Authors: Ghassabian A, Albert PS, Hornig M, Yeung E, Cherkerzian S, Goldstein RB, Buka SL, Goldstein JM, Gilman SE
Journal: Transl Psychiatry. 2018 Mar 13;8(1):64. doi: 10.1038/s41398-018-0112-z.
Gestational inflammation may contribute to brain abnormalities associated with childhood neuropsychiatric disorders. Limited knowledge exists regarding the associations of maternal cytokine levels during pregnancy with offspring neurocognitive development. We assayed the concentrations of five cytokines (interleukin (IL)-6, IL-1β, IL-8, tumor necrosis factor alpha (TNF-α), and IL-10) up to four times in the 2nd and 3rd trimesters of pregnancy using stored prenatal sera from 1366 participants in the New England Family Study (enrollment 1959-1966). Intelligence (IQ), academic achievement, and neuropsychological functioning of singleton offspring were assessed at age 7 years using standardized tests. We used linear mixed models with random effects to estimate the cumulative exposure to each cytokine during 2nd and 3rd trimesters, and then related cumulative cytokine exposure to a wide range of offspring neurocognitive outcomes. We found that children of women with higher levels of the pro-inflammatory cytokine, TNF-α, in the 2nd and 3rd trimesters had lower IQ (B = -2.51, 99% CI: -4.84,-0.18), higher problem scores in visual-motor maturity (B = 0.12, 99% CI: 0.001,0.24), and lower Draw-a-Person test scores (B = -1.28, 99% CI: -2.49,-0.07). Higher gestational levels of IL-8, another pro-inflammatory molecule, were associated with better Draw-a-Person test scores and tactile finger recognition scores. Other cytokines were not associated with our outcome of interest. The opposing directions of associations observed between TNF-α and IL-8 with childhood outcomes suggest pleiotropic effects of gestational inflammation across the domains of neurocognitive functioning. Although the path to psychopathological disturbances in children is no doubt multifactorial, our findings point to a potential role for immune processes in the neurocognitive development of children.
Authors: Scroggins B, Matsuo M, White A, Saito K, Munasinghe JP, Sourbier C, Yamamoto K, Diaz V, Ichikawa K, Mitchell JB, Cherukuri MK, Citrin DE
Journal: Clin Cancer Res. 2018 Mar 29. pii: clincanres.1957.2017. doi: 10.1158/1078-0432.CCR-17-1957. [Epub ahead of print]
PURPOSE: To evaluate the potential of hyperpolarized [1-13C]-pyruvate magnetic resonance spectroscopic imaging (MRSI) of prostate cancer as a predictive biomarker for targeting the Warburg effect.
EXPERIMENTAL DESIGN: Two human prostate cancer cell lines (DU145 and PC3) were grown as xenografts. The conversion of pyruvate to lactate in xenografts was measured with hyperpolarized [1-13C]-pyruvate MRSI after systemic delivery of [1-13C] pyruvic acid. Steady state metabolomic analysis of xenograft tumors was performed with mass spectrometry and steady state lactate concentrations were measured with proton (1H) MRS. Perfusion and oxygenation of xenografts was measured with electron paramagnetic resonance (EPR) imaging with OX063. Tumor growth was assessed after lactate dehydrogenase (LDH) inhibition with FX-11 (42 µg/mouse/day for 5 days x 2 weekly cycles). Lactate production, pyruvate uptake, extracellular acidification rates and oxygen consumption of the prostate cancer cell lines was analyzed in vitro. LDH activity was assessed in tumor homogenates.
RESULTS: DU145 tumors demonstrated an enhanced conversion of pyruvate to lactate with hyperpolarized [1-13C]-pyruvate MRSI compared to PC3, and a corresponding greater sensitivity to LDH inhibition. No difference was observed between PC3 and DU145 xenografts in steady state measures of pyruvate fermentation, oxygenation, or perfusion. The two cell lines exhibited similar sensitivity to FX-11 in vitro. LDH activity correlated to FX-11 sensitivity.
CONCLUSIONS: Hyperpolarized [1-13C]-pyruvate MRSI of prostate cancer predicts efficacy of targeting the Warburg effect.
Authors: Tracy JD, Acra S, Chen KY, Buchowski MS.
Journal: PLoS One. 2018 Mar 23;13(3):e0194461. doi: 10.1371/journal.pone.0194461.
OBJECTIVES: To adapt and refine a previously-developed youth-specific algorithm to identify bedrest for use in adults. The algorithm is based on using an automated decision tree (DT) analysis of accelerometry data.
DESIGN: Healthy adults (n = 141, 85 females, 19-69 years-old) wore accelerometers on the waist, with a subset also wearing accelerometers on the dominant wrist (n = 45). Participants spent ≈24-h in a whole-room indirect calorimeter equipped with a force-platform floor to detect movement.
METHODS: Minute-by-minute data from recordings of waist-worn or wrist-worn accelerometers were used to identify bedrest and wake periods. Participants were randomly allocated to development (n = 69 and 23) and validation (n = 72 and 22) groups for waist-worn and wrist-worn accelerometers, respectively. The optimized DT algorithm parameters were block length, threshold, bedrest-start trigger, and bedrest-end trigger. Differences between DT classification and synchronized objective classification by the room calorimeter to bedrest or wake were assessed for sensitivity, specificity, and accuracy using a Receiver Operating Characteristic (ROC) procedure applied to 1-min epochs (n = 92,543 waist; n = 30,653 wrist).
RESULTS: The optimal algorithm parameter values for block length were 60 and 45 min, thresholds 12.5 and 400 counts/min, bedrest-start trigger 120 and 400 counts/min, and bedrest-end trigger 1,200 and 1,500 counts/min, for the waist and wrist-worn accelerometers, respectively. Bedrest was identified correctly in the validation group with sensitivities of 0.819 and 0.912, specificities of 0.966 and 0.923, and accuracies of 0.755 and 0.859 by the waist and wrist-worn accelerometer, respectively. The DT algorithm identified bedrest/sleep with greater accuracy than a commonly used automated algorithm (Cole-Kripke) for wrist-worn accelerometers (p<0.001).
CONCLUSIONS: The adapted DT accurately identifies bedrest in data from accelerometers worn by adults on either the wrist or waist. The automated bedrest/sleep detection DT algorithm for both youth and adults is openly accessible as a package "PhysActBedRest" for the R-computer language.
Authors: Ma J, Hennein R, Liu C, Long MT, Hoffmann U, Jacques PF, Lichtenstein AH, Hu FB, Levy D
Journal: Gastroenterology. 2018 Mar 28. pii: S0016-5085(18)30345-7. doi: 10.1053/j.gastro.2018.03.038. [Epub ahead of print]
BACKGROUND & AIMS: Dietary modification has been recommended for treatment of nonalcoholic fatty liver disease (NAFLD), although it is not clear whether improving diet quality can prevent its development. We performed a prospective study to examine the association between diet quality change and change in liver fat change. We also examined the association between genetic risk score and liver fat change in individuals with different levels of diet quality change.
METHODS: Our study included 1521 participants who attended the seventh and eighth examinations (1998-2001 and 2005-2008) of the second-generation cohort or attended the first and second examinations (2002-2005 and 2008-2011) of the third-generation cohort in the Framingham Heart Study. The self-administered semi-quantitative 126-item Harvard food frequency questionnaire was used to determine dietary intake in the year leading up to an examination. We assessed levels of liver fat using liver-phantom ratio (LPR) and computed tomography images from 2002 through 2005 and again from 2008 through 2011. LPR values are inversely related to liver fat-increased LPR indicates decreased liver fat. We examined associations of changes in 2 diet scores —the Mediterranean-style diet score (MDS) and Alternative Healthy Eating Index (AHEI) — with changes in liver fat and new-onset fatty liver. We evaluated interactions between diet score change and a weighted genetic risk score for NAFLD, determined based on multiple single nucleotide polymorphisms identified in genome-wide association studies of NAFLD. The primary outcome was change in LPR between baseline and follow-up measurement.
RESULTS: For each 1-standard deviation increase in MDS, the LPR increased (meaning liver fat decreased) by 0.57 (95% CI, 0.27-0.86; P<.001) and the odds for incident fatty liver decreased by 26% (95% CI, 10%-39%; P=.002). For each 1-standard deviation increase in AHEI, LPR increased by 0.56 (95% CI, 0.29-0.84; P<.001) and the odds for incident fatty liver decreased by 21% (95% CI, 5%-35%; P=.02). Increased diet scores were also associated with reduced odds of developing more-advanced fatty liver. Higher genetic risk scores were associated with increased liver fat accumulation in participants who had decreased MDS (P<.001) or AHEI scores (P=.001), but not in those with stable or improved diet scores (P for gene-diet interaction <.001).
CONCLUSIONS: In an analysis of participants in the Framingham Heart Study, increasing diet quality, determined based on MDS and AHEI scores, is associated with less liver fat accumulation and reduced risk for new-onset fatty liver. An improved diet is particularly important for individuals with a high genetic risk for NAFLD.
Authors: Mitchell SJ, Bernier M, Aon MA, Cortassa S, Kim EY, Fang EF, Palacios HH, Ali A, Navas-Enamorado I, Di Francesco A, Kaiser TA, Waltz TB, Zhang N, Ellis JL, Elliott PJ, Frederick DW, Bohr VA, Schmidt MS, Brenner C, Sinclair DA, Sauve AA, Baur JA, de Cabo R
Journal: Cell Metab. 2018 Mar 6;27(3):667-676.e4. doi: 10.1016/j.cmet.2018.02.001.
The role in longevity and healthspan of nicotinamide (NAM), the physiological precursor of NAD+, is elusive. Here, we report that chronic NAM supplementation improves healthspan measures in mice without extending lifespan. Untargeted metabolite profiling of the liver and metabolic flux analysis of liver-derived cells revealed NAM-mediated improvement in glucose homeostasis in mice on a high-fat diet (HFD) that was associated with reduced hepatic steatosis and inflammation concomitant with increased glycogen deposition and flux through the pentose phosphate and glycolytic pathways. Targeted NAD metabolome analysis in liver revealed depressed expression of NAM salvage in NAM-treated mice, an effect counteracted by higher expression of de novo NAD biosynthetic enzymes. Although neither hepatic NAD+ nor NADP+ was boosted by NAM, acetylation of some SIRT1 targets was enhanced by NAM supplementation in a diet- and NAM dose-dependent manner. Collectively, our results show health improvement in NAM-supplemented HFD-fed mice in the absence of survival effects.
Authors: Limpose KL, Trego KS, Li Z, Leung SW, Sarker AH, Shah JA, Ramalingam SS, Werner EM, Dynan WS, Cooper PK, Corbett AH, Doetsch PW
Journal: Nucleic Acids Res. 2018 Mar 7. doi: 10.1093/nar/gky162. [Epub ahead of print]
Base excision repair (BER), which is initiated by DNA N-glycosylase proteins, is the frontline for repairing potentially mutagenic DNA base damage. The NTHL1 glycosylase, which excises DNA base damage caused by reactive oxygen species, is thought to be a tumor suppressor. However, in addition to NTHL1 loss-of-function mutations, our analysis of cancer genomic datasets reveals that NTHL1 frequently undergoes amplification or upregulation in some cancers. Whether NTHL1 overexpression could contribute to cancer phenotypes has not yet been explored. To address the functional consequences of NTHL1 overexpression, we employed transient overexpression. Both NTHL1 and a catalytically-dead NTHL1 (CATmut) induce DNA damage and genomic instability in non-transformed human bronchial epithelial cells (HBEC) when overexpressed. Strikingly, overexpression of either NTHL1 or CATmut causes replication stress signaling and a decrease in homologous recombination (HR). HBEC cells that overexpress NTHL1 or CATmut acquire the ability to grow in soft agar and exhibit loss of contact inhibition, suggesting that a mechanism independent of NTHL1 catalytic activity contributes to acquisition of cancer-related cellular phenotypes. We provide evidence that NTHL1 interacts with the multifunctional DNA repair protein XPG suggesting that interference with HR is a possible mechanism that contributes to acquisition of early cellular hallmarks of cancer.
Authors: Sinha R, Goedert JJ, Vogtmann E, Hua X, Porras C, Hayes R, Safaeian M, Yu G, Sampson J, Ahn J, Shi J
Journal: Am J Epidemiol. 2018 Mar 28. doi: 10.1093/aje/kwy064. [Epub ahead of print]
Temporal variation in microbiome measurements can reduce power. Quantification of this variation is essential for designing chronic disease studies. We analyzed 16S rRNA profiles in paired specimens separated by six months from three studies. We evaluated temporal stability by calculating intraclass correlation coefficients (ICCs). Sample sizes to detect microbiome differences between equal numbers of cases and controls for a nested case-control design were calculated based on estimated ICCs. Across body sites, 12 phylum-level ICCs were greater than 0.5. Similarly, 11 alpha-diversity ICCs were greater than 0.5. Fecal beta diversity estimates had ICCs over 0.5. For a single collection with most microbiome metrics, detecting an odds ratio (OR) of 2.0 would require 300-500 cases when matching one case to one control at P = 0.05. Two or three sequential specimens reduce the number of required subjects by 40%-50% for low-ICC metrics. Relative abundances of major phyla and alpha diversity metrics have low temporal stability. Thus, detecting associations of moderate effect size with these metrics will require large sample sizes. As beta-diversity for feces is reasonably stable over time, smaller sample sizes can detect associations with community composition. Sequential pre-diagnostic specimens from thousands of prospectively ascertained cases are required to detect modest disease associations with particular microbiome metrics.
Authors: Postnikova E, Cong Y, DeWald LE, Dyall J, Yu S, Hart BJ, Zhou H, Gross R, Logue J, Cai Y, Deiuliis N, Michelotti J, Honko AN, Bennett RS, Holbrook MR, Olinger GG, Hensley LE, Jahrling PB.
Journal: PLoS One. 2018 Mar 22;13(3):e0194880. doi: 10.1371/journal.pone.0194880. eCollection 2018.
Identifying effective antivirals for treating Ebola virus disease (EVD) and minimizing transmission of such disease is critical. A variety of cell-based assays have been developed for evaluating compounds for activity against Ebola virus. However, very few reports discuss the variable assay conditions that can affect the results obtained from these drug screens. Here, we describe variable conditions tested during the development of our cell-based drug screen assays designed to identify compounds with anti-Ebola virus activity using established cell lines and human primary cells. The effect of multiple assay readouts and variable assay conditions, including virus input, time of infection, and the cell passage number, were compared, and the impact on the effective concentration for 50% and/ or 90% inhibition (EC50, EC90) was evaluated using the FDA-approved compound, toremifene citrate. In these studies, we show that altering cell-based assay conditions can have an impact on apparent drug potency as measured by the EC50. These results further support the importance of developing standard operating procedures for generating reliable and reproducible in vitro data sets for potential antivirals.
Authors: Mahale P Engels EA, Li R, Torres HA, Hwang LY, Brown EL, Kramer JR.
Journal: Gut. 2018 Mar;67(3):553-561. doi: 10.1136/gutjnl-2017-313983. Epub 2017 Jun 20.
BACKGROUND AND AIM: Chronic HCV infection is associated with several extrahepatic manifestations (EHMs). Data on the effect of sustained virological response (SVR) on the risk of EHMs are limited.
METHODS: We conducted a retrospective cohort study using data of patients from the US Veterans Affairs HCV Clinical Case Registry who had a positive HCV RNA test (10/1999-08/2009). Patients receiving interferon-based antiviral therapy (AVT) were identified. SVR was defined as negative HCV RNA at least 12 weeks after end of AVT. Risks of eight incident EHMs were evaluated in Cox regression models.
RESULTS: Of the 160 875 HCV-infected veterans, 31 143 (19.4%) received AVT, of whom 10 575 (33.9%) experienced SVR. EHM risk was reduced in the SVR group compared with untreated patients for mixed cryoglobulinaemia (adjusted HR (aHR)=0.61; 95% CI 0.39 to 0.94), glomerulonephritis (aHR=0.62; 95% CI 0.48 to 0.79), porphyria cutanea tarda (PCT) (aHR=0.41; 95% CI 0.20 to 0.83), non-Hodgkin's lymphoma (NHL) (aHR=0.64; 95% CI 0.43 to 0.95), diabetes (aHR=0.82; 95% CI 0.76 to 0.88) and stroke (aHR=0.84; 95% CI 0.74 to 0.94), but not for lichen planus (aHR=1.11; 95% CI 0.78 to 1.56) or coronary heart disease (aHR=1.12; 95% CI 0.81 to 1.56). Risk reductions were also observed when patients with SVR were compared with treated patients without SVR for mixed cryoglobulinaemia, glomerulonephritis, PCT and diabetes. Significant reductions in the magnitude of aHRs towards the null with increasing time to initiation of AVT after HCV diagnosis were observed for glomerulonephritis, NHL and stroke.
CONCLUSIONS: Risks of several EHMs of HCV infection are reduced after AVT with SVR. However, early initiation of AVT may be required to reduce the risk of glomerulonephritis, NHL and stroke.
Authors: Hooper LG, Young MT, Keller JP, Szpiro AA, O'Brien KM, Sandler DP, Vedal S, Kaufman JD, London SJ.
Journal: Environ Health Perspect. 2018 Feb 6;126(2):027005. doi: 10.1289/EHP2199.
BACKGROUND: Limited evidence links air pollution exposure to chronic cough and sputum production. Few reports have investigated the association between long-term exposure to air pollution and classically defined chronic bronchitis.
OBJECTIVES: Our objective was to estimate the association between long-term exposure to particulate matter (diameter <10 μm, PM10; <2.5μm, PM2.5), nitrogen dioxide (NO2), and both incident and prevalent chronic bronchitis.
METHODS: We estimated annual average PM2.5, PM10, and NO2 concentrations using a national land-use regression model with spatial smoothing at home addresses of participants in a prospective nationwide U.S. cohort study of sisters of women with breast cancer. Incident chronic bronchitis and prevalent chronic bronchitis, cough and phlegm, were assessed by questionnaires.
RESULTS: Among 47,357 individuals with complete data, 1,383 had prevalent chronic bronchitis at baseline, and 647 incident cases occurred over 5.7-y average follow-up. No associations with incident chronic bronchitis were observed. Prevalent chronic bronchitis was associated with PM10 [adjusted odds ratio (aOR) per interquartile range (IQR) difference (5.8 μg/m3)=1.07; 95% confidence interval (CI): 1.01, 1.13]. In never-smokers, PM2.5 was associated with prevalent chronic bronchitis (aOR=1.18 per IQR difference; 95% CI: 1.04, 1.34), and NO2 was associated with prevalent chronic bronchitis (aOR=1.10; 95% CI=1.01, 1.20), cough (aOR=1.10; 95% CI: 1.05, 1.16), and phlegm (aOR=1.07; 95% CI: 1.01, 1.14); interaction p-values (nonsmokers vs. smokers) <0.05.
CONCLUSIONS: PM10 exposure was related to chronic bronchitis prevalence. Among never-smokers, PM2.5 and NO2 exposure was associated with chronic bronchitis and component symptoms. Results may have policy ramifications for PM10 regulation by providing evidence for respiratory health effects related to long-term PM10 exposure.