Research advances from the National Institutes of Health (NIH) Intramural Research Program (IRP) often make headlines. Read the news releases that describe our most recent findings:
BETHESDA, Md. (AP) — Sam Srisatta, a 20-year-old Florida college student, spent a month living inside a government hospital here last fall, playing video games and allowing scientists to document every morsel of food that went into his mouth.
From big bowls of salad to platters of meatballs and spaghetti sauce, Srisatta noshed his way through a nutrition study aimed at understanding the health effects of ultraprocessed foods, the controversial fare that now accounts for more than 70% of the U.S. food supply. He allowed The Associated Press to tag along for a day.
“Today my lunch was chicken nuggets, some chips, some ketchup,” said Srisatta, one of three dozen participants paid $5,000 each to devote 28 days of their lives to science. “It was pretty fulfilling.”
Examining exactly what made those nuggets so satisfying is the goal of the widely anticipated research led by National Institutes of Health nutrition researcher Kevin Hall.
“What we hope to do is figure out what those mechanisms are so that we can better understand that process,” Hall said.
A genomic analysis of lung cancer in people with no history of smoking has found that a majority of these tumors arise from the accumulation of mutations caused by natural processes in the body. This study was conducted by an international team led by researchers at the National Cancer Institute (NCI), part of the National Institutes of Health (NIH), and describes for the first time three molecular subtypes of lung cancer in people who have never smoked.
These insights will help unlock the mystery of how lung cancer arises in people who have no history of smoking and may guide the development of more precise clinical treatments. The findings were published September 6, 2021, in Nature Genetics.
“What we’re seeing is that there are different subtypes of lung cancer in never smokers that have distinct molecular characteristics and evolutionary processes,” said epidemiologist Maria Teresa Landi, M.D., Ph.D., of the Integrative Tumor Epidemiology Branch in NCI’s Division of Cancer Epidemiology and Genetics, who led the study, which was done in collaboration with researchers at the National Institute of Environmental Health Sciences, another part of NIH, and other institutions. “In the future we may be able to have different treatments based on these subtypes.”
Illustration of lungs made up of DNA sequences. A magnifying glass hovers over a portion of a DNA sequence showing a mutational change.
Neurofibromatosis type 1, or NF1, is the most common cancer predisposition syndrome, affecting 1 in 3,000 people worldwide
People with an inherited condition known as neurofibromatosis type 1, or NF1, often develop non-cancerous, or benign, tumors that grow along nerves. These tumors can sometimes turn into aggressive cancers, but there hasn’t been a good way to determine whether this transformation to cancer has happened.
Researchers from the National Cancer Institute’s (NCI) Center for Cancer Research, part of the National Institutes of Health, and Washington University School of Medicine in St. Louis have developed a blood test that, they believe, could one day offer a highly sensitive and inexpensive approach to detect cancer early in people with NF1. The blood test could also help doctors monitor how well patients are responding to treatment for their cancer.
The findings are published in the August 31 issue of PLOS Medicine.
National Institutes of Health scientists studying SARS-CoV-2, the virus that causes COVID-19, have defined in Syrian hamsters how different routes of virus exposure are linked to disease severity. Their study, published in Nature Communications, details the efficiency of airborne transmission between hamsters and examines how the virus replicates and causes disease throughout the respiratory system. Their work also shows that virus transmission via fomites—exposure from contaminated surface contact — is markedly less efficient than airborne transmission but does occur.
To investigate how different routes of exposure affected disease development, the scientists exposed hamsters to SARS-CoV-2 via both aerosols and fomites. For aerosol exposure, the scientists used equipment that controlled the size of virus-loaded droplets. For fomite exposure, they placed a dish contaminated with SARS-CoV-2 in the animal cages.
The scientists found that aerosol exposure directly deposited SARS-CoV-2 deep into the lungs, whereas fomite exposure resulted in initial virus replication in the nose. Regardless of exposure route, animals had SARS-CoV-2 replicating in the lungs, but lung damage was more severe in aerosol-exposed animals compared to the fomite group.
Scientists at the National Institutes of Health (NIH) have developed a new sample preparation method to detect SARS-Cov-2, the virus that causes COVID-19. The method bypasses extraction of the virus’ genetic RNA material, simplifying sample purification and potentially reducing test time and cost. The method is the result of a collaboration among researchers at the National Eye Institute (NEI), the NIH Clinical Center (CC), and the National Institute of Dental and Craniofacial Research (NIDCR).
Diagnostic testing remains a crucial tool in the fight against the COVID-19 pandemic. Standard tests for detection of SARS-CoV-2 involve amplifying viral RNA to detectable levels using a technique called quantitative reverse transcription PCR (RT-qPCR). But first, the RNA must be extracted from the sample. Manufacturers of RNA extraction kits have had difficulty keeping up with demand during the COVID-19 pandemic, hindering testing capacity worldwide. With new virus variants emerging, the need for better, faster tests is greater than ever.
A team led by Robert B. Hufnagel, M.D., Ph.D., chief of the NEI Medical Genetics and Ophthalmic Genomic Unit, and Bin Guan, Ph.D., a fellow at the Ophthalmic Genomics Laboratory at NEI, used a chelating agent made by the lab supply company Bio-Rad called Chelex 100 resin to preserve SARS-CoV-2 RNA in samples for detection by RT-qPCR.
“We used nasopharyngeal and saliva samples with various virion concentrations to evaluate whether they could be used for direct RNA detection,” said Guan, the lead author of a report on the technique, which published this week in iScience. “The answer was yes, with markedly high sensitivity. Also, this preparation inactivated the virus, making it safer for lab personnel to handle positive samples.”
One dose of a new monoclonal antibody discovered and developed at the National Institutes of Health safely prevented malaria for up to nine months in people who were exposed to the malaria parasite. The small, carefully monitored clinical trial is the first to demonstrate that a monoclonal antibody can prevent malaria in people. The trial was sponsored and conducted by scientists from the Vaccine Research Center (VRC) of the National Institute of Allergy and Infectious Diseases (NIAID), part of NIH, and was funded by NIAID. The findings were published today in The New England Journal of Medicine.
“Malaria continues to be a major cause of illness and death in many regions of the world, especially in infants and young children; therefore, new tools are needed to prevent this deadly disease,” said NIAID Director Anthony S. Fauci, M.D. “The results reported today suggest that a single infusion of a monoclonal antibody can protect people from malaria for at least 9 months. Additional research is needed, however, to confirm and extend this finding.”
According to the World Health Organization, an estimated 229 million cases of malaria occurred worldwide in 2019, resulting in an estimated 409,000 deaths, mostly in children in sub-Saharan Africa. So far, no licensed or experimental malaria vaccine that has completed Phase 3 testing provides more than 50% protection from the disease over the course of a year or longer.
Colorized electron micrograph showing malaria parasite (right, blue) attaching to a human red blood cell. The inset shows a detail of the attachment point at higher magnification.
New study brings into question current policies on receiving secondary genomic findings
A study published today by researchers at the National Institutes of Health revealed that about half of individuals who said they don’t want to receive secondary genomic findings changed their minds after their healthcare provider gave them more detailed information. The paper, published in Genetics in Medicine, examines people's attitudes about receiving secondary genomic findings related to treatable or preventable diseases.
With the broader adoption of genome sequencing in clinical care, researchers and the bioethics community are considering options for how to navigate the discovery of secondary genomic findings. Secondary findings that come out of genome sequencing reflect information that is separate from the primary reason for an individual's medical care or participation in a study. For example, the genomic data of a patient who undergoes genome sequencing to address an autoimmune problem might reveal genomic variants that are associated with a heightened risk for breast cancer.
NIH study in monkeys finds that in visual decision-making, information relevant to the decision is broadcast widely
Researchers at the National Institutes of Health have discovered that decisions based on visual information, which involve a complex stream of data flowing forward and backwards along the brain’s visual pathways, is broadcast widely to neurons in the visual system, including to those that are not being used to make the decision. Feedback — such as information about a decision traveling back to neurons detecting visual features like color or shape — is thought to help the brain focus on visual information that is relevant to decision-making. The study, by scientists at the National Eye Institute (NEI), was published in Nature Communications.
“Why and how decision-making information is relayed back into the visual processing parts of the brain is an open question. Some theories posit that this type of feedback should be selective – only affecting those neurons that are involved in the decision,” said Hendrikje Nienborg, Ph.D., chief of the NEI Unit on Visual Decision Making and lead author of the study. “This study shows that decision-related feedback is spatially unselective, affecting neurons much more broadly than one might suppose.”
Feedback is used by the brain in many ways and many systems. When a decision is based on what we see, information about expectation or attention — such as where the object is, or about its features — is fed back to brain regions involved in the visual process, raising the activity of neurons involved in seeing the object or event in question.
Decision-signals about feature information, like object depth, are broadcast widely in the visual cortex via feedback during visual decision-making.
A new study authored by scientists at the National Institutes of Health, in collaboration with colleagues at the Centers for Disease Control and Prevention and Harvard University, Boston, and Emory University, Atlanta, suggests that one in four COVID-19 deaths in U.S. hospitals may have been attributed to hospitals strained by surging caseloads. Published in the Annals of Internal Medicine, the analysis looked at data from 150,000 COVID-19 inpatients from 558 U.S. hospitals from March to August of 2020. More than half of those admissions were patients arriving at hospitals during peak COVID-19 surges.
The surge–mortality relationship was stronger from June to August 2020 versus March to May 2020 (i.e., the contrast in outcomes between surging and non-surging hospitals appeared to grow over time), despite greater corticosteroid use and more judicious intubation during later and higher surging months. Surges were associated with mortality across ward, intensive care unit (ICU) and intubated patients.
These findings have implications for triage strategies, hospital preparedness, how healthcare facilities allocate resources and how public health authorities can assess and react to local data. To better highlight the strain experienced by hospitals, investigators used a measure of surge that considered not only the number of patients with COVID-19, but also illness severity and the hospitals’ typical bed capacity. By tracking these data, hospitals could preemptively divert patients and ask for help sooner — potentially avoiding excess deaths.
NIH-funded study finds frequency, intensity of monthly migraines declined among those on higher fish oil diet
A diet higher in fatty fish helped frequent migraine sufferers reduce their monthly number of headaches and intensity of pain compared to participants on a diet higher in vegetable-based fats and oils, according to a new study. The findings by a team of researchers from the National Institute on Aging (NIA) and the National Institute on Alcohol Abuse and Alcoholism (NIAAA), parts of the National Institutes of Health; and the University of North Carolina (UNC) at Chapel Hill, were published in the July 3 issue of The BMJ.
This study of 182 adults with frequent migraines expanded on the team’s previous work on the impact of linoleic acid and chronic pain. Linoleic acid is a polyunsaturated fatty acid commonly derived in the American diet from corn, soybean, and other similar oils, as well as some nuts and seeds. The team’s previous smaller studies explored if linoleic acid inflamed migraine-related pain processing tissues and pathways in the trigeminal nerve, the largest and most complex of the body’s 12 cranial nerves. They found that a diet lower in linoleic acid and higher in levels of omega-3 fatty acids (like those found in fish and shellfish) could soothe this pain pathway inflammation.
In a 16-week dietary intervention, participants were randomly assigned to one of three healthy diet plans. Participants all received meal kits that included fish, vegetables, hummus, salads, and breakfast items. One group received meals that had high levels of fatty fish or oils from fatty fish and lowered linoleic acid. A second group received meals that had high levels of fatty fish and higher linoleic acid. The third group received meals with high linoleic acid and lower levels of fatty fish to mimic average U.S. intakes.
Two U.S. Phase 1 clinical trials of a novel candidate malaria vaccine have found that the regimen conferred unprecedentedly high levels of durable protection when volunteers were later exposed to disease-causing malaria parasites. The vaccine combines live parasites with either of two widely used antimalarial drugs—an approach termed chemoprophylaxis vaccination. A Phase 2 clinical trial of the vaccine is now underway in Mali, a malaria-endemic country. If the approach proves successful there, chemoprophylaxis vaccination, or CVac, potentially could help reverse the stalled decline of global malaria. Currently, there is no vaccine in widespread use for the mosquito-transmitted disease.
The Sanaria vaccine, called PfSPZ, is composed of sporozoites, the form of the malaria parasite transmitted to people by mosquito bites. Sporozoites travel through blood to the liver to initiate infection. In the CVac trials, healthy adult volunteers received PfSPZ along with either pyrimethamine, a drug that kills liver-stage parasites, or chloroquine, which kills blood-stage parasites. Three months later, under carefully controlled conditions, the volunteers were exposed to either an African malaria parasite strain that was the same as that in the vaccine (homologous challenge) or a variant South American parasite (heterologous challenge) that was more genetically distant from the vaccine strain than hundreds of African parasites. Exposure in both cases was via inoculation into venous blood, which infects all unvaccinated individuals.