Sensitive non-invasive technology developed to detect changes in protein biomarkers associated with TBI

From NYU Langone Medical Center:

Enhancing Studies on a Possible Blood Biomarker for Traumatic Brain Injury

NYU Langone Medical Center Introducing New Technology to Identify Head Injuries at Earliest Possible Stages

February 27, 2015 (9:00AM)

New technology being introduced at NYU Langone Medical Center could help researchers advance blood biomarker capabilities that show changes in low concentrations of specific proteins present following a neurological injury.

The single molecule array (Simoa) technology developed by Quanterix, and the fully automated HD-1 analyzer, offers unprecedented improvement in sensitivity proteins over current technologies for the detection of blood-based biomarkers – as much as 1000 times more sensitive than conventional immunoassays. Specifically, it allows for more effective measurement of low concentration of proteins such as tau, a normal protein that is released from brain cells following a brain injury.

The single molecule array (Simoa) technology developed by Quanterix, and the fully automated HD-1 analyzer.

Utilizing this technology, researchers hope to develop and validate simpler and more objective blood biomarkers for the diagnosis, prognosis and treatment of traumatic brain injury (TBI).

“This diagnostic advancement provides us with a more precise ruler for measuring the effectiveness of diagnosis, treatment and progression of TBI,” says Mony J. de Leon, EdD, director of the Center for Brain Health at NYU Langone, professor of psychiatry and an investigator with NYU’s Langone’s Steven and Alexandra Cohen Veterans Center. “We know that increased tau proteins in the cerebrospinal fluid are a marker for TBI. Having more immediate and consistent access to tau measurements from the blood or saliva will allow us to more accurately determine if a brain injury has, indeed, occurred, and how well a patient is responding to treatment.”

Until recently, there have been limits in tissue availability and technology to detect specific proteins and other potential biomarkers in the blood following TBI. Analyzing cerebrospinal fluid (CSF) has been the most common approach to date to measure these proteins. However, this requires an invasive procedure, and it is not always available or obtainable with certain injuries. Nor is it regularly used to monitor progression and recovery because of its invasive nature.

The result is that a large number of people whose brains appear normal on standard tests (X-ray, computed tomography, or magnetic resonance imaging) could actually have some form of injury whose course is not well understood. “This presents a unique opportunity to develop a mechanisms-based classification of TBI in the context of personalized medicine,” adds Andreas Jeromin, PhD, scientific and medical advisor to Quanterix, Inc.

With further study, researchers also feel that the technology has the potential to move beyond the clinical spectrum and into military battlefields, sports arenas and other settings where TBI, concussion and other head injuries often occur. In fact, the Simoa technology was recently selected as a winner of the 2014 GE and NFL Head Health Challenge from more than 400 entries across 27 countries by a panel of leading healthcare experts in brain research.

An estimated 1.7 million Americans suffer a traumatic brain injury each year, according to the Centers for Disease Control and Prevention, and an estimated 5.3 million individuals – approximately two percent of the U.S. population — are living with disability as a result of TBI. Traumatic brain injuries can occur from even the slightest bump or blow to the head.

“The Quanterix Simoa will accelerate the discovery of new biomarkers to identify TBI and the development of new treatments, including targeted medications and other therapies,” says Charles L. Marmar, MD, the Lucius Littauer professor and chairman of Psychiatry at NYU Langone and executive director of the Cohen Veterans Center. “It is truly a breakthrough for advancing detection and treatment of brain injuries.”

The acquisition of the Simoa technology continues the collaborative partnership between TBI researchers at NYU Langone and the Steven & Alexandra Cohen Foundation. In 2014, the foundation made a major $17 million gift to establish the Steven and Alexandra Cohen Veterans Center at NYU Langone. The foundation’s latest gift will fund the purchase of the Simoa technology and support Dr. de Leon’s research to study its use as a potential biomarker for TBI.

“Finding a biomarker to detect PTS and TBI is critical to identifying which veterans need treatment, what kind of treatment, and how we track their progress in getting better,” says Mr. Cohen. “We owe it our veterans to make sure they get the treatment they need and deserve.”

Dr. de Leon also points out that brain injury increases the risk for future neurodegenerative disorders such as Alzheimer’s disease. “Since both TBI and Alzheimer’s share features such as neurofibrillary pathology, a variant of the tau molecule, the Quanterix device offers opportunities to further explore tauopathy that contributes to Alzheimer’s disease risk,” Dr. de Leon adds.

Read more.

Vitamin D regulates synthesis of serotonin; EPA and DHA increase serotonin effect at synapse

Image retrieved from Vitamin D Council

From EurekAlert:

Public Release: 26-Feb-2015

Omega-3 fatty acids and vitamin D may control brain serotonin

Affecting behavior and psychiatric disorders

Children’s Hospital & Research Center Oakland

Oakland, CA (February 26, 2015) – Although essential marine omega-3 fatty acids and vitamin D have been shown to improve cognitive function and behavior in the context of certain brain disorders, the underlying mechanism has been unclear. In a new paper published in FASEB Journal by Rhonda Patrick, PhD and Bruce Ames, PhD of Children’s Hospital Oakland Research Institute (CHORI), serotonin is explained as the possible missing link tying together why vitamin D and marine omega-3 fatty acids might ameliorate the symptoms associated with a broad array of brain disorders.

In a previous paper published last year, authors Patrick and Ames discussed the implications of their finding that vitamin D regulates the conversion of the essential amino acid tryptophan into serotonin, and how this may influence the development of autism, particularly in developing children with poor vitamin D status.

Here they discuss the relevance of these micronutrients for neuropsychiatric illness. Serotonin affects a wide-range of cognitive functions and behaviors including mood, decision-making, social behavior, impulsive behavior, and even plays a role in social decision-making by keeping in check aggressive social responses or impulsive behavior.

Many clinical disorders, such as autism spectrum disorder (ASD), attention deficit hyperactivity disorder (ADHD), bipolar disorder, schizophrenia, and depression share as a unifying attribute low brain serotonin. “In this paper we explain how serotonin is a critical modulator of executive function, impulse control, sensory gating, and pro-social behavior,” says Dr. Patrick. “We link serotonin production and function to vitamin D and omega-3 fatty acids, suggesting one way these important micronutrients help the brain function and affect the way we behave.”

Eicosapentaenoic acid (EPA) increases serotonin release from presynaptic neurons by reducing inflammatory signaling molecules in the brain known as E2 series prostaglandins, which inhibit serotonin release and suggests how inflammation may negatively impact serotonin in the brain. EPA, however, is not the only omega-3 that plays a role in the serotonin pathway. Docosahexaenoic acid (DHA) also influences the action of various serotonin receptors by making them more accessible to serotonin by increasing cell membrane fluidity in postsynaptic neurons.

Their paper illuminates the mechanistic links that explain why low vitamin D, which is mostly produced by the skin when exposed to sun, and marine omega-3 deficiencies interacts with genetic pathways, such as the serotonin pathway, that are important for brain development, social cognition, and decision-making, and how these gene-micronutrient interactions may influence neuropsychiatric outcomes. “Vitamin D, which is converted to a steroid hormone that controls about 1,000 genes, many in the brain, is a major deficiency in the US and omega-3 fatty acid deficiencies are very common because people don’t eat enough fish,” said Dr. Ames.

Read more.

Also see

Research suggests vitamin D could affect brain function

New study finds vitamin D helps to regulate three genes involved in autism

Mechanism proposed that links decreased levels of vitamin D and serotonin in autism patients

Vitamin D hormone regulates serotonin synthesis. Part 1: relevance for autism.

Somatostatin activation of GABAb receptors on excitatory neurons in neocortex acts as a cloaking device

From Carnegie Mellon News:

 Thursday, February 26, 2015

Intermediary Neuron Acts as Synaptic Cloaking Device, Says Carnegie Mellon Study

Researchers Find That Somatostatin Neurons Regulate Synaptic Activity in the Neocortex

By Jocelyn Duffy / 412-268-9982

Star Trek NeuronNeuroscientists believe that the connectome, a map of each and every connection between the millions of neurons in the brain, will provide a blueprint that will allow them to link brain anatomy to brain function. But a new study from Carnegie Mellon University has found that a specific type of neuron might be thwarting their efforts at mapping the connectome by temporarily cloaking the synapses that link a wide field of neurons.

If you’re a Star Trek fan, think of it as a Romulan or Klingon cloaking device, which hides a warship. The cloaked ship is invisible, until it fires at an enemy. In the study published in the March 16 issue of Current Biology, the researchers found that a class of inhibitory neurons, called somatostatin cells, send out a signal — much like a cloaking device — that silences neighboring excitatory neurons. Synapses, like a cloaked warship, can’t be seen if they aren’t firing; activating the somatostatin cells makes the synapses and local network of neurons invisible to researchers.

Furthermore, by silencing certain parts of the neuronal network, the activity of the somatostatin neurons also can change the way the brain functions, heightening some perceptual pathways and silencing others.

“It was totally unexpected that these cells would work this way,” said Alison Barth, professor of biological sciences and a member of BrainHubSM, Carnegie Mellon’s neuroscience research initiative. “Changing the activity of just this one cell type can let you change the brain’s circuit structure at will. This could dramatically change how we look at — and use — the connectome.”

The Carnegie Mellon researchers discovered this synaptic cloaking device, much in the same way Starfleet would detect a cloaked Klingon warship — they were conducting their normal research and noticed that something just didn’t look quite right.

Joanna Urban-Ciecko, a research scientist in Barth’s lab, noticed that the synapses in her experiments were not behaving the way that previous experimenters had reported. Prior studies reported that the synapses should be strong and reliable, and that they should always grow and strengthen in response to a stimulus. But the neurons Urban-Ciecko looked at were weak and unreliable.

The difference between Urban-Ciecko’s research and the previously completed work was that her research was being done under real-life conditions. Prior research on synapse function was done under conditions optimized for observing synapses. However, such experimental conditions don’t reflect the noisy brain environment in which synapses normally exist.

“There’s this big black box in neuroscience. We know how to make synapses stronger in a dish. But what’s going on in the brain to initiate synaptic strengthening in real life?” Barth asked.

To find out, Urban-Ciecko looked at neurons in the brain’s neocortex that were functioning under normal, noisy conditions. She took paired-cell recordings from pyramidal cells, a type of excitatory neuron, and found that many of the synapses between the neurons were not functioning, or functioning at an unexpectedly low level. Urban-Ciecko then recorded the activity of somatostatin cells, a type of inhibitory neuron, and found that those neurons were much more active than expected.

“The somatostatin cells were so active, I wondered if they could possibly be driving the inhibition of synapses,” Urban-Ciecko said.

To test her hypothesis, Urban-Ciecko turned to optogenetics, a technique that controls neurons with light. She used light to trigger an enzyme that activated and deactivated the somatostatin neuron. When the somatostatin cells were turned off, synapses grew big and strong. When the cells were turned on, the synapses became weaker and in some cases, disappeared entirely.

“You have inputs coming at you all the time, why do you remember one thing and not the other? We think that somatostatin neurons may be gating whether synapses are used, and whether they can be changed during some important event, to enable learning,” said Barth, who is also a member of the joint CMU/University of Pittsburgh Center for the Neural Basis of Cognition (CNBC).

The researchers found that when the somatostatin neurons were turned on, this triggered the cloaking device. The neuron activated the GABAb receptors on hundreds of excitatory neurons in the immediate area. Activating this receptor suppressed the excitatory neurons, which prevented them from creating and strengthening synapses — and made them invisible to researchers.

The researchers next plan to see if the somatostatin cells behave similarly in other areas of the brain. If they do, it could represent a novel target for studying and improving learning and memory.

Read more.

Exercise-induced galanin protects neurons from degeneration caused by stress

From UGA Today:

UGA Today

Exercise reduces stress: UGA scientists discover why

Neuroscientists reveal mechanism behind stress-reducing benefit of exercise

February 23, 2015

Read more.

Brain waves may be infrastructure supporting neural communication

From MIT  News:

Two areas of the brain — the hippocampus (yellow) and the prefrontal cortex (blue) — use two different brain-wave frequencies to communicate as the brain learns to associate unrelated objects.

Two areas of the brain — the hippocampus (yellow) and the prefrontal cortex (blue) — use two different brain-wave frequencies to communicate as the brain learns to associate unrelated objects. Illustration: Jose-Luis Olivares/MIT

How brain waves guide memory formation

Neurons hum at different frequencies to tell the brain which memories it should store.

Anne Trafton | MIT News Office
February 23, 2015

Our brains generate a constant hum of activity: As neurons fire, they produce brain waves that oscillate at different frequencies. Long thought to be merely a byproduct of neuron activity, recent studies suggest that these waves may play a critical role in communication between different parts of the brain.

A new study from MIT neuroscientists adds to that evidence. The researchers found that two brain regions that are key to learning — the hippocampus and the prefrontal cortex — use two different brain-wave frequencies to communicate as the brain learns to associate unrelated objects. Whenever the brain correctly links the objects, the waves oscillate at a higher frequency, called “beta,” and when the guess is incorrect, the waves oscillate at a lower “theta” frequency.

“It’s like you’re playing a computer game and you get a ding when you get it right, and a buzz when you get it wrong. These two areas of the brain are playing two different ‘notes’ for correct guesses and wrong guesses,” says Earl Miller, the Picower Professor of Neuroscience, a member of MIT’s Picower Institute for Learning and Memory, and senior author of a paper describing the findings in the Feb. 23 online edition of Nature Neuroscience.

Furthermore, these oscillations may reinforce the correct guesses while repressing the incorrect guesses, helping the brain learn new information, the researchers say.

Signaling right and wrong

Miller and lead author Scott Brincat, a research scientist at the Picower Institute, examined activity in the brain as it forms a type of memory called explicit memory — memory for facts and events. This includes linkages between items such as names and faces, or between a location and an event that took place there.

During the learning task, animals were shown pairs of images and gradually learned, through trial and error, which pairs went together. Each correct response was signaled with a reward.

As the researchers recorded brain waves in the hippocampus and the prefrontal cortex during this task, they noticed that the waves occurred at different frequencies depending on whether the correct or incorrect response was given. When the guess was correct, the waves occurred in the beta frequency, about 9 to 16 hertz (cycles per second). When incorrect, the waves oscillated in the theta frequency, about 2 to 6 hertz.

Previous studies by MIT’s Mark Bear, also a member of the Picower Institute, have found that stimulating neurons in brain slices at beta frequencies strengthens the connections between the neurons, while stimulating the neurons at theta frequencies weakens the connections.

Miller believes the same thing is happening during this learning task.

“When the animal guesses correctly, the brain hums at the correct answer note, and that frequency reinforces the strengthening of connections,” he says. “When the animal guesses incorrectly, the ‘wrong’ buzzer buzzes, and that frequency is what weakens connections, so it’s basically telling the brain to forget about what it just did.”

The findings represent a major step in revealing how memories are formed, says Howard Eichenbaum, director of the Center for Memory and Brain at Boston University.

“This study offers a very specific, detailed story about the role of different directions of flow, who’s sending information to whom, at what frequencies, and how that feedback contributes to memory formation,” says Eichenbaum, who was not part of the research team.

The study also highlights the significance of brain waves in cognitive function, which has only recently been discovered by Miller and others.

“Brain waves had been ignored for decades in neuroscience. It’s been thought of as the humming of a car engine,” Miller says. “What we’re discovering through this experiment and others is that these brain waves may be the infrastructure that supports neural communication.”

Read more.

Highly processed foods implicated in “food addiction”

From University of Michigan News:

Highly processed foods linked to addictive eating

Feb 18, 2015

A hot dog, plate of French fries, pizza and chocolate. (stock image)ANN ARBOR—A new University of Michigan study confirms what has long been suspected: highly processed foods like chocolate, pizza and French fries are among the most addictive.

This is one of the first studies to examine specifically which foods may be implicated in “food addiction,” which has become of growing interest to scientists and consumers in light of the obesity epidemic.

Previous studies in animals conclude that highly processed foods, or foods with added fat or refined carbohydrates (like white flour and sugar), may be capable of triggering addictive-like eating behavior. Clinical studies in humans have observed that some individuals meet the criteria for substance dependence when the substance is food.

Despite highly processed foods generally known to be highly tasty and preferred, it is unknown whether these types of foods can elicit addiction-like responses in humans, nor is it known which specific foods produce these responses, said Ashley Gearhardt, U-M assistant professor of psychology.

Unprocessed foods, with no added fat or refined carbohydrates like brown rice and salmon, were not associated with addictive-like eating behavior.

Individuals with symptoms of food addiction or with higher body mass indexes reported greater problems with highly processed foods, suggesting some may be particularly sensitive to the possible “rewarding” properties of these foods, said Erica Schulte, a U-M psychology doctoral student and the study’s lead author.

Read More.

Brain mapping study reveals hitherto unknown cell types in cerebral cortex

From Karolinska Institutet Newsroom:

New brain mapping reveals unknown cell types

18 February 2015 Karolinska Institutet

Using a process known as single cell sequencing, scientists at Karolinska Institutet have produced a detailed map of cortical cell types and the genes active within them. The study, which is published in the journal Science, marks the first time this method of analysis has been used on such a large scale on such complex tissue. The team studied over three thousand cells, one at a time, and even managed to identify a number of hitherto unknown types.

“If you compare the brain to a fruit salad, you could say that previous methods were like running the fruit through a blender and seeing what colour juice you got from different parts of the brain,” says Sten Linnarsson, senior researcher at the Department of Medical Biochemistry and Biophysics. “But in recent years we’ve developed much more sensitive methods of analysis that allow us to see which genes are active in individual cells. This is like taking pieces of the fruit salad, examining them one by one and then sorting them into piles to see how many different kinds of fruit it contains, what they’re made up of and how they interrelate.”

The knowledge that all living organisms are built up of cells is almost 200 years old. Since the discovery was made by a group of 19th century German scientists, we have also learnt that the nature of a particular body tissue is determined by its constituent cells, which are, in turn, determined by which genes are active in their DNA. However, little is still known about how this happens in detail, especially as regards the brain, the body’s most complex organ.

In the present study, the scientists used large-scale single-cell analysis to answer some of these questions. By studying over three thousand cells from the cerebral cortex in mice, one at a time and in detail, and comparing which of the 20,000 genes were active in each one, they were able to sort the cells into virtual piles. They identified 47 different kinds of cell, including a large proportion of specialised neurons, some blood vessel cells and glial cells, which take care of waste products, protect against infection and supply nerve cells with nutrients.

With the help of this detailed map, the scientists were able to identify hitherto unknown cell types, including a nerve cell in the most superficial cortical layer, and six different types of oligodendrocyte, which are cells that form the electrically insulating myelin sheath around the nerve cells. The new knowledge the project has generated can shed more light on diseases that affect the myelin, such as multiple sclerosis (MS).

“We could also confirm previous findings, such as that the pyramidal cells of the cerebral cortex are functionally organised in layers,” says Jens Hjerling-Leffler, who co-led the study with Dr Linnarsson. “But above all, we have created a much more detailed map of the cells of the brain that describes each cell type in detail and shows which genes are active in it. This gives science a new tool for studying these cell types in disease models and helps us to understand better how brain cell respond to disease and injury.”

There are estimated to be 100 million cells in a mouse brain, and 65 billion in a human brain. Nerve cells are approximately 20 micrometres in diameter, glial cells about 10 micrometres. A micrometre is equivalent to a thousandth of a millimetre.

The study was carried out by Sten Linnarsson’s and Jens Hjerling-Leffler’s research groups at the department of medical biochemistry and biophysics, in particular by Amit Zeisel and Ana Muños Manchado. It also involved researchers from Karolinska Institutet’s Department of Oncology-Pathology, and Uppsala University.

Read more.

Scientists discover auditory nociceptive pathway from cochlea to brain

From Northwestern University Feinberg School of Medicine News Center:

Auditory Pain Pathway May Protect Against Hearing Loss

Read more.

Risk of psychosis five times higher with daily use of high potency cannabis

large img

Patterns of cannabis use between patients with first-episode psychosis and population controls

Figure retrieved from The Lancet.

From King’s College London News:

‘Skunk-like’ cannabis associated with 24% of new psychosis cases

Posted on 16/02/2015
—————————

Scientists have found that 24% of all new cases of psychosis are associated with the use of high potency ‘skunk-like’ cannabis.  In addition, the risk of psychosis is three times higher for potent ‘skunk-like’ cannabis users and five times higher for those who use it every day, according to research from the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) at King’s College London, published today in Lancet Psychiatry.

The findings, based on a study of nearly 800 people aged 18-65 in South London, have major implications for prevention of cannabis-associated psychosis as well as developing new treatments.

“Compared with those who had never tried cannabis, users of high potency ‘skunk-like’ cannabis had a threefold increase in risk of psychosis,” said Dr Marta Di Forti from the IoPPN, King’s College London, and lead author on the research. “The risk to those who use every day was even higher; a fivefold increase compared to people who never use.

“The results show that psychosis risk in cannabis users depends on both the frequency of use and cannabis potency.  The use of hash was not associated with increased risk of psychosis,” she added.

Sir Robin Murray, Professor of Psychiatric Research at the IoPPN at King’s and senior researcher on the study stated: “It is now well known that use of cannabis increases the risk of psychosis. However, sceptics still claim that this is not an important cause of schizophrenia-like psychosis. This paper suggests that we could prevent almost one quarter of cases of psychosis if no-one smoked high potency cannabis. This could save young patients a lot of suffering and the NHS a lot of money.”

Between 2005 and 2011, researchers worked with 410 patients aged 18-65 who reported a first episode of psychosis at the South London and Maudsley NHS Foundation Trust. A further 370 healthy participants from the same area of South London were included as controls. A main finding was that the frequency of use and cannabis potency, which are often overlooked when determining how harmful the drug can be, are essential factors in the mental health effects on users. These factors are not sufficiently considered by doctors.

Read more.

Treating uninjured side of brain helps limit damage after stroke

From Georgia Regents University and Health News:

research-brain-stroke-greport

Treating the uninjured side of the brain appears to aid stroke recovery

Read more.

Damage to brain from chronic methamphetamine use is greater in adolescents

From University of Utah Health Care News:

Feb 11, 2015 1:28 PM

Meth Damages Adolescent Brains Far More than Those of Adults, Study Finds

Perry F. Renshaw, M.D., Ph.D., M.B.A.

(SALT LAKE CITY)—Adolescents who chronically use methamphetamine suffer greater and more widespread alterations in their brain than adults who chronically abuse the drug–and damage is particularly evident in a part of the brain believed to control the “executive function,” researchers from the University of Utah and South Korea report.

In a study with chronic adolescent and adult meth abusers in South Korea, MRI brain scans showed decreased thickness in the gray matter of younger users’ frontal cortex, the area of the brain believed to direct people’s ability to organize, reason and remember things, known as the executive function. A different type of MRI, diffusion tensor imaging (DTI), indicated alterations to the adolescents’ white matter, meaning possible damage to neurons–the cells that relay information via electrical signals from one part of the brain to another. The gray and white matter of chronic adult meth users showed far less damage than that of the adolescents.

The researchers found the evidence of damage to cortical thickness in the frontal cortex of adolescent users alarming.

“It’s particularly unfortunate that meth appears to damage that part of the brain, which is still developing in young people and is critical for cognitive ability,” says In Kyoon Lyoo, M.D., Ph.D., of Ewha W. University in Seoul, South Korea. “Damage to that part of the brain is especially problematic because adolescents’ ability to control risky behavior is less mature than that of adults. The findings may help explain the severe behavioral issues and relapses that are common in adolescent drug addiction.”

Lyoo is first author on the study, published Feb. 10, 2015, in Molecular Psychiatry online. Perry F. Renshaw, M.D., Ph.D., M.B.A., University of Utah USTAR investigator and professor of psychiatry, is the study’s senior author.The results also indicate that it might take much less meth to cause greater damage in adolescent brains because youths typically use smaller amounts of the drug than adults.

Meth is the one of the most widely abused drugs in Asia, but it’s also a problem in the United States, with the Western region of the country experiencing the highest rates of use. Studies with rodents have shown that meth damages the brains of adult rats more than young ones, but whether that holds true in people has been cause for debate.

In one of the largest studies of its type, Lyoo, Renshaw and their colleagues scanned the brains of 111 South Korean adolescents and 114 adults. Among the younger people, 51 used meth while 60 did not. The adults included 54 meth users and 60 non-users.

“There is a critical period of brain development for specific functions, and it appears that adolescents who abuse methamphetamine are at great risk for derailing that process,” Renshaw says. “I think the results show it is hugely important to keep kids off drugs.”

Read more.

Naturally occurring humanin peptide may protect against cell death after TBI, stroke, or heart attack

Since naturally occurring humanin provides this protective effect, why not use it instead of analogues which are not exact copies?  The answer, of course, is patents and money.

From EurekAlerts:

Public Release: 8-Feb-2015

Promising peptide for TBI, heart attack and stroke

Caption AGA(C8R)-HNG17 and the mitochondrial tracker tetramethylrhodamine methyl ester in PC-12 cells (rat pheochromocytoma, of neuronal origin) 10 min after inducing necrosis by cyanide, exhbiting co-localization of humanin and mito-tracker at the mitochondria. Both trackers are co-localized where their lifetime is the longest. Credit Parola/Ben-Gurion University

 Researchers at Ben Gurion University of the Negev and Soroka University Medical Center explore the uses of a molecule called humanin to halt necrosis – stopping cell death in its tracks

Biophysical Society

WASHINGTON, D.C., February 8, 2015 — Strokes, heart attacks and traumatic brain injuries are separate diseases with certain shared pathologies that achieve a common end – cell death and human injury due to hypoxia, or lack of oxygen. In these diseases, a lack of blood supply to affected tissues begins a signaling pathway that ultimately halts the production of energy-releasing ATP molecules – a death sentence for most cells.

By employing derivatives of humanin, a naturally occurring peptide encoded in the genome of cellular mitochondria, researchers at Ben Gurion University of the Negev are working to interrupt this process, buying precious time for tissues whose cellular mechanisms have called it quits.

“The present findings could provide a new lead compound for the development of drug therapies for necrosis-related diseases such as traumatic brain injury, stroke and myocardial infarction – conditions for which no effective drug-based treatments are currently available [that work by blocking necrosis],” said Abraham Parola, a professor of biophysical chemistry at Ben Gurion University of the Negev in Beer-Sheva, Israel. Parola is presently a visiting professor of Biophysical Chemistry & Director of Natural Sciences at New York University Shanghai, and will speak about his lab’s finding’s this week at the Biophysical Society’s 59th annual meeting in Baltimore, Md.

The humanin derivatives work by counteracting the decrease in ATP levels caused by necrosis. The researchers tested the effectiveness of the humanin analogues AGA(C8R)-HNG17 and AGA-HNG by treating neuronal cells with these peptides prior to exposure to a necrotic agent. The experiments were a success.

Parola’s previous work has dealt with membrane dynamics and the mechanism of action of anti-angiogenesis drugs, which cause starvation of malignant tumor growths by preventing the supply of nutrients and oxygen to the fast growing tissue, in addition to various other biophysical and molecular medicine and diagnostic topics.

“A recent paper published by our group suggested the involvement of cardiolipin [a phospholipid in inner mitochondrial membranes] in the necrotic process,” Parola said. “During this work we stumbled along humanin and were intrigued by its anti-apoptotic effect, and extended it to anti-necrotic effect.”

Parola and his colleagues also performed in vivo studies by treating mice that had had traumatic brain injuries with an HNG17 analogue, which successfully reduced cranial fluid buildup and lowered the mice’s neuronal severity scores, a metric in which a higher number corresponds with greater degrees of neurological motor impairment.

As the peptides Parola and his colleagues used are derivatives of naturally occurring humanin, an ideal treatment might involve a drug delivery system with the HNG17 as the lead compound, a process aided by the ability of the peptides to penetrate the cell membrane without the use of additional reagents.

Read more.

Brain uses statistically optimal procedure in decision-making

From ScienceDaily:

Similar statistics play role in decision-making and World War II code breaking

Date:
February 5, 2015
—————–
“The brain reaches a decision by combining samples of evidence in much the way a good statistician would,” says Michael Shadlen, a Professor of Neuroscience at the Kavli Institute for Brain Science at Columbia University. In a new paper in Neuron, Shadlen and colleagues from the University of Washington and the Shanghai Institutes for Biological Sciences demonstrate this theory by monitoring the decision-making process in rhesus monkeys to determine how much and what information they need to confidently choose a correct answer.

The monkeys were shown a sequence of shapes that served as clues about the location of a reward. They could look at as few or as many such clues before making their choice. The scientists found that the monkeys’ neurons increased or decreased their activity depending on whether the shape in a sequence supported one or the other location (or color). The process halted when the accumulated evidence reach a critical level. This strategy explained both the choice and number of shapes used to make it.

“It’s the brain doing a statistically optimal procedure,” Shadlen says. “It’s nothing less than a basis of rationality. The brain allows us to combine apples and oranges and lemons, so to speak, by assigning them the right kinds of weights so that when we put them together we reason according to the laws of probability.”

This statistical way of decision making resembles a process Alan Turning’s team did in Bletchley Park, England, to work out the settings of German enigma machines. In order to make use of the large clicking machine–called ‘Christopher’ in the recent historical drama “The Imitation Game”–Turing’s team analyzed pairs of randomly intercepted German messages, aligned them one above the other to accumulate evidence from letter pairs (matched or not) until they reach a threshold level of certainty that the messages were sent on identical enigma machines, or not. Once the threshold was reached, the code breaker would either accept or reject the hypothesis.

Read more. 

Brains of cognitive SuperAgers have thicker anterior cingulate cortex, fewer tangles, and more von Economo neurons

From Northwestern University Newscenter:

Senior reading

SuperAger Brains Yield New Clues to Their Remarkable Memories

Brains of cognitively elite look distinctly different than their elderly peers

February 3, 2015 | by Marla Paul

CHICAGO – SuperAgers, aged 80 and above, have distinctly different looking brains than those of normal older people, according to new Northwestern Medicine® research that is beginning to reveal why the memories of these cognitively elite elders don’t suffer the usual ravages of time.

SuperAgers have memories that are as sharp as those of healthy persons decades younger.

Understanding their unique “brain signature” will enable scientists to decipher the genetic or molecular source and may foster the development of strategies to protect the memories of normal aging persons as well as treat dementia.

Published Jan. 28 in the Journal of Neuroscience, the study is the first to quantify brain differences of SuperAgers and normal older people.

Cognitive SuperAgers were first identified in 2007 by scientists at Northwestern’s Cognitive Neurology and Alzheimer’s Disease Center at Northwestern University Feinberg School of Medicine.

Their unusual brain signature has three common components when compared with normal persons of similar ages: a thicker region of the cortex; significantly fewer tangles (a primary marker of Alzheimer’s disease) and a whopping supply of a specific neuron –von Economo — linked to higher social intelligence.

“The brains of the SuperAgers are either wired differently or have structural differences when compared to normal individuals of the same age,” said Changiz Geula, study senior author and a research professor at the Cognitive Neurology and Alzheimer’s Disease Center. “It may be one factor, such as expression of a specific gene, or a combination of factors that offers protection.”

The Center has a new NIH grant to continue the research.

“Identifying the factors that contribute to the SuperAgers’ unusual memory capacity may allow us to offer strategies to help the growing population of ‘normal’ elderly maintain their cognitive function and guide future therapies to treat certain dementias,” said Tamar Gefen, the first study author and a clinical neuropsychology doctoral candidate at Feinberg.

MRI imaging and an analysis of the SuperAger brains after death show the following brain signature:

1) MRI imaging showed the anterior cingulate cortex of SuperAgers (31 subjects) was not only significantly thicker than the same area in aged individuals with normal cognitive performance (21 subjects), but also larger than the same area in a group of much younger, middle-aged individuals (ages 50 to 60, 18 subjects). This region is indirectly related to memory through its influence on related functions such as cognitive control, executive function, conflict resolution, motivation and perseverance.

2) Analysis of the brains of five SuperAgers showed the anterior cingulate cortex had approximately 87 percent less tangles than age-matched controls and 92 percent less tangles than individuals with mild cognitive impairment. The neurofibrillary brain tangles, twisted fibers consisting of the protein tau, strangle and eventually kill neurons.

3) The number of von Economo neurons was approximately three to five times higher in the anterior cingulate of SuperAgers compared with age-matched controls and individuals with mild cognitive impairment.

“It’s thought that these von Economo neurons play a critical role in the rapid transmission of behaviorally relevant information related to social interactions,” Geula said, “which is how they may relate to better memory capacity.” These cells are present in such species as whales, elephants, dolphins and higher apes.

Read more.

———————–

See also:

Extreme fatigue from antipsychotic medications leaves many patients in a zombie-like state

From MedicalXpress:

Antipsychotic meds prompt zombie-like state among patients

by Rob Payne
Antipsychotic meds prompt zombie-like state among patients

Disturbingly, researchers found participants often exhibited ‘a culture of hopelessness’ where acceptance was dominant, which they warn can destroy an individual’s will to recover. Credit: Charlotte Spencer

Interviews with community members who are taking antipsychotic medication for mental health problems have added to growing concerns about how the drugs are administered, their effectiveness against placebo and the severity of their side-effects.

The recent research also touches on how stigma can lead individuals to ‘just put up with the drugs’ despite not believing they help.

While their sample size is limited, Murdoch University and the University of Queensland researchers say insight into lived experience is invaluable.

“People using antipsychotic medications experience adverse side-effects that reach into their physical, social and emotional lives, and cause a level of fear and suffering that is difficult for anyone else to fully comprehend,” Murdoch Professor Paul Morrison says.

“The proportion that experiences a disturbing side-effect has been estimated at between 50 and 70 per cent, and participants in our study reported on average between six and seven medication side-effects.

“It is difficult for an outsider to appreciate what this means to individual consumers, and how it impacts on their self-image and ability to cope.”

Side-effects can include Parkinsonism, akathisia (restlessness) and tardive dyskinesia (involuntary movements), as well as weight gain, hypersomnia, insomnia, sexual dysfunction, dry mouth, constipation and dizziness.

The most profound side-effect is extreme fatigue, which leaves many in a ‘zombie state’.

Participants engulfed by hopelessness

Disturbingly, researchers found participants often exhibited ‘a culture of hopelessness’ where acceptance was dominant, which they warn can destroy an individual’s will to recover.

“The issue here is the extent to which people with a mental illness have been conditioned into accepting the disabling effects of without protest,” Prof Morrison says.

“The ability of mental health staff to forestall protest arises from the guilt communities thrust upon the sufferer.

“Without this guilt and shame, would consumers and their loved ones be so ready to accept that a life of zombie-like consciousness and physical discomfort is preferable to hearing voices, or would they be demanding more intensive efforts to develop ‘cleaner’ medications?”

The study advocates creating a standardised rating scale for assessing and monitoring side-effects and better communication between practitioners and those taking medications.

The research suggests psychosocial treatment methods should be explored, such as relaxation and distraction techniques, which have been proven to improve quality of life.

Read more.

Resveratrol improves learning and memory in aged rats and increases neurogenesis

https://i0.wp.com/isagenixhealth.net/wp-content/uploads/2012/04/grapes.jpg

From Texas A & M University news release:

Compound Found In Grapes, Red Wine May Help Prevent Memory Loss

Released: 4-Feb-2015 4:10 PM EST
Source Newsroom: Texas A&M University

 

Newswise — A compound found in common foods such as red grapes and peanuts may help prevent age-related decline in memory, according to new research published by a faculty member in the Texas A&M Health Science Center College of Medicine.

Ashok K. Shetty, Ph.D., a professor in the Department of Molecular and Cellular Medicine and Director of Neurosciences at the Institute for Regenerative Medicine, has been studying the potential benefit of resveratrol, an antioxidant that is found in the skin of red grapes, as well as in red wine, peanuts and some berries.

Resveratrol has been widely touted for its potential to prevent heart disease, but Shetty and a team that includes other researchers from the health science center believe it also has positive effects on the hippocampus, an area of the brain that is critical to functions such as memory, learning and mood.

Because both humans and animals show a decline in cognitive capacity after middle age, the findings may have implications for treating memory loss in the elderly. Resveratrol may even be able to help people afflicted with severe neurodegenerative conditions such as Alzheimer’s disease.

In a study published online Jan. 28 in Scientific Reports, Shetty and his research team members reported that treatment with resveratrol had apparent benefits in terms of learning, memory and mood function in aged rats.

“The results of the study were striking,” Shetty said. “They indicated that for the control rats who did not receive resveratrol, spatial learning ability was largely maintained but ability to make new spatial memories significantly declined between 22 and 25 months. By contrast, both spatial learning and memory improved in the resveratrol-treated rats.”

Shetty said neurogenesis (the growth and development of neurons) approximately doubled in the rats given resveratrol compared to the control rats. The resveratrol-treated rats also had significantly improved microvasculature, indicating improved blood flow, and had a lower level of chronic inflammation in the hippocampus.

“The study provides novel evidence that resveratrol treatment in late middle age can help improve memory and mood function in old age,” Shetty said.

Read more.

———————

Related:

 

Older adults with musical training during youth are faster in identifying speech sounds

From EurekaAlert:

Public Release: 2-Feb-2015

More evidence that musical training protects the brain

Baycrest Centre for Geriatric Care

Toronto, CANADA – Scientists have found some of the strongest evidence yet that musical training in younger years can prevent the decay in speech listening skills in later life.

According to a new Canadian study led by the Rotman Research Institute (RRI) at Baycrest Health Sciences, older adults who had musical training in their youth were 20% faster in identifying speech sounds than their non-musician peers on speech identification tests, a benefit that has already been observed in young people with musical training.

The findings are published in The Journal of Neuroscience (Jan. 21).

Among the different cognitive functions that can diminish with age is the ability to comprehend speech. Interestingly, this difficulty can persist in the absence of any measurable hearing loss. Previous research has confirmed that the brain’s central auditory system which supports the ability to parse, sequence and identify acoustic features of speech – weakens in later years.

Starting formal lessons on a musical instrument prior to age 14 and continuing intense training for up to a decade appears to enhance key areas in the brain that support speech recognition. The Rotman study found “robust” evidence that this brain benefit is maintained even in the older population.

“Musical activities are an engaging form of cognitive brain training and we are now seeing robust evidence of brain plasticity from musical training not just in younger brains, but in older brains too,” said Gavin Bidelman, who led the study as a post-doctoral fellow at the RRI and is now an assistant professor at the University of Memphis.

“In our study we were able to predict how well older people classify or identify speech using EEG imaging. We saw a brain-behaviour response that was two to three times better in the older musicians compared to non-musicians peers. In other words, old musicians’ brains provide a much more detailed, clean and accurate depiction of the speech signal, which is likely why they are much more sensitive and better at understanding speech.”

Bidelman received a GRAMMY Foundation research grant to conduct the study and partnered with senior scientist Claude Alain, assistant director of Baycrest’s RRI and a leading authority in the study of age-related differences in auditory cortical activity.

The latest findings add to mounting evidence that musical training not only gives young developing brains a cognitive boost, but those neural enhancements extend across the lifespan into old age when the brain needs it most to counteract cognitive decline. The findings also underscore the importance of music instruction in schools and in rehabilitative programs for older adults.

In this study, 20 healthy older adults (aged 55-75) – 10 musicians and 10 non-musicians – put on headphones in a controlled lab setting and were asked to identify random speech sounds. Some of the sounds were single vowel sounds such as an “ooo” or an “ahhh”, others more ambiguous as a mix of two sounds that posed a greater challenge to their auditory processing abilities for categorizing the speech sound correctly.

During the testing cycles, researchers recorded the neural activity of each participant using electroencephalography (EEG). This brain imaging technique measures to a very precise degree the exact timing of the electrical activity which occurs in the brain in response to external stimuli. This is displayed as waveforms on a computer screen. Researchers use this technology to study how the brain makes sense of our complex acoustical environment and how aging impacts cognitive functions.

According to Bidelman and Alain’s published paper, the older musicians’ brain responses showed “more efficient and robust neurophysiological processing of speech at multiple tiers of auditory processing, paralleling enhancements reported in younger musicians.”

Read more.
 
—————————–

Related:

More Evidence Music Training Boosts Brainpower

NFL players who played tackle football before age 12 more likely to have cognitive problems as adults

From Boston University Research:

Football: Child’s Play, Adult Peril?

MED study: NFL vets starting football under age 12 at increased cognitive impairment risk

Study finds that NFL players who participated in tackle football before age 12 were more likely to have memory and thinking problems as adults.

As the 100 million viewers tuning in to this Sunday’s Super Bowl can attest, Americans adore football. And for many, the love affair begins in childhood: Pop Warner Tiny-Mites start as young as age five, and many adults retain warm memories and friendships from their youth football days.

But a new study from BU School of Medicine researchers points to a possible increased risk of cognitive impairment from playing youth football. The National Institutes of Health–funded study, published online in the January 28, 2015, edition of the journal Neurology, finds that former National Football League players who participated in tackle football before the age of 12 are more likely to have memory and thinking problems as adults.

The study contradicts conventional wisdom that children’s more plastic brains might recover from injury better than those of adults, and suggests that they may actually be more vulnerable to repeated head impacts, especially if injuries occur during a critical period of growth and development.

“Sports offer huge benefits to kids, as far as work ethic, leadership, and fitness, and we think kids should participate,” says study lead author Julie Stamm (MED’15), a PhD candidate in anatomy and neurobiology. “But there’s increasing evidence that children respond differently to head trauma than adults. Kids who are hitting their heads over and over during this important time of brain development may have consequences later in life.”

“This is one study, with limitations,” adds study senior author Robert Stern, a MED professor of neurology, neurosurgery, and anatomy and neurobiology and director of the Alzheimer’s Disease Center’s Clinical Core. “But the findings support the idea that it may not make sense to allow children—at a time when their brain is rapidly developing—to be exposed to repetitive hits to the head. If larger studies confirm this one, we may need to consider safety changes in youth sports.”

In the study, researchers reexamined data from BU’s ongoing DETECT (Diagnosing and Evaluating Traumatic Encephalopathy Using Clinical Tests) study, which aims to develop methods of diagnosing chronic traumatic encephalopathy (CTE) during life. CTE is a neurodegenerative disease often found in professional football players, boxers, and other athletes who have a history of repetitive brain trauma. It can currently be diagnosed only by autopsy.

Robert Stern

MED Professor Robert Stern, senior author on the Neurology study. Photo courtesy of Robert Stern

For this latest study, scientists examined test scores of 42 former NFL players, with an average age of 52, all of whom had experienced memory and thinking problems for at least six months. Half the players had played tackle football before age 12, and half had not. Significantly, the total number of concussions was similar between the two groups. Researchers found that the players exposed to tackle football before age 12 had greater impairment in mental flexibility, memory, and intelligence—a 20 percent difference in some cases. These findings held up even after statistically removing the effects of the total number of years the participants played football. Both groups scored below average on many of the tests.

“We were surprised by how striking the results were,” says Stamm. “Every single test was significantly different, by a lot.”

Stamm says that the researchers were especially surprised by the scores on a reading test called the WRAT-4, which has participants read words of increasing difficulty. A person’s score depends on the ability to pronounce the words correctly, indicating the person’s familiarity with complex vocabulary. The low scores may be significant, she says, because they suggest that repeated head trauma at a young age might limit peak intelligence. She emphasizes, however, that there may be other reasons for a low score, and that more research is needed.

The authors chose age 12 as the cutoff because significant peaks in brain development occur in boys around that age. (This happens for girls a bit earlier, on average.) Around age 12, says Stern, blood flow to the brain increases, and brain structures such as the hippocampus, which is critical for memory, reach their highest volume. Boys’ brains also reach a peak in their rate of myelination—the process in which the long tendrils of brain cells are coated with a fatty sheath, allowing neurons to communicate quickly and efficiently. Because of these developmental changes, Stern says, this age may possibly represent a “window of vulnerability,” when the brain may be especially sensitive to repeated trauma.

“If you take just the hippocampus, that’s a really important part of your brain,” he says. “It may be that if you hit your head a lot during this important period, you might have significant memory problems later on.”

Stern adds that a study by another group of researchers of the number and severity of hits in football players aged 9 to 12, using accelerometers in helmets, found that players received an average of 240 high-magnitude hits per season, sometimes with a force similar to that experienced by high school and college players.

With approximately 4.8 million athletes playing youth football in the United States, the long-term consequences of brain injury represent a growing public health concern. This study comes at a time of increasing awareness of the dangers of concussions—and subconcussive hits—in youth sports like football, hockey, and soccer. In 2012, Pop Warner football, the oldest and largest youth football organization in the country, changed its rules to limit contact during practices and banned intentional head-to-head contact. When reached by phone at the organization’s headquarters in Langhorne, Pa., a Pop Warner spokesman declined to comment on the study until they had more time to examine the results in detail.

Julie Stamm

Julie Stamm (MED’15), a PhD candidate in anatomy and neurobiology and lead author on the study. Photo courtesy of Julie Stamm

“Football has the highest injury rate among team sports,” writes Christopher M. Filley, a fellow with the American Academy of Neurology, in an editorial accompanying the Neurology article. “Given that 70 percent of all football players in the United States are under the age of 14, and every child aged 9 to 12 can be exposed to 240 head impacts during a single football season, a better understanding of how these impacts may affect children’s brains is urgently needed.”

Filley’s editorial cautions that the study has limitations: because the researchers could not precisely determine the players’ lifetime number of head impacts, it may be the total number of hits—rather than the age of a player—that is the more critical measurement. In addition, because the study focuses on professional athletes, the results may not apply to recreational players who participated in youth football, but did not play beyond high school.

Read more.