Re-energizing the aging brain

Selected Coverage: Science Daily

The human brain has a prodigious demand for energy — 20 to 30% of the body’s energy budget. In the course of normal aging, in people with neurodegenerative diseases or mental disorders, or in periods of physiological stress, the supply of sugars to the brain may be reduced. This leads to a reduction in the brain’s energy reserves, which in turn can lead to cognitive decline and loss of memory.

But new research on mice shows that the brain’s energy reserves can be increased with a daily dose of pyruvate, a small energy-rich molecule that sits at the hub of most of the energy pathways inside the cell. These results need to be replicated in human subjects, but could ultimately lead to clinical applications.

“In our new study, we show that long-term dietary supplementation with pyruvate increases the energy reserves in the brain, at least in mice, in the form of the molecules glycogen, creatine and lactate,” says lead author Heikki Tanila, Professor of Molecular Neurobiology at the A. I. Virtanen Institute of the University of Eastern Finland.

What’s more, dietary supplementation with pyruvate didn’t only increase the brain’s energy stores: it also changed the behavior of the mice in positive ways, show the researchers.

“The mice became more energetic and increased their explorative activity. It appears that these behavioral changes are directly due to the effect of pyruvate on brain function, since we didn’t find that these mice had developed greater muscle force or endurance,” says Tanila.

For example, chronic supplementation with pyruvate facilitated the spatial learning of middle-aged (6- to 12-months-old) mice, made them more interested in the odor of unfamiliar mice, and stimulated them to perform so-called “rearing”, an exploratory behavior where mice stand on their hind legs and investigate their surroundings (photo).

The dose necessary to achieve these effects was about 800 mg pyruvate per day – which corresponds to about 10 g per day in humans — given to the mice in normal chow over a period of 2.5 to 6 months. A single large dose of pyruvate injected directly into the blood stream had no detectable effect.

Interestingly, the positive response to dietary supplementation with pyruvate was also found in a strain of transgenic mice called APPswe/PS1dE9, often used as an animal model for the study of Alzheimer’s disease. These mice exhibit many of the same symptoms as people with Alzheimer’s, such as the deposition of protein plaques in the brain, neurodegeneration, and cognitive decline. These results raise hopes that pyruvate might also benefit people with neurodegenerative disorders such as Alzheimer’s and Parkinson’s.

“Pyruvate supplementation may prove beneficial as an activating treatment for the elderly and in therapies for alleviating cognitive decline due to aging, neurodegenerative disease, or mental disorders. It is well tolerated and warrants further studies in humans,” says Tanila.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2016-03/f-rta031016.php

Study: http://journal.frontiersin.org/article/10.3389/fnagi.2016.00041/full

 

Fifteen shades of photoreceptor in a butterfly’s eye

Selected coverage: ABC, Christian Science Monitor, Science Magazine, Süddeutsche Zeitung

When researchers studied the eyes of Common Bluebottles, a species of swallowtail butterfly from Australasia, they were in for a surprise. These butterflies have large eyes and use their blue-green iridescent wings for visual communication – evidence that their vision must be excellent. Even so, no-one expected to find that Common Bluebottles (Graphium sarpedon) have at least 15 different classes of “photoreceptors” — light-detecting cells comparable to the rods and cones in the human eye. Previously, no insect was known to have more than nine.

“We have studied color vision in many insects for many years, and we knew that the number of photoreceptors varies greatly from species to species. But this discovery of 15 classes in one eye was really stunning,” says Kentaro Arikawa, Professor of Biology at Sokendai (the Graduate University for Advanced Studies), Hayama, Japan and lead author of the study.

Have multiple classes of photoreceptors is indispensable for seeing color. Each class is stimulated by light of some wavelengths, and less or not at all by other wavelengths. By comparing information received from the different photoreceptor classes, the brain is able to distinguish colors.

Through physiological, anatomical and molecular experiments, Arikawa and colleagues were able to determine that Common Bluebottles have 15 photoreceptor classes, one stimulated by ultraviolet light, another by violet, three stimulated by slightly different blue lights, one by blue-green, four by green lights, and five by red lights.

Why do Common Bluebottles need so many classes of photoreceptor? After all, many other insects have only three classes of photoreceptor and yet have excellent color vision. Likewise, humans have only three classes of cones, enough to distinguish millions of colors.

Arikawa and his colleagues believe that Common Bluebottles use only four classes of photoreceptor for routine color vision, and use the other eleven to detect very specific stimuli in the environment, for example fast-moving objects against the sky or colorful objects hidden among vegetation. A similar system is found in another butterfly previously studied by the same research group, the Asian swallowtail Papilio xuthus, which has six photoreceptors.

“Butterflies may have a slightly lower visual acuity than ourselves, but in many respects they enjoy a clear advantage over us: they have a very large visual field, a superior ability to pursue fast-moving objects and can even distinguish ultraviolet and polarized light. Isn’t it fascinating to imagine how these butterflies see their world?” says Arikawa.

The results are published in the open-access journal Frontiers in Ecology and Evolution.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2016-03/f-fso030116.php

Study: http://journal.frontiersin.org/article/10.3389/fevo.2016.00018/full

Linguists discover the best word order for giving directions

Selected coverage: NY Times, Christian Science Monitor, The Telegraph, The Independent, Daily Mail

Good directions start — literally — with the most obvious

To give good directions, it is not enough to say the right things: saying them in the right order is also important, shows a study in Frontiers in Psychology. Sentences that start with a prominent landmark and end with the object of interest work better than sentences where this order is reversed. These results could have direct applications in the fields of artificial intelligence and human-computer interaction.

“Here we show for the first time that people are quicker to find a hard-to-see person in an image when the directions mention a prominent landmark first, as in ‘Next to the horse is the man in red’, rather than last, as in ‘The man in red is next to the horse’,” says Alasdair Clarke from the School of Psychology at the University of Aberdeen, the lead author of the study.

Clarke et al. asked volunteers to focus on a particular human figure within the visually cluttered cartoons of the ‘Where’s Wally?’ children’s books (called ‘Where’s Waldo?’ in the USA and Canada). The volunteers were then instructed to explain, in their own words, how to find that figure quickly — no trivial task, as each cartoon contained hundreds of items. As expected, the volunteers often opted to indicate the position of the human figure relative to a landmark object in the cartoon, such as a building.

fpsyg-06-01793-g001
Example of “Where’s Wally?” image used in the experiment

What was surprising, however, was that they tended to use a different word order depending on the visual properties of the landmark. Landmarks that stood out strongly from the background — as measured with imaging software — were statistically likely to be mentioned at the beginning of the sentence, while landmarks that stood out little were typically mentioned at its end. But if the target figure itself stood out strongly, most participants mentioned that first.

In a separate experiment, the researchers show that the most frequently used word order, ‘landmark first-target-second’, is also the most effective: people who heard descriptions with this order needed on average less time to find the human figure in the cartoon than people who heard descriptions with the reverse order.

These results suggest that people who give directions keep a mental record of which objects in an image are easy to see, prefer to use these as landmarks, and treat them differently than harder-to-see objects when planning the word order of descriptions. This strategy helps listeners to find the target quickly.

“Listeners start processing the directions before they’re finished, so it’s good to give them a head start by pointing them towards something they can find quickly, such as a landmark. But if the target your listener is looking for is itself easy to see, then you should just start your directions with that,” concludes co-author Micha Elsner, Assistant Professor at the Department of Linguistics, Ohio State University.

These results could help to develop computer algorithms for automatic direction-giving. “A long-term goal is to build a computer direction-giver that could automatically detect objects of interest in the scene and select the landmarks that would work best for human listeners,” says Clarke.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2015-12/f-ldt120315.php

Study: http://journal.frontiersin.org/article/10.3389/fpsyg.2015.01793/full

 

Scientists date the origin of the cacao tree to 10 million years ago

Selected coverage: Science Magazine, The Times Scotland, Daily Mail, Tech Times, Live Science,

With Shauna Hay, Royal Botanic Garden Edinburgh, UK

New research shows that cacao trees evolved around 10 million years ago, earlier than previously believed. Considerable genetic variation might remain to be discovered among wild cacao populations, which could be crossbred with cultivated cacao for greater resistance to disease and climate change.

Chocolate, produced from seeds of the cacao tree Theobroma cacao, is one of the most popular flavors in the world, with sales around 100$ billion dollars per year. Yet, as worldwide demand increases, there are fears the industry will fail to cope with growing public hunger for the product. The main problem, common to many crops, is the lack of genetic variation in cultivated cacao, which makes it vulnerable to pests and blights. Lack of genetic variation also puts cacao trees at risk from climate change, jeopardizing the long-term sustainability of the industry.

Now, however, new research suggests the cacao tree is much older than previously realised — and may have close relations capable of sustaining our sweet-toothed appetites.

“Studies of the evolutionary history of economically important groups are vital to develop agricultural industries, and demonstrate the importance of conserving biodiversity to contribute towards sustainable development. Here we show for the first time that the source of chocolate, Theobroma cacao, is remarkably old for an Amazonian plant species,” says Dr James Richardson, a tropical botanist at the Royal Botanic Garden Edinburgh, UK, and lead author of the study.

Together with researchers from the University of Rosario and the University of the Andes in Colombia, the University of Miami, USA, and the United States Department of Agriculture (USDA), Richardson found that Theobroma cacao is one of the oldest species in the genus Theobroma, having evolved around 10 million years ago. At the time, the Andes were not yet fully elevated, which explains why cacao trees today occur on both sides of the Andes.

The species’ early evolutionary origin is good news: it suggests that cacao has had enough time to diversify genetically, with each wild population adapting to its local habitat. Wild populations of cacao across the Americas may therefore be treasure troves of genetic variation, which could be bred into cultivated strains to make the latter more resistant to disease and climate change, and perhaps even create new flavors of chocolate.

“After ten million years of evolution we should not be surprised to see a large amount of variation within the species, some of which might exhibit novel flavours or forms that are resistant to diseases. These varieties may contribute towards improving a developing chocolate industry,” says James Richardson.

The researchers already plan to return to South America to sample all species related to cacao and investigate the characteristics of their native populations.

“We hope to highlight the importance of conserving biodiversity so that it can be used to augment and safeguard the agricultural sector. By understanding the diversification processes of chocolate and its relatives we can contribute to the development of the industry and demonstrate that this truly is the Age of Chocolate,” says coauthor Dr Santiago Madriñán of the University of the Andes in Bogotá, Colombia.

The study is published in the open-access journal Frontiers in Ecology and Evolution.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2015-11/f-sdt110515.php

Study: http://journal.frontiersin.org/article/10.3389/fevo.2015.00120/full

Everyone has their own daily rhythm of digital activity, shows study

Selected coverage: Tech2, Vocativ, Videnskab, GeeklyHelsingin Sanomat

Over the past decade, there has been a surge of scientific studies on the digital activity of people, such as making mobile calls, texting, e-mailing, and posting on social media. Because nearly all human behavior leaves a digital footprint, scientists can use such digital activity as a proxy to track human activity in general, for example to study differences between cultures or communities in sleep patterns, work schedules, and leisure activities.

In a new study in the open-access journal Frontiers in Physics, researchers from Finland and Denmark use a radically new approach to study digital rhythms. In contrast to previous studies that focused on general patterns across large numbers of people, they search for pronounced, long-term differences in rhythm between individuals. They show that people tend to have their personal rhythm of digital activity — almost like a personal signature.

“Each individual follows their own distinctive and persistent daily rhythm”, says Doctoral Candidate Talayeh Aledavood, who performed the research together with Jari Saramäki, Associate Professor at Aalto University, and Sune Lehmann, Associate Professor at the Technical University of Denmark.

These personal rhythms could be detected in multiple datasets, and to a similar extent for e-mail, phone calls, and text messages.

“In almost every case, the individual patterns differ strongly from the average behavior, for example by increased calling frequency during mornings, mid-days, or evenings,” says Aledavood.

What drives these individual differences is not yet clear. Geographical and cultural differences clearly play a role. The researchers believe that there could also be an effect of physiology, for example caused by the difference between morning and evening persons, or by highly individual patterns of alertness during the daylight hours.

“We see this research as a first step of the way to understanding how activity patterns and chronotype are related to other personal characteristics, such as personality or mobility behavior,” says Sune Lehmann.

Aledavood et al. further show that these personal digital rhythms persist in time, meaning they are truly characteristic for each individual. This finding could also have medical applications: digital rhythms could be selectively monitored for patients with mental health problems, suggest the authors. Sudden changes in patients’ digital rhythms could be a sign that medical intervention may be necessary.

“Combining this research with Big Data may also open new avenues of research in sleep studies,” concludes Saramäki.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2015-10/f-eht100115.php

Study: http://journal.frontiersin.org/article/10.3389/fphy.2015.00073/full

Study sets ambitious new goals for nutrition science

How can nutrition science help to achieve healthy nutrition for everyone? An urgent question in a world where 795 million people are chronically undernourished (FAO) while 1.9 billion people are overweight or obese (WHO).

“To deliver successfully, nutrition research needs a bold dose of innovation,” writes an international team of researchers from across the Life Sciences in the open-access journal Frontiers in Nutrition. In their study – aptly termed a “Field Grand Challenge” – they reach out to their peers with an ambitious set of research goals for nutrition science for the period 2015-2020.

“This initiative by the Field Chief Editor of Frontiers in Nutrition deals with a long-overdue issue: to bring researchers from all the scientific disciplines working on nutrition-related questions together, to think and work on trans- and interdisciplinary topics,” says Professor Dietrich Knorr from the Department of Food Biotechnology and Food Process Engineering at the Technical University Berlin.

The experts identify questions that need to be answered, methods that need to be developed, and foundational data that need to be collected within the next five years along eight axes of research: (1) Sustainability in food and nutrition; (2) Identifying and mitigating methodological errors in nutrition science, to increase rigor, objectivity, reproducibility, and transparency; (3) Generation and analysis of highly dimensional “Big Data”, for example in nutrigenomics; (4) Authenticity and safety of foods; (5) Food-related human behavior; (6) The molecular and physiological link between nutrition and brain health; (7) The human microbiome; and (8) Nourishing the immune system and preventing disease, for example through medical nutrition and neutraceuticals.

“We feel the topics described represent the key opportunities, but also the biggest challenges in our field,” says Dr Johannes le Coutre, Senior Research Scientist and Head of Perception Physiology at the Nestlé Research Center, Lausanne, Switzerland, and Field Chief Editor of Frontiers in Nutrition. “Five years seemed long enough for a scientific program to bear measurable fruit — yet with a clear scope and focus.”

The authors stress the need for a transdisciplinary systems-science approach to nutrition research, generating and integrating data at all levels of complexity and from all relevant disciplines, including genomics, medical science, physiology, bioengineering, food science and technology, analytics, and biomathematics.

“Nutrition science is evolving from reductionist approaches centered around the study of single molecules and pathways to in-depth, quantitative, systems-wide analyses of massively interacting systems (i.e., nutrition, microbiome, immunological and metabolic networks) that delineate health outcomes. This article articulates the Grand Challenges in 21st Century Nutrition Research and Discovery and provides paradigm-shifting solutions such as informatics, data analytics and modeling approaches in combination with pre-clinical and clinical validation studies,” says Prof Bassaganya-Riera, Director of the Virginia Bioinformatics Institute at Virginia Tech.

The authors hope their Grand Challenge will provoke a lively discussion among their peers about how to improve nutrition as a science, allowing it fulfil its potential and make meaningful, sustainable contributions to global nutrition.

“At Frontiers in Nutrition, we are excited to develop and share an open-science platform for this discussion. Healthy nutrition for all is an ambition too important to be handled by detached interest groups,” concludes le Coutre.

Misperception discourages girls from studying math-intensive science, shows study

Selected coverage: The ConversationIFLS

A misperception — the belief that the ability to do difficult mathematics is something that you either have or don’t — currently prevents many American girls from opting for a major in physics, engineering, mathematics, or computer science (PEMC), suggests a new study. What is positive is that schools, families and policy makers can help students to shift their (mis-)perceptions.

“Our results indicate the potential for more women to move into PEMC if they perceive their mathematics ability as strong, and open to growth,” says Lara Perez-Felkner, Assistant Professor of Higher Education and Sociology at Florida State University and co-author of the study.

Together with doctoral students Samantha Nix and Kirby Thomas, Perez-Felkner set out to determine how the choice of college major is influenced by gender and perceptions about ability. They focused on a group of 4,450 students from 750 high schools across the USA, following them over the period 2002-2012 through the records of the Education Longitudinal Study of the US National Center for Education Statistics.

The results were revealing. Someone’s self-perceived ability in mathematics, particular in difficult and challenging tasks, matters. While boys in high school tend to overrate their abilities in mathematics, girls tend to underrate them. But girls in 12th grade who reported being convinced that they could do the most difficult and challenging mathematics were an estimated 3.3 times more likely to take a PEMC major. This held true even after correcting for other factors, for example the science courses they took in high school, ethnicity, college entrance exam scores, and the selectivity of the college.

Another important factor was the perception that mathematical ability can be developed through learning (a “growth mindset”). Girls in 10th grade who reported that they strongly believed this were an estimated 2.3 times more likely to take a PEMC major than girls who reported the opposite belief.

“By focusing on students’ perceived ability under challenge, we are getting closer to the “real” world context, where mathematics anxiety may operate. Most people believe they can do some mathematics, such as splitting a dinner bill with friends, but fewer believe they can do mathematics they perceive as ‘difficult’. Here we show that this belief can influence the decision to specialize in mathematics-intensitive fields, for both women and men,” says Samantha Nix.

These findings have direct implications for policy. They suggest that interventions that foster a growth mindset of mathematical ability could be effective in raising the number of women that pursue a career in PEMC fields. Currently, women are strongly underrepresented in these fields, as shown by recent reports by the OECD and the US National Science Foundation. This gender gap is bad news for everyone: science and society lose talent, while women miss out on potential careers with higher-than-average income and job stability.

“It is important for the US and other nations to continue to invest in interventions to end gender segregation in PEMC sciences. For instance, students may need to hear that encountering difficulty during classwork is expected and normal, and does not say anything about the ability to become a successful scientist. In addition, instructors may want to ask themselves if they are giving the same feedback to young women and men who deal successfully with a difficult mathematics problem in class,” says Perez-Felkner.

93041_web
Lara Perez-Felkner, Samantha Nix, Kirby Thomas. Credit: Bill Lax/Florida State University

The research, which was funded by a National Science Foundation grant, is published in the open-access journal Frontiers in Psychology.

Other results include:

  • Girls were an estimated 3.7 times less likely to major in PEMC than their male peers.
  • Girls were an estimated 3.8 times more likely to major in Health Science than their male peers.
  • Girls and boys who had completed both Physics 1 and Chemistry 1 courses in high school were an estimated 1.9 times more likely to major in a PEMC field than their peers, and an estimated 2.5 times more likely if they had completed both Physics 2 and Chemistry 2.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2015-06/f-mdg060415.php

Study: http://journal.frontiersin.org/article/10.3389/fpsyg.2015.00530/full

Male fish dig pits and build sand castles at the bottom of Lake Malawi to attract females

New research shows that courtship rituals evolve very fast in cichlid fish in Lake Malawi. Whenever species evolve to feed at different depths, their courtship evolves as well. In the shallows where the light is good, males build sand castles to attract females. Males of deep-dwelling species dig less elaborate pits and compensate with longer swimming displays. The results are published in the open-access journal Frontiers in Ecology and Evolution.

“Lake Malawi cichlids are famous for the diversity and fast evolution of their feeding habits, body form, and sex determination system,” says Ryan York, a graduate student at Stanford University and lead author of the study. “Here we show for the first time that their courtship rituals also evolve exceptionally fast.”

The researchers made a DNA-based “family tree” for 75 species (out of over 500) of Lake Malawi cichlids, noting for each whether males build castles or dig pits. The tree looks like a messy patchwork: the closest relatives of species with castle-building males often have pit-digging males, and vice versa. York and colleagues conclude that individual species have repeatedly moved back and forth between castle building and pit digging during cichlid evolution.

Lake Malawi is approximately 5 million years old, which means that all evolutionary changes in the cichlids’ ecology — including courtship behavior — have happened within this extremely short period.

The evolution of cichlid courtship seems to be driven by shifts in the average depth at which each species feeds. Castles require more effort to build but are more striking to females in clear, shallow waters. In species that live at greater depth where light is scarce, castle building does not pay off.

In support of their theory, the researchers show that castle-building species live at an average depth of 15 meters in Lake Malawi, compared to 30 meters for pit-digging species.

The body of pit-diggers is likewise better suited for living at greater depths. For example, females and males of pit-digging species can extend their upper jaw further towards prey, allowing them to catch fast-moving animal plankton on the murky lake bottom. Their retina is less able to detect UV light — a wavelength that is too short to reach the depths.

Digging pits takes less effort than building castles, and pit-digging males seem to use the time and energy saved to good effect. Studying courtship in one castle-building and one pit-digging species in detail inside aquaria, the researchers found that males of the latter invest twice as much time in display behavior, for example swimming towards females or extending their fins and gill cover to look larger.

Pits and castles are only used during courtship and mating, and have no other function. If a female likes what she sees, she lays her eggs inside the pit or castle, to be fertilized by the male. She then keeps them in her mouth for several weeks, never eating until they hatch.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2015-03/f-mfd031615.php

Study: http://journal.frontiersin.org/article/10.3389/fevo.2015.00018/full

 

Identifying the war-afflicted teenagers most in need of mental health care

Selected coverage: NPR

A new study finds widespread post-traumatic stress disorder, depression, and suicidal ideation among teenagers in warn-torn Northern Uganda, not only among former child soldiers. Psychological support should be offered to all young people in the region through the education system.

In Northern Uganda, an estimated nine out of ten teenagers have been displaced at least once in their life. Around one in three have been abducted by the notorious rebel group Lord’s Resistance Army to serve as child soldiers. Approximately half of former child soldiers have been forced to commit violence, for example abducting other children or injuring people.

In a new controlled study, an international team of researchers surveyed mental health problems among teenagers in education programs in Northern Uganda. They found evidence of Post-Traumatic Stress Disorder (PTSD) in 32% of former abductees and 12% of other teenagers. Depression and suicidal ideation were less frequent, but still elevated and widespread.

The study was done by psychologists and psychiatrists from Germany and Uganda, in partnership with the mental health organization Vivo International. Further support came from the Windle Trust and the Norwegian Refugee Council.

“The rates of PTSD and depression were highest among former child soldiers. But we also found very high rates in other war-affected youth,” says lead author Nina Winkler from the University of Konstanz, Germany.

The results are drawn from clinical interviews with a controlled, representative sample of 843 teenagers from 12 secondary schools and 6 vocational training centers in the Gulu and Amuru regions of Uganda. Interviews were conducted by local trauma counselors in the language Lou, under close supervision by expert psychologists.

The researchers looked for symptoms of PTSD, depression, and suicidal ideation. They also asked interviewees about their exposure to different types of traumatic events, such as being abducted or displaced, assault with weapons, sexual violence, life-threatening injury, or witnessing violent death.

Rates of trauma exposure, PTSD, depression, and suicidal ideation were higher in former abductees than in non-abductees, and highest in former child soldiers who had been forced to commit violence. However, PTSD was widespread among teenagers exposed to many types of traumatic events, regardless of whether they had also been abducted.

These results suggest a stepwise “building block” mechanism for the development of PTSD, where the risk increases with each traumatic experience.

“All learners, abducted or not, with high trauma exposure appear at risk of developing symptoms of PTSD and depression. Our results suggest that there is no valid reason to focus mental healthcare exclusively on former child soldiers.”

The researchers are concerned that the widespread mental health problems among teenagers might make it easier for gangs and armed groups to find recruits, prolonging unrest and violence in the region.

They suggest that the local education system could help to deliver mental healthcare and psychosocial support to all war-affected young people, including former child soldiers.

The study is published in the open-access journal Frontiers in Psychiatry.

Findings include:

  • 84% of girls and 89% of boys had been displaced at least once
  • 30% of girls and 50% of boys had been abducted at least once
  • Abductions lasted on average 12 months, with a maximum of 11 years
  • Among former abductees, 34% of girls and 31% of boys were diagnosed with PTSD
  • Among non-abductees, 16% of girls and 8% of boys were diagnosed with PTSD
  • Among former abductees with PTSD, 30% of girls and 17% of boys were diagnosed with depression
  • Among former abductees with PTSD, 57% of girls and 34% of boys reported having current suicidal ideations
  • The highest rate of PTSD, 87%, was found in former abductees who had committed violence and experienced 25 or more types of traumatic events

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2015-03/f-itw022515.php

Study: http://journal.frontiersin.org/article/10.3389/fpsyt.2015.00002/full

Unlike people, monkeys aren’t fooled by expensive brands

Selected coverage: Le Monde, Yale News, L’Obs, Yahoo, Discover Magazine, Sydney Morning Herald, Herald Scotland, Daily Mail, Tech Times

In at least one respect, Capuchin monkeys are smarter than humans — they don’t assume a higher price tag means better quality, according to a new Yale study appearing in the open-access journal Frontiers in Psychology.

People consistently tend to confuse the price of a good with its quality. For instance, one study showed that people think a wine labeled with an expensive price tag tastes better than the same wine labeled with a cheaper price tag. In other studies, people thought a painkiller worked better when they paid a higher price for it.

The Yale study shows that monkeys don’t buy that premise, although they share other irrational behaviors with their human relatives.

“We know that capuchin monkeys share a number of our own economic biases. Our previous work has shown that monkeys are loss-averse, irrational when it comes to dealing with risk, and even prone to rationalizing their own decisions, just like humans,” said Laurie Santos, a psychologist at Yale University and senior author of the study. “But this is one of the first domains we’ve tested in which monkeys show more rational behavior than humans do.”

Rhia Catapano, a former Yale undergraduate who ran the study as part of her senior honors thesis, along with Santos and colleagues designed a series of four experiments to test whether capuchins would prefer higher-priced but equivalent items. They taught monkeys to make choices in an experimental market and to buy novel foods at different prices. Control studies showed that monkeys understood the differences in price between the foods. But when the researchers tested whether monkeys preferred the taste of the higher-priced goods, they were surprised to find that the monkeys didn’t show the same bias as humans.

Santos and colleagues think that differences in the response of humans and capuchins could stem from the different experiences that monkeys and people have with markets and how they behave.

“For humans, higher price tags often signal that other people like a particular good.” Santos noted. “Our richer social experiences with markets might be the very thing that leads us — and not monkeys — astray in this case.”

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2014-12/yu-upm120114.php

Study: http://journal.frontiersin.org/article/10.3389/fpsyg.2014.01330/full