Probiotics improve cognition in Alzheimer’s patients

In a randomized double-blind trial, scientists show for the first time that dietary supplementation with daily dose of probiotic bacteria over a period of just 12 weeks is sufficient to yield a small but significant improvement in the cognitive performance of Alzheimer’s patients.

For the first time, scientists have shown that probiotics — beneficial live bacteria and yeasts taken as dietary supplements — can improve cognitive function in humans. In a new clinical trial, scientists show that a daily dose of probiotic Lactobacillus and Bifidobacterium bacteria taken over a period of just 12 weeks is enough to yield a moderate but significant improvement in the score of elderly Alzheimer’s patients on the Mini-Mental State Examination (MMSE) scale, a standard measure of cognitive impairment.

Probiotics are known to give partial protection against certain infectious diarrheas, irritable bowel syndrome, inflammatory bowel disease, eczema, allergies, colds, tooth decay, and periodontal disease. But scientists have long hypothesized that probiotics might also boost cognition, as there is continuous two-way communication between the intestinal microflora, the gastrointestinal tract, and the brain through the nervous system, the immune system, and hormones (along the so-called “microbiota-gut-brain axis”). In mice, probiotics have indeed been shown to improve learning and memory, and reduce anxiety and depression- and OCD-like symptoms. But prior to the present study there was very limited evidence of any cognitive benefits in humans.

Here, the researchers, from Kashan University of Medical Sciences, Kashan, and Islamic Azad University, Tehran, Iran, present results from a randomized, double-blind, controlled clinical trial on a total of 52 women and men with Alzheimer’s between 60 and 95 years of age. Half of the patients daily received 200 ml milk enriched with four probiotic bacteria Lactobacillus acidophilus, L. casei, L. fermentum, and Bifidobacterium bifidum (approximately 400 billion bacteria per species), while the other half received untreated milk.

At the beginning and the end of the 12-week experimental period, the scientists took blood samples for biochemical analyses and tested the cognitive function of the subjects with the MMSE questionnaire, which includes tasks like giving the current date, counting backwards from 100 by sevens, naming objects, repeating a phrase, and copying a picture.

Over the course of the study, the average score on the MMSE questionnaire significantly increased (from 8.7 to 10.6, out of a maximum of 30) in the group receiving probiotics, but not in the control group (from 8.5 to 8.0). Even though this increase is moderate, and all patients remained severely cognitively impaired, these results are important because they are the first to show that probiotics can improve human cognition. Future research, on more patients and over longer time-scales, is necessary to test if the beneficial effects of probiotics become stronger after longer treatment.

“In a previous study, we showed that probiotic treatment improves the impaired spatial learning and memory in diabetic rats, but this is the first time that probiotic supplementation has been shown to benefit cognition in cognitively impaired humans,” says Professor Mahmoud Salami from Kashan University, the senior author of the study.

Treatment with probiotics also resulted in lower levels of triglycerides, Very Low Density Lipoprotein (VLDL), high-sensitivity C-Reactive Protein (hs-CRP) in the blood of the Alzheimer patients, and likewise a reduction in two common measures (called “Homeostatic Model Assessment”, HOMA-IR and HOMA-B) of insulin resistance and the activity of the insulin-producing cells in the pancreas.

“These findings indicate that change in the metabolic adjustments might be a mechanism by which probiotics affect Alzheimer’s and possibly other neurological disorders,” says Salami. “We plan to look at these mechanisms in greater detail in our next study.”

Walter Lukiw, Professor of Neurology, Neuroscience and Ophthalmology and Bollinger Professor of Alzheimer’s disease at Louisiana State University, who reviewed the study but was not involved in the research, said: “This early study is interesting and important because it provides evidence for gastrointestinal (GI) tract microbiome components playing a role in neurological function, and indicates that probiotics can in principle improve human cognition. This is in line with some of our recent studies which indicate that the GI tract microbiome in Alzheimer’s is significantly altered in composition when compared to age-matched controls, and that both the GI tract and blood-brain barriersbecome significantly more leaky with aging, thus allowing GI tract microbial exudates (e.g. amyloids, lipopolysaccharides, endotoxins and small non-coding RNAs) to access Central Nervous System compartments.”

The study is published in the open-access journal Frontiers in Aging Neuroscience.

Link to EurekAlert!: https://www.eurekalert.org/emb_releases/2016-11/f-pic110116.php

Link to study: http://journal.frontiersin.org/article/10.3389/fnagi.2016.00256/full

 

“Don’t hit your brother” – moms are strictest on their infants’ moral wrongdoing

Moms respond more strongly to moral faults by infants than to other type of misbehavior, regardless of the potential harm, shows a new study.

Research in the journal Frontiers in Psychology shows that mothers typically respond more strongly to any “moral” faults by their infants – that is, which risk hurting other people or pets – than to any other type of misbehavior. Even misbehavior that puts the infant herself, but no-one else, at potentially risk, for example running down the stairs, is generally disciplined less strongly by moms than moral wrongdoing. Conversely, infants are more ready to obey, and less likely to protest against, their mother’s prohibitions on moral faults than prohibitions on other types of misbehavior. These results indicate that mothers tend to treat moral wrongdoing as a special, more serious type of misbehavior, regardless of the potential harm.

“Mothers were more insistent on the moral prohibition against harming others than prohibitions against doing something dangerous or creating mess or inconvenience, as shown by their greater use of physical interventions and direct commands in response to moral transgressions,” says the author Audun Dahl, Assistant Professor at the Department of Psychology at the University of California at Santa Cruz.

Dahl and coworkers observed interactions between 26 mothers, their 14-month infant girls or boys, and an older sibling below 8 years of age during 2.5-hour-long home visits, and repeated the visits 5 and 10 months later. Mothers were told to behave naturally, as the purpose was to study the everyday experience of infants. The observers scored each instance of infant naughty behavior, distinguishing between moral, “prudential” (dangerous to the infant herself, but to no-one else) and “pragmatic” faults (i.e. creating mess or inconvenience, but not harmful to the infant or anyone else). They also scored the mother’s response to each behavior, for example physical restraint; commands; distracting the infant from the unwanted behavior; softening, such as saying “I know you want to play with my phone” to acknowledge the infant’s wish, comforting him or her, or using of terms of endearment; compromising on an earlier prohibition; or explaining why the infant’s behavior was wrong. Other variables were the infant’s response, for example compliance with their mother’s instructions, protest, or expressing negative emotions, and the seriousness of the actual or potential consequences of the behavior.

The results show that mothers consistently respond with high-intensity interventions such as physical restraint and commands, and not with gentler interventions, whenever their infants showed moral misbehavior. In contrast, mothers were more likely respond to pragmatic or prudential transgressions with low-intensity interventions, especially distraction, softening or compromising. Furthermore, infants were significantly more likely to comply immediately with their mother’s commands when their original transgression had been moral, and less likely to protest verbally. Importantly, the greater insistence of mothers on moral rules couldn’t be attributed to moral transgressions having more severe potential consequences, since the observed prudential misbehaviors was on average more harmful – for example, putting the infant at risk of falling or choking.

Dahl concludes that mothers tend to treat the moral imperative to avoid harm to others as fundamentally different — more important to communicate and less open to negotiation — from prudential and pragmatic rules.

“Through their more insistent interventions on moral misbehavior, mothers appear to help their children make this distinction as well,” says Dahl. “Still, how parents react to misbehaviors is only one of many factors in early moral development. So an important question for future research is how precisely young children make use of their mother’s reactions, along with other experiences, to gradually develop their own notions of right and wrong,” says Dahl.

 

Link to study: http://journal.frontiersin.org/article/10.3389/fpsyg.2016.01448/full

EurekAlert! PR: https://www.eurekalert.org/pub_releases/2016-10/f-hy100616.php

 

“Gambling” wolves take more risks than dogs

Selected coverage: Science Magazine, Independent, Christian Science Monitor

Wolves pursue a high-risk, all-or-nothing strategy when gambling for food, while dogs are more cautious, shows a new study. This difference is likely innate and adaptive, reflecting the hunter versus scavenger lifestyle of wolves and dogs.

Would you rather get 100 euros for certain, or have a fifty-fifty chance of receiving either 200 euros or nothing? Most choose the first, as humans tend to be “risk-averse”, preferring a guaranteed pay-off over the possibility of a greater reward. It is thought that this human preference for “playing it safe” has evolved through natural selection: when you live precariously like our remote ancestors, losing all your food reserves might be catastrophic, while adding to them might makes less difference to your chances of survival.

Here, in one of the first studies on risk preferences in non-primates, scientists show through a series of controlled experiments that wolves are consistently more prone to take risks when gambling for food than dogs. When faced with the choice between an insipid food pellet and a fifty-fifty chance of either tasty meat or an inedible stone, wolves nearly always prefer the risky option, whereas dogs are more cautious.

“We compared the propensity to take risks in a foraging context between wolves and dogs that had been raised under the same conditions,” says Sarah Marshall-Pescini, a postdoctoral fellow at the University of Vienna and the Wolf Science Centre, Ernstbrunn, Austria, the study’s first author. “We found that wolves prefer the risky option significantly more often than dogs. This difference, which seems to be innate, is consistent with the hypothesis that risk preference evolves as a function of ecology.”

The study was done at the Wolf Science Centre, Ernstbrunn, Austria, a research institute where scientists study cognitive and behavioral differences and similarities between wolves and dogs. Here, wolves and dogs live in packs, under near-natural conditions within large enclosures.

Marshall-Pescini let each of 7 wolves and 7 dogs choose 80 times between two upside-down bowls, placed side-by-side on a moveable table-top. The animals had been trained to indicate the bowl of their choice with their paw or muzzle, after which they would receive the item that was hidden beneath it.

The researchers had taught the wolves and dogs that beneath the first bowl, the “safe” option, was invariably an insipid dry food pellet, while beneath the second bowl, the “risky” option, was either an inedible item, a stone, in a random 50% of trials, and high-quality food, such as meat, sausage, or chicken, in the other 50%. The side for the “safe” and “risky” option changed between trials, but the animals were always shown which side corresponded to which option; whether they would get a stone or high-quality food if they chose the “risky” option was the only unknown. Rigorously designed control trials confirmed that the animals understood this rule, including the element of chance.

Wolves are much more prone to take risks than dogs, show the results. Wolves chose the risky option in 80% of trials, whereas dogs only did so in 58% of trials.

The researchers believe that dogs evolved a more cautious temperament after they underwent an evolutionary shift from their ancestral hunter lifestyle to their current scavenger lifestyle, which happened between 18,000 to 32,000 years ago when humans first domesticated dogs from wolves. Previous research has suggested that species that rely on patchily distributed, uncertain food sources are generally more risk-prone. For example, chimpanzees, which feed on fruit trees and hunt for monkeys, have been shown to be more risk-prone than bonobos, which rely more on terrestrial vegetation, a temporally and spatially reliable food source.

“Wild wolves hunt large ungulates – a risky strategy, not only because hunts often fail, but also because these prey animals can be dangerous – whereas free-ranging dogs, which make up 80% of the world’s dog population, feed mostly by scavenging on human refuse, a ubiquitous, unlimited resource. So dogs no longer need to take risks when searching for food, and this may have selected for a preference to play it safe.” concludes Marshall-Pescini.

freya_etu_rooobertbayer
Freya and Etu, a dog and wolf from the Wolf Centre. Credit: RoooBert Bayer

 

 

 

etu_ela_rooobertbayer
Etu and Ela, wolf pups at the Wolf Centre. Credit: RoooBert Bayer
geronimo_rooobertbayer
Geronimo chose the “risky” option in 78% of trials. Credit: RoooBert Bayer

Burnout is caused by mismatch between unconscious needs and job demands

Selected coverage: Le Figaro, CBS News

New research shows that burnout is caused by a mismatch between a person’s unconscious needs and the opportunities and demands at the workplace. These results have implications for the prevention of job burnout.

Imagine an accountant who is outgoing and seeks closeness in her social relationships, but whose job offers little scope for contact with colleagues or clients. Now imagine a manager, required to take responsibility for a team, but who does not enjoy taking center-stage or being in a leadership role. For both, there is a mismatch between their individual needs and the opportunities and demands at the workplace. A new study in the open-access journal Frontiers in Psychology shows that such mismatches put employees at risk of burnout.

Burnout is a state of physical, emotional, and mental exhaustion from work, which results in a lack of motivation, low efficiency, and a helpless feeling. Its health effects include anxiety, cardiovascular disease, immune disorders, insomnia, and depression. The financial burden from absenteeism, employee turnover, reduced productivity, and medical, legal, and insurance expenses due to burnout and general work-related stress is staggering: for example, the American Institute of Stress estimates the total cost to American enterprises at 300 billion US$ per year, while a 2012 study commissioned by the Health Programme of the European Union estimates the annual cost to EU enterprises at 272 billion €.

In the new study, researchers from the Universities of Zurich and Leipzig show that the unconscious needs of employees – their so-called “implicit motives” – play an important role in the development of burnout. The researchers focus on two important motives: the power motive, that is, the need to take responsibility for others, maintain discipline, and engage in arguments or negotiation, in order to feel strong and self-efficacious; and the affiliation motive, the need for positive personal relations, in order to feel trust, warmth, and belonging. A mismatch between job characteristics and either implicit motive can cause burnout, the results show. Moreover, a mismatch in either direction is risky: employees can get burned out when they have too much or not enough scope for power or affiliation compared to their individual needs.

“We found that the frustration of unconscious affective needs, caused by a lack of opportunities for motive-driven behavior, is detrimental to psychological and physical well-being. The same is true for goal-striving that doesn’t match a well-developed implicit motive for power or affiliation, because then excessive effort is necessary to achieve that goal. Both forms of mismatch act as ‘hidden stressors’ and can cause burnout,” says the leading author, Veronika Brandstätter, Professor of Psychology at the University of Zurich, Switzerland.

Brandstätter and colleagues recruited 97 women and men between 22 and 62 through the Swiss Burnout website, an information resource and forum for Swiss people suffering from burnout. Participants completed questionnaires about their physical well-being, degree of burnout, and the characteristics of their job, including its opportunities and demands.

To assess implicit motives – whose strength varies from person to person, but which can’t be measured directly through self-reports since they are mostly unconscious – Brandstätter et al. used an inventive method: they asked the participants to write imaginative short stories to describe five pictures, which showed an architect, trapeze artists, women in a laboratory, a boxer, and a nightclub scene. Each story was analyzed by trained coders, who looked for sentences about positive personal relations between persons (thus expressing the affiliation motive) or about persons having impact or influence on others (expressing the power motive). Participants who used many such sentences in their story received a higher score for the corresponding implicit motive.

The greater the mismatch between someone’s affiliation motive and the scope for personal relations at the job, the higher the risk of burnout, show the researchers. Likewise, adverse physical symptoms, such as headache, chest pain, faintness, and shortness of breath, became more common with increasing mismatch between an employee’s power motive and the scope for power in his or her job.

Importantly, these results immediately suggest that interventions that prevent or repair such mismatches could increase well-being at work and reduce the risk of burnout.

“A starting point could be to select job applicants in such a way that their implicit motives match the characteristics of the open position. Another strategy could be so-called “job crafting”, where employees proactively try to enrich their job in order to meet their individual needs. For example, an employee with a strong affiliation motive might handle her duties in a more collaborative way and try to find ways to do more teamwork,” says Brandstätter.

“A motivated workforce it the key to success in today’s globalized economy. Here, we need innovative approaches that go beyond providing attractive working conditions. Matching employees’ motivational needs to their daily activities at work might be the way forward. This may also help to address growing concerns about employee mental health, since burnout is essentially an erosion of motivation. To do so, we must increasingly take account of motivational patterns in the context of occupational stress research, and study person-environment-fit across entire organizations and industries,” says Beate Schulze, a Senior Researcher at the Department of Social and Occupational Medicine of the University of Leipzig and Vice-President of the Swiss Expert Network on Burnout.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2016-08/f-bic080816.php

Study: http://journal.frontiersin.org/article/10.3389/fpsyg.2016.01153/full

Butterflies use differences in leaf shape to distinguish between plants

The preference of Heliconius butterflies for certain leaf shapes is innate, but can be reversed through learning. These results support a decades-old theory for explaining the evolution of the exceptional diversity of leaf shapes in passionflowers.

The tropical butterfly Heliconius erato distinguishes between shapes, and uses them as a cue for choosing the plants on which to feed and lay eggs, shows new research by scientists from the University of Cambridge and the Smithsonian Tropical Research Institute. The butterfly has an innate preference for passionflowers with particular leaf shapes, but can learn to overcome this preference in favor of other shapes, especially those that are the most abundant in the local flora. These preferences can promote the evolution of plant biodiversity.

Heliconius erato, the red passionflower butterfly, is a large (5 to 8 cm wingspan), white-red-black butterfly that occurs throughout Central America and tropical South America. Females lay their eggs on passionflowers (Passiflora), a genus of tropical vines with extreme variation in leaf shape, both between and within species. For example, related species can have triangular, elongated, elliptic, lobed, or spear-shaped leaves, while even on the same plant leaf shape may vary between young and old, or sun-exposed or overshadowed leaves. Once caterpillars hatch from the eggs, they start feeding on the leaves and shoots of the host plant, often causing considerable damage.

“Here, we show for the first time that female Heliconius erato use shape as a cue for selecting the passionflowers on which they feed and lay eggs,” says Denise Dell’Aglio, a doctoral student at the Department of Zoology of the University of Cambridge.

“These findings have implications for ecological theory, because they support a decades-old hypothesis that the butterflies could drive so-called ‘negative frequency dependent selection’ on the leaf shape of passionflowers, that is, natural selection where the rarest forms always have a competitive advantage. This could explain the extraordinary diversity of leaf shapes found in passionflowers.”

According to this hypothesis, first formulated in 1975 but never tested until now, female Heliconius develop a learned preference – a “search image” – for passionflowers with common leaf shapes, and lay their eggs exclusively on these plants, which then suffer damage from the caterpillars. This would drive a cycle in which passionflowers with rare leaf shapes tend to do better and have more offspring – until over the next generations they become more common in turn, and lose their competitive advantage.

Here, Dell’Aglio and colleagues use artificial flowers and leaves, made out of foam sheet, to test the preferences of Heliconius erato females for particular shapes. They first show that the butterflies have an innate preference for feeding on star-like flowers with three and five petals over flowers with simpler shapes. But they can quickly learn to reverse this preference if the simpler flowers reliably contain a food reward, show the researchers. In a second experiment, Dell’Aglio et al. show that Heliconius erato prefer to lay eggs on leaves with a familiar shape, and tend to avoid laying on leaves with a shape that they have not previously encountered. These results indicate that the butterflies develop search images for familiar leaf and flower shapes, in support of the theory.

“Negative frequency dependence, where rare forms have an advantage, is thought to be a common process that promotes diversity in tropical plants. It is therefore exciting to think about how commonly this may be driven by behavioral flexibility in predators. Perhaps other insects might learn chemical signatures, textures or other physical cues and similarly promote diversity in their host plants,” says Chris Jiggins, Professor of Evolutionary Biology at the University of Cambridge, and one of the coauthors on the new study.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2016-07/f-bud072116.php

Study: http://journal.frontiersin.org/article/10.3389/fevo.2016.00081/full

 

Scientists use microchips to track “Ghosts of Gotham”

Selected coverage: Washington Post, Popular Science, STATNews, Motherboard

For the first time, scientists have tagged NYC rats with RFID microchips to study their individual behavior and potential for transmitting disease

For the first time, researchers are able to study the daily activity of some of the most abundant, most dangerous and secretive, and least known denizens of the world’s great cities: rats. In the open-access journal Frontiers in Public Health, scientists present a novel, cheap, and safe method to tag city rats with RFID (Radio-Frequency Identification) microchips and track their individual movements over several months. The new method, tried and tested on New York City rats – nicknamed “Ghosts of Gotham” because of their elusiveness – is expected to yield a wealth of data on the behavior of city rats and their potential for transmitting disease.

“We don’t know much about the behavior of city rats, or as much as we need to know about the organisms they can transfer to humans, either directly or indirectly through ticks and fleas,” says Michael H. Parsons, a researcher at the Department of Biology at Hofstra University, Hempstead, New York, and lead author of the new study.

“For example, there are currently no known routine surveillance programs for rats in the USA, nor are the population dynamics of rat pathogens systematically monitored in any part of the world. But it’s imperative that we study these subjects because by the year 2050, 75% of the world’s population will live in urban areas, and could therefore be exposed to rat pathogens. Even today, rats and other rodent pests cost the U.S. economy $19 billion per year from food loss, infrastructure damage, and disease.”

Known rat-borne diseases include Rat bite fever, Rocky Mountain spotted fever, and Lyme and Cat-scratch disease, but it is certain that many other rat-borne pathogens remain to be discovered: for example, in a 2014 study, researchers found 18 viruses new to science in a sample of only 133 New York City rats.

Studying wild rats is difficult. Current knowledge about rats comes mainly from non-representative observations on migrating or exceptionally gregarious rats that are active during daylight. Rats are also dangerous to handle, and quickly learn to avoid recapture in traps, while traditional methods for animal tracking, such as radio-telemetry of GPS tagging, often fail in cities where signals are blocked by infrastructure. The new method is the first to overcome these challenges.

“We developed the first safe method – not only for researchers, but also for the rats — for collecting pathogens from the same rat individuals over time, while monitoring their individual behaviors and predispositions. We show that rats can be effectively monitored with RFID microchips, without a lot of funds – the total cost of our equipment was less than $ 15,000,” says Parsons.

Parsons et al. give a step-by-step description of the method, including how to locate rat colonies, trap rats, anesthesize them, take blood, fecal, and skin parasite samples for disease testing, and surgically implant an RFID chip – about the size of a rice grain – under their skin. Once tagged rats are released back into their original environment, their daily activity can be monitored through RFID reading stations, which are treated with natural rat scents – easily extracted from the soiled bedding of laboratory rats – to attract wild rats. Each time a tagged rat comes within a few inches of a station, its presence is registered, while a weighing scale and camera automatically collect weight data and video footage.

Initial results show that rats have different personalities: some are shy, and some are bold. There is also a pronounced sex difference in activity: our female rats were most active between the daylight hours, while males were active throughout the day and night.

“We’re looking forward to seeing others using or improving our protocol, so that rat populations and their pathogens can be systematically monitored to help protect against potential disease outbreaks,” concludes Parsons. “Even in our home city of New York, with more than 8 million people, there are fewer than 10 institutional researchers pursuing rat research – perhaps we can help change that.”

 

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2016-07/f-sum070816.php

Study: http://journal.frontiersin.org/article/10.3389/fpubh.2016.00132/full

Mixing cannabis with tobacco increases dependence risk, suggests study

Selected coverage: The Guardian, Culture Magazine, Daily Mail

People who mix tobacco with cannabis are less motivated to seek help to quit

Tobacco and cannabis are two of the world’s most popular drugs, used respectively by 1 billion and 182 million people worldwide (World Health Organization; United Nations Office on Drugs and Crime). The adverse health effects of tobacco are well known. Short-term effects of cannabis are transient impairments in motor function and working memory, planning, and decision-making, while possible long-term health effects of heavy cannabis use include physical and psychological dependence, permanent reductions in cognitive performance, cardiovascular and respiratory diseases, and some cancers (WHO).

Many users mix cannabis with tobacco, not only to save money but also because tobacco can increase the efficiency of cannabis inhalation. But such mixing can increase the risk of dependence, suggests a new study in Frontiers in Psychiatry.

“Cannabis dependence and tobacco dependence manifest in similar ways, so it is often difficult to separate these out in people who use both drugs,” says lead author Chandni Hindocha, a doctoral student at the Clinical Psychopharmacology Unit of University College London. “Cannabis is less addictive than tobacco, but we show here that mixing tobacco with cannabis lowers the motivation to quit using these drugs.”

Together with collaborators from University College London, the University of Queensland, King’s College London, and the South London and the Maudsley NHS Trust, Hindocha analyzed responses from 33,687 cannabis users who participated in the 2014 Global Drug Survey, an anonymous online survey of drug use, conducted each year in partnership with international media such as Die Zeit, The Guardian, Libération, and the Huffington Post. Participants came from a total of 18 countries in Europe, North and South America, and Australasia. The new study is the first to survey the popularity of different methods of cannabis consumption – so-called routes of administration – around the world.

Routes of administration vary widely between countries, show Hindocha and colleagues. For example, tobacco routes for cannabis – for example in joints, blunts, or pipes — are much more popular in Europe than elsewhere. Depending on the country, between 77.2% and 90.9% of European cannabis users use tobacco routes, while only 51.6% of Australian and 20.7% of New Zealand cannabis users do. Tobacco routes are least popular in the Americas, used by only 16% of Canadian, 4.4% of US, 6.9% of Mexican, and 7.4% of Brazilian cannabis users. In contrast, the use of cannabis vaporizers, a strictly non-tobacco route, is quite common in Canada (13.2% of cannabis users) and the United States (11.2%), but rare everywhere else (0.2 to 5.8%).

Importantly, preferences for routes of administration strongly affected the motivation to quit and to seek professional help for doing so. In particular, cannabis users who favor non-tobacco routes had 61.5% higher odds of wanting professional help to use less cannabis, and 80.6% higher odds of wanting help to use less tobacco, than users who prefer tobacco routes. Similarly, cannabis users who prefer non-tobacco routes had 10.7% higher odds of wanting to use less tobacco, and 103.9% higher odds of actively planning to seek help to use less tobacco.

These results suggest that people who regularly mix tobacco with cannabis are more at risk of psychological dependence than people who use cannabis and tobacco separately, without mixing them.

“Our results highlight the importance of routes of administration when considering the health effects of cannabis and show that the co-administration of tobacco and cannabis is associated with decreased motivation to cease tobacco use, and to seek help for ceasing the use of tobacco and cannabis,” says Michael T. Lynskey, Professor of Addictions in the National Addictions Centre of the Institute of Psychiatry, Psychology & Neuroscience at King’s College London. “Given a changing legislative environment surrounding access to cannabis in many jurisdictions, increased research focus should be given to reducing the use of routes of administration that involve the co-administration of tobacco.”

Other results include:

  • Worldwide, the most popular tobacco route of administration for cannabis is the joint, preferred by 93.4% of users of tobacco routes.
  • Pipes are the most popular non-tobacco route, preferred by 11.7% of users of non-tobacco routes.
  • Non-inhaled routes, such as bucket bongs, hot knifes, or in food or drink, were uncommon in every country surveyed (2.4% of cannabis users worldwide).
  • Men are more likely (68.2% of surveyed male cannabis users) than women (63.8% of surveyed female cannabis users) to use tobacco routes.
  • Users of tobacco routes tend to be younger (mean 26.2 years) than users of non-tobacco routes (mean 30.8 years).
  • 16.3% of respondents had never tried smoking tobacco independently of cannabis.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2016-07/f-mcw070116.php

Study: http://journal.frontiersin.org/article/10.3389/fpsyt.2016.00104/full

 

Antiphonal singing in indris

Selected coverage: Christian Science Monitor, Der Spiegel, National Geographic, Slate, Sciences et Avenir

“How to get noticed as a singer?” isn’t only a concern for young people aspiring to a career in the music industry. Young indris, critically endangered lemurs from Madagascar, sing in antiphony with their choirmates to increase their chances of getting noticed by rival groups, according to a new study in Frontiers in Neuroscience.

Indris (Indri indri) are one of the few species of primates that sing. They live only in the eastern rainforests of Madagascar, a habitat threatened by illegal logging. They live in small groups, which generally consist of a dominant female and male, their immature offspring, and one or more low-ranking young adults. Both females and males sing, and their songs play an important role in territorial defense and group formation.

In the new study, researchers from Italy, Germany, and Madagascar recorded 496 indri songs and analyzed their timing, rhythm, and pitch. The research is part of a long-term study on the ecology of indris in the vicinity of Andasibe-Mantadia National Park and the Maromizaha Forest, eastern Madagascar.

Group members carefully coordinate their singing, show the researchers. As soon as one indri starts to sing, all group members over two years typically old join in. Indris also tend to copy each other’s rhythm, synchronizing their notes.

“The chorus songs of the indri start with roars that probably serve as attention-getter for the other singers and continue with modulated notes of that are often grouped into phrases. In these phrases the indris give a high-frequency note at the beginning, and then the following ones descend gradually in frequency,” says Marco Gamba, a Senior Researcher at the Department of Life Sciences and Systems Biology of the University of Turin, Italy

“Synchronized singing produces louder songs, and this may help to defend the group’s territory from rival groups. Singing is interpreted as a kind of investment, which may help to provide conspecifics with information on the strength of the pair bond and the presence of potential partners,” says doctoral student Giovanna Bonadonna, who is one of the co-authors.

There is one exception to this pattern, however: young, lower-ranking individuals have a strong preference for singing in antiphony rather than synchrony with the rest of the chorus, alternating their notes with those sung by the dominant pair. Gamba and colleagues propose that this is a tactic that lets low-ranking indris maximize their solitary singing and emphasize their individual contribution to the song.

“Synchronized singing doesn’t allow a singer to advertise his or her individuality, so it makes sense that young, low-ranking indris sing in antiphony. This lets them advertise their fighting ability to members of other groups and signal their individuality to potential sexual partners,” says Bonadonna.

“Indris are indeed good candidates for further investigations into the evolution of vocal communication. The next steps in our studies will be to understand whether the acoustic structure of the units allows individual recognition and whether genetics plays a role in the singing structure,” says Professor Cristina Giacoma from the Department of Life Sciences and Systems Biology, the study’s final author.

 

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2016-06/f-asi060716.php

Study: http://journal.frontiersin.org/article/10.3389/fnins.2016.00249/full

 

West African genes lower the risk of obesity in men, suggests study

Selected coverage: Medical Daily, Milwaukee Community Journal, Science Daily

African American men with a high degree of West African genetic ancestry have less central adiposity

The obesity epidemic affects women and men of every ethnic group in the United States, but strong gender and racial disparities in the risk of overweight and obesity exist. African American women are currently more at risk than any other group in the United States: 82.1% of African American women are overweight or obese (defined as having a BMI of 25 or higher), compared to 76.2% of Hispanic women and 64.6% of Caucasian women, according to the 2011-2012 National Health and Nutrition Examination Survey (NHANES). Socioeconomic factors, such as inequalities in access to healthcare, healthy food, and safe places to exercise, are known to be important causes of these and other racial disparities in health characteristics.

In contrast, “only” 69.1% of African American men are overweight or obese – a percentage that is still alarmingly high in absolute terms, but lower than the percentages for Caucasian men (73.2%), and Hispanic men (77.9%), according to the NHANES data. A similar pattern has been reported for type-2 diabetes, a disease strongly associated with overweight and obesity: according to a 2007 study in The American Journal of Public Health, the incidence of diabetes is higher among African American women (24.5%) than among Caucasian women (20.7%), but lower among African American men (16.7%) than among Caucasian men (19.6%).

Why do African American men have a relatively low risk of overweight, obesity, and diabetes, despite facing many of the same socioeconomic disadvantages as African American women? A new study in the open-access journal Frontiers in Genetics suggests that the cause may be partly genetic.

“Here we show that West-African genetic ancestry may afford protection against central adiposity in African American men, but not in African American women,” says Yann Klimentidis, Assistant Professor at the Epidemiology and Biostatistics Department of the University of Arizona, the study’s lead author.

Central adiposity (“belly fat”), the build-up of excess fat under the skin of the lower torso and around the internal organs, is a strong risk factor for obesity and diabetes, as well as for high blood pressure, high blood sugar, disease of the heart, liver, and pancreas, and some cancers.

In the new study, Klimentidis and colleagues analyze genetic data from 4,425 volunteers, all healthy African American women and men between the ages of 45 and 85. These data had been collected in the course of two prospective studies sponsored by the National Heart Lung and Blood Institute of the NIH: the Atherosclerosis Risk in Communities Study (ARIC) and the Multi-Ethnic Study of Atherosclerosis (MESA).

Klimentidis et al. focused on approximately 3,500 nucleotides (“letters” in the genome) that often differ between people from West Africa and Europe. In this way, they could estimate the degree of West African genetic ancestry – the fraction of the genome inherited from West African ancestors – for each participant in the study. A 2015 study in The American Journal of Human Genetics has shown that this fraction varies considerably among African Americans, due to differences in genetic contributions from ancestors from other ethnic groups, especially Europeans and Native Americans.

The new results show that the Waist Circumference and Waist-Hip Ratio tends to be lower in African American men with a high degree of West African genetic ancestry, indicating that they have less central adiposity than African American men with a lower degree of West African genetic ancestry. In contrast, there was no association between Waist Circumference and Waist-Hip Ratio and the degree of West African genetic ancestry in African American women.

The researchers conclude that the gene pool of the African American population contains one or more gene variants – originally inherited from West African ancestors — that give partial protection against central adiposity, but only when present in men. Further research is needed to identify these gene variants and the physiological mechanisms through which they operate, to help prevent and treat central obesity.

“There are still many unanswered questions, including: What are the specific genes that afford protection against central adiposity in men of West African ancestry, or conversely, What are the genes that predispose individuals of other ancestries to greater central adiposity? What cultural, socio-economic, or other factors might explain the lack of protection in African American women?” says Klimentidis.

“The reasons for group differences in multifactorial traits like obesity remain difficult to understand, especially when simplistic explanations do not easily explain complex patterns, like group differences which are not constant across sexes. We believe these new analyses begin to shed light on the factors underpinning the some ethnic disparities in obesity,” says David Allison, Distinguished Professor of Public Health and Director of the Nutrition Obesity Research Center at University of Alabama at Birmingham, a coauthor on the study.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2016-06/f-wag052316.php

Study: http://journal.frontiersin.org/article/10.3389/fgene.2016.00089/full

Mind your busyness

Selected coverage: Smithsonian Magazine, The Independent, NPR, Huffington Post

Busy seniors have better cognitive function, shows study

Are you busy on an average day? Do you often have too many things to do to get them all done? Do you often have so many things to do that you go to bed later than your regular bedtime?

If you are over 50 and the answer to these questions is a weary yes, here is some good news: older adults with a busy daily lifestyle tend to do better on tests of cognitive function than their less busy peers, shows a new study in Frontiers in Aging Neuroscience. The research is part of the Dallas Lifespan Brain Study, one of the most comprehensive studies of age-related changes in cognition and brain function in healthy adults currently underway in the USA.

“We show that people who report greater levels of daily busyness tend to have better cognition, especially with regard to memory for recently learned information,” says Sara Festini, a postdoctoral researcher at the Center for Vital Longevity of the University of Texas at Dallas and lead author of the study.

“We were surprised at how little research there was on busyness, given that being too busy seems to be a fact of modern life for so many,” says Denise Park, University Distinguished Chair at the Center for Vital Longevity, Director of the Dallas Lifespan Brain Study.

The researchers surveyed 330 participants in the Dallas Lifespan Brain Study – healthy women and men between 50 and 89 from the Dallas/Fort Worth area, Texas, recruited through media advertisements and community notices– about their daily schedule. The participants also visited the Park Aging Mind laboratory at the Center for Vital Longevity, where they took part in a long series of neuropsychological tests to measure their cognitive performance.

The results show that at any age, and regardless of education, a busier lifestyle is associated with superior processing speed of the brain, working memory, reasoning, and vocabulary. Especially strong is the association between busyness and better episodic memory, the ability to remember specific events in the past.

Festini et al. warn that the present data do not allow the conclusion that being busy directly improves cognition. It is also possible that people with better cognitive function seek out a busier lifestyle, or that busyness and cognition reinforce each other, resulting in reciprocal strengthening. But one mediating factor accounting for the relationship might be new learning, propose the researchers. Busy people are likely to have more opportunities to learn as they are exposed to more information and encounter a wider range of situations in daily life. In turn, learning is known to stimulate cognition: for example, a recent study from the Center for Vital Longevity found that a sustained effort in learning difficult new skills, such as digital photography or quilting, boosts episodic memory.

“Living a busy lifestyle appears beneficial for mental function, although additional experimental work is needed to determine if manipulations of busyness have the same effect,” says Festini.

EurekAlert! PR: http://www.eurekalert.org/pub_releases/2016-05/f-myb051016.php

Study: http://journal.frontiersin.org/article/10.3389/fnagi.2016.00098/full