No Comment Diary

The News Without Comment

This content shows Simple View

BPS Research Digest

When tears turn into pearls: Post-traumatic growth following childhood and adolescent cancer

GettyImages-619368670.jpgBy guest blogger Tomasz Witkowski

It’s hard to imagine a crueller fate than when a child receives a diagnosis of an illness as difficult as cancer. A young human being, still not fully formed, is suddenly and irrevocably thrown into a situation that many adults are unable to cope with. Each year, around 160,000 children and youngsters worldwide are diagnosed with cancer, and this trend is growing in industrialised societies. Faced with such facts, it is particularly important to understand how children cope. What traces of the experience remain in their psyche if they manage to survive?

Partial answers to these questions come from a trio of Australian researchers in their systematic review and meta-analysis of existing research into the psychological effects of cancer on children, published recently in Psycho-Oncology. Their findings give us reason for some optimism. It turns out children and adolescents affected by cancer are no more likely to develop post-traumatic stress symptoms than their healthy peers. In fact, several studies have found that children affected by cancer go on to experience greater than usual adjustment and quality of life and lower anxiety and post-traumatic stress symptoms. In psychology, we refer to this as the post-traumatic growth (PTG) effect, which can arise from the struggle with highly challenging life circumstances or trauma.

Results from the analysed body of research – 18 studies in all – indicated that participants who were older when surveyed, or older when diagnosed with cancer, were more likely to experience PTG. Most likely, this is a product of the development of abstract reasoning that occurs sometime after the 11th or 12th year of life, when adolescents begin to formulate their own value systems, take an interest in philosophical ideas, and think about the meaning of life – cognitive processes that are involved in the development of PTG.

The meta-analysis also revealed a small but statistically significant correlation between post-traumatic stress and PTG. By definition, the struggle with trauma is necessary for the development of PTG, because it is during such struggles with disease that a teenager may experience both obsessive thoughts about death, but also may begin to appreciate life more. The vision of losing the normalcy that healthy people take for granted can turn into affirmation of that very normalcy. It is precisely these difficult experiences in youth that contribute to the formation of individuals who psychologists refer to as “prematurely mature”.

The least surprising result was the positive link between PTG and having greater social support, as well as between PTG and being more optimistic. Unfortunately, these correlations don’t tell us whether social support and optimism lead to PTG, or if the reverse is true. Further studies may identify the causal relations between these factors. In turn this may help inform the development of support programmes targeting children with cancer and other difficult illnesses.

The new meta-analysis also looked for potential correlations between PTG and depression, anxiety, pessimism, and quality of life, but all were statistically nonsignificant. Jasmin Turner and her colleagues suspect that this could be caused by small sample sizes.

For decades, psychology has treated negative human experiences as unequivocally harmful to people, assuming that they lead to post-traumatic stress disorder, and poorer psychological and physical functioning. Regardless of their actual psychological state, people who have survived negative experiences have sometimes been treated like patients in need of help, and at times this help has even proved harmful to them. The discovery that following some traumatic situations, tears can turn into pearls is one of the more significant and promising discoveries of psychology. Understanding when and why this happens is a means for science to make a clear contribution to improving people’s well-being. And while the new results are not very strong, nevertheless they may help guide future research, potentially helping social support and clinical interventions for cancer patients. Also important is consideration of the factors leading to PTG and how to share this information appropriately and sensitively with people suffering illnesses. However, before we can label any such programmes as “evidence based”, further studies are necessary, particularly longitudinal research.

That said, it is not worth waiting passively for the results of such studies. With the knowledge that the experience of trauma can lead to PTG, we can begin providing intelligent support to people whose luck is down – encouraging reflection on the experience of trauma, rather than mechanical consolation with exhortations to think positively. Intelligent support should be an unobtrusive presence, without encouraging the rejection of negative emotions, and without attempts at eliciting positive ones. In all certainty, this kind of approach will be different from the offerings we have received for many years from some unreflexive positive psychologists.

Correlates of post-traumatic growth following childhood and adolescent cancer: A systematic review and meta-analysis

Post written by Dr Tomasz Witkowski for the BPS Research Digest. Tomasz is a psychologist and science writer who specialises in debunking pseudoscience in the field of psychology, psychotherapy and diagnosis. He has published over a dozen books, dozens of scientific papers and over 100 popular articles (some of them in Skeptical Inquirer). In 2016 his latest book Psychology Led Astray: Cargo Cult in Science and Therapy was published by BrownWalker Press. He blogs at http://ift.tt/2futFR5.

http://ift.tt/2rashdY Source: http://ift.tt/2bxzvQM



Researchers say this 5-minute technique could help you fall asleep more quickly

GettyImages-498497602.jpgBy Christian Jarrett

You’ve had all day to worry, but your brain decides that the moment you rest your weary head upon your pillow is the precise instant it wants to start fretting. The result of course is that you feel wide awake and cannot sleep. Two possible solutions: (1) spend five minutes before lights out writing about everything you have done. This might give you a soothing sense of achievement. Or (2) spend five minutes writing a comprehensive to-do list. This could serve to off-load your worries, or perhaps it will only make them more salient? To find out which is the better strategy, a team led by Michael Scullin at Baylor University, invited 57 volunteers to their sleep lab and had half of them try technique 1 and half try technique 2. Their findings are published in the latest issue of the Journal of Experimental Psychology: General.

The participants, aged 18 to 30, attended the sleep lab at about 9pm on a weekday night. They filled out questionnaires about their usual sleep habits and underwent basic medical tests. Once in their sound-proofed room and wired up to equipment that uses brain waves to measure sleep objectively, they were told that lights out would be 10.30pm. Before they tried to sleep, half of the participants spent five minutes “writing about everything you have to remember to do tomorrow and over the next few days”. The others spent the same time writing about any activities they’d completed that day and over the previous few days.

The key finding is that the participants in the to-do list condition fell asleep more quickly. They took about 15 minutes to fall asleep, on average, compared with 25 minutes for those in the “jobs already done” condition. Moreover, among those in the to-do list group, the more thorough and specific their list, the more quickly they fell asleep, which would seem to support a kind of off-loading explanation. Another interpretation is that busier people, who had more to write about, tended to fall asleep more quickly. But this is undermined by the fact that among the jobs-done group, those who wrote in more detail tended to take longer to fall asleep.

“Rather than journal about the day’s completed tasks or process tomorrow’s to-do list in one’s mind, the current experiment suggests that individuals spend five minutes near bedtime thoroughly writing a to-do list,” the researchers said.

Unfortunately, the experiment didn’t have a baseline no-intervention control group, so it’s possible that the shorter time-to-sleep of the to-do list writing intervention was actually a reflection of journaling about completed jobs making it harder to fall asleep. Also, note the current sample didn’t have any sleep problems. Scullin and his team say the next step is to conduct a longer-running randomised control trial of the to-do list intervention outside of the sleep lab, with people who do and don’t have sleep-onset insomnia.

The Effects of Bedtime Writing on Difficulty Falling Asleep:
A Polysomnographic Study Comparing To-Do Lists and Completed Activity Lists

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

http://ift.tt/2B1x2pZ Source: http://ift.tt/2bxzvQM



New findings pose more problems for the embattled concept of the microaggression

GettyImages-843534086.jpgBy Alex Fradera

“Microaggressions” are seemingly innocuous words or behaviour that supposedly communicate a bias toward minority groups, such as asking Asian Americans where they are from, implying that they are not really part of the USA. According to advocates of the usefulness of the concept, microaggressions cause real harm, even if unintended by the perpetrator. However, the theoretical and evidential support for the concept of microaggressions is far from clear, as detailed in Scott Lilienfeld’s recent thorough critique, which recommended the term be revised or at least re-examined. Now, Craig Harper, a psychologist at Nottingham Trent University, has published a study as a pre-print online at PsyArXiv that, he argues, reveals a further key problem with the concept of the microaggression.

As it’s usually defined, a microaggression only counts as such when a majority group member commits an act that a minority group member perceives as a slight. The term was coined to account for slights against Black people, and has since expanded to other groups. On his Psychology Today blog, Derald Sue at Columbia University, one of the originators of the microaggression concept, states that microaggressions are “everyday verbal, nonverbal, and environmental slights, snubs, or insults, whether intentional or unintentional, that … target persons based solely upon their marginalized group membership”. For his new study, Craig Harper’s aim was to examine whether it’s true that the experience of “being microaggressed at” really does only flow in one direction.

Harper recruited around 400 US participants online and split them into three groups according to their political beliefs: conservative, liberal, or moderate. They also answered questions about their pride in belonging to their political grouping.

The participants then read six fictional scenarios in which a professor made a contentious claim to his students. For example, in one, a male professor explained that the reason women are under-represented in certain professions is entirely due to their personal choice. A female student then responded in a frustrated manner, but was told by the professor to calm down and that this was not a place for “policy agendas.” This is a fairly typical example of what the literature considers a microaggression.

Crucially, the scenarios were balanced so only half the microaggressions targeted people who would normally be seen as victims: women, Black people, and liberals. The other three scenarios kept the topics (such as educational attainment and political diversity), but switched the social identities of the perpetrator and victim. For example, in one case, a female professor justified lower rates of men in certain professions and then showed impatience towards a male student who complained.

After reading each scenario, the participants said whether they thought the student was right to feel insulted or had been overly sensitive. They also said whether they thought the professor was bigoted.

Unsurprisingly perhaps, the students’ own political orientation shaped their response. Harper found that political liberals judged the fictional professors more harshly when they targeted racial minorities, women and those on the political left, compared to other targets. The students with more conservative leanings were less bothered by the scenario content overall, but showed the opposite pattern, being more critical when majorities, men and conservatives were targeted.

Harper says this is important because it shows, contrary to the the conventional definition of the microaggression concept, that microaggressions aren’t only experienced by those who fall into certain minority political categories. Instead it seems that anyone who thinks their in-group has been slighted by an out-group member may feel as if they’ve been “microaggressed”.

Possibly relevant to one’s sensitivity to potential microaggressions, according to Harper,  is the concept of “collective narcissism” – how much we believe in the superiority of our in-group (typified by agreement with statements like “If people with my political views had a major say in the world, the world would be a much better place”). Based on the participants’ ratings of their pride in their political in-group, there wasn’t any evidence this factor mattered for the liberal groups’ perception of microaggressions. For some of the conservative participants, however, it was those showing more collective narcissism who showed greater sensitivity to slighted right-wing targets.

This finding for collective narcissism is tentative at this stage. In fact, Harper’s paper has not yet been peer reviewed and there may be further interpretations of the results that Harper hasn’t considered. But it seems clear that once you are willing to put aside the lens of power to understand microaggressions, you can study the concept like other forms of motivated social cognition we understand quite well, such as, Harper says, the tendency “to shun or discredit those with whom we are ideologically opposed”. These new findings suggest that those who argue microaggressions are a societal concern specifically afflicting minorities may have to recognise that other interest groups – especially in spaces that lean left, not right, like most universities – will have a claim to play the same game.

Political Microaggressions Across the Ideological Spectrum

Alex Fradera (@alexfradera) is Staff Writer at BPS Research Digest

http://ift.tt/2Dp4lpE Source: http://ift.tt/2bxzvQM



New test of children’s Environmental Sensitivity identifies three groups: orchids, dandelions and tulips

Screenshot 2018-01-11 09.27.31.pngBy Christian Jarrett

It’s widely accepted children’s development reflects an interaction between their genes and the environment they are raised in. More tentative is the intriguing idea that the role of the environment is more consequential for some children than others. According to this view, a minority of children are environmentally sensitive “orchids” who suffer disproportionately in adversity, but who especially thrive in positive conditions.

To date, research into this idea has been stifled by the lack of a short, reliable test of children’s Environmental Sensitivity. As reported in Developmental Psychology, a team led by Michael Pluess at Queen Mary University of London has now developed a 12-item scale for measuring children’s Environmental Sensitivity. Preliminary work using the test supports the importance of the concept and suggests children fall into three groups: orchids; dandelions, who are relatively unaffected by the environment; and tulips, who are midway between the two.

The researchers started by asking over 300 children aged 11 to 14 years, at a school in East London, to complete an established adult measure of Environmental Sensitivity. Based on the children’s answers, the researchers removed unnecessary items and whittled the measure down to create a “Highly Sensitive Child” (HSC) test that asks children to rate how strongly they agree with the following items:

  • I find it unpleasant to have a lot going on at once
  • Some music can make me really happy
  • I love nice tastes
  • Loud noises make me feel uncomfortable
  • I am annoyed when people try to get me to do too many things at once
  • I notice it when small things have changed in my environment
  • I get nervous when I have to do a lot in little time
  • I love nice smells
  • I don’t like watching TV programs that have a lot of violence in them
  • I don’t like loud noises
  • I don’t like it when things change in my life
  • When someone observes me, I get nervous. This makes me perform worse than normal

Next, the researchers put their new test through its paces, by giving it, together with other personality measures, to  258 more children, aged 11-12 years, also at a school in East London; to 104 children (aged 8-11 years) at two London primary schools, who took the new test twice, several weeks apart; and to nearly 1,500 teenagers from across England and Wales, aged 15-19 years.

Data from these tests showed that like the adult version of the test, the new test appears to tap three separate factors (Aesthetic sensitivity; Ease of Excitation; and Low Sensory Threshold) that correlate and contribute to a general trait of Environmental Sensitivity. Importantly, Environmental Sensitivity appeared to be distinct from other traits already established in psychology, such as trait Neuroticism (or emotional instability). There was also notable stability in their Environmental Sensitivity scores when children took the test twice, weeks apart.

In a final analysis, the researchers combined all their data and looked to see whether the distribution of Environmental Sensitivity scores supported the idea that a minority of children are orchids, while the majority are “dandelions” and less sensitive to the environment, positive or negative. In fact, the new data suggested that scores tend to fall into three groupings, with roughly 30 per cent of children matching the environmentally sensitive “orchid” category, 30 per cent being “dandelions” and showing a distinct lack of Environmental Sensitivity, and the remainder being “tulips”, midway between the two other groups.

More research is need to test the reality of these categories and how they differ from each other, or whether it might actually be more accurate to see variation in Environmental Sensitivity as an unbroken continuum. It will also be interesting to see whether and how patterns of Environmental Sensitivity vary across cultures, and what are the factors that influence a child’s score on the test.

What’s already exciting from a practical and theoretical perspective, though, is that early research (conducted separately from the current study) using the new Highly Sensitive Child test, suggests that Environmental Sensitivity is a meaningful concept with important implications. For instance, a long-term study in the US of juvenile offenders (currently unpublished) found that risk of re-offending was more strongly related to quality of the home environment, good and bad, among those who scored high on Environmental Sensitivity. Similarly, another study found that both positive and negative parenting styles were more consequential for children rated higher by their parents on Environmental Sensitivity (using an adapted version of the new HSC test).

“Environmental Sensitivity is an important individual characteristic that is related to, but largely distinct from, other common temperament and personality traits,” the researchers said. “The current study suggests that it is possible to measure Environmental Sensitivity reliably in children and adolescents with the Highly Sensitive Child scale, a 12-item self-report measure with good psychometric properties.”

— Environmental sensitivity in children: Development of the Highly Sensitive Child Scale and identification of sensitivity groups

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

http://ift.tt/2DiFBPW Source: http://ift.tt/2bxzvQM



Researchers have tested ways to reduce the collective blaming of Muslims for extremism

GettyImages-461333672.jpgBy Emma Young

Terror attacks by Muslim extremists tend to provoke hate crimes in response. After the London Bridge and Borough market attacks in 2017, and the Boston Marathon bombing in 2013, for example, there was a spike in the number of reports of verbal and physical attacks on innocent Muslims. Two weeks after the London Bridge attacks, a British non-Muslim man even drove his van into worshippers leaving the Finsbury Park Mosque in London, killing one and injuring 11.

“People have a tendency to hold groups collectively responsible for the actions of individual group members, which justifies ‘vicarious retribution’ against any group member to exact revenge,” note the authors of a new paper in Personality and Social Psychology Bulletin that explores how to short-circuit this cycle of violence.

In what the researchers dub an “interventions tournament”, they tried out various methods of reducing the collective blaming of all Muslims for attacks by individual extremists. Most failed. But there was one clear winner: an intervention that encouraged non-Muslims to see the hypocrisy in blaming all Muslims for the appalling actions of a few individuals, but not all Christians for the violent actions of an extremist few.

Emily Bruneau at the University of Pennsylvania led the work, beginning with a study of the importance of collective blaming, involving 193 mostly white, Christian Americans, recruited online. As they predicted, the researchers found that those participants who held Muslims collectively responsible for the terror attacks in Paris in November 2015 were also more likely to endorse a suite of anti-Muslim attitudes and beliefs.

Next, the researchers selected eight 2-4 minute-long videos containing different psychological “elements” that might help reduce the belief in collective blame. For example, one of the videos highlighted the diversity of the Muslim experience – challenging the idea that “all Muslims are the same”. Two videos depicted non-Muslim Americans demonstrating that they do not hold Muslims collectively responsible for terror incidents – by donating to a vandalised mosque, for example. This kind of “social proof” has been shown in other studies to strongly influence behaviour.

A total of 1,765 non-Muslim Americans watched one of these eight videos, or no video (as a control), or a “negative control” – a video interview with a Syrian-born woman who attacked Islam as backward and violent. This video was expected to increase collective blame, and the analysis showed that it did – highlighting, the researchers write, the potential danger of anti-Muslim rhetoric by politicians and others.

Only one video reduced collective blame. This was an Al Jazeera interview with a Muslim American woman who highlighted the hypocrisy of holding all Muslims collectively responsible for extremist violence. Specifically, she discussed the tendency by Christians to blame all Muslims for terror attacks, but not to blame all Christians for violent acts by extremist individual Christians. After watching this video, participants not only saw Muslims as less collectively responsible for extremism, but also showed less anti-Muslim prejudice.

To test whether the key ingredient of this video was the way it highlighted hypocrisy, the researchers asked more participants to reflect on their own (and White people’s) lack of collective responsibility for the actions of extreme individual group members, and they found this also lowered belief in the collective blame of Muslims, as well as reducing anti-Muslim prejudice, attitudes and behaviour.

It would be interesting, of course, to know how long the effects of a hypocrisy intervention might last in the real world. But, as the researchers note in their conclusion, the findings “contribute to a long tradition of research demonstrating that highlighting hypocrisy can drive behaviour change”.

Interventions Highlighting Hypocrisy Reduce Collective Blame of Muslims for Individual Acts of Violence and Assuage Anti-Muslim Hostility

Image: A Muslim protester holds a placard reading “Islam, Peace, Islam” during a demonstration outside Atocha Station against the recent Paris terrorist attacks on January 11, 2015 in Madrid, Spain (credit: Pablo Blazquez Dominguez / Stringer).

Emma Young (@EmmaELYoung) is Staff Writer at BPS Research Digest

http://ift.tt/2AMuJa4 Source: http://ift.tt/2bxzvQM



New insights into lifetime personality change from “meta-study” featuring 50,000 participants

GettyImages-636223928.jpgBy Christian Jarrett

It’s a question that goes to the heart of human nature – do our personalities change through life or stay essentially the same? You might think psychology would have a definitive answer, but this remains an active research question. This is partly because of the practical challenge of testing the same group of individuals over many years. Now a major new contribution to the topic has been made available online at the PsyArXiv repository. The researchers, led by Eileen Graham at Northwestern University, have compared and combined data from 14 previously published longitudinal studies, together involving nearly 50,000 participants from the US, Europe and Scandinavia. Their findings confirm and extend existing knowledge, showing how personality traits tend to change through life in predictable ways.

Graham and her colleagues started by looking for existing long-term studies into health and ageing that had captured data on at least one of the Big Five personality traits (Openness, Conscientiousness, Extraversion, Agreeableness, Neuroticism) on at least three separate occasions among the same sample of people. Although these long-term term studies measured personality, previous investigators hadn’t necessarily looked at this aspect of their data.

Among the identified studies that met the required criteria were the Einstein Aging Study, the Midlife in the United States study, The Berlin Aging Study, The Lothian Birth Cohort and the Swedish Adoption/Twin Study of Aging. In some cases, personality data was collected from the same individuals over several decades. There was a bias in the included studies toward testing people late in life, but this helps counterbalance existing studies of lifetime personality change which have been skewed toward younger participants.

Combining data from all the studies showed that four of the five main personality traits showed statistically significant change, on average, through life, thus contradicting William James’ famous assertion that personality is set like plaster after age 30. The exception was trait Agreeableness (related to warmth and empathy), but actually this trait was found to change in each individual study, but in different directions for different studies (sometimes increasing through life, sometimes diminishing), such that it appeared stable when considered in aggregate.

Putting Agreeableness aside, the overall pattern was for the other traits to decline across the lifespan by about 1-2 per cent per decade, such that participants became, on average, more emotionally stable (save for an uptick in Neuroticism at the very end of life), but generally less outgoing, less open-minded, and less orderly and self-disciplined. This is somewhat consistent with the previously described Dolce Vita (literally “sweet life”) effect, which describes how we change in late life in response to having fewer responsibilities.

However, it’s worth reiterating that there was a bias in the included studies towards older samples. Focusing on just those studies with younger participants also provided evidence for early life increases in Conscientiousness, Extraversion and Openness (with peaks occurring in mid-life), which is consistent with another established theory known as the “maturity principle”, which states that our personalities tend to improve through life as we adapt to the growing challenges of work and family responsibilities (the observed declines in Neuroticism also fit this picture).

It’s important to remember these findings relate to sample averages, based on how personality trait levels were seen to change when data from all participants was combined. Consistent with past research, results from 12 of the 14 analysed studies revealed a good deal of individual variation in these patterns. As Graham and her colleagues put it, “not everyone changes at the same rate, or in the same direction”.

There were also some intriguing hints at how these personality dynamics might vary across cultures. Specifically, the US samples showed more marked and consistent declines in Extraversion through life compared with European samples. The researchers interpreted this as US citizens becoming less inclined with age to conform to the strong pressure in American culture to behave like an extravert.

The new findings leave many questions about personality change unanswered, such as why some people show trait changes that don’t fit with the usual pattern. However, this paper’s most important contribution is probably to show how sub-fields like this, which depend on challenging, hard-to-repeat, long-term studies, can respond to the so-called replication crisis in psychology. It’s fairly easy to re-run a simple laboratory task with students, it’s much harder to try to replicate the findings of a decades-long personality change study.

“Any one of the 14 samples used here could have comprised a publication on its own,” Graham and her colleagues explained, “with some claiming a given increase, other claiming a decrease, and still others arguing that a trait does not change at all. This has the potential to sow great confusion … However, when considered as a group with other samples, a fuller picture emerges.”

A Coordinated Analysis of Big-Five Trait Change Across 14 Longitudinal Studies (Note this study is a preprint which has not yet been subjected to peer review)

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest and the author of a forthcoming book on personality change

http://ift.tt/2EpI22H Source: http://ift.tt/2bxzvQM



Facts aren’t everything – understanding parents’ moral reasons for avoiding vaccination

GettyImages-473268102.jpgBy Emma Young

Last year, so few people contracted measles in England and Wales that the disease was declared technically “eliminated”. The national MMR (measles mumps rubella) vaccination programme is to thank. But set against this welcome news were some imperfect stats: in England in 2016/17, only 87.6 per cent of children had received both the required doses of the vaccine by their fifth birthday – a drop compared with the previous two years. At least part of the reason was a reluctance among some parents to have their children vaccinated. This is a problem that affects other countries, and other vaccines, too. And it’s troubling, because clusters of unvaccinated or under-vaccinated children are more susceptible to disease outbreaks – indeed, a measles outbreak in Leeds and Liverpool just last year affected unprotected children, providing a reminder why all children should be vaccinated.

In a new paper, published in Nature Human Behaviour, a team led by Avnika Amin at Emory University, US, reveal a previously overlooked explanation for “vaccine hesitancy”, as it’s called – and it’s to do with parents’ basic moral values.

Recent interventions designed to improve childhood vaccination rates have focused on providing parents with facts about vaccines. But while educational campaigns of this kind can be successful in the short term, they can backfire, making some parents even less willing to vaccinate their children than they were before. This makes sense if you consider that parents’ decisions are not based only on the facts, but also on their moral values. By also addressing this influence on parents’ attitudes to vaccination, it may be possible to develop more effective interventions and help increase vaccination rates further.

Our moral values – the extent to which we prioritise loyalty or fairness, for example – are known to guide our decisions and behaviour in other areas, such as climate change and philanthropy. So the researchers reasoned that it made sense to look how they might influence attitudes towards vaccination. To do this, Amin and her colleagues surveyed 1,007 US parents of children aged under 13 about their attitudes to vaccination, and about their moral values, and looked to see if there was a connection between the two.

Almost three quarters of parents fell into the “low hesitancy” category (they didn’t have a problem with vaccines); about 11 per cent were classed as “medium hesitancy” (they expressed concerns about vaccinations and had perhaps delayed their child’s vaccinations as a result); and 16 per cent were classed as “high hesitancy” (they had seriously delayed or even refused vaccinations).

In terms of their moral values, both medium and high-hesitancy parents were twice as likely as low-hesitancy parents to place a high emphasis on “purity” (one of six “moral foundations”, purity is associated with disapproval of acts that are deemed “disgusting” or “unnatural”). The high-hesitancy parents were also twice as likely as low-hesitancy parents to place a strong emphasis on liberty, which involves valuing individual freedom. They were also half as likely endorse obedience to authority. Belief in the importance of fairness (another moral foundation) was not related to vaccine hesitancy, and neither were gender or political ideology.

A second study, using a separate group of 464 parents, validated these findings. But the data from both studies only show a correlation between particular moral standpoints and attitudes towards childhood vaccinations, not a causal link. Nonetheless, the researchers said their results suggest that “health decisions are, to some extent, linked with moral concerns” and that there was promise in exploring the impact of health messages “framed using the moral foundations associated with medium or high hesitancy”.

For example, they suggested an intervention framed in terms of purity might read, “Boost your child’s natural defences against diseases! Keep your child pure of infections – Vaccinate!” A message framed in terms of liberty might read, “Take personal control of your child’s health! Vaccinations can help your child and others be free to lead a happy and healthy life!”

Emma Young (@EmmaELYoung) is Staff Writer at BPS Research Digest

Association of moral values with vaccine hesitancy

http://ift.tt/2D6zwWY Source: http://ift.tt/2bxzvQM



What’s your stress mindset?

Screenshot 2018-01-04 11.35.39.pngBy Christian Jarrett

Do you see stress as helpful or harmful? If you recognise that it can have upsides – by sharpening your focus and boosting your motivation, and that stressful challenges can offer learning and achievement opportunities – then you have a positive stress mindset (conversely, if you see stress as unpleasant, debilitating and threatening, then you have a negative stress mindset).

A new diary study in the European Journal of Work and Organizational Psychology has explored the implications of stress mindset for the workplace – surprisingly, one of the first investigations to do so. The researchers, led by Anne Casper at the University of Mannheim, found that anticipating a large workload on a given day was associated with employees upping their performance that day, taking more proactive steps to meet the challenge, and ending the day feeling more energised, but only if they had a positive stress mindset.

The findings come from 171 employees in various occupations and industries, mainly education, health and social care, and IT. Their average age was 39 and just over half were men. They first completed a survey about their stress mindset. Then they completed an online diary three times a day – morning, after work, before bed – for five working days. In the morning, the diary prompted them to answer questions about the their expected workload that day. Later, it asked them about any constructive steps they’d taken to meet the day’s challenges, such as planning and scheduling, and seeing the challenge as a learning opportunity. In the evening, participants also indicated how well they’d performed, and how energetic they felt.

For employees with a positive stress mindset, there was an association between expecting a larger workload and taking more proactive steps to cope. In turn, these proactive, constructive behaviours (the researchers call this “approach coping”) were related to performing better, and feeling more energised. But for those with a negative stress mindset, this association was reversed – the more workload they anticipated, the less they performed constructive coping behaviours. In turn, the worse these workers said they’d performed, and the less energised they felt at the day’s end. This is consistent with the idea that people with a negative stress mindset try to cope through avoidance.

The new findings complement previous research that’s shown our stress mindset can influence how we respond to challenges. For instance, students with a positive stress mindset are more inclined to seek out feedback after completing a stressful task (presumably because they see it as a chance to learn).

Casper and her colleagues said their new results show the benefits that could come from raising people’s awareness of the concept of stress mindset. Promisingly, they said there is some evidence that people can be helped to develop a positive stress mindset. However, they also acknowledged some limitations with their research – for instance, they didn’t look at employees’ expectations about how difficult their work will be (only how much they had to do). Similarly, the measure of stress mindset lacked nuance. Some people may think modest amounts of stress are helpful, for example, but that extreme stress is harmful.

I’d add another caveat – this study doesn’t provide strong causal evidence for the effects of stress mindset. That would require altering some employees’ stress mindset through a training programme and looking to see what effects this had on their behaviour and performance.

Finally, it’s worth noting, as the researchers do, that this study focused on effects within a single day. It’s well-established that a chronic excessive workload is associated with negative outcomes for health and performance (most likely regardless of mindset). This study shows that having a positive stress mindset may help us cope with – perhaps even benefit from – a particularly challenging, high intensity work day, but this shouldn’t be taken as a justification for bosses to overburden their staff long-term.

Mindset matters: the role of employees’ stress mindset for day-specific reactions to workload anticipation

Image is Figure 2 from Casper et al, 2017.

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

http://ift.tt/2CUQU0B Source: http://ift.tt/2bxzvQM



8 studies and 2 podcasts to help you keep your New Year’s resolutions



Brain differences in avid players of violent video games suggest they are “callous, cool and in control”

GettyImages-642627610.jpgBy guest blogger Helge Hasselmann

Video games do not enjoy the best of reputations. Violent games in particular have been linked with aggression, antisocial behaviour and alienation among teens. For example, one study found that playing a mere 10 minutes of a violent video game was enough to reduce helping behaviour in participants.

However, some experts are sceptical about whether games really cause aggression and, even if the games are to blame, it remains unclear what drives their harmful effects. Earlier studies identified empathy as a key trait that may be affected by violent gameplay. Now a study by Laura Stockdale at Loyola University Chicago and her colleagues in Social Affective and Cognitive Neuroscience has taken a closer look at how gamers and non-gamers differ at a neural level, uncovering evidence that suggests chronic violent gameplay may affect emotional brain processing, although more research is needed to confirm this.

Participants were classified as frequent or infrequent players of video games depending on weekly usage – at least 30 hours of screen time per week was considered frequent, while no more than five hours a week was considered infrequent. Next, the scientists looked at the top three games their participants played, more specifically if the majority were either violent (for example, a shooter video game such as Call of Duty) or non-violent (for example FIFA). This yielded a sample of 30 frequent players of violent video games (gamers) and 31 infrequent players of non-violent games (controls). Both groups had an average age of 21 and all were male.

Participants first completed an established empathy questionnaire, with the gamers scoring lower than the controls.

For the actual experiment, the researchers recorded participants’ brain waves using electroencephalography (EEG) while they completed a modified version of the “stop-signal task” (SST). The SST contained male and female faces with fearful or happy expressions and there were two types of trial: On Go trials, participants had to indicate as fast as possible whether the face was male or female by pressing a button. On No-go (stop) trials – indicated by a box around the face – participants had to withhold making any response. This version of the SST is generally considered an implicit measure of emotion processing because participants have to pay attention to the gender of the faces while trying to ignore the emotions on the faces.

The gamers and controls performed similarly on this task. Crucially, however, the brain activity of the two groups differed.

The researchers were particularly interested in three components of the EEG recordings during the stop-signal task, the so-called P100, N170 and N200/P300 (these positive or negative spikes of neural activity, known as “event-related potentials”, occur at different times after the stimuli). The P100 (a positive spike 100ms after the stimulus) is one of the earliest indices of processing of visual information and has been associated with attention to emotional information. The N170 is evoked by viewing a human face, especially if it displays a negative emotion (such as fear). Finally, the N200/P300 is generally believed to be associated with inhibiting responses.

On Go trials, gamers showed a reduced P100 in response specifically to happy faces, as compared to the controls. This suggests the gamers may have been paying less attention to positive facial stimuli. Interestingly, however, this was explained by differences in empathy rather than screen time or game content. When examining only the gamers who scored high on the empathy questionnaire, they did not differ neurally from the controls. This indicates that empathy manifests in the earliest stages of how we process emotional information, and this could be one way that violent video games affect our perception. In line with this, the gamers showed an earlier N170 to happy than to fearful faces, which is the opposite of what normally happens (and the opposite of the pattern shown by the controls). This may indicate that gamers gave reduced neural priority to threatening faces, compared with normal, perhaps due to their overexposure to threatening content in games.

Finally, gamers seemed to require a lower amount of neural resources to inhibit their responses (as shown by a reduced N200/P300). It could be that video games train cognitive function and this is why gamers need fewer mental resources for this task. Alternatively, and in line with the finding of a reduced P100 amplitude, it could be that gamers simply pay less attention to emotional information, are less distracted by it and thus needed fewer mental resources for the task.

Taken together, the researchers said their results are consistent with the idea that the chronic playing of violent video games affects people’s empathy and the way their brains process emotional facial expressions and control their behavioural responses. In short, chronic violent gameplay may leave players “callous, cool and in control”, they said.

While interesting, this study is not free of limitations. Clearly, the biggest drawback lies with the cross-sectional design, which cannot clarify whether violent video gameplay causes lower empathy and reduced emotional processing or the other way around. Also, it is not easy to disentangle the effects of video gaming content (i.e., whether it is violent or not) and gaming duration because there was no group of frequent players of non-violent video games or infrequent players of violent video games. Plausibly, the duration gamers spend each week on their consoles could be more important than what the video game is about. For obvious reasons, any association between video gaming and empathy found here only applies to a Western context.

So more research is clearly needed. But if chronic exposure to violent video games really is associated with lower empathy and emotional callousness, this could have major implications for policymakers.

Cool, callous and in control: superior inhibitory control in frequent players of video games with violent content

Image: eSports World Convention (ESWC) : Day Two At Porte De Versailles In Paris
February 2017: A visitor plays ‘Call of Duty’ (Photo by Chesnot/Getty Images)

img_20150110_163732Post written for BPS Research Digest by Helge Hasselmann. Helge studied psychology and clinical neurosciences. Since 2014, he is a PhD student in medical neurosciences at Charité University Hospital in Berlin, Germany, with a focus on understanding the role of the immune system in major depression.

http://ift.tt/2CPMLel Source: http://ift.tt/2bxzvQM




top