You Don’t Need a Personal Genetics Test to Take Charge of Your Health

The online storefront for the consumer genetics company Orig3n features an image of a young woman facing toward a sepia horizon. Her tresses are wavy, her triceps enviably toned. Her determined stance complements the copy floating beside her: "Take charge of your future," it reads. "Orig3n DNA tests uncover the links between your genes and how you think, act, and feel. The more you know, the easier it is to reach your highest potential."

It's the promise of a growing number of services: Genetic insights you can act on. There are tests tailored to tell you about your diet, your fitness, your complexion—even your wine preference. Helix, another consumer genetics company, sells a Wine Explorer service that recommends wine "scientifically selected based on your DNA."

But researchers will tell you to approach lifestyle-tailored testing kits with extreme skepticism. "What you see in the consumer genetics market is that legitimate genetic findings, often from studies with very large sample sizes, are being turned around and marketed to people in a way that implies it's going to be actionable for individuals," says Harvard geneticist Robert Green, who's been researching direct-to-consumer genetic testing for close to 20 years. But in most cases, he says, the extent to which consumers can act upon their results "really remains to be proven."

Not that researchers aren't trying. On the contrary: This week, scientists led by Christopher Gardner, director of nutrition studies at the Stanford Prevention Research Center, published one of the most rigorous investigations to date on whether dieters can use personal DNA results to identify more effective weight-loss strategies. The researchers compared the effectiveness of low fat and low carbohydrate diets in a year-long randomized controlled trial involving more than 600 test subjects. And crucially, the researchers also looked at whether test subjects' genes impacted their results. Earlier studies, some led by Gardner, had suggested that a combination of mutations in PPARG, ADRB2, and PABP2, three genes linked to the metabolism of fat and carbohydrates, could predispose test subjects to lose more weight on one diet than the other.

But the results, which appear in this week's issue of the Journal of American Medicine, found no association between test subjects' genetic profiles and their success with either program; test subjects lost the same amount of weight, regardless of which diet they were assigned. And study participants who were assigned diets that "matched" their genetic profile fared no better than those who weren't.

"When I saw the results, this wave of disappointment washed over me," Gardner says. "It was like, wait, it didn’t work? None of the genetic variants had an effect?”

Nope. The study was big, well-designed, and pricey (it received funding from the Nutrition Science Initiative, a non-profit devoted to funding rigorous nutrition research), yet it failed to replicate smaller, less carefully controlled studies. Such is science! It also illustrates why establishing the usefulness of home DNA kits will be so difficult and time consuming. Research into the link between genetic determinants and diet will probably continue; for the JAMA study, Gardner and his colleagues examined the predictive power of mutations in just three genes, but there are dozens to consider, in a staggering number of combinations. It's plausible—even likely, Gardner says—that some of these genetic signatures could lead people to more effective diets. "But nutrition is just so complex, it's not likely there's going to be an answer soon."

And that's for nutrition. The odds of somebody funding a rigorous, controlled investigation into the link between your DNA and your ideal exercise regimen are … well … let's just say that kind of research isn’t very high on researchers’ to do list.

"It's hard to make a case for studying anything in the lifestyle realm, because it's pretty low stakes," says geneticist Lawrence Brody, director of the Division of Genomics and Society at the National Human Genome Research Institute. Personalized cancer treatments, rare-disease diagnosis, reproductive health screening—you know, the urgent stuff—these are the types of genomic investigations that receive funding. Which is why, even in a field as large as nutrition research, Brody says there are few examples of studies examining the link between genetics and diet with the level of rigor you find in Gardner's JAMA study. "A lot of researchers don't think it's a high enough priority, or likely enough to show results, to conduct and fund a full randomized trial."

And just think: If it's that hard to conduct a solid study on actionable associations between DNA and diet, imagine how unlikely it is we'll see RCTs on personalized skin care plans, "what makes your child unique," or your "unique superhero traits".

You might expect consumer genetics companies to fund this kind of research themselves. Guess again. Most don't have the cash, and, even for those that do, it's risky to perform such studies in the first place, in the event they turn up results like Gardner's. “Most companies don’t feel they need those kinds of studies to sell a narrative that supports the purchase of their products,” Green says.

All of which should make consumers wary of lifestyle-oriented commercial DNA kits, which occupy a gray area somewhere between tests for ancestry and, say, cancer-associated mutations. The experts I spoke with were all optimistic about the long-term future of in-home DNA kits, and supportive of people's right to access their genetic information. But right now, for most tests, the evidence base just isn't there. Something to keep in mind the next time a personal genetics company's motivational ad implies their kit can point you toward a more effective diet or workout, or tell you "whether your genes have the raw potential football legend John Lynch looks for in a player.".

On the upside, the participants in Gardner's JAMA study lost a combined 6,500 pounds, averaging 13 pounds of weight loss apiece, regardless of their genetic profile, and regardless of their assigned diet. A lot of people hear "genetics" and think "destiny," but the vast majority of the time, that's not how genes work. Which means that the vast majority of the time, you don't need a personal genetics test to take charge of your future.

More Consumer Genetics

Read more:

How to lose your love handles

(CNN)If you ask almost any fitness professional how to lose your love handles, they will tell you two things: No amount of abdominal crunches will make a difference if you don’t first improve your diet and exercise regimen.

And there is no such thing as “spot reducing” when it comes to fat loss, so you can’t target this area alone.
Love handles, the pinchable fat on both sides of your stomach that leads to a “muffin top” appearance, are especially stubborn. They do not respond as quickly to diet and exercise changes as the more dangerous visceral fat that is deep within your abdomen and leads to more of a “potbelly” appearance.
    If you are a woman and your waist is larger than 35 inches around or a man with a waist size greater than 40 inches, you will lose the deep belly fat first. This is actually better for your health, as visceral fat has been linked to numerous medical issues, including high blood pressure, high cholesterol, pre-diabetes and inflammation.
    To lose the subcutaneous fat that forms not-so-lovable love handles (or excess fat on your hips, thighs and buttocks), it’s going to take even more time and effort.
    What is the best diet for losing your love handles? The one you can stick with long-term, as it’s not going to be quick or easy for most people. This is especially true for men, as this is often the last place on their body where they lose weight.
    In terms of specific exercise recommendations, three top trainers shared the advice they give their A-list clients.
    Gunnar Peterson, who trains Hollywood actors and professional athletes including the NBA’s Los Angeles Lakers, advocates a comprehensive lifestyle approach that includes clean eating, adequate sleep, stress management, plenty of hydration, and sprint work mixed with steady-state cardio on alternate days to optimize fat burning.
    Sprint work, also known as high-intensity interval training, involves alternating periods of maximum effort with periods of recovery (not be confused with rest). The duration of sprints can increase and recovery periods can decrease as your fitness levels improve. Steady-state cardio involves working out at approximately 65% to 85% of your maximum heart rate for at least 30 minutes.
    One exercise Peterson does not recommend for reducing love handles is side bends with heavy weights. This type of exercise can actually exacerbate the appearance of love handles by increasing the size of the underlying external oblique muscles. Instead, he recommends working all of the abdominal muscles, not just the obliques, in multiple planes of motion. Exercises that reach those areas include wood chops, medicine ball rotations, medicine ball slams (at a variety of angles) and banded overhead extensions side to side and back to front.
    Ami Jampolis, a top San Francisco Bay Area trainer (and, full disclosure, my sister), recommends high-intensity circuit training several times a week to build or maintain calorie-burning muscle, along with several days of steady-state cardiovascular training.
    High-intensity circuit training involves alternating strength training exercises (such as a pushup) and short bouts of cardiovascular exercises (such as jumping jacks), with minimal rest in between. This type of training is gaining popularity for its fast pace and time efficiency and can be done at home or at a gym that specializes in this type of training.
    For homing in on the oblique muscles, Jampolis agrees with Peterson about avoiding side bends with weights. Instead, she suggests two very simple yet effective exercises that don’t seem to have the bulking effect on the oblique abdominal muscles: side planks and bicycles. Both can be done anywhere and don’t require any equipment.
    Celebrity trainer and nutritionist Harley Pasternak takes an even simpler approach to losing your love handles. In addition to cleaning up their diets, he gives all his clients fitness trackers to ensure that they meet his required 12,000 steps per day and seven hours of sleep per night. He finds that higher-intensity training can often increase appetite, making it harder to stick to a diet, so he prefers simply walking, along with regular strength training that includes working each major body part at least once per week.

    See the latest news and share your comments with CNN Health on Facebook and Twitter.

    Although their approaches may vary, professionals agree with the popular saying “abs are made in the kitchen, not the gym.” If you really want to lose your love handles, start by finding a diet plan that works for you and that you can stick with long-term, and then build in an exercise program that works with your current fitness level, exercise preference, budget and schedule.
    How to do a side plank: Lying on your side, lift your body onto your forearm, keeping hips, feet and shoulders stacked over one another. Aim to hold this position for as long as possible, increasing the time as you get stronger. Make sure not to collapse into your upper back. If your balance is a little off, you can place one foot in front of the other for more stability. Switch sides and repeat three times. As you advance, you can drop your hips to the floor and then bring them back up to the side plank position 15 to 20 times and repeat three sets per side.
    How to do bicycles: Lie on your back with both knees bent at a 90-degree angle and place your hands behind your head. Fully extend one leg out about 6 inches above the ground, bringing your opposite elbow to the opposite knee. With control, bring the extended leg in and extend the other leg out, simultaneously bringing the other elbow to the opposite knee. Make sure to twist from your stomach and not your neck. Control is key on this exercise; speed is not your friend. Aim for switching every two seconds. Aim for three sets of 60 (30 per leg).

    Read more:

    To Stop Climate Change, Educate Girls and Give them Birth Control

    Climate change is a ubiquitous hydra, a many-headed beast that affects everyone and everything in some form. Solutions to climate change range from the effective and the practical to the potentially catastrophically dangerous—but, in this somewhat heated debate, a potent weapon in our arsenal is falling by the wayside: the empowerment of women.



    Robin George Andrews (@squigglyvolcano and is a volcanologist and science writer whose work appears in IFLScience, Forbes, and elsewhere.

    Last year a coalition of scientists, economists, policymakers, researchers, and business people published Project Drawdown, a compendium of ways to prevent carbon dioxide from escaping skywards. Drawing from a plethora of peer-reviewed research, the document ranks 80 practical, mitigating measures—along with 20 near-future concepts—that could push back the oncoming storm.

    Ranked in order of carbon emissions locked down by 2050, the usual suspects made the list. A moderate expansion of solar farms (number 8), onshore wind turbines (number 2), and nuclear power (number 20) would all save tens of billions of tons of equivalent carbon dioxide emissions. Increasing the number of people on plant-rich diets (number 4) and using electric vehicles (number 26) are effective carbon-cutting measures often proposed by climate hawks, and rightly so. The top spot went to managing refrigerants like HFCs, which are incredibly effective at trapping heat within our atmosphere.

    But two lesser-known solutions also made this most practical of lists: the education of girls (number 6) and family planning (number 7). This is a stunning revelation, one that couldn’t be more pertinent, and yet, for the most part, discussions of mitigation and de-carbonization focus heavily on other matters, from the perceived perils and bona fide benefits of nuclear power, to just how quickly solar power is proliferating.

    The link between the education of girls and a smaller carbon footprint isn’t as intuitively obvious as, say, phasing out fossil fuels. But dig a little deeper, and the evidence is overwhelming. It’s clear that getting more girls into school, and giving them a quality education, has a series of profound, cascading effects: reduced incidence of disease, higher life expectancies, more economic prosperity, fewer forced marriages, and fewer children. Better educational access and attainment not only equips women with the skills to deal with the antagonizing effects of climate change, but it gives them influence over how their communities militate against it.

    Although the education of girls in a small number of countries is at, or approaching, parity with boys, for most of the planet, this remains distressingly elusive. Poverty, along with community traditions, tends to hold back girls as boys are prioritized.

    Then there's family planning, something that’s indivisible from the education of girls. The planet is overpopulated, and the demands of its citizens greatly exceed the natural resources provided by our environment.

    Contraception and prenatal care is denied to women across the world, from those in the United States, to communities in low-income nations. It’s either not available, not affordable, or social and/or religious motives ensure that it’s banned or heavily restricted. As a consequence, the world’s population will rise rapidly, consume ever more resources, and power its ambitions using fossil fuels. Carbon dioxide will continue to accumulate in the atmosphere.

    The education of girls and family planning can be considered as a single issue involving the empowerment of women in communities across the world. Drawdown calculated that, by taking steps toward universal education and investing in family planning in developing nations, the world could nix 120 billion tons of emissions by 2050. That’s roughly 10 years’ worth of China’s annual emissions as of 2014, and it’s all because the world's population won't rise quite so rapidly.

    It's farcical that this isn’t forming a major part of the debate over climate change mitigation. It’s not entirely clear why this is the case, but I’d suspect that regressive societal attitudes, along with the tendency of commentators to focus on the battle between different energy sectors, play suppressive roles in this regard.

    Project Drawdown isn't the only group that has recently tied population growth to climate change. A study published last summer also found that having just one fewer child is a far more effective way for individuals in the developed world to shrink their carbon footprint than, say, recycling or eating less meat. For women in wealthy countries, these decisions are often freely made, and fertility rates in those countries are already fairly low. In low-income countries, such individual agency—not to mention contraception—is frequently absent, and fertility rates remain high.

    Just as policymakers, climate advocates, and science communicators should pay attention to Drawdown’s findings, individuals should also do what they can to make sure such a solution comes to pass. Non-government organizations, like Hand In Hand International, Girls Not Brides, and the Malala Fund aren’t just uplifting women, but they’re helping to save the planet too, and they deserve support.

    It's a grim assessment of civilization that, in 2018, humans are still grappling with gender equality. The world would clearly benefit if women were on par with men in every sector of society. We shouldn’t need any more convincing, but the fact that the social ascension of women would deal a severe blow to anthropogenic warming should be shouted from the rooftops.

    Incidentally, the solutions in Drawdown also had associated economic benefits or costs associated with them. If 10 percent of the world's electricity was generated using solar farms, then it’d save $5 trillion by 2050, for example. No such value could be put to educating girls and family planning—two human rights with incalculable benefits.

    WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here.

    Our Warming Climate

    Read more:

    How Vampire Bats Survive And Even Thrive On Blood

    As well as being staples of horror movies since forever, vampire bats are evolutionary marvels. Now scientists are slowly starting to decode exactly how these animals survive by feasting on blood.

    Feeding exclusively on blood requires a high degree of specialization. Not only have the bats evolved the tools for the trade – teeth that let them pierce the skin of their meal and shave away any hairs or feathers beforehand, and the ability to distinguish individuals from their breathing patterns alone – but also the capability to live off blood itself.

    Blood is not a good food source. While it might be high in protein, it is low in pretty much all other essential nutrients, like carbohydrates and vitamins. In addition to that, such large volumes of liquid – vampire bats can drink up to half their weight in blood every night – also puts monumental strain on their kidneys and bladder. As if that wasn’t enough, blood also tends to be full of deadly bacteria and viruses.

    Yet despite all this, three species of bat have managed to overcome these issues and are now the only mammals known to dine exclusively on the red stuff. How exactly they have managed to evolve this feat is still something of a mystery, something that the authors of this latest study has attempted to solve by delving into not only their genome, but also their microbiome, publishing their results in Nature Ecology & Evolution.

    They found that the common vampire bat has key differences in the genes related to immunity and food metabolism compared to the 1,200 other species of bats that have diets filled with fruit, meat, or insects. This includes certain bits of DNA, known as transposable elements, which are able to shift their position on the chromosome, being concentrated in the region that dictates immunity, as well as differences in genes that process high levels of iron and nitrogen.

    These genetic adaptations are coupled with a highly distinctive gut microbiome. By studying the poop of the bats, they were able to identify up to 280 different bacteria that make up the ecosystem in the bats’ intestines, many of which would cause sickness if found in most other mammals.  

    “The data suggests that there is a close evolutionary relationship between the gut microbiome and the genome of the vampire bat for adaptation to sanguivory [feeding exclusively on blood],” study author Dr Marie Zepeda Mendoza, told the BBC.

    They think that these changes at both the genetic level and that of the microbiome might have worked in conjunction to allow the animals to make their shift towards feeding exclusively on blood. One suggestion is that perhaps the animals moved from feeding on insects to focusing on blood-sucking parasites such as ticks, before going fully sanguivorous some 4 million years ago.

    Read more:

    The Toxic Reality Of Living In Diet Culture And Why Im Doing My Best To Leave It Behind

    Element5 Digital / Unsplash

    Diet culture, better known as the disastrous amalgamation of pop culture, media, pseudoscience, and social constructs we are all systemically smothered by, is something I’d like to break up with. I’ve read and researched the topic, follow all of the #riotsnotdiets accounts, and talk a big game about body positivity. But despite my progressive exterior I secretly worry that I can’t completely break up with this destructive system. While it’s easy for me to champion healthy body and food beliefs for other women, I have a much more difficult time following through for myself.

    Diet culture is unquestionably toxic, but trying to untangle myself from its totality is complicated.

    The holy trinity (body “positive” statements, diet talk, and believing that all foods have a moral value attached to them) of diet culture is difficult to get away from. Individually these aspects can disguise themselves as innocuous, and they thrive in every corner of modern life, which makes escaping them seem impossible. It doesn’t help that engaging in some of these holy trinity behaviors feel good and connects us to other women in a seemingly positive way.

    Julie Klausner, creator of Difficult People, is a woman who has long expressed her separation from diet culture. On one episode of her podcast, How Was Your Week, she goes on a high energy tirade where she discusses how if you were to put two women in a room, one who has just earned her Ph.D. and another who has recently lost weight, that the women who lost weight will receive more acclaim and attention. Julie points out the idiocy in this and the sadness of its truth.

    Last year I found myself in an adjacent situation where I had lost a decent amount of weight around the same time that I started working a new job. My weight loss was a side effect of a serious bout of depression and anxiety, and it wasn’t something I was actively chasing. For months, anytime I met up with friends or colleagues my new, slimmer figure was the first thing they would comment on, and it was clearly the thing that people were most excited to ask me about. Julie K’s hypothesized musing was right. My new job was a significant promotion in my field, and yet it was my weight that those around me wanted to discuss. I knew all the weight loss talk was feeding into long-held, toxic beliefs about women and our bodies, but I also recognized that it still felt good to hear. People used phrases to describe me physically like well-rested, glowing, and most importantly good and skinny. 

    The words good and skinny are often heard together which has caused most of us to believe that skinny is good, and we all want to be good…and skinny.

    While I had never felt more mentally drained and down in my life, I had also never received so much positive praise. The experience felt conflicting.

    Losing weight has a hypnotic effect on others, especially women. People talk to you as if you know a secret and have accomplished a worthy feat. My diet at the time consisted of hardly eating and excessively working out in an attempt to boost my brains natural levels of serotonin/dopamine. During this time, I also developed a handful of mild irrational fears around meat, cheese, and processed food which likely occurred thanks to all the food shaming (vegan*) documentaries I was consuming on Netflix. At first, when questioned about my weight loss I would attempt a vague response like, “I’m just eating better and exercising more.” But after a while, I felt uncomfortable about being misleading.

    I didn’t want to give truth to the deranged diet culture notion of simply eating better and exercising more equating to weight loss equating to a glowing appearance equating to a new job and better life.

    Eventually, I mustered up a more truthful, “I just don’t eat a lot” or “depression,” which I would say adding a hard laugh. This laugh didn’t seem particularly helpful as people often looked uncomfortable upon hearing a closer version of the truth.

    While giving out skinny compliments seems like a kind exchange, it’s just encouraging the cycle of placing ultimate value on our bodies. I know it’s tough, it feels good to get and give these compliments. I still accidentally tell women they look skinny as an automatic comment. It feels nice to make other people light up, and nothing does it as quickly as telling a woman she looks thin. Most people want to make their friends feel confident and happy, but we have to find better ways of doing it. I do think we should be able to compliment one another when we’re looking well rested and glowing, but maybe these adjectives don’t need to be so related to the actual skin in which we live.

    As girls, I believe many of our first addictions were talking obsessively about diets and food. Some of my earliest memories of interacting with adult women as a child are sitting in the kitchen and talking about diets. Growing up my mom was a Weight Watchers discipline, and by age 12 I could rattle off the calorie count and point value to almost any food. Other neighborhood moms were impressed with my knowledge of points and calories and all things numerically related. Early on, I realized that talking about a diet was an essential part of being on a diet, and a great way to positively interact with other women.

    As adult women, talking about food and diet continues to be one of the quickest ways to bond with one another. Food noise is something we all have in common.

    I’ve yet to meet a woman who has never been affected by a cultural desire to lose weight and make her body “better.”

    Diet talk is a quick way to connect and empathize with one another even though by doing so we are continuing to agree with the idea that your body is your value. While I try to not engage, I still get caught up in it at times because I worry that opting out completely will cast me out as a social pariah, and truthfully something about the chatter is addicting. The morning back to work after the winter holiday the first thing I asked my co-worker was, “What are you drinking? Are you on a new cleanse?” I couldn’t help myself, something inside of me desperately wanted to know. We then proceeded to talk about juice cleanses for 10 minutes before coming around to ask one another how our holiday vacations went.

    Even more recently, I slipped up and found myself 20 minutes deep into a conversation about someone’s new life-altering diet at a family shiva. I sat with a plate full of bagel, kugel, and rainbow cake while a woman preached to me about the wonders of Keto. The woman explained how Keto focuses on our bodies natural ability to run solely on fats and proteins. Taking humiliating swallows of bagel and schmear, I actively listened as she continued rattling on about how since she started this new diet her body only needed to eat twice a day. The shame of eating more than two times a day immediately filled me.

    The shame feels right though—it’s an essential part of diet talk. We want the shame. We hope the shame will force us to be good. While I know it’s not good for me, diet chatter does light up part of my brain sending out excessive levels of a pleasure chemical. Maybe it’s the learning part that feels good. Perhaps my brain thinks it’s about to gain novel, life alerting information that will bring an undiscovered happiness to my life.

    Eating is one of the first behaviors we learn how to do on our own. It’s seemingly the simplest survival mechanism for humans.

    1. Ingest food 2. Don’t die 3. Repeat.

    We’ve managed to take this natural human need and turn it into an issue of morality.

    The idea that food is good or bad is beyond damaging to our self-worth, and in recent years this trend has only been getting worse as we’ve started doing it to even the tiniest of our species, babies! Breast milk is better than formal milk, organic vegetables blended on a free-range farm are better than pre-packaged vegetable blends, etc. Cognitively I know food is just food, but emotionally it’s become impossible to feel that way. It feels like eating the good things means that I’m good. This thought is exacerbated by the fact that everything around me is telling me that this is true. Labels declare what is good and even worse what foods can be consumed guilt free. We are so accustomed to adding morals to food choices that we don’t even hear how psychotic it sounds when someone says, “I’m so bad I just ate ____” In fact, we usually agree with them and say how bad we are too. (If you need a reminder of how absurd this actually sounds watch this.)  

    These days I’m trying my best to untangle myself from this food-self-worth mess by constantly reminding myself that food is just food and I’m attempting to disengage from diet chatter, but I still slip up and go back to my old ways like compulsively reconnecting with an ex. Even though I see that the totality is wrong, some parts of it feel right, and maybe I’m masochistic.

    Walking out on diet culture is complicated. We live here. The damage of a system that values body type and food choices over personality is clearly detrimental and interferes with a million aspects of our lives. Somedays I imagine a time and place where my girlfriends and I have become so evolved that we eat without shame and talk about ourselves in a kinder way. Other days, I’m annoyed if no one tells me I look skinny because that stupid phrase is still feeding something insatiable inside of me.

    I want to let go, but breaking up is hard to do.

    Read more:

    Mysterious Mass Grave In England Could Be Filled With The Viking Great Army’s War Dead

    Once upon a time, back in the 9th century, there was a predominantly Danish Viking horde of militants that led a ferocious campaign across England. Unlike other Viking groups at the time, they didn’t just set out to raid, but to conquer, gain land, and seize political power too.

    Referred to by the Christian Saxons as “heathens”, they briefly settled in Repton, Derbyshire, before disappearing into the weaves of time. It’s known that they fragmented shortly after their reign of terror, in 873-874 CE, but much about them remains enigmatic.

    So – what happened to the so-called Great Army? A new study published in the journal Antiquity, led by the University of Bristol’s Department of Anthropology and Archaeology, has at least a partial answer. Reexamining a mass grave first discovered in the 1980s, it’s been confirmed that it dates back to the Viking era, and there’s a chance that it’s comprised of fallen members of the long-lost invaders from the north.

    The Repton burial mound was analyzed shortly after its original excavation, and the bones of at least 264 men (80 percent) and women (20 percent), aged 18-45, were found beneath it. A double grave containing two heavily injured men, a Viking sword, and a Thor’s hammer pendant, as well as a sacrificial grave containing the remains of four juveniles, were also found close by.

    Rather uncomfortably, one unfortunate occupant of the double grave seemed to have had his severed penis replaced with a boar’s tusk to presumably accompany him into the afterlife.

    One of the skulls from the Repton mass grave. Cat Jarman

    Controversially, at the time, academics concluded that this mass grave was filled in over the course of several centuries. This was based on radiocarbon dating, which uses the decay of unstable carbon isotopes as a measure of time – and it revealed not a point in time, but an elongated chronology. Researchers at the time were baffled: sedimentological and archaeological evidence clearly demonstrated that the mass grave was made and filled in just once.

    As time ticked on, it transpired that something may have thrown off the dating method, and Bristol bioarchaeologist Cat Jarman, a “Doctor of the Dead”, felt that a reassessment was required. As it turns out, that radiocarbon dating was thrown off by these mysterious fellows’ fish-heavy diet.

    The basis of radiocarbon dating is that there is a constant level of carbon-14 in all living organisms, but this isn’t always the case. This isotope, which is continuously formed in the upper atmosphere as cosmic rays cascade into it, rains down from the sky, and is absorbed by plants and animals in a variety of ways.

    Oceans, and anything living in them that assimilates carbon, are troublesome for researchers that use radiocarbon dating. Thanks to the way carbon-14 is distributed in those watery depths, a tree that’s the same age as a shellfish can actually appear to be 400 years younger.

    Or, as Jarman phrased it to IFLScience: “If Ivar only ate fish and his mate Halfdan only ate sheep, and both were killed by Alfred on the same day, it would look like Ivar died 400 years before Halfdan.”

    Those in the grave clearly ate a lot of seafood, and Jarman’s team suspected that threw off the original dating. Correcting for this, they found that the three main grave sites all date back to 872-885 CE, a relatively short period of time.

    Significantly, this is a time directly associated with the Great Army’s base in this part of England. Could this grave have belonged to their war dead, perchance?

    “We definitely can’t prove that the mass grave is that of the Viking war dead. A radiocarbon date doesn’t make you a Viking, and certainly not a warrior!” Jarman added.

    “All we can prove is that they date to the right time, and with all the contextual evidence, it makes the conclusion far more likely than it was before. But we are by no means certain.”

    Spot the grave. Mark Horton

    Either way, this revelatory research means that we’re one step closer to tracing the final days of one of the most mysterious military forces in human history – and it’s all thanks to cosmic rays and fishy diets.

    Read more:

    It’s Time For a Serious Talk About the Science of Tech “Addiction”

    To hear Andrew Przybylski tell it, the American 2016 presidential election is what really inflamed the public's anxiety over the seductive power of screens. (A suspicion that big companies with opaque inner workings are influencing your thoughts and actions will do that.) "Psychologists and sociologists have obviously been studying and debating about screens and their effects for years," says Przybylski, who is himself a psychologist at the Oxford Internet Institute with more than a decade's experience studying the impact of technology. But society's present conversation—"chatter," he calls it—can be traced back to three events, beginning with the political race between Hillary Clinton and Donald Trump.

    Then there were the books. Well-publicized. Scary-sounding. Several, really, but two in particular. The first, Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked, by NYU psychologist Adam Alter, was released March 2, 2017. The second, iGen: Why Today's Super-Connected Kids are Growing Up Less Rebellious, More Tolerant, Less Happy – and Completely Unprepared for Adulthood – and What That Means for the Rest of Us, by San Diego State University psychologist Jean Twenge, hit stores five months later.

    Last came the turncoats. Former employees and executives from companies like Facebook worried openly to the media about the monsters they helped create. Tristan Harris, a former product manager at Google and founder of the nonprofit "Time Well Spent" spoke with this publication's editor in chief about how Apple, Google, Facebook, Snapchat, Twitter, Instagram—you know, everyone—design products to steal our time and attention.

    Bring these factors together, and Przybylski says you have all the ingredients necessary for alarmism and moral panic. What you're missing, he says, is the only thing that matters: direct evidence.

    Which even Alter, the author of that first bellwether book, concedes. "There's far too little evidence for many of the assertions people make," he says. "I've become a lot more careful with what I say, because I felt the evidence was stronger when I first started speaking about it."

    "People are letting themselves get played," says Przybylski. "It's a bandwagon." So I ask him: When WIRED says that technology is hijacking your brain, and the New York Times says it's time for Apple to design a less addictive iPhone, are we part of the problem? Are we all getting duped?

    "Yeah, you are," he says. You absolutely are."

    Of course, we've been here before. Anxieties over technology's impact on society are as old as society itself; video games, television, radio, the telegraph, even the written word—they were all, at one time, scapegoats or harbingers of humanity's cognitive, creative, emotional, and cultural dissolution. But the apprehension over smartphones, apps, and seductive algorithms is different. So different, in fact, that our treatment of past technologies fails to be instructive.

    A better analogy is our modern love-hate relationship with food. When grappling with the promises and pitfalls of our digital devices, it helps to understand the similarities between our technological diets and our literal ones.

    Today's technology is always with you; a necessary condition, increasingly, of existence itself. These are some of the considerations that led MIT sociologist Sherry Turkle to suggest avoiding the metaphor of addiction, when discussing technology. "To combat addiction, you have to discard the addicting substance," Turkle wrote in her 2011 book Alone Together: Why We Expect More from Technology and Less from Each Other. "But we are not going to 'get rid' of the Internet. We will not go ‘cold turkey’ or forbid cell phones to our children. We are not going to stop the music or go back to the television as the family hearth."

    Food addicts—who speak of having to take the "tiger of addiction" out of the cage for a walk three times a day—might take issue with Turkle's characterization of dependence. But her observation, and the food addict's plight, speak volumes about our complicated relationships with our devices and the current state of research.

    People from all backgrounds use technology—and no two people use it exactly the same way. "What that means in practice is that it's really hard to do purely observational research into the effects of something like screen time, or social media use," says MIT social scientist Dean Eckles, who studies how interactive technologies impact society's thoughts and behaviors. You can't just divide participants into, say, those with phones and those without. Instead, researchers have to compare behaviors between participants while accounting for variables like income, race, and parental education.

    Say, for example, you’re trying to understand the impact of social media on adolescents, as Jean Twenge, author of the iGen book, has. When Twenge and her colleagues analyzed data from two nationally representative surveys of hundreds of thousands of kids, they calculated that social media exposure could explain 0.36 percent of the covariance for depressive symptoms in girls.

    But those results didn’t hold for the boys in the dataset. What's more, that 0.36 percent means that 99.64 percent of the group’s depressive symptoms had nothing to do with social media use. Przybylski puts it another way: "I have the data set they used open in front of me, and I submit to you that, based on that same data set, eating potatoes has the exact same negative effect on depression. That the negative impact of listening to music is 13 times larger than the effect of social media."

    In datasets as large as these, it's easy for weak correlational signals to emerge from the noise. And a correlation tells us nothing about whether new-media screen time actually causes sadness or depression. Which are the same problems scientists confront in nutritional research, much of which is based on similarly large, observational work. If a population develops diabetes but surveys show they’re eating sugar, drinking alcohol, sipping out of BPA-laden straws, and consuming calories to excess, which dietary variable is to blame? It could just as easily be none or all of the above.

    Decades ago, those kinds of correlational nutrition findings led people to demonize fat, pinning it as the root cause of obesity and chronic illness in the US. Tens of millions of Americans abolished it from their diets. It's taken a generation for the research to boomerang back and rectify the whole baby-bathwater mistake. We risk similar consequences, as this new era of digital nutrition research gets underway.

    Fortunately, lessons learned from the rehabilitation of nutrition research can point a way forward. In 2012, science journalist Gary Taubes and physician-researcher Peter Attia launched a multimillion-dollar undertaking to reinvent the field. They wanted to lay a new epistemological foundation for nutrition research, investing the time and money to conduct trials that could rigorously establish the root causes of obesity and its related diseases. They called their project the Nutrition Science Initiative.

    Today, research on the link between technology and wellbeing, attention, and addiction finds itself in need of similar initiatives. They need randomized controlled trials, to establish stronger correlations between the architecture of our interfaces and their impacts; and funding for long-term, rigorously performed research. "What causes what? Is it that screen time leads to unhappiness or unhappiness leads to screen time?" says Twenge. "So that’s where longitudinal studies come in." Strategies from the nascent Open Science Framework—like the pre-registration of studies, and data sharing—could help, too.

    But more than any of that, researchers will need buy-in from the companies that control that data. Ours is a time of intense informational asymmetry; the people best equipped to study what's happening—the people who very likely are studying what's happening—are behind closed doors. Achieving balance will require openness and objectivity from those who hold the data; clear-headed analysis from those who study it; and measured consideration by the rest of us.

    "Don't get me wrong, I'm concerned about the effects of technology. That's why I spend so much of my time trying to do the science well," Przybylski says. He says he's working to develop a research proposal strategy by which scientists could apply to conduct specific, carefully designed studies with proprietary data from major platforms. Proposals would be assessed by independent reviewers outside the control of Facebook etc. If the investigation shows the potential to answer an important question in a discipline, or about a platform, the researchers outside the company get paired with the ones inside.

    "If it’s team based, collaborative, and transparent, it’s got half a chance in hell of working," Przybylski says.

    And if we can avoid the same mistakes that led us to banish fat from our food, we stand a decent chance of keeping our technological diets balanced and healthy.

    Your Technology and You

    • Wired's editor-in-chief Nick Thompson spoke with Tristan Harris, the prophet behind the "Time Well Spent" movement that argues are minds are being hijacked by the technology we use.

    • One writer takes us through his extreme digital detox, a retreat that took him offline for a whole month.

    • Technology is demonized for making us distractible, but the right tech can help us form new, better digital habits—like these ones.

    Read more:

    It’s not just in the genes: the foods that can help and harm your brain

    Our diet has a huge effect on our brain and our mental wellbeing, even protecting against dementia. So, what should be on the menu?

    It’s not just in the genes: the foods that can help and harm your brain

    Our diet has a huge effect on our brain and our mental wellbeing, even protecting against dementia. So, what should be on the menu?

    Read more:

    8 Scientific Conspiracies That Turned Out To Be True

    The past couple of years have seen the resurgence of the conspiracy theorist, but instead of the foil hat wearing, megaphone-wielding whack job of old, the conspiracy theorist of 2018 sits behind a keyboard and sets up GoFundMe pages raising money to launch homemade rockets.

    Vaccines cause autism. Man-made climate change isn’t real. Nibiru will crash into Earth, wiping out all life. And Stephen Hawking is an imposter. There are plenty to pick and choose from. Recent research has linked this sort of behavior to gullibility and the need to feel special, but conspiracy theorists will tell you this is simply another cover-up.

    There are some occasions, however, when real life is just as strange as fiction. From sinister research programs to health-related cover-ups, here are eight examples from the annals of history that sound so extraordinary they could have been cooked up by a conspiracy theorist.

    The CIA really did experiment with mind control and psychedelics

    The Central Intelligence Agency (CIA) crops up in many a conspiracy theory, but its shady reputation is not entirely undeserved. Papers released in the seventies revealed the secret service really had been dabbling in mind control, psychological torture, radiation, and electric shock therapy in a series of studies into behavioral modification known as “Project MK-ULTRA”.

    More than 150 human experiments took place between 1953 and 1964, many of which involved administering drugs to US citizens without their knowledge and consent, and under no medical supervision. The purpose of this research was to develop techniques and substances to use against the Soviet Union and its allies – think truth serums and Bourne-like super agents.

    But things get more sinister. Then CIA Director Richard Helms ordered the destruction of all records relating to MK-ULTRA in 1973, which means there is little evidence of the intelligence services’ nefarious activities around today. We know the research was responsible for at least one hospitalization and two deaths but the true cost could be much higher.

    MK-ULTRA has inspired several Hollywood blockbusters. overturefilms/Youtube

    Politicians and industry leaders purposefully misled the public over the health risks associated with smoking

    Smoking increases your risk of stroke, emphysema, infertility, and a whole host of cancers. But back in the day, Big Tobacco tried all it could to persuade consumers that cigarettes weren’t bad for you. It didn’t stop there. They even tried to convince the public that smoking was healthy. Just take a look at some of the dangerous, not to mention highly sexist, vintage ads from the sixties and earlier.

    Tobacco companies were major lobbyists and generous donors to political campaigns. Essentially, they were able to buy favor with politicians and others in positions of power, meanwhile refuting the science behind the health risks, claiming it was uncertain. It was not until the nineties – at which point, the evidence against smoking was irrefutable – that corporations began to admit there were health risks associated with cigarette smoking.

    And in 2006, after a seven-year-long lawsuit, Judge Gladys E. Kessler found the tobacco companies guilty of conspiracy, having “suppressed research…destroyed documents…manipulated the use of nicotine so as to increase and perpetuate addiction”.

    Luckies get the physician’s stamp of approval, apparently. clotho98/Flickr CC BY-NC 2.0

    …and sugar

    But it wasn’t just the tobacco companies that were guilty of this kind of malevolent activity. The sugar industry also spent years hiding data and bribing scientists to keep inconvenient research under wraps, all the while advertising Lucky Charms and Kool-Aid on children’s TV.

    In 2016, a paper published in JAMA Internal Medicine revealed the sugar industry had funded research in the sixties underplaying the risks associated with eating the white stuff, instead pointing the finger of blame at fat. According to the article authors, sugar bigwigs have been attempting to control debate around the dangers and merits of sugar and fat consumption for the past five decades.

    For example, a 2011 study made the shocking (and entirely unscientific) claim that children who eat candy weigh less than those that do not. Dig a little deeper and it turns out that the research was funded by the National Confectioner’s Association, a group that represents companies like Hershey and Skittles. Then, in 2015, it was revealed that soda giant Coca-Cola had funded studies linking weight loss to exercise to undermine the important role poor diet plays in obesity.  

    Companies like Coca-Cola have been covering up the health risks associated with sugar for the past 50 years. Physics_joe/Shutterstock

    The US government really did investigate UFOs

    So much for Area 51 being a fiction conceived by conspiracy theorist loons. Last year, the Pentagon confirmed that the US government had been investigating “anomalous aerospace threats”, or what you and I might refer to as UFOs.

    Between 2008 and 2011, the Advanced Aerospace Threat Identification Program received close to $22 million, which, admittedly, isn’t exactly a large slice of the Defense Department’s annual budget of $600 billion. Ultimately, the experiment came to nothing and the program was closed down – at least, that’s what official sources are saying.

    Area 51 might not be quite so crazy after all – UFO sighting 1937. Malcolm Dee/Flickr CC BY-NC-ND 2.0

    …and employed Nazi scientists after WW2

    For those who haven’t seen Doctor Strangelove, the film’s titular character is a former Nazi with a severe case of alien hand syndrome working as a scientific advisor to the US president, who he intermittently referred to as “Mein Fuhrer”.

    The premise seems farfetched but rumor has it he was modeled on Wernher von Braun, who was just one of the 1,600 or so Nazi scientists sent to work in the US following German defeat in World War II. The program, called Operation Paperclip, was exposed in media outlets like the New York Times in 1946.

    Some of these scientists were involved in Project MK-ULTRA. Von Braun, however, was put to work as director of the Development Operations Division of the Army Ballistic Missile Agency. He was heavily involved in the moon landing and developed the Jupiter-C rocket used to launch America’s first satellite. Before that, he had been involved in the V-2 rocket program, where he used prisoners from the concentration camps to assist him. Others in the program had similarly dodgy pasts, with some having even been tried at Nuremberg.

    Former Nazi, Wernher von Braun. NASA/Wikimedia Commons

    Water can affect the sex of frogs  sort of

    Alex Jones, the far-right radio host and conspiracy theorist of Infowars fame, claimed chemicals in the water are turning frogs gay. While there doesn’t seem to be a whole lot of evidence to back up his “theory” that water is affecting frogs’ sexuality, some studies seem to suggest man-made chemicals do have an effect on a frog’s sex.

    A 2010 paper from the University of California, Berkeley, found that as many as one in 10 male frogs exposed to atrazine, a common pesticide, experience a hormonal imbalance that effectively turns them female. They produce estrogen, mate with males, and even lay eggs. More recently, studies have shown that chemicals found in suburban ponds and road salts can also affect a frog’s sex.

    While the role of man-made chemicals in the environment may be problematic from a reproductive point of view, there is nothing unnatural about animals switching sex. Shrimps, clownfish, coral, and frogs all have the ability to do so. There is also no evidence to suggest that chemicals in the water can affect human sexuality or imply the US government is trying to make children gay with juice boxes, as Jones claims.

    Painted reed frog chilling on a leaf. feathercollector/Shutterstock

    The Public Health Service watched as black men died of syphilis unnecessarily, in the name of science

    Syphilis is a nasty disease that, if not treated, can result in blindness, paralysis, and/or death, and until the discovery of penicillin, there was no cure. Still, that does not excuse the unethical treatment of poor, black men who were recruited to take part in a program to record the natural progress of the disease.

    The “Tuskegee Study of Untreated Syphilis in the Negro Male” began operation in 1932, when 600 men from Macon County, a deprived region of Alabama, were drafted. Of those, 399 had syphilis. The men were misled and told they would receive treatment for “bad blood”, which they did not recieve.

    What’s worse, the researchers continued the experiment after penicillin became the accepted treatment for syphilis in 1945. The experiment, initially projected to last six months, carried on for 40 years. For some reason, doctors decided it was in medicine’s best interest to watch the men die a slow and painful death unnecessarily. The research came to an end in 1972, after The New York Times published a story on the study. During the trial, 28 men died of syphilis, 100 more from related causes, and 40 spouses contracted the disease.

    In the forties, similar experiments took place in Guatemala, where hundreds of men and women were purposefully infected with syphilis.

    Doctor injecting a subject with what is presumably a placebo. US government/Wikimedia Commons

    The US government did poison alcohol supplies, on purpose

    Politicians introduced Prohibition in 1920 to curb the nation’s drinking habit, but speakeasies multiplied and bootlegging (the illegal production and distribution of liquor) was widespread. Apparently, putting something in law is not enough to actually change people’s behavior, so the US government came up with a drastic, far more sinister solution: poison the illegal liquor supply.

    To do this, the government started adding toxins like benzene and mercury to alcohol in the mid-1920s. The most lethal, however, was methanol, otherwise known as wood alcohol, a substance normally found in industrial products like fuel and formaldehyde. Drinking this stuff can cause paralysis, blindness, and even death. In total, it is estimated around 10,000 people died as a result of the government’s moral crusade.

    Ultimately, even poisoning the liquor supply wasn’t enough to dampen the country’s taste for alcohol and the practice died down. Prohibition ended in 1933, but the government adopted a similar tactic in the fight against marijuana use in the 1970s.

    New York City Deputy Police Commissioner John A. Leach, watches as an officer disposes of (probably poisonous) alcohol during Prohibition. Source unknown/Wikimedia Commons


    Read more:

    Avoiding A Compound Found In These Foods Could Slow The Spread Of Several Cancers

    A new study has found that a chemical in our food could affect the speed at which several different cancers grow and spread.

    Researchers from the University of Cambridge looked at the effect of the compound asparagine, commonly found in – you guessed it – asparagus, as well as things like poultry and seafood. Asparagine is just one of the many amino acids produced by the body to synthesize proteins.  

    When asparagine was removed from the diets of mice with an aggressive form of breast cancer, those that would normally have died within a few weeks lived. The growth of their tumors had slowed dramatically.

    The research, carried out by Cancer Research UK, found that restricting the amount of asparagine in the mice’s diet or blocking it using a drug called L-asparaginase greatly reduced the spread of breast cancer in the mice.

    The research, published in Nature, also looked at patients with breast cancer and other types of malignant tumors, including kidney, head, and neck cancers. They found that the more asparagine in the diets of mice with breast cancer, the more likely the cancer was to spread around the body. Looking at data on several other types of cancer, the researchers found that the cancer of patients who naturally produce more asparagine was more likely to spread.

    They believe that, along with chemotherapy, reducing asparagine in the diet could be used as a way to slow the spread of the disease, and improve patient outcomes.

    “This finding adds vital information to our understanding of how we can stop cancer spreading,” Professor Greg Hannon, lead author of the study, said, “[which is] the main reason patients die from their disease.”

    “In the future, restricting this amino acid through a controlled diet plan or by other means could be an additional part of treatment for some patients with breast and other cancers.”

    At this stage, it’s important to remember that human trials have not yet been conducted, so if you do have cancer, you shouldn’t restrict your own diet until the evidence suggests you should. The researchers say that the next step will to be to conduct a trial using human patients.

    “Research like this is crucial to help develop better treatments for breast cancer patients,” Martin Ledwick, Cancer Research UK’s head nurse, said.

    “At the moment, there is no evidence that restricting certain foods can help fight cancer, so it’s important for patients to speak to their doctor before making any changes to their diet while having treatment.”

    Read more: