Boy robot passes agility tests

Robots are on their way to passing gym class.

The design of a new life-size bot named Kengoro closely resembles the anatomy of a teenage boy in body proportion, skeletal and muscular structure, and joint flexibility, researchers report online December 20 in Science Robotics. Compared with previous humanoid robots with more rigid, bulky bodies, Kengoro’s anatomically inspired design gives the bot a wide range of motion to perform humanlike, full-body exercises.
Constructed by Masayuki Inaba, an engineer at the University of Tokyo, and colleagues, Kengoro has a multi-jointed spine that allows the robot to curl into a sit-up or do back extensions. The bot’s arms are limber enough to execute various stretches or swing a badminton racket. And its artificial muscles are strong enough that Kengoro can stand on tiptoe or do push-ups. Batteries in each leg power Kengoro through about 20 minutes of exercise at a time, and water seeping from inside Kengoro’s metal skeleton like sweat keeps the motors of the artificial muscles cool while the bot works out.

Such a nimble robot that so closely imitates human movement and anatomy is “very unique,” says Luis Sentis, an engineer at the University of Texas at Austin not involved in the work. Building more humanlike robots could lead to the development of more sophisticated prosthetics or more realistic crash-test dummies that make humanlike reflexive movements during an accident.

Jazz improvisers score high on creativity

Improvisation may give jazz artists a creative boost not seen among musicians more likely to stick to the score. Jazz musicians’ brains quickly embrace improvisational surprises, new research on the neural roots of creativity shows.

Neuroscientist Emily Przysinda and colleagues at Wesleyan University in Middletown, Conn., measured the creative aptitudes of 12 jazz improvisers, 12 classical musicians and 12 nonmusicians. The researchers first posed creativity challenges to the volunteers, such as listing every possible use for a paper clip. Volunteers then listened to three different kinds of chord progressions — common ones, some that were a bit off and some that went in wild directions — as the team recorded the subjects’ brain waves with an electroencephalogram. Afterward, volunteers rated how much they liked each progression.

Jazz musicians, more so than the other participants, preferred the unexpected riffs, brain waves confirmed. And the improvisers’ faster and stronger neural responses showed that they were more attuned to unusual music and quickly engaged with it. Classical musicians’ and nonmusicians’ brains hadn’t yet figured out the surprising music by the time the jazz musicians had moved on, the researchers report in the December Brain and Cognition.

The jazz musicians’ striking responses to unexpected chords mirrored their out-of-the-box thinking on the creativity challenges. Training to be receptive to the unexpected in a specific area of expertise can increase creativity in general, says Harvard University cognitive neuroscientist Roger Beaty, who was not involved in the study.

‘Laid-back’ bonobos take a shine to belligerents

Despite a reputation as mellow apes, bonobos have a thing for bad guys.

Rather than latching on to individuals with a track record of helpfulness, adult bonobos favor obstructionists who keep others from getting what they want. The result may help explain what differentiates humans’ cooperative skills from those of other apes, biological anthropologists Christopher Krupenye of the University of St. Andrews in Scotland and Brian Hare of Duke University report online January 4 in Current Biology.
Previous investigations indicate that, by 3 months old, humans do the opposite of bonobos, choosing to align more frequently with helpers than hinderers. Humans, unlike other apes, have evolved to seek cooperative partnerships that make large-scale collaborations possible (SN: 10/28/17, p. 7), Krupenye and Hare propose.

“Conducting similar experiments with chimpanzees and other apes is a key next step,” Krupenye says. If chimps view hinderers as kindly as bonobos do, that finding would support the duo’s proposal about human cooperation, he says.

Bonobos may view those who impede others’ actions as socially dominant and thus worth grooming as allies, Krupenye says. Although bonobos readily share food, social pecking orders still affect the animals’ behavior.

The researchers showed 24 bonobos four animated videos featuring pairs of colored shapes, most depicted with a pair of eyes. In one video, a circle tries and fails to climb a hill until a “helper” triangle arrives and pushes the circle to the top. In a second video, a circle tries and fails to climb a hill before a “hinderer” square arrives and pushes the circle farther down the hill. In the other two videos, other shapes with eyes push an eyeless, unmoving circle up or down a hill.
After watching the first two videos, bonobos chose between paper cutouts of helper and hinderer shapes placed on top of small apple pieces. The same choice was presented for cutouts of shapes from the last two videos.

Snacks covered by hinderer shapes were chosen about 70 percent of the time by the 14 adult animals, ages 9 and older. Younger bonobos displayed no strong preference either way. Apes of all ages showed no partiality to either shape that had pushed inanimate circles.

Adult bonobos also reached more often for an apple piece offered by a human they had observed snatch a toy dropped by another person, versus a human they had seen return the toy.

In a final experiment, eight of 24 bonobos usually selected apple pieces covered by cutouts of an animated shape that the apes had seen win a contest with another shape to occupy a location. This result suggests that some bonobos’ strong preference for dominant individuals partly accounts for the newly reported fondness for hinderers, Krupenye says.

“The notion that bonobos approach the bully because they view that individual as more dominant is a very plausible interpretation,” says psychologist Felix Warneken of the University of Michigan in Ann Arbor. Warneken, who did not participate in the new study, studies cooperative behavior in human children and nonhuman apes.

A key virus fighter is implicated in pregnancy woes

An immune system mainstay in the fight against viruses may harm rather than help a pregnancy. In Zika-infected mice, this betrayal appears to contribute to fetal abnormalities linked to the virus, researchers report online January 5 in Science Immunology. And it could explain pregnancy complications that arise from infections with other pathogens and from autoimmune disorders.

In pregnant mice infected with Zika virus, those fetuses with a docking station, or receptor, for immune system proteins called type I interferons either died or grew more poorly compared with fetuses lacking the receptor. “The type I interferon system is one of the key mechanisms for stopping viral infections,” says Helen Lazear, a virologist at the University of North Carolina at Chapel Hill, who coauthored an editorial accompanying the study. “That same [immune] process is actually causing fetal damage, and that’s unexpected.”
Cells infected by viruses begin the fight against the intruder by producing type I interferons. These proteins latch onto their receptor on the surfaces of neighboring cells and kick-start the production of hundreds of other antiviral proteins.

Akiko Iwasaki, a Howard Hughes Medical Institute investigator and immunologist at Yale School of Medicine, and her colleagues were interested in studying what happens to fetuses when moms are sexually infected with Zika virus. The researchers mated female mice unable to make the receptor for type I interferons to males with one copy of the gene needed to make the receptor. This meant that moms would carry some pups with the receptor and some without in the same pregnancy.

Pregnant mice were infected vaginally with Zika at one of two times — one corresponding to mid‒first trimester in humans, the other to late first trimester. Of the fetuses exposed to infection earlier, those that had the interferon receptor died, while those without the receptor continued to develop. For fetuses exposed to infection a bit later in the pregnancy, those with the receptor were much smaller than their receptor-lacking counterparts.

Story continues below graphic
The fetuses without the receptor still grew poorly due to the Zika infection, which is expected given their inability to fight the infection. What was striking, Iwasaki says, is that the fetuses able to fight the infection were more damaged, and were the only fetuses that died.

It’s unclear how this antiviral immune response causes fetal damage. But the placentas—which, like their fetuses, had the receptor — didn’t appear to provide those fetuses with enough oxygen, Iwasaki says.

The researchers also infected pregnant mice that had the receptor for type I interferons with a viral mimic — a bit of genetic material that goads the body to begin its antiviral immune response — to see if the damage happened only during a Zika infection. These fetuses also died early in the pregnancy, an indication that perhaps the immune system could cause fetal damage during other viral infections, Iwasaki notes.

Iwasaki and colleagues next added type I interferon to samples of human placental tissue in dishes. After 16 to 20 hours, the placental tissues developed structures that resembled syncytial knots. These knots are widespread in the placentas of pregnancies with such complications as preeclampsia and restricted fetal growth.

Figuring out which of the hundreds of antiviral proteins made when type I interferon ignites the immune system can trigger placental and fetal damage is the next step, says Iwasaki. That could provide more understanding of miscarriage generally; other infections that cause congenital diseases, like toxoplasmosis and rubella; and autoimmune disorders that feature excessive type I interferon production, such as lupus, she says.

Hormone replacement makes sense for some menopausal women

Internist Gail Povar has many female patients making their way through menopause, some having a tougher time than others. Several women with similar stories stand out in her mind. Each came to Povar’s Silver Spring, Md., office within a year or two of stopping her period, complaining of frequent hot flashes and poor sleep at night. “They just felt exhausted all the time,” Povar says. “The joy had kind of gone out.”

And all of them “were just absolutely certain that they were not going to take hormone replacement,” she says. But the women had no risk factors that would rule out treating their symptoms with hormones. So Povar suggested the women try hormone therapy for a few months. “If you feel really better and it makes a big difference in your life, then you and I can decide how long we continue it,” Povar told them. “And if it doesn’t make any difference to you, stop it.”
At the follow-up appointments, all of these women reacted the same way, Povar recalls. “They walked in beaming, absolutely beaming, saying, ‘I can’t believe I didn’t do this a year ago. My life! I’ve got my life back.’ ”

That doesn’t mean, Povar says, that she’s pushing hormone replacement on patients. “But it should be on the table,” she says. “It should be part of the discussion.”

Hormone replacement therapy toppled off the table for many menopausal women and their doctors in 2002. That’s when a women’s health study, stopped early after a data review, published results linking a common hormone therapy to an increased risk of breast cancer, heart disease, stroke and blood clots. The trial, part of a multifaceted project called the Women’s Health Initiative, or WHI, was meant to examine hormone therapy’s effectiveness in lowering the risk of heart disease and other conditions in women ages 50 to 79. It wasn’t a study of hormone therapy for treating menopausal symptoms.

But that nuance got lost in the coverage of the study’s results, described at the time as a “bombshell,” a call to get off of hormone therapy right away. Women and doctors in the United States heeded the call. A 2012 study in Obstetrics & Gynecology showed that use plummeted: Oral hormone therapy, taken by an estimated 22 percent of U.S. women 40 and older in 1999–2000, was taken by fewer than 12 percent of women in 2003–2004. Six years later, the number of women using oral hormone therapy had sunk below 5 percent.
Specialists in women’s health say it’s time for the public and the medical profession to reconsider their views on hormone therapy. Research in the last five years, including a long-term follow-up of women in the WHI, has clarified the risks, benefits and ideal ages for hormone therapy. Medical organizations, including the Endocrine Society in 2015 and the North American Menopause Society in 2017, have released updated recommendations. The overall message is that hormone therapy offers more benefits than risks for the relief of menopausal symptoms in mostly healthy women of a specific age range: those who are under age 60 or within 10 years of stopping menstruation.

“A generation of women has missed out on effective treatment because of misinformation,” says JoAnn Pinkerton, executive director of the North American Menopause Society and a gynecologist who specializes in menopause at the University of Virginia Health System in Charlottesville. It’s time to move beyond 2002, she says, and have a conversation based on “what we know now.”

End of an era
Menopause, the final menstrual period, signals the end of fertility and is confirmed after a woman has gone 12 months without having a period. From then on she is postmenopausal. Women reach menopause around age 51, on average. In the four to eight years before, called perimenopause, the amount of estrogen in the body declines as ovarian function winds down. Women may have symptoms related to the lack of estrogen beginning in perimenopause and continuing after the final period.

Probably the best-known symptom is the hot flash, a sudden blast of heat, sweating and flushing in the face and upper chest. These temperature tantrums can occur at all hours. At night, hot flashes can produce drenching sweats and disrupt sleep.

Hot flashes arise because the temperature range in which the body normally feels comfortable narrows during the menopause transition, partly in response to the drop in estrogen. Normally, the body takes small changes in core body temperature in stride. But for menopausal women, the slightest uptick in degree can be a trigger for the vessels to dilate, which increases blood flow and sweating.

About 75 to 80 percent of menopausal women experience hot flashes and night sweats, on and off, for anywhere from a couple of years to more than a decade. In a study in JAMA Internal Medicine in 2015, more than half of almost 1,500 women enrolled at ages 42 to 52 reported frequent hot flashes — occurring at least six days in the previous two weeks — with symptoms lasting more than seven years.

A sizable number of women have moderate or severe hot flashes, which spread throughout the body and can include profuse sweating, heart palpitations or anxiety. In a study of 255 menopausal women, moderate to severe hot flashes were most common, occurring in 46 percent of women, during the two years after participants’ last menstrual period. A third of all the women still experienced heightened hot flashes 10 years after menopause, researchers reported in 2014 in Menopause.

Besides hot flashes and night sweats, roughly 40 percent of menopausal women experience irritation and dryness of the vulva and vagina, which can make sexual intercourse painful. These symptoms tend to arise after the final period.

Alarm bells
In the 1980s and ’90s, researchers observed that women using hormone therapy for menopausal symptoms had a lower risk of heart disease, bone fractures and overall death. Some doctors began recommending the medication not just for symptom relief, but also for disease prevention.

Observational studies of the apparent health benefits of hormone therapy spurred a more stringent study, a randomized controlled trial, which tested the treatment’s impact by randomly assigning hormones to some volunteers and not others. The WHI hormone therapy trials assessed heart disease, breast cancer, stroke, blood clots, colorectal cancer, hip fractures and deaths from other causes in women who used the hormones versus those who took a placebo. Two commonly prescribed formulations were tested: a combined hormone therapy — estrogen sourced from horses plus synthetic progesterone — and estrogen alone. (Today, additional U.S. Food and Drug Administration–approved formulations are available.)
The 2002 WHI report in JAMA, which described early results of the combined hormone therapy, shocked the medical community. The study was halted prematurely because after about five years, women taking the hormones had a slightly higher risk of breast cancer and an overall poor risk-to-benefit ratio compared with women taking the placebo. While the women taking hormones had fewer hip fractures and colorectal cancers, they had more breast cancers, heart disease, blood clots and strokes. The findings were reported in terms of the relative risk, the ratio of how often a disease happened in one group versus another. News of a 26 percent increase in breast cancers and a 41 percent increase in strokes caused confusion and alarm.

Women dropped the hormones in droves. From 2001 to 2009, the use of all hormone therapy among menopausal women, as reported by physicians based on U.S. office visits, fell 52 percent, according to a 2011 study in Menopause.

But, researchers say, the message that hormone therapy was bad for all was unwarranted. “The goal of the WHI was to evaluate the balance of benefits and risks of menopausal hormone therapy when used for prevention of chronic disease,” says JoAnn Manson, a physician epidemiologist at Harvard-affiliated Brigham and Women’s Hospital in Boston and one of the lead investigators of the WHI. “It was not intended to evaluate its role in managing menopausal symptoms.”

Along with the focus on prevention, the WHI hormone therapy trials were largely studies of older women — in their 60s and 70s. Only around one-third of participants started the trial between ages 50 and 59, the age group more likely to be in need of symptom relief. Hormone therapy “was always primarily a product to use in women entering menopause,” says Howard Hodis, a physician scientist who focuses on preventive medicine at the University of Southern California’s Keck School of Medicine in Los Angeles. “The observational studies were based on these women.”

Also lost in the coverage of the 2002 study results was the absolute risk, the actual difference in the number of cases of disease between two groups. The group on combined hormone therapy had eight more cases of breast cancer per 10,000 women per year than the group taking a placebo. Hodis notes that that absolute risk translates to less than one extra case for every 1,000 women, which is classified as a rare risk by the Council for International Organizations of Medical Sciences, a World Health Organization group. There was also less than one additional case for every 1,000 women per year for heart disease and for stroke in the hormone-treated women compared with those on placebo.

In 2004, researchers published results of the WHI study of estrogen-only therapy, taken for about seven years by women who had had their uteruses surgically removed. (Progesterone is added to hormone therapy to protect the uterus lining from a risk of cancer seen with estrogen alone.) The trial, also stopped early, reported a decreased risk of hip fractures and breast cancer, but an increased risk of stroke. The study didn’t change the narrative that hormone therapy wasn’t safe.

Timing is everything
Since the turn away from hormone therapy, follow-up studies have brought nuance not initially captured by the first two reports. Researchers were finally able to tease out the results that applied to “the young women — and I love saying this — young women 50 to 59 who are most apt to present with symptoms of menopause,” says Cynthia Stuenkel, an internist and endocrinologist at the University of California, San Diego School of Medicine in La Jolla.

In 2013, Manson and colleagues reported data from the WHI grouped by age. It turned out that absolute risks were smaller for 50- to 59-year-olds than they were for older women, especially those 70 to 79 years old, for both combined therapy and estrogen alone. For example, in the combined hormone therapy trial, treated 50- to 59-year-olds had five additional cases of heart disease and five more strokes per 10,000 women annually compared with the same-aged group on placebo. But the treated 70- to 79-year-olds had 19 more heart disease cases and 13 more strokes per 10,000 women annually than women of the same age taking a placebo. “So a lot more of these events that were of concern were in the older women,” Stuenkel says.

Story continues below graphs
A Danish study reported in 2012 of about 1,000 recently postmenopausal women, ages 45 to 58, also supported the idea that timing of hormone treatment matters. The randomized controlled trial examined the use of different formulations of estrogen (17β-estradiol) and progesterone than the WHI. The researchers reported in BMJ that after 10 years, women taking hormone therapy — combined or estrogen alone — had a reduced risk of mortality, heart failure or heart attacks, and no added risk of cancer, stroke or blood clots compared with those not treated.

These findings provide evidence for the timing hypothesis, also supported by animal studies, as an explanation for the results seen in younger women, especially in terms of heart disease and stroke. In healthy blood vessels, more common in younger women, estrogen can slow the development of artery-clogging plaques. But in vessels that already have plaque buildup, more likely in older women, estrogen may cause the plaques to rupture and block an artery, Manson explains.

Recently, Manson and colleagues published a long-term study of the risk of death in women in the two WHI hormone therapy trials — combined therapy and estrogen alone — from the time of trial enrollment in the mid-1990s until the end of 2014. Use of either hormone therapy was not associated with an added risk of death during the study or follow-up periods due to any cause or, specifically, death from heart disease or cancer, the researchers reported in JAMA in September 2017. The study provides reassurance that taking hormone therapy, at least for five to seven years, “does not show any mortality concern,” Stuenkel says.

Both the Endocrine Society and the North American Menopause Society state that, for symptom relief, the benefits of FDA-approved hormone therapy outweigh the risks in women younger than 60 or within 10 years of their last period, absent health issues such as a high risk of breast cancer or heart disease. The menopause society position statement adds that there are also benefits for women at high risk of bone loss or fracture.

Today, the message about hormone therapy is “not everybody needs it, but if you’re a candidate, let’s talk about the pros and cons, and let’s do it in a science-based way,” Pinkerton says.

Hormone therapy is the most effective treatment for hot flashes, night sweats and genital symptoms, she says. A review of randomized controlled trials, published in 2004, reported that hormone therapy decreased the frequency of hot flashes by 75 percent and reduced their severity as well.

More than 50 million U.S. women will be older than 51 by 2020, Manson says. Yet today, many women have a hard time finding a physician who is comfortable prescribing hormone therapy or even just managing a patient’s menopausal symptoms, she says.

Stuenkel, who says many younger doctors stopped learning about hormone therapy after 2002, is trying to play catch up. When she teaches medical students and doctors about treating menopausal symptoms, she brings up three questions to ask patients. First, how bothersome are the symptoms? Some women say “fix it, get me through the day and the night, put me back in order,” Stuenkel says. Other women’s symptoms are not as disruptive. Second, what does the patient want? Third, what is safe for this particular woman, based on her health? If a woman’s health history doesn’t support the use of hormone therapy, or she just isn’t interested, there are nonhormonal options, such as certain antidepressants, and also nondrug lifestyle approaches.

Menopause looms large for many women, Povar says, and discussing a patient’s expectations as well as whether hormone therapy is the right approach becomes a unique discussion with each patient, she says. “This is one of the most individual decisions a woman makes.”

50 years ago, IUDs were deemed safe and effective

In 1929, the German scientist Ernst Grafenberg inserted silver rings into the uteri of 2,000 women, and reported a pregnancy rate of only 1.6 percent. Despite this history, the use of intrauterine devices, or IUDs, was not generally accepted.… A report made public last week by the FDA’s Advisory Committee on Obstetrics and Gynecology concludes that while it doesn’t know how they work, it finds IUDs to be safe and effective in blocking conception. — Science News, February 3, 1968.
Update
Early intrauterine devices came in myriad shapes, including a double-S, loops and spirals. One IUD, the spiked Dalkon Shield, was taken off the market in 1974 amid complaints of severe infections. Consumers quickly lost interest. But after companies redesigned the devices in the 1990s, use rose. From 1988 to 2002, just 1.5 percent of U.S. women ages 15 to 44 used an IUD; from 2011 to 2013, use was as high as 7.2 percent.

Scientists now know how IUDs prevent pregnancy. Hormonal IUDs thin the lining of the uterus and thicken the mucus on the cervix, preventing sperm from swimming. The devices can also reduce how frequently women ovulate. Copper IUDs and others without hormones prevent pregnancy by releasing ions that create a sperm- and egg-killing environment in women’s reproductive tracts. IUDs and other long-acting contraceptives are currently the most reliable reversible forms of birth control (SN: 6/30/12, p. 9).

Massive dust storms are robbing Mars of its water

Storms of powdery Martian soil are contributing to the loss of the planet’s remaining water.

This newly proposed mechanism for water loss, reported January 22 in Nature Astronomy, might also hint at how Mars originally became dehydrated. Researchers used over a decade of imaging data taken by NASA’s Mars Reconnaissance Orbiter to investigate the composition of the Red Planet’s frequent dust storms, some of which are vast enough to circle the planet for months.

During one massive dust storm in 2006 and 2007, signs of water vapor were found at unusually high altitudes in the atmosphere, nearly 80 kilometers up. That water vapor rose within “rocket dust storms” — storms with rapid vertical movement — on convection currents similar to those in some storm clouds on Earth, says study coauthor Nicholas Heavens, an astronomer at Hampton University in Virginia.
At altitudes above 50 kilometers, ultraviolet light from the sun easily penetrates the Red Planet’s thin atmosphere and breaks down water’s chemical bonds between hydrogen and oxygen. Left to its own devices, hydrogen slips free into space, leaving the planet with less of a vital ingredient for water.

“Because it’s so light, hydrogen is lost relatively easily on Mars,” Heavens says. “Hydrogen loss is measurable from Earth, too, but we have so much water that it’s not a big deal.”

Previous studies have indicated that Mars, which was once covered in an ocean about 100 meters deep, lost the bulk of its water through hydrogen escape (SN Online: 10/15/14). But this is the first study to identify dust storms as a mechanism for helping the gas break away. The total effect of all dust storms could account for about 10 percent of Mars’ current hydrogen loss, Heavens says.
Whether that was true in the past is up in the air. Extrapolating back billions of years ago, when Mars was warm and wet, isn’t so easy. Scientists don’t know how dust storms would have worked in a wetter climate or a thicker atmosphere.

“Variations over weeks or months don’t really tell you anything about the 1,000-year timescale that governs hydrogen,” notes Kevin Zahnle, an astronomer at NASA’s Ames Research Center in Moffett Field, Calif., who was not involved in the study.

But Zahnle, an expert on atmospheric escape of gases, agrees with the main thrust of the study: Right now, dust storms are helping to bleed Mars dry.

Life may have been possible in Earth’s earliest, most hellish eon

Maybe Earth’s early years weren’t so hellish after all.

Asteroid strikes repeatedly bombarded the planet during its first eon, but the heat released by those hits wasn’t as sterilizing as once thought, new research suggests. Simulations indicate that after the first few hundred million years of bombardment, the heat from the impacts had dissipated enough that 10 to 75 percent of the top kilometer of the subsurface was habitable for mesophiles — microbes that live in temperatures of 20° to 50° Celsius. If so, the planet may have been habitable much earlier than previously believed.
Earth’s earliest eon, the Hadean, spans the period from about 4.6 billion years ago, when the planet was born, to 4 billion years ago. The name, for the Greek god of the underworld, reflects the original conception of the age: dark and hellish and inhospitable to life. But little direct evidence of Hadean asteroid impacts still exists, limiting scientists’ understanding of how those collisions affected the planet’s habitability.

“There has been an assumption that the Hadean was mostly an uninteresting slag heap until the sky stopped falling and life could take hold,” says Stephen Mojzsis, a geologist at the University of Colorado Boulder. That’s not to say that all of the Hadean was pleasant; the first 150 million years of Earth’s history, which included the giant whack that formed the moon, were pretty dramatic. But after that, things settled down considerably, says Mojzsis, who was not an author of the new study.

For example, scientists have found signs of liquid water and even faint hints of possible life in zircon crystals dating back 4.1 billion years (SN: 11/28/15, p. 16). Other researchers have contested the idea that Earth was continually bombarded by asteroids through much of the Hadean, or that a last barrage of asteroids shelled the planet 3.9 billion years ago in what has been called the Late Heavy Bombardment, killing any incipient life (SN Online: 9/12/16).

Story continues below illustration
In the new study, geophysicist Robert Grimm and planetary scientist Simone Marchi, both of the Southwest Research Institute in Boulder, Colo., estimated how hot it would have been just a few kilometers beneath the planet’s surface during the Hadean. The scientists used an estimated rate of asteroid bombardment, as well as how much heat the projectiles would have added to the subsurface and how much that heat would have dissipated over time to simulate how hot it got — and whether microbial life could have withstood those conditions. The research built on earlier work, including Marchi’s 2014 finding that asteroid impacts became smaller and less frequent with time (SN: 8/23/14, p. 13).

Asteroid impacts did heat the subsurface, according to the simulations, but even the heaviest bombardment scenarios were not intense enough to sterilize the planet, the researchers report March 1 in Earth and Planetary Science Letters. And if the rate of bombardment did decrease as the eon progressed, the heat the asteroids delivered to Earth’s subsurface would also have had time to dissipate. As a result, that habitable zone would have increased over time.

A Late Heavy Bombardment, if it occurred, would have been tougher for the microbes, because the heat wouldn’t have had time to dissipate with such a rapid barrage. But that just would have meant the habitable zone didn’t increase, the team reports; mesophiles could still have inhabited at least 20 percent of the top kilometer of subsurface.

Mojzsis says he’s come to similar conclusions in his own work. “For a long time people said, with absolutely no data, that there could be no biosphere before 3.9 billion years ago,” he says. But “after the solar system settled down, the biosphere could have started on Earth 4.4 billion years ago.”

That’s not to say that there was definitely life, Grimm notes. Although the heat from impacts may not have been a limiting factor for life, asteroid bombardment introduced numerous other challenges, affecting the climate, surface or even convection of the mantle. Still, the picture of Earth’s earliest days is undergoing a sea change. As Grimm says, “An average day in the Hadean did not spell doom.”

Clumps of dark matter could be lurking undetected in our galaxy

Clumps of dark matter may be sailing through the Milky Way and other galaxies.

Typically thought to form featureless blobs surrounding entire galaxies, dark matter could also collapse into smaller clumps — similar to normal matter condensing into stars and planets — a new study proposes. Thousands of collapsed dark clumps could constitute 10 percent of the Milky Way’s dark matter, researchers from Rutgers University in Piscataway, N.J., report in a paper accepted in Physical Review Letters.
Dark matter is necessary to explain the motions of stars in galaxies. Without an extra source of mass, astronomers can’t explain why stars move at the speeds they do. Such observations suggest that a spherical “halo” of invisible, unidentified massive particles surrounds each galaxy.

But the halo might be only part of the story. “We don’t really know what dark matter at smaller scales is doing,” says theoretical physicist Matthew Buckley, who coauthored the study with physicist Anthony DiFranzo. More complex structures might be hiding within the halo.

To collapse, dark matter would need a way to lose energy, slowing particles as gravity pulls them into the center of the clump, so they can glom on to one another rather than zipping right through. In normal matter, this energy loss occurs via electromagnetic interactions. But the most commonly proposed type of dark matter particles, weakly interacting massive particles, or WIMPs, have no such way to lose energy.

Buckley and DiFranzo imagined what might happen if an analogous “dark electromagnetism” allowed dark matter particles to interact and radiate energy. The researchers considered how dark matter would behave if it were like a pared-down version of normal matter, composed of two types of charged particles — a dark proton and a dark electron. Those particles could interact — forming dark atoms, for example — and radiate energy in the form of dark photons, a dark matter analog to particles of light.
The researchers found that small clouds of such dark matter could collapse, but larger clouds, the mass of the Milky Way, for example, couldn’t — they have too much energy to get rid of. This finding means that the Milky Way could harbor a vast halo, with a sprinkling of dark matter clumps within. By picking particular masses for the hypothetical particles, the researchers were able to calculate the number and sizes of clumps that could be floating through the Milky Way. Varying the choice of masses led to different levels of clumpiness.

In Buckley and DiFranzo’s scenario, the dark matter can’t squish down to the size of a star. Before the clumps get that small, they reach a point where they can’t lose any more energy. So a single clump might be hundreds of light-years across.

The result, says theoretical astrophysicist Dan Hooper of Fermilab in Batavia, Ill., is “interesting and novel … but it also leaves a lot of open questions.” Without knowing more about dark matter, it’s hard to predict what kind of clumps it might actually form.

Scientists have looked for the gravitational effects of unidentified, star-sized objects, which could be made either of normal matter or dark matter, known as massive compact halo objects, or MACHOs. But such objects turned out to be too rare to make up a significant fraction of dark matter. On the other hand, says Hooper, “what if these things collapse to solar system‒sized objects?” Such larger clumps haven’t have been ruled out yet.

By looking for the effects of unexplained gravitational tugs on stars, scientists may be able to determine whether galaxies are littered with dark matter clumps. “Because we didn’t think these things were a possibility, I don’t think people have looked,” Buckley says. “It was a blind spot.”

Up until now, most scientists have focused on WIMPs. But after decades of searching in sophisticated detectors, there’s no sign of the particles (SN: 11/12/16, p. 14). As a result, says theoretical physicist Hai-Bo Yu of the University of California, Riverside, “there’s a movement in the community.” Scientists are now exploring new ideas for what dark matter might be.

When it’s playtime, many kids prefer reality over fantasy

Young children travel to fantasy worlds every day, packing just imaginations and a toy or two.

Some preschoolers scurry across ocean floors carrying toy versions of cartoon character SpongeBob SquarePants. Other kids trek to distant universes with miniature replicas of Star Wars robots R2-D2 and C-3PO. Throngs of youngsters fly on broomsticks and cast magic spells with Harry Potter and his Hogwarts buddies. The list of improbable adventures goes on and on.

Parents today take for granted that kids need toys to fuel what comes naturally — outlandish bursts of make-believe. Kids’ flights of fantasy are presumed to soar before school and life’s other demands yank the youngsters down to Earth.
Yet some researchers call childhood fantasy play — which revolves around invented characters and settings with no or little relationship to kids’ daily lives — highly overrated. From at least the age when they start talking, little ones crave opportunities to assist parents at practical tasks and otherwise learn how to be productive members of their cultures, these investigators argue.

New findings support the view that children are geared more toward helping than fantasizing. Preschoolers would rather perform real activities, such as cutting vegetables or feeding a baby, than pretend to do those same things, scientists say. Even in the fantastical realm of children’s fiction books, reality may have an important place. Young U.S. readers show signs of learning better from human characters than from those ever-present talking pigs and bears.
Studies of children in traditional societies illustrate the dominance of reality-based play outside modern Western cultures. Kids raised in hunter-gatherer communities, farming villages and herding groups rarely play fantasy games. Children typically play with real tools, or small replicas of tools, in what amounts to practice for adult work. Playgroups supervised by older children enact make-believe versions of what adults do, such as sharing hunting spoils.
These activities come much closer to the nature of play in ancient human groups than do childhood fantasies fueled by mass-produced toys, videos and movies, researchers think.
Handing over household implements to toddlers and preschoolers and letting them play at working, or allowing them to lend a hand on daily tasks, generates little traction among Western parents, says psychologist Angeline Lillard of the University of Virginia in Charlottesville. Many adults, leaning heavily on adult-supervised playdates, assume preschoolers and younger kids need to be protected from themselves. Lillard suspects that preschoolers, whose early helping impulses get rebuffed by anxious parents, often rebel when told to start doing household chores a few years later.

“Kids like to do real things because they want a role in the real world,” Lillard says. “Our society has gone overboard in stressing the importance of pretense and fantasy for young children.”

Keep it real
Lillard suspects most preschoolers agree with her.

More than 40 years of research fails to support the widespread view that playing pretend games generates special social or mental benefits for young children, Lillard and colleagues wrote in a 2013 review in Psychological Bulletin. Studies that track children into their teens and beyond are sorely needed to establish any beneficial effects of pretending to be other people or acting out imaginary situations, the researchers concluded.

Even the assumption that kids naturally gravitate toward make-believe worlds may be unrealistic. When given a choice, 3- to 6-year-olds growing up in the United States — one of many countries saturated with superhero movies, video games and otherworldly action figures — preferred performing real activities over pretending to do them, Lillard and colleagues reported online June 20 in Developmental Science.
One hundred youngsters, most of them white and middle class, were tested either in a children’s museum, a preschool or a university laboratory. An experimenter showed each child nine pairs of photographs. Each photo in a pair featured a boy or a girl, to match the sex of the youngster being tested. One photo showed a child in action. Depicted behaviors included cutting vegetables with a knife, talking on a telephone and bottle-feeding a baby. In the second photo, a different child pretended to do what the first child did for real.

When asked by the experimenter whether they would rather, say, cut real vegetables with a knife like the first child or pretend to do so like the second child, preschoolers chose the real activity almost two-thirds of the time. Among the preschoolers, hard-core realists outnumbered fans of make-believe, the researchers found. Whereas 16 kids always chose real activities, only three wanted to pretend on every trial. Just as strikingly, 48 children (including seven of 26 of the 3-year-olds) chose at least seven real activities of the nine depicted. Only 14 kids (mostly the younger ones) selected at least seven pretend activities.

Kids often said they liked real activities for practical reasons, such as wanting to learn how to feed babies to help mom. Hands-on activities also got endorsed for being especially fun or novel. “I’ve never talked on the real phone,” one child explained. Reasons for choosing pretend activities centered on being afraid of the real activity or liking to pretend.

In a preliminary follow-up study directed by Lillard, 16 girls and boys, ages 3 to 6, chose between playing with 10 real objects, such as a microscope, or toy versions of the same objects. During 10-minute play periods, kids spent an average of about twice as much time with real items. That preference for real things increased with age. Three-year-olds spent nearly equal time playing with genuine and pretend items, but the older children strongly preferred the real deal.

Lillard’s findings illustrate that kids want and need real experiences, says psychologist Thalia Goldstein of George Mason University in Fairfax, Va. “Modern definitions of childhood have swung too far toward thinking that young children should live in a world of fantasy and magic,” she maintains.

But pretend play, including fantasy games, still has value in fostering youngsters’ social and emotional growth, Goldstein and Matthew Lerner of Stony Brook University in New York reported online September 15 in Developmental Science. After participating in 24 play sessions, 4- and 5-year-olds from poor families were tested on empathy and other social skills. Those who played dramatic pretend games (being a superhero, animal or chef, for instance) were less likely than kids who played with blocks or read stories to become visibly upset upon seeing an experimenter who the kids believed had hurt a knee or finger, the researchers found. Playing pretend games enabled kids to rein in distress at seeing the experimenter in pain, the researchers proposed.

It’s not known whether fantasy- and reality-based games shape kids’ social skills in different ways over the long haul, Goldstein says.

True fiction
Even on the printed page, where youngsters gawk at Maurice Sendak’s goggle-eyed Wild Things and Dr. Seuss’ mustachioed Lorax, the real world exerts a special pull.

Consider 4- to 6-year-olds who were read either a storybook about a little raccoon that learns to share with other animals or the same storybook with illustrations of human characters learning to share. Both versions told of how characters felt better after giving some of what they had to others. A third set of kids heard an illustrated storybook about seeds that had nothing to do with sharing. Each group consisted of 32 children.

Only kids who heard the realistic story displayed a general willingness to act on its message, reported a team led by psychologist Patricia Ganea of the University of Toronto in a paper published online August 2 in Developmental Science. On a test of children’s willingness to share any of 10 stickers with a child described as unable to participate in the experiment, listeners to the tale with human characters forked over an average of nearly three stickers, about one more than the kids had donated before the experiment.

Children who heard stories with animal characters became less giving, sharing an average of 1.7 stickers after having originally donated an average of 2.3 stickers. Sticker sharing declined similarly among kids who heard the seed story. These results fit with several previous studies showing that preschoolers more easily apply knowledge learned from realistic stories to the real world, as opposed to information encountered in fantasy stories.

Even for fiction stories that are highly unrealistic, youngsters generally favor realistic endings, say Boston University psychologist Melissa Kibbe and colleagues. In a study from the team published online June 15 in Psychology of Aesthetics, Creativity and the Arts, an experimenter read 90 children, ages 4 to 6, one of three illustrated versions of a story. In the tale, a child gets lost on the way to a school bus. A realistic version was set in a present-day city. A futuristic science fiction version was set on the moon. A fantasy version occurred in medieval times and included magical characters. Stories ended with descriptions and illustrations of a child finally locating either a typical school bus, a futuristic school bus with rockets on its sides or a magical coach with dragon wings.
When given the chance, 40 percent of kids inserted a typical school bus into the ending for the science fiction story and nearly 70 percent did so for the fantasy tale. “Children have a bias toward reality when completing stories,” Kibbe says.
Hands on
Outside Western cultures, children’s bias toward reality takes an extreme turn, especially during play.

Nothing keeps it real like a child merrily swinging around a sharp knife as adults go about their business. That’s cause for alarm in Western households. But in many foraging communities, children play with knives and even machetes with their parents’ blessing, says anthropologist David Lancy of Utah State University in Logan.

Lancy describes reported instances of youngsters from hunter-gatherer groups playing with knives in his 2017 book Raising Children. Among Maniq foragers inhabiting southern Thailand’s forests, for instance, one researcher observed a father looking on approvingly as his baby crawled along holding a knife about as long as a dollar bill. The same investigator observed a 4-year-old Maniq girl sitting by herself cutting pieces of vegetation with a machete.

In East Africa, a Hadza infant can grab a knife and suck on it undisturbed, at least until an adult needs to use the tool. On Vanatinai Island in the South Pacific, children freely experiment with knives and pieces of burning wood from campfires.

Yes, accidents happen. That doesn’t mean hunter-gatherer parents are uncaring or indifferent toward their children, Lancy says. In these egalitarian societies, where sharing food and other resources is the norm, parents believe it’s wrong to impose one’s will on anyone, including children. Hunter-gatherer adults assume that a child learns best through hands-on, sometimes risky, exploration on his or her own and in groups with other kids. In that way, the adults’ thinking goes, youngsters develop resourcefulness, creativity and determination. Self-inflicted cuts and burns represent learning opportunities.

In many societies, adults make miniature tools for children to play with or give kids cast-off tools to use as toys. For instance, Inuit boys have been observed mimicking seal hunts with items supplied by parents, such as pieces of sealskin and miniature harpoons. Girls in Ecuador’s Conambo tribe mold clay balls provided by their mothers into various shapes as a first step toward becoming potters.
Childhood games and toys in foraging groups and farming villages, as in Western nations, reflect cultural values. Hunter-gatherer kids rarely engage in rough-and-tumble or competitive games. In fact, competition is discouraged. These kids concoct games with no winners, such as throwing a weighted feather in the air and flicking the feather back up as it descends. Children in many farming villages and herding societies play basic forms of marbles, in which each player shoots a hard object at similar objects to knock the targets out of a defined area. The rules change constantly as players decide among themselves what counts and what doesn’t.

Children in traditional societies don’t invent fantasy characters to play with, Lancy says. Consider imaginative play among children of Aka foragers in the Central African Republic. These kids may pretend to be forest animals, but the animals are creatures from the children’s surroundings, such as antelope. The children aim to take the animals’ perspective to determine what route to follow while exploring, says anthropologist Adam Boyette of Duke University. Aka youngsters sometimes pretend to be spirits that adults have told the kids about. In this way, kids become familiar with community beliefs and rituals.
Aka childhood activities are geared toward adult work, Boyette says. Girls start foraging for food within the first few years of life. Boys take many years to master dangerous tasks, such as climbing trees to raid honey from bees’ nests (SN: 8/20/16, p. 10). By around age 7, boys start to play hunting games and graduate to real hunts as teenagers.

In 33 hunter-gatherer societies around the world, parents typically take 1- to 2-year-olds on foraging expeditions and give the youngsters toy versions of tools to manipulate, reported psychologist Sheina Lew-Levy of the University of Cambridge and her colleagues in the December Human Nature. Groups of children at a range of ages play make-believe versions of what adults do and get in some actual practice at tasks such as toolmaking. Youngsters generally become proficient food collectors and novice toolmakers between ages 8 and 12, the researchers conclude. Adults, but not necessarily parents, begin teaching hunting and complex toolmaking skills to teens. For the report, Lew-Levy’s group reviewed 58 papers on childhood learning among hunter-gatherers, most published since 2000.

“There’s a blurred line between work and play in foraging societies because children are constantly rehearsing for adult roles by playing,” Boyette says.

Children in Western societies can profitably mix fantasy with playful rehearsals for adult tasks, observes George Mason’s Goldstein, who was a professional stage actor before opting for steadier academic work. “My 5-year-old son is never happier than when he’s helping to check us out at the grocery store,” she says. “But he also likes to pretend to be a robot, and sometimes a robot who checks us out at the grocery store.”

Not too far in the future, preschoolers pretending to be robots may encounter real robots running grocery-store checkouts. Playtime will never be the same.