Hormone replacement makes sense for some menopausal women

Internist Gail Povar has many female patients making their way through menopause, some having a tougher time than others. Several women with similar stories stand out in her mind. Each came to Povar’s Silver Spring, Md., office within a year or two of stopping her period, complaining of frequent hot flashes and poor sleep at night. “They just felt exhausted all the time,” Povar says. “The joy had kind of gone out.”

And all of them “were just absolutely certain that they were not going to take hormone replacement,” she says. But the women had no risk factors that would rule out treating their symptoms with hormones. So Povar suggested the women try hormone therapy for a few months. “If you feel really better and it makes a big difference in your life, then you and I can decide how long we continue it,” Povar told them. “And if it doesn’t make any difference to you, stop it.”
At the follow-up appointments, all of these women reacted the same way, Povar recalls. “They walked in beaming, absolutely beaming, saying, ‘I can’t believe I didn’t do this a year ago. My life! I’ve got my life back.’ ”

That doesn’t mean, Povar says, that she’s pushing hormone replacement on patients. “But it should be on the table,” she says. “It should be part of the discussion.”

Hormone replacement therapy toppled off the table for many menopausal women and their doctors in 2002. That’s when a women’s health study, stopped early after a data review, published results linking a common hormone therapy to an increased risk of breast cancer, heart disease, stroke and blood clots. The trial, part of a multifaceted project called the Women’s Health Initiative, or WHI, was meant to examine hormone therapy’s effectiveness in lowering the risk of heart disease and other conditions in women ages 50 to 79. It wasn’t a study of hormone therapy for treating menopausal symptoms.

But that nuance got lost in the coverage of the study’s results, described at the time as a “bombshell,” a call to get off of hormone therapy right away. Women and doctors in the United States heeded the call. A 2012 study in Obstetrics & Gynecology showed that use plummeted: Oral hormone therapy, taken by an estimated 22 percent of U.S. women 40 and older in 1999–2000, was taken by fewer than 12 percent of women in 2003–2004. Six years later, the number of women using oral hormone therapy had sunk below 5 percent.
Specialists in women’s health say it’s time for the public and the medical profession to reconsider their views on hormone therapy. Research in the last five years, including a long-term follow-up of women in the WHI, has clarified the risks, benefits and ideal ages for hormone therapy. Medical organizations, including the Endocrine Society in 2015 and the North American Menopause Society in 2017, have released updated recommendations. The overall message is that hormone therapy offers more benefits than risks for the relief of menopausal symptoms in mostly healthy women of a specific age range: those who are under age 60 or within 10 years of stopping menstruation.

“A generation of women has missed out on effective treatment because of misinformation,” says JoAnn Pinkerton, executive director of the North American Menopause Society and a gynecologist who specializes in menopause at the University of Virginia Health System in Charlottesville. It’s time to move beyond 2002, she says, and have a conversation based on “what we know now.”

End of an era
Menopause, the final menstrual period, signals the end of fertility and is confirmed after a woman has gone 12 months without having a period. From then on she is postmenopausal. Women reach menopause around age 51, on average. In the four to eight years before, called perimenopause, the amount of estrogen in the body declines as ovarian function winds down. Women may have symptoms related to the lack of estrogen beginning in perimenopause and continuing after the final period.

Probably the best-known symptom is the hot flash, a sudden blast of heat, sweating and flushing in the face and upper chest. These temperature tantrums can occur at all hours. At night, hot flashes can produce drenching sweats and disrupt sleep.

Hot flashes arise because the temperature range in which the body normally feels comfortable narrows during the menopause transition, partly in response to the drop in estrogen. Normally, the body takes small changes in core body temperature in stride. But for menopausal women, the slightest uptick in degree can be a trigger for the vessels to dilate, which increases blood flow and sweating.

About 75 to 80 percent of menopausal women experience hot flashes and night sweats, on and off, for anywhere from a couple of years to more than a decade. In a study in JAMA Internal Medicine in 2015, more than half of almost 1,500 women enrolled at ages 42 to 52 reported frequent hot flashes — occurring at least six days in the previous two weeks — with symptoms lasting more than seven years.

A sizable number of women have moderate or severe hot flashes, which spread throughout the body and can include profuse sweating, heart palpitations or anxiety. In a study of 255 menopausal women, moderate to severe hot flashes were most common, occurring in 46 percent of women, during the two years after participants’ last menstrual period. A third of all the women still experienced heightened hot flashes 10 years after menopause, researchers reported in 2014 in Menopause.

Besides hot flashes and night sweats, roughly 40 percent of menopausal women experience irritation and dryness of the vulva and vagina, which can make sexual intercourse painful. These symptoms tend to arise after the final period.

Alarm bells
In the 1980s and ’90s, researchers observed that women using hormone therapy for menopausal symptoms had a lower risk of heart disease, bone fractures and overall death. Some doctors began recommending the medication not just for symptom relief, but also for disease prevention.

Observational studies of the apparent health benefits of hormone therapy spurred a more stringent study, a randomized controlled trial, which tested the treatment’s impact by randomly assigning hormones to some volunteers and not others. The WHI hormone therapy trials assessed heart disease, breast cancer, stroke, blood clots, colorectal cancer, hip fractures and deaths from other causes in women who used the hormones versus those who took a placebo. Two commonly prescribed formulations were tested: a combined hormone therapy — estrogen sourced from horses plus synthetic progesterone — and estrogen alone. (Today, additional U.S. Food and Drug Administration–approved formulations are available.)
The 2002 WHI report in JAMA, which described early results of the combined hormone therapy, shocked the medical community. The study was halted prematurely because after about five years, women taking the hormones had a slightly higher risk of breast cancer and an overall poor risk-to-benefit ratio compared with women taking the placebo. While the women taking hormones had fewer hip fractures and colorectal cancers, they had more breast cancers, heart disease, blood clots and strokes. The findings were reported in terms of the relative risk, the ratio of how often a disease happened in one group versus another. News of a 26 percent increase in breast cancers and a 41 percent increase in strokes caused confusion and alarm.

Women dropped the hormones in droves. From 2001 to 2009, the use of all hormone therapy among menopausal women, as reported by physicians based on U.S. office visits, fell 52 percent, according to a 2011 study in Menopause.

But, researchers say, the message that hormone therapy was bad for all was unwarranted. “The goal of the WHI was to evaluate the balance of benefits and risks of menopausal hormone therapy when used for prevention of chronic disease,” says JoAnn Manson, a physician epidemiologist at Harvard-affiliated Brigham and Women’s Hospital in Boston and one of the lead investigators of the WHI. “It was not intended to evaluate its role in managing menopausal symptoms.”

Along with the focus on prevention, the WHI hormone therapy trials were largely studies of older women — in their 60s and 70s. Only around one-third of participants started the trial between ages 50 and 59, the age group more likely to be in need of symptom relief. Hormone therapy “was always primarily a product to use in women entering menopause,” says Howard Hodis, a physician scientist who focuses on preventive medicine at the University of Southern California’s Keck School of Medicine in Los Angeles. “The observational studies were based on these women.”

Also lost in the coverage of the 2002 study results was the absolute risk, the actual difference in the number of cases of disease between two groups. The group on combined hormone therapy had eight more cases of breast cancer per 10,000 women per year than the group taking a placebo. Hodis notes that that absolute risk translates to less than one extra case for every 1,000 women, which is classified as a rare risk by the Council for International Organizations of Medical Sciences, a World Health Organization group. There was also less than one additional case for every 1,000 women per year for heart disease and for stroke in the hormone-treated women compared with those on placebo.

In 2004, researchers published results of the WHI study of estrogen-only therapy, taken for about seven years by women who had had their uteruses surgically removed. (Progesterone is added to hormone therapy to protect the uterus lining from a risk of cancer seen with estrogen alone.) The trial, also stopped early, reported a decreased risk of hip fractures and breast cancer, but an increased risk of stroke. The study didn’t change the narrative that hormone therapy wasn’t safe.

Timing is everything
Since the turn away from hormone therapy, follow-up studies have brought nuance not initially captured by the first two reports. Researchers were finally able to tease out the results that applied to “the young women — and I love saying this — young women 50 to 59 who are most apt to present with symptoms of menopause,” says Cynthia Stuenkel, an internist and endocrinologist at the University of California, San Diego School of Medicine in La Jolla.

In 2013, Manson and colleagues reported data from the WHI grouped by age. It turned out that absolute risks were smaller for 50- to 59-year-olds than they were for older women, especially those 70 to 79 years old, for both combined therapy and estrogen alone. For example, in the combined hormone therapy trial, treated 50- to 59-year-olds had five additional cases of heart disease and five more strokes per 10,000 women annually compared with the same-aged group on placebo. But the treated 70- to 79-year-olds had 19 more heart disease cases and 13 more strokes per 10,000 women annually than women of the same age taking a placebo. “So a lot more of these events that were of concern were in the older women,” Stuenkel says.

Story continues below graphs
A Danish study reported in 2012 of about 1,000 recently postmenopausal women, ages 45 to 58, also supported the idea that timing of hormone treatment matters. The randomized controlled trial examined the use of different formulations of estrogen (17β-estradiol) and progesterone than the WHI. The researchers reported in BMJ that after 10 years, women taking hormone therapy — combined or estrogen alone — had a reduced risk of mortality, heart failure or heart attacks, and no added risk of cancer, stroke or blood clots compared with those not treated.

These findings provide evidence for the timing hypothesis, also supported by animal studies, as an explanation for the results seen in younger women, especially in terms of heart disease and stroke. In healthy blood vessels, more common in younger women, estrogen can slow the development of artery-clogging plaques. But in vessels that already have plaque buildup, more likely in older women, estrogen may cause the plaques to rupture and block an artery, Manson explains.

Recently, Manson and colleagues published a long-term study of the risk of death in women in the two WHI hormone therapy trials — combined therapy and estrogen alone — from the time of trial enrollment in the mid-1990s until the end of 2014. Use of either hormone therapy was not associated with an added risk of death during the study or follow-up periods due to any cause or, specifically, death from heart disease or cancer, the researchers reported in JAMA in September 2017. The study provides reassurance that taking hormone therapy, at least for five to seven years, “does not show any mortality concern,” Stuenkel says.

Both the Endocrine Society and the North American Menopause Society state that, for symptom relief, the benefits of FDA-approved hormone therapy outweigh the risks in women younger than 60 or within 10 years of their last period, absent health issues such as a high risk of breast cancer or heart disease. The menopause society position statement adds that there are also benefits for women at high risk of bone loss or fracture.

Today, the message about hormone therapy is “not everybody needs it, but if you’re a candidate, let’s talk about the pros and cons, and let’s do it in a science-based way,” Pinkerton says.

Hormone therapy is the most effective treatment for hot flashes, night sweats and genital symptoms, she says. A review of randomized controlled trials, published in 2004, reported that hormone therapy decreased the frequency of hot flashes by 75 percent and reduced their severity as well.

More than 50 million U.S. women will be older than 51 by 2020, Manson says. Yet today, many women have a hard time finding a physician who is comfortable prescribing hormone therapy or even just managing a patient’s menopausal symptoms, she says.

Stuenkel, who says many younger doctors stopped learning about hormone therapy after 2002, is trying to play catch up. When she teaches medical students and doctors about treating menopausal symptoms, she brings up three questions to ask patients. First, how bothersome are the symptoms? Some women say “fix it, get me through the day and the night, put me back in order,” Stuenkel says. Other women’s symptoms are not as disruptive. Second, what does the patient want? Third, what is safe for this particular woman, based on her health? If a woman’s health history doesn’t support the use of hormone therapy, or she just isn’t interested, there are nonhormonal options, such as certain antidepressants, and also nondrug lifestyle approaches.

Menopause looms large for many women, Povar says, and discussing a patient’s expectations as well as whether hormone therapy is the right approach becomes a unique discussion with each patient, she says. “This is one of the most individual decisions a woman makes.”

When it’s playtime, many kids prefer reality over fantasy

Young children travel to fantasy worlds every day, packing just imaginations and a toy or two.

Some preschoolers scurry across ocean floors carrying toy versions of cartoon character SpongeBob SquarePants. Other kids trek to distant universes with miniature replicas of Star Wars robots R2-D2 and C-3PO. Throngs of youngsters fly on broomsticks and cast magic spells with Harry Potter and his Hogwarts buddies. The list of improbable adventures goes on and on.

Parents today take for granted that kids need toys to fuel what comes naturally — outlandish bursts of make-believe. Kids’ flights of fantasy are presumed to soar before school and life’s other demands yank the youngsters down to Earth.
Yet some researchers call childhood fantasy play — which revolves around invented characters and settings with no or little relationship to kids’ daily lives — highly overrated. From at least the age when they start talking, little ones crave opportunities to assist parents at practical tasks and otherwise learn how to be productive members of their cultures, these investigators argue.

New findings support the view that children are geared more toward helping than fantasizing. Preschoolers would rather perform real activities, such as cutting vegetables or feeding a baby, than pretend to do those same things, scientists say. Even in the fantastical realm of children’s fiction books, reality may have an important place. Young U.S. readers show signs of learning better from human characters than from those ever-present talking pigs and bears.
Studies of children in traditional societies illustrate the dominance of reality-based play outside modern Western cultures. Kids raised in hunter-gatherer communities, farming villages and herding groups rarely play fantasy games. Children typically play with real tools, or small replicas of tools, in what amounts to practice for adult work. Playgroups supervised by older children enact make-believe versions of what adults do, such as sharing hunting spoils.
These activities come much closer to the nature of play in ancient human groups than do childhood fantasies fueled by mass-produced toys, videos and movies, researchers think.
Handing over household implements to toddlers and preschoolers and letting them play at working, or allowing them to lend a hand on daily tasks, generates little traction among Western parents, says psychologist Angeline Lillard of the University of Virginia in Charlottesville. Many adults, leaning heavily on adult-supervised playdates, assume preschoolers and younger kids need to be protected from themselves. Lillard suspects that preschoolers, whose early helping impulses get rebuffed by anxious parents, often rebel when told to start doing household chores a few years later.

“Kids like to do real things because they want a role in the real world,” Lillard says. “Our society has gone overboard in stressing the importance of pretense and fantasy for young children.”

Keep it real
Lillard suspects most preschoolers agree with her.

More than 40 years of research fails to support the widespread view that playing pretend games generates special social or mental benefits for young children, Lillard and colleagues wrote in a 2013 review in Psychological Bulletin. Studies that track children into their teens and beyond are sorely needed to establish any beneficial effects of pretending to be other people or acting out imaginary situations, the researchers concluded.

Even the assumption that kids naturally gravitate toward make-believe worlds may be unrealistic. When given a choice, 3- to 6-year-olds growing up in the United States — one of many countries saturated with superhero movies, video games and otherworldly action figures — preferred performing real activities over pretending to do them, Lillard and colleagues reported online June 20 in Developmental Science.
One hundred youngsters, most of them white and middle class, were tested either in a children’s museum, a preschool or a university laboratory. An experimenter showed each child nine pairs of photographs. Each photo in a pair featured a boy or a girl, to match the sex of the youngster being tested. One photo showed a child in action. Depicted behaviors included cutting vegetables with a knife, talking on a telephone and bottle-feeding a baby. In the second photo, a different child pretended to do what the first child did for real.

When asked by the experimenter whether they would rather, say, cut real vegetables with a knife like the first child or pretend to do so like the second child, preschoolers chose the real activity almost two-thirds of the time. Among the preschoolers, hard-core realists outnumbered fans of make-believe, the researchers found. Whereas 16 kids always chose real activities, only three wanted to pretend on every trial. Just as strikingly, 48 children (including seven of 26 of the 3-year-olds) chose at least seven real activities of the nine depicted. Only 14 kids (mostly the younger ones) selected at least seven pretend activities.

Kids often said they liked real activities for practical reasons, such as wanting to learn how to feed babies to help mom. Hands-on activities also got endorsed for being especially fun or novel. “I’ve never talked on the real phone,” one child explained. Reasons for choosing pretend activities centered on being afraid of the real activity or liking to pretend.

In a preliminary follow-up study directed by Lillard, 16 girls and boys, ages 3 to 6, chose between playing with 10 real objects, such as a microscope, or toy versions of the same objects. During 10-minute play periods, kids spent an average of about twice as much time with real items. That preference for real things increased with age. Three-year-olds spent nearly equal time playing with genuine and pretend items, but the older children strongly preferred the real deal.

Lillard’s findings illustrate that kids want and need real experiences, says psychologist Thalia Goldstein of George Mason University in Fairfax, Va. “Modern definitions of childhood have swung too far toward thinking that young children should live in a world of fantasy and magic,” she maintains.

But pretend play, including fantasy games, still has value in fostering youngsters’ social and emotional growth, Goldstein and Matthew Lerner of Stony Brook University in New York reported online September 15 in Developmental Science. After participating in 24 play sessions, 4- and 5-year-olds from poor families were tested on empathy and other social skills. Those who played dramatic pretend games (being a superhero, animal or chef, for instance) were less likely than kids who played with blocks or read stories to become visibly upset upon seeing an experimenter who the kids believed had hurt a knee or finger, the researchers found. Playing pretend games enabled kids to rein in distress at seeing the experimenter in pain, the researchers proposed.

It’s not known whether fantasy- and reality-based games shape kids’ social skills in different ways over the long haul, Goldstein says.

True fiction
Even on the printed page, where youngsters gawk at Maurice Sendak’s goggle-eyed Wild Things and Dr. Seuss’ mustachioed Lorax, the real world exerts a special pull.

Consider 4- to 6-year-olds who were read either a storybook about a little raccoon that learns to share with other animals or the same storybook with illustrations of human characters learning to share. Both versions told of how characters felt better after giving some of what they had to others. A third set of kids heard an illustrated storybook about seeds that had nothing to do with sharing. Each group consisted of 32 children.

Only kids who heard the realistic story displayed a general willingness to act on its message, reported a team led by psychologist Patricia Ganea of the University of Toronto in a paper published online August 2 in Developmental Science. On a test of children’s willingness to share any of 10 stickers with a child described as unable to participate in the experiment, listeners to the tale with human characters forked over an average of nearly three stickers, about one more than the kids had donated before the experiment.

Children who heard stories with animal characters became less giving, sharing an average of 1.7 stickers after having originally donated an average of 2.3 stickers. Sticker sharing declined similarly among kids who heard the seed story. These results fit with several previous studies showing that preschoolers more easily apply knowledge learned from realistic stories to the real world, as opposed to information encountered in fantasy stories.

Even for fiction stories that are highly unrealistic, youngsters generally favor realistic endings, say Boston University psychologist Melissa Kibbe and colleagues. In a study from the team published online June 15 in Psychology of Aesthetics, Creativity and the Arts, an experimenter read 90 children, ages 4 to 6, one of three illustrated versions of a story. In the tale, a child gets lost on the way to a school bus. A realistic version was set in a present-day city. A futuristic science fiction version was set on the moon. A fantasy version occurred in medieval times and included magical characters. Stories ended with descriptions and illustrations of a child finally locating either a typical school bus, a futuristic school bus with rockets on its sides or a magical coach with dragon wings.
When given the chance, 40 percent of kids inserted a typical school bus into the ending for the science fiction story and nearly 70 percent did so for the fantasy tale. “Children have a bias toward reality when completing stories,” Kibbe says.
Hands on
Outside Western cultures, children’s bias toward reality takes an extreme turn, especially during play.

Nothing keeps it real like a child merrily swinging around a sharp knife as adults go about their business. That’s cause for alarm in Western households. But in many foraging communities, children play with knives and even machetes with their parents’ blessing, says anthropologist David Lancy of Utah State University in Logan.

Lancy describes reported instances of youngsters from hunter-gatherer groups playing with knives in his 2017 book Raising Children. Among Maniq foragers inhabiting southern Thailand’s forests, for instance, one researcher observed a father looking on approvingly as his baby crawled along holding a knife about as long as a dollar bill. The same investigator observed a 4-year-old Maniq girl sitting by herself cutting pieces of vegetation with a machete.

In East Africa, a Hadza infant can grab a knife and suck on it undisturbed, at least until an adult needs to use the tool. On Vanatinai Island in the South Pacific, children freely experiment with knives and pieces of burning wood from campfires.

Yes, accidents happen. That doesn’t mean hunter-gatherer parents are uncaring or indifferent toward their children, Lancy says. In these egalitarian societies, where sharing food and other resources is the norm, parents believe it’s wrong to impose one’s will on anyone, including children. Hunter-gatherer adults assume that a child learns best through hands-on, sometimes risky, exploration on his or her own and in groups with other kids. In that way, the adults’ thinking goes, youngsters develop resourcefulness, creativity and determination. Self-inflicted cuts and burns represent learning opportunities.

In many societies, adults make miniature tools for children to play with or give kids cast-off tools to use as toys. For instance, Inuit boys have been observed mimicking seal hunts with items supplied by parents, such as pieces of sealskin and miniature harpoons. Girls in Ecuador’s Conambo tribe mold clay balls provided by their mothers into various shapes as a first step toward becoming potters.
Childhood games and toys in foraging groups and farming villages, as in Western nations, reflect cultural values. Hunter-gatherer kids rarely engage in rough-and-tumble or competitive games. In fact, competition is discouraged. These kids concoct games with no winners, such as throwing a weighted feather in the air and flicking the feather back up as it descends. Children in many farming villages and herding societies play basic forms of marbles, in which each player shoots a hard object at similar objects to knock the targets out of a defined area. The rules change constantly as players decide among themselves what counts and what doesn’t.

Children in traditional societies don’t invent fantasy characters to play with, Lancy says. Consider imaginative play among children of Aka foragers in the Central African Republic. These kids may pretend to be forest animals, but the animals are creatures from the children’s surroundings, such as antelope. The children aim to take the animals’ perspective to determine what route to follow while exploring, says anthropologist Adam Boyette of Duke University. Aka youngsters sometimes pretend to be spirits that adults have told the kids about. In this way, kids become familiar with community beliefs and rituals.
Aka childhood activities are geared toward adult work, Boyette says. Girls start foraging for food within the first few years of life. Boys take many years to master dangerous tasks, such as climbing trees to raid honey from bees’ nests (SN: 8/20/16, p. 10). By around age 7, boys start to play hunting games and graduate to real hunts as teenagers.

In 33 hunter-gatherer societies around the world, parents typically take 1- to 2-year-olds on foraging expeditions and give the youngsters toy versions of tools to manipulate, reported psychologist Sheina Lew-Levy of the University of Cambridge and her colleagues in the December Human Nature. Groups of children at a range of ages play make-believe versions of what adults do and get in some actual practice at tasks such as toolmaking. Youngsters generally become proficient food collectors and novice toolmakers between ages 8 and 12, the researchers conclude. Adults, but not necessarily parents, begin teaching hunting and complex toolmaking skills to teens. For the report, Lew-Levy’s group reviewed 58 papers on childhood learning among hunter-gatherers, most published since 2000.

“There’s a blurred line between work and play in foraging societies because children are constantly rehearsing for adult roles by playing,” Boyette says.

Children in Western societies can profitably mix fantasy with playful rehearsals for adult tasks, observes George Mason’s Goldstein, who was a professional stage actor before opting for steadier academic work. “My 5-year-old son is never happier than when he’s helping to check us out at the grocery store,” she says. “But he also likes to pretend to be a robot, and sometimes a robot who checks us out at the grocery store.”

Not too far in the future, preschoolers pretending to be robots may encounter real robots running grocery-store checkouts. Playtime will never be the same.

The mystery of Christiaan Huygens’ flawed telescopes may have been solved

17th century scientist Christiaan Huygens set his sights on faraway Saturn, but he may have been nearsighted.

Huygens is known, in part, for discovering Saturn’s largest moon, Titan, and deducing the shape of the planet’s rings. But by some accounts, the Dutch scientist’s telescopes produced fuzzier views than others of the time despite having well-crafted lenses.

That may be because Huygens needed glasses, astronomer Alexander Pietrow proposes March 1 in Notes and Records: the Royal Society Journal of the History of Science.
To make his telescopes, Huygens combined two lenses, an objective and an eyepiece, positioned at either end of the telescope. Huygens experimented with different lenses to find combinations that, to his eye, created a sharp image, eventually creating a table to keep track of which combinations to use to obtain a given magnification. But when compared with modern-day knowledge of optics, Huygens’ calculations were a bit off, says Pietrow, of the Leibniz Institute for Astrophysics Potsdam in Germany.

One possible explanation: Huygens selected lenses based on his flawed vision. Historical records indicate that Huygens’ father was nearsighted, so it wouldn’t be surprising if Christiaan Huygens also suffered from the often-hereditary affliction.

Assuming that’s the reason for the mismatch, Pietrow calculates that Huygens had 20/70 vision: What someone with normal vision could read from 70 feet away, Huygens could read only from 20 feet. If so, that could be why Huygens’ telescopes never quite reached their potential.

50 years ago, atomic testing created otter refugees

Sea otters restocked in old home

When the [Atomic Energy Commission] first cast its eye on the island of Amchitka as a possible site for the testing of underground nuclear explosions, howls of anguish went up; the island is part of the Aleutians National Wildlife Refuge, created to preserve the colonies of nesting birds and some 2,500 sea otters that live there…— Science News, November 9, 1968

Update
The commission said underground nuclear testing would not harm the otters, but the fears of conservationists were well-founded: A test in 1971 killed more than 900 otters on the Aleutian island.
Some otters remained around Amchitka, but 602 otters were relocated in 1965–1972 to Oregon, southeast Alaska, Washington and British Columbia — areas where hunting had wiped them out. All but the Oregon population thrived, and today more than 25,000 otters live near the coastal shores where once they were extinct.

“They were sitting on the precipice,” says James Bodkin, who is a coastal ecologist at the U.S. Geological Survey. “It’s been a great conservation story.”

Martian soil may have all the nutrients rice needs

THE WOODLANDS, TEXAS — Martian dirt may have all the necessary nutrients for growing rice, one of humankind’s most important foods, planetary scientist Abhilash Ramachandran reported March 13 at the Lunar and Planetary Science Conference. However, the plant may need a bit of help to survive amid perchlorate, a chemical that can be toxic to plants and has been detected on Mars’ surface (SN: 11/18/20).

“We want to send humans to Mars … but we cannot take everything there. It’s going to be expensive,” says Ramachandran, of the University of Arkansas in Fayetteville. Growing rice there would be ideal, because it’s easy to prepare, he says. “You just peel off the husk and start boiling.”
Ramachandran and his colleagues grew rice plants in a Martian soil simulant made of Mojave Desert basalt. They also grew rice in pure potting mix as well as several mixtures of the potting mix and soil simulant. All pots were watered once or twice a day.

Rice plants did grow in the synthetic Mars dirt, the team found. However, the plants developed slighter shoots and wispier roots than the plants that sprouted from the potting mix and hybrid soils. Even replacing just 25 percent of the simulant with potting mix helped heaps, they found.

The researchers also tried growing rice in soil with added perchlorate. They sourced one wild rice variety and two cultivars with a genetic mutation — modified for resilience against environmental stressors like drought — and grew them in Mars-like dirt with and without perchlorate (SN: 9/24/21).

No rice plants grew amid a concentration of 3 grams of perchlorate per kilogram of soil. But when the concentration was just 1 gram per kilogram, one of the mutant lines grew both a shoot and a root, while the wild variety managed to grow a root.

The findings suggest that by tinkering with the successful mutant’s modified gene, SnRK1a, humans might eventually be able to develop a rice cultivar suitable for Mars.

Biologists are one step closer to creating snake venom in the lab

SAN DIEGO — Labs growing replicas of snakes’ venom glands may one day replace snake farms.

Researchers in the Netherlands have succeeded in growing mimics of venom-producing glands from multiple species of snakes. Stem cell biologist Hans Clevers of the Hubrecht Institute in Utrecht, the Netherlands, reported the creation of these organoids on December 10 at a joint meeting of the American Society for Cell Biology and the European Molecular Biology Organization.

If scientists can extract venom from the lab-grown glands, that venom might be used to create new drugs and antidotes for bites including from snakes that aren’t currently raised on farms.

Up to 2.7 million people worldwide are estimated to be bitten by venomous snakes each year. Between about 81,000 to 138,000 people die as a result of the bite, and as many as roughly 400,000 may lose limbs or have other disabilities, according to the World Health Organization.
Antivenoms are made using venom collected from snakes usually raised on farms. Venom is injected into other animals that make antibodies to the toxins. Purified versions of those antibodies can help a bitten person recover, but must be specific to the species of snake that made the bite. “If it’s a fairly rare or local snake, chances are there would be no antidote,” Clevers says.

Three postdoctoral researchers in Clevers’ lab wanted to know if they could make organoids — tissues grown from stem cells to have properties of the organs they mimic — from snakes and other nonmammalian species. The researchers started with Cape coral snakes (Aspidelaps lubricus) that were dissected from eggs just before hatching. Stem cells taken from the unhatched snakes grew into several different types of organoids, including some that make venom closely resembling the snake’s normal venom, Clevers reported at the meeting.

His team has produced venom-gland organoids from at least seven species of snakes. The organoids have survived in the lab for up to two years so far.

Clevers and colleagues hope to harvest venom from the organoids, which produce more highly concentrated venom than snakes usually make. “It’s probably going to be easier than milking a snake,” he says.

Satellites make mapping hot spots of ammonia pollution easier

Satellites may be a more accurate way to track smog-producing ammonia.

It’s notoriously tricky to pinpoint accurate numbers for ammonia gas emissions from sources such as animal feedlots and fertilizer plants. But new maps, generated from infrared radiation measurements gathered by satellites, reveal global ammonia hot spots in greater detail than before. The new data suggest that previous estimates underestimate the magnitude of these emissions, researchers report December 5 in Nature.

In the atmosphere, ammonia, which contains nitrogen, can help form tiny particles that worsen air quality and harm human health. The research could help keep tabs on who’s emitting how much, to make sure that factories and farms are meeting environmental standards.
Emissions are usually estimated by adding up output from individual known sources of activity, but those calculations are only as good as the data that go into them. Ammonia sticks around only hours to a few days in the atmosphere, so on-the-ground measurements vary a lot even in the same place, says coauthor Martin Van Damme, an atmospheric scientist at the Université Libre de Bruxelles in Belgium.

“There’s so much uncertainty in ammonia emissions,” says Daven Henze, a mechanical engineer at the University of Colorado Boulder who wasn’t part of the research. Other scientists, including his research group, have estimated ammonia releases using satellite data before. But these new maps rely on a more detailed dataset and have substantially better resolution, Henze says — fine enough that the study authors were able to link areas of high emissions to specific factories or farms.
The new maps show 248 nitrogen emission hot spots across the globe at a resolution of about a kilometer. Eighty-three of those hot spots arose from agricultural activity that involved high numbers of cows, pigs and chickens, such as a site in Colorado that overlapped on satellite imagery maps with two big cattle feedlots. Ammonia emissions from feedlots come largely from livestock waste. Another 158 sites were affected by industrial emissions — mostly from sites that produced ammonia-based fertilizer, such as in Marvdasht, Iran. Six hot spots couldn’t be pinned to specific activity.
Ammonia is also emitted naturally, from volcanoes or seabird colonies. But most of those sources were too weak or not concentrated enough to show up as hot spots in the data. Lake Natron in Tanzania is the one exception — its mud flats show up as an ammonia-releasing hot spot, perhaps due to decaying algae. But it’s not clear why other lakes with similar mud flats didn’t. Some natural sources may have gone undetected because of where they were located — in places with heavy cloud cover that obscured the data, or where turbulent air dissipated ammonia especially quickly, Van Damme suggests.

Some areas with particularly high overall ammonia emissions from biomass burning or fertilizer, such as West Africa and the Indus Valley in Pakistan and northern India, didn’t reveal specific hot spots, either, the researchers report.

U.S. fentanyl deaths are rising fastest among African-Americans

Since people in the United States began dying in the fentanyl-related drug overdose epidemic, whites have been hit the hardest. But new data released March 21 by the Centers for Disease Control and Prevention show that African-Americans and Hispanics are catching up.

Non-Hispanic whites still experience the majority of deaths involving fentanyl, a synthetic opioid. But among African-Americans and Hispanics, death rates rose faster from 2011 to 2016. Whites experienced a 61 percent annual increase, on average, while the rate rose 140.6 percent annually for blacks and 118.3 percent per year for Hispanics. No reliable data were available for other racial groups.
Overall, the number of U.S. fentanyl-related deaths in 2011 and 2012 hovered just above 1,600. A sharp increase began in 2013, reaching 18,335 deaths in 2016. That’s up from 0.5 deaths per 100,000 people in 2011 to 5.9 per 100,000 in 2016.

In the first three years of the data, men and women died from fentanyl-related overdoses at similar rates, around 0.5 per 100,000. But in 2013, those paths diverged, and by 2016, the death rate among men was 8.6 per 100,000; for women it was 3.1 per 100,000. Overdose death rates rose most sharply along the East Coast, including in New England and the middle Atlantic, and in the Great Lakes region.

One of the most powerful opioids, fentanyl has been around for decades and is still prescribed to fight pain. But it has emerged as a street drug that is cheap to make and is found mixed into other drugs. In 2013, fentanyl was the ninth most common drug involved in overdose deaths, according to the CDC report; in 2016, it was number one. Just a little bit can do a lot of damage: The drug can quickly kill a person by overwhelming several systems in the body (SN: 9/3/2016, p. 14).

50 years ago, scientists were unlocking the secrets of bacteria-infecting viruses

Unusual virus is valuable tool —

Viruses, which cannot reproduce on their own, infect cells and usurp their genetic machinery for use in making new viruses…. But just how viruses use the cell machinery is unknown.… Some answers may come from work with an unusual virus, called M13, that has a particularly compatible relationship with … [E. coli] bacteria. — Science News, April 5, 1969

Update
M13 did help unlock secrets of viral replication. Some bacteria-infecting viruses, called bacteriophages or simply phages, kill the host cell after hijacking the cell’s machinery to make copies of themselves. Other phages, including M13, leave the cell intact. Scientists are using phage replication to develop drugs and technologies, such as virus-powered batteries (SN: 4/25/09, p. 12). Adding genetic instructions to phage DNA for making certain molecules lets some phages produce antibodies against diseases such as lupus and cancer. The technique, called phage display, garnered an American-British duo the 2018 Nobel Prize in chemistry (SN: 10/27/18, p. 16).

Toddlers tend to opt for the last thing in a set, so craft your questions carefully

My youngest child, now just over a year old, has started to talk. Even though I’ve experienced this process with my older two, it’s absolutely thrilling. He is putting words to the thoughts that swirl around in his sweet little head, making his mind a little less mysterious to the rest of us.

But these early words may not mean what we think they mean, a new study hints. Unsurprisingly, when 2-year-olds were asked a series of “this or that” questions, the toddlers showed strong preferences — but not for the reasons you’d think. Overwhelmingly, the toddlers answered the questions with the last choice given.
That bias, described in PLOS ONE on June 12, suggests that young children’s answers to these sorts of questions don’t actually reflect their desires. Instead, kids may simply be echoing the last thing they heard.

This verbal quirk can be used by parents to great effect, as the researchers point out in the title of their paper: “Cake or broccoli?” More fundamentally, the results raise questions about what sort of information a verbal answer actually pulls out of a young child’s mind. This murkiness is especially troublesome when it comes to questions whose answers call for adult action, such as: “Did you hit your sister on purpose or on accident?”

In the first series of experiments, researchers led by Emily Sumner at the University of California, Irvine, asked 24 1- and 2-year-olds a bunch of two-choice questions, some of which involved a polar bear named Rori or a grizzly bear named Quinn. One question, for example, was, “Does Rori live in an igloo or a tepee?” Later, the researchers switched the bear and the order of the options, asking, for example, “Does Quinn live in a tepee or an igloo?”

The toddlers could answer either verbally or, for reluctant speakers, by pointing at one of two stickers that showed the choices. When the children answered the questions by pointing, they chose the second option about half the time, right around chance. But when the toddlers spoke their answers, they chose the second option 85 percent of the time, regardless of the bear.
SECOND BEST A toddler taking part in a study selects the second option in three either-or questions. This tendency, called the recency bias, may reflect kids’ inability to juggle several choices in their minds simultaneously. Credit: E. Sumner et al/PLOS ONE 2019

This abundance of second options selected — a habit known as the recency bias — might be due to the fact that young children have trouble holding the first option in mind, the researchers suspect. Other experiments showed that children’s tendency toward the second option got stronger when the words got longer.

Adults actually have the opposite tendency: We’re more inclined to choose the first option we’re given (the primacy bias). To see when this shift from last to first occurs, the researchers studied transcripts of conversations held between adults and children ages 1.5 to 4. In these natural conversations, 2-year-olds were more likely to choose the second option. But 3- and 4-year-olds didn’t show this bias, suggesting that the window closes around then.

The results hold a multitude of delightful parenting hacks: “Would you like to jump on the bed all night, or go to sleep?” But more importantly, the study serves as a reminder that the utterances of small children, while fascinating, may not carry the same meanings as those that come from more mature speakers. If you really want a straight answer, consider showing the two options to the toddler. But if you go that route, be prepared to hand over the cake.