Astronomy

Better playground design could help kids get more exercise

The playground at Lake County Intermediate School in Leadville, Colo., was in desperate need of a makeover. The schoolyard didn’t offer much — just a few swings, some rusty climbing equipment, a cracked basketball court and a play area of dirt and gravel.

In the spring of 2014, the community replaced the run-down equipment, installing a spider web–like climbing net, twisting slides and colorful swings. A new basketball court went in, along with a grassy play area and walking paths. Kids got access to balls, Hula-Hoops and other loose equipment.

The overhaul did more than improve how the playground looked; it turbocharged the kids’ recess activity. When researchers observed the playground that November, they found that the share of children participating in vigorous physical activity had tripled. And the changes appeared to last — a year after the overhaul, the students were still more active than they’d been before, the researchers reported in 2018 in the American Journal of Preventive Medicine.

“A lot of things, when they’re new and shiny, lead to increased physical activity, but it’s not always sustained,” says Elena Kuo, a senior evaluation and learning consultant at Kaiser Permanente Washington Health Research Institute in Seattle, who coauthored the study. “That’s why it’s a pretty exciting finding.”

Being physically active has many benefits for kids: It reduces obesity risk and improves overall physical and mental health, fosters social and emotional development and boosts academic performance. The World Health Organization recommends that schoolchildren get 60 minutes of moderate to vigorous activity every day. Most kids fall far short of that goal. Globally, 81 percent of 11- to 17-year-olds fail to hit that threshold, according to an analysis reported in January in the Lancet.

Playgrounds offer a chance to encourage kids to be more active in their everyday lives. “You’ve got a captive audience and a lot of kids,” says Kuo, calling outdoor play areas “an opportunity to have a high impact.” Overhauling playgrounds to encourage active play is gaining momentum, she says.

Scientists across the globe are studying how to maximize the opportunity that playgrounds provide. Research teams are using accelerometers, GPS tags and other wearable technology to probe how kids behave on playgrounds and are conducting randomized controlled trials to assess whether certain playground features, programs and designs can encourage kids to move more.

Colorado playground
The Lake County Intermediate School in Leadville, Colo., transformed its mostly barren schoolyard into a colorful, active playground and saw lasting activity changes among users.Great Outdoors Colorado

The results so far suggest that there are ways to subtly nudge children into being more active on playgrounds. And scientists say that there’s now enough evidence to begin making some specific recommendations to cities and schools that want to create playgrounds that foster movement. “When they come to us, we are now able to give them some pointers,” says Jasper Schipperijn, a sports scientist at the University of Southern Denmark in Odense.

While “evidence-based” playgrounds and playground-based programs won’t be a cure-all, they could make a real difference for some kids. “You’ll have children that will be active regardless of how their school or playground looks,” Schipperijn says. “But then there’s another group of kids that needs a bit more help.”

Sparking play

By the time Leadville embarked on its playground renovation, scientists had already identified several strategies for boosting playground activity. One of the first interventions to amass considerable research support used nothing more than some cans of colorful paint.

In the late 1990s, Gareth Stratton, a sports and exercise scientist then at Liverpool John Moores University in England, launched a pilot study at a local primary school. Stratton worked with the young students to develop a set of fun, brightly colored designs — including a castle, pirate ship, dragon, clockface, hopscotch board and maze — to paint on the playground surface.

The markings seemed to spark active, imaginative play and changed how students used the space, reducing the dominance of soccer and creating new play areas and opportunities for kids who might otherwise just opt out, says Stratton, now at Swansea University in Wales. “There’s no sitting on the sidelines anymore because there aren’t sidelines as such.”

When Stratton attached heart monitors to 36 of the schoolchildren, he discovered another benefit of the markings. During the month before markings were added, kids spent an average of 27 minutes of their daily recess time — which totaled about an hour a day, divided into three play sessions — engaged in moderate to vigorous physical activity, he reported in 2000 in Ergonomics. In the month after the designs were painted, that number jumped to 45 minutes a day. The children’s playtime heart rates also increased by an average of seven beats per minute. (The activity of children at a nearby school without the markings increased more modestly over the study period, from 29 minutes to 36, and there was almost no change in their heart rates.)

Chutes and ladders design
Studies dating back two decades reveal that painting colorful markings on playgrounds, like this chutes and ladders design at a school in Liverpool, England, can boost recess activity.N. Ridgers

In the years since, Stratton and other researchers have confirmed and expanded upon these findings — and schools have put the lessons into practice. “This is something that’s actually had traction and is actually really useful in a real-world setting,” Stratton says.

In addition to playground markings, loose play equipment, like the balls and Hula-Hoops added at Leadville, can encourage kids to move more. Two studies published in 2019 in the Journal of School Health used a popular observational tool known as the System for Observing Play and Leisure Activity in Youth, or SOPLAY, to demonstrate the power of these supplies. To use SOPLAY, researchers systematically scan the play area, counting the number of children who are sitting, walking or engaged in higher-intensity pursuits.

In the first study, the proportion of students engaged in moderate to vigorous physical activity at 19 schools in Los Angeles County was roughly 10 to 20 percentage points higher in play areas with loose equipment.  

The second study, based in two school districts in Colorado, demonstrated that the more play equipment schools provided, the bigger the activity gains.

“If there’s something fun to do, kids will do it,” Kuo says. “Even something as simple as having a bunch of balls available and having Frisbees around — it’s just more fun than a random open field.”

Unconventional equipment

And the options go beyond traditional sports equipment. For Australia’s Sydney Playground Project, which began in 2009, researchers enrolled 12 inner-city primary schools and randomly assigned half to receive a two-part intervention.

The playgrounds at the intervention schools were stocked with an assortment of recycled “loose parts” — hay bales, tires, crates and foam pool noodles. “We just put them on the playground with no instruction to the children about what you should do with them, and we asked the adults to try to step back from the children and let them do whatever they wanted,” says Anita Bundy, a play researcher who led the Sydney Playground Project at the University of Sydney. (She has since moved to Colorado State University in Fort Collins.)

To encourage adults to keep their distance, Bundy and colleagues put parents, teachers and school staff through “risk reframing” workshops designed to reinforce the idea that active, independent free play — even seemingly dangerous kinds of play, such as climbing trees or running down hills — has myriad benefits for kids.

Crates at Sydney playground
Unconventional play materials, like these crates at a primary school in Sydney, can prompt creative, active play among children.Sydney Playground Project

The intervention was a hit. Many teachers and parents reported that the workshops gave them a new perspective on potentially perilous play, and the kids embraced the strange playground supplies. The children combined and recombined those materials “in zillions of different ways, and they loved that,” Bundy says. At one school, some kids created an imaginary amusement park with the loose parts. At another, students invented new sports. “They’d have a whole team full of people playing pool noodle hockey,” Bundy says.

Indeed, an in-depth analysis of one Sydney primary school revealed that the loose parts prompted kids to play more than they had before and to engage in more creative play and a wider variety of activities, the researchers reported in 2018 in the Journal of Adventure Education and Outdoor Learning.

In a separate analysis, Bundy and colleagues tracked students’ activity levels using accelerometers fastened to the kids’ clothes. After the intervention, students at the experimental schools spent 12 percent more recess time participating in moderate to vigorous physical activity than children at the control schools, the researchers reported in 2013 in Preventive Medicine.

However, the absolute increases were modest. The amount of time that kids at the intervention schools spent engaged in moderate to vigorous physical activity increased by just two minutes, on average. It’s possible that the data from the accelerometers — which are better at picking up on when children start running than on the lifting, pushing, pulling, climbing and building that many of the kids were engaged in — underestimated the benefits of the loose materials.

But the results also highlight the limitations of relying on playgrounds as the singular secret to better child health. Recess tends to be short, and even “successful” interventions often add just a few extra minutes of activity to kids’ days.

So some experts caution that while creating more active, appealing playgrounds may be a step in the right direction, getting kids to move more requires a multipronged approach. “I wouldn’t put all my eggs in that basket,” says Mark Tremblay, who directs the Healthy Active Living and Obesity research group at the Children’s Hospital of Eastern Ontario Research Institute in Ottawa. Playgrounds, he says, are “one option, and I think there needs to be many, many options.”

Playground researchers argue that even small increases in activity matter, that a few extra minutes a day add up. “Is it enough? No it’s not, but it’s definitely a good step along the way,” says Schipperijn, in Denmark.

And studies, which tend to report averages, might miss positive effects. A playground overhaul may prompt some kids to be much more active while having no effect whatsoever on others. “The chances that it will work for all children is very small,” Schipperijn says. “Simply because different children need different things.”

Though individual kids vary enormously, in general, boys tend to be more active than girls. Schipperijn’s research suggests that at least some of the gender difference results from a social hierarchy in which boys tend to claim the most desirable amenities. Providing more play spaces could boost girls’ participation.

Varied landscapes

Different kids are also attracted to different activities, so it’s important to provide not just plenty of play spaces, but also varied ones.

In a study published in November 2019 in Landscape and Urban Planning, Schipperijn tracked students at three Danish schools before and after major playground renovations. Though each renovation was unique, in all three cases, the paved, mostly featureless schoolyards were converted into rich and varied playscapes. Each had some combination of sports courts, swings, four square and hopscotch markings, climbing structures, balance bars, trampolines, an obstacle trail, a climbing wall, hills, tree stumps and dedicated dancing areas with mirrors, loudspeakers and video screens. After the redesigns, there were more physical activity “hot spots” — for both boys and girls — than before, the researchers found.

Danish playground
Different children have different desires and needs. This Danish school created a wide variety of play spaces to engage more kids.Leif Tuxen

More structured play programs could also help pull the most sedentary kids into the action. That’s what researchers have found in studies of Playworks, a nonprofit that sends full-time coaches into low-income American schools to organize group games and activities during recess.

Researchers studied the effects of the program in various demographic groups in a randomized controlled trial that enrolled 29 schools spread across six American cities. Seventeen of the schools were randomly assigned to receive Playworks, while the other 12 schools served as controls. In the control group, black students spent an average of 14.1 percent of their recess time engaged in moderate to vigorous physical activity, while white students spent 19.2 percent of recess being similarly active.

Playworks appeared to close this gap, prompting black students to spend 20.4 percent of recess in moderate to vigorous physical activity, while the activity levels of white students remained essentially steady, at 19.7 percent, scientists reported in 2016 in the Journal of Physical Activity and Health. In a separate study, researchers also discovered that Playworks prompted girls, but not boys, to move substantially more.

The program may have leveled the proverbial playing field, making it easier for less active kids to participate in schoolyard games. “The Playworks coaches … were equally getting everyone involved and teaching kids games at the same time,” says Martha Bleeker, a senior researcher involved in both studies at the policy research firm Mathematica. “It’s not like one group had ownership over that activity.”

It’s also possible that any effort to remake playgrounds yields the biggest dividends for the kids who are the least active, and who thus have the most room for improvement.  

Danish girls on climbing equipment
Girls, who tend to be less active during recess than boys on average, play on climbing equipment in a Danish schoolyard.Leif Tuxen

To get the most out of playground redesigns and programs, schools may also need to rethink certain practices and policies. In many schools in Australia, for instance, kids aren’t allowed to play at recess if they don’t have a hat to protect them from the blazing sun, says Anne-Maree Parrish, a childhood physical activity researcher at Australia’s University of Wollongong.

In a randomized controlled trial published in 2016 in the Journal of Science and Medicine in Sport, Parrish found that providing loose play equipment alongside policy changes, including one that allowed hatless kids to play in the shade, boosted the share of break time that students spent in moderate to vigorous physical activity by 9 to 13 percentage points.

And even the best-designed playgrounds won’t make much difference if kids don’t get time to play on them. Although the share of U.S. school districts that mandate regular recess for elementary school students is up from 46 percent in 2000 to 65 percent in 2016, the average amount of daily recess time has actually ticked down slightly, from 30 minutes in 2000 to 27 in 2014.

“The biggest challenge at the moment is that time allocated to recess and lunchtime is decreasing in schools,” says Nicola Ridgers, a researcher at the Institute for Physical Activity and Nutrition at Deakin University in Australia. “So it’s really [about] trying to protect that time and make sure that kids have the opportunity to play.”

While playgrounds won’t single-handedly remedy the problem of childhood inactivity, they can be part of the solution — instilling a love of movement and setting the stage for a lifetime of healthy habits. As Parrish notes, “Any opportunity to try and increase their physical activity somewhere is always a bonus.”

.image-mobile {
display: none;
}
@media (max-width: 400px) {
.image-mobile {
display: block;
}
.image-desktop {
display: none;
}
}

Emissions dropped during the COVID-19 pandemic. The climate impact won’t last

To curb the spread of COVID-19, much of the globe hunkered down. That inactivity helped slow the spread of the virus and, as a side effect, kept some climate-warming gases out of the air.

New estimates based on people’s movements suggest that global greenhouse gas emissions fell roughly 10 to 30 percent, on average, during April 2020 as people and businesses reduced activity. But those massive drops, even in a scenario in which the pandemic lasts through 2021, won’t have much of a lasting effect on climate change, unless countries incorporate “green” policy measures in their economic recovery packages, researchers report August 7 in Nature Climate Change.

“The fall in emissions we experienced during COVID-19 is temporary, and therefore it will do nothing to slow down climate change,” says Corinne Le Quéré, a climate scientist at the University of East Anglia in Norwich, England. But how governments respond could be “a turning point if they focus on a green recovery, helping to avoid severe impacts from climate change.” 

Carbon dioxide lingers in the atmosphere for a long time, making month-to-month changes in CO2 levels difficult to measure as they happen. Instead, the researchers looked at what drives some of those emissions — people’s movements. Using anonymized cell phone mobility data released by Google and Apple, Le Quéré and colleagues tracked changes in energy-consuming activities, like driving or shopping, to estimate changes in 10 greenhouse gases and air pollutants. 

“Mobility data have big advantages” for estimating short-term changes in emissions, says Jenny Stavrakou, a climate scientist at the Royal Belgian Institute for Space Aeronomy in Brussels who wasn’t involved in the study. Since those data are continuously updated, they can reveal daily changes in transportation emissions caused by lockdowns, she says. “It’s an innovative approach.”

Google’s mobility data revealed that 4 billion people reduced their travel by more than 50 percent in April alone. By adding more traditional emissions estimates to fill in gaps (SN: 5/19/20), the researchers analyzed emissions trends across 123 countries from February to June. The researchers found that the peak drop occurred in April, when globally averaged CO2 emissions and nitrogen oxides fell by roughly 30 percent from baseline, mostly due to reduced driving.

Fewer greenhouse gases should result in some cooling of the atmosphere, but the researchers found that effect will be largely offset by the roughly 20 percent fall in sulfur dioxide emissions in April. These industrial emissions turn into sulfur aerosol particles in the atmosphere that reflect sunlight and thus have a cooling effect. With fewer shading aerosols, more of the sun’s energy can heat the atmosphere, causing warming. On the whole, the stark drop in emissions in April alone will cool the globe a mere 0.01 degrees Celsius over the next five years, the study finds.

In the long-term, the massive, but temporary, shifts in behavior caused by COVID-19 won’t change our current warming trajectory. But large-scale economic recovery plans offer an opportunity to enact climate-friendly policies, such as invest in low-carbon technologies, that could avert the worst warming (SN: 11/26/19). That could help reach a goal of cutting total global greenhouse gas emissions by 52 percent by 2050, limiting warming to 1.5 degrees Celsius above preindustrial levels through 2050, the researchers say.

.image-mobile {
display: none;
}
@media (max-width: 400px) {
.image-mobile {
display: block;
}
.image-desktop {
display: none;
}
}

A new experiment hints at how hot water can freeze faster than cold

In physics, chilling out isn’t as simple as it seems.

A hot object can cool more quickly than a warm one, a new study finds. When chilled, a warmer system cooled off in less time than it took a cooler system to reach the same low temperature. And in some cases, the speedup was even exponential, physicists report in the Aug. 6 Nature.

The experiment was inspired by reports of the Mpemba effect, the counterintuitive observation that hot water sometimes freezes faster than cold. But experiments studying this phenomenon have been muddled by the complexities of water and the freezing process, making results difficult to reproduce and leaving scientists disagreeing over what causes the effect, how to define it and if it is even real (SN: 1/6/17).

To sidestep those complexities, Avinash Kumar and John Bechhoefer, both of Simon Fraser University in Burnaby, Canada, used tiny glass beads, 1.5 micrometers in diameter, in lieu of water. And the researchers defined the Mpemba effect based on cooling instead of the more complicated process of freezing.

The result: “This is the first time that an experiment can be claimed as a clean, perfectly controlled experiment that demonstrates this effect,” says theoretical chemist Zhiyue Lu of the University of North Carolina at Chapel Hill.

In the experiment, a bead represented the equivalent of a single molecule of water, and measurements were performed 1,000 times under a given set of conditions to produce a collection of “molecules.” A laser exerted forces on each bead, producing an energy landscape, or potential. Meanwhile, the bead was cooled in a bath of water. The effective “temperature” of the beads from the combined trials could be derived from how they traversed the energy landscape, moving in response to the forces imparted by the laser.

To study how the system cooled, the researchers tracked the beads’ motions over time. The beads began at either a high or a moderate temperature, and the researchers measured how long it took for the beads to cool to the temperature of the water. Under certain conditions, the beads that started out hotter cooled faster, and sometimes exponentially faster, than the cooler beads. In one case, the hotter beads cooled in about two milliseconds, while the cooler beads took 10 times as long.

Laser experiment set up
In a new experiment (shown, with researcher Avinash Kumar), a laser exerted forces on tiny glass beads to demonstrate that a hot system of beads could cool down faster than a cold one.Prithviraj Basak

It might seem sensible to assume that a lower starting temperature would provide an insurmountable head start. In a straightforward race down the thermometer, the hot object would first have to reach the original temperature of the warm object, suggesting that a higher temperature could only add to the cooling time.

But in certain cases, that simple logic is wrong — specifically, for systems that are not in a state of thermal equilibrium, in which all parts have reached an even temperature. For such a system, “its behavior is no longer characterized just by a temperature,” Bechhoefer says. The material’s behavior is too complicated for a single number to describe it. As the beads cooled, they weren’t in thermal equilibrium, meaning their locations in the potential energy landscape weren’t distributed in a manner that would allow a single temperature to describe them.

For such systems, rather than a direct path from hot to cold, there can be multiple paths to chilliness allowing for potential shortcuts. For the beads, depending on the shape of the landscape, starting at a higher temperature meant they could more easily rearrange themselves into a configuration that matched a lower temperature. It’s like how a hiker might arrive at a destination more quickly by starting farther away, if that starting point allows the hiker to avoid an arduous climb over a mountain.

Lu and physicist Oren Raz had previously predicted that such cooling shortcuts were possible. “It’s really nice to see that it actually works,” says Raz, of the Weizmann Institute of Science in Rehovot, Israel. But, he notes, “we don’t know whether this is the effect in water or not.”

Water is more complex, including the quirks of impurities in the water, evaporation and the possibility of supercooling, in which the water is liquid below the normal freezing temperature (SN: 3/23/10).

But the simplicity of the study is part of its beauty, says theoretical physicist Marija Vucelja of the University of Virginia in Charlottesville. “It’s one of these very simple setups, and it already is rich enough to show this effect.” That suggests the Mpemba effect could go beyond glass beads or water. “I would imagine that this effect appears quite generically in nature elsewhere, just we haven’t paid attention to it.”

Predictions for the 2020 Atlantic hurricane season just got worse

Chalk up one more way 2020 could be an especially stressful year: The Atlantic hurricane season now threatens to be even more severe than preseason forecasts predicted, and may be one of the busiest on record.

With as many as 25 named storms now expected — twice the average number — 2020 is shaping up to be an “extremely active” season with more frequent, longer and stronger storms, the National Oceanic and Atmospheric Administration warns. Wind patterns and warmer-than-normal seawater have conspired to prime the Atlantic Ocean for a particularly fitful year — although it is not yet clear whether climate change had a hand in creating such hurricane-friendly conditions. “Once the season ends, we’ll study it within the context of the overall climate record,” Gerry Bell, lead seasonal hurricane forecaster at NOAA’s Climate Prediction Center, said during an Aug. 6 news teleconference.

The 2020 hurricane season is already off to a rapid start, with a record-high nine named storms by early August, including two hurricanes. The average season, which runs June through November, sees two named storms by this time of year.

“We are now entering the peak months of the Atlantic hurricane season, August through October,” National Weather Service Director Louis Uccellini said in the news teleconference. “Given the activity we have seen so far this season, coupled with the ongoing challenges that communities face in light of COVID-19, now is the time to organize your family plan and make necessary preparations.”

Storms get names once they have sustained wind speeds of at least 63 kilometers per hour. In April, forecasters predicted there would be 18 named storms, with half reaching hurricane status (SN: 4/16/20). Now, NOAA anticipates that 2020 could deliver a total of 19 to 25 named storms. That would put this year in league with 2005, which boasted over two dozen named storms including Hurricane Katrina (SN: 8/23/15).

Seven to 11 of this year’s named storms could become hurricanes, including three to six major hurricanes of Category 3 or higher, NOAA predicts. By contrast, the average season brings 12 named storms and six hurricanes, including three major ones.

Given that heightened activity, NOAA projects that 2020 will have an Accumulated Cyclone Energy, or ACE, value between 140 to 230 percent the norm. That value accounts for both the duration and intensity of all a season’s named storms, and seasons that exceed 165 percent the average ACE value qualify as “extremely active.”

Researchers at Colorado State University released a similar prediction on August 5. They foresee  24 named storms in total, 12 of which could be hurricanes, including five major ones. The probability of at least one major hurricane making landfall in the continental United States before the season is up is 74 percent — compared with the average seasonal likelihood of 52 percent, the Colorado State researchers say.

It’s hard to know how many storms in total will make landfall. But “when we do have more activity, there is a [trend] of more storms coming towards major landmasses — coming towards the U.S., coming towards Central America, and the Caribbean, and even sometimes up towards Canada,” says meteorologist Matthew Rosencrans of NOAA’s Climate Prediction Center in College Park, Md.

Two main climate patterns are setting the stage for an extremely intense hurricane season, says Jhordanne Jones, an atmospheric scientist at Colorado State in Fort Collins. Warmer-than-normal sea surface temperatures in the tropical Atlantic are poised to fuel stronger storms. What’s more, there are hints that La Niña may develop around the height of Atlantic hurricane season. La Niña, the flip side of El Niño, is a naturally occurring climate cycle that brings cooler waters to the tropical Pacific, changing wind patterns over that ocean (SN: 1/26/15). The effects of that disturbance in air circulation can be felt across the globe, suppressing winds over the Atlantic that might otherwise pull tropical storms apart.

How understanding nature made the atomic bomb inevitable

Atomic bombs hastened the end of World War II. But they launched another kind of war, a cold one, that threatened the entire planet with nuclear annihilation. So it’s understandable that on the 75th anniversary of the atomic bomb explosion that devastated Hiroshima (August 6, 1945), reflections tend to emphasize the geopolitical dramas during the decades that followed.

But it’s also worth reflecting on the scientific story of how the bombs came to be.

It’s not easy to pinpoint that story’s beginning. Nuclear fission — the source of the bomb’s energy — was discovered in 1938, less than seven years before Hiroshima. But the science behind nuclear energy originated decades earlier. You could say 1905, when Einstein revealed to the world that E = mc2. Or perhaps it’s better to begin with Henri Becquerel’s discovery of radioactivity in 1896. Radioactivity revealed a new sort of energy, of vast quantity, hidden within the most minuscule components of matter — the parts that made up atoms.

In any case, once science began to comprehend the subatomic world, no force could stop the eventual revelation of the atom’s power.

But the path from basic science to the bomb was not straightforward. There was no clear clue to how subatomic energy could be tapped for any significant use, military or otherwise. Writing in Science News Bulletin (the original Science News precursor) in 1921, physicist Robert Millikan noted that a gram of radium, in the process of disintegrating into lead, emits 300,000 times as much energy as burning a gram of coal. That wasn’t scary, Millikan said, because there wasn’t even enough radium in the world to make very much popcorn. But, he warned, “it is almost a foregone conclusion that similar stores of energy are also possessed by the atoms which … are not radioactive.”

In 1923 editor Edwin Slosson of Science News-Letter (the immediate precursor to Science News) also remarked that “all the elements have similar stores of energy if we only know how to release it.” But so far, he acknowledged, “scientists have not been able to unlock the atomic energy except by the employment of greater energy from another source.”

By then, physicists realized that the atom’s wealth of energy was stored in a nucleus — discovered by Ernest Rutherford in 1911. But accessing nuclear energy for practical use seemed unfeasible — at least to Rutherford, who in 1933 said that anyone planning to exploit nuclear energy was “talking moonshine.” But just the year before, the tool for releasing nuclear power had been discovered by James Chadwick, in the form of the subatomic particle known as the neutron.

Having no electric charge, the neutron was the ideal bullet to shoot into an atom, able to penetrate the nucleus and destabilize it. Such experiments in Italy by Enrico Fermi in the 1930s did actually induce fission in uranium. But Fermi thought he had created new, heavier chemical elements. He had no idea that the uranium nucleus had split. He concluded that he had produced a new element, number 93, heavier than uranium (element 92).

Not everyone agreed. Ida Noddack, a German chemist-physicist, argued that the evidence was inconclusive, and Fermi might have produced lighter elements, fragments of the uranium nucleus. But she was defying the prevailing wisdom. As the German chemist Otto Hahn wrote years later, the idea of breaking a uranium nucleus into smaller pieces was “wholly incompatible with the laws of atomic physics. To split heavy atomic nuclei into lighter ones was then considered impossible.”

Nevertheless Hahn and Lise Meitner, an Austrian physicist, continued bombarding uranium with neutrons, producing what they too believed to be new elements. Soon Meitner had to flee Germany for Sweden to avoid Nazi persecution of Jews. Hahn continued the work with chemist Fritz Strassmann; in December 1938 they found that an element they thought was radium could not be chemically distinguished from barium — apparently because it was barium. Hahn and Strassmann couldn’t explain how that could be.

Hahn wrote of this result to Meitner, who discussed it with her nephew Otto Frisch, a physicist studying at Niels Bohr’s institute in Copenhagen. Meitner and Frisch figured out what happened — the neutron had induced the uranium nucleus to split. Barium was one of the leftover chunks. Frisch told Bohr, about to board a ship to America, who realized instantly that fission confirmed his belief that an atomic nucleus behaved analogously to a drop of liquid. Upon arrival in the United States, Bohr began collaborating with John Archibald Wheeler at Princeton to explain the fission process. They quickly found that fission occurred much more readily in uranium-235, the rare form, than in the more common uranium-238. And their analysis revealed that an as yet undiscovered element, number 94, would also be especially efficient at fissioning. Their paper appeared on September 1, 1939, the day Germany invaded Poland to begin World War II.

Bohr and Wheeler
Niels Bohr (left) and John Archibald Wheeler (right) collaborated to explain fission, the source of the atomic bomb’s energy.From left: Photograph by Paul Ehrenfest Jr., courtesy AIP Emilio Segrè Visual Archives, Weisskopf Collection; AIP Emilio Segrè Visual Archives

Between Bohr’s arrival in Amer­ica in January 1939 and the publication of his paper with Wheeler, news of fission’s reality spread, stunning physicists and chemists around the world. At the end of January, for instance, word of fission reached Berkeley, where the leading physicist was J. Robert Oppenheimer, who eventually became the scientist that led the Manhattan Project to build the bomb.

Among the attendees at the Berkeley seminar introducing fission was Glenn Seaborg, a young chemistry in­structor (who in 1941 discovered the unknown element 94 predicted by Bohr and Wheeler, naming it plutonium). Seaborg recalled that at first Oppenheimer didn’t believe fission happened. But, “after a few minutes he decided it was possible,” Seaborg said in a 1997 interview. “It just caught every­body by surprise.”

After the initial surprise, physicists quickly established that fission was the key to unlocking the atom’s energy storehouse. “Lots of people verified that indeed when ura­nium is bombarded by neutrons, slow neutrons in particular, a process occurs which releases tremen­dous amounts of energy,” physicist Hans Bethe said in a 1997 interview. Soon the implications for warfare occupied everybody’s attention.

“The threat of war was getting closer and closer,” Wheeler said in an interview in 1985. “It was impossible not to think about what this business (fission) could mean in the event of war.” In early 1939, physicists meeting to discuss fission concurred that a fission bomb was thinkable. “Everybody agreed that it was perfectly pos­sible to make a nuclear explosive,” Bethe remembered.

Concerns that Germany might develop a nuclear bomb prompted Albert Einstein’s famous letter to President Franklin Roosevelt, sent in August 1939, that eventually led to the Manhattan Project. It became clear that building a fission bomb would require generating a “chain reaction” — the fission process itself would need to release neutrons capable of inducing further fission. In December 1942, Fermi led the team at the University of Chicago that demonstrated a sustained chain reaction, after which work on the bomb proceeded in Los Alamos, N.M., under Oppenheimer’s direction.

At first some physicists thought a bomb could not be developed rapidly enough to be relevant to the war. Bethe, for instance, preferred to work on radar.

“I had considered the whole enterprise a boon­doggle,” he said. “I thought this had nothing to do with the war.” But by April 1943 Oppenheimer succeeded in recruiting Bethe to Los Alamos. By that time the science was in place, and the path to designing and building a bomb was straightforward. “All we had to do was to find out that there were no unforeseen difficulties,” Bethe said.

Ultimately the prototype was exploded at Alamogordo in July 1945, about three weeks before the bomb’s use against Japan.

Trinity test site
The prototype atomic bomb was exploded at the Trinity test site, in Alamogordo, N.M., in July 1945.United States Department of Energy

It was a weapon more horrifying than anything humankind and ever encountered or imagined. And science was responsible. But only because science succeeded in understanding nature more deeply than before. Nobody knew at first where that understanding would lead.

There was absolutely no way to foresee that the discovery of radioactivity, or the atomic nucleus, or even the neutron would eventually enable the construction of a weapon of mass destruction. Yet once it was known that a bomb was possible, it was inevitable.

After Germany’s surrender in World War II, the Allies detained several top German scientists, including Werner Heisenberg, leader of the Nazi bomb project, and eavesdropped on their conversations. It was clear that the Germans failed to build a bomb because they did not think it was practically possible. But after hearing of the bombing of Hiroshima, Heisenberg was quickly able to figure out how the bomb had, in fact, been feasible. Once scientists know for sure something is possible, it’s a lot easier to do it.

In the case of the atomic bomb, basic research seeking nature’s secrets initiated a chain reaction of new knowledge, impossible to control. So the mushroom cloud that resulted symbolizes one of science’s most disturbing successes.