That is a
non-paywalled, free-to-read link courtesy of the WSJ.
It's lengthy but an excellent read. I encourage everyone to take
The dire predictions went out the window, seemingly unanimously.
But there is plenty in the article for the fearmongers and the sceptics to both
say "I told you so".
Italic emphasis in the snips
below is mine.
For almost five years, an international consortium of scientists
was chasing clouds, determined to solve a problem that bedeviled climate-change
forecasts for a generation: How do these wisps of water vapor affect global
They reworked 2.1 million lines of supercomputer code used to
explore the future of climate change, adding more-intricate equations for
clouds and hundreds of other improvements. They tested the equations, debugged
them and tested again.
scientists would find that even the best tools at hand can’t model climates
with the sureness the world needs as rising temperatures impact almost every
When they ran the updated simulation in 2018, the conclusion
jolted them: Earth’s atmosphere was much more sensitive to greenhouse gases
than decades of previous models had predicted, and future temperatures could be
much higher than feared—perhaps even beyond hope of practical remedy.
“We thought this was really strange,” said Gokhan Danabasoglu,
chief scientist for the climate-model project at the Mesa Laboratory in Boulder
at the National Center for Atmospheric Research, or NCAR. “If that number was
correct, that was really bad news.”
The scientists soon concluded their new calculations had been
thrown off kilter by the physics of clouds in a warming world, which may
amplify or damp climate change. “The old way is just wrong, we know that,” said
Andrew Gettelman, a physicist at NCAR who specializes in clouds and helped
develop the CESM2 model. “I think our higher sensitivity is wrong too. It’s probably
a consequence of other things we did by making clouds better and more
realistic. You solve one problem and create another.”
Down Extreme Forecasts
“We have a situation where the models are behaving strangely,”
said Gavin Schmidt, director of the National Aeronautics and Space
Administration’s Goddard Institute for Space Sciences, a leading center for
climate modeling. “We have a conundrum.”
In November 2021, as leaders met in Glasgow to
negotiate limits on greenhouse gases under the auspices of the
2015 Paris Accords, there were more than 100 major global climate-change models
produced by 49 different research groups, reflecting an influx of people into
guidance to governments last year, the U.N. climate-change panel for the first
time played down the most extreme forecasts.
Before making new climate predictions for policy makers, an
independent group of scientists used a technique called “hind-casting,” testing
how well the models reproduced changes that occurred during the 20th century
and earlier. Only models that re-created past climate behavior accurately were
Because clouds can both reflect solar radiation into space and
trap heat from Earth’s surface, they are among the biggest challenges for
scientists honing climate models.
At any given time, clouds cover more than two-thirds of the
planet. Their impact on climate depends on how reflective they are, how high
they rise and whether it is day or night. They can accelerate warming or cool
it down. They operate at a scale as broad as the ocean, as small as a hair’s
width. Their behavior can be affected, studies show, by factors ranging from
cosmic rays to ocean microbes, which emit sulfur particles that become the
nuclei of water droplets or ice crystals.
don’t get clouds right, everything is out of whack.” said
Tapio Schneider, an atmospheric scientist at the California Institute of
Technology and the Climate Modeling Alliance, which is developing an
experimental model. “Clouds are crucially important for regulating Earth’s
independent assessment of 39 global-climate models last year, scientists found
that 13 of the new models produced significantly higher estimates of the global
temperatures caused by rising atmospheric levels of carbon dioxide than the
older computer models—scientists called them the “wolf pack.” Weighed
against historical evidence of temperature changes, those estimates were deemed
Dr. Gettelman, who helped develop CESM2, and his colleagues in
their initial upgrade added better ways to model polar ice caps and how carbon
and nitrogen cycle through the environment. To make the ocean more realistic,
they added wind-driven waves. They fine-tuned the physics in its algorithms and
made its vintage
Fortran code more efficient.
Even the simplest diagnostic test is challenging. The model
divides Earth into a virtual grid of 64,800 cubes, each 100 kilometers on a
side, stacked in 72 layers. For each projection, the computer must calculate
4.6 million data points every 30 minutes. To test an upgrade or correction,
researchers typically let the model run for 300 years of simulated computer
In their initial analysis, scientists discovered a flaw in how
CESM2 modeled the way moisture interacts with soot, dust or sea-spray particles
that allow water vapor to condense into cloud droplets. It
took a team of 10 climate experts almost 5 months to track it down to a flaw in
their data and correct it, the scientists said.
The NCAR scientists in Boulder would like to delve more deeply
into the behavior of clouds, ice sheets and aerosols, but they already are
straining their five-year-old Cheyenne supercomputer, according to NCAR
officials. A climate model able to capture the subtle effects of
individual cloud systems, storms, regional wildfires and ocean currents at a
more detailed scale would require a thousand times more computer power,
models need to link rising temperatures on a global scale to changing
conditions in a local forest, watershed, grassland or agricultural zone, says
NCAR forest ecologist Jacquelyn Shuman and NCAR scientist Gerald Meehl.
“Computer models that contain both large-scale and small-scale
models allow you to really do experiments that you can’t do in the real world,”
she said. “You can really ramp up the temperature, dial down the precipitation
or completely change the amount of fire or lightning strikes that an area is
seeing, so you can really diagnose how it all works together. That’s the next
step. It would be very computationally expensive.”
“I think the climate models are the best tool we have to
understand the future, even though they are far from perfect,” said Dr.
Gettelman. “I’m not worried that the new models might be wrong. What
scares me is that they might be right.”
Models Will Get Better
Scientists need to keep doing what they are doing. The models
surely will get better.
Despite the models being wrong, they appear to be better than I
Yet, had we listened to the dire forecasts from Al Gore,
globetrotting Gretta, President Biden, and media darling AOC, where would we
Al Gore wanted to spend $90 trillion to fight climate
"New Green Deal" Stunningly Absurd: Far More Ridiculous Than Expected
World’s 1st remote brain surgery via 5G network performed in China Published time: 17 Mar, 2019 13:12 · A Chinese surgeon has performed the world’s first remote brain surgery using 5G technology, with the patient 3,000km away from the operating doctor. Dr. Ling Zhipei remotely implanted a neurostimulator into his patient’s brain on Saturday, Chinese state-run media reports . The surgeon manipulated the instruments in the Beijing-based PLAGH hospital from a clinic subsidiary on the southern Hainan island, located 3,000km away. The surgery is said to have lasted three hours and ended successfully. The patient, suffering from Parkinson’s disease, is said to be feeling well after the pioneering operation. The doctor used a computer connected to the next-generation 5G network developed by Chinese tech giant Huawei. The new device enabled a near real-time connection, according to Dr. Ling. “You barely feel that the patient is 3,000 kilometers away,” he said.
Visualizing The Power Of The World's Supercomputers BY TYLER DURDEN FRIDAY, JAN 21, 2022 - 04:15 AM A supercomputer is a machine that is built to handle billions, if not trillions of calculations at once. Each supercomputer is actually made up of many individual computers (known as nodes) that work together in parallel. A common metric for measuring the performance of these machines is flops , or floating point operations per second . In this visualization, Visual Capitalist's Marcus Lu uses November 2021 data from TOP500 to visualize the computing power of the world’s top five supercomputers. For added context, a number of modern consumer devices were included in the comparison. Ranking by Teraflops Because supercomputers can achieve over one quadrillion flops, and consumer devices are much less powerful, we’ve used teraflops as our comparison metric. 1 teraflop = 1,000,000,000,000 (1 trillion) flops. Supercomputer Fugaku was completed in March 202
Too Much Power to the People? A Food Safety Site Tests the Limits Several national chain restaurants have been the target of complaints on IWasPoisoned.com since the site began in 2009. By KEVIN ROOSE FEB. 13, 2018 Dan Laptev, an electronics analyst, was making his way through the Charlotte, N.C., airport this month when he stopped at Starbucks for a light dinner — a ham-and-cheese sandwich and a cup of hot chocolate. He ate, drank, boarded his flight and got home. And that’s when the trouble started. Mr. Laptev spent much of that night hunched over the toilet with a violently upset stomach. Suspecting his Starbucks meal as the source of his ills, he sent a complaint through the company’s website, but got only an automated form email back. So he did the next best thing: He logged on to his computer and went to IWasPoisoned.com, a website that allows users to post reports of food poisoning, and submitted his saga. “I wanted to let people know to stop eating at Star