A ‘Phantom’ Climate Preprint Spreads Online, Highlighting the Stakes of Sensitivity Estimates

A string of numbers — 2603.13766 — looked like the key to a bit of cautiously good climate news.

Circulating in online chats and internal briefings this month, the code was described as an arXiv preprint: a not-yet-peer-reviewed study claiming Earth’s climate is somewhat less sensitive to greenhouse gases than many models suggest. The paper supposedly introduced a method called “Time-Averaged Ordinary Least Squares,” or TAOLS, and put equilibrium climate sensitivity — the long-term warming from a doubling of carbon dioxide — in the range of about 2.1 to 2.5 degrees Celsius.

There is one problem: that paper, at least under that identifier and description, cannot be found.

Searches of the arXiv.org repository and mirrored databases turn up no record of a climate paper with the identifier 2603.13766, no preprint matching the reported title, and no trace of “Time-Averaged Ordinary Least Squares” as a named method in the scientific literature.

The phantom study may turn out to be a simple mis-citation or a draft posted under another number. But the episode offers a glimpse into a higher-stakes fight: how climate scientists estimate equilibrium climate sensitivity, why some statistical methods tend to produce lower numbers, and how even the hint of reassuring results can race ahead of verification.

Why climate sensitivity matters

Equilibrium climate sensitivity (ECS) is a cornerstone of climate science. It measures how much the planet’s average surface temperature will eventually rise once atmospheric carbon dioxide has doubled and the climate system has had time to adjust, taking into account fast feedbacks such as water vapor, clouds, and sea ice.

In its latest comprehensive assessment, released in 2021, the U.N. Intergovernmental Panel on Climate Change (IPCC) concluded that ECS is “likely” between 2.5°C and 4.0°C and “very likely” between 2.0°C and 5.0°C. The panel’s best estimate is 3.0°C. Those numbers reflect multiple strands of evidence: instrumental temperature records since the late 19th century, simulations from climate models, and reconstructions of ancient climates such as the last Ice Age.

“We assess ECS to be 3 °C, with a very likely range of 2 to 5 °C, based on a thorough assessment of all lines of evidence,” the IPCC’s Working Group I report stated. The authors added that values below 2°C “are difficult to reconcile with observed warming and paleoclimate evidence.”

A claimed range of 2.1 to 2.5°C, if supported by robust analysis, would sit at or below the low end of that assessed range. It would not overturn decades of research, but it would nudge estimates toward the cooler side and modestly lower the high-end warming risks for any given path of emissions.

That makes such numbers politically and psychologically potent. Studies hinting at lower climate sensitivity have repeatedly been promoted by groups opposed to stringent climate policies as evidence that projections are overblown.

In past cases, the underlying work was real but later reinterpreted as new observations, better aerosol data, and improved methods emerged. Here, the difficulty is more basic: the TAOLS study, as described, is not yet part of the public scientific record.

“It’s a red flag when you’re given a specific preprint ID and a method name and you just can’t find them,” said one senior climate scientist involved in past IPCC reports, who spoke generally about verification practices and not about any specific leak. “At a minimum, it means you should hit pause before using those results in any public or policy context.”

The statistical dispute behind the missing paper

Behind the hunt for the missing paper lies a more technical dispute over how to use statistics to infer how sensitive the climate is.

One common strategy relies on regression — a tool familiar to economists and social scientists — to link historical changes in global temperature with changes in radiative forcing, the net energy imbalance caused by greenhouse gases, aerosols, solar variations, and other factors. The simplest form, ordinary least squares (OLS), fits a straight line through noisy data and uses the slope to estimate how much temperatures respond to a given increase in forcing.

That simplicity is part of the appeal, but it also hides pitfalls.

“The problem is that the independent variable in these regressions — the forcing — is itself uncertain and noisy,” said an econometrician who has worked on climate time series. “Standard OLS assumes the regressors are measured without error. When they’re not, it tends to bias the slope toward zero, which in this context means biasing climate sensitivity downward.”

Aerosol pollution is a particular source of uncertainty. Sulfate particles from burning coal and other fuels reflect sunlight and change cloud properties, offsetting some greenhouse warming. But the magnitude and evolution of that cooling effect remain imperfectly known. Because greenhouse gases and aerosols rose and then, in some regions, leveled off together over the 20th century, their combined imprint on temperature is hard to disentangle.

“Similar time evolution of greenhouse gas forcing and aerosol-induced cooling makes it statistically challenging to separate them,” one recent paper on historical constraints concluded, warning that simple regressions could give a misleadingly narrow range of ECS values.

To address these issues, researchers have developed variations on basic OLS: total least squares, which allows for errors in both variables; dynamic and “fully modified” OLS methods that account for autocorrelation and non-stationarity in time series; and Bayesian frameworks that combine temperature records with data on ocean heat content and other variables.

Even within the climate modeling community, debates have erupted over how best to fit linear relationships and what they really say about long-term sensitivity. A widely used technique known as the Gregory method, for example, regresses changes in top-of-atmosphere energy balance on surface temperature in models subjected to an abrupt quadrupling of CO₂, then extrapolates to infer ECS. Recent methodological papers have shown that choices about time windows, averaging, and how to handle model drift can move individual model estimates up or down.

Against that backdrop, the name “Time-Averaged Ordinary Least Squares” sounds more like a particular choice about how to smooth data than a radical new econometric invention.

Time-averaging — taking multi-year means instead of annual values — is already a standard way to reduce weather-driven noise and highlight the slower climate response. But it has trade-offs: overlapping averages can create complicated error structures, and the effective number of independent data points shrinks, which can make confidence intervals look tighter than they really are if handled incorrectly.

“If someone is doing OLS on time-averaged climate data, that’s not inherently wrong,” the econometrician said. “The question is how they treat the induced autocorrelation and uncertainty, and whether they compare their result honestly to the broader evidence base.”

Why a lower number wouldn’t mean ‘safe’

That broader evidence does not hinge on any one regression.

A 2020 assessment led by Australian scientist Steven Sherwood and published in Reviews of Geophysics synthesized observations, paleoclimate records, and detailed studies of individual feedbacks such as clouds. The authors concluded that ECS is “likely” between 2.6°C and 3.9°C and “very unlikely” to be below 2°C, arguing that the lowest values were inconsistent with multiple lines of evidence taken together.

The IPCC adopted similar ranges in its Sixth Assessment Report and stressed that uncertainty about ECS does not change the basic conclusion that greenhouse gas emissions must fall sharply to avoid the worst impacts.

“Every increment of global warming will intensify multiple and concurrent hazards,” the IPCC wrote in its 2023 Synthesis Report. “Deep, rapid and sustained reductions in greenhouse gas emissions would lead to a discernible slowdown in global warming within around two decades.”

Even if a well-vetted study ultimately solidified an ECS estimate near 2.2°C, climate scientists say the practical message would not be one of safety, but of slightly less dangerous trajectories.

At that sensitivity, unchecked fossil fuel use would still push temperatures far beyond the Paris Agreement’s 1.5°C and 2°C targets, and into ranges associated with more extreme heat waves, heavier downpours, continued sea-level rise, and heightened risks to ecosystems such as coral reefs and tropical forests. Many potential tipping elements, from parts of the Greenland and West Antarctic ice sheets to thawing permafrost, are thought to be vulnerable in roughly the 1.5°C to 3°C band.

Economists who model the social cost of carbon — the damage from emitting an additional ton of CO₂ — already treat ECS as a distribution, not a fixed input. Lowering the central estimate by a few tenths of a degree would shift those damage curves, but not eliminate them.

A lesson in verification

For journalists, policymakers, and the public, the elusive TAOLS paper underscores a more immediate lesson: how to treat scientific claims that have not yet cleared basic hurdles of transparency and review.

Preprint servers such as arXiv and EarthArXiv have become vital components of scientific communication, allowing researchers to share results quickly. But they also mean that unvetted or preliminary work is easier to cite — or mis-cite — in political debate.

Simple checks can help: verifying that an identifier exists on a repository, scanning for independent expert commentary, and looking at how a new result lines up with prior assessments from bodies such as the IPCC, the U.S. National Academies, or major peer-reviewed syntheses.

“The first questions I ask are: Can I read the paper? Does it engage seriously with everything we already know?” the veteran climate scientist said. “If the answer to either is no, then it shouldn’t be driving decisions.”

Whether the TAOLS analysis appears under a different number tomorrow or remains a ghost, the underlying physics is not waiting. The planet has already warmed by about 1.1°C since the late 19th century, according to the IPCC, and global greenhouse gas emissions hit a record high in 2022.

The precise value of climate sensitivity will continue to be refined in journal pages and conference halls. In the meantime, the search for that one reassuring number — and the temptation to seize it when it seems to appear — may say as much about human psychology as it does about Earth’s response to carbon dioxide.

Tags: #climatechange, #arxiv, #misinformation, #statistics, #ipcc