Actually, I have thought about this. This is the source of my hangup. In recent history, we have hourly temp data. We don't have that granular of data even 10,000 years ago. I assume that's why your graph is logarithmic. Our temp measurements in the past are neither temporally precise nor close enough together to even compare with today's temp spikes. Do we have annual measurements 350,000 years ago? Measurements to the decade? The century? Are the measurements close enough together to even register a spike like today? My main question is this: how can we compare rates of change today to rates of change in the past when we don't have the past data? *I could totally be wrong about our past measurements...maybe they ARE more granular and precise than I think.
Again, do we have measurements precise to 1000 year intervals?
This chart is limited to about 1500 years. According to your long term chart, it does appear that there are massive spikes and drops in temp. But I don't really trust those because again, what granularity are we talking about with data points?
I'm not trying to argue man made climate change. I just want to know from a logical/data/math perspective, do we have the data to say things like "this rate of change has never happened before".
Ok I have some time so what I was trying to show was that it wasn’t necessary to use historical data to show what’s happening today. However historical data can help confirm what we know and bound possible scenarios.
Fundamentally all historical data past about 1880 uses different proxy records or analysis to estimate what the temperature profile and atmospheric composition was.
There are a multitude of different methods used to estimate the temperature record both in magnitude and temporally. Where they overlap they can be used to narrow the temperature range and time range of a given point. The uncertainty with these methods are included.
Now to be clear the claim I was referring to was from here:
https://earthobservatory.nasa.gov/Features/GlobalWarming/page3.php
- In the last million years the temperature increase after an ice age averaged 4-7C over 5000 years
- Global temperatures have increased by 0.7C in the past century. 10 times faster than than average. (This was written in 2010 it’s up to about 1C now)
- Predicted scenarios for the 21st century could see the increase be 20 times faster than this average
So how do we know when things happend?
This article has many examples of the methods used to date fossils, artifacts or simply excavated strata:
- Biostratigraphy
- Paleomagnetism
- Tephrochronology
- Radiometric Dating (C14, U, etc)
- Single crystal fusion
- Thermoluminescence
- Optically stimulated luminescence
- Electronic spin resonance
Each of these methods has an uncertainty that is known and quantified when used to build a paleoclimate dataset.
On top of these methods of determining how old a layer of sediment, rock, fossil, etc there are regular cycles of climate change driven by orbital mechanics and ocean currents that we know well.
http://www.antarcticglaciers.org/glaciers-and-climate/climate-change/
”Throughout the last 2.6 million years (the “Quaternary Period”), the earth’s climate has oscillated many times, swinging between glacial and interglacial states (Figure 1). Over the last ~1 million years, we have experienced large ice ages and interglacials with a periodicity of around 100,000 years. We are currently in an interglacial state, which began at the start of the Holocene, ~11,500 years ago. About 104 stages of these cold and temperate cycles have been recognised in deep ocean marine sediment cores (Figure 1) [1]. During glacials, large ice sheets developed in mid- to high-latitudes, including over Britain and North America. These large changes are driven by changes in the earth’s orbit around the sun – see The Quaternary Period (Table 1) [2]. Glacials and interglacials can be further divided into stadials and interstadials, and within these we have smaller scale Dansgaard-Oeschger cycles, and then even smaller cycles, such as El-Nino and ENSO. Climate data is therefore very noisy, and climate scientists must determine patterns in this data using complex statistical techniques. Throughout this time, carbon dioxide has mirrored temperature variations, which have formed a regular pattern.”
Those cycles help narrow down to the century, decade or year where data goes on the timeline but they to have a known uncertainty as well.
Just like there are multiple overlapping ways to estimate the date of something there are multiple overlapping paleoclimate datasets.
NOAA has 18 different datasets you can take a look at of varying lengths
https://www.ncdc.noaa.gov/data-access/paleoclimatology-data/datasets
Now you were specifically interested in ice cores. Hunting through that NOAA site I found the following which you might find interesting.
ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/ice-cores.pdf
Sources of uncertainty in ice core data:
”...Timescale uncertainty is an obvious source of error in ice-core based reconstructions. In general, for ice core timescales based on counting of seasonal cycles (in δ18O, sulfate, etc.), uncertainty will increase with depth (i.e. time) in an ice core. However, near volcanic marker horizons of independently-known age (e.g. the Tambora erurption of 1815) this uncertainty will be reduced. The magnitude of the uncertainty depends on the degree of ambiguity in identifying seasonal markers, and the likelihood of missing layers; both are functions of snow accumulation rate and, to a lesser extent, location. In general, where snow accumulation rates are <10 cm (ice equivalent)/year, identification of annual layers begins to be problematic. Steig et al., (2005) emphasized the need to distinguish absolute accuracy from relative accuracy. In the 200-year-long U.S. ITASE ice cores from West Antarctica, they showed that while the absolute accuracy of the dating was ±2 years, the relative accuracy among several cores was <±0.5 year, due to identification of several volcanic marker horizons in each of the cores. In this case the cores can be averaged together without creating additional timescale uncertainty, since any systematic errors in the timescale would affect all the cores together...
”
So for this particular ice core data they have the absolute accuracy as
+ 2 years with intra-layer accuracy being as low as <
+0.5 years.
Using these overlapping records and modeling we can get a fairly accurate historical temperature plot for the Earth. Although there will be error bars showing the uncertainty.
(Example - note how 10k years earlier there is an uncertainty of 0.4C which disappears once you hit modern measurements)
If you want to understand how physics behind the modeling works I suggest this excellent primer on the physics of climate by the American Chemical Society.
https://www.acs.org/content/acs/en/climatescience.html
If you’d like an example from Earths history that is somewhat the close to what the current climate is doing check out this:
The Permian Mass Extinction
http://www.livescience.com/41909-new-clues-permian-mass-extinction.html
Massive volcanism in Siberia likely burned enormous coal deposits dumping large amounts of greenhouse gases into the atmosphere over thousands of years creating the worst extinction event so far and raising global temperatures by 8-10C.
Finally, while I’d like to link you more uncertainties many of the promising peer reviewed research is behind pay walls plus this has taken awhile - again. But hopefully this gives you an idea of how much corroborating evidence there is.