Raw Climate Data - a challenge

This is comical.

Does anyone here even understand the issue? NONE of the data released is actual raw data. It's averages, anomalies, and adjustments.


Cannot any of you climate science experts post some actual temperature readings? Just the readings, locations, and time/date of those readings. These averages are just that, averages based on arbitrary calculations.

What you're seeing here is the "deer in the headlights" expression on faces of the AGW nutburgers. They are too stupid to even understand why an average isn't a temperature reading.
 
This is comical.

Does anyone here even understand the issue? NONE of the data released is actual raw data. It's averages, anomalies, and adjustments.


Cannot any of you climate science experts post some actual temperature readings? Just the readings, locations, and time/date of those readings. These averages are just that, averages based on arbitrary calculations.

I have been saying for over a decade that the historical temperature data (right up to yesterday) have been so heavily massaged, manipulated, and mangled that an actual temperature data set either no longer exists or is known to a very few individuals.

Good to see someone else with the same concerns. I believe that most of these climate idiots really don't have any idea that the warming they are so concerned about is really an artifact that bears little, if any resemblance to the actual temperature data and certainly hasn't been openly compared with the actual temperature data.
 
This is comical.

Does anyone here even understand the issue? NONE of the data released is actual raw data. It's averages, anomalies, and adjustments.


Cannot any of you climate science experts post some actual temperature readings? Just the readings, locations, and time/date of those readings. These averages are just that, averages based on arbitrary calculations.

I understand.

:)
 
Through other conversations here and elsewhere, I've made the point that the raw data has either been hidden or destroyed through incompetent data management. While the Climategate folks made every intention to keep their data secret, eventually forces beyond their control caused them to relent and they released everything they said they had.

OK, climate sceptics: here's the raw data you wanted - environment - 28 July 2011 - New Scientist

The only problem I see with this is that in none of those links, supporting links, or any of the other files released is there actual raw data. EVERY dataset has averages, average anomalies, and data that has been adjusted.

Here's the Met Office (UK's National Weather Service) website, where the raw data is said to reside:

Land surface climate station records - Met Office

I challenge anyone to find a file with a station's actual temperature record.

An actual temperature record would be something denoting the station, the temperature at a given time, and the time and date.

Any takers?

From the page to which you linked:

HadCRUT3 is one of the global temperature records that have underpinned Intergovernmental Panel on Climate Change (IPCC) assessment reports and numerous scientific studies. The data subset consists of a network of individual land stations that has been designated by the World Meteorological Organization (WMO) for use in climate monitoring and other data that the Met Office has gained permission from the owners to make available. The data contain monthly average temperature values for more than 3,000 land stations.
***************************
Are you suggesting that by averaging the temperature readings over a month, they were able to make them look warmer than they actually are? You'll have to explain in what way you believe averaging distorts the data.

BTW, if you wanted it daily, the files, the download time and the storage space required all go up 30-fold. If you wanted it hourly, they all go up 720-fold.

Have you tried contacting them and explaining your need? I'm quite sure they can satisfy your needs. After all, they've got nothing else to do.

You trying to intimidate me with the DATA SIZE of a temperature record?? I've got RING TONES that are larger than daily surface stations for a 100 years..

So --- what if I'm studying the DIURNAL temp diffs (day night) and I want the original DAILIES book-keeping? The old paper notebooks that the temps in the 1920s were written on.. Where do you suppose THEY WOULD BE?

And why is this so hard?? If it wasn't for internet fanatics squirreling stuff away, we wouldn't KNOW that the peak in 30s and 40s was higher than the OFFICIAL numbers.

Averaging distorts when the algorithms being used "homogenize" surface readings between adjacent stations. Or when stations are dropped in favor of nearby locations.
Or when gaps in data are interpolated.. Or when attempts to nullify "urban heating" are applied..

If you're only looking for a 0.15deg change from year to year -- doesn't take much raise artifacts and processing effects to that order of significance.
 
But you've just told us that averaged data were garbage. Daily will be average. Hourly will be averaged. They're all garbage.

Read what I said again. I said daily, as long as the time is noted. That's not an average, it's a reading of temperature at a certain date and a certain time. Hourly is the same, a temperature reading at a certain time of a certain day.

Do you understand how thermometers work?



The best available as long as the raw data is provided, not the data that has been run through a computer program. That's the crux of my skepticism, that the people who wrote these computer programs have neither formal education nor experience in software development.

And, with all that technical experience, would you mind explaining to us how averaging turns good data into garbage?

If sensor readings are taken a different parts of the day and that data is not captured, it produces an inaccurate trend. If sensor readings over the course of a month are not taken consistently at the same time of each day every day, those readings are inconsistent. If the equation used to average a monthly set of readings and are not true averages but give weighted averages to compensate for a lack of data, those averages are a guess at best.

Read some of the comments noted in "Botch after Botch" cited above. Then read the "HARRY_READ_ME.txt" file cited above.

This is not the work of professionals.

Isn't it interesting that the experts always seem to be lacking in the areas where their critics feel strongest.

We're talking about reading a thermometer and writing down the numbers.. What do suspect the pay grade is for getting that job done??

What kind of "expert" is required to MAINTAIN a historical temperature record as RAW DATA??

Late Edit --- BTW ---- Having papers and studies reviewed by a MULTIDISCIPLINARY audience is a GOOD idea. There is nothing wrong with WELCOMING folks who are stronger in SMALL aspects of the problem to generate comments and suggestions.
There are GLARING similiarities in all corners of science and technology. Any discipline that attempts to cloister itself beyond general scrutiny --- is doing something devious...
 
Last edited:
Through other conversations here and elsewhere, I've made the point that the raw data has either been hidden or destroyed through incompetent data management. While the Climategate folks made every intention to keep their data secret, eventually forces beyond their control caused them to relent and they released everything they said they had.

OK, climate sceptics: here's the raw data you wanted - environment - 28 July 2011 - New Scientist

The only problem I see with this is that in none of those links, supporting links, or any of the other files released is there actual raw data. EVERY dataset has averages, average anomalies, and data that has been adjusted.

Here's the Met Office (UK's National Weather Service) website, where the raw data is said to reside:

Land surface climate station records - Met Office

I challenge anyone to find a file with a station's actual temperature record.

An actual temperature record would be something denoting the station, the temperature at a given time, and the time and date.

Any takers?

From the page to which you linked:

HadCRUT3 is one of the global temperature records that have underpinned Intergovernmental Panel on Climate Change (IPCC) assessment reports and numerous scientific studies. The data subset consists of a network of individual land stations that has been designated by the World Meteorological Organization (WMO) for use in climate monitoring and other data that the Met Office has gained permission from the owners to make available. The data contain monthly average temperature values for more than 3,000 land stations.
***************************
Are you suggesting that by averaging the temperature readings over a month, they were able to make them look warmer than they actually are? You'll have to explain in what way you believe averaging distorts the data.

BTW, if you wanted it daily, the files, the download time and the storage space required all go up 30-fold. If you wanted it hourly, they all go up 720-fold.

Have you tried contacting them and explaining your need? I'm quite sure they can satisfy your needs. After all, they've got nothing else to do.

You trying to intimidate me with the DATA SIZE of a temperature record?? I've got RING TONES that are larger than daily surface stations for a 100 years..

So --- what if I'm studying the DIURNAL temp diffs (day night) and I want the original DAILIES book-keeping? The old paper notebooks that the temps in the 1920s were written on.. Where do you suppose THEY WOULD BE?

And why is this so hard?? If it wasn't for internet fanatics squirreling stuff away, we wouldn't KNOW that the peak in 30s and 40s was higher than the OFFICIAL numbers.

Averaging distorts when the algorithms being used "homogenize" surface readings between adjacent stations. Or when stations are dropped in favor of nearby locations.
Or when gaps in data are interpolated.. Or when attempts to nullify "urban heating" are applied..

If you're only looking for a 0.15deg change from year to year -- doesn't take much raise artifacts and processing effects to that order of significance.

We're not even at the point where this can be discussed honestly because nobody has released the original data. There are all sorts of issues and problems that could arise just from simply casting variables wrong, using stock functions that contain rounding errors, compiling the same source on different architecture and producing different results (without knowing), etc.

Without the input data no independent analysis can be done.

For those of you who think this is far fetched, there is still a quite glaring set of rounding errors in Excel, due to the IEEE 754 specification.

Fortran has rounding errors also.

What might seem like a simple fix really isn't that simple when using multiple systems (and in many cases multiple sources).

Here's an example: perl - Does Fortran have inherent limitations on numerical accuracy compared to other languages? - Stack Overflow

So even without suggesting any monkey business, the potential for errors exist.
 
How do you define comprehensive? Was there some particular, DIFFERENT data format you were expecting to see? Date and temperature below in red.

Here's your daily data, hotshot.

The United States Historical Climatology Network (USHCN) Main Page

Did you even look for it?

Yes I did.

Show me where they provide comprehensive (even if it's incomplete) data for temperature readings.

None are there.

USHCN daily data are available as ASCII files.
The format of each record in an ASCII data file, be it a state-level
file (e.g., state01_AL.txt) or the file for the entire U.S. (us.txt)
is as follows. (Each record in a file contains one month of daily data.)

Variable Columns Type
COOP ID 1-6 Character
YEAR 7-10 Integer
MONTH 11-12 Integer
ELEMENT 13-16 Character
VALUE1 17-21 Integer
MFLAG1 22 Character
QFLAG1 23 Character
SFLAG1 24 Character
VALUE2 25-29 Integer
MFLAG2 30 Character
QFLAG2 31 Character
SFLAG2 32 Character
. . .
. . .
. . .
. . .
VALUE31 257-261 Integer
MFLAG31 262 Character
QFLAG31 263 Character
SFLAG31 264 Character

These variables have the following definitions:

COOP ID is the U.S. Cooperative Observer Network station identification
code. Note that the first two digits in the Coop Id correspond to the state.

YEAR is the year of the record.

MONTH is the month of the record.


ELEMENT is the element type. There are five possible values:
PRCP = precipitation (hundredths of inches)
SNOW = snowfall (tenths of inches)
SNWD = snow depth (inches)
TMAX = maximum temperature (degrees F)
TMIN = minimum temperature (degrees F)


VALUE1 is the value on the first day of the month (missing = -9999).

MFLAG1 is the measurement flag for the first day of the month. There are
ten possible values:

Blank = no measurement information applicable
B = precipitation total formed from two 12-hour totals
D = precipitation total formed from four six-hour totals
H = represents highest or lowest hourly temperature
K = converted from knots
L = temperature appears to be lagged with respect to reported
hour of observation
O = converted from oktas
P = identified as "missing presumed zero" in DSI 3200 and 3206
T = trace of precipitation, snowfall, or snow depth
W = converted from 16-point WBAN code (for wind direction)

QFLAG1 is the quality flag for the first day of the month. There are
fourteen possible values:

Blank = did not fail any quality assurance check
D = failed duplicate check
G = failed gap check
I = failed internal consistency check
K = failed streak/frequent-value check
L = failed check on length of multiday period
M = failed megaconsistency check
N = failed naught check
O = failed climatological outlier check
R = failed lagged range check
S = failed spatial consistency check
T = failed temporal consistency check
W = temperature too warm for snow
X = failed bounds check
Z = flagged as a result of an official Datzilla
investigation

SFLAG1 is the source flag for the first day of the month. There are
nineteen possible values (including blank, upper and
lower case letters):

Blank = No source (i.e., data value missing)
0 = U.S. Cooperative Summary of the Day (NCDC DSI-3200)
6 = CDMP Cooperative Summary of the Day (NCDC DSI-3206)
7 = U.S. Cooperative Summary of the Day -- Transmitted
via WxCoder3 (NCDC DSI-3207)
A = U.S. Automated Surface Observing System (ASOS)
real-time data (since January 1, 2006)
B = U.S. ASOS data for October 2000-December 2005 (NCDC
DSI-3211)
F = U.S. Fort data
G = Official Global Climate Observing System (GCOS) or
other government-supplied data
H = High Plains Regional Climate Center real-time data
K = U.S. Cooperative Summary of the Day data digitized from
paper observer forms (from 2011 to present)
M = Monthly METAR Extract (additional ASOS data)
N = Community Collaborative Rain, Hail,and Snow (CoCoRaHS)
R = NCDC Reference Network Database (Climate Reference Network
and Historical Climatology Network-Modernized)
S = Global Summary of the Day (NCDC DSI-9618)
NOTE: "S" values are derived from hourly synoptic reports
exchanged on the Global Telecommunications System (GTS).
Daily values derived in this fashion may differ
significantly
from "true" daily data, particularly for precipitation
(i.e., use with caution).
T = SNOwpack TELemtry (SNOTEL) data obtained from the Western
Regional Climate Center
U = Remote Automatic Weather Station (RAWS) data obtained
from the Western Regional Climate Center
W = WBAN/ASOS Summary of the Day from NCDC's Integrated
Surface Data (ISD).
X = U.S. First-Order Summary of the Day (NCDC DSI-3210)
Z = Datzilla official additions or replacements

When data are available for the same time from more than one
source,
the highest priority source is chosen according to the following
priority order (from highest to lowest):
Z,R,0,6,X,W,K,7,F,B,M,r,E,z,u,b,a,G,Q,I,A,N,H,S

VALUE2 is the value on the second day of the month

MFLAG2 is the measurement flag for the second day of the month.

QFLAG2 is the quality flag for the second day of the month.

SFLAG2 is the source flag for the second day of the month.

... and so on through the 31st day of the month. Note: If the month has less
than 31 days, then the remaining variables are set to missing (e.g., for
April, VALUE31 = -9999, MFLAG31 = blank, QFLAG31 = blank, SFLAG31 = blank).

http://cdiac.ornl.gov/ftp/ushcn_daily/data_format.txt
 
Last edited:
That's the crux of my skepticism, that the people who wrote these computer programs have neither formal education nor experience in software development.

And that would be why they don't screw it up. It requires an "expert" to do that.

This isn't chaos theory. Averaging is well-behaved, and tiny rounding errors mean ... nothing. When the rounding error is a million times smaller than the thermometer error, it's not an issue, no matter how much anyone wants it to be.

What is the issue? Cultists and their conspiracy theories declaring all the data is forged, solely because the data is inconvenient for the cult.

Oh, if people with "formal education in software development" had been assigned to the problem, it would have required a staff of 100 to do it, and it would have taken 5 years. I've seen how "software development" takes a year and 500 man-hours to implement my one-digit code fix. TheProcess (all bow heads reverently) must be followed, you know.
 
And for those not into conspiracy theories ...

The WMO (World Meteorlogical Org) around 1957 established a communication protocol for worldwide sharing of weather data. There are World Data Centers that collate the data in several locations. One is in the USA (Asheville, NC), with counterparts elsewhere in the world.

Using WDC Asheville as the example, they receive the data from around the world. When almost all of it comes in, it gets autoprocessed down into a more digestible lump. Which can be downloaded here.

NOAA: WMO Resolution 40

If you actually want raw station data, WDC Asheville will send it to you, but it's not a website download. You have to request specifically what you want, and pay them for the additional processing and handling.

Some weather station data is declared as proprietary by the country of origin. If someone has gotten it from WDC Asheville, they are not allowed to redistribute it by publicly posting it on the internet. Hence why you don't find worldwide raw station data on the internet.

And if that's the vast conspiracy, it's been going on since 1957 or so. Crafty, how those warmers planned ahead like that.
 
Last edited:
So, data is not destroyed and is available for examination.

What are the chances that all will remember that point?
 
The proprietary part also comes into play with FOIA and similar-type requests. You can't send proprietary data to a third party just because they ask for it with an FOIA. And thus, when the crazies don't get the proprietary data, they declare it's because of a vast conspiracy to hide data. Data which they could get themselves, if they'd ask the right sources.
 
Last edited:
The proprietary part also comes into play with FOIA and similar-type requests. You can't send proprietary data to a third party just because they ask for it with an FOIA. And thus, when the crazies don't get the proprietary data, they declare it's because of a vast conspiracy to hide data. Data which they could get themselves, if they'd ask the right sources.

What "proprietary data?" Wasn't it all collected by stooges on the government payroll?
 
That's the crux of my skepticism, that the people who wrote these computer programs have neither formal education nor experience in software development.

And that would be why they don't screw it up. It requires an "expert" to do that.

This isn't chaos theory. Averaging is well-behaved, and tiny rounding errors mean ... nothing. When the rounding error is a million times smaller than the thermometer error, it's not an issue, no matter how much anyone wants it to be.

What is the issue? Cultists and their conspiracy theories declaring all the data is forged, solely because the data is inconvenient for the cult.

Oh, if people with "formal education in software development" had been assigned to the problem, it would have required a staff of 100 to do it, and it would have taken 5 years. I've seen how "software development" takes a year and 500 man-hours to implement my one-digit code fix. TheProcess (all bow heads reverently) must be followed, you know.

There is no data, turd, so what is there to discuss? Genuine scientists are willing to produce the data they used to reach their conclusions. Keeping your data hidden is the sign of a fraud or a con artist.
 
oh please do pull me the national DAILY record from 1938 thru 1941...
That would please me no end..

Why is it that the Feds can ask us how many toilets every citizen has and publish that data --- but we can't get the temperatures dailies for 80 years ago??

Proprietary my ass..
If it's old enough for the smithsonian --- it's not proprietary anymore...
 
There is no data, turd, so what is there to discuss?

I'd like to discuss how the kooks are melting down so hilariously after yet another of their retard conspiracy theories has bitten the dust.

If you had a pair, you'd go after the people who lied to you. But you don't, so you'll attack the messenger, and then run back to your masters to beg for more lies. I'd tell you to enjoy life on your knees, if it wasn't already so clear how much you do.

Anyways, kooks, don't be discouraged. You've bounced back from worse beatings than this. Haven't you? Maybe not. Whatever. Just buck up, crack another 40 open, guzzle, and think up a new conspiracy. Try to make this one a little less retarded.
 
The daily temperature records being stored...for safekeeping:

[ame=http://www.youtube.com/watch?v=q6-rQ6Jay6w]Raiders of the Lost Ark: top men + warehouse scene - YouTube[/ame]
 
How do you define comprehensive? Was there some particular, DIFFERENT data format you were expecting to see? Date and temperature below in red.

Here's your daily data, hotshot.

The United States Historical Climatology Network (USHCN) Main Page

Did you even look for it?

Yes I did.

Show me where they provide comprehensive (even if it's incomplete) data for temperature readings.

None are there.

USHCN daily data are available as ASCII files.
The format of each record in an ASCII data file, be it a state-level
file (e.g., state01_AL.txt) or the file for the entire U.S. (us.txt)
is as follows. (Each record in a file contains one month of daily data.)

Variable Columns Type
COOP ID 1-6 Character
YEAR 7-10 Integer
MONTH 11-12 Integer
ELEMENT 13-16 Character
VALUE1 17-21 Integer
MFLAG1 22 Character
QFLAG1 23 Character
SFLAG1 24 Character
VALUE2 25-29 Integer
MFLAG2 30 Character
QFLAG2 31 Character
SFLAG2 32 Character
. . .
. . .
. . .
. . .
VALUE31 257-261 Integer
MFLAG31 262 Character
QFLAG31 263 Character
SFLAG31 264 Character

These variables have the following definitions:

COOP ID is the U.S. Cooperative Observer Network station identification
code. Note that the first two digits in the Coop Id correspond to the state.

YEAR is the year of the record.

MONTH is the month of the record.


ELEMENT is the element type. There are five possible values:
PRCP = precipitation (hundredths of inches)
SNOW = snowfall (tenths of inches)
SNWD = snow depth (inches)
TMAX = maximum temperature (degrees F)
TMIN = minimum temperature (degrees F)


VALUE1 is the value on the first day of the month (missing = -9999).

MFLAG1 is the measurement flag for the first day of the month. There are
ten possible values:

Blank = no measurement information applicable
B = precipitation total formed from two 12-hour totals
D = precipitation total formed from four six-hour totals
H = represents highest or lowest hourly temperature
K = converted from knots
L = temperature appears to be lagged with respect to reported
hour of observation
O = converted from oktas
P = identified as "missing presumed zero" in DSI 3200 and 3206
T = trace of precipitation, snowfall, or snow depth
W = converted from 16-point WBAN code (for wind direction)

QFLAG1 is the quality flag for the first day of the month. There are
fourteen possible values:

Blank = did not fail any quality assurance check
D = failed duplicate check
G = failed gap check
I = failed internal consistency check
K = failed streak/frequent-value check
L = failed check on length of multiday period
M = failed megaconsistency check
N = failed naught check
O = failed climatological outlier check
R = failed lagged range check
S = failed spatial consistency check
T = failed temporal consistency check
W = temperature too warm for snow
X = failed bounds check
Z = flagged as a result of an official Datzilla
investigation

SFLAG1 is the source flag for the first day of the month. There are
nineteen possible values (including blank, upper and
lower case letters):

Blank = No source (i.e., data value missing)
0 = U.S. Cooperative Summary of the Day (NCDC DSI-3200)
6 = CDMP Cooperative Summary of the Day (NCDC DSI-3206)
7 = U.S. Cooperative Summary of the Day -- Transmitted
via WxCoder3 (NCDC DSI-3207)
A = U.S. Automated Surface Observing System (ASOS)
real-time data (since January 1, 2006)
B = U.S. ASOS data for October 2000-December 2005 (NCDC
DSI-3211)
F = U.S. Fort data
G = Official Global Climate Observing System (GCOS) or
other government-supplied data
H = High Plains Regional Climate Center real-time data
K = U.S. Cooperative Summary of the Day data digitized from
paper observer forms (from 2011 to present)
M = Monthly METAR Extract (additional ASOS data)
N = Community Collaborative Rain, Hail,and Snow (CoCoRaHS)
R = NCDC Reference Network Database (Climate Reference Network
and Historical Climatology Network-Modernized)
S = Global Summary of the Day (NCDC DSI-9618)
NOTE: "S" values are derived from hourly synoptic reports
exchanged on the Global Telecommunications System (GTS).
Daily values derived in this fashion may differ
significantly
from "true" daily data, particularly for precipitation
(i.e., use with caution).
T = SNOwpack TELemtry (SNOTEL) data obtained from the Western
Regional Climate Center
U = Remote Automatic Weather Station (RAWS) data obtained
from the Western Regional Climate Center
W = WBAN/ASOS Summary of the Day from NCDC's Integrated
Surface Data (ISD).
X = U.S. First-Order Summary of the Day (NCDC DSI-3210)
Z = Datzilla official additions or replacements

When data are available for the same time from more than one
source,
the highest priority source is chosen according to the following
priority order (from highest to lowest):
Z,R,0,6,X,W,K,7,F,B,M,r,E,z,u,b,a,G,Q,I,A,N,H,S

VALUE2 is the value on the second day of the month

MFLAG2 is the measurement flag for the second day of the month.

QFLAG2 is the quality flag for the second day of the month.

SFLAG2 is the source flag for the second day of the month.

... and so on through the 31st day of the month. Note: If the month has less
than 31 days, then the remaining variables are set to missing (e.g., for
April, VALUE31 = -9999, MFLAG31 = blank, QFLAG31 = blank, SFLAG31 = blank).

http://cdiac.ornl.gov/ftp/ushcn_daily/data_format.txt

I'm looking for temperature readings, not monthly averages.

Or do you think that these are actual temperature readings and they one take one per month?
 
What's that cooking in the GISS kitchen?? Hey it's the temps all the way back to 1930s..

Prepared again and again everyday.. From stuff they have lying around..

flacaltenn-albums-charts-picture5795-1998changesannotated-1.gif
 
That's the crux of my skepticism, that the people who wrote these computer programs have neither formal education nor experience in software development.

And that would be why they don't screw it up. It requires an "expert" to do that.

This isn't chaos theory. Averaging is well-behaved, and tiny rounding errors mean ... nothing. When the rounding error is a million times smaller than the thermometer error, it's not an issue, no matter how much anyone wants it to be.

What is the issue? Cultists and their conspiracy theories declaring all the data is forged, solely because the data is inconvenient for the cult.

Oh, if people with "formal education in software development" had been assigned to the problem, it would have required a staff of 100 to do it, and it would have taken 5 years. I've seen how "software development" takes a year and 500 man-hours to implement my one-digit code fix. TheProcess (all bow heads reverently) must be followed, you know.

So you've "seen" huh?

How many people wrote the original code for Facebook?

So rounding errors don't matter? I'm not sure how I can explain it to you if you think this is just a "cult" thing. But it's obvious that you haven't investigated this issue, you're just accepting what you're told. So data analysts and programmers can't critique code, but amateur programmers can write climate models?

Explain that one. What makes Phil Jones an expert in technology? What specific classes in his Hydrology and Environmental Sciences degrees qualify him to write software?
 

Forum List

Back
Top