To be an AGW denier is to be paranoid

It was done with thermometers Frank. And did you have some reason to quote as much material as you did to ask that question? I don't see that you needed to quote anything.
 
It was done with thermometers Frank. And did you have some reason to quote as much material as you did to ask that question? I don't see that you needed to quote anything.

We had thermometers accurate to a tenth of a degree in 1880? Really?

Show us.

BTW, This is a great opportunity for you to repost the chart with no temperature axis, yanno
 
Give some thought as to how someone might CALIBRATE a thermometer Frank. Think of water and phase changes. Think about what is happening in this graph.

phase.gif
 
Give some thought as to how someone might CALIBRATE a thermometer Frank. Think of water and phase changes. Think about what is happening in this graph.

phase.gif


hahahahaha, I would like to see crick calibrate a liquid-in-glass thermometer using the freezing point and the boiling point of water. do you think he could get within 2 degrees for the range?
 
Give some thought as to how someone might CALIBRATE a thermometer Frank. Think of water and phase changes. Think about what is happening in this graph.

phase.gif

Tell us again how we got Deep Ocean temperature accurate to a tenth of a degree back in 1880

I'd be remiss if I didn't congratulate you for finally post a chart with a temperature axis
 
Give some thought as to how someone might CALIBRATE a thermometer Frank. Think of water and phase changes. Think about what is happening in this graph.

phase.gif


hahahahaha, I would like to see crick calibrate a liquid-in-glass thermometer using the freezing point and the boiling point of water. do you think he could get within 2 degrees for the range?

Mercury thermometers were invented by Fahrenheit in 1724 and Celsius proposed using the melting and vaporization points of water as the defined ends of a scale in 1742. The clinical mercury thermometer we all used before the rise of digital versions was developed in 1866. I guarantee you that by 1880 there were thermometers that could reliably measure ambient temperatures to a tenth of a degree.

And this comment surprises me. Calibrating thermometers with slush and boiling water is standard basic chemistry and physics lab work. I was under the impression you have a college education in some sort of science Ian. Is that not true?
 
Last edited:
Give some thought as to how someone might CALIBRATE a thermometer Frank. Think of water and phase changes. Think about what is happening in this graph.

phase.gif


hahahahaha, I would like to see crick calibrate a liquid-in-glass thermometer using the freezing point and the boiling point of water. do you think he could get within 2 degrees for the range?

Mercury thermometers were invented by Fahrenheit in 1724 and Celsius proposed using the melting and vaporization points of water as the defined ends of a scale in 1742. The clinical mercury thermometer we all used before the rise of digital versions was developed in 1866. I guarantee you that by 1880 there were thermometers that could reliably measure ambient temperatures to a tenth of a degree.

And this comment surprises me. Calibrating thermometers with slush and boiling water is standard basic chemistry and physics lab work. I was under the impression you have a college education in some sort of science Ian. Is that not true?

So that's how they measured the temperature of the deep Pacific ocean, with mercury thermometers.

How'd the do that in 1880 Crick, I mean accurate to a tenth of a degree no less
 
Give some thought as to how someone might CALIBRATE a thermometer Frank. Think of water and phase changes. Think about what is happening in this graph.

phase.gif


hahahahaha, I would like to see crick calibrate a liquid-in-glass thermometer using the freezing point and the boiling point of water. do you think he could get within 2 degrees for the range?

Mercury thermometers were invented by Fahrenheit in 1724 and Celsius proposed using the melting and vaporization points of water as the defined ends of a scale in 1742. The clinical mercury thermometer we all used before the rise of digital versions was developed in 1866. I guarantee you that by 1880 there were thermometers that could reliably measure ambient temperatures to a tenth of a degree.

And this comment surprises me. Calibrating thermometers with slush and boiling water is standard basic chemistry and physics lab work. I was under the impression you have a college education in some sort of science Ian. Is that not true?


Again. I would love to see crick take an unmarked liquid in glass tube, calibrate it with the freezing and boiling points, and then see if he could measure 22C and 37C accurately. Better yet, 70F and 98.6F.

What are the chances he would come within 2 degrees?

I have a handful of Fisher Scientific LIG thermometers in a drawer here. They show a spread of half a degree higher and lower than the central value.

The bimetal ones are close to twice that, and most have been recalibrated recently.

Last night I compared a LIG against a Fluke digital and the difference was 0.6C.

I assume the precision is good, and the accuracy OK.

Perhaps the thermometers in the olden days were high quality and regularly calibrated. Perhaps not.

Perhaps the thermometer enclosures were well maintained, perhaps not. If a Stevenson screen goes ten years without being repainted it reads higher and higher. Once repainted it abruptly drops, which may trigger a point break in the record. All the artificial warming would be incorporated into the trend and the cycle repeats.

UHI is similar. It just keeps adding to the trend. Low trend rural stations are seen as the outliers and are adjusted towards the contaminated higher trend urbanized ones.l
 
Give some thought as to how someone might CALIBRATE a thermometer Frank. Think of water and phase changes. Think about what is happening in this graph.

phase.gif


hahahahaha, I would like to see crick calibrate a liquid-in-glass thermometer using the freezing point and the boiling point of water. do you think he could get within 2 degrees for the range?

Mercury thermometers were invented by Fahrenheit in 1724 and Celsius proposed using the melting and vaporization points of water as the defined ends of a scale in 1742. The clinical mercury thermometer we all used before the rise of digital versions was developed in 1866. I guarantee you that by 1880 there were thermometers that could reliably measure ambient temperatures to a tenth of a degree.

And this comment surprises me. Calibrating thermometers with slush and boiling water is standard basic chemistry and physics lab work. I was under the impression you have a college education in some sort of science Ian. Is that not true?


Again. I would love to see crick take an unmarked liquid in glass tube, calibrate it with the freezing and boiling points, and then see if he could measure 22C and 37C accurately. Better yet, 70F and 98.6F.

What are the chances he would come within 2 degrees?

I have a handful of Fisher Scientific LIG thermometers in a drawer here. They show a spread of half a degree higher and lower than the central value.

The bimetal ones are close to twice that, and most have been recalibrated recently.

Last night I compared a LIG against a Fluke digital and the difference was 0.6C.

I assume the precision is good, and the accuracy OK.

Perhaps the thermometers in the olden days were high quality and regularly calibrated. Perhaps not.

Perhaps the thermometer enclosures were well maintained, perhaps not. If a Stevenson screen goes ten years without being repainted it reads higher and higher. Once repainted it abruptly drops, which may trigger a point break in the record. All the artificial warming would be incorporated into the trend and the cycle repeats.

UHI is similar. It just keeps adding to the trend. Low trend rural stations are seen as the outliers and are adjusted towards the contaminated higher trend urbanized ones.l

Clearly your thermometers are DENIERS!! who refuse to accurately measure the devastating effects of AGW and must be adjusted accordingly
 
So, in 1880- 1930 they accurately measured the excess heat, 93% of which is absorbed by the ocean and then subducted down into the deep, to a tenth of a degree.

2+2=5
 
Tell us again how we got Deep Ocean temperature accurate to a tenth of a degree back in 1880

We didn't, and you're a moron for thinking we did.

The world isn't faking data. You're completely clueless on the topic of statistics. And the whole world knows that, which is why you're laughed at and ignored.
 
Tell us again how we got Deep Ocean temperature accurate to a tenth of a degree back in 1880

We didn't, and you're a moron for thinking we did.

The world isn't faking data. You're completely clueless on the topic of statistics. And the whole world knows that, which is why you're laughed at and ignored.
so tooth, I think the issue is then, if there weren't records, how can one go back and make changes to that historical information? I'm just saying. That was the point of the post.
 
so tooth, I think the issue is then, if there weren't records,

There were records. What are you babbling about?

how can one go back and make changes to that historical information? I'm just saying. That was the point of the post.

Post #568, I showed you how doing what you want -- that is, simply averaging the raw data -- results in showing _more_ warming.

The adjustments reduce the calculated warming. You claim the adjustments increase the calculated warming. Being that you're just lying openly, there's no reason to pay attention to your ravings.
 
Tell us again how we got Deep Ocean temperature accurate to a tenth of a degree back in 1880

We didn't, and you're a moron for thinking we did.

The world isn't faking data. You're completely clueless on the topic of statistics. And the whole world knows that, which is why you're laughed at and ignored.

Weren't the oceans absorbing 93% of the excess heat just like AR5 said they do today? Did you read AR5? Can you count to 5?
 
hindcasts are obviously tuned to get the best result.

Wrong. Models are tuned to give the best hindcasts. It's how you initially judge how well your model is working.


reduced to being the grammar police?

I bet you were hall monitor in elementary school too.

You know quite well to phrased that to give the impression that there was no value in hindcast performance and that they were intended as a deception regarding the accuracy of models. Don't bitch when you get caught trying to be dishonest.
 
hindcasts are obviously tuned to get the best result.

Wrong. Models are tuned to give the best hindcasts. It's how you initially judge how well your model is working.


reduced to being the grammar police?

I bet you were hall monitor in elementary school too.

You know quite well to phrased that to give the impression that there was no value in hindcast performance and that they were intended as a deception regarding the accuracy of models. Don't bitch when you get caught trying to be dishonest.

How am I being dishonest? I posted a graph showing the training period of one model from roughly 75-00, and how the hindcast before that was bad, and the projection after was worse.

I will admit I should know more about the climate models than I do.

Do they all use a standardized set of inputs to initialize their runs? That would seem to be the most basic of first steps. Do they? I was under the impression that there is a fair amount of leeway in choosing the initial parameters although they are basically grouped under RCPs for the forecasts. Am I wrong? Link me up to the site that defines the parameters that all the models use.
 

Forum List

Back
Top