Global Warming Actually Still Accelerating - no "lull"

I've BUILT models for earth resource estimation.. Some are still in use. NONE of them purported to map an entire GLOBAL volume equivalent to ALL THE WATER IN THE OCEANS to several THOUSAND meters of depth. Looking for a couple Joules of energy per meter depth.. I'd be embarrassed to show myself at the next conference..

Built or programmed?

If programmed, when?

How big was your address space? That is, how much memory did you expect to have available to people running your code?







Uh oh spaghetti O. You're in trouble now.... you see....unlike you...flac really does do this stuff. What are you a undergrad?
 
From the Von Storch et al paper to which you linked:

What do these inconsistencies imply for the utility of climate projections of anthropogenic climate change? Three possible explanations of the inconsistencies can be suggested: 1) the models underestimate the internal natural climate variability; 2) the climate models fail to include important external forcing processes in addition to anthropogenic forcing, or 3) the climate model sensitivities to external anthropogenic forcing is too high,.The first explanation is simple and plausible. Natural climate variability is an inevitable consequence of a slow system (climate) interacting with a fast system (weather)
(10).

The forcing of the slow system by the (white noise) low-frequency components of the fast system produces a “Brownian motion” of the slow system, represented by a red variance spectrum - in qualitative agreement with observations. However, the details of the response depend strongly on the internal dynamics of the slow system in the time scale range of interest - in the present case, on decadal time scales. It is long known, from successive reports of the Intergovernmental Panel on Climate Change (4), that contemporary global climate models have only limited success in simulating many such processes, ranging from the variability of the ocean circulation, ENSO events, various coupled ocean-atmosphere oscillation regimes, to changes in sea ice, land surface, atmospheric chemistry and the biosphere. The inability to simulate the statistical internal climate variability may have been artificially compensated in the past by tuning the models to prescribed external forcings, such as volcanic eruptions and tropospheric aerosols.This would explain why simulations with historical forcing by different GCMs tend to be very similar and follow closely the observed record. This artificial “inflation”(11) of forced variability at the expense of unpredictable natural variability works, however, only in the period of tuning, and no longer in the post-tuning phase since about 2000. The net effect of such a procedure is an underestimation of natural variability and an overestimation of the response to forced variability.

Do you understand what the man is saying? Your attempted characterization does not indicate that you do.
 
From the Von Storch et al paper to which you linked:

What do these inconsistencies imply for the utility of climate projections of anthropogenic climate change? Three possible explanations of the inconsistencies can be suggested: 1) the models underestimate the internal natural climate variability; 2) the climate models fail to include important external forcing processes in addition to anthropogenic forcing, or 3) the climate model sensitivities to external anthropogenic forcing is too high,.The first explanation is simple and plausible. Natural climate variability is an inevitable consequence of a slow system (climate) interacting with a fast system (weather)
(10).

The forcing of the slow system by the (white noise) low-frequency components of the fast system produces a “Brownian motion” of the slow system, represented by a red variance spectrum - in qualitative agreement with observations. However, the details of the response depend strongly on the internal dynamics of the slow system in the time scale range of interest - in the present case, on decadal time scales. It is long known, from successive reports of the Intergovernmental Panel on Climate Change (4), that contemporary global climate models have only limited success in simulating many such processes, ranging from the variability of the ocean circulation, ENSO events, various coupled ocean-atmosphere oscillation regimes, to changes in sea ice, land surface, atmospheric chemistry and the biosphere. The inability to simulate the statistical internal climate variability may have been artificially compensated in the past by tuning the models to prescribed external forcings, such as volcanic eruptions and tropospheric aerosols.This would explain why simulations with historical forcing by different GCMs tend to be very similar and follow closely the observed record. This artificial “inflation”(11) of forced variability at the expense of unpredictable natural variability works, however, only in the period of tuning, and no longer in the post-tuning phase since about 2000. The net effect of such a procedure is an underestimation of natural variability and an overestimation of the response to forced variability.

Do you understand what the man is saying? Your attempted characterization does not indicate that you do.

Indeed, I was going to comment on that paper as well. Westwall claims that climate models are useless. And he has tried to support his claim by posting papers that don't actually support that claim. For instance, what Von Storch is saying is that although the models worked well in the past, conditions developed in recent years climate models didn't predict, and so need to be tweaked to better reflect conditions for which they are modeled. Everyone knows this. All scientific models work that way. You punch in the data for the parameters for which you are testing, and then compare the results with the real world. If there is variance, you change the model parameters and/or refine the data until it more precisely reflects real world conditions. In this way you find what works and what doesn't work, and what can reveal unforeseen conditions that need to be accounted for, thus refining the model further. This is how all scientific models work. So to say that the models are useless is a meaningless statement. Models of the 1970s were less robust than models of the 1980s, which were less robust than models of the 1990s, which were less robust than the models 10 years ago, which were less robust than the models today, which will be less robust than the models of the future. Etc., etc., etc. And none of the assessment and reassessments of these models refute the fact that global warming is occurring and is, in fact, ongoing.
 
I've BUILT models for earth resource estimation.. Some are still in use. NONE of them purported to map an entire GLOBAL volume equivalent to ALL THE WATER IN THE OCEANS to several THOUSAND meters of depth. Looking for a couple Joules of energy per meter depth.. I'd be embarrassed to show myself at the next conference..

Built or programmed?

If programmed, when?

How big was your address space? That is, how much memory did you expect to have available to people running your code?

The company paying my salary was one of the worlds largest suppliers of Image Array Processors. At that point in time -- that was RACKS of equipment with boxes that had 10 or 20 FRAMES of image data memory plus the ALUs and digital signal processing to accelerate the process. So it was NOT a general computing application -- tho ---- some smart cookies have PORTED our work to GP platforms in later years.

There was generally a row of GP computers with direct DMA access to the image memory.
Later on -- we pioneering "memory centered" digital processing with multiple PCs and also pioneered work on applying Neural Network array hardware to image feature extraction and classification.. Neural Nets and Machine Learning is the CRUX of understanding modeling from a data centered point of view.

That good enough for ya?? I'm the Forest Gump of image/signal processing. Been literally everywhere -- done more than my share..

Now what distinction did you expect between "built or programmed"??? :lol: You expect the semantics there is important?

Build a model -- program a model. What's the big diff?
 
From the Von Storch et al paper to which you linked:

What do these inconsistencies imply for the utility of climate projections of anthropogenic climate change? Three possible explanations of the inconsistencies can be suggested: 1) the models underestimate the internal natural climate variability; 2) the climate models fail to include important external forcing processes in addition to anthropogenic forcing, or 3) the climate model sensitivities to external anthropogenic forcing is too high,.The first explanation is simple and plausible. Natural climate variability is an inevitable consequence of a slow system (climate) interacting with a fast system (weather)
(10).

The forcing of the slow system by the (white noise) low-frequency components of the fast system produces a “Brownian motion” of the slow system, represented by a red variance spectrum - in qualitative agreement with observations. However, the details of the response depend strongly on the internal dynamics of the slow system in the time scale range of interest - in the present case, on decadal time scales. It is long known, from successive reports of the Intergovernmental Panel on Climate Change (4), that contemporary global climate models have only limited success in simulating many such processes, ranging from the variability of the ocean circulation, ENSO events, various coupled ocean-atmosphere oscillation regimes, to changes in sea ice, land surface, atmospheric chemistry and the biosphere. The inability to simulate the statistical internal climate variability may have been artificially compensated in the past by tuning the models to prescribed external forcings, such as volcanic eruptions and tropospheric aerosols.This would explain why simulations with historical forcing by different GCMs tend to be very similar and follow closely the observed record. This artificial “inflation”(11) of forced variability at the expense of unpredictable natural variability works, however, only in the period of tuning, and no longer in the post-tuning phase since about 2000. The net effect of such a procedure is an underestimation of natural variability and an overestimation of the response to forced variability.

Do you understand what the man is saying? Your attempted characterization does not indicate that you do.

Indeed, I was going to comment on that paper as well. Westwall claims that climate models are useless. And he has tried to support his claim by posting papers that don't actually support that claim. For instance, what Von Storch is saying is that although the models worked well in the past, conditions developed in recent years climate models didn't predict, and so need to be tweaked to better reflect conditions for which they are modeled. Everyone knows this. All scientific models work that way. You punch in the data for the parameters for which you are testing, and then compare the results with the real world. If there is variance, you change the model parameters and/or refine the data until it more precisely reflects real world conditions. In this way you find what works and what doesn't work, and what can reveal unforeseen conditions that need to be accounted for, thus refining the model further. This is how all scientific models work. So to say that the models are useless is a meaningless statement. Models of the 1970s were less robust than models of the 1980s, which were less robust than models of the 1990s, which were less robust than the models 10 years ago, which were less robust than the models today, which will be less robust than the models of the future. Etc., etc., etc. And none of the assessment and reassessments of these models refute the fact that global warming is occurring and is, in fact, ongoing.






Show me a model that can recreate the weather we had yesterday. Should be easy peasy, you have perfect knowledge of every variable in play. A model that can't do a simple hindcast is worthless.

The most sophisticated models in use today are the CFD models used to design aircraft and F1 racecars. They cost millions of dollars to build and operate, and they STILL need to be checked in the wind tunnels. They are focused on ONE thing aerodynamics, and they still make basic mistakes. Get one variable wrong and you are losing .5 seconds per lap.

You claim that "simple computer models" can tell us what the long term climate is going to be like and I can guarantee you that they can't predict what will happen tomorrow. You really think that those laughable pieces of dog shit are useful?

Get real.... Spend some real money and hire some of those CFD people and see what they can do. In 10 years they would probably have something useful....what you have is a pathetic joke...
 
This reminds me of a lecture I heard during a course on Engineering Creativity (fun course).

The Navy in the 30s was trying to model hydrodynamics for torpedo design. Lots of slide rule action, miles of chalkboard, little results.

Some navy admiral got wind of this and got impatient. Ordered a 400 lb bar of soap to be made and delivered to Pearl Harbor. Monkeyed around with some stabilizers and weights and designed a good directional towing vector. Tied it to the back of a Cruiser and took it out on exercises for a week..

Got back to port with the ideal torpedo shape. Urban legend? Maybe.. I don't know. Sounds plausible for the 30s..

Maybe F1 needs to make a sheddable tissue chassis and place it in a wind tunnel.. Chuck the models..
 
Last edited:
From the Von Storch et al paper to which you linked:

What do these inconsistencies imply for the utility of climate projections of anthropogenic climate change? Three possible explanations of the inconsistencies can be suggested: 1) the models underestimate the internal natural climate variability; 2) the climate models fail to include important external forcing processes in addition to anthropogenic forcing, or 3) the climate model sensitivities to external anthropogenic forcing is too high,.The first explanation is simple and plausible. Natural climate variability is an inevitable consequence of a slow system (climate) interacting with a fast system (weather)
(10).

The forcing of the slow system by the (white noise) low-frequency components of the fast system produces a “Brownian motion” of the slow system, represented by a red variance spectrum - in qualitative agreement with observations. However, the details of the response depend strongly on the internal dynamics of the slow system in the time scale range of interest - in the present case, on decadal time scales. It is long known, from successive reports of the Intergovernmental Panel on Climate Change (4), that contemporary global climate models have only limited success in simulating many such processes, ranging from the variability of the ocean circulation, ENSO events, various coupled ocean-atmosphere oscillation regimes, to changes in sea ice, land surface, atmospheric chemistry and the biosphere. The inability to simulate the statistical internal climate variability may have been artificially compensated in the past by tuning the models to prescribed external forcings, such as volcanic eruptions and tropospheric aerosols.This would explain why simulations with historical forcing by different GCMs tend to be very similar and follow closely the observed record. This artificial “inflation”(11) of forced variability at the expense of unpredictable natural variability works, however, only in the period of tuning, and no longer in the post-tuning phase since about 2000. The net effect of such a procedure is an underestimation of natural variability and an overestimation of the response to forced variability.

Do you understand what the man is saying? Your attempted characterization does not indicate that you do.

Indeed, I was going to comment on that paper as well. Westwall claims that climate models are useless. And he has tried to support his claim by posting papers that don't actually support that claim. For instance, what Von Storch is saying is that although the models worked well in the past, conditions developed in recent years climate models didn't predict, and so need to be tweaked to better reflect conditions for which they are modeled. Everyone knows this. All scientific models work that way. You punch in the data for the parameters for which you are testing, and then compare the results with the real world. If there is variance, you change the model parameters and/or refine the data until it more precisely reflects real world conditions. In this way you find what works and what doesn't work, and what can reveal unforeseen conditions that need to be accounted for, thus refining the model further. This is how all scientific models work. So to say that the models are useless is a meaningless statement. Models of the 1970s were less robust than models of the 1980s, which were less robust than models of the 1990s, which were less robust than the models 10 years ago, which were less robust than the models today, which will be less robust than the models of the future. Etc., etc., etc. And none of the assessment and reassessments of these models refute the fact that global warming is occurring and is, in fact, ongoing.






Show me a model that can recreate the weather we had yesterday. Should be easy peasy, you have perfect knowledge of every variable in play. A model that can't do a simple hindcast is worthless.

The most sophisticated models in use today are the CFD models used to design aircraft and F1 racecars. They cost millions of dollars to build and operate, and they STILL need to be checked in the wind tunnels. They are focused on ONE thing aerodynamics, and they still make basic mistakes. Get one variable wrong and you are losing .5 seconds per lap.

You claim that "simple computer models" can tell us what the long term climate is going to be like and I can guarantee you that they can't predict what will happen tomorrow. You really think that those laughable pieces of dog shit are useful?

Get real.... Spend some real money and hire some of those CFD people and see what they can do. In 10 years they would probably have something useful....what you have is a pathetic joke...

No one (least of all me) made any claim about these models being simple. If they were simple, they wouldn't need mainframe time. Play the conspiracy card if that helps you sleep at night, but the fact is that nothing in your response refutes the my response. So sorry for you.
 
This reminds me of a lecture I heard during a course on Engineering Creativity (fun course).

The Navy in the 30s was trying to model hydrodynamics for torpedo design. Lots of slide rule action, miles of chalkboard, little results.

Some navy admiral got wind of this and got impatient. Ordered a 400 lb bar of soap to be made and delivered to Pearl Harbor. Monkeyed around with some stabilizers and weights and designed a good directional towing vector. Tied it to the back of a Cruiser and took it out on exercises for a week..

Got back to port with the ideal torpedo shape. Urban legend? Maybe.. I don't know. Sounds plausible for the 30s..

Maybe F1 needs to make a sheddable tissue chassis and place it in a wind tunnel.. Chuck the models..








They actually (well the top level teams with loads of cash) make up multiple parts and check each one in the wind tunnel. Ferrari and McLaren were famous for running their full sized moving roadway wind tunnels 24/7. They tested every component of the car before and after assembly. The F1 teams are so far ahead tech wise that some have partnered with aviation companies to develop new aircraft.
 
Indeed, I was going to comment on that paper as well. Westwall claims that climate models are useless. And he has tried to support his claim by posting papers that don't actually support that claim. For instance, what Von Storch is saying is that although the models worked well in the past, conditions developed in recent years climate models didn't predict, and so need to be tweaked to better reflect conditions for which they are modeled. Everyone knows this. All scientific models work that way. You punch in the data for the parameters for which you are testing, and then compare the results with the real world. If there is variance, you change the model parameters and/or refine the data until it more precisely reflects real world conditions. In this way you find what works and what doesn't work, and what can reveal unforeseen conditions that need to be accounted for, thus refining the model further. This is how all scientific models work. So to say that the models are useless is a meaningless statement. Models of the 1970s were less robust than models of the 1980s, which were less robust than models of the 1990s, which were less robust than the models 10 years ago, which were less robust than the models today, which will be less robust than the models of the future. Etc., etc., etc. And none of the assessment and reassessments of these models refute the fact that global warming is occurring and is, in fact, ongoing.






Show me a model that can recreate the weather we had yesterday. Should be easy peasy, you have perfect knowledge of every variable in play. A model that can't do a simple hindcast is worthless.

The most sophisticated models in use today are the CFD models used to design aircraft and F1 racecars. They cost millions of dollars to build and operate, and they STILL need to be checked in the wind tunnels. They are focused on ONE thing aerodynamics, and they still make basic mistakes. Get one variable wrong and you are losing .5 seconds per lap.

You claim that "simple computer models" can tell us what the long term climate is going to be like and I can guarantee you that they can't predict what will happen tomorrow. You really think that those laughable pieces of dog shit are useful?

Get real.... Spend some real money and hire some of those CFD people and see what they can do. In 10 years they would probably have something useful....what you have is a pathetic joke...

No one (least of all me) made any claim about these models being simple. If they were simple, they wouldn't need mainframe time. Play the conspiracy card if that helps you sleep at night, but the fact is that nothing in your response refutes the my response. So sorry for you.








I never did. The pushers of the fraud use that terminology ALL THE TIME. THEY are the ones saying they are simple...and they are. They are ridiculously simple and yet you clowns fall all over yourselves saying how profound they are.

You guys are jokes.


C3: Simple Climate Model Continues To Embarrass "Experts" and The IPCC's Billion-Dollar Computer Climate Simulations

Simple Models of Climate

Using a Very, Very Simple Climate Model in the Classroom

http://cybele.bu.edu/courses/gg312fall02/documents/lab02.pdf

Climate Models | WMO

UCAR E&O - Randy Russell - Very Simple Climate Model interactive
 
The pushers of the fraud use that terminology ALL THE TIME

Like I said, Play the conspiracy card if that helps you sleep at night. You can be certain that it puts the rest of us fast asleep.
 
The pushers of the fraud use that terminology ALL THE TIME

Like I said, Play the conspiracy card if that helps you sleep at night. You can be certain that it puts the rest of us fast asleep.









Yet another pathetic tactic pulled from the Saul Alinsky play book. You really do suck at this oltrakartrollingorogenicblunderfraud...
 
I've BUILT models for earth resource estimation.. Some are still in use. NONE of them purported to map an entire GLOBAL volume equivalent to ALL THE WATER IN THE OCEANS to several THOUSAND meters of depth. Looking for a couple Joules of energy per meter depth.. I'd be embarrassed to show myself at the next conference..

Built or programmed?

If programmed, when?

How big was your address space? That is, how much memory did you expect to have available to people running your code?

The company paying my salary was one of the worlds largest suppliers of Image Array Processors. At that point in time -- that was RACKS of equipment with boxes that had 10 or 20 FRAMES of image data memory plus the ALUs and digital signal processing to accelerate the process. So it was NOT a general computing application -- tho ---- some smart cookies have PORTED our work to GP platforms in later years.

There was generally a row of GP computers with direct DMA access to the image memory.
Later on -- we pioneering "memory centered" digital processing with multiple PCs and also pioneered work on applying Neural Network array hardware to image feature extraction and classification.. Neural Nets and Machine Learning is the CRUX of understanding modeling from a data centered point of view.

That good enough for ya?? I'm the Forest Gump of image/signal processing. Been literally everywhere -- done more than my share..

Now what distinction did you expect between "built or programmed"??? :lol: You expect the semantics there is important?

Build a model -- program a model. What's the big diff?

I was wondering whether or not you might be talking about a papier mache globe.

Okay, you've got racks and racks of memory. You didn't answer the question. How much memory was it? What was your tally in MALLOCs?

And what climate modelling did you do with it?

Your point here is unfounded. You're trying to claim that the results Balmaseda, Trenberth and Kalllen obtained from the ORAS4 model are inaccurate because they could not have modeled every cubic meter of the ocean. Is that a fair statement?

If so, please explain for us why they need to model every cubic meter. I know you know they don't. So why'd you say so?

And that "Earth resource estimation": that wasn't oil was it?
 
Last edited:
The pushers of the fraud use that terminology ALL THE TIME

Like I said, Play the conspiracy card if that helps you sleep at night. You can be certain that it puts the rest of us fast asleep.









Yet another pathetic tactic pulled from the Saul Alinsky play book. You really do suck at this oltrakartrollingorogenicblunderfraud...

Saul Alinsky? Really? Has what to do with anything I've posted? Wow, you truly have gone fishing, dude. :cuckoo:
 
Built or programmed?

If programmed, when?

How big was your address space? That is, how much memory did you expect to have available to people running your code?

The company paying my salary was one of the worlds largest suppliers of Image Array Processors. At that point in time -- that was RACKS of equipment with boxes that had 10 or 20 FRAMES of image data memory plus the ALUs and digital signal processing to accelerate the process. So it was NOT a general computing application -- tho ---- some smart cookies have PORTED our work to GP platforms in later years.

There was generally a row of GP computers with direct DMA access to the image memory.
Later on -- we pioneering "memory centered" digital processing with multiple PCs and also pioneered work on applying Neural Network array hardware to image feature extraction and classification.. Neural Nets and Machine Learning is the CRUX of understanding modeling from a data centered point of view.

That good enough for ya?? I'm the Forest Gump of image/signal processing. Been literally everywhere -- done more than my share..

Now what distinction did you expect between "built or programmed"??? :lol: You expect the semantics there is important?

Build a model -- program a model. What's the big diff?

I was wondering whether or not you might be talking about a papier mache globe.

Okay, you've got racks and racks of memory. You didn't answer the question. How much memory was it? What was your tally in MALLOCs?

And what climate modelling did you do with it?

Your point here is unfounded. You're trying to claim that the results Balmaseda, Trenberth and Kalllen obtained from the ORAS4 model are inaccurate because they could not have modeled every cubic meter of the ocean. Is that a fair statement?

If so, please explain for us why they need to model every cubic meter. I know you know they don't. So why'd you say so?

And that "Earth resource estimation": that wasn't oil was it?

You GoreButts are worse than Joseph McCarthy.. YES I am so deep into the pockets of big oil that I exhale lint.. I own 3 Nascar Caps with PennZoil, Quaker State and Jiffy Lube on them.. My entire education was paid for by Al Gores daddy and his Occidental Petroleum holdings and a couple research grants from that great LEFTIST oilman George Soros.

No climate modeling.. Mostly merging multi-spectral satellite with radar and hi-res imagery to monitor vegetation and drought conditions. LOTS of spatial "warping" and resampling and filtering before we even started to extract useful data.

So --- you got a decent mapping of the Gulf Stream out of the B, T K study? Can I see it??
Did they find any new major heat pathways previously unmapped? Can I see those?

BTK is so easy to remember.. Just like in Bind, Torture, Kill...
 
Last edited:
The IPCC is the source of science on climate change commissioned to advise legitimate political entities on the consequences of various paths forward.

What's represented here are illegitimate political entities who have given up their seats at the table of government by pandering to the interests of the few at the cost of the many. Including future generations. They have no sense of responsibility so they only attack, futilely, science, that they're incapable of understanding.

Nobody should be the least bit surprised. They believe only in the power to impose what they believe is best for them on everyone else.

Has nothing at all to do with science or legitimate government.
 
Last edited:
So what flac says is how he was focused on one specialty, and thus assumes a totally different specialty has to work exactly the same way.

It's often called "The engineer's fallacy".
 
Computer models are a great tool to test current understanding. They are a failure for predicting the future of chaotic systems.
 
Computer models are a great tool to test current understanding. They are a failure for predicting the future of chaotic systems.

Chaos, by definition, cannot be modeled. However, sometimes, science mistakes systems that are too complex, with true chaos.

As I've said, the only way for the dynamic response to be predicted would require multi year weather forecasts involving land, water, ice and atmosphere. Decades away. Maybe forever away.

But, no more science is required to know that the action of GHGs of increasing atmospheric concentration requires the earth system to warm, and that history gives good clues as to what will happen when conditions of the past are recreated.
 
So what flac says is how he was focused on one specialty, and thus assumes a totally different specialty has to work exactly the same way.

It's often called "The engineer's fallacy".

EAT ME....

flacaltenn-albums-charts-picture5913-pict0710.jpg



I've worked with marine mammals.
I've helped blind people to "see".
I've sat in on 30 or more cardiac cath lab procedures.
I've worked with many of Americas three letter spy agencies.
I've developed biometric ID and RFID techniques.
I've done optical computing, Neural Network classifiers/detectors, and radar/sonar processing.
I've got equipment in almost every major hospital and medical research facility.
I've got clients for my services around the world.
I've got papers in image, signal, and display processing journals.
I've worked at Kennedy Space Center.
I've done several oceanography related projects involving 3D mapping and thermoclines.
I've processed and enhanced almost every type of signal and image on earth.
I have advanced degrees in Biomedical, and Electrical Engineering.
I've taken all the pre-med reqs PLUS advanced courses in chemistry, biology and physics.

Tell me about "the engineer's fallacy".
I've been blessed with an extraordinary career. I'm not scared of a temperature graph.
 
Last edited:
Computer models are a great tool to test current understanding. They are a failure for predicting the future of chaotic systems.

Chaos, by definition, cannot be modeled. However, sometimes, science mistakes systems that are too complex, with true chaos.

As I've said, the only way for the dynamic response to be predicted would require multi year weather forecasts involving land, water, ice and atmosphere. Decades away. Maybe forever away.

But, no more science is required to know that the action of GHGs of increasing atmospheric concentration requires the earth system to warm, and that history gives good clues as to what will happen when conditions of the past are recreated.

I am not trying to be dismissive or insulting here but have you ever considered that your interpretation of how CO2 affects surface temperatures is just as simplistic and wrongly framed as SSDD's interpretation of the SLoT? The people who suffer from D-K seldom seem to recognize it in themselves.
 

Forum List

Back
Top