But here we got an entire group of idiots that figure that there must be more photons per m^2 if they can read a label better and if they can`t then there are less photons.
That would be because it's true. Non-retards understand that having more light makes it easier to read. It's funny watching you go into these idiot contortions, trying to deny something that everyone else learned way before kindergarten. You could just say "oops, I was wrong", but you're emotionally incapable of that, so ever deeper into the stupid hole you now dig.
Lose the red herrings. Spectrum has zilch to do with anything we're discussing here. It doesn't matter what wavelength you use. If the walls reflect whatever wavelength you're using, then the light level of that wavelength in the room will increase.According to them photons disappear when you shine a red light on a red text and have trouble reading it...or if you place a blue object in front of a blue wall and have to look twice before you can make it out.
(Notice how even your loyal lickspittles don't want to jump up on this stupid wagon with you? That should give you a clue about how stupid you look.)
And by the way, sniveling about how mean I am won't get me to stop tearing you to pieces. That just encourages me.
Unfortunately, the Crusty Crab caught ME and you being inexact.. And of course, used that opportunity to pummel us instead of engaging in anything truely useful..
The problem is our sloppy accounting for photons. In your example, if 20% of photons are absorbed by a white wall, we neglect to account for the wavelength conversion that occurs when the energy is RE-Emitted by white wall as IR longwave photons.
The deal is -- a black wall reduces EM emissions to HEAT faster than a white wall. And as such removes VISIBLE photons (and indeed the sum total of all photon energy) faster from the area. And for all practical purposes, changes the distribution of VISIBLE photons in the area.
So Mr. Krabbe -- we all know this. There is no advantage to trying to trip us on not being entirely rigorous. I agree - i was somewhat sloppy, but not incapable of doing it completely correct.
Mammoth is happy to simply have more visible light, which is true, but what that has to do with back-radiated IR is beyond me..
Now can we get to why you want to deny that the atmosphere is exchanging photon energy with the surface even IF it's usually cooler? If the BLACK WALL is emitting long wave energy to ALL objects in the room --- are some of them IMMUNE from being heated because they are warmer than the wall?
Now look, I`m not in the habit of splitting hairs like a lawyer but in an exact science like physics there is no room for ambiguity.
Especially when we are talking about temperature increases of just a fraction of 1 degree over several decades and discuss the last 15 years where there was no T-increase at all, despite ppm CO2 going up,...in fact the trend was reversed more often than it was at a steady state or has increased.
I highlighted in red where you are either dead wrong or phrased it so that it came out dead wrong for anyone who made their (professional) living using physics.
That is fundamentally wrong!...unless you want to switch the subject from the radiative transfer inside that room to heat conduction from the walls to the outside. Then I`ll counter like Roy Spencer etc would with a "heat insulation", essentially denying you your heat conduction to the outside argument and confine your thoughts to the inside of that room.(and indeed {removes}the sum total of all photon energy) faster from the area.
Have you forgotten that the sum of the total energy quanta that the photons carried in the visual range is still the same (total energy ) quanta-sum when the shorter wavelenght ( λ-1 ) that a black body (or wall) converted to "heat energy" as is carried by the now more numerous λ-2 longer wavelength IR photons that are now emitted after the black body (or wall) λ-2 > λ-1 to photon wavelength swap ?
There is no such process that can make energy disappear...or "magnify" it (like IanC would have it). You can only convert it to another form of energy. There is no question concerning that most fundamental principle in physics.
But there are many questions how much more incoming sunlight can be emitted from a less than perfect black body earth, using a crudely estimated albedo and "calculating" how much 15 µm IR is emitted and redirected down by CO2 instead of up.
Heinz Hug from the Max Planck Institute measured and calculated that to be :
IanC has a lot of trouble understanding the difference between "heat" as in tempertaure and "heat energy" and goes ballistic every time I quote Heinz or anybody else from the Max Planck Institute (that racked up the most Nobel Prizes in physics) so I thought I humor him and re-did Heinz`s calculations using "effective temperature" instead of watts or watt seconds/m^2 heat energy:![]()
[FONT=Arial, Geneva]Crucial is the relative increment of greenhouse effect . This is equal to the difference between the sum of slope integrals for 714 and 357 ppm, related to the total integral for 357 ppm. Considering the n[SIZE=-2]3[/SIZE] band alone (as IPCC does) we get[/FONT]
[FONT=Arial, Geneva](9.79[SIZE=+1]*[/SIZE]10[SIZE=-2]-4[/SIZE] cm[SIZE=-2]-1[/SIZE] - 1.11[SIZE=+1]*[/SIZE]10[SIZE=-2]-4[/SIZE] cm[SIZE=-2]-1[/SIZE]) / 0.5171 cm[SIZE=-2]-1[/SIZE] = 0.17 %[/FONT]
[FONT=Arial, Geneva] Conclusions[/FONT]![]()
[FONT=Arial, Geneva]It is hardly to be expected that for CO[SIZE=-2]2[/SIZE] doubling an increment of IR absorption at the 15 µm edges by 0.17% can cause any significant global warming or even a climate catastrophe.[/FONT]
[FONT=Arial, Geneva]The radiative forcing for doubling can be calculated by using this figure. If we allocate an absorption of 32 W/m[SIZE=-2]2[/SIZE] [SIZE=-1][14][/SIZE] over 180º steradiant to the total integral (area) of the n[SIZE=-2]3[/SIZE] band as observed from satellite measurements [SIZE=-1](Hanel et al., 1971)[/SIZE] and applied to a standard atmosphere, and take an increment of 0.17%, the absorption is 0.054 W/m[SIZE=-2]2[/SIZE] - and not 4.3 W/m[SIZE=-2]2[/SIZE].[/FONT]
[FONT=Arial, Geneva]This is roughly 80 times less than IPCC's radiative forcing.[/FONT]
http://www.usmessageboard.com/envir...the-atmosphere-is-what-we-28.html#post7305734
After that IanC went even more ballistic and there was a barrage of cat shit crap posts from the Siamese cat in a white walled kitty litter box where you got more visible photons.I guess it all boils down to the simple fact that the CO2 15 µm absorption band absorbs more incoming 15µm solar IR than the earth can produce at a comfortable temperature....which the CO2 is supposed to absorb and "back radiate".At the distance we are from the sun the CO2 in the atmosphere shields us from about 20 times more IR watts per m^2 @ 15 µm than what a 20 to 30 C warm earth could possibly produce as IR energy at that wave band with the rest of the solar radiation that went through down to the surface .
The peak IR at +20 C is nowhere near the 15 µm CO2 absorption band but is at 9.88 µm and gets shorter the warmer...in other words even farther away from the absorption band. If you integrate from 14 to 16 µm, straddling the 15 µm peak all you get is a total band radiance: 12.3786 W/m2/sr....of which only 6.2 W/m2/sr is in the center of CO2 absorption spectral line.
But let`s be generous and give them the whole band.
We can also drop the "sr" the solid angle because the IPCC says it does not matter, all of it is absorbed because the surface is surrounded by CO2.
But they also say that 50% of that goes up and out and the other 50 % radiate back.
That leaves us with 3.1 watts/m^2 "back radiation" from CO2 compared to ~250 watts/m^2 that were shielded by the CO2 in the upper part of our atmosphere.
Next lets put the amount of energy which is absorbed in the first 10 meters above ground with over 300 ppm CO2 into a Temperature perspective. Energy is not necessarily heat as in "hot" but can be expressed as an equivalent black body temperature.Anyway, if you do that conversion, that`s called the "effective temperature"...it is how we estimate how hot distant stars are by comparing it with a black body temperature that has the same radiation energy profile and a matching peak wavelength.
A black body that has it`s peak emission at 15 µm like the evil "man made CO2" and "re-emits" at 15 µm happens to have an "effective temperature" of - 80 C.The ice cubes in my freezer are "effectively" 8 times warmer than CO2 that just absorbed all the IR it could and "back radiates" it.![]()
IanC then made the claim that he never denied back radiation and he challenged me to dig up his crap where he did say that there was no such thing, then changed his song and dance around this subject and claimed he always said there was, but had no idea how much.
I obliged him anyway and it was a hoot to stick his crap..back radiation exists, then no there is no such thing, then there it was and existed again but then he called it "radiation impedance" .
I did that because it was funny, but I`m a little bit too busy with something else right now, that my 4 year old has been waiting for long enough:
[ame="http://www.youtube.com/watch?v=_3te8XH1QNo&feature=youtu.be"]DIY E Trike made from old junk - YouTube[/ame]
Like I already said. This forum is full of idiots that make my 4 year old Great-grandson look like a genius. He already knows how to set my multimeter and dial it into the right Voltage, Current or Ohm Ranges when I tried to cheat him with 12 Volts instead of giving him 24 VDc for his test drive.
.
Last edited: