moralist

Fiddled Data

Recommended Posts

Just goes to show, nothing can get between a parasite and their government funding...

THE US HAS ACTUALLY BEEN COOLING SINCE THE THIRTIES, THE HOTTEST DECADE ON RECORD

By Christopher Booker

4:04PM BST 21 Jun 2014

When future generations try to understand how the world got carried away around the end of the 20th century by the panic over global warming, few things will amaze them more than the part played in stoking up the scare by the fiddling of official temperature data. There was already much evidence of this seven years ago, when I was writing my history of the scare, The Real Global Warming Disaster. But now another damning example has been uncovered by Steven Goddard's US blog Real Science, showing how shamelessly manipulated has been one of the world’s most influential climate records, the graph of US surface temperature records published by the National Oceanic and Atmospheric Administration (NOAA).

Goddard shows how, in recent years, NOAA’s US Historical Climatology Network (USHCN) has been “adjusting” its record by replacing real temperatures with data “fabricated” by computer models. The effect of this has been to downgrade earlier temperatures and to exaggerate those from recent decades, to give the impression that the Earth has been warming up much more than is justified by the actual data. In several posts headed “Data tampering at USHCN/GISS”, Goddard compares the currently published temperature graphs with those based only on temperatures measured at the time. These show that the US has actually been cooling since the Thirties, the hottest decade on record; whereas the latest graph, nearly half of it based on “fabricated” data, shows it to have been warming at a rate equivalent to more than 3 degrees centigrade per century.

When I first began examining the global-warming scare, I found nothing more puzzling than the way officially approved scientists kept on being shown to have finagled their data, as in that ludicrous “hockey stick” graph, pretending to prove that the world had suddenly become much hotter than at any time in 1,000 years. Any theory needing to rely so consistently on fudging the evidence, I concluded, must be looked on not as science at all, but as simply a rather alarming case study in the aberrations of group psychology.

Share this post


Link to post
Share on other sites

Just goes to show, nothing can get between a parasite and their government funding...

THE US HAS ACTUALLY BEEN COOLING SINCE THE THIRTIES, THE HOTTEST DECADE ON RECORD

By Christopher Booker

4:04PM BST 21 Jun 2014

When future generations try to understand how the world got carried away around the end of the 20th century by the panic over global warming, few things will amaze them more than the part played in stoking up the scare by the fiddling of official temperature data. There was already much evidence of this seven years ago, when I was writing my history of the scare, The Real Global Warming Disaster. But now another damning example has been uncovered by Steven Goddard's US blog Real Science, showing how shamelessly manipulated has been one of the world’s most influential climate records, the graph of US surface temperature records published by the National Oceanic and Atmospheric Administration (NOAA).

Goddard shows how, in recent years, NOAA’s US Historical Climatology Network (USHCN) has been “adjusting” its record by replacing real temperatures with data “fabricated” by computer models. The effect of this has been to downgrade earlier temperatures and to exaggerate those from recent decades, to give the impression that the Earth has been warming up much more than is justified by the actual data. In several posts headed “Data tampering at USHCN/GISS”, Goddard compares the currently published temperature graphs with those based only on temperatures measured at the time. These show that the US has actually been cooling since the Thirties, the hottest decade on record; whereas the latest graph, nearly half of it based on “fabricated” data, shows it to have been warming at a rate equivalent to more than 3 degrees centigrade per century.

When I first began examining the global-warming scare, I found nothing more puzzling than the way officially approved scientists kept on being shown to have finagled their data, as in that ludicrous “hockey stick” graph, pretending to prove that the world had suddenly become much hotter than at any time in 1,000 years. Any theory needing to rely so consistently on fudging the evidence, I concluded, must be looked on not as science at all, but as simply a rather alarming case study in the aberrations of group psychology.

It is mathematically possible for the U.S. to cool and the world at large to heat up.

For example if the Atlantic Halocine conveyor fails the North East part of North America and the North West part of Europe is in for very cold winters. Hans Brinker will have to dig up his silver skates to get around the canals in Amsterdam and Rotterdam.

The melting of the Arctic ice is the very thing that will shut down the Atlantic ocean conveyor.

Ba'al Chatzaf

Share this post


Link to post
Share on other sites

Just goes to show, nothing can get between a parasite and their government funding...

THE US HAS ACTUALLY BEEN COOLING SINCE THE THIRTIES, THE HOTTEST DECADE ON RECORD

By Christopher Booker

4:04PM BST 21 Jun 2014

When future generations try to understand how the world got carried away around the end of the 20th century by the panic over global warming, few things will amaze them more than the part played in stoking up the scare by the fiddling of official temperature data. There was already much evidence of this seven years ago, when I was writing my history of the scare, The Real Global Warming Disaster. But now another damning example has been uncovered by Steven Goddard's US blog Real Science, showing how shamelessly manipulated has been one of the world’s most influential climate records, the graph of US surface temperature records published by the National Oceanic and Atmospheric Administration (NOAA).

Goddard shows how, in recent years, NOAA’s US Historical Climatology Network (USHCN) has been “adjusting” its record by replacing real temperatures with data “fabricated” by computer models. The effect of this has been to downgrade earlier temperatures and to exaggerate those from recent decades, to give the impression that the Earth has been warming up much more than is justified by the actual data. In several posts headed “Data tampering at USHCN/GISS”, Goddard compares the currently published temperature graphs with those based only on temperatures measured at the time. These show that the US has actually been cooling since the Thirties, the hottest decade on record; whereas the latest graph, nearly half of it based on “fabricated” data, shows it to have been warming at a rate equivalent to more than 3 degrees centigrade per century.

When I first began examining the global-warming scare, I found nothing more puzzling than the way officially approved scientists kept on being shown to have finagled their data, as in that ludicrous “hockey stick” graph, pretending to prove that the world had suddenly become much hotter than at any time in 1,000 years. Any theory needing to rely so consistently on fudging the evidence, I concluded, must be looked on not as science at all, but as simply a rather alarming case study in the aberrations of group psychology.

It is mathematically possible for the U.S. to cool and the world at large to heat up.

For example if the Atlantic Halocine conveyor fails the North East part of North America and the North West part of Europe is in for very cold winters. Hans Brinker will have to dig up his silver skates to get around the canals in Amsterdam and Rotterdam.

The melting of the Arctic ice is the very thing that will shut down the Atlantic ocean conveyor.

Ba'al Chatzaf

The real issue is whether or not you drink the leftist Kool Aid that it is catastrophic and human caused. :wink:

Greg

Share this post


Link to post
Share on other sites

Again though, if these employees of the Government fraudulent changed data for personal gain, does that not constitute a crime?

Share this post


Link to post
Share on other sites

It is a crime that 98% of these government "jobs" even exist. Not only are they another entitlement group sucking the life out of the economy of productive people but they actively interfere in the economic actions of same. These "employees" are parasites, their lies are to protect themselves and their privileges.

-[grumpy today]

Share this post


Link to post
Share on other sites

if these employees of the Government fraudulent changed data for personal gain, does that not constitute a crime?

Lemme think. There are three thousand expert climate scientists funded by five dozen governments, appointed to IPCC, a UN agency.

A two-term U.S. president, his cabinet and party leadership, the National Science Foundation, the Pentagon, NASA, EPA, Natural Resources Defense Council, and a former vice-president are co-conspirators. Global warming has been trumpeted by 5,000 American environmental science professors, 50,000 U.S. reporters, correspondents, and broadcast news analysts, one million U.S. public high school science and social studies teachers, plus two million government educators and official spokesmice in Europe and Asia.

State-funded science and "education" is a crime against humanity.

Ayn Rand had the right idea. The guiltiest of men are the natural oligarchs, who abdicated their leadership of an anarcho-capitalist revolution. Instead of giving Harry Truman the atomic bomb, it could have and should have been developed in a laboratory at Galt's Gulch. This is the moral meaning of inequality. When the men of brains collaborate with a mob of dullards, it's unfair to blame the resultant calamity on a crowd of pickpockets and cheerleaders.

The Constitution of Government in Galt's Gulch, p.146

Share this post


Link to post
Share on other sites

Just goes to show, nothing can get between a parasite and their government funding...

THE US HAS ACTUALLY BEEN COOLING SINCE THE THIRTIES, THE HOTTEST DECADE ON RECORD

By Christopher Booker

4:04PM BST 21 Jun 2014

When future generations try to understand how the world got carried away around the end of the 20th century by the panic over global warming, few things will amaze them more than the part played in stoking up the scare by the fiddling of official temperature data. There was already much evidence of this seven years ago, when I was writing my history of the scare, The Real Global Warming Disaster. But now another damning example has been uncovered by Steven Goddard's US blog Real Science, showing how shamelessly manipulated has been one of the world’s most influential climate records, the graph of US surface temperature records published by the National Oceanic and Atmospheric Administration (NOAA).

Goddard shows how, in recent years, NOAA’s US Historical Climatology Network (USHCN) has been “adjusting” its record by replacing real temperatures with data “fabricated” by computer models. The effect of this has been to downgrade earlier temperatures and to exaggerate those from recent decades, to give the impression that the Earth has been warming up much more than is justified by the actual data. In several posts headed “Data tampering at USHCN/GISS”, Goddard compares the currently published temperature graphs with those based only on temperatures measured at the time. These show that the US has actually been cooling since the Thirties, the hottest decade on record; whereas the latest graph, nearly half of it based on “fabricated” data, shows it to have been warming at a rate equivalent to more than 3 degrees centigrade per century.

When I first began examining the global-warming scare, I found nothing more puzzling than the way officially approved scientists kept on being shown to have finagled their data, as in that ludicrous “hockey stick” graph, pretending to prove that the world had suddenly become much hotter than at any time in 1,000 years. Any theory needing to rely so consistently on fudging the evidence, I concluded, must be looked on not as science at all, but as simply a rather alarming case study in the aberrations of group psychology.

It is mathematically possible for the U.S. to cool and the world at large to heat up.

For example if the Atlantic Halocine conveyor fails the North East part of North America and the North West part of Europe is in for very cold winters. Hans Brinker will have to dig up his silver skates to get around the canals in Amsterdam and Rotterdam.

The melting of the Arctic ice is the very thing that will shut down the Atlantic ocean conveyor.

Ba'al Chatzaf

I thought the 'mechanism' that contributed the most to the Atlantic ocean conveyor was the formation of Antarctic ice shelf. The surface seawater freezes 'squeezing' out the salt and creating denser water subsurface that falls to the seabed and then follows the terrian contributing to the over all flow of the ocean currents.

Share this post


Link to post
Share on other sites

if these employees of the Government fraudulent changed data for personal gain, does that not constitute a crime?

Lemme think. There are three thousand expert climate scientists funded by five dozen governments, appointed to IPCC, a UN agency.

A two-term U.S. president, his cabinet and party leadership, the National Science Foundation, the Pentagon, NASA, EPA, Natural Resources Defense Council, and a former vice-president are co-conspirators. Global warming has been trumpeted by 5,000 American environmental science professors, 50,000 U.S. reporters, correspondents, and broadcast news analysts, one million U.S. public high school science and social studies teachers, plus two million government educators and official spokesmice in Europe and Asia.

State-funded science and "education" is a crime against humanity.

Ayn Rand had the right idea. The guiltiest of men are the natural oligarchs, who abdicated their leadership of an anarcho-capitalist revolution. Instead of giving Harry Truman the atomic bomb, it could have and should have been developed in a laboratory at Galt's Gulch. This is the moral meaning of inequality. When the men of brains collaborate with a mob of dullards, it's unfair to blame the resultant calamity on a crowd of pickpockets and cheerleaders.

The Constitution of Government in Galt's Gulch, p.146

Understood.

However, using the Respondiant Superior:

A legal doctrine, most commonly used in tort, that holds an employer or principal legally responsible for the wrongful acts of an employee or agent, if such acts occur within the scope of the employment or agency.

premise, should indictments be issued for the heads of each American government funded from the general fund, or, specific project funds.

As for as trying to track the money the US pisses away every year would probably be difficult to trace in the UN financial cesspool.

Does Respondeat Superior apply criminally?

A...

Share this post


Link to post
Share on other sites

Does Respondeat Superior apply criminally?

You can't sue the government without its permission and individuals lack standing unless they were uniquely harmed. United States v. Richardson, 418 U.S. 166 (1974). It must be a distinct and palpable personal injury. Gladstone, Realtors v. Village of Bellwood, 441 US 91 (1979). It is not the role of courts, but that of the political branches, to shape the institutions of government in such fashion as to comply with the laws and the Constitution. Lewis v. Casey, 518 US 343 (1996). Citizens cannot challenge the Executive Branch. Schlesinger v. Reservists Comm. to Stop the War, 418 US 208 (1974). The federal courts were not constituted as ombudsmen of the general welfare. Valley Forge Christian College v. Americans United for Separation of Church and State, Inc., 454 US 464 (1982).

However...

US Code, Title 18, Part I, Chapter 47, Section 1001. ... whoever, in any matter within the jurisdiction of the executive, legislative, or judicial branch of the Government of the United States, knowingly and willfully (1) falsifies, conceals, or covers up by any trick, scheme, or device a material fact; (2) makes any materially false, fictitious, or fraudulent statement or representation; or (3) makes or uses any false writing or document knowing the same to contain any materially false, fictitious, or fraudulent statement or entry; shall be fined under this title or imprisoned not more than 5 years, or both.

US Code, Title 18, Part I, Chapter 19, Section 371. If two or more persons conspire either to commit any offense against the United States, or to defraud the United States, or any agency thereof in any manner or for any purpose, and one or more of such persons do any act to effect the object of the conspiracy, each shall be fined under this title or imprisoned not more than five years, or both.

You'd have to get a U.S. Attorney to file a complaint. Rotsa ruck with that.

Share this post


Link to post
Share on other sites

You'd have to get a U.S. Attorney to file a complaint. Rotsa ruck with that.

I have researched Qui Tam and if the Justice Department refuses to move on it, citizens can...

False Claims Act and Qui Tam

The False Claims Act, sometimes referred to as the FCA, was enacted in 1863, and was amended most recently in 1986. The FCA contains an ancient legal device called the "qui tam" provision which is shorthand for the Latin phrase:

qui tam pro domino rege quam pro se ipso in hac parte sequitur

he who brings a case on behalf of our lord the King, as well as for himself

The False Claims Act allows a private individual with knowledge of past or present fraud on the federal government to sue on the government’s behalf to recover compensatory damages, civil penalties, and triple damages.

With Congress' 1986 amendment to strengthen this Civil War Era statute a powerful public-private partnership was put in play to uncover fraud against the federal government and obtaining the maximum recovery for the U.S. Treasury. The FCA has become an important tool for uncovering fraud and abuse of government programs. The FCA compensates the private whistleblower, known as the relator, if his or her efforts are successful in helping the government recover fraudulently obtained government funds.

A...

Share this post


Link to post
Share on other sites

You'd have to get a U.S. Attorney to file a complaint. Rotsa ruck with that.

I have researched Qui Tam and if the Justice Department refuses to move on it, citizens can...

False Claims Act and Qui Tam

The False Claims Act, sometimes referred to as the FCA, was enacted in 1863, and was amended most recently in 1986. The FCA contains an ancient legal device called the "qui tam" provision which is shorthand for the Latin phrase:

qui tam pro domino rege quam pro se ipso in hac parte sequitur

he who brings a case on behalf of our lord the King, as well as for himself

The False Claims Act allows a private individual with knowledge of past or present fraud on the federal government to sue on the government’s behalf to recover compensatory damages, civil penalties, and triple damages.

With Congress' 1986 amendment to strengthen this Civil War Era statute a powerful public-private partnership was put in play to uncover fraud against the federal government and obtaining the maximum recovery for the U.S. Treasury. The FCA has become an important tool for uncovering fraud and abuse of government programs. The FCA compensates the private whistleblower, known as the relator, if his or her efforts are successful in helping the government recover fraudulently obtained government funds.

Qui tam was used to harass and nearly ruin one of the best deepwater operators in the Gulf, W&T Offshore.

Here's a law review discussion of malice, intent, knowledge and particularity in FCA pleadings, the upshot of which is:

Rule 9(b) of the Federal Rules of Civil Procedure provides that “n alleging fraud or mistake, a party must state with particularity the circumstances constituting fraud or mistake. Malice, intent, knowledge, and other conditions of a person’s mind may be alleged generally.” Rule 9(b)’s particularity requirement is justified by “the desire to protect the reputation of the defendant” and “the need to afford an opponent adequate notice in order to prepare a responsive pleading.” Due to the minimal legislative history that explains the precise instances in which Rule 9(b) should be applied, courts have developed its purpose over time. In Odom v. Microsoft Corp., the Ninth Circuit noted that the relator must identify the fraudulent party and state the time, place, and content of the fraud to comply with the Rule 9(b)...

Share this post


Link to post
Share on other sites

You'd have to get a U.S. Attorney to file a complaint. Rotsa ruck with that.

I have researched Qui Tam and if the Justice Department refuses to move on it, citizens can...

False Claims Act and Qui Tam

The False Claims Act, sometimes referred to as the FCA, was enacted in 1863, and was amended most recently in 1986. The FCA contains an ancient legal device called the "qui tam" provision which is shorthand for the Latin phrase:

qui tam pro domino rege quam pro se ipso in hac parte sequitur

he who brings a case on behalf of our lord the King, as well as for himself

The False Claims Act allows a private individual with knowledge of past or present fraud on the federal government to sue on the government’s behalf to recover compensatory damages, civil penalties, and triple damages.

With Congress' 1986 amendment to strengthen this Civil War Era statute a powerful public-private partnership was put in play to uncover fraud against the federal government and obtaining the maximum recovery for the U.S. Treasury. The FCA has become an important tool for uncovering fraud and abuse of government programs. The FCA compensates the private whistleblower, known as the relator, if his or her efforts are successful in helping the government recover fraudulently obtained government funds.

Qui tam was used to harass and nearly ruin one of the best deepwater operators in the Gulf, W&T Offshore.

Here's a detailed discussion of malice, intent, knowledge and particularity in Federal Rules of Procedure.

I know of that situation, Qui Tam can be used as a sword or a shield, just like bankruptcy and by either side.

Share this post


Link to post
Share on other sites

Just a question... and please dont get political or philosophical.....

What does he mean by "fabricated by computer models"?

Does this mean that the super computers in use work properly but they are being fed garbage data, or does it mean that the computers simply dont work?

Share this post


Link to post
Share on other sites

What does he mean by "fabricated by computer models"?

It means that computer programs were created to get the results that were wanted by those who created them, rather than to recognize and accept the results that actually happened.

J

Share this post


Link to post
Share on other sites

Woolf:

I am looking for a particular Qui Tam decision which was quite revealing, Just can't figure out where I filed it yet.

FYI:

http://www.natlawreview.com/article/qui-tam-update-recent-developments-unsealed-cases

Apparently, the powers to be are attempting to:

1) restrict the use of Qui Tam through the False Claims Act; and

2) remaining moot on it's generic common law application to the Constitution and Amendments...

http://en.wikipedia.org/wiki/Qui_tam

A....

Share this post


Link to post
Share on other sites

Just goes to show, nothing can get between a parasite and their government funding...

THE US HAS ACTUALLY BEEN COOLING SINCE THE THIRTIES, THE HOTTEST DECADE ON RECORD

By Christopher Booker

4:04PM BST 21 Jun 2014

When future generations try to understand how the world got carried away around the end of the 20th century by the panic over global warming, few things will amaze them more than the part played in stoking up the scare by the fiddling of official temperature data. There was already much evidence of this seven years ago, when I was writing my history of the scare, The Real Global Warming Disaster. But now another damning example has been uncovered by Steven Goddard's US blog Real Science, showing how shamelessly manipulated has been one of the world’s most influential climate records, the graph of US surface temperature records published by the National Oceanic and Atmospheric Administration (NOAA).

Goddard shows how, in recent years, NOAA’s US Historical Climatology Network (USHCN) has been “adjusting” its record by replacing real temperatures with data “fabricated” by computer models. The effect of this has been to downgrade earlier temperatures and to exaggerate those from recent decades, to give the impression that the Earth has been warming up much more than is justified by the actual data. In several posts headed “Data tampering at USHCN/GISS”, Goddard compares the currently published temperature graphs with those based only on temperatures measured at the time. These show that the US has actually been cooling since the Thirties, the hottest decade on record; whereas the latest graph, nearly half of it based on “fabricated” data, shows it to have been warming at a rate equivalent to more than 3 degrees centigrade per century.

When I first began examining the global-warming scare, I found nothing more puzzling than the way officially approved scientists kept on being shown to have finagled their data, as in that ludicrous “hockey stick” graph, pretending to prove that the world had suddenly become much hotter than at any time in 1,000 years. Any theory needing to rely so consistently on fudging the evidence, I concluded, must be looked on not as science at all, but as simply a rather alarming case study in the aberrations of group psychology.

It is mathematically possible for the U.S. to cool and the world at large to heat up.

For example if the Atlantic Halocine conveyor fails the North East part of North America and the North West part of Europe is in for very cold winters. Hans Brinker will have to dig up his silver skates to get around the canals in Amsterdam and Rotterdam.

The melting of the Arctic ice is the very thing that will shut down the Atlantic ocean conveyor.

Ba'al Chatzaf

I thought the 'mechanism' that contributed the most to the Atlantic ocean conveyor was the formation of Antarctic ice shelf. The surface seawater freezes 'squeezing' out the salt and creating denser water subsurface that falls to the seabed and then follows the terrian contributing to the over all flow of the ocean currents.

The melting of the arctic ice will kill the Gulf Stream which is what keeps North West Europe from freezing its collective arse off. If the Gulf Stream shuts down the North Easter parts of North America -- New England and Eastern Canada are in for very cold winters too. The cold snap could conceivably make winters in the mid-Atlantic states nasty. Think of New Jersey turning into New York or Massachussetts.

Ba'al Chatzaf

Share this post


Link to post
Share on other sites

Just a question... and please dont get political or philosophical.....

What does he mean by "fabricated by computer models"?

Does this mean that the super computers in use work properly but they are being fed garbage data, or does it mean that the computers simply dont work?

Temperature data has been collected for a long time, reliably for several centuries. East Anglia looked at the past five decades, without considering the fact that cities had expanded, which raised the temperature of certain urban/suburban localities, on the basis of which they wrongly predicted a "hockey stick" global warming model that had to be abandoned, because it was proven hilariously wrong.

real_hockey_stick.jpg

Others began to study ice cores and geological rock records, theoretical carbon inputs and carbon sinks, volcanic eruptions, cloud cover, cow farts, etc, to arrive at a more comprehensive and coherent climate model. However, there were two series of data that continued to bedevil all the statistical and theoretical work. One was the Medieval Warm Period, when Greenland was ice-free and green. The other was a warming trend in the 1930s that reversed to a very chilly climate in the 1970s. In fact the East Anglia Climate Unit was founded in the 1970s and funded by UK oil companies to study global cooling.

Confronted with awkward data that explained nothing, the U.S. Climatology Network "adjusted" the data.

screenhunter_235-jun-01-15-261.gif?w=640

Of course, last winter's "Arctic Vortex" and very deep snowpack in Siberia rubbished their model once again. Reluctantly, IPCC has had to acknowledge that there has been no global warming for the past 15 years, and that maybe -- just maybe -- climate has something to do with solar cycles.

NASA's 2008 prediction of Solar Cycle 24 in red, and what actually happened traced in green. Yep, global cooling.

screenhunter_35-jan-26-14-05.jpg?w=640&h

Climate has nothing to do with Anthropogenic (human) industry. We emit 3% of CO2, and CO2 is only 3% of the atmosphere.

The big "climate change" culprit, aside from solar cycles, is -- ta da! -- clouds, over which human beings have no control.

As a human being I'm in favor of global warming. It's easier to keep everybody alive, instead of burning fuel to stay warm.

However, we are completely screwed by EPA regulation of "greenhouse gases" (GHG). This landed in my inbox today:

The Supreme Court "Tailors" EPA's GHG Permitting ProgramYesterday, Justice Scalia, writing for a majority of the United States SupremeCourt, invalidated EPA's greenhouse-gas (GHG) regulations to the extent they requirestationary sources to obtain a Prevention of Significant Deterioration (PSD) and/orTitle V major source permit based solely on the source's GHG emissions. The Court,however, also validated EPA's extension of "best available control technology"(BACT) requirements to GHG emissions at sources already subject to PSD requirementsbased on criteria pollutant emissions (so-called "PSD-anyway" sources). Thus, whileEPA's authority to require BACT controls for GHGs at so-called PSD-anyway sourceswas upheld, the broad scope of authority claimed by EPA was significantly reduced.The case, Utility Air Regulatory Group v. EPA ("UARG"), No. 12-1146, is asignificant development in EPA's efforts at regulating GHGs in the absence ofCongressional action and, as discussed below, raises a number of important issuesand questions.BACKGROUNDTo put this issue into proper context, it is necessary to reach back toMassachusetts v. EPA, 549 U.S. 497 (2007), where the Court held that the mobilesource provisions in Title II of the CAA authorized EPA to regulate GHG emissionsfrom new motor vehicles if EPA found that such emissions contributed to climatechange. Following Massachusetts, EPA officially determined that GHGs from new motorvehicles contribute to climate change (i.e., the Endangerment Finding). After thisthreshold determination, EPA announced that the upcoming GHG standards for motorvehicles would, as a matter of law under the CAA, also trigger PSD and Title Vpermitting requirements for certain stationary sources under Titles I and V of theCAA based solely on those sources potential to emit GHGs (i.e., the TriggeringRule). Because strict application of the CAA's numerical thresholds to stationarysources (i.e., 100/250 tpy for PSD, 100 tpy for Title V) would result in a radicalexpansion of the PSD and Title V programs to millions of smaller sources, EPA, inthe Tailoring Rule, wrote into the statute a new major source threshold of 100,000tpy for GHGs.Numerous industries and several states challenged EPA's suite of GHG-related actionsin the United States Court of Appeals for the District of Columbia on severalgrounds. The D.C. Circuit upheld EPA's actions, including EPA's interpretation thatthe CAA mandated application of the PSD and Title V programs to "any regulated airpollutant," including GHGs. The Supreme Court granted certiorari on only onequestion: "Whether EPA permissibly determined that its regulation of greenhouse gasemissions from new motor vehicles triggered permitting requirements under the CleanAir Act for stationary sources that emit greenhouse gases." The Court broke thatquestion down into the following two sub-questions: (1) whether EPA permissiblydetermined that a stationary source may be subject to PSD and Title V permittingrequirements based solely on the source's potential to emit GHGs; and (2) whetherEPA permissibly determined that PSD-anyway sources may be required to limit its GHGemissions by employing "best available control technology" (BACT) for GHGs.COURT'S HOLDING/ANALYSISIn regards to the first sub-question, the Court invalidated EPA's approach,generally finding that EPA had overstepped its CAA authority. The Court's key pointson this first question are identified below:The CAA does not compel or permit EPA to apply the PSD and Title V programs to GHGsin the manner the Agency chose. Although Massachusetts held that the general,statute-wide definition of "air pollutant" includes GHGs, the Court here found thatthe broad definition of "air pollutant" in the CAA "is not a command to regulate,but a description of the universe of substances EPA may consider regulating underthe Act's operative provisions." The relevant question is whether EPA's chosenmethod is reasonable in the context of the CAA. The Court cited numerous instancesunder various CAA programs (e.g., NSPS, PSD, and Title V) stretching back to 1971where EPA has narrowed its definition of "air pollutant" to ensure sensibleregulation.EPA's approach is unreasonable because, among other things, "it would bring about anenormous and transformative expansion in EPA's regulatory authority without clearcongressional authorization."Even though the Court had not expressly agreed to review the Tailoring Rule, it,nonetheless, invalidated that rule, finding that the Agency "lacked authority to‘tailor' the Act's unambiguous numerical thresholds to accommodate itsgreenhouse gas inclusive interpretation of the permitting triggers."The Court's holding with respect to this first sub-question represents a stunningrebuke of EPA's GHG program. Justice Scalia characterized EPA's approach as "anagency laying claim to extravagant statutory power over the national economy whileat the same time strenuously asserting that the authority claimed would render thestatute unrecognizable to the Congress that designed it." The Court struck downEPA's approach by reaffirming "the core administrative-law principle that an agencymay not rewrite clear statutory terms to suit its own sense of how the statuteshould operate." As discussed below, there are significant practical implicationsstemming from this part of the decision.Regarding the second sub-question, the Court found that EPA reasonably interpretedthe CAA to require sources already subject to PSD requirements based on non-GHGcriteria pollutants (i.e., PSD-anyway sources) to comply with BACT emissionstandards for GHGs. The Court reasoned that subjecting PSD-anyway sources to GHGBACT would not result in a "dramatic expansion" of EPA's jurisdiction, nor would itresult in the regulation of millions of previously un-regulated sources. The Courtnoted, however, that EPA can require PSD-anyway sources to comply with BACT for GHGsonly if the source emits GHGs above de minimis thresholds, which have not yet beenidentified by EPA. In this respect, EPA's GHG program for PSD/Title V sources willmove forward, albeit, in a much more narrow fashion.POTENTIAL IMPLICATIONSThe Court's holding in UARG — and particularly in striking down EPA's approach tothe regulation of GHGs from PSD/Title V sources—raises a multitude of questions andpotential implications. Prominent among those is how EPA and the states are going toaddress stationary sources for which PSD and/or Title V permits have already beenissued based solely on GHG emissions? It is clear after this decision that suchpermits are not authorized under the CAA. Similarly, now that the Tailoring Rule hasbeen held unlawful, what are the implications for the numerous states that haveincorporated the rule into their State Implementation Plans (SIPs)?The decision may also have implications for EPA's efforts to regulate coal-firedpower plants under Section 111 of the CAA. UARG is unlikely to have a direct effecton President Obama's recent proposal to curb greenhouse gas emissions from existingpower plants (i.e., the Clean Power Plan). Put simply, the Clean Power Plan involvesan entirely separate CAA provision (NSPS) from that at issue in UARG (PSD/Title V).Indirectly, however, the UARG decision may be an important indicator regarding howmuch leeway the Court is willing to give EPA in addressing GHGs. In particular, theopinion could have implications for the Clean Power Plan's "beyond the fence line"approach to reducing GHG emissions, including its inclusion of demand-side energyefficiency. Even though the Court upheld EPA's authority to regulate GHGs fromPSD-anyway sources, it remained cautious about limits on EPA's authority to regulate"energy use." The Court noted that BACT has traditionally been used for"end-of-stack controls" and acknowledged "important limitations" on the CAA'sdefinition of BACT. Specifically, the Court reasoned that it "has long been heldthat BACT cannot be used to order a fundamental redesign of the facility." This partof the opinion could have important implications should EPA's approach under theClean Power Plan drift too far from regulating air pollutants into regulating energyuse through facility re-design requirements or "beyond the fence" energy efficiencymandates.UARG also raises questions regarding the future of EPA's GHG program. During oralargument, the industry petitioners seemed to concede that the CAA grants EPA ampleauthority under the New Source Performance Standards (NSPS) provisions to regulateGHGs for individual sources — that is what EPA is currently undertaking with its111(b) and 111(d) Clean Power Plant rules. As the Court acknowledged, however, thisfirst requires EPA to identify pollutants to be regulated on a source-by-sourcebasis. Heretofore, EPA has only begun this process for GHGs, and then, only with thelargest source category — coal-fired power plants. What does this decision mean forother major sources (i.e., refineries, chemical manufacturing plants, etc.) and howfar down the line in terms of source sizes and categories can or will EPA go? TheCourt acknowledged "the potential for greenhouse-gas BACT to lead to an unreasonableand unanticipated degree of regulation," and cautioned that "our decision should notbe taken as an endorsement of all aspects of EPA's current approach, nor as a freerein for any future regulatory application of BACT in this distinct context."Finally, the UARG decision continues to highlight the significant obstacles inherentin addressing GHG emissions, and climate change more broadly, via executive actionunder the CAA. In the absence of Congressional legislation placing a price oncarbon, it seems inevitable that any EPA-effort to address GHG emissions will bechallenged. Ultimately, such challenges take significant time and effort, delayimportant air quality benefits, and create substantial uncertainty for the regulatedcommunity. The road ahead hardly seems straight or smooth.These are just several of the more prominent potential implications stemming fromthis landmark decision. If you have any questions or would like to discuss what theCourt's decision in UARG may mean for your company, please do not hesitate tocontact any of the authors of this client alert.

Share this post


Link to post
Share on other sites

Just a question... and please dont get political or philosophical.....

What does he mean by "fabricated by computer models"?

Does this mean that the super computers in use work properly but they are being fed garbage data, or does it mean that the computers simply dont work?

The story (by Christopher Booker, historian, author and journalist) at the Telegraph contains a link to the blog home of "Steve Goddard," a pseudonymous1 consultant. It was Goddard's work, not Booker's that is the original source of 'fabricated' data claim. Several recent postings at the Goddard blog inspired the Telegraph column, but Booker only mentioned this one, without providing a direct link: "Data Tampering At USHCN/GISS."

Here is the image from that posting that illustrates the main argument2:

1998changesannotated.gif?w=500&h=355

In a nutshell, Goddard prefers raw temperature data over any 'adjustments.' Though there might conceivably be good reasons to 'adjust' raw temperature data -- in the sense of reducing error, bias, poor sampling, and other concerns3 -- Goddard is having none of that. Adjustment is manipulation is fabrication. We should note he has been touting 'fabrication' for at least a couple of years. Here's a link to and an excerpt from a 2012 story at Forbes that helps explain what he sees in the data:

The bureaucracy at NOAA and NASA who report the U.S. temperature data undertake what they term “correcting” the raw data. These corrections are not just one-time affairs, either. As time goes by, older temperature readings are systematically and repeatedly made cooler, and then cooler still, and then cooler still, while more recent temperature readings are made warmer, and then warmer still, and then warmer still.

Science blogger Steven Goddard at Real Science has posted temperature comparison charts (available here, and here) showing just how dramatically the NOAA and NASA bureaucrats have doctored the U.S. temperature data during the past several decades. As the before-and-after temperature charts show, government bureaucrats with power and funding at stake have turned a striking long-term temperature decline (as revealed by the real-world data), into a striking long-term temperature increase.

It is, of course, possible that certain factors can influence the real-world temperature readings such that a correction in real-world temperature data may be justified.

I personally find Booker to be unreliable and a bit kooky (being an "intelligent design" creationist). He (earlier articles cribbing from Goddard) has been wrong before.

I kinda gotta file the Telegraph story under 'not news,' but advise those who take it under advisement to read at least a little bit of the 'pushback,' both in the comments at Goddard's blog and elsewhere. Here's an eye-opening article published yesterday at Reason that challenges the Telegraph article and Goddard, and quotes well-known climate skeptic Anthony Watts. (There is of course more critical material, though it can be a hard slog to identify and integrate both 'sides' of this particular question, as with any other polarized schmozzle. Bottom line is that we can be easily convinced that fraud, manipulation, hoaxing and conspiracy surrounds and infects the business/science of Climate Change -- if we want to. Measuring and accurately reporting the degree of the conspiracy is more tricky. Sifting the gold from the dross more tricky still.)

I post the whole dang thing, since it contains links to the important errors and other analyses.

Some segments of the Internet are abuzz with the claim by climate change skeptic Steven Goddard (Tony Heller) over at his Real Science blog thatNASA/NOAA have been jiggering the numbers so that they can claim that warmest years in the continental United States occurred recently, not back in the 1930s. Folks, please watch out forconfirmation bias.

Via email, I asked Anthony Watts, proprietor of WattsUpWithThat, what he thinks of Goddard's claims. He responded...

...while it is true that NOAA does a tremendous amount of adjustment to the surface temperature record, the word “fabrication” implies that numbers are being plucked out of thin air in a nefarious way when it isn’t exactly the case.

“Goddard” is wrong is his assertions of fabrication, but the fact is that NCDC isn’t paying attention to small details, and the entire process from B91’s to CONUS creates an inflated warming signal. We published a preliminary paper two years ago on this which you can read here: http://wattsupwiththat.com/2012/07/29/press-release-2/

About half the warming in the USA is due to adjustments. We' received a lot of criticism for that paper, and we’ve spent two years reworking it and dealing with those criticisms. Our results are unchanged and will be published soon.

In his email, Watts also cites the strong criticisms of Goddard's earlier claims over at theBlackboard blog:

Goddard made two major errors in his analysis, which produced results showing a large bias due to infilling that doesn’t really exist. First, he is simply averaging absolute temperatures rather than using anomalies. Absolute temperatures work fine if and only if the composition of the station network remains unchanged over time. If the composition does change, you will often find that stations dropping out will result in climatological biases in the network due to differences in elevation and average temperatures that don’t necessarily reflect any real information on month-to-month or year-to-year variability. Lucia covered this well a few years back with a toy model, so I’d suggest people who are still confused about the subject to consult her spherical cow.

His second error is to not use any form of spatial weighting (e.g. gridding) when combining station records. While the USHCN network is fairly well distributed across the U.S., its not perfectly so, and some areas of the country have considerably more stations than others. Not gridding also can exacerbate the effect of station drop-out when the stations that drop out are not randomly distributed.

I note that Watts commented on the, hmmm, accuracy of Goddard's work over at the Blackboard as well:

Anthony Watts (Comment #130003)

June 6th, 2014 at 8:00 am

I took Goddard to task over this as well in a private email, saying he was very wrong and needed to do better. I also pointed out to him that his initial claim was wronger than wrong, as he was claiming that 40% of USCHN STATIONS were missing.

Predictably, he swept that under the rug, and then proceeded to tell me in email that I don’t know what I’m talking about.

Fortunately I saved screen caps from his original post and the edit he made afterwards.

See:

Before: http://wattsupwiththat.files.w.....before.png

After: http://wattsupwiththat.files.w....._after.png

Note the change in wording in the highlighted last sentence.

In case you didn’t know, “Steve Goddard” is a made up name. Supposedly at Heartland ICCC9 he’s going to “out” himself and start using his real name. That should be interesting to watch, I won’t be anywhere near that moment of his.

This, combined with his inability to openly admit to and correct mistakes, is why I booted him from WUWT some years ago, after he refused to admit that his claim about CO2 freezing on the surface of Antarctica couldn’t be possible due to partial pressure of CO2.

http://wattsupwiththat.com/200.....a-at-113f/

And then when we had an experiment done, he still wouldn’t admit to it.

http://wattsupwiththat.com/200.....-possible/

And when I pointed out his recent stubborness over the USHCN issues was just like that…he posts this:

http://stevengoddard.wordpress.....reeze-co2/

He’s hopelessly stubborn, worse than Mann at being able to admit mistakes IMHO.

In his email to me, Watts details the sort of bureaucratic bungling that produces what he thinks is a significant artificial warming signal in the lower 48 temperature records from which he concludes:

It is my view that while NOAA/NCDC is not purposely “fabricating” data, their lack of attention to detail in the process has contributed to a false warming signal in the USA, and they don’t much care about it because it is in line with their expectations of warming. The surface temperature record thus becomes a product of bureaucracy and not of hard science...Never ascribe malice to what can be explained by simple incompetence.

See my earlier reporting on Watts et al.'s U.S. temperature data paper in my article, "Everyone Freaks Out About Two New Climate Change Studies." In response to criticism of that paper Watt and his colleagues have, as noted above, recrunched the data and will release a new paper soon.

________________________

1 -- from a Heartland Institute conference speakers list: "Tony Heller has spent much of the past seven years studying the history of extreme weather, as well as the history and methodology behind the reported NOAA/NASA temperature record. Tony is an expert in computer graphics and high performance computing. He has a B.S. in Geology from ASU, and a Masters in Electrical Engineering from Rice University. He lives in Fort Collins, Colorado and blogs under the pen name of Steve Goddard."

2 -- two links to explanations of 'corrections'/systematic fraud: National Temperature Trends: The Science Behind the Calculations and Quality Control, Homogeneity Testing, and Adjustment Procedures

3 -- the Forbes article cited above resulted in some "pushback" and in the process suggested the 'might be smart' reasons for adjustments: Heartland’s James Taylor hits new low with defamatory false accusations against NOAA

Edited by william.scherk

Share this post


Link to post
Share on other sites

Again though, if these employees of the Government fraudulent changed data for personal gain, does that not constitute a crime?

While true... your point is irrelevant because the legal bureaucracy is impotent in matters like this. Catastrophic human caused global warming is the official leftist secular religion of the State. Your comment constitutes blasphemy against the State and against the political majority who created the dominant secular political religion in their own image. And as such, it would never be recognized in a court of law.

No one can get between a global warming parasite and their government funding... no one.

Greg

Share this post


Link to post
Share on other sites

Just a question... and please dont get political or philosophical.....

What does he mean by "fabricated by computer models"?

Does this mean that the super computers in use work properly but they are being fed garbage data, or does it mean that the computers simply dont work?

All computer models and scientific theory are fabricated. They do not come into existence by themselves. The question is how well do they describe and predict actual events, processes and measurement.

Ba'al Chatzaf

Share this post


Link to post
Share on other sites

You're a scoundrel, Scherk.

Climate models all wrongly predicted warming

http://business.financialpost.com/2014/06/16/the-global-warming-hiatus/

Record shows that there has been no warming, either greenhouse or any other kind, for the last 17 years. Now that is a scientific observation of nature that this political charade is keeping a deep, dark secret from the populace, lest they start doubting their party line. Warming advocates today are looking all over for their “lost heat”, even in the ocean bottom. After all, the Arrhenius greenhouse theory tells them that there should be warming because carbon dioxide in the air is constantly increasing. And here we come up against the rules of the scientific method. If your theory tells you to expect warming and nothing happens for 17 years that theory is wrong and should be discarded. It belongs in the waste basket of history. [Arno Arrak]

Failure to reproduce past observations highlights model inadequacies and motivates model improvement, but success in reproducing past states provides only a limited kind of confidence in simulation of future states ... No model can be exactly isomorphic to the actual system ... In terms of scientists involved in writing the actual IPCC reports, very few of them are actual climate model developers, and few of these are actually involved in issues related to the dynamical core and predictability. People outside this core group (the users of climate model results) seem to have more confidence in the models than the model developers themselves. [Judith Curry, Chair, Dept. of Earth & Atmospheric Sciences, Georgia Tech]

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...