Thursday, June 30, 2011

Despite a marked increase in California grape production, alarmist study predicts collapse of wine industry

A study released today from Stanford University, aka Global Warming Central, claims the California wine industry will collapse due to global warming, despite the marked increase in California grape production of approximately 400% along the North Coast, 2500% along the Central Coast, 1500% in the Central Valley, and 800% in the Southern Valley of California over the past 30 years of global warming.


Warmer temperatures threaten Northern California vineyards

VineyardsIn the next 30 years, high-value vineyards in Northern California could shrink by 50% because of global warming, according to a new Stanford University study released Thursday.
Applying scenarios from the Intergovernmental Panel on Climate Change, scientists used a climate system computer model and found that Napa and Santa Barbara counties could experience very hot days during the growing season, with temperatures reaching 95 Fahrenheit or higher. The number of hot days will be greater, they say, with about 10 more sweltering days than usual.
As a result, the amount of grape-growing land is projected to decline over the next three decades, the authors wrote.
"There will likely be significant localized temperature changes over the next three decades," said Noah Diffenbaugh, coauthor of the study and a center fellow at the Woods Institute for the Environment at Stanford. "One of our motivations for the study was to identify the potential impact of those changes, and also to identify the opportunities for growers to take action and adapt."
High-value growers in California may need to take into account warmer weather and integrate climate information into their cultivation and practices, Diffenbaugh said. Two counties that he found would have cooler temperatures, Yamhill County in Oregon and Walla Walla County in Washington, can prepare for more optimal growing seasons.
"It's risky for a grower to make decisions that consider climate change, because those decisions could be expensive and the climate may not change exactly as we expect," Diffenbaugh said. "But there's also risk in decisions that ignore global warming, because we're finding that there are likely to be significant localized changes in the near term."
The peer-reviewed study, which has yet to undergo the scrutiny of the larger scientific community, is based on the Copenhagen Accords greenhouse gas target of 2 degrees Celsius above the pre-industrial baseline, or about a 23% increase in atmosphere gases by 2040. This could raise the average global temperature by 1.8 degrees Fahrenheit, a conservative scenario, said Diffenbaugh. 
Researchers compared the computer model’s simulations with actual weather data collected between 1960 and 2010 to see if their model could accurately replicate past temperatures. They combined new and historical data and found that all four counties were likely to experience higher average temperatures during growing seasons.
Certain varietals, such as Pinot Noir and Cabernet Sauvignon in Napa Valley, grow at average temperatures of 68 F, with fewer than 30 hot days. But with temperatures projected to rise by 2 degrees and 10 more hot days, hospitable conditions for growing would decrease.
On the other hand, Yamhill and Walla Walla counties will see more land suitable for high-value varietals.
These projections could have a large effect on California's $16.5-billion wine industry, which with more than 500,000 acres of vineyards, produces on average more than 5 million gallons per year and accounts for nearly 90% of the nation’s total wine production, according to the Wine Institute, a state winemakers trade organization.
Diffenbaugh suggests winemakers adapt to warmer conditions by planting heat-tolerant vines that can survive up to 45 very hot days and average temperatures of 71 F, but these varietals can lower the quality of wine. Growers can also use trellis systems to shade vines, use irrigation to cool plants and adjust fermentation processes. http://latimesblogs.latimes.com/greenspace/2011/06/global-warming-wine-vineyards-california-napa-valley-santa-barbara-1.html

Tuesday, June 28, 2011

Paper shows Greenland temperatures were higher in the 1930's and 1700's

A paper published last month in the journal Climate of the Past illustrates three different temperature reconstructions of southwest Greenland, which show temperatures were higher than the present in the 1930's, 1940's, 1700's, and multiple other times over the past 700 years when CO2 levels were extremely safe.
SW Greenland temperature reconstruction 1245-1970 in top graph
SW Greenland temperature reconstruction 1908-2005 in top graph
SW Greenland summer temperature reconstruction 1784-2005 in top graph

New paper finds significant temperature response to the 11-year solar cycle

A paper published last week in the Journal of Geophysical Research examines long-term temperature trends over the past 30 years at three observation sites in the northern hemisphere and finds "significant temperature response to the 11 year solar cycle." The paper also finds a temperature cooling trend of the middle atmosphere over the period 1981-2009 at 2 of the 3 sites, and "near zero" change at the third. Meanwhile, the IPCC claims the 0.1% change in total solar irradiance over a typical solar cycle has no significant effect on temperature, while ignoring potential amplification by large changes in solar UV over solar cycles [which secondarily affects ozone production], potential secondary cloud effects, and a long term increase in solar activity over the past several millennia.



JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 116, D00P05, 11 PP., 2011
doi:10.1029/2010JD015275
Key Points
  • The temperature trend and solar cycle was studied with Rayleigh lidar data sets
  • The temperature cooling trend was found at TMF and OHP and was near zero at MLO
  • Positive response was found with negative response in winter stratosphere at OHP
Tao Li
School of Earth and Space Sciences, University of Science and Technology of China, Hefei, China, State Key Laboratory of Space Weather, Chinese Academy of Sciences, Beijing, China
Thierry Leblanc
Table Mountain Facility, Jet Propulsion Laboratory, California Institute of Technology, Wrightwood, California, USA
I. Stuart McDermid
Table Mountain Facility, Jet Propulsion Laboratory, California Institute of Technology, Wrightwood, California, USA
Philippe Keckhut
Laboratoire Atmosphères, Milieux, Observations Spatiales, Institut Pierre-Simon Laplace, Guyancourt, France
Alain Hauchecorne
Laboratoire Atmosphères, Milieux, Observations Spatiales, Institut Pierre-Simon Laplace, Guyancourt, France
Xiankang Dou
School of Earth and Space Sciences, University of Science and Technology of China, Hefei, China
The long-term temperature profile data sets obtained by Rayleigh lidars at three different northern latitudes within the Network for the Detection of Atmospheric Composition Change were used to derive the middle atmosphere temperature trend and response to the 11 year solar cycle. The lidars were located at the Mauna Loa Observatory, Hawaii (MLO, 19.5°N); the Table Mountain Facility, California (TMF, 34.4°N); and the Observatoire de Haute Provence, France (OHP, 43.9°N). A stratospheric cooling trend of 2–3 K/decade was found for both TMF and OHP, and a trend of ≤0.5 ± 0.5 K/decade was found at MLO. In the mesosphere, the trend at TMF (3–4 K/decade) was much larger than that at both OHP and MLO (<1 K/decade). The lidar trends agree well with earlier satellite and rocketsonde trends in the stratosphere, but a substantial discrepancy was found in the mesosphere. The cooling trend in the upper stratosphere at OHP during 1981–1994 (∼2–3 K/decade) was much larger than that during 1995–2009 (≤0.8 K/decade), coincident with the slightly increasing upper stratospheric ozone density after 1995. Significant temperature response to the 11 year solar cycle was found. The correlation was positive in both the stratosphere and mesosphere at MLO and TMF. At OHP a wintertime negative response in the upper stratosphere and a positive response in the middle mesosphere were observed during 1981–1994, but the opposite behavior was found during 1995–2009. This behavior may not be a direct solar cycle response at all but is likely related to an apparent response to decadal variability (e.g., volcanoes, modulated random occurrence of sudden stratospheric warmings) that is more complex.

New solar energy technology uses magnetic properties of light

Off-topic but interesting...
A new device called an "optical capacitor" can amplify the normally weak magneto-electric effects of light by a million times to generate electricity, and the remaining light combined with conventional solar cells which use the photo-electric effects of light. While still on the drawing board and the calculations theoretical, a 10% solar energy conversion efficiency using concentrating optics is predicted.

OPTOELECTRONIC THEORY: Optical capacitor leverages light's magnetic field

Jun 1, 2011  Laser Focus World

Imagine converting the Sun’s energy to electricity without the need for expensive semiconductor materials or the usual absorption-exciton-drift-current conversion steps. This is precisely what researchers at the University of Michigan (Ann Arbor, MI) are doing: developing a capacitive energy source, or “optical capacitor,” based on their discovery that the heretofore-considered weak magnetic component of light can—under the right conditions—be exploited to generate an intense magnetoelectric effect.1 Not only is efficient optical charge separation possible, but the effect can also produce magnetically induced terahertz radiation.

The right conditions
 The better known but weak inverse Faraday effect produces a small, quasi-static magnetic field along the direction of propagation of light (perpendicular to the magnetic component of its electromagnetic field) when the light is circularly polarized. But when linearly polarized light of an intensity as low as 107 W/cm2 travels in certain transparent dielectrics, charge oscillations driven by the electric and weak magnetic component of the light can drive a dipolar magnetization parallel to the optical magnetic field in bound electron systems, creating intense magnetic dipole radiation—a magnetoelectric effect a million times more intense than the weak inverse Faraday effect.


 In this magnetoelectric interaction, each illuminated atom or molecule in the dielectric medium acquires a large static electric dipole moment—a static polarization that is sustained by a complex sequence of events: 1) the electric field initiates electron motion parallel to the electric field; 2) the magnetic field component causes a deflection of the electron (from the Lorentz force) that grows in amplitude due to a coupling between electric and magnetic-field driven motions (analogous to the coupled mechanical vibrations that destroyed the Tacoma Narrows Bridge in Washington State in 1940); and 3) the average position of the electron shifts away from the nuclei of the surrounding atoms and molecules, effectively accomplishing charge separation and creating an optically charged capacitor that could be used as an electrical energy source.


The trajectory of electron motion (away from its nucleus at x = y = z = 0) in a dielectric material illuminated by an incident electric field strength of (a) 1 V/m is compared to illumination with a strength of (b) 108 V/m. For the low-intensity light field, the x and z axes differ by nine orders of magnitude and the electron moves along the electric field x. At a higher light intensity, motion shifts to z, the direction of light propagation, and is much larger than expected. (Courtesy of the University of Michigan)

This magnetoelectric effect is quantified by the researchers through a classical Lorentz oscillator model (see figure). When electron motion is compared using low-intensity light incident on the dielectric (1 V/m) versus high-intensity light (108 V/m), the electron motion switches from being along the electric field (x) as expected to having a large component along the direction of motion of the light beam (z). At higher light intensities, this magnetically driven response attains its maximum amplitude with respect to the linear electric polarization of the dielectric medium, and the electron motion is sustained in a curved path beyond the nucleus. By applying magnetic-susceptibility equations, the surface-charge density possible in a plane-parallel dielectric slab illuminated by a known intensity of coherent light can be calculated.
 If ultrashort pulses are used as the illumination source, an additional capability beyond charge separation is possible: the generation of terahertz radiation. When any electric dipole induced by light is short-lived (transient), it becomes an antenna. If the inducing pulses are short enough, the frequency of the radiation emitted by the antenna has a breadth Δν, as determined by the uncertainty principle (Δν is approximately 1/Δt), that can extend to the terahertz range.


Optical-to-electrical conversion

 In theory, if a 1 kW, 800 nm Gaussian source with a pulse-repetition rate of 25 MHz is focused to a spot size of 50 µm in a 10-m-long sapphire optical fiber, the calculated optical-to-electrical conversion efficiency of the magnetoelectric system would be around 30%; additional calculations show that ordinary sunlight focused into a large-core optical fiber through concentrating optics could result in 10% conversion efficiency values, despite the incoherent nature of sunlight. While the physical experiment has not yet been completed, the researchers are confident of the result.
 “Since the light is not absorbed during the magnetoelectric interaction in transparent conversion media, much less heat is expected than in photovoltaic technology and light will simply pass through the converter when electrical power is not actively being drawn from it,” says professor Stephen Rand at the University of Michigan. “This is potentially more eco-friendly. Any light not directly converted to electricity could be used for growing crops or passed through to solar cells to boost the overall conversion efficiency of hybrid systems. Magnetoelectric technology is also an interesting contender for more efficient, transportable power generation in beamed laser-power applications or for solar conversion in harsh environments like space, where semiconductor circuits degrade more rapidly than radiation-hard dielectric materials.” —Gail Overton

REFERENCE:   1. W.M. Fisher and S.C. Rand, J. Appl. Phys., 109, 064903 (2011).



However, implementation of solar energy technology is likely to decrease due to elimination of government subsidies

An inconvenient cooling

The Washington Times picks up on the Hockey Schtick post Solar Physicist Dr. C. de Jager predicts Grand Solar Minimum will last until 2100

Sun’s coming quietude burns global warmists

Reports of imminent climatic catastrophes are turning out to be rather anticlimactic. That’s because rather than heating up to life-threatening levels, new scientific findings indicate it’s more likely the Earth will cool in coming years. That’s bad news for a global-warming industry heavily invested in a sultry forecast.

Cornelis de Jager, a solar physicist from the Netherlands and former secretary-general of the International Astronomical Union, announced that the sun is about to enter a period of extremely low sunspot activity, which historically is associated with cooling trends. Backed by other scientists, he predicted the “grand solar minimum” is expected to begin around 2020 and last until 2100.

The ebb of solar activity is shaping up to resemble what occurred during the Little Ice Age, the period from 1620 to 1720 when sunspot activity diminished and temperatures dropped an estimated 3 degrees Celsius. The era was noted for colder-than-usual winters in North America and Europe, when rivers and canals froze over, allowing for ice-skating and winter festivals. It also resulted in crop failure and population displacement in northern regions such as Iceland. To characterize the impending grand solar minimum as an “ice age” - with glaciers forming at temperate latitudes - would be an exaggeration. The correlation between decreased sunspot activity and falling temperatures means it’s likely to get colder when the sunspots begin to disappear.

Global-warming zealots are steamed. They’ve already cleverly rebranded their movement as “climate change” in order to appear relevant no matter what the thermometer reads, but the recent findings could undermine the basis for their cause. They assert that man-made greenhouse gases - not that big fireball in the sky - are responsible for heating up the Earth and threaten to end life as we know it. After nearly a generation of politically driven growth, countless careers and billions of dollars have been sunk into this fairy tale. Nothing would discredit the story more quickly than tumbling mercury.

Consequently, those invested in global warming have vigorously assaulted the news of an approaching solar minimum with hammer and ice pick. Correlation does not imply causation, they argue, insisting that the Little Ice Age was not caused by diminished sunspot activity but by volcanoes that spewed sun-reflecting clouds into the atmosphere during that era. Tellingly, warmists dismiss skeptics who suggest elevated atmospheric carbon-dioxide is the result rather than the cause of rising temperatures.

Uncertainty about whether human activity could cause the Earth to heat up warrants healthy skepticism, but warmism is rooted in faith, not fact. Peeling back its green disguise, the movement wants to impose globe-spanning environmental regulations with a view toward turning the clock back to the time before the Industrial Revolution. As Al Gore preached in his 1993 book, “Earth in the Balance,” “We must make the rescue of the environment the central organizing principle for civilization.”

The goal of society has always been to improve the human condition and for one generation to leave a better world for the next. Crippling the engines of progress, particularly in the production of affordable energy, will lead not to paradise on Earth, but to poverty and squalor. If it takes a chilly breeze to silence the retrograde movement, Mr. de Jager’s news is welcome indeed.

Monday, June 27, 2011

Single whale spotting leads to scientific hissy fit

The gray whale, a highly endangered species, had been hunted to extinction in the Atlantic Ocean by the mid-1700s. Yet one appeared off the Israeli coast in May 2010.
Wandering gray whale points to climate change  
Detroit Free Press 6/27/11

AMSTERDAM, Netherlands -- When a 43-foot gray whale was spotted off the Israeli town of Herzliya last year, scientists came to a startling conclusion: It must have wandered across the normally icebound route above Canada, where warm weather briefly opened a clear channel three years earlier.

On a microscopic level, scientists also have found plankton in the North Atlantic where it had not existed for at least 800,000 years. [Why was it there 800,000 years ago?]

The whale's odyssey and the surprising appearance of the plankton indicate a migration of species through the Northwest Passage, a worrying sign of how global warming is affecting animals and plants in the oceans as well as on land.

"The implications are enormous. It's a threshold that has been crossed," said Philip Reid of the Alister Hardy Foundation for Ocean Science in Plymouth, England.

"It's an indication of the speed of change that is taking place in our world in the present day because of climate change," he said in a telephone interview Friday. [that speed of change would be 0.6C in the past 160 years and no change over the past decade]

Reid said the last time the world witnessed such a major incursion from the Pacific was 2 million years ago [which is it 800,000 or 2 million?], which had "a huge impact on the North Atlantic," driving some species to extinction as the newcomers dominated the competition for food.

Reid's study of plankton and the research on the whale, co-authored by Aviad Scheinin of the Israel Marine Mammal Research and Assistance Center, are among nearly 300 scientific papers written during the last 13 years that are being synthesized and published this year by Project Clamer, a collaboration of 17 institutes on climate change and the oceans.

Changes in the oceans' chemistry and temperature could have implications for fisheries, as species migrate northward to cooler waters, said Katja Philippart of the Royal Netherlands Institute of Sea Research, who is coordinating the project funded by the European Union.

"We don't say whether it's bad or good. We say there is a high potential for change," she said. [yes, there is a high potential the climate will continue to change as it has since the beginning of time]

The Northwest Passage, the route through the frigid archipelago from Alaska across northern Canada, has been ice-free from one end to the other only twice in recorded history, in 1998 and 2007. [false] But the ice pack is retreating farther and more frequently in summer. [false]

Plankton that previously had been found only in Atlantic seabed cores from 800,000 years ago [why were they there 800,000 years ago?] appeared in the Labrador Sea in 1999 -- and then in massive numbers in the Gulf of St. Lawrence two years later. Now it has established itself as far south as the New York coast, Reid said.

The endangered gray whale sighted off the Israeli coast in May 2010 belonged to a species that was hunted to extinction in the Atlantic by the mid-1700s. [are you certain a few didn't survive in the Atlantic? on what basis do you assume gray whales didn't traverse the Northwest Passage anytime prior to the mid-1700's?]

Sunday, June 26, 2011

When climate science was a science instead of a political grandstand

One year prior to James Hansen testifying to Congress in 1988 he was 99% certain that man-made global warming was occurring, a paper published in Nature indicated that global warming would cause negative feedback from increased cloud cover to cool the Earth by more than the alleged warming effect of CO2.

Buying carbon offsets to assuage your green guilt? Study says don't bother

Study: trees not cure for global warming

BY MARGARET MUNRO, POSTMEDIA NEWS  JUNE 18, 2011

Planting trees may help appease travellers' guilt about pumping carbon into the atmosphere.

But new research suggests it will do little to cool the planet, especially when trees are planted in Canada and other northern countries, says climatologist Alvaro Montenegro, at St. Francis Xavier University in Nova Scotia.

"There is no magic bullet" for global warming, says Montenegro, "and trees are certainly not going to be providing it."

He assessed the impact of replanting forests on crop and marginal lands with Environment Canada researcher Vivek Arora. Their study, published Sunday in Nature Geoscience, concludes "afforestation is not a substitute for reduced greenhouse-gas emissions."

The United Nations, environmental groups and carbon-offset companies are invested heavily in the idea that planting trees will help slow climate change and global warming. International authorities have long described "afforestation" as a key climate-change mitigation strategy.

But the study says the benefits of tree planting are "marginal" when it comes to stopping the planet from overheating.

Trees do suck carbon [dioxide] out of the air, but the study highlights that their dark leaves and needles also decrease the amount of solar radiation that gets reflected by the landscape, which has a warming effect.

Cropland - especially snow-covered cropland - has a cooling effect because it reflects a lot more solar energy than forests, the scientists say. This so-called "albedo effect" is important and needs to be incorporated into assessments of tree planting programs and projects, the researchers say.

Montenegro and Arora stress that planting forests has many benefits - trees provide habitat for wildlife and prevent soil erosion. And planting forests does help reduce atmospheric levels of carbon dioxide because carbon is locked into wood as trees grow.

But planting trees will have only a modest effect on the global temperature, according to their study, which used a sophisticated climate modelling system developed by Environment Canada. [see Top 10 Reasons Why Climate Model Predictions are False]

CO2 levels have risen at the same rate for past 18,000 years

While climate alarmists claim CO2 levels have risen at an unprecedented rate since the industrial revolution, extrapolation from ice core data shows the rate of rise has been steady since the peak of the last ice age about 18,000 years ago. This shows that man-made CO2 (~3% of total emissions) has had little effect on atmospheric levels of CO2, which instead are driven by outgassing from the oceans during interglacial periods; interglacial periods are driven by solar insolation, not CO2.
Atmospheric CO2 from 3 ice-core studies shown on vertical axis, thousands of years ago shown on horizontal axis. Linear extrapolation from the peak of the last ice age ~18,000 years ago shows that the rate of rise in CO2 has not changed over the past 18,000 years. Notations in red added. Graph source.

Saturday, June 25, 2011

The IPCC and Greenpeace: Renewable Outrage

The Economist  6/17/11

THE release of the full text of the Intergovernmental Panel on Climate Change’s Special Report on Renewable Energy this week has led to a new set of questions about the panel’s attitudes, probity and reliabilty: is it simply a sounding board for green activists? The answer is no—but that doesn’t mean it’s without serious problems. For what’s worst about the affair, and for comments by IPCC chair Rajendra Pachauri, scroll down to the lower bits of the post.

When the summary of the report was released last month (IPCC summaries, agreed line by line by governments at often quite fractious plenary meetings, come out before the report they are summarising, in part because the report may need a little tweaking to reflect the plenary’s summary judgements) it came with a press release proclaiming that the world could get 80% of its energy from renewables by 2050 if it just had the right policies and paid the right amount. This figure was subsequently trumpeted by those parts of the world’s press paying attention, which tended to be the parts that have readers keen on more environmental action.

The full report shows where the number came from, and that’s why its publication sparked a fuss. One of the report’s 11 chapters is an analysis of 164 previously published scenarios looking at the energy mix over the next four decades under various assumptions. The scenario which had the highest penetration of renewables put the total at 77% by 2050. The research involved was done by the German space-research institute, which has long worked on energy analysis, too; its experts were commissioned to do the work by Greenpeace, and a Greenpeace staff member with an engineering background, Sven Teske, was the scenario’s lead author when it was published in a couple of different forms in peer-reviewed journals. It has also been published, in bigger, glossier format, by Greenpeace itself under the grating and uncharacteristically fence-sitting title Energy [R]evolution.

Mr Teske was also one of the authors of the chapter of the IPCC report that looked at those 164 scenarios, and that chose Energy [R]evolution as one of four scenarios to explore in more detail. That, say critics, looks like a fix. And one with big consequences. That one scenario’s claim that the world could get call-it-80% of its energy from renewables managed, thanks to the press release, to shape perceptions of the report when it was originally released, making it look like a piece of renewables boosterism. Worse: who wrote the foreword to Greenpeace’s glossy publication of its scenario? Rajendra Pachauri, the chair of the IPCC. (Disclosure: at the request of IPCC authors, this avatar of Babbage chaired a debate on the summary of the special report when it was launched in May, and his brother is a “co-ordinating lead author” on the panel’s forthcoming “fifth assessment report”, though not in an area associated with renewable energy.)

Steve McIntyre, who runs a blog on which he tries to hold climate science to higher standards than he sees it holding itself, picked up all these IPCC/Greenpeace connections and posted on them angrily, calling for all involved to be sacked. “As a citizen,” he says, “I would like to know how much weight we can put on renewables as a big-footprint solution. Prior to the IPCC report, I was aware that Greenpeace—and WWF—had promoted high renewable scenarios. However, before placing any weight on them, the realism of these scenarios needs to be closely examined. IPCC has a mandate to provide hard information but did no critical evaluation of the Greenpeace scenario."

His desire for solid, honest answers is plainly one to be shared. But the authors of the IPCC chapter involved declined to evaluate the scenarios they looked at in terms of whether they thought they were plausible, let alone likely. Ottmar Edenhofer, a German economist who was one of those in overall charge of the report, gives the impression that he would have welcomed a more critical approach from his colleagues; but there is no mechanism by which the people in charge can force an author team to do more, or other, than it wants to. (The same goes for authors on the team, Mr Teske says; he was one of twelve authors on the relevant chapter, and over 120 authors overall, and had no peculiar Greenpeace lantern with which to bend them all to his will.)

read remainder at economist.com

The Facts About Fracking

The real risks of the shale gas revolution, and how to manage them

WSJ.com Review & Outlook 6/25/11

The U.S. is in the midst of an energy revolution, and we don't mean solar panels or wind turbines. A new gusher of natural gas from shale has the potential to transform U.S. energy production—that is, unless politicians, greens and the industry mess it up.

Only a decade ago Texas oil engineers hit upon the idea of combining two established technologies to release natural gas trapped in shale formations. Horizontal drilling—in which wells turn sideways after a certain depth—opens up big new production areas. Producers then use a 60-year-old technique called hydraulic fracturing—in which water, sand and chemicals are injected into the well at high pressure—to loosen the shale and release gas (and increasingly, oil).

***
The resulting boom is transforming America's energy landscape. As recently as 2000, shale gas was 1% of America's gas supplies; today it is 25%. Prior to the shale breakthrough, U.S. natural gas reserves were in decline, prices exceeded $15 per million British thermal units, and investors were building ports to import liquid natural gas. Today, proven reserves are the highest since 1971, prices have fallen close to $4 and ports are being retrofitted for LNG exports.

The shale boom is also reviving economically suffering parts of the country, while offering a new incentive for manufacturers to stay in the U.S. Pennsylvania's Department of Labor and Industry estimates fracking in the Marcellus shale formation, which stretches from upstate New York through West Virginia, has created 72,000 jobs in the Keystone State between the fourth quarter of 2009 and the first quarter of 2011.

The Bakken formation, along the Montana-North Dakota border, is thought to hold four billion barrels of oil (the biggest proven estimate outside Alaska), and the drilling boom helps explain North Dakota's unemployment rate of 3.2%, the nation's lowest.

All of this growth has inevitably attracted critics, notably environmentalists and their allies. They've launched a media and political assault on hydraulic fracturing, and their claims are raising public anxiety. So it's a useful moment to separate truth from fiction in the main allegations against the shale revolution.

• Fracking contaminates drinking water. One claim is that fracking creates cracks in rock formations that allow chemicals to leach into sources of fresh water. The problem with this argument is that the average shale formation is thousands of feet underground, while the average drinking well or aquifer is a few hundred feet deep. Separating the two is solid rock. This geological reality explains why EPA administrator Lisa Jackson, a determined enemy of fossil fuels, recently told Congress that there have been no "proven cases where the fracking process itself has affected water."

Friday, June 24, 2011

Jay Leno mocks Al Gore's extremist views that nobody listens to anymore

From the Tonight Show with Jay Leno monologue 6/23/11, second & third jokes:

Today President Obama released 30 million barrels of oil from the Strategic Petroleum Reserve...he said it was in response to what he called a real emergency - his poll numbers.

Even Al Gore is attacking President Obama...Gore said Obama has failed to stand up for bold action and has made little progress on global warming...and then the girl said, "Sir, if you could just pay for your ice cream - we've got other customers waiting."

(applause & laughter)



Related: The Boy Who Cried Wolf

The White House Oil Epiphany

Obama has epiphany that the skyrocketing energy prices he called for previously are not good for re-election...

WSJ.com Review & Outlook 6/24/11

It wasn't long ago that the Obama Administration was trying to drive up the price of fossil fuels to reduce carbon emissions, promote "green jobs" and save the planet from global warming. Gasoline at $3.50 or $4 a gallon has ended that. And yesterday the White House went so far as to join a global effort to release 60 million barrels from oil stockpiles to further reduce prices.

The U.S. will release one million barrels a day for 30 days from the Strategic Petroleum Reserve—the nation's 727 million barrel oil stockpile located in salt domes in Texas and Louisiana. The spot price of oil dropped about $5 a barrel on the news, and if that decrease holds it could be the equivalent of a 10 cent a gallon reduction in gas prices.

The White House says it is taking this action because of "supply disruptions" in Libya and other countries which pose a threat to global economic recovery. But the Libyan conflict is now four months old, so Mr. Obama's falling approval ratings no doubt also provided motivation.

The SPR was created in 1975 to cushion the impact of major supply disruptions. George W. Bush drew on the reserves after Hurricane Katrina when domestic oil supplies from the Gulf of Mexico were curtailed. As a pure business decision, selling oil from the SPR when the price is high, and then replenishing the oil when the price falls, isn't a bad idea. But the effect on gas prices is temporary, as global supply and demand adjust.

One irony is that a million barrels a day is about how much oil experts believe we could be producing from the vast oil fields in Alaska's wildlife reserve. President Obama has said that tapping Alaska wouldn't affect oil prices but now says a temporary spurt will do so. How about opening up Alaska, and dropping the de facto Gulf moratorium too?

Wednesday, June 22, 2011

Study shows modern oceans are more alkaline than past 250 million years

While eco-alarmists would have you believe the oceans have "acidified" to dangerous pH levels, a paper published in Nature finds that the modern ocean pH of about 8.1-8.2 is actually the most alkaline the oceans have been over the past 250 million years. During this time corals, phytoplankton, and indeed most of the ocean biomass have evolved. The paper shows a mean pH of about 7.7 over the past 250 million years, whereas the alarmist and frequently incorrect IPCC predicts ocean pH will drop to 7.88 (~0.2 pH units) under a "business as usual" scenario by 2100.
Mean ocean surface pH shown in 2nd graph. Modern ocean pH of 8.1-8.2 is shown at left side of graph, right side of graph is 250 million years ago. pH above 7.0 is alkaline. Top graph shows diversity of various species of phytoplankton. Lower graph shows little change in calcification over the entire period.
The fact is modern sea life cope perfectly well with pH levels that vary by .4 pH units over a period of less than one year, as shown in this graph from the Monterey Bay Aquarium of incoming seawater:

Tuesday, June 21, 2011

Incredible! Mann uses upside down data again!

One day following release, Michael Mann's latest paper on sea levels has been determined to use upside down proxy data, despite the exact same error being pointed out by Steve McIntyre 3 years ago! Mann has never acknowledged this grave error and incredibly, continues to use the same trick.

From ClimateAudit.org:


Upside Down Mann Lives on in Kemp et al 2011


contributor AMac:

Yesterday, Kemp et al. 2011 was published in PNAS, relating sea-level variation to climate over the past 1,600 years (UPenn press release). Among the authors is Prof. Mann. (Kemp11 is downloadable from WUWT.) Figs. 2A and 4A are “Composite EIV global land plus ocean global temperature reconstruction, smoothed with a 30-year LOESS low-pass filter”. This is one of the multiproxy reconstructions in Mann et al. (2008, PNAS). The unsmoothed tracing appears as the black line labelled “Composite (with uncertainties)” in panel F of Fig. S6 of the “Supporting Information” supplement to Mann08 (dowonloadable from pnas.org).

This is one of the Mann08 reconstructions that made use of the four (actually three) uncalibratable Tiljander data series.

As scientist/blogger Gavin Schmidt has indicated, the early years of the EIV Global reconstruction rely heavily on Tiljander to pass its “validation” test: “…it’s worth pointing out that validation for the no-dendro/no-Tilj is quite sensitive to the required significance, for EIV NH Land+Ocean it goes back to 1500 for 95%, but 1300 for 94% and 1100 AD for 90%” (link). Also see RealClimate here (Gavin’s responses to comments 525, 529, and 531).

The dependence of the first two-thirds of the EIV recon on the inclusion of Tiljander’s data series isn’t mentioned in the text of Kemp11. Nor is it discussed in the SI, although it is an obvious and trivial explanation for the pre-1100 divergence noted in the SI’s Figures S3, S4, and S5.

Peer review appears to have been missing in action on this glaring shortcoming in Kemp11′s methodology.
More than anything, I am surprised by this zombie-like re-appearance of the Tiljander data series — nearly three years after the eruption of the controversy over their misuse as temperature proxies!

Top 10 Reasons Why Climate Model Predictions are False

The IPCC predictions of catastrophic global warming climate change are entirely based upon computer models programmed on the basis of unverified, and in most cases, false premises. Unlike any other area of science, climate computer model results are considered gospel without verification by empirical data or proper consideration of the huge uncertainties and limitations of modeling a chaotic system in which almost all of the variables are poorly understood. Climate science has become perverted to the point of considering models to supplant empirical data
“People underestimate the power of models. Observational evidence is not very useful,” adding, “Our approach is not entirely empirical.” John Mitchell, principal research scientist at the UK Met Office
somehow forgetting the scientific method, as succinctly stated by physicist Richard Feynman:
It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.
The recent empirical observations showing that the Sun is entering an exceptionally low period of activity were immediately 'countered' by the climate alarmist community trumpeting a computer model to supposedly prove negligible effect of a Grand Solar Minimum upon climate. This prompts a list of the top 10 reasons why climate model predictions are false:

1. Even the IPCC admits the climate models have not been verified by empirical observations to assess confidence. The fine print of the IPCC 2007 Report contains this admission:
Assessments of our relative confidence in climate projections from different models should ideally be based on a comprehensive set of observational tests that would allow us to quantify model errors in simulating a wide variety of climate statistics, including simulations of the mean climate and variability and of particular climate processes. 
In the final paragraph of this critical section of the AR4 WG1 Chapter 8 page 52 the IPCC states that 
a number of diagnostic tests [of the models] have been proposed, but few of them have been applied to the models currently in use. 
In fact, the models have performed poorly in comparison to observations, with global temperatures failing to even remain above the lower bound predicted by the IPCC, despite the steady rise in CO2 levels:
2. Furthermore, the IPCC even admits "isn't clear which [diagnostic] tests are critical" to verify and assess confidence in the models. The 2007 Report Chapter 8, page 52 states the diagnostic tests to assess confidence in feedbacks simulated by different models have "yet to be developed." In other words, the IPCC can't begin to make any assessment whatsoever of confidence of the models at the heart of the IPCC "consensus" on anthropogenic global warming.  If the IPCC is unable to verify and determine confidence in the models, no other climate modelling publication in climate science can rightfully make the claim that the models have been verified, or determine confidence limits on the results.


3. Of 16 climate forcings identified by the IPCC, only 2 are stated by the IPCC to have a "high level" of understanding (CO2 and other greenhouse gases). Most of the other forcings have a "low level" of understanding, with a few stated to be "low to medium." It is impossible to create a model with any validity without a high level of understanding of the effect of each of the input variables. The variables also interact in a chaotic manner, which by definition cannot be modeled. 


4. The 2 forcings claimed by the IPCC to have a "high level" of understanding (man-made CO2 and other greenhouse gases plus unproven positive feedbacks) are in fact not well understood, with empirical satellite data showing the sensitivity to doubled CO2 with feedbacks is only about 0.7C  (Lindzen & Choi 2009, 2011 and others), a factor of 4 less than assumed by IPCC climate models. 


5. The climate models falsely assume infrared "back-radiation" from greenhouse gases can heat the oceans (71% of the Earth surface area). In fact, IR wavelengths are only capable of penetrating the surface of the ocean a few microns (millionths of a meter), with all energy absorbed used up in the phase change of evaporation (which actually cools the sea surface), with no remaining energy to heat the ocean bulk. This fact alone completely invalidates the assumed radiative forcing from greenhouse gases incorporated in the models.
Long Wave Infrared from greenhouse gases has a wavelength of ~8-14 microns. Penetration depth into water shown on right scale.
6. In contrast to IR "back-radiation," visible and especially UV radiation from the Sun is capable of penetrating the oceans to a depth of several meters to heat the oceans. Solar UV activity varies by up to 10% over solar cycles, unlike the total solar irradiance (TSI), which only varies by 0.1%. The IPCC climate models only consider changes in TSI and ignore the large changes in solar UV which heat the oceans. Solar UV also affects ozone levels, which in turn have large poorly understood effects on climate.


7.  Clouds are one of the most important yet most poorly understood variables, with the IPCC not even certain whether clouds have a net warming or cooling effect. The empirical data show cloud albedo declined over past few decades and accounts for at least 3 times as much warming as greenhouse gases. Whether the cloud changes are due to the cosmic ray theory of Svensmark et al or not, this remains an unexplained huge factor not incorporated in the models. As pointed out by Dr. Roy Spencer, a mere 1-2 % change in global cloud cover alone can account for either global warming or cooling. The changes in cloud cover secondarily related to solar activity noted by Svensmark et al have an amplitude of about 4%:


8. Ocean oscillations, which can have a periodicity of up to 60 years (e.g. the Pacific Decadal Oscillation), and huge effects upon worldwide climate, are not incorporated in the climate models. Ocean oscillations alone could account for the warming of the latter 20th century that the IPCC chooses to ascribe to man-made CO2, while claiming there is no other explanation.


9. As well stated by solar physicist Dr. Nicola Scaffeta,
...the traditional climate models also fail to properly reconstruct the correct amplitudes of the climate oscillations that have clear solar/astronomical signature...Given the above, there is little hope that the traditional climate models correctly interpret climate change and nothing concerning the real climate can be inferred from them because from a false premise everything can be concluded.
10. The latest climate models continue to greatly exaggerate sensitivity to CO2 by 67%. Despite admitting this, the model authors were unwilling or unable to tweak the models to match observed temperatures, allowing the exaggerated effects of CO2 to remain in the world's most commonly used climate model. How hard could it have been to correct the sensitivity to CO2, given that the supposedly sophisticated models can be replicated with a small handful of arbitrary and artificially linear forcing factors on a laptop PC?

Related: New Paper “Validation And Forecasting Accuracy In Models Of Climate Change”

Monday, June 20, 2011

The Climate Tort is Finished

WSJ.com  Review & Outlook 6/21/11

Yesterday's other important Supreme Court decision came in a case that joined the green lobby and the trial bar, if that isn't redundant. The Court unanimously struck down one of the legal left's most destructive theories, and not a moment too soon.

In American Electric Power v. Connecticut, eight states and various other environmental activists sued a group of utilities, claiming that their carbon emissions were a "nuisance" under federal common law and that therefore the courts should set U.S. global warming policy. Yet this is a fundamentally political question, one the Constitution reserves to Congress and the executive, as Justice Ruth Bader Ginsburg wrote for the 8-0 majority.

The Court "remains mindful that it does not have creative power akin to that vested in Congress," Justice Ginsburg observed, in an all-too-rare vindication of legal restraint. "It is altogether fitting that Congress designated an expert agency, here, EPA, as best suited to serve as primary regulator of greenhouse gas emissions. The expert agency is surely better equipped to do the job than individual district judges issuing ad hoc, case-by-case injunctions. Federal judges lack the scientific, economic, and technological resources an agency can utilize in coping with issues of this order."

We'd go further and point out that Congress never granted the Environmental Protection Agency the power to regulate CO2. The EPA has merely asserted that power with an assist from the pure policy invention of the Court itself in 2006's 5-4 Mass. v. EPA ruling. Still, the fact that every Justice rejected the new climate tort theory, and that the opinion was delivered by the most liberal Justice, shows how abusive it really was.

The Court dismissed the case under the "political question doctrine," but we wish it had resolved the technical issue of Article III standing, which determines when a plaintiff has a right to sue. The Justices were split four to four, and thus did not rule; Justice Sonia Sotomayor recused herself because she heard the case on the Second Circuit. Yet standing is one of the few restraints on the power of the federal courts, and the litigants didn't have it by a mile here.

Under the traditional legal reading of standing, plaintiffs have to show that the defendants caused their injuries and that the courts can meaningfully redress those injuries. But climate change is a world-wide phenomenon for which the group of utilities barely contributed even under the most aggressive global warmist theories. And even if the courts shut down those plants tomorrow, it would have no effect whatsoever on atmospheric CO2 concentrations.

The climate tort is nonetheless finished, and the Court's decision should make it impossible to advance the same claims in state courts. Anyone who cares about the economy and the Constitutional balance of power can breathe a little easier.

Sunday, June 19, 2011

Solar Physicist Dr. C. de Jager predicts Grand Solar Minimum will last until 2100

Dr. Cornelis de Jager is a renowned Netherlands solar physicist, past General Secretary of the International Astronomical Union, and author of several peer-reviewed studies examining the solar influence upon climate. In response to the recent press release of three US studies indicating the Sun is entering a period of exceptionally low activity, Dr. de Jager references his publications of 2010 and prior indicating that this Grand Solar Minimum will be similar to the Maunder Minimum which caused the Little Ice Age, and prediction that this "deep minimum" will last until approximately the year 2100.
"The new episode is a deep minimum. It will look similar to the Maunder Minimum, which lasted from 1620 to 1720...This new Grand Minimum will last until approximately 2100."
A lecture by Dr. de Jager at UCAR shows that solar activity during the 20th century was at the highest levels of the past 900 years:
and shows solar UV activity (bottom graph below) was at the highest levels of the past 400 years in the latter portion of the 20th century: (UV is the most energetic portion of the solar spectrum, and varies much more than the Total Solar Irradiance (TSI). The IPCC and computer models only consider changes in TSI, ignoring the much more significant changes in UV)
and shows the amplication of solar variation via the cosmic ray theory of Svensmark et al:
leading to two possible mechanisms accounting for amplified solar effects upon the climate, neither of which is considered by the IPCC:

Recommended: Dr. de Jager's peer-reviewed paper Solar Activity and Its Influence on the Climate

Thursday, June 16, 2011

Oh...the Irony: Environmental concerns derail billions of dollars in solar energy projects

Spot The Tortoise?

Todd Woody, 06.08.11, 06:00 PM EDT 
Forbes Magazine dated June 27, 2011

More than $10 billion in solar projects are riding on the shell of an iconic desert reptile.


image

BrightSource broke ground on this 370MW solar plant. Cost: $2.2 billion. Tortoises: 700.


Last October BrightSource Energy began construction on the first large-scale solar thermal power plant to be built in the U.S. in two decades. After an arduous three-year environmental review, a $1.6 billion federal loan guarantee and more than a half-billion dollars in investment from the likes of Google (GOOG - news people ), Morgan Stanley ( MS - news -people ) and NRG Energy ( NRG - news people ), Interior Secretary Ken Salazar and then California Governor Arnold Schwarzenegger appeared at a sunny groundbreaking ceremony in Nipton, Calif., in the Mojave Desert. The 370-megawatt Ivanpah Solar Electric Generating System, they proclaimed, heralded a clean, green energy future.
But as the dignitaries speechified, biologists were discovering the creosote-bush-studded landscape was crawling with some uninvited guests: desert tortoises. Years of surveys had estimated that, at most, 32 of the iconic, imperiled animals called the 5.6-square-mile site home. But as giant road graders moved in, biologists had already found nearly that many tortoises just in the project's first, 914-acre phase.
"The big mystery question is, why are there more animals than expected?," said Mercy Vaughn, a respected desert tortoise biologist who's leading the company's roundup and relocation of the long-lived reptiles, as she stood outside a tortoise holding pen in October.
Today those pens have expanded to hold even more tortoises. Federal officials in April ordered construction temporarily halted on part of the project until a new environmental review could be conducted. The reason: Government biologists now predict that between 86 and 162 adult tortoises and 608 juveniles roam the site, some 40 miles southwest of Las Vegas. Biologists with the U.S. Bureau of Land Management, which leases the land to BrightSource, concluded that the project would "harass" 2,325 mostly juvenile tortoises living within a 2-kilometer radius outside the site in the Ivanpah Valley, where another company, First Solar ( FSLR - news people ), intends to construct two huge generating stations.
Wildlife has emerged as the wild card in plans to build more than a dozen multibillion-dollar solar projects in the desert Southwest. Earlier this year German developer Solar Millennium's U.S. venture abandoned a 250-megawatt solar project after 16 months of environmental review because of concerns over its impact on the Mohave ground squirrel. The renewed scrutiny of other big solar projects raises the stakes for the Obama Administration, which has offered more than $8 billion in loan guarantees for solar construction, and for developers and investors making bets on Big Solar.
read remainder at Forbes.com