Create an Account
username: password:
 
  MemeStreams Logo

40% Efficiency Solar Cells Developed

search

Lost
Picture of Lost
My Blog
My Profile
My Audience
My Sources
Send Me a Message

sponsored links

Lost's topics
Arts
Business
Games
Health and Wellness
Home and Garden
Miscellaneous
Current Events
Recreation
Local Information
Science
Society
Sports
Technology

support us

Get MemeStreams Stuff!


 
40% Efficiency Solar Cells Developed
Topic: Technology 5:42 am EDT, Jun  2, 2007

If solar is less expensive than the available clean conventional sources then this might make sense. Otherwise, why bother? It's only in situations where you're already near existing daytime conventional capacity and the deployment of solar is much faster/cheaper in the short term than deployment of another clean conventional source that it might make sense. But if solar is expensive and/or time-consuming to deploy (relative to deploying another clean conventional source) then it simply doesn't make sense to use it even if it's only for dealing with peak load.

Forgive me, but you are completely wrong about this. Peak periods are exactly when things like solar really "shine." There are a couple thing you must understand about the interstate electricity grid:

First, is that it is over-designed on purpose. Most major utilities have operating reserves of power generation of between 12 - 18 % of the day's anticipated peak demand. On any given day, the system operator will have tens or hundreds of generation sources that it never dispatches (e.g., uses to produce power), but that are there "just in case." This means that utilities have multiple dispatch solutions in order to meet load (load being a measure of people who want to use electricity).

The second key principle is that utilities select their generation resoures based on a "least-cost dispatch" basis. While in practice, this gets incredibly complicated (and also includes environmental factors), the utility will pick the least expensive generators that can produce enough power to adequately supply the day's demand. In practical terms, this means that the utility will dispatch the dirtiest and most expensive to operate (on an incremental cost basis) generating facilities last.

The third principle is an outgrowth of the first two. On peak demand days (think middle of summer, air conditioners running at full blast, etc.), the number of dispatch options available to the utility decreases further and further as it commits an ever-increasingly greater share of its total generating capacity to meet demand. This means that your nastiest, dirtiest, foulest, most expensive generating facilities are dispatched on such days.

Imagine this scenario. You are Utility X. You have the following five generating facilities at your disposal:

1000 MW nuke.

500 MW cheaper, clean(er) coal.

500 MW slightly less cheap dirty coal.

100 MW incredibly expensive natural gas.

20 MW aging oil burner that spews out more toxics that Paris Hilton on a breathalyzer AND costs more than the GDP of small nations to operate.

Total installed capacity (a fancy term for the total amount of generation): 2110 MW.

Now imagine that hellishly hot day. Demand immediately soars to 1500 MW -- and it's not even 11 am yet. You commit your nuke and your clean coal facility. Now it's 2 pm and demand hits 2000 MW. Throw in the dirty coal. Four pm rolls around and demand hits 2040 MW. Thow in that expensive natural gas peaker! (Don't worry -- the rate payers will just end up eating the extra -- your investors are safe.)

Now it's 4:47 in the afternoon. The peak of the peak. You're at 2099 and still rising.... You are getting ready to commit the oil burner at a cost of several millions of dollars and countless hazy days. Do you need it?

Well, maybe not. If you were a smart utility executive, you invested in Demand Response and paid some of your customers to go off-grid on days like this. Additionally, you've been incouraging customers to install solar panels that are all furiously generating power right as it's needed most.

This is the moment where solar pays for itself. By reducing the peak demand by only a smidge, you reduce energy bills substantially. Solar is also one of the few alternative/clean sources of energy that peaks along with demand. Wind, for example, tends to blow off-peak. (This is even more true when you factor in the capital costs of building a new $1 billion generating facility to meet peak.) It's also why it's sometimes cheaper to pay customers for not using power on a given summer day.

There is a handy little graph that you can plot -- and it's true for almost all utilities. After demand exceeds a certain point, the $/MW of additional energy soars incredibly. Brownouts are not generally allowed, so the utility is forced to suppliers of last resort at a cost of billions (think California in 2000/2001). You can also chart the cost of solar against this. (Although honestly, wind is a better example, since with tax breaks, it approaches conventional generation costs. Solar is still often off the charts.)

Finally, what is this "clean conventional" source of energy you refer to? And no, I'm not some eco-nut trying to send us back to the stoneage. Perhaps you're simply using the term "conventional" incorrectly. In the energy world, "conventional" sources are . . . well, conventional. They include coal, natural gas, oil, etc. Nuclear, wind, solar, land-fill gas, and biomass are all classified as non-conventional.

40% Efficiency Solar Cells Developed



 
 
Powered By Industrial Memetics
RSS2.0