The above (Fig 1.2a of Terrorism and the Electric Power Delivery System) shows a measure of total electric grid outage duration across a sample of countries in the world. The measure is SAIDI = System Average Interruption Duration Index. This is the total number of minutes per year of outage experienced by an average customer, but it explicitly excludes major events (hurricanes, earthquakes). Most utility outage measurements are reported exclusive of major events. This is irritating if one is interested in the climate change signal, but understandable for utilities trying to measure their endogenous performance, rather than exogenous events that they can't control in their own operations.
The data are for 1992-2001, but this measure is fairly stable over time, so it's probably not too different now. Japan is the best with only 6 minutes of average outage. That's six-nines reliability (99.9999% uptime). By contrast, the US experience is over three hours (99.96% uptime). The developing countries in the sample range from a few hours (Argentina) to a few days (Columbia) of outage per year.
I don't have any big point to make about this, but it's interesting background about how reliable the electricity supply is in industrial civilization.
Again, exclusive of major events, SAIDI has not been increasing lately in the US:
One occasionally hears some extreme "peak everything" writers forecasting the collapse of the electrical grid due to lack of fossil fuel. No hint of that so far.
Wednesday, May 29, 2013
Subscribe to:
Post Comments (Atom)
6 comments:
It appears that Singapore is even better than Japan (~2.2 SADI vs. ~6 SAIDI).
Quad - good point.
Your point that it disproves "peak everything" does not follow because it would seem the system would collapse due to stresses. In many localities the electricity goes out because the power line right of ways are not being properly trimmed or I have even seen lines just stapled to trees in my own locality. Cost cutting is an outcome of peak everything.
An issue about excluding major events is that in some years, like the east coast blackout of 2003, there is about a week where data is excluded where it was a clear case of improper grid maintenance. This means 55 million people aren't also considered statistically for a week. Same probably goes for large scale electric grid events in India and China where the electic companies are the culprits. The black swans that matter will always be excluded.
Interesting that Germany - the country most enamored with solar and wind energy - is excluded, as well as its neighbors. May such electric grids not be efficient, I hear there are blackout issues associated with peak production hours.
interesting point is how outage minutes per year are counted. If there are 100 feeders, each having different MW power flow. Whether No. of units of electricity unserved by utility during outage is considered in calculations. if one customer of 2000 KW is outage for 2 hour and another customer of 5kW for outage of 10 hours can not be averaged unless their load is also accounted for
can anyone guide in this respect, what is the methodology
Post a Comment