The above (Fig 1.2a of Terrorism and the Electric Power Delivery System) shows a measure of total electric grid outage duration across a sample of countries in the world. The measure is SAIDI = System Average Interruption Duration Index. This is the total number of minutes per year of outage experienced by an average customer, but it explicitly excludes major events (hurricanes, earthquakes). Most utility outage measurements are reported exclusive of major events. This is irritating if one is interested in the climate change signal, but understandable for utilities trying to measure their endogenous performance, rather than exogenous events that they can't control in their own operations.
The data are for 1992-2001, but this measure is fairly stable over time, so it's probably not too different now. Japan is the best with only 6 minutes of average outage. That's six-nines reliability (99.9999% uptime). By contrast, the US experience is over three hours (99.96% uptime). The developing countries in the sample range from a few hours (Argentina) to a few days (Columbia) of outage per year.
I don't have any big point to make about this, but it's interesting background about how reliable the electricity supply is in industrial civilization.
Again, exclusive of major events, SAIDI has not been increasing lately in the US:
One occasionally hears some extreme "peak everything" writers forecasting the collapse of the electrical grid due to lack of fossil fuel. No hint of that so far.