Sunday, May 2, 2010

Is Flash costing the American public money in electricity costs?

There has been much furor about the lack of Flash on the iPhone, iPod Touch, and iPad with Apple espousing a number of reasons for that, a key one being performance and resource utilization.  I wrote about my experience with Flash and some minor improvements a beta of 10.1 seems to bring, but I've observed that even when not watching video, with the number of sites that use Flash for other things the CPU is actually quite busy when one would think it would be idle (i.e. I'm not actively browsing).

To test this out, on a Mac I opened up several windows, each with several tabs opened to sites I'll typically have open during the day including Gmail, Google Calendar, Blogger, ESPN.com, StatCounter, ZDNet, and a handful of others.  I also watched a Dodger game on MLB.com for a bit but closed that window and left the computer basically idle but with the tabs still open in my browser.

At this point the Shockwave plug-in was using about 2-3% of the CPU, not bad.  But 20 minutes later of just sitting otherwise idle, the plug-in was using 10% of the CPU.  During this test I did not wait significantly longer, but previously I've seen the plug-in using 15-20% of the CPU while the machine is seemingly idle.

So, even at a conservative 10% of the CPU being used unnecessarily by Flash on an idle machine, one has to then wonder what that is doing to the power consumption of the machine.  A little quick research revealed that moderate usage of a desktop uses 30-50 watts above idle and a laptop 10-15 watts above idle.  If moderate use is 20-30% CPU, then our 10% CPU is going to be using around 10 watts on average.

Let's keep doing more math to see where this takes us.  The current population of the US is just over 307 million and 76.2 computers per 100 people that is roughly 234 million computers.  If only 10%, or 23.4 million, are used on a daily basis and when they are used 10%, or 2.34 million of them visit sites each day for an hour that use Flash and have this CPU waste, that is 23,400 kilowatt hours each day, or 8.54 million per year.

To make that number more meaningful, at an electricity price of $0.10 per kilowatt hour, the use of Flash is costing the American public $854K per year.  And the 10%'s and single hour I used above are likely clearly on the low end of what the actuals are and my analysis ignores computers used at work. So the actual cost is likely well into the millions.  In the grand scheme of things, $854K isn't that much for the entire country, but it is still sobering to think about.

Now, is this all Adobe's fault?  If their software is indeed buggy and inefficient they do shoulder some of the blame, but I would argue that Flash is likely used in many situations it isn't needed and that it is poorly written Flash apps that is a big factor too.  Poorly written AJAX apps running in your browser could cause the exact same issues.

So what can one do about it?  Well, you can choose to do nothing as it is really only costing you at most pennies in extra electricity costs (although what about wear and tear on your computer from heat and the fan running?), but your other alternatives are to not install the Flash plug-in and forgo benefiting from sites that use Flash, or installing a Flash blocker.  I've done the latter and it blocks all Flash applets by default but allows you to white-list sites or selectively enable specific applets.

Using the blocker it is interesting to see what sites use Flash and the list includes Google Mail, ESPN.com, ZDNet, java.sys-con.com, StatCounter, and more.  Do each of these really need to use Flash?

No comments:

Post a Comment