Yesterday, while driving to work, I was listening to the radio and heard a commentary on the price of gas. I wondered if anyone ever calculated how much it costs to run a data center with a hundred TB of data running on systems that are 25% efficient.
I wonder how much of a dent in our energy bills we could save by increasing the capacity utilization of computer storage resources? I am not just talking about the efficiency of the equipment, I am also interested in seeing if there is any interest in getting storage resources up to perhaps 50% efficiency. That does not seem like much. Imagine how much energy could be saved if fewer disks were spinning with more data on them.
Tom West writes…
Along with rising oil/gasoline prices, the return of rolling blackouts in California and Intel's latest talk of "low-power chips" suggest a renewed interest in energy efficiency. You raise an interesting question about cost savings related to increasing the capacity utilization of disk storage.
Of course, one concern in turn is the impact that consolidation can potentially have upon performance (e.g., resource contention). In any case, I'm certainly an advocate for better efficiency in disk capacity utilization (and rising energy costs might well draw the attention of folks to this issue). It seems that having some empirical metrics in this regard would be a big help in evaluating the various tradeoffs involved.