ESJ has an interesting article that I suggest folks looking into cloud computing read. The author, Henry Newman, lays out quite a few issues with bandwidth and recovery time using a cloud storage provider.
The bandwidth problem isn’t limited to enterprises. In the next 12 to 24 months, most of us will have 10Gbit/sec network connections at work (see Falling 10GbE Prices Spell Doom for Fibre Channel), while at home the fastest connect available as the current backbone of the Internet is OC-768, and each of us internally is going to have a connection that is 6.5 percent of OC-768. That will be limited, of course, by our DSL and cable connections, but their performance is going to grow and use up the backbone bandwidth. This is pretty shocking when you consider how much data we create and how long it takes to move it around. I have a home Internet backup service and about 1TB of data at home. It took me about three months to get all of the data copied off site via my cable connection, which was the bottleneck. If I had a crash before the off-site copy was created, I would have lost data.
The issue that our customers are running into is in managing the backup and archiving requirements and data restoration time over the web. Henry Newman touches on these issues. Every organization values its data assets differently and needs an affordable strategy to archive data. Although cloud storage may be affordable, it may not provide the data recovery access that your organization needs to satisfy your clients.
Where is your bottleneck, and what can you do to accelerate your data recovery to provide your customers the service they need? If you can’t help your customers when they call because your data is unavailable, you will be providing your competitors an opportunity to take some business away.