From The Expert Feature Article
May 29, 2013

Against the Trend: What Cloud Service Providers Need to Change


By TMCnet Special Guest
Stefan Bernbo, Founder and CEO of Compuverde

The sheer mass of users utilizing the cloud to store data increases daily as cloud computing continues to span the scope of online services. With this emerging trend, new changes in big data infrastructure are giving service providers the opportunity to reduce costs while improving performance.

It is essential that online service providers investigate ways to cost-effectively enhance the performance of their services. Free cloud services will become increasingly difficult to find if current industry practices, like data centralization, remain unchanged. Eventually, the industry is headed toward limiting the amount of storage per user, raising prices and increasing energy consumption.

How Much is Your Data Worth?

One method being used by many data centers to save money and energy is centralizing data in an isolated location and making it available via the Internet from anywhere. Benefits of data centralization include a decrease in expenses, better quality Internet connections and enhanced performance.

While centralizing data shows improvements in performance, the ability for the system to process an exponential amount of data is harder and initially more expensive. Increasing performance through data centralization requires purchasing higher-performance machines, which is a significant upfront cost. This switch to higher-performance machines also affects energy consumption, making it more difficult to regulate the data on a larger scale.

How the Cloud Differs

Average enterprise users demand high performance, but the enterprise has comparatively few users compared to users of the cloud. These enterprise users have access to needed files directly through the network and typically open, send and save relatively small files, such as Word documents and spreadsheets that use less storage volume and decrease performance load.

In comparison, cloud service providers must resolve performance issues such as data bottlenecks, which are a major concern since these providers manage significantly more clients and have higher performance demands than do enterprises. The cloud is being accessed via the Internet by a high volume of users, which itself becomes a performance bottleneck. Not only does the cloud provider’s storage system have to scale for each added user, but it must also support performance reliability across the collection of all users. Additionally, the typical cloud user accesses and stores much larger files -- such as music, photos and videos -- than the typical enterprise user. 

The economic impact of these storage demands is extensive. Service providers must be able to adjust quickly in order to accommodate the demand for increased data storage. Users are familiar with free cloud storage and won’t hesitate to abandon providers that implement pay walls. For service providers to be extremely cost-effective, they need inexpensive storage that both scales effortlessly and performs well.

Best Practices for the Cloud

For cloud providers seeking an ideal combination of performance, scalability and cost-effectiveness, these best practices can help achieve those goals:

1.      Decentralize storage centers

Although data centers tend to be moving towards centralization, having a distributed storage center provides the most effective way to achieve scalability. Decentralization improves performance at the software level, offsetting the performance benefit of centralized data storage.

2.      Use low-energy hardware

Having low-energy hardware is common sense. Significantly reducing both setup and operating costs, using inexpensive commodity-component servers is also more energy-efficient. 

3.      Avoid bottlenecking

With the demands of cloud computing weighing heavily on big data storage, a single point of entry can become a single point of disaster very quickly. Adding caches to ease performance bottlenecks, as numerous cloud providers do, rapidly increases cost and complexity to a system. Choosing a horizontally scalable data system that allocates data between all nodes makes it easier to select less expensive and lower-energy hardware.

Conclusion

In today’s cloud landscape, big data storage consists mainly of high-performance appliances and vertically scaled online data systems. Current cloud architectures are extremely expensive and can merely scale to one petabyte, meaning that in the long run they are less cost-effective and less adequate at meeting the needs of the service provider. Shifting from a vertically scaled cloud system to a horizontally scaled one allows data to be distributed equally among low-energy hardware, improving performance and decreasing costs. Using the best practices listed above, service providers can increase efficiency, performance and scalability of their big data centers.

About the Author:

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, Stefan has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson (News - Alert), the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.




Edited by Rich Steeves




Comments powered by Disqus