Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

The Internet of Things Is Killing Your Storage: Best Practices for the Brave New World

October 23, 2015 No Comments

Featured article by Stefan Bernbo, CEO and founder of Compuverde

IDC points to several factors that are enabling the rise of the Internet of Things (IoT), including a global culture that has embraced and now expects greater connectivity; ongoing interest in and development of smart cars, homes and cities; and advances in connectivity infrastructure. Consequently, the analyst firm anticipates that IoT technology and services spending will generate global revenues of $8.9 trillion by 2020.

The IoT’s billions of “things” are generating unprecedented volumes of data, all of which must be stored. This is already putting a strain on current storage solutions – and the IoT’s just getting started. Service providers will need to accommodate ever-increasing demands for storage so, to remain competitive, many service providers are exploring new options in data center architecture that will permit greater flexibility and control over hardware costs.

A Recipe for Bottlenecks

Appliances—server hardware that comes with proprietary, mandatory software—form the spine of most data centers. The software is designed for the hardware and vice versa, and come tightly wedded together as a package. The benefits of this configuration include convenience and ease of use.

Traditional storage relies on a single point of entry. Because it’s a fact of life that the hardware is going to fail at some point, appliances typically include redundant copies of expensive components to anticipate and prevent failure caused by that single point of entry. These redundant extra components bring with them higher hardware costs, greater energy usage and additional layers of complexity. When companies, in anticipation of growth events like the IoT, begin to consider how to scale out their data centers, costs for this traditional architecture skyrocket.

Vertical construction is another problem. In an architecture based on traditional appliances, requests come in via a single point of entry and are then re-routed. Think about a million users connected to that one entry point at the same time. That’s a recipe for a bottleneck, which prevents service providers from being able to scale to meet the capacity needed to support the Internet of Things.

Software-Defined: The Horizontal Alternative

An alternative to traditional vertical construction is software-defined storage. By taking features typically found in hardware and moving them to the software layer, a software-defined approach to data center architecture eliminates the dependency on server “appliances” with software hard-wired into the system. This option provides the scalability and speed that the IoT demands.

Most people don’t realize that many everyday electronic devices have been “software-defined” for years. Take the PC, for example: software can be installed on any hardware platform, allowing the user to custom-tailor both the hardware and the software according to his or her needs. The average PC can use Linux as an operating system if the owner so chooses. This gives the user greater freedom to allocate his or her budget precisely as needed for the task at hand – whether towards a high-powered graphic design set-up, for example, or a lightweight Web browser.

Administrators are able to choose inexpensive commodity servers if they create a software-defined storage architecture. This liberates the software from the hardware, offering significant cost savings. When coupled with lightweight, efficient software solutions, the use of commodity servers can result in substantial cost savings for online service providers seeking ways to accommodate their users’ growing demand for storage.

Because not all data centers are created equal, software-defined storage also provides

scalability. A telco servicing one particular area will have different storage needs than a major bank with branches in several countries, and a cloud services host provider will have different needs still. While appliances might be good enough for most of these needs, fully uncoupling the software from the hardware can extract substantial gains in economy of scale.

IT admins now have the luxury of examining the needs of their business and hand-picking the specific components and software that best support their growth goals. While this approach does require more technically trained staff, the flexibility afforded by software-defined storage delivers a simpler, stronger and more tailored data center for the company’s needs.

In addition, because of the horizontal approach that software-defined storage takes, data is streamlined and redistributed, which eliminates the potential bottlenecking problems of vertical, single-entry-point models. Data is handled faster and more efficiently, and this non-hierarchical construction can be scaled out easily and cost-effectively.

Another likely beneficiary of the horizontal approach is the Internet of Things. With millions of devices needing to access storage, the current storage model that uses a single point of entry cannot scale to meet the demand. To accommodate the ballooning ecosystem of storage-connected devices all over the world, service providers, enterprises and telcos need to be able to spread their storage layers over multiple data centers in different locations worldwide. It’s becoming increasingly clear that one data center is not enough to meet the storage needs of the Internet of Things; storage must instead to be distributed in a way that lets it be run in several data centers globally.

Brave New Storage

All indications are that the massive proliferation of data created by the IoT will not slow down any time soon. Rather, it will only increase, and enterprises will need storage solutions that can be scaled quickly without breaking the bank. Hardware options simply cannot meet these criteria. Thankfully, software-defined storage is also an option. It offers the scalability, flexibility and distributed model that today’s enterprises need to keep pace with the ever-growing Internet of Things.

Stefan Headshot

About the Author:

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, Stefan has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

 

 

 

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech