Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

Benchmarking For a Better Cloud

January 21, 2013 No Comments

Featured article by Kurt Hagerman and Todd Gleason

In the not-so-distant past, evaluating cloud infrastructure options was a simple task. There wasn’t a lot of variety in the cloud, so the choice came down to cloud vs. dedicated, and in pretty much every category but price, dedicated won. Even just five years ago, dedicated servers were the only high powered, totally secure option with optimal performance capability. This allowed the consumer to make an apples-to-apples comparison between providers on price, resources, and reputation.

Now we see that the cloud has grown up and isn’t just the option for cost control. Cloud infrastructure can be sophisticated enough to go head to head against even the most robust dedicated infrastructure. This is why the current transitional landscape from dedicated server farms to cloud infrastructures has expanded the conversation. Public cloud. Private Cloud. Hybrid Cloud. Production cloud. It’s so confusing. It becomes especially critical when evaluating cloud services for something very specific, such as storage of data types, application demands, or even performance. Additionally, non-standard terminology for resource allocation can make it very difficult for enterprises to anticipate cloud performance. IT pros have relied heavily on benchmarking studies for years in order to have all of the info they need to make good decisions. Isn’t it time for the cloud to be benchmarked too?

Why Benchmark the Cloud Now?

It’s important to understand that every business case calls for a different cloud stack. A commodity-based cloud providing average performance sufficient for development and staging will not be sufficient for organizations where performance and security are paramount for their production instances. Likewise, storage capacity for archival might outweigh performance for one project, while another use case might need high IOPs and low latency performance. Enterprises especially now live in the world of big data and need tools and education to help them compare apples-to-apples when choosing the right cloud IaaS for their needs.

Further, most cloud vendors share compute and networking resources between clients in their multi-tenant environments, and generally do not disclose the technical details of the infrastructure. But if businesses are to make heads or tails of the clouds they are comparing, cloud companies need to be more transparent about their infrastructures, topography, services, and the specific kinds of challenges they help businesses meet.

IT decision makers are looking at the cloud very critically, and not just cloud vs. dedicated but cloud vs. cloud. Benchmarking the cloud could not be more appropriate right now, as more IT pros demand to know what they can expect from a cloud provider.

What to Benchmark?

The question then becomes, what should we look at when benchmarking the cloud and what should be considered as the industry standard process? One of the most common comparisons to make is for performance, so let’s use that as the example. Here are the critical comparison metrics:

  • – Compute performance
  • – Storage performance
  • – Memory performance
  • – Overall server performance when it includes all of these together
  • – Networking
  • – Uptime

In order to accurately determine cloud performance, you have to measure these attributes separately and then measure how they work together. A good third party organization will typically set a baseline performance standard by testing these elements on a dedicated environment (see, they still serve a purpose). Then they’ll do the same tests, at multiple iterations, on a cloud environment and compare the cloud vs. the dedicated environment. By doing this, IT decision makers can see clearly how the cloud options they are considering stack up against the dedicated infrastructure from which they are thinking about migrating away. It’s the most honest and straightforward comparison and it’s a critical resource for CIOs and other IT decision makers right now.

Get Ready to Dig for Data

Cloud benchmarking is not quite the norm yet, nor will it be for some time. It is imperative that enterprises seek out third party benchmarking reports from credible research organizations. Most IT decision makers are already engaging cloud research organizations for reports and consultation. Now ask them for cloud benchmarking and see what comes up. While some organizations will admit they aren’t there yet, you might be surprised how many are diving in and gathering invaluable data to help you make your next cloud buying decision.

Business have a lot of cloud options in front of them, and they should have the right data available in order to select a cloud infrastructure (or multiple cloud infrastructures). Benchmarking can establish reasonable expectations around quality of performance, and other key facts, and help accurately compare one cloud to another. It’s time to compare apples to apples and benchmark the cloud.

Todd Gleason, Director of Innovation

As the director of innovation at FireHost, Todd Gleason is responsible for driving key new offerings from FireHost’s secure cloud hosting, and ensuring the company always stays one step ahead in the market. He oversees research and development of new initiatives that will keep customers safe, compliant, and at top performance all the time.

Kurt Hagerman, Director of Information Security

As the director of information security at FireHost, Kurt Hagerman oversees all compliance-related and security initiatives. Hagerman is responsible for helping FireHost with the attainment of ISO, PCI, HIPAA and other certifications, which allows FireHost customers to more easily achieve the necessary compliances for their own businesses. His position further includes merging information security and compliance into one organization, and enacting a strong security program where levels of compliance are by-products.

 

 

 

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech