Inside the Briefcase

IT Briefcase Exclusive Interview: Getting the Most Out of Open Source While Managing License Compliance, Risk, and Security

IT Briefcase Exclusive Interview: Getting the Most Out of Open Source While Managing License Compliance, Risk, and Security

with Kendra Morton, Flexera
In this interview, Kendra Morton,...

Why DEM Matters More Than Ever in Financial Services

Why DEM Matters More Than Ever in Financial Services

Remember waiting in line at the bank? Banking customers...

How to Transform Your Website into a Lead Generating Machine

How to Transform Your Website into a Lead Generating Machine

Responsive customer service has become of special importance, as...

Ironclad SaaS Security for Cloud-Forward Enterprises

Ironclad SaaS Security for Cloud-Forward Enterprises

The 2015 Anthem data breach was the result of...

The Key Benefits of Using Social Media for Business

The Key Benefits of Using Social Media for Business

Worldwide, there are more than 2.6 billion social media...

How to Navigate Data into the Cloud

October 31, 2014 No Comments

Featured article by Dave Wagner, TeamQuest Director of Market Development

The increase in businesses moving their data into the cloud has created several new challenges for data center managers. Many departments and employees are increasingly using the public cloud to host unpredictable applications with unknown resource requirements. Secondly, ongoing operational management considerations must be fleshed out for the post-cloud move. Both of these are further complicated by requirements relating to information privacy and security.

It is intuitively understood that multi-tenancy, self-service portals and virtualization (shared resources) potentially impact information privacy and security. But, the rapid growth of dynamically changing virtual and cloud configurations is upsetting the heretofore relatively well understood management approaches for security in multi-tenant and self-service environments. After all, those could be relatively easily managed via relatively simple policies and procedures. It is the additional complexity of the latest highly dynamic configurations for virtualized “everything” and cloud environments; wherein IaaS, Paas, and even AaaS are made available, used, reconfigured, and returned that has upset the complacent security management applecart.

These additional dynamic complexities also significantly impact the ability of IT to discern how to analyze report and plan the resources needed to successfully deliver expected levels of service cost-effectively. Combining performance data from not only end-user experiences (response times, throughput of business transactions, etc.) but also the dynamically changing, shared IT resources underpinning the services themselves is hugely challenging.

With that said, data center managers are now trying to navigate these challenges According to a recent IDC whitepaper, “Cloud computing represents a new set of challenges for performance and capacity management professionals. Cloud infrastructures are typically based on shared, pooled, highly virtualized hardware and operating environments. Clouds require extensive management software for enabling such functions as resource allocation, self-service catalogs, automated provisioning, service-level management, usage-based metering and billing, and support for dynamic expansion and contraction of resources or “elasticity.” As such, clouds represent technology evolution for the delivery of business services and workloads, but cloud infrastructures still require the same performance and capacity management functions as more conventional infrastructures.”

Planning for the Move

Like anything in IT, planning and transparency is necessary to moving data into the cloud. Provisioning in the cloud too quickly can lead to many pitfalls, like losing total control of your IT process, overestimating your cloud capabilities and exponentially increasing the cost of your cloud initiative – and all with no guarantee of acceptable service deliveries. Planning will also speed up the release of new IT services into the cloud. As things get more dynamic, and more abstracted, in cloud environments – the key to success is going to be transparency.

Moving the data itself requires a security “audit” upfront (i.e. organizational approval as to what data is allowed to be placed in a public cloud – or not). If the cloud strategy is either public, or hybrid this is critical, private-only clouds have fewer issues because the data has not been externalized from the organization, though there may be some within the organization (i.e. compartmentalization of data privacy, etc.).

Independent of security aspects, and whether you are in a private cloud or using a hosting provider, several key requirements exist:

- You will have to monitor and report on your agreed service levels back to your constituencies. Therefore, you must have some visibility of the IT service or application, or at least, have information about response time from a user’s perspective. Otherwise, if you don’t measure these, you are not providing business value – you are not aligning to what they care about.

- Because service performance requires timely access to acceptably performing IT resources, you need to understand your applications and their computing resources in the cloud. Collecting performance data will enable you to understand the resource consumption and provide an estimate to the initial sizing of your cloud environment. Toolsets to monitor end user experience will be a must; not only to monitor SLA adherence, but also as a mechanism to use as a baseline while doing performance and capacity optimization.  You can automate the process of measuring end user response time with a spectrum of approaches ranging from the less costly, easier representative (sample) transactions at low granularities such as 5 minutes. Or you can do this with high granularity and fidelity transactional decomposition tools which actually measure each and every individual real transaction. The price for such accuracy? Typically complex, invasive and potentially expensive solutions.

- Application workloads need to have their performance continuously analyzed against actual resource requirements in order to ensure optimal usage of IT resources provided by IaaS and PaaS cloud platforms. On a private cloud you can monitor these resources and create alerts that can speed restoration by easily drilling down to pinpoint the causes of incidents.  Currently, access to these types of metrics in public cloud environments is typically not possible; with such cloud provides not exposing this level of detail of the underlying infrastructure; after all there is a conflict of interest here – such providers make their margins by effectively having you pay for more than you are using. Sometimes, this will matter to you, sometimes it will not.

In order to continuously compete, your cloud initiatives should continually improve and re-invent your processes and service offerings. Done well and transparently, you can begin to drive the resource demand curve, making it easier for you to provision the appropriate resources, at the right time and cost while simultaneously ensuring service levels that meet business goals are achieved. Service Improvement should start immediately after your cloud implementation is done, not in 9 months. SLAs are measured, gap analysis conducted, identification of potential risks and efficiency of the services and process must be an ongoing process. This continuous process is how we gain IT maturity.

Wagner Dave 240x300 How to Navigate Data into the Cloud

About Dave Wagner

TeamQuest Director of Market Development Dave Wagner has more than 30 years of experience in the performance and capacity management space. He currently leads the cross-company Innovation Team and serves as worldwide business and technology thought leader to the market, analysts and media.




Leave a Reply