Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

Selecting Platforms to Optimize IT Operations

January 9, 2017 No Comments

Featured article by Jason Anderson, Chief Architect at Datalink

These are the best of times and the worst of times for IT managers. On the one hand, newer technologies like cloud, flash storage, and converged and hyper-converged systems provide compelling on- and off-premise choices for optimizing IT infrastructure and workloads. On the other, the number of choices and the complexity of the decision-making process present a variety of hurdles for busy IT teams, frustrating efforts to use these new platforms to create leaner and more agile IT environments.

One result is that relatively few IT organizations are operating at peak efficiency. According to a recent study commissioned by Datalink, just 9% of IT managers consider their companies’ data centers and related technology operations to be fully optimized for efficient and cost-effective management. Nearly half ranked their optimization level between 1 and 6 on a scale of 1 to 10, demonstrating how far IT teams need to go to close the gap.

Other survey findings show the struggles that IT organizations are facing in the optimization arena. Among them:

– More than 60% reported that defining a data center platform strategy is one of their top IT challenges, requiring complex evaluations and analyses to determine the optimal mix of technologies to meet each organization’s specific IT and business needs.

– Multiple time and resource obstacles make it difficult to undertake optimization projects, including day-to-day IT responsibilities, the effort required to assess new technology choices, skills gaps, rapid growth in both data and applications, and the need to demonstrate that IT investments will deliver benefits to the business.

– Nearly 40% of the IT teams that have deployed applications to a public cloud as part of their optimization campaign have brought at least some of them back in-house, primarily because of security concerns, unexpected costs or both. Other reasons cited include dissatisfaction with manageability, reliability/performance, lack of flexibility or customization, support/service, and lack of control over data or resources.

This widespread abandonment of the public cloud is undoubtedly temporary, given other survey results indicating that the proportion of IT infrastructure and application workloads residing in the public cloud will climb from 14% to 23% over the next two years. But it highlights a fundamental flaw in many IT organizations’ approach to platform selection in today’s IT landscape.

The problem is that IT personnel frequently fail to perform a thorough analysis of workload and business requirements before making cloud or other platform decisions. That leads to mistakes as well as mid-course corrections that require the time and expense of re-platforming.

In fact, according to the survey, only 33% of IT managers who report poor optimization in their IT environments have completed an application inventory, just 24% have conducted an interdependency analysis, and a mere 12% have defined the workload requirements for the platforms that run business applications. In contrast, those with highly optimized operations are two to three times more likely to have done their due diligence in all three areas.

And those are just the first steps in determining the best workload optimization strategy for your organization.  Once you have gathered this information and established your baseline, you need to explore questions such as:

– How do I determine the optimal platform for a specific workload?

– How do cloud costs compare to current enterprise costs?  Private? SaaS? Public?

– How will I transform or manage legacy systems?

– What are my security and regulatory compliance requirements?

– What will my service provider strategy be?

– What operational processes can I streamline?

– How do I improve my agility to deliver services more efficiently to the business?

– What IT roles will I need to deliver my cloud strategy? Service architects? Orchestration & automation engineers? IT broker consultant?

– Can I consolidate or eliminate data centers?

– What’s my current IT total cost of ownership and how will that change?

Any decisions must also be shaped by the business goals and demands articulated by the C-suite. The days of basing IT purchases on three- to five-year infrastructure upgrade cycles are gone, replaced by a need to tie IT investments to the strategic business value that a specific technology can contribute. IT leaders are now expected to help their employers drive growth, mitigate risk, cut costs, accelerate time to market for new initiatives, and more. These business concerns must be factored into the equation.

It’s a complex, time-consuming process, but ultimately it will make it possible to determine which mix of platforms is right for you as well as which workloads should go where. You’ll likely wind up with a combination of public cloud, private cloud, all-flash converged technology stacks, hyper-converged systems, and the legacy infrastructure already in your data center – all carefully aligned to the workload requirements of each application in your ecosystem.

The upshot: a modernized IT operation that minimizes server and storage capacity, maximizes energy efficiency, improves the customer experience through higher service levels, enables your IT operation to scale cost-effectively, and helps your in-house IT team do more with less. By any measure, that’s a successful IT makeover.

janderson-unmodified3

Jason Anderson is the Chief Architect at Datalink, an IT services provider that helps mid and large-size enterprises align IT with intended business results. Jason consults with clients regarding business needs and recommends data center transformation strategies that leverage next generation technology to greater efficiency, risk mitigation, and an enhanced customer experience. Jason oversees the Datalink architecture practice, as well as researches, monitors, and recommends emerging technologies for Datalink’s portfolio.

 

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech