Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

IT Briefcase Exclusive Interview: How Smart Availability Can Help Organizations to Reap the Full Benefit of Analytics, When Applied to IT and Business Processes

September 20, 2018 No Comments

The bottom-line value associated from data-driven processes has become increasingly intertwined with analytics. In many instances, analytics has arisen as the key focus of business critical apps, whereas it was once simply regarded as a supplement. Today, we speak with Don Boxley, CEO and Co-Founder of DH2i, about the criticality of analytics and how organizations can put themselves in the best position to plan, deploy and reap its full benefits.

  • Q: Today, it seems like applying analytics to business processes is being offered as a panacea for every IT woe – the bottom-line answer for creating competitive advantage. What do you feel must be considered, and/or what hurdles must be overcome before you can reap its full benefit?

DH2i: As the objectives for applying analytics to business processes have grown, so has the complexity of deployments. Organizations regularly confront situations in which data is dispersed across an abundance of environments, making it difficult and time-consuming to centralize for a single use case. Possibly even more prevalent is the reality in which it is beneficial to deploy in other settings (such as with Linux platforms, in the cloud, or with containers), but technological or financial shortcomings are prohibitive.

The truth is in today’s ever-changing data space is that enterprises need agility for analytics as much as for any other aspect of competitive advantage.

  • Q: Processing is optimized by performing analytics as close to data as possible. Are organizations finding that they need to switch locations for disaster recovery, scheduled downtime, or situations such as limited-time pricing offers, in the cloud?

DH2i: By implementing an agile approach based on what is now being called “Smart Availability” as opposed to traditional high availability (HA), organizations can dynamically provision analytics in numerous environments to satiate business use cases, seamlessly migrating data between on-premises settings (including both Windows and Linux machines), the cloud and containers.

Consequently, they reap decreased infrastructure costs, effective disaster recovery, and an overall greater yield for analytics—and that of data in general.

  • Q: Can you explain how Smart Availability improves analytics in cloud deployments?

DH2i: There are several advantages to going to the cloud for analytics, not the least of which are the pay-per-use pricing model, decreased infrastructure, and elastic scalability of cloud resources. There are also a number of software as a service (SaaS) and platform as a service (PaaS) options. Some of these involve advanced analytics capabilities for machine learning and neural networks for users without data science teams. Nevertheless, the most persuasive reason for running analytics in the cloud is the alternative: attempting to scale on premises.

Historically, scaling in physical environments involved an exponential curve with a number of immutable costs which oftentimes limited enterprise agility. However, by scaling in the cloud and with other contemporary measures, organizations can experience a far more affordable linear curve.

This point is ideally illustrated by a healthcare example in which a large Australian healthcare group was using SQL Server on premises for its OLTP, yet the organization wanted to deploy a cloud model for Business Intelligence (BI). The choice was clear: either defy budget constraints by splurging on added physical infrastructure (with all the obligatory costs for licenses and servers) or deploy to the cloud for real-time data access of their present IT assets. The latter option maximized operational efficiency and decreased costs, as will the majority of well-planned and implemented cloud analytics solutions.

  • Q: Does optimizing cloud analytics always involve continually replicating on-premises data to the cloud? How might you minimize costs?

DH2i: Shrewd organizations minimize these costs by opting for asynchronous replication; the aforementioned healthcare entity did so with approximately a second latency for near real-time access of its healthcare data. Replication to the cloud is often inexpensive or even free, making the data transfer component highly affordable. By making this data available for BI in the cloud, this customer enjoyed several advantages. The most prominent was the reproducibility of a single dataset for multiple uses. Business users—in this case physicians, clinicians, nurses, etc.—are now able to access this read-only data for intelligence to impact diagnosis or treatment options. Furthermore, they do so while the original data is accessible to other/multiple users on premises for functions related to OLTP.

  • Q: With this paradigm, it sounds like there are no performance issues compromising the work of those using on-premises data because of reporting—which might occur if each group was provisioning the same copy of the data for their respective uses.

DH2i: Correct. With this model, each party enjoys equal benefit. The healthcare organization is benefited by the primary data being stored on premises, which is important for compliance measures in this highly regulated industry. It’s also vital to note the flexibility of this architecture, which most immediately affects cloud users. Organizations can establish clusters in any of the major cloud providers such as Amazon Web Services (AWS), Microsoft Azure, or any private or hybrid clouds they choose. They can also easily shift resources between these providers as they see fit, such as according to use case or for discounted pricing. Moreover, when they no longer require those analytics they can immediately halt those deployments—or just transfer them to other environments involving containers, for example.

The aforementioned healthcare organization also enjoys a third advantage when utilizing the Smart Availability approach for running analytics in the cloud: automatic failover. Should it experience any type of downtime for on-premises infrastructure (which could include scheduled maintenance or any sort of catastrophic event), its active workloads will automatically fail over to the cloud using Smart Availability methods/technology. The ensuing continuity enables all users to continue accessing data so that there are no downtime ramifications. Those primary workloads just transfer to cloud servers, so workloads are still running. This advantage is a prime example of the agility of the Smart Availability approach. Workloads are able to continuously run despite downtime situations. Furthermore, they run where users specify them to create the most meaningful competitive advantage. Most high availability methods don’t give users the flexibility of choosing between Linux or Windows settings. There’s also a simplicity of management and resiliency for Availability Groups facilitated by Smart Availability solutions, which provision resources where they’re needed without downtime.

  • Q: So, do Smart Availability methods enable users to maximize analytic output by creating recurring advantages from what is essentially the same dataset?

DH2i: Exactly. Users can move copies of that data to and between cloud providers for low latency analytics capabilities. Moreover, this approach enables users to do so while maintaining critical governance and performance requisites for on-premises deployments. Best of all, these benefits are sustained while automatically failing over to offsite locations, which in turn preserves the continuity and total dependability of workflows. All of this in an era in which information technology is anything but predictable.

Don Boxley (1)

About Don Boxley, CEO and Co-Founder, DH2i (www.dh2i.com)

Don Boxley Jr is a DH2i (www.dh2i.com) co-founder and CEO. Prior to DH2i, Don held leadership roles at Hewlett-Packard where he was instrumental in sales and marketing strategies that resulted in significant revenue growth in the scale-out NAS business. Boxley spent more than 20 years in management positions for leading technology companies, including Hewlett-Packard, CoCreate Software, Iomega, TapeWorks Data Storage Systems and Colorado Memory Systems.  Boxley earned his MBA from the Johnson School of Management, Cornell University.

About DH2i

DH2i Company is the leading provider of multi-platform Smart Availability™ software for Windows Server and Linux Server databases and stateful Docker containers. Its flagship product, DxEnterprise®, drastically reduces IT management complexity, enables nearest-to-zero planned and unplanned downtime, unlocks 30-60% cost savings and can reduce the number of OSes under management by 8-15x. DxEnterprise gives you data tier portability from any host, to any host, anywhere. Intelligent automation ensures that workloads and containers only come online where they can perform at an optimal level, compliant with business requirements and SLAs. To learn more, please visit: www.dh2i.com, call: 800-380-5405 or +44 20 3318 9204, or email: info@dh2i.com.

 

 

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech