Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

IT Briefcase Exclusive Interview: The Key to Optimal Virtualization with Augie Gonzalez, DataCore Software

December 21, 2012 No Comments

Rapid change is nothing new to data management. As virtualization expands and transforms, users are demanding flexibility and agility that can best be made possible within a cloud environment.

In the below interview, Augie Gonzalez from DataCore Software discusses the challenges that arise when moving third party applications over to a virtualized environment, and offers expert advice as to how businesses can function more efficiently within private, public, and hybrid cloud.

  • Q. How do you see Virtualization and Cloud Computing as having evolved and changed over the last 10 years?

A. They’ve gone from skepticism to mainstream.

Seems only a few years ago virtualization was met with much cynicism by users and manufacturers alike. Application Specific Integrated Circuits (ASICs) were the closest anyone got to programmable logic, and the only hardware imitation anyone trusted. Mainframe and UNIX partitioning techniques were getting attention, but it wasn’t until Dell started promoting VMware on x86 machines that server virtualization was taken seriously.  
The attention then shifted from core machine emulation to the management tools and utilities responsible for provisioning, resource balancing, clustering and failover.

When DataCore first introduced storage virtualization software for open systems in ‘98, there was no such point of reference. It too was greeted with disbelief. How could an off-the-shelf software stack executing on commodity Windows servers possibly improve on a seven foot tall, two-ton, $3.5 million storage behemoth enclosed in hardened steel? Despite convincing benchmarks, a superset of features, and early adopter testimonials, it wasn’t until server hypervisors took foothold, that storage emulation was given any meaningful consideration.

Throughout the last three years, all has changed. The pundits now project software-defined storage to be the next inevitable virtualization wave, shaping how current and future generations of purpose-built hardware are pooled and managed.

Cloud computing, on the other hand, enjoyed a brisker takeoff, although talk of cloud computing even six years ago, mostly brought on giggles absent the friendly, yet-to-be invented “l.o.l.” It quickly blossomed with the success and widespread use of
SalesForce.com

The new-breed of experts weaved these concepts together around 2010. They finally understood the advantages of using storage virtualization software to build out, scalable, robust hybrid clouds spanning on-premises gear and offsite resources. The hardware makeup of which no one knows for sure, since they morph incessantly in many directions.

  • Q. How can businesses today begin to address the challenges that arise when moving mission critical applications over to a virtualized environment?

A. Begin by flipping around your architectural design priorities. Instead of paying so much attention to the processor loading where there is plenty of headroom, direct more attention to the I/O pile-ups. Those occurring when consolidated apps crash at intersections trying to reach shared disk resources. Study after study reveals how contention for storage and the paths to it dominate the behavior of mission critical apps in virtualized environments.

This explains why storage virtualization software has entered the spotlight.  Among its major roles is playing traffic cop to prevent the pile ups and rapidly fulfilling I/O requests without collisions.

  • Q. What pointers can you offer to help businesses function more efficiently within private, public, and hybrid cloud environments?  

A. Start by altering your image of data and apps. Rather than static objects confined to a machine, think of them as transient shipping containers free to move where processing / safeguarding costs are most appropriate. Sometimes you’re willing to incur longer shipping delays to get a really low price (public cloud). Other times, urgency dictates expediting through the speediest facility which charges a premium (private cloud).

Since you are unaccustomed to making these decisions, and likely unable to make them quickly enough given the volume of traffic, we suggest you hire a seasoned shipping broker; one with a track record for optimizing each load given specific criteria.

In the 21st century, the broker is a piece of software – actually, a combination of software. Advanced utilities spanning clustered servers, such as dynamic resource schedulers, take care of choosing the best processing location. Advanced auto-tiering software, like that from DataCore, dynamically selects the best storage location. Together, they balance performance and security against cost and convenience.

  • Q. How is DataCore currently working to help organizations integrate different storage levels and different devices at a reasonably low cost?

A. First by taking difficulty out of it. And, as importantly, by re-introducing buyer choice into the selection process.

These days it’s clear that no one device or any one manufacturer can meet the foreseeable storage needs of an organization. There’s a healthy competitive environment among the popular suppliers for your disk spending. Not to mention numerous startups inventing new gear to cover the gaps left unanswered by the established vendors.

But lack of backwards compatibility, both at the device level and within the management software, makes it difficult for many buyers to entertain different suppliers or migrate to newer models. They are essentially locked out from putting these breakthrough technologies into their workflows.

DataCore breaks the stranglehold, enabling IT organizations to shop for the best value among competitive alternatives, easily accommodating state-of-the-art solutions alongside contemporary and legacy equipment. I say solutions, because sometimes you will be subscribing to services rather than acquiring equipment in order to strike the ideal balance between on-premises disks and cloud storage. 

  • Q. How do DataCore’s Auto Tiering and High Speed Caching solutions work together to help businesses optimize across different types of storage?

A. It’s a two prong, automated approach necessary to respond swiftly enough to the dynamic nature of today’s mixed workloads and the variability among many classes of storage resources. The moment apps moved out of dedicated private machines into consolidated virtual servers, all bets were off. No longer could we rely on manual tuning to keep response times satisfactory.

DataCore designed really smart software to quickly adjust how the infrastructure responds to competing demands for shared resources.  The algorithms have been perfected over the past 14 years. Essentially, auto-tiering directs high priority, time-sensitive requests to the fastest storage assets (flash memories / SSDs), while assigning lower cost storage to jobs where turnaround time is less critical. That could be bulk scratch storage on Amazon S3 in a hybrid cloud configuration.

The infrastructure-wide caching techniques built into SANsymphony-V complement auto-tiering by keeping frequently accessed data on large, high-speed memories close to the apps. The software speeds up response, particularly for Tier-1 apps like Oracle, SAP, SQL Server and Exchange by learning and adjusting in real time. It anticipates the next read requests and preload them from disks into inexpensive, but lightning-fast RAM. It also gathers random writes in RAM when the disks are too busy, and then streams them out in a more effective sequential transfer to reduce update latency. 

  • Q. Do you think that there will be an eventual paradigm shift to Cloud Computing and Virtualization being the primary solutions supporting modern business functionality?

A. The paradigm shift to virtualization is well underway, so I don’t need to pull out my crystal ball for that one. Among our clients worldwide, we’re also seeing cloud computing selectively employed today, mostly in hybrid scenarios, mixing private, on-premises data center properties with rented disk space maintained by 3rd parties on public clouds.

The transition need not be so abrupt and painful as you might suspect. Nor is it binary. Definitely not on the storage side of the equation. DataCore simultaneously provisions virtual disks to classic, siloed servers and newly virtualized machines with equal ease. We effectively help customers cut over from past ways into the future on their chosen timetable, without the fear of uncertainty stressing them out.


Augie Gonzalez is the Director of Product Marketing for DataCore Software Corp. He has more than 25 years of experience developing, marketing and managing advanced IT products.  Before joining DataCore, he led the Citrix team that introduced simple, secure, remote access solutions for SMBs. Prior to Citrix, Gonzalez headed Sun Microsystems Storage Division’s Disaster Recovery Group. He’s held marketing / product planning roles at Encore Computers and Gould Computer Systems, specializing in high-end platforms for vehicle simulation and data acquisition.

Gonzalez holds a bachelor of science degree in civil engineering from the University of Florida.

Gonzalez also contributed to the Space Launch and Cruise Missile programs as Lead Engineer for General Dynamics mechanical test facilities and had a hand in the introduction of Computer Aided Manufacturing to the company.

 

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech