Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

IT Briefcase Exclusive Interview: Maximizing Business Value through Data Management with Robert Eve, Composite Software

January 25, 2013 No Comments

With Big Data on the rise, enterprises today have access to a multitude of new tools and technologies, and therefore, massive amounts of data.

In the below interview, Robert Eve from Composite Software offers expert advice for organizations looking to effectively integrate and analyze data to increase overall business value.

  • Q. What key changes in data management and data storage do you think have had the greatest impact over the last 10 years?

A. The question should be what hasn’t changed!  Ten years ago:

– We consumed information on our desktops at work.  Today we get our information on our phones, tablets and desktops 24×7.

– We went to IT to get our information.  Today we are more self-service.  We just want the data from IT and we will analyze it ourselves.

– Everything we needed was on our internal corporate IT systems.  Today we are just as interested in external information.  Further, many of the IT systems have moved to the cloud.

– We focused on trying to understand what had happened historically.  Today we want to predict what will happen and even change that to an outcome that will be better for our organization.

– All the data we needed was derived from business process transactions of some nature, typically keyed in by somebody at some point.  Today, machines generate the majority of the data in a trend we now call Big Data.

There are revolutionary changes everywhere.  And all of them interweave to create fabulous opportunities and interesting challenges.

  • Q. How can organizations today begin to reconcile the increased demand for information from a “business” perspective, and the need to effectively manage data on the “IT” front?

A. When I was at the Center for Information  Systems Research at MIT over 25 years ago, my thesis advisor, Dr. Jack Rockart taught me that business and IT needed to work in a partnership and this partnership must evolve in accordance with new business and technology opportunities.  With the explosion of data today, his prescient guidance is more valuable than ever.

So the first step, the first principle so to speak, is for the business and IT to treat each other like partners with a shared mission, but diverse skills.    Then together they can align the priorities and funding needed for joint success.

The second step is for IT to recognize that the business side has more skills and tools to analyze data themselves.  So IT needs to become more of a data delivery provider to the business focusing on data quality, security, reliability and the like.  The analogy is an electric utility, with data flowing across the grid, to the end users who will use it in diverse and random ways.   This will allow each side to optimize their side as well as the organization’s overall performance.

The third step is for the business and IT to jointly experiment on new business and technology initiatives as a way to pioneer and successfully exploit new opportunities.   We are seeing a lot of this kind of experimentation in the Big Data arena today.

  • Q. What advice can you offer to businesses trying to juggle the increased speed, volume, and variety of data that Big Data brings to the table?

A. Speed (velocity), volume, and variety seem to be the 3 V’s that everyone talks about when it comes to Big Data.    What happened to the most important V, Value?

So my primary advice about Big Data is “Get your business case in order.”  Once the value to the business is understood, juggling higher data velocity, volume and/or variety becomes an engineering problem.

I am the first to admit that there might be a new class of engineering problem, requiring new technologies or skills, but it is a fully solvable engineering problem nonetheless.

In other words, don’t get knocked off guard by the Big Data buzzwords.  Go back to business and technology basics, and you’ll be fine.

  • Q. What steps should organizations be taking to help prepare their IT departments for managing and processing Big Data?

A. For IT, Big Data is as much an organizational change challenge, as a technology challenge.  Practical first steps that seem to work well include:

– Experimenting with a smaller, “SWOT” team on a selected set of projects is a great way to introduce something new.

– Going for some quick and easy wins, rather than boiling the ocean with large-scale initiatives, is a proven technique for gaining momentum.

– Implementing a solution with revenue impact, such a next-best offer analytic to improve upsell performance or a predictive churn analytic that helps reduce customer defection, can ease business funding challenges and improve executive visibility / sponsorship.

  • Q. With so many new tools and technologies available today, enterprise data frequently ends up residing within many silos.  How is Composite Software working to help companies effectively integrate and analyze this data to increase overall business value?

A. Cloud sources, analytic appliances, Big Data stores, and more have resulted in a landscape of data processing silos.  The good news is that each of these silos optimizes their specialized function.  The bad news is that many other critical business functions – maximizing  revenue, synchronizing a supply chain, accelerating new product development, managing risk or meeting compliance requirements – require data from across these proliferating silos.

This is where Composite Software’s data virtualization offerings have proven valuable at our large enterprise customers such as Comcast, Pfizer and the New York Stock Exchange.   We have an agile data integration approach that easily leverages existing silos, without generating even more silos as organizations might have done with a data warehouse in the past.

By simplifying information access in this way, the business dramatically improves their information and therefore business agility.  For IT is means greater flexibility to use whatever technology is optimal for each silo, but then provide the business with a common, “virtualized” place to get whatever data is required, whenever needed.  And for everyone, it means lower costs and greater business success.

  • Q. In the process of analyzing data, the gathering of data frequently takes a longer than it should. Is Composite Software currently offering any solutions to help organizations increase the speed with which they gather and compile their data?

A. It is very interesting to talk to data scientists and business analysts about what they actually do when performing a new analysis of some nature.

After they think a bit about the problem, the next thing they do is go after the data they think they might need.   This means determining what data is actually available.  Then they work with IT to get access to that data.  And finally they pull the data together into some form of a sandbox.  They do all of this data preparation work before they start building the analytic model, statistically analyzing the results, interpreting what the results mean for the business and communicating these insights.

The data scientists and business analysts will say they spend over half their time addressing these data related activities.  This means they spend less than half their time actually doing analysis!  Does that make any sense?

Composite is an expert at helping enterprises simplify and accelerate access to data.  Out-of- the-box today we have products for automatically introspecting data sources, discovering relationships and then modeling them is friendly entity-relationship diagrams that are easy for the analysts to understand.

Once the data is identified, our development studio simplifies the building of easy-to-understand views of the data.  Next our powerful information server automatically optimizes queries the required data sets.  And then depending on the sandbox strategy (physical, virtual, or hybrid), our server can also manage these data sets.   And all of this can be done in hours or days, rather than weeks or months in the “old way” using ETL, data replication tools and/or hand-coding.

The result is a 2-10x acceleration of time-to-analytic results, which pays off handsomely when analyzing revenue optimization, risk management and/or compliance opportunities.

In addition, the data scientists and business analysts are not only more productive, they are much happier because they get to do more modeling and analyzing and less data chasing.  And happier analysts are easier to retain, a key issue given the shortage of analysts today.

Further all of this works with Big Data, traditional enterprise data, external or cloud data, desktop data, and more.

Simple, yet powerful and works for any organization’s IT environment.  Lots of value-add and the users like it too.  I think any organization’s data scientists and business analysts will find Composite a great solution for their data challenges.

Robert Eve
Executive Vice President, Marketing
Composite Software, Inc.

Bob leads the marketing for Composite Software.  Bob’s experience includes executive level marketing and business development roles at leading enterprise software companies such as Oracle, PeopleSoft, Mercury Interactive (Kintana) and Right Hemisphere, as well as management roles at Ernst & Young and Intel.  Bob is the co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.  Bob holds a MS degree in Management from the Massachusetts Institute of Technology and a BS degree in Business Administration from the University of California at Berkeley.

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech