Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

Taking a Data First Approach to Real Time Applications

June 30, 2016 No Comments

Featured article by Jack Norris, Senior Vice President, Data and Applications at MapR

Leading companies that are getting the most out of their data are not focusing on queries and data lakes, they are actively integrating analytics into their operations. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency and rely on a variety of data sources. In fact, real time is at the foundation of many transformational applications.

In this interview, Jack Norris, senior vice president, data and applications at MapR, speaks with IT Briefcase on how a data first approach simplifies and speeds development of applications and results in real-time applications that have significant impact. Jack takes a closer look at what real time really means, and why real time is required across the entire process.

How is a data-first approach different from traditional applications?

Traditionally, we’ve taken an “application first” approach: you start with the application and determine the data requirements. You then prepare the data into specialized schemas to serve the application. Each of these applications has their own dedicated silo, and the result is that you have a proliferation of silos. In fact, the average company has hundreds of data silos throughout their organization. Gartner refers to this as the biggest challenge for data management in organizations. The promise of big data is to centralize this into a data lake and bring the processing to the data.

By focusing on real-time data streams, companies can transform the development, deployment, and future agility of applications.

How do you define high frequency decisioning?

The reality is that real time is required across the entire process: it begins at the time data is collected, and continues until the business action is taken. If we can compress that data-to-action time frame, it can form the foundation for some truly transformational applications.

Companies use high-frequency decisioning applications to make small, automated adjustments to: increase revenues, reduce costs and mitigate risks.

Can you provide examples of how customers are using high frequency decisioning applications?

Sure, there are many examples across different industries. For example, one of our customers has developed an application that processes electronic medical records as a secure data stream that simplifies the deployment of real-time applications for hospitals, clinics, and insurance companies. American Express, leverages big data to identify potential fraud when an American Express Card is used anywhere in the world. Their platform protects $1 trillion in charge volume every year – determining in less than 2 milliseconds if the charge is fraudulent or not.

Another example is Altitude Digital, one of the fastest-growing video advertising platforms in the industry. With nearly seven billion transactions per day, Altitude Digital is able to select, in real time, the best video advertisement to play at the right time for the right person.

Is the growing complexity of big data environments starting to decrease?

Hadoop enables organizations to collect data into a centralized data lake. However, with the growing complexity of big data, we’re actually seeing the separation of data into specialized clusters: a cluster for ingest, a cluster for streaming analytics, a cluster for database operations, and another for deep analytics. We’re starting to create the same silo problem only with different technologies.

In order to eliminate data silos and enable these real-time, transformational applications, you need to focus on two areas.

The first is a Converged Data Platform, which eliminates separate clusters and enables applications to benefit from all data. By eliminating silos, you can have all of your data available for a wide variety of data manipulations for your application. Every piece of data can be considered a “first-class citizen”: structured, unstructured, data-in-motion, and data-at-rest.

What’s the key to real-time data success?

The winners will not necessarily be the companies with most data; the winners will be those companies that demonstrate the most data agility — the ability to generate the fastest and most appropriate response to changes in customer demand, competitive pressures, and market events.

When it comes to big data, it truly is time to get real.

Author Bio

Jack Norris, Senior Vice President, Data and Applications at MapR, drives understanding and adoption of new applications enabled by data convergence. With over 20 years of enterprise software marketing experience, Jack’s broad experience includes launching and establishing analytic, virtualization, and storage companies and leading marketing and business development for an early-stage cloud storage software provider. Jack has also held senior executive roles with EMC, Brio Technology, SQRIBE, and Bain and Company.

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech