Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

The Big Data Necessity: It’s Here to Stay and You Need to Understand

April 19, 2016 No Comments

Featured article by Jeremy Sutter, Independent Author

computer graph

One of the biggest challenges in the late 1960s at NASA was figuring out how they were going to store all the telemetry from their spacecraft. Scientists and astronauts knew this information would be crucial to not only the current mission but future missions. The problem was they just didn’t have the storage capacity to capture it all.

Almost fifty years later, the technology to capture almost every digit of measurement, every transaction, transcripts of all speech, translations of those transcripts in 100 languages, plus sound and video of every location on Earth not only exists, its ridiculously cheap and produces relatively high quality records of pretty much everything that needs to be recorded.

Now, like those NASA scientist’s decades ago, we face a problem. We’ve got a Himalayan mountain range of data. How do we use it?

What is Big Data? 

big data hadoop
It has been famously noted that every time someone parks a car in a garage in a downtown area, five new entries are made in a database somewhere. Somewhere along the line, those entries will be incorporated into some kind of report or account explaining the overall statistics of that parking garage, if for no other reason than to figure out the taxes that need to be paid.

Now suppose some enterprising car parking company has garages in 40 cities and they want to run a global report for all of them at once. They’re going to have a lot of records to process for that report.

Now suppose they want to run that report for the last 12 years? That’s a lot of records twelve times. Even with modern computers they are getting into some rather large data sets.

It is these larger datasets that have presented IT firms and small businesses with the need for technologies like topological data analysis, distributed file systems and integrated technology like Tableu on Hadoop.

Exponentially larger datasets are what “big data” encompasses. It is the process of using modern processing capabilities to analyze many more times the largest set of data traditionally available to an enterprise computing platform.

Why Big Data?

When such enormous volumes of data can be analyzed in a single context, certain kinds of trends can become visible that otherwise would remain hidden. If it is possible to compare, for example, the pattern of using a parking garage 12 years ago to today, what might be learned could demonstrate there is a way to save operating costs that would have been impossible otherwise.

An older system would have to be utilized to look at exactly the right comparison to produce the correct report. A big data system will include it because in the context of the newer system and the larger dataset, the comparison is just part of the report.

The theory of big data is the sheer magnitude of information will incorporate patterns that can be identified, studied and utilized to produce better strategies.

What those strategies are and how they might be constructed remains to be seen. Big data is too new a concept for programmers, database administrators and analysts to have developed a consistent methodology for deriving patterns and reports yet, but that work is in progress.

When Data Alone Isn’t Enough

Having a big pile of data is interesting, and might even be an attraction on its own, kind of like the world’s largest ball of twine, or the Baker, California thermometer. It isn’t until the correct analytics and reporting technologies are brought to bear that big data becomes something useful to the enterprise.

Reporting is simply a way to convert database records into data. While that might sound circular or confusing, there is a difference between records in a database and data. The reason these two terms should remain distinct is because there isn’t much that can be accomplished with the information contained in a single record in a database except possibly a phone call.

With a hundred or a thousand records, however, what a business has is data which can be tabulated and compiled, analyzed and used to produce reports that will tell a business how many of those thousand people bought a warranty. That’s useful data. The fact this particular customer did not buy a warranty isn’t.

Programming frameworks like Hadoop, HPCC and the Quantcast File System and other technologies like signal processing, data fusion, time series analysis and commercial applications from companies like IBM, Oracle, Microsoft and Dell provide developers the ability to utilize tools like distributed file systems in order to reach the data they are trying to analyze. This is vitally important because without this enabling technology, they can’t reach the data, and without the data, you have big costs, not big data.

This technology is still developing, but it is doing so rapidly, it could soon be a viable means for business insurance. Soon it will be possible for businesses of all sizes to utilize the strategies that only ten years ago would have required an impractical level of expense and expertise.

 

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech