Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

1.8 Trillion Gigabytes of Data, and Growing

September 19, 2012 No Comments

SOURCE: PCMAG.COM

There’s a common bond among all technology users: the accumulation of data. Recent data from analytics firm Infobright provides some astonishing numbers about the amount of data every user collectively stores and details common issues that businesses, even the SMB, must tackle when data starts to snowball.

For example, did you know that if digital data was broken into bits of info, there would be more data particles than stars in the physical universe? Or that we are currently storing 1.8 trillion GB of data stored across 500 quadrillion files?

There’s no stopping this data agglomeration, and many businesses and IT departments are faced with the daunting task of analyzing this information. The analysis of business data is crucial for running a successful business, and analytics can range from simply being able to accurately calculate profit and loss to having a good grasp on the demographic information of your most loyal customer base.

The challenge many are facing is how to tame and analyze that data as it grows at an astounding rate. It can become complicated to manage and cause performance issues on a network.

According to Infobright, when it comes to dealing with the complexities and performance issues of analyzing big data most IT managers tend to:

  • Tune and upgrade databases, which can help but can also increase administration costs and licensing fees.
  • Upgrade hardware processing capabilities, which often increases overall Total Cost of Ownership.
  • Expand storage system; again, another rise in cost.
  • Archive old data, which is particularly problematic as this reduces the amount of data available to analyze at any one time, which can mean less accurate analytical reports.
  • Upgrade network infrastructure, increasing costs and increasing network complexities.

Managing growing data in a business environment while keeping cost and complexity low is what many businesses are trying to tackle. Infobright and many other data analysis vendors suggest the answer is not so much in adding hardware but in approaching the storage of data in different database formats—such as storing data in columnar databases rather than traditional row-based databases. Because less data is returned when searching against a columnar database, analytics and searches can be performed much faster than with row databases.

Large data sets are posing new data management issues, even in small businesses. IT and data owners must find creative and cost-effective ways for their companies to work with large amounts of data.

Click here to view original article

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech