Inside the Briefcase

Driving Better Outcomes through Workforce Analytics Webcast

Driving Better Outcomes through Workforce Analytics Webcast

Find out what’s really going on in your business...

Legacy Modernization: Look to the Cloud and Open Systems

Legacy Modernization: Look to the Cloud and Open Systems

On the surface, mainframe architecture seems relatively simple: A...

Still keeping your hybrid power systems indoors?  It’s time for change.

Still keeping your hybrid power systems indoors? It’s time for change.

Mobile telecommunications network equipment is expected to work without...

As the Network Changes, Engineers Are Embracing the DevOps Model

As the Network Changes, Engineers Are Embracing the DevOps Model

Businesses that have embraced digital transformation with a clear...

The 5 Most Common Application Bottlenecks

The 5 Most Common Application Bottlenecks

Application bottlenecks can lead an otherwise functional computer or...

Big Data Requires Big Storage

February 9, 2012 No Comments

SOURCE: Bank Systems and Technology

Banks have long been built on massive amounts of data. But as “big data” gets even bigger, legacy storage systems are growing increasingly inefficient and even obsolete, according to industry experts.

In fact, financial institutions will have to completely rethink and recreate the way they store data to effectively deal with the crush of information they possess, says Barbara Murphy, CMO of Panasas, a Sunnyvale, Calif.-based data storage provider. “The challenge that the banking industry has in consolidating different data types is, can you take all the different file types and have them rest in a single location?” she notes. “You need the scalability that can handle that, which traditional systems don’t have. Infinite scale is now a requirement. There needs to be an entirely different architecture.”

Read more…

DATA and ANALYTICS 

Leave a Reply

(required)

(required)


ADVERTISEMENT

UC Expo

SSOW

sptechcon

ITBriefcase Comparison Report