Inside the Briefcase

Driving Better Outcomes through Workforce Analytics Webcast

Driving Better Outcomes through Workforce Analytics Webcast

Find out what’s really going on in your business...

Legacy Modernization: Look to the Cloud and Open Systems

Legacy Modernization: Look to the Cloud and Open Systems

On the surface, mainframe architecture seems relatively simple: A...

Still keeping your hybrid power systems indoors?  It’s time for change.

Still keeping your hybrid power systems indoors? It’s time for change.

Mobile telecommunications network equipment is expected to work without...

As the Network Changes, Engineers Are Embracing the DevOps Model

As the Network Changes, Engineers Are Embracing the DevOps Model

Businesses that have embraced digital transformation with a clear...

The 5 Most Common Application Bottlenecks

The 5 Most Common Application Bottlenecks

Application bottlenecks can lead an otherwise functional computer or...

Back to The Roots: Information as a Strategic Asset

December 18, 2012 No Comments

Featured Blog By Yves de Montcheuil, Vice President of Marketing, Talend

In the early days of Information Technology, IT wasn’t called IT. It was called Data Processing. Its purpose was to store data records (about customers, products, accounts, etc.) and to process these records. This was achieved by gigantic mainframe computers that were initially ingesting and spitting out punched cards, then spinning reels of magnetic tape, until magnetic disks and optical media became mainstream storage mechanisms. In these times, computers managed data. And all programs that were written to run on these computers had a specific purpose linked to the recording, retrieval, or manipulation of that data.

As Information Technology became more pervasive and strategic, systems evolved. Mainframes started to be supplanted by open systems — so were called the Unix servers in the 1990s, because they were less closed than mainframes. The client-server revolution arrived, and soon everyone was building 2-tier, 3-tier or even n-tier systems, with a data layer, an application layer, a presentation layer, etc., using modern 3GL and 4GL languages that were focusing more on application logic than on data. The advent of the Web thin client and of service-oriented architectures finished to move the focus of IT away from data and on to applications.

Why this shift of focus? Well, for one thing, data was considered to be a given, and data management technologies were stable and well understood. On the other hand, the application layer was becoming too complex to handle. There were too many competing languages and technologies, incompatible platforms and little reusability. Applications needed governance, which has been a major focus of IT in the first decade of the 21st century.

During this time, data came to be viewed as a by-product of applications: to run, an application needs a server, a run-time, and a place to store its data. Whether the data is core to the task performed by this application (such as customer records for an order management application) or an actual by-product of this application (navigation logs for example), was largely irrelevant.

Viewing data as a by-product of applications has created several issues. For one, governance of data was largely inexistent, and was certainly not viewed as a critical activity. Some organizations have matured faster than others, through initiatives often driven by lines-of-business, but information governance remains the privilege of leading edge organization. In addition, because of this concept that data somehow “belonged” to applications, consistency of data across systems was not covered “by design”. When it was implemented, it was often as an afterthought because of business issues created by data inconsistencies that prompted a patch-type response.

Data integration, born as ETL and originally devised to load data warehouses, quickly became a key element of this “patching” of consistency issues, simply because it was the most suitable technology at-hand to deal with the problem at the time. By joining forces with the nascent domain of Master Data Management (MDM) at the end of the 2000s, it became the foundation of a new era of information governance: information as an asset.

Information does not belong to applications, nor is it a by-product of applications. IT organizations manage a number of assets, which include application assets and information assets.  Both of these asset classes are strategic for the business, and are managed and governed as such.

This new era of governance requires a shift in mentalities. Governance practices that have been successfully applied to application code or SOA services, need to be created for information assets.  Application architects need to learn to work closely with information architects. It’s no longer a one way street.

And after all, even if IT is no longer called Data Processing, it’s still called Information Technology. Which, I think, is very telling.  Information has always been at the core. Even if this was concealed for a few years, the 2010s decade will view its return as a strategic asset.

YvesM casual2 lores Back to The Roots: Information as a Strategic Asset

Yves de Montcheuil is the Vice President of Marketing at Talend, the recognized leader in open source integration. Yves holds a master’s degree in electrical engineering and computer science and has 20 years of experience in software product management, product marketing and corporate marketing. He is also a presenter, author, blogger, social media enthusiast, and can be followed on Twitter: @ydemontcheuil.


DATA and ANALYTICS , Fresh Ink

Leave a Reply




UC Expo



ITBriefcase Comparison Report