Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized... Membership! Membership!

Tweet Register as an member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

Cutting Your Cloud Database Costs

October 17, 2022 No Comments

Article by Vinay Samuel, founder and CEO, Zetaris

Cloud data storage has proven to be a boon to corporate enterprises. Cloud computing offers elastic data storage virtually on-demand, and it saves on-premise hardware and software costs. It also makes it easier to provide global data access without a dedicated network infrastructure. Hosted data storage also simplifies systems backup and data collaboration. Despite its popularity and the growth of software-as-a-service (SaaS) adoption, cloud data storage carries its own expenses, which can dramatically impact the total cost of ownership (TCO).

Cloud data storage isn’t free. Cloud data storage can average up to $0.20 per gigabyte, and the cost of downloading data can be $0.12 per gigabyte or more. When you start archiving petabytes of data, storage adds up. Much of the cost of cloud data storage comes from data egress, downloading data for on-site data processing. Gartner expects public cloud spending to reach $500 billion in 2022, up 20.4% in the last year. More than half of all corporate data is stored in the cloud, and cloud services will continue to see up to 30% growth as more organizations embrace cloud computing services.

Much of the cost of cloud data services comes from data egress and analytical workloads. As the amount of data stored grows, moving terabytes of data for analysis becomes less cost-effective. Caching frequently used queries improves analytics performance by reducing access speed, eliminating the need to access the slower, underlying storage layer. Querying stored data rather than downloading data for processing is faster, more efficient, and provides real-time insights.

The Cost of Migrating Cloud Data

Data egress, moving data out of the cloud, is one of the highest hidden costs of cloud data storage. Most cloud providers provide free data ingress, i.e., no cost to upload data for storage. However, they charge network fees to move data out of the cloud. Fees can range from $0.05 to $0.20 per gigabyte every time you move data out of storage for processing. When you have a business moving terabytes of data each month, the fees add up quickly.

For example, transferring 1 TB of data costs $100 per transfer. If you transfer that amount of data 10 times daily, you will be paying more than $20,000 in monthly fees.

Numerous factors dictate the cost of data transverse:

– The amount of data being transferred

– Whether data is transferred within or across regions

– Whether data is transferred within or outside the availability zone

– Whether the data is transferred within the same cloud or to another provider

The Cost of Analytics Workloads

Analytics is a vital business component, revealing insights about operations, inventory, revenue, customer retention, etc. Unfortunately, analytics workloads require a lot of computing, data storage, and memory resources, which are reflected as line items on your monthly cloud services bill. They also can add to network bandwidth costs.

The challenge is that cloud services use a brute-force approach to performance problems. Rather than striving for efficiency, the cloud adds elastic resources on demand, which increases customer costs.

Data specialists try to cut costs by optimizing queries, but complex queries require several computing steps. For example, consider the requirements for a query such as identifying all male customers between ages 25 and 45 who have spent more than $100 on goods in the first quarter. The query structure and the sequence of steps dictate the number of input-output operations, the CPU path length, the amount of data buffer space needed, and the types of data to be joined.

Query optimization requires creating the most elegant and efficient plan for a specific query. Optimization also requires sophisticated algorithms and a skilled database administrator. The process is very labor-intensive and particular to the database.

Eliminating the Need to Move Data

As the amount of data increases and the enterprise infrastructure becomes more complex, the cost of data egress and analytics workloads continues to rise. To cut costs, you must stop moving terabytes of data around for processing.

Zetaris: The Networked Data Platform addresses the problem with an Intelligent Adaptive Cache. Rather than moving data, Zetaris delivers speed and scalability using virtualization over metadata, which enables views of any data combination without actually moving the data.

Zetaris ingests metadata from data sources located throughout the enterprise to construct a virtual schema. Ingesting data from multiple sources creates a Virtual Data Warehouse (VDW). The scheme uses the metadata to query the data sources as if they were physically connected in a data warehouse. The system minimizes traffic between data sources since the data itself doesn’t have to be joined for collective interrogation.

To reduce the overhead from analytics workloads, Zetaris optimizes queries beyond the interrogation of a single data source to simultaneously query multiple databases registered in the VDW. Every query is optimized at run time, and the process is both dynamic and automatic, improving performance and ensuring access to real-time data.

 A user-controlled cache further reduces the drain on resources. Frequently run complex queries can be cached to minimize network and memory overhead to increase speed and reduce cloud costs.

Research shows that many companies abandon the cloud after seeing the cost of hosted data warehouses rise. Using the Networked Data Platform to create a “drill through” path to cloud data dramatically reduces the load in the cloud database CPU. As a result, organizations can see savings of a third or more of their cloud data warehouse costs.

Click here to view more IT Briefcase content!

Sorry, the comment form is closed at this time.