8. Unlock the Potential

Data is the currency of the new digital economy. The hallmark of data-driven success, however, lies in each organization’s ability to harness information effectively to improve business outcomes – from increasing customer satisfaction to reducing risk.

IT leaders are expecting big things from big data. Many view analytics as the technology that’s most likely to have an impact on their organizations over the next three to five years, according to Computerworld Tech Forecast 2017 survey.

Which of these disruptive technologies is most likely to have an impact on your organization over the next three to five years?

Drag-and-drop the list below in order of importance.

What you say:

  • 1AI/knowledge-based systems
  • 2Big data/analytics
  • 3Cloud/SaaS
  • 4Internet of Things
  • 5Mobile payments
  • 6Self-service IT

What other CIOs say:

However, as CIOs steer their organizations toward data-driven enterprises, many are finding their efforts thwarted by inflexible IT infrastructure and legacy data warehouses. A growing “app-data gap” between applications and the data that business users need daily can inhibit productivity. The rising volume, velocity, and variety of data create challenges for companies to easily and cost-effectively scale storage and compute capacity.

Organizations need large quantities of both to fuel faster decision-making and react in real-time to the ebbs and flows of business. They also need storage that’s smart enough to predict and prevent downtime and other issues that can impact the business. For example, HPE has demonstrated that its predictive all-flash solution automatically predicts and prevents 86% of storage-related problems. The goal is a data center that is self-healing, self-managing, and self-optimizing.

41% of enterprise CIOs expect to experience a shortage in data science/analytics skills over the next 12 months.

In data-heavy industries such as biomedicine, many organizations are accelerating the move to public cloud as they bump up against the limits of on-premises storage and processing capabilities.

“The power of genomics is a statistics game. You need big data sets,” says William Mayo, CIO at Broad Institute, a biomedical research organization. “I can’t do petabytes of data in a data center. That needs to start outside [in the public cloud] and stay there.”

The challenge becomes even more daunting as CIOs begin to integrate new data sources from Internet of Things (IoT) and other “edge” devices. Much of this data will stay where it’s created for practical reasons, such as data size, response time, and regulations regarding data privacy and sovereignty.

Therefore, as organizations use the public cloud to perform advanced analytics on massive amounts of data, they also must prepare for a future characterized by edge analytics, which utilizes machine learning to transform data wherever it resides and send insights, not just raw data, back to the core

Real-time Insights with Advanced Data Analytics

A 3- to 4-year effort to modernize SunTrust Banks’ data infrastructure around a Hadoop-based data lake platform has helped the financial services organization streamline data management, improve data governance, and add more real-time operational data to its analytics capabilities, says Anil Cheriyan, SunTrust CIO.

“Analytics is a core part of what we do, whether it’s risk underwriting, fraud, or many other components,” says Cheriyan. Modernizing hundreds of traditional warehouses into a data lake has dramatically increased the speed at which the bank can process and make information relevant to people throughout the organization.

“Our digital platforms are now closely tied to our data lake, which helps us to know our clients better,” he explains. An integrated view of prior interactions, both online and in branches, “enables us to serve our clients in a better way,” he adds.

One-third of IT leaders believe big data/analytics are driving the biggest IT investments — but just 16% of LOB leaders agree.

Using Data and Analytics to Eliminate CIO Pain Points

At AmeriPride, CIO Steven John and his team made a decision two and a half years ago to gut the company’s existing information stack, replacing a data warehouse that John admits never had the vision or financial support needed to work effectively. In its place, the company implemented three core components: Tableau visualization software, Micro Focus’s Vertica distributed analytical database, and Informatica’s data management platform.

At a basic level, the simplified stack helped John’s team eliminate one significant pain point: Reducing the time required to produce daily operational reports. A process that used to take 36 hours now runs in fewer than 15 minutes.

Streamlined reporting has enabled AmeriPride to expand its analytics effort across the business. John’s team has built 20 dashboards by which employees now live and lead, providing real-time visibility into everything from safety standards to fleet management.

“We have 60 plants in North America, and each is essentially a small business,” says John. “They need information on a daily basis that allows them to serve their customers and compare against our metrics to see if they’re winning or losing.”

The analytics infrastructure has become a foundational piece of AmeriPride’s broader vision to simplify, standardize, and automate across the business.

“We had a three-legged stool with two legs: good judgment and good leadership,” says John. “We added good data, which gives us a solid base to make better decisions. It has changed the entire dynamic of our company.”

“One of the things I particularly enjoy about the public hyper-scale platforms is they have great analytics, machine learning and AI that comes with them.”

– Alan Crawford
CIO, City and Guilds Group

Machine Learning Drives Insights

A flexible Hybrid IT infrastructure that can scale storage and computing capacity as needed opens the door to an influx of data that can overwhelm traditional data management methods, tools, and analytics teams. That’s where machine learning plays an increasingly important role. Organizations are increasingly evaluating solutions that utilize machine learning and predictive analytics for many infrastructure-related activities such as downtime prediction, prescriptive resolution, root-cause analysis, and even analytics-driven tech support.

“A lot of data can be a liability if you don’t know what to do with it,” says Jeff Wike, CTO at DreamWorks Animation, creators of blockbuster movie franchises such as "Shrek" and "How to Train Your Dragon". DreamWorks Animation certainly produces a lot of data: A single 90-minute, computer-generated animated feature film, at 24 frames per second, comprises 130,000 individual frames – approximately 500 million digital files. The studio’s image-rendering operation processes up to 112,000 transactions per second during image creation while collecting close to 1 million artifacts about that information daily, according to Wike.

“A lot of data can be a liability if you don’t know what to do with it.”

— Jeff Wike,
CTO, DreamWorks Animation

As part of a broader transformation of its data center and data management architecture, DreamWorks has optimized its rendering operations with advanced analytics. The approach helps determine how long it takes to make a particular image and exactly what resources are needed for specific tasks based on past performance.

Automating this analysis is a big step forward from manually examining log files in search of patterns. “Guessing was not very reliable,” Wike says. “By actually capturing those artifacts and predicting their patterns through machine learning, when a similar job comes in we know precisely how many resources to devote to it.” The system has improved rendering performance by about 15%.

The analytics tools also use anomaly-detection methods to help identify problems with rendering processes. “Anomalies can throw a wrench in your plans to complete a project,” Wike says. “The ability to detect problems as they are happening — or before they start — and triage them has a huge impact.”

“That’s really the biggest advantage. It saves an enormous amount of artists’ time and engineering resources trying to solve problems when we’re in the heat of production,” he says. “That’s priceless.”

Machine learning also plays a growing role in SunTrust’s efforts to protect its customers against fraudulent activities.

“There are lots of active bad guys out there who are trying to leverage financial institutions to steal money,” says Cheriyan. “More modern, real-time analytics tools that leverage machine learning enable us to connect a lot of core analytics into our business processes and core operational systems.”

Connecting risk management systems with transactional systems lets SunTrust more quickly flag potentially inappropriate behavior and take action proactively. “The speed at which we can process and make information relevant is where the real benefits are coming from,” he says.

Key Play

8. The 3 Cs

Organizations are using edge computing to create smarter buildings, cities, work spaces, retail experiences, factory floors, and more. Here’s how to capture deeper insights from the “Intelligent Edge.”

Network or direct connect the devices and things at the edge: cars, tools, gadgets, appliances, people, power grids, robots, street lights, pumps, buildings, etc.


Configure, actuate, or orchestrate the things and equipment the edge.


Analyze data from the edge to reveal new business, engineering, or scientific insights.

Take a Deeper Dive into Hybrid IT

The Strategic CIO’s Playbook

Create a game plan for accelerating digital transformation with the right mix of Hybrid IT.

Technology Insights

Sign up for enterprise.nxt to get IT news, insights and resources delivered to your inbox.