Data In The Era Of Disruption

The word “disruption” has become blasé in the buzzword-laden world of Silicon Valley. Buzzword or not, the reality — move fast or die — is something companies are wrestling with every day. Businesses must innovate through new experiences, business models and services that drive value for today’s consumers. After all, 15 years ago, would you have thought that a social media giant would be working to “transform the global economy” by championing a new digital currency?

Threats Are Coming From All Sides

It’s been cited that, since 2000, half of Fortune 500 companies have gone out of business. Twenty years ago, Apple was a computer company. Amazon sold books. They didn’t survive because they got better at selling computers and books, but because they relentlessly reinvented themselves to become media companies, cloud providers, grocery stores and more.

Prior to the digital era, we used to say, “The big eat the small,” where larger companies would snap up smaller companies to add to their portfolio. Today, it’s “The fast beat the slow.” Whoever can move and innovate more quickly builds sustainable advantages in these new industries. Speed, not size, determines the winners.

You may not realize it, but Piggly Wiggly was an innovator that forever changed the world of grocery shopping. In an era when customers had to present their orders to a clerk, Piggly Wiggly changed the game by pioneering a self-service model and introducing the notion of check-out clerks, shopping baskets, price tags and refrigerated cases, ultimately refashioning the way people shopped.

For all the work the Safeways and Krogers of the world do to optimize their shopper experience, at the end of the day, their job is to put food in the homes of their customers. What happens when those customers stop coming to the store to fulfill that need, instead of turning to Instacart, HelloFresh and UberEats? Big grocers are forced to disrupt themselves to rethink how consumers want to get food.

Data is the Key

In today’s digital era, software has eaten the world. Whether you’re building a new mobile app or web service or making changes to the supply and distribution chains, innovation is delivered through software. But companies are finding that data is an increasingly critical part of these new digital experiences. To stay current in today’s world of software, businesses must master the ability to get relevant data to the people who need it, quickly and securely.

Automation has long been critical to helping organizations move faster at a lower cost. Many of the world’s leading companies have been successful in embracing the cloud, deploying continuous integration and delivery tool chains and adopting new DevOps models that tighten the feedback between development and operations. Teams can spin up and tear down servers in minutes instead of weeks or months. This leads to wholesale reinvention of their software development life cycle, which enables them to deliver innovation orders of magnitude faster than before.

But many enterprise data teams struggle to provision a new data environment in less than a day. According to a report by 451 Research and commissioned by Delphix, 47% of global enterprises say it takes four to five days to provision a new data environment. When your infrastructure and your systems development life cycle (SDLC) tooling is fast, the data gap becomes painfully visible.

Imagine you’re trying to do continuous testing, your server is ready to go and you’re getting ready to fire some tests. Then, you stop and realize you need fresh data. At that point, you’ll likely have to put in an IT request to database administrators and storage admins for data, and suddenly, five days have passed since you put the project in motion.

If you’re doing weekly sprints and it takes two weeks to provision data, are you really doing a weekly sprint? It’s not CI/CD (continuous integration and continuous delivery) if there’s suddenly a five-day delay in the middle of your workflow. Maybe you try to work around it by turning to synthetic data, but the less realistic your data, the lower the quality of your testing, and the slower you move as a result. You spend more time debugging data-related defects later in the development cycle instead of shifting that testing left so developers and testers catch those issues while they’re quick and easy to fix.

Getting the Most out of Your Data

Today, ignoring digital disruption is not an option. Any incremental improvement you make will be wiped out by a competitor who creates a new offering or an entirely new market that turns your company into an also-ran. The truth is that you’re only able to innovate in today’s market when you can unlock the data you need. In a world where every company is a data company, teams must be able to:

• Manage, change and collaborate with data like code.

• Make data available in a self-service fashion.

• Become data source agnostic to work with any data, anywhere.

• Intelligently anonymize data for privacy and compliance.

This is exactly the function of DataOps, a new discipline that aims to do for data what DevOps did for software development and deployment. A successful DataOps practice is one that addresses the full breadth of data-driven initiatives in the modern enterprise, requiring a robust toolbox of capabilities across data security, self-service data access and control, speedy data delivery and the ability to work in a complex IT environment.

Enterprises that master DataOps will be able to unlock new insights, bring new digital services to market faster and build better customer experiences so they can avoid being disrupted and win in today’s market.

*** This is a Security Bloggers Network syndicated blog from Resources - Blog authored by Delphix. Read the original post at: