Data Privacy Regulations’ Implications on AI

Investment in artificial intelligence (AI) is growing, with 60% of adopters raising their budgets 50% year over year, according to Constellation Research. But working with AI under emerging privacy standards is complex, requiring a dynamic balance that allows for continued innovation without misstepping on regulatory requirements. Under privacy regulations, businesses are responsible for gaining consent to use personal data and being able to explain what they are doing with that data. There is a real concern that black box automation systems that offer no explanations and require the long-term storage of large customer data sets will simply not be permitted under these regulations.

Data regulations often have a negative connotation for companies, but with AI, the regulations could have the opposite effect. Despite the concerns, companies of all sizes are improving predictive analytics, machine learning and AI capabilities due to enforced data regulations. In reality, data privacy creates real opportunities for better AI and outcomes from all the hard work that goes into creating these systems.

Implications

Everyone wants to automate, but the foundation around doing this well is having quality data, and that begs the question of whether CIOs are getting the full ROI out of their investments in these technologies. So many people are talking about how to fix the “garbage in, garbage out” scenario, but hardly any are actually making moves to curb it. That is, until data privacy regulations such as GDPR were introduced.

Privacy regulations help companies drive a data quality program to leverage more advanced technology. For most companies, data quality is the one thing that they never get around to. Regulations force them to make it a priority, helping them to better understand their data. CIOs are taking advantage of this opportunity to better map and find quality data that will help build their AI programs.

Some IT leaders fear that this effort will be all for nothing as more regulations build and shift in the U.S. and across the globe. However, AI likely will not become regulated under privacy legislation because it will always go back to the foundation of data. Whether you’re collecting it through voice assistants or outbound marketing, the regulations are meant to protect what you collect, not what you build. In fact, AI programs are being built to help address compliance and data sensitivity issues, taking advantage of this new opportunity to help overcome a massive challenge.

Outcomes

Machines are starting to make a larger number of big decisions as AI becomes more advanced due to the amount of data available, making data quality more important than ever. For example, just a few years ago, people would have used a financial adviser to help make investment decisions. Now, they can simply log into an app or web-based tool, such as Betterment, which implements automation to make investment decisions in 15 minutes, all while taking into consideration things such as tolerance for risk, financial goals and timeline. But Betterment couldn’t make these big decisions effectively—or take over for my financial adviser—without quality data. And for a lot of companies trying to create tools like Betterment, complying with privacy regulations will be the turning point to get it right.

Data governance and data management are key components of any AI program, and privacy regulations enforce that. There are certain things you have to know about your data—such as who has access to it, what they’re doing with the data and how clean and trustworthy it is—to be compliant. This forces companies to be more diligent and thoughtful about their data management, which, in turn, creates positive outcomes for AI, as companies put more good data to use, creating higher quality outcomes.

GDPR has been the catalyst for smoothing out data quality issues and seeing better AI outputs. While the regulation is only a little more than a year old, we can expect to see more success stories of privacy compliance on next-gen technology uses cases crop up over the next few years.

The future of AI is critically dependent on data quality, and for many companies, data quality is dependent on complying with privacy regulations. While most companies see compliance as an overhead, tax or extra work, it’s actually the beginning of next-gen technology programs. It’s important not to underestimate the value of compliance as a long-term positive impact. You may get more value out of it than you think.

Richa Dhanda

Avatar photo

Richa Dhanda

Richa is responsible for global product and solutions marketing at Talend. She has extensive experience in driving business growth, increasing customer adoption and retention, and establishing and communicating market leadership with modern technology marketing initiatives. She joined Talend from Dell EMC where she led product marketing for a multi-billion dollar product portfolio selling data protection products. Before that, Richa held a series of roles at leading technology companies including Nutanix, VMware and IBM.

richa-dhanda has 1 posts and counting.See all posts by richa-dhanda