__STYLES__

/

Debbie "The Data Diva" Reynolds: Data Privacy 101 for Data Teams

EPISODE 2

EPISODE 2

EPISODE 2

Debbie "The Data Diva" Reynolds: Data Privacy 101 for Data Teams

Debbie "The Data Diva" Reynolds: Data Privacy 101 for Data Teams

Debbie "The Data Diva" Reynolds: Data Privacy 101 for Data Teams

Meet the Guest

Debbie "The Data Diva" Reynolds

Debbie Reynolds is the Founder, CEO, and Chief Data Privacy Officer of Debbie Reynolds Consulting LLC. Debbie Reynolds, “The Data Diva,” is a world-renowned technologist, thought leader, and advisor working at the intersection of Data Privacy, Technology, and Law. Ms. Reynolds is an internationally published author, highly sought speaker, and top media presence about Global Data Privacy, Data Protection, and Emerging Technology issues.

Watch the Episode

meet your host

meet your host

Alice Zhao

Alice Zhao

Alice Zhao is a data science instructor with Maven, author, and is famously known for her YouTube tutorials which have generated over 1.5 million views!

Alice Zhao is a data science instructor with Maven, author, and is famously known for her YouTube tutorials which have generated over 1.5 million views!

Top 3 Insights from Debbie Reynolds

  1. Be great with data, but don’t trample on the rights of humans

  2. Companies should be intentional about the data they keep and its business purposes

  3. Everyone needs to be educated on the data risks that their companies have

————————————————————-

READ THE TRANSCRIPT

What is data privacy?

Data privacy is about protecting the rights of humans. Humans accumulate and put out data. I work with companies to help them use data in ways that protect the rights of humans, which differ depending on what jurisdiction you’re in, what you’re doing with the data, etc. The main idea is – be great with data, but don’t trample on the rights of humans.

At what point should companies start thinking about data privacy?

Data privacy really should be fundamental to a company’s operations. Just like companies think about cybersecurity, they should be thinking about data privacy. If you’re using data of humans and creating processes and procedures to do so, you may be making missteps throughout the process, which could be hard to go back and change. That said, it’s never too late to start thinking about privacy.

I’ve had a lot of companies come to me and say, this grant organization is about to give me some money and I have to send them documentation or information about our privacy programs. The grant organization wants to know how companies protect data because they don’t want to take on the risk. It’s an eye opening experience for companies to have those requirements placed on them.

What are some first steps companies should take in terms of data privacy?

The first step that I recommend for companies is to identify what type of data they have about people that is personally identifiable. Is it just peoples’ phone numbers and emails or is it biometric data? Where are those people located? Data travels, and people’s rights travel with them. The laws differ between states and there are certain things you can do in some places that you can’t do in others. Looking at the data that you have and understanding your risk profile will help you figure out what you need to do next.

What role is typically responsible for knowing about data privacy — the data team or legal team?

Data privacy is multidisciplinary. The law only talks about what’s on the books, but you really need to get more into the fundamentals – that’s when you see companies getting fined for privacy issues. It’s not because they don’t know the law, it’s because they don’t know how to change the way they operate.

——————

“When you see companies getting fined for privacy issues, it is not because they don’t know the law, it's because they don’t know how to change the way that they operate.”

——————

I work with people on operations – how do you do your day-to-day work with data? What are your risks? Which ones do you want to take on? I can advise clients, but they have to be able to decide what they want to do with that and what their risk appetite is with data.

So companies can choose their risk? I thought there was a hard line between what companies can and can’t do.

Yes and no. There should be a hard line. But companies can kind of do whatever they want. Let’s say Facebook gets fined a ton for something they chose to with their data. Maybe they thought the risk was worth it because they could afford it. On the other hand, a small mom-and-pop shop may not want to take on the fine.

A lot of big companies take data privacy very seriously though. Let’s say you’re a small business owner and you’re only aware of local laws. If you’re a third party to Amazon, Google or some other big company, they’ll say that while the small business is not subject to these laws, they are, so they want the small businesses to align with those. That’s what I help a lot of companies do, align on those laws.

So let’s say I create an app and I’m collecting names and emails. What do I need to do from a data privacy perspective?

One thing you need to figure out is where the people reside and what state they’re in to know what laws apply to them. Are you collecting data about people that’s considered sensitive information, like anything that can be used to discriminate against them like their race, health, biometric data, etc.? These things have special protections beyond what you would have for someone’s email and password. That’s personal data, but not sensitive data.

When I’m working with clients on operations, I go through an exercise. Do we need this data or not? Is it at risk or not? Should we collect it or not? A lot of it is let’s figure out what we need to collect and what we want to do. Let’s tie the data to a purpose. And then you’ll have less risk down the line when you use the data. A lot of companies get in trouble because they collect more data than they need.

A good example is a recent T-Mobile data breach, where some of its legacy data was breached. There was data in spreadsheets of people who applied for accounts and whether they got them or not. This data was clearly very old and it didn’t have a high business value, but it had a high privacy risk. So that’s why they got fined. Literally if they had just deleted that, they wouldn’t have any problems.

This is a problem that a lot of companies have where they want more data and can’t get rid of anything. But truly, the data you keep should help you run your business. It should help you get insights into what you’re doing with your business. If not, then you’re just hoarding it like a bear.

——————

“The data you keep should help you run your business. It should help you get insights into what you’re doing with your business.”

——————

From a data analyst standpoint, collecting a lot of data seems like a good thing. This is a new perspective for me.

A lot of laws they’re putting into place now are basically saying if you don’t have a purpose for the data, then you should get rid of it. This hasn’t happened before. Normally companies collect data, keep it forever and do whatever they want. Now these laws are saying if you don’t have a business relationship with the person or you don’t have legal requirements to keep it, then you really shouldn’t keep the data.

The way that you can keep data forever is to de-identify it. So once you take the personally identifiable information (PII) out of the data, you can keep it forever. But a lot of companies want to either keep things forever or they don’t have a strategy for their data.

Data should have an end-of-life strategy. The end-of-life should be the end of your business purpose for using the data. Once that purpose has expired, the data should be either gotten rid of or anonymized in some way where you take the PII or sensitive information out of it. Once de-identified, you can still analyze the data and link it to other data sets, but what they don’t want you to do is try and re-identify the person.

How do people learn about data privacy?

They are now starting to have classes on data privacy at universities, certificates, etc. My learning of data privacy has just been reading for 20 years. I’m a nerd in that way and that’s a topic that I’ve been very interested in so I kind of marry it with my technical skills. That’s how I got into data privacy originally. I would say read up on it.

A lot of companies have been put into situations where they never had to think about privacy but then they want to expand to a different jurisdiction and they run into problems or they can’t sign a contract unless they share what they’re doing operationally with their data. Without thinking about data privacy, these companies are taking on risks.

In the past, if a big company gave their data to a third party to do something for them, they would say it’s not my problem what they do with their data. That’s the Cambridge Analytica story. But it doesn’t work that way anymore. Since that happened, laws are saying that as a first party data holder, if you give your data to a third party, you both have skin in the game. So from a contract perspective, a lot of those requirements are being pushed down to those third parties.

How can companies train their employees on data privacy?

Just like you have companies doing annual cybersecurity training, I think companies need to do that with data privacy as well. Some teams will have more access to data than others, but everyone needs to be educated on the data risks that companies have. Even if someone doesn’t have direct access to something, maybe there’s something that they know or see that may create a risk for the company and they can let someone know so they can take care of it.

Companies should take a more risk-based approach, meaning they should be doing things in more proactive ways. You should keep documentation over time to track your maturity instead of waiting for something bad to happen.

——————

“Just like you have companies doing annual cybersecurity training, I think companies need to do that with data privacy as well. Some teams will have more access to data than others, but everyone needs to be educated on data and data risks that companies have.”

——————

In the cybersecurity space, there are some laws being created in the US where someone needs to be assigned responsibility if something goes wrong. They’re tired of having CEOs going to Capitol Hill talking about how they don’t know what happened.

It’s a different world these days. If people aren’t talking about data privacy, they will be very soon. And a lot of companies now are required by law in some jurisdictions to give you a data processing or privacy addendum around how they handle data and what they expect of others when they give out their data to handle on their behalf.

Are there particular tools for data privacy?

There are tools, but unfortunately it’s not a magic bullet or easy button here. It depends on what companies want to do. I think of tools in two different buckets. One bucket contains things that touch data, that actually helps you with the operations of how you work with data. An example is BigID, which will tell you where your PII is and help you pinpoint those risks to figure out what to do from a data perspective.

The other bucket contains things that don’t touch data but talk about it, so things like creating a policy, procedure or workflow. There are reasons why companies collect information, like personally identifiable information, but that purpose needs to be very clear and needs to align with not only the regulations but also your business practices.

It seems like data privacy is less about the tools and more about the strategy.

A lot of times people think about privacy only in a legal way, so the laws tend to be reactive. Typically, especially in the US, something bad happens and we pass a law. A lot of what needs to happen in data privacy is not just about the law. Focusing on the law isn’t going to get us where we need to be. You have to operationally and fundamentally change the way that you think and strategize about data so that you don’t have issues with laws.

What are some small wins or practical things that companies should do after hearing this?

First of all, if your company doesn’t have a privacy program, definitely reach out to someone to talk you through what you need. It’s not one size fits all. Not all companies need as much. Some companies may be stronger on operations versus legal or vice versa, and there’s a balance they need to strike there. But just educate yourself. If you haven’t received something like a data processing addendum from a company that you work with, you need to be on the lookout for that because it’s going to happen.

What we’re entering into now is unprecedented transparency. There are a lot of things about the way that companies handled data in the past that did not have to be as transparent and they do have to now. It’s kind of sending a shock to companies that used to want to keep data forever and not tell anyone anything. That’s not the future.

In the future, you have to be transparent and you have to get consent in some regards. In many situations now, you have to get consent and in almost all instances where you have PII of people, you have to give them an option to opt-out. They need to have some avenue where they can say I don’t want you to collect my data.

I’ve personally noticed more options to opt-out of things recently and I feel like I have more control of my data.

That’s the idea. Some people think of privacy as a tax on business. It really is a way that creates a competitive advantage. Apple has been really pushing privacy. They made a radical change in the way that they handle data where they make markers have to ask for your request before they start marking your information. It’s been such a huge thing. Other companies who make money off of collecting data weren’t happy about that and lost a lot of money. But Apple as a result of that has had some of their most successful quarters since they made that change. I think that it’s indicative of the fact that users want to have more control and they want help with their data.

I think that if people trust your company, they don’t have problems giving you data. But if they don’t trust you, they don’t want to give you data or they give you bad data. That’s what you don’t want. As you know in analytics, you want the best data that you can get because you can’t get a good insight out of bad data.

So it sounds like data privacy is going in the right direction.

I think so. I hope so!


Subscribe for Updates

Get new episodes and key insights from The Mavens of Data series sent to your inbox.

Get new episodes and key insights from The Mavens of Data series sent to your inbox.

mAVEN FOR BUSINESS

Empower your team to make smart, data-driven decisions

Assess your team's skills, discover project-based courses to close the gaps, and create custom learning plans to build the data skills you need most.

Assess your team's skills, discover project-based courses to close the gaps, and create custom learning plans to build the data skills you need most.

READY TO GET STARTED?

Request a Free Team Trial & Platform Demo