Data privacy has transcended beyond mere legal checkboxes into a realm where ethical considerations and user trust define the success of digital projects. With the rapid advancements in technology, the handling of user data has become a complex web of responsibility that project managers cannot afford to overlook.
Galen Low is joined by Mackenzie Dysart, Delivery Principal at Thoughtworks, to give listeners a comprehensive roadmap to navigate the often murky waters of digital privacy.
Interview Highlights
- Mackenzie Dysart’s Journey into Privacy Advocacy [01:36]
- She fell into privacy work unexpectedly. Her client needed help with privacy after she finished her previous project.
- Privacy is the right thing to do. While it can be challenging to implement, it’s important to protect users.
- Focus on the benefits, not just the burdens. Privacy isn’t just about following laws; it’s about respecting people.
- Privacy builds better products. Considering privacy during the design process leads to better experiences for end users.
- Data Privacy Today vs. 10 Years Ago [03:55]
- Focus:
- Then: Primarily on preventing data breaches and protecting Personally Identifiable Information (PII) like name and address.
- Now: A broader concern about how data is collected, used, and shared, including:
- User tracking and online activity.
- Sensitive information like race, gender, and health data.
- Biometric data like facial recognition.
- Regulations:
- Then: Limited data privacy laws.
- Now: Increased regulations like GDPR (Europe), CCPA (California), and PIPEDA (Canada).
- Reasons for the Shift:
- Increased use of data by companies and governments.
- Advancements in technology that collect more data (e.g., connected cars).
- Public awareness of potential misuse of data (e.g., Cambridge Analytica scandal).
- Overall Goal – to give users more control over their personal data.
- Focus:
- Integrating Data Privacy Into Digital Projects [8:43]
- Why data privacy is important for digital projects:
- Users are becoming more aware of how their data is collected and used.
- Data privacy regulations are increasing around the world.
- Not considering data privacy can lead to legal risks.
- What to consider when integrating data privacy into digital projects:
- Data collection:
- Identify what data is being collected from users (authenticated and unauthenticated).
- Understand the concept of “sale of data” (transferring data for financial gain).
- User consent:
- Follow GDPR regulations for cookie consent pop-ups.
- Obtain informed consent from users before collecting or using their data.
- Data storage and movement:
- Securely store user data (encryption, anonymization).
- Understand where data is transferred and how it’s secured.
- User control:
- Provide users with control over their data (opt-out mechanisms, access requests).
- Data collection:
- Key data privacy principles:
- Notice: Inform users about data collection practices.
- Consent: Obtain user consent for data collection and use.
- Control: Allow users to control their data (access, correction, deletion).
- Why data privacy is important for digital projects:
You can’t have consent without notice. You don’t always need to have control, though control is generally the best practice. Keeping these three principles in mind is crucial throughout any project work.
Mackenzie Dysart
- The Cost of Data Privacy [14:59]
- Considering data privacy early can save money. Building data privacy practices into a project from the beginning is cheaper than fixing them later.
- Platforms can help manage data privacy. There are platforms that can help automate tasks like cookie consent management and user data deletion requests.
- The cost depends on your existing systems. If your systems are complex and not integrated, managing data privacy might require more manual work and custom development.
- Focus on building a sustainable process. It’s important to establish a process for handling data privacy that can be continuously improved upon. This is more cost-effective in the long run than a one-time fix.
- The Role of Digital Project Managers in Data Privacy [20:49]
- Raise awareness: The most important thing is to be aware of data privacy considerations and raise them early in the project.
- Involve stakeholders: Partner with legal and privacy teams early in the project to discuss data collection practices and potential risks.
- Identify data triggers: Be aware of situations that require privacy considerations, such as collecting new user data or using data for new purposes.
- Plan for compliance: Work with legal and privacy teams to plan for how to comply with data privacy regulations.
- Shifting Left
- Introduce legal and privacy considerations early in the project lifecycle to avoid delays and rework later.
- This collaborative approach is more efficient than waiting until later stages of the project.
- Challenges
- Project managers may face pushback from team members who are concerned about the cost of involving legal and privacy teams early.
As a project manager, you don’t need to be an expert in reading code, selecting plugins, or choosing a test suite. You just need to understand that these aspects matter and need to be considered.
Mackenzie Dysart
- How to Convince Stakeholders to Prioritize Data Privacy [26:37]
- Reduced Costs:
- Building privacy considerations into projects early is cheaper than fixing problems later.
- Avoiding lawsuits for non-compliance with data privacy regulations can save businesses significant amounts of money.
- Improved Reputation:
- Consumers are becoming more aware of how businesses handle their data and may choose to do business with organizations that prioritize privacy.
- Future-Proofing:
- Data privacy laws are constantly evolving, and building products with privacy in mind from the beginning makes it easier to adapt to new regulations.
- Competitive Advantage:
- By demonstrating a commitment to data privacy, businesses can differentiate themselves from competitors and attract privacy-conscious customers.
- Shifting the Mindset:
- Data privacy should be considered a core business requirement, not an optional add-on.
- Project managers should raise privacy concerns early and involve legal and privacy teams in the planning process.
- When selecting vendors, consider their approach to data privacy as part of the evaluation criteria.
- Building a culture of data privacy is not just the right thing to do, it’s also a smart business decision.
- Reduced Costs:
- Building a Culture of Privacy and Ethical Responsibility [34:56]
- Project Manager as a Role Model: Project managers should lead by example, asking questions about data ethics and encouraging open discussions.
- Psychological Safety: Create a team environment where team members feel comfortable raising concerns about unethical data practices.
- Transparency and Honesty: Be upfront about limitations and potential risks associated with data collection practices.
- Documentation: Document concerns and recommendations provided to clients.
- Risk Management: Clearly communicate the risks associated with unethical data practices to clients.
- Client Choice: Ultimately, the client makes the final decision, but the project manager has a responsibility to inform them of the risks.
- Consider the organizational culture: If the culture is very client-centric and doesn’t encourage questioning, it may be more challenging to promote ethical data practices.
- Take small steps: If major changes are not possible initially, identify a small, positive step that can be taken.
- Navigating Client Relationships and Privacy Concerns [39:08]
- The Impact of Deadlines: Organizations that wait until the last minute to address data privacy compliance face significant challenges and higher costs.
- Legacy Systems: Organizations with older systems not built with privacy in mind will find it more difficult and expensive to achieve compliance.
- Grace Period and Improvement Plans: Regulatory bodies are often more understanding if organizations have a plan for achieving compliance over a reasonable timeframe.
- Private Lawsuits: Organizations that are not actively working towards compliance are more likely to face lawsuits from private individuals or firms specializing in privacy law.
- Retroactive Application of Laws: Existing laws may be reinterpreted to apply to new technologies, creating additional compliance challenges.
- Compliance with data privacy regulations is an ongoing process, not a one-time fix.
- Small, incremental changes can be made over time to improve data privacy practices.
- The Unseen Dangers of Data Sharing and Contact Syncing [45:21]
- The conversation around data privacy has changed significantly over time. In the past, concerns focused on personal information like video rental history. Today’s concerns encompass aspects of our lives like location tracking through connected devices.
- There is a growing awareness of data privacy issues and the potential consequences of unchecked data collection.
- Contact syncing with apps, a common feature, raises privacy concerns as it grants access to user contacts without explicit consent.
Meet Our Guest
Mackenzie is a PMP and CSM certified project manager with over a decade of experience. She’s a bit of a unicorn as she actually chose to be a PM as a career path. Her focus has been on digital projects but she has also worked in print and app development. Her experience is on both client and agency side.
Ask the weird question. It helps make the team more comfortable. Creating a space of psychological safety is crucial because it encourages people to feel okay with asking questions.
Mackenzie Dysart
Resources From This Episode:
- Join DPM Membership
- Subscribe to the newsletter to get our latest articles and podcasts
- Connect with Mackenzie on LinkedIn and X
- Check out Thoughtworks
Related Articles And Podcasts:
- About the podcast
- What is a Digital Project? 4 Dimensions, Types & Examples
- 5 Ways to Improve Your Digital Project Management Process
- Managing Tasks As A Project Manager: 3 Key Strategies
- Your Guide To A Career In Project Management & Example Paths
- What Is A Project Manager & What Do They Do All Day?
Read The Transcript:
We’re trying out transcribing our podcasts using a software program. Please forgive any typos as the bot isn’t correct 100% of the time.
Galen Low: Hey folks, thanks for tuning in. My name is Galen Low with The Digital Project Manager. We are a community of digital professionals on a mission to help each other get skilled, get confident, and get connected so that we can amplify the value of project management in a digital world. If you want to hear more about that, head on over to thedigitalprojectmanager.com/membership.
Okay, today we are talking about privacy and how we as digital project managers need to account for recent changes in technology and legislation in our projects as it relates to privacy. Joining me today is one of our very own digital project management community experts and Delivery Principal at Thoughtworks, Ms. Mackenzie Dysart.
Mackenzie, thanks for joining me today.
Mackenzie Dysart: Thank you for having me. Really excited for our chat today.
Galen Low: I'm excited too. For our listeners, Mackenzie and I have been working together for years and I had this moment, I'm like, I don't think I've ever had Mackenzie on the podcast under my watch. I'm like, let's change that.
And today we are changing that. And I guess for listeners who don't know you, like, we were just talking about this in the green room, you always pick up a mantle that's like rolling a ball uphill, the really tough challenges. You've been a champion of accessibility, and now you've fallen into this role of being a champion of privacy across digital projects, where you work, but also like in our community as well.
So your brand is kind of like difficult challenges to convince and persuade people.
Mackenzie Dysart: Difficult, difficult, lemon difficult. That's my approach.
Galen Low: The least sexy parts of project management, champion that thing please.
Mackenzie Dysart: I am here for it.
Galen Low: I love that.
You were telling me that like, privacy just crept up. It's just come into the fray and you've kind of, you didn't actively pursue it?
Mackenzie Dysart: We'll say I jumped into it. Yeah, so I've been working with my client for the last two and a half years. And so, about a year ago, they were like, okay, the project you've been working on, the program stream, we're going to swap it out now, ready for you to move on to the next thing, which is always exciting.
But they went, privacy needs some support. They've got a lot of stuff going on. They have no process. They don't know how to work with product. They need some help. So I went, well, okay. And so it's been a year of me learning all things privacy and it's daunting, but it's also just like accessibility. It's just the right thing to do for people.
And I think, it's important that we understand the value and the benefits, and it's not just laws, right? It's doing right by people. It's protecting people. And I think sometimes we get bogged down by 'it's hard' versus the anecdotes and the stories of why these things matter and why it's important. And so, yeah, it's been a year.
I've learned a lot. I get to work with really smart people, obviously some challenging people as well, but like tough conversations are had, but at the end, you're always building a better product, a better project, better service for your end users. And I think that's generally the hill that I end up dying on is let's do right by end users and right by people.
So of course it just sucks to be right in right like accessibility did. So here we are. I'm ready to talk about privacy, why it matters, and why you should care.
Galen Low: Honestly, I love that because I mean, it is a noble cause really. And certainly I've been in projects where it is very much this box ticking exercise.
It's a big pain in the bottom. Nobody really wants to do it. And they just want to like, just get rid of that liability. Right? You know, not so much a user experience so much as it is reducing liability. But I love that framing that yes, all of these things are to like create better experiences do right by our users.
And I think that's really, really interesting. I guess the other thing is that in the conversations that I've been a part of, and it sounds like for you as well, data privacy is more at the forefront now. I mean, storing data is nothing new, personally identifiable information, that tongue twister, nothing too terribly new. But I've sensed this sort of arc in the importance of privacy, the importance of getting it right, the importance of having process.
The importance of working things into the things that we're creating, especially our digital solutions. But beyond that, I'm ignorant.
So I wonder if you could just tell us like how something like data privacy is different today than it was, say like 5 or 10 years ago? Like what are the key things driving those differences?
Mackenzie Dysart: Yeah, so, I mean, 10 years ago, all we were concerned about was data breaches, right? We were concerned about where that PII was, where we were storing it, and making sure it was protected. And that's really all the data privacy laws were, right? That's where it ended. And then, I guess it was 2016 is when GDPR came out, right? And that was the first legislation and I should remember what GDPR stands for. I don't.
Galen Low: General Data Protection Regulation.
Mackenzie Dysart: I should know, but it was the first kind of piece to say like people should have control over what's being tracked about them online. Right? And this primarily affected websites.
And it was that the start of those cookie banners that we all know and love. And, like we aren't bothered by anymore because they're just so common because everyone has cookies and we need to consent to those cookies being tracked. And that was the first move towards like, okay, what people are doing and what their activity is matters and they should have control and consent and really like know what's happening with their data and have choices.
Then we started to see the California Protection Act came in 2020. 2021 had the CPRA, which, or CPRA, another act in the U.S. And really what's, we're starting to see is state by state, there's new legislation coming out in the U.S. In the U.K., in the E.U., they're further ahead in terms of protection and more like federal or like governing across the E.U., which is easier to understand and apply to. Australia's got its own laws. Canada's coming up with some of its own laws as well. I think Canadians, it's called PIPEDA.
Galen Low: PIPEDA, yeah.
Mackenzie Dysart: Right? All these fun little acronyms. But there's just so many new laws coming out like regularly to protect people's data. And it's all because of how we, as society, are using data now, right? So before, it was just concerned about, like, we didn't want individuals data getting breached because of identity theft, right? That was the main purpose of it. Now there's so many other reasons to be careful about your data.
There's a pretty common, like, anecdote from CVS about the fact that there was a teenage girl, she'd been looking some stuff up online, and then CVS sent her a coupon for prenatal stuff or baby things, and basically outed her as pregnant to her dad. Because they just went, oh, a person in this household is looking for this, it's not sensitive, and just sense of and that's not okay, right?
And that's the anecdote everybody thinks of, but now we start thinking of all the tracking technology and things that we have in tools. And there's an anecdote and story in a camera, which car manufactured is I want to say Mercedes, but basically there was a divorce. She got the car and he still has access to tracking that vehicle and they can't shut it down. And then there's that safety aspect of she can't be anywhere without her ex knowing where she is because she can't have his access removed from the vehicle because of how they built their systems.
So there's all these things of like, as we're building smarter technology, we're gathering all this extra data about people, and then what are we doing with that data? And then we look at health data and what that means for people and understanding like the implications of their health data is shared to different states, for example, because things can be shared. And then even the concept of PII has changed, right? PII initially was like name, address, phone number, like, postal code, whatever.
Now there's the broader term that is sensitive information, so that is anything pertaining to who you are as an individual, race, gender, and then there's the biometric aspect of it too, right? Like, a photo of your face is biometric information now. Something that I had never thought about before, of like, uploading your profile photo to a platform, that's now biometric information that they can then use to like, match your face to things on the internet.
I've never been an overly like concerned about privacy person on the internet because I grew up in that age of like, at first I was very hesitant and had so many fake names on Hotmail and then just committed myself to no, just going to be my real self on the internet from now on and lost all that concern for whatever reason.
And I think there's just so much information out there and then people can connect the dots and find all this information and now I'm just more aware of how systems use it, how it can be like shared, sold. I think the biggest thing that we think about when we get like icky about what's done with our data is you think Cambridge Analytica and all those things that happen where it's like, Oh, your activity on Facebook, we're going to manipulate that to do these other things and figure out this information about you.
And there's a lot of laws coming out trying to stop that or protect that and allow you to opt in and control your experience. And I think, we'll talk about this probably multiple times today, but ultimately what I think the intent of all of these laws is just to put control back in the end user's hands versus us making decisions arbitrarily for people.
Galen Low: That's what I find really interesting. And I think, yeah, you hit on two really big points. So a) the technology it's progressing, it's personalized, it's convenient. And the price for all of that is generally data, right? Knowing things about you. I think the other piece is just this awareness, right? You said like 10 years ago, data breaches, it could be like, a credit card company.
Yeah, like someone got access to all your credit card numbers and that's a thing. And we're like, Oh yeah, I guess that can happen. And now it's just much more part of the conversation. We have this like deeper awareness of how our data is used. And then the next step beyond that is exactly as you say, right?
It's like, how can we put that control back into the hands of the people who are the sources of that data? And then how can we, you know, continue to drive transparency and awareness around how that data is being used? And I do agree that like, yeah, that ickiness of like, having your data sold. That's like the most, I guess, common conversation I have around privacy, at least.
But, the thing you mentioned about pictures, like, I've come across tools and, not saying one way or the other, whether they're ethical or not, but they're sort of connecting sort of your social profiles and then to verify whether they, it is actually you and not some other Mackenzie Dysart.
You're like, Oh, a little bit of facial recognition. Like it could be a different photo of you there and here, but you know, is it the same person? And they're using it in that way. And it gets pretty deep in terms of how this is being used. Not the economics, the simple economics of like buying data and selling it, but the sort of economics of baking data processing into features that are abstracted and blended into a broader experience and that awareness is a spectrum, right?
And like, people are going to be like, okay, I'm aware that if I give people my email address, I'm probably going to get some, promotional newsletters from that person, maybe from a third party, but then it can go much, much deeper, much deeper.
Mackenzie Dysart: So deep. Yeah. And like the amount of data you give away about other people on it, like is also really interesting. And that could be like a whole other conversation, but it's interesting.
Galen Low: I mean, I suppose it is one of those, more 'you know' things. But then I guess, from a project perspective, the economics of a project, I guess, I mean, we don't have infinite time and infinite budget to know everything we need to know about data privacy.
But we were talking earlier in the green room about like process, right? Even just like having a bit of a framework around it. And I guess, with all of this, I'm thinking of digital projects. I'm thinking of things like, website builds. I'm thinking of apps. I'm thinking of, augmented reality, hack video games, all these things, like does every digital project now need to consider data privacy? And also with all the things that you mentioned, all the acronyms we talked about.
Mackenzie Dysart: So many acronyms.
Galen Low: What is the risk of not having privacy baked into a digital project?
Mackenzie Dysart: Short answer, yes. Right? We can just stop there and just say, yes, everything should consider data privacy. Long answer is, it's so critical to understand what data points you're capturing about people.
And that's, if we're going to think websites, just to make this easier, there's the two sides of it, right? First, there's the non authenticated, non logged in, where basically we're just tracking cookies and sessions. And that's really the GDPR experience, right? That we're mostly familiar with and understanding what consents can happen by default.
So an interesting piece to understand is for GDPR, like that cookie banner pop up, every country has a different set of rules about what can happen if someone doesn't click on it. So in the U.S., for example, if someone clicks outside that banner and just dismisses it and hits the X, you can assume implied consent and collect the cookies.
That is not the case in the GDPR. That is not the case in Canada. So it's so important to have understanding of all the different regions that you're operating in and what the impacts are and what rules and legal processes you need to be following to make sure that you're as compliant as possible. And then that also bodes to like, what is your internal process every time you add a new pixel or tracking cookie to your website? How do you check and then make sure that it's in your cookie tracking system so that it is either defined as required, or it's just, it's not necessary? So it can be like just marketing or whatever, and then it's filed correctly so that you're behaving as expected.
So I think that's the easiest kind of simplest place where it's quite clear. Then if you have authenticated users and like a logged in state, what are you doing with all that data that you have about those people? And then on top of that, where's the data going? So when we talk about selling and sharing of data, a lot of people think of like the sale of data as, Oh, I've taken your data and I've sold it to another company for money.
That is not what the legal definition of sale of data is, which I did not know until a year ago. So what it actually means is transferring data from one system to another for financial benefit at some point. So it could be you going from like your database to amplitude or segment or some sort of tool that you use to figure out how you're going to market to them and then like your CRM tool and then you're sending them marketing emails and then they buy something.
That is a sale of data process. Even though it's all your own experiences and like your own systems, it's still a transfer of data. And so it's really interesting and something that we also need to think about is where is the data being housed? Where is the data going? And how are we moving that data? Is it anonymized?
Is it encrypted? Are we making sure that like once it goes from our system to the next system, we don't actually need the individual PII for whatever identifiable information we can just send aggregates. Okay, that's a little bit safer. So it's understanding what data points are capturing, how you're capturing consent, and then what you're doing with the data. There's like three key terms that I learned that apply to most things in privacy, and it's really notice, so that's giving the information about what is happening and what will happen if you make this choice. There's consent, so that's the actual capture of the decision that the person's making.
And then there's the control, and that's the ability to, like, change your decision, right? So there's those three pieces. You can't have consent without notice. You don't always have to have control. The control is the best practice, generally. And those are the three things that are always good to keep top of mind throughout any kind of project work.
Galen Low: You know, it's funny because as you're going through this, there's this thing in the back of my brain is cog.
It's going like this sounds expensive, man. Like, I'm imagining but I'm coming from a place of ignorance here. I'm imagining there's platforms to help organizations manage data and categorize data. And even like sort of bucket data in terms of, where it's going as PII or where it's going as anonymized data and then helping to craft, I guess, the messaging, the notice aspect of things, and the control aspect of things. Because there is that, oh, please delete my data, is it just someone's job to just go in and find that in all the systems and delete it?
I'm imagining this platform has to manage that. But even if there is, sounds like a line item in a budget now, is it?
Mackenzie Dysart: It can be for sure, right? So that's the other thing that you have to decide in your project and in your organization as a whole. If you are a product organization and you're dealing with a lot of different aspects of data, you have to figure out how you're going to manage it.
And that's where thinking about privacy and legal and compliance and security, arguably, like all of that early is less costly because you're building it with that in mind so that you're building the processes as you're going versus retroactively having to go, Oh, I now need to figure out where all the data is going and what data is going where, like how to set a filter or like where do I cut it off because a person's opted out of something. And so it's obviously much easier to do earlier on.
For like GDPR, the cookie consent stuff, there are some like one trust I think is probably the best known large company that does that as a service. And it's pretty out of the box, like, Oh, these are the things I need. You get to write the copy, but it does all the heavy lifting for you on your website. That's one of the ones. There's a couple other companies and platforms that are coming out, but arguably it all depends on your individual tech stack and how you've built stuff. If you can have a layer on, or if you need to build stuff internally, my current client and my project, it's a combination, right?
We've got some stuff that can happen automatically, but then we've got a lot of our own data. So we had to build our own automated process for if a person submits a data subject access request or deletion request, then we can automatically go through these systems. But then these other systems, we have a manual list where we have to go and manually delete them so that we adhere to that requirement.
But also, it's just because they're not all connected. And so it's a challenge. It is manual labor. It's not easy, but the easier, it can be easier if you build it with that in mind. Right? So I think that's the thing that I've learned always is, you might not be able to build it a hundred percent compliant for launch date, but can you build it in a way that making it compliant and future proofing it, matters.
And I, this is the same thing that I think I said in like our accessibility talks, just think about it early. Continue to work on it and it's fine. Right? And as long as you're working towards it and you build it with that in mind, it's so much easier to adhere to those privacy compliance laws, especially given that there's new laws coming out all the time and we don't know what they're going to be.
Galen Low: Yeah, absolutely. I think you raised a good point with accessibility because I mean, I said, is it a line item? And I, you know, I literally, the picture in my head is, this estimate that says whatever privacy stuff, right? Like 25% of the budget, but like, we wouldn't do that for accessibility and we didn't do that for sort of, like back in the day, like responsive websites, right?
It wasn't like, oh yeah, here's that extra tax at the bottom, this line item for making it responsive. Like, we had to blend it into everything we did. We had to blend accessibility into everything we did. It was just quality attributes, like what are we gunning for? Are we gunning for AA, okay, accessibility, and then do we need to pave a path to AAA in the future?
And it, you know, it wasn't just as bucket, which it had to go in, it was part of design. It was part of the technology. It was part of the way we tested. And, we ended up like where I was, it was our clients were public sector. And so accessibility did matter. And as a result, it was also a sort of competitive selling feature, right?
That we were an organization that understood accessibility and everyone, who was a client needed to pay attention to that. There was that pairing. And I'm imagining now that that's the case with privacy, where an organization who might go out to an agency or a consultancy or any kind of third party, it will be attractive to them to be like, okay, well, we understand privacy.
Yes, it costs more. Yes, we weave it into everything. Yes, it's difficult, but guess what? A) it's a noble cause, right? This is for the benefit of users, the benefit of humankind. It's here to go that far?
Mackenzie Dysart: Benefit, yes.
Galen Low: Yeah. Yeah. But, also, it's something that you will want to build for. It is normal to build for now.
This is not extra. It is part of the conversation now. It is an expectation. And frankly, we'll get there later, but some organizations that are like, yeah, I'll just feign ignorance when somebody comes after me with a lawsuit. Those people are going to be falling behind pretty quickly in terms of where we're progressing in our sort of, the privacy zeitgeist?
Is that a correct use of the word? Anyways, the sort of, the culture around privacy. Like you see it with Apple, right? Apple is leading with the front foot on privacy being a selling feature in their billboards and stuff, and I don't know one way or another, how much that's baked into the platform, but it's clearly a marketable thing now to be like, we will help protect your data. Don't worry. So it's very much part of the conversation.
I guess if I was to tie it back, I'm thinking now I'm in project manager mode. I'm like, okay, we got a plan for this. And even if we plan from the beginning, like, I imagine there's a lot of organizations that don't yet have like a dedicated sort of privacy champion on every project, looking at every aspect of the project and thinking through what the requirements are. And where I've worked in the past, that would probably fall wherever the buck stops. And sometimes the buck stops with the project manager.
So I was wondering, like, what is a digital project manager's role in data privacy? Like, are we aspiring to become chief privacy officers of our projects? Or is it more of that sort of awareness that you mentioned, right, the notice, the consent and the controls? How deep do we need to go?
Mackenzie Dysart: Yeah, I think being a chief privacy officer would be like the most terrifying and stressful job ever, cause you'd have to make so many like risk-based decisions. I'm not that person. Well, I mean, I guess I am, but I like to defer sometimes and it's nice having one of those people to defer to. It's for a DPM, right?
You're a project manager. Just like building the product itself, you do not need to be an expert in reading code or what plugins you're going to use or what test suite you're going to do, right? It's just understanding that it matters and that it needs to be considered. And it's as much as we can do to bring it up early, I think is the most important part.
So it's a matter of making sure that you've got processes in place. If you do have a legal team, if you do have a subset of that legal team that has privacy. Are you involving them early, especially if there are questionable things or new pieces of user data that you've never captured before? It's really important to have that conversation and say, Okay, hey, we're going to start collecting height and weight about our customers because we're delving into fitness.
And so what does that mean for us? And then you start to learn that like, oh, there's all these laws that govern health data and that can be considered health data. Do we really want to do that and make those decisions? So I think it's important to have a partner that you can check in with. And most importantly, I think it's understanding what those guardrails are or what those key items that are like, you hear something, you're like, Oh, I should probably talk to someone about that.
And for those, my like tidbits would be like anything about new user data. So if you are capturing any new data point about your users, check with privacy. Figure out what the implications are. And then it's also understanding where is that data going to go? How do we add it to our deletion process, right?
Circling back to what you mentioned about, is it going to be another manual thing? Can we make it automated? How do we protect that data if we need to? And then really just understanding if there are compliance requirements, then like, how do we get forward? So it's really, is there a new user data or are we doing something new with the data that we already have?
And those are the two things that you really want to consider. And I think are the like, as soon as it, anything is said like that, my like little alarm bells go off and go, Oh no, privacy. And you can imagine like a little privacy light and it's got glowing, right? Yeah. I think those are the two things that you really want to understand.
And then the obvious big elephant in the room, which ties to new things with data is AI, right? Like if you're going to start leveraging AI or machine learning based on this user data, what does that look like? And go from there. I think those are the two questions, right? New data, or are we doing something new with the existing data?
And those are the two that can really be your triggers to say, okay, I really need to talk to privacy now and move forward. Obviously you should always have security legal, privacy reviews on stuff. Always. If you don't have that process in your organization, maybe find out why. Find out how to partner with.
As agencies, sometimes we don't have that stuff in house, but we should be partnering with our clients to figure out, Hey, who needs to review this? Like who needs to validate our assumed, like, flow of data. So I think it's also important to make sure that you've got that partner on the other side. I've had the opportunity of working on a number of different projects over my career that have always involved legal, and my takeaway is always involve legal early.
Involve them as soon as possible so they know what's going on, so they can feel part of the conversation and then they can ask the questions early before they're like a deterrent to your project, right? Or before they're a big problem. If they're just a we're at point A and like at point C we're going to need to consider this thing for legal. We can plan for it. We can plan that detour. It's less of a veer than, oh, we're at point D, we need to go back to point C, right?
So I think it's a matter of planning and understanding and making sure that they're at the table early. The phrase shift left comes to mind here, just like it does for a lot of other things. The earlier on and the more engaged folks are, the simpler it will be in the long term.
Galen Low: Which is a dark art in and of itself because I know I've experienced a lot of pushback and like getting people involved early, especially in an agency context where you're like, you know what, you're gonna have these people billing on the project, but like, we don't do any of that security stuff until later. We don't do the privacy stuff until later. I'm like, yeah, but you know, like it'll be less expensive then.
Mackenzie Dysart: We're just gonna kick it off. Bring everybody to the table to start. They don't have to be there the entire time. And you just, you make it a, we're building a team. We're just getting everybody on board.
They can go, or if we're all on a boat together, they can just go hang out by the pool deck for the next little while while we're working. But, we all need to be on the same boat, on the same ship, going somewhere, I think is the way to look at it.
Galen Low: And even that, like, I love that framing also because the thing that strikes me is that, all of these things are ecosystems.
Like, we just painted a picture of an ecosystem of people and technology that may not work for the same organization even. So, we do have this notion that you have to choose vendors who also care about privacy. You have to, understand how your client feels about privacy. You have to have the internal culture of privacy, and those all need to plug together, otherwise you still have this sort of liability.
And yet, I've probably been in two or three conversations in my entire career where we're bringing all of the parties together for that simple, and simple-ish, simple-ish conversation of, What are we doing about this thing that matters? Like, from a legal standpoint, from a security standpoint, from an accessibility standpoint, from a privacy standpoint.
Because we all need to be rowing in the same directions. Nobody can find out at point D something that they were supposed to be thinking about in point C, but didn't get the memo at point A. And I mean, like, that's never good. Like, that is all fun and good. There's lots of folks involved.
In a perfect world, everybody is on the same page about privacy. But like many things that are compliance related sometimes the temptation from our teams, from our clients, from our sponsors, sometimes the temptation will be to take shortcuts on things like privacy, to keep the cost down, or to deliver on time, like classic, project management stuff. Deliver sooner, deliver cheaper.
What arguments can a digital project manager arm themselves with to, like, navigate those conversations?
Mackenzie Dysart: So first, and I think this is the most common thing that I also answered, again, speaking about accessibility or anything, is bugs and problems that are found later are just exponentially more expensive to fix.
So if you don't build it with that in mind, it's going to be so much harder to retroactively make it work. And you're going to do something jankier, and then it's not going to be as clean. And then there's that risk of, oh, someone left and they're not here and they built this and it's not, there was some nuance and there's just a lot of risk and business risk associated with just doing it late, right? That's the easiest and the one you want to try and start with. Also it's just the right thing to do, again, trying to do the right thing for people. Now understanding that businesses are businesses and sometimes they just want to hit a timeline and that's it.
That's when you can start to use the lawsuit aspect of things, right? So GDPR, I believe is 20 million pounds or 4% of total global revenue is like the max you can be charged for being non compliant with GDPR.
Galen Low: Oh, wow.
Mackenzie Dysart: So it's like a pretty big financial burden if they deem you non compliant to the max ability and like, I don't know how they determine exactly how much you're going to get charged or anything, but like, that's a pretty big risk.
And then if we look at U.S. state laws, most of them are about $7,500 per individual violation. So that comes into play when you start thinking about class action lawsuits. And so if, and this is all the legal jargon I've recently learned, but like if there is the right to private claims, so that's when you can have individual lawsuits against a company versus like a regulator or governing body going after a company, they can go for private rights of action, class action lawsuits.
So I've seen stuff go on Instagram where they're like, Hey, Have you used this product before? You might be eligible for X, right? We used to see these commercials on TV all the time about like, were you exposed to this? Join our class action lawsuit. So basically the same thing's still happening now with a privacy lens, and that's $7,500 per individual violation or individual complaint, right?
So that can add up pretty quick. And then there was a new law that just came out in Washington state around health data. And again, this is the one that is top of mind for me, but it was $25,000 per individual if they could deem damages. So if you think that you've just even got like a couple hundred people who could prove damages, that's a lot of money, right?
And so the financial implications and the financial risk matters. There's also the business risk. People are becoming more considerate and aware of privacy. Maybe they're not necessarily someone who's going to be like, Oh, I opt out of everything. But they might be someone who is conscientious of what businesses they're working with because of it being something that matters.
And for that, there's reputational risk associated with it as well. And then I think it's also really important to understand that laws are changing all the time. So it's even unclear what's going to be happening in the next six months. There's tons of new laws coming out about teenagers and social media.
But what is deemed social media? If you have any sort of community within your website, digital app, that could be considered social media and are you compliant with all of that? So it's interesting. There's a whole bunch of like different costs associated with it. And I think if you're not building with privacy in mind, privacy by design, accessible by design, secure by design, all of those concepts are really critical to success here.
And they are going to be a little bit more upfront work, but significantly less costly in the long run, either from rework perspective or legal perspective, right? So I think it's applying that principle and just knowing that you might not be able to build it to be compliant on day one, but if you start building it right and know that, hey, like maybe there's not even a law that governs the thing you're doing, but you think, okay, if I'm thinking privacy, I should be able to either corner off this section of data or another piece is, members or users being able to like change their mind.
So that's the big piece. So how are we going to give people control to change their mind or opt in or opt out of this thing and also change their mind again later on? We don't necessarily have to be able to do that on day one, but can we build that backend to support it later on in the future when it may be necessary?
Galen Low: It's really interesting because when you think about maybe all things, right, but definitely privacy, it's sort of part of the conversation culturally now. There is higher awareness, which is going to drive, progress in law and legislation, which is then going to combine into the sort of people, voting with your dollars, right?
We say that when we're talking about whatever free run eggs, but we also mean it as, okay, you are the client or whatever, from my perspective. You're an organization that is going to hire vendors or engage existing vendors on a thing and that selection process, like if it is part of the conversation and it's top of mind and there is this sort of ROI on just not getting sued, then you will start selecting people to work with who believe in those things. Because also I've been on the other side of it too, where it's like, let's not say anything about it now so that we can come in at the budget. When it does become a thing, we'll build a client for it. But you're looking at dollars here that is like the cost of switching vendors is far less.
So even if you're running this business and you're like, okay, well, we're getting away with it. Let's just not get caught. I think a savvy organization, a client organization should be able to say, okay, well, listen, like if you guys haven't mentioned privacy in five years, what are you doing about it? And if I don't like the answer, it's cheaper for me to switch to another agency than to get sued and have someone take 4% of my total global revenue.
It's a no brainer. And that kind of forces, it galvanizes the entire industry towards better privacy behavior. Even if players along the way don't really believe in it or care about it, it becomes part of the marketplace and becomes part of being competitive.
Mackenzie Dysart: Yeah, it's an absolute value add, right? You're looking at that skill set, that ability to adapt and drive product or environment need, right?
Everyone needs to be considering this now. This is a business requirement, whether or not it's being discussed in the RFP, you should be responding to that RFP with, Hey, by the way, this is what we do for privacy. This is how we are going to make sure that we build it in. What you do with that is your own choice as our client.
But this is what we will do to provide you with the steps that you need, the best practices, and then you can do with it what you want and pass that, that risk assessment and that choice over to them. Yeah, it's just a huge value add, and I think it's not hard to start to understand it. Like I said earlier, I've only been doing this for a year, and I happen to work really closely with some privacy experts who know what they're doing and are very patient with me.
But also, it's not that hard to start to understand the themes, right? I don't have to decide how to build it properly. I don't have to know. I just need to know how to speak to it intelligently, how to ask the right questions, and how to make sure it's being considered along the way. Otherwise, it can just get missed.
And then it's so costly. And then those hidden line items on client budgets, they don't always like it, right? Like I know it's been a business strategy for a while, like, Oh, let's just get, we're going to hit that budget number. Okay, but we can't hit that budget number unless we don't do this work. And that's not fair.
So I think it's really important to be honest and transparent about it, but also start building it into our system so that it's just part of the budget, right? Like the budget just includes privacy. And that's not an add on. It's not a feature that you get to choose. It just is just like building mobile first websites, right?
That responsive design, I think is a great example of, we just figured out a way to do it. We didn't upcharge for it. It just was a thing that we had to do as an industry. And that's the way forward.
Galen Low: I love that sort of education piece, right? People willing to share, and people curious enough to ask, equals, yeah, sort of lifts most boats, but maybe not all.
Maybe we can dive into the tough stuff. Because we've been talking about, ethics. We've been talking about this noble cause. We've also been talking about organizations that are tempted to take shortcuts. And yeah, they're probably not alone. Yeah, I guess thinking through some of those sort of tough questions and the role of a digital project manager, but also like anyone on the team.
We are talking about some solutions, digital projects where, we are capturing data, we're doing lead capture, running digital marketing campaigns that are personalized, and it involves a lot of data collection. And sometimes, in ways that a team might not feel is all that ethical, even if it's legal. So, how can we as project managers, like, create a team culture where our team feels safe to blow the whistle when they see something concerning?
Mackenzie Dysart: First and foremost, you need to be the person leading by example whenever possible. I think that's the most important piece. And it's not necessarily blowing the whistle sometimes, right? It's just, we're doing what? Why are we doing this? And we hear that a lot. If you do any sort of like DE&I training or anything, where someone says something inappropriate, and then you call that in a little bit, or ask a clarifying question to draw attention to, that was a weird thought.
And it's the same process, right? So if someone's bringing up, Oh, we're going to do this with their data and it's, you can just say, do we have consent to do that? Should we be doing that? Do we need that piece of data to make this decision or could we do it with less data? And it's asking those questions.
I always think that we, as project managers are the head of the team, right? So if we aren't leading by example and asking questions that sometimes like might be uncomfortable, might not make us look the most intelligent sometimes, like, I ask a lot of dumb questions, because sometimes you just want to open it up for other people to ask questions, right?
So if you ask the weird question, then everybody else will, right? I think there's like a meme or a reel going around right now that's like, say the weird thing. Ask the weird question. At least it sets the team up to be more comfortable. Creating that space of psychological safety is so important because you need people to understand and be okay with asking about it.
And that is an organizational challenge. If you're working in a place that is very client centric and just do whatever they want, right? Like do what the client says, doesn't matter. Don't think about it. Just give them what they want. And it's like, okay, but should we? Right, like, should we do that? Can we do it a different way?
Can we provide them with what they're asking for in a more privacy centric way? So I think it's really just a matter of leading by example whenever possible. Being honest and transparent about the limitations, too. Sometimes that's just hard, right? Like sometimes you're in a place where the systems don't exist to support it, and that sucks.
And it's like, okay, how do we write that ship? But also like, it's going to take us a couple of years. So what is the first right step that we can take? I think that's something that's really important in that aspect is just provide the information. And then there's also that piece where maybe you're providing the recommendation to your client and they just aren't hearing you, right?
Like that is a very real possibility. I've seen it in a number of different cases where you're like, but this is the way to do it. You're like, no, I don't want to do the extra three hours worth of work. No, let's, I want all the data. All you can really do is say, you know what? Here's the risk. I have documented it.
I've given you my recommendation. What you do with it is what you want, right? And that's a risk based decision that a business has to take on. And all you can do is provide them with the risks associated with it and hope that they do the right thing. And then be grateful that you don't work for that client as an employee, because they're not doing nice things, right?
So that's the other way to look at it is when your client does sketchy things, at least you don't work there. It's probably not a great selling feature, but you know, it's choice.
Galen Low: It's choice, making decisions, not solely about where are you going to make the most money, but also where you're going to, yeah, have the most valuable impact to users, to humanity?
I don't know, I keep going there.
Mackenzie Dysart: Purpose driven, right?
Galen Low: Privacy superhero.
Mackenzie Dysart: Can I get a cape that just says privacy? Yeah, it's going to be great.
Galen Low: That seems like an Amazon search that, like, could go wrong. Privacy cape.
Mackenzie Dysart: Yeah, that could go horribly, horribly wrong. Yeah. I don't want to know what that's going to give me.
Galen Low: Coming back to that conversation with, clients and organizations who just don't seem to care. I think the other thing is, like, the conversation is being driven as well by things like deadlines, right? We saw it with accessibility. I believe there are some or many on the privacy front, like deadlines to be compliant.
I think, governments regionally understand that this doesn't happen overnight and they can't, make everyone turn on a dime, but it has to happen and these deadlines creep up. But I have definitely been in projects where the client organization has waited till pretty much the last minute where there isn't really enough time to like, get something done to be compliant.
But it brought into sharp relief this notion of like, being too late. You know what I mean? Where it's like, yes, start all these things, yes, they're good, yes, they take time, and no, you don't have to do it right away, but actually you do need to do it at some point, otherwise you're going to, be you know, up the creek or obsolete. Like if there's an organization on the privacy front, who hasn't even started thinking about how to be compliant from a data privacy standpoint, or if they're trying to dodge it until the last minute, are they actually already too late?
If they're listening to this going like, yeah, I should probably start doing that. Is that very concerning or is that sort of a boat that many people are in?
Mackenzie Dysart: It's concerning, but I think it is a boat a lot of folks are in. I think it truly depends on the nature of the data that you have about people too, right?
Because different levels of data, different sensitivities, but it's ultimately like making these changes is hard, especially when you have older systems that weren't built with it in mind. We've talked about this before and I'll hammer it home again. The later you do it, the more expensive it is to fix, and it's just hard sometimes.
And this isn't necessarily a choice, right? Like, you might have built something 10 years ago, not really, we didn't have these laws. We didn't have this. So it's not anybody's fault at that point. But now you're looking at, I'll use GDPR as like Data Subject to Access Requests, or DSARF. And that's your right to find out all the data that a company has about you.
And you can also ask to have it rectified if it's incorrect, has to be readable. So you have to give it to the user in a machine readable format. So usually it's like a CSV or something, but they have to be able to, as a human, understand the data and then the right to deletion, and that's removal of all your data from their system.
That's a tricky process for any company that's been around for 10, 15, 20 years. That's been collecting all this data about people that has systems that aren't necessarily connected well. Right? So that's a lot that's been in play. I don't remember exactly when that became effective, but a lot of companies are still trying to become compliant with that because it's hard.
It's heavy. It's not as simple as we would like it to be because we use all these different tools. And sometimes there's that one dev team that decided to use an extra tool that nobody knows about. It's not mapped and documented, but it's housing some user data. So it's hard. And I think we need to give ourselves grace in understanding that, but we need to find ways to chip away at that because it's really easy to look at that and go, Oh my gosh, this is so daunting.
I don't even know where to begin. I'm so like, you get overwhelmed, analysis paralysis, and then you just ostrich your head in the sand and you're like, it's not happening. It's fine. That's not the way forward. The way forward is to figure it out bit by bit, make small incremental changes, and make those improvements.
For the most part, the governing bodies, the regulators, are going to be on your side. If you can prove that you have a plan to get yourself into a better place in the next 6, 12, 24, 36 months to figure out all your systems, that's a better place. You have to have that plan. If you don't have that plan when the regulators come, then it's like, oh, you weren't even trying.
That's an entirely different conversation, and that's going to open up litigation really differently. And then there's that concept of like, ambulance chasing lawyers. They exist in privacy. I can't remember their names. There's one guy specifically who like, anytime a new privacy law comes out, he submits lawsuits against Meta, Apple, like all the big names immediately. Cause he knows they're probably not compliant on day one, but they're big companies, they've got huge law teams. Like they deal with it regularly, but there's these laws that are coming out that every six months or so, probably even less than that in the U.S. right now cause each state is coming up with their own things.
They're not all in the same timeline. Some of them are similar, but slightly different. Cause why would we be consistent? And so it's hard and the governing bodies, the regulators understand that and are a bit more understanding, but the private rates of action are really complicated. There is one law, it's called VIPPA, the Video Privacy Protection Act. It exists from, I want to say like the eighties, maybe seventies. And the intent is that your video rental history cannot be shared with another person without your consent. There's a reason for certain types of videos that people might not want other people to know that they're watching.
That is the intent behind this law. The back in the blockbuster days. I'm sure there's a bunch of people watching or listening to this who don't even know what I'm talking about.
Galen Low: What do you mean rent a video?
Mackenzie Dysart: Rent a video? Rent a VHS. We're not even talking DVDs. We're talking VHS at this point.
Galen Low: The Didn't Rewind Legislation Act of 1981.
Mackenzie Dysart: Right. But that law, some lawyers figured out that it could apply to pixels that track the videos people are watching on Facebook and on YouTube. And so any sort of tracking that was happening, and I can't remember who actually got like a bunch of different companies actually got sued under VIPPA because they hadn't said that like your streaming history, or your video watching history was being shared with other systems.
And so that's also another aspect of privacy that like, there are laws that work created for the digital age that can be adjusted and can be pivoted to the new world and the new age and the new technology that we have. So on top of new laws coming out, there's pre existing ones that we hadn't even thought about could apply.
And so this was, I think, all those lawsuits were last summer, but it was really interesting to think about, right? Because I wouldn't have thought of, like, what I watched on Netflix being, like, of concern. Like, anyone in my household can see my watching history. So, yeah.
Galen Low: It's fascinating. It's just, again, it's fascinating the sort of, like, application of law to things that are new.
It's also fascinating that, yeah, actually, like, privacy, in that decade was like, who can see my video rental history? And now it's you know, can my ex spouse track my whereabouts because they have access to the car that we bought together years ago? Like, the conversation has changed.
I'm happy that the conversation is changing. The awareness is, rising and the action is being taken to make sure that we are not making silly decisions or leaving these things unchecked because they could go to pretty dangerous places.
Mackenzie Dysart: Yeah, I would say the one thing that I never considered the implications of from like a privacy perspective that I will now probably never do or very selectively do is contact syncing with an app.
And this is every digital product person is going to hate me for this because this is like a very great feature. But the concept of 'I sync my contacts with an app' that you have nothing to do with Galen. All of a sudden, it's got Galen's name, phone number, email, all of your data without your consent of being involved with that app at all.
And so, Meta's actually got a great section of their, like, support content about how they get that data and you can have it removed, but not a lot of companies are sophisticated enough to do that or are Meta that have the systems. But it's the one thing that I never really thought about. Syncing contacts to any Bluetooth device?
Why does it that card now has all of your contacts, always? It's a really, this is the one that like got under my skin. If there's anything that I've been like, Oh, that's, yeah, contact syncing. That's the one you'll think about now forever.
Galen Low: Oh my goodness, yes. Nightmare fuel.
Mackenzie Dysart: Yep, sorry.
Galen Low: I don't, but I don't even know how, like, it's just like, my information is out there in a Bluetooth device.
Mackenzie Dysart: All the places your name exists that you didn't consent to, yeah, your name is in who knows how many cars.
Galen Low: Mackenzie, this has been super fascinating. I've learned a ton. I think our listeners have as well. We could probably fill a book with some of this stuff. Maybe we ought to. Maybe we ought to do a part two.
But honestly, there's just so much good stuff there in terms of just questions to ask, things to be aware of, and just like the role that we play as digital project managers. Or if you're listening and you're not a digital project manager, just folks involved in digital stuff, there is a role to be played in what is not just a box ticking exercise, but actually something that's, it's a noble cause.
It's a greater good. It's a, it takes some convincing sometimes, but it's not a difficult argument when you're talking about the safety of children's personal information, for example, or lawsuits, right?
Mackenzie Dysart: Exactly.
Galen Low: That's an easy one, too.
Mackenzie Dysart: Think of the children.
Galen Low: Think of the children and not giving out 4% of your global revenue.
Mackenzie, thank you so much again for spending the time with me today. This has been a lot of fun.
Mackenzie Dysart: Thanks for having me. This was great.
Galen Low: All right folks, there you have it. As always, if you'd like to join the conversation with over a thousand like minded project management champions membership to learn more, head on over to thedigitalproject manager.com/membership to learn more. And if you like what you heard today, please subscribe and stay in touch on thedigitalprojectmanager.com.
Until next time, thanks for listening.