In digital marketing we get so fixated with conversion rates and exponential growth curves that we often don’t have time to worry about data privacy and data ethics. What are the long term business impacts of unethical use of personal data?
Join hundreds of practitioners and leaders like you with episode insights straight in your inbox.
Checkout our brands or sponsors page to see if you are a match. We publish conversations with industry leaders to help data practitioners maximise the impact of their work.
Marketing and sales are all about relationship building and trust. But in digital marketing we often forget that behind email lists there are actual people. We get so fixated with conversion rates and exponential growth curves that our teams are forced to corners and have no space to worry about data privacy and data ethics. But what are the long term business impacts of unethical use of personal data? What should brands do to build trust at scale? Today I learn about the future of digital marketing from my guest Stéphane Hamel.
Stéphane is a seasoned independent consultant and distinguished thought leader in the field of digital marketing and analytics. His primary interest is in the ethical use of data, especially in the context of digital marketing. Stéphane advises global clients, agencies, start-ups, and vendors on the ethical use of data. He is an investor and advisor to a number of privacy-tech startups and teaches a course on marketing & analytics for MBA students.
You can follow Stéphane on LinkedIn: https://www.linkedin.com/in/shamel/
Your ideas help us create useful and relevant content. Send a private message or rate the show on Apple Podcast or Spotify!
Loris Marini: Those that follow me know that we’ve touched on many, many topics in the last 30X episodes, but we've rarely talked about ethics, privacy and I’ve been trying to reflect on why that is.
Part of the reason is that I felt imposter syndrome. I didn't feel I knew enough to have a meaningful conversation on the topic. Spoiler alert -- I still feel that way. that tells you one reason why I decided to do it. The second reason is that, to be honest, I never found someone with the right experience that could weigh in on the topic.
Just to put the conversation in a bit of context, what I and my guest are hoping to achieve with this is to really zoom into that moment where ethics and privacy conversations meet the practical business needs. When you work in data teams, arguably the first application of any database or data warehouse, data mart, however you want to call it, is marketing. It's generally accepted that predictive analytics has a sizeable return on investment when you apply it to connecting better people with products.
Marketing is the focus today, but it's not just marketing. It's about how we ethically use data assets to improve the key performance indicators we track in marketing to help the business as a whole. Doing it in a way that is respectful of people's data and people's privacy. My guest today is Stéphane Hamel.
Stéphane is a seasoned independent consultant, and distinguished thought leader in the field of digital marketing and analytics. He advises global clients, agencies, startups, and vendors in the digital analytics space on the ethical use of data and how to grow a data-informed culture, which I'm super excited about. Right now, he's interested in the ethical use of data for marketing purposes is teaching marketing and analytics to MBA students. He's an investor and advisor to a number of privacy tech startups so he's got really that view on what's happening, what's hot in the industry right now.
Stéphane, I'm super excited to be here. Thank you for taking the time. Thank you for being with me.
Stéphane Hamel: Thank you for inviting me. I'm always delighted to share my thoughts and experience and the way I see things in this field.
Loris Marini: Awesome. Let’s start from the basics. Give me your intro to the world of privacy ethics and what's happening. What has happened in the last 15, 20 years?
Stéphane Hamel: My initial background was in software engineering. Back in 1987, my internship was all about using healthcare data to do an analysis of the different care and stuff that were being offered to the patients in the hospital. Even at the time, we had very strict rules about what we could do with data or not.
The names and the age and gender and all kinds of attributes were available. If on Monday at 3:00 PM, this person had, I don't know, a bath, for example. Even if we had all of that information, we had very strict rules about what we could do or not do in that research project.
I think someone who's grown with computer science or data science background will appreciate that. When the web came out, the natural thing that happened is a replication of what was known in the marketing environment, the advertising model of the papers and TV and radio by default was transposed to the web.
Naturally what happened is we started to see some banners. We started to see some advertising on the web and the model was the one from papers and TV, where you get to view some ads in order to get something for free, which is the content that you want. That's what eventually led us to where we stand today, where the whole economy of the web, and probably the internet itself, stems from the idea that in order to get something, you need to see some ads and you need to sacrifice some data, except that the data becomes so easy to collect and to store.
It's very cheap. You can collect so much information that then erodes the notion of privacy. What is privacy? The simplest way to see it is what I, as an individual, decide to share or not. What I want to protect because I feel that it's a little piece of information I don't want to share. It's not for others to decide what is your privacy? It's up to each individual to decide what they want to keep to themselves, or if they are willing to say, “Okay, let's tell the world that I have such and such attributes.” It’s very, very personal.
Loris Marini: Yeah, definitely. It's interesting because, inside that right, you spoke about two elements. One is to have people be able to see which of the information that they put out there is used. One is visibility, and the other one is being able to control it, to have toggles that you can say, "I don't want this, this, and this to be used.”
Diving into that gut feeling that I had and trying to build a more solid foundation to understand, what are the legal and actual implications for the business?
Stéphane Hamel: There are really two ways to see it. Of course, we hear a lot about privacy or data protection. In Europe, there's more emphasis on data protection. In North America, there's more emphasis on the word privacy.
At the end of the day, if you take it from a legal angle for the business, It's all about risk mitigation. How can I make sure that I'm compliant with the GDPR and the privacy and all those aspects? What you think about is, “what do I need to do so that I don't get into any trouble as a brand?” You go, “Oh, I need to ask for consent,” or, “I need to ask for cookies,” and, “I need to do this and name a DPO,” and all that stuff.
It's all about risk mitigation for the business. As marketers, we like to say that we are customer-centric because we want to build trust with our customers. That's what marketing is all about, right? It's about building trust so the customer understands your value proposition. They embrace it and they say, “Oh, I like you so much. I'm going to buy your product or subscribe to your service,” and so on. When you think of a more ethical approach to it, what you constantly ask yourself is exactly what you said earlier, “Am I doing the right thing for my customers? If I was in the shoes of my customers, would I appreciate what the brand is doing for me?”
Would I say, “oh, wait a minute? Why do you ask for my birthday when you don't need it?” That's a funny one, because how many times do you go on a website and you need to fill in all kinds of information just to get a supposedly white paper, which in reality is going to be just a marketing piece. There's no research, there's no scientific background. It's really just called a white paper because it sounds better.
It's just a marketing piece and we know perfectly well they are asking for all of that information because they want to get our contacts. They want to know who we are, but I'm sorry if you want me to read your marketing piece, give it to me. If I like it, I will know how to reach you. I will be a much better prospect. Instead, what we do, we just fill crap in those forms. I'm always nobody living nowhere. My phone number is 1-1-1-1-1-1. What's the marketing value of that? There is absolutely no value.
I think we need to rethink the idea that in order to do good marketing, we actually need more data. We don't need more data. What we need is better data. In order to do that, we need to build that trusting relationship with those prospective customers at the right time. If they want to reach us, then yes, they will know how to do it and we will be in a better position to serve them.
Loris Marini: Yeah, definitely. That resonates with me as you say, it's trust. It's such a simple human thing. We know how our real-world relationships work. It's all about being there for each other and demonstrating upfront why we’re supposed to talk to someone else.
Stéphane Hamel: The funny thing about ethics versus legal is a set of rules. You have to follow the rule, otherwise, you get in trouble, but ethics is about values. It's about personal values and by extension, it's about brand value also. When we enter back with the brand, there needs to be a match between our personal values and the values exposed by the brand.
It's very personal. It's very cultural also because we see differences. Of course, from a legal standpoint, we see differences between the laws in the U.S. and Europe and Canada in Australia and so on. There are big differences from a legal standpoint, but even from a cultural standpoint, our acceptance of some practices in marketing are very different. In the U.S., it’s much more, I would say relaxed, which is not a good thing. In Europe, it's stricter. Maybe it's getting in the way of doing business. That's one thing I hear very often is, “Yeah, it's fine. It’s really ethical and I do it in the right way, but all of my competitors are not doing it. I'm going to lose my edge.”
In the long run, I don't think so, because again, it's a matter of values. In the long run, the brands that are able to demonstrate that they care about the data they collect and their customers, they are the ones who are going to win, regardless of “Okay. I'm following the law. I checked all the boxes.” That's one but the brand values relating to ethics, that's something different.
Loris Marini: Let's dive into that ideal state of privacy and ethics for you. As someone that has that expertise, what should brands do in that regard?
Stéphane Hamel: When you think about the GDPR is almost like the baseline of what we should be doing, right? It's all about transparency. Don't be afraid of disclosing what you're doing in a language that makes sense, not necessarily in the legal terminology, but a language that your customers are going to understand.
Transparency, consent, control, super, super important. I want to have control over what I want to give you. Again, just this morning I was filling out a form where it asks for my birthday and my zip code and things like that. I was like, “No. My phone number email? Why do you need that?” All of those fields were required. It's like, “Okay, screw that. I'm not doing it. I'm just moving on to your competitor who maybe is doing it right.”
Control, I think, is also super important. If I go back to what I said earlier about the early days of the web, where the default model was, “Okay, we're going to show you ads. We're going to collect data and you're going to get something for free.” That's a value proposition, but we were really never asked about that. Imagine how the web would be today, if from the start each and every one of us would have had the control over the data we want to share with brands instead of brands deciding for us what they want to know from us? If we were in control, that's zero-party data that we hear a lot these days.
Loris Marini: Yeah. Fundamental keywords, I think you mentioned a few, but you said long-term. That is something that everyone in data management struggles with, because we seem to be living in this world where all that matters is short-term goals. Nobody seems to be looking at the horizon in two, three, five, 10 years.
How do we talk to the business about this? How do I demonstrate that there is a product-market fit and people want us? It can be a tricky thing to navigate.
Stéphane Hamel: One of the interesting aspects is we're moving from a world where the general idea was, “we're going to collect everything just in case we need it.: Now it's because the party is over, regulations are coming in and saying, “Okay, stop that.” When you ask for a piece of data, you need to be able to demonstrate why you actually collected it and how you want to use it.
I don't think it's a bad thing. It's funny because I was so active in the digital and analytics community for 20 years, advocating for digital and analytics and so on. Now that I'm talking about privacy and people think that I'm against tracking for example, or against data collection. I'm not against it. On the contrary, I’m for it, but re-imagined in a way that will be much, much more respectful, much more customer-centric than what was done until now.
How do you convince your managers that it's fine to ask for consent and it's fine if only 30% of the people say, “okay, yes, I agree. You can track me.” There's the legal aspect. If you don't do it, you can be fined. Usually that's a pretty strong incentive. There's the other aspect, which is what would be the cost of a data leak? Obviously, if you collect more information and this is what we hear a lot, “Oh, because of third-party cookies disappearing,” some people call it cookie apocalypse. I laugh at that. There's no apocalypse.
I've been using Brave. My primary browser is Brave. It blocks third-party cookies. Nothing exploded. I can still use the web. I can still navigate the internet. It still works. There's no apocalypse. Those things that will be an apocalypse, maybe they need to look at what they are doing. If it's so detrimental to what they are doing, maybe they're not doing the right thing.
There's the legal there. There are the fines and all those aspects. There's the risk of a data leak. Obviously, if you collect more information, it gets riskier to have a leak. Now we're talking about security and all those aspects, but what is the cost? What is the cost of a data leak? It can be super expensive. I read something like, the average cost was $4 million to $5 million, sometimes $9 million, depending obviously on the type of business and how big a data leak is and so on. On average, it was taking like almost a year, 280 days to recover from a data leak. That's another pretty strong incentive.
The third one is what is the cost in terms of brand value in the long run. Again, if a business has a data leak, how many customers are they going to lose? How many new customers they won't get because people will go, “Oh, wait a minute. This business had a data leak. I don't trust them anymore.” They will go to somewhere else and maybe that somewhere else won't be any better. The point is that there's a cost to a bad reputation in terms of data management. That cost is really hard to quantify. I read something like, was it 70% of people would abandon a brand if they don't trust how they use their data?
Loris Marini: Well, I'm definitely in that cohort for sure. I have very little patience for that, mostly because this connects to the attention economy and the fact that we are overwhelmed with everything today. There's everything of everything. There's too much of everything. The last thing we know we need are those automated marketing funnels, shooting content may not necessarily be useful to us.
Stéphane Hamel: Well, again, we just have to think about how we do in our personal lives. When we get an email once a year that says, “oh, it's black Friday. We really, really love you.” The rest of the year, you don't hear about them. Is that building trust? Is that building the brand? Absolutely not.
Maybe sometimes it's getting back to the basics of what is marketing. Marketing is not a chatbot that will be there. “Oh, we really like you. You can talk to us.” You discovered that the chatbot is actually crappy and doesn't understand anything. When you try to actually reach someone, you're not able to do it. Tell me how the chatbot is really so miraculous that it will solve anything. I'm not saying that it's not useful. What I'm saying is we need to be more careful because it's easy to implement. It's machine learning or AI, or it'll smart about the data or whatever, there's still no magic.
It's not number of tools and widgets that you include in your MarTech stack that will make the difference. It's how you really care about your customers. Caring is not always automated. Sometimes yes, but not always.
Loris Marini: Yeah, I love that. It’s a realization for many, many, many people, especially looking at the state of digital marketing. I was looking at an interesting report recently, which actually I should include in the show notes. I'm actually wondering if anyone in the audience knows of a better report or not, or alternative reports because it always good to double check with two or three data points.
This one in particular was talking about the effectiveness of advertising and digital advertising in particular, the return on investment has been steadily going down because yeah, you can come up with all the strategies you want, but we shouldn't forget that the other side is a human being. We're very good at telling whether someone has genuine intentions or it's just trying to get that, I think they call it a lead magnet if I'm correct, in the industry. Yeah, I hate and love the term. I love it because it really explains what's happening. It's a magnet for people to provide the details. I hate it, because as you said, 90% of the time, there's no real value behind it.
Maybe we should dive into the tech for a moment. What are you seeing in the space in terms of privacy tech startups and players that are trying to change the status quo?
Stéphane Hamel: Yeah, there's a lot of innovation going on, definitely. Part of that is because of regulations like the GDPR. There are a lot of solutions emerging to solve specific issues related to the legal aspect of it. What I like to look at, of course those are important, but I think things that are maybe more challenging or more ambitious, like one of the startups that I'm involved with is called Caden in New York.
They are working on zero-party data, but with a slight difference to the definition that Forrester gave a few years ago. When Forrester started to define zero-party data, they created the term, it's data that is willingly given by the consumer to a brand, but they forgot the most important aspect who has control over this data.
Even if I'm willingly giving you my email address and so on, but then the brand runs away with it and I don't have any control. That’s still first-party data. Zero-party data to me, what it means is as the customer, I keep total control over what I want to share with which brands, for which purpose, for how long and so on.
It's a huge challenge because that's not how the internet and the web evolved over the years. Switching this back to giving the power to the customer is probably the biggest change that can happen because it means that you need to have authentication to know, is this the real person?
You need authentication, you need security, you need control, you need transparency. In such a scenario, if I'm the customer and I'm saying, “oh, yes, I like this brand. They're asking politely and giving me the right value proposition. I'm willing to give them my email, my name, my address, my phone, and my date of birth, gender, and all kinds of interests because I trust the brand.” I still retained the control. If one day I decided that I don't want to do business with them anymore. I can pull that out and say, “I'm sorry, but for whatever reason, I don't want to be a customer of your brand anymore.”
Loris Marini: We need to break up.
Stéphane Hamel: It's a breakup. Yeah. Again, it goes back to building trust and managing the value for the brand.
Imagine the value of this data, the person who willingly gave it to you, they have absolutely no reason to put crap in there because they retain the control over it. The value of the data is going to be much, much better than trying to force people, because what we hear is, “oh, you need to collect more first-party data. Let's find all kinds of ways to collect more of it.”
Loris Marini: It’s a race to the bottom.
Stéphane Hamel: Doing contests or whatever just to get some crappy data.
Loris Marini: Yeah, it goes back to measuring the wrong thing, right? We're so fixated with numbers. We want a big number to show that we're doing our job, but the reality is trust cannot really be encapsulated with a number. It's not by driving that number up there, you increase trust. The causal chain goes the other way. You first build trust and then numbers go up, not the other way.
Zero-party data, basically the idea is that we are in complete control of our personal information, but from a technical standpoint, is it another tool? Do we create our identity in one central database, federate it, ticks all the boxes and then sort of link that up?
Stéphane Hamel: Yeah, there are a couple of important points. One is the authentication aspect. When you go to a website, typically you either provide your email and a password, or you authenticate with one of the big players like LinkedIn or Microsoft or Google or Twitter or Facebook and so on. Why does it have to be that way? What if the authentication making this was trusted? An escrow, for example, where their mission is really to protect your authentication mechanism. If you use something like 1Password or something like that, we know that there are ways to have secure data that sits on your device, that is encrypted and secure while also being synchronized somewhere in the cloud that is also secured and they don't even have the password. They have simply have no means. It's impossible to decrypt what is there?
Let’s say, to use Caden’s terminology, they call it the vault. You have your own vault, a bit like a bank. You have the control. You have the key to it and you decide what you want to share, but it needs to be done in a way that is transparent enough so that there is no difficulty, no challenges using it on your phone, on your desktop or something like that.
Loris Marini: Frictionless, as well.
Stéphane Hamel: Frictionless, user-friendly. You can put all that into a blockchain and so on to make sure that it's secured and you follow the data so that it doesn't get replicated and doesn't get reused in a context that you don't agree to.
Loris Marini: I love that. In the conversation with Doug Laney on Infonomics, we talked about the non-depletable aspect of the asset. Digital assets, the fact that we can make copies of pictures, we can make copies of data sets. Unless we take the necessary precautions, nobody can tell whether it's A or B, which one’s the original.
Stéphane Hamel: I think there is a more interesting way to think about NFTs than simply an exclusive picture that you're the only one in the world who has the right to it. What if NFTs were attached to the data that you share so that it cannot be replicated or cannot be used in another context that you didn't agree to?
I think that it's a big challenge. Of course, Caden is not the only one looking into it, but it's exciting. I don't think I've been excited like that since I've started working 35 years ago.
Loris Marini: Well, it's absolutely good to know. The only use of NFTs that seems to be making it through the crux of my information bubble is around getting swaggers. Things and symbols that tell the world that you belong to a certain tribe and that you believe in certain values, which can be really, really helpful and useful, but surely, there's got to be more to it.
Stéphane Hamel: I hope so.
Loris Marini: What are people doing with NFTs? What kind of world can we build if we use that technology in a smart way?
Stéphane Hamel: You asked about what businesses can do in order to embrace maybe a more ethical approach to it. When you go to a website and there's those famous consent pop-ups. I don't know if people realize that when you say reject all, they can still track you. Legally, they can still track you. Why is that? They track you without cookies. They track you without any personal information.
I even saw a website where more than once, where there's a nice consent dialogue, but regardless of what you do exactly the same thing happened. Whether you say yes or no, they were still tracking you. On the surface, it's like, “Oh yeah, we care about your privacy.”
Loris Marini: The illusion of agency.
Stéphane Hamel: It's the illusion.
Again, it's totally fake. Is it customer-centric? Is it building trust? Is it good marketing? Absolutely not. It's got ego on top of that.
Loris Marini: Yeah. We'll go back on square one, right? Ethics pays in the long-term. Nobody gets a bonus because they use data ethically tomorrow. The organization, the brand as a whole, the way the brand is perceived, can unanimously be impacted by these micro decisions we take every day.
I want to dive into that because imagine that now we're looking at a company that has a marketing function. The business understands the importance of this long-term. They're not in the game of exploiting their relationships that make up their business. They want to protect them. They want to nurture them and eventually convert them when and if that conversion is supposed to happen without forcing anyone to do anything.
If that's the vibe in the organization and we're looking at the marketing function, what do we need as data leaders? What kind of initiatives can we take to ensure that people across the organization understand the impact?
Stéphane Hamel: There are some well-established financial practices governed by the law in each different state. There are certain things in accounting, for example, that you can simply not do, right? If you do it eventually in the short term or the long-term, eventually it's going to be known and it's going to hurt your business. People have learned through education, through a slap on the wrist, that there aren't some certain things that shouldn't be done. Why would it be different than what we're going to learn over time what is acceptable or not?
One of the stories I like to tell is when I speak at conferences, I ask the audience, who would like to work with 30 million or 50 million customer records detailed with all kinds of attributes? Of course, everybody I asked, it’s super exciting. Who wouldn’t want to use this data to create a super wonderful campaign for selling shoes? Of course, marketers will say, “Oh yes, that would be awesome. That's a nice playground.” For data scientists, it's like, “oh yeah, bring it on. I want it. I want the data. It's a nice playground. I will cut my teeth on creating awesome models,” and so on. You realize that it actually works and you can sell more shoes to a certain audience. Well, who would say no to that?
Well, let me tell you a slightly different way of seeing it. Cambridge Analytica had smart people. They had smart data scientists who were able to create awesome models and they did marketing and it was super-efficient and it was well targeted and it was successful and everything. It's just that instead of selling shoes, they were selling an election and who wouldn't want to do that?
That's the way it works. What was the real problem with Cambridge Analytica? Is it how they collected all the data? They cheated a little bit here and there, but it was not illegal. Is it because they had smart data scientists? No, actually not. Who would want to have the team of data scientists with the right expertise? They have the right amount of data. They did marketing campaigns, it worked. What was the problem? The problem is people found out about it.
Propaganda is not an issue until people realize that they were the victim of propaganda. Where is the tipping point between doing marketing and doing manipulation, deceit and propaganda? That's the ethical question is, where's the limit? How far can I go and be justified and perfectly legitimate to do it from a marketing standpoint?
Obviously, nobody wakes up in the morning and says, “oh, I'm going to really exploit people.” Nobody does that in marketing. We think that we're doing something that is okay, that is the right thing to do. Recently, I wrote about an experience I had with a very small agency that has a questionable marketing tactic. I decided to give the name of that small business, which is questionable doing that. When I see something good, I give the name and when I see something bad, I don't give the name also.
If you, as a marketer, I think the easiest ethical question is if somebody finds out about your marketing tactics and talked about it publicly, are you going to be ashamed of it? If the answer is, yes, don't do it. If you're ashamed of your marketing tactics going public, it's not a good marketing tactic. If you purchase email lists to spam people and it gets to the front page of the news, it's not a good marketing tactic.
The other ethical one is, when you do something, imagine your mother is just behind your shoulder and watching what you're doing. If she understood, would she agree? Would she say, “Oh yeah, you're a good boy.”
Loris Marini: So proud of you.
Stéphane Hamel: Yeah, she’s proud of me. If not, it’s not the right thing to do.
Loris Marini: Yeah. I've seen and heard arguments of the type, “Everybody's doing what we're doing.” It's very easy to just fall into that trap. Sure, human behavior is very complicated and there's a large range of things. People kill people. People donate to charity. The tails of that bell curve are pretty long. Surely if you look hard enough, you'll find anyone that does anything so you can justify it that way. Or you can take responsibility, which is data leaders should do.
Stéphane Hamel: Yeah. The objections we hear most of the time is, “Everybody's doing it. They already know everything.” They is, I don't know, some spirit somewhere that knows everything. It's God. I have nothing to hide from a customer standpoint. I hear that very often, “Oh, I have nothing to hide.”
A few days ago, someone said, “I have nothing to hide.” It took me 20 minutes and I sent him a personal message saying, “Well, you live at this address. Here's a picture of your house. I know you live with that person.” All of that information was public. Do you still think that you have nothing to hide? It gets pretty scary and all of that was done in a matter of less than 20 minutes using just publicly available information.
One analogy that I find pretty interesting. It's a little bit crude, but privacy is when you go to the bathroom. Everyone knows what you're doing, but you still close the door. It's not true that you don't have anything to hide. You always have something to hide. It's impossible that everything can be disclosed publicly for whoever.
The worst is obviously people harvesting this data and making money out of it without your knowledge, without your control, without your consent. That's probably the worst thing that can happen. Competitors might be doing it. They might be falsifying their revenues, playing on their financials and stuff like that. They might be doing that too. Does it mean that you're going to do it? Absolutely not.
Loris Marini: Yeah. I don't know. I wish I had more faith in goodwill and good intentions of people to just expect that with the right education, the problem can be mitigated. I am absolutely sure that literacy is a critical piece of the puzzle, because if you don't even know what you're doing, how can you expect to even ask the question?
Everything starts from a doubt. When you're doubting what you're doing, that's the beginning of transformation, of change. We need that, but we also need incentives so when the ethics rubber hits the road, that point of contact to me is incredibly interesting from a psychological standpoint, from a legal standpoint, from a trust and branding and marketing angle.
We've got to have the right incentives, right? We need to create a structure where people say, “This is not the right thing to do. My mum would not be proud of me.” They also know in a tangible, concrete way that if they keep doing what they're doing, it's going to get bad and again, it can only get worse.
Use those arguments to go and educate, perhaps even outline a report and say, “hey, you asked me to do this. We've got two options here. We can go one way and these are the consequences, short-term or long-term, or we can go the other way. This is the scenario.” Even just painting this scenario could increase dramatically. The probability of someone then takes action but how do we do that?
Stéphane Hamel: The interesting thing also is that education and literacy and those aspects are super important. There was an article essentially saying people are too dumb or stupid or irresponsible to manage their own data so don't make it available to them. I was like, “What a stupid thing to say.”
The interesting thing is you don't actually need to do that for 100% of the people. If there are sufficiently large amount of people who are educated and know how important it is, and Forrester brands gradually embrace more of a customer-centric and data responsibility culture, it will benefit 100% of the people, even those who don't care about it. Even those who say, “I don't have anything to hide. They know everything anyway.” If there's enough people who care about that, it's going to benefit everyone, including the brands. Maybe a bit of a longer run, but it will be beneficial to everyone. I think we're living in a very interesting time.
Of course, not everybody agrees that having a more of an ethical approach to it is worth it. Maybe they're right but we've seen that it takes a long time to change corporate culture and to educate people about the importance of data privacy and security and protection and so on.
Loris Marini: This is actually perhaps one of the cases where I would say technology can actually make the whole process a lot smoother and faster. I am a bit critical of the technology, the tech religion, the people that expect just with the right tool, everything magically fixes all the shortcomings of human nature.
I don't think that's ever going to happen, but it's true that with the right technology, we can change our habits in a more systematic way. At least track our progress. It's about reeducating. When you dive into the data team, it gets more complicated because you have historical data and your responsibility is to decide what you want to do with those. Do you want to, I don't know, anonymize who's going to consume it?
The closer you are to the source, the higher the responsibility, because then the tree kind of branches out. If you're looking at a master data set in master data management, that by definition is going to be 10% of the data used by pretty much every single process and person within the enterprise. You’re looking at an address. You want to anonymize. Great, but how do you do it? How can you reverse it? Can you track who's using it?
Stéphane Hamel: Sometimes it's very simple things. Throughout the years, when I coached agencies and worked with clients on digital analytics and so on, sometimes they were giving me a full admin access when I didn't need it. We see with the advent of the GDPR, that quickly people understood that. No, you don't give admin to anyone. No, you don't send customer records with all the personal information left and right to a bunch of people through email. Those are simple things that anyone can easily understand.
I think we're going to get there. It doesn't have to be super complex. It's just elevating the bar so that we're more aware of the value of the data we have and the risks that comes with it. I think it's going to be better.
Loris Marini: Yeah. I'm a relentless optimist too. I just think that this is something we really need to focus on. It's going to be an interesting one for sure.
Stéphane Hamel: Yeah. I think we're going to be busy in the coming years. That's for sure.
Loris Marini: Well, in terms of data communities, and sources or websites or communities, places where someone that resonates with the things we talked about today can go and learn more and become the agent of change within the organization.
Stéphane Hamel: I think that's a good question, because again, we're at the intersection of marketing, IT, and legal. Overlapping those three aspects is pretty difficult, but I think one starting point might be the IAPP. From a legal perspective, they cover a vast array of information. IAPP, it's the International Association of Privacy Professionals. They have training. They have conferences and articles and all kinds of stuff. That's a really good starting point.
It goes in different directions. One that I like a lot, The Wired Wig. It's funny because it was started by Annabel Pemberton. She's a lawyer and she gets the technology. She provides the overlap between the two. She has interesting conversations about the overlap of the legal aspects and the technical aspects. That's a good one also.
Maybe you can share that with your audience. There was a nice a Twitter list with a bunch of interesting places to follow. There's a lot of them talking about the legal aspects. When you're talking about IT, oftentimes it goes into security aspects.
There's not that many people talking about it, I think, not enough.
Loris Marini: Cool. Now that we are tailing slowly approaching the end of our hour, I wanted to ask you two broad questions. The first one is, what would you like to see more of in the industry? The second, what would you like to see less off?
Stéphane Hamel: Yeah. I think I can probably answer both of those questions using a term that I've started to use that simplifies everything. I use the hashtag #noconsentnotracking, especially in the marketing aspects and the digital analytics aspect to mean that before you implement something to collect data, think about the transparency, consent and control. Just start with that instead of rushing to install the nicest, shiny object that just came out. Step back and think about it.
If we think about that, we're going to see more marketers raising the flag when they feel that something might be wrong and go beyond just the legal compliance. We're going to see less in a sense of marketing behavior that is questionable, or maybe not aligned with our own personal values. I think that could play on both sides of things.
Loris Marini: Setting the intention. I love that. I absolutely love that. Perhaps before we close, I just remembered recently with Debbie Botha, we talked about privacy. I didn't do any extended research on the topic, but I'm wondering not everybody understands this in the same way. We need diversity of thoughts but some people kind of lost any hope that this can indeed be solved. The problem of control and transparency. Some demand that, like you and me having this conversation. In your experience, what does the demographic look like? Who are the people that are more receptive to this?
Stéphane Hamel: It's a super wide spectrum ranging from those who don't care about it. Don't have anything to hide. They know everything. It's like, “okay, game over. It's the end of the world. We cannot do anything about it.” There's the other, I would say the other extreme, which is all but as soon as you go to a website that use any third-party resource, you need to have consent. In fact, because there's a cached version of the webpage on your site so it's storing data on your device, then it should ask for consent.
It doesn't make any sense. Otherwise, you won't be able to surf the web and benefit from all the good things there are on the web if there are too many roadblocks for the casual user to use it. I hope that I'm somewhere in between that I care enough, but without going to the extremes, and we need more people who will see it in maybe slightly different ways, but we need more people to just have a conversation and discuss about it.
I think it's going to continue to evolve quickly. I think in a year, two years from now, it's going to be very different. You probably heard about in Europe, that Google analytics would be "illegal” because it's sending IP addresses to the U.S. and the U.S. doesn't offer the same protection that Europe has.
Beyond that simple statement, of course there are big discussions way above our heads about that. When you think of it, it's the foundation of the internet. Being able to have two devices that can communicate one to another, using an IP address. They have to exchange that IP address.
There's no way around it. That's how it works. Saying, “oh, no, you can't do that.” It's like, “okay. Let's close everything and don't ever go on the internet. Stop using all of the technology that we have today”, because that's how it works.
Loris Marini: We don't really have an option. We need to either fix it or we stick with the status quo and let anyone do whatever they want with that data, which is not a good future.
Stéphane Hamel: Nope, exactly.
Loris Marini: Let's go off and fix it.
Stéphane Hamel: It's better to fix it and improve it and understand it.
Loris Marini: Yeah. First step for our listeners is definitely to follow you, I think. Are you active on Twitter as well as LinkedIn?
Stéphane Hamel: A little bit on Twitter, but mostly on LinkedIn. When I publish something, usually I echo it on Twitter also.
Loris Marini: Nice.
Stéphane Hamel: All the channels. On Medium also, but the easiest way, just look me up on LinkedIn.
Loris Marini: It's definitely worthwhile, visiting Stéphane Hamel’s page and hitting that bell if you want to keep learning about these topics, that's for sure. Stéphane, thank you so much for your time. I really enjoyed this conversation and I have a feeling that we're going to do so many more things together in the future.
Stéphane Hamel: Thank you.
Loris Marini: Cheers.