Skip to content Skip to footer

The Enchanted House: a discussion about IoT and privacy

Thibault Schrepel: Welcome to this ALTI special edition with my friend and colleague, Silvia de Conca. Today, we are set to discuss Silvia’s Ph.D., which she defended in June of 2021. I’m delighted to be with you, Silvia. Welcome to this talk!

Silvia de Conca: Thank you, I’m very happy we’re doing this.

Thibault: So it’s actually the very first ALTI conversation, so we have to set the standard [laugh]. I thought what we could do is first to give a sense of what your thesis is all about. I think it would be great for you to explain how it may impact legal scholarship and how it integrates into the broader debate regarding privacy and smart homes. So if you could take us through your Ph.D. journey and explain the main idea to start with, this would be great.

Silvia: Yes, of course. My Ph.D. research focused on smart speakers, or voice assistants how everyone calls them. So the typical Alexa or Google Home or even Siri, that are now becoming more and more popular in the households in North America and Europe, and how these smart speakers affect, in a certain way, the way that individuals conceptualize their private sphere. How their privacy, their private dimension, is affected by the presence of these talking electronic devices, and also how then the GDPR and the proposal for privacy regulation can tackle some undesired effects that might come from the way that smart speakers function and collect and process data.

So it all revolves around privacy, data protection, and smart speakers. In terms of contribution, well, what I’ve done, first of all, is to offer, as far as I know, one of the few complete overviews of smart speakers from the perspective of the application of the GDPR and from the perspective of the application of some provisions of the incoming privacy regulation. From the legal perspective, I offer an overview that systemically looks at how the GDPR concretely takes form when we discuss these devices because the GDPR is technologically neutral. Sometimes, the interpretation of certain provisions needs to be adapted to the technological reality. From a non-legal perspective, or what I call a meta-legal perspective, the private sphere and privacy are concepts that originate outside of the law, they are socio-cultural constructs. This means that they are multifaceted, and then analyzing them from different disciplines actually can give different results and focus on various aspects. But these aspects are all closely intertwined with personal identity and with our daily lives.

Because tools like the GDPR and the privacy regulation are legal instruments that protect this socio-cultural construct, and necessarily also must simplify it a little bit, translate, if you want into legal language, I have also looked at several disciplines: from history, philosophy, and psychology and behavioral sciences, besides obviously, computer design and behavioral design of digital products, to see what attributes, what fundamental understandings we associate with our home, as, you know, the container of the private life and private sphere, and how the smart speaker interacts with these attributes. Then I have discussed the legal part in the light of these non-legal, behavioral, social, cultural, philosophical concepts, connecting them in a very organic and, if you want, a holistic way. That is definitely also an original contribution because it is not very common in law.

Thibault: That triggers so many questions already. The first one is when did you start your Ph.D.? Did we have GDPR at the time? We sure have some Ph.D. candidates listening to us, and this is kind of the ultimate fear right, that a big chunk of regulation will come in during their Ph.D. and will change the entire landscape.

Silvia: I drafted the project and made the preliminary research towards the end of 2014 through 2015. So, at that time, only Alexa and Amazon Echo had been announced, but there were some experimental projects, some of which did not then see the light of day and did not go through; Jibo was the one that actually triggered a lot of attention. It was a very small niche, and it was just beginning, and it wasn’t even clear whether it was going to actually take off or not.

Thibault: You could have chosen to study Google Glasses, in which case, the Ph.D. would have been of no direct use.

Silvia: Yes, this is important. It was like that in part, and I really, really dug. I looked into market projections for the smart speakers. Throughout 2015, there were already several projects that said we might reach something like the saturation of the European market and the North American market by 2022-2023. And we are kind of going in that direction, slower in Europe than it probably was forecasted. If we look outside Western countries, look at China, Russia, the numbers are extremely high, and actually, the Chinese voice assistant market has three dominating products, one dominating and two, they’re behind a little bit, all voice assistants, and we’re talking for a total of over half a billion devices. So, I think that looking at the market projections helped me because I wasn’t just looking at the hype; I was looking at where the markets were investing.

Thibault: Yeah. And what about GDPR then, was it in at the time?

Silvia: It was not into effectiveness. The draft was being discussed.

Thibault: Already?

Silvia: Yes, it started having an almost definitive form when I submitted my proposal. So, what I did with the proposal was that I assumed that the GDPR would enter into force during my Ph.D., which is exactly what happened halfway through it, and I wrote immediately the proposal focusing on the GDPR. I actually had that problem, again, with the e-Privacy Regulation because that one was supposed to enter into force almost together with the GDPR, and then it didn’t happen. And actually, a more concrete draft of the privacy regulation was published, I think, two months before, no after, two months after I sent my definitive Thesis to my committee for evaluation. And I received some criticism about that because some people that saw my Thesis thought “Why aren’t you focusing on the e-Privacy directive.” But it was a precise choice because I knew that the Directive, will not cover the technological reality of these devices. There were too many uncertainties, concerning whether these devices would fall under the Directive’s scope, while I knew that whatever form the Regulation is going to take, the scope would include IoT. Yeah, that didn’t matter for me. And I took a risk. That is probably what a Ph.D. does.

Thibault: Now, how did you approach the other fields? You are a lawyer by training, so how did you approach sociology and the other sciences? Did you read a lot? Did you interview people?

Silvia: I definitely read a lot. Now, that was also a difficult aspect of my research, but it was also the enriching one. And personally, I’m very happy I did it. So, luckily enough, there are some publications, some sources that are considered fundamental. They’re keystones in these disciplines. So for example, there is the History of the private sphere written by French historians. It’s several volumes. It’s extremely fascinating. And it is the starting point of any research, even for historians. So, I moved from the keystones, the important texts, and I found them by talking to colleagues, also talking to colleagues from other disciplines. I wouldn’t necessarily call them interviews because I didn’t follow a protocol. And it wasn’t as structured as, you know, a qualitative interview should be, but it was more networking and comparing, and asking experts in those fields. Same for behavioral sciences and psychology. I used Irwin Altman’s foundational research about privacy from a behavioral perspective. And I also got a little bit more secure and comfortable in my steps when I saw that other people, bigger, more famous, and more authoritative than me, were also using in their papers (that were coming out while I was writing, or in previous books) the same authors from other disciplines.

Thibault: Would you say your thesis is relevant outside of Europe?

Sivia: I used mostly European perspective and European authors, I also use quite a lot of North American perspective, because up to a point, the disciplines of privacy and data protection intertwine a lot between the two sides of the pond. What I would say is that it is relevant outside of Europe. My idea is that the GDPR is becoming an inspirational standard for a lot of privacy systems outside of Europe. It is because of the relations that many countries have with the Eurozone. And the fact that the GDPR has a scope, a geographical scope that inevitably extends outside of Europe because if companies want to do business with Europe and want to collect data of European citizens, they need to comply with the GDPR in one form or another. And this has pushed quite an interest in the GDPR. And many countries have adequacy decision. So have regimes that are similar enough, or are taking the GDPR as a starting point to develop their own regimes, and even in the United States, some states are now looking at the GDPR, and the principles contained in it and integrating them within their tradition. So from this perspective, this certainly becomes relevant, like looking at how the GDPR applies to these technologies. At the same time, there is a cultural limitation, because the private sphere, what falls within the private sphere and what falls within the public sphere, the acceptable norms to respect someone’s private sphere are all culturally connected. And even within Europe, we see that, because coming from Italy, and living in the Netherlands, I see even just the way we use curtains in houses, is radically different. And that is a tool: the curtains are a tool, respecting your neighbors’ privacy is the social norm. These are differences. So from that perspective, I do think that it can be interesting to see, at the more general level, a European tradition, but some nations might not necessarily feel that that reflects their own cultural landscape.

Thibault: I’m still unsure about the relationship Dutch people have with their privacy. If you walk around Amsterdam, you can literally look at people’s places. I thought, well, it’s probably wide open because people are not looking through the window. But then I was told that… people actually look. So, I’m a bit confused. Anyway, going back to your Ph.D., the title is “The Enchanted House”. That’s a wonderful title, very creative. How did you pick it? Did you have any doubts as to whether this was a good idea to have such a creative title?

Silvia: Thank you. Well, if I have to be completely honest, what comes to my mind when I read the title is the beauty and the beast. Because the beast lives in a castle where all the personnel have been turned into furniture, they move and talk and dance and entertain Belle. And, but I think it really works in, it’s evocative in this sense, because what we want is a house that we can talk to, and that is going to do things for us. This is why IoT is developing. This is the dream of having a smart home, like the Jetsons’. And we still dream of, you know, the robot Butler, we still want them. But at the same time, like in every good fairy tale, magic comes at a cost. Sometimes it’s a curse, sometimes the price is quite steep. And it is the same with the smart home. And it is the thing with smart speakers. It does appear like magic to us. Because as a famous writer once said, every form of complex technology appears like magic to someone that doesn’t know how it works. But at the same time, it has a cost and that cost is invisible, or might all of a sudden become visible and then make us uncomfortable. And that is what happens with certain data collection inside the home with the IoT, the Internet of Things. So that’s why the enchanted house.

Thibault: That takes me to what you’ve done in the first part of your Ph.D. in which you explore the way we’ve seen privacy through Western history. What are the most surprising things that you’ve learned here? How did it impact your legal outcomes?

Silvia: Well, first of all, it’s pretty obvious to all of us today than what was considered private 500 years ago, 200 years ago, or even simply 50 years ago, it’s not the same as today, or as we have just seen, what’s considered private in one country is not the same in another. So the idea that privacy changes within a community and with time is intuitive if we want.

Thibault: But if I may, it is also really hard to measure, right? This reminds me of the two-fish joke. They’re in the sea, and one says to the other: “Wow, the water is quite chilly today.” And the other fish answers: “But what is water?” Obviously, they have no idea, because they always lived in it. And the same is true for us with culture and the way we see privacy, we live within it. So it’s really hard to take a step back, which is why the first part of your Ph.D. achieves.

Silvia: No, you’re perfectly right. It’s exactly that; these are things that somehow we recognize when we see them, right? We know intuitively what we want and, and what that is but really, we’ve got that intuitive idea that privacy changes with the context. And that really has emerged looking at behavioral sciences and philosophy. (and also looking at some philosophy of technology and STS, social sciences applied to technology) is that currently, privacy is part of a negotiation between the individual and the community. And this is not something that we constantly do with intention. There are individual preferences, subjective constructions, subjective understanding of what is private, and what should be respected, that somehow create values and at some point, these values come to the surface in a community in our society and they also become an objective threshold. So, there is this constant tension between the subjective and the communal, the objective dimension, the collective dimension that’s recognized, the value, and the clashes of values that are constantly negotiated with an individual. And definitely one of the things that I found more interesting is a bit that is connected to this relational aspect of the private-public dimension. And it is how individuals enact on their privacy. So, this is true behavioral science. This is Irwin Altman’s theory, and all the scientists that worked with him: we create a series of mechanisms and act to have the social norms that implicitly exist in a community, respected. One of those (actions) is where we cut off the territory, we put a fence, we build walls, to protect the private life and the private sphere. Another one is purely psychological. It’s called crowding, and definitely was one of the most interesting things that I discovered to exist in other disciplines. Because crowding is the psychological perception that we have less control over our private sphere, following the presence of someone else. It is independent of the actual presence, it doesn’t matter if someone is there physically, or if we can see them. If we perceive that there was a presence, we perceive the traces, which makes us more uncomfortable, and this is going to trigger our reactions: we might close the curtains, we might clean the house, we might go through and lock the doors and so on. So it has a cost. That is not only the cost of the curtains and the locks, so not only economic, it is costly at the level of mental energy, psychological comfort. And finally, in terms of identity, being comfortable enough to disclose our identity, which is something that you should do in your private sphere, right, you should relax and let your true self out. And then when you go back outside, you have enough energy to interact with the community in a desirable way. So all these things that other disciplines tell the law, I think, are extremely exciting and interesting, because it also tell us what the law should protect.

Thibault: But so let me ask you the following question, which relates to the relationship with our perception of privacy, and the law, I think it is fascinating. Would you say that we should not enact privacy regulations? If it goes against the norms and the desire and the way people perceive privacy? So in a sense, would you say that the law should be ahead of society?

Silvia: That is very interesting. And it is a little bit of an egg and chicken question for legal scholars in general, and lawyers in general. I do think that the law is an expression of a certain culture at a certain time, like literature and art. So in this sense, I think the law is the result of what has been boiling underneath the surface in the community and what has emerged. And technologies actually become very important in a dialectic process. Because the law… I am very much following Habermas, I’m a fan of Habermas: (23.56) So we have the public debate, where all the individual subjective preferences are discussed and evaluated and clash with each other. And then eventually, something is synthesized out of this. And that becomes the new threshold, the new norm. And that is going to be put into discussion again, in a cycle. This is very dialectic. And I believe that this is how, mostly, the law also comes to be. And regulatory interventions in general. And it is only one of the possible tools that we can use to protect an interest that we think deserves protection at a societal level. And in this cycle, I think technology becomes very interesting because technology can be a catalyst that can accelerate the clashes of values that were more dormant under the surface. We’ve seen that even with non-digital technologies, we’ve seen that with a lot of debatable and difficult choices connected to technology. Just like pre-birth screening, right, or contraception, already these technology brought to the surface different values that were clashing. And then from this clash, something new is synthetized. And then the law protects this new creation. Now, I do think that the law does necessarily always reflect these debates, this constant dialectic. And it doesn’t mean protecting the majority always either, because obviously, by now, based on the Human Rights Framework, we know that a value in a society can also be the value of the minority, for example, or it means creating a new one, something new that can include both perspectives, at least in part.

Thibault: Yeah. And, you know, I was thinking going back to Lessig’s framework, so we have the law, the norms, the architecture (or technology), and markets. It seems that markets could play an important role. To this day, I am not aware of companies saying, “if you buy this smart speaker, then we will protect your privacy.” Have you seen those market forces playing out and trying to protect better privacy? If not, do you think it will come? Or do you think we will never see that because protecting the privacy and making those IoT functions well is not compatible?

Silvia: There are experiments that are being carried out, but they’re not really private companies, yet. They’re mostly experiments, or it’s research and development from universities or projects of NGOs that are trying to create home assistants, voice assistants, or smart speakers that are a little bit more privacy-friendly. So one was called Candle and originated between I think, the Netherlands and Germany. No Candle is in the Netherlands, while Germany has Magnolia, which is a smaller company, and Magnolia is the name of the assistant they make.

Thibault: So you would say it’s coming.

Silvia: I think there is an attempt, but I don’t think it is coming, because of the way these kinds of markets are structured, the winner takes all economy mechanism is very strong, like in many digital products. I don’t see them become actually that many relevant players outside of a niche of people maybe like me, that are lazy enough to really want smart speakers or something like that, but obviously, also work with privacy or are very privacy-aware. And so they would not necessarily go for the mainstream ones.

Thibault: Yeah, and so your protection is more like a search engine where the main players are not super privacy-friendly. And we do have DuckDuckGo and a few others, but they represent about 2% of the market. So you would say the same will be most likely true for the smart speakers.

Silvia: I think so. And that I believe is not just based on my own impression, but is informed by how the technology works, which obviously is another thing outside of the law that I had to dive into and make myself familiar with. And also the various markets actually, meaning that I have looked at the business model of these devices. And for example, it’s very clear in the case of Apple, because Apple has the tendency to lock the customers and the users into their ecosystem of products. And obviously, they don’t want to be left out of the smart speaker market because that might mean that some users might get outside of their ecosystem through the voice assistant. It’s very clear in the case of Google because it worked with data and Google Assistant complements their search engine. So it goes still in that direction. As a matter of fact, no one knows what the business model of Alexa is, which I think is incredibly interesting. It was developed by chance. Alexa is what’s left of a previous project. That didn’t actually work and it was a virtual reality project. Then they took Alexa, separated it from the virtual reality project, and they created this smart speaker, they created Echo, the device, right? However, it also bridged a little bit the distance between Google and Amazon with regard to having access to data. Because Google had incredible access to data thanks to the search engine that made their AI, their machine learning, better. Now imagine with Alexa: it has bridged that because they have our voice data. So their AI is getting smarter and smarter. So: we know it bridges a little bit that data gap, we know that somehow it helps with online shopping on the Amazon webshop. But the sale of the device? Even the cost is not covered by the price. So it is not clear yet exactly what business model they’re attaching to it. And that is going to affect a lot how much privacy is protected there.

Thibault: Indeed, and that is where the law comes in, of course. The DMA will play an important part. If it stays as it is, big tech companies won’t be able to take personal data coming from one service, such as the Echo, to improve another service, such as Amazon.com. This will change the business model entirely. As to GDPR, what are the main takes away from your research? Are you satisfied with it? What can we improve?

Silvia: Well, in terms of GDPR, I do think in general it offers a good threshold for the protection. Of course, everything can be improved. And I am taking here the mere perspective of protecting personal data because they are a fundamental right because this is the reality of the law in the European Union. Here personal data are not a commodity, they’re not good, they are a fundamental right, like privacy. So the threshold of protection here needs to be very carefully established. In terms of shortcomings, I think, definitely there are some things that can be improved in the way the GPR is interpreted or applied concretely to these devices. One of them that is more general and connects to what you’re saying, in terms of why can’t we use the data from the voice records of Alexa to improve certain services on the webshop of atmosphere? Technically, they can and they’re already doing it within the framework of the GDPR. They’re already doing it because the main actors in this market, so imagine for over 75% – 78% of it, Alexa and Google, have umbrella policies, privacy policies that cover all their services. So whether you’re signing up for the email, the webshop, or the voice assistant, it’s the exact same privacy policy you’re agreeing to. It’s general, and it covers up to 62 services. And these are the biggest two companies (in this market), so it is quite wide.

Thibault: And of course, we’ve all been reading all of those privacy terms, right? [laugh].

Silvia: I think it was the end of 2020, the French Data Protection Authority censored Google for the way their umbrella policy is formulated, because it’s a website with a lot of short paragraphs and hyperlinks to other pages where they have more explanation. The explanation is very generic. And it’s very diluted and fragmented throughout a lot of web pages, almost as if it was some dynamic forum situation. And actually, I have found myself that sometimes if you go in the true forum, like the developer support or the user support forum, you find better information than in the privacy policy. They’re more detailed. So, the French Data Protection Authority said: this is too fragmenting, you’re actually hiding information in this way. Plus, you’re mixing together information for over 60 products and services. What can possibly the user understand from this? It’s difficult, it’s not working very well.

Thibault: And did they say this was a problem regarding GDPR?

Silvia: Yes

Thibault: Okay.

Silvia: It is clear from GDPR perspective because the GDPR has an obligation of transparency of the controller. So the entity that is in charge of the processing must inform the users, then the users can make an informed choice. That is the starting point, well, to the extent that a simple user can actually make an informed choice on very complex technology because there’s also that. And there is an episode that really makes you understand the tension that is created between these devices, the GDPR, the practices of the sector, and the fact that we’re talking about our homes. So it’s a special place. So towards the end, or in the middle of 2019 actually, some contractors of Amazon (so people that were working for another company, and then the company was working for Amazon, right) and some former employees of Google said: listen, our job was to listen to snippets of conversations recorded from the smart speakers. Because on some occasions, the smart speaker did not understand the language. So there was a conflict to solve in natural language processing, which is the function as allows the smart speaker to talk to us and understand what we’re saying. And so they need human intervention to solve that conflict, so that the next time the speaker is going to perform better, it’s going to understand it.

A lot of companies do that when there is a conflict in understanding, and they want to improve the service. And in the privacy policy, it clearly says that the data can be used to improve the service. However, these snippets, which were up to 30 seconds long, in some cases recorded fights, possible aggression, like sexual harassment. In some cases, they had names, addresses, and apparently, and this was not fully verified, they were not fully anonymized for some of the contractors. So the contractors have access to the address with, like, the civic number of the house in some cases. So they had the full profile of the user. This, I argue, is not in line with the GDPR. Because the GDPR has a provision that says that technical and organizational measures must be taken to minimize the data processing. So these people should not have had access to the address. It’s not the minimum amount of data necessary for that processing. And it’s not complying with data protection by design. However, even if we want to stretch it and say the addresses are necessary to understand better, for example, and we want to say “okay, we did inform users that we might use the data to improve the service, and this is what we meant”. This doesn’t remove the fact that the people, in learning this, the users actually were not happy, and a scandal was created. And people complained to the newspapers, because their private sphere, their home felt violated, it was interference. Even if they had some information from the privacy policy, but that information was not specific enough, they couldn’t understand the implications of “improving the service”. And even if we assume that we can stretch the GDPR, and say that that is compatible, it’s a sector practice, that does not remove the implications for our private sphere, for our home. So you see here how the tension is inevitable in certain things with these devices. The GDPR has the standard and a threshold, the sector practices might willingly or unwillingly not line up with that threshold. And at stake, there is the expectation of privacy that the citizens have inside their homes.

Thibault: Indeed, and so again, the tension between the norms and the actual law. And eventually, if of course, people expect privacy, then it means that the market will play into force. I was smiling because some members of my family do have such, I won’t give names, but home speakers and sometimes you know, when the discussion again, I won’t disclose the subject and it’s nothing criminal, but sometimes when they do mention a few topics, they will talk [wishpers] like that, so that you know, the speakers cannot hear what they have to say, which I find to be fascinating how, again, he didn’t pass on norms and you know, hard behaviors in a very practical way.

Silvia: If I may what your family members are doing is one of Altman’s mechanisms, the strategies to reinstate a desired level of privacy. That mechanism, it looks trivial, they’re just lowering their voice. But lowering your voice in front of a smart TV or a smart fridge or a smart speaker in your own house means that you’re wasting mental energy and you are losing something in terms of feeling comfortable and emotionally connected to your own home, it has a cost. It’s not trivial.

Thibault: And still, they keep the thing, and they use it.

Silvia: Yes!

Thibault: Again, should the law always follow people’s desires? This brings me to my very final question: what are the main open questions you were left with when finishing your Ph.D.? Is there any question that you will address in the coming months and years? Other questions that researchers should address?

Silvia: I had a couple of open questions at the end of the research. And to be completely honest, those are probably the ones that get me more excited in general about my research. I was very happy that I found those questions. And one is connected to the GDPR. And it’s pretty legal. But it also has an element of reflection more at a societal level. So the more and more we’re going to be living in an environment, even a familiar environment like our home, that is embedded with these digital technologies (so the smart home or the Smart City, even) the more we’re going to be constantly exposed to a series of optimized automated decisions of these devices. We might give some inputs, but their algorithms are going to choose the optimal way to carry out certain things or certain tasks or to achieve a certain goal. And they are going to profile us in a certain way.

Now, the GDPR offers protection against these kinds of automated decisions and profiling, they can have either legal effects, so they might lock us out of a mortgage, they might, you know, reject our citizenship or a job because an algorithm selected our profile. Or they might have a “similarly significant effect”. So they might affect us in an important interest and important right, like not being discriminated against. Think of the case of the Asian families that were being given a higher price for tutoring based on the stereotype that Asian families want their kids to go to university more. It was happening in the United States a few years ago. Now, if we take the environment, where we might have a lot of teeny tiny, trivial daily decisions, automated and optimized, they optimize also to the interest of the company. So you know, price discrimination: maybe have a few cents more on the groceries that we’re doing through our smart speakers can pile up in the span of a year, or two, or five. And a few cents on every purchase becomes a lot in the end. Or, you know, they might profile as more vulnerable to purchase certain goods and start serving us more ads about that. And that’s going to make us waste money again, or discriminate us for other services based on what we did inside our house.

Now, the starting point of the GDPR is that you have one decision that creates this big, nasty effect on our lives. But what when these big nasty effects are the result of the accumulation of teeny tiny daily decisions? There is no answer to that yet. And there are a couple of interesting cases, especially from the court of Amsterdam, with Uber and the profile of their drivers and how they might be fired based on that profile, that might go in that direction. And I am following that close and I hope to have more information from the solutions (to those judicial cases), in this sense. But that remains an open question so far, that was something that one currently would not probably anticipate.

Thibault: Well, I can’t resist but to talk about competition law here because I think we should definitely cooperate on those questions. But when you were mentioning the fact that sometimes it is not in your best interest but in the best interest of the company, I think this is where you could see competition law also playing a role. But, on some occasions, it is actually in your best interest, right? So if you just ask any home assistant, “can you please order a pizza?”, the choice reduction is very convenient. The fact that you don’t even have to choose anymore might be good for a consumer, but it might be bad if it simply benefits the company. In the space of competition law, agencies assume that “more” choice is always in the consumer interests, when in fact, home assistants can increase consumer welfare by reducing choice and serving what is best for them. But, of course, it triggers the question, is it really what’s happening? Who are they serving? This is where the interplay between privacy and competition law becomes central.

Silvia: I absolutely agree, then. And sometimes we see privacy tools used with an eye to competition and vice versa, right? And the other open question that I had actually also had an element of that, where we have to look at consumer protection and competition law, I believe, to find some answers. Because what I’ve looked at in my non-legal part is, and that was another really fun discovery, is that all digital products are designed to optimize our reactions, and persuade us. Some may say nudge but the nudge is a very specific concept in your economic behavior. And they might persuade us to a certain action or behavior so that we engage with them more. And we intuitively already know that because we say that some of these things are addictive. It is actually not necessarily an addiction, it’s what they call a long-term relationship with the customer. There are very simple tricks, like a certain combination of color. Or offering a tip, as a suggestion: the Assistant might offer you a suggestion or to do something for you. And then right after, when you’re well disposed towards the assistant, ask you to do something else, almost as if it was in return. And by doing things, they’re going to collect your data, right, so they did get something: you contribute to them. This is a mix of marketing, psychology, and computer science and you know, user interface.

Thibault: And of course, they could implement certain strategies for just some consumers, and compare the results.

Silvia: Exactly. So with machine learning and personalization profiling, this can be further refined, but honestly, I have looked so much into it, even some very basic mechanisms. The simpler, the more effective. So if they put the banner of your favorite soccer team on a website, you’re going to be more prone to answer more questions, for example, in a survey. Things like that, it’s extremely easy. But definitely, AI or machine learning becomes an enhancer of this. And what happens if this persuasive effect, is embedded in our home, in digital and physical environment? For me, it becomes incredibly interesting. Because the law has not dealt with that, unless we’re talking about consumer protection with you know, very clear-cut situations, like the true subliminal messages that were very trendy in the 90s. When I was little, we assumed that advertising was going to show us one photogram, and that would give a push to buy soda or something. Right. So that is kind of a clear-cut situation. But this is less clear cut (Note: what happens with the design of smart speakers).

Thibault: And what about the AI Act? There are some provisions about that and manipulation. One way of reading it is that advertisement will be prohibited in Europe. Of course, this wasn’t the intention, but advertisement is about manipulating people so that they buy things. So, I’m very curious about how they will frame that in a better way. Have you ever looked into the AI act? What’s the interplay with GDPR?

Silvia: The act considers the “subliminal effect to distort behaviors”. These are very vague words. So, I am really looking forward to understanding better how the European legislator intends those words, what they mean for them, and how they want to apply them. And what you’re saying is correct. We even have shops that place products in a certain way. So we buy them in the store, the physical store, right? And that is allowed. So, where do we draw the line between manipulative practices that are desirable, or persuasive (I prefer a more neutral term for now because it’s still to understand what it is actually), so between the desirable and undesirable persuasive practices? This is exactly the question that I want to answer now in my next research endeavor, so I really hope to have an answer to that in a few years’ time. But, and I also think that that’s going to affect also competition law, what as you were saying because that can give quite an advantage at some point to some companies.

Thibault: Of course. So people should expect papers from us both. In the meantime, I believe your Ph.D. thesis will eventually be published. But for now, I’m sure we can find some of your open access articles on the subject.

Silvia: Absolutely. And they can contact me on Twitter anytime, and I’m happy to talk about smart speakers a lot, as you can see from how long this interview was.

Thibault: Of course, well, thank you very much. It’s been a fascinating discussion. You can expect more talks coming from the ALTI team in the coming weeks and months. Silvia, I’ll see you in a couple of hours. Thanks again, and take care.

Silvia: Thank you. Bye.

Thibault: Bye-bye.

Subscribe to our newsletter