IoT and technology-facilitated abuse with Dr Leonie Tanczer

Listen to episode 10

The podcast which talks to the maverick renegades of the internet.
WeDont.Stream

You can’t listen to this anywhere but here.

About the guest

Dr Leonie Tanczer is Lecturer in International Security and Emerging Technologies at University College London’s Department of Science, Technology, Engineering and Public Policy.

Leonie is the founder of the Gender and IoT project, which looks at how IoT impacts on gender-based domestic violence and abuse and what the socio-technical measures need to be implemented in order to mitigate against those risks.

Donation choice

Each guest is asked to nominate an open source project or charity for donations for their episode. Leonie has chosen to highlight a number of UK based organisations which she recommends donating to.

Transcript

Intro

James Mullarkey

Hello and welcome to We don’t stream the podcast which talks to the maverick renegades of the internet. I’m your host James Mullarkey and yes that is my real name.

This is episode 10 and today we are talking to Dr Leonie Tanczer. Leonie is a lecturer in

international security and emerging technologies at University College London’s department of science technology engineering and public policy.

Her research focuses on internet security and is centred on the intersection points of gender technology and security. She is currently researching technology facilitated abuse through her gender in IoT project which studies the implication of the internet of things on victims and survivors of gender-based domestic and sexual violence. Some of Leonie’s prior research projects included studies of gender stereotypes within the hacktivist community and sexism online.

I was extremely grateful for the time Leonie gave me to discuss technology facilitated abuse and the internet of things.

I began our chat by asking Leonie to introduce herself and explain if she had a superhero story.

Interview


Leonie Tanczer

My name is Dr Leonie Tanczer and I’m a lecturer at UCL. The project that I’m working on is called gender and IoT and I don’t have a superhero story really. How I came about  studying the impact that technology facilitated abuse has on victims of gender-based violence although I did have an ‘a-ha moment’ which brought me there, which was the fact that I was researching smart technologies which I normally refer to as IoT internet of things technologies. I was part of a research network and we were doing all these amazing things and also assessing risks and governance issues.

But I felt we weren’t looking specifically at some of the most vulnerable groups in society which I think are victims of domestic abuse and intimate partner violence. So, when the chance arrived to submit a small funding bid I just went for it and thought hopefully that contributes a bit to the discourse that I felt was missing back in 2017.

James Mullarkey

You’ve mentioned there a phrase which perhaps not everyone is entirely familiar with – technology facilitated abuse. So, could you give us a bit of a guide to what exactly you mean by that and ways in which this kind of technology can be used to abuse victims?

Leonie Tanczer

The term technology facilitated abuse – I would normally call it a really big bucket and the reason for that being is as you can envision just if you break up that term technology facilitated abuse that technically could be anything right now. When I speak about it, I particularly refer to intimate partner violence, but potentially other people could even think about it in terms of cyber bullying or work-based abuse or whatnot. But with regards to intimate partner violence and gender-based violence situation, technology facilitated abuse describes the misuse of technologies and I think here specifically digital technologies for the purpose of coercing controlling or harming or monitoring victims and survivors.

So, to contextualize this a bit better, this ranges from anything such as the excessive text messaging to for example stalkerware, which is malicious software that can be installed on devices such as smartphones but also tablets and laptops to monitor a person and their activities on that system. 

I’m really interested in smart internet connected devices so Alexa, Google home, smart thermostats, so any products and services that we used to have in our households that are now becoming internet connected. To give some concrete examples. So, for example we have as a consequence of our research over the last three years, now worked extensively with frontline organization both police services but also charities that work with victims and survivors but also perpetrators, and have observed that for example devices such as smart speakers (so that is Alexa, Google Home) are being used to drop in or monitor what people are asking these devices.

 Other systems that are quite widespread are things that we have seen now in our households for longer than kind of cutting-edge technology such as smart thermostat which we have now for a bit in our households too. So, these allow for remote changes so you don’t have to be physically present with the victims or even the perpetrator can change the heating in an environment and the other thing is, like minor things, that can be quite scary just like light bulbs and I always give the example some of them have really poor security features.

 The best concrete example I can give is let’s say you have a one night stand you bring this person home they ask you for your wi-fi details you give it to them right because most of us have that displayed somewhere on the fridge as well. Then they realise this person has some smart connected devices in the household and I don’t want to say that all these devices have kind of really bad authentication requirements, but some really poorly configured. The necessity is purely that you download the app associated with that device and being on the same wi-fi network in order for them to be controllable. Now thinking about the fact that routers if you have in a wi-fi network can reach up to 30 meters, this person could be really and they might even think of it as a joke but could just stand 30 meters outside of your house and the next day change your lighting system. These are some of the examples that we have been seeing in our research.

James Mullarkey

You’ve mentioned there we’ve got some of the internet of things, these sort of relatively newer devices on the scene and they’re ranging from those sorts of hubs like Alexa, sorry Amazon Echo and Google Home and those types of things. But then you’ve got a range of other things orbiting around those you know ranging from something which might be useful like a smart bulb or something like that where you can turn light on, I’m not quite sure why you’d want to do that but some people think it’s a great idea.

During this Covid lockdown we’ve seen people buying ring doorbells because that means they can see who’s come to the door with to reduce the chances of transmitting the virus and stuff like that.

Up to the ridiculous where you’d get you the smart oven. I’ve never really understood why anybody wants to turn the oven off while they’re out the house on and off but there you go.

So basically everything is going to have a computer in it sooner or later well that’s the narrative of the PR companies of the you know of Amazon and Google’s PR but do you think a smarter home, if we’ve got all of these devices in our home, do you think that’s going to be a safer home or do you think that these devices make us fundamentally unsafe? Because to me I mean I’ve not got Amazon Echo or any of these to me that the second I plug them in I think they’re unsafe or kind of abusive in a way.

Leonie Tanczer

So, whether or not a smarter home is a safer home is a really, really difficult question to answer because it’s so dependent on a person’s risk level. I think there’s a couple of things to consider here.

Number one is the people that are developing these systems are often very privileged people, they may not think about all the groups and communities that are going to use these devices and that they not have the same privileges that they will have.

That’s problem number one that we need to think not just because I’m a white educated woman and I would like to have a smart doorbell that everybody would like that then that everybody would have the same functionalities and profits from it as I would have. So that’s number one.

Number two is I really want to stress, that smart technologies and any technologies for that matter when it comes to domestic abuse and intimate partner violence need to be assessed on the spectrum of abuse that is happening.

We normally talk about three stages, and functionalities like remote control even like connectivity between the devices or even accounts such as social media with IoT devices can easily be abused in the first two stages. The reason for that is that if you think about a coercive controlling environment, there will be one person that potentially physically or emotionally, psychologically abuses someone. They may be in charge of these devices and these accounts too.

They may even have purchased these devices and that’s one thing we see quite frequently that actually the legal owner is the perpetrator because they have purchased this and there’s a gender dimension here that comes into play too.

If you think about just your friends and family – how often is it that in a household it is the man who often is in charge of the gadgets?

There are multiple levels here and coming back to these three stages, in the first stage we often see you know people sharing a household there’s physical access to devices but also physical access to the victim and survivor and even like their children for that matter.

In that situation often IoT functionalities and devices are abused against victims and survivors because there is this power dynamic that comes into play. In the second phase, this is the high-risk phase, this is where a victim and survivor realises – okay, I want to leave. I might go to friends, family or I might go to a refuge and in that situation, it is extremely important that they think about all the ways that a perpetrator can trace them.

How they can still connect with them and in this phase for example, forgetting about one still connected device will be extremely important. But again, here there’s a risk that these smart features can be abused still.

However, if they are for example in an environment where they have control it might allow them to remain in the control of the environment through by for example using smart CCTV cameras.

The third stage is where this the victim and survivor has left the abusive situation. Perhaps the perpetrator has been prosecuted and they feel like they are in charge again or as far as one can be after such a situation, in charge again of their life. Here actually we see a lot of victims and survivors telling us that they would prefer smart features because then they are in charge of the accounts they can control how they are being used so the things that have been used against them in the past can now be used to their positive advantage by for example using CCTV cameras, smart locks.

But again there is still probably often this anxiety, am I safe, you know? With for example a physical lock, I know for sure this is locked now this is not unbreakable either but with digital devices, having been in cyber security space a while, we know nothing is unhackable and most perpetrators are just like kind of the average they just use the features that are out there they they’re not going and gonna write malicious codes but there is services out there that help with that and there are bugs once in a while.

Then there’s also the really emotionally invested perpetrators as well that will go far beyond anything you would envision. So, you know this third stage where they may feel in control because they have these features, there’s still this caveat that you know can you fully trust them all the time?

But to come back to your original question as you can see from these stages it’s relatively hard to say that it’s safer or not it depends on what you feel comfortable with in this environment, who’s in charge of these devices, and how trustworthy you can feel about these systems being used in your environment.

James Mullarkey

It’s quite chilling actually to hear how even at the point of wanting to exit relationship you know that is somewhat simpler physically to do so although that can also be tricky at times. But even when you exit physically you’re not in a sense leaving really because of the access and the sort of technology that side of your life which is being left behind and that seems to take considerably longer to remove yourself from.

So, what are the design things that are missing here because you’ve said before that it’s a system and you know let’s not kid ourselves, it’s white guys in hoodies design it. White guys in suits then approve it and then the white guys in black t-shirts are the ones that you see you know the CEOs the ones on twitter saying how great it is. They’re definitely not thinking about other people that’s for sure. But is there anything that they can that can be done in the design?

Leonie Tanczer

I think definitely so one thing is I’m really worried one day I will get a call from Amazon or Google saying you need to stop talking about this, you’re really bad for business. I really understand that like I only hone in on the negative aspects and that’s point number one. I do see advantages as I say on this spectrum, where I think we could do better and then perhaps it’s going be better to use these systems also for victims and survivors of intimate partner violence.

From our research there’s a couple of things we feel would be easy to implement and are kind of the low-hanging fruit.

So, to say that could be started number one, authentication requirements that I said earlier don’t just allow anyone to connect with a device simply by being on the same wi-fi network. But again, like a good product vendors already don’t do this thankfully but still.

Number two is prompts. I think how often have we given I’m thinking I often relate this back to my own behaviours, like how often have I had friends over and given them access to for example my wi-fi network or for example used my accounts such as Spotify on their laptops right and then you forgot that the credentials are still there. So I think just from natural human behaviour it would be super important for once in a while getting prompts and saying – hey you’re still connected to these systems do you still want person x to have access to this or do you want still product y connected to this? I think that would be something where awareness could be raised that is not at the moment there because you forgot how many devices are connected.

Now I do want to stress that could also cause risk and tension with regards to intimate partner violence and situations because the perpetrator might have secretly connected devices that you were not aware of. So that could escalate the situation if the victim and survivor is prompted to say and now okay the perpetrator is silently monitoring me and then but then they still have a choice currently they might not even know. So, I think prompts would be something absolutely fundamental and that’s not hard to implement I just don’t understand why this is not something that’s already there.

The other thing is there’s a discrepancy between the functionalities that are on the device and the things that are happening in the cloud and on the web features of this system. So, if you think about if you’re owner of a smart speaker so Google Home, Echo, the device in itself has not many buttons there’s not much to see. You have this connected app but also you can access information on the web interface. There’s this very often nobody of us really logs into that system for a while, right?

So again, I think frequent prompts that are connected for example with my email account would be something that could help here to once in a while, do a sweep clean so to say of what’s happening.

The other thing that is actually more human related than technological would be a lot of victims and survivors their first point of call are actually tech shops when they are worried about something. But if you’re living in London and if you have ever gone to Oxford Street into the

Apple store you will know that the chances of you getting proper help about something you’re worried about are very limited.

Which is not a bashing the shop but there’s so many people, there’s so many requirements that they have to fulfil, they will not take an hour to sit down with you and check if you have stalkerware installed or if something is happening with your phone.

We feel it’s really important that vendors are having some form of triage that when there’s requests that come in a context of intimate partner violence dimension, that they’re targeted specifically and differently to the rest of requests that they may receive. Because a lot of the information that is on our device can actually be used for evidence collection, and if that’s not handled appropriately you may accidentally or deliberately delete it because someone just says or just reinstall the device but then the victim and survivors could just not have any form of evidence to go and prosecute the perpetrator.

So, people that give advice around these issues need to be trained and think about this implication because of course the first thing we’re going to tell everyone is – have you tried switching it off and on again, have you restarted it, have you reset it? which in this context is actually potentially risky. So, for example this would be something that like vendors coul invest of having a triage system to deal with that.

James Mullarkey

I know you’ve got a very interesting background because you’ve got a lot of public policy related knowledge and experience as well as on the technology side.

I wanted to ask you really about the policy and legislative changes that could be made to improve this? I’m sure you’ve had some discussions with people in the government and that surrounding world and so I wanted to ask you about what changes can be made there and also realistically do you think this is almost in a way an American problem because these companies are all American?

Will we actually be able to do anything here in little old Britain especially now we’ve been cut off from Europe and we’re floating alone on our own. Do you think that’s there’s still things we can do ourselves to change the situation?

Leonie Tanczer

So really good question. I mean I do certainly believe that the UK government has an opportunity to direct the tech sector and oriented them towards something. Knowing also what the UK government has done in this space in the recent years you know we have the 13th principle on IoT security best practices. I think they’re great but they’re really the low hanging fruit again like you know it’s from embedding encryption into systems to again authentication requirements. So, it’s great but number one is they’re voluntary so you know a vendor can still access the market without like having these.

Number two is they are not as far-reaching as they could be because they often are very system specific in so far as what I’m describing and what I’m experiencing a lot is the functionalities that are being abused. It’s not like nobody exploits the encryption level of a device it’s more like the top of the system that is being used the functionalities that are embedded in there and for a regulator whether it’s the UK government whether it’s the European Union whether it’s the US to address this is going to be harder.

Because as I explained earlier some people want this remote control feature you may not know why you want to switch on your oven from afar, but some people want to and they are willing to pay for it. I think we’re in this really unique situation I believe now where we can still intervene and that’s why I’m banging on this topic so much because I think we’ve missed the train with laptops with just generally the internet and like also you know other functions to systems.

But we have now this unique opportunity with IoT devices because they’re not yet wide spread enough that like it is hard to retroactively change anything but there’s certainly an appetite for these systems to become widespread and I think we’re also too far in to basically put them back in a box.

So I really, really want to stress the importance and the criticality of intervening now because once it’s gone too far it will be impossible to change these because we know from the research and the way the market is going that these systems are not just gadgets they are you know tiny sensors that will be built into the built environment. How are you going to get those then you know changed if someone built now a factory or an office to rectify that?

So I really want to stress the timeliness of this whether it’s an American problem or a British problem or a European or global problem I’d say it is something we all have to address because we know in the US the same problems remain. But I guess the question is how you comfortably are with the carrots and the stick as we would call in the policy domain, with the carrots being incentivising better behaviour, with the stick being you know regulation, legislation and basically telling the vendors what to do.

So, to come to your question around what the policy space could do well. I think in the UK we’re just in the process of having the new domestic abuse bill which is great there’s some acknowledgement to tech abuse although it’s more related to online abuse. In many ways we’re thinking again about old things from my perspective like social media and I think this will remain important it will transform and change.

There will be new products new services that come on the market but I am worried that we are forgetting the device level that it’s not just the platforms that we are should be concerned about but also the internet functionality in devices that we’re seeing.

I think we have the online harms white paper coming and being discussed in the UK. I know that the digital rights community has their opinions around it, but I certainly sit in between this space plus the domestic abuse space and I think both fields need to talk to each other. Because if you talk to a frontline worker that you know free speech is so important when they have to help 18 year old woman that just fled like a two-year-old really horrible domestic abuse environment with a partner and just doesn’t know what to do. Because she doesn’t want to be cut off from all the online environments that help her to for example do her business, do her schoolwork, do university work. But also, just stay connected and like just be a human being. Then I feel we can’t just talk about it’s so important to keep everything open. I think we need to find a middle ground and I’m really not saying this lightly I think it’s really important that both communities talk because online harms are something that will not go away and only become more worrying.

James Mullarkey

That’s okay so that’s really good. So, you give me a bit of hope I didn’t start the question with. A lot of hope. What do you think the immediate thing someone like me who frankly, I was very unaware of these types of problems until I started reading about it relatively recently. I guess that probably partly answers my question, but what do you think someone uh like myself can do to improve the situation for victims of abuse? I you know I’ll write to my MP if that helped but I mean what are the what are the things I should be doing?

Leonie Tanczer

It’s funny that would have been one of my points. Definitely writing to your MP because one thing that we’re seeing working with the UK government who is open, receptive, is that there’s a political drive that is needed. And that’s not like civil servants that’s political and like parties and officials. So definitely if a whole constituency speaks up about the fact that this is an issue and it’s not just the small charity that works I don’t know in Reading and sees that problem but a whole constituency, I think you know for this person it will be important to pick that up.

The other thing I was thinking is especially if you are a bit tech savvy volunteering, but I’m saying this with a big, big caveat.

So, it is really important to be led by the people that work with victims and survivors. I have heard a lot of great, ambitious ideas from people. But if you break it down and if you know about the reality of people that work with victims and survivors and also of victims and survivors, you know that this new app or this and that, simply don’t work so when it comes to volunteering.

I think it’s important to go into and offer one’s technical skill. Set to for example help with like certain things that like are happening but also be conscious that giving advice is something that potentially risks this person’s life so nobody should take it lightly.

Just to say something but the reason why I’m saying nevertheless help support frontline organisations is because my worry is we’re asking frontline organisations may be charities may be even the police to not just be legal experts, to not just be social workers, who have a psychology background and how to deal with people with incredible trauma but they also have to be housing experts and like banking experts etc, and now we ask them also to be tech experts and let’s be honest that is unrealistic.

There’s a need there’s a gap to help the sector deal with questions around tech and I honestly know that many of them frontline workers you know are trying to read up on things are trying to google stuff but then they have the same worries have I understood this correctly and what is if I’m wrong?

So, we have organised a crypto party with frontline for the frontline services in the UK. Plus, with digital rights groups but also the national cyber security centre where it brought together tech experts with the frontline services and we collected questions that they have. Many questions don’t have a simple answer but at least the techies in the room were prompted to think about these. But also now when they come up with for example solutions, they will have to think about all the implications that they were shared with them in this room and I think that’s really important because otherwise you’re just sitting in your fancy, well no longer in the fancy office because you’re probably sitting at home somewhere, but you’re just thinking in a bubble about what the needs and what the risks are and you don’t think about the consequences. So, volunteering in that space can help to for example see the implications of it learn the questions that are out there.

Also if you’re designing products that can help you but just like generally help the sector reflect questions around like how to reset an Android phone or how do I navigate certain kind of online platforms better and change privacy settings. They might be super low-hanging questions but having someone there who where you can bounce off idea can be very helpful

James Mullarkey

Is this a gap then? Is this the sort of area where that your project might be able to assist with and what are the next steps then for your project? Is this where you fit into the piece?

Leonie Tanczer

Well I’d love to think the uniqueness of this project is that it is influenced both from you know my digital rights, technical but well my technical background? Jesus Christ I’m a social scientist full stop. But I’ve worked with techies for very long time and I’d say I know the mentality of the sector quite well. The other thing is my strong passion and my interest in domestic abuse but also gender issues. I do want to stress it is not just a woman’s issue so to say as well we know of like men are being abused through technology as well. But also, it’s an LGBTQ issue as well you know? So I want to stress this what I would like to see are we definitely have a couple of like please or requests that come out of our research and we’ve written them up in a report that is publicly available for anyone who is interested.

But there’s a couple of things that we think can be addressed both in the technical space in the social space of like for example the practical space of frontline workers may be police or charities in the policy space in the research space but also on the individual level. 

So, from I mean I’ve already addressed the kind of industry aspects that like require them to think about prompts and connectivity issues in regard to technologies.

With the frontline space we would see a need for example for the explicit collection of tech abuse data. The reason for that being is we are in the UK using dedicated risk assessment it has around 27 questions and there are questions that hint at technology. But they could be more explicit because I think it’s more important to no longer just think about laptops and smartphones but also about wearable devices, about smart speakers or built environment things like thermostats and smart locks. I think that’s important because then if you go to for example a person that is alert to that they still ask you questions around this technology and think about that in risk assessment.

That brings me to the second point. The risk assessment of when working with victim service survivors need to be updated but also the safety plan and safety plan basically means based on the risk that a person has and that’s so dependent on the individual circumstances. Their safety plan needs to account for if they want to leave – what would be the steps be to change for example settings? Or if they want to remain with that perpetrator how can they safely use these devices continuously?

So that would be one and the data collection would require us, because we would be able to see the extent of it. Because currently I’m getting questions both from industry and policy that say – well it’s not our devices that do these things, we’ve never designed them to be abused. Policy officials also are telling us without like stats, and we have some figures, but I think it could be better in this regard, it’s hard to make changes.

Then with the academic space I really would love that this project, this community that we’ve built, the gender and IoT project at UCL, sets the tone and the narrative for future research in this space.

I’m really dedicated to you transform the computer science profession in regard to being making students already alert to this issue. But also, what I’d really love and this, is a very early idea is we have clinics like legal clinics where law students are giving advice to people. We could apply a similar dimension to computer science as well in engineering. We have a similar project in the US we have the CETA the clinic to end tech abuse, and they’re doing something similar. But it still remains on a very small level and I would like to test a similar approach in the UK.

Bringing tech students with the support sector in contact and hopefully that also means that once they’re graduating, they’re more alert to all the implications that their technologies and devices that they produce will have. So, the next steps for us are certainly you know to continue to assess the tech landscape risk level.

So testing devices, continuing to working with the charitable space to better you know assess their needs and how we could look into the devices that we’re testing to search for certain aspects that they are seeing in this space.

We’re also very interested in quantifying of course the tech abuse landscape. So, we have a project on the go to actually use hopefully. Although I’ve considered that we’re facing some challenges, machine learning approaches to you know go through reports and assess tech abuse in that. But I think most foundationally, I also think we need to look at perpetrators a bit more and that’s really one thing I am really keen on because I think the space at this moment of time is primarily focused on victims and survivors and what they have to change and do and also what support services have to do. I even raised my hand that has what we’ve done in the last two years to three years but I think it’s important to put our attention to the tech space and to the perpetrators that are doing this thing and prevent abuse basically from scratch.

James Mullarkey

I can see what you’re saying there because it’s very much about the steps victims need to take to protect themselves and not about the actually preventing the problem in the first instance.

I’m quite astounded by and I know this is the sort of an outcome of your project is the lack of data that’s the one thing that just blows me away really. I know that we’re talking in the  context of sometimes tech abuse versus other types of abuse and being able to define that as a category and how much that is happening but that’s the whole point of these devices is to collect massive amounts of data and yet there’s we don’t have the data available to understand properly what’s happening.

If we’ve got the same companies and similar companies for example creating apps about women’s menstrual cycles so they can predict things like mood or perhaps when they become pregnant before they do and using data like that to not only commodify humans but also to try as a means of understanding things that are happening in their day-to-day lives it seems to me that they might have some of the answers tucked away somewhere around this?

They might you know they might know that there are things happening in people’s houses there might be abusive situations going on because there’s so much data being captured on that they might suspect that. Do you think I’m being a bit crazy by saying that I mean have you had any discussions with them about this?

Dr Leonie Tanczer

So number one is exactly what you said the main problem with assessing this and quantifying this space is what is tech abuse? Because as I said earlier, it’s a very big category, and even like in the academic space we haven’t agreed whether it’s excessive text messaging or if it needs to be more sophisticated. So, to say to be tech abuse so that’s one of the core problems. I think we need to first have a clear definition of what it is to able to quantify this.

The second one being the data is spread across many places so the police, if a police forces have data from records but that’s only a limited amount insofar as you know not all victims and survivors will escalate it to the police level.

The second one is charities again. Charities will, not many, not all victims various will go to charities either. So, we will have a black hole there in the data as well and of course as you said the second, the third one being industry. Industry is an interesting one. You’re asking me if I had encounters and I would say yes, I had encounters. But I’m not sure if they were always super fruitful. I think the problem is as you can envision, no company wants to associate itself with the fact that their devices are being abused and especially in in something so horrendous like intimate partner violence.

So, I think there’s nearly a worry to talk about this or bring awareness to because that could escalate into a realm that they may not want for PR purposes or other things. So, I think there is this reluctance to engage with this topic because you open a box that you may not be able to close.

After that equally I think for industry actors there’s so many other things that come into play like you know they have other topics that they are legally required also to tackle like terrorist content and whatnot. So, that probably falls under the radar and then there’s the lack of awareness as well and who’s responsible. I think that’s the biggest challenge I face as a researcher because a corporation is so big at most of these things that like I might talk one person, but they are not responsible for this, they may refer me to another and then it dries up and then like it’s you know nobody does anything. That’s why I’m talking about coming back to the carrot and the stick. Sometimes the stick is quite handy.

So yes, data would definitely be in the industry. But again, I think it boils down to what are you looking for. Like the question of category because I’m trying to envision here like let’s say we would have the pleasure of receiving like smart speaker data there’s a lot of personal information there which in itself will be tricky to navigate as a university but I’m sure you know we could do around that but then like that’s where machine learning comes in what patterns need to be there for something to be abusive?

I want to stress there has been actually a patent and I believe they’re based in San Diego for analysing smart speaker data whether it’s abusive or not and unfortunately, I haven’t managed to reach out to these researchers yet but that’s the other alternative that industry actors will go down.

Okay like it’s if they’re alerted to it they may use that data and say “Leonie is in an abusive relationship” or whatnot you know and if it’s reaching a threshold then they may alert me but I’m actually worried about these technical solutions to an inherently societal problem as well. Because again let’s say what does the device do then or what does the company do then? That’s why I think I really want to stress this is not a simple solution that will only be solved through technical means and sometimes it requires a trade-off of not using certain features or using certain features and just living with the fact that it may be abused I don’t know.

I have not found the full answer yet. Yes, industry actors have a data but I’m not sure they would come up with the smartest solution and evaluation as a consequence of looking into that potentially.

James Mullarkey

I know you don’t have any IoT devices yourself and so I was just wondering – do you think it will be easy or even possible to keep that going in the future given the drive to put a computer into literally everything? Will I be able to insure my house if I don’t have a smart lock on will it be forced upon me?


Dr Leonie Tanczer

That’s the biggest question like for society in the next years you know? I have a smart speaker now funnily enough. It was a joke gift from my partner thinking it would be funny. But yeah like I think the best example are cars. There’s still a lot a lot of cars on the market that are nuts I don’t even say smart but that don’t have software in them and it’s getting harder and harder and I think soon you will not get an MOT with an old car probably

So, I definitely with cars I can see if you don’t want software in your car you’re gonna struggle in the future and now already. I think it will probably be the same and that’s why I’m saying it’s a time critical moment. The best example I can give is Spotify. If you signed up for Spotify, you got a Google smart speaker. They will be giveaways you know? One thing I mean, this is super black mirrory and futuristic. But like I also think we know of social housing providers who have started to embed IoT functionality in the built environment.

It must be the same for you of course you can proceed not connecting things but it will start to get really inconvenient and sooner or later you just give in because there’s already the right plugs for something or you know you can’t find the necessary cable or god knows what you know and convenience often trumps privacy and security let’s be honest. I’m not saying this has to be like that it’s just historically how it is. I think because there’s not many people out there that you know would go through all the length that you seem to go through, people will just slowly be eased into this and we just have to accept it or we push back now before it’s too late.

I see this again with social media now. People are tearing their hair out of like what are we gonna do? We have all these documentaries about how awful the things are, but everybody is just like you know so what like we can’t do anything it’s too late you know? I think with

IoT I think we still have a chance and I really, really think we need to use that to push back and say certain things are not allowed. A big thing for me is also interoperability and has not so much to do with like the abuse aspects although it can be functionally important. But I think I see a development where certain smart providers have agreements with certain other smart providers and you have to buy into the full system just like you buy currently into the full windows system or into the full iOS system I’m really worried that this will mean you know if you move house and suddenly you know this is all set out for google home and I have bought into Alexa or whatnot it’s going to be inconvenient.

So, there are so many things that I think we need to think about and that we aren’t publicly speaking about because there’s so much other things we have on our radar at this moment of the time in society. I always joke and this is actually not my idea we did a workshop and someone came up with this. I think we need a kind of a David Attenborough for tech. We need someone that like educates society around these issues. We just take it for granted that everybody is on top of everything but let’s be honest we have left already so many people behind with this with moving forward and I think you know an educational program equal to Blue Planets on tech that is what we need and hopefully gets the discussion rolling.

James Mullarkey

I don’t I just don’t know who that person be I think it’s a great idea though I was wondering do you want to mention any organisations or charities that you think are important to support?


Dr Leonie Tanczer

I thought about this question a lot because of course I want to give a shout out to all the charities working in the domestic abuse space. So, the first point of call for me would be for anyone interested in supporting an organisation is to support your local charity in this space. Covid has had such a huge impact a lot of small organisations are struggling they don’t get enough funding.

But they have a huge increase in like calls or requests for people even to get like a bad space so start with like for example your local charity and if you don’t know it just Google it. The other thing, so an alternative to the local element perhaps, you have for example there’s special organisation for the Jewish women’s aid or the Chinese advice information centre. You can go through that and then of course there are these big organisations that are really important in the UK and to mention only a few. I really want to stress this.

I definitely think Refuge when it comes to tech abuse is ahead of the curve in the UK.

Respect when it comes to perpetrator intervention and male cantered advice.

The Suzy Lamplaugh Trust who is very much focused on stalking – victims and survivors.

Woman’s Aid and Save Lives.

These are just some of the organisations I think it would be worthwhile supporting. But I really want to stress anything can help in this space. Start with your preference of your local one or with big organisations like that there’s the London Violence Against Women and Girls Consortium. While it has in the name London there’s many organisations that are operating across the UK so definitely a point of call because they have 28 organizations that work really on specific issues. If people are interested our research project the gender and IoT project is running a monthly newsletter and we are basically using that newsletter to inform the public but also this charitable space about latest developments in this space, latest research, publications, latest legislative regulatory developments, resources but also news and give also project updates of what we are doing. If anyone is interested and want to keep on top of the tech abuse landscape

please feel free to sign up.

Outro

James Mullarkey

Well there you have it. I’m very grateful to Leonie who gave up quite a bit of additional time to have a longer discussion with me. That’s the longest episode of this podcast that I’ve recorded and the most important.

As is customary I asked my guest to nominate an open source project or charity for donations for their episode and as We heard Leonie actually chosen a few organisations to highlight. These are UK based organisations but if you hunt around online you should be able to find ones in your country if you don’t live in the UK.

Just as a quick recap they are:

  • Refuge
  • Respect
  • The Suzy Lamplugh Trust
  • Women’s Aid and
  • Save Lives

I’ll include links to all of those in the show notes as well as some more information about Leonie’s work and a link to her monthly gender and IoT newsletter which is definitely worth

signing up to.

All that’s left for me to say is keep watching the skies, don’t use google and enjoy your life.

We Don't Stream

This podcast series looks at the maverick renegades of the internet who are pushing back against big tech monopolies, surveillance capitalism and climate change. People who reject mainstream technology narratives to try and forge a slightly better, less dystopian future for all of us.