The Hacker Mind Podcast: Surviving Stalkerware
What role does technology play in facilitating intimate partner abuse? What role might the security industry have in identifying or even stopping it?
Lodrina Cherne and Martijn Grooten join the The Hacker Mind podcast to discuss their Black Hat USA 2021 presentation. They discuss how software and IoT companies can avoid becoming the next Black Mirror episode and share resources that can help survivors (and those who want to help them) deal with the technology issues that can be associated with technologically facilitated abuse.
Vamosi: Ever get the feeling that someone is watching you. It’s natural. I mean, we’ve all experienced it, and usually it just lasts a moment, as when someone across the street stares before they move on. But what if the feeling that someone was watching you was persistent. And what if that unease was coming from your mobile device.
In early September 2021. The Federal Trade Commission in the United States, banned an app called SpyPhone, and its CEO Scott Zuckerman, from operating in the surveillance industry. The FTC claims that spy phones secretly harvested and shared data on people’s physical movements phone news online activities through a hidden hack. It says that spy phones sold real time access to that information, which could have enabled domestic abusers and stalkers to track their targets. Some of those who bought the spyware were allegedly able to see live locations of the devices, view the targets emails, photos, web browsing history, text messages, video calls, etc. So here’s the thing, SpyPhone is not an isolated incident. There are literally dozens of other examples. Apps that haven’t yet been flagged as such, but do the same thing.
Worse stalkerware is just the tip of the iceberg. There are other ways where abusive partners use technology to gaslight or even physically harmed someone. In a moment we’ll hear from two hackers who spoke at Black Hat USA 2021. It’s an important topic, with real human consequences. So I hope you’ll stick around.
Welcome to the hacker mind, in original podcast from for all security. It’s about challenging our expectations about the people who hack for a living. I’m Robert Vamosi, and in this episode I’m discussing something that’s uncomfortable for a lot of people, the use of technology to spy on loved ones, and the responsibility of the technology vendors to disclose if not even consider how the features and services they provide. Might be misused and hurt others,
Vamosi: Technology and how people use it, it’s a subject of a lot of discussion. I’m interested in how technology developed, one way is used by people in another. For example, the way, mobile phone pings the local cell tower are used by the Transportation Department to report traffic conditions on major roads and highways. But what about the unintended uses of technology that doesn’t get discussed as much. In this episode, we’re going to talk about two terms in particular, intimate partner violence, or IPv is akin to domestic abuse, where someone you live with may be causing emotional or physical violence. Then there’s the broader issue of technology facilitated abuse, in which a device such as a smart device in a home or an application on a mobile device can be used to inflict emotional or even physical harm. Both involve people getting hurt. Both involve technology. Currently there’s little guidance around this. Fortunately, there are those in the InfoSec world, who are actively looking at the subject and speaking out at conferences, such as Black Hat.
Cherne: My name is Lodrina Cherne. I’m a principal on the security team at Cybereason, and I’m also a digital forensics instructor at the SANS Institute,
Grooten: Martijn Grooten. I am a coordinator, the Coalition Against Spyware
Vamosi: Lodrina and Martijn presented a talk in a very important time slot the 10am slot immediately following the opening keynote speech at Black Hat.
Cherne: The name of this talk at Black Hat this year is a “Survivor-centric, Trauma-informed Approach to Stalkerware.”
Vamosi: And on that first day of Black Hat, not everybody was impressed. A celebrity researcher tweeted that he didn’t think the human factor talks such as more times and Latinas should even be presented at Black Hat, The researcher quickly apologized, but the fact that someone so well known tweeted it exposed, something that Martijn has been saying for a while, on Twitter,
Grooten: Security is a social science with small technical components, and I think we forget that I think we as a community as a whole forget as I think we value, technical knowledge is very, very high compared to an understanding of how organizations and people work. And that’s actually far more important, so I would say it’s the human factor talks at conferences like Black Hat, that are the most important ones. I mean I agree that there’s some technical research at Black Hat that is useful that helps the community further that’s also important. But I think in the end it is far less reported and of us having a good understanding of why people in organizations to the things they do, or in this case, don’t do the things that we think they should be doing.
Vamosi: If the human factor is so important. Then why haven’t we seen more talks on the social science aspect of security,
Grooten: My personal motivation for spinning to Black Hat in particular, was that, I think that if you are a technology expert, a security expert, and you’re going to do some work on stock because someone contacts you about something they think they have on their phone. And all you bring is your technical knowledge that I think it’s very likely to go wrong. If this is not this is not a technical problem and it’s a form of abuse, and I think it’s important that people put it into context. I personally wanted to make that point in 25 minutes, Black Hat presentation part of
Vamosi: That talk focused on the fact that there are InfoSec hackers openly working to address this problem.
Cherne: On the topic of stalkerware and technologically facilitated abuse there’s quite a few people who’ve done amazing work in this space, just two of many who come to mind are Eva Galperin at the Electronic Frontier Foundation and Tara Hairston at the coalition against stalkerware and the coalition against stalkerware is really this great organization that builds a network of corporations like Cybereason, like others, including Kaspersky who’s done a lot of work in this space, along with organizations that are fighting domestic violence that are fighting technologically facilitated abuse and other nonprofits and educational institutions.
Vamosi: So let’s first ground ourselves in a common definition of what is and what is not stalkerware.
Cherne: One of the greatest things that I think the coalition and stalkerware has done is really defined what stalker wear is, it’s something that a lot of people have this sneaking suspicion about, But I think this definition is really important to think about. So, according to the coalition against stalkerware. stalkerware is software made directly to individuals that enables a remote user to monitor the activities on another user’s device without that user’s consent, and without explicit persistent notification to that user in a manner that may facilitate intimate partner surveillance, harassment, abuse, stalking, and or violence.
Vamosi: This definition is precise and well suited for apps that record or transmit data in secret. But what about IoT devices? What about social media? It seems that when only talking about stalkerware We might be avoiding a larger problem,
Grooten: I mean stalkerware In particular, as a very narrow topic is part of a much broader topic which is called technology abuse, which is the use of technology in intimate partner violence, abusive relationships and clients. There’s been quite a bit of attention to stalkerware within the tech community within the security community. And it’s partly because I think people can relate with. I mean I’ve done work with directly with the survivors, and the issues that they face the technologies that they face are often far more mundane and far less technical than something like stalkerware but talk about something that people can relate to it is more aware, and we understand more where we think, but yeah and I think that’s, that’s partly why there’s so much attention. That’s awkward. It’s a form of abuse, and if it happens in a partner relationship which is usually passed in the form of intimate partner abuse. We and it’s that large we have quite a big group of people have been working on stalkerware for at least two years and have been trying to raise awareness, including, among the security community and other groups but in this particular case the security community,
Vamosi: To further define this, let’s be clear about how you get stuck or where you might think, for example, that you might be targeted from afar, that somebody remotely installed this on your mobile phone.
Grooten: Generally when we talk about stalkerware or anything actually being installed on a device like this, it’s typically going to be some kind of physical access,
Vamosi: Physical access means the abuser has access to the device. So that person either works with you, lives with you, or otherwise has access. It doesn’t take leet hacking skills. It also means that just about anyone can do this, download the stalkerware onto another device.
Cherne: One of the points in our talk is that, in general, this isn’t going to be some novel zero day exploit. You don’t need to have hacker or criminal connections, you don’t need to be on the dark web. This could be something as simple as somebody having physical access to your device to be able to put these applications on there.
Vamosi: That said, Apple recently patched, a no quick vulnerability that would send a text message could install listening capabilities on your iPhone. Yeah but aren’t there examples of stalkerware that didn’t require physical contact with the device.
Grooten: I also would have qualified No. The reason is –except in limited cases — the bar for this is a lot higher need to go, the prices a lot higher, you need to need to pay someone a lot of money to do this, or you need to have room still pay quite a bit of money and have very good technical skills, which means it’s, it’s, you can simply executed for most,
Vamosi: There are extreme exceptions to this, of course,
Grooten: if someone’s ex partner and effort based like this is active somewhat inside organized crime and if these things happen. That would be something to consider. But in general, I would rule it out. I think it’s important to rule it out because if you believe that someone could good accurate advice remotely there’s basically no feeling of safety, and I think it’s important for people to be safe but also to feel safe. I should also say that this when it does happen and again it’s extremely, extremely rare when it does happen, It almost always involves some kind of social engineering.
Vamosi: There’s also Pegasus, a type of surveillance software created by NSO in Israeli security company. In this episode we’re not talking about Pegasus, which has been sold to nation states, and has been used by some nation states to target journalists, human rights workers and political opposition, we’re not talking about external threats such as Pegasus, we’re talking about internal threats that come from inside the house
Grooten: probably not relevant for your audience because I think they probably have some technical understanding but there is a belief among many people that you can hack a phone through their phone numbers, like that but it’s simply not possible. Like, if I have your phone number. I can’t hack your phone, unless you know, unless you have a million dollars and I can hire like me to do that but yeah, so you can’t really, in general, The answer’s no. People shouldn’t worry about this.
Vamosi: Seems perhaps a simple way to define technologically facilitated abuse. Apart from legitimate services and apps that we use consent. Did you as the product owner understand and give consent to the scope of which the product or app will do. And does that product or app periodically notify you that it may be recording what you say or do
Cherne: Consent in monitoring is where it all starts. So one of the definitions in that definition of stalkerware. One of the terms in that description of what stalkerware is, talked about, there is no explicit persistent notification to that user. And that’s so important because somebody who is an abuser who is trying to control somebody should have access to installing this kind of software. Well, clicking once on install on some kind of pop up that says do you allow your location to be monitored Do you allow SMS and all of these things to be accessed by this device. Clicking on that once does not mean that the user of that device, who is using it, Who is going out in the world has consented to all of their activity being recorded and reported to somebody else. So, again it’s outside of anything we can cover in this podcast, but really consent is one of the really big issues here. And without that consent from the user from the person being monitored. That’s one of the key features of stalkerware.
Vamosi: That’s a gray area with the Internet of Things. Sometimes you consent once at installation, and then you forget that something is actively listening to your conversation, wanting you to say the magic word to turn on the TV or turn off the lights, or order something online. But what is it doing in the meantime,
Grooten: The most obvious one I think is something that people can relate to our cameras that can be remotely controlled. You know cameras that people install in their homes or maybe a partner, when people on vacation install in someone’s home, to monitor the home wander away. But of course, someone can monitor this remotely say the same partner of the relationship has ended, can still monitor these things remotely then it becomes a form of abuse, that’s that’s the thing people can relate to, but because because we understand privacy, but there are also instances where an abusive partner or an ex partner was able to remotely control the lights, or the thermostats, within a house. Using that only show, they have power over someone. Because a lot of abuse is about power. So there’s no obvious security issue like you can’t do any harm in a traditional security sense by, by turning on the power but you can do a lot of emotional harm.
Vamosi: And then there’s the kids who really don’t know any better. Another gray area. Your ex gives your child a toy that can call out, maybe to the ex directly, or if the toy doesn’t explicitly say it can call out that maybe the ex can go to a site and listen in on conversations in the home, where the child now lives, chances are the toy is not informing the child or the parent that any of this activity is happening. There’s no notification of consent.
Grooten: Some smart voice, that the children get, they have microphones, and those cameras as part of that, and these terminals can be controlled remotely and something that I know has happened, sometimes, is that an abusive partner gives these resources of presence to children and children use them. This is a way they can monitor remotely. Again, I should say, most abuse is far more mundane that it’s just like stock where most abuse is far more mundane, even if it seems technical it’s often not particularly technical, but the IoT abuse does happen and it’s something that we should be aware of, and I think IoT manufacturers should be aware of.
Vamosi: What about the parents who just want to monitor their children’s internet use. There are legitimate apps for that. Right.
Grooten: The most important thing is that if a parent wants to monitor a child, then they should not be hidden from the child. The child should get a notification on their phone, say that they just want to be monitored, that’s, that’s okay it’s, I think it’s a normal situation that a 10 year old or 12 year old knows that their parents are monitoring them, and there’s a point in their life where they stop speeding okay, and that makes it a complex issue but for young children, I think that’s, that’s okay but at least if you use monitoring software that is parental control software that is clearly visible on the device, that the child knows it and even if a child is deemed too old for for this kind of thing to happen, they can, they know that the phone is being monitored, and they can maybe talk to a friend privately, while leaving the phone somewhere else, or just do not censor the message on the phone or someone who’s at least aware of it and stalkerware, even though a lot of stock aware. If you go to the website where these things are sold, they claim that this is for parental monitoring, they emphasize that it’s hidden on the device. And I don’t think there’s any reason why things are under control for should be hidden. And as you say there’s not just in my opinion, we get in a coalition against all quiet we wrote the definition of software, and the bullet points. Look, lottery not mentioned in our talk or that it’s because there’s no notification of the software being offered of the software being active on device if that’s the case then,
Vamosi: Given that we have a reasonably good model of stalkerware It seems to me that the app stores could do so much more. I mean, can’t they filter out or at least force vendors to disclose that certain activities from these apps, might be considered stalker,
Grooten: where, yes, the potential there. But all the usual caveats, including the controversy surrounding Apple I mean I wouldn’t, I wouldn’t recommend that install a new monitoring onto devices I think that’s creepy and there’s a lot of harm that companies see quite a bit already, and they should be able to have some understanding of what’s going on it’s, it’s something you have to be very careful with because I’m not sure if they can always be certain that it’s not aware and even, even if it’s just a warning I think a false positive can be very harmful, you know, if you suddenly get a message that maybe you’re maybe a part of an export and was wondering your, your device, your phone, that’s that can be very, very intimidating traumatizing. So, yeah, again, I would recommend, if Apple and Google etc are listening, make sure you keep conversations going, which I know is already happening for people working in the anti abuse space.
Vamosi: So stalkerware remains very much a caveat tour or buyer beware. It’s hard for the tech giants to screen against them. The Stalker apps will always have lawyers who will wordsmith things in such a way that it appears to all be a gray area.
Grooten: There are some conversations going on around that court has banned stalkerware from being advertised using Google ads in practice, of not seeing a lot of little fixes there, it’s kind of a cat and mouse game, because you know for me it’s easy to say these companies are bad, and the students, you know, they also have lawyers, and they, their website is written very carefully to make it appear that it’s, that it’s legitimate what’s happening. So, it’s probably not as simple as locking them off as much as I would personally like for Google and other algorithms to do more, and I’m sure they can do more, But it’s, it’s not as simple as okay we’ve done it today and then stops happening, why otherwise.
Vamosi: One of the purposes of their talk at Black Hat was to open the discussion in the security community and get technical people talking about abuse awareness, there’s a recognition that software engineers can be part of the solution. For example, software engineers now include accessibility features, so that people with disabilities can use and benefit from their work, perhaps software engineers can play a role in designing more safeguards against surreptitious spying.
Grooten: Exactly. That’s very important. I think these two are possibly somewhat linked as our technology is accessible and to build this on just like security does not make sure these things are built on right from the start and not a fix later on. Yeah, no it’s definitely something that engineers should consider the human impact of any device, or any piece of software that they use. Could this be, because we abused in an IP situation
Cherne: For the vast majority of people in attendance or maybe who are listening to this podcast. There might be people who are in product organizations, and what I would challenge you to do in those organizations, is, think about what are some of the unintended uses of your software. So, I’ve seen this put in a number of different ways. My friends see Todd Lombardo who does a lot of writing around product management and product development, said very succinctly. Who does your product hurt? That’s a really powerful statement.
Vamosi: We’ve seen early examples of this, where landlords with IoT enabled furnaces and air conditioning, have changed the settings on tenants who are behind in their rent. What if we extrapolate that out. What if a single app cut off your access to social media, or work simply because a former partner wanted some form of revenge. Or, if all of your Facebook posts ended up in the hands of a political operative who could further target you with their messages that really happened with Facebook. In reaction to this in 2018 Aaron Z Lewis, a young designer, wrote in his Twitter feed, in light of the latest Facebook scandal. Here’s my proposal for replacing design sprints. Black Mirror brainstorms, a workshop in which you create Black Mirror episodes. The plot must revolve around the misuse of your team’s product.
Cherne: I like to think about that quite a bit, you know, not just who would be hurt. But what if your product was used in a black mirror episode, like how dark would it get what are those unintended unintended outcomes and how could it be abused in ways that are not central to what you’re trying to do.
Vamosi: Some of the feedback that Lewi received on his tweet included the idea of Black Mirror brainstorms being pre mortem technique used by some design teams to imagine everything that could go wrong in a project before it even starts. Another suggestion was traditional Red Team Blue Team design experience with the Black Mirror brainstorm would create a red team with an adversarial role, challenging the blue product team with mitigating the possible risks.
Cherne: What if you were in a black mirror episode or what if you were in the news for something that is not your main line of business for somebody unintentionally using your product the wrong way. These are some of the things I like to think about
Vamosi: My problem with technology is every time Windows issues an update. Of course, I do the update, but then later I have to go back to my privacy settings to see what Microsoft changed. Often, nothing has, but sometimes it’s like, oh hell no, I don’t want all my data automatically saved to OneDrive or No, I don’t want Microsoft listening to all my calls and text messages and emails. I shut all that off, and it seems like they creep back in and at some point, which then I shut them off again. And it’s not just Microsoft. The same with social media apps, I have to go through various settings to make sure that I’m not tagged without my permission, or in some cases even block a person now and again. I know these situations go against the free and open flow of information that hackers and designers may have intended, but I also know there have been egregious abuses in the past.
Grooten: I think everyone working in security should understand everyone working in, in any country to understand the impacts. Their work is on society in general and vulnerable. In particular, I think if technology, one Office group, not one of this group is, I believe survivors and I think that’s a responsibility to totally understand the impact of our technology. I’m not saying that everyone should volunteer their time working on it if you do it will be great but you know there’s other things in the world to focus on, But I think that there should be some awareness of Latrina uses the code I’m paraphrasing, so I’m like, you don’t want your technology to be used in a movie script, and that’s that’s totally you know and then you don’t want your technology to be used against an abusive driver and it’s the same with you don’t want your technology to be used by a government to spy on human rights activists, that’s that’s a different, different somewhat related topic that I’m also very passionate about and, you know, these things I think you should be aware of, and if you are in a team and you build a product and you think you just don’t have some understanding, reach out and try to build contacts and there’s a lot of people working in this space, and have a brainstorming session to say hey, we have this prototype that you think someone can abuse
Vamosi: in security. We talk a lot about threat models. There’s the corporate threat model, which helps organizations determine what security services and tools are needed. And then there’s the personal threat model. Everyone’s threat model is different because either we have different skill sets, occupations or life experiences. Threat Modeling also applies to technologically facilitated abuse as well. and it’s just as complex. It’s not just the tech that needs to be considered. It’s the whole environment that the survivor finds himself in.
Cherne: So, while in the talk we do talk about threat modeling for the survivor here. It’s something that we probably could spend a lot more time on if we had a longer time on stage or in this talk. It’s something that I’ve talked about with a lot of either small businesses completely separate from this small businesses or some student groups, and we talk about threat modeling, as you know, when you plan out things. What is the worst thing you could see happening to you. Okay. Now, how are you going to mitigate this, and maybe that threat model that you created three years ago. Maybe that is no longer valid today, maybe it changes from day to day.
Vamosi: People change, and by that I don’t mean that maybe the stalker won’t stock anymore. Hopefully they stop stalking. But what I mean is that as we get more information. We change, we change the way we view our friends, our enemies, our world. And so our threat model, it needs to change as well.
Grooten: I’ve got to answer this but I want to make it clear that these answers, kind of come from a paper by Karen Levy, retired, that will link the ranks in our talk. One thing to keep in mind is that most security. Most of our security, mobile is about remote. At first, you know, a cyber criminal in a foreign country or something like that, or, or even someone in a public Wi Fi displays itself, like I said it’s a bit rare but people are thinking about it but it’s different when the adversary lives in the same house and has physical access to the device.
Vamosi: There is no definition of a perfectly healthy relationship. At least I don’t think such a model exists, but in the absence of that, there are near perfect relationships with other people. And that’s what I think most of us strive for sure. We had a fight, but fights, too, can be healthy. The absence of fighting in a relationship can certainly be unhealthy. It means that there are topics/situations that just aren’t being discussed head on one to one with each other. And the other party might be harboring ill will or bad feelings about things that could be better addressed if they just talked about it out loud. This isn’t to say that you should avoid all relationships, No, no, not at all. Rather, you should enter into a relationship with open eyes.
Grooten: Another very important thing to keep in mind, that’s also from the paper is that relationships change, like someone made in a relationship with someone and then they trust them and they may give them access to, to device to to an account that can be a good thing I mean having a backup person in case someone might die or something like that I mean that that’s not a bad, bad idea, but it should also be possible at anytime to withdraw that people should be made aware. Who else has access to something that’s, that’s not uncommon, interviewed that, for example, someone still had someone share Facebook passwords. It’s not something I would recommend but people get in relationships, and then ratio ends and it turns out they never change their password, they still can read messages and stuff like that.
Vamosi: So we’ve focused a lot on device and apps, but there’s also the social media component. For example, I have followed Martijn for years as a security expert on Twitter. But I also know a few other things about him. The public things that he’s chosen to share online, such as he lives in Greece, or that Martijn and I both run marathons, so I could probably strike up an online conversation with him around that. In most cases, it would be harmless, even healthy to reach out to another person with a shared interest. I would assume Martijn would look me up on Twitter and see who I am before continuing with any further detail or Martijn could just lock down his account so that only a few people, the people he knows and trusts can see his social media posts. So, there are privacy settings, but do they really work and how are they effective was stalkerware?
Grooten: It’s a very good question. I mean, yeah, that could be used in general, social media, privacy settings are very important, and are very complex but most social media apps do have pretty good privacy, at least controls, I mean, we think the social media apps are invading my privacy. More than they should. I definitely think that, but for from a IDV point of view, their control step to be reasonable, but not always the right ones by default so if this is a concern for you go through all the social media app to use your, your Facebook, your Instagram, your Twitter, your, your, tick tock recently installed picked up for for this reason to understand its privacy controls samples a day, they tend to be quite good, and you know you need to think what’s possible for example in Facebook, you can stop people from tagging you, so it’s not uncommon someone an abusive relationship that it ends, but the two people still have common sense, and they keep their profile their private sort of ex doesn’t realize but they go to a party and a friend tags them and that partner information about, but I’m sorry, for example, you mentioned running like they may be run together and somebody start that’s being part of that run without giving approval is being taxed, but it’s definitely something that people may want to turn off.
Vamosi: There are innocent scenarios that you might find yourself in, there’s subtle, like having your friends tag you on a photo, but then having someone else who might be threatening see that tag, and somehow take action against you. Things like tagging, others, they seem like a great idea at the time but later, maybe not so
Grooten: yeah no personal threat model but in this case, like face would make it easier to do that to tag someone so even if you don’t vote in activity, one of your Facebook friends make both of these dates you’re involved with, and they got you and that becomes all your friends know that you were there that that may be fine for most people that modulations, but probably isn’t in instead of definitely isn’t in some cases and that’s why it’s important that these controls. As a general rule, there’s quite a lot that you can control, but the defaults are not always the best from an ipv4.
Vamosi: So how does personal threat models work with security people? I would think that we would be at maximum security level, given that we know too well. What could go wrong. The reality is that we probably aren’t doing everything that we advise, we can, in certain situations. Let our defenses down a little bit. And we have to weigh those pros and cons, when we do, I’ve mentioned
Cherne: Survivors are going to have different threat models, myself, who I would like to consider, you know, really adept security practitioner. I might do things in my life that other people would never think of. For example, I share my location with my husband. Now this is not something that I woke up one day and said I was going to do if you asked me a few years ago, would you just leave your location turned on and shared with somebody else even somebody that I was married to for years, I would have thought you’re crazy. But it turns out there was one time when we were moving between two different houses, we were taking different vehicles shuttling stuff back and forth all the time, and just really needed to know where each other were from that experience that lasted a couple days. As I commuted and did all these weird multimodal transportation things to deal with the traffic of Boston. I just left that location. And, you know, if you had asked me again a few years ago. Is this something that you would do, would you share your location with other people, even people in your family and go. No, that’s a horrible idea. Why would I turn that on, that can open the door to who knows where your dad is going. But in my circumstance. It’s something that worked for me and has great utility to me.
Vamosi: Lodrina, in this case, there’s an awareness. She knew what she was doing. She knew the person she was choosing to share her information with, and she weighed the pros and the cons. Most of us however don’t ask all of these questions, or we don’t ask enough, most of us choose or side with convenience, out of convenience, why go through all the extra steps of locking down your apps and our devices, right, and there really is no one answer that fits every scenario, you have to work out each of these situations, one by one.
Cherne: When I teach my forensics classes, and I teach. I teach a Windows forensics class in six days. Whenever my students asked me questions about this case or that scenario, I always say, it depends, You know this answer is going to frustrate you. But it depends. Just knowing this issue is not black and white, it really is, many shades of grey. You know getting these ideas across, getting them out in the world having these other conversations already feels like such a win.
Vamosi: One area where women in particular find themselves trapped, is that the man in the relationship, typically owns all the property, or has the bank accounts or has the credit rating. Maybe that’s something they entered into unintentionally. In fact, it wasn’t until the Equal Credit Opportunity Act of 1974 that women in the US were able to get their own personal credit cards. From this, we can see both good scenarios where the couple today can share their accounts openly and bad examples where one person uses a shared account as leverage or as a means to spy on the other party. So, finances should definitely influence one’s personal threat model,
Grooten: Financial, banking and stuff is very important. It’s very. This is not trivial to develop protocols for
Vamosi: Related to, but not always thought of as such. Our mobile phone plans. You have to demonstrate all this financial stuff upfront just to get a cellular plan. Also, it doesn’t always make sense for everyone to have separate mobile plans. Family plans are far less expensive. But again, if a person needs to leave an abusive relationship, how do they sever that connection, in terms of technology and services,
Grooten: There’s some legislation happening in the US at the moment around for shared formats. It’s often hard to escape share it from land and people are working hard, including members of Congress to make this easier for someone to just escape the shared problem if they’re gonna use abusive relationship, you might
Vamosi: You might be thinking that all this advice is just common sense, or that there’s probably some universal best practices that can be applied, but really, in reality, it’s much more nuanced and complex
Grooten: Abusive relationships are complex, they are not continually abusive people, there may be some of that okay maybe he’s changed, people have invested in relationships there’s lots of reasons why people don’t leave. And it may be that something has changed, and it’s, you know, if you don’t work in security, this may seem hard to understand because we’re so focused on security and privacy in our minds, but for most people, you know if your partner says, Oh come on, you can give me the password for your phone. You trust me, don’t you, I mean, most people will do that. And they would, they would see, not giving it as a way of not trusting someone. So this is how a lot of people have access to their partners twice a day. Things are complicated. These relationships are complicated and it’s, I think, that advice that just don’t share your. Don’t share your phone, don’t share your password with your partner. This doesn’t without acknowledging the complications isn’t very helpful.
Vamosi: There’s another layer here that we haven’t really talked about in that there aren’t easy prescriptive models to technologically facilitated abuse questions, nor can you apply the typical security best practices, either.
Cherne: This might seem counterintuitive to a lot of technical folks, but some of the measures that we consider basic security hygiene, things like set, a unique password to a device, and then that is your password, do not give it to anybody do not share it, cheat, you know, this doesn’t work for people who are in these survivor situations. If you are in a situation of intimate partner violence, domestic abuse, gender based violence. You know we’re not just talking about somebody who’s being physically hurt him hit that kind of thing. This can include things like financial abuse, controlling behaviors of emotional abuse, and it could be that somebody is in a situation that they do not feel empowered to leave or don’t feel as safe to leave. And in that situation. Just like uninstalling the spyware app can escalate abuse.
Vamosi: Think about that just changing a password or better yet, removing the app, this could escalate into physical violence. It could enrage the abuser and threaten the survivor of that relationship.
Grooten: The two main reasons why you may not want to move in the first place, as I mentioned, reduce my escalation. People may. I mean, someone will discover if they’ve also upgraded, that it’s removed because it stops working. And that may, that may, if they used awkwardness to control you. They may look for other ways to control you. That may be worse. And the second reason is if you do want to bring something to record at some point. Then removing software removes evidence so that’s also something to keep in mind, and maybe a third reason that I’ve heard sometimes. It is also about evidence, like sometimes people you know when the abuse is really put behind you know when the relationship is really, really finished and the contract is completely broken off. Some customer just wants to know what happened back in. Two summers ago when all this weird stuff happened on my phone for their own peace of mind it may just be be helpful for stock or for the foreign ignore to be removed. But yeah, that depends on the individual situation.
Vamosi: So again, the advice must be customized to the individual in that situation, their threat model must include the possibility of escalation.
Cherne: Well, somebody who’s in a vulnerable relationship who suddenly one day has a new phone password and won’t give it to their abuser. If that is behavior that is different than what has been experienced in that relationship, Frankly, it’s just not something that’s going to fly. So, digging further into the idea of this very specific threat model. And I don’t know how many people out there in our world in the tech space have thought about, wow, I might have friends, I might have colleagues who can’t have a safe and private password to their own device who can’t have a private account on their computer really opening people’s minds to some of these concepts, you know, in the 30 minutes that we have onstage. If we can open people’s minds and educate them on some of these things that survivors are experiencing. I would consider that a win.
Vamosi: Beyond there being sheer technical solutions to stock aware and technologically facilitated abuse, there are human beings, and by that I mean there are organizations run by human beings that can help.
Cherne: One of the other important things to know about this situation is that there are resources out there to help people, there are domestic violence hotlines, and even beyond the technical aspect, because some hotlines may be able to deal with technology, more than others. I think the important thing to know is, Not only can these hotlines be useful if you are somebody who finds themselves as a survivor in these situations. They can also be helpful for people who might have friends in these situations or family members. And lastly, if somebody thinks that they themselves might be performing, abusive, or controlling behaviors. Oftentimes these hotlines can help or recommend, who can help in those situations. The biggest thing is really knowing that if you feel you are being surveilled and you’re in an unsafe situation. You are not alone and there are resources out there that can
Grooten: There’s the Clinic to End Tech Abuse (CETA), which is part of the clinical and community support part of Cornell Tech University in New York. Slight disclaimer: I’m doing some volunteer work for them, but they have some great resources on the website with a lot of guides like on securing your Google account, your Facebook account, etc. there are very good science on tech abuse. And there are similar ones by Western Australia and refuge in the UK. We have a website with resources, including links to the things that you mentioned, there is a very brief guide like Hey, I worry I have software, what should I do, we wrote earlier this year with, with a lot of help from people from all aspects from technology experts to IP address,
Vamosi: third party organizations are great. They have experts, they have good advice. But what happens when your best friend confides in you, the local security expert, or maybe you notice something that seems off in a friend’s relationship before they even notice. I’m not a human factor expert, but if someone came to me in the situation in needing my help. What should I do.
Grooten: Well, I think I don’t think I’ve mentioned this in the talk. If I’ve given talks on the subject and in various settings and people sometimes come to me afterwards, I find it interesting, and I want to help and I’m pretty good at reverse engineering and stuff like that, I want to reverse engineer stalkerware apps, it’s always nice if people want to help but I think the best you can do if you want to help is read up on, on the other part of the funnel, read a book, watch videos, read websites from the likes of NME TV. To understand how abuse works to get a, get some kind of understanding for things that may not be open just like the fact that some of them can’t leave a relationship, even if they know it’s abusive, even if they would like to leave a comment for a number of reasons,
Vamosi: True. If you want to get good at this, like anything else in security, you should probably read up on the subject, or attend some workshops, before even trying to help.
Grooten: And these things need to become a kind of intuition, if you want to work in this space.
Vamosi: If you’re serious about helping out Martijn has some recommendations,
Grooten: it’s a few things to keep in mind, I think the first thing to keep in mind is to make sure at every step they control what you do. So, what happens. So, if you do find if you do happen to find stalkerware say you run an antivirus scan on the phone. on iOS folkways rapid and and both. It’s far more common antivirus typically desktop software so if you install an anti virus vaccine, they should. It’s up to them to decide whether they actually want to remove it. Because, um, we mentioned, we’re gonna talk about some of that sometimes escalates, when stalkerware is present and is removed. It’s important that people think this isn’t mine. The second thing to keep in mind is that people often self identify that they have soccer because they, they know that and rubbish in an abusive relationship, they’ve read about stalkerware because it’s been in the news quite a bit and I think it must be happening to me. In practice, it’s almost always, it’s usually something more, more mundane, like a shared password or, or just someone who occasionally has access to the device, you know, you don’t need stock aware, retail investors and if you, if twice a week you can read them on the phone. So make sure your focus is, is broad not narrow. We share some resources on tech abuse. And keep in mind, and very importantly, understand that someone who miss happens usually in an abusive relationship or was in such a relationship is often traumatized in some way and this May. Which means that sometimes these people can be extremely interesting like you’re the one person who can solve all their problems, or they feel they make you feel this way. Sometimes it can be extremely distrustful,
Vamosi: No matter how well you think you were helping someone out. Always remember that there are professionals and there are experts out there.
Grooten: And that’s both completely normal and that’s something you need to keep in mind when you are dealing with someone with some kind of drama, that’s, that’s what you need to be aware of. We shared some, some Helpline Numbers in our talk, primarily for survivors. But if you’re helping someone, even if you’re only being asked about a technical help with a technical issue but you’re just not sure, like, is what I’m doing is really in their interest. Should What should I do, call these numbers and ask for advice
Vamosi: And traditional abuse organizations have come into the 21st century, and now recognize the difficulties that technology can bring to any relationship,
Cherne: As somebody who has interacted with people in the domestic violence space in the survivor support space, it’s my impression that a lot of these organizations are getting better at dealing with technological issues, certainly there are organizations at scale who do this work, and N ED v is one that we point people to in the talk. The Coalition Against stalkerware does some work at the corporate level. There’s also organizations like Operation Safe escape, which is an organization that helps people through the process of recovering from these abusive situations, helping people get safe in their devices.
Vamosi: Whoa, so there are organizations that can help individual survivors of abuse, clean up their devices, so they can use them again, that’s really good to know.
Cherne: That being said, it is really going to vary, potentially based on your geographic location, the help that is available to you. And I’m speaking right now maybe to those people who are outside of the United States, who maybe can’t contact an EDP and they’re one 800 hotline for advice, who are really excellent. If you are in a different geographic location where maybe resources in your language aren’t available, maybe your police department doesn’t understand that somebody’s tracking you on your phone, his physical safety issue. And, you know, we, we, in our talk, by the way mentioned statistics on domestic abuse according to the CDC, which brings up another important fact of this being a health issue, you know this is not only a safety issue this is a health issue. I just have to say, keep trying, you know, whether it is a support line, whether it is a police officer, whether it’s a federal agency or a friend. Know that you are not alone. There are people who care. And, well, really, in my mind I’m speaking to technologists, I also want to acknowledge that there might even be people in the room when I speak this week who are going through this situation and looking for help. So, you’re not alone, there is help out there,
Vamosi: Intimate partner violence is very hard to discuss in public and intimate partner abuse includes not just physical abuse, which others might see, but also emotional abuse, which others might not see part of our reticence in talking about all of this in public, I think, is that some of us simply don’t want to know about it. And part of it is, I think, if some of us are open to hearing about it. A few of us might come to identify with it. If we do, what’s the next step. How do we get the support we need to get out of that situation, and begin to heal?
That’s why talks given by Lodrina and Martijn at Black Hat and other conferences, despite what celebrity security researchers might think are so important in that they can least break the ice and get more people, people in the security community, comfortable with admitting that these messy sometimes ugly interpersonal human situations exist, and perhaps by recognizing that, as we’ve heard, we can start to see how our technology might facilitate even escalate that abuse
In the show notes, I’ve shared a resources link from Martijn and Lodrina’s presentation, or you can find it directly at lapsed ordinary dotnet slash Black Hat 2021 I think it goes without saying that insecurity, it’s always important to make sure technology, any technology isn’t used to hurt someone. That’s exploitation that I think we can all work in together.
As I produced this episode, the United States government imposed fines on three individuals who are part of Project Raven, a story that Reuters news service first broke a few years ago, about some former NSA individuals who went to work for the United Arab Emirates in order to spy on us citizens. This case is an example of a very specific application of technologically facilitated abuse. And while in this episode, we only talked about the more common, the more intimate abuses among domestic partners. These state sponsored abuses are along the same spying and surveillance spectrum as stalker where I think the more light that we can shine on any of these surveillance abuses either state sponsored or domestic, the closer we’ll be the ending technologically facilitated abuse in general.
Let’s keep this conversation going. DM me at Robert Vamosi on Twitter, or join me on subreddit or discord. You can find the deets at hacker mine.com
The Hacker Mind is brought to you every two weeks, commercial free by ForAllSecure
For the hacker mind, I remain the do no harm to others Robert Vamosi.