Nothing Is Unhackable
From “smart” homes to educational apps for remote schooling, from employee-monitoring software to the geolocation devices we carry in our pockets, the extent to which our intensely surveilled life became normal would have shocked even the most future-focused mid-twentieth-century creative. Consider that the setting for Philip K. Dick’s 1956 novella The Minority Report—made into a 2002 film by Steven Spielberg—was Washington, D.C., in 2054. In Spielberg’s version, art director Alex McDowell envisioned a city where computer interfaces controlled by gesture float in mid-air, where robots perform retina scans to inventory humans, and where the authorities profess to “know” what you're going to do before you do. To date such a world to 2054 now seems quaint. Because it has all already happened, and capabilities formerly reserved for deep-pocketed state actors are now commercially available to your nextdoor neighbor.
Should we be afraid? Is any of this reversible, and would we want it to be? We were happy to discuss these and other hard questions this past week with the extraordinary Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, who will join us virtually at Concrete Love this October. Eva has extensive experience in IT systems and security in Silicon Valley, and is laser-focused on targeting malware, as well as organizing and educating citizens about digital privacy, security, and why we all could stand to get smarter about it.
Tim Leberecht: Eva, thanks so much for your time. Tell me a bit about your work with the Electronic Frontier Foundation. How long have you been at the foundation? What are the accomplishments you’re most proud of?
Eva Galperin: I’m the director of cyber security at the Electronic Frontier Foundation, and lead a team called the Threat Lab, where we research threats to vulnerable populations. Most of the last 14 years at the Electronic Frontier Foundation I’ve been working on threats to journalists and activists, tracking APTs—Advanced Persistent Threats—which is just a fancy security way of saying state actors that spy on people. More recently I moved into the realm of intimate partner abuse and helped found an organization called the Coalition Against Stalkerware, which has been helping to increase the detection of stalkerware and to secure devices against this type of commercially available software that people use for spying.
And most recently, I’ve been working on threats from physical trackers. So Apple put out this thing called the AirTag, which is supposed to make it easier for you to locate things you’ve lost or that have been stolen. But it is extremely easy to abuse, or to use as a physical tracker to spy on people. And the AirTag is not a particularly new technology. For example, before the AirTag, there was a product called Tile that does much the same thing. So I’ve been working on getting those companies to change the way their products are built so that it’s harder to use them for stalking.
TL: What exactly is an Advanced Persistent Threat?
EG: Well, most of the time, if you are a systems administrator, or you work in information security, and you’re trying to lock down a network or a corporation, the kind of threat you are facing is usually limited to criminals and hackers and scammers—people whom you can deter by simply making it hard enough to break into your system that they move on to the next guy. An advanced persistent threat, which is often a government, is somebody who will keep trying, who will essentially make it their job 9-5 to compromise a specific person or a specific organization.
And that takes a very different kind of set of mitigations to work against.
TL: You already mentioned state actor surveillance, nation state surveillance, and of course, that’s in the news right now because of the Pegasus scandal, revealed by a coalition of journalists. Were you surprised? What do you make of the whole scandal? What’s going to be the fallout?
EG: I’m not even slightly surprised. I have been working on nation-state spying, and the increasing commodification of the kind of software that is being sold specifically to governments and law enforcement, and that gets sold to authoritarian states and used as a tool of abuse, for the better part of a decade. And the NSO Group is not even the only company that does this. Prior to this we put a bunch of pressure on Gamma Group, which spun off into FinFisher and Hacking Team which is located in Italy, and also NSO Group. So this is not new. This is not even the first report on abuses by this one particular company. What’s really different is the sheer number of possible targets—the list that Amnesty International has. That really gave rise to a nearly endless supply of stories about people who have been targeted by this very scary and powerful software. And that’s what allowed it to catch on in the public imagination.
TL: It’s fascinating that the technology exists. So what do you think of NSO’s response, basically blaming the BDS—Boycott, Divestment and Sanctions—initiative? How do you make sense of a company like NSO Group being allowed to operate?
EG: Not only is NSO Group perfectly legal and allowed to operate, but it essentially operates with the blessing of the Israeli government. This was also true of the Gamma Group in Germany and Hacking Team in Italy. After their abuses were documented and reports about them came out, they still got permission from their governments to export their products to governments that we knew engaged in human rights abuses.
And that is particularly true of the NSO Group. It would be extremely weird if the only possible motivation for opposing NSO Group’s enabling human rights abuses was support for BDS. Because as far as I know, BDS does not include Germany, or Italy, or any of the other countries that have had similar companies with similarly cozy relationships with government actors. I think both Citizen Lab and Amnesty International and, for that matter, the Electronic Frontier Foundation have been extremely consistent in their opposition to companies that provide a product to governments and law enforcement that allows them to engage in unbridled human rights abuses. And that’s what all of these companies have in common.
Photo by Natalya Letunova
TL: Help us understand how it’s possible to spy on mobile phones. Apple, for example, prides itself on being a safe, protected, privacy-based environment—unhackable, basically. Are companies not doing enough to protect the phones? Is there anything they could do to better protect users?
EG: Nothing is unhackable. It is extremely important for everyone to understand that nothing is unhackable. The more complicated the device, and the more complicated the software, or the more open it is to interaction with other applications, or research by security researchers or hackers, the more likely it is that you have created an additional attack surface.
Playing offense is easy, because all you need to do is find a vulnerability. And playing defense is hard, because you need to defend yourself on all fronts all the time.
In fact, we are fairly certain that the kind of capability the NSA has is a capability all of the Five Eyes countries have had for many years. We’re fairly certain the U.S. has it, the UK has it, that Canada and Australia have it, and that China and Russia may have it. What is novel is that in the last 10 years or so, companies have come up that take advantage of these vulnerabilities and sell this capability to any tin-pot dictator with a million dollars to spend.
So suddenly we’re not just looking at the U.S. or Russia or China having these capabilities, but also Kazakhstan and Azerbaijan, and Mexico and Ethiopia and Saudi Arabia and UAE. And that is particularly disturbing, because we have human rights abuses linked directly to this kind of surveillance, over and over and over again.
TL: So how is this gonna play out? The Moroccan government is suing Amnesty International and other players involved in revealing the Pegasus software.
EG: There are a lot of lawsuits flying fast and furiously, and I’m very excited about them. The Morocco lawsuit is largely baseless. It is a vexatious lawsuit that exists largely to scare Amnesty International into silence and to keep them from publishing some more. And I hope it is dismissed, summarily and quickly.
On the other hand, some really interesting lawsuits have been launched against NSO Group over the last few years, including the lawsuit launched by WhatsApp for NSO’s having taken advantage of vulnerabilities and spied on people using WhatsApp. There were some very interesting lawsuits fairly recently in French court, against companies like Nexus Technologies, and amasis, both of which provided technologies to Egypt and Libya before the Arab Spring, and which helped to enable many well-documented human rights abuses.
And we’ve actually seen the French court take action, specifically against the executives in charge of those companies at the time. So I’m hoping that sets a precedent, and it makes company executives think twice before selling to any company or country with a long history of human rights abuse.
Photo by Sigmund
TL: That may be one positive outcome from the whole thing. Do you think it will raise awareness of this issue, or lead to different policies? Is this a seminal moment in the history of spyware, or just a blip?
EG: I think this is just another blip. I mean, I want to believe that soon we will have a moment where we put our foot down and say, “No more. We’re not taking any more of this.” But I think the temptation to spy is simply too strong, especially for authoritarian governments. And the best we can do is put an end to companies that sell a turnkey service that makes it simple, cheap, and easy to take advantage of this kind of capability. That’s probably the best we can hope for. But that is a situation that leaves us much safer than where we are now.
TL: So we’ve been talking about state-actor surveillance, but on a domestic level, we spy on our partners or friends or colleagues, right? You gave a TED talk about stalkerware and founded the Coalition Against Stalkerware. Can you tell us more about stalkerware, and what aspects of it we need to be aware of?
EG: Stalkerware is commercially available software that a person can buy and covertly install on someone else’s phone that gives them full access to whatever is going on on that phone. This is very tempting for abusers, for the same reason it’s very tempting for governments to do this kind of spying. The tracking devices we carry around in our pockets give away a whole lot of information about our lives; our browsing history, all of our text messages, including our end-to-end encrypted messages, all of our pictures, all of our new posts to social media. Essentially, access to somebody’s phone is the next best thing to access to somebody’s mind. It’s incredibly powerful.
And this kind of power is available not only to governments, but to unscrupulous individuals who use this power in order to control people that they’re either abusing or stalking. There’s an entire industry of software that does this. And so a couple of years ago I brought the antivirus industry together to treat this kind of software as malicious, so that now, if you install an antivirus program, it is much more likely to actually find this software on your device, let you know that it’s there, and give you the option of removing it. That really moved the needle on stalkerware in an important way.
TL: It’s a blurry boundary, perhaps, between stalking and curiosity. I mean, we are all Googlers. And we maybe follow people on social media and look them up. But stalkerware presents a whole different dimension. Where do you draw the line? What behaviors cross the line?
EG: Well, the good news is that the line is not blurry at all: it stops at consent. The posts you make to your social media profiles are things you have chosen to share with the world. Even if somebody you don’t like sees it, if you have chosen to share it with everybody, it is not unreasonable for somebody to go look at it. If you need to get your physical hands on somebody’s phone and covertly install software on their phone in order to track them and see what they are doing, this is absolutely a circumvention of their consent. This is getting all kinds of information they would not give you otherwise, or make public. And it is absolutely not okay—no matter how bad they are, no matter how much you feel the need to control them, no matter how justified you feel. In most places this is illegal, and certainly everywhere it is unethical, and it is abuse.
TL: And with the Electronic Frontier Foundation, you’re instigating policy changes, but also trying to raise awareness for the individual user, is that right? It’s a two-pronged approach.
EG: Yeah, the technical approach has, of course, been to talk to the companies. The policy approach has really revolved around a couple of different things. The first is definitely raising awareness, talking to the people who develop new tools in order to keep them from building things that can be used as stalkerware. And also making it more difficult for stalkerware vendors to make their products available to ordinary users, and to make it all easier to detect. And then the last thing I have been working on from a policy perspective has been awareness-raising for law enforcement. Because once we have all of these tools that allow you to be aware of stalkerware and to know when it’s present on your device, a lot of people wonder what to do next. Sometimes they will go to law enforcement, and it’s essentially a crapshoot of whether or not you get someone in law enforcement who even understands what is happening, much less how to investigate it, or create some sort of consequences.
And I am sick of hearing the stories of people who come to law enforcement and then get treated badly—they get gaslighted, they get treated like they’re crazy. They get treated like this is somehow not illegal or not worth investigating. I don't want law enforcement to have any excuse, saying that they simply didn’t know what it was, or they didn’t know what to do next.
TL: Do you think we live in a surveillance society? I’m thinking of Shoshana Zubov’s book The Age of Surveillance Capitalism. Is it a reality that we can no longer reverse, just the world we live in—and we can only try to manage it or navigate it and behave accordingly? Or do we still have a chance for a fundamental reset?
EG: Yes and no. I think to some extent, the horse is out of the barn. A lot of our lives are now digital. And controlling that digital life is increasingly complicated and difficult. But at the same time, I don’t believe privacy is dead. If privacy were really dead, governments and law enforcement wouldn’t be trying to kill it all the time. I would have much less work to do if privacy were already lying on the floor twitching. There is a lot we can do to take back our privacy. And it’s incredibly important that we do so, and that we don’t give up.
When I do my security training for vulnerable populations, when I talk to journalists, and when I talk to activists, one of the most disturbing bits of pushback I get sometimes is, “why should I keep anything private, because the government can already see everything all the time anyway?” And that’s simply not true. If the government could already see everything you do all the time, they would not be pushing policies that allow them to see more. They would not be pushing for laws that give them more power. And the impression that they can already see everything and know everything means that they don’t actually have to do it, because that is what causes you to self censor. And as far as I’m concerned, that is the worst possible outcome for making change happen in the world.
Let’s talk a little bit about technology platforms. Some people say that blockchain encryption might protect privacy. And then there are people saying, well, that’s until quantum renders blockchain obsolete. What’s your take on the race towards the ultimate protected, decentralized technology, for example?
It really depends on what you mean by blockchain and what you mean by encryption. Encryption is simply a process. And when somebody says something is encrypted, you should ask what is encrypted? How? Why is it encrypted? What is it encrypted against? Simply saying “encryption” tells me absolutely nothing about what is happening or how safe or secure it is.
But what we have seen is a lot of push from governments and law enforcement to allow them backdoors to end-to-end encrypted conversations. They say they need to be able to see WhatsApp messages or Facebook Messenger messages or Signal messages. And they need to be able to get in if they think it’s really important—in order to stop terrorists, pedophiles. And so they need to have this power. The FBI, for example, every couple of years, starts a campaign talking about how their surveillance capabilities are going dark, so they can’t prevent bad things from happening, and so they need these backdoors. This is a bullshit argument. You cannot create a backdoor to end-to-end encrypted communications that only good guys can go through. Once you create a backdoor, it is inevitable that hackers will find it, that people like NSO Group will find it, and that they will sell it to people like the Saudi government ,and they will use it for things like spying on Jamal Khashoggi before killing him in Turkey.
This is the inevitable outcome of backdooring end-to-end encrypted communications. And we simply cannot allow it to happen no matter how much law enforcement waves the scary prospect of terrorists and pedophiles around. Law enforcement has many alternative methods to track both terrorists and pedophiles without engaging in this increasingly risky endeavor that would destroy privacy and security for everyone, including you and me.
TL: So as professionals and citizens, what can we do? What is safe? What should we not do? And then also, what is our responsibility and duty? What can we do to basically prevent further spying?
EG: There are a couple of things we can do in a lot of ways. Telling people how to maintain their privacy and security online is very similar to basic hygiene. And so while there are super-specific cases where there are very particular things you need to do, there are a few things which are good for everybody to do, and will prevent 90 percent of all compromises. That includes securing your accounts. The way you secure your accounts is to make sure you’re using a password manager, and that your password manager is creating a strong and secure password that is different for every one of your accounts. Turning on two-factor authentication so that even if somebody manages to steal your password, they still need that second method of authentication to break into your account, is also very important. Not all two-factor authentication is created equal. For example, SMS two-factor authentication (when the second factor is sent to you in an SMS message) is less secure than when the second factor is sent to you in an app like Google Authenticator or Authy. But having some level of two-factor authentication turned on is definitely better than having nothing.
The other thing I recommend to people is to take your security updates. You will not benefit from the tireless efforts of people working to secure software and platforms if you do not take your updates. And the majority of compromises that happen as a result of security vulnerabilities happen not as a result of unknown vulnerabilities, but from vulnerabilities that are known and have been patched, and that happen between the time they become known and the time you actually patch the vulnerability yourself on your device.
TL: Last question for you: You’re obviously spending a lot of time on the dark side. In the digital world we live in, what lifts your spirits? What spurred you to enter this field in the first place?
EG: I understand that sometimes it’s really depressing when you just talk about people being kidnapped and tortured and spied upon. It can get really bleak. But it’s tremendously satisfying to be able to make a dent in that, to wake up every morning and know that you have concretely helped someone. And that you’ve changed the conversation around these tools and these practices in such a way that some people will not be attacked in the first place, and they can protect themselves from these abuses. I love my job. I’m amazed I get to do this every day. Because I can track the ways in which I make a difference in this world. And most other jobs simply don’t have that; you can’t see the good that you are doing. I get to see the good that I’m doing every day.
Photo by Maxim Hopman
Do you sometimes wish to be a fly on the wall?
Who are you spying on right now and why?
Have you ever recorded someone without their consent?
Have you ever been recorded without your consent?
Have you been the victim of malware, spyware, or stalkerware?
Do companies have a right to monitor remote workers’ work activity?
James Bond or Mata Hari?
Do you know what we’re doing with your data?
Would you like to know?
Who have you Googled and don’t want anybody to know you Googled?
You do know that “incognito” browsing, at least on Chrome, does not mean “invisible,” right?
Who’d be the first team member you’d spy on? Why?
Would you vote for a social credit system if you could be part of shaping the rules for it?
Have you ever read the full “Terms and Conditions” of any digital product?
Do you believe your vaccination status should be made mandatorily visible to public institutions in the interest of public health? What about known to your employer? Your neighbor?
The most commonly used passwords remain "iloveyou" and "sunshine." Wow.