Tracking Assassins – How Weakness in Data Privacy Helped Find the Navalny Poisoners
Carole Switzer
Co-Founder of OCEG, a global nonprofit think tank that provides standards, guidelines, and online resources to help organizations achieve Principled Performance.
I’ve been deep into the topics of data privacy and data breaches recently, following OCEG’s recent release of our new Integrated Data Privacy Model (IDPM) and a playbook on Preparing for Improvement in Data Privacy. I've been reading “99 Privacy Breaches To Be Aware Of”, a somewhat scary book by our training partners at Straits Interactive (Kevin Shepherdson, William Hioe and Lyn Boxall) who also were primary contributing authors to the IDPM. It’s an eye-opening view into the many ways that things have gone wrong to affect both individuals and organizations, and I recommend it wholeheartedly – just don’t make it bedtime reading or you won’t get much sleep.
But as scary as the scenarios are in the book, they pale in comparison to what I learned about black market databases a few nights ago, as I took a break from my reading to watch Navalny, the new documentary film about the poisoning of Putin opponent, Alexey Navalny (who survived the assassination attempt but is now in a Russian prison). At one key point in the film, Navalny discusses how an investigation team tracked down the chemical weapons experts involved in the development of Novichok, the poison used in his attack, and the agents who carried out the poisoning. Here I was again, hearing about how personal information can be used in ways we never imagine.
Black market data
Navalny notes that Russian email providers and social networks are not very secure or privacy-focused, resulting in frequent data leaks and the ready accumulation of personal information by data merchants and automated services that can be used to mine sensitive personal data at very low cost. He also notes that much data is floating around the internet and can be simply googled, and that even more personal information is sold to data merchants by individuals who work in low level jobs at banks, telephone companies, police departments and the like. These data merchants build databases that are then sold or accessed on the black market, often for only a few hundred dollars.
Phone numbers and flight records, and fake names …oh my
Against that background, part of the investigation looked like this: The investigation team called Bellingcat had already been investigating Russia’s chemical weapons program and had obtained the phone numbers of two senior executives at SC Signal, an entity they had previously determined was directly involved in the development of new application methods for nerve agents. Bellingcat obtained and analyzed the metadata of those phone records and found significant surges in calls to several suspected FSB (Russian federal security service) linked numbers in the days surrounding the poisoning. Then, by searching databases that have crowdsourced contact book entries of vast numbers of people, they were able to find several names associated with those numbers and further information verifying they were used by specific FSB officers.
Next, the team were able to look at passenger manifests (again, data that should have been better protected), and found those officers flying in alignment with Novalny’s own itinerary, along with several other operatives. Then by searching databases of taxpayer numbers, vehicle ownership and registered residences, they were able to determine that two of the men were travelling under false names. By further examining travel records and credit card transactions for the men, (one who actually had been listed as FSB in one of the searched contact books), they built a case that the men had been trailing Navalny for some time. By looking at residence records (including using the maiden name of spouses), and phone records, they were able to determine who the real people were who had been travelling under false names.
It goes on and on, with detailed explanation of the investigative techniques built on the porosity of personal information databases, raising many questions in my mind. For example, you know how every time you sign up for a new app there is a quick note about how the app wants access to your contacts? Sometimes you can say no to that, but most people don’t even read that notice and just complete the download. Is that a way that these black market databases of contact books get built?
What about the location service that is on your phone that many apps ask to use although they seem to have no valid reason for needing that information? Part of the Bellingcat team’s investigation relied on those services compiled from many sources to track the locations of the suspected poisoners and further solidify suspicions about who the men travelling under false names really were.
While the availability of all of this data was put to good use in tracing and identifying the poisoners who attempted to assassinate Navalny, we must see that it could also be put to more nefarious use and usually is. Just as the Bellingcat team were able to trace the movements of the FSB operatives, so too could the FSB use data to track Navalny and his colleagues.
Safekeeping only starts with personal responsibility
Most people aren’t very careful about what they do online, what information they allow apps to access, how often they fill out forms (including paper ones) to enter contests or take fun quizzes – so their personal information gets swooped up into black market databases along with public records and other sources of information about them. We must acknowledge that there is a level of personal responsibility for one’s own data and think more before handing it out willy nilly.
But those who hold our data -- in a sense similar to that of “bailment” of personal property entrusted to the safekeeping of another party -- also have responsibility to protect it. Data leaks from companies that are holding the personal data of thousands, hundreds of thousands, or millions of employees and customers are on the rise. Much of that data isn’t needed by the company that holds it, or can be inappropriately accessed by units or third party vendors who have no legitimate need for it. It is these cumulative company databases that present the most risk.
Data privacy doesn’t really exist…much or in many places
Employees, customers and regulators are increasingly aware that the leak of personal data can lead to both direct negative impacts such as identify theft, but also may be used inappropriately in other ways that the individual may never know or feel as a direct impact. And it doesn’t take much probing into the data management schemes of many organizations to see that they are just a house of cards. Not much poking is needed for data privacy controls to collapse when the foundation of those controls is poorly constructed. At the same time, though, that means you can only make things better.
We can help
To help address this problem, OCEG developed the free, open source Integrated Data Privacy Capability Model, but even those who want to develop a better data control structure must first start by knowing what data management looks like for personal information held by them today. That is where OCEG’s newest playbook, sponsored by ServiceNow, comes in. The playbook, entitled Preparing for Improvement in Data Privacy, outlines key steps to take and questions to ask that will help you discover how and where personal information is collected, stored, used, processed, and transferred in and outside of your business. I am certain that the answers, once you find them, will surprise and shock you.
I don’t say this often, but this is one OCEG resource that every member should download and share across their organization with anyone that has any role in managing data. While it isn’t the only tool you need in your data privacy toolbox, it is an essential manual to help you get started in making valuable improvements right away.
Featured in: Data Privacy