Demographics. Geopolitics. Innovation.
The following interview has been used with permission from the author of the interview. You find the link and permission granted on the original interview – Seeing the Big Picture of Cybersecurity With Milena Rodban.
In context of the United States and geopolitical risks in the cyber world, who do you see as our major challenges? How do you think we’re currently handling these challenges?
Put simply, the major challenge is complexity and the large numbers of actors looking to target our vulnerabilities, from state sponsored hackers, to intelligence agencies, criminals, and terrorists. We’re spending record sums on cybersecurity, but the breaches are still stunning. With each new development, even convenient ones like streamlined and simplified log-in APIs, we’re adding extra complexity to the situations we face in cyber space, and increasing the likelihood that a breach will have far-reaching and catastrophic consequences. Each new device adds new points of vulnerability, new ways to collect private information, and new ways for bad actors to hijack poorly secured devices to wreak havoc.
Furthermore, we are not prioritizing the need to understand likely immediate consequences, not to mention second and third order externalities. Some things that seem like they make our lives easier or simpler demand tremendous sacrifices in terms of privacy, security, and vulnerability. Tech companies need to be able to collect and sell data in order to be able to offer platforms or services to users for free, so it’s a stretch to imagine that you can have fully ethical firms that don’t charge high fees to make up for not being able to profit off data. The reason an ancestry kit is a bargain is because they then sell your data (anonymized, they claim) to drug companies to make new medicine. Either we pay a premium to keep our data safe or we acknowledge “free” isn’t really free- you and your data are the product.
Additionally, the rapid speed with which we update systems, unevenly adopt new tech, and specifically security measures, along with rampant tech illiteracy leave little time for people to consider the potential interactions and nefarious uses for the innumerable gadgets that we use on a daily basis. Look at the way wearable fitness trackers uncovered secret military bases- to me, as someone who works on helping clients explore their potential vulnerabilities and the actors likely to target them, the connection seems obvious. To the average person who wants to stay in shape in a stressful job, it may not be immediately obvious. The most important challenge is that most tech firms still do not appreciate the extent to which they are vulnerable to geopolitical developments and how they actively raise their exposure. There is someone, somewhere, looking to use every development for unintended or nefarious purposes- whether criminal, activist, terrorist, or state-sponsored hacker.
To the point on spending record sums but the breaches are getting worse – why aren’t we getting better results if we’re spending more money on cybersecurity?
Many of the things we’re spending money on are efforts to bypass humans – password generators, consultants, ways to automate cybersecurity so that we work around humans, because these are easier to find, buy, and implement. Yet the most successful breaches are ones that focus on bypassing those automations. For example, CEO fraud – sophisticated phishing attempts – has snared even sophisticated executives. Social engineering and physical penetration of facilities are also very effective. Personnel see someone with an ID card that looks right, and automate their thinking – letting them through instead of giving it some thought and really determining if someone belongs there. We need everyone to play an active role in cybersecurity, not abdicate responsibility to programs that promise to protect us and our data.
Some companies store data on their customers from private to behavioral data, but don’t see how there are any geopolitical risks involved with these data. Why is this assumption incorrect?
As we see with fitness data, and genetic (such as ancestry) testing data used or sold to for-profit firms or shared with law enforcement, any information – that can be obtained and stored – will be used. It will be sold to advertisers. Third party researchers might gain access. It’ll be used to raise the price of your life insurance premiums. Intelligence agencies will target health insurers used by government employees. If your device can be adapted by activists or terrorists to avoid law enforcement, it will be. There are companies that honestly believe the advanced communications tech they sell to mountain climbers to better stay in touch and practice safe climbing can’t be used by anyone else. They think criminals who want to communicate in remote areas with poor infrastructure or terrorists operating in rural areas won’t use them simply because the makers only market their products in mountain climbing magazines. These naive attitudes need to evolve. We- both in the tech community and the public at large- know the basics of what we need to anticipate, and recognize at least some of the dangers of using poorly secured devices that can be hijacked or poorly secured systems where data can be stolen with ransomware. We’ve all seen major cities and hospitals hit with ransomware attacked, and how internet-connected devices can be drafted into a malicious botnet.
Some developers have advised the approach of “move quickly and break things” – which may work safely in some development contexts if security isn’t important. In environments where cybersecurity is a very high concern, what do you think is a superior approach to development and why?
We can create controlled environments to experiment with things before they are launched in the real world. Then, many companies also launch in a small market, before expanding to major markets. These are good ways to work out the kinks, but there should be more emphasis on the security pieces alongside the fancy new features. Security is largely seen as limiting rather than empowering. Many people don’t realize that for many devices to work, they need a wealth of data. This is why tech firms don’t want opt-in data collection measures – they want to collect by default and let some people opt out if desired, though most don’t even know that’s an option. By choosing to keep data from a device or platform, the user experience changes substantially, rendering some features null.
We need to treat everything that captures and stores sensitive data like they are baby products. People want to know that baby products are made with good ingredients, tested extensively, and safe to use. No product can be released, without meeting certain tough requirements. And many companies even compete to surpass those basic criteria to meet even higher standards, believing this to be good for both babies and bottom lines, as demanding parents choose higher quality goods over basic options. With tech, the security and safety aspect is often advertised far less than splashy redesigns, sleek hardware, or cutesy features like Apple’s Memoji. The tech community must realize that they are on the front lines of securing users, who like babies, are often much less tech literate and lazy in terms of sticking to best practices. We need to acknowledge limitations and create ways for users to get more educated about how secure new tech is, and how users are becoming exposed to potential attacks and data breaches by using bad tech. We’ve known for years that Chinese firms must help Chinese intelligence agencies, and so all Chinese-made tech has backdoors and data goes to Chinese intel. It’s stunning that we had to wait many years for the US government to ban the use of Chinese made phones from Huawei and ZTE phones by government employees and contractors. This was an open secret: how much was compromised before the US government came to its senses?
For the few individuals who may value security over convenience (very rare at the moment), do you think it may be possible to convince companies to offer a choice?
For-profit companies don’t often voluntarily choose to give people choices that will drive them away from their business. This is where we need the government to come in and introduce smart regulations to protect consumers, privacy, data, etc. We’ve seen some popular efforts (like net neutrality) get overturned in the name of freedom, and we’ve seen some truly mind-boggling regulations introduced, like the EU’s GDPR, which is at best a knee-jerk reaction to address fringe problems with expansive, expensive, and likely ineffective compliance regimes. We need more tech literate legislators, and tech literate voters who can bring them to power, so we can protect users without stifling innovation.
I’ve met people who want to transition all we do into the digital world. From IoT devices to money, they envision a world where everything we do is tied to the digital world. Given your understanding of geopolitical risks, do you think this is a wise path to follow for our future – assuming it’s possible?
We already have many separate digital worlds, and we’ve seen the problems they cause, so we know a bit of what to expect as our lives get increasingly more digital. It is because of them, their poor protections and uneven security, that we’ve seen recent crackdowns on the digital realm, from GDPR in the EU, to a recent ruling about Aadhar, India’s massive biometric identity verification system. In China, every interaction, from a credit card swipe at a lingerie store to a message sent on Weibo, is used to calculate a person’s loyalty score, affecting their job prospects, chances of getting a loan, etc. As the digital world keeps encroaching on daily life, we’ve seen efforts to push back. In terms of GDPR, a largely tech-illiterate legislature passed sweeping measures that don’t reflect the reality of what protections are needed, what punitive actions are possible, and what is technically possible to do. If tech wants to prevent more such measures, which impose tremendous compliance costs on tech firms without actually making user privacy a priority, then they need to educate the voters who decide elections, and they need to educate the people who make the laws. The entire world won’t go digital at once, if ever. Just as industrialization did not happen overnight everywhere, so too will efforts to transition to a digital world come in a fits and starts- and many will work to combat its pervasiveness and limit its reach in terms of the way that it can be turned against private citizens- something we’ve already seen it used for in places like Xinjiang, China.
What is something that you believe that no one, or few people, agree with you on?
Milena Rodban is a geopolitical risk consultant and interactive simulation designer. She advises private firms, with a particular emphasis on tech companies, to help them successfully navigate complex business and security environments. Milena designs and facilitates interactive simulations that are customized to allow clients to diagnose problems, analyze major decisions, and integrate more effective communication, collaboration, and crisis response protocols. Ms. Rodban received her MA in Security Studies at the School of Foreign Service at Georgetown University.