Satellite systems are among the most crucial technologies for people on earth. They are also vulnerable to terrorist manipulation.
Remember Ronald Reagan’s plan for “brilliant pebbles” that could take out satellites and disrupt future enemy communications? They are being launched en masse by SpaceX. The only problem is that Elon Musk thinks they are actually for internet connectivity when they are also potential weapons of terror.
Did I get your attention?
Last week, SpaceX launched another 60 units of its Starlink internet network, bringing the number to 180 satellites in Low Earth Orbit (LEO), just about 500km above our heads. By the time it is complete, the network will be made of thousands of these tiny (260 kg) satellites surrounding the globe (kinda like the ones the Vogon constructor fleets put around the earth but much smaller.) SpaceX claims an important feature of these satellites is a propulsion system that, can be used to move the system out of the path of other orbital systems, like the International Space Station, to avoid catastrophic collisions. It also allows operators to take malfunctioning satellites out of orbit and safely guide them back to earth where most parts of a Starlink satellite burn during the reentry phase but the rest will crash somewhere on the surface – with thousands of tiny parts over the years we can already prepare ourselves for a kind of space rain. I’m a bit skeptical of those claims because a few months ago there was a close call between a European Space Agency (ESA) satellite and one of SpaceX’s satellites where SpaceX did not reply to the urgent alerting message of ESA. But that is another story.
This feature was announced at a conference last year where Patrick O’Keeffe was attending. The audience applauded but O’Keeffe did not. A high-level military officer saw his reaction and wanted to know why.
O’Keeffe may know more about security, satellites and military technology than maybe 99 percent of the engineering world, which means he probably knows more than anyone you and I know. O’Keeffe studied aerospace engineering at the University of the German Armed Forces, Munich, and multinational operations and international project management at various NATO and United Nations entities. With a career as both, a naval aviator as well as a NATO legal advisor for modern technologies, he is currently serving as a security policy advisor for a number of public and private organizations. Among other activities, he has helped to integrate autonomous and artificial intelligence systems for military use or e.g. supported the design of a Masters in Cybersecurity at the California State University Maritime Academy.
Before our meeting, several people referred to him as the German “James Bond,” a sobriquet he really hates, but he is also a straight-arrow type with four children, so a better descriptor might be a German “Jack Ryan.” Our conversation began with the implementation, expansion and success of the European Union’s General Data Protection Regulations (GDPR), which you can hear at my Crucial Tech podcast. After the recorder went off, we had an on the record chat about the state of satellite security. Boy, did I get an education. This is the story he told me and the military officer last year.
In 2017, approximately 40 ships in the Black Sea were affected by a GPS spoofing attack that placed their ships far inland. One tanker registered a location 32 kilometers inland, at Gelendzhik Airport. Essentially, these ships have been blinded and had no idea where they were other than somewhere in the Black Sea. Luckily, the attack came during the day and visibility was good so there were no collisions. It appeared the attack was an experiment to see if it could be done.
Most experts at the time determined the disruption came from the Kremlin, but O’Keeffe was skeptical because some Russian shipping was also affected. “It would have been stupid for Russia to do such a thing so openly. It didn’t make sense,” he explained.
He did his own research (see photo above) about a year after the incident gathering the records of satellite transmissions and tracing the attack back to the precise moment it occurred. He found that the source of the malicious signal was from the oldest GPS satellite in Medium Earth Orbit (MEO) orbit above Iran at that time. He did not, however, place the blame on Iran because the signals that hacked the satellite were untraceable. “I’m not willing to assume anything without proof, but the reality is that anyone with a satellite uplink and basic programming skills can do this. The satellite had no effective security.”
His next statement kind of floored me.
“Satellites are technically outdated, unpatched and always connected – and still cybersecurity is not a topic for satellite manufacturers nor operators.
Does that include SpaceX’s satellite network?
That’s what he told the military officer, as well.
“When I talk to companies building satellites I point out the security flaws of their design in both hardware and software,” he explained. “They tell me I need to talk to the cybersecurity team about the problem. When I go to the cyber team, they tell me to go talk to the satellite designers.”
And there you have the problem. Cybersecurity and system design are still, in this day and age, separate disciplines with little to no interaction. It reminds me of the days when hardware/software designers would toss of responsibility by saying the bug was a software/hardware problem.
Today we have satellite hacks of GPS systems, but most of the satellites lack propulsion systems like SpaceX’s. Soon there will be thousands with systems that could be used much more destructively and they are not secure.
In a world where countries like North Korea are funding nuclear weapon technology by selling ransomware kits to organized crime, how much could they make by selling satellite hacking kits to terror groups? And how long before one of SpaceX’s satellites are taken over and directed into a controlled descent to crash into the Capitol Building during the State of the Union address at hypersonic speed?
It is no longer acceptable that the engineering community accept “good enough” for systems security, nor to believe they have done all they can. It is time to make security a gating factor in product release.