The Nanophage: Can a Signal Destroy Hardware?

Gamer_152
12 min readNov 12, 2022

Note: The following article contains minor spoilers for Observer.

A colourful assemblage of wires and electronic parts.

In Bloober Team’s horror sci-fi, Observer, there’s a disease that is communicated electronically but still physically degrades people. At first, I thought the idea of this illness, “the nanophage”, was ridiculous. How do a bunch of abstract computer commands materially damage something? But then I understood that the disease causes nanobots to turn against their hosts; they were executing code that told them to wreck the parts around them. And I got to thinking, “Could something like this happen in real life?”. Could you destroy components just by sending data to a machine? The truth is, if you’ve owned a lot of video game consoles, it’s probably happened in one of your devices, and you’ve not known about it.

Some terminology before we go any further: In lay conversations, we often use the word “virus” to refer to any program that intentionally makes a computer malfunction. But in the field of cybersecurity, that term has a more specific meaning. It refers to malicious software that reproduces by writing itself over the code of other programs. The generic word for software that attacks a computer is “malware”, so that’s the jargon we’ll be using here.

If you Google for malware that can damage computer parts, you get a lot of results about nasty little blighters that ruin firmware. Firmware are programs that exist on the permanent memory of devices and tell them how to run. For example, your hard drive’s firmware describes to it how to read and write data. Your motherboard’s firmware instructs it on piping data between other circuit boards. You’ve probably heard of “corruption”: it’s when any data, including program code, is turned from a coherent set of signifiers into a useless mess. You can modify malware to corrupt a hard drive’s or motherboard’s firmware. This is one way to brick a PC because all software and hardware are reliant on these parts functioning. But these programs wouldn’t fundamentally damage hardware. The rotors in those hard drives and transistors on those boards would be as fit as they were before the infection. They just wouldn’t be receiving the right instructions anymore.

We want signals that can physically change the computer’s body, and that’s a decent description of what’s happening in a Field-Programmable Gate Array or FPGA. FPGAs are often shortcuts around compatibility issues. Using a PC, it might seem that any software can plug into any hardware and that the only limitation is raw computing power. When you download a PDF reader, you don’t have to worry about whether it agrees with your model of CPU. They don’t have one version of a game for AMD graphics cards and another for NVIDIA. Yet, a PS5 game won’t run on an Xbox One or vice-versa, and titles must be “ported” between platforms. So, what gives?

Software must be compatible with hardware. That’s non-negotiable. Code is a series of instructions for hardware to carry out. Yet, circuits speak different languages, and their components, and the relationships between those components, are not likely to be identical. So, you can’t have a ZX Spectrum program run on a Genesis any more than you can give someone a manual for a dishwasher and tell them to use it to operate their oven. The reason that you can run any program you’re likely to encounter on any modern PC is that the PC hardware got standardised. If the circuitry and the low-level programming language stay roughly the same and software is built for that circuitry and that language, you don’t have to worry about as many compatibility issues. Most consumer-grade processors in PCs, the part of the computer carrying out the instructions, are usually built to one of two rough blueprints or “architectures”: x64 or x86.

Older computers were built before these standards were codified, and many non-PC computers do not prioritise software compatibility. Console and phone manufacturers instead design their machines to achieve a balance between high computing power and low parts cost. They have competing ideas about which components would be best for those tasks, and answers change as technology advances or depending on the exact device they’re developing. Different goals, different mechanisms. Making sure only their hardware can run their software also gives platform holders power over the customer experience and the market for their products. If you could play any PS5 game on the Series X, there’d be no PS5 exclusives.

The pickiness of computer hardware creates headaches for anyone who wants to run software for older machines or devices they can’t obtain. The favoured solution for this problem is the software emulator: an application that simulates the hardware as it exists from softwares’ perspective. By recreating the environment code was written to be executed in, emulators recreate a comparable software-hardware relationship. Yet, emulating software requires more memory and processing power than interfacing it with the original hardware does. The base hardware just has to put in the elbow grease to run the game. The PC emulating the hardware has to commit resources to run the emulator and run the game on top of that. Realistically, it’s probably also handling the instructions for a lot of other programs. Software, especially video games, which tend to be demanding of hardware, may run perceptibly slower than on their initial platform.

Here’s where FPGAs can swoop in and save the day. Fundamental to computer circuits are logic gates. Logic gates are circuit elements that perform mathematical operations, but they do it on bits instead of the numbers we usually work with.[1] Bits are 1s and 0s, and it’s common to represent a 1 with a burst of electricity in a circuit and a 0 with no electrical signal. An AND logic gate will take multiple bits in and output a 1 only if all its inputs are 1s. Otherwise, it outputs a 0. An OR gate will output a 1 if any of its inputs are 1. Else, it outputs a 0.[1] Most people find this information easier to digest in table form:

These building blocks might seem too disconnected from the maths we know to stack into anything useful, but look at the circuit linked here. It combines AND and OR gates to add numbers, an indispensable operation for any computer. Those parts in the wires with rounded fronts are AND gates and the ones with pointed fronts are OR gates. If we wire together enough gates with other components, we can get whole processors and memory banks. Field-Programmable Gate Arrays are bundles of logic cells that each contains a small amount of memory and a circuit for processing data called a Lookup Table.

We can think of circuits as specific configurations of processing and memory components. So, by routing signals between these memory and processing components in different designs, we can create billions of different circuits. As Field-Programmable Gate Arrays suggest by their names, we can use data to set how those signals are routed, effectively deciding the circuitry. We can even import those designs from the internet. With the FPGA, we see that it’s possible for pure signals sent to our computers to transform the hardware on our desks. Against all common sense, if you need a NES or an Intellivision, you can just download one. But it’s reasonable if you don’t consider transforming the infrastructure of a circuit to be physically changing hardware, and I did promise you carnage.

Following our current thread, we could dive into overclocking and overvolting. Respectively, these are methods to increase the speed at which a processor executes instructions and the amount of electricity that can pass through it per second. The two can both be configured at the software level. Pushing a Central Processing Unit beyond its default parameters can cause it to wear prematurely. There are plenty of sources on the web talking about malware hypothetically overclocking or overvolting processors, but I’ve never been able to find any records of it happening. It’s the kind of program that people use the word “theoretically” a lot when describing. I’m also not sure that wearing down a part at a higher rate than usual is what we’re after when we talk about “destruction”.

There is, however, one piece of malware I know of that wrecked a mechanism by ripping off its safety harness, and it takes us far from home to the Iranian capital of Tehran. In June 2010, the Belarusian security firm VirusBlokAda found a piece of malware residing in Iranian computer systems.[2] This 500KB program would later be named “Stuxnet”, and by October of the same year, Forbes reported that Stuxnet had infiltrated more than 50,000 Windows computers worldwide. The program was highly contagious and record-breaking in its complexity, and yet, no one knew who had developed it or why.[2] Despite its viral infectiousness, the malware had gone undetected for over a year. Partly because it didn’t activate its harmful payload unless it found itself in one special type of computer: an industrial control system. Stuxnet’s developers were setting it up to sabotage uranium processing plants.[2]

See, the allied countries of the US and Israel were pulling at their collars as their mutual enemy, Iran, was refining uranium. That fissile material could find a purpose in nuclear power stations or, potentially, atom bombs.[2] So, the two initiated a project called Operation Olympic Games to sabotage the Natanz enrichment facility in Tehran.[3] Hacking in through the internet wasn’t an option because the centrifuges used to separate radioactive isotopes weren’t connected to the internet: a common security protocol.[2] So, the attacking nations developed a worm: a type of malware that spreads from machine to machine, that would jump between computers using USB flash drives.[4] While Stuxnet took advantage of a number of weaknesses in Siemens and Windows systems, it also had another edge against the enemy security.

Legitimate software comes with security certificates: digital codes that are impossible to forge if all is running smoothly. Some malware tunnels into machines by bypassing those certification checks or relying on careless humans to run software despite the lack of certificates. Stuxnet didn’t bank on either of those methods; it was travelling with authentic certificates from the security companies Realtek and JMicron. The hackers had found the best back door of all: the front door. The worm made its way to the Natanz facility from the computers of companies who had contracts with it, unwittingly transported via memory sticks.[3] The evidence suggests it did its job, with 984 Natanz centrifuges falling offline in the space of five months.[4] Program code was the incantation that destroyed hardware. But because the malware could travel by memory stick, it didn’t just infiltrate the processing facility: it began to spread far and wide.

I’m including Stuxnet in this article because it’s the least ambiguous and most visceral example I have of code corporeally altering the world. It doesn’t get much more tangible than taking a crowbar to a uranium refinement plant. Maybe you’ve been worried about something like this happening to your uranium refinement plant. Although, when I talk about signals wrecking hardware, odds are you’re thinking of something more domestic. In our final case study, we’re going to see elements of both the FPGAs and Stuxnet arise in destruction happening right under our noses.

The piracy of commercial video games is almost as old as the products themselves. The ongoing war between pirates and intellectual property holders is a history of assaults and counter-assaults. The complexity inherent to computers, even individual pieces of software, means that there’s usually some oversight made in their design that you can exploit if you dig hard enough. Console manufacturers have some freakily talented engineers. However, the communities that hack these consoles represent many more people working together and can spend years looking for the chinks in devices’ armour. The internet made it easier for pirates to communicate to develop and distribute hacks, but it also became a backchannel for manufacturers to patch security holes in devices.

Patches didn’t always stop the pirates. They’d just find a security flaw in an older version of the console’s software and install that permutation of it on their machines. But in the 2000s, Microsoft began to jeopardise that approach. Looking to make their upcoming console, the Xbox 360, an impenetrable fortress, the manufacturer got their hands on some e-fuses. E-fuses work on the same principle as the fuse in your plugs: as long as you pass a low enough current through them, they act like any other piece of conductive wire, but if that current exceeds a predetermined amperage, they “blow”, breaking the circuit. Unlike the fuse in your plug, e-fuses don’t combust. Electrical currents are rivers of electron particles, and when electrons hit an atom, they transfer some kinetic energy to it, nudging it. E-fuses are constructed in such a way that the momentum from a strong current of electrons will push their atoms apart like someone blowing away iron filings. This phenomenon is called electromigration.

For their sophomore console, Microsoft used e-fuses to create a permanent record of each console’s current software version. This data allowed it to check whether the user was running an outdated operating system, which may have compromised security. When the 360 receives a major update, it blows a set number of e-fuses inside it. Various versions of the console’s operating software are associated with a certain quantity of fuses.[5] On boot, it checks how many intact fuses it has, and then continues according to these rules:

  • If it has more fuses than it should for its current version, it will destroy enough that it has the correct number and then reboot.
  • If the console has the right number of fuses for its version, it boots normally.
  • If it has less than it should, it will refuse to start up.[6]

Here’s a simplified version of how a system like this might work in practice: Let’s say we make a console with four e-fuses in it and roll it out with software version 1.0. For 1.0, the “valid” number of fuses will be four. Every time it tries to boot, it will check its fuse count, see that it has four, and boot as usual. Unfortunately, we discover a fatal bug in 1.0 that lets users run pirated discs on it. So, we use the internet to send out a version 2.0 with the bug fixed. For 2.0, we decide that the correct number of fuses is three. When the console first starts after getting the 2.0 update, it will see it has one more fuse than it should have, blow that fuse, and then restart. After restarting, and on every subsequent power-on (without tampering), it will check if it has three fuses, see that it does, and work as normal.

Now, let’s imagine word of that fatal bug in version 1.0 gets out, and people who have already downloaded 2.0 try to reinstall version 1.0 on their devices to take advantage of the glitch. This rollback will fail because their console now has three working fuses, which is fewer than 1.0 wants to see upon boot. 1.0 will start up on these machines, look for the four fuses, find it’s one short, and refuse to pass Go.

These e-fuses are not built for replacement like the ones in your plug. They are microscopic, and they are all inside the processor chip. Opening up the Central Processing Unit, let alone touching any part of the circuit, is a delicate process that risks irreparably bricking the whole system. You also couldn’t just replace the CPU because every one had a unique digital signature that tied it to the Xbox it came in.[5] The user can retain older versions of the 360 firmware by not taking their console online, and so, not receiving the updates that blow the fuses. Still, many went through plenty of updates before the fuse mechanism was discovered, with unblown systems becoming niche hardware and it difficult to authenticate whether you’re getting one before you buy it. A lot of players would also prefer not to keep their console permanently offline, especially a machine like the Xbox 360, which was prized for its online multiplayer experience. E-fuses are also now used in the Nintendo Switch.[6]

The e-fuse method does not prohibit piracy. We can’t say any technology conclusively has. In fact, the Xbox 360 hackers eventually found a way to bypass even e-fuses, but anti-piracy measures don’t have to prevent 100% of piracy cases forever to be helpful. If you can make piracy so arduous that a lot of customers decide it’s not worth it, and if you can keep people paying for genuine games for years, it can save enormous sums of money. It’s that profit motive that has meant people miles away from us have flipped switches and sent signals to our belongings to destroy little pieces of them.

Consoles using e-fuses are like FPGAs in that they take electronic instructions that transform their circuits, and, therefore, their operations. They are like Stuxnet in that that transformation destroys something and relies on a genuine security clearance. Yet, unlike the FPGA or the uranium centrifuges, the e-fuses were made to be destroyed. While we might think of updates to our systems as necessarily upgrading them, the e-fuse method shows that updates can also limit functionality. They can be our Xbox 360 or Switch’s very own nanophage. Thanks for reading.

Notes

  1. Basic Gates and Functions by Wale Sangosanya, David Belton, and Richard Bigwood (1997–2005), University of Surrey.
  2. Weinberger, S. (2011). Computer security: Is this the start of cyberwarfare? Nature 474 (p. 142–145).
  3. Confirmed: US and Israel created Stuxnet, lost control of it by Nate Anderson (June 1, 2012), Ars Technica.
  4. An Unprecedented Look at Stuxnet, the World’s First Digital Weapon by Kim Zetter (November 3, 2014), Wired.
  5. How the Xbox 360 Security was Defeated by Dimitris Giannakis (June 10, 2019), YouTube.
  6. How the Nintendo Switch prevents downgrades by irreparably blowing its own fuses by JonLuca De Caro (April 24th 2018), HackerNoon.

All other sources are linked at relevant points in the article.

--

--

Gamer_152

Moderator of Giant Bomb, writing about all sorts. This is a place for my experiments and side projects.