You spend a thousand dollars on a supercomputer that fits in your pocket. You hold it in your hand. It feels like yours, but the moment you try to change the software or modify the hardware to your liking, you hit a wall. It turns out you’re just renting space in a chassis owned by a corporation, and the landlord is increasingly strict about what you’re allowed to do inside.
This tension between what we buy and what we’re allowed to control is reaching a breaking point. From the debate around bootloader locking to the opaque systems of “Play Integrity,” we are watching a shift where security is used as the excuse to strip away ownership.
Questions We Should Ask
We need more developers who don’t care about PR There is a specific kind of relief in hearing a developer say exactly what they think, stripped of corporate politeness. When the lead behind GrapheneOS speaks, people often call the tone unhinged or abrasive, but they rarely prove the technical analysis wrong. It’s refreshing to encounter someone who prioritizes being technically correct over being media-friendly. In an ecosystem saturated with PR-friendly bullshit, a blunt truth-teller is worth their weight in gold, even if they make you uncomfortable.
The “EU made me do it” excuse is a lie There is a pervasive myth circulating that new European regulations like the Radio Equipment Directive (RED) or the Cyber Resilience Act (CRA) force manufacturers to lock bootloaders shut. This is gross misinformation. These laws require secure boot and the isolation of radio firmware to prevent interference, but they do not demand that users be permanently locked out of their own devices. OEMs are using compliance as a convenient scapegoat to do what they’ve always wanted: kill the right to repair and the right to tinker.
The radio spectrum is not your personal playground Here is where we have to be careful about the “it’s my device” argument. You should absolutely have the freedom to modify the operating system, the kernel, and the userspace. But you should not be able to reprogram the radio firmware to broadcast on unauthorized frequencies or power levels. The airwaves are a shared public resource; letting billions of devices run rogue software could knock out cell towers, emergency networks, and aviation communications.There is a massive difference between securing the modem against dangerous transmissions and locking the user out of the main processor.
Play Integrity is a system of control, not security Google Play Integrity and similar attestation frameworks are fundamentally flawed. They purport to verify a device’s security posture, but their actual purpose is to enforce Google’s specific definition of “safe.” If you choose to run a privacy-focused OS like GrapheneOS, these systems flag you as a risk and deplatform you. Banks and government agencies blindly follow this lead, punishing users who dare to prioritize their own privacy over Google’s defaults. It’s a mechanism to enforce conformity, and calling it malware isn’t hyperbole—it’s a description of what it does to your autonomy.
Hardware fixes for hardware problems The debate often conflates software modification with hardware identity theft. Critics argue that if we don’t lock everything down, people will change IMEIs to hide stolen phones. That is a legitimate concern, but it has a technical solution that doesn’t require ruining the device for everyone. Identifiers like IMEIs should be burned into read-only memory or eFuses at the factory. If the ID is immutable in hardware, there is no need to enforce a software lockdown that prevents you from installing the operating system of your choice.
Security without ownership is just containment We are rushing toward a future where our devices are incredibly secure, but only against their owners. The technology to verify integrity and protect the radio spectrum exists, but it is being implemented in a way that centralizes power in the hands of manufacturers and regulators. If we cannot audit, modify, and control the technology we rely on, we aren’t users anymore. We are just the product being managed by a system that views our curiosity as a vulnerability.
We need to stop accepting the false choice between a secure device and a controllable one. The goal shouldn’t be to lock down the machine to prevent any possible misuse; it should be to architect the system so that when you break the rules, you only hurt yourself, not the network. Until we separate the radio from the computer and the identity from the software, we are just building prettier prisons.
