Meta’s removal of end-to-end encryption from Instagram direct messages, confirmed for May 8, 2026, prompts reflection on the path not taken. The change was disclosed through a quiet help page update. At every stage of the feature’s troubled history, alternative choices were available that could have produced a very different outcome.
The path not taken at the design stage: making encryption the default rather than opt-in. WhatsApp proves this approach works. Had Instagram offered default encryption in 2023, adoption would have been far higher and the low-uptake justification for removal would not have been available.
The path not taken at the promotion stage: actively encouraging users to enable encryption through user interface design, notifications, and public communication. None of this was done. The feature existed quietly, known only to users who sought it out.
The path not taken at the safety stage: investing in detection tools that work within encrypted systems. Security researchers have developed approaches that can identify harmful content without breaking encryption. Meta chose not to pursue this path. Law enforcement agencies including the FBI, Interpol, and national bodies in Australia and the UK had pushed for the easier path of removing encryption. Australia reportedly saw the feature deactivated before the global deadline.
Digital Rights Watch argued that the path not taken is as important as the path taken. Tom Sulston maintained that at every decision point, Meta had the option to choose privacy. The company chose otherwise. He and others are committed to ensuring that future platform decisions are made differently.