The revelation that a major FaceTime bug can effectively turn your Apple devices into a hot mic, allowing a caller to hear or even see you before you pick up, would be a massive embarrassment no matter which company was involved. It’s an absolutely crazy security fail.

But when that company is Apple – which has been ceaselessly pushing privacy of late – it becomes so cringeworthy we’re going to have to invent a whole new scale just to measure it …

I mean, I get it. Bugs happen. No-one intends them, but coding is complex, and software engineers are human. It’s just a fact of life that some bugs will make it through, and that this will include security vulnerabilities.

Software testing is also complex, given the massive number of variables involved. This particular FaceTime bug occurs only when someone does something completely illogical and unexpected: adds themselves to a call they initiated. I appreciate this would have been a tricky scenario to anticipate and include in testing.

But when you are Apple, a company which has talked of little other than privacy over the past few months, then you don’t get a pass on this. And if you think I’m holding Apple to too high a standard, let’s take a look at some examples.

FaceTime Bug vs. Privacy

October 2, Tim Cook talks privacy to Vice.

October 23, Cook gives a keynote address at the International Conference of Data Protection and Privacy Commissioners in Brussels.

The way we go into product design is we challenge ourselves to collect as little as possible. We challenge ourselves to make it not identifiable. We don’t read your email, your messages. You are not our product. It’s not the business we’re in.

October 24, Cook says many companies can’t be trusted on privacy, and federal regulation is needed.

November 18, Cook talks privacy with HBO.

January 5, an Apple billboard in Vegas claims ‘What happens on your iPhone, stays on your iPhone.’

January 24, Cook writes an op-ed for Time in which he says that ‘data breaches seem out of control.’

Apple Standards

The standard to which I’m holding Apple today is one the company set for itself, very loudly and very frequently.

Difficult or not, the testing work to prevent a security vulnerability of this magnitude has to be done. Every variable has to be tested, whether it’s someone adding themselves to a call they made, adding contacts in reverse alphabetic order or asking Siri to initiate a call while standing on your head in a west-facing room on a Thursday evening.

Apple has responded by disabling group FaceTime calls. That’s a responsible course of action. And I have no doubt that it will quickly release an update to fix the bug.

But this FaceTime bug is an absolutely massive fail. Apple either needs to be able to overhaul its software development and testing regime such that it can be certain nothing of this seriousness can ever occur again, or it needs to cease throwing quite so many stones from what turns out to be a glass house.