Put your hand up if you would like information on your mobile phone to be secure. Photos of your kids that you have taken on your phone to be kept private. E-mails from work about new product launches to remain just with employees. Posting information on social media to be limited to you. That type of thing.
Apple argued that ... that same backdoor could be used by people with nefarious intent.
I would be surprised if your hand is not currently in the air. There aren't too many people who would be happy if everything they have on their phone - their bank accounts, contacts, everything - ended up in the hands of a random person somewhere in the world. Consequently, phone manufacturers place a high level of importance on security. Seems sensible.
Now jump to a separate conversation. With an average of 1.2 mass shooting events a day in the US, you can be excused for not remembering one specific event. On December 2, 2015, 14 people were killed and 22 injured in an Islamic terrorist attack in San Bernardino. The two shooters were killed in a shootout with police but in investigating the case, the FBI found 400 pieces of evidence including an iPhone.
Exploring the prospect that the phone may have crucial evidence on it - including the likelihood that it may have information on a possible third shooter - the FBI politely asked Apple to unlock the phone of a known terrorist. Apple politely declined. By the end of 2016, Apple had received - and objected to - at least eleven orders issued by the United States district courts under the All Writs Act of 1789.
I am not Apple but if the FBI knocked on my door and asked for some help in the investigation of a known terrorist, I would be pretty happy to bend over backwards. Most people would be the same - but in this case, go back to my original point. People want security in their phones.
Apple argued that if they built a phone that had a secret backdoor that allowed them to access information then that same backdoor could be used by people with nefarious intent.
Unlike Meghan and Harry, you can't have your cake and eat it too. As tragic as the loss of 14 lives was, Apple argued that the loss of lives would potentially be greater if Apple deliberately built a phone that was able to be hacked.
This case is ancient history in technology terms but the issue is once again at the centre of technology discussions after a lone shooter killed three and injured eight in Pensacola, Florida on December 6, 2019. The incident has been officially classified as an act of terrorism. Two iPhones have been recovered by the FBI as part of its investigation.
In a case of history repeating itself, the US Attorney-General has urged the tech giant to unlock the two mobiles used by the gunman but, no surprises here, Apple have refused. They have explained that they "have always maintained there is no such thing as a backdoor just for the good guys," and further that "backdoors can also be exploited by those who threaten our national security and the data security of our customers."
It would seem that the logic from Apple is that the two things we desire in our society - security for our own mobiles but no security for mobiles of 'bad guys' - are mutually exclusive. On the plus side for Apple, the ongoing vitriol from Trump and the US government is the best advertising possible for Apple's security!
Let me know where you stand on Apple's stance. Security for all or access for the government? firstname.lastname@example.org
- Mathew Dickerson is the founder of regional tech and communications company Axxis Technology.