Opinions

The Bad Apple

Reading Time: 3 minutes

The F.B.I. might be able to read the messages you send to your friends. But it’s not as bad as it sounds.

Law enforcement wants Apple to unlock and decrypt messages on an iPhone that belongs to one of the two attackers that shot 14 people in San Bernardino, California last December. Since ten incorrect password attempts wipes the data on the phone, the F.B.I. cannot risk manually guessing password combinations like a computer could - rather, it needs a backdoor.

By creating a backdoor, not only would the F.B.I. gain access to the information on this specific iPhone, it would also be able to sift through the data on at least nine other smartphones that are currently identified as belonging to criminals. It is also certainly possible that the same code could be used to unlock almost every single smartphone that exists. This is seen as a battle between consumer privacy and the security that comes with pursuing the greater good.

Apple sees it this way: they don’t want to create the master key that could put their customers’ privacy at risk. In response to the FBI’s demand, a letter from Tim Cook, the Chief Executive Officer of Apple Inc., says that “No reasonable person would find that acceptable.”

They claim that their security measures are to make sure that their users are safe, and that business has nothing to do with their decision.

But nothing could be further from the truth.

Apple claims that they do not have the technology to essentially spy on its customers, and that this case is unprecedented. However, in a report cited by the Wall Street Journal covering data from only the first half of 2015, Apple reported that it received “nearly 11,000 requests from government agencies worldwide for information on roughly 60,000 devices, and it provided some data in roughly 7,100 instances.” Not only does Apple have the ability to provide sensitive data to law enforcement, its security is not as impenetrable as it claims it is. They did it before and they can do it again.

Furthermore, by refusing to unlock the phones, Apple is putting their sales ahead of safety. In response to Apple’s refusal, the F.B.I. shot back that their decision was simply “a marketing strategy.” This is true - Apple realizes that its consumers don’t want their privacy to be compromised, and by upholding their reputation as a company that prioritizes security, they’re drawing in loyal customers.

But these loyal customers aren’t limited to teenagers and their parents. They will also attract extremists all over the world, including the Middle East, where encryption services such as CryptoCat are popular. Apple encrypt the information on your phone in the same way (encryption is essentially a way for a message to only be viewed by the sender and the recipient using “keys” that scramble and unscramble code), but these services attract extremists and activists. CryptoCat was created in 2011 by Nadim Kobeissi, a Lebanese activist and coder, and he says that it “became popular among Middle Eastern activists during that year’s Arab Spring rebellions.”

The Arab Spring rebellions themselves cannot be seen as a completely negative series of events (although it did make it easier for ISIS to swoop in and end the power vacuum), but it does send the message that if used incorrectly, people with the wrong intentions will be able to use these services - and they do.

Since this is not new news, Apple is essentially encouraging, or at least tolerating, the use of their services to plan attacks and mass shootings. If Tim Cook and the rest of Apple truly stand behind their “deepest respect for American democracy and a love of [their] country”, as they state on their website, they would not stand for terrorists taking advantage of their so-called unhackable phones to kill innocent civilians. The F.B.I. should not let Apple undermine its authority and let this San Bernardino case become hope and a green flag for terrorists.

No one wants their text messages, photos, health records, or any other personal data broadcasted to the world or even scanned by government officials. But by purchasing a device from a third-person party, it is unrealistic to believe that your data is perfectly secure. In an age where digital theft is becoming more prevalent than physical theft, it is becoming more and more important to be able to protect consumer data. However, it cannot be at the cost of not being able to gather evidence to put terrorists in jail. The U.S. government needs to keep a delicate balance and strike the middle ground between protecting individual privacy and maintaining national security. Are more innocent civilian deaths really the price we have to pay for maintaining individual privacy?