Apple, Security and Your Passcode


your passcode

There’s one simple thing, just one thing that stands between strangers knowing virtually everything about you. From personal health and financial information to your email and social media accounts, just one thing is keeping people out of your most private information. What’s that simple thing? It’s your iPhone’s passcode. Whether your passcode is a simple 4-digit passcode, a password or even your finger print, when you’re not using your iPhone, your passcode is the last line of defense that keeps your information safe and sound.

Your unique passcode lets you control what people see and what they don’t see, but for most people, we never think about it. It has become second nature for us to unlock our iPhones before we use them, and we depend on that security feature to protect us, but the secure feeling that we get from that passcode is all coming into question.

Famously, Apple has always claimed that there is no backdoor, a way for the company to access your information without knowing your passcode, into the company’s iOS software. Even Apple does not have access to the information that you store on your iPhone. The company claims that if they designed a backdoor into the software, it would, obviously, introduce a new security risk, namely the backdoor itself. If Apple could access your information, what’s stopping a hacker, a scorned spouse or even a parent from finding a way (or finding the right person) to break into the iPhone and see all the information that has been stored there? According to Apple, nothing. Introducing a backdoor would compromise the software, and it would only be a matter of time before someone found it in the code and took advantage of it.

your passcode

Unfortunately, that’s exactly what Apple has been asked to do.

Deep in your iPhone’s settings, you have the option to turn on or turn off the passcode feature that makes your iPhone automatically delete all its information if you incorrectly put in the passcode 10 times in a row. This might seem like an extreme step for some people, but all of your information can be retrieved if you’ve done an iCloud backup. But if your phone is stolen and a crook has your phone, unless he / she knows your passcode for some reason (0000, 1234, 1111, etc.), you’ll be happy to know that your data will all be erased after he / she has incorrectly tried to gain access to the phone 10 times in a row. Safe. Secure. Awesome.

But this kind of technology also raises some eyebrows concerning national security. Your passcode doesn’t just protect you against giving your information away to whomever finds your iPhone — it also protects criminals who are using iPhones to store evidence of the crime they have committed or contacts of other persons of interest. This is exactly the situation with the San Bernardino shooters from the December 2015 terrorist attack in California.

One of the deceased shooters involved in the terrorist attacks in San Bernardino, Syed Rizwan Farook, had an iPhone 5c, and investigators believe that it might hold some information about his connections to the terrorist organization ISIS or even other potential lone wolf attacks here in the United States, but since they recovered the iPhone late last year, they have been unable to get any information from it because of Farook’s passcode.

A federal judge in California has ordered Apple to give the FBI a special firmware that will basically suspend one of the safety precautions built into the iPhone’s passcode lock, specifically the feature that deletes all the information stored on your iPhone after 10 failed attempts to put in a correct passcode. Without this lock in place, the FBI can try combination after combination until they find the correct passcode and gain access to the suspect’s iPhone (which is reportedly running iOS 9).

Apple has decided to fight the FBI over the federal judge’s ruling, refusing to help the government unlock Farook’s iPhone, and while this might be an extremely popular stance for Apple to take, it’s important for the brand to stand by its customer’s privacy. In the tech world, if you can’t protect your users’ data, your users won’t trust you and, therefore, not use your service. In Apple’s case, helping the FBI unlock a terrorist’s iPhone might seem like a no-brainer, but this really is a dangerous precedent to set. In the end, Apple will come out on top, having protected the rights of its customers, even in the face of the United States Federal Government.

your passcode

In an open letter to Apple’s customers, Tim Cook plainly explains the need for data encryption and why the company cannot build the backdoor that the government is asking for. In short, he writes that though the FBI’s intentions are just, helping the government unlock Farook’s iPhone would give the government a lot of power to take a peek into something as intimate as our personal iPhones.

What do you think about Apple fighting the United States Government over its users privacy? Should the company unlock the phone or continue to fight against adding a backdoor to their software? Let us know in the comments below.

You might also like:

Not Using Apple Pay? This One Security Feature Will Change Your Mind Today

The Apple Pay Security Breach is Not Apple’s Fault


Kokou Adzo

Kokou Adzo is a stalwart in the tech journalism community, has been chronicling the ever-evolving world of Apple products and innovations for over a decade. As a Senior Author at Apple Gazette, Kokou combines a deep passion for technology with an innate ability to translate complex tech jargon into relatable insights for everyday users.

3 Comments

Your email address will not be published. Required fields are marked *

  1. I truly believe that dead terrorist or other such criminals do not have the same rights as I have and strongly believe that the FBI should be allowed information especially of a company phone disconnected from cloud in order to save many, many more lives from terrorist acts or domestic violence.
    Quit your posturing and marketing! Save lives
    Erin Mahan Brainerd