Recently there has been a conflict between the FBI and Apple over encryption. The case was in that the FBI wanted Apple to make a backdoor into their major operating system, iOS 9. The reason that the FBI wanted this backdoor was to view the messages sent from this phone to another iPhone using iMessage. iMessage is an end-to-end encryption messaging system that works on all iPhones, iPads and Macs. End-to-end encryption is an encryption that something sent from one device to another device securely, so securely enough to that, no one can gain access to it, not even the company that made the service. In the FBI v. Apple case, the FBI wants Apple to show them the messages sent through iMessages to help find and stop terrorists; however, the problem with the request is that Apple can not actually view the files, or even unlock the phone themselves. This feature was different in iOS 8 and lower, but in iOS 9 and up it is different. Apple refused to help the FBI and agreed to go to court over this matter; however, before the court date a third party company came to the FBI and was able to unlock the iPhone. The FBI later revealed that the exploit only applies to the iPhone 5c, and the FBI dropped charges after the exploit was a success. Just because the FBI dropped the charge, does not mean that people are not worried about encryption and what the government can ask for us.
On April seventh, 2016, a new bill was introduced by Richard Burr and Dianne Feinstein. It is called the "Compliance with Court Order Act of 2016." The bill, according to its draft, would make encryption redundant. Redundant in that in order to comply with the bill, companies must either make a 'backdoor' or a software in which data files can be provided to a court of law. This may seem at first like a reasonable or even a good thing, like in the instance of the terrorist iPhone, but if there is a way to render encryption on an operating system useless, then what is to prevent a hacker from re-creating this software. Hackers are always looking for exploits to get at and into the kernel. Hackers are constantly trying to get into different operating systems and are even paid by some companies: Google, Facebook, Microsoft and Uber. Often these companies will pay thousands of dollars to anyone that is able to hack the companies device. The catch is that the hacker has to explain to the company how they did it to help secure it.
Companies' encryption right now, especially international companies, can be found on multiple devices, and often if the device goes un-updated for more than a few years, the device becomes susceptible to intrusion. Even older devices that a person used have remnants of sensitive details: banking information, contact information, personal photos and locally stored passwords. These sensitive files can not be seen by most people, but to the skilled eyes of a hacker, these files can be easily accessed and used. Other things that the bill does not consider is that if everything had a 'backdoor' for the government, all of these are no longer secure, Doctor patient confidentiality, defense schematics, security cameras, the itinerary of people in authority and even banking/credit card information.
Now I know that the bill is currently in its draft stage, but even if it is refined to only include certain business or types of branches to fully secure them for the good of the public, who can make the choices for the good of the public other than the public. What gives the government the right to choose specifically which software are included and held responsible for this mandate? I see this bill as it currently is an affront to our rights, to privacy and this bill is not the correct answer to terrorism or cybersecurity. I do not promote terrorism in any way, but this bill could even cause serious irreversible problems.