The recent request by the FBI for Apple to assist in the unlocking and access to an iPhone belonging to a suspected domestic terrorist has created a lot of interest and opinions among pundits, lawmakers and technology providers. The basic debate is about digital privacy and the obligation that companies do or do not have to modify their proprietary products for the so-called “public good”.
The US Government’s request to Apple to create software to bypass the iPhone’s security capabilities, in effect weakening the very security protections that define Apple products, is a big deal. Protecting the American public from terrorism is a very real and difficult task, made more so by the very technology at the center of this debate, but the solutions are not clear-cut.
As society increasingly relies on digital platforms for commerce and communication, economic growth depends more and more on the inherent trust that we put in modern technology and the institutions that provide it. The growth of the world economy and social advancement to a great degree is directly connected to the trust we have in our technological processes to ensure that our data is safe. Safe from criminals and safe from prying eyes. We expect our technology to keep us secure.
When tech companies or the government does things to break this trust, there is a domino effect and all systems and tools become suspect. This case will precedents for the foreseeable future. The intentional weakening of protections created to guard our personal privacy opens Pandora’s Box.
This is an incredibly complex topic. That the government wants Apple to write code (in effect creating a new product) in order to undermine an existing product is troubling in and of itself. Where will the line be drawn regarding a company’s ability to create or not create new product? Is that not the basis for free enterprise? What social obligations do large technology companies have to serve and protect the public?