This is my proposal for how Apple could help the FBI unlock a phone, while ensuring security for all its users. I’d love your thoughts in the comments.
The FBI want Apple to create a custom version of iOS, the operating system that the iPhone runs, that allows unlimited unlock attempts without delays between retries. I see the sense in being able to unlock a device for law enforcement purposes. My concern is with insecure code getting in to the wrong hands, and with law enforcement agencies unlocking devices that don’t perhaps, pass the test for candidacy for unlocking.
If the FBI, or any other law enforcement agency (LEA), required a locked device to be unlocked, with the help of the device manufacturer, there must be safeguard in place to make sure it only happens with absolute certainty that the device may contain material evidence for a severe crime. I won’t define severe here, but I wouldn’t feel comfortable is Apple facilitated the unlocking of an iPhone in the case of minor infringements. A warrant for a specific and named device (not a general warrant) would suffice for this purpose, and such would need to be provided by the LEA to the manufacturer.
The manufacturer would then be afforded a reasonable amount of time to confer internally, and consult with relevant external parties such as legal advisors and civil liberties organisations. There may need to be some confidentiality clause around these conversations.
The manufacturer would then have the ability to appeal the request if they feel it wasn’t a valid candidate for unlocking, or proceed with the request.
In the event that software needs to be altered in order to facilitate the unlock process, such must be done in a secure environment, specifically for the purpose of unlocking that one device, and destroyed thereafter. Further, the software must never leave the secure environment, including on the device that is required to be unlocked. Here’s a sample process for the particular case outlines in the preamble.
Assuming Apple proceed with the FBI’s request, the FBI would submit the device to Apple. They have the right to have an employee witness the end-to-end process of unlocking, should they feel it appropriate. For example, they might not want the phone to leave their eyes.
Apple would assign a room in their office to carry out the unlocking process. This room would be locked and access restricted only to those taking part in the unlock process.
In this room, there would be a computer, without internet access. Any internet ports in the room would be disabled, and a WiFi blocker installed to no data can enter or leave the room.
A software developer would take the code for iOS, alter it to remove the unlock restrictions, compile, and install directly on the iPhone in question. One successfully installed, the altered software would be securely deleted from the computer.
Now that the iPhone has a less secure operating system on it, the FBI employee, or other nominated person, can begin the unlock attempt process. In this particular case, it’s a manual process of typing in all the codes. If a software process is used, the device that unlocks the phone would need to be wiped of all data before leaving the secure room again.
Once the phone is unlocked, the contents of the phone, not including the operating system software, are extracted and written to a write-once medium. A checksum for the contents is also made and recorded in a write-once fashion. (Write-once basically means in a way that modification is impossible or detectable, such as on to a non-rewritable CD.)
With the data now securely extracted, the phone is factory reset to the default operating system. The phone may now leave the room, with the wrote-once media containing its contents. All other computers that have entered the room must be securely wiped and reinstalled from scratch.
The room can now have its security measures lifted.
Oversight and Room Access
Unless the room is empty, there must be at least one member of the LEA, one member of the manufacturer’s staff, and one independent auditor present in the room. If any person leaving the room causes this minimum requirement to fail, all persons must exit the room, ensuring no computing devices are taken with them. There must be a security guard on the door 24 hours a day, who ensures only authorised persons are admitted, and that the minimum requirement of those present is adhered to.
The independent auditor would be, for example, a representative of a recognised civil liberties organisation.
This process would give me assurance that only the absolutely essential cases are processed (who’s going to pay for a 24/7 security guard to get SMSs off a phone from a shoplifter?), that every interested party has representation, and that any code written to reduce the security of a device never leaves a contained and siloed environment. Would this work for you?