This is all distraction, as operating system configuration and patching is not a "backdoor'.
The best response to the FBI's request I've read thus far comes from the noted IOS forensics security guru, Jonathan Zdziarski
where he wrote the following
An instrument is the term used in the courts to describe anything from a breathalyzer device to a forensics tool, and in order to get judicial notice of a new instrument, it must be established that it is validated, peer reviewed, and accepted in the scientific community. It is also held to strict requirements of reproducibility and predictability, requiring third parties (such as defense experts) to have access to it. I've often heard Cellebrite referred to, for example, as the Cellebrite instrument in courts. Instruments are treated very differently from a simple lab service, like dumping a phone. I've done both of these for law enforcement in the past: provided services, and developed a forensics tool. Providing a simple dump of a disk image only involves my giving testimony of my technique. My forensics tools, however, required a much thorough process that took significant resources, and they would for Apple too.
The tool must be designed and developed under much more stringent practices that involve reproducible, predictable results, extensive error checking, documentation, adequate logging of errors, and so on. The tool must be forensically sound and not change anything on the target, or document every change that it makes / is made in the process. Full documentation must be written that explains the methods and techniques used to disable Apple's own security features. The tool cannot simply be some throw-together to break a PIN; it must be designed in a manner in which its function can be explained, and its methodology could be reproduced by independent third parties. Since FBI is supposedly the ones to provide the PIN codes to try, Apple must also design and develop an interface / harness to communicate PINs into the tool, which means added engineering for input validation, protocol design, more logging, error handling, and so on. FBI has asked to do this wirelessly (possibly remotely), which also means transit encryption, validation, certificate revocation, and so on.
Once the tool itself is designed, it must be tested internally on a number of devices with exactly matching versions of hardware and operating system, and peer reviewed internally to establish a pool of peer-review experts that can vouch for the technology. In my case, it was a bunch of scientists from various government agencies doing the peer-review for me. The test devices will be imaged before and after, and their disk images compared to ensure that no bits were changed; changes that do occur from the operating system unlocking, logging, etc., will need to be documented so they can be explained to the courts. Bugs must be addressed. The user interface must be simplified and robust in its error handling so that it can be used by third parties.
Once the tool is ready, it must be tested and validated by a third party. In this case, it would be NIST/NIJ (which is where my own tools were validated). NIST has a mobile forensics testing and validation process by which Apple would need to provide a copy of the tool (which would have to work on all of their test devices) for NIST to verify. NIST checks to ensure that all of the data on the test devices is recovered. Any time the software is updated, it should go back through the validation process. Once NIST tests and validates the device, it would be clear for the FBI to use on the device. Here is an example of what my tools validation from NIJ looks like: https://www.ncjrs.gov/pdffiles...
During trial, the court will want to see what kind of scientific peer review the tool has had; if it is not validated by NIST or some other third party, or has no acceptance in the scientific community, the tool and any evidence gathered by it could be rejected.
Apple must be prepared to defend their tool and methodology in court; no really, the defense / judge / even juries in CA will ask stupid questions such as, why didn't you do it this way, or is this jail breaking, or couldn't you just jailbreak the phone? (i was actually asked that by a juror in CA's broken legal system that lets the jury ask questions). Apple has to invest resources in engineers who are intimately familiar with not only their code, but also why they chose the methodology they did as their best practices. If certain challenges don't end well, future versions of the instrument may end up needing to incorporate changes at the request of FBI.
If evidence from a device ever leads to a case in a court room, the defense attorney will (and should) request a copy of the tool to have independent third party verification performed, at which point the software will need to be made to work on another set of test devices. Apple will need to work with defense experts to instruct them on how to use the tool to provide predictable and consistent results.
In the likely event that FBI compels the use of the tool for other devices, Apple will need to maintain engineering and legal staff to keep up to date on their knowledge of the tool, maintain the tool, and provide testimony as needed.
In other words, developing an instrument is far more involved than simply dumping a phone for FBI, which FBI could have ordered:
Developed to forensically sound standards
Validated and peer-reviewed
Be tested and run on numerous test devices
Accepted in court
Given to third party forensics experts (testing)
Given to defense experts (defense)
Stand up to challenges
Be explained on the stand
Possibly give source code if ordered
Maintain and report on issues
Defend lawsuits from those convicted
Legally pursue any agencies, forensics companies, or hackers that steal parts of the code.
Maintain legal and engineering staff to support it
On appeals, go through much of the process all over again
The risks are significant too:
Ingested by an agency, reverse engineered, then combined with in-house or purchased exploits to fill in the gap of code signing.
Ingested by private forensics companies, combined with other tools / exploits, then sold as a commercial product.
Leaked to criminal hackers, who reverse engineer and find ways to further exploit devices, steal personal data, or use it as an injection point for other ways to weaken the security of the device.
The PR nightmare from demonstrating in a very public venue how the company's own products can be back doored.
The judicial precedents set to now allow virtually any agency to compel the software be used on any other device.
The international ramifications of other countries following in our footsteps; many countries of which have governments that oppress civil rights.
This far exceeds the realm of reasonable assistance, especially considering that Apple is not a professional forensics company and has no experience in designing forensic methodology, tools, or forensic validation. FBI could attempt to circumvent proper validation by issuing a deviation (as they had at one point with my own tools), however this runs the risk of causing the house of cards to collapse if challenged by a defense attorney.