This was my first thought -- it's a search not for security of the devices, but a search for exploits of these devices and/or some form of industrial espionage.
But I wonder -- can Apple set the terms of the audit? Ie, you get to examine whatever it is you examine in our office using our provided systems which aren't connected to the Internet. You may not bring any electronic devices into the audit facility. You may not reproduce any code you review in our facility by any means, including notes, pseudocode, block diagrams, etc.
I suppose there's still some risk -- ie, deliberate subterfuge involving copying in some way or the use of a memory savant or some error so obvious they know how to attack it without any information exfiltrated.
I don't know, but I also assume that a truly thorough security audit of a large, novel (ie, you didn't write it) code base is hard and may be dependent on 2nd order effects, like the actual generated object code. Which may make it extremely time-consuming -- didn't the funded audit of TrueCrypt take an extremely long time just to do the initial audit?