The odd thing about the way that Apple handles its security business is that there’s no real way to tell how Apple handles its security business. The company’s motives and reasoning are unknowable, thanks to its near-total silence on security matters and that attitude is beginning to border on the absurd.
The most recent entry in the growing timeline of security weirdness from Apple came earlier this week when the company summarily terminated researcher Charlie MIller from the Apple iOS Developer Program. The action came after news stories revealed that Miller had succeeded in placing an app in the iTunes App Store that was used to demonstrate a vulnerability he had identified in iOS. The flaw allowed him to bypass Apple’s restriction on unsigned code running on iOS devices and enabled him to take arbitrary actions on the device, including reading the contact list. He reported the vulnerability to Apple on Oct. 14.
September 2, 2016 , 10:00 am
September 1, 2016 , 11:52 am
August 29, 2016 , 5:40 pm
Although Miller went through the proper app store process, submitting it for review and approval, the app had a hidden capability that enabled it to call home to Miller’s Web server and download the unsigned code. Once word got out about Miller’s app, Apple pulled it from the store and then sent the researcher a letter telling him that he was barred from the developer program for a year for violating the program’s terms of service.
There’s no disputing that Miller did in fact violate the terms of service, something he freely admits, and Apple certainly had every right to take the action it did, heavy-handed as it was. But the problem is that the company could have just as easily handled the matter by telling Miller it didn’t appreciate the way he’d gotten the app into the store, warned him not to do it again, and moved along. But that’s not the way that Apple does things when it comes to security. It goes the other way.
One reason for that is the company’s reliance on its control of the iTunes App Store, and by extension, the software that runs on iPhones and iPads, for the security of those devices. Apple’s policy of reviewing every app before it’s approved for inclusion in the app store is designed to ensure, at least in part, that no malicious or compromised apps wind up on users’ devices. What Miller demonstrated wasn’t just that there is a serious flaw in iOS, but also that the app-review process itself is likely flawed.
Reviewing the apps before approval is the right idea and it’s served the company well thus far, but the huge number of apps submitted by developers every day makes it impractical, if not impossible, for Apple to review them manually. That means automation and it means that the process needs to move quickly, so things may slip through. The process is still light years ahead of what’s in place for the Android Market, which has seen a number of malware-laced apps get through, as well as proof-of-concept apps submitted by security researchers.
It’s fairly simple to construct an argument that Apple is giving the proper attention to the security of its products. The app-review process is a major plus, as is the requirement that all of the software running on iOS devices be signed. The iPhone has had a sandbox included in its operating system for some time now, and recently Apple informed developers that any app submitted to the Mac App Store must be sandboxed, as well, starting in March 2012.
But it’s also pretty easy to construct an argument that Apple has a long way to go on security. The company maintains total silence on vulnerabilities in its software, even when they’re already publicly known. It continues to patch products on an ad hoc basis, issuing fixes whenever it suits them rather than moving to a regular schedule as many major vendors have. And Apple holds itself apart from the rest of the industry in many ways. It’s not part of SAFECode or any of the other software security initiatives, and its engineers and security specialists don’t do much public speaking on security topics.
Some of this is superficial stuff, but it all matters. It adds up. Even the way that the company handled the flap with Miller isn’t necessarily all that important when viewed in isolation. Lots of software companies treat researchers badly, or with outright hostility. But it suggests a kind of indifference on some level for the way that customers and others outside of Cupertino view the company’s security practices. Instead of pushing people away, Apple should be listening to what they have to say, inviting their feedback. Maybe that will change sometime soon, but the available evidence doesn’t support that conclusion.