For the past decade Apple has tried to make the iPhone one of the most secure devices on the market. By locking down its software, Apple keeps its two billion iPhone owners safe. But security researchers say that makes it impossible to look under the hood to figure out what happened when things go wrong.
Once the company that claimed its computers don’t get viruses, Apple has in recent years begun to embrace security researchers and hackers in a way it hadn’t before.
Last year at the Black Hat security conference, Apple’s head of security Ivan Krstic told a crowd of security researchers that it would give its most-trusted researchers a “special” iPhone with unprecedented access to the the device’s underbelly, making it easier to find and report security vulnerabilities that Apple can fix in what it called the iOS Security Research Device program.
Starting today, the company will start loaning these special research iPhones to skilled and vetted researchers that meet the program’s eligibility.
These research iPhones will come with specific, custom-built iOS software with features that ordinary iPhones don’t have, like SSH access and a root shell to run custom commands with the highest access to the software, and debugging tools that make it easier for security researchers to run their code and better understand what’s going on under the surface.
Apple told TechCrunch it wants the program to be more of a collaboration rather than shipping out a device and calling it a day. Hackers in the research device program will also have access to extensive documentation and a dedicated forum with Apple engineers to answer questions and get feedback.
These research devices are not new per se, but have never before been made directly available to researchers. Some researchers are known to have sought out these internal, so-called “dev-fused” devices that have found their way onto underground marketplaces to test their exploits. Those out of luck had to rely on “jailbreaking” an ordinary iPhone first to get access to the device’s internals. But these jailbreaks are rarely available for the most recent iPhones, making it more difficult for hackers to know if the vulnerabilities they find can be exploited or have been fixed.
By giving its best hackers effectively an up-to-date and pre-jailbroken iPhone with some of its normal security restrictions removed, Apple wants to make it easier for trusted security researchers and hackers to find vulnerabilities deep inside the software that haven’t been found before.
But as much as these research phones are more open to hackers, Apple said that the devices don’t pose a risk to the security of any other iPhone if they are lost or stolen.
The new program is a huge leap for the company that only a year ago opened its once-private bug bounty program to everyone, a move seen as long overdue and far later than most other tech companies. For a time, some well-known hackers would publish their bug findings online without first alerting Apple — which hackers call a “zero-day” as they give no time for companies to patch — out of frustration with Apple’s once-restrictive bug bounty terms.
Now under its bounty program, Apple asks hackers to privately submit bugs and security issues for its engineers to fix, to help make its iPhones stronger to protect against nation-state attacks and jailbreaks. In return, hackers get paid on a sliding scale based on the severity of their vulnerability.
Apple said the research device program will run parallel to its bug bounty program. Hackers in the program can still file security bug reports with Apple and receive payouts of up to $1 million — and up to a 50% bonus on top of that for the most serious vulnerabilities found in the company’s pre-release software.
The new program shows Apple is less cautious and more embracing of the hacker community than it once was — even if it’s better late than never.