Apple has opened up its Private Cloud Compute to researchers, offering up to $1 million to anyone who finds a hole in the secure cloud platform that supports its Apple Intelligence features.
It comes as Apple Intelligence is about to launch on iPhones next week with the arrival of its major point upgrade iOS 18.1. This will include iPhone AI features for the first time, such as enhancements to its voice assistant Siri.
Apple is thought to offer the more secure and private AI option compared to other smartphone makers in the Google Android ecosystem such as Samsung, which offer so called “hybrid AI.” This is because with Apple Intelligence, the iPhone maker processes as much data as possible on the device.
For more complex requests, Apple’s Private Cloud Compute runs on the the company’s own silicon servers. Built with custom Apple silicon and a hardened operating system designed for privacy, Apple calls PCC “the most advanced security architecture ever deployed for cloud AI compute at scale.”
And it’s pretty confident about its security. When launching PCC, Apple said it was going to offer security researchers the chance to find vulnerabilities in its private cloud platform.
“In the weeks after we announced Apple Intelligence and PCC, we provided third-party auditors and select security researchers early access to the resources we created to enable this inspection, including the PCC Virtual Research Environment,” Apple explained in a new blog titled Security research on Private Cloud Compute.
On Oct 24., Apple made resources publicly available to invite all security and privacy researchers — or “anyone with interest and a technical curiosity to find holes in the platform.”
The firm said the aim is “to learn more about PCC and perform their own independent verification of our claims.”
At the same time, the iPhone maker is expanding its Apple Security Bounty to include PCC, with “significant rewards” for reports of issues with its security or privacy claims.
Apple’s $1 Million Bug Bounty
Apple’s bug bounty for PCC is pretty generous. For major holes, which it categorizes as allowing “remote attack on request data,” it is offering $1 million for arbitrary code execution flaws. Meanwhile, access to a user’s request data or sensitive information outside the trust boundary offers a still rather generous $250,000 reward.
For attacks requiring a “privileged position” — access to someone’s iPhone — Apple is offering $150,000 for flaws allowing access to a user’s request data or other sensitive information about the user outside the trust boundary.
“Because we care deeply about any compromise to user privacy or security, we will consider any security issue that has a significant impact to PCC for an Apple Security Bounty reward, even if it doesn’t match a published category,” Apple said.
It added that it will “evaluate every report according to the quality of what’s presented, the proof of what can be exploited and the impact to users.”
Security researchers interested in the program can visit the Apple Security Bounty page to learn more and to submit their research.