Apple

Apple in June introduced Private Cloud Compute (PCC), an intelligence system that brings the advanced security and privacy features the vendor offers in its personal devices to the cloud to run private AI workloads.

As the device maker wrote, the goal is to make sure that “personal user data sent to PCC isn’t accessible to anyone other than the user — not even to Apple. Built with custom Apple silicon and a hardened operating system designed for privacy, we believe PCC is the most advanced security architecture ever deployed for cloud AI compute at scale.”

Apple has long offered users ways to process workloads on such devices as Macs, iPhones and iPads and ensure the security and privacy of the data. However, larger AI foundation models and the data they use are more complex. With PCC, users can offload their private AI jobs to the cloud in a secure manner that protects their privacy.

To prove PCC’s bona fides and drive the trust of the system’s users, Apple gave researchers and third-party auditors early access to the system to inspect and verify the end-to-end security and privacy claims. That included testing the PCC Virtual Research Environment (VRE), a set of tools that lets users run their security analysis of PCC from their Mac device.

Now the company is opening up that opportunity to everyone.

Everybody’s Invited

“We’re making these resources publicly available to invite all security and privacy researchers — or anyone with interest and a technical curiosity — to learn more about PCC and perform their own independent verification of our claims,” Apple’s Security Engineering and Architecture (SEAR) team wrote in a recent blog post.

That includes expanding the company’s bug bounty to PCC with rewards that can reach as much as $1 million and publishing the Private Cloud Compute Security Guide, a detailed look at how the PCC’s architecture was designed. The guide gives technical details about the system’s components and how they work together to provide the high level of security and privacy for processing AI workloads in the cloud.

There are a range of topics, from how the features are implemented in hardware and how PCC requests are authenticated and routed so they can’t be targeted, to how users can inspect the software that runs in Apple’s data centers. In addition, the information shows how the system’s privacy and security features hold up in multiple attack scenarios.

Going on the Bug Hunt

The bug bounty program outlines PCC categories that include unintended data disclosure caused by configuration flaws or problems with the system design and vulnerabilities that let bad actors exploit user requests to get unauthorized access to the PCC. Another category involves security flaws in which access to internal interfaces in the cloud system can lead to it being compromised.

The rewards in the program can run as high as $1 million for an attack on request data that leads to remote code execution (RCE). Threat hunters that detect such remote attacks that access user request data or other sensitive information will be rewarded with $250,000.

Apple also lists attacks on request data that result from networks with elevated privileges that can bring investigators between $50,000 to $150,000.

Detailing the VRE

According to SEAR, the VRE is used to run the PCC node software in a virtual machine with few modifications. It’s available in the macOS Sequoia 15.1 Developer Preview and requires a Mac with an Apple chip and 16GB or more of unified memory.

“Userspace software runs identically to the PCC node, with the boot process and kernel adapted for virtualization,” the team wrote. “The VRE includes a virtual Secure Enclave Processor (SEP), enabling security research in this component for the first time — and also uses the built-in macOS support for paravirtualized graphics to enable inference.”

Through the VRE, users can list and inspect PCC software releases, download the binaries for each release, boot a release in a virtualized environment, and run AI inference work against demonstration models. In addition, they can modify and debug the software to perform more investigations.

Accessing the Source Code

Apple is making the source code for some PCC components that are used to implement the security and privacy requirements available under a limited-use license, enabling users and investigators to run a deeper analysis of the system. The code includes such areas as Apple’s Cloud Attestation project for building and validating the PCC node’s attestations, the Thimble project for verifiable transparency, and the Splunkd daemon for filtering the logs that can be emitted from a PCC node to protect against accidental exposure.

There also is the SRD tools project, which lets users understand how the VRE enables running the PCC code. All of the available source code is available on GitHub.

Techstrong TV

Click full-screen to enable volume control
Watch latest episodes and shows

Tech Field Day Extra at Cisco Live EMEA

SHARE THIS STORY

RELATED STORIES