Monday, 23 December 2024
25.7 C
Singapore

Apple offers US$1M reward for hacking its AI cloud

Apple is offering a US$1 million reward to anyone who can hack its AI cloud and inviting researchers to test the security of its Private Cloud Compute.

Apple has raised the stakes in digital security, offering up to US$1 million to anyone who can hack into its AI , Private Cloud Compute (PCC). This bold move underscores Apple’s commitment to protecting user privacy and ensuring its systems remain secure and trustworthy. The PCC will handle tasks requiring greater processing power than achievable on devices, but that comes with a heightened need for robust security.

Apple opens its AI cloud to the public for testing

In a recent Apple Security blog post, the company announced a new public research initiative, inviting developers, security researchers, and the tech-savvy public to examine PCC’s security. Previously, the PCC was only accessible to select security researchers and auditors. Now, by expanding access, Apple hopes to uncover any vulnerabilities in its AI cloud, offering hackers and security experts a chance to find and report flaws.

The PCC operates as Apple’s AI powerhouse, handling complex tasks when on-device capabilities, such as those on iPhones and Macs, fall short. Apple stresses that much of its AI processing is managed on-device to ensure data privacy. However, in cases where greater processing is necessary, data is transferred to the PCC, where Apple employs end-to-end encryption to keep user information secure. Still, leaving personal data on one’s device can be unsettling to some users, and Apple’s bug bounty aims to reassure them by proving the robustness of its cloud security.

Up to US$1 million on offer for critical vulnerabilities

Apple’s highest bug bounty payout is set at US$1 million for those able to execute malicious code on the PCC servers, posing the greatest potential threat to security. This reward aims to identify any vulnerabilities that could compromise user data or cloud functionality. A second, substantial bounty of US$250,000 will go to those who manage to exploit user data from the AI cloud, with smaller rewards starting at US$150,000 for accessing data from within Apple’s network in a “privileged network position.”

The tiered structure of Apple’s bounty program encourages ethical hackers and security professionals to discover a wide range of vulnerabilities, from critical issues that could allow malicious software to infiltrate the PCC to less severe but still concerning weaknesses. By providing rewards proportional to the risks posed, Apple is protecting its system and fostering a collaborative approach to security.

Security and privacy are Apple’s priorities

Apple has a history of offering rewards for bug identification and has successfully used such programs to prevent security threats. For example, two years ago, Apple paid a university student US$100,000 to identify a vulnerability within its Mac operating system. The stakes have increased with Apple’s AI cloud, and the company hopes that the added incentive will attract top security talent to help safeguard its latest technological advancements.

The PCC will be critical in delivering seamless AI capabilities as Apple Intelligence becomes more widely integrated into products and services. While Apple assures users that data handled by the PCC remains private and is only accessible to the user, it is taking no chances regarding security. By opening its code to public scrutiny and incentivising discoveries, Apple is betting on the expertise of the tech community to identify and address any issues before they become real risks.

Apple sets a high standard for transparency and security in the tech industry with its new AI cloud bug bounty. This initiative could enhance user confidence in Apple’s commitment to privacy and secure data handling if successful.

Hot this week

PlayStation and AMD collaborate to revolutionise gaming with AI

Sony and AMD partner to bring AI-powered gaming innovations, enhancing graphics and gameplay on PlayStation, PCs, and cloud platforms.

Intel outlines fixes to improve Arrow Lake CPU performance

Intel rolls out fixes for Arrow Lake CPU performance issues, addressing Windows updates, gaming optimisation, and future improvements at CES.

Google Keep might become an essential Android app

Google Keep might become a core Android app in Android 16, making it uninstallable without root access and potentially gaining new features.

Why human skills remain essential in software development’s AI era

Developers’ critical thinking and creativity remain essential as AI tools like GenAI assist in coding. Learn why human skills still matter in the AI era.

Inappropriate apps found rated safe for young children on Apple’s App Store, report reveals

A new report reveals inappropriate apps rated safe for kids on Apple’s App Store, prompting calls for stronger child safety measures.

Atomic-scale memristors: The future of AI and brain-like computing

Atomic-scale memristors could transform AI and computing by mimicking the brain's neural networks for faster, energy-efficient systems.

Inappropriate apps found rated safe for young children on Apple’s App Store, report reveals

A new report reveals inappropriate apps rated safe for kids on Apple’s App Store, prompting calls for stronger child safety measures.

Trump indicates TikTok could stay in the US after campaign success

Donald Trump hints at keeping TikTok in the US while also addressing plans to tackle the Ukraine war, migrant crime, and transgender issues.

Former Huawei recruit announces mass production of humanoid robots

A former Huawei recruit’s start-up, Agibot, begins mass production of humanoid robots, marking a key milestone in China’s robotics race.

Related Articles

Popular Categories