Apple plans to use in-house server chips for AI tools this year
Apple Inc. will deliver some of its upcoming artificial intelligence features this year via data centers equipped with its own in-house processors, part of a sweeping effort to infuse its devices with AI capabilities.
The company is placing high-end chips — similar to ones it designed for the Mac — in cloud-computing servers designed to process the most advanced AI tasks coming to Apple devices, according to people familiar with the matter. Simpler AI-related features will be processed directly on iPhones, iPads and Macs, said the people, who asked not to be identified because the plan is still under wraps.
The move is part of Apple’s much-anticipated push into generative artificial intelligence — the technology behind ChatGPT and other popular tools. The company is playing catch-up with Big Tech rivals in the area but is poised to lay out an ambitious AI strategy at its Worldwide Developers Conference on June 10.
Apple’s plan to use its own chips and process AI tasks in the cloud was hatched about three years ago, but the company accelerated the timeline after the AI craze — fueled by OpenAI’s ChatGPT and Google’s Gemini — forced it to move more quickly.
The first AI server chips will be the M2 Ultra, which was launched last year as part of the Mac Pro and Mac Studio computers, though the company is already eyeing future versions based on the M4 chip.
A representative for Cupertino, California-based Apple declined to comment.
Relatively simple AI tasks — like providing users a summary of their missed iPhone notifications or incoming text messages — could be handled by the chips inside of Apple devices. More complicated jobs, such as generating images or summarizing lengthy news articles and creating long-form responses in emails, would likely require the cloud-based approach — as would an upgraded version of Apple’s Siri voice assistant.
The move, coming as part of Apple’s iOS 18 rollout in the fall, represents a shift for the company. For years, Apple prioritized on-device processing, touting it as a better way to ensure security and privacy. But people involved in the creation of the Apple server project — code-named ACDC, or Apple Chips in Data Centers — say that components already inside of its processors can safeguard user privacy. The company uses an approach called Secure Enclave that can isolate data from a security breach.
For now, Apple is planning to use its own data centers to operate the cloud features, but it will eventually rely on outside facilities — as it does with iCloud and other services. The Wall Street Journal reported earlier on some aspects of the server plan.
Luca Maestri, Apple’s chief financial officer, hinted at the approach on an earnings call last week. “We have our own data center capacity and then we use capacity from third parties,” he said after being asked about the company’s AI infrastructure. “It’s a model that has worked well for us historically, and we plan to continue along the same lines going forward.”
Handling AI features on devices will still be a big part of Apple’s AI strategy. But some of those capabilities will require its most recent chips, such as the A18 launched in last year’s iPhone and the M4 chip that debuted in the iPad Pro earlier this week. Those processors include significant upgrades to the so-called neural engine, the part of the chip that handles AI tasks.
Apple is rapidly upgrading its product line with more powerful chips. In a first, it’s bringing a next-generation processor — the M4 — to its entire range of Mac computers. The Mac mini, iMac and MacBook Pro will get the M4 later this year, and the chip will go into the MacBook Air, Mac Studio and Mac Pro next year, Bloomberg News reported in April.
Taken together, the plans lay the groundwork for Apple to weave AI into much of its product line. The company will focus on features that make life easier for users as they go about their day — say, by making suggestions and offering a customized experience. Apple isn’t planning to roll out its own ChatGPT-style service, though it’s been in discussions about offering that option through a partnership.
Just last week, Apple said the ability to run AI on its devices will help it stand out from rivals.
“We believe in the transformative power and promise of AI, and we believe we have advantages that will differentiate us in this new era, including Apple’s unique combination of seamless hardware, software and services integration,” Chief Executive Officer Tim Cook said during the earnings call.
Without getting into specifics, Cook said that Apple’s in-house semiconductors would give it in an edge in this still-nascent field. He added that the company’s privacy focus “underpins everything we create.”
The company has invested hundreds of millions of dollars in the cloud-based initiative over the past three years, according to the people. But there are still gaps in its offerings. For users who want a chatbot, Apple has held discussions with Alphabet Inc.’s Google and OpenAI about integrating one into the iPhone and iPad.
Talks with OpenAI have recently intensified, suggesting that a partnership is likely. Apple also could offer a range of options from outside companies, people familiar with the discussions have said.