Apple releases new preview of its AI, including ChatGPT integration
Apple on Wednesday released a beta version of a slew of Apple Intelligence features, including its long-awaited ChatGPT integration.
The company announced its answer to the artificial intelligence boom this summer, but is slowly rolling out its features to users. Investors hope AI features will spur a wave of iPhone upgrades because the tools are only available on newer devices.
Apple Intelligence has been available in previews for developers and early adopters, but the official public release will come next week as part of the official iOS 18.1 release, Apple said. This latest batch of features is included in a beta version of iOS 18.2 for software developers that was released Wednesday. Apple developer betas typically go through a cycle of weeks before they are released to the public.
The preview included with iOS 18.2 contains:
- New abilities to describe how the user wants Apple Intelligence to rewrite a chunk of text.
- Genmoji, Apple’s image generator for new emojis.
- Image Playground, Apple’s AI image generator.
- Image Wand, a feature that allows users to remove objects or distractions from photos.
- Integration with OpenAI’s ChatGPT.
However, the long-awaited ability for Siri to take actions inside of apps isn’t included in this update, but is expected soon.
Answers from ChatGPT
In June, Apple announced its integration with ChatGPT. Although Apple Intelligence and Siri mostly rely on Apple’s chips inside its devices, the company said at the time that for more sophisticated problems or questions, users can get responses from OpenAI’s chatbot instead.
At the company’s developer conference, Apple showed how the ChatGPT integration will work. When Siri is asked a question that it identifies as being a better question for ChatGPT, it will ask the user for permission to ask ChatGPT. The user doesn’t need an OpenAI account. Users will also be able to use ChatGPT in text fields to generate text.
ChatGPT will also be used in part of a feature that Apple calls Visual Intelligence, where the phone’s camera can identify text or objects and even translate signs in real time.
The partnership between the two companies was a coup for OpenAI, which is now valued at $157 billion after a financing around announced earlier this month.
OpenAI CEO Sam Altman was spotted on Apple’s campus when the integration was announced. That came after Microsoft began a deep integration with OpenAI models into its products.
However, neither Apple nor OpenAI has commented publicly on the financial details of the partnership, and Apple was not an investor in OpenAI’s fundraising round. Apple execs have also suggested that other AI models, such as those from Google, may also integrate with Apple Intelligence in the future.
Some Apple Intelligence features are already in testing by the public, and will be released next week as part of iOS 18.1. The first wave of tools included the ability to rewrite text, a new look for Siri, and notification summaries that take a stack of push notifications and condenses them into a few sentences.