The first time Apple has formally wired a third-party AI into Siri's core flow.
A second release candidate for iOS 18.2 landed on developer and public beta tester devices Monday, one week after the first version made its rounds — and if history holds, the public release is days away. The build number ticked from 22C150 to 22C151, a small increment that signals Apple is satisfied enough with the software to push it toward the finish line. Alongside the iPhone update, Apple also seeded matching release candidates for iPadOS 18.2, macOS Sequoia 15.2, visionOS 2.2, tvOS 18.2, and HomePod Software 18.2.
What makes this particular update consequential is what's inside it. iOS 18.2 represents the second major wave of Apple Intelligence features — the AI initiative Apple has been building toward since the summer. The first wave, which arrived with iOS 18.1, was relatively modest: writing assistance, notification summaries, a smarter Siri. This round is considerably more ambitious.
The headlining addition is Image Playground, a standalone app that generates images from text prompts. Users can describe what they want, or lean on Apple's built-in suggestions for costumes, locations, and objects. The app can also draw inspiration from a Messages conversation or a note — so if you've been planning a trip to Tokyo with a friend, it can pull that context in. Images can be modeled on real people using photos as reference, and there's a version history so you can step backward through your edits. One important constraint: Image Playground produces only animated or illustrated images, not photorealistic ones. Apple has made that a deliberate boundary.
Tied to Image Playground is a feature called Image Wand, which lives inside the Notes app. Draw a rough circle around a blank area or a phrase in a note with an Apple Pencil, and Image Wand will generate an image to fill that space. It's a small feature in footprint but a meaningful one for anyone who uses Notes as a working canvas.
Genmoji rounds out the image-generation trio. These are custom emoji built from text descriptions, and like Image Playground, they can be modeled on people in your Photos library. You get multiple options to choose from, and the whole thing is accessible directly from the emoji keyboard. Apple is candid that all three image features — Genmoji, Image Wand, and Image Playground — can produce unexpected results, and the company is actively collecting feedback to improve them. Access to these three features is being rolled out gradually over the coming weeks through a secondary waitlist.
The Siri-ChatGPT integration is perhaps the most structurally interesting addition. When Siri can't answer a question on its own, it can hand the query to ChatGPT — but only after the user explicitly approves the handoff. ChatGPT can also generate text and images from scratch through this channel. No account is required, and Apple says neither it nor OpenAI retains any record of those requests.
For iPhone 16 owners specifically, iOS 18.2 also brings Visual Intelligence into fuller form. Point the camera at a restaurant and get its hours and reviews. Point it at a product and search Google for where to buy it. The camera can also read text aloud, detect phone numbers and addresses for easy saving to Contacts, and pull additional context from ChatGPT about whatever it's looking at.
Writing Tools, which debuted in iOS 18.1, gets a meaningful upgrade as well. Previously, users could only shift the tone of their writing to preset options — friendly, professional, simplified. With 18.2, the tool accepts open-ended instructions: turn this email into a poem, add more active verbs, shift the register entirely. It's a small change in interface terms but a significant one in practical flexibility.
Apple Intelligence is also expanding its language reach with this update, adding localized English support for Australia, Canada, New Zealand, South Africa, Ireland, and the United Kingdom — previously, only U.S. English was supported.
Compatibility requirements remain unchanged. Apple Intelligence needs an iPhone 15 Pro or any iPhone 16 model, an iPad running an M-series or A17 Pro chip, or an M-series Mac. The updates themselves will install on any supported device, but the AI features simply won't appear on older hardware. The full public release is expected before the week is out.
Notable Quotes
Genmoji, Image Wand, and Image Playground can sometimes give results you weren't expecting — Apple is collecting feedback and will refine them over time.— Apple, via product documentation accompanying the release candidate
The Hearth Conversation Another angle on the story
Why does a second release candidate matter? Isn't the first one usually the final version?
Usually, yes — the first RC is often what ships. A second one means Apple found something worth fixing, even if it's minor. It's a small signal that they're being careful.
What's the most significant thing arriving in this update, in your view?
Probably the Siri-ChatGPT handoff. It's the first time Apple has formally wired a third-party AI into Siri's core flow. That's a structural shift, not just a feature addition.
The no-storage promise from Apple and OpenAI — how much does that actually matter to users?
It matters a lot to the people it matters to. Privacy has been Apple's differentiator for years. Saying requests aren't stored is them defending that identity even as they open the door to OpenAI.
Why limit Image Playground to illustrations and animations? Why not photorealism?
Partly technical, partly political. Photorealistic AI faces are where deepfakes live. Keeping it stylized is a way of drawing a line before regulators or critics draw it for them.
The waitlist for Genmoji and Image Playground — is that unusual for Apple?
It is. Apple typically flips a switch and the feature is there. A rolling waitlist suggests they're not fully confident in the infrastructure yet, or they want to manage feedback volume.
Visual Intelligence sounds like Google Lens. Is Apple just catching up?
Functionally, yes — Google has had this for years. But Apple's angle is that it's integrated into the camera button on iPhone 16 hardware, and it routes through their privacy framework. The feature isn't new; the packaging is.
What should people who don't have an iPhone 15 Pro or 16 take away from all this?
That the gap between supported and unsupported devices is widening fast. iOS 18.2 installs everywhere, but the experience on older phones is increasingly a different product.