Apple Intelligence expands in iOS 18.2 developer beta, adding Genmoji, Visual Intelligence and ChatGPT


Apple Intelligence’s rollout has been slow, confusing and steady since the company first announced its take on AI. at WWDC this year. Today continues with the release of the latest developer betas for iOS 18, iPadOS 18, and macOS Sequoia. Updates to iOS 18.2, iPadOS 18.2, and macOS Sequoia (15.2) bring long-awaited features like Genmoji, Image Playground, Visual Intelligence, and ChatGPT integration for previewers, as well as Image Wand for iPads and more writing tools.

This is following the announcement iOS 18.1 will be released to the public next week as a stable releasewill bring things like writing tools, notification summaries, and Apple’s hearing test to the masses.

This is the first time for people who haven’t joined the beta software to check out Apple Intelligence, which the company introduced as a headline feature for devices it launched this year. For example, the iPhone 16 series were billed as phones designed for Apple Intelligence, although they were released without these features.

Now that the next set of tools is ready for developers to test, it looks like they’ll be weeks away from reaching the public. For those already in the developer beta, the update will download automatically. As always, a word of caution: If you’re not already familiar, beta software is meant for users to test new features and often check for compatibility or issues. They might be wrong, so always back up your data before installing previews. In this case, you must also have an Apple developer account to gain access.

Brings today’s updates Genmojiit allows you to create custom emoji from your keyboard. You’ll be taken to the emoji keyboard, click the Genmoji button next to the description or search input field, then type what you want to create. Apple Intelligence will generate several options, which you can swipe and select one to send. You’ll also be able to use them as callback reactions to other people’s messages. Plus, you can create Genmoji based on your friends’ photos, creating more accurate Memoji from them. Since these are all presented in emoji style, there will be no risk of mistaking them for real pictures.

Apple is also releasing the Genmoji API today so third-party messaging apps can read and display Genmoji, and people messaging on WhatsApp or Telegram can see the new gym emoji.

Other previously announced features such as Image Playground and Image Wand are also available today. The former is both a standalone app and something you can get from the Messages app via the Plus button. If you go through messages, the system will quickly generate some suggestions based on your conversations. You can also write descriptions or select photos from your gallery as a reference, and the system will provide an image that you can then edit. To avoid confusion, only some art styles are available: Animation or Illustration. You won’t be able to display photorealistic images of people.

Image Wand is also arriving today as an update to the Apple Pencil tool palette, helping you turn your tiny sketches into more polished works of art.

As announced at WWDC, Apple is bringing ChatGPT to Siri and Writing Tools, and whenever your request is well received by OpenAI tools, the system will suggest going there. For example, if you ask Siri to create an itinerary, exercise program, or even a meal plan, the assistant may tell you that it needs to use ChatGPT for that and ask for your permission. You can choose to have the system ask you every time it goes to GPT, or to make these prompts less frequently.

It’s worth repeating that you don’t need a ChatGPT account to use these tools, and Apple has its own agreement with OpenAI that when you use the latter’s services, your data such as your IP address will not be stored or used to develop models. . However, if you connect your ChatGPT account, your content will be covered by OpenAI policies.

Elsewhere, Apple Intelligence will also show that you can compose with ChatGPT within Writing Tools, where you’ll find things like Rewrite, Summarize, and Edit. This is also another area that got an update with the developer beta – a new tool called “Describe Your Changes”. It’s basically a command bar that lets you tell Apple what you want your posts to do. For example, “Make it sound more enthusiastic” or “Check this for grammatical errors.” Basically, this will make it a little easier for the AI ​​to edit your work because you won’t have to go to separate sections for, say, Proofread or Summarize. You can also manage to do things like “Turn this into a poem”.

Finally, if you have an iPhone 16 or iPhone 16 Pro and are running the developer beta, you’ll be able to test Visual Intelligence. It lets you point your camera at objects around you and get answers to questions like math problems in your textbook or the menu at a restaurant you pass on your commute. It may also touch third-party services such as Google and ChatGPT.

Outside of the iPhone 16 series, you will need it compatible device to test any Apple Intelligence features. That means the iPhone 15 Pro and newer models, or an M-series iPad or MacBook.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *