With the iOS 18.3 update, Apple is adding new Visual Intelligence features for the iPhone 16 models. After installing the software, iPhone 16 users will be able to add an event to the Calendar app when using the Camera Control Visual Intelligence option to view a poster or a flyer.
Using Visual Intelligence to add an event to Calendar was a promised function that Apple showed off when introducing Camera Control, and it will be usable in iOS 18.3. To add an event, view a document such as a poster with a date, and then tap on the date when the feature pops up in the Visual Intelligence interface.
The update also adds a feature for easily identifying plants and animals with Visual Intelligence. The Photos app is already able to provide insight into plants, animals, and insects when opting to view additional information in the editing interface, but Camera Control will now show these details in real-time.
When viewing an animal or a plant with Visual Intelligence, you may see a tappable bubble that lets you know what you're looking at. Tapping it provides more information.
Visual Intelligence is an Apple Intelligence feature that is exclusive to the iPhone 16 models. It can be activated by long pressing on the Camera Control button.
Apple's Worldwide Developers Conference (WWDC) starts today with the traditional keynote kicking things off at 10:00 a.m. Pacific Time. MacRumors is on hand for the event and we'll be sharing details and our thoughts throughout the day.
We're expecting to see a number of software-related announcements led by a design revamp across Apple's platforms that will also see the numbering of all of...
At today's WWDC 2025 keynote event, Apple unveiled a new design that will inform the next decade of iOS, iPadOS, and macOS development, so needless to say, it was a busy day. Apple also unveiled a ton of new features for the iPhone, an overhauled Spotlight interface for the Mac, and a ton of updates that make the iPad more like a Mac than ever before.
Subscribe to the MacRumors YouTube channel ...
Apple today announced a complete redesign of all of its major software platforms called "Liquid Glass."
Announced simultaneously for iOS, iPadOS, macOS, watchOS, tvOS, visionOS, and CarPlay, Liquid Glass forms a new universal design language for the first time. At its WWDC 2025 keynote address, Apple's software chief Craig Federighi said "Apple Silicon has become dramatically more powerful...
Apple today announced that iPadOS 26 will be compatible with the iPad models listed below.
iPadOS 26 features a new Liquid Glass design, a menu bar, improved app windowing, and more.
iPadOS 26 supports the following iPad models:iPad Pro (M4)
iPad Pro 12.9-inch (3rd generation and later)
iPad Pro 11-inch (1st generation and later)
iPad Air (M2 and later)
iPad Air (3rd generation and...
In 2020, Apple added a digital car key feature to its Wallet app, allowing users to lock, unlock, and start a compatible vehicle with an iPhone or Apple Watch. The feature is currently offered by select automakers, including Audi, BMW, Hyundai, Kia, Genesis, Mercedes-Benz, Volvo, and a handful of others, and it is set to expand further.
During its WWDC 2025 keynote today, Apple said that 13...
Apple at WWDC announced iOS 26, introducing a comprehensive visual redesign built around its new "Liquid Glass" concept, alongside expanded Apple Intelligence capabilities, updates to core communication apps, and more.
Liquid Glass is a translucent material that reflects and refracts surroundings to create dynamic, responsive interface elements, according to Apple. The new design language...
They need to surface this in the camera app. Even on my iPhone 16 Pro i rarely think to use it by clicking the capture button (for longer) to get into this feature.
I mean i don't expect to use it a lot. Google lens was always neat but didn't change the world for most people.
So if you point the camera at a tabby cat then you'll get a tabby cat button up top, but what about "Ask" and "Search" and what's the difference? So many buttons always.