Want to explore "what if"?

iOS 18 and integrating our apps with Siri and AI

Written by BrendonRole: Principal Consultant - Security

Heads up: Our Ideas Factory has been refreshed, levelled up, and grown-up into Alphero Intelligence. Some of our old posts are pretty cool tho'. Check this one out.

Apple tablet, iPhone and ear buds. Photo by Xiong Yan on Unsplash
  • Integrate your iOS 18 apps deeper into the Apple eco-system, to support Siri and AI
  • Share content in standard formats between apps and services
  • Apple are releasing AppIntents for over 100 domains

With the introduction of iOS 18 we gain some nice improvements to integrate your mobile app with the Apple Intelligence features coming in 2024 and 2025.

Apple has extended its integration points with its OS AI and Siri features. They are setting the ground-work for AI to both find and interact with your custom apps and its content.

Apple are releasing AppIntents for over 100 domains, which will set the pattern for exposing your application and its content elements through templates that AI and Siri can consume and interact with. On day one, iOS 18 will support domains for Photos and Email, but many more will follow through 2024 and 2025. Apple have provided an IndexedEntity protocol to provide searchable content directly from inside your apps to iOS, along with a Transferrable protocol to expose content as PDF, PNG or RTF out of your applications. This will allow your content to move beyond the boundaries of your application, and be shared through the eco-system. Apple is also improving the universal deep linking capabilities to allow better linking into your app when discovered by Siri or AI capabilities.

So starting with iOS 18 we will begin to see custom apps expose their content into the broader iOS eco-system of Siri, Spotlight and AI.

If you want to learn more refer to Apple's WWDC 2024 videos titled. “What’s new in App Intents” and “Bring your app to Siri”.

Designing for an AR space has different requirements to designing for a screen and, with this project, we’ve aimed to understand the process of providing a smooth user experience with “real life” space in mind.
All in all, it was a great project to get us thinking about user experience for augmented reality. ARKit was consistently able to recognise and track the banknotes quite well, provided they were relatively flat. We were able to track images by simply adding them to the project in a specified resource group and configuring the AR view to look for those images. The main challenge was connecting the AR world to the screen space; identifying the correct markers in order to display content over it. In future, a possible solution for improving the object tracking could be to keep a record of the notes’ physical dimensions and locations of their features and refer to this when scanning notes using the mobile app.

Written by BrendonRole: Principal Consultant - Security