Heads up: Our Ideas Factory has been refreshed, levelled up, and grown-up into Alphero Intelligence. Some of our old posts are pretty cool tho'. Check this one out.

- Our WWDC correspondents, developer Brendon and designer Georgia share their views on the most interesting updates from Apple for 2023.
- The huge news was the launch of the Apple Vision Pro headset, which will bring big changes to the future of computing.
- But there are a bunch of other nifty new developments that we're loving too…
Brendon's take:
The big news
So Apple have finally revealed the AR headset they’ve been working on for the past seven years. And while the $3,500 USD Apple Vision Pro is not going to suit everyone (or their budget) it’s going to create fundamental changes in the computing and design space for years to come.
In particular, it will impact how we think about computing in a spatial realm and advance how we interact with interfaces - using our eyes, hand gestures and speech to navigate our devices. These new approaches to interaction will have accessibility and inclusion benefits too.
A first iteration
Right now the product has a large and bulky headset, with an external battery pack, but this will become less of an issue with time and technology advancements. Because let’s not forget that this is still a first generation product. Apple will continue to enhance it over the years, and the experience and capabilities will be improved and refined (something Apple is very good at doing).
Human touch
Yes, there’s still the issue that when we’re slapping a computer to our face, we’re separating ourselves from the humanity around us; but Apple have tried to address this to some degree, by projecting the user’s eyes on the outside of the headset. This makes it appear as if they are interacting with the real world. Only time will tell if this is enough, or more needs to be done in this area.
Tools for the future
All in all, it has been a big journey for Apple to get to this point. They are not usually the first players to enter a new market segment, but they do learn from the failures of others in the space before them. Over the past seven years, they have been patiently building out the tools and frameworks needed to support this new spatial computing model.
These will now be used by Apple’s massive developer community to build innovative new apps and interactions, and this will in turn draw more users to this platform. Over time the cost of this hardware will reduce and the appeal will increase.
The eyes have it
I also love that they have implemented an optic scan to unlock the device with the user’s eye print. In time this may also come to iPhone and iPad. To do this securely, they have built on their Secure Enclave approach which protects the users privacy and personal information.
Onwards and outwards
You can tell from watching the Apple Design talk that they have obsessed over the design of the hardware and software for years - to the point where many aspects of the product are already very polished and considered.
Overall, I think it has the potential to change the way we use computers now and into the future, bringing new ways of interacting with and viewing content.
More to watch
Outside of the Apple Vision Pro, they have also made many other small improvements to their other platforms, as they do year on year. This year it feels like many small and practical improvements have been made, with some real love applied to watchOS 10, to really enhance the way Apple Watch apps are designed and function.
Machine learning in the mix
One other thing to note, Apple have not jumped on the AI bandwagon in the way they tout their products and features. Having said that, they are still weaving on-device machine learning capabilities into their frameworks, in subtle but useful ways. For instance they now allow for subject lifting: the ability to pull foreground items out of an image and separate them from the background. Its initial use case might be as simple as making stickers - but with this feature as an API now, it could become even more useful in future apps.
Georgia's take:
Keeping in touch just got easier!
Improving how apple users connect and communicate has been a big focus for iOS 17, and happily this includes a bunch of cool new features:
- AirDrop is expanding to include NameDrop, which allows you to easily swap selected contact info with someone just by bringing your iPhones or Apple Watch close together.
- Catch Up is a new feature in messages that lets you jump to the top of group chats so it’s easier to read through a conversation starting with the first message you missed (instead of the last one).
- In FaceTime, if the person you are calling is unavailable or doesn’t answer, you can now leave them a video voice message.
- Auto correct has been improved, and now has better inline predictive text.
- Voice transcription has been introduced to voicemail and voice messages. Live voicemail writes out the message in real time so you can see what the person is saying as they are recording it. And voice messages will now be transcribed, so you should be able to read through long voice messages (big YAY).
- As part of this release you can now also customise peoples’ contact posters - which will be displayed when they calling you, in a similar way to the iOS 16 Lock Screen.

Next-level life tracking
There is a new Journal App where you can write down your thoughts and track your mood like a typical journaling app, but it can also track other information, like where you go and who you hang out with. This app is taking life tracking to the next level and could reveal some fascinating information about how you chose to spend your time.
And finally, a few other faves
Interactive widgets are finally here, woohoo! So you shouldn’t need to go into the app anymore to adjust your lights, you can do that straight from the Home Screen.
You can now get maps offline: just select a specific area in maps - it could be a region or just a neighbourhood - and download it directly onto your iPhone.
Apple have introduced ‘StandBy’ which is a new mode for iPhones that activates when your device is plugged in and in landscape orientation. It displays a clock face by default but you can customise it with widgets of your choice as well.
And last but not least, Apple has dropped the ‘Hey’ from Hey Siri so you can now just say ‘Siri’ to use it. Too easy!