WWDC 2018: Takeaways & New Mobile Technology

WWDC 2018 takeaways

Apple recently held its WWDC 2018 event and unveiled new technology across the mobile spectrum. Our team has been diving into the tech to explain how it’ll change the future of applications. Here we’ve outlined a few of the technologies we’re most excited about and how these changes in new mobile technology will begin to reflect in some of the mobile applications we’re building.

Machine Learning

Apple has updated the Machine Learning library called MLKit. This makes it much easier for developers to incorporate artificial intelligence into mobile applications. The most practical applications include recognizing objects and categorizing data. With the new updates, it’s easier to deploy AI in typical apps and the smaller memory footprint makes it more broadly practical for older devices.

A Natural Language Processing framework allows for much richer text recognition, parsing, and interpretation. This can power Siri-like voice and chat experiences, even when users include jargon or words that take multiple forms. Application developers can include aspects of text interpretation or even voice recognition. Tools that interpret either voice or text will continue to grow in sophistication and comprehension and won’t be bound to Alexa or Siri experiences.

Augmented Reality and Vision

During WWDC 2018 Apple also announced it has significantly improved its Augmented Reality library making it easier to save, restore, and share experiences in the same position in the real world. This will make it possible to collaboratively play AR games, do space planning or drop digital notes and objects in the real-world for others to find. This technology enables multiple users to interact with the same virtual AR environment in real time.

Apps can now recognize arbitrary images and even 3D objects. This requires some training with a sample object or image. Example uses can allow a toy to come to life or a physical object such as a car or appliance to show you what all the buttons do with an AR owners manual.

Such experiences can also overlay other live data. Imagine an AR experience in a basketball arena. Currently the at-home experience shows scores and info, however, the new ARKit can put that information in the hands of spectators overlayed on plays and players.


Apps can now expose “shortcuts” for common actions. This can allow you to order your favorite coffee with a catch-phrase and it will enable app developers to expose actions that can be easily launched. You could have an action for opening your garage and starting your car, filing a report, or controlling your home. Siri will also infer frequent actions and recommend shortcuts based on time and location.

Users can provide their own activation phrase, making it even more personalized. Siri can return a lightweight interface and provide confirmations to prevent accidental actions.

Other Updates

“Free demo” options for apps will let you try before you buy certain apps with a limited time trial.

With their new feature called “Screen Time”, Apple is starting to campaign to reduce our screen addiction by making it easier for users to control their notifications. iOS 12 will also tell users where they spend the most time on their phone and allow for setting time limits and reminders to leave an app that’s particularly sticky.

Along with some of the items outlined there are hundreds of other details from WWDC 2018 that are still being explored. These details create a rich canvas for application developers to make thoughtful, creative, and innovative applications.

Do you have a project in mind? We’d love to work with you. If you’d like an opportunity to work on projects with us, check out our Careers page. We’re hiring!