Highlights from WWDC: Apple Keynote

Augmented Reality, Conferences, Industry News, Mobile Apps, Mobile Development Comments (0)

Highlights from WWDC: Apple Keynote

The 2018 WWDC Apple Developer Conference was held at the McEnery Convention Center in San Jose, CA from June 4-8th. We had a couple of our developers attend the conference – as we do every year. During this event, Apple revealed many exciting updates, during its Keynote on June 4, at 10 am (PST) such as:

What was Float most excited about? Here are some of the highlights from the Keynote focusing on IOS 12 and its AR capabilities:

 

Performance:

IOS 12 is designed to specifically make your iPhone and iPad faster and more responsive. Apple will make things you do faster: like bringing up your camera, typing a message, or even launching your apps. The new iOS supports devices dating back to the iPad Air and iPhone 5s (allowing us to build great apps even for customers who haven’t bought the latest hardware!).

Apple posted statistics saying that this software will improve the speed of the device up to:

  • 70% faster swipe to Camera
  • 50% faster keyboard display
  • 2x faster app launch

Whoa, right? We’re excited about the performance improvements.

 

Facetime:

With IOS 12, you will now be able to Facetime with up to 32 people at once. Small tiles containing the pictures of your contacts will grow on screen as they talk so you will always be on track with the conversation. You will be able to start a Group Facetime from your Messages and join any Facetime that is active within the chat.

 

Memoji and Animoji:  

Apple was very excited to showcase their new customizable Emojis, dubbed Memoji, and Animojis, which will be coming to the iPhone X. These emojis can detect your facial expressions and mirror them in the message that you send.

 

Augmented Reality:

Apple’s update for ARKit, ARKit 2, will make it possible for developers to create fully immersible AR encounters. This update includes the opportunity for multiple people to experience AR together. Messages and Mail will make it possible for AR objects to be sent and viewed in the real world via your device.

Excitedly, there will be a new app called “Measure” that will allow you to measure real world objects just by pointing your camera at them . Although Google’s Tango devices did this some time ago, we’re excited to see this feature get added to iOS.

 

CoreML 2:

The framework for machine learning, used across many of Apple’s products like Siri, Camera, and QuickType, is getting updated! The fast performance enables users to build intelligent apps with just a few lines of code. Now, you can create your own models on Mac using Create ML and the playgrounds within Xcode 10.

This is a topic that excites us!

 

At Float, we are using CoreML to build iOS applications with the ability to identify products in a company’s catalogue through the device camera. CoreML is the best choice for ML on iOS because Apple has made significant investments, ensuring that CoreML takes full advantage of the Graphics Processing Unit (GPU)  and return the fastest results.

 

GPUs are especially useful for machine learning, because they are capable of performing a large number of operations quickly and in parallel. One model that Float has developed involves over three-trillion mathematical calculations. CoreML is capable of performing these operations in about a tenth of a second.

 

In loose terms, Float is adding a new feature to an existing app, which will allow users to perform inventory visually. Using the device camera to look up a product, instead of requiring users to visually inspect a Stock Keeping Unit (SKU) and manually entering it with the device keyboard. This allows users to take inventory in much less time, with far less errors in data entries.

 

Other things we were happy to see at the event included demos of third party apps, which showed off some amazing capabilities in partnership with iOS 12. Talking about the technology is fun of course, but seeing it in action is even better. Here are some of our favorite examples shared on stage at WWDC 2018:

 

Augmented Reality (Continued):

ARKit 2’s impressive  localization and spatial awareness: will allow you to put furniture in your home to see how it looks and if it fits in your space. It will also allow you to navigate in places without ever needing to glance at a map. Excitedly, Ikea is taking advantage of that in their new app, IKEA Place, which allows you to try out the furniture without ever going to the store.

In addition, American Airlines developed an AR prototype which places real-time information and your surroundings at airport terminals. As a result, this allow you to easily navigate to your gate, a coffee shop, and the closest restroom. Float has worked on real time wayfinding and navigation apps in the past, and any tools that are making this a reality and easier to create are things we can’t wait to try out.

 

As you have learned in this post, Apple’s keynote revealed a lot of exciting things coming to IOS 12! From faster phones, to giant group facetimes, to AR assisting in image classification, Float is in awe,

 

What are your thoughts? Which updates? Let us know in the comments below.

Follow Float
The following two tabs change content below.

Alexis Benson

Latest posts by Alexis Benson (see all)

» Augmented Reality, Conferences, Industry News, Mobile Apps, Mobile Development » Highlights from WWDC: Apple Keynote
On July 3, 2018
By
, , , , , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *

«