Google I/O is probably the most famous developer conference in the world. This year's news revolved around ARCore, Android, Flutter, Firebase, Machine Learning and much more. Our developers Martin and Raimund flew to California to witness it all and summarized their highlights.
AR News: Navigation and ARCore
Visitors were able to use AR-based navigation on their smartphones to find their way around the large Google I/O site. After scanning markings that were installed on the site, the camera image displayed markers for the different stages and tents with the corresponding distance.
A new feature of Google Maps also allows navigation in the city using AR. Arrows and other clues are integrated into the camera image to show the way. The user's position is determined by comparing the camera image with Street View footage, which is amazingly accurate. For Google Local Guides, AR navigation can already be used. All other users will get access to this feature after the open test phase is completed.
In the AR Sandbox, we were also to test some apps that have already been equipped with the ARCore version that is announced for this summer. The applications were very fluid and robust. For example, we were able to look inside a coffee machine via AR. Animations illustrated how the coffee grinder and the water heater worked. The demos clearly showed how well the new ARCore version can estimate the direction of the light, making more realistic light and shadow effects possible.
Android inspired by iOS
We got the impression that the Android team continues to be inspired a lot by iOS and more recently also by Flutter in order to make the development of Android apps even more pleasant. Layout constraints for the design of views have already been adopted in the past. Since recently, there is also a navigation graph, which is similar with the storyboards from the Interface Builder of Xcode. However, the navigation graph has the decisive advantage that it strictly separates navigation and layout and thus prevents a monolithic storyboard and associated merge conflicts. Furthermore, in Android Studio it's now also possible to inspect an exploded view of the view hierarchy for debugging purposes.
More news: Portals & Web Perception Toolkit
Portals are an interesting mixture of Iframes and links. They allow the embedding of a completely rendered page – like with an iframe – but that is only used as a preview for the navigation of this page. This makes it possible, for example, to create seamless or animated navigations.
The Web Perception Toolkit is a feature in development that allows you to extend websites with computer vision. With this, barcodes, QR codes, and other images can be used directly in a website – and in the future also the results of an ML-supported object recognition.