waplog

Google’s ARCore brings augmented reality to millions of Android devices


Google is taking a second swing at augmented reality with a new SDK called "ARCore." The SDK is available for download today (Google should have a blog post here) along with a set of ARCore demos. After experimenting with Project Tango, an AR initiative launched in 2014 that loaded a smartphone up with custom sensors, Google's AR reboot brings most of that functionality to regular old Android phones through the magic of software. If you're drawing mental comparisons to Apple's ARKit, you're on the right track.
 
We're not just working off the blog post here, as I was lucky enough to have this project explained to me by some of the Googlers in charge of it. Let's start with the basics.
 
Google is again doing a bit of internal product competition with Tango and ARCore. Google is best thought of as a group of individual product divisions rather than a unified company, and it's the teams behind these products that tell the biggest story about where these two projects are headed. Project Tango was cooked up by Google's ATAP group, a small skunkworks group that doesn't have a ton of resources. ARCore is a collaboration between the Android Team and the VR group, two groups with a ton of resources.
 
Of course the Android Team's track record speaks for itself, and it has a ton of OEM contacts and expertise shipping a product to hundreds of millions of people. The VR group is just getting started, but it's easily the most exciting new team inside Google. It has been snatching up a lot of high-profile Googlers over the past two years and has already shipped VR creation apps Tilt Brush and Blocks plus the Daydream VR smartphone platform. The VR team even has a standalone VR headset hardware platform due out by the end of the year.
 
While Tango answered the question of "can we do AR on a smartphone," ARCore is about bringing some of that functionality to as many devices as possible. Tango was limited to a mere two phones with special hardware, but, with ARCore, Google says it is eventually targeting "hundreds of millions" of Android handsets using nothing other than the existing camera and accelerometers.
 
Daydream offers a good framework for thinking about ARCore device support: just as VR is a high-end feature that requires individual "Daydream ready" device certification for support, AR will also be limited to certain "certified" device models. Google will be working with OEMs on ARCore support to ensure a consistent experience across devices. So just as Daydream has a laundry list of hardware requirements in the Android Compatibility Definition Document (CDD), we'd imagine there will eventually be a similar list in the CDD for ARCore.
 
 
While we don't yet know the hardware requirements, there's no reason for the ARCore hardware requirements to be as stringent as Daydream VR. Because VR is strapped to your face, any frame drop or stutter can make the user feel physically ill. To combat "VR Sickness," Daydream has a load of requirements designed to minimize the "motion to photon" latency, like banning all LCD-equipped phones from participating in the program, because LCDs are just too slow. For ARCore, because you're just looking at a phone in your hand, frame drops won't make anyone sick, and many of these tough VR requirements shouldn't apply.
 
For this SDK launch, there are just two supported ARCore devices: the Google Pixel and Samsung Galaxy S8. Going forward, Google's blog post says it's "working with manufacturers like Samsung, LG, Huawei, ASUS, and others." When ARCore hits version 1.0 (I'm told that will be "this winter"), more than 100 million users should have access to ARCore apps. Google isn't just aiming ARCore at Android devices, it's also releasing "prototype browsers for Web developers" that support the SDK, too, and the company says "these custom browsers allow developers to create AR-enhanced websites and run them on both Android/ARCore and iOS/ARKit."
 
Google's blog post runs down the SDK's capabilities:
 
ARCore works with Java/OpenGL, Unity, and Unreal and focuses on three things:
 
Motion tracking: Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.
 
Environmental understanding: It is common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.
 
Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.
 
Motion tracking and environmental understanding were both things Tango could do—you'll basically be able to slap an item on a surface and walk around it, using the device as a virtual camera. The built-in "light estimation" capabilities are new, though. Out of the box, ARCore will try to apply the real life lighting to the virtual object, which should help virtual objects blend in more.
 
The key difference between ARCore and a sensor-loaded Tango device is that you're not getting depth sensing. Tango could actually see the world in 3D by sending out a blast of IR light and measuring the return time using a time-of-flight camera. Depth sensing in Tango led to my favorite Tango app: Matterport Scenes, which was basically a 3D scanner app. You could wave a Tango phone around a room or object and quickly get a (low-resolution) 3D, color model of it. While that was really amazing, the vast majority of Tango apps just wanted to use the "place stuff on a flat surface" feature, which doesn't need the depth sensor and will work just fine on ARCore.
 
ARCore will also have the benefit of what's essentially a "multiplayer mode," where data can be synced across devices thanks to the VR Group's "VPS" (Visual Positioning System). This is narrowing your location down with GPS and then having the phone's AR capabilities recognize where you are in a room, a kind of "indoor GPS." ARCore users in the same room can then share items in an environment, and one person's object manipulations could be seen by the other device.




MTCHT
ICT
TECHNOLOGICAL INNOVATIONS
POST
ABOUT US
NEWS
INTERESTING
INTERVIEW
ANALYSIS
ONLAIN LESSONS