Posted by Anuj Gosalia
A little over a year ago, we introduced ARCore: a platform for building augmented reality (AR) experiences. Developers have been using it to create thousands of ARCore apps that help people with everything from fixing their dishwashers, to shopping for sunglasses, to mapping the night sky. Since last I/O, we've quadrupled the number of ARCore enabled devices to an estimated 400 million.
Today, at I/O we introduced updates to Augmented Images and Light Estimation - features that let you build more interactive, and realistic experiences. And to make it easier for people to experience AR, we introduced Scene Viewer, a new tool which lets users view 3D objects in AR right from your website.
To make experiences appear realistic, we need to account for the fact that things in the real world don’t always stay still. That’s why we’re updating Augmented Images — our API that lets people point their camera at 2D images, like posters or packaging, to bring them to life. The updates enable you to track moving images and multiple images simultaneously. This unlocks the ability to create dynamic and interactive experiences like animated playing cards where multiple images move at the same time.
An example of how the Augmented Images API can be used with moving targets by JD.com
Last year, we introduced the concept of light estimation, which provides a single ambient light intensity to extend real world lighting into a digital scene. In order to provide even more realistic lighting, we’ve added a new mode, Environmental HDR, to our Light Estimation API.
Before and after Environmental HDR is applied to the digital mannequin on the left, featuring 3D printed designs from Julia Koerner
Environmental HDR uses machine learning with a single camera frame to understand high dynamic range illumination in 360°. It takes in available light data, and extends the light into a scene with accurate shadows, highlights, reflections and more. When Environmental HDR is activated, digital objects are lit just like physical objects, so the two blend seamlessly, even when light sources are moving.
Digital mannequin on left and physical mannequin on right
Environmental HDR provides developers with three APIs to replicate real world lighting:
We want to make it easier for people to jump into AR, so today we’re introducing Scene Viewer, so that AR experience can be launched right from your website without having to download a separate app.
To make your assets accessible via Scene Viewer, first add a glTF 3D asset to your website with the <model-viewer> web component, and then add the “ar” attribute to the <model-viewer> markup. Later this year, experiences in Scene Viewer will begin to surface in your Search results.
<script type="module" src="https://unpkg.com/@google/model-viewer/dist/model-viewer.js"></script> <script nomodule src="https://unpkg.com/@google/model-viewer/dist/model-viewer-legacy.js"></script> <model-viewer ar src="examples/assets/YOURMODEL.gltf" auto-rotate camera-controls alt="TEXT ABOUT YOUR MODEL" background-color="#455A64"></model-viewer>
NASA.gov enables users to view the Curiosity Rover in their space
These are a few ways that improving real world understanding in ARCore can make AR experiences more interactive, realistic, and easier to access. Look for these features to roll out over the next two releases. To learn more and get started, check out the ARCore developer website.
Posted by Evan Hardesty Parker, Software Engineer
ARCore and Sceneform give developers simple yet powerful tools for creating augmented reality (AR) experiences. In our last update (version 1.6) we focused on making virtual objects appear more realistic within a scene. In version 1.7, we're focusing on creative elements like AR selfies and animation as well as helping you improve the core user experience in your apps.
Example of 3D face mesh application
ARCore's new Augmented Faces API (available on the front-facing camera) offers a high quality, 468-point 3D mesh that lets users attach fun effects to their faces. From animated masks, glasses, and virtual hats to skin retouching, the mesh provides coordinates and region specific anchors that make it possible to add these delightful effects.
You can get started in Unity or Sceneform by creating an ARCore session with the "front-facing camera" and Augmented Faces "mesh" mode enabled. Note that other AR features such as plane detection aren't currently available when using the front-facing camera. AugmentedFace extends Trackable, so faces are detected and updated just like planes, Augmented Images, and other trackables.
AugmentedFace
Trackable
// Create ARCore session that support Augmented Faces for use in Sceneform. public Session createAugmentedFacesSession(Activity activity) throws UnavailableException { // Use the front-facing (selfie) camera. Session session = new Session(activity, EnumSet.of(Session.Feature.FRONT_CAMERA)); // Enable Augmented Faces. Config config = session.getConfig(); config.setAugmentedFaceMode(Config.AugmentedFaceMode.MESH3D); session.configure(config); return session; }
Another way version 1.7 expands the AR creative canvas is by letting your objects dance, jump, spin and move around with support for animations in Sceneform. To start an animation, initialize a ModelAnimator (an extension of the existing Android animation support) with animation data from your ModelRenderable.
ModelAnimator
ModelRenderable
void startDancing(ModelRenderable andyRenderable) { AnimationData data = andyRenderable.getAnimationData("andy_dancing"); animator = new ModelAnimator(data, andyRenderable); animator.start(); }
In ARCore version 1.7 we also focused on helping you improve your user experience with a simplified workflow. We've integrated "ARCore Elements" -- a set of common AR UI components that have been validated with user testing -- into the ARCore SDK for Unity. You can use ARCore Elements to insert AR interactive patterns in your apps without having to reinvent the wheel. ARCore Elements also makes it easier to follow Google's recommended AR UX guidelines.
ARCore Elements includes two AR UI components that are especially useful:
We plan to add more to ARCore Elements over time. You can download the ARCore Elements app available in the Google Play Store to learn more.
ARCore version 1.7 also includes UX enhancements for the smartphone camera -- specifically, the experience of switching in and out of AR mode. Shared Camera access in the ARCore SDK for Java lets users pause an AR experience, access the camera, and jump back in. This can be particularly helpful if users want to take a picture of the action in your app.
More details are available in the Shared Camera developer documentation and Java sample.
For AR experiences to capture users' imaginations they need to be both immersive and easily accessible. With tools for adding AR selfies, animation, and UI enhancements, ARCore version 1.7 can help with both these objectives.
You can learn more about these new updates on our ARCore developer website.
Posted by Ashish Shah, Product Manager, Google AR & VR
The magic of augmented reality is in the way it blends the digital and the physical worlds. For AR experiences to feel truly immersive, digital objects need to look realistic -- as if they were actually there with you, in your space. This is something we continue to prioritize as we update ARCore and Sceneform, our 3D rendering library for Java developers.
Today, with the release of ARCore 1.6, we're bringing further improvements to help you build more realistic and compelling experiences, including better plane boundary tracking and several lighting improvements in Sceneform.
With 250M devices now supporting ARCore, developers can bring these experiences to an even larger and growing user base.
Previous versions of Sceneform defaulted to optimizing ambient light as yellow. Version 1.6 defaults to neutral and white. This aligns more closely to the way light appears in the real world, making digital objects look more natural. You can see the differences below.
This change will also make objects rendered with Sceneform look as if they're affected more naturally by color and lighting in the surrounding environment. For example, if you're viewing an AR object at sunset, it would appear to be illuminated by the red and orange hues, just like real objects in the scene.
In addition, we've updated Sceneform's built-in environmental image to provide a more neutral scene for your app. This will be most noticeable when viewing reflections in smooth metallic surfaces.
To help you further improve quality and engagement in your AR apps, we're adding screen capture and recording to Sceneform. This is something a number of developers have requested to help with demo recording and prototyping. It can also be used as an external facing feature, allowing your users to share screenshots and videos on social media more easily, which can help get the word out about your app.
You can access this functionality through the surface mirroring API for the SceneView class. The API allows you to display the Sceneform view on a device's screen at the same time it's being rendered to another surface (such as the input surface for the Android MediaRecorder).
The new updates to Sceneform and ARCore are available today. With these new versions also comes support for new devices, such as the Samsung Galaxy A3 and the Huawei P20 Lite, that will join the list of ARCore-enabled devices. More information is available on the ARCore developer website.
Posted by Clayton Wilkinson, Developer Platforms Engineer
Today, we're releasing updates to ARCore, Google's platform for building augmented reality experiences, and to Sceneform, the 3D rendering library for building AR applications on Android. These updates include algorithm improvements that will let your apps consume less memory and CPU usage during longer sessions. They also include new functionality that give you more flexibility over content management.
Here's what we added:
Sceneform will now include an API to enable apps to load gITF models at runtime. You'll no longer need to convert the gITF files to SFB format before rendering. This will be particularly useful for apps that have a large number of gITF models (like shopping experiences).
To take advantage of this new function -- and load models from the cloud or local storage at runtime -- use RenderableSource as the source when building a ModelRenderable.
private static final String GLTF_ASSET = "https://github.com/KhronosGroup/glTF-Sample-Models/raw/master/2.0/Duck/glTF/Duck.gltf"; // When you build a Renderable, Sceneform loads its resources in the background while returning // a CompletableFuture. Call thenAccept(), handle(), or check isDone() before calling get(). ModelRenderable.builder() .setSource(this, RenderableSource.builder().setSource( this, Uri.parse(GLTF_ASSET), RenderableSource.SourceType.GLTF2).build()) .setRegistryId(GLTF_ASSET) .build() .thenAccept(renderable -> duckRenderable = renderable) .exceptionally( throwable -> { Toast toast = Toast.makeText(this, "Unable to load renderable", Toast.LENGTH_LONG); toast.setGravity(Gravity.CENTER, 0, 0); toast.show(); return null; });
Sceneform has a UX library of common elements like plane detection and object transformation. Instead of recreating these elements from scratch every time you build an app, you can save precious development time by taking them from the library. But what if you need to tailor these elements to your specific app needs? Today we're publishing the source code of the UX library so you can customize whichever elements you need.
An example of interactive object transformation, powered by an element in the Sceneform UX Library.
Several developers have told us that when it comes to point clouds, they'd like to be able to associate points between frames. Why? Because when a point is present in multiple frames, it is more likely to be part of a solid, stable structure rather than an object in motion.
To make this possible, we're adding an API to ARCore that will assign IDs to each individual dot in a point cloud.
These new point IDs have the following elements:
Last but not least, we continue to add ARCore support to more devices so your AR experiences can reach more users across more surfaces. These include smartphones as well as -- for the first time -- a Chrome OS device, the Acer Chromebook Tab 10.
You can get the latest information about ARCore and Sceneform on https://developers.google.com/ar/develop
Ready to try out the samples or have issues, then visit our projects hosted on GitHub:
Posted by Karin Levi, Product Marketing, ARCore
A few weeks ago at Google I/O we released a major update to ARCore, Google's AR development platform. We added new APIs like Cloud Anchors, that enable multi-user, collaborative AR experiences and Augmented Images that enable activation of 2D images into 3D objects. All of these updates are going to change the way we use AR today and enable developers to create richer, more immersive AR apps.
With these new capabilities, we decided to put our platform to the test. So we built real experiences to showcase how these all come to life. All demos were presented at the I/O AR & VR sandbox area. We open sourced them to make sure you can see how simple it is to build these experiences. We're pretty happy with how they turned out and would love to share with you some learning and insights from behind the scenes.
Light Board is an AR multiplayer tabletop game where two players on floating game boards launch colored projectiles at each other.
While building Light Board it was important for us to keep in mind who the end users are. We wanted it to be a simple/fun game for developers to try out while visiting the I/O sandbox. The developers would only have a couple minutes to play while passing through, so it needed to allow players (even non-gamers) to pick it up and play with very little setup.
The artwork for Light Board was a major focus. Our mission for the look of the game was to align with the design and decor of I/O 2018. This way, our app would feel like an extension of everything the attendees saw around them. As a result, our design philosophy had 3 goals; bright accent colors, simple graphic shapes and natural physical materials.
Left: Design for AR/VR Sandbox at I/O 2018. Right: Key art for Light Board game boards
The artwork was created in Maya and Cinema 4D. We created physically based materials for our models using Substance Painter. Just as continuous iteration is crucial for engineering, it is also important when creating art assets. With that in mind, we kept careful track of our content pipeline, even for this relatively simple project. This allowed us to quickly try out different looks and board styles before settling on our final design.
On the engineering front we selected the Unity game engine as our dev environment. Unity gives us a couple of important advantages. First, it is easy to get great looking 3D graphics up and running right away. Second, the engine component is already complete, so we could immediately start iterating on gameplay code. As with the artwork, this allowed us to test gameplay options before we made a final decision. Additionally, Unity gave us support for both Android and iOS with only a little extra work.
To handle the multiplayer aspect we used Firebase Realtime Database. We were concerned with network performance at the event, and felt that the persistent nature of a database would make it more tolerant of poor networks. As it turned out, it worked very well and we got the ability to quit and rejoin games for free!
We had a lot of fun building Light Board and we hope people can use it as an example of how easy it can be to not only build AR apps, but to use really cool features like Cloud Anchors. Please check out our open source repo and give Light Board a try!
In March, we released Just a Line, an Android app that lets you draw in the air with your phone. It's a simple experiment meant to showcase the power of ARCore. At Google I/O, we added Cloud Anchors to the app so that two people can draw at once in the same space, even if one of them is using Android and the other iOS.
Both apps were built natively: The Android version was written in Android Studio, and the iOS version was built in xCode. ARCore's Cloud Anchors enable Just a Line to pair two phones, allowing users to draw simultaneously in a shared space. Pairing works across Android and iOS devices, and drawings are synchronized live through a Firebase Realtime Database. You can find the open-source code for iOS here and for Android here.
"Illusive Images" demo is an augmented gallery consisting of 3 artworks, each exploring a different augmented image use case and user experience. As one walks from side to side, around the object, or gazes in a specific direction, 2D artworks are married with 3D, inviting the viewer to enter into the space of the artwork spanning well beyond the physical frame.
Due to the visual design nature of our augmented images, we experimented a lot with creating databases with varying degrees of features. In order to get the best results, we iterated quickly by resizing the canvas for the artwork. We also moved and stretched the brightness and contrast levels. These variations helped to achieve the most optimal image without compromising design intent.
The app was built in Unity with ARCore, with the majority of assets created in Cinema 4D. Mograph animations were imported into Unity as an fbx, and driven entirely by the position of the user in relation to the artwork. An example project can be found here.
To make your development experience easier, we open sourced all the demos our team built. We hope you find this useful! You can also visit our website to learn more and start building AR experiences today.