In the blog post about our experience at the Unity Unite event in Berlin we gave an
short overview of our experience at the event. The keynote itself deserves a blog post by itself.
So in this blog post, we will talking about all topics we saw during the keynote.
Project MARS (Mixed and Augmented Reality Studio) is a set of tools that helps the developer to create robust AR/MR apps that live in and react to the real world and it will become available in 2018.3 release.
Some improvements in project MARS are:
- AR face masks that you can extend with face tracking technology;
- a simulation View to try out content in the real world, straight from the editor;
- templates and simulated rooms to try content;
- AR object script to allow objects to be automatically placed in the real world using given requirements.
For the demo, the 3D Toolkit was used. This is a collection of mechanics, tools, systems and assets to hook up gameplay without writing any code and is available on the Asset Store. Two tables were set up in the real world on which the level was projected. Then a bridge formed between the platforms and the AI moved across.
AR Facial Performance for Animation
XR can be used for animating characters in movies. For this demo, a face of a girl was animated using a real person’s face with an iPhone X. The animations can be recorded and used in the Cinemachine component.
Setting up animation trees can be time consuming. Why not use machine learning to set up animations? You tell the machine what to achieve, not how to achieve it. You don’t have to worry about transition points. Kinematica is an AI Animation System that uses machine learning to animate. It is is expected in 2018.3 release.
To demonstrate, Unity hired a stuntman that performed all needed animations in a motion capture studio. Kinematica keeps these animations in a single library and decides in real-time how to combine fragments into a sequence that matches the controller input, the environment context and the gameplay requests.
As a demo, a game similar to flappy bird was created using the small runtime. After building, the compressed size, with the runtime and all the assets, was only 200 KB. In the future, the small runtime would include native deployment and 3D.
Creating connected games required expertise, infrastructure and operations. This is where Google Cloud wants to helps Unity developers. Unity is migrating to Google Cloud where Virtual Machines run the game server with 99,99% uptime and a very low congestion.
An Open Source Matchmaking project was also announced by Unity and Google Cloud which allows connected games to be scalable. In the demo, a connected game was deployed and pushed to a new virtual machine, which started the game server and allowed people to join the game. In the operations dashboard of Google Cloud, you could see the new servers spin up in real-time as the needed capacity increased.
New Prefab Workflows
New prefab workflows were announced at the end of the keynote. The problem with the existing workflow was that you did not have enough control and productivity suffered. A new feature in Unity is editing in “Prefab Mode”. This is a scene editor that allows you to edit only the prefab and save the changes directly to the prefab file. Another amazing addition to prefab is “Nested Prefabs”.
This is something Unity developers have been looking forward to. Currently, saving a prefab with a child prefab as a prefab does not save the child prefab. Nested prefabs are currently not treated as prefabs, but this will be the case with the new workflows. The new prefab workflows will be released at the end of 2018 in version 2018.3 and are currently in preview.