arkit、arcore_试用多人ARCore和ARKit:跳入示例代码

arkit、arcore_试用多人ARCore和ARKit:跳入示例代码

arkit、arcore

One of the aims of our talk at Unite Austin 2017 was to provide an introduction to AR and show people what ARKit and ARCore can do. But most importantly, we wanted developers who were using those platforms to be excited about creating on them. To that end, we built the talk around some major requests from AR developers.

我们在Unite Austin 2017的演讲的目的之一是介绍AR,并向人们展示ARKitARCore可以做什么。 但最重要的是,我们希望正在使用这些平台的开发人员对在其上进行创建感到兴奋。 为此,我们围绕AR开发人员的一些主要要求展开了讨论。

演示地址

Note that this particular implementation and sample code are meant for experimental purposes. The package provided is not officially supported by Unity and any upcoming changes to ARCore or ARKit can break the package.

请注意,此特定实现和示例代码仅用于实验目的。 Unity并未正式提供所提供的软件包,并且即将对ARCore或ARKit进行的任何更改都可能破坏该软件包。

The first request was to be able to write cross-platform code so that they could target both ARKit and ARCore (and other platforms as they came out). This request tied in very well with Unity’s core value of democratizing development by allowing creators to write once, deploy anywhere. For this request, we created a quick experimental API as described below that we hope to evolve into our integrated cross-platform API.

第一个要求是能够编写跨平台代码,以便它们可以同时针对ARKit和ARCore(以及其他推出的平台)。 通过允许创建者编写一次,部署到任何地方,此请求与Unity*化开发的核心价值紧密结合。 对于此请求,我们创建了如下所述的快速实验API,希望将其发展为集成的跨平台API。

The second common request we got was to deal with the issues around displaying scaled content in AR. The third common request we decided to handle was to show how we could enable shared multiplayer experiences using existing building blocks provided by Unity.

我们得到的第二个共同要求是处理有关在AR中显示缩放内容的问题。 我们决定处理的第三个常见要求是展示如何使用Unity提供的现有构建模块来启用共享多人游戏体验。

During the talk we showed demos that answered all three of these major requests. In this blog, we hope to give you some more details about those answers and help you use the code associated with the talk.

在谈话中,我们展示了可以回答所有这三个主要要求的演示。 在此博客中,我们希望为您提供有关这些答案的更多详细信息,并帮助您使用与演讲相关的代码。

源代码发布 (Source code release)

You can download the code associated with our talk here

您可以在此处下载与我们的演讲相关的代码。  

Prerequisites

先决条件

AR接口 (ARInterface)

We would like to emphasize that the ARInterface that we demonstrated at Unite Austin is experimental: we developed it within two weeks with the explicit aim of developing some cross-platform demos for the talk, and as such it has not taken into account any of the underlying Unity requirements or existing APIs. At Unity, we are working on native cross-platform bindings for multiple AR platforms for next year. Until that is officially released, this C# API may be a useful stopgap for developing ARCore and ARKit applications.

我们要强调的是,我们在Unite Austin上演示的ARInterface是实验性的 :我们在两周内开发了ARInterface,其明确目的是为此次演讲开发一些跨平台的演示,因此,它没有考虑任何基本的Unity要求或现有的API。 在Unity,我们正在为明年的多个AR平台开发本机跨平台绑定。 在正式发布之前,此C#API可能是开发ARCore和ARKit应用程序的有用的权宜之计。

Developing an AR application for ARKit or ARCore usually means interacting with platform-specific APIs and components. ARInterface abstracts several of the commonalities into a separate layer, which means if the developer writes code that talks to ARInterface, then it will filter down to the correct platform-specific call.

为ARKit或ARCore开发AR应用程序通常意味着与平台特定的API和组件进行交互。 ARInterface将几个共同点抽象到一个单独的层中,这意味着如果开发人员编写了与ARInterface对话的代码,则它将过滤到正确的平台特定调用。

Initially this interface allows cross-platform work across ARKit and ARCore, but it should be easy enough to implement for other AR platforms with similar functionality.

最初,该接口允许跨ARKit和ARCore进行跨平台工作,但对于具有类似功能的其他AR平台,应该足够容易实现。

当前的API (Current API)

To take a look at the API, examine ARInterface.cs in the project. Here follows a description and detail of the methods in this interface.

要查看API,请检查项目中的ARInterface.cs。 以下是此界面中方法的描述和详细信息。

Firstly we have the calls to start and stop the AR session:

首先,我们有调用来启动和停止AR会话:

1
2
public abstract bool StartService(Settings settings);
public abstract void StopService();
1
2
public abstract bool StartService ( Settings settings ) ;
public abstract void StopService ( ) ;

The Settings parameter lets you choose to enable any one or more of point cloud creation, light estimation and plane detection in the session.

设置参数使您可以选择在会话中启用点云创建,光估计和平面检测中的任何一项或多项。

Next we have the basic AR functionality of world tracking by keeping track of the AR device position and orientation in the real world:

接下来,我们通过跟踪现实世界中AR设备的位置和方向来掌握世界跟踪的基本AR功能:

1
public bool TryGetPose(ref Pose pose);
1
public bool TryGetPose ( ref Pose pose ) ;

Pose here describes the position and the rotation of the device camera, and is usually used to move a Unity camera just like the device.

此处的姿势描述了设备相机的位置和旋转,通常用于像设备一样移动Unity相机。

Then we have the plane events delegates:

然后,我们有了飞机事件的代表:

1
2
3
public static Action<BoundedPlane> planeAdded;
public static Action<BoundedPlane> planeUpdated; 
public static Action<BoundedPlane> planeRemoved;
1
2
3
public static Action < BoundedPlane > planeAdded ;
public static Action < BoundedPlane > planeUpdated ; 
public static Action < BoundedPlane > planeRemoved ;

These allow you to keep track of the planes that are detected during the AR session, whenever they are added, updated or removed.

这些可让您随时跟踪在AR会话中检测到的平面,无论何时添加,更新或删除它们。

To get the real scene that you want to augment in AR, you need to render the video from the camera as the background for a camera in the scene:

要获得要在AR中增强的真实场景,需要将来自摄像机的视频渲染为场景中摄像机的背景:

1
public abstract void SetupCamera(Camera camera);
1
public abstract void SetupCamera ( Camera camera ) ;

You pass in the Unity camera that you want to have display the video background and the implementation will set it up for you.

您传入要显示视频背景的Unity摄像机,实现将为您设置它。

For the background rendering, we also need the display transform which allows the shader to rotate the resulting background texture according to the orientation and screen size of the device:

对于背景渲染,我们还需要显示转​​换,该转换允许着色器根据设备的方向和屏幕尺寸旋转生成的背景纹理:

1
public abstract Matrix4x4 GetDisplayTransform ();
1
public abstract Matrix4x4 GetDisplayTransform ( ) ;

There is a call to get the detected point cloud:

调用获取检测到的点云:

1
public abstract bool TryGetPointCloud(ref PointCloud pointCloud);
1
public abstract bool TryGetPointCloud ( ref PointCloud pointCloud ) ;

PointCloud has a list of Vector3 that describes where the points are in world coordinates.

PointCloud有一个Vector3列表,该列表描述了点在世界坐标中的位置。

AR platforms usually return an estimate of the lighting in the scene:

AR平台通常会返回场景中的照明估计:

1
public abstract LightEstimate GetLightEstimate();
1
public abstract LightEstimate GetLightEstimate ( ) ;

LightEstimate contains both an ambientIntensity and an ambientColorTemperature.

LightEstimate同时包含一个virtualIntensity和一个virtualColorTemperature。

In some cases, you want to actually read the values of the pixels from the camera captured image (of your real surroundings), and for that you would use:

在某些情况下,您实际上想从相机拍摄的图像(真实环境)中读取像素值,为此,您可以使用:

1
public abstract bool TryGetCameraImage(ref CameraImage cameraImage);
1
public abstract bool TryGetCameraImage ( ref CameraImage cameraImage ) ;

CameraImage contains the actual pixels captured in a format accessible from the CPU.

CameraImage包含以CPU可访问的格式捕获的实际像素。

ARKitInterface和ARCoreInterface (ARKitInterface and ARCoreInterface)

These are the concrete implementations of the ARInterface to handle the ARKit and ARCore platforms respectively. Their implementations use the underlying plugin to carry out the functionality they require. You can take a look at these to see how you would possibly extend this to other platforms.

这些是ARInterface的具体实现,分别用于处理ARKit和ARCore平台。 他们的实现使用基础插件来执行所需的功能。 您可以查看这些内容,以了解如何将其扩展到其他平台。

AREditorInterface:ARInterface (AREditorInterface: ARInterface)

One new thing we can achieve with ARInterface is the ability to derive an editor interface from it, and program it to allow us to replicate an AR environment in the editor. In our example, our simulated AR environment generates two planes in specific places in the scene: one after 1 second and the other after another 1 second. It also generates a point cloud of 20 random points in the scene. The code for this is in AREditorInterface.cs and you can make your simulated environment as elaborate or detailed as needed.

我们可以使用ARInterface实现的新功能是从其派生编辑器接口并对它进行编程的能力,以允许我们在编辑器中复制AR环境。 在我们的示例中,我们的模拟AR环境在场景中的特定位置生成了两个平面:一个在1秒后,另一个在另一个1秒后。 它还会在场景中生成20个随机点的点云。 用于此的代码在AREditorInterface.cs中,您可以根据需要使模拟环境精细或详细。

Now we can interact with this environment as if we were moving the device through it and debug and iterate on any interactions with it since it is in the editor. To move the camera through the environment, use the WASD keys. To orient the camera, use the mouse with the right mouse button pressed.

现在,我们可以与该环境进行交互,就好像我们要在设备中移动设备一样,并且由于它在编辑器中,因此可以对其进行任何调试和迭代。 要在环境中移动相机,请使用WASD键。 要调整相机的方向,请在鼠标右键按下的情况下使用鼠标。

Since we are within the editor, we can change any GameObject parameters via the inspector and see instant changes. We can also change the code and rerun immediately as the compile time of a change in a single file is almost instantaneous in-editor. In addition, we can set breakpoints in our code and step through it to figure out why something is not working as expected.

由于我们在编辑器中,因此我们可以通过检查器更改任何GameObject参数,并查看即时更改。 我们也可以更改代码并立即重新运行,因为单个文件中的更改的编译时间几乎是即时在编辑器中。 另外,我们可以在代码中设置断点并逐步执行,以弄清为什么某些东西无法按预期工作。

This tool has proven invaluable to us while developing new examples and porting examples from other platforms: use it to iterate quickly without even the need for an AR device!

在开发新示例和从其他平台移植示例时,该工具被证明对我们具有无价的价值: 使用它可以快速迭代,甚至不需要AR设备!

ARRemoteInterface:ARInterface (ARRemoteInterface: ARInterface)

One of the most popular tools that we released in conjunction with the ARKit plugin is the ARKit Remote. This tool has two parts: the actual remote app that you install on your AR device, and a component you place in your scene so that it gets the ARKit data from the remote app, providing a simulation of it in the Editor. This allowed developers to iterate and debug in the Editor, similar to the EditorInterface, but in this case they were getting the actual data of their real world surroundings.

我们与ARKit插件一起发布的最受欢迎的工具之一是ARKit Remote 。 该工具分为两部分:安装在AR设备上的实际远程应用程序,以及放置在场景中的组件,以便从远程应用程序获取ARKit数据,并在编辑器中对其进行仿真。 这使开发人员可以在编辑器中进行迭代和调试,类似于EditorInterface,但是在这种情况下,他们正在获取其真实环境的实际数据。

The popularity of the ARKit Remote meant that when the ARCore preview was released many developers asked for a similar tool for ARCore. With the cross platform ARInterface, creating the tool for ARCore was made much easier by implementing similar functionality to the existing ARKit Remote via the interface.

ARKit Remote的普及意味着当发布ARCore预览版时,许多开发人员都要求为ARCore提供类似的工具。 使用跨平台ARInterface,通过该接口实现与现有ARKit Remote相似的功能,可以大大简化ARCore的创建工具。

To use this workflow, you first build the RemoteDevice scene (found in the Assets/UnityARinterface/ARRemote folder) and install it to an actual device that supports either ARKit or ARCore. Make sure that you check the “Development Build” checkbox in the Build Settings dialog when building (this enables the PlayerConnection mechanism that we use to communicate between the remote app and the Editor).

要使用此工作流程,您首先要构建RemoteDevice场景(位于Assets / UnityARinterface / ARRemote文件夹中),然后将其安装到支持ARKit或ARCore的实际设备上。 确保在构建时选中“构建设置”对话框中的“开发构建”复选框(这将启用用于在远程应用程序和编辑器之间进行通信的PlayerConnection机制)。

Next you take the AR scene you want to iterate on in the editor and create an ARRemoteEditor component on the ARRoot GameObject. Disable the ARController component on the same GameObject if one exists. Now run the remote app that you installed on the device, and connect to it from the Console “Connected Player” menu. Now press “Play” in the Editor and you should get a Game View like this:

接下来,在编辑器中获取要迭代的AR场景,并在ARRoot GameObject上创建ARRemoteEditor组件。 如果存在,请在同一GameObject上禁用ARController组件。 现在,运行您在设备上安装的远程应用程序,然后从控制台“ Connected Player”菜单连接到它。 现在在编辑器中按“播放”,您将获得如下游戏视图:

Press the “Start Remote AR Session” button on the top of the screen and you should now have the AR device sending its AR data across to the Editor. You are now ready to iterate in the Editor with real AR data from device.

按下屏幕顶部的“启动远程AR会话”按钮,您现在应该让AR设备将其AR数据发送到编辑器。 现在,您可以使用来自设备的真实AR数据在编辑器中进行迭代。

移植的例子 (Ported examples)

We ported a number of examples from the ARKit plugin over to use ARInterface, which allows us to use them on ARCore as well. In addition, it gives us the ability to use AREditorInterface and ARRemoteInterface as well. In fact, the porting of these examples were made way simpler and faster because we could iterate on them in the Editor using AREditorInterface.

我们从ARKit插件移植了许多示例以使用ARInterface,这也使我们也可以在ARCore上使用它们。 此外,它还使我们能够使用AREditorInterface和ARRemoteInterface。 实际上,这些示例的移植变得更加简单和快捷,因为我们可以使用AREditorInterface在编辑器中对其进行迭代。

焦点广场示例 (Focus Square example)

A key component of Apple’s AR Human Interaction Guidelines is an indicator of where you can place your virtual content in your environment. This was our implementation of that indicator which we called Focus Square.

Apple的《 AR人机交互指南》的关键组成部分指示了您可以在环境中放置虚拟内容的位置。 这就是我们对指标的实施,我们称之为“焦点广场”。

Porting this example to ARInterface was pretty straightforward as it did not have much in the way of platform specific code, except for the HitTest. We decided to use a raycast against the generated plane GameObjects in the scene instead of a HitTest: this would effectively give us a HitTest against a plane considering its extents, and would also work on Editor as a bonus.

将该示例移植到ARInterface非常简单,因为除了HitTest之外,它没有太多针对平台的代码。 我们决定对场景中生成的平面GameObjects使用光线投射,而不是HitTest:这将有效地为我们提供针对平面的HitTest(考虑其范围),并且还可以作为奖励在Editor上工作。

The rest of the AR session and camera is set up by the ARController which is part of the code drop that helps you setup your scene for use in AR.

AR会话和相机的其余部分由ARController设置,这是代码段的一部分,可帮助您设置要在AR中使用的场景。

AR Ballz示例 (AR Ballz example)

The Unity ARBallz example was a fun demo that was used to show physics interactions with flat surfaces in AR. The example has two modes. In one, you create balls on planes that have been detected in your environment when you touch that plane via the screen. The second mode made you move the balls away from wherever you were touching on the plane via the screen.

Unity ARBallz示例是一个有趣的演示,用于演示AR中与平面的物理相互作用。 该示例有两种模式。 一种方法是,当您通过屏幕触摸飞机时,在您的环境中检测到的飞机上创建球。 第二种模式使您可以通过屏幕将球从飞机上触摸的地方移开。

In porting this example, we replaced the HitTests that were used to place and move the balls with the editor friendly version described above, since we were only interested in doing the HitTest against the extents of the planes you had already detected.

在移植此示例时,我们用上述易于编辑的版本替换了用于放置和移动球的HitTest,因为我们只对针对已检测到的飞机范围进行HitTest感兴趣。

Another change we made, which was not strictly needed for the port, was to make the interaction that made the balls move work much better by using force vectors from your finger position rather than collision dynamics. This was an instance where the EditorInterface came in handy to iterate on the parameters in the Inspector.

我们进行的另一个更改(并非端口必须严格执行)是通过使用手指位置的力矢量(而不是碰撞动力学)使交互作用更好地使球运动。 在这种情况下,EditorInterface可以方便地迭代检查器中的参数。

You can try both of these examples in the project in Editor, on Remotes or on actual ARKit and ARCore devices. We look forward to porting more of the examples over to this interface as an exercise.

您可以在项目的编辑器中,遥控器上或实际的ARKit和ARCore设备上尝试这两个示例。 我们希望通过练习将更多示例移植到此接口上。

缩放内容 (ScaledContent)

In our Unite session, we talked about scaled content and why you would want it. We also discussed why you would not want to scale the content itself, but use “camera-tricks” to do the scaling. We showed two options for scaling content – one of them uses one camera, while the other uses two cameras. This topic is important enough to warrant a separate blog post, and we’ll be publishing one soon with a more detailed explanation of how to best scale content. Be sure to check back soon. (Update: The blog on Dealing with Scale in AR is now available to read.)

在我们的Unite会话中,我们讨论了可缩放的内容以及您为什么想要它。 我们还讨论了为什么您不想缩放内容本身,而是使用“照相机技巧”进行缩放。 我们展示了两种缩放内容的选项-其中一个使用一台摄像机,而另一个使用两台摄像机。 这个主题很重要,可以单独撰写一篇博客文章,我们将很快发布一篇文章,其中将详细介绍如何最佳地扩展内容。 请务必稍后再检查。 (更新:现在可以阅读有关在AR中处理规模的博客。)

Within this code release, you can look at the implementation of the one camera solution under Assets/UnityARInterface/Examples/ScaledContent.

在此代码发布中,您可以在Assets / UnityARInterface / Examples / ScaledContent下查看一个摄像机解决方案的实现。

共享的多人游戏体验 (Shared multiplayer experience)

Another common request from game developers was to allow people with different devices to be able to play the same game in the same space. To cater to this request, we wanted to create a shared multiplayer experience to demonstrate how it could be done. ARInterface, along with the other utilities we created, helped smooth out this development experience.

游戏开发人员的另一个常见要求是允许具有不同设备的人们能够在同一空间玩同一游戏。 为了满足此要求,我们希望创建共享的多人游戏体验,以演示如何实现。 ARInterface以及我们创建的其他实用程序帮助简化了这种开发体验。

We started with the TANKS! Networking Demo project that is available from the Asset Store. This turned out to be a good start since the project is a simple one which allows you to play a multiplayer game between various mobile devices. The main thing missing was to make it work as an AR experience, and share that experience across multiple devices.

我们从坦克开始 资产商店中可用的网络演示项目。 事实证明,这是一个不错的开始,因为该项目是一个简单的项目,可让您在各种移动设备之间玩多人游戏。 缺少的主要内容是使其成为AR体验,并在多个设备之间共享该体验。

The existing project worked by using this flow:

现有项目通过使用以下流程进行工作:

This flow used the Lobby scene to do matchmaking of the users on the different devices, and then forwarded the matched users to play together in the main scene.

此流程使用Lobby场景在不同设备上对用户进行匹配,然后转发匹配的用户以在主场景中一起玩。

To make our modifications, we first got rid of the CameraRig that the main scene had, since we were going to create a camera that was going to be controlled by the AR device and not by the movement of the tanks. Then we changed the flow of the game by adding a new scene at the beginning of the flow:

为了进行修改,我们首先摆脱了主要场景所具有的CameraRig,因为我们将要创建一个由AR设备而不是坦克移动来控制的摄像机。 然后,我们通过在流程的开头添加新场景来更改游戏流程:

In this case, we set up the AR session and camera in the AR setup scene. Then we pass control over to the Lobby scene, making sure that we make the AR session and camera stay in the scene when we load the subsequent scenes.

在这种情况下,我们在AR设置场景中设置了AR会话和相机。 然后,我们将控制权传递给大厅场景,确保在加载后续场景时使AR会话和摄像机停留在场景中。

The AR setup scene itself uses plane detection to find a good place to create the game. Both devices will look at the same table surface, and use its center to show where the level should appear. Since both use a real world table center and orientation to sync up their coordinate systems, the shared level is created in the same spot according to both devices.

AR设置场景本身使用平面检测来找到创建游戏的好地方。 两种设备都将看向同一桌子表面,并使用其中心来显示水平仪应出现的位置。 由于两者都使用现实世界中的表中心和方向来同步其坐标系,因此根据这两种设备在同一位置创建共享级别。

We then consider the size of the plane and the size of the level in Tanks! to figure out what scale we should use on the level content using the mechanism described in the scaled content section. We then use this scale for the Tanks! level we are going to display for both players. We also made some changes to be able to directly interact with the Tanks by screen touch and dragging.

然后,我们考虑飞机的大小和“坦克”中水位的大小! 通过比例内容部分中描述的机制来弄清楚应该在关卡内容上使用的比例。 然后,我们将这个比例用于坦克! 我们将为两个玩家显示的级别。 我们还进行了一些更改,以能够通过屏幕触摸和拖动与坦克直接互动。

继续扩充 (Keep on augmenting)

We hope the talk and this release of the code associated with it will inspire you to look at developing some exciting new features and interactions for your own AR applications. Please show us what you do with it @jimmy_jam_jam. Any questions? Please contact us on the Unity forums.

我们希望本次演讲以及与之相关联的代码的发行将激发您着眼于为自己的AR应用程序开发一些令人兴奋的新功能和交互功能。 请向我们展示您如何使用@jimmy_jam_jam 。 任何问题? 请在Unity论坛上与我们联系。

翻译自: https://blogs.unity3d.com/2017/11/01/experimenting-with-multiplayer-arcore-and-arkit-jump-in-with-sample-code/

arkit、arcore