Spawn a Player using AR Foundation

Code and workflows for using AR Foundation to spawn an AR Player

What you'll develop on this page

spawn an AR Player that is able to interact with desktop players by shooting and navigating

Your project will be able to make use of AR Player input configurations when deployed on ARKit enabled platforms.

Github branch link: https://github.com/moetsi/Unity-DOTS-Multiplayer-XR-Sample/tree/Updating-to-AR-Player

Updating controls

Sampling AR pose

Almost all AR APIs are able to provide the device's "pose" (translation + rotation). AR Foundation provides this data through its "Pose Driver."

AR Pose Driver

The AR Pose Driver drives the local position and orientation of the parent GameObject according to the device's tracking information. The most common use case for this would be attaching the ARPoseDriver to the AR Camera to drive the camera's position and orientation in an AR scene.

AR Pose Driver

From AR Foundation's AR Pose Driver documentation

In order to spawn an AR Player in our game, we need to grab AR pose data from AR Foundation and provide it to our ECS system. AR Foundation runs on MonoBehaviours, so we're going to need to use a pattern of dropping data "into" ECS (using the Entity Manager), and then pulling it back "out" when we need it. Currently, ECS does not provide a way to "push" the information out (i.e. by calling a MonoBehaviour method within a system).

The device's movement will control our AR player's movement. So if the device moves to the left, the player moves to the left. If the device moves to the right, the player moves to the right. On the client, we will sample the pose and use NetCode to send it to the server as ICommandData.

  • Let's create ARPoseComponent in the Client/Components folder

  • Paste the code snippet below into ARPoseComponent.cs:

Creating ARPoseComponent

This is going to be the component that will update every time we sample pose data.

Now we need to actually create the MonoBehaviour that will grab the pose data, and then set ARPoseComponent. From the Unity AR Foundation documentation above, we know that the AR Pose Driver script updates the translation and rotation of the parent GameObject. We are going to add AR Pose Sampler to the same GameObject and pull its transformation and rotation.

  • In MainScene, select the AR Camera that is nested in AR Session Origin, click "Add Component" in Inspector and create a new script called ARPoseSampler, and move the script to the Client folder

Creating ARPoseSampler
  • Paste the code snippet below into ARPoseSampler.cs:

Updating ARPoseSampler

Updating PlayerCommand

We are going to use PlayerCommand to send the pose data, so we'll need to update the script in order to do so.

We are also going to need to add a boolean to PlayerCommand to signify that the command was sent from an AR player, now that the server will need to differentiate between AR and desktop players.

  • Paste the code snippet below into PlayerCommand.cs:

Updating PlayerCommand

Now PlayerCommand is able to grab the pose and send the pose to the server.

Create ARInputSystem

Rather than adding if(ar) statements into InputSystem we are going to create a new input system in Client/Systems just for AR players, named "ARInputSystem."

  • Create ARInputSystem in the Client/Systems folder

  • Paste the code snippet below into ARInputSystem.cs:

Update response system

Next, we need to update one of our response systems. We will only need to update the InputResponseMovementSystem (and not InputResponseSpawnSystem) because the spawning of bullets does not need to be altered. If the PlayerCommand has shoot = 1, then the bullet will spawn.

  • Paste the code snippet below into InputResponseMovementSystem.cs:

Updating InputResponseMovementSystem

In InputResponseMovementSystem, we hardcoded the offset of our camera to our player as (0, 2, -10). We instead could've included this offset as part of our GameSettingsComponent, but we decided to leave it hardcoded for the sake of simplicity.

Update spawning classification for AR players

Now we need to update our PlayerGhostSpawnClassificationSystem so it does not add the camera to an AR Player (remember that in PlayerGhostSpawnClassificationSystem we add a Camera to the player after spawning).

We are going to check for the IsARPlayerComponent singleton and if it exists we will not add the Camera to the player.

  • Paste the code snippet below into PlayerGhostSpawnClassificationSystem.cs:

Updating PlayerGhostSpawnClassificationSystem
  • Let's hit play, host a game, and spawn a player

Hitting play and our camera not appearing

If you remember back to our "DOTS Entity Component System ECS" section, we added the following to our "Scripting Define Symbols": HYBRID_ENTITIES_CAMERA_CONVERSION. Now that we are building for iOS, we also need to add this to our iOS Player Settings. This way when we hit "play" in the Unity editor we are still able to render our camera (instead of constantly switching back and forth between Desktop and iOS build).

  • Go to "Player Settings..." (File > Build Settings > Player Settings button in the bottom left) and add HYBRID_ENTITIES_CAMERA_CONVERSION to the "Scripting Define Symbols" field when Player is selected, after "UNITY_XR_ARKIT_LOADER_ENABLED;"

    • Hit apply

  • Save!

Updating "Player Settings..." for our iOS build with HYBRID_ENTITIES_CAMERA_CONVERSION
  • Ok, now you need to restart your computer (yup, we're being serious; in our testing, this was a necessary step in order for the camera to work in the editor again)

  • Go to the BuildSettings folder, choose your development platform, and hit "Build and Run" (this will cause the Editor to switch)

    • The top of the Editor will change from "iOS" to "PC, Mac & Linux"

  • Hit play, host a game, and spawn a player

After switching back to development platform player camera works again
  • Now let's try something different. Unclick play, go back to BuildSettings folder, select iOS-Build and hit Build and Run (this will cause editor to switch)

Selecting our iOS-Build and hitting Build and Run to switch back
  • Once the Editor resets to "iOS" hit play, host a game, and spawn a player

Spawning a player now generates the camera again
  • Now click "Build and Run" again to build to Xcode

Hitting Build and Run after the Editor reset to iOS
  • In the deployed app, host a game

  • Tap on the screen to spawn and shoot, use 3 fingers to self-destruct, and move the device to move the player

Updated AR Player is able to spawn, navigate, and fire

Updating spawning

Currently, InputMovementResponseSystem places the player ahead of the AR pose, regardless of where our server spawns our AR player. This isn't great because it means as soon as a player is destroyed (via self-destructing or via another player shooting at them), it will regenerate right where they were, which is not very fun.

We need to update our flow so that when our AR Player is spawned we can tell our AR Pose Driver to move the camera to behind spawn location. We need to communicate to Pose Driver and say "you are now at translation + rotation" position.

A good place to do this is PlayerGhostSpawnClassificationSystem. When we receive our player ghost on the client, we will create a new singleton called "SpawnPositionForARComponent" which will be the location our server spawned our player.

Within ARPoseSampler we will add logic to check for this component, and if it exists (which means there was a new spawn), we will update the pose to be behind the player and then delete the singleton.

  • Create SpawnPositionForARComponent in the Client/Components folder

  • Paste the code snippet below into SpawnPositionForARComponent.cs:

  • Now we need to update PlayerGhostSpawnClassificationSystem in order to create a singleton with the spawn position

  • Paste the code snippet below into PlayerGhostSpawnClassificationSystem.cs:

Updating PlayerGhostSpawnClassificationSystem to provide spawn location

Now we will update ARPoseSampler to check for a SpawnPositionForARComponent, and if it exists we update the pose to be behind the player spawn. We will use ARSessionOrigin's MakeContentAppearAt method, explained in Unity's AR Foundation documentation below:

MakeContentAppearAt(Transform, Quaternion)

Makes content appear to have orientation rotation relative to the Camera.

Declaration

Parameters

Type
Name
Description

content

The Transform of the content you wish to affect.

rotation

The rotation the content should appear to be in, relative to the Camera.

Remarks

This method does not actually change the Transform of content; instead, it updates the ARSessionOrigin's Transform so that the content appears to be in the requested orientation.

From AR Foundation's MakeContentAppearAt documentation

Because we don't want to actually move content, but instead want to move our AR Pose, we are going to provide the inverse of our translation and rotation to the MakeContentAppearAt method.

Also, we are going to need to save our updates to AR Session Origin. This is because we need to "undo" them before we add another update. That is because MakeContentAppearAt is additive, it does not reset on each call. So if we did not "undo" this, we would be pushing our pose further and further away.

  • Paste the code snippet below into ARPoseSampler.cs:

Updating ARPoseSampler to ingest SpawnPositionForARComponent
  • Make sure MainScene is open, and then drag AR Session Origin from the Hierarchy into the appropriate field in the ARPoseSampler component within AR Camera (which is nested in AR Session Origin in Hierarchy)

Updating ARPoseSamplers public field with AR Session Orign
  • Let's build and run for iOS and check it out

Github branch link:‌

git clone https://github.com/moetsi/Unity-DOTS-Multiplayer-XR-Sample/ git checkout 'Updating-to-AR-Player'‌

Last updated