Skip to main content

High-Fidelity Environments with Unity

This chapter covers creating high-fidelity environments using Unity for digital twin applications in humanoid robotics. Unity provides powerful rendering capabilities and realistic visual simulation that complement physics simulation from Gazebo.

Introduction to Unity for Digital Twins​

Unity is a versatile game engine that has found significant applications in robotics simulation and digital twin development. For humanoid robotics, Unity offers:

  • High-quality real-time rendering with advanced lighting and materials
  • Physics simulation capabilities (though often paired with Gazebo for robotics-specific physics)
  • Extensive asset library and 3D modeling tools
  • Cross-platform deployment options
  • Integration capabilities with ROS/ROS2 and other robotics frameworks

Unity in Digital Twin Architecture​

Unity typically serves as the visualization layer in digital twin systems:

  • Visual rendering: High-fidelity graphics for realistic perception simulation
  • User interface: Control panels and visualization tools for human operators
  • Environment creation: Complex 3D environments with detailed textures and lighting
  • Integration hub: Connecting various simulation components

Creating Realistic 3D Environments​

Environment Design Principles​

When creating environments for humanoid robot simulation, consider these key principles:

  1. Scale accuracy: Ensure environments match real-world dimensions
  2. Material properties: Use realistic materials that affect sensor simulation
  3. Lighting conditions: Include various lighting scenarios for perception testing
  4. Collision geometry: Balance visual detail with performance requirements
  5. Interactable elements: Include objects the robot can interact with

Basic Scene Setup​

Here's a template for setting up a Unity scene for robotics simulation:

using UnityEngine;
using System.Collections;

public class RobotEnvironment : MonoBehaviour
{
[Header("Environment Settings")]
public float simulationScale = 1.0f; // 1 unit = 1 meter
public bool enableRealisticLighting = true;
public Material defaultRobotMaterial;

[Header("Physics Settings")]
public float gravity = -9.81f;
public PhysicMaterial robotMaterial;

void Start()
{
// Set up gravity
Physics.gravity = new Vector3(0, gravity, 0);

// Configure environment scale
transform.localScale = Vector3.one * simulationScale;

// Apply lighting settings
SetupLighting();
}

void SetupLighting()
{
if (enableRealisticLighting)
{
// Configure main directional light to simulate sun
Light sunLight = FindObjectOfType<Light>();
if (sunLight != null)
{
sunLight.type = LightType.Directional;
sunLight.intensity = 1.0f;
sunLight.color = Color.white;
sunLight.transform.rotation = Quaternion.Euler(50, -30, 0);
}
}
}
}

Terrain and Ground Plane Creation​

Creating realistic ground planes and terrain:

using UnityEngine;

public class TerrainGenerator : MonoBehaviour
{
public int terrainWidth = 200;
public int terrainLength = 200;
public float terrainHeight = 20f;
public int resolution = 257; // Should be 2^n + 1

void Start()
{
CreateTerrain();
}

void CreateTerrain()
{
// Create terrain object
Terrain terrain = Terrain.CreateTerrainGameObject(new TerrainData()).GetComponent<Terrain>();
terrain.terrainData = new TerrainData();

// Set terrain dimensions
terrain.terrainData.size = new Vector3(terrainWidth, terrainHeight, terrainLength);

// Generate heightmap
GenerateHeightmap(terrain.terrainData);

// Apply default terrain material
terrain.materialTemplate = new Material(Shader.Find("Standard"));
terrain.materialTemplate.color = Color.green;
}

void GenerateHeightmap(TerrainData terrainData)
{
float[,] heights = new float[resolution, resolution];

for (int x = 0; x < resolution; x++)
{
for (int y = 0; y < resolution; y++)
{
// Create a simple terrain with some variation
float xPercent = (float)x / (resolution - 1);
float yPercent = (float)y / (resolution - 1);

// Add some noise for natural variation
float noise = Mathf.PerlinNoise(xPercent * 10f, yPercent * 10f) * 0.5f;

heights[x, y] = noise;
}
}

terrainData.SetHeights(0, 0, heights);
}
}

Indoor Environment Setup​

For indoor humanoid robot environments:

using UnityEngine;

public class IndoorEnvironment : MonoBehaviour
{
[Header("Room Dimensions")]
public Vector3 roomSize = new Vector3(10f, 3f, 10f);
public float wallThickness = 0.2f;

[Header("Furniture")]
public GameObject[] furniturePrefabs;

void Start()
{
CreateRoom();
AddFurniture();
}

void CreateRoom()
{
// Create floor
CreatePlane("Floor", new Vector3(roomSize.x, roomSize.z), new Vector3(0, -wallThickness/2, 0));

// Create walls
CreateWall("WallFront", new Vector3(roomSize.x, roomSize.y), new Vector3(0, roomSize.y/2, roomSize.z/2));
CreateWall("WallBack", new Vector3(roomSize.x, roomSize.y), new Vector3(0, roomSize.y/2, -roomSize.z/2));
CreateWall("WallLeft", new Vector3(roomSize.z, roomSize.y), new Vector3(-roomSize.x/2, roomSize.y/2, 0), true);
CreateWall("WallRight", new Vector3(roomSize.z, roomSize.y), new Vector3(roomSize.x/2, roomSize.y/2, 0), true);
}

GameObject CreatePlane(string name, Vector2 size, Vector3 position)
{
GameObject plane = GameObject.CreatePrimitive(PrimitiveType.Plane);
plane.name = name;
plane.transform.position = position;
plane.transform.localScale = new Vector3(size.x / 10f, 1, size.y / 10f);
return plane;
}

GameObject CreateWall(string name, Vector2 size, Vector3 position, bool rotate = false)
{
GameObject wall = GameObject.CreatePrimitive(PrimitiveType.Cube);
wall.name = name;
wall.transform.position = position;
wall.transform.localScale = new Vector3(size.x, size.y, wallThickness);

if (rotate)
{
wall.transform.Rotate(0, 90, 0);
}

return wall;
}

void AddFurniture()
{
if (furniturePrefabs.Length > 0)
{
// Add random furniture to the environment
foreach (GameObject furniture in furniturePrefabs)
{
Vector3 randomPos = new Vector3(
Random.Range(-roomSize.x/2 + 1, roomSize.x/2 - 1),
0,
Random.Range(-roomSize.z/2 + 1, roomSize.z/2 - 1)
);

Instantiate(furniture, randomPos, Quaternion.identity);
}
}
}
}

Lighting and Material Optimization​

Realistic Lighting Setup​

Proper lighting is crucial for high-fidelity environments:

using UnityEngine;

public class LightingSetup : MonoBehaviour
{
[Header("Lighting Configuration")]
public LightType lightType = LightType.Directional;
public Color lightColor = Color.white;
public float intensity = 1.0f;
public bool enableShadows = true;

[Header("Ambient Lighting")]
public Color ambientColor = new Color(0.212f, 0.227f, 0.259f, 1f);
public float ambientIntensity = 1.0f;

void Start()
{
SetupLighting();
}

void SetupLighting()
{
// Create main light
GameObject lightObj = new GameObject("MainLight");
Light mainLight = lightObj.AddComponent<Light>();

mainLight.type = lightType;
mainLight.color = lightColor;
mainLight.intensity = intensity;
mainLight.shadows = enableShadows ? LightShadows.Soft : LightShadows.None;
mainLight.transform.rotation = Quaternion.Euler(50, -30, 0);

// Set ambient lighting
RenderSettings.ambientLight = ambientColor * ambientIntensity;
RenderSettings.ambientIntensity = ambientIntensity;
}

// Dynamic lighting for day/night cycles
public void SetTimeOfDay(float timeOfDay)
{
// timeOfDay: 0.0 = midnight, 0.5 = noon, 1.0 = midnight again
float rotation = timeOfDay * 360f - 90f; // Start at dawn
transform.rotation = Quaternion.Euler(rotation, 0, 0);

// Adjust intensity based on time of day
float intensityFactor = Mathf.Clamp01(Mathf.Cos(rotation * Mathf.Deg2Rad) + 0.5f);
RenderSettings.ambientIntensity = ambientIntensity * intensityFactor;
}
}

Material Configuration for Sensor Simulation​

Materials that affect sensor simulation need special consideration:

using UnityEngine;

public class SensorMaterials : MonoBehaviour
{
[Header("Material Properties for Sensors")]
public Material reflectiveMaterial; // High reflectivity for LiDAR
public Material absorptiveMaterial; // Low reflectivity
public Material transparentMaterial; // Transparent for cameras
public Material texturedMaterial; // Complex textures

void Start()
{
ConfigureMaterials();
}

void ConfigureMaterials()
{
// Reflective material for LiDAR simulation
if (reflectiveMaterial != null)
{
reflectiveMaterial.color = Color.gray;
reflectiveMaterial.SetFloat("_Metallic", 0.9f); // High reflectivity
reflectiveMaterial.SetFloat("_Smoothness", 0.9f);
}

// Absorptive material
if (absorptiveMaterial != null)
{
absorptiveMaterial.color = Color.black;
reflectiveMaterial.SetFloat("_Metallic", 0.1f);
reflectiveMaterial.SetFloat("_Smoothness", 0.1f);
}

// Transparent material for depth cameras
if (transparentMaterial != null)
{
transparentMaterial.color = new Color(1, 1, 1, 0.5f); // Semi-transparent
transparentMaterial.SetFloat("_Metallic", 0f);
transparentMaterial.SetFloat("_Smoothness", 0.5f);
transparentMaterial.renderQueue = 3000; // Transparent queue
}

// Textured material
if (texturedMaterial != null)
{
texturedMaterial.color = Color.white;
texturedMaterial.SetFloat("_Metallic", 0.2f);
texturedMaterial.SetFloat("_Smoothness", 0.3f);
}
}
}

Integration with Gazebo Simulation​

ROS# Integration​

Unity can integrate with ROS/ROS2 using packages like ROS# (ROS Sharp):

using UnityEngine;
using RosSharp.RosBridgeClient;

public class GazeboIntegration : MonoBehaviour
{
[Header("ROS Connection")]
public string rosBridgeServerUrl = "ws://192.168.1.1:9090";
public int robotJointCount = 6;

private RosSocket rosSocket;
private JointStatePublisher jointStatePublisher;
private JointStateSubscriber jointStateSubscriber;

void Start()
{
ConnectToROS();
}

void ConnectToROS()
{
RosBridgeClient.Protocols.WebSocketNetProtocol protocol =
new RosBridgeClient.Protocols.WebSocketNetProtocol(rosBridgeServerUrl);

rosSocket = new RosSocket(protocol, (message) => {
Debug.Log("Connected to ROS bridge: " + message);
});

// Set up publishers and subscribers
jointStatePublisher = gameObject.AddComponent<JointStatePublisher>();
jointStatePublisher.Initialize(rosSocket, "/unity_joint_states", robotJointCount);
}

void Update()
{
// Synchronize Unity transforms with ROS joint states
SyncTransformsWithROS();
}

void SyncTransformsWithROS()
{
// Example: Update Unity objects based on ROS joint states
// This would typically involve getting joint angles from ROS
// and applying them to Unity objects
}

void OnDestroy()
{
if (rosSocket != null)
{
rosSocket.Close();
}
}
}

Physics Synchronization​

Synchronizing physics between Unity and external simulators:

using UnityEngine;

public class PhysicsSync : MonoBehaviour
{
[Header("Synchronization Settings")]
public float syncRate = 60f; // Hz
public bool useFixedUpdate = true;

private float lastSyncTime;
private Rigidbody[] robotRigidbodies;

void Start()
{
robotRigidbodies = GetComponentsInChildren<Rigidbody>();
lastSyncTime = Time.time;
}

void FixedUpdate()
{
if (useFixedUpdate)
{
SyncPhysics();
}
}

void Update()
{
if (!useFixedUpdate && Time.time - lastSyncTime >= 1f/syncRate)
{
SyncPhysics();
lastSyncTime = Time.time;
}
}

void SyncPhysics()
{
// Synchronize positions and velocities with external physics engine
foreach (Rigidbody rb in robotRigidbodies)
{
// Get position/velocity from external physics engine
// For example, from ROS joint state messages
Vector3 externalPosition = GetExternalPosition(rb.name);
Quaternion externalRotation = GetExternalRotation(rb.name);

// Apply to Unity rigidbody
rb.MovePosition(externalPosition);
rb.MoveRotation(externalRotation);
}
}

Vector3 GetExternalPosition(string jointName)
{
// This would typically get data from ROS or other external source
// For now, return current position
return transform.position;
}

Quaternion GetExternalRotation(string jointName)
{
// This would typically get data from ROS or other external source
// For now, return current rotation
return transform.rotation;
}
}

Visual Perception Testing​

Camera Simulation Setup​

Setting up cameras for perception testing:

using UnityEngine;

public class PerceptionCamera : MonoBehaviour
{
[Header("Camera Configuration")]
public float fieldOfView = 60f;
public int resolutionWidth = 640;
public int resolutionHeight = 480;
public float nearClip = 0.1f;
public float farClip = 100f;

[Header("Sensor Simulation")]
public bool simulateDepth = true;
public bool simulateRGB = true;
public float depthNoiseLevel = 0.01f;

private Camera perceptionCam;
private RenderTexture rgbTexture;
private RenderTexture depthTexture;

void Start()
{
SetupCamera();
}

void SetupCamera()
{
perceptionCam = GetComponent<Camera>();
if (perceptionCam == null)
{
perceptionCam = gameObject.AddComponent<Camera>();
}

perceptionCam.fieldOfView = fieldOfView;
perceptionCam.nearClipPlane = nearClip;
perceptionCam.farClipPlane = farClip;

// Create render textures for simulation
CreateRenderTextures();
}

void CreateRenderTextures()
{
if (simulateRGB)
{
rgbTexture = new RenderTexture(resolutionWidth, resolutionHeight, 24);
rgbTexture.name = "RGB_Texture";
perceptionCam.targetTexture = rgbTexture;
}

if (simulateDepth)
{
depthTexture = new RenderTexture(resolutionWidth, resolutionHeight, 24);
depthTexture.name = "Depth_Texture";
depthTexture.format = RenderTextureFormat.Depth;
}
}

// Method to get RGB image
public Texture2D GetRGBImage()
{
if (rgbTexture == null) return null;

RenderTexture.active = rgbTexture;
Texture2D image = new Texture2D(resolutionWidth, resolutionHeight, TextureFormat.RGB24, false);
image.ReadPixels(new Rect(0, 0, resolutionWidth, resolutionHeight), 0, 0);
image.Apply();
RenderTexture.active = null;

return image;
}

// Method to get depth information
public float[,] GetDepthData()
{
if (depthTexture == null) return null;

RenderTexture.active = depthTexture;
Texture2D depthImage = new Texture2D(resolutionWidth, resolutionHeight, TextureFormat.RFloat, false);
depthImage.ReadPixels(new Rect(0, 0, resolutionWidth, resolutionHeight), 0, 0);
depthImage.Apply();
RenderTexture.active = null;

// Process depth data with noise simulation
float[,] depthData = new float[resolutionWidth, resolutionHeight];
for (int x = 0; x < resolutionWidth; x++)
{
for (int y = 0; y < resolutionHeight; y++)
{
Color pixel = depthImage.GetPixel(x, y);
float depthValue = pixel.r; // Depth is stored in red channel

// Add noise to simulate real sensor behavior
depthValue += Random.Range(-depthNoiseLevel, depthNoiseLevel);
depthValue = Mathf.Clamp(depthValue, nearClip, farClip);

depthData[x, y] = depthValue;
}
}

return depthData;
}
}

Perception Pipeline Integration​

Integrating perception testing into the digital twin:

using UnityEngine;
using System.Collections.Generic;

public class PerceptionPipeline : MonoBehaviour
{
[Header("Perception Modules")]
public PerceptionCamera mainCamera;
public List<PerceptionCamera> sensorCameras = new List<PerceptionCamera>();

[Header("Perception Algorithms")]
public bool enableObjectDetection = true;
public bool enableSLAM = true;
public bool enableFeatureExtraction = true;

void Start()
{
InitializePerceptionPipeline();
}

void InitializePerceptionPipeline()
{
if (mainCamera == null)
{
mainCamera = FindObjectOfType<PerceptionCamera>();
}

if (sensorCameras.Count == 0)
{
sensorCameras.AddRange(FindObjectsOfType<PerceptionCamera>());
}
}

void Update()
{
if (enableObjectDetection)
{
ProcessObjectDetection();
}

if (enableSLAM)
{
ProcessSLAM();
}

if (enableFeatureExtraction)
{
ProcessFeatureExtraction();
}
}

void ProcessObjectDetection()
{
// Simulate object detection on camera images
Texture2D cameraImage = mainCamera.GetRGBImage();
if (cameraImage != null)
{
// Apply object detection algorithms
// This would typically use computer vision libraries
List<DetectedObject> objects = DetectObjects(cameraImage);

// Publish results for visualization or further processing
VisualizeDetections(objects);
}
}

void ProcessSLAM()
{
// Simulate SLAM processing
// This would typically use point cloud data from depth cameras
float[,] depthData = mainCamera.GetDepthData();
if (depthData != null)
{
// Process SLAM algorithm
ProcessSLAMFromDepth(depthData);
}
}

void ProcessFeatureExtraction()
{
// Extract visual features from camera images
Texture2D image = mainCamera.GetRGBImage();
if (image != null)
{
// Extract features like edges, corners, etc.
List<FeaturePoint> features = ExtractFeatures(image);

// Use features for mapping, localization, etc.
}
}

List<DetectedObject> DetectObjects(Texture2D image)
{
// Placeholder for object detection algorithm
// In a real implementation, this would use a trained neural network
// or classical computer vision techniques
return new List<DetectedObject>();
}

void VisualizeDetections(List<DetectedObject> objects)
{
// Visualize detected objects in the Unity scene
foreach (DetectedObject obj in objects)
{
// Create visualization elements for detected objects
CreateDetectionVisualization(obj);
}
}

List<FeaturePoint> ExtractFeatures(Texture2D image)
{
// Placeholder for feature extraction
return new List<FeaturePoint>();
}

void ProcessSLAMFromDepth(float[,] depthData)
{
// Placeholder for SLAM algorithm using depth data
}

void CreateDetectionVisualization(DetectedObject obj)
{
// Create visualization for detected object
GameObject viz = GameObject.CreatePrimitive(PrimitiveType.Cube);
viz.transform.position = obj.position;
viz.transform.localScale = obj.size;
viz.GetComponent<Renderer>().material.color = Color.red;
Destroy(viz, 0.1f); // Temporary visualization
}
}

[System.Serializable]
public class DetectedObject
{
public string label;
public Vector3 position;
public Vector3 size;
public float confidence;
}

[System.Serializable]
public class FeaturePoint
{
public Vector2 pixelPosition;
public Vector3 worldPosition;
public float response;
}

Performance Considerations​

Optimization Strategies​

When creating high-fidelity environments for digital twins:

  1. Level of Detail (LOD): Use different detail levels based on distance
  2. Occlusion Culling: Don't render objects not visible to cameras
  3. Texture Compression: Use appropriate compression for textures
  4. Light Baking: Bake static lighting to reduce runtime calculations
  5. Object Pooling: Reuse objects instead of creating/destroying frequently

Performance Monitoring​

Monitor performance metrics:

using UnityEngine;

public class PerformanceMonitor : MonoBehaviour
{
[Header("Performance Settings")]
public float updateInterval = 1f;
public bool logPerformance = true;

private float lastUpdateTime;
private int frameCount;
private float accumulatedFrameTime;

void Start()
{
lastUpdateTime = Time.time;
}

void Update()
{
frameCount++;
accumulatedFrameTime += Time.unscaledDeltaTime;

if (Time.time - lastUpdateTime >= updateInterval)
{
float fps = frameCount / (Time.time - lastUpdateTime);
float avgFrameTime = (accumulatedFrameTime / frameCount) * 1000f; // ms

if (logPerformance)
{
Debug.Log($"FPS: {fps:F1}, Avg Frame Time: {avgFrameTime:F1}ms, " +
$"Triangles: {GetTriangleCount()}, Draw Calls: {GetDrawCallCount()}");
}

// Reset counters
frameCount = 0;
accumulatedFrameTime = 0f;
lastUpdateTime = Time.time;
}
}

int GetTriangleCount()
{
int triangles = 0;
MeshFilter[] meshFilters = FindObjectsOfType<MeshFilter>();
foreach (MeshFilter mf in meshFilters)
{
if (mf.sharedMesh != null)
triangles += mf.sharedMesh.triangles.Length;
}

return triangles;
}

int GetDrawCallCount()
{
// This is a simplified approximation
// Actual draw calls depend on materials, batching, etc.
Renderer[] renderers = FindObjectsOfType<Renderer>();
return renderers.Length;
}
}

Unity Scene Configurations and Rendering Settings Examples​

Complete Scene Configuration​

Here's a complete example of a Unity scene configuration for humanoid robot simulation:

using UnityEngine;
using UnityEngine.Rendering;

public class CompleteRobotScene : MonoBehaviour
{
[Header("Environment Configuration")]
public GameObject robotPrefab;
public Transform spawnPoint;
public IndoorEnvironment indoorEnv;
public LightingSetup lightingSetup;

[Header("Simulation Settings")]
public float simulationSpeed = 1.0f;
public bool enablePhysicsSync = true;
public bool enablePerception = true;

void Start()
{
InitializeScene();
}

void InitializeScene()
{
// Create environment
CreateEnvironment();

// Spawn robot
SpawnRobot();

// Setup lighting
SetupLighting();

// Initialize perception systems
if (enablePerception)
{
InitializePerception();
}

// Setup physics synchronization
if (enablePhysicsSync)
{
SetupPhysicsSync();
}
}

void CreateEnvironment()
{
// Create indoor environment
GameObject envObj = new GameObject("Environment");
indoorEnv = envObj.AddComponent<IndoorEnvironment>();
indoorEnv.roomSize = new Vector3(15f, 4f, 15f);
}

void SpawnRobot()
{
if (robotPrefab != null && spawnPoint != null)
{
GameObject robot = Instantiate(robotPrefab, spawnPoint.position, spawnPoint.rotation);
robot.name = "HumanoidRobot";
}
}

void SetupLighting()
{
GameObject lightObj = new GameObject("SceneLighting");
lightingSetup = lightObj.AddComponent<LightingSetup>();
lightingSetup.intensity = 1.2f;
lightingSetup.enableShadows = true;
}

void InitializePerception()
{
// Add perception cameras to robot
GameObject robot = GameObject.Find("HumanoidRobot");
if (robot != null)
{
PerceptionCamera mainCam = robot.AddComponent<PerceptionCamera>();
mainCam.fieldOfView = 70f;
mainCam.resolutionWidth = 640;
mainCam.resolutionHeight = 480;
}
}

void SetupPhysicsSync()
{
// Add physics synchronization to robot
GameObject robot = GameObject.Find("HumanoidRobot");
if (robot != null)
{
PhysicsSync sync = robot.AddComponent<PhysicsSync>();
sync.syncRate = 100f; // 100Hz sync
}
}
}

Rendering Settings for Different Use Cases​

Configure rendering settings based on the use case:

using UnityEngine;
using UnityEngine.Rendering;

public enum RenderingMode
{
Realistic, // High-quality rendering for visualization
Performance, // Optimized for real-time performance
Perception, // Optimized for sensor simulation
Training // Optimized for ML training data generation
}

public class RenderingConfiguration : MonoBehaviour
{
public RenderingMode currentMode = RenderingMode.Realistic;

[Header("Rendering Settings")]
public int targetFrameRate = 60;
public ShadowQuality shadowQuality = ShadowQuality.All;
public TextureQuality textureQuality = TextureQuality.High;
public float lodBias = 1.0f;

void Start()
{
ApplyRenderingConfiguration();
}

public void SetRenderingMode(RenderingMode mode)
{
currentMode = mode;
ApplyRenderingConfiguration();
}

void ApplyRenderingConfiguration()
{
switch (currentMode)
{
case RenderingMode.Realistic:
ApplyRealisticSettings();
break;
case RenderingMode.Performance:
ApplyPerformanceSettings();
break;
case RenderingMode.Perception:
ApplyPerceptionSettings();
break;
case RenderingMode.Training:
ApplyTrainingSettings();
break;
}
}

void ApplyRealisticSettings()
{
QualitySettings.shadowResolution = ShadowResolution.High;
QualitySettings.shadowDistance = 50f;
QualitySettings.shadowCascades = 4;
QualitySettings.anisotropicFiltering = AnisotropicFiltering.Enable;
QualitySettings.lodBias = 2.0f;
Application.targetFrameRate = targetFrameRate;
}

void ApplyPerformanceSettings()
{
QualitySettings.shadowResolution = ShadowResolution.Low;
QualitySettings.shadowDistance = 20f;
QualitySettings.shadowCascades = 1;
QualitySettings.anisotropicFiltering = AnisotropicFiltering.Disable;
QualitySettings.lodBias = 0.5f;
Application.targetFrameRate = 30; // Lower target for performance
}

void ApplyPerceptionSettings()
{
QualitySettings.shadowResolution = ShadowResolution.Medium;
QualitySettings.shadowDistance = 30f;
QualitySettings.shadowCascades = 2;
QualitySettings.anisotropicFiltering = AnisotropicFiltering.Enable;
QualitySettings.lodBias = 1.0f;
Application.targetFrameRate = 30; // Match typical camera frame rates
}

void ApplyTrainingSettings()
{
QualitySettings.shadowResolution = ShadowResolution.Medium;
QualitySettings.shadowDistance = 40f;
QualitySettings.shadowCascades = 2;
QualitySettings.anisotropicFiltering = AnisotropicFiltering.Enable;
QualitySettings.lodBias = 1.0f;
Application.targetFrameRate = 60; // Higher frame rate for more training data
}
}

Prerequisites​

To effectively work with Unity for high-fidelity digital twin environments, you should have:

  • Basic understanding of Unity development environment
  • Familiarity with C# programming
  • Knowledge of 3D graphics concepts (vertices, materials, lighting)
  • Understanding of robotics concepts (coordinate systems, transformations)
  • Experience with physics simulation (helpful but not required)

This chapter provides the foundation for creating high-fidelity environments in Unity that can be integrated with physics simulation from Gazebo and sensor simulation for comprehensive digital twin applications in humanoid robotics.