Skip to content
< All Topics

Coordinate System(s)

Experior's Global Coordinate System

Experior implements a left-handed, y-up coordinate system for its 3D environment as illustrated in the figure below, which is the view presented to the user when Experior is opened.
The x-axis is considered the forward vector in Experior.
The y-axis is considered the up-vector in Experior.
For some models implemented in Experior, the different axises are also named as follows:
  • x-axis = “Length”
  • y-axis = “Height”
  • z-axis = “Width”
This coordinate system is never moved.
Visual Representation of Experior's global coordinate system - left-handed, y-up
Experior's global coordinate system - left-handed, y-up.
Assemblies are placed in the global coordinate system, whereas individual parts in an Assembly are placed relative to the Assembly’s local coordinate system.
Illustration of Global position and local position of boundary objects.
An Assembly's global position in Experior (Orange dot - (1000mm, 410mm, 1000mm)). Conveyor sides (Boundaries) are objects added to the Assembly, hence they have a local position relative to the Assembly's global position.
To illustrate the concept in another way, the following images show an Assembly with 2 parts (Red/Blue Box). 
In the left image, the Assembly is placed at global coordinate (0, 0, 0). The Parts are placed at local coordinates – Red box: (0, 0, 0) and Blue box: (0, 2, 0) respectively. The Blue box also has local Pitch rotation applied.
In the right side image, the Assembly is placed at global coordinate (4, 0, 3), where the individual Parts retain their local coordinate offsets.
Illustration of an Assembly with 2 parts placed in the global coordinate system.
An Assembly with 2 Parts (Red/Blue Boxes) placed at global position (0, 0, 0).
Illustration of an Assembly with 2 parts placed in the global coordinate system.
An Assembly with 2 Parts (Red/Blue Boxes) placed at global position (4, 0, 3), retaining their local offset positions.
Each Part in an Assembly also has its own local coordinate system, which is mostly relevant for rotations/orientations of the Part around its local center point, as illustrated above with local Pitch rotation applied to the blue Box Part.

Custom Coordinate System

It is also possible to add additional coordinate systems to an Assembly, which can allow for advanced control of machinery. It should be noted that the left-handed coordinate axis frame remain the same as the global coordinate system and cannot be changed.
The following section refers to the “Robot” Assembly example in the DeveloperSamples Catalog, which can be obtained here \\TODO: GitHub repository link. The example contains interactions to manipulate each individual mesh Part. \\TODO: Insert ref link to DeveloperSamples Catalog
Illustration of robot picker with additional coordinate systems added for control options.
Robot picker with additional coordinate systems added for each joint, allowing for individual control of each part of the robot.
The CoordinateSystem class resides in the Experior.Core.Assemblies namespace. Adding a coordinate system to an Assembly is fairly simple, as shown here:
using Experior.Core.Assemblies; // Needed for CoordinateSystem ref
using System.Numerics; // Needed for Vector3 ref

public class Robot : Assembly
{
    public CoordinateSystem Coordinate0, Coordinate1, Coordinate2, Coordinate3, Coordinate4;
    private Core.Parts.Mesh link0, link1, link2, link3, link4, link5;
    
    public Robot(Core.Assemblies.AssemblyInfo info) : base(info)
    {
        // Signature: CoordinateSystem(CoordinateSystem subSystem, Vector3 localPosition)
        // Defines a new coordinate system with another coordinate system as a child object (can be null for last child object)
        // and places it at the defined local position.
        // Local position here means relative to the parent CoordinateSystem.
        // Meaning Coordinate1's localposition (0.32f, 0.75f, 0.0f) is relative to Coordinate0's position and so on.
        Coordinate4 = new CoordinateSystem(null, Vector3.Zero);
        Coordinate3 = new CoordinateSystem(Coordinate4, new Vector3(1.16f, 0.2f, -0.02f));
        Coordinate2 = new CoordinateSystem(Coordinate3, new Vector3(0.0f, 1.1f, 0.0f));
        Coordinate1 = new CoordinateSystem(Coordinate2, new Vector3(0.32f, 0.75f, 0.0f));
        Coordinate0 = new CoordinateSystem(Coordinate1, new Vector3(0.0f, 0.0f, 0.0f));
        
        // CoordinateSystem.AddSubSystem() method can also achieve addition of coordinate subsystems.
        
        // Create new instances of a robot "link" mesh
        link0 = new Experior.Core.Parts.Mesh(Common.EmbeddedResourceLoader.Get("link0Centered"));
        link1 = new Experior.Core.Parts.Mesh(Common.EmbeddedResourceLoader.Get("link1Centered"));
        link2 = new Experior.Core.Parts.Mesh(Common.EmbeddedResourceLoader.Get("link2Centered"));
        link3 = new Experior.Core.Parts.Mesh(Common.EmbeddedResourceLoader.Get("link3Centered"));
        link4 = new Experior.Core.Parts.Mesh(Common.EmbeddedResourceLoader.Get("link4Centered"));
        link5 = new Experior.Core.Parts.Mesh(Common.EmbeddedResourceLoader.Get("link5Centered"));
        
        // Add the mesh parts to the assembly
        Add(link0);
        Add(link1);
        Add(link2);
        Add(link3);
        Add(link4);
        Add(link5);
        
        // Signature: Add(RigidPart part, Vector3 localPosition)
        // Adds the Parts to the right coordinate system, and positions them using local coordinates
        Coordinate0.Add(link0, new Vector3(-0.17f, 0.15f, 0.0f));
        Coordinate1.Add(link1, new Vector3(0.025f, 0.6f, -0.025f));
        Coordinate2.Add(link2, new Vector3(0, 0.6f, -0.2f));
        Coordinate3.Add(link3, new Vector3(-0.03f, 0.10f, 0));
        Coordinate3.Add(link4, new Vector3(0.75f, 0.2f, -0.03f));
        Coordinate4.Add(link5, new Vector3(0, 0, 0));
        
        //Add the coordinate systems to the assembly
        Add(Coordinate4);
        Add(Coordinate3);
        Add(Coordinate2);
        Add(Coordinate1);
        Add(Coordinate0);
        //the coordinate system with no parent system is also the coordinate system for the assembly.
        //This is to be added last beacause it will update all the parts and subsystems.
    }
}
Since each Part has been added to its own individual coordinate system, each Part can be transformed individually. Transforming a coordinate system also applies transformation to all child objects. For instance, if transformation is applied to Coordinate0, it applies to all mesh Parts in the child objects, whereas transformations applied to Coordinate3 would only apply to the final joint (the yellow “magnet”), similar to how a robot of this type would move in the real world.
Visuals of additional coordinate systems added to a robot.
Coordinate systems added and individually transformed to create a robot with joint movements.
To move the Robot into a position as a illustrated here, the CoordinateSystem class has a RotateRelativeParent() method. In the interactive example, pressing “A” will rotate the parent coordinate system (Coordinate0) of this child coordinate system (Coordinate1). The Assembly class’ KeyDown() method is overridden to apply this interactive functionality.
public override void KeyDown(System.Windows.Input.KeyEventArgs e)
{
    float delta 0.02f;
    
    if(e.Key == System.Windows.Input.Key.A)
        Coordinate1.RotateRelativeParent(delta, 0, 0);
}
Specifically this method call affects the model as follows:
Illustration of robot picker with additional coordinate systems added for control options.
Robot picker as it is placed in Experior.
Illustrates how RotateRelativeParent method affects the Robot's transformations.
How the "RotateRelativeParent" method affects the Robot when "A" is pressed.
While not directly related to coordinate systems, we’ll briefly mention here that initiating movement for the Robot can also be done with the methods LocalAngularMovement() for rotational movement and LocalMovement() for linear movement.
LocalAngularMovement(Vector3 Velocity, Vector3 radiansToRotate) – takes a Velocity vector and defines how many radians it should rotate at that speed.
LocalMovement(Vector3 Velocity, float lengthToMove) – takes a velocity vector and defines how far it should move in that direction.
// Rotates Coordinate1 one PI around the Y-axis (Yaw rotation) when M key is pressed.

public override void KeyDown(System.Windows.Input.KeyEventArgs e)
{
    // Additional e.Key == [...] inputs
    if(e.Key == System.Windows.Input.Key.M)
        Coordinate1.LocalAngularMovement(new Vector3(2, 0, 0), new Vector3((float)Math.PI, 0, 0));
    if(e.Key == System.Windows.Input.Key.P)
        Coordinate0.LocalMovement(new Vector3(2, 0, 0), 2.0f);
}
Result of initiating LocalAngularMovement on the Robot (M key for clockwise direction, N key for counter-clockwise direction).
Result of initiating LocalMovement on the Robot (P key for forward movement, O key for backward movement).