Platform One – Technical Information and Download

T.E.A.M. is a research project that was created to facilitate the design of kinetic projects and components created through a computational design process. Time is the ingredient that allows dynamism. The Platform One application, which is the core of this research project, was envisioned with these two key features: the first one is that everything modified and developed in the VE (Virtual Environment) retains its geometric characteristics, allowing the user to reach an informed 3D model at the end of the process; the second one is the ease and enjoyment with which the user manipulates complex dynamic geometries in the three-dimensional environment through a natural interface design approach that focuses on direct manipulation of architectural objects and components. The simulator is designed to be used in a 6DOF virtual environment using a commercial VR headset. It has currently been loaded with several archetypal test architectures and it’s now available to designers who want to test their work with it.

1. Built with

“Platform One” is built using Unity as Game Engine. The reason for this choice was mainly for the implementation of ready-to-use VR technologies, which helped us in the early stage of the prototyping research phase. As the development process went further on, we were able to develop our custom Components and built a custom framework on the top of the Oculus’ one, especially for what concerned the hand tracking feature, which was yet experimental at the start time of the research project.

Another framework which helped us in the early stage, and that we took some elements from, is the open-source Mixed Reality Toolkit from Microsoft: the high customization of each Component allowed us to further change and adapt some scripts to our own necessities. A key role was determined by the lightweight yet customizable shaders of the MRTK: the abstract feeling of an ethereal virtual world beyond this reality was achievable thanks to the custom shaders of the framework. They also helped us to keep pretty high performances, keeping in mind the limited resources available in the first generation of the Oculus Quest.

2. Getting started

Prerequisites

“Platform One” requires at least Unity 2020.3.9f to build and run. Android build support (both Android SDK and OpenJDK) module needs to be installed with Unity in order to build for the Oculus Quest platform (which is an Android-based device). Any other framework is already installed in the project.

Installation

  1. Download or clone the “PlatformOne” repository.
  2. Open UnityHub launcher. Select the Project tab on the right side and select Open>Add project from disk.
  3. Select the folder called “PlatformOne” containing the entire Unity project and open it.
  4. First opening should take a while since Unity has to build the local Library of assets cache, compiled shaders and so on.
  5. After the recompiling the project opens directly with PlatformOne scene as primary scene in the editor.

3. Usage

Add new custom geometries

Any geometry in the Platform One environment is the representation of a Rhino/Grasshoper parametrized mesh in a form that Unity can visualize as a pseudo-animated mesh. The detailed process is described on the T.E.A.M. project website.

In order to add your own “animated” mesh you have to export each frame of the Rhino animated mesh as a single unique .fbx and import them in Unity. After that, you must follow the steps below:

  1. Create a new Empty Prefab: you can name it whatever you want, but be sure to append the word “Sequence” at its name.
  2. Add each imported .fbx as child of the newly created prefab.
  3. Each .fbx should appear as a compound GameObject: a parent with a Transform component only and a child called “Mesh”, with the actual Mesh filter and Mesh renderer components
  4. Your prefab should be constructed this way at the moment: the prefab parent, a number of sub-children equal to the number of the animation frames for your custom mesh, each of them with a sub-child called “Mesh”. Add to each direct sub-children of the main prefab an Animator component and a Bolt Flow Machine component: as Controller property in the Animator choose what is called “DockableCone”, and as Macro in the Flow Machine choose what is called “MeshSequenceAnimationListener”.
  5. For any help on what a final prefab should look, open the Prefab folder in the Assets one, and in the Sequences folder inside it you can explore our built example prefabs. Have a look here to replicate the precise structure for your own prefab.
  6. In order to your custom prefab to appear at runtime, select the ImportModule GameObject in the Hierarchy panel. In the Geometry Mesh Sequence Set Module component, add a new entry in the Mesh Sequence Geometry Prefabs list and select your custom Prefab.

Customize the environment

Many aspects of the experience can be customized because many features have been developed as highly changeable:

  1. Some parameters can be changed directly in Unity, since some of them have been exposed as public in their related script and can be found under their script component
  2. Other parameters can be changed only in code, since they build up a more complex structure related to many different behaviors [each relevant script has its own documentation]

Build your package

Any running build packaged from Unity has to be exported as an Android Package (.apk). The project should be already properly set. Otherwise, check into Build Settings that:

  1. Android is selected as target platform. If not, select Android.
  2. Under Android settings on the right side of the panel, the Texture Compression must be set to ASTC value.
  3. If any setting has been changed, confirm changes selecting Switch Platform. A process of assets conversion starts. It should take some time depending on the hardware configuration of your computer.

If Android platform is properly set, you should be able to build your own package: open the Build Settings panel, select Build (or Build and Run if your Oculus is connected to your computer and set in the Oculus app), select your output saving location on your computer and wait for the built process to complete. After that, you can upload your .apk in your Oculus Quest using the Command Prompt (on Windows) or Terminal (on Mac) or apps such as SideQuest or the Oculus Developer Hub.

IMPORTANT: in order to upload your packages, Developer Mode for your Oculus Quest must be enabled!

4. Implementation

The VE experience is made up of many different components and specialized managers which handle every aspect of the experience. Because of the research aim of the project, some initial classes have been extended and replaced with a more specialized version of them, sometimes creating them newly from scratch, sometimes extending the existing ones. These classes are still implemented in the project because they are the building block for the final version, although they may not be used anymore.

In this section we are going to explore the general execution process of a running build of Platform One in order to understand how the entire application works and what are the main components that play a central role in the key features of the experience.

When the application starts its execution, all Manager GameObjects are activated and start their processes:

  • GameManager: it coordinates with the TimeManager to control time inside the experience, to set properly the camera and the general execution of the application.
  • UIManager: it controls any interaction occurring to the buttons of the control panel (such as showing the Dock, changing the reality mode, showing the Navigation panel or the Grid visualization…), recording events fired by the user’s hands and calling methods on specific objects. The Label Visualization mode, if active, reads values stored in specific components and print them near the corresponding handle or the interactive element in a diegetic UI text element.
  • TimeManager: it is structured by many other components, each of them related to a specific aspect of the environment, such as fog color, cloud opacity, skybox material… and it controls how these elements behave during time. TimeManager also activates the digital clock: it sets the light position and rotation to the current time and can modify its update frequency when the user interacts with the clock, increasing or reducing the light motion speed.
  • ImportModule: it loads any specified geometry and places it in the Selection Dock. Any geometry that is passed in the ImportModule is created inside a specific structure (that we called Centroid) that allows interaction with hands, the ability to be docked, projected or manipulated, and to use an encapsulated reference system.

After some necessary elements of the environment are set and they start working in the background, any other feature is triggered by specific actions.

When the user selects to visualize the Dock, the Dock object appears and follows the user. The Dock is composed by different Plate types. Each of them defines how the custom geometry structure (the Centroid) interacts with them:

  • The leftmost Plate is the Loading Plate: here all pre-loaded geometries are visualized in a circular buffer structure. From this plate geometries can be either grabbed and dragged onto a Preparation Plate or the Projection Plate or the Trashbin Plate
  • Central Plates are the Preparation Plates: geometries can be released there and instantiated as a new GameObject preserving the original one in the Loading Plate for multiple copies. This kind of Plate just hold a geometry, waiting to be dragged onto another Plate
  • The rightmost Plate is the TrashBin Plate: it properly destroys any instantiated geometry
  • The biggest Plate on the right is the Projection Plate: when a geometry is released onto it, the Plate communicates with the ObjectPlaceholder object and instantiates a 1:1 scale of the docked geometry. A specific structure is built around the docked geometry, and it allows real-time transformation to the geometry: any change to rotation, scale or location of the object is copied into the actual-size projection. At the same time a new geometry is placed on the Projection Dock, the SliderManager activates and creates a new slider directly connected to the animation of that specific geometry (the slider shows only the mesh at specified index and hides all other meshes, giving the illusion that the geometry is being animated)

The pressure of the Freeze Button, to lock the projected geometry in the desired position, activates a series of components and systems, handled by the FrozenGeometriesManager:

  • Instantiate a new copy of the projected geometry with the exact same Transform values
  • Copy the Transform values of the docked geometry in a self-handled structure, then undock and hide the docked geometry in the Projection Plate
  • Enable the unfreeze pointer (handled by the MATKPointerManager), which recognizes whenever it touches a frozen geometry (using a specific tag)
  • If the user triggers the specific gesture when the pointer is selecting a frozen geometry, the original geometry is enabled, docked in the Projection Dock using the saved Transform values and automatically the projected geometry comes back, replacing the frozen one which is destroyed

The Centroid is a custom prefab which encapsulates a custom reference system. When the geometry is docked onto the Projection Plate, it considers the center of the mini platform as its new center (0,0,0), and any change to its position, rotation or scale must be reference to that center. The Centroid becomes the geometry pivot point, and measures offsets from the mini scene center as if they are offset of a GameObject Pivot point from the World origin. Any change to this mini-scene system is then mirrored to the projected geometry, which references the World origin as actual origin point.

5. License

Copyright (c) 2022 Poplab srl Permission is hereby granted, free of charge, to any person obtaining a copyof this software and associated documentation files (the “Software”), to dealin the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sellcopies of the Software, and to permit persons to whom the Software isfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in allcopies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS ORIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THEAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHERLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

6. Download

Here the link to GitHub. You can clone or fork the project, but we won’t accept any pull requests. Be free to explore the source and implement your own solutions:

7. Research Paper

T.E.A.M. presented part of its research results with an oral presentation during CISBAT, a scientific conference to be held at the École polytechnique fédérale de Lausanne (EPFL) in September 2021. From this link you can access the published paper “Time Enhanced Architectural Modelling (T.E.A.M.): Virtual reality project for the planning and visualization of kinetic architecture and dynamic design“: https://iopscience.iop.org/article/10.1088/1742-6596/2042/1/012072

For any information regarding the projects and their implementation please write us at team@poplab.cc