HXRC Trainee Lens: Espoonlahti Health Centre Visualization in Cluster

Jan 8, 2026

Last autumn, Helsinki XR Center developed a Proof of Concept (PoC) for the Espoonlahti Health Centre in Cluster’s Metaverse platform, as part of the AI Start project in collaboration with the City of Espoo and Cluster in Japan. The goal was to explore how urban digital twins, XR, and AI-supported workflows can turn safety communication and citizen participation into an engaging, social experience. 

In this blog post, our HXRC trainees Ella, Meiju, and Andrii share their own perspective on how the project came together and what they learned along the way.

Text by Ella Räntilä, Meiju Alihaanperä and Andrii Deshko.

This project is a 3D visualization of the newly renovated Espoonlahti Health Centre. Our client, the City of Espoo, wanted an easy and approachable way to view the health centre, with the additional goal of visualizing the emergency exits of the building. The building was uploaded to the Cluster platform where you can explore the centre using a mobile device, PC or VR-headsets. We had only a month to work on this project, but we managed to create a smooth experience with some extra fun features for the players. The team included three trainees: Meiju Alihaanperä and Ella Räntilä as the 3D Artists and Andrii Deshko as the Coder.

Trainee introduction: Ella Räntilä

I’m Ella Räntilä, and I’m a 3D Artist trainee at Helsinki XR Center. I study 3D Animation and Visualization at Metropolia UAS and will hopefully graduate next spring. This AI Start – Cluster – City of Espoo collaboration was my first project in HXRC and my role was to optimize original CAD-models, model new props, create textures, and shaders and combine them in Unity.

Trainee introduction: Meiju Alihaanperä

I’m Meiju, a fellow 3D Artist trainee and rigging enthusiast. Alongside working on my thesis project, I started working at Helsinki XR Center during my 4th year in Metropolia University of Applied Sciences. I mostly specialize in Blender, but for rigging I use Maya. My first project task was to clean up the building mesh by deleting unnecessary faces and objects. The clean up was time consuming but simple, as I enjoy these kinds of tasks very much. Since this was my first time working in a professional environment, this project was a great way for me to test the waters and slowly learn my part in a work environment.

Trainee introduction: Andrii Deshko

Hi, I am Andrii Deshko. I am a Software Developer trainee here at Helsinki XR Center, and a 4th year Software Engineering student at Metropolia UAS. This was my first project at HXRC and my first experience working on a full-scale Unity project for the Metaverse. I was responsible for adding functionalities to the scene and making them work properly in Cluster.

What is Cluster?

Cluster is one of the biggest Metaverse platforms in Japan, where people can create virtual worlds for different purposes. People from all over the world can gather in a digital space to play games and socialize. It’s easy to use with multiple devices, such as smart and anyone can create and upload their own world there. This was a new platform for all of us and it needed some time to explore, yet the process of uploading a new world to the platform for others to enjoy was very simple. One big difficulty was that most of the documentations and forum discussions were in Japanese.

A view of the Cluster homepage.

Optimizing and creating props

There was already an existing BIM (Building Information Model) of the Espoonlahti Health Centre that we used including terrain and a lot of furniture and props. However, it was not created to be used in game engines or VR. It was quite heavy and disorganized, so it could not be used in Cluster without optimization. We first started with deleting unnecessary props, and by cleaning up the meshes of the ones we decided to keep. All models needed to have a low triangle count so that it could run smoothly on Cluster. Working was quite straightforward with no big problems. Cleaning up the old mesh was a time-consuming puzzle, but relaxing in a way. For 3D modeling, we used Blender.

Next, Meiju will walk us through the main building optimization and reflect on her experience working on the project. 

At first, the face count was over 490 000, but after cleaning it up thoroughly, the face count dropped to a little over 190 000. The mesh itself had many unnecessary layers of faces and edges – not to mention it was all connected into one mesh – making it difficult to clean. I started off by breaking it down into sections by rooms and walls, and slowly cleaning it all the way from the lowest to the highest floor.

The task itself was very simple and time consuming, but personally, I found it enjoyable like piecing a puzzle together. I’ve learned new ways to clean up meshes in a more efficient way. For a first timer in an official project like this, I’m proud of how much I’ve  got done in little over a month.

Before and after: a comparison of the original and optimized building models.

After we had tested that our more optimized model runs smoothly in Cluster, we were able to add some furniture back and create some new pieces. When creating new props, I had to keep in mind that the models needed to be kept pretty simple and not to add many details since it would make the virtual world run slower. Some props were necessary to help players navigate the space, such as information signs, while others were added purely for visual appeal, like indoor plants and seating areas.

 

Additional props created or cleaned up from the old file.

For textures, we used a texture atlas that was applied to almost all of the models minimizing the amount of drawcalls. It is a pretty convenient technique if you don’t need anything detailed, because you can make one texture and apply it to everything. Using a texture atlas that has mainly solid colors was a technique that I hadn’t used before. I created a 1024 x 1024 texture in Photoshop that had all needed colors and when UV unwrapping models, we arranged faces to correct colors. I also created texture maps for different roughnesses and emissive parts.

 

Texture atlas used for the majority of props and environments from left to right: color, roughness, and emission.

Working in Unity

Then we started combining all the components in Unity. All models had to be imported and assigned the correct materials. The 3D models were turned into prefabs where we also added colliders and other necessary components. Next, I added lights and reflection probes. The lights were baked, because real-time lighting is heavier. Everyone imported the items they worked on, and lastly everything was combined into the master scene. 

This was a nice project to start with. Communication with other team members went very well. I got to do a variety of different things that I’m used to doing and got to improve them, but also learned new things about optimizing and VR. I think I did pretty well in this project considering the time limit.

Adding features

As mentioned previously, the main goal was to visualize the centre and create an emergency exit simulation. Players can explore the building and start the simulation by pressing the fire alarm buttons located on the walls. When the button is triggered, the guiding arrows leading to the nearest exit become visible. All players must exit the building and gather at the meeting point sign. After implementing the simulation, we had time to add some more fun features. The doors on the accessible area were made usable. Players can click to open and close doors and the sliding doors react when the player is close enough. Lastly, we added some functionalities to chairs and the X-Ray room.

 

Pictures from the experience in Cluster. On the left: visualizing the emergency exit. On the right: players socializing in the cafeteria.

Next, Andrii will share more about the coding side, uploading the world to the Cluster platform, and key takeaways from his work on the project.

Coding the experience and uploading to Cluster

Cluster has its own constraints: you cannot simply attach C# scripts to objects. Instead, it uses JavaScript-based ClusterScript, along with Triggers and Gimmicks from the Cluster Creator Kit (CCK), to configure interactions via Unity components.

My tasks included opening doors (manual and automatic), enabling users to sit on chairs, allowing players to lie on the X-ray bed to take images, and activating the fire alarm. Implementing the door and sitting logic were straightforward but required individual setup for each object.

Implementing the sitting logic.

Moreover, I downloaded animations from Mixamo service in order to add variations in sitting positions and to create the lying animation for the X-ray table.

Sitting and lying animations used in the simulation.

 

The X-ray functionality used CCK’s Gimmicks, Triggers, and Animations from Unity to show images when the scan button was pressed.

The most important and complex task was implementing the fire alarm in the building. The goal consisted of multiple tasks:

  • The alarm sound turns on when alarm button is pressed
  • Guideline arrows appear on the floor when alarm is ringing
  • The alarm sound can be turned off manually by clicking the alarm button
  • A timeout should be set so that the alarm can not be turned on for 5 seconds after being turned off
  • The alarm sound stops ringing when all avatars leave the building
  • Guideline arrows should disappear when alarm sound is off

To implement all of these functionalities, I had to figure out and combine multiple CCK’s Global Logic components, Triggers and a Global timer that would toggle all necessary values correctly. In order to turn off the alarm automatically when everyone has left the building, I used a Box Collider, CCK’s Overlap Detector, and a Global Logic component, combined with a custom ClusterScript to track avatars and toggle the alarm and guideline arrows.

Implementing the fire alarm functionalities.

Finished simulation on Cluster.

All in all, this project was an extremely valuable learning experience. It was my first time collaborating directly with 3D artists and working with Unity in the Metaverse. Collaboration with non-coding team members went smoothly, thanks to clear communication and mutual support throughout the project.

Despite all these obstacles, such as limited documentation in Japanese, I successfully implemented all functionalities and felt a strong sense of accomplishment when the project was completed.

And now, you can try out the world yourself here! The platform supports smartphones, PCs, and VR headsets. 

 

Final thoughts

This was a nice first project for all of us interns. We got to do some familiar tasks but also learned some new things. We also filmed a demo video to showcase the finished product that you can see below. Take a look!

Key Stakeholders and Collaboration

AIStart Incubator is an EU co-funded program led by Haaga Helia UAS with the cities of Helsinki, Espoo, Vantaa and Metropolia UAS (Helsinki XR Center) as partners. It helps SMEs in the Helsinki region adopt AI through hands-on support, tailored training, and access to diverse testing platforms, and forms part of the wider HEVinnovations program to strengthen regional innovation. Espoonlahti Health Centre project was initiated with City of Espoo in collaboration with Cluster Japan.  

You can try the finished simulation here. The experience works on smartphones, PCs and VR headsets.

For more information about the project, contact Janina Rannikko (janina.rannikko@metropolia.fi) or Janset Shawash (janset.shawash@metropolia.fi). 

AI Start Incubator is a collaboration of:

To see previous news about our trainees’ projects, head over to the Trainee news section.

Follow us on social media for more posts: Facebook | LinkedIn | Twitter | Instagram