Close this search box.
AR / VR / XR Story

Revolutionizing Education: Virtual Lab for applied science

Revolutionizing Education: Virtual Lab for applied science

In today’s constantly evolving education landscape, there is a growing demand for innovative solutions driven by the integration of cutting-edge technologies. This demand sparked the inception of the Virtual Lab for applied science by Bijoy Bera, an assistant professor at the Transport Phenomena group within the Applied Sciences Faculty of Delft University of Technology. The project aims to revolutionize how students learn and engage with concepts in physics transport phenomena through immersive virtual experiences. The aim was to provide students with a chance to simulate Physics theories in lab sessions, but the large class size made it unfeasible to conduct real-life experiments for everyone. Thus, the VR project began as a pilot initiative for Bijoy to explore virtual simulations of scientific experiments. To achieve this, he collaborated with developers from the XR Zone of the NewMedia Centre at TU Delft, embarking on a mission to enhance the learning experience through immersive technology.

The Virtual Lab idea began with the goal of offering multimodal learning, where students can learn in different ways. Professor Bera’s past methods, like using videos and live demonstrations, laid the foundation for this approach. Now, the Virtual Lab aims to facilitate the exploration of physics transport phenomena through hands-on experiences, implementing the “learning by doing” approach to enable active student engagement while mitigating the risk of laboratory accidents. With the help of developers from the XR Zone, a virtual lab environment was created, enabling students to verify scientific theories from lectures through interactive experiments.



Starting with eight sets in 2022, the project quickly expanded into an interactive platform for hands-on learning. As interest grew among faculty members, more professors from the faculty of Applied Sciences were convinced about the possibility of scientific simulation in VR after having seen the pilot project. Thus, additional experiment sets, including those in biotechnology, were integrated, reflecting a broader trend of applied science faculties embracing cutting edge technologies.

The impact of the Virtual Lab on student learning was profound. A course survey revealed heightened comprehension of theoretical concepts and an enriched learning experience. Immersed in VR experiments, students bridged the gap between theory and practice, gaining a deeper understanding of the subject matter. Reflecting on the project’s success, Professor Bera noted the transformative potential of VR in education. Despite initial uncertainties, the project proved to be a resounding success, highlighting the accuracy and adaptability of VR simulations.
The students themselves shared some ideas. They found clarity and connection from the theory they learned through immersive VR experiences. As one student remarked, he now understood where it was mentioned in the lecture, bringing everything into focus. The Virtual Lab for Physics Transport Phenomena integrates VR and 3D technology with education. It marks a shift towards experiential learning, where virtual environments facilitate discovery and interaction.

Following its success, the Virtual Lab stands as proof of Professor Bera’s vision and motivates educators. As we venture further into virtual education, one thing is certain: learning’s potential is boundless, limited only by our imagination.

Initiator: Dr. Bijoy Bera
Coordinator: Arno Freeke
Developer: Yosua Adisapta Pranata Andoko

3D Animation AR / VR / XR Story

Teacher – Classroom Simulator using AI

Teacher – Classroom Simulator using AI

In collaboration with Hogeschool Fontys, SEC from TU Delft, and XR Zone, we embarked on creating an AI-driven Teacher Classroom Simulator for didactics courses. Our aim was to explore the potential of VR in teaching and provide educators with a virtual training ground. Lecturers applying VR capabilities to implement innovative didactic methodologies for teaching students. Additionally, XR Zone offers an alternative pedagogical solution, leveraging cutting-edge technology to facilitate knowledge transfer within the classroom.

Their didactic approach prioritises practical applications, applying VR to enhance learning experiences.


The project grew from a shared passion for improving teaching through VR. Initially, we focused on education and AI.

The Educational Environment provides immersive classroom experiences with helpful feedback. We tested text prompts and verbal cues to help teachers, using VR to make learning more engaging.

On the AI side, virtual students act like real ones, evaluating teaching skills with voice and behaviour tracking. Two years of development, with a year of refining, taught us a lot from trials with new teachers. Our goal is to make our platform accessible for others and adaptable to different teaching situations to provide immersive education.

Initiator: Margreet Docter 

Researchers: Tamara De Vries, Alfin Thomas
Developer: Huu Dat Nguyen

3D Animation AR / VR / XR Story Video

Sarah – AI driven MetaHuman

Introducing Sarah, the remarkable AI-driven MetaHuman born from the innovative fusion of diverse projects and cutting-edge AI experimentation. Conceptualised within the dynamic confines of the XR Zone, Sarah represents a groundbreaking leap in technology-human interaction. Her genesis was marked by the challenge of infusing her with unparalleled realism and interactivity, leveraging state-of-the-art tools such as realistic avatar and OpenAI’s large language models (LLMs). The goal was to capture and articulate her voice, bridging it with lifelike conversation while ensuring swift responsiveness – where speed emerged as the ultimate frontier.

But Sarah’s development extended far beyond mere functionality. Every aspect of her physicality was crafted – from her animated lips to her expressive gaze and nuanced gestures – all aimed at fostering authentic engagement. The culmination of this effort saw Sarah take centre stage, showcased across screens in four distinct iterations, captivating audiences and inspiring her development team. Her debut marked a milestone as she volunteered her talents at XR events, effortlessly hosting and engaging attendees. Her success paved the way for a variety of different opportunities, including hosting the opening of the Faculty of Mechanical Engineering, where her presence commanded recognition.

Subsequent engagements saw Sarah evolve further, adapting her appearance to suit diverse contexts while retaining her core essence of interactivity. From serving as a gracious host at Philosophy event at Theatre De Veste Delft to exploring novel avenues of engagement, Sarah’s potential knows no bounds. 

As we continue to refine and expand Sarah’s capabilities, we invite inquiries and collaborations, eager to explore new horizons alongside our exceptional AI-driven MetaHuman host. Join us as we embark on this journey of innovation and discovery with Sarah leading the way.

Initiator: NMC Media Lab

Developer: Huu Dat Nguyen

AR / VR / XR



The XRScaleKit is a collection of projects and concepts that resulted in a toolkit for scalable XR development. The NewMedia Centre XR Zone frequently encountered issues in network, authentication and hosting etc. while building and scaling up XR applications, which led to the initiation of various MediaLab projects to address these challenges.

The journey began in 2019 with a project named “Classroom Controller,” which was designed to empower teachers with better control over larger groups in virtual reality (VR) environments. This was followed in 2022 by the “App-Manager” project, which focused on managing XR applications on a server to streamline application installation and version control.

Between these two key projects, several other solutions were developed, gradually leading to the concept of creating a reusable toolkit. We are now excited to share our progress on the XRScaleKit, a comprehensive toolkit designed to accelerate our XR development and enhance its scalability in education.

The XRScaleKit is a collection of projects and concepts that resulted in a toolkit for scalable XR development


A quick overview of our current toolkit in development


Manage and remotely install XR applications from a centralized location with support for mass device management.

Remote Control

Monitor, control, and screen share with XR applications remotely, complete with progress tracking and minimap views.


Implement secure Single Sign-On for XR environments, streamlining user authentication and access.


Securely upload, store, and manage application data, with tools for easy viewing and analysis.


Host multiplayer servers and support services on demand in the cloud, integrated within the Library.


A comprehensive set of accessible widgets and UI components designed specifically for XR environments.

Tutorial Level

A tutorial level designed to help users get comfortable with XR controls and interfaces before diving into main application.


Enhance user experience by adding intuitive labels to XR controller buttons, indicating their functions.

Web Application

Desktop Client

Unity Plugin

Unreal Plugin

Building and testing the applications

The XRScaleKit is an ambitious and evolving project currently under active development. We are excited to share our progress and vision for a comprehensive toolkit designed to speed up our XR development. We invite you to stay engaged as we continue to innovate and expand the possibilities of XR with new tools and innovations.

Below you can see the video where Luuk Goossen is testing the Remote Control application in a classroom with 25 students. The test was done using the “Virtual Lab for applied science” application that is build for TU Delft Assistant Professor Bijoy Bera by NMC developer Yosua Adisapta Pranata Andoko.

Stay in touch!

We believe this project might have great potential for other universities and institutions as well. If you have any questions about the complete or part of the project do not hesitate to contact us, please fill out the contact form and reach out to the MediaLab or XR Zone.

Arno Freeke,
Luuk Goossen

Luuk Goossen (Lead),
Yosua Pranata Andoko,
Sharif Bayoumy

Project Coordinator:
Roland van Roijen

AR / VR / XR Story

IoT Bridge

IoT Bridge

This MediaLab project aimed to connect IoT sensors based on the SeeedStudio Grove platform to most used game Engines using a socket library and proved to be successful. Providing students and researchers with an easy way to use IoT sensors in their projects at the NewMedia Centre XR Zone. IoT sensors are sensors that can measure things like: Pressure, Light, Gyroscopic & Acceleration, Temperature & Humidity, Proximity & Motion and more.

This technology can be used not only by game developers but also by students and researchers. By incorporating real-world data into game engines, students and researchers can explore new possibilities for immersive and interactive experiences.


By incorporating real-world data into game engines, students and researchers can explore new possibilities for immersive and interactive experiences.

The IoT Bridge is separated into three main components:

•    a Raspberry Pi SD image
•    Unreal Engine plugin
•    Unity plugin

The Raspberry Pi SD image contains all the necessary software and configurations to run the Grove sensors and enable communication between the sensors and the game engines. This image simplifies the setup process, making it easy for game developers to bring incorporating real-world data and physical elements into their projects.

The Unreal Engine and Unity plugins provide a seamless integration between the Grove sensors and the respective game engines, together with the Raspberry Pi SD image they allow for real-time data transfer and control of actuators. Developers can use the data from the sensors to drive various elements within the game engines, such as animations, particle effects, and physical simulations, creating more interactive and immersive experiences.

The three components work together to bridge Grove based IoT sensors and both game engines.

This technology provides a platform for experimentation and innovation in the field of Serious-Game development and other XR applications, making it an ideal tool for students and researchers looking to push the boundaries of what is possible in this field. The integration of Grove based IoT sensors with game engines through IoT Bridge provides easier access to a powerful solution for exploring the integration of real-world data and physical elements in XR projects.
To access this technology, visit the XR Zone located in the TU Delft Library.

Luuk Goosen
Roland van Roijen

Luuk Goosen
Yosua Adisapta Pranata Andoko

AR / VR / XR Story



The availability of the physical University Labs may be limited due to its capacity or other factors, like the mandatory requirement to work from home during the pandemic peaks. TU Delft had a goal to increase the capacity and extend the availability of the physical Photovoltaic Lab.

To still provide a sustainable and uninterrupted access to learning, the NewMedia Centre together with a research group Photovoltaic Materials and Devices (PVMD) have created the Virtual PVLAB.

The Virtual PVLAB is the digital twin of the on-campus PV Laboratory. Each task and each piece of equipment is simulated in a 3D environment that resembles the actual PV Laboratory. The didactical approach pursued in Virtual PVLAB is the same as in its on-campus version: students access the laboratory, preemptively study from the guide and execute a certain task according to a schedule. In the virtual lab students gain practical experience, conducting experiments with light, solar cells, modules, batteries, power electronic components, and system design. They also test the impact of various realistic situations and configurations on the performance of PV systems and all their components.

Through practical work the students get a “hands on” experience with solar equipment, thus gaining a more pragmatic understanding of all the processes.

“Gameplay” images from the application

Instead of analyzing the aforementioned experiments through a mathematical representation in theory, the students can actively manipulate objects in a virtual, live setting and challenge themselves by arranging the measurement setup in a virtual lab. Through practical work the students get a “hands on” experience with solar equipment, thus gaining a more pragmatic understanding of all the processes.

The opportunity to follow this course in a virtual format also enables students to do it at any time, facilitating their study progress through the MSc programme.

How does it work?

Students can access the virtual lab through their laptops. They do not need any VR headsets, as the Web Graphics Library provides access to a 3D environment through a web browser. In order to create the web application the NewMedia Centre has used Unreal Engine to create the 3D environment and equipment, simulate the physics behind the interaction of all objects; and used WebGL to deploy the tasks on all platforms on all major browsers.

All 3D objects created by NewMedia Centre’s XR are high quality and optimized for future re-use in similar or other XR applications.

Initiative: O. Isabella MSc
Faculty of Electrical Engineering, Mathematics & Computer Science

Didactics, Technical input & knowledge:
Dr. René van Swaaij R. Vismara MSc

Developer: Arend-Jan Krooneman
3D modeling: Arno FreekeRoland van Roijen

AR / VR / XR Story

VR Maritime

VR Maritime

Learning some procedures required for working on a ship wharf is usually a difficult and costly process due to limited access to an actual location and lots of risk involved. Still, students of the faculty of 3mE (Mechanical, Maritime and Materials Engineering) have to practice some assembly and logistics ship operations.

In order to help students learn easily and safely, the NewMedia Centre created a multiplayer VR application where they can learn multiple disciplines on a ship wharf in a virtual environment. Once in VR, the students perform different tasks from identifying and locating the required parts of the ship to transporting them and assembling the hull of the ship with a crane. During the whole experience they work in a team and perform these practical tasks while learning to navigate through the ship together. All the team members communicate through virtual walkie talkies, created specifically to increase the realism of their communication in VR.

For this project a ship and a ship wharf have been created in 3D, using Unreal Engine, and optimised for VR. The application features a multiplayer environment.

How does it work?

Students use VR headsets which provide full immersion into the environment. For this project a ship and a ship wharf have been created in 3D, using Unreal Engine, and optimised for VR. The application features a multiplayer environment.

At some point the students have to carry a pipe through the ship. As VR does not allow us to replicate the weight of the object, we have added a real physical pipe with a length of about 150cm with weights attached to it. The object is fitted with a VR puck that translates the object to VR so students can see and feel the object in VR and learn to communicate and navigate in safety while physically carrying a weight. This enables a more realistic perception of the virtual experience and results in a more effective learning.

The maritime project has been run successfully with students from the faculty of 3mE and is scheduled to be expanded over time through the addition of more functionality and tasks.

Initiative: J.F.J. (Jeroen) Pruijn
Faculty 3mE

Arend-Jan Krooneman
Luuk Goosen
Max van Schendel
Arno Freeke
Huu Dat Nguyen

AR / VR / XR Story Video

Virtual Production 4 Education

Virtual Production 4 Education

Footage from Mars rovers and images of vehicles have been out there for years, but getting the feel of the scale of these objects in comparison to a human has always been difficult. Lecturer at Astrodynamics & Space Missions, TU Delft Aerospace Engineering, Sebastiaan de Vet wanted to show the scale differences between Mars vehicles created over time by NASA. Many people have seen footage from Mars rovers but they usually lack the feeling of scale as there are usually no people in the scene for reference.

To solve this issue the NewMedia Centre of TU Delft has created a virtual production studio that enables teachers and researchers to blend into their 3D virtual content thus creating a realtime augmented video. 

Sebastiaan had the idea to walk past a chronological lineup of all the vehicles, where he would stop to talk about each vehicle and then move on to the next one with a final shot showing all vehicles in one scene.

Thanks to this studio, the dynamic content can be made on an affordable scale for education. It will enable teachers and researchers to step into their 3D virtual content, thus creating a realtime augmented video.

The virtual studio features free-moving cameras where the foreground and background will move accordingly in scale, position, rotation and focus with the moving cameras.

How does it work?

The students are able to see the professor on the video as if he is a part of a preset environment: in this case he is walking on the Mars surface. This project requires preparation, however. The virtual studio features free-moving cameras where the foreground and background will move accordingly in scale, position, rotation and focus with the moving cameras. A powerful computer renders the dynamic foreground and background in real time. 

The Mars landscape was created from scratch using Pixel Mega scans. All was put together and after a bit of tweaking a realistic Mars landscape with rover vehicles was created: The 3D Models from the rovers are provided by NASA but  needed to be optimized by the NMC before they could be used in VR.

“We used three types of cameras: a static camera, a camera on a slider and a free moving cameraman. We also wanted to shoot close-up as well as a wide scenery. The virtual studio allows rebuilding shoots as you like and it’s relatively easy to sync the fore and background on the fly,” says Roland van Roijen, Coördinator Media Lab | Media / XR / 3D Designer.

VR tech: Arend-Jan Krooneman
3D modeling: Arno FreekeRoland van Roijen
Director: Maik Helgers
Camera: Boris SwaenGeraldo SolisaJulia Zomer
Mars models: NASA