Close this search box.
AR / VR / XR Story

Revolutionizing Education: Virtual Lab for applied science

Revolutionizing Education: Virtual Lab for applied science

In today’s constantly evolving education landscape, there is a growing demand for innovative solutions driven by the integration of cutting-edge technologies. This demand sparked the inception of the Virtual Lab for applied science by Bijoy Bera, an assistant professor at the Transport Phenomena group within the Applied Sciences Faculty of Delft University of Technology. The project aims to revolutionize how students learn and engage with concepts in physics transport phenomena through immersive virtual experiences. The aim was to provide students with a chance to simulate Physics theories in lab sessions, but the large class size made it unfeasible to conduct real-life experiments for everyone. Thus, the VR project began as a pilot initiative for Bijoy to explore virtual simulations of scientific experiments. To achieve this, he collaborated with developers from the XR Zone of the NewMedia Centre at TU Delft, embarking on a mission to enhance the learning experience through immersive technology.

The Virtual Lab idea began with the goal of offering multimodal learning, where students can learn in different ways. Professor Bera’s past methods, like using videos and live demonstrations, laid the foundation for this approach. Now, the Virtual Lab aims to facilitate the exploration of physics transport phenomena through hands-on experiences, implementing the “learning by doing” approach to enable active student engagement while mitigating the risk of laboratory accidents. With the help of developers from the XR Zone, a virtual lab environment was created, enabling students to verify scientific theories from lectures through interactive experiments.



Starting with eight sets in 2022, the project quickly expanded into an interactive platform for hands-on learning. As interest grew among faculty members, more professors from the faculty of Applied Sciences were convinced about the possibility of scientific simulation in VR after having seen the pilot project. Thus, additional experiment sets, including those in biotechnology, were integrated, reflecting a broader trend of applied science faculties embracing cutting edge technologies.

The impact of the Virtual Lab on student learning was profound. A course survey revealed heightened comprehension of theoretical concepts and an enriched learning experience. Immersed in VR experiments, students bridged the gap between theory and practice, gaining a deeper understanding of the subject matter. Reflecting on the project’s success, Professor Bera noted the transformative potential of VR in education. Despite initial uncertainties, the project proved to be a resounding success, highlighting the accuracy and adaptability of VR simulations.
The students themselves shared some ideas. They found clarity and connection from the theory they learned through immersive VR experiences. As one student remarked, he now understood where it was mentioned in the lecture, bringing everything into focus. The Virtual Lab for Physics Transport Phenomena integrates VR and 3D technology with education. It marks a shift towards experiential learning, where virtual environments facilitate discovery and interaction.

Following its success, the Virtual Lab stands as proof of Professor Bera’s vision and motivates educators. As we venture further into virtual education, one thing is certain: learning’s potential is boundless, limited only by our imagination.

Initiator: Dr. Bijoy Bera
Coordinator: Arno Freeke
Developer: Yosua Adisapta Pranata Andoko

3D Animation AR / VR / XR Story

Teacher – Classroom Simulator using AI

Teacher – Classroom Simulator using AI

In collaboration with Hogeschool Fontys, SEC from TU Delft, and XR Zone, we embarked on creating an AI-driven Teacher Classroom Simulator for didactics courses. Our aim was to explore the potential of VR in teaching and provide educators with a virtual training ground. Lecturers applying VR capabilities to implement innovative didactic methodologies for teaching students. Additionally, XR Zone offers an alternative pedagogical solution, leveraging cutting-edge technology to facilitate knowledge transfer within the classroom.

Their didactic approach prioritises practical applications, applying VR to enhance learning experiences.


The project grew from a shared passion for improving teaching through VR. Initially, we focused on education and AI.

The Educational Environment provides immersive classroom experiences with helpful feedback. We tested text prompts and verbal cues to help teachers, using VR to make learning more engaging.

On the AI side, virtual students act like real ones, evaluating teaching skills with voice and behaviour tracking. Two years of development, with a year of refining, taught us a lot from trials with new teachers. Our goal is to make our platform accessible for others and adaptable to different teaching situations to provide immersive education.

Initiator: Margreet Docter 

Researchers: Tamara De Vries, Alfin Thomas
Developer: Huu Dat Nguyen

3D Animation AR / VR / XR Story Video

Sarah – AI driven MetaHuman

Introducing Sarah, the remarkable AI-driven MetaHuman born from the innovative fusion of diverse projects and cutting-edge AI experimentation. Conceptualised within the dynamic confines of the XR Zone, Sarah represents a groundbreaking leap in technology-human interaction. Her genesis was marked by the challenge of infusing her with unparalleled realism and interactivity, leveraging state-of-the-art tools such as realistic avatar and OpenAI’s large language models (LLMs). The goal was to capture and articulate her voice, bridging it with lifelike conversation while ensuring swift responsiveness – where speed emerged as the ultimate frontier.

But Sarah’s development extended far beyond mere functionality. Every aspect of her physicality was crafted – from her animated lips to her expressive gaze and nuanced gestures – all aimed at fostering authentic engagement. The culmination of this effort saw Sarah take centre stage, showcased across screens in four distinct iterations, captivating audiences and inspiring her development team. Her debut marked a milestone as she volunteered her talents at XR events, effortlessly hosting and engaging attendees. Her success paved the way for a variety of different opportunities, including hosting the opening of the Faculty of Mechanical Engineering, where her presence commanded recognition.

Subsequent engagements saw Sarah evolve further, adapting her appearance to suit diverse contexts while retaining her core essence of interactivity. From serving as a gracious host at Philosophy event at Theatre De Veste Delft to exploring novel avenues of engagement, Sarah’s potential knows no bounds. 

As we continue to refine and expand Sarah’s capabilities, we invite inquiries and collaborations, eager to explore new horizons alongside our exceptional AI-driven MetaHuman host. Join us as we embark on this journey of innovation and discovery with Sarah leading the way.

Initiator: NMC Media Lab

Developer: Huu Dat Nguyen

AR / VR / XR Story

IoT Bridge

IoT Bridge

This MediaLab project aimed to connect IoT sensors based on the SeeedStudio Grove platform to most used game Engines using a socket library and proved to be successful. Providing students and researchers with an easy way to use IoT sensors in their projects at the NewMedia Centre XR Zone. IoT sensors are sensors that can measure things like: Pressure, Light, Gyroscopic & Acceleration, Temperature & Humidity, Proximity & Motion and more.

This technology can be used not only by game developers but also by students and researchers. By incorporating real-world data into game engines, students and researchers can explore new possibilities for immersive and interactive experiences.


By incorporating real-world data into game engines, students and researchers can explore new possibilities for immersive and interactive experiences.

The IoT Bridge is separated into three main components:

•    a Raspberry Pi SD image
•    Unreal Engine plugin
•    Unity plugin

The Raspberry Pi SD image contains all the necessary software and configurations to run the Grove sensors and enable communication between the sensors and the game engines. This image simplifies the setup process, making it easy for game developers to bring incorporating real-world data and physical elements into their projects.

The Unreal Engine and Unity plugins provide a seamless integration between the Grove sensors and the respective game engines, together with the Raspberry Pi SD image they allow for real-time data transfer and control of actuators. Developers can use the data from the sensors to drive various elements within the game engines, such as animations, particle effects, and physical simulations, creating more interactive and immersive experiences.

The three components work together to bridge Grove based IoT sensors and both game engines.

This technology provides a platform for experimentation and innovation in the field of Serious-Game development and other XR applications, making it an ideal tool for students and researchers looking to push the boundaries of what is possible in this field. The integration of Grove based IoT sensors with game engines through IoT Bridge provides easier access to a powerful solution for exploring the integration of real-world data and physical elements in XR projects.
To access this technology, visit the XR Zone located in the TU Delft Library.

Luuk Goosen
Roland van Roijen

Luuk Goosen
Yosua Adisapta Pranata Andoko

3D Story

3D Scanner

3D Scanner

In the Media Lab we are  setting up a 3D scanner service, currently we are testing and investigating how to implement this service. This includes best practices, optimization and a solution to export and or publish your model in format or formats suitable for several applications.

A handheld 3D scanner is a device that uses lasers and cameras to capture the shape and texture of an object, and then creates a 3D digital model of it. These scanners have become increasingly popular in recent years due to their ability to quickly and easily capture detailed 3D data of a wide range of objects, from small figurines to large industrial parts. Handheld 3D scanners are used in a variety of industries, including manufacturing, engineering, and design. They can also be used for artists who want to create digital models of their creations.

Handheld 3D scanners have a wide range of applications, such as:

  • Reverse engineering
  • Quality control and inspection
  • Rapid prototyping
  • Archiving historical artifacts
  • Cultural heritage and conservation
  • Artistic creation
  • Education

The Scanner available is the Creaform Go Scan.

This is a fast user-friendly handheld 3D scanner that works with lights and cameras. Ideally, it is used for objects as small as approximately 5 cm, up to objects measuring roughly 2 meters, Because it has a cord which is little over 2 meters long. 

3D scanners are used in a variety of industries, including manufacturing, engineering, and design.

Having the scanner is the easy part, but creating instructions to make it user friendly is a challenge that requires a lot work. several sessions have been done with test users and after a few paper versions the first instruction video is being created for new run.

After the scanning a huge amount of data is created and Sharif is finalizing an Application build to do an automated optimization where the model can be converted to more usable formats like glTF and FBX accompanied with PBR materials.

If you have questions, ideas or suggestions do not hesitate to contact us.

Initiator: Vincent Cellucci
Coordinator: Roland van Roijen
Developer: Sharif Bayoumy
Use Case Research: Geertje van Achterberg

Story Video

Holographic projection

Holographic projection

Together with the Mexican Monterrey Institute of technology the NewMedia Centre is investigating possibilities for holographic projections in the classroom. Currently there is a pilot running where a teacher from Mexico will teach students in Delft and in turn a Delft Teacher will teach students in Mexico.

Holographic projection technology is revolutionizing the way we teach and learn. With this technology, teachers are no longer limited to the physical space of the classroom and can now reach students from anywhere in the world.

The technology works by using a combination of cameras, projectors, and special software to create a true sized image of the teacher that can be projected into the classroom on a transparent screen. This image is so realistic that it appears as if the teacher is actually in the room with the students.

One of the biggest advantages of holographic projection technology is that it allows for remote teaching. This means that a teacher can teach a class from a different location, whether it be from their home or another school. This is particularly useful in cases where a teacher is unable to be physically present in the classroom. 

Overall, holographic projection technology is considered a game-changer in the field of education. It could opens up new opportunities for teaching and learning, making it possible for teachers to reach students in ways that were previously not possible. As this technology continues to evolve, we can expect to see even more innovative and immersive ways of teaching and learning in the future.


With this technology, teachers are no longer limited to the physical space of the classroom and can now reach students from anywhere in the world.

Play Video

Not only can this screen be used for teaching it will also allow people to do presentations or meetings. Academic Director of the TU Delft Teaching Academy, Annoesjka Cabo was the first to use our screen in the Netherlands during the Education Day in 2022. Where she spoke to a crowd of approximately 150 people and responded to questions from the audience. Overall the response to this performance was very positive.

We are still experimenting with the implementation and possibilities so do not hesitate to contact us if you are interested.

Initiative: Monterrey Institute of technology

Developer: Roland van Roijen

AR / VR / XR Story



The availability of the physical University Labs may be limited due to its capacity or other factors, like the mandatory requirement to work from home during the pandemic peaks. TU Delft had a goal to increase the capacity and extend the availability of the physical Photovoltaic Lab.

To still provide a sustainable and uninterrupted access to learning, the NewMedia Centre together with a research group Photovoltaic Materials and Devices (PVMD) have created the Virtual PVLAB.

The Virtual PVLAB is the digital twin of the on-campus PV Laboratory. Each task and each piece of equipment is simulated in a 3D environment that resembles the actual PV Laboratory. The didactical approach pursued in Virtual PVLAB is the same as in its on-campus version: students access the laboratory, preemptively study from the guide and execute a certain task according to a schedule. In the virtual lab students gain practical experience, conducting experiments with light, solar cells, modules, batteries, power electronic components, and system design. They also test the impact of various realistic situations and configurations on the performance of PV systems and all their components.

Through practical work the students get a “hands on” experience with solar equipment, thus gaining a more pragmatic understanding of all the processes.

“Gameplay” images from the application

Instead of analyzing the aforementioned experiments through a mathematical representation in theory, the students can actively manipulate objects in a virtual, live setting and challenge themselves by arranging the measurement setup in a virtual lab. Through practical work the students get a “hands on” experience with solar equipment, thus gaining a more pragmatic understanding of all the processes.

The opportunity to follow this course in a virtual format also enables students to do it at any time, facilitating their study progress through the MSc programme.

How does it work?

Students can access the virtual lab through their laptops. They do not need any VR headsets, as the Web Graphics Library provides access to a 3D environment through a web browser. In order to create the web application the NewMedia Centre has used Unreal Engine to create the 3D environment and equipment, simulate the physics behind the interaction of all objects; and used WebGL to deploy the tasks on all platforms on all major browsers.

All 3D objects created by NewMedia Centre’s XR are high quality and optimized for future re-use in similar or other XR applications.

Initiative: O. Isabella MSc
Faculty of Electrical Engineering, Mathematics & Computer Science

Didactics, Technical input & knowledge:
Dr. René van Swaaij R. Vismara MSc

Developer: Arend-Jan Krooneman
3D modeling: Arno FreekeRoland van Roijen

AR / VR / XR Story

VR Maritime

VR Maritime

Learning some procedures required for working on a ship wharf is usually a difficult and costly process due to limited access to an actual location and lots of risk involved. Still, students of the faculty of 3mE (Mechanical, Maritime and Materials Engineering) have to practice some assembly and logistics ship operations.

In order to help students learn easily and safely, the NewMedia Centre created a multiplayer VR application where they can learn multiple disciplines on a ship wharf in a virtual environment. Once in VR, the students perform different tasks from identifying and locating the required parts of the ship to transporting them and assembling the hull of the ship with a crane. During the whole experience they work in a team and perform these practical tasks while learning to navigate through the ship together. All the team members communicate through virtual walkie talkies, created specifically to increase the realism of their communication in VR.

For this project a ship and a ship wharf have been created in 3D, using Unreal Engine, and optimised for VR. The application features a multiplayer environment.

How does it work?

Students use VR headsets which provide full immersion into the environment. For this project a ship and a ship wharf have been created in 3D, using Unreal Engine, and optimised for VR. The application features a multiplayer environment.

At some point the students have to carry a pipe through the ship. As VR does not allow us to replicate the weight of the object, we have added a real physical pipe with a length of about 150cm with weights attached to it. The object is fitted with a VR puck that translates the object to VR so students can see and feel the object in VR and learn to communicate and navigate in safety while physically carrying a weight. This enables a more realistic perception of the virtual experience and results in a more effective learning.

The maritime project has been run successfully with students from the faculty of 3mE and is scheduled to be expanded over time through the addition of more functionality and tasks.

Initiative: J.F.J. (Jeroen) Pruijn
Faculty 3mE

Arend-Jan Krooneman
Luuk Goosen
Max van Schendel
Arno Freeke
Huu Dat Nguyen

AR / VR / XR Story Video

Virtual Production 4 Education

Virtual Production 4 Education

Footage from Mars rovers and images of vehicles have been out there for years, but getting the feel of the scale of these objects in comparison to a human has always been difficult. Lecturer at Astrodynamics & Space Missions, TU Delft Aerospace Engineering, Sebastiaan de Vet wanted to show the scale differences between Mars vehicles created over time by NASA. Many people have seen footage from Mars rovers but they usually lack the feeling of scale as there are usually no people in the scene for reference.

To solve this issue the NewMedia Centre of TU Delft has created a virtual production studio that enables teachers and researchers to blend into their 3D virtual content thus creating a realtime augmented video. 

Sebastiaan had the idea to walk past a chronological lineup of all the vehicles, where he would stop to talk about each vehicle and then move on to the next one with a final shot showing all vehicles in one scene.

Thanks to this studio, the dynamic content can be made on an affordable scale for education. It will enable teachers and researchers to step into their 3D virtual content, thus creating a realtime augmented video.

The virtual studio features free-moving cameras where the foreground and background will move accordingly in scale, position, rotation and focus with the moving cameras.

How does it work?

The students are able to see the professor on the video as if he is a part of a preset environment: in this case he is walking on the Mars surface. This project requires preparation, however. The virtual studio features free-moving cameras where the foreground and background will move accordingly in scale, position, rotation and focus with the moving cameras. A powerful computer renders the dynamic foreground and background in real time. 

The Mars landscape was created from scratch using Pixel Mega scans. All was put together and after a bit of tweaking a realistic Mars landscape with rover vehicles was created: The 3D Models from the rovers are provided by NASA but  needed to be optimized by the NMC before they could be used in VR.

“We used three types of cameras: a static camera, a camera on a slider and a free moving cameraman. We also wanted to shoot close-up as well as a wide scenery. The virtual studio allows rebuilding shoots as you like and it’s relatively easy to sync the fore and background on the fly,” says Roland van Roijen, Coördinator Media Lab | Media / XR / 3D Designer.

VR tech: Arend-Jan Krooneman
3D modeling: Arno FreekeRoland van Roijen
Director: Maik Helgers
Camera: Boris SwaenGeraldo SolisaJulia Zomer
Mars models: NASA