Search
Close this search box.
Categories
3D Animation AR / VR / XR Story Video

Sarah – AI driven MetaHuman

Introducing Sarah, the remarkable AI-driven MetaHuman born from the innovative fusion of diverse projects and cutting-edge AI experimentation. Conceptualised within the dynamic confines of the XR Zone, Sarah represents a groundbreaking leap in technology-human interaction. Her genesis was marked by the challenge of infusing her with unparalleled realism and interactivity, leveraging state-of-the-art tools such as realistic avatar and OpenAI’s large language models (LLMs). The goal was to capture and articulate her voice, bridging it with lifelike conversation while ensuring swift responsiveness – where speed emerged as the ultimate frontier.

But Sarah’s development extended far beyond mere functionality. Every aspect of her physicality was crafted – from her animated lips to her expressive gaze and nuanced gestures – all aimed at fostering authentic engagement. The culmination of this effort saw Sarah take centre stage, showcased across screens in four distinct iterations, captivating audiences and inspiring her development team. Her debut marked a milestone as she volunteered her talents at XR events, effortlessly hosting and engaging attendees. Her success paved the way for a variety of different opportunities, including hosting the opening of the Faculty of Mechanical Engineering, where her presence commanded recognition.

Subsequent engagements saw Sarah evolve further, adapting her appearance to suit diverse contexts while retaining her core essence of interactivity. From serving as a gracious host at Philosophy event at Theatre De Veste Delft to exploring novel avenues of engagement, Sarah’s potential knows no bounds. 

As we continue to refine and expand Sarah’s capabilities, we invite inquiries and collaborations, eager to explore new horizons alongside our exceptional AI-driven MetaHuman host. Join us as we embark on this journey of innovation and discovery with Sarah leading the way.

Categories
Story Video

Holographic projection

Holographic projection

Together with the Mexican Monterrey Institute of technology the NewMedia Centre is investigating possibilities for holographic projections in the classroom. Currently there is a pilot running where a teacher from Mexico will teach students in Delft and in turn a Delft Teacher will teach students in Mexico.

Holographic projection technology is revolutionizing the way we teach and learn. With this technology, teachers are no longer limited to the physical space of the classroom and can now reach students from anywhere in the world.

The technology works by using a combination of cameras, projectors, and special software to create a true sized image of the teacher that can be projected into the classroom on a transparent screen. This image is so realistic that it appears as if the teacher is actually in the room with the students.

One of the biggest advantages of holographic projection technology is that it allows for remote teaching. This means that a teacher can teach a class from a different location, whether it be from their home or another school. This is particularly useful in cases where a teacher is unable to be physically present in the classroom. 

Overall, holographic projection technology is considered a game-changer in the field of education. It could opens up new opportunities for teaching and learning, making it possible for teachers to reach students in ways that were previously not possible. As this technology continues to evolve, we can expect to see even more innovative and immersive ways of teaching and learning in the future.

 

With this technology, teachers are no longer limited to the physical space of the classroom and can now reach students from anywhere in the world.

Play Video

Not only can this screen be used for teaching it will also allow people to do presentations or meetings. Academic Director of the TU Delft Teaching Academy, Annoesjka Cabo was the first to use our screen in the Netherlands during the Education Day in 2022. Where she spoke to a crowd of approximately 150 people and responded to questions from the audience. Overall the response to this performance was very positive.

We are still experimenting with the implementation and possibilities so do not hesitate to contact us if you are interested.

Initiative: Monterrey Institute of technology

Developer: Roland van Roijen

Categories
AR / VR / XR Story Video

Virtual Production 4 Education

Virtual Production 4 Education

Footage from Mars rovers and images of vehicles have been out there for years, but getting the feel of the scale of these objects in comparison to a human has always been difficult. Lecturer at Astrodynamics & Space Missions, TU Delft Aerospace Engineering, Sebastiaan de Vet wanted to show the scale differences between Mars vehicles created over time by NASA. Many people have seen footage from Mars rovers but they usually lack the feeling of scale as there are usually no people in the scene for reference.

To solve this issue the NewMedia Centre of TU Delft has created a virtual production studio that enables teachers and researchers to blend into their 3D virtual content thus creating a realtime augmented video. 

Sebastiaan had the idea to walk past a chronological lineup of all the vehicles, where he would stop to talk about each vehicle and then move on to the next one with a final shot showing all vehicles in one scene.

Thanks to this studio, the dynamic content can be made on an affordable scale for education. It will enable teachers and researchers to step into their 3D virtual content, thus creating a realtime augmented video.

The virtual studio features free-moving cameras where the foreground and background will move accordingly in scale, position, rotation and focus with the moving cameras.

How does it work?

The students are able to see the professor on the video as if he is a part of a preset environment: in this case he is walking on the Mars surface. This project requires preparation, however. The virtual studio features free-moving cameras where the foreground and background will move accordingly in scale, position, rotation and focus with the moving cameras. A powerful computer renders the dynamic foreground and background in real time. 

The Mars landscape was created from scratch using Pixel Mega scans. All was put together and after a bit of tweaking a realistic Mars landscape with rover vehicles was created: The 3D Models from the rovers are provided by NASA but  needed to be optimized by the NMC before they could be used in VR.

“We used three types of cameras: a static camera, a camera on a slider and a free moving cameraman. We also wanted to shoot close-up as well as a wide scenery. The virtual studio allows rebuilding shoots as you like and it’s relatively easy to sync the fore and background on the fly,” says Roland van Roijen, Coördinator Media Lab | Media / XR / 3D Designer.

VR tech: Arend-Jan Krooneman
3D modeling: Arno FreekeRoland van Roijen
Director: Maik Helgers
Camera: Boris SwaenGeraldo SolisaJulia Zomer
Mars models: NASA