Made by a student, now XRLab technician, as a final year project the Cycling Safety application is an innovative project that allows people to practise and experience cycling safety and the dangers of the road like never before. Using the immersive and interactive technologies of virtual reality, users are placed in control of a cyclist and given the task of navigating a small section of a busy city. The dangers change based on several factors such as the time of day and weather, affecting cars’ behaviour or pedestrians count. The application monitors the user's actions, judging whether they travelled in a safe manner before informing the user of any dangers or lapses in safety on their part along with suggestions on improvements. Plans to develop the project over the next year include implementing a wider range of conditions, increased interactions, more dynamic artificial intelligence and more responsive and detailed feedback for the user as well as straight forward graphical updates and improvements to give the city a much more realistic look.
PAL - Learn and Play is a project developed for the HoloLens technology, which wanted to create an educational and interactive game, aimed especially at children, to learn subjects such as math, anatomy and art, in a fun way through mini-games. The user is in front of a class, in which there is a robot avatar who plays the role of playmate, and will work alongside him throughout the game; there are also other secondary objects, with which the user can interact. The controls of the game work by moving the head with the utmost freedom and pointing towards objects and use gestures or voice to interact with them. The project was carried out in order to break down the limits of the virtual reality where one needs to connect to a secondary GPU, but also the limits of alienation with a virtual headset, given that with HoloLens it is possible to observe even the real environment through the lenses. The future plan is to improve the mini-games and interactions already present, to improve the rendering capacity, to create a greater connection between the game and the surrounding environment, to create a compatible version for the HoloLens 2 with more interactions.
This project is the result of a collaboration between the XRLab and the department of psychology at the University of Westminster. It has been created with the idea of breaking down the distance barrier, and making it possible to meet in the virtual world. To do so, within the VR networked software VRchat, students have been able to enter a portal (world) specifically created for their needs, to meet, interview each other and collaborate in real time. Moreover always as part of the project, a set of avatars with different ethnic characteristics has been created, in order to best represent each individual. The development has included the use of VRchat SDK in Unity for the creation of the world, as well as Adobe Fuse for the avatars. Moreover, the realization of the project has confirmed the power of virtual reality technology and its multiple uses in different fields, such as psychology. The project will be available to use by future psychology students or anyone who wants to try it or improve it, for the same or other purposes.
The Projection Mapping Rig has been created in collaboration with the Fabrication Lab. It is used for exploring and presenting architectural proposals in an interactive and immersive fashion. In particular, a physical model of a proposed building and its surroundings have been overlaid with animated cross-sections taken from 3D scan point-cloud data. Alongside with it, the building outside have been projected as a texture onto the concrete model through projection mapping. Moreover, the map of the designated area has been converted into a 3D version from the authentic google map.
This project wanted to demonstrate the approach of different technologies in the field of architecture and accurate results that can be obtained through new 3D scanning methods and projections, as well as experimenting with different ways of visualizations of models.
The airship project is a game created to explore the features and basic mechanics of the VRTK toolkit, within a small demo. The user starts by having in front of him two platforms that can be reached by teleport; on the first there is the miniature of a floating 3D airship model made of lego, which the user can interact by rotating or scaling it; on the second platform there is the same model but in real scale, on which the user can teleport and explore the interior. Inside there is a lever that acts as a switch to activate the vehicle to fly in the sky. The project is available for students that want to learn the basic mechanisms in VR, but it is also a demonstration of a possible project idea.
This project created an AR overlay that can be applied to an architectural presentation model in order to add additional active elements. These could be for example the structures uses (i.e. animations of cars and people), decorative elements such (applying textures or populating with objects) or state changes that help broaden understanding of the structures use (i.e. placing in neighbouring structures).
AR Overlay has been developed to be shared internally as a demonstrable digital product that can be used to help others understand the capabilities of the XR lab, but also to inspire students to think on how they can integrate this technology in the current practise and to create another AR functionality with a specific architectural application.
At the moment the application is working on phones and tablets but in the future it could be implemented onto the HoloLens, to demonstrate the multiple ways and circumstances in which this application can be used.
ARIA has been able to create a networked VR application that was capable of communicating, across multiple devices, the positional information of the VR user in relation to a 3D model and display that as an AR overlay on top of a duplicate physical model. The project includes features such as creating natural persistence in the AR avatar as the VR user teleported around by having this model move towards the player rather than being attached to it;, animation cycles that changed so the avatar appeared to walk as it moved between locations; avatar arms moving with the VR users controller.
The development of the project has been a very useful learning experience to get a better understanding of the Unity networking module and has demonstrated the existence of the multiple approaches it can have with new technologies, as well as the different methods in which different technologies can be combined to create innovative and revolutionary projects. With the introduction of additional training on the updates of the networking module of the new Unity versions, this project can be continued and improved.
This project is the result of the participation between the XRLab and the University of Westminster Law department, of a project called Project Swift which sees collaborations with law students of the Universite’ de Lorraine. For it, students have been divided into teams to meet each other in the virtual world, represented by avatars, in order to be able to discuss two legal cases previously chosen. The platform used is the VR networked Sansar software. The laboratory has therefore made its equipment available by supporting and helping students to create avatars and a virtual space to connect. This experience can be repeated in the future, with the same imprint or with the use of other software or technologies. The project is open for further experiments to law students or other departments
MetaCampus Virtual Tour is an XR Lab project that allows both provisional, current, alumni students and University Of Westminster staff members to have a better view of what the university has to offer, to easily discover useful information about the classrooms or laboratories and to get the full usability of the services on site.
The project enables the user to visit the University Of Westminster Cavendish campus via browser or Virtual Reality. The browser version uses WebGL whereas the Virtual Reality version needs a Virtual Reality headset. The virtual tour uses point to point views just like on Google Maps with the insertion of UI elements that provide useful information about the visualized space. The project is offers the virtual tour to students who cannot attend open days, with accurate and updated information. Future plan for MetaCampus are optimization, increasing of the security level and enhancing the user interface together with the addition of other useful services, as well as creating virtual tours for other campuses.
This project has been created to demonstrate how to handle the Unity lighting system in VR in a correct way. We created a Unity scene of a house interior with a set of realistic light output, that would best reflect reality. Then we created a video demonstration of the final result and an additional workflow where all the process is explained step by step. Therefore, students can benefit from this project by learning and implementing it in their own ones. Also, beside the need to support students in the learning process, the project has also been developed to find a way to overcome all the issues encountered when implementing lighting in the virtual reality and to enhance the performance for our next applications.
This project has been made by college students during a three-day work experience at the XRLab. Its concept is inspired by the existing VR game called Beat Saber. As in the latter, the player plays inside an arena where there are boxes coming towards them, each with an HTML tag on it. The player's task is to recognize whether the box’s tags are correct or not, and destroy it to get points. HTMLSaber has been created firstly to put into practice the knowledge gained during the work experience, but also to use VR to create an application that the player can benefit from by learning the basics of a coding language, while having fun. In the future the project might be implemented with the addition of multiple levels and difficulties and also by improving its mechanics
The SLoAS project was carried out in partnership with University of the Arts London. The project wanted to demonstrate the different forms in which the individual can be represented in the virtual world, in this case through holograms in augmented reality, which are being layered over the real world.
The first phase of the project has been to 3D scan each participant with a Kinect; the second phase has been to motion capture some actions performed by the participants in real time in the mocap space; the third phase has been to transfer the 3D models in Motion Builder software alongside with the animations captured before, to combine both; the last phase has been to transfer everything into Unity to build an application that could overlay the model in the real world, using Vuforia Engine. Moreover, two different versions of the same application has been made, one using a ground plane recognition to be broadcasted through a projector and a second version has been developed using image targeting recognition to be found at different locations in the room. Both versions have been used during the UAL show.
Created through the use of UNET/the networking system currently available in Unity, its primary purpose was to demonstrate to students multiple functions and features: virtual full-body representation with IK tracking of the head and hands, networked connection setup and online interactions. The project is full featured, past the alpha stage and now just requires additional testing and bug fixing in the beta stage. In the future once bugs have been fixed the features will be expanded to include customizable avatars, a range of environments and in-application games to play with friends within these VR worlds
Made by an XRLab technician as a fun and compact demonstration for students struggling with creating certain features within Unity, the planet contains animated inhabitants, particle systems, simulated buoyancy, mathematically generated ocean waves and custom made shaders crafted through the use of ShaderGraph. Available for students to download and alter, the applications purpose is to demonstrate a variety of different concepts and their implementation through the Unity Engine. The application will be developed further to demonstrate additional concepts on additional planets such as advanced physics, dynamic lighting and procedural animation.
A single example of the helpful projects that the XRLab provides to students. A Unity project and workflow developed to teach students how to make their own unique gravity and physics. The standard physics within a Unity application is often inadequately setup to handle such interactions such as giving augmented reality physics or complex virtual reality physics. As such this lead to the development of this project that contains and demonstrates simple scripts and assets that will allow any students how use it to hopefully achieve what they desire.
This project wants to explore how virtual reality may be implemented over the web by creating a networked environment where multiple users can interact with one another virtually. Its aim is to be able to create a virtual environment in which people can access directly through an html link without the need to install any additional plugins or software, by either a smartphone/tablet, PC or a virtual reality headset. Currently in the project, implementation of basic interaction has been completed, such as picking up objects or resizing them, as well as having those actions synchronised in real time between all users.
This project is at its early stage, but there are many future features that can be implemented such as audio/video communication, to allow effective and enhanced communication amongst users; have it hosted on a server where it can be accessed anywhere through a weblink, as currently it works only on a local machine server; use it as an educational platform where simple tutorials can be implemented to allow students to complete exercises through virtual reality, allowing it easy to learn/remember lessons etc.
This project explores the use of VRTK to create an environment that could act as a playground for students who are learning Unity, VR and VRTK. It aims to use various means of interaction to allow users to add, remove and modify components as seen in Unity so that they have a better understanding of how they work when applied in games. VR Editor (VRTK) helps not only students that are new to Unity or VR, but also developer as they can benefit from the concept of programming in a concrete visualization. Future plans of this project will be to improve the representation of Unity components to be more accurate and user-friendly as well as implementing challenges and such that would be designed to stimulate users and help understand components, for example: ‘Create a vehicle’ or ‘Add/modify the components on this cube to enable interaction’.
Developed due to recently hosted events and student projects that needing to cover this topic, this project demonstrates how two augmented reality gameObjects can interact, how this can be set up and how they can function. AR interactions are more complicated than regular ones, as your project must recognise when two AR objects are supposed to be interacting with each other or in a position to do so. within Unity there are instant problems with this due to the way Unity handles augmented reality and the gameObjects in its space. This is a simple project that helps students to get starter or even to use it as a base project to build on top of. One student is doing augmented reality for his final year project has already used this project as a starting point for the rest.
Tasked by the Head of School of Computer Science & Engineering, and being given only two weeks to complete the task, the XRLab successfully submitted a redesign of the fifth floor to give it a more modern look and expanded functionality. In addition to this, these designs were requested to be presented with a Unity application capable of both VR and non-VR. Both versions of the application were created and developed as the designs were altered and finalised, with the VR version eventually being developed and optimized for the Oculus Quest in order to allow for a highly interactive and immersive demonstration on the go. Happy with the results, the Head of School has preceded to show these designs to senior management to create excitement and enthusiasm for the plans and hopefully make them one step closer to reality.
The projection mapping rig was created in collaboration with the Fabrication Lab.
It is used for exploring and presenting architectural proposals in an interactive and immersive fashion. A physical model of the proposed building and its surroundings are overlaid with animated cross-sections taken from 3D scan point-cloud data.
The XRLab at the University of Westminster is proud to support the Psychology department in their experiments using Virtual Reality.
Virtual Avatars are used in networked VR Software VRChat.
PAL: Learn & Play, is a project under development for Microsoft Hololens AR headset. It is an educational game where it is possible to interact with a robot avatar who supports the user in learning different subjects such as mathematics, art and anatomy through mini-games. The controls are given through gestures and vocal inputs.
Using the immersive and interactive technologies of virtual reality users are placed in control of a cyclist and given the simple task of navigating a small section of a busy city. The dangers changed based on several factors such as the time of day or weather, affecting aspects such as cars behaviour or pedestrians count. The application monitored the user’s actions, judging whether they travelled in a safe manner before informing the user of any dangers or lapses in safety on their part along with suggestions on improvements.