key: cord-0051580-6r37e6if authors: Huang, Lihui; Taib, Siti Faatihah Binte Mohd; Aung, Ryan; Goh, Zhe An; XU, Mengshan title: Virtual reality research and development in NTU date: 2020-10-14 journal: nan DOI: 10.1016/j.vrih.2020.06.002 sha: d4c8b87c6bac07baad3a60ed7f6ed38ef9be3542 doc_id: 51580 cord_uid: 6r37e6if In 1981, Nanyang Technological Institute was established in Singapore to train engineers and accountants to keep up with the fast-growing economy of the country. In 1991, the institute was upgraded to Nanyang Technological University (NTU). NTU holds the rank for world's top young university for six consecutive years according to the Quacquarelli Symonds (QS) world university ranking. Virtual Reality (VR) research began in NTU in the late 1990s. NTU's colleges, schools, institutes, and centers have contributed toward the excellence of VR research. This article briefly describes the VR research directions and activities in NTU. Located in west Singapore, Nanyang Technological University (NTU) comprises a 500-acre campus and is listed among the top-15 most beautiful university campuses in the world. In 2019, NTU [1] had 25088 students (including 6719 international students), 935 research only staff, and 1582 academic staff (including 853 international staff); furthermore, 6443 undergraduate and 649 doctoral degrees were awarded, and 5868 new undergraduate and 550 new doctoral students had enrolled. As an international university, NTU comprises 100 nationalities. With more than 300 academic partners worldwide, international exchange students are ubiquitous on the green campus. NTU comprises four colleges, i. e., College of Business, College of Engineering, College of Science, and College of Humanities, Arts, and Social Sciences. In addition to these four colleges, NTU comprises the Lee Kong Chian School of Medicine (jointly established with Imperial College London), the Graduate College, and five autonomous institutes including the National Institute of Education, and the S. Rajaratnam School of International Studies. NTU hosts a number of research centers of excellence such as the Institute for Media Innovation, Energy Research Institute @ NTU, NTU Institute of Science and Technology for Humanity, Nanyang Institute of Technology in Health and Medicine, Singapore Centre for Environmental Life Sciences and Engineering, Nanyang Environment & Water Research Institute, and the Earth Observatory of Singapore . Furthermore, the university collaborates with major corporations such as Alibaba, Rolls-Royce, and Dyson in establishing joint laboratories on campus to achieve research objectives that are relevant to the society. As a research-intensive university, NTU has significantly contributed to VR research, with focus on fundamental research, including but not limited to geometric modelling, image-based three-dimensional (3D) reconstruction, and digital geometry processing. Furthermore, NTU has partnered with various corporations to develop VR applications in Biomedical Sciences, Engineering, and Education. In this paper, Section 2 presents NTU's VR research and application development in Biomedical Sciences. Section 3 discusses NTU's VR research and application development in Engineering. Section 4 presents NTU's VR research and application development in Education. Section 5 focuses on NTU's VR research and development in Humanity and Social Sciences. Finally, Section 6 concludes this article. Over the years, NTU has developed a number of VR projects. Basic research was first developed and then used for various applications. In this section, we discuss the VR research conducted in the Strategic Research Program of VR and Soft Computing (srp VR & SC) and the School of Mechanical & Aerospace Engineering in collaboration with local and international corporations. The Strategic Research Program of VR and Soft Computing was established in NTU in the late 1990s to promote education, research, and industrial collaboration in VR and artificial intelligence (AI). Several VR projects pertaining to biomedical science have been developed through the program. In collaboration with Gleneagles Hospital in Singapore [2] , a project was conducted to design and develop a VR-enhanced system for cardiac intervention using patient-specific tagged Magnetic resonance imaging (MRI) data. The program srp VR & SC focuses significantly on various areas, including geometric modeling, digital geometry processing, and image processing. Several algorithms have been developed for 3D progressive reconstruc-tion [3] , cardiac motion vector extraction from tagged MRI data [4] , and geometry-based interaction modeling [5] . Figure 1 shows the VR-enhanced simulation system for intracardiac intervention using tagged MRI data [6] . The simulator was designed to train medical students and junior surgeons to perform minimally invasive cardiac intervention. For example, both the red-white and blue-white striped wires (Figure 1b) are simulating the catheter. They will be inserted into the heart to search for the location to inject stem cells. However, the location to inject the stem cell cannot be determined easily because the heart beats continuously, and the heart wall moves continuously. The aim is to find a non-slip contact for the injection. The School of Mechanical & Aerospace Engineering at NTU (Singapore) and the School of Medicine at the University of Toronto (Canada), particularly Opas's Lab of Medicine and Pathology [7] , have a long-term partnership in developing an innovative solution for the interactive visualization and quantification of cellular images. Apart from joint PhD student training, the two above-mentioned parties have collaborated to investigate both fundamental and application problems associated with cellular structures. Through their joint effort, several algorithms have been designed and developed for a better understanding of volumetric cellular structures, with focus on 3D border extraction and cell clustering from laser fluorescent microscopic confocal image stacks [8] [9] [10] and other cellular image processing techniques [11, 12] . For 3D confocal microscopic images and those built on top of several algorithms and techniques mentioned above, a VR-enabled system, CellStudio was developed for the visualization and analysis of volumetric cellular image data [13] . This system was established in earlier VR efforts at NTU; its customized design was based on a CRT display, an interactive stylus, a pair of active shutter glasses, and an emitter, as shown in Figure 2 NTU and the Institute of Mental Health [14] in Singapore collaborated to create a game-based learning application for children with autism spectrum disorder (ASD) to train their executive function skills, such as planning and sorting. As computer-assisted learning has proven effective in the learning of children with ASD [15] , game-assisted learning could prove to be effective as well for their executive functioning skills training. An iPad application comprising three different games for children of nine to twelve years old (primary 3 to primary 6) was developed to train their planning and sorting skills. Each game presents different locations (home, supermarket, and school) and has ten levels, which increase in difficulty as the player progresses in the game. The home and supermarket games are aimed at training their planning skills, whereas the school game aims at training their sorting skills. The supermarket game, in particular, is designed to improve their route-planning skill as it requires them to obtain items listed in their shopping list in the shortest distance possible (Figure 3a ) [16] . Furthermore, three parent-component levels are present in each game so that parents can participate in their children's development by having them implement those tasks in real life. One can implement this by having children search for items in their parents' customized shopping list at a physical supermarket store. As shown in Figure 3a [16] , while the players are playing the games, data are collected, pushed, and stored in real time using the Firebase Real-time Database, a cloud-hosted database. The data include the players' performance (number of stars received), amount of time required for completion, and errors generated in each game level. This is to measure the players' progress without explicitly imposing a test setting on them. NTU and PEC Limited [17] in Singapore have collaborated to develop an innovative solution for heavy crane lifting under two research collaboration agreements. In this industrial collaboration project, heavy lifting in highly complex industrial environments was investigated. Safety and productivity are two main concerns in lifting tasks, particularly when large and heavy loads are involved. Traditionally, lifting is performed manually, which is error prone and time consuming. The research purpose was to develop an automatic and intelligent path planning system for highly complex environments to generate, in real-time or near real-time, safe and optimized lifting paths using AI and VR technology. Innovative solutions for path planning that use the advantages of GPU programming [18] have been developed [19] [20] [21] [22] . A simulator (Figure 4a ) was designed [23] for the vocational training of heavy crane operations. Tower cranes, mobile cranes, and crawler cranes were digitally modeled and rigged. The building information model (BIM) [24] , plant design management systems, or discretized environments captured using a laser scanner in the form of point clouds [25] were used as input for the lifting simulator [26] . For the crane simulator, trainees can use interactive devices, such as a joystick and a steering wheel, to mimic the control of cranes. In addition to gaining interactive experience, they can immerse themselves in a virtual industrial environment through stereoscopic visualization. The safety warnings will be triggered if the lifting load is about to collide with the environment. Moreover, the operation steps during the session performed by trainees can be recorded using the crane simulator for debriefing purposes. An optimal lifting path ( Figure 4b ) will be used as a reference to evaluate the lifting operations during training with respect to safety and productivity. The SJ-NTU Corporation Laboratory is a joint venture between NTU and Surbana Jurong Consultants Private Limited [27] in Singapore with funding support from Singapore's National Research Foundation. This project aims to develop intelligent solutions for the reconstruction of the BIM representation from multimodal images ( Figure 5 ). In this new industry project under the corporate laboratory, fundamental research includes innovation in artificial intelligence algorithms, point-cloud-based segmentation and classification, digital geometry processing, and 3D reconstruction. The research purpose is to develop a pipeline for automatic and intelligent recognition and reconstruction of BIMs from multimodal images. Multimodal images will be generated to capture environments using smart scanning techniques, including LiDAR and photogrammetry. Image fusion algorithms will be investigated before building components such as walls and windows, whereas mechanical and electrical pipes will be recognized automatically or semi-automatically to form their corresponding BIM representation by leveraging the latest AI techniques. VR technology is seamlessly integrated in the work process for the interactive visualization and modification of the digital geometry and 3D BIM models. In a project under the sponsorship of the National Research Foundation, the School of Computer Science and Engineering [28] collaborated with GovTech [29] to investigate a LiDAR-based geometry and machine for the virtualization and semantics enrichment of 3D city models. Used in all models in Virtual Singapore, CityGML is an open-data model in an XML-based format for the storage and exchange of virtual 3D city models. It is an application schema for the Geography Markup Language version 3.1.1 (GML3) and an extensible international standard for spatial data exchange [30] issued by the Open Geospatial Consortium and ISO TC211. The aim of the development of CityGML is to establish a common definition for the basic entities, attributes, and relations of a 3D city model. Virtual Singapore [31] is a large-scale city model of Singapore that enables public and private sectors to develop solutions for business and research applications. It is cost effective for the sustainable maintenance of 3D city models, allowing the reuse of the same data in different application fields. Owing to its rich and dynamic data environment, Virtual Singapore provides a collaborative platform for virtual experimentation and test bedding, research and development, simulation, planning, and decision-making with applications in construction, building and maintenance, infrastructure and resource management, urban planning, etc. Fraunhofer Singapore [32] is jointly founded by Fraunhofer-Gesellschaft and Fraunhofer IGD in Europe, as well as NTU with funding support from the National Research Foundation under Singapore's RIE2020 initiative. Fraunhofer Singapore is hosted by NTU's School of Computer Science and Engineering in partnership with Technische Universitaet Darmstadt (Germany) and Graz University of Technology (Austria). Fraunhofer Singapore focuses on four research areas: VR and augmented reality (AR), intuitive human-machine interaction, digital content generation, and 3D reconstruction. AR/VR and mixed reality technologies were particularly investigated to develop solutions for smart industrial engineering, maintenance, and industrial applications in Industry 4.0 and beyond. Fraunhofer Singapore and the School of Civil and Environmental Engineering [33] have collaborated to investigate the 399 psychophysiological evaluation of seafarers to improve training in a maritime virtual simulator [34] . The first CAVE was invented by Cruz-Neira et al. in the Electronic Visualization Laboratory at the University of Illinois, Chicago [35] . Two CAVEs were installed in Singapore's Science Center [36] and the Institute of High Performance Computing (IHPC) [37] in the late 1990s and early 2000s. In the early 2000s, a course for visualization and VR for product design was offered for Year 4 students by the School of Mechanical and Aerospace Engineering (course code: M498) with the objective to teach product design using visualization and VR technology [38] . The IHPC facilitated course M498 students in performing CAVE walkthroughs ( Figure 6a ) for a virtual hostel that they designed in the course ( Figure 6b ). This is a 320°circular projection VR environment with five high-end active projectors installed in the institute's immersive room. Each is mirror projected to a panel on the circular screen. The common area of two neighborhood projectors must be edge blended to form a smooth transition in the area [40] . A virtual dolphinarium was designed for children with autism to enable their interaction with virtual pink dolphins to improve their communication skills [40] . Figure 7 shows a child with autism playing with the virtual pink dolphins in the immersive room. This project [41, 42] is a collaboration between IMI, Underwater World Singapore [43] , AWWA Special School [44] in Singapore, Suzhou Industrial Park Renai Special School [45] in China, as well as Utrecht University [46] and Windesheim University of Applied Sciences [47] in the Netherlands. VR is an excellent tool for public education. In the following, three interactive VR games are presented to showcase their superiority for the public education of viruses. To develop these interactive VR games, fundamental research [48, 49] as well as protein surface and structure modeling [50] are desired. This is an interactive VR game designed in NTU for SARS virus education (Figure 8a ). It was exhibited in Gallery #10 of Singapore Art Museum [51] from September 2003 to October 2004 [52] . The purpose of this exhibition was to promote the convergence of art, science, and technology with a protein roller-coaster as an interactive medium to enable the public, particularly young students, to better understand the primary, secondary, and tertiary structures of SARS virus proteins. Tower [53] in May 2005 (Figure 8b) . The virus theme park involved 12 viruses, including HIV and SARS [54] . The week-long public exhibition attracted over 70000 visitors. This is an interactive VR serious game designed at NTU for learning protein secondary structures through X gaming [55] . Several virus proteins were modeled in the form of the Great Wall of China to represent the protein backbone structure (Figure 8c) . Players ride on a motor bike along the Great Wall and perform various X game actions, such as 360°spinning, to learn protein amino acid sequences as well as protein secondary and tertiary structures. The Hive (Figure 9a ) is a new student learning hub at the NTU campus that supports technology-enhanced learning. Among the 56 classrooms in the Hive, only one is a traditional lecture theater, while the remainder are designed for team-based and inquiry-based learning. Figure 9b shows a flipped classroom in the Hive. The classroom is designed to host approximately 30 students for team-based learning. Each team with a maximum of six students is supported by a VR-ready system. All six VR systems in the classroom can operate either independently or in a synchronized manner. The VR flipped classroom is an immersive and interactive learning environment [56] with integrated hardware, software, pedagogy, and content. The hardware for each table includes a high-end VR-ready computer, head-mounted display, stereographic TV display, and other interactive devices. The software comprises system integration and optimization accelerated by a graphics processing unit. Content is created to support curriculum-based learning in engineering, whereas in-depth learning pedagogy is developed for inquiry-based and team-based learning. Currently, the School of Mechanical and Aerospace Engineering [57] , the School of Chemical and Biological Engineering [58] , Centre for IT Services [59] , and National Institute of Education [60] are collaborating to develop VR technology-enhanced learning for NTU students. In addition to biomedical sciences, engineering, and education, VR is useful in many other fields. Researchers at NTU have investigated VR applications in humanity, social sciences, and heritage. With the support from the Singapore Tourism Promotion Board, Madam Snake White, Jiang Taigong, Smiling Buddha, the Eight Immortals, Ten Courts of Hell, etc. were digitized using a laser scanner, 3D reconstruction, and 3D printing before generating an AR application for heritage education. Laser scanning technology (Figure 10a ) was applied to digitize identified items in the villa. Subsequently, 3D mapping was conducted to reconstruct digital 3D models (Figure 10b ) from the captured point clouds through the processes of point cloud data fusion, segmentation, etc. [61] . From the digitized 3D models, 3D prints were created and then used as triggers for AR applications. The applications developed enabled users to learn the culture and heritage of the Haw Par Villa (Figure 10c) . Part of the work has been published as a special issue in the journal, "Presence" [62] . At the School of Art, Design, and Media [63] , heritage visualization and potential speculative reconstructions in digital space were investigated, and a special case study based on Cyprus's Medieval Church of St. Anne in Famagusta was discussed. The research finding was published in DISEGNARECON as a special issue on advanced technologies for historical city visualization [64] . The Centre for Augmented and Virtual Reality [65] was jointly established by the College of Science and College of Engineering in partnership with EON Reality [66] for education, training, and research. The center is equipped with the latest AR/VR technology and leading infrastructure for course development and training. A walkthrough application of NTU's Yunnan Garden was developed at the center based on an efficient resource scheduling scheme for the out-of-core dynamic streaming of a 3D scene. The entire scene was stored in the cloud and relevant scene data were streamed to a client's Android mobile device in real time. In the early 2000s, a virtual reality project was performed with the School of Computer Science and Engineering to develop a virtual NTU campus [67] . The VR model of the campus was developed based on the MultiGen-Paradigm and Vega as well as 3D web visualization technology for VR walkthrough applications using virtual reality modeling language. The BeingThere Centre at the Institute for Media Innovation [39] is a three-party international collaboration involving NTU, ETH (Zurich), and the University of North Carolina at Chapel Hill. The three active research areas at the center are virtual humans, social robots, and telepresence. The BeingThere Centre is now renamed the BeingTogether Centre [68] after the first-phase completion of BeingThere. Substantial efforts have been expended in modeling multi-party interactions among virtual characters, social robots, and humans [69, 70] . More specifically, the social and humanoid robot Nadine developed at the IMI has received significant interest worldwide [71] . Telepresence is a major research topic at both the BeingThere Centre and BeingTogether Centre. Immersive 3D telepresence [72] , autofocus AR eyeglasses for both real-world and virtual imagery [73] , and extended depth-of-field volumetric near-eye AR displays [74] have been developed. Owing to Covid-19, virtual humans and social robots are expected to play a more active role in virtual meeting and telepresence. At the School of Social Science [75] within the College of Humanity, Art, and Social Sciences, researchers are investigating vision perception and neuroscience through a multi-disciplinary approach combining psychophysics, electrophysiology, eye tracking, and VR [76] . Additionally, computational modeling was developed for hierarchical information processing. At the WKW School of Communication and Information [77] within the College of Humanity, Art, and Social Sciences, game-based social interactions are studied with focus on cyber well-being and cyber wellness, as well as the effects of digital games on adolescents' social and psychological development [78] . This research will have potential applications in the post-pandemic era when working from home becomes the new norm. VR can be used in virtual try-on and fashion simulations. Asystem [79] developed by the School of Computer Science and Engineering and the IMI comprises data extraction, animated body adaptation, and garment prepositioning and simulation. A Kinect sensor was used to capture a customer's data using RGB, depth, and motion sensors. To generate a customized 3D body from a template model, a statistical analysis method was proposed to estimate anthropometric measurements from the partial information extracted from the Kinect sensor. A constrained Laplacian-based deformation algorithm was then applied to deform the template model to match the obtained anthropometric measurements before shape refinement was carried out based on contours. The fast-growing trend in online shopping necessitates VR applications [80] . Fundamental research including modeling, visualization, and animation can be used for customizing fashion designs. Virtual try-on simulations employing the Kinect sensor can be used for online fashion shopping. Ideally, the research should accommodate the fashion industry to provide appropriate customization services to end users. This article provides an overview of the research and development of VR performed at NTU. Various VR projects developed in different schools, research centers/labs, and institutes were presented. Emphasis was 404 404 placed on fundamental research and application development in VR across NTU campus, which focused on four areas: biomedical sciences, engineering, education, and humanity, social sciences, and heritage. As VR technology is evolving rapidly, it was not possible to include all the related research activities at NTU herein. The authors declare no competing financial interest. Progressive surface reconstruction for heart mapping procedure A B-spline approach to phase unwrapping in tagged cardiac MRI for motion tracking A geometric approach to the modeling of the catheterheart interaction for VR simulation of intra-cardiac intervention A VR simulator for intracardiac intervention The OPAS's Lab of Medicine and Pathology An automatic method for identifying appropriate gradient magnitude for 3D boundary detection of confocal image stacks 3D boundary extraction of confocal cellular images using higher order statistics An automatic segmentation algorithm for 3D cell cluster splitting using volumetric confocal images Adaptive-weighted cubic B-spline using lookup tables for fast and efficient axial resampling of 3D confocal microscopy images Adaptive correction technique for 3D reconstruction of fluorescence microscopy images A VR enhanced collaborative system for 3d confocal microscopic image processing and visualization Use of technology in interventions for children with autism Supermarket route-planning game: A serious game for the rehabilitation of planning executive function of children with ASD Cuda By Example: An Introduction to General-Purpose GPU Programming Parallel genetic algorithm based automatic path planning for crane lifting in complex environments Automatic path planning for dual-crane lifting in complex environments using a prioritized multiobjective PGA Accurate and efficient approximation of clothoids using bézier curves for path planning Automatic re-planning of lifting paths for robotized tower cranes in dynamic BIM environments. Automation in Construction Simulation-enabled vocational training for heavy crane operations Building Information Modeling (BIM) for existing buildings: Literature review and future needs Point Cloud Representation Point cloud based path planning for tower crane lifting Psychophysiological evaluation of seafarers to improve training in maritime virtual simulator The CAVE: audio visual experience automatic virtual environment 39 Institute for Media Innovation Design and development of a virtual dolphinarium for children with autism Simulations and serious games for education Simulation and serious games for education Triangular Bézier sub-surfaces on a triangular Bézier surface Interpolation over arbitrary topology meshes using a two-phase subdivision scheme Kernel modeling for molecular surfaces using a uniform solution Proteins, immersive games and music Immersive protein gaming for bio edutainment Bio-edutainment: Learning life science through X gaming 3D Immersive and Interactive Learning Madam snake white: a case study on virtual reality continuum applications for Singaporean culture and heritage at haw par villa Special issue on VR for culture and heritage: the experience of cultural heritage with virtual reality (part II): guest editors' introduction Heritage Visualization and Potential Speculative Reconstructions in Digital Space: The Medieval Church of St 65 Centre for Augmented and Virtual Reality Nanyang Technological University virtual campus virtual reality project Modelling multi-party interactions among virtual characters, robots, and humans. Presence: Teleoperators and Virtual Environments A user study of a humanoid robot as a social mediator for twoperson conversations A social robot that can localize objects and grasp them in a human way FocusAR: auto-focus augmented reality eyeglasses for both real world and virtual imagery An extended depth-at-field volumetric near-eye augmented reality display Ensemble statistics shape face adaptation and the cheerleader effect Wee Kim Wee School of Communication and Information Relating video game exposure, sensation seeking, aggression and socioeconomic factors to school performance Cloth simulation and virtual try-on with kinect based on human body adaptation Modeling and simulating bodies and garments