AbstractMetaverse is a new medium that effectively fuses digital and physical worlds into a seamless experience. Metaverse platforms are limited in their current scope of applications and require further exploration to expand their capabilities beyond social networking. In this paper, we present a system framework that aims to realize the concept of industrial Metaverse for engineering product demonstration using extended reality (XR) technologies. The framework incrementally integrates virtual and real environments throughout three phases with a focus on transitioning from business to engineering aspects. A virtual reality (VR) exhibition hall, an augmented reality (AR) catalog, and a mixed reality (MR) on-site assessment tool have been developed to support technical marketing in each phase, respectively. Test results show that these XR applications enable real-time interaction for potential customers to evaluate industrial coolers without geographic and temporal restrictions. The presented framework may work as a foundation for the implementation of industrial Metaverse in the post-pandemic era.
1 IntroductionThe concept of Metaverse has gained significant attention in both the business and technology sectors since 2021 [1,2]. The Metaverse is a digital realm where people and computer-generated elements interact in real-time within a virtual space or real environment. This technology offers users novel and immersive experiences with applications in various areas, including entertainment, gaming, and social networking. The Metaverse breaks down physical barriers and temporal limitations by enabling users to explore 3D spaces in ways that go beyond traditional web browsing and real-life interactions [3]. The recent COVID-19 pandemic has been a driving force behind the rise of such revolutionary technologies. Due to the widespread impact of social distancing and travel restrictions, both individuals and companies are actively seeking innovative methods for remote interactions and networking in the digital space. The Metaverse has emerged as one of the early attempts to address this need.
Despite its potential, the Metaverse is still in a premature stage of technical development and deployment. Currently, the practical applications of the Metaverse are mainly limited to social networking and remote conferencing. Yao et al. [4] proposed a conceptual framework that advocates the idea of ‘Industrial Internet+Metaverse = Industrial Metaverse’. In this framework, networks and the industrial internet of things (IIoT) will continue to develop and result in a seamless integration of the real and virtual worlds. Lee and Kundu [5] presented a concept framework of the Industrial Metaverse designed for remote machine maintenance in the manufacturing industry. An ongoing project on health monitoring of a ball screw was used to highlight the challenges that may arise in implementing the industrial metaverse in manufacturing. Deployment of the Metaverse in manufacturing or other industries is currently in its early stages due to a lack of successful use cases, limited understanding, and slow adoption [6].
The implementation of a Metaverse often involves the use of extended reality (XR) technologies, which encompass virtual reality (VR), augmented reality (AR), mixed reality (MR), and other immersive technologies. Fig. 1 illustrates the virtuality-reality continuum, which represents the progression from a virtual world to a real environment. XR technologies allow for real-time user interaction through a variety of sensory modalities, including visual, auditory, haptic, and their combinations. This interaction creates an immersive environment that mimics the real world along the virtuality-reality continuum [7,8]. In this continuum, VR offers a fully immersive experience, allowing users to interact with a computer-generated 3D virtual environment. AR augments the user’s perception and understanding of their environment by overlaying virtual content onto real-world scenes, thereby enhancing human situation awareness. MR represents an advancement beyond AR by integrating physical reality and digital content, allowing for real-time interaction and communication between both real and virtual elements. VR/AR/MR technologies offer different levels of immersion, ranging from high immersion achieved through specialized devices like head-mounted displays (HMDs) and 3D projection [9], to lower levels of presence experienced when using computer screens and smartphones. In XR applications, user interfaces incorporate a variety of input devices, ranging from familiar options like keyboards, mice, and touch screens, to more advanced functionalities such as voice commands, eye-tracking, and gesture recognition [7,8,10].
XR technologies provide an effective solution for technical marketing and product evaluation, which utilizes ubiquitous digital media to engage users, create immersive experiences, and facilitate interaction with products. The previous study [11] examined how exposure to online information (stimulus) during the initial phase of the COVID-19 pandemic influenced and reinforced purchase intent (response). Marketing activities should strive to stimulate purchase intentions and capture customers’ interests. Product demonstration is considered a powerful marketing strategy that can enhance customer purchase intentions, especially for engineering products or technical services [12–14]. Empirical studies have shown that the use of VR/AR/MR can provide both hedonic values, such as an enjoyable shopping experience, and utilitarian values, such as product knowledge, respectively [15,16]. However, there is a lack of solutions for how the Metaverse can support demonstration of complex industrial products using XR technologies.
This study presents a system framework for conducting industrial product demonstrations in a Metaverse to fulfill this requirement. The framework comprises three phases that progressively merge virtual and real environments using XR technologies to enhance customer’s understanding from business to engineering perspectives. The Metaverse thus constructed provides a new communication medium for demonstration of complex engineering products, enabling potential customers to remotely access and evaluate technical details of the products. The presented framework incrementally combines virtual and real environments throughout three phases along the virtuality-reality continuum using XR technologies. A VR exhibition hall, an AR catalog, and an MR on-site assessment tool have been developed to support the technical marketing of industrial coolers in each respective phase. The VR exhibition hall provides virtual showrooms, 3D product models, and instant communication with sales representatives for customers to explore in a virtual environment. The AR catalog shows how each product is comprised of individual components in an exploded view when scanning a paper brochure with a smartphone. The MR assessment tool enables potential customers to download 3D product models and place them in a real environment to evaluate potential interference with other existing objects. The presented framework can be extended within the business and engineering context by incorporating NFTs (Non-Fungible Tokens) and digital twins, respectively. Test results from these XR applications highlight the potential of the proposed framework to realize the Metaverse in the industrial sector, especially for demonstrating complex products. This work may suggest the possibility of extending the concept of the Metaverse beyond social networking and into engineering fields.
2 Industrial Metaverse FrameworkThe main objective of this paper is to introduce a system framework for technical marketing of industrial products in the Metaverse. The framework consists of three phases that progressively integrate virtual and physical environments using various XR technologies to increase customers’ understanding of products and their purchase intention. Figure 2 illustrates the suggested framework and the XR applications employed in each phase: a VR exhibition hall, an AR catalog, and an MR product layout. These applications provide product information progressively from marketing to engineering perspectives across the virtuality-reality continuum.
The first phase of the Metaverse is a 3D VR showroom that replicates a real-world exhibition hall and presents a range of industrial coolers as a product family. Users can access the VR showroom either via a URL using a browser from a computer or mobile device that supports WebGL and JavaScript. Alternatively, potential customers wearing VR goggles can explore the exhibition space in an immersive manner from the SteamVR community and virtually walk through the space. In either way, they gain general information about the product company, including its history, mission statement, technological advancements, and services. In addition, they can view multimedia presentations that display 3D models, functional capabilities, and potential applications of each product by clicking links embedded in the space. The 3D models showcased are converted from CAD software and imported as meshes into the VR application, which is developed using C# in the Unity 3D engine. User verification is not required for the VR showroom and this open design may increase its exposure to the public.
In the second phase, an AR application offers a 3D interactive product catalog. This catalog is intended for customers who have expressed interest in the product and want to gain knowledge about its technical details. The 3D catalog can be accessed in two ways: by clicking on a corresponding URL link for each product model in the VR exhibition space or by scanning the product image on a traditional product flyer using a smartphone. This phase emphasizes real-time interactivity with each product model and provides detailed specifications for technical assessment. The AR application is developed using C# in Unity and conducts image scanning and model recognition from a product flyer using Vuforia after a proper training process. It integrates with the Lean Touch package in Unity, enabling users to manipulate 3D product models directly through finger touch on mobile devices. The explosive view feature shows how each model is assembled from its components in 3D space through an animation supported by the Post Processing and DOTween packages. A pop-up window also displays detailed information about each product model and its components. The AR application is running on iOS 14.0 and higher.
The final phase of the Metaverse involves an MR application for product layout in a real-world environment. The application is accessible only to serious buyers who wish to ensure the optimal spatial arrangement of a specific product on their shop floor. One essential function of the application is to detect collisions between the chosen product model and existing objects in the real environment in real-time during the layout process. Additionally, an industrial cooler is specifically designed to reduce the temperature of manufacturing equipment that generates excessive heat during use, such as machine tools and surface mounting technology (SMT) machines. In many cases, it is necessary to adjust the position of the cooler in relation to the heat source to achieve an optimal spatial arrangement. One way to achieve a suitable arrangement is by utilizing collision detection between the virtual cooler and existing real-world objects. The MR application visually represents the machine’s final appearance, including its actual dimensions, such as height, width, and length. It is developed using Swift in Xcode (version 12.4) and utilizes Apple’s ARKit v4 for depth sensing and 3D scene construction from the real environment. The user interface is designed with SwiftUI components. The application requires LiDAR sensors, which are supported by iPad Pro v4 or later, running on iOS 14.0 or higher.
Recent advancements in Ethereum’s protocol offer the ability to prove ownership of NFTs in the cyber space. NFTs are unique and can act as cryptographic proof of ownership of physical or digital assets. NFT has been successfully applied to prove IP ownership of 3D models in Industry 4.0 [17]. Following this idea, purchase transactions of the products displayed in the VR hall can be automatically processed in an NFT marketplace extended from the business perspective. NFTs are implemented on the Ethereum blockchain through the use of smart contracts, which authenticate the ownership of a product model and enable transfer of ownership. On the other hand, incorporating digital twin (DT) technology into the framework represents an important extension from the engineering perspective, which allows for a more precise evaluation of a product’s performance in real use [18]. A digital twin model can instantly simulate the heat transfer process involved in an industrial cooler operating in a real environment. This capability allows for real-time adjustment of the installation position of the cooler with respect to the machine it is intended to cool down. As a result, customers can select a specific cooler that best fulfills their needs before installing it on-site.
3 Implementation Results3.1 VR Exhibition HallA traditional way of demonstrating industrial products is through tradeshows, exhibitions or similar marketing events. Due to the COVID-19 pandemic, travel and face-to-face meetings are heavily restricted worldwide. Transforming physical activities into virtual but realistic forms has proven to be an effective solution to face this situation. XR and other immersive technologies are crucial in developing new approaches for technical marketing and product demonstration with minimal geographic and temporal restrictions during and after the pandemic. The following section presents the outcomes of implementing the proposed framework in this regard.
The VR application provides users with a highly realistic 3D exhibition hall that they can freely explore in panoramic view. As shown in Fig. 3, a common use scenario is for users to wear a VR headset and immerse themselves in walking through the hall while discovering the content offered by each exhibition booth. Users can zoom in and out of the virtual space by clicking on the view button (as seen in Fig. 3) for an enlarged or shrunk view of the information presented. For example, Fig. 4 shows the company’s milestones related to product development. Additionally, multimedia data is embedded into several clickable links within the virtual scene, which comprehensively introduce the company’s background, organization, and technical capabilities.
Users are allowed to examine each cooler model on display in detail, observing it from various angles and distances. They can click on with two buttons associated with each model, “play” and “information”, to access general product information and engage with interactive technical features. Fig. 5 presents how the cooler’s synchronous dual-temperature control system functions in an animation. This feature enables potential customers to understand the capability of monitoring both room and liquid temperatures while the cooler works with a machine tool. Learning the practical use and working principle of a complex product in this manner can create a better user experience.
3.2 AR Product CatalogOnce customers become interested in a product, they typically want to collect its technical details before making a purchase decision. Information about a product has typically been provided through means such as product flyers, websites, or emails. However, these methods tend to be less interactive and effective. XR technologies serve as a novel communication medium to address these problems by improving information sharing. The proposed AR catalog is an example to support this statement. The catalog can be accessed by scanning the machine image on a product flyer using a mobile AR application that must be downloaded and deployed on a handheld device. The download URL link is embedded in the VR exhibition hall. The AR catalog provides high flexibility for updating product information without the need to change the link.
The AR catalog provides four major functions for users: (1) scanning and tracking machine images, (2) manipulating 3D product models by rotating, translating, and scaling, (3) displaying the assembly process of product models through animation, and (4) presenting the technical specifications of machines. These functions are developed in response to user feedback and in alignment with the product company’s marketing strategy. After selecting a target machine, the AR application displays a message “READY TO SCAN” (Fig. 6(a)), instructing the user to focus their mobile device’s camera on the corresponding machine image on the product flyer. A green focus box then appears to indicate that image scanning and tracking are ready. The machine’s name also appears to confirm the subsequent information retrieval (Fig. 6(b)), which brings up a 3D product model above the image on the product flyer. The user interface of the AR catalog is shown in Fig. 7.
After clicking on the rotate button [
], the product model quickly rotates 45 degrees clockwise, as shown in Fig. 8(a). Scaling up and down is achieved by dragging two fingers on the screen, which corresponds to zooming in and out, respectively (see Fig. 8(b)). Clicking on the play button [
] starts an animation that shows the assembly process of the product from individual components in an explosive view. The animation dynamically reveals the product’s interior details, as shown in Fig. 8(c). The information button [
] displays the technical specifications of the machine, including dimensions, weight, cooling capabilities, power consumption, and installation procedure (see Fig. 8(d)). The user can close the catalog by clicking on the exit button [
].
The AR catalog undergoes a series of tests to identify the ideal use conditions, which are affected by environmental factors like lighting, focus distance, and viewing angles. The test results showed that the optimal conditions for the image scanning and recognition functions are when the mobile device’s camera is positioned 50 centimeters away from the machine image on the product flyer, viewed from a 45-degree angle from the top front side, and placed under spot or directional lighting. The tests also disclosed some limitations. Firstly, the object recognition function is trained on a specific image for each product model using Vuforia. Therefore, any change in the target image may affect the stability of the recognition results. Secondly, the AR catalog can only recognize and scan one image at a time. Only one will trigger recognition in a scene containing multiple machine images. The AR application is developed for iOS and is compatible with iPhone 12 and iPad 4 or newer models.
3.3 MR Assessment ToolCompanies typically evaluate the technical specifications and compatibility of an engineering product with their work environment before making purchasing decisions. In the traditional approach, the product manufacturer may prefer to dispatch field engineers or sales representatives to conduct onsite assessments and have discussions with the buyers. Due to the recent pandemic, physical visits have become nearly impossible, especially when it involves traveling to other countries. In addition, it is necessary to have tools available on-site to assess whether a new product operates correctly with the existing equipment on the shop floor. In regard to auxiliary or peripheral machines such as industrial coolers, the outcome of the evaluation often plays a significant role in determining the final purchasing decision. The proposed MR assessment tool is specifically designed to fulfill such needs.
The assessment tool, implemented as a mobile app, accurately superimposes a selected cooler model onto a real environment. The model is a highly realistic 3D representation of the actual product with identical appearance and dimensions. By freely placing the 3D model in space, users can determine if there are any potential collisions with existing real objects at a given location. Real-time collision detection is applied to the model as it moves in the real environment. Users receive a visual cue to continuously adjust the model’s position as the collision location is highlighted with a bright color in the MR scene. This functional feature provides instant feedback that enhances human spatial reasoning during product installation on the shop floor.
The MR assessment tool enables users to choose from different models for collision detection through the home menu (see Fig. 9(a)). Once a particular machine is selected, a gesture-based guide will appear to provide instructions on how to operate the function (see Fig. 9(b)). Fig. 9(c) shows that users can interactively adjust the position of the machine in the physical environment by simply tapping their fingers. The adjustment may continue until a collision-free location is found, as shown in Fig. 10. The MR application is designed to run on mobile devices running iOS version 14.0 or newer. According to preliminary test results, users are recommended to hold the device within a distance range of 0.5 to 1.5 meters from real objects in indoor environments. The application may not perform effectively in outdoor environments due to the possibility of depth-sensing malfunction caused by bright lighting conditions. Note that real-time interaction between the user and virtual machines supported by digital twins involves high computing resources. Two solutions are available to meet this requirement on the shop floor. 5G networks allow access to superior computing power residing in the cloud, or edge/fog computing devices can be installed on-site to enable IoT operations.
4 ConclusionThe COVID-19 pandemic has significantly impacted the global manufacturing industry. Beside the disruption to shop floor activities, traditional marketing and product promotion have become challenging due to travel restrictions. Companies are thus exploring innovative methods to overcome these limitations and maintain competitiveness. The Metaverse presents a novel approach to social networking and human interactions, which could potentially address this challenge. This study presented a system framework that enables the implementation of industrial Metaverse for technical marketing and engineering product demonstrations using industrial coolers as a case study. The framework was designed to incrementally integrate virtual and real environments using XR technologies across three phases that transition from business to engineering aspects. In the first phase, a VR exhibition hall creates an immersive environment for customers to view product models. Customers can navigate the virtual hall freely and obtain interactive business information about both the products and the manufacturer. In the second phase, an AR catalog allows existing customers to access up-to-date technical specifications of individual products by scanning product images on a flyer. 3D product models appear in a real scene and an animation simulates the assembly process of a product from its components. The third phase of the framework involves the use of an MR assessment tool to assist users in determining whether a product model can fit into an existing work environment with real objects. Users can move the model freely within the environment, receiving instant feedback on collision detection for arranging the installation location on the shop floor. Several use cases were presented to highlight how the three XR applications assist potential customers in conducting technical assessments remotely. Purchase transactions for the products displayed in the VR hall can be automatically processed through an NFT marketplace. On the other hand, a digital twin model allows for a more precise evaluation of a product’s performance in real use. This study aimed to be one of the earliest attempts to implement an industrial Metaverse for engineering product demonstrations. It could potentially expand the concept of the Metaverse from its current focus on social networking to include applications in engineering fields. In future work, it would be important to verify the extension of the NFT marketplace in the business context. Realizing the idea of optimizing product performance through digital twins is worth considering for further expansion of the industrial Metaverse.
DeclarationsFunding This work was financially support by Ministry of Science and Technology of Taiwan under the grant No. 109-2221-E-007-064-MY3. References1. Kim, J., (2021). Advertising in the metaverse: Research agenda. Journal of Interactive Advertising, 21(3), 141–144.
2. Park, S.M., & Kim, Y.G. (2022). A metaverse: taxonomy, components, applications, and open challenges. IEEE Access, 10, 4209–4251.
3. Choi, S., Yoon, K., Kim, M., Yoo, J., Lee, B., Song, I. & Woo, J. (2022). Building Korean DMZ Metaverse Using a Web-Based Metaverse Platform. Applied Sciences, 12(15), 7908.
4. Yao, X., Ma, N., Zhang, J., Wang, K., Yang, E. & Faccio, M. (2022). Enhancing wisdom manufacturing as industrial metaverse for industry and society 5.0. Journal of Intelligent Manufacturing (pp. 1–21.
5. Lee, J., & Kundu, P. (2022). Integrated cyber-physical systems and industrial metaverse for remote manufacturing. Manufacturing Letters, 34, 12–15.
6. Zhao, Y., Jiang, J., Chen, Y., Liu, R., Yang, Y., Xue, X. & Chen, S. (2022). Metaverse: Perspectives from graphics, interactions and visualization. Visual Informatics.
7. Suh, A., & Prophet, J. (2018). The state of immersive technology research: A literature analysis. Computers in Human Behavior, 86, 77–90.
8. Santoso, H.B., Baroroh, D.K. & Darmawan, A. (2021). Future Application of Multisensory Mixed Reality in the Human Cyber-Physical System. South African Journal of Industrial Engineering, 32(4), 44–56.
9. Carmigniani, J., Furht, B., Anisetti, M., Ceravolo, P., Damiani, E. & Ivkovic, M. (2011). Augmented reality technologies, systems and applications. Multimedia tools and applications, 51, 341–377.
10. Papadopoulos, T., Evangelidis, K., Kaskalis, T.H., Evangelidis, G. & Sylaiou, S. (2021). Interactions in Augmented and Mixed Reality: An Overview. Applied Sciences, 11(18), 8752.
11. Laato, S., Islam, A.N., Farooq, A. & Dhir, A. (2020). Unusual purchasing behavior during the early stages of the COVID-19 pandemic: The stimulus-organism-response approach. Journal of Retailing and Consumer Services, 57, 102224.
12. Chang, H.J., Eckman, M. & Yan, R.N. (2011). Application of the Stimulus-Organism-Response model to the retail environment: the role of hedonic motivation in impulse buying behavior. The International review of retail, distribution and consumer research, 21(3), 233–249.
13. Chen, C.C., & Yao, J.Y. (2018). What drives impulse buying behaviors in a mobile auction? The perspective of the Stimulus-Organism-Response model. Telematics and Informatics, 35(5), 1249–1262.
14. Moghaddasi, M., Marín-Morales, J., Khatri, J., Guixeres, J., Chicchi Giglioli, I.A. & Alcañiz, M. (2021). Recognition of customers’ impulsivity from behavioral patterns in virtual reality. Applied Sciences, 11(10), 4399.
15. Choi, U., & Choi, B. (2020). The effect of augmented reality on consumer learning for search and experience products in mobile commerce. Cyberpsychology, Behavior, and Social Networking, 23(11), 800–805.
16. Moriuchi, E., Landers, V.M., Colton, D. & Hair, N. (2021). Engagement with chatbots versus augmented reality interactive technology in e-commerce. Journal of Strategic Marketing, 29(5), 375–389.
Biography
Chih-Hsing Chu is a Professor in Department of Industrial Engineering and Engineering Management, National Tsing Hua University, Taiwan. He received his B.S. and M.S. degrees in Mechanical Engineering from National Taiwan University. He earned his Ph.D. degree in Mechanical engineering from University of California at Berkeley, USA. Dr. Chu has been involved in the editorial work of several international journals: IEEE Transactions on Automation Science and Engineering, International Journal of Computer Integrated Manufacturing, International Journal of Precision Engineering and Manufacturing, International Journal of Precision Engineering and Manufacturing: Green Technology, Journal of Industrial and Production Engineering, and ASME Journal of Computing and Information Science in Engineering. He has received numerous research awards, including the Best Paper Award from the ASME International Design Engineering Technical Conferences, the Best Paper Award from the IEEE International Conference on Industrial Engineering and Engineering Management, and the Highly Recommended Paper Award from the International Conference on Manufacturing Automation. His research interests include smart manufacturing, augmented reality, CAD/CAM, and interactive design. He is a member of IEEE, ASME, and SME
Biography
Dawi Karomati Baroroh received her B.S. and M.S. degrees in Industrial Engineering from Universitas Gadjah Mada, Indonesia. Currently, she is a Ph.D. candidate in the Department of Industrial Engineering and Engineering Management at National Tsing Hua University, Taiwan. She is also an assistant professor in the Department of Mechanical and Industrial Engineering at Universitas Gadjah Mada, Indonesia. Her research interests are optimization, manufacturing process and system, and augmented reality.
Biography
Jie-Ke Pan is a Ph.D. candidate in Department of Industrial Engineering and Engineering Management, National Tsing Hua University, Taiwan. He received his B.S. degree in Industrial Engineering from National Tsing Hua University, Taiwan. His research interests include augmented reality, virtual reality, interdisciplinary integration, and interactive design. He received the Honorable Mention Paper Award at the 20th IEEE International Conference on Electronic Commerce.
Biography
Shau-Min Chen received his B.S. and M.S. degrees in Industrial Engineering and Engineering Management from National Tsing Hua University, Taiwan. His research interests include augmented reality applications in industrial fields. He is also a patent inventor of “Method of identifying flange specification based on augmented reality interface” Chu, C.H., Lee, M.H., Chen, Y.R., Chen, S.M. (US Patent 17/218,838, 2022)
|
|