• 沒有找到結果。

AN INTERACTIVE AUGMENTED REALITY FURNITURE CUSTOMIZATION SYSTEM

N/A
N/A
Protected

Academic year: 2022

Share "AN INTERACTIVE AUGMENTED REALITY FURNITURE CUSTOMIZATION SYSTEM"

Copied!
6
0
0

加載中.... (立即查看全文)

全文

(1)

AN INTERACTIVE AUGMENTED REALITY FURNITURE CUSTOMIZATION SYSTEM

Tzu-Chien Young (楊子權)1, Shana Smith (陳湘鳳)2, Chih-Kai Yang (楊智凱)3, Chiou-Shann Fuh (傅楸善)4

Department of Mechanical Engineering National Taiwan University

Taipei, Taiwan

1,2

ssmith@ntu.edu.tw

3

r04522627@ntu.edu.tw

4

fuh@csie.ntu.edu.tw

ABSTRACT

Internet is making e-commerce the primary means by which shoppers purchase new products. However, shopping on the Internet has some disadvantages and limited user experiences. Shopping for furniture products is especially difficult. Furniture products are generally large and heavy. As a result, users cannot view and change the materials and dimensions of real furniture products, within the contexts of real environments.

Previous augmented reality furniture display systems do not give users the abilities to view virtual reality furniture models with accurate positions, orientations, and dimensions, and without occlusions.

This paper presents an augmented reality furniture customization system. Kinect is used to track the human body motions. Occlusions for real and virtual objects in different depths are considered to increase the realism of the system. A calibration algorithm is developed to match the depth information, IR image information, and RGB image information of the Kinect, to improve the image quality in the augmented reality environment.

Furniture customization functions are provided to create improved user experiences. The system gives users the abilities to view and change the dimensions and materials of three-dimensional virtual furniture models, within the contexts of real environments, and with occlusions.

User-test results show that users consider the augmented reality furniture customization system more realistic and natural to use than previous furniture display systems.

Keywords Augmented reality, occlusion, furniture display.

1. INTRODUCTION

Augmented Reality technology combines virtual object with real scene captured from camera using computer

graphic rendering and calculation of camera position technology. Related applications of augmented reality are various. For example, Carozza et al. (2014) [4] use augmented reality technology in city design. He provides visualization of city view to help the design of city. In order to improve reality of augmented reality, virtual and real objects must be placed in right place and have right front-back relationship. However, virtual objects are always drawn in the top layer of image and confuse the users. Occlusion problem may confuse user’s feeling and cause decision failure. Thus, solving occlusion problem becomes an important research. For example, Hayashi et al. (2005) [12] store transformation matrix and combine stereo vision system to find the depth information of moving object. Fortin and Hebert (2006) [10] fixed experiment scene and camera to find depth information of object then solve occlusion problem.

Interaction in augmented reality is a communication between user and virtual object. An intractable system can provide information and help to user. Seo (2013) [24]

track position of hand, interacting with virtual object when user clicks a plain board. Radkowski (2012) [23]

uses Kinect device to track user’s body to control a virtual cursor. User uses this virtual cursor to have a training of assembly / disassembly.

Occlusion handling can improve reality of augmented reality. Interaction can provide helpful information to the user. They are both needed, so, this study combines both occlusion handling and interaction. Our method lets user interact with virtual object directly through Natural User Interface.

This study uses Kinect v2.0, developed by Microsoft Co., to get the depth information of real world. We calibrate Kinect with two-step method, making depth image and RGB image aligned. Our method uses Z-buffer to solve occlusion problem, we overwrite Z-buffer with calibrated depth image. Comparing depth value when

(2)

virtual object being drawing. If virtual object is nearer than real object, draw it, otherwise do nothing.

This study aims to make interaction in augmented reality system intuitive and directive. It means user can interact with virtual object directly through its body. In order to achieve this goal, this study proposes a 3-D virtual object drawing process with a tracking technique based on Kinect. Our method transforms user skeleton coordinate to model coordinate. When user and virtual object are all in the same coordinate then we can interact with each other.

This study makes an augmented reality system for furniture display to verify our methods and applies a user testing to consider user’s real feeling.

2. METHOD 2.1. Calibration for Kinect

Although Kinect is great in providing depth image, there are some deviations between depth image and color image (Fig. 1). Because of the different angles and locations of infrared camera and color camera, we should calibrate Kinect before using these images. The calibration includes calibrating infrared camera and color camera, and translation between depth image and infrared image. The process of an object display on monitors in Fig. 2. An object at world coordinate is projected to camera coordinates by external camera parameters, and then projected to pixel coordinates finally by internal parameter.

Fig. 1. Deviation in superposition for RGB and depth image.

Fig. 2. Coordinates system (Collins, 2007 Computer Vision Course) [5].

Stereo calibration. Kinect camera calibration (stereo calibration) refers to the external camera parameters between the color camera and infrared camera, and that is to find the relationship of rotation and translation between two cameras. This paper uses internal camera parameter of the color camera and infrared camera and image pairs shot simultaneously to perform stereo calibration by calibration board. The calculation of method is done in Camera Calibration Toolbox for Matlab (Bouguet, 2013).

Fig. 3. Flow chart for stereo calibration

Translation correction between infrared image and depth image. The correction work in this paper is performed with the home-made star calibration board and an ellipse fitting system, and the process is shown as following (Fig.4). With the depth images and infrared images shot simultaneously, the system detects ellipse edges and find out corresponding points to calculate least square solution of the translation deviation

(3)

(Fig.5) .Fig.6 shows the difference of the result between calibrated and uncalibrated image. The method performance measure up our expectation and works effectively.

Fig. 4. Flow chart for translation correction

Fig. 5. Ellipse corresponding points.

Fig. 6. Result before and after calibration 2.2 Augmented Reality and Occlusions

Augmented reality. AR is an interactive environment combined with real environment and virtual objects. The key to construct the AR environment is the relationship between the camera coordinate and the world coordinate.

This paper use NyARtoolkit api. as development toolkit which is an open source based on marker tracking. The process of the augmented reality development is shown in Fig.7. This paper sets the size of virtual object to 160mm and remove half of length for easier observation.

As Fig.8 shows, the length of the black marker is 160mm.

The fact shows the setting of virtual object is right.

Fig. 7. Flow chart for AR development

(4)

Fig. 8. Verification the size of virtual object and marker

Occlusions. Z-buffer is used in our method to solve occlusion problem. As Fig.9 left shows, we can see that Z-buffer is a memory which store the nearest depth information pixel by pixel. We use calibrated depth image to overwrite Z-buffer, so that Z-buffer represents real world’s depth information. When virtual objects are going to be drawn, we compare Z-buffer by depth information. Common Z-buffer resolution has 8bit, 16bit, 24bit, 32bit four kinds which mean that how many bits are used to memory a pixel. The higher the resolution, the better the accuracy is. In this thesis, we use 32bit Z- buffer and DirectX 9.0c to solve this occlusion problem, and the result is effective (Fig.9. right).

(a)

(b) Fig. 9. Z-buffer schematic diagram (a) and application

diagram (b).

2.3 Interaction with Augmented Reality

This study provides a set of processes (Fig.10), so that coordinate user and virtual objects in a unified world.

Through the cube trigger to fire the event and Kinect’s body tracking, reaching users directly interact with the virtual objects. To increase the stability in user controlling virtual objects, we add a bigger cube which is apart from the cube trigger 200mm as a dividing scanner.

When user’s hands touch in the cube trigger, an interaction event happens. After user’s hands leave out the dividing cube, the event ends. As Fig.10 shows, this paper provide four kinds of basic interaction that are resize, translation, rotation and animation. These four interaction would be triggered by both hands. With Kinect’s body tracking, we can get the positions and parameter of both hands to perform virtual objects control. As Fig.11 shows, the virtual object is created on the marker board first, and then when user’s hands approach nearly enough, the transparent red area shows up to remind user in the interaction.

Fig. 10. Interaction process.

Fig. 11. User interacts with virtual objects.

(5)

3. EXPERIMENTS

Experiments were performed on the augmented reality furniture display system constructed with the furniture models open source TF3DM (http://tf3dm.com/3d- models/furniture) for the interaction realistic. There are two user experiments in this paper with 12 men and 11 women, and the experimental environment is shown in Fig.12. Users stand beside the marker and face to the table. The screen shows up the augmented reality with furniture system and the interaction between users and virtual objects at the same time.

Fig. 12. User experimental environment

Experiment design and results. The first experiment was on the estimate the difference between the augmented reality furniture system and IKEA virtual reality system (Fig.13.). At first, we would explain both systems to users, and let them put some certain furniture where they were would like to. And then, users would fill out the questionnaire. The questionnaire result t-test is shown in table.1

Table 1. Difference between the visual reality and the augmented reality system.

The second experiment was on the augmented reality for the influence of occlusion. At first, we would explain occlusion to the users, and let them experience the arrangement of the furniture in the first experiment. And

then, users would follow the instruction and go to the certain position with the unocclusion system and the occlusion system. After these process, users also filled out the questionnaire. The t-test of result is shown in table.2.

Table 2. The difference user experience in unocclusion/occlusion

This paper apply the camera calibration to solve the occlusion in augmented reality. The result shows this method is effective and fluent. Furthermore, the occlusion of this method looks more realistic than stratification method and interpolation-depth method. In the interaction of augmented reality, this paper provides the trigger function and a coordinate of users and virtual objects to reach interactions directly. Compare to the virtual reality, the augmented reality in the test is more intuitive and realistic to estimate the size of furniture.

4. CONCLUSIONS

With the development of computer hardware and software, augmented reality applications also sprung up, along with Microsoft and Google have been published on its own augmented reality glasses, we can expect the development of augmented reality will be more vigorous.

The following points summarize the present research and the test results

1. The translation correctness of depth image and infrared image.

2. Apply camera calibration solution in augmented reality occlusion problem.

3. Propose direct interaction method and process in augmented reality system.

4. Establish a furniture display system within occlusion and augmented reality.

5. Verify augmented reality applications, occlusion process is required.

Based on the above conclusions can be deduced that the development of augmented reality system is able to handle user interaction with the occlusion problem fluently. It can be expected to see that the development of augmented reality should be based on the extension of both occlusion and the interaction between users and machine. The users can immerse in the situation with augmented reality, and that's what virtual reality can't achieve. By the test of the users, we can come to the conclusion that augmented reality has great potential, and can be used in many different situations beside furniture gallery.

(6)

REFERENCES

[1] Azuma, Ronald. T., & others. (1997). A survey of augmented reality. Presence, 6(4), 355–385.

[2] Brade, M., Keck, M., Gründer, T., Müller, M., & Groh, R.

(2013). Natural interface exploration. Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction. New York, 427–430.

[3] Canny, J. (1986). A computational approach to edge detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 8(6), 679–698.

[4] Carozza, L., Tingdahl, D., Bosché, F., & Gool, L. (2014).

Markerless vision-based augmented reality for urban planning. Computer-Aided Civil and Infrastructure Engineering, 29(1), 2–17.

[5] Collins, R. (2007).Coordinate system diagram. Retrieved June 2 2015, from Computer Vision Course.

[6] Corbett-Davies, S., Dunser, A., & Clark, A. (2012). An interactive augmented reality system for exposure treatment.

IEEE International Symposium on Mixed and Augmented Reality, Atlanta, Georgia, USA, 95–96.

[7] Corbett-Davies, S., Green, R., & Clark, A. (2012).

Physically interactive tabletop augmented reality using the kinect. Proceedings of the 27th Conference on Image and Vision Computing, Dunedin, New Zealand, 210–215.

[8] De Pra, Y., Spoto, F., Fontana, F., & Tao, L. (2014).

Infrared vs. ultrasonic finger detection on a virtual piano keyboard. Proceedings of ICMC/SMC, Athens, Greece, 654-658.

[9] Feng, Y., Du, W., Guan, X., Gao, F., & Chen, Y. (2006).

Realization of multilayer occlusion between real and virtual scenes in augmented reality. The 10th International Conference on Computer Supported Cooperative Work in Design, Nanjing, 1–5.

[10]Fortin, P.-A., & Hebert, P. (2006). Handling occlusions in real-time augmented reality: dealing with movable real and virtual objects. The 3rd Canadian Conference on Computer and Robot Vision, 54–54.

[11]Fransen, B., Morariu, V., Martinson, E., Blisard, S., Marge, M., Thomas, S., Perzanowski, D. (2007). Using vision, acoustics, and natural language for disambiguation.

Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, New York, USA, 73–80.

[12]Hayashi, K., Kato, H., & Nishida, S. (2005). Occlusion detection of real objects using contour based stereo matching. Proceedings of the 2005 International Conference on Augmented Tele-Existence, New York, USA, 180–186.

[13]HoloLens. Retrieved June 2 2015, from https://www.microsoft.com/microsoft-hololens/en-us.

[14]Hsieh, Y.-A. (2014). Use Dynamic markers for an augmented reality system based on multi-point color tracking. National Taiwan University Master Thesis, Taipei, Taiwan.

[15]Jean, Y. B. (2010). Camera calibration toolbox for matlab.

Retrieved June 1 2015, from

http://www.vision.caltech.edu/bouguetj/calib_doc/index.ht ml.

[16]Kanbara, M., Okuma, T., Takemura, H., & Yokoya, N.

(2000). A stereoscopic video see-through augmented reality system based on real-time vision-based registration.

Proceedings of IEEE Virtual Reality, New Brunswick, NJ, 255–262.

[17]Leal-Meléndrez, J. A., Altamirano-Robles, L., & Gonzalez, J. A. (2013). Occlusion handling in video-based augmented reality using the kinect sensor for indoor registration.

Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, Berlin Heidelberg, 447–454.

[18]Lee, T., & Höllerer, T. (2007). Handy AR: Markerless inspection of augmented reality objects using fingertip tracking. The 11th IEEE International Symposium on Wearable Computers, Boston, MA, 83–90.

[19]Lin, C.-H. (2013). Integrating stereo vision and occlusion function into an augmented reality furniture customization system. National Taiwan University Master Thesis, Taipei, Taiwan.

[20]Lu, Y., & Smith, S. (2009). GPU-based real-time occlusion in an immersive augmented reality environment. Journal of Computing and Information Science in Engineering, 9(2), 024501.

[21]Moré, J. J. (1978). The Levenberg-Marquardt algorithm:

implementation and theory. Proceedings of the Biennial Conference, Dundee, 105–116.

[22]Nee, A. Y. C., & Ong, S. K. (2013). Virtual and augmented reality applications in manufacturing. Springer Science &

Business Media.

[23]Radkowski, R., & Stritzke, C. (2012). Interactive hand gesture-based assembly for augmented reality applications.

The 5th International Conference on Advances in Computer-Human Interactions, Valencia, Spain, 303–308.

[24]Seo, D. W., & Lee, J. Y. (2013). Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences. Expert Systems with Applications, 40(9), 3784–3793.

[25]Staranowicz, A., & Mariottini, G.-L. (2012). A comparative study of calibration methods for kinect-style cameras. Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments, New York, USA, 49-49.

[26]Vera, L., Gimeno, J., Coma, I., & Fernández, M. (2011).

Augmented mirror: interactive augmented reality system based on kinect. Human-Computer Interaction–INTERACT 2011, Lisbon, Portugal, 483–486.

[27]Zhang, C., & Zhang, Z. (2014). Calibration between depth and color sensors for commodity depth cameras. Computer Vision and Machine Learning with RGB-D Sensors, Barcelona, 47–64.

[28]Smisek, J., Jancosek, M., & Pajdla, T. (2013). 3D with kinect. Consumer Depth Cameras for Computer Vision, Barcelona, 3–25.

參考文獻

相關文件

The highly interactive nature of augmented reality with its user has given rise to various augmented reality applications for mobile devices, ranging from

The algorithm used is averaging algorithm, i.e., when shrinking a 512 by 512 image into a 64 by 64 image, which is 1/8 of the original size, we take 8 by 8 blocks and use the

The information provided in this Section should describe the quality assurance procedures in place to ensure that the course in Hong Kong is delivered to an academic

An information literate person is able to recognise that information processing skills and freedom of information access are pivotal to sustaining the development of a

In the past, studies on the impact of information disclosure transparency and corporate governance on company value and operating performance have resulted in inconsistent

This kind of algorithm has also been a powerful tool for solving many other optimization problems, including symmetric cone complementarity problems [15, 16, 20–22], symmetric

 The IEC endeavours to ensure that the information contained in this presentation is accurate as of the date of its presentation, but the information is provided on an

Centre for Learning Sciences and Technologies (CLST) The Chinese University of Hong Kong..