VR Project Orientation
Institute of Information Science, Academia Sinica
Department of Computer Science and Information Engineering, National Taiwan University
Current Project and Working Items
◦ Real-time Room fusion
◦ Perspective Changing
What is VR
Virtual Reality(VR) replicates an
environment that simulates a physical
presence in places in the real world or an imagined world, allowing the user to
interact in that world.
◦ Artificially create sensory experiences, which can include sight, touch, hearing, and smell.
VR: virtual objects + virtual environment
AV: real objects + virtual environment
AR: virtual objects in real environment
Head Mount Device(HMD):
◦ Oculus Rift
◦ Samsung Gear VR
◦ Sony PlayStation VR
◦ HTC Vive
◦ Google Cardboard
Position Tracker, controller, joystick, …etc.
◦ FPS, adventure, music, … etc.
◦ 3D movie, 360 ° videos
◦ VR movies
Improve user experiences
◦ Make the VR/AR effect more realistic.
◦ Reduce the dizziness
◦ Reduce random wobble of the picture.
◦ Find a good balance between QoE and battery life of a mobile device.
◦ Dynamically scale resolution and refresh rate to reduce mobile GPU power consumption.
◦ Use user head movement information to decide the appropriate QoE metrics.
◦ Move the computations from local device to a server cluster, such that users without high-
end hardware equipments can also experience VR fluently.
◦ How to allocate sufficient computing
resources to different VR applications in the cluster?
Develop new VR/AR applications.
Live Reality Fusion
◦ Combine the real-time images from two different locations into one.
◦ ex: fuse two seminar rooms for oversea joint meeting; “wall removing” for Interior design.
◦ Cooperate with Dr. Wang’s group.
Only one 360’ camera in the remote room.
A server captures and process the stream from camera, then fuse the processed
stream into the video stream captured by
Break the video captured by 360’ camera into frames.
◦ Panoramic image
Construct a cuboid model of the remote room.
◦ Transform the panoramic frame into cube.
◦ Transform the cube into cuboid.
Compute what an user should see from the current angle.
◦ By projection or other methods.
Replace the wall in observation room
with the projected image of the remote
Fuse R106 and R107 in IIS.
◦ The wall between the two rooms can be removed to achieve real “room fusion”.
◦ Take picture/video as ground-truth.
Dr. Wang’s RA, 祖詒, has completed the
construction of the cuboid model of the
Speedup the model construction process.
◦ The current time required to process one image is a few seconds.
Less than 1/33 seconds in order to achieve real- time.
◦ Study the Matlab functions 祖詒 used in his code.
Continue 禎佑‘s work on multi-camera