Augmented Reality is a means to mimic the properties of a hologram. Project SugarCube utilizes devices such as the Oculus Rift, Leap Motion and cameras to create an immersive and intuitive experience for the user that can maybe one day replace the conventional desktop setup. It is a prototype that is meant to act as a proof of concept.
Project SugarCube comprises a whole set of sub-projects that have been implemented in order to understand each of the modules that are required to obtain the final output. Each of these sub-projects have also been explained through the videos and descriptions provided.
Project SugarCube comprises a whole set of sub-projects that have been implemented in order to understand each of the modules that are required to obtain the final output. Each of these sub-projects have also been explained through the videos and descriptions provided.
Marker based Augmented Reality with the Oculus Rift
Demonstration of how marker-based Augmented Reality can be implemented using the Oculus Rift. In such a system, the virtual objects retain their positions with respect to the real world, unlike the previous implementation of static AR. This is a proof of concept project for a desktop based AR system.
The project is implemented using the Unity 3D game engine, and uses the NyARToolkit for the marker based tracking. The gesture control is achieved using the Leap Motion Controller. |
|
Augmented Reality with the Oculus Rift
|
A video demonstrating how AR can be implemented using the Oculus Rift. Developed using Unity, the 3D models have been used from BlendSwap. In this implementation, objects can be switched using a keyboard button press. Interaction with these objects is implemented using the Leap Motion Controller.
This project implements a static AR, i.e. the objects retain their positions on the user's screen, irrespective of where the user looks. This is a proof of concept for Augmented Reality interfaces in portable devices, such as smart glasses. |
Oculus Rift with stereo WebCam
A demonstration of how a stereo camera can be used on the Oculus Rift. A standard WebCam (Logitech C310) is attached to the outer body of the Rift. Created using the Unity 3D game engine, the image feed from the camera is applied as a Texture to a plane GameObject in the scene. For a better experience, two WebCams can be used, so that each eye can receive a video feed of its own.
Also, note that the objects in the video feed appear doubly magnified. This is due to a property known as Field-Of-Vision (FOV). The human eye has a FOV of about 120°, while the WebCam provides about 60°. This error can be corrected by fitting the cameras with a wide angle lens. **Camera fps ~ 60fps |
|
Project can be downloaded here. Important assets in the project are the WebCam.cs script and the OVR GameObject.
Interaction with Marker based AR using the LEAP
|
This project demonstrates interaction with a marker based augmented reality system using the Leap Motion controller. The project was built using the Unity game engine. For the marker based tracking, we use the NyARToolkit for Unity. A standard WebCam is positioned such that it has a complete view of the marker. This video demonstrates interaction with objects (cubes) positioned on the marker. The position and orientation of the objects vary with the movement of the marker or change in camera position. This serves as a proof of concept for a realistic desktop-based augmented reality interface, where the cubes could represent files, virtual screens, or objects within 3D design applications.
|
Marker based Augmented Reality
A demonstration of how the NyARToolkit can be used in Unity. NyARToolkit is an augmented reality toolkit. We demonstrate how various objects can be placed on the marker and given coordinates in the real world, like any real object. The low poly Horse model used is downloaded from BlendSwap.
For a tutorial on how to use the NyARToolkit, refer to this video. |
|
Interaction with Augmented Reality
|
This project demonstrates interaction with virtual objects augmented onto the real world. Interaction with these objects is done using the Leap Motion controller. Built using Unity, this project is an extension of the VR Interface project (below). Virtual objects like cubes and spheres can be added to the scene. Each of these objects can be interacted with. This is a proof of concept for an Augmented Reality interface for portable devices, such as smart glasses.
|
Interaction in Virtual Reality
Interacting with a virtual environment created in Unity, using the Leap Motion controller. This video shows how the LEAP can be integrated with the Unity 3D game engine. Virtual objects like cubes and spheres can be added to the scene. Each of these objects can be interacted with. This is a proof of concept for a Virtual Reality interface.
|
|
Interaction through tracking
A few methods to implement interaction with a virtual environment - ranging from gesture tracking to head tracking.
LEAP Interaction demo - Python, LeapSDK
|
A demonstration of how to work with the LeapSDK using Python. This shows how the data from the LEAP controller can be used for various interactions - including tracking individual fingers and palm tracking. Using this tracking data, virtual objects can be controlled and various operations can be performed. ‘LEAP_fingers_opengl.py’ demonstrates how individual fingers can be tracked, and further used to rotate a 3D cube created using OpenGL. ‘LEAP_palm_VTK.py’ demonstrates navigation in a virtual environment created with VTK (Visualization Toolkit), using palm control. ‘LEAP_palm_opengl.py’ is a module that can perform rotation of a 3D cube rendered in OpenGL, using palm control. Demonstration for all these three are available in the video. Code links are available below.
|
LEAP fingers with OpenGL code: LEAP_fingers_opengl.py
LEAP palm with VTK code: LEAP_palm_VTK.py
LEAP palm with OpenGL code: LEAP_palm_opengl.py
LEAP palm with VTK code: LEAP_palm_VTK.py
LEAP palm with OpenGL code: LEAP_palm_opengl.py
Color based tracking - Python, OpenCV
A modified version of the CAMSHIFT (Continuously Adaptive Meanshift) algorithm for tracking marker objects is shown. Demonstrates how any object of a distinct colour can be used as a marker object to be tracked. Further, based on gestures performed with the marker, the camera in a virtual environment can be controlled and rotated about some virtual objects. The virtual environment is generated using VTK (Visualization Toolkit). The module is implemented in Python using OpenCV.
|
|
Color based tracker code: camshift_mod.py
Skin color based tracking - Python, OpenCV
|
This video shows how skin colour detection can be used for detecting and tracking a hand on the scene. Skin color detection is performed by isolating the range of skin colors, using HSV color filtering. After performing color tracking, contours are detected and the hands are marked using the largest contour (assuming that only the hands are visible in the scene). This module is implemented in Python using OpenCV.
|
Skin color based tracker code: skin_tracking.py
Head tracking in a virtual environment - Python, OpenCV, VTK
A demonstration of how head tracking can be implemented in a virtual environment. The virtual environment is generated using VTK (Visualization Toolkit). The head tracking module (demonstrated below) is used for moving within the environment and for changing perspectives. The head tracking allows for motion with 3 Degrees-Of-Freedom, along the horizontal (X-axis), vertical (Y-axis) and the depth (Z-axis) directions.
|
|
Head tracking in VR:
Dynamic origin tracking code - VR_Headtrack_dynamicOG.py
Static origin tracking code- VR_Headtrack_staticOG.py
Dynamic origin tracking code - VR_Headtrack_dynamicOG.py
Static origin tracking code- VR_Headtrack_staticOG.py
Head tracking - Python, OpenCV
|
A demonstration of how to implement head tracking using Python and OpenCV. The face detection is done using the HAAR Cascade algorithm provided in OpenCV. The face detection, when executed once every 5 frames while using a WebCam running at 30fps, performs some pretty accurate face tracking. Light settings and face orientation affect the working of the module, due to the nature of the HAAR Cascade algorithm.
|
Head tracking code: basic_facetrack.py