Sensor fusion (AR markers) for multi-touch interaction

The purpose of this project is to incorporate the use of Augmented Reality (AR) markers into an existing vision based multi-touch system that is capable of sensing multiple fingers. The scope of this project covers the development of a multi-touch table that is suited for the implementation of an ap...

Full description

Bibliographic Details
Main Author: Hanafiah Yusof.
Other Authors: Louis-Philippe Demers
Format: Final Year Project (FYP)
Language:English
Published: 2009
Subjects:
Online Access:http://hdl.handle.net/10356/17032
_version_ 1826112999992066048
author Hanafiah Yusof.
author2 Louis-Philippe Demers
author_facet Louis-Philippe Demers
Hanafiah Yusof.
author_sort Hanafiah Yusof.
collection NTU
description The purpose of this project is to incorporate the use of Augmented Reality (AR) markers into an existing vision based multi-touch system that is capable of sensing multiple fingers. The scope of this project covers the development of a multi-touch table that is suited for the implementation of an application that uses AR markers as well as multiple fingers for interaction. The development of the multi-touch table is divided into hardware and software components. The hardware component of this project includes the design and construction of a 1st generation table for testing and learning purposes as well as a 2nd generation table meant for actual usage. At the software level, various existing computer vision toolkits are tested, utilized and combined to sense the presence of AR markers and to interpret the data into useful information. The hardware and software design methodology of the multi-touch table is presented in the report. The multi-touch table is designed to provide maximum functionality so as to extend its usage to future applications. The design and development process will be discussed in the report. This project is successfully implementing a vision based multi-touch table capable of sensing both fingers and AR markers using a combination of the Frustrated Total Internal Reflection and Diffused Illumination technologies. This method of combination does not currently exist. A virtual musical application, named DeeJay Touch, which incorporates the use of AR markers and multiple fingers for interaction is developed as a companion application.
first_indexed 2024-10-01T03:16:21Z
format Final Year Project (FYP)
id ntu-10356/17032
institution Nanyang Technological University
language English
last_indexed 2024-10-01T03:16:21Z
publishDate 2009
record_format dspace
spelling ntu-10356/170322023-03-03T20:32:49Z Sensor fusion (AR markers) for multi-touch interaction Hanafiah Yusof. Louis-Philippe Demers School of Computer Engineering DRNTU::Engineering::Computer science and engineering::Computer applications::Computers in other systems The purpose of this project is to incorporate the use of Augmented Reality (AR) markers into an existing vision based multi-touch system that is capable of sensing multiple fingers. The scope of this project covers the development of a multi-touch table that is suited for the implementation of an application that uses AR markers as well as multiple fingers for interaction. The development of the multi-touch table is divided into hardware and software components. The hardware component of this project includes the design and construction of a 1st generation table for testing and learning purposes as well as a 2nd generation table meant for actual usage. At the software level, various existing computer vision toolkits are tested, utilized and combined to sense the presence of AR markers and to interpret the data into useful information. The hardware and software design methodology of the multi-touch table is presented in the report. The multi-touch table is designed to provide maximum functionality so as to extend its usage to future applications. The design and development process will be discussed in the report. This project is successfully implementing a vision based multi-touch table capable of sensing both fingers and AR markers using a combination of the Frustrated Total Internal Reflection and Diffused Illumination technologies. This method of combination does not currently exist. A virtual musical application, named DeeJay Touch, which incorporates the use of AR markers and multiple fingers for interaction is developed as a companion application. Bachelor of Engineering (Computer Engineering) 2009-05-29T04:13:57Z 2009-05-29T04:13:57Z 2009 2009 Final Year Project (FYP) http://hdl.handle.net/10356/17032 en Nanyang Technological University 86 p. application/pdf
spellingShingle DRNTU::Engineering::Computer science and engineering::Computer applications::Computers in other systems
Hanafiah Yusof.
Sensor fusion (AR markers) for multi-touch interaction
title Sensor fusion (AR markers) for multi-touch interaction
title_full Sensor fusion (AR markers) for multi-touch interaction
title_fullStr Sensor fusion (AR markers) for multi-touch interaction
title_full_unstemmed Sensor fusion (AR markers) for multi-touch interaction
title_short Sensor fusion (AR markers) for multi-touch interaction
title_sort sensor fusion ar markers for multi touch interaction
topic DRNTU::Engineering::Computer science and engineering::Computer applications::Computers in other systems
url http://hdl.handle.net/10356/17032
work_keys_str_mv AT hanafiahyusof sensorfusionarmarkersformultitouchinteraction