Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Computer vision-based articulated human motion tracking is attractive for many applications since it allows unobtrusive and passive estimation of people's activities. Although much progress has been made on human-only tracking, the visual tracking of people that interact with objects such as tools, products, packages, and devices is considerably more challenging. The wide variety of objects, their varying visual appearance, and their varying (and often small) size makes a vision-based understanding of person-object interactions very difficult. To alleviate this problem for at least some application domains, we propose a framework that combines visual human motion tracking with RFID based object tracking. We customized commonly available RFID technology to obtain orientation estimates of objects in the field of RFID emitter coils. The resulting fusion of visual human motion tracking and RFID-based object tracking enables the accurate estimation of high-level interactions between people and objects for application domains such as retail, home-care, workplace-safety, manufacturing and others.

Original publication

DOI

10.1109/ACVMOT.2005.17

Type

Journal article

Journal

Proceedings - Seventh IEEE Workshop on Applications of Computer Vision, WACV 2005

Publication Date

01/01/2005

Pages

494 - 500