Sept. 25, 2012—A University of Utah team won two awards today at EvAAL, a prestigious international competition on location-sensing technologies designed to ensure the health and well being of a person needing assistance, such as the elderly or people requiring long-term home-based care. These technologies, called ambient assisted living technologies, may enable such individuals to stay in their home longer rather than moving to assisted living facilities. Localization sensing could enable a caretaker to be alerted if, for example, a person has fallen and cannot get up.

Led by Department of Electrical and Computer Engineering post-doctoral fellow Maurizio Bocca and faculty member Neal Patwari, the University of Utah team won first place in the tracking accuracy category, which recognized the best tracking performance in each system, and second place overall. The team, called the cyber-physical systems (CPS) team, included School of Computing faculty Suresh Venkatasubramanian and Sneha K. Kasera.

Although localization and home surveillance have an ominous Big Brother tone, this technology is only to be used with a person’s permission and only when a person requires it. The University of Utah team developed a unique tracking system that doesn’t require the person to wear an active badge, which is a radio transmitter that periodically sends the system a message so it can locate where this message originated. However, the user must remember to wear the active badge—a clear disadvantage. Using a U of U technology based on radio tomography, the CPS team could determine where people are moving with a high level of accuracy. Indeed, the team’s tests in an apartment in Salt Lake City prior to the competition showed a person could be located within one foot (30 cm) of their actual position. No other team in this competition used radio tomography as the basis for their sensing systems.

Each team in the competition developed and tested systems designed to locate and track a person in their own home. The teams’ systems were tested during the competition, held in the Smart House Living Lab at the Universidad Politecnica de Madrid in Madrid, Spain. During the test, a person walked around in a path that was unknown to any of the teams. Each team’s system continuously reported its best guess of where it believed the person to be. Then, the evaluators compared these estimates to the person’s actual path. The difference between the estimated path and the actual path was the team’s localization error. The University of Utah team had the lowest localization error.

Overall scores were judged based on other factors including “installation complexity” and “user acceptance.” The competition, organized by the Ambient Assisted Living Open Association, involved ten research teams from around the world, including Canada, Spain, Germany, Switzerland, France and the United States.

The CPS group is funded by the National Science Foundation.

Contacts:
Maurizio Bocca, post-doctoral fellow, maurizio.bocca@utah.edu
Neal Patwari, assistant professor of electrical and computer engineering, npatwari@ece.utah.edu

Media

This video shows the experimental results from a radio tomography (RT)-based two-person tracking experiment in an apartment. The people are not wearing any radio tags, instead, a wireless network of 33 (IEEE 802.15.4) transceivers deployed in the apartment measure changes in received signal strength (RSS), known as “the number of bars”, caused by the people. Based on which links experience changes, the RT algorithm comes up with an image (shown at left) that has highest values (red) where it guesses that a person is located, and lowest values (blue) where it guesses that no person is located. A multi-target tracking algorithm developed by Dr. Maurizio Bocca at the University of Utah identifies from the image where the “blobs” are and what path they are taking through the apartment, using computer vision methods adapted to the RT problem. On the left, the video shows the apartment with black indicating walls, grey indicating furniture, white circle indicating the actual person location, and white X indicating the current estimate of the person. Dr. Bocca’s algorithm track the two people to within an average error of about 30 cm (1 foot).