Tech News

Teaching computers the meaning of sensor names in smart homes


Credit: CC0 Public Domain

The UPV/EHU’s IXA group has use natural language processing techniques to overcome one of the major difficulties associated with smart homes, namely that the systems developed to infer activities in one environment do not work when they are applied to a different one, because both the sensors and the activities are different. The group has come up with the innovative idea of using words to represent the activation of both sensors and human activity.

The aim of smart homes is to make life easier for those living in them. Applications for environment-aided may have a major social impact, fostering active aging and enabling older adults to remain independent for longer. One of the keys to smart homes is the system’s ability to deduce the human activities taking place. To this end, different types of are used to detect the changes triggered by inhabitants in this environment (turning lights on and off, opening and closing doors, etc.).

Normally, the information generated by these sensors is processed using data analysis methods, and the most successful systems are based on supervised learning techniques (i.e., knowledge), with someone supervising the data and an algorithm automatically learning the meaning. Nevertheless, one of the main problems with is that a system trained in one environment is not valid in another one: “Algorithms are usually closely linked to a specific smart environment, to the types of sensor existing in that environment and their configuration, as well as to the concrete habits of one individual. The algorithm learns all this easily, but is then unable to transfer it to a different environment,” explains Gorka Azkune, a member of the UPV/EHU’s IXA group.

Giving sensors names

To date, sensors have been identified using numbers, meaning that ‘they lost any meaning they may have had,” continues Dr. Azkune. “We propose using sensor names instead of identifiers, to enable their , their semantics, to be used to determine the activity to which they are linked. Thus, what the algorithm learns in one may be valid in a different one, even if the sensors are not the same, because their semantics are similar. This is why we use processing techniques.”

The researcher also explains that the techniques used are totally automatic. “At the end of the day, the learns the words first and then the representation that we develop using those words. There is no human intervention. This is important from the perspective of scalability, since it has been proven to overcome the aforementioned difficulty.” Indeed, the new approach has achieved similar results to those obtained using the knowledge-based method.


Sensor-based technologies are promising to support independent living for older women


More information:
Gorka Azkune et al. Cross-environment activity recognition using word embeddings for sensor and activity representation, Neurocomputing (2020). DOI: 10.1016/j.neucom.2020.08.044

Citation:
Teaching computers the meaning of sensor names in smart homes (2020, December 1)
retrieved 3 December 2020
from https://techxplore.com/news/2020-12-sensor-smart-homes.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link