Garments that can help people with deafblindness to perceive the world
2021-06-30
At the beginning of 2018, a very interesting EU project commenced, coordinated by the University of Borås. The researchers were to develop a garment that would serve as a communicative contact surface for people with deafblindness.
“For a person who cannot see or hear, everyday life is challenging in our society. We wanted to combine knowledge from universities and companies in Europe with different expertise to contribute to a technical aid for increased independence and perception of the environment,” said Nasrine Olson, Project Coordinator for SUITCEYES and Senior Lecturer in Library and Information Science at the University of Borås.
Three and a half years later, it is time to conclude the project. Nasrine Olson feels both hurried and expectant. There are only a few weeks left until the project will be presented to and evaluated by the European Commission and there are many successful parts to the project that are to be presented.
Algorithms that recognise
Both cameras and sensors are integrated into the garment to perceive the user’s surroundings. The researchers have created algorithms that, with the help of the camera, can detect and recognise faces, objects, distances, and scenes. This information is then translated via a haptic interface into a form of communication: vibrations felt by the user.
“For example, the system can tell you that you are in the kitchen, there are five people in the room, and two of them are known to you. The system can now also recognise if someone is wearing a face mask, due to the pandemic” said Nasrine Olson.
“It is also possible to ask questions of the system, such as ‘where is my mobile?’ Then the garment can guide the user in searching the room – it asks you, for example, to walk forward or to the left. But it can also remember things and tell the user ‘the last time you had it was in the bedroom,’” she continued.
Software for haptograms
Nasrine Olson spoke about the innovation known as a "haptogram," a pattern that carries a meaning (e.g. happy, cup, room) that is created by activating "vibro-tactile" actuators – small electronic devices that vibrate. It is with the help of haptograms that important messages are formed and conveyed to the wearer of the garment.
Haptograms are communicated with these "vibro-tactile" actuators that are organised in a grid of rows and columns in the garment; these can vibrate in sequences or simultaneously. Each haptogram is unique with its own pattern: the order and number of nodes involved, the intensity of the vibration, and the number of times each node is activated. The actuators are integrated into the textile and can be connected to different parts of the body, e.g. the back, shoulders, upper arms and forearms, waist, and upper and lower legs. Theoretically, there is no limit to how large a grid can be used, but the largest one used in the project consisted of 96 actuators.
“We have created several different variants of both hardware and software that make it possible to program the pattern and its meaning via a screen: for example, a reading tablet. This then forms a library of haptograms corresponding to words such as happy, sad, one, two, three, and so on.”
This technology can also be used for interpreters. An interpreter can draw on the tablet instead of directly onto the body as is done today. The interpreter can send, for example, "everyone laughs," and this can then be communicated with a haptogram via the garment to one or more people at the same time. These people also do not have to be present in the same room as the interpreter; during testing, users were even located in different countries.
In addition, there are prototypes of games in the system that are intended to be used partly to learn the technology but also as entertainment. Users can also program their own haptograms.
All integrated into one garment
Researchers from Smart Textiles, at Science Park Borås at the University of Borås, have also been involved in the project to work on integrating technology into textiles, as we humans almost always have fabric close to our bodies. This has resulted in several different prototypes. The current variations include a number of vests that range from being fully adjustable (designed for experimental purposes and to suit the different sizes and body shapes of the participants) to tailored vests (designed to fit a specific user). There are also dresses, in order to show the potential for creating aesthetically pleasing, fashionable clothes that can simultaneously accommodate sophisticated technical components. In addition to these, there is another wearable garment with multi-panels that is intended for conveying semantic content to various body parts. Not all versions are designed as clothes. For example, a version called "chairable" is designed to be mounted onto the back of a typical office chair in order to convey messages to the person sitting in the chair.
“Like all other aspects of life, the pandemic has also affected the implementation of our project on various fronts. SUITCEYES is by its very nature a user-centred innovation project, and therefore access to both users and technical components is at the core of the project. Access to both of these has been severely affected due to COVID-19. Yet despite the challenges we have faced, we are happy to see a strong conclusion to this project,” said Nasrine Olson.
Many ideas for the future
A final symposium for the project was held a few weeks ago. There were over 400 participants from 39 countries, including from organisations and companies that could be interested in commercialising the product.
“What we have developed are prototypes. There is a long way to go before these can be sold. But while the market is small, it does exist, and we will continue to collaborate via the networks we have built up,” she said.
“All our articles and research results are open to other researchers to build upon. We have also applied for funding for new projects that we hope will be granted. So even if this project is now completed, we will continue to work on our ideas,” concluded Nasrine Olson.
Read more
See video about the project on Youtube
Anna Kjellsson, Translation Eva Medin
LightHouse for the Blind and Visually Impaired