Embedded computing

Sensing and communicating our ups/downs and many needs

The power of computing grows more pervasive as it is increasingly integrated into everyday items within our immediate environment, as in smart toothbrushes, for example, or a vortex whistle for managing chronic lung function via smartphones. This September, scientists, developers and designers from around the world met in Heidelberg, Germany, for two parallel events: ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP) and the International Symposium on Wearable Computers (ISCW) 2016. What was their aim? To compare notes on how computing can be embedded into daily life

Report: Cornelia Wels-Maug

Google glass is a wearable computer.
Google glass is a wearable computer.
Source: shutterstock__Hattanas

Sensors everywhere?

The two conferences showed that, next to smartphones, sensors embedded in watches, glasses, fabric and jewellery, the most common objects, permit new ways of collecting and exchanging data. By bringing artificial intelligence (AI), cognitive computing and deep learning into the mix, this development allows healthcare and wellness industries not only to monitor individuals with more sophistication but also to predict behaviour patterns and trajectories of diseases, among other things. Research in the overall healthcare field featured highly in both events.

Whereas some solutions were clearly in early development stages others were closer to market launch, European Hospital was intrigued by innovations such as:

  • Technology enabling users to make up for physical disabilities when communicating with the outside world. For example, ‘mood lenses’ are glasses that offer people with a neurodegenerative disease who have lost control of facial muscles, to express their emotions via physiological signals integrated into wearable technology. This tool enables individuals to restore non-verbal communication with the help of an in-lens fibre optic display.
  • Improving note taking during surgical procedures. As operations need to be documented, surgeons struggle to find the most accurate and time-efficient way to do so. To wait until the procedure is over risks inaccuracy that can be fortified by inconsistencies in note taking, because there is no standardised way for paper-based systems. With the surgeon’s hand busy, documenting during a procedure is difficult. Developments are under way to establish whether Google glasses can help and what the best input modalities are – perhaps speech, head movement or discrete head gestures?
  • Advancing the measurement of body and mental functions. This covers a wide array of development and research efforts, resulting, for example, in the design of a non-invasive blood screening system. Using the optical properties of blood and applying machine learning algorithms, the blood’s haemoglobin concentration can possibly be estimated based on the blood’s colour when illuminating a finger with the help of a smartphone camera.
  • Predicting patient behaviour: Hospitals have a vital interest in cutting re-admission rates to avoid penalty payments. Based on the positive correlation between the physical activities of cancer patients after surgical procedures and the likelihood of not being re-admitted, how about using a step counter or a wireless-enabled wearable technology device to document patients’ mobility as an indicator of their likely re-admission? This would then help to introduce timely interventions such as changes in medication or arranging for home monitoring.
  • Another fascinating application is work on predicting the onset of depression by using a person’s wearable and smartphone data along with data generated in social networks and geo-location information and combine all data with machine learning analysis. The outcome is then fed back to the individual, who is often not aware of any mood changes. However, by making the person aware of those factors it is hoped to prevent depression or even suicide.

Outlook

Considerable work will still need to go into measuring and modifying user experience. Work presented at the Heidelberg events made it clear that just monitoring is a welcome first step, but by no means enough. More intelligence is asked for. Individuals not only want to be able to read their vital signs or their blood glucose levels by, for example, holding their smartphones still in their hands, but also want to obtain advice on how to stay healthy or become healthy faster, based on the current readings. User experience and design will become more important. Consumers increasingly, expect more customised interactions that take into account their personal preferences for communication as well as their circumstances.

And, yes, these are all projects in the making, shining a light on what might be possible. It will take years before some of it will be eventually mature enough to become an actual product – like that electronic toothbrush, equipped with a water-resistant camera to inform the user on the level of plaque on her teeth, using different sounds as feedback. There is just one little drawback – it does not work well with toothpaste!

15.12.2016

Read all latest stories

Related articles

Photo

Article •

A pathologist in your pocket

We live in a connected world, a very different world than it was a decade ago, said Eric Topol MD. Mobile devices, wearable devices are driving a creative revolution, reducing costs of healthcare,…

Photo

Article •

Process management for future hospitals

Architect Herbert Michael Küpper Dr-Ing from M+ management, demonstrates that full understanding of the working needs of a hospital is necessary to arrive at a successful new construction.

Photo

Article •

‘Shopping’ as a growth strategy

Celenus regroups on the rehabilitation market and aims for the top through acquisitions, Susanne Werner reports

Related products

Subscribe to Newsletter