Directing a smarter future with seamless human-machine interactions
With the rapid advance and proliferation of artificial intelligence, innovators have begun developing smarter machines to bolster a wide array of human activities.
Whenever you push the buttons of an ATM to withdraw cash, tap on your smartphone to send a text, or turn a knob to control the volume of your favourite song, you are interacting with a machine that is coded to respond to your actions. Such interactions are known as human-machine interface (HMI). In particular, physically interacting with technology using tools like a mouse, keyboard or touch screen refers to tangible HMI.
Now, imagine being able to control digital devices with just a wave of your hand, dart of your eyes or a flicker of thought. These interactions, known as intangible HMI, may sound like the stuff of science fiction—but thanks to advancements in materials science and artificial intelligence (AI) they are entering reality.
In this feature, we explore some exciting and innovative technologies that could take HMI to the next level and further enhance our control over everyday technology as we know it.
Contactless solutions are just a gesture away
Over the past two years, many of our interactions have had to quickly adapt to safe-distancing rules and contactless environments. To keep communities prepared and safe, innovators can harness one simple everyday action, gesturing, to change the way we communicate with digital devices.
Touchless gesture recognition for electronic devices is a type of motion-sensing HMI technology, underpinned by tracking the user’s hand movements. Powered by next-generation materials, a new gesture sensor designed by Singapore-based innovators takes the form of a thin film. These sensors are highly responsive, sensitive, energy-efficient and more easily deployed than their conventional camera-based counterparts, which are not able to work effectively at close range and the setup drains the battery of the electronic device quickly.
As hygiene awareness takes centre stage amid the COVID-19 pandemic, such inventions could prove useful in public spaces. For instance, instead of physically interacting with electronic screens like digital directories at shopping malls, shoppers could interact with a gesture-enabled screen to reduce the risk of disease transmission.
Eyeing ways to make decisions
While the eyes are the window to the soul, they can also serve as a highly accurate decision-maker. Capitalising on the high flexibility of the human eyeball, some machines can now track eye movement to help us carry out a plethora of tasks.
Most gaze-based HMI works by directing light towards the participant’s pupils, causing reflections in both the pupil and cornea. These reflections can then provide information about the movement of the user’s eyes. However, these technologies typically use infrared-based sensors that can be uncomfortable after long periods of wear.
Recently, innovators have come up with solutions like adopting a green light-based camera that is more comfortable for extended use. One such innovation, the head-mounted eye-tracker device adapts traditional virtual reality headsets to detect and interpret eye gaze and movement.
For instance, users can make selections on an electronic screen by gazing instead of physically clicking a mouse or tapping the screen.
Similar to gesture-based HMI, gaze-based HMI has both business and healthcare applications. From remote environment-monitoring and machine-controlling in tele-operations, to improving retail sales by identifying customer behaviour, many industries could benefit from eye-tracking devices.
Such an invention could also be a boon to the healthcare sector. For people living with motor neuron diseases and disorders, being able to exert control over their environment and communicate with others with their eyes could allow them to regain independence.
It’s all in your head
For decades, the prospect of mind control has been tantalising neuroscientists and science-fiction fans alike. Recently, with the rapid advancement of neuroscience and technology, the line between fantasy and reality has become increasingly blurred. For example, bio-based HMI technology can pick up electrical signals from different parts of the human body and transform these signals into action on a machine.
Electrical signals are generated by the diffusion of calcium, potassium, sodium and chlorine ions across neuron membranes. These signals are then analysed by an AI platform to monitor fatigue, manage cognitive concerns and even command machines.
Coupled with AI-driven platforms, wearable electroencephalography (EEG) devices can also be used to reveal a person’s thoughts, emotions and attention. For example, an AI platform developed in Singapore works in tandem with a low-powered, portable and Bluetooth-enabled EEG wearable. The six EEG sensors pick up signals and the AI platform interprets them as various mental states from focus and fatigue to attention and relaxation.
To allow for further innovation, the platform also contains software development kits for healthcare professionals or researchers and third-party developers to create their own solutions that address brain health.
Besides the brain, electrical signals can also be gathered from muscles and eyes via electromyography and electrooculography sensors, respectively. Through electro-sensing wearables, AI platforms can harness information from such electrical impulses and empower users to execute tasks with ease. From shuffling a playlist to changing the slides on a presentation and piloting a drone from afar, many tasks can be accomplished with a simple twitch of the finger.
Currently, while each interface addresses specific applications, they also face unique limitations. As innovators continue to advance HMI, limitations like accuracy and ease of adoption can be overcome with newly developed technologies.
Whether enhancing health outcomes, generating valuable business insights or simply making daily life more convenient—HMIs, powered by human ingenuity, have the potential to connect machines and humans seamlessly.