People who rely on eye tracking on a daily basis to interact with their devices now have a powerful new tool in their arsenal: Google Assistant. The wizard’s numerous integration and communication tools replace the original voice-based interface of Blick and should improve the functionality of the Tobii Dynavox devices on which it now works.
The wizard can now be added as a tile Tobii Eye tracking tablets and mobile apps that present a large customizable grid of frequently used items that the user can view for activation. It acts as an intermediary for a large collection of other software and hardware interfaces supported by Google.
For example, smart home devices, which can be incredibly useful for people with certain disabilities, may not have an easily accessible interface for the gaze tracking device, requiring other means or potentially limiting a user’s actions. Google Assistant works with tons of this stuff out of the box.
“The ability to control the things around you and the world is central to many of our users,” said Fredrik Ruben, CEO of Tobii Dynavox. “The Google Assistant ecosystem offers almost unlimited possibilities – and offers our user community a lot of normalcy.”
Users can set up assistant tiles for commands or apps and requests like “What’s on my calendar today?” Automate. All that is required for the setup process is a Google account. Then the eye tracking device (in this case the mystically named Snap Core First app from Tobii Dynavox) must be added to the Google Home app as a smart speaker / display. Assistant tiles can then be added to the user interface and customized using commands that are normally spoken.
Ruben said the integration of the Google software was “technically straightforward”. “Because our software itself was designed for a wide variety of access needs and is designed for the rollout of third-party services, there was a natural match between our software and the Google Assistant services,” he said.
Tobii’s built-in symbol library (e.g. lights with an up arrow, opening or closing a door, and other visual representations of actions) can also be easily applied to the assistant’s shortcuts.
For Google, this is just the latest in a number of interesting accessibility services the company has developed, including live transcription, recognition when sign language is used in group video calls, and speech recognition that takes into account non-standard voices and people with obstacles. Much of the web is inaccessible remotely, but at least the big tech companies have done a good job every now and then to help.