Making machines adaptive

Mission

We believe that true digital inclusion means that everybody, regardless of their needs, abilities or physicality must be able to operate computers, workstations and smart devices, and navigate digital environments (augmented/virtual reality/metaverse). 

Eventually, Inclusive Brains’ proprietary generative AI aims at turning any connected device with sensors into an adaptive neural interface. Our innovation is therefore empowering machines and digital environments to adapt to who their users are and to how they feel in real time. 

Our solutions open new possibilities for people with disabilities to have unlimited access to education and to (re)join the workforce. And, of course, possibilities for EVERYBODY, regardless of their physicality, needs and abilities, to be assisted by adaptive machines that optimize decision-making and improve performance while preserving physical and mental health.

INSPIRATION

Olivier and Paul dream big. But their dreams are a bit different compared to most people’s. They do not dream about going to space or being movie stars. They dream about the remote control. First introduced as an assistive technology, to facilitate the lives of people with disabilities, remote controls made TV sets more inclusive. Then, they became pervasive products (and a very profitable business) that changed the way billions of people interact with technology.

Inclusive Brains uses the tech and business trajectory of the remote control (and other innovations initially developed for people with disabilities that gained mass adoption such as voice assistants) as an inspiration and a benchmark for their roadmap: from serving people with special needs to a mass market.

VISION - IMPROVING HUMAN-MACHINE INTERACTIONS

Large Language Models (or LLMs such as GTP4, Llama2) are the first step in empowering machines to better understand how people function thanks to an unprecedented mastering of human language. They are paving the way for new human-machine interactions. However language is not enough to capture the complexity of human cognition, intentions and behaviors which are rooted in neurophysiological processes that language barely touches. To overcome this issue, the first generation of smart wearables (e.g. wristbands and watches) launched more than a decade ago were equipped with motion and heart rate sensors, in an attempt to identify and track physiological markers of emotional responses.

Despite their commercial success, smart devices of the first generation are unable to offer rigorous insights into complex affective and cognitive states such as stress, attention, cognitive load or mind wandering.

These limitations did not go unnoticed by Big Tech companies who, recently, invested in and/or made significant tech acquisitions to launch a second generation of smart wearables that will be equipped with additional sensors (electrodermal response, eye-movement and even brainwaves) that can offer better insights into emotions and cognition. The goal that smart devices of the second generation aim to achieve is to provide not only monitoring of users’ cognitive and affective functions but also predictive analytics to improve their physical and mental health. It is a testament to where the human-machine interface market is going: leveraging neurophysiology for machines to offer multisensory experiences, to better sense how people feel in real time, and to introduce new ways for users to control smart devices and digital environments (AR, VR, metaverse) thanks to the silent, motionless and touchless control afforded by mental commands.

MACHINES MUST BE INCLUSIVE AND ADAPTIVE

As advanced as this second generation of physiology-powered human-machine interfaces are, they do not capture (yet) the true essence of what makes an interaction effective: adaptiveness. First, most smart devices and digital environments are operated via interfaces designed for the great majority of people. That is people who have the ability to use their hands to type on a keyboard, move a mouse or touch a screen. Because some people are paralyzed, amputated or cannot speak because of neurodegenerative diseases, they face digital exclusion and therefore have, by extension, often limited possibilities to study and/or to work. Digital inclusion must improve. This is why the next generation of human machine interfaces must also offer alternate ways to operate smart devices and navigate digital environments. Starting with silent, touchless and motionless control to make them as inclusive as possible since this feature will empower even the people who cannot use their hands to operate computers or workstations. People still interact with their computers, phones, and connected devices in a somewhat old-school way: users give orders (via text, voice, gesture, eye-movements) and machines respond. There is no denial that they respond in ways that are faster, more accurate and more creative than ever. Yet it remains a one-way stimulus-response kind of interaction when a reciprocal information exchange is still required for true interactions to happen. Interacting means that information circulates both ways, that there is entertainment between the two parties. Be them humans or a human and a machine. Hence, what human-machine interactions of the second generation are still lacking is the abilitity to adapt in real time to how we, their users, feel and to what makes each and everyone of us unique. This is where Inclusive Brains’ innovation enters the game. We are building neural human-machine interfaces that can adapt to: How different each of us is. Whether you can use all your limbs or not, whether you can speak or not, mental commands allow everybody to operate smart devices and to navigate digital environments. How we feel and how this changes over time. Being able to sense whether a user is distracted, stressed, fatigued or is experiencing high cognitive load is key to to improve his/her/their performance as well as helping preserve and improve physical and mental health.

Intuit Mailchimp logo
Twitter icon
LinkedIn icon

© 2024 Inclusive Brains