BrainAble will conceive, research, design, implement and validate an ICT-based human computer interface (HCI) composed of BNCI sensors combined with affective computing and virtual environments. This combination will dramatically improve the quality of life of people with disabilities by overcoming the two main shortcomings they suffer - exclusion from home and social activities - by providing inner functional independence for daily life activities and autonomy (HCI connected to accessible and interoperable home and urban automation) and outer social inclusion (HCI connected to advanced and adapted social networks services).
In terms of HCI, BrainAble will improve both direct and indirect interaction with computers. Direct control will be upgraded by creating tools that allow people to control those inner and outer environments using a “hybrid” Brain Computer Interface (BCI) system (BCIs, Electro Oculogram (EOG), Electromyography (EMG), and Heart Rate).
Furthermore, BNCI information will be used for indirect interaction, such as by changing interface or overall system parameters based on measures of boredom, confusion, frustration, or information overload. These self-adaptive tools will increase effective bandwidth because users will be able to use a plurality of signals to effect control, and also because adaptation will reduce errors and help provide the user with the desired control.
BrainAble’s HCI will be complemented by an intelligent Virtual Reality-based user interface with avatars and scenarios that will help disabled people to move around on their wheelchairs, interact with all sort of devices, create self-expression assets using music, pictures and text, communicate online and offline with other people, play games to counteract cognitive decline, and get trained in new functionalities and tasks.
BrainAble’s HCI will therefore allow the disabled user to manage two types of applications:
Inner virtual environment connected to home automation to augment user’s functional capabilities and autonomy:
Apply control and automation techniques for the autonomy, comfort and security of the user in his living environment modifying it as little as possible, by creating a wireless, light and open home and urban automation network empowered by the Universal Remote Console (URC) standard for accessibility and interoperability of devices. Through a “BNCI+ affective computing” interface to an audio-visual virtual environment, the user will be able to interact with and manage his accessible living environment, which will include his wheelchair, assistive robot, Heating, Ventilating, and Air Conditioning (HVAC), lighting, natural lighting, audio, video, security (intrusion, fire, gas, water), intercoms, home robotics and other home devices (garage door, front door, security camera, television, pet and plant feeding and watering) as well as URC-compliant urban devices such as Automatic Teller Machine (ATM), traffic-lights or information displays.
Outer virtual environment connected to social network Services to augment user’s social inclusion:
Enable access to online communities of people who share interests and activities. A new trend is emerging with social networks created to help its members with various physical and mental ailments in order to share self-expression assets, experiences and knowledge, as well as provide tools to carry out a responsible health tracking. Social interaction between users will be synchronous/online like in a conversation and asynchronous/offline like in a mail relationship. It is crucial to have the scientific and industrial communities involved to avoid false or unproven expectations.