IIT-Madras builds AI tech to convert brain signals into language

By IANS | Published: February 3, 2020 04:30 PM2020-02-03T16:30:04+5:302020-02-03T16:40:18+5:30

Researchers at the Indian Institute of Technology Madras (IIT-Madras) have developed an Artificial Intelligence (AI) technology to convert brain signals of speech impaired humans into language, the Institute said on Monday.

IIT-Madras builds AI tech to convert brain signals into language | IIT-Madras builds AI tech to convert brain signals into language

IIT-Madras builds AI tech to convert brain signals into language

The researchers can potentially interpret nature's signals such as the plant photosynthesis process or their response to external forces.

Electrical signals, brain signals, or any signal in general, are waveforms which are decoded to meaningful information using physical law or mathematical transforms such as Fourier Transform or Laplace Transform.

These physical laws and mathematical transforms are science-based languages discovered by renowned scientists such as Isaac Newton and Jean-Baptiste Joseph Fourier.

"The output result is the ionic current, which represents the flow of ions which are charged particles. These electrically driven ionic current signals are worked on to be interpreted as human language meaning speech. This would tell us what the ions are trying to communicate with us," said study researcher Vishal Nandigana, Assistant Professor, Fluid Systems Laboratory, Department of Mechanical Engineering, IIT Madras.

"When we succeed with this effort, we will get electrophysiological data from the neurologists to get brain signals of speech impaired humans to know what they are trying to communicate," Nandigana added.

The researchers are working on how these real data signal can be decoded into human languages such as English, and if the real data signal can be interpreted as a simple human language that all human beings can understand.

Brain signals are typically electrical signals. These are wave-like patterns with spikes, humps and crusts which can be converted into simple human language, meaning speech, using Artificial Intelligence and Deep Learning algorithms.

This enabled the researchers to read the direct electrical signals of the brain.

They tested this concept by getting experimental electrical signals through experiments in the laboratory to get signals from nanofluidic transport inside nanopores.

The nanopores were filled with saline solution and mediated using an electric field, the Institute said in a statement.

( With inputs from IANS )

Open in app