27 Apr 2016


What if we told you we could read your mind? You’d probably think we were messing with you…which, yeah, is a fair assumption. But believe it or not, this is totally possible, and one of our SXSW projects, Lari—featured in Engadget and Fortunedemonstrates how…


When you think of words, your neurons stimulate the area around your larynx, creating micro vibrations that can be picked up and recognized. This means we can tell what you’re thinking, even when you don’t say it out loud. It’s not mind reading—it’s subvocalization.


Imagine telepathy…

This is kind of close.

See, when thinking, people naturally vocalize spoken word in their minds, and—even when we don’t speak out loud—our neurons still stimulate the area around our larynx, creating micro vibrations. Using a combination of hardware, a trainer software and a machine learning algorithm, these micro vibrations can then be captured and translated into text, allowing for vocal control without sound emission.

In other words, those things you say in your head aren’t as sacred or secret as you think, and when properly developed, this technology could allow for an utterly unbelievable form of Human UI: communication and commands via thought alone. Lari (named for larynx) is our exploration into harnessing this tech…and taking the words right out of your mouth.


  • Electromyography – Is used to monitor the electrical impulse being sent to certain muscles when a user is speaking.
  • A trainer software – Captures these signals and analyzes them using a machine learning algorithm. Models of the data can be created and classified according to the words that were spoken. This allows for a direct correlation to made between the signals being sent and the words they align to, essentially forming a silent language of sorts. Once this has been established, when the user thinks of a word that have been classified, the software will recognize the signal they’re creating and return a match. That word can then be displayed or used in conjunction with other technology to serve as a voice command—without the voice.


Besides blowing people’s minds, this technology could be utilized as an entirely new form of communication or command. For instance, imagine traveling to a foreign country. Instead of having to fumble over your sub-par Spanish in Madrid or flip frantically through a Spanish-English dictionary, you could simply think of what you wanted to say in your native tongue. The signals you omit by thinking could be recognized by Lari and then translated into words that could then be run through a translation application, allowing you easy access to translations.

On a simpler note, imagine the last time you asked Alexa to do something for you. What if you could just think of a command instead? Lari eliminates screen, movement, and sound—and this technology might just make for the most unbelievable UI ever.

For more Chaotic Moon magic, check out Sentiri, UnderCurrents and Noti-FLY!