Here’s a short list of applications that my friends and I would like to develop using the SDK:
Auto hunger detection.
You are at work and you start to feel hungry. You’re not even consciously aware of the hunger at this stage yet. The hunger is automatically detected, and your lunch is ordered from your restaurant of choice. You know those days when you just want to have some sushi, and the next day we don’t want to hear about sushi. Your meal choice is correlated to your neuro-signals, and a program will be able to “feel” and guess what you would like to have for your lunch.
Temperature auto tuning
You won’t use the air conditioner remote control when your new air conditioner automatically adjusts the temperature as you start to feel cold or hot.
Your cognitive and mental states will one day become as priceless as the pictures that you took on your vacation. Captured cognitive & emotional signals, combined with location data from GPS, and recorded audio, will allow us to learn important correlations between places, activities, events and mental states. Those important metrics will be the next stage in the revolution that was started by bio-feedback, and will allow us to reach new levels of self-control. Once we learn to recreate emotions, we could augment movies and pictures with the emotional state that you were in when you took the picture, and create more comprehensive capture of reality.
Emotion based – Interactive movies
Interactive movie experience, in which you implicitly navigate the plot using your emotions. Your emotional state is constantly fed into the playing device, which chooses the plot which will create the most enjoyable experience for you. The movie literally learns your taste and lets your hidden emotions navigate through the plot.
The brake light turns on automatically as you think of braking, before you started to move your foot to push the brake pedal. The extra 400ms could prevent cars behind you from bumping into your car in situations of sudden braking.
Listening to music
You are in the middle of a listening to great song or podcast when you realize you need to go to take something from another room. As you go farther from the speakers, the sound system senses that you don’t hear it well enough, and adjust the volume so that you continue getting uninterrupted listening experience. You can also think of “Pause” from the other room, and the song will wait for you to come back.
Can we reach new level of expressiveness if we could transform our emotional-cognitive state to color and shapes? Perhaps common drawing tasks such as color, brush style, width and opacity control could be controlled faster and allow finer detailed drawings with far less effort.
Emotion augmented chat
The emotions of the chat partner are expressed though background music, sounds, colors and images to add emotional layers to the communication.
Emotion triggering game
The computer draws an random emotion; excitement, fear, happiness, anxiety…
You get a point when you trigger the drawn emotion in the opponent first.
Brain control headset + headphones + smartphone
A full interactive brain controlled hands free experience, which allows you to think of “email” or “calendar” and get your new messages and meetings read. This solution is great for situation when your hands are not free, and there’s too much noise to use speech recognition. You can think of “answer call” or “call mom”.