This kind of system, along with things that react to facial expressions, eye movement or brain activity are cool and everything, but what kind of usability gains are we going to get here? Is it really an advantage to have your computer know you're pissed, for example, or sad? Oh yeah, we can get a google-ads type response to your mood; "Looks like you're crying - click here to FedEx Kleenex," or maybe alert security that you've become enraged in your cubicle and are an imminent threat to your coworkers and company property - great. Conversely, if this does do something useful, you'd doubtlessly end up in situations where you're second-guessing the algorithms to accomplish something with twitches and contortions you could otherwise do with a few clicks or not bother with at all.
I'd be much much happier if we got some software out there that can tell that I should have used "first" instead of "firs" in a sentence that currently passes as correct in grammar/spell checking, instead of criticizing me for using passive tense, but that's all less cool.