The need to communicate with others, is a primordial need and desire. To simply tell others or society rather at large how we feel. I’ve been spending some time recently musing over what it would be like to communicate these feelings. One idea that I’ve had is the utilization of sensors to send signals about our emotional state.
Communicating feelings is hard, Adam Smith’s treatise The Theory of Moral Sentiments points this out, quite pointedly that, things are hard to communicate or translate, the feeling of dread in the pit of your stomach , or more lightly flatulence. These are things that are happening internally, so ones discomfort tends to manifest itself in visual cues, sounds like stomach gurgling, sweaty palms. On the flip side of discomfort there’s also joy , happiness, excitement etc… . Our body tends to produce these signals that are hard to catch unless you’re physically there. Additionally there hard to empathize with because they aren’t physically happening to us. Capturing extra information about personal state of being , could help you communicate these things. Moreover, going in a different direction we could use sensor data about ones personal data to add context, and annotate the world around us, with a very low bar. For the first time we might have signals into the way people feel, actually physically feel in the world. We could say whether or not the crowd , at a concert was high energy, aggregating many peoples co-located signals, to get a sense of the vibe of a place. Or translate those signals into abstract feelings, this music feels high energy. This movie is boring, all of these things could be effortlessly soft communicated through the physical signal. Instead of hitting the like button or comment button or emoji button , you could simply maybe hit the signal/sensor button and broadcast your sentiments in a form of sensor data.
I once attended a lecture, where the first cyborg stuck a chip in his arm and over a network communicated feeling to his wife by attaching the chip to his nervous system and hers. If recall , the experience was described as eye opening. He claimed to have a better understanding of his wife’s feelings after being wired together or rather sensations.
Simply write software to hook into the pulse metric for the iWatch. Use this to then translate the pulse into a feeling, then simply sync it with what you know the user is doing. Listening to music , or browsing a webpage . Then you could potentially have deeper information, on how the information is making people feel.
This is kind of a weird dystopian step , where you constantly are monitored. Down from your head to your toes, broadcasting your every emotional state. You can’t ever really hide from your pulse, or other bodily signals.
You’d want to make everything opt in, and the actual signal correlation sampled. It would be interesting to actually instead of hitting the like button, hit the here’s a sketch of my bio-life sign signal button.