Skip to main content

How Affective Computing can forever change the way we look at Computers

What is Affective Computing? 

Affective Computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. It is an interdisciplinary field spanning computer sciencepsychology, and cognitive science.

So in simple terms, it's a study on Human Emotions. Most of the study is restricted to the first 3 keywords - recognize, interpret & process human emotions. How about developing a program that actually has  emotions? 

If we are talking about simulating human emotions, that would just be an emulation of the emotions majority of the humans feel in certain common situations - you give the program a certain input sitaution, it receives it, processes it, maps it, and provides the emotion "it is feeling". Is it the program's emotion? No, that program is simply apathic towards the situation.

What I propose is to make use of a Neural Network structure to actually emulate the Emotion Generating region of the brain. This implies that Affective Computing wouldn't just be called a mixture of Computer Science and Psychology then, but would rather need a major Neurobiological perspective to build a structure that could actually "generate" emotions.

Emotion Generating Region of the Brain

Let's look at the process of Emotion Generation psychologically first before learning about the process biologically. Emotion and Knowledge cannot be said as 2 really exhaustive sets within our memory. A certain stimulus can trigger a memory[knowledge] with which another emotion is associated. That particular emotion need not be a common association with the memory in everyone's mind in the same way that the certain stimulus cannot trigger the same memory in everyone's mind.
Everyone has different memories(experiences, knowledge), and different sets of emotions associated with it. 
Thus, emotions are memory or knowledge-dependent.

Starting a computer program with certain given knowledge/memory can be a better option since a software has no stimuli reception. Thus, all our inputs will be given digitally.

Testing of the Program

How would we test if the program is building its own emotions based on the events happening around it? The initial, and up till now, the only way to interact with such a program could be to shape this as a chat bot. Some expectations from this program could be:
  • Visible irritation in replies when being asked the same questions/inappropriate questions (provided it is built with/has gathered such specific knowledge)
  • A delightful response if a certain user comes back to interact with the program with whom the program had had a constructive discussion.
(This rough idea had to be given to the reader to make sense and visualize the kind of program I am talking about. However, the self-thinker might have realized the possibilities for such a program are truly endless).

Emotion Generation Continued

Now let's get back to this discussion through a neuro-biological perspective. ML Programs successfully emulate a Neural Network, however, it is far from being an actual simulation of the brain.
To truly understand the how's of Emotions in the brain, we need a Neurobiology expert, and I'm far from even understanding the where's of Emotions in the brain. 

With this hurdle, I will have to pause this discussion until I can accumulate more facts about the brain and emotions. Nevertheless, the analysis from a psychological perspective will still need to be further extended for implementation purposes. For instance, for questions about what parameters and quantity of knowledge shall we provide the initial program and the basis of retrieval of such information.

I hope to continue this discussion further. However, I would really appreciate more insights into this project. 

Comments

Post a Comment

Popular posts from this blog

Namaste JavaScript Quick Notes

Note:  Akshay Saini's Namaste JavaScript is probably the best course for JavaScript developers out there. These are my personal notes that I made while watching the course; they serve more of as an online quick reference for my understanding and revision, and I hope it benefits anyone reading it too! Everything in JS happens inside an Execution Context. Before a JS code is run, memory is allocated and variables are set as undefined   , and functions are set as their exact code in the scope within the Execution Context. The global execution context hosts all the global variables and function definitions. An Execution Context has 2 components: Memory, that stores variables and functions; and Code, that reads and executes the code. Call Stack maintains the order of execution contexts. Since JS is single threaded and asynchronous, at one point of time, only one function is executed which is at the top of the call stack. For each function, an execution context is created before executi

i3wm essentials - I (Brightness)

So you have started using i3 and somehow managed to open your browser and almost resumed your normal work.  But wait, the brightness is too much isn't it? Or is it too low? The mousepad used to work fine, but now all of a sudden tapping does not equal click?!  Don't worry.  This blog series will tell you all about the essential setup commands and common shortcuts that I use to navigate my work in i3, and how you can too. Changing the brightness So you just started i3 and you just can't take this brightness setting. You go for your function keys, and damn! They aren't working. Quick fix: Run the following command if you need to change the brightness ASAP. xrandr -q | grep ' connected' | head -n 1 | cut -d ' ' -f1 This will give an ouput that's the name of your monitor.  Use that monitor name here and change the values of brightness to suit your needs. xrandr --output <monitor-name> --brightness 0.7 Now that your eyes are comfortable, let me show

i3wm essentials - II

Welcome back! Let's continue this guide with other setup essentials for i3. Enabling Mousetap Chances are that if you're using a laptop, then tapping on the mousepad does not equal a click for you. You need to enable tapping in your config. Fortunately, there is one documentation available that works for majority of the setups. I don't need to explain this one in detail. Here you go: Enable tap to click in i3 . Volume Control This one is simple again. Do you remember the i3 config file I talked about in the previous blog ? All you need to do is go to that file and find the line: bindsym XF86AudioRaiseVolume Just below that line you will find lines with XF86AudioLowerVolume and XF86AudioMute too. Anyway, the truth is, there are 2 sets of lines with these keywords. Chances are that the line: bindsym XF86AudioRaiseVolume exec --no-startup-id pactl -- set-sink-volume 0 +5% Will be uncommented and the line: bindsym XF86AudioRaiseVolume exec --no-startup-id pactl -- set-sink vo