Skip to main content

What are Emotional Neural Networks? - I

We know that Neural Networks were built trying to copy the structure of the human brain. 

The brain is composed of neurons, all interconnected and shooting/receiving impulses to/from multiple edge connections. That's how a Neural Network is too. Multiple nodes, all interconnected with each other and propagating information and errors to and from, trying to correct itself in every try.

But have you ever wondered if we can ever emulate human emotions? Or if we can replicate human emotions in a Neural Network?

Well, part of this has been implemented already, and although it sounds somewhere along the lines of Affective Computing, it's not completely the same. 

What is an Emotional Neural Network?

Emotional Neural Networks are a new form of Neural Networks that seem to emulate a certain part of human emotions while learning. There are a couple more models and definitions for such a Network, but we'll focus on this aspect first.

Picture a scenario where you're learning something completely new. For instance, learning an instrument. How do you feel at the beginning? 

Sure, you enjoy it. But it's a really new task and you don't know anything about it! You're anxious, maybe not dramatically anxious as shown in movies but there is a doubt, a certain level of uncertainty about the future that will you be able to learn this? Will you be able to play as you had imagined?

As time progresses, your skill grows and with that, your confidence. The anxiety reduces. You're happy with the small achievements and still looking forward to accomplish more. Maybe you still are anxious if you're ambitious about your goals, but the newly acquired confidence gives you an edge to practice better, and enjoy it.

This principal has been employed in a new form of Neural Network. The algorithm has been called Emotional Backpropagation Algorithm. Why?

You see, the Neural Network is given 2 parameters using which it will now make decisions, anxiety and confidence. As the Neural Network iterates over each iteration and computes the gradient, the change is influenced by the anxiety and confidence coefficients for every neuron. 

To put it simply, your neural network is not just punishing itself by the margin of its wrong answers, it is also keeping a record of how many times it was wrong and influencing its learning capabilities by it.

Here, YI1, YI2, ...YIi are simply the input neurons. Consider them as input features only. 

So for each neuron in the second layer, you have the product of all input features with their respective weights, and also the product of the Emotional Neuron, with its own weight. The latter part is what is new here.

It is the backpropagation algorithm that makes a difference, and finally takes into account the anxiety and confidence parameters that we discussed before. Normally, the gradient matrix that is computed depends only on the delta terms (change in target and real output) for that particular neuron, which in turn is influenced by other delta terms. But in the EmBP algorithm, we compute the gradient matrix using the product of the confidence coefficient and the previous change in gradient, the product of the anxiety conefficient and delta terms and the normal conventional way of computing the gradient regularized using certain parameters.

This is as if the amount of change is being influenced with the anxiety and confidence coefficients.

This was the Emotional Backpropagation Algorithm, a model of Neural Networks that has been tested and found to be more efficient than conventional neural networks in many cases such as Face Recognition, Credit Risk Evaluations, Rainfall-Runoff Modelling predictors, etc. If you want to read more about this algorithm and research, please read Dr. Adnan Khashman's journal paper on EmBP.

But, this is not all about Emotional Neural Networks.
Since the original question was, "Can we ever emulate human emotions?" there have been a lot of approaches to this possibility. Some models, such as this algorithm above, have simply emulated the 'behaviour' of emotions in a certain aspect of human life. But there are other cases, where people have actually dived deeper and tried to actually implement the structure of the brain that is responsible for producing and sensing emotions. 
If we can achieve that, we might just be able to produce any or all kinds of human emotions, right? How, and what is it? 
The answer to these questions is not so simple, and I will be presenting those in my next part of this series.

Stay safe and stay tuned.

Comments

Post a Comment

Popular posts from this blog

Namaste JavaScript Quick Notes

Note:  Akshay Saini's Namaste JavaScript is probably the best course for JavaScript developers out there. These are my personal notes that I made while watching the course; they serve more of as an online quick reference for my understanding and revision, and I hope it benefits anyone reading it too! Everything in JS happens inside an Execution Context. Before a JS code is run, memory is allocated and variables are set as undefined   , and functions are set as their exact code in the scope within the Execution Context. The global execution context hosts all the global variables and function definitions. An Execution Context has 2 components: Memory, that stores variables and functions; and Code, that reads and executes the code. Call Stack maintains the order of execution contexts. Since JS is single threaded and asynchronous, at one point of time, only one function is executed which is at the top of the call stack. For each function, an execution context is created before executi

i3wm essentials - I (Brightness)

So you have started using i3 and somehow managed to open your browser and almost resumed your normal work.  But wait, the brightness is too much isn't it? Or is it too low? The mousepad used to work fine, but now all of a sudden tapping does not equal click?!  Don't worry.  This blog series will tell you all about the essential setup commands and common shortcuts that I use to navigate my work in i3, and how you can too. Changing the brightness So you just started i3 and you just can't take this brightness setting. You go for your function keys, and damn! They aren't working. Quick fix: Run the following command if you need to change the brightness ASAP. xrandr -q | grep ' connected' | head -n 1 | cut -d ' ' -f1 This will give an ouput that's the name of your monitor.  Use that monitor name here and change the values of brightness to suit your needs. xrandr --output <monitor-name> --brightness 0.7 Now that your eyes are comfortable, let me show

i3wm essentials - II

Welcome back! Let's continue this guide with other setup essentials for i3. Enabling Mousetap Chances are that if you're using a laptop, then tapping on the mousepad does not equal a click for you. You need to enable tapping in your config. Fortunately, there is one documentation available that works for majority of the setups. I don't need to explain this one in detail. Here you go: Enable tap to click in i3 . Volume Control This one is simple again. Do you remember the i3 config file I talked about in the previous blog ? All you need to do is go to that file and find the line: bindsym XF86AudioRaiseVolume Just below that line you will find lines with XF86AudioLowerVolume and XF86AudioMute too. Anyway, the truth is, there are 2 sets of lines with these keywords. Chances are that the line: bindsym XF86AudioRaiseVolume exec --no-startup-id pactl -- set-sink-volume 0 +5% Will be uncommented and the line: bindsym XF86AudioRaiseVolume exec --no-startup-id pactl -- set-sink vo