Safety with the help of this and by

Safety Driving and Emotions Detection

Abstract:

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

This
paper express methods for collecting and analyzing facial points of different
persons to check their emotions. This is especially for the persons that are in
stress position or in bad mood because due to this many accidents have occurred,
so to determine the emotions of persons, their facial points are used which we
extracted by our method and then these facial points are compared with the
action points that have fixed values for different features of face. So with
the help of this and by our proposed method that finds out the facial points of
human face, we can find out in which expression the person is, and if the
person is in the state that can be dangerous for him/her than we can alarm them
using different techniques so that they and other people can be saved from
different tragic incidents. So our basic focus is to find human facial points
and for this we uses video of different persons of some seconds and then some
frames are extracted from this video to find out facial points. To find out
facial points we use Borman. This method helps us in the detection of facial
expressions and this is useful for helping people where emotions have great
importance.

Introduction:

According
to National highway and Motorway police stated that in 2007 more than 5500
casualties happened on road. Now it has reached to 12000 deaths and more than
30000 injuries per year according to WHO 1. In 2013 38% of fatal accident
happened due to distraction i.e. by using electronic devices examined by NHTSA
in US 2. Driving behavior depends on the age and gender factor, male done
violation of traffic rules even they get angry on police and women find
themselves anger on traffic hurdles 3. Emotion can be described in a way of
combining of facial expression in all cultures. People who get often angry
found more frequently in near misses 4. Use of mobile phone and electronic
devices increases the death rate and accident in US 5. Car crashes on minor
road has increases as compared to the major road because of over speeding and
cars with high speed are more on risk of collision or accident 6. Questionnaire
taken on English drivers and  resulting
as driver who have more aggressive behavior violate traffic rules  and those who feel anger found involved
in  traffic accident 7.

It
has been found by examining, music do have impact on driver emotions, if fast
music played it lead to happy face expression and effect will on driving
aggressively, sad music lead sad or neutral face where it judged that driver
kept car in his lane. Listening to music is a normal activity and it affects
the driver’s performance in positive and negative way, drivers informed that by
listening music they pay more attention on road as compared to conversation
with other members sitting in vehicle 9. It shows music helps but it can also
distract driver from road by effecting emotionally which is harmful for
driving. Human behavior depend on the emotions he have and North American Roadways
have emotional billboards which distracts drivers, and distraction may come
through from suing cell phone, passenger conversation and texting information.
Driving need more focus on road to control steering, staying in lane, traffic
lights etc. Emotion may also effects driver decision making power 10. Multiple
emotions that appear on human face are called compound emotions. These emotions
are composed of two or may be more than two. For example, happily surprised,
angry surprised etc 11.

Literature Review:

Facial
expression analysis is an emerging field in human computer interaction design
and computer science. A method is introduced to know four facial expressions
through 2D appearance. Radial Symmetry Transform and edge projection Analysis
are the two algorithms defined for the extraction of feature and 81% of accuracy
achieved by grayscale images 12. Facial expression can be determined through
facial points, these points can take as input from a video follow by a temporal
rules. Algorithm evaluates the facial expressions from a video and by the help
of action units recognize the temporal segments. Recognition accuracy is 87% it
is helpful in human interfaces which leads to advance studies on nonverbal
communication and human emotion detection 13. Facial structure and contraction
of muscles are important to know the facial expressions. By analyzing 11 facial
features points and symmetry face of human, average result of system extracted
as 91.3 %. 3D facial model is used and as an input 6 distance characteristics
have taken from 11 facial features. Generally system approaches in three steps
by Face acquisition, Facial feature extraction and representation and Facial
expression recognition 14. Facial expressions are not easy to detect as it is
simply defined theoretically. Emotions which we see in our daily life like
anger sad neutral are the combination of face context 15. To decrease the
time response of the emotion detection system it is important to make a smart
system processing which helps to decrease the time of decision making in order
to get the feedback of application used on call center to know the frustration
of client 16. In 2012, Aleix
Martinez et al. firstly describe two models that classify human facial
expressions of emotion. Continuous model is about the method to see expressions
of emotion for different intensities. The other categorical model describes
that a happy and a shocked face are taken as either happy or shocked but not
some part that lies between these emotions. Both models face problems like
recognizing combinations of emotion categories. For this solution, they had
worked on a revised model that consists of C distinct continuous spaces
which justifies the results of cognitive science that are previously reported. By
linearly combining of these C face spaces compound emotion categories
can be recognized. Their model can be successfully deployed and can achieve
greater results for applications in the fields of computer vision and HCI
because their model is consistent if we take it with recent human perception
understanding. They mostly worked on recognition of emotions and they believe
that their study can be used in different tasks related to recognition 17. In
2014, Shichuan Du et al. describes in their paper about some new categories of
expressions. These new categories are called compound emotions. They said that
previously people studied or researched on six basic emotions. They said that
other types of emotions like compound emotions also exist and are used by
humans. Compound emotions are formed by the combination of basic emotions. They
worked on 21 different categories of emotion. These 21 categories are
differentiated by the analysis of Facial Action Coding System. Their research
opens a new way in face recognition by using compound emotions 18. Previously
the research is mostly based on six simple emotions, in which their production
method is studied. Action units are identified for all the basic emotions on
the base of their facial muscles. Then the work extended to compound emotions. Two
compound emotions (happily fearful and happily sad) are also discussed that are
not properly studied previously. Facial expressions help people to understand
the behavior of others. Compound emotions on people face appear like if any
person gives surprise to another person then the other person becomes not only
happy but also surprised so the emotion is called happily surprised and is
composed of two emotions. Another example of riding of rollercoaster is also
discussed in discussion part in which person feels happy or fearful while
riding rollercoaster 11.

Methodology:       

 

We prepared videos of different persons; it further splits in
to frames with the help of Matlab code then we extracted facial points from Borman.
Action unit helps us to tell the facial expression by comparing our derived
points with the help of Borman. Another code is executed for the comparison of
action units and evaluated points from Borman to judge the expression. We took
four frames of every emotion as input and we executed 15 times the same code on
every frame to find the accuracy of facial points.

Initially neutral, smile, and angry were taken as an input as
shown in Fig 1.

`  

          

Fig 1

                                   

 

Distance between points calculated to change the facial
expressions. Like for sad our lips remain same but as we feel happiness our
lips expands so for the difference we use distance method to calculate a real
expression. Action unit are explained by Ekman and Friesen so they are
universally accepted to overcome the emission of error.

 

Few emotions with their action units 19

 

Issues:

It is quite difficult to understand the nature of people, so
problem how to tell to the driver about the danger. If we give a beep it can be
noisy or irritating and can create a distraction.

Conclusion:

As
there is already many researchers have worked on it so we also continue this
study because the basic objective is to generate better results for the
detection of emotion. So we use different people videos for this process and
successfully achieves the results on detection of emotions. A new area of
research is in progress to signal the driver in case of any emergency without
causing any distraction to driver. There would be a focus to determine
psychology of driver and respond appropriately.

 

References:

1.      National Road safety journal vol 1 2014

2.      Traffic Safety facts Research Note Distracted
Driving 2013

3.      Transportation Research Part F: Traffic Psychology
and Behaviour Volume
15, Issue 4,
July 2012, Pages 404–412

4.      SWOV Fact sheet  Anger,
aggression in traffic, and risky driving behavior

5.      Trends in Fatalities From Distracted Driving
in the United States, 1999 to 2008

6.      Aarts, L.T. & Schagen, I. van (2006). Driving speed and the risk of road crashes:
A review. In: Accident Analysis and Prevention, vol. 38, nr. 2, p. 215-224

7.      Driving violations,
aggression and perceived consensus School of Psychological Sciences, University
of Manchester, Manchester M13 9PL, UK Received
29 August 2004; accepted 20 May 2006

8.      Emotions drive attention: Effects on driver’s
behavior Article in Safety Science · November 2009

9.      An exploratory
survey of in-vehicle music listening Article?in?Psychology of Music
35(4):571 · October 2007

10.  The
emotional side of cognitive distraction: Implications for road safety Michelle
Chana,?,
Anthony Singhala,b

11.  Compound
facial expressions of emotion: from basic research to clinical applications
Shichuan Du, PhD; Aleix M. Martinez, PhD

12.  Facial
Expression Recognition Neeta Sarode Computer Engineering Dept.Thadomal Shahani
Engineering College, Bandra (West), Mumbai-400050 Prof. Shalini Bhatia Computer
Engineering Dept. Thadomal Shahani Engineering College, Bandra (West),
Mumbai-400050

13.  Dynamics
of Facial Expression: Recognition of Facial Actions and Their Temporal Segments
From Face Profile Image Sequences Maja Pantic, Member, IEEE, and Ioannis
Patras, Member, IEEE 2006

14.  Facial
Expression Recognition Using 3D Facial Feature Distances Hamit Soyel and Hasan
Demirel Eastern Mediterranean University Northern Cyprus

15.  Inherently
Ambiguous: Facial Expressions of Emotions, in Context Ran R. Hassin Department
of Psychology, Hebrew University, Israel The Center for the Study of
Rationality, Hebrew University, Israel Hillel Aviezer Department of Psychology,
Hebrew University, Israel Department of Psychology, Princeton University, USA
Shlomo Bentin Department of Psychology, Hebrew University, Israel Center for
Neural Computation, Hebrew University, Israel

16.  The
Hourglass of Emotions by Erik Cambria1, Andrew Livingstone2, and Amir Hussain3

17.  A
Model of the Perception of Facial Expressions of Emotion by Humans: Research
Overview and Perspectives Aleix Martinez Shichuan Du Department of Electrical and Computer Engineering The Ohio State University 2015 Neil Avenue Columbus, OH 43210, USA

18.  Compound
facial expressions of emotion Shichuan Du, Yong Tao, and Aleix M. Martinez1
Department of Electrical and Computer Engineering, and Center for Cognitive and
Brain Sciences, The Ohio State University, Columbus, OH 43210

19.  https://www.pinterest.com/pin/6614730674767017/