We believe that the best way to evaluate the accuracy of our SDKs it to try our demo and test the performance on your own.
However, some of us prefer to look at numbers. Numbers alone are sometimes deceiving, as they do not include the scenario in which they were calculated.
Therefore, here we list the accuracy levels and the data in which the numbers were obtained for each of the features.
All results are cross validated.
- Head Pose: Boston University dataset (http://csr.bu.edu/headtracking/uniform-light/)
- Face Recognition: Faces in the Wild dataset (http://vis-www.cs.umass.edu/lfw/)
- Age: Feret dataset (http://www.itl.nist.gov/iad/humanid/feret/feret_master.html)
- Facial Expressions (InSight): Cohn-Kanade+ (http://www.pitt.edu/~emotion/ck-spread.htm)
- Facial Expressions (InSight): Bosphorus dataset (http://bosphorus.ee.boun.edu.tr/default.aspx)
- Gender: Faces in the Wild dataset (http://vis-www.cs.umass.edu/lfw/)
- Mood: Privately collected dataset
- Eye Gaze: Privately collected dataset
|Head Pose||Yaw 6.1⁰ (+/- 4.4⁰)
Pitch 6.5⁰ (+/- 3.9⁰)
|Face Recognition||91 % (20% false positives)|
InSight SDK Accuracy
|Facial Expressions||Average: 93.2 %
Neutral : 88.2 %
Happy : 95.2 %
Surprised : 100.0 %
Puzzled : 98.3 %
Disgusted : 81.1 %
Afraid : 94.2 %
Sad : 95.6 %
|Head Pose||pitch 5.2° (+/- 4.6°)
yaw 6.1° (+/- 5.79°)
roll 3.00° (+/- 2.82°)