So I decided to compare results: SongKong 4.4 (Pro) vs. beaTunes 4.6.10.
I picked 10 tracks from 10 different genres.
The results are interesting.
See picture:
![](http://i66.tinypic.com/68909k.png)
Why is there so much difference between the moods, when comparing the outcome of both programs on the same song?
I even dare to say, there are contradictions.
Like for ex. track 5, ‘Extremely Calm’ vs. ‘Very Excited’;
And what about track 10, ‘Very glad’ vs. ‘Very aroused, angry’.
Even the values of Happy/Happiness and Arousal are different.
I thought SongKong was just reading out the data in the database of AcousticBrainz (and beaTunes aswell using its plugin)?
Does SongKong keeps reason with all of the values, like Mood Happy, Arousal, Valence, Agressive, Dance, Party, Relaxed, Sad and divides them to create a distinguished textual tag, like ‘Very glad’ ?
Or is there just a a tag in the AcousticBrainz db written, called for ex. ‘Very glad’?
How does it work?
N.B.: I noticed the picture shows a little vague due tinypic’s compression, so I uploaded the pic, you can download it in higher quality here: http://www68.zippyshare.com/v/aOY5uef5/file.html