New possibilities to enhance tagging of songs with artificial intelligence?

By chance I came across the startup Cyanite from Berlin on my search to further optimize the search. Especially because I love automation, I love Cyanite.

The company with the music analysis tool CYANITE is the interface between the music industry, data science and software engineering. Our mission is to relieve and support decision makers in the creative industry through emotional and situational music analyses. We establish a universal language for individual music perception in order to make the best possible use of the effect of music.

The music analysis tool CYANITE

The following two analysis methods are currently available: The Track Mood Comparison analyzes a song selection of up to 10 tracks simultaneously. An interactive network diagram is used for visualization to compare the emotional characteristics. It also allows to show and hide the songs of your choice for an individual comparison. The Dynamic Emotion Analysis visualizes the dynamic development and expression of the moods over the entire length of a single song and highlights the most characteristic parts. Additionally, the genre classifier categorizes and quantifies the most fitting genre(s). The file source for the Track Mood Rating is the Spotify account that the user connect with the tool when signing up. For the Dynamic Emotion Analysis, mp3 files and YouTube links also work. All analyses within CYANITE are based on the same, logically defined 8 mood framework: Happy, Excited, Calm, Relaxed, Sad, Tense, Angry and Melancholic. With both methods, users gain objective insights into the emotional structure of music, which are valuable for contextual re-use of the music pieces.