top of page
Writer's pictureAlibek Jakupov

Machine Learning and Music : Grace and Beauty (part I)

Updated: Nov 19, 2021



When music sounds, gone is the earth I know, And all her lovely things even lovelier grow; Her flowers in vision flame, her forest trees Lift burdened branches, stilled with ecstasies.
When music sounds, out of the water rise Naiads whose beauty dims my waking eyes, Rapt in strange dreams burns each enchanted face, With solemn echoing stirs their dwelling-place.
When music sounds, all that I was I am Ere to this haunt of brooding dust I came; And from Time's woods break into distant song The swift-winged hours, as I hasten along.

Walter de la Mare, ‘Music’


It all started with...


It all started with statistical methods that were first discovered and then refined. The world then saw the pioneering machine learning research, in 1950s, conducted using simple algorithms. Later Bayesian methods were introduced for probabilistic inference, followed by so-called 'AI Winter' in 1970s as people were not absolutely sure about AI effectiveness. But as the proverb says, where there is a will there is a way, and rediscovery of backpropagation caused a new wave in AI research. Shortly afterwards, in 1980s, there was a dramatic change that shifted the whole process from a knowledge-based to a data-driven approach. Scientists started creating software in order to analyze large amounts of data. The main goal was to rediscover the natural laws underlying the observations, in other words, learn from the initial data by drawing logical conclusion. At this period of the machine learning's history such algorithms as Support vector machines (SVMs) and recurrent neural networks (RNNs) become commonly used. It was now the start of the fields of computational complexity via neural networks and super-Turing computation. The early 2000s have seen the rise of Support Vector Clustering and other Kernel methods as well as unsupervised algorithms. Starting from 2010s the Deep learning became achievable that caused the appearance of a wide range of application based on machine learning algorithms.


And what about the other spheres of life like, say, music. How did music change over time as the AI steadily integrated our everyday lives? Here we are going to discuss the evolution of the machine learning in sight of view of the most popular songs of a discussed period. Up we go.



1763 : Mozart family grand tour, William Boyce's "At length, th’imperious Lord of War" and Bayes' Theorem



On January 1, 1763 the world saw the very first performance of William Boyce's "At length, th’imperious Lord of War". Shortly afterwards, the family of Wolfgang Amadeus Mozart organized a European tour that ended the same year in Paris.


Thomas Bayes's work "An Essay towards solving a Problem in the Doctrine of Chances" was published, two years after his death. Bayes's friend, Richard Price, amended the work and edited it before publishing. This outstanding work underpinned the famous Bayes theorem that we still apply in machine learning related tasks.



1800s : Beethoven's Symphony No. 3, Niccolò Paganini's Europe tour and Adrien-Marie Legendre's "méthode des moindres carrés"


On the 7th of April, 1805 Beethoven publicly presented his Symphony No. 3, Eroica, at the Theater an der Wien in Vienna. This event marked the beginning of his middle period. At the same time Nicolo Paganini started touring in Europe.


In 1803 Adrien-Marie Legendre described the "méthode des moindres carrés" (the least squares method) that is know commonly applied in data fitting.



1812 : Franz Schubert, Antonio Salieri and Pierre-Simon Laplace



On the 26th of July a fifteen-year-old youngster Franz Schubert makes his last appearance as a chorister at the Imperial Chapel in Vienna. At the same year Antonio Salieri presented the world his Gesù al limbo für Soli.


The same year Pierre-Simon Laplace published his outstanding work called "Théorie Analytique des Probabilités" that defined what is now known as Bayes' Theorem. In this work he expanded upon the work of Bayes that had been published in 1763. by his friend Richard Price.


Here we covered the nineteenth century, a fascinating period of the classical music masterpieces and eminent mathematical breakthroughs. In the next article we will discuss the twentieth century. Hope you enjoyed it.


To be continued.


References

  1. Solomonoff, Ray J. "A formal theory of inductive inference. Part II." Information and control 7.2 (1964): 224–254.

  2. Marr, Bernard. "A Short History of Machine Learning – Every Manager Should Read". Forbes. Retrieved 28 Sep 2016.

  3. Siegelmann, Hava; Sontag, Eduardo (1995). "Computational Power of Neural Networks". Journal of Computer and System Sciences. 50 (1): 132–150.

  4. Siegelmann, Hava (1995). "Computation Beyond the Turing Limit". Journal of Computer and System Sciences. 238 (28): 632–637.

  5. Ben-Hur, Asa; Horn, David; Siegelmann, Hava; Vapnik, Vladimir (2001). "Support vector clustering". Journal of Machine Learning Research. 2: 51–86.

  6. Hofmann, Thomas; Schölkopf, Bernhard; Smola, Alexander J. (2008). "Kernel methods in machine learning". The Annals of Statistics. 36 (3): 1171–1220. JSTOR 25464664.

  7. Bennett, James; Lanning, Stan (2007). "The netflix prize" (PDF). Proceedings of KDD Cup and Workshop 2007.

  8. Bayes, Thomas (1 January 1763). "An Essay towards solving a Problem in the Doctrine of Chance" (PDF). Philosophical Transactions. 53: 370–418. doi:10.1098/rstl.1763.0053. JSTOR 105741. Retrieved 15 June 2016.

  9. Legendre, Adrien-Marie (1805). Nouvelles méthodes pour la détermination des orbites des comètes (in French). Paris: Firmin Didot. p. viii. Retrieved 13 June 2016.

  10. O'Connor, J J; Robertson, E F. "Pierre-Simon Laplace". School of Mathematics and Statistics, University of St Andrews, Scotland. Retrieved 15 June 2016.

65 views0 comments

Recent Posts

See All

Comments


bottom of page