The Musical Time Machine (formerly: Musical Vintage) is a framework for machine-learning analysis of era of production of music based on static analysis of features. For the full paper see the pdf labelled MusicalVintage.
This project was completed by Shlomi Helfgot, Ami Listokin, and Zev Pinker (names are organized alphabetically; an equal share was done by all) in Spring of 2024 at Yale.
The data is stored in fma_metadata. The data that we use to train our models is stored in fma_metadata/features.csv and fma_metadata/tracks.csv.
The requirements are listed at the top of each code file.
The code files that contain our models and the code for the graders to run are the following:
KRR_clean.ipynb
Neural_Net.ipynb
Perceptron.ipynb
SVMRBF.ipynb
This code was developed for implementation on Google Colab. As such, the filesystem invocations must be changed in accordance with your local system.