Autoresearch Is Just Hyperparameter Optimization With Extra Steps
The last few weeks, the autoresearch repository by Andrew Karparthy has made some waves. Everybody seemed to be hyped for LLMs doing deep learning research, while I had a look at the README and thought: “Well, that sounds like hyperparameter optimization with extra steps.” Below you can see the progress plot Karparthy published as part of the repo. The LLM runs 83 experiments over eight hours and successfully reduces the validation metric by around 0.0282 bits per byte. ...