Link Search Menu Expand Document (external link)

Benchmarking Sustainability Qualities for Hyperparameter Optimization in Deep Learning

Climate change and the associated obligation to save resources is one of the most pressing issues of our time and must also apply to how we develop and deploy software, including deep learning models. Hyperparameter optimization (HPO) is a necessary and very energy and resource-intensive part of the creation of these deep learning models, as it involves the search and repeated taring of deep learning models. Recently, frameworks emerged that enable data scientists to hand over HPO tasks to cloud resources.

However, the plethora of inter-dependent configuration options in these frameworks creates uncertainty in selecting reasonably energy-efficient, sustainable options. Thus we designed basht, a benchmarking tool to explore the cause-effect relations in HPO frameworks to uncover the most energy and resource efficient configurations of cloud-based hyperparameter optimization systems.

We are part of the Information Systems Engineering (ISE) chair at TU Berlin, Germany. Feel free to explore basht and the continues updated overview of all benchmark results.