In recent years, machine learning techniques utilizing large-scale datasets
have achieved remarkable performance. Differential privacy, by means of adding
noise, provides strong privacy guarantees for such learning algorithms. The
cost of differential privacy is often a reduced model accuracy and a lowered
convergence speed. This paper investigates the impact of differential privacy
on learning algorithms in terms of their carbon footprint due to either longer
run-times or failed experiments. Through extensive experiments, further
guidance is provided on choosing the noise levels which can strike a balance
between desired privacy levels and reduced carbon emissions.

By admin