For context, it took as much electricity as 120 American households consume in a year to train OpenAI’s GPT-3, which has 175 billion data parameters, according to The Association of Data Scientists.