A less wasteful way to train large language models, such as the GPT series, finishes in the same amount of time for up to 30% less energy, according to a new study from the University of Michigan.
Up to 30% of the power used to train AI is wasted: A software tool could help fix that
Tech News
-
Free Dark Web Monitoring Stamps the $17 Million Credentials Markets
-
Smart buildings: What happens to our free will when tech makes choices for us?
-
Screenshots have generated new forms of storytelling, from Twitter fan fiction to desktop film
-
Darknet markets generate millions in revenue selling stolen personal data, supply chain study finds
-
Privacy violations undermine the trustworthiness of the Tim Hortons brand
-
Why Tesla’s Autopilot crashes spurred the feds to investigate driver-assist technologies – and what that means for the future of self-driving cars