The large language models that have increasingly taken over the tech world are not “cheap” in many ways. The most prominent LLMs, such as GPT-4, took some $100 million to build in the form of legal costs of accessing training data, computational power costs for what could be billions or trillions of parameters, the energy and water needed to fuel computation, and the many coders developing the training algorithms that must run cycle after cycle so the machine will “learn.”
Language agents help large language models ‘think’ better and cheaper
Tech News
-
Free Dark Web Monitoring Stamps the $17 Million Credentials Markets
-
Smart buildings: What happens to our free will when tech makes choices for us?
-
Screenshots have generated new forms of storytelling, from Twitter fan fiction to desktop film
-
Darknet markets generate millions in revenue selling stolen personal data, supply chain study finds
-
Privacy violations undermine the trustworthiness of the Tim Hortons brand
-
Why Tesla’s Autopilot crashes spurred the feds to investigate driver-assist technologies – and what that means for the future of self-driving cars