Executive Warns Company That Large Language Models Are Energy Inefficient
| Source: Mastodon | Original article
Company's AI model criticized for excessive energy waste.
A recent criticism has surfaced regarding the energy inefficiency of Large Language Models (LLMs). The statement "What I need this company to understand is that LLMs waste a lot of energy" highlights the issue, citing examples such as wrapping a 500kb executable in a 1GB Docker image and running full-repository CI suites on every change in a dedicated off-site cloud farm. This criticism matters because LLMs, like those powering ChatGPT, are becoming increasingly prevalent in various industries, including pharma and life sciences, where they are seen as a way to democratize AI.
As we previously reported, LLMs have been shown to corrupt documents when delegated, and their usage-based billing models, such as GitHub Copilot's, are being implemented. The energy inefficiency of LLMs is a significant concern, especially considering their dependence on training data and lack of optimization under resource constraints. Researchers at companies like Meta are now exploring ways to optimize LLMs, including learning reasoning shortcuts. What to watch next is how companies will address the energy waste issue, potentially by optimizing their LLMs or adopting more efficient AI technologies.
Sources
Back to AIPULSEN