Decentralized LLM Inference over Edge Networks with Energy Harvesting
As large language models continue to grow, running them efficiently on small edge devices remains one of AI’s biggest challenges. This paper explores how LLM inference can be decentralized across a network of energy-harvesting devices each powered by intermittent renewable… Read More »Decentralized LLM Inference over Edge Networks with Energy Harvesting
