Alexander Müller. “Optimizing Large Language Model Inference: Strategies for Latency Reduction, Energy Efficiency, and Cybersecurity Applications”. Academic Reseach Library for International Journal of Computer Science & Information System 10, no. 11 (November 30, 2025): 93–97. Accessed January 17, 2026. https://colomboscipub.com/index.php/arlijcsis/article/view/58.