* [Inference Gateway v0.23.4](https://github.com/inference-gateway/inference-gateway) – Cloud-native, high-performance proxy unifying multiple LLM providers with streaming, multimodal, MCP and function-calling support.