Tales from the Red Hat Summit 2025
This week, Kangaroot is having its yearly appointment with all-things-Red-Hat at the Red Hat Summit in Boston. Summarizing 4 days of back-to-back tech sessions in a single post is impossible, so we'll focus on the two main topics.
Red Hat Enterprise Linux 10 is here!
First up, Red Hat Linux gets a new major release: RHEL 10. The main features we will be looking into are Red Hat Enterprise Linux Lightspeed: a built-in assistant that you can ask questions in natural language about commands, knowledge base content, ...
Image mode linux allows you to build, manage and deploy your Linux OS using container technologies, resulting in less drift, include it in CI/CD builds, ...
Additionally, Insights for Image Building, first steps in Post-Quantum Cryptography to secure RHEL and of course updates on containers and AI components are also included in this new release.
The power of AI is open
Last year, AI was already center stage during Red Hat Summit. However, the open source and Red Hat proposition was limited and maybe still unclear.
One year later, the market has evolved from a focus on building and training models to businesses looking at actual use cases in the AI world. And Red Hat has put together a coherent strategy where all bits come together, it has invested in the acquisition of Neural Magic to become the leader in the vLLM and, as a result, we can clearly see now how open source and Open Standards like MCP are taking the lead again in this rapidly evolving AI world.
Red Hat has announced Red Hat Inference server, which can run on RHEL AI or on OpenShift AI, allowing customers to run different models within their existing environments. Inference server also interacts with Red Hat's model repository on Hugging Face, where customers can find validated models that have also been compressed and optimized to deliver better performance on the same underlying hardware.
To round up, Red Hat launched the LLM-d community and project, which is building the technology for a distributed Inference server... the next step in scaling this AI infrastructure.
All this gives a new spin on Red Hat's hybrid cloud slogan: "Any application, Any platform, Any cloud" and turns it into:
Any model, Any accelerator, Any cloud
Great to see the AI plans and a proposition come together; something we can now work with for our customers.
Are you thinking about AI use cases? Want to build a POC based on these technologies? Kangaroot and the other Tech Tribes combine their knowledge to get you started. Just get in touch.