The world tried to kill Andy off but he had to stay alive to to talk about what happened with databases in 2025.
Google Cloud’s lead engineer for databases discusses the challenges of integrating databases and LLMs, the tools needed to ...
“By running Gemini natively in Snowflake, customers can use Gemini models across all supported clouds via cross-region ...
This is a toolkit for working with local LLMs. It also served as an exercise in agentic coding. Only OpenAPI-compatible endpoints are supported. IIRC Ollama has added support for that not long aho, ...
[08/05] Running a High-Performance GPT-OSS-120B Inference Server with TensorRT LLM ️ link [08/01] Scaling Expert Parallelism in TensorRT LLM (Part 2: Performance Status and Optimization) ️ link [07/26 ...