Displaying 3 items of Coder Radio with the tag "local llms".
-
645: Warp's Holmes & Llyod
April 8th, 2026 | 27 mins 6 secs
ai, determinism., enterprise security, kubernetes, linux, local llms, podman, predictive ai, rag, rama-llama, rhel, vllm
A rare trio episode with two of luminaries behind Warp and good old gungan Mike where they discuss all things agentic including Warp's very own Oz.
-
636: Red Hat's James Huang
December 19th, 2025 | 20 mins 53 secs
ai, determinism., enterprise security, kubernetes, linux, local llms, podman, predictive ai, rag, rama-llama, rhel, vllm
In this episode of the Coder Radio (Coder.show) network, Michael Dominick sits down with James Huang, Senior Product Manager of AI and High Performance Computing at Red Hat, to discuss the intersection of enterprise-grade Linux and the rapidly evolving world of artificial intelligence.
-
563: Mike’s No Good Very Bad Rails Update
March 27th, 2024 | 57 mins 16 secs
3 body problem, ai, apple, apple developer, apple vision pro, asset pipeline, barbara fried, c++ interop, c++ successor, coder radio, developers, development podcast, esm, expensive elephant, feedback, github, google carbon, helix, huggingface, importmap-rails, iphon, lm studio, local llms, netflix, nostr, nvidia, rails, rcs support, sbf, sentencing submission, text editor, vanilla js, wwdc 2024, xr, youtube
Mike makes the case for just going vanilla, a look at Google Carbon, and then we address the expensive elephant in the room.