Some books I read in 2024


Books I read in 2024 that were actually published in 2024:

  • On the Edge Nate Silver. The elections had me checking FiveThirtyEight more than I’d care to admit 1, so I was curious to hear the personality behind it. And, Nate Silver narrates his own audiobook, laying on some delightful accents when quoting his interviewees.
  • The Demon of Unrest: A Saga of Hubris Heartbreak and Heroism at the Dawn of the Civil War Erik Larson. The subtitle says it all. Larson contextualizes the advent of the American Civil War, which he ties back to the Jan 6th capital riots.
  • Fire Weather: On the Front Lines of a Burning World John Vaillant (more late ‘23). A retelling of the 2016 Fort McMurray, Alberta, Canada fire. The thesis is that global warming caused by the petrocene age is fueling a new, more severe generation of fires.

Most of the books I read weren’t published in 2024:

Read more ⟶

OpenAI's premium $200/mo ChatGPT Pro


The first of the “twelve days of OpenAI” is headlined by the ChatGPT Pro announcement, a new premium consumer AI tier. ChatGPT Pro, a tier above Plus, is a:

$200 monthly plan that enables scaled access to the best of OpenAI’s models and tools. This plan includes unlimited access to our smartest model, OpenAI o1, as well as to o1-mini, GPT-4o, and Advanced Voice. It also includes o1 pro mode, a version of o1 that uses more compute to think harder and provide even better answers to the hardest problems. In the future, we expect to add more powerful, compute-intensive productivity features to this plan.

Read more ⟶

Project Idea: Emacs as a Model Context Protocol Client


On Nov 25th Anthropic AI of Claude fame published the v1 of their Model Context Protocol spec (MCP). I’ll let Anthropic describe it in their own words:

The Model Context Protocol is an open standard that enables developers to build secure, two-way connections between their data sources and AI-powered tools. The architecture is straightforward: developers can either expose their data through MCP servers or build AI applications (MCP clients) that connect to these servers.

Read more ⟶

Yet Another Python Pkg Manager: uv


I am frustrated and delighted to have started using yet another Python package manager: uv. While I could tout the usability or speed of uv, what I find most remarkable is that it’s new to this world.

Python has been around since Guido started working on it in 1991, but it’s definition of “batteries included” hasn’t included a package manager. Instead, Python has developed official tooling (pip, virtualenv) that could be kludged together, and generations of third-party package managers have evolved (pipenv, poetry, hatch, rye, pdm, etc).

Read more ⟶

Project Idea: Using Ollama for Emacs Completions


I’d like to use a local LLM for Github Copilot-esque completions. I like Copilot; it’s downright uncanny when it serves that perfect completion. However, it comes with a few drawbacks:

  • Cost: $10/mo adds up, especially when I already pay for general purpose LLMs
  • Privacy: even if Github sticks to its privacy policy and doesn’t keep queries for future training, sending code to third parties introduces a risk surface many organizations find unacceptable.

A local LLM sidesteps these issues

Read more ⟶