Buckaroo Banzai


As planned, skipping lifting legs this week to focus on cycling. Got in a 1h20min ride for 12.5mi at about 150w power. For now, I’m going to give myself a 1.5x multiplier on mileage with the fan belted on.

Also, watched Buckaroo Banzai. I don’t think I fully get it. Yet.

Got in an F1 race with EBJ, falling into second behind him around Barcelona. Super fun track, but man I was almost a second a lap off his pace. We have Monaco coming up next – put in a quick practice and I crushed three front wings. More practice necessary.

Read more ⟶

LLM's get Reflexive


Self-Reflecting LLMs

My dad shared some fascinating news on GPT-4 (via this video): it performs better with reflexion, i.e. if you ask it why it provided an incorrect answer it will sometimes catch it, meaning it can sometimes self-reflect on wrong answers to offer an improvement. Here is a substack post from the paper’s authors Noah Shinn and Ashwin Gopinath – one of the provocative points is that GPT-4 crossed a threshold of complexity to be able to improve via reflexion, unlike the earlier GPT models.

Read more ⟶

Driving Home and gptel


Happy April 2nd! Drove back home, with CEO, so everything is right with the world. I actually watched the Aus F1 race today. Some numbers from the race that stuck out to me: two red flags leading to three starts, there were eight DNF’s, zero points for Ferrari, and Aston Martin placed third and fourth. The four DRS zones kept throwing cars into overtakes, making it one wild ride.

I configured gptel using my OpenAI plus account – Emacs is where work happens, so let’s introduce them. I wonder how it GPT-4 does writing elisp… of course, the answer is easy to find on the web. @daviwil gives a good run-through of its Emacs-related capabilities on a System Crafters stream.

Read more ⟶

Australia '23 Race Day


Happy April 2nd!

I’m busy this evening, heading North for home tomorrow, but at this moment early in the day it’s my own time. So! Using that time to get a massive sleep in, catch up on the hours I missed over the week. True rest day.

One note on guix: a full garbage collection (guix gc) cleared out nearly 200GB, however many of those packages were used for builds. So, updating my Guix Home profile after the GC resulted in an expensive build dependency resolution step.

Read more ⟶

Continuing down the LangChain


LangChain provides a Pythonic framework for writing LLM powered applications, e.g. tools for asking questions re knowledge bases, personal assistants, chatbots, etc. Given it provides a toolkint to implement “smart” apps, I wanted to go through the motions of implementing something simple + useful.

I started wiring LangChain to my org files and emails but I wasn’t able to fully take it through it’s paces. While most of my organized thoughts are in the org files, there doesn’t seem to be an unstructured parser for it. This means all of the semantic structure (headings, tags, linnks, etc) is lost in raw text parsing & splitting. Solution-wise, should be easy enough to convert the org files to HTML which unstructured will devour happily. Email have their own problem, namely persisting the Chroma vectorstore index because it takes a while to index my tens of thousands of emails. Should also be easy enough with LangChain, but I’m still learning the API calls to pull it off.

Read more ⟶