Org blog interactive graphing options


I’m searching for a javascript graphing library to spruce up this org / Hugo / Emacs blog. In addition to drawing great graphs, the library should produce visualizations which are:

  • Interactive
  • Resizable
  • Relatively lightweight

tl;dr: Charts.js integrates easily with org and checks off all my requirements.

What I was using before: Python seaborn

In prior posts I’ve used Python’s seaborn to generate graphs as images which I embed in the site. Seaborn is excellent for static graphs, but it doesn’t take advantage of the web platform – you can’t resize or interact with the images.

Read more ⟶

"Statistics on the Table"


George Stigler’s pulled a fantastic quote from Pearson for his book’s title, Statistics on the Table (1999):

I am too familiar with the manner in which actual data are met with the suggestion that other data, if they were collected, might show something else to believe it to have any value as an argument. “Statistics on the table, please,” can be my sole reply.

  • Pearson (1911)

My read: fight statistics with statistics, whether via method or data. Avoid the chaff of ungrounded theory.

Read more ⟶

Micromegas / Voltaire / Locke on Deism


Voltaire’s deism has withstood the test of time well – I could imagine an empirical Christian nodding along to its humility, dualism, and acknowledgment of a creator:

A tiny follower of Locke was standing nearby, and when it was finally his turn to speak, he said:

“I do not know how I think, but I do know that I have never thought except with the aid of my senses. That there are immaterial and intelligent substances is something I do not doubt, but that it is impossible for God to endow matter with the power of thought is something I do strongly doubt. I revere the eternal power and it is not for me to set limits on it. I affirm nothing, and I am content to believe that more things are possible than people think.”

Read more ⟶

How Machines Learn by Anil Ananthaswamy


I highly recommend Anil Ananthaswamy’s How Machines Learn as an introduction to the foundational concepts of deep learning. It touches on the math, but keeps it accessible for a high school grad, and grounds it with practitioner interviews (LeCun, Sutskever, Hinton, Hopfield, etc). Ananthaswamy starts with perceptrons and ends with deep learning networks, concluding before generative models.

Here are some of the ideas which were new to me:

  • With enough neurons, a neural network with even just a single hidden layer can approximate any function (Cybenko 1989)
  • Convolutional neural networks, like the early LeNet and AlexNet, learn the convolutional filters during training (LeNet 1989, AlexNet 2012)
  • Kernel methods map input data to a higher dimensional space where there might be a separating hyperplane (Guyon 1991)
  • Overparameterized deep learning models don’t overfit, despite shattering the training data (see implicit and explicit regularization, Occam’s Razor and Lottery Ticket Hypothesis)

“I definitely remember being perplexed by how simple the whole thing is… How can it be? You look at your undergrad classes in math or physics, and they’re so complicated. And then this stuff is so simple. You just read two papers and you understand such powerful concepts. How can it be that it’s so simple?”

Read more ⟶

Using proportional fonts in Emacs org mode


Most of my writing, prose or code, happens in the monospaced-font-by-default Emacs. While I’d been hanging on to anachronistic monospaced aesthetic, I’m coming back around to the readability of a well-kerned proportional font.

To this end, I switched this blog’s fonts from monospace to proportional fonts a few months back, and I wanted to do the same with Emacs org-mode. The built-in variable-pitch-mode renders all buffer text in a proportional font 1, but we need a more granular solution given org-mode buffers can have embedded source code blocks and header bullets.

Read more ⟶