June Twenty-Fifth

Posted on Jun 25, 2023

Emacs + LLMs

I’ve been continuing to integrate chatgpt-shell with Emacs and my workflows, specifically org-mode. The config below 1) changes the results formatting to be org + markdown friendly, 2) drops the temperature to zero to make it “deterministic”, and 3) takes advantage of the built in programming prompt.

(require 'a)
(setq org-babel-default-header-args:chatgpt-shell
      `((:results . "code")
        (:version . "gpt-4-0613")
        (:system . ,(a-get chatgpt-shell-system-prompts "Programming"))
        (:temperature . 0)
        (:wrap . "EXPORT markdown")))

The prompt and exporting config targets markdown. Let’s try reconfiguring it to use org, because when in Rome. First let’s reconfigure the system prompt to ask the LLM to use org formatting. Here’s the existing prompt:

(a-get chatgpt-shell-system-prompts "Programming")
The user is a programmer with very limited time.
                        You treat their time as precious. You do not repeat obvious things, including their query.
                        You are as concise as possible in responses.
                        You never apologize for confusions because it would waste their time.
                        You use markdown liberally to structure responses.
                        Always show code snippets in markdown blocks with language labels.
                        Don't explain code snippets.
                        Whenever you output updated code for the user, only show diffs, instead of entire snippets.
# System info

## OS details
ProductName:		macOS
ProductVersion:		13.4.1
ProductVersionExtra:	(c)
BuildVersion:		22F770820d
## Editor
GNU Emacs 30.0.50 (build 1, aarch64-apple-darwin22.4.0, NS appkit-2299.50 Version 13.3.1 (Build 22E261))
 of 2023-04-25

This prompt does a great job of producing just the output you want and nothing else – I don’t want to fork it, just modify it. Let’s go with:

(add-to-list 'chatgpt-shell-system-prompts
             `("Prorg" . ,(string-replace
                           "markdown" "org-mode markup"
                           (a-get chatgpt-shell-system-prompts "Programming"))))

(setq org-babel-default-header-args:chatgpt-shell
      `((:results . "code")
        (:version . "gpt-4-0613")
        (:system . ,(a-get chatgpt-shell-system-prompts "Prorg"))
        (:temperature . 0)
        (:wrap . "EXPORT org")))
Write fibonacci(n) in emacs-lisp

Here’s the function:

```emacs-lisp (defun fibonacci (n) (if (or (= n 0) (= n 1)) n (+ (fibonacci (- n 1)) (fibonacci (- n 2))))) ```

Awesome. However, I’d like to remove the org export wrapping entirely because we’re in Rome (i.e. org-mode) and don’t need no wrapping. I’m not sure what src code block would do the trick, so let’s ask the oracle:

In emacs org mode what source code block arguments do I use to include the
result verbatim / unwrapped?

Use the `:results` header argument with the value `verbatim` in your source code block.

Example:

```org

print("Hello, World!")

```

This will include the result as a verbatim text block.

Sounds good, GPT – let’s drop the wrap argument and change results to verbatim:

(setq org-babel-default-header-args:chatgpt-shell
      `((:results . "raw")
        (:version . "gpt-4-0613")
        (:system . ,(a-get chatgpt-shell-system-prompts "Prorg"))
        (:temperature . 0)))

And now for the test, Python this time ‘round:

Write fibonacci(n) in python

Sure.

def fibonacci(n):
    a, b = 0, 1
    for _ in range(n):
        a, b = b, a + b
    return a

Voila! You, dear reader, might not be able to tell the difference, but from my perspective I just got a unwrapped org mode python src block in my buffer. Now I feel like I do need to label the output as machine produced, but delighted that I’m able to have that seamless integration.

Six Hours of the Glen

Six hours of the Glen this weekend – I’m happily falling for IMSA’s multi-class racing. It’s like having stock sharing the track with bespoke F1 cars (well, swap F1 with the LMPs) and I love it. Watkins Glen itself is in upstate New York. I was idly thinking at 3.5 hrs it’s not close to TEO / JB / SJO. Maybe some day. In the meantime, GT7 has a great Watkins Glen implementation and it’s popular in the lobbies.

IMSA also races at Laguna Seca – between Indycar, GT Challenge, and IMSA my cup runneth over with local series. Crazy to see Cadillac competing with a prototype, and exciting that Corvette C8’s are competing in the GT class. US sports cars seem to be picking up .