Continuing on Multi-ChatGPT-Shell Context
Hiking in the Whites
I’ve been hiking in the Whites out of N. Conway with CEO and Fam the last week:
- Mount Jackson (bagged a presi, 2000ft over 2.7mi and Curie came too)
- Georgianna Falls (long scramble along a waterfall, Curie still came)
- Diana’s Baths (quick ’n easy – we were stuck in the rain)
Opened PR with the chatgpt-shell multi src blocks
I opened a rough draft of the PR to support multiple chatgpt-shell org-babel contexts (#121). It’ll be a great upgrade to the library, my chatgpt workflows, and, it turns out, my elisp fundamentals.
…The In-Laws and NH and chatgpt-shell context
Headed to the White Mountains with CEO to hang with the family for a few days.
org-babel chatgpt-shell multiple src block contexts!
I’m on the plane, my laptop is shaking with turbulence, and I’m going to try and put together multiple org-babel chatgpt-shell contexts. Okie dokie.
Notes
The chatgpt-shell
context is produced by the –context function. Currently this
function takes no arguments, but if it was modified to 1) accept the name of the
context it can a
A short todo
todo
- pack up for NH with Curie
- checkout latest chatgpt-shell
- 1h bike ride at 18mph
- GT7 Laguna Seca Gr.4?
First look at chatgpt-shell history
I need some sort of switchable “memory” using chatgpt-shell
, for both the
shell and org-mode. Memory, here, refers to e.g. a conversation like we have in
the ChatGPT web interface. The Emacs chatgpt shell and org-mode both have the
ability to hold a single conversation of chat messages. (with the current caveat
that the org-babel integration has a bug and the shell can’t be restored).
The Shell
chatgpt-shell does indeed have saving / loading session transcripts. Let’s try it out… alright looks like this works well enough as-is. The key functions are:
…June Twenty-Fifth
Emacs + LLMs
I’ve been continuing to integrate chatgpt-shell with Emacs and my workflows, specifically org-mode. The config below 1) changes the results formatting to be org + markdown friendly, 2) drops the temperature to zero to make it “deterministic”, and 3) takes advantage of the built in programming prompt.
(require 'a)
(setq org-babel-default-header-args:chatgpt-shell
`((:results . "code")
(:version . "gpt-4-0613")
(:system . ,(a-get chatgpt-shell-system-prompts "Programming"))
(:temperature . 0)
(:wrap . "EXPORT markdown")))
The prompt and exporting config targets markdown. Let’s try reconfiguring it to use org, because when in Rome. First let’s reconfigure the system prompt to ask the LLM to use org formatting. Here’s the existing prompt:
…