Prompts are the weirdest part about LLMs. We usually instruct computers in unambiguous programming languages, but prompting injects irrationality into that process. We end up saying things like, “Take a deep breath and work on this problem step-by-step.
Hi Nathaniel, just thought I'd drop by and say I briefly mentioned this article in my latest post https://open.substack.com/pub/youbutbetter/p/dont-give-in-to-hand-waving?r=3cbkwq&utm_campaign=post&utm_medium=web
Really love your blog - it's very interesting reading!