Saturday, December 31, 2022
Show HN: lambdaprompt – build, compose and call templated LLM prompts https://ift.tt/Y6vDPHJ
Show HN: lambdaprompt – build, compose and call templated LLM prompts For the past few months I've been building a lot of things with LLMs (GPT-3, Codex, etc.) as I've been trying to push them to their limits (especially towards applying them to the tabular data domain) When working on this, I've found there are some common patterns for solving problems (templating, chaining, functional-programming style operations, etc.) As I've iterated, I've come to believe that a functional style interface is likely going to power a new wave of systems I'm calling "prompt-machines"(systems where the core new unit of work is a "named" LLM prompt, extending the "function" concept). Additionally, in order to make the code capable of meta-prompting (where the LLM can write its own templated prompts), I aimed to make the interface and library as simple and lightweight as possible. I think I've achieved my goal, so I'm releasing the library to share with others! I mainly use lambdaprompt in two main ways: (1) to quickly try out "map" applying a LLM prompt against multiple inputs, to see how it behaves on a fixed set of inputs (2) to quickly iterate on a prompt-chain (taking the output of a prompt, and passing it to other prompts) to create complex behavior. An example of (2) that worked quite well is a Text-2-SQL prototype: It generates multiple SQL options, then executes each against the database (if it errors, asks `Codex-EDIT` to fix the errors and retry), then takes the most "consistent" answer as the valid answer. Simply by adding this prompt-chain on top of codex, we saw an improvement from ~75% to ~85% on a spider Text-2-SQL benchmark (On just a small sample of N=200). To increase usability it also ships (extras) with a fastapi app that registers any defined prompts as endpoints directly, and hosts the functions to be directly callable via HTTP-GET requests. This makes it easy to build client-applications off of these prompts, while allowing the prompt itself to be arbitrarily complex (composition of prompts) I hope you enjoy using it! Also, I'm super curious to hear if anyone else has been thinking about LLMs (composing them, building interfaces to them, etc.) in similar ways and what learnings have been (even if not though this library) https://ift.tt/jfCOE39 December 31, 2022 at 11:13AM
Subscribe to:
Post Comments (Atom)
Cybersecurity Career Week October 16-21, 2023
Join us in Observing Cybersecurity Career Week October 16-21, 2023 nist.gov/nice/ccw What is it? Cybersecurity Career Awareness Week is a ca...
-
Show HN: Dumbproxy – modern and simple HTTP(S) proxy https://ift.tt/3cVqvkq May 25, 2020 at 03:00AM
-
Show HN: A simple MSN Weather API wrapper https://ift.tt/3jeS0vi June 26, 2021 at 07:48AM
-
Show HN: TopSpace – Scroll above the top line in Emacs This is an Emacs minor mode I made in my spare time this past year. It lets you scrol...
No comments:
Post a Comment