-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
2 changed files
with
4 additions
and
4 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1 @@ | ||
{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Overview \u00b6 Metaprompt is a template language for LLM prompt automation, reuse and structuring, with support for writing prompts with prompts. It adds a number of syntactic constructs to plaintext prompts: variables conditionals function calls meta-prompting operator etc. These constructs get expanded at run time, producing textual output. Project status \u00b6 !!! This is an early work-in-progress !!! Not all of the described features have been implemented. The repository README will give you more details. Use cases \u00b6 Templating \u00b6 MetaPrompt's basic use case is substituting parameter values instead of variable names embedded in a prompt: Write me a poem about [:subject] in the style of [:style] Meta-prompting \u00b6 Meta-prompting is a technique of asking an LLM to create/modify/expand an LLM prompt. Dynamically crafting task-specific prompts based on a set of high level principles Modifying prompts to increase accuracy Securing inputs from prompt injection attacks Selecting the most suitable model based on prompt contents Quick example: [$ You are an LLM prompt engineer. Improve this prompt by adding specific instructions: [:prompt] ] Prompt structuring \u00b6 A module system and a package system enable parameterized prompt reuse and publishing. hello.metaprompt : Hello, [:what]! main.metaprompt : [:use ./hello :what=world] main.metaprompt will evaluate to Hello, world!","title":"Home"},{"location":"#overview","text":"Metaprompt is a template language for LLM prompt automation, reuse and structuring, with support for writing prompts with prompts. It adds a number of syntactic constructs to plaintext prompts: variables conditionals function calls meta-prompting operator etc. These constructs get expanded at run time, producing textual output.","title":"Overview"},{"location":"#project-status","text":"!!! This is an early work-in-progress !!! Not all of the described features have been implemented. The repository README will give you more details.","title":"Project status"},{"location":"#use-cases","text":"","title":"Use cases"},{"location":"#templating","text":"MetaPrompt's basic use case is substituting parameter values instead of variable names embedded in a prompt: Write me a poem about [:subject] in the style of [:style]","title":"Templating"},{"location":"#meta-prompting","text":"Meta-prompting is a technique of asking an LLM to create/modify/expand an LLM prompt. Dynamically crafting task-specific prompts based on a set of high level principles Modifying prompts to increase accuracy Securing inputs from prompt injection attacks Selecting the most suitable model based on prompt contents Quick example: [$ You are an LLM prompt engineer. Improve this prompt by adding specific instructions: [:prompt] ]","title":"Meta-prompting"},{"location":"#prompt-structuring","text":"A module system and a package system enable parameterized prompt reuse and publishing. hello.metaprompt : Hello, [:what]! main.metaprompt : [:use ./hello :what=world] main.metaprompt will evaluate to Hello, world!","title":"Prompt structuring"},{"location":"modules/","text":"MetaPrompt module system is centered around files.","title":"Modules"},{"location":"syntax/","text":"Text \u00b6 A textual prompt is usually a valid metaprompt: Hi, LLM! How are you feeling today? will be expanded to the same string, because it does not contain any MetaPrompt constructs. Variables \u00b6 Here's a variable: [:variable_name]. If a variable is used before first assignment, it is treated as a required prompt parameter automatically. [:variable_name=it can be reassigned to any value, however] [:variable_name=Including a value containing its old value: [:variable_name]] Comments \u00b6 [# Text for the human reader can be written like this. Comments must be well-formed metaprompt expressions too - in the future comment parse trees will be used to convey additional semantic info (e.g. documentation). Comments are ignored during evaluation. ] Conditionals \u00b6 [:if the sky is sometimes blue :then this... :else that... ] :if expressions will be expanded at runtime. First, the following text will be fed to an LLM: Please determine if the following statement is true. Do not write any other output, answer just \"true\" or \"false\". The statement: the sky is sometimes blue The answer will determine the execution branch. If the answer is not literally \"true\" or \"false\", an exception will be thrown after a few retries Meta-prompting \u00b6 LLM says: [$ Hi, LLM! How are you today?] The [$ prompt will be executed and its output will be inserted at its position during expansion. This enables powerful techniques of prompt rewriting: [$ [$ Improve this LLM prompt: [:prompt]]] Notice the double nesting of [$ - the improved prompt will be fed back into an LLM. Modules \u00b6 Every .metaprompt file is a function. Unbound variables used in a file are its parameters, that must be provided. Hello, [:what]! [# `what` is a parameter ] [:who=you] [# `who` is NOT a parameter, because it is assigned before first use] How are [:who] feeling today? File imports \u00b6 The following expression will include ./relative-import.metaprompt file (relative to the directory of the file, NOT to the current working dir): [:use ./relative-import] Package imports \u00b6 NOT IMPLEMENTED Passing parameters \u00b6 [:use ./relative-import :someParameter= arbitrary value, potentially using any other MetaPrompt constructs :otherParameter= another value ] Special variables \u00b6 MODEL - used to determine active LLM id.","title":"Syntax"},{"location":"syntax/#text","text":"A textual prompt is usually a valid metaprompt: Hi, LLM! How are you feeling today? will be expanded to the same string, because it does not contain any MetaPrompt constructs.","title":"Text"},{"location":"syntax/#variables","text":"Here's a variable: [:variable_name]. If a variable is used before first assignment, it is treated as a required prompt parameter automatically. [:variable_name=it can be reassigned to any value, however] [:variable_name=Including a value containing its old value: [:variable_name]]","title":"Variables"},{"location":"syntax/#comments","text":"[# Text for the human reader can be written like this. Comments must be well-formed metaprompt expressions too - in the future comment parse trees will be used to convey additional semantic info (e.g. documentation). Comments are ignored during evaluation. ]","title":"Comments"},{"location":"syntax/#conditionals","text":"[:if the sky is sometimes blue :then this... :else that... ] :if expressions will be expanded at runtime. First, the following text will be fed to an LLM: Please determine if the following statement is true. Do not write any other output, answer just \"true\" or \"false\". The statement: the sky is sometimes blue The answer will determine the execution branch. If the answer is not literally \"true\" or \"false\", an exception will be thrown after a few retries","title":"Conditionals"},{"location":"syntax/#meta-prompting","text":"LLM says: [$ Hi, LLM! How are you today?] The [$ prompt will be executed and its output will be inserted at its position during expansion. This enables powerful techniques of prompt rewriting: [$ [$ Improve this LLM prompt: [:prompt]]] Notice the double nesting of [$ - the improved prompt will be fed back into an LLM.","title":"Meta-prompting"},{"location":"syntax/#modules","text":"Every .metaprompt file is a function. Unbound variables used in a file are its parameters, that must be provided. Hello, [:what]! [# `what` is a parameter ] [:who=you] [# `who` is NOT a parameter, because it is assigned before first use] How are [:who] feeling today?","title":"Modules"},{"location":"syntax/#file-imports","text":"The following expression will include ./relative-import.metaprompt file (relative to the directory of the file, NOT to the current working dir): [:use ./relative-import]","title":"File imports"},{"location":"syntax/#package-imports","text":"NOT IMPLEMENTED","title":"Package imports"},{"location":"syntax/#passing-parameters","text":"[:use ./relative-import :someParameter= arbitrary value, potentially using any other MetaPrompt constructs :otherParameter= another value ]","title":"Passing parameters"},{"location":"syntax/#special-variables","text":"MODEL - used to determine active LLM id.","title":"Special variables"},{"location":"tutorial/","text":"","title":"Tutorial"}]} | ||
{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Overview \u00b6 Metaprompt is a template language for LLM prompt automation, reuse and structuring, with support for writing prompts with prompts. It adds a number of syntactic constructs to plaintext prompts: variables conditionals function calls meta-prompting operator etc. These constructs get expanded at run time, producing textual output. Project status \u00b6 !!! This is an early work-in-progress !!! Not all of the described features have been implemented. The repository README will give you more details. Use cases \u00b6 Templating \u00b6 MetaPrompt's basic use case is substituting parameter values instead of variable names embedded in a prompt: Write me a poem about [:subject] in the style of [:style] Prompt rewriting \u00b6 Prompt rewriting is a technique of asking an LLM to create/modify/expand an LLM prompt. Dynamically crafting task-specific prompts based on a set of high level principles Modifying prompts to increase accuracy Securing inputs from prompt injection attacks Selecting the most suitable model based on prompt contents Quick example: [$ You are an LLM prompt engineer. Improve this prompt by adding specific instructions: [:prompt] ] Prompt structuring \u00b6 A module system and a package system enable parameterized prompt reuse and publishing. hello.metaprompt : Hello, [:what]! main.metaprompt : [:use ./hello :what=world] main.metaprompt will evaluate to Hello, world!","title":"Home"},{"location":"#overview","text":"Metaprompt is a template language for LLM prompt automation, reuse and structuring, with support for writing prompts with prompts. It adds a number of syntactic constructs to plaintext prompts: variables conditionals function calls meta-prompting operator etc. These constructs get expanded at run time, producing textual output.","title":"Overview"},{"location":"#project-status","text":"!!! This is an early work-in-progress !!! Not all of the described features have been implemented. The repository README will give you more details.","title":"Project status"},{"location":"#use-cases","text":"","title":"Use cases"},{"location":"#templating","text":"MetaPrompt's basic use case is substituting parameter values instead of variable names embedded in a prompt: Write me a poem about [:subject] in the style of [:style]","title":"Templating"},{"location":"#prompt-rewriting","text":"Prompt rewriting is a technique of asking an LLM to create/modify/expand an LLM prompt. Dynamically crafting task-specific prompts based on a set of high level principles Modifying prompts to increase accuracy Securing inputs from prompt injection attacks Selecting the most suitable model based on prompt contents Quick example: [$ You are an LLM prompt engineer. Improve this prompt by adding specific instructions: [:prompt] ]","title":"Prompt rewriting"},{"location":"#prompt-structuring","text":"A module system and a package system enable parameterized prompt reuse and publishing. hello.metaprompt : Hello, [:what]! main.metaprompt : [:use ./hello :what=world] main.metaprompt will evaluate to Hello, world!","title":"Prompt structuring"},{"location":"modules/","text":"MetaPrompt module system is centered around files.","title":"Modules"},{"location":"syntax/","text":"Text \u00b6 A textual prompt is usually a valid metaprompt: Hi, LLM! How are you feeling today? will be expanded to the same string, because it does not contain any MetaPrompt constructs. Variables \u00b6 Here's a variable: [:variable_name]. If a variable is used before first assignment, it is treated as a required prompt parameter automatically. [:variable_name=it can be reassigned to any value, however] [:variable_name=Including a value containing its old value: [:variable_name]] Comments \u00b6 [# Text for the human reader can be written like this. Comments must be well-formed metaprompt expressions too - in the future comment parse trees will be used to convey additional semantic info (e.g. documentation). Comments are ignored during evaluation. ] Conditionals \u00b6 [:if the sky is sometimes blue :then this... :else that... ] :if expressions will be expanded at runtime. First, the following text will be fed to an LLM: Please determine if the following statement is true. Do not write any other output, answer just \"true\" or \"false\". The statement: the sky is sometimes blue The answer will determine the execution branch. If the answer is not literally \"true\" or \"false\", an exception will be thrown after a few retries Meta-prompting \u00b6 LLM says: [$ Hi, LLM! How are you today?] The [$ prompt will be executed and its output will be inserted at its position during expansion. This enables powerful techniques of prompt rewriting: [$ [$ Improve this LLM prompt: [:prompt]]] Notice the double nesting of [$ - the improved prompt will be fed back into an LLM. Modules \u00b6 Every .metaprompt file is a function. Unbound variables used in a file are its parameters, that must be provided. Hello, [:what]! [# `what` is a parameter ] [:who=you] [# `who` is NOT a parameter, because it is assigned before first use] How are [:who] feeling today? File imports \u00b6 The following expression will include ./relative-import.metaprompt file (relative to the directory of the file, NOT to the current working dir): [:use ./relative-import] Package imports \u00b6 NOT IMPLEMENTED Passing parameters \u00b6 [:use ./relative-import :someParameter= arbitrary value, potentially using any other MetaPrompt constructs :otherParameter= another value ] Special variables \u00b6 MODEL - used to determine active LLM id.","title":"Syntax"},{"location":"syntax/#text","text":"A textual prompt is usually a valid metaprompt: Hi, LLM! How are you feeling today? will be expanded to the same string, because it does not contain any MetaPrompt constructs.","title":"Text"},{"location":"syntax/#variables","text":"Here's a variable: [:variable_name]. If a variable is used before first assignment, it is treated as a required prompt parameter automatically. [:variable_name=it can be reassigned to any value, however] [:variable_name=Including a value containing its old value: [:variable_name]]","title":"Variables"},{"location":"syntax/#comments","text":"[# Text for the human reader can be written like this. Comments must be well-formed metaprompt expressions too - in the future comment parse trees will be used to convey additional semantic info (e.g. documentation). Comments are ignored during evaluation. ]","title":"Comments"},{"location":"syntax/#conditionals","text":"[:if the sky is sometimes blue :then this... :else that... ] :if expressions will be expanded at runtime. First, the following text will be fed to an LLM: Please determine if the following statement is true. Do not write any other output, answer just \"true\" or \"false\". The statement: the sky is sometimes blue The answer will determine the execution branch. If the answer is not literally \"true\" or \"false\", an exception will be thrown after a few retries","title":"Conditionals"},{"location":"syntax/#meta-prompting","text":"LLM says: [$ Hi, LLM! How are you today?] The [$ prompt will be executed and its output will be inserted at its position during expansion. This enables powerful techniques of prompt rewriting: [$ [$ Improve this LLM prompt: [:prompt]]] Notice the double nesting of [$ - the improved prompt will be fed back into an LLM.","title":"Meta-prompting"},{"location":"syntax/#modules","text":"Every .metaprompt file is a function. Unbound variables used in a file are its parameters, that must be provided. Hello, [:what]! [# `what` is a parameter ] [:who=you] [# `who` is NOT a parameter, because it is assigned before first use] How are [:who] feeling today?","title":"Modules"},{"location":"syntax/#file-imports","text":"The following expression will include ./relative-import.metaprompt file (relative to the directory of the file, NOT to the current working dir): [:use ./relative-import]","title":"File imports"},{"location":"syntax/#package-imports","text":"NOT IMPLEMENTED","title":"Package imports"},{"location":"syntax/#passing-parameters","text":"[:use ./relative-import :someParameter= arbitrary value, potentially using any other MetaPrompt constructs :otherParameter= another value ]","title":"Passing parameters"},{"location":"syntax/#special-variables","text":"MODEL - used to determine active LLM id.","title":"Special variables"},{"location":"tutorial/","text":"","title":"Tutorial"}]} |