Skip to content

Commit 1a5f191

Browse files
authored
updated tutorial (#627)
Signed-off-by: Mandana Vaziri <[email protected]>
1 parent 3465894 commit 1a5f191

26 files changed

+267
-104
lines changed

README.md

Lines changed: 13 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -65,6 +65,12 @@ other platforms, downloads are available
6565
may also kick the tires with a web version of the GUI
6666
[here](https://pdl.s3-web.us-east.cloud-object-storage.appdomain.cloud/).
6767

68+
To generate a trace for use in the GUI:
69+
```bash
70+
pdl --trace <file.json> <my-example.pdl>
71+
```
72+
73+
6874
<img src="docs/assets/ui.gif" alt="PDL GUI"/>
6975

7076
## Key Features
@@ -198,13 +204,15 @@ text:
198204
```
199205
200206
201-
## Trace Generation and Live Document Visualization
207+
## Trace Telemetry
202208
203-
```bash
204-
pdl --trace <file.json> <my-example.pdl>
205-
```
209+
PDL includes experimental support for gathering trace telemetry. This can
210+
be used for debugging or performance analysis, and to see the shape of prompts sent by LiteLLM to models.
211+
212+
For more information see [here](https://github.com/IBM/prompt-declaration-language/blob/main/docs/telemetry.md).
213+
214+
<img src="https://ibm.github.io/prompt-declaration-language/assets/telemetry.png" alt="Trace Telemetry"/>
206215
207-
Then, you can either download the GUI, or upload trace files to the [Live Document Viewer](https://pdl.s3-web.us-east.cloud-object-storage.appdomain.cloud/) for visual debugging, trace exploration, and live programming.
208216
209217
210218
## Contributing

docs/README.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ PDL is based on the premise that interactions with an LLM are mainly for the pur
1111

1212
PDL provides the following features:
1313

14-
- Ability to use any LLM locally or remotely via [LiteLLM](https://www.litellm.ai/), including [IBM's watsonx](https://www.ibm.com/watsonx)
14+
- Ability to use any LLM locally or remotely via [LiteLLM](https://www.litellm.ai/), including [IBM's watsonx](https://www.ibm.com/watsonx), as well as the [Granite IO Processor](https://github.com/ibm-granite/granite-io) framework
1515
- Ability to templatize not only prompts for one LLM call, but also composition of LLMs with tools (code and APIs). Templates can encompass tasks of larger granularity than a single LLM call
1616
- Control structures: variable definitions and use, conditionals, loops, functions
1717
- Ability to read from files and stdin, including JSON data
@@ -32,7 +32,6 @@ See below for a quick reference, followed by [installation notes](#interpreter_i
3232

3333
(See also [PDF version](https://github.com/IBM/prompt-declaration-language/blob/main/docs/assets/pdl_quick_reference.pdf).)
3434

35-
<b>Pro Tip</b>: When writing loops and conditionals with `repeat`, `for`, and `if-then-else`, start the body of the loop or conditional (`then`/`else`) with `text` in order to see the results of every block in the body. See for example this [file](https://github.com/IBM/prompt-declaration-language/blob/main/examples/tutorial/conditionals_loops.pdl).
3635

3736
## Interpreter Installation
3837

docs/assets/telemetry.png

11.6 KB
Loading

docs/tutorial.md

Lines changed: 119 additions & 52 deletions
Large diffs are not rendered by default.

examples/hello/hello-while.pdl

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
defs:
2+
i: 0
3+
while: ${ i < 3 }
4+
repeat:
5+
defs:
6+
i: ${i + 1}
7+
text: ${i}

examples/intrinsics/demo-hallucination.pdl

Lines changed: 10 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -78,15 +78,13 @@ text:
7878

7979

8080
The answer is: ${ out[0].sentence }
81-
82-
- if: ${ out[0].meta.hallucination_level == "low" }
83-
then: |
84-
I am not hallucinating, promise!
85-
# (The level may be "high" or "unanswerable")
86-
else: |
87-
Totally hallucinating, sorry!
88-
- if: ${ out[0].meta.citation }
89-
then: |
90-
The citation is: ${ out[0].meta.citation.snippet }
91-
else: |
92-
No citation matched the user query.
81+
- match: ${out[0].meta.hallucination_level}
82+
with:
83+
- case: "high"
84+
then: Totally hallucinating, sorry!
85+
- case: "low"
86+
if: ${ out[0].meta.citation }
87+
then: |
88+
I am not hallucinating, promise!
89+
The citation is: ${ out[0].meta.citation.snippet }
90+
- then: Not sure if I am hallucinating...

examples/rag/tfidf_rag.pdl

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
description: Retrieval-augmented generation for NL-to-Code generation task.
2+
text:
3+
- lang: python
4+
code: | # initialize PDL_SESSION.vec_db and PDL_SESSION.embed() function
5+
import datasets, sklearn.feature_extraction.text
6+
train_in = datasets.load_dataset("mbpp", "sanitized", split="train")
7+
corpus = [row["prompt"] for row in train_in]
8+
tfidf = sklearn.feature_extraction.text.TfidfVectorizer().fit(corpus)
9+
def embed(text):
10+
singleton_batch = [text]
11+
sparse_result = tfidf.transform(raw_documents=singleton_batch)
12+
return sparse_result.toarray().flatten()
13+
train_em = train_in.map(lambda row: {"embeddings": embed(row["prompt"])})
14+
PDL_SESSION.vec_db = train_em.add_faiss_index("embeddings")
15+
PDL_SESSION.embed = embed
16+
result = ""
17+
- def: TEST_PROMPT
18+
text: >-
19+
Write a python function to remove first and last occurrence of a
20+
given character from the string.
21+
contribute: []
22+
- def: RETRIEVED
23+
lang: python
24+
spec: {prompt: [str], code: [str]}
25+
code: |
26+
key = PDL_SESSION.embed("${ TEST_PROMPT }")
27+
nearest = PDL_SESSION.vec_db.get_nearest_examples("embeddings", key, 5)
28+
result = {col: nearest.examples[col] for col in ["prompt", "code"]}
29+
contribute: []
30+
- |
31+
Given the text after "Q:", generate a Python function after "A:".
32+
33+
Here are some examples, complete the last one:
34+
- for:
35+
prompt: ${ RETRIEVED.prompt }
36+
code: ${ RETRIEVED.code }
37+
repeat: |
38+
39+
Q: ${ prompt }
40+
A: ```${ code }```
41+
- |-
42+
43+
Q: ${ TEST_PROMPT }
44+
A:
45+
- model: replicate/ibm-granite/granite-3.0-8b-instruct

examples/tutorial/calling_apis.pdl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ description: Using a weather API and LLM to make a small weather app
22
text:
33
- def: QUERY
44
text: "What is the weather in Madrid?\n"
5-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
5+
- model: ollama/granite3.2:2b
66
input: |
77
Extract the location from the question.
88
Question: What is the weather in London?
@@ -25,7 +25,7 @@ text:
2525
def: WEATHER
2626
parser: json
2727
contribute: []
28-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
28+
- model: ollama/granite3.2:2b
2929
input: |
3030
Explain the weather from the following JSON:
3131
${ WEATHER }

examples/tutorial/calling_llm.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
description: Hello world calling a model
22
text:
33
- "Hello\n"
4-
- model: ollama/granite-code:8b
4+
- model: ollama/granite3.2:2b
55
parameters:
66
stop: ['!']
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
description: Hello world calling a model
22
text:
33
- "Hello\n"
4-
- model: ollama/granite-code:8b
4+
- model: ollama/granite3.2:2b
55
input:
66
Translate the word 'Hello' to French

examples/tutorial/calling_llm_with_input_messages.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
description: Hello world calling a model
22
text:
33
- "Hello\n"
4-
- model: ollama/granite-code:8b
4+
- model: ollama/granite3.2:2b
55
input:
66
array:
77
- role: system

examples/tutorial/conditionals_loops.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ text:
55
contribute: [context]
66
- repeat:
77
text:
8-
- model: ollama/granite-code:8b
8+
- model: ollama/granite3.2:2b
99
- read:
1010
def: eval
1111
message: "\nIs this a good answer[yes/no]?\n"

examples/tutorial/data_block.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ defs:
66
TRUTH:
77
read: ./ground_truth.txt
88
lastOf:
9-
- model: ollama/granite-code:8b
9+
- model: ollama/granite3.2:2b
1010
def: EXPLANATION
1111
input:
1212
|

examples/tutorial/function_definition.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ text:
77
return:
88
lastOf:
99
- "\nTranslate the sentence '${ sentence }' to ${ language }.\n"
10-
- model: ollama/granite-code:8b
10+
- model: ollama/granite3.2:2b
1111
parameters:
1212
stop: ["\n"]
1313
temperature: 0

examples/tutorial/grouping_definitions.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ defs:
77
return:
88
lastOf:
99
- "\nTranslate the sentence '${ sentence }' to ${ language }.\n"
10-
- model: ollama/granite-code:8b
10+
- model: ollama/granite3.2:2b
1111
parameters:
1212
stop: ["\n"]
1313
text:

examples/tutorial/import.pdl

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
defs:
2+
lib:
3+
import: import_lib
4+
text:
5+
- call: ${ lib.a }
6+
args:
7+
arg: Bye!

examples/tutorial/import_lib.pdl

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
2+
defs:
3+
b:
4+
function:
5+
arg: str
6+
return:
7+
${ arg }
8+
9+
a:
10+
function:
11+
arg: str
12+
return:
13+
call: ${ b }
14+
args:
15+
pdl_context: []
16+
arg: ${ arg }

examples/tutorial/include.pdl

Lines changed: 0 additions & 20 deletions
This file was deleted.

examples/tutorial/model_chaining.pdl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
description: Model chaining
22
text:
33
- "Hello\n"
4-
- model: ollama/granite-code:8b
4+
- model: ollama/granite3.2:2b
55
parameters:
66
stop: ["!"]
77
- "\nDid you just say Hello?\n"
8-
- model: ollama/granite-code:8b
8+
- model: ollama/granite3.2:2b
99
parameters:
1010
stop: ["!"]

examples/tutorial/muting_block_output.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ defs:
88
text:
99
- text: "\nTranslate the sentence '${ sentence }' to ${ language }.\n"
1010
contribute: [context]
11-
- model: ollama/granite-code:8b
11+
- model: ollama/granite3.2:2b
1212
parameters:
1313
stop: ["\n"]
1414
text:

examples/tutorial/parser_findall.pdl

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
text: "1 -- 2 -- 3 -- 4"
2+
parser:
3+
regex: '[0-9]+'
4+
mode: findall

examples/tutorial/parser_regex.pdl

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
description: Hello world with parser using regex
2+
text:
3+
- model: ollama/granite-code:8b
4+
input: "Hello,"
5+
parameters:
6+
# Tell the LLM to stop after generating an exclamation point.
7+
stop: ['!']
8+
spec: {"name": str}
9+
parser:
10+
spec:
11+
name: str
12+
regex: '\s*(?P<name>.*)\s*'

examples/tutorial/repeat.pdl

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
description: repeat loop with multiple conditions
2+
defs:
3+
numbers:
4+
data: [1, 2, 3, 4]
5+
names:
6+
data: ["Bob", "Carol", "David", "Ernest"]
7+
for:
8+
number: ${ numbers }
9+
name: ${ names }
10+
repeat:
11+
"${ name }'s number is ${ number }\n"
12+
until: ${ name == "Carol"}
13+
max_iterations: 1

examples/tutorial/variable_def_use.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
description: Hello world with variable def and use
22
text:
33
- "Hello\n"
4-
- model: ollama/granite-code:8b
4+
- model: ollama/granite3.2:2b
55
def: GEN
66
parameters:
77
stop: ['!']

examples/tutorial/while.pdl

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
defs:
2+
i: 0
3+
while: ${ i < 3 }
4+
repeat:
5+
defs:
6+
i: ${i + 1}
7+
text: ${i}

0 commit comments

Comments
 (0)