update nav

This commit is contained in:
zachary62
2025-04-04 14:03:22 -04:00
parent 2fa60fe7d5
commit 0426110e66
24 changed files with 261 additions and 32 deletions

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "Module & Program"
parent: "DSPy"
nav_order: 1
---
# Chapter 1: Modules and Programs: Building Blocks of DSPy
Welcome to the first chapter of our journey into DSPy! We're excited to have you here.

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "Signature"
parent: "DSPy"
nav_order: 2
---
# Chapter 2: Signatures - Defining the Task
In [Chapter 1: Modules and Programs](01_module___program.md), we learned that `Module`s are like Lego bricks that perform specific tasks, often using Language Models ([LM](05_lm__language_model_client_.md)). We saw how `Program`s combine these modules.

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "Example"
parent: "DSPy"
nav_order: 3
---
# Chapter 3: Example - Your Data Points
In [Chapter 2: Signature](02_signature.md), we learned how to define the *task* for a DSPy module using `Signatures` specifying the inputs, outputs, and instructions. It's like writing a recipe card.

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "Predict"
parent: "DSPy"
nav_order: 4
---
# Chapter 4: Predict - The Basic LM Caller
In [Chapter 3: Example](03_example.md), we learned how to create `dspy.Example` objects to represent our data points like flashcards holding an input and its corresponding desired output. We also saw in [Chapter 2: Signature](02_signature.md) how to define the *task* itself using `dspy.Signature`.

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "LM (Language Model Client)"
parent: "DSPy"
nav_order: 5
---
# Chapter 5: LM (Language Model Client) - The Engine Room
In [Chapter 4: Predict](04_predict.md), we saw how `dspy.Predict` takes a [Signature](02_signature.md) and input data to magically generate an output. We used our `translator` example:

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "RM (Retrieval Model Client)"
parent: "DSPy"
nav_order: 6
---
# Chapter 6: RM (Retrieval Model Client) - Your Program's Librarian
In [Chapter 5: LM (Language Model Client)](05_lm__language_model_client_.md), we learned how to connect our DSPy programs to the powerful "brain" of a Language Model (LM) using the LM Client. The LM is great at generating creative text, answering questions based on its vast training data, and reasoning.

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "Evaluate"
parent: "DSPy"
nav_order: 7
---
# Chapter 7: Evaluate - Grading Your Program
In the previous chapter, [Chapter 6: RM (Retrieval Model Client)](06_rm__retrieval_model_client_.md), we learned how to connect our DSPy program to external knowledge sources using Retrieval Models (RMs). We saw how combining RMs with Language Models (LMs) allows us to build sophisticated programs like Retrieval-Augmented Generation (RAG) systems.

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "Teleprompter & Optimizer"
parent: "DSPy"
nav_order: 8
---
# Chapter 8: Teleprompter / Optimizer - Your Program's Coach
Welcome to Chapter 8! In [Chapter 7: Evaluate](07_evaluate.md), we learned how to grade our DSPy programs using metrics and datasets to see how well they perform. That's great for knowing our score, but what if the score isn't high enough?

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "Adapter"
parent: "DSPy"
nav_order: 9
---
# Chapter 9: Adapter - The Universal Translator
Welcome to Chapter 9! In [Chapter 8: Teleprompter / Optimizer](08_teleprompter___optimizer.md), we saw how DSPy can automatically optimize our programs by finding better prompts or few-shot examples. We ended up with a `compiled_program` that should perform better.

View File

@@ -1,3 +1,10 @@
---
layout: default
title: "Settings"
parent: "DSPy"
nav_order: 10
---
# Chapter 10: Settings - Your Program's Control Panel
Welcome to the final chapter of our introductory DSPy tutorial! In [Chapter 9: Adapter](09_adapter.md), we saw how Adapters act as translators, allowing our DSPy programs to communicate seamlessly with different types of Language Models (LMs).