Initial commit

This commit is contained in:
Zach
2025-04-02 16:56:35 -04:00
committed by GitHub
commit 41ecb66b4a
13 changed files with 5254 additions and 0 deletions

1664
.clinerules Normal file

File diff suppressed because it is too large Load Diff

1664
.cursorrules Normal file

File diff suppressed because it is too large Load Diff

92
.gitignore vendored Normal file
View File

@@ -0,0 +1,92 @@
# Dependencies
node_modules/
vendor/
.pnp/
.pnp.js
# Build outputs
dist/
build/
out/
*.pyc
__pycache__/
# Environment files
.env
.env.local
.env.*.local
.env.development
.env.test
.env.production
# IDE - VSCode
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
# IDE - JetBrains
.idea/
*.iml
*.iws
*.ipr
# IDE - Eclipse
.project
.classpath
.settings/
# Logs
logs/
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Operating System
.DS_Store
Thumbs.db
*.swp
*.swo
# Testing
coverage/
.nyc_output/
# Temporary files
*.tmp
*.temp
.cache/
# Compiled files
*.com
*.class
*.dll
*.exe
*.o
*.so
# Package files
*.7z
*.dmg
*.gz
*.iso
*.jar
*.rar
*.tar
*.zip
# Database
*.sqlite
*.sqlite3
*.db
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history

1664
.windsurfrules Normal file

File diff suppressed because it is too large Load Diff

20
README.md Normal file
View File

@@ -0,0 +1,20 @@
<h1 align="center">Agentic Coding - Project Template</h1>
<p align="center">
<a href="https://github.com/The-Pocket/PocketFlow" target="_blank">
<img
src="./assets/banner.png" width="600"
/>
</a>
</p>
This is a project template for Agentic Coding with [Pocket Flow](https://github.com/The-Pocket/PocketFlow), a 100-line LLM framework, and Cursor.
- We have included the [.cursorrules](.cursorrules) file to let Cursor AI help you build LLM projects.
- Want to learn how to build LLM projects with Agentic Coding?
- Check out the [Agentic Coding Guidance](https://the-pocket.github.io/PocketFlow/guide.html)
- Check out the [YouTube Tutorial](https://www.youtube.com/@ZacharyLLM?sub_confirmation=1)

BIN
assets/banner.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.9 MiB

79
docs/design.md Normal file
View File

@@ -0,0 +1,79 @@
# Design Doc: Your Project Name
> Please DON'T remove notes for AI
## Requirements
> Notes for AI: Keep it simple and clear.
> If the requirements are abstract, write concrete user stories
## Flow Design
> Notes for AI:
> 1. Consider the design patterns of agent, map-reduce, rag, and workflow. Apply them if they fit.
> 2. Present a concise, high-level description of the workflow.
### Applicable Design Pattern:
1. Map the file summary into chunks, then reduce these chunks into a final summary.
2. Agentic file finder
- *Context*: The entire summary of the file
- *Action*: Find the file
### Flow high-level Design:
1. **First Node**: This node is for ...
2. **Second Node**: This node is for ...
3. **Third Node**: This node is for ...
```mermaid
flowchart TD
firstNode[First Node] --> secondNode[Second Node]
secondNode --> thirdNode[Third Node]
```
## Utility Functions
> Notes for AI:
> 1. Understand the utility function definition thoroughly by reviewing the doc.
> 2. Include only the necessary utility functions, based on nodes in the flow.
1. **Call LLM** (`utils/call_llm.py`)
- *Input*: prompt (str)
- *Output*: response (str)
- Generally used by most nodes for LLM tasks
2. **Embedding** (`utils/get_embedding.py`)
- *Input*: str
- *Output*: a vector of 3072 floats
- Used by the second node to embed text
## Node Design
### Shared Memory
> Notes for AI: Try to minimize data redundancy
The shared memory structure is organized as follows:
```python
shared = {
"key": "value"
}
```
### Node Steps
> Notes for AI: Carefully decide whether to use Batch/Async Node/Flow.
1. First Node
- *Purpose*: Provide a short explanation of the nodes function
- *Type*: Decide between Regular, Batch, or Async
- *Steps*:
- *prep*: Read "key" from the shared store
- *exec*: Call the utility function
- *post*: Write "key" to the shared store
2. Second Node
...

14
flow.py Normal file
View File

@@ -0,0 +1,14 @@
from pocketflow import Flow
from nodes import GetQuestionNode, AnswerNode
def create_qa_flow():
"""Create and return a question-answering flow."""
# Create nodes
get_question_node = GetQuestionNode()
answer_node = AnswerNode()
# Connect nodes in sequence
get_question_node >> answer_node
# Create flow starting with input node
return Flow(start=get_question_node)

16
main.py Normal file
View File

@@ -0,0 +1,16 @@
from flow import qa_flow
# Example main function
# Please replace this with your own main function
def main():
shared = {
"question": "In one sentence, what's the end of universe?",
"answer": None
}
qa_flow.run(shared)
print("Question:", shared["question"])
print("Answer:", shared["answer"])
if __name__ == "__main__":
main()

26
nodes.py Normal file
View File

@@ -0,0 +1,26 @@
from pocketflow import Node
from utils.call_llm import call_llm
class GetQuestionNode(Node):
def exec(self, _):
# Get question directly from user input
user_question = input("Enter your question: ")
return user_question
def post(self, shared, prep_res, exec_res):
# Store the user's question
shared["question"] = exec_res
return "default" # Go to the next node
class AnswerNode(Node):
def prep(self, shared):
# Read question from shared
return shared["question"]
def exec(self, question):
# Call LLM to get the answer
return call_llm(question)
def post(self, shared, prep_res, exec_res):
# Store the answer in shared
shared["answer"] = exec_res

1
requirements.txt Normal file
View File

@@ -0,0 +1 @@
pocketflow>=0.0.1

0
utils/__init__.py Normal file
View File

14
utils/call_llm.py Normal file
View File

@@ -0,0 +1,14 @@
from openai import OpenAI
# Learn more about calling the LLM: https://the-pocket.github.io/PocketFlow/utility_function/llm.html
def call_llm(prompt):
client = OpenAI(api_key="YOUR_API_KEY_HERE")
r = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": prompt}]
)
return r.choices[0].message.content
if __name__ == "__main__":
prompt = "What is the meaning of life?"
print(call_llm(prompt))