feat: new structure

This commit is contained in:
Florian Hönicke
2023-04-14 16:07:05 +02:00
parent e5a9336619
commit a65b5ff9df
22 changed files with 436 additions and 247 deletions

111
README.md
View File

@@ -1,8 +1,7 @@
<h1 align="center"> <h1 align="center">
GPT Deploy: One line to create them all 🧙🚀 GPT Deploy: One line to generate them all 🧙🚀
</h1> </h1>
<p align="center"> <p align="center">
<img src="res/gpt-deploy-logo.png" alt="Jina NOW logo" width="150px"> <img src="res/gpt-deploy-logo.png" alt="Jina NOW logo" width="150px">
</p> </p>
@@ -53,21 +52,35 @@ If you set the environment variable `OPENAI_API_KEY`, the configuration step can
Your api key must have access to gpt-4 to use this tool. Your api key must have access to gpt-4 to use this tool.
We are working on a way to use gpt-3.5-turbo as well. We are working on a way to use gpt-3.5-turbo as well.
### Create Microservice ### Generate Microservice
```bash ```bash
gptdeploy create --description "Given a PDF, return its text" --test "https://www.africau.edu/images/default/sample.pdf" gptdeploy generate --description "<description of the microservice>" --test "<specification of a test scenario>" --path .
``` ```
To create your personal microservice two things are required: To generate your personal microservice two things are required:
- A `description` of the task you want to accomplish. - A `description` of the task you want to accomplish.
- A `test` scenario that ensures the microservice works as expected. - A `test` scenario that ensures the microservice works as expected.
- A `path` on the local drive where the microservice will be generated.
The creation process should take between 5 and 15 minutes. The creation process should take between 5 and 15 minutes.
During this time, GPT iteratively builds your microservice until it finds a strategy that make your test scenario pass. During this time, GPT iteratively builds your microservice until it finds a strategy that make your test scenario pass.
Once the microservice is created and deployed, you can test it using the generated Streamlit playground.
The deployment is made on the Jina`s infrastructure.
When creating a Jina account, you get some free credits, which you can use to deploy your microservice ($0.025/hour). ### Run Microservice
Be aware that the costs you have to pay for openai vary between $0.50 and $3.00 per microservice. ```bash
If you run out of credits, you can purchase more. gptdeploy run --path <path to microservice>
```
### Deploy Microservice
It is required to have a [Jina account](https://cloud.jina.ai/) to deploy your microservice.
```bash
gptdeploy deploy --microservice_path <path to microservice>
```
[//]: # (Once the microservice is generated and deployed, you can test it using the generated Streamlit playground.)
[//]: # (The deployment is made on the Jina`s infrastructure. )
[//]: # (When creating a Jina account, you get some free credits, which you can use to deploy your microservice &#40;$0.025/hour&#41;.)
[//]: # (Be aware that the costs you have to pay for openai vary between $0.50 and $3.00 per microservice.)
[//]: # (If you run out of credits, you can purchase more.)
### Delete Microservice ### Delete Microservice
To save credits you can delete your microservice via the following commands: To save credits you can delete your microservice via the following commands:
@@ -84,51 +97,51 @@ jc delete <microservice id>
### Animal Detector ### Animal Detector
```bash ```bash
gptdeploy create --description "Given an image, return the image with bounding boxes of all animals (https://pjreddie.com/media/files/yolov3.weights, https://raw.githubusercontent.com/pjreddie/darknet/master/cfg/yolov3.cfg)" --test "https://images.unsplash.com/photo-1444212477490-ca407925329e contains animals" gptdeploy generate --description "Given an image, return the image with bounding boxes of all animals (https://pjreddie.com/media/files/yolov3.weights, https://raw.githubusercontent.com/pjreddie/darknet/master/cfg/yolov3.cfg)" --test "https://images.unsplash.com/photo-1444212477490-ca407925329e contains animals"
``` ```
<img src="res/animal_detector_example.png" alt="Animal Detector" width="600" /> <img src="res/animal_detector_example.png" alt="Animal Detector" width="600" />
### Meme Generator ### Meme Generator
```bash ```bash
gptdeploy create --description "Generate a meme from an image and a caption" --test "Surprised Pikachu: https://media.wired.com/photos/5f87340d114b38fa1f8339f9/master/w_1600%2Cc_limit/Ideas_Surprised_Pikachu_HD.jpg, TOP:When you discovered GPTDeploy" gptdeploy generate --description "Generate a meme from an image and a caption" --test "Surprised Pikachu: https://media.wired.com/photos/5f87340d114b38fa1f8339f9/master/w_1600%2Cc_limit/Ideas_Surprised_Pikachu_HD.jpg, TOP:When you discovered GPTDeploy"
``` ```
<img src="res/meme_example.png" alt="Meme Generator" width="600" /> <img src="res/meme_example.png" alt="Meme Generator" width="600" />
### Rhyme Generator ### Rhyme Generator
```bash ```bash
gptdeploy create --description "Given a word, return a list of rhyming words using the datamuse api" --test "hello" gptdeploy generate --description "Given a word, return a list of rhyming words using the datamuse api" --test "hello"
``` ```
<img src="res/rhyme_generator_example.png" alt="Rhyme Generator" width="600" /> <img src="res/rhyme_generator_example.png" alt="Rhyme Generator" width="600" />
### Word Cloud Generator ### Word Cloud Generator
```bash ```bash
gptdeploy create --description "Generate a word cloud from a given text" --test "Lorem ipsum dolor sit amet, consectetur adipiscing elit." gptdeploy generate --description "Generate a word cloud from a given text" --test "Lorem ipsum dolor sit amet, consectetur adipiscing elit."
``` ```
<img src="res/word_cloud_example.png" alt="Word Cloud Generator" width="600" /> <img src="res/word_cloud_example.png" alt="Word Cloud Generator" width="600" />
### 3d model info ### 3d model info
```bash ```bash
gptdeploy create --description "Given a 3d object, return vertex count and face count" --test "https://raw.githubusercontent.com/polygonjs/polygonjs-assets/master/models/wolf.obj" gptdeploy generate --description "Given a 3d object, return vertex count and face count" --test "https://raw.githubusercontent.com/polygonjs/polygonjs-assets/master/models/wolf.obj"
``` ```
<img src="res/obj_info_example.png" alt="3D Model Info" width="600" /> <img src="res/obj_info_example.png" alt="3D Model Info" width="600" />
### Table extraction ### Table extraction
```bash ```bash
gptdeploy create --description "Given a URL, extract all tables as csv" --test "http://www.ins.tn/statistiques/90" gptdeploy generate --description "Given a URL, extract all tables as csv" --test "http://www.ins.tn/statistiques/90"
``` ```
<img src="res/table_extraction_example.png" alt="Table Extraction" width="600" /> <img src="res/table_extraction_example.png" alt="Table Extraction" width="600" />
### Audio to mel spectrogram ### Audio to mel spectrogram
```bash ```bash
gptdeploy create --description "Create mel spectrograms from audio file" --test "https://cdn.pixabay.com/download/audio/2023/02/28/audio_550d815fa5.mp3" gptdeploy generate --description "Create mel spectrograms from audio file" --test "https://cdn.pixabay.com/download/audio/2023/02/28/audio_550d815fa5.mp3"
``` ```
<img src="res/audio_to_mel_example.png" alt="Audio to Mel Spectrogram" width="600" /> <img src="res/audio_to_mel_example.png" alt="Audio to Mel Spectrogram" width="600" />
### Text to speech ### Text to speech
```bash ```bash
gptdeploy create --description "Convert text to speech" --test "Hello, welcome to GPT Deploy!" gptdeploy generate --description "Convert text to speech" --test "Hello, welcome to GPT Deploy!"
``` ```
<a href=res/text_to_speech_example.wav><img src="res/text_to_speech_example.png" alt="Text to Speech" width="600" /></a> <a href=res/text_to_speech_example.wav><img src="res/text_to_speech_example.png" alt="Text to Speech" width="600" /></a>
@@ -139,20 +152,20 @@ gptdeploy create --description "Convert text to speech" --test "Hello, welcome t
### Heatmap Generator ### Heatmap Generator
```bash ```bash
gptdeploy create --description "Create a heatmap from an image and a list of relative coordinates" --test "https://images.unsplash.com/photo-1574786198875-49f5d09fe2d2, [[0.1, 0.2], [0.3, 0.4], [0.5, 0.6], [0.2, 0.1], [0.7, 0.2], [0.4, 0.2]]" gptdeploy generate --description "Create a heatmap from an image and a list of relative coordinates" --test "https://images.unsplash.com/photo-1574786198875-49f5d09fe2d2, [[0.1, 0.2], [0.3, 0.4], [0.5, 0.6], [0.2, 0.1], [0.7, 0.2], [0.4, 0.2]]"
``` ```
<img src="res/heatmap_example.png" alt="Heatmap Generator" width="600" /> <img src="res/heatmap_example.png" alt="Heatmap Generator" width="600" />
### QR Code Generator ### QR Code Generator
```bash ```bash
gptdeploy create --description "Generate QR code from URL" --test "https://www.example.com" gptdeploy generate --description "Generate QR code from URL" --test "https://www.example.com"
``` ```
<img src="res/qr_example.png" alt="QR Code Generator" width="600" /> <img src="res/qr_example.png" alt="QR Code Generator" width="600" />
### Mandelbrot Set Visualizer ### Mandelbrot Set Visualizer
```bash ```bash
gptdeploy create --description "Visualize the Mandelbrot set with custom parameters" --test "center=-0+1i, zoom=1.0, size=800x800, iterations=1000" gptdeploy generate --description "Visualize the Mandelbrot set with custom parameters" --test "center=-0+1i, zoom=1.0, size=800x800, iterations=1000"
``` ```
<img src="res/mandelbrot_example.png" alt="Mandelbrot Set Visualizer" width="600" /> <img src="res/mandelbrot_example.png" alt="Mandelbrot Set Visualizer" width="600" />
@@ -164,7 +177,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Given a password, return a score from 1 to 10 indicating the strength of the password" --test "Pa$$w0rd => 1/5, !Akfdh%.ytRadf => 5/5") [//]: # (gptdeploy generate --description "Given a password, return a score from 1 to 10 indicating the strength of the password" --test "Pa$$w0rd => 1/5, !Akfdh%.ytRadf => 5/5")
[//]: # (```) [//]: # (```)
@@ -172,7 +185,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Convert text to morse code" --test "Hello, welcome to GPT Deploy!") [//]: # (gptdeploy generate --description "Convert text to morse code" --test "Hello, welcome to GPT Deploy!")
[//]: # (```) [//]: # (```)
@@ -180,7 +193,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Given an IP address, return the geolocation information" --test "142.251.46.174") [//]: # (gptdeploy generate --description "Given an IP address, return the geolocation information" --test "142.251.46.174")
[//]: # (```) [//]: # (```)
@@ -188,7 +201,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Converts any currency into any other" --test "1 usd to eur") [//]: # (gptdeploy generate --description "Converts any currency into any other" --test "1 usd to eur")
[//]: # (```) [//]: # (```)
@@ -196,7 +209,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Given an image, resize it to a specified width and height" --test "https://images.unsplash.com/photo-1602738328654-51ab2ae6c4ff") [//]: # (gptdeploy generate --description "Given an image, resize it to a specified width and height" --test "https://images.unsplash.com/photo-1602738328654-51ab2ae6c4ff")
[//]: # (```) [//]: # (```)
@@ -204,7 +217,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Given a city, return the current weather" --test "Berlin") [//]: # (gptdeploy generate --description "Given a city, return the current weather" --test "Berlin")
[//]: # (```) [//]: # (```)
@@ -213,7 +226,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Given a sudoku puzzle, return the solution" --test "[[2, 5, 0, 0, 3, 0, 9, 0, 1], [0, 1, 0, 0, 0, 4, 0, 0, 0], [4, 0, 7, 0, 0, 0, 2, 0, 8], [0, 0, 5, 2, 0, 0, 0, 0, 0], [0, 0, 0, 0, 9, 8, 1, 0, 0], [0, 4, 0, 0, 0, 3, 0, 0, 0], [0, 0, 0, 3, 6, 0, 0, 7, 2], [0, 7, 0, 0, 0, 0, 0, 0, 3], [9, 0, 3, 0, 0, 0, 6, 0, 4]]") [//]: # (gptdeploy generate --description "Given a sudoku puzzle, return the solution" --test "[[2, 5, 0, 0, 3, 0, 9, 0, 1], [0, 1, 0, 0, 0, 4, 0, 0, 0], [4, 0, 7, 0, 0, 0, 2, 0, 8], [0, 0, 5, 2, 0, 0, 0, 0, 0], [0, 0, 0, 0, 9, 8, 1, 0, 0], [0, 4, 0, 0, 0, 3, 0, 0, 0], [0, 0, 0, 3, 6, 0, 0, 7, 2], [0, 7, 0, 0, 0, 0, 0, 0, 3], [9, 0, 3, 0, 0, 0, 6, 0, 4]]")
[//]: # (```) [//]: # (```)
@@ -222,7 +235,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Estimate a company's carbon footprint based on factors like transportation, electricity usage, waste production etc..." --test "Jina AI") [//]: # (gptdeploy generate --description "Estimate a company's carbon footprint based on factors like transportation, electricity usage, waste production etc..." --test "Jina AI")
[//]: # (```) [//]: # (```)
@@ -231,7 +244,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Create a microservice that estimates the value of a property based on factors like location, property type, age, and square footage." --test "Berlin Friedrichshain, Flat, 100m², 10 years old") [//]: # (gptdeploy generate --description "Create a microservice that estimates the value of a property based on factors like location, property type, age, and square footage." --test "Berlin Friedrichshain, Flat, 100m², 10 years old")
[//]: # (```) [//]: # (```)
@@ -240,7 +253,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Align two DNA or RNA sequences using the Needleman-Wunsch algorithm" --test "AGTC, GTCA") [//]: # (gptdeploy generate --description "Align two DNA or RNA sequences using the Needleman-Wunsch algorithm" --test "AGTC, GTCA")
[//]: # (```) [//]: # (```)
@@ -249,7 +262,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Convert markdown to HTML" --test "# Hello, welcome to GPT Deploy!") [//]: # (gptdeploy generate --description "Convert markdown to HTML" --test "# Hello, welcome to GPT Deploy!")
[//]: # (```) [//]: # (```)
@@ -258,7 +271,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Generate a barcode from a string" --test "Hello, welcome to GPT Deploy!") [//]: # (gptdeploy generate --description "Generate a barcode from a string" --test "Hello, welcome to GPT Deploy!")
[//]: # (```) [//]: # (```)
@@ -267,7 +280,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Compress a file using the gzip algorithm" --test "content of the file: Hello, welcome to GPT Deploy!") [//]: # (gptdeploy generate --description "Compress a file using the gzip algorithm" --test "content of the file: Hello, welcome to GPT Deploy!")
[//]: # (```) [//]: # (```)
@@ -276,7 +289,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Add a watermark &#40;GPT Deploy&#41; to an image" --test "https://images.unsplash.com/photo-1602738328654-51ab2ae6c4ff") [//]: # (gptdeploy generate --description "Add a watermark &#40;GPT Deploy&#41; to an image" --test "https://images.unsplash.com/photo-1602738328654-51ab2ae6c4ff")
[//]: # (```) [//]: # (```)
@@ -285,7 +298,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Extract metadata from a file" --test "https://images.unsplash.com/photo-1602738328654-51ab2ae6c4ff") [//]: # (gptdeploy generate --description "Extract metadata from a file" --test "https://images.unsplash.com/photo-1602738328654-51ab2ae6c4ff")
[//]: # (```) [//]: # (```)
@@ -294,7 +307,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Extract a thumbnail from a video" --test "http://techslides.com/demos/sample-videos/small.mp4") [//]: # (gptdeploy generate --description "Extract a thumbnail from a video" --test "http://techslides.com/demos/sample-videos/small.mp4")
[//]: # (```) [//]: # (```)
@@ -303,7 +316,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Create a gif from a list of images" --test "https://images.unsplash.com/photo-1564725075388-cc8338732289, https://images.unsplash.com/photo-1584555684040-bad07f46a21f, https://images.unsplash.com/photo-1584555613497-9ecf9dd06f68") [//]: # (gptdeploy generate --description "Create a gif from a list of images" --test "https://images.unsplash.com/photo-1564725075388-cc8338732289, https://images.unsplash.com/photo-1584555684040-bad07f46a21f, https://images.unsplash.com/photo-1584555613497-9ecf9dd06f68")
[//]: # (```) [//]: # (```)
@@ -316,7 +329,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # () [//]: # ()
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Visualize a sound file and output the visualization as video combined with the sound" --test "some/mp3/file.mp3") [//]: # (gptdeploy generate --description "Visualize a sound file and output the visualization as video combined with the sound" --test "some/mp3/file.mp3")
[//]: # (```) [//]: # (```)
@@ -326,7 +339,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Convert a chemical formula into a 2D chemical structure diagram" --test "C6H6") [//]: # (gptdeploy generate --description "Convert a chemical formula into a 2D chemical structure diagram" --test "C6H6")
[//]: # (```) [//]: # (```)
@@ -334,7 +347,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "creates aesthetically pleasing color palettes based on a seed color, using color theory principles like complementary or analogous colors" --test "red") [//]: # (gptdeploy generate --description "creates aesthetically pleasing color palettes based on a seed color, using color theory principles like complementary or analogous colors" --test "red")
[//]: # (```) [//]: # (```)
@@ -343,7 +356,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Generate a depth map from a 3d Object" --test "https://raw.githubusercontent.com/polygonjs/polygonjs-assets/master/models/wolf.obj") [//]: # (gptdeploy generate --description "Generate a depth map from a 3d Object" --test "https://raw.githubusercontent.com/polygonjs/polygonjs-assets/master/models/wolf.obj")
[//]: # (```) [//]: # (```)
@@ -355,7 +368,7 @@ gptdeploy create --description "Visualize the Mandelbrot set with custom paramet
[//]: # (```bash) [//]: # (```bash)
[//]: # (gptdeploy create --description "Convert image to ASCII art" --test "https://images.unsplash.com/photo-1602738328654-51ab2ae6c4ff") [//]: # (gptdeploy generate --description "Convert image to ASCII art" --test "https://images.unsplash.com/photo-1602738328654-51ab2ae6c4ff")
[//]: # (```) [//]: # (```)
@@ -390,7 +403,7 @@ graph TB
registry --> deploy[deploy microservice] registry --> deploy[deploy microservice]
deploy --> streamlit[create streamlit playground] deploy --> streamlit[generate streamlit playground]
streamlit --> user_run[user tests microservice] streamlit --> user_run[user tests microservice]
@@ -398,21 +411,21 @@ graph TB
1. GPT Deploy identifies several strategies to implement your task. 1. GPT Deploy identifies several strategies to implement your task.
2. It tests each strategy until it finds one that works. 2. It tests each strategy until it finds one that works.
3. For each strategy, it creates the following files: 3. For each strategy, it generates the following files:
- executor.py: This is the main implementation of the microservice. - microservice.py: This is the main implementation of the microservice.
- test_executor.py: These are test cases to ensure the microservice works as expected. - test_microservice.py: These are test cases to ensure the microservice works as expected.
- requirements.txt: This file lists the packages needed by the microservice and its tests. - requirements.txt: This file lists the packages needed by the microservice and its tests.
- Dockerfile: This file is used to run the microservice in a container and also runs the tests when building the image. - Dockerfile: This file is used to run the microservice in a container and also runs the tests when building the image.
4. GPT Deploy attempts to build the image. If the build fails, it uses the error message to apply a fix and tries again to build the image. 4. GPT Deploy attempts to build the image. If the build fails, it uses the error message to apply a fix and tries again to build the image.
5. Once it finds a successful strategy, it: 5. Once it finds a successful strategy, it:
- Pushes the Docker image to the registry. - Pushes the Docker image to the registry.
- Deploys the microservice. - Deploys the microservice.
- Creates a Streamlit playground where you can test the microservice. - Generates a Streamlit playground where you can test the microservice.
6. If it fails 10 times in a row, it moves on to the next approach. 6. If it fails 10 times in a row, it moves on to the next approach.
## 🔮 vision ## 🔮 vision
Use natural language interface to create, deploy and update your microservice infrastructure. Use natural language interface to generate, deploy and update your microservice infrastructure.
## ✨ Contributers ## ✨ Contributers
If you want to contribute to this project, feel free to open a PR or an issue. If you want to contribute to this project, feel free to open a PR or an issue.

View File

@@ -8,7 +8,7 @@ def read_requirements():
setup( setup(
name='gptdeploy', name='gptdeploy',
version='0.18.15', version='0.18.15',
description='Use natural language interface to create, deploy and update your microservice infrastructure.', description='Use natural language interface to generate, deploy and update your microservice infrastructure.',
long_description=open('README.md', 'r', encoding='utf-8').read(), long_description=open('README.md', 'r', encoding='utf-8').read(),
long_description_content_type='text/markdown', long_description_content_type='text/markdown',
author='Florian Hönicke', author='Florian Hönicke',

0
src/apis/__init__.py Normal file
View File

View File

@@ -7,13 +7,13 @@ from openai.error import RateLimitError, Timeout
from src.constants import PRICING_GPT4_PROMPT, PRICING_GPT4_GENERATION, PRICING_GPT3_5_TURBO_PROMPT, \ from src.constants import PRICING_GPT4_PROMPT, PRICING_GPT4_GENERATION, PRICING_GPT3_5_TURBO_PROMPT, \
PRICING_GPT3_5_TURBO_GENERATION PRICING_GPT3_5_TURBO_GENERATION
from src.prompt_system import system_base_definition from src.options.generate.prompt_system import system_base_definition
from src.utils.io import timeout_generator_wrapper, GenerationTimeoutError from src.utils.io import timeout_generator_wrapper, GenerationTimeoutError
from src.utils.string_tools import print_colored from src.utils.string_tools import print_colored
class GPTSession: class GPTSession:
def __init__(self): def __init__(self):
self.get_openai_api_key() self.configure_openai_api_key()
if self.is_gpt4_available(): if self.is_gpt4_available():
self.supported_model = 'gpt-4' self.supported_model = 'gpt-4'
self.pricing_prompt = PRICING_GPT4_PROMPT self.pricing_prompt = PRICING_GPT4_PROMPT
@@ -26,7 +26,7 @@ class GPTSession:
self.chars_prompt_so_far = 0 self.chars_prompt_so_far = 0
self.chars_generation_so_far = 0 self.chars_generation_so_far = 0
def get_openai_api_key(self): def configure_openai_api_key(self):
if 'OPENAI_API_KEY' not in os.environ: if 'OPENAI_API_KEY' not in os.environ:
raise Exception(''' raise Exception('''
You need to set OPENAI_API_KEY in your environment. You need to set OPENAI_API_KEY in your environment.

View File

@@ -3,17 +3,41 @@ import json
import os import os
import re import re
import subprocess import subprocess
import threading
import webbrowser import webbrowser
from pathlib import Path from pathlib import Path
import click
import hubble import hubble
from hubble.executor.helper import upload_file, archive_package, get_request_header from hubble.executor.helper import upload_file, archive_package, get_request_header
from jcloud.flow import CloudFlow from jcloud.flow import CloudFlow
from jina import Flow
from src.utils.io import suppress_stdout from src.utils.io import suppress_stdout, is_docker_running
from src.utils.string_tools import print_colored from src.utils.string_tools import print_colored
import requests
import time
def wait_until_app_is_ready(url):
is_app_ready = False
while not is_app_ready:
try:
response = requests.get(url)
print('waiting for app to be ready...')
if response.status_code == 200:
is_app_ready = True
except requests.exceptions.RequestException:
pass
time.sleep(0.5)
def open_streamlit_app():
url = "http://localhost:8081/playground"
wait_until_app_is_ready(url)
webbrowser.open(url, new=2)
def redirect_callback(href): def redirect_callback(href):
print( print(
f'You need login to Jina first to use GPTDeploy\n' f'You need login to Jina first to use GPTDeploy\n'
@@ -85,29 +109,89 @@ def get_user_name():
return response['data']['name'] return response['data']['name']
def deploy_on_jcloud(flow_yaml): def _deploy_on_jcloud(flow_yaml):
cloud_flow = CloudFlow(path=flow_yaml) cloud_flow = CloudFlow(path=flow_yaml)
return cloud_flow.__enter__().endpoints['gateway'] return cloud_flow.__enter__().endpoints['gateway']
def deploy_flow(executor_name, dest_folder): def deploy_on_jcloud(executor_name, microservice_path):
print('Deploy a jina flow') print('Deploy a jina flow')
full_flow_path = create_flow_yaml(microservice_path, executor_name, use_docker=True)
for i in range(3):
try:
host = _deploy_on_jcloud(flow_yaml=full_flow_path)
break
except Exception as e:
print(f'Could not deploy on Jina Cloud. Trying again in 5 seconds. Error: {e}')
time.sleep(5)
if i == 2:
raise Exception('''
Could not deploy on Jina Cloud.
This can happen when the microservice is buggy, if it requires too much memory or if the Jina Cloud is overloaded.
Please try again later.
'''
)
print(f'''
Your Microservice is deployed.
Run the following command to start the playground:
streamlit run {os.path.join(microservice_path, "app.py")} --server.port 8081 --server.address 0.0.0.0 -- --host http://{host}
'''
)
return host
def run_streamlit_app(app_path):
subprocess.run(['streamlit', 'run', app_path, 'server.address', '0.0.0.0', '--server.port', '8081', '--', '--host', 'grpc://localhost:8080'])
def run_locally(executor_name, microservice_version_path):
if is_docker_running():
use_docker = True
else:
click.echo('Docker daemon doesn\'t seem to be running. Trying to start it without docker')
use_docker = False
print('Run a jina flow locally')
full_flow_path = create_flow_yaml(microservice_version_path, executor_name, use_docker)
flow = Flow.load_config(full_flow_path)
with flow:
print(f'''
Your microservice started locally.
We now start the playground for you.
''')
app_path = os.path.join(microservice_version_path, "app.py")
# Run the Streamlit app in a separate thread
streamlit_thread = threading.Thread(target=run_streamlit_app, args=(app_path,))
streamlit_thread.start()
# Open the Streamlit app in the user's default web browser
open_streamlit_app()
flow.block()
def create_flow_yaml(dest_folder, executor_name, use_docker):
if use_docker:
prefix = 'jinaai+docker'
else:
prefix = 'jinaai'
flow = f''' flow = f'''
jtype: Flow jtype: Flow
with: with:
name: nowapi name: nowapi
env: port: 8080
JINA_LOG_LEVEL: DEBUG
jcloud: jcloud:
version: 3.14.2.dev18 version: 3.14.2.dev18
labels: labels:
creator: microchain creator: microchain
name: gptdeploy name: gptdeploy
executors: executors:
- name: {executor_name.lower()} - name: {executor_name.lower()}
uses: jinaai+docker://{get_user_name()}/{executor_name}:latest uses: {prefix }://{get_user_name()}/{executor_name}:latest
env: {"" if use_docker else "install-requirements: True"}
JINA_LOG_LEVEL: DEBUG
jcloud: jcloud:
resources: resources:
instance: C2 instance: C2
@@ -117,16 +201,7 @@ executors:
'flow.yml') 'flow.yml')
with open(full_flow_path, 'w') as f: with open(full_flow_path, 'w') as f:
f.write(flow) f.write(flow)
return full_flow_path
for i in range(3):
try:
host = deploy_on_jcloud(flow_yaml=full_flow_path)
break
except Exception as e:
raise e
print(f'Flow is deployed create the playground for {host}')
return host
def replace_client_line(file_content: str, replacement: str) -> str: def replace_client_line(file_content: str, replacement: str) -> str:

View File

@@ -1,30 +1,80 @@
import functools
import os
import click import click
from src.executor_factory import ExecutorFactory from src.apis.jina_cloud import jina_auth_login
from src.jina_cloud import jina_auth_login from src.options.configure.key_handling import set_api_key
from src.key_handling import set_api_key
def exception_interceptor(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as e:
raise type(e)(f'''
{str(e)}
😱😱😱 Sorry for this experience. Could you please report an issue about this on our github repo? We'll try to fix it asap.
''') from e
return wrapper
def path_param(func):
@click.option('--path', required=True, help='Path to the generated microservice.')
@functools.wraps(func)
def wrapper(*args, **kwargs):
path = os.path.expanduser(kwargs['path'])
path = os.path.abspath(path)
kwargs['path'] = path
return func(*args, **kwargs)
return wrapper
@click.group(invoke_without_command=True) @click.group(invoke_without_command=True)
def main(): @click.pass_context
@exception_interceptor
def main(ctx):
if ctx.invoked_subcommand is None:
click.echo(ctx.get_help())
jina_auth_login() jina_auth_login()
@main.command() @main.command()
@click.option('--description', required=True, help='Description of the executor.') @click.option('--description', required=True, help='Description of the microservice.')
@click.option('--test', required=True, help='Test scenario for the executor.') @click.option('--test', required=True, help='Test scenario for the microservice.')
@click.option('--num_approaches', default=3, type=int, @path_param
help='Number of num_approaches to use to fulfill the task (default: 3).') def generate(
@click.option('--output_path', default='executor', help='Path to the output folder (must be empty). ')
def create(
description, description,
test, test,
num_approaches=3, path,
output_path='executor',
): ):
executor_factory = ExecutorFactory() from src.options.generate.generator import Generator
executor_factory.create(description, num_approaches, output_path, test) path = os.path.expanduser(path)
path = os.path.abspath(path)
if os.path.exists(path):
if os.listdir(path):
click.echo(f"Error: The path {path} you provided via --path is not empty. Please choose a directory that does not exist or is empty.")
return
generator = Generator()
generator.generate(description, test, path)
@main.command()
@path_param
def run(path):
from src.options.run import Runner
path = os.path.expanduser(path)
path = os.path.abspath(path)
Runner().run(path)
@main.command()
@path_param
def deploy(path):
from src.options.deploy.deployer import Deployer
path = os.path.expanduser(path)
path = os.path.abspath(path)
Deployer().deploy(path)
@main.command() @main.command()
@click.option('--key', required=True, help='Your OpenAI API key.') @click.option('--key', required=True, help='Your OpenAI API key.')

View File

@@ -1,5 +1,5 @@
EXECUTOR_FILE_NAME = 'executor.py' EXECUTOR_FILE_NAME = 'microservice.py'
TEST_EXECUTOR_FILE_NAME = 'test_executor.py' TEST_EXECUTOR_FILE_NAME = 'test_microservice.py'
REQUIREMENTS_FILE_NAME = 'requirements.txt' REQUIREMENTS_FILE_NAME = 'requirements.txt'
DOCKER_FILE_NAME = 'Dockerfile' DOCKER_FILE_NAME = 'Dockerfile'
CLIENT_FILE_NAME = 'client.py' CLIENT_FILE_NAME = 'client.py'
@@ -21,12 +21,12 @@ FILE_AND_TAG_PAIRS = [
(STREAMLIT_FILE_NAME, STREAMLIT_FILE_TAG) (STREAMLIT_FILE_NAME, STREAMLIT_FILE_TAG)
] ]
EXECUTOR_FOLDER_v1 = 'executor_v1'
EXECUTOR_FOLDER_v2 = 'executor_v2'
FLOW_URL_PLACEHOLDER = 'jcloud.jina.ai' FLOW_URL_PLACEHOLDER = 'jcloud.jina.ai'
PRICING_GPT4_PROMPT = 0.03 PRICING_GPT4_PROMPT = 0.03
PRICING_GPT4_GENERATION = 0.06 PRICING_GPT4_GENERATION = 0.06
PRICING_GPT3_5_TURBO_PROMPT = 0.002 PRICING_GPT3_5_TURBO_PROMPT = 0.002
PRICING_GPT3_5_TURBO_GENERATION = 0.002 PRICING_GPT3_5_TURBO_GENERATION = 0.002
NUM_IMPLEMENTATION_STRATEGIES = 3
MAX_DEBUGGING_ITERATIONS = 10

37
src/options/__init__.py Normal file
View File

@@ -0,0 +1,37 @@
import os
def get_latest_folder(path):
return max([os.path.join(path, f) for f in os.listdir(path) if os.path.isdir(os.path.join(path, f))])
def get_latest_version_path(microservice_path):
executor_name_path = get_latest_folder(microservice_path)
latest_approach_path = get_latest_folder(executor_name_path)
latest_version_path = get_latest_folder(latest_approach_path)
return latest_version_path
def get_executor_name(microservice_path):
return get_latest_folder(microservice_path).split('/')[-1]
def validate_folder_is_correct(microservice_path):
if not os.path.exists(microservice_path):
raise ValueError(f'Path {microservice_path} does not exist')
if not os.path.isdir(microservice_path):
raise ValueError(f'Path {microservice_path} is not a directory')
if len(os.listdir(microservice_path)) == 0:
raise ValueError(f'Path {microservice_path} is empty. Please generate a microservice first. Type `gptdeploy generate` for further instructions.')
if len(os.listdir(microservice_path)) > 1:
raise ValueError(f'Path {microservice_path} needs to contain only one folder. Please make sure that you only have one microservice in this folder.')
latest_version_path = get_latest_version_path(microservice_path)
required_files = [
'app.py',
'requirements.txt',
'Dockerfile',
'config.yml',
'microservice.py',
'test_microservice.py',
]
for file_name in required_files:
if not os.path.exists(os.path.join(latest_version_path, file_name)):
raise ValueError(f'Path {latest_version_path} needs to contain a file named {file_name}')

View File

View File

View File

@@ -0,0 +1,10 @@
from src.apis.jina_cloud import deploy_on_jcloud
from src.options import validate_folder_is_correct, get_executor_name, get_latest_version_path
class Deployer:
def deploy(self, microservice_path):
validate_folder_is_correct(microservice_path)
executor_name = get_executor_name(microservice_path)
latest_version_path = get_latest_version_path(microservice_path)
deploy_on_jcloud(executor_name, latest_version_path)

View File

View File

@@ -2,16 +2,16 @@ import os
import random import random
import re import re
from src import gpt, jina_cloud from src.apis import gpt
from src.constants import FILE_AND_TAG_PAIRS from src.constants import FILE_AND_TAG_PAIRS, NUM_IMPLEMENTATION_STRATEGIES, MAX_DEBUGGING_ITERATIONS
from src.jina_cloud import push_executor, process_error_message from src.apis.jina_cloud import process_error_message, push_executor
from src.prompt_tasks import general_guidelines, executor_file_task, chain_of_thought_creation, test_executor_file_task, \ from src.options.generate.prompt_tasks import general_guidelines, chain_of_thought_creation, executor_file_task, \
chain_of_thought_optimization, requirements_file_task, docker_file_task, not_allowed not_allowed, chain_of_thought_optimization, test_executor_file_task, requirements_file_task, docker_file_task
from src.utils.io import persist_file from src.utils.io import persist_file, get_all_microservice_files_with_content, get_microservice_path
from src.utils.string_tools import print_colored from src.utils.string_tools import print_colored
class ExecutorFactory: class Generator:
def __init__(self): def __init__(self):
self.gpt_session = gpt.GPTSession() self.gpt_session = gpt.GPTSession()
@@ -23,93 +23,81 @@ class ExecutorFactory:
else: else:
return '' return ''
def write_config_yml(self, executor_name, dest_folder): def write_config_yml(self, microservice_name, dest_folder):
config_content = f''' config_content = f'''
jtype: {executor_name} jtype: {microservice_name}
py_modules: py_modules:
- executor.py - microservice.py
metas: metas:
name: {executor_name} name: {microservice_name}
''' '''
with open(os.path.join(dest_folder, 'config.yml'), 'w') as f: with open(os.path.join(dest_folder, 'config.yml'), 'w') as f:
f.write(config_content) f.write(config_content)
def get_all_executor_files_with_content(self, folder_path):
file_name_to_content = {}
for filename in os.listdir(folder_path):
file_path = os.path.join(folder_path, filename)
if os.path.isfile(file_path):
with open(file_path, 'r', encoding='utf-8') as file:
content = file.read()
file_name_to_content[filename] = content
return file_name_to_content
def files_to_string(self, file_name_to_content): def files_to_string(self, file_name_to_content):
all_executor_files_string = '' all_microservice_files_string = ''
for file_name, tag in FILE_AND_TAG_PAIRS: for file_name, tag in FILE_AND_TAG_PAIRS:
if file_name in file_name_to_content: if file_name in file_name_to_content:
all_executor_files_string += f'**{file_name}**\n' all_microservice_files_string += f'**{file_name}**\n'
all_executor_files_string += f'```{tag}\n' all_microservice_files_string += f'```{tag}\n'
all_executor_files_string += file_name_to_content[file_name] all_microservice_files_string += file_name_to_content[file_name]
all_executor_files_string += '\n```\n\n' all_microservice_files_string += '\n```\n\n'
return all_executor_files_string return all_microservice_files_string
def wrap_content_in_code_block(self, executor_content, file_name, tag): def wrap_content_in_code_block(self, microservice_content, file_name, tag):
return f'**{file_name}**\n```{tag}\n{executor_content}\n```\n\n' return f'**{file_name}**\n```{tag}\n{microservice_content}\n```\n\n'
def create_executor( def generate_microservice(
self, self,
description, description,
test, test,
output_path, path,
executor_name, microservice_name,
package, package,
num_approach, num_approach,
is_chain_of_thought=False, is_chain_of_thought=False,
): ):
EXECUTOR_FOLDER_v1 = self.get_executor_path(output_path, executor_name, package, num_approach, 1) MICROSERVICE_FOLDER_v1 = get_microservice_path(path, microservice_name, package, num_approach, 1)
os.makedirs(EXECUTOR_FOLDER_v1) os.makedirs(MICROSERVICE_FOLDER_v1)
print_colored('', '############# Executor #############', 'red') print_colored('', '############# Microservice #############', 'red')
user_query = ( user_query = (
general_guidelines() general_guidelines()
+ executor_file_task(executor_name, description, test, package) + executor_file_task(microservice_name, description, test, package)
+ chain_of_thought_creation() + chain_of_thought_creation()
) )
conversation = self.gpt_session.get_conversation() conversation = self.gpt_session.get_conversation()
executor_content_raw = conversation.query(user_query) microservice_content_raw = conversation.query(user_query)
if is_chain_of_thought: if is_chain_of_thought:
executor_content_raw = conversation.query( microservice_content_raw = conversation.query(
f"General rules: " + not_allowed() + chain_of_thought_optimization('python', 'executor.py')) f"General rules: " + not_allowed() + chain_of_thought_optimization('python', 'microservice.py'))
executor_content = self.extract_content_from_result(executor_content_raw, 'executor.py') microservice_content = self.extract_content_from_result(microservice_content_raw, 'microservice.py')
persist_file(executor_content, os.path.join(EXECUTOR_FOLDER_v1, 'executor.py')) persist_file(microservice_content, os.path.join(MICROSERVICE_FOLDER_v1, 'microservice.py'))
print_colored('', '############# Test Executor #############', 'red') print_colored('', '############# Test Microservice #############', 'red')
user_query = ( user_query = (
general_guidelines() general_guidelines()
+ self.wrap_content_in_code_block(executor_content, 'executor.py', 'python') + self.wrap_content_in_code_block(microservice_content, 'microservice.py', 'python')
+ test_executor_file_task(executor_name, test) + test_executor_file_task(microservice_name, test)
) )
conversation = self.gpt_session.get_conversation() conversation = self.gpt_session.get_conversation()
test_executor_content_raw = conversation.query(user_query) test_microservice_content_raw = conversation.query(user_query)
if is_chain_of_thought: if is_chain_of_thought:
test_executor_content_raw = conversation.query( test_microservice_content_raw = conversation.query(
f"General rules: " + not_allowed() + f"General rules: " + not_allowed() +
chain_of_thought_optimization('python', 'test_executor.py') chain_of_thought_optimization('python', 'test_microservice.py')
+ "Don't add any additional tests. " + "Don't add any additional tests. "
) )
test_executor_content = self.extract_content_from_result(test_executor_content_raw, 'test_executor.py') test_microservice_content = self.extract_content_from_result(test_microservice_content_raw, 'test_microservice.py')
persist_file(test_executor_content, os.path.join(EXECUTOR_FOLDER_v1, 'test_executor.py')) persist_file(test_microservice_content, os.path.join(MICROSERVICE_FOLDER_v1, 'test_microservice.py'))
print_colored('', '############# Requirements #############', 'red') print_colored('', '############# Requirements #############', 'red')
requirements_path = os.path.join(EXECUTOR_FOLDER_v1, 'requirements.txt') requirements_path = os.path.join(MICROSERVICE_FOLDER_v1, 'requirements.txt')
user_query = ( user_query = (
general_guidelines() general_guidelines()
+ self.wrap_content_in_code_block(executor_content, 'executor.py', 'python') + self.wrap_content_in_code_block(microservice_content, 'microservice.py', 'python')
+ self.wrap_content_in_code_block(test_executor_content, 'test_executor.py', 'python') + self.wrap_content_in_code_block(test_microservice_content, 'test_microservice.py', 'python')
+ requirements_file_task() + requirements_file_task()
) )
conversation = self.gpt_session.get_conversation() conversation = self.gpt_session.get_conversation()
@@ -124,8 +112,8 @@ class ExecutorFactory:
print_colored('', '############# Dockerfile #############', 'red') print_colored('', '############# Dockerfile #############', 'red')
user_query = ( user_query = (
general_guidelines() general_guidelines()
+ self.wrap_content_in_code_block(executor_content, 'executor.py', 'python') + self.wrap_content_in_code_block(microservice_content, 'microservice.py', 'python')
+ self.wrap_content_in_code_block(test_executor_content, 'test_executor.py', 'python') + self.wrap_content_in_code_block(test_microservice_content, 'test_microservice.py', 'python')
+ self.wrap_content_in_code_block(requirements_content, 'requirements.txt', '') + self.wrap_content_in_code_block(requirements_content, 'requirements.txt', '')
+ docker_file_task() + docker_file_task()
) )
@@ -135,28 +123,29 @@ class ExecutorFactory:
dockerfile_content_raw = conversation.query( dockerfile_content_raw = conversation.query(
f"General rules: " + not_allowed() + chain_of_thought_optimization('dockerfile', 'Dockerfile')) f"General rules: " + not_allowed() + chain_of_thought_optimization('dockerfile', 'Dockerfile'))
dockerfile_content = self.extract_content_from_result(dockerfile_content_raw, 'Dockerfile') dockerfile_content = self.extract_content_from_result(dockerfile_content_raw, 'Dockerfile')
persist_file(dockerfile_content, os.path.join(EXECUTOR_FOLDER_v1, 'Dockerfile')) persist_file(dockerfile_content, os.path.join(MICROSERVICE_FOLDER_v1, 'Dockerfile'))
self.write_config_yml(executor_name, EXECUTOR_FOLDER_v1) self.write_config_yml(microservice_name, MICROSERVICE_FOLDER_v1)
print('First version of the executor created. Start iterating on it to make the tests pass...') print('First version of the microservice generated. Start iterating on it to make the tests pass...')
def create_playground(self, executor_name, executor_path, host): def generate_playground(self, microservice_name, microservice_path):
print_colored('', '############# Playground #############', 'red') print_colored('', '############# Playground #############', 'red')
file_name_to_content = self.get_all_executor_files_with_content(executor_path) file_name_to_content = get_all_microservice_files_with_content(microservice_path)
user_query = ( user_query = (
general_guidelines() general_guidelines()
+ self.wrap_content_in_code_block(file_name_to_content['executor.py'], 'executor.py', 'python') + self.wrap_content_in_code_block(file_name_to_content['microservice.py'], 'microservice.py', 'python')
+ self.wrap_content_in_code_block(file_name_to_content['test_executor.py'], 'test_executor.py', + self.wrap_content_in_code_block(file_name_to_content['test_microservice.py'], 'test_microservice.py',
'python') 'python')
+ f''' + f'''
Create a playground for the executor {executor_name} using streamlit. Create a playground for the executor {microservice_name} using streamlit.
The playground must look like it was made by a professional designer. The playground must look like it was made by a professional designer.
All the ui elements are well thought out to make them visually appealing and easy to use. All the ui elements are well thought out to make them visually appealing and easy to use.
The executor is hosted on {host}. The playground must be started with a custom host: streamlit run app.py -- --host grpc://...
The playground must not let the user configure the --host grpc://... on the ui.
This is an example how you can connect to the executor assuming the document (d) is already defined: This is an example how you can connect to the executor assuming the document (d) is already defined:
from jina import Client, Document, DocumentArray from jina import Client, Document, DocumentArray
client = Client(host='{host}') client = Client(host=host)
response = client.post('/', inputs=DocumentArray([d])) # always use '/' response = client.post('/', inputs=DocumentArray([d])) # always use '/'
print(response[0].text) # can also be blob in case of image/audio..., this should be visualized in the streamlit app print(response[0].text) # can also be blob in case of image/audio..., this should be visualized in the streamlit app
''' '''
@@ -166,25 +155,21 @@ print(response[0].text) # can also be blob in case of image/audio..., this shoul
playground_content_raw = conversation.query( playground_content_raw = conversation.query(
f"General rules: " + not_allowed() + chain_of_thought_optimization('python', 'app.py')) f"General rules: " + not_allowed() + chain_of_thought_optimization('python', 'app.py'))
playground_content = self.extract_content_from_result(playground_content_raw, 'app.py') playground_content = self.extract_content_from_result(playground_content_raw, 'app.py')
persist_file(playground_content, os.path.join(executor_path, 'app.py')) persist_file(playground_content, os.path.join(microservice_path, 'app.py'))
def get_executor_path(self, output_path, executor_name, package, num_approach, version):
package_path = '_'.join(package)
return os.path.join(output_path, executor_name, f'{num_approach}_{package_path}', f'v{version}')
def debug_executor(self, output_path, executor_name, num_approach, packages, description, test): def debug_microservice(self, path, microservice_name, num_approach, packages, description, test):
MAX_DEBUGGING_ITERATIONS = 10
error_before = '' error_before = ''
for i in range(1, MAX_DEBUGGING_ITERATIONS): for i in range(1, MAX_DEBUGGING_ITERATIONS):
print('Debugging iteration', i) print('Debugging iteration', i)
print('Trying to build the microservice. Might take a while...') print('Trying to build the microservice. Might take a while...')
previous_executor_path = self.get_executor_path(output_path, executor_name, packages, num_approach, i) previous_microservice_path = get_microservice_path(path, microservice_name, packages, num_approach, i)
next_executor_path = self.get_executor_path(output_path, executor_name, packages, num_approach, i + 1) next_microservice_path = get_microservice_path(path, microservice_name, packages, num_approach, i + 1)
log_hubble = push_executor(previous_executor_path) log_hubble = push_executor(previous_microservice_path)
error = process_error_message(log_hubble) error = process_error_message(log_hubble)
if error: if error:
os.makedirs(next_executor_path) os.makedirs(next_microservice_path)
file_name_to_content = self.get_all_executor_files_with_content(previous_executor_path) file_name_to_content = get_all_microservice_files_with_content(previous_microservice_path)
all_files_string = self.files_to_string(file_name_to_content) all_files_string = self.files_to_string(file_name_to_content)
user_query = ( user_query = (
f"General rules: " + not_allowed() f"General rules: " + not_allowed()
@@ -217,19 +202,19 @@ print(response[0].text) # can also be blob in case of image/audio..., this shoul
file_name_to_content[file_name] = updated_file file_name_to_content[file_name] = updated_file
for file_name, content in file_name_to_content.items(): for file_name, content in file_name_to_content.items():
persist_file(content, os.path.join(next_executor_path, file_name)) persist_file(content, os.path.join(next_microservice_path, file_name))
error_before = error error_before = error
else: else:
break break
if i == MAX_DEBUGGING_ITERATIONS - 1: if i == MAX_DEBUGGING_ITERATIONS - 1:
raise self.MaxDebugTimeReachedException('Could not debug the executor.') raise self.MaxDebugTimeReachedException('Could not debug the microservice.')
return self.get_executor_path(output_path, executor_name, packages, num_approach, i) return get_microservice_path(path, microservice_name, packages, num_approach, i)
class MaxDebugTimeReachedException(BaseException): class MaxDebugTimeReachedException(BaseException):
pass pass
def generate_executor_name(self, description): def generate_microservice_name(self, description):
conversation = self.gpt_session.get_conversation() conversation = self.gpt_session.get_conversation()
user_query = f''' user_query = f'''
Generate a name for the executor matching the description: Generate a name for the executor matching the description:
@@ -250,7 +235,7 @@ PDFParserExecutor
name = self.extract_content_from_result(name_raw, 'name.txt') name = self.extract_content_from_result(name_raw, 'name.txt')
return name return name
def get_possible_packages(self, description, threads): def get_possible_packages(self, description):
print_colored('', '############# What package to use? #############', 'red') print_colored('', '############# What package to use? #############', 'red')
user_query = f''' user_query = f'''
Here is the task description of the problme you need to solve: Here is the task description of the problme you need to solve:
@@ -279,29 +264,26 @@ package2,package3,...
packages_raw = conversation.query(user_query) packages_raw = conversation.query(user_query)
packages_csv_string = self.extract_content_from_result(packages_raw, 'packages.csv') packages_csv_string = self.extract_content_from_result(packages_raw, 'packages.csv')
packages = [package.split(',') for package in packages_csv_string.split('\n')] packages = [package.split(',') for package in packages_csv_string.split('\n')]
packages = packages[:threads] packages = packages[:NUM_IMPLEMENTATION_STRATEGIES]
return packages return packages
def create(self, description, num_approaches, output_path, test): def generate(self, description, test, microservice_path):
generated_name = self.generate_executor_name(description) generated_name = self.generate_microservice_name(description)
executor_name = f'{generated_name}{random.randint(0, 1000_000)}' microservice_name = f'{generated_name}{random.randint(0, 10_000_000)}'
packages_list = self.get_possible_packages(description, num_approaches) packages_list = self.get_possible_packages(description)
for num_approach, packages in enumerate(packages_list): for num_approach, packages in enumerate(packages_list):
try: try:
self.create_executor(description, test, output_path, executor_name, packages, num_approach) self.generate_microservice(description, test, microservice_path, microservice_name, packages, num_approach)
executor_path = self.debug_executor(output_path, executor_name, num_approach, packages, description, test) final_version_path = self.debug_microservice(microservice_path, microservice_name, num_approach, packages, description, test)
host = jina_cloud.deploy_flow(executor_name, executor_path) self.generate_playground(microservice_name, final_version_path)
self.create_playground(executor_name, executor_path, host)
except self.MaxDebugTimeReachedException: except self.MaxDebugTimeReachedException:
print('Could not debug the Executor.') print('Could not debug the Microservice.')
continue continue
print(f''' print(f'''
Executor name: {executor_name} You can now run or deploy your microservice:
Executor path: {executor_path} gptdeploy run --path {microservice_path}
Host: {host} gptdeploy deploy --path {microservice_path}
Run the following command to start the playground:
streamlit run {os.path.join(executor_path, "app.py")}
''' '''
) )
break break

View File

@@ -4,9 +4,8 @@ executor_example = '''
Using the Jina framework, users can define executors. Using the Jina framework, users can define executors.
Here is an example of how an executor can be defined. It always starts with a comment: Here is an example of how an executor can be defined. It always starts with a comment:
**executor.py** **microservice.py**
```python ```python
# this executor binary files as input and returns the length of each binary file as output
from jina import Executor, requests, DocumentArray, Document from jina import Executor, requests, DocumentArray, Document
import json import json
class MyInfoExecutor(Executor): class MyInfoExecutor(Executor):

View File

@@ -50,7 +50,7 @@ def test_executor_file_task(executor_name, test_scenario):
if test_scenario else "" if test_scenario else ""
) )
+ "Use the following import to import the executor: " + "Use the following import to import the executor: "
f"from executor import {executor_name} " f"from microservice import {executor_name} "
+ not_allowed() + not_allowed()
+ "The test must not open local files. " + "The test must not open local files. "
+ "The test must not mock a function of the executor. " + "The test must not mock a function of the executor. "

View File

@@ -0,0 +1 @@
from src.options.run.runner import Runner

11
src/options/run/runner.py Normal file
View File

@@ -0,0 +1,11 @@
from src.apis.jina_cloud import run_locally
from src.options import validate_folder_is_correct, get_executor_name, get_latest_version_path
class Runner():
def run(self, microservice_path):
validate_folder_is_correct(microservice_path)
executor_name = get_executor_name(microservice_path)
latest_version_path = get_latest_version_path(microservice_path)
run_locally(executor_name, latest_version_path)

View File

@@ -1,53 +0,0 @@
# from fastapi import FastAPI
# from fastapi.exceptions import RequestValidationError
# from pydantic import BaseModel
# from typing import Optional, Dict
#
# from starlette.middleware.cors import CORSMiddleware
# from starlette.requests import Request
# from starlette.responses import JSONResponse
# from main import main
#
# app = FastAPI()
#
# # Define the request model
# class CreateRequest(BaseModel):
# test_scenario: str
# executor_description: str
#
# # Define the response model
# class CreateResponse(BaseModel):
# result: Dict[str, str]
# success: bool
# message: Optional[str]
#
# @app.post("/create", response_model=CreateResponse)
# def create_endpoint(request: CreateRequest):
#
# result = main(
# executor_description=request.executor_description,
# test_scenario=request.test_scenario,
# )
# return CreateResponse(result=result, success=True, message=None)
#
#
# app.add_middleware(
# CORSMiddleware,
# allow_origins=["*"],
# allow_credentials=True,
# allow_methods=["*"],
# allow_headers=["*"],
# )
#
# # Add a custom exception handler for RequestValidationError
# @app.exception_handler(RequestValidationError)
# def validation_exception_handler(request: Request, exc: RequestValidationError):
# return JSONResponse(
# status_code=422,
# content={"detail": exc.errors()},
# )
#
#
# if __name__ == "__main__":
# import uvicorn
# uvicorn.run("server:app", host="0.0.0.0", port=8000, log_level="info")

View File

@@ -1,17 +1,37 @@
import os import os
import shutil
import concurrent.futures import concurrent.futures
import concurrent.futures import concurrent.futures
from typing import Generator from typing import Generator
import sys import sys
from contextlib import contextmanager from contextlib import contextmanager
import docker
from docker import APIClient
def get_microservice_path(path, microservice_name, package, num_approach, version):
package_path = '_'.join(package)
return os.path.join(path, microservice_name, f'{num_approach}_{package_path}', f'v{version}')
def persist_file(file_content, file_path): def persist_file(file_content, file_path):
with open(file_path, 'w') as f: with open(file_path, 'w') as f:
f.write(file_content) f.write(file_content)
def get_all_microservice_files_with_content(folder_path):
file_name_to_content = {}
for filename in os.listdir(folder_path):
file_path = os.path.join(folder_path, filename)
if os.path.isfile(file_path):
with open(file_path, 'r', encoding='utf-8') as file:
content = file.read()
file_name_to_content[filename] = content
return file_name_to_content
class GenerationTimeoutError(Exception): class GenerationTimeoutError(Exception):
pass pass
@@ -43,3 +63,18 @@ def suppress_stdout():
finally: finally:
sys.stdout.close() sys.stdout.close()
sys.stdout = original_stdout sys.stdout = original_stdout
def is_docker_running():
try:
from hubble import __windows__
_client = docker.from_env()
# low-level client
_raw_client = APIClient(
base_url=docker.constants.DEFAULT_NPIPE
if __windows__
else docker.constants.DEFAULT_UNIX_SOCKET
)
except Exception:
return False
return True

0
test/__init__.py Normal file
View File

29
test/test_generator.py Normal file
View File

@@ -0,0 +1,29 @@
import unittest.mock as mock
from src.options.generate.generator import Generator
from src.apis.gpt import GPTSession
def test_generator(tmpdir):
# Define a mock response
mock_response = {
"choices": [
{
"delta": {
"content": "This is a mock response."
}
}
]
}
# Define a function to replace openai.ChatCompletion.create
def mock_create(*args, **kwargs):
return [mock_response] * kwargs.get("stream", 1)
# Define a function to replace get_openai_api_key
def mock_get_openai_api_key(*args, **kwargs):
pass
# Use mock.patch as a context manager to replace the original methods with the mocks
with mock.patch("openai.ChatCompletion.create", side_effect=mock_create), \
mock.patch.object(GPTSession, "configure_openai_api_key", side_effect=mock_get_openai_api_key):
generator = Generator()
generator.generate("my description", "my test", str(tmpdir))