mirror of
https://github.com/aljazceru/dev-gpt.git
synced 2025-12-18 14:14:21 +01:00
fix: multiple things and 3d
This commit is contained in:
28
README.md
28
README.md
@@ -103,51 +103,57 @@ jc delete <microservice id>
|
||||
### Animal Detector
|
||||
```bash
|
||||
|
||||
gptdeploy generate --description "Given an image, return the image with bounding boxes of all animals (https://pjreddie.com/media/files/yolov3.weights, https://raw.githubusercontent.com/pjreddie/darknet/master/cfg/yolov3.cfg)" --test "https://images.unsplash.com/photo-1444212477490-ca407925329e contains animals" --model gpt-4
|
||||
gptdeploy generate --description "Given an image, return the image with bounding boxes of all animals (https://pjreddie.com/media/files/yolov3.weights, https://raw.githubusercontent.com/pjreddie/darknet/master/cfg/yolov3.cfg)" --test "https://images.unsplash.com/photo-1444212477490-ca407925329e contains animals" --model gpt-4 --path microservice
|
||||
```
|
||||
|
||||
<img src="res/animal_detector_example.png" alt="Animal Detector" width="600" />
|
||||
|
||||
### Meme Generator
|
||||
```bash
|
||||
gptdeploy generate --description "Generate a meme from an image and a caption" --test "Surprised Pikachu: https://media.wired.com/photos/5f87340d114b38fa1f8339f9/master/w_1600%2Cc_limit/Ideas_Surprised_Pikachu_HD.jpg, TOP:When you discovered GPTDeploy" --model gpt-4
|
||||
gptdeploy generate --description "Generate a meme from an image and a caption" --test "Surprised Pikachu: https://media.wired.com/photos/5f87340d114b38fa1f8339f9/master/w_1600%2Cc_limit/Ideas_Surprised_Pikachu_HD.jpg, TOP:When you discovered GPTDeploy" --model gpt-4 --path microservice
|
||||
```
|
||||
|
||||
<img src="res/meme_example.png" alt="Meme Generator" width="600" />
|
||||
|
||||
### Rhyme Generator
|
||||
```bash
|
||||
gptdeploy generate --description "Given a word, return a list of rhyming words using the datamuse api" --test "hello" --model gpt-4
|
||||
gptdeploy generate --description "Given a word, return a list of rhyming words using the datamuse api" --test "hello" --model gpt-4 --path microservice
|
||||
```
|
||||
<img src="res/rhyme_generator_example.png" alt="Rhyme Generator" width="600" />
|
||||
|
||||
### Word Cloud Generator
|
||||
```bash
|
||||
gptdeploy generate --description "Generate a word cloud from a given text" --test "Lorem ipsum dolor sit amet, consectetur adipiscing elit." --model gpt-4
|
||||
gptdeploy generate --description "Generate a word cloud from a given text" --test "Lorem ipsum dolor sit amet, consectetur adipiscing elit." --model gpt-4 --path microservice
|
||||
```
|
||||
<img src="res/word_cloud_example.png" alt="Word Cloud Generator" width="600" />
|
||||
|
||||
### 3d model info
|
||||
```bash
|
||||
gptdeploy generate --description "Given a 3d object, return vertex count and face count" --test "https://raw.githubusercontent.com/polygonjs/polygonjs-assets/master/models/wolf.obj" --model gpt-4
|
||||
gptdeploy generate --description "Given a 3d object, return vertex count and face count" --test "https://raw.githubusercontent.com/polygonjs/polygonjs-assets/master/models/wolf.obj" --model gpt-4 --path microservice
|
||||
```
|
||||
<img src="res/obj_info_example.png" alt="3D Model Info" width="600" />
|
||||
|
||||
### 2d rendering of 3d model
|
||||
```bash
|
||||
gptdeploy generate --description "create a 2d rendering of a whole 3d object and x,y,z object rotation using trimesh and pyrender.OffscreenRenderer with os.environ['PYOPENGL_PLATFORM'] = 'egl' and freeglut3-dev library" --test "input: https://graphics.stanford.edu/courses/cs148-10-summer/as3/code/as3/teapot.obj output: assert the image is not completely white or black" --model gpt-4 --path microservice
|
||||
```
|
||||
<img src="res/obj_render_example.gif" alt="2D Rendering of 3D Model" width="600" />
|
||||
|
||||
### Table extraction
|
||||
```bash
|
||||
gptdeploy generate --description "Given a URL, extract all tables as csv" --test "http://www.ins.tn/statistiques/90" --model gpt-4
|
||||
gptdeploy generate --description "Given a URL, extract all tables as csv" --test "http://www.ins.tn/statistiques/90" --model gpt-4 --path microservice
|
||||
```
|
||||
<img src="res/table_extraction_example.png" alt="Table Extraction" width="600" />
|
||||
|
||||
### Audio to mel spectrogram
|
||||
```bash
|
||||
gptdeploy generate --description "Create mel spectrograms from audio file" --test "https://cdn.pixabay.com/download/audio/2023/02/28/audio_550d815fa5.mp3" --model gpt-4
|
||||
gptdeploy generate --description "Create mel spectrograms from audio file" --test "https://cdn.pixabay.com/download/audio/2023/02/28/audio_550d815fa5.mp3" --model gpt-4 --path microservice
|
||||
```
|
||||
<img src="res/audio_to_mel_example.png" alt="Audio to Mel Spectrogram" width="600" />
|
||||
|
||||
### Text to speech
|
||||
```bash
|
||||
gptdeploy generate --description "Convert text to speech" --test "Hello, welcome to GPT Deploy!" --model gpt-4
|
||||
gptdeploy generate --description "Convert text to speech" --test "Hello, welcome to GPT Deploy!" --model gpt-4 --path microservice
|
||||
```
|
||||
<a href=res/text_to_speech_example.wav><img src="res/text_to_speech_example.png" alt="Text to Speech" width="600" /></a>
|
||||
|
||||
@@ -158,20 +164,20 @@ gptdeploy generate --description "Convert text to speech" --test "Hello, welcome
|
||||
|
||||
### Heatmap Generator
|
||||
```bash
|
||||
gptdeploy generate --description "Create a heatmap from an image and a list of relative coordinates" --test "https://images.unsplash.com/photo-1574786198875-49f5d09fe2d2, [[0.1, 0.2], [0.3, 0.4], [0.5, 0.6], [0.2, 0.1], [0.7, 0.2], [0.4, 0.2]]" --model gpt-4
|
||||
gptdeploy generate --description "Create a heatmap from an image and a list of relative coordinates" --test "https://images.unsplash.com/photo-1574786198875-49f5d09fe2d2, [[0.1, 0.2], [0.3, 0.4], [0.5, 0.6], [0.2, 0.1], [0.7, 0.2], [0.4, 0.2]]" --model gpt-4 --path microservice
|
||||
```
|
||||
<img src="res/heatmap_example.png" alt="Heatmap Generator" width="600" />
|
||||
|
||||
### QR Code Generator
|
||||
```bash
|
||||
gptdeploy generate --description "Generate QR code from URL" --test "https://www.example.com" --model gpt-4
|
||||
gptdeploy generate --description "Generate QR code from URL" --test "https://www.example.com" --model gpt-4 --path microservice
|
||||
```
|
||||
<img src="res/qr_example.png" alt="QR Code Generator" width="600" />
|
||||
|
||||
### Mandelbrot Set Visualizer
|
||||
|
||||
```bash
|
||||
gptdeploy generate --description "Visualize the Mandelbrot set with custom parameters" --test "center=-0+1i, zoom=1.0, size=800x800, iterations=1000" --model gpt-4
|
||||
gptdeploy generate --description "Visualize the Mandelbrot set with custom parameters" --test "center=-0+1i, zoom=1.0, size=800x800, iterations=1000" --model gpt-4 --path microservice
|
||||
```
|
||||
<img src="res/mandelbrot_example.png" alt="Mandelbrot Set Visualizer" width="600" />
|
||||
|
||||
|
||||
BIN
res/obj_render_example.gif
Normal file
BIN
res/obj_render_example.gif
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 898 KiB |
@@ -14,7 +14,9 @@ from src.utils.string_tools import print_colored
|
||||
|
||||
|
||||
class GPTSession:
|
||||
def __init__(self, model: str = 'gpt-4'):
|
||||
def __init__(self, task_description, test_description, model: str = 'gpt-4', ):
|
||||
self.task_description = task_description
|
||||
self.test_description = test_description
|
||||
self.configure_openai_api_key()
|
||||
if model == 'gpt-4' and self.is_gpt4_available():
|
||||
self.supported_model = 'gpt-4'
|
||||
@@ -68,18 +70,18 @@ If you have updated it already, please restart your terminal.
|
||||
print('\n')
|
||||
|
||||
def get_conversation(self, system_definition_examples: List[str] = ['executor', 'docarray', 'client']):
|
||||
return _GPTConversation(self.supported_model, self.cost_callback, system_definition_examples)
|
||||
return _GPTConversation(self.supported_model, self.cost_callback, self.task_description, self.test_description, system_definition_examples)
|
||||
|
||||
def calculate_money_spent(self, num_chars, price):
|
||||
return round(num_chars / CHARS_PER_TOKEN * price / 1000, 3)
|
||||
|
||||
|
||||
class _GPTConversation:
|
||||
def __init__(self, model: str, cost_callback, system_definition_examples: List[str] = ['executor', 'docarray', 'client']):
|
||||
def __init__(self, model: str, cost_callback, task_description, test_description, system_definition_examples: List[str] = ['executor', 'docarray', 'client']):
|
||||
self.model = model
|
||||
self.cost_callback = cost_callback
|
||||
self.prompt_list: List[Optional[Tuple]] = [None]
|
||||
self.set_system_definition(system_definition_examples)
|
||||
self.set_system_definition(task_description, test_description, system_definition_examples)
|
||||
if os.environ['VERBOSE'].lower() == 'true':
|
||||
print_colored('system', self.prompt_list[0][1], 'magenta')
|
||||
|
||||
@@ -91,8 +93,8 @@ class _GPTConversation:
|
||||
self.prompt_list.append(('assistant', response))
|
||||
return response
|
||||
|
||||
def set_system_definition(self, system_definition_examples: List[str] = []):
|
||||
system_message = system_base_definition
|
||||
def set_system_definition(self, task_description, test_description, system_definition_examples: List[str] = []):
|
||||
system_message = system_base_definition(task_description, test_description)
|
||||
if 'executor' in system_definition_examples:
|
||||
system_message += f'\n{executor_example}'
|
||||
if 'docarray' in system_definition_examples:
|
||||
|
||||
@@ -248,10 +248,11 @@ def update_client_line_in_file(file_path, host):
|
||||
file.write(replaced_content)
|
||||
|
||||
|
||||
def remove_after_stderr(relevant_lines):
|
||||
def shorten_logs(relevant_lines):
|
||||
for index, line in enumerate(relevant_lines):
|
||||
if '--- Captured stderr call ----' in line:
|
||||
return relevant_lines[:index]
|
||||
relevant_lines = relevant_lines[:index]
|
||||
relevant_lines = [line for line in relevant_lines if ' Requirement already satisfied: ' not in line]
|
||||
return relevant_lines
|
||||
|
||||
|
||||
@@ -270,9 +271,9 @@ def process_error_message(error_message):
|
||||
if last_matching_line_index is not None:
|
||||
relevant_lines = lines[last_matching_line_index:]
|
||||
|
||||
relevant_lines = remove_after_stderr(relevant_lines)
|
||||
relevant_lines = shorten_logs(relevant_lines)
|
||||
|
||||
response = '\n'.join(relevant_lines[-25:]).strip()
|
||||
response = '\n'.join(relevant_lines[-100:]).strip()
|
||||
|
||||
# the following code tests the case that the docker file is corrupted and can not be parsed
|
||||
# the method above will not return a relevant error message in this case
|
||||
@@ -282,21 +283,3 @@ def process_error_message(error_message):
|
||||
if not response and last_line.startswith('error: '):
|
||||
return last_line
|
||||
return response
|
||||
|
||||
|
||||
def build_docker(path):
|
||||
# The command to build the Docker image
|
||||
cmd = f"docker build -t micromagic {path}"
|
||||
|
||||
# Run the command and capture the output
|
||||
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
|
||||
stdout, stderr = process.communicate()
|
||||
|
||||
# Check if there was an error
|
||||
if process.returncode != 0:
|
||||
error_message = stderr.decode("utf-8")
|
||||
relevant_error_message = process_error_message(error_message)
|
||||
return relevant_error_message
|
||||
else:
|
||||
print("Docker build completed successfully.")
|
||||
return ''
|
||||
|
||||
@@ -63,8 +63,8 @@ def generate(
|
||||
return
|
||||
|
||||
from src.options.generate.generator import Generator
|
||||
generator = Generator(model=model)
|
||||
generator.generate(description, test, path)
|
||||
generator = Generator(description, test, model=model)
|
||||
generator.generate(path)
|
||||
|
||||
@main.command()
|
||||
@path_param
|
||||
|
||||
@@ -36,5 +36,6 @@ MAX_DEBUGGING_ITERATIONS = 10
|
||||
DEMO_TOKEN = '45372338e04f5a41af949024db929d46'
|
||||
|
||||
PROBLEMATIC_PACKAGES = [
|
||||
'Pyrender', 'Trimesh', 'ModernGL', 'PyOpenGL', 'Pyglet', 'pythreejs', 'panda3d' # because they need a screen
|
||||
# 'Pyrender', 'Trimesh',
|
||||
'ModernGL', 'PyOpenGL', 'Pyglet', 'pythreejs', 'panda3d' # because they need a screen
|
||||
]
|
||||
@@ -1,13 +1,19 @@
|
||||
import os
|
||||
|
||||
|
||||
def get_latest_folder(path):
|
||||
return max([os.path.join(path, f) for f in os.listdir(path) if os.path.isdir(os.path.join(path, f))])
|
||||
def get_latest_folder(path, max_fn=max):
|
||||
return max_fn([os.path.join(path, f) for f in os.listdir(path) if os.path.isdir(os.path.join(path, f))])
|
||||
|
||||
def version_max_fn(path_list):
|
||||
version_list = [int(path.split('/')[-1].replace('v', '')) for path in path_list]
|
||||
max_version = max(version_list)
|
||||
max_index = version_list.index(max_version)
|
||||
return path_list[max_index]
|
||||
|
||||
def get_latest_version_path(microservice_path):
|
||||
executor_name_path = get_latest_folder(microservice_path)
|
||||
latest_approach_path = get_latest_folder(executor_name_path)
|
||||
latest_version_path = get_latest_folder(latest_approach_path)
|
||||
latest_version_path = get_latest_folder(latest_approach_path, max_fn=version_max_fn)
|
||||
return latest_version_path
|
||||
|
||||
def get_executor_name(microservice_path):
|
||||
|
||||
@@ -14,22 +14,23 @@ from src.utils.string_tools import print_colored
|
||||
|
||||
|
||||
class Generator:
|
||||
def __init__(self, model='gpt-4'):
|
||||
self.gpt_session = gpt.GPTSession(model=model)
|
||||
def __init__(self, task_description, test_description, model='gpt-4'):
|
||||
self.gpt_session = gpt.GPTSession(task_description, test_description, model=model)
|
||||
self.task_description = task_description
|
||||
self.test_description = test_description
|
||||
|
||||
def extract_content_from_result(self, plain_text, file_name, match_single_block=False):
|
||||
pattern = fr"^\*\*{file_name}\*\*\n```(?:\w+\n)?([\s\S]*?)```"
|
||||
match = re.search(pattern, plain_text, re.MULTILINE)
|
||||
if match:
|
||||
return match.group(1).strip()
|
||||
else:
|
||||
elif match_single_block:
|
||||
# Check for a single code block
|
||||
single_code_block_pattern = r"^```(?:\w+\n)?([\s\S]*?)```"
|
||||
single_code_block_match = re.findall(single_code_block_pattern, plain_text, re.MULTILINE)
|
||||
if match_single_block and len(single_code_block_match) == 1:
|
||||
if len(single_code_block_match) == 1:
|
||||
return single_code_block_match[0].strip()
|
||||
else:
|
||||
return ''
|
||||
return ''
|
||||
|
||||
def write_config_yml(self, microservice_name, dest_folder):
|
||||
config_content = f'''
|
||||
@@ -61,24 +62,19 @@ class Generator:
|
||||
test,
|
||||
path,
|
||||
microservice_name,
|
||||
package,
|
||||
packages,
|
||||
num_approach,
|
||||
is_chain_of_thought=False,
|
||||
):
|
||||
MICROSERVICE_FOLDER_v1 = get_microservice_path(path, microservice_name, package, num_approach, 1)
|
||||
MICROSERVICE_FOLDER_v1 = get_microservice_path(path, microservice_name, packages, num_approach, 1)
|
||||
os.makedirs(MICROSERVICE_FOLDER_v1)
|
||||
|
||||
print_colored('', '############# Microservice #############', 'blue')
|
||||
user_query = (
|
||||
general_guidelines()
|
||||
+ executor_file_task(microservice_name, description, test, package)
|
||||
+ executor_file_task(microservice_name, description, test, packages)
|
||||
)
|
||||
conversation = self.gpt_session.get_conversation()
|
||||
microservice_content_raw = conversation.query(user_query)
|
||||
if is_chain_of_thought:
|
||||
microservice_content_raw = conversation.query(
|
||||
f"General rules: " + not_allowed_executor() + chain_of_thought_optimization('python',
|
||||
'microservice.py'))
|
||||
microservice_content = self.extract_content_from_result(microservice_content_raw, 'microservice.py',
|
||||
match_single_block=True)
|
||||
if microservice_content == '':
|
||||
@@ -96,12 +92,6 @@ class Generator:
|
||||
)
|
||||
conversation = self.gpt_session.get_conversation()
|
||||
test_microservice_content_raw = conversation.query(user_query)
|
||||
if is_chain_of_thought:
|
||||
test_microservice_content_raw = conversation.query(
|
||||
f"General rules: " + not_allowed_executor() +
|
||||
chain_of_thought_optimization('python', 'test_microservice.py')
|
||||
+ "Don't add any additional tests. "
|
||||
)
|
||||
test_microservice_content = self.extract_content_from_result(
|
||||
test_microservice_content_raw, 'microservice.py', match_single_block=True
|
||||
)
|
||||
@@ -117,9 +107,6 @@ class Generator:
|
||||
)
|
||||
conversation = self.gpt_session.get_conversation()
|
||||
requirements_content_raw = conversation.query(user_query)
|
||||
if is_chain_of_thought:
|
||||
requirements_content_raw = conversation.query(
|
||||
chain_of_thought_optimization('', requirements_path) + "Keep the same version of jina ")
|
||||
|
||||
requirements_content = self.extract_content_from_result(requirements_content_raw, 'requirements.txt',
|
||||
match_single_block=True)
|
||||
@@ -135,9 +122,6 @@ class Generator:
|
||||
)
|
||||
conversation = self.gpt_session.get_conversation()
|
||||
dockerfile_content_raw = conversation.query(user_query)
|
||||
if is_chain_of_thought:
|
||||
dockerfile_content_raw = conversation.query(
|
||||
f"General rules: " + not_allowed_executor() + chain_of_thought_optimization('dockerfile', 'Dockerfile'))
|
||||
dockerfile_content = self.extract_content_from_result(dockerfile_content_raw, 'Dockerfile',
|
||||
match_single_block=True)
|
||||
persist_file(dockerfile_content, os.path.join(MICROSERVICE_FOLDER_v1, 'Dockerfile'))
|
||||
@@ -167,7 +151,7 @@ print(response[0].text) # can also be blob in case of image/audio..., this shoul
|
||||
```
|
||||
Note that the response will always be in response[0].text
|
||||
You must provide the complete file with the exact same syntax to wrap the code.
|
||||
The playground (app.py) must read the host from sys.argv because it will be started with a custom host: streamlit run app.py -- --host grpc://...
|
||||
The playground (app.py) must read the host from sys.argv[-1] because it will be started with a custom host: streamlit run app.py -- --host grpc://...
|
||||
The playground (app.py) must not let the user configure the host on the ui.
|
||||
'''
|
||||
)
|
||||
@@ -178,7 +162,6 @@ The playground (app.py) must not let the user configure the host on the ui.
|
||||
persist_file(playground_content, os.path.join(microservice_path, 'app.py'))
|
||||
|
||||
def debug_microservice(self, path, microservice_name, num_approach, packages, description, test):
|
||||
error_before = ''
|
||||
for i in range(1, MAX_DEBUGGING_ITERATIONS):
|
||||
print('Debugging iteration', i)
|
||||
print('Trying to build the microservice. Might take a while...')
|
||||
@@ -188,29 +171,31 @@ The playground (app.py) must not let the user configure the host on the ui.
|
||||
error = process_error_message(log_hubble)
|
||||
if error:
|
||||
print('An error occurred during the build process. Feeding the error back to the assistent...')
|
||||
self.do_debug_iteration(description, error, error_before, next_microservice_path,
|
||||
self.do_debug_iteration(description, error, next_microservice_path,
|
||||
previous_microservice_path, test)
|
||||
error_before = error
|
||||
if i == MAX_DEBUGGING_ITERATIONS - 1:
|
||||
raise self.MaxDebugTimeReachedException('Could not debug the microservice.')
|
||||
else:
|
||||
print('Successfully build microservice.')
|
||||
break
|
||||
if i == MAX_DEBUGGING_ITERATIONS - 1:
|
||||
raise self.MaxDebugTimeReachedException('Could not debug the microservice.')
|
||||
|
||||
return get_microservice_path(path, microservice_name, packages, num_approach, i)
|
||||
|
||||
def do_debug_iteration(self, description, error, error_before, next_microservice_path, previous_microservice_path,
|
||||
def do_debug_iteration(self, description, error, next_microservice_path, previous_microservice_path,
|
||||
test):
|
||||
os.makedirs(next_microservice_path)
|
||||
file_name_to_content = get_all_microservice_files_with_content(previous_microservice_path)
|
||||
|
||||
summarized_error = self.summarize_error(error)
|
||||
is_dependency_issue = self.is_dependency_issue(error, file_name_to_content['Dockerfile'])
|
||||
if is_dependency_issue:
|
||||
all_files_string = self.files_to_string({
|
||||
key: val for key, val in file_name_to_content.items() if
|
||||
key in ['requirements.txt', 'Dockerfile']
|
||||
})
|
||||
user_query = self.get_user_query_dependency_issue(all_files_string, error)
|
||||
user_query = self.get_user_query_dependency_issue(all_files_string, summarized_error)
|
||||
else:
|
||||
user_query = self.get_user_query_code_issue(description, error, file_name_to_content,
|
||||
user_query = self.get_user_query_code_issue(description, summarized_error, file_name_to_content,
|
||||
test)
|
||||
conversation = self.gpt_session.get_conversation()
|
||||
returned_files_raw = conversation.query(user_query)
|
||||
@@ -218,33 +203,50 @@ The playground (app.py) must not let the user configure the host on the ui.
|
||||
updated_file = self.extract_content_from_result(returned_files_raw, file_name)
|
||||
if updated_file and (not is_dependency_issue or file_name in ['requirements.txt', 'Dockerfile']):
|
||||
file_name_to_content[file_name] = updated_file
|
||||
print(f'Updated {file_name}')
|
||||
for file_name, content in file_name_to_content.items():
|
||||
persist_file(content, os.path.join(next_microservice_path, file_name))
|
||||
|
||||
def get_user_query_dependency_issue(self, all_files_string, error):
|
||||
|
||||
def get_user_query_dependency_issue(self, all_files_string, summarized_error):
|
||||
user_query = (
|
||||
f'''
|
||||
Your task is to provide guidance on how to solve an error that occurred during the Docker build process.
|
||||
The error message is:
|
||||
**microservice.log**
|
||||
```
|
||||
{error}
|
||||
```
|
||||
Here is the summary of the error that occurred:
|
||||
{summarized_error}
|
||||
|
||||
To solve this error, you should:
|
||||
1. Identify the type of error by examining the stack trace.
|
||||
2. Suggest how to solve it.
|
||||
3. Your suggestion must include the files that need to be changed, but not files that don't need to be changed.
|
||||
1. Suggest 3 to 5 possible solutions on how to solve it. You have no access to the documentation of the package.
|
||||
2. Decide for the best solution and explain it in detail.
|
||||
3. Write down the files that need to be changed, but not files that don't need to be changed.
|
||||
For files that need to be changed, you must provide the complete file with the exact same syntax to wrap the code.
|
||||
Obey the following rules: {not_allowed_docker()}
|
||||
|
||||
You are given the following files:
|
||||
|
||||
{all_files_string}"
|
||||
|
||||
Output all the files that need change.
|
||||
Don't output files that don't need change. If you output a file, then write the
|
||||
complete file. Use the exact following syntax to wrap the code:
|
||||
|
||||
**...**
|
||||
```
|
||||
...code...
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
**requirements.txt**
|
||||
```
|
||||
jina==2.0.0
|
||||
```
|
||||
|
||||
'''
|
||||
)
|
||||
return user_query
|
||||
|
||||
def get_user_query_code_issue(self, description, error, file_name_to_content, test):
|
||||
def get_user_query_code_issue(self, description, summarized_error, file_name_to_content, test):
|
||||
all_files_string = self.files_to_string(file_name_to_content)
|
||||
return f'''
|
||||
General rules: {not_allowed_executor()}
|
||||
@@ -256,18 +258,34 @@ Here are all the files I use:
|
||||
{all_files_string}
|
||||
|
||||
|
||||
This is the error I encounter currently during the docker build process:
|
||||
{error}
|
||||
Here is the summary of the error that occurred:
|
||||
{summarized_error}
|
||||
|
||||
Look at the stack trace of the current error. First, think about what kind of error is this?
|
||||
Then think about possible reasons which might have caused it. Then suggest how to
|
||||
solve it. Output all the files that need change.
|
||||
To solve this error, you should:
|
||||
1. Suggest 3 to 5 possible solutions on how to solve it. You have no access to the documentation of the package.
|
||||
2. Decide for the best solution and explain it in detail.
|
||||
3. Write down the files that need to be changed, but not files that don't need to be changed.
|
||||
Obey the following rules:
|
||||
{not_allowed_executor()}
|
||||
{not_allowed_docker()}
|
||||
|
||||
|
||||
Output all the files that need change.
|
||||
Don't output files that don't need change. If you output a file, then write the
|
||||
complete file. Use the exact same syntax to wrap the code:
|
||||
complete file. Use the exact following syntax to wrap the code:
|
||||
|
||||
**...**
|
||||
```...
|
||||
...code...
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
**microservice.py**
|
||||
```python
|
||||
print('hello world')
|
||||
```
|
||||
|
||||
'''
|
||||
|
||||
class MaxDebugTimeReachedException(BaseException):
|
||||
@@ -332,32 +350,32 @@ If the package is mentioned in the description, then it is automatically the bes
|
||||
The output must be a list of lists wrapped into ``` and starting with **packages.csv** like this:
|
||||
**packages.csv**
|
||||
```
|
||||
package1
|
||||
package2
|
||||
package3
|
||||
package4
|
||||
package5
|
||||
package1a, package1b ...
|
||||
package2a, package2b, package2c
|
||||
package3a ...
|
||||
package4a ...
|
||||
package5a ...
|
||||
...
|
||||
```
|
||||
'''
|
||||
conversation = self.gpt_session.get_conversation()
|
||||
packages_raw = conversation.query(user_query)
|
||||
packages_csv_string = self.extract_content_from_result(packages_raw, 'packages.csv')
|
||||
packages = [package.split(',') for package in packages_csv_string.split('\n')]
|
||||
packages = packages[:NUM_IMPLEMENTATION_STRATEGIES]
|
||||
return packages
|
||||
packages_list = [[pkg.strip() for pkg in packages_string.split(',')] for packages_string in packages_csv_string.split('\n')]
|
||||
packages_list = packages_list[:NUM_IMPLEMENTATION_STRATEGIES]
|
||||
return packages_list
|
||||
|
||||
def generate(self, description, test, microservice_path):
|
||||
generated_name = self.generate_microservice_name(description)
|
||||
def generate(self, microservice_path):
|
||||
generated_name = self.generate_microservice_name(self.task_description)
|
||||
microservice_name = f'{generated_name}{random.randint(0, 10_000_000)}'
|
||||
packages_list = self.get_possible_packages(description)
|
||||
packages_list = self.get_possible_packages(self.task_description)
|
||||
packages_list = [packages for packages in packages_list if len(set(packages).intersection(set(PROBLEMATIC_PACKAGES))) == 0]
|
||||
for num_approach, packages in enumerate(packages_list):
|
||||
try:
|
||||
self.generate_microservice(description, test, microservice_path, microservice_name, packages,
|
||||
self.generate_microservice(self.task_description, self.test_description, microservice_path, microservice_name, packages,
|
||||
num_approach)
|
||||
final_version_path = self.debug_microservice(microservice_path, microservice_name, num_approach,
|
||||
packages, description, test)
|
||||
packages, self.task_description, self.test_description)
|
||||
self.generate_playground(microservice_name, final_version_path)
|
||||
except self.MaxDebugTimeReachedException:
|
||||
print('Could not debug the Microservice with the approach:', packages)
|
||||
@@ -373,3 +391,14 @@ gptdeploy deploy --path {microservice_path}
|
||||
'''
|
||||
)
|
||||
break
|
||||
|
||||
def summarize_error(self, error):
|
||||
conversation = self.gpt_session.get_conversation([])
|
||||
user_query = f'''
|
||||
Here is an error message I encountered during the docker build process:
|
||||
"{error}"
|
||||
Your task is to summarize the error message as compact and informative as possible while maintaining all information necessary to debug the core issue.
|
||||
Warnings are not worth mentioning.
|
||||
'''
|
||||
error_summary = conversation.query(user_query)
|
||||
return error_summary
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
from src.constants import FLOW_URL_PLACEHOLDER
|
||||
from src.options.generate.prompt_tasks import not_allowed_executor, not_allowed_docker
|
||||
|
||||
executor_example = '''Using the Jina framework, users can define executors.
|
||||
Here is an example of how an executor can be defined. It always starts with a comment:
|
||||
@@ -16,7 +17,7 @@ class MyInfoExecutor(Executor):
|
||||
for d in docs:
|
||||
content = json.loads(d.text)
|
||||
...
|
||||
d.text = json.dumps(modified_content)
|
||||
d.text = json.dumps(modified_content) # serialized json
|
||||
return docs
|
||||
```
|
||||
|
||||
@@ -28,7 +29,7 @@ A Document is a python class that represents a single document.
|
||||
Here is the protobuf definition of a Document:
|
||||
```
|
||||
message DocumentProto {{
|
||||
// used to store json data the executor gets and returns
|
||||
// used to store serialized json data the executor gets and returns
|
||||
string text = 1;
|
||||
}}
|
||||
```
|
||||
@@ -71,8 +72,22 @@ print(response[0].text)
|
||||
```'''
|
||||
|
||||
|
||||
system_base_definition = f'''
|
||||
def system_base_definition(task_description, test_description):
|
||||
return f'''
|
||||
It is the year 2021.
|
||||
You are a principal engineer working at Jina - an open source company.
|
||||
You are a principal engineer working at Jina - an open source company.
|
||||
You accurately satisfy all of the user's requirements.
|
||||
To be more specific, you help the user to build a microservice with the following requirements:
|
||||
```
|
||||
{task_description}
|
||||
```
|
||||
and the following test scenario:
|
||||
```
|
||||
{test_description}
|
||||
```
|
||||
|
||||
You must obey the following rules:
|
||||
{not_allowed_executor()}
|
||||
{not_allowed_docker()}
|
||||
|
||||
'''
|
||||
@@ -4,17 +4,22 @@ from src.constants import EXECUTOR_FILE_NAME, REQUIREMENTS_FILE_NAME, TEST_EXECU
|
||||
|
||||
|
||||
def general_guidelines():
|
||||
return (
|
||||
"The code you write is production ready. "
|
||||
"Every file starts with comments describing what the code is doing before the first import. "
|
||||
"Comments can only be written within code blocks. "
|
||||
"Then all imports are listed. "
|
||||
"It is important to import all modules that could be needed in the Executor code. "
|
||||
"Always import: "
|
||||
"from jina import Executor, DocumentArray, Document, requests "
|
||||
"Start from top-level and then fully implement all methods. "
|
||||
"\n"
|
||||
)
|
||||
return '''
|
||||
The code you write is production ready.
|
||||
Every file starts with comments describing what the code is doing before the first import.
|
||||
Comments can only be written within code blocks.
|
||||
Then all imports are listed.
|
||||
It is important to import all modules that could be needed in the Executor code.
|
||||
Always import:
|
||||
from jina import Executor, DocumentArray, Document, requests
|
||||
import json
|
||||
from io import BytesIO
|
||||
import requests as req
|
||||
|
||||
|
||||
Start from top-level and then fully implement all methods.
|
||||
|
||||
'''
|
||||
|
||||
|
||||
def _task(task, tag_name, file_name, purpose=None):
|
||||
@@ -44,7 +49,7 @@ Have in mind that d.uri is never a path to a local file. It is always a url.
|
||||
{not_allowed_executor()}
|
||||
Your approach:
|
||||
1. Identify the core challenge when implementing the executor.
|
||||
2. Think about solutions for these challenges.
|
||||
2. Think about possible solutions for these challenges.
|
||||
3. Decide for one of the solutions.
|
||||
4. Write the code.
|
||||
''', EXECUTOR_FILE_TAG, EXECUTOR_FILE_NAME)
|
||||
|
||||
@@ -9,8 +9,8 @@ import docker
|
||||
from docker import APIClient
|
||||
|
||||
|
||||
def get_microservice_path(path, microservice_name, package, num_approach, version):
|
||||
package_path = '_'.join(package)
|
||||
def get_microservice_path(path, microservice_name, packages, num_approach, version):
|
||||
package_path = '_'.join(packages)
|
||||
return os.path.join(path, microservice_name, f'{num_approach}_{package_path}', f'v{version}')
|
||||
|
||||
def persist_file(file_content, file_path):
|
||||
|
||||
@@ -25,5 +25,5 @@ def test_generator(tmpdir):
|
||||
# Use mock.patch as a context manager to replace the original methods with the mocks
|
||||
with mock.patch("openai.ChatCompletion.create", side_effect=mock_create), \
|
||||
mock.patch.object(GPTSession, "configure_openai_api_key", side_effect=mock_get_openai_api_key):
|
||||
generator = Generator()
|
||||
generator.generate("my description", "my test", str(tmpdir))
|
||||
generator = Generator("my description", "my test")
|
||||
generator.generate(str(tmpdir))
|
||||
|
||||
Reference in New Issue
Block a user