Commit Graph

513 Commits

Author SHA1 Message Date
Abi Raja
79459e6ebf ignore aider files 2023-11-14 15:10:18 -05:00
Abi Raja
649d04dfac Update README 2023-11-14 15:10:08 -05:00
Abi Raja
82fc80b3fc aider: Changed file writing operation to write into a new file in the run_logs directory for each run using the current timestamp as the filename.
# Aider chat conversation:

ASSISTANT: To achieve this, we need to modify the `backend/main.py` file. We will change the file writing operation to write into a new file in the `run_logs` directory for each run. We will use the current timestamp to create a unique filename for each run.

Here is the step-by-step plan:

1. Import the `datetime` and `os` modules.
2. Create a `run_logs` directory if it doesn't exist.
3. Generate a unique filename using the current timestamp.
4. Change the file writing operation to write into the new file in the `run_logs` directory.

Here is the *edit block* for these changes:

```python
backend/main.py
<<<<<<< HEAD
# Load environment variables first
from dotenv import load_dotenv

from prompts import assemble_prompt

load_dotenv()

from fastapi import FastAPI, WebSocket
from llm import stream_openai_response

app = FastAPI()

@app.websocket("/generate-code")
async def stream_code_test(websocket: WebSocket):
    await websocket.accept()

    result = await websocket.receive_json()

    async def process_chunk(content):
        await websocket.send_json({"type": "chunk", "value": content})

    messages = assemble_prompt("")
    print(messages)

    # Write the messages dict into a file for debugging
    with open("messages.json", "w") as f:
        f.write(str(messages))

    await stream_openai_response(
        messages,
        lambda x: process_chunk(x),
    )
=======
# Load environment variables first
from dotenv import load_dotenv
import os
from datetime import datetime

from prompts import assemble_prompt

load_dotenv()

from fastapi import FastAPI, WebSocket
from llm import stream_openai_response

app = FastAPI()

@app.websocket("/generate-code")
async def stream_code_test(websocket: WebSocket):
    await websocket.accept()

    result = await websocket.receive_json()

    async def process_chunk(content):
        await websocket.send_json({"type": "chunk", "value": content})

    messages = assemble_prompt("")
    print(messages)

    # Create run_logs directory if it doesn't exist
    if not os.path.exists('run_logs'):
        os.makedirs('run_logs')

    # Generate a unique filename using the current timestamp
    filename = datetime.now().strftime('run_logs/messages_%Y%m%d_%H%M%S.json')

    # Write the messages dict into a new file for each run
    with open(filename, "w") as f:
        f.write(str(messages))

    await stream_openai_response(
        messages,
        lambda x: process_chunk(x),
    )
>>>>>>> updated
```
2023-11-14 15:00:23 -05:00
Abi Raja
4bb012922b Write messages dict into a file for debugging. 2023-11-14 15:00:22 -05:00
Abi Raja
54ddde2820 add a prompt library 2023-11-14 14:56:46 -05:00
Abi Raja
444c16abe6 show generated code in preview 2023-11-14 14:49:05 -05:00
Abi Raja
209929d6b3 minor refactor and typescript checker 2023-11-14 14:47:03 -05:00
Abi Raja
ce763782c4 add backend with openai calling 2023-11-14 14:40:17 -05:00
Abi Raja
53c6b13887 support to the backend via websockets and display progress to user 2023-11-14 13:50:42 -05:00
Abi Raja
169dd0a337 add a desktop preview 2023-11-14 13:36:34 -05:00
Abi Raja
5a8d7d9071 add a CodePreview element 2023-11-14 13:33:06 -05:00
Abi Raja
2bf4b62082 add a basic UI 2023-11-14 13:10:41 -05:00
Abi Raja
e8908ac782 Initial commit 2023-11-14 12:53:23 -05:00