Skip to content

Avoid structural dependency on LLM APIs #54

@0xba1a

Description

@0xba1a

The current implementation of src/microbots/llm/openai_api.py is heavily depending on the response structure of the LLM. It expects the LLM's response to be always in llm_output_format. This adds an additional overhead for the custom-bots which need to extend the current system-prompt including the specific llm_output_format.

We should converge with OpenAI API's original structure. The LLM function-call should be in dedicated LLM's function-call key instead of forcing the LLM to provide it's NL output in JSON everytime.

Metadata

Metadata

Labels

No labels
No labels

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions