Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Issue]: Implementing Conversational Capabilities in UserProxyAgent #3092

Open
wbw625 opened this issue Jul 8, 2024 · 0 comments
Open

[Issue]: Implementing Conversational Capabilities in UserProxyAgent #3092

wbw625 opened this issue Jul 8, 2024 · 0 comments

Comments

@wbw625
Copy link

wbw625 commented Jul 8, 2024

Describe the issue

I am currently working with ConversableAgent and UserProxyAgent classes. I am using a local model Mistral-7B-OpenOrca for my implementation. I noticed that the ConversableAgent can perform Q&A functions, but the UserProxyAgent primarily executes code.

I want to extend the functionality of UserProxyAgent so that it can also perform Q&A similar to ConversableAgent. Here is the current setup I am using:

assistant = ConversableAgent("agent", llm_config={"config_list": config_list})

user_proxy = UserProxyAgent(
    "user_proxy",
    code_execution_config={"work_dir": "coding", "use_docker": False},
    # llm_config={"config_list": config_list}
    llm_config=False
)

assistant.register_model_client(model_client_cls=CustomModelClient)
# user_proxy.register_model_client(model_client_cls=CustomModelClient)
user_proxy.initiate_chat(assistant, message="Plot a chart of NVDA and TESLA stock price change YTD.")

My question is: How can I make UserProxyAgent capable of performing Q&A functionalities, and not just running code?

Thank you.

Steps to reproduce

No response

Screenshots and logs

No response

Additional Information

The configuration file OAI_CONFIG_LIST is as follows:

[
    {
        "model": "Open-Orca/Mistral-7B-OpenOrca",
        "model_client_cls": "CustomModelClient",
        "device": "cuda:3",
        "n": 1,
        "params": {
            "max_length": 1000
        }
    }
]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
1 participant