Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Missing required arguments; Expected either ('messages' and 'model') or ('messages', 'model' and 'stream') arguments to be given #4987

Open
mwarqee opened this issue Jan 10, 2025 · 1 comment

Comments

@mwarqee
Copy link

mwarqee commented Jan 10, 2025

Currently testing the Society of Mind notebook and encountering the issue mentioned in the title.
One important difference with respect to the original notebook - instead of loading the credentials via json file, this is done via env file.

config_file_or_env = ".env"
llm_config = {"temperature": 0,"timeout": 600,"cache_seed": 44}
config_list = autogen.config_list_from_dotenv(config_file_or_env, model_api_key_map={'gpt-4o-mini':'OPENAI_API_KEY'},filter_dict={"model": ["gpt-4o-mini"]})

env file content

OPENAI_API_KEY="ewohizhf98328uiefujkc3"
endpoint="https://loremipsum.openai.azure.com"
model="gpt-4o-mini"
api_type="azure"
api_version="2024-08-01-preview"

Getting the following error message -

[autogen.oai.client: 01-10 09:22:29] {250} WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model.
WARNING:autogen.oai.client:The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model.
user_proxy (to society_of_mind):

On which days in 2024 was Microsoft Stock higher than $370?

--------------------------------------------------------------------------------

>>>>>>>> USING AUTO REPLY...
society_of_mind (to chat_manager):

On which days in 2024 was Microsoft Stock higher than $370?

--------------------------------------------------------------------------------

Next speaker: inner-assistant

Traceback (most recent call last):
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\autogen\agentchat\contrib\society_of_mind_agent.py", line 198, in generate_inner_monologue_reply
    self.initiate_chat(self.chat_manager, message=messages[-1], clear_history=False)
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 1117, in initiate_chat
    self.send(msg2send, recipient, silent=silent)
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 807, in send
    recipient.receive(message, self, request_reply, silent)
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 917, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 2065, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\autogen\agentchat\groupchat.py", line 1184, in run_chat
    reply = speaker.generate_reply(sender=self)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 2065, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 1436, in generate_oai_reply
    extracted_response = self._generate_oai_reply_from_client(
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 1455, in _generate_oai_reply_from_client
    response = llm_client.create(
               ^^^^^^^^^^^^^^^^^^
...
               ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Projects\AutoGenStudio\.vcagent\Lib\site-packages\openai\_utils\_utils.py", line 278, in wrapper
    raise TypeError(msg)
TypeError: Missing required arguments; Expected either ('messages' and 'model') or ('messages', 'model' and 'stream') arguments to be given
Output is truncated. View as a [scrollable element](command:cellOutput.enableScrolling?7d887e2f-44d9-45e0-ba40-6bd61a9cbb3b) or open in a [text editor](command:workbench.action.openLargeOutput?7d887e2f-44d9-45e0-ba40-6bd61a9cbb3b). Adjust cell output [settings](command:workbench.action.openSettings?%5B%22%40tag%3AnotebookOutputLayout%22%5D)...


Is there a sample file showing how to properly load endpoint, api version, api key and model from env for azure?

@jackgerrits
Copy link
Member

What version of autogen are you using?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants