You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I had the folllowing error when running the python listBot-streaming sample, using default_model="gpt-4o-mini".
stream_id is not a known attribute of class <class 'botbuilder.schema._models_py3.Entity'> and will be ignored
stream_sequence is not a known attribute of class <class 'botbuilder.schema._models_py3.Entity'> and will be ignored
stream_type is not a known attribute of class <class 'botbuilder.schema._models_py3.Entity'> and will be ignored
TEST <msrest.universal_http.async_requests.AsyncRequestsClientResponse object at 0x7618746baf30>
Task exception was never retrieved
future: <Task finished name='Task-14' coro=<StreamingResponse.drain_queue.<locals>._drain_queue() done, defined at /home/duydl/Projects/teams/teams-ai/python/packages/ai/teams/streaming/streaming_response.py:251> exception=ErrorResponseException('(BadSyntax) Only start streaming and continue streaming types are allowed as a typing activity')>
Traceback (most recent call last):
File "/home/duydl/Projects/teams/teams-ai/python/packages/ai/teams/streaming/streaming_response.py", line 262, in _drain_queue
await self.send_activity(activity)
File "/home/duydl/Projects/teams/teams-ai/python/packages/ai/teams/streaming/streaming_response.py", line 306, in send_activity
response = await self._context.send_activity(activity)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/duydl/Projects/teams/botbuilder-python/libraries/botbuilder-core/botbuilder/core/turn_context.py", line 174, in send_activity
result = await self.send_activities([activity_or_text])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/duydl/Projects/teams/botbuilder-python/libraries/botbuilder-core/botbuilder/core/turn_context.py", line 226, in send_activities
return await self._emit(self._on_send_activities, output, logic())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/duydl/Projects/teams/botbuilder-python/libraries/botbuilder-core/botbuilder/core/turn_context.py", line 304, in _emit
return await logic
^^^^^^^^^^^
File "/home/duydl/Projects/teams/botbuilder-python/libraries/botbuilder-core/botbuilder/core/turn_context.py", line 221, in logic
responses = await self.adapter.send_activities(self, output)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/duydl/Projects/teams/botbuilder-python/libraries/botbuilder-core/botbuilder/core/cloud_adapter_base.py", line 93, in send_activities
response = await connector_client.conversations.reply_to_activity(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/duydl/Projects/teams/botbuilder-python/libraries/botframework-connector/botframework/connector/aio/operations_async/_conversations_operations_async.py", line 529, in reply_to_activity
raise models.ErrorResponseException(self._deserialize, response)
botbuilder.schema._models_py3.ErrorResponseException: (BadSyntax) Only start streaming and continue streaming types are allowed as a typing activity
Using Azure OpenAI API
When using Azure. I also had this error:
Traceback (most recent call last):
File "/home/duydl/Projects/teams/teams-ai/python/samples/04.ai.d.chainedActions.listBot/.conda/lib/python3.11/site-packages/teams/app.py", line 758, in _on_turn
if not await self._run_ai_chain(context, state):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/duydl/Projects/teams/teams-ai/python/samples/04.ai.d.chainedActions.listBot/.conda/lib/python3.11/site-packages/teams/app.py", line 836, in _run_ai_chain
is_ok = await self._ai.run(context, state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/duydl/Projects/teams/teams-ai/python/samples/04.ai.d.chainedActions.listBot/.conda/lib/python3.11/site-packages/teams/ai/ai.py", line 146, in run
plan = await self.planner.begin_task(context, state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/duydl/Projects/teams/teams-ai/python/samples/04.ai.d.chainedActions.listBot/.conda/lib/python3.11/site-packages/teams/ai/planners/action_planner.py", line 100, in begin_task
return await self.continue_task(context, state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/duydl/Projects/teams/teams-ai/python/samples/04.ai.d.chainedActions.listBot/.conda/lib/python3.11/site-packages/teams/ai/planners/action_planner.py", line 111, in continue_task
raise ApplicationError(res.error or "[ActionPlanner]: failed task")
teams.app_error.ApplicationError:
The chat completion API returned an error
status of None: Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: parallel_tool_calls', 'type': 'invalid_request_error', 'param': None, 'code': None}}
I found that it is related to this issue Azure/azure-rest-api-specs#29545 and some other samples e.g listBot also raised this. I managed to bypass this by modifying async def complete_prompt method in class OpenAIModel(PromptCompletionModel).
Reproduction Steps
1. Follow instruction in https://github.com/microsoft/teams-ai/tree/main/python/samples/04.ai.h.chainedActions.listBot-streaming
2. (Optional) Clone the repo and install packages/ai in editable mode to debug.
3. Press F5. Add app to Team.
4. Send a message. Error happens.
The text was updated successfully, but these errors were encountered:
Language
Python
Version
latest
Description
Using OpenAI API
I had the folllowing error when running the python listBot-streaming sample, using
default_model="gpt-4o-mini"
.Using Azure OpenAI API
When using Azure. I also had this error:
I found that it is related to this issue Azure/azure-rest-api-specs#29545 and some other samples e.g
listBot
also raised this. I managed to bypass this by modifyingasync def complete_prompt
method inclass OpenAIModel(PromptCompletionModel)
.Reproduction Steps
The text was updated successfully, but these errors were encountered: