Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net Bug: Function calling does not work properly with Llama series models when hosted on Azure. #10221

Open
bauann opened this issue Jan 17, 2025 · 1 comment
Assignees
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code

Comments

@bauann
Copy link

bauann commented Jan 17, 2025

Describe the bug
I deployed 3 different models in Azure AI (Machine Learning Studio) serverless endpoint.

  • Mistral-Nemo
  • Llama-3.3-70B-Instruct
  • Llama-3.2-90B-Vision-Instruct
    And use SemanticKernel package 1.33.0 + Microsoft.SemanticKernel.Connectors.AzureAIInference 1.33.0-beta to test, The Mistral-Nemo model works perfect. But not in llama series models.

To Reproduce
Steps to reproduce the behavior:

  1. Depoly open source model (e.g llama-3.3-70B) to serverless endpoin.
  2. Make a GetChatMessageContentsAsync call with tools

Expected behavior
Models like Llama 3.3 and Llama 3.2 should support tool calling and are expected to function as they do when hosted in Ollama.

Platform

  • OS: Mac
  • IDE: Rider
  • Language: C#
  • Source: SK 1.33.0, Microsoft.SemanticKernel.Connectors.AzureAIInference 1.33.0-beta

Additional context
With llama-3.3-70B, When make a chatcomplecton call with tools, It responded with an incorrect result and not invoke any tools. no matter use GetChatMessageContentsAsync or GetStreamingChatMessageContentsAsync.

With llama-3.2-90B, When make a chatcomplecton call with tools, It throws an exception right away (error message in bwlow)

Azure.RequestFailedException: {"object":"error","message":"\"auto\" tool choice requires --enable-auto-tool-choice and --tool-call-parser to be set","type":"BadRequestError","param":null,"code":400}
Status: 400 (Bad Request)
ErrorCode: Bad Request

Content:
{"error":{"code":"Bad Request","message":"{\"object\":\"error\",\"message\":\"\\\"auto\\\" tool choice requires --enable-auto-tool-choice and --tool-call-parser to be set\",\"type\":\"BadRequestError\",\"param\":null,\"code\":400}","status":400}}

Headers:
x-ms-rai-invoked: REDACTED
x-envoy-upstream-service-time: REDACTED
X-Request-ID: REDACTED
ms-azureml-model-error-reason: REDACTED
ms-azureml-model-error-statuscode: REDACTED
ms-azureml-model-time: REDACTED
azureml-destination-model-group: REDACTED
azureml-destination-region: REDACTED
azureml-destination-deployment: REDACTED
azureml-destination-endpoint: REDACTED
x-ms-client-request-id: 42bc2161-5a3f-4e88-8c9e-577390db941e
Request-Context: REDACTED
azureml-model-session: REDACTED
azureml-model-group: REDACTED
Date: Wed, 15 Jan 2025 05:46:51 GMT
Content-Length: 246
Content-Type: application/json

   at Azure.Core.HttpPipelineExtensions.ProcessMessageAsync(HttpPipeline pipeline, HttpMessage message, RequestContext requestContext, CancellationToken cancellationToken)
   at Azure.AI.Inference.ChatCompletionsClient.CompleteAsync(RequestContent content, String extraParams, RequestContext context)
   at Azure.AI.Inference.ChatCompletionsClient.CompleteAsync(ChatCompletionsOptions chatCompletionsOptions, CancellationToken cancellationToken)
   at Microsoft.Extensions.AI.AzureAIInferenceChatClient.CompleteAsync(IList`1 chatMessages, ChatOptions options, CancellationToken cancellationToken)
   at Microsoft.Extensions.AI.FunctionInvokingChatClient.CompleteAsync(IList`1 chatMessages, ChatOptions options, CancellationToken cancellationToken)
   at Microsoft.SemanticKernel.ChatCompletion.ChatClientChatCompletionService.GetChatMessageContentsAsync(ChatHistory chatHistory, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)
@bauann bauann added the bug Something isn't working label Jan 17, 2025
@markwallace-microsoft markwallace-microsoft added .NET Issue or Pull requests regarding .NET code triage labels Jan 17, 2025
@RogerBarreto RogerBarreto moved this from Bug to Sprint: In Progress in Semantic Kernel Jan 22, 2025
@2BitSalute
Copy link

2BitSalute commented Jan 23, 2025

Similar or the same issue calling with tools, using the Phi-4 model (through Ollama or Azure).
Tried with Required (default and specifying individual functions explicitly) and Auto behaviors.

{
    "type": "value_error",
    "loc": ("body",),
    "msg": "Value error, Currently only named tools are supported.",
    "input": {
        "messages": [
            {
                "role": "user",
                "content": [
                    {
                        "type": "text",
                        "text": "<prompt>"
                    }
                ]
            }
        ],
        "model": "phi4",
        "tools": [
            {
                "type": "function",
                "function": {
                    "name": "<Class>-<Method>",
                    "description": "<description>",
                    "parameters": {...}
                }
            },
            ...
        ],
        "tool_choice": "required",
        "stream_options": None
    },
    "ctx": {
        "error": ValueError("Currently only named tools are supported.")
    }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code
Projects
Status: Sprint: In Progress
Development

No branches or pull requests

4 participants