Unauthorized access. 'Send' claim(s) are required to perform this operation. Resource: 'sb://xxxx.servicebus.windows.net/yyyy'

Hi,

we are using NServiceBus.AzureFunctions.Worker.ServiceBus version 3.1.0 in combination with Azure.Messaging.ServiceBus version 7.11.1. This function runs on an Azure Functions Elastic Premium plan. While processing messages, we often see hundreds occurences of the same exception in a timespan of a few minutes. The messages eventually get processed.

We use the Primary Connection String (RootManageSharedAccessKey) of the Azure Service Bus Namespace. We store it in Azure Key Vault and have a Key Vault reference to it in the application settings of the Azure Function.

The full details of the exception we receive:

ExceptionType: Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException
Method: Azure.Messaging.ServiceBus.ServiceBusProcessor+<OnProcessMessageAsync>d__104.MoveNext
OuterMessage: Exception while executing function: Functions.NServiceBusFunctionEndpointTrigger-xxxx
OuterAssembly: Microsoft.Azure.WebJobs.Extensions.ServiceBus, Version=5.7.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8
OuterMethod: Microsoft.Azure.WebJobs.ServiceBus.MessageProcessor.CompleteProcessingMessageAsync
InnermostType: Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException

InnerMostMessage:

Result: Failure
Exception: System.AggregateException: One or more errors occurred. (Unauthorized access. 'Send' claim(s) are required to perform this operation. Resource: 'sb://xxxx.servicebus.windows.net/yyyy'. Timestamp:2023-02-07T09:15:25
For troubleshooting information, see https://aka.ms/azsdk/net/servicebus/exceptions/troubleshoot.)
---> System.UnauthorizedAccessException: Unauthorized access. 'Send' claim(s) are required to perform this operation. Resource: 'sb://xxxx.servicebus.windows.net/yyyy'. Timestamp:2023-02-07T09:15:25
For troubleshooting information, see https://aka.ms/azsdk/net/servicebus/exceptions/troubleshoot.
at Azure.Messaging.ServiceBus.Amqp.AmqpSender.SendBatchInternalAsync(AmqpMessage batchMessage, TimeSpan timeout, CancellationToken cancellationToken)
at Azure.Messaging.ServiceBus.Amqp.AmqpSender.<>c.<<SendAsync>b__24_0>d.MoveNext()
--- End of stack trace from previous location ---
at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.<>c__22`1.<<RunOperation>b__22_0>d.MoveNext()
--- End of stack trace from previous location ---
at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1,TResult](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken, Boolean logRetriesAsVerbose)
at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1,TResult](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken, Boolean logRetriesAsVerbose)
at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken)
at Azure.Messaging.ServiceBus.Amqp.AmqpSender.SendAsync(IReadOnlyCollection`1 messages, CancellationToken cancellationToken)
at Azure.Messaging.ServiceBus.ServiceBusSender.SendMessagesAsync(IEnumerable`1 messages, CancellationToken cancellationToken)
at Azure.Messaging.ServiceBus.ServiceBusSender.SendMessageAsync(ServiceBusMessage message, CancellationToken cancellationToken)
at NServiceBus.TransportReceiveToPhysicalMessageConnector.Invoke(ITransportReceiveContext context, Func`2 next) in /_/src/NServiceBus.Core/Pipeline/Incoming/TransportReceiveToPhysicalMessageConnector.cs:line 61
at NServiceBus.RetryAcknowledgementBehavior.Invoke(ITransportReceiveContext context, Func`2 next) in /_/src/NServiceBus.Core/ServicePlatform/Retries/RetryAcknowledgementBehavior.cs:line 46
at NServiceBus.MainPipelineExecutor.Invoke(MessageContext messageContext) in /_/src/NServiceBus.Core/Pipeline/MainPipelineExecutor.cs:line 50
at NServiceBus.TransportReceiver.InvokePipeline(MessageContext c) in /_/src/NServiceBus.Core/Transports/TransportReceiver.cs:line 66
at NServiceBus.TransportReceiver.InvokePipeline(MessageContext c) in /_/src/NServiceBus.Core/Transports/TransportReceiver.cs:line 66
at NServiceBus.FunctionEndpoint.Process(Byte[] body, IDictionary`2 userProperties, String messageId, Int32 deliveryCount, String replyTo, String correlationId, ITransactionStrategy transactionStrategy, PipelineInvoker pipeline) in /_/src/NServiceBus.AzureFunctions.Worker.ServiceBus/FunctionEndpoint.cs:line 107
at NServiceBus.FunctionEndpoint.Process(Byte[] body, IDictionary`2 userProperties, String messageId, Int32 deliveryCount, String replyTo, String correlationId, ITransactionStrategy transactionStrategy, PipelineInvoker pipeline) in /_/src/NServiceBus.AzureFunctions.Worker.ServiceBus/FunctionEndpoint.cs:line 107
at NServiceBus.FunctionEndpoint.Process(Byte[] body, IDictionary`2 userProperties, String messageId, Int32 deliveryCount, String replyTo, String correlationId, FunctionContext functionContext) in /_/src/NServiceBus.AzureFunctions.Worker.ServiceBus/FunctionEndpoint.cs:line 42
at FunctionEndpointTrigger.Run(Byte[] messageBody, IDictionary`2 userProperties, String messageId, Int32 deliveryCount, String replyTo, String correlationId, FunctionContext context) in E:\Agent01\_work\16\s\src\backend\HR.XXXX.ServiceBus.FunctionHost\NServiceBus.AzureFunctions.Worker.SourceGenerator\NServiceBus.AzureFunctions.SourceGenerator.TriggerFunctionGenerator\NServiceBus__FunctionEndpointTrigger.cs:line 34
at Microsoft.Azure.Functions.Worker.Invocation.VoidTaskMethodInvoker`2.InvokeAsync(TReflected instance, Object[] arguments) in D:\a\1\s\src\DotNetWorker.Core\Invocation\VoidTaskMethodInvoker.cs:line 24
--- End of inner exception stack trace ---
at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)
at System.Threading.Tasks.Task`1.GetResultCore(Boolean waitCompletionNotification)
at Microsoft.Azure.Functions.Worker.Invocation.DefaultFunctionInvoker`2.<>c.<InvokeAsync>b__6_0(Task`1 t) in D:\a\1\s\src\DotNetWorker.Core\Invocation\DefaultFunctionInvoker.cs:line 32
at System.Threading.Tasks.ContinuationResultTaskFromResultTask`2.InnerInvoke()
at System.Threading.Tasks.Task.<>c.<.cctor>b__272_0(Object obj)
at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state)
--- End of stack trace from previous location ---
at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread)
--- End of stack trace from previous location ---
at Microsoft.Azure.Functions.Worker.Invocation.DefaultFunctionExecutor.ExecuteAsync(FunctionContext context) in D:\a\1\s\src\DotNetWorker.Core\Invocation\DefaultFunctionExecutor.cs:line 45
at Microsoft.Azure.Functions.Worker.OutputBindings.OutputBindingsMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\1\s\src\DotNetWorker.Core\OutputBindings\OutputBindingsMiddleware.cs:line 16
at Microsoft.Azure.Functions.Worker.GrpcWorker.InvocationRequestHandlerAsync(InvocationRequest request, IFunctionsApplication application, IInvocationFeaturesFactory invocationFeaturesFactory, ObjectSerializer serializer, IOutputBindingsInfoProvider outputBindingsInfoProvider) in D:\a\1\s\src\DotNetWorker.Grpc\GrpcWorker.cs:line 166
Stack: at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)
at System.Threading.Tasks.Task`1.GetResultCore(Boolean waitCompletionNotification)
at Microsoft.Azure.Functions.Worker.Invocation.DefaultFunctionInvoker`2.<>c.<InvokeAsync>b__6_0(Task`1 t) in D:\a\1\s\src\DotNetWorker.Core\Invocation\DefaultFunctionInvoker.cs:line 32
at System.Threading.Tasks.ContinuationResultTaskFromResultTask`2.InnerInvoke()
at System.Threading.Tasks.Task.<>c.<.cctor>b__272_0(Object obj)
at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state)
--- End of stack trace from previous location ---
at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread)
--- End of stack trace from previous location ---
at Microsoft.Azure.Functions.Worker.Invocation.DefaultFunctionExecutor.ExecuteAsync(FunctionContext context) in D:\a\1\s\src\DotNetWorker.Core\Invocation\DefaultFunctionExecutor.cs:line 45
at Microsoft.Azure.Functions.Worker.OutputBindings.OutputBindingsMiddleware.Invoke(FunctionContext context, FunctionExecutionDelegate next) in D:\a\1\s\src\DotNetWorker.Core\OutputBindings\OutputBindingsMiddleware.cs:line 16
at Microsoft.Azure.Functions.Worker.GrpcWorker.InvocationRequestHandlerAsync(InvocationRequest request, IFunctionsApplication application, IInvocationFeaturesFactory invocationFeaturesFactory, ObjectSerializer serializer, IOutputBindingsInfoProvider outputBindingsInfoProvider) in D:\a\1\s\src\DotNetWorker.Grpc\GrpcWorker.cs:line 166

Have you seen this error before? Or can you provide some guidance on where to look in order to resolve this issue?

Thanks in advance

Jeroen Janssens

Sounds like an Azure Support question rather than NSB. I suspect connectivity issues between Functions App and Key Vault, causing this.

OK, thanks for the input. I’ll look into that.

Jeroen

Hi Sean,

We investigated whether the exception is caused by connectivity issues between Function Apps and Key Vault. But this is not the case. Even without Key Vault we receive this exception. We recently migrated our Windows Services, which were using SQL transport, to use Azure Service transport and we receive the same exceptions as well.

The exceptions suddenly appear and continue for 5 to 10 minutes, after which it automagically resolves. The messages are retried and eventually get processed.

We did notice that we receive the “send claims exceptions” for a different queue then the queue we want to send to. A screenshot from Application Insights to clarify what I mean:

I’m not sure if this is a problem caused by our code, the NServiceBus.Transport.AzureServiceBus package, the Azure.Messaging.ServiceBus package or Azure itself. I have created a ticket for Azure support as well.

Thanks in advance

Jeroen

Hi @jeroenj,

We are using NServiceBus 7.7.3, and NServiceBus.Transport.AzureServiceBus 2.0.5 in a .NET 6 app that we deploy to Azure ACI Linux on Alpine - and we are getting exactly this error - intermittently. It doesn’t really make much sense.

Wondering if you got any feedback from Microsoft before we raise a ticket with them?

We did not encounter this problem when we deployed in docker containers on a Windows VM (older build of our own software though).

Regards,
Rob.

Hi @Rawk ,

My apologies for the late response. Microsoft couldn’t pinpoint the issue either. So we are still in the dark. Did you manage to resolve it?

Regards,
Jeroen

We recently diagnosed an instance of the same error, even though it was in the context of using virtual machines, and the root cause of the problem was clock drift between one of the machines and Azure Service Bus.

We’re still experiencing occasionally - seems a bit random. I checked the clocks last time, but they all seem spot on to me.

Interesting. We too have seen this random error with webjob/console app in Azure Web App. If this is a clock drift, any thoughts on how to prevent or mitigate this? Is this something within NSB framework that could address it, or you think its really up to Azure to tighten up their clocks??

As far as I can tell, no NServiceBus currently cannot detect that and address the issue. We tried in all possible ways to reproduce the problem without luck.

In the instance I was talking about, Microsoft support reported that there was a clock drift issue with two of the instances.

One option to mitigate this would be to retry the send operation (IIRC, the SDK already does some of that) by leveraging something similar to GitHub - mauroservienti/NServiceBus.Extensions.DispatchRetries. Either by using the package or copying what it does and adopting it to your scenario.