Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exception messages are not logged when the message exceed max length #2284

Closed
ravibha opened this issue May 26, 2021 · 5 comments
Closed

Exception messages are not logged when the message exceed max length #2284

ravibha opened this issue May 26, 2021 · 5 comments

Comments

@ravibha
Copy link

ravibha commented May 26, 2021

Hi all,
We are observing an issue with logging of exceptions in ApplicationInsights, when the message length is > 32K limit.
My expectation is if the messages are greater than the limit, they should be truncated by the sdk and logged to appinsights. However, I don't see that happening. Instead I don't see an exception logged in appinsights. I do see a trace logged though.

Is this a known issue?

For now, the workaround I am using is to create a ExceptionTelemetry object with truncated message and call TrackException(ExceptionTelemetry exceptionTelemetry)

I did notice, there is a Sanitize method for ExceptionTelemetry. Not sure if it is called.

@ravibha
Copy link
Author

ravibha commented Jun 23, 2021

I also hit a similar issue when there are AggregateExceptions with long inner exception messages or a lot of inner exceptions, the exceptions are not logged in appinsights. Looks like both might have the same cause.

Package versions:

<PackageReference Include="Microsoft.ApplicationInsights.AspNetCore" Version="2.17.0" />
<PackageReference Include="Microsoft.NET.Sdk.Functions" Version="3.0.11" />

Below is a simple repro with a Azure function. In the catch block, the trace is logged with the AggregateException but I don’t see any exceptions being logged. Would be great to have a fix or a workaround for this.

public static class Function1
    {
        private static TelemetryClient telemetryClient;
        private static string ExceptionText ="<Some long text>";
 
        [FunctionName("Function1")]
        public static void Run([TimerTrigger("0 */1 * * * *")]TimerInfo myTimer, ILogger log)
        {
            log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
            var telemetryProperties = new Dictionary<string, string>();
            telemetryProperties.Add("KeyResultId", "2323");
            telemetryClient = new TelemetryClient(new TelemetryConfiguration { InstrumentationKey = "2c43572f-e11d-4d0b-a053-761a3387a225" }); 
            telemetryClient.TrackTrace("Trace from azure function"); 
            Action<int> job = i => throw new TimeoutException(ExceptionText); 
            // we want many tasks to run in parallel
            var tasks = new Task[100];
            for (var i = 0; i < 100; i++)
            {
                int j = i;
                Task task = Task.Run(() => job(j));
                tasks[i] = task;
            }
 
            try
            {
                // wait for all the tasks to finish in a blocking manner
                Task.WaitAll(tasks);
            }
            catch (Exception ex)
            {
                telemetryClient.TrackTrace($"Exception: {ex}");
                telemetryClient.TrackException(ex, telemetryProperties);
            }
        }
    }

Thanks!

@andystumpp
Copy link

Any update on this, can we prioritize? This seems critical for troubleshooting and should be covered in the SDK without having manual code to work around this?

@cijothomas
Copy link
Contributor

related: #2482

@Daniel-Guenter
Copy link
Contributor

This is an issue for us as well. The only way we can work around it is with reflection.

@github-actions
Copy link

This issue is stale because it has been open 300 days with no activity. Remove stale label or this will be closed in 7 days. Commenting will instruct the bot to automatically remove the label.

@github-actions github-actions bot added the stale label Sep 30, 2022
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Oct 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants