Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory dump detection: dump file dropped because it's too big #4007

Open
bruno-garcia opened this issue Feb 27, 2025 · 6 comments
Open

Memory dump detection: dump file dropped because it's too big #4007

bruno-garcia opened this issue Feb 27, 2025 · 6 comments

Comments

@bruno-garcia
Copy link
Member

bruno-garcia commented Feb 27, 2025

The memory dump detection feature created an event, but no dump file was attached.

I was only able to find out what happened because the SDK logs are sent to google cloud logging. And I found it:

jsonPayload: {
0: "HttpTransport"
1: "20250227_072623_1.gcdump"
2: 161153506
@l: "Warning"
@mt: "{0}: Attachment '{1}' dropped because it's too large ({2} bytes)."
@t: "2025-02-27T07:26:46.1279762Z"
MachineName: "symbol-collector-6b45d484f7-7ks87"
SourceContext: "Sentry.ISentryClient"
}

https://console.cloud.google.com/kubernetes/deployment/us-central1-b/zdpwkxst/default/symbol-collector/logs?invt=AbqtWg&project=internal-sentry&rapt=AEjHL4Pycm2CIGp9OPDNBq9dWM8XDAhfti7gFFXRZH1YB5BmzYneOn020lZbGvX-j61FjNflNHB1FM9rLFjC7pn0w6Y_z_hVX1BS-GN9h1MManAXE6IU3-8&inv=1

We could show something in the UI saying the dump file was too large.
Or we could avoid sending the event altogether. But this means now the server could crash OOM and we have no event in Sentry at all. Which is probably even worse.

@jamescrosswell
Copy link
Collaborator

Hm, tricky. We can't get around the file size constraint.

About all I can think is that we could also provide an "OnLargeDumpCaptured" callback or something that users could implement to send/store the dump somewhere else (e.g. they could upload it to some cloud storage). It starts to make things more complex and less convenient, but I can't see any other way around it.

Thoughts?

@bruno-garcia
Copy link
Member Author

There's no way to make the dump file smaller? Configure how many references to keep around etc?

Not sure it's worth adding the callback. We do need to verify if we want to have that event being sent if the event is too large though. Do we know the dump file or it's streaming to disk or something?

@jamescrosswell
Copy link
Collaborator

dotnet-gcdump writes the file to disk, then we capture a Sentry event and add the dump as an attachment:

var command = $"dotnet-gcdump collect -v -p {Environment.ProcessId} -o '{dumpFile}'";
dumpProcessRunner.Invoke(command);
if (!_options.FileSystem.FileExists(dumpFile))
{
// if this happens, hopefully there would be more information in the standard output from the process above
_options.LogError("Unexpected error creating memory dump. Use debug-level to see output of dotnet-gcdump.");
return;
}
_onDumpCollected(dumpFile);

So yeah, we know where the dump file is and could just let people know the location where it's been stored, if it's too big to add as an attachment. I imagine a lot of folks wouldn't have access to it in a production environment though.

@bruno-garcia
Copy link
Member Author

So yeah, we know where the dump file is and could just let people know the location where it's been stored, if it's too big to add as an attachment. I imagine a lot of folks wouldn't have access to it in a production environment though.

But if we're doing this now we need to keep track and delete old files etc to avoid a disk filling up.
If we're not sending to Sentry, I would delete the dump file and log diag logger error about it.

If Sentry increases the size limit one would need to bump the SDK (and we'd need to amend the SDK limits though)

@bruno-garcia bruno-garcia changed the title Memory dump too big Memory dump detection: dump file dropped because it's too big Feb 27, 2025
@jamescrosswell
Copy link
Collaborator

But if we're doing this now we need to keep track and delete old files etc to avoid a disk filling up.

I just added a new issue to cover that off.

In terms of how to capture large heap dumps, Sentry's docs on the subject suggest To add larger or more files, consider secondary storage options. If we did have an OnOversizedDump callback, is there any secondary storage we could use to dogfood this in the symbol collector? I think it would be fairly easy to add that callback.

@bruno-garcia
Copy link
Member Author

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

No branches or pull requests

2 participants