-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reduce IEnumerator allocation via interface #5667
Comments
How do you even know that the enumerator is used in a
If I understand correctly this involves adding a new field to collection classes that use this technique. This penalizes those who use collection classes directly and not through interfaces because it increases the size of the collection object itself. |
@mikedn I dropped the extra field in PR dotnet/coreclr#4468 |
I see, that's better. In theory this looks like a good idea but it will probably be difficult to evaluated its true impact on performance. Keep in mind that you'll get different pools for different Anyway, it's a good subject for discussion. |
@benaadams can you update the status of this? |
Probably better as a single ThreadStatic cache per collection... |
Going to close this out |
A small step |
Revisiting #4505 /cc @stephentoub taking another shot at it...
If in addition to the struct enumerators another nested class was added to the generic collections that used the struct enumerator e.g. for
List<T>
A new field added
private CachedIEnumerator _cachedIEnumerator;
And the
IEnumerable<T>
implementations changed toIt would still allocate on first use, however
foreach
callsDispose
when its finished https://github.com/dotnet/coreclr/issues/1505 so this would return the enumerator object to the list to be used again, so on a second enumeration of the same list it wouldn't allocate.May be issue if people hold on to the
IEnumerable
postforeach
? HoweverReset
is allowed to thrownew NotSupportedException
and is an explicitly implemented methodd, so the purpose of holding on to the reference would be limited.The text was updated successfully, but these errors were encountered: