-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimize Microsoft.Extensions.Caching.Memory #111050
base: main
Are you sure you want to change the base?
Conversation
Tagging subscribers to this area: @dotnet/area-extensions-caching |
src/libraries/Microsoft.Extensions.Caching.Memory/src/CacheEntry.cs
Outdated
Show resolved
Hide resolved
The main win here is not eagerly allocating the options lists, is that right? Plus using the list-specific iterator instead of IEnumerable-T/IEnumerator-T. Definitely worthwhile - however, I'm not a huge fan of exposing the naked field, even as |
Yes, the allocation savings come from on-demand allocation of options lists and avoiding an enumerator allocation each time when using options. And skipping options allocation entirely in
|
Since this is an private List<Whatever>? _foos;
public IList<Whatever> Foos => _foos ??= []; // lazy alloc
internal List<Whatever>? FoosDirect => _foos; // note never allocs This adds a line of code, but is IMO significantly preferable to exposing the naked field. |
src/libraries/Microsoft.Extensions.Caching.Abstractions/src/DistributedCacheExtensions.cs
Outdated
Show resolved
Hide resolved
src/libraries/Microsoft.Extensions.Caching.Abstractions/src/DistributedCacheEntryOptions.cs
Show resolved
Hide resolved
This should be ready for merging if there are no other concerns. |
Remove some allocations and interface calls.