Optimize World::get_entity_mut for large entity slices#23740
Optimize World::get_entity_mut for large entity slices#23740CrazyRoka wants to merge 3 commits intobevyengine:mainfrom
Conversation
Replace nested loops with a HashSet for O(N) duplicate entity detection. This improves performance significantly for larger entity lists.
|
Good observation that I'll also note that the vast, vast majority of arrays are small, so the As an isolated change this makes sense (and we can merge it in the meantime)! I am curious, in the use case you've seen, is the entire result always used, or only partially iterated/consumed? If only partial use is needed, then an iteration-based duplication check would be more performant. Additionally, if the source slices/arrays are not mutated between each |
Objective
Optimize
World::get_entity_mut(and the underlyingWorldEntityFetchtrait) when a slice of entities is passed in.The previous duplicate-entity check used nested loops (
O(N²)) and showed up as a CPU hotspot in real workloads with thousands of entities.This PR makes the check
O(N)while preserving exact behaviour and error semantics.Solution
for i in 0..len { for j in 0..i }duplicate check with a single-passEntityHashSetin both&[Entity]and&[Entity; N]implementations ofWorldEntityFetch::fetch_mut.get_entity_mut_slice) that exercises the hot path with slices up to 2000 entities.Testing
cargo bench -p benches --bench ecs -- get_entity_mut_slice.Showcase
Benchmark results before:
Benchmark results after:
Criterion table summary: