Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix #23908: Significantly improve the performance of copy/paste when …
…dealing with large amounts of data From #23908, at least one valid workflow involves copy/pasting large amounts of data (specifically updating boundaries). The relation for `Terwolde` in the sample data had 2206 objects. This took >5 minutes to copy between layers. With this change, it takes <1 second. This is a performance regression from r19176. The primary culprit from r19176 is when we check the size of the dataset. The problem is that we check to see if the size of the primitives being changed is greater than or equal to the non-deleted complete primitives in the dataset. We get a new filtered collection each time we get those primitives, and therefore the size of that is not cached. The size calculation for the filtered collection is where almost all the expense is. We fix that by wrapping the work from AddPrimitivesCommand in `DataSet#update` to have a single large update at the end of the copy operation. This ensures that we do not have many spurious fired event calls when a mass operation is going on. git-svn-id: https://josm.openstreetmap.de/svn/trunk@19214 0c6e7542-c601-0410-84e7-c038aed88b3b
- Loading branch information