High memory usage in queryPlanCache when entity batching is enabled
Description
Attachments
1
Activity
Show:

Sanne Grinovero December 13, 2024 at 10:09 AM
I’m sorry that this has apparently been ignored, but since version 6 such aspects are entirely different so I’ll consider this out of date.
Scenario: Entity batching is enabled by setting @BatchSize(size = 100000) for every entity from data model. On an outgrowing database, where most queries do not reach 100000 entries, the queryCachePlan gets polluted constantly with entries the one attached. Such entries were observed to be as large as 1-2MB in size thus increasing heavily the memory pressure of the JVM.
Since any variation in number of ids may lead to a new entry, better approach would be to rearchitect the whole concept and have as parameter a list which can be templated. Current approach is not only suboptimal, but can be quite unforgiving for batch processing jobs which load entities constantly (memory pressure can grow to such an extend that it can cripple the performance of the JVM).