Scenario: Entity batching is enabled by setting @BatchSize(size = 100000) for every entity from data model. On an outgrowing database, where most queries do not reach 100000 entries, the queryCachePlan gets polluted constantly with entries the one attached. Such entries were observed to be as large as 1-2MB in size thus increasing heavily the memory pressure of the JVM.
Since any variation in number of ids may lead to a new entry, better approach would be to rearchitect the whole concept and have as parameter a list which can be templated. Current approach is not only suboptimal, but can be quite unforgiving for batch processing jobs which load entities constantly (memory pressure can grow to such an extend that it can cripple the performance of the JVM).
Ubuntu 18.04 LTS, MySQL 8.0.16