Cache key is huge since migration to 6

Description

Since migrating to v6 the memory used in L2 cache has exploded. Investigations reveal that cache keys/fields in Redis (via Redisson) have ballooned considerably. A brief chat in Zulip lead to the recommendation to file this issue as it seems a possible side effect of changes to caching that came with 6.0.

We’re seeing keys with lengths of 2k-3k characters. We’re currently using SnappyCodecV2 but also verified with MarshallingCodec, SerializationCodec and JsonJacksonCodec.

An example of a key:

Activity

Show:

Dave Myron October 24, 2022 at 3:39 PM

Yeah, it looks like it is that we’re getting bit by. Our classes that use EmbeddedId are the ones having this problem. It leads to different cache keys for instance regardless of actual equality (which leads to ballooning of the cache storage).

Christian Beikov October 19, 2022 at 4:17 PM

I think that the fix for will resolve this issue.

Sanne Grinovero October 19, 2022 at 2:57 PM

This is a pandora box, the cache keys are not meant to get serialized in this way. The 2LC implementation should extract an appropriate representation - and what that is, is not a general Hibernate issue.

Christian Beikov October 18, 2022 at 9:17 AM

Please share the entity class and composite id type you are using. For composite ids there is maybe further optimization potential, but there are limits to what we can do. I think that it would be best if the cache would simple ignore the CacheKeyValueDescriptor cacheKeyValueDescriptor field when serializing CacheKeyImplementation, since that won’t be used by the remote cache anyway.

Dave Myron October 18, 2022 at 9:03 AM

We’re still getting very large keys with 6.1.4. And it seems like it’s generating unique keys for every instance of the same entity (very confusing). Here’s an example of a key with 6.1.4.

That’s 3,860 characters.

Fixed

Details

Assignee

Reporter

Labels

Components

Fix versions

Priority

Created September 12, 2022 at 2:49 PM
Updated October 24, 2022 at 3:39 PM
Resolved September 21, 2022 at 4:35 PM