Details
-
Type: Bug
-
Status: Closed
-
Priority: Critical
-
Resolution: Fixed
-
Affects Version/s: 1.3.1
-
Fix Version/s: 1.4
-
Component/s: Converters
-
Labels:None
Description
This has been discussed in XSTR-395 however it hasn't been dealt with satisfactorily IMO.
Due to the default StringConverters use of a cache (even if it is a WeakHashMap), if a single instance of the XStream class is used over a long period to serialise many thousands of objects then eventually an OutOfMemoryError will occur. This is due to a new WeakHashMap#Entry being added to the cache for every new string inside each of serialised objects.
IMO this is very dangerous and should NOT be the default behaviour. I understand that many people use XStream in many different ways but surely this pitfall should be avoided in the default configuration. If you need to cache strings for performance then you should install a CachingStringConverter rather than the other way around.
It took me longer than it should have to track down this OOM error as I was (perhaps foolishly?) assuming that XStream release version 1.3.1 would be rock solid.
Issue Links
- supercedes
-
XSTR-410 Fix for issue 395 also causes OutOfMemoryError
Opps - let me just correct two mistakes that I made above.
1) It was discussed in
XSTR-410rather than the originalXSTR-395.2) An OOME might not actually happen... but the JVM will grind to a halt and spend most of it's time doing FULL GC's.