This has been discussed in
XSTR-395 however it hasn't been dealt with satisfactorily IMO.
Due to the default StringConverters use of a cache (even if it is a WeakHashMap), if a single instance of the XStream class is used over a long period to serialise many thousands of objects then eventually an OutOfMemoryError will occur. This is due to a new WeakHashMap#Entry being added to the cache for every new string inside each of serialised objects.
IMO this is very dangerous and should NOT be the default behaviour. I understand that many people use XStream in many different ways but surely this pitfall should be avoided in the default configuration. If you need to cache strings for performance then you should install a CachingStringConverter rather than the other way around.
It took me longer than it should have to track down this OOM error as I was (perhaps foolishly?) assuming that XStream release version 1.3.1 would be rock solid.