Affects Version/s: None
Fix Version/s: 1.3
when loading huge XStream data sets from disk, holding mainly Strings, after a while, you get a PermGen Space - OutOfMemory.
Increasing the PermSpace of the JVM with -XX:MaxPermSize=256M or higher only postpones the problem.
First I assumed that the amount of classes created caused the VM to choke. Additionally I found a VM defect, that describes an "PermHeap bloat in and only in server VM". http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4957990
So I stopped reusing my XStream instance and constructed always a new one. This improved the situation further but at the end, there was always a "PermGen Space" exception. Days ago I found some more infos about the content of the perm gen space and the information, that also interned strings are kept there. So I searched through the code of XStream and found StringConverter.java.
StringConverter is interning strings to unify them to reduce the memory footprint. This is basically not a bad ideas but I have the feeling that String.intern() still behaves strange in Java 5. It does not seem to be a real weak storage and additionally String.intern() is a native call.
Luckily all converters are pluggable and therefore I wrote a new StringConverter and plugged it in. It works just perfect! Basically I use a WeakHashMap or a LRUHashMap to unify the string instances and my PermSpace stays low and my program runs through (of course some additional heap might be required but this is set high already in my case).
Find the new StringConverter attached as well as a usage example. The cache to use is defined globally in the program to reuse it across several instances of XStream. This has the best effect and saves quite some memory. If you gonna use the cache in a multi-threaded szenario, do not forget to wrap it with Collections.synchronizedMap(...).
- Cache for unifying strings.
private static final Map<String, String> stringCache = new WeakHashMap<String, String>();
XStream xstream = new XStream();