lwpro2 posted: " a big memory drainer is the string objects. with the object header, pointer for the char array, there are minimum ~20 bytes (varies by java version) occupied even for a empty string. this could become an especially a big problem, if a large volume "
with the object header, pointer for the char array, there are minimum ~20 bytes (varies by java version) occupied even for a empty string.
this could become an especially a big problem, if a large volume (like millions of records) of messages to parse onto single jvm.
from java 8, there are two ways to handle this, (especially for situations where a large amount of data all have for example same headers, like "portfolio", "name", "currency". these are likely to have limited/constant number of variances for both the key/attribute/property and values)
string intern, this is the approach before java 8
caveat though, the java default implementation with string intern could be slow with native implementation.
an alternative to the native implementation is using map, which would server the same purpose and faster. like
public final class StringRepo extends ConcurrentHashMap<String, String> { public final static StringRepo repo = new StringRepo(); public String intern(String s){ //Note: handle npe return computeIfAbsent(s, String::intern); } }
2. from java 8, string deduplication could be used to take on gc's help on reduce the string memory footprint.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.