JAVA heap space memory issue
11 次查看(过去 30 天)
显示 更早的评论
I have an XML file of approximate 1 GB size.
I have used xmlread command in my script which always leads to below error:
Java exception occurred:
java.lang.OutOfMemoryError: Java heap space
I maximized the Java Heap Memory under Preferences to 4016 MB.But I am still getting the same issue.
Any work around?
0 个评论
回答(2 个)
Kevin Gurney
2023-6-20
编辑:Kevin Gurney
2023-6-20
You could try using the newer MATLAB APIs for XML Processing (MAXP) or readstruct rather than xmlread. Neither of these interfaces use Java under the hood.
However, there is still a possibility of out of memory issues if your XML file is too large relative to the amount of available memory on your machine.
MAXP Example:
import matlab.io.xml.dom.*
filename = "data.xml"
dom = parseFile(Parser, filename);
readstruct Example:
filename = "data.xml"
s = readstruct(filename);
Ultimately, if your XML file is too large to load into memory all at once, then you may need to use a streaming / "chunk"-based parsing approach. For example, you could try to leverage a SAX based Java parser or other third party SAX parser to only read parts of the file at a time to decrease maximum memory usage.
0 个评论
sreepathy
about 1 hour 前
The java.lang.OutOfMemoryError: Java heap space error in your case is happening because the job is trying to load too many records into memory at once. Even though you have already increased the JVM options this type of issue may come up if the application is holding on to objects longer than necessary.You noticed that your xmlread command tries to process the entire file into memory all at once? Because of this process, you are actually encountering such errors. The increased heap space may not be sufficient for a few cases like yours since it needs to construct a complete Document Object Model (DOM) tree in memory. Usually always for very large XML files, this approach will lead to OutOfMemoryError earlier than this error occurs in other actual cases. A common workaround is to avoid using DOM-based parsers like xmlread for such large files and instead switch to a streaming-based parser such as SAX or StAX, which processes the file sequentially without loading everything into memory. Alternatively, if your use case allows, you can split the large XML file into smaller chunks and process them separately. This way, memory consumption will be reduced, and you can handle the data more efficiently without hitting heap space limits. For further details, you can also refer to this blog: How to Solve OutOfMemoryError: Java heap space.
0 个评论
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!