+1 vote
536 views

I am repetetively experiencing memory issues, even though I have (as suggested in other treads in this forum) increased my allocated memory for OpenLCA to fit the need.

My current situation:

My deterministic (normal) model takes up just shy of 31 GB RAM while running, Takes about 23 minutes to run, and usually only works once or twice, before I have to fully restart my PC.

I have allocated 50GB RAM for the programme (with administrator rights), and I have 64GB RAM in my system.

Other programs usually sum up to use 7-10 GB RAM in total.

Hence, I rarely exceed 70% of my available RAM while running the model. And sometimes it still aborts and through the OurOfMemory Java heap error.

I have repetetively ensures proper shutdown and confirmed it in my task manager, every time i restart.

Can I suspect a memory leak in the software, or what could the issue be?

Is there a SQL command I could add, to make the memory garbage collection more efficient?

in openLCA by (790 points)
reshown by

1 Answer

0 votes
by (125k points)
edited by

Hi Simon,

hm, first question, how large is your model? In terms of processes and connections. What you report seems really huge. Maybe based on exiobase plus other databases? And then, which version of openLCA are you using?

Thank you,

Andreas

edit: thank you for your additional explanation in the comments.

The memory clean up (garbage collection) only works if nothing else is to do or if insufficient memory is available, so an accumulation as you observe is ok.

The PS is built as System processes (not unit!)

I do not understand? And you calculate 249 product systems at the same time? This is of course quite memory-intense..

by (790 points)
Hi lauradmeloc3,

I did not solve the issue at the time using version 2.0, however, I read that they solved a memory leak issue related to Monte Carlo simulations with the release of 2.0.2. Which version are you running?
...