+1 vote
358 views

I am repetetively experiencing memory issues, even though I have (as suggested in other treads in this forum) increased my allocated memory for OpenLCA to fit the need.

My current situation:

My deterministic (normal) model takes up just shy of 31 GB RAM while running, Takes about 23 minutes to run, and usually only works once or twice, before I have to fully restart my PC.

I have allocated 50GB RAM for the programme (with administrator rights), and I have 64GB RAM in my system.

Other programs usually sum up to use 7-10 GB RAM in total.

Hence, I rarely exceed 70% of my available RAM while running the model. And sometimes it still aborts and through the OurOfMemory Java heap error.

I have repetetively ensures proper shutdown and confirmed it in my task manager, every time i restart.

Can I suspect a memory leak in the software, or what could the issue be?

Is there a SQL command I could add, to make the memory garbage collection more efficient?

in openLCA by (790 points)
reshown by

1 Answer

0 votes
by (114k points)
edited by

Hi Simon,

hm, first question, how large is your model? In terms of processes and connections. What you report seems really huge. Maybe based on exiobase plus other databases? And then, which version of openLCA are you using?

Thank you,

Andreas

edit: thank you for your additional explanation in the comments.

The memory clean up (garbage collection) only works if nothing else is to do or if insufficient memory is available, so an accumulation as you observe is ok.

The PS is built as System processes (not unit!)

I do not understand? And you calculate 249 product systems at the same time? This is of course quite memory-intense..

by (790 points)
Hi Andreas,

I run a project w. 249 copies of the same PS. The statistics of the PS are:
No. of processes: 14376
No. of process links: 163607
Connected graph/can calculate?: yes
...
Links that are linked with default providers: 163607
Links with exacly one possible provider: 25

Other information:
The model is built on ecoinvent 3.7 cutoff_unit_regionalized processes
The PS is built as System processes (not unit!)
I'm running OpenLCA 2.0 (not beta)

Do you need other information?

Thank you so much.
by (790 points)
As an additional note.

The Monte Carlo simulation for the PS accumulates RAM in between runs.

Exmaple:

I run 250 simulations of my ps with "Parameter set 1". Then export to excen and closes the results tab.
Now OpenLCA keeps 15 GB in memory.
Then replicates this for "Parameter set 2"
... Now OpenLCA keeps abot 25 GB memory.

This way, I have to manually restart (normal shut down of the application), in order to have enough memory for the next runs. Even though I don't run all of them simultainously.

This is slightly worrying.

Let's say someone runs way smaller models on a daily basis, they would still eventually have to randomly restart the application to keep on using it.

... naturally, sometimes a restart is the solution, but it seems there is something unsustainable in the way memory is handled in this matter.

I'm happy to discuss.
by (410 points)
Hey, have you solved this issue? I'm experiencing something similar. Thanks
by (790 points)
Hi lauradmeloc3,

I did not solve the issue at the time using version 2.0, however, I read that they solved a memory leak issue related to Monte Carlo simulations with the release of 2.0.2. Which version are you running?
...