0 votes
184 views

Hi,

I am trying to run an analysis calculation on a relatively simple system, but am frequently getting the out of memory error. I have upgraded the fast calculation files as mentioned here https://ask.openlca.org/4118/out-of-memory-error-message-problem. The configuration for maximum memory usage has been changed to 16000 MB (16 GB RAM available). I am running openLCA version 1.10.3.

My system uses the ecoinvent 3.8 cutoff non-regionalized dataset as the background, and has maybe twenty foreground processes. Are there other troubleshooting recommendations that I can try? 

in openLCA by (120 points)
edited by

1 Answer

0 votes
by (90k points)
You need to leave some memory for other applications on the computer. An ecoinvent 3.8 cut-off product system has about 15,000 processes, independent from how big your product system is (or, in the typical case that you do not add 10,000 new processes in your foreground system), and needs about 6-8 GB RAM. In your case, e.g. allocate 12 GB, then you can make also some parallel calculations. You can check in the task manager e.g. (Windows) that openLCA indeed uses that RAM, if openLCA was installed by an admin and you do not have write access to the program folder then the allocated RAM is not changed, install then the zip archive version that you just need to extract in your user directory.

Best wishes,

Andreas
...