Optimization of Epanet model on GPU

Hi, I have to calibrate a network model and I need to speed up simulation during optimization with genetic algorithm. It’s possible run epanet.dll on GPU? I’m working on Matlab.

thanks

There has been quite a bit of research on this topic - I will link a couple of papers below. The basic takeaway is that unless the network is very large, the overhead created by shipping data to/from the gpu outweighs the benefits of speed. However, some task-level parallelization could be useful here; EPANET 2.2 is thread safe and re-entrant, so you could run several simulations concurrently on different execution threads with the same library.

https://www.researchgate.net/profile/Fernando-Martinez-Alzamora/publication/315728277_Efficient_simulation_of_Water_Distribution_Systems_using_OpenMP/links/58df782f4585153bfe948b15/Efficient-simulation-of-Water-Distribution-Systems-using-OpenMP.pdf

https://iwaponline.com/jh/article/14/3/603/3168/The-potential-of-graphical-processing-units-to