Optimization of Epanet model on GPU

Hi, I have to calibrate a network model and I need to speed up simulation during optimization with genetic algorithm. It’s possible run epanet.dll on GPU? I’m working on Matlab.


There has been quite a bit of research on this topic - I will link a couple of papers below. The basic takeaway is that unless the network is very large, the overhead created by shipping data to/from the gpu outweighs the benefits of speed. However, some task-level parallelization could be useful here; EPANET 2.2 is thread safe and re-entrant, so you could run several simulations concurrently on different execution threads with the same library.