Apologies for necromancering this post but I wanted to let you know that we are starting to look into building support for this for a variety of modeling platforms (including NetLogo) in the coming year (probably middle-to-late 2020). The idea is to first containerize your models, then provide some supporting scripts / libraries to handle the parameterization + model execution + joining the results back into a coherent data pile on which to perform further output analysis.
This would in theory work on cloud compute resources or high-throughput computing on the Open Science Grid or submitting to your local HPC with Singularity.
We’d love to hear about your experiences with this and how you got it working!