At Spectrum Labs, our deep learning models are trained and validated against large data sets that can only be processed in a distributed manner. Data Scientists require the ability to invoke batch jobs as part of the the model development lifecycle, and these jobs must run in a scalable, fault-tolerant manner. Argo Workflows is chosen as our data pipeline framework as it provides a container-...
Learn for free, join the best tech learning community
Event notifications, weekly newsletter
Access to all content