Self-Serve Elastic Compute
One-click access to scalable compute
Say goodbye to DevOps learning curves and wait times. Stop trying to guess compute needs in advance. With Domino, you can self-serve dynamically adjusting Kubernetes-based compute clusters with just a few clicks. You can easily access distributed frameworks such as Spark, Ray, and Dask, as well as NVIDIA GPUs for model training and inference, to power the most computationally hungry algorithms.
IT immediately benefits from centralized infrastructure management that optimizes resource use and facilitates chargeback to business units based on usage.
Frequently Asked Questions
Can Domino support both on-prem and cloud infrastructure?
Yes, because Domino is fully Kubernetes native, we can support both on-prem and cloud infrastructure. As a result, Domino aligns with your current and future IT strategy and infrastructure vision and can be a key enabler as you move towards a full cloud, or hybrid on-prem and cloud deployment.
Does Domino support GPUs?
Yes, with Domino you can centrally provision GPU resources for data scientists to leverage in projects. They are shared across all users allowing you to maximize the benefit and utilization.
Is Spark supported in Domino?
Yes, Domino supports Spark, as well as other distributed computing frameworks like Ray and Dask.