Error when deploying new models that need *multiple* GPUs
Incident Report for Baseten
The issue has been resolved.
Posted May 30, 2023 - 15:29 PDT
Deploying new models using multiple GPUs (e.g. 2xA100, 2xA10G) is currently broken. Should have a fix within a few minutes.
Posted May 30, 2023 - 13:28 PDT
This incident affected: API.