We are building an ML tracking service using MLflow as a backend. One issue we've run into is that in order to log models via MLflow's python API, the user needs to have AWS credentials configured on their machine. Since our service is outward-facing, we can't really let users have the access key for our S3 bucket. Is there a mechanism for authenticating a boto3 client used by MLflow via some temporary AWS credentials? We can generate a signed URL with write permissions to the bucket, but it's unclear how we would then pass it onto the boto3 client / MLflow python api. Or can we do something with environment variables? In any case, if someone knows of a good way to do this - I'd greatly appreciate the help. Best, SP
Asked
Active
Viewed 350 times