Is there a way to get the full s3 url path of a metaflow artifact, which was stored in a step?
I looked at Metaflow's DataArtifact class but didn't see an obvious s3 path property.
I am using Metaflow on AWS in batch mode. I deleted the conda folder from s3. Now when I try to run a batch task, it fails in the bootstrapping environment step.
Apparently metaflow.plugins.conda.batch_bootstrap tries to download conda packages…
I’m currently experimenting on Metaflow. I followed the documentation and was able to deploy an aws setup with the given cloud formation template.
My question is why is that I’m always getting a:
message: "Missing Authentication Token"
when I access…
I am using metaflow to define my state machine and push this to AWS step function. When I run this state machine my step is stuck at RUNNABLE step. I have used cloudformation template to setup AWS. For testing I have only these steps in my flow…
I am trying to use fasttext in metaflow, but I cannot install it successfully. I use pip_install_module("fasttext", "0.9.1") to install. The error message in metaflow is:
Usually, I install fasttext in AWS sagemaker like shown below:
It works this…
This is an issue about metaflow, zarr, python
I am creating a LinearFlow using metaflow and zarr.
All is going well except one key zarr function: when I try to consolidate all my metadata into a Metadata Store inside the flow, I get no error message…
Only flow1 runs. I'm trying to find a way to run flow1 and when finished to run flow2. flow2 should be with scheduler. Any ideas? Thanks a lot!
'''
import metaflow
from metaflow import FlowSpec, Parameter, step
class Flow1(FlowSpec):
@step
…
I need to use Databricks-Notebooks for writing a script which combines Metaflow and Mlflow.
This is the script:
import mlflow
from metaflow import FlowSpec, step, Parameter
import pandas as pd
import numpy as np
from sklearn.linear_model import…
I run Metaflow jobs on AWS Batch and even when I specify swap requirements for my batch job, the EC2 instances on which the job is running have no swap and all memory is stored in RAM.
I am using m5 family spot instances with AWS Linux AMI, without…
I have a function sayHello defined in one python file utils.py. I also have a python file flow.py that runs a simple metaflow. I want to import sayHello from utils.py and use sayHello as a step function in flow.py. Is it possible to do that? This…
By default, MetaFlow retries failed steps multiple times before the pipeline errors out. However, this is undesired when I am CI testing my flows using pytest-- I just want the flows to fail fast. How do I temporarily disable retries (without…
I have a metaflow flow which I want to create a step function on aws for it
but I want to create 2 step functions to the same flow twice, one as a staging environment and another as production (MY_FLOW_STG and MY_FLOW_PRD)
the command we have to…
I have a directory containing mostly text and json files, and one binary file (output of MXNet.Block.save_parameters neural network).
I wanted to zip this folder and then pickle it. Say I have a zip file object:
from zipfile import ZipFile
import…
Objective
Understand how GPU will be utilized in Metaflow.
Background
As in Documentation / Explanation on how to use GPU #250, there are several discussions on how to use GPU.
It looks @resources(GPU=2) looks after the GPU allocation, but there are…