0

code cell-

    from ultralytics import YOLO
    model = YOLO("yolov8n.pt")
    results =  model.train(data="/workspace/awadh/nvidia/apis_mellifera/v8/datasets/Apis_mellifera_IIT_front_view-1/data.yaml", epochs=200, batch=16, imgsz=640)

Error displayed-

/usr/local/lib/python3.8/dist-packages/ultralytics/yolo/data/utils.py in check_dataset_yaml(dataset, autodownload)
    232                 LOGGER.warning(msg)
    233             else:
--> 234                 raise FileNotFoundError(msg)
    235             t = time.time()
    236             if s.startswith('http') and s.endswith('.zip'):  # URL

FileNotFoundError: 
Dataset '/workspace/awadh/nvidia/apis_mellifera/v8/datasets/Apis_mellifera_IIT_front_view-1/data.yaml' not found ⚠️, missing paths ['/workspace/awadh/nvidia/Species_detection/Apis_Mellifera/y8/datasets/Apis_mellifera_IIT_front_view-1/valid/images']

So, the same line of codes are working in google colab but when I am using jupyter it is not working.The dataset is present in the file location and yet this error is coming.

2 Answers2

0

I discovered that you can include your dataset in the 'datasets' directory's root.

Here's the folder structure you should follow in the 'datasets' directory:

data.yaml
train
-images
-labels
test
-images
-labels
valid
-images
-labels

For your training, check if your dataset is located at 'datasets/data.yaml'

After, you can use this command to train your dataset :

yolo task=detect mode=train model=yolov8s.pt data=datasets/data.yaml epochs=100 imgsz=640

Source

Arthur Lacoste
  • 854
  • 1
  • 11
  • 10
-1

This has happened to me on a Cloud GPU Instance using ultralytics 8.0.22. Downgrading to ultralytics 8.0.20 solved the issue for me, running the pytorch/pytorch Docker image.

pip uninstall ultralytics
pip install ultralytics==8.0.20

However, this issue still persists on my local machine running MacOS Ventura (Apple Silicon).