0

https://huggingface.co/docs/diffusers/training/lora

I am trying to use this code to lora train,My simplified code goes like this:

accelerate launch --mixed_precision="fp16"  examples/text_to_image/train_text_to_image_lora.py --pretrained_model_name_or_path=D:\PyCharm2022.1.3\PycharmProject\diffusers_txt_
images\modle_clone\Realistic_Vision_V1.4 --dataset_name="lambdalabs/pokemon-blip-captions" --dataloader_num_workers=8 --resolution=512 --center_crop --random_flip --train_batch_size=1 --gradient_accumulation_steps=4 --max_train_step
s=15000 --learning_rate=1e-04 --max_grad_norm=1 --lr_scheduler="cosine" --lr_warmup_steps=0 --output_dir="ddpm-ema-pokemon-64" --checkpointing_steps=2000 --validation_prompt="Dialogue." --seed=1337 

However it gives error:

    dataloader_iter = super().__iter__()
  File "D:\PyCharm2022.1.3\PycharmProject\mydiffusers\venv\lib\site-packages\torch\utils\data\dataloader.py", line 441, in __iter__
    return self._get_iterator()
  File "D:\PyCharm2022.1.3\PycharmProject\mydiffusers\venv\lib\site-packages\torch\utils\data\dataloader.py", line 388, in _get_iterator
    return _MultiProcessingDataLoaderIter(self)
  File "D:\PyCharm2022.1.3\PycharmProject\mydiffusers\venv\lib\site-packages\torch\utils\data\dataloader.py", line 1042, in __init__
    w.start()
  File "D:\python\Python38\lib\multiprocessing\process.py", line 121, in start
    self._popen = self._Popen(self)
  File "D:\python\Python38\lib\multiprocessing\context.py", line 224, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "D:\python\Python38\lib\multiprocessing\context.py", line 327, in _Popen
    return Popen(process_obj)
  File "D:\python\Python38\lib\multiprocessing\popen_spawn_win32.py", line 93, in __init__
    reduction.dump(process_obj, to_child)
  File "D:\python\Python38\lib\multiprocessing\reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
AttributeError: Can't pickle local object 'main.<locals>.preprocess_train'

I appreciate any solution or insight to this problem. Thanks!

0 Answers0