I am trying to split large podcasts mp3 files into smaller 5 minute chunks using python and the pydub library. This is my code:
folder = r"C:\temp"
filename = r"p967.mp3"
from pydub import AudioSegment
sound = AudioSegment.from_mp3(folder + "\\" + filename)
This works fine for small files but for the large podcasts i am interested in 100mb +. This returns the following error.
Traceback (most recent call last):
File "C:\temp\mp3split.py", line 6, in <module>
sound = AudioSegment.from_mp3(folder + "\\" + filename)
File "C:\Python27\lib\site-packages\pydub\audio_segment.py", line 522, in from_mp3
return cls.from_file(file, 'mp3', parameters)
File "C:\Python27\lib\site-packages\pydub\audio_segment.py", line 511, in from_file
obj = cls._from_safe_wav(output)
File "C:\Python27\lib\site-packages\pydub\audio_segment.py", line 544, in _from_safe_wav
return cls(data=file)
File "C:\Python27\lib\site-packages\pydub\audio_segment.py", line 146, in __init__
data = data if isinstance(data, (basestring, bytes)) else data.read()
MemoryError
Is this a limitation of the library? should i be using an alternative approach to achieve this?
If i add the follwoing code to check the memory status at the point of running.
import psutil
print psutil.virtual_memory()
This prints:
svmem(total=8476975104L, available=5342715904L, percent=37.0 used=3134259200L, free=5342715904L)
This suggests to me that there is plenty of memory at the start of the operation, though I am happy to be proven wrong.