Hey I'm pretty new to a lot of the erasure encoding concepts. I've mostly only read about Reed-solomon, but it does not fit what I need.
I need to find a technique that can create parity shards on large data WITHOUT requiring heavy system resource usage.
For example:
I want to store a 32gb video cut up into eight 4gb shards. I want to create 3 parity shards for this. I can not exceed more than a few hundred mb of memory at most, and I want the entire parity shards created incrementally so that I may write them to another file system without storing the entire thing in memory/on local disk.
Is there an erasure encoding technique so I can:
- Create parity shards for larges files without using significant amounts of memory
- incrementally create and distribute the parity shards to another system by sending the bytes as they are created.