UPDATE
I split
the file into multiple files each roughly with 1.5 million lines and no issues.
Attempting to pipe into Redis 6.0.6 roughly 15 million lines of SADD and HSET commands properly formatted to Redis Mass Insertion but it fails with the following message:
ERR Protocol error: too big mbulk count string
I use the following command:
echo -e "$(cat load.txt)" | redis-cli --pipe
I run dbsize command in redis-cli and it shows no increase during the entire time.
I can use the formatting app I wrote (a c++ app with client library redis-plus-plus), which correctly formats the lines, write to std::cout then using the following command as well:
./app | redis-cli --pipe
but it exits right away and only sometimes produces the error message.
If I take roughly 400,000 lines from the load.txt file and load it in a smaller file then use echo -e etc.... it loads fine. The problem seems to be the large number of lines.
Any suggestions? It's not a formatting issue afaik. I can code my app to write all the commands into Redis but mass insertion should be faster and I'd prefer that route.