0

Unable to download the file to hdfs if the url contains spaces when we are executing the jython/python For example : URL Contains spaces in the file name and directory path

> http://www.example.com/a bc/def/c h.csv

Command i tried with the url by escaping spaces with %20 or + symbol both didn't work.

executing below command by opening the shell in python like below. If the url doesn't contain spaces it happens correctly.

curl http://www.example.com/a bc/def/c h.csv | hadoop fs -put -f - /xyz/c h.csv
user1485267
  • 1,295
  • 2
  • 10
  • 19
  • I don't see any python related stuff here? URLs must not contain a space, therefore the "+" or "%20" encoding is correct. This seems to be a server issue, hard to tell without a valid example URI. – Frieder Jul 29 '19 at 12:46

1 Answers1

1

What about escaping the spaces with backslashes? Or surrounding the URL with quotation marks? Try:

http://www.example.com/a\ bc/def/c\ h.csv or "http://www.example.com/a bc/def/c h.csv"