1

I am trying to use wget to download a pdf from my university website the url looks something like this : https://online.myuni.ac.uk/webapps/blackboard/execute/content/file?cmd=view&content_id=_xxxxxxx_1&course_id=_xxxxx_1

I have tried using wget with both cookies and also with --user=xxx --password=xxx However what it downloads is a html page showing me a log-in screen saying I have insufficient permission. I cannot get this to work, and I am not sure how to proceed. I am very inexperienced with linux and programming in general any help is appreciated.

Amir
  • 155
  • 1
  • 2
  • 10
  • 1
    See http://stackoverflow.com/questions/1324421/how-to-get-past-the-login-page-with-wget – Pekka Jul 19 '15 at 09:23
  • I have looked there, I did the same method of exporting cookies from a browser, and I have tried the log in options too. I want to use wget in this particular instance and I am not interested in using curl. – Amir Jul 19 '15 at 09:27
  • You'll need to adjust the login field names to those in your login form. The way explained there really is *the* way to do it using wget. – Pekka Jul 19 '15 at 09:33
  • try to use url in double quotes wget "https://online.myuni.ac.uk/webapps/blackboard/execute/content/file?cmd=view&content_id=_xxxxxxx_1&course_id=_xxxxx_1" – user3355434 Jul 19 '15 at 11:22

1 Answers1

0

I finally got this working by exporting cookies from a browser, but the key was to include the "keep-session-cookies" parameter.

Amir
  • 155
  • 1
  • 2
  • 10