3

I understand that this might be too specific but I am still quite new and having some difficulty figuring this out. So, I am hoping someone can offer up a clear command which can achieve my goal.

My goal is to download a page from Moodle. Now, this is Moodle 3.9 but I don't think this matters. The page is a wiki which is part of Moodle. So, there are child pages, links, images, etc. Now, each student gets their own wiki and it is available by URL. However, one must be logged in to access the URL or else a login page will show up.

I did review this question, which seemed like it would help. It even has an answer here which seems specific to Moodle.

By investigating each of the answers, I learned how to get the cookies from Firefox. So, I tried logging into Moodle in my browser, then getting the cookie for the session and using that in wget, according to the Moodle answer cited above.

Everything I try results in a response in Terminal like:

Redirecting output to 'wget-log.4'

I then found out where wget-log files are stored and checked the contents, which seemed to be the login page from Moodle. While I am quite sure I got the cookie correct (copy-and-paste) I am not sure I got everything else correct.

Try as I might (for a couple of hours), I could not get any of the answers to work.

My best guess for the command is as follows:

wget --no-cookies --header "Cookie: MoodleSession=3q00000000009nr8i79uehukm" https://myfancydomainname.edu/mod/ouwiki/view.php?id=7&user=160456

It just redirects me to the wget-log file (which has the Moodle login page).

I should not need any post data since once logged into my browser, I can simply paste this URL and it will take me to the page in question.

Any ideas what the command might be? If not, any idea how I could get some pretty clear (step-by-step) instructions how to figure it out?

John
  • 1,124
  • 1
  • 11
  • 27

0 Answers0