4

I want to write some code i can run in the bash that takes a list of URL's and checks if they return a 404. If the site is not returning a 404 i need the url to be written to the output list.

So in the end i should have a list with working sites. I do not know how to realize the code. This looks like something that could work right?: How to check if a URL exists or returns 404 with Java?

Community
  • 1
  • 1
xxad
  • 41
  • 1
  • 2
  • something like: if curl --output /dev/null --silent --head --fail "list of url's" then "write to example.txt" – xxad Aug 06 '16 at 16:13

2 Answers2

4

You can use this code and build on it as necessary:

#!/bin/bash

array=( "http://www.stackoverflow.com" "http://www.google.com" )

for url in "${array[@]}"
do
    if ! curl -s --head  --request GET ${url} | grep "404 Not Found" > /dev/null
    then
       echo "Output URL not returning 404 ${url}"
    fi
done
tale852150
  • 1,618
  • 3
  • 17
  • 23
0

Thanks for your help. I found a package for linux called linkchecker. It does exactly what i want.

xxad
  • 41
  • 1
  • 2