I am trying to run webpage insights on a website by first downloading it locally and then using local server to test it. But, I am getting error like Performance Score = {'error': {'code': 500, 'message': 'Lighthouse returned error: FAILED_DOCUMENT_REQUEST. Lighthouse was unable to reliably load the page you requested. Make sure you are testing the correct URL and that the server is properly responding to all requests. (Details: net::ERR_CONNECTION_FAILED)', 'errors': [{'message': 'Lighthouse returned error: FAILED_DOCUMENT_REQUEST. Lighthouse was unable to reliably load the page you requested. Make sure you are testing the correct URL and that the server is properly responding to all requests. (Details: net::ERR_CONNECTION_FAILED)', 'domain': 'lighthouse', 'reason': 'lighthouseError'}]}} 4505.094128370285 and my code looks like this:
# Download the website files
subprocess.run(["wget", "-r", "-p", "-np", url])
# Start the local server
url_path = url.split("//")[1]
server_process = subprocess.Popen(["python", "-m", "http.server"], cwd=os.path.join(os.getcwd(), url_path))
# Wait for the server to start up
requests.get("http://localhost:8000/")
# Run PageSpeed Insights analysis
pagespeed_url = f"https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=http://localhost:8000/&key={API_KEY}"
response = requests.get(pagespeed_url)
data = response.json()
performance_score = data
print(f"{url}: Performance Score = {performance_score}")
# Stop the local server
server_process.kill()
I want to receive my pagespeed insight results launched via local host such that external factors like number of users and network congestion doesn't affect my result