3

The website i'm trying to call basically uses .ascx and aspx pages to render their items. And in order to navigate you have to use the click event instead of goto url. Currently i'm using Thread.Sleep(3000), but thats clunky and I would think it's not the best method.

What can you add in order to make sure the page has been loaded from the click event?

IWebElement myLink;

myLink = driver.FindElement(By.Id("ctl00_ctl00_cphContentArea_cphContentArea_ucwaag_lnkbtnDate" + i.ToString()));
myLink.Click();
Thread.Sleep(5000); //yeah need to fix this
Christos
  • 53,228
  • 8
  • 76
  • 108

3 Answers3

0

I am not sure if this THE approach but I am using pooling mechanism that waits for javascript variable to get set, which get's set manually when page is fully loaded (as there is no way to tell when it has actually done).

C#

public void WaitUntilLoaded()
    {
        Func<IWebDriver, bool> test = drv =>
        {
          result = (bool?)js.ExecuteScript("return SeleniumHelper.isPageLoaded;");
        };

        TestHelper.WaitUntil(this.Driver, d => { }, test, Constants.Pause.Medium, "load", logWaitTime: true);
    }

internal static class TestHelper
   {
   private static readonly string StartDateString = DateTime.Now.ToString("yyyy-MM-ddTHH-mm-ss");
   private static readonly Random Rnd = new Random((int)DateTime.Now.Ticks);

   public static void WaitUntil(IWebDriver driver, Action<IWebDriver> action, Func<IWebDriver, bool> predicate, TimeSpan timeOut, string waitingForDescription, bool logWaitTime = false)
   {
       action(driver);

       int retryCount = 0;
       var now = DateTime.Now;
       var wait = new WebDriverWait(driver, timeOut);
       Func<IWebDriver, bool> fullPredicate = drv =>
       {
           var res = predicate(drv);
           if (!res)
               {
               bool hasErr = IsShowingErrorLoadingPage(drv);
               if (hasErr)
               {
                   Console.WriteLine(string.Format("Page is marked with [{0}]", "Error Loading Page"));
                   if (retryCount++ == 0)
                   {
                        Console.WriteLine("Retrying...");
                        action(driver);
                        res = false;
                   }
                }
            }

            return res;
        };

        try
        {
            wait.Until(fullPredicate);
        }
        catch (TimeoutException)
        {
            string waitMsg = timeOut.TotalSeconds >= 1 ? (timeOut.TotalSeconds + "s") : (timeOut.TotalMilliseconds + "ms");
            throw new TimeoutException(string.Format("Waited for {0} to {1} without success [{2}].", waitMsg, waitingForDescription, WritePageSource(driver)));
        }

        if (logWaitTime)
        {
            Console.WriteLine("WaitUntil: Requested wait: " + Math.Round(timeOut.TotalSeconds, 2).ToString("0.00").PadLeft(5, ' ') + "s; actual wait: " + Math.Round(DateTime.Now.Subtract(now).TotalSeconds, 3).ToString("0.000").PadLeft(6, ' ') + "s");
        }
  }

 public static bool IsStale(IWebElement element)
 {
     return Throws<StaleElementReferenceException>(() => element.GetAttribute("x"));
 }

 public static bool IsShowingErrorLoadingPage(IWebDriver driver)
 {
        IWebElement failedToLoad;
        try
        {
            failedToLoad = driver.FindElements(By.TagName("h1")).Where(el => el.Text == "Error Loading Page").FirstOrDefault();
        }
        catch (StaleElementReferenceException)
        {
            failedToLoad = null;
        }

        return failedToLoad != null;
    }
}

Javascript:

 function SeleniumHelper() {
    isPageLoaded = false;
    this.markPageAsLoaded = function () {
        logger.log('page marked as loaded');
        isPageLoaded = true;
    };

}
Matas Vaitkevicius
  • 58,075
  • 31
  • 238
  • 265
0

You are best served waiting for a known element on the target page to appear - introducing any sleep statement will make your test very brittle so you should try to avoid that.

So something like this:

wait.Until(d => d.FindElement(By.Id(someId)));

In this context you might find this Stackoverflow question useful.

Community
  • 1
  • 1
BrokenGlass
  • 158,293
  • 28
  • 286
  • 335
  • The only thing I dont like about this is you have to know each ID on the next page –  Jul 28 '14 at 15:00
  • you just have to know the id or the xpath of *one* element on the page - since you are writing a test with expectations for that page this should be a given – BrokenGlass Jul 28 '14 at 15:02
  • Broken - i know but what if there is 1000 pages to crawl, then I need 1000 ids –  Jul 28 '14 at 15:03
  • are you writing a test or are you using Selenium for something entirely different? You would think that you can find at least one element that is common to all these pages, e.g. a container `div` element – BrokenGlass Jul 28 '14 at 15:04
  • I trying to use Selenium to scrape aspx pages where javascript is used to navigate –  Jul 28 '14 at 15:06
0

This is the exact same issue that we faced when trying to use Selenium.

We addressed the problem by using a wait condition.

wait.Until(ExpectedConditions.ElementExists(locator));
wait.Until(ExpectedConditions.ElementIsVisible(locator));

We had to use both the ElementExists and ElementIsVisible to address the problem of controls being on the page but not being visible yet

Mike Norgate
  • 2,393
  • 3
  • 24
  • 45
  • Mike - what if there isnt an id you know on each page? Meaning, what if there is 1000 pages, i dont want to have to find 1000 ids –  Jul 28 '14 at 15:02
  • @Mike this might not work in that situation. When we did it then was always one known element on the page – Mike Norgate Jul 28 '14 at 15:07
  • Mike - your sample is easy enough, i'm just not sure how to actually use it. What is locator? I know the ID of the div i'm looking for –  Jul 28 '14 at 15:17