30

I'm looking for the recommended/nicest way to make Selenium tests execute in several browsers one after another. The website I'm testing isn't big, so I don't need a parallel solution yet.

I have the usual test set-up methods with [SetUp], [TearDown], and [Test]. The SetUp one, of course, instantiates a new ISelenium object with whatever browser I want to test with.

So what I want to do is programmatically say: this test will be run on Chrome, IE, and Firefox in sequence. How do I do that?

EDIT:

This might help a bit. We're using CruiseControl.NET to start the NUnit tests after a successful build. Is there any way to pass a parameter to the NUnit executable, and then use that parameter in the test? This way we could have NUnit run several times with different browser parameters.

Edgar
  • 4,348
  • 4
  • 40
  • 59

8 Answers8

50

NUnit 2.5+ now supports Generic Test Fixtures which make testing in multiple browsers very straightforward. http://www.nunit.org/index.php?p=testFixture&r=2.5

Running the following example will execute the GoogleTest twice, once in Firefox and once in IE.

using NUnit.Framework;
using OpenQA.Selenium;
using OpenQA.Selenium.Firefox;
using OpenQA.Selenium.IE;
using System.Threading;

namespace SeleniumTests 
{
    [TestFixture(typeof(FirefoxDriver))]
    [TestFixture(typeof(InternetExplorerDriver))]
    public class TestWithMultipleBrowsers<TWebDriver> where TWebDriver : IWebDriver, new()
    {
        private IWebDriver driver;

        [SetUp]
        public void CreateDriver () {
            this.driver = new TWebDriver();
        }

        [Test]
        public void GoogleTest() {
            driver.Navigate().GoToUrl("http://www.google.com/");
            IWebElement query = driver.FindElement(By.Name("q"));
            query.SendKeys("Bread" + Keys.Enter);

            Thread.Sleep(2000);

            Assert.AreEqual("bread - Google Search", driver.Title);
            driver.Quit();
        }
    }
}
alanning
  • 5,198
  • 2
  • 34
  • 33
  • @alanning I cannot get my system to recognize where the driver for IE is located, where do I need to place it in order for it to run? – DEnumber50 Apr 21 '15 at 19:08
  • 1
    @DEnumber50 been a while since I've worked with this but https://code.google.com/p/selenium/wiki/InternetExplorerDriver says the executable must be in your path. Also note the required config for IE11 if you are using that. – alanning Apr 22 '15 at 03:41
  • Also see @arve-systad 's [answer](http://stackoverflow.com/a/19139816/219238) below for an example of how to specify driver location. – alanning Mar 22 '16 at 20:41
  • Is there a way to inherit this class so that redundant code in each test class is avoided? – bit Aug 20 '16 at 03:31
6

This is a recurring question and is solved a couple ways:

  1. Factory method produces your ISelenium object - You have a helper class with a static getSelenium method. That method reads in some external config, which has a property that defines the browser you want as a string. In your getSelenium you then configure the browser accordingly. here's a handy post on using config files with NUnit http://blog.coryfoy.com/2005/08/nunit-app-config-files-its-all-about-the-nunit-file/

  2. Others have success with injecting the browser via an IoC container. I really like this because TestNG works really well with Guice in Java land, but I'm not sure how easy it is to mix NUnit and Ninject, MEF, etc...

pnewhook
  • 4,048
  • 2
  • 31
  • 49
  • I have the helper stuff and configurable browser string, but the problem is that I have to run the same test multiple times with different browsers. – Edgar Feb 21 '11 at 08:22
  • Then you should be swapping the config files for each run. The whole config file would be the same, but you'd have the browser name as a string for each browser you'd want to test. – pnewhook Feb 22 '11 at 15:08
5

This is basically just an expansion of alanning's answer (Oct 21 '11 at 20:20). My case was similar, just that I did not want to run with the parameterless constructor (and thus use the default path to the driver executables). I had a separate folder containing the drivers I wanted to test against, and this seems to work out nicely:

[TestFixture(typeof(ChromeDriver))]
[TestFixture(typeof(InternetExplorerDriver))]
public class BrowserTests<TWebDriver> where TWebDriver : IWebDriver, new()
{
    private IWebDriver _webDriver;

    [SetUp]
    public void SetUp()
    {
        string driversPath = Environment.CurrentDirectory + @"\..\..\..\WebDrivers\";

        _webDriver = Activator.CreateInstance(typeof (TWebDriver), new object[] { driversPath }) as IWebDriver;
    }

    [TearDown]
    public void TearDown()
    {
        _webDriver.Dispose(); // Actively dispose it, doesn't seem to do so itself
    }

    [Test]
    public void Tests()
    {
        //TestCode
    }
}

}

Arve Systad
  • 5,471
  • 1
  • 32
  • 58
  • +1 Brilliant - this is much neater and you only need to decorate your class with the different drivers. – Deano Nov 04 '13 at 11:41
  • Is there a way to inherit this class so that redundant code in each test class is avoided? – bit Aug 20 '16 at 03:30
  • I don't know how the TestFixture attribute works with inheritance, but everything inside SetUp and TearDown should be possible to move away into some other class, and letting your tests just call "GetWebDriver" or something. I guess you could try to just inherit from this class and see what happens? – Arve Systad Aug 21 '16 at 09:05
3

I use a list of IWeb driver to perform tests on all browsers, line by line:

[ClassInitialize]
        public static void ClassInitialize(TestContext context) {
            drivers = new List<IWebDriver>();
            firefoxDriver = new FirefoxDriver();
            chromeDriver = new ChromeDriver(path);
            ieDriver = new InternetExplorerDriver(path);
            drivers.Add(firefoxDriver);
            drivers.Add(chromeDriver);
            drivers.Add(ieDriver);
            baseURL = "http://localhost:4444/";
        }

    [ClassCleanup]
    public static void ClassCleanup() {
        drivers.ForEach(x => x.Quit());
    }

..and then am able to write tests like this:

[TestMethod]
        public void LinkClick() {
            WaitForElementByLinkText("Link");
            drivers.ForEach(x => x.FindElement(By.LinkText("Link")).Click());
            AssertIsAllTrue(x => x.PageSource.Contains("test link")); 
        }

..where I am writing my own methods WaitForElementByLinkText and AssertIsAllTrue to perform the operation for each driver, and where anything fails, to output a message helping me to identify which browser(s) may have failed:

 public void WaitForElementByLinkText(string linkText) {
            List<string> failedBrowsers = new List<string>();
            foreach (IWebDriver driver in drivers) {
                try {
                    WebDriverWait wait = new WebDriverWait(clock, driver, TimeSpan.FromSeconds(5), TimeSpan.FromMilliseconds(250));
                    wait.Until((d) => { return d.FindElement(By.LinkText(linkText)).Displayed; });
                } catch (TimeoutException) {
                    failedBrowsers.Add(driver.GetType().Name + " Link text: " + linkText);
                }
            }
            Assert.IsTrue(failedBrowsers.Count == 0, "Failed browsers: " + string.Join(", ", failedBrowsers));
        }

The IEDriver is painfully slow but this will have 3 of the main browsers running tests 'side by side'

DevDave
  • 6,700
  • 12
  • 65
  • 99
1

I was wondering about the same issue and finally i got solution

So when you install plugin you then are able to controll in which browser should scenario be tested.

Feature example:

@Browser:IE
@Browser:Chrome
Scenario Outline: Add Two Numbers
    Given I navigated to /
    And I have entered <SummandOne> into summandOne calculator
    And I have entered <SummandTwo> into summandTwo calculator
    When I press add
    Then the result should be <Result> on the screen
    Scenarios:
        | SummandOne | SummandTwo | Result |
        | 50 | 70 | 120 |
        | 1 | 10 | 11 |

Implementation

[Given(@"I have entered '(.*)' into the commentbox")]
public void GivenIHaveEnteredIntoTheCommentbox(string p0)
{
            Browser.Current.FindElement(By.Id("comments")).SendKeys(p0);
}

More info

Vova Bilyachat
  • 18,765
  • 4
  • 55
  • 80
1

Ok, one solution is to have wrapper tests that set-up the ISelenium object with different browsers. Then they pass that object to all the other tests which use it instead of setting up a new one themselves like they did previously.

The disadvantage is, I end up with one big test for each browser. Not the best solution either. Still looking...

EDIT:

Spent some more time on this. The solution I came up with is to have a text file in the solution that specifies the browser to use for testing. NUnit picks up the setting when instantiating a Selenium object.

I'm using CruiseControl.NET to run automatic builds and tests. And instead of just running the test once, I configured it to run them twice. But before each test I run a command line command that changes the browser in the configuration text file.

<exec>
    <executable>cmd</executable>
    <buildArgs>/C echo firefox C:\\Program Files (x86)\\Mozilla Firefox\\firefox.exe > F:\...\selenium_browser.txt</buildArgs>
</exec>
<exec>
    <executable>F:\...\NUnit 2.5.7\bin\net-2.0\nunit-console.exe</executable>
    <baseDirectory>F:\...\bin\Debug</baseDirectory>
    <buildArgs>F:\...\...nunit /xml:"F:\CCXmlLog\Project\nunit-results.xml" /noshadow</buildArgs>
    <successExitCodes>0</successExitCodes>
    <buildTimeoutSeconds>1200</buildTimeoutSeconds>
</exec>

<exec>
    <executable>cmd</executable>
    <buildArgs>/C echo googlechrome C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe > F:\...\selenium_browser.txt</buildArgs>
</exec>
<exec>
    <executable>F:\...\NUnit 2.5.7\bin\net-2.0\nunit-console.exe</executable>
    <baseDirectory>F:\...\bin\Debug</baseDirectory>
    <buildArgs>F:\...\...nunit /xml:"F:\CCXmlLog\Project\nunit-results.xml" /noshadow</buildArgs>
    <successExitCodes>0</successExitCodes>
    <buildTimeoutSeconds>1200</buildTimeoutSeconds>
</exec>
Edgar
  • 4,348
  • 4
  • 40
  • 59
  • It turns out you can also specify environment variables with the `` element, so that would be a way to avoid the text files. – Edgar Jun 02 '11 at 16:11
1

This helped me to solve similar problem How do I run a set of nUnit tests with two different setups?

Just set different browsers in setup method : ]

Community
  • 1
  • 1
slawek
  • 2,709
  • 1
  • 25
  • 29
0

There must be a better way, but you could use T4 templates to generate duplicate test classes for each browser - essentially automating a copy-and-paste of the tests for each browser.

Massif
  • 4,327
  • 23
  • 25