0

I have the following scenario:-

  • I am developing an SPA using Polymer for the client and a bespoke server using nodejs. It talks to an SQL Server Database on another computer entirely
  • The production version will run on a raspberry pi INSIDE a Docker container
  • I am trying to set up a CI system using jenkins on a desktop computer as the master, with a slave node as the raspberry pi. This system will operate in pairs - one pair will work at home (where the desktop is linux) and one pair will operate in an office (where the desktop is windows). A global git repository will allow me transfer between the two, and allow the jenkins machine and its slave to be driven by the same repository (exposed to the raspberry pi via ssh)
  • I am thinking it unlikely that I can install Docker on the desktop computer in the office
  • I would like to find a way to test the client aspect of this SPA against the content of the production Docker image. I can produce a test Docker image using the production image as a base in order to include additional test tools such as polymer-cli (which in turn includes web-component-tester and selenium). But that image would have to run on the raspberry pi where there isn't much choice of browser.
  • I currently run tests on my linux desktop with a nodejs javascript call like const child = spawn('xvfb-run', ['-a', 'wct', '--color'], {cwd: path.resolve(__dirname, 'client')});

What I can't quite get my head around is some portion of web testing is driven by nodejs talking to selenium which then fires up a browser, but where? and then nodejs is serving the content to the browser where it is run (and I am using xvfb-run to capture the output from the browser). Is it possible to have the browser run on another machine (either the windows desktop machine or the place where the global git repository is located)

akc42
  • 4,893
  • 5
  • 41
  • 60

0 Answers0