7

I am working with the Spark web framework and creating a RESTful API. (http://sparkjava.com since there are multiple things out there named "Spark")

My employer's standards mandate that we write a series of unit tests that will be automatically run once a day to confirm that applications are still up.

Spark is easy to test myself using a tool like Postman but I have not found any good examples of JUnit tests being written with Spark or even with HTTP Requests being made programmatically with it.

Has anyone done this before? Is it possible?

Jonathan
  • 20,053
  • 6
  • 63
  • 70
Mr Robot Arms
  • 130
  • 2
  • 11

4 Answers4

9

we have develop a small library that facilitates the unit testing of Spark controllers/endpoints.

Github

Also, the version 1.1.3 is published in Maven Central Repository

<dependency>
        <groupId>com.despegar</groupId>
        <artifactId>spark-test</artifactId>
        <version>1.1.3</version>
        <scope>test</scope>
    </dependency>
  • Latest published version is now 1.1.1 – Maks Nov 24 '16 at 01:08
  • This is great! I can now test my routes without starting a server every time. – Bentaye Feb 14 '17 at 16:04
  • That's a good framework! It works well, but I'm having some troubles with the requests, you may be able to help me. I execute a post and save an attribute on the request, then with a get method it returns null when asking for the attribute on the same @Test. It might work or do I have to do something different? – Motomine Apr 19 '17 at 14:45
  • @Motomine what do you mean by attribute? How do you save it? Where? – Fernando Wasylyszyn Apr 20 '17 at 16:45
  • I've just posted a question related to this [here](http://stackoverflow.com/questions/43520844/how-to-test-spark-java-requests), please if you don't understand something tell me – Motomine Apr 20 '17 at 16:55
  • @Motomine I think that I understand your problem. When you perform a post in order to login a user, if the application stores the user in the session, it is very probable that a cookie with the session id is returned in the post response. You should take that cookie from the post response header and put it in the "isLogged" get request header. – Fernando Wasylyszyn Apr 21 '17 at 17:09
  • @ferwasy I've tried that and I get a headers map where one key is "Set-Cookie" and its value is "JSESSIONID=......;Path=/", where in "......" is a random string that changes in each execution. I've already tried adding a header with key "Set-Cookie" and that complete value, but it doesn't work. Am I doing something wrong? I have also tried with "JSESSIONID" as key and the random String as value, and it didn't work neither – Motomine Apr 21 '17 at 20:27
  • @motormine the header in the request for check the login must be "Cookie", not "Set-Cookie". The later is used by the server to instruct the client that the cookie must be stored – Fernando Wasylyszyn Apr 23 '17 at 01:05
5

I had the same requirement that you and I found a way to make it work. I searched over Spark source code and I found two classes that are useful:

  • SparkTestUtil: this class wraps Apache HttpClient and expose methods to make different http requests against a local web server (running in localhost) with customizable port (in constructor) and relative path (in requests methods)
  • ServletTest: it starts a Jetty instance in a local port with an application context and a relative directory path where a WEB-INF/web.xml file descriptor can be found. This web.xml will be use to simulate a web application. Then it uses SparkTestUtil to make http requests against this simulated application and assert results.

This is what I did: I created a junit test class that implements SparkApplication interface. In that interface I create and initialize the "controller" (a class of my application) in charge of answer http requests. In a method annotated with @BeforeClass I initialize the Jetty instance using a web.xml that refers to the junit test class as the SparkApplication and a SparkTestUtil

JUnit test class

package com.test

import org.eclipse.jetty.server.Connector;
import org.eclipse.jetty.server.Server;
import org.eclipse.jetty.server.ServerConnector;
import org.eclipse.jetty.webapp.WebAppContext;

public class ControllerTest implements SparkApplication {

    private static SparkTestUtil sparkTestUtil;

    private static Server webServer;

    @Override
    public void init() {
         new Controller(...)
    }

    @BeforeClass
    public static void beforeClass() throws Exception {
       sparkTestUtil = new SparkTestUtil(PORT);
       webServer = new Server();
       ServerConnector connector = new ServerConnector(webServer);
       connector.setPort(PORT);
       webServer.setConnectors(new Connector[] {connector});
       WebAppContext bb = new WebAppContext();
       bb.setServer(webServer);
       bb.setContextPath("/");
       bb.setWar("src/test/webapp/");
       webServer.setHandler(bb);
       webServer.start();
       (...)
    }

    @AfterClass
    public static void afterClass() throws Exception {
       webServer.stop();
       (...)
    }    

}

src/test/webapp/WEB-INF/web.xml file

<!DOCTYPE web-app PUBLIC
 "-//Sun Microsystems, Inc.//DTD Web Application 2.3//EN"
 "http://java.sun.com/dtd/web-app_2_3.dtd" >

<web-app>
    <display-name>Archetype Created Web Application</display-name>
    <filter>
        <filter-name>SparkFilter</filter-name>
        <filter-class>spark.servlet.SparkFilter</filter-class>
        <init-param>
            <param-name>applicationClass</param-name>
            <param-value>com.test.ControllerTest</param-value>
        </init-param>
    </filter>

    <filter-mapping>
        <filter-name>SparkFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>
</web-app>

This can be improved, but it is a good starting point I think. Maybe some "spark-test" component could be created?

Hope this would be useful for you!

1

Here is my Solution.You just need additional add apache-http and junit dependency.

<dependency>
    <groupId>org.apache.httpcomponents</groupId>
    <artifactId>httpclient</artifactId>
    <version>4.5.2</version>
</dependency>

public class SparkServer {
    public static void main(String[] args) {
        Spark.port(8888);
        Spark.threadPool(1000, 1000,60000);
        Spark.get("/ping", (req, res) -> "pong");
    }
}

public class SparkTest {
    @Before
    public void setup() {
        SparkServer.main(null);
    }
    @After
    public void tearDown() throws Exception {
        Thread.sleep(1000);
        Spark.stop();
    }
    @Test
    public void test() throws IOException {

        CloseableHttpClient httpClient = HttpClients.custom()
                .build();

        HttpGet httpGet = new HttpGet("http://localhost:8888/ping");
        CloseableHttpResponse response = httpClient.execute(httpGet);

        int statusCode = response.getStatusLine().getStatusCode();
        BufferedReader rd = new BufferedReader(
                 new InputStreamReader(response.getEntity().getContent()));

        StringBuffer result = new StringBuffer();
        String line = "";
        while ((line = rd.readLine()) != null) {
            result.append(line);
        }

        assertEquals(200, statusCode);
        assertEquals("pong", result.toString());
    }
}
freeslaver
  • 341
  • 3
  • 5
0

Another approach wis to create a class which implements Route in each path or route. For example, if you have a route like next:

get("maintenance/task", (req, response) -> {....}); 

Then replace (req, response) -> {....} lambda by a class implementing Route.

For example:

public class YourRoute implements Route {
   public Object handle(Request request, Response response) throws Exception {
     ....
   }
}

Would be:

get("maintenance/task", new YourRoute()); 

Then you can unit testing YourRoute class using JUnit.


Pau
  • 14,917
  • 14
  • 67
  • 94