Bio and Publications

Thursday, September 13, 2012

RESTful Web Services Testing, Q&A

Some background:
I was vaguely pointed at one call in an API, via a 2-page "tutorial" that uses CURL examples. Told "Test this some more." by the guy who'd been doing some amount (none?) of hand "success path" testing via CURL. This has since morphed into "regression testing things, all 12 calls", "we have a build API as well", and "there's this hot new feature for a vendor conference in a couple weeks ..."
There was more, but you get the idea.  There were so more specific "requirements" for the RESTful unit testing environment.
1) Get "smoke test" coverage vs. all the calls 
A sequence of CURL requests to exercise a server can be viewed as "testing".  It's piss-poor at best.  Indeed, it's often misleading because of the complexity of the technology stack.

In addition to the app, you're also testing Apache (or whatever server they're using) plus the framework, plus the firewall, plus caching and any other components of the server's technology stack.

However, it does get you started ASAP.
2) expand / parameterize that 
CURL isn't the best choice.  You wind up writing shell scripts.  It gets ugly before long.

Python is better for this.

Selenium may also work.  Oh wait.  Selenium is written in Python.
3) build out to response correctness & error codes 
Proper design for testability makes this easy.

However.  When you've be tossed a "finished" RESTful web service that you're supposed to be testing, you have to struggle with expected vs. actual.

It's not trivial because the responses may have legitimate variances: date-time stamps, changing security tokens or nonces, sequence numbers that vary.

Essentially, you can't just use the OS DIFF program to compare actual CURL responses with expected CURL responses.

You're going to have to parse the response, pick out appropriate fields to check and write proper unittest assertions around those fields.
4) layer in at least that much testing for the new, new feature breathlessly happening RIGHT NOW.
Without a proper design for testability, this can be painful.

If you're using a good unit test framework, it shouldn't be impossible.  Your framework must be able to start the target RESTful web service for a TestCase, exercise the TestCase, and then shutdown the target RESTful web service when the test has completed.

Now, you're just writing unittest TestCase instances for the new feature breathlessly happening RIGHT NOW.  That should be manageable.
...tool things I've found so far... [list elided
All crap, more or less.  They're REST client libraries, not testing tools.

You need a proper unit testing framework with TestCase and TestSuite classes and a TestRunner.  The tools you identified aren't testing frameworks, they're lower level REST client and client library.  CURL, by itself, isn't really very good for robust testing unless you embed CURL in some test framework.
For defining interfaces (2), I have found these... [list elided]
API's in a typical RESTful environment have little or no formal definition, other than Engrish.  WSDL is for Java/XML/SOAP.  It's not used much for (simpler) REST.  For the most part, REST API definitions (i.e., via JSON or whatever) are mostly experimental.  Not standardized.  Not to be trusted.

The issue is one of parallel maintenance.  The idea is that a REST frameworks can operate without too much additional JSON or XML folderol; just the code should be sufficient.

If there's no WSDL (because it's just REST) then there's no formal definition of anything.  Sorry.
I (perhaps foolishly) figured there's be some standard way to consume the standard format definition of an API, to generate test cases and stubbing at least. Maybe even a standard set of verifications, like error codes. So I went a-googling for 1) a standard / conventional way to spec these APIs, 2) a standard / conventional tool or maybe tools @one per stack, and 3) a standard / conventional way to generate tests or test scaffolding in these tools, consuming the standard / conventional API spec. So far, not so much.
"So far, not so much" is the state of the art.  You have correctly understood what's available.

REST -- in spite of it's trivial simplicity and strict adherence to HTTP -- is a rather open world.  It's also pretty simple.  Fancy tools don't help much.

Why not?

Because decent programming languages already do REST; tools don't add significant value.  In the case of Python, there are relatively few tools (Selenium is the big deal, and it's for browser testing) because there's no real marketplace niche for them.  In general, simple Python using httplib (or Python 3 http.client) can test the living shit out of RESTful API better than CURL/DIFF with no ugly shell-script coding.  Only polite, civilized Python coding.

Tuesday, September 11, 2012

RESTful Web Service Testing

Unit testing RESTful web services is rather complex.  Ideally, the services are tested in isolation before being packaged as a service.  However, sometimes people will want to test the "finished" or "integrated" web services technology stack because (I suppose) they don't trust their lower-level unit tests.

Or they don't have effective lower-level unit tests.

Before we look at testing a complete RESTful web service, we need to expose some underlying principles.

Principle #1.  Unit does not mean "class".  Unit means unit: a discrete unit of code.  Class, package, module, framework, application.  All are legitimate meanings of unit.  We want to use stable, easy-to-live with unit testing tools.  We don't want to invent something based on shell scripts running CURL and DIFF.

Principle #2.  The code under test cannot have any changes made to it for testing.  It has to be the real, unmodified production code.  This seems self-evident.  But.  It gets violated by folks who have badly-designed RESTful services.

This principle means that all the settings required for testability must be part of an external configuration.  No exceptions.  It also means that your service may need to be refactored so that the guts can be run from the command line outside Apache.

When your RESTful Web Service depends on third-party web service(s), there is an additional principle.

Principle #3.  You must have formal proxy classes for all RESTful services your app consumes.  These proxy classes are going to be really simple, since they must trivially map resource requests to proper HTTP processing.  In Python, it is delightfully simple to create a class where each method simply uses httplib (or http.client in Python 3.2) to make a GET, POST, PUT or DELETE request.  In Java you can do this, also, it's just not delightfully simple.

TestCase Overview

Testing a RESTful web service is a matter of starting an instance of the service, running a standard unit testing TestCase, and then shutting that instance down.  Generally this will involve setUpModule and tearDownModule (in Python parlance) or a @BeforeClass and @AfterClass (in Java parlance).

The class-level (or module-level) setup must start the application server being tested.  The server will start in some known initial condition.  This may involve building and populating known database, too.  This can be fairly complex.

When working with SQL, In-memory databases are essential for this.  SQLite (Python) or http://hsqldb.org (Java) can be life-savers because they're fast and flexible.

What's important is that the client access to the RESTful web service is entirely under control of a unit testing framework.

Mocking The Server

A small, special-purpose server must be built that mocks the full application server without the endless overheads of a full web server.

It can be simpler to mock a server rather than to try to reset the state of a running Apache server.  TestCases often execute a sequence of stateful requests assuming a known starting state.   Starting a fresh mock server is sometimes an easy way to set this known starting state.

Here's a Python script that will start a server.   It writes the PID to a file for the shutdown script.

import http.server
import os
from the_application import some_application_feature
class AppWrapper( http.server.BaseHTTPRequestHandler ):
    def do_GET( self ):
        # Parse the URL

        id= url.split("/")[-1]

        # Invoke the real application's method for GET on this URL.
        body= some_application_feature( id )
        # Respond appropriately
        self.send_response( 200, body )
    ... etc ...

# Database setup before starting the service.
# Filesystem setup before starting the service.
# Other web service proxy processes must be started, too.
with open("someservice.pid","w") as pid_file:
    print( os.getpid(), file=pid_file )
httpd = http.server.HTTPServer("localhost:8000", AppWrapper)
try:
    httpd.serve_forever()
finally:
    # Cleanup other web services.

Here's a shutdown script.

import os, signal
with open("someservice.pid") as pid_file:
    pid= int( pid_file.read() )
os.kill( pid, signal.CTRL_C_EVENT )

These two scripts will start and stop a mock server that wraps the underlying application.

When you're working in Java, it isn't so delightfully simple as Python  But it should be respectably simple.  And you have Jython Java integration so that this Python code can invoke a Java application without too much pain.

Plus,  you can always fall further back to a CGI-like unit testing capability where "body= some_application_feature( id )" becomes a subprocess.call(). Yes it's inefficient.  We're just testing.

This CGI-like access only works if the application is very well-behaved and can be configured to process one request at a time from a local file or from the command line.  This, in turn, may require building a test harness that uses the core application logic in a CGI-like context where STDIN is read and STDOUT is written.