Saturday, February 16, 2013

Continous Integration with Dropwizard, ClassRule, ClassPathSuite, Maven and RestAssured

I recently had a chance to do several projects using dropwizard. I found it to be a great tool. Basically, it's a rest container built using embedded jetty, jersey, guava, logback bundled with it. Details can be seen here . This tool was developed and open sourced by yammer.

DropWizard is started as an executable jar file with java -jar .. command. By default DropWizard's main thread blocks after starting embedded instance of jetty. This causes some issues with continous integration as during build process server has to be started, and stopped so tests can be executed against container. Out of the box this could be achieved by executing jar from inside test code, (using Runtime.exec()) but this is really undesirable solution.

Fortunately, this has been addressed by several people by using ServerCommand (available since release 0.6.1 - I believe).
See links below:
forum
EmbeddableServerCommand
TestableServerCommand

I've created a project to wrap all this together in a bundle which makes it easier to get started. Project is posted on github in dropwizard-ci repository.

Key source classes
Here are key classes in the project

EmbeddableServerCommand -  pretty much copy/paste from the gist listed above.
EmbeddableService  - wraps command ensuring that EmbeddableServerCommand has been added to bootstrap, 
 Services extending it, just have to make sure to call super.initialize(..).

Key test classes
NOTE: Most of the credit for test classes listed below belong to David Drake (who's blog can be found here ), as he was leading development effort of classes below.

TestContext - starts and stops server when needed. 
Notice that TestContext is instantiated with false flag passed into constructor in each test except in the test suite (DWCITestSuite) where it's created with true value. See this piece of code in TestContext class:
    @Override
    protected void starting(Description description) {
        if (forSuite == IN_SUITE) {
            startDropWizardServer();
        }
    }
This avoids starting and stopping container for each test when running within test suite, but makes it happen when runing individual tests

Testing
See HelloResourceIT. Running it as a test from an IDE will start server (because overriden starting method of TestWatcher (TestContext) instantiated in the class ), and verifies whether endpoint returns expected response using restassurred client.

Maven builds.
Maven plugin is configured to only execute DWCITestSuite, which runs all tests.

 maven-surefire-plugin
  
    
      **/DWCITestSuite.class
    
  
 
To test the setup pull project down from the github at: https://github.com/pwyrwa/dropwizard-ci.git and execute following maven command:
mvn clean test
Check standard output from maven build to verify that both integration tests have been executed, but server has been started only once.

Setup shown above blends unit and integration tests (as they're all executed in test phase). I'll fix that in my next post and update project in github.

Sunday, July 29, 2012

Unit/Integration testing with Maven and Embedded Cassandra

I've been using Cassandra on my current project. I was able to utilize embedded cassandra feature of cassandra-unit to create a testable application without the need of installing/configuring Cassandra on developer boxes.

I've created a sample application showing setup/code skeleton I used. Project is available on github here. It requires java 5 or higher (tested it with 7) and maven 3.x installed.

Required dependencies and configuration files
1. Cassandra-unit (see pom snippets below)


        
            com.netflix.astyanax
            astyanax
            ${astyanax.version}
            
             .....
            
        
        
            org.cassandraunit
            cassandra-unit
            ${cassandra.unit.version}
            test
            
             ....
            
        
      
Note: Pull project here for the full dependency list.

3. cassandra.yaml - Configuration used to start embedded version of Cassandra. Check out file in the project for configuration details. Note that all the directories required by Cassandra configured in this file are children of target directory.

4. dataset.json - Keyspace and column family definition

Test case setup and layout
Although, I'm not a big fan of using inheritance, I found that defining base class with common functionality and having tests extend it to be the easiest, cleanest approach in this case.

BaseCassandraTest.java (fragments)
   .....
    @BeforeClass
    public static void startCassandra()
            throws IOException, TTransportException,
            ConfigurationException, InterruptedException
    {
        EmbeddedCassandraServerHelper
        .startEmbeddedCassandra("cassandra.yaml");
    }

    @Before
    public void setUp() throws IOException, 
     TTransportException, ConfigurationException, InterruptedException
    {
        DataLoader dataLoader = new DataLoader("TestCluster", 
         "localhost:9272");
        dataLoader.load(new ClassPathJsonDataSet("dataset.json"));

        AstyanaxContext context = 
         new AstyanaxContext.Builder().forCluster("TestCluster")
                .forKeyspace("test_keyspace")
                .withAstyanaxConfiguration(
                 new AstyanaxConfigurationImpl()
                 .setDiscoveryType(NodeDiscoveryType.NONE))
                .withConnectionPoolConfiguration(
                        new ConnectionPoolConfigurationImpl(
                        "testConnectionPool").setPort(9272)
                                .setMaxConnsPerHost(1)
                                .setSeeds("localhost:9272"))
                .withConnectionPoolMonitor(
                 new CountingConnectionPoolMonitor())
                .buildKeyspace(ThriftFamilyFactory.getInstance());

        context.start();
        keyspace = context.getEntity();
        cassandraAccessor = 
         new CassandraAccessor(new ObjectMapper(), keyspace);
    }

    @After
    public void clearCassandra()
    {
        EmbeddedCassandraServerHelper.cleanEmbeddedCassandra();
    }

    @AfterClass
    public static void stopCassandra()
    {
        EmbeddedCassandraServerHelper.stopEmbeddedCassandra();
    }
   ...
Class listed above defines lifecycle for each Cassandra test. Embedded Cassandra is started/stoped before, after class, and keyspace is tore down, cleaned and recreated for each test.

This setup has worked pretty well. Alternate approach would be to have a test cluster setup and just configure client to connect to the cluster during tests.

Saturday, May 5, 2012

Deploy Spring MVC application to cloud foundry

In my previous post I've described how to deploy existing spring mvc application to heroku cloud. I've attended cloud foundry open tour in Portland since then. I'll write up here what it takes to deploy a very simple, spring-mvc web application to cloud foundry platform.

Cloud foundry has several interesting features which distinguish it from other cloud offerings, such as micro cloud which allows one to run a cloud instance on a local box. Additionally cloud foundry platform is not limited to vmware offerings - it can be installed, and run in internally hosted data centers. The best part, for me at least, is that it currently provides very generous free offerings for devs  which I'm about to tap to deploy my dummy app to vmware hosted implementation. Time permitting I can build some features into the application to explore some freely available services such as mongo, solr, etc.. in the future.

Getting started with cloud foundry
Getting started is fairly simple. Personally, the single wrinkle in the process was the fact that I had to wait about 24 hours to be approved and granted access to the platform following registration process. Here are getting started steps
1. Register for a free account here
2. Cloud foundry comes with excellent command line tool support (vmc). The installation procedure varies between the OS's. Follow instructions specific to the OS here to install, configure vmc and log into cloud foundry environment.
Once the tool is installed and configured it can be used to manage applications in cloud foundry platform. To see full list of supported commands type vmc -help in terminal prompt after installation.

Installing application in the cloud foundry
I was deploying a very simple spring-mvc web application built with maven. The application is available for download on github. I listed detailed steps on how to run application locally in one of my previous blogs. Here are the steps to deploy the application to cloud foundry platform.

1. Verify application with maven command
       mvn clean verify
2. cd into application target directory NOTE:Personally I found it a bit non standard to have to cd into the target directory, maybe, because I'm used to work with maven where all the commands are executed on the app root level

3. Install application with vmc push command
       vmc push
     
4. Respond to several questions at prompts:
      Would you like to deploy from the current directory? [Yn]: Y
    
Define application name (I made it to match url for simplicity)
      Application Name: pio-spring-inter-sample
     
Next prompt is application url. make sure that there are no periods (.) in url as it didn't work for me - this might be fixed by now though
Application Deployed URL: 'pio-spring-inter-sample.cloudfoundry.com'? 
Detected a Java SpringSource Spring Application, is this correct? [Yn]: Y
     
Define is the memory reservation for the application. I think the free offering cap at 2Gb. This is a very small app thus 256M is more than enough (I'll save the rest of capacity for cooler stuff that will come later).
Memory Reservation [Default:512M] (64M, 128M, 256M, 512M, 1G or 2G) 256M
Creating Application: OK
Enter N on services prompt as the application doesn't require services (at least for now).
Would you like to bind any services to 'pio-spring-inter-sample'? [yN]: N
Once the last prompt has been entered vmc will list all steps it performs to deploy and start the application
Uploading Application:
  Checking for available resources: OK
  Processing resources: OK
  Packing application: OK
  Uploading (5K): OK   
Push Status: OK
Staging Application: OK                                                         
Starting Application: OK
    
Once this process completes the application is accessible under the url provided in one of the steps before. In my case it is  http://pio-spring-inter-sample.cloudfoundry.com/home.

Maven plugin support

There is a maven plugin available which allows to deploy the application as a part of maven build. I will not go to details describing it's usage as it's been described in this blog post. With the help of maven plugin applying changes can be a part of a successful maven build cycle which updates, restarts application as a part of successful build.
   mvn clean verify cf:update cf:restart

Sunday, February 26, 2012

Deploy existing Spring MVC application to heroku

I have written a small spring-mvc application to demonstate handling internationalization for my last blog. Since then I attended a presentation about Heroku by James Ward. Heroku is a cloud application platform. It's hosted on EC2. I wanted to see what it takes to deploy a (very) basic, existing web app on heroku. Heroku provides a free (single dyno) service, which I'll use to host the application. Heroku supports java and maven out of the box and since the application I wanted to deploy is a spring mvc application built with maven I was in luck. I described steps I took to deploy application below.

Getting acquainted with heroku and preparing the application
1. I followed getting started steps to signup, install heroku tools locally and login.
2. I followed the instructions to add jetty runner to the pom file.
3. Added Procfile file to declare process runner instructions executed on heroku described here .
4.Tested application locally

Once I have verified that application runs fine locally I was ready to push it up to the cloud.
Heroku uses git to push apps to the cloud, since the app is stored on github I was ready to deploy the application.

Deploying application
Created the app on heroku by executing following command:
heroku create --stack cedar
The output I got was something like that.
Creating severe-dusk-1991... done, stack is cedar
http://severe-dusk-1991.herokuapp.com/ | git@heroku.com:severe-dusk-1991.git
Git remote heroku added
Note that the url listed in the second line of the output is the url for accessing the app on the web.
After creation I was ready to actually deploy the app to the heroku by executing following command:
 git push heroku master
Got the following output
....
-----> Heroku receiving push
-----> Java app detected
-----> Installing Maven 3.0.3..... done
-----> Installing settings.xml..... done
-----> executing /app/tmp/repo.git/.cache/.maven/bin/mvn -B 
		-Duser.home=/tmp/build_3pcwfaauez5e 
		-Dmaven.repo.local=/app/tmp/repo.git/.cache/.m2/repository 
		-s /app/tmp/repo.git/.cache/.m2/settings.xml -DskipTests=true clean install
       [INFO] Scanning for projects
       ....
-----> Discovering process types
       Procfile declares types -> web
-----> Compiled slug size is 14.0MB
-----> Launching... done, v5
       http://severe-dusk-1991.herokuapp.com deployed to Heroku
...
Note that maven build is executed out on heroku instance and output then deployed. Heroku also supports gradle as build tool.
After this step application is readily available for public viewing at: http://severe-dusk-1991.herokuapp.com/home . Note: this is a free account, I'm not sure how long this app is actually going to be available .

Logging and viewing logging output
Heroku provides access to application log output only when it's printed to standard output. Application I just deployed used logback library and outputs logging output to files. In order to take advantage of heroku logging service I had to re-direct output to system.out by changing logging configuration and redeploy the application.
Steps
1. Changed logback file to use console appender instead of file appenders (this is really beyound the scope of this article).
2. Re-deployed application to the heroku service by re-executing command:
	git push heroku master

3. Now I was able to monitor application logs by executing following command
 heroku logs --tail
Other possible application changes
Granted, that the application I have deployed is very, very simple. Hosting more complex applications might require significant configuration (and maybe even code) modifications. For one, heroku doesn't support session affinity. There are several approaches for dealing with that (i.e. using something like memcached, database, nosql store, or such for session state) which are beyond the scope of this article. There is also a question of deploying and configuation of external services such as databases, etc.., which will contribute to the complexity of cloud deployment and configuration. Heroku does provide a significant number of add on services available for deployed applications which might make things easier.

Saturday, February 11, 2012

Spring mvc internationalization support with cookies

Spring mvc has a great support for adding internationalization support to web applications. It allows adding support for displaying multi- messages in just several configuration settings. I have created a sample application to demonstrate one way to accomplish that. The application is available on github spring-internationalization. The app uses maven to resolve all dependencies and cargo plugin to download and run local tomcat instance. It should be fairly easy to check it out (once maven 3 is installed and configured) To try it out, make sure that maven is installed and configured properly. Pull down the app from the public github repository at spring-internationalization, open terminal window and execute following command in app root directory:
 
     mvn clean package cargo:run

Running this command will download, and execute a local copy of tomcat and deploy the application. Once tomcat has been started application can be accessed in web browser by going to following url:

http://localhost:5050/spring-internationalization/home

To check out language features, select a language from "Pick language" select box, on home page. Changing select box value changes the language used in entire application (2 humble jsp pages in this particular case) and re-loads home page with messages in newly selected locale.

Setup

This is a typical spring mvc web application, so I'm going to skip describing components in detail. Below I displayed configuration pieces needed to support internationalization:
1. Message source bean - tells spring where to find resource bundles and their file name pattern. In this application file name pattern is messages_xx_XX.properties (ie. messages_en_US.properties for english/US properties files) see files in src/main/resources/bundles path in project for details.

    <bean id="messageSource" 
         class="org.springframework.context.support.ReloadableResourceBundleMessageSource"
          p:basenames="classpath:bundles/messages"
          p:defaultEncoding="utf-8"
          p:fallbackToSystemLocale="true"/>


2. Mvc interceptor - component intercepting http requests checking whether locale needs to be changed.
This interceptor is configured to capture requests with parameter name of siteLanguage and attempts to change the locale based on parameter value:
For example a request to the home page with appended parameter value such as below
    http://localhost:5050/spring-internationalization/home?siteLanguage=fr_FR
will attempt to change application language to french.

   <mvc:interceptors>
        <bean class="org.springframework.web.servlet.i18n.LocaleChangeInterceptor" >
            <property name="paramName" value="siteLanguage"/>
        </bean>
    </mvc:interceptors>


3. CookieLocaleResolver bean is the third piece of the puzzle. This bean stores locale selected by user in a browser cookie.
 
    
        
    


4. Finally jsp pages use spring:message tag to access bundle messages as listed below.

 

This is it in a nutshell. Note that this is just one of the strategies. Spring provides other tools for handling internationalization. One common strategy is to fallback to language settings in http request header to determine desired locale if cookie value is not available (or CookieLocaleResolver localization is not configured).

Friday, January 6, 2012

Using Spring 3.1 bean definition profiles with unified property management

Spring 3.1 introduced concepts of bean definition profiles (see spring-framework-3-1-m1-released ) and unified property management . I was able to use these features to allow switching between the active profiles by providing JVM system property  (supplied via -Dspring.profiles.active,  maven properties, cargo.args, etc...) parameter when starting the web container while falling back on using a default profile when no JVM system property was supplied.

Specifically I wanted to use an embedded database to allow fast, environment independent development, unit and integration testing vs using external database in production, qa and staging environments to which access is defined through jndi parameter.

Context file:
  
  <context:property-placeholder 
            location="classpath*:profile.properties"/>

    <!-- Local profile using embedded database-->
  <beans profile="local">     
    <jdbc:embedded-database id="dataSource">     
    <jdbc:script location="classpath:scripts/hsql_tables.sql"/>
    </jdbc:embedded-database>
  </beans>


  <!-- Remote profile using external data source -->
  <beans profile="remote"> 
   <jee:jndi-lookup id="dataSource"
     jndiname="java:comp/env/jdbc/dataSource"/>
  </beans>

Property file ( profile.properties ):
  spring.profiles.active="remote"
By default, system properties take precedence over environment variables thus spring.profiles.active parameter submitted through the JVM system property will be used instead of value in profile.properties  file.

Alternative solution:
Another possible way of accomplishing same goal (without the use of spring bean profiles) would be to import a different context file for the database in different environments, but bean profiles seemed like a more appropriate solution in this case.
<import resource="${database.type}-context.xml"/>