Peter Keating

Developer from the New Forest in the South of England.

Walking Skeleton for Web Applications

When beginning a new project at Moov2 we always kick off with a meeting to discuss tasks to be done for the first phase. The first task added to our task management process is always "Create a Walking Skeleton". I came across the term "Walking Skeleton" when reading a book I highly recommend, Growing Object Orientated Software Guided By Tests. Creation of a walking skeleton involves getting architecture in place to build, deploy and test a product so focus can shift quickly to implementing features. The majority of time when creating a walking skeleton is spent setting up an automated build process. The automated build process carries out tasks orchestrated by a build script to configure and optimize source files to output a product ready to be deployed into a production environment. Because a walking skeleton doesn't contain features it can be used as a starting point for any web application project. With this being the case I have open sourced the walking skeleton on Github for others and others to contribute and use when starting web applications.

What is a web application?

I consider a web application to be a user experience delivered in the browser that has the responsive nature of a native application instead of the response request model of traditional websites. Web applications used to be delivered through plugins like Flash, however nowadays they rely heavily on CSS3 & JavaScript to create a responsive user experience that is available across all devices. The trend of web applications is down to many contributing factors but the primary one for me is developers pushing the boundaries of the browsers via HTML5. These boundaries being pushed forces the question, do we develop a native application or a web application, a question that can only be answered by the context and requirements of the product. I have been working on a project recently that uses HTML5 features provided by modern browsers, which prompted the creation of the skeleton. The single page application (SPA) has the feeling of a native application providing a responsive user interface that responds to user touch. Below is a list of web application examples.

The Walking Skeleton

As I said earlier in this blog post all the code for the walking skeleton is available on Github. Even though it doesn't contain any features that can contribute to features in the end product, it is packed full of features that aid the development and deployment process.

Build Process using Ant

Ant is a Java command line tool used to automate the building of software solutions. Ant is instructed by a build file written in the familiar language of XML. Familiarity was the primary reason for me choosing Ant. Not just familiarity with Ant but also familiarity with integrating Ant with a continuous integration process. The continuous integration process isn't in the scope of this article, but I recommend reading Martin Fowler's excellent article on continuous integration. With Ant being an established tool (first released in 2003) it should easily integrate with most continuous integration environments. With Ant being an older tool it does rely on willing developers to create and release libraries in order to handle modern tasks, an example is Rhino a tool by Mozilla used to run JavaScript in a Java environment for things like testing JavaScript. It is this reliance on a committed minority that questions why a more modern build tool like grunt was not chosen. But for me I have decided to go with what is familiar and what I am comfortable working with, and at the moment that is Ant.

The big picture of the build process is that source files are copied to an intermediate directory where they're configured and optimised then the files that contribute to the end product are copied to a publish directory. The detail is in what is done during the configuration and optimisation stages of the build process.

Optimizing CSS & JavaScript

In modern web design there is more considerations that need to be made due to the different devices that users may use to visit a web page. Mobiles, tablets, laptops, desktops & TVs all need to be considered. It isn't just presenting for varying screen sizes that needs thought, delivering a lightweight page that loads fast on varying connection speeds needs thought investment. Files loaded on page load are most likely to be made up of CSS, images and JavaScript. In order to make the CSS & JavaScript load as quickly as possible these files are loaded with the minimal number of HTTP requests at the minimum file-size possible. The Ant build process in the skeleton ensures that the outputted distributable is loading a single CSS & JavaScript file that have been compressed to ensure they're the minimum possible file size.

CSS Reset & Sass

The skeleton comes with a CSS reset courtesy of normalize.css by Nicolas Gallacher and that's all seeing as there is nothing to style. Even though there isn't any CSS the skeleton is geared up to use the Sass CSS preprocesser. By choosing Sass it makes it easy to only have a single CSS file for the web application and have Sass produce a compressed output, this reduces the file size of the CSS file.

AMD via RequireJS

The skeleton comes with a handful of JavaScript files to get the developer in a position where focus can be on the features. James Burke's brilliant RequireJS library is used to improve the speed and quality of the JavaScript code. The JavaScript code for the application is split up into modules that are loaded by RequireJS when required. This helps during development as there is no dependency on ensuring that <script> tags are ordered correctly because RequireJS handles all that for you. You may now be thinking that loading files (containing modules) when they're needed will increase the number of requests, well your right but that is acceptable in a development environment as it makes things easier to debug. When it comes to the production environment the JavaScript will have been compressed and concatenated into a single file requiring a single HTTP request. This is handled in the Ant build script using RequireJS optimizer to concatenate all the files that contain modules used by a main entry module outputting a minified (with UglifyJS) JavaScript file. The build script also handles changing the HTML to load the optimised JavaScript file for the production environment instead of loading the RequireJS library with a data-main attribute used in the development environment (as shown in the code sample below).

<!-- development version -->  
<script data-main="app/js/config" src="vendor/require.js"></script>

<!-- production version -->
<script src="app/js/fcb2dc81.js"></script>

I will discuss the odd file name in the production version shortly.


Every web project I am involved in nowadays includes Modernizr, a JavaScript library to detect available HTML5 & CSS3 features in the visiting browser. To learn more about Modernizr I suggest taking a look at the official site which will give you all the information you need. Alternatively if you're confident you don't need Modernizr then it can easily be removed.

Cache Busting

It is important that returning visitors always receive the latest presentation and functionality whilst still getting the benefits of quicker page load with caching. To ensure that a user doesn't get served out of date files the URL to the file can be changed either by renaming the file or adding a query string variable that changes. The method used in the skeleton is one I came across in the HTML5 boilerplate Ant build script project that creates a random file name for the optimised CSS & JavaScript files. The Ant build script in the skeleton creates a random file names, renames the optimised CSS & JavaScript files to the random file names and then modifies the HTML to load the renamed CSS & JavaScript files instead of the original files. To see how this is done in the build script see the -css.cache & -js.cache targets.

Image Optimisation

One of the biggest things that affects page load time is images. Images take up the majority of HTTP requests and also weigh the most KB. In order to trim a bit of the load time off the build script in the skeleton has a target dedicated to compressing .png images (plans in the future to include .jpg). The compression is done using OptiPNG. To kick off the optimisation process open up a command prompt and change the current directory to the build folder and run the command as shown below.

ant images  

The time it takes for the command to complete is linked to how many images are in the directory. The command will run each image in the images directory (located in the src/app/images directory) through OptiPNG placing the optimised image in an optimised-images directory in the root. This directory is ignored by Git to ensure the optimised images aren't included in the source code repository.

Instead of including the image optimisation in the default build target it has a target of its own where the responsibility for running lies with the developer. This was done because once the image has been optimised the output won't be different when run again. Also if there is a considerable amount of images it can really impact the speed of the build process. So once the command shown above has completed you should copy the optimised images to the images directory overwriting the original images.


When developing on anything nowadays I put extra effort to make sure the code I'm writing is easily understood by others and myself when I return to the code at a later date. An effective way to do this is to use comments to show my intentions with a piece of code. This is a principle that I have applied through out the entire codebase in the skeleton. Not only have I included explanations but also links to useful resources that offer a deeper understanding in what is happening in the code.

Still to Come

Unfortunately I haven't quite had time to get everything I want into the skeleton but there is enough there to give a good base to a web application project. One of the main things missing from the skeleton is a testing set up that is automated through the build script and promote test driven development when developing features for the web application. With tests it could also be useful to have a code coverage tool to give good indication of how covered the source code is with tests. Another aid to JavaScript development that is popular nowadays is the use of JSLint or JSHint to promote code quality and to detect potential errors in JavaScript code. Another area that needs some attention is that image optimisation only handles .png images, so it would be good to at least handle .jpg images.

The last thing that will hopefully come in the future is contributions from others. There is already many boilerplate type repositories available on the web and this skeleton is not trying to be a replacement. It is primarily an aid for myself when starting a web application project, but if it is useful to others then that's a bonus. So please feel free to contribute, point out problems or offer suggestions on things that could be added or done better. Once again the project can be found on Github, or alternatively you can contact me via email or twitter.

Back to Posts