TeamCity has been proving to be one of the most useful tools I have used in recent times. It has helped me from continuous deployment of sites to staging and production environments, allowing security penetration testing, running automated acceptance tests with SpecFlow and much more. I really hoped there would be a way that I could test the SEO capabilities of a newly deployed site. Usually when I deploy a site I will open IIS console and run a Site Analysis against the URL using the SEO Toolkit but I thought there must really be a way to run this from TeamCity.
Over the past week or so I have been writing some prototype code that will allow me to use the SEO toolkit dll to create a sits analysis. This worked out to be effective and since I was able to run the analysis from code I was able to run the application from TeamCity.
the code for this application is in very early stages as it was written with the help of some sample code on MSDN. It will evolve over time and I’ll make this available. The plans for the application are to support different parts of the analysis to be added to the report by config parameters. As I am sure this code could be misused I have added a small Thread.Sleep so that it can’t be used to DoS attack a site. I accept no responsibilities for the code being misused.
Configuring TeamCity to Run the Application
Log into TeamCity as an administrator and go into the administration screen.
Click on ‘Create Project’ and enter some project details:
Next we want to create a build configuration. So click on ‘Create build configuration’:
Add some general details – name, description and build number format. The most important thing to note here is the artifacts text area. It has SEOReport.html added to it. This is the report that the crawler application creates. We add this to the artifacts so that we can add the results to a report tab.
We will be asked next to enter some VCS settings. I have had to hack things slightly. Technically I don’t have to check this application out of version control. So I enter the path to the application file in the checkout path as follows:
The build step is the next thing to add. In the build steps over click on Add Build Step and choose Command Line from the runner type drop down menu. This is where we call the application now:
As you can see we leave the working directory empty as we have specified the directory already in the VCS stage. We can then call the .exe from the command executable. The parameters is where it gets interesting. This is were we pass in the site to scan – if a site is not specified then an application exception will be thrown and the build will fail.
When you have finished the setup and passed in a site then run the build. On a successful build the build details page will show (a newly configured) SEO Crawler Report tab:
If you don’t know how to add a custom report tab I’ll cover this in a separate post. But on the successful build and we click that tab we get a very simple report as follows:
This report only has a summary of URLS, links and violations as well as site status codes and broken links. But as this application provides a full scan that the SEO toolkit can do we can make the application output more details. In coming versions of the code, I’ll allow it to be configurable as to which sections can be added to the report.
How Can I View The Full Report?
As I mentioned, the crawler application does a full site analysis. The same analysis that IIS does. In the application root folder an IIS SEO Reports folder. This folder will contain all the scan reports. So if you really want to you can take the scan folder and drop it to a know IIS location (for me is MyDocuments/IIS SEO Reports) and IIS can display the full report as it would have done before.
Another tip of the cap to TeamCity. I know as this is an executable it can be called from any build script but its just made very easy for me with the build in build runner and parameters acceptance.