Sergio and the sigil

[ANN] Chicago ALT.NET shows Rake and Albacore

Posted by Sergio on 2010-03-09

I haven't mentioned our meetings here in a while but our group has been going strong and enthusiastic all this time.

Tomorrow, March 10th our topic will be build scripts for .Net projects using Rake and Albacore. I've been using Rake and a little bit of Albacore in my own projects and I'm ready to say that it will take a very serious event to make me go back to NAnt or MSBuild.

Introduction to Rake with Albacore.NET

6:00 pm
Pizza and networking time

6:30 pm

How would you to write your build scripts using a scripting language instead of XML? In this month's meeting we will see how the ease of programming in Ruby can be used to create a much more pleasant and extensible build script.

Rake isn't just for Rubyists or Alphageeks anymore. Albacore helps bring the power and expresiveness of the Ruby language to the world of .NET build automation. Using Rake it's never been easier to handle build automation, test execution, continuous integration and just about any task you need to automate for your build.

Michael D. Hall has been developing software on the Microsoft platform for over a decade. He's been an Alt.NETter for years and is really enjoying the exposure to different ideas and concepts beyond the safe confines of the .NET world. Currently he's a consultant working with Obtiva and has started a Cloud Developer's Group that meets monthly in McHenry county.

Register for Introduction to Rake with Albacore.NET in Chicago, IL  on Eventbrite

Code coverage reports with NCover and MSBuild

Posted by Sergio on 2010-02-09

I've been doing a lot of static analysis on our projects at work lately. As part of that task we added NCover to our automated build process. Our build runs on Team Build (TFS) and is specified in an MSBuild file.

We wanted to take code metrics very seriously and we purchased the complete version of the product to take full advantage of its capabilities.

Getting NCover to run in your build is very simple and the online documentation will be enough to figure it out. The problem comes when you begin needing to create more and more variations of the reports. The online documentation is a little short on this aspect, especially on how to use the MSBuild or NAnt custom tasks. I hear they plan to update the site with better docs for the next version of the product.

NCover Complete comes with 23 different types of reports and a ton of parameters that can be configured to produce far more helpful reports than just sticking to the defaults.

For example, we are working on a new release of our product and we are pushing ourselves to produce more testable code and write more unit tests for all the new code. The problem is that the new code is a just tiny fraction of the existing code and the metrics get averaged down by the older code.

The key is to separate the code coverage profiling (which is done by NCover while it runs all the unit tests with NUnit) from the rendering of the reports. That way we only run the code coverage once; and that can sometimes take a good chunk of time to produce the coverage data. Rendering the reports is much quicker since the NCover reporting engine can feed off the coverage data as many times as we need, very quickly.

Once we have the coverage data we can choose which report types we want to create, the thresholds for sufficient coverage (or to fail the build), which assemblies/types/methods we want to include/exclude from each report and where to save each of them.

Example

To demonstrate what I just described in practice, I decided to take an existing open source project and add NCover reporting to it. The project I selected was AutoMapper mostly because it's not very big and has decent test coverage.

I downloaded the project's source code from the repository and added a file named AutoMapper.msbuild to its root directory. You can download this entire file but I'll go over it piece by piece.

We start by just importing the MSBuild tasks that ship with NCover into our script and declaring a few targets, including one to collect coverage data and one to generate the reports. I added the NCover tasks dll to the project directory tools/NCoverComplete.

<Project DefaultTargets="RebuildReports" 
  xmlns="http://schemas.microsoft.com/developer/msbuild/2003" >
  <UsingTask  TaskName="NCover.MSBuildTasks.NCover" 
        AssemblyFile="$(ProjectDir)tools\NCoverComplete\NCover.MSBuildTasks.dll"/>
  <UsingTask  TaskName="NCover.MSBuildTasks.NCoverReporting" 
        AssemblyFile="$(ProjectDir)tools\NCoverComplete\NCover.MSBuildTasks.dll"/>

  <PropertyGroup>
    <Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
    <BuildDir>$(MSBuildProjectDirectory)\build\$(Configuration)</BuildDir>
    <NUnitBinDirectoryPath>$(MSBuildProjectDirectory)\tools\NUnit</NUnitBinDirectoryPath>
  </PropertyGroup>

  <Target Name="RebuildReports" DependsOnTargets="RunCoverage;ExportReports" >
    <Message Text="We will rebuild the coverage data than refresh the reports." 
          Importance="High" />
  </Target>

  <Target Name="RunCoverage" >
    <!-- snip -->
  </Target>

  <Target Name="ExportReports" >
    <!-- snip -->
  </Target>
</Project>

Now let's look closely at the target that gathers the coverage data. All it does is tell NCover (NCover console, really) to run NUnit over the AutoMapper.UnitTests.dll and save all the output to well-known locations.

<Target Name="RunCoverage" >
  <Message Text="Starting Code Coverage Analysis (NCover) ..." Importance="High" />
  <PropertyGroup>
    <NCoverOutDir>$(MSBuildProjectDirectory)\build\NCoverOut</NCoverOutDir>
    <NUnitResultsFile>build\NCoverOut\automapper-nunit-result.xml</NUnitResultsFile>
    <NUnitOutFile>build\NCoverOut\automapper-nunit-Out.txt</NUnitOutFile>
    <InputFile>$(BuildDir)\UnitTests\AutoMapper.UnitTests.dll</InputFile>
  </PropertyGroup>

  <NCover ToolPath="$(ProgramFiles)\NCover"
    ProjectName="$(Scenario)"
    WorkingDirectory="$(MSBuildProjectDirectory)"   
    TestRunnerExe="$(NUnitBinDirectoryPath)\nunit-console.exe"

    TestRunnerArgs="$(InputFile) /xml=$(NUnitResultsFile) /out=$(NUnitOutFile)"

    AppendTrendTo="$(NCoverOutDir)\automapper-coverage.trend"
    CoverageFile="$(NCoverOutDir)\automapper-coverage.xml"
    LogFile="$(NCoverOutDir)\automapper-coverage.log"
    IncludeTypes="AutoMapper\..*"
    ExcludeTypes="AutoMapper\.UnitTests\..*;AutoMapper\.Tests\..*"
    SymbolSearchLocations="Registry, SymbolServer, BuildPath, ExecutingDir"
  />
</Target>

Of special interest in the NCover task above are the output files named automapper)-coverage.xml and automapper-coverage.trend, which contain the precious coverage data and historical trending respectively. In case you're curious, the trend file is actually a SQLite3 database file that you can report directly from or export to other database formats if you want.

Also note the IncludeTypes and ExcludeTypes parameters, which guarantee that we are not tracking coverage on code that we don't care about.

Now that we have our coverage and trend data collected and saved to files we know, we can run as many reports as we want without needing to execute the whole set of tests again. That's in the next target.

<Target Name="ExportReports" >
  <Message Text="Starting Producing NCover Reports..." Importance="High" />
  <PropertyGroup>
    <Scenario>AutoMapper-Full</Scenario>
    <NCoverOutDir>$(MSBuildProjectDirectory)\build\NCoverOut</NCoverOutDir>
    <RptOutFolder>$(NCoverOutDir)\$(Scenario)Coverage</RptOutFolder>
    <Reports>
      <Report>
        <ReportType>FullCoverageReport</ReportType>
        <OutputPath>$(RptOutFolder)\Full\index.html</OutputPath>
        <Format>Html</Format>
      </Report>
      <Report>
        <ReportType>SymbolModuleNamespaceClass</ReportType>
        <OutputPath>$(RptOutFolder)\ClassCoverage\index.html</OutputPath>
        <Format>Html</Format>
      </Report>
      <Report>
        <ReportType>Trends</ReportType>
        <OutputPath>$(RptOutFolder)\Trends\index.html</OutputPath>
        <Format>Html</Format>
      </Report>
    </Reports>
    <SatisfactoryCoverage>
      <Threshold>
        <CoverageMetric>MethodCoverage</CoverageMetric>
        <Type>View</Type>
        <Value>80.0</Value>
      </Threshold>
      <Threshold>
        <CoverageMetric>SymbolCoverage</CoverageMetric>
        <Value>80.0</Value>
      </Threshold>
      <Threshold>
        <CoverageMetric>BranchCoverage</CoverageMetric>
        <Value>80.0</Value>
      </Threshold>
      <Threshold>
        <CoverageMetric>CyclomaticComplexity</CoverageMetric>
        <Value>8</Value>
      </Threshold>
    </SatisfactoryCoverage>

  </PropertyGroup>

  <NCoverReporting 
    ToolPath="$(ProgramFiles)\NCover"
    CoverageDataPaths="$(NCoverOutDir)\automapper-coverage.xml"
    LoadTrendPath="$(NCoverOutDir)\automapper-coverage.trend"
    ProjectName="$(Scenario) Code"
    OutputReport="$(Reports)"
    SatisfactoryCoverage="$(SatisfactoryCoverage)"
  />
</Target>

What you can see in this target is that we are creating three different reports, represented by the Report elements and that we are changing the satisfactory threshold to 80% code coverage (down from the default of 95%) and the maximum cyclomatic complexity to 8. These two blocks of configuration are passer to the NCoverReporting task via the parameters OutputReport and SatisfactoryCoverage, respectively.

The above reports are shown in the images below.


Focus on specific areas

Let's now say that, in addition to the reports for the entire source code, we also want to keep a closer eye on the classes under the AutoMapper.Mappers namespace. We can get that going with another reporting target, filtering the reported data down to just the code we are interested in:

<Target Name="ExportReportsMappers" >
  <Message Text="Reports just for the Mappers" Importance="High" />
  <PropertyGroup>
    <Scenario>AutoMapper-OnlyMappers</Scenario>
    <NCoverOutDir>$(MSBuildProjectDirectory)\build\NCoverOut</NCoverOutDir>
    <RptOutFolder>$(NCoverOutDir)\$(Scenario)Coverage</RptOutFolder>
    <Reports>
      <Report>
        <ReportType>SymbolModuleNamespaceClass</ReportType>
        <OutputPath>$(RptOutFolder)\ClassCoverage\index.html</OutputPath>
        <Format>Html</Format>
      </Report>
      <!-- add more Report elements as desired -->
    </Reports>
    <CoverageFilters>
      <Filter>
        <Pattern>AutoMapper\.Mappers\..*</Pattern>
        <Type>Class</Type>
        <IsRegex>True</IsRegex>
        <IsInclude>True</IsInclude>
      </Filter>
      <!-- include/exclude more classes, assemblies, namespaces, 
      methods, files as desired -->
    </CoverageFilters>

  </PropertyGroup>

  <NCoverReporting 
    ToolPath="$(ProgramFiles)\NCover"
    CoverageDataPaths="$(NCoverOutDir)\automapper-coverage.xml"
    ClearCoverageFilters="true"
    CoverageFilters="$(CoverageFilters)"
    LoadTrendPath="$(NCoverOutDir)\automapper-coverage.trend"
    ProjectName="$(Scenario) Code"
    OutputReport="$(Reports)"
  />
</Target/>

Now that we have this basic template our plan is to identify problem areas in the code and create reports aimed at them. The URLs of the reports will be included in the CI build reports and notification emails.

It's so easy to add more reports that we will have reports that will live for a single release cycle or even less if we need it.

I hope this was helpful for more people because it did take a good amount of time to get it all sorted out. Even if you're using NAnt instead of MSBuild, the syntax is similar and I'm sure you can port the idea easily.

Mozilla Add-Ons in Chicago

Posted by Sergio on 2009-09-17

Later this month I'll be attending the Mozilla Add-Ons Meetup in Chicago.

I'm continually impressed with the extensibility of Mozilla applications and the amazing things people are doing with it. I'm interested in both the extensibility model and in writing a few custom extensions myself, even if it's just for my own use. Given that it's mostly XML and JavaScript, it should be right up my alley.

After seeing a presentation about building Firefox extensions earlier this year I decided I had to look into that more seriously.

So if you're in the area and wants to see what this is all about, this meetup might be a good way to get some info to get going.

Generated by a tool, not for human consumption

Posted by Sergio on 2009-01-30

A few years ago I had an interesting discussion with some of my then coworkers about the XML comments in our code. XML comments were useful in some cases because we were writing some libraries to be shared with many of our applications. We kept the DLLs, the XML and the CHM all in the build folder for any other developer that needed to use that library.

I know some of you have strong opinions against or in favor of XML comments. What I know is that they don't bother me but I'd trade them for a clear and self-explanatory API in a heartbeat.

But what really bothered me was when one of the guys came to me and showed this amazing Visual Studio add-on that would automatically generate the XML comments for him. I won't name the tool here because it's not important. GhostDoc (ooopsy!!!) goes through all your code, finds where XML comments can be put, and tries to guess the text of the comment from the name of the members, parameters, etc. When it finishes, a method named GetOrders will have a description "Gets the orders", a property or parameter named UserName will become "Gets the name of the user", and so on. See image below as an example.

Now, let's think about this for a second. Suppose you are trying to use a class that has a method called GetOrders, do you really need a stupid tooltip comment or a topic in a CHM file to tell you that this method "gets orders" ? Maybe you thought it would list the "orders of the gets", right? Then you bring up Intellisense suggestions and there's a property named UserName in your object, I'm sure you'd be totally puzzled wondering what it stands for, correct?

Hmmm, UserName, what could it possibly be. Ha! Thank you Mr. Tooltip you just saved my bacon. It's the name of the user. Phew, thank God I didn't need to use Reflector to figure this one out.

Sarcasm aside, what real benefit does such a tool gives you? You're just avoiding compiler warnings at most. If you want real useful content in your XML comments, they need to hand-written by someone who's thinking about the developer that will read them. A tool will not add examples, notes, tips, suggest other references, etc. Basically, if a freaking tool was able to guess what that member does, you must be able to guess it too before the tooltip comes up. The tool was not written to help the other developer. The tool was written to beat another tool (the compiler and its annoying warnings.) Use wisdom when choosing your tools. Not all of them are made equal.

What drove me over the edge to get this post out was seeing a tooltip like "Gets the name of the nick".

Cruise and Agile discussed - videos forthcoming

Posted by Sergio on 2008-08-15
Update: The videos of the presentation and discussion have been posted.

This month's Chicago ALT.NET meeting was pretty awesome and it was all caught in video. As soon as I have some time to do some post-production on the raw material (read, just stitch pieces together) I'll make it available somehow.

As previously mentioned we started off with a presentation of ThoughtWorks Cruise, where Robert Norton explained the idea of CI server, Agents, Pipelines and went through many of Cruise features, system requirements, and futures. He also clarified his company's position regarding CruiseControl.net, which will most likely not receive a lot of attention in terms of funding, being left for the community to keep it going.

Cruise seemed promising to me but it's clearly a typical version 1 product that needs some work to get enthusiastic thumbs up from me. Hopefully they move quickly and release a few updates before the year is over to make the product top notch. I don't mean to say Cruise in unusable. It's definitely usable and does things in a very smart way. Given time I'm sure they will take care of the rough edges and have a chance to answer customer feedback. My particular concerns tend to be on the side of ability to integrate with other systems in the enterprise, like your bug/feature tracker.

After the presentation portion we all sat together for an open discussion. The fallback topic was CI practices but what the group really wanted to talk about was Agile teams and their dynamics, so that's what the discussion became. As usual, that's my favorite part of the meeting and it's a pity that only 50% of the attendance stuck around for it.

It's nice when you go to a meeting like this and can take home a lot of new knowledge.