Tuesday, December 9, 2008

Visual Studio Unit Testing – Reducing Redundant Tests

When I first started unit testing, I tested as many methods as I could.  These included public and private ones.  A bit of searching on the web about public versus private method testing will yield mixed results.  I personally test both since I aim for at least 70% code coverage.  Visual Studio creates Accessor classes for private methods and properties for you so testing private methods is easy.

I used to write at least one test per method (some require more than one to test conditional code paths), but this can get redundant.  The steps I take to reduce redundancy today are as follows.  First, I turn on code coverage.  This will give a visual indicator of what has been covered in my classes. 



When I run my tests, code that has been covered is in blue and the code that hasn’t been covered is in red.


I test my constructors first, then public methods, and finally private methods.  Doing it in this order gives me a better idea of what private methods need to be tested.  Most of the time public methods will call private methods so writing a single test will cover those methods as well.  I collapse whatever has been covered each time I create and run a test.  Doing my tests this way has greatly reduced the number of redundant tests written and has kept the same amount of code coverage.

Thursday, September 4, 2008

ADO.NET Asynchronous Transactions

Searching on the web and on the MSDN forums for Asynchronous Transactions didn’t give me what I was looking for.  I knew a bit about both, so I decided trying to combine the two to get the result I wanted.  This was the general outline of what I came up with.

Public Sub DoLongSQLOperation()
    Dim ConnectionStringBuilder As New SqlClient.SqlConnectionStringBuilder

    ConnectionStringBuilder.IntegratedSecurity = True
    ConnectionStringBuilder.DataSource = "SQLSERVER"
    ConnectionStringBuilder.InitialCatalog = "DATABASE"
    ConnectionStringBuilder.AsynchronousProcessing = True

    Dim MySQLConnection As New SqlConnection(ConnectionStringBuilder.ToString)

    Dim LongSQLCommand As New SqlCommand("sp_LongOperation", MySQLConnection)

    Dim MySQLTrans As SqlTransaction = MySQLConnection.BeginTransaction

    LongSQLCommand.Transaction = MySQLTrans
    LongSQLCommand.CommandType = CommandType.StoredProcedure

    Dim Callback As New AsyncCallback(AddressOf CallbackMethod)

    Dim Result As IAsyncResult = LongSQLCommand.BeginExecuteNonQuery(Callback, LongSQLCommand)

    While Not Result.IsCompleted
      'do something if needed
    End While

  End Sub

  Private Sub CallbackMethod(ByVal result As IAsyncResult)
    Dim LongSQLCommand As SqlCommand

    LongSQLCommand = DirectCast(result.AsyncState, SqlCommand)

    Dim MyTransaction As SqlTransaction = LongSQLCommand.Transaction

    Catch ex As Exception
      'Try to rollback on a commit exception
      Catch exRollback As Exception
        'Rollback failed
      End Try
    End Try

    'Dispose of your objects

  End Sub

I used the Asynchronous Callback method of doing what I needed.  CallbackMethod gets called when the asynchronous operation completes.  Within my callback method is where I commit my transaction and dispose of any data resources.  I’ve kept the error handling to a minimum in my example for brevity, but you’ll definitely want to add them where they’re needed. 

Thursday, July 24, 2008

Visual Studio and Data Driven Unit Tests

Unit testing can be tedious when you have a battery of data to test against.  If you’re just testing against a small number of different data, then using Data Driven Unit Tests might be a bit overkill.  However, Data Driven Unit Tests gives you is a single location of data that can be used throughout your unit testing project.  Instead of modifying the data you typed in your unit test code, you can just modify it in your data file.  I used this MSDN entry as a starting point.

The method we’ll be testing is shown below.

Public Function SomeMethod(ByVal data As String) As Integer
   Return data.Length
End Function

You create the unit test for SomeMethod as usual.  Now what about our data file?  I’ll use an XML data file that just holds a few strings named test.xml.

    <Sample0>My Sample</Sample0>
    <Sample1>This is another sample</Sample1>

Now you have your simple data file.  To hook it into your unit test project, open up the Test List Editor in Visual Studio.  Find the test that you want to use test.xml with and click on that test.


In the Properties, find the Data Connection String entry and press the ellipses button.  This will bring up the New Test Data Source Wizard.



Select XML File and press Next to bring up the next screen.  Press the ellipses button and find test.xml.  This should fill in Table and Preview data for you.


Pressing Next will bring you to the final page of the wizard.  Highlight the table and press Finish.


You’ll see a dialog after the previous step.  Press Yes and Visual Studio will add test.xml to your unit test project.



If you look at your Data Connection String property, it should now point to test.xml within your unit test project.  The next step is using the data in your unit test.  Here’s a simple example of how to use your data.

<DataSource("Microsoft.VisualStudio.TestTools.DataSource.XML", "|DataDirectory|\test.xml", "Samples", DataAccessMethod.Sequential)> _
<DeploymentItem("TestProject1\test.xml")> _
<TestMethod()> _
Public Sub SomeMethodTest()
  Dim target As Form1 = New Form1
  Dim data As String = String.Empty

  Assert.IsTrue(target.SomeMethod(TestContext.DataRow("Sample0")) = TestContext.DataRow("Sample0").ToString.Length)
End Sub

The attributes for SomeMethodTest get automatically added when you add test.xml to your Data Connection String property.  The main point here is that TestContext.DataRow() is used to access your data.

One issue that I’ve run into was that I wanted to move my data into a sub folder called Data within my unit test project.  This will work, but the path to test.xml is a hard path.  Why does this matter?  I’m part of a team and having a hard path breaks the unit test since not all of our workspace paths are the same.  If you look at the Data Connection String property, you’ll notice that |DataDirectory| is at the beginning before the xml file location.  This transfers fine over different workspaces, however you’ll have to leave the data file where it gets inserted into the project by default.

Wednesday, July 16, 2008

TeamBuild and WiX

Have you ever run into this error using TeamBuild to build a WiX setup project?

light.exe : error LGHT0217: An unexpected external UI message was received: The Windows Installer Service could not be accessed. This can occur if you are running Windows in safe mode, or if the Windows Installer is not correctly installed. Contact your support personnel for assistance.

Done building project "Setup.wixproj" -- FAILED.

It seems this problem is rampant on Vista build machines.  The solution usually had something to do with the vbscript engine, but I’ve tried all of those solutions and we were still failing WiX builds.  Besides, we’re using XP for our build machines.

I decided to login to our build machine using our build service account and manually build the WiX project through the IDE.  The project compiled fine so I kicked off another build.  That didn’t fix anything.  I then decided to compile the project using devenv.exe from the command line.  That compiled fine, but I noticed an ICE## warning.  At first I didn’t think anything of it because after this second step, our builds started to work.

Greg, being the manager that he is, told me to repeat the steps on our other build machine to see if that was indeed the fix to our problem.  I was hopeful, but it didn’t seem to work on the other machine.  Then he said to reboot the original build machine and repeat the steps.  Again, I was hopeful, but it looked like the steps I took before wasn’t the solution.

Later on in the day I tried to login to our build machine again using our service account to repeat the steps from before.  Firing a build didn’t work again, but this time that ICE## warning really caught my eye.  I remembered that the WiX project property page had some settings for ICE validation:


Although the ICE## message being returned by the build process was a warning, it was causing our build to fail.  Checking Supress ICE validation was the key and we now have an automated build for our WiX setup project.

Friday, July 11, 2008

My Thoughts On Scrum (Part 1)

If you’ve been following Greg's Blog, you’d know that our team (and our whole development group for that matter) is now using the Scrum methodology for software development.  I think it’s natural for anyone to get excited or somewhat nervous when a change in day to day operations gets implemented. 

Some people thrive on a dynamic lifestyle while others can’t handle change very well.  For a software developer, the only things that are certain are death, taxes, and change.  Being thrown into something new shouldn’t make your world flip upside-down.

Before Scrum, our team didn’t have a distinctly defined system.  It was definitely iterative and agile, but there were no hard release dates and no administrative tasks associated with our development.  Our team is very customer focused, so transitioning to Scrum didn’t really change our mentality on what we delivered, just how we delivered it.

The hardest part for me when we transitioned to Scrum was the administrative tasks.  Updating statuses, work hours, and general commenting on work items was new to me (and I’m still learning to get used to it).  However, I do understand the importance of these tasks since our sprint burndown chart gives us a great visual representation of our status.

I was assigned to my own project before Scrum and I felt solely responsible for it. Now that we’re Scrumming and there are four of us are working on the same project, I feel a deeper sense of camaraderie.  There’s less sense of I and more of us

I think our first sprint is going very well and I’m very happy about what we’re going to give to our customers at the end of this sprint.  As we become more experienced with Scrum, I’ll share some more of my thoughts.

Thursday, June 19, 2008

TeamBuild ClickOnce – Auto Incrementing Your Version Information

Using AssemblyInfoTask within your TFSBuild.proj file, you can set your builds to automatically increment your AssemblyVersion and AssemblyFileVersion.  You can also keep your application’s Publish Version in sync, which is useful if you’re publishing your application via ClickOnce. 

The Publish Version is important because that’s what ClickOnce looks at to see if there’s a new version available when the user runs the application.  Updating the AssemblyVersion or AssemblyFileVersion doesn’t tell ClickOnce that there’s a new version of your application to download.

There are a ton of blog posts like this one that tell you how to check in/out files from your TFS so I won’t repeat those steps.  The actual version updates happen inside the <AssemblyInfo> task:

    <Import Project="$(MSBuildExtensionsPath)\Microsoft\AssemblyInfoTask\Microsoft.VersionNumber.Targets" />
    <Target Name="AfterGet">
        <Message Text="In After Get"/>

        <CreateItem Include="$(SolutionRoot)\My Project\AssemblyInfo.vb">
            <Output ItemName="AssemblyInfoFiles" TaskParameter="Include"/>

        <Exec WorkingDirectory="$(SolutionRoot)\My Project\"
            Command="$(TF) checkout AssemblyInfo.vb" />

        <AssemblyInfo AssemblyInfoFiles="@(AssemblyInfoFiles)"

          <Output ItemName="MaxAssemblyVersion" TaskParameter="MaxAssemblyVersion" />

For our project, we only want to update a specific AssemblyInfo.vb as opposed to recursively checking out all AssemblyInfo files to modify.  We used the <CreateItem> task to store the AssemblyInfo files into an array.  In our case it’s just one file (That’s why the <Exec> task looks a bit different than most examples out there on the web). 

Inside the <AssemblyInfo> task is pretty self-explanitory.  Setting the type and format of the Revision (you can do this to your Major, Minor, and Build as well) to AutoIncrement is what does the trick.  You don’t have to have an <Output> element, but I wanted to keep our Publish Version and Assembly Version in sync. 

We can use the variable @(MaxAssemblyVersion), which stems from the ItemName attribute, to hold our new Assembly Version.  I’ve previously posted about modifying the vbproj file to update the Publish Version so I won’t go into the details, but finding and replacing the needed element isn’t difficult.  Lastly you’ll want to check in your new AssemblyInfo file so that the next build increments your versions. 

Tuesday, June 3, 2008

TeamBuild ClickOnce – Publishing To Different Locations

Our team's goal with TeamBuild is to keep things as automated as possible.  We currently have two separate publishing locations for production and beta releases.  Wouldn't it be nice if we didn't have to put the publishing location manually each time we put out a build?


What I found interesting inside my project's vbproj file (which is just XML) was that the information I needed was located within the <PublishUrl> element:

<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="3.5">

Great, we can now have two different build definitions for production and beta releases without having to manually enter the publishing location.  All we need to do is write the location in the vbproj file before we compile.  The problem is that TFS has the vbproj file (and I’m pretty sure all other files within your project) as read-only during a build.

My first thought was that I just needed to checkout the vbproj file, do my modification, then just check it in.  This worked fine, but was all that really necessary?  I thought of using attrib.exe to remove the read-only attribute from the file, but I didn’t know if it’d work because of the TFS source control files.  I got confirmation from a post that attrib.exe would do the job so that’s what I ended up doing in my TFSBuild.proj:

<Project ...>
    <Target Name ="BeforeCompile">
        <Message Text="In Before Compile"/>

        <Message Text="Making vbproj file writable"/>
        <Exec Command="attrib -R &quot;$(SolutionRoot)\project.vbproj&quot;"/>

Now we just have to modify the <PublishUrl> element with our publishing location.  If you’ve read my previous posts you know that we use the SDC Tasks Library for XML manipulation.  So the first stab I took at placing our publishing locations into the vbproj file was using the <XmlFile.SetValue> task.  The problem here is that the default namespace in the vbproj file isn’t prefixed so getting the XPath I needed didn’t work.  The solution was to use the <File.RegEx> task:

<Project ...>
    <Target Name ="BeforeCompile">
       <Message Text="In Before Compile"/>

       <Message Text="Making vbproj file writable"/>
       <Exec Command="attrib -R &quot;$(SolutionRoot)\project.vbproj&quot;"/>

       <Message Text="Replacing PublishUrl"/>

When the project compiles, it’ll use the updated vbproj’s information so the manifests will have the correct publishing information when built.  Using this method, you can still manually publish your ClickOnce without messing up your build definitions and also have various build definitions to specify different publishing locations. 

One thing to note is that the Publish Version in the IDE is the <ApplicationVersion> element in your vbproj file.  If you modify this in your IDE then this will also affect your builds.  I found that keeping your Assembly Version and Publish Version in sync the best way to do things.  This way nobody gets confused as to which version you might be talking about.  Also, ClickOnce checks your Publish Version when doing automatic updates so keep that in mind.

Monday, May 19, 2008

TeamBuild ClickOnce – Versioning

It seems many people have had issues getting the version number correctly within their tfsbuild.proj files to use in their ClickOnce html files.  The solutions I've seen were very complicated, but this may be because the examples I've seen were in TFS 2005 or they were using different tasks.

My coworker added the SDC tasks to our build server because there is functionality for XML and file manipulation.  From his starting point I went on to look further into two specific tasks: Xml.GetValue and File.RegEx.

My boss directed me to where I can find the version number that I needed, which resides in the application manifest file.  The trickiest part of the Xml.GetValue task to me was understanding XPath.  The biggest point that I missed was that I needed to define the namespace within the application manifest to get to the element. 

If you look inside the application manifest of your program (not the program.exe.manifest, but program.application as I'm dealing with publishing), you'll find something similar to this:

<?xml version="1.0" encoding="utf-8"?>
        xsi:schemaLocation="urn:schemas-microsoft-com:asm.v1 assembly.adaptive.xsd" 
        manifestVersion="1.0" xmlns:asmv3="urn:schemas-microsoft-com:asm.v3" 
        xmlns="urn:schemas-microsoft-com:asm.v2" xmlns:asmv1="urn:schemas-microsoft-com:asm.v1" 
        xmlns:asmv2="urn:schemas-microsoft-com:asm.v2" xmlns:xrml="urn:mpeg:mpeg21:2003:01-REL-R-NS" 
    <assemblyIdentity name="program.application" version="1.0" 
        publicKeyToken="abcdedf892929" language="neutral" processorArchitecture="x86" 
        xmlns="urn:schemas-microsoft-com:asm.v1" />

I thought I could just use an XPath like this /asmv1:assembly/assemblyIdentity/@version to get the value that I wanted.  Nope, my build threw errors saying that the namespace asmv1 was not defined.  Then I defined the asmv1 namespace, but I wasn't getting a value returned.   You'll notice in the snippet above that assemblyIdentity adds its own namespace, but it's same as asmv1 defined above.  In the end it was because I didn't prefix the asmv1 namespace before assemblyIdentity as well.

I manually published my project to get the publish.htm file.  I then made it a part of my project so that I could copy it to my published application's directory and modify the version number.  I edited the html file and put a #VERSION# tag so that the File.RegEx task could find it and replace it with the $(ResultsItem) variable.  We get $(ResultsItem) from the ItemName attribute of the Output element within the Xml.GetValue task element.  The following is the snippet of my tfsbuild.proj file that does the versioning of the publish.htm file:

    <Target Name="AfterCompile">
        <Copy SourceFiles="$(SolutionRoot)\SupportFiles\publish.htm" DestinationFolder="$(OutDir)" />

    <Target Name="AfterDropBuild">
        <Message Text="Extracting version from Application manifest..."/>

            <Output TaskParameter="Results" ItemName="ResultsItem"/>

        <Message Text="Updating publish.htm to version @(ResultsItem)"/>
            NewValue="@(ResultsItem)" />

Friday, May 16, 2008

TeamBuild ClickOnce – Automation

I've had a ton of issues trying to get ClickOnce to work with MS Team Build starting from TFS 2005; we're now using TFS 2008.  Getting a bootstrap package to work with TeamBuild took some work, but I was able to do it with success early on. 

In our 2005 environment, I had to place the signing certificate that I was using onto our build server and place it in the certificate store in order for any of my builds to work.  Without the stored certificate my builds would not complete.  Fine, my bootstrap packages work, but my ClickOnce builds didn't.  I didn't feel too bad because a ton of people out there have had issues with ClickOnce and Team Build. 

Fast forward to our 2008 environment.  I had some minor issues with my bootstrap package as I posted about when I started my blog, but nothing compared to the ClickOnce issues I've had.  I decided to create a new build for my project in VS 2008 and start fresh.  I still got the same result: my build would hang and return no warnings. 

My coworker got ClickOnce automation working on his project and I happy someone finally got it to work.  I really couldn't find any difference in our processes as far as the build automation went.  My project has a lot more dependencies, but that didn't seem to be the cause of my problems. 

Every time I published my application manually, the certificate asked for a password.  This was the same password that I used to store the certificate on our build server.  My coworker's publishing didn't require him to enter one.  So then I thought to myself, maybe a password dialog is just sitting there waiting for input. 

The reason I came to this conclusion was that I did some testing and looked at the output directory of where my files were being published.  Everything was being created and copied except my ".application" file.  That triggered the thought that my certificate wasn't signing the application manifest because of a certificate password window just sitting there waiting for input.  The build process can't handle window pop ups.

My solution was to just create a new signing certificate with no password and this time I didn't place it in our build server's certificate store.  After adding the certificate to my project I ran my ClickOnce build and it worked...FINALLY! 

Thursday, May 15, 2008

MS Team Build Bug

Today while scouring the MSDN forums I found something interesting.  If you choose "Any CPU" as your platform in your "TFSBuild.proj" file, solution files will build correctly:


    <SolutionToBuild Include="$(BuildProjectFolderPath)/../../project.sln">


 <ConfigurationToBuild Include="Release|Any CPU">
     <PlatformToBuild>Any CPU</PlatformToBuild>


The problem is that I tried to publish a solution and for some reason it didn't work.  Actually, the reason was that I had references in my solution to non-publishable projects.  So it made sense then to just publish the project within the solution.  When I replaced the solution file with my VB Project file in the "SolutionToBuild" element, I got errors saying that the location of my executable couldn't be found.  I followed the suggestions on MSDN and changed my platform to x86. After making the change I was able to build both solution and project files:


    <SolutionToBuild Include="$(BuildProjectFolderPath)/../../project.vbproj">


 <ConfigurationToBuild Include="Release|x86">


Tuesday, May 13, 2008

Visual Studio Bootstrapping (Part 3)

My project has various prerequisites for it to run and the install location is set to download from the vendor's web site.  Unfortunately, the option to specify the location of the prerequisites affects all of them and you can't choose per prerequisite:


In BMG you can set the download URL of your prerequisite file by setting the "HomeSite URL" property:


I intentionally left mine blank as I wasn't sure if the download location would change frequently or not.  When I built my project, I got a build warning:


Putting a "HomeSite URL" will exclude your prerequisite file from your setup package.  Instead your bootstrapper will depend on the URL to download the prerequisite.  If you put an invalid location, you'll get an error:


As I mentioned before, I wasn't sure how often the links get updated and this was why I chose to include my prerequisite with my setup package. 

If you're using a build server like our team is, make sure to copy your prerequisite package to your build server.  The default location for Visual Studio 2008's packages is "C:\Program Files\Microsoft SDKs\Windows\v6.0A\Bootstrapper\Packages"  Be sure to update your "tfsbuild.proj" file to take into account any copying of your prerequisite as well.

Overall, BMG is a great tool with some quirks and bugs.  Even with the issues I ran into, I'll still be using this utility to create any prerequisite packages that I might need.

Visual Studio Bootstrapping (Part 2)

Setting up system checks for prerequisites in BMG is pretty straight forward.  The only option I chose was the MSI Product Check:


You can either get your product code from the msi file that you're including or from your computer's installed applications.  They should return the same product code either way.  The "Property for Result" is just a variable name that you'll use to check against on the "Install Conditions" tab.

After looking at the help file (which was another painful and buggy experience), these were the return codes I was supposed to check against:


However, even with my prerequisite installed, my "setup.exe" kept wanting to install my prerequisite.  I had to intentionally make my setup package fail so that I could snoop into the installation log.  What I found was that the return code that I needed to check against wasn't in the help file, the value I was supposed to check against was "5":


After adding the above condition, my prerequisite checks worked correctly, but my installation failed due to an exit code issue.  On the "Exit Codes" tab, I had to insert a success for exit code "0" in order for my setup package to determine that my prerequisite installed correctly:


Visual Studio Bootstrapping (Part 1)

We added a new feature to my current project that required a third party application as a prerequisite. Hooking this into Visual Studio seemed pretty straight forward; just use a bootstrapping program to create the necessary bootstrap package for Visual Studio. You can also create your own manually.

I used Bootstrapper Manifest Generator to generate my bootstrap package.  Although this application was quite buggy for me, in the end it got the job done.  However, it wasn't as easy as it should have been to get the whole process going.

The current version of BMG for Visual Studio 2008 is still in beta, so that may account for the numerous bugs I encountered when running this application.  I also had some issues figuring out how to get things to work the way I wanted.  For instance, if you want a progress bar for your prerequisite's installation, you'll need to put in an installation time:


If you don't put a time (which is just a guess anyway), you'll end up with a screen like this:


The prerequisite will install, but you'll have no progress bar.  Not a show stopper, but I found this to be unappealing.  Also, if you overshoot your installation time, the progress bar will never reach the end before exiting this screen.  If you undershoot it, the progress bar will go to the end and start over.  I chose the latter.

Wednesday, April 23, 2008

Iterating Through "My.Resources"

The "My" namespace in VB.NET is indispensable. Settings and resources among other things can be easily accessed by using it. You can place items such as files or images in "My.Resources" and access them in your code in a very simple manner: 

Dim MyObject As Object 

MyObject = My.Resources.ObjectName

However, iterating through My.Resources is another issue. You might think you could do it this way:

For Each Object In My.Resources 
   'more code here 

You can't. "My.Resources" is a namespace and not a collection that implements IEnumerable. So what do you do? You have to enumerate the objects in "My.Resources" if you want to iterate through them. I started out on the MSDN forums looking for a starting point and I came up with this implementation:

Public Function GetMyResourcesDictionary() As Dictionary(Of String, Object) 
  Dim ItemDictionary As New Dictionary(Of String, Object) 
  Dim ItemEnumerator As System.Collections.IDictionaryEnumerator 
  Dim ItemResourceSet As Resources.ResourceSet 
  Dim ResourceNameList As New List(Of String) 

  ItemResourceSet = My.Resources.ResourceManager.GetResourceSet(New System.Globalization.CultureInfo("en"), True, True) 

  'Get the enumerator for My.Resources 
  ItemEnumerator = ItemResourceSet.GetEnumerator 

  Do While ItemEnumerator.MoveNext 

  For Each resourceName As String In ResourceNameList 
    ItemDictionary.Add(resourceName, GetItem(resourceName)) 

  ResourceNameList = Nothing 

  Return ItemDictionary 
End Function 

Public Function GetItem(ByVal resourceName As String) As Object 
  Return My.Resources.ResourceManager.GetObject(resourceName) 
End Function

The reason I did it this way was so that I could call GetMyResourcesDictionary once and I'd have a Dictionary object that I could iterate through.

Tuesday, April 15, 2008

Unit Testing (Part 2)

Our team aims for at least 70% code coverage with our unit tests.  I've read that quite a few people only test public methods as the private methods will eventually be reached this way. From my experience this isn't always the case.

We use Visual Studio 2008's built in unit testing framework.  The nice thing about this framework is that it's integrated into the IDE and simply right-clicking on a method or class will bring up options to create unit tests.  Along with this, accessor classes are created so that you have access to private fields and methods.

I test both public and private methods just to be exhaustive. Sometimes I run into conditional statements that I can't test fully without the class accessors.  I've hit 80%+ code coverage on my current project, but I will be looking into white to test the UI in my project as well. This will certainly boost code coverage.

Order matters in unit tests.  For example, I have classes that depend on other classes to work.  The unit test stubs that get inserted by the IDE do not account for this, it merely gives you a very trivial skeleton.  It is up to you to make sure that you have all of the dependent classes instantiated before testing the class that you actually want to test.

Monday, April 14, 2008

Unit Testing (Part 1)

I've heard about unit testing before I started here at KPMG, but had zero experience with it.  My software developer friends have experience with unit testing because most of them are part of huge development teams and deal with mission critical software.  I understood the concept, but really didn't understand its value... until I added unit testing to my current project.

I'll be honest, I don't find writing unit tests fun.  I'd much rather be developing new stuff or even fixing bugs, but I found numerous bugs that I wouldn't have found without unit tests. Users would most likely have found these bugs while using the application which would require tracking the issue, fixing the problem, testing the solution, and finally releasing a new version.  This is a lot of time (and money) wasted.

Thursday, April 10, 2008

Generic Lists and LINQ

When LINQ was introduced I was pretty uninterested. I thought to myself, "what's the point?" Well, after using it I think it's an invaluable part of the .NET Framework.

Ever since our team was given the go to update our environment to VS 2008 and .NET 3.5, I've been including LINQ in my current project. Granted I'm only using LINQ to objects, but I see myself using LINQ to XML when I come to that point.

I use Generic Lists all over the place in my project. Today I ran across something I found interesting while I was doing something that required copying of a List. Copying List contents to another List without LINQ is pretty simple.

Dim CopyList As New List(Of Object)


Now with LINQ, you can use the ToList extension method and you don't have to create a New CopyList up front.

Dim CopyList As List(Of Object)

CopyList = OriginalList.ToList

Also, from my testing (admittedly light and crude) the ToList extension method is faster at completing the copy.

Tuesday, April 8, 2008

MS Team Foundation Server / Team Build (Part 2)

Creating a stand alone setup.exe/msi wasn't as easy for me as it should have been. Following Microsoft's suggestion of adding the .vdproj file to my "AfterCompile" target in my TFSBuild.proj file did not work; I got a compile error.

<Target Name="AfterCompile"> 
    <Exec Command="&quot;C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\devenv&quot;&quot;$(SolutionRoot)\HelloWorldTest\HelloWorldTestInstaller\HelloWorldTestInstaller.vdproj /Build &quot;Debug|Any CPU&quot;"/> 
    <Copy SourceFiles="$(SolutionRoot)\HelloWorldTest\HelloWorldTestInstaller\Debug\HelloWorldTestInstaller.msi" DestinationFolder="$(OutDir)" /> 
<Copy SourceFiles="$(SolutionRoot)\HelloWorldTest\HelloWorldTestInstaller\Debug\setup.exe" 
DestinationFolder="$(OutDir)" /> 
What I ended up doing was replacing the .vdproj file with the entire .sln file. Then I copied the setup.exe and .msi files out to the specified drop directory.
<Target Name="AfterCompile">
    <Exec Command="del &quot;$(OutDir)*&quot; /q" />
    <Exec Command="&quot;C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\devenv&quot; &quot;$(SolutionRoot)\Project.sln&quot; /Build &quot;Debug|Any CPU&quot;" />
    <Copy SourceFiles="$(SolutionRoot)\ProjectSetup.msi" DestinationFolder="$(OutDir)" />
    <Copy SourceFiles="$(SolutionRoot)\setup.exe" DestinationFolder="$(OutDir)" />

MS Team Foundation Server / Team Build (Part 1)

I've had to setup my current project at work so that it uses TFS's build automation. "F5 is not a compile solution" as I've been told by my boss. I completely agree with this because I've run into the whole "well it works on my machine" syndrome many times.

Having a dedicated server to compile my solution gives me a better grasp of how things would work on a machine that didn't have all of my development tools on it (aside from what's required by Team Build).

I also had the pleasure (more like pain) of upgrading my TFS 2005 build script to 2008 when we upgraded. This in itself is really simple to do, you just simply move it on over and TFS 2008 just runs the 2005 version in compatibility mode.

However, I didn't want to just run an older version so I just copied out the part that I wanted from the 2005 version, re-ran the wizard to build a 2008 version, and pasted the chunk of text. Much to my disappointment, the build script didn't work anymore.

I looked at my file over and over again. I and others on the MSDN forums found nothing wrong with it. Finally, it was suggested I just delete the TFSBuild.proj file on the build server so that TFS copies it to the build server again... it worked.

Lesson learned: be sure to check out the TFSBuild.proj file when making edits and don't modify in memory if that message box comes up. I spent a good part of a whole week trying to figure out why it wasn't working.

Monday, April 7, 2008

My first post...

My boss suggested I share my software development experiences (aka trials and tribulations) from work. This blog will be my medium to do that. I will share things that I've learned and gone through to possibly ease someone else's pain.