Something that I have noticed at a number of companies is there is a real fear that working with branch's is hard. Sure most of us have been bitten by working on a branch that is out of sync with its parent and has not been reverse integrated on a regular basis, and then when it comes time to merge the changes to your main line the experience can be awful. But is this experience any different to that of the developer that refuses to update their workspace on a regular basis? If you apply the same rules, but obviously not regularity, as you do in your own workspace of the regular updates then things can go very smoothly. Combine these good work habits with the work that Microsoft have done in TFS 2010 with branching and merging visualizations, and life becomes a lot easier.
In a later post I will talk about a consistent way to structure your source that makes the setup time of any branch and its associated automated build infrastructure minimal. In the meantime I have posted a link to an excellent article on branching that is well worth a read InfoQ: Version Control for Multiple Agile Teams.
Monday, September 27, 2010
Friday, September 17, 2010
Automated testing of PowerShell
I have been expanding my library of PowerShell build scripts that I use for my work, and as such I want to ensure that they keep working as I expect them too. The obvious answer is automated tests but it was a case of how. I spent some time trying out PSUnit but to be honest I found it a bit hard, possibly just because I was unfamiliar with the framework. After a bit of digging here is the solution that I now use, and to make it harder I was using Visual Studio Express 2010.
Tools
To run tests in Visual Studio express I use the following work around after setting the project to be a Console Application rather than a library.
And then to run the command for my test I use the following class
Integration Tests
As I mentioned above I am treating these tests as integration tests and I import the module as part of the setup.
An example of the integration tests is below
Tools
- Visual Studio 2010 C# Express
- NUnit 2.5
- PowerShell 2
Setup
I have a solution that contains two projects, one with my PowerShell scripts and the other with my tests just to keep things clean.
Scripts project
My PowerShell scripts are all part of a module so for my integration tests I install this module in the user directory. I am just using a build event to do this
rmdir %USERPROFILE%\Documents\WindowsPowerShell\modules\DevPipeline\ /s /q
mkdir %USERPROFILE%\Documents\WindowsPowerShell\modules\DevPipeline\
xcopy $(ProjectDir)\PowerShellModule %USERPROFILE%\Documents\windowspowershell\modules\DevPipeline
My PowerShell scripts are all part of a module so for my integration tests I install this module in the user directory. I am just using a build event to do this
rmdir %USERPROFILE%\Documents\WindowsPowerShell\modules\DevPipeline\ /s /q
mkdir %USERPROFILE%\Documents\WindowsPowerShell\modules\DevPipeline\
xcopy $(ProjectDir)\PowerShellModule %USERPROFILE%\Documents\windowspowershell\modules\DevPipeline
Tests project
To run tests in Visual Studio express I use the following work around after setting the project to be a Console Application rather than a library.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
using System; | |
using System.Reflection; | |
namespace Tests.ALM.Build { | |
class Program { | |
static void Main(string[] args) { | |
AppDomain.CurrentDomain.ExecuteAssembly( | |
@"C:\Program Files\NUnit 2.5.5\bin\net-2.0\NUnit-console.exe", | |
new string[] { Assembly.GetExecutingAssembly().Location }); | |
} | |
} | |
} |
And then to run the command for my test I use the following class
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
using System.Collections.ObjectModel; | |
using System.Management.Automation; | |
namespace Tests.ALM.Build { | |
/// <summary> | |
/// Run powershell scripts for unit test purposes | |
/// </summary> | |
internal static class PowerShellCommandRunner { | |
/// <summary> | |
/// Runs the given script from the imported module | |
/// </summary> | |
/// <param name="moduleName">Name of the powershell module to import</param> | |
/// <param name="command">The command that will be run</param> | |
/// <returns>powershell output</returns> | |
internal static Collection<PSObject> ExecuteModuleCommand(string moduleName, string command) { | |
Collection<PSObject> output; | |
using (PowerShell ps = PowerShell.Create()) { | |
ps.AddCommand("Import-Module", true).AddParameter("Name", moduleName); | |
ps.Invoke(); | |
ps.Commands.Clear(); | |
PowerShell cmd = ps.AddScript(command); | |
output = ps.Invoke(); | |
} | |
return output; | |
} | |
} | |
} |
Integration Tests
As I mentioned above I am treating these tests as integration tests and I import the module as part of the setup.
An example of the integration tests is below
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
using System; | |
using System.Collections.ObjectModel; | |
using System.IO; | |
using System.Management.Automation; | |
using NUnit.Framework; | |
namespace Tests.ALM.Build { | |
[TestFixture] | |
public class ExampleTests { | |
private string TestingSolutionsDirectory = Path.Combine(Directory.GetCurrentDirectory(), "Tests", "Assets", "Solutions"); | |
private const string LibrariesFailingUnitTest = "LibrariesFailingUnitTest"; | |
private const string DevPipelineModule = "DevPipeline"; | |
private const string ApplicationsOne = "Applications.One"; | |
[Test] | |
[Category("Integration")] | |
public void TestSolution_WillReturnFalseWhenUnitTestsFail() { | |
string runTestsForFailingUnitTestScript = string.Format("Set-SolutionPath -name {0} -path {1} | Test-Solution", LibrariesFailingUnitTest, TestingSolutionsDirectory); | |
Collection<PSObject> resultObject = PowerShellModule.ExecuteModuleScript(DevPipelineModule, runTestsForFailingUnitTestScript); | |
bool passed = (bool)resultObject[1].ImmediateBaseObject; | |
Assert.False(passed, "unit tests failed"); | |
} | |
[Test] | |
[Category("Integration")] | |
public void SetSolution_ReturnsMessageThatCurrentSolutionIsNowSetAsTestPath(){ | |
string setSolutionScript = string.Format("Set-SolutionPath -name {0} -path {1}", ApplicationsOne,TestingSolutionsDirectory); | |
Collection<PSObject> resultObject = PowerShellModule.ExecuteModuleScript(DevPipelineModule, setSolutionScript); | |
String actualMessage = (String)resultObject[0].ImmediateBaseObject; | |
String expectedMessage = string.Format("Current solution is now set as [{0}]", Path.Combine(TestingSolutionsDirectory, ApplicationsOne)); | |
Assert.AreEqual(expectedMessage, actualMessage); | |
} | |
} | |
} |
Thursday, September 16, 2010
Building VS2010 projects from TFS 2005/2008
An issue that I have now come across a couple of times is the ability to still use an older version TFS to build .NET 4 projects/solutions.
The solution that I have used is to to override the standard build target and call the correct version of MSBuild.exe using the Exec task.
The solution that I have used is to to override the standard build target and call the correct version of MSBuild.exe using the Exec task.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
<?xml version="1.0" encoding="utf-8"?> | |
<Project | |
xmlns="http://schemas.microsoft.com/developer/msbuild/2003" | |
ToolsVersion="2.0"> | |
<!-- | |
We are not using the standard compilation and hooking into the AfterCompile | |
target and then using the specified list of targets to clean, build and then | |
copy the build output. | |
--> | |
<Target Name="AfterCompile"> | |
<CallTarget Targets="MyBuild" /> | |
</Target> | |
<Target Name="MyBuild"> | |
<CallTarget Targets="CleanBin" /> | |
<CallTarget Targets="Build" /> | |
<CallTarget Targets="CopyToManagedBinaries" /> | |
</Target> | |
<!-- | |
Clean | |
--> | |
<ItemGroup> | |
<SolutionBinFiles Include="$(SolutionRoot)\Bin\**\*.*"/> | |
</ItemGroup> | |
<Target Name="CleanBin" > | |
<Delete Files="@(SolutionBinFiles)"/> | |
</Target> | |
<!-- | |
Compile - we are calling out to the .NET 4 version of msbuild from either an ealier instance of | |
Team Foundation Server | |
--> | |
<PropertyGroup> | |
<SolutionFileName>$(SolutionRoot)\YourSolution.sln</SolutionFileName> | |
<MSBuild4>C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe</MSBuild4> | |
<BuildTarget>/t:Rebuild</BuildTarget> | |
</PropertyGroup> | |
<Target Name="Build"> | |
<Exec ContinueOnError="false" | |
IgnoreExitCode="false" | |
Command="$(MSBuild4Path) $(SolutionFileName) $(BuildTarget)"> | |
</Exec> | |
</Target> | |
<!-- | |
Copy build outout to managed binaries | |
--> | |
<Target Name="CopyToManagedBinaries" > | |
<Exec ContinueOnError="false" | |
IgnoreExitCode="false" | |
Command='"Powershell" "$(MyScriptsDirectory)CopyToManagedBinaries.ps1" -managedBinariesPath $(DropLocation)\$(SolutionName)\$(AssemblyVersion)' /> | |
</Target> | |
</Project> |
Subscribe to:
Posts (Atom)