Life is a Non-Deterministic Finite State Automata
Automation ? (*pGeekiness)++ : Code /eneration;

June 15, 2012

Automating the creation of standard environments using the VS2012 RC API (Update)

The full source code for this article can be downloaded here (V1.1) It is built against the 2012 RC version.

UPDATE 28-June-2012: The TF259637 error has been confirmed and sounds like it will be fixed with better UI guidance in the next release. The sample code is still at V1.1 and I will update my code at RTM.

A very welcome feature in the new edition of VS2012 Lab Center is the concept of a ‘Standard Environment’. Within Lab Centre, you can add any running machine to an environment and automatically push out the Test Agent ready for distributing tests to:

This means that VmWare, Virtual Box, Virtual PC and physical machines can easily be incorporated into your Lab Center environments. This post will show how to automate that feature using the (now public) API. No need to jump through hoops anymore like in VS2010!

However, before going on, you must at least be able to push these agents out manually using the Lab Center user interface so you know your infrastructure is set up for this: ensure that your target machine has file sharing set up; that you have fixed this rather obscure registry setting if necessary and ensure that IPSec and the Firewall aren’t getting in the way. And lots more.

We will now write some code to create a new Standard Environment and push the Agent out ready to run tests:

It looks like all of the API calls to the Lab Service and supporting infrastructure are now public in MSDN. There is some very kool stuff in there!

Getting going
The first thing is to add a few references to your project: Microsoft.TeamFoundation.DLL, Microsoft.TeamFoundation.Client.DLL and Microsoft.TeamFoundation.Lab.Client.DLL (search your machine for these):

Then it’s a simple case of connecting to a Team Project…

TeamProjectPicker picker = new TeamProjectPicker(TeamProjectPickerMode.SingleProject, false);

DialogResult result = picker.ShowDialog();
if (result != System.Windows.Forms.DialogResult.OK) return;
if (picker.SelectedProjects.Length == 0) return;

… and making the following calls to create a new environment and register it with a test controller:

LabSystemDefinition single = new LabSystemDefinition("TheMachineNameYouWantToPushTheAgentsOutTo", "TheMachineNameYouWantToPushTheAgentsOutTo", "YourMachineRole");

LabEnvironmentDefinition definition = new LabEnvironmentDefinition("The Environment Name", "The Environment Description", new List() { single });
definition.TestControllerName = "TheTestController:6901";

LabEnvironment newEnvironment = service.CreateLabEnvironment(ProjectName, definition, null, null);

There is then a nicely exposed ‘InstallTestAgent’ method that does exactly what it says on the tin:

// Download the source code to see how the credentials are set up for this call (process == null if you want to run the Test Agent as a service)
themachine.InstallTestAgent(admin, process);


The name ‘InstallTestAgent’ is a little misleading – it installs the Agent if it does not already exist and then reconfigures it.

Configuring the Agent to run tests interactively is similar: all we need to do is to provide another set of credentials that we want to run the Test Agent as on each end-point and tell the Lab Environment which machine roles require an interactive agent so the deployed agents can be configured correctly. We do this prior to creating the environment otherwise we would have to call LabService.UpdateLabEnvironment afterwards:

definition.CodedUIRole = p.MachineRoles;
definition.CodedUIUserName =  String.Format("{0}\\{1}", p.InteractiveCredentials.Domain, p.InteractiveCredentials.UserName);

I had issues getting my Lab Center to push out a test agent across *Workgroups* to run interactively even when I drive the operation manually from the Lab Center UI (not the API): this happened from a completely fresh install or otherwise. After pushing out the Agent, rebooting and automatically logging in, the Lab Center UI would keep hitting me with error “TF259637: The test machine configuration does not match the configuration on the environment. The account…”. The benefit of a public API is that it lets us investigate! The error appears to be the way Lab Center stores and/or validates its LabEnvironment.CodedUIUserName and LabSystem.Configuration.ConfiguredUserName parameters when distributing a Test Agent across Workgroups. The LabEnvironment.CodedUIUserName was set to ‘W732AGENTS\Graham’ (the value I entered in the Lab Center UI because that is what I want to run the Agent as on the end-point) whereas the LabSystem.Configuration.ConfiguredUserName property was set to .\Graham. Clearly a mismatch. To fix it, it seems all we need to do is sync the two. I need to be clear [especially given how the above code snippet obviously creates this problem!]: the issue occurs when driving the Lab Center UI manually, so it is not specific to the API or this sample. For the sample, I have chosen to mimic (what I think is) the behaviour of the Lab Center UI.

I have posted an issue with Connect with more information to seek clarification – please see ID: 749436.

If deploying Agents to Workgroups, you might get the TF2569637 error. I have left my source code to pass parameters to the API the same way that the Lab Center UI RC appears to (ie: without validation or guidance) so I attempt to deal with the error and automatically fix it post-deployment. It actually makes things more robust anyhow. I will update my code to reflect Lab Center UI changes post-RC:

var theEnvironment = service.QueryLabEnvironments(new LabEnvironmentQuerySpec() { Project = ProjectName }).First(f => f.Name == p.EnvironmentName);
var theMachine = theEnvironment.LabSystems.First(f => f.Name == p.MachineName);

string testAgentRunningAs = theMachine.Configuration.ConfiguredUserName;
string environmentThinksTestAgentRunningAsd = theEnvironment.CodedUIUserName;

if (String.Compare(testAgentRunningAs, environmentThinksTestAgentRunningAsd, true) != 0)
    // Synchronize the user names... 
    service.UpdateLabEnvironment(theEnvironment.Uri, new LabEnvironmentUpdatePack() { CodedUIUserName = testAgentRunningAs });

You can also use that snippet to fix a manually-deployed environment that is broken with error TF259637.

Putting it all together, you can can download this sample here. :

It looks like there is a slightly more flexible way of doing the installation of the test agents using a combination of the TestAgentDeploy class and the AMLCommandBase-derived classes. But perhaps more on that some other time!


May 6, 2012

Pronto v1.5 Released (Productivity Tool for Automating MTM Test Cases)

Filed under: ALM,Programming,Testing — Tags: , , , , , , , — admin @ 6:12 am

Pronto lets you create test stubs (including data binding and documentation) for your manual MTM Test Cases in C# or VB.Net by dragging those test cases onto your source file. You can then use Pronto’s Bulk Associated Automation Assistant to associate many of your test methods with your MTM test cases in one go. Uses might include: automating acceptance tests or creating Keyword/Action Word Frameworks. The application can be downloaded directly from Visual Studio Gallery here.

Changes to v1.5:

• All fragment generators are now freely editable T4 text files
• Create new generators and customize the fragments easily (docs and samples included)
• Fixed a few bugs

After downloading, Unblocking the file (right click -> Properties), installing the VSIX and restarting Visual Studio, ensure that the Pronto window is visible:

Assuming you already have a WorkItem query in Team Explorer that returns Test Cases, just Drag and Drop that query onto the Pronto window to get a list of Test Cases:

To create your method stubs and help with documentation, either Drag ‘n’ Drop or Right Click/Copy the Test Cases and Shared Steps and paste directly into your Unit Test:

Notice how the method stub, data binding parameters, test steps and title are all generated for you automatically (this is customizable).

After building your solution manually, open up the Bulk Associated Automation Assistant (the “Sheep” icon). It will discover your tests and correlate the MTM Test Case ID with the WorkItem Id on your test method:

Now you can optionally associate them all in one go without leaving VS2010.

This application has been tested on Visual Studio 2010 Professional (first release) and Visual Studio 2010 Ultimate (SP1, FP2, Rollups). Providing your process template integrates with MTM from a Test Automation perpsective – ie: supports ‘Associated Automation’, has an ‘Automated’ automation status and contains ‘Shared Steps’ and ‘Test Case’ template types, in theory, depending on the alignment of the stars, the phase of the moon and the direction of the wind, Pronto should just work.


Grey Ham

March 25, 2011

Automating the Integration of VmWare with Microsoft Test Manager and Lab Center: Part 6 – Changes for Visual Studio 2010 Service Pack 1

PLEASE NOTE: This is for Visual Studio 2010. For the VS2012 version, please click here.

Mid-way through the series, Visual Studio Service Pack 1 was released. How amusing! So this is an update to incorporate the Service Pack 1 changes.

See Part 5 for the Source Code and scripts.


Do not use this code under any circumstances (should just about cover the possibilities!).

I am using an undocumented API in order to construct the Physical Environment in Lab Center and set up the Test Controller Topology. I have tested the registered environments using MTM, use it often and come to no grief. My Lab Center and TFS system appears to be stable. But you use this at your own risk! At the very least, it would be sensible to do a full back up of your TFS Installation and ideally test this prior to production deployment. Use at your own risk :)

Parts 1, 2, 4, 5 are the same: nothing changes. The only changes you will need to make are down to the installation automation in Part 3.

I will not be providing an updated script to do this but if you have been following the series and want to stick to the same structure, you need to make Service Pack 1 available under the VisualStudioGumpf directory by unpacking your ISO there:

You will probably also need to create a new BAT file to launch “setup.exe /passive” from the Service Pack 1 location. Drop this into your Golden VM at the usual place:

And then write a new function to launch that from PowerShell.

InstallServicePack1 $VmWareConnectionParameters $VmWareClonedImageLocation "$DomainName\$DomainUsername" $DomainPassword $VisualStudioGumpfUnc;

If you get problems – try it manually first! The only part where I do anything undocumented is to create the Physical Environment. If you happen to get a situation where you can do this registration process manually, but not automatically, please let me know so that I can fix it :-)

Source Code Changes
I have no idea how many lines of white powder I had up my nose when I wrote this comment:

// I am not going to check this here but only one machine in an environment can be of a given role. 
Dictionary agentsToAdd = new Dictionary();
Dictionary machineRoleInfo = new Dictionary();

But it is clearly wrong!

Apart from having to install Service Pack 1, I haven’t had to make any changes: the environment still gets created and all appears normal. You should be able to target your created environment from MTM:

And run your Unit Tests, Integration Tests and CodedUI Tests on it:


March 17, 2011

Automating the Integration of VmWare with Microsoft Test Manager and Lab Center: Part 5 – Registering a new Physical Environment with TFS / Lab Center

PLEASE NOTE: This is for Visual Studio 2010. For the VS2012 version, please click here.

This series of posts will show you how to provision VmWare machines, install Test Controllers and Test Agents, register those with TFS and construct a Physical Environment within Lab Center so that it can be targeted by Microsoft Test Manager. All automatically using PowerShell (and VmRun.EXE):

  • Part 1 – Introduction.
  • Part 2 – Provisioning a new VmWare machine using Vix and joining a domain.
  • Part 3 – Automating the installation of the Test Controller and Test Agent.
  • Part 4 – Automating the configuration and registration of the Test Controller and Test Agent with TFS.
  • Part 5 – Automating the creation of a Physical Enviroment in Lab Center.(Source Code) [Updated: Removed comment]
  • Part 6 – Service Pack 1 Notes
  • Build-Deploy-Test with VmWare is a superset of this so I will tackle that after this series.

    See Part 2 for Requirements (PowerShell, Vix etc.).

    Do not use this code under any circumstances (should just about cover the possibilities!).

    I am using an undocumented API in order to construct the Physical Environment in Lab Center and set up the Test Controller Topology. I have tested the registered environments using MTM, use it often and come to no grief. My Lab Center and TFS system appears to be stable. But you use this at your own risk! At the very least, it would be sensible to do a full back up of your TFS Installation and ideally test this prior to production deployment. In particular: this has not yet been tested with Visual Studio 2010 Server Pack 1 (it has now, and it appears to work as well as pre-SP1) but use at your own risk :-)

    In future, if and when Microsoft provides a way to create Phyical environments and construct Test Controller / Test Agent hierarchies programmatically, this is the only stage you will have to change in this series. Everything else you have done as part of this series still applies.

    That aside: this is the REALLY AWESOME PART!! Before going on, consider provisioning a brand new virtual machine using the scripts at the end of Part 4.

    There are no officially supported APIs to manage the Test Controllers and Test Agents in TFS. This is unfortunate because it stops you creating Physical Environments and constructing your Test Controller / Test Agent hierarchies on the fly. You proceed at your own risk!

    The Test Controllers and Test Agents are all written in .Net and Microsoft has a tendency to make all of its classes ‘internal’ so you cannot reuse them willy-nilly. I like this approach: rather than obfuscating their code they are polite enough to let us wander around their work if we need to in order to troubleshoot or find out WTF is going on. It’s a pragmatic solution between opening things up to all and sundry and having to support their firstborn for life.

    The code we need to invoke to construct our Test Controller and Test Agent hierarchies is stored in ‘internal’ classes so we need to find a way of instantiating those classes and invoking their methods.

    A major help is that IL has no concept of ‘internal’: it is the compiler that enforces that contract (well… its not THAT simple!) opening up all kinds of creative possibilities! But that aside, you can create internal classes from other libraries just by using reflection like so:

    System.Reflection.Assembly controllerAssembly = System.Reflection.Assembly.LoadFile(theAssemblyPath); 
    Type controllerTestEnvironmentType = controllerAssembly.GetType(theFullyQualifiedTypeName);
    object testEnvironment = controllerTestEnvironmentType.InvokeMember("", System.Reflection.BindingFlags.CreateInstance, null, controllerTestEnvironmentType, null);

    …and you can get hold of properties, invoke private and protected methods and everything else you want using nothing more than reflection. But in .Net 4.0 there is an easier way! The ‘dynamic’ keyword will leave property and method resolution until runtime. Think of it as IDispatchEx from the COM World or expandos in JavaScript. It is very cool because by wrapping an internal type you can treat it like any other object. I use the code from that link verbatim but there’s a lot of extra stuff you can do with that if you put in the time.

    The reason I am mentioning it? Because you will see it in my code when I invoke the internal or private Test Controller classes:

                dynamic controllerConnectionManagerWrapper = new AccessPrivateWrapper(controllerConnectionManager);

    Manually creating a Physical Environment using a UI
    The source code this time includes all of the scripts and some supporting Visual Studio 2010 Solutions that you will have to build yourself:

  • The AgentHelpers library creates physical environments in Lab Center and sets up the Test Controller / Test Agents ready for use in that environment.
  • A crude UI Application (PhysicalEnvironments) that uses the library to create physical environments in Lab Center
  • A command line application called AgentInteraction.EXE is used to set up the environments as part of the PowerShell script
  • The first thing we need to do is to make sure you can manually construct Physical Environments using my User Interface because it drives the underlying TFS libraries. If not: not much point going on!

    Assuming you have heeded the disclaimer, you will need to add references to several standard TFS API’s including:

    The interesting three are:

  • Microsoft.TeamFoundation.Lab.Activites – in C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\TeamLab\Microsoft.TeamFoundation.Lab.Activities.dll
  • Microsoft.VisualStudio.QualityTools.ControllerObject – in C:\Windows\assembly\GAC_MSIL\Microsoft.VisualStudio.QualityTools.ControllerObject\\Microsoft.VisualStudio.QualityTools.ControllerObject.dll
  • Microsoft.VisualStudio.QualityTools.ExecutionCommon – in C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ReferenceAssemblies\v2.0\Microsoft.VisualStudio.QualityTools.ExecutionCommon.dll
  • It is important you set *ALL* of those DLL’s to ‘Copy local: True’ for the Reference. You will need to add them to AgentInteraction and PhysicalEnvironments as well.

    Now run the PhysicalEnvironments application and you will see a glorious ‘Graham UI’. I loathe arty-farty work and I loathe writing User Interfaces so much, and have consistently done such a bad job of it, that I never get asked to write them any more (mission accomplished!). Select the TFS Team Project Collection / Team Project and observe what happens:

    This UI is listing all of the Test Controllers on the left: that is the same list you see when you create a new environment in Lab Center and are asked to select the Test Controller you want to create the environment on:

    When you select one of the Test Controllers (such as Moo1250) you will see the list of Agents available. This list is identical to what you can see in Lab Center as well:

    So we have all of the information we need to start constructing Physical Environments. Enter a new name for the Physical Environment you want to create – such as “WooHOO” – tick the single Agent, enter a sensible Role name, and select “Create New Environment”:

    Your UI will now gloriously lock up for about a minute (I provide no UI Cues) whilst Lab Center does whatever it needs to do to create your environment. Assuming all went well (you’ll know when focus returns to your window) you should be able to see your environment in Lab Center:

    AWESOME! So we have all the bits and pieces.

    Before going on, delete the environment in Lab Center and then continue.

    Automating the creation of a Physical Environment using PowerShell
    The PowerShell script obviously doesn’t use the User Interface but I have created a dirty, hackey-wackey Console application called ‘AgentInteraction.EXE’ that is only moderately unforgiving of command line mistakes but can be used to drive the registration process.

    To set this up to work, create a new subdirectory under VisualStudioGumpf called AgentInteraction and drop these DLL’s (AgentHelpers/bin/Debug), AgentInteraction.EXE and anything pulled in as part of the build into it:

    Assuming you have removed the Test Controller and everything from Lab Center, you should be able to execute code like this:

    # Obtain a list of all Agents known to a Test Controller
    \\\VisualStudioGumpf\AgentInteraction\AgentInteraction.EXE Command=ListAgents TestControllerUri="MOO1251.WOOHOO.COM:6901"
    # To create an environment on the Test Controller MOO1251 and with a single Agent called MOO1251 with a Role of 'Desktop Client' you would execute:
    \\\VisualStudioGumpf\AgentInteraction\AgentInteraction.EXE command=RegisterEnvironment testcontrolleruri="MOO1251.WOOHOO.COM:6901" TeamProjectName="TestSample" EnvironmentName="My New Environment" Agents="MOO1251,The Web Server"

    … and see your environment created in Lab Center from PowerShell:


    Automation Summary
    Assuming everything is in place, you should now be able to dynamically provision new virtual machines in VmWare, install the Test Controllers and Test Agents, register them with TFS and create a new Physical Environment on a whim. The key differences for this post are:

    # ... variables from previous stages
    # The Team Project Name we want to register the environment with and the environment name
    $TeamProjectName = "TestSample";
    $EnvironmentName = "My Brand New Environment";
    $MachineRole = "Database Server";
    # ...
    # STAGE 5: Create the Environment
    # It can take a while for the VM to spin up and the Test Controller + Test Agent to come online... so use the RETRYCOUNT here.
    Invoke-Expression "$VisualStudioGumpfUnc\AgentInteraction\AgentInteraction.EXE command=RegisterEnvironment testcontrolleruri=$($MachineName).$($DomainName):6901 TeamProjectName=`"$TeamProjectName`" EnvironmentName=`"$EnvironmentName`" Agents=`"$MachineName,$MachineRole`" RetryCount=10";
    Agents=`"$MachineName,$MachineRole`" ";

    As a final test, set it as a targettable environment in MTM and run your Tests:

    Mission Accomplished! However… there is an anomoly! When I went into the Test Controllers view and clicked a Test Controller, I noticed that duplicate Test Controller names appeared for those that were put on Virtual Machines that I had provisioned:

    At first I thought it was some SID issue with the cloned virtual machine but it appears to be down to a legacy DNS Issue:

    I was able to repeat this 100%. If you remove the legacy DNS Record (ie: for Moo1400 which is a virtual machine I purged) and restart MTM, the problem goes away (I wasn’t able to repeat it after this).

    So I guess you need to purge the DNS Record as well as the Computer from your Active Directory when you throw away an environment.


    Older Posts »

    Powered by WordPress