Tuesday, June 28, 2016

Building a developer test lab with Test Controllers and Agents in VSTS

Microsoft has recently released their DevTest Lab resources to production in Azure. This great functionality can let you quickly build up a lab where you can test your products and tear it down.

Steps:

  1. Create an Azure Resource Group project in Visual Studio for managing your DTL.
  2. Add a Virtual Network resource to the template
  3. Add a DevTest Lab resource to the template
  4. In the portal, bind the virtual network to the DTL in the Settings tab of the DTL
  5. Create a Virtual Machine to host your software. Use a Formula to install any pre-requisites that you want on the machine: e.g. Chrome, Notepad++, etc.
    1. As a pre-requisite for connecting to a vNext workflow, you **MUST** have the Azure PowerShell cmdlets installed on the machine as part of the Formula for the VM
  6. On the virtual machine, open a PowerShell console and enable remoting along with adding a corresponding firewall rule to remove the local-subnet-access-only restriction that's set by default:
    1. PS> Enable-PSRemoting
    2. PS> Set-NetFirewallRule –Name "WINRM-HTTP-In-TCP-PUBLIC" –RemoteAddress Any -LocalPort 5986
    3. PS> Set-ExecutionPolicy RemoteSigned
    4. PS> dir WSMan:\localhost\listener\*\Port # show the port on which WSMan is currently listening
    5. PS> winrm set winrm/config/Listener?Address=*+Transport=HTTP '@{Port="5986"}' # Change the port on which WSMan runs, option 1
    6. PS> Set-Item WSMan:\localhost\listener\*\Port 5986 # Change the port on which WSMan runs, option 2
    7. Configure the machine as per the tools here, which is essentially the same as the above steps, with some extra ones as well: https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-winrm-windows
  7. From another machine, run the following to ensure that your machine is connectable:
    1. Test-WSMan -ComputerName mymachinedns.westus.cloudapp.azure.com -Port 5986
    2. $sessionOpt = New-PSSessionOption -SkipCACheck
    3. $session = New-PSSession -ComputerName [myvmname].westus.cloudapp.azure.com -Credential (Get-Credential) -Port 5986 -UseSSL -SessionOption $sessionOpt

Saturday, June 25, 2016

Forcing addition of the "this" keyword with ReSharper

I hate using underscores for representing members of a type. It's idiotic. If you need a prefix to distinguish your local variables from your members with the same name, there's a built-in way of doing it: the 'this' keyword.

"But ... that's more typing ... it takes longer" = > NO! IT FUCKING DOESN'T

1) Between Intellisense in your tools and automatic code formatters, it can actually take less when you apply the formatter regularly and share the settings with your team.

2) The underscore is harder to type, especially on a QWERTY keyboard (I use Dvorak)

If you use ReSharper, the Code Cleanup feature can fix this up for you in short order. As part of your cleanup profile, ensure that you select "Arrange qualifiers" in the settings.

Tuesday, June 21, 2016

PowerShell script to execute an Entity Framework Migration

I got sick and tired of coming up with hacks for including Entity Framework migrations as part of my deployment processes, so I came up with a much cleaner hack in the form of a PowerShell script that can be used to execute a database migration given an assembly and a connection string. This script does have one caveat though: for some reason if it's run more than once in a given PowerShell session, it causes a stack overflow exception, and I haven't been able to figure out why. I guess I'll have to leave it as an exercise to you. Have at it:

Param(
    [ValidateNotNullOrEmpty()][string]$assemblyFile,
[ValidateNotNullOrEmpty()][string]$connectionString
#    [ValidateNotNullOrEmpty()][string]$server,
#    [ValidateNotNullOrEmpty()][string]$databaseName,
#    [ValidateNotNullOrEmpty()][string]$userName,
#    [ValidateNotNullOrEmpty()][string]$password
)

$assemblyFolder = [System.IO.Path]::GetDirectoryName($assemblyFile)

$OnAssemblyResolve = [System.ResolveEventHandler] {
  param($sender, $e)

  # First load the assemblies already in our app domain
  foreach($a in [System.AppDomain]::CurrentDomain.GetAssemblies())
  {
    if ($a.FullName -eq $e.Name)
    {
      return $a
    }
  }

  $fn = Join-Path $assemblyFolder "$($e.Name.Split(',')[0]).dll"

  if (Test-Path $fn)
  {
      $ass = [Reflection.Assembly]::LoadFile($fn)
      return $ass
  }

  return $null
}

[System.AppDomain]::CurrentDomain.add_AssemblyResolve($OnAssemblyResolve)

Set-Location ($assemblyFolder)

[System.Reflection.Assembly] $assembly = [System.Reflection.Assembly]::LoadFile($assemblyFile)

$types = $null

try
{
    $types = $assembly.GetTypes()
}
catch [Exception]
{
    $ErrorMessage = $_.Exception.Message

    throw "Failed to load types from assembly"
}

$configurationType = [System.Type]::GetType("System.Data.Entity.Migrations.DbMigrationsConfiguration, EntityFramework")
$migratorType = [System.Type]::GetType("System.Data.Entity.Migrations.DbMigrator, EntityFramework")
$connectionInfoType = [System.Type]::GetType("System.Data.Entity.Infrastructure.DbConnectionInfo, EntityFramework")

if ($configurationType -eq $null -or $migratorType -eq $null -or $connectionInfoType -eq $null)
{
    throw "Failed to load entity framework"
}

$migrationConfigurationTypes = @($types | ? { $configurationType.IsAssignableFrom($_) })

if ($migrationConfigurationTypes.Length -ne 1)
{
    throw "Failed to find single migration type for Entity framework in the assembly to migrate, found $($migrationConfigurationTypes.Length) migration configuration types"
}

Write-Warning "If you see an error similar to 'Could not find a connection string named [connection string name] in the application config', try changing your DbContext descendent to use the DbContext('connection string name') constructor syntax instead of DbContext('name=connection string name'). This is a known bug in EntityFramework."

#$builder = New-Object System.Data.SqlClient.SqlConnectionStringBuilder
# Need to use psbase, otherwise these properties throw an exception
#$builder.psbase.DataSource = $server
#$builder.psbase.InitialCatalog = $databaseName
#$builder.psbase.UserID = $userName
#$builder.psbase.Password = $password

#$connectionString = $builder.ToString()

$configuration = [System.Activator]::CreateInstance($migrationConfigurationTypes[0])
$connectionInfo = New-Object $connectionInfoType ($connectionString, "System.Data.SqlClient")
$configuration.TargetDatabase = $connectionInfo
$migrator = [System.Activator]::CreateInstance($migratorType, @($configuration))
$migrator.Update()

Write-Host 'Successfully migrated database'


Wednesday, June 15, 2016

Using the OAuth 2.0 configuration of HTTP Client Connectors with Dell Boomi (useful for Azure PaaS Web Applications that use Azure AD)

My company recently started using Dell's Boomi platform to connect to some of our PaaS applications running in Azure that use Azure Active Directory for authentication. We had tried previously to get the OAuth 2.0 security settings on the HTTP Client Connector working to no avail. Due to some of the work we have coming up for which we really wanted to be able to use the OAuth 2.0 configured connectors, I decided to try again, and got it working. Here's what I had to do in order to use an HTTP Client Connector with Azure AD:


  • On the "Settings" tab of your HTTP Client, do the following:
    • "URL" => The URL of the API to which you want to connect.
    • "Authentication Type" => OAuth 2.0
    • "Client ID" => The Client ID of an Azure AD **Web Application** registered in Azure AD, that has **ALREADY BEEN PRECONFIGURED FOR ACCESS TO YOUR SERVICE**. This can be copied and pasted from the Azure web portal AD application page for your application.
    • "Client Secret" => The App Key (in Azure terminology) of the Azure AD **Web Application** registered in Azure AD to be used as a client application for your service.
    • "Authorization Token URL" => The "OAUTH 2.0 AUTHORIZATION ENDPOINT" copied out of the Azure AD "Applications" tab in the Azure Management Portal.
    • "Access Token URL" => The "OAUTH 2.0 TOKEN ENDPOINT" copied out of the Azure AD "Applications" tab in the Azure Management Portal.
  • Under the "Add Authorization Parameter" link, you'll need to add 2 parameters. Click on the "Add Authorization Parameter" link twice to add them. You'll need the following for parameters:
    • "grant_type" => "client_credentials"
    • "resource" => The App ID URI of the target Web Application registered in Azure AD acting as the service to which your client Web Application is connecting.
  • In the Azure AD portal for your Client Web Application, you'll need to add the OAuth callback URL for your Boomi account to the "Reply URLs" list. e.g. https://platform.boomi.com/account/[companyaccountname-11X11X]/oauth2/callback
  • At the bottom of the page, click on the "Generate" button next to the "Access Token" label. Boomi will now attempt to connect to Azure AD. To do this, it will open up a new web page and attempt to authorize, so ensure that you have any pop-up blockers either turned off or configured to allow platform.boomi.com to open pop-ups.

Friday, June 03, 2016

Implementing Application Warmup with IIS 7.5 on Windows Server 2008 R2

In order to be able to support certain features of some of our applications, we've found the need to enable application warmup. However, this isn't built-in to IIS 7.5, but rather it's available as the Application Initialization module provided by Microsoft. In order to get this working, you'll need to do the following:


  1. Install the Application Initialization 1.0 module available from Microsoft here.
  2. You'll need to edit the applicationHost.config file on your server to enable Application Initialization. To do this, you'll need to open the file at %WINDIR%\System32\inetsrv\config\applicationHost.config in your text editor of choice and make the following changes:
    • Find the 'application' element that you wish to enable for Application Initialization, which you can do with an XPath similar to "/configuration/system.applicationHost/sites/site/application" and once found, set the preloadEnabled="true" attribute on the XML element. Add the element if it's not already there.
    • While on the 'application' element, take note of the 'applicationPool' value, because that's the Application Pool in which the application runs. Find the configuration node for that application pool (XPath: /configuration/system.applicationHost/applicationPools) and then set the following attributes on the <add> element for the application pool and its child 'processModel' element:
      • <add .... startMode="AlwaysRunning">
      • <processModel ... idleTimeout="0">
  3. In your application's web.config file, you'll need to add the following elements:
<system.webServer>
<applicationInitialization remapManagedRequestsTo="/InitializationProxyPage.txt" skipManagedModules="true" doAppInitAfterRestart="true">
            <add initializationPage="/Services/MyCustomService.svc" hostName="localhost" />
        </applicationInitialization>

        <directoryBrowse enabled="true" />
</system.webServer>

You'll need to have the "InitializationProxyPage.txt" file in the root of your web application.

4. Enable "Anonymous" authentication in the Authentication panel of your application in IIS. This is necessary so that the warmup page can successfully execute a GET request for your initialization page to kick off the warmup.

5. If you use the system.webServer/security/authorization or system.webServer/security/authentication sections in your web.config to control access to the application, you'll also need to add the following under the /configuration element as a sibling to the system.webServer element:

<configuration>
<location path="InitializationProxyPage.txt">
        <system.webServer>
            <security>
                <authorization>
                    <add accessType="Allow" users="*"/>
                </authorization>
            </security>
        </system.webServer>
    </location>
</configuration>
This will grant explicit access to the InitializationProxyPage.txt file to all users (including the App Pool Identity user) so that application warmup is guaranteed to have access to that file and can bootstrap your services.

Thursday, May 26, 2016

Troubleshooting the dreaded "The build directory of the test run either does not exist or access permission is required." error message in Microsoft Test Manager

Our company is starting to use the Microsoft Test Manager a lot more to manage our testing, both manual and automated. Additionally, we've recently started on creating some new Test Controllers and Test Agents with an exotic network configuration. On one of these new Test Controllers, I've started creating Lab test runs, and been getting the error message "The build directory of the test run either does not exist or access permission is required." When you Google this error message, you get what's described in this post on MSDN. In my case, this was the former problem: the account under which the test controller was running couldn't see the build folder.

After reading that last statement, you might say "well, why didn't you check to make sure that the drop folder was in the place it should have been ?". And, I did. Sort of. Due to our network configuration and aliases, my user could see it on the expected place at the alias in my portion of the network, but the user under which the Test Controller was running (not the tests themselves as configured in the Microsoft Test Manager, but instead the Test Controller software, they're not the same user) couldn't because we have some synchronization going on. Once I logged on to the machine on which the Test Controller service was running ** AS THE USER UNDER WHICH THE TEST CONTROLLER WAS RUNNING ** and went to the network alias myself, I could finally see that the synchronization between the locations on the network that had the same alias wasn't running and the build that I expected to be there was in fact not running.

Problem solved.

Monday, May 02, 2016

Migrating code in TFS from one Team Project to another Team Project within the same Team Project Collection

My company has frequently had the issue of moving code around within the same Team Project Collection as shifting desires and motivations for the organization of our code have swayed our company. Previously, we've used tools such as the TFS Integration Platform available on code plex. That project has its share of issues, namely that:
a) It was half baked to begin with
b) It's no longer maintained and doesn't work for versions later than TFS 2012.

Fortunately for me, thanks to a fortuitous Stack Overflow post, I've found the following script to recursively move the content of one folder to another folder, and it even works across Team Projects within the same collection (but not across Team Project Collections):

Param(
    [Parameter(Mandatory = $true)][ValidateNotNullOrEmpty()][string] $SourceFolder,
    [Parameter(Mandatory = $true)][ValidateNotNullOrEmpty()][string] $TargetFolder
)

Get-TfsChildItem $SourceFolder |
    select -Skip 1 |  # skip the root dir
    foreach {
        & "C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\TF.exe" rename $_.serveritem $_.serveritem.replace($SourceFolder, $TargetFolder)
    }

In order to run the script above, you'll need both Visual Studio and TFS Power Tools installed. When you install the Power Tools, you'll need to ensure that the PowerShell module is selected in the installation options. Also, because the path to the Power Tools PowerShell module isn't put in the PSModulesPath by default, you'll have to manually import the module into your PowerShell settings like so (before you run the script):

Import-Module "C:\Program Files (x86)\Microsoft Team Foundation Server 2013 Power Tools\Microsoft.TeamFoundation.PowerTools.PowerShell.dll"