Azure Remote PowerShell NuGet Package

This package can be installed in your Azure cloud service by running Install-Package AzureRemotePerfView on the web or worker role project you want it to run on.

An example

To go through a quick tutorial, let's create a new cloud project. In Visual Studio, click New Project, select Cloud from the tree and pick Windows Azure Cloud Service from the list. For the purposes of this demo, let's add an ASP.NET Web Role. You can set it up however you want. I'll use the MVC template.

On the new web role project, right click and choose Manage NuGet Packages. Search for Azure PerfView Tool. Since I created this NuGet package recently, I'm using fairly recent Azure SDK dependencies so you may notice other packages being updated.

What gets added

In your web or worker role, you'll see a few files added:

The cloud project will also have been adjusted, you can take a look in the csdef file:

<ServiceDefinition name="AzureCloudService1" xmlns="" schemaVersion="2014-06.2.4">
  <WebRole name="WebRole1" vmsize="Small">
      <Setting name="AzurePerfTools.PowerShellWindowsService.ConnectionString" />
      <Task commandLine="InstallRemotePS.bat" executionContext="elevated" taskType="simple" />
      <Site name="Web">
          <Binding name="Endpoint1" endpointName="Endpoint1" />
      <InputEndpoint name="Endpoint1" protocol="http" port="80" />
      <Import moduleName="Diagnostics" />
      <LocalStorage name="DiagnosticStore" sizeInMB="4096" cleanOnRoleRecycle="false" />
      <LocalStorage name="ProfileStorage" sizeInMB="1000" cleanOnRoleRecycle="true" />

A new configuration setting has been added for the Azure storage connection string the remote PowerShell service will use. The new setting will require values be added to the cscfg file(s).

A startup task runs the InstallRemotePS.bat to setup the Windows service. Note that if there is a problem with the referenced assemblies from the Azure SDK, the InstallRemotePS.bat will fail. For testing purposes, it may be best to comment out this startup task and deploy without it. Remote onto a VM instance after deploying and run the batch file to verify that it will work.

A local storage resource for the profiles is also added. If a DiagnosticStore resource has not been added, it will add one to make it explicit.

For this demo, everything should just work as long as you set the connection string correctly. You can publish straight from Visual Studio.

Extra steps

Make sure you have an Azure storage account. Add a blob storage container called "profiles". This is where the profiles will be uploaded.

Creating a client

Once the service is deployed, how do you contact the service and get it to do things? For that, you will need to create an app to run from your client machine. The code for the Azure Remote PS on GitHub includes a client console app that you can use as a starting point. Get the code by running:

git clone 

To continue with the demo, let's create our own console application. After creating a new console application project, go to Manage NuGet Packages for Solution. In the installed packages you should see WCF Azure Table Transport. Add that to the new console application. Also, add a reference to System.ServiceModel.

Next, we'll define the WCF service contract:

public interface IRemotePowerShellCommands
    string StartPowerShell();
    string SendCommand(string commandText);

Next, add a client implementation of that contract:

class RemotePowerShellClient : ClientBase<IRemotePowerShellCommands>, IRemotePowerShellCommands
    public RemotePowerShellClient(Binding binding, EndpointAddress remoteAddress) :
        base(binding, remoteAddress)
    public RemotePowerShellClient(string endpointConfigurationName)
        : base(endpointConfigurationName)
    public string StartPowerShell()
        return base.Channel.StartPowerShell();
    public string SendCommand(string commandText)
        return base.Channel.SendCommand(commandText);

Next add the WCF configuration to your app.config:

        type="AzurePerfTools.TableTransportChannel.AzureTableTransportBindingCollectionElement, AzurePerfTools.TableTransportChannel" />
        sendTimeout="00:10:00" />
      name="PowerShellClient" />

Be sure to replace deploymentId with the actual deployment Id (not the deployment name) from your cloud service and the connStr with the Azure cloud storage connection string that you configured for the cloud role.

Let's turn the main program into a simple terminal:

class Program
    static void Main(string[] args)
            RemotePowerShellClient client = new RemotePowerShellClient("PowerShellClient");
            string startResponse = client.StartPowerShell();
            string command;
                command = Console.ReadLine();
                ExecuteCommand(client, command);
            } while (!string.Equals("exit", command, StringComparison.InvariantCultureIgnoreCase));
        catch (Exception exc)
        Console.WriteLine("\n\nPress any key to exit");
    private static void ExecuteCommand(RemotePowerShellClient client, string cmd)
        string commandResponse = client.SendCommand(cmd);

If everything is configured correctly, you should see a "PSConsoleSample:" prompt come up when you run the console application. The first thing you should NOT do is "ls" or "dir". First run this:

PSConsoleSample: (Get-Item -Path ".\" -Verbose).FullName

The message size allowed is not large enough to fit a full directory listing of system32.

How to run PerfView

Go to the approot and run PerfView:

PSConsoleSample: pushd $profileStorage
PSConsoleSample: perfview start -LogFile:perfview.log -ThreadTime -AcceptEULA

Run some tests against your website and then stop the profile:

PSConsoleSample: perfview stop -LogFile:perfview.log
PSConsoleSample: Get-Process perfview | Foreach-Object { $_.WaitForExit() }
PSConsoleSample: dir *
    Directory: C:\Resources\Directory\00000000000000000000000000000000.WebRole1.ProfileStorage
Mode                LastWriteTime     Length Name
----                -------------     ------ ----
-a---         10/7/2014   6:25 PM   29411269

In the above, I use the Get-Process command to wait for PerfView to finish merging and packaging the profile. This is especially important if you're looking to automate. As you can see, it created a profile called

Where to go from here

This allows you to run commands against an Azure cloud service role instance. It opens the possibility for collecting all kinds of logs, profiles, dumps, etc. without having to manually go onto the machine. The local storage account allows the files created on the VM to be uploaded to blob storage automatically.

comments powered by Disqus