Monday, November 25, 2013

Page file manipulation using PowerShell

Handy little script for changing the location of the system page file using PowerShell.






$computer = Get-WmiObject Win32_computersystem -EnableAllPrivileges

$computer.AutomaticManagedPagefile = $false


$CurrentPageFile = Get-WmiObject -Query "select * from Win32_PageFileSetting where name='c:\\pagefile.sys'"


Set-WMIInstance -Class Win32_PageFileSetting -Arguments @{name="d:\pagefile.sys";InitialSize = 0; MaximumSize = 0}






I can’t take any credit for this script, I’m posting it here so that I can find easily find it again in the future J

Friday, November 08, 2013

Problems with FSRM (Quick Fix)

I recently installed FSRM on to a Windows 2012 server.

I started seeing a few strange issues when trying to use the FSRM powershell cmdlets. 

For example, when I run Get-FSRMQuota, a list of all currently applied quotas was meant to be returned.
Instead, I got no output at all from the command. Other powershell commands threw strange errors.

Whenever I tried to view the FSRM options screen, the MMC would crash and I'd get this error.

After a bit of digging it turns out that this a bug with the installation of FSRM which only occurs following the first reboot of a server hosting the FSRM role.

After you perform a second reboot or run:

net stop srmsvc
net start srmsvc

The problem goes away. 

I'm not sure what causes the issue, but I thought it would be useful to document this incase anyone else runs into the same issue.

AWS CLI writing into Dynamo DB

Just a quick snippet of AWS CLI for putting items into a dynamo DB.


aws dynamodb put-item --table-name users --item

"userid": {"S": "1234356"},
"first" : {"S": "Mitchy"},
"lastname" : {"S": "Blog"},
"age" : {"N": "999"}


N = Number value

S = String Value

Use conditional Puts to update or replace existing items.

The new line layout above is just to make it easier to read.

Tuesday, November 05, 2013

Troubleshooting Sysprep Domain Join Issues

I’ve spent the morning trying to figure out why a SYSPREP’d instance was not joining the domain.


Turns out that in the folder C:\Windows\Panther\UnattendGC there is an extremely useful log files called setupact.log.


This log provides a verbose breakdown what happens during the sysprep process for the machine in question.



To access the log you will need to launch notepad under Administrator context.


Once you open the log, search for the DJOIN string. From there you’ll see all sorts of juicy details about the domain joining sequence.


Once you find the error codes, google them and you’ll be up and running in no time.


A couple of example codes and their translation…


Error 1326: Logon failure: unknown user name or bad password.

Typically this error signifies a bad username or password.


Error 1355: The specified domain either does not exist or could not be contacted.

Typically this error signifies network, DNS, or WINS issues.


Error 2202: You specified an invalid user name or group name.

Typically this error signifies an invalid (incorrect format) username is being used. 

This can happen if you use a third party utility that creates the unattended.xml file used by sysprep.


Friday, October 25, 2013

Sharing an AMI between 2 AWS accounts

Useful tip for sharing an AMI between two accounts.

1. Capture your AMI in the normal way.

2. Once the AMI has been captured, click on the "Permissions" tab.

3. From there, under "AWS Account Number" click on edit and enter the AWS account number of the account with which you would like to share the AMI.

4. Once you've shared the AMi, open up the other accounts.

5. Begin the process for launching a new AMI. From the left hand menu, choose My AMIs and the make sure, under Ownership, you choose, "Shared with me".

6. Search for the shared AMI by name or ID and hey presto, the newly shared AMI can now be used within your other account.

Wednesday, October 23, 2013

Sysprep Time Zone

When configuring syprep within an AWS AMI (or any template for that matter), don’t forget to set the time zone as needed (it defaults to UTC).


In my case I wanted Australian Eastern Time.


<TimeZone>AUS Eastern Standard Time</TimeZone>


A full list of time zones and their corresponding names can be found here:



Thursday, September 26, 2013

Citrix Storefront and HTML 5 Receiver using ALTADDR (not secure!)

This article is just for reference and is not a recommended solution!


I needed to setup a quick and dirty solution to get some colleagues access to an application published using Citrix XenApp for the purpose of demonstrating performance.


The three key components were, XenApp server, StoreFront server and the HTML 5 receiver.


I unfortunately did not have the time or resources to setup and configured a Netscaler or Access Gateway (hence the quick and dirty approach).


I’ll stick a disclaimer at the top “This is in no way secure and I always recommend using a combination of  Netscalers/Access Gateway and SSL certificates for any kind of public facing XenApp solution!”.

For the purpose of my “quick and dirty solution”, I’ve restricted access to a set of specific set of source IP addresses.


This also assume you have two public IP addresses assigned, one for the Storefront server and one for the XenApp server.

You’ll also need to configure firewall rules to allow inbound connections to the XenApp NAT using the WebSockets port configured within the XenApp policies. (default is 8008)



So, armed with NAT and a few configuration tweaks, I was able to publish my Citrix Storefront and allow my colleagues to access the published application using the Citrix HTML 5 receiver, here is how.



1.       Enable Alternate Addressing on the storefront server…

a.       Browse to the C:\InetPub\wwwroot\citrix\<storefront> folder.

b.      Open the web.config file.

c.       Find the “alternateAddress=”off” section and change it to “alternateAddress=”on”.

d.      Save and close the file.

e.      From the command prompt, run IISRESET



2.       On the XenApp server, open up a command prompt and run.

a.       ALTADDR /set nnn.nnn.nnn.nnn (where N is the public IP address that NATs to the private IP of the XenApp server)

b.      Reboot the server.

c.       Run ALTADDR /v to confirm the Alternate Address has taken.


3.       Login to your storefront service, launch click on the application icon and he presto, the application launches.

Tuesday, September 10, 2013

Preparing XenApp Servers for image (AMIs, Provisioning services etc

Open up the XenApp Role Manager from the start men....

Choose Edit configuration and then select the option "prepare this server for imaging..."

You can remove the current server from the farm but checking the box, or remove the check to leave the existing server in the farm.

Thursday, September 05, 2013

Enable Access Based Enumeration Server 2012

I have a Windows Server 2012 file server hosting home directories for a large number of users.

Even though I have a NTFS access lists preventing users from accessing other users folders, I don't really want users from even being able to see each other top level folders.

Enter Access Based Enumeration. Easy to enable and gives everyone that nice warm fuzzy feeling that everything is even more secure than it was before.

1. Open Server Manager.


2. Click on File and Storage Services.

3. Choose Shares from the left hand menu.

4. Pick a share, right click and choose properties from the context menu.

5. Check the box for "Enable access-based enumeration".

Hey presto, job done.

Onwards and upwards.....

Tuesday, September 03, 2013

Using AWS Micro Instances to test my RDS Application: Part 1

·         Launch a t1.micro instance using an Ubuntu AMI


·         Connect to the instance via SSH using the associated private key (remembering to append “ubuntu@” to the start of the instance IP if you’re using putty or some other terminal tool.


·         Run sudo apt-get update


·         Run sudo apt-get install vnc4server


·         Run vncserver :0


·         Set a password for your VNC connection


·         Edit the xstartup file within ~/.vnc/ folder, remove the last couple of lines and add in a line for gnome-session&


·         Should look a bit like this…




# Uncomment the following two lines for normal desktop:


# exec /etc/X11/xinit/xinitrc


[ -x /etc/vnc/xstartup ] && exec /etc/vnc/xstartup

[ -r $HOME/.Xresources ] && xrdb $HOME/.Xresources

xsetroot -solid grey

vncconfig -iconic &



·         Next run sudo nano /etc/init.d/vncserver


·         Within the nano within paste the following: (remember to modify the USER variable to be the name of your current user, note the DISPLAY variable, this is the port number / session ID)


#!/bin/sh -e


# Provides:          vncserver

# Required-Start:    networking

# Default-Start:     3 4 5

# Default-Stop:      0 6





# The Username:Group that will run VNC

export USER="ubuntu"



# The display that VNC will use



# Color depth (between 8 and 32)



# The Desktop geometry to use.






# The name that the VNC Desktop will have.



OPTIONS="-name ${NAME} -depth ${DEPTH} -geometry ${GEOMETRY} :${DISPLAY}"


. /lib/lsb/init-functions


case "$1" in


log_action_begin_msg "Starting vncserver for user '${USER}' on localhost:${DISPLAY}"

su ${USER} -c "/usr/bin/vncserver ${OPTIONS}"




log_action_begin_msg "Stoping vncserver for user '${USER}' on localhost:${DISPLAY}"

su ${USER} -c "/usr/bin/vncserver -kill :${DISPLAY}"




$0 stop

$0 start




exit 0


·         Press CTRL + X and save the file


·         Run “sudo update-rc. /etc/init.d/vncserver defaults”


·         Reboot by running “sudo reboot”


·         Launch VNC viewer enter the IP address followed by the “:1” or whatever port your chose.


·         Hit Connect and you should be logged in.


Thursday, August 29, 2013

Pulling Custom Metrics into CloudWatch

Needed to pull some custom metrics from my Windows instances running within AWS and pop them into Cloudwatch.


Specifically I needed to get the number of “Active Sessions” from the Terminal Services counter.


To start within, AWS provide some nice little PowerShell scripts that take custom metrics and pop them into Cloudwatch for you.


Once the scripts are downloaded and extracted into a local folder, you need to create an IAM user with enough rights for Cloud watch operations (I started with cloud watch full access, but I could probably be a little more granular)


Once that was done, I downloaded and extracted the contents of the file into c:\_Scripts\Cloudwatch.


Next step was to add the credentials into the awscreds.conf file.



Now, there are few scripts available, each one assists in grabbing specific metric sets from the Window Instances.


My specific requirement was to grab the number of active terminal server connections, so I opted to customize the mon-put-metrics-perfmon.ps1 script and include the specific metrics I was after.


Opening up the PS1 script within notepad and jumping down to the section which starts “#### Add More Counters here.”  

Below this section you can add a list of the counters you wish to pull from perfmon and send up to cloudwatch.


First step was to determine the correct syntax for the metric I was after.


Running the command “(get-counter –ListSet ‘Terminal Services’).Paths” returns the available metrics and the paths for those metrics.



Now it was time to mod the PowerShell script, So I followed the format of the existing counters and added the following:


$Counters.Add('\\localhost\Terminal Services\Active Sessions','Count')


Which resulted in the file looking a bit like this..



Save the file now. Notice the second parameter, “count”, this is the unit of data the counter provides. In the case of Active Sessions, it’s simply a counter 1..2..3….10, you get the idea.


All that’s left do to now is schedule the script to run at whatever interval meets your requirements.


In my case I opted for every 5 minutes.


I configured the task to run as SYSTEM and the action looks a bit like this:



The argument is: ‘ -command "C:\_Scripts\Cloudwatch\mon-put-metrics-perfmon.ps1  -aws_credential_file C:\_Scripts\Cloudwatch\awscreds.conf" ’


As you can see, I’ve simply specified PowerShell as the program to run, with the the mon-put-metrics-perfmon.ps1 script as the argument (I also specified the credentials file).


Now time to click save and let it run for  few minutes / hours /days….


The result is a nice new metric on CloudWatch – now time to have fun with Autoscaling.




Configuring a custom SysPrep file for AWS Windows Instanes

I needed to convert a Windows Server 2012 instance into an AMI so that it could be deployed as part of an auto-scaling configuration.


My domain already existed and so I simply needed to grab the existing Sysprep2008.xml file from


C:\Program Files\Amazon\Ec2ConfigService\sysprep2008.xml


And add in the following components under the respective sections.


Under the “Generalize” section.


    <component name="Microsoft-Windows-Security-SPP" processorArchitecture="wow64" publicKeyToken="31bf3856ad364e35" language="neutral" versionScope="nonSxS" xmlns:wcm="" xmlns:xsi="">




Under the “Specialize” section.


    <component name="Microsoft-Windows-UnattendedJoin" processorArchitecture="amd64" publicKeyToken="31bf3856ad364e35" language="neutral" versionScope="nonSxS" xmlns:wcm="" xmlns:xsi="">








        <MachineObjectOU>OU=Session Hosts,OU=Servers,DC=MyDomain,DC=local </MachineObjectOU>





I then copied the modified sysprep file back into C:\Program Files\Amazon\Ec2ConfigService\sysprep2008.xml overwriting the original template.


After that, I ran EC2Config as follows:


“Set Computer Name” – I may look into including my own custom naming convention at a later stage, but for the purposes of testing, this ensures my servers all have unique names.



“Shutdown with Sysprep” – Pretty obvious what this does.



Once Sysprep has run and the instance has just down, you can then create an AMI from the instance and start having fun with auto-scaling.


A couple of gotchas – if you have a password with any kind special characters within the sysprep file, EC2Config will crap out and report an error parsing EntityName.

Needs a little more investigation but I changed the password to something a little more straight forward and it worked no problem.


I’ll add a future post about my exploits with auto-scaling.






Tuesday, August 27, 2013

Reckon Accounts CRashing on Server 2012

Came across this one today when launching Reckon Accounts on a Server 2012 VM.

More investigation needed, but to fix in the short term

·         Disable IESC.

·         Configure Medium Security for the Internet Zone within Internet Explorer.

·         Close down IE and re-launch Reckon Accounts.


Monday, August 26, 2013

Useful PowerShell Script to Bulk Create Users

Just used this script to generate 4000 users accounts within a development Active Directory.


The CSV has to be in the format:


##### CSV FILE ########







############### START SCRIPT ################


Import-Module ActiveDirectory

$Users = Import-Csv -Delimiter ";" -Path ".\users.csv" 

foreach ($User in $Users) 

    $OU = "OU=Employees,DC=lab-os,DC=com" 

    $Password = $User.password

    $Detailedname = $User.firstname + " " + $

    $UserFirstname = $User.Firstname

    $FirstLetterFirstname = $UserFirstname.substring(0,1)

    $SAM =  $FirstLetterFirstname + $

    New-ADUser -Name $Detailedname -SamAccountName $SAM -UserPrincipalName $SAM -DisplayName $Detailedname -GivenName $user.firstname -Surname $ -AccountPassword (ConvertTo-SecureString $Password -AsPlainText -Force) -Enabled $true -Path $OU 



############### END SCRIPT ################

Script credit goes to



A little about Me

My photo
My name is Mitch Beaumont and I've been a technology professional since 1999. I began my career working as a desk-side support engineer for a medical devices company in a small town in the middle of England (Ashby De La Zouch). I then joined IBM Global Services where I began specialising in customer projects which were based on and around Citrix technologies. Following a couple of very enjoyable years with IBM I relocated to London to work as a system operations engineer for a large law firm where I responsible for the day to day operations and development of the firms global Citrix infrastructure. In 2006 I was offered a position in Sydney, Australia. Since then I've had the privilege of working for and with a number of companies in various technology roles including as a Solutions Architect and Technical team leader.