XmlWriter not releasing file with Powershell Script 07 January 2015 at 13:23

I encountered an issue today where a developer had created a Powershell script to write XML from stored procedure to a file, and the file would not be released until we closed the PowerGUI Script Editor (which is what we used to run the file). Even from the console it would take 10 seconds after the script completed before the file lock was released.

The solution was fairly simple - closing the XmlWriter doesn't automatically flush and release the underlying stream, you have to explicitly set this on the writer using an XmlWriterSettings object. Full solution below.

#Setup connection string and sql to run
$server = "magma"
$db = "my_database"
$connectionString = "Data Source=$server; Integrated Security=True; Initial Catalog=$db"

$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)
$command = "EXECUTE spTest";

$dateTimeStamp = [System.DateTime]::Now.ToString("yyyyMMddHHmm")
$filePath = "\\mypath\Extract_$dateTimeStamp.xml"       

    $command = New-Object System.Data.SqlClient.SqlCommand($command, $connection)
    $command.CommandTimeout = 0
    $xmlReader = $command.ExecuteXmlReader()

    $xmlWriterSettings = New-Object System.Xml.XmlWriterSettings
    $xmlWriterSettings.CloseOutput = $true
    $xmlWriter =  ([System.Xml.XmlWriter]::Create($filePath, $xmlWriterSettings))
catch {
    $errorMessage = $_.Exception.Message
    $body = "Extract failed to run for the extract: $errorMessage"
    Send-MailMessage -To "me@test.co.za" -From "do-not-reply@test.co.za" -Subject "Oops" -Body $body -SmtpServer "MYEXCHSERVER"

    $xmlWriter = $null
    $xmlReader = $null
    $connection = $null

Write-Output "DONE!"

SQL Server Database Stuck in Restoring 100% State 15 September 2014 at 16:37

I had an issue today where an empty database backup could not be restored for the love of money. The backup was 4MB, but when I attempted to restore it would create a 76GB log file on the server and then stay perpetually in a 100% "Restoring" state. This is because one of the virtual logs in the log file is still marked as used, and the database is stuck waiting for the log restore to complete.

You have two options here. Firstly, keep backing up the log and shrinking the log until the virtual logs are cleared up:

BACKUP LOG [database] TO DISK = 'C:\Temp2.bak'
DBCC SHRINKFILE (N'logfilename' , 10)

Your second option is to reset the recovery mode to simple, shrink the file and then reset the recovery mode:

use [database]
DBCC SHRINKFILE([logfilename], 1)

PhotoTool 2.1.0 23 June 2014 at 21:48

One of the pieces of software I wrote (10 years ago now!) that I actually use is PhotoTool. This isn't really anything amazing, it just works for me when I want to upload photos. I updated it last week to use the ImageProcessor library, and put the horrible source code onto github. The code is pretty bad, but hey, it works for me.

Download the latest version from here.

Scheduled photos with a web cam 18 April 2014 at 12:00

I have a spare web cam lying around, and figured it would be a great addition to set it up so it can take regular photos of our lounge when we're away on holiday. I did a quick Google, and managed to get a rudimentary solution running in less than half an hour using CommandCam and Powershell.

The solution was pretty basic: I downloaded the exe and then created a Powrshell script that captures an image every 5 minutes. I added an extra command to my script which sends me an email every now and then to remind me that the capture is still running. Basic, but it works! Note that this just runs forever, I'm happy with it working that way and I just stop it when I want to, although this could be altered to be a scheduled task or windows service.

cd E:\Cam
$photo_count = 0

while ($true) {
    .\CommandCam.exe /quiet

    $dt = [System.DateTime]::Now.ToString("yyyyMMdd_HHmmss")
    Rename-Item image.bmp "image_$dt.bmp"
    Write-Output "image_$dt.bmp"

    $photo_count = $photo_count + 1
    if ($photo_count -gt 72) {
        $smtp = New-Object Net.Mail.SmtpClient("mysmtp.com", 101) 
        $smtp.EnableSsl = $true 
        $smtp.Credentials = New-Object System.Net.NetworkCredential("hello@nowhere.com", "password"); 
        $smtp.Send("fromaddress@somewhere.com", "toaddress@nowhere.com", "Web capture running", "Reminder: web capture is still running on SERVERX")    
        Write-Output "Reminder sent"
        $photo_count = 0
    # capture every 5 minutes
    Start-Sleep -s 300 

With my web cam, the resulting images are pretty small too - around 250-300KB each, so I went one step further and changed my target folder to a Dropbox folder, This means my images are automatically uploaded to the web too, so I can admire my empty lounge using my phone in a matter of seconds.....

Deleting stubborn windows folders 17 April 2014 at 15:13
Every now and then, even as an administrator on a machine, you hit a folder that you just cannot delete. The usual route is to take ownership of the folder and it's child files and folders, but once in a blue moon this doesn't work and you're stuck with a folder you can't delete. Assuming no locks on the files, I found today that issuing the following commands from an elevated-level command prompt worked where the GUI equivalents didn't:
takeown /F C:\Temp /R
cacls C:\Temp /T /e /g Administrators:f
rmdir /S C:\Temp
Git Cheat Sheet 24 February 2014 at 15:58

We're switching to git (on github.com) at the moment, and I really wanted to use it properly instead of just relying on the GIU clients. I decided to learn it using the command-line, and I found this excellent tutorial online which really helped me understand some of the fundamentals: http://www.sbf5.com/~cduan/technical/git/.

This article contains my basic cheat sheet for Git, which more or less follows the general workflow when using a git repo.

git initInitialises a new git repository in the current folder.
git clone https://myrepo.comClones and initialises a remote git repository locally in the current folder - adds a remote repository reference named "origin".
git logView log changes.
git add .Recursively adds all changes to the repository.
git commit --dry-runSee what changes will be committed before actually running git commit.
git commit -m "My message"Commits changes to the repository.
git branchGet a list of local branches, with a star next to the current head.
git branch branch-name baseCreates a new branch based on an existing branch e.g. git branch test-branch HEAD.
git checkout branch-nameSwitches to a new branch and updates the local folder with the files from that branch.
git fetch originRetrieves remote changes and updates remote heads.
git pull originPulls all remote changes (origin can be replaced with a URL, for example).
git pull --rebase originPulls all remote changes but baselines them BEFORE your local changes, so your changes move on top of what everybody else has already contributed.
git push origin HEADPushes all changes back to the repository origin.
git clone --branch xyz https://github.com/MyOrg/MyRepo.gitClone a specific tag from a remote repo.

Creating tickets in Trac with XmlRpc and C# .NET 19 December 2013 at 15:00

Programmatically creating tickets in a Trac system using the recommended approach of XmlRpc is actually fairly straight-forward. The documentation and various sources were confusing to me at first, so I thought I would document the steps I went through to set it up. For me, I had a basic Trac instance on Windows set up, with the AccountManagerPlugin enabled.

  1. Go here: http://trac-hacks.org/wiki/XmlRpcPlugin.
  2. From the command-line, run the easy_install option that suits you, I ran easy_install -Z -U http://trac-hacks.org/svn/xmlrpcplugin/trunk
  3. As per the link above, enable the plugin in trac.ini, and if you're using the AccountManagerPlugin, make sure you follow the instructions.
  4. Fire up Trac - you should now have the XmlRpcPlugin running in Trac.
  5. Go to the Admin section in Trac, and to Permissions. You will need to add the XML_RPC permission to the user account that requires it - in my case I added it to my own login for testing purposes:
  6. If you browse to your Trac instance /rpc (e.g. http://localhost:8000/rpc) you will see the full XmlRpc API, with all possible methods to implement.
  7. Create a new Visual Studio project, and implement the methods you want. I just implemented two methods, CreateTicket and AddAttachment as follows (according to the example show here):
        public struct TicketAttributes
        public interface TracXmlRpcProxy : IXmlRpcProxy
            int CreateTicket(string summary, string description, TicketAttributes attributes, bool notify, DateTime when);
            string AddAttachment(int ticketId, string filename, string description, byte[] data, bool replace = true);
        class Program
            static void Main(string[] args)
                TracXmlRpcProxy proxy;
                // Fill these in appropriately
                string user = "matt";
                string password = "password!";
                /// Create an instance of the Trac interface
                proxy = XmlRpcProxyGen.Create();
                // If desired, point this to your URL. If you do not do this,
                // it will use the one specified in the service declaration.
                // proxy.Url = "https://trac-rules.org/xmlrpc";
                // Attach your credentials
                proxy.Credentials = new System.Net.NetworkCredential(user, password);
                TicketAttributes attr;
                //attr.comment = "This is the comment that goes with the new page";
                int ticketId = proxy.CreateTicket("Xml Rpc Test", "This is an XML RPC Test", attr, false, DateTime.Now);
                byte[] fileData = File.ReadAllBytes("C:\Temp\test.jpg");
                proxy.KeepAlive = false;
                proxy.AddAttachment(ticketId, "test.jpg", "Test description", fileData, true);
                Console.WriteLine("Ticket ID: " + ticketId);

    One thing to note with the above is the proxy.KeepAlive = false: this HAS to be set before sending the attachment otherwise you get the following error: "The server committed a protocol violation. Section=ResponseStatusLine".

Jenkins Active Directory Authentication with CCTray 01 November 2013 at 15:29

Our Jenkins build server is only accessed internally, so there was never any real reason to put authentication on it. However, I recently added QA Deployments to the build server and we'd like to implement production releases too, so I figured now was a good time to add Active Directory authentication.

Adding authentication was easy: Add the Active Directory plugin, activate it using Global Security, and add the users who you want to be able to login. That took about 2 minutes. The problem for me was that we use CCTray for local build monitoring, and CCTray was now longer working.

I found a solution fairly quickly, but it's implementation wasn't completely straight-forward, so I thought I would document it here. This assumes that you have CCTray 1.8.0 or above installed.

  1. Visit https://github.com/csnate/cctray-jenkins-transport, and download the zip file. Build the project (I built in Release mode), and then copy all the files compiled into the bin folder into an "extensions" folder.
  2. Close down CCTray, and copy the "extensions" folder into your local CCTray folder (usually located at C:Program Files (x86)CCTray)
  3. Run CCTray
  4. Open the Settings... view off CCTray, and add a new server as follows:
    • Click "Using a transport extension"
    • Select "JenkinsTransport" from the drop-down box
    • Click Configure Extension
    • Server: Your server - eg. http://myserver:8080
    • Username: Your current network user name
    • Password: Your current network password

Powershell : Recursively deleting folders 25 October 2013 at 14:55
I haven't posted for ages as I am not doing any coding at the moment, but this little snippet that I used to know in my head completely eluded me today. I thought I'd post it here for when that happens again: this will delete ALL .svn folders in C:\Temp, recursively.
gci C:\Temp -include .svn -Recurse -Force | Remove-Item -Recurse -Force
Cryptext version 1.0.0 06 August 2013 at 19:54
I finally got around to adding Cryptext to my site. This is a simple, single-file utilty that I use to encrypt my personal data. The software uses a fairly strong encrypting algorithm to encrypt the data with a key of your choosing per file. I'm no security expert, but I think it's pretty secure in it's implementation. Please feel free to use and abuse.