Using Powershell with REST 11 October 2016 at 06:55

I was doing some Web API testing for a client today, with a fairly complex scenario that required multiple permutations of data. I had requested each possible combination, which was provided to me as a pile of JSON files that were compliant with a REST service on our side. The problem now, though, was how do I automate sending these JSON files?

As usual, I turned to Powershell, and was pleasantly surprised to see that there is an existing "Invoke-RestMethod" commandlet already, making this a piece of cake! The only minor complications were that our service used a combination of IP Address validation and Basic Auth, and that we forced SSL usage - but these turned out to be no problem from a code perspective.

cls
$dir = "C:\temp\20161010\MyJsonFiles"
$files = [System.IO.Directory]::EnumerateFiles($dir, "*.txt")
$url = "https://localhost:44300/api/client/endpoint"

# ignore certificates when debugging on localhost only - don't use this on live!!!
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

# set up user for basic auth
$user = 'test'
$pass = 'password'
$pair = "$($user):$($pass)"
$encodedCreds = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($pair))
$basicAuthValue = "Basic $encodedCreds"
$headers = @{
    Authorization = $basicAuthValue
}

foreach ($f in $files)
{
	$body = [System.IO.File]::ReadAllText($f)
	Invoke-RestMethod -Uri $url -Body $body -Method Post -Headers $headers -ContentType "application/json"
}
Export MongoDb query to CSV 16 August 2016 at 17:16

Flattening out MongoDb documents can be a bit of pain. I use RoboMongo to inspect database documents, and I may be stupid, but although I can query individual fields, the GUI will still present them in tree format, so I don't get a nice "spreadsheet" view that I can export and send to users when they request data.

However, MongoDb, by default, comes with an export function that can be used to easily extract (at a field level) documents to a CSV file. In my example below, we have a complex Workflow document where the required fields are many nodes down the tree. This is easily extracted as follows:

mongoexport --db Workflows --collection Workflows --csv -f "Workflow.CreatedOn,Workflow.OrderInfo.Amount,Workflow.OrderInfo.ReferenceNumber" -o out.csv

This will extract three fields into three columns, with headings. You can even filter the results by passing a query with a "-1" parameter.

Openfiles utility 27 June 2016 at 11:43
I have no idea how I didn't know this existed - it's been around for ages. Openfiles.exe has existed on Windows machines since Windows XP, and this is something that I've struggled with for years!
Openfiles.exe /query /s file-server > openfiles.txt
This will export to a list all open files by user on that particular server as per below
ID       Accessed By          Type       Open File (Pathexecutable)          
======== ==================== ========== =====================================
31541186 eric              Windows    F:\..\Example\June
42278615 tom              Windows    E:\..\Example2\doc
46977638 john              Windows    E:\..\Hello
30870272 jane              Windows    E:\..Place\file.docx
SQL Server : Scalar Function Affecting Performance 22 January 2016 at 12:20

I was looking into poor performance issue on a data warehouse report today, and noticed this line as one of the columns being returned in the SELECT statement:

schema.MyFunction(table.DateColumn) 

The query as a whole has around 7 joins and returns 5500 records. This is a fairly small data set, but without this line the query was running in under 2 seconds - with the scalar function being called it was taking 14 - 20 seconds. However, when I looked into the function it really wasn't doing very much - it wasn't very well written but it didn't have any hidden SELECT statements or anything that would cause an obvious performance issue. All it was doing, was formatting the date being passed to it.

The function itself wasn't the problem - it was the recursive call of the scalar function that was the issue. Basically, this speaks to the way SQL Server optimises it just cannot optimise that function as part of its query plan, and executes it 5500 times once for every row.

So what's the solution? If the function encapsulates logic that you don't want spread across reports, you can't remove the function. Your best option is to use a TABLE valued function instead of a scalar. Firstly, you need to rewrite your function as follows:

CREATE FUNCTION [cadi].[MyFunction] (@Date as smalldatetime)
RETURNS TABLE WITH SCHEMABINDING
AS
RETURN (
       select right(convert(varchar, @Date, 106), 8) AS MyDate
)

Then, you need to add remove the function from the SELECT portion of your query and CROSS APPLY it - as follows:

SELECT MyTableResult.MyDateColumn, 
t.Detail
FROM dbo.MainTable t
INNER JOIN dbo.OtherTable o ON t.Id = o.ForeignKeyId
CROSS APPLY dbo.MyFunction(t.MyDateColumn) AS MyTableResult
WHERE t.Status = 1

In my particular scenario, applying the TABLE function as opposed to SCALAR resulted in performance going down to 2 seconds i.e. no appreciable hit on performance at all.

MediaWiki on IIS working with no styles 04 January 2016 at 19:58

I decided to install MediaWiki today on one of our web servers, and set it up within IIS. However, although the site immediately served pages, all of them were displaying without any styles. The stylesheet request showed with a 200 response within Chrome so I didn't think there was anything wrong there and resorted to Google. However, I couldn't find anything on Google with an answer, so after an hour of trial and error I eventually ended up back in Chrome looking at the request.

It turns out the 200 response was incorrect - by following the link there was actually an error occurring when I looked at the contents of the response itself. It turns out, there was an error in the response: C:\Windows\TEMP not writable. The less engine being used was trying to dynamically compile the stylesheet and was failing as the IIS user under which the site was running didn't have access to the folder. I added write permissions to the user, and boom, it started working.

IIS7 : Add folders for specific users with Basic Authentication 05 May 2015 at 17:02

I needed to set up a web site that has a basic page for browsing as the root, but within that site there should also be download folders secured for different users. The folders don't require a web page, just browsable content. The issue I had was that there is a lot of misleading (and difficult to understand) documentation online, and I kept getting it to work and realising ALL users had access to ALL folders, or none. After some playing around, I found that for a basic site it's actually fairly easy to configure but thought I'd document this as it's useful as a basic file sharing mechanism. Note that you will need to ensure your site uses HTTPS rather than HTTP otherwise the password is sent in plain text across the network/internet/etc. Basic Auth isn't the most secure mechanism, but it's better than nothing at all.

Steps for configuration in IIS7:

  1. Create your web site in IIS with a basic index.htm file as per any other site. Make sure Anonymous Authentication is Enabled at this level.
  2. User permissions are controlled using standard user accounts. Go to manage your computer and and add a new local user.
  3. In Windows, create the folder in your web site directory.
  4. In IIS Manager, click on the folder, go to Authentication and make sure Anonymous Authentication is Disabled, and Basic Authentication is Enabled.
  5. On the folder, go to Directory Browsing and make sure it is Enabled.
  6. In Windows Explorer, go to the folder, and update the permissions so that the user you created has Read access to the folder. Make sure you remove any other accounts that are not relevant - for example the Users group on the local machine should NOT have access to the folder. Finally, you also need to ensure that the user of the Application Pool under which the web site runs has access to the folder. For example, if your App Pool runs under the Network Service account, this account must have access to the folder in question.

And there you go, that's it - you should be able to access your new folder and get a challenge response that will only work for your new user!

Upgrading SSL configuration on your web server 05 May 2015 at 11:33

I ran SSL tests on one of our web servers recently and found it was horribly out of date, receving an "F" from QULAYS SSL Labs SSL Server Test. Boooo. The problem was, in the past this was a pain in the butt as you had to pile into the registry and try and disable the old SSL support and various other manual steps.

Thanks to the awesomeness of Google, however, I found this little gem: https://www.hass.de/content/setup-your-iis-ssl-perfect-forward-secrecy-and-tls-12. Run Powershell script, test, cheer. A great big thanks to the author for that script - it worked perfectly, and our web site now gets a nice green A rating.

XmlWriter not releasing file with Powershell Script 07 January 2015 at 13:23

I encountered an issue today where a developer had created a Powershell script to write XML from stored procedure to a file, and the file would not be released until we closed the PowerGUI Script Editor (which is what we used to run the file). Even from the console it would take 10 seconds after the script completed before the file lock was released.

The solution was fairly simple - closing the XmlWriter doesn't automatically flush and release the underlying stream, you have to explicitly set this on the writer using an XmlWriterSettings object. Full solution below.

#Setup connection string and sql to run
$server = "magma"
$db = "my_database"
$connectionString = "Data Source=$server; Integrated Security=True; Initial Catalog=$db"

$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)
$command = "EXECUTE spTest";

$dateTimeStamp = [System.DateTime]::Now.ToString("yyyyMMddHHmm")
$filePath = "\\mypath\Extract_$dateTimeStamp.xml"       

try
{
    $connection.Open()
    $command = New-Object System.Data.SqlClient.SqlCommand($command, $connection)
    $command.CommandTimeout = 0
	
    $xmlReader = $command.ExecuteXmlReader()

    $xmlWriterSettings = New-Object System.Xml.XmlWriterSettings
    $xmlWriterSettings.CloseOutput = $true
    $xmlWriter =  ([System.Xml.XmlWriter]::Create($filePath, $xmlWriterSettings))
    
    $xmlWriter.WriteNode($xmlReader,$true)    
    $xmlWriter.Flush()
}
catch {
    $errorMessage = $_.Exception.Message
    $body = "Extract failed to run for the extract: $errorMessage"
    Send-MailMessage -To "me@test.co.za" -From "do-not-reply@test.co.za" -Subject "Oops" -Body $body -SmtpServer "MYEXCHSERVER"
}
finally
{
    $xmlWriter.Close()
    $xmlReader.Close()
    $connection.Close()
    
    $xmlWriter.Dispose()
    $xmlReader.Dispose()
    $connection.Dispose()

    $xmlWriter = $null
    $xmlReader = $null
    $connection = $null
}

Write-Output "DONE!"

SQL Server Database Stuck in Restoring 100% State 15 September 2014 at 16:37

I had an issue today where an empty database backup could not be restored for the love of money. The backup was 4MB, but when I attempted to restore it would create a 76GB log file on the server and then stay perpetually in a 100% "Restoring" state. This is because one of the virtual logs in the log file is still marked as used, and the database is stuck waiting for the log restore to complete.

You have two options here. Firstly, keep backing up the log and shrinking the log until the virtual logs are cleared up:

BACKUP LOG [database] TO DISK = 'C:\Temp2.bak'
GO
DBCC SHRINKFILE (N'logfilename' , 10)
GO

Your second option is to reset the recovery mode to simple, shrink the file and then reset the recovery mode:

use [database]
ALTER DATABASE [database]SET RECOVERY SIMPLE WITH NO_WAIT
DBCC SHRINKFILE([logfilename], 1)
ALTER DATABASE [database] SET RECOVERY FULL WITH NO_WAIT

PhotoTool 2.1.0 23 June 2014 at 21:48

One of the pieces of software I wrote (10 years ago now!) that I actually use is PhotoTool. This isn't really anything amazing, it just works for me when I want to upload photos. I updated it last week to use the ImageProcessor library, and put the horrible source code onto github. The code is pretty bad, but hey, it works for me.

Download the latest version from here.