Load test with Fiddler
Created 20 September 2017 16:25

A client reported an issue with one of our API endpoints today with a rather depressing problem - it worked fine with single posts, but multiple concurrent posts resulted in random errors.

I had the means to test the API with Fiddler, but that didn't help me much as that only sends one request at a time and I couldn't replicate the issue. I have used West Wind Web Surge in the past for personal work, but it requires a license for commercial use and I couldn't get it to play nicely with a self-certified test ceritificate, so I started Googling for an alternative.

It turns out you can do it easily with Fiddler. Execute the request that you want to stress test once, and then select the request once the response is received. Hit Shift-R, and a pop-up will appear asking how many simultaneous requests you want to make, defaulting to 5. Click OK on that, and Fiddler will send 5 concurrent requests!

Web Site and Page Essentials
Created 24 August 2017 20:03, updated 25 August 2017 16:05
Creating a web site is simple, but there are a lot of basic items to consider that are easy to forget but will trip you up in the lifetime of the web site. These lists should be revisited regularly.

For your site as a whole

  1. Are JavaScript and CSS resources bundled and minimised?
  2. http://www.troyhunt.com/2013/05/hack-yourself-first-how-to-go-on.html
  3. While sitemaps are not as important as they used to be, if you have a complex site heirarchy or some pages are perhaps hidden behind posted forms, it may help your SEO to include a sitemap (sitemap.xml)
  4. X-Frame-Options header
  5. If possible, serve everything over HTTPS
  6. Ensure all passwords are hashed, and allow users to reset passwords by send them an email
  7. Error messages should be caught at a global level - don't reveal sensitive information when an error occurs
  8. Create an appropriate robots.txt
  9. Scan your site using http://www.webpagetest.org
  10. Run your site through https://asafaweb.com (particularly if your site is ASP.NET)

For every page

  1. Version numbers on CSS and JS include files - this will prevent new deploys from going wrong because users have cached copies of old files on their machines e.g. Myfile.js?version=1.0.4
  2. SQL Injection attacks - have you parameterised all your queries? Dapper, maybe?
  3. CSRF/XSRF Vulnerabilities - every page on your site (apart from the login screen) should compare the value of a form item AND a cookie to ensure they are the same cyrptographically secure number, otherwise authentication fails. For ASP.NET users, use the anti-forgery token.
  4. Don't send any sensitive information over email e.g. passwords
  5. Can the parameters be tampered with to reveal data the user should not be able to get to?

Useful links

  • https://www.troyhunt.com/hack-yourself-first-how-to-go-on/
Extract SSH Key from Filezilla Client
Created 10 May 2017 15:52

I always forget this, so I'm documenting it here for myself. If you make SFTP connections from automated scripts or within a program, you really want to be supplying the SSH key.

On Windows, this can be extracted by doing the following:

  1. Open registry editor (regedit.exe)
  2. Go to HKEY_CURRENT_USER\Software\SimonTatham\PuTTY\SshHostKeys
  3. All the keys are stored, with types, on the right hand side under the SshHostKeys folder. The Name of each entry contains the host.
  4. Delete the key for the host you're looking for, and reconnect using Filezilla. It should supply you with a challenge response with the key and value for future use.
  5. You may want to confirm this key with the owner of the source system!
Using Powershell with REST
Created 11 October 2016 06:55

I was doing some Web API testing for a client today, with a fairly complex scenario that required multiple permutations of data. I had requested each possible combination, which was provided to me as a pile of JSON files that were compliant with a REST service on our side. The problem now, though, was how do I automate sending these JSON files?

As usual, I turned to Powershell, and was pleasantly surprised to see that there is an existing "Invoke-RestMethod" commandlet already, making this a piece of cake! The only minor complications were that our service used a combination of IP Address validation and Basic Auth, and that we forced SSL usage - but these turned out to be no problem from a code perspective.

cls
$dir = "C:\temp\20161010\MyJsonFiles"
$files = [System.IO.Directory]::EnumerateFiles($dir, "*.txt")
$url = "https://localhost:44300/api/client/endpoint"

# ignore certificates when debugging on localhost only - don't use this on live!!!
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

# set up user for basic auth
$user = 'test'
$pass = 'password'
$pair = "$($user):$($pass)"
$encodedCreds = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($pair))
$basicAuthValue = "Basic $encodedCreds"
$headers = @{
    Authorization = $basicAuthValue
}

foreach ($f in $files)
{
	$body = [System.IO.File]::ReadAllText($f)
	Invoke-RestMethod -Uri $url -Body $body -Method Post -Headers $headers -ContentType "application/json"
}
Export MongoDb query to CSV
Created 16 August 2016 17:16

Flattening out MongoDb documents can be a bit of pain. I use RoboMongo to inspect database documents, and I may be stupid, but although I can query individual fields, the GUI will still present them in tree format, so I don't get a nice "spreadsheet" view that I can export and send to users when they request data.

However, MongoDb, by default, comes with an export function that can be used to easily extract (at a field level) documents to a CSV file. In my example below, we have a complex Workflow document where the required fields are many nodes down the tree. This is easily extracted as follows:

mongoexport --db Workflows --collection Workflows --csv -f "Workflow.CreatedOn,Workflow.OrderInfo.Amount,Workflow.OrderInfo.ReferenceNumber" -o out.csv

This will extract three fields into three columns, with headings. You can even filter the results by passing a query with a "-1" parameter.

Openfiles utility
Created 27 June 2016 11:43, updated 18 June 2017 10:02

I have no idea how I didn't know this existed - it's been around for ages. Openfiles.exe has existed on Windows machines since Windows XP, and this is something that I've struggled with for years!

Openfiles.exe /query /s file-server > openfiles.txt
This will export to a list all open files by user on that particular server as per below
ID       Accessed By          Type       Open File (Pathexecutable)          
======== ==================== ========== =====================================
31541186 eric              Windows    F:\..\Example\June
42278615 tom              Windows    E:\..\Example2\doc
46977638 john              Windows    E:\..\Hello
30870272 jane              Windows    E:\..Place\file.docx

If you would like to "kill" one of these locks, for example the lock Tom has on you can just disconnect them using the same utility, supplying the ID as follows:

openfiles.exe /disconnect /id 42278615 

SQL Server : Scalar Function Affecting Performance
Created 22 January 2016 12:20, updated 18 June 2017 13:49

I was looking into poor performance issue on a data warehouse report today, and noticed this line as one of the columns being returned in the SELECT statement:

schema.MyFunction(table.DateColumn) 

The query as a whole has around 7 joins and returns 5500 records. This is a fairly small data set, but without this line the query was running in under 2 seconds - with the scalar function being called it was taking 14 - 20 seconds. However, when I looked into the function it really wasn't doing very much - it wasn't very well written but it didn't have any hidden SELECT statements or anything that would cause an obvious performance issue. All it was doing, was formatting the date being passed to it.

The function itself wasn't the problem - it was the recursive call of the scalar function that was the issue. Basically, this speaks to the way SQL Server optimises - it just cannot optimise that function as part of its query plan, and executes it 5500 times - once for every row.

So what's the solution? If the function encapsulates logic that you don't want spread across reports, you can't remove the function. Your best option is to use a TABLE valued function instead of a scalar. Firstly, you need to rewrite your function as follows:

CREATE FUNCTION [cadi].[MyFunction] (@Date as smalldatetime)
RETURNS TABLE WITH SCHEMABINDING
AS
RETURN (
       select right(convert(varchar, @Date, 106), 8) AS MyDate
)

Then, you need to add remove the function from the SELECT portion of your query and CROSS APPLY it - as follows:

SELECT MyTableResult.MyDateColumn, 
t.Detail
FROM dbo.MainTable t
INNER JOIN dbo.OtherTable o ON t.Id = o.ForeignKeyId
CROSS APPLY dbo.MyFunction(t.MyDateColumn) AS MyTableResult
WHERE t.Status = 1

In my particular scenario, applying the TABLE function as opposed to SCALAR resulted in performance going down to 2 seconds i.e. no appreciable hit on performance at all.

MediaWiki on IIS working with no styles
Created 04 January 2016 19:58

I decided to install MediaWiki today on one of our web servers, and set it up within IIS. However, although the site immediately served pages, all of them were displaying without any styles. The stylesheet request showed with a 200 response within Chrome so I didn't think there was anything wrong there and resorted to Google. However, I couldn't find anything on Google with an answer, so after an hour of trial and error I eventually ended up back in Chrome looking at the request.

It turns out the 200 response was incorrect - by following the link there was actually an error occurring when I looked at the contents of the response itself. It turns out, there was an error in the response: C:\Windows\TEMP not writable. The less engine being used was trying to dynamically compile the stylesheet and was failing as the IIS user under which the site was running didn't have access to the folder. I added write permissions to the user, and boom, it started working.

IIS7 : Add folders for specific users with Basic Authentication
Created 05 May 2015 17:02

I needed to set up a web site that has a basic page for browsing as the root, but within that site there should also be download folders secured for different users. The folders don't require a web page, just browsable content. The issue I had was that there is a lot of misleading (and difficult to understand) documentation online, and I kept getting it to work and realising ALL users had access to ALL folders, or none. After some playing around, I found that for a basic site it's actually fairly easy to configure but thought I'd document this as it's useful as a basic file sharing mechanism. Note that you will need to ensure your site uses HTTPS rather than HTTP otherwise the password is sent in plain text across the network/internet/etc. Basic Auth isn't the most secure mechanism, but it's better than nothing at all.

Steps for configuration in IIS7:

  1. Create your web site in IIS with a basic index.htm file as per any other site. Make sure Anonymous Authentication is Enabled at this level.
  2. User permissions are controlled using standard user accounts. Go to manage your computer and and add a new local user.
  3. In Windows, create the folder in your web site directory.
  4. In IIS Manager, click on the folder, go to Authentication and make sure Anonymous Authentication is Disabled, and Basic Authentication is Enabled.
  5. On the folder, go to Directory Browsing and make sure it is Enabled.
  6. In Windows Explorer, go to the folder, and update the permissions so that the user you created has Read access to the folder. Make sure you remove any other accounts that are not relevant - for example the Users group on the local machine should NOT have access to the folder. Finally, you also need to ensure that the user of the Application Pool under which the web site runs has access to the folder. For example, if your App Pool runs under the Network Service account, this account must have access to the folder in question.

And there you go, that's it - you should be able to access your new folder and get a challenge response that will only work for your new user!

Upgrading SSL configuration on your web server
Created 05 May 2015 11:33

I ran SSL tests on one of our web servers recently and found it was horribly out of date, receving an "F" from QULAYS SSL Labs SSL Server Test. Boooo. The problem was, in the past this was a pain in the butt as you had to pile into the registry and try and disable the old SSL support and various other manual steps.

Thanks to the awesomeness of Google, however, I found this little gem: https://www.hass.de/content/setup-your-iis-ssl-perfect-forward-secrecy-and-tls-12. Run Powershell script, test, cheer. A great big thanks to the author for that script - it worked perfectly, and our web site now gets a nice green A rating.