Resolve Mixed Content Warnings with WordPress and the Enfold Theme

Created 03 May 2018 11:20

I needed to move a WordPress site to HTTPS. The certificate was installed, and the site was being correctly served over HTTPS, but I was getting mixed content warnings, as many of the images were being loaded over http. There are many sites that give the full rundown, but nothing worked for me for the Enfold theme.

This site: - can be used to set up HTTPS from start to finish

This site: - led me to the solution

Once the site was set up for https, I had to run the following four queries against the WordPress database:

UPDATE wp_posts 
SET    post_content = ( Replace (post_content, 'src="http://', 'src="//') )
WHERE  Instr(post_content, 'jpeg') > 0 
    OR Instr(post_content, 'jpg') > 0 
    OR Instr(post_content, 'gif') > 0 
    OR Instr(post_content, 'png') > 0;

UPDATE wp_posts
SET   post_content = ( Replace (post_content, "src='http://", "src='//") )
WHERE  Instr(post_content, 'jpeg') > 0 
    OR Instr(post_content, 'jpg') > 0 
    OR Instr(post_content, 'gif') > 0
    OR Instr(post_content, 'png') > 0;

UPDATE wp_postmeta
SET    meta_value = ( Replace (meta_value, 'src="http://', 'src="//') )
WHERE  Instr(meta_value, 'jpeg') > 0 
    OR Instr(meta_value, 'jpg') > 0 
    OR Instr(meta_value, 'gif') > 0 
    OR Instr(meta_value, 'png') > 0;

UPDATE wp_postmeta
SET    meta_value = ( Replace (meta_value, "src='http://", "src='//") )
WHERE  Instr(meta_value, 'jpeg') > 0 
    OR Instr(meta_value, 'jpg') > 0 
    OR Instr(meta_value, 'gif') > 0 
    OR Instr(meta_value, 'png') > 0;

Installing Windows 10 Pro over Home Edition

Created 26 February 2018 08:15, updated 26 February 2018 08:18

I purchased a new laptop through work today, which came with a pre-installed version of Windows Home Edition. I need Professional Edition for work, and so bought a copy of Windows Pro at the same time. Because I'm iffy about setup, I elected to install the new OS myself...what a mistake. No matter what I did, the Windows installer kept pulling the key from the BIOS, and so kept activating as Windows Home.

Things I tried that failed:

  1. Installing over Windows Home
  2. Deleting all partitions, re-partitioning, formatting all drives, and installing from scratch
  3. Swearing loudly, and reinstalling

Things I tried that worked:

  1. Download the Windows 10 ISO from Microsoft - click "Download tool now" and create a USB drive installation media (alternatively, you can download the ISO and install to the drive)
  2. I'm not 100% this step is necessary, but I did it anyway - create a "PID.txt" inside the "sources" folder file on your flash stick, with the following contents (where the XXXXX string is your valid Windows key)
  3. In the sources folder again, create a text file called ei.cfg and set the contents to as follows:
    In the below articles, it was mentioned that this file should be placed in x64sources and x86sources directories - for me it was just the "sources" folder
  4. Install Windows - I did this by just running the setup.exe file

I am documenting this here for my personal benefit, but this help as found here:

Importing files into SQL Server using OPENROWSET

Created 09 November 2017 12:17, updated 09 November 2017 12:45

I've used OPENROWSET to import Excel documents for years, but I was playing around today with CSV and Pipe-delimited files, and there are some tricks to these that I thought I would document for future reference.

All the examples below assume you have a folder F:\MyData on your server. I emphasise this, because if you run the below scripts using SSMS on your machine, that is not the same F: drive - it is the F: drive the server sees.


Excel documents go in really easily. You just need the name of the Sheet you want to import - in my example below I am importing all data from sheet "Sheet1" into table "MyExcelData".

SELECT * INTO MyExcelData FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0', 'Excel 12.0;Database=F:\MyData\MyFile.xlsx', [Sheet1$]) 


Second easiest, is a CSV document. Be warned, you will need to make sure that it is an ANSI-compliant file (or that there are no commas in the data), otherwise you will end up with columns named "F1", "F2", "F3" in your generated table.

select * INTO MyCsvData
,'Driver={Microsoft Access Text Driver (*.txt, *.csv)};'''
,'SELECT * FROM F:\MyData\MyFile.csv')

Delimited text files (e.g. pipe-delimited)

Last, but not least, is a delimited document. In this case, it works similar to CSV, but when you try and import as per the above statement, you will end up with all the data in a single column. As such, you will need a Schema.ini file, that contains data about your file and is accessible to your server. In my example, I declare the DefaultDir as the place where the Schema.ini file resides.

Firstly, the Schema.ini. This file must contain details for each file that is being imported, where the heading of each section is the name of the file. So, if I am importing a pipe-delimited file called MyDelimitedFile.txt, that has headers in the column, my Schema.ini will have the following contents:


The SELECT statement must then reference the DefaultDir, and the delimiters will be used to determine the columns.

select * INTO MyDelimitedData
,'Driver={Microsoft Access Text Driver (*.txt, *.csv)};DefaultDir=F:\MyData'''
,'SELECT * FROM F:\MyData\MyDelimitedFile.txt')

Load test with Fiddler

Created 20 September 2017 16:25

A client reported an issue with one of our API endpoints today with a rather depressing problem - it worked fine with single posts, but multiple concurrent posts resulted in random errors.

I had the means to test the API with Fiddler, but that didn't help me much as that only sends one request at a time and I couldn't replicate the issue. I have used West Wind Web Surge in the past for personal work, but it requires a license for commercial use and I couldn't get it to play nicely with a self-certified test ceritificate, so I started Googling for an alternative.

It turns out you can do it easily with Fiddler. Execute the request that you want to stress test once, and then select the request once the response is received. Hit Shift-R, and a pop-up will appear asking how many simultaneous requests you want to make, defaulting to 5. Click OK on that, and Fiddler will send 5 concurrent requests!

Web Site and Page Essentials

Created 24 August 2017 20:03, updated 03 February 2018 17:18
Creating a web site is simple, but there are a lot of basic items to consider that are easy to forget but will trip you up in the lifetime of the web site. These lists should be revisited regularly.

For your site as a whole

  1. Are JavaScript and CSS resources bundled and minimised?
  2. While sitemaps are not as important as they used to be, if you have a complex site heirarchy or some pages are perhaps hidden behind posted forms, it may help your SEO to include a sitemap (sitemap.xml)
  3. X-Frame-Options header
  4. If possible, serve everything over HTTPS
  5. Implement HSTS:
  6. Ensure all passwords are hashed, and allow users to reset passwords by send them an email
  7. Error messages should be caught at a global level - don't reveal sensitive information when an error occurs
  8. Create an appropriate robots.txt
  9. Scan your site using
  10. Scan your site using
  11. Run your site through (particularly if your site is ASP.NET)
  12. Run your site through
  13. Optimize your images:

For every page

  1. Version numbers on CSS and JS include files - this will prevent new deploys from going wrong because users have cached copies of old files on their machines e.g. Myfile.js?version=1.0.4
  2. SQL Injection attacks - have you parameterised all your queries? Dapper, maybe?
  3. CSRF/XSRF Vulnerabilities - every page on your site (apart from the login screen) should compare the value of a form item AND a cookie to ensure they are the same cyrptographically secure number, otherwise authentication fails. For ASP.NET users, use the anti-forgery token.
  4. Don't send any sensitive information over email e.g. passwords
  5. Can the parameters be tampered with to reveal data the user should not be able to get to?

Useful links

Extract SSH Key from Filezilla Client

Created 10 May 2017 15:52

I always forget this, so I'm documenting it here for myself. If you make SFTP connections from automated scripts or within a program, you really want to be supplying the SSH key.

On Windows, this can be extracted by doing the following:

  1. Open registry editor (regedit.exe)
  2. Go to HKEY_CURRENT_USER\Software\SimonTatham\PuTTY\SshHostKeys
  3. All the keys are stored, with types, on the right hand side under the SshHostKeys folder. The Name of each entry contains the host.
  4. Delete the key for the host you're looking for, and reconnect using Filezilla. It should supply you with a challenge response with the key and value for future use.
  5. You may want to confirm this key with the owner of the source system!

Using Powershell with REST

Created 11 October 2016 06:55

I was doing some Web API testing for a client today, with a fairly complex scenario that required multiple permutations of data. I had requested each possible combination, which was provided to me as a pile of JSON files that were compliant with a REST service on our side. The problem now, though, was how do I automate sending these JSON files?

As usual, I turned to Powershell, and was pleasantly surprised to see that there is an existing "Invoke-RestMethod" commandlet already, making this a piece of cake! The only minor complications were that our service used a combination of IP Address validation and Basic Auth, and that we forced SSL usage - but these turned out to be no problem from a code perspective.

$dir = "C:\temp\20161010\MyJsonFiles"
$files = [System.IO.Directory]::EnumerateFiles($dir, "*.txt")
$url = "https://localhost:44300/api/client/endpoint"

# ignore certificates when debugging on localhost only - don't use this on live!!!
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

# set up user for basic auth
$user = 'test'
$pass = 'password'
$pair = "$($user):$($pass)"
$encodedCreds = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($pair))
$basicAuthValue = "Basic $encodedCreds"
$headers = @{
    Authorization = $basicAuthValue

foreach ($f in $files)
	$body = [System.IO.File]::ReadAllText($f)
	Invoke-RestMethod -Uri $url -Body $body -Method Post -Headers $headers -ContentType "application/json"

Export MongoDb query to CSV

Created 16 August 2016 17:16

Flattening out MongoDb documents can be a bit of pain. I use RoboMongo to inspect database documents, and I may be stupid, but although I can query individual fields, the GUI will still present them in tree format, so I don't get a nice "spreadsheet" view that I can export and send to users when they request data.

However, MongoDb, by default, comes with an export function that can be used to easily extract (at a field level) documents to a CSV file. In my example below, we have a complex Workflow document where the required fields are many nodes down the tree. This is easily extracted as follows:

mongoexport --db Workflows --collection Workflows --csv -f "Workflow.CreatedOn,Workflow.OrderInfo.Amount,Workflow.OrderInfo.ReferenceNumber" -o out.csv

This will extract three fields into three columns, with headings. You can even filter the results by passing a query with a "-1" parameter.

Openfiles utility

Created 27 June 2016 11:43, updated 18 June 2017 10:02

I have no idea how I didn't know this existed - it's been around for ages. Openfiles.exe has existed on Windows machines since Windows XP, and this is something that I've struggled with for years!

Openfiles.exe /query /s file-server > openfiles.txt
This will export to a list all open files by user on that particular server as per below
ID       Accessed By          Type       Open File (Pathexecutable)          
======== ==================== ========== =====================================
31541186 eric              Windows    F:\..\Example\June
42278615 tom              Windows    E:\..\Example2\doc
46977638 john              Windows    E:\..\Hello
30870272 jane              Windows    E:\..Place\file.docx

If you would like to "kill" one of these locks, for example the lock Tom has on you can just disconnect them using the same utility, supplying the ID as follows:

openfiles.exe /disconnect /id 42278615 

SQL Server : Scalar Function Affecting Performance

Created 22 January 2016 12:20, updated 18 June 2017 13:49

I was looking into poor performance issue on a data warehouse report today, and noticed this line as one of the columns being returned in the SELECT statement:


The query as a whole has around 7 joins and returns 5500 records. This is a fairly small data set, but without this line the query was running in under 2 seconds - with the scalar function being called it was taking 14 - 20 seconds. However, when I looked into the function it really wasn't doing very much - it wasn't very well written but it didn't have any hidden SELECT statements or anything that would cause an obvious performance issue. All it was doing, was formatting the date being passed to it.

The function itself wasn't the problem - it was the recursive call of the scalar function that was the issue. Basically, this speaks to the way SQL Server optimises - it just cannot optimise that function as part of its query plan, and executes it 5500 times - once for every row.

So what's the solution? If the function encapsulates logic that you don't want spread across reports, you can't remove the function. Your best option is to use a TABLE valued function instead of a scalar. Firstly, you need to rewrite your function as follows:

CREATE FUNCTION [cadi].[MyFunction] (@Date as smalldatetime)
       select right(convert(varchar, @Date, 106), 8) AS MyDate

Then, you need to add remove the function from the SELECT portion of your query and CROSS APPLY it - as follows:

SELECT MyTableResult.MyDateColumn, 
FROM dbo.MainTable t
INNER JOIN dbo.OtherTable o ON t.Id = o.ForeignKeyId
CROSS APPLY dbo.MyFunction(t.MyDateColumn) AS MyTableResult
WHERE t.Status = 1

In my particular scenario, applying the TABLE function as opposed to SCALAR resulted in performance going down to 2 seconds i.e. no appreciable hit on performance at all.