SQL Server : Scalar Function Affecting Performance 22 January 2016 at 12:20

I was looking into poor performance issue on a data warehouse report today, and noticed this line as one of the columns being returned in the SELECT statement:

schema.MyFunction(table.DateColumn) 

The query as a whole has around 7 joins and returns 5500 records. This is a fairly small data set, but without this line the query was running in under 2 seconds - with the scalar function being called it was taking 14 - 20 seconds. However, when I looked into the function it really wasn't doing very much - it wasn't very well written but it didn't have any hidden SELECT statements or anything that would cause an obvious performance issue. All it was doing, was formatting the date being passed to it.

The function itself wasn't the problem - it was the recursive call of the scalar function that was the issue. Basically, this speaks to the way SQL Server optimises it just cannot optimise that function as part of its query plan, and executes it 5500 times once for every row.

So what's the solution? If the function encapsulates logic that you don't want spread across reports, you can't remove the function. Your best option is to use a TABLE valued function instead of a scalar. Firstly, you need to rewrite your function as follows:

CREATE FUNCTION [cadi].[MyFunction] (@Date as smalldatetime)
RETURNS TABLE WITH SCHEMABINDING
AS
RETURN (
       select right(convert(varchar, @Date, 106), 8) AS MyDate
)

Then, you need to add remove the function from the SELECT portion of your query and CROSS APPLY it - as follows:

SELECT MyTableResult.MyDateColumn, 
t.Detail
FROM dbo.MainTable t
INNER JOIN dbo.OtherTable o ON t.Id = o.ForeignKeyId
CROSS APPLY dbo.MyFunction(t.MyDateColumn) AS MyTableResult
WHERE t.Status = 1

In my particular scenario, applying the TABLE function as opposed to SCALAR resulted in performance going down to 2 seconds i.e. no appreciable hit on performance at all.

MediaWiki on IIS working with no styles 04 January 2016 at 19:58

I decided to install MediaWiki today on one of our web servers, and set it up within IIS. However, although the site immediately served pages, all of them were displaying without any styles. The stylesheet request showed with a 200 response within Chrome so I didn't think there was anything wrong there and resorted to Google. However, I couldn't find anything on Google with an answer, so after an hour of trial and error I eventually ended up back in Chrome looking at the request.

It turns out the 200 response was incorrect - by following the link there was actually an error occurring when I looked at the contents of the response itself. It turns out, there was an error in the response: C:\Windows\TEMP not writable. The less engine being used was trying to dynamically compile the stylesheet and was failing as the IIS user under which the site was running didn't have access to the folder. I added write permissions to the user, and boom, it started working.

IIS7 : Add folders for specific users with Basic Authentication 05 May 2015 at 17:02

I needed to set up a web site that has a basic page for browsing as the root, but within that site there should also be download folders secured for different users. The folders don't require a web page, just browsable content. The issue I had was that there is a lot of misleading (and difficult to understand) documentation online, and I kept getting it to work and realising ALL users had access to ALL folders, or none. After some playing around, I found that for a basic site it's actually fairly easy to configure but thought I'd document this as it's useful as a basic file sharing mechanism. Note that you will need to ensure your site uses HTTPS rather than HTTP otherwise the password is sent in plain text across the network/internet/etc. Basic Auth isn't the most secure mechanism, but it's better than nothing at all.

Steps for configuration in IIS7:

  1. Create your web site in IIS with a basic index.htm file as per any other site. Make sure Anonymous Authentication is Enabled at this level.
  2. User permissions are controlled using standard user accounts. Go to manage your computer and and add a new local user.
  3. In Windows, create the folder in your web site directory.
  4. In IIS Manager, click on the folder, go to Authentication and make sure Anonymous Authentication is Disabled, and Basic Authentication is Enabled.
  5. On the folder, go to Directory Browsing and make sure it is Enabled.
  6. In Windows Explorer, go to the folder, and update the permissions so that the user you created has Read access to the folder. Make sure you remove any other accounts that are not relevant - for example the Users group on the local machine should NOT have access to the folder. Finally, you also need to ensure that the user of the Application Pool under which the web site runs has access to the folder. For example, if your App Pool runs under the Network Service account, this account must have access to the folder in question.

And there you go, that's it - you should be able to access your new folder and get a challenge response that will only work for your new user!

Upgrading SSL configuration on your web server 05 May 2015 at 11:33

I ran SSL tests on one of our web servers recently and found it was horribly out of date, receving an "F" from QULAYS SSL Labs SSL Server Test. Boooo. The problem was, in the past this was a pain in the butt as you had to pile into the registry and try and disable the old SSL support and various other manual steps.

Thanks to the awesomeness of Google, however, I found this little gem: https://www.hass.de/content/setup-your-iis-ssl-perfect-forward-secrecy-and-tls-12. Run Powershell script, test, cheer. A great big thanks to the author for that script - it worked perfectly, and our web site now gets a nice green A rating.

XmlWriter not releasing file with Powershell Script 07 January 2015 at 13:23

I encountered an issue today where a developer had created a Powershell script to write XML from stored procedure to a file, and the file would not be released until we closed the PowerGUI Script Editor (which is what we used to run the file). Even from the console it would take 10 seconds after the script completed before the file lock was released.

The solution was fairly simple - closing the XmlWriter doesn't automatically flush and release the underlying stream, you have to explicitly set this on the writer using an XmlWriterSettings object. Full solution below.

#Setup connection string and sql to run
$server = "magma"
$db = "my_database"
$connectionString = "Data Source=$server; Integrated Security=True; Initial Catalog=$db"

$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)
$command = "EXECUTE spTest";

$dateTimeStamp = [System.DateTime]::Now.ToString("yyyyMMddHHmm")
$filePath = "\\mypath\Extract_$dateTimeStamp.xml"       

try
{
    $connection.Open()
    $command = New-Object System.Data.SqlClient.SqlCommand($command, $connection)
    $command.CommandTimeout = 0
	
    $xmlReader = $command.ExecuteXmlReader()

    $xmlWriterSettings = New-Object System.Xml.XmlWriterSettings
    $xmlWriterSettings.CloseOutput = $true
    $xmlWriter =  ([System.Xml.XmlWriter]::Create($filePath, $xmlWriterSettings))
    
    $xmlWriter.WriteNode($xmlReader,$true)    
    $xmlWriter.Flush()
}
catch {
    $errorMessage = $_.Exception.Message
    $body = "Extract failed to run for the extract: $errorMessage"
    Send-MailMessage -To "me@test.co.za" -From "do-not-reply@test.co.za" -Subject "Oops" -Body $body -SmtpServer "MYEXCHSERVER"
}
finally
{
    $xmlWriter.Close()
    $xmlReader.Close()
    $connection.Close()
    
    $xmlWriter.Dispose()
    $xmlReader.Dispose()
    $connection.Dispose()

    $xmlWriter = $null
    $xmlReader = $null
    $connection = $null
}

Write-Output "DONE!"

SQL Server Database Stuck in Restoring 100% State 15 September 2014 at 16:37

I had an issue today where an empty database backup could not be restored for the love of money. The backup was 4MB, but when I attempted to restore it would create a 76GB log file on the server and then stay perpetually in a 100% "Restoring" state. This is because one of the virtual logs in the log file is still marked as used, and the database is stuck waiting for the log restore to complete.

You have two options here. Firstly, keep backing up the log and shrinking the log until the virtual logs are cleared up:

BACKUP LOG [database] TO DISK = 'C:\Temp2.bak'
GO
DBCC SHRINKFILE (N'logfilename' , 10)
GO

Your second option is to reset the recovery mode to simple, shrink the file and then reset the recovery mode:

use [database]
ALTER DATABASE [database]SET RECOVERY SIMPLE WITH NO_WAIT
DBCC SHRINKFILE([logfilename], 1)
ALTER DATABASE [database] SET RECOVERY FULL WITH NO_WAIT

PhotoTool 2.1.0 23 June 2014 at 21:48

One of the pieces of software I wrote (10 years ago now!) that I actually use is PhotoTool. This isn't really anything amazing, it just works for me when I want to upload photos. I updated it last week to use the ImageProcessor library, and put the horrible source code onto github. The code is pretty bad, but hey, it works for me.

Download the latest version from here.

Scheduled photos with a web cam 18 April 2014 at 12:00

I have a spare web cam lying around, and figured it would be a great addition to set it up so it can take regular photos of our lounge when we're away on holiday. I did a quick Google, and managed to get a rudimentary solution running in less than half an hour using CommandCam and Powershell.

The solution was pretty basic: I downloaded the exe and then created a Powrshell script that captures an image every 5 minutes. I added an extra command to my script which sends me an email every now and then to remind me that the capture is still running. Basic, but it works! Note that this just runs forever, I'm happy with it working that way and I just stop it when I want to, although this could be altered to be a scheduled task or windows service.

cd E:\Cam
$photo_count = 0

while ($true) {
    .\CommandCam.exe /quiet

    $dt = [System.DateTime]::Now.ToString("yyyyMMdd_HHmmss")
    Rename-Item image.bmp "image_$dt.bmp"
    Write-Output "image_$dt.bmp"

    $photo_count = $photo_count + 1
    if ($photo_count -gt 72) {
        $smtp = New-Object Net.Mail.SmtpClient("mysmtp.com", 101) 
        $smtp.EnableSsl = $true 
        $smtp.Credentials = New-Object System.Net.NetworkCredential("hello@nowhere.com", "password"); 
        $smtp.Send("fromaddress@somewhere.com", "toaddress@nowhere.com", "Web capture running", "Reminder: web capture is still running on SERVERX")    
        Write-Output "Reminder sent"
        $photo_count = 0
    }    
    # capture every 5 minutes
    Start-Sleep -s 300 
}

With my web cam, the resulting images are pretty small too - around 250-300KB each, so I went one step further and changed my target folder to a Dropbox folder, This means my images are automatically uploaded to the web too, so I can admire my empty lounge using my phone in a matter of seconds.....

Deleting stubborn windows folders 17 April 2014 at 15:13
Every now and then, even as an administrator on a machine, you hit a folder that you just cannot delete. The usual route is to take ownership of the folder and it's child files and folders, but once in a blue moon this doesn't work and you're stuck with a folder you can't delete. Assuming no locks on the files, I found today that issuing the following commands from an elevated-level command prompt worked where the GUI equivalents didn't:
takeown /F C:\Temp /R
cacls C:\Temp /T /e /g Administrators:f
rmdir /S C:\Temp
Git Cheat Sheet 24 February 2014 at 15:58

We're switching to git (on github.com) at the moment, and I really wanted to use it properly instead of just relying on the GIU clients. I decided to learn it using the command-line, and I found this excellent tutorial online which really helped me understand some of the fundamentals: http://www.sbf5.com/~cduan/technical/git/.

This article contains my basic cheat sheet for Git, which more or less follows the general workflow when using a git repo.

git initInitialises a new git repository in the current folder.
git clone https://myrepo.comClones and initialises a remote git repository locally in the current folder - adds a remote repository reference named "origin".
git logView log changes.
git add .Recursively adds all changes to the repository.
git commit --dry-runSee what changes will be committed before actually running git commit.
git commit -m "My message"Commits changes to the repository.
git branchGet a list of local branches, with a star next to the current head.
git branch branch-name baseCreates a new branch based on an existing branch e.g. git branch test-branch HEAD.
git checkout branch-nameSwitches to a new branch and updates the local folder with the files from that branch.
git fetch originRetrieves remote changes and updates remote heads.
git pull originPulls all remote changes (origin can be replaced with a URL, for example).
git pull --rebase originPulls all remote changes but baselines them BEFORE your local changes, so your changes move on top of what everybody else has already contributed.
git push origin HEADPushes all changes back to the repository origin.
git clone --branch xyz https://github.com/MyOrg/MyRepo.gitClone a specific tag from a remote repo.

Edit (30 June 2015): To prevent yourself from having to enter credentials with each command-line entry, you can configure it to use a local store with the following command: git config credential.helper store - this will result in one request for user credentials and then no more.