You always want to log errors on your live systems, but sometimes there are errors that occur that you are not interested in and don't want to code around. For example, a client disconnects in the middle of a web request. NLog allows you to add filters to ignore these errors at a logger level. So, you could log all errors to a database AND to email, but specific errors would not get emailed and fill up your mailbox!
<rules>
<logger name="*" minlevel="Error" writeTo="Database" />
<logger name="*" minlevel="Error" writeTo="email">
<filters>
<when condition="contains('${aspnet-request:serverVariable=HTTP_URL}','Token') and contains('${exception:format=Message}','task was canceled')" action="Ignore" />
</filters>
</logger>
At IUA we use Jenkins for CI, and we use Powershell for automation of our builds and deployments. We've always invoked Powershell using batch commands, but I recent implemented the Powershell plugin which makes life a little easier. Some tips on implementation:
Include a file
Prefix the file name with a ". "
. C:\Temp\test.ps1
Use environment variables
When running in the context of Jenkins, you can get and set environment variables. This can be useful when wanting to access a common function used across multiple scripts. For example, if you have many files wanting to use a version number, but the generation of that version number is complex enough to put it in a function, you could move that function to a common file and then store the generated value in an environment variable:
. C:\Temp\version.ps1 $env:version = GetVersion Write-Host $env:version
Run jobs via the Jenkins API
I got this code from Stack overflow (https://stackoverflow.com/questions/46131903/why-is-powershell-not-able-to-send-proper-crumb), but adapted it to include build parameters:
function RunJenkinsJob { param( [string]$userName, [string]$password, [string]$apiUrl, [string]$apiPort, [string]$jobName, [string]$jobParams # url-encoded string of parameters e.g. myparam1=val1&myparam2=anotherval ) $h = @{} $h.Add('Authorization', 'Basic ' + [Convert]::ToBase64String([Text.Encoding]::UTF8.GetBytes("$(${userName}):$(${password})"))) $params = @{uri = "${apiUrl}:${apiPort}/crumbIssuer/api/json"; Method = 'Get'; Headers = $h;} $apiCrumb = Invoke-RestMethod @params $uri = "${apiUrl}:${apiPort}/job/${jobName}" if ([String]::IsNullOrWhiteSpace($jobParams)) { $uri = "$uri/build" } else { $uri = "$uri/buildWithParameters?$jobParams" } $h.Add('Jenkins-Crumb', $apiCrumb.crumb) $params['uri'] = $uri $params['Method'] = 'Post' $params['Headers'] = $h Invoke-RestMethod @params } RunJenkinsJob -userName "jenkins" -password "mypassword" -apiUrl "http://buildserver" -apiPort "8080" -jobName "JobWithParams" -jobParams "PARAM1=value1&PARAM2=value2" RunJenkinsJob -userName "jenkins" -password "mypassword" -apiUrl "http://buildserver" -apiPort "8080" -jobName "JobWithoutParams"
For every web site created in IIS, a new W3SVCX folder is created in the logging folder configured for IIS. Knowing which folder applies to which site is not obvious, particularly when you have multiple sites serving the same content and it's not easy to determine this information from inspecting the log files.
The configuration can be worked out by opening the applicationHost.Config
file in the %WinDir%\System32\Inetsrv\Config
folder.
Under the sites
element, each site is then listed with a unique id, for example:
<site name="mysite.mydomain.co.za" id="6" serverAutoStart="true">
In this case, the log files for this site will be stored in the W3SVC6 folder.
I've been meaning to move to CloudFlare for a while, and thanks to this amazingly helpful page created by Troy Hunt, I've taken the leap.
I've been battling for time, and this really took all the research requirements right out of it, and I'm up and running with SSL even though my host doesn't support it on the cheap hosted package I'm using.
I've been using MediaWiki for ages for technical documentation, and I always forget how to implement custom CSS in both screen and print, so I'm adding this as a permanent reminder for myself.
To edit the CSS for the site (screen version), navigate to "http://YOURMEDIAWIKISITE/index.php?title=MediaWiki:Common.css" - and edit the CSS declared on that page
To edit the CSS for printing, navigate to "http://YOURMEDIAWIKISITE/index.php?title=MediaWiki:Print.css"
Easy!
I needed to move a WordPress site to HTTPS. The certificate was installed, and the site was being correctly served over HTTPS, but I was getting mixed content warnings, as many of the images were being loaded over http. There are many sites that give the full rundown, but nothing worked for me for the Enfold theme.
This site: https://managewp.com/blog/wordpress-ssl-settings-and-how-to-resolve-mixed-content-warnings - can be used to set up HTTPS from start to finish
This site: https://isabelcastillo.com/mysql-wordpress-http-to-https - led me to the solution
Once the site was set up for https, I had to run the following four queries against the WordPress database:
UPDATE wp_posts
SET post_content = ( Replace (post_content, 'src="http://', 'src="//') )
WHERE Instr(post_content, 'jpeg') > 0
OR Instr(post_content, 'jpg') > 0
OR Instr(post_content, 'gif') > 0
OR Instr(post_content, 'png') > 0;
UPDATE wp_posts
SET post_content = ( Replace (post_content, "src='http://", "src='//") )
WHERE Instr(post_content, 'jpeg') > 0
OR Instr(post_content, 'jpg') > 0
OR Instr(post_content, 'gif') > 0
OR Instr(post_content, 'png') > 0;
UPDATE wp_postmeta
SET meta_value = ( Replace (meta_value, 'src="http://', 'src="//') )
WHERE Instr(meta_value, 'jpeg') > 0
OR Instr(meta_value, 'jpg') > 0
OR Instr(meta_value, 'gif') > 0
OR Instr(meta_value, 'png') > 0;
UPDATE wp_postmeta
SET meta_value = ( Replace (meta_value, "src='http://", "src='//") )
WHERE Instr(meta_value, 'jpeg') > 0
OR Instr(meta_value, 'jpg') > 0
OR Instr(meta_value, 'gif') > 0
OR Instr(meta_value, 'png') > 0;
I purchased a new laptop through work today, which came with a pre-installed version of Windows Home Edition. I need Professional Edition for work, and so bought a copy of Windows Pro at the same time. Because I'm iffy about setup, I elected to install the new OS myself...what a mistake. No matter what I did, the Windows installer kept pulling the key from the BIOS, and so kept activating as Windows Home.
Things I tried that failed:
Things I tried that worked:
[PID] Value=XXXXX-XXXXX-XXXXX-XXXXX-XXXXX
[EditionID] Professional [Channel] OEM [VL] 0In the below articles, it was mentioned that this file should be placed in x64sources and x86sources directories - for me it was just the "sources" folder
I am documenting this here for my personal benefit, but this help as found here:
I've used OPENROWSET to import Excel documents for years, but I was playing around today with CSV and Pipe-delimited files, and there are some tricks to these that I thought I would document for future reference.
All the examples below assume you have a folder F:\MyData on your server. I emphasise this, because if you run the below scripts using SSMS on your machine, that is not the same F: drive - it is the F: drive the server sees.
Excel documents go in really easily. You just need the name of the Sheet you want to import - in my example below I am importing all data from sheet "Sheet1" into table "MyExcelData".
SELECT * INTO MyExcelData FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0', 'Excel 12.0;Database=F:\MyData\MyFile.xlsx', [Sheet1$])
Second easiest, is a CSV document. Be warned, you will need to make sure that it is an ANSI-compliant file (or that there are no commas in the data), otherwise you will end up with columns named "F1", "F2", "F3" in your generated table.
select * INTO MyCsvData FROM OPENROWSET('MSDASQL' ,'Driver={Microsoft Access Text Driver (*.txt, *.csv)};''' ,'SELECT * FROM F:\MyData\MyFile.csv')
Last, but not least, is a delimited document. In this case, it works similar to CSV, but when you try and import as per the above statement, you will end up with all the data in a single column. As such, you will need a Schema.ini file, that contains data about your file and is accessible to your server. In my example, I declare the DefaultDir as the place where the Schema.ini file resides.
Firstly, the Schema.ini. This file must contain details for each file that is being imported, where the heading of each section is the name of the file. So, if I am importing a pipe-delimited file called MyDelimitedFile.txt, that has headers in the column, my Schema.ini will have the following contents:
[MyDelimitedFile.txt] ColNameHeader=True Format=Delimited(|)
The SELECT statement must then reference the DefaultDir, and the delimiters will be used to determine the columns.
select * INTO MyDelimitedData FROM OPENROWSET('MSDASQL' ,'Driver={Microsoft Access Text Driver (*.txt, *.csv)};DefaultDir=F:\MyData''' ,'SELECT * FROM F:\MyData\MyDelimitedFile.txt')
A client reported an issue with one of our API endpoints today with a rather depressing problem - it worked fine with single posts, but multiple concurrent posts resulted in random errors.
I had the means to test the API with Fiddler, but that didn't help me much as that only sends one request at a time and I couldn't replicate the issue. I have used West Wind Web Surge in the past for personal work, but it requires a license for commercial use and I couldn't get it to play nicely with a self-certified test ceritificate, so I started Googling for an alternative.
It turns out you can do it easily with Fiddler. Execute the request that you want to stress test once, and then select the request once the response is received. Hit Shift-R, and a pop-up will appear asking how many simultaneous requests you want to make, defaulting to 5. Click OK on that, and Fiddler will send 5 concurrent requests!