Create Windows scheduled task (for the NppUpdater script)

Create Windows scheduled task (for the NppUpdater script)

The previous NppUpdater script doesn’t do anything without it being scheduled in the task scheduler.

So I thought I also release a script to you, which does just that. If you add the code in this script to the NppUpdater script, it’ll also create the scheduled task if it doesn’t exist. Of course the script can also be used for scheduling other things.

The task is created with the following parameters:

  • The task’s created in a subfolder called “SysAdmins”
  • The task’s name is “NppUpdater”.
  • The task will start the NppUpdater.cmd file.
  • The task runs daily at 6 am, with a random start delay of 15 minutes
  • The task its start-in path is set to the path the script started from when it rain
  • The task will run with highest privileges as the account that ran it (which needs to be a member of the administrators group)
  • The task requires a logged on user (if you don’t prefer this: just change the script)

If you don’t want the text in your log file about the task already existing, change the line in the bottom from




The script can be found here: Create-ScheduledTask.ps1

Notepad++ downloader and updater

I love to use Notepad++ (npp), but I also have npp installed on servers on which users can log on, but don’t have administrative privileges. They also love to use npp, but with the updater enabled, they get popup messages for the updates, but cannot install them. I don’t want to have to check all servers that I manage if they need a new version of npp, but if there’s a new version I do want to like that one to be rolled out on all servers.

There are many ways to do this job, but next to wsus, we don’t use other tooling, so I decided to create two little tools.

  1. NppDownloader
  2. NppUpdater

The first tool will check if the npp website if there’s a newer version available then the one that’s already on the disk in a certain folder or on a certain share (this one I schedule on my file server). The second tool is one that I schedule on each server with npp, it’ll check a certain folder or share for npp installers. The latest one found, will be installed on the system, if it’s newer than the current version that’s installed (I assume the default installtion location is used).

Both scripts will create a log file which is overwritten each time the script is ran. These scripts were made before I did my loop and fire write speed tests (as you can read in my previous blog post), thus still contains the [io.file]::WriteAllLines commands instead of the [io.file]::WriteLine command. It only writes a couple of lines in the file, so should be about as quick I guess.

I’ve zipped them both, and added cmd files with some options so it can be ran as administrator with the script starting in the correct path (normally a cmd that’s ran as administrator will start in C:\windows\system32). There is an option to enable the localextentions and go to the path the cmd file is started from. This is achieved by these lines in the cmd file:

setlocal enableextensions
cd /d "%~dp0"

The zipped powershell scripts and cmd files can be found here:


Edit: If you also want to automatically schedule the NppDownloader file, you might be interested in my follow-up post: Create windows scheduled task for the nppupdater script

Speed of loops and different ways of writing to files – Which is the quickest?


A while ago, I stumbled upon two blogs about speed and PowerShell. First Guillaume Bordier’s blog about PowerShell and writing files. And second there was IT Idea’s Blog on PowerShell Foreach vs ForEach-Object. While researching the matter, I also came across an article on about speeding up loops (which wasn’t of use in my tests, but still interesting and something to think about when creating a for loop).

For the first blog: I saw a flaw in the export-csv he used, as the output of the file was not the string “some text” and I myself use .net File class WriteAllLines a lot, which was not in his list.

Thus I decided to create my own test and try out the information found in both of the above blogs. Things got out of hand a little bit, which made me end up with the tool I have today and which I share with you. This way you can test them yourself or use my script as something to learn from. By creating this script I myself learnt a lot of things as well and used techniques which I hadn’t used before. Let me take you on the journey (if you’re not interested in the story and results, scroll down to the end of the post for the script):

Test Goals:

  1. I want the output to be the same and least interrupted by having the need of using variables next to writing and looping; but if they are needed, their time will be measured as well. (thus: file contents needs to be the same, for as far possible)
  2. I want to test the speed of each file write type
  3. I want to test the speed of each each loop
  4. In case of export-csv, I want the output to be the same text as in each other file, but it’s allowed to have quotes around it, because that’s part of the csv format.
  5. I will use the default setting for each way to write to a file, I will just deliver a string/array (depending on the way to write) and a file name and let the command do its work.

The different ways to write to a file, which are used in the tests (minor number in the test files):

  1. export-csv -append
  2. >> output
  3. out-file
  4. out-file -append
  5. io file WriteAllLines (.net file class)
  6. .net streamwriter WriteLine (.net streamwriter class)
  7. io file WriteLine (.net file class)

The different loops, which are used in the tests (major number in the test files):

  1. .5000 | ForEach-Object
  2. Foreach 1..5000
  3. For x = 1 – 5000
  4. While x -ne 5000
  5. Do .. While x -ne 5000
  6. ForEach(x in [arr5000])

(I know there’s also a Do Until, but I figured there’s already a While and a Do.. While in here, so it probably wouldn’t matter that much; maybe I will add it later. If I missed some more, please tell me in the comments and I will try to add it.)

Like I said before, once I started testing things, it got a little bit out of hand and I added more and more, also because the results were pretty interesting, but primarily because I enjoyed testing and learning about all the differences and new ways of doing things in my code.

The test results

The first thing to notice when the script is complete is that the different write types give different file sizes. With the csv export is easily explainable, because of the quotes around each text line. (64kb instead of 54kb)


When you open them with my favorite text editor (Notepad++), it’ll tell you the different types of encoding that was used.

UTF-8 is used by Export-CSV, the .net file WriteAllLines & WriteLine and .net streamwriter WriteLine

The others use UCS-2 LE BOM; I did not research the difference, but they probably support double byte characters, thus explaining the double file size.

(* Note – in most or all cases the file format can be specified, but I was interested in the default setting, as this would be the one used the most when I’ll use these commands)

After I saw these differences, I also noticed the time differences, which I color coded so it’s easier to find the best and worst ones. In the end of the tool it’ll show 2 views on the results. The results based on the types of loop and on the types of writing to the files.

In the end I limited the file to create test files with a maximum 50 million lines (which is a 524MB UTF-8 file), but you can easily override that by commenting it out.

The slow writing types and loops will in most cases be exponentially slower when using larger files, but testing with a minimum of 5000 lines will give a decent result. It is possible to go as low as 50 lines though, but the quick writing types will be done witting 0 milliseconds. Even with only 50 lines to write, some of the ways will result in taking nearly 1 second, thus already lagging your script if you use them.

All test results show that the ForEach-Object is one of the slowest loops. The quickest seems to be the ForEach loop, but Do.. While, While and For follow with just several milliseconds difference. It would be quicker to create a list of objects which you’d otherwise ran through ForEach-Object and create a ForEach loop to work with that list of objects. This would probably be about 30 seconds quicker with a loop that runs 5000 times. One little interesting thing is that the Do .. While loop is a little bit quicker than the While loop (maybe because the Do already initiates the start of the loop, without knowing the conditions?), that started me wondering about the Do .. Until loop and will probably add that to the types of loop to test. But first I wanted to share my findings and script with you.

On the writing part there’s a lot of differences to see.

The .net File class WriteAllLines  (which I thought was using streamwriter, but it is filestream) is performing well with small files, but once they get bigger, it is easily outperformed by the .net File class WriteLine and .net Streamwriter. Those two are the quickest, with the io file WriteLine leading on top on all tests and loops.

The test results In screenshots:

TestResults5000Test-LoopsAndWriteFile PS4Test-LoopsAndWriteFile PS4 524MB file WriteStartTest6

(The first one is a screenshot of test results as put into an Excel sheet and comparing 3 test runs; the second one is a screenshot of the results of the script with its default parameters, thus all tests and 5000 lines per file; the third one is a screenshot of the results of the script with 50 million lines (524MB UTF-8 file) written and only the quickest 2 writing methods. The red line (which is marked worst) is actually very good, but since it’s the slowest in the list, it gets this marking)

I’m pretty impressed by the results as well; writing 524MB, line by line in 10.4 seconds.

My conclusion:

Only use the Foreach-Object if you really really really (yes, I just did repeat the word really 3 times) need to; because it’s very bad for the performance of your script. If it’s at all possible, just create a ForEach loop and you will spend a lot less time waiting on your scripts to complete. (Which makes you and others around you that use your scripts happy). There are some valid reasons to do so, as described on the IT Idea blog (mentioned in the top of the article), to which Jeffery Hicks also replied with a valid reason to use foreach, though his example may be lacking a bit.

If you were considering of using export-csv , >>, out-file append or out-file; they all take 1 second up to 40 seconds for a 5000 lines file (54kb in UTF-8); so I would recommend no to use those. In case of getting data to a CSV file, you can always use ConvertTo-Csv and then use any of the other writing methods to save your data. Or if you’re into quick and dirty solutions, just put quotes around the strings you save and then use any of the quick writing methods to save this to a file.

If you were considering and/or using .net File class WriteAllLines (like I am/was): you’re already on the right track, and it is a quick way to save your data, but move on to one of the two quickest ones on the block:

  1. .net File WriteLine
  2. .net StreamWriter WriteLine

Which one of those to use? I’d say get my test script and test it yourself. On my dekstop and laptop computer I got different time results, but the winner (by a hair) was the .net File class with WriteLine.

So I know that from now on I’ll be using that one a lot more, in combination with a ForEach or For loop; which both perform the best.

What I learned by creating this

On the learning part for me (when creating this script), I started learning about using dynamic script block names, putting a console cursor on a certain place (doesn’t work on PowerShell 2.0 by the way), formatting strings, dynamic variables, how to properly write my text with export-csv (which I’ll never use again) and probably more.

I was confused on the .net file class, hence it uses filestream I had assumed it is the same type of steam as streamwriter; but the test results showed me something different in timing, so I also learned that there’s a difference between filestream and streamwriter.

Nice piece of code

I believe this to be a very nice line in the script (as it calls all the different tests and loops) :

Invoke-Command -ScriptBlock (Get-Variable -Name "$TestName" -ValueOnly) -ArgumentList $s,$TestName,$TestNumber,$TotalTestText

If you believe there’s anything missing in my tests, or things can be done on a better and/or different way, let me know below in the comments.

Optional parameters:

  • LoopStartTest (default value = 1; run all tests). If you want only the quick loops, start from 2
  • TestLines (default value = 5000; max = 50000000; 50 million). The amount of lines to write to each test file. Going below 1000 will result in writing methods completing within 0ms
  • WriteStartTest (default value = 1; run all tests). If you want only the quickest 2 writers, set the value to 6, if you want to include WriteAllLines set it to 5 (which can also be pretty quick, depending on amount of lines and file size)

Once you set the testlines value above 1 million, it is advised to only start from writing method 5; and loop test 2. Though I am interested in your results. Do they compare to mine? If you do have the time and patience to wait for the slow ones on files above 100k lines, please let me know the results below in the comments. I don’t thing the color coding will still be any good, though I tried to tie a formula to it to try to keep it within acceptable ranges. If you’ve got the patience to run this script to create larger files than 50 million lines (524MB in UTF-8 encoding)

The test script can be found here: Test-LoopsAndWriteFile.ps1

Note for PowerShell 2.0 users: The Append parameter is not implemented for Export-Csv, thus this test is skipped if PowerShell 2.0 is detected.

Download SolarEdge solar production data and save to csv

I’ve got a nice solar panel setup on my roof, which uploads its data to the SolarEdge monitoring portal (this is the brand of inverter that I got). It appeared that this monitoring portal also has an API to automate getting the energy data. With me working in a company that works on energy savings, monitoring, consultancy etc, it’s a logical step for me to automate downloading the production of my panels, so my collueges at work can import this data in my energy monitoring portal at work and then they can do their magic calculations on them; which results in me getting nice graphs of my solar enrgy production.

The purpose of the tool:

I wrote this tool to be somewhat smart in what to download and what not. I only want to download the information if it’s up-to-date. Next to that, I’d want daily values, but also values with a 15 minute interval. In the end I’d want all the info to be exported to CSV files containing 1 month of data and/or 1 year of data. All files that have already been generated don’t need their data to be downloaded again and overwritten, thus it skips this data once it’s downloaded. In the end I can either mail these files or put them on a file share, so they can be imported by our energy monitoring system. This last step I’ve removed from the script that I share with you.

Several nice ‘techniques’ used in this script to get to the goal:

Since I’m talking with an API, the main important command is the Invoke-WebRequest with a pipe to ConvertFrom-Json

To get the last day of the month, the following line is used: $LastDayOfTheMonth = ((Get-Date -Date “01-$Month-$Year” -Hour 0 -Minute 0 -Second 0).AddMonths(1).AddSeconds(-1)).Day

(which will add a month to the date/time 01-<month>-<year> : 0:00 and then remove 1 second to get to the last second of the previous month, thus resulting in returning the last day of that month (<last day>-<month>-<year> 23:59:59))

In the end, the downloaded data is converted to CSV with this command: | ConvertTo-Csv -Delimiter “;” -NoTypeInformation (in Excel with localization the Netherlands, the semicolon is a better seperator than a comma, since Excel expects a semicolon. This saves me time on opening the csv files and converting the information to an excel readable file. Depending on your localization settings, you’d want to change this accordingly) I also add the switch -NoTypeInformation, since I’m not interested in getting information about the type of variable PS used, I only need the data.

Script parameters and bounds:

The script will check the solaredge portal for the date/time on which the installation received its latest data. If this date is not the same as today, something might be wrong. It’ll warn you about it, but will still download. If it’s out of date for more than 1 day, it’ll give an error about this, but the script will still continue. When no data has been downloaded, the tool will download all data (starting on the installation date of the inverter) per month and save these to csv files. It will also generate files with yearly data. The data that’s downloaded is from the previous month (unless this is a later date than the date the last data was received on the solaredge monitoring portal, then the last data date will be used). In case of the daily values (monthly and yearly csv’s), the tool will only store the value’s if they’re not $null (which is: no data received on portal or the solar panels with its optimizers as a whole aren’t yet initialized by the inverter). In case of the 15 minute interval values, all $null values will be replaced by 0,0 (in that case the inverter is turned off because of its night mode); thus giving a nice list of 96 readings each day, and resulting in about 2688 – 2976 values in each file, depending on the amount of days in the month (which can be checked for, if needed; in my case I don’t, since my work already has many comprehensive tools to check for gaps or strange behaviour in data).

The script can be found and downloaded here: DownloadEnergyData.ps1


Edit: I got myself an undocumented API call parameter from a SolarEdge developer, which gives me 15 minute values in WH 🙂 The script has been updated. (&TimeUnit=QUARTER_OF_AN_HOUR)

E-mail address checker

It has been an while since I posted an update. It’s been pretty busy at work (mostly powershell related, thus a lot of fun to do) and have been busy working on several scripts to automate my work. Some of them I will also share here, like this nice little script to validate an e-mail address.

I needed a script to validate an e-mail address, but since the tlds can be pretty ‘random’ nowadays, I didn’t want to just implement the rfc822 regex and hope for the best, but wanted to validate the TLD as well. Since I’m already validating the TLD, I think I can also do with a less comprehensive regex to check for mailaddress standards.

Thus I came up with the following script. First I download the list of current TLDs and update this file every month. Then I check if the TLD(s) of the given e-mail address(es) are in the list of TLDs. Once this is done, the simple e-mail address regex checker will validate the rest of the address.

You can find the script here: EmailAddressChecker.ps1

So the main important lines in the script are:

$TldUrl = to list of current TLDs)

$Regex = “^[A-Z0-9._%+-]+@[A-Z0-9.-]+\.[A-Z]{2,63}$” (E-mail address regex, check for allowed characters before the @ and after. After the dot it can only contain letters, with a minimum length of 2 and a maximum lenght of 63, according to RFC standard:

At this moment the script only outputs that the TLD exists. You can replace Write-Output “TLD exists for $MailAddress For anything you like it to do after the TLD validation has been done.

Move-item and files with [ and ] in its filename

I was researching a bug in a script of mine, where most files would be moved, exept some. The behaviour seemed irregular, until I noticed that the files not being moved had 1 thing in common. They all had brackets [ ] in their file name. It appears that if you use Move-item -Path <filename> -Destination <destination path> it won’t work on files with those brackets in its name. To make Move-item move these files too, use the switch LiteralPath instead of Path:

Move-Item -LiteralPath <filename> -Destination <destination path>

Net stop/start W3SVC and Stop/Start-Service W3SVC vs IISREST STOP/START

I encountered a problem where I wanted to stop and start IIS by using either the command line or powershell versions of stopping and starting IIS.

Stop-Service W3SVC
Start-Service W3SVC

Both of those stopped and started the World Wide Publishing service correctly. But both appeared to not stop/recycle the worker processes. With Net start and stop giving an error message when IIS was already stopped and/or started before running those commands, resulting in a scheduled task error code.

I found that using the IISRESET command, it does close those worker processes and when IIS was already stopped and/or started it didn’t return an error, thus resulting in the scheduled task returning a nice 0x0 (success)

Thus: use the following commands (also within powershell) if you want to stop/start IIS websites. This will always work as intended and not return an error code (in some cases, where the others would)


Powershell application – Get information from a webpage, parse it and show the results

Today I created a nice little ‘application’ in Powershell.

In The Netherlands we have a site called Marktplaats ( which is some sort of combination between e-bay and craiglist. This is one of the biggest sites in The Netherlands, but I was missing some search options on this site and thought they started to show to many ads. Thus I decided to try something in Powershell to help make my life somewhat easier and while I’m at it, also implement the things that I think the site is missing.

One of the main important things I was missing, was a wide search pattern which I could fine-grain down to the things I’d like it to return. (ie. searching for all free items, but excluding sand, cats, yellow stones etc).

Second thing I’m missing is and alert function. I’d like to receive an alert if something I search for gets placed on the site (and if it also complies with the from and to price I’m searching the item for.

Third thing as mentioned before: They started loading the site with Ads. At the moment it’s about 1 ad for each 2 items on the site. This is way too much and decreases the fun I have of browsing trough secondhand items.

Fourth thing: It appears that the site has restrictions on the distance radius I can set. Apparently their search function itself isn’t limited by those distance radiuses; Thus I can start to offer an app which can search any distance.


As you can see in the screen shot, there’s no more Ads 🙂

I still need to program the 2nd feature which I want (it’s harder than I thought it would be), but I did already style the app in the same template style as the site and its logo. It nicely filters out all Ads; It also filters out the items I don’t want in my search results.


Things I stumbled upon:

When resizing the form, the objects on the form didn’t resize accordingly and stayed on the same place.

This could be solved by using a splitcontainer on my form (something that splits the form in 2 parts); the splitcontainer already contains features for resizing. By setting its anchor, I was able to do the same with all the objects on the form and have them resize when I resize my form! 🙂

$SplitContainer1.Anchor = ([System.Windows.Forms.AnchorStyles]([System.Windows.Forms.AnchorStyles]::Top -bor[System.Windows.Forms.AnchorStyles]::Bottom -bor[System.Windows.Forms.AnchorStyles]::Left -bor[System.Windows.Forms.AnchorStyles]::Right))

Getting the search result was pretty easy; parsing the info a little bit harder

$WebUrl = "
$page = Invoke-WebRequest $WebUrl
$html.getElementsByTagName('tr') | ? { (($_.className -eq 'search-result defaultSnippet group-0') -or ($_.className -eq 'search-result defaultSnippet group-1')) } | % {

After which I could do the ‘magic’ with everything it returned. Thanks to their programmers being a little bit lazy, my work was made easier with parsing the text and even changing it, so I would have to write less code (one of the things I added is an ‘open in new window’ for each hyperlink, so It would open the default browser on the system when I want to watch the product itself and I’m not stuck with the browser object in my tool (which I don’t need to extend now to support more features and/or show buttons etc).

In the beginning the script didn’t perform that well. Once I changed the replacement texts into a regex replacement, this sped up the process somewhat, but after I changed the function that changes the returned text (and checks it for certain values) to a select-string, this sped up the entire process a lot. Right now it takes me about 9-11 seconds (depending on the type of system and hardware) to get a 3,5MB result from the website, parse it, and show the parsed results (which is saved into a temp file, which is about 100KB still). All with all I’m pretty happy with how it turned out and its overall look and feel; same goes for the speed and ease; thus another great app/script to add to my collection 🙂

Next feature that’ll be added will be to check for new items and alert me when one is added inside my search criteria.

How simple some things can be – Powershell indent

In the past few years, I’ve been using tools to indent my code… Today by accident found out that in Powershell ISE, if you select multiple lines and press tab it’ll indent all selected lines… and of course shift+tab will do the reverse.

Amazing how simple some things can be without any tool… And how I’ve never before thought about trying this and automatically went looking to create my own or find an external solution once I couldn’t find it in any menu.

Nice piece of automation

Today I stumbled upon a very nice script by David O’Brien, which documents your entire System Center Configuration Manager environment.

It generates a very nice report and all created automatically. Very nice piece of automation.