Archive | Read File RSS for this section

Open Excel file and run macro

I needed a way to open an Excel file and run one of the macro’s inside the file to do some calculations and import some data then save the file as a new file, with the name of the previous month and its year added to it, to differentiate it from the original file.

Make sure macros are enabled within Excel, otherwise they’ll be disabled when opening the Excel file which will result in the script not working.

When working with Excel and PowerShell, when you’re living outside of the US, make sure you set the CurrentCulture to EN-US, otherwise you’ll get several Excel errors.
I’ll first save the CurrentCulture setting in a variable, so I can change it back after the script finishes. So, in the beginning of the script I’ll have these lines:

$CurrentCulture = [System.Threading.Thread]::CurrentThread.CurrentCulture.Name

[System.Threading.Thread]::CurrentThread.CurrentCulture = "en-US"

And after the script finishes, I’ll set it back like this:

[System.Threading.Thread]::CurrentThread.CurrentCulture = $CurrentCulture

To get the name of the month and the year, use the following two lines. I like to have the month name start with a capital letter, thus I add the third line of code:

$dateTextMonth = Get-Date -Date (Get-Date).AddMonths(-1) -format MMMM
$dateTextYear = Get-Date -Date (Get-Date).AddMonths(-1) -format yyyy
$dateTextMonthString = $dateTextMonth.Substring(0,1).toUpper() + $dateTextMonth.Substring(1).ToLower()

Now we can open Excel, open the file and run the macro. I’ll set the Excel save format to an Excel file with macros enabled.

Add-Type -AssemblyName Microsoft.Office.Interop.Excel
$xlFixedFormat = [Microsoft.Office.Interop.Excel.XlFileFormat]::xlOpenXMLWorkbookMacroEnabled

$app = New-Object -comobject Excel.Application
$app.Visible = $True
$app.DisplayAlerts = $False
$wb = $App.Workbooks.Open("$folderName\$fileName.xlsm")


$app.ActiveWorkbook.SaveAs("$folderName\$fileName $dateTextMonthString $dateTextYear.xlsm", $xlFixedFormat)

And now it’ll open Excel, open the macro enabled workbook, run the macro, save the workbook with a new name and close Excel.

Sometimes the script would be to quick, thus I needed to add a start-sleep for 1 second, to make sure the Excel file is opened in Excel before I run the macro.

The complete script can be downloaded here.

Download SolarEdge solar production data and save to csv

I’ve got a nice solar panel setup on my roof, which uploads its data to the SolarEdge monitoring portal (this is the brand of inverter that I got). It appeared that this monitoring portal also has an API to automate getting the energy data. With me working in a company that works on energy savings, monitoring, consultancy etc, it’s a logical step for me to automate downloading the production of my panels, so my collueges at work can import this data in my energy monitoring portal at work and then they can do their magic calculations on them; which results in me getting nice graphs of my solar enrgy production.

The purpose of the tool:

I wrote this tool to be somewhat smart in what to download and what not. I only want to download the information if it’s up-to-date. Next to that, I’d want daily values, but also values with a 15 minute interval. In the end I’d want all the info to be exported to CSV files containing 1 month of data and/or 1 year of data. All files that have already been generated don’t need their data to be downloaded again and overwritten, thus it skips this data once it’s downloaded. In the end I can either mail these files or put them on a file share, so they can be imported by our energy monitoring system. This last step I’ve removed from the script that I share with you.

Several nice ‘techniques’ used in this script to get to the goal:

Since I’m talking with an API, the main important command is the Invoke-WebRequest with a pipe to ConvertFrom-Json

To get the last day of the month, the following line is used: $LastDayOfTheMonth = ((Get-Date -Date “01-$Month-$Year” -Hour 0 -Minute 0 -Second 0).AddMonths(1).AddSeconds(-1)).Day

(which will add a month to the date/time 01-<month>-<year> : 0:00 and then remove 1 second to get to the last second of the previous month, thus resulting in returning the last day of that month (<last day>-<month>-<year> 23:59:59))

In the end, the downloaded data is converted to CSV with this command: | ConvertTo-Csv -Delimiter “;” -NoTypeInformation (in Excel with localization the Netherlands, the semicolon is a better seperator than a comma, since Excel expects a semicolon. This saves me time on opening the csv files and converting the information to an excel readable file. Depending on your localization settings, you’d want to change this accordingly) I also add the switch -NoTypeInformation, since I’m not interested in getting information about the type of variable PS used, I only need the data.

Script parameters and bounds:

The script will check the solaredge portal for the date/time on which the installation received its latest data. If this date is not the same as today, something might be wrong. It’ll warn you about it, but will still download. If it’s out of date for more than 1 day, it’ll give an error about this, but the script will still continue. When no data has been downloaded, the tool will download all data (starting on the installation date of the inverter) per month and save these to csv files. It will also generate files with yearly data. The data that’s downloaded is from the previous month (unless this is a later date than the date the last data was received on the solaredge monitoring portal, then the last data date will be used). In case of the daily values (monthly and yearly csv’s), the tool will only store the value’s if they’re not $null (which is: no data received on portal or the solar panels with its optimizers as a whole aren’t yet initialized by the inverter). In case of the 15 minute interval values, all $null values will be replaced by 0,0 (in that case the inverter is turned off because of its night mode); thus giving a nice list of 96 readings each day, and resulting in about 2688 – 2976 values in each file, depending on the amount of days in the month (which can be checked for, if needed; in my case I don’t, since my work already has many comprehensive tools to check for gaps or strange behaviour in data).

The script can be found and downloaded here: DownloadEnergyData.ps1


Edit: I got myself an undocumented API call parameter from a SolarEdge developer, which gives me 15 minute values in WH 🙂 The script has been updated. (&TimeUnit=QUARTER_OF_AN_HOUR)

Powershell application – Get information from a webpage, parse it and show the results

Today I created a nice little ‘application’ in Powershell.

In The Netherlands we have a site called Marktplaats ( which is some sort of combination between e-bay and craiglist. This is one of the biggest sites in The Netherlands, but I was missing some search options on this site and thought they started to show to many ads. Thus I decided to try something in Powershell to help make my life somewhat easier and while I’m at it, also implement the things that I think the site is missing.

One of the main important things I was missing, was a wide search pattern which I could fine-grain down to the things I’d like it to return. (ie. searching for all free items, but excluding sand, cats, yellow stones etc).

Second thing I’m missing is and alert function. I’d like to receive an alert if something I search for gets placed on the site (and if it also complies with the from and to price I’m searching the item for.

Third thing as mentioned before: They started loading the site with Ads. At the moment it’s about 1 ad for each 2 items on the site. This is way too much and decreases the fun I have of browsing trough secondhand items.

Fourth thing: It appears that the site has restrictions on the distance radius I can set. Apparently their search function itself isn’t limited by those distance radiuses; Thus I can start to offer an app which can search any distance.


As you can see in the screen shot, there’s no more Ads 🙂

I still need to program the 2nd feature which I want (it’s harder than I thought it would be), but I did already style the app in the same template style as the site and its logo. It nicely filters out all Ads; It also filters out the items I don’t want in my search results.


Things I stumbled upon:

When resizing the form, the objects on the form didn’t resize accordingly and stayed on the same place.

This could be solved by using a splitcontainer on my form (something that splits the form in 2 parts); the splitcontainer already contains features for resizing. By setting its anchor, I was able to do the same with all the objects on the form and have them resize when I resize my form! 🙂

$SplitContainer1.Anchor = ([System.Windows.Forms.AnchorStyles]([System.Windows.Forms.AnchorStyles]::Top -bor[System.Windows.Forms.AnchorStyles]::Bottom -bor[System.Windows.Forms.AnchorStyles]::Left -bor[System.Windows.Forms.AnchorStyles]::Right))

Getting the search result was pretty easy; parsing the info a little bit harder

$WebUrl = "
$page = Invoke-WebRequest $WebUrl
$html.getElementsByTagName('tr') | ? { (($_.className -eq 'search-result defaultSnippet group-0') -or ($_.className -eq 'search-result defaultSnippet group-1')) } | % {

After which I could do the ‘magic’ with everything it returned. Thanks to their programmers being a little bit lazy, my work was made easier with parsing the text and even changing it, so I would have to write less code (one of the things I added is an ‘open in new window’ for each hyperlink, so It would open the default browser on the system when I want to watch the product itself and I’m not stuck with the browser object in my tool (which I don’t need to extend now to support more features and/or show buttons etc).

In the beginning the script didn’t perform that well. Once I changed the replacement texts into a regex replacement, this sped up the process somewhat, but after I changed the function that changes the returned text (and checks it for certain values) to a select-string, this sped up the entire process a lot. Right now it takes me about 9-11 seconds (depending on the type of system and hardware) to get a 3,5MB result from the website, parse it, and show the parsed results (which is saved into a temp file, which is about 100KB still). All with all I’m pretty happy with how it turned out and its overall look and feel; same goes for the speed and ease; thus another great app/script to add to my collection 🙂

Next feature that’ll be added will be to check for new items and alert me when one is added inside my search criteria.

Analyze Robocopy log files and mail the result

A long, long time ago (back in 2008) I created a VBS script which would analyze a folder filled with Robocopy logs and once done analyzing and creating a summary, it would mail the result to me.

A couple of days ago I decided it was time to update this script to Powershell. Once I was finished, I still wasn’t satisfied with its speed (neither was I with the VBS version of the script, but since it ran nightly I didn’t bother changing it). Since I was working on it, I thought I’d completely rewrite it and increase its speed some, together with making it more diverse, so I can use it in more situations and depending on the needed notification settings for each specific situation. Since Microsoft decided to default add Robocopy to the OS, I’ll be using it a lot more myself. I also checked the internet if others created something similar, I found some, but none could compete with what I just created and/or had the diversity I needed; thus thought I’d share it.

I tested this script on Powershell 2.0 and 4.0, so should also work with 3.0. If not, please contact me and I can update/change it. (also include the error please)

Ok, so what does this script? It will read the contents of all robocopy log files in the specified path. If you add -recurse, it’ll also include subfolders. Depending on your settings, it’ll create a summary file with either a summary of all found log files, only the ones with errors and warnings or only the ones with errors. Depending if you want warnings and/or errors, it’ll add detailed information about the errors it found in all the log files. The detailed information for errors will be placed in the top of the summary file. The output file will be placed in the directory the script is started from. Depending on your settings it’ll send an e-mail report with a summary of the analyzed logs and the summary file (with detailed error and/or warning info) will be attached to this mail.

After I finished creating the script, I also added detailed help information and tested its speed.

Reading of 110 files (95,8MB), which consist of 7 errors in 4 files and 402924 warnings in 4 files;
Running time: 2m8s
Tested with option SummaryAll (maximum output) and a MaxLinesPerFile of 500 (thus 2007 lines (500 per warnings file) to write)

Same conditions as above, but tested with SummaryAllWithoutWarnings (thus only 7 lines to write)
Running time: 20s

The Robocopy settings I use (I also use this to make copties of files that can be open by users in their desktop session, that’s why I excluded the ~$ temporary excel files in this script, otherwise you’d get false-positive errors)

robocopy "<source>" "<target>" /XD "<folder>" /XF "~$*.xls" "~$*.xlsx" "~$*.xlsm" /E /V /NP /LOG:"<Path\Logfile>" /R:2 /W:2

Getting eager to try it out yourself? The complete powershell script can be found here: AnalyzeRobocopyLogs.ps1

Use Get-Help .\AnalyzeRobocopylogs.ps1 to get default help information about the script and

Use Get-Help .\AnalyzeRobocopylogs.ps1 -full to get detailed information about the script

If you use this script change it or provide it to / share it with others, please comment me and/or link to my blog.

The E-mail message will look something like this:

AnalyzeRobocopyLogs MailMessage


Edit: Updated the script. It appeared that Powershell v2 had an issue with the summary text. This has been updated in the AnalyzeRobocopyLogs v2.1 script.