Archive | Write File RSS for this section

Writing to Excel

Background

About a year ago I started writing things to Excel with PowerShell, based on the information I found on an MSDN blog – Create Excel file with PowerShell

I wasn’t quite satisfied, though. I wanted some more formatting (specifically: change the top row behavior). It was somewhat frustrating to not be able to get everything the way I wanted, thus I decided to start finding out how to do this. It started early evening and late night (or early morning) I was finally satisfied with all the things I could do with Excel. I decided to share the script and struggles with you. I couldn’t find a place which had all these customizations together with creating the Excel file, thus this might help you save a lot of time when trying to reach the same goal.

Goal

My goal is to freeze the top row, set the text at least to bold (depending on difficulty in setting the font properties) and set an autofilter on the top row. And of course to fill the Excel workbook with data. I also want to refit the column width based on its text.

In real life I use it to document things (who likes documenting and why document things if you can have the computer doing it for you?), in this Example, I’ll use it to write the output of the Get-Process command to get a list of processes, process IDs, UserNames and Sessions (in the example, the CPU object is also documented)

Writing the script

With the Get-Process command including the IncludeUserName parameter in my script, I need to run this as administrator. There are several ways to achieve this, I found these two options on the net: Start script as administrator & Check if user is administrator (nearly last comment from MOW)

Combining those two gets met to these lines of code (check if the user started the script as administrator & check if the user can be an administrator. If the script wasn’t started as admin and the security settings allow it, it’ll start the script as admin, otherwise it’ll give an error that you need to be an admin to be able to run this script):

$IsAdmin = ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator")
$CanBeAdmin = ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]"$env:USERNAME").IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator")

If(!($IsAdmin))
{
   If($CanBeAdmin)
   {
       $Arguments = "& '" + $MyInvocation.MyCommand.Definition + "'"
       Start-Process powershell -Verb runAs -ArgumentList $Arguments
   }
   Else
   {
       Write-Error "You have to be an administrator to run this script" -ErrorAction Stop
   }
   Break
}

These lines go in the top of the script. The script breaks if the user isn’t an admin.

When you live in a non US country (which are more people that actually live in the US, but ok), your CurrentCulture settings will probably have you run into problems when trying to use the Excel com object. The following Error will probably appear on many of your Excel com object calls:

Exception calling "<name>" with "<int>" argument(s): "Old format or invalid type library. (Exception from HRESULT: 0x80028018 (TYPE_E_INVDATAREAD))"
At line:<int> char:<int>
+ <PSObject>.<call> <<<< ()
   + CategoryInfo         : NotSpecified: (:) [], MethodInvocationException
   + FullyQualifiedErrorId : ComMethodTargetInvocation

Thus we’ll have to add this line of code to set the CurrentCulture to US:

[System.Threading.Thread]::CurrentThread.CurrentCulture = "en-US"

Now we can start with the Excel code. In my example I delete the Excel file if it exists, before I start a new one. This makes the code obsolete to Open & Save instead of Create & SaveAs, but I left it in there as a reference. Then came the struggle of setting the top row defaults.

The bold part was pretty easy and straightforward:

$Range = $sheet.UsedRange
$Range.Font.ColorIndex = 11
$Range.Font.Bold = $True

Locking the top rows was somewhat harder, but I stumbled upon it when searching for setting the auto filter. Thanks, rwskas on Experts-Exchange

$sheet.Application.ActiveWindow.SplitColumn = 0
$sheet.Application.ActiveWindow.SplitRow = 1
$sheet.Application.ActiveWindow.FreezePanes = $true

With the top rows locked all that’s left is setting the autofilter. Which proved to be pretty hard information to find. Microsoft has information about the Range.AutoFilter Method, but it still left me with questions on how to implement this in PowerShell. I had this C# example from Stack Overflow. Then I found a nearly correct way to use it on powershell.org. Thanks, Tim Pringle. I still needed to know how get the ‘missing type’ and the Excel.XlAutoFilterOperator.xlAnd (from the C# example), which was (probably) the 1 in the PowerShell example. When searching for the XLAutoFilterOperators I stumbled upon a TechNet post with an answer by Boe Prox (Thank you!), there’s the missing type, now I can try it. And … It works! The 1 seems to be xlAnd (the default value) as we thought. This gives us the following code:

$Range.AutoFilter(1, $MissingType, 1, $MissingType, $MissingType) | Out-Null

All that’s left now is refitting the column width based on its text (pretty straightforward)

$Range.EntireColumn.AutoFit() | Out-Null

Then I added an extra line of code, so my example file won’t contain my user name.

If($Process.UserName -eq "$env:USERDOMAIN\$env:USERNAME") { $Process.UserName = "PowershellAdministrator" }

In the end, I quit Excel, release its com object and remove the Excel variable as used in the script.

$Excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Excel) | out-null
Remove-Variable -Name Excel -Scope Global -EA 0

And that’s it. Now you’re automatically writing to Excel. In the end the code isn’t that hard to understand, but to get there can be a pain sometimes. So I hope this’ll help someone trying to reach the same goal(s) as I did.

The script and Excel example file

The Excel example file can be found here, the PowerShell script here.

Happy scripting.

Download SolarEdge solar production data and save to csv

I’ve got a nice solar panel setup on my roof, which uploads its data to the SolarEdge monitoring portal (this is the brand of inverter that I got). It appeared that this monitoring portal also has an API to automate getting the energy data. With me working in a company that works on energy savings, monitoring, consultancy etc, it’s a logical step for me to automate downloading the production of my panels, so my collueges at work can import this data in my energy monitoring portal at work and then they can do their magic calculations on them; which results in me getting nice graphs of my solar enrgy production.

The purpose of the tool:

I wrote this tool to be somewhat smart in what to download and what not. I only want to download the information if it’s up-to-date. Next to that, I’d want daily values, but also values with a 15 minute interval. In the end I’d want all the info to be exported to CSV files containing 1 month of data and/or 1 year of data. All files that have already been generated don’t need their data to be downloaded again and overwritten, thus it skips this data once it’s downloaded. In the end I can either mail these files or put them on a file share, so they can be imported by our energy monitoring system. This last step I’ve removed from the script that I share with you.

Several nice ‘techniques’ used in this script to get to the goal:

Since I’m talking with an API, the main important command is the Invoke-WebRequest with a pipe to ConvertFrom-Json

To get the last day of the month, the following line is used: $LastDayOfTheMonth = ((Get-Date -Date “01-$Month-$Year” -Hour 0 -Minute 0 -Second 0).AddMonths(1).AddSeconds(-1)).Day

(which will add a month to the date/time 01-<month>-<year> : 0:00 and then remove 1 second to get to the last second of the previous month, thus resulting in returning the last day of that month (<last day>-<month>-<year> 23:59:59))

In the end, the downloaded data is converted to CSV with this command: | ConvertTo-Csv -Delimiter “;” -NoTypeInformation (in Excel with localization the Netherlands, the semicolon is a better seperator than a comma, since Excel expects a semicolon. This saves me time on opening the csv files and converting the information to an excel readable file. Depending on your localization settings, you’d want to change this accordingly) I also add the switch -NoTypeInformation, since I’m not interested in getting information about the type of variable PS used, I only need the data.

Script parameters and bounds:

The script will check the solaredge portal for the date/time on which the installation received its latest data. If this date is not the same as today, something might be wrong. It’ll warn you about it, but will still download. If it’s out of date for more than 1 day, it’ll give an error about this, but the script will still continue. When no data has been downloaded, the tool will download all data (starting on the installation date of the inverter) per month and save these to csv files. It will also generate files with yearly data. The data that’s downloaded is from the previous month (unless this is a later date than the date the last data was received on the solaredge monitoring portal, then the last data date will be used). In case of the daily values (monthly and yearly csv’s), the tool will only store the value’s if they’re not $null (which is: no data received on portal or the solar panels with its optimizers as a whole aren’t yet initialized by the inverter). In case of the 15 minute interval values, all $null values will be replaced by 0,0 (in that case the inverter is turned off because of its night mode); thus giving a nice list of 96 readings each day, and resulting in about 2688 – 2976 values in each file, depending on the amount of days in the month (which can be checked for, if needed; in my case I don’t, since my work already has many comprehensive tools to check for gaps or strange behaviour in data).

The script can be found and downloaded here: DownloadEnergyData.ps1

 

Edit: I got myself an undocumented API call parameter from a SolarEdge developer, which gives me 15 minute values in WH 🙂 The script has been updated. (&TimeUnit=QUARTER_OF_AN_HOUR)

Powershell application – Get information from a webpage, parse it and show the results

Today I created a nice little ‘application’ in Powershell.

In The Netherlands we have a site called Marktplaats (www.marktplaats.nl) which is some sort of combination between e-bay and craiglist. This is one of the biggest sites in The Netherlands, but I was missing some search options on this site and thought they started to show to many ads. Thus I decided to try something in Powershell to help make my life somewhat easier and while I’m at it, also implement the things that I think the site is missing.

One of the main important things I was missing, was a wide search pattern which I could fine-grain down to the things I’d like it to return. (ie. searching for all free items, but excluding sand, cats, yellow stones etc).

Second thing I’m missing is and alert function. I’d like to receive an alert if something I search for gets placed on the site (and if it also complies with the from and to price I’m searching the item for.

Third thing as mentioned before: They started loading the site with Ads. At the moment it’s about 1 ad for each 2 items on the site. This is way too much and decreases the fun I have of browsing trough secondhand items.

Fourth thing: It appears that the site has restrictions on the distance radius I can set. Apparently their search function itself isn’t limited by those distance radiuses; Thus I can start to offer an app which can search any distance.

 

As you can see in the screen shot, there’s no more Ads 🙂

I still need to program the 2nd feature which I want (it’s harder than I thought it would be), but I did already style the app in the same template style as the site and its logo. It nicely filters out all Ads; It also filters out the items I don’t want in my search results.

 

Things I stumbled upon:

When resizing the form, the objects on the form didn’t resize accordingly and stayed on the same place.

This could be solved by using a splitcontainer on my form (something that splits the form in 2 parts); the splitcontainer already contains features for resizing. By setting its anchor, I was able to do the same with all the objects on the form and have them resize when I resize my form! 🙂

$SplitContainer1=New-ObjectSystem.Windows.Forms.SplitContainer
$SplitContainer1.Anchor = ([System.Windows.Forms.AnchorStyles]([System.Windows.Forms.AnchorStyles]::Top -bor[System.Windows.Forms.AnchorStyles]::Bottom -bor[System.Windows.Forms.AnchorStyles]::Left -bor[System.Windows.Forms.AnchorStyles]::Right))

Getting the search result was pretty easy; parsing the info a little bit harder

$WebUrl = "http://www.marktplaats.nl/z.html?
$page = Invoke-WebRequest $WebUrl
$html.getElementsByTagName('tr') | ? { (($_.className -eq 'search-result defaultSnippet group-0') -or ($_.className -eq 'search-result defaultSnippet group-1')) } | % {

After which I could do the ‘magic’ with everything it returned. Thanks to their programmers being a little bit lazy, my work was made easier with parsing the text and even changing it, so I would have to write less code (one of the things I added is an ‘open in new window’ for each hyperlink, so It would open the default browser on the system when I want to watch the product itself and I’m not stuck with the browser object in my tool (which I don’t need to extend now to support more features and/or show buttons etc).

In the beginning the script didn’t perform that well. Once I changed the replacement texts into a regex replacement, this sped up the process somewhat, but after I changed the function that changes the returned text (and checks it for certain values) to a select-string, this sped up the entire process a lot. Right now it takes me about 9-11 seconds (depending on the type of system and hardware) to get a 3,5MB result from the website, parse it, and show the parsed results (which is saved into a temp file, which is about 100KB still). All with all I’m pretty happy with how it turned out and its overall look and feel; same goes for the speed and ease; thus another great app/script to add to my collection 🙂

Next feature that’ll be added will be to check for new items and alert me when one is added inside my search criteria.

Analyze Robocopy log files and mail the result

A long, long time ago (back in 2008) I created a VBS script which would analyze a folder filled with Robocopy logs and once done analyzing and creating a summary, it would mail the result to me.

A couple of days ago I decided it was time to update this script to Powershell. Once I was finished, I still wasn’t satisfied with its speed (neither was I with the VBS version of the script, but since it ran nightly I didn’t bother changing it). Since I was working on it, I thought I’d completely rewrite it and increase its speed some, together with making it more diverse, so I can use it in more situations and depending on the needed notification settings for each specific situation. Since Microsoft decided to default add Robocopy to the OS, I’ll be using it a lot more myself. I also checked the internet if others created something similar, I found some, but none could compete with what I just created and/or had the diversity I needed; thus thought I’d share it.

I tested this script on Powershell 2.0 and 4.0, so should also work with 3.0. If not, please contact me and I can update/change it. (also include the error please)

Ok, so what does this script? It will read the contents of all robocopy log files in the specified path. If you add -recurse, it’ll also include subfolders. Depending on your settings, it’ll create a summary file with either a summary of all found log files, only the ones with errors and warnings or only the ones with errors. Depending if you want warnings and/or errors, it’ll add detailed information about the errors it found in all the log files. The detailed information for errors will be placed in the top of the summary file. The output file will be placed in the directory the script is started from. Depending on your settings it’ll send an e-mail report with a summary of the analyzed logs and the summary file (with detailed error and/or warning info) will be attached to this mail.

After I finished creating the script, I also added detailed help information and tested its speed.

Reading of 110 files (95,8MB), which consist of 7 errors in 4 files and 402924 warnings in 4 files;
Running time: 2m8s
Tested with option SummaryAll (maximum output) and a MaxLinesPerFile of 500 (thus 2007 lines (500 per warnings file) to write)

Same conditions as above, but tested with SummaryAllWithoutWarnings (thus only 7 lines to write)
Running time: 20s

The Robocopy settings I use (I also use this to make copties of files that can be open by users in their desktop session, that’s why I excluded the ~$ temporary excel files in this script, otherwise you’d get false-positive errors)

robocopy "<source>" "<target>" /XD "<folder>" /XF "~$*.xls" "~$*.xlsx" "~$*.xlsm" /E /V /NP /LOG:"<Path\Logfile>" /R:2 /W:2

Getting eager to try it out yourself? The complete powershell script can be found here: AnalyzeRobocopyLogs.ps1

Use Get-Help .\AnalyzeRobocopylogs.ps1 to get default help information about the script and

Use Get-Help .\AnalyzeRobocopylogs.ps1 -full to get detailed information about the script

If you use this script change it or provide it to / share it with others, please comment me and/or link to my blog.

The E-mail message will look something like this:

AnalyzeRobocopyLogs MailMessage


 

Edit: Updated the script. It appeared that Powershell v2 had an issue with the summary text. This has been updated in the AnalyzeRobocopyLogs v2.1 script.