Archive | Powershell scripts RSS for this section

Writing to a SQL database

Last week I needed to store information in a SQL database, so I can share certain user account information with a select group of users in the helpdesk department, but they shouldn’t have access to the AD itself. So I decided that a SQL database was the way to go and I can create a gridview in Powershell for them, to retrieve this information for now, while our programmers change their helpdesk tool to include this information in there.

I decided that I wanted to update this information every 15 minutes and wanted to clear the database everytime (so I don’t need to be bothered with deleting old users and changing ones that have been changed), before I filled it with my data. I can use Integrated Security, because the AD account that runs the script has the correct access rights to the SQL database and my SQL server is set to Windows Authentication mode.

$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = "Data Source=SQLSERVER(\INSTANCE);Initial Catalog=DatabaseName;Integrated Security=SSPI;"
$cmd = New-Object System.Data.SqlClient.SqlCommand
$cmd.connection = $conn
$cmd.commandtext = "TRUNCATE TABLE TableName"
If($cmd.executenonquery() -eq -1) { Write-Host "Succesfully cleared table" }

If you were to use the above code, change the SQLSERVER text to the name of your SQL server, if needed add the backslash and the instance name and change the text DatabaseName to your database name. Also change the text TableName to your table name.

Once the table is cleared, I get all my desired information from my AD servers. When I get this, I’ll have to put it in the SQL database. Remember that when adding data to a SQL table, you need to specify each column name you’d want to fill with your data (usually all column names of the table, but you might have columns which allow nulls, and if you don’t need to fill those with data, you don’t need to enter those names. In my case all columns need to be filled (as they are all the data I need to share from my AD with our helpdesk people), so I entered all my column names.

$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = "Data Source=SQLSERVER(\INSTANCE);Initial Catalog=DatabaseName;Integrated Security=SSPI;"
$cmd = New-Object System.Data.SqlClient.SqlCommand
$cmd.connection = $conn
$cmd.commandtext = "INSERT INTO TableName (OuName,UserName,AccountEnabled,EmailAddress,LastLogonTime,LogonCount) VALUES('{0}','{1}','{2}','{3}','{4}','{5}')" -f $ouName,$userName,$accountEnabled,$emailAddress,$lastLogonDateTime,$logonCount
If($cmd.executenonquery() -eq 1) { Write-Host "Successfully added $ouName, $userName, $accountEnabled, $emailAddress, $lastLogonDateTime, $logonCount to the SQL database" }

In case you wonder how to get the logonCount and emailAddress from an AD user, those are extra properties which can be requested with the Get-ADUser command in powershell, like this:

Get-ADUser $userName -Properties logonCount,emailAddress

The lastLogon information isn’t found in the Get-ADUser command, but can be found trhough the Get-ADObject command, like this:

Get-ADUser $userName | Get-ADObject -Properties lastLogon

And that’s all there’s to it, to write to SQL and clearing the table.

Stop/Restart/Pause OneDrive for business

I had an issue with OneDrive for business, which I need to use to be able to sync a SharePoint folder to my local machine.

On this machine I like to do some auto cleanup with my MoveOldFiles script (a script that moves all files in certain folders to sub folders in the format yyyy\MM, and after doing so, it uses 7za.exe to zip the files, once complete and zipped successfully the source files are being deleted). But when doing so, OneDrive kept syncing the changes and while zipping and deleting, sometimes sync errors might occur. I wanted this to be more robust and stable and not have OneDrive crash or starting to consume a lot of memory.

On trying to find how to pause OneDrive trough a script, I couldn’t find any solutions, but some solutions pointed to a SkyDrive /shutdown option. OneDrive for business uses groove nowadays instead of SkyDrive, but they kept this argument switch alive.

So, all I had to do was to add 2 lines of code to my script.
One line to shut down the OneDrive service (the installation can usually be found here: C:\Program Files\Microsoft Office 15\root\office15):

Groove.exe /shutdown

And one to restart is


And that was all there is to it, this way OneDrive is closed during my automatic cleanup sessions and will be restarted after all is done and then it starts syncing all changes.


Open Excel file and run macro

I needed a way to open an Excel file and run one of the macro’s inside the file to do some calculations and import some data then save the file as a new file, with the name of the previous month and its year added to it, to differentiate it from the original file.

Make sure macros are enabled within Excel, otherwise they’ll be disabled when opening the Excel file which will result in the script not working.

When working with Excel and PowerShell, when you’re living outside of the US, make sure you set the CurrentCulture to EN-US, otherwise you’ll get several Excel errors.
I’ll first save the CurrentCulture setting in a variable, so I can change it back after the script finishes. So, in the beginning of the script I’ll have these lines:

$CurrentCulture = [System.Threading.Thread]::CurrentThread.CurrentCulture.Name

[System.Threading.Thread]::CurrentThread.CurrentCulture = "en-US"

And after the script finishes, I’ll set it back like this:

[System.Threading.Thread]::CurrentThread.CurrentCulture = $CurrentCulture

To get the name of the month and the year, use the following two lines. I like to have the month name start with a capital letter, thus I add the third line of code:

$dateTextMonth = Get-Date -Date (Get-Date).AddMonths(-1) -format MMMM
$dateTextYear = Get-Date -Date (Get-Date).AddMonths(-1) -format yyyy
$dateTextMonthString = $dateTextMonth.Substring(0,1).toUpper() + $dateTextMonth.Substring(1).ToLower()

Now we can open Excel, open the file and run the macro. I’ll set the Excel save format to an Excel file with macros enabled.

Add-Type -AssemblyName Microsoft.Office.Interop.Excel
$xlFixedFormat = [Microsoft.Office.Interop.Excel.XlFileFormat]::xlOpenXMLWorkbookMacroEnabled

$app = New-Object -comobject Excel.Application
$app.Visible = $True
$app.DisplayAlerts = $False
$wb = $App.Workbooks.Open("$folderName\$fileName.xlsm")


$app.ActiveWorkbook.SaveAs("$folderName\$fileName $dateTextMonthString $dateTextYear.xlsm", $xlFixedFormat)

And now it’ll open Excel, open the macro enabled workbook, run the macro, save the workbook with a new name and close Excel.

Sometimes the script would be to quick, thus I needed to add a start-sleep for 1 second, to make sure the Excel file is opened in Excel before I run the macro.

The complete script can be downloaded here.

Create a sharepoint (document) library

Creating a sharepoint library (works in both sharepoint foundation and sharepoint server) with powershell is a nice way to ensure all libraries within sharepoint are configured the same way.

What we’ll be doing here, is creating a library, disable anonymous access on it and only allow certain user groups to have access rights to this library. We will create a document library, with 2 user groups which have contribute access rights and one group which has full control access rights. We will enable the quicklaunch link for this library and enable versioning (52 versions).

First we’ll add the sharepoint powershell snap in:

Add-PsSnapin Microsoft.SharePoint.PowerShell

After this we’ll define our defaults. We’ll be adding a Help desk user group (contribute) and an administrators users group (full control) and a users group specifically created for access rights on this library (contribute). You’d need to edit this to work on your environment.

$WebUrl = ""
$HelpDeskUsers = "DOMAIN\HelpDeskUsersGroup"
$AdminsGroup = "DOMAIN\AdminsUsersGroup"
$LibraryUsers = "DOMAIN\LibraryUsersGroup"

After setting this, we’ll get the sharepoint web application:

$SpAssignment = Start-SPAssignment
$SpWeb = Get-SPWeb -Identity $WebUrl -AssignmentCollection $spAssignment

Normally I’ll have the libraryUsers group defined by the name of the library, thus in my code I’ll have a Foreach loop in case I’d like to create multiple document libraries.

In the script below I’ll be creating the sharepoint users group, set the library type to create to ‘document library’, remove the inheritance for the library, enable quick launch, enable versioning and remove anonymous access (as I have a publically available sharepoint, but its document libraries I don’t want to be available anonymous). Then I’ll add the user groups to their specific role definition (full control / contribute), I also set the description for the library. After that I’ll close the sharepoint connection and all is done. Here is the rest of the code:

#Create a list of users groups (needed for access rights to the library)
$AllUsers = $LibraryUsers,$HelpDeskUsers,$AdminsGroup # Contribute, Contribute, Full Control
#Create the sharepoint login for the library specific users group
New-SPUser -UserAlias $LibraryUsers -Web $WebUrl -DisplayName $LibraryUsers
#Set the library type to Document Library
$ListTemplate = [Microsoft.SharePoint.SPListTemplateType]::DocumentLibrary
#Add the document library to the web application
$SpWeb.Lists.Add("$Library","$Library Shared Documents",$ListTemplate)
#Get the created document library to set specific settings
$spDocumentLibrary = $spWeb.GetList("$Library")
#Remove inheritance and remove all access entries (bool copyRoleAssignments, bool clearSubscopes)
#Add the library to the quick launch (visible to peopl with access rights to the library)
$spDocumentLibrary.OnQuickLaunch = $true
#Enable versioning on the library (default setting: only keep major versions)
$spDocumentLibrary.EnableVersioning = $true
#Set a maximum of 52 versions (1 version for each weekly documentation, thus keeping 1 year of documentation and changes)
$spDocumentLibrary.MajorVersionLimit = 52
#Disable anonymous access to this library. (Can only set 1 EmptyMask, the other one with automatically be set)
$spDocumentLibrary.AnonymousPermMask = [Microsoft.Sharepoint.SPRights]::EmptyMask
#$spDocumentLibrary.AnonymousPermMask64 = [Microsoft.Sharepoint.SPBasePermissions]::EmptyMask
#Add user (group) permissions to the library
Foreach($Users in $AllUsers)
$User = Get-SPUser -Web $WebUrl | Where DisplayName -eq $Users
If($Users -eq $AdminsGroup) { $Role = $SpWeb.RoleDefinitions["Full Control"] }
Else { $Role = $SpWeb.RoleDefinitions["Contribute"] }
$Roleassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($User)
#Update the library and dispose the web application psobject
$spFolder = $spDocumentLibrary.AddItem("",[Microsoft.SharePoint.SPFileSystemObjectType]::Folder,"Documentation")


The complete script can be downloaded here

In the script, there are several items which need to be changed so the script works on your environment. First you’d need to enter the location (URL) of your sharepoint environment.

$WebUrl = ""

After that, the 3 user groups need to be changes to fit your needs. The variables in which they are stored in the script are: $HelpDeskUsers, $AdminsGroup, $LibraryUsers.

Backup to zip

I wanted to have a tool that creates a backup to a zip file, but I didn’t want the zip file to be recreated every time (as it can take nearly a day for some backups I wanted to make), thus I need something to synchronize to zip and remove items from the zip archive if they are removed from disk. If I restore a backup and the folder I want to backup had an older file than the zip archive contains, the folder with the older file should be leading and the file in the zip archive needs to be overwritten.

I decided the best way to go would be the dos/commandline version of 7zip (7za.exe), which you can download from their sourceforge. Be warned that splitting in volumes doesn’t work in this case, because the update to zip function won’t work.

How to use this tool with synchronize options, I found here. But as this treats newer files as leading, I did need to tweak it a little bit, so the folder would be the leading factor.

I also needed to exclude the folder called “backups” and don’t want the other zip files which I created backupped into this zip file, thus I need to exclude all files which start with the name “backup” as well. As I like to create a log file as well, I also include the date. (could have used start-transcript as well, which I usually use and I believe is a better way for creating log files. I don’t know why I didn’t use it this time)

Get-Date | Out-File D:\Data\Backup.log
& "$PSScriptRoot\7za.exe" u -tzip "D:\Data\" -up1q0r2x2y2z1w2 D:\Data\* -mx7 '-xr!Backups' '-xr!Backup*.*'  >> D:\Data\Backup.log
Get-Date | Out-File D:\Data\Backup.log -Append

Exporting all scheduled tasks

I wanted to have an export of all scheduled tasks ready for all servers that I manage, thus I created a little powershell tool that does that exactly.

It appeared to be pretty simple to reach this goal, as there is an Export-ScheduledTask function in Powershell.

This is what I used, but I wanted to have an export of all my scheduled tasks, without the default ones from Microsoft. So I needed to filter out a little bit.
There are two lines of code to filter out the unwanted tasks. First I removed the ones in the sub folder Microsoft and OfficeSoftwareProtection. After I run trough them to save the export (xml) of the task, I check if the task name doesn’t contain the text “User_Feed_Synchronization” or “Optimize Start Menu Cache Files”, if they aren’t called that, they’ll be exported.

(Get-ScheduledTask).TaskPath | Where { ($_ -notmatch "Microsoft") -and ($_ -notmatch "OfficeSoftware") } | Select -Unique
If(($TaskName -match "User_Feed_Synchronization") -or ($TaskName -match "Optimize Start Menu Cache Files"))

Troughout the running of the tool, it’ll output information to its log file. The files are saved in subdirectory names which correspond to the scheduled task folder they are found in.

This is the complete code for the tool:
$LogFile = "D:\Data\Logging\ExportScheduledTasks.log"
$BackupPath = "D:\Data\Tasks"
$TaskFolders = (Get-ScheduledTask).TaskPath | Where { ($_ -notmatch "Microsoft") -and ($_ -notmatch "OfficeSoftware") } | Select -Unique
Start-Transcript -Path $LogFile
Write-Output "Start exporting of scheduled tasks."

If(Test-Path -Path $BackupPath)
Remove-Item -Path $BackupPath -Recurse -Force
md $BackupPath | Out-Null

Foreach ($TaskFolder in $TaskFolders)
Write-Output "Task folder: $TaskFolder"
If($TaskFolder -ne "\") { md $BackupPath$TaskFolder | Out-Null }
$Tasks = Get-ScheduledTask -TaskPath $TaskFolder -ErrorAction SilentlyContinue
Foreach ($Task in $Tasks)
$TaskName = $Task.TaskName
If(($TaskName -match "User_Feed_Synchronization") -or ($TaskName -match "Optimize Start Menu Cache Files"))
$TaskInfo = Export-ScheduledTask -TaskName $TaskName -TaskPath $TaskFolder
$TaskInfo | Out-File "$BackupPath$TaskFolder$TaskName.xml"
Write-Output "Saved file $BackupPath$TaskFolder$TaskName.xml"

Write-Output "Exporting of scheduled tasks finished."

It has been a while…

It’s been a while since I posted anything on this site. At work I haven’t been able to work much with Powershell as other activities needed my time. Next to that I’ve gotten a lower back hernia, which was/is very painful and made it not able for me to sit and use my computer.

The few things I did do in Powershell were too specific for my work to post about. I hope I’ll be able to post some more projects at the end of the year, I’ve still got many I’d like to do. I see traffic to my blog is still increasing, I’m happy for that. Thank you all for visiting and reading my posts. I hope I can help you with your projects or help you learn how to use Powershell.

One of the things I’m planning is: rewriting my AnalyzeRobocopyLogs script, to remove some bugs and use a lot less lines of code. Stay tuned for that.

Use text to speech to read the hardest poem in the English language

Some years ago I came across one of the hardest poems in the English language, and since I’ve been playing with the text-to-speech engine I decided to let the text-to-speech deal with the poem. It seemed to do very well and tought me how to pronounce some words I had never heard of as well.

The script runs best on Windows 8.1 and higher (powershell 4), but runs from Powershell 3.0 and up (as Invoke-Webrequest was added in that version of powershell).

It’s just a simple script that gets the installed voices, which you can select. It downloads and parses the poem from the page mentioned above. Then it loops through it line by line, writing the line to the screen and have text-to-speech say the line of text. I removed the speaking of the first and last line of text as I only wanted the poem itself to be read.

You can find the script here: ReadEnglishPoem.ps1


Text to speech alarm clock with powershell

Thanks to a blog by Jefferey Hicks about an alarm clock in powershell using the PC speaker, I thought it would be nice to tweak that script a little bit and use Microsoft’s text to speech engine so you can use the alarm and let a voice wake you through your headphones or speakers.

To enable the engine, I believe you need .NET 4 on your system and you need powershell version 2 or higher. But if you use Windows 8.1 (or higher; thus powershell 4 or higher) the engine is a lot better and will talk with a reasonably natural voice. All you need is 2 lines of code to reference to the text to speech engine and create an object.

Add-Type -AssemblyName System.speech
$Speak = New-Object System.Speech.Synthesis.SpeechSynthesizer

Now you can use that object to say something by using $Speak.Speak(“Something”), to which you can either pass a string or an array.

All the credit for this script goes to Jefferey Hicks (as mentioned above), I only replaced the pc speaker part of his script with the code for the text to speech engine and maybe some of you know the reference to the first part of what’s being said… And I guess that a lot of you will know the the last line.

You can find my version of the script here: Alarm.ps1

Just run it with the -NapTime parameter and the amount of minutes to count down from.

Get current cpu usage for a process

I wanted to get the current CPU usage for certain processes, which appeared to be harder than I thought.

The Get-Process command will give you CPU time and program start time, but will not tell you when the CPU was being used by the process. Thus if you use the previous information you got from the tool to calculate the CPU usage, you’ll get the average CPU usage for that process since it started. (So if it started at 100% CPU usage and has been busy since, it’ll show as 100%, but when it’s software that’s been started days ago and just been using 100% for the past 10 minutes, without doing much the time before, it’ll show as the program using 0,05 % or something like that, since that is its average usage). If you’re interested in this information, explains how to do this. This is not the information I want to know, thus I started searching for another way to do it.

WMI has many possibilities, but most of my scripts that are lagging, are lagging because of WMI usage, thus I try to use this only if I can’t find another way to do so.

I finally got to the Get-Counter command, which gets information from Performance Counters. I’d like to get % processor time information, thus I’ll check this with the following command (add the * to get all instances of the process):

Get-Counter "\Process(Chrome*)\% Processor Time"

This gives me a list of all process instances and their current CPU usage.

If you’d want to take multiple samples, this can also be specified (just like the sample interval)

Get-Counter "\Process(Chrome*)\% Processor Time" -SampleInterval 1 -MaxSamples 5

At the moment I’m only interested in getting 1 quick sample (this script will be integrated in a monitoring tool), thus I’ll skip these switches. But to explain them: The SampleInterval switch will set the interval time between the 2 measuring samples, default is 1 second, the MaxSamples switch will give you an x amount of samples; the default value is 1 sample.

The output is not quite what I want, I only like to get the CounterSamples. This is achieved with this command:

(Get-Counter "\Process(Chrome*)\% Processor Time").CounterSamples

This’ll give me this output (with a lot of open tabs in chrome):


But now, there’s the problem that the name as being used in the Performance Counters isn’t the same as in the process list, neither do I know a process ID. Lets solve this with the Get-Counter command as well. This command will show me the Process ID’s:

(Get-Counter "\Process(Chrome*)\ID Process").CounterSamples

And will give me this list of ID’s:


Now it’s time to start working with the data. I’ll start by putting the data in variables, then I’ll check if the lists are the same size. Once that’s done, I’ll store the data that I want and return the list sorted with the process with the highest CPU usage on top:

$ProcessId = (Get-Counter "\Process($Process*)\ID Process").CounterSamples | Sort-Object Path
$ProcessCPU = (Get-Counter "\Process($Process*)\% Processor Time").CounterSamples | Sort-Object Path,CPU
$TotalInfo = @()
If($ProcessId.Count -eq $ProcessCPU.Count)
    For($i = 0; $i -lt $ProcessId.Count; $i++)
        $TempInfo = $ProcessId[$i].Path.Split("\")[3].Substring($ProcessId[$i].Path.Split("\")[3].IndexOf("(") + 1)
        $TempInfo = $TempInfo.Substring(0,$TempInfo.IndexOf(")"))
        $ProcessInfo = New-Object psobject
        Add-Member -InputObject $ProcessInfo -MemberType noteproperty -Name "CounterName" -Value $TempInfo -Force
        Add-Member -InputObject $ProcessInfo -MemberType noteproperty -Name "Id" -Value $ProcessId[$i].CookedValue -Force
        Add-Member -InputObject $ProcessInfo -MemberType noteproperty -Name "Cpu" -Value $ProcessCPU[$i].CookedValue -Force
        $TotalInfo += $ProcessInfo
Return $TotalInfo | Sort-Object Cpu -Descending

And this will give me a nice list like this:


Oh, if you’re interested in getting the total CPU load of a system, this is also pretty easy with the Get-Counter command. also describes this, but can be done with this command:

(Get-Counter '\processor(_total)\% processor time').CounterSamples

The script can be downloaded here.