Archive | Powershell scripts RSS for this section

Stop/Restart/Pause OneDrive for business

I had an issue with OneDrive for business, which I need to use to be able to sync a SharePoint folder to my local machine.

On this machine I like to do some auto cleanup with my MoveOldFiles script (a script that moves all files in certain folders to sub folders in the format yyyy\MM, and after doing so, it uses 7za.exe to zip the files, once complete and zipped successfully the source files are being deleted). But when doing so, OneDrive kept syncing the changes and while zipping and deleting, sometimes sync errors might occur. I wanted this to be more robust and stable and not have OneDrive crash or starting to consume a lot of memory.

On trying to find how to pause OneDrive trough a script, I couldn’t find any solutions, but some solutions pointed to a SkyDrive /shutdown option. OneDrive for business uses groove nowadays instead of SkyDrive, but they kept this argument switch alive.

So, all I had to do was to add 2 lines of code to my script.
One line to shut down the OneDrive service (the installation can usually be found here: C:\Program Files\Microsoft Office 15\root\office15):

Groove.exe /shutdown

And one to restart is

Groove.exe

And that was all there is to it, this way OneDrive is closed during my automatic cleanup sessions and will be restarted after all is done and then it starts syncing all changes.

 

Open Excel file and run macro

I needed a way to open an Excel file and run one of the macro’s inside the file to do some calculations and import some data then save the file as a new file, with the name of the previous month and its year added to it, to differentiate it from the original file.

Make sure macros are enabled within Excel, otherwise they’ll be disabled when opening the Excel file which will result in the script not working.

When working with Excel and PowerShell, when you’re living outside of the US, make sure you set the CurrentCulture to EN-US, otherwise you’ll get several Excel errors.
I’ll first save the CurrentCulture setting in a variable, so I can change it back after the script finishes. So, in the beginning of the script I’ll have these lines:

$CurrentCulture = [System.Threading.Thread]::CurrentThread.CurrentCulture.Name

[System.Threading.Thread]::CurrentThread.CurrentCulture = "en-US"

And after the script finishes, I’ll set it back like this:

[System.Threading.Thread]::CurrentThread.CurrentCulture = $CurrentCulture

To get the name of the month and the year, use the following two lines. I like to have the month name start with a capital letter, thus I add the third line of code:

$dateTextMonth = Get-Date -Date (Get-Date).AddMonths(-1) -format MMMM
$dateTextYear = Get-Date -Date (Get-Date).AddMonths(-1) -format yyyy
$dateTextMonthString = $dateTextMonth.Substring(0,1).toUpper() + $dateTextMonth.Substring(1).ToLower()

Now we can open Excel, open the file and run the macro. I’ll set the Excel save format to an Excel file with macros enabled.

Add-Type -AssemblyName Microsoft.Office.Interop.Excel
$xlFixedFormat = [Microsoft.Office.Interop.Excel.XlFileFormat]::xlOpenXMLWorkbookMacroEnabled

$app = New-Object -comobject Excel.Application
$app.Visible = $True
$app.DisplayAlerts = $False
$wb = $App.Workbooks.Open("$folderName\$fileName.xlsm")

$App.Run("MacroNameToRun")

$app.ActiveWorkbook.SaveAs("$folderName\$fileName $dateTextMonthString $dateTextYear.xlsm", $xlFixedFormat)
$app.Quit()

And now it’ll open Excel, open the macro enabled workbook, run the macro, save the workbook with a new name and close Excel.

Sometimes the script would be to quick, thus I needed to add a start-sleep for 1 second, to make sure the Excel file is opened in Excel before I run the macro.

The complete script can be downloaded here.

Create a sharepoint (document) library

Creating a sharepoint library (works in both sharepoint foundation and sharepoint server) with powershell is a nice way to ensure all libraries within sharepoint are configured the same way.

What we’ll be doing here, is creating a library, disable anonymous access on it and only allow certain user groups to have access rights to this library. We will create a document library, with 2 user groups which have contribute access rights and one group which has full control access rights. We will enable the quicklaunch link for this library and enable versioning (52 versions).

First we’ll add the sharepoint powershell snap in:

Add-PsSnapin Microsoft.SharePoint.PowerShell

After this we’ll define our defaults. We’ll be adding a Help desk user group (contribute) and an administrators users group (full control) and a users group specifically created for access rights on this library (contribute). You’d need to edit this to work on your environment.

$WebUrl = "https://url_to_sharepoint.com"
$HelpDeskUsers = "DOMAIN\HelpDeskUsersGroup"
$AdminsGroup = "DOMAIN\AdminsUsersGroup"
$LibraryUsers = "DOMAIN\LibraryUsersGroup"

After setting this, we’ll get the sharepoint web application:

$SpAssignment = Start-SPAssignment
$SpWeb = Get-SPWeb -Identity $WebUrl -AssignmentCollection $spAssignment

Normally I’ll have the libraryUsers group defined by the name of the library, thus in my code I’ll have a Foreach loop in case I’d like to create multiple document libraries.

In the script below I’ll be creating the sharepoint users group, set the library type to create to ‘document library’, remove the inheritance for the library, enable quick launch, enable versioning and remove anonymous access (as I have a publically available sharepoint, but its document libraries I don’t want to be available anonymous). Then I’ll add the user groups to their specific role definition (full control / contribute), I also set the description for the library. After that I’ll close the sharepoint connection and all is done. Here is the rest of the code:

#Create a list of users groups (needed for access rights to the library)
$AllUsers = $LibraryUsers,$HelpDeskUsers,$AdminsGroup # Contribute, Contribute, Full Control
#Create the sharepoint login for the library specific users group
New-SPUser -UserAlias $LibraryUsers -Web $WebUrl -DisplayName $LibraryUsers
#Set the library type to Document Library
$ListTemplate = [Microsoft.SharePoint.SPListTemplateType]::DocumentLibrary
#Add the document library to the web application
$SpWeb.Lists.Add("$Library","$Library Shared Documents",$ListTemplate)
#Get the created document library to set specific settings
$spDocumentLibrary = $spWeb.GetList("$Library")
#Remove inheritance and remove all access entries (bool copyRoleAssignments, bool clearSubscopes)
$spDocumentLibrary.BreakRoleInheritance($false,$true)
#Add the library to the quick launch (visible to peopl with access rights to the library)
$spDocumentLibrary.OnQuickLaunch = $true
#Enable versioning on the library (default setting: only keep major versions)
$spDocumentLibrary.EnableVersioning = $true
#Set a maximum of 52 versions (1 version for each weekly documentation, thus keeping 1 year of documentation and changes)
$spDocumentLibrary.MajorVersionLimit = 52
#Disable anonymous access to this library. (Can only set 1 EmptyMask, the other one with automatically be set)
$spDocumentLibrary.AnonymousPermMask = [Microsoft.Sharepoint.SPRights]::EmptyMask
#$spDocumentLibrary.AnonymousPermMask64 = [Microsoft.Sharepoint.SPBasePermissions]::EmptyMask
#Add user (group) permissions to the library
Foreach($Users in $AllUsers)
{
$User = Get-SPUser -Web $WebUrl | Where DisplayName -eq $Users
If($Users -eq $AdminsGroup) { $Role = $SpWeb.RoleDefinitions["Full Control"] }
Else { $Role = $SpWeb.RoleDefinitions["Contribute"] }
$Roleassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($User)
$Roleassignment.RoleDefinitionBindings.Add($Role)
$spDocumentLibrary.RoleAssignments.Add($Roleassignment)
}
#Update the library and dispose the web application psobject
$spDocumentLibrary.Update()
$spFolder = $spDocumentLibrary.AddItem("",[Microsoft.SharePoint.SPFileSystemObjectType]::Folder,"Documentation")
$spFolder.Update()
}

$SpWeb.Dispose()

The complete script can be downloaded here

In the script, there are several items which need to be changed so the script works on your environment. First you’d need to enter the location (URL) of your sharepoint environment.

$WebUrl = "https://url_to_sharepoint.com"

After that, the 3 user groups need to be changes to fit your needs. The variables in which they are stored in the script are: $HelpDeskUsers, $AdminsGroup, $LibraryUsers.

Backup to zip

I wanted to have a tool that creates a backup to a zip file, but I didn’t want the zip file to be recreated every time (as it can take nearly a day for some backups I wanted to make), thus I need something to synchronize to zip and remove items from the zip archive if they are removed from disk. If I restore a backup and the folder I want to backup had an older file than the zip archive contains, the folder with the older file should be leading and the file in the zip archive needs to be overwritten.

I decided the best way to go would be the dos/commandline version of 7zip (7za.exe), which you can download from their sourceforge. Be warned that splitting in volumes doesn’t work in this case, because the update to zip function won’t work.

How to use this tool with synchronize options, I found here. But as this treats newer files as leading, I did need to tweak it a little bit, so the folder would be the leading factor.

I also needed to exclude the folder called “backups” and don’t want the other zip files which I created backupped into this zip file, thus I need to exclude all files which start with the name “backup” as well. As I like to create a log file as well, I also include the date. (could have used start-transcript as well, which I usually use and I believe is a better way for creating log files. I don’t know why I didn’t use it this time)

Get-Date | Out-File D:\Data\Backup.log
& "$PSScriptRoot\7za.exe" u -tzip "D:\Data\Backup.zip" -up1q0r2x2y2z1w2 D:\Data\* -mx7 '-xr!Backups' '-xr!Backup*.*'  >> D:\Data\Backup.log
Get-Date | Out-File D:\Data\Backup.log -Append

Exporting all scheduled tasks

I wanted to have an export of all scheduled tasks ready for all servers that I manage, thus I created a little powershell tool that does that exactly.

It appeared to be pretty simple to reach this goal, as there is an Export-ScheduledTask function in Powershell.

This is what I used, but I wanted to have an export of all my scheduled tasks, without the default ones from Microsoft. So I needed to filter out a little bit.
There are two lines of code to filter out the unwanted tasks. First I removed the ones in the sub folder Microsoft and OfficeSoftwareProtection. After I run trough them to save the export (xml) of the task, I check if the task name doesn’t contain the text “User_Feed_Synchronization” or “Optimize Start Menu Cache Files”, if they aren’t called that, they’ll be exported.

(Get-ScheduledTask).TaskPath | Where { ($_ -notmatch "Microsoft") -and ($_ -notmatch "OfficeSoftware") } | Select -Unique
If(($TaskName -match "User_Feed_Synchronization") -or ($TaskName -match "Optimize Start Menu Cache Files"))

Troughout the running of the tool, it’ll output information to its log file. The files are saved in subdirectory names which correspond to the scheduled task folder they are found in.

This is the complete code for the tool:
$LogFile = "D:\Data\Logging\ExportScheduledTasks.log"
$BackupPath = "D:\Data\Tasks"
$TaskFolders = (Get-ScheduledTask).TaskPath | Where { ($_ -notmatch "Microsoft") -and ($_ -notmatch "OfficeSoftware") } | Select -Unique
Start-Transcript -Path $LogFile
Write-Output "Start exporting of scheduled tasks."

If(Test-Path -Path $BackupPath)
{
Remove-Item -Path $BackupPath -Recurse -Force
}
md $BackupPath | Out-Null

Foreach ($TaskFolder in $TaskFolders)
{
Write-Output "Task folder: $TaskFolder"
If($TaskFolder -ne "\") { md $BackupPath$TaskFolder | Out-Null }
$Tasks = Get-ScheduledTask -TaskPath $TaskFolder -ErrorAction SilentlyContinue
Foreach ($Task in $Tasks)
{
$TaskName = $Task.TaskName
If(($TaskName -match "User_Feed_Synchronization") -or ($TaskName -match "Optimize Start Menu Cache Files"))
{
}
Else
{
$TaskInfo = Export-ScheduledTask -TaskName $TaskName -TaskPath $TaskFolder
$TaskInfo | Out-File "$BackupPath$TaskFolder$TaskName.xml"
Write-Output "Saved file $BackupPath$TaskFolder$TaskName.xml"
}
}
}

Write-Output "Exporting of scheduled tasks finished."
Stop-Transcript

It has been a while…

It’s been a while since I posted anything on this site. At work I haven’t been able to work much with Powershell as other activities needed my time. Next to that I’ve gotten a lower back hernia, which was/is very painful and made it not able for me to sit and use my computer.

The few things I did do in Powershell were too specific for my work to post about. I hope I’ll be able to post some more projects at the end of the year, I’ve still got many I’d like to do. I see traffic to my blog is still increasing, I’m happy for that. Thank you all for visiting and reading my posts. I hope I can help you with your projects or help you learn how to use Powershell.

One of the things I’m planning is: rewriting my AnalyzeRobocopyLogs script, to remove some bugs and use a lot less lines of code. Stay tuned for that.

Use text to speech to read the hardest poem in the English language

Some years ago I came across one of the hardest poems in the English language, and since I’ve been playing with the text-to-speech engine I decided to let the text-to-speech deal with the poem. It seemed to do very well and tought me how to pronounce some words I had never heard of as well.

The script runs best on Windows 8.1 and higher (powershell 4), but runs from Powershell 3.0 and up (as Invoke-Webrequest was added in that version of powershell).

It’s just a simple script that gets the installed voices, which you can select. It downloads and parses the poem from the page mentioned above. Then it loops through it line by line, writing the line to the screen and have text-to-speech say the line of text. I removed the speaking of the first and last line of text as I only wanted the poem itself to be read.

You can find the script here: ReadEnglishPoem.ps1

 

Text to speech alarm clock with powershell

Thanks to a blog by Jefferey Hicks about an alarm clock in powershell using the PC speaker, I thought it would be nice to tweak that script a little bit and use Microsoft’s text to speech engine so you can use the alarm and let a voice wake you through your headphones or speakers.

To enable the engine, I believe you need .NET 4 on your system and you need powershell version 2 or higher. But if you use Windows 8.1 (or higher; thus powershell 4 or higher) the engine is a lot better and will talk with a reasonably natural voice. All you need is 2 lines of code to reference to the text to speech engine and create an object.

Add-Type -AssemblyName System.speech
$Speak = New-Object System.Speech.Synthesis.SpeechSynthesizer

Now you can use that object to say something by using $Speak.Speak(“Something”), to which you can either pass a string or an array.

All the credit for this script goes to Jefferey Hicks (as mentioned above), I only replaced the pc speaker part of his script with the code for the text to speech engine and maybe some of you know the reference to the first part of what’s being said… And I guess that a lot of you will know the the last line.

You can find my version of the script here: Alarm.ps1

Just run it with the -NapTime parameter and the amount of minutes to count down from.

Get current cpu usage for a process

I wanted to get the current CPU usage for certain processes, which appeared to be harder than I thought.

The Get-Process command will give you CPU time and program start time, but will not tell you when the CPU was being used by the process. Thus if you use the previous information you got from the tool to calculate the CPU usage, you’ll get the average CPU usage for that process since it started. (So if it started at 100% CPU usage and has been busy since, it’ll show as 100%, but when it’s software that’s been started days ago and just been using 100% for the past 10 minutes, without doing much the time before, it’ll show as the program using 0,05 % or something like that, since that is its average usage). If you’re interested in this information, Powershell.com explains how to do this. This is not the information I want to know, thus I started searching for another way to do it.

WMI has many possibilities, but most of my scripts that are lagging, are lagging because of WMI usage, thus I try to use this only if I can’t find another way to do so.

I finally got to the Get-Counter command, which gets information from Performance Counters. I’d like to get % processor time information, thus I’ll check this with the following command (add the * to get all instances of the process):

Get-Counter "\Process(Chrome*)\% Processor Time"

This gives me a list of all process instances and their current CPU usage.

If you’d want to take multiple samples, this can also be specified (just like the sample interval)

Get-Counter "\Process(Chrome*)\% Processor Time" -SampleInterval 1 -MaxSamples 5

At the moment I’m only interested in getting 1 quick sample (this script will be integrated in a monitoring tool), thus I’ll skip these switches. But to explain them: The SampleInterval switch will set the interval time between the 2 measuring samples, default is 1 second, the MaxSamples switch will give you an x amount of samples; the default value is 1 sample.

The output is not quite what I want, I only like to get the CounterSamples. This is achieved with this command:

(Get-Counter "\Process(Chrome*)\% Processor Time").CounterSamples

This’ll give me this output (with a lot of open tabs in chrome):

CpuUsage

But now, there’s the problem that the name as being used in the Performance Counters isn’t the same as in the process list, neither do I know a process ID. Lets solve this with the Get-Counter command as well. This command will show me the Process ID’s:

(Get-Counter "\Process(Chrome*)\ID Process").CounterSamples

And will give me this list of ID’s:

CpuUsageID

Now it’s time to start working with the data. I’ll start by putting the data in variables, then I’ll check if the lists are the same size. Once that’s done, I’ll store the data that I want and return the list sorted with the process with the highest CPU usage on top:

Param([parameter(Mandatory=$True)][string]$Process)
$ProcessId = (Get-Counter "\Process($Process*)\ID Process").CounterSamples | Sort-Object Path
$ProcessCPU = (Get-Counter "\Process($Process*)\% Processor Time").CounterSamples | Sort-Object Path,CPU
$TotalInfo = @()
If($ProcessId.Count -eq $ProcessCPU.Count)
{
    For($i = 0; $i -lt $ProcessId.Count; $i++)
    {
        $TempInfo = $ProcessId[$i].Path.Split("\")[3].Substring($ProcessId[$i].Path.Split("\")[3].IndexOf("(") + 1)
        $TempInfo = $TempInfo.Substring(0,$TempInfo.IndexOf(")"))
        $ProcessInfo = New-Object psobject
        Add-Member -InputObject $ProcessInfo -MemberType noteproperty -Name "CounterName" -Value $TempInfo -Force
        Add-Member -InputObject $ProcessInfo -MemberType noteproperty -Name "Id" -Value $ProcessId[$i].CookedValue -Force
        Add-Member -InputObject $ProcessInfo -MemberType noteproperty -Name "Cpu" -Value $ProcessCPU[$i].CookedValue -Force
        $TotalInfo += $ProcessInfo
    }
}
Return $TotalInfo | Sort-Object Cpu -Descending

And this will give me a nice list like this:

CpuUsageAndId

Oh, if you’re interested in getting the total CPU load of a system, this is also pretty easy with the Get-Counter command. Powershell.com also describes this, but can be done with this command:

(Get-Counter '\processor(_total)\% processor time').CounterSamples

The script can be downloaded here.

Get logged on users and sessions

There are several ways to get a list of currently logged on users on a system, but only a few return the things that I like to know. In case of my servers, I’d like to know which users are connected to which session.

One way to do so is by:

Import-Module RemoteDesktop
Get-RDUserSession

but this doesn’t give a consistent return (on some of my 2012 R2 servers it doesn’t return anything, even though users are logged on to the system and RDS is setup) and it doesn’t include the 0 session user(s)

GetUsersAndSessions1

WMI can also do the job, but by default will give a lot of unnecessary information

Get-WmiObject Win32_LoggedOnUser | Select Antecedent -Unique | %{"{0}\{1}" -f $_.Antecedent.ToString().Split('"')[1],$_.Antecedent.ToString().Split('"')[3]}

(It’ll return DWM-1, DWM-2, DWM-3, IUSR, SYSTEM and more things I don’t need to know or work with) To filter most unwanted items, you’d get a big command like:

Get-WmiObject Win32_LoggedOnUser | Select Antecedent -Unique | Where-Object { $_.Antecedent.ToString().Split('"')[1] -ne $env:COMPUTERNAME -and $_.Antecedent.ToString().Split('"')[1] -ne "Window Manager" -and $_.Antecedent.ToString().Split('"')[3] -notmatch $env:COMPUTERNAME } | %{"{0}\{1}" -f $_.Antecedent.ToString().Split('"')[1],$_.Antecedent.ToString().Split('"')[3]}

This’ll return a list of all users (including administrator), but no sessions:

GetUsers

So until now the first command would be the best; but since it’s pretty inconsistent wether it indeed returns a list of users, I thought of ‘misusing’ another command for this purpose. The Get-process command. Since every logged in user has several processes in their name, this might be the best approach:

Get-Process -IncludeUserName | Select-Object UserName,SessionId | Where-Object { $_.UserName -ne $null } | Sort-Object UserName -Unique

This will return a list of all users, but users that are connected to multiple sessions, will only show in one this way. To get the list of users based on the session and filter out the accounts like DWM-1 to 3 etc, the following command  The Where-Object { $_.UserName -ne $null } part will make sure the UserName field is filled when walking through it, otherwise you’d get lots of errors on not being able to call a method on a mull-valued expression. Thus to rule these out, this part is added.

Get-Process -IncludeUserName | Select-Object UserName,SessionId | Where-Object { $_.UserName -ne $null -and $_.UserName.StartsWith("DOMAIN OR COMPUTERNAME") } | Sort-Object SessionId -Unique

This will return an object with UserName and SessionID for that user. Once multiple users are connected to the same session (like session 0), the filter for the domain name or computer name is needed to get the list of names you’d be interested in. Here is a screenshot of the result of the last command:

GetUsersAndSessions

The IncludeUserName switch has been added since Powershell 4.0, thus older versions will not be able to use this switch. Also the switch requires this command to be ran elevated; otherwise you’d get an error.

In the end I will definitely use the Get-Process command to get a consistent list of logged on users on my servers.

Edit: By the way, if you’re about to run this on a remote computer, wrap an Invoke-Command around it. Because if you include the -ComputerName switch, the -IncludeUserName switch can’t be combined. Next to that, the Get-Process command with -ComputerName will return all processes as running on session 0.

Thus if you are to use it on a remote computer, use a command like:

Invoke-Command -ComputerName "COMPUTERNAME" -ScriptBlock { Get-Process -IncludeUserName | Select-Object UserName,SessionId | Where-Object { $_.UserName -ne $null -and $_.UserName.StartsWith("DOMAIN OR COMPUTERNAME") } | Sort-Object SessionId -Unique } | Select-Object UserName,SessionId

Note that in the end there’s an extra Select-Object, because the Invoke-Command will add 2 extra properties to the result; PSComputerName and RunspaceId. You can also add credentials to this command and other parameters, depending on your needs.