PowerShell Script for get or delete terms in MMS

Managing MMS could be more easier via PowerShell. Today I had to delete hundreds of terms from an MMS. First to check if terms are real, I created for a short script to list them.

function Get-SPManagedMetadataTerm{
[cmdletbinding(SupportsShouldProcess=$true,ConfirmImpact='High')]
param(
 [parameter(Mandatory=$true,Position=0)]$SiteUrl,
 [parameter(Mandatory=$true,Position=1)]$GroupName,
 [parameter(Mandatory=$true,Position=2)]$TermSetName,
 [parameter(Position=3,ValueFromPipeline=$true)]$TermName
 )

Begin{
 [System.reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint") | Out-Null
 $site = New-Object Microsoft.SharePoint.SPSite($SiteUrl)
 $taxonomySession = New-Object Microsoft.SharePoint.Taxonomy.TaxonomySession($site)
 $Group = $taxonomySession.TermStores.Groups | ?{$_.Name -like $GroupName} 
 $TermSet = $Group.TermSets | ?{$_.Name -like "$TermSetName"}
}

Process{
 $Term = $TermSet.Terms | ?{$_.Name -like $TermName}
 if($Term)
 {
 $Term.Name
 }
 else
 {
 "($TermName) could not found!"
 }
}

}

There is a little modification in the script to deleting terms which found.

function Delete-SPManagedMetadataTerm{
[cmdletbinding(SupportsShouldProcess=$true,ConfirmImpact='High')]
param(
 [parameter(Mandatory=$true,Position=0)]$SiteUrl,
 [parameter(Mandatory=$true,Position=1)]$GroupName,
 [parameter(Mandatory=$true,Position=2)]$TermSetName,
 [parameter(Position=3,ValueFromPipeline=$true)]$TermName
 )

Begin{
 [System.reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint") | Out-Null
 $site = New-Object Microsoft.SharePoint.SPSite($SiteUrl)
 $taxonomySession = New-Object Microsoft.SharePoint.Taxonomy.TaxonomySession($site)
 $Group = $taxonomySession.TermStores.Groups | ?{$_.Name -like $GroupName} 
 $TermSet = $Group.TermSets | ?{$_.Name -like $TermSetName}
}

Process{
 $Term = $TermSet.Terms | ?{$_.Name -like $TermName}
 if($Term)
 {
 if($PSCmdlet.ShouldProcess("Term (" + $($Term.Name) + ") will be deleted from ($($Term.TermStore.Name + "/" + $Group.Name + "/" + $TermSet.Name))"))
 {
 $Term.Delete()
 $Term.TermStore.CommitAll()
 "Term (" + $($TermName) + ") has been deleted from ($($Term.TermStore.Name + "/" + $Group.Name + "/" + $TermSet.Name))"
 }
 }
 else
 {
 "($TermName) could not found!"
 }
}

}

The end you could easily check and delete a bunch of terms like this.

$terms = Get-Content terms.txt

$terms| Get-SPManagedMetadataTerm -SiteUrl https://<SiteUrl> -GroupName <GroupName> -TermSetName <TermSetName>

$terms| Delete-SPManagedMetadataTerm -SiteUrl https://<SiteUrl> -GroupName <GroupName> -TermSetName <TermSetName> -Confirm:$false

Why “Sign in as Different User” removed in SharePoint?

Please be careful when using the “Sign in as Different User” in SharePoint 2013. This feature removed directly since SharePoint 2013, because of allows users which have more than one AD account to login with a different account.

You will get the next error message:

In the ULS log will find the next exception:

System.ArgumentException: Exception of type ‘System.ArgumentException’ was thrown. Parameter name: encodedValue at Microsoft.SharePoint.Administration.Claims.SPClaimEncodingManager.DecodeClaimFromFormsSuffix(String encodedValue) at Microsoft.SharePoint.Administration.Claims.SPClaimProviderManager.GetProviderUserKey(IClaimsIdentity claimsIdentity, String encodedIdentityClaimSuffix) at Microsoft.SharePoint.Administration.Claims.SPClaimProviderManager.GetProviderUserKey(String encodedIdentityClaimSuffix) at Microsoft.SharePoint.ApplicationRuntime.SPHeaderManager.AddIsapiHeaders(HttpContext context, String encodedUrl, NameValueCollection headers) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.PreRequestExecuteAppHandler(Object oSender, EventArgs ea) at System.Web.HttpApplication.SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)

This is a security issue, because of the page content shows cached information from the previous user.

To avoid the issue, you have to logoff/logon in Windows or delete cookies and website data in your browser, and re-open it to Sign in as a different user.

https://blogs.technet.microsoft.com/stefan_gossner/2012/01/19/be-careful-when-using-the-sign-in-as-different-user-feature-in-sharepoint/

https://support.microsoft.com/en-nz/help/2988406/system-argumentexception-exception-when-you-sign-out-of-a-sharepoint-2

How to connect to SPOService behind company proxy

Function Connect-SPOServiceBehindProxy{
[cmdletbinding()]
param(
    [parameter(Mandatory=$true,Position=0,ValueFromPipeline=$true)][string]$AdminUPN,
    [parameter(Mandatory=$true,Position=1,ValueFromPipeline=$true)][string[]]$Organization
    )
Begin{
$cred = [System.Net.CredentialCache]::DefaultCredentials
[System.Net.WebRequest]::DefaultWebProxy.Credentials = $cred
}

Process{
$Credential = Get-Credential -UserName $AdminUPN -Message "Type the password for ($AdminUPN)."
Connect-SPOService -Url https://$Organization-admin.sharepoint.com -Credential $Credential
}
}

https://docs.microsoft.com/en-us/powershell/sharepoint/sharepoint-online/connect-sharepoint-online?view=sharepoint-ps

Updated version of ConvertTo-HtmlTable function is available

There are some modification in my ConvertTo-HtmlTable function is available here: Generate colored HTML-Table from an input PSObject

Version 2.0 – 04/05/2017

  • Simpler parameter list.
  • PropertyCheck parameter with Pattern to ignore threshold settings on property and colors those rows based on CellbackgroundColor parameter which is default red.
  • NotLike switch parameter for PropertyCheck
  • Reverse switch parameter to set filter functionality, default is (-gt).
  • Improved help.

Move-SPSite: already exists in database

Today I wanted to move a site collection into another content database, because a request came from the customer to raise the site collection quota to 200GB.
As you know the suggested content database size is 200GB and therefore we decided to allocate a new content database for it.
Move-SPSite cmdlet timed out and the site remained in the original content database. The second shot went wrong because of the site collection already exists in new database, however the site stay in place.

Checking of the content database, it showed the site count not incremented, but on the SQL server the content database AllSites table showed me the site collection id.



The easiest way to remove this site collection id from here you have to run the next one-liner script.

Get-SPContentDatabase <databasename> | %{$_.ForceDeleteSite("<siteid>",$false,$false)}

SPContentDatabase.ForceDeleteSite method

Formatted and colored HTML-table based on a filtered property

Just published a great script to generate a fully colored and formatted HTML-table output from an input PSObject.
Sometimes needed a colored report e.g. disk storage space, CPU or memory usage to warning and/or critical issues.
There are some examples here:

Get-Process | ConvertTo-HtmlTable -Property WS -Warning 1724416 -Critical 2007040 | Out-File test.html
$PSObject = Get-Process
ConvertTo-HtmlTable -PSObject $PSObject -Property WS -Warning 1724416 -Critical 2007040 | Out-File test.html

example

Get-Process | Sort-Object -Property WS -Descending | ConvertTo-HtmlTable -Property WS -Warning 1724416 -Critical 2007040 | Out-File test.html

More formatted Html-Table:

Get-Process | Select Name, Id, PM, VM, WS | ConvertTo-HtmlTable -Property WS -Warning 8712192  -Critical 12984320 -Title "WS Report" -BorderStyle double -HeaderBackgroundColor grey -BackgroundColor lightgrey -Padding 10px | Out-File test.html

example2

Testing site collection availability per content database randomly

I’ve got a task to write a script to test and measure request time on a site collection per content database. However there is a cmdlet Invoke-WebRequest to do do it simple way, I decided to create a function as Get-SPWebRequest to do it, because previous one is not available in Powershell v2.

Where we have Powershell 3v we can start an Invoke-WebRequest with a site url and measure the command with Measure-Command cmdlet.

$result = Measure-Command {Invoke-WebRequest -Uri  -UseDefaultCredentials}
"{0:N2}ms" -f [double]$result.TotalMilliseconds

Anyway there is a function to do the WebRequest:

# Web request with default credentials
Function Get-SPWebRequest ($Url)
    {
        Try
            { 
                $request = [System.Net.WebRequest]::Create($Url)
                $request.UseDefaultCredentials="true"
                $request.method="get"
                $request.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED","f")
                $request.TimeOut=120000
                $response = $request.GetResponse()
                return $response
            }
        Catch 
            {
                $regex = [regex]("\d{3}")
                return New-Object PSCustomObject -property @{
                    "StatusDescription" = (($_.exception.innerexception.message -split ":")[1]).Trim()
                    "StatusCode" = [int]$regex.Match($_.exception.innerexception.message).value}
            }
     }

The WebRequest using default credentials, because during the test usually we use farm admin or admin account which has full control permission for all web application on the SharePoint farm. Method is a get method, we just testing and not modifying anything on the site.
During my test I ran into a problem where my request always gave back error code 403 (forbidden) on sites where we use forms base authentication. There is the reason to added (“X-FORMS_BASED_AUTH_ACCEPTED”,”f”) header. It is programmatically access a SharePoint site which uses multiple authentication providers by using Windows credentials. Here is for the MS article which give you the explanation: Retrieving Data from a Multiple-Authentication Site Using the Client Object Model and Web Services in SharePoint 2010.
Try-catch method integrated to catch any other http error code and set them into a custom object as StatusDescription and StatusCode as the request result when it successfully. From the inner exception message I get the status code by regex which is mainly useful to provide outputs based on a pattern from a string. Regex is difficult, but there are lots of help which can be found on the Internet to generate and test for your needs.

After that, when we have the engine for the script, and we only need to collect all content databases and choose one site collection from all randomly to generate the output like this.

Get-SPContentDatabase | %{Get-SPWebRequest -Url $($_.sites[$(Get-Random $_.sites.count)]).url}

And measure the command like this.

Get-SPContentDatabase | 
%{(Measure-Command {Get-SPWebRequest -Url $($_.sites[$(Get-Random $_.sites.count)]).url}).TotalMilliseconds}

Yes, but we would like to see the result nicely and convertible into e.g. .csv or .html format. So, we need have to create a PSCustomObject and put into all properties which needed.
Additionally, I do the request twice to warmup the site before the real measurement and created a DefaultDisplayPropertySet to show only those values which are more important. Any others can be filtered or displayed if needed.
Here is the full script which is also can be found in the MS TechNet Gallery here: Testing a random site collection WebRequest per content database.

# Get all content databases
Function Get-SPContentDatabase
    {
        return $databases = [Microsoft.SharePoint.Administration.SPFarm]::Local | 
            select -expand services | 
                ?{$_ -is [Microsoft.SharePoint.Administration.SPWebService]} |
                    select -expand webapplications | select -expand contentdatabases
    }

# Web request with default credentials
Function Get-SPWebRequest ($Url)
    {
        Try
            { 
                $request = [System.Net.WebRequest]::Create($Url)
                $request.UseDefaultCredentials="true"
                $request.method="get"
                $request.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED","f")
                $request.TimeOut=120000
                $response = $request.GetResponse()
                return $response
            }
        Catch 
            {
                $regex = [regex]("\d{3}")
                return New-Object PSCustomObject -property @{
                    "StatusDescription" = (($_.exception.innerexception.message -split ":")[1]).Trim()
                    "StatusCode" = [int]$regex.Match($_.exception.innerexception.message).value}
            }
     }

# Load Microsoft.SharePoint assemly  
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint") | Out-Null

#Get all content database
$ContentDatabases = Get-SPContentDatabase

$i = 1

#Get database from all content databases
foreach($Database in $ContentDatabases)
    {
        #Progress bar
        Write-Progress -Activity "Gathering WebRequest status per content database" -Status "From $Database" -PercentComplete ($i/$ContentDatabases.count*100)

        #Get random site from content database
        $Site = $Database.Sites[$(Get-Random $Database.Sites.Count)]
        
        #WarmUp
        $WarmUP = Measure-Command {Get-SPWebRequest -Url $Site.Url}

        #Measure command
        $Result = Measure-Command {$Request = Get-SPWebRequest -Url $Site.Url}

        [string[]]$defaultProperties = 'DateTime','Url','RequestTime','StatusCode'
        $defaultDisplayPropertySet = New-Object System.Management.Automation.PSPropertySet('DefaultDisplayPropertySet',[string[]]$defaultProperties)
        $PSStandardMembers = [System.Management.Automation.PSMemberInfo[]]@($defaultDisplayPropertySet)

        #Create PSCustomObject
        New-Object PSCustomObject -Property @{
            "DateTime" = "{0:dd/MM/yyy HH:mm:ss}" -f [datetime]::Now
            "ContentDatabase" = $Site.ContentDatabase.Name
            "Url" = $Site.Url
            "WarmUpRequestTime" = "{0:N2}ms"-f [double]$WarmUp.TotalMilliseconds
            "RequestTime" = "{0:N2}ms"-f [double]$Result.TotalMilliseconds
            "StatusCode" = [int]$Request.Statuscode
            "StatusDescription" = $Request.StatusDescription
            "ContentLength" = "{0:N2} MB" -f ($Request.ContentLength/1Mb)
            "ResponseUri" = $Request.ResponseUri
            "IsFromCache" = $Request.IsFromCache} | 
                Add-Member -MemberType MemberSet -Name PSStandardMembers -Value $PSStandardMembers -PassThru
        $i++
    }

I get content databases via Microsoft.SharePoint reflection and therefore my code portable between different SharePoint versions.

MSSQL Locks Average Wait Time is too high

Last days SCOM reported “SQL DB Average Wait Time is too high” monitoring tickets where the limit set to 250ms.

When I tried to check “SQL Server Locks counters” via PowerShell, I found extremely high value in “AverageWaitTimems” property.

How to find the “_PerfRawData_MSSQL_Locks” object:

Get-WmiObject -List | ?{$_.Name -like "*PerfRawData_MSSQL*Locks"} | Select Name

 

Get actual “AverageWaitTimems” property:

Get-WmiObject Win32_PerfRawData_MSSQLInstanceNameLocks | ?{$_.Name -like "_Total"} | Select Name, AverageWaitTimems | Format-Table -AutoSize

AverageWaitTimems

If you check the community addition below the page you’ll see a post how you should calculate the the real value from raw data. Please check “PERF_AVERAGE_BULK” section in the “Interpreting the counter values from sys.dm_os_performance_counters” to see how it works really. You can find here others as “Buffer cache hit ratio, Transactions, Total pages”.

Example to calculate the real value:

Get-WmiObject Win32_PerfRawData_MSSQLInstanceNameLocks | ?{$_.Name -like "_Total
"} | Select Name, AverageWaitTimems, AverageWaitTimems_Base | Format-Table -AutoSize

Sample 1

Sample1

Sample 2

Sample2

Calculated data:

(Sample2.AverageWaitTimems-Sample1.AverageWaitTimems)/(Sample2.AverageWaitTimems_Base-Sample1.AverageWaitTimems_Base)

(6452617-6452602)/(15901-15900)=15

You can check it via “CookedValue” based on the MSSQL$INSTANCE:Locks(_Total)\Average Wait Time (ms) counter.

Get-Counter -Counter '\MSSQL$INSTANCE:Locks(_Total)\Average Wait Time (ms)'

How to find counter to the “_PerfRawData_MSSQL_Loks” object:

Get-Counter -ListSet * | ?{$_.CountersetName -like "*MSSQL*Locks*"} | Select -expand Paths

 

PWA error: An entry with the same key already exists

Full story started with a different issue see below.

We had an issue regarding “Create Project Web App Instance” where the first given administrator account display name showed as <i:0#.w|firstname%2c lastname>.

Root cause analysis showed “Failure trying to Sync Web Application – events ID: 5555” in the application log.

Project farm contains only a proxy “User Profile Service Application” which is connect to a different SharePoint farm. I realized from the ULS log there is a simple access denied issue where the SharePoint Project Farm account missed “Full Control” permission on the “User Profile Service Application” on the secondary SharePoint farm.

ULS log entry: Category: User Profiles, Message: Exception occured while connecting to WCF endpoint: System.ServiceModel.Security.SecurityAccessDeniedException: Access is denied.

Issue with the “PWA error” in the title started during my investigation previous problem, when I tried to create Project web app instances to test account display name issue. First I created a test project site as “Test-PWA” and when I finished the test I deleted it. Later I wanted to test it again and created the same site as “Test-PWA”, but at this time I got the error: “Sorry, something went wrong. An entry with the same key already exists.” and a correlation ID instead of the list about Projects on the “Manage Project Web Apps”.

ULS log also contained the error: Application error when access /_admin/pwa/managepwa.aspx, Error=An entry with the same key already exists.

The issue answered here by Michael Wharton, MVP.

“This error is due to orphan PWA instances.”

The solution is to check orphan entries in the service application and remove them by PowerShell.

$PWA = Get-SPServiceApplication | ?{$_.DisplayName -like "Project Server Service Application"}
$PWA.SiteCollection | Select Name, ID, @{N='DatabaseName';E={$_.ProjectServiceDatabase.Name}}
$RemoveEntry = $PWA.SiteCollection | ?{$_.Id -eq ""}
$RemoveEntry.Delete()

Easy way to search in IIS logs

I’d like to show you a simple way to find any related information in IIS logs with PowerShell. Yeah, I know there are some tool which are useful to parse IIS logs, but I like scripting and playing with PowerShell.

I wrote a small script which can collect between start and end time IIS logs from a Web server and merge them into a CSV file which is useful to gain information or create small report based on parameters.

How the script works?

There are three parameter in order Name, StartTime and EndTime. Name represent the WebSite name which can be found to list all websites in a web server using Get-WebSite cmdlet. For the cmdlet you have to import WebAdministration module. Keep in mind since PowerShell 3.0, installed modules are automatically imported when you use any commands if $PSModuleAutoLoadingPreference preference variable is set to All. About preference variables.

Based on website name the script set the $LogFolder variable to the IIS log folder which contains related events for.

$LogFolder = Get-Website | ?{$_.Name -eq $Name} | Select @{N="LogFile";E={$_.LogFile.Directory + "\W3SVC" + $_.Id}}

Then collect log files from the folder, which are true for the Where-Object condition.

$LogFile = Get-ChildItem $LogFolder.LogFile | ?{$_.CreationTime -lt $EndTime -and $_.LastwriteTime -gt $StartTime}

Collect all rows from log files which contains the “Fields: ” text. We need heads for the .csv file.

$Heads = Get-Content $LogFile.FullName[0] | ?{$_ -match "#Fields: "}

Formad first line of $Heads and take it into a temporary file. I tried to avoid temporary file, but I could not solve import into CSV from variable.

$Heads[0] | %{$_ -replace "#Fields: ","" -replace "-","_" -replace "\(","" -replace "\)",""} | Out-File .\tempiislog.tmp

Append all events into the temporary file.

$LogFile | %{Get-Content $_.FullName} | ?{$_ -notlike "#*"} | Out-File .\tempiislog.tmp -Append

In the end select only rows from temp file which are exactly match for the criteria (date and time) and export into .csv file.

Import-Csv -Delimiter " " -Path .\tempiislog.tmp |
?{[datetime]($_.Date +" "+$_.Time) -gt $StartTime -and [datetime]($_.Date +" "+$_.Time) -lt $EndTime} |
Export-Csv -Path ("MergedIISLogs_{0:MMddyyHHmm}-" -f $StartTime + "{0:MMddyyHHmm}.log" -f $EndTime) -Delimiter ";"

Merged IIS log files can be found next to your script as MergedIISLogs_MMddyyHHmm-MMddyyHHmm.log in CSV format.

How can you search in IIS logs?

For example you can find all related entries where status code is “302” which is a redirection.

Import-Csv -Delimiter ";" -Path .\MergedIISLogs_xxxxxxxxxx-xxxxxxxxxx.log | Where-Object {$_.sc_status -match "302"}

Or you can find the slowest uri.

$maximum = (Import-Csv -Delimiter ";" -Path .\MergedIISLogs_xxxxxxxxxx-xxxxxxxxxx.log |
Measure-Object -Property time_taken -Maximum).maximum
Import-Csv -Delimiter ";" -Path .\MergedIISLogs_xxxxxxxxxx-xxxxxxxxxx.log | ?{$_.time_taken -eq $maximum}