Using PowerShell to test for broken URL redirects

This PowerShell script iterates through a list of URLS from a CSV file looking for broken redirects. If any are found they are noted to a file.

#------------------------------------------------------------
 cls
 #------------------------------------------------------------
 # BEGIN: Config
 #------------------------------------------------------------
 # Where are the files?
 $path = "C:\checkLinks"; # Do NOT end with \
 # Source
 $csv = import-csv -Path "$path\source.csv";
 # Report
 $targetFile = "report.csv";
 # Where does the site send you to if a link is broken?  Look at "OriginalString" in code as it's possibly not what you assume it is.
 $redirStr = "/error";
 # What is the name of the column in the CSV file that has the URLs?
 $urlColumn = "url";
 #------------------------------------------------------------
 # END: Config
 #------------------------------------------------------------

#------------------------------------------------------------
 # BEGIN: SUPPORTING FUNCTIONS
 #------------------------------------------------------------

#============================================================
 # BEGIN: Get-URL
 # DESC: Receives params, checks URL for redirect and if redirect
 # matches str writes to report file.
 #============================================================
 function Get-URL {
 param([string]$url)
 try
 {
 "Checking: $url"
 #========================================================
 $wr = [System.Net.HttpWebRequest]::Create($url);
 $wrRsp = $wr.GetResponse();
 $temp = $wrRsp.ResponseUri.AbsolutePath
 $redirectURL = $wrRsp.ResponseUri.OriginalString
 $redirectURL
 if ($temp -like $redirStr) {
 Add-Content $path\$targetFile "$url"
 };
 $wrRsp.Close();
 #========================================================
 } catch {
 $errorStatus = "Exception Message: " + $_.Exception.Message;
 Write-Host $errorStatus;
 }
 };
 #============================================================
 # END: Get-URL
 #============================================================

#------------------------------------------------------------
 # END: SUPPORTING FUNCTIONS
 #------------------------------------------------------------

#============================================================
 # START MAIN
 #============================================================

#============================================================
 # BEGIN: Step 1
 #============================================================
 # A: Go to folder
 cd $path
 # B: Does report file exist? If no, create it.
 if (!(Test-Path "$path\$targetFile")) {
 New-Item -path $path -name $targetFile -type "file"
 };
 #============================================================
 # END: Step 1
 #============================================================

#============================================================
 # BEGIN: Step 2
 #============================================================
 foreach($line in $csv)
 {

# Loop through each row and check if URL has value.
 $properties = $line | Get-Member -MemberType Properties
 for($i=0; $i -lt $properties.Count;$i++)
 {
 $column = $properties[$i]
 $columnvalue = $line | Select -ExpandProperty $column.Name

#-----------------------------------------------------
 # BEGIN: Send to check URL
 #-----------------------------------------------------
 if ($column.Name -eq $urlColumn){
 if ($columnvalue.Length -gt 4){
 Get-URL $columnvalue;
 };
 };
 #-----------------------------------------------------
 # END: Send to check URL
 #-----------------------------------------------------

    };
 };
 #============================================================
 # END: Step 2
 #============================================================

 

How to compare two Excel / CSV / XLSX files using PowerShell

The following PowerShell script allows you to compare ALL the columns in two CSV files contents against each other.

cls

# Import the files
 $file1 = import-csv -Path "file1.csv";
 $file2 = import-csv -Path "file2.csv";

# Get the list of properties
 $props1 = $file1 | gm -MemberType NoteProperty | select -expand Name | sort | % {"$_"}
 $props2 = $file2 | gm -MemberType NoteProperty | select -expand Name | sort | % {"$_"}

if(Compare-Object $props1 $props2) {

    # Check that properties match

    throw "Properties are not the same! [$props1] [$props2]"

} else {

    # Pass properties list to Compare-Object

    "Checking $props1"

    Compare-Object $file1 $file2 -Property $props1

}

Using PowerShell to export SharePoint list items into CSV and then delete the same items from the list

The below code can be used to export list items, if found, info column by column. It then deletes the items that it exported from the list.

if ((Get-PSSnapin “Microsoft.SharePoint.PowerShell” -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin “Microsoft.SharePoint.PowerShell”
}

# BEGIN CONFIG

$fileNameStamp = Get-Date -UFormat “%Y_%m_%d_%H_%M”
$fileAndPath =”D:\DATA\myData-$fileNameStamp.csv”
$web = Get-SPWeb -identity “http://10.1.1.10/”
$list = $web.Lists[“MyList”]

# END CONFIG
# BEGIN: STEP 1: Export current items in the list to CSV
$listitems = $list.Items.Count;
# Break out if the list has no content. Stops the creation of empty files.
if ($listitems -lt 1) {
    break;
} else {
    #Array to Hold Result – PSObjects
    $ListItemCollection = @()
    #Get All List items”
    $list.Items | foreach {
       $ExportItem = New-Object PSObject
       $ExportItem | Add-Member -MemberType NoteProperty -name “ID” -value $_[“ID”]
       $ExportItem | Add-Member -MemberType NoteProperty -name “Title” -value $_[“Title”]
       $ExportItem | Add-Member -MemberType NoteProperty -name “Created” -value $_[“Created”]
       $ExportItem | Add-Member -MemberType NoteProperty -name “CreatedBy” -value $_[“Created By”]
       #Add the object with property to an Array
       $ListItemCollection += $ExportItem
    }
    #Export the result Array to CSV file
    $ListItemCollection | Export-CSV $fileAndPath -NoTypeInformation
}
# END: STEP 1: Export current items in the list to CSV

# BEGIN: STEP 2: Delete the exported items from the list
Import-CSV $fileAndPath -Header ID,Title | Foreach-Object{
    Write-Host ” Looking for item #” $_.ID “…”;

    $items = $list.Items | Where {$_[“ID”] -eq $_.ID}
    foreach($item in $items)
    {
        $item.Delete();
        break;
    }
}
# END: STEP 2: Delete the exported items from the list

#BEGIN: CLEANUP
$web.Dispose();
#END: CLEANUP

PowerShell to shrink SharePoint auditdata table

I recently had a content database that was out of control size wise because of the Auditdata table and needed it shrunk promptly. This piece of PowerShell did the job.

NOTE: This is a bit of a hammer so all things look like a nail approach but it may be good for your specific situation.

if ((Get-PSSnapin “Microsoft.SharePoint.PowerShell” -ErrorAction SilentlyContinue) -eq $null) {
   Add-PSSnapin “Microsoft.SharePoint.PowerShell”
}

foreach ($site in get-spsite -Limit ALL)
{
   Write-host ‘Deleting audit data for site: ‘ $sc.URL
   $i = -350
   do {
      Write-Host $site.URL ‘ – Delete day ‘ $i ‘ : ‘ ([System.DateTime]::Now.ToLocalTime().AddDays($i))
      $site.audit.deleteentries([System.DateTime]::Now.ToLocalTime().AddDays($i))
      $site.audit.update()
      $i++
   }
   while ($i -le 1)
   $site.Dispose()
}

PowerShell “fun” with text to speech

Writing code can be rather far from glamorous if not downright boring sometimes. PowerShell has its appeal but it is at the end of the day still an administrative framework for task and configuration automation.  Even saying that can hurt… If you’re looking to spice it up a bit the following may help.

function prompt { $text = ‘You are great!’, ‘Hero!’, ‘What a checker you are.’, ‘Champ, well done!’, ‘Man, you are good!’, ‘Guru stuff I would say.’, ‘You are magic!’    ‘PS> ‘    $host.UI.RawUI.WindowTitle = Get-Location    (New-Object -ComObject Sapi.SpVoice).Speak(($text | Get-Random)) }

prompt

 

SharePoint PowerShell script to check if a site collection is alive and email if not.

This might help somebody. In brief the script checks the array of URLs and if it gets a non 200 (OK) response back it emails out an alert. It can also be used as a keep alive script. Best run on a non farm connected server.

$urls= @("http://webapp/sitecollection1/Default.aspx",

"http://webapp/sitecollection2/default.aspx",

"http://webapp/sitecollection3/default.aspx",

"http://webapp/sitecollection4/default.aspx");

# Request all URLs in the array

foreach ($objItemin$urls) {

Write-Host“Checking $objItem”;

$req= [system.Net.WebRequest]::Create($objItem)

$req.UseDefaultCredentials=$true

try {

$res=$req.GetResponse()

} catch [System.Net.WebException] {

$res=$_.Exception.Response

}

$int= [int]$res.StatusCode

$status=$res.StatusCode

write-host“$int $status”

if ($int-ne 200) {

Write-Host”  Sending Email…”;

$enc  =New-ObjectSystem.Text.utf8encoding;

$smtp=”emailserver.domain.com”;

$to=”Recipient 1<username@domain.com>”;

$toCC=Recipient 2 <username@domain.com>”;

$from=”SharePoint farm <username@domain.com>”;

$ScriptName=$MyInvocation.MyCommand.Name;

$scriptPath=split-path-parent$MyInvocation.MyCommand.Definition;

$body=”This was generated by the script $ScriptName in $scriptPath”;

$subject=”URL check failure on $objItem – ‘$int : $status'”;

send-MailMessage-SmtpServer$smtp-To$to-Cc$toCC-From$from-Subject$subject-Body$body-BodyAsHtml-Encoding$enc;

Write-Host”  Sent.”;

}

}

Searching the source of ALL pages in a SharePoint farm using PowerShell

Recently I had to find all instances of a jQuery specific reference in a small to mid sized farm. I had not had to do this before and after a few minutes I decided that PowerShell was the tool to use. Here’s what I cobbled together. It’s neither great nor finished but it is probably useful for somebody right now.

###########################################################################################################################
# BEGIN: Get all pages ASPX & HTM*
function TrawlInventory($Url) {
    $farm = [Microsoft.SharePoint.Administration.SPFarm]::Local;
    $site = new-object Microsoft.SharePoint.SPSite $Url;

    foreach ($web in $site.AllWebs) {
        $webUrl = $web.Url;
        write-host “- Checking: $webUrl”;

        foreach ($list in $web.Lists) {
            $listitemsCount = $list.Items.Count;
            write-host “- – Checking: $list which has $listitemsCount items”;

            if ($listitemsCount -lt 1000) {
                foreach ($item in $list.Items) {
                    $itemName = $item.name.ToString().ToLower();
                    write-host “- – – – Checking file: $itemName”;

                    if ($itemName -like ‘*aspx’ -Or $itemName -like ‘*htm’) {
                        $thisURL = $item.Url;
                        $thisWEB = $web.Url;
                        if ($thisURL){
                            $thisItemsFullURL = “$thisWEB/$thisURL”;
                            $itemsFullURL = $thisItemsFullURL.ToString();
                            $uri = “$itemsFullURL”;
                            write-host “- – – – – Checking source for: $uri”;
                            $wc = New-Object System.Net.WebClient;
                            $wc.UseDefaultCredentials = $true;
                            $json = $wc.DownloadString($uri);
                            $jsonLC = $json.ToLower();
                            if ($jsonLC.Contains(“string1“) -Or $jsonLC.Contains(“string2”)) {
                                write-host “Found reference: $itemsFullURL”;
                                Add-Content -Path “curls.txt” -Value “$itemsFullURL”;
                            }
                            $json = “”;
                            $jsonLC = “”;
                        }
                    }
    $itemName = “”;

                }

            } else {
                write-host “Too many itemns to check: $listitemsCount”;            
            }

        }
        $web.Dispose();                
    }
}
###########################################################################################################################
$thisDomain = “http://WebAppName/site-collection/”;
TrawlInventory($thisDomain);
# END: Get all pages ASPX & HTM*
###########################################################################################################################

Failed to cache field with id ‘{…}’, overwrite=0″

Recently I encountered a problem with a list in a migrated farm. A generic Tasks list would no longer allow item level access (View or Edit) but continued to serve list views and allowed for item creation. The list settings page was also available. Users who tried to access an item were greeted with a standard SharePoint “oopsie” page.

ULS logs produced the following error: “Failed to cache field Title with id ‘{…}’, overwrite=0″

And that was it.

So I tried to clone the list for some more detailed testing. Using PowerShell I pretty quickly encountered a problem.

If you’re interested, the PowerShell used for exporting/importing the list this was as below.

function Export-List([string]$ListURL)

{

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”) > $null

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint.Deployment”) > $null

$versions = [Microsoft.SharePoint.Deployment.SPIncludeVersions]::All

$exportObject = New-Object Microsoft.SharePoint.Deployment.SPExportObject

$exportObject.Type = [Microsoft.SharePoint.Deployment.SPDeploymentObjectType]::List 

$exportObject.IncludeDescendants = [Microsoft.SharePoint.Deployment.SPIncludeDescendants]::All

$settings = New-Object Microsoft.SharePoint.Deployment.SPExportSettings

$settings.ExportMethod = [Microsoft.SharePoint.Deployment.SPExportMethodType]::ExportAll

$settings.IncludeVersions = $versions

$settings.IncludeSecurity = [Microsoft.SharePoint.Deployment.SPIncludeSecurity]::All

$settings.OverwriteExistingDataFile = 1

$settings.ExcludeDependencies = $true

$site = new-object Microsoft.SharePoint.SPSite($ListURL)

Write-Host “ListURL”, $ListURL

$web = $site.OpenWeb()

$list = $web.GetList($ListURL)

$settings.SiteUrl = $web.Url

$exportObject.Id = $list.ID

$settings.FileLocation = “d:temp”

$settings.BaseFileName = “ExportList-“+ $list.ID.ToString() +”.DAT”

$settings.FileCompression = 1

Write-Host “FileLocation”, $settings.FileLocation

$settings.ExportObjects.Add($exportObject)

$export = New-Object Microsoft.SharePoint.Deployment.SPExport($settings)

$export.Run()

$web.Dispose()

$site.Dispose()

}

function Import-List([string]$DestWebURL, [string]$FileName, [string]$LogFilePath)

{

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”) > $null

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint.Deployment”) > $null

$settings = New-Object Microsoft.SharePoint.Deployment.SPImportSettings

$settings.IncludeSecurity = [Microsoft.SharePoint.Deployment.SPIncludeSecurity]::All

$settings.UpdateVersions = [Microsoft.SharePoint.Deployment.SPUpdateVersions]::Overwrite 

$settings.UserInfoDateTime = [Microsoft.SharePoint.Deployment.SPImportUserInfoDateTimeOption]::ImportAll

$site = new-object Microsoft.SharePoint.SPSite($DestWebURL)

Write-Host “DestWebURL”, $DestWebURL

$web = $site.OpenWeb()

Write-Host “SPWeb”, $web.Url

$settings.SiteUrl = $web.Url

$settings.WebUrl = $web.Url

$settings.CommandLineVerbose = $true 

$settings.FileLocation = “d:temp”

$settings.BaseFileName = $FileName

$settings.LogFilePath = $LogFilePath

$settings.FileCompression = 1

Write-Host “FileLocation”, $settings.FileLocation

$import = New-Object Microsoft.SharePoint.Deployment.SPImport($settings)

$import.Run()

$web.Dispose()

$site.Dispose()

}

# For Export a specified SharePoint List

#Export-List “http://server/sites/subsite/Lists/TasksTest/” 

# For Import the list you export in previous command

Import-List “http://server/sites/subsite/Lists/TasksTest/” “ExportList-GUID.DAT” “C:tempImportLog.txt” 

So my next step was how to identify the duplicate field. Again PowerShell and Google helped. The below came from http://blog.sharepoint-voodoo.net/?p=142

# Presumeably the issue you are having is that a Content Database won’t upgrade due to duplicate field names

$inputDB = Read-Host “Enter the name of the Content Database to be scanned for duplicate list fields “

$sites = Get-SPSite -Limit All  -ContentDatabase $inputDB

 # Set up the logging files and current date

$date = Get-Date

$viewFieldText = “c:TEMPDuplicateViewFields.txt”

$listFieldIDText = “c:TEMPDuplicateListFieldIDs.txt”

$listFieldIntNameText = “c:TEMPDuplicateFieldIntNameIDs.txt”

# Create the initial files by writing the date to them

$date | out-file “$viewFieldText”

$date | out-file “$listFieldIDText”

$date | out-file “$listFieldIntNameText”

# Start looping through each Site Collection in the DB

foreach ($site in $sites)

{

# Loop through each sub-web in the current site collection

foreach ($web in $site.allwebs)

{

# Loop through each list in the current sub-web

foreach ($list in $web.lists)

{

$siteName = $site.Title

Write-Host “Checking $siteName/$web/$list List for duplicate Field Names or IDs…”

  # Create the Arrays that will hold data about the list fields being scanned

# An array of objects. Each object will be made up of the Field ID, Field Title, and Field Internal Name

$objFieldArray = @() 

  # Loop through each Field in the list

foreach($listField in $list.Fields)

{

# Does the current list field ID already exist in the array of objects?

$objFieldRow = $objFieldArray | ?{$_.FieldID -eq $listField.ID}

$objFieldIntName = $objFieldArray | ?{$_.InternalName -eq $listField.InternalName}

  # If the current list field ID or InternalName was matched in the array of objects, log info about the current list field

# and the matching list field object from the array of objects

          if ($objFieldRow.FieldID -eq $listField.ID -or $objFieldIntName.InternalName -eq $listField.InternalName)

                {

# Generate the variables to be logged to the text file

                    $webUrl = $web.Url

$listTitle = $list.Title

$listFieldID = $listField.ID

$listFieldTitle = $listField.Title

$listFieldInternalName = $listField.InternalName

$existingID = $objFieldRow.FieldID

$existingName = $objFieldRow.FieldName

$existingInternal = $objFieldRow.InternalName

  # Start logging

Write-Host “Duplicate item detected”

“————Duplicate item detected—————–” | out-file “$listFieldIDText” -append

                    “Web URL: $webUrl” | out-file “$listFieldIDText” -append

                    “List: $listTitle” | out-file “$listFieldIDText” -append

“Field #1 ID: $listFieldID” | out-file “$listFieldIDText” -append

                    “Field #1 Title: $listFieldTitle” | out-file “$listFieldIDText” -append

“Field #1 Internal Name: $listFieldInternalName” | out-file “$listFieldIDText” -append

“” | out-file “$listFieldIDText” -append

“Field #2 ID: $existingID” | out-file “$listFieldIDText” -append

“Field #2  Name: $existingName” | out-file “$listFieldIDText” -append

“Field #2  InternalName: $existingInternal” | out-file “$listFieldIDText” -append

“—————————————————-” | out-file “$listFieldIDText” -append

“” | out-file “$listFieldIDText” -append

                }

else # If the current list field ID or InternalName is not found in the array of objects, insert it now 

{

# Create the blank object

$objFieldData = “” | select FieldID,FieldName,InternalName

  # Insert data into the object

$objFieldData.FieldID = $listField.ID

$objFieldData.FieldName = $listField.Title

$objFieldData.InternalName = $listField.InternalName

  # Insert the new object into the Array

$objFieldArray += $objFieldData

}

}

  Write-Host “Checking List Views for duplicate fields…”

  # Now that all of the list fields have been checked, we need to check for duplicate field names in each of the list views

foreach($ListView in $list.Views)

{

# Create an array to hold the Internal Names of the View Fields

$viewFieldArray = @()

  # Loop through each field in the view

foreach ($ViewField in $ListView.ViewFields)

{

# Check if the current View Field Internal Name exists in the array

              if ($viewFieldArray -contains $ViewField)

                    {

                        # Log info about the duplicate view field

$webUrl = $web.Url

$listTitle = $list.Title

$listViewTitle = $ListView.Title

  Write-Host “Duplicate item detected”

“————Duplicate item detected—————–” | out-file “$viewFieldText” -append

                        “Web URL: $webUrl” | out-file “$viewFieldText” -append

                        “List: $listTitle” | out-file “$viewFieldText” -append

                        “View Name: $listViewTitle” | out-file “$viewFieldText” -append

                        “Duplicate Field: $ViewField” | out-file “$viewFieldText” -append

“—————————————————-” | out-file “$viewFieldText” -append

“” | out-file “$viewFieldText” -append

                     }

                    else 

{

# If the view field internal name was not found in the array, add it now

$viewFieldArray += $ViewField

}

  }

}

}

}

}

The text file output of the above process gave a positive return for the problem list. Specifically it showed:

————Duplicate item detected—————–

Web URL: http://server/site/subsite

List: Tasks

Field #1 ID: c15b34c3-ce7d-490a-b133——————

Field #1 Title: Task Status

Field #1 Internal Name: TaskStatus

Field #2 ID: c15b34c3-ce7d-490a-b133——————-

Field #2  Name: Status

Field #2  InternalName: Status

—————————————————-

So in my DEV environment I tested out what would happen if the column “Status” was removed. Everything worked again. I removed the column from production and the problem was solved.

Powershell BCS list call error: an error occurred while enumerating through a collection: the shim execution failed unexpectedly – proxy creation failed. default context not found

If you get the following error whilst making a call to an external list make sure that you instantiate a spservicecontext. Just add the lines below above your get-spweb.

# BEGIN: Get the SpServiceContext as a BCS call is being made
$context = Get-SPServiceContext “http://sharepoint/”;
$scope = New-Object microsoft.SharePoint.SPServiceContextScope $context ;
# END: Get the SpServiceContext as a BCS call is being made

How to fix the PowerGui error “index was out of range. Must be non-negative…”

I was getting this error whenever I tried to open a PowerShell script by double clicking it. PowerGui is my default PowerShell editor.

The workaround is to empty out the PowerGUI configuration settings and data for the user account under which it fails to start. Making a backup is probably wise.

Navigate to each path below and delete the PowerGui folder.

  • %USERPROFILE%AppDataRoamingQuest Software
  • %USERPROFILE%AppDataLocalQuest Software
You may have to empty your recycle bin as well.