Failed to cache field with id ‘{…}’, overwrite=0″

Recently I encountered a problem with a list in a migrated farm. A generic Tasks list would no longer allow item level access (View or Edit) but continued to serve list views and allowed for item creation. The list settings page was also available. Users who tried to access an item were greeted with a standard SharePoint “oopsie” page.

ULS logs produced the following error: “Failed to cache field Title with id ‘{…}’, overwrite=0″

And that was it.

So I tried to clone the list for some more detailed testing. Using PowerShell I pretty quickly encountered a problem.

If you’re interested, the PowerShell used for exporting/importing the list this was as below.

function Export-List([string]$ListURL)

{

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”) > $null

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint.Deployment”) > $null

$versions = [Microsoft.SharePoint.Deployment.SPIncludeVersions]::All

$exportObject = New-Object Microsoft.SharePoint.Deployment.SPExportObject

$exportObject.Type = [Microsoft.SharePoint.Deployment.SPDeploymentObjectType]::List 

$exportObject.IncludeDescendants = [Microsoft.SharePoint.Deployment.SPIncludeDescendants]::All

$settings = New-Object Microsoft.SharePoint.Deployment.SPExportSettings

$settings.ExportMethod = [Microsoft.SharePoint.Deployment.SPExportMethodType]::ExportAll

$settings.IncludeVersions = $versions

$settings.IncludeSecurity = [Microsoft.SharePoint.Deployment.SPIncludeSecurity]::All

$settings.OverwriteExistingDataFile = 1

$settings.ExcludeDependencies = $true

$site = new-object Microsoft.SharePoint.SPSite($ListURL)

Write-Host “ListURL”, $ListURL

$web = $site.OpenWeb()

$list = $web.GetList($ListURL)

$settings.SiteUrl = $web.Url

$exportObject.Id = $list.ID

$settings.FileLocation = “d:temp”

$settings.BaseFileName = “ExportList-“+ $list.ID.ToString() +”.DAT”

$settings.FileCompression = 1

Write-Host “FileLocation”, $settings.FileLocation

$settings.ExportObjects.Add($exportObject)

$export = New-Object Microsoft.SharePoint.Deployment.SPExport($settings)

$export.Run()

$web.Dispose()

$site.Dispose()

}

function Import-List([string]$DestWebURL, [string]$FileName, [string]$LogFilePath)

{

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”) > $null

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint.Deployment”) > $null

$settings = New-Object Microsoft.SharePoint.Deployment.SPImportSettings

$settings.IncludeSecurity = [Microsoft.SharePoint.Deployment.SPIncludeSecurity]::All

$settings.UpdateVersions = [Microsoft.SharePoint.Deployment.SPUpdateVersions]::Overwrite 

$settings.UserInfoDateTime = [Microsoft.SharePoint.Deployment.SPImportUserInfoDateTimeOption]::ImportAll

$site = new-object Microsoft.SharePoint.SPSite($DestWebURL)

Write-Host “DestWebURL”, $DestWebURL

$web = $site.OpenWeb()

Write-Host “SPWeb”, $web.Url

$settings.SiteUrl = $web.Url

$settings.WebUrl = $web.Url

$settings.CommandLineVerbose = $true 

$settings.FileLocation = “d:temp”

$settings.BaseFileName = $FileName

$settings.LogFilePath = $LogFilePath

$settings.FileCompression = 1

Write-Host “FileLocation”, $settings.FileLocation

$import = New-Object Microsoft.SharePoint.Deployment.SPImport($settings)

$import.Run()

$web.Dispose()

$site.Dispose()

}

# For Export a specified SharePoint List

#Export-List “http://server/sites/subsite/Lists/TasksTest/” 

# For Import the list you export in previous command

Import-List “http://server/sites/subsite/Lists/TasksTest/” “ExportList-GUID.DAT” “C:tempImportLog.txt” 

So my next step was how to identify the duplicate field. Again PowerShell and Google helped. The below came from http://blog.sharepoint-voodoo.net/?p=142

# Presumeably the issue you are having is that a Content Database won’t upgrade due to duplicate field names

$inputDB = Read-Host “Enter the name of the Content Database to be scanned for duplicate list fields “

$sites = Get-SPSite -Limit All  -ContentDatabase $inputDB

 # Set up the logging files and current date

$date = Get-Date

$viewFieldText = “c:TEMPDuplicateViewFields.txt”

$listFieldIDText = “c:TEMPDuplicateListFieldIDs.txt”

$listFieldIntNameText = “c:TEMPDuplicateFieldIntNameIDs.txt”

# Create the initial files by writing the date to them

$date | out-file “$viewFieldText”

$date | out-file “$listFieldIDText”

$date | out-file “$listFieldIntNameText”

# Start looping through each Site Collection in the DB

foreach ($site in $sites)

{

# Loop through each sub-web in the current site collection

foreach ($web in $site.allwebs)

{

# Loop through each list in the current sub-web

foreach ($list in $web.lists)

{

$siteName = $site.Title

Write-Host “Checking $siteName/$web/$list List for duplicate Field Names or IDs…”

  # Create the Arrays that will hold data about the list fields being scanned

# An array of objects. Each object will be made up of the Field ID, Field Title, and Field Internal Name

$objFieldArray = @() 

  # Loop through each Field in the list

foreach($listField in $list.Fields)

{

# Does the current list field ID already exist in the array of objects?

$objFieldRow = $objFieldArray | ?{$_.FieldID -eq $listField.ID}

$objFieldIntName = $objFieldArray | ?{$_.InternalName -eq $listField.InternalName}

  # If the current list field ID or InternalName was matched in the array of objects, log info about the current list field

# and the matching list field object from the array of objects

          if ($objFieldRow.FieldID -eq $listField.ID -or $objFieldIntName.InternalName -eq $listField.InternalName)

                {

# Generate the variables to be logged to the text file

                    $webUrl = $web.Url

$listTitle = $list.Title

$listFieldID = $listField.ID

$listFieldTitle = $listField.Title

$listFieldInternalName = $listField.InternalName

$existingID = $objFieldRow.FieldID

$existingName = $objFieldRow.FieldName

$existingInternal = $objFieldRow.InternalName

  # Start logging

Write-Host “Duplicate item detected”

“————Duplicate item detected—————–” | out-file “$listFieldIDText” -append

                    “Web URL: $webUrl” | out-file “$listFieldIDText” -append

                    “List: $listTitle” | out-file “$listFieldIDText” -append

“Field #1 ID: $listFieldID” | out-file “$listFieldIDText” -append

                    “Field #1 Title: $listFieldTitle” | out-file “$listFieldIDText” -append

“Field #1 Internal Name: $listFieldInternalName” | out-file “$listFieldIDText” -append

“” | out-file “$listFieldIDText” -append

“Field #2 ID: $existingID” | out-file “$listFieldIDText” -append

“Field #2  Name: $existingName” | out-file “$listFieldIDText” -append

“Field #2  InternalName: $existingInternal” | out-file “$listFieldIDText” -append

“—————————————————-” | out-file “$listFieldIDText” -append

“” | out-file “$listFieldIDText” -append

                }

else # If the current list field ID or InternalName is not found in the array of objects, insert it now 

{

# Create the blank object

$objFieldData = “” | select FieldID,FieldName,InternalName

  # Insert data into the object

$objFieldData.FieldID = $listField.ID

$objFieldData.FieldName = $listField.Title

$objFieldData.InternalName = $listField.InternalName

  # Insert the new object into the Array

$objFieldArray += $objFieldData

}

}

  Write-Host “Checking List Views for duplicate fields…”

  # Now that all of the list fields have been checked, we need to check for duplicate field names in each of the list views

foreach($ListView in $list.Views)

{

# Create an array to hold the Internal Names of the View Fields

$viewFieldArray = @()

  # Loop through each field in the view

foreach ($ViewField in $ListView.ViewFields)

{

# Check if the current View Field Internal Name exists in the array

              if ($viewFieldArray -contains $ViewField)

                    {

                        # Log info about the duplicate view field

$webUrl = $web.Url

$listTitle = $list.Title

$listViewTitle = $ListView.Title

  Write-Host “Duplicate item detected”

“————Duplicate item detected—————–” | out-file “$viewFieldText” -append

                        “Web URL: $webUrl” | out-file “$viewFieldText” -append

                        “List: $listTitle” | out-file “$viewFieldText” -append

                        “View Name: $listViewTitle” | out-file “$viewFieldText” -append

                        “Duplicate Field: $ViewField” | out-file “$viewFieldText” -append

“—————————————————-” | out-file “$viewFieldText” -append

“” | out-file “$viewFieldText” -append

                     }

                    else 

{

# If the view field internal name was not found in the array, add it now

$viewFieldArray += $ViewField

}

  }

}

}

}

}

The text file output of the above process gave a positive return for the problem list. Specifically it showed:

————Duplicate item detected—————–

Web URL: http://server/site/subsite

List: Tasks

Field #1 ID: c15b34c3-ce7d-490a-b133——————

Field #1 Title: Task Status

Field #1 Internal Name: TaskStatus

Field #2 ID: c15b34c3-ce7d-490a-b133——————-

Field #2  Name: Status

Field #2  InternalName: Status

—————————————————-

So in my DEV environment I tested out what would happen if the column “Status” was removed. Everything worked again. I removed the column from production and the problem was solved.