SP2010 seeing more than “An unexpected error has occurred”

Anybody who works with SharePoint has seen the following “Awesome” message…

An unexpected error has occurred

There’s more detail to be had with a simple tweak of the web.config in both MOSS SharePoint 2007 and SP2010.

Navigate to your virtualdirectory, typically here: c:inetpubwwwrootwssVirtualDirectories and then to the folder for your web application.

Opening the web.config in Notepad++ or any other editor and search for the following tags (there is only one of each):

  • CustomErrors
  • CallStack

Note: change them to the below:

  • CustomErrors=”Off”
  • CallStack=”true”

SharePoint 2007: How to show and link to an attachment in a list

Get to the field in the view and put this in. Should work without a hitch… The usual change from “<" to "< " has been made. < name="SharePoint:AttachmentsField">
< name="runat">server< /xsl:attribute>
< name="ListId">
{LIST GUID HERE}
< /xsl:attribute>
< name="FieldName">Attachments< /xsl:attribute>
< name="ControlMode">Display< /xsl:attribute>
< name="Visible">true< /xsl:attribute>
< name="ItemId">
< select="@ID">
< /xsl:attribute>
< /xsl:element>

Display RSS with JavaScript in SharePoint

Displaying RSS feeds is a nice thing to be able to do. One way to do this is with JavaScript in a Content Editor Webpart. Code example below… As usual “<" has been replaced with "< ". < id="feed-control">
< style="color:#676767;font-size:11px;margin:10px;padding:4px;">Loading…< /span>
< /div>

< !-- Google Ajax Api -->
< src="http://www.google.com/jsapi?key=notsupplied-wizard" type="text/javascript">< /script>

< !-- Dynamic Feed Control and Stylesheet -->
< src="http://www.google.com/uds/solutions/dynamicfeed/gfdynamicfeedcontrol.js" type="text/javascript">< /script>
< type="text/css">
@import url(“http://www.google.com/uds/solutions/dynamicfeed/gfdynamicfeedcontrol.css”);
< /style>

< type="text/javascript">
function LoadDynamicFeedControl() {
var feeds = [
{title: ‘Feed Items’,
url: ‘http://coreboarder.blogspot.com/feeds/posts/default?alt=rss’
}];
var options = {
stacked : true,
horizontal : false,
title : “Preview”
}

new GFdynamicFeedControl(feeds, ‘feed-control’, options);
}
// Load the feeds API and set the onload callback.
google.load(‘feeds’, ‘1’);
google.setOnLoadCallback(LoadDynamicFeedControl);
< /script>

The value of distributed computing: The return of Markov chain Monte Carlo methods

A while back I wrote something on doing Monte Carlo simulations with Web Services and SharePoint. Halfway through I mentioned that Google Pagerank was defined by a Markov chain which in turn was an output of a process called Markov chain Monte Carlo methods. Not that it concerned me but only one person mentioned this, and at that it was a vague mentioning. Huh…

This actually is a big deal. In fact a very big deal. A multi billion dollar deal in fact, as in the case of Google PageRank. Distributed computing has the power to help us solve many things if applied correctly. The “cloud” does not. (A topic for later.) Probably the greatest hurdle in getting people back on track is that this technology has use beyond the scope of most peoples daily lives. For example…

A paper was published in PLoS last week, September 4th 2009, called “Can an Eigenvector Measure Species’ Importance for Coextinctions?” In it the authors state that “PageRank” can be applied to the study of food webs. Food webs are the complex networks of who eats whom in an ecosystem.Typically we’re at the top, unless Hollywood or very bad planning is involved. Essentially, the scientists are saying that their particular version of PageRank could be a simple way of working out which extinctions would lead to ecosystem collapse. A relatively handy thing to have these days… As every species is embedded in a complex network of relationships with others, even a single extinction can rapidly cascade into the loss of seemingly unrelated species. Investigating when this might happen using more conventional methods is complicated as even in simple ecosystems, the number of combinations exceeds the number of atoms in the universe… E.g. a typical lottery which has 8 numbers that can range between 1 and 50 has 39,062,500,000,000 different combinations…

The researchers had to tweak PageRank to it to adapt it for their ecology focused purposes.

“First of all we had to reverse the definition of the algorithm.” “In PageRank, a web page is important if important pages point to it. In our approach a species is important if it points to important species.”

They also tested against algorithms that were already in use in computational biology to find a solution to the same problem. PageRank, in its adjusted form, gave them exactly the same solution as these much more complicated algorithms.

With the right design SharePoint can be an extremely useful, and totally appropriate, interface for accessing and disseminating the inputs and outputs of such an effort. It can store and present this data with all of the requisite benefits one would expect from a collaborative platform. Certainly there’s a world of work involved in doing something like this but the key point is that the right tool for the right job mantra works here. “All” you need is:

  • IIS
  • .NET
  • SharePoint
  • PowerShell
  • Visual Studio
  • SQL
  • Skill

SiteMap in a custom SharePoint master page

Only a start but if you are having problems this may help… (As usual “<" has beeen replaced with "< ")
< sitemapprovider="SPContentMapProvider" id="ContentMap" skiplinktext="" cssclass="ms-sitemapdirectional" runat="server">
< id="PlaceHolderTitleBreadcrumb" runat="server" visible="False">
< class="breadcrumb">
< asp:SiteMapPath ID="siteMapPath" Runat="server"
SiteMapProvider=”SPContentMapProvider”
RenderCurrentNodeAsLink=”true”
CurrentNodeStyle-CssClass=”breadcrumbCurrent”
NodeStyle-CssClass=”ms-sitemapdirectional”
pathdirection=”RootToCurrent”>
< cssclass="breadcrumbCurrent">
< cssclass="ms-sitemapdirectional">
< /asp:SiteMapPath>
< /div>
< /asp:ContentPlaceHolder>

How to: turn the left navigation in SharePoint into an accordion style one

Add this code to your master page. Add the Google JS to a local address if you don’t want to keep making remote calls. I’ve edited the code so that it will render on the page by changing:

  • “<" - "< "
  • “>” – ” >”

< type="text/javascript" src="http://www.google.com/jsapi">< /script >
< type="text/javascript">
// Load jQuery
google.load(“jquery”, “1.2.6”);
< /script >
< type="text/javascript">
$(function(){
//initialize menus
var menuRows = $(“[id$=’QuickLaunchMenu’] > tbody > tr”);
var menuHd = menuRows.filter(“[id!=”]:has(+tr[id=”])”);
//set img path for when submenu is hidden
var closedImg = “/_layouts/images/Menu1.gif”;
//set img path for when submenu is visible
var openedImg = “/_layouts/images/ptclose.gif”;
var cssInit = {
“background-image”: “url(‘”+closedImg+”‘)”,
“background-repeat”: “no-repeat”,
“background-position”: “100% 50%”
}
var cssClosed = {“background-image”: “url(‘”+closedImg+”‘)”}
var cssOpen = {“background-image”: “url(‘”+openedImg+”‘)”}
//hide submenus
menuRows.filter(“[id=”]”).hide();
//apply initial inline style to menu headers
menuHd.find(“td:last”).css(cssInit);
menuHd.click(function () {
var styleElm = $(this).find(“td:last”)
var nextTR = $(this).next(“tr[id=”]”);
if (nextTR.is(‘:visible’)) {
nextTR.hide();
styleElm.css(cssClosed);
} else {
nextTR.show();
styleElm.css(cssOpen);
}
});
});
< /script >


Making links in SharePoint list URL fields open in a new window

Open the schema file for the links list feature:

C:Program FilesCommon FilesMicrosoft Sharedweb server extensions12TEMPLATEFEATURESLinksListLinksschema.xml
There should be two instances of the following:
< Column Name="URL" HTMLEncode="TRUE" /><HTML>[ CDATA[">]]>HTML>

Replace them both with this:
< Column Name="URL" HTMLEncode="TRUE" /><HTML>[ CDATA[" target="_blank">]]>HTML>

Note: Replace “< " with "<" and "[ " with "["

A Thought on Monte Carlo Simulation Using Parallel Asynchronous Web Services with .NET and SharePoint

Monte Carlo Simulation is a technique used to estimate the likely range of outcomes outputted by a complex process by simulating the actual process with randomly selected data generating conditions that are true to the process model a large number of times. (In fact, the more you do it the better your data.) The Monte Carlo method is best applied whenever a deterministic solution would either be too computationally intensive or if such a solution does not exist whatsoever.

Monte Carlo Simulation is used in/with

  • Physical sciences
  • Design and visuals
  • Finance and business
  • Telecommunications
  • Games

Monte Carlo Simulation is not a “what if” process. What if’s require single point estimates and use deterministic modeling. Basically you are using best case, worst case, and so on. By using Monte Carlo you consume large random samplings, sourced from probability distribution functions, to produce a large range of outputs which in turn can allow you with greater confidence to produce a narrower range of outputs. In other words you are not using equal weights for each scenario.

Why is this pertinent? Well, stay with me on this one, Markov chain methods are extremely useful for generating sequences of random numbers to accurately reflect rather complicated desired probability distributions, via a process called Markov chain Monte Carlo methods. A tool that is used to generate simulations from a probability distribution…

The Google PageRank of a webpage is defined by a Markov chain.

And the penny drops…

Now, back to the point.

Depending on the degree of accuracy ultimately required, millions or billions of points may need to be tried. Distributing billions of point calculations across multiple servers running Monte Carlo Simulations via web services would parallelize the process and generates results VERY quickly. Good in concept but how to do it?

As defined by the W3C a web service is “a software system designed to support interoperable machine-to-machine interaction over a network.” Running web services on IIS has advantages not limited to:

  • You can grow your “cluster” by just deploying the web service to new nodes.
  • Each web service call with IIS is a thread which should have obvious and positive performance implications.
  • Web services provide a relatively simple and straightforward method of distributing parallel problems across multiple compute platforms.
  • Web services are written like traditional functions, they are easily parallelized without hand-coding a multi-threaded application, custom writing a message passing interface or using other high performance computing management software.

Needless to say, unless your requirements can be served by parallel computations, which would have no dependency on others in the pipe, this is going to become very difficult or rather “challenging” 🙂 very, very quickly.

So how could SharePoint fit in? SharePoint is perfect for acting as a landing point for your data. In and out. Companies benefit by building intelligence into their document libraries and lists with workflows. With workflow, SharePoint can act as a central hub for the data, sending it out to a queue which distributes to nodes on the network. Upon return, the data could be used to populate lists, document libraries, notify people/groups, and more. Search, BDC, Security, and all the other features in SharePoint make this concept a compelling one.

A random tidbit on non random data

I recently was talking with somebody who felt that TrueCrypt hidden volumes were the bee knees. The scenario they used, and which I myself have read ‘musings’ about, involved a laptop carrying sensitive corporate data being seized by customs. Laptop drive gets “reviewed”, secret container is not seen, and laptop passes as normal and uninteresting. Big deal. Bigger deal is if you have 007 style data and that guy in the uniform is pretty certain you have it as well. My colleagues version of the story ends with an almost hollywood style style exhalation of breath and cinematic zoom out to the hero walking out the door. That’s not how it would probably pan out…

Truecrypt volumes, which are essentially files, have certain characteristics that allow programs such as TCHunt to detect them with a high *probability*. The most significant, in mathematical terms, is that their modulo division by 512 is 0. Now it is certainly true that TrueCrypt volumes do not contain known file headers and that their content is indistinguishable from random, so it is difficult to definitively prove that certain files are TrueCrypt volumes. However their very presence can demonstrate and provide reasonable suspicion they contain encrypted data.

The actual math behind this is interesting. TrueCrypt volume files have file sizes that are evenly divisible by 512 and their content passes chi-square randomness tests. A chi-square test is any statistical hypothesis test in which the sampling distribution of the test statistic is a chi-square distribution* when the null hypothesis is true, or any in which this is asymptotically true. Specifically meaning that the sampling distribution (if the null hypothesis is true) can be made to approximate a chi-square distribution as closely as desired by making the sample size large enough.

So what does this all mean? Really nothing for us normal people. For those whom I have built custom STSADM containers for securing your backups and exports, your data is still secure and will stay that way indefinitely. For those running across the border. A forensic analysis will reveal the presence of encrypted data, TrueCrypt volumes or otherwise, but not much more. Sometimes that’s enough to start asking questions or poking further. With the forensic tools, not the dentistry kit.

* A skewed distribution whose shape depends on the number of degrees of freedom. As the number of degrees of freedom increases, the distribution becomes more symmetrical.

http://www.truecrypt.org/
http://16systems.com/TCHunt/