PowerShell “fun” with text to speech

Writing code can be rather far from glamorous if not downright boring sometimes. PowerShell has its appeal but it is at the end of the day still an administrative framework for task and configuration automation.  Even saying that can hurt… If you’re looking to spice it up a bit the following may help.

function prompt { $text = ‘You are great!’, ‘Hero!’, ‘What a checker you are.’, ‘Champ, well done!’, ‘Man, you are good!’, ‘Guru stuff I would say.’, ‘You are magic!’    ‘PS> ‘    $host.UI.RawUI.WindowTitle = Get-Location    (New-Object -ComObject Sapi.SpVoice).Speak(($text | Get-Random)) }

prompt

 

SharePoint PowerShell script to check if a site collection is alive and email if not.

This might help somebody. In brief the script checks the array of URLs and if it gets a non 200 (OK) response back it emails out an alert. It can also be used as a keep alive script. Best run on a non farm connected server.

$urls= @("http://webapp/sitecollection1/Default.aspx",

"http://webapp/sitecollection2/default.aspx",

"http://webapp/sitecollection3/default.aspx",

"http://webapp/sitecollection4/default.aspx");

# Request all URLs in the array

foreach ($objItemin$urls) {

Write-Host“Checking $objItem”;

$req= [system.Net.WebRequest]::Create($objItem)

$req.UseDefaultCredentials=$true

try {

$res=$req.GetResponse()

} catch [System.Net.WebException] {

$res=$_.Exception.Response

}

$int= [int]$res.StatusCode

$status=$res.StatusCode

write-host“$int $status”

if ($int-ne 200) {

Write-Host”  Sending Email…”;

$enc  =New-ObjectSystem.Text.utf8encoding;

$smtp=”emailserver.domain.com”;

$to=”Recipient 1<username@domain.com>”;

$toCC=Recipient 2 <username@domain.com>”;

$from=”SharePoint farm <username@domain.com>”;

$ScriptName=$MyInvocation.MyCommand.Name;

$scriptPath=split-path-parent$MyInvocation.MyCommand.Definition;

$body=”This was generated by the script $ScriptName in $scriptPath”;

$subject=”URL check failure on $objItem – ‘$int : $status'”;

send-MailMessage-SmtpServer$smtp-To$to-Cc$toCC-From$from-Subject$subject-Body$body-BodyAsHtml-Encoding$enc;

Write-Host”  Sent.”;

}

}

Searching the source of ALL pages in a SharePoint farm using PowerShell

Recently I had to find all instances of a jQuery specific reference in a small to mid sized farm. I had not had to do this before and after a few minutes I decided that PowerShell was the tool to use. Here’s what I cobbled together. It’s neither great nor finished but it is probably useful for somebody right now.

###########################################################################################################################
# BEGIN: Get all pages ASPX & HTM*
function TrawlInventory($Url) {
    $farm = [Microsoft.SharePoint.Administration.SPFarm]::Local;
    $site = new-object Microsoft.SharePoint.SPSite $Url;

    foreach ($web in $site.AllWebs) {
        $webUrl = $web.Url;
        write-host “- Checking: $webUrl”;

        foreach ($list in $web.Lists) {
            $listitemsCount = $list.Items.Count;
            write-host “- – Checking: $list which has $listitemsCount items”;

            if ($listitemsCount -lt 1000) {
                foreach ($item in $list.Items) {
                    $itemName = $item.name.ToString().ToLower();
                    write-host “- – – – Checking file: $itemName”;

                    if ($itemName -like ‘*aspx’ -Or $itemName -like ‘*htm’) {
                        $thisURL = $item.Url;
                        $thisWEB = $web.Url;
                        if ($thisURL){
                            $thisItemsFullURL = “$thisWEB/$thisURL”;
                            $itemsFullURL = $thisItemsFullURL.ToString();
                            $uri = “$itemsFullURL”;
                            write-host “- – – – – Checking source for: $uri”;
                            $wc = New-Object System.Net.WebClient;
                            $wc.UseDefaultCredentials = $true;
                            $json = $wc.DownloadString($uri);
                            $jsonLC = $json.ToLower();
                            if ($jsonLC.Contains(“string1“) -Or $jsonLC.Contains(“string2”)) {
                                write-host “Found reference: $itemsFullURL”;
                                Add-Content -Path “curls.txt” -Value “$itemsFullURL”;
                            }
                            $json = “”;
                            $jsonLC = “”;
                        }
                    }
    $itemName = “”;

                }

            } else {
                write-host “Too many itemns to check: $listitemsCount”;            
            }

        }
        $web.Dispose();                
    }
}
###########################################################################################################################
$thisDomain = “http://WebAppName/site-collection/”;
TrawlInventory($thisDomain);
# END: Get all pages ASPX & HTM*
###########################################################################################################################

Dynamics CRM 2011 got to cloud customers first…

Users of the cloud-hosted version of customer relationship management software from Microsoft, aka Dynamics CRM, will be first in line to get the latest version. Microsoft Dynamics CRM 2011 will be available for on-premises deployment from February 28, but cloud customers can start using it today in 40 countries and 41 languages…

Clearly gunning for Salesforce.com and Oracle CRM customers, Microsoft is trying to undercut its competitors with a promotional price of $34 per user per month, and as an added incentive is offering $200 per user transferred from a competing product to Dynamics CRM for a limited period, to help cover any migration costs. Which are not really that bad speaking from an implementers perspective.

Resources
https://offers.crmchoice.com/Cloud-CRM-For-Less/?locale=en-us
http://crm.dynamics.com/en-us/

Open source projects for PST file access

This goes back a while and I just plain forgot to post about it.

Back in February of 2010, Microsoft released a public specifications for PST files, the databases used by Outlook for storing and archiving e-mail. To these specifications, Microsoft also added a pair of developer-oriented open source projects:

Though the core Office document formats are now XML-based open standards, alleviating such lock-in issues, Outlook has continued to use a complex database format for storing mail, making interoperability difficult. With the documentation and these software projects, the days of being locked into Outlook could be coming to an end.

The SDK project is still not finished; seemingly it still only provides read-only access to PST data, though write support is purportedly planned. Both tools are released using under the Apache License 2.0. This means that they can be incorporated into proprietary, closed-source projects, as well as other open source projects.

Overall the information available at the Microsoft Interoperability site is very, very useful and I would strongly recommend a visit.

Resources
http://blogs.msdn.com/b/interoperability/

SharePoint: Multiple databases for 1 Web Application

This process makes assumptions that the SharePoint environment is in a very exact state. Specifically that:
• All databases other the one currently being used for new MySites are in an offline state and that there is only ever one Web Application available.
• There is a defined business need for a series of smaller databases rather than one large and potentially unwieldy one.

When a database in SharePoint is in an offline state it is not in an inaccessible mode but rather it is not viewed as a tenable location by the process that creates new site collections. Instead SharePoint will look at the delta on the current number of sites value versus the set maximum in any and all other open Content Databases and select the one with the greatest value.

To set a content database status to offline:
1. Go to “Central Administration > Application Management > Content Databases”
2. Then click on the database of your choice set it to “Offline”

To create a new database for every new site collection the following steps need be followed.
1. Create a new content database.
a. Start the SharePoint Central Administration Web site.
b. On the Application Management page, in the SharePoint Web Application Management section, click Content databases.
c. On the Manage Content Databases page, click Add a content database.
d. On the Add Content Database page:
e. Select a Web application for the new database.
f. Select a database server to host the new database.
g. Specify the authentication method the new database will use and supply an account name and password if necessary.
h. Specify both the total number of top-level sites that can be created in the database and the number at which a warning will be issued.
i. Click OK.
2. Create the new Site Collection.
3. Set the newly created content database status to Offline.
4. View the contents of the new content database to confirm that it only contains what it should. (In theory a new My Site could be in there.)

It is important to confirm that this new database is appropriately accounted for by backup and DBA maintenance routines.

If a site collection was errantly created during the period that the new content database was online it can be moved as follows:

1. Run the following stsadm command, to get a list of all the sites in the web application

Stsadm -o enumsites -url <URL> > <path/file name>.xml

Where:
• <URL> is the address of the Web application that contains the site collection that you want to move,and
• <path/file name> is the name of the XML file that you want to create with the site collection data.

(Example: stsadm -o enumsites -url http://localhost > c:Sites.XML)

Open the XML file that you created in a text editing application. If there are any URLs for site collections that you do not want to move, be sure to delete them from the file. The only URLs that should remain in the XML file should be for the site collections that you want to move.

Note: There is no need to change the site count or any of the other site collection information in the file. Only the URLs are relevant to this procedure.

2. Run the following stsadm command:
Stsadm -o mergecontentdbs -url <URL> -sourcedatabasename <database name> -destinationdatabasename <destination database name> -operation 3 -filename <file name>

Where:
• <URL> is the address of the Web application that contains the site collection that you want;
• <database name> is the name of the database that you want to move the site collection from; and
• <destination database name> is the name of the database that you want to move the site collection to;
• operation 3 is the “Read from file” operation; and
• <file name> is the name of the file that you created in step 4.

(Example: stsadm -o mergecontentdbs -url http://localhost -sourcedatabasename WSS_Content -destinationdatabasename WSS_Content2 -operation 3 -filename c:Sites.xml)

Note: This step assumes that all of the sites in the Sites.xml file where in the source database name. If the sites are located in multiple databases you may need to repeat these steps for each source Content Database.

3. Restart IIS by typing the following command, and then pressing ENTER: iisreset /noforce.

Windows MultiPoint Server 2010

Built on top of Windows Server 2008 R2 Windows MultiPoint Server 2010 seems to be bring back the days of the mainframe.

The general goal of MultiPoint Server is to reduce costs for certain types of businesses and schools . The solution works by having one server feeding multiple mice, keyboards, speakers and monitors. Each session provides a user with a unique Remote Desktop to the MultiPoint Server. In other words, the dumb terminal once again makes a return. (Needless to say MultiPoint faces stiff competition from the likes of Userful Corporation, the world leader in multiseat Linux desktop virtualization.)

When I looked at it I immediately thought about how this would work at home and at the office. Wireless keyboards, and mice complimented with wireless monitor adapters (that can support audio!) connected to your television? That would make my life very easy and very much cheaper at home.

 It also could be very useful for development shops with small budgets.

http://www.microsoft.com/windows/multipoint/default.aspx

http://www2.userful.com/

SharePoint 2010 Optimization: Start with SQL

If you need more than 4Gb of storage for your SharePoint 2010 environment and blob linking is not for you, working with SQL Server is on your plate. 2010 has a lot of databases, each and every one of which has a role and performance profile. Knowing these can help you squeeze out that extra bit out of your environment.

A typical, or maybe not so depending on your individual needs/budget/etc, SharePoint 2010 environment has the potential to look like this. If yours does not, and almost certainly it does not 🙂 , I’m sure you can see similarities.

Viewing this in a compartmentalized fashion skewed towards where you can gain literal and alluded performance improvements you have roughly three major sections.

A: The database
B: The WFEs
C: The UI / General payload

I’m going to focus on the databases, what they are, and what they do. With knowledge comes power as they say.

So starting with the basics…


System
A normal installation of SQL drops a couple of databases on the disk.
Expect to see:

  1. MASTER: Records all system level information for a SQL Server instance. E.g. logins, configuration, and other databases.
  2. MSDB: Records operators and is used by SQL Server Agent to schedule jobs and alerts.
  3. MODEL: Is used as the template for all the databases created in the instance.
  4. TEMPDB: All temporary tables, temporary stored procedures, and anything else that is ‘temporary’ is stored here. Like a cache this is recreated every time the SQL Server instance is started. 

Notes:
  • Of these four, only the last one TEMPDB really should be a candidate for its own spindle. 
  • Size wise, TEMPDB is the only one that you should expect any type of growth from.
  • All four scale vertically, as in you can only allow them to grow in size rather than add in extra databases.
Reporting Services
Reporting Services are often used with SharePoint 2010s Excel, Visio, and PerformacePoint Services but are not required. Access Services does require them, specifically the SQL Server 2008 R2 Reporting Services Add-in. 
Expect to see:
  1. ReportServer: This stores all the metadata for reports such as definitions, history, scheduling, and snapshots. When ReportServer is actually in use the report documents are stored in SharePoint Content databases.
  2. ReportServerTempDB: When reports are running temporary snapshots are found here.

Notes:
  • Both of these databases must coexist on the same database server. 
  • They have a heavy read load.
  • The ReportServerTempDB can grow if there is heavy use of cached snapshots. Just like with the System databases, all four scale vertically, as in you can only allow them to grow in size rather than add in extra databases.
Overview 1
Okay, so now we have a bunch of databases sitting on our server making it look a tad like this:
Now Lets start to add in SharePoint 2010…
Microsoft SharePoint Foundation 2010
Microsoft SharePoint Foundation 2010 drops a number of databases to your disks. (If you install Search Server 2010 Express you will see some of these databases. I have put a * next to the name if it is a factor in play for that kind of scenario.)
Expect to see:

  1. Configuration*: As inferred by its name, this is where information about the environment is stored. Databases, IIS sites, Web Applications, Trusted Solutions, Wb Part Packages, site templates, blocked file types, quotas, they’re all in here. There can only be one configuration database per farm and you can expect it to remain relatively small.
  2. Central Administration Content*: Just like the configuration database, there can only be one, it does really grow much, and as its name infers it stores the Central Administration sites content.
  3. Content*: All site content is stored in these guys. Site pages, documents, lists, web part properties, audit logs, user names and rights, they’re all in here. Office Web Application data is also stored in here.
  4. Usage*: The Usage and Health Data Collection Service Application uses this database. This write heavy database is unique in that it is the only database that can be (and should be) queried directly by external applications. There can only be one per farm. Health monitoring and usage data used for reporting and diagnostics is in here.
  5. Business Data Connectivity*: External Content Types and related objects are in here. It is typically a read heavy yet small sized database. 
  6. Application Registry*: You really only see this if backward compatibility with the BDC API is needed. Typically this can be deleted AFTER upgrading is complete.
  7. Subscription Settings: Features and settings for hosted customers gets put in here. It is not something you normally see as it has to be manually created using Powershell or SQL Server. If it is present you can expect it to remain small and busy only from reads.
Note:

  • Content databases can hold multiple site collections and their content. 
  • They can get BIG. Their size is dependent on size and count of both content and users.
  • 200GB should be your ceiling. Odds are you will see a performance hit as you pass that number. 1TB or more are an option but for Record Centers and the like. I.e. non collaborative usage and data retrieval (read only.) Add more content databases as you reach that 200GB level.
  • Due to its nature it is wise to move the Usage database to a separate spindle.
Overview 2
Okay, so now we have Foundation running with a bunch more databases sitting on our server making it look a tad like this:







Right so now lets move on to the Standard edition.

Microsoft SharePoint Server 2010 Standard
Compared directly to Foundation, Microsoft SharePoint Server 2010 Standard drops a number of extra databases to your disks. (If you install Search Server 2010 Express you will see some of these databases. I have put a * next to the name if it is a factor in play for that kind of scenario.)
Expect to see:
  1. Search Administration*: Used by the Search application the ACL and configuration information is in here. Growth can be vertical or horizontal – if you create new instances of the service application. 
  2. Crawl*: Again used by the Search application, the state of the crawled content/data and the historical crawl records are in here. This database can get a bit big and if you are crawling LARGE amounts of data it is a good idea to not have it on the same server as the next database, Property. It is also a good idea to be using the Enterprise version of SQL so that you can leverage the data compression functionality. This is a read heavy database.
  3. Property*: Again used by the Search application associated data to the crawls is stored in here such as history, crawl queues, and general properties. If you have large volumes of data, move this to its own server and you will see performance improvements with regard to search query result returns. This is a write heavy database.
  4. Web Analytics Reporting*: Used by the Web Analytics application, aggregated report tables, fact data grouped by site – date – asset, diagnostic information can be found in here. Depending on your policies this database can get massive. As it grows scale out by adding more databases. 
  5. Web Analytics Staging*: Temporary data (unaggregated) is in here. 
  6. State*: Used by the State Service application, InfoPath Forms Services, and Visio Services. Temporary state information for Exchange is also in here. More State databases can be added via PowerShell. 
  7. Profile: Used by the User Profile service application, users and their information is found and managed in here. Additional databases can be created if you create additional service instances. Expect lots of reads and medium growth.
  8. Synchronization: Again, used by the User Profile service application. It contains configuration and staging data for use when a directory service such as AD is being synched with. Scaling up or out is the same as with Profile. Its size can be influenced by not just the number of users and groups but their ratios as well. Expect fairly balanced read vs write activity.
  9. Social Tagging: Again, used by the User Profile service application. As users create social tags and notes they are stored, along with their URLs in here. Expect growth to be representative of how much your community embraces the functionality of social tagging. Scaling up or out is the same as with Profile. This is definitely more read than write heavy. 
  10. Managed Metadata Service: Syndicated content types and managed metadata is stored in here. Growth can be vertical or horizontal – if you create new instances of the Managed Metadata service application. 
  11. Secure Store*: Account names, passwords, and their mappings are stored in here. This is where you may have to collaborate with other internal groups as it is suggested that this database be hosted on a seperate database instane with limited access. Growth can be vertical or horizontal – if you create new instances of the service application. 

Note:
  • Watch the Web Analytics Reporting database. It can get very big.
  • Use compression for Crawl if you have the Enterprise version of SQL.
Overview 3
Okay, so now we have SharePoint Server 2010 Standard edition running with a bunch more databases sitting on our server making it look a tad like this:

Right so now lets move on to the Enterprise edition…

Microsoft SharePoint Server 2010 Enterprise
Microsoft SharePoint Server 2010 Enterprise drops extra databases on top of the Standard list to handle Word and PerformancePoint.
Expect to see:
  1. Word Automation Services: Used by the Word Automation service application information about impending and completed document conversions is to be found in here. This database can be expected to remain relatively small.
  2. PerformancePoint: Used by the PerformancePoint service application settings, persisted user comments, and temporary objects are stored in here. It can be expected to be read heavy and can be scaled up or out.
Overview 4
Okay, so now we have SharePoint Server 2010 Enterprise edition running on our server making it look a tad like this:

Where else can we go from here? Well there’s always Project Server , PowerPivot, and FAST… All are complicated products that require SharePoint Server 2010 Enterprise.





Microsoft Project Server 2010





Expect to see:

  1. Draft: Not accessible to end users this is used to store data used by the project queue. There can only be one as far as I know. 
  2. Published: When a project is published this is where it lives. Timesheets, resources, custom fields, and all the other metadata is stored in here. This database can get large.
  3. Archive: Backup data of projects, resources, calendars, etc. is in here. Growth can be limited, if it is not and is used it will get large.
  4. Reporting: This is the repository of the entire project portfolio. When you publish a project plan,  a copy will be put in here. Read heavy and large…





Microsoft FAST Search Server 2010 for SharePoint





Expect to see:

  1. Search Administration: Stores and manages search setting groups, keywords, synonyms, promotion/demotions, best bets, and search schema metadata. It should stay on the small side and remain read heavy. 




Microsoft SQL Server PowerPivot for SharePoint





Expect to see:

  1. PowerPivot Service Application: Cached and loaded PowerPivot files are in here along with usage data and schedules. You can expect this database to remain small.  
Note:
  • PowerPivot files can be rather big and PowerPivot stores data in content databases as well as the central administration database. 
So where does that leave us? If you have everything above installed you should expect to see something like this…
Looking at these databases and what could / should be 

candidates for having their own platters you should start to see something like this: 












Note that the Secure Store should be moved primarily for security purposes. 


It has to be mentioned as well that each and every environment can prove to be unique so you may not see value in moving anything other than your content databases to another platter.

So what does this look like in the wild? Viewing a screenshot of, an admittedly deliberately slightly duplicated environment,  you can see these  databases as they would appear in real life…

Their file system looks like this:

Clearly there is job security for DBAs going forward :). There is also a level of complexity that merits caution and planning when considering SharePoint 2010. The phrase a stitch in time continues to hold value…

Moving large numbers of records with SqlBulkCopy

Sometimes you have to move large amounts of data in, out, or around SQL. There are plenty of ways to do that but few are as fast as using SqlBulkCopy. It is not fast. It is blazingly fast. As with anything else it has its place and limitations but it is well worth understanding at the least.
It is pretty easy to work with as the below example shows…
static void CopyData(DataTable sourceTable, SqlConnection destConnection)
{
// new method: SQLBulkCopy:
using (SqlBulkCopy s = new SqlBulkCopy(destConnection))
{
s.DestinationTableName = “SalesCopy”;
s.NotifyAfter = 10000;
s.SqlRowsCopied += new SqlRowsCopiedEventHandler(s_SqlRowsCopied);
s.WriteToServer(sourceTable);
s.Close();
}
}

When used properly it can make seemingly large data volume transactions trivial.

Visual Studio 2010 problem with Vault

Working with Visual Studio 2010 and SQL 2008 I got this message out of the blue:
‘System.__ComObject’ to interface type ‘Microsoft.VisualStudio.OLE.Interop.IServiceProvider’. This operation failed because the QueryInterface call on the COM component for the interface with IID ‘{6D5140C1-7436-11CE-8034-00AA006009FA}’ failed due to the following error: No such interface supported (Exception from HRESULT: 0x80004002 (E_NOINTERFACE)).

Whoah… After a reboot, some thought, a few more attempts to build, and a bit of luck, I deduced with reasonable certainty that Vault, which I use for version control, was the culprit.
So I closed Studio 2010 and removed Vault. Then I opened the same solution and did a build with no problems. Next step is to figure out what to do about getting Vault back into play. Or not.