Using stacktrace.js to log AngularJS errors to a SharePoint list

stacktrace.js: is a Framework-agnostic, micro-library for getting stack traces in all web browsers allowing you to debug your JavaScript by giving you a nicely detailed stack trace of function calls leading to an error (or any condition you specify).

http://www.stacktracejs.com/


1: Create a new list, in this example we will call it “apperrors” and it will be at the root. Create the following columns:

  • errorMessage as multiple lines without formatting.
  • stackTrace as multiple lines without formatting.

2: First include stacktrace.js after your AngularJS script.

<script type=”text/javascript” src=”angular-1.3.15/angular.min.js”></script>
<script type=”text/javascript” src=”stacktrace.min.js”></script>

3: By default AngularJS catches errors and log them verbosely to the console. This is highly desirable so we will keep that behavior and just add an interception in so that we can use stacktrace.js to centrally log any errors.

app.provider(“$exceptionHandler”, {
    $get: function( errorLogService ) {
        return( errorLogService );
    }
});

4: Even though since step 1 stacktrace.js is now in the global scope it is not ‘correct’ to reference a global object inside an AngularJS component. Therefore the stacktrace feature needs to be wrapped in an AngularJS service that will expose the print method.

app.factory(“stacktraceService”, function() {
    return({
        print: printStackTrace
    });

});

5: The error logging service is a wrapper around the core error handling of AngularJS.

app.factory(“errorLogService”, function( $log, $window, stacktraceService ) {
    function log( exception, cause ) {
        $log.error.apply( $log, arguments );
        try {
            var URL = window.location.href;
            var errorMessage = JSON.stringify(exception.toString());
            var stackTrace = JSON.stringify(stacktraceService.print({ e: exception }));
            var item = {
                “__metadata”: { “type”: “SP.Data.apperrorsListItem”},
                “Title”: URL,
                “errorMessage”: errorMessage,
                “stackTrace”: stackTrace
            };
            $.ajax({
                url: _spPageContextInfo.webAbsoluteUrl + “/_api/web/lists/getbytitle(‘apperrors’)/items”,
                type: “POST”,
                contentType: “application/json;odata=verbose”,
                data: JSON.stringify(item),
                headers: {
                        “Accept”: “application/json;odata=verbose”,
                       “X-RequestDigest”: $(“#__REQUESTDIGEST”).val()
                },
                success: function (data) {
                    console.log(JSON.stringify(data, null, 4));
                },
                error: function (data) {
                    console.log(JSON.stringify(data, null, 4));
                }
            });
            } catch ( loggingError ) {
                $log.warn( “Error logging failed” );
                $log.log( loggingError );
            }
    }
    // Return the logging function.
    return( log );
});

5: That’s it. In theory if all has gone correctly any AngularJS errors will now get written to the SharePoint list.

http://www.bennadel.com/blog/2542-logging-client-side-errors-with-angularjs-and-stacktrace-js.htm

https://angularjs.org/

http://www.stacktracejs.com/

Using PowerShell to export SharePoint list items into CSV and then delete the same items from the list

The below code can be used to export list items, if found, info column by column. It then deletes the items that it exported from the list.

if ((Get-PSSnapin “Microsoft.SharePoint.PowerShell” -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin “Microsoft.SharePoint.PowerShell”
}

# BEGIN CONFIG

$fileNameStamp = Get-Date -UFormat “%Y_%m_%d_%H_%M”
$fileAndPath =”D:\DATA\myData-$fileNameStamp.csv”
$web = Get-SPWeb -identity “http://10.1.1.10/”
$list = $web.Lists[“MyList”]

# END CONFIG
# BEGIN: STEP 1: Export current items in the list to CSV
$listitems = $list.Items.Count;
# Break out if the list has no content. Stops the creation of empty files.
if ($listitems -lt 1) {
    break;
} else {
    #Array to Hold Result – PSObjects
    $ListItemCollection = @()
    #Get All List items”
    $list.Items | foreach {
       $ExportItem = New-Object PSObject
       $ExportItem | Add-Member -MemberType NoteProperty -name “ID” -value $_[“ID”]
       $ExportItem | Add-Member -MemberType NoteProperty -name “Title” -value $_[“Title”]
       $ExportItem | Add-Member -MemberType NoteProperty -name “Created” -value $_[“Created”]
       $ExportItem | Add-Member -MemberType NoteProperty -name “CreatedBy” -value $_[“Created By”]
       #Add the object with property to an Array
       $ListItemCollection += $ExportItem
    }
    #Export the result Array to CSV file
    $ListItemCollection | Export-CSV $fileAndPath -NoTypeInformation
}
# END: STEP 1: Export current items in the list to CSV

# BEGIN: STEP 2: Delete the exported items from the list
Import-CSV $fileAndPath -Header ID,Title | Foreach-Object{
    Write-Host ” Looking for item #” $_.ID “…”;

    $items = $list.Items | Where {$_[“ID”] -eq $_.ID}
    foreach($item in $items)
    {
        $item.Delete();
        break;
    }
}
# END: STEP 2: Delete the exported items from the list

#BEGIN: CLEANUP
$web.Dispose();
#END: CLEANUP

Storing SharePoint list data locally in IndexedDB with PouchDB and AngularJS

pouchdbPouchDB is an open-source JavaScript database inspired by Apache CouchDB that is designed to run well within the browser.

PouchDB was created to help web developers build applications that work as well offline as they do online. It enables applications to store data locally while offline, then synchronize it with CouchDB and compatible servers when the application is back online, keeping the user’s data in sync no matter where they next login.

The below code should give you a full life cycle of:

  1. Delete a DB if it exists
  2. Create a new DB
  3. Get SharePoint list data
  4. Add returned data to the IndexedDB
  5. Console out errors if issues are encountered.

 

var req = indexedDB.deleteDatabase(“demodata”);
req.onsuccess = function() {
    console.log(“Deleted database successfully”);
    // Create the client side DB
    var db = new PouchDB(‘demodata’);
    console.log(“DB created”);

    // Get info
    db.info().then(function (info) {
        console.log(info);
    });

    // BEGIN: Get list data
    $http({
        method: ‘GET’,
        url: _spPageContextInfo.webAbsoluteUrl + “/_api/web/lists/getByTitle(‘demolist’)/items”,
        cache: true,
        headers: {
            “Accept”: “application/json;odata=verbose”
        }
    }).success(function(data, status, headers, config) {
        var demodataValues = data.d.results;
        var demodata = {
            “_id”: “demo”,
            “demodataValues”: demodataValues
        };
        db.put(demodata);
        console.log(“Demo injected.”);
    }).error(function(data, status, headers, config) {

        console.log(“SP error”);

    });
    // END: Get list data
};
req.onerror = function() {
    console.log(“Couldn’t delete database”);
};
req.onblocked = function() {
    console.log(“Couldn’t delete database due to the operation being blocked”);
};

VirtualBox 5 – it remains a useful tool rather than a replacement…

virtualboxVirtualBox 5 is out of a somewhat short beta and its essence seems to have not really changed. It’s still a great tool but no Hyper-V / BootCamp / Parallels replacement….

In brief if you’ve used it before then you will see five new features. All of which are good.

  1. Paravirtualization Support for Windows and Linux Guests: Significantly improves guest OS performance by leveraging built-in virtualization support on operating systems such as Oracle Linux 7 and Microsoft Windows 7 and newer.
  2. Improved CPU Utilization: Exposes a broader set of CPU instructions to the guest OS, enabling applications to make use of the latest hardware instruction sets for maximum performance.
  3. Support of USB 3.0 Devices: Guest operating systems can directly recognize USB 3.0 devices and operate at full 3.0 speeds. The guest OS can be configured to support USB 1.1, 2.0, and 3.0.
  4. Bi-Directional Drag and Drop Support for Windows: On all host platforms, Windows, Linux and Oracle Solaris guests now support “drag and drop” of content between the host and the guest. The drag and drop feature transparently allows copying or opening of files, directories, and more.
  5. Disk Image Encryption: Data can be encrypted on virtual hard disk images transparently during runtime, using the industry standard AES algorithm with up to 256 bit data encryption keys (DEK). This helps ensure data is secure and encrypted at all times, whether the VM is sitting unused on a developer’s machine or server, or actively in use.

Useful links:

How to MANUALLY install Redmine 3.x on Windows Server 2008 R2

redmine_logo_v1  Redmine is a free and open source, web-based project management and issue tracking tool. It allows users to manage multiple projects and associated subprojects. It features per project wikis and forums, time tracking, and flexible role based access control. It includes a calendar and Gantt charts to aid visual representation of projects and their deadlines. Redmine integrates with various version control systems and includes a repository browser and diff viewer. The design of Redmine is significantly influenced by Trac, a software package with some similar features. Redmine is written using the Ruby on Rails framework. It is cross-platform and cross-database and supports at least 34 languages.

There are issues with installing Redmine on Windows due to some gems. The key to a successful outcome is to match the versions. Obviously with time this situation will change…

This ‘should’ take < 30 minutes.

You can always use the Web Platform Installer process. Google/Bing/etc. will help you find it.

You WILL experience a Json gem error around step 4.3 – just keep following the directions.

IGNORE “DL is deprecated, please use Fiddle”

Assumptions

  • You have IIS up and running
  • You already have an instance in place of SQL, MAriaDB, POST/My etc
  • Rmagick is NOT being/already installed.

Download the following:

  • Ruby 2.0.0 x32 – http://rubyinstaller.org/downloads/
  • Ruby Dev Kit for use with 2.0 x32 – http://rubyinstaller.org/downloads/
  • Redmine 3.0.4 – http://www.redmine.org/projects/redmine/wiki/Download

‘THE’ Process

  1. Install Ruby. Just double click, agree, and accept the defaults.
  2. Create the IIS site that will host Redmine.
  3. Extract the Redmine file into the above IIS site folder. Let us assume it shall be C:\inetpub\wwwroot\redmine\
  4. Open a Ruby Command prompt as Administrator. Capture1
    1. CD into the folder C:\inetpub\wwwroot\redmine\ and run
    2. gem install bundler
    3. bundle install –without development test rmagick
  5. Install the Dev Kit
    1. Double click and extract to a folder that makes sense. E.g. C:\RubyDevKit
    2. CD into that folder and run:
      • ruby dk.rb init
      • ruby dk.rb install
      • gem install json -v ‘1.8.3’
  6. Go back to C:\inetpub\wwwroot\redmine\ and run
    • bundle install –without development test rmagick
    • bundle exec rake generate_secret_token
    • rake db:migrate – IF you get a SQL error here you probably have the wrong PW in your C:\inetpub\wwwroot\redmine\config\database.yml file OR your user is different.
    • set RAILS_ENV=production
    • set REDMINE_LANG=en
    • bundle exec rake redmine:load_default_data
  7. In theory you should now have a functional instance of Redmine BUT you are not yet done. Run the following to test
    • bundle exec rails server webrick -e production
    • OPEN a browser on the server and check http://localhost:3000
    • If you see what looks like Redmine CTRL+C and continue.
    • NOTE: You CAN run Redmine like this but it is not really stable enough for ‘real’ production use.
  8. IIS needs to be set up to run Ruby & Redmine. This detail will be added at a later date when I have time.

 

This seems useful / complimentary: http://jonathanmatthey.com/project/opensource/2012/11/01/redmine-sprint-reports.html

PowerShell to shrink SharePoint auditdata table

I recently had a content database that was out of control size wise because of the Auditdata table and needed it shrunk promptly. This piece of PowerShell did the job.

NOTE: This is a bit of a hammer so all things look like a nail approach but it may be good for your specific situation.

if ((Get-PSSnapin “Microsoft.SharePoint.PowerShell” -ErrorAction SilentlyContinue) -eq $null) {
   Add-PSSnapin “Microsoft.SharePoint.PowerShell”
}

foreach ($site in get-spsite -Limit ALL)
{
   Write-host ‘Deleting audit data for site: ‘ $sc.URL
   $i = -350
   do {
      Write-Host $site.URL ‘ – Delete day ‘ $i ‘ : ‘ ([System.DateTime]::Now.ToLocalTime().AddDays($i))
      $site.audit.deleteentries([System.DateTime]::Now.ToLocalTime().AddDays($i))
      $site.audit.update()
      $i++
   }
   while ($i -le 1)
   $site.Dispose()
}

PowerShell “fun” with text to speech

Writing code can be rather far from glamorous if not downright boring sometimes. PowerShell has its appeal but it is at the end of the day still an administrative framework for task and configuration automation.  Even saying that can hurt… If you’re looking to spice it up a bit the following may help.

function prompt { $text = ‘You are great!’, ‘Hero!’, ‘What a checker you are.’, ‘Champ, well done!’, ‘Man, you are good!’, ‘Guru stuff I would say.’, ‘You are magic!’    ‘PS> ‘    $host.UI.RawUI.WindowTitle = Get-Location    (New-Object -ComObject Sapi.SpVoice).Speak(($text | Get-Random)) }

prompt

 

SharePoint PowerShell script to check if a site collection is alive and email if not.

This might help somebody. In brief the script checks the array of URLs and if it gets a non 200 (OK) response back it emails out an alert. It can also be used as a keep alive script. Best run on a non farm connected server.

$urls= @("http://webapp/sitecollection1/Default.aspx",

"http://webapp/sitecollection2/default.aspx",

"http://webapp/sitecollection3/default.aspx",

"http://webapp/sitecollection4/default.aspx");

# Request all URLs in the array

foreach ($objItemin$urls) {

Write-Host“Checking $objItem”;

$req= [system.Net.WebRequest]::Create($objItem)

$req.UseDefaultCredentials=$true

try {

$res=$req.GetResponse()

} catch [System.Net.WebException] {

$res=$_.Exception.Response

}

$int= [int]$res.StatusCode

$status=$res.StatusCode

write-host“$int $status”

if ($int-ne 200) {

Write-Host”  Sending Email…”;

$enc  =New-ObjectSystem.Text.utf8encoding;

$smtp=”emailserver.domain.com”;

$to=”Recipient 1<username@domain.com>”;

$toCC=Recipient 2 <username@domain.com>”;

$from=”SharePoint farm <username@domain.com>”;

$ScriptName=$MyInvocation.MyCommand.Name;

$scriptPath=split-path-parent$MyInvocation.MyCommand.Definition;

$body=”This was generated by the script $ScriptName in $scriptPath”;

$subject=”URL check failure on $objItem – ‘$int : $status'”;

send-MailMessage-SmtpServer$smtp-To$to-Cc$toCC-From$from-Subject$subject-Body$body-BodyAsHtml-Encoding$enc;

Write-Host”  Sent.”;

}

}

Using AppFabric Distributed Cache on SharePoint 2013

After encountering some issues I thought I’d post a few bullets that may help someone.  They mostly cover memory and things to not do.