Storing SharePoint list data locally in IndexedDB with PouchDB and AngularJS

pouchdbPouchDB is an open-source JavaScript database inspired by Apache CouchDB that is designed to run well within the browser.

PouchDB was created to help web developers build applications that work as well offline as they do online. It enables applications to store data locally while offline, then synchronize it with CouchDB and compatible servers when the application is back online, keeping the user’s data in sync no matter where they next login.

The below code should give you a full life cycle of:

  1. Delete a DB if it exists
  2. Create a new DB
  3. Get SharePoint list data
  4. Add returned data to the IndexedDB
  5. Console out errors if issues are encountered.

 

var req = indexedDB.deleteDatabase(“demodata”);
req.onsuccess = function() {
    console.log(“Deleted database successfully”);
    // Create the client side DB
    var db = new PouchDB(‘demodata’);
    console.log(“DB created”);

    // Get info
    db.info().then(function (info) {
        console.log(info);
    });

    // BEGIN: Get list data
    $http({
        method: ‘GET’,
        url: _spPageContextInfo.webAbsoluteUrl + “/_api/web/lists/getByTitle(‘demolist’)/items”,
        cache: true,
        headers: {
            “Accept”: “application/json;odata=verbose”
        }
    }).success(function(data, status, headers, config) {
        var demodataValues = data.d.results;
        var demodata = {
            “_id”: “demo”,
            “demodataValues”: demodataValues
        };
        db.put(demodata);
        console.log(“Demo injected.”);
    }).error(function(data, status, headers, config) {

        console.log(“SP error”);

    });
    // END: Get list data
};
req.onerror = function() {
    console.log(“Couldn’t delete database”);
};
req.onblocked = function() {
    console.log(“Couldn’t delete database due to the operation being blocked”);
};

How to import your IIS logs into Piwik

Piwik is pretty awesome if you’re in the market for the functionality it provides. Unfortunately a lot of people who use IIS, SharePoint, .NET are not familiar with Python and may struggle with getting their IIS logs into Piwik. This should help if you are in that situation.

  1. Move your logs to a central location. PowerShell as a scheduled job is perfect for this. In this example let us assume it is D:LOGS
  2. Let us also assume that you are running Piwik on the local machine aka localhost on port 1000 and the ID of the site in Piwik is 1. These values are in the code below.
  3. Create a script called importFiles.py and paste the below into it. The TAB indents are pretty important.

import os, fnmatch, subprocess

def find_files(directory, pattern):
    for root, dirs, files in os.walk(directory):
        for basename in files:
            if fnmatch.fnmatch(basename, pattern):
                filename = os.path.join(root, basename)
                yield filename

for filename in find_files(‘C:LOGS’, ‘*.log’):
    #print (filename)
    os.system(“C:/inetpub/wwwroot/Piwik/misc/log-analytics/import_logs.py –url=http://localhost:1000/ ” +str(filename) +” –idsite=1″)

Notes about the above.

  • You must NOT install Python 3.x. Install 2.x. This is a compatibility issue with Piwik.
  • I use MariaDB. If you are not familiar with it think of it as simply as it being a fully compatible fork of MySQL.
  • You can definitely optimize this script further by reading the Piwik docs. This is about as basic as it can be and still get the job done.
  • This is recursive so if you are moving the logs into server named subfolders, a good idea if you have a farm, it should work just fine.
  • I would strongly recommend that you move the files to a “processed” folder outside of the path above once you’re done. 
  • Make sure to set index.php as your default document in the IIS site you are running Piwik from.
If you need to do over then you need to get the log file data out of the DB. to do so remove the appropriate tables and delete the appropriate values similar to below. More details are in the Piwik FAQ here.
DROP TABLE piwik_archive_numeric_2011_01, piwik_archive_numeric_XX;
DELETE FROM piwik_log_visit WHERE idsite = X;
DELETE FROM piwik_log_link_visit_action WHERE idsite
= X;
DELETE FROM piwik_log_conversion WHERE idsite
= X;
DELETE FROM piwik_log_conversion_item WHERE idsite
= X;
More info and relevant downloads can be gotten here: 

Another easy way to perform instant and/or automated backups of MySQL

PhpMyAdmin is great for what it does but it does not excel everywhere. If you’re looking to create instant or automated backups of MySQL databases consider Sypex Dumper 2.x

Built with PHP, it has an Ajaxed interface and can run database restores as well. Handily it can avoid PHP script timeouts by pausing the jobs. And being licensed under the BSD license, it is absolutely free. Interestingly supposedly it can also be integrated into third-party products… Something I have not done or tried. What impressed me mostly about it was how easy interacting with databases via the browser became with it.

http://sypex.net/en/