Maemo: An Overview

We’re under a deluge of iPhone OS and Android updates these days. However there are plenty of other interesting if not equal options out there. Maemo from Nokia may be one… Especially with what David Rivas, Nokia’s vice president for devices R&D, had said when he was asked about operator customization: “Very clearly Apple, Android are a whole lot less about providing customization to the operators and a whole lot more about providing a really cool, compelling value proposition to the end-consumer. We have an opportunity that we are going to take advantage of, with Maemo platform to play the game a little bit more along those lines than with Symbian lines.

If you’re not familiar with Maemo, a good place to start is here: http://maemo.org/intro/

Maemo is an operating system for the Internet Tablet line of handheld computers. It was originally named “Internet Tablet OS”.

It is similar to many handheld operating systems, and features a “Home” screen—the central point from which all applications and settings are accessed. The Home Screen is divided into areas for launching applications, a menu bar, and a large customisable area that can display information such as an RSS reader, Internet radio player, and Google search box. Based on Debian GNU/Linux, it draws much of its GUI, frameworks, and libraries from the GNOME project. It also uses the Matchbox window manager, and the GTK-based Hildon as its GUI and application framework. All pretty sweet stuff if you’re into it…

Release history

Version Codename Build identifier Release date Notes
OS2005 1.1 2.2005.45-1 November 2005
3.2005.51-13 December 2005
5.2006.13-7 April 2006
OS2006 2.0 Mistral 0.2006.22-21 May 2006 Beta release
1.2006.26-8 May 2006
2.1 Scirocco 2.2006.39-14 November 2006
2.2 Gregale 3.2006.49-2 January 2007 Final Nokia-supported OS for 770
OS2007 3.0 Bora 2.2006.51-6 January 2007
3.1 3.2007.10-7 March 2007
3.2 4.2007.26-8 July 2007
4.2007.38-2 October 2007 SDHC corruption fix
OS2008 4.0 Chinook 1.2007.42-18 November 2007 (N810 only)
1.2007.42-19 November 2007 Kernel upgrade only (N810 only)
1.2007.44-4 November 2007 Beta release (N800 only)
2.2007.50-2 November 2007
2.2007.51-3 January 2008 NOLO upgrade only
4.1 Diablo 4.2008.23-14 June 2008 Adds SSU support
4.2008.30-2 August 2008 First SSU update
4.2008.36-5 September 2008
5.2008.43-7 December 2008
Maemo 5 5.0 Fremantle Bundled community-supported Qt libraries
? Harmattan Bundled officially supported Qt libraries

Making links in SharePoint list URL fields open in a new window

Open the schema file for the links list feature:

C:Program FilesCommon FilesMicrosoft Sharedweb server extensions12TEMPLATEFEATURESLinksListLinksschema.xml
There should be two instances of the following:
< Column Name="URL" HTMLEncode="TRUE" /><HTML>[ CDATA[">]]>HTML>

Replace them both with this:
< Column Name="URL" HTMLEncode="TRUE" /><HTML>[ CDATA[" target="_blank">]]>HTML>

Note: Replace “< " with "<" and "[ " with "["

A Thought on Monte Carlo Simulation Using Parallel Asynchronous Web Services with .NET and SharePoint

Monte Carlo Simulation is a technique used to estimate the likely range of outcomes outputted by a complex process by simulating the actual process with randomly selected data generating conditions that are true to the process model a large number of times. (In fact, the more you do it the better your data.) The Monte Carlo method is best applied whenever a deterministic solution would either be too computationally intensive or if such a solution does not exist whatsoever.

Monte Carlo Simulation is used in/with

  • Physical sciences
  • Design and visuals
  • Finance and business
  • Telecommunications
  • Games

Monte Carlo Simulation is not a “what if” process. What if’s require single point estimates and use deterministic modeling. Basically you are using best case, worst case, and so on. By using Monte Carlo you consume large random samplings, sourced from probability distribution functions, to produce a large range of outputs which in turn can allow you with greater confidence to produce a narrower range of outputs. In other words you are not using equal weights for each scenario.

Why is this pertinent? Well, stay with me on this one, Markov chain methods are extremely useful for generating sequences of random numbers to accurately reflect rather complicated desired probability distributions, via a process called Markov chain Monte Carlo methods. A tool that is used to generate simulations from a probability distribution…

The Google PageRank of a webpage is defined by a Markov chain.

And the penny drops…

Now, back to the point.

Depending on the degree of accuracy ultimately required, millions or billions of points may need to be tried. Distributing billions of point calculations across multiple servers running Monte Carlo Simulations via web services would parallelize the process and generates results VERY quickly. Good in concept but how to do it?

As defined by the W3C a web service is “a software system designed to support interoperable machine-to-machine interaction over a network.” Running web services on IIS has advantages not limited to:

  • You can grow your “cluster” by just deploying the web service to new nodes.
  • Each web service call with IIS is a thread which should have obvious and positive performance implications.
  • Web services provide a relatively simple and straightforward method of distributing parallel problems across multiple compute platforms.
  • Web services are written like traditional functions, they are easily parallelized without hand-coding a multi-threaded application, custom writing a message passing interface or using other high performance computing management software.

Needless to say, unless your requirements can be served by parallel computations, which would have no dependency on others in the pipe, this is going to become very difficult or rather “challenging” 🙂 very, very quickly.

So how could SharePoint fit in? SharePoint is perfect for acting as a landing point for your data. In and out. Companies benefit by building intelligence into their document libraries and lists with workflows. With workflow, SharePoint can act as a central hub for the data, sending it out to a queue which distributes to nodes on the network. Upon return, the data could be used to populate lists, document libraries, notify people/groups, and more. Search, BDC, Security, and all the other features in SharePoint make this concept a compelling one.

A random tidbit on non random data

I recently was talking with somebody who felt that TrueCrypt hidden volumes were the bee knees. The scenario they used, and which I myself have read ‘musings’ about, involved a laptop carrying sensitive corporate data being seized by customs. Laptop drive gets “reviewed”, secret container is not seen, and laptop passes as normal and uninteresting. Big deal. Bigger deal is if you have 007 style data and that guy in the uniform is pretty certain you have it as well. My colleagues version of the story ends with an almost hollywood style style exhalation of breath and cinematic zoom out to the hero walking out the door. That’s not how it would probably pan out…

Truecrypt volumes, which are essentially files, have certain characteristics that allow programs such as TCHunt to detect them with a high *probability*. The most significant, in mathematical terms, is that their modulo division by 512 is 0. Now it is certainly true that TrueCrypt volumes do not contain known file headers and that their content is indistinguishable from random, so it is difficult to definitively prove that certain files are TrueCrypt volumes. However their very presence can demonstrate and provide reasonable suspicion they contain encrypted data.

The actual math behind this is interesting. TrueCrypt volume files have file sizes that are evenly divisible by 512 and their content passes chi-square randomness tests. A chi-square test is any statistical hypothesis test in which the sampling distribution of the test statistic is a chi-square distribution* when the null hypothesis is true, or any in which this is asymptotically true. Specifically meaning that the sampling distribution (if the null hypothesis is true) can be made to approximate a chi-square distribution as closely as desired by making the sample size large enough.

So what does this all mean? Really nothing for us normal people. For those whom I have built custom STSADM containers for securing your backups and exports, your data is still secure and will stay that way indefinitely. For those running across the border. A forensic analysis will reveal the presence of encrypted data, TrueCrypt volumes or otherwise, but not much more. Sometimes that’s enough to start asking questions or poking further. With the forensic tools, not the dentistry kit.

* A skewed distribution whose shape depends on the number of degrees of freedom. As the number of degrees of freedom increases, the distribution becomes more symmetrical.

http://www.truecrypt.org/
http://16systems.com/TCHunt/

Microsoft Windows Azure July 2009 CTP new features

This download extends Visual Studio to enable the creation, building, debugging, running and packaging of scalable Web applications and services on Windows Azure. A hot topic to say the least.

In case you are not sure what it is: “Windows Azure is the cloud operating system that serves as the development, run-time, and control environment for the Azure Services Platform.”

And you can get the whole marketing blurb here: http://www.microsoft.com/azure/default.mspx

If you are just getting started with the CTP go here Working with Multiple Web and Worker Roles and here Associating an ASP.NET Web Application (including MVC) as a Web Role.

Download it from here: Windows Azure Tools for Microsoft Visual Studio July 2009 CTP

Copy of text from the link page below
=====================================================

Overview


Windows Azure Tools for Microsoft Visual Studio extend Visual Studio 2008 and Visual Studio 2010 Beta 1 to enable the creation, building, debugging, running and packaging of scalable web applications and services on Windows Azure.

Please note that this is a CTP release and should not be used on production systems. Please see the EULA for more details.

New for the July 2009 CTP:

  • Support for developing and deploying services containing multiple web and worker roles. A service may contain zero or more web roles and zero or more worker roles with a minimum of one role of either type.
  • New project creation dialog that supports creating Cloud Services with multiple web and worker roles.
  • Ability to associate any ASP.NET Web Application project in a Cloud Service solution as a Web Role
  • Support for building Cloud Services from TFS Build
  • Enhanced robustness and stability

Windows Azure Tools for Microsoft Visual Studio includes:

  • C# and VB Project Templates for creating a Cloud Service solution
  • Tools to change the Service Role configuration
  • Integrated local development via the Development Fabric and Development Storage services
  • Debugging Cloud Service Roles running in the Development Fabric
  • Building and packaging of Cloud Service Packages
  • Browsing to the Azure Services Developer Portal
  • SSL Certificate selection

System Requirements

  • Supported Operating Systems: Windows 7; Windows Server 2008; Windows Vista

Getting ready for SharePoint Server 2010?

The requirements are known so you have an easy starting point, ensuring your hardware is 64-bit as:

  1. SharePoint Server 2010 will be 64-bit only.
  2. SharePoint Server 2010 will require 64-bit Windows Server 2008 or 64-bit Windows Server 2008 R2.
  3. SharePoint Server 2010 will require 64-bit SQL Server 2008 or 64-bit SQL Server 2005.

If it is not, you will at some point need to move it. A great place to start the process of learning is on Technet, Migrate an existing server farm to a 64-bit environment (Office SharePoint Server 2007)

Things can get complicated after that point. Do you have an existent environment?

If yes

  • Deploy Service Pack 2 and take a good look at the SharePoint 2010 Upgrade Checker that is shipped as part of the update. The Upgrade Checker will scan your SharePoint Server 2007 deployment for many issues that could affect a future upgrade to SharePoint 2010.

If no

  • Get to know Windows Server 2008 with SharePoint 2007, this post is a great starting point.

Nmap 5 released

Network security starts with scanning because you need to know what you have so that you can identify your vulnerable points and manage the associated risk.Nmap excels in helping you enumerate your network and identify what is running. Nmap is also a key tool in the fight against Conficker and its ilk and can be used to detect an infected node on a network.

With the release of Nmap 5, the first major release since 1997?, there is a noticeable speed advantage with faster scans. Aside from the speed improvements there are the new tools such as Ncat and Nmap Scripting Engine (NSE) that make Nmap 5 a must have.

  • “The new Ncat tool aims to be your Swiss Army Knife for data transfer, redirection, and debugging,” the Nmap 5.0 release announcement states.
  • NSE is all about automating network scanning task with scripts.”Those scripts are then executed in parallel with the speed and efficiency you expect from Nmap. All existing scripts have been improved, and 32 new ones added. New scripts include a whole bunch of MSRPC/NetBIOS attacks, queries, and vulnerability probes; open proxy detection; whois and AS number lookup queries; brute force attack scripts against the SNMP and POP3 protocols; and many more.”

Other “stuff” in this version…

  • ncat (allows data transfer, redirection and debugging) – (Remember hobbit’s nc ?)
  • ndiff scan comparison
  • better performance
  • improved zenmap GUI (including a real neat feature to visually map the network you have scanned)
  • Improvement of nmap scripting engine (nse), reviewed existing scripts and added 32 new scripts.

A useful if not must have tool. It not only applies to security, but also to simple things such as trying to find that pesky administrative interface to a WSS or MOSS environment when you cannot get access to the desktop… The more you have and know the better your options as they say.

http://nmap.org/5/
http://nmap.org/5/#5changes

PHP and IIS7: Easier than ever

The FASTEST and EASIEST way to install PHP on IIS is using Microsoft’s Web Platform installer. It completely automates setting up IIS, FastCGI and the latest version of PHP from the php.net site. To install it, just click this button:
If you don’t have Web PI v2 installed, you will be prompted to install it. Once installed, the tool will launch. You can either navigate to the “Web Platform” tab and select “PHP” under “Framework and Runtimes” customize link, or close your browser, re-open to this blog and click the button again to launch the tool directly into PHP install.

Don’t forget to test it out with

Update on ntfs-3g

Back in December I posted about setting up direct read and write access to a NTFS drive from 10.5. It all seemed to be working okay until last week when I had to move a couple of VHD files, close to 500Gb, from a Mac running 10.5 across the wire to a windows based NAS. Good grief is all I can say about sustained performance. It took days to complete. More like a week to be honest… Why, as of yet I do not know but there is defintely something “up” with either the Mac or the driver. As the NAS is “fine.”

ZFS project

I recently began a storage project at home. Basically I intend to build a central NAS based on FreeNAS formatted with ZFS.

If you’re not aware of it, and shame on you if you indeed are not, under the Common Development and Distribution License (CDDL) in 2004, Sun released the Zettabyte File System as a means to bring advanced features like filesystem/volume management integration, snapshots and automated repair to its storage systems to its platforms. Since then, it has been fully integrated into OpenSolaris and Solaris 10, FreeBSD 7, and others. (Though I would steer clear of anything FUSE related for now…)

The challenge that I have been facing is how to get performance levels that are supposedly possible out of it. I have done the math on my hardware and know my goals. Getting them should be “fun.”

So far these links have been helpful.

http://wiki.freebsd.org/ZFS
http://www.freebsdnews.net/2009/01/21/setting-freebsd-zfs-only-system/
http://blogs.freebsdish.org/lulf/2008/12/16/setting-up-a-zfs-only-system/