Category Archives: Applications

The apps we use day in and day out.

The context has expired and can no longer be used

I routinely see this error when working with SharePoint 2013 in my development environment. This problem is more frequent when I restore earlier snapshots of my SP2013 server.

SharePoint spits out this error when the local server time is out of sync. To remedy this issue, try one of the following:

  1. Update the date and time on the SharePoint Server
  2. Disable security context check for the web application, as follows
  1. Go to central administration
  2. Go to  "Application management section
  3. Go to "Web Application General Settings"
  4. Go to "Web page Security validation"  disable this option.

SharePoint 2013 Managed Navigation

After much awaited anticipation, SharePoint 2013 now offers custom navigation of sites via the Managed Metadata Term Store. SharePoint 2010 introduced managed metadata for tagging purposes, with hierarchical terms. This same hierarchical infrastructure bodes well for site navigation, which is also hierarchical. I often hear the word “taxonomy” said a lot, pertaining to both tagging taxonomy and site structure, which just speaks to the fact that the Managed Metadata Term Store is great for managing custom navigation.

Prior to SharePoint 2013, custom navigation typically involved some custom component, to read navigation structure from either a list, XML file, or some other hierarchical node store. The out-of-the-box offering provided very little in the way of custom navigation – just the ability to include headers and links at each site level. The main issue with the out-of-the-box offering is that it was limited in the number of nested navigation nodes, without adhering to the actual structure of sites and sub-sites in the collection. Despite typical site navigation following site structure, content owners should have the ability to store their content (sites and pages) in any structure and the navigation look completely different. Content storage and structure suits how content owners maintain content, and navigation is about how end users access content, and the two may look very different. Managed Metadata Navigation finally allows content owners to create an independent navigation structure to that of their content model.

To demonstrate Managed Navigation, I shall first create a hierarchy in the default term store, for our application:

  1. Open Central Administration
  2. Click the link for managed service applications
  3. Scroll down the list and click the Managed Metadata Service
  4. Click the Manage icon in the ribbon to open the Term Store editor
  5. Ensure you have permissions to edit the term store – add your username to the term store administrators field
  6. Managed navigation binds to term sets, so I created a new group for navigation and then a term set for site navigation

SharePoint creates a default term set in the Managed Metadata Term Store for your site collection; I created my own for demonstration purposes.

  1. Create a term set structure
  2. Click the Site Navigation term set
  3. In the right panel, click the tab for Intended Use
  4. Check the checkbox to enable the term set for navigation – you can also use the term set for tagging if you wish by toggling the other checkbox option
  5. Click the save button to save the changes
  6. Click the tab for term driven pages – this page shows the settings for friendly URLs for the term set (more on friendly URLs shortly)
  7. Now we are ready to configure our publishing site to use the managed navigation
  8. Open your publishing site (assuming the hosting web application uses the managed metadata service you just configured)
  9. Click the gear icon, the select the menu item for site settings
  10. Click the link for Navigation, under the Look and Feel header
  11. SharePoint displays the navigation settings page
  12. Choose the radio button option for Managed Navigation for either or both the left and global (top) navigation
  13. Scroll to the bottom of the page to the managed navigation term set section
  14. Select the term set to use for managed navigation
  15. The checkboxes below the term set browser tell SharePoint whether to populate your term set with nodes when you create new pages in the site, and whether to generate friendly URLs for new pages
  16. Click the OK button at the bottom of the page, to save your changes


Configuring RBS for SP2010

Following on from my previous post about list scaling and performance.  The following posts details configuration of Remote Blob Storage for SharePoint 2010 and SQL Server 2008 R2.

First download the RBS provider for SQL Server 2008 (don’t install it yet):

Configure file stream for the SQL Server Service using the Configuration Manager:


Execute the following SQL queries:

EXEC sp_configure filestream_access_level, 2


Execute the following SQL to set up a master encryption key and blob store file group:

use WSS_Content

if not exists (select * from sys.symmetric_keys where name = N'##MS_DatabaseMasterKey##')
create master key encryption by password = N'Admin Key Password !2#4'

if not exists (select groupname from sysfilegroups where 
groupname=N'RBSFilestreamProvider')alter database WSS_Content
 add filegroup RBSFilestreamProvider contains filestream
alter database [WSS_Content] add file (name = RBSFilestreamFile, filename = 'c:Blobstore') 
to filegroup RBSFilestreamProvider

Install the RBS provider with the following command (change DBINSTANCE to your SQL server instance):

msiexec /qn /lvx* rbs_install_log.txt /i RBS_X64.msi TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME="WSS_Content" DBINSTANCE="SP2010" FILESTREAMFILEGROUP=RBSFilestreamProvider FILESTREAMSTORENAME=FilestreamProvider_1

If installing RBD on production servers, be sure to run on all WFE’s with the following command (again, change the DBINSTANCE):

msiexec /qn /lvx* rbs_install_log.txt /i RBS_X64.msi DBNAME="WSS_Content" DBINSTANCE="SP2010" ADDLOCAL=”Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer”

Run the following Power Shell script from the SP2010 Management Console:

$cdb = Get-SPContentDatabase –WebApplication http://sp2010

$rbss = $cdb.RemoteBlobStorageSettings





Now create a document library in SharePoint and upload an image to it.  Next visit the c:blobstore directory and look for the GUID sub folder with recent date.  Keep drilling down until you find a file.  You should see a file with GUID name.  Drop this into IE and you should see that it is the same file you uploaded to your document library.


From the SharePoint 2010 book I’m reviewing

List Scaling and Performance in SP2010

It it a well known fact that MOSS 2007 caused some rising opinions on the subject of list scalability and performance.  Many developers operated under the misconception that SharePoint lists only allowed 2000 list items before croaking out with bad performance.  Nothing could be further from the truth.

The following article talks about this issue of large lists in great depth and highlights the point that SharePoint can actually handle many more than 2000 list items in any one list.  However, “query” this data is affected by the item count and SharePoint architects should design their data access and presentation of data accordingly.

Microsoft has added a number of new enhancements to lists in SharePoint 2010 to handle larger capacity and the query of this data, and the following is a short summary of the enhancements:

List Column Indexing

SP2010 now allows list designers to create up to 20 indices (some of multiple columns) on any one list.  These indices allow for faster query of data when the list size exceeds that of typical.


Under list settings and in the columns section; users now see a link to Indexed Columns.

The following is a list of column types usable as indexed columns:

· Single line of text

· Choice field, but not multi choice

· Number

· Currency

· Date/Time

· Look up, but not a multi value look up

· Person or group, but not multi value

· Title, except in a document library


List Throttling

SharePoint administrators now have the capability to better control list queries so that developers (or general users) may issue list queries on large lists that may potentially bring down the server.  Specifically:

Administrators may define some limits at the web application level:

– Configure the number of items fetched for queries

– Administrators may receive warnings when thresholds exceeded

– Ability to configure time periods for expensive queries to operate

– Limit the size of list items (default to 8k)

– Limit the number of columns in a join (default to 6)

The following code will display the list throttling limits for the site collection:

using (SPSite site = new SPSite(siteUrl))

To enable list throttling on any list be sure to toggle the setting with the following:

SPList.EnableThrottling = true

MaxItemsPerThrottledOperationWarningLevel – If a list exceeds the number of items specified in this threshold then a warning is displayed on the list settings page.

What MaxItemsPerThrottledOperation – This indicates the number of list items returned to non-administrators.  Administrators can query up to the threshold in What MaxItemsPerThrottledOperationOverride but will receive a warning on the list settings page.

If administrators wish for users to execute expensive operations in specific window of time they can do so by using the following method on the WebApplication object: SetDailyUnthrottledPrivilegedOperationWindow


RBS Storage (Remote Blob Storage)

In some cases the use of document libraries to store large files is no longer scalable and causes content databases to become unmanageable.  An example situations where a public web site, hosted in SharePoint, provides users with rich media content – web files and large images – is once such example of the large blob storage issue. 

In MOSS, hosting content in the database provided certain benefits, such as single storage location, versioning, and access of files via the object model.  Whereas file based storage provided better scalability at the cost of orphaned content from the object model.  SP2010 solves this issue with RBS.  Site Architects can now store large files (blobs) in alternate locations to that of the SharePoint content database without relinquishing access via the OM.  From an and developer standpoint, the data is accessed as if it were in the content database, but the content is actually in a remote location.

To enable RBS you’re farm will need to use at least SQL Server 2008 R2.

Marking blobs as external at the content database level enables SharePoint to store the meta-data associated with blobs in the database while storing the actual blob content outside the content database.  Because RBS is handled at the database level, SharePoint is unaware that data is not stored in the content database but in another location.

in a future time, vendors will bring RBS providers for SP2010 to the table, but in the meantime Microsoft has provided RBS for SQL server as an extra download:

See my next blog post on configuring RBS.


From the SharePoint 2010 book I’m reviewing

SP2010 Developer Dashboard

The SP2010 Developer Dashboard allows developers to review object model calls, database queries, web part events – and the timings for these various happenings.

The following code enables the dashboard:

SPPerformanceMonitor SPPerfMon;
SPPerfMon = SPFarm.Local.PerformanceMonitor;
SPPerfMon.DeveloperDashboardLevel = SPPerformanceMonitoringLevel.On;

The following code turns it off again:

SPPerformanceMonitor SPPerfMon;
SPPerfMon = SPFarm.Local.PerformanceMonitor;
SPPerfMon.DeveloperDashboardLevel = SPPerformanceMonitoringLevel.Off;

Development Setup for SP2010

Some important points to remember when developing against SP2010:

  • Make sure your Visual Studio project is set up for .NET 3.5, not .NET 4.0
  • Run Visual Studio as an Administrator to load debugging symbols
  • Make sure your project is set to compile for Any CPU or x64 (not x86 by default), otherwise your code will throw a FileNotFoundException

From the SharePoint 2010 book I’m reviewing

Provision 2010 Farm without the Mess

More and more research is showing that the key to lifelong good health is what experts call “lifestyle medicine” — making simple changes in diet, exercise, and stress management. To help you turn that knowledge into results.

We asked three experts — a naturopathic physician, a dietitian, and a personal trainer — to tell us the top five simple-but-significant lifestyle-medicine changes they recommend.

Besides giving you three different takes on how to pick your health battles, this list gives you choices you can make without being whisked off to a reality-show fat farm — or buying a second freezer for those calorie-controlled, pre-portioned frozen meals. Boost your dietary results with Biofit.




Research shows a healthy positive attitude helps build a healthier immune system and boosts overall health. Your body believes what you think, so focus on the positive.


Shoot for five servings of vegetables a day — raw, steamed, or stir-fried. A diet high in vegetables is associated with a reduced risk of developing cancers of the lung, colon, breast, cervix, esophagus, stomach, bladder, pancreas, and ovaries. And many of the most powerful phytonutrients are the ones with the boldest colors — such as broccoli, cabbage, carrots, tomatoes, grapes, and leafy greens.


What, when, and how much you eat can keep both your metabolism and your energy levels steadily elevated, so you’ll have more all-day energy. A “5 meal ideal” will help you manage your weight, keep your cool, maintain your focus, and avoid cravings.


Did you know that daily exercise can reduce all of the biomarkers of aging? This includes improving eyesight, normalizing blood pressure, improving lean muscle, lowering cholesterol, and improving bone density. If you want to live well and live longer, you must exercise! Studies show that even ten minutes of exercise makes a difference — so do something! Crank the stereo and dance in your living room. Sign up for swing dancing or ballroom dancing lessons. Walk to the park with your kids or a neighbor you’d like to catch up with. Jump rope or play hopscotch. Spin a hula hoop. Play water volleyball. Bike to work. Jump on a trampoline. Go for a hike.


If you have trouble sleeping, try relaxation techniques such as meditation and yoga. Or eat a small bedtime snack of foods shown to help shift the body and mind into sleep mode: whole grain cereal with milk, oatmeal, cherries, or chamomile tea. Darken your room more and turn your clock away from you. Write down worries or stressful thoughts to get them out of your head and onto the page. This will help you put them into perspective so you can quit worrying about them.

Pre-Search Facets in MOSS 2007

Search facets offer a powerful entry point into data exploration, especially in cases where data is categorized or tagged effectively.  With modern search and content retrieval mechanisms, such as the search engine in MOSS (and new FAST search engine for SharePoint) the traditional method of browsing for content in SharePoint using static navigation hierarchies can now take on a whole new approach.

The new thinking behind content storage is to put all artifacts in one large bucket, tag the artifacts, and then leverage search and facets to surface relevant content.  Think about how Google revolutionized mail by discarding the folder structure approach in favor of a search and label paradigm.

Anyone who has played with Faceted searching in SharePoint probably knows that, aside of the commercial tools like BA Insight, the only real free option is to use the Codeplex Faceted Additions.


The Codeplex offering assumes “post” search faceting, in that the web parts determine the relevant facet headings and count based on the current executed result set.  This approach makes good for filtering search results and allowing users to drill down on with restricted queries, but what about pre-facet browsing, similar to the functionality on sites like Best Buy?

Here is the problem – to provide the user with a dynamic tree view if hierarchical data based on facet categorization, the hierarchy generation method needs to know about all potential facet values ahead of time.  Take the following example:

An organization tags all their documents with a document type and department.  Let’s assume we wanted to provide a dynamic list of departments, which the user could choose, and then a list of document types available for the selected department.  After selecting the document type we’d like the user to see all documents of the selected type that sourced from the selected department.

Aside of issuing a general search, and then filtering the result set by department and document type, the Codeplex faceted search web parts do not appear to offer a mechanism to provide dynamic table-of-content like behavior.

So I got to thinking – search facets in MOSS are no more than managed properties that exist in the search index.  Surely the object model must enable me a way to query distinct values of a given managed property?  It turns out that you can query the search API for this information, and with a little code magic you can obtain the results desired:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data;
using Microsoft.Office.Server.Search;
using Microsoft.SharePoint;
using Microsoft.Office.Server.Search.Query;

namespace SearchFacets
    /// <summary>
    /// Faceted search querying.
    /// </summary>
    class Program
        static readonly string SRCURL = "http://server/";

        /// <summary>
        /// Entry point.
        /// </summary>
        /// <param name="args"></param>
        static void Main(string[] args)
            using (SPSite srcSite = new SPSite(SRCURL))
                string query = "SELECT DocumentType FROM Scope() WHERE "SCOPE" = 'My Scope Documents'";
                string[] values = GetDistinctSearchResults(srcSite, query, 1000);
                foreach (string s in values)

        /// <summary>
        /// Get some search results using full text.
        /// </summary>
        /// <param name="context">Site.</param>
        /// <param name="searchQuery">Query.</param>
        /// <param name="searchLimit">Limit results.</param>
        /// <returns>Results.</returns>
        static string[] GetDistinctSearchResults(SPSite context, string searchQuery, int searchLimit)
            using (var fullTextQuery = new FullTextSqlQuery(context))
                fullTextQuery.ResultTypes = ResultType.RelevantResults;
                fullTextQuery.QueryText = searchQuery;
                fullTextQuery.KeywordInclusion = KeywordInclusion.AnyKeyword;
                fullTextQuery.EnableStemming = false;
                fullTextQuery.TrimDuplicates = false;
                fullTextQuery.RowLimit = searchLimit;

                ResultTableCollection resultsCollection = fullTextQuery.Execute();
                ResultTable resultsTable = resultsCollection[ResultType.RelevantResults];
                return ReturnDistinct(resultsTable);

        /// <summary>
        /// Return distinct list.
        /// </summary>
        /// <param name="rtWins">Restult set.</param>
        /// <returns>Distinct values.</returns>
        static string[] ReturnDistinct(ResultTable rtWins)
            DataTable dtWins = null;
            Dictionary<String, int> pairs = new Dictionary<string, int>();
            List<String> lstWins = new List<string>();
            dtWins = new DataTable("dtWINS");

            foreach (DataRow drWin in dtWins.Rows)
                string fieldName = drWin[0].ToString();
                if (pairs.ContainsKey(fieldName))
                    pairs.Add(fieldName, 0);

            foreach (KeyValuePair<String, int> pair in pairs)
                lstWins.Add(String.Format("{0} ({1})", pair.Key, pair.Value));
            return lstWins.ToArray();

You might be thinking “Hey, you’re just executing a search”, and you’d be right.  Since the facet values (managed property values map to crawled properties) live in the search indexes we have no choice but to perform a search to get at these values.

The key in the above code is to limit the search results returned (1000 in above case) and take advantage of relevancy.  In all likelihood; any search results beyond 1000 hits will not likely produce facet values that map to many results of value to the end user.

Clearly, the above code is just a starting point and has potential for many improvements, such as caching, making use of parent child relationships etc, but you get the idea…

Site Collection Restore Error

If you ever run into the following error when performing a site collection restore, via STSADM, and you know space is not an issue, try the steps below before diving deep into troubleshooting mode:

The site collection could not be restored. If this problem persists, please make sure the content databases are available and have sufficient free space.

  • Stop and start the SharePoint Timer Service
  • Restore to a new content database