Category Archives: Applications

The apps we use day in and day out.

Yammer Integrated with Office 365

Yammer has become the popular social network for the workplace. Yammer provides a discrete network for organizations looking to engage in social network activity without giving employee participants free reign to network with individuals outside their organization, such as with twitter and Facebook.
Many organizations have moved their SharePoint farms to Office 365 – SharePoint Online. The cloud provides an attractive alternative to self-hosting expensive SharePoint infrastructure on premises. The latest wave of SharePoint Online – wave 15 – includes the Newsfeed and social networking capabilities, consistent with on premises SharePoint 2013.
The baked-in social capabilities of SharePoint 2013/Wave 15 are pretty awesome, and with the proliferation of the SharePoint Newsfeed app for Windows Phone, Android, and iOS, SharePoint social networking is becoming as ubiquitous as Facebook and Twitter in the mobile-sphere. However, Microsoft has not ignored those organizations that went the Yammer route and use SharePoint Online, as Yammer now integrates with SharePoint Online.
If you log into your SharePoint Online administration portal within your Office 365 tenant and click the settings link, you should see the Yammer integration option at the top of the page. Toggling the Enterprise Social Collaboration from Newsfeed (default) to Yammer, takes about 30 minutes to take effect, after which time users of SharePoint Online see the Newsfeed link replaced with a link to Yammer in the top navigation.

Presently, the integration with Yammer is very loose. The Yammer link in the top navigation redirects users to the http://www.yammer.com home page, where users can sign-in. Your Organization’s Yammer feed is not yet integrated into your SharePoint Online My Site, and the default Newsfeed remains in place. However, this is just the first phase of roll-out, and Microsoft promises single-sign-on and Yammer feeds integrated into the SharePoint Online user interface in the coming months.

For those that cannot wait, there is a free app that will render Yammer feeds within the SharePoint Online UI, which administrators can download and install from the SharePoint App Store.

My organization – Planet Technologies – uses Yammer (we’re a social bunch), so I am quite excited for the next phase of Yammer integration, which will bring Yammer and SharePoint Online together seamlessly.

SharePoint Authentication and Session Management

What is authentication?

1. A security measure designed to protect a communications system against acceptance of a fraudulent transmission or simulation by establishing the validity of a transmission, message, or originator.
2. A means of identifying individuals and verifying their eligibility to receive specific categories of information.

Authentication is essentially the process of validating a user is who they say they are, such that they can gain access to a system – in this context, the system is SharePoint. Authentication is not authorization, which is the process in determine if a known user is permitted access to certain data in the system, after successful authentication.

SharePoint, much like any content management system, relies on user authentication to provide user access to secured content. Pre-SharePoint 2010, SharePoint relied on NTLM, Kerberos, or basic (forms-based) authentication protocols (their discussion out of scope of this text). SharePoint 2010 introduced Claims-based-Authentication (CBA), also present in SharePoint 2013. CBA consists of authentication abstraction, using a Secure Token Service (STS), and identification of users with multiple attributes –claims – not just the traditional username and password pair.

A Secure Token Service implements open standards. A typical STS implementation communicates over HTTPS, and packages user identity information (claim data) via signed and encrypted XML – Secure Assertion Markup Language (SAML). Examples of STS implementations are the STS engine in SharePoint 2010/2013, ADFS, and third party applications build using the Windows Identity Framework.

SharePoint Session Management

A user session in SharePoint 2010/2013 is the time in which a user is logged into SharePoint without needing to re-authenticate. SharePoint, like most secure systems, implements limited lifespan sessions – i.e. users may authentication with a SharePoint system, but they’re not authenticated with the system indefinitely. The length of user sessions falls under the control of session management, configured for each SharePoint Web Application.

SharePoint handles session management differently, depending on the authentication method in play (Kerberos, NTLM, CBA, Forms, etc.). This article discusses how SharePoint works with Active Directory Federated Services (ADFS) – an STS – to maintain abstracted user authentication and user session lifetime. The following is a sequence diagram of the default authentication and session creation process in SharePoint 2010/2013 when using CBA with ADFS.

The following is a summary of the authentication process, shown in the sequence diagram.

  1. A user requests a page in SharePoint from their browser this might be the home page of the site.
  2. SharePoint captures the request and determines that no valid session exists, by the absence of the FEDAUTH cookie.
  3. SharePoint redirects the user to the internal STS – this is important because the internal STS handles all authentication requests for SharePoint and is the core of the CBA implementation in SharePoint 2010/2013.
  4. Since we have configured SharePoint to use ADFS as a trusted login provider, the internal STS redirects the user to the ADFS login page.
  5. ADFS acquires credentials and authenticates the user.
  6. ADFS creates a SAML token, containing the user’s claims, as encrypted and signed.
  7. ADFS posts the SAML token to the internal SharePoint STS.
  8. The Internal STS saves the SAML token in the SAML Token Cache.
  9. SharePoint creates the FEDAUTH cookie, which contains a reference to the SAML token in the cache.
  10. The Internal STS redirects the user back to SharePoint, and then back to the original requested page.

Session Lifetime

The lifetime of a SharePoint session, when using ADFS, is the topic of much confusion. Ultimately, SharePoint determines whether a user has a current session by the presence of the FEDAUTH cookie. The default behavior of SharePoint is to store this persistent cookie on the user’s disk, with fixed expiration date. Before sending a new FEDAUTH cookie back to the user’s browser, SharePoint calculates the expiration of the cookie with the following formula:

SAML Token Lifetime – Logon Token Cache Expiration Window

The above values are important since they govern the overall lifetime of the FEDAUTH cookie, and hence the session lifetime. The following table describes each value and its source:

Configuration Value Description
SAML Token Lifetime This value, in minutes, is provided by the token issuer – ADFS. In the case of ADFS, each Relying Party configuration (one for each instance of SharePoint farm) has this value as part of the configuration.By default, SharePoint sets the session lifetime the same as this SAML token lifetime.

You can change this value using PowerShell and the ADFS command: Set-ADFSRelyingPartyTrust.

E.g.

Add-PSSnapin Microsoft.ADFS.PowerShell

Set-AdfsRelyingPartyTrust –TargetName “Relying Party Name” –TokenLifeTime 10

Logon Token Cache Expiration Window This value, in minutes, is provided by SharePoint STS and governs how long the SAML token remains active in the cache, and therefore how long the associated user session remains alive.For example, of ADFS sets the SAML Token Lifetime to 10 minutes and this value is set in the STS as 2 minutes then the overall SharePoint session lifespan is 8 minutes.

E.g.

$ap = Get-SPSecurityTokenServiceConfig

$ap.LogonTokenCacheExpirationWindow = (New-TimeSpan -minutes 2)

$ap.Update();

IIsreset

Sliding Session

A siding session is one where the session expiration time changes as a user interacts with the system. By default, SharePoint 2010/2013 does not offer sliding sessions. Each new session expires on a fixed time, based on the aforementioned formula, earlier in this text.

Use of a sliding session does not mean that we must compromise security. Should a user become inactive, a sliding session will timeout just as the fixed session, the main difference that a user can extend a sliding session with continued use of the SharePoint system.

An important point here is that Patriot should only extend sliding sessions with explicit user interaction – polling and passive processes should not extend sessions indefinitely.

Creation of sliding session requires configuration of the Relying Party in ADFS and the SharePoint Logon Token Cache Expiration. The following PowerShell configures the Relying Party to 60 minutes, which is the absolute maximum time that a session remains active should the user become inactive:

Add-PSSnapin Microsoft.ADFS.PowerShell
Set-AdfsRelyingPartyTrust –TargetName “Relying Party Name” –TokenLifeTime 60

The following PowerShell sets the Logon Token Cache Expiration in SharePoint STS, which forces the sliding session lifetime to 20 minutes.

$ap = Get-SPSecurityTokenServiceConfig
$ap.LogonTokenCacheExpirationWindow = (New-TimeSpan -minutes 40)
$ap.Update();
IIsreset

The above settings are only part of the solution. On their own we have a fixed session duration of 20 minutes, determined by the earlier mentioned formula subtracting the logon token cache expiration from the RP token lifetime. To make sure the session renews with continued activity, we must refresh the session (and FEDAUTH cookie), which we can achieve with an HTTP module. The following code is an excerpt to refresh the session with each HTTP request.

Persistent verses Session Cookies

By default, SharePoint stores the authentication/session (FEDAUTH) cookie as a persistent cookie on disk. This allows the user to close and reopen their browser and access SharePoint without having to re-authenticate. This behavior is not always desirable.

Fortunately, we can ask SharePoint to use in-memory cookies (session cookies) for the authentication (FEDAUTH) cookie, as follows:

$sts = Get-SPSecurityTokenServiceConfig
$sts.UseSessionCookies = $true
$sts.Update()
iisreset

Configuring SharePoint 2013 for Windows Azure Workflow

SharePoint 2013 now abstracts workflow processing to the cloud – using Windows Azure Workflow (WAW). SharePoint still maintains the legacy workflow engine, as part of the .NET Framework 3.5.1, to enable execution of SharePoint 2010 workflows. However, SharePoint 2013 does not install WAW by default. The following steps detail additional configuration.

1. Ensure you are not installing on a domain controller – WAW integration does not work with SharePoint 2013 running on a single server domain controller

2. Create an account in your domain for WAW

3. Add this account to the local administrators group on the SharePoint server and grant log on locally permissions

4. Ensure the SQL server accepts connections via TCP/IP – use the SQL Server Configuration Manager tool

5. Provide the WAW account access to SQL Server, include create database permissions (or you could grant administrative permissions if you are brave)

6. Log onto the SharePoint server as that account

7. Install Workflow Beta 1.0 (http://technet.microsoft.com/en-us/library/jj193478), using the Web Platform Installer

8. After installation, you should see the WAW Configuration Wizard

9. Click to create a new farm, using custom settings

10. Configure databases and click the Test Connection button for each

11. Make sure the WAW service account is correct – use the fully qualified domain name (FQDN), by default it prepopulates the textbox with a non-FQDN

12. Provide certificate generation keys

13. Leave the ports as default

14. Check the checkbox to allow management over HTTP (if you choose to use HTTPS you will need to establish trust between SharePoint and WAW using a trusted certificate)

15. Click the next button to move onto configuring the service bus

16. Complete similar steps for database, service account, and certificates settings as you did above

17. Again, leave the ports as default

18. Review the summary page, then click the tick button to complete the configuration

19. Wait for the configuration to complete – this might take a little time

20. After WAW configuration completes, run the following PowerShell command:

Register-SPWorkflowService –SPSite "http://{sitecollectionurl}" –WorkflowHostUri "http://{workflowserve}:12291" –AllowOAuthHttp

21. Assuming no errors, you have now configured WAW in SharePoint 2013 for your site collection

More information on installing and configuring WAW is available at the following URL: http://technet.microsoft.com/en-us/library/jj658588%28v=office.15%29

The context has expired and can no longer be used

I routinely see this error when working with SharePoint 2013 in my development environment. This problem is more frequent when I restore earlier snapshots of my SP2013 server.

SharePoint spits out this error when the local server time is out of sync. To remedy this issue, try one of the following:

  1. Update the date and time on the SharePoint Server
  2. Disable security context check for the web application, as follows
  1. Go to central administration
  2. Go to  "Application management section
  3. Go to "Web Application General Settings"
  4. Go to "Web page Security validation"  disable this option.

SharePoint 2013 Managed Navigation

After much awaited anticipation, SharePoint 2013 now offers custom navigation of sites via the Managed Metadata Term Store. SharePoint 2010 introduced managed metadata for tagging purposes, with hierarchical terms. This same hierarchical infrastructure bodes well for site navigation, which is also hierarchical. I often hear the word “taxonomy” said a lot, pertaining to both tagging taxonomy and site structure, which just speaks to the fact that the Managed Metadata Term Store is great for managing custom navigation.

Prior to SharePoint 2013, custom navigation typically involved some custom component, to read navigation structure from either a list, XML file, or some other hierarchical node store. The out-of-the-box offering provided very little in the way of custom navigation – just the ability to include headers and links at each site level. The main issue with the out-of-the-box offering is that it was limited in the number of nested navigation nodes, without adhering to the actual structure of sites and sub-sites in the collection. Despite typical site navigation following site structure, content owners should have the ability to store their content (sites and pages) in any structure and the navigation look completely different. Content storage and structure suits how content owners maintain content, and navigation is about how end users access content, and the two may look very different. Managed Metadata Navigation finally allows content owners to create an independent navigation structure to that of their content model.

To demonstrate Managed Navigation, I shall first create a hierarchy in the default term store, for our application:

  1. Open Central Administration
  2. Click the link for managed service applications
  3. Scroll down the list and click the Managed Metadata Service
  4. Click the Manage icon in the ribbon to open the Term Store editor
  5. Ensure you have permissions to edit the term store – add your username to the term store administrators field
  6. Managed navigation binds to term sets, so I created a new group for navigation and then a term set for site navigation

SharePoint creates a default term set in the Managed Metadata Term Store for your site collection; I created my own for demonstration purposes.

  1. Create a term set structure
  2. Click the Site Navigation term set
  3. In the right panel, click the tab for Intended Use
  4. Check the checkbox to enable the term set for navigation – you can also use the term set for tagging if you wish by toggling the other checkbox option
  5. Click the save button to save the changes
  6. Click the tab for term driven pages – this page shows the settings for friendly URLs for the term set (more on friendly URLs shortly)
  7. Now we are ready to configure our publishing site to use the managed navigation
  8. Open your publishing site (assuming the hosting web application uses the managed metadata service you just configured)
  9. Click the gear icon, the select the menu item for site settings
  10. Click the link for Navigation, under the Look and Feel header
  11. SharePoint displays the navigation settings page
  12. Choose the radio button option for Managed Navigation for either or both the left and global (top) navigation
  13. Scroll to the bottom of the page to the managed navigation term set section
  14. Select the term set to use for managed navigation
  15. The checkboxes below the term set browser tell SharePoint whether to populate your term set with nodes when you create new pages in the site, and whether to generate friendly URLs for new pages
  16. Click the OK button at the bottom of the page, to save your changes

managednav2013

Configuring RBS for SP2010

Following on from my previous post about list scaling and performance.  The following posts details configuration of Remote Blob Storage for SharePoint 2010 and SQL Server 2008 R2.

First download the RBS provider for SQL Server 2008 (don’t install it yet):

http://go.microsoft.com/fwlink/?LinkId=177388

Configure file stream for the SQL Server Service using the Configuration Manager:

image

Execute the following SQL queries:

EXEC sp_configure filestream_access_level, 2

RECONFIGURE

Execute the following SQL to set up a master encryption key and blob store file group:

use WSS_Content

if not exists (select * from sys.symmetric_keys where name = N'##MS_DatabaseMasterKey##')
create master key encryption by password = N'Admin Key Password !2#4'

if not exists (select groupname from sysfilegroups where 
groupname=N'RBSFilestreamProvider')alter database WSS_Content
 add filegroup RBSFilestreamProvider contains filestream
 
alter database [WSS_Content] add file (name = RBSFilestreamFile, filename = 'c:\Blobstore') 
to filegroup RBSFilestreamProvider

Install the RBS provider with the following command (change DBINSTANCE to your SQL server instance):

msiexec /qn /lvx* rbs_install_log.txt /i RBS_X64.msi TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME="WSS_Content" DBINSTANCE="SP2010" FILESTREAMFILEGROUP=RBSFilestreamProvider FILESTREAMSTORENAME=FilestreamProvider_1

If installing RBD on production servers, be sure to run on all WFE’s with the following command (again, change the DBINSTANCE):

msiexec /qn /lvx* rbs_install_log.txt /i RBS_X64.msi DBNAME="WSS_Content" DBINSTANCE="SP2010" ADDLOCAL=”Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer”

Run the following Power Shell script from the SP2010 Management Console:

$cdb = Get-SPContentDatabase –WebApplication http://sp2010

$rbss = $cdb.RemoteBlobStorageSettings

$rbss.Installed()

$rbss.Enable()

$rbss.SetActiveProviderName($rbss.GetProviderNames()[0])

$rbss

Now create a document library in SharePoint and upload an image to it.  Next visit the c:\blobstore directory and look for the GUID sub folder with recent date.  Keep drilling down until you find a file.  You should see a file with GUID name.  Drop this into IE and you should see that it is the same file you uploaded to your document library.

 

From the SharePoint 2010 book I’m reviewing

List Scaling and Performance in SP2010

It it a well known fact that MOSS 2007 caused some rising opinions on the subject of list scalability and performance.  Many developers operated under the misconception that SharePoint lists only allowed 2000 list items before croaking out with bad performance.  Nothing could be further from the truth.

The following article talks about this issue of large lists in great depth and highlights the point that SharePoint can actually handle many more than 2000 list items in any one list.  However, “query” this data is affected by the item count and SharePoint architects should design their data access and presentation of data accordingly.

http://technet.microsoft.com/en-us/library/cc262813.aspx

Microsoft has added a number of new enhancements to lists in SharePoint 2010 to handle larger capacity and the query of this data, and the following is a short summary of the enhancements:

List Column Indexing

SP2010 now allows list designers to create up to 20 indices (some of multiple columns) on any one list.  These indices allow for faster query of data when the list size exceeds that of typical.

image 

Under list settings and in the columns section; users now see a link to Indexed Columns.

The following is a list of column types usable as indexed columns:

· Single line of text

· Choice field, but not multi choice

· Number

· Currency

· Date/Time

· Look up, but not a multi value look up

· Person or group, but not multi value

· Title, except in a document library

 

List Throttling

SharePoint administrators now have the capability to better control list queries so that developers (or general users) may issue list queries on large lists that may potentially bring down the server.  Specifically:

Administrators may define some limits at the web application level:

- Configure the number of items fetched for queries

- Administrators may receive warnings when thresholds exceeded

- Ability to configure time periods for expensive queries to operate

- Limit the size of list items (default to 8k)

- Limit the number of columns in a join (default to 6)

The following code will display the list throttling limits for the site collection:

using (SPSite site = new SPSite(siteUrl))
            {
                Console.WriteLine("MaxItemsPerThrottledOperation:{0}",
                    site.WebApplication.MaxItemsPerThrottledOperation);
                Console.WriteLine("MaxItemsPerThrottledOperationOverride:{0}",
                    site.WebApplication.MaxItemsPerThrottledOperationOverride);
                Console.WriteLine("MaxItemsPerThrottledOperationWarningLevel:{0}",
                    site.WebApplication.MaxItemsPerThrottledOperationWarningLevel);
            }

To enable list throttling on any list be sure to toggle the setting with the following:

SPList.EnableThrottling = true

MaxItemsPerThrottledOperationWarningLevel – If a list exceeds the number of items specified in this threshold then a warning is displayed on the list settings page.

What MaxItemsPerThrottledOperation – This indicates the number of list items returned to non-administrators.  Administrators can query up to the threshold in What MaxItemsPerThrottledOperationOverride but will receive a warning on the list settings page.

If administrators wish for users to execute expensive operations in specific window of time they can do so by using the following method on the WebApplication object: SetDailyUnthrottledPrivilegedOperationWindow

 

RBS Storage (Remote Blob Storage)

In some cases the use of document libraries to store large files is no longer scalable and causes content databases to become unmanageable.  An example situations where a public web site, hosted in SharePoint, provides users with rich media content – web files and large images – is once such example of the large blob storage issue. 

In MOSS, hosting content in the database provided certain benefits, such as single storage location, versioning, and access of files via the object model.  Whereas file based storage provided better scalability at the cost of orphaned content from the object model.  SP2010 solves this issue with RBS.  Site Architects can now store large files (blobs) in alternate locations to that of the SharePoint content database without relinquishing access via the OM.  From an and developer standpoint, the data is accessed as if it were in the content database, but the content is actually in a remote location.

To enable RBS you’re farm will need to use at least SQL Server 2008 R2.

Marking blobs as external at the content database level enables SharePoint to store the meta-data associated with blobs in the database while storing the actual blob content outside the content database.  Because RBS is handled at the database level, SharePoint is unaware that data is not stored in the content database but in another location.

in a future time, vendors will bring RBS providers for SP2010 to the table, but in the meantime Microsoft has provided RBS for SQL server as an extra download:

http://go.microsoft.com/fwlink/?LinkId=177388

See my next blog post on configuring RBS.

 

From the SharePoint 2010 book I’m reviewing

SP2010 Blogging Open

Just in case you missed it, public blogging of SharePoint 2010 is now permitted – let the flood gates open ;)

SP2010 Developer Dashboard

The SP2010 Developer Dashboard allows developers to review object model calls, database queries, web part events – and the timings for these various happenings.

The following code enables the dashboard:

SPPerformanceMonitor SPPerfMon;
SPPerfMon = SPFarm.Local.PerformanceMonitor;
SPPerfMon.DeveloperDashboardLevel = SPPerformanceMonitoringLevel.On;
SPPerfMon.Update();

The following code turns it off again:

SPPerformanceMonitor SPPerfMon;
SPPerfMon = SPFarm.Local.PerformanceMonitor;
SPPerfMon.DeveloperDashboardLevel = SPPerformanceMonitoringLevel.Off;
SPPerfMon.Update();

Development Setup for SP2010

Some important points to remember when developing against SP2010:

  • Make sure your Visual Studio project is set up for .NET 3.5, not .NET 4.0
  • Run Visual Studio as an Administrator to load debugging symbols
  • Make sure your project is set to compile for Any CPU or x64 (not x86 by default), otherwise your code will throw a FileNotFoundException

From the SharePoint 2010 book I’m reviewing

Follow

Get every new post delivered to your Inbox.

Join 292 other followers

%d bloggers like this: