Tag Archives: SharePoint 2010

SharePoint 2010 Calendar Item Error – Item Does Not exist. It may have been deleted by another user

Today, I encountered an interesting issue with SharePoint calendar list items. My customer had created a recurring calendar entry using Outlook and then subsequently deleted the series in Outlook. Somewhere along the timeline they may have updated specific instances in the series. After deleting the series my customer noticed future events in the recurrence series remaining in the SharePoint Calendar list. Any attempt to delete these list items via the UI or PowerShell resulted in the error “Item Does Not exist. It may have been deleted by another user“.

SharePoint handles recurring events by creating child list items for every event in the series and tying them to one master list item – stored in the MasterSeriesItemID  field. The master list item is not shown via the UI, since users see the events in the series, especially when the series has exceptions.

Deleting the master list item with ID specified in MasterSeriesItemID  resulted in deleting the entire set of recurring events in the series, which was the desired outcome. I’m guessing Outlook should have deleted the series master item in SharePoint when the user deleted the series in Outlook, but that clearly didn’t happen.

SharePoint Site Pages, What Are They?

SharePoint Foundation introduced Site Pages. Site Pages are pages created, edited, and customized by end users.  Site Pages are different to Application Pages, which have been around since WSS 3, live in the SharePoint filesystem (hive), and are responsible for back-end functionality (such as site settings etc.).

Site Pages are either un-customized (ghosted) or customized (un-ghosted). The state of a Site Page will determine where the page content resides – on the file system, in the content database or both, and this can sometimes be the topic of confusion.

Un-customized Site Pages

An un-customized (or ghosted) Site Page is one that resides on the file system. Typically, these files live in the TEMPLATESSiteTemplates folder or some location within the TEMPLATES folder within the SharePoint file system. An un-customized page is sometimes referred to as a Page Template.

An un-customized page also maintains a reference in the site collection content database. This reference points to the location of the page in the file system.

An un-customized Site Page may contain inline code because SharePoint assumes a developer, with console access to the SharePoint server, has vetted any inline code or script.

Customized Site Pages

A customized (un-ghosted) Site Page is one that consists of edits made by end users or designers, using SharePoint Designer, SharePoint API, or via download from the SharePoint UI. The edits reside in the content database for the SharePoint site collection.

Whereas an un-customized page maintains a reference to the template on the filesystem in the content database, a customized page retains both page content (the customized page content) as well as the reference to the original template.

Customized Site Pages may NOT include inline code because edits are not controlled by administrators with access to the server console. SharePoint controls this behavior by running all customized page content through a Page Parser, which strips out any inline code.

Sandbox Solution Site Pages

Sandbox solutions do not allow deployment of files to the SharePoint file system, therefore, any Site Page deployed as a module as part of a Sandbox solution deploy ONLY to the site collection content database. Users may customize these pages also, but there is no reference to a location on the file system in the content database.

Page Parsing

SharePoint parses ASPX (both application and site page) content in one of two modes, depending on the page – direct, or safe-mode. The first time a user requests an Application or Un-customized Site Page, SharePoint parses the page content in direct mode. In direct mode, the page content is parsed and compiled and placed into memory cache for faster subsequent requests for the same page.

Customized Site Pages reside in the content database and undergo a stricter parsing method, called safe-mode parsing. In safe-mode, the page content may not contain any inline server code, user and server controls must be registered as safe in the application web.config, and the page is not compiled. Safe-mode pages do not live in memory cache, so their use is a performance consideration.

Note: It is possible to override the behavior of the safe-mode parser by adding <PageParserPath> elements to the <SafeMode> element in the web.config, which enables you to select certain Site Pages that may contain inline server code. However, this is not recommended because it compromises the security of your site collection by allowing end users to include potentially dangerous code in page content.

Synchronous Web Events

Triggering custom behavior, after sub-site (web) creation in SharePoint 2007, involved stapling a custom site feature to the site definition. SharePoint 2010 provides additional “web” events, which developers may bind custom event receivers and execute custom code.

Creating a new event receiver and binding to web events is a simple exercise in Visual Studio 2010 (Add a new event receiver item to a SharePoint project and specify the event).

I recently wrote a custom event receiver to provision content types in the Pages library of a publishing site at site creation – straight forward enough.  Everything worked great inside the debugger, but after deployment, SharePoint would kick back a synchronization error, indicating a previous update had been made, during the provisioning process.

After some head scratching, I suspected that even though SharePoint fired the WebProvisioned event, the site provisioning process was continuing in the background. A breakpoint set on my catch block tipped me off because the pages library did not exist.

I consulted with a colleague, who made me aware of the following XML node in the event receiver Elements.xml file:

<Synchronization>Synchronous</Synchronization>

By default, SharePoint calls the WebProvisioned event asynchronously, without the above node in the Elements.xml file.

After a quick recompile and deployment (with the above node added), I was able to test sub-site provisioning working and my event receiver running each time without failure.

Assigning a Unique Master Page to a Page Layout in SharePoint 2010

In the old days of SharePoint 2007, the master page reference in a publishing page layout lived in the MasterPageFile attribute of the @Page reference at the top off the layout file. 

This made good when you needed to create a page layout that stood out from the common branding of the site – such as a page layout that had no chrome for popups etc… and this was exactly what I wanted to accomplish today in SharePoint 2010.

Unfortunately, Microsoft changed the way in which Page Layouts associate with their master page.  Open any of the out of the box page layouts in SP2010 and you should notice that there are no references to master page files anywhere.  This is because the master page association is handled by the containing site settings.

So how does one go about creating a “special” page layout that does not follow the same branding as the rest of the site?  One option is to isolate such pages in sub sites, which is frankly crappy.  Unfortunately, the alternative is much better – the general consensus is that the solution to this problem consists of creating a new sub class that inherits the PublishingLayoutPage class and sets the CustomMasterUrl property explicitly in the Page Init event.

Thanks to Eric for his post, which I referenced to solve this issue.

TaxonomyClientService.AddTerms Wrong Documentation

I’ve been working lately on a project that requires access to the Managed Metadata Service in SP2010.  I got to a point where I needed to add a term to the default term store under a term set.

I have some code in my project that takes in the following parameters and creates a term in the term store:

– TaxonomyClientService proxy instance

– Term name

– Term store ID

– Term Set ID

I needed to use the AddTerm method of the proxy to create a new term, and spent most of my afternoon wrestling with the format of the NewTerms parameter of the method.

The following MSDN documentation is wrong! (here) – or at least not informative.

The MSDN documentation stipulates to use NewTerm nodes to wrap new terms in the XML passed to the service.  What the documentation did not tell me was:

1. The term set must be open, otherwise the method returns an empty string.

2. The method need the exact syntax for the XML to work – looking on the web, I found no real answer to this problem, and ended up reflecting the web service code to get my answer.  Below is a sample piece of XML.

<newTerms><newTerm label="MyTerm" clientId="1" parentTermId="GUID of parent or empty GUID if none"></newTerm></newTerms>

Worth noting with the above XML…

1. Notice the lowercase use of newTerms and newTerm (not uppercase N as in the MSDN documentation)

2. clientId does very little and so you can pass the value 1

3. The parentTermId must be a real GUID, and Guid.Empty if no parent

4. New terms wrap in the newTerms node, which MSDN failed to mention.

I hope this post saves others an afternoon worth of work, which it cost me.

Programmatically Provision Term Store

I recently had to write a feature to provision the SharePoint 2010 Term Store. Numerous blog posts exist on how to populate the term store using Power Shell or how to write XML to add terms to the store, but what I wanted to do was a little different.  The requirements for my feature were as follows:

1. Works in SharePoint 2010 (duh)

2. Create a new Term Store Group in default Term Store

3. Great a series of Term Sets

4. Create a series of Terms in the Term Sets

5. Deploy the feature at the Farm scope.

One of the issues with setting up a Term Store in the SharePoint 2010 Metadata Managed Service Application is that the service can be temperamental if the proxy for the service does not have the setting checked for default term store storage location for site collections.  I wrote a previous blog post on this issue here

To save you some reading, the issue above is with obtaining the default term store instance for a given site collection when using the TaxonomySession object in the Microsoft.SharePoint.Taxonomy API.  As my previous post mentioned, the way to resolve this issue is to check the option under Properties fore the proxy and ensure that the current user had full control as an Administrator of the Managed Metadata Service Application.

Nice one Rob! But what if you want to avoid the manual step and want to configure these operations in code?  It is not as hard as it may sound, check out the code below…

        private void ProvisionMetadataService()
        {
            // We don't have the metadata service configured, so let's do that.
            var proxy = SPFarm.Local.ServiceProxies.Where(s =&gt; s.GetType().Name.Equals
(&quot;MetadataWebServiceProxy&quot;)).FirstOrDefault();
            if (null == proxy) 
throw new SPException(&quot;Failed to get instance of metadata web service proxy, is it installed?&quot;);
            foreach (var proxyApp in proxy.ApplicationProxies.Where(proxyApp =&gt; 
proxyApp.Properties.ContainsKey(&quot;IsDefaultSiteCollectionTaxonomy&quot;)))
            {
                proxyApp.Properties[&quot;IsDefaultSiteCollectionTaxonomy&quot;] = true;
                proxyApp.Update(true);
            }
            // Give the current user access rights to the metadata service.
            var service = SPFarm.Local.Services.Where(s =&gt; s.GetType().Name.Equals
(&quot;MetadataWebService&quot;)).FirstOrDefault();
            if (null == service) 
throw new SPException(&quot;Failed to get instance of metadata web service, is it installed?&quot;);
            var serviceApp = service.Applications.OfType&lt;SPIisWebServiceApplication&gt;().FirstOrDefault();
            if (null == serviceApp) 
throw new SPException(&quot;Failed to get instance of metadata web service app, is it installed?&quot;);
            var security = serviceApp.GetAdministrationAccessControl();
            var cba = SPClaimProviderManager.Local;
            var claim = cba.ConvertIdentifierToClaim(&quot;DOMAINuser&quot;, 
SPIdentifierTypes.WindowsSamAccountName);
            security.AddAccessRule(
new SPAclAccessRule&lt;SPCentralAdministrationRights&gt;(claim, SPCentralAdministrationRights.FullControl));
            serviceApp.SetAdministrationAccessControl(security);
            serviceApp.Uncache();
            service.Uncache();
        }

 

With the above code executed, you can then open an instance of the site collection to the Central Admin Site and then request the default site collection term store with the following code:

using (var site = new SPSite(_farmUrl))
            {
                // Do we have explicit credentials?
                if (!String.IsNullOrEmpty(_username) &amp;&amp; !String.IsNullOrEmpty(_password))
                {
                    var user = site.RootWeb.AllUsers[_username];
                    if (null == user) 
throw new SPException(String.Format(&quot;no user in site collection {0}&quot;, _username));
                    using (var secureSite = new SPSite(site.ID, user.UserToken))
                    {
                        // Get the term store.
                        var session = new TaxonomySession(secureSite);
                        var termStore = session.DefaultSiteCollectionTermStore;
                        if (null == termStore) 
throw new SPException(&quot;Failed to get the default term store instance&quot;);
                        del(termStore);
                    }
                }
                else
                {
                    // Get the term store.
                    var session = new TaxonomySession(site);
                    var termStore = session.DefaultSiteCollectionTermStore;
                    if (null == termStore) 
throw new SPException(&quot;Failed to get the default term store instance&quot;);
                    del(termStore);
                }
            }

It’s not obvious from the code above, but the call to “del” is a delegate call that I include as a parameter to the wrapping method of the above cod.

So, lastly, how did I get the Central Admin URL in the feature receiver?  See below:

var app = SPAdministratrionWebApplication.GetInstanceLocalToFarm(SPFarm.Local);
var url = app.Sites[0].Url

Managed Metadata Service: DefaultSiteCollectionTermStore == null

I happened to configure my SP2010 farm using Powershell automated scripts and and as a result my default Metadata term store proxy was not default for any new or existing site collections.  This issue manifested itself when I was trying to access the default site collection term store via the SharePoint API as a property of the TaxonomySession class.

I came across the following blog post, which got me as far as establishing the metadata service proxy as default for site collection.  To access the DefaultSiteCollectionTermStore property I had to configure additional permissions.

By default my Metadata Service Applications allowed permitted access to the farm and application pool accounts, but my custom code was running under the context of the logged in user.  To rectify this issue I could either elevate permissions to run as the app pool user, or give the logged in user explicit permissions by clicking through as follows in Central Administration:

1. Central Administration

2. Application Management

3. Manage Service Applications

4. Metadata Service Application (not the proxy)

5. Permissions (Ribbon)

6. Add the user via the dialog and give them permissions to access the application.

Configuring RBS for SP2010

Following on from my previous post about list scaling and performance.  The following posts details configuration of Remote Blob Storage for SharePoint 2010 and SQL Server 2008 R2.

First download the RBS provider for SQL Server 2008 (don’t install it yet):

http://go.microsoft.com/fwlink/?LinkId=177388

Configure file stream for the SQL Server Service using the Configuration Manager:

image

Execute the following SQL queries:

EXEC sp_configure filestream_access_level, 2

RECONFIGURE

Execute the following SQL to set up a master encryption key and blob store file group:

use WSS_Content

if not exists (select * from sys.symmetric_keys where name = N'##MS_DatabaseMasterKey##')
create master key encryption by password = N'Admin Key Password !2#4'

if not exists (select groupname from sysfilegroups where 
groupname=N'RBSFilestreamProvider')alter database WSS_Content
 add filegroup RBSFilestreamProvider contains filestream
 
alter database [WSS_Content] add file (name = RBSFilestreamFile, filename = 'c:Blobstore') 
to filegroup RBSFilestreamProvider

Install the RBS provider with the following command (change DBINSTANCE to your SQL server instance):

msiexec /qn /lvx* rbs_install_log.txt /i RBS_X64.msi TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME="WSS_Content" DBINSTANCE="SP2010" FILESTREAMFILEGROUP=RBSFilestreamProvider FILESTREAMSTORENAME=FilestreamProvider_1

If installing RBD on production servers, be sure to run on all WFE’s with the following command (again, change the DBINSTANCE):

msiexec /qn /lvx* rbs_install_log.txt /i RBS_X64.msi DBNAME="WSS_Content" DBINSTANCE="SP2010" ADDLOCAL=”Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer”

Run the following Power Shell script from the SP2010 Management Console:

$cdb = Get-SPContentDatabase –WebApplication http://sp2010

$rbss = $cdb.RemoteBlobStorageSettings

$rbss.Installed()

$rbss.Enable()

$rbss.SetActiveProviderName($rbss.GetProviderNames()[0])

$rbss

Now create a document library in SharePoint and upload an image to it.  Next visit the c:blobstore directory and look for the GUID sub folder with recent date.  Keep drilling down until you find a file.  You should see a file with GUID name.  Drop this into IE and you should see that it is the same file you uploaded to your document library.

 

From the SharePoint 2010 book I’m reviewing

List Scaling and Performance in SP2010

It it a well known fact that MOSS 2007 caused some rising opinions on the subject of list scalability and performance.  Many developers operated under the misconception that SharePoint lists only allowed 2000 list items before croaking out with bad performance.  Nothing could be further from the truth.

The following article talks about this issue of large lists in great depth and highlights the point that SharePoint can actually handle many more than 2000 list items in any one list.  However, “query” this data is affected by the item count and SharePoint architects should design their data access and presentation of data accordingly.

http://technet.microsoft.com/en-us/library/cc262813.aspx

Microsoft has added a number of new enhancements to lists in SharePoint 2010 to handle larger capacity and the query of this data, and the following is a short summary of the enhancements:

List Column Indexing

SP2010 now allows list designers to create up to 20 indices (some of multiple columns) on any one list.  These indices allow for faster query of data when the list size exceeds that of typical.

image 

Under list settings and in the columns section; users now see a link to Indexed Columns.

The following is a list of column types usable as indexed columns:

· Single line of text

· Choice field, but not multi choice

· Number

· Currency

· Date/Time

· Look up, but not a multi value look up

· Person or group, but not multi value

· Title, except in a document library

 

List Throttling

SharePoint administrators now have the capability to better control list queries so that developers (or general users) may issue list queries on large lists that may potentially bring down the server.  Specifically:

Administrators may define some limits at the web application level:

– Configure the number of items fetched for queries

– Administrators may receive warnings when thresholds exceeded

– Ability to configure time periods for expensive queries to operate

– Limit the size of list items (default to 8k)

– Limit the number of columns in a join (default to 6)

The following code will display the list throttling limits for the site collection:

using (SPSite site = new SPSite(siteUrl))
            {
                Console.WriteLine("MaxItemsPerThrottledOperation:{0}",
                    site.WebApplication.MaxItemsPerThrottledOperation);
                Console.WriteLine("MaxItemsPerThrottledOperationOverride:{0}",
                    site.WebApplication.MaxItemsPerThrottledOperationOverride);
                Console.WriteLine("MaxItemsPerThrottledOperationWarningLevel:{0}",
                    site.WebApplication.MaxItemsPerThrottledOperationWarningLevel);
            }

To enable list throttling on any list be sure to toggle the setting with the following:

SPList.EnableThrottling = true

MaxItemsPerThrottledOperationWarningLevel – If a list exceeds the number of items specified in this threshold then a warning is displayed on the list settings page.

What MaxItemsPerThrottledOperation – This indicates the number of list items returned to non-administrators.  Administrators can query up to the threshold in What MaxItemsPerThrottledOperationOverride but will receive a warning on the list settings page.

If administrators wish for users to execute expensive operations in specific window of time they can do so by using the following method on the WebApplication object: SetDailyUnthrottledPrivilegedOperationWindow

 

RBS Storage (Remote Blob Storage)

In some cases the use of document libraries to store large files is no longer scalable and causes content databases to become unmanageable.  An example situations where a public web site, hosted in SharePoint, provides users with rich media content – web files and large images – is once such example of the large blob storage issue. 

In MOSS, hosting content in the database provided certain benefits, such as single storage location, versioning, and access of files via the object model.  Whereas file based storage provided better scalability at the cost of orphaned content from the object model.  SP2010 solves this issue with RBS.  Site Architects can now store large files (blobs) in alternate locations to that of the SharePoint content database without relinquishing access via the OM.  From an and developer standpoint, the data is accessed as if it were in the content database, but the content is actually in a remote location.

To enable RBS you’re farm will need to use at least SQL Server 2008 R2.

Marking blobs as external at the content database level enables SharePoint to store the meta-data associated with blobs in the database while storing the actual blob content outside the content database.  Because RBS is handled at the database level, SharePoint is unaware that data is not stored in the content database but in another location.

in a future time, vendors will bring RBS providers for SP2010 to the table, but in the meantime Microsoft has provided RBS for SQL server as an extra download:

http://go.microsoft.com/fwlink/?LinkId=177388

See my next blog post on configuring RBS.

 

From the SharePoint 2010 book I’m reviewing