PowerShell Register Provider-Hosted Add-in/App

My current client uses provider-hosted add-ins with SharePoint 2013 on-premises. We have a centralized server infrastructure – provider-host – for the add-ins, where we deploy the add-in/app logic and then deploy the APP files to different SharePoint 2013 environments.

Why? The add-ins we’ve developed use CSOM to effect changes in the environment they’re deployed (SharePoint). We have one team developing the provider-hosted add-ins, and another team testing the add-ins within their development environments. This post is not about lifecycle deployment of SharePoint Provider-Hosted Add-ins – besides, we have integration, staging, test, and production hosts for this purpose – but about a nifty PowerShell script to reuse APP files across environments.

So, the scenario goes like this…

We have an integration farm with the provider-hosted add-ins deployed (and working). Developers download these add-ins from the integration SharePoint farm app catalog and save the APP files locally. They then upload these APP files into the app catalog of their local development SharePoint farm. Each development farm has a registered Security Token Issuer, using the same issuer ID as the integration farm. The development farms also have a trusted root certificate for the High-Trust between the provider-host and SharePoint, also the same as integration. The remaining step is to ensure that each add-in deployed to the development farm has the same client/app ID as that registered in the integration farm.

The typical process to register a shared add-in would be to crack open the APP file (just a zip file), look in the manifest.xml file and pull the client ID, and then call https://site/_layouts/15/appregnew.aspx. However, I wanted a script that avoided all that nonsense, and here it is:

[CmdletBinding()]Param(
    [Parameter(Mandatory=$true)][string]$appPath,
    [Parameter(Mandatory=$true)][string]$webUrl
);

if ((Get-PSSnapin -Name "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin "Microsoft.SharePoint.PowerShell";
}

$zipStream = $null;
$streamReader = $null;
try {
    Write-Verbose "Looking for AppManifest in APP Zip";
    [System.Reflection.Assembly]::LoadWithPartialName('System.IO.Compression') | Out-Null;
    if (![System.IO.File]::Exists($appPath)) { throw "$appPath does not exist"; }
    $zipBytes = [System.IO.File]::ReadAllBytes($appPath);
    $zipStream = New-Object System.IO.Memorystream;
    $zipStream.Write($zipBytes, 0, $zipBytes.Length);
    $zipArchive = New-Object System.IO.Compression.ZipArchive($zipStream);
    $zipEntry = $zipArchive.GetEntry("AppManifest.xml");
    $streamReader = New-Object System.IO.StreamReader($zipEntry.Open());
    $manifest = New-Object System.Xml.XmlDocument;
    $manifest.LoadXml($streamReader.ReadToEnd());
    
    Write-Verbose "Looking for ClientID";
    $ns = New-Object System.Xml.XmlNamespaceManager($manifest.NameTable);
    $ns.AddNamespace("x", "http://schemas.microsoft.com/sharepoint/2012/app/manifest");
    $node = $manifest.SelectSingleNode("/x:App/x:AppPrincipal/x:RemoteWebApplication", $ns);
    $clientId = $node.Attributes["ClientId"].Value;
    $node = $manifest.SelectSingleNode("/x:App/x:Properties/x:Title", $ns);
    $appTitle = $node.InnerText;
    Write-Verbose "Found app with title $appTitle and clientID $clientId";
    
    Write-Verbose "Registering App ClientId with SharePoint";
    $web = Get-SPWeb $webUrl;
    $realm = Get-SPAuthenticationRealm -ServiceContext $web.Site;
    $fullAppId = $clientId + '@' + $realm;
    Register-SPAppPrincipal -DisplayName $appTitle -NameIdentifier $fullAppId -Site $web;

} catch {
    Write-Host -ForegroundColor Red $_.Exception;
} finally {
    if ($streamReader -ne $null) { $streamReader.Close(); }
    if ($zipStream -ne $null) { $zipStream.Close(); }
}

The script takes a full path to the APP file and a web URL to register the add-in. As you can see from the code, the script replicates the manual steps of unzipping the APP (in memory), pulls out the client ID and calls Register-SPAppPrincipal to register the add-in.

SharePoint Add-In Governance

I’ve been working with a client that recently asked me about Governance of SharePoint Provider-Hosted Add-ins in their on-premises SharePoint 2013. Essentially, they wanted to take control over how site owners installed add-ins from the corporate SharePoint App Catalog. The App Catalog allows administrators to toggle whether site owners can install add-ins or request permission to install them, but my client was looking for a more granular solution on a site-by-site basis.

My thought process went in the direction of detecting installation of an add-in and finding a way to intercept the installation process. This led me to the App Installed event that SharePoint supports for provider-hosted add-ins. Imagine a scenario where every add-in in the catalog fired an App Installed event that then checked the add-in against a central database/list etc., to determine whether the site owner could complete the installation. Seems simple enough. However, what if I then said administrators could upload provider-hosted add-ins to the catalog that might not have App Installed event code present? Perhaps a third-party add-in or add-in developed by another person outside the governance circle.

I wondered if it’d be possible to inject logic into existing add-ins after they’d been deployed to the app catalog. Turns out this isn’t as hard as it sounds. For those of you thinking I’m about to go with a code hack and a suggestion that could compromise compiled code, just hang in a moment longer.

Provider-Hosted Add-ins consist of an APP file in the app catalog, which redirects execution to an endpoint on another sever. The APP file is really just a glorified ZIP file with AppManifest.xml and other supporting files contained. Provider-Hosted add-ins that support the App Installed event (and Upgrading and Uninstalling events) include an endpoint reference to a remote event receiver web service in the AppManifest.xml. My theory was that I could download the APP file, unzip it, make the change to the AppManifest.xml file, zip it again, and upload it back to the catalog. The new modified APP file would contain a remote event receiver location of my choosing. So, I started work…

First to illustrate what I mean, here’s a snapshot of a test Provider-Hosted Add-in:

<?xml version="1.0" encoding="utf-8" ?>
<!--Created:cb85b80c-f585-40ff-8bfc-12ff4d0e34a9-->
<App xmlns="http://schemas.microsoft.com/sharepoint/2012/app/manifest"      Name="HelloWorld"      ProductID="{316a9436-6358-4626-a007-c31fafe306a2}"      Version="1.0.0.0"      SharePointMinVersion="15.0.0.0" >
  <Properties>
    <Title>Hello World</Title>
    <StartPage>~remoteAppUrl/Pages/Default.aspx?{StandardTokens}</StartPage>
    <InstalledEventEndpoint>~remoteAppUrl/Services/AppEventReceiver.svc</InstalledEventEndpoint>
    <UninstallingEventEndpoint>~remoteAppUrl/Services/AppEventReceiver.svc</UninstallingEventEndpoint>
    <UpgradedEventEndpoint>~remoteAppUrl/Services/AppEventReceiver.svc</UpgradedEventEndpoint>
  </Properties>

  <AppPrincipal>
    <RemoteWebApplication ClientId="*" />
  </AppPrincipal>
  <AppPermissionRequests AllowAppOnlyPolicy="false">
  </AppPermissionRequests>
</App>

Next, I needed some plumbing that would register event receivers on the App Catalog to manipulate add-ins uploaded to the catalog. In keeping with good SharePoint 2013 development, I created a Provider-Hosted Add-in for this purpose. I called this PH add-in the App Catalog Updater, ACU for short.

Because the ACU needs to make changes to the App Catalog, this add-in requires tenant control and the ability to run under the app-only context (as opposed to app and user credentials). This prevents this add-in from ever going into the Marketplace, but for my purpose this didn’t matter.

The following is the AppManifest.xml for the ACU:

<?xml version="1.0" encoding="utf-8" ?>
<!--Created:cb85b80c-f585-40ff-8bfc-12ff4d0e34a9-->
<App xmlns="http://schemas.microsoft.com/sharepoint/2012/app/manifest"      Name="SPAppsUpdateAppCat"      ProductID="{316a9436-6358-4626-a007-c31fafe306a2}"      Version="1.0.0.0"      SharePointMinVersion="15.0.0.0" >
  <Properties>
    <Title>SPApps.UpdateAppCat</Title>
    <StartPage>~remoteAppUrl/Pages/Default.aspx?{StandardTokens}</StartPage>
    <InstalledEventEndpoint>~remoteAppUrl/Services/AppEventReceiver.svc</InstalledEventEndpoint>
    <UninstallingEventEndpoint>~remoteAppUrl/Services/AppEventReceiver.svc</UninstallingEventEndpoint>
    <UpgradedEventEndpoint>~remoteAppUrl/Services/AppEventReceiver.svc</UpgradedEventEndpoint>
  </Properties>

  <AppPrincipal>
    <RemoteWebApplication ClientId="*" />
  </AppPrincipal>
  <AppPermissionRequests AllowAppOnlyPolicy="true">
    <AppPermissionRequest Scope="http://sharepoint/content/tenant" Right="FullControl" />
  </AppPermissionRequests>
</App>

As mentioned, the ACU registers remote events on the App Catalog, which I achieve, using the following code:

using System;
using Microsoft.SharePoint.Client.EventReceivers;

namespace SPApps.UpdateAppCatWeb.Services
{
    public class AppEventReceiver : IRemoteEventService
    {
        ///

<summary>
        /// Handles app events that occur after the app is installed or upgraded, or when app is being uninstalled.
        /// </summary>


        /// <param name="properties">Holds information about the app event.</param>
        /// <returns>Holds information returned from the app event.</returns>
        public SPRemoteEventResult ProcessEvent(SPRemoteEventProperties properties)
        {
            var result = new SPRemoteEventResult();
            Logger.Logger.LogInfo("ProcessEvent called for AppEventReceiver", () =>
            {
                using (var clientContext = TokenHelper.CreateAppEventClientContext(properties, false))
                {
                    switch (properties.EventType)
                    {
                        case SPRemoteEventType.AppInstalled:
                            // Remove any old RER first.
                            AppHelper.UnregisterRemoteEvents(clientContext);
                            // Install a RER for the App Catalog.
                            AppHelper.RegisterRemoteEvents(clientContext);
                            // Iterate existing apps and process them.
                            AppHelper.ProcessAppList(clientContext);
                            break;
                        case SPRemoteEventType.AppUninstalling:
                            // Remove RER from the App Catalog.
                            AppHelper.UnregisterRemoteEvents(clientContext);
                            break;
                    }
                }
            });
            return result;
        }

        ///

<summary>
        /// This method is a required placeholder, but is not used by app events.
        /// </summary>


        /// <param name="properties">Unused.</param>
        public void ProcessOneWayEvent(SPRemoteEventProperties properties)
        {
            throw new NotImplementedException();
        }

    }
}

The above code resides in the App Installed event receiver code for the ACU. This code executes when the ACU is installed the first time, and then calls a handy App Helper class to register the event receivers on the App Catalog. In addition, the code looks for existing add-ins in the catalog for processing, again using the App Helper.

Let’s take a look at the App Helper class:

using Microsoft.SharePoint.Client;
using System;
using System.Xml;
using System.Collections.Generic;
using System.Diagnostics;
using System.Globalization;
using System.IO.Compression;
using System.ServiceModel;

namespace SPApps.UpdateAppCatWeb
{
    static class AppHelper
    {
        private const string LISTNAME = "Apps for SharePoint";
        private const string RERNAME = "Apps_Remote_Event_Receiver";

        public static void RegisterRemoteEvents(ClientContext clientContext)
        {
            if (null == clientContext) throw new ArgumentNullException("clientContext");
            try
            {
                // Get the Apps List.
                Logger.Logger.LogInfo("Registering remote events", () =>
                {
                    var appCat = clientContext.Web.Lists.GetByTitle(LISTNAME);
                    clientContext.Load(clientContext.Web);
                    clientContext.ExecuteQuery();
                    // Get the operation context and remote event service URL.
                    var remoteUrl = GetServiceUrl("ListEventReceiver.svc");
                    // Add RER for Item Added.
                    if (!IsRemoteEventRegistered(clientContext, EventReceiverType.ItemAdded))
                    {
                        appCat.EventReceivers.Add(new EventReceiverDefinitionCreationInformation
                        {
                            EventType = EventReceiverType.ItemAdded,
                            ReceiverName = RERNAME,
                            ReceiverUrl = remoteUrl,
                            SequenceNumber = 10000,
                            Synchronization = EventReceiverSynchronization.Synchronous
                        });
                        clientContext.ExecuteQuery();
                    }
                    // Add RER for Item Updated
                    if (IsRemoteEventRegistered(clientContext, EventReceiverType.ItemUpdated)) return;
                    appCat.EventReceivers.Add(new EventReceiverDefinitionCreationInformation
                    {
                        EventType = EventReceiverType.ItemUpdated,
                        ReceiverName = RERNAME,
                        ReceiverUrl = remoteUrl,
                        SequenceNumber = 10001
                    });
                    clientContext.ExecuteQuery();
                });
            }
            catch (Exception ex)
            {
                Debug.WriteLine(ex.ToString());
                Logger.Logger.LogError(ex.ToString());
            }
        }

        public static bool IsRemoteEventRegistered(ClientContext clientContext, EventReceiverType type)
        {
            var result = false;
            if (null == clientContext) throw new ArgumentNullException("clientContext");
            try
            {
                // Get the list
                Logger.Logger.LogInfo("Checking if remote events registered", () =>
                {
                    var srcList = clientContext.Web.Lists.GetByTitle(LISTNAME);
                    clientContext.Load(clientContext.Web);
                    clientContext.ExecuteQuery();
                    // Iterate all event receivers.
                    clientContext.Load(srcList.EventReceivers);
                    clientContext.ExecuteQuery();
                    foreach (var er in srcList.EventReceivers)
                        if (0 == string.Compare(er.ReceiverName, RERNAME, true, CultureInfo.CurrentCulture) && er.EventType == type)
                        {
                            result = true;
                            break;
                        }
                });
                return result;
            }
            catch (Exception ex)
            {
                Debug.WriteLine(ex.ToString());
                Logger.Logger.LogError(ex.ToString());
            }
            return false;
        }

        public static void UnregisterRemoteEvents(ClientContext clientContext)
        {
            if (null == clientContext) throw new ArgumentNullException("clientContext");
            try
            {
                Logger.Logger.LogInfo("Unregistering remote events", () =>
                {
                    // Get the App Catalog.
                    var appCat = clientContext.Web.Lists.GetByTitle(LISTNAME);
                    clientContext.Load(clientContext.Web);
                    clientContext.ExecuteQuery();
                    // Remove all event receivers.
                    clientContext.Load(appCat.EventReceivers);
                    clientContext.ExecuteQuery();
                    var toDelete = new List<EventReceiverDefinition>();
                    // ReSharper disable once LoopCanBeConvertedToQuery
                    foreach (var er in appCat.EventReceivers)
                    {
                        if (er.ReceiverName == RERNAME) toDelete.Add(er);
                    }
                    foreach (var er in toDelete)
                    {
                        er.DeleteObject();
                        clientContext.ExecuteQuery();
                    }
                });
            }
            catch (Exception ex)
            {
                Debug.WriteLine(ex.ToString());
                Logger.Logger.LogError(ex.ToString());
            }
        }

        internal static void ProcessAppList(ClientContext clientContext)
        {
            if (null == clientContext) throw new ArgumentNullException("clientContext");
            try
            {
                Logger.Logger.LogInfo("Processing app catalog", () =>
                {
                    // Get the App Catalog and App List Item.
                    var appCat = clientContext.Web.Lists.GetByTitle(LISTNAME);
                    clientContext.Load(clientContext.Web);
                    clientContext.Load(appCat);
                    var query = CamlQuery.CreateAllItemsQuery();
                    var items = appCat.GetItems(query);
                    clientContext.Load(items);
                    clientContext.ExecuteQuery();
                    foreach (var item in items)
                        ProcessAppListItem(clientContext, item);
                });
            }
            catch (Exception ex)
            {
                Debug.WriteLine(ex.ToString());
                Logger.Logger.LogError(ex.ToString());
            }
        }

        internal static void ProcessAppListItem(ClientContext clientContext, int itemID)
        {
            if (null == clientContext) throw new ArgumentNullException("clientContext");
            if (itemID <= 0) throw new ArgumentOutOfRangeException("itemID");             try {                 // Get the App Catalog and App List Item.                 var appCat = clientContext.Web.Lists.GetByTitle(LISTNAME);                 clientContext.Load(clientContext.Web);                 clientContext.Load(appCat);                 var item = appCat.GetItemById(itemID);                 clientContext.Load(item);                 clientContext.ExecuteQuery();                 ProcessAppListItem(clientContext, item);             }             catch (Exception ex)             {                 Debug.WriteLine(ex.ToString());                 Logger.Logger.LogError(ex.ToString());             }         }         internal static void ProcessAppListItem(ClientContext clientContext, ListItem item)         {             if (null == clientContext) throw new ArgumentNullException("clientContext");             if (null == item) throw new ArgumentNullException("item");             try             {                 Logger.Logger.LogInfo("Processing list item with ID {0}", () => {
                    clientContext.Load(item.File);
                    var stream = item.File.OpenBinaryStream();
                    clientContext.ExecuteQuery();
                    var fileInfo = new FileSaveBinaryInformation();
                    fileInfo.ContentStream = new System.IO.MemoryStream();
                    // Load the app manifest file.
                    ProcessManifest(stream.Value, fileInfo.ContentStream, (manifest, ns) => {
                        // Load the properties.
                        var propNode = manifest.SelectSingleNode("x:App/x:Properties", ns);
                        // Look for the endpoints.
                        var installedNode = propNode.SelectSingleNode("x:InstalledEventEndpoint", ns);
                        var upgradedNode = propNode.SelectSingleNode("x:UpgradedEventEndpoint", ns);
                        var uninstalledNode = propNode.SelectSingleNode("x:UninstallingEventEndpoint", ns);
                        if (null == installedNode)
                        {
                            installedNode = manifest.CreateElement("InstalledEventEndpoint", manifest.DocumentElement.NamespaceURI);
                            propNode.AppendChild(installedNode);
                        }
                        if (null == upgradedNode)
                        {
                            upgradedNode = manifest.CreateElement("UpgradedEventEndpoint", manifest.DocumentElement.NamespaceURI);
                            propNode.AppendChild(upgradedNode);
                        }
                        if (null == uninstalledNode)
                        {
                            uninstalledNode = manifest.CreateElement("UninstallingEventEndpoint", manifest.DocumentElement.NamespaceURI);
                            propNode.AppendChild(uninstalledNode);
                        }
                        // NOTE: We're replacing the app installing and upgrading events so we can manage app lifecycle.
                        // If the deployed originally used these events, we've overridden them.
                        installedNode.InnerText = GetServiceUrl("AppMgmtReceiver.svc");
                        upgradedNode.InnerText = GetServiceUrl("AppMgmtReceiver.svc");
                        uninstalledNode.InnerText = GetServiceUrl("AppMgmtReceiver.svc");
                    });
                    // Save the manifest back to SharePoint.
                    fileInfo.ContentStream.Seek(0, System.IO.SeekOrigin.Begin);
                    item.File.SaveBinary(fileInfo);
                    clientContext.Load(item.File);
                    clientContext.ExecuteQuery();
                }, item.Id);
            }
            catch (Exception ex)
            {
                Debug.WriteLine(ex.ToString());
                Logger.Logger.LogError(ex.ToString());
            }
        }

        private static void ProcessManifest(System.IO.Stream inStream, System.IO.Stream outStream, Action<XmlDocument, XmlNamespaceManager> manifestDel)
        {
            if (null == inStream) throw new ArgumentNullException("inStream");
            if (null == outStream) throw new ArgumentNullException("outStream");
            if (null == manifestDel) throw new ArgumentNullException("manifestDel");
            using (var memory = new System.IO.MemoryStream())
            {
                var buffer = new byte[1024 * 64];
                int nread = 0, total = 0;
                while ((nread = inStream.Read(buffer, 0, buffer.Length)) > 0)
                {
                    memory.Write(buffer, 0, nread);
                    total += nread;
                }
                memory.Seek(0, System.IO.SeekOrigin.Begin);
                // Open the app manifest.
                using (var zipArchive = new ZipArchive(memory, ZipArchiveMode.Update, true))
                {
                    var entry = zipArchive.GetEntry("AppManifest.xml");
                    if (null == entry) throw new Exception("Could not find AppManifest.xml in the app archive");
                    var manifest = new XmlDocument();
                    using (var sr = new System.IO.StreamReader(entry.Open()))
                    {
                        manifest.LoadXml(sr.ReadToEnd());
                        sr.Close();
                    }
                    var ns = new XmlNamespaceManager(manifest.NameTable);
                    ns.AddNamespace("x", "http://schemas.microsoft.com/sharepoint/2012/app/manifest");
                    // Call the delegate.
                    manifestDel(manifest, ns);
                    // Write back to the archive.
                    using (var sw = new System.IO.StreamWriter(entry.Open()))
                    {
                        sw.Write(manifest.OuterXml);
                        sw.Close();
                    }
                }
                // Memory stream now contains the updated archive
                memory.Seek(0, System.IO.SeekOrigin.Begin);
                // Write result to output stream.
                buffer = new byte[1024 * 64];
                nread = 0; total = 0;
                while ((nread = memory.Read(buffer, 0, buffer.Length)) > 0)
                {
                    outStream.Write(buffer, 0, nread);
                    total += nread;
                }
            }
        }

        private static string GetServiceUrl(string serviceEndpoint)
        {
            if (string.IsNullOrEmpty(serviceEndpoint)) throw new ArgumentNullException("serviceEndpoint");
            if (null == OperationContext.Current) throw new Exception("Could not get service URL from the operational context.");
            var url = OperationContext.Current.Channel.LocalAddress.Uri.AbsoluteUri;
            var opContext = url.Substring(0, url.LastIndexOf("/", StringComparison.Ordinal));
            return string.Format("{0}/{1}", opContext, serviceEndpoint);
        }
    }
}

This class, both registers event receivers on the App Catalog as well as provide logic for the injection of App Installed events for new and existing add-ins. The ProcessListItem method is the most interesting (in my opinion). This method downloads the APP file from the catalog into a memory stream, unzips the contents with the .NET compression API, makes the changes to the AppManifest.xml file, and then zips the file and re-uploads the APP to the catalog.

The location of the injected endpoint is another remote event receiver on the ACU. The governance code should reside somewhere central, and hosted in the ACU seemed like as good a place as any.

In the above code, we can trace the call from the ACU ProcessEvent method through to the code that updates the existing add-ins in the catalog. For new add-ins added to the catalog later, we also need a list remote event receiver, which the ACU registers and calls an endpoint ListEventReceiver.svc. The following is the code-behind for this service (again hosted in the ACU):

using System;
using System.Diagnostics;
using Microsoft.SharePoint.Client.EventReceivers;
using SPApps.UpdateAppCatWeb;

namespace SPApps.SubSiteCreateWeb.Services
{
    public class ListEventReceiver : IRemoteEventService
    {
        public void ProcessOneWayEvent(SPRemoteEventProperties properties)
        {
            throw new NotImplementedException();
        }

        public SPRemoteEventResult ProcessEvent(SPRemoteEventProperties properties)
        {
            var result = new SPRemoteEventResult();
            Logger.Logger.LogInfo("ProcessEvent called on ListEventReceiver", () =>
            {
                if (null == properties) throw new ArgumentNullException("properties");
                try
                {
                    switch (properties.EventType)
                    {
                        case SPRemoteEventType.ItemAdded:
                            using (var clientContext = TokenHelper.CreateRemoteEventReceiverClientContext(properties))
                                AppHelper.ProcessAppListItem(clientContext, properties.ItemEventProperties.ListItemId);
                            break;
                    }

                }
                catch (Exception ex)
                {
                    Logger.Logger.LogError(ex.ToString());
                    Debug.WriteLine(ex.ToString());
                }
            });
            return result;
        }
    }
}

Going back to the App Helper, take note of the code that registers the List Remote Event Receiver. I register the App Installed event as Synchronous. This is important, because this event is fired when uploading an APP to the catalog. SharePoint provides a dialog box to enter metadata about the add-in as part of the upload process. Leaving the event as asynchronous (default) causes a conflict error when clicking the save button on the dialog.

Lastly, let’s look at the code that is called from the injected RER endpoint – the code that allows or denies installation of the add-in:

using System;
using Microsoft.SharePoint.Client.EventReceivers;

namespace SPApps.UpdateAppCatWeb.Services
{
    public class AppMgmtReceiver : IRemoteEventService
    {
        /// <summary>
        /// Handles app events that occur after the app is installed or upgraded, or when app is being uninstalled.
        /// </summary>
        /// <param name="properties">Holds information about the app event.</param>
        /// <returns>Holds information returned from the app event.</returns>
        public SPRemoteEventResult ProcessEvent(SPRemoteEventProperties properties)
        {
            var result = new SPRemoteEventResult();
            Logger.Logger.LogInfo("ProcessEvent called for AppMgmtReceiver", () =>
            {
                using (var clientContext = TokenHelper.CreateAppEventClientContext(properties, false))
                {
                    switch (properties.EventType)
                    {
                        case SPRemoteEventType.AppInstalled:
                            result.Status = SPRemoteEventServiceStatus.CancelWithError;
                            result.ErrorMessage = "You are not allowed to install this app!";
                            break;
                        case SPRemoteEventType.AppUpgraded:
                            break;
                    }
                }
            });
            return result;
        }

        /// <summary>
        /// This method is a required placeholder, but is not used by app events.
        /// </summary>
        /// <param name="properties">Unused.</param>
        public void ProcessOneWayEvent(SPRemoteEventProperties properties)
        {
            var result = new SPRemoteEventResult();
            Logger.Logger.LogInfo("ProcessEvent called for AppMgmtReceiver", () =>
            {
                using (var clientContext = TokenHelper.CreateAppEventClientContext(properties, false))
                {
                    switch (properties.EventType)
                    {
                        case SPRemoteEventType.AppInstalled:
                            break;
                        case SPRemoteEventType.AppUpgraded:
                            break;
                    }
                }
            });
        }

    }
}

The above code is very simplistic and just denies installation of any add-in. However, this could easily be adapted (and will be for my customer) to check a central repository for before determining that an add-in is allowed or denied installation.

So, there we have it. With a little bit of trickery to the App Catalog it’s possible to implement rudimentary add-in governance. The complete project is available on github: https://github.com/robgarrett/SharePoint-Apps.

ALL CODE PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND. I MAKE NO WARRANTIES, EXPRESS OR IMPLIED, THAT THEY ARE FREE OF ERROR, OR ARE CONSISTENT WITH ANY PARTICULAR STANDARD OF MERCHANTABILITY, OR THAT IT WILL MEET YOUR REQUIREMENTS FOR ANY PARTICULAR APPLICATION. I DISCLAIM ALL LIABILITY FOR DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES RESULTING FROM YOUR USE OF THE INCLUDED CODE. ALL CODE PROVIDED OR LINKED IS FREE FOR DISTRIBUTION AND IS NOT CONSIDERED PROTECTED INTELLECTUAL PROPERTY OF MINE NOR MICROSOFT CORPORATION.

SharePoint 2016 MinRole Services List

Hopefully, by now we should all know about MinRole functionality in SharePoint 2016. If not, check out Bill Baer’s article here. In setting up my new farm, I was curious as to what services should live on what server to be in compliance. Of course, the point of MinRole is to save farm architects from worrying too much about this, but I was curious.

The list at the bottom of this post lists of all SP2016 services and the associated MinRole. I cannot take credit for the list, I got help from a post, located here, and converted the C# code to PowerShell:

    $servicesInRole = @{};
    $minRoleValues = [System.Enum]::GetNames([Microsoft.SharePoint.Administration.SPServerRole]);
    $minRoleValues | % { $servicesInRole.Add($_, (New-Object System.Collections.ArrayList)); }

    $farm.Services | % {
        $service = $_;
        $service.Instances | % {
            $serviceInstance = $_;
            # Check in which minrole the service can reside.
            $minRoleValues | % {
                if ($serviceInstance.ShouldProvision($_)) {
                    [System.Collections.ArrayList]$item = $servicesInRole.Get_Item($_);
                    if (!$item.Contains($service.TypeName)) {
                        $item.Add($service.TypeName) | Out-Null;
                    }
                }
            }
        }

    }

Interestingly, the list includes a role called “SingleServer”, which is not the same as “SingleServerFarm”. I’ve not yet tried adding a new server to my farm with this “SingleServer” role. I imagine this is for specific purpose, since it includes the insights service, but not much else.

I’ve been reading some comments about MinRoles and there appears some confusion surrounding the purpose of a MinRole. Essentially, a MinRole is a default configuration for a SharePoint server in the farm, based on the role the server will play in real life. For example, the “WebFrontEnd” role consists of services optimized for front end content delivery, since users typically hit these servers directly (via load balancer). The “Application” role consists of services optimized for back end processing.

Notice that some services exist in multiple MinRole configurations. For example, the “WebFrontEnd” and “Application” roles both contain the Business Connectivity Services. It’s feasible that both end users and back end processes require access to BCS. Therefore the BCS service lives in both role configurations, likewise with the Secure Store Service etc. This might upset some minimalist architects who like to deploy all services to one (or many) application servers and just web application services to WFE servers. If you think about it, it’s probably better to deploy some services to WFE servers when these services deliver content to end users.   If you’re looking for fine grained control over deployment location of SharePoint services, use the “Custom” role.

Something I found out after adding a new WFE server to my farm, which had services preconfigured on my App server – SharePoint started services included in the “WebFrontEnd” MinRole on my WFE server automaticlly. As I should have expected.

MinRole: WebFrontEnd

Service: Access Services 2010
Service: Microsoft Project Server Events Service
Service: Secure Store Service
Service: Microsoft SharePoint Foundation Web Application
Service: Request Management
Service: SSP Job Control Service
Service: Project Server Application Service
Service: PerformancePoint Service
Service: Visio Graphics Service
Service: Managed Metadata Web Service
Service: Microsoft SharePoint Foundation Administration
Service: Microsoft SharePoint Foundation Database
Service: Portal Service
Service: Microsoft SharePoint Foundation Sandboxed Code Service
Service: Microsoft Project Server Calculation Service
Service: Microsoft SharePoint Foundation Tracing
Service: SharePoint Server Search
Service: Microsoft SharePoint Foundation Timer
Service: App Management Service
Service: Security Token Service
Service: Machine Translation Service
Service: Microsoft Project Server Queuing Service
Service: Microsoft SharePoint Foundation Usage
Service: Microsoft SharePoint Foundation Subscription Settings Service
Service: Claims to Windows Token Service
Service: User Profile Service
Service: Business Data Connectivity Service
Service: Access Services
Service: Microsoft SharePoint Insights
Service: Information Management Policy Configuration Service

MinRole: SingleServerFarm

Service: Access Services 2010
Service: Microsoft Project Server Events Service
Service: Secure Store Service
Service: PowerPoint Conversion Service
Service: Microsoft SharePoint Foundation Web Application
Service: Request Management
Service: SSP Job Control Service
Service: Project Server Application Service
Service: PerformancePoint Service
Service: Visio Graphics Service
Service: Managed Metadata Web Service
Service: Microsoft SharePoint Foundation Administration
Service: Microsoft SharePoint Foundation Database
Service: Portal Service
Service: Microsoft SharePoint Foundation Sandboxed Code Service
Service: Microsoft Project Server Calculation Service
Service: Microsoft SharePoint Foundation Tracing
Service: SharePoint Server Search
Service: Microsoft SharePoint Foundation Timer
Service: App Management Service
Service: Security Token Service
Service: Machine Translation Service
Service: Microsoft Project Server Queuing Service
Service: Application Discovery and Load Balancer Service
Service: Microsoft SharePoint Foundation Usage
Service: Microsoft SharePoint Foundation Subscription Settings Service
Service: Search Administration Web Service
Service: Word Automation Services
Service: Claims to Windows Token Service
Service: User Profile Service
Service: Business Data Connectivity Service
Service: Lotus Notes Connector
Service: Microsoft SharePoint Foundation Workflow Timer Service
Service: Access Services
Service: Microsoft SharePoint Insights
Service: Search Host Controller Service
Service: Information Management Policy Configuration Service
Service: Microsoft SharePoint Foundation Incoming E-Mail
Service: Search Query and Site Settings Service

MinRole: SingleServer

Service: Microsoft SharePoint Foundation Database
Service: Security Token Service
Service: Microsoft SharePoint Insights

MinRole: Invalid

Service: Microsoft SharePoint Foundation Database

MinRole: Search

Service: SSP Job Control Service
Service: Microsoft SharePoint Foundation Administration
Service: Microsoft SharePoint Foundation Database
Service: Portal Service
Service: Microsoft SharePoint Foundation Tracing
Service: SharePoint Server Search
Service: Microsoft SharePoint Foundation Timer
Service: Security Token Service
Service: Application Discovery and Load Balancer Service
Service: Microsoft SharePoint Foundation Usage
Service: Search Administration Web Service
Service: Claims to Windows Token Service
Service: Microsoft SharePoint Insights
Service: Search Host Controller Service
Service: Search Query and Site Settings Service

MinRole: Application

Service: Microsoft Project Server Events Service
Service: Secure Store Service
Service: PowerPoint Conversion Service
Service: Microsoft SharePoint Foundation Web Application
Service: Request Management
Service: SSP Job Control Service
Service: Project Server Application Service
Service: Managed Metadata Web Service
Service: Microsoft SharePoint Foundation Administration
Service: Microsoft SharePoint Foundation Database
Service: Portal Service
Service: Microsoft Project Server Calculation Service
Service: Microsoft SharePoint Foundation Tracing
Service: Microsoft SharePoint Foundation Timer
Service: App Management Service
Service: Security Token Service
Service: Machine Translation Service
Service: Microsoft Project Server Queuing Service
Service: Application Discovery and Load Balancer Service
Service: Microsoft SharePoint Foundation Usage
Service: Microsoft SharePoint Foundation Subscription Settings Service
Service: Word Automation Services
Service: Claims to Windows Token Service
Service: User Profile Service
Service: Business Data Connectivity Service
Service: Microsoft SharePoint Foundation Workflow Timer Service
Service: Microsoft SharePoint Insights
Service: Information Management Policy Configuration Service
Service: Microsoft SharePoint Foundation Incoming E-Mail

MinRole: DistributedCache

Service: Microsoft SharePoint Foundation Web Application
Service: Request Management
Service: SSP Job Control Service
Service: Microsoft SharePoint Foundation Administration
Service: Microsoft SharePoint Foundation Database
Service: Portal Service
Service: Microsoft SharePoint Foundation Tracing
Service: Microsoft SharePoint Foundation Timer
Service: Security Token Service
Service: Microsoft SharePoint Foundation Usage
Service: Claims to Windows Token Service
Service: Microsoft SharePoint Insights

MinRole: Custom

Service: Microsoft SharePoint Foundation Web Application
Service: SSP Job Control Service
Service: Microsoft SharePoint Foundation Administration
Service: Microsoft SharePoint Foundation Database
Service: Portal Service
Service: Microsoft SharePoint Foundation Tracing
Service: Microsoft SharePoint Foundation Timer
Service: Security Token Service
Service: Microsoft SharePoint Foundation Usage
Service: Claims to Windows Token Service
Service: Microsoft SharePoint Insights

 

Apple iPhone Upgrade Plan

For those of you that watched the Apple “Hey Siri” event keynote, or have been following up on the Internet, today Apple announced their iPhone Upgrade Plan.

Apple’s new upgrade plan is similar to what some of the wireless network providers are already providing, and aims to allow consumers to take advantage of a new iPhone model every year, without having to wait for contract dates to expire. Just like the network provider offerings, it seems great on the surface – you pay a small amount each month and get the latest technology in return. When the next gen iPhone comes out, you hand in your current model and get a new model. This is very much like leasing a car, you pay for what you use. But how does the total cost over 12 months stack up to just buying the phone outright?

Below is a table I put together in Excel that includes pricing for the iPhone 5s, 6, 6+, 6s, and 6s+, per the numbers published by Apple today and from AT&T. I didn’t include pricing from Verizon, Sprint and T-Mobile because their plans are typically comparable (sort of). Besides, the point of this blog post was to differentiate the costs for buying a new iPhone outright every 12 months verses buying into a leasing plan from Apple.

 

image

Looking at the table, you can see that I priced out the various iPhone 64GB models, with the exception of the 5s, which only comes with a maximum of 32GB. The calculations in this table hinge on the typical resale for used iPhone devices. I based my resale numbers on the average price that an iPhone 5s, 6 and 6+ sell for today, and then factored the price for 12 months from now. So, the resale of an iPhone 6s plus in 12 months should be about the same price as a used iPhone 6 plus today.

Let’s assume you decide to buy a new iPhone 6s when it comes out next week. If you buy the phone outright you can expect to part with $749 + $99. I’ve included Apple Care because the new Apple Upgrade Plan includes it. After 12 months, when we assume Apple with have the next model version, you’ll still own your 6s outright and can probably sell it for $500. So, you’ll be in the hole by $348, which is about right for a year of wear and tear on an electronic smartphone. Compare this price with the Apple Upgrade plan, which costs $32.45 per month (numbers from the Apple web site).

When Apple introduced the plan in early September, the first thought was that this was a lease plan – one where you hand back the device after you’re done using it. However, the plan appears as an interest free loan (according to @maccast). The monthly payment equates to the cost of the phone, plus Apple Care, spread over 24 months of payments. This being the case, the numbers in the above table are subjective. If you plan to trade your iPhone for the next model, Apple will give you the trade in value based on wear and tear as well as the value from the Apple recycle trade in program. This assumes your phone is free from damage (cracks), powers on, and retains a charge. Anything less than good condition and you’ll likely get less for trade in. Should you decide to keep the phone, Apple will continue to charge you the monthly fee until the balance is paid off, or you decide to pay the remaining balance in full (or so I think). What I’m unclear on is whether you can hand back your iPhone after 12 months, having paid 12 monthly payments, and then walk away free and clear – no more payments, no more device. If so, this is then a leased phone in the same sense as a leased vehicle – hand back and walk away, buy remaining balance, or trade towards the next model.

Next, let’s look at the 6s plus. What’s interesting here is that the loss for buying outright is the same as that of the 6s. Although the 6s plus sells at $100 more than the 6s, you can expect to make $100 extra come time to sell it. However, look at the amount you’ll pay for the upgrade plan – $449.40 – the difference is now closer to $100.

If we’re talking about upgrade plans, we shouldn’t ignore At&T’s Next plan. Again, I’m pretty sure Sprint and Verizon offer something similar, but for comparison sake… With AT&T’s Next program you end up buying the phone outright and the payments are just spread out over a 24 or 30 months term, depending on the plan. Assuming the same resale value as before, you’re at the same loss value, only you had most of the money in your bank for 12 months, making some, if not small, amount of interest.

Finally, there’s the 2 year contract pricing – it appears that the network providers are getting away from these plans, besides, they’re too restrictive in not allowing changing to other providers mid-contract and phone upgrades are limited, so you’d be insane to sign up for one of these plans for another 2 years. The costs aren’t great either – it used to be the case that the network providers subsidized the cost of the phone if you agreed to a fixed 2 year contract. As phones got more expensive (we’re not talking a $99 flip phone anymore) providers realized they were losing money, especially since plans have become more affordable. I found out the hard way when I purchased my iPhone 6, last year, for $299, that AT&T increased my monthly plan charge by $25. Over the life of the plan, that equates to $600, on top of the $299 I’d already paid. That’s more than the cost of the phone at retail.

To conclude – the Apple Upgrade plan isn’t too bad on cost. If you like to get the latest iPhone each year and want peace of mind with Apple Care, then this could work for you. However, if you look after your phone and prefer to take your chances on no Apple Care and plan on selling your phone in 12 months, you could save yourself some $$$. On the other hand, if you’re fine keeping a iPhone one or two generations behind the current, it makes a whole lot more sense to buy outright at the start or amortize the cost over 24 months.

Apple Music, iTunes Match, iCloud Music, yada, yada, yada

With Apple’s recent roll out of Apple Music it’s generated a lot of confusion from consumers. I’ve lost count on the the number of blog posts I read that attempt to explain the nuances between Apple Music, iTunes Match, and iCloud Music Library.The following article is a good read…

http://www.imore.com/itunes-match

So, why am I added the the list of blog posts on this subject? More for my own sanity, but also to provide my own perception of these Apple services and what they mean to consumers.

Apple Music

If you’ve used Spotify, Beats Music, Xbox Music, or any one of the myriad of streaming music services, Apple Music should come as no surprise. AM is a streaming music service that allows consumers to stream any music available in Apple’s iTunes music store to Apple devices. Apple will soon offer the service to Android consumers.

Similar to it’s competitors, AM is available to consumers for the monthly fee. $9.99 (in the US, other countries have different prices) for an individual account and $14.99 for a family plan.

The idea of AM is that you can listen to music anywhere you have Internet, or download music for offline listening, create playlists in the iOS music app and within iTunes on the Mac. Siri understands requests to play a particular genre, artist, track, album, or year of music, which my children love in the car. What makes AM appealing (to me at least) is the ability to listen to AM music songs alongside my purchased music songs in the same playlist on all my Apple devices. AM standing alone makes perfect sense, but it’s the existence of other Apple music services that’s causing some confusion. Read on…

iTunes Match

iTunes match was revolutionary when Apple first introduced it. Previously, service providers, like Google, had the ability to upload your music to cloud servers to allow streaming on the go. The majority of us settled for carrying around iPods with large storage or a subset of our music library on what storage we had available on a portable device. I remember the painful experience of keeping copies of my music library on multiple Apple devices and Windows PCs so I could listen to the same music in the office, at home, and on the bus. Google music required I use their HTML5 player, and I didn’t like that.  iTunes Match changed everything for me.

iTunes Match is a service costing $25 per year, allowing iTunes to scan your media library (on a Mac or PC) and match songs found in the iTunes Music Store. Matched songs are then available to consumers to play on any iOS device and within iTunes on the Mac and PC as long as you have an Internet connection. Even though an original song exists on your Mac at home, you can play the same song on your office Windows PC (using iTunes) or on your iPhone via the music app. What about those eclectic songs that you own that do not reside in the iTunes Music Store? Simple, iTunes uploads them to some private space in Apple’s cloud so you can download and play them on other devices.

iTunes Match is different to Apple Music in many ways, but predominantly:

  • iTunes Match only matches music you already own, whereas AM allows you access to all music in the iTunes Music store.
  • iTunes and the iOS music app downloads matched songs in full, before allowing you to play them (at least that was the way it was before iTunes 12.2 and iOS 8.4).
  • Apple Music streams songs in the same way that Pandora and Spotify do.
  • Apple Music tracks are DRM encoded, iTunes Matched songs are not.

Now that Apple Music is here, do I need iTunes Match? This is a question asked by many, and the answer isn’t simply yes or no. It really depends on your intent to own your own music or not. If you’re paying for AM each month and do not plan on cancelling the service any time soon, there is no good reason to pay the yearly iTunes Match fee in addition. As long as you keep up with your AM subscription, all songs in your music library will remain as long as they’re available in the iTunes Music Store. I cannot say for certain, but I have to believe that when my iTunes Match subscription ends, all those previously “matched” songs will either remain as such, or convert to “Apple Music” songs. We’ll find out soon as AM subscriptions gain longevity and iTunes Match subscriptions lapse.

BTW, it’s worth my mentioning that signing up for Apple Music will not cancel your iTunes Match subscription. I had to cancel mine manually via my account page – see instructions here.

If you’re an iTunes Match subscriber and have decided to take advantage of Apple Music 3 months free trial, and are not sure you plan on subscribing to AM full time, I recommend you do not let your iTunes Match subscription lapse. Assuming you have your original non-DRM files downloaded somewhere in iTunes, you can always go back to the yearly $25 model and continue to match those songs you own. Those AM songs you don’t own will stop working because of Apple Music DRM. As long as your Match subscription is active you should be able to continue listening to the music you do own on all Apple devices. On the other hand, cancelling both AM and iTunes match means you’ll lose all cloud music access and can only play songs which you have stored locally and DRM-free.

Now, if you’re diligent (read: anal), like me, you’ll most likely have a tidy back up of all your original ripped music (from CDs you own, right?). In the event that both your Apple Music and iTunes Match subscriptions lapse, you should be able to go back to the originals.

Some took the brave step of deleting their originals after signing up with iTunes Match. Some haven’t paid much attention and their library consists of both locally downloaded songs as well as cloud only matched songs (especially if you haven’t played them recently). Those with multiple devices may have local music on one device and not another – it’s hard to tell. My recommendation is to backup any and all locally downloaded songs, via iTunes, while you’re still subscribed to iTunes Match. This way you’ll at least have music you own DRM-free somewhere. Those AM music tracks that you never purchased will disappear (and not play if you have a local DRM copy available).

iCloud Music Library

I left the best to last… if you’ve signed up to use Apple Music, iTunes probably (should have) gave you the option to switch over to iCloud Music Library. Here is another helpful link. What’s this, a third service from Apple for music? Sort of…

The best way to get this mismatch of Apple Music Services straight in your head is to consider Apple Music and iTunes Match as “services” and iCloud Music Library as a freebie add-on for AM subscribers. After all, AM and iTunes Match are paid subscription services in their own right, which you can opt in or out. iCloud Music Library is an extra feature available to those signed up with Apple Music.

iCloud ML is exactly what the name says it is – it’s your iTunes Music Library stored in the cloud. Long-time users of iTunes Match are probably screaming at this blog post and saying that is what they’ve been using all along and they’re partially right. iCloud ML aims to replicate your iTunes Music Library across all Apple devices and include match, non-matched, and Apple Music songs in all playlists. iTunes Match would not sync playlists that contained non-matched and non-uploaded songs. Personally, I think Apple took this feature from iTunes Match and made it available to AM subscribers so AM subscribers could cancel iTunes Match without losing non-matched local music in the cloud.

Unfortunately, iCloud ML has gotten bad press since the roll out of Apple Music. If you look at the slew of complaints since the roll out of iOS 8.4 and iTunes 12.2, most are about iCloud ML and not the actual AM service. From what I can tell, people migrating from other music streaming services to AM continued their life without issue (except for recreating their favorite playlists in the new AM service). However, those that manage their own Matched music libraries in iTunes were very upset when iCloud ML started monkeying with their music libraries. There were lots of complaints of missing songs, missing artwork, incorrect song metadata, changes to the playlist not replicating to all devices… the list goes on. Apple recently pushed an update to iTunes – 12.2.1 to address an issue where Apple classified matched songs as DRM Apple Music songs in version 12.2.

To clarify – you do not need to switch over to iCloud Music Library if you’re an Apple Music subscriber. In fact, if you’re untrusting of Apple’s recent roll out, then I’d recommend not opting into iCloud ML. In this case, you’ll be able to listen to AM songs on all your devices and see AM playlists, but your local music will remain local. I still have time left with my iTunes Match subscription so cannot determine if opting out of iCloud ML will eradicate my “matched” tracks if I have AM turned on without an iTunes Match subscription active.

I took the plunge with iCloud ML and made sure I had a back up of my original MP3 and AAC files. I came from an iTunes Match subscription, which I cancelled the automated billing shortly after taking the plunge with AM. I’m curious to see what will happen to my “matched” songs once my iTunes Match subscription lapses – hopefully they’ll stay DRM-free, but I’m not too bothered knowing I have my originals and plan on staying with AM for the immediate future.

Something I found out of late, and I’m not sure if Apple is addressing it, is that iCloud ML and Apple Music appear to impose request throttling. In non-techie terms – iTunes and iOS can make a finite number of calls to the AM and iCloud ML servers within a period of time (I’m not sure how many requests or the window of time). This, like most web services, prevents denial of service attacks by malicious applications flooding a service with too many requests. The upshot of this is that I found I hit the throttle limit easily when making mass changes to my iTunes Music Library with iCloud ML enabled. I spent an hour “loving” tracks in my library so that AM would produce better curated playlists and recommendations, after which I’d lose connectivity to AM and iCloud ML. It was quite frustrating.

Summary

To summarize… Apple Music and iTunes Match are two different Apple Cloud services. You do not necessarily need iTunes Match if you’re an AM subscriber, but might want to go back to Match if you cancel your AM subscription – in which case keep back ups of your original and matched DRM-free downloads.

iCloud Music Library is a bit of a cluster-**** at the moment and it appears that Apple is making strives to fix it. iCloud ML works for me (after hours of tinkering) but if you’re proud of the many hours invested in your iTunes Music Library, you may not want to let iCloud ML run rampant over it just yet.

SharePoint PowerShell Scripts

It’s time to give my blog a fresh injection of content….

I’ve been working for several months on a lot of PowerShell script work for SharePoint (2010, 2013, and SharePoint Online). I figured it was about time that I put sanitized copies of my scripts up on my blog site for all to read.

I’ve added a new section on my blog, aptly named “Scripts”, which you can access via the top level navigation of this site. From there, I present an ongoing set of links to each script I develop and publish. At this time, the following is a list of the client-side scripts I’ve uploaded…

To access my SharePoint 2013 farm provisioning scripts, see here on GitHub.

Bulk Check In for SP2010 Files with No Version Info

SharePoint best practice is to disable “require check in” on document libraries before doing a large bulk import of documents. I received an email from a customer last week, who had not followed this best practice and had over 35,000 documents checked out, which no one but he could see.

Unlike checked out documents with previous check in version(s), a newly uploaded document is not visible by anyone but the person who uploaded the file, even administrators. Fortunately, SharePoint provides a way for an admin to take ownership of these documents via “Manage checked out documents” in the library settings. However, when dealing with a document count that exceeds the default threshold of 10,000, SharePoint returns an error. Temporarily increasing the threshold gets around the error, but then the user interface becomes intolerably slow.

Even after taking ownership, then there’s the task of bulk check in, which is again, a slow process via the UI for large item count document libraries. What I wanted was a PowerShell script to both take ownership of the documents and then check them in. Below is the server-side script I created….

Note: I had to use server-side and not client-side PowerShell because CSOM does not expose checked out files method. The script was tested on SharePoint 2010.

Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue;

function BulkCheckIn {
param([Microsoft.SharePoint.SPFolder]$folder);
$folder.Files | ? { $_.CheckOutStatus -ine "None" } | % {
Write-Host $_.ServerRelativeUrl;
$_.CheckIn("Initial check in", [Microsoft.SharePoint.SPCheckinType]::MajorCheckIn);
}
$folder.SubFolders | % { BulkCheckIn -folder $_; }
}

$site = Get-SPSite http://mysitecollection/;
$web = $site.OpenWeb('subsite/subsite');
$lib = $web.Lists['Documents'];
$lib.CheckedOutFiles | % {
Write-Host "Processing $($_.Url)";
$_.TakeOverCheckOut();
}
BulkCheckIn -folder $lib.RootFolder;

SharePoint 2013 List Item Save Timeout

I received a report from a colleague today that he was getting timeout errors after clicking the save button on a list item edit form. Initial testing of web head performance showed no issues and the ULS log only reported timing out during the save.

Taking a deeper look, we established that the list in question had a custom 2013 workflow attached – aha! At the same time, OOTB publish workflows were taking longer than usual to complete.

Next step, we checked on the Workflow Manager log in the event view on each of the machines in our Workflow Manager farm. Lo and behold, we found a critical issue with connecting to the Service Bus on one of the servers. Turned out that all three service bus services were stopped. In checking the other two servers in the Workflow Manager quorum, they too showed stopped SB services.

My guess is that IT had rolled out a patch for Service Bus and not checked to see if the services restarted on each affected server. I believe Microsoft recently released a patch for Service Bus, which may or may not require a server reboot, which could account for the services having stopped and not restarted (expected after a reboot).

So, there you have it, if you come across these symptoms, check your workflow.

Install Workflow Manager with PowerShell

By default, SharePoint 2013 on-premises installation includes the legacy SharePoint 2010 workflow engine only. To take advantage of Workflow 2013 you must install Workflow Manager. You can achieve this by installing Workflow Manager with the Web Platform Installer and then configure with the Workflow Manager Configuration Wizard. I have recently cobbled together my own SharePoint 2013 configuration scripts to setup SharePoint 2013 on-prem – soup to nuts – with Powershell. In keeping with this same theme, this blog post details the script required to install and configure Workflow Manager.

First things first, Workflow Manager is a separate installation from SharePoint, available on the web. I used the latest version of the Web Platform Installer to pull down the files. I had to install WPI first.

With WPI installed, open an elevated console and run the following command:

webpicmd /offline /Products:WorkflowManagerRefresh /Path:c:WorkflowManagerFiles

We now have the Workflow Manager installation files in c:WorkflowManagerFiles. Installing WFM is a simple case of running the following command:

WebpiCmd.exe /Install /Products:WorkflowManagerRefresh /XML:c:/WorkflowManagerFiles/feeds/latest/webproductlist.xml

Note: Make sure you install the ‘Refresh’ of WFM if installing on Windows Server 2012 R2. Installing the original 1.0 version causes issues when registering WFM with SharePoint 2013. I strongly recommend patching SharePoint 2013 to SP1.

After completing the installation, Workflow Manager launches the configuration wizard. You can use the wizard WFM if you like and need do no more, but if you want to see the juicy script that the wizard generates, then read on.

The following is my complete script to configure a WFM farm – it’s a simple configuration. For those interested, Spencer Harbar has a series of good posts on configuring load balanced WFM: http://www.harbar.net/articles/wfm2.aspx.

function WFM-Configure {
    # Create new SB Farm
    $SBCertificateAutoGenerationKey = ConvertTo-SecureString -AsPlainText  -Force  -String $passphrase;
    $WFCertAutoGenerationKey = ConvertTo-SecureString -AsPlainText  -Force  -String $passphrase;
    $managementCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFMSB_Management;Integrated Security=True;Encrypt=False';
    $gatewayCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFMSB_Gateway;Integrated Security=True;Encrypt=False';
    $messageContCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFMSB_MessageContainer;Integrated Security=True;Encrypt=False';
    Write-Host -ForegroundColor White ' - Creating new Service Bus farm...' -NoNewline;
    try {
        $sbFarm = Get-SBFarm -SBFarmDBConnectionString $managementCS;
        Write-Host -ForegroundColor White 'Already Exists';
    }
    catch {
        New-SBFarm -SBFarmDBConnectionString $managementCS -InternalPortRangeStart 9000 -TcpPort 9354 -MessageBrokerPort 9356 -RunAsAccount $spServiceAcctName `
            -AdminGroup 'BUILTINAdministrators' -GatewayDBConnectionString $gatewayCS -CertificateAutoGenerationKey $SBCertificateAutoGenerationKey `
            -MessageContainerDBConnectionString $messageContCS;
        Write-Host -ForegroundColor White 'Done';
    }
    # Create new WF Farm
    Write-Host -ForegroundColor white &quot; - Creating new Workflow Farm...&quot; -NoNewline;
    $wfManagementCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFM_Management;Integrated Security=True;Encrypt=False';
    $wfInstanceCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFM_InstanceManagement;Integrated Security=True;Encrypt=False';
    $wfResourceCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFM_ResourceManagement;Integrated Security=True;Encrypt=False';
    try {
        $wfFarm = Get-WFFarm -WFFarmDBConnectionString $wfManagementCS;
        Write-Host -ForegroundColor White 'Already Exists';
    }
    catch {
        New-WFFarm -WFFarmDBConnectionString $wfManagementCS -RunAsAccount $spServiceAcctName -AdminGroup 'BUILTINAdministrators' -HttpsPort 12290 -HttpPort 12291 `
            -InstanceDBConnectionString $wfInstanceCS -ResourceDBConnectionString $wfResourceCS -CertificateAutoGenerationKey $WFCertAutoGenerationKey;
        Write-Host -ForegroundColor white 'Done';
    }
    # Add SB Host
    Write-Host -ForegroundColor white ' - Adding Service Bus host...' -NoNewline;
    try {
        $SBRunAsPassword = ConvertTo-SecureString -AsPlainText  -Force  -String $spServiceAcctPwd;
        Add-SBHost -SBFarmDBConnectionString $managementCS -RunAsPassword $SBRunAsPassword -EnableHttpPort `
            -EnableFirewallRules $true -CertificateAutoGenerationKey $SBCertificateAutoGenerationKey;
        Write-Host -ForegroundColor white 'Done';
    } 
    catch {
        Write-Host -ForegroundColor white 'Already Exists';
    }
    Write-Host -ForegroundColor white ' - Creating Workflow Default Namespace...' -NoNewline;
    $sbNamespace = $dbPrefix + '-WorkflowNamespace';
    try {
        $defaultNS = Get-SBNamespace -Name $sbNamespace -ErrorAction SilentlyContinue;
        Write-Host -ForegroundColor white 'Already Exists';
    }
    catch {
        try {
            # Create new SB Namespace
            $currentUser = $env:userdomain + '' + $env:username;
            New-SBNamespace -Name $sbNamespace -AddressingScheme 'Path' -ManageUsers $spServiceAcctName,$spAdminAcctName,$currentUser;
            Start-Sleep -s 90
            Write-Host -ForegroundColor white 'Done';
        }
        catch [system.InvalidOperationException] {
            throw;
        }
    }
    # Get SB Client Configuration
    $SBClientConfiguration = Get-SBClientConfiguration -Namespaces $sbNamespace;
    # Add WF Host
    try {
        $WFRunAsPassword = ConvertTo-SecureString -AsPlainText  -Force  -String $spServiceAcctPwd;
        Write-Host -ForegroundColor White ' - Adding Workflow Host...' -NoNewline;
        Add-WFHost -WFFarmDBConnectionString $wfManagementCS `
        -RunAsPassword $WFRunAsPassword -EnableFirewallRules $true `
        -SBClientConfiguration $SBClientConfiguration -CertificateAutoGenerationKey $WFCertAutoGenerationKey;
        Write-Host -ForegroundColor White 'Done';
    }
    catch {
        Write-Host -ForegroundColor white &quot;Already Exists&quot;;
    }
}

Let’s break the script down a little. I should mention that cutting and pasting my script into your Powershell window won’t work to start because the script assumes the existence of a few predefined variables, as follows:

  • $dbServer – SQL Alias for my SQL server (best to use an alias not the server name)
  • $dbPrefix – Prefix for all my database names
  • $spServiceAcctName – Name of my service account (used for workflow manager)
  • $spServiceAcctPWD – Password for my service account
  • $spAdminAcct – Admin account for SharePoint
  • $passphrase – Passphrase used by WFM when joining new hosts to the farm

WFM consists of two parts – the Service Bus and the Workflow Engine. To start with, my script creates a new Service Bus Farm…

$SBCertificateAutoGenerationKey = ConvertTo-SecureString -AsPlainText  -Force  -String $passphrase;
    $WFCertAutoGenerationKey = ConvertTo-SecureString -AsPlainText  -Force  -String $passphrase;
    $managementCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFMSB_Management;Integrated Security=True;Encrypt=False';
    $gatewayCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFMSB_Gateway;Integrated Security=True;Encrypt=False';
    $messageContCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFMSB_MessageContainer;Integrated Security=True;Encrypt=False';
    Write-Host -ForegroundColor White ' - Creating new Service Bus farm...' -NoNewline;
    try {
        $sbFarm = Get-SBFarm -SBFarmDBConnectionString $managementCS;
        Write-Host -ForegroundColor White 'Already Exists';
    }
    catch {
        New-SBFarm -SBFarmDBConnectionString $managementCS -InternalPortRangeStart 9000 -TcpPort 9354 -MessageBrokerPort 9356 -RunAsAccount $spServiceAcctName `
            -AdminGroup 'BUILTINAdministrators' -GatewayDBConnectionString $gatewayCS -CertificateAutoGenerationKey $SBCertificateAutoGenerationKey `
            -MessageContainerDBConnectionString $messageContCS;
        Write-Host -ForegroundColor White 'Done';
    }

Notice that my script does some checking and error control so that I can run this script multiple times without it causing error because of pre-configuration. OK, now to the Workflow Farm…

Write-Host -ForegroundColor white &quot; - Creating new Workflow Farm...&quot; -NoNewline;
    $wfManagementCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFM_Management;Integrated Security=True;Encrypt=False';
    $wfInstanceCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFM_InstanceManagement;Integrated Security=True;Encrypt=False';
    $wfResourceCS = 'Data Source=' + $dbServer + ';Initial Catalog=' + $dbPrefix + '_WFM_ResourceManagement;Integrated Security=True;Encrypt=False';
    try {
        $wfFarm = Get-WFFarm -WFFarmDBConnectionString $wfManagementCS;
        Write-Host -ForegroundColor White 'Already Exists';
    }
    catch {
        New-WFFarm -WFFarmDBConnectionString $wfManagementCS -RunAsAccount $spServiceAcctName -AdminGroup 'BUILTINAdministrators' -HttpsPort 12290 -HttpPort 12291 `
            -InstanceDBConnectionString $wfInstanceCS -ResourceDBConnectionString $wfResourceCS -CertificateAutoGenerationKey $WFCertAutoGenerationKey;
        Write-Host -ForegroundColor white 'Done';
    }

With both Service Bus and Workflow farms created, it’s time to create a Service Bus host and a Workflow Management Host. WFM adopts a similar architecture to ADFS and SharePoint in that a farm consists of one or many hosts. The more hosts the better availability of service with network load balancing.

Here’s the part script to create a new Service Bus Host…

Write-Host -ForegroundColor white ' - Adding Service Bus host...' -NoNewline;
    try {
        $SBRunAsPassword = ConvertTo-SecureString -AsPlainText  -Force  -String $spServiceAcctPwd;
        Add-SBHost -SBFarmDBConnectionString $managementCS -RunAsPassword $SBRunAsPassword `
            -EnableFirewallRules $true -CertificateAutoGenerationKey $SBCertificateAutoGenerationKey;
        Write-Host -ForegroundColor white 'Done';
    } 
    catch {
        Write-Host -ForegroundColor white 'Already Exists';
    }
    Write-Host -ForegroundColor white ' - Creating Workflow Default Namespace...' -NoNewline;
    $sbNamespace = $dbPrefix + '-WorkflowNamespace';
    try {
        $defaultNS = Get-SBNamespace -Name $sbNamespace -ErrorAction SilentlyContinue;
        Write-Host -ForegroundColor white 'Already Exists';
    }
    catch {
        try {
            # Create new SB Namespace
            $currentUser = $env:userdomain + '' + $env:username;
            New-SBNamespace -Name $sbNamespace -AddressingScheme 'Path' -ManageUsers $spServiceAcctName,$spAdminAcctName,$currentUser;
            Start-Sleep -s 90
            Write-Host -ForegroundColor white 'Done';
        }
        catch [system.InvalidOperationException] {
            throw;
        }
    }

…and finally, we create the Workflow Manager host…

$SBClientConfiguration = Get-SBClientConfiguration -Namespaces $sbNamespace;
    # Add WF Host
    try {
        $WFRunAsPassword = ConvertTo-SecureString -AsPlainText  -Force  -String $spServiceAcctPwd;
        Write-Host -ForegroundColor White ' - Adding Workflow Host...' -NoNewline;
        Add-WFHost -WFFarmDBConnectionString $wfManagementCS `
        -RunAsPassword $WFRunAsPassword -EnableFirewallRules $true -EnableHttpPort `
        -SBClientConfiguration $SBClientConfiguration -CertificateAutoGenerationKey $WFCertAutoGenerationKey;
        Write-Host -ForegroundColor White 'Done';
    }
    catch {
        Write-Host -ForegroundColor white &quot;Already Exists&quot;;
    }

At this point, we’ve completed configuration of Workflow Manager, but we’re not quite finished with configuration yet. We need to tell SharePoint about the existence of the WFM farm, which involves installing WFM client on each SharePoint server (you don’t need to install the client if you have installed the WFM on a SharePoint server). Ideally, your WFM farm is independent of SharePoint (servers and databases), but for test purposes you can install the whole lot on one server.

Let’s make sure WFM is working before we configure SharePoint. Open a browser to the following location…

http://WFM-server-name:12291

You should see some XML returned from the service (you may need to run IE as admin for this to work).

&lt;ScopeInfo xmlns:i=&quot;http://www.w3.org/2001/XMLSchema-instance&quot; xmlns=&quot;http://schemas.microsoft.com/workflow/2012/xaml/activities&quot;&gt;
  &lt;DefaultWorkflowConfiguration /&gt; 
  &lt;Description&gt;Root Scope&lt;/Description&gt; 
  &lt;LastModified&gt;2014-05-12T23:17:49.47&lt;/LastModified&gt; 
  &lt;LastRevised&gt;2014-05-12T23:17:49.47&lt;/LastRevised&gt; 
  &lt;Path&gt;/&lt;/Path&gt; 
 &lt;SecurityConfigurations&gt;
 &lt;ScopedSecurityConfiguration i:type=&quot;WindowsSecurityConfiguration&quot;&gt;
  &lt;Name&gt;Microsoft.Workflow.Management.Security.WindowsSecurityConfiguration&lt;/Name&gt; 
  &lt;WorkflowAdminGroupName&gt;BUILTINAdministrators&lt;/WorkflowAdminGroupName&gt; 
  &lt;/ScopedSecurityConfiguration&gt;
  &lt;/SecurityConfigurations&gt;
  &lt;Status&gt;Active&lt;/Status&gt; 
 &lt;/ScopeInfo&gt;

Our last task is to tell SharePoint 2013 about our new Workflow farm, which we accomplish with the following cmdlet (run this on each SharePoint server):

Register-SPWorkflow -spsite 'http://web-application-url' -WorkflowHostUri 'http://WFM-server-FQDN:12291' -AllowOAuthHttp;

All being well, you should have a Workflow Manager Proxy listed under Manage Service Applications in Central Administration. Note: This proxy gets created whether the Register-SPWorkflow cmdlet succeeds or fails, so remember to remove this proxy after an error. Selecting the proxy and clicking the Manage icon in the ribbon should give you a page that indicates SharePoint is connected to the WFM. You’re good to go and create 2013 workflows.

Some things to note…

My configuration uses HTTP between SharePoint and WFM. In a production scenario, I recommend using HTTPS. In which case, you need to export the auto-generated certificate (from https://WFM-server-FQDN:12290) and import it into SharePoint Central Admin under Security -> Manage Trust. When using SSL, run the same Register-SPWorkflow cmdlet, but change the WFM location to https://WFM-server-FQDN:12290 and drop the -AllowOAuthHttp switch.

Some organizations rely heavily on workflow. Not unlike SharePoint, ADFS, and other farm based services, it’s a good idea to configure WFM with multiple hosts when high availability is important.

Filter Document Lib to Last Published Version

My customer brought up an interesting requirement to filter their document library to show just the last approved versions when content approval and major/minor versions applied.

Any unpublished document – that is a document where the major version is 0 will not show up in the filter. Any document that has a major version number greater than 0 and is in draft or pending status – e.g a document at version 1.1 – will only show a link to the last published version.

Turns out the solution was quite easy and involved just adding some query string parameters…

?IncludeVersions=TRUE&FilterField1=_ModerationStatus&FilterValue1=0&FilterField2=_IsCurrentVersion&FilterValue2=1

The IncludeVersions parameter instructs the list view to show all versions. Then it’s a simple case of filtering on the most current version of each item where not in moderation state.