Author Archives: robgarrett

SharePoint Provider-Hosted Add-In and Web Api

This topic has been discussed several times – how do you go about using Web API from within a single page SharePoint Provider-Hosted add-in? Bas Lijten had a great approach, which I emulated with some additions.

Provider-Hosted add-ins rely on OAUTH under the hood, and the technicalities are normally abstracted by SharePointContext and TokenHelper classes. In short, the issue with these classes is that they assume access to both HttpContext.Current and session state, neither which Web API guarantees.

Bas wrote a custom set of classes that perform the same job as the stock classes, but for Web API. I went one step further and provided a set of classes that work for both MVC page rendering and Web API. My custom provider avoids the use of session state and instead uses cookies to pass values between standard MVC SharePoint add-in calls and Web API calls (AJAX).

You can see my code and read a more technical description over at GitHub.

SharePoint 2010 Calendar Item Error – Item Does Not exist. It may have been deleted by another user

Today, I encountered an interesting issue with SharePoint calendar list items. My customer had created a recurring calendar entry using Outlook and then subsequently deleted the series in Outlook. Somewhere along the timeline they may have updated specific instances in the series. After deleting the series my customer noticed future events in the recurrence series remaining in the SharePoint Calendar list. Any attempt to delete these list items via the UI or PowerShell resulted in the error “Item Does Not exist. It may have been deleted by another user“.

SharePoint handles recurring events by creating child list items for every event in the series and tying them to one master list item – stored in the MasterSeriesItemID  field. The master list item is not shown via the UI, since users see the events in the series, especially when the series has exceptions.

Deleting the master list item with ID specified in MasterSeriesItemID  resulted in deleting the entire set of recurring events in the series, which was the desired outcome. I’m guessing Outlook should have deleted the series master item in SharePoint when the user deleted the series in Outlook, but that clearly didn’t happen.

SharePoint High-Trust App Only Context from PowerShell

Wictor Wilén wrote a great post on getting an “app-only” client context for SharePoint Add-ins using ACS – Low Trust Add-ins. I wanted to do something similar with High-Trust Server-2-Server Provider-Hosted Add-ins. I needed the app-only context so that I could use the Tenant API to create on-premises site collections from PowerShell. With the help of Fiddler and some mad PowerShell skills, I came up with the following function.

Function GetAppOnlyContext {
    Param([string]$clientId, [string]$issuerId, [string]$certPath, [string]$certPwd, [scriptblock]$GetAppOnlyContextCB);
    [System.Reflection.Assembly]::LoadWithPartialName("System") | Out-Null;
    [System.Reflection.Assembly]::LoadWithPartialName("System.Collections") | Out-Null;
    [System.Reflection.Assembly]::Load("System.IdentityModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089") | Out-Null;
    [System.Reflection.Assembly]::Load("Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35") | Out-Null;
    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.IdentityModel.Extensions") | Out-Null;
    # Get the realm.
    $realm = "";
    $headers = @{Authorization = "Bearer "};
    Try {
        $x = Invoke-WebRequest -Uri "$($tenantUrl)_vti_bin/client.svc" -Headers $headers -Method POST -UseBasicParsing;
    } Catch {
        # We expect a 401 here.
        $realm = $_.Exception.Response.Headers["WWW-Authenticate"].Substring(7).Split(",")[0].Split("=")[1].Trim("`"");
    }
    $issuer = "$($issuerId)@$($realm)";
    $nameId = "$($clientId)@$($realm)";
    $uri = New-Object System.Uri($tenantUrl);
    $audience = "00000003-0000-0ff1-ce00-000000000000/$($uri.Authority)@$($realm)";
    # Get the claims
    $actorClaims = New-Object System.Collections.Generic.List[Microsoft.IdentityModel.S2S.Tokens.JsonWebTokenClaim];
    $claim = New-Object Microsoft.IdentityModel.S2S.Tokens.JsonWebTokenClaim("nameid", $nameId);
    $actorClaims.Add($claim);
    # Get the signing credentials
    $cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2($certPath, $certPwd);
    $signingCreds = New-Object Microsoft.IdentityModel.SecurityTokenService.X509SigningCredentials(`
        $cert, [System.IdentityModel.Tokens.SecurityAlgorithms]::RsaSha256Signature, `
        [System.IdentityModel.Tokens.SecurityAlgorithms]::Sha256Digest);
    # Create the token.    
    $token = New-Object Microsoft.IdentityModel.S2S.Tokens.JsonWebSecurityToken(`
        $issuer, $audience, [System.DateTime]::UtcNow, [System.DateTime]::UtcNow.AddHours(12), $actorClaims, $signingCreds);
    $tokenString = (New-Object Microsoft.IdentityModel.S2S.Tokens.JsonWebSecurityTokenHandler).WriteTokenAsString($token);
    [Microsoft.SharePoint.Client.ClientContext]$clientContext = $null;
    Try {
        # Create the client context.
        $clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($tenantUrl);
        $clientContext.AuthenticationMode = [Microsoft.SharePoint.Client.ClientAuthenticationMode]::Anonymous;
        $clientContext.FormDigestHandlingEnabled = $false;
        AddRequestHandler -clientContext $clientContext -token $tokenString;
        if ($GetAppOnlyContextCB -ne $null) { &$GetAppOnlyContextCB -clientContext $clientContext; }
    } Finally {
        if ($clientContext -ne $null) { $clientContext.Dispose(); }
    }
}

Function AddRequestHandler {
    Param([Microsoft.SharePoint.Client.ClientContext]$clientContext, [string]$token);
    Add-Type -TypeDefinition @"
using System;
using Microsoft.SharePoint.Client;
namespace SPHelper {
    public static class ClientContextHelper {
        private static string _token = "";
        public static void AddRequestHandler(ClientContext ctx, string token) {
            _token = token;
            ctx.ExecutingWebRequest += new EventHandler<WebRequestEventArgs>(RequestHandler);
        }
        private static void RequestHandler(object sender, WebRequestEventArgs e) {
            e.WebRequestExecutor.RequestHeaders["Authorization"] = "Bearer " + _token;
        }
    }
}
"@ -ReferencedAssemblies "$env:dp0\DLLs\Microsoft.SharePoint.Client.dll", "$env:dp0\DLLs\Microsoft.SharePoint.Client.Runtime.dll";
    [SPHelper.ClientContextHelper]::AddRequestHandler($clientContext, $token);
}

SharePoint Online AppRegNew via PowerShell

$clientId = "clientId GUID"
$appDomain = "App Domain"
$appName = "Friendly Name"
$appUrl = "App Url"
$newClientSecret = "Client Secret"

$servicePrincipalName = @("$clientID/$appDomain")
New-MsolServicePrincipal -ServicePrincipalNames $servicePrincipalName -AppPrincipalId $clientID -DisplayName $appName `
  -Type Symmetric -Usage Verify -StartDate "12/01/2016" -EndDate "12/01/2017" -Addresses (New-MsolServicePrincipalAddresses -Address $appUrl) 
New-MsolServicePrincipalCredential -AppPrincipalId $clientId -Type Symmetric -Usage Sign -Value $newClientSecret
New-MsolServicePrincipalCredential -AppPrincipalId $clientId -Type Symmetric -Usage Verify -Value $newClientSecret
New-MsolServicePrincipalCredential -AppPrincipalId $clientId -Type Password -Usage Verify -Value $newClientSecret

NodeJS Development Environment w/SharePoint Framework Support

Unless you’ve been hiding under a rock recently (or not interested in SharePoint development), you’ve probably heard/read the recent announcement about the preview release of the SharePoint Framework. You can read the announcement here:

http://dev.office.com/blogs/sharepoint-framework-developer-preview-release

The new SPFx adopts client-side development using Typescript (a superset language of JavaScript) and uses tools born from NodeJS development – such as Gulp and Yeoman. The aim of this post is not to go into the specifics of these tools, besides, there’s lots of information on the Internet.

If you’ve made the leap into client-side development (for SharePoint or otherwise) – congratulations and welcome to the new era of software development. Those of you embarking on the learning curve will soon learn that client-side development (and by extension SPFx development) requires installation of various tools for your development arsenal. The days of just installing a single IDE are fading away. At this point I shall mention that those die hard Visual Studio folks can develop NodeJS and SPFx projects with their IDE. However, you still need NodeJS and dependent modules installed to develop for SPFX. The following article details the steps:

https://github.com/SharePoint/sp-dev-docs/wiki

Like many JavaScript and Typescript developers before me, I have opted for the platform independent tools, using Visual Studio Code. VSCode is a lightweight code editor that embraces client-side and NodeJS development and runs on Windows, OSX and Linux (you can even run it on a Raspberry PI). Just as with it’s big brother, Visual Studio Code works with additional software to constitute a true development environment – alone it’s really just a JavaScript/Typescript editor. I’ll refer you to the previously mentioned article that speaks to installing all the necessary components for SPFx development.

By now, you’re probably thinking “I have to install Visual Studio Code or Visual Studio 2015, NodeJS, Yeoman, Gulp, Windows Build Tools, yada, yada, yada, just to get a development environment up and running?”. The short answer is “yes”. Luckily for you (those of you on Windows at least), I have created a PowerShell script that downloads all the tools and dependencies for you, available at the following location:

https://github.com/robgarrett/Study/blob/master/Install-JSDev.ps1

My script downloads the following binaries and installs them:

  • Visual Studio Code
  • NodeJS LTS
  • Windows GIT

After installation of the binaries, the script uses the Node Package Manager (NPM) to install:

  • Windows-Build-Tools (includes an installation of Python)
  • Yeoman
  • Gulp
  • The SPFx Yeoman Generator (this creates SPFx scaffolding)
  • Typescript 2.0

Typescript 2.0 isn’t necessarily required for SPFx development (I believe TSv1.0 installed as a dependency of one of the other packages), but Typescript is coming to stay so might as well get used to the next version.

Depending on the performance of your development machine and your Internet, the script can take some time installing all the necessary packages. So grab a coffee and let it do its stuff.

Finally, you’re ready to start developing. If you’re ready to dive into SPFx Web Part development you can create your first web-part using the instructions at the following location:

https://github.com/robgarrett/Study/blob/master/Install-JSDev.ps1

FYI – the @Microsoft/Generator-SharePoint downloads a ton of modules and it can take an absolute age. It might seem like a lot of waiting around to develop your first SPFx web part, but SPFx and the workbench rely on lots of modules. For subsequent projects you can always make a copy of the default web-part scaffolding (directory structure and files) save generating from scratch. At the very least the node_modules folder is good to keep because it contains all dependent NodeJS projects and libraries.

So, that’s it. Sorry, if you’re on OSX or Linux – you’ll have to download to the installs per the article instructions until I or someone else creates a bash script to do the same as my PowerShell script (note: PowerShell now runs on Linux, but my script is Windows specific). Hey, at least you seldom set up your development environment from scratch.

SharePoint 2013 Not Crawling Cold Fusion (CFM) Pages

It’s not every day that you find needle in a haystack, but when you do it’s worth blogging for prosperity…

My customer has set up SharePoint 2013 as the central search authority for their organization and uses it to crawl non-SharePoint sites as well as SharePoint. We noticed that SharePoint was not crawling links in Cold Fusion pages (.cfm). Instead, the crawler was treating CFM pages as text and stopping at the top-most levels without indexing lower-level pages.

We noticed in testing that renaming CFM to HTML extension fixed the issue, but wasn’t sustainable for the vast number of CF sites in the organization.

For the longest time I was messing around with the New-SPEnterpriseSearchFileFormat cmdlet, urging SharePoint to treat CFM pages the same as HTML. What I determined is this cmdlet is good for mapping custom extensions to Windows platform IFilters. What I wanted to do was to mimic the crawler indexing HTML pages, which does not use Windows IFilters. After much perseverance, I found the following information (included below, incase the link goes away):

# To check the current settings for filtering extensions, run the following command lines:
$ssa = Get-SPEnterpriseSearchServiceApplication "Search Service Application"
$ssa.GetProperty("ExtensionsToFilter")
# Here's the default output that you'll receive:
#;ascx;asp;aspx;htm;html;jhtml;jsp;mht;php;

#To add the .cfm extension to the property, run the following commands:
$ssa.SetProperty("ExtensionsToFilter", ";ascx;asp;aspx;htm;html;jhtml;jsp;mht;php;cfm;")
$ssa.Update()
# To restart the search functionality on a crawler when no crawling is occurring, run the following commands:
net stop osearch15
net start osearch15

https://support.microsoft.com/en-us/kb/2953907

SharePoint Crawling User Profiles (SPS3://) – Access Denied w/o HTTP

I stumbled across an interesting issue with People Search in SharePoint 2016. I was attempting to crawl the user profile store with URL: sps3://server-name and getting Access Denied in the crawl log. I checked the Administrators for the User Profile Service in Manage Service Applications and confirmed my default content access account (crawl account) had access to Retrieve People Data for Search Crawlers (see here).

Looking at the ULS I noticed errors about missing Alternate Access Mappings for an HTTP address, before seeing the Access Denied error. This caught my eye because I’ve configured my collaboration web application and my-site host as HTTPS.
For kicks, I added an IIS binding for HTTP://SERVER-NAME and added an AAM for the server name on HTTP, alongside my HTTPS FQDN. Lo-and-behold, after starting a full crawl the log reported successes for people data.

So, it appears that SharePoint takes the URL sps3://server-name and converts it to http://server-name to make some determination of access to the User Profile store. I’m not sure why this is the case (not yet anyway).

Lesson learned (for now): make sure SharePoint’s default content access account can access the same domain URL on HTTP as that of the SPS3 protocol. As mentioned at the top of this post, I found this out on SharePoint 2016, and I need to test to see if the results are the same on SharePoint 2013.

[Update 5/13/2016]: Turns out I should read the TechNet articles carefully. The following article indicates using sps3s://mysite-url, which then works correctly.

https://technet.microsoft.com/en-us/library/hh582311.aspx?f=255&MSPPError=-2147217396

Deploying a SharePoint Add-In from the Catalog via PowerShell

I recently came across a situation where I was asked if I could deploy a SharePoint App/Add-in from the corporate catalog to a sub-site, via PowerShell. With no surprise, there’s no single PowerShell Cmdlet that will perform this task. So, I took it upon myself to reverse engineer the SharePoint storefront and see how Microsoft does it within the platform.

The following code relies on .NET Reflection to invoke private and internal methods in the SharePoint server-side APIs. For this reason, I recommend taking caution in using this code, because we’re calling methods that Microsoft never intended developers to access. I highly recommend keeping this code away from production.

The code assumes the presence of custom add-ins/apps in the catalog and will iterate them. With each add-in/app, you have the option to add the app to the root site of the given site collection. You could easily change this code to suit your purpose. Note: if an app is already installed for a given web, it’ll not show up in the iteration.

[CmdletBinding()]param();

if ((Get-PSSnapin -Name "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin "Microsoft.SharePoint.PowerShell";
}

$yes = New-Object System.Management.Automation.Host.ChoiceDescription "&Yes","Description."
$no = New-Object System.Management.Automation.Host.ChoiceDescription "&No","Description."
$cancel = New-Object System.Management.Automation.Host.ChoiceDescription "&Cancel","Description."
$options = [System.Management.Automation.Host.ChoiceDescription[]]($yes, $no, $cancel)

$url = "site collection URL here";
$site = Get-SPSite $url;
$web = $site.RootWeb;

Write-Verbose "Getting apps from the catalog";
$json = Invoke-RestMethod -UseDefaultCredentials -Method Get -Uri $url + "/_layouts/15/addanapp.aspx?task=GetMyApps&sort=1&query=&myappscatalog=0&ci=1&vd=1";
$json | ? { $_.Catalog -eq 1 } | % {
    $appId = $_.ID;

    Write-Host -foreground Yellow "Title: $($_.Title)";
    Write-Host -foreground Yellow "AppID: $appId";

    $result = $host.ui.PromptForChoice("App Install", "Install App $($_.Title)", $options, 1)
    if ($result -eq 2) { break; }
    if ($result -eq 0) {

        Write-Verbose "Get the Corporate Catalog Accessor instance";
        $flags = [System.Reflection.BindingFlags]::NonPublic -bor [System.Reflection.BindingFlags]::Instance;
        $asm = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint");
        $ccaType = $asm.GetType("Microsoft.SharePoint.Marketplace.CorporateCuratedGallery.SPCorporateCatalogAccessor");
        $ccaCtor = $ccaType.GetConstructors($flags) | ? { $_.GetParameters().Count -eq 1; }
        $cca = $ccaCtor.Invoke(@($web));

        Write-Verbose "Getting App Package from the Catalog";
        $method = $ccaType.GetMethods($flags) | ? { $_.Name -ilike "GetAppPackage" -and ($_.GetParameters())[0].ParameterType.Name -eq "String" }
        $stream = $method.Invoke($cca, @($appId));

        Write-Verbose "Installing App from Catalog";
        $spAppType = $asm.GetType("Microsoft.SharePoint.Administration.SPApp");
        $method = $spAppType.GetMethod("CreateAppUsingPackageMetadata", [System.Reflection.BindingFlags]::NonPublic -bor [System.Reflection.BindingFlags]::Static);
        [Microsoft.SharePoint.Administration.SPApp]$spApp = $method.Invoke($null, @($stream, $web, 2, $false, $null, $null));
        $appInstanceId = $spApp.CreateAppInstance($web);
        Write-Host -ForegroundColor Yellow "AppInstanceID: $appInstanceId";
        $appInstance = [Microsoft.SharePoint.Administration.SPAppCatalog]::GetAppInstance($web, $appInstanceId);
        $appInstance.Install();
    }
}

SharePoint 2013 Build Numbers and PowerShell

If you’re in the business of maintaining SharePoint 2013 on-premises, you’ve undoubtedly come across Todd Klindt’s blog post with ongoing table of build numbers and corresponding CU names: http://www.toddklindt.com/sp2013builds.

Knowing the current patch version of your farm is pretty straight forward. You can look up the farm build number in Central Administration -> Manage Servers in Farm, and grab the number at the top of the page. Cross reference this number with the table in Todd’s blog post and you have the CU version installed in your farm.

I wanted to go a step further and write a PowerShell script that pulls the build number and looks up the details from Todd’s blog post automagically. Here it is:

[CmdletBinding()]Param();

$global:srcWebPage = "http://www.toddklindt.com/sp2013builds"; # Thanks Todd.

if ((Get-PSSnapin -Name "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin "Microsoft.SharePoint.PowerShell";
}

try {
    $farm = Get-SPFarm;
    $buildVersion = $farm.BuildVersion;
    $buildVersionString = $buildVersion.ToString();
    $site = Invoke-WebRequest -UseBasicParsing -Uri $global:srcWebPage;
    $pattern = "\<td.*\>.+(" + $buildVersionString.Replace(".", "\.") + ").*\</td\>\s*\<td.*\>(.+)\</td\>\s*\<td.*\>(.+)\</td\>";
    $pattern += '\s*\<td.*\>.*\<a.+href="(.+)".*\>(.+)\</a\>\</td\>';
    $pattern += '\s*\<td.*\>.*\<a.+href="(.+)".*\>(.+)\</a\>\</td\>';
    Write-Verbose $pattern;
    $m = [Regex]::Match($site.RawContent, $pattern, [System.Text.RegularExpressions.RegexOptions]::Multiline);
    if (!$m.Success) { throw "Could not find build number $buildVersionString in $global:srcWebPage"; }
    Write-Host -ForegroundColor white -NoNewline "Current Build Number: ";
    Write-Host -ForegroundColor yellow $buildVersionString;
    Write-Host -ForegroundColor white -NoNewline "Current Patch/CU: ";
    Write-Host -ForegroundColor yellow $m.Groups[2].Value;
    Write-Host -ForegroundColor white -NoNewline "KB of Current Patch/CU: ";
    Write-Host -ForegroundColor yellow $m.Groups[5].Value;
    Write-Host -ForegroundColor white -NoNewline "Download of Current Patch/CU: ";
    Write-Host -ForegroundColor yellow $m.Groups[6].Value;
    Write-Host
    $index = $m.Index + $m.Length;
    $pattern = "\<td.*\>.+([0-9]+\.[0-9]+\.[0-9]+\.[0-9]+).*\</td\>\s*\<td.*\>(.+)\</td\>\s*\<td.*\>(.+)\</td\>";
    $pattern += '\s*\<td.*\>.*\<a.+href="(.+)".*\>(.+)\</a\>\</td\>';
    $pattern += '\s*\<td.*\>.*\<a.+href="(.+)".*\>(.+)\</a\>\</td\>';
    $m = [Regex]::Match($site.RawContent.Substring($index), $pattern, [System.Text.RegularExpressions.RegexOptions]::Multiline);
    if ($m.Success) {
        Write-Host -ForegroundColor white -NoNewline "Next Build Number: ";
        Write-Host -ForegroundColor green $m.Groups[1].Value;
        Write-Host -ForegroundColor white -NoNewline "Next Patch/CU: ";
        Write-Host -ForegroundColor green $m.Groups[2].Value;
        Write-Host -ForegroundColor white -NoNewline "KB of Next Patch/CU: ";
        Write-Host -ForegroundColor green $m.Groups[5].Value;
        Write-Host -ForegroundColor white -NoNewline "Download of Next Patch/CU: ";
        Write-Host -ForegroundColor green $m.Groups[6].Value;
    }

} catch {
    Write-Host -ForegroundColor Red $_.Exception;
}

PowerShell Register Provider-Hosted Add-in/App

My current client uses provider-hosted add-ins with SharePoint 2013 on-premises. We have a centralized server infrastructure – provider-host – for the add-ins, where we deploy the add-in/app logic and then deploy the APP files to different SharePoint 2013 environments.

Why? The add-ins we’ve developed use CSOM to effect changes in the environment they’re deployed (SharePoint). We have one team developing the provider-hosted add-ins, and another team testing the add-ins within their development environments. This post is not about lifecycle deployment of SharePoint Provider-Hosted Add-ins – besides, we have integration, staging, test, and production hosts for this purpose – but about a nifty PowerShell script to reuse APP files across environments.

So, the scenario goes like this…

We have an integration farm with the provider-hosted add-ins deployed (and working). Developers download these add-ins from the integration SharePoint farm app catalog and save the APP files locally. They then upload these APP files into the app catalog of their local development SharePoint farm. Each development farm has a registered Security Token Issuer, using the same issuer ID as the integration farm. The development farms also have a trusted root certificate for the High-Trust between the provider-host and SharePoint, also the same as integration. The remaining step is to ensure that each add-in deployed to the development farm has the same client/app ID as that registered in the integration farm.

The typical process to register a shared add-in would be to crack open the APP file (just a zip file), look in the manifest.xml file and pull the client ID, and then call https://site/_layouts/15/appregnew.aspx. However, I wanted a script that avoided all that nonsense, and here it is:

[CmdletBinding()]Param(
    [Parameter(Mandatory=$true)][string]$appPath,
    [Parameter(Mandatory=$true)][string]$webUrl
);

if ((Get-PSSnapin -Name "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin "Microsoft.SharePoint.PowerShell";
}

$zipStream = $null;
$streamReader = $null;
try {
    Write-Verbose "Looking for AppManifest in APP Zip";
    [System.Reflection.Assembly]::LoadWithPartialName('System.IO.Compression') | Out-Null;
    if (![System.IO.File]::Exists($appPath)) { throw "$appPath does not exist"; }
    $zipBytes = [System.IO.File]::ReadAllBytes($appPath);
    $zipStream = New-Object System.IO.Memorystream;
    $zipStream.Write($zipBytes, 0, $zipBytes.Length);
    $zipArchive = New-Object System.IO.Compression.ZipArchive($zipStream);
    $zipEntry = $zipArchive.GetEntry("AppManifest.xml");
    $streamReader = New-Object System.IO.StreamReader($zipEntry.Open());
    $manifest = New-Object System.Xml.XmlDocument;
    $manifest.LoadXml($streamReader.ReadToEnd());
    
    Write-Verbose "Looking for ClientID";
    $ns = New-Object System.Xml.XmlNamespaceManager($manifest.NameTable);
    $ns.AddNamespace("x", "http://schemas.microsoft.com/sharepoint/2012/app/manifest");
    $node = $manifest.SelectSingleNode("/x:App/x:AppPrincipal/x:RemoteWebApplication", $ns);
    $clientId = $node.Attributes["ClientId"].Value;
    $node = $manifest.SelectSingleNode("/x:App/x:Properties/x:Title", $ns);
    $appTitle = $node.InnerText;
    Write-Verbose "Found app with title $appTitle and clientID $clientId";
    
    Write-Verbose "Registering App ClientId with SharePoint";
    $web = Get-SPWeb $webUrl;
    $realm = Get-SPAuthenticationRealm -ServiceContext $web.Site;
    $fullAppId = $clientId + '@' + $realm;
    Register-SPAppPrincipal -DisplayName $appTitle -NameIdentifier $fullAppId -Site $web;

} catch {
    Write-Host -ForegroundColor Red $_.Exception;
} finally {
    if ($streamReader -ne $null) { $streamReader.Close(); }
    if ($zipStream -ne $null) { $zipStream.Close(); }
}

The script takes a full path to the APP file and a web URL to register the add-in. As you can see from the code, the script replicates the manual steps of unzipping the APP (in memory), pulls out the client ID and calls Register-SPAppPrincipal to register the add-in.