Category Archives: Everything

“Every” post goes into this category

Creating an SSL Certificate for SharePoint (Development)

I have recently been researching Active Directory Federated Services (ADFS), for my upcoming book on SharePoint 2013. Once of the requirements for ADFS is to communicate with relying parties, such as SharePoint, over SSL. Setting up SSL for a SharePoint web application is a trivial process, but nonetheless, one I thought I’d blog about.

Note: The following steps create a self-signed certificate for development purposes, never use these in production.

1. Open Internet Information Service Manager 7

2. Click on the server name in the left navigation tree, and then double-click the Server Certificates icon on the right, under IIS section.

3. Click the link to create a self-signed certificate

4. Give the certificate a friendly name, and then click the OK button

5. Double-click the self-signed certificate to see the details

6. Click the details tab and then click the button to copy the certificate to a file

7. Click the next button

8. Select the option to NOT export the private key, then click the next button

9. Choose the export format (I chose the default DER format) and then the next button

10. Give the certificate a filename and browse to a location on disk

11. Click the next button, then finish button to export the certificate to the file

You have now created a new self-signed certificate and exported the public key to a file on disk. The steps that follow demonstrate adding the public key to the trusted root authorities certificate store, so the certificate is trusted on the local machine – this avoids annoying messages in IE about untrusted certificates.

12. Open the Microsoft Management Console (MMC.exe)

13. Add the Certificates snap-in for the computer account and local machine

14. Import the certificate into the Trusted Root Certificate Authorities node

15. Import the certificate into the SharePoint node

Now we have a trusted certificate, next we add the certificate to the trusted store in SharePoint, using the following PowerShell script:

$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("c:MYCert.cer ")
New-SPTrustedRootAuthority -Name "SharePoint Certificate" -Certificate $cert

Note: you must provide the full path to the CER file in the above script.

Let’s go ahead and bind the certificate to an application (web site) in IIS:

16. Return to IIS Management

17. Click the SharePoint application in the left navigation, under sites

18. Click the Bindings link (on the far right)

19. Click the add button

20. Choose HTTPS, and select the certificate to use

Finally, we must let SharePoint know that we can receive requests on the SSL address, by creating an Alternate Access Mapping entry, as follows:

21. Open Central Administration

22. Click the Application Management heading

23. Click the link to configure alternate access mappings

24. Click the button to Edit Public URLs

25. Change the Alternate Access Mapping Collection for the correct web application

26. Choose an empty zone and add the HTTP URL (this should be the full domain name that is listed for the self-signed certificate in IIS)

That’s all there is to it.

Images working again – Medical benefits

Risks and Benefits of the New Medical Imaging Enterprise

“Dr. Peterson, it’s been a while.” Lifting his head up from a plate of stuffed mushrooms, Dr. Peterson eyed a familiar face—Dr. Kelly. Although the two had been friends since the very first day of medical school, the residency match had assigned them to lives on opposite coasts; Dr. Peterson in internal medicine in New York; Dr. Kelly in radiology in Los Angeles, and they had lost contact. Indeed, one motivation to attend this medical school reunion was to finally catch up with each other.

“It has been a long time, hasn’t it? I really should have gotten here earlier, but our practice just opened a new clinic and I had to stay late, what with everyone cashing in on that coupon we placed in the paper. Here’s our new business card.”

Eyeing the card, Dr. Peterson noticed the bold letters: “Kelly Health-E-Scan: Full Body Imaging.”

“Our radiology group is creating a straight-to-the-public, full-body scan clinic. We have CT and MRI facilities in the office. We can see potential tumors and calcifications before they become symptomatic. Surveying the community around our practice, we found a sizable interest in such a service, and voila!”

It appeared that Dr. Kelly’s famous showmanship had not waned. But Peterson, ever the debater, retorted immediately.

“It does look snazzy, but don’t you think that it’s rather costly to do scans just on a whim? What about all those false positives? You could put patients through so much unnecessary grief.”

“We don’t just scan any guy off the street,” Kelly replied. “We have it all thought out; we make sure that the patient is at risk before we scan them. Our radiologists go over the results thoroughly with them and send a report to their primary care physicians. There will always be benign findings in every setup; I think it’s better to do these scans early and potentially save people from the blocked artery or a brain tumor. If the technology is there and a market for it exists from the patients, I don’t see why we shouldn’t allow patients to take their health care into their own hands. It doesn’t do them any harm. Come on, it’s the ultimate public health initiative; giving the patient the latest and greatest to stop them taking up a hospital bed 10 years down the line!”. Check these steel bite pro reviews.

Commentary

While the American health care system leads the world in many aspects of medical innovation and advanced medical technologies, it also suffers from serious problems that form the context for any discussion of this scenario. The problems include, most notably, high costs that are still increasing and a growing number of people who are uninsured. More than 45 million Americans do not have health insurance, due, at least in part, to a lack of affordability. For those who do have coverage, indecipherable layers of complexity and restrictions prevail over personal choice.

On the one hand, drawbacks of the current U.S. health insurance system can be traced historically to the interposition of employers between individuals and access to insurance, as well as to complex government mandates and other multifactorial influences outside the patient-physician construct. On the other hand, virtually every country, regardless of the fundamental structure of its health care system and the degree of government regulation, is struggling to control costs while grappling with limitations in access to modern medical advances. Needless to say, there is no simple solution. Prevent most hearing loss conditions with silencil reviews.

The U.S. health system is often held up as an example of the failure of “private” medicine, yet this characterization is misleading. Indeed, the vast majority of payments to physicians or hospitals are directly or indirectly set by government and not by market forces. Moreover, the U.S. has one of the most government-regulated health systems in the world—and at a huge cost. Beyond payment, the close linkage between employment and health insurance just mentioned has severely limited choice and autonomy for the individual patient. For these reasons, many policy makers and consumers welcome movement away from governmental dictates toward individual consumer empowerment with information and control of the health care dollar.

Unfortunately, imaging-based screening centers, as an example of consumer-directed care, have so far fallen short of their laudable goals. One serious limitation is that they require out-of-pocket payment because the vast majority of health care insurance policies do not cover such screening. This type of service, then, may be accessible only to the socioeconomic group that has the means to pay out-of-pocket or to consumers who carry newer high-deductible insurance with health savings accounts. Access and a means to pay, though, are only parts of the problem.

Notwithstanding the obvious irregular quality and other controversies about implementation [1], the basic idea of screening for disease at imaging centers should not be immediately discarded. These centers may potentially benefit consumers of health care a great deal. It is widely acknowledged that providing medical care only for those who are already sick is neither efficient nor optimal from a public health perspective. Thus, screening and preventive care with pre-morbid detection of disease is extremely significant if implemented correctly. Estimates are that even a mere 1-percent permanent reduction in cancer death rates wouldsave $500 billion [2].

The Case of Dr. Kelly

The case described by the dialogue between Drs. Peterson and Kelly is hardly fiction. More than 108 imaging centers offering heart, lung, brain and other scans exist in the U.S today. In 2001, 88 centers were operational, distributed across the country and highly concentrated in coastal regions such as California, Florida and New York [1]. The distribution has changed over the past five years, but only slightly. Peaks in areas of concentration are less sharp than before and centers are now distributed across 31 states. In Canada and Europe, availability is also increasing steadily [3].

Benefits. The potential benefits of consumer-directed, self-referred imaging are significant. At the top of the list is the possibility of a life-saving finding or early intervention by virtue of detecting preclinical disease. While a life-saving discovery may be rare, and empirically established true positive rates are not as well-documented as widely cited anecdotal testimony of good outcomes, the early detection of subclinical disease has undeniable value. Second on the list of benefits is patient empowerment. For individuals to take control of their own health care is a good thing—for them and for society—assuming that appropriate access to information, full disclosure about risks and assistance for follow-up by physicians is available. Third is autonomy and privacy. In this electronic age when personal privacy may be all but an illusion, the opportunity to seek a medical answer to a nagging private question outside the traditional health care system is also desirable. This is true whether a consumer-patient is entirely asymptomatic and seeks reassurance of fine health or is one who worries in the wake of a medical scare. Visit austinchronicle.com for more information about healthy supplements.

Risks. A list of risks arises from indiscriminate use of imaging marketed to consumers without physicians in the loop. Our own work has shown that, given the current culture, design and framework for screening imaging, risks outweigh benefits in number and quality. The risks include:

  • Psychological, health and financial costs of false positive findings and the potential for unnecessary, invasive follow-up tests.
  • Risks incurred when an anonymous diagnostician relays highly significant information to a patient with whom he or she has no relationship or rapport.
  • Diagnosis with no available therapy.
  • Lack of standards for disclosure of benefits and risks.
  • Caregiver conflict of interest.
  • Unregulated quality control of radiologist and scanning methods and equipment.
  • Risks of radiation from repeat CT scans; patients may visit many centers, and record-keeping across centers is not required.
  • Inadvertent changes to patient lifestyle due to over-confidence in clearance from disease by screening imaging technology.

The impact of misleading marketing and advertising must also be taken seriously [4]. There is no question that competition among health care professionals is beneficial to all, but in medicine, where the asymmetry of information between clinicians and patient is high, competitive marketing can lead to problems. Most worrisome are aggressive advertising campaigns aimed at vulnerable prospective consumers: the patient who suffers from mental illness or the patient who is desperately seeking relief from untreatable disease or incompletely explained symptoms. The free availability of a wide range of information on the Internet is extremely positive, but the very nature of the Internet also allows medical information to be of variable quality, completeness and reliability, which exacerbates these risks [5].

Ctrl-F Crash in Visual Studio

I have Resharper 4 installed into Visual Studio 2008.  On 64-bit the CTRL-F functionality crashes the application, which has driving me nuts.  My colleague Anand posted a solution to our company Intranet, so I stole his post for my blog for future reference. 

Thanks Anand 😉

“Visual Studio might crash when using the Find feature on a 64 bit system. This msdn article explains the issue. KB947841 I uncovered this problem after installing resharper on a 64-bit system, however, it is not related to resharper. Installing this add-in simply uncovers this bug in visual studio. You need to request this hotfix ”

Blog moved to WordPress.com

On a whim decision in middle of yesterday evening I decided to move my blog from Community Server 2.1 to WordPress.com.

Please update syndication URL to http://blog.robgarrett.com/feed/

image

Community Server and the team at Data Research Group, where my blog was hosted, have been great and I thank both DRG for their support and free hosting; and the Community Server guys for the wonderful platform I’ve been using for the past 3-4 years.

My decision to move last night wasn’t an agonizing one (hence “whim”) and nothing to do withy the CS platform or hosting, but because I am moving my life in the direction of “less maintenance for Rob.”

I chose to move RobGarrett.com to WordPress.com because WP offers a clean, slick, easy to use interface – and the best part, I don’t have to maintain it.  It’s taken me a while to comprehend that the more services one is responsible the more headaches one has to deal with (not that my blog was ever a huge burden).  WP affords me the ability to concentrate on blog posting, and never do I have to worry about backing up data, checking in a site errors, or making changes inline with infrastructure changes at my hosting org.

I did consider several other blog engines, especially SharePoint, since this is the focus of my career, but settled on WP because it was free, they offer 3GB of space, and configuration is simple.

The following is a list of pros and cons I have evaluated in the 24 hours since I moved to WP:

Pros

-  Easy to use administration interface
-  Stock templates – get bored with look and feel, I can just change it
-  iPhone application available
-  Never going away (hopefully), infrastructure maintained by WP team
-  Never have to worry about backups again
-  3GB of storage space (can pay for addition)
-  Stable platform, should never error out

Cons

-  Limited customization ability
-  No Google ads
-  No Google analytics
-  Have to pay extra for custom CSS

Moving my blog posts from CS 2.1 wasn’t as painful as I thought it would be.  I followed a great post from Rob Walling, which led me to use the CS BLOGML export tool from CodePlex, to export all my posts to BLOGML.  Once I exported my content, I was then able to massage the content, convert to WordPress.com WXR format using Damien G’s XSLT (and Visual Studio 2008), and then import the content directly into WP – presto, posts and comments.

The above process did some hand-holding.  Trawling the web, I found some claims to developed tools that would do the complete migration in one step, but never found a so called solution that worked.  With some knowledge of ASP.NET (debugging the CS export tool) and XSLT (for WXR convert) I was able to weed out posts causing difficulty in the conversion process and pull over a clean set.

I’m not sure if WP has fixed the importer recently, but I read many exasperating complaints about the WXR importer timing out.  I was able to import 300+ posts (about a 2MB file) with no issue.

So… enjoy the location, and send me feedback about anything you see broken, something you don’t like, or praise for the move 😉

Efficient way to add a new item to a SharePoint list

Never use SPList.Items.Add because this approach gets all items in the list before adding a new SPListItem.  Use the following method instead, which does not preload the list items:

   1:  /// <summary>
   2:          /// More efficient way of adding an item to a list.
   3:          /// </summary>
   4:          /// <remarks>
   5:          /// GetItems with a query is faster than calling the OM to get all items.
   6:          /// This is because the SPListItemCollection is created without loading all the
   7:          /// data until the first query request.  Whereas SPList.Items loads all the data
   8:          /// at construction.
   9:          /// </remarks>
  10:          /// <param name="list">List.</param>
  11:          /// <returns>List Item Added.</returns>
  12:          public static SPListItem OptimizedAddItem(SPList list)
  13:          {
  14:              const string EmptyQuery = "0";
  15:              SPQuery q = new SPQuery {Query = EmptyQuery};
  16:              return list.GetItems(q).Add();
  17:          }

SharePoint Development Best Practices (Summary)

I’ve read several blog posts of late regarding best practices for developing SharePoint API based components.  Some developers are aware of issues surrounding disposal of SPSite and SPWeb objects and the use of SPList Item collections, and some are not.  The simple fact is the SharePoint API is not intuitive when it comes usage of memory hungry-object instances, and it is all too easy to leave too many of these objects in memory for the garbage collector to deal with – causing large memory spikes in the site application pool with high traffic utilization.  Furthermore, what seems like innocent well structured code can perform badly because of the underlying calls against the SharePoint content databases.  This blog post serves as a reference point and as a quick summary of some of the best practices.

Best Practices: Using Disposable Windows SharePoint Services Objects

Reference

Enable the following registry setting to populate the ULS logs with allocation identifiers to isolate non-disposed SPSite and SPWeb objects:
HKEY_LOCAL_MACHINESOFTWAREMicrosoftShared ToolsWeb Server ExtensionsHeapSettings SPRequestStackTrace = 1

Wrap all calls that create a new SPSite or SPWeb object (except those from obtained from the SPContext.Current singleton) in try, catch, finally blocks, or using statements

SPContext objects are managed by the SharePoint framework and should not be explicitly disposed in your code. This is true also for the SPSite and SPWeb objects returned by SPContext.Site, SPContext.Current.Site, SPContext.Web, and SPContext.Current.Web.

You must be cautious and aware of what the runtime is doing whenever you combine SharePoint object model calls on the same line. Leaks arising from this scenario are among the hardest to find.

The finally block executes after calls to Response.Redirect in the try block. Response.Redirect ultimately generates a ThreadAbortException. When this exception is raised, the runtime executes all finally blocks before ending the thread. However, because the finally block can do an unbounded computation or cancel the ThreadAbortException, there is no guarantee that the thread will end. Therefore, before any redirection or transfer of processing can occur, you must dispose of the objects. If your code must redirect, implement it in a way similar to the following code example.

String str;
SPSite oSPSite = null;
SPWeb oSPWeb = null;

try
{
oSPSite = new SPSite("http://server");
oSPWeb = oSPSite.OpenWeb(..);

str = oSPWeb.Title;
if(bDoRedirection)
{
if (oSPWeb != null)
oSPWeb.Dispose();

if (oSPSite != null)
oSPSite.Dispose();

Response.Redirect("newpage.aspx");
}
}
catch(Exception e)
{
}
finally
{
if (oSPWeb != null)
oSPWeb.Dispose();

if (oSPSite != null)
oSPSite.Dispose();
}

 

Good practices to reduce object long-term retention:

  • If you create the object with a new operator, ensure that the creating application disposes of it.
  • Dispose of items created by SharePoint methods that return other SPWeb objects (such as OpenWeb).
  • Do not share any SPRequest object (and by extension any object that contains a reference to an SPRequest object) across threads.

SPSite Object:

  • The SPSiteCollection.Add method creates and returns a new SPSite object. You should dispose of any SPSite object returned from the SPSiteCollection.Add method.
  • The SPSiteCollection [] index operator returns a new SPSite object for each access. An SPSite instance is created even if that object was already accessed. The following code samples demonstrate improper disposal of the SPSite object.
  • The SPSite.AllWebs.Add method creates and returns an SPWeb object. You should dispose of any SPWeb object returned from SPSite.AllWebs.Add.
  • The SPWebCollection.Add method creates and returns an SPWeb object that needs to be disposed.
  • The SPSite.AllWebs [] index operator returns a new SPWeb instance each time it is accessed.
  • The OpenWeb method and SelfServiceCreateSite method (all signatures) create an SPWeb object and return it to the caller.
  • The Microsoft.Office.Server.UserProfiles.PersonalSite returns an SPSite object that must be disposed.
SPSite.RootWeb Property

An earlier version of this article indicated that the calling application should dispose of the SPSite.RootWeb property just before disposing of the SPSite object that is using it. This is no longer the official guidance. The dispose cleanup is handled automatically by the SharePoint framework. Additionally, SPSite properties LockIssue, Owner, and SecondaryContact used the RootWeb property internally. Given the updated guidance for RootWeb, it is no longer advisable to call the Dispose method on the SPSite.RootWeb property whenever any of these properties are used.

SPWeb Object:

  • The SPWeb.Webs property returns an SPWebCollection object. The SPWeb objects in this collection must be disposed.
  • The SPWeb.Webs.Add method (or Add) creates and returns a new SPWeb object. You should dispose of any SPWeb object returned from this method call.
  • The SPWeb.Webs[] index operator returns a new SPWeb object for each access
SPWeb.ParentWeb Property

Updated Guidance

An earlier version of this article recommended that the calling application should dispose of the SPWeb.ParentWeb. This is no longer the official guidance. The dispose cleanup is handled automatically by the SharePoint framework.

Other Objects:

  • Microsoft.SharePoint.Portal.SiteData.Area.Web Property.  The Web property returns a new SPWeb object each time it is accessed.
  • If the object is obtained from the SharePoint context objects (GetContextSite method and GetContextWeb method), the calling application should not call the Dispose method on the object.
  • The SPLimitedWebPartManager class contains a reference to an internal SPWeb object that must be disposed.
  • The GetPublishingWebs method of the PublishingWeb class returns a PublishingWebCollection object. You must call the Close method on each enumerated innerPubWeb object. When you are calling only the GetPublishingWeb method, you are not required to call Close.
  • The Microsoft.SharePoint.Publishing.PublishingWeb.GetVariation method returns a PublishingWeb object that must be disposed.

Best Practices: Common Coding Issues When Using the SharePoint Object Model

Reference

Caching Data and Objects

Many developers use the Microsoft .NET Framework caching objects (for example, System.Web.Caching.Cache) to help take better advantage of memory and increase overall system performance. But many objects are not "thread safe" and caching those objects can cause applications to fail and unexpected or unrelated user errors.

Caching SharePoint Objects That Are Not Thread Safe

You might try to increase performance and memory usage by caching SPListItemCollection objects that are returned from queries. In general, this is a good practice; however, the SPListItemCollection object contains an embedded SPWeb object that is not thread safe and should not be cached.

You can cache a DataTable object that is created from the SPListItemCollection object.

Using Objects in Event Receivers

Do not instantiate SPWeb, SPSite, SPList, or SPListItem objects within an event receiver. Event receivers that instantiate SPSite, SPWeb, SPList, or SPListItem objects instead of using the instances passed via the event properties can cause the following problems:

  • They incur significant additional roundtrips to the database. (One write operation can result in up to five additional roundtrips in each event receiver.)
  • Calling the Update method on these instances can cause subsequent Update calls in other registered event receivers to fail.

Working with Folders and Lists

Do not use SPList.Items.  SPList.Items selects all items from all subfolders, including all fields in the list.

  • Instead of calling SPList.Items.Add, simply use SPList.AddItem.
  • Retrieve list items using SPList.GetItems(SPQuery query)
  • Instead of using SPList.Items.GetItemById, use SPList.GetItemById(int id, string field1, params string[] fields)

Do not enumerate entire SPList.Items collections or SPFolder.Files collections.

Poor Performing Methods and Properties Better Performing Alternatives
SPList.Items.Count SPList.ItemCount
SPList.Items.XmlDataSchema Create an SPQuery object to retrieve items you want
SPList.Items.NumberOfFields Create an SPQuery object (specifying the ViewFields)
SPList.Items[System.Guid] SPList.GetItemByUniqueId(System.Guid)
SPList.Items[System.Int32] SPList.GetItemById(System.Int32)
SPList.Items.ReorderItems Perform a non-paged query using SPQuery in each page
SPList.Items.GetItemById(System.Int32) SPList.GetItemById(System.Int32)
SPFolder.Files.Count SPFolder.ItemCount

Use PortalSiteMapProvider (Microsoft Office SharePoint Server 2007 only).

Steve Peschka's white paper Working with Large Lists in Office SharePoint Server 2007 describes an efficient approach to retrieving list data in Office SharePoint Server 2007 by using the PortalSiteMapProvider class.

(Very important as this works around the 2000 item limit)

Scaling Code

How to make this code more scalable or fine-tune it for a multiple user environment can be a hard question to answer. It depends on what the application is designed to do.

You should take the following into consideration when asking how to make code more scalable:

  • Is the data static (seldom changes), somewhat static (changes occasionally), or dynamic (changes constantly)?

  • Is the data the same for all users, or does it change? For example, does it change depending on the user who is logged on, the part of the site being accessed, or the time of year (seasonal information)?

  • Is the data easily accessible or does it require a long time to return the data? For example, is it returning from a long-running database query or from remote databases that can have some network latency in the data transfers?

  • Is the data public or does it require a higher level of security?

  • What is the size of the data?

  • Is the SharePoint site on a single server or on a server farm?

Using SPQuery Objects

SPQuery objects can cause performance problems whenever they return large result sets. The following suggestions will help you optimize your code so that performance will not suffer greatly whenever your searches return large numbers of items.

  • Do not use an unbounded SPQuery object.
  • An SPQuery object without a value for RowLimit will perform poorly and fail on large lists. Specify a RowLimit between 1 and 2000 and, if necessary, page through the list.
  • Use indexed fields.
  • If you query on a field that is not indexed, the query will be blocked whenever it would result in a scan of more items than the query threshold (as soon as there are more items in the list than are specified in the query threshold). Set SPQuery.RowLimit to a value that is less than the query threshold.
  • If you know the URL of your list item and want to query by FileRef, use SPWeb.GetListItem(string strUrl, string field1, params string[] fields) instead.

Authentication failed because the remote party has closed the transport stream

If you receiver the error "Authentication failed because the remote party has closed the transport stream" when accessing "Search Settings" in the SSP, the following steps will resolve the issue.  The issue is a result of incorrect self-serving-certificate.

1. Install the IIS 6.0 Resource Kit on the index server (http://www.microsoft.com/downloads/details.aspx?FamilyID=56fc92ee-a71a-4c73-b628-ade629c89499&DisplayLang=en)
2. Assign a new SSL certificate  to the Office SharePoint Server Web Services site on the index server using the SELFSSL tool from the resource kit.

SELFSSL.EXE /N:CN=<name of index server> /K:1024 /V:<number of days cert should be valid> /S:951338967  /P:56738
The /S and /P parameters specify the web site ID and port number, respectively, of the Office SharePoint Server Web Services site in IIS. They should be set to the appropriate values for your environment.