Welcome!

Related Topics: .NET, XML, Search, Silverlight, Web 2.0

.NET: Article

SharePoint Gone Wild: When Governance Lacks Discoverability | Part 3

Part three of the SharePoint Gone Wild series

If you missed any previous part of this series, you can read them here.

In the series so far, we've talked about how accountability, quality, appropriateness and restrictions have all been key drivers for why we need to focus on governance for Microsoft SharePoint within our organizations.

To continue this series, I want to focus on another key area called Discoverability. What does this term mean? It is the quality of the content being discoverable (able to find information) within SharePoint. Unfortunately, this also happens to be one of the biggest pains within organizations today.

Now, you would think with an enterprise search engine, managed metadata service application, content types with site columns, and versioning that this wouldn't be so difficult. In reality, however, it can often make this harder than just searching on a file system because the search crawlers do such a good job of surfacing everything!

The biggest problem with search surfacing everything is that old data comes up in the search and, as discussed in my post on restrictions, unsecured data also bubbles up to the surface and creates noise to users who shouldn't see it.

Unfortunately, although managed metadata is great when applied to list items, it also produces an unfair rating if list items are not using the metadata or appearing when search filters are applied. Unless you enforce managed metadata as a required field on list items, this unbalance will always occur. Don't think that your job stops at putting a red star next to the managed metadata field, either: Even if you enforce required field entries, users can just add one term and others may add ten terms, thereby keeping the same type of imbalance.

Another common issue comes into play when the enterprise search within SharePoint is enabled and crawls all content without any parameters. It is essential that you "tune" the configuration with the use of scopes, keywords, and exclusions in order to make results more relevant.

As with any document repository, often there are duplicate documents in various locations where people accountable for content have placed it in the incorrect location in the information architecture. Although the search results page is smart enough to remove EXACT duplicates, if there a few versions littered around the farm, this will produce duplicate results as well.

When I go out and work with customers on this issue, they are always after the silver bullet. Unfortunately, this just isn't an easy problem to solve. Here are some of the things that I've seen customers doing using native SharePoint functionality in order to improve discoverability:

  • Information architecture planning - As I've already stressed in this post - as well as others - planning the appropriate structure for site collections, sub sites, lists, and libraries is very important.
  • Management metadata taxonomies - The best way to get up and running with managed metadata is to provide one taxonomy for users, and then educate them on the power of it for discoverability. The easiest way to demonstrate this is the search refiners during search results that allow you to narrow your search by terms.
  • Tuning search - As discussed, tuning the search can involve various configurations. The best approach to this is identifying key search use cases and customizing results pages specifically for them with requisite scopes and targeted advanced search fields. People search is a good out-of-the-box example of this.

Michael Pisarek also discusses some great points on search governance on his blog, and I recommend checking out his post. If you are not already subscribed to the SharePointPodShow.com I'd highly recommend doing so, as they did a great podcast on search with Josh Noble that is well worth a listen.

More Stories By Jeremy Thake

Jeremy Thake is AvePoint's Chief Architect. Jeremy’s 10-plus years of experience in the software development industry, along with his expertise in Microsoft technologies, earned him the label of “expert” in the global SharePoint community. He was named a Microsoft SharePoint MVP in 2009, and continues to work directly with enterprise customers and AvePoint’s research & development team to develop solutions that will set the standard for the next generation of collaboration platforms, including Microsoft SharePoint 2013.

Jeremy was one of only eight Microsoft MVPs from Australia, where he lived for seven years, who was recognized by the SharePoint Product Team in 2010 for his extensive contributions to the global SharePoint community. He also played an instrumental role in organizing the Perth SharePoint User Group during his time living there.