-->

16/08/2013

Basics of Sharepoint Search

In this post we discuss about basics of sharepoint search.
Below are the concepts which we cover in this post.
  • Components of Search
  • Enabling / Understanding Search Service Application
  • Sharepoint Content Source
  • Crawl Rules
  • Crawl Logs
  • Setting-up Enterprise Search
  • Crawl Properties / Search Schema
  • Search Managed Properties / Search Schema





You will find this kind of information in "N" number of sites and forums. But most of it doesn't help in kick start implementation. Well I think how a newbie can understand SharePoint search. I came up with below approach.

Before jumping into decision of using search, lets understand why do we need to take all the pain of setting infrastructure. We have SPQuery , SPSiteDataQuery. Then why Search?

SPQuery - Works for querying a list.
SPSiteDataQuery - works for multiple lists across SiteCollection.
Search - Query across Farm.

Now you know when to choose Search.

Lets begin with a regular site with a documents library. I have a Web application named "MyWebApp" with a site collection holding a document library "Applications", shown below.



The first step is to enable Search service application. For that go to Central Admin => Manage Service Applications => You can see


Else create one using new button on top left corner. Once the application is created and started, click on it and you will see something like this, from where you can control complete search configuration.
Just give a look at all the links on this page.


Content Source
As the name implies this is the core where you can control the visibility of data in search. 
By default it will have at-least one content source "Local Shaprepoint Sites", which will have all the site collections under one source.
I don't want to have such a generic source , so delete "MyWebApp" from the above content source.
Created a new content source exclusive for my site collection.
Have a look at "New Content Source" screen.
Now from the above screen you can see that you can control the content visibility and the frequency of crawling. you can see the reason why I separated my web application, to have a high frequency of incremental crawling to have better search results.


Crawl Rules
As the name signifies, this is where we can control how the crawling can be done.
As you can see that I have only one rule to crawl all the sites on http protocol using my search account. Some times you will be in situation to crawl a https site, for which you may need to use the client certificate instead of windows account.

Below is how you can control the crawl on a secure site.

Once we setup crawl rules, we just need to run the crawl so that the content can be searchable.
For that you may need to go to content source and trigger a full crawl for the first time.

Now the status will change from Idle => Starting => Crawling Full => Completing => Ideal
This means crawling is done. Before jumping to check the search results, we may need to verify how the crawl happened.


Crawl Logs

The image speaks for itself. All we need to ensure is that there are no errors for specific content source. If there are any click in the number below errors and you will be taken to list of errors happened while crawling.

Most common mistake is "Password incorrect for search account in Search administration screen". To correct this click on name of account and set the password.

Now our content is available on search, how do we search? Either using  Search API with Custom code or a generic search application. I choose option 2 for now.


Setting-up Enterprise Search application
Its the most easiest of all tasks. Create a new web application, and a site collection under it. Choose " Enterprise Search " template while creating
When I search for "Applicant1", below are the results.


You can see that its fetching all records, ones with file name as Applicant1, and the others with Applicant name as "Applicant1".
What if I want only records with Applicant name as "Applicant1", here is where we need to use concept of Metadata Properties and Crawl Properties for refined search.


Managed Properties & Crawl Properties or Search Schema (SP13)
Crawl Properties: This is the data captured while crawling. Eg., Applicant field of our applications Document library.
In SP2010:
Go to Search Administration Screen => Managed Properties => Crawl Properties

In SP2013:
Go to Search Administration Screen => Search Schema => Crawl Properties

Managed Properties: This is the property on which a user can do custom search. we need to map it to a Crawl property.
Now in our case, we are going to create a new managed property "APPLICANT" which maps to Applicant field of our document library.
You can see that Crawl property "ows_Applicant" is mapped to "APPLICANT". Now the search will be changed as below.
You can see that the results are fetched based on the value of applicant filed and other records with file name as "Applicant1" were excluded.

This is tip of the ice berg so called "SharePoint Search". In next post we will see how to implement custom search to have complete control on result set and its display using SP Search API.

For now, I hope this gives an initial kick start on SP Search.

No comments:

Post a Comment