Importing Best Bets as Query Rule in SharePoint 2013 using PowerShell

Use the below format for the CSVfile input

BestBet,UserContext,Keyword,Description,Url,StartDate,EndDate,Position
"IRS","","Income Tax;Tax;","Internal Revenue Service","http://irs.gov/","","",""


Sample PowerShell Script

Add-PSSnapin AdminSnapIn -erroraction SilentlyContinue
Add-PsSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue

$sa = "Search Service App Name"
$ssa = Get-SPEnterpriseSearchServiceApplication $sa
$siteurl = "http://siteurl/"
$ruleManager = New-Object Microsoft.Office.Server.Search.Query.Rules.QueryRuleManager($ssa)
$isVisualBB = "false"
$bestBetCSVFile = import-csv "E:\Path\BestBet.csv" -erroraction SilentlyContinue

Write-Host "File Imported"

foreach($row in $bestBetCSVFile)
{

$bestBetName = $row.BestBet
$rulename = $bestBetName
$contextName = $row.UserContext
$keywordName = $row.Keyword
$description = $row.Description
$url = $row.Url
$startDate = $row.StartDate
$endDate = $row.EndDate
$position = $row.Position

Write-Host $bestBetName $rulename $contextName $keywordName $description $url $startDate $endDate $row.Position

if($keywordName)
{
createBestBetQueryRule -rulename $rulename -keyword $keywordName -ruleManager $ruleManager -bestBetName $bestBetName -contextName $contextName -description $description -url $url -startDate $startDate -endDate $endDate -position $position -isVisualBB $isVisualBB

Write-Host "Added BestBet '$bestBetName' to Keyword '$keywordName'"
}
}

Advertisements

Set Permission to List Items using PowerShell


Add-PSSnapin Microsoft.SharePoint.Powershell

$url = "http://weburl"
$listName = "List Name";
$permissionLevel = "Permission Level Name";
$groupName = "Enter Group Name";

#Grant permission on all uniquely secured list items to the specified group

$web = Get-SPWeb $url;
$list = $web.Lists[$listName];
$permission = $web.RoleDefinitions[$permissionLevel];
$principal = $web.SiteGroups[$groupName];

#Process each list item

foreach ($item in $list.Items) {
Write-Output ("Item #" + $item.ID.ToString());

#Check to see whether the item is uniquely secured

if ($item.HasUniqueRoleAssignments -eq $FALSE) {
Write-Output " No change, permissions are inherited.";
}

else {

#Find an existing role assignment for this principal

$assignments = $item.RoleAssignments;
$assignment = $assignments | where {$_.Member.Name -eq $principal.Name};
if ($assignment -eq $NULL) {

#Add a new role assignment for the principal

$assignment = new-object Microsoft.SharePoint.SPRoleAssignment($principal);
$assignment.RoleDefinitionBindings.Add($permission);
$assignments.Add($assignment);
Write-Output (" Granted " + $permissionLevel + " to " + $groupName);
}

elseif ($assignment.RoleDefinitionBindings.Contains($permission) -ne $TRUE) {

#Update the principal's role assignment to add the desired permission level

$assignment.RoleDefinitionBindings.Add($permission);
$assignment.Update();
Write-Output (" Updated " + $groupName + " permissions to " + $permissionLevel);
}
else {
Write-Output " No change.";
}
}

}
$web.Dispose();

Migrate data from Source to Destination Library without changing the audit trail column values– PowerShell

Add-PSSnapin "Microsoft.SharePoint.Powershell" -ErrorAction SilentlyContinue

$sourceWebURL = "http://srcurl"
$sourceListName = "Source"

$destinationWebURL = "http://desurl"
$destinationListName = "Destination"

$spSourceWeb = Get-SPWeb $sourceWebURL
$spDestinationWeb = Get-SPWeb $destinationWebURL

$spSourceList = $spSourceWeb.Lists[$sourceListName]
$spDestinationList = $spDestinationWeb.Lists[$destinationListName]

$RootFolder = $spDestinationList.RootFolder

$spSourceItems = $spSourceList.Items

ForEach ($item in $spSourceItems)
{

Try
{

$binary = $item.File.OpenBinary();

if ($binary -ne $null)
{
$sBytes = $item.File.OpenBinary()

[Microsoft.SharePoint.SPFile]$spFile = $RootFolder.Files.Add($item.Name, $sBytes, $true)
$theItem = $spFile.Item
write-host -f Green "...Success!"
$pos = $item["Author"].IndexOf("#")
$userAuthorLogin = "amat\"+$item["Author"].Substring($pos+1)

$pos1 = $item["Editor"].IndexOf("#")
$userEditorLogin = "amat\"+$item["Editor"].Substring($pos+1)

$dateCreatedToStore = Get-Date $item["Created"]
$dateModifiedToStore = Get-Date $item["Modified"]

$userAuthor = Get-SPUser -Web $spDestinationWeb | ? {$_.userlogin -eq $userAuthorLogin}
$userAuthorString = "{0};#{1}" -f $userAuthor.ID, $userAuthor.UserLogin.Tostring()

$userEditor = Get-SPUser -Web $spDestinationWeb | ? {$_.userlogin -eq $userEditorLogin}
$userEditorString = "{0};#{1}" -f $userEditor.ID, $userEditor.UserLogin.Tostring()

#Sets the created by field
$theItem["Author"] = $userAuthorString
$theItem["Created"] = $dateCreatedToStore

#Set the modified by values
$theItem["Editor"] = $userEditorString
$theItem["Modified"] = $dateModifiedToStore

#Store changes without overwriting the existing Modified details.
$theItem.UpdateOverwriteVersion()

write-host -f Green "...Success!"

}

}
Catch [system.exception]
{
write-host "Caught a system exception for " $item.ID $item.Title
}
Finally
{
$spSourceWeb.Dispose();
$spDestinationWeb.Dispose();
}

}

Migrate data from CSV to SharePoint List – PowerShell


Add-PSSnapin Microsoft.SharePoint.PowerShell -EA SilentlyContinue

$FilePathCSV = import-csv D:\Book1.csv

$webURL = "http://url"
$listName = "CSVInput"
$web = Get-SPWeb $webURL
$list = $web.Lists[$listName]

foreach($CurrentRwo in $FilePathCSV)
{
$field1 = $CurrentRwo.Name
$field2 = $CurrentRwo.Age

$newItem = $list.Items.Add()
$newItem["Title"] = $field1
$newItem["Age"] = $field2

$filebytes = [System.IO.File]::ReadAllBytes("D:\Book1.csv")
$newItem.Attachments.Add("DummyFile Title",$filebytes)
$newItem.Update()
}

 

Adding SharePoint Site as a Search Provider in Internet Explorer

AddSearchProvider was introduced in Windows Internet Explorer 7.

This method opens a dialog box that enables the user to add the provider to their registry, and optionally set it as the default search provider. The maximum number of search providers that can be installed is 200.

The sUrl parameter allows http:, https:, or ftp: protocol schemes only. Additionally, the URL must be in a security zone that permits downloading.

Search providers must use the HTTP GET request method; the POST request method is not supported.

Syntax of “Provider.xml” must be of valid XML format. Be sure to encode all characters in the query string. Particularly, escaping “&” to “&”.

The query string must contain “q={searchTerms}” somewhere within the query string. When Internet Explorer 8 navigates to the provider to get search results, “{searchTerms}” will be replaced by the query string that the user typed into the Instant Search box.

To add SharePoint Search Site as a Search Provider in Internet Explorer such bing or google we have to user the below syntax

window.external.AddSearchProvider(sUrl)

Here sUrl is a string that specifies an absolute or relative URL to the OpenSearch Description file for the search provider.

The following code will add the search provider on click of the button

<INPUT TYPE=”button” VALUE=”SPRIDER Search Provider”   onClick=’window.external.AddSearchProvider(“/_layouts/15/SPRIDER/XML/SPRIDER_IE_Provider.xml”);’>

This can be advertised automatically by embedding this file on the master page or webpart zone with the help of CEWP as shown below

<link title=”My Provider Name” rel=”search” type=”application/opensearchdescription+xml” href=”(“/_layouts/15/SPRIDER/XML/SPRIDER_IE_Provider.xml”>

Sample OpenSearch Description file content

<?xml version=”1.0″ encoding=”UTF-8″?>

<OpenSearchDescription xmlns=”http://a9.com/-/spec/opensearch/1.1/”&gt;

<ShortName>SPRIDER Search </ShortName>

<Description> SPRIDER Search provider for Internet Explorer</Description>

<InputEncoding>UTF-8</InputEncoding>

<Url type=”text/html” template=”http://sprider/Pages/results.aspx?k={searchTerms}”/>

<Url type=”application/x-suggestions+xml” template=”http:// sprider/_layouts/15/amat/SuggestionProvider.aspx?k={searchTerms}”/>

<Image height=”16″ width=”16″ type=”image/x-icon”>http://sprider/_layouts/15/SPRIDER/Images/spriderfavicon.ico</Image&gt;

</OpenSearchDescription>

Here I have used the customized Suggestion response page (SuggestionProvider.aspx) to send empty response message when no suggestions found.

SharePoint 2010 not showing results – Files with big size

Problem Statement:

Our current search does not seem to include older Office versions (e.g. ppt) in SharePoint 2010 Search results.  Why did we limit this?

Issue:

When I received the above mentioned problem statement I started exploring the below:

  • Is there a problem in crawl / any restrictions configured to specific file types – Test passed
  • Is Browser Locale setting in the Search Result Core webpart trimming the results specific to some location – Test passed
  • Is Remove Duplicate Results settings  in the Search Result Core webpart causing any issue– Test passed
  • Compared the file size between the files which are appearing the search results and the issue causing file – Test partially failed, the file of the disappearing files are always bigger than the files which are appearing in the search results.  

 I stopped here and started exploring is there a limitation in the crawler settings to crawl the file contents with big size and found that SharePoint is by default limited to crawl file contents which are less than 16 MB.

So SharePoint when it crawls the files from the document library or list, if the file size exceeds this 16 MB limit then it will only crawls the basic meta data which are associate to that list /library such as Title, Created By, Modified By. The contents inside the file will not be crawled.  

To increase this limit we can run the following PowerShell script. But the impact will be on the crawl time. So we have to analyze the environment and perform this action.

$ssa = Get-SPEnterpriseSearchServiceApplication

$ssa.SetProperty(“MaxDownloadSize”, 25)

$ssa.Update()

We can set this limit to specific file type as shown below:  

$ssa.SetProperty(“MaxDownloadSizeExcel”, 25) 

 

Manage SharePoint incremental crawl index deletion

Problem

In SharePoint 2013, we have a content source that connects via BCS. The total items in the External table are around 7.7 million records. We have a full crawl scheduled to crawl the BCS items every month and an incremental crawl to run every 4 hrs.

In our environment the full crawl takes 18 hrs to crawl 7.7 million records and incremental crawl takes 2-3 hrs based on the number of items modified in the external table.

Things were moving fine until the incremental crawl deleted all the indexes from the content source when it failed (due to permission issue) to crawl BCS items. When we checked the crawl log we found that the incremental crawl only triggered the delete operation in the SharePoint Search Service.

Finally we came to know that Microsoft enforced some policies in SharePoint Search Service to delete the index based on some conditions. I am not sure we can disable this setting but we can increase the thresholds through PowerShell.

If somebody knows how to disable this setting / policy please let me know….

Solution / Workaround

Please find the below article which explains about this issue:

http://technet.microsoft.com/en-us/library/hh127009.aspx

But few intervals are changed in 2013 version of SharePoint

Comparison of Default Values between 2010 and 2013:

Thanks to Simon’s blog for sharing this information in his blog.  

Property SP 2010 SP 2013
ErrorDeleteCountAllowed 30 times 10 times
ErrorDeleteIntervalAllowed 720 hours 240 hours
ErrorCountAllowed 100 times 15 times
ErrorIntervalAllowed 1440 hours 360 hours
RecrawlErrorCount 10 times 5 times
RecrawlErrorInterval 360 120 hours
DeleteUnvisitedMethod 1 1