Importing Best Bets as Query Rule in SharePoint 2013 using PowerShell

Use the below format for the CSVfile input

"IRS","","Income Tax;Tax;","Internal Revenue Service","","","",""

Sample PowerShell Script

Add-PSSnapin AdminSnapIn -erroraction SilentlyContinue
Add-PsSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue

$sa = "Search Service App Name"
$ssa = Get-SPEnterpriseSearchServiceApplication $sa
$siteurl = "http://siteurl/"
$ruleManager = New-Object Microsoft.Office.Server.Search.Query.Rules.QueryRuleManager($ssa)
$isVisualBB = "false"
$bestBetCSVFile = import-csv "E:\Path\BestBet.csv" -erroraction SilentlyContinue

Write-Host "File Imported"

foreach($row in $bestBetCSVFile)

$bestBetName = $row.BestBet
$rulename = $bestBetName
$contextName = $row.UserContext
$keywordName = $row.Keyword
$description = $row.Description
$url = $row.Url
$startDate = $row.StartDate
$endDate = $row.EndDate
$position = $row.Position

Write-Host $bestBetName $rulename $contextName $keywordName $description $url $startDate $endDate $row.Position

createBestBetQueryRule -rulename $rulename -keyword $keywordName -ruleManager $ruleManager -bestBetName $bestBetName -contextName $contextName -description $description -url $url -startDate $startDate -endDate $endDate -position $position -isVisualBB $isVisualBB

Write-Host "Added BestBet '$bestBetName' to Keyword '$keywordName'"


Set Permission to List Items using PowerShell

Add-PSSnapin Microsoft.SharePoint.Powershell

$url = "http://weburl"
$listName = "List Name";
$permissionLevel = "Permission Level Name";
$groupName = "Enter Group Name";

#Grant permission on all uniquely secured list items to the specified group

$web = Get-SPWeb $url;
$list = $web.Lists[$listName];
$permission = $web.RoleDefinitions[$permissionLevel];
$principal = $web.SiteGroups[$groupName];

#Process each list item

foreach ($item in $list.Items) {
Write-Output ("Item #" + $item.ID.ToString());

#Check to see whether the item is uniquely secured

if ($item.HasUniqueRoleAssignments -eq $FALSE) {
Write-Output " No change, permissions are inherited.";

else {

#Find an existing role assignment for this principal

$assignments = $item.RoleAssignments;
$assignment = $assignments | where {$_.Member.Name -eq $principal.Name};
if ($assignment -eq $NULL) {

#Add a new role assignment for the principal

$assignment = new-object Microsoft.SharePoint.SPRoleAssignment($principal);
Write-Output (" Granted " + $permissionLevel + " to " + $groupName);

elseif ($assignment.RoleDefinitionBindings.Contains($permission) -ne $TRUE) {

#Update the principal's role assignment to add the desired permission level

Write-Output (" Updated " + $groupName + " permissions to " + $permissionLevel);
else {
Write-Output " No change.";


Migrate data from Source to Destination Library without changing the audit trail column values– PowerShell

Add-PSSnapin "Microsoft.SharePoint.Powershell" -ErrorAction SilentlyContinue

$sourceWebURL = "http://srcurl"
$sourceListName = "Source"

$destinationWebURL = "http://desurl"
$destinationListName = "Destination"

$spSourceWeb = Get-SPWeb $sourceWebURL
$spDestinationWeb = Get-SPWeb $destinationWebURL

$spSourceList = $spSourceWeb.Lists[$sourceListName]
$spDestinationList = $spDestinationWeb.Lists[$destinationListName]

$RootFolder = $spDestinationList.RootFolder

$spSourceItems = $spSourceList.Items

ForEach ($item in $spSourceItems)


$binary = $item.File.OpenBinary();

if ($binary -ne $null)
$sBytes = $item.File.OpenBinary()

[Microsoft.SharePoint.SPFile]$spFile = $RootFolder.Files.Add($item.Name, $sBytes, $true)
$theItem = $spFile.Item
write-host -f Green "...Success!"
$pos = $item["Author"].IndexOf("#")
$userAuthorLogin = "amat\"+$item["Author"].Substring($pos+1)

$pos1 = $item["Editor"].IndexOf("#")
$userEditorLogin = "amat\"+$item["Editor"].Substring($pos+1)

$dateCreatedToStore = Get-Date $item["Created"]
$dateModifiedToStore = Get-Date $item["Modified"]

$userAuthor = Get-SPUser -Web $spDestinationWeb | ? {$_.userlogin -eq $userAuthorLogin}
$userAuthorString = "{0};#{1}" -f $userAuthor.ID, $userAuthor.UserLogin.Tostring()

$userEditor = Get-SPUser -Web $spDestinationWeb | ? {$_.userlogin -eq $userEditorLogin}
$userEditorString = "{0};#{1}" -f $userEditor.ID, $userEditor.UserLogin.Tostring()

#Sets the created by field
$theItem["Author"] = $userAuthorString
$theItem["Created"] = $dateCreatedToStore

#Set the modified by values
$theItem["Editor"] = $userEditorString
$theItem["Modified"] = $dateModifiedToStore

#Store changes without overwriting the existing Modified details.

write-host -f Green "...Success!"


Catch [system.exception]
write-host "Caught a system exception for " $item.ID $item.Title


Migrate data from CSV to SharePoint List – PowerShell

Add-PSSnapin Microsoft.SharePoint.PowerShell -EA SilentlyContinue

$FilePathCSV = import-csv D:\Book1.csv

$webURL = "http://url"
$listName = "CSVInput"
$web = Get-SPWeb $webURL
$list = $web.Lists[$listName]

foreach($CurrentRwo in $FilePathCSV)
$field1 = $CurrentRwo.Name
$field2 = $CurrentRwo.Age

$newItem = $list.Items.Add()
$newItem["Title"] = $field1
$newItem["Age"] = $field2

$filebytes = [System.IO.File]::ReadAllBytes("D:\Book1.csv")
$newItem.Attachments.Add("DummyFile Title",$filebytes)


SharePoint 2010 not showing results – Files with big size

Problem Statement:

Our current search does not seem to include older Office versions (e.g. ppt) in SharePoint 2010 Search results.  Why did we limit this?


When I received the above mentioned problem statement I started exploring the below:

  • Is there a problem in crawl / any restrictions configured to specific file types – Test passed
  • Is Browser Locale setting in the Search Result Core webpart trimming the results specific to some location – Test passed
  • Is Remove Duplicate Results settings  in the Search Result Core webpart causing any issue– Test passed
  • Compared the file size between the files which are appearing the search results and the issue causing file – Test partially failed, the file of the disappearing files are always bigger than the files which are appearing in the search results.  

 I stopped here and started exploring is there a limitation in the crawler settings to crawl the file contents with big size and found that SharePoint is by default limited to crawl file contents which are less than 16 MB.

So SharePoint when it crawls the files from the document library or list, if the file size exceeds this 16 MB limit then it will only crawls the basic meta data which are associate to that list /library such as Title, Created By, Modified By. The contents inside the file will not be crawled.  

To increase this limit we can run the following PowerShell script. But the impact will be on the crawl time. So we have to analyze the environment and perform this action.

$ssa = Get-SPEnterpriseSearchServiceApplication

$ssa.SetProperty(“MaxDownloadSize”, 25)


We can set this limit to specific file type as shown below:  

$ssa.SetProperty(“MaxDownloadSizeExcel”, 25) 


Which is best Workflows OR Event Receivers?

Which is best Workflows OR Event Receivers?

Deciding Factors  


Event Receiver

Do you have Wait for an Item Update / Wait Until requirements (basically related to time)?



Interaction with Users such as Assign Task, Delegation, Request Data



Do you have High Volume of Transactions?



Do you have any browser / client specific change requirements (using SharePoint Designer / Visio / Nintex)?



Do need manual Trigger?



Do you wish to execute post or prior the events?



Do you wish to trigger an event with Service Account token on item created / modified?



Priority in executing some action (Workflow will run first)



Do you wish to migrate the running instances from one environment to other environment easily :)?



Configure item level security in SharePoint BCS

In SharePoint 2007 version we can only apply the query time security trimming on BDC Entities, Where in SP 2010 / 2013 you can implement the security trimming in the Crawl time also.

If you wish to trim your SharePoint search results on external system using BCS based on your own logic, there are two approaches:

  • CRAWL time security trimming
  • QUERY time security trimming

Applying security trimming logic at query level needs to be checked against every item returned from the search query for current user. It’s a very heavy process. When we use NTLM users to apply permission for the individual items in the external system then it’s better to go with CRAWL time security trimming.  In the crawl time security trimming approach, you just need an additional column in the external system table which will be used to save the permission details for each item. This will work if you are building a new BCS Model from the scratch. It’s very difficult to apply crawl time security trimming logic in an ongoing complex BCS application. So in that case you can better use query time security trimming approach.

Please refer this article –