Tuesday, November 29, 2011

A specified logon session does not exist. It may already have been terminated.

Issue:
On a Windows 2008 R2 or Windows 7 machine, you open Task Scheduler. You create a new task and the following conditions are used in the task.
  • The account used to run the task is another service account (i.e. author or the user who is trying to save the task) is different from the user who will be running the task
  • Option of “Run whether user is logged on or not” is selected.
  • Option of “Do not store password. The task will only have access to local computer resources.” is not selected.
When you try to save the task with the above general settings, we receive error.

Error Message:
An error has occurred for task <TaskName>. Error message: The following error was reported: A specified logon session does not exist. It may already have been terminated.

Cause:
This occurs because the local security policy has the following setting:
Network access: Do not allow storage of passwords and credentials for network authentication - Enabled.

Resolution:
To verify whether the security policy is causing the issue. Log on to the machine where you are facing the problem.
  • Start -> Administrative Tools -> Local Security Policy
  • Security Settings -> Local Policies -> Security Options
  • Network access: Do not allow storage of passwords and credentials for network authentication setting should be Enabled.
Just FYI, the corresponding registry key for this setting can be found here:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa
Key Name: disabledomaincreds
Current Key Value: 1
The value of “1” means that the policy is enabled. It must be “0”.

NOTE: If you change the registry value to “0”, you should be able to save the task. Do not make registry changes unless you are absolutely sure. Only use it for checking it this resolves the issue or not on a test machine. Actual resolution is to update the group policy and disable the security policy mentioned above. It will take care of this registry.
Disable the policy from domain controller and run a group policy update.

Monday, November 7, 2011

SharePoint 2010 - Crawling stuck, crawl component in "Recovering" state


From past few days we started this issue out of now where. The crawler is stuck. When we browse to the Search Administration page, we see that the Crawler component is stuck in "Recovering" state and due to that Query component also reported a status as "Not Responding". Propogation status was shown as "Query server not responding"

We checked the SharePoint logs and found errors as below

file copy failed (error 0x80070005: Access is denied.   0x80070005, source c:\program files\microsoft office servers\14.0\data\office server\applications\GUID-crawl-0\projects\portal_content\indexer\cifiles\0001001A.ci, destination \\machinename\GUID-query-0\GUID-query-0\Projects\Portal_Content\Indexer\CiFiles\0000.0001001A.ci.cp)  [indexpropagator.cxx:403]  d:\office\source\search\native\ytrip\tripoli\propagation\indexpropagator.cxx

Exception thrown (0x80070005: [needs message] info 0)                           [indexpropagator.cxx:409]  d:\office\source\search\native\ytrip\tripoli\propagation\indexpropagator.cxx

In Event Viewer, we found Event ID: 2587 with the description that "The following conditions are currently affecting index propogation to this server for search service application "Service Application Name": 1. Query 0 has been disabled so that crawls can continue. It may be recovered via the Restart-SPEnterpriseSearchQueryComponent command in PowerShell

The code - 0x80070005 states that the access was denied and the account accessing the folder did not have permissions.

I granted permissions for the service and crawler account on c:\program files\microsoft office servers\14.0\data\office server\applications\
But this did not resolve the issue. This meant that apart from the location mentioned in the logs, the permission issue was somewhere else too. In such cases, process monitor does help. I took a process monitor and found that there were errors related to permissions on C:\ProgramData and few of its SharePoint subfolders. After granting the service and crawler account permission (Full Control) over the C:\ProgramData folder, the status of the components was not showing as "Online".

In case you face such errors, try giving permissions for the search service and crawler account (in case they are different) over the C:\ProgramData folder and check. If not, take help from Process Monitor to find out where the permission issue is.

Do not remove, add or restart any component unless you are absolutely sure that permissions or any of the below mentioned causes is not an issue.

Few other things to verify:

Update: Recently we noticed that restarting the server running the component would get the component back to "Online" state. Before restarting any services or running PowerShell cmdlet, try rebooting the server.

Wednesday, October 12, 2011

SharePoint 2010: User Profile Service not starting, remains in Stopped state

Issue:Not able to start User Profile Synchronization Service from “Services on Servers” page in Central Administration.

Symptoms:You have created a User Profile Service Application and now would like the User profile synchronization service to start only on specific servers. So, you go to Central Administration –> Application Management –> Services on Server.
On this page, you try to start the “User Profile Synchronization Service”. It stalls on “Starting” state for a minute or two and then reverts back to “Stopped” state.

Error Messages(s):When the status of the service returns to “Stopped” state following errors are logged in event viewer.

Log Name:      Application
Source:        Microsoft Resource Management Service
Date:          xx/xx/xxxx x:xx:xx xx
Event ID:      0
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer: xxxxxxxx
Description:
Service cannot be started. System.Data.SqlClient.SqlException: Could not find stored procedure 'RegisterService'.
   at Microsoft.ResourceManagement.WindowsHostService.OnStart(String[] args)
   at System.ServiceProcess.ServiceBase.ServiceQueuedMainCallback(Object state)


Log Name:      Application
Source:        Forefront Identity Manager
Date:          xx/xx/xxxx x:xx:xx xx
Event ID:      3
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      xxxxxxxx
Description:
.Net SqlClient Data Provider: System.Data.SqlClient.SqlException: Could not find stored procedure 'RegisterService'.
   at Microsoft.ResourceManagement.Utilities.ExceptionManager.ThrowException(Exception exception)
   at Microsoft.ResourceManagement.Data.Exception.DataAccessExceptionManager.ThrowException(SqlException innerException)
   at Microsoft.ResourceManagement.Data.DataAccess.RegisterService(String hostName)
   at Microsoft.ResourceManagement.Workflow.Hosting.HostActivator.RegisterService(String hostName)
   at Microsoft.ResourceManagement.Workflow.Hosting.HostActivator.Initialize()
   at Microsoft.ResourceManagement.WebServices.ResourceManagementServiceHostFactory.CreateServiceHost(String constructorString, Uri[] baseAddresses)
   at Microsoft.ResourceManagement.WindowsHostService.OnStart(String[] args)


Log Name:      Application
Source:        Microsoft.ResourceManagement.ServiceHealthSource
Date:          xx/xx/xxxx x:xx:xx xx
Event ID:      2
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      xxxxxxxx
Description:
The Forefront Identity Manager Service could not bind to its endpoints.  This failure prevents clients from communicating with the Web services.

A most likely cause for the failure is another service, possibly another instance of Forefront Identity Manager Service, has already bound to the endpoint.  Another, less likely cause, is that the account under which the service runs does not have permission to bind to endpoints.

Ensure that no other processes have bound to that endpoint and that the service account has permission to bind endpoints.  Further, check the application configuration file to ensure the Forefront Identity Manager Service is binding to the correct endpoints.


Log Name:      Application
Source:        Microsoft.ResourceManagement.ServiceHealthSource
Date:          xx/xx/xxxx x:xx:xx xx
Event ID:      26
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      xxxxxxxx
Description:
The Forefront Identity Manager Service was not able to initialize a timer necessary for supporting the execution of workflows.

Upon startup, the Forefront Identity Manager Service must initialize and set a timer to support workflow execution.  If this timer fails to get created, workflows will not run successfully and there is no recovery other than to stop and start the Forefront Identity Manager Service. Restart the Forefront Identity Manager Service.


Cause:Incorrect permissions and configuration of the account that is used to run the Profile Synchronization Service.

Resolution:
  1. Add the account to local administrator’s group on the server where you are trying to start the service
  2. Restart the “SharePoint 2010 Timer Service” on the server.
  3. Ensure that you do not start the “Forefront Identity Manager” service manually. Although, if you want, you can change the service start up type to “Automatic (Delayed)”.
  4. Now, try to start the service from Central Administration –> Application Management –> Services on Server.

Monday, October 10, 2011

SharePoint Health Analyzer - Common Issues

I have seen in many SharePoint Farm installations where these messages usually show up in Health Analyzer page. Following are some basic steps that can be taken to get rid of the error/warning messages.

1- The server farm account should not be used for other services.
In the error description, you will see the names of the web application or windows services that is using the service account. Go to Central Administration -> Security -> Configure Service Accounts, Select the proper web application nor windows service and change the service account to an account which is not the system account. If there is no other domain account other than system account, use the Configure managed account page to register a new managed account.

2- Web.config files are not identical on all machines in the farm
Use a file utility like ExamDiff to check for differences in the web.config files. If the web.config files do not match up, make sure that the web.config files are similar across all the servers in the farm. Ensure you take a backup of the web.config files before replacing any files.
Then, you need to turn “off” automatic repair in “Web.config files are not identical on all machines in the farm” rule. This rule can be found in Central Administration -> Monitoring -> Health Analyzer Rule Definition -> under configuration category.




3- InfoPath Forms Services forms cannot be filled out in a Web browser because no State Service connection is configured
You will need to configure the State Service by running the configuration wizard. Detailed information can be found here - http://technet.microsoft.com/en-us/library/ff805084.aspx


4- Drives are running out of free space. 
Hard disk space is low on SP servers mentioned in the warning message. The available disk space should be at least or more than twice the amount of RAM installed on the machine.

Solution 1:  Free disk space on the server computer
1.     Verify that the user account that is performing this procedure is a member of the Administrators group on the local computer.
2.     Run the Disk Cleanup tool to free disk space on the server computer.
Solution 2:   Decrease the number of days to store log files
1.     Verify that the user account that is performing this procedure is a member of the Farm Administrators group.
2.     On the Central Administration Home page, click Monitoring.
3.     On the Monitoring page, in the Reporting section, click Configure diagnostic logging.
4.     On the Diagnostic Logging page, in the Trace Log section, in the Number of days to store log files box, type a smaller number.
5.     Click OK.
Solution 3: Add more disk space on the server

5- Database has large amounts of unused space.
Database files are defragmented or have more storage allocated on disk than actually required.
Follow http://support.microsoft.com/kb/307487/en-us to defragment the databases.

Thursday, September 29, 2011

PowerShell script to update Access Request Email Address for multiple sites and webs

Yesterday I worked with one of my colleagues on a PowerShell script to update/change access request on all the sites/webs inside a web application. The requirement was to check if access request is enabled or not on a site. If yes, then update the access request email address to a specific email address. The script checks all the webs and sites inside the mentioned web application and then updates the access request email. My colleague has also blogged about this and the post can be found here.

What is access request email?
Consider a user tries to access a SharePoint site and gets "Access Denied" message. If access request are enabled on the site collection or web, then they will get similar screen like below.


If the user clicks on the "Request access" link, an email is sent to the person (who is specified in the access request settings for the site). This person can then add the user and give appropriate permissions. This is useful, if the number of users is not known when setting up the SharePoint site.

Access requests can be enabled on site (or web if the web have unique permissions) by navigating to Site Settings -> Site Permissions. In the ribbon, an option to manage access request can be seen


In the next screen, you will see options to enable access requests and specify the email address.


Now, consider that there are hundreds and thousands of site and webs (with unique permissions) and you need to update the email address for access requests. This would be a very tedious and time consuming task. Below is a script which will allow to update the access request email address for all the sites and webs in a single web application at once. The scripts iterates through all the sites (and webs inside the site) and checks if there are unique permissions or not. If not, then the web is skipped (because the access request settings are being inherited from the parent site). If yes, then the email address is updated.

Replace the "URL of Web Application" with your web application url. e.g. http://sharepointserver
and replace "Specify access request email here" with the email address of the person who should manage the access requests.

Add-PSSnapin Microsoft.SharePoint.Powershell
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

$webapp =Get-SPWebApplication "URL of Web Application"
foreach($site in $webapp.Sites)
{
   foreach($web in $site.AllWebs)
   {
     if (!$web.HasUniquePerm)
      {
             Write-Host "Access Request Settings is inherted from parent."
      }
       elseif($web.RequestAccessEnabled)
       {
            $web.RequestAccessEmail ="Specify the access request email here"
            $web.Update()
       }
       else
      {
            Write-Host "Access Request Settings not enabled."
      }
   }
}

Let me know if this helps or if there are any queries.

Tuesday, September 27, 2011

SharePoint BLOB Cache (Disk Based Caching) fundamentals

What is blob cache?
Blob Cache is a disk based caching mechanism for Binary Large Objects (BLOBs) which can be used with SharePoint sites to decrease response times for pages within a SharePoint site. It also helps reduce database load on SQL servers and network traffic between SharePoint and SQL servers. Binary Large Objects (BLOBs) such as images, videos and audio files, large code files are retrieved from the database and can be cached and stored in a directory on the web front end server(s). Thus, when a user requests for a page/file, SharePoint server can respond faster by returning the file from the server cache rather than making a round trip and retrieving the data from SQL server every time. BLOB caching can be enabled on Web Front End (WFE) servers.

In SharePoint 2010, new the concept of byte-range requests is introduced which allows a user to select a later point in the video and being playback from that point. SharePoint also provides the concept of progressive caching, which starts streaming the video from the beginning of the large video file and the rest of the file is cached. Video files are also divided and retrieved in smaller sections to reduce load between the WFE and SQL servers. Administrators have the option to configured the size of these file sections.

BLOB caching is mostly used in Internet facing (anonymous access) or with sites which have read-only documents (which are not modified on regular basis) and static content. We can specify the files e.g. documents, images, etc to be cached on the server so that the response time decreases.

BLOB cache is usually stored in a directory on the web front end servers on the drive where SharePoint is installed. We need to ensure that the drive has sufficient space where the files will be stored. The location of BLOB cache can be set as per our requirement.
To enable BLOB cache on WFE servers, we need to configure the web.config file for the web applications. By default, disk based BLOB caching is turned off and must be enabled on front end web servers. In the web.config file of the web application, we will see a code line as

<BlobCache location="" path="\.(gif|jpg|jpeg|jpe|jfif|bmp|dib|tif|tiff|ico|png|wdp|hdp|css|js|asf|avi|flv|m4v|mov|mp3|mp4|mpeg|mpg|rm|rmvb|wma|wmv)$" maxSize="10" enabled="false" />


In this line, change the enabled attribute from “false” to “true”. If needed specify the location where the files need to be stored and the type of files which need to be cached. To add or remove the file types from the list of file types to be cached, for the path attribute, modify the regular expressions to include or remove the appropriate file extension. Size of the cache can also be specified. The default is set to 10 GB under the “maxSize” attribute.


Additionally, if the product is SharePoint Server 2010, we can enable “SharePoint Server Publishing Infrastructure” at the site collection level where we need to use Object caching.  Once the Publishing feature is enabled on the site, we can then enable Object caching by navigating to site settings -> Site Collection Object Cache.

Advantages of BLOB Cache
  • Disk based caching is extremely fast. Decrease in response time.
  • Eliminates the need for round trips between database and SharePoint servers for every request.
  • Security trimmed caching.
  • Can specify which files to be cached and can also restrict the types of files which need not be cached.
  • Setting BLOB cache for one web application does not affect performance of other web applications which do not have BLOB cache configured.
  • The files are cached on the server only if it is requested for the first time on that WFE. Thus, only the required files are cached and each WFE maintains its local copy of cache.
  • Each server uses less CPU time and energy to serve the same page after the initial rendering.
Disadvantage of BLOB Cache
  • The site’s performance is temporarily affected while the files to be cached are first written on the disk.
  • BLOB caching can only be configured at web application level. There is one BLOB cache per web application. Hence, if there are two different site collections – one of which contains static (needs BLOB caching) and other dynamic (does not need BLOB caching), then we need to separate the two site collections and store them in two different web application. 
  • In some scenarios where the files are large in size and number, BLOB caching may require lot of hard disk space and hence may need additional hard ware to accommodate the files. We need to use a hard drive which has sufficient space and which will be used by few processes as possible to avoid conflicts during cache retrieval.
  • As each WFE maintains it own copy of cache, the information (about which files need to be cached) is not synchronized. Thus, if WFE1 has “file1.jpg” cached but if WFE2 gets the request for that same file, WFE2 will make a round trip to SQL to fetch the file for the first time. It is not aware that the file is cached on WFE1.
  • Also, as the data is not synchronized, there are chances that each server can have different copies of the file and upon request the end user might see two different files. 
  • In case few of the files are corrupted, we may need to reset the entire index file on WFE server and rebuild the cache.
  • BLOB caching works only for items stored in document libraries on SharePoint site.
  • BLOB caching cannot be used with Web Applications that use Web Gardening.
Performance:
Let us test the performance with and without blob cache. The performance of the page also depends on the amount of media and content (that is cached on the server) present on the page. If the content cached is more, then the performance improvement can be seen very easily. If there are lesser image or media files, then the performance improvement might go unnoticed.

e.g. Following are the page load time for a test page on SharePoint site. The page contains a image file – “SharePointDeveloperToolbar.jpg”. The following data considers the time taken for the file to load before and after the blob cache settings.

Page load time with blob cache
Approximately 0.015 seconds for that single image file.

Page load time without blob cache
Approximately 0.031 seconds for the same image file.

The above is only for one single image file on the page. If we consider all the media elements on the page, then the difference can be seen easily.

Friday, September 23, 2011

SharePoint People Picker and Active Directory

Introduction:
SharePoint People Picker enables end users to enter either username or part of the username/display name and have the input resolved against a source that is holding user information (mostly active directory, but we can also use other directory sources).
Following is the screenshot of how a people picker page looks in SharePoint 2010.


Note: The following part of the article considers that SharePoint Picker People is using classic (Windows Integrated Authentication). When SharePoint People Picker uses classic authentication method, we can use stsadm setproperty commands. If the web application is using Claims based authentication, then stsadm setproperty commands would not cause any effect on the configuration.

How people picker works:
Following is the sequence of activities that occur when you try to search for a user in people picker
  • User submits a query in people picker
  • The query is sent to one of the Web Front End (WFE) servers. The server which is handling the request performs a lookup in DNS for the domain controller server holding the Global Catalog Service.
  • DNS returns the I.P. address of the domain controller holding GC. SharePoint will use any random port on it’s server and connect to the Domain controller server on port 3268. Port 3268 is the LDAP over TCP port used by Global Catalog service.
  • SharePoint will initially ask for some information from DC which will let SharePoint know the authentication mechanism that is supported, LDAP capabilities, end points, etc.
  • If the Active Directory requests for authentication, SharePoint will use the web application’s application pool account to send a LDAP query.
  • SharePoint will now send a LDAP query (for the username which the user requested) to the DC server. The query is created using “System.DirectoryServices” namespace. If the server is a standalone installation and the application pool is running under Local system or Network Service, then DOMAIN\MACHINE_NAME will be used for authentication. Once username is found in the directory, Windows APIs will return the SID of the user. Using the SID, further information about the user is retrieved from the Active Directory.
  • DC sends the response back to the SharePoint server.  System.DirectoryServices uses “DirectorySearcher” object to formulate the search response. A “SearchResultContainer” object is sent containing “SearchResults” which were returned by the “FindAll” method.
    Refer: http://msdn.microsoft.com/en-us/library/ms180881(VS.90).aspx
  • The information is parsed and displayed in People Picker.

We know now that SharePoint initially queries DNS and then communicates with the Domain Controller holding GC to get the details of the user. For the communication to happen successfully, we need to ensure that the following ports are open between SharePoint and Domain controller (holding GC).

On SharePoint Server:
  • TCP/UDP 135, 137, 138, 139 (RPC)
  • TCP/UDP 389 (LDAP default port)
  • TCP 636 (LDAP SSL default port)
  • TCP/UDP 53 (DNS) 
  • TCP/UDP 88 (Kerberos)
  • UDP 464 (Kerberos Change Password)

On Domain Controller:
  • TCP/UDP 135 (RPC)
  • TCP/UDP 389  (LDAP default port)
  • TCP 636 (LDAP SSL default port)
  • TCP 3268 (LDAP Global Catalog)
  • TCP 3269 (LDAP SSL Global Catalog)
  • TCP/UDP 88 (Kerberos)
  • TCP/UDP 53 (DNS)
  • TCP/UDP 445 (Directory Services)

Following ports are optional
  • TCP/UDP 749 (Kerberos-Adm)
  • TCP port 750 (Kerberos-IV)

If people picker is pulling up users from another domain, then to support the communication between domains inside the corporate network requires at least a one-way trust relationship in which the perimeter network trusts the corporate network.
On the other domain controller (or if the communication occurs through ISA), the following ports are required to be kept open for inbound connections in the DC/ISA server.
  • TCP/UDP 135 (RPC)
  • TCP/UDP 389 (LDAP default port)
  • TCP 636 (LDAP SSL default port)
  • TCP 3268 (LDAP Global Catalog)
  • TCP 3269 (LDAP SSL Global Catalog)
  • TCP/UDP 53 (DNS)
  • TCP/UDP 88 (Kerberos)
  • TCP/UDP 445 (Directory Services)

    Optional Ports:
  • TCP/UDP 749 (Kerberos-Adm)
  • TCP port 750 (Kerberos-IV)
In case of any queries, please feel free to drop a comment and I will respond back.


Wednesday, September 21, 2011

When was the last you felt like an expert on SharePoint?

Microsoft is presenting the ESP Live Webcast Series, conducted by Subject Matter Experts from Microsoft, to educate you about the advantages that you might be missing out on.

1. SharePoint Architecture - Services Stack
The first webcast, on 19th September, will take you through SharePoint Architecture - Services Stack, introducing you to the various services SharePoint Server 2010 offers. Some of the key topics that we will talk about include:
  • Access Services
  • Dashboards
  • Enterprise Content Management
  • Enterprise Search
  • Excel Service

2. SharePoint Architecture - Farm Deployment
The second webcast, on 21st September, will be on SharePoint Architecture - Farm Deployment, introducing you to the overall elements that comprise a SharePoint farm and how administrators can leverage interfaces to manage their farm. Topics we will cover in the webcast are:
  • Farm Topology
  • Load Balancing Servers
  • Planning Availability
  • Disaster Recovery
  • Service Applications

Designing and Creating Workflows with Visio 2010

One of the biggest challenges in implementing a successful workflow is not becoming an expert in coding/creating a custom workflow but to design it as per the requirement of user or business needs. Typically, a consultant would talk to business users, understand business process and then design the workflow. This design is then handed over to SharePoint professionals to implement on SharePoint.

Visio makes the task of analyzing the requirement and actually putting it down on paper. We can draw diagrams, create flowcharts for workflows and the best part is that now, we can export the workflow in .vwi format (Visio Workflow Interchange) and then import it in SharePoint Designer 2010. Thus, a consultant or a business power user can create Visio diagrams for the workflow and SharePoint professionals can easily import them in SharePoint Designer 2010 and implement it on lists/sites.

Creating a workflow using Visio 2010:
Visio 2010 provides us with specific workflow templates that can be directly imported in SharePoint Designer 2010.



When we select “SharePoint Workflow” template in Visio, we can see that there is a set of shapes available. These shapes relate to the actions and conditions in SharePoint Designer 2010.


Use this template and create a workflow diagram as per your business requirement. Once the diagram is done, Click on “Export the diagram” from “Process” tab in Visio. This will check the file for any errors and export the file and save it as a .vwi format file.


Importing the workflow diagram in SharePoint Designer 2010:
Open SharePoint Designer 2010 and open the site/list where you would like to associate the workflow. Under the workflow tab, you can see the button for Importing the workflow from a Visio document.


Once the workflow is imported, it can be further customized or details can be added. You can then publish the workflow on the required list/site.

Tuesday, September 20, 2011

How to carry out effective and efficient SharePoint searches


I have to admit. When I think about search, the first thing that comes to my mind is “Google” (or some people may even think about Bing or other search engines). Google has made searching very easy and very “fast”. And it has also made other search sites and tools look equally dumb and useless. So, an end user will open Google in one tab of the browser and SharePoint search page on another and type a query and compare the results. I am sure the person will be amazed by Google’s response time. But that does not mean that SharePoint search is weak and does not perform.

Users have to understand that there is a vast difference between Google’s infrastructure and the infrastructure of the organization they work in. Their own organization is hosting SharePoint for them (or sometimes may be it is hosted by another organization, but the point is that the infrastructure is still smaller than Google’s). So, the server capacity is limited, bandwidth is limited and storage is limited. Still, SharePoint manages to give results by just taking a little extra seconds than Google. So, I think that is commendable job from SharePoint’s point of view.

For all those who still do not like to search on SharePoint site, here are few tips that can help you get the results your are looking for in the first instance of your try. Try any or combine these tips below to narrow down the result set and get the results easily and in your first try.

Using Keywords:
Keywords are the basis of searching something. Consider you want to search for a H.R. document whose title is “Company HR Policies”. So, you can search for Policy. Searching with a single word may not give you the results.You can use phrase – Company HR Policy. SharePoint will now search for either Company or HR or Policy and return documents and items that contain any of these words. Exact phrases can  also be used like “Company HR Policy” (by specifying the quotation mark). This will search for exactly the phrase you have typed and in this case, you might get the first result as the document you were looking for. So, if you know the filename and want to search for it, use exact full phrases rather than loosely typed pieces of words.

But what if you do not know the filename but do know who wrote the document or what type of file it is. In such cases, Property Filters can be used

Property Filters:
When property filters are used while searching for content, SharePoint will limit the result set based on the matches between the data provided by you and the metadata properties of the file/item indexed by SharePoint.

e.g. You know that the document is written by a person named “Amol Ghanwat”, then following are few examples how you can use to search it.

author:Amol
author:”Amol Ghanwat”
author:”domain\username”

Note the use of quotation mark when multiple words are used.

In case you know the file type, you can search for document and limit the result set for those file type(s). e.g. if you are searching for Excel 2007/2010 file types, following query will yield only Excel files

filetype:xlsx

So, I mentioned earlier that you may not know the name of the document. But what if you know only a certain part of the filename or some other attribute. Prefix matching can be used to search for content for which you have incomplete information.

Prefix Matching:
You know that the file name starts with something like “Share….” but do not know the rest of the name. You can simply search for “Share*” (using the asterisk “*” symbol) and the search would return documents/item starting with “Share” e.g. SharePoint, ShareBook, etc.

Prefix matching can be used with property filters. e.g. You know that the content was written by a person whose name start with “Am” but do not know the last name, simply search for author:Am* and it would return documents with authors whose name start with “Am”

Note: As the name applies, “*” symbol can be used only as shown above. If you try to search something like “*Point” it will not return “SharePoint” in results. This is because wildcard characters are allowed only a
t the end when using SharePoint search.

Inclusions and Exclusions:
You are searching for a travel catalog and would like to find information about “Paris”, you can use inclusions as - Catalog + Paris or ”Travel Catalog” + Paris

This will ensure that the documents with “Catalog” or “Travel Catalog” with “Paris” are returned.

Similarly, if you would like that documents with “XYZPlace” in its name should not be returned, then we can query as - Catalog – XYZPlace

Boolean Expressions:
If there are multiple words or phrases in your query, you can use OR AND operations. e.g. If you want to search for a restaurant with the name ABC, you can search as restaurant AND ABC

Numeric Expressions:
Lets consider that you are an account executive and would like to search for accounts whose balance is less than 1000. In such cases, you are use Numerical Operators to compare values from SharePoint list/library/site columns. e.g. Accounts < 1000

Note that if Accounts is a custom column, your SharePoint site owner or administrator might have to request for this column to be mapped to metadata properties and crawled. If the metadata mapping is not done, then Numeric expressions will not yield correct results. Also, metadata mapping affects property filters.

Search Alerts
If you search for something very frequently and the main purpose of the search is to track which documents are added, removed, updated, etc, then you are use Search Alerts. When you search for something, the result page will contain links for Alert Me and RSS. Use any one of the links to keep yourself updated. e.g. If “ABC” was the first search result that used to come and after few days, “XYZ” relevance increases, it will show up first. Or a document named “PQR” was added and showed up in the search results. All these changes are summarized and sent across to you as RSS feed or as an email.

Monday, September 19, 2011

The sdk service is either not running or not yet initialized

I would like to bring out some points which can be used to troubleshoot issue related to System Center Operation Manager 2007 R2 SDK Service. You may get the below mentioned error while opening SCOM console, or on other services or applications (like SharePoint) where this data is being utilized. The problem lies with the SCOM server's sdk (System Center Data Access) service.

To analyze the issue further, we will need to reproduce the issue and check “Operations Manager” logs from event viewer on SCOM server.

Following are few possibilities/cause of the issue -
  1. If any of the following service is stopped.
    • System Center Management (Health)
    • System Center Data Access (SDK)
    • System Center Configuration (Config service)
  2. If any of the above service is stopped, please start the service. 
  3. Re-enter the password for the identity of above services and restart the service. 
  4. In case we are getting an error while starting the service, the reason might one of the reasons mentioned below
  • Incorrect SPN set for the account that is running the System Center Data Access (OpsMgr SDK) service. Need to find out what SPNs are set for the account and rule out any possibility of incorrect SPNs.
  • If SCOM is running on Windows 2003 SP1 and uses SQL 2005, we can also check the “SynAttackProtect” TCP registry as mentioned in the blog article below: http://blogs.msdn.com/b/sql_protocols/archive/2006/04/12/574608.aspx
  • Before making the above registry change, we need to ensure that we are getting System.Data.SQLClient.SQLException from Operations Manager event viewer logs. Event ID: 26380
  • If System Center Operations Manger 2007 R2 Authoring console is installed on System Center Operations Manager 2007 SP1 RMS or if there are errors related to assembly in event viewer, please check this article: http://support.microsoft.com/kb/2526827
  • Verify if the port numbers are same across the following registry key and SQL Server configuration manager settings for the SQL instance.
  • Registry Key Location: HKLM\Software\Microsoft\Microsoft Operations Manager\3.0\Setup\
    Check the “DatabaseServerName” key information.
    Match it with the port number specified in SQL Configuration manager for SQL instance. Both should be same.
    Related: http://support.microsoft.com/kb/2002620
  • Ensure that we are able to open SCOM Console on SCOM server.
  • Check for any handler or access denied errors in event viewer. If yes, then the SDK account might not have permissions on “Services Configuration” or SCOM installation directory.If starting a sdk service gives time out error, we can consider increasing the timeout for the sdk service – http://support.microsoft.com/kb/922918

Windows XP lifecycle to end in April 2014

The most successful Microsoft OS - Windows XP life-cycle should get over in April 2014 as per the statement from Kevin Turner.

"We are end-of-lifing XP and Office 2003 and everything prior, in April 2014," said Kevin Turner, Microsoft's chief operating officer, during a meeting with financial analysts Wednesday. "So for all those companies that have the old products that haven't quite started the refresh, guess what? This has been a great product, XP has been a wonderful product; great TCO has been given. It's now time for it to go."

Read more - http://www.informationweek.com/news/windows/microsoft_news/231601604

Friday, September 16, 2011

Batch file to delete files older than 2 minutes

Specify the directory where the files are located, save the file as .bat (batch) file and run it. When you run it, it will delete all the files which are older than 2 minutes

@echo off
cd "Directory where files need to be deleted"
setlocal
call :DateToMinutes %date:~-4% %date:~-10,2% %date:~-7,2% %time:~0,2% %time:~3,2% NowMins
for /f "delims=" %%a in ('dir * /a-d /b') do call :CheckMins "%%a" "%%~ta"
goto :EOF
:CheckMins
set File=%1
set TimeStamp=%2
call :DateToMinutes %timestamp:~7,4% %timestamp:~1,2% %timestamp:~4,2% %timestamp:~12,2% %timestamp:~15,2%%timestamp:~18,1% FileMins
set /a MinsOld=%NowMins%-%FileMins%
if %MinsOld% gtr 2 del %file%
goto :EOF
:DateToMinutes
setlocal
set yy=%1&set mm=%2&set dd=%3&set hh=%4&set nn=%5
if 1%yy% LSS 200 if 1%yy% LSS 170 (set yy=20%yy%) else (set yy=19%yy%)
set /a dd=100%dd%%%100,mm=100%mm%%%100
set /a z=14-mm,z/=12,y=yy+4800-z,m=mm+12*z-3,j=153*m+2
set /a j=j/5+dd+y*365+y/4-y/100+y/400-2472633
if 1%hh% LSS 20 set hh=0%hh%
if /i {%nn:~2,1%} EQU {p} if "%hh%" NEQ "12" set hh=1%hh%&set/a hh-=88
if /i {%nn:~2,1%} EQU {a} if "%hh%" EQU "12" set hh=00
if /i {%nn:~2,1%} GEQ {a} set nn=%nn:~0,2%
set /a hh=100%hh%%%100,nn=100%nn%%%100,j=j*1440+hh*60+nn
endlocal&set %6=%j%&goto :EOF

At the start of the file, specify the directory where the files are located. If you need to change the number of minutes, locate "if %MinsOld% gtr 2 del %file%. Change the number of minutes. You can use this to schedule jobs and delete log files older than specified amount of minutes.

Visio Graphic Service Protocol

Visio Graphic Service Protocol is used to retrieve information about a Web Drawing (which is usually stored in a .vdw format) in a SharePoint document library. SharePoint as an application uses this protocol to retrieve data and display the graphic on the site. When we open a Visio diagram in SharePoint site, the Visio protocol client calls the server for information. The information is either returned in raster or vector format. Raster file data is returned as .png (portal network graphic) format while a vector data is returned as XAML.

Visio Graphic Services uses SOAP (Simple Object Access Protocol) for formatting request and response received from the server. The information is transmitted using TCP IP protocol (either through HTTP or HTTPS, depending on the way you have configured Visio Service Application). Following Diagram shows the messaging and transport stack used by the service.



Following are the pre-requisites for the service protocol to work properly:
  • Service endpoint. 
  • Appropriate permissions to call the method from VisioGraphicService.svc 
  • Token based security mechanism
SharePoint acts as the client requesting the information. Visio Service is hosted in a website known as “SharePoint Web Services”. The sub directory name is usually a GUID. Search through the contents of the site and you will find that VisioGraphicService.svc is hosted in one of the directories of the web application.


Open IIS Manager and check the Content view of the “SharePoint Web Services” web application. You should see a similar picture as above. The GUID might differ though.

VisioGraphicService.svc acts a a service endpoint and is needed to communicate and transfer data across. SharePoint Web Services web application uses TCP port 32843 (HTTP) and TCP port 32844 (HTTPS) for communication. Usually a default configuration of SharePoint uses HTTP port for communication unless you specific any other setting from Central Administration site.

Ensure we have these port(s) open. In case firewall is blocking the ports or you are facing any network issue, Visio diagrams may fail to refresh or may not render at all. Just in case, I am documenting some sample error messages I received because TCP 32843 port was blocked.

w3wp.exe Visio Graphics Service Web Access 8046 Critical Failed to get Vector Diagram for visio file (null) page (null) Exception : Could not connect to http://servername:32843/virtualDirectoryID/VisioGraphicsService.svc. TCP error code 10061: No connection could be made because the target machine actively refused it :32843.
w3wp.exe Visio Graphics Service Web Access High BeginGetVectorDiagram failed: System.OperationCanceledException: The server failed to process the request. ---> System.ServiceModel.EndpointNotFoundException: Could not connect to http://servername:32843/virtualDirectoryID/VisioGraphicsService.svc. TCP error code 10061: No connection could be made because the target machine actively refused it :32843. ---> System.Net.WebException: Unable to connect to the remote server ---> System.Net.Sockets.SocketException: No connection could be made because the target machine actively refused it :32843


See Microsoft Technet Article for details about ports that need to be open for communication: http://technet.microsoft.com/en-us/library/cc262849.aspx#ServiceApp

Friday, August 5, 2011

"An unexpected error has occurred" on all the SharePoint Server 2010 sites.

Issue:
All SharePoint sites throw the following error:
"An unexpected error has occurred."

Errors:
In ULS logs, something similar can be found

An exception occurred when trying to issue security token: The server was unable to process the request due to an internal error. For more information about the error, either turn on IncludeExceptionDetailInFaults (either from ServiceBehaviorAttribute or from the configuration behavior) on the server in order to send the exception information back to the client, or turn on tracing as per the Microsoft .NET Framework 3.0 SDK documentation and inspect the server trace logs.. 

Exception occured while connecting to WCF endpoint: System.ServiceModel.FaultException: The server was unable to process the request due to an internal error. For more information about the error, either turn on IncludeExceptionDetailInFaults (either from ServiceBehaviorAttribute or from the configuration behavior) on the server in order to send the exception information back to the client, or turn on tracing as per the Microsoft .NET Framework 3.0 SDK documentation and inspect the server trace logs. 
at Microsoft.IdentityModel.Protocols.WSTrust.WSTrustChannel.ReadResponse(Message response) 
at Microsoft.IdentityModel.Protocols.WSTrust.WSTrustChannel.Issue(RequestSecurityToken rst, RequestSecurityTokenResponse& rstr) 
at Microsoft.IdentityModel.Protocols.WSTrust.WSTrustChannel.Issue(RequestSecurityToken rst) 
at Microsoft.SharePoint.SPSecurityContext.SecurityTokenForContext(Uri context, Boolean bearerToken, SecurityToken onBehalfOf, SecurityToken actAs, SecurityToken delegateTo) 
at Microsoft.SharePoint.SPSecurityContext.<>c__DisplayClass7.b__6() 
at Microsoft.SharePoint.Utilities.SecurityContext.RunAsProcess(CodeToRunElevated secureCode) 
at Microsoft.SharePoint.SPSecurityContext.GetProcessSecurityTokenForServiceContext() 
at Microsoft.SharePoint.SPChannelFactoryOperations.CreateChannelAsProcess[TChannel](ChannelFactory`1 factory, EndpointAddress address, Uri via) 
at Microsoft.SharePoint.SPChannelFactoryOperations.CreateChannelAsProcess[TChannel](ChannelFactory`1 factory, EndpointAddress address) 
at Microsoft.Office.Server.UserProfiles.MossClientBase`1.get_Channel() 
at Microsoft.Office.Server.UserProfiles.MossClientBase`1.ExecuteOnChannel(String operationName, CodeBlock codeBlock) 
at Microsoft.Office.Server.UserProfiles.ProfilePropertyServiceClient.ExecuteOnChannel(String operationName, CodeBlock codeBlock)

UserProfileApplicationProxy.InitializePropertyCache: Microsoft.Office.Server.UserProfiles.UserProfileException: The server was unable to process the request due to an internal error. For more information about the error, either turn on IncludeExceptionDetailInFaults (either from ServiceBehaviorAttribute or from the configuration behavior) on the server in order to send the exception information back to the client, or turn on tracing as per the Microsoft .NET Framework 3.0 SDK documentation and inspect the server trace logs. ---> System.ServiceModel.FaultException: The server was unable to process the request due to an internal error. For more information about the error, either turn on IncludeExceptionDetailInFaults (either from ServiceBehaviorAttribute or from the configuration behavior) on the server in order to send the exception information back to the client, or turn on tracing as per the Microsoft .NET Framework 3.0 SDK documentation and inspect the server trace logs. 
at Microsoft.IdentityModel.Protocols.WSTrust.WSTrustChannel.ReadResponse(Message response) 
at Microsoft.IdentityModel.Protocols.WSTrust.WSTrustChannel.Issue(RequestSecurityToken rst, RequestSecurityTokenResponse& rstr) 
at Microsoft.IdentityModel.Protocols.WSTrust.WSTrustChannel.Issue(RequestSecurityToken rst) 
at Microsoft.SharePoint.SPSecurityContext.SecurityTokenForContext(Uri context, Boolean bearerToken, SecurityToken onBehalfOf, SecurityToken actAs, SecurityToken delegateTo) 
at Microsoft.SharePoint.SPSecurityContext.<>c__DisplayClass7.b__6() 
at Microsoft.SharePoint.Utilities.SecurityContext.RunAsProcess(CodeToRunElevated secureCode) 
at Microsoft.SharePoint.SPSecurityContext.GetProcessSecurityTokenForServiceContext() 
at Microsoft.SharePoint.SPChannelFactoryOperations.CreateChannelAsProcess[TChannel](ChannelFactory`1 factory, EndpointAddress address, Uri via) 
at Microsoft.SharePoint.SPChannelFactoryOperations.CreateChannelAsProcess[TChannel](ChannelFactory`1 factory, EndpointAddress address) 
at Microsoft.Office.Server.UserProfiles.MossClientBase`1.get_Channel() 
at Microsoft.Office.Server.UserProfiles.MossClientBase`1.ExecuteOnChannel(String operationName, CodeBlock codeBlock) 
at Microsoft.Office.Server.UserProfiles.ProfilePropertyServiceClient.ExecuteOnChannel(String operationName, CodeBlock codeBlock) -
-- End of inner exception stack trace --- 
at Microsoft.Office.Server.UserProfiles.ProfilePropertyServiceClient.ExecuteOnChannel(String operationName, CodeBlock codeBlock) 
at Microsoft.Office.Server.UserProfiles.ProfilePropertyServiceClient.GetProfileProperties() 
at Microsoft.Office.Server.Administration.UserProfileApplicationProxy.RefreshProperties(Guid applicationID) 
at Microsoft.Office.Server.Utilities.SPAsyncCache`2.GetValueNow(K key) 
at Microsoft.Office.Server.Utilities.SPAsyncCache`2.GetValue(K key, Boolean asynchronous) 
at Microsoft.Office.Server.Administration.UserProfileApplicationProxy.InitializePropertyCache()

System.NullReferenceException: Object reference not set to an instance of an object. 
at Microsoft.Office.Server.Administration.UserProfileApplicationProxy.get_ApplicationProperties() 
at Microsoft.Office.Server.Administration.UserProfileApplicationProxy.get_PartitionIDs() 
at Microsoft.Office.Server.Administration.UserProfileApplicationProxy.IsAvailable(SPServiceContext serviceContext) 
at Microsoft.Office.Server.WebControls.MyLinksRibbon.get_PortalAvailable() 
at Microsoft.Office.Server.WebControls.MyLinksRibbon.EnsureMySiteUrls() 
at Microsoft.Office.Server.WebControls.MyLinksRibbon.get_PortalMySiteUrlAvailable() 
at Microsoft.Office.Server.WebControls.MyLinksRibbon.OnLoad(EventArgs e) 
at System.Web.UI.Control.LoadRecursive() 
at System.Web.UI.Control.LoadRecursive() 
at System.Web.UI.Control.LoadRecursive() 
at System.Web.UI.Control.LoadRecursive() 
at System.Web.UI.Control.LoadRecursive() 
at System.Web.UI.Control.LoadRecursive() 
at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
Cause:
This will occur on single server deployments which uses build in SQL database. If you have done a farm installation, the resolution/cause may not apply.

Forefront Identity Manager Synchronization service corrupt not working. This in turn affects User profile sync service and security token service application.

If your applications are associated with User profile service, you will get the error.

Resolution:
As per the Technet article, User profile synchronization service does not work with single server deployment with build in databases.
Check the article for further details:
http://technet.microsoft.com/en-us/library/cc263202.aspx

To resolve the issue, delete the User profile synchronization service instance from Central Admin.

Tuesday, August 2, 2011

Exporting and Importing Term Set to .csv file using PowerShell

SharePoint 2010 now provides us the ability to attach metadata to almost all the items that exists in SharePoint. We can now manage what terms are created and how we associate it. This new feature is achieved using "Managed Metadata Service".


Usually organizations will create hierarchies and set up a term set so that users can associate the terms (just like categories or tags) to an item. This can be later used in search results to refine search items. SharePoint does not provide the ability to automatically synchronize the metadata across different farms. If we need to copy term set, we need to export the term set (can use this utility) and then import it using the term store management tool. In case you need to know how to import the term set using Term Store Management tool, please check out these articles -


http://office.microsoft.com/en-us/sharepoint-server-help/import-a-term-set-HA101818255.aspx http://www.wictorwilen.se/Post/Create-SharePoint-2010-Managed-Metadata-with-Excel-2010.aspx


This is a manual process and I have seen situations where automation is required. e.g. If the term set is open for users to allow addition of terms (Submission Policy is set to Open), then the term are mostly updated in the term sets. In case you need to have same term set data across multiple farms, then we need to do the export and import process manually all the time. Before I proceed, I have to let you know that exporting and importing the term sets will break the association with the existing term set. This is because whenever we are exporting the term set in a new farm, the GUIDs are not imported. If we have a list/site column (in source site) that is associated to specific term, it is done using the GUID of that term and not the name. So, if you have migrated the content and now are trying to export and import the metadata, the metadata items will be orphaned (but will still work i.e. functionality to add and remove metadata will work with a small setting change). In case you need to maintain the associate of the ID with the data in SharePoint sites across farms, I suggest you go with this method -


http://www.andrewconnell.com/blog/archive/2011/06/15/sharepoint-2010-managed-metadata-movingcopying-mms-instances-term.aspx


Note: In the above method, you are actually moving the entire term store and not individual term sets from groups.


With that said, lets move to a situation where term sets are updated weekly/daily and needs to be synchronized across different farms. To sync the data, we will need to carry out the process weekly/daily and causes redundancy and time wastage. We can do the export and import (considering in mind that the association with the ID will be lost), but again that is manual. The next approach can be to use code or powershell scripts to export and import the term sets. Then using a batch files or through Team Foundation Server, we can automate this process. Next you will find the powershell scripts which can export a term set (upto 7 levels) and then import it in the desired farm. At the start of each script you will see that some variables are set. e.g Term Set Name, Managed Metadata Service Instance Name, Owner Name, File Name, etc. Change it according to your farm settings and details and then run the script.


****** Export Script ******
Add-PSSnapin Microsoft.SharePoint.PowerShell
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Taxonomy")

#This is the directory where you need to export the term set as a .csv file.
#Please ensure that this folder exists before trying to export the term set.
$outPutDir = "Directory where the csv file should be saved"

#This should be the url of the Central Administration Site of the farm from where we would like to export the term set.
$siteURL = "Central Admin Site URL"

#This is the name of the Metadata Service Instance. e.g. "Managed Metadata Service"
$mmsServiceName = "Managed Metadata Service Instance Name"

#This is the group name where the term set exists.
#In case the group does not exists, the script will create a new group with this name under the term store.
$grpName = "Term Store Group Name"

#This is the term set name which we are going to export.
$termSetName = "Term Set Name"

#This is the name of the csv file which will contain the exported term set data.
$fileName = "FileName.csv"

try
        {
            $taxonomySite = Get-SPSite $siteURL
            $taxSession = Get-SPTaxonomySession -site $taxonomySite
            
            try
            {
                $termStore = $taxSession.TermStores[$mmsServiceName];
 
                if ($termStore -ne $null)
                {
                    try
                    {
                        $termGroup = $termStore.Groups[$grpName];
 
                        if ($termGroup -ne $null)
                        {
                            try
                            {
                                $termSet = $termGroup.TermSets[$termSetName];
 
                                if ($termSet -ne $null)
                                {
                                    [string]$csvDir = ""; 
                                    $csvDir = $outPutDir;
                                    $outPutFile = $csvDir + "\" + "$fileName";
 
                                    $sw = New-Object System.IO.StreamWriter($outPutFile);
 
                                    $sw.writeline('"Term Set Name","Term Set Description","LCID","Available","Term Description","Level 1 Term","Level 2 Term","Level 3 Term","Level 4 Term","Level 5 Term","Level 6 Term","Level 7 Term"');
                                    [Byte[]] $ampersand = 0xEF,0xBC,0x86;
                                    #$loop = $termGroup.TermSets.Count;
                                    #$count = 1;
 
                                            if($TermSet.TermsCount -ne 0)
                                            {
                                                $topTermOutput = '"' + $TermSet.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' +  $TermSet.Description + '","' + $TermSet.Languages + '","' +  $TermSet.IsAvailableForTagging + '","'; 
                                                
                                            foreach($childTerm in $TermSet.Terms)
                                            {
                                                if($childTerm.TermsCount -ne 0)
                                                {
                                                    #$topTermOutput = '"' + '","' +  $TermSet.Description + '","' + $TermSet.Languages + '","' +  $TermSet.IsAvailableForTagging + '","'; 
                                                    $heritage = $topTermOutput + '","' + $childTerm.Description + '","' + $childTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '"';
                                                    foreach($secondLevelTerm in $childTerm.Terms)
                                                    {
                                                        if($secondLevelTerm.TermsCount -ne 0)
                                                        {
                                                            foreach($thirdLevelTerm in $secondLevelTerm.Terms)
                                                            {
                                                                if($thirdLevelTerm.TermsCount -ne 0)
                                                                {
                                                                    foreach($fourthLevelTerm in $thirdLevelTerm.Terms)
                                                                    {
                                                                        if($fourthLevelTerm.TermsCount -ne 0)
                                                                        {
                                                                            foreach($fifthLevelTerm in $fourthLEvelTerm.Terms)
                                                                            {
                                                                                if($fifthLevelTerm.TermsCount -ne 0)
                                                                                {
                                                                                    foreach($sixthLevelTerm in $fifthLevelTerm.Terms)
                                                                                    {
                                                                                        if($sixthLevelTerm.TermsCount -ne 0)
                                                                                        {
                                                                                            foreach($seventhLevelTerm in $sixthLevelTerm.Terms)
                                                                                            {
                                                                                                $heritage = $topTermOutput + '","' + $childTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $secondLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $thirdLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $fourthLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $fifthLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $sixthLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $seventhLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '"';
                                                                                                $sw.writeline($heritage);                                                                                               
                                                                                            }
                                                                                        }
                                                                                        else
                                                                                        {
                                                                                            $heritage = $topTermOutput + '","' + $childTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $secondLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $thirdLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $fourthLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $fifthLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $sixthLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '"';
                                                                                            $sw.writeline($heritage);
                                                                                        }
                                                                                    }
                                                                                }
                                                                                else
                                                                                {
                                                                                    $heritage = $topTermOutput + '","' + $childTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $secondLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $thirdLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $fourthLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $fifthLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '"';
                                                                                    $sw.writeline($heritage);
                                                                                }
                                                                            }
                                                                        }
                                                                        else
                                                                        {
                                                                            $heritage = $topTermOutput + '","' + $childTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $secondLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $thirdLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $fourthLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '"';
                                                                            $sw.writeline($heritage);
                                                                        }
                                                                    }
                                                                }
                                                                else
                                                                {
                                                                    $heritage = $topTermOutput + '","' + $childTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $secondLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $thirdLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '"';
                                                                    $sw.writeline($heritage);
                                                                }
                                                            }
                                                        }
                                                        else
                                                        {
                                                            $heritage = $topTermOutput + '","' + $childTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '","' + $secondLevelTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '"';
                                                            $sw.writeline($heritage);
                                                        }
                                                    }
                                                }
                                                else
                                                {
                                                     $heritage = $topTermOutput + '","' + $childTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '"';
                                                     $sw.writeline($heritage);
                                                }
                                            }
                                        }
                                        else
                                        {
                                             $heritage = $topTermOutput + '","' + $childTerm.Name.Replace([System.Text.Encoding]::UTF8.GetString($ampersand), "&") + '",,,,,,,' ;                                                
                                             $sw.writeline($heritage);
                                        }
                                    }
                                }
                                catch
                                {
                                    return "Problem occurred while creating the export file."
                                }
                                $sw.close();
                                write-host "Your CSV has been created at $outPutFile";
                                }
                            
                            else
                            {
                                return "Termset $termSet does not exist in the term store group $term";
                            }
                        }
                        catch [System.Exception]
                        {
                            write-host($_.Exception)
                            "Unable to acquire the termset $termSetName from the term group $grpName"
                        }
                    }
                    else
                    {
                        return "Term store group $grpName does not exist in the term store $mmsServiceName";
                    }
                }
                catch
                {
                    "Unable to acquire term store group $grpName from $mmsServiceName"
                }
            }
            
            catch
            {
                "Unable to acquire session for the site $siteURL";
            }
        finally
        {
            $ErrorActionPreference = "Continue";
        }

****** Import Script ******
Add-PSSnapin Microsoft.SharePoint.Powershell
$inputFile = "Complete path of the .csv file"
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Taxonomy")

#Change the below values as per your farm and term store settings.

#This should be the url of the Central Administration Site of the farm where we would like to import the term set
$siteURL = "Central Admin site URL"

#This is the name of the Metadata Service Instance. e.g. "Managed Metadata Service"
$mmsServiceName = "Metadata Service Instance"

#This is the group name where the term set exists. In case the group does not exists, the script will create a new group with this name under the term store.
$grpName = "Group Name"

#This is the login ID of the user who should be set as the owner of the Group
$grpManager = "domain\username"

#This is the term set name which we are going to import.
$termSetName = "Term Set Name"

#This is the login ID of the user who should be set as the owner of the term set.
$termSetOwner = "domain\username"

#This is the login ID of the user who should be set as the one of the StakeHolders for the term set.
$tsStakeHolder = "domain\username"
#In case you need to add more stake holders, create new variables and add them before the CommitAll() statement (before setting the owner) at the later part of the code.

$taxonomySite = Get-SPSite $siteURL
$taxSession = Get-SPTaxonomySession -Site $taxonomySite
$termStore = $taxSession.TermStores[$mmsServiceName];
if ($termStore -ne $null)
{
    $termGroups = $termStore.Groups;
    
    #This checks if a group with the given name exists or not. If it does not exists then it will create a new group with the given name.
    foreach($termGroup in $termGroups)
    {
        if($termGroup.Name -eq $grpName)
        {
            $termGroup = $termStore.Groups[$grpName];
        }
        else
        {
            $termGroup = $null;
        }
    }
    if($termGroup -eq $null)
    {
        try
        {
            $termStoreGroup = $termStore.CreateGroup($grpName);
            $termGroup.AddGroupManager($grpManager);
            write-host "Term group $grpName created successfully";
            $termGroup = $termStore.Groups[$grpName];
        }
        catch
        {
            write-host "Creation of Term Group Failed.";
        }
    }
    
    #If the group exists then deleting all the term sets under the given group.
    if ($termGroup -ne $null)
    {
        $termGroup.TermSets | foreach {
        $_.Delete();
        $termStore.CommitAll();
        }
 write-host "All term sets under the group $grpName deleted successfully"
    }
    
    #Importig the term set from the csv file.
    try
    {
        $fileReader = [System.IO.File]::OpenText($inputFile);
        $importMan = $termStore.GetImportManager();
        $varImported = "";
        $varMessage = "";
        $importMan.ImportTermSet($termGroup, $fileReader,([REF]$varImported),([REF]$varMessage));
        $termStore.CommitAll();
        $termSet = $termGroup.TermSets[$termSetName];
        $termSet.AddStakeHolder($tsStakeHolder);
        $termSet.IsOpenForTermCreation = "true";
     $termSet.Owner = $termSetOwner;
        $termStore.CommitAll();
        write-host "Term Set re-imported successfully.";
    }
    catch
    {
        write-host "There was either a problem while creating or importing the term set";
    }
}
else
{
    return "Term store $termStore does not exist";
}


You can copy the above code and save it as .ps1 file and run it against the farm(s) where you would like to import/export the term set.

Sunday, July 10, 2011

Metadata column not visible for users other than site collection administrators

Recently I came across an issue, where data from a column of the type "Managed Metadata" did not show up for users. Only the site collection administrators were able to see the content.

Cause:
Users did not have permissions on a hidden list. List is called Hidden Taxonomy List and the url is something like this: http://...SITEURL.../lists/taxonomyhiddenlist

Granting permissions on this list makes the managed metadata accessible to users (who have permissions) on this list. This information comes from the manged metadata store (term store).

Resolution
Browse to the hidden taxonomy list.
Check the permissions of this list and check if users have at least read permissions. Usually, it is preferred to give "NT AUTHORITY\AUTHENTICATED USERS" read permissions on this list.

Update: About a year later, Microsoft has now published a KB article for this issue
Managed Metadata column on a SharePoint 2010 list does not display values added from term store

You might find these articles useful