Tuesday, June 28, 2011

Vertical Bar Graph with XSLT in Data View Web Part (DVWP)

As part of a recent project, I needed to create a dashboard page for the requests for executive management. However, since the data was in a list and not a database this was easier said than done. So I started off by researching my options and I found this post on MSDN. This post explains very well how to modify the XSL of a DVWP to create a bar graph. I used this post and the code below to do as such. You will noticed that I overwrote the .ms-selected CSS class so that I could control the image as well as the colors of the chart.

Example Bar Graphs:


My second issue was that I wanted a vertical bar graph too. I was unable to find an example anywhere on the web, so I sat down and started playing with the code.

I used the code from the MSDN article, but overwrite the CSS class in order to control the color of the chart:

.ms-selected {
background-image:url("/administration/BTG/images/chartrow.gif");
background-color: #6a9a21;
border-bottom: 1px solid #6a9a21;
border-top: 1px solid #FFFFFF;
}

Now to change the above chart into a vertical bar chart we have to replace three sections of code.

The first is the CSS:
I created a new class called .ms-selectedc and the c stands for column.

.ms-selectedc {
background-image:url("/administration/BTG/images/chartcolumn.gif");
background-color: #6a9a21;
border-left: 0px solid #6a9a21;
border-right: 0px solid #FFFFFF;
}

We have to replace the main table with the following code:

Replace -

With -

Finally we have to change the XSL that creates each bar.

Replace -

With -

Depending on the length of the column titles, you may have to play with the height and width of the table where the title and percentage is created.

If anyone has any suggestions for more efficient code, please let me know. I also suggest filtering out unneeded data in the CAML so as reduce load on the client.

Vertical Graph:

And my dashboard:

Friday, June 24, 2011

Powershell Delete Files After X Days and Keep Log

I have been managing a project to develop a simple app to upload files to a web server. In order to do this with the way we were managing state, we had to temporarily store the file on the server until submit. This posed a problem with files that were put in temp, but then the browser closed. Since the browser closed didn't cleanup the file I had to write a script to do that. I thought I'd share it.

This script has the -whatif flag in it. That means you can run it and it will simulate what it would do and will tell what would have happened. When you get it set for your environment just take -whatif out, but be careful in doing so.

One nice feature of this script is that it will self clean the log files based on the number of days you tell it to delete. In my case I clean it out every day, so I only keep one log file.

#####################################################
#Checks to see if $files contains information, if it does not, then the script is finished
#$files stores all items in the directory and subdirectories where the last modified date is less than x days and is not a #folder
#if $files contains info then the script starts a log and removes all files that are stored in $files
#the logs are self cleaning because they are stored in the directory being checked and will be deleted as any other file
#add –whatif after –verbose in the foreach statement to test with no destructive action
#Andrew Alaniz - 6/8/2011
#####################################################

if($files = get-childitem D:\Location\ -recurse | where {$_.lastwritetime -lt (get-dateM).adddays(-1) -and -not $_.psiscontainer}){
start-transcript D:\Location\$(get-date -format MMddyyHH).txt
foreach ($file in $files){%{remove-item $file.fullname -force -verbose -whatif}}
stop-transcript}

To execute this script just setup a scheduled tasks. This batch file will execute the powershell script. You just use the scheduled task to execute the batch file.

powershell -command "& 'D:\Scripts\DeleteOldFiles.ps1' "

Example CAML queries for SharePoint Designer

I have recently been working on a project the deals heavily with XSLT and DVWPs. One thing to keep in mind is that data manipulation done in XSLT is done client side and thus slows load times. This is especially the case if you are just the equivalent of a SELECT * in your CAML query as the entire dataset is returned and then it is up to the client to filter, group, order and display the data.

Spare your audience some load time and filter as much as you can from the dataset via your CAML query. There are number of useful sites out there that explain CAML in detail so I will give a quick and dirty on the queries I am sharing and then provide the queries. Reference

A basic SELECT * CAML query looks like this: <View></View> and for those of you who had to learn LISP in school, these queries will bring back those fond memories. While they are not exactly the same, the recursive nature is apparent.

If you want to filter out what is returned you have to add, similar to T-SQL, a WHERE clause. This looks something like this:

<View>
     <Query>
          <Where>
               Your clause
          </Where>
     </Query>
</View>

And when saying WHERE something AND something else you just add an AND in there:
<View>
     <Query>
          <And>
               <Where>
                    Your clause
               </Where>
               <Where>
                    Your clause
               </Where>
          </And>
     </Query>
</View>


A few key elements to note are:
<Eq></Eq> - Equal To
<Neq></Neq> - Not Equal To
<Gt></Gt> - Greater Than
<Geq></Geq> - Greater Than Or Equal To
<Lt></Lt> - Less Than
<Leq></Leq> - Less Than Or Equal To
<IsNull></IsNull>
<BeginsWith></BeginsWith>
<Contains></Contains>

Each of these conditionals follows this format:
<Eq>
     <Eq>
          <FieldRef Name="Closed"/>
          <Value Type="Integer">
               1
          </Value>
     </Eq>
</Eq>

Finally some useful CAML Queries for SharePoint:

Select all records created by the current user (change Eq to Neq to select all records not created by the current user):
<View>
     <Query>
          <Where>
               <Eq>
                    <FieldRef Name="Author"/>
                    <Value Type="Integer">
                              <UserID/>
                    </Value>
               </Eq>
          </Where>
     </Query>
</View>


All records created by current user and matching some criteria:

<View>
     <Query>
          <Where>
               <And>
                    <Eq>
                         <FieldRef Name="Author"/>
                         <Value Type="Integer">
                                   <UserID/>
                         </Value>
                    </Eq>
                    <Eq>
                         <FieldRef Name="Some Column"/>
                         <Value Type="Text">
                                   Matching Text Here
                         </Value>
                    </Eq>
               </And>
          </Where>
    </Query>
</View>

Finally, be sure to enter the queries into Designer using the HTML form of the >, <, and " signs. I recommend this site here, just copy your CAML query into and it will transform it: http://www.devtrends.com/custom/camlencode/

Also very handy there is an XML Tools plugin to Notepad++ that will do the conversions in Notepad++ for you.

Enjoy.

Friday, April 1, 2011

SharePoint 2010 Licensing Quick Reference

--Updated 2/2/2011 Added a few additional reference links at the end.
--Updated 1/18/2011 Added Windows Server 2008 External Connector to licensing lists for public facing sites.

I have tried to compile a list of sources to help simplify (or least consolidate) information on SharePoint 2010 Licensing.

This is a good reference for what is/is not included in the different versions of SharePoint: http://goo.gl/HWasj

Visual breakdown of the basic requirements of a SharePoint 2010 Medium farm by @SharePointLola http://bit.ly/hEsml4
Another good blog on SharePoint 2010 Topology.

The following scenarios do not cover whether or not to use Windows Server Standard or Enterprise or SQL Server Standard or Enterprise.  However, for most scenarios, Windows Server Std will suffice for the web front ends and the application servers.  Windows Server Enterprise is required on the backend if you need a failover cluster.  Standard only supports NLB clustering and Enterprise is required to use failover clustering.  As for the DB servers, this really depends on how much you will need to scale moving forward.  This article is a good resource for comparing SQL Server versions: http://goo.gl/5Awry

This whitepaper is a good resource for why SQL Server 2008 R2 Enterprise is better with SharePoint 2010: http://goo.gl/1uD9Z
Good post on SQL Server High Availability.

Hardware and Software specs for SharePoint Server 2010 and SharePoint Foundation 2010.

SharePoint 2010 Standard Internal only (no publically available content) (not including FAST):
Web Front-End Servers: Windows Servers 2008 R2 + SharePoint Server 2010
• Application Servers: Windows Servers 2008 R2 + SharePoint Server 2010
• Database Servers: Windows Server 2008 R2 + (SQL Server 2008 R2 Proc License x Number of Processors)*
• Users: SharePoint Server 2010 Standard User CALS x number of users accessing internal content

SharePoint 2010 Enterprise Internal only (no publically available content)(not including FAST):
• Web Front-End Servers: Windows Servers 2008 R2 + SharePoint Server 2010
• Application Servers: Windows Servers 2008 R2 + SharePoint Server 2010
• Database Servers: Windows Server 2008 R2 + (SQL Server 2008 R2 Proc License x Number of Processors)*
• Users: ((SharePoint Server 2010 Standard User CALs + SharePoint Server 2010 Enterprise User CALs) x number of users accessing internal content)(Ent CALs are cumulative. If you need Enterprise you must buy Standard and Enterprise CALs)

SharePoint 2010 Standard Internet Facing only (all content is publically available) (not including FAST):
• Web Front-End Servers: Windows Servers 2008 R2 + SharePoint Server 2010 for Internet Sites, Standard + Windows Server 2008 External Connector
• Application Servers: Windows Servers 2008 R2 + SharePoint Server 2010 for Internet Sites, Standard
• Database Servers: Windows Server 2008 R2 + (SQL Server 2008 R2 Proc license x Number of Processors)*
• Users: Content administrators for publically available content are covered under the Internet Sites license

SharePoint 2010 Enterprise Internet Facing only (all content is publically available) (not including FAST):
• Web Front-End Servers: Windows Servers 2008 R2 + SharePoint Server 2010 for Internet Sites, Enterprise + Windows Server 2008 External Connector
• Application Servers: Windows Servers 2008 R2 + SharePoint Server 2010 for Internet Sites, Enterprise
• Database Servers: Windows Server 2008 R2 + (SQL Server 2008 R2 Proc license x Number of Processors)*
• Users: Content administrators for publically available content are covered under the Internet Sites license

SharePoint 2010 Standard for both internal and public facing sites on the same hardware (not including FAST):
• Web Front-End Servers: Windows Servers 2008 R2 + SharePoint Server 2010 + SharePoint Server 2010 for Internet Sites, Standard + Windows Server 2008 External Connector
• Application Servers: Windows Servers 2008 R2 + SharePoint Server 2010 + SharePoint Server 2010 for Internet Sites, Standard
• Database Servers: Windows Server 2008 R2 + (SQL Server 2008 R2 Proc license x Number of Processors)*
• Users: SharePoint Server 2010 Standard User CALS x number of users accessing internal content

SharePoint 2010 Enterprise for both internal and public facing sites on the same hardware (not including FAST):
• Web Front-End Servers: Windows Servers 2008 R2 + SharePoint Server 2010 + SharePoint Server 2010 for Internet Sites, Enterprise + Windows Server 2008 External Connector
• Application Servers: Windows Servers 2008 R2 + SharePoint Server 2010 + SharePoint Server 2010 for Internet Sites, Enterprise
• Database Servers: Windows Server 2008 R2 + (SQL Server 2008 R2 Proc license x Number of Processors)*
• Users: (SharePoint Server 2010 Standard User CALs + SharePoint Server 2010 Enterprise User CALs) x number of users accessing internal content


FAST requires additional licenses for the FAST Search Server 2010 for SharePoint.
*For virtualization of SQL Server Only active nodes in a cluster require a SQL Server license.  For Standard, Workgroup, and Enterprise, if you decide to license on a per processor basis, you must buy a SQL Server license for each virtual processor. For Enterprise Edition, you can also choose to license all physical processors in a box. This gives you rights to run SQL Server on any number of virtual processors running on the same physical server. If you use Server/CAL based licensing, for Standard and Workgroup editions, you must obtain SQL Server licenses for each Virtual Operating System Environment on which you run instances of SQL Server. However, for the Enterprise edition, if you have a Server license for the physical Server, you may run any number of SQL Server instances in any Virtual Operating System Environment that you run on that same physical server.  If you are using hardware partitioning on a multi-processor server, you can use any number of virtualized instances for SQL Server Enterprise Edition as long as all processors in that hardware partition are licensed. For example, if you have a partition of 10 physical processors on a 32-processor server, purchasing 10 processor licenses of SQL Server 2008 gives you the rights to run any number of SQL Server instances on physical or virtual environments on that partition.


Other references:
How To Buy SharePoint 2010
SharePoint 2010 Hosting
Ari Baker Blog
Bamboo Solutions Price Calculator for SP2010
Dan Holme - SP2010 Licensing
Microsoft SharePoint Licensing Page
Windows Server 2008 External Connector

Wednesday, March 2, 2011

You are probably NOT a SharePoint Configuration ‘Expert’ if…

After reading Mark Rackley’s recent post, I found that while it was amusing, it was extremely helpful for me, being a consultant on the infrastructure side of SharePoint, as a reference when talking to developers well as great interview material.  I struggled with the title to use for this post, but I went with Configuration Expert because titles such as Architect or Administrator can be somewhat ambiguous and broadly defined.  I will be the second, after Mark, to say that I am not a SharePoint expert as I find that I learn new things specifically about SharePoint every day. 

What am I saying: If you call yourself an expert, the following should apply.
What am I not saying: If any of the following do not apply, then you are not an expert.

Now, let’s begin:

You are not a SharePoint Configuration Expert if you still refer to SharePoint 2010 as MOSS.
Microsoft Office SharePoint Server 2007 was commonly referred to as MOSS.  Valid options for referring to Microsoft SharePoint Server 2010 are SharePoint, SharePoint 2010, or SP2010 and maybe a few others, but not MOSS.

You are not a SharePoint Configuration Expert if you install SharePoint without a plan…
There are so many different configurations with which SharePoint can be installed, that it is crucial to gather requirements prior to beginning an installation.  Governance, Information architecture, performance, growth, service applications, and the list goes on, all play a vital role in a successful SharePoint deployment, and any SharePoint Infrastructure Expert should be able to speak to each of these and why they are important to a successful implementation.

You are not a SharePoint Configuration Expert if you ever recommend using the Basic Install...
These two articles, here and here, go into a number of reasons why no one should use the Basic Install.  The bottom line is if you look at the limitations and the long term consequences of running the basic install, only the Advanced installation option should ever be run.

You are not a SharePoint Configuration Expert if you use the Product Configuration Wizard to install all of your service apps into production...
I run the Product Configuration Wizard for installing my dev environments.  It’s easy and it’s simple. Best of all, I don’t have to worry about using PowerShell! (*sarcasm*)

This is a debatable topic, so I don’t think I can come out and say that you shouldn’t run the Product Configuration Wizard for any of the service applications ever.  There are a few that you should never run it for (example User Profile Service), but there are some, that if you can accept ugly database GUIDs in the database names, it could be argued that there is no technical reason to avoid it.  That being said, I will always recommend manually creating each service application as needed because the wizard does take some shortcuts, and I am rather particular about nomenclature in my installations.

You are not a SharePoint Configuration Expert if you use a single service account in your farm...
This particular topic is a perfect example of an ‘it depends’ topic.  The number of service accounts required for SharePoint may depend on the particular business requirements for a specific installation.  While there are some extremes on both sides, my point in this one is I believe best practice is to use more than one account.  There is just too much security risk associated with using a single service account as your Farm Account, service application accounts and especially user profile service account.  Todd Klindt has a good list of suggested service accounts.

You are not a SharePoint Configuration expert if you can't name the default authentication zones and know how many zones you can have...
This includes knowing how to extend a web application in order to implement the required zones.  This TechNet article speaks to zones in SharePoint 2010.

You are not a SharePoint Configuration expert if you can't configure FBA…
First of all, FBA is not a directory, it is an authentication type.  You can use FBA to authenticate against a customized claims provider, but you can also use it to authenticate against Active Directory.  FBA is often a valid business requirement for remote or extranet access.

You are not a SharePoint Configuration expert if you always recommend the same farm configuration for every installation…
Wait, you mean I can’t just offer a cookie cutter SharePoint implementation?  Businesses all have unique needs and unique processes.  SharePoint is no different.  Business requirements should dictate SharePoint deployments and not the other way around.

I’ll take a couple of points now from Mark’s blog:

You are not a SharePoint expert if you do not capitalize the P in SharePoint...

You are not a SharePoint Configuration Expert if you allow SharePoint Development Experts to manually copy DLLs...

I will also reiterate that you are not a SharePoint Configuration Expert if you are not involved in the community...
I will clarify my point in that it’s not about if you can drop the names of SharePoint ‘Experts’, but that you are sharing your experiences and offering your help to the community.  Basically I would hope to be able to find some reference to you somewhere in the SharePoint community if you are touting yourself as a SharePoint expert.

Finally, I agree totally with Mark in that SharePoint consultants and experts who don’t know everything are far less dangerous than those who think they know it all and are unwilling to listen to reason, the community, or their peers.  It takes more than just the ability to click next a few times to understand the SharePoint infrastructure. 


Thanks to @SharePointLola and @mrackley for the assist on this post!

Friday, February 25, 2011

Backups Equal Disaster Recovery - Not So Fast

I am beginning a series of blog posts that speak to a topic that many people have in the back of their mind but most are afraid to confront: Disaster Recovery (DR) and Business Continuity (BC).  I get asked about these a lot and specifically how they pertain to SharePoint.  The problem is that there are many more levels to data protection than just DR and BC, and they are not unique to SharePoint.  I am going to take a stab at going through the different levels that are ecompassed within DR and BC from a data protection perspective and speak to a number of details in each that are particular to SharePoint.  In this first post I am just going to identify the different levels of protection (or lack of protection) and provide a summary in my own words for each.  In each subsequent post I will take a deep dive into each item and speak to my experiences and how the perspectives for each change based on the audience.  A CIO sees each level differently than the IT manager who see them differently from the System Administrator and so on.

Protecting business data is much different than ensuring that a business is protected from the loss of its data.  If you think of it as I just stated, the former is much easier than the latter.  In the real world, protecting a business' data is easy, but worthless.  What good is data if it's tucked away on a tape offsite, if it takes a day to get it back onsite and no one has a tape drive or server with a SCSI card in it to read the data.  Even if all of the hardware is in place to read the data, what good is it if I can't provide business resources quick access to the data.  By the time the data is somewhat accessible, it's been two days.  And since the business never took the time to figure out that it actually costs them $10,000/business hour of downtime, but only spent a couple thousands dollars on a new tape drive, it's going to take months to recover from a 10 second brown out in the building.

This scenario is not unlikely, nor is it uncommon. Per an Iron Mountain white paper called, “The Business Case For Disaster Recovery Planning: Calculating The Cost Of Downtime,” "A frequently cited study in Contingency Planning and Management magazine found that 40 percent of companies that shut down for three days failed within 36 months."

While those numbers can be increased or decreased based a number of factors, they should cause fear for any executive (especially in a small business) who has not considered how useless merely having backups of critical data is to his or her business.  Consider a small business whose primary income source is billable consulting hours, and immediate access to archived data is not critical.  Now consider a stock trading company, they could lose thousands or millions of dollars in seconds if they do not have access to their data.  The business owners must NOT merely consider the cost of disaster recovery, but the cost of downtime and disaster prevention compared to the cost of disaster recovery.

I see basically three facets of protecting a business from data loss: Backups, Disaster Recovery, and Business Continuity.  The cost is indirectly proportional to the downtime tolerance level also known as the Recovery Time Objective (RTO).  In other words the less downtime a business can withstand, the more it's going to cost to mitigate the risk of downtime.  Here is a simple graph to show this relationship:

Keep in mind that just have the data available is one thing, but a business' resources for revenue generation, must also be able to access the data (could be employees, could be customers).  I will factor this into some of my discussions, but for summary purposes I will keep this somewhat high-level.

Backups

Backups ensure that data critical to a business' revenue generation or penalty prevention (think SOX, GLBA, HIPAA) is in a state that can be recovered.  In other words, that data is on tape, disk, other media or in the cloud somewhere that can be accessed and restored.  Cost factors for backups are related to the speed of backups, the speed of restoration, the length of retention, the size of the data set, and the security of the data.

Disaster Recovery

Disaster Recovery is the ability to recover access to data (not just recover data) from a specific point in time (Recovery Point Objective or RPO) within a certain amount of time (Recovery Time Objective or RTO).  I do not include a Force Majeure disaster within the scope of DR and I will explain this later.

Within DR there are also three scenarios to consider:
 
Real-time recovery - This is the ability to recover data at the item level.  Think individual emails, documents within a SharePoint library, tables within a database

Data Corruption Recovery - The systems are still up and running, but a database has corrupted, a software update has hosed an OS, someone deleted a system file.

Hardware Failure - The database server dies and not only do the databases have to be restored, but that third party application has be redeployed after SQL Server has been successfully reinstalled. (doh, I knew I should have created an Active/Passive cluster)

Cost factors for DR include application complexity, resource availability, investment in Backup strategies, investment in high availability and/or disaster prevention.

Business Continuity 

Business continuity is the ability to continue generating revenue even when the "normal" means of day to day operations are unavailable.  Think hurricane Katrina or a severe ice storm knocks out power for a week.

Cost associated with BC include investments in DR, strategic relationships with vendors, resource (both revenue generating and non-revenue generating) availability.

The bottom line is there are many considerations to take into account when calculating the costs and ROI associated with protecting a business' data.  I intend to provide a little insight here into some of those considerations, and provide some specific examples and suggestions that are directly related to SharePoint.

Monday, February 7, 2011

Catch 22 for Approval Workflow in SharePoint Designer

--I began blogging on this, and found someone else's blogpost from Dec 2010 who also blogged on this exact topic.  Though I never saw his blog post prior to writing this, credit to Lars Nielson

I ran into an interesting dilemma while working on my most recent project.  The project involved a number of document libraries that required content approval as well as versioning.

The scope of work greatly limited the amount of customizations that could be made to SharePoint (i.e. building workflows in Visual Studio).

This introduced an interesting problem based on our initial game plan.  The workflow is basically a document goes through a change process and committee approvals before it is published.  End Users cannot see changes until an approved document is published.

The change process was the complicated piece to this because the organization had a number of variables within the process, so the only solution was to require each step in the workflow to be manually kicked off.

When I made it to the approval stage, we initially had content approval on in the libraries.  And here is where the catch 22 comes in, trying to set content approval status to Approved in a document library that requires a document to be checked out and content approval is on.

To set the content approval status from Designer, ok easy enough, there is an action for that....uh oh, the document must first be checked out.  ok, check out document, set approval status....uh oh, I get an error that says content approval status cannot be set while a document is checked out.

Catch 22.  Basically to set content approval status in designer, the document must be checked out, but content approval status cannot be changed while a document is checked out.

Possible work-around - We did not go this route, we basically turned off content approval, and used a workflow to notify approvals, and once approval was granted, a user would publish a major version of the document.  This was acceptable for this engagement.

I was going to explain the workaround here, but I ran across a blog post while getting my info together of someone who already documented it.  So in the spirit of giving credit where credit is due, Lars gets credit for this solution.  Lars Nielson blogged on this exact topic in late 2010.    http://discoverlars.wordpress.com/2010/12/28/update-the-approval-status-in-a-sharepoint-designer-workflow/

The bottom line in this is that for highly complex workflows, you really need Visual Studio.

Friday, February 4, 2011

Modify the Search Refinement Panel in SharePoint Without Code

The goal of this exercise is to add/modify the refinement panel web part on a search results page in SharePoint 2010.  No code required.  Keep in mind that customization of the OSSSearchResults.aspx page is unsupported. So in my example, I created a Search Center site, and pointed my contextual search to the Search Center. This way I can customize the search results page.

Step 1 - Create Site Column
Create a site column(s) in the site where the items that will use the property are stored.  In my case I created a lookup column to a list that returned the Title of the list item called Entity Lookup.

Step 2 - Add Site Column and Map to Content
Next we need to create a document library and Add an Existing Site Column.  It is not required to add the column to content types, but this will provide the ability to edit the column in Edit Properties modal dialogue window.  Be sure that you have a number of items that use this property and have a value added or the refinement will not show up with the search results.

Step 3 - Perform a Full Crawl
I will explain why we do this in the next step.  You may be able to do an incremental crawl, but the first time I tried it, this did not work.  After that, I just did a full crawl for good measure.  Obviously in production, you may have to wait until the next crawl to proceed.  To do this go to Central Admin->Manage Service Applications->Search Service Application->Content Sources->Drop Down the Content Source and say Start Full Crawl.




Step 4 - Verify Crawled Property and Add Managed Property
First verify that after the crawl, the column(s) that were added are listed as a crawled property.  To do this go to Central Admin->Manage Service Applications->Search Service Application->Metadata Properties->Crawled Properties.  In the search box type in "ows_[column name]" and click find.  The result will depend on what type of column created and how long the name is.  If it is a multi-word column name, just use the first word in the search. In my case the crawled property is ows_Entity_x0020_Lookup.




Initially, there will be no value in the Mapped To Column.  Once you verify that the Crawled Property Exists, select Managed Properties link at the top of the page.

Choose New Managed Property.  Give it a name and a type.  In my case I call it Entity and give it type Text.  Next choose Add Mapping:

Search for the Crawled Property you just verified and click ok.  I set the following options on the managed property and the clicked ok to create it.



Step 5 - Perform another Full Crawl

See Step 3.

Step 6 - Modify Refinement Panel
In my case I created a Search Site in the same site collection as the documents, but as long as the Search Page has access to the scopes that this content lives you are ok.  The other advantage to a Search Site is that the Search page is already built and linked up.  On the search page, click Edit Page.  Then Edit the Search Refinement Web Part.



In the web part properties, chose Refinement.  Click the ellipses on Filter Category Definition.


This will open a less than optimal view of the XML that makes up the definition.  The element for a new refinement looks like this:


My code element looks like this:

Take your code for each element add it in the XML after the Filter Categories Tag.  My XML File looks like this:


Finally, make sure to uncheck the Use Default Configuration check box.  This took me a minute to figure out, so save yourself the trouble and uncheck it.
Click OK to close the Web Part Properties screen.
Then refresh the page.  Now depending on where you put the code within the web part, you should see your new refinement options.

I hope to play with customizing the search page some more.  I will definitely add more posts like this as I do.

Monday, January 24, 2011

SharePoint 2010 Document Center: Send Reminder Prior to Date on Documents without Code

--Updated 1/26/2011: Improved workflow steps to remove events firing multiple times.  See Step 3.

Let's talk about the problem first.  For this specific problem, I have a 'policies and procedure' site that I have created, and policies need to be reviewed based on a date specified in each policy, and certain groups have to be notified 90 days prior to that date.  The Review date for these policies is not known as circumstances may require it to be early, late, or skipped.  The Target Review Date is also unknown as there is no policiy defining a single Target Review date.  It is basically unique in calculation and value for each document.  There is also no defined approval process as there are many processes and many exceptions to those processes.  Basically I had to create this document library to follow very manual steps in order to account for the various scenarios that are required.  If these values are defined  or your processes are defined in your situation, then this post is overkill.  I recommend taking a look at http://goo.gl/Ab4kH by Michal Pisarek.  More than likely Michal's post is on the extreme of well-defined and my post is on the extreme of loosely defined, so often times the solution will be in the middle somewhere.

This seems like a simple enough issue right, until I realize that I can't do this as 'on modify' workflow to send the email because I don't need to send spam every time a minor revision is made, I can't set an Information Management Policy because the only OOB operation is addition of days/months/years, and I can't use a 'pause until Date' workflow because there are various scenarios that would require the workflow to stop and restart or the date might change after the modification.  I also want to do this without breaking open Visual Studio.

If anyone sees any flaws in my logic above, please let me know so that I can learn from what I missed.  The solution I propose to this problem is the best that I saw at the time, and it works with negligible impact to performance or complexity of the site.
 
Step 1 - Column Creation

Create three columns:
  • Target Review Date - Date/Time Type Column
  • Target Review Calc - Calculated Column
  • Target Review Notify - Date/Time Type Column 

I created the Target Review Date column as a site column so that it is available to all of my libraries.  I also have it specified as a content type (CT) column in each of my libraries (I have three CTs per library).  This column is visible so that my users can modify its value within the custom Document Information Panel in Microsoft Word.

For the Target Review Cal, I also created it as a site column of type Calculated Column.  The formula I used is: =IF([Target Review Date]=0,"Some Value",[Target Review Date]-90). See this site for formula help.  Replace Some Value with your default value for this field.  This column will be used to set the value of the Target Review Notify Column.

NOTEIn order for an Information Management Policy stage to recognize a column, it must be of Date/Time type and not just contain a Date/Time value, and it must be listed in the CT columns. 

The third column is a site column of type Date/Time and I called it Target Review Notify.  The reason this column is required is because of my note above.  The Information Management Policy will not see the column if it is not of Date/Time type.  Since the Target Review Calc column is of type Calculated we cannot use it directly.


Now that we have our columns available to us, we need to make sure the libraries are setup correctly.

Step 2 - Add Columns to Libraries

Each library that will use this solution must have these three columns added.  I suggest adding them in Library settings, but they can also be added to the content types themselves.

Go to Document Library Settings-> Add from existing site columns and add the three columns. 
For Target Review Date select Add to Content type, and I chose to also add it to the default view.  The Target Review Calc column does not need to be added to the content type or to the view, and Target Review Notify needs to be added to the Content Type only.



Now navigate to the content types and (unless you have requirements otherwise) hide Target Review Notify.  There is no need for it to be available in the DIP.

Step 3 - Create Workflow to set Value

Now crack open SharePoint Designer to create the workflow to set the value.  I created a List workflow in case I needed to change functionality on a specific list in the future.  I call the workflow Target Review Value.  I am assuming that check out is required for modification.  If it is not, then the check-in/check-out steps are not required in the workflow.

The steps look like this:

I have this set to start on Create and on Modified.



Step 4 - Create Workflow to perform task

Now create a similar workflow that does a task you need performed at a certain date prior to the Target Review Date.  In my case it was a simple Send Email, then stop workflow.

Do not set this workflow to start.
Step 5 - Configure Information Management Policy

Go to Document Library Settings and choose Information Management Policy (IMP).  Now the next steps can also be accomplished by creating a custom timer job to monitor the documents.  For me, this was part of information management, so it made sense to put it here.

Open the IMP for each CT that will be using this policy.  Check the box for enable retention and choose Add stage.  Configure the stage as follows:


Select OK and you are good to go.

Review Notification - Column created and associated with the CT that uses this policy.  This tells the stage when to fire.
Action - Start a workflow, in our case, a workflow to send an email.
Recurrence - This tells this stage if needs to recur or not.  In my case, I need it to recur forever until the document is archived.  The email will fire off every 30 days (or as often as needed) until the document is reviewed, the Target Review date is changed or the document is archived.
Step 6 - Modify Timer Job if needed
The timer job that checks the Target Review Date is scheduled to fire weekly.  Modifying this will impact the entire web application, so do not do so lightly.  If there are numerous IMPs then this could cause resource issues.  In my case, this will not happen, so I considered changing it to fire each day.

In order to do this go to Central Admin->Monitoring->Review Job Definitions->Choose Expiration Policy.  Modify the schedule if needed.

NOTE: You will notice that that there is a timer job called information management policy.  This does not control the frequency at which the stage conditions are checked.
Summary

This solution allows you to check a date value on a content type and trigger an event prior to that date without requiring Visual Studio.  There are some performance and/or management implications to this solution, but it works.

Friday, January 7, 2011

Deploy Office Web Apps to multi-server SharePoint Farm

I recently installed Office Web Apps on a multi-server farm, and I had trouble finding definitive steps on doing this.  In order to do this, basically perform the steps outlined here with exceptions noted below: http://www.mukalian.com/blog/post/2010/12/11/Installing-Office-Web-Apps-Existing-SharePoint-2010-Server-Farm.aspx

Stop at step 5 and perform steps 1-5 on each web front end and application server before continuing on to step 6.  I performed steps 6 and beyond on the server that is running Central Administration.

In Step 14, the services are only available on the server running Central Admin (in my case the Application server).

In Step 17,18,19 choose existing app pool and use the SharePoint Web Services Default (referenced in http://technet.microsoft.com/en-us/library/ff431687.aspx)

Other than the minor changes in the steps above, deploying Office Web Apps to multiple servers if straight forward.