Harnessing SignalR in SharePoint

Note: For those interested in the implementation of SignalR in SharePoint 2013, view the latest post Harnessing SignalR in SharePoint 2013 (Office 365)

On the 1st of April Bil Simser wrote an article Introducing SharePointR. I happened to read it on the 3rd and as such it wasn’t until I got to the ‘quote’ by David Fowler that I tweaked to what was going on. It was a clever April Fools post and likely got a lot of people excited. For me, it piqued my interest. I’d heard of SignalR in passing but had yet to delve into it. Once I had, I have to say I got pretty excited myself. It’s super cool. I, along with one of my extremely talented colleagues Elliot Wood, went about getting SignalR up and running in a SharePoint environment.

SignalR is relatively new and as such the information available isn’t extensive, but what is out there is pretty good. There are a few examples you should check out right off the bat: Scott Hanselman’s Asynchronous scalable web applications with real-time persistent long-running connections with SignalR and Justin Schwartzenberger’s Learn how to use SignalR and Knockout in an ASP.NET MVC 3 web application to handle real-time UX updates were the two I found to start with. Both link off to a number of other valuable examples.

So what is SignalR? Essentially, it’s real-time client-server communication on the web. Without having to constantly poll or perform any refreshes, data on the page will ‘magically’ update in front of your eyes. The library is maintained on Github which gives you access to the latest code, issues and documentation at a central source. I’m not going to go too far into what it is, because with the excitement that’s being generated around this, plenty of other people are doing a far better job of it than what I could. I’ll skip straight to the fun stuff.

So if the examples are already out there, why can’t we just plug this straight into SharePoint?

The majority of examples which currently exist tend to host the hub and the client in the same project, and hence are hosted on the same domain. This works great, you can spin up your .NET 4.0 web application and everything will work smoothly. Only problem being SharePoint runs on the .NET 2.0 framework – you won’t be able to add the SignalR DLLs to your SharePoint project.

This however is not a deal-breaker. As long as your hub is hosted on a .NET 4.0 web application you can leverage SignalR in a simple HTML file with JavaScript, so surely that would be easy enough to plug into a SharePoint web part?

Firstly; it’s not exactly simple. It requires something called Cross-Domain Calling which I’ve since found out is a bit of a pain to get working across different browsers. This is where the information fell down a little, or at least was scattered around. I’ve read more StackExchange and Github Issues than I care to dig back up and link to for this article so excuse my lack of referencing here. One page I did come across which got me most of the way there was Thomas Krause’s Making Cross-Domain Calls in SignalR which summed it up pretty nicely, but still didn’t work in all browsers for me. But more on this later.

Secondly; even with all those issues resolved we still want a way to trigger SignalR to broadcast an update from SharePoint, and unless you’re doing that purely with client interaction in the UI, chances are that’s going to mean handling create, update and delete events via event receivers. Which brings us back to our first issue – these will be on the .NET 2.0 framework and won’t be able to reference the SignalR DLLs. So how do we get around this?

Essentially what is required is a bridge between SharePoint’s .NET 2.0 environment and a .NET 4.0 one. I liked the way Elliot termed this better: breaking the .NET barrier. My initial thoughts were hosting a WCF service and passing the information from the event receiver to that service to be broadcast by SignalR, and I still think that would be the ideal solution. Elliot however beat me to the implementation by making the leap via HTTP and posting to a handler sending the information via Query String, and for the purposes of the proof of concept (minimal data being transfered) this did the trick nicely.

With all the pieces of the puzzle in place, it’s time to implement our SignalR in SharePoint proof of concept. The idea is pretty simple – consider it a mini task-tracking system. Tasks will get added to the pile and others will be completed. The service quality manager will have a dashboard on their monitor open all day receiving live information on whether performance targets are being hit on a day-to-day basis. Let this be a glimpse into the power of what SignalR can achieve and let your imagination run wild on possible real-world implementations.

Step 1: Create the Hub

The Hub needs to be a .NET 4.0 web application. The majority of examples on the net state the first step of implementing any SignalR application is to use nuget to retrieve the SignalR DLLs and scripts. This is fine in theory, but while we were investigating our cross-browser issues (non-performance in Firefox and Chrome) Elliot noticed that the nuget version of SignalR is not the latest, therefore downloading the latest ZIP and re-referencing the DLLs is the way to go. In fact the ‘latest’ version from Github at the time of writing included another required DLL that the nuget version didn’t bring across – SignalR.Hosting.Common.dll. Others that you’ll want the updated version for (or at least the ones we used) include SignalR.dll, SignalR.Hosting.AspNet.dll and Newtonsoft.Json.dll.

The next step is to add an entry into the web.config file to allow the cross-domain calls. This requires adding the following snippet to the system.webServer node

<httpProtocol>
  <customHeaders>
    <add name="Access-Control-Allow-Origin" value="*" />
  </customHeaders>
</httpProtocol>

The final step is to add a class to your project to house the Hub

using SignalR.Hubs;

namespace SignalRHub
{
  public class SharePointHub : Hub
  {
    public void Send(string message)
    {
      // Call the addMessage method on all clients
      Clients.addMessage(message);
    }
  }
}

At this point all the extraneous components of the project can be removed so you’re left with the class, the web.config and packages.config. Just a note – the Send implementation is somewhat redundant as we’ll be calling the client script from the handler below rather than the hub itself – but it’s useful for testing.

Step 2: Create the HTTP Handler

The handler can exist in the same project created above and will essentially be the tool to receive the Query String data from SharePoint and broadcast it via SignalR to the clients. All credit here goes to Elliot for the concept, I’ve adapted it to match my SignalR implementation (I used a Hub, he used a Persistent Connection).

One major thing to point out is that the documentation currently offers this as the method of broadcasting over a Hub from outside of a Hub

using SignalR.Infrastructure;

IConnectionManager connectionManager = AspNetHost.DependencyResolver.Resolve<IConnectionManager>();
dynamic clients = connectionManager.GetClients<MyHub>();

This documentation however is only valid for the 0.4.0 build of SignalR you would get from nuget – the latest version establishes the IConnectionManager from another means

IConnectionManager connectionManager = Global.Connections;

The resulting code for the ProcessRequest is as follows

public void ProcessRequest(HttpContext context)
{
    context.Response.ContentType = "text/plain";

    IConnectionManager connectionManager = Global.Connections;
    dynamic clients = connectionManager.GetClients<SharePointHub>();

    var payload = new
    {
        TasksOpenedToday = context.Request.Params["TasksOpenedToday"],
        TasksCompletedToday = context.Request.Params["TasksCompletedToday"],
        LightStatus = context.Request.Params["LightStatus"]
    };

    JavaScriptSerializer jss = new JavaScriptSerializer();
    var payloadJSON = jss.Serialize(payload);
    clients.addMessage(payloadJSON);
}

Step 3: Create the Event Receiver

There’s nothing particularly fancy going on here aside from posting to our HTTP handler we created in step 2. The rest of the code is just a simple event receiver bound to a task list. If you need help at this step then take a look at Walkthrough: Deploying a Project Task List Definition. We’ll need to override the ItemAdded, ItemUpdated and ItemDeleted events and add the following code

Broadcast(getPayload(properties));

private string getPayload(SPItemEventProperties properties)
{
    SPList list = properties.List;
    SPQuery query = new SPQuery();
    query.Query = "<Where><Eq><FieldRef Name='Created' /><Value Type='DateTime'><Today /></Value></Eq></Where>";
    SPListItemCollection items = list.GetItems(query);
    double tasksOpenedToday = items.Count;

    double tasksCompletedToday = 0;
    foreach (SPListItem item in items)
    {
        if (item["Status"].ToString() == "Completed") tasksCompletedToday++;
    }

    string colour = "RED";
    int percentage = (int)Math.Floor(tasksCompletedToday / tasksOpenedToday * 100);
    if (percentage >= 70) colour = "GREEN";
    else if (percentage >= 50) colour = "ORANGE";

    return string.Format("?TasksOpenedToday={0}&TasksCompletedToday={1}&LightStatus={2}",
        tasksOpenedToday.ToString(),
        tasksCompletedToday.ToString(),
        colour);
}

private void Broadcast(string Payload)
{
    WebRequest request = HttpWebRequest.Create(string.Concat("http://server:port/SharePointRProxyHandler.ashx",Payload));
    WebResponse response = request.GetResponse();
}

Step 4: Create the Web Part

We’re on the home stretch. We’ve got an event receiver posting data to an HTTP handler, which is in turn broadcasting that via SignalR. The only thing left to do is create the client to listen out for that broadcast. This can essentially be HTML and JavaScript and as such doesn’t really need to be a web part at all, but I’ll be creating a visual one in the interests of effective deployment.

It’s here that we need to think back to when we retrieved the latest files from Github for SignalR. You’ll now need to grab the latest JavaScript files from that package and store them in SharePoint so we can reference them in our web part. You’ll also need to grab and store Knockout.

There’s a few differences you’ll need to consider compared to the generic example provided for implementing the SignalR client for hub communication. Firstly, instead of referencing /signalr/hubs in your script src, you’ll need to reference it in the location in which it exists

<script src="http://server:port/signalr/hubs" type="text/javascript"></script>

Secondly, you’ll need to force jQuery to support cross-site scripting

jQuery.support.cors = true; //force cross-site scripting

Third, your hub URL will also need to point to the relevant location

$.connection.hub.url = 'http://server:port/signalr';

Your Hub name will need to be in camel-case (which isn’t clear in the example seeing it’s all lower case)

var sharePointHub = $.connection.sharePointHub;

Your client handler will need to parse the JSON sent in seeing it’s not a simple string

sharePointHub.addMessage = function(json) {
var data = JSON.parse(json);
};

And finally you’ll need to pass a couple of options into the call to start the connection to your Hub

$.connection.hub.start({ transport: 'longPolling', xdomain: true });

That pretty much covers it. Add in some initial population of the viewModel server side and the necessary Knockout bindings and the web part is ready to be deployed. You can see the final web part (minus the basic server side population of variables) below

<style type="text/css">
    .dashboard-item { margin-bottom: 10px; }
    .dashboard-label { font-weight: bold; }
    #LightStatus { width: 50px; height: 50px; }
    .RED { background-color: Red; }
    .ORANGE { background-color: Orange; }
    .GREEN { background-color: Green; }
</style>

<script src="/SiteAssets/jquery-1.7.min.js" type="text/javascript"></script>
<script src="/SiteAssets/json2.min.js" type="text/javascript"></script>
<script src="/SiteAssets/jquery.signalR.min.js" type="text/javascript"></script>
<script src="/SiteAssets/knockout-2.0.0.js" type="text/javascript"></script>
<script src="http://server:port/signalr/hubs" type="text/javascript"></script>

<div class="dashboard-item"><span class="dashboard-label">Tasks Opened Today: </span><span data-bind="text:tasksOpenedToday"></span></div>
<div class="dashboard-item"><span class="dashboard-label">Tasks Completed Today: </span><span data-bind="text: tasksCompletedToday"></span></div>
<div id="LightStatus" data-bind="attr: { class: lightStatus }"></div>

<script type="text/javascript">
    var viewModel = {
        tasksOpenedToday: ko.observable(),
        tasksCompletedToday: ko.observable(),
        lightStatus: ko.observable()
    };

    $(document).ready(function () {
        jQuery.support.cors = true; //force cross-site scripting

        $.connection.hub.url = 'http://server:port/signalr';

        // Proxy created on the fly
        var sharePointHub = $.connection.sharePointHub;

        // Declare a function on the chat hub so the server can invoke it
        sharePointHub.addMessage = function (json) {
            // Update viewModel values here
            var data = JSON.parse(json);
            viewModel.tasksOpenedToday(data.TasksOpenedToday);
            viewModel.tasksCompletedToday(data.TasksCompletedToday);
            viewModel.lightStatus(data.LightStatus);
        };

        //Populate viewModel via ASP.NET and bind to Knockout
        viewModel.tasksOpenedToday("<%= InitialTasksOpenedToday %>");
        viewModel.tasksCompletedToday("<%= InitialTasksCompletedToday %>");
        viewModel.lightStatus("<%= InitialColour %>");
        ko.applyBindings(viewModel);

        // Start the connection
        $.connection.hub.start({ transport: 'longPolling', xdomain: true });
    });
</script>

So how does it look in the end? You be the judge..

So there you have it. I’d be surprised if there’s anyone out there that didn’t think this was pretty awesome. With any luck this will inspire you to go forth and make your own SharePoint implementations with SignalR – I’m looking forward to seeing what will be achieved with this excellent technology. Already Elliot has taken up the challenge to bring Bil’s ‘April Fools’ concept to life, it’s the kind of inspiring functionality that makes you want to go out and experiment. Just one last note – for anyone fearful of how this somewhat new technology would go in an enterprise environment, take a look at the video C#5, ASP.NET MVC 4, and asynchronous Web applications.

Performance Optimising SharePoint Sites – Part 3

In Performance Optimising SharePoint Sites – Part 1 of this series I focussed on some of the first steps you should undertake or consider when embarking on performance optimising your SharePoint site. Performance Optimising SharePoint Sites – Part 2 explored some of the platform-independent techniques available at your disposal while this part of the series will identify some of the SharePoint-specific techniques able to be leveraged.

This is another topic where the information available is quite extensive. In a lot of cases however the information is either spread around or quite targetted so I think there is some value consolidating a lot of these ideas as part of this series.

One aspect of performance optimisation for SharePoint sites I won’t delve heavily into is on the infrastructure and administration side. This isn’t particularly my area of expertise so instead i’ll invite you to read some articles with more appropriate authors including Arpan Shah’s Things to Consider for SharePoint Performance, David Lozzi’s Improving SharePoint’s Performance and Eric Shupps’ 10 steps to optimize SharePoint performance.

Make sure your Objects are Disposed Correctly

It could be considered that this has been done to death and surely no one out there would fall into the trap of incorrectly, or failing to completely, dispose their SharePoint objects – but no performance optimisation article would be complete without at least mentioning it. It’s one of the most serious pitfalls a developer can fall into when developing a SharePoint site. Two main resources exist to steer you clear of this evil; the MSDN articles Disposing Objects for SharePoint 2010 and Best Practices: Using Disposable Windows SharePoint Services Objects for SharePoint 2007 and the SPDisposeCheck tool which can be run over your code to check for potential errors. There really is no excuse to letting these issues slip through the cracks and should definitely form part of your optimisation process.

Perform a Code Review and Ensure Your SharePoint Code is Optimal

Aside from object disposal mentioned above, a number of other code factors exist which can leave your pages rendering in a less than optimal time. A number of blog posts run through a bunch of these including Andreas Grabner’s How to avoid the Top 5 SharePoint Performance Mistakes, Eric Shupps’ SharePoint.Performance: Optimizing Web Parts and Waldek Mastykarz’s Performance of various methods to retrieve one list item. However the best resource i’ve found in regards to writing optimal code is on the MSDN site Developer Best Practices Resource Center – definitely worth your time in parsing through all the information on that site to ensure your code is as optimal as it can be.

Use Caching within your Code

No matter how optimal you eventually make your code, it will still be slower than not having to query the information in the first place. Caching is a great mechanism to allow you to run the query once then retrieve the data from memory on subsequent loads. There’s always a trade off between caching the data for performance and the currency of data – the trick is to find the balance between improving the performance of your site and ensuring the data being displayed is relatively up to date – this could differ on a case by case basis. You can either harness the built-in SharePoint objects which handle the caching for you or cache the objects yourself. Claudio Brotto has a good article on all things caching at SharePoint Internet Sites – Performance Optimization for Data Access which is worth a read.

Make Use of SharePoint’s Built-In Caching Functionality

SharePoint also has other built-in forms of caching that can be harnessed for your site both identified in Claudio’s article above; the Output Cache and the BLOB Cache. There are pitfalls associated with caching and these need to be considered, however caching is one of the most powerful methods of improving the load time of your SharePoint site. For more information on BLOB caching i’d recommend reading Sean McDonough’s Do You Know What’s Going to Happen When You Enable the SharePoint BLOB Cache? – it’s a long but quality post. For Output Caching (and caching in general) have a read of Tobias Zimmergren’s Caching in SharePoint 2010. As a side note particularly for SharePoint 2007 you may need to consider the issues Chris O’Brien identifies in his article Optimization, BLOB caching and HTTP 304s.

Get Rid of those Unnecessary SharePoint JavaScript Files

If your site has been extended to solely be an anonymous internet site then chances are a bunch of JavaScript is being downloaded to the page that will never be used by the end user and you should strongly consider changing that. Microsoft covered this off in SharePoint 2007 in a section (within a larger article) titled Managing Page Payload (Small Is Good). Chris O’Brien covered this off again in SharePoint 2010 in his post Eliminating large JS files to optimize SharePoint 2010 internet sites and Mahmood Hamed extended on that in his article Eliminate large JS files to optimize SharePoint 2010 internet sites: the complete reference.

Consider Warming up your SharePoint Site

Warming up the pages across your SharePoint site is a good idea for two reasons. Firstly, ASP.NET’s Just-In-Time compiler will often cause the first access of a site to be extremely slow compared to usual. Secondly, the caching techniques discussed previously only kick in once a page has been visited for the first time. By running a script or job to warm up the site you can avoid the ‘first hit’ performance lag associated with these issues. There are a number of sources of information regarding this topic but one such article that seems to collate a lot of them is Wahid Saleemi’s Roundup: SharePoint Warm-Up Scripts.

So that about sums it up – my take on a start to finish process on performance optimising a SharePoint site. When approaching this task with limited time and budget it’s important to target the factors you believe will offer the greatest gain for the time required to implement. With any luck the tasks completed will be identifiably successful enough that future projects will be undertaken with performance in mind right from the start.

Performance Optimising SharePoint Sites – Part 2

In Performance Optimising SharePoint Sites – Part 1 of this series I focussed on some of the first steps you should undertake or consider when embarking on performance optimising your SharePoint site. This part of the series will explore some of the platform-independent techniques available at your disposal while Performance Optimising SharePoint Sites – Part 3 will identify some of the SharePoint-specific techniques able to be leveraged.

Before I delve into some of the techniques available I should point out that the information available on this subject is extensive. It may seem a bit redundant even including this section in my series due to the wealth of information available elsewhere however I’ve done so for completeness. Both Google and Yahoo offer terrific sources of information on this subject matter if you’d prefer to go directly to the source – i’ll try to differentiate my contribution by linking to alternate tools and resources.

Minimise HTTP Requests

Reducing the number of HTTP requests on your page is one of the best ways to reduce the load time, particularly for first time visitors. As pointed out on Yahoo’s performance rules page via Tenni Theurer’s Browser Cache Usage – Exposed! 40-60% of daily visitors to your site come in with an empty cache. Every request requires a round trip to the server so it’s easy to see how this component of page optimisation could provide a significant improvement to the overall load time of the page. The majority of techniques that can be used to achieve this will be identified in the following 3 sections.

Optimise your JavaScript

There are a number of ways in which the JavaScript on your site can be performance optimised. Firstly, you should Move Scripts to the Bottom as Steve Sauders explains. This enables progressive rendering and helps to achieve parallel downloads. This should only be done where it doesn’t impact the visual rendering of the page. You should ensure that JavaScript isn’t duplicated within your site – either in various libraries or particularly in terms of referencing the same file multiple times (which can happen if you’ve included the reference in multiple web parts or controls). Your JavaScript should exist in external files rather than inline so that it can be cached and hence not add to the size of the HTML document. You should combine your JavaScript into as few external files as possible to minimise the number of HTTP requests and finally you should minify those external JavaScript files. There are a number of resources available to assist you in automating the minification process, both for Visual Studio described in Dave Ward’s post Automatically minify and combine JavaScript in Visual Studio or if you prefer it with a SharePoint flavour via Waldek Mastykarz’s Minifying JavaScript and CSS files made easy with Mavention SharePoint Assets Minifier.

Optimise your CSS

Similarly to optimising JavaScript, you can perform a number of optimisations to your CSS. Whereas you should load your JavaScript at the bottom of the page to enable progressive rendering, with CSS you should Put Stylesheets at the Top. You should always refactor your CSS similar to how you would your code to ensure it is optimal and doesn’t repeat definitions. Take advantage of the fact you can assign multiple classes to your elements – for instance where appropriate include a base CSS class which has the shared definitions and another which has the unique ones. You should externalise your CSS for the same reasons as you should your JavaScript and combine your CSS into as few external files as appropriate (find the balance between reducing HTTP requests and combining a large amount of CSS which is only used on one or two pages). Finally, you are also able to minify your CSS much in the same fashion as you can your JavaScript.

Optimise your Images

Along with HTTP requests, the size of images is one of the biggest factors in determining page load times. There are 3 major areas of focus when it comes to optimising your images. Firstly, it is important that the images are as optimised as possible to begin with. This means achieving smaller file sizes without reducing the quality of the image itself. A number of tools exist to assist with this highlighted in Jacob Gube’s post 8 Excellent Tools for Optimizing Your Images – the one I’m most familiar with it Yahoo’s Smush.it. Secondly, it’s important to ensure that you’re not relying on the browser to resize your images. It is common to find large images reduced in size by the size of the containing element which is definitely not optimal. Finally, CSS sprites can be used to reduce the number of HTTP requests required to retrieve the images explained via Chris Coyier’s CSS Sprites: What They Are, Why They’re Cool, and How To Use Them. One thing you’ll want to consider is that you may be able to optimise the images for the site initially, but if you have content being added by end users and they upload and reference unoptimised and HTML-resized images to a page you’re at risk of losing the optimised nature of that page. One final thing to note is that if you’re displaying a page with a known transition path you can pre-load the images that will appear on the next screen/page to further optimise that page’s load time using a method such as CSS Ninja’s Even better image preloading with CSS2.

Utilise GZip Compression

GZip compression is another way in which you can reduce the size of the payload coming back from the server to the user accessing your site. The majority of browsers support GZip compression and hence this should not be ignored. For a better explanation of why it is beneficial and what it actually does have a read of Kalid Azad’s How To Optimize Your Site With GZIP Compression. For implementation details with IIS6 see Bill Baer’s HTTP Compression, Internet Information Services 6.0, and SharePoint Products and Technologies or Todd Sharp’s Enabling GZip Encoding On IIS7.

In Performance Optimising SharePoint Sites – Part 3 of this series i’ll identify some of the SharePoint-specific techniques able to be leveraged.

Performance Optimising SharePoint Sites – Part 1

For anyone involved in delivering public facing websites, particularly for an international audience, minimising page load times would have to be something high on the agenda. While this subject is not limited to sites hosted on SharePoint, it is an essential topic for consideration for any SharePoint project. There is somewhat of a misperception that SharePoint is inherently slow and often becomes the primary target of blame when trying to work out why page load times aren’t to an acceptable standard. The simple fact is there are a number of optimisation techniques available to be leveraged to minimise page load times.

These topics are not only relevant to public facing websites – a number of them are applicable to intranets and extranets experiencing performance issues. My personal experience in this field stems from delivering a number of public facing websites to specific performance targets. My initial exposure to optimising SharePoint sites came working with the team at Tourism WA on the famous SharePoint site westernaustralia.com and most recently in the early stages of a site optimisation phase for the Career Centre website.

Ideally site optimisation would be a major consideration at the beginning of any project and planned for accordingly. Realistically due to tight deadlines and more functional concerns it’s often a task carried out in retrospect. Either way performance optimisation for your SharePoint site should be considered a crucial task and one that is always undertaken – this article will approach the topic from a retrospective viewpoint.

Part 1 of this series will focus on some of the first steps you should undertake or consider when embarking on performance optimising your SharePoint site. Performance Optimising SharePoint Sites – Part 2 will explore some of the platform-independent techniques available at your disposal while Performance Optimising SharePoint Sites – Part 3 will identify some of the SharePoint-specific techniques able to be leveraged.

Understand the importance of Performance Optimisation

There is a strong correlation between page load times and the success of a website. There is plenty of anecdotal and statistically-backed evidence littered throughout the net to prove this. For a few examples take a look at How Loading Time Affects Your Bottom Line, how Amazon increased revenue for every 100ms of improvement and how ShopZilla increased revenue and page views by reducing load time. Page load times have also been known to influence bounce rates and improve traffic as testified by Google’s Marissa Mayer in Speed Wins. It should be obvious, but with evidence to back it up the likelihood of being granted the time and money to perform optimising tasks should improve.

Establish Benchmarks

In my opinion an important step to undertake before beginning work on optimising a SharePoint website is to establish the performance benchmarks for the site and ideally have a performance target in mind. This works on 2 levels; firstly, having an understanding of the current traffic levels of the site is important. The ultimate goal of performance optimisation is to increase traffic, improve the amount of time spent on the site and if applicable increase conversions. Using a free tool such as Google Analytics may give you the information you need or you can explore a raft of paid options which exist. Secondly you need to capture the current performance of the website. There are a couple of free tools available to do this including Web Page Test and Page Speed Online. With these pieces of information in hand you will be able to accurately determine if the work carried out has had a meaningful influence on both the performance and effectiveness of your site and will increase the likelihood of being able to carry out these tasks in the future.

Know the Tools at your Disposal

There are a number of tools out there that either help identify the areas in which performance optimisation can be implemented, help perform the optimisation tasks themselves or offer somewhat of a shortcut to having to dedicate hours of time on performance optimisation at all. The performance benchmarking tools mentioned above not only rate and measure performance but also offer advice on how your pages can be improved. Tools such as ySlow and Fiddler give you a more granular view of what’s going on as your page loads and with a bit of knowledge lets you target areas for improvement. If you’re hosting your site on SharePoint 2010 then you can make use of the Developer Dashboard to analyse the performance of your page. Products such as Aptimize exist which was famously used on Microsoft’s SharePoint.com to improve performance. Finally, Content Delivery Networks can be leveraged to greatly improve international load times of your site (in my personal experience it was Akamai’s CDN that proved the silver bullet for meeting performance targets on westernaustralia.com).

In Performance Optimising SharePoint Sites – Part 2 of this series i’ll explore some of the platform-independent techniques available at your disposal.

Follow

Get every new post delivered to your Inbox.