How I passed 70-668

This post sees the end of my series on working through passing the SharePoint 2010 certification exams. It may be a little delayed, however it hasn’t been too long since I gained the MCITP: SharePoint Administrator 2010 certification by passing 70-668. It’s a great feeling to have all 4 certifications under my belt now and opens up the door to a range of new challenges to work towards – none more so than exploring everything SharePoint 2013 has to offer (including the certification paths in the future I’m sure!).

But first things first – how I passed 70-668. The title to this post may even be a bit misleading, because I’m going to discuss a number of avenues I could have taken to pass the exam. In reality I felt that I had a pretty good understanding of the theory behind a lot of the skills measured and ended up relying on the reading I had already done for 70-667 along with a few study videos to get through. It was interesting to compare the different paths I took for each exam both in terms of knowledge gained and the exam results which followed.

So firstly the path I did take – there are a few study videos available to learn SharePoint 2010 administration and I was keen to try them out, not having done so in the past (not counting the Ignite series). A few examples include LearnSmart’s SharePoint 2010 Administration Video Training, CBT Nuggets’ Microsoft SharePoint 2010 Admin 70-668 and PluralSight’s SharePoint 2010 Administrator Ramp-Up.

Overall I’m undecided on the video route – in terms of targeted knowledge I believe it is the best way to go. In terms of having the information sink in and motivating myself to study, I’d rate them fairly highly. In terms of getting the overall breadth of coverage you get from trawling through copious amounts of TechNet and MSDN study material it wouldn’t quite be up there (not to mention sometimes the answers to questions come directly from that content!). The time you dedicate to it depends on how fast you can read – whereas the videos take a set amount of time, reading articles or a book allow you to skim through sections without fear of missing something important. I’d suggest I would be prepared to go down this route again both to specifically target an exam and get a decent amount of knowledge up front, however I’d prefer not to do it at the expense of reading a book or Microsoft’s documentation in the long run.

So what other options were available? Surprisingly, a great deal. In fact I seemed to find more quality resources for 70-668 than I did for 70-667. First up, Microsoft can always be relied on to provide some decent ‘amalgamation’ pages, especially for the newer versions of the platform. Their SharePoint Server 2010 for IT pros page is a good place to start your deep dive into the content. There’s also the videos and virtual environments they provide which I have previously listed in How I passed 70-667. This was the first time however I’ve stumbled across actual usable learning plans – my memory of these were always a link to a course or paid online training, but it seems like they’ve got some decent content attached these days – something I definitely want to check out in the future.

On top of what Microsoft provide there was also a number of quality blog posts all in different styles which would certainly help preparing for the exam. There is Joel Jeffery’s generic SharePoint Exam Tips and Barry’s SharePoint 2010 70-668 Study Guide links. Scott Jamison wrote a great piece on Preparing for 70-668: PRO: Microsoft SharePoint 2010, Administrator and Alex Dean wrote some handy tips on How to Pass 70-668.

Finally, much like for 70-667, Accelerated Ideas has provided some practice exam questions which will help you get a feel for the type of information you’ll need to know to pass the exam.

Overall my 70-668 experience was very good – somewhat surprisingly I actually scored my highest mark in this one out of the lot – it may have had something to do with the fact it was the last one I took and a lot of the information you study for overall is transferable. The one thing to look out for with this exam is the different style of questions you’ll be faced with – it’s no longer simply a matter of taking a 25% stab in the dark for a lot of questions – there are case studies, ‘mega’ multi-choice with 15+ options and perhaps the most difficult style, ordering various steps to achieve a particular task (while ignoring the steps that shouldn’t be there!). If you’ve read up on the links presented on this site however, or perhaps watched some of the videos available, you should be ready for success. Good luck!

Integrating Facebook into SharePoint

As mentioned in my post Integrating Twitter into SharePoint I was recently faced with creating a stream from both Twitter and Facebook to display on a public-facing SharePoint website. While the Twitter experience was quite simple and enjoyable, that which was experienced when jumping across onto the Facebook side of the fence was far less so. In the end it actually wasn’t too difficult to implement, however the road there was full of potholes and speed humps.

My first stop was attempting to find an example of one which had already been implemented and the options were surprisingly limited. The closest I came was 3PillarLabs’ Facebook Webpart for SharePoint – Part 1 which looked like it would have at least given me the code samples I needed to get my own implementation working – if I was approaching it around the same time frame. The problem was that I continually (usually on the 2nd load) received an error: The remote server returned an error: (400) Bad Request. Viewing a number of recent comments on their code page seemed to indicate this was not just isolated to me.

The main issue was that in late December Facebook changed how their authentication tokens worked. You can read more about this both on Facebook’s own Removal of offline_access permission page and Randy Hoyt’s Changes to Facebook API: Access Tokens and offline_access. Long story short, 2 things seemed to now be the case – you couldn’t simply store a Facebook application code as per the example highlighted above because it seemed to last all of 2 minutes, and you couldn’t get a user token that would allow offline access which would never expire.

I was left with two options – I either needed to work with a page access token instead or refresh the user access token (which, after extended, would last up to 60 days). My original intent was to pull back Facebook statuses and unfortunately the page access token simply wouldn’t work for this purpose – I continually received the error (OAuthException – #102) A user access token is required to request this resource. Now this threw me off track a fair bit – as it turns out, you can actually use the page access token to retrieve a number of other things that would suit a similar purpose – namely feeds and posts. If you are happy with this approach you can skip the next paragraph and flick ahead, as this was how I finally implemented the solution.

It’s possible however that you may be faced with a situation where you need access to something that requires a user access token and therefore the brainstorming I did to solve this issue may still be of use. Essentially what I planned on doing was having a CustomAction in the SiteActions group which would make 3 calls – the first to access the application code via the URL

https://graph.facebook.com/oauth/authorize?client_id=CLIENTID&redirect_uri=CustomActionURL&scope=read_stream

the second to access a short-term user access token via the URL

https://graph.facebook.com/oauth/access_token?client_id=CLIENTID&redirect_uri=CustomActionURL&client_secret=CLIENTSECRET&code=CODE&scope=read_stream

and the third to extend the short-term user access token into a long (60-day) one via

https://graph.facebook.com/oauth/access_token?client_id=CLIENTID&client_secret=CLIENTSECRET&grant_type=fb_exchange_token&fb_exchange_token=USERTOKEN

This long term token would then be stored in a property bag and accessed by the web part. This process however would need to be run within every 60 days – it’s possible you could instead use a timer job to automate that process.

Before continuing on it’s important to point out that the CLIENTID and CLIENTSECRET values are easily retrieved similarly to how Twitter’s were – all you need to do it sign in to the Facebook app development page and create a new app.

The final piece of the puzzle was to find an SDK to make life a little easier when programming against the Facebook API. Unfortunately, while SpringSource have a Facebook SDK for their Java framework, the ported Spring.NET equivalent for .NET (which I used for Twitter) did not. I ended up using the Facebook C# SDK for ASP.NET. Unfortunately though when using their code samples I ended up with an error as shown below: Dynamic operations can only be performed in homogenous AppDomain.

dynamic-operations

A little searching turned up that this occurs when trying to use dynamic properties with a web.config entry of <trust level=”Full” legacyCasModel=”true” /> – interestingly enough something Corey Roth pointed out was changed in SharePoint 2013 in his article New level of trust in SharePoint 2013 Preview. Rather than adjust that value which seemed fraught with danger (and was recommended against by one Microsoft employee, Humberto Lezama, in an MSDN forum post Getting SharePoint 2013 and MVC 4 to co-exist), I simply adjusted the code to work without dynamic properties.

The end result – finally – was a success.

facebook-feed

var oAuthAppToken = GetOAuthToken();
var client = new FacebookClient(oAuthAppToken);
var me = client.Get(FacebookId, new { fields = "posts.limit(10).fields(message,updated_time)" }) as IDictionary<string, object>;
var posts = (IDictionary<string, object>)me["posts"];
var data = (JsonArray)posts["data"];

List<FacebookStatusDisplay> statusDisplays = new List<FacebookStatusDisplay>();
foreach (var theStatus in data)
{
    var status = (IDictionary<string, object>)theStatus;

    if (status.ContainsKey("message"))
    {
        string message = (string)status["message"];
        string updatedTime = (string)status["updated_time"];
        statusDisplays.Add(new FacebookStatusDisplay(message, updatedTime));
    }

    if (statusDisplays.Count == 2) break;
}

if (statusDisplays.Count > 0)
{
    StatusRepeater.DataSource = statusDisplays;
    StatusRepeater.DataBind();
}
else
{
    FacebookFeed.Visible = false;
}

Note above that it is important to check for the message before trying to add the post, because we’re not dealing with statuses directly it’s possible that a message won’t exist within the post.

string url = string.Format("https://graph.facebook.com/oauth/access_token?client_id={0}&client_secret={1}&grant_type=client_credentials", OAuthClientID, OAuthClientSecret);
WebRequest request = WebRequest.Create(url) as HttpWebRequest;

using (HttpWebResponse response = request.GetResponse() as HttpWebResponse)
{
    StreamReader reader = new StreamReader(response.GetResponseStream());
    string retVal = reader.ReadToEnd();
    oAuthToken = retVal.Substring(retVal.IndexOf("=") + 1, retVal.Length - retVal.IndexOf("=") - 1);
}

It’s also worth noting that like the Twitter integration I used a SharePoint list for configuration, had the necessary error handling present and used the PrettyDate function identified to retrieve the correct time-lapsed date.

So overall with this knowledge at hand it would be relatively simple to knock out another web part which read posts from – importantly – a publicly accessible Facebook page. There were just so many little issues that had to be overcome to make this an enjoyable experience, namely the changes Facebook have made to their authentication. I feel for Facebook application developers and am glad this should be the extent that I need to interact with Facebook within SharePoint in the near future!

Integrating Twitter into SharePoint

It’s been over a year since I wrote about Integrating WordPress into SharePoint as one of my first (and most visited) posts on this blog, so it seemed only fitting that early into the new year I found myself faced with the task of integrating both Twitter and Facebook into SharePoint 2013. As it was when it came to WordPress, the integration to be discussed here is light – simply displaying a feed of the latest tweets on the site – but the hurdles encountered are likely to be helpful and relevant with any level of integration you’d be capable of achieving with a bit of study of the Twitter API and third party SDKs.

Thankfully there was a lot of information available on the net in regards to programming against Twitter, the first source being the Twitter API itself. It wasn’t long after that I learnt that there were mature and decent SDKs available to make programming against Twitter even easier – one of the .NET ones recommended by Twitter themselves was Spring.NET Social for Twitter which proved easy to learn, integrate into the solution and use.

The first thing to know about creating an application for Twitter is that you must register it. Creating a new application will give you the Consumer key and Consumer secret you’ll need to identify your app, and creating an access token will give you the Access token and Access token secret to authorise your application to interact with your feed. I chose to store these values as configuration items using the list approach identified in my post Application Settings in SharePoint.

The second thing to know is that the Twitter API is rate limited (180 accesses per 15 minutes for user feeds) and therefore needs to be considered. This is easily countered by caching the results which are returned – obviously not ideal considering the results returned would not be up-to-date to the minute however necessary if you believe the site will be accessed that many times within a 15 minute period.

So the next step was to code up a proof of concept, but as we all know, approaching any task within the confines of SharePoint tends to lead to unexpected hurdles that need to be overcome. The first issue I encountered was ‘No connection could be made because the target machine actively refused it’. This turned out to be a bit of a red herring – the target machine had nothing to do with the problem, it was in fact a replica of the proxy issue I came across when integrating WordPress and was resolved in the same way. The error encountered and the web.config entry to resolve it can be seen below.

proxy-error

<system.net>
  <defaultProxy>
    <proxy proxyaddress="http://your-proxy-address:port" bypassonlocal="true" />
  </defaultProxy>
</system.net>

Thinking that was all a bit too easy and looking forward to seeing the end result I was immediately fronted with another issue: ‘The remote certificate is invalid according to the validation procedure’ which could be further narrowed down to ‘The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel’. This one took a little longer to get over as there were a lot of posts on the net which proposed unresponsive or unattractive resolutions to the problem. There were however a few ways in the end that this one could be resolved.

certificate-error

Firstly, it was possible to set a web.config entry to get SharePoint to ignore the problem altogether – however this did not seem like the most security conscious way to approach it.

<system.net>
    <settings>
      <servicePointManager
          checkCertificateName="false"
          checkCertificateRevocationList="false"
      />
    </settings>
</system.net>

Secondly, it was possible to get SharePoint to ignore the problem via code – basically the same approach as above and similarly unattractive.

ServicePointManager.ServerCertificateValidationCallback = new System.Net.Security.RemoteCertificateValidationCallback(delegate { return true; });

Finally, and probably the most appropriate and security conscious approach to take, was to register the SSL certificate as a trusted certificate in SharePoint. This is where it took a bit of trial and error to find the right one – most guides suggested that trusting the right root certificate from VeriSign would do the trick however it wasn’t until I exported the appropriate certificate from https://api.twitter.com that I got it working – it’s important to note I needed to export the VeriSign Class 3 Secure Server CA – G2 certificate, not the root. You can read up on how to export the certificate in Sean Wallbridge’s post SharePoint 2010 and Cert Trust – Could not establish trust relationship for the SSL/TLS secure channel and either use his Central Administration instructions listed or the following PowerShell commands:

$root = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("(location)\VeriSign Class 3 Secure Server CA - G2.cer")
New-SPTrustedRootAuthority -Name "VeriSign Class 3 Secure Server CA - G2.cer" -Certificate $root

So after all that I had a working twitter integration – however the feed only returned the CreatedAt date and not the relative time lapsed since the tweet was made. I used a couple of posts to implement that feature, namely Sam Allen’s C# Pretty Date and a thread in StackOverflow Calculating relative time.

The end result and some code snippets to achieve it can be seen below – overall the experience of creating an integration application with Twitter was relatively pain free and easy to accomplish which is more I can say for my experience with integrating Facebook!

twitter-integration

 ITwitter twitter = new TwitterTemplate(ConsumerKey, ConsumerSecret, AccessToken, AccessTokenSecret);

 IList<Tweet> tweets = twitter.TimelineOperations.GetUserTimeline(2);
 foreach (Tweet tweet in tweets) tweetDisplays.Add(new TweetDisplay(tweet));

 // Cache the result for 20 minutes (Twitter limit access for a certain number of calls per 15 minutes)
 CachingHelper.AddToCache(cacheKey, tweetDisplays, new TimeSpan(0, 20, 0));

 TweetsRepeater.DataSource = tweetDisplays;
 TweetsRepeater.DataBind();