SharePoint optimized – part 1, CSOM calls

Intranet home page should contains all information that are needed in daily manner. In fact many companies use home page as a traffic node where everybody comes just to find a navigation link pointing to another part of intranet. In my current company, Findwise, we do that too. However one of our components that allows us to quickly navigate through intranet sites gets slower and slower year by year. Currently it’s loading time is almost 10 seconds! I decided to fix it or even rebuild it if needed. Especially that few weeks ago on ShareCon 365 conference I talked about SharePoint Framework in Search Driven Architecture where I described the customer case, PGNIG Termika, who saved about 600k PLN (~$165.000) per year thanks to their information accessibility improvements (information time access dropped from 5-10 minutes to 1-2 seconds).

In this post I wanted to show you what was the problem, how I fixed it and how my fix cuts the component loading time 6 times!

Navigate through customers and projects

In Findwise we are using SharePoint in a common pattern for keeping customer/project related data in one well-organized place: for every customer we create a subsite. Then for every project of a customer we’ve created a subsite in his site. To be able to quickly navigate through customers and their projects few years ago we built a component and put it on our home page. It look like this:

It’s a tree view with list of customers. In above component when user click on some customer name he will be redirected to that customer site. When a user click on plus button the customer projects sites will be showed and by clicking on project name user can navigate to its site. Like below:

Moreover it has simple search box that allows for searching through customers (but not projects, I’ll explain that later). We didn’t want to spend too many hours for a simple tool so we basically just use simple HTML, JS and CSS loaded by Script Editor Web Part. For fetching site information (name, url) we used JSOM (CSOM in JS).

Wrong usage of CSOM for aggregation may cause performance issues

At the time the component was developed it was good enough. However when customers and projects grow in number it started to slow down. In order to fix it I needed to know what was the problem. I opened browser Dev tool > network tab and I saw this:

Red line on timeline indicates when JSOM query begins. Ok, somehow I expected that it will be related to query but needed to be 100% sure before I get into the code. It also explains why the component does not preload projects for customers: if loading around 500 customers took ~10 second (averaged after 10 tests) then making another JSOM call for customer projects (which gives us extra 500 requests) will take ages.

I checked original code and cannot find anything that would catch my attention.

function sharePointReady() {
    clientContext = SP.ClientContext.get_current();
    website = clientContext.get_web();
    customerSubsites = website.getSubwebsForCurrentUser(null);
        var listEnumerator = customerSubsites.getEnumerator();
        while (listEnumerator.moveNext()) { 
            var customerSub = listEnumerator.get_current();
                subsites = customerSub.getSubwebsForCurrentUser(null);
    }, onRequestFailed

function onRequestSucceeded() {
   var listEnumerator = subsites.getEnumerator();
   var subsitesArray =[];
   while (listEnumerator.moveNext()) { 
        var subsite = listEnumerator.get_current();
   GenerateList('Customers - Projects', subsitesArray);     

If you thought “Hey, wait a minute, line 15, you load bunch of unused properties. You should use some including in loading” – you’re right!

Let’s fix it quickly by swapping line 15 with

clientContext.load(subsites, "Include(Title,Url)");


Nice! Case closed.

Oh come on! We can’t let the problem gone that fast! Let’s fix something else i.e.: to have an option of searching through projects subsites too (which was skipped in order to cut loading time).

Use CSOM to get all subsites with properties to retrieve

Ok, this is a bit tricky. Why? Because:

  1. There is no executeQuery function that is synchronous in JSOM
  2. We cannot just simply add another block like below to while loop in line 28
projectSubsites = subsite.get_webs();
clientContext.load(projectSubsites, 'Include(Title, Url)');

Explanation is unpredictability of async calls end. In the code above we tried to load different subsites (i.e. for CustomerA site and CustomerB site) to the same variable (projectSubsites) and then load it in success method – but since we cannot predict when particular callback will return there may be (and will be) situation when:

  1. code initialize loading projects sites for CustomerA site to projectSubsites variable
  2. immediately after it (that’s how async works, right?) there is initialization for loading projects sites but for CustomerB to the same variable
  3. at some point similarOnRequestSucceeded fires for CustomerA and tries to get information from projectSubsites…which has been overridden by step 2 so it’s still not initialized -> Error collection not initialized.

So we have 2 options:

  1. Create a new variable for every subsite crawl
  2. Create anonymous success function dynamically

Since I don’t like to reinvent the wheel I go for option 2 – there is cool gist enumWebs.js that do exactly what we need – recursively crawl all site subsites. I modified in a little bit in order to put any site as a root and it looks like below:

function enumWebs(ctx, root, propertiesToRetrieve, success,error)
   var rootWeb = root;
   var result = [];
   var level = 0;
   var colPropertiesToRetrieve = String.format('Include({0})',propertiesToRetrieve.join(',')); 
   var enumWebsInner = function(web,result,success,error) 
      var ctx = web.get_context();
      var webs = web.get_webs(); 
            for(var i = 0; i < webs.get_count();i++){
                var web = webs.getItemAtIndex(i);
            if (level == 0 && success)

I used that in following way:

enumWebs(clientContext , customersSub, ['Title', 'Url'],
  for(var i = 0; i < webs.length;i++){
    console.log(counter +" || " + webs[i].get_title());

As you can see I put a performance check in line 6. I put similar check right before I call enumWebs function in order to see how long it will take to crawl all subsites.

973 query calls done almost at the same time. 1.17s each, praise the async 🙂

Example results:

  • First performance check: 4390.175ms
  • Last (#973) performance check: 10931.175000000001ms

Results (averaged): 973 subsites returned in 6492.59ms

It’s still better than at the beginning when loading around 500 customers took ~10 second. But it’s way longer then calling just customers sites (500 items) in one query call that took 0.8s.

Hm…which to pick? Performance vs usability? Fast loading vs ability to searching in customers AND projects?

I’ll show that to you in my next post!

Leave a Reply

Your email address will not be published. Required fields are marked *