How safe is storing data in the Cloud?

How safe is storing data in the Cloud?

by Vishan Sondhi, SharePoint Consultant @ ClearPeople

Is it safe storing data in the cloud? I wanted to discuss some advantages of using Office 365, but, more importantly outline what measures Microsoft takes to ensure that ‘our’ data is safe in the cloud.

Firstly, it’s important to understand that Microsoft offers various levels of security to its users within Office 365.

No one has specific access to data – No third party, literally no one.

Each data centre (where data is stored) has a strict control of access as to who can actually enter the data centre and what they can do; this is governed by what Microsoft called ‘Lock Box’. Essentially this means that if an engineer does have to go into the data centre then there is a strict access control as to what the engineer can do – at no point will that engineer have access to the data; it’s mainly just troubleshooting tasks.

Below is a brief overview on some of the various layers of security offered on different layers:

Network Layer – Firstly all data on Office 365 is encrypted in transit using TLS/SSL; this ensures that data is confidential – so if a user did ‘intercept’ communication, then the results would be scrambled and would be of no use to them.

Physical Layer – On-Premise, IT staff know exactly where the disk that contains the data is, they know exactly which computer it is on and exactly how to get to it; this is generally how a premise environment is run, in that someone knows exactly where the data is. A malicious user once on the server has the ability to do whatever they want, i.e. run code, delete data, copy data, remove the drive etc.

In the cloud world, the only people who can get into the data centre are the engineers, for example during maintenance, but because there are strict access controls in place, this is mainly just to administer troubleshooting tasks.

Having the ability to find out whose server is whose, which partition data lives on, or where it lives within the data centre is like finding a needle in a haystack. The size of the data centre and the amount of servers would mean a malicious person would never know which disk drive belongs to a particular person.

But in a worst case scenario, supposing an engineer pulls a drive out, Microsoft have invested in BitLocker which basically means that the drive that is pulled out will be wiped.
Furthermore Microsoft have a Blue team and Red team to ensure they’re up to speed on data security – The Red team are constantly trying to ‘hack’ into Microsoft data centres whilst at the same time the Blue team are consistently trying to prevent those attacks.

Key things to note on Microsoft cloud security:

Logical Layer – No code that is not known to Microsoft is allowed to be executed on any of the servers; i.e. it can’t get random code out of the environment and run it on servers. Only known processes are white listed to run on servers. This would make it virtually impossible for a malicious user to run a malicious code on a server (that’s if they ever got on).

User Layer – The Office 365 admin portal offers much more in terms of security; some listed below:

  • Multi Factor Authentication – This is a two way sign in process, making it harder for a malicious user to get into your account; when a user signs into their Office 365 account with their username and password – an additional layer of security must be acknowledged via a phone call or text before that user can sign in. This feature is also available on most Hotmail / Outlook accounts.
  • Data Loss Prevention – DLP essentially scans emails for sensitive information, such as “Credit Card Number”. Warnings can be given to the sender alerting them and give the sender control of whether they would like to send the email or not. If the sender agrees to send the email, then it can be encrypted using TLS encryption or we could apply rights management
  • Rights Management – This is a list / library setting (within SharePoint) that allows site owners to protect attachments stored against list items and / or supported file types. For example, if a document is downloaded, the file is encrypted so only authorised people can view it. Furthermore the file can have restrictions imposed on it making it impossible for users to print, copy, save a local copy etc.

There is a fantastic white paper which is available to download which details the above with additional security measures Microsoft have taken to ensure data is safe in the cloud. Download here.


Posted in Cloud Services, consulting | Tagged | Leave a comment

Target Audiences Column appearing as GUID

Target Audiences Column appearing as GUID

by Vishan Sondhi, SharePoint Consultant @ ClearPeople

So, I have been working on a project for a customer who wanted to Target Audience individual list items in SharePoint… Simple task, right?

I allowed the management of content types on my list and added in the ‘Target Audience’ site column which is available to us out of the box. This bit all works fine.

The actual problem arises when you start to audience target the list items.

When you audience target list items, the audience appears as a GUID in the list view (as below)


But if you remove the audience from the first list item, the audience does not appear as a GUID (as below)


So basically, if the first item in the list has an audience, then ALL the items appear as GUIDS.

As soon as you remove the audience from the first list item, the audience name loads correctly i.e. doesn’t display the GUID.

I did a bit of further reading on this and a lot of people have said this issue is by design, but surely it can’t be a design issue? No one wants to see a GUID in the view.

I found this issue is evident on all on-premise versions of SharePoint (2007, 2010 and 2013) and SharePoint Online.

Having contacted Microsoft about the issue; they confirmed that this issue is a Bug in SharePoint and it has been there since 2007.

Microsoft reasoning

“This bug will not be fixed. SharePoint uses XSLT stylesheets to render list views. A list is rendered as an HTML table and the value of a field is rendered into the appropriate cell of the table by a simple XSLT template from the fldtypes.xsl file located in %ProgramFiles%\Common Files\Microsoft Shared\web server extensions\14\TEMPLATE\LAYOUTS\XSL. And this issue relates to the .xsl file. ”

So there you have it folks, an unfixable bug in SharePoint. The current workaround is to remove the audience from the first item. I didn’t find this very appealing, nor a solution, so I had a little play around with it and have found a temporary fix which works for me.

Temporary Fix to Audiences appearing as GUID

As the GUID on Target Audiences only appears on the first item in the list, we need to find a way of hiding the first item. Having tried to use filters on the view to hide the first ID – the problem still remained, (because the first item in the list will be item 2 If you hide item 1).

A solution that has worked for me was adding the snippet web part to the actual list and then adding a bit of code to it. See below:

1. Create a bogus item as the first item in the list. Make sure that this item has an ID of 1. (So it has to be the first item in the list).


2. Edit the list by selecting the cog in the corner and Edit Page.


3. Insert the snippet web part on top of the list and insert the following code:

.ms-listviewtable > tbody > tr:first-child{
display: none;

3. Save the page

4. Voila!


The snippet is just basically hiding the first item in the list.

*I strongly recommend you try and test this before you insert any code onto a PROD environment.

Posted in Uncategorized | Leave a comment

Portal Engine or ASPX Template: Choosing the right kind of development models in Kentico

Portal Engine or ASPX Template: Choosing the right kind of development models in Kentico

by Mercedes Amoros, Web Developer @ ClearPeople

This blog post aims to describe the advantages and disadvantages of the types of development models available in developing your Kentico websites.

The Portal Engine Model

With the portal model your templates can be built within the web interface and you can define your templates and document types easily.


  • Visual representation of your zones (placeholders) and webparts (controls)
  • You can create a placeholder easily without writing any code
  • To remove/create a webpart you only need to go to the design tab
  • You don’t need to deploy if you want to change a webpart or placeholder


  • It is not possible to manipulate the file directly and it can be difficult to develop complex Webparts; in this case I’d advise using a combination of Portal with ASPX.

See below an example of the Portal Engine Model:

Portal Engine

ASPX Template Model

The ASPX model in Kentico allows you to implement all your templates from scratch on the physical file. You should define your Zones (placeholders) and webparts (controls) programmatically in your ASPX file.

To use the ASPX model you need to create a new template and specify which file (.aspx) your template is, like a standard ASPX page for example. To do this you need to create the ASPX files on Visual Studio and inherit these from Kentico classes like: TemplateMasterPage or TemplatePage


  • For some webparts which require complex functionality it is easier for the developer to create the code on an ASPX file. Because it is file-based you can use source control.


  • You will need to write all the code to create templates (which can be done in Portal quicker)
  • You need to deploy every time you need to change a template
  • It is not possible to change your template from the Kentico panel

See below an example of the ASPX Template model:

ASPX model
Portal Engine and ASPX Template Model

It is possible to combine Portal and ASPX models across your site. You can create some templates on Portal and some on ASPX depending on the complexity.

You can also combine the two on the same template. The template can be created on Portal Model and your webparts can use the ASPX model, using ASCX files.

To create Webparts on Kentico with an ASCX file you should create a control on Visual Studio and inherit from CMSAbstractWebPart.

Portal Engine is recommended by Kentico

Kentico recommend you use this model to develop because you can define your page easily with a powerful visual UI. It provides an intuitive way to create Zones (placeholders) and add Webparts (controls).

Personal opinion

Choosing your development model will depend on the complexity of your project. Personally, I think the best option is a combination of both. Portal can give you a quick way to create your templates and document types, and give you a vision of what your pages will look like. Using the ASPX model makes it possible to create more complex functionality for your site.

I find a good combination is to use the Portal Model for your templates and layouts (the visual part of your project) and the ASPX Model for some complex webparts which will give you the flexibility to create your own code.

For this combo of Portal and ASPX, you will certainly need programming knowledge to develop your Kentico site.

Posted in Website Design | Tagged | Leave a comment

Better User Experience with Auto Scrolling

Better User Experience with Auto Scrolling

by Ian Jackson, UX Developer @ ClearPeople

Many websites adopt an auto scroll feature of some sort. This is generally a button or link which allows the user to jump to a different part of the document without having to manually scroll.

A ‘back to top’ feature is generally a button or link which is either placed at the bottom of a page or fixed on the page as the user scrolls beyond a certain point. It is a feature which allows the user to return to the top of the document when clicked. This is especially useful on pages containing a large amount of content.

back to top
Anchor links are clickable buttons/links which can take the user to another section of the page. This is usually seen in a large article page with a list of contents links at the top which refer to each section.

Both of these are examples of a scenario where the position of the scrollbar needs to be moved automatically.

Scrollbar fundamentals

The scroll bar is an integral feature of, not just browsers, but all computer interfaces since the creation of the first graphical operating system. They provide reassurance to the user that there is more content offered below the fold. They indicate the length of the content and also the section of the content the user is viewing. It also assures the user that they have complete control over the scroll position and that they are free to move up, down, left, or right.

I believe it is important to retain these core values when we tamper with the default behaviour of the scroll bar with JavaScript. This means that care should be taken when developing an automatic scroll feature.

Development Method

There are essentially two options a UX developer can take when building this control.

  1. Create a button which, when clicked, will ‘jump’ to the specified location on the document.
  2. Create a button which, when clicked, will transition smoothly to the specified location on the document.

In this article I am going to focus on the latter option.

Automatically scrolling to a position on the document can pertain to any feature of a site which causes the scroll bar to be controlled by JavaScript.


A transition effect is beneficial in this scenario as it allows the user to perceive the position of the page as it is moving and gives a sense of bearing and direction. If the page were to jump without a transition, the user will be confused – have we moved up, or down? Are we viewing a different page? What just happened?

Transition Duration

This is the time it takes to scroll to the specific point on the page. It cannot be too fast or it will disorientate the user. It cannot be too slow or it will irritate the user.

Transition Easing

The transition easing is the property which describes how the animation will proceed over time. Does it proceed from point A to point B at a fixed speed or does it slow down as it reaches point B? In my opinion the transition easing function, much like the branding, should be consistent throughout the site and should match the character of the brand, i.e. you wouldn’t really use a ‘bouncing’ or ‘elastic’ effect on a corporate legal website.

Escape Clause

The user should be able to resume control of the scrollbar if they so choose, and therefore cancel the auto-scroll if necessary.

Take a look at this example of a ‘back to top’ button. When we click the button, we briefly lose control of the scrollbar until the page is back at the top.

When we deny the user the ability to control the page freely, albeit small, we degrade the user experience, and devalue the appeal of the site.

To sum up

When developing an auto scroll feature, such as a ‘back to top’ button or a set of anchor links, a developer must consider using a transitional effect to allow a gradual scroll as opposed to a sudden ‘jump’. The developer must also consider the duration of the scroll transition, the easing of the transition, and must ensure the user has complete control at all times by allowing the scroll transition to be interrupted if necessary.

Posted in Uncategorized | Leave a comment

Removing obsolete items in Sitecore Package Designer

Removing obsolete items in Sitecore Package Designer

by Alan Yip, Senior Sitecore Consultant @ ClearPeople

Sitecore’s Package Designer is a very useful tool for packaging Sitecore items and transferring these between different environments.

There is a very useful feature in there where you can store the structure of these items in a definition file as XML. This can then be stored somewhere and later ran against the latest content tree or wherever you decide to generate the content from and a new package is then created.

In one of my latest projects, I had to do a migration of content from one environment to another. This was not a simple straight forward migration, but that is something else.

I was in continuous communications with the clients’ developer and they were responsible for creating these packages. Now as most projects go, there will always be obstacles in the way and this project was no different.

The developer had generated the structure of the content into a definition file which was going to be used for the real content packaging, but as time went by, the actual packaging of the content didn’t actually happen until a few months later in which case, we all know content comes and goes.

Yes you’ve guessed it, when I tried to generate the content, I got the nice “Package generation failed…” error!

package failed
But luckily, I found a nice little gem that was to become my lucky charm and that was the “Remove obsolete” button!

remove obsolete
This button removes any items that don’t exist in your current Sitecore tree thus making the creation of your package possible!

NOTE: After pressing this button, it doesn’t save a copy of the updated XML definition so it’s entirely up to you how you want to manage this.

Happy packaging!

Posted in Sitecore | Leave a comment

Gleaning insight from your Intranet – A review of SharePoint Analytics tools

Gleaning insight from your Intranet – A review of SharePoint Analytics tools

by Benjamin Moles, Senior .Net Developer @ ClearPeople

As SharePoint specialists, ClearPeople helps many clients implement intranets and extranets based on this platform.

With the advent of SharePoint Online, we are often asked what analytical tools can be used to improve the current out-of-the-box analytical functionality. This prompted us to conduct independent research on analytics tools available for SharePoint and Office 365.

There are two main reasons why companies are interested in analytics tools:

  1. To gather usage information that allows them to identify what elements within the intranet are the most successful
  2. To measure the overall success of intranet projects – knowing what works best within the intranet can help business leaders make better decisions

When reviewing available analytics solutions on the market we identified key criteria that would help a business select the right solution based on their specific needs:

SharePoint integration

Most analytics tools in the market including the most popular one, Google Analytics, do not integrate well with SharePoint. The grade of integration with SharePoint is important because it determines what data can be collected and how this information is collected for further analysis. Low levels of integration makes it difficult (if not impossible) to collect data such as users’ identity. This makes a big difference in segmenting usability data by teams, departments and actual users. This criteria may be used to discard many tools available in the market for intranet analytics.


There are big differences in cost from one solution to another. For a company with a limited budget this will obviously be a major factor in their choice of analytics tool.

Pricing Policy Complexity

Some pricing policies are complex and based on usage quota and features which makes it difficult to evaluate upfront the final cost to be paid for the solution. This factor is especially important if the budget is small.


Analytics is a complex subject. Analytics tools can provide useful information but businesses need to know how to use information to glean relevant insight from it. For companies without extensive experience in using analytics tools and with a limited budget, it may be more useful investing in training rather than an expensive and soon to be redundant tool.

How can ClearPeople help?

With these criteria in mind the options are narrowed down quite a lot as there are few tools that match all of them. At ClearPeople we are happy to help our clients to select the right analytics tool for their business needs.

If you are interested in migrating all or part of your intranet to Office 365 and you want to quantify and maximise your ROI, please get in touch with one of our consultants who will be happy to help.

Posted in SharePoint | Tagged , , | Leave a comment

A review of our past year: Building a team to last

A review of our past year: Building a team to last

by the Senior Leadership Team @ ClearPeople

As the new financial year approaches, it is time to review the changes we’ve seen over the past 12 months. It’s clear to say that ClearPeople has undergone significant change. And admittedly it has been a bit of a challenge, but the change has been necessary – so many companies that fail to adapt as the industry evolves, fail themselves. To quote Harold Wilson: “He who rejects change is the architect of decay.”

But ClearPeople continues to grow from strength to strength. We’ve almost doubled our turnover in 2 years which has led to the growth of our Alicante team by 78% and our London team by 22%.

Such an achievement requires an evolution of business processes to sustain it. And we are really proud that with the appointment of a new Senior Leadership Team, this change has been managed effectively to the benefit of our business and our employees.


Through small incremental changes over the past 12 months, ClearPeople has come to be in the leading position it is today. Our client work continues to be award-winning, with a recent project named the Best IT Project at the BCS & Computing UK IT Industry Awards, and we were also recognised as the first Microsoft UK partner to be awarded Cloud Platform Gold Competency for Microsoft Azure. This demonstrates our best-in-class capabilities in delivering Cloud platform based solutions to UK Enterprises.

Of course, with any form of change comes an element of uncertainty and we obviously had to accept that change does not suit everybody within an organisation. We have had to adapt our processes and internal structure to be able to meet the demand of our clients and that has seen some of our employees leave for pastures new. We wish them well in their endeavours and thank them for all their hard work and contribution. But it has meant that many new, talented experts have joined us and many of our staff (some who have been with us for more than 10 years) have embraced it. The Senior Leadership team have sought to provide stability through this period and as we approach the new financial year we are pleased to be in a position where these changes are starting to pay off.

Katya Linossi, Co-founder and Managing Director of ClearPeople says: “It’s an incredibly exciting time for ClearPeople as we continue to grow and add to the services we’re able to offer to our clients. We’re proud to have built a really strong team, rather than a company of individuals, and I am confident that our team’s expertise will help us to maintain our position as one of London’s leading digital and technology agencies.”

Our ‘consultagency’ proposition, fusing the skills of technical consulting with creative digital marketing agency services, has been well received by our customers giving them real and clear business value. Everything ClearPeople does is focused on meeting our customer’s business objectives and providing value back to those that put their trust in us. This is evident in our 85% repeat business and referral rate.

If you have any questions, please don’t hesitate to get in touch with any of our Senior Leadership Team:

Katya Linossi, Managing Director
Gabriel Karawani, Director
Wesley Hogg, Digital Strategy Services Director
Eneko Vallecillo, Technical Consulting Services Director
Barry McKaine, Head of Project Management

Posted in Customer Experience, Uncategorized | Leave a comment

Designing the perfect Document Management System

Designing the perfect Document Management System

by Matthew Quenby, New Business Manager @ ClearPeople

In my opinion, there are only a few things that separate SharePoint from being either one of the world’s worst or best platforms for Document Management.

  • The first concerns the choices you make when designing the user interface and experience – taking into account (or not) the user’s day to day experience (so over 80% of the user’s time is still today in 2015 spent in MS Word or Outlook).
  • The second is how you design a sensible information architecture (IA) that will withstand the pressures of today and the changes over time.
  • The third is how you manage, maintain and control the IA day to day, avoiding it getting out of hand, while not constraining the business from using the portal for what it was intended (collaborative and flexible document management).

Generally speaking, our team spends lots of time talking about and advising on how to avoid the most common mistakes and how to take best of advantage of SharePoint for what it is. I won’t cover these more general points, but you can read more about them in the Guidelines for SharePoint 2013:

From a user’s perspective, one of the greatest shortcomings is often the first point mentioned earlier. The interaction between Microsoft Office applications Outlook, Word etc. and SharePoint itself.

In this series of blog posts my colleagues and I will cover this point in detail but we will also discuss how SharePoint can deliver high intensity/volume document management to a level that will not only match but be more versatile and applicable than traditional DM systems.

As a taster of what is possible from a user’s perspective, the below screen shots show a SharePoint document set folder being surfaced in Outlook including a site/library/folder structure in the right hand panel. This example is delivered by combining a SharePoint DM portal with a third party product DMF to provide a great user experience. More about this in future posts.


And – of course – I will also cover how social and communication features (mainly Yammer and Lync) layered into the Microsoft product suite is reducing the need for email communications and content sharing, but I want to balance this with facts. Because the fact is that the majority of information workers still rely heavily on email and indeed spend most of their working day using Outlook. Irrespective of that, social features do have an impact on all three key points above, so they cannot be ignored by anyone designing DM systems today.

For those of you that deal with information management – the key challenge is: how do we maintain, manage and secure all the information sent and received through different channels when our users are given so much freedom and power with the technology tools we are handing out. A tough question – which I look forward to discussing further soon.

Posted in Document Management, SharePoint | Leave a comment

Enjoying the personal touch

Enjoying the personal touch

by Kellie White, Head of UX @ ClearPeople


Name: Kellie White
Gender: Female
Age: 35 – 45
Marital status: Single
Geographic location: London Borough of Barnet
Taste: Interesting, on-trend, glamorous
Needs: Inspiration, validation, admiration
Willing to pay: £100 – £150

The Story

Today is the day! I have saved for this day, fantasised about this day and looked forward to it for months. Oh yes, today is the day that I find those drop dead January bargains. The kind that leave me reeling with delight, wondering why on earth I ever buy anything full-price? Ladies and gentlemen, today I am hitting the high street in a big way, actually more like the Westfield… I have an A-lister event coming up and I need to look the part.


Firstly, I need a coffee. I head over to my local caffeine pusher. They know me by my painful, unmoving regulatory of choice. Coffee is one of those things that people like a certain way isn’t it and don’t like it any other? Personally I like it flat white and skinny, not too big, not too small – just right. I don’t want a caramel-vanilla shot do I, vile and sickly-sweet. Neither do I want to be offered a cup of tea, I’d recoil in horror, never to return. All I am here for is a few pleasantries, to get my hit, pay and leave, with all my expectations met in that first sip. It’s a relationship that’s established, trusted and loyal, and I love it!

I’m discovering though, as I heave through the writhing mass of Saturday sale shoppers that clothes shopping is not quite the same! No one knows me well enough to fulfil my desires as soon as I walk in the door. In fact as I contemplate, no one offers me that kind of service apart from my friend Amanda who has introduced me to many of my best buys. “Jacket meet Kellie, Kellie meet jacket” she sings, as she holds it up to me with her knowing smile. I gaze at the creation, eyes glittering as much as the lapel. She’s so unbelievably good at finding those hidden gems, the things that I’d walk straight past.

Amanda is a fashion buyer, she understands it and knows it inside out. All the body shapes, the lumps and bumps and the lines and cuts to make them work. Most importantly she knows me, my personality and my style. She knows I love a jumpsuit, a bit of glam-rock and that I fantasise about wearing the softest pimped-up leopard print coat. Where is she on a day like today though, when I need her solid, good judgement as I stumble clumsily through piles of clothes and multiple fluorescent changing rooms, swimming in angora, chintz and faux-fur?


And then, just as I am about to lose my cool and post a number of selfies on Facebook for advice, he appears at my side, cool, elegant and in complete control. I hold fire on the posts, as he lifts up a panné velvet number in black with a slight shimmer of snakeskin and I instantly fall… Not for him, no way! A man with that much inside knowledge is not to be trusted, but I definitely have my dress and as you’ve guessed, it fits like a dream at half the price, yes please!

Next he’s gliding me along effortlessly matching earrings, heels and a clutch. This guy’s unbelievable, a perfect match for Amanda and as I am pondering ways of chance encounters, he leads me swiftly to the till, flashes me a devilish smile, slips me a card and exits. All the card says is ‘Thanks for shopping with us today. Why not try our online stylist next time? We’re sure you’ll be just as pleased.’

What, seriously, will it ever be that good and easy? I’m not sure but if they hired him they must know what they’re doing right? And I can’t deal a minute longer with these crowds. What a breeze, I walk out the door a happy lady and have all I need to make that impression!

The facts

Getting the things we want and need easily are great, aren’t they? Just like my regular morning coffee and finding that fabulous, (virtually free) frock. I am super pleased when I’m presented with the things that I want without much effort because there is so much these days, an overabundance of choice, far too much information and I’m very thin on the time to digest it all.

That’s where personalisation can help and has been helping for quite some time in our online experiences. It’s all about getting the things that we need without looking too hard, with the aim of saving time and effort – it’s that simple.

Now imagine an online experience that you’re creating in which you’d like to apply the same principles. How would you know what the different types of people are who visit your site? My bet is you’d start by meeting and talking to them and asking lots of questions. In this way you’d get a good understanding of their likes and dislikes, how they behave and interact, and what they’re trying to achieve. You’d start to see similarities and differences, you’d start to organise them into groups based on buckets of similarity. Each one of these groups become what is called a persona, which is an archetypal user of the site. There are usually around 5 to 6 personas in a site, which are real users because you’ve done your research.

After this groundwork is done you’ll be in a much stronger position to work with your users in more realistic goal-focused ways, to create optimum user journeys based on their needs. With these ingredients and the right people to bring their stories and your story to life, you’re in exactly the right place to increase audience engagement by grabbing their attention with focused messages that are quick to inform, and relevant.

Working with a provider who understands segmentation empowers marketing teams to analyse and respond to their users’ demands in new and innovative ways. The analysis doesn’t stop there though, it’s important to continuously test new content and your personas so that engagement strategies can be fine-tuned, to give your users a better experience every time they come to your site.

Posted in consulting, Content Management, Customer Experience, Digital Marketing System | Leave a comment

Getting to grips with a 1 Sitecore 7.2 JQuery Modal Dialog Box Issue

Getting to grips with a 1 Sitecore 7.2 JQuery Modal Dialog Box Issue

by Alan Yip, Senior Sitecore Consultant @ ClearPeople

So you’ve installed a vanilla install of Sitecore 7.2 and you are about to give it a test run to make sure that basic features like content creation and publishing works. But you stumble across a weird issue when you try to publish an item and no popup is displayed! In fact, if you try any function where the new modal dialog Sitecore popup is used, you will notice that none of this actually works now…

The Issue

In Sitecore 7.2, XAML is still widely used throughout the admin pages and in some cases, the new JQuery modal dialog is used, for instance the publish item popup. An example of this popup is shown below:

sitecore popup

You will notice that the XAML windows are not affected by this issue, but only the new JQuery ones are. This is not easily identifiable at first but I will explain how you will find out what causes this.

The Cause

If you are in Chrome (And no, this is not related to the Chrome Modal Dialog issue) you will want to press F12 to bring up the developer toolbar. You will notice the following error messages as shown below:

error message
You will notice some JavaScript errors being fired when you refresh the admin page.

Highlighted is the term “rejected-by-urlscan”. This is the cause and if you are getting this error, then you are in luck because there is a solution for this.

The solution

In IIS and under ISAPI Filters for your website, you will probably have URLScan 3.1 installed on your server for penetration test fixes. This is a module that you install in IIS to lock down certain HTTP requests.

You can remove this from your list of ISAPI Filters and your popups will function again with no problem and this is a solution from some Googling.

But you may not be in the position to do this because it may fail your penetration tests. So what can we do instead?

The problem is that the default installation has the following configuration set to false:


This basically says that URL paths cannot have more than 1 dot in its URL and if it does, then it will return a 404 for that file. The problem we have with websites nowadays (including Sitecore) is that JQuery files for instance have dots everywhere! (well, not everywhere) so these always return a 404, thus stopping most of your JQuery features from functioning.

So to get around not removing URLScan from your website, you can set this property to 1 and reset IIS and you’re good to go!

Please note that you will need to work with your penetration company to make sure that by removing this you mitigate other areas.

Posted in Sitecore, Sitecore 7.2 | Tagged | Leave a comment