Skip Ribbon Commands
Skip to main content

Migrating SharePoint to Azure

Overcoming the challenges of hosting SharePoint 2007 and 2013 within the same infrastructure

Andrew Walman
Brian Jones


Fuse has recently completed the migration of a customer's 2007 SharePoint farm from a "traditional" hosting environment to the Azure IAAS cloud. The farm runs four major websites, including the main website of our local county council (Northamptonshire) so it was critical that downtime was kept to a minimum. At the same time as migrating the farm, we were also introducing a new SharePoint 2013 farm to support a newly developed search-driven site, and to ultimately upgrade and host the existing 2007 sites. In this post, I'll explain some of the challenges and successes we experienced.


Northamptonshire County Council is a customer we've worked with for many years, particularly around SharePoint. The 2007 farm hosting was originally built by Fuse as combination of physical and virtual (VMWare) servers as a three tier farm, with a separate UAT environment. Active directory services were provided by local AD servers replicating over a VPN link back to the customer's site. Shared load balancers and firewalls were managed by the datacentre providers, part of a managed service that included OS monitoring and patching. Over three years, the infrastructure has worked pretty much faultlessly, with the only few incidents of unexpected downtime caused by factors outside of anyone's control.

The sites themselves are developed by a combination of Fuse and NCC, with all support provided by Fuse. Over the years, the SharePoint sites have grown to include many custom solutions, forms based authentication, custom data-driven applications, and data pulled from web services hosted in Northampton and elsewhere. A custom mobile solution and multiple SSL sites add to the overall complexity of the hosting environment. Even though we carry out every deployment and have overall responsibility for support, the prospect of moving the farm to a new infrastructure was a daunting one.

So Why Move?

Three main factors drove the move to a new infrastructure:

  1. SharePoint 2007 is an aging platform, and any new sites would need to be developed on 2013, requiring a new farm, whilst needing to maintain the existing one as the sites are upgraded – the cost of doing hosting both farms simultaneously with the current provider, who requires annual commitment, was prohibitive.
  2. NCC is now part of LGSS (Another Site hosted by Fuse on Azure in this farm!) who will be building their own hosting environments in the medium/long term. Using Azure, we could build an environment that could easily be moved back into an on-premises environment, at any time, or simply taken over by LGSS.
  3. We had worked with other customers to build large-scale SharePoint farms on Azure, and found the reliability, features and flexibility to surpass that of any hosting environment we'd ever come across. Combined with the seemingly ever decreasing costs and increasing features, Azure ticks many boxes.

Having convinced the customer that Azure was the right way to go, we engaged with Microsoft's UK Azure team to set up the subscription as part of NCC's EA agreement. This gave them improved pricing and access to the Azure super-portal that allows management of multiple subscriptions – a useful feature that then enabled NCC to delegate access to an entire subscription for us (something we and Microsoft guided NCC through setting up) and will allow future subscriptions in the future to be used for other projects with separate budgets. Azure's pay as you go model doesn't seem at first glance to fit with local government procurement, but done the right way, it's very simple to set up, and is just an extension of an existing agreement. This also covers off any licensing issues, which can be complex with a mixed on-premise/cloud infrastructure.

The Challenges

We've built several SharePoint 2013 farms in Azure, and they've all been relatively straightforward. As SharePoint 2013 and SQL 2012 are built for Windows 2012, which sits very comfortably on Azure, there are no real issues to overcome - essentially you build the farm as you would on premise, with certain tweaks for optimising performance on Azure - particularly with the SQL servers. A SharePoint 2007 farm is a different matter:

  • Only supports Windows 2008 R2 – which is still OK on Azure
  • No support for SQL 2012 or Availability groups
  • Requires sticky-session support in a load-balanced environment
  • No SNI (server name identification) in IIS 7

We also had our own challenges to overcome with this particular environment:

  • VPN needed to replicate the Active Directory, for user authentication of content editors from NCC, using their corporate credentials
  • Transferring the existing content and application data
  • Ensuring all the various features of each site were operational prior to go live
  • Co-ordinating all this at the same time as launching a brand-new site

Here's some detail on how we overcame each challenge to build a successful solution.

Two Farms in One

In order to minimise the costs of hosting two farms, we knew we needed to share as much as possible between the farms, while still ensuring high availability. The sites in the farm regularly exceed 15,000 visitors per day, so it infrastructure must be able to comfortably handle those traffic levels. During certain periods (bad weather for example) traffic can quadruple.  So we designed the following architecture for production:

  • Shared Infrastructure
    • Two Windows 2012 Active Directory Servers, running DNS and replicating with the AD servers at NCC
    • Two Windows 2012 servers, in a Windows cluster, running two instances of SQL 2012 – an Availability group for SharePoint 2013, and a mirror/witness set for 2007
    • Azure configured to run a front end service for 2007, and a separate service for 2013 - effectively two distinct load balancers
    • All servers connected to the same virtual network, divided into subnets, and connected to NCC's infrastructure through a VPN tunnel
  • Two SharePoint 2007 front end servers running on Windows 2008 R2
  • One SharePoint 2007 server running on Windows 2008 R2 as a search crawler/indexer etc.
  • Four SharePoint 2013 servers running on Windows 2012, with two acting as the application/batch processing tier
  • UAT environment consists of two SharePoint 2013 and one SharePoint 2007 server, connected to a single SQL server running two instances to SQL. The servers are joined to the same AD as production.

All the SharePoint 2007 servers have more memory, disk space and processing power than the existing farm, but are still cheaper to host at Azure.

SQL Issues with 2007

Although officially SharePoint 2007 doesn't support SQL 2012, adding another pair of large SQL servers when we already had a decent set seemed like something that was worth avoiding, particularly as the intention was to wind down 2007 over the life of the solution. Before we put the solution together we looked into what it would take to make SharePoint 2007 work on SQL 2012, and it turns out, not a lot – there's some good blog posts out there explaining the necessary steps, and we've found no issues so far. We have kept SharePoint 2007 on its own instance, and limited memory accordingly for each instance, so we don't make any SQL changes that could affect our 2013 farm supportability.

This has also meant we can take different high availability approaches for each farm. 2013 natively supports all the SQL 2012 HA features, so we've gone for the one that best suits Azure, availability groups on a Windows cluster. 2007 doesn't (though we did try hard to convince it!) so instead we've opted for a witness/mirror configuration.

Load Balancing

In a previous blog post I've mentioned how good Azure can be for load balancing SharePoint – but this was for SharePoint 2013, which handles sessions across the farm much better than 2007. Although anonymous users are fine without sticky sessions, there are a number of features within the sites that require a sticky session, which Azure load balancing doesn't do. The first issue this caused was with site editing for Windows-authenticated users, which on post back would generate an invalid view state if it went to the other server – this is easily fixed by having the same machine key in the application's web config, which fixes a similar issue for forms-based authentication too.

All the other transaction-based custom applications within the farm seem to handle non-sticky sessions well, but this could just be luck – so we looked at some different solutions in case our luck doesn't hold out. The first was a virtual load-balancing device from Kemp, which is available from the Azure store. This is essentially a full featured load balancer that runs behind the Azure load balancer and in front of the SharePoint servers, and can handle sticky sessions. Secondly we looked at the Application Request Routing module on IIS 8, which turns any Windows server into a full featured load balancer/cache server. As we also needed an SSL solution, this was what went with, though currently it's only handling our SSL issue.


Because our 2007 farms hosts multiple SharePoint applications, and some of those have SSL, we needed a way of having multiple SSL certificates on the same port. In our old hosting environment, our hosting providers gave us another IP address and we simply forwarded that on a different internal port to SharePoint. With Azure, things aren't quite so simple – another IP address requires another "service" and virtual machines can only exist in one service at a time. SNI is an option, but only for 2013 farm, as it's a feature of IIS 8. For our 2007 farm we needed to a similar solution to our existing hosting, and came up with using ARR to forward the requests from a new Azure service to our SharePoint 2007 front ends, again on an internal port. As the ARR servers are also Windows 2012, we can use SNI on these to identify the requested site and forward it correctly to the SharePoint 2007 servers, enabling us to host multiple SSL sites without requiring any more servers or services.

VPN and Active Directory Replication

Our existing hosting environment already used a VPN to enable AD traffic between sites. From this and other Azure implementations, we knew that the Azure site-to-site VPN and virtual networks would provide a workable solution, but we also know that any VPN into a large/complex environment is never straightforward. We also had to ensure the replication topology could handle the two branch sites. This took a lot of work with the customers networking and AD teams, and had we not had previous experience with Azure VPNs, we might not have persisted. I'm glad to say it does all work, and enables other solutions, such as on-premises back up, monitoring etc., to be used seamlessly with Azure – effectively the Azure servers in Dublin are now as much a part of the NCC infrastructure as those in Northampton.

Data Transfer

In order to perform the migration with minimal downtime, we had set the farms up and configured them with all the required settings and solutions over a number of weeks, testing against a copy of the content and application databases we had copied across as SQL backup files. The final step to making them live was to freeze content and migrate the data one more time, before switching DNS entries to make the new farm available to the public. However, we'd found with the initial data copy, where we'd used the two VPN tunnels to copy the live SQL data up to NCC and then back out to Azure, had taken much too long.

We found a much better way was to use the Azure/SQL backup tool, which can automatically transfer any SQL backup, as it's created, to Azure storage – by connecting this to storage available to our new farm servers, the backups were ready in minutes instead of hours and meant we could commence the content restore that night, shortening the entire migration process by around a day. We simply scheduled the SQL backup to occur after the content freeze and waited for it to arrive at Azure.

Planning, Testing and Project Management

Of course we can't pretend the whole process went completely smoothly, and there was still some minor issues once the sites went live. However these were quickly identified and fixed. This was due to the way the project was managed between NCC and Fuse, with frequent update meetings, shared project resources and effective communication. A well-developed test plan ensured that once the migration was completed and the new farm made live, all the features could be tested and the integration endpoints updated – this would not have been possible without both teams working together.

Successful Delivery

This was a major set of changes for NCC, with significant risks and a large number of people involved on both sides. Fuse overcame a number of technical challenges to deliver a solid platform that has saved a significant amount of public money and enabled NCCs digital team to continue delivering award-winning services to the public.

 About us

Fuse Collaboration Services is a Cloud Solution Provider and Microsoft Gold Partner specialising in delivering SharePoint, Skype for Business, and Azure cloud-based solutions. Based in Northampton, UK.

Microsoft Gold Partner Logo showing 5 competencies

Read more

 Latest Tweets

 Latest Blog



How to use SaaS solutions to identify sensitive data1497<p class="lead">​​​​​​​​​​​This article is going to look specifically at how we implement the use of software (SaaS) to enable your organisation to become ready for the GDPR quickly and easily, without interruption to your end users.</p><p> <strong>The first step in getting ready for the GDPR is to know what data your organisation holds</strong>. At the time of writing this article, the new legislation is only <strong>268 </strong>days away and the four main questions you n​eed to be able to answer to ensure your organisation is ready are&#58;</p><ul><li>What data does your organisation hold?</li><li>Where is the data kept?</li><li>Why do you need to use or keep the data?</li><li>Do you have consent to use the data?​</li></ul><div class="thumbnail"> <img class="img-responsive" alt="A padlock on a background of binary data" src="/ourblog/PublishingImages/Pages/How-to-use-SaaS-solutions-to-identify-your-data,-ensuring-your-organisation-is-ready-for-the-GDPR/shutterstock%20Data%20protection%20GDPR%20blog.jpg" style="max-width&#58;500px;" />​</div>​ <h3>What is defined as sensitive data under the GDPR?</h3><p>In terms of the GDPR, sensitive data is defined as personal data, but goes further than the Data Protection Act and includes online identifiers such as an IP address. The GDPR applies to both automated personal data and manual filing systems. You can be held responsible for breaching the GDPR by allowing personal data to be compromised either by&#58;</p><p>Misuse - using data for purposes other than that defined and recorded&#160;consent given for;</p><p>or </p><p>for data breaches, even if the breach was a malicious act (hacking), if you can't prove you had adequate data security measures and processes.&#160;​</p><div class="row"><div class="col-md-6"><h4>Will my company have GDPR sensitive data?</h4><p>Data that can identify any individual, such as a name, National Insurance number, passport, IP address or even biometric data - a soon as this is recorded in any system, in a file, a database record, or even on paper - that data then falls under the remit of the GDPR. As every organisation has staff records, every company will be affected by GDPR to some extent. </p><p>However, the more individuals you deal with, and the longer you hold that data for, the more prone you are to breaches of the legislation. Companies that perform data processing, even on behalf of other companies, and particularly those that use personal data records for multiple purposes (for example re-marketing) are at most exposure to GDPR.</p></div> <div class="col-md-6"><h4>​Examples of who will be most affected&#58;&#160;</h4><ul><li>Retailers – High street shops and online retailers storing customer profiles</li><li>Health Sector -&#160; Hospitals, doctor’s surgeries, scientific research organisations, pharmaceutical companies, with patient records</li><li>Education sector - Schools, colleges and universities, storing current and past student records</li><li>Financial sector – Banks, mortgage and insurance providers, with customer accounts</li><li>Recruitment companies - candidate records</li><li>Charity organisations - records of donors and recipients</li><li>Estate Agents - vendor and client records </li><li>Legal profession – Solicitors, CPS and courts - client records.</li></ul> ​</div></div><hr />​ <h3>​How to identify data and ensure all your data is GDPR compliant?&#160;</h3><p>There is an easy and quick way to find out what data you hold and you will be relieved to know we work alongside companies that are currently releasing SaaS solutions that are designed purely to scan, discover and analyse your data, to ensure you only hold data that is GDPR compliant.&#160;&#160;<br></p><p>Our Partners have solutions that use metadata to scan and analyse data which has enabled the migration of data to SharePoint for some time. It’s this technology that’s enabled these new solutions to be <strong>created specifically for the use of identifying what data you have and if it is GDPR compliant</strong>. Using NPL (Natural programming language) such as “name”, “address” or “credit card number”) this process can be done in days not months and can easily identify documents in unstructured databases, file shares and SharePoint. </p><p>The discovery phase of the SaaS tool is an important part of the new solutions as they are designed around common datatypes that can be tagged easily e.g. names, addresses, age of document, author of document, credit card numbers, postcodes IP addresses. The solution we use comes with predetermined taxonomies which can be edited easily, to reflect the sector that your organisation works within. Dashboards are then accessed with detailed data analysis which identifies the data that will not be compliant with the GDPR. Additional columns appear alongside your files with a “true” or “false” label showing whether the files are compliant with the GDPR and our team of consultants are experienced in using this technology and can advise you depending on your specific IT infrastructure.</p><hr /><h3>Hype around the GDPR&#160;</h3><p>There is a lot of hype around the new GDPR coming out in May which appears to be fairly negative but instead of viewing it as a tiresome challenge that your organisation must overcome, I would advise viewing it as an excellent opportunity to gain a competitive edge within the market. Whatever your opinion is and I doubt there are many companies that relish the additional resources that will be needed to comply; from a customer’s perspective, it <strong>must </strong>be a good thing. The trust in any business relationship is one of the fundamental reasons why you have a successful company. In an age where we frequently see headlines describing yet another data breach, damage to a business’s brand and reputation is an expensive result of avoiding being compliant with the new regulations.</p><hr /><h3>What to do right now!&#160;</h3><p>The key decision makers in your organisation need to be made aware of the GDPR. They need to know that the first thing they need to do is to find out what data their organisation holds. This is where we come in.</p><p>Although GDPR isn’t an IT issue as it will ultimately fall to the responsibility of who is currently responsible for your Data Protection E.g. Compliance Managers, Data Protection Officers, Data Controllers and Office Managers, the team at Fuse will be able to provide a technical solution to determining what data is held.</p><p>Before any amendments to existing internal procedures, policies or customer facing documentation such as websites and application forms can be changed they must know what is relevant to the GDPR. It may be the case that a lot of the data you hold is ROT and this can simply be deleted. If you have a completely unstructured filing system and want to take the opportunity to improve the efficiency of your business we don't just offer technical expertise, we are experienced in developing proofs of concept, functional and technical specifications. We can either then take responsibility for delivering the project or work alongside your IT departments providing a technical lead.</p><p>It doesn’t matter whether you have an inhouse IT department or not as we can work alongside existing IT managers, Compliance or HR managers. An IT consultancy needs to be your first port of call as they can advise you as to which is the best SaaS solution for your business depending on the size of your business and your budget. Having expert knowledge of your IT infrastructure and how it works is important to ensure that the right solutions are used. The benefits of using SaaS is that your end users are not interrupted and your IT departments are not impacted either. </p><p>You need to weigh up the cost of using an IT consultancy who can implement the right tools against the extra resources it will take to trawl through and analyse your data manually. This can seem daunting, but it’s a great opportunity to get your data in order and have confidence that your organisation can be proud of its commitment to protecting the data of its employees, customers and suppliers.​</p><div class="well well-lg"><p class="lead">If you want further advice or a quote on how we can help you get ready for the GDPR call Fuse today on 01604 797979 or <a href="/_layouts/15/FIXUPREDIRECT.ASPX?WebId=4fc45909-2b6d-48b9-bcf9-a446e9d472d6&amp;TermSetId=c98895cd-d37f-4406-9cff-5480b4f829b6&amp;TermId=218eb0be-10f6-490a-82a7-a7fd47c8de90">contact us​</a></p></div>​ | Louise Ozier | 693A30232E777C6675736563735C6C2E6F7A696572 i:0#.w|fusecs\l.ozier28/08/2017 23:00:002017-08-28T23:00:00Z Ensuring your organisation is ready for the GDPR19/09/2017 23:16:251671htmlFalseaspx

 Contact us

Our address
12-14 Brookfield, Duncan Close
Moulton Park, Northampton
P: +44(0)1604 797979
Contact Us