Every business today is looking for ways to unlock the value of their data, connect disparate systems, increase efficiency, optimise the use of current IT while at the same time keeping the organisation secure. Why? Because competitive advantage, positive customer experience and a bulging bottom line are all dependent on it.
Technology is becoming increasingly flexible, mobile and connected- the evidence is probably either in your pocket or your hand.
Smart devices are now built around a huge number of modern applications, many of which are tied directly into business functions – in essence the line between consumer and business has blurred.
Before we start comparing, it’s worth highlighting that this is a bit of an “oranges & apples” thing because, technically speaking, the UK DPA (Data Protection Act) 1998 was enacted to bring British law into line with the 1995 EU DPD (Data Protection Directive, aka 95/46/EC) which is the one that is, now, being repealed and superseded by the GDPR (General Data Protection Regulation, aka 2016/679) that was adopted in 2016.
When creating a virtual machine in Azure, Microsoft provide a set of images for you to use that your virtual machines get built from. These include simple images such as Server 2012 r2 with nothing installed up to images with different operating systems and various pieces of software already installed, such as Ubuntu, CentOS and many third party applications. There are other reasons why you may want to use a custom image.
The public preview of Microsoft Flow was announced at the end of April and was showcased in a session at the recent Integrate 2016 summit in London. In this blog, Ed Loveridge demonstrates how to set up a simple Flow to send all tweets with a specific hashtag to a channel on a Slack team site.
During a recent BizTalk project, one of the customer requirements was that system tests were to be written using Specflow. Specflow is an open-source framework which allows automated tests to be defined as Behaviour Driven Design (BDD) style specifications. In this blog I will demonstrate how to install Specflow and outline the steps that were used throughout the project to create the tests.
"The best way to avoid failure is to fail constantly". Over the last few years, Netflix has been improving their resiliency with the help of the ‘Chaos Monkey’ and an ever expanding ‘Simian Army’. Nathan Cooper looks at the lessons that can be learned from their rather unorthodox automated testing approach.
Nino Crudele will be producing a series of articles about the latest and greatest in Microsoft technologies, some being directed to sales and management, others being more technically focussed towards the developer community.
Owin is the new open source standard for building and hosting web applications, bringing huge improvements to web development both in terms of flexibility and developer productivity. This article will solve a particular problem that may be encountered when moving to Owin for applications using Microsoft's Unity dependency container.
In this series of blog posts, Nicholas Revell will discuss how Power BI and related Microsoft technologies can help you to implement a successful self-service business intelligence (BI) strategy.
Power BI is uniquely placed to help your organisation realise the benefits of self-service business intelligence (BI), whether you are starting from scratch or building on top of a traditional BI platform.
I had planned to review Row-level Security and Transparent Data Encryption (TDE), but support for TDE has not been announced yet (even though I can see the relevant system tables in V12 databases). So in this part I will just be giving you my assessment of row-level security.
There have been a number of new security enhancements recently to control and audit access to Azure SQL databases. These include Auditing, Dynamic Data Masking, Row-level Security, and Transparent Data Encryption. In this first part I give my assessment of auditing and dynamic data design and how likely I am to be using them.
The data mining tools in SSAS (multidimensional mode) have been available since SQL Server 2000, and the range of data mining algorithms that are bundled are generally considered to be sufficient for most requirements.
What I fundamentally need to achieve for HADR is a database system that is, as much as possible, always available ('online and accessible'), complete ('no data loss'), and accurate ('no data corruption'). For the purpose of this discussion, I am ignoring backup strategies on the assumption that they exist primarily to roll back databases to a previous known state.
Database management tools, such as SQL Server Management Studio (SSMS) are mature products in daily use by database professionals around the world. This makes me wonder why Microsoft bothered producing their Silverlight-based Azure SQL Database Management Portal.
Scaling out (or sharding) by adding more databases usually requires careful planning and provisioning to ensure even distribution of data. It also adds more administrative overhead, and increases the number of points of failure. In this respect, Azure SQL databases are the perfect candidates for sharding because they can be created or deleted on demand, provide near-zero administration, and have built-in fault tolerance.
During initial discussions on structuring a new TFS installation, there is a regular but surprising (to me) question why a dedicated machine should be configured to host builds. I shall explain why the build function should be separated from TFS.
Extending on a well documented walkthroughhere I thought I’d share some of the errors I came across while integrating BizTalk 2013 with SharePoint Online 2013.