PASS Summit Day 2: The Aftermath
Reposted from Chris Webb's blog with the author's permission.
Well, that last blog post sparked a bit of a discussion, didn’t it? Indeed, I’ve spent the last few days doing a lot of talking to various different groups of people – PASS attendees, fellow MVPs, Microsoft – about was or wasn’t said in the various announcements made at PASS, what I did or didn’t mean, and how people are interpreting or misinterpreting the news. And now it’s time to follow up with another blog post to explain myself better and say what’s happened since Thursday; you may also want to read this official statement about the roadmap from TK Anand here before carrying on reading this post:
First of all, let me start by making it clear that Analysis Services overall is alive and well, and in fact has a greatly increased role to play in the BI stack in Denali. My original post pretty much said as much. Some of the confusion, though, stems from the fact that ‘Analysis Services’ in Denali will have two distinct parts:
1) The UDM, or Analysis Services cubes, which is what we have today. Some people refer to it as MOLAP SSAS but I don’t like this description: it highlights the storage mode when in fact I consider its distinguishing feature to be its multidimensional view of the world. Personally I couldn’t care less about storage modes and can’t wait to see Vertipaq replace MOLAP, but I do care about multidimensionality and its advantages when it comes to BI – some BI applications, typically ones which need complex calculations, can only be built using a true multidimensional OLAP database. I’d say anyone that thinks that the point of using the UDM is because MOLAP is (or has been) faster than relational database engines has completely missed the point. However, multidimensionality is complex and somewhat inflexible and that’s what puts lots of people off.
2) The new BI Semantic Model, BISM. This is what’s new in Denali, and features a more relational, tabular way of modelling data as well as the new Vertipaq storage engine. BISM is a little bit multidimensional (it is after all still SSAS under the covers) but not much: that’s exactly why it’s easier to use, more flexible and appropriate for a wider range of BI applications. It will be a massive asset to the MS BI stack and make building many types of BI applications quicker and easier. It will probably not do everything that the UDM does, though, precisely because it is not as multidimensional.
The point I was trying to make in my original post was that the announcements made at PASS, as I and everyone I spoke to there interpreted them, made me very concerned (to say the least) for the future of the UDM and the multidimensional model. First of all there was the news that Microsoft was putting all of its development efforts into Vertipaq and BISM, while the UDM was (for yet another release) getting very few obvious improvements. Then there was the news that Project Crescent was only going to support BISM as a data source and not the UDM, which made it seem like the UDM was a second class citizen in this regard. And finally there was a lack of clarity in the roadmap which meant I wasn’t sure whether BISM was meant to replace the UDM or not, or whether BISM would ever be able to do the same things that the UDM can do today.
This is what caused all the commotion, and I’m pleased to say that after a lot of what’s generally referred to as ‘free and frank discussion’ behind the scenes the guys at Microsoft understand what happened. In part there was a failure of communication because I don’t think the Analysis Services team ever meant to send out a negative message about the UDM and were a bit surprised at my reaction. TK’s recent post that I link to above is a very clear and positive statement about the future of the UDM. But words need to be backed up by actions and Microsoft know there need to be some changes to the Denali roadmap so that customers receive the right signals. As a result I hope to see a little bit more love shown to the UDM in Denali as a result, to prove to all of us who have invested in the UDM to show Microsoft still cares about it; I also know that Microsoft are looking again at ways that Crescent can work with existing UDM applications; and I hope to see a clearer long-term vision to show how anyone investing in the UDM today will have the option, if they want, to move smoothly over to BISM when they feel they are ready. An argument about semantics is in no-one’s interests (I couldn’t help thinking of this); what I care about is that I’ll have all the cool new stuff that BISM will give me and I’ll still be able to do everything I can do today in the UDM, and that we’ll have all the power of relational and multidimensional modelling when we’re building our BI solutions.
So let’s be positive. There was a bit of a bust-up, but we’re all friends again now and I think the SSAS community is better off for having had this all come out now rather than later – and the fact that we can even have this type of discussion shows the strength and vibrancy of the community. I’m not afraid of change and I know it has to happen; I’m confident that the changes we see coming in Denali will be for the better. However I’m also a lot happier now that existing Microsoft BI customers have had this reassurance that they won’t be left stranded by these changes.
Chris has been working with Microsoft BI tools since he started using beta 3 of OLAP Services back in the late 90s. Since then he has worked with Analysis Services in a number of roles (including three years spent with Microsoft Consulting Services) and he is now an independent consultant specialising in complex MDX, Analysis Services cube design and Analysis Services query performance problems. His company website can be found at http://www.crossjoin.co.uk and his blog can be found at http://cwebbbi.wordpress.com/ .