There has been a lot of buzz from SQLPass and in the SQL Server community about the Business Intelligence Semantic Model (BISM), which will be used to ‘power’ access to the data for the Microsoft Business Intelligence applications such as Excel, Reporting Services (SSRS) and Sharepoint. It is also intended that Project Crescent, the new self-service ad-hoc reporting tool available in SQL Server Denali, will be powered by the BISM.
Following on from some recent blogs, I was pleased to receive some direct questions from some business-oriented people, who wanted to know more about the ‘how’ of the BISM. It’s clear to me that business users are interested in how it will impact them. The focus of this blog is to take the information from people like Chris Webb, Teo Lachev, Marco Russo and TK Anand, who have written clear and trusted accounts of the SQL Server Denali information thus far, and use it as a foundation to answer the questions from business-oriented users that I’ve received so far. So, to business!
How can I access the BISM?
The Data Model Layer – this is what users connect to. This is underpinned by two layers:
The Business Logic Layer – encapsulates the business rules, which is supported by:
The Data Access Layer – the point at the data is integrated from various sources.
TK Anand has produced a nice diagram of the inter-relationships, and you can head over to his site to have a look.
How do I create a Business Intelligence Semantic Model?
This is done via an environment, which is essentially a new version of the BIDS called Project Juneau.
It is also possible to produce a BISM Model using Excel PowerPivot, which will help you to construct the relationships and elements contained in the model, such as the business calculations. This is done using Data Analysis Expressions (DAX). This helps you to form simple calculations through to more complex business calculations such as Pareto computations, ranking, and time intelligence calculations. If you would like to know DAX in-depth, then I suggest that you have a look at the book entitled Microsoft PowerPivot for Excel 2010 by Marco Russo and Alberto Ferrari. This book is accessible in its explanations of the DAX constructions. Thank you to Sean Boon for his commentary on the involvement of PowerPivot in creating BISM models.
How up-to-date is the data? Is it cached or accessed in real-time?
The Data Model Layer, accessed as the fundamental part of the BISM, can be cached or accessed in real-time. The main take away point is as follows:
Cached method: the upshot of which is that it is very, very fast to access the cached data. At the SQLPASS event, the demo showed instant querying on a 2 billion row fact table on a reasonable server. Specifically, the speed is because it uses the Vertipaq store to hold the data ‘in memory’.
Real-time method: the queries go straight through the Business Logic Layer to go and get the data for the data navigator.
A potential downside of the cached method is that the data needs to be loaded into the Vertipaq ‘in memory’ store for access. It’s not clear how long this will take so it is sounding like a ‘how long is a length of string?’ question; in other words, it depends on your data I suppose. Other technologies, like Tableau, also use in-memory data stores and data extracts. For example, Tableau offers you more calculations, such as CountD, if you use the data extracts instead of touching the source systems, thereby encouraging you to use their own data stores. In Denali, I will be interested to see if there are differences in the calculations offered by the cached or real-time method.
To summarise, a careful analysis of the requirements will help to determine the methodology that your business needs. In case you need more technical detail, this BISM, in-memory mode is a version of SQL Server Analysis Services. If you require more details, I would head over to Chris Webb’s site.
How can I access the BISM without Sharepoint?
In SQL Server Denali, it will be possible to install a standalone instance of the in-memory, BISM mode. Essentially, this is a version of Analysis Services which does not need Sharepoint. Until more details are clarified, it isn’t possible to say for certain how this version differs from the Sharepoint-specific version. No doubt that will become more clear.
As an aside, I personally love Sharepoint and I think that users can get a great deal from it generally, and not just in the Business Intelligence sphere. I would want to include Sharepoint implementations as far as possible in any case.
What will BISM give me?
Project Crescent: The big plus is Project Crescent, which is the new ad-hoc data visualisation tool, which is planned to look only visualise data via the BISM. Although you don’t need Sharepoint to have a BISM, you do need it if you want to use Project Crescent.
Hitting the low and high notes: If you’ve ever had to produce very detailed, granular reports from a cube, then you will know that this can take time to render. The BISM will be able to serve up the detailed level data as well as the aggregated data, thereby hitting both notes nicely!
Role-based security: this will be available, in which it will be possible to secure tables, rows or columns. As an aside, it will be important to plan out the roles and security so that this maps business requirements around who can see the data.
What will BISM not give me?
As I understand it, it will not support very advanced multi-dimensional calculations in Denali since it is not as multidimensional as its more mature Analysis Services sibling, the Unified Dimensional Model (UDM). Like most things, if it is simpler to use, it won’t be as advanced as more complex facilities. This can be an advantage since it will be easier for many relational-oriented people to understand and access, especially for straightforward quick reports.
I hope that helps to answer the various questions I have received; if not, please don’t hesitate to get in touch again!