Showing posts with label Analysis Services. Show all posts
Showing posts with label Analysis Services. Show all posts

Tuesday, July 28, 2009

A Little Something About PPS Filters (part 3)

It’s been a while since I started on this topic so hopefully I’ll remember what I was attempting to say.

For a particular client we have had issues with a multi level dimension hierarchy that contains several thousand members. As I’ve said before and numerous people have experienced PPS filter performance is not great (word on the grapevine is though this is no longer an issue in v2). For various reasons, creating multiple dashboards to satisfy different users and restricting the users using OLAP security weren’t options so we came up with a “third way”.

The third way was to provide the users with the ability to change their preferences and alter what they could see in the report hierarchy. First we needed to provide the users with a screen where they could make their selections. This was as simple as providing a PPS dashboard page that contained a filter and a report that used the filter.

The filter is based off of tabular data as explained in the last

post. The data itself is a parent child data set generated from the dimension that I want to restrict turning my source table of the dimension from something that looks like this;

image

To something that looks like like this;

image

There are reasons I’ve done things like place a letter a the beginning of each of the ID’s. First it ensures that every member has a different ID and secondly it identifies the level of the original hierarchy the member is from without having to write queries to work out this out.

This has been placed into a view which is just a series of select statements from each level of the hierarchy with unions between. With that source view a tabular filter can be built and put into a dashboard page. At the top of the view though I’ve put a statement in for the “All” member. Remember this isn’t Analysis Services so there’s no dimension and no “All” member.

image

Again referencing back to the previous post on this, for the filter value to be retained in the ParameterValues table you will need to associate it with a report. A simple reporting services report will do and my preference is to provide the user with some feedback on the selections they’ve made. If they have selected the “All” member then they get a list of all of the locations otherwise only the members that have been selected.

The reporting services input parameter is from the filter passed into some SQL that in this case looks like this;

IF EXISTS (
    SELECT 1
    FROM vwDimTown
    WHERE 'ALL' IN (@Location)) 
BEGIN
   SELECT DISTINCT CountryDesc, RegionDesc
   FROM vwDimTown
END
ELSE
BEGIN
   SELECT DISTINCT CountryDesc, RegionDesc
   FROM vwDimTown
   WHERE 'D' + CAST(CountyID AS varchar(5))
      IN (@Location)
   OR 'C' + CAST(RegionID AS varchar(5))
      IN (@Location)
   OR 'B' + CAST(CountryID AS varchar(5))
      IN (@Location)
END

With the reporting services report on the page we now have a mechanism for capturing a users preferences like so;


image


Retrieving the preferences is the next part so for that we build on the SQL on the previous post that shows how to extract rows from the ParameterValues table in a meaningful way and for this we will put the logic into a stored procedure.


The procedure will perform a number of tasks in addition to pulling the selected members out of the PPSMonitoring database. We also need the final output to be formatted as an MDX set in the format {member 1, member 2, member 3, member n}. So lets break down the procedure into its constituent parts. First the declarations.



CREATE PROCEDURE uspGetFilterPreference 
     @Login varchar(255)
    ,@FilterID varchar(50) 
    ,@StrToSet varchar(max) OUTPUT 

The first of the variables, Login, is the logon of the person whose preferences we are trying to find. FilterID is the GUID identifying the filter we are interested in. Finally StrToSet is the output variable that will contain my final MDX set string.


DECLARE @Keys table( 
MemberKey varchar(255)) 
DECLARE @OutputString varchar(max) 
INSERT @Keys(MemberKey) 
SELECT 
    REPLACE( 
        REPLACE( 
            CAST(d.Members.query('LocationID') 
                AS VARCHAR (255) 
                ), '<LocationID>', '' 
            ), '</LocationID>', '') 
    AS MemberKey 
FROM PPSMonitoring..ParameterValues 
CROSS APPLY SerializedXml.nodes
    ('/NewDataSet/MyDataTable') 
    AS d(Members) 
WHERE [Login] = @Login 
AND ParameterUniqueName = @FilterID

This section gets the values out of the ParameterValues table and loads them into a table variable called Keys. Along the way we strip out any XML syntax (element names, tags, etc.) leaving a table variable comtaining the key values from the tabular filter.



IF EXISTS (
    SELECT 1 
    FROM @Keys  
    WHERE MemberKey = 'ALL')
BEGIN 
    SELECT @OutputString = 
       '[Town].[County].Members' 
END

Here we check to see if the “All” member has been selected. If it has then we set the OutputString variable to be all members from the County attribute hierarchy of my Town dimension and NOT the user hierarchy. This is really important for later on.


If the “All” member has not been selected then we work out what members have been selected and using a set of union statements get a list of true dimension keys



ELSE 
BEGIN 
  SELECT @OutputString = 
    SUBSTRING((
      SELECT ', ' 
            + '[Town].[County].&[' 
            + CAST(ID AS varchar(10)) 
            + ']' 
      FROM (
       SELECT ID 
       FROM DimTownTabular 
       WHERE ParentLocationID IN 
         (SELECT MemberKey FROM @Keys 
          WHERE LEFT(MemberKey, 1) = 'B') 
       UNION 
       SELECT ID FROM dbo.DimTownTabular 
       WHERE ParentLocationID IN 
         (SELECT MemberKey FROM @Keys 
          WHERE LEFT(MemberKey, 1) = 'C') 
       UNION 
       SELECT ID FROM dbo.DimTownTabular 
       WHERE LocationID IN 
         (SELECT MemberKey FROM @Keys 
          WHERE LEFT(MemberKey, 1) = 'D')
            ) a 
  FOR XML PATH( '' )), 3, 1000 ) 
END 

This is why we have the ID column and the letter prefix. It makes this bit a lot easier to manage. The output of this and the value that get dropped into the OutputString variable is almost exactly what we want. The only issue is that by using the FOR XML PATH statement to produce a comma delimited string of the members it replaces “&“ symbols with “&amp;”. A quick …



SELECT @StrToSet = '{' 
                 + REPLACE(@OutputString,'.&amp;', '.&') 
                 + '}'

… fixes that and puts the curly brackets around to finish off our set syntax and the procedure as a whole.


Run the procedure against the username and filter you’re querying and depending on how many members you selected in the tabular filter you should get something like this;



{[Town].[County].&[26], [Town].[County].&[63]}

Now I would recommend that everyone goes out and gets this book






imageMDX Solutions
(Amazon link)

And go to chapter 10 where there is a section called “Using Stored Procedures for Dynamic Security”. This explains how to use a simple analysis service stored procedure (DLL) to affect what members in a dimension users are able to see. All the code is doing is retrieving a number of rows from a SQL query and passing it into analysis services security as an MDX set object. In my stored procedure I made it simpler and just executed a stored procedure and output a string.


Unfortunately I haven’t got this all packaged up as a project but my recommendation would be to create a C# project build the DLL and also build a command line application that references the DLL just to test everything is working OK.


image


Also things like error handling and connection string configuration to where the PPS database should all be handled in this code. My function is called “GetPreferredCounties” which has two input variables. The username and the filter ID, both as strings.


After registering the assembly with the Analysis Server we can now setup the final piece. In one of the roles in your cube (you have created one right?), within the dimension data security for the County attribute of the Town dimension I can now put in the syntax to make the stored procedure call and retrieve the string.


image


In my example it looks like this.



STRTOSET(PPSFilterASStoredProcs.GetPreferredCounties(
    UserName()
   ,"822e399a-af44-4cf3-be19-f1602ca14afb"))

The USERNAME function will pass through the current users logon credentials to the function and the STRTOSET function will take the output string from the function and convert it into something analysis service will understand is a set.


If all has gone well Analysis Services is now using the output from the stored procedure to provide a restricted list of dimension members to a user. A list that the user themselves can control without any other system being put in the way or elevated privileges being provided.


Build another page on your dashboard and this time create and add a report that just displays some info about your filtered dimension. Attach this to a filter based on the analysis services dimension (I tend to use named sets) and now you will only see the members that match your preferences selection.imageimage


Now to tidy up that hierarchy a bit. All those parent members that are just getting in the way and making the navigation more complex than it needs to be. Pay a visit to Boyan Penev’s blog and take a look at a post on Filtering Unneeded Dimension Members in PerformancePoint Filters. By making the source query or named set for your filter based on the logic in the post you just get the top most level necessary which makes the filter a hell of a lot neater and with the added bonus of the top level member becoming the default when you first access it.


image


There’s a lot going on in this post and unfortunately I haven’t had the chance to package it up into something I can redistribute easily but if there’s any questions then ask away and I’ll help if I can.

Wednesday, June 03, 2009

A Little Something About PPS Filters (part 1)

If you have a lot of members, say in the thousands, that you want to display in a PPS filter then you’re going have problems. The current version of PPS doesn’t handle them to well and I surmise that it’s one of the reasons that there’s a configurable setting in the web.config allowing to set the maximum number of members returned in a filter.

Things are compounded when you have hierarchical filters as the number of members is increased by parent members. The obvious way to get around this off the top of your head would be cascading filters but this has its own issues one in particular is that I want the user to be able to use all of the hierarchy in the reports from the all level to the leaf level members.

Hopefully the architecture of PPS v2 will have corrected this little foible but in the meantime we have to work around this. I’m going to talk in the next few posts about the filters and demonstrate a potential workaround but wanted to start off with a bit of an overview on two types of filters available in PPS and that’s OLAP based and tabular based.

The majority of people will have put together an OLAP data based filter. It’s very simple and you can choose to either select the members directly from the dimension, use a named set or a MDX query. I would hazard to guess that most developers will use either named sets or MDX queries.

I’ll use a simple MDX query to generate a set from a dimension I’ve got called “Towns”. This dimension contains a single user hierarchy that looks like this;

  • Country
    • Region
      • County
        • Town Prefix
          • Town

As I only want to display data to the county level in my filter I can set this up with this MDX;

DESCENDANTS(
   [Town].[Town].[All]
// The top level member
  ,[Town].[Town].[County] // down to the County level
  ,SELF_AND_BEFORE) // including all members

When the filter is built and deployed you’ get something that looks like this;

Image001

To achieve the same with a tabular filter you need to put together a parent child structured data set. As my source table has individual fields for each of the hierarchy levels this will need to be created manually using a series of union statements. For example;

SELECT
  'All' AS Location 
,
'All' AS ParentLocation
UNION ALL
SELECT
  Country AS Location
,'All'
AS ParentLocation
FROM
  dbo.DimTown
UNION
ALL
SELECT
  Region AS Location
,Country AS ParentLocation
FROM
  dbo.DimTown

Now none of what has been written above is new, Earth shattering or of any particular benefit to anyone apart from those who have never touched PPS before but I wanted to mention the distinction between OLAP and tabular data sourced filters as it will become quite important in the next post around how PPS remembers what your last selection was.

Wednesday, March 04, 2009

Crash, Bang, WOLAP..

I’ve been working on a SQL Server 2008 project that uses proactive caching on some partitions to speed up the availability of the data to the end user.

For all the things that have been said and written about ROLAP in the past apart from some issues around traces I’ve found it to behave itself pretty much. Well that is until now.

Since moving into a test environment we’ve been getting a lot these errors;

The description for Event ID 22 from source MSSQLServerOLAPService cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.

If the event originated on another computer, the display information had to be saved with the event.

The following information was included with the event:

Internal error: An unexpected exception occurred.

the message resource is present but the message is not found in the string/message table

Actually these errors would come thick and fast every couple of seconds or so until the Analysis Services service decided to stop. Profiling the Analysis Server showed the problem from the OLAP server point of view where it seemed to happen when the aggregations were being merged.;

Internal error: An unexpected exception occurred. Errors in the OLAP storage engine: An error occurred while processing the 'Current Day' partition of the 'Fact Table' measure group for the 'AS2008 Bug Test' cube from the AS2008 Bug Test database. Server: The operation has been cancelled.

I took the OLAP database that I was using and broke it down until I had the minimum amount of code and objects left in there. This meant stripping out the calculations, all of the measure groups, apart from 1, and most of the dimensions. Finally I was left with a cube that had 2 dimensions (one of which was a date dimension) and a single measure.

Within the measure group were 2 partitions, one was standard MOLAP the other a real-time HOLAP partition looking at the same fact table but different date ranges (current day and everything else). Now here’s the thing, while there is data in the HOLAP partition there are no errors and no crashing. With the HOLAP partition not processed, again no errors but as soon as the HOLAP partition was processed and had no data to retrieve then there would be errors galore resulting in the inevitable crash.

A further step back showed that this would all behave as long no aggregations were applied to the HOLAP partition. As soon as a single aggregation was put on the partition and the partition processed the errors would appear until either data was inserted into the source table, to subsequently be picked up in the HOLAP partition, or Analysis Services crashed.

Curiously converting the partition to real-time ROLAP didn’t deliver any errors with or without aggregations and same is true for any of the MOLAP types of storage but as soon as it moved back to HOLAP the errors returned.

Some extra detail. I’ve observed this on both 32bit and 64bit systems. The 64bit server this was first spotted on has the cumulative update package 3 for SQL Server 2008 installed and the 32bit is RTM so it looks like the issue is throughout the codebase. We haven’t gone as far as to install the SP1 CTP due to environment constraints. This looks like it might be a bit of a bug so I’m in the process now of packaging up something to send to Microsoft.

Wednesday, February 25, 2009

Wanted: Data Dude-esque Features for the Rest of the SQL Server Stack

I’m a big fan of Visual Studio Team Developer Edition for Database Professionals (mouthful..!). But what has practically always taken the sheen off of my BI deployments is the clunky way in which the other components get deployed. I like the idea of being able to perform a one click deployment and so do many of the production administrators I deal with.

DB Pro really provides complete control over all aspects of the database development and deployment so why can’t I have the same thing for my Analysis Services deployments. Between the deployment features of Visual Studio and the Analysis Services Deployment Wizard everything appears to be there to do exactly what I would like so all I’m missing is that close nit development environment integration.

Reporting Services suffices to a certain extent apart from being able to control the connection strings in data sources depending on the project / solution configuration. Yes I know that you can re-deploy the data sources or use the scripting capabilities but yet another external mechanism when Data Dude (now in its second iteration) shows exactly how it should be done.

SSIS deployment is something I’ve never been particularly fond of but you can build the files into the folder of your choice so I’ll swallow that…

Maybe there’s something coming in SQL Server 2010 / Visual Studio 2010. One can always hope…

Tuesday, February 17, 2009

The Mystery of the Disappearing Packet

No it’s not Halloween come early but a story straight from “Tales of the Unexpected”.

So we have a scaled out PerformancePoint architecture with Windows SharePoint Services on one box, the PPS Monitoring services on another and Analysis Services 2008 and SQL Server 2008 on separate machines. In this environment the dashboards deployed to the SharePoint server talk directly to the Analysis Server but in this case they didn’t.

Because there is a firewall between the SharePoint server and the other servers in this environment my immediate thought was firewall but after, a) checking the firewall rules were in place, b) checking Analysis Services was listening on the default port, and c) that we could telnet on port 2383 from the SharePoint box to the Analysis Services machine there was still no indication as to the issue.

Plugging in Microsoft’s Network Monitor (great tool, I thoroughly recommend getting it and using it). I could see the request coming in from the client but nothing going out at all on port 2383. I knew there needed to be something going out on that port as I had checked all of this with the same tool on a similarly configured environment.

Next I moved on to the OLEDB providers themselves. As this is totally SQL Server 2008 it appeared all of the correct drivers had been installed but as you notice when configuring PPS Monitoring, the installation does like to have ADOMD and SQL Client for AS 2005 and SQL 2005. After consuming some information on the interweb about SQL Server 2008 / PPS SP2 installations (here and here) I thought I would give the latest 2005 components a try. To my amazement without even a reboot packets started to trickle to my Analysis Server on port 2383.

Now I will leave you with a blast from the past. Just one thing to note for anyone outside of the UK, this is actually how everyone dances over here. On Saturday nights the Ritzys and Hippodromes up and down the country are filled with this kind of thing.

Enjoy..!

Client Tool Curiosities

Had a bit of a weird one the last couple of days where a cube I had built was demonstrating different behaviour in a variety of tools. Debugging the cube script in Visual Studio provided the values that I expected from the script but when I then viewed the same script in ProClarity I had a very different behaviour.

I then tried the same thing in Management Studio and got a different set of numbers and then finally Excel 2007 provided me with a slightly different behaviour. So how did all of this come about. Well firstly there was an error in my calc, I was adding an integer value to a column where null occurred but this wasn’t producing an error in the visual studio debugger. In fact even though the debugger numbers appeared correct it wasn’t evaluating my script in the way I expected.

ProClarity wouldn’t show the error, that was fine but when I put same MDX that ProClarity was generating into Management Studio the correct numbers appeared. Browsing the cube in Management Studio produced a completely different set of values but that was due to me setting filters in the browser rather than navigating to the values I wanted.

Finally I gave Excel 2007 a go and actually saw the error “#VALUE” which kind of gave away what the problem was and where it was.

So there’s a moral to this story in there somewhere which I’ll leave to yourselves. But I’ve taken away from this the saying that if the number looks right and the number smells right then it’s probably right, maybe, in the right type of tool….. perhaps!

Thursday, February 05, 2009

PerformancePoint Dashboard Performance

I’m currently working on a project that is mixing PPS, Reporting Services 2008, ProClarity Analytics Server and Analysis Services 2008 with a lot of proactive caching….. Nice…!

The development environment we’re using does the job but I had been experiencing some really wild performance issues. One dashboard deployment would be fine and then the next it would run like a dog. This then drove me down a path of tuning every aspect of the solution so I thought I would share a few of my findings up here for posterity.

The cube has a number of measure groups that in turn have a number of partitions. We’re partitioning by month due to the data volumes with a current day partition that is HOLAP on each of the measure groups. I’m using the SQL Server notifications (AS sets a trace against the target table) so everything is pretty much automatic.

Dashboards in the solution consist of a mix of reporting services, PAS and PPS native reports all joined together using PPS filters. First off the filter queries themselves, they weren’t dynamic so we went down the route of putting some server side named sets in and just calling them from Dashboard Designer. Nice and simple so far but this is where the performance problems started to manifest.

With a simple dashboard up and a small number of members in the filter named set, the dashboard was taking far too long to return. There were four filters on this particular page. Looking at profiler I was finding that a refresh of the page after a database update (remember proactive caching involved here) was causing a query to be fired against against Analysis Services and every partition in a single measure group would be read, four times.

The measure group in question happens to be the measure group that contains the cubes default measure. This I considered to be a bad thing as the production environment would have a lot more data and a lot more partitions to be read, four times, when ever a dashboard page was opened with filters.

To confirm the behaviour I created a dummy partition with a dummy measure in and made that the cube default measure. Running the same test showed four queries being fired but this time the partition being read was tiny and the filter response in the dashboard immediate. Un-processing the partition though caused everything to behave correctly without obviously the dummy measure being read.

So point 1 is be careful where you put you default measure. Next up are the reporting services reports. Creating parameters will create hidden data sets in the report that are there to support the available values list in the report parameters. Whenever the report is executed it’s going to want to get the data for those data-sets to populate the list and multi-selects. If, as is the case on this project, the reports are designed to be accessed purely through dashboards using PPS filters there is little point in having these data sets so removed them.

First for each parameter set the available values to none then remove the related data sets (you will most likely have to show the hidden data sets) and redundant data sources. Although these data sets are small the reporting services reports on the dashboard were observably more responsive.

The final overall improvement is to reduce the number of members in the filters and make sure that you have selected the right type of filter for the job. In the case of this project we’ve avoided multi-selects and opted for single select lists where possible as they seem to take less time to render on the client. Improving the performance of the filters is ultimately going to be based on the number of members you can select from. With SP1 of PPS you can now (I think) have cascading filters but in an OLAP environment where you want to select a member at any level within a hierarchy that’s not useful.

Try using clever MDX or security to reduce the member count, it really makes a lot of difference.

Monday, November 17, 2008

Attribute Processing Errors

Just something to remind myself that when processing a dimension attribute make sure that there are no NULL's in addition to empty strings. One or the other is ok but when you start having both in an attribute key you will get processing errors as Analysis Services will do whatever it does and effectively make them the same value therefore you will get duplicate key problems.

Thursday, October 11, 2007

UPDATE to Multiple Pick List Filters in PPS 2007? Get RTM, Quickly..!

Technorati tags: ,

Yesterday I was over the moon with joy after finding that the RTM version of PPS solved my little issue. How wrong I was as performing the upgrade on another server presented me with exactly the same problem.

After some mad panicking and searching for a solution we discovered that the version of PAS on the PPS virtual PC was different to the version on our server. Another little hunt around presented us with this....

Microsoft ProClarity Analytics Server 6.3.2209 Hotfix

...which subsequently lead us to...

A hotfix rollup package is available for ProClarity Analytics Server 6.3

...and tucked away on there was this little gem...

Issues that this hotfix rollup package fixes

This hotfix rollup package fixes the following issues:

blah

blah

blah

When you run a query in Microsoft SQL Server Analysis Services, the query takes a long time if multiple parameters are specified. Additionally, the query may time out in Internet Information Services (IIS).

Hey presto it all behaves itself now.

To summarise, you don't have to install the RTM PerformancePoint Server to get rid of the poorly performing MDX, just patch PAS but I would still recommend getting it for things like better performance on the UI and the fact it now supports integration of Reporting Services on SharePoint.

Technorati tags: , , ,

Wednesday, October 10, 2007

Multiple Pick List Filters in PPS 2007? Get RTM, Quickly..!

PerformancePoint Server 2007 RTM has been up on MSDN subscriber downloads for a while now, well several days anyway, so I would imagine there aren't too many people who haven't got hold of it who are doing some serious mucking around in PPS.

If you're one of those people who are on the CTP but don't have access to the RTM then this may be of some interest.

As some of you may be aware one of the report types you can embed into a PerformancePoint dashboard is a Proclarity Analytics Server Page. This effectively provides the user with a Proclarity briefing book allowing some interactivity and as with almost all report types within PPS filters can be applied externally by PPS allowing say a PPS analytics grid, PAS page or reporting services report to all be displayed with the same parameters / dimension members being passed through.

The particular filter type that this post is concerned with are multi-selects. This is a typical dimension selection technique you've seen in plenty of tools before so I won't teach you to suck eggs on that one. Anyway, to cut a long story short after the dashboard has been published it's possible to preview the dashboard with your embedded PAS based report and filters.

Select a single member from your filter and apply and the report return pretty quickly, not lightening fast but remember this a CTP and debugging is switched on etc. so that has to be taken into account. Select 2 members from the filter and when the report is say performing a non empty on a dimension with members numbering in the thousands, it takes a little longer than it should, in fact uncomfortably longer.

Profile the poor performing MDX and here's what you get....

WITH

  SET [<##<SET!smchugh!{E3EF8F62-CD2B-422D-881F-ABAFB4AA3E16}>##>]

    AS '{[Product].[Product Categories].[Subcategory].&[31]}'   

  SET [<##<SET!smchugh!{844FB37C-1DA6-4536-B7A4-297650572576}>##>]

    AS '{[Product].[Product Categories].[Subcategory].&[1]}'   

  MEMBER [Product].[Product Categories].[All Products].[ Aggregation]

    AS ' AGGREGATE( EXISTING { [<##<SET!smchugh!{E3EF8F62-CD2B-422D-881F-ABAFB4AA3E16}>##>],

                               [<##<SET!smchugh!{844FB37C-1DA6-4536-B7A4-297650572576}>##>] }) ',

    SOLVE_ORDER = 0   

SELECT

  NON EMPTY { [Measures].[Internet Tax Amount], [Measures].[Internet Order Quantity] }

  PROPERTIES PARENT_UNIQUE_NAME, MEMBER_TYPE ON COLUMNS ,

  NON EMPTY { [Customer].[Customer].[All Customers].CHILDREN}

  PROPERTIES PARENT_UNIQUE_NAME, MEMBER_TYPE ON ROWS 

FROM

  [Adventure Works]

WHERE

  ( [Product].[Product Categories].[All Products].[ Aggregation] )

CELL PROPERTIES VALUE, FORMATTED_VALUE, CELL_ORDINAL

Take particular notice of the new member called [ Aggregation] that uses the 2 sets. If they are stripped out then and the MDX run in a much more sensible manner such as ....

WITH

  MEMBER [Product].[Product Categories].[All Products].[ Aggregation]

    AS ' AGGREGATE( EXISTING {

    [Product].[Product Categories].[Subcategory].&[31],

    [Product].[Product Categories].[Subcategory].&[1]}) ',

    SOLVE_ORDER = 0   

SELECT

  NON EMPTY { [Measures].[Internet Tax Amount], [Measures].[Internet Order Quantity] }

  PROPERTIES PARENT_UNIQUE_NAME, MEMBER_TYPE ON COLUMNS ,

  NON EMPTY { [Customer].[Customer].[All Customers].CHILDREN}

  PROPERTIES PARENT_UNIQUE_NAME, MEMBER_TYPE ON ROWS 

FROM

  [Adventure Works]

WHERE

  ( [Product].[Product Categories].[All Products].[ Aggregation] )

CELL PROPERTIES VALUE, FORMATTED_VALUE, CELL_ORDINAL

...same data returned but with a massive time difference. It's worth actually running both these queries on [Adventure Works] as it's a good indication as to what a difference little changes in your MDX can make.

Suffice to say the second query is now how it's structured in PerformancePoint Server 2007 RTM so it's well worth getting this installed as soon as possible as not only will queries return faster but some nasty timeouts on your dashboard will be avoided.

Technorati tags: , , ,