Wednesday, October 24, 2007

This Week I Have Been Mostly Listening To...

Overpowered

Overpowered - Roisin Murphy

Technorati tags: ,

Organising SSIS

I think this has probably been blogged about and documented elsewhere to death but it's well worth posting something about (even as a reminder to myself).

SSIS package configuration is great but as always, as people use something heavily, its usage gets more sophisticated and so do the problems. One thing I like is to be able to centrally log and also centrally get configuration values but to do this I need to be able to support multiple environments (dev, test, etc.) and also multiple applications (the same variable used in different packages). This is where I've found the configuration to be little limited so in true developer fashion we came up with an alternative configuration mechanism. I won't go into the details but effectively we get all configuration values from a call to a database.

The only problem with this is that I want to be able to tell the package where the configuration database is, what environment I'm executing in or the application name I'm running under but I can't do that with hard coding a config file into the package. Of course there are alternatives. A number of server variables or registry entries but really I just want to do this the once and have control over everything else.

In my perfect world I would want to structure my data in something similar to this;

image

This contains pretty much everything I need all in a single location. I can have multiple configuration files for different SSIS based systems, a place for source files and if I wanted to drop in XSD files I could create a new folder or just stick it in a suitable existing location. But this is when these little issues creep in.

How do I tell my package where the SSISRoot folder is? I don't want to use configuration files as that defeats the object of having a single location where I keep my configuration files, very chicken and egg. So what alternatives are there? An environment variable can be used to point to the config file location but then I need an environment variable for every config file I have once again defeating the object. What about creating an environment variable that points to SSISRoot, that would deliver what I need. I tried this. I created the variable and called it from Start>Run %SSISRoot%, it worked and I get the folder location that I wanted. The next thing to do was to put "%SSISRoot%/Config/SSISConfigFile.dtconfig" into the path for the file however that had little success, and when I say little I mean none whatsoever.

So I'm back to square one. Hard coding the package config file into the packages. This is one of those issues I would be grateful to hear any clever workarounds that anyone has so please let me know as this is getting increasingly frustrating.....!

Technorati tags: ,

Thursday, October 11, 2007

UPDATE to Multiple Pick List Filters in PPS 2007? Get RTM, Quickly..!

Technorati tags: ,

Yesterday I was over the moon with joy after finding that the RTM version of PPS solved my little issue. How wrong I was as performing the upgrade on another server presented me with exactly the same problem.

After some mad panicking and searching for a solution we discovered that the version of PAS on the PPS virtual PC was different to the version on our server. Another little hunt around presented us with this....

Microsoft ProClarity Analytics Server 6.3.2209 Hotfix

...which subsequently lead us to...

A hotfix rollup package is available for ProClarity Analytics Server 6.3

...and tucked away on there was this little gem...

Issues that this hotfix rollup package fixes

This hotfix rollup package fixes the following issues:

blah

blah

blah

When you run a query in Microsoft SQL Server Analysis Services, the query takes a long time if multiple parameters are specified. Additionally, the query may time out in Internet Information Services (IIS).

Hey presto it all behaves itself now.

To summarise, you don't have to install the RTM PerformancePoint Server to get rid of the poorly performing MDX, just patch PAS but I would still recommend getting it for things like better performance on the UI and the fact it now supports integration of Reporting Services on SharePoint.

Technorati tags: , , ,

Wednesday, October 10, 2007

Multiple Pick List Filters in PPS 2007? Get RTM, Quickly..!

PerformancePoint Server 2007 RTM has been up on MSDN subscriber downloads for a while now, well several days anyway, so I would imagine there aren't too many people who haven't got hold of it who are doing some serious mucking around in PPS.

If you're one of those people who are on the CTP but don't have access to the RTM then this may be of some interest.

As some of you may be aware one of the report types you can embed into a PerformancePoint dashboard is a Proclarity Analytics Server Page. This effectively provides the user with a Proclarity briefing book allowing some interactivity and as with almost all report types within PPS filters can be applied externally by PPS allowing say a PPS analytics grid, PAS page or reporting services report to all be displayed with the same parameters / dimension members being passed through.

The particular filter type that this post is concerned with are multi-selects. This is a typical dimension selection technique you've seen in plenty of tools before so I won't teach you to suck eggs on that one. Anyway, to cut a long story short after the dashboard has been published it's possible to preview the dashboard with your embedded PAS based report and filters.

Select a single member from your filter and apply and the report return pretty quickly, not lightening fast but remember this a CTP and debugging is switched on etc. so that has to be taken into account. Select 2 members from the filter and when the report is say performing a non empty on a dimension with members numbering in the thousands, it takes a little longer than it should, in fact uncomfortably longer.

Profile the poor performing MDX and here's what you get....

WITH

  SET [<##<SET!smchugh!{E3EF8F62-CD2B-422D-881F-ABAFB4AA3E16}>##>]

    AS '{[Product].[Product Categories].[Subcategory].&[31]}'   

  SET [<##<SET!smchugh!{844FB37C-1DA6-4536-B7A4-297650572576}>##>]

    AS '{[Product].[Product Categories].[Subcategory].&[1]}'   

  MEMBER [Product].[Product Categories].[All Products].[ Aggregation]

    AS ' AGGREGATE( EXISTING { [<##<SET!smchugh!{E3EF8F62-CD2B-422D-881F-ABAFB4AA3E16}>##>],

                               [<##<SET!smchugh!{844FB37C-1DA6-4536-B7A4-297650572576}>##>] }) ',

    SOLVE_ORDER = 0   

SELECT

  NON EMPTY { [Measures].[Internet Tax Amount], [Measures].[Internet Order Quantity] }

  PROPERTIES PARENT_UNIQUE_NAME, MEMBER_TYPE ON COLUMNS ,

  NON EMPTY { [Customer].[Customer].[All Customers].CHILDREN}

  PROPERTIES PARENT_UNIQUE_NAME, MEMBER_TYPE ON ROWS 

FROM

  [Adventure Works]

WHERE

  ( [Product].[Product Categories].[All Products].[ Aggregation] )

CELL PROPERTIES VALUE, FORMATTED_VALUE, CELL_ORDINAL

Take particular notice of the new member called [ Aggregation] that uses the 2 sets. If they are stripped out then and the MDX run in a much more sensible manner such as ....

WITH

  MEMBER [Product].[Product Categories].[All Products].[ Aggregation]

    AS ' AGGREGATE( EXISTING {

    [Product].[Product Categories].[Subcategory].&[31],

    [Product].[Product Categories].[Subcategory].&[1]}) ',

    SOLVE_ORDER = 0   

SELECT

  NON EMPTY { [Measures].[Internet Tax Amount], [Measures].[Internet Order Quantity] }

  PROPERTIES PARENT_UNIQUE_NAME, MEMBER_TYPE ON COLUMNS ,

  NON EMPTY { [Customer].[Customer].[All Customers].CHILDREN}

  PROPERTIES PARENT_UNIQUE_NAME, MEMBER_TYPE ON ROWS 

FROM

  [Adventure Works]

WHERE

  ( [Product].[Product Categories].[All Products].[ Aggregation] )

CELL PROPERTIES VALUE, FORMATTED_VALUE, CELL_ORDINAL

...same data returned but with a massive time difference. It's worth actually running both these queries on [Adventure Works] as it's a good indication as to what a difference little changes in your MDX can make.

Suffice to say the second query is now how it's structured in PerformancePoint Server 2007 RTM so it's well worth getting this installed as soon as possible as not only will queries return faster but some nasty timeouts on your dashboard will be avoided.

Technorati tags: , , ,

PPS CTP4 VPC DOA

If there weren't enough acronyms in the world already....

A little tip from a colleague who has been working closely with Performance Point Server over last couple of months and had the misfortune to use the Virtual PC from Microsoft to demo the toolset.

It performs like a dog. I can confirm this as I'm watching it "not respond" for no reason after an "indeterminable" amount of time. After somebody complained about this they were told that you have to defrag the hard drive within the virtual PC 10 times. I'm currently on defrag number 5 of 10.....

8 out of 10.....

10 out of 10.....

......

......

......

Well after a little play around I have to admit to some non-scientifically measured performance benefits that have been noted, oh, until it crashed on me with the dashboard designer deciding that the task of saving needed 100% CPU and to deny any interaction whatsoever.

Unfortunately what I cannot answer with any clarity is....

  • 9 clicks will make it perform worse.
  • 11 clicks will make it perform better, or worse.
  • the dashboard designer needs a quad core xenon to save anything......

Tuesday, October 09, 2007

Reporting Services and Virtual Directories, a Warning

Just a very quick post to make people aware of a little thing that happens with reporting services when you install it on a server and install it under the same virtual directory (e.g. you install it on a SharePoint server and don't allow SharePoint to control the reporting server, e.g. when you have Performance Point Server 2007 CTP4 and are using it with reporting services - mail me if it's confusing).

We found that Report Manager (http://servername:1234/reports) worked correctly but any access via the web service (http://servername:1234/reportserver) would return an error along the lines of...

Microsoft.ReportingServices.UI.WebControlConnection,
ReportingServicesWebUserInterface, Version=9.0.242.0, Culture=neutral,
PublicKeyToken=89845dcd8080cc91 does not implement
IReportServerConnection or could not be found

Spurious to say the least. The solution for us was to remove a couple of lines from the web.config file within the <appsettings> tag in the reporting services virtual directory...

<add key="ReportViewerServerConnection"
value="Microsoft.ReportingServices.UI.WebControlConnection,
ReportingServicesWebUserInterface, Version=9.0.242.0, Culture=neutral,
PublicKeyToken=89845dcd8080cc91" />
  <add key="ReportViewerTemporaryStorage"
value="Microsoft.ReportingServices.UI.ReportViewerTemporaryStorage,
ReportingServicesWebUserInterface, Version=9.0.242.0, Culture=neutral,
PublicKeyToken=89845dcd8080cc91" />

This gets the web service running but guess what, reports via the report manager won't work anymore.

The problem, I am reliably informed on 'tinternet, is that web config's are inherited in IIS therefore the reporting services web service web config settings are inherited from the report manager web config which means that the setting you removed is needed for report manager but breaks the web service.

The solution, is to add the values back into the web config for the report manager but make sure that the web config for the web service virtual directory has the value...

<appSettings>
    <clear/>
</appSettings>

Hey presto, both the report manager and the web service now function correctly.

Apologies if my descriptions of web things are a little on the vague side but I'm reading it as it happens.

This Week I Have Been Mostly Listening To...

Radio One Established 1967

Radio One Established 1967 - Various Artists

(and not just for the Girls Aloud cover of Teenage Dirtbag...)

Technorati tags: , ,

SQLBits

Waking up early on a Saturday morning is not one of my greatest traits but I managed to overcome that weakness to attend SQLBits at the Microsoft campus in Reading at the weekend.

There was a track on BI as well as one on SQL Server 2008 so plenty for everyone to be interested in. The best moments for me were seeing my friend and colleague Sutha present on managing early arriving facts and the new spatial data types in SQL Server 2008.

I need to go through the details of Keith Burns presentation again before I can post about it with any real conviction but the general consensus within the room was that "right here right now I don't know what I'm going to use this stuff for but I'm going to find a way to implement it". In other words how we developer types define "cool".

BIDS and TFS

I'm a big fan of Team Foundation Server and how it can be used to better organise analysis, development and delivery. I've been especially impressed with a number of the features of Data Dude (or Visual Studio 2005 Team Edition for Database Professionals for the full mouth full), certainly since SR1, and how, with some work, deployments can be made a lot more efficient.

Let me give you an example. I have multiple environments, dev, UAT and production. All of them will have the same code base but different configuration settings for example we will have a table containing configuration settings or source file locations and these are all likely to change depending on what we are testing or deploying. Now Data Dude allows you to embed variables in your pre and post deployment scripts meaning that I don't have to change anything when it comes to things like allowing a accounts read access to data on dev that should not have read access on prod.

All pretty simple really and well worth installing for these kind of features (notice how I gloss over the cost....!)

Now the point "Visual Studio 2005 Team Edition for Business Intelligence Developers". has a nice ring to it but stick it into Google and at the time of writing nothing comes back. I can think of a number of scenarios where non intrusive deployment would be useful without have to rely on post deployment XMLA or scripts using rs.exe or even the wonderful package deployment wizard.

Some of the things I've named may not seem too reasonable but should you really have to go a separate utility every time you want to release code having multiple settings depending on the server or the environment or the fact it's a 64bit deployment or not. Analysis Services seems a prime candidate for this kind of behaviour as user access is defines in roles but I don't want to have to define them within role membership then remove them or go to the Analysis Services Deployment Wizard and define different configuration files when I've invested in a nice shiny Team Foundation Server and trained everyone up on it's use.

Anyway I would be most interested in hearing any thoughts on the above subject as this kind of thing is taking up a lot of time and more and more clients are jumping onto the TFS / agile bandwagon one even demanding to know why this can be done with the databases and not the cubes. You see the point..!

Technorati tags: , , ,

SAP have BO....!

Oooooh, the smell..!

I couldn't resist the almost register like title for the post....

It appears after many many months of hearsay and conjecture someone is finally going to stump up the cash and buy Business Objects. There's been a lot of talk around this area especially with Hyperion being absorbed by Oracle and further rumors about the future of Cognos.

Microsoft seem to be content with buying the smaller players and partnering with the similar smaller vendors to compliment their own technologies and now they've gone their own route for budgeting and planning it seems very unlikely they will do anything else but carry on that trend.

It will be interesting to see what effect this has, if any, on the close ties that SAP and Microsoft have had recently. We shall see.