Showing posts with label BI. Show all posts
Showing posts with label BI. Show all posts

Monday, November 02, 2009

Azure – It’s Not Big and it’s Not Clever

I’ve played with Azure and yes it does what it’s supposed to but coming from a business intelligence background I’m struggling to see what kind of advantage my area of focus will gain. I would love to see huge databases accessible in the cloud attached to huge cubes that use huge ROLAP partitions to run huge queries but until that time comes along I’m going to sit on a huge fence, watch and wait.

One project I’m working on has dimensions with over 50 million members in. Due to the business and the way it works a Process Update has to be performed every night. Lots of attribute hierarchies and lots of members means lots of time taken to complete. Throwing more hardware at it makes this faster but

what happens when the cost of the infrastructure exceeds the value they receive from their analysis. This client will look at how they can reduce costs elsewhere whether it be licensing (open source perhaps) or the platform (Amazon EC2 perhaps) neither of which are going to help Microsoft.

When I first saw that the Azure platform was changing from it’s early form into what I considered a natural progression of the SQL Server platform from Express to Enterprise to Azure I was pretty positive. This coupled with Analysis Services could mean smaller companies that produce a lot of data could get top class analytics without having to break the bank with massive hardware expense. But I haven’t seen anything that makes me feel confident that this will happen. Perhaps the Azure OLAP platform will be huge 128bit version of Excel with PowerPivot (don’t get me started) who knows, someone should at least.

Azure, PowerPivot, Reporting Services, Performance Point, ProClarity, MDM. In my opinion all of these things should be part of a single cohesive strategy to allow users to access data in a simple way whilst ensuring organisations can maintain a level of data governance and I’m just not seeing it. Why do I have to still tell a client that they need to use this tool because that’s the one you can change the line colours in but you’ll need to use the other tool for pie charts and oh you want it integrated into your Portal then you’ll need this one instead. It’s scatter gun, not particularly professional and very worrying.

I’m seeing this from a very focused view but from my experience a real world view and I just can’t see Azure fitting in to it at the moment. I would love to have my mind changed so please do.

Technorati Tags: ,

Tuesday, October 09, 2007

SQLBits

Waking up early on a Saturday morning is not one of my greatest traits but I managed to overcome that weakness to attend SQLBits at the Microsoft campus in Reading at the weekend.

There was a track on BI as well as one on SQL Server 2008 so plenty for everyone to be interested in. The best moments for me were seeing my friend and colleague Sutha present on managing early arriving facts and the new spatial data types in SQL Server 2008.

I need to go through the details of Keith Burns presentation again before I can post about it with any real conviction but the general consensus within the room was that "right here right now I don't know what I'm going to use this stuff for but I'm going to find a way to implement it". In other words how we developer types define "cool".

SAP have BO....!

Oooooh, the smell..!

I couldn't resist the almost register like title for the post....

It appears after many many months of hearsay and conjecture someone is finally going to stump up the cash and buy Business Objects. There's been a lot of talk around this area especially with Hyperion being absorbed by Oracle and further rumors about the future of Cognos.

Microsoft seem to be content with buying the smaller players and partnering with the similar smaller vendors to compliment their own technologies and now they've gone their own route for budgeting and planning it seems very unlikely they will do anything else but carry on that trend.

It will be interesting to see what effect this has, if any, on the close ties that SAP and Microsoft have had recently. We shall see.

Wednesday, September 26, 2007

Developing with MDM

Technorati tags: , , ,

There has been some recent announcements on Microsoft's acquisition of Stratature that have begun to solidify their approach and roadmap for Master Data Management. The MDM tools, as most people had previously guessed, will be very much aligned with the Office toolset and made part of the SharePoint. Personally I think the key thing about this is how this data will be accessed in my day to day job pulling data from here and putting there for user consumption.

Jamie Thompson has a video on YouTube (embedded below) detailing how to consume web services in SSIS 2008. Couple this with Bulldog's (Microsoft's name for absorbing Stratature's EDM+ product) ability to expose the master data and meta data as web services and all kinds of pre-built SOA components become possible.

Although possible with the existing set of tools in SSIS, I would be surprised if a data source component didn't appear at some point for SSIS 2008 that was specifically designed to retrieve master data from Bulldog. After the constant battles I've had getting master data out of organisations due to the cost of managing and maintaining it, having a tool that looks after the work flow and provides simple generic interfaces to the data will be very beneficial indeed.

Bring on the CTP next year....

Friday, June 29, 2007

Sorry it's been a bit quiet...

The lack of posts recently is down to my laptop being in the repair shop.....

No fault of mine just general wear and tear. As soon as I get it back then I'll have some stuff on SQL Server 2008, more MDM bits and some interesting things I've been up to with SharePoint 2007, not forgetting the things that i've been listening to. There's quite a lot of those as we enter festival season and yes I will be making the traditional trip to the V Festival in Chemlsford if anything just to see McFly...! (If you know me you know the truth).

Laters

Steve

Friday, June 08, 2007

Just when I was about to carry on talking about MDM...

...Microsoft go and announce this.

http://blogs.msdn.com/knight_reign/archive/2007/06/08/microsoft-completes-stratature-acquisition.aspx

Looks good for an end to end Microsoft architecture but it will be interesting to see what happens when it's working outside of its comfort zone.....

Technorati tags: , ,

Wednesday, May 23, 2007

The Evil That is Master / Meta Data - Part 2 (or the one where Steve talks about Ireland..!)

In the last post on this subject I talked about some of the key attributes of master data and meta data management and how it is intrinsically synched with what I call the Information or Data lifecycle. Now I would like to elaborate on this and identify some of the advantages that investing in this kind of strategy can provide.

Embarking on corporate wide gathering of all things data requires investment at all levels. Time, effort, money and most importantly commitment are essential. But ensuring a business receives any kind of ROI (return of investment which in the early part of my career I thought meant Republic of Ireland, hence the title) before that kind of commitment takes place can prove daunting and a little difficult to say the least. Let's forget the cons for a moment and look at some examples of the key advantages a data and information strategy can provide.

  • Everyone on the same page.
  • Information means the same thing throughout the business.
  • Reduced cost of new reporting and / or analytical requirements.

This doesn't look like a very extensive list and to be honest if someone presented me with a pitch like this I would be showing them the exit pretty rapidly but when you further examine the nature of each of these bullets you see that they are rooted incredibly deeply in practically all business processes and systems in place within an organisation. Key to the whole concept is that meta data and master data are not only for use within reporting systems. It's just that any projects that tend to drive this kind of requirement are also implementing some kind of reporting mechanism.

Lets take a step back and look at a simplified implementation of a number of reports. First we gather the requirements for the reports which would be based upon an existing set of data, possibly sheets of paper possibly a database storing transactions. Then the nasty business of performing analysis on said data, conforming it into your existing dimensional structure or creating new dimensions from scratch. On defining the model from which you will base your reporting you can finally start building reports.

So how could we improve this process and reduce the time taken to turn around a reporting requirement. First having some degree of knowledge of the system prior to a reporting requirement coming along would be advantageous but that's not the way the world works. Looking at a single report as a deliverable we would need to understand where the constituent data is sourced from. The report, for example, has customers, geographical breakdown, product type, number of orders and order value. Very simple but already pulling data perhaps from CRM, product catalogue and ordering systems.

When building a picture of the data held within the company it is very important that ownership is established. Who owns the customer data? Who is responsible for maintaining the product catalogue? These are the people that own these data elements within the organisation and are therefore ensuring the quality of not only the data in their own systems but also the reporting that is based on this data.

The point to this is that data quality needs to come from the top down. BI projects are generally just the the catalyst for this but should also be used as means of improvement in the source systems. Too often has data cleansing been hooked on to the back of a BI project and weighed it down with responsibility that should lie elsewhere.

Ok enough of this business type talk of responsibility and stuff. Next time I'm going to go into what master data and meta data are actually made of.

Tuesday, April 03, 2007

The Evil That is Master / Meta Data - Part 1 (or the one where Steve talks about socks..!)

Master data and metadata. A subject close to my heart due to it's significant importance in what I call the data lifecycle. Data? Lifecycle? What on earth is he talking about now, I just wanted to get Oracle talking to SSIS? Well let's go a little off subject here and use a bit of a euphemism.

Take something simple that I think we all learnt at school, the water cycle. This is the continuous movement of water as it shifts location and state from ocean to atmospheric to ground water. Now I liken this to the way data moves through an entity whether it be an organisation or group of systems.

A good example of this is a common scenario in financial reporting. An accountant (that's the cloud up there) will read their profit and loss report for a particular department and use this as to calculate the following years budget or forecast. These estimates will then be entered into the budgeting and planing system (that would be the mountains, more likely though it's Excel :). The budget and forecast are imported into the data warehouse where the profit and loss report (the ocean perhaps?) is generated which is read by another accountant looking at the companies performance, who........ ad infinitum.

A very typical example but it demonstrates the fact that the behavior of data within an organisation is very organic and in a constant state of flux. Just because the original piece of data is sitting in a table somewhere it doesn't mean it hasn't evolved into a different beast elsewhere with different properties and meanings. Simple as it sounds, this makes life a little complicated when you add influencing factors such as SOX (Sarbanes Oxley) compliance that requires the demonstrability of internal controls. In BI speak this could be someone changing an attribute of a dimension member and proving who did it, when and why. One tiny change which to a developer may be minor but to a CEO moves them from the red to the black, exactly the kind of thing SOX tries to stop.

Now all this talk of oceans, cycles and socks is all very good but doesn't bring us any closer to knowing what the hell to do about managing master and meta data. Ok, lets break down some of things I've mentioned in to some key bullets.

  • Dimension Management
  • Compliance
  • Process Flow

This list identifies some of the major reasons for having and requirements of any meta data and master data management mechanism.

In the next part I'll cover the these elements in more detail and how they can contribute to a more streamlined data strategy.

Tuesday, February 06, 2007

Other Blogger's

A colleague of mine has started blogging recently mostly in and around the SSIS area and I can categorically say he know what he's talking about.

Check it out here....... Colin's blog

Monday, December 11, 2006

Making The Client Happy - The Real Reason For Yesterdays Post That Rapidly Turned Into Some Kind Of Rant...!

If there's one thing that users hate it's being kept in the dark. So communication is essential and to be honest BI is all about communication. Articulating a series of 0's and 1's via OLAP or reporting something tangible that an end user can get real benefit from is what it's all about.

So we have a process that may rely on another process outside of the control of the BI infrastructure. This is a typical scenario and due to the 24 hour worldwide mannerisms of a lot of the BI implementations I've worked with, that process may be smack bang in the middle of someone's working day.

Scenario established, what are the problems we have here. Well first there's the problem of processing whilst people are working but with the performance tuning techniques available to us in the Microsoft BI space that shouldn't have so much of an impact on your user base. Really you just want to delivery new data to the user as quick as possible.

The real problem here is a user not knowing when their new data is available. Does that balance include yesterdays transactions yet or not? A typical question that may get asked in financial departments at month end something that would be asked a lot more regularly than you would like.

So communication is a problem, or lack of anyway (see the way I did that, started off talking about it and brought it round again to the subject, you're impressed, I can tell..!).

What can be done about it? There are a number of clever little things we should be able to do as part of our BI profession. Use our skill set and let the consumers of our data know the data is there to be consumed. First method, deploy a report at the end of processing to the user or send an information e-mail. Database mail in SQL Server 2005 makes this a pretty simple task these days. This could be a standard link telling the user base to go to a reporting services link and open a report telling them what has happened in the data processing. Problem with this though is it is reliant on a patient user base who will happily stare at the mail box until that little shining light of hope gets sent from "SQLSVR978BSMNT" (I do miss the days when there were fewer servers and they had names based on characters from The Simpsons or something like that.).

OK so your client / user isn't patient or it's 4:30 on the Friday before Christmas (topical huh..!), either works, and you don't want to have to go answer 20 calls from finance departments around the world demanding to know where the data is. Let's look at what they need to know. Take a typical end to end process, it may go something like this;

  • Extract data from one or more source systems.
  • Process said new data into the warehouse / mart / etc.
  • Process data into cube.
  • Distribute reports (or warm cache, it's useful have this catch all at the end)

Now this is high level generalisation of what may be going on but being able to let the uses know this much gives them a massive increase in the perception of the system. Even to just know that the sources system data has not ben extracted empowers the user, if there are aware of where the data is coming from, and why shouldn't they be, to get on the phone and ask New York why their figures haven't been entered yet.

I've already talked about making this happen via sending reports out but lets face it, this isn't the ideal way of doing this when there are numerous users and numerous steps within the process to communicate. So we make the user pro-active. Go to the portal or website and find out what the hold up is or where the problem lies. All we're doing is taken something that has been happening in customer service departments  for years and applying it to the BI systems that we put in place and to be honest with our experience we really should know how to provide someone with beneficial information in a number of formats.

So what is my point in all this? There has been an idea bubbling around my head for a few months now around this subject and visualising end to end processes for clients, mostly based on SSIS and the simple logging model I've posted about previously. The idea was to use Visio to expose data in a diagrammatic format to the users showing them something similar to the SSIS control flow, you know greens, reds, yellows. This became all the more interesting after the recent UK BI user group at Reading where David Parker was presenting discussing visualising data in Visio, essentially the same topic.

There, that's the idea, now I'm going to put it into practice. Stay tuned and if anyone else has had any experience in doing what I've said then I would be really interested in hearing.

Thursday, November 30, 2006

Evening After The Night Before....

Last night I had the pleasure of attending the 2nd UK BI User Group and happened to ask Chris Webb when the next one might be and he answered probably about 6 months. Now go onto one of the IT job boards and type in BI, SSIS, OLAP or any of the many acronyms that relate to this kind of thing and you'll be overloaded with agencies and other organisations offering large sums of money and incentives to come and join them.

What's my point?

Well with that kind of level of interest in BI over the last 18 months and that interest not looking like it will wane in the foreseeable future I think there are a couple of things that should happen....

Firstly more people should attend, now that may be location as TVP (Microsoft UK, Reading) isn't the easiest of places to get to but MS kindly offer their place so who's going to refuse it.

Secondly more regular, things are moving pretty fast, people are coming out with new tips tricks and ways of adapting and implementing the MS BI platform so surely there are plenty more things that can be discussed, one look at a PASS event shows there a hell of a lot of ideas out there.....

I should do my bit as well I suppose, I could make some Turkey sandwiches to take, there might be some left overs in the next month or so..!

Wednesday, October 04, 2006

Simple SSIS Logging - Part 3, Bad, Bad Rows..!

It’s all well and good knowing that there has been a failure in your package which has lead to a bleary eyed support technician getting phoned at 3 in the morning but sometimes it’s not really necessary to get Barry (we’ll call him Barry for the purposes of this post) out of bed. Your package may be bullet proof and work perfectly but the nature of data is that it will change and can quite often come back and bite you on the backside.

Take for example early arriving facts. When you start integrating a number of systems it is possible for things to start happening that don’t conform to the world according to the system owner. Take the example of an in-house order system operating completely separately to the companies CRM system. During the analysis phase of the data warehousing project it was determined that CRM was the source of Customer ID therefore that’s where customers get created and that’s where customers get maintained. But a new customer phones up and makes an order before the customers details are completed in the CRM mechanism due to one thing or another. Your data warehouse is quite likely to be collecting that fact row before the relevant CRM data has been correctly populated and passed on to the customer dimension. So what happens in your SSIS package when looking up the Customer key for that dimension, well not a hell of a lot.

To be fair there are numerous techniques for dealing with early arriving facts. Some involve the population of the Customer ID and retrieval of the key so that when the correct dimension member arrives in the dimension table, its attributes are populated automatically. The problem is that things aren’t always as clear cut as that. You have managed to capture the early arriving fact data and correctly allocate it to members in the Customer dimension but when board reports run that specify a particular customer type the data is incorrect, the numbers don’t roll up and for 24 hours you could be running around like a headless chicken trying to figure out where the missing data has gone, oh that’s until the next day when the customer dimension member is correctly updated and things mysteriously fix themselves .

So where does logging fit into all of this? Well knowing that we have an early arriving fact would be a good idea. So we have to log the early arriving fact and let the data pass through to allow it into the fact table.

This looks quite simple but when you’re trying to handle a number of dimensions and any of these could have early arriving facts then you can get have a lot of data destinations. In addition to all this there is also the fact that each package could have a different error table at each dimension lookup making logging, which should really be centralised and in a defined structure, a little reactive and difficult to deal with. Dealing with this may not be as difficult as it seems though and for the first time I’m going to mention a customer SSIS task that has been written with this in mind. A colleague of mine has written a component for just this type of situation. Whilst I would have liked to have completed all 3 parts of this Simple SSIS Logging piece without referring to any external code, this really does make life a lot easier. The full source code is available on the site and I recommend its download.

The Xmilfy component (to give it its officially christened name) will turn a series of columns and convert them into a single XML column that has some obvious benefits. In the above example the data is then written to a single table in my SSIS logging database which has a data type of XML allowing the data to be viewed in a number of ways. Extra information can be added into the data flow that specifies where the error occurred, the number of rows etc, again creating detailed and useful reporting. Want to know the number of early arriving for a particular package or even a particular dimension, easily done using the data in the table. Finally a warning can be raised using script to inform Barry, remember him, that some data problems occurred but now it’s a warning and can be handed over to the department in charge of wielding the large stick that is used to stop sales people logging sales before the customer is correctly created. Furthermore the system can be automated to alert said department taking Barry completely out of the loop. But wait a second, Barry is now worried about losing his job but luckily he’s been learning SSIS in his spare time.

It’s ok people, Barry is fine…!

Friday, September 29, 2006

SQL Server 2005 Metadata

Hidden in the depths of Microsoft’s useful download division in Redmond is a handy pair of utilities you may have discovered that analyse dependencies within and across SSIS and Analysis Services. The first tool, Dependency Analyzer, examines the meta data contained within SSIS packages and Analysis Services objects collating their inter-dependencies writing the results to a pre-defined database structure.

The second tool is the Dependency Viewer and provides a graphical interface displaying relationships between the analysed objects .


The amount of data is considerable and has some quite powerful possibilities. There has been a distinct gap in the MS BI product set, dependency and data analysis being two key areas that their competitors eulogise in large amounts about. This appears to be the beginning of some gap plugging before the release of Katmai (SQL Server 2007). In the meantime all the source code, examples and even some reporting interfaces are provided in the download.

I’m quite looking forward to plugging this tool into some of my end to end 2005 projects and looking at the results. Once I've got some good examples of how everything integrates together on this I'll post ome of the results.

Wednesday, August 02, 2006

Other Places to go...

In my first proper contribution to this years blogging (yeah, yeah, I know) I was going to give a quick overview of some of the blogs I regularly read and why.

First up is a Mark Hill (http://markhill.org/blog). I've known Mark for a number of years and worked with him for almost all that time. His blog is in his own words "a ramble through Microsoft BI technologies". Give the guy a little credit here, it's not a ramble but quite an insightful comment on Microsoft’s BI platform and essential viewing if you're looking for information on the problems you'll face implementing large scale SSIS and Analysis Services systems and especially if you're looking to put 64 bit servers within your architecture.

Next is Thiru Sutha's SQL Server 2005 BI blog (http://tsutha.blogspot.com/). Again someone I've know and worked with for a number of years has been working on the same projects as Mark Hill and significantly contributed to stabilising 64 bit SSIS. Definitely worth a viewing and contains some good example SSIS packages for your pleasure in addition to postings on known SSIS bugs.

Chris Webb (http://cwebbbi.spaces.live.com/) is a pretty well known individual in the world of Microsoft business intelligence and is your first port of call if you need to know anything about using MDX in the field. His collaboration in the latest version of MDX Solutions (along with George Spofford amongst others) gives the book a lot more weight and makes it an excellent guide to anyone looking to enhance their MDX knowledge.

Off the BI band wagon slightly here The Joy of Code (http://www.thejoyofcode.com/) is a collaborative blog that includes Josh Twist, a colleague I’ve worked with recently on a SQL Server 2005 BI architecture implementation. Josh is .Net developer by trade but was huge benefit to the projects custom SSIS component development. A number of the SSIS components he worked on are available there with example code for download, some of which I’ll discuss in further detail later along with some recommended applications.

Finally another SSIS orientated blog from Jamie Thomson (http://blogs.conchango.com/jamiethomson). I have, lets say, borrowed plenty of SSIS code from this site recently and his postings on things like what you could do in DTS but aren’t as obvious in SSIS are very useful for anyone making that transition.
Well that’s it for now hopefully you’ll find these sites as useful as I have.

Steve

Tuesday, June 21, 2005

Project '4' REAL

Take a look at this site for Project REAL. There was a number of sessions on it at PASS Europe that were really interesting. It's essentially Microsofts attempt to build a Buissiness Intelligence system using SQL Server 2005 and real data for a well known on-line retailer in the U.S. This is going to hopefully be incredibly useful for not only developers observing the kinds of issues that they will come up against during their own implementations but also for managers needing to plan SQL Server 2005 based BI projects.

There's a couple of sites, one is the original technical overview here by Len Wyatt and now the Project REAL website which is going to be an ever expanding information resource on the progress of the project. There is already a link to a piece on SSIS detailing lessons learnt during the ETL design process and is probably essential viewing for anyone with an interest in the ETL space.

I'm lead to believe there will be a number of webcasts as well as the content available on the site which will increase ramping up more and more the closer it gets to the 2005 launch date (November currently and looking certain) so I'm sure there will be something for everyone to be interested in but if you're looking for reporting services info expect it to be towards the end of the project :)

Thursday, June 16, 2005

Diet Projects

I’m currently trying to put together a full end to end demo of SQL Server 2005 based on an existing implementation I’m working on (You could call it my own little Project Real Lite). This is an Oracle data warehouse with Microsoft presentation tools and as I really need to demonstrate all aspects of SQL Server 2005 I need to pull the data out of Oracle and into the database engine so first up is SSIS.

Good tool this, very good and with the help of an expert I know who I’m trying to convince to start blogging it’s going quite well. But, there’s always a “but” pulling Oracle data out with SSIS has proved trickier than it shouldn’t have been. Yes I know I’ve cut a few corners to speed the process up but numerical data appears to be coming out as numeric(38,4) and trying to any data conversion to other numerical data using the derived column component is causing it to fail consistently. I’ve managed to get it working but I have to do something pretty evil and convert the data. First to a string in the select statement then back into decimal in the derived column component, the overhead is obvious but I should only have to do this once and when I’m all done I think I’ll reinvestigate all the little irritations I’m finding.

I'm sure i'm just missing something that a little preperation couldn't solve but it's still a somewhat odd.

Wednesday, May 04, 2005

Hmmmm. Saturated Fats..!

Sustaining oneself on fast food is really not the best of ideas when you're trying to master complex feats of development but then again Starbucks wasn't open so I was kind of restricted to any of the vast selection of neon lit establishments around Liverpool street.
Anyway the cube is built and the first of several reports is complete all in one neatly packaged solution. Processing speed is good and I've managed to get myself over a million rows of decent demo data thanks to a great little tool called the Advanced Data Generator. I've used it in the past and it has always delivered what I needed so take note if you're stuck for several million rowsd of data.

Tuesday, May 03, 2005

Art of the Referentially Intact

Clicking a wizard should be the easiest thing in the world. One push, out comes your desired result. Not tonight. I'm having great difficulty building a demo SQL Server 2005 Analysis Services cube because the data I've been given has very little integrity. So what does this mean for me? It means I'm missing the champions league semi final between Liverpool and Chelsea. It means I'm getting hungrier by the second and will probably have to consume a large amount of processed food on the way home and it also means I'm losing the feeling in my fingers due to the over active air conditioning in my office.
Hopefully I would have now cleansed the data to a point at which it can't fail but it severly reduces what I can demonstrate. Lets see what spanners await me when I try to build some Reporting Services reports off of it.....!

Thursday, April 28, 2005

The Munich Peer Festival

I have just managed to get myself fixed up with flights, hotels and registration to this years SQL pass Europe conference in Munich in May.
Very kindly my employers (Edenbrook) have offered me the opportunity to go. This years event will have a certain degree of spice as it will be the final one before the release of SQL Server 2005.
So if anyone reading this wants to buy me a beer out there, I'll be happy to oblige :)

Constructive Success

Rather than the full on introduction to myself I've decided to let Information about myself slip out in smaller chunks. This in turn presents me with my first opportunity totell you what I'm doing with myself at the moment.
I am currently in the process of evaluating the production capabilty of SQL Server 2005 for a client. Whilst I've been using the tools In various capacities for well over a year I haven't had the opportunity to play much with the Report Builder functionality in Reporting Services.
I like Reporting Services and have done since I first played around with the beta some years back. It is the very essence of a Microsoft product, Simple to pick up but very powerful under the hood.
In the next version you're now given the ability to build your own reports and deploy them to the central server. This provides some great possibilities and moves the platform up a gear. I saw an early demo of the ActiveViews technology in Redmond shortly after Microsoft acquired it last year and it looked good then but after some hands on research I really think a lot more people will take a look at the Business Intelligence offerings from Microsoft.