ODTUG 2006 – First Report from Washington

2

This year’s ODTUG 2006 Kaleidoscope conference went underway today in Washington DC. A familiar crowd has gathered in a very pleasant venue where around 1.30 PM we kicked off with the first set of sessions. I did a three hour session on Java Server Faces and ADF Faces in a room that was of course much too cold – you Americans and your AC addiction – though pleasantly well attended with some 50 people. The talk went okay, except for the final demonstration of the JHeadstart 10.1.3 beta. For more substantial stuff, people could visit next door’s at Toon Koppelaar’s session on Applied Mathematics for Database Professionals.

Afterwards, there was a general session with the promising title of "Business Intelligence Summit" by Paul Rodwick (Vice President of BI Marketing) and George Lumpkin (Senior Director of Data Warehousing) both from Oracle Corporation. They were to present on the BI Strategy from Oracle. And that is what they did, a session that was at times very hard to follow. What I basically understood, later confirmed by more knowledgeable people, was....

Oracle presents a BI Server product – priced at around $225k per CPU – that is mainly the successor to the award winning Siebel Analytics product. BI Server comes with pre-built Analytical Applications and pre-built BI Cubes and ETL processes, allowing large enterprises to quickly set up enterprise data warehouses and run reports against them. The theme of the presentation was also that Oracle would take an integrated approach, offering the entire suite of BI Technologies that enterprises need while recognizing the fact of existing BI solutions. Integration is to be pervasive across tools, database and applications. I am not clear as to how Oracle BI Server (pka Siebel Analytics) integrates with for example the Oracle Datamining and OLAP engines. As far as I understand it, is Siebel Analytics a C++ application based on for example a DB2, SQL Server and Oracle database. The Repository for Siebel Analytics is a file based one. Whether Discoverer and Discoverer OLAP and BI Beans play any role in the new BI landscape, is very unclear. 

I found most of the presentation somewhat unclear. High level information, slides with great pictures of highly integrated solutions and extremely happy customers, but no actual software demonstrations and no clear explanation as to how the new Oracle acquisitions from Siebel actually fit in with the established BI technology from Oracle. The notion of hot-pluggability did nothing to help clear up the picture: it suggested that Oracle BI Server can have work with hot pluggable ‘components’,  meaning that you pick your own database to work with, your own authorization/authentication solution, your own ETL solution, your own BI Frontend (e.g. Business Objects or Cognos) etc. However, if you can pick your own database, where does that leave the fully integrated OLAP and DataMining engines? It seems that fully integrated and hot-pluggable sort of do not entirely go together. It was confusing at the least.

Positive notes – though I cannot really gauge them given that the presentation on a whole remained very abstact – were a disconnected mode for doing Analytics (so without direct database access) and support for Microsoft tools – i.e. Excel.

Paul discussed a number of customer cases, about customers already using the Oracle BI Server EE, including Cisco Systems, IBM, Microsoft and Yahoo. Surely this must refer to Siebel Analytics and not to a new product? The customer cases were quite impressive by the way, with larger numbers of users and more importantly, very clear business benefits.

The importance of BI was underlined using an article from Merrill Lynch, claiming that 41% of the managers indicated that BI their top IT priority was (security trailed BI with 37%).  Oracle sees  a lot of potential triggers for exploding data warehouses, such as RFID, Click Stream Analysis, BAM, packet movement oriented analysis at telco’s – instead of call records per call etc. Oracle has identifed five areas where BI in general is seeing a  change, and of course it will address those five areas:

  • From being an Analyst’s thing, BI will be much more pervasive across the enterprise with far more users (not a few dozen, but hundreds or even thousands)
  • From looking mainly at history , BI will move towards real-time and even predictive analysis
  • From being fragmented across data marts and departmens, BI will move towards a much more unified, enterprise level approach, all against a common enterprise model
  • From just reporting results, BI will move towards ‘insight driven business process optimization’
  • From specialized analytic tools, BI will  move towards a unified BI infrastructure with pre-built analytic solutions (best industry practices), e.g. every telco in the world can make use of a number of generic analytic solutions, possibly refined and tuned for only a very small percentage

And implementing BI solutions should therefore become much more like implementing standard applications instead of custom developing systems from scratch.

It sounds reasonable enough, especially for the Fortune 500 – the larger companies who use various ERP and CRM solutions that directly tie in with the Siebel Analytics offering. I am not yet convinced with regard to mid-sized companies (at $225k per CPU). And I would like to see the thing in action…
 

And now… ACTION: Oracle Warehouse Builder 10gR2

After Paul and George had done their presentations, they invited Mark Rittman on the stage. Mark is one of the experts worldwide in the established Oracle BI technology – and he already started to get into the new Oracle BI Server stuff. His presentation however was on Oracle Warehouse Builder 10gR2 – freshly released last month after a long beta-period in which Mark worked with the product at various customers. He explained a little about the most important new features and put forward two customer cases where these new features helped to save the day.

In one instance, using the Data Quality features in OWB10gR2, it was possible to profile, analyze and even cleanse the data from three different systems that had to be merged. The Data Quality features allow us analyze the structure of the data in a set of source tables. He did a brief demonstration that went very well – illustrating the results from the data profiler. He then went on to talk about a University’s student and course tracking system, that needed to support Slowly Changing Dimensions – type 2 – where the entire history of the dimension needs to be recorded. It can be done in OWB 10gR1 but is quite complex. In OWB10gR2, it has become much easier to get such Dimensions set up. There is a wizard that simply generated the Mappings and supporting database infrastructure.

Mark’s intermezzo was clear, straightforward and refreshingly tangible after the abstract stories told before him. I have no clear understanding of what the integration was between his talk on OWB10gR2 and the bits on Oracle BI Server.

I cannot help but feel that Oracle has a luxury problem: a fine set of BI Technology it developed over the past few years with Oracle OLAP and DataMining in the database, Oracle Discoverer, BI Beans and Discoverer OLAP, the OLAP spreadsheet plugin, Analytical Workspace Manager and Oracle Warehouse Builder. And now all of a sudden they have Siebel Analytics – that does not use any of the aforementioned tools and technologies. And now they build a story about these two branches and use integration across the board as the main theme. It seems more like stating the ambition than describing the actual situation.

Note that there is a lot of debate going on about the licensing for Oracle Warehouse Builder 10gR2: any organisation with a license for the Ora
cle 10g RDBMS automatically
is entitled to use the core functionality of Oracle Warehouse Builder 10gR2 – which is roughly the same functionality as the 10gR1 release had. Note that OWB is no longer part of the Oracle Development Suite; having a license on the 10gDS no longer gets you the OWB license.

If you want to use the Data Quality functionality that Mark demonstrated, you need an additonal license for the Data Quality option, at roughly $25k per CPU. Similar options exist for Enterprise ETL and ERP Connectivity. I hear quite a few angry voices about these prices – and it for example might do to customers that were involved with OWB Beta program, testing and relying on some of the features they now no longer can afford…

Welcome Reception

After this fairly long get together, we joined the Welcome Reception for food, drinks and company. It was good to see so many familiar faces again – Duncan who I spoke to only three days ago at the NL-JUG conference, Sue who I only seem to meet at ODTUG, Mark and Dan, Maggie, Kent, Paul and Peter, Leslier, Jeff, Peter, Dimitri, Clemens, Regis, Toon, Kathleen, Donna, Ann, Keith, Jean Pierre, Paul, a nice glass of red wine, Frank, and many more. It is nice to be back.

Share.

About Author

Lucas Jellema, active in IT (and with Oracle) since 1994. Oracle ACE Director for Fusion Middleware. Consultant, trainer and instructor on diverse areas including Oracle Database (SQL & PLSQL), Service Oriented Architecture, BPM, ADF, Java in various shapes and forms and many other things. Author of the Oracle Press book: Oracle SOA Suite 11g Handbook. Frequent presenter on conferences such as JavaOne, Oracle OpenWorld, ODTUG Kaleidoscope, Devoxx and OBUG. Presenter for Oracle University Celebrity specials.

2 Comments

  1. As we all discussed at the conference, it is a shame that al the wiz-bang new features of OWB are not “options” to the core product. My team (and many other I suspect) were really looking forward to using the data profiling and data quality options but now find we cannot afford to purchase it. Sadly my budget year starts Juky 1 and it is already set for the next 12 months with nothing factored in for this additonal cost. We had unfortunately assumed that the new features came with the upgrade and would be covered by our existing Oracle support agreement. A little warning sooner on the pricing change would have been nice.

    I wonder how many other organizations now find themselves in this situation. I bet some of the beta customers (who tested the product for 18 months) have deployed some of these new options to production already. Hope they at least had some warning or got a good deal on the license. It would be a shame if after all the testing they could not acutally use the software!

  2. Mark Rittman on

    Hi Lucas

    Good to meet up with you yesterday (and hopefully this evening). Just one point to clarify – the Enterprise ETL option is $10k per CPU, the Data Quality option is $15k, so it’s $25k for *both* options not just the DQ one as your posting suggests.

    Still too expensive though!

    regards

    Mark