No matter how complex the enterprise and application architectures become, no matter the number of tiers, services, devices and user interfaces – at the heart of most enterprises will be a relational database.
And no matter how hard we try to implement a fully service based architecture or a multi-purpose business tier (for example using EJBs) – we will have multiple routes to the database and the data in it. Data will be manipulated through web applications, web services, client/server applications, batch database jobs, application managers working directly against the database from the command line or TOAD-like tools. If for no other reason – that by itself is an overridding motivation for enforcing every data constraint at the lowest possible level – the one level that none of these channels can avoid: the database itself. In addition to the fact that only enforcement inside the database can provide real integrity (and what the hack would be semi-real integrity?), for most data rules (especially complex ones) implementation inside the database is the most performant, scalable, maintainable, managable, agile, elegant and easy to implement as well.
Note: I am not advocating that data rules should not be implemented in other tiers as well. For the purpose of user friendliness through quick feedback and preventing incorrect data from ever getting near the database, it is of course perfectly fine to specify required fields, fields with a certainly length or a domain with allowable values in user interfaces and/or describe and enforce data constraints in the POJO Domain Model and the XSDs for the canonical data model of the ESB. However: all of these are only supplemental. They do not provide robustness.
Implementing data rules inside the database is not trivial, despite all the clear advantages outlined above.
First of all, for different types of rules, different mechanisms can and need to be used. Furthermore, it is hard to implement the constraints in a truly robust manner. Using table triggers is the foundation for the implementation for any non-declarative type of constraint. But without explicit focus on multi-session and locking issues, such an implementation will typically not be robust (also see: https://technology.amis.nl/blog/?p=1375). Another aspect of rule enforcement poses usually an insurmountable challenge: transaction level enforcement. Only if at the end of the transaction, after perhaps many DML statements the data is not in a valid state, an Application Error should be raised. Temporary violation of rules during the transaction is logically perfectly acceptable. At commit time, everything should be straightened out and rules have to be enforced. However, the Oracle Database does not offer a pre-commit trigger, only row and statement level triggers. Many home-grown solutions for data contraint validation and enforcement are typically not capable of handling the transaction level enforcement. As a result, complex rules are either enforced to rigorously – forcing users for example to delete and recreate data rather than update several records – or not enforced in the database but only (or not even) in the application – which is not robust.
The interest in data integrity and robustly implementing data rules increased very rapidly in the late 1990s. Various frameworks came into existence that helped developers to use a logical and structured approach for analyzing, designing and implementing rules. Examples in the Oracle domain are BRIM and CDM RuleFrame. I played an early role in the creation of CDM RuleFrame, in the Summer of 1999. CDM RuleFrame is a Rule 1.0 attempt at rationalizing and formalizing the implementation of data oriented business rules. It depends on Oracle Designer (very popular and fairly omnipresent at the time) and was almost entirely Oracle7 based (with one exception of the Deferred Constraint lock on the door, using an Oracle 8.0 feature). CDM RuleFrame has undergone a minor evolution at the beginning of the Century – which resolved some of the early shortcomings. However, its PL/SQL nature, row-(by-row) focus, poor (batch) performance and Designer dependency stayed and it made and makes rather limited use of more recent database features. In addition to this: CDM RuleFrame has a hole. Not a huge one, granted, but it cannot claim true robustness.
In 2005, the first Rule 2.0 framework was born: Toon Koppelaars launched release 1 of his RuleGen product. From the beginning, RuleGen was far more SQL oriented than its predecessors. As a result, performance is typically better or much better. In general, RuleGen follows the options for the standard Oracle Database constraints – including distinction between immediate and deferred and the option to enable and disable.
RuleGen is not tied to Oracle Designer – or other products for that matter. RuleGen is basically a server side framework with design time and run time repository – tables and views – and PL/SQL APIs on top of those. A Design Time UI is available in the form of an APEX application. Other user interfaces can easily be developed.
RuleGen has a smart solution to plug the hole that many other data/business rule implementations suffer from – the lack of locking and resulting integrity leak. Through fine grained, user (well, RuleGen) defined locks, the concurrency challenge is dealt with.
RuleGen comes with a number of useful run time settings that allow rules to be enabled, disabled or set to log-only mode – for example to analyze the impact of new rules on existing applications. These settings are available per session and across the database instance. New or modified rules can be validated against existing data when first deployed. A rules log monitor provides insight in frequency and conditions (which user, which application) of rule violations.
RuleGen supports both immediate (statement level) and deferred (transaction level) validation and enforcement. These settings can be set per session and in general. Transaction level validation is safe guarded through a smart Materialized View construct.
Applications – Java, Forms, .Net, APEX, ADF, PHP,… – running against a database are not necessarily aware of RuleGen. Data manipulation happens as before, after RuleGen has been enabled. Inside the database, triggers will fire and RuleGen comes into action. Most transactions will succeed. Those that fail because of rule violations will fail with an application error – something applications will be used to or can easily deal with.
Only those applications that want to leverage the transaction level enforcement of constraints and specifically the fact that messages for all rule violations can be retrieved (instead of a single generic message that informs the user that ‘something has gone wrong and the transaction has failed’) will have to add a single stored procedure call, to be made just prior to the actual commit of the transaction.
Since its first inception in 2005, RuleGen has evolved over the years. No major changes have been introduced – at least not to the core of the framework. However, years of practice in various implementation around The Netherlands have led to useful improvements and the odd bug fix. Last year – 2010 – saw the roll out of Release 3.0 with some new features and one important step forward: the code generated by RuleGen used to be wrapped PL/SQL code. Starting with RuleGen 3.0, this is no longer the case. Even though the RuleGen generator itself is protected, all resulting code and the entire run time framework is open. This has removed a concern several interested enterprises had voiced.
Smaller yet valuable improvements include the option to deploy rules immediately from the UI to the target run time environment, support for Rule scoping whereby (typically for the purpose of a unit test) all rules except specifically selected rule are disabled in a session, validation of a rule against existing data (rather than for new transactions) next to some usability and performance enhancements.
After the rule frenzy of the late 1990s, the focus on business rules seemed to relax over the years. Enterprise integration and Web 2.0, Business Process Management en Service Orientation, (real time) Business Intelligence and mobile applications consumed IT departments. The enterprise database was less en vogue and with it we saw a dwindling interest in business rule enforcement.
It seems that recently, things have taken a turn. Integrating systems (A2A and B2B) in a Service Oriented Architecture through an enterprise service bus as well the roll out of multiple applications – Web 2.0, Mobile, … – against the same underlying database has made it abundantly clear that robust enforcement of data constraints is a top priority. Exposing data to external consumers via services does not work well with data of questionable quality. Receiving data and undergoing data manipulations from multiple application (front ends) and services that may not have the same regard for the data constraints as the original application quickly results in flawed data.
Many organizations discover that their enterprise data may not have the quality that is required to expose that data through web applications, mobile apps and web services to external consumers. They also find out that despite all the archi-ware and slide-works, the enterprise service bus is not the place to enforce low level, fine grained data constraints. They conclude that rigorous enforcement of data constraints and data oriented business rules is required and that it should be implemented as close to the tables as possible – for integrity & robustness, for scalability and performance, for reuse and agility, for ease of implementation and future proof-ness.
RuleGen 3.0 is a pretty good fit for the requirement that follows from the previous conclusions. The fact that Oracle database developers can use RuleGen with their current skillset, and can be up and running in a few hours time and that RuleGen can be deployed in existing databases, under current applications with no impact other than the desired increase in robustness make it quite attractive.
In a sequel to this article, I will demonstrate the design and implementation of a fairly complex business rule using RuleGen. Which took me less than 20 minutes to accomplish. This will illustrate the architecture of RuleGen and the way of developing with it
The RuleGen website: http://www.rulegen.com/.
My review of RuleGen back in 2008: https://technology.amis.nl/blog/2981/rulegen-the-next-generation-framework-for-data-logic-aka-data-oriented-business-rules.