Posts tagged data integrity
The accuracy, internal quality, and reliability of data is frequently referred to using the term ‘data integrity’. Without it, data is less valuable or even useless. This session takes a close look at what data integrity entails and how it can be enforced in multi-tier application architectures using distributed data sources and global transactions. The discussion will make clear which elements are required from any robust implementation of data oriented business rules aka data constraints and it will explain how most existing solutions are not as watertight as is frequently assumed. Steps for achieving reliable constraint enforcement are demonstrated.
The presentation I did last week for the JFall 2013 conference can be checked on SlideShare:
To be useful, data held and used in information systems has to live up to a number of expectations. The data should be an accurate representation of its source. It should be reliable. The data should have internal consistency. The data should adhere to rules based on the logic of the real world. This accuracy, internal quality, and reliability of data is frequently referred as data integrity.
Safeguarding the integrity of data is a challenge, one that increases in complexity when multiple users access and manipulate the data simultaneously , obviously a common situation. And that challenge reaches new heights when the data is managed in multiple independent data stores rather than a single database.
Earlier this month, the Oracle Technology Network published an article that I recently wrote on this subject: http://www.oracle.com/technetwork/articles/soa/jellema-data-integrity-1932181.html. I was triggered into writing it by two recent experiences.
One was at a customer of mine where we are designing a service oriented architecture, based on a number of distinct and independent data domains. These domains are exposed through elementary (entity) services. A second tier of More >