My impressions from the Oak table seminar with Tom Kyte.
I knew Tomas Kyte only from the book ‘Effective Oracle by Design’ and was interested to hear from him live. The contents of the seminar did have a lot of stuff which can also be found in the already mentioned book. But I did hear some interesting new things.
Most impressive I found the way Tom was able to explain the most complex things. Having a few examples and small test cases with only using SQL plus made things very clear.
Before the seminar I had never visited his site ‘ask Tom Since the seminar I have visited it several times and I must say the answers you get there are as cristal clear as it can be.
Things I found the most interesting:
Background on the read and write consistency in an Oracle database and the way it influences the firing of before or after row triggers. Never use code in those triggers that cannot be rolled back, because you may end up with unexpected behaviour!!
All the overhead that Oracle is keeping track off to keep your data consistent and recoverable. And it still performing amazingly fast.
Autotrace sometimes ‘lies':
1. Setting autotrace to true does change your system environment and therefor may change the explain plans you see when you execute a statement. ( This was very nice to notice since I have had headaches seeing different explain plans when autotrace was on or off in a production environment)
2. Autotrace is smart enough to do a conversion from number to varchar2. A statement where you have a bind parameter referencing a varchar column which is indexed and your input parameter for the bind is a number, can give you the wrong idea. In real live the optimizer will not use the index on this column. Autotrace will show you an explain plan where the index is used.
The existence of the psuedo column ORA_ROWSCN which you can query to check if a row has changed since you last checked it.
With oracle 10g only a few hints are still valid to use. Almost everything ( even obvious bad PLSQL) will be tuned by Oracle. A developer now only needs to make a good design and good algorithmes, Oracle will do the rest. Of course you must make sure that the optimizer has all the statistics. But even that is solved by Oracle 10g: it will gather statistics as soon as it notices that it is missing those.
Valid hints you still should use:
ALL_ROWS or FIRST_ROWS(n) ( I didn’t know the option of having the number of rows you want to see fast)
(NO)REWRITE to de-activate the usage of materialised views.
DRIVING_SITE to indicate which site of the database link should make the query plan. Having a database link between two databases means that one of the two databases must deliver the explain plan. It is best to let the database with the most complex / big tables make the decision.
(NO) APPEND on inserts to de-activate the redo logging.
A lot of new possibilities in 10 g for flashing back. I think especially usefull for DBA’s. I never have had the need to use it. As long as undo information is available you can get everything back.
Be inventive when finding a solution for a problem. Tom’s book Effective Oracle by Design and his website have very nice examples of solving problems. Analytical functions play many times a big role.
SQL Server still has problems with read and write consistency.
Oracle 10g is doing much better then Oracle 9i. If you are still on Oracle 8i think seriously about moving to 10g instead of 9i.
Conclusions: Checkout the askTom site and get the book ‘Effective Oracle by Design’.