Collecting Statistics Using Package
It is recommended that if you have not tuned your queries specifically for RULE based optimization, COST based optimization would give you a far better performance. The COST based optimization uses the statistics on tables and indexes collected using the ANALYZE command for the optimizer to find the best execution path. For the COST based optimization to perform properly, the statistics available should be current. How do you keep the statistics current? Well, Oracle 7.3 comes with 2 useful supplied packages known as dbms_utilily.analyze_schema and dbms_utility.analyze_database. Make sure your init.ora parameter OPTIMIZER_MODE is set to CHOOSE (preferable, uses cost based if statistics available) or FIRST_ROWS (best response) or ALL_ROWS (best throughput).
Let's see how we can set up the a procedure to run everyday morning at 6AM (after the nightly batch jobs which loads data) to collect statistics. Here we are going to use the job scheduling feature of Oracle RDBMS.
Log in to Sql*Plus as a user who has analyze any system privilege. The following lines will set up a job using the Oracle supplied packages.
OK, here's how it works... Oracle estimates the statistics for the table, if 50% of the total number of rows in a table is more than 50000, otherwise the statistics will be computed. This way we do no waste resources computing statistics for large tables. trunc(sysdate)+15/12 is adding 1 day and 6 hours to current date. So the first execution of this procedure would be at 6AM tomorrow and after each successful run, the next date and time is arrived by the same method.
To list the job status and when they are scheduled, please refer to the script jobinfo.sql elsewhere in my script collection.
To remove this job from Oracle RDBMS, do
SQL> exec dbms_job.remove (xx);
where xx is the value you received from print jobno above.
The init.ora parameters, JOB_QUEUE_PROCESS & JOB_QUEUE_INTERVAL must be set for the jobs to run.