<div id="_htmlarea_default_style_" style="font:10pt arial,helvetica,sans-serif">Another thing to look at for bulk data loads is commit frequency.
Assuming your data is "clean", commits on bulk loads should not happen after each and every record. Every 1K, 10K, 100K records yes. There
is a sweet spot, and you will find a dramatic decrease in load time by properly adjusting.<br><br><br>On Thu, 1 Mar 2012 08:42:16
-0600<br> <murraymckee@wellsfargo.com> wrote:<br>> If the analyze description shows something odd, like scanning a table verses using an
index, you might need to <br>>update the statistics of the DB. And in some cases you might have to manually override the statistics that
are <br>>automatically collected to 'encourage' the optimizer to use an index.<br><snip><br></div>