Monday, November 3, 2008

Primary key / unique constraints enforced using a non-unique index - 11.1.0.6 and 11.1.0.7

When enforcing a primary key or unique constraint using a non-unique index 11.1.0.6 allows to use direct-path inserts (append mode) in contrast to 10.2.0.4 which silently falls back to conventional inserts in this particular case. For a demonstration, see here.

But this enhancement in 11.1.0.6 seems to allow a situation where the enabled and validated constraint can be violated without any error message. This looks like a bug. If you simply set the non-unique index to "unusable" and then attempt to insert duplicates using direct-path insert mode you'll actually succeed and therefore end up with duplicate data in your table although you have a enabled primary key/unique constraint.

A small script shall demonstrate the issue:


SQL>
SQL> -- blank sample table
SQL> create table append_test as
2 select
3 trunc(sqrt(rownum-1)) as skewed_data,
4 rownum-1 as id,
5 lpad(rownum-1,10) id_char,
6 rpad('x',50, 'x') as filler
7 from
8 all_objects
9 where 1 = 2;

Table created.

SQL>
SQL> -- create non-unique index
SQL> create index append_test_pk on append_test(id);

Index created.

SQL>
SQL> -- add primary key constraint
SQL> alter table append_test add constraint pk_append_test primary key (id);

Table altered.

SQL>
SQL> -- same applies to unique constraint
SQL> -- alter table append_test add constraint uq_append_test unique (id);
SQL>
SQL> -- make the index unusable
SQL> alter index append_test_pk unusable;

Index altered.

SQL>
SQL> -- now perform a direct-path insert
SQL> insert /*+ append */ into append_test
2 select
3 trunc(sqrt(rownum-1)) as skewed_data,
4 1 as id,
5 --rownum-1 as id,
6 lpad(rownum-1,10) id_char,
7 rpad('x',50, 'x') as filler
8 from
9 all_objects
10 where rownum <= 100;

100 rows created.

SQL>
SQL> -- this generates an error ORA-12838
SQL> -- and therefore shows that this
SQL> -- was a direct-path insert
SQL> select * from append_test;
select * from append_test
*
ERROR at line 1:
ORA-12838: cannot read/modify an object after modifying it in parallel


SQL>
SQL> -- now we have non-unique data
SQL> -- in the table
SQL> -- although the primary key
SQL> -- constraint is enabled and
SQL> -- validated
SQL> commit;

Commit complete.

SQL>
SQL> -- try the same using conventional insert
SQL> insert /*+ noappend */ into append_test
2 select
3 trunc(sqrt(rownum-1)) as skewed_data,
4 1 as id,
5 --rownum-1 as id,
6 lpad(rownum-1,10) id_char,
7 rpad('x',50, 'x') as filler
8 from
9 all_objects
10 where rownum <= 100;
insert /*+ noappend */ into append_test
*
ERROR at line 1:
ORA-01502: index 'CBO_TEST.APPEND_TEST_PK' or partition of such index is in unusable state


SQL>
SQL> -- rebuild the index
SQL> alter index append_test_pk rebuild;

Index altered.

SQL>
SQL> -- attempt to re-validate the constraint
SQL> alter table append_test modify constraint pk_append_test novalidate;

Table altered.

SQL>
SQL> -- fails due to duplicates
SQL> alter table append_test modify constraint pk_append_test validate;
alter table append_test modify constraint pk_append_test validate
*
ERROR at line 1:
ORA-02437: cannot validate (CBO_TEST.PK_APPEND_TEST) - primary key violated


SQL> spool off


The issue can still be reproduced in the 11.1.0.7 patch set and therefore and seems to be a bug not yet fixed.

3 comments:

  1. Hi Randolf,

    nice blog ... although I have one 'beginners' question: why someone will enforce primary key or unique constraint using a non-unique index?
    No example is comming to my mind where this could be applied.

    thanks,
    goran

    ReplyDelete
  2. Hi Goran,

    Deferred or novalidated, but enabled constraints are such examples:

    1. Oracle supports deferred constraints that can be violated during the transaction but need to be adhered to when committing the transaction. This is only possible with a non-unique index in case of a unique/primary key constraint because otherwise the unique index will not allow the temporary duplicates.

    2. It is probably not a common scenario, but in theory you can enable a unique / primary key constraint but leave it in NOVALIDATE state. If there are duplicates in the existing data, this is only possible using a non-unique index

    Randolf

    ReplyDelete
  3. It is a common and good practice to use a non-unique index to support a unique constraint. In this way the constraint can be dropped and re-created (quite an easy process) without the need to rebuild the index (quite a long-winded process). - Elwyn

    ReplyDelete

Note: Only a member of this blog may post a comment.