tag:blogger.com,1999:blog-5124641802818980374.post1687119672919265571..comments2023-09-06T11:58:30.752+02:00Comments on Oracle related stuff: Concurrent Index CreationUnknownnoreply@blogger.comBlogger31125tag:blogger.com,1999:blog-5124641802818980374.post-8731143989426588962017-12-01T16:29:10.310+01:002017-12-01T16:29:10.310+01:00I must thank you for the efforts you've put in...I must thank you for the efforts you've put in penning this site.<br /><br />I am hoping to see the same high-grade blog posts from you in the future <br />as well. In truth, your creative writing abilities has inspired <br />me to get my own site now ;)Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-2589635083721947952017-02-28T01:33:40.711+01:002017-02-28T01:33:40.711+01:00The files are still available, but probably you tr...The files are still available, but probably you try to download from behind a firewall that prevents downloads from non-default ports - the site hosting the files uses non-default port 7676 (see previous comments).<br /><br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-16757426562818945932017-02-27T18:13:26.950+01:002017-02-27T18:13:26.950+01:00Files aren't available, could you please re-po...Files aren't available, could you please re-post it?<br />Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-23365267954570775042012-09-01T21:55:23.294+02:002012-09-01T21:55:23.294+02:00Hi Ram,
thanks for your comment.
Apologies for ...Hi Ram,<br /><br />thanks for your comment. <br /><br />Apologies for the delayed response, but I was on vacation for the last couple of days.<br /><br />I haven't looked into the details of Datapump and constraint enabling resp. validation.<br /><br />At present I would assume that I won't spend time on that topic in the near future, so don't expect anything in that regard.<br /><br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-11143393128167249062012-08-23T05:33:53.851+02:002012-08-23T05:33:53.851+02:00Hi Randolf,
Thanks for sharing your efforts. This...Hi Randolf,<br /><br />Thanks for sharing your efforts. This really helped me save some serious time. Datapump is not using parallel for enabling on constraints (pk as well as ref-constraints) and spends a huge amount of time. I think we can save a lot of time by running the constraints in parallel mode using 'alter session enable parallel ddl'. I am not good at sed/awk. Do you have any plans to extend your script for Constraints too? It will be a great help if you could. Thanks again for helping out!ramhttps://www.blogger.com/profile/08414268698031450638noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-45639576981789584202012-03-24T11:12:50.618+01:002012-03-24T11:12:50.618+01:00Hello randolf
Important precision : the concurren...Hello randolf<br /><br />Important precision : the concurrent process does not work in RESTRICT MODE,<br />the instance must be fully open...<br /><br />Thank you<br /><br />Noelnononoreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-26788002115740443402012-03-12T21:23:53.948+01:002012-03-12T21:23:53.948+01:00Hi Noel,
thanks for the update. It's nice to ...Hi Noel,<br /><br />thanks for the update. It's nice to hear that the information provided here helps you!<br /><br />Thanks,<br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-46810955156238446302012-03-12T10:58:10.924+01:002012-03-12T10:58:10.924+01:00Hello randolf
just to jeep you informed, your pro...Hello randolf<br /><br />just to jeep you informed, your process works fine in Standard edition, i rebuild the 28000 indexes of my JDE DB in just 1h40, including unique indexes, I use 40 process, on a P740 with 4 cores and 4 threads per core, on the new production server we will have 12 cores...so hope to run the full import in less than 6h, compared to 20h with the non optimized process...<br />So thank you a lot, randolf, you saved my migration process !...<br /><br />Best regards<br />Noelnononoreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-11135393762806871112012-03-07T12:14:10.175+01:002012-03-07T12:14:10.175+01:00Noel,
that sounds like a reasonable approach. If ...Noel,<br /><br />that sounds like a reasonable approach. If you can include the creation of the indexes supporting the constraints into the scripts used to populate the queues for the worker threads, then you should be able to create the corresponding constraints afterwards on top of the already existing indexes without the need to rebuild / recreate any of them.<br /><br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-11852446010402113832012-03-07T09:20:15.546+01:002012-03-07T09:20:15.546+01:00hello randolph
It seems that advanced queuing and...hello randolph<br /><br />It seems that advanced queuing and scheduler are available in SE. my first import test with EXCLUDE=INDEX option show that unique indexes are created cause I don't exclude primary keys constraints, so the question is : <br /><br />do we must include unique indexes in your process, with EXLUDE=CONSTRAINT ?<br /> In this case we must test if we can create constraint by script without rebuilding unique indexes, my import of one schema of 600Gig took 7h00, but the constraint took 5h30...<br /><br />thank younononoreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-76646651074866723432012-03-04T21:26:58.647+01:002012-03-04T21:26:58.647+01:00Noel,
honestly I haven't tried the procedure ...Noel,<br /><br />honestly I haven't tried the procedure yet on a Standard Edition.<br /><br />But since this is simply a specific implementation of a process spawning multiple threads to perform a task concurrently the main building blocks are:<br /><br />- Advanced Queueing <br />- Scheduler<br /><br />Hence if these components are supported by Standard Edition this should work. Each thread will then only be able to create an index serially.<br /><br />I don't know what is going to happen to indexes that have been defined as parallel in your Enterprise Edition source database - whether Standard Edition simply ignores the PARALLEL attribute or throws an error, but then you can influence the script that is being processed and remove the PARALLEL clause if necessary.<br /><br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-9338918206125987922012-03-03T22:39:29.310+01:002012-03-03T22:39:29.310+01:00Hello randolf
Does your concurrent index creation...Hello randolf<br /><br />Does your concurrent index creation process be used on a 10gR2 standard edition ?<br />I have to import a 1To 10.2.0.4 EE to a 10.2.0.4 SE where parallel option is not allowed...<br /><br />thank you<br />Noelnononoreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-77311720051165739972011-09-21T20:37:14.311+02:002011-09-21T20:37:14.311+02:00> I am running the data pump with 8 worker thre...> I am running the data pump with 8 worker threads, wouldn't each of the worker threads work on one index, ie: 8 indexes are being created in parallel, instead of one index being created by 8 threads in parallel<br /><br />Have you tried it? No matter how many worker threads you define, when it comes to index creation only a single worker thread will be active. On top of that there is this problem in some versions with PARALLEL indexes that don't get rebuild in parallel.<br /><br />So these are the reasons for this post.<br /><br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-79727914773382211782011-09-21T03:13:33.854+02:002011-09-21T03:13:33.854+02:00I don't quite understand the problem, if there...I don't quite understand the problem, if there are a 100 indexes to create and I am running the data pump with 8 worker threads, wouldn't each of the worker threads work on one index, ie: 8 indexes are being created in parallel, instead of one index being created by 8 threads in parallel. So in effect at a give time the same amount of work is getting done. Did I miss something?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-42024039216119640062011-09-15T03:45:03.703+02:002011-09-15T03:45:03.703+02:00You are right - it works fine from home. I am merg...You are right - it works fine from home. I am merging your code with some additional procedures invoking dbms_datapump so that it can be invoked without database server access - for example - for use by developers. Also have you considered using dbms_metadata.get_dll to extract the "create index" commands? The benefit would be to keep all processing inside pl/sql - time permitting I will probably try it out and let you know how it goes.Mark Teehanhttps://www.blogger.com/profile/12271885028844202412noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-23793533759416886882011-09-14T10:47:40.853+02:002011-09-14T10:47:40.853+02:00Hi Mark,
the link works for me - note however tha...Hi Mark,<br /><br />the link works for me - note however that my site uses a non-default port of 7676 rather than 80, therefore you might have trouble behind some corporate firewalls.<br /><br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-13296333134869700252011-09-14T04:31:36.941+02:002011-09-14T04:31:36.941+02:00Hi - this is an excellent blog post and I am looki...Hi - this is an excellent blog post and I am looking forward to trying it out. However the zipfile is unavailable - is your site down?<br /><br />Thanks!<br />MarkMark Teehanhttps://www.blogger.com/profile/12271885028844202412noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-11972815425227660372011-09-05T04:40:41.145+02:002011-09-05T04:40:41.145+02:00@Nagendra,
maybe in your case is a way to exclude...@Nagendra,<br /><br />maybe in your case is a way to exclude CTX schema and recreate it later again ...<br />i.e.:<br />http://www.databasedesign-resource.com/oracle-indexes.html<br />Hop this helps,<br />DamirDamir Vadashttps://www.blogger.com/profile/15963017378937428976noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-91786751173340750672011-06-18T21:30:06.084+02:002011-06-18T21:30:06.084+02:00Hi Nagendra,
thanks for reminding me of Domain In...Hi Nagendra,<br /><br />thanks for reminding me of Domain Indexes. These are indeed not covered by the current version of the script.<br /><br />Regarding your specific problem creating the Domain Index: Unfortunately I don't know more about this particular error. Probably a case you have to sort out with Oracle Support I guess.<br /><br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-27380589181010302452011-06-14T21:27:05.361+02:002011-06-14T21:27:05.361+02:00Hi Randolf,
I have used this method and able to i...Hi Randolf,<br /><br />I have used this method and able to import the indexes. But your transform_all_sql.sh is not taking care of Domain Indexes. Hence i have tried creating the domain indexes manually. But i get the below error.<br /><br />SQL> create index AR.hz_class_code_denorm_t1_1 on AR.HZ_CLASS_CODE_DENORM(CLASS_CODE_MEANING) indextype is ctxsys.context ;<br />create index AR.hz_class_code_denorm_t1_1<br />*<br />ERROR at line 1:<br />ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine<br />ORA-20000: Oracle Text error:<br />DRG-00100: internal error, arguments : [51021],[drwaf.c],[1607],[],[]<br />ORA-06512: at "CTXSYS.DRUE", line 160<br />ORA-06512: at "CTXSYS.TEXTINDEXMETHODS", line 366<br /><br />I have tried workarounds from several Metalink notes but i am still having the problem.<br /><br />Is this something you have a workaround already ? If so, can you tell me the workaround please.<br /><br />Regards,<br />Nagendra.Nagendranoreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-47671986886264646362011-03-24T01:56:59.786+01:002011-03-24T01:56:59.786+01:00Thank you Randolf. I will try again and see. I wil...Thank you Randolf. I will try again and see. I will keep you posted.<br /><br />KumarKumar Maddurihttps://www.blogger.com/profile/15189324674840369808noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-54178941822900110352011-03-23T12:26:59.521+01:002011-03-23T12:26:59.521+01:00Hi Kumar,
I think I get your problem now. You are...Hi Kumar,<br /><br />I think I get your problem now. You are not referring to an error in the log but you get an error from the "coordinator" session where you attempt to start the jobs.<br /><br />The error is caused by the DBMS_SCHEDULER.CREATE_JOB call - a scheduler job is actually a database object and therefore needs to have a unique name.<br /><br />The error tells you that there is already a job with that name.<br /><br />For your second attempt the error can be explained - as I've outlined in the text if you use the same name for both worker threads then you need to specify "0" (not "NULL" - note that "NULL" and "0" have different meanings in general and in particular here) for one of the worker thread degrees, otherwise you'll end up with exactly that error because the script will attempt to create two sets of jobs with overlapping names.<br /><br />However it doesn't explain why you got a similar error at initial invocation, I can't tell with the information provided why that error occurred when running the script for the first time.<br /><br />In principle it means that there is already such a job - however when running for the very first time there should not be such a job.<br /><br />If you had used the same name for the two worker threads also in your initial attempt without setting one of the degrees to 0, this would explain the problem as outlined in the blog post.<br /><br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-48951353540178077622011-03-23T11:30:42.305+01:002011-03-23T11:30:42.305+01:00Hi Randolf:
I verified after I posted the comment ...Hi Randolf:<br />I verified after I posted the comment and the indexes are created as expected. I also read the README in which you did mention about the logic where if an index already exists or if there is an error, the script would continue.<br />But I had to run the sql script couple of times and the first time it failed with <br />ORA-27477: "KMADDURI.SERIAL_INDEX_20" already exists<br />and the table or directory were not dropped.<br />I run the second time again and this time it created additional indexes that were missed because of the failed prior run and exited again though with this error ORA-27477: "KMADDURI.KUMAR_1" already exists<br /><br />So I run the sql script again and this time it did not have any indexes to be created because the first two runs already created the indexes and this time the program exits cleanly and drops the directory in the end.<br />I was trying to udnerstand why I got the ORA-27477 error (I choose the default values for all the prompts except the name of the job in the second run which i Put KUMAR).<br /><br />Thank you for your time..<br /><br />KumarKumar Maddurihttps://www.blogger.com/profile/15189324674840369808noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-2037145527513775182011-03-23T10:26:18.881+01:002011-03-23T10:26:18.881+01:00Kumar,
I think I'm having trouble understandi...Kumar,<br /><br />I think I'm having trouble understanding what exactly your problem is.<br /><br />You seem to try to create some missing indexes, and the script that you supply includes also indexes that already exist.<br /><br />Therefore I would expect such errors to be shown in the log since the indexes that already exist will raise this error during processing, but I think in the version of the provided logic a worker thread that encounters such an error should continue, since it catches the error and attempts to go ahead with the next entry from the queue.<br /><br />So what exactly is your point? Do you try to express that some of the missing indexes have not been created, or that you get the "already exists" error for indexes that you think are missing?<br /><br />RandolfRandolfhttps://www.blogger.com/profile/13463198440639982695noreply@blogger.comtag:blogger.com,1999:blog-5124641802818980374.post-79609275330851545112011-03-23T08:59:07.114+01:002011-03-23T08:59:07.114+01:00Hi Randolf
I am trying to create some missing inde...Hi Randolf<br />I am trying to create some missing indexes using this approach. But the sql file has all 'create index' statements. So when the job executes there are errors(name already used by existing object) which is ok. Some indexes get created but I get error like this<br />ORA-27477: "KMADDURI.KUMAR_1" already exists<br /><br />These are the parameters I pass for the sql file<br /><br />OS Path : /linux_migration/OF2INT<br />Filename : xxdl_indexes2.sql<br />Serial Degree : NULL<br />Parallel Degree : 1<br />Worker Thread 1 Name: KUMAR<br />Worker Thread 2 Name: KUMARKumar Maddurihttps://www.blogger.com/profile/15189324674840369808noreply@blogger.com