Issue in Index import

Hello All,
I have installed a new Database and importing a Datapump dump into a newly created schema.
The data is around 150 GB . When i invoked the imprort Data wash pumped in just two hours whereas the index creation part is taking longer than usual (a day and a Half ). Seems The same behaviour was seen last time to in the same database.
What could be the reason for this . this is recurring only in this DB for the same dump.
Could it be some init.ora setting i have missed out.
Our oracle version is 10.2.0.4 mounted on a solaris 10 OS. Disks are only RAID.
Thanks
Vijay G

Pl post the impdp command used, along with the last 50 lines from the impdp log
HTH
Srini

Similar Messages

  • HT1844 Library Issue:  I just imported 2 CD's into ITunes.  They show up in "Recently Added" and on my other devices via ITunes Match, but NOT in the ITunes software on the computer where I added them.

    Library Issue:  I just imported 2 CD's into ITunes.  They show up in "Recently Added" and on my other devices via ITunes Match, but NOT in the ITunes software on the computer where I added them.  They show up in the search box, but when I click on the results it just kicks me to ALL my music.  I have imported thousands of songs/CD's before and this is a new one.

    Something may have gone wrong with the index of the Music playlist. Making another change to the library such as downloading the current iTunes Free Single of the Week, or deleting one track from your library and reimporting it, should fix the problem.
    If that doesn't work close iTunes and delete the hidden file sentinel from inside the main iTunes folder, then start iTunes again. It should run a consistency check when it starts up which ought to correct things.
    tt2

  • Iteration Speed issue when Indexing 3D array wried to a while-loop

    Brief Description of my Program:
    I am working on the real-time signal generation by using LabView and DAQmx (PCI-6110 card). My vi reads the big data file (typically 8MB txt file containing about 400,000 samples (complex double precision). Then, the signal is pre-processed and I end up with a huge 3D array to feed while-loop (typically 3D array dimension is N x 7 x M where N & M >> 7). Inside the loop, that 3D array is indexed and processed before the results are written to the DAQmx outputs. I have a speed issue when indexing the large 3D array (i.e, 3D array having a large sub-array size). My while-loop could not run fast enough to top-up the DAQmx AO buffer (at the output rate of 96kHz). It could run faster only if I use smaller 3D array (i.e, smaller-sized sub-arrays). I do not quite understand why the size of 3D sub-array affects the rate of looping although I am indexing same sub-array size at each iteration. I really appreciate your comments, advices and helps.
    I include my 3D array format as a picture below.
    Question on LabView:
    How does indexing an 3D array which wires to the while-loop affect the speed of the loop iteration? I found that large dimension of sub-arrays in the 3D array slows down the iteration speed by comparing to indexing the same size of sub-array from smaller-sized sub-arrays of the 3D array to perform signal processing inside the while-loop. Why? Is there any other way of designing LabView Program to improve speed of iteration?
    attachment:

    Thank you all for your prompt replies and your interests. I am sorry about my attachment. But, I have now attached a jpg format image file as you suggested.
    I had read the few papers on large data handling such as "LabVIEW Performance and Memory Management". Thus, I had already tried to avoid making unnecessary copies of data and growing arrays in my while-loop. I am not an expert on LabView, so I am not sure if the issues I have are just LabView fundamental limitations or there are any other ways to improve the iteration speed without reducing the input file size and DAQ output rate.
    As you request, I also attach my top-level vi showing essential sections such as while-loop and its indexing. The attached file is as an image jpg format because the actual vi including Sub-VIs are as big as 3MB in total. I hope my attachment would be useful for anyone who would like to reply my question. If anyone would like to see my whole vi & llb files, I would be interesting to send it to you by an e-mail privately and thus please provide your e-mail address.
    The dimension of my 3D array is N x 7 x M (Page x Row x Column), where N represents number of pages in 3D array, and M represents the size of 1D array.  The file I am currently using forms 3D array of N = 28, & M = 10,731.  Refering to the top-level vi picture I attached, my while-loop indexes each page per iteration and wrap-around.  The sub-VI called "channel" inside the while-loop will further index its input (2D array) into seven of 1D arrays for other signal processsing.  The output from that "channel" sub-VI is the superposition of those seven arrays.  I hope my explaination is clear. 
    Attachement: 3Darray.jpg and MyVi.jpg
    Kind Regards,
    Shein
    Attachments:
    3Darray.jpg ‏30 KB
    MyVI.jpg ‏87 KB

  • Issue with Contract Import

    Hello Folks,
    I am facing an issue with Contract Import. The contract import file has been struck in Contract Document importer and from past 4 days the status is showing as Running. Even in trace file, it is showing as "Mon Mar 16 18:05:51 CET 2015          Error during action"
    Is there any way where the import can be stopped or deleted?
    Any help is highly appreciated!!!
    Thanks in Advance!
    Regards,
    Vignesh

    Thanks John and Peter for your support. Unfortunately I cannot provide a screenshot of the issue since of the confidential material and has to re-do the work. I have re-done all the excel correction and re-created the Indesign merge document on a PC laptop. So far 'no' problems. I have a strong suspicion that the cause has been a macro used in excel to convert text or another one that I used to add values to the colum that InDesign found issue. Definetly InDesign should have reported a problem before merging the data. Yes, I am using CS5.5 with the latest update and Excel 2010 on a Mac. Still, I would like to find out what means the letter 'A' in the Data Merge Panel, since this may have directed me to find why the multiple 'AAA...' shown in the first line in the merge panel. Again thanks John and Peter: much appreciated.

  • Index import is not working

    I have TCS II, and I have a RoboHelp project linked to a FrameMaker book. The import goes OK for the most part, but I'm having issues with the index (and the TOC, but I created that manually in RH).
    FM generated my index without issues, but when I try to import that index into RH, all I get is an HTML file with the index and page numbers. The index file in the Index pod has no keywords. I apologize if I'm missing something obvious. Any suggestions?

    Maybe this will help you.
    import javafx.stage.*;
    import javafx.scene.*;
    import javafx.scene.layout.*;
    import javafx.scene.control.*;
    def toggleGroup = ToggleGroup {};
    var message: String[] = ["Display a slide show of car images.",
                "Display the Car Race Simulation",
                "Exit"];
    var radioButton: RadioButton[] = [for (index in [0..2]) { RadioButton {
                        toggleGroup: toggleGroup
                        text: "{index + 1} {message[index]}"
    var radioButtonSelectedOrNot: Boolean[] = bind for (index in [0..2]) { radioButton[index].selected };
    def stage1 = Stage {
                scene: Scene {
                    width: 600
                    height: 400
                    content: [
                        VBox {
                            spacing: 5
                            content: [
                                radioButton
            }

  • OBIEE 11g SSL Configuration Issue : Unable to import the Server certs

    Hello All,
    We are trying to configure OBIEE 11.1.1.6.0 with SSL using Windows server 2003 (IIS) and facing some issues with that.
    Followed the document : OBIEE11g SSL Setup and Configuration [1326781.1]
    http://obieedue.blogspot.sg/2012/08/obiee11g-ssl-setup-and-configuration.html
    and also completed generating the required certificate signing request and keystores for SSL communication and sent it to the CA (IT Admin team) to to have the certificate signed by CA. The issue comes when I am trying to import the CA certificate (Root certificate) and Server Certificate into the Java Keystore.
    I am importing the Root CA Certificate first which is successfully added to the keystore.
    keytool -import -trustcacerts -alias mycacert -file cacert.pem -keystore mykeystore.jks -storepass Welcome1
    Trust this certificate? [no]: yes
    Certificate was added to keystore.
    But when trying to add the Server Certificate to the keystore using the command below :
    keytool -import -v -alias testserver -file server.cer -keystore mykeystore.jks -keypass Welcome1 -storepass Welcome1
    Certificate reply was installed in keystore
    I get the following error:
    keytool error: java.lang.Exception: Failed to establish chain from reply
    java.lang.Exception: Failed to establish chain from reply
    at sun.security.tools.KeyTool.establishCertChain(KeyTool.java:2662)
    at sun.security.tools.KeyTool.installReply(KeyTool.java:1870)
    at sun.security.tools.KeyTool.doCommands(KeyTool.java:807)
    at sun.security.tools.KeyTool.run(KeyTool.java:172)
    at sun.security.tools.KeyTool.main(KeyTool.java:166)
    Read many forums and tried to convert it to the PKCS#7 format and import the cert to the identity keystore, but was not successful in that either. I have also checked with the IT Admin team and found there is only one RootCA and no other intermediate CA's.
    Please advice if any one has similar issues or suggestions.
    Thanks in advance,
    SVS

    Hi,
    One obvious reason would be that you did not specify -trustcacerts, and the root CA is not included in the present server keystore. In that case, using the -trustcacerts option would solve the problem, if the root CA is indeed in the JDK cacerts.
    To print out the certificates present in the JDK cacerts, use the following command:
    keytool -list -keystore <JAVA_HOME>/jre/lib/security/cacerts -storepass changeit -v
    Then check if the root CA that signed your server certificate is present, and has not expired (in which case,you would need to re-import a newer one into cacerts).
    Another common reason for that error message is when you have used a proprietary CA to sign your server certificate. Then it would obviously not be in the JDK cacerts. The solution in that case is to import your proprietary root CA into the JDK cacerts, using the following command:
    keytool -import -keystore <JAVA_HOME>/jre/lib/security/cacerts -file yourRootCA.pem -storepass changeit -alias youralias
    A third reason for that error message is when your server was signed by an intermediate certificate. In that case, you would have received from your CA a chain of certificates. One way to solve this (not the only one, but this one works well): Prepend your intermediate CA file to your server cert file, and import the obtained concatenated file into the server keystore. Be careful, the intermediate CA must be BEFORE the server cert. Example:
    copy rootca.cer certchain.p7b
    type server.cer >> certchain.p7b
    The file certchain.p7b will be the concatenation of the intermediate CA and the signed server cert. Then import the newly created file under the key alias as follows:
    keytool -import -keystore serverks.jks -file certchain.p7b -alias yourkey -trustcacerts
    If you only prepend the intermediate root CA, you must make sure the the final root CA is in cacerts. But you can also prepend your whole chain of trust inside the server keystore.
    Regards,
    Kal

  • Performance Issue on Traditional Import and Export

    Scenario
    =====
    Oracle 9i Enterprise Edition (9.2.0.8)
    Windows Server 2003 (32 bit)
    --- to ---
    Oracle 11g Enterprise Edition (11.2.0.3)
    Windows Server 2008 Standard R2 (64 bit)
    Hi to all
    I'm doing a upgrade from 9i to 11g and i am using native imp/exp to migrate those data.. For my 1st round of testing, I have done the following:
    1) Full DB Export from 9i. exp system/<pwd>@db FULL=Y FILE=export.dmp log=export.log
    Encountered warning "EXP-00091: Exporting questionable statistics." (On hindsight, I know that I need to set the characterset as per my server before exporting) Nevertheless, I proceeded on with this 8.4GB dmp file that has the warning "EXP-00091: Exporting questionable statistics." The characterset in 9i is US7ASCII. My export took 1 hour, my 9i is actually only a small 26GB.
    2) Full import to 11g. My 11g is a newly created DB with characterset WE8MSWIN1252. I know that schemas and objects will be automatically created in 11g.
    3) However, the problem I face is that this importing of data has been running for the past 4 hours and counting and it is still not done.
    My question is:
    Is it because of the difference in the characterset in the dmp file and 11g that is causing this importing to be slow? Could it be that characterset conversion is causing high overhead?
    OR
    Is it because I exported all the indexes from 9i and now during importing, it is taking a long time to create those indexes in 11g? Should I have export full but set index=F and create a index creation script from 9i and run it in 11g, so as to save time?
    OR
    Both of the above is causing the importing to keep on running? Or ??
    Edited by: moslee on Nov 21, 2012 11:54 PM
    Edited by: moslee on Nov 22, 2012 12:01 AM

    Hi to all
    All my tablespaces in oracle 9i database is on 4096 block_size... But I pre-created those same tablespaces in 11g and they are on 8192 block_size..
    Am I right to say that it is the differences in block size that is slowing down this importing?
    If so, how can I solve this problem? If I use "IGNORE=Y" in my import statement, will it help? Thanks..
    I think i will follow my 9i db and create the tablespaces in 11g in 4096 block_size...
    Here's the new server (11g) specs (i know this is not officially supported by oracle yet):
    Win Server 2012
    HP Proliant DL360p
    Intel Xeon CPU @ 2GHz
    8GB Ram
    ======
    Logfile
    ======
    C:\>imp system/<pwd>@dbnew full=y file=export.dmp
    Import: Release 9.2.0.1.0 - Production on Thu Nov 22 11:29:08 2012
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit
    Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Export file created by EXPORT:V09.02.00 via conventional path
    Warning: the objects were exported by OEM, not by you
    import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    IMP-00017: following statement failed with ORACLE error 29339:
    "CREATE UNDO TABLESPACE "UNDOTBS" BLOCKSIZE 4096 DATAFILE 'F:\ORADATA\UNDO\"
    "UNDOTBS.DBF' SIZE 5120 , 'F:\ORADATA\UNDO\UNDOTBS1.DBF' SIZE 5120 "
    " EXTENT MANAGEMENT LOCAL "
    IMP-00003: ORACLE error 29339 encountered
    ORA-29339: tablespace block size 4096 does not match configured block sizes
    IMP-00017: following statement failed with ORACLE error 29339:
    "CREATE TEMPORARY TABLESPACE "TEMP" BLOCKSIZE 4096 TEMPFILE 'D:\ORADATA\TEM"
    "P\TEMP.DBF' SIZE 7524 AUTOEXTEND ON NEXT 20971520 MAXSIZE 8192M EXTE"
    "NT MANAGEMENT LOCAL UNIFORM SIZE 10485760"
    IMP-00003: ORACLE error 29339 encountered
    ORA-29339: tablespace block size 4096 does not match configured block sizes
    IMP-00015: following statement failed because the object already exists:
    "REVOKE "OEM_MONITOR" FROM SYSTEM"
    IMP-00015: following statement failed because the object already exists:
    "CREATE ROLE "HS_ADMIN_ROLE""
    IMP-00015: following statement failed because the object already exists:
    "REVOKE "HS_ADMIN_ROLE" FROM SYSTEM"
    . importing O3's objects into O3
    Edited by: moslee on Nov 22, 2012 6:45 PM
    Edited by: moslee on Nov 22, 2012 7:07 PM
    Edited by: moslee on Nov 22, 2012 7:13 PM
    Edited by: moslee on Nov 22, 2012 7:28 PM

  • Issue with index creation of an infocube.

    Hi,
    I have an issue with creation of index's for a info cube in SAP BI. when i create index's using
    a process chain for the cube it is getting failed  in the process chain. when i try to check the index's  for this
    cube  manual the following below massage is shown.
    *The recalculation can take some time (depending on data volume)*
    *The traffic light status is not up to date during this time*
    Even i tried to repair the index's using the standard program "SAP_INFOCUBE_INDEXES_REPAIR" in SE38
    to repair the index so it is leading to dump in this case.
    Dear experts with the above issue please suggest. 
    Regards,
    Prasad.

    Hi,
    Please check the Performance tab in the Cube manage and try doing a repair index from there.
    This generates a job so check the job in SM37 and see if it finishes. If it fails, check the job log which will give you the exact error.
    These indices are F fact table indices so if nothing works, try activating the cube by the pgm 'RSDG_CUBE_ACTIVATE' and see if that resolves the issue.
    Let us know the results.

  • Issue with the import of hyperlinks from FM to Robohelp

    Hi,
    I am trying to produce online help (WebHelp) using RoboHelp 9 based on FrameMaker files (FM8). Previously I was working with WebWorks ePublisher (which was way easier to use since there wasn't much to customize...). I'm linking (not importing) the book to the RH project.
    Now I'm having trouble with the following points:
    1. Table of contents: the TOC imported from FM to RH does not work properly, some sections should be converted into a book, but are ignored by RH, so I had to create a TOC manually. If everyone know where this can come from, please help me . Of course, I checked the styles, and the TOC in FM perfectly reflects the title levels...
    2. Hypertext links: this is the biggest issue. I have around 3000 pages, so I can't recreate each hyperlink manually.. In FM, I use the "Hypertext" marker and then I put "openlink myfile.fm:firstpage", which works perfectly fine in the PDF output. However, in RH, the marker is totally ignored, and there is no hyperlink in the output file. I checked the source code of the htm file, but there is absolutely nothing that suggests a link (no <a href="...">).
    I tried to update the project, and tried other markers like URL myfile.fm, but it did change anything...
    Also, I tried generating the online help directly using FM10 (trial version of the TCS, until I can confirm that it is worth it), but the TOC in the online help didn't correspond to the one in FM, so I gave up and used RH to create the TOC manually...
    Thanks in advance for your help.
    PS.: I'm not a native English speaker, so sorry if the language is a little messy...

    Hi,
    Thanks for your answers.
    I tried generating the help using FM10 then File > Publish, but the TOC and the hyperlinks aren't working any better. I also tried converting the book into FM10 format, then linking it to a RH9 project, but the results were the same.
    The titles which are ignored in RH have the same styles as the titles which are not ignored, only their names are different, and the ignored titles are numbered.
    Also, these titles corresponds to a file in FM (which contains a small chapter TOC), and the file is totally ignored as well.
    Does anyone create links between files of a same book? I also tried using FM10 then RH9, but the results are the same as well
    Thanks in advance.

  • Issue with Export / Import Configuration Data in AC 5.3 migration to AC 10

    Hi,
    We are migrating from AC 5.3 to AC 10. As per the Migration Guide I have exported the data from AC 5.3.
    The export created around 57 files on the destination with .dat extension.
    Moved these files to target server Import location.
    Then I have used the utility (GRAC_DATA_MIGRATION) in AC 10. Selected the Import location. and clicked Get Files.
    It has uploaded some files (21 out of 57).
    When I analyzed the uploaded files, I found it has uploaded only "Common Configuration" data. For the below Objects, though the export utility has created files, Import utility did not uploaded the files.
    1. Business Unit
    2. Critical Roles and Profiles
    3. CUP Role Repository
    4. Mitigation Controls
    5. SOD Rules
    6. Workflow
    Does anyone have any idea about this issue?
    Regards,
    Deepak

    Deepak,
    Can you share with me how you are importing the config data? How you are using the path? Please share this and also try toshare how you systems are distributed.
    Regards,
    Faisal

  • Adobe Bridge issue with index.html files

    Hi, I have a perplexing problem.... Three weeks ago, I created a web photo gallery in Bridge. I transferred it to my website via FTP and it worked like a charm. Three days later I created another web gallery transferred it to my website using my ftp and the address of what I uploaded takes me to a blank page. I contacted my web hosting support and was told it looks like an issue with my index.html file. Here is a link to the gallery that is working: www.janieblanchard/com/galleries/prettylights/index.html
    Here is the link to the site that is not working:
    www.janieblanchard.com/galleries/macrogallery/index.html
    Any advice would be so helpful, I've spend too many hours trying different galleries and uploading multiple times.
    Thanks!

    What exact camera make and model?
    What specific, exact version of Adobe Camera Raw (ACR) plug-in?
    What specific, exact versions of Bridge and of Yosemite?
    BOILERPLATE TEXT:
    Note that this is boilerplate text.
    If you give complete and detailed information about your setup and the issue at hand,
    such as your platform (Mac or Win),
    exact versions of your OS, of Photoshop (not just "CS6", but something like CS6v.13.0.6) and of Bridge,
    your settings in Photoshop > Preference > Performance
    the type of file you were working on,
    machine specs, such as total installed RAM, scratch file HDs, total available HD space, video card specs, including total VRAM installed,
    what troubleshooting steps you have taken so far,
    what error message(s) you receive,
    if having issues opening raw files also the exact camera make and model that generated them,
    if you're having printing issues, indicate the exact make and model of your printer, paper size, image dimensions in pixels (so many pixels wide by so many pixels high). if going through a RIP, specify that too.
    etc.,
    someone may be able to help you (not necessarily this poster).
    a screen shot of your settings or of the image could be very helpful too.
    Please read this FAQ for advice on how to ask your questions correctly for quicker and better answers:
    http://forums.adobe.com/thread/419981?tstart=0
    Thanks!

  • Issues while Auto Import

    Hi Guru's
    I am facing some issue while trying to Import Article to MDM Automatically..XML just fails and goes from Ready folder to Exception folder..
    I dont know why it happens ..because when i open the same XML in Import manager(with same map) and click on Import tab it doesnt give any mssg like for eg "this segment is to be mapped" Or so...
    It just says Ready for Import .. Dont know why its failing while trying to Import Automatically...
    Import log says:-
    "Some portions of this import map are out of date.<LF/> ..Solution: Please, Launch the Import Manager GUI, preferably using the original source file that the map was generated for, otherwise the same source file and Save Update [File-&gt;Save Update] the map"
    Has anyone come across such issue as i wont be able to open each XMl manually and then import it..
    Any setting i need to make to avoid this..
    Regards
    Vikrant M Kelkar

    Hi Vikrant,
    I have faced this probelm before while working in a project and I was able to resolve this issue.
    This problem comes when the Map that you have saved in the Import Manger and used in the Console port Setting becomes Outdated.
    The error comes because you have some field in the current Source file that you are trying to import, which was not in the file which you have used to save the Map. Please refer to the Log. It will given there- > Undefined element.
    While working with Import Server, you will have to open the Source file with the Map that you have saved. Ans as Micheal Also said, GO to File-> Save update.
    If it still doent work, then you will have to manually add the undefined field in the Source File that you had used to save the Map. And then Save the Map again. Your saved map should have all the fields which are there in the current file.
    Hopefully it will work.
    Thanks and Regards
    Nitin Jain

  • File permission issue when indexing from JSP

    HI
    I have a class that I intend to use both in batch file and in JSP that will build a search index (Lucene) somwhere in the file system outside the web root. The class works great when called from the batch script but when I try to use it in the JSP page i get this error:
    [#|2004-04-23T09:56:41.155-0400|INFO|sun-appserver-pe8.0|javax.enterprise.system.stream.out|_ThreadID=13;|
    caught a class java.security.AccessControlException
    with message: access denied (java.io.FilePermission C:\AMIR\nemours_internet\common\search_index\html\segments delete)|#]
    Can someone tell me if this is a setting in PE8 server or do I have to do something within my class code?
    Thanks
    Following is my class (BuildPageIndex.java) and the snippet of the JSP file where I call it...
    <<<< package and imports ommited >>>>
    public class BuildPageIndex {
         private StringBuffer sbOut;
         public BuildPageIndex(){
         public BuildPageIndex(String pathToCMTXML){
         CMTController.loadCMTXML(pathToCMTXML);     
         public void buildIndex(){
              try {
              sbOut = new StringBuffer();
         // get all the page objects from the database
         ArrayList pageList = CMTPersister.getInstance().getAllPages();
         if (pageList.size() == 0)
              sbOut.append("<Lucene> No HTML pages in the database...Exiting...\n");
              return;     
         // obtain index location from XML config file
         HashMap searchMap = CMTController.getInstance().getSearchConfig();
         String strIndexLoc = (String)searchMap.get("LOCATION");
         sbOut.append("\n\n\n");
         // if backup management fails abort index creation
         if (!(manageBackup(strIndexLoc)))
                        return;
         // create index and start timing
         sbOut.append("<Lucene> Creating index for HTML pages in: "+ strIndexLoc + "\n");
         Date start = new Date();
         IndexWriter ixdBuilder = new IndexWriter(strIndexLoc, new StandardAnalyzer(), true);
         //for all pages ...
         for (int i=0;i<pageList.size();i++)
                        //get a page
                        Page thisPage = (Page)pageList.get(i);
                        // create lucene document from a page
                        sbOut.append("<Lucene> Page [" + thisPage.getURL().trim() +"] added \n");
                        Document pg_doc = PageDocument.Document(thisPage);
                        // add page document to the index...
                        ixdBuilder.addDocument(pg_doc);     
         // optimize and close index
                   ixdBuilder.optimize();
                   ixdBuilder.close();
         // calculate the time it took to build index
         Date end = new Date();
         long msecs = end.getTime() - start.getTime();
         sbOut.append("<Lucene> HTML Index created in: " + msecs + " milliseconds ("+ (float)msecs/1000.00 + " seconds) on "+pageList.size()+" pages\n\n");
         } catch (Exception e) {
                   System.out.println(" caught a " + e.getClass() + "\n with message: " + e.getMessage());
         public String getOutput(){
              return this.sbOut.toString();
         private boolean manageBackup(String loc){
    boolean retVal = true;
              //rename the location to _BACKUP so we have it
    File currIdx = new File(loc);
    File backIdx = new File(loc+"_BACKUP");
    if (currIdx.exists()){
         sbOut.append("<Lucene> Index directory allready exists!\n<Lucene> Backing it up to ["+backIdx.getPath()+"]...\n");
         if (backIdx.exists()){
              sbOut.append("<Lucene> Backup directory also exists! Will delete it first...\n");
              if (!(Utils.deleteDirRecursive(backIdx))){
                   sbOut.append("<Lucene> Failed deleting backup directory! Aborting...\n");
                   retVal = false;
         if (!(currIdx.renameTo(backIdx)))
              sbOut.append("<Lucene> Problem backing up existing index! Aborting...\n");
              retVal = false;
    return retVal;
         * Class has a main so it can be run from the batch file
         * or a command line
         * @param args
         public static void main(String[] args) {
              if (args.length == 0) {
                   System.err.println("Usage: BuildPageIndex <path_to_cmt.xml_file>");
                   System.exit(0);
              BuildPageIndex bpi = new BuildPageIndex(args[0]);
              bpi.buildIndex();
              System.out.println(bpi.getOutput());
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    JSP snippet
    +++++++++++++++++++++++++++++++++++++++++++++
    <%
         BuildPageIndex bpi = new BuildPageIndex();
         bpi.buildIndex();
         String results = bpi.getOutput();
    %>
         <%=results%>

    Hi Zambak,
    It looks like you might need to give your JSP permission to delete the file that it is complaining about. One way to do this would be to add a FilePermission to the server's security policy file (located at <PE install dir>/domains/domain1/config/server.policy. Adding the following would grant file delete permissions to all files for all codebases...
    grant{
    permission java.io.FilePermission "<<ALL FILES>>", "delete";
    After modifying the policy file, you will need to restart the domain by executing the following...
    <PE install dir>/bin/asadmin stop-domain
    <PE install dir>/bin/asadmin start-domain
    It is also possible to further restrict permissions to certain codebases or further open the set of file permissions (adding write, read, delete for example). For more information on how to do this, please refer to the J2SE security documentation loacted at http://java.sun.com/j2se/1.4.2/docs/guide/security/permissions.html .

  • Having issue trying to import large (23min) clip into iMovie 08 and FCEx 4.0.1

    Through my iMac 2.0GHz 2GB ram, Intel Core Duo using OS 10.6.8 (Snow Leopard).  Try to no success to import a 23 min AVCHD clip into either
    iMovie 08 or FCEx 4.0.1.  I know my mac, OS and software are antiquated by today's current editions, but that said never had this issue before.
    Footage was shot with Sony NX5u camera using SDHC (class 10) media.  The event I shot has 54 video clips on the card.  All but this large clip
    will import. So I don't think it's some kinda of file structure issue.  Otherwise I think none of the clips would have imported into to either editing program.
    Each (iMovie 08 and FCEx) will begin the import process for this large clip. iMovie begins to import, but in the end only 11:28 of the full 23 min. get imported.
    FCEx after beginning to import eventually aborts the import.  I know in the past I've been able to import long clips (15-20min) with no issues.  Maybe
    someone here has a solution.  Dare I say I need to bump my ram up to 6GB, upgrade to Yosemite, upgrade to latest iMovie or FCPX.  Considered trying
    Adobe Premier.  Right now I'm wide open for any suggestions on how to tackle this problem.

    Hi M
    First thing that comes to mind is - Have You just installed FinalCut on
    Your Mac. There are a known problem with this so that You have to
    go to Library folder and move out FCE or pro files to make share back
    to camera to work in iMovie.
    my notes:
    This can happen with iMovie 3 or iMovie 4. This does not occur with iMovie HD 5.0.
    The solution is to temporarily move the following files from /Library/QuickTime / to the Desktop:
    Final Cut Pro HD
    ? DesktopVideoOut.component
    ? DVCPROHDVideoOutput.component
    Final Cut Express HD
    ? DesktopVideoOut.component
    Restart the computer and try exporting from iMovie to your camera.
    When you need to use Final Cut Pro HD or Final Cut Express HD again, drag these files back to /Library/QuickTime / and restart the computer.
    Yours Bengt W

  • Issues with MDL imports in OWB

    I am trying to import MDL files, some MDL's import was success when i started importing MAPPING MDL's i am getting below error, i am completly new to this OWB world,can some one help me on this issue.
    * Oracle Warehouse Builder Import Log File
    * Created: Dec 17, 2011 7:47:05 PM
    * OWB Release: 11.1.0.7.0 OWB Repository Version: 11.1.0.7.0 MDL Release: 10.2
    * User: null Connect String: null
    * Data File: /u01/oracle/MOC/db/tech_st/11.1.0/owb/rtp/sql/mdl/r12rfrmp.mdl
    * Log File: /u01/oracle/MOC/db/tech_st/11.1.0/owb/rtp/sql/mdl/r12rfrmp_imp.log Log Message Level: ALL
    * Character Set: UTF8
    * Mode: UPDATE Ignore Universal Identifier: Y
    Import started at Dec 17, 2011 7:47:05 PM
    Importing into workspace OWBADMIN
    Informational: MDL1328: PROJECT MTH not imported since the object in MDL file is the same as the object in the workspace.
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR> not found for OPERATOR <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.MTH_ENTITY_PLANNED_USAGE_ERR>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR> not found for GROUP <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.INOUTGRP1>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ENTITY_FK> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ENTITY_FK>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ENTITY_TYPE> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ENTITY_TYPE>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_AUDIT_DETAIL_ID> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_AUDIT_DETAIL_ID>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_AUDIT_RUN_ID> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_AUDIT_RUN_ID>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_ERROR_ID> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_ERROR_ID>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_ERROR_OBJECT_NAME> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_ERROR_OBJECT_NAME>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_ERROR_REASON> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_ERROR_REASON>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_OPERATOR_NAME> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_OPERATOR_NAME>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_SEVERITY> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_SEVERITY>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR_CODE> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR_CODE>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.PERIOD_OF_USAGE> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.PERIOD_OF_USAGE>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.PLANNED_USAGE> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.PLANNED_USAGE>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.REPROCESS_READY_YN> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.REPROCESS_READY_YN>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.SUSTAIN_ASPECT_FK> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.SUSTAIN_ASPECT_FK>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR> not found for OPERATOR <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.MTH_ENTITY_PLANNED_USAGE_ERR_1>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR> not found for GROUP <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.INOUTGRP1>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ENTITY_FK> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ENTITY_FK>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ENTITY_TYPE> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ENTITY_TYPE>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_AUDIT_DETAIL_ID> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_AUDIT_DETAIL_ID>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_AUDIT_RUN_ID> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_AUDIT_RUN_ID>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_ERROR_ID> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_ERROR_ID>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_ERROR_OBJECT_NAME> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_ERROR_OBJECT_NAME>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_ERROR_REASON> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_ERROR_REASON>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_OPERATOR_NAME> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_OPERATOR_NAME>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR$$$_SEVERITY> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR$$$_SEVERITY>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.ERR_CODE> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.ERR_CODE>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.PERIOD_OF_USAGE> not found for ATTRIBUTE <MTH.MTH_TARGET.MTH_ENT_PLANNED_USG_ES_ALL_MAP.PERIOD_OF_USAGE>
    Warning: MDL3002: ReferencingObject <MTH.MTH_TARGET.MTH_ENTITY_PLANNED_USAGE_ERR.PLANNED_USAGE> not found for ATTRIBUTE

    It sounds like the MDL has mappings and the dependent tables referenced by the mappings are not in that MDL...they may be in another one, you should import it first. If not, then export from a repository which has the tables.
    Cheers
    David

Maybe you are looking for

  • Background Process doesn't work

    Hi Everybody, i want to change some data on a database table through a background process but the data will not be changed: I am working on IDES ECC 6.0 (2005)                 CALL FUNCTION 'MY_BACKGROUNDD_PROCESS'                   IN BACKGROUND TAS

  • InvocationTargetException in User Defined Function in Oracle CEP

    I was trying to use a UDF in my CQL query. But I got the following error: An InvocationTargetException was encountered while attempting to register the user defined function "mymod".  The message was: null My code snippet is bellow: Bean: package com

  • How to send gmail for oracle10gR2

    ERROR 在行 1: RA-29278: SMTP 暫時錯誤: 421 Service not available RA-06512: 在 "SYS.UTL_SMTP", line 21 RA-06512: 在 "SYS.UTL_SMTP", line 97 RA-06512: 在 "SYS.UTL_SMTP", line 399 RA-06512: 在 "ERP_OBJ.AAB_MAIL", line 132 RA-29278: SMTP 暫時錯誤: 421 Service not avai

  • Is Lightroom a viable tool to manage video assets for Premiere Pro?

    I'm trying to determine the easiest workflow for cataloging and importing DSLR video footage for my video productions.  Generally, I simply drag my video footage from the memory card over to my RAID array and store the footage in a subfolder named af

  • Reposting as I've had no reply

    Hi. I'm reposting this post that I put up in November. I've had no reply so..... I'm hoping someone will see it this time. Hi. Is there a known problem with iweb 09 Snowleopard and movies? I have tried and tried to upload movies taken on my phone and