Export data file directory

what's the default directory when exporting data from the application manager? why can't i find the file i just export or choose the directory when i try to reload the data?

The default directory is $ARBORPATh/app.But you can export to database directory by$ARBORPATh/app/appname/dbname/filename.txtso that you can load directly this from App Manager under server file.

Similar Messages

  • Export data File Browse dialog box does not appear

    I am trying to export data to CSV and when I click on Browse to locate proper directory, nothing happens (the dialog doesn't show up).
    I normally right-click on any cell in "Result" TAB, choose Export->CSV.
    On the same note CSV format is a workaround to get to Excel. Is there an extension or option to save directly in Excel format. This would avoid having the client to open the file in Excel and then do a Save As with the proper XLS format instead of CSV.
    Using Win XP client connecting to 10.2 database. Using latest SQL Developer version (downloaded on Friday may 5th version 1.0.0.15.27).
    Thanks
    Robert
    Message was edited by:
    user509017

    I haven't seen the browse button problem that you describe. Is it still occurring? On all exports? There have been issues with export rerunning the query a few times which makes the application appear to lock up. Could that be the issue?
    There is an excel add-in. I assume that it isn't included for licensing issues, but it comes from one of the core developers of this app.
    esdev.sf.net.
    Regards,
    Eric

  • Need to rebuild Outlook 2011 for mac identity for exporting data file.

    I was exporting outlook for mac data file (to import it into apple mail), and suddenly Outlook email client crashed and when I tried again it said that I need to rebuild the Outlook for mac database using rebuild utility. I took help from some professional technicians but it was a dead end. They said that my outlook for mac database was corrupt and they could not help me any further. I had years of emails in outlook for mac that I needed to move to apple mail as I was fed up of these frequently occuring outlook 2011 errors. Please if anyone could help me in this case then it will be very helpful.
    Thanks

    Outlook for mac stores archives its data in .olm format and it originally stores its mails in .OLK format. If everything has failed then you can seek help of a tool that can recover these .olk14 messages. One tool I know that is capable of doing so is http://www.outlookmacdatabaserecovery.com/.
    Look for the trial version first and see of it works and recovers you emails.

  • Exported data files change clarification

    We are migrating a client from Oracle CRM On Demand to another CRM product. A couple of months ago we performed a full export and mapped all the needed fields from the current Oracle system to the new CRM system. This morning I went in and performed another full export to start the process of the actual migration process based on all the mappings we had performed previously. This time there were quite a few new data files that were exported and I believe this may be due to the "Release 21" that was just implemented. It appears initially that these new files are duplicate records that are associated to the main record. That is now there is an "Account Opportunity" data file where before it didn't exist. I looked online but cannot find any specifics on this. Can someone please let me know if this is the case or is there a document online somewhere that has this information? Thanks in advance for the help.

    Hi Dave,
    I've got this working on the copy of the rulebase I had, it's because your entity / relationship public names don't match the .xds file, possibly as a result of the upgrade. I've sent you an email about this.

  • Version 2.1.0.63 - export data file is locked

    I followed the tutorial for Getting Started with Oracle SQL Developer - Adding Data to a Table
    After adding a total of 5 rows to the newly created table: DEPENDENTS I followed the instructions to export to a .csv by right clicking on a cell and exporting to a file e.g. C:\DEPENDENTS.csv
    Although I can find the file, I can't open it with Excel 2003 - I tried disconnecting from the User HR and SQLDeveloper appears to hold a lock on the file.
    If I quit SQLDeveloper I can open the file in Excel - otherwise I get a message file C:\DEPENDENTS.csv is locked for editing, by 'another user' e.g. SQLDeveloper.
    I would expect the file to be created and be available for immediate use.
    Kind regards
    Simon
    Edited by: user626089 on 18-Feb-2010 05:15

    Has been reported several times and will hopefully be fixed in the upcoming patch.
    Regards,
    K.

  • How to create the Export Data and Import Data using flat file interface

    Hi,
    Request to let me know based on the requirement below on how to export and import data using flat file interface.....
    Please provide the steps involved for the same.......
    BW/BI - Recovery Process for SNP data. 
    For each SNP InfoProvider,
    create:
    1) Export Data:
    1.a)  Create an export data source, InfoPackage, comm structure, etc. necessary to create an ASCII fixed length flat file on the XI
    ctnhsappdata\iface\SCPI063\Out folder for each SNP InfoProvider. 
    1.b)  All fields in each InfoProvider should be exported and included in the flat file. 
    1.c)  A process chain should be created for each InfoProvider with a start event. 
    1.d)  If the file exists on the target drive it should be overwritten. 
    1.e)  The exported data file name should include the InfoProvider technical name.
    1.f)  Include APO Planning Version, Date of Planning Run, APO Location, Calendar Year/Month, Material and BW Plant as selection criteria.
    2) Import Data:
    2.a) Create a flat file source system InfoPackage, comm structure, etc. necessary to import ASCII fixed length flat files from the XI
    ctnhsappdata\iface\SCPI063\Out folder for each SNP InfoProvider.
    2.b)  All fields for each InfoProvider should be mapped and imported from the flat file.
    2.c)  A process chain should be created for each InfoProvider with a start event. 
    2.d)  The file should be archived in the
    ctnhsappdata\iface\SCPI063\Archive directory.  Each file name should have the date appended in YYYYMMDD format.  Each file should be deleted from the \Out directory after it is archived. 
    Thanks in advance.
    Tyson

    Here's some info on working with plists:
    http://developer.apple.com/documentation/Cocoa/Conceptual/PropertyLists/Introduc tion/chapter1_section1.html
    They can be edited with any text editor. Xcode provides a graphical editor for them - make sure to use the .plist extension so Xcode will recognize it.

  • Removing unwanted control characters in exported text files

    I am currently evaluating Crystal Reports 2008 to determine applicability to our requirements. I need to export data files to continuous text to be read by other application software. I have successfully created the files but have what I believe to be page feed or end-of-page control characters (small rectangles) in the output. Can someone enlighten me as to how I can suppress or remove these control characters?

    In the export to text options enter 0 for the number of lines per page. This will produce an unpaginated text document without the page control markers.

  • Help to export data into a file

    Hi,
      I am having more than 200 columns in a table. Client wants to export the data into a file with all 215 columns data. When i write the select * from tablename into a spool file. It is getting the data into a multiple lines however client wants each record into a single line.(right now records are getting into multiple lines). I tried to use to_nclob(columnname1)....to_nclob(column215), this way it is getting a single line for each record but it is taking much time(means in a table we have 10M records and explain plan is showing 3 days to export the file). Could you please help me how we can achieve these type of functionalities.
    Thanks in advance

    user586 wrote:
    Hi,
      I am having more than 200 columns in a table. Client wants to export the data into a file with all 215 columns data. When i write the select * from tablename into a spool file. It is getting the data into a multiple lines however client wants each record into a single line.(right now records are getting into multiple lines). I tried to use to_nclob(columnname1)....to_nclob(column215), this way it is getting a single line for each record but it is taking much time(means in a table we have 10M records and explain plan is showing 3 days to export the file). Could you please help me how we can achieve these type of functionalities.
    Thanks in advance
    Hardly surprising it's taking ages to export 10 million rows, if you're doing it through SQL*Plus, as that involves transporting all the data over the network to the client, which is then rendering all that data in the display (which can be slow to scroll, depending on the size of the window) etc.
    Better to get the database to access the file system directly, using UTL_FILE (though this does mean that the file is produced on the server) and then transfer the resultant file to where it needs to go.
    Example of a generic template to use as a starting point...
    As sys user:
    CREATE OR REPLACE DIRECTORY TEST_DIR AS '\tmp\myfiles'
    GRANT READ, WRITE ON DIRECTORY TEST_DIR TO myuser
    As myuser:
    CREATE OR REPLACE PROCEDURE run_query(p_sql IN VARCHAR2
                                         ,p_dir IN VARCHAR2
                                         ,p_header_file IN VARCHAR2
                                         ,p_data_file IN VARCHAR2 := NULL) IS
      v_finaltxt  VARCHAR2(4000);
      v_v_val     VARCHAR2(4000);
      v_n_val     NUMBER;
      v_d_val     DATE;
      v_ret       NUMBER;
      c           NUMBER;
      d           NUMBER;
      col_cnt     INTEGER;
      f           BOOLEAN;
      rec_tab     DBMS_SQL.DESC_TAB;
      col_num     NUMBER;
      v_fh        UTL_FILE.FILE_TYPE;
      v_samefile  BOOLEAN := (NVL(p_data_file,p_header_file) = p_header_file);
    BEGIN
      c := DBMS_SQL.OPEN_CURSOR;
      DBMS_SQL.PARSE(c, p_sql, DBMS_SQL.NATIVE);
      d := DBMS_SQL.EXECUTE(c);
      DBMS_SQL.DESCRIBE_COLUMNS(c, col_cnt, rec_tab);
      FOR j in 1..col_cnt
      LOOP
        CASE rec_tab(j).col_type
          WHEN 1 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,2000);
          WHEN 2 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_n_val);
          WHEN 12 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_d_val);
        ELSE
          DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,2000);
        END CASE;
      END LOOP;
      -- This part outputs the HEADER
      v_fh := UTL_FILE.FOPEN(upper(p_dir),p_header_file,'w',32767);
      FOR j in 1..col_cnt
      LOOP
        v_finaltxt := ltrim(v_finaltxt||','||lower(rec_tab(j).col_name),',');
      END LOOP;
      --  DBMS_OUTPUT.PUT_LINE(v_finaltxt);
      UTL_FILE.PUT_LINE(v_fh, v_finaltxt);
      IF NOT v_samefile THEN
        UTL_FILE.FCLOSE(v_fh);
      END IF;
      -- This part outputs the DATA
      IF NOT v_samefile THEN
        v_fh := UTL_FILE.FOPEN(upper(p_dir),p_data_file,'w',32767);
      END IF;
      LOOP
        v_ret := DBMS_SQL.FETCH_ROWS(c);
        EXIT WHEN v_ret = 0;
        v_finaltxt := NULL;
        FOR j in 1..col_cnt
        LOOP
          CASE rec_tab(j).col_type
            WHEN 1 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_v_val);
                        v_finaltxt := ltrim(v_finaltxt||',"'||v_v_val||'"',',');
            WHEN 2 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_n_val);
                        v_finaltxt := ltrim(v_finaltxt||','||v_n_val,',');
            WHEN 12 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_d_val);
                        v_finaltxt := ltrim(v_finaltxt||','||to_char(v_d_val,'DD/MM/YYYY HH24:MI:SS'),',');
          ELSE
            DBMS_SQL.COLUMN_VALUE(c,j,v_v_val);
            v_finaltxt := ltrim(v_finaltxt||',"'||v_v_val||'"',',');
          END CASE;
        END LOOP;
      --  DBMS_OUTPUT.PUT_LINE(v_finaltxt);
        UTL_FILE.PUT_LINE(v_fh, v_finaltxt);
      END LOOP;
      UTL_FILE.FCLOSE(v_fh);
      DBMS_SQL.CLOSE_CURSOR(c);
    END;
    This allows for the header row and the data to be written to seperate files if required.
    e.g.
    SQL> exec run_query('select * from emp','TEST_DIR','output.csv');
    PL/SQL procedure successfully completed.
    Output.csv file contains:
    empno,ename,job,mgr,hiredate,sal,comm,deptno
    7369,"SMITH","CLERK",7902,17/12/1980 00:00:00,800,,20
    7499,"ALLEN","SALESMAN",7698,20/02/1981 00:00:00,1600,300,30
    7521,"WARD","SALESMAN",7698,22/02/1981 00:00:00,1250,500,30
    7566,"JONES","MANAGER",7839,02/04/1981 00:00:00,2975,,20
    7654,"MARTIN","SALESMAN",7698,28/09/1981 00:00:00,1250,1400,30
    7698,"BLAKE","MANAGER",7839,01/05/1981 00:00:00,2850,,30
    7782,"CLARK","MANAGER",7839,09/06/1981 00:00:00,2450,,10
    7788,"SCOTT","ANALYST",7566,19/04/1987 00:00:00,3000,,20
    7839,"KING","PRESIDENT",,17/11/1981 00:00:00,5000,,10
    7844,"TURNER","SALESMAN",7698,08/09/1981 00:00:00,1500,0,30
    7876,"ADAMS","CLERK",7788,23/05/1987 00:00:00,1100,,20
    7900,"JAMES","CLERK",7698,03/12/1981 00:00:00,950,,30
    7902,"FORD","ANALYST",7566,03/12/1981 00:00:00,3000,,20
    7934,"MILLER","CLERK",7782,23/01/1982 00:00:00,1300,,10
    The procedure allows for the header and data to go to seperate files if required.  Just specifying the "header" filename will put the header and data in the one file.
    Adapt to output different datatypes and styles are required.

  • Unable to export data from Web Access Data Sheet in Sharepoint to local excel or access file

    Greetings and good morning.
    I'm going to start off in broad terms with this question because I'm not 100 percent sure what information to provide.
    Long story short, we've got a Web Access Data Sheet list hosted in a Sharepoint 2010 environment. It is accessed and used by multiple people throughout the day. It contains several thousand line item entries. I'd call it a large data sheet.
    I think the size of the data sheet is casuign some instability in the list. I'd like to be able to export a defined range of data from the list into a local excel or access file. After that, I'd delete the stuff on the Access list to improve performance.
    But...when I attempt to use the Sharepoint Action bar to export - Excel locks up/crashes. If I try to export to Access, I get a similar issue.
    Any ideas? Could anyone begin by telling me what other information is required?

    Hi,
    If you would like to export data from Access Web Database in SharePoint 2010, you could go to Design With Access page in Settings. The url in my environment is http://sp/tt/_layouts/accsrv/ModifyApplication.aspx . Then choose the Table and export it to Excel
    or Modify it in Access.
    Regards,
    Rebecca Tu
    TechNet Community Support

  • Essbase Data Export not Overwriting existing data file

    We have an ODI interface in our environment which is used to export the data from Essbase apps to text files using Data export calc scripts and then we load those text files in a relational database. Laetely we are seeing some issue where the Data Export calc script is not overwriting the file and is just appending the new data to the existing file.
    The OverWriteFile option is set to ON.
    SET DATAEXPORTOPTIONS {
         DataExportLevel "Level0";
         DataExportOverWriteFile ON;     
    DataExportDimHeader ON;
         DataExportColHeader "Period";
         DataExportDynamicCalc ON;
    The "Scenario" variable is a substitution variable which is set during the runtime. We are trying to extract "Budget" but the calc script is not clearing the "Actual" scenario from the text file which was the scenario that was extracted earlier. Its like after the execution of the calc script, the file contains both "Actual" and "Budget" data. We are not able to find the root cause as in why this might be happening and why OVERWRITEFILE command is not being taken into account by the data export calc script.
    We have also deleted the text data file to make sure there are no temporary files on the server or anything. But when we ran the data export directly from Essbase again, then again the file contained both "Actual" as well as "Budget" data which really strange. We have never encountered an issue like this before.
    Any suggestions regarding this issue?

    Did some more testing and pretty much zoomed on the issue. Our Scenario is actually something like this "Q1FCST-Budget", "Q2FCST-Budget" etc
    This is the reason why we need to use a member function because Calc Script reads "&ODI_SCENARIO" (which is set to Q2FCST-Budget) as a number and gives an error. To convert this value to a string we are using @member function. And, this seems the root cause of the issue. The ODI_Scenario variable is set to "Q2FCST-Budget", but when we run the script with this calculation function @member("&ODI_SCENARIO"), the data file brings back the values for "Q1FCST-Budget" out of nowhere in addition to "Q2FCST-Budget" data which we are trying to extract.
    Successful Test Case 1:
    1) Put Scenario "Q2FCST-Budget" in hard coded letters in Script and ran the script
    e.g "Q2FCST-Phased"
    2) Ran the Script
    3) Result Ok.Script overwrote the file with Q2FCST-Budget data
    Successful Case 2:
    1) Put scenario in @member function
    e.g. @member("Q2FCST-Budget")
    2) Results again ok
    Failed Case:
    1) Deleted the file
    2) Put scenario in a substitution variable and used the member function "@member("&ODI_Scenario") and Ran the script . *ODI_SCENARIO is set to Q@FCST-Budget in Essbase variables.
    e.g. @member("&ODI_SCENARIO")
    3) Result : Script contained both "Q1FCST-Budget" as well as "Q2FCST-Budget" data values in the text file.
    We are still not close to the root cause and why is this issue happening. Putting the sub var in the member function changes the complete picture and gives us inaccurate results.
    Any clues anyone?

  • Is it possible to export the "Mediathek" in a data file (excel or word) ?

    Is it possible to export the "Mediathek" in a data file (excel or word) ?

    see my old post:
    https://discussions.apple.com/message/6272619?messageID=6272619#6272619?messageI D=6272619

  • Export data from a table to text file using srcipt task

    Hi
    i am new to SSIS
    i have to export data from a table and append it into a existing file through SSIS script task
    please help
    Thanks
    Umesh

    Hi Umesh,
    The data structure of the source table and the structure of the destination file are the same, right? Is the destination file a flat file? Do you have to do it through Script Task? If the destination file is a flat file, this can be done easily by using
    the stock tasks/components other than .NET code. In the Data Flow Task, we choose the appropriate source adapter (such as OLE DB Source or ADO.NET Source) to extract data from the source table, perform transformation if necessary, and then load to the destination
    file via a Flat File Destination. When setting up the Flat File Destination, uncheck the “Overwrite data in the file” option so that the extracted data will be appended to the existing file.
    If you need to implement it through Script Task/Component indeed, you may benefit from the following code examples:
    http://stackoverflow.com/questions/8070163/how-to-add-custom-footer-to-an-ssis-flat-file-seperate-component-or-script-tas 
    http://stackoverflow.com/questions/8467326/add-header-and-footer-row-flat-file-ssis 
    If you need further help about the script, I suggest that you ask a new question in .NET forums where you can get more dedicated support:
    http://social.msdn.microsoft.com/Forums/vstudio/en-US/home?category=netdevelopment 
    Regards,
    Mike Yin
    TechNet Community Support

  • How to export data from a Oracle table to a delimited file?

    I know how to load delimited file into a table, but how to export
    data from a Oracle table to a delimited file?
    Thanks in advance.

    Try looking at this link, it's long but there's three different solutions discussed in it. If you look at Barbara Boehmer's
    solution in her posts in the link below you'll see that she's addressed your concerns with spool files.
    Re: utl_smtp and triggers

  • How to export data from a Dynpro table to Excel file?

    Hi
    Here I go again. I read the post <b>Looking for example to export data from a DynPro table to Excel file</b> and put the code lines into a Web Dynpro Project where we need to export a dynpro table to Excel file but exactly at line 23 it doesn't recognize <b>workBook = new HSSFWorkbook();</b>
    1     //Declare this in the end between the Begin others block.
    2     
    3     private FileOutputStream out = null;
    4     private HSSFWorkbook workBook = null;
    5     private HSSFSheet hsSheet = null;
    6     private HSSFRow row = null;
    7     private HSSFCell cell = null;
    8     private HSSFCellStyle cs = null;
    9     private HSSFCellStyle cs1 = null;
    10     private HSSFCellStyle cs2 = null;
    11     private HSSFDataFormat dataFormat = null;
    12     private HSSFFont f = null;
    13     private HSSFFont f1 = null;
    14     
    15     //Code to create the Excel.
    16     
    17     public void onActionExportToExcel(com.sap.tc.webdynpro.progmodel.api.IWDCustomEvent wdEvent )
    18     {
    19     //@@begin onActionExportToExcel(ServerEvent)
    20     try
    21     {
    22     out = new FileOutputStream("C:/mydirectory/myfiles/testexcel.xls");
    23     workBook = new HSSFWorkbook();
    24     hsSheet = workBook.createSheet("My Sheet");
    25     cs = workBook.createCellStyle();
    26     cs1 = workBook.createCellStyle();
    27     cs2 = workBook.createCellStyle();
    28     dataFormat = workBook.createDataFormat();
    29     f = workBook.createFont();
    30     f1 = workBook.createFont();
    31     f.setFontHeightInPoints((short) 12);
    32     // make it blue
    33     f.setColor( (short)HSSFFont.COLOR_NORMAL );
    34     // make it bold
    35     // arial is the default font
    36     f.setBoldweight(HSSFFont.BOLDWEIGHT_BOLD);
    37     
    38     // set font 2 to 10 point type
    39     f1.setFontHeightInPoints((short) 10);
    40     // make it red
    41     f1.setColor( (short)HSSFFont.COLOR_RED );
    42     // make it bold
    43     f1.setBoldweight(HSSFFont.BOLDWEIGHT_BOLD);
    44     f1.setStrikeout(true);
    45     cs.setFont(f);
    46     cs.setDataFormat(dataFormat.getFormat("#,##0.0"));
    47     
    48     // set a thick border
    49     cs2.setBorderBottom(cs2.BORDER_THICK);
    50     
    51     // fill w fg fill color
    52     cs2.setFillPattern((short) HSSFCellStyle.SOLID_FOREGROUND);
    53     cs2.setFillBackgroundColor((short)HSSFCellStyle.SOLID_FOREGROUND);
    54     // set the cell format to text see HSSFDataFormat for a full list
    55     cs2.setDataFormat(HSSFDataFormat.getBuiltinFormat("text"));
    56     cs2.setFont(f1);
    57     cs2.setLocked(true);
    58     cs2.setWrapText(true);
    59     row = hsSheet.createRow(0);
    60     hsSheet.createFreezePane(0,1,1,1);
    61     for(int i=1; i<10;i++)
    62     {
    63     cell = row.createCell((short)i);
    64     cell.setCellValue("Excel Column "+i);
    65     cell.setCellStyle(cs2);
    66     }
    67     workBook.write(out);
    68     out.close();
    69     
    70     //Read the file that was created.
    71     
    72     FileInputStream fin = new FileInputStream("C:/mydirectory/myfiles/testexcel.xls");
    73     byte b[] = new byte[fin.available()];
    74     fin.read(b,0,b.length);
    75     fin.close();
    76     
    77     wdContext.currentContextElement().setDataContent(b);
    78     }
    79     catch(Exception e)
    80     {
    81     wdComponentAPI.getComponent().getMessageManager().reportException("Exception while reading file "+e,true);
    82     }
    83     //@@end
    84     }
    I don't know why this happen? Any information I will appreciate it.
    Thanks in advance!!!
    Tokio Franco Chang

    After test the code lines appears this error stacktrace:
    [code]
    java.lang.NoClassDefFoundError: org/apache/poi/hssf/usermodel/HSSFWorkbook
         at com.sap.tc.webdynpro.progmodel.api.iwdcustomevent.ExportToExcel.onActionAct1(ExportToExcel.java:232)
         at com.sap.tc.webdynpro.progmodel.api.iwdcustomevent.wdp.InternalExportToExcel.wdInvokeEventHandler(InternalExportToExcel.java:147)
         at com.sap.tc.webdynpro.progmodel.generation.DelegatingView.invokeEventHandler(DelegatingView.java:87)
         at com.sap.tc.webdynpro.progmodel.controller.Action.fire(Action.java:67)
         at com.sap.tc.webdynpro.clientserver.task.WebDynproMainTask.handleAction(WebDynproMainTask.java:101)
         at com.sap.tc.webdynpro.clientserver.task.WebDynproMainTask.handleActionEvent(WebDynproMainTask.java:304)
         at com.sap.tc.webdynpro.clientserver.task.WebDynproMainTask.execute(WebDynproMainTask.java:649)
         at com.sap.tc.webdynpro.clientserver.cal.AbstractClient.executeTasks(AbstractClient.java:59)
         at com.sap.tc.webdynpro.clientserver.cal.ClientManager.doProcessing(ClientManager.java:252)
         at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doWebDynproProcessing(DispatcherServlet.java:154)
         at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doContent(DispatcherServlet.java:116)
         at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doPost(DispatcherServlet.java:55)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:392)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:266)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:345)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:323)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:865)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:240)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:92)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:148)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:37)
         at com.sap.engine.core.cluster.impl6.session.UnorderedChannel$MessageRunner.run(UnorderedChannel.java:71)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:95)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:159)
    Thanks in advance!!!
    Tokio Franco Chang
    [/code]

  • HOW TO IMPORT DATA MORE THAN ONCE FROM THE SAME EXPORT DUMP FILE?

    before asking my question i'd like to mention that i'm a french spoke...
    so my english is a little bit bad. sorry for that.
    my problem is : IMPORT
    how to import data a SECOND TIME from an export dump file within oracle?
    My Export dump file was made successfully (Full Export) and then i
    tried to import datas for the first time.
    I got this following message in my logfile: I ADDED SOME COMMENTS
    Warning: the objects were exported by L1, not by you
    . importing SYSTEM's objects into SYSTEM
    REM ************** CREATING TABLESPACES *****
    REM *********************************************
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLESPACE "USER_DATA" DATAFILE 'E:\ORANT\DATABASE\USR1ORCL.ORA' SI"
    "ZE 3145728 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAX"
    "EXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLESPACE "ROLLBACK_DATA" DATAFILE 'E:\ORANT\DATABASE\RBS1ORCL.ORA"
    "' SIZE 10485760 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS "
    "1 MAXEXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
    etc........
    IMP-00017: following statement failed with ORACLE error 1119:
    "CREATE TABLESPACE "L1" DATAFILE 'E:\ORADATA\L1.DBF' SIZE 1048576000 "
    "DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAXEXTENTS 121 PCTIN"
    "CREASE 50) ONLINE PERMANENT"
    IMP-00003: ORACLE error 1119 encountered
    ORA-01119: error in creating database file 'E:\ORADATA\L1.DBF'
    ORA-09200: sfccf: error creating file
    OSD-04002: unable to open file
    O/S-Error: (OS 3) The system cannot find the path specified
    --->etc..........
    the drive E: with the folder E:\ORADATA didn't exist, but after
    all that i created it.
    see below, before my IMPORT statement
    REM ********************* CREATING USER *********
    REM ********************************************
    IMP-00017: following statement failed with ORACLE error 959:
    "CREATE USER "L1" IDENTIFIED BY VALUES 'A6E0DAA6865E7627' DEFAULT TABLESPACE"
    " "L1" TEMPORARY TABLESPACE "TEMPORARY_DATA""
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'L1' does not exist
    IMP-00017: following statement failed with ORACLE error 959:
    "CREATE USER "MLCO" IDENTIFIED BY VALUES '56AC6447B7D50467' DEFAULT TABLESPA"
    "CE "MLCO" TEMPORARY TABLESPACE "TEMPORARY_DATA""
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'MLCO' does not exist
    ETC.......
    REM ********************* GRANTING ROLES ***********
    REM ************************************************
    IMP-00017: following statement failed with ORACLE error 1917:
    "GRANT ALTER ANY TABLE to "L1" "
    IMP-00003: ORACLE error 1917 encountered
    ORA-01917: user or role 'L1' does not exist
    ETC.........
    IMP-00017: following statement failed with ORACLE error 1918:
    "ALTER USER "L1" DEFAULT ROLE ALL"
    IMP-00003: ORACLE error 1918 encountered
    ORA-01918: user 'L1' does not exist
    -- that is normal, since the creation of the
    tablespace failed !!
    REM******************************
    IMP-00015: following statement failed because the object already exists:
    "CREATE ROLLBACK SEGMENT RB_TEMP STORAGE (INITIAL 10240 NEXT 10240 MINEXTENT"
    "S 2 MAXEXTENTS 121) TABLESPACE "SYSTEM""
    IMP-00015: following statement failed because
    . importing SCOTT's objects into SCOTT
    IMP-00015: following statement failed because the object already exists:
    "CREATE SEQUENCE "EVT_PROFILE_SEQ" MINVALUE 1 MAXVALUE 999999999999999999999"
    "999999 INCREMENT BY 1 START WITH 21 CACHE 20 NOORDER NOCYCLE"
    ETC............
    importing L1's objects into L1
    IMP-00017: following statement failed with ORACLE error 1435:
    "ALTER SCHEMA = "L1""
    IMP-00003: ORACLE error 1435 encountered
    ORA-01435: user does not exist
    REM *************** IMPORTING TABLES *******************
    REM ****************************************************
    . importing SYSTEM's objects into SYSTEM
    . . importing table "AN1999_BDAT" 243 rows imported
    . . importing table "BOPD" 112 rows imported
    . . importing table "BOINFO_AP" 49
    ETC................
    . . importing table "BO_WHF" 2 rows imported
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLE "DEF$_CALL" ("DEFERRED_TRAN_DB" VARCHAR2(128),
    IMP-00015: following statement failed because the object already exists:
    "CREATE SYNONYM "DBA_ROLES" FOR "SYS"."DBA_ROLES""
    IMP-00015: following statement failed because the object already exists:
    "CREATE SYNONYM "DBA_ERRORS" FOR "SYS"."DBA_ERRORS""
    IMP-00008: unrecognized statement in the export file:
    . importing L1's objects into L1
    IMP-00017: following statement failed with ORACLE error 1435:
    "ALTER SCHEMA = "L1""
    IMP-00008: unrecognized statement in the export file:
    J
    Import terminated successfully with warnings.
    -------------------------------------b]
    So after analysing this log file, i created
    the appropriate drives and folders... as the
    import statement doesn't see them.
    E:\ORADATA G:\ORDATA etc...
    And i started to [b]IMPORT ONE MORE TIME. with:
    $ IMP73 sys/pssw Full=Y FILE=c:\temp\FOLD_1\data_1.dmp BUFFER=64000
    COMMIT=Y INDEXFILE=c:\temp\FOLD_1\BOO_idx.sql
    LOG=c:\temp\FOLD_1\BOO_log.LOG DESTROY=Y IGNORE=Y;
    after that i could not see the users nor the
    tables created.
    and the following message appeared in the log file:
    Warning: the objects were exported by L1, not by you
    . . skipping table "AN1999_BDAT"
    . . skipping table "ANPK"
    . . skipping table "BOAP"
    . . skipping table "BOO_D"
    ETC.....skipping all the tables
    . . skipping table "THIN_PER0"
    . . skipping table "UPDATE_TEMP"
    Import terminated successfully without warnings.
    and only 2 new tablespaces (originally 3) were
    created without any data in ( i check that in
    the Oracle Storage manager : the tablespaces exit
    with 0.002 used space; originally 60 M for each !!)
    so,
    How to import data (with full import option) succefully
    MORE THAN ONE TIME from an exported dump file ?
    Even if we have to overwrite tablespaces , tables and users.
    thank you very much

    The Member Feedback forum is for suggestions and feedback for OTN Developer Services. This forum is not monitored by Oracle support or product teams and so Oracle product and technology related questions will not be answered. We recommend that you post this thread to the appropriate Database forum.
    The main URL is:
    http://forums.oracle.com/forums/index.jsp?cat=18

Maybe you are looking for