Uploading & Processing of CSV file  fails in clustered env.

          We have a csv file which is uploaded to the weblogic application server, written
          to a temporary directory and then manipulated before being written to the database.
          This process works correctly in a single server environment but fails in a clustered
          environment.
          The file gets uploaded to the server into the temporary directory without problem.
          The processing starts. When running in a cluster the csv file is replicated
          to a temporary directory on the secondary server as well as the primary.
          The manipulation process is running but never finishes and the browser times out
          with the following message:
          Message from the NSAPI plugin:
          No backend server available for connection: timed out after 30 seconds.
          Build date/time: Jun 3 2002 12:27:28
          The server which is loading the file and processing it writes this to the log:
          17-Jul-03 15:13:12 BST> <Warning> <HTTP> <WebAppServletContext(1883426,fulfilment,/fu
          lfilment) One of the getParameter family of methods called after reading from
          the Serv
          letInputStream, not merging post parameters>.
          

Anna Bancroft wrote:
          > We have a csv file which is uploaded to the weblogic application server, written
          > to a temporary directory and then manipulated before being written to the database.
          > This process works correctly in a single server environment but fails in a clustered
          > environment.
          >
          > The file gets uploaded to the server into the temporary directory without problem.
          > The processing starts. When running in a cluster the csv file is replicated
          > to a temporary directory on the secondary server as well as the primary.
          >
          > The manipulation process is running but never finishes and the browser times out
          > with the following message:
          > Message from the NSAPI plugin:
          > No backend server available for connection: timed out after 30 seconds.
          >
          >
          >
          > --------------------------------------------------------------------------------
          >
          > Build date/time: Jun 3 2002 12:27:28
          >
          > The server which is loading the file and processing it writes this to the log:
          > 17-Jul-03 15:13:12 BST> <Warning> <HTTP> <WebAppServletContext(1883426,fulfilment,/fu
          > lfilment) One of the getParameter family of methods called after reading from
          > the Serv
          > letInputStream, not merging post parameters>.
          >
          >
          It doesn't make sense? Who is replicating the file? How long does it
          take to process the file? Which plugin are you using as proxy to the
          cluster?
          -- Prasad
          

Similar Messages

  • How to upload above20000 records csv files into oracle table,Oracle APEX3.2

    Can any one help me how to CSV upload more than 20,000 records using APEX 3.2 upload process.i am using regular upload process using BOLB file
    SELECT blob_content,id,filename into v_blob_data,v_file_id,v_file_name
    FROM apex_application_files
    WHERE last_updated = (select max(last_updated)
    from apex_application_files WHERE UPDATED_BY = :APP_USER)
    AND id = (select max(id) from apex_application_files where updated_by = :APP_USER);
    I tried to upload but my page getting time out. my application best working up to 1000 records. after that its getting timed out.Each record is storing 2 secornds in the oracle table.So 1000 records it taking 7 minuts after that APEX upload webpage getting timed out
    please help me with source how to speed upload csv file process or help another best with with source example.
    Thanks,
    Sant.
    Edited by: 994152 on Mar 15, 2013 5:38 AM

    See this posting:
    Internet Explorer Cannot Display
    There, I provided a couple of links on this particular issue. You need to change the timeout on your application server.
    Denes Kubicek
    http://deneskubicek.blogspot.com/
    http://www.apress.com/9781430235125
    http://apex.oracle.com/pls/apex/f?p=31517:1
    http://www.amazon.de/Oracle-APEX-XE-Praxis/dp/3826655494
    -------------------------------------------------------------------

  • How to create power shell script to upload all termset csv files into the SharePoint2013 local taxonomy ?

    Hi Everyone,
    I want to create a powershell script file
    1) Check a directory and upload all termset csv files into the SharePoint local taxonomy.
    2) Input paramaters - directory that containss termset csv files, Local Termstore to import to,
    3) Prior to updating get a backup of the existing termstore (for rollback/recovery purposes)
    4) Parameters should be passed in via XML file.
    Please let me know how to do it.
    Regards,
    Srinivas

    Hi,
    Please check this link
    http://termsetimporter.codeplex.com/
    Please remember to click 'Mark as Answer' on the answer if it helps you

  • Processing a .CSV file to Table

    HI Friends,
    Please le tme know how to process a .CSV file to Table by using SQL loader Concept.
    No other technique required...! Know it can be done thru various processes. But needed in the above way...
    Thanks in adv...!
    tempdbs
    www.dwforum.net

    here:
    http://forums.oracle.com/forums/search.jspa?threadID=&q=loading+csv+sql+loader&objID=c84&dateRange=lastyear&userID=&numResults=15

  • Processing a CSV file in batching by FTP Adapter gives translation error

    Hi All,
    I have a CSV with 2000 records.. want to process it in batch of 500.
    When i dont use batching in FTP adapter.. everything goes fine.
    But when i include batching in the adpater.. and try to process the same file. it gives:
    <2009-09-09 12:09:15,997> <ERROR> <VDSTServices.collaxa.cube.translation> <NXSDTranslatorImpl::logError> translateFromNative Failed with exception = 50
    <2009-09-09 12:09:15,997> <INFO> <VDSTServices.collaxa.cube.activation> <FTP Adapter::Inbound> Error while translating inbound file : VDST_CNP_EMR5110026_20090904_000000.csv
    <2009-09-09 12:09:15,997> <INFO> <VDSTServices.collaxa.cube.activation> <FTP Adapter::Inbound>
    ORABPEL-11100
    Translation Failure.
    [Line=27, Col=1] Translation from native failed. 50.
    Check the error stack and fix the cause of the error. Contact oracle support if error is not fixable.
         at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.doTranslateFromNative(NXSDTranslatorImpl.java:754)
         at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.translateFromNative(NXSDTranslatorImpl.java:489)
         at oracle.tip.adapter.file.inbound.ProcessWork.doTranslation(ProcessWork.java:748)
         at oracle.tip.adapter.file.inbound.ProcessWork.processMessages(ProcessWork.java:336)
         at oracle.tip.adapter.file.inbound.ProcessWork.run(ProcessWork.java:218)
         at oracle.tip.adapter.fw.jca.work.WorkerJob.go(WorkerJob.java:51)
         at oracle.tip.adapter.fw.common.ThreadPool.run(ThreadPool.java:280)
         at java.lang.Thread.run(Thread.java:595)
    Caused by: java.lang.ArrayIndexOutOfBoundsException: 50
         at oracle.tip.pc.services.translation.xlators.nxsd.ErrorList.addError(ErrorList.java:108)
         at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.terminateLoop(NXSDTranslatorImpl.java:1688)
         at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.parseNXSD(NXSDTranslatorImpl.java:1307)
         at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.parseNXSD(NXSDTranslatorImpl.java:1216)
         at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.parseNXSD(NXSDTranslatorImpl.java:1084)
         at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.doTranslateFromNative(NXSDTranslatorImpl.java:706)
         ... 7 more
    I tried deleting the rows 24-28 from the CSV and process again.. again got the same error [Line=27, Col=1] .. so its not an issue with CSV.
    Please suggest.
    Thanks

    Can you publish or pass your csv and the native schema you are using. Please also mention your soa version. thx.

  • Error while uploading data through CSV File

    Dear All,
    While Performing following steps I have encounted error in BW 3.5.
    Step 1. Right-click Source System u2013 demo: flat file, and then select Create InfoPackageu2026.
    Step 2. Select the DataSource Material number (Master data), enter a description for the InfoPackage, and
    Step 3. Click the External data tab. Select options as shown in the screen. Enter a file name with a path.(CSV File)\
    While checking the preview I am able to see the values inside CSV file,
    But while Starting the scheduler to upload this data it is displaying following error=>
    Syntax error in template RSTMPLIR, row 0 (-> Long text)
    While checking performance assistant following detail is displayed.
    *Diagnosis
    Field "C_R_D" is unknown. It is neither in one of the specified tables nor defined by a "DATA"   ...
    System response
    The program generation was terminated.
    Procedure
    Correct the template*
    Can you guide how to eliminate this error?
    Points will be rewarded for your contribution
    Regards,
    Purav

    Flat File was in error.

  • SQlldr Error while uploading "excel" or "csv" file.

    Hello to community,
    We are using Oracle AS(application Server) 10g as a "web Server" & "database Server"
    The Database Server is having an NFS Partition,which has mounted onto "Web Server"
    So, if any client tried to upload any excel OR csv file , the Web Server will redirect that file data onto "database" server" through NFS partition.
    By clicking "upload" button from client end, they are getting a strange error. By checking "ias_console" log file, I have found below latest logs,which belongs to the error. Kindly let me know where is the problem coming from.
    The Database Server Shared NFS partition name is "web_upload" & we have same "web_upload" partition on the "web server". The command of mounting NFS partition is given below.
    On AIX web Server :- mount <ip address>:/file_data/web_upload/ /web_upload/
    SYNTAX :- <ip add of db server>/mount point "web server mount point"
    The error is as given below.
    09/06/09 15:27:47 NumIdle: 2
    09 Jun 2009 15:27:47,764 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ private Connection getDBConnection() ] Exited
    09 Jun 2009 15:27:47,764 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ private Connection getDBConnection() ] Exited
    09 Jun 2009 15:27:47,766 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ private void initialize() ] After getDBConnection called,
      connection object is: org.apache.commons.dbcp.PoolableConnection@1048a893
    09 Jun 2009 15:27:47,767 [DEBUG] - [ com.vat.website.service.ExcelUploadService ] [ public boolean insertFileDetails(ExcelDataBean databean,String
      strFname) ] strUniqueKey:select  web_uploaded_file_history_seq.nextval from dual
    09 Jun 2009 15:27:47,767 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ public ResultSet executeQuery(String strQuery) ] Entered
    09 Jun 2009 15:27:47,773 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ public ResultSet executeQuery(String strQuery) ] Exited
    09 Jun 2009 15:27:47,774 [DEBUG] - [ com.vat.website.service.ExcelUploadService ] [ public boolean insertFileDetails(ExcelDataBean databean,String
      strFname) ] insertQuery:INSERT INTO WEB_UPLOADED_FILE_HISTORY(WUF_SERIAL_NUMBER,WUF_DEALER_ID,WUF_PERIOD_FROM,WUF_PERIOD_TO,WUF_FILE_PATH,WUF_FORM_NO,WUF_SERVER_IP,WUF_UPLOAD_YN,WUF_CREATED_DATE,WUF_CREATED_BY,WUF_ORIGINAL_REVISED,WUF_SHEETS_NUMBER,wuf_reco
      d_key)VALUES('2658271','T00100001000306',to_date('01/01/2009','dd/mm/yyyy'),to_date('31/01/2009','dd/mm/yyyy'),'/web_upload/090609/VAT
      Returns/000000/0000000000/Form201/0000000000_0T201BO0109_009062009152747.csv',replace(decode('T201B','T201M', 'T201','T201B'),'T','VAT-Form'),(select
      RPAD(sys_context('USERENV','IP_ADDRESS'),15,' ') AS client_ipaddress from dual),'Y',SYSDATE,'WEB','O','0','3317063')
    09 Jun 2009 15:27:47,791 [DEBUG] - [ com.vat.website.service.DatabaseService ] [ public void closeDBConnection() ] Entered
    09 Jun 2009 15:27:47,791 [DEBUG] - finalDate:01-JAN-2009
    09 Jun 2009 15:27:47,792 [DEBUG] - finalDate:31-JAN-2009
    09 Jun 2009 15:27:47,806 [DEBUG] - [ com.vat.website.utils.PropertyCache ] [ static Object getValue(String propertyName, String propertyFileName)
      ] Entered
    09 Jun 2009 15:27:47,806 [DEBUG] - [ com.vat.website.utils.PropertyCache ] [ static Object getValue(String propertyName, String propertyFileName)
      ] Entered
    09 Jun 2009 15:27:47,806 [DEBUG] - [ com.vat.website.utils.PropertyCache ] [ static Object getValue(String propertyName, String propertyFileName)
      ] Entered
    09 Jun 2009 15:27:47,806 [DEBUG] - [ com.vat.website.action.UploadAction ] [ public ActionForward submit(ActionMapping mapping, ActionForm
      form,HttpServletRequest request, HttpServletResponse response) ] Executing: sqlldr parfile=parafile.par silent=feedback direct=Y at location:
      /web_upload/090609/VAT Returns/000000/0000000000/Form201/SQLLdr
    It seems that this problem is due to "Sqlldr" then how to troubleshoot this problem?
    Waiting for your favorable response,
    Advanced Thanks,
    Nishith Vyas.

    Flat File was in error.

  • User upload from a csv file via users_gen

    Hello SRM Guru,
    I am uploading users via transaction users_gen from a csv file, all the users are uploaded correctly but I see that there are some special charater (which we have in german language) are not cpoied correctly for example the last name is copied as
    B#hm instead of Böhm. Although my logon language was German only.  
    Any help is appreciated !!!
    Thanks,
    Jack

    Hello,
    Right click on your file, choose "open with", choose notepad. Then in notepad, "save as" and in encoding choose unicode.
    But it's just a quick trick to convert your file in an unicode one. There should be an option in the editor you use to directly save csv in unicode encoding.
    P.

  • Upload or download csv files

    Hi folks,
            How we can upload or download csv(comma separated vale) fles in web dynpro abap.in some forums i seen how we can upload or download files but i need file upload/download csv files

    hi ,
    create an attribute datasource of type xstring...and use the following code
    TYPES :
           BEGIN OF STR_ITAB,
           AP_NUMBER(10) TYPE C,
           SPAN(10) TYPE C,
           TOWER_TYPE(10) TYPE C,
           END OF STR_ITAB.
           DATA : T_TABLE1 TYPE STANDARD TABLE OF STR_ITAB,
             I_DATA TYPE STANDARD TABLE OF STRING,
    *         LO_ND_SFLIGHT TYPE REF TO IF_WD_CONTEXT_NODE,
    *         LO_EL_SFLIGHT TYPE REF TO IF_WD_CONTEXT_ELEMENT,
             L_STRING TYPE STRING,
             FS_TABLE TYPE STR_ITAB,
             L_XSTRING TYPE XSTRING,
             FIELDS TYPE STRING_TABLE,
             LV_FIELD TYPE STRING.
              DATA : T_TABLE TYPE IF_ZWD_KT_TOWERSCHDUL_V=>ELEMENTS_CTX_VN_DATA_TAB,
              CTX_VN_DATA_TABLE TYPE IF_ZWD_KT_TOWERSCHDUL_V=>ELEMENTS_CTX_VN_DATA_TAB.
    * get single attribute
    WD_CONTEXT->GET_ATTRIBUTE( EXPORTING   NAME =  `DATASOURCE`    IMPORTING   VALUE = L_XSTRING ).
      CALL FUNCTION 'HR_KR_XSTRING_TO_STRING'
        EXPORTING
          IN_XSTRING = L_XSTRING
        IMPORTING
          OUT_STRING = L_STRING.
      SPLIT L_STRING  AT CL_ABAP_CHAR_UTILITIES=>CR_LF INTO TABLE I_DATA.
    *SPLIT S_CONT AT CL_ABAP_CHAR_UTILITIES=> INTO TABLE S_TABLE.
    FIELD-SYMBOLS: <WA_TABLE> LIKE LINE OF I_DATA.
    REFRESH T_TABLE1.
    CLEAR FS_TABLE.
    DELETE I_DATA INDEX 1.
    LOOP AT I_DATA ASSIGNING <WA_TABLE>.
    *  splits string on basis of tabs
      SPLIT <WA_TABLE> AT ',' INTO
                      FS_TABLE-TOWER_TYPE
                      FS_TABLE-AP_NUMBER
                      FS_TABLE-SPAN.
      APPEND FS_TABLE TO T_TABLE1.
    *                  STR_ITAB-NUMBER_DIGITS.
    *  APPEND STR_ITAB TO ITAB.
    ENDLOOP.
    *       Bind With table Element.
    *      LOOP AT i_data INTO l_string.
    *    SPLIT l_string AT cl_abap_char_utilities=>horizontal_tab INTO TABLE fields.
    *     READ TABLE fields INTO lv_field INDEX 1.
    *    fs_table-AP_NUMBER = lv_field.
    *     READ TABLE fields INTO lv_field INDEX 2.
    *    fs_table-SPAN = lv_field.
    *     APPEND fs_table TO t_table1.
    *  ENDLOOP.
    *  lo_nd_sflight = wd_context->get_child_node( 'CTX_VN_CTX_VN_CTX_VN_CTX_VN_DATA_TAB' ).
    *  lo_nd_sflight->bind_table( t_table1 ).
      DATA LO_ND_CTX_VN_DATA_TAB TYPE REF TO IF_WD_CONTEXT_NODE.
      DATA LO_EL_CTX_VN_DATA_TAB TYPE REF TO IF_WD_CONTEXT_ELEMENT.
      DATA LS_CTX_VN_DATA_TAB TYPE WD_THIS->ELEMENT_CTX_VN_DATA_TAB.
    * navigate from <CONTEXT> to <CTX_VN_CTX_VN_CTX_VN_CTX_VN_DATA_TAB> via lead selection
      LO_ND_CTX_VN_DATA_TAB = WD_CONTEXT->GET_CHILD_NODE( NAME = WD_THIS->WDCTX_CTX_VN_DATA_TAB ).
      LO_ND_CTX_VN_DATA_TAB->BIND_TABLE( T_TABLE1 ).
    thanks and regards,
    sahai.s

  • How to upload multiple .CSV files in the same time.

    Hi Legends,
    Can anyone please help me to resolve my issue?
    This is very urgent and critical.
    Description:
    We have two users.1)edw_user_dump
    2)prd_udm.
    We need to upload the xx.csv file at the same time for these two users in oracle forms.
    we have differentiated .csv file name based on the user names.
    the main problem is in the sql loader command the xx.csv file name is created but the data is not captured in the server (kentucky) .
    Below is my code for the upload.
    -- To delete the Part_Mast.log file from the client
    DECLARE
    pid WEBUTIL_HOST.PROCESS_ID;
    v_result PLS_INTEGER;
    v_username varchar2(30) := GET_APPLICATION_PROPERTY(USERNAME);
    BEGIN
    v_result := WEBUTIL_HOST.Get_return_Code(pid ) ;
    host('cat /dev/null > /tmp/'||v_username||'_'||'EDW_CF_IO_UPLOAD.log');
    host('cat /dev/null > /tmp/'||v_username||'_'||'"CF615 IO Upload.csv"');
    END;
    DECLARE
    l_success boolean:=FALSE;
    l_bare_filename varchar2(100):=NULL;
    v_username varchar2(30) := GET_APPLICATION_PROPERTY(USERNAME);
    BEGIN
    -- Delete the content of the log and bad file
    host('cat /dev/null > /tmp/'||v_username||'_'||'EDW_CF_IO_UPLOAD.log');
    --host('cat /dev/null > /tmp/Part_Mast.bad');
    -- Upload the data file to Application Server
    l_bare_filename := v_username||'_'||substr(:FIC_SOURCE,instr(:FIC_SOURCE,'\',-1)+1);
    l_success := webutil_file_transfer.Client_To_AS_with_progress
    (clientFile => :FIC_SOURCE
    ,serverFile => '/tmp/'||l_bare_filename
    ,progressTitle => 'Upload to Application Server in progress'
    ,progressSubTitle => 'Please wait'
    ,asynchronous => false
    ,callbackTrigger => null
    IF l_success THEN
    NULL;
    ELSE
    null;
    END IF;
    EXCEPTION
    WHEN OTHERS THEN
    RAISE Form_Trigger_Failure;
    END;
    DECLARE
    v_username varchar2(30) := GET_APPLICATION_PROPERTY(USERNAME);
    begin
    host('cat /dev/null > /tmp/'||v_username||'_'||'EDW_CF_IO_UPLOAD.log');
    host('cat /dev/null > /tmp/EDW_CF_IO_UPLOAD.bad');
    end;
    BEGIN
    DECLARE
    v_username varchar2(30) := GET_APPLICATION_PROPERTY(USERNAME);
    v_password varchar2(30) := GET_APPLICATION_PROPERTY(PASSWORD);
    v_connect_string varchar2(30) := GET_APPLICATION_PROPERTY(CONNECT_STRING);
    a_host varchar2(500);
    BEGIN
    a_host :='/tmp/'||v_username||'_'||'"CF615 IO Upload.csv"';
    host('sqlldr '||v_username||'/'||v_password||'@'||v_connect_string||' '|| 'control=/home/edw_bis/ctl/GLB_CF_IO_UPLOAD.CTL'||' '|| 'DATA=a_host'||' '|| 'LOG=/tmp/'||v_username||'_'||'EDW_CF_IO_UPLOAD.log SKIP=1 errors=200000 DIRECT=FALSE');
    dbms_output.put_line(a_host);
    END;
    DECLARE
    v_username varchar2(30) := GET_APPLICATION_PROPERTY(USERNAME);
    begin
    host('cat /dev/null > /tmp/'||v_username||'_'||'"CF615 IO Upload.csv"');
    EXCEPTION
    WHEN OTHERS THEN
    RAISE Form_Trigger_Failure;
    END;
    end;
    DECLARE
    al_id3 ALERT;
    al_button Number;
    BEGIN
    edw_user_dump.SANM_PRC_MERGE_CF_IO_UPLOAD(:global.v_plsql_res,:global.v_ins_rec,:global.v_upd_rec);
    IF NVL(:global.v_plsql_res,0) = 0 and (:global.v_ins_rec !=0 OR :global.v_upd_rec != 0 ) then
    al_id3 :=FIND_ALERT('ROWINS');
    SET_ALERT_PROPERTY(al_id3,alert_message_text,' Process Completed Successfully!'||CHR(10)||' Rows Inserted : '||:global.v_ins_rec ||CHR(10)||' Rows Updated : '||:global.v_upd_rec);
    al_button := SHOW_ALERT( al_id3 );
    ELSIF (:global.v_plsql_res IN(-1,0) or :global.v_plsql_res > 0) and (:global.v_ins_rec =0 and :global.v_upd_rec = 0 and :global.v_del_rec =0 ) then
    al_id3 :=FIND_ALERT('ROWINS');
    SET_ALERT_PROPERTY(al_id3,alert_message_text,' Process Failed. Please Download the Log File '||CHR(10)||' Rows Failed : '||:global.v_plsql_res||CHR(10)||' Rows Inserted : '||:global.v_ins_rec ||CHR(10)||' Rows Updated : '||:global.v_upd_rec);
    al_button := SHOW_ALERT( al_id3 );
    ELSE
    al_id3 :=FIND_ALERT('ROWINS');
    SET_ALERT_PROPERTY(al_id3,alert_message_text,'Please Download the Log File '||CHR(10)||' Rows Failed : '||:global.v_plsql_res||CHR(10)||' Rows Inserted : '||:global.v_ins_rec ||CHR(10)||' Rows Updated : '||:global.v_upd_rec );
    al_button := SHOW_ALERT( al_id3 );
    END IF;
    EXCEPTION WHEN OTHERS THEN
    RAISE Form_Trigger_Failure;
    END;
    DECLARE
    v_username varchar2(30) := GET_APPLICATION_PROPERTY(USERNAME);
    begin
    host('cat /dev/null > /tmp/'||v_username||'_'||'"CF615 IO Upload.csv"');
    host('rm -rf /tmp/'||v_username||'_'||'"CF615 IO Upload.csv"');
    end;
    Thanks in advance!
    Thanks,
    Madhusudhanan

    Madhusudhanan,
    A couple of observations. First; always list your exact Forms version (eg; 10.1.2.0.2 not 10g R2). In most cases, the solution is different depending on the Forms version. Second; why must you use Forms to kick off a SQL Loader process? This is a server-side process and should be initiated by a server side process. If you absolutely must use Forms to kick off the process, again we need your Forms version in order to offer any solutions. Based on your code sample, I can asusme you are at least using Forms 9i becuase you are using WebUtil.
    Is your Database and Application Server the same physical computer? If they are not, this would explain why your HOST command isn't working because HOST runs against the Application Server not the Database server.
    Third; have you considered using and External Table (if your RDBMS version supports them) for each of the files you are attempting to upload? In this instance, it would be helpful to know your RDBMS version as well. External Tables can be a little frustrating to set up the first time, but as with any new construct you use - it gets easier the more you use it.
    Fourth; are you getting any errors in your log file(s)? If so, what are the errors? Please list the full error message if you have one.
    Finally, with respects to your statement:
    Posted: Mar 18, 2011 2:30 PM - Madhu This is very urgent and critical.>
    You have to understand that forum contributers are all volunteers - this is not our full-time job. If your issue is truely urgent I suggest you open a Service Request (SR) with Oracle Support! ;-)
    Craig B-)
    If someone's response is helpful or correct, please mark it accordingly.

  • Setting required to process incomming Email with CSV file attached

    Dear all,
    We are having a scenario where in the supplier receives a PO confirmation CSV file from the download center as an Email attachment. The supplier updates the file by confirming PO's and replies back. The CSV file needs to update the confirmed PO's in SNC.
    The SO50 settings are in place but still we are not able to receive the Email back in SNC. Any suggestion on what settings we need in SNC to receive and process the CSV file?
    thanks,
    mahehs

    Hi,
    You can use the upload center to process the changed data. Isnt that helpful?
    Best Regards,
    Harsha Gatt

  • Error while processing csv file

    Hi,
    i get this messages while processing a csv file to load the content to database,
    [SSIS.Pipeline] Warning: Warning: Could not open global shared memory to communicate with performance DLL;
     data flow performance counters are not available.  To resolve, run this package as an administrator, or on the system's console.
    [SSIS.Pipeline] Warning: The output column "Copy of Column 13" (842) on output "Data Conversion Output" (798) and component
     "Data Conversion" (796) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.
    can someone please me with this.
    With Regards
    litu Here

    Hello Litu,
    Are you using: OLEDB Source and Destination ?
    Can you change the source and destination provider to ADO.NET and check if the issue still persist?
    Mark this post as "Answered" if this addresses your question. 
    Regards,
    Don Rohan [MSFT]
    Regards, Don Rohan [MSFT]

  • Getting Issue while uploading CSV file into internal table

    Hi,
    CSV file Data format as below
         a             b               c              d           e               f
    2.01E14     29-Sep-08     13:44:19     2.01E14     SELL     T+1
    actual values of column   A is 201000000000000
                     and  columen D is 201000000035690
    I am uploading above said CSV file into internal table using
    the below coding:
    TYPES: BEGIN OF TY_INTERN.
            INCLUDE STRUCTURE  KCDE_CELLS.
    TYPES: END OF TY_INTERN.
    CALL FUNCTION 'KCD_CSV_FILE_TO_INTERN_CONVERT'
        EXPORTING
          I_FILENAME      = P_FILE
          I_SEPARATOR     = ','
        TABLES
          E_INTERN        = T_INTERN
        EXCEPTIONS
          UPLOAD_CSV      = 1
          UPLOAD_FILETYPE = 2
          OTHERS          = 3.
      IF SY-SUBRC <> 0.
        MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
                WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
    am getting all columns data into internal table,
    getting problem is columan A & D. am getting values into internal table for both; 2.01E+14. How to get actual values without modifying the csv file format.
    waiting for your reply...
    thanks & regards,
    abhi

    Hi Saurabh,
    Thanks for your reply.
    even i can't double click on those columns.
    b'se the program needs be executed in background there can lot of csv file in one folder. No manual interaction on those csv files.
    regards,
    abhi

  • While uploading a CSV file i am getting error

    Hi All,
    I am trying to upload a huge CSV file and now it is giving below error:
    ORA-01653: unable to extend table ADT.SCG_RECIEVABLES2 by 128 in tablespace APEX_1711125608495793205.
    Any expert,please guide what type of error is it?and what i need to do to solve it.
    workspace name: ADT
    target table is SCG_RECIEVABLES2.
    Shyam

    Hi,
    Table space is full. Contact your DBA or extend tablespace.
    First hit from Googling by that ORA error
    http://www.dbmotive.com/oracle_error_codes.php?errcode=01653
    BR,
    Jari

  • How to align the CSV file on upload?

    Hi All,
    I have to upload a CSV file as an attachment in a mail and the data's in the internal table which has to be uploaded in the CSV file are seperated using commas which on uploading appear in the CSV file in the same column but i need the data's to be seperated in different columns and different lines...
    Pls help it is very urgent..
    Thanks in Advance...

    Hi
    For my understanding you are talking about download.
    For that you have to concatenate each fields of final internal table (gt_itab for example) using comma as below.
    TYPES: BEGIN OF ty_lines,
             line(1023) TYPE c,
           END OF ty_lines.
    DATA: l_filename TYPE string VALUE 'C:\temp\abcd.csv',
          gt_lines  TYPE TABLE OF ty_lines,
          gw_lines  TYPE ty_lines.
      LOOP AT gt_itab INTO gw_itab.
        CONCATENATE gw_itab-f1
                    gw_itab-f2 .....
               INTO gw_lines-line
               SEPARATED BY ','.
         APPEND gw_lines TO gt_lines.
      ENDLOOP.
      CALL FUNCTION 'GUI_DOWNLOAD'
        EXPORTING
          filename                = l_filename
          filetype                = 'ASC'
          confirm_overwrite       = 'X'
          no_auth_check           = 'X'
        TABLES
          data_tab                = gt_lines.

Maybe you are looking for

  • External hard drive not in sync with Mac Book Pro

    I recently upgraded to a Mac Book Pro from my old and trusted ibook G4 but kept my portable external hard drive. With my ibook, my external hard drive's folders were all organised by name but since using my Mac Book Pro, my folders on my hard drive a

  • Differnt output ...  Urgent!!

    Hi to all, In processreq i used this code, for (int t=0; t<2; t++) OAViewObject assoDetVO[] = new OAViewObject[40]; assoDetVO[t] = (OAViewObject)partgCompSegAM.findViewObject("TrialAssociatingCompanyVO1") TrialAssociatingCompanyVORowImpl assoVORow[]

  • Problem with the progress bar in swings

    Hi all, I need to show a progress bar till another window opens up in swings.Below is the code i used to show the progress bar.My problem is i am able to get the dialog box where i have set the progress bar but i cldnt get the progress bar.But after

  • [NetworkManager] unable to create AdHoc network?

    Hi, I'm trying to create an AdHoc network from my laptop so my phone can connect to it and use it as a source for an Internet connection. The laptop is a Lenovo T420: 03:00.0 Network controller: Intel Corporation Centrino Advanced-N 6205 (rev 34) iwl

  • Cfdirectory on remote machine/drive

    Trying to do a directory on a remote machine that has both a mapped drive on my web-server and a virtual directory on IIS. I can view the files in IIS and over the web, I just can't seem to get a list of the files in CF????? Any ideas would be apprec