Csv upload limit

Hi all,
Can any one pls tell me how to upload a csv file to a table containing 90 columns in APEX.
Thanks and regards,
Shoaib

I think you will have two options:
1. Is to create an application with From Spreadsheet, and then if you want to use that data on an already existing table, process it with a procedure or something.
Old but it will give you some pointers.... http://avdeo.com/2008/05/21/uploading-excel-sheet-using-oracle-application-express-apex/
2. Upload a CSV , using UTL_FILE/directory. You can use external tables http://docs.oracle.com/cd/B19306_01/server.102/b14215/et_concepts.htm#i1009391

Similar Messages

  • How to have concurrent CSV upload and manual edit in APEX on the same table

    Hi there,
    my users want to have csv upload and manual edit on apex pages functions together...
    it means they want to insert the data by csv upload and also have interactive report and form on the same table...
    so if user A changes something in csv file...then user B update or delete another record from apex fronted (manually in the apex form)...then after that user A wants to upload the csv file,the record was changed by user B would be overwritten ...
    So how can I prevent it?????

    Hi Huzaifa,
    Thanks for the reply...
    I'm going to use File Browser so that end users can upload the files themselves...
    after editing data by users...a manger going to review it and in case of approval , i need to insert the data to one final table....
    so it needs much effort to check two source tables and in case of difference between table of csv and other one...what to do...

  • CSV upload and update table fields

    Hi,
    Is there any way to update any field of the underlying table from HTMLDB while importing the CSV file?For e.g, if there are XYZ table have 200 ermployee where empno is primary key.
    Once we upload the data from CSV its fine.Now after 2 weeks i need to update the sal and ename corresponding to the previous 100 employees only.So now when i trying to update the data a unique constraints violation occurs since the empno have already present so my concern is that -- how to get out of this problem specially when updation of huge no. of records needed and also no updation required for some records already present?Do the CSV update be helpful on that circumstances?
    One thing we can do that we can track down the list of 100 employee whose information going to be updated.Put those info in excel and add rest 100 emplyee to that excel and again doing the CSV upload ?
    This is a huge issue and not only the tedious task but also errorprone and that task will be increased with the amount of data increasing in the table.
    Please anyone can suggest ?
    Regards,
    ROSY.

    The best way to solve your problem is next way.
    1º) Save the file with the data to import like a BLOV ITEM.
    Ok it can be done. CSV placed on BLOB data field.2º) Make a procedure to manage this BLOV item that can be fired by user pushing a buttom. (you can do it easy from HTMLDB).
    how to manage ? What need to be fire to push the CSV to the target table with the actual field population ?3º) Into this procedure insert the records freely and dont worry for anything more. (maybe you can be worry about duplicate records is so, only, into this same procedure make any pl/sql lines to be sure record is not duplicate before insert)
    ? ? ? If the data already exist it should be needed to truncate before insert that is not my reqr rather the existing data should be updated based on PK if any chages occurs in the CSV fo that data.4º) after you are sure the insert work properly and if you need any other treatment to record, with a trigger in the table or calling any other procedure from the same trigger make the traeataments that you need.
    that can be done highly .......before insert trigger.Remember. Dont put any commit.
    sure.......5º) Go down from horse. Dont take coffe or Tea in exceed. And remember all we are the same, people that only want go to eat with childs in week-end.
    i m in land no way to go down........nothing i take ya eat & eat much ....but this is not the right place for making comments about Weekend....Regards and good luck.
    anyway embargo starts...........enjoy Cheers...........Wish u All Hp XMas...

  • Upload Limit in WSE 2012 R2

    Hi everyone.
    In WSE 2012 R2 it is possible to upload up to 2 Gb at once. Is it possible to change this value? Maybe somewhere in registry?
    Thanks for help.

    Hi,
    Based on your description, please refer to the following similar question. And check if it’s helpful.
    File size upload limit
    http://social.technet.microsoft.com/Forums/windowsserver/en-US/0f961d6b-6074-460d-b3db-e71fbc4689ff/file-size-upload-limit?forum=winserveressentials
    Hope this helps.
    Best regards,
    Justin Gu

  • Download/Upload Limit with Shaping

    I have a network with 8-10 PC's both wireless and wired.
    I am looking for an ADSL 2+ Modem with Router that will -
    - Designate IP addressess based on MAC addresses.
    - Allow me to set a download/upload limit based on amount not speed (ie give everyone 20GB download/upload each month) for each IP Address.
    - Allow me to shape the speed if they go over their 20GB limit, so they are not cut off from the internet, but will get a slower speed.
    - Has a phone port so an Analog phone can be connected for VOIP, and SIP settings for voip connection. (Not a big requirement)
    Does Cisco have any product (Even if it is only a Router) that fully or partially does the above - However it must be able to allocate blocks/amounts of traffic to each IP address on the network, so download/upload limits can be shared equally. (Even if a more expensive router is avaible/designed for larger operation, I would be willing to look at it)
    Note I would prefer something where the "firmware" has the necessary options to set the above.
    Thank you
    John

    my signal indicator on the menu bar shows all bars.
    As you have probably guessed by now, the "bars" are there pretty much for show, as they tell you very little about the actual strength of your signal, and nothing about the noise.
    My Mac is on the floor just above the AP and a total of about 30 feet away.
    A typical wall constructed of sheet rock and 2 x 4s in a home will on average, absorb 15-25% of the signal. A ceiling is much thicker, and typically will absorb 45-60% of the signal.
    What about different channels, taking it off "automatic" and choosing a particular channel?
    The "Automatic" setting will scan all frequencies and select the "best" channel that is available at the time. You can set the channels manually, but which channel to choose? 
    You just have to experiment as no particular channel is really "better" than another. It all depends on what channels are in use at any particular time, and that is hard to know, even with good scanning equipment.
    A lot of networks are "invisible". You cannot "see" them, so there are more around you than you might think.
    Is ridding the house of cordless phones the only answer?
    The cordless phones may or may not be causing problems for the wireless network. The only way that you will know is to pick a time when you can turn them and their base stations off for a few hours and then restart your entire network.  Keep an eye on things to see if you notice any difference in performance.
    Moving the AP location is not possible, as the cable modem can't be moved.
    Perhaps the modem cannot be moved, but you can move the AirPort Extreme anywhere that an Ethernet cable will reach from the modem. You can use virtually any length of cable you want in a home.

  • CSV upload for end user

    I've searched all over the web and this forum and haven't found exactly what I'm looking for. I see this referenced a lot (http://avdeo.com/2008/05/21/uploading-excel-sheet-using-oracle-application-express-apex/), but it's still a little too technical for me to figure out.
    Is there a packaged app or a dumbed down "how to" for creating a csv upload? I want the end user to be able to upload a csv file (same columns/headings, but the number of rows will vary) and display the table they just uploaded in a report.
    Thank you!!

    Hi Bob,
    Where exactly are you stuck? In the example that Tony linked, all you have to do is modify ~line 53 to the number of columns that will go into your table (and the name of the file browse item).
    i.e.
    APEX_COLLECTION.ADD_MEMBER(
                                p_collection_name => c_collection_name,
                                p_c001 => v_curr_row(1),
                                p_c002 => v_curr_row(2) -- add more of these for the number of columns that will be in your csv file);Or does the example not meet your requirements? If I interpret correctly, you want the user to be able to upload a file where they have specified the column names at the top of the csv file? I have created an example using this technique, although I am not entirely happy with it yet (for the lack of confirming the data that is to be inserted).
    Ta,
    Trent

  • Character Set Settings Affecting CSV Upload

    Hi,
    We have a production issue that I need a little help with. Using one of the examples listed here I have created a csv upload and on our dev box its works fine however on production we have a problem on production.
    The database is set to WE8ISO8859P1 character set but for some reason the web browser on production keeps changing it to UTF8 so for some of the characters it uploads are incorrect e.g. £ becomes £
    I'm guessing this may be a set up issue as on dev this does not happen but I've ran out of things to check.
    Automatic CSV Encoding is set to Yes on both envirnments
    The following select returns WE8ISO8859P1 on both environments
    select value$ charset, 'sys.props$' source from sys.props$ where name='NLS_CHARACTERSET' ;
    select property_value, 'database_properties' from database_properties where property_name='NLS_CHARACTERSET' ;
    select value, 'nls_database_parameters' from nls_database_parameters where parameter='NLS_CHARACTERSET';
    I have limited access to the system but what else can I check
    Thanks inadvance

    Hi Andy,
    1) You say that "you have created a csv upload". What does this mean?
    2) How are you loading data? Is it through Application Express -> Utilities -> Data Load? Or is this something you've built into your own application?
    3) What is the encoding of the CSV file you're using to upload data?
    Automatic CSV Encoding only impacts CSV download from reports. It has nothing to do with data loading.
    Joel

  • CSV Upload Procedure

    I am using Apex 3.2
    I have built a csv upload procedure with help from this thread
    {message:id=10502964}
    All seems to be working fine but I need to a validation on the number of columns before I make my insert to my table.
    How can I count the columns in an apex collection
    Cheers
    Gus
    Edited by: Gus C on Aug 31, 2012 1:26 AM
    Edited by: Gus C on Aug 31, 2012 1:27 AM

    Gus C wrote:
    I am using Apex 3.2
    I have built a csv upload procedure with help from this thread
    {message:id=10502964}
    All seems to be working fine but I need to a validation on the number of columns before I make my insert to my table.
    How can I count the columns in an apex collectionMy thinking is that the number of "columns in an APEX collection" is fixed, and depends on the APEX version and your definition of "column". Could be 50 if you are only considering VARCHAR2 columns (as is likely for the purposes of CSV processing), or 51 in APEX 3.2, or 63 in APEX 4.1...
    I'd say you need to determine the number of columns from the CSV file (by counting the [maximum number of] delimiters) rather than the contents of the collection. Once the data is in the collection, how can you tell if empty columns are beyond the end of the row or are trailing null values?

  • Csv upload fails when the header contains colon (:)

    Hi,
    There seem to be a problem with the csv upload to the database tables. If the column header of the csv file contains colon the process fails.
    Please help!!
    Mustak

    Hi Andy,
    Thanks! The APEX version is 2.0. I have developed an application where I uploading models and their features from the csv file. The csv file headers are models names, feature names etc and the rows are models and their corresponding features. Some of the features(as headers) contains colons such as "3:2-pulldown compensation", a feature of TV. The upload process reads the models and features and load the feature values to the appropriate table. So you can imagine the column headers are important for the process. I wander if there is any work around.
    Mustak

  • Hitting the 20Mb upload limit

    Our respondents are hitting the 20MB upload limit: is there any way of increasing the restriction? We are limiting each to one 20MB item per 'question' but we still get problems with uploaded scans being refused. (Ironically, they're Adobe PDFs that are causing the problem)

    Hi YTKO,
    Forms Central only support files size up to 20 MB and there isn't any way to increase it.
    Thanks,
    Vikrantt Singh

  • Remote Web Workspace Upload Limit.

    Hi - Just noticed the upload limit for RWW on SBS 2011 is 2GB.
    I would like to make this unlimited, or at least increase the limit to 20GB. This increase is to enable the site to handle huge amounts of application log files for diagnostics......the files can be 1000's of individual files at 1-2mb or 1 file at 10gb ++.
    Thanks,
    John.

    Hi,
    As I know, when we upload files via RWW, we can find follow note message:
    Note: The upload limit per file is 2GB. It should be by design. So, it seems to be no
    methods
    to increase. Sorry for it.
    Hope this helps.
    Best regards,
    Justin Gu

  • Upload limit exceeded

    Trying to complete a document on echosign, cannot attach new files error message upload limit exceeded keep coming up.

    Hello Okoyeiyke,
    If the document you are uploading exceeds the limit size and pages, you will get this error. Here is the link for reference:
    http://helpx.adobe.com/echosign/kb/transaction-limits.html
    -Rijul

  • Data in CSV uploads successfully, but it is not visible after upload.

    Hi,
    I am using Apex 3.2 on Oracle 11g.
    This is an imported application for which I am making changes as per my requirements. As I am new to Apex and even SQL, I request forum members to help me with this.
    Please find below the old code for uploading data from CSV. It displays only 6 columns - Database Name, Server Name, Application Name, Application Provider, Critical, Remarks. This was successfully uploading all the data from CSV and that data was visible after upload.
    OLD CODE:_
    --PLSQL code for uploading application details
    DECLARE
    v_blob_data      BLOB;
    v_blob_len      NUMBER;
    v_position      NUMBER;
    v_raw_chunk      RAW(10000);
    v_char           CHAR(1);
    c_chunk_len           NUMBER:= 1;
    v_line           VARCHAR2 (32767):= NULL;
    v_data_array      wwv_flow_global.vc_arr2;
    v_rows           NUMBER;
    v_count           NUMBER;
    v_dbid           NUMBER;
    v_serverid           NUMBER;
    v_sr_no          NUMBER:=1;
    v_last_char          varchar2(2);
    BEGIN
    -- Read data from wwv_flow_files
    SELECT blob_content INTO v_blob_data FROM wwv_flow_files
    WHERE last_updated = (SELECT MAX(last_updated) FROM wwv_flow_files WHERE UPDATED_BY = :APP_USER)
    AND id = (SELECT MAX(id) FROM wwv_flow_files WHERE updated_by = :APP_USER);
    v_blob_len := dbms_lob.getlength(v_blob_data);
    v_position := 1;
    -- For removing the first line
    WHILE ( v_position <= v_blob_len )
    LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved
    IF v_char = CHR(10) THEN
    EXIT;
    END IF;
    END LOOP;
    -- Read and convert binary to char
    WHILE ( v_position <= v_blob_len )
    LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_line := v_line || v_char;
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved
    IF v_char = CHR(10) THEN
    --removing the new line character added in the end
    v_line := substr(v_line, 1, length(v_line)-2);
    --removing the double quotes
    v_line := REPLACE (v_line, '"', '');
    --checking the absense of data in the end
    v_last_char:= substr(v_line,length(v_line),1);
    IF v_last_char = CHR(44) THEN
         v_line :=v_line||'-';
    END IF;
    -- Convert each column separated by , into array of data
    v_data_array := wwv_flow_utilities.string_to_table (v_line, ',');
    -- Insert data into target tables
    SELECT SERVERID into v_serverid FROM REPOS_SERVERS WHERE SERVERNAME=v_data_array(2);
    SELECT DBID into v_dbid FROM REPOS_DATABASES WHERE DBNAME=v_data_array(1) AND SERVERID=v_serverid;
    --Checking whether the data already exist
    SELECT COUNT(APPID) INTO v_count FROM REPOS_APPLICATIONS WHERE DBID=v_dbid AND APPNAME=v_data_array(1);
    IF v_count = 0 THEN
    EXECUTE IMMEDIATE 'INSERT INTO
    REPOS_APPLICATIONS (APPID,APPNAME,APP_PROVIDER,DBID,SERVERID,CRITICAL,LAST_UPDATE_BY,LAST_UPDATE_DATE,REMARKS) VALUES(:1,:2,:3,:4,:5,:6,:7,:8,:9)'
    USING
    APP_ID_SEQ.NEXTVAL,
    v_data_array(3),
    v_data_array(4),
    v_dbid,
    v_serverid,
    v_data_array(5),
    v_data_array(6),
    v_data_array(7),
    v_data_array(8);
    END IF;
    -- Clearing out the previous line
    v_line := NULL;
    END IF;
    END LOOP;
    END;
    ==============================================================================================================================
    Please find below the new code (which I modified as per my requirements) for uploading data from CSV. It displays 17 columns - Hostname, IP Address, Env Type, Env Num, Env Name, Application, Application Component, Notes, Cluster , Load Balanced, Business User Access Mechanism for Application, Env Owner, Controlled Environment, SSO Enabled, ADSI / LDAP / External Directory Authentication, Disaster Recovery Solution in Place, Interfaces with other application.
    This is successfully uploading all the data from CSV, But this uploaded data is not visible in its respective tab.
    _*NEW CODE:*_
    --PLSQL code for uploading application details
    DECLARE
    v_blob_data      BLOB;
    v_blob_len      NUMBER;
    v_position      NUMBER;
    v_raw_chunk      RAW(10000);
    v_char           CHAR(1);
    c_chunk_len           NUMBER:= 1;
    v_line           VARCHAR2 (32767):= NULL;
    v_data_array      wwv_flow_global.vc_arr2;
    v_rows           NUMBER;
    v_count           NUMBER;
    v_dbid           NUMBER;
    v_serverid           NUMBER;
    v_sr_no          NUMBER:=1;
    v_last_char          varchar2(2);
    BEGIN
    -- Read data from wwv_flow_files
    SELECT blob_content INTO v_blob_data FROM wwv_flow_files
    WHERE last_updated = (SELECT MAX(last_updated) FROM wwv_flow_files WHERE UPDATED_BY = :APP_USER)
    AND id = (SELECT MAX(id) FROM wwv_flow_files WHERE updated_by = :APP_USER);
    v_blob_len := dbms_lob.getlength(v_blob_data);
    v_position := 1;
    -- For removing the first line
    WHILE ( v_position <= v_blob_len )
    LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved
    IF v_char = CHR(10) THEN
    EXIT;
    END IF;
    END LOOP;
    -- Read and convert binary to char
    WHILE ( v_position <= v_blob_len )
    LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_line := v_line || v_char;
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved
    IF v_char = CHR(10) THEN
    --removing the new line character added in the end
    v_line := substr(v_line, 1, length(v_line)-2);
    --removing the double quotes
    v_line := REPLACE (v_line, '"', '');
    --checking the absense of data in the end
    v_last_char:= substr(v_line,length(v_line),1);
    IF v_last_char = CHR(44) THEN
         v_line :=v_line||'-';
    END IF;
    -- Convert each column separated by , into array of data
    v_data_array := wwv_flow_utilities.string_to_table (v_line, ',');
    -- Insert data into target tables
    --SELECT SERVERID into v_serverid FROM REPOS_SERVERS WHERE SERVERNAME=v_data_array(2);
    --SELECT DBID into v_dbid FROM REPOS_DATABASES WHERE DBNAME=v_data_array(1) AND SERVERID=v_serverid;
    --Checking whether the data already exist
    --SELECT COUNT(APPID) INTO v_count FROM REPOS_APPLICATIONS WHERE DBID=v_dbid AND APPNAME=v_data_array(1);
    IF v_count = 0 THEN
    EXECUTE IMMEDIATE 'INSERT INTO
    REPOS_APPLICATIONS (APPID,HOSTNAME,IPADDRESS,ENV_TYPE,ENV_NUM,ENV_NAME,APPLICATION,APPLICATION_COMPONENT,NOTES,CLSTR,LOAD_BALANCED,BUSINESS,ENV_OWNER,CONTROLLED,SSO_ENABLED,ADSI,DISASTER,INTERFACES) VALUES(:1,:2,:3,:4,:5,:6,:7,:8,:9,:10,:11,:12,:13,:14,:15,:16,:17,:18)'
    USING
    APP_ID_SEQ.NEXTVAL,
    v_data_array(1),
    v_data_array(2),
    v_data_array(3),
    v_data_array(4),
    v_data_array(5),
    v_data_array(6),
    v_data_array(7),
    v_data_array(8),
    v_data_array(9),
    v_data_array(10),
    v_data_array(11),
    v_data_array(12),
    v_data_array(13),
    v_data_array(14),
    v_data_array(15),
    v_data_array(16),
    v_data_array(17);
    END IF;
    -- Clearing out the previous line
    v_line := NULL;
    END IF;
    END LOOP;
    END;
    ============================================================================================================================
    FYI, CREATE TABLE_ is as below:
    CREATE TABLE "REPOS_APPLICATIONS"
    (     "APPID" NUMBER,
         "APPNAME" VARCHAR2(50),
         "APP_PROVIDER" VARCHAR2(50),
         "DBID" NUMBER,
         "CRITICAL" VARCHAR2(3),
         "REMARKS" VARCHAR2(255),
         "LAST_UPDATE_DATE" TIMESTAMP (6) DEFAULT SYSDATE NOT NULL ENABLE,
         "LAST_UPDATE_BY" VARCHAR2(10),
         "SERVERID" NUMBER,
         "HOSTNAME" VARCHAR2(20),
         "IPADDRESS" VARCHAR2(16),
         "ENV_TYPE" VARCHAR2(20),
         "ENV_NUM" VARCHAR2(20),
         "ENV_NAME" VARCHAR2(50),
         "APPLICATION" VARCHAR2(50),
         "APPLICATION_COMPONENT" VARCHAR2(50),
         "NOTES" VARCHAR2(255),
         "CLSTR" VARCHAR2(20),
         "LOAD_BALANCED" VARCHAR2(20),
         "BUSINESS" VARCHAR2(255),
         "ENV_OWNER" VARCHAR2(20),
         "CONTROLLED" VARCHAR2(20),
         "SSO_ENABLED" VARCHAR2(20),
         "ADSI" VARCHAR2(20),
         "DISASTER" VARCHAR2(50),
         "INTERFACES" VARCHAR2(50),
         CONSTRAINT "REPOS_APPLICATIONS_PK" PRIMARY KEY ("APPID") ENABLE
    ALTER TABLE "REPOS_APPLICATIONS" ADD CONSTRAINT "REPOS_APPLICATIONS_R01" FOREIGN KEY ("DBID")
         REFERENCES "REPOS_DATABASES" ("DBID") ENABLE
    ALTER TABLE "REPOS_APPLICATIONS" ADD CONSTRAINT "REPOS_APPLICATIONS_R02" FOREIGN KEY ("SERVERID")
         REFERENCES "REPOS_SERVERS" ("SERVERID") ENABLE
    ==============================================================================================================================
    It would be of great help if someone can help me to resolve this issue with uploading data from CSV.
    Thanks & Regards
    Sharath

    Hi,
    You can see the installed dictionaries and change between them by right-clicking and choosing '''Languages''' inside a live text box eg. the box you are in when replying or right-clicking on the '''Search''' box on the top right corner of this page and choosing '''Check Spelling'''.

  • Csv upload -- suggestion needed with non-English character in csv file

    <p>Hi All,</p>
    I have a process which uploads a csv file into a table. It works with the normal english characters. In case of non-English characters in the csv file it doesn't populate the actual columns.
    My csv file content is
    <p></p>First Name | Middle Name | Last Name
    <p><span style="background-color: #FF0000">José</span> | # | Reema</p>
    <p>Sam | # | Peter</p>
    <p>Out put is coming like : (the last name is coming as blank )</p>
    First Name | Middle Name | Last Name
    <p><span style="background-color: #FF0000">Jos鬣</span> | Reema | <span style="background-color: #FF0000"> blank </span></p>
    <p>Sam | # | Peter</p>
    http://apex.oracle.com/pls/otn/f?p=53121:1
    workspace- gil_dev
    user- apex
    password- apex12
    Thanks for your help.
    Manish

    Manish,
    PROCEDURE csv_to_array (
          -- Utility to take a CSV string, parse it into a PL/SQL table
          -- Note that it takes care of some elements optionally enclosed
          -- by double-quotes.
          p_csv_string   IN       VARCHAR2,
          p_array        OUT      wwv_flow_global.vc_arr2,
          p_separator    IN       VARCHAR2 := ';'
       IS
          l_start_separator   PLS_INTEGER    := 0;
          l_stop_separator    PLS_INTEGER    := 0;
          l_length            PLS_INTEGER    := 0;
          l_idx               BINARY_INTEGER := 0;
          l_quote_enclosed    BOOLEAN        := FALSE;
          l_offset            PLS_INTEGER    := 1;
       BEGIN
          l_length := NVL (LENGTH (p_csv_string), 0);
          IF (l_length <= 0)
          THEN
             RETURN;
          END IF;
          LOOP
             l_idx := l_idx + 1;
             l_quote_enclosed := FALSE;
             IF SUBSTR (p_csv_string, l_start_separator + 1, 1) = '"'
             THEN
                l_quote_enclosed := TRUE;
                l_offset := 2;
                l_stop_separator :=
                       INSTR (p_csv_string, '"', l_start_separator + l_offset, 1);
             ELSE
                l_offset := 1;
                l_stop_separator :=
                   INSTR (p_csv_string,
                          p_separator,
                          l_start_separator + l_offset,
                          1
             END IF;
             IF l_stop_separator = 0
             THEN
                l_stop_separator := l_length + 1;
             END IF;
             p_array (l_idx) :=
                (SUBSTR (p_csv_string,
                         l_start_separator + l_offset,
                         (l_stop_separator - l_start_separator - l_offset
             EXIT WHEN l_stop_separator >= l_length;
             IF l_quote_enclosed
             THEN
                l_stop_separator := l_stop_separator + 1;
             END IF;
             l_start_separator := l_stop_separator;
          END LOOP;
       END csv_to_array;and
    PROCEDURE get_records (p_clob IN CLOB, p_records OUT varchar2_t)
       IS
          l_record_separator   VARCHAR2 (2) := CHR (13) || CHR (10);
          l_last               INTEGER;
          l_current            INTEGER;
       BEGIN
          -- SIf HTMLDB has generated the file,
          -- it will be a Unix text file. If user has manually created the file, it
          -- will have DOS newlines.
          -- If the file has a DOS newline (cr+lf), use that
          -- If the file does not have a DOS newline, use a Unix newline (lf)
          IF (NVL (DBMS_LOB.INSTR (p_clob, l_record_separator, 1, 1), 0) = 0)
          THEN
             l_record_separator := CHR (10);
          END IF;
          l_last := 1;
          LOOP
             l_current := DBMS_LOB.INSTR (p_clob, l_record_separator, l_last, 1);
             EXIT WHEN (NVL (l_current, 0) = 0);
             p_records (p_records.COUNT + 1) :=
                REPLACE (DBMS_LOB.SUBSTR (p_clob, l_current - l_last, l_last),
             l_last := l_current + LENGTH (l_record_separator);
          END LOOP;
       END get_records;Denes Kubicek
    http://deneskubicek.blogspot.com/
    http://www.opal-consulting.de/training
    http://htmldb.oracle.com/pls/otn/f?p=31517:1
    -------------------------------------------------------------------

  • Error in CSV upload

    Hi Gurus,
    while uploading data from csv file, at one field i am getting error, it is of type BETRG(CURR) and value in file is 500, but while i checked in debug 500 is added with # i.e., 500# '#' tabking automatically.. why it is taking #?? any help please...
    Thanks and regards,
    venky.

    @venkatt....this might help u
      DATA LO_EL_CONTEXT TYPE REF TO IF_WD_CONTEXT_ELEMENT.
      DATA LS_CONTEXT TYPE WD_THIS->ELEMENT_CONTEXT.
      DATA ITEM_FILE TYPE WD_THIS->ELEMENT_CONTEXT-EXCEL_UPLOAD.
      get element via lead selection
      LO_EL_CONTEXT = WD_CONTEXT->GET_ELEMENT( ).
      @TODO handle not set lead selection
      IF LO_EL_CONTEXT IS INITIAL.
      ENDIF.
      get single attribute
      LO_EL_CONTEXT->GET_ATTRIBUTE(
        EXPORTING
          NAME =  `EXCEL_UPLOAD`
        IMPORTING
          VALUE = ITEM_FILE ).
    DATA S_CONT TYPE STRING.
    DATA CONVT TYPE REF TO CL_ABAP_CONV_IN_CE.
    CONSTANTS: c_encoding TYPE abap_encoding VALUE '8404'. "optional
    CALL METHOD cl_abap_conv_in_ce=>create
    EXPORTING
    encoding = c_encoding "optional
    input = ITEM_FILE
    RECEIVING
    conv = CONVT.
    CALL METHOD CONVT->read
    IMPORTING
    data = S_CONT.
    CONVT = CL_ABAP_CONV_IN_CE=>CREATE( INPUT = ITEM_FILE ).
    CONVT->READ( IMPORTING DATA = S_CONT ).
    TYPES: BEGIN OF TY_TAB,
           NAME_CHAR TYPE  STRING,
           DESCR_CHAR TYPE  STRING,
           NUMBER_DIGITS TYPE  STRING,
           END OF TY_TAB.
    DATA: FIELDS TYPE STRING_TABLE.
    DATA: LV_FIELD TYPE STRING.
    DATA: S_TABLE TYPE STRING_TABLE.
    DATA: ITAB TYPE TABLE OF TY_TAB.
    DATA: STR_ITAB TYPE TY_TAB.
    *splits string based on new line
    SPLIT S_CONT AT CL_ABAP_CHAR_UTILITIES=>CR_LF INTO TABLE S_TABLE.
    FIELD-SYMBOLS: <WA_TABLE> LIKE LINE OF S_TABLE.
    LOOP AT S_TABLE ASSIGNING <WA_TABLE>.
    splits string on basis of tabs
      SPLIT <WA_TABLE> AT ',' INTO
                      STR_ITAB-NAME_CHAR
                      STR_ITAB-DESCR_CHAR
                      STR_ITAB-NUMBER_DIGITS.
      APPEND STR_ITAB TO ITAB.
    ENDLOOP.
      DATA: IT_ALV_STYLE TYPE STANDARD TABLE OF ZMTCD_ALV_CHARACT_STYLE,
            WA_ALV_STYLE LIKE LINE OF IT_ALV_STYLE.
      DATA : IT_CHARACTERISTIC TYPE ZMTCD_CHAR_T ,
             IT_TEMP_CHARACTERISTIC TYPE ZMTCD_CHAR_T,
             WA_CHAR  LIKE LINE OF IT_CHARACTERISTIC,
             WA_TEMP_CHARACTERISTIC LIKE LINE OF IT_TEMP_CHARACTERISTIC.
        REFRESH IT_ALV_STYLE.
        LOOP AT ITAB INTO STR_ITAB.
        MOVE-CORRESPONDING STR_ITAB TO WA_ALV_STYLE.
        APPEND WA_ALV_STYLE TO IT_ALV_STYLE.
        ENDLOOP.
      DATA CONTEXT_NODE TYPE REF TO IF_WD_CONTEXT_NODE.
      DATA LO_INTERFACECONTROLLER TYPE REF TO IWCI_SALV_WD_TABLE .
        LO_INTERFACECONTROLLER =   WD_THIS->WD_CPIFC_ALV1( ).
        CONTEXT_NODE = WD_CONTEXT->GET_CHILD_NODE( NAME = 'CTX_VN_ALV1' ).
        LO_INTERFACECONTROLLER->SET_DATA(
      only_if_new_descr =                 " wdy_boolean
        R_NODE_DATA = CONTEXT_NODE                      " ref to if_wd_context_node
       CONTEXT_NODE->BIND_TABLE( IT_ALV_STYLE ).
    in the above program....first data is taken in a xstring variable....
    then it is passed to classes / methods to decrypt it...
    then it is divided properly....depending on rows n colums...
    later it is binded to internal table and displayed in the output

Maybe you are looking for

  • Sender / Receiver ID mapping for EDI interfaces and Type of seeburger adapt

    We need to set up a interface using Seeburger adapter for Purchase order IDOC to 850 EDI mapping.. After reading from SDN, 1) We can use the standard mapping in BIC MD to do E2X and X2E mapping.. 2) We can use any of the AS2, EDI generic adapter, See

  • FREETALK Connect•Me

    using FREETALK® Connect•Me with skype offer, I have a telephon number with skype and no landline connection (only broadband). My question is this: Can I receive call (to my skype number) and call without landline connection only using skype account w

  • Generation of DDL similar to Oracle Designer (Data Modeler 3.0 EA1.)

    Hi, Will the production version of SQL Data Modeler allow the creation of separate DDL files similar to that which Oracle Designer does? For example, I can create a sequence number generator in Data Modeler (DM) and then make a call to that in an Obj

  • Console not seeing Media Centre when wired through...

    I have an issue that is causing no end of problems. My current wired network and wifi SSIDs BT Home Hub 2----- DHCP - Wifi 802.11g only {SSID}OLD   |             |   |           Sky HD Box (Wired)   |           XBOX360_1 (Wired)   | TP-Link WR1043N r

  • ITunes image does not appear

    Hello and thank you for taking the time to view my question. Our current LoudBlog program is dead and I am now producing my own RSS feed. I am having trouble getting the image to appear, the URL is accurate and the xml tags are correct. Any suggestio