CSV files question

Hi all
i use my pad for business and I buy motorcycle tests for the DVSA, once brought I can download a CVS file to up load to my online diary however if I use my pc I can download the file & then up load to my diary no problem but if I use my ipad, I can download the file but then the dairy says it can't use the file as there are to many lines ?
How do I get the pad to download it without changing it ?
Any help would be great
Brian

There is a forum here on OTN for XE.
You have to register to see it.
There is a link to the registration page on your XE Database Homepage.
It leads here: http://www.oracle.com/technology/xe/registration/index.html
Once you get to that forum, search for "timeout".

Similar Messages

  • Question about reading csv file into internal table

    Some one (thanks those nice guys!) in this forum have suggested me to use FM KCD_CSV_FILE_TO_INTERN_CONVERT to read csv file into internal table. However, it can be used to read a local file only.
    I would like to ask how can I read a CSV file into internal table from files in application server?
    I can't simply use SPLIT as there may be comma in the content. e.g.
    "abc","aaa,ab",10,"bbc"
    My expected output:
    abc
    aaa,ab
    10
    bbb
    Thanks again for your help.

    Hi Gundam,
    Try this code. I have made a custom parser to read the details in the record and split them accordingly. I have also tested them with your provided test cases and it work fine.
    OPEN DATASET dsn FOR input IN TEXT MODE ENCODING DEFAULT.
    DO.
    READ DATASET dsn INTO record.
      PERFORM parser USING record.
    ENDDO.
    *DATA str(32) VALUE '"abc",10,"aaa,ab","bbc"'.
    *DATA str(32) VALUE '"abc","aaa,ab",10,"bbc"'.
    *DATA str(32) VALUE '"a,bc","aaaab",10,"bbc"'.
    *DATA str(32) VALUE '"abc","aaa,ab",10,"b,bc"'.
    *DATA str(32) VALUE '"abc","aaaab",10,"bbc"'.
    FORM parser USING str.
    DATA field(12).
    DATA field1(12).
    DATA field2(12).
    DATA field3(12).
    DATA field4(12).
    DATA cnt TYPE i.
    DATA len TYPE i.
    DATA temp TYPE i.
    DATA start TYPE i.
    DATA quote TYPE i.
    DATA rec_cnt TYPE i.
    len = strlen( str ).
    cnt = 0.
    temp = 0.
    rec_cnt = 0.
    DO.
    *  Start at the beginning
      IF start EQ 0.
        "string just ENDED start new one.
        start = 1.
        quote = 0.
        CLEAR field.
      ENDIF.
      IF str+cnt(1) EQ '"'.  "Check for qoutes
        "CHECK IF quotes is already set
        IF quote = 1.
          "Already quotes set
          "Start new field
          start = 0.
          quote = 0.
          CONCATENATE field '"' INTO field.
          IF field IS NOT INITIAL.
            rec_cnt = rec_cnt + 1.
            CONDENSE field.
            IF rec_cnt EQ 1.
              field1 = field.
            ELSEIF rec_cnt EQ 2.
              field2 = field.
            ELSEIF rec_cnt EQ 3.
              field3 = field.
            ELSEIF rec_cnt EQ 4.
              field4 = field.
            ENDIF.
          ENDIF.
    *      WRITE field.
        ELSE.
          "This is the start of quotes
          quote = 1.
        ENDIF.
      ENDIF.
      IF str+cnt(1) EQ ','. "Check end of field
        IF quote EQ 0. "This is not inside quote end of field
          start = 0.
          quote = 0.
          CONDENSE field.
    *      WRITE field.
          IF field IS NOT INITIAL.
            rec_cnt = rec_cnt + 1.
            IF rec_cnt EQ 1.
              field1 = field.
            ELSEIF rec_cnt EQ 2.
              field2 = field.
            ELSEIF rec_cnt EQ 3.
              field3 = field.
            ELSEIF rec_cnt EQ 4.
              field4 = field.
            ENDIF.
          ENDIF.
        ENDIF.
      ENDIF.
      CONCATENATE field str+cnt(1) INTO field.
      cnt = cnt + 1.
      IF cnt GE len.
        EXIT.
      ENDIF.
    ENDDO.
    WRITE: field1, field2, field3, field4.
    ENDFORM.
    Regards,
    Wenceslaus.

  • Creation of CSV file on client machine with data from forms

    Hi,
    My requirement is to generate a CSV file(or .XLS) on the client machine ie local drive with the details shown in a form.
    Oracle version -
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    PL/SQL Release 11.1.0.7.0 - Production
    CORE 11.1.0.7.0 Production
    TNS for Solaris: Version 11.1.0.7.0 - Production
    NLSRTL Version 11.1.0.7.0 - Production.
    I have searched the web for last couple of days and got to know that TEXT_IO is to be used to generate files on client machine. However, when I tried TEXT_IO, it was not able to generate the file on client rather it was able to generate on database server. After further browsing on this, there was a link which said that we need to use CLIENT_TEXT_IO to generate file on client side. For this, i was required to subclass the webutil.pll which i did and corrected the code to use CLIENT_TEXT_IO. The form was unable to compile and was not able to find "webutil_core" package.
    I am very confused with the disparity in the information available on the web as in what to use to generate a file on client side. If anyone has use it in the past, can he/she please detail what to use to get things sorted.
    Thanks,
    R

    Oracle version - Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    PL/SQL Release 11.1.0.7.0 - Production
    CORE 11.1.0.7.0 Production
    TNS for Solaris: Version 11.1.0.7.0 - Production
    NLSRTL Version 11.1.0.7.0 - Production. >
    So, what is your FORMS version. This is more important than your database version.
    My requirement is to generate a CSV file(or .XLS) on the client machine ie local drive with the details shown in a form. Depending on your Forms version, you would use TEXT_IO (if Forms 6i running in Client/Server mode) or WebUtil (if Forms 9i or higher).
    I'm going to guess that you are at least using Oracle Forms 9i since you stated that your attempt at using TEXT_IO produced a file on the DB server.
    There is more to using WebUtil than just attaching the WEBUTIL.PLL. If you had performed a simple search of the Forms Help System would have found numerous WebUtil topics to include: Introduction to WebUtil, Configuring WebUtil, Using WebUtil in Your Applications and the WebUtil User's Guide. If you have Oracle Forms release 10g or higher, WebUtil is included when you installed Forms, however, you do need to configure your installation to use WebUtil and you must download the Java COM Bridge (jacob.jar) from Source Forge. Take a look at the Configuring WebUtil Forms Help topic to find out which version of the Java COM Bridge you will need to download.
    After you have successfully configured WebUtil, take a look at the Using WebUtil in Your Applications topic to find out how to implement WebUtil in a form.
    Searching the Internet for answers is great, but don't forget the look at the Forms Help System because the majority of your questions can be answered there. :)
    Lastly, configuration of WebUtil is primarily done on your Application Server (AS). However, if you plan to perform preliminary runtime testing by running your Form from the Forms Builder, then you will configure your local runtime to support WebUtil as well as configure your AS. The steps are exactly the same. A common mis-step is to skip a step during the configuration because you don't think the step applies. Take a look at the Forms Help Runtime Setup Checklist topic for a list of step you need to complete in order to enable WebUtil.
    Hope this helps,
    Craig B-)
    If someone's response is helpful or correct, please mark it accordingly.

  • How export to csv work in safari browser? In my application export to csv open like a raw data in new tab. But other browsers working great!. Need to open in a csv file or save it as a csv file.

    How export to csv work in safari browser?
    In my application export to csv open like a raw data in new tab.
    But other browsers working great!.
    Need to open in a csv file or save it as a csv file.
    Please suggest me. Thank you in advance!.

    Hi Adrian,
    Why don't you try any another software for opening CSV files then Notepad ? According to my experience, you can use these softwares to open an CSV files and they are:-
    Microsoft Excel
    Open Office Calc
    Google Docs
    Also there is an additional tool available known as CSV viewer. You may try this, download it from here http://www.csvviewer.com/
    I've never used Notepad for opening CSV files, because sometimes it contains some symbols which are not not at all compatibile with Notepad.
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.

  • Can't paste copied entries from online CSV file into Excel spreadsheet (chatted with zzxc on 5/3/10 at around 2.30 CST but got disconnected. did not get an answer ...)

    Here is the transcript of a chat with Firefox community member zzxcon May 3/10
    You are now chatting with Firefox community member zzxc
    zzxc: Hello
    zzxc: What happens when you attempt to download a .csv file?
    seegal: hello
    seegal: it doesn't copy
    zzxc: how are you trying to copy?
    seegal: pls bear with me I'm a slow typist. Just copy selected text
    Biolizard has joined the conversation.
    zzxc: ok - which text are you selecting?
    seegal: I reconcile my checkbook (spreadsheet this way). I copy the items in my online bank acc and paste it to the spreadsheet
    seegal: I'm using Firefox /2.0.0.19. Have no problem to do this.
    zzxc: Which version of OS X?
    seegal: In all newer version nothing happens when trying to paste- just doesn't paste
    zzxc: Firefox 2.0.0.x is no longer supported, and hasn't been supported in over a year
    zzxc: Paste into Excel, from Firefox?
    seegal: Sorry, I'm ahead... /2.0.0.19
    seegal: Yes. I open my bank acc in Firefox
    zzxc: Which version of Excel?
    zzxc: It would really help if you could tell me step by step what you're doing.
    seegal: First re: your previous question: it's OS 10.4.11
    seegal: About Excell: it'as 2004 version - the lasat one produced for Macs. The specific version is 11.3.7
    seegal: So I open my bank acc online in Firfox (my primary browser). I copy the latest entry in the account and paste it into my Excel spreadsheet.
    zzxc: so, you copy direct from the web page without downloading a CSV file?
    seegal: What do you mean by downloading to CSV file? I could export from the https://chat-support.mozilla.com:9091/webchat/getimage?image=sendmessage&workgroup=support%40workgroup.chat-support.mozilla.comFirefox to the CSV file, but the other way around?
    zzxc: Are you copying your bank statement directly from the web site to Excel using the clipboard?
    seegal: sI don't use the clipboard. This is a mac. There is no need to do that. In PC it would be yes.
    zzxc: I need to know the exact steps you're taking to get them into excel
    zzxc: And I need to know what exactly goes wrong in the latest version of Firefox.
    seegal: Do you have a mac there with Firefox and Excel? It would be very easy to reproduce. Imagine you open an online bank acc, select some entries , click "copy", than proceed to your already open Excel spreadsheet and click' paste". That's it!
    zzxc: When this happens, do you get cryptic code pasted into Excel?
    seegal: As I said before: in all newer versions starting with 3.0 when I go to Excel to "paste" from my bank acc nothing happens. It does not paste. No, I don't get a cryptic code pasted, just NOTHING.
    zzxc: what if you paste into MS Word instead?
    seegal: haven't tried that, the formatting most likely would be lost. Tried that with an other Exc el spreadsheet- it lost all the formating and pasted as continous text.
    == This happened ==
    Every time Firefox opened
    == Pls. see copy of the chat above. THIS A MAC OS X. In older versions, prior to 3.0 I could copy from CSV file on the website Ibank acc) and paste directly to my Excel spreadheet to reconcile my account.

    See also:
    Table2Clipboard: https://addons.mozilla.org/firefox/addon/1852
    TableTools: https://addons.mozilla.org/firefox/addon/2637

  • Memory problem with loading a csv file and displaying 2 xy graphs

    Hi there, i'm having some memory issues with this little program.
    What i'm trying to do is reading a .csv file of 215 mb (6 million lines more or less), extracting the x-y values as 1d array and displaying them in 2 xy graphs (vi attacked).
    I've noticed that this process eats from 1.6 to 2 gb of ram and the 2 x-y graphs, as soon as they are loaded (2 minutes more or less) are really realy slow to move with the scrollbar.
    My question is: Is there a way for use less memory resources and make the graphs move smoother ?
    Thanks in advance,
    Ierman Gert
    Attachments:
    read from file test.vi ‏106 KB

    Hi lerman,
    how many datapoints do you need to handle? How many do you display on the graphs?
    Some notes:
    - Each graph has its own data buffer. So all data wired to the graph will be buffered again in memory. When wiring a (big) 1d array to the graph a copy will be made in memory. And you mentioned 2 graphs...
    - load the array in parts: read a number of lines, parse them to arrays as before (maybe using "spreadsheet string to array"?), finally append the parts to build the big array (may lead to memory problems too).
    - avoid datacopies when handling big arrays. You can show buffer creation using menu->tools->advanced->show buffer allocation
    - use SGL instead of DBL when possible...
    Message Edited by GerdW on 05-12-2009 10:02 PM
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • Use PL/SQL procedure to guard against malformed CSV files before upload

    Hi all,
    In my CSV upload utility, I've implemented error checking procedures, but they are invoked only AFTER data has been uploaded to temp table. This doesn't catch the following sample scenarios:
    1. The CSV is malformed, with some rows shifted due to fields and separators missing here and there.
    2. User has chosen a wrong CSV to upload (most likely number of fields mismatch)
    I'm wondering if it is a good idea to have procedure to read in the CSV, scan each line, count the number of fields (null fields but with delimiters showing the field exist is ok) for each record. If every single record matches the required number of fields for that table, then the CSV is ok, and the insert from external table to temp table is allow. This will ensure at least the CSV file has a valid matrix size for the target table, but rest of error checking is left until after temp table is populated.
    One of my concerns is, I specify "missing field values are null" in the external table parameters, which is necessary since not all fields are required. Does this specification causes a row with missing trailing separators still considered valid? If so then the stored procedure must be programmed to omit such a case.
    What do you think? Many thanks.

    Hi, Cuauhtemoc Amox
    Thank you for your advice. I have set web adi  using PL SQL interface.
    I have decided to go futher
    i have a procedure like that
    procedure delete_old_data ( p_id in number, p_description in varchar2)
    as
    begin
                   begin
                   if g_flag ='N' then
                   delete from xx_test;
                    g_flag='Y';
                    else null;
                    end if;
                    end;
    insert into xx_test (p_id,p_descriptiom);
    end;
    G_FLAG is a global variable with default value ='N'  in my package. When web_adi upload
    first row from excel sheet, then procedure delete all data from table and change g_flag to 'Y'.
    All other rows uploads succesfully; it  works fine.
    But when user change data in excel sheet and try to upload second time. DELETE_OLD_DATA procedure doesnt work in proper way.
    I have found what problem is. when user use same template and try to upload data several times there is same SESSION_ID in database.
    i.e. g_flag ='Y' when user try to upload second time. But when user log off and create template again it is work properly.
    My question is: How i can different upload attempts? May be there is some id for each upload proccess.
    if i can use this ID in my procedure user will have opportunity to use same template.
    Thank you

  • Inserting an image from a CSV file

    I am trying to create a document that is a directory of members of a community.  The directory is in a CSV file.  One ot the fields is a @photos field that contains the path to the location of the images on my local compiuter,  When I try and preview with the Data Merge panel I get the attached error message,  The path reads:
    /Users/jhaynes/Documents/Falcons_Landing/images
    All the other fields preview correctly.

    @Jim – the right syntax for your Mac OSX would be:
    Macintosh_HD:Users:jhaynes:Documents:Falcons_Landing:images:jhaynes.jpg
    : instead of /
    Note: it's all case sensitive.
    In my personal case the line above would not work, because in my system it's "Macintosh HD" (a blank between Macintosh and HD) instead of "Macintosh_HD" with an underscore. But this may vary…
    It's amazing, that InDesign's datamerge functionality on Mac OSX relies on that old syntax, wheras you could go to your images folder in the finder with typing "Users/jhaynes/Documents/Falcons_Landing/images" in the Go To Folder dialog. Using the : syntax there would not work.
    That leaves one question:
    What to do with special characters in file names like umlauts äAöÖüÜ, accented ones like áàÁÀ etc.pp.?
    Will they break that workflow on Mac OSX with HFS+ formatted volumes?
    Very likely…
    And that's because on Mac OSX umlauts and accented characters in file names are composite characters (made out of two glyphs) instead of single characters. But not only because of that…
    Here an example:
    You found out that the easiest way to get a file path is to drag a file to a pure text file opened in TextEdit App.
    Fine. You now know, that all "/"-signs must be changed to ":". No problem with a quick find/change action.
    Then you saved the file as pure text file with a txt suffix.
    All good and well?
    Perhaps. Perhaps not.
    A simple "ö" in a file name could ruin your workflow!
    Datamerge in InDesign would accept the text file as source, but would have problems to place the file with the umlaut. Precisely it will see the "ö" not as "ö" but as: "Äà"!
    Ah! You think, the composite characters problem kicks in. And right you are.
    No problem then: we could replace "ö" with our own typed "ö" in TextEdit.
    But to our surprise datamerge get the following instead of an ö: "√∂". Another composite…
    If you open the text file in TextEdit you cannot spot the difference.
    You can only see there is the represantation of character "ö".
    Here the solution to this puzzle:
    Choose "Unicode (UTF-16)" instead of "Unicode (UTF-8)" when saving your text file to pure text in TextEdit.
    Uwe

  • Is there a way to name the .indd files created by a data merge with values in the CSV file

    I have a data merge document that is set to single Record per Document page and 1 page per document.  In my CSV file I have a field for Part Number.  If I run a data merge with 10 records I end up with 10 .indd files.  I need a way so that the resulting .indd files are named the same value that is in the Part Number field.

    Loic has provided a link for my original piece, and I've written up some follow-up pieces for indesignsecrets.com:
    http://indesignsecrets.com/data-merging-individual-records-separate-pdfs.php
    http://indesignsecrets.com/data-merging-individual-records-separate-pdfs-part-2-scripting. php
    So there are several ways to get unique name PDFs from an indesign Data Merge. However, none of these 3 articles will truly answer your question of how to get unique indesign filenames using the database. I can see a practical purpose for this as merging business cards directly to PDFs is great, but I can be guaranteed that while on proof, there will be alts to the business cards that need to be done outside the merge, meaning specific records need to be exported to indesign files for further manipulation.

  • How can I export multiple selections in a list box into a .csv file?

    Hi all, I've created a form in Acrobat X with a list box enabled for multiple selections. When I try to export the filled out form into a .csv file, only the first selection shows up. Can anyone help me figure out how to get all selections to export? Thanks!

    Thank you for your quick response!
    Once a recipient fills out the form (which has two questions with list boxes and multiple selection enabled) they send the completed form back to me. When I open the completed form, I am given the option add the completed form to a reponse file which was set up when I distributed the form. When I open the reponse file, it lists all of the responses in rows, and the values that were chosen in the form are arranged in columns. In this file, the list boxe columns have multiple values, separated by commas (which is what I'm looking for). At this point there is an option to export into a .csv file or an .xml file. If I choose the .csv file and open in excel, only the first selection shows in the list box column rather than all selections that were initially made by the responder.

  • How to read/write .CSV file into CLOB column in a table of Oracle 10g

    I have a requirement which is nothing but a table has two column
    create table emp_data (empid number, report clob)
    Here REPORT column is CLOB data type which used to load the data from the .csv file.
    The requirement here is
    1) How to load data from .CSV file into CLOB column along with empid using DBMS_lob utility
    2) How to read report columns which should return all the columns present in the .CSV file (dynamically because every csv file may have different number of columns) along with the primariy key empid).
    eg: empid report_field1 report_field2
    1 x y
    Any help would be appreciated.

    If I understand you right, you want each row in your table to contain an emp_id and the complete text of a multi-record .csv file.
    It's not clear how you relate emp_id to the appropriate file to be read. Is the emp_id stored in the csv file?
    To read the file, you can use functions from [UTL_FILE|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#BABGGEDF] (as long as the file is in a directory accessible to the Oracle server):
    declare
        lt_report_clob CLOB;
        l_max_line_length integer := 1024;   -- set as high as the longest line in your file
        l_infile UTL_FILE.file_type;
        l_buffer varchar2(1024);
        l_emp_id report_table.emp_id%type := 123; -- not clear where emp_id comes from
        l_filename varchar2(200) := 'my_file_name.csv';   -- get this from somewhere
    begin
       -- open the file; we assume an Oracle directory has already been created
        l_infile := utl_file.fopen('CSV_DIRECTORY', l_filename, 'r', l_max_line_length);
        -- initialise the empty clob
        dbms_lob.createtemporary(lt_report_clob, TRUE, DBMS_LOB.session);
        loop
          begin
             utl_file.get_line(l_infile, l_buffer);
             dbms_lob.append(lt_report_clob, l_buffer);
          exception
             when no_data_found then
                 exit;
          end;
        end loop;
        insert into report_table (emp_id, report)
        values (l_emp_id, lt_report_clob);
        -- free the temporary lob
        dbms_lob.freetemporary(lt_report_clob);
       -- close the file
       UTL_FILE.fclose(l_infile);
    end;This simple line-by-line approach is easy to understand, and gives you an opportunity (if you want) to take each line in the file and transform it (for example, you could transform it into a nested table, or into XML). However it can be rather slow if there are many records in the csv file - the lob_append operation is not particularly efficient. I was able to improve the efficiency by caching the lines in a VARCHAR2 up to a maximum cache size, and only then appending to the LOB - see [three posts on my blog|http://preferisco.blogspot.com/search/label/lob].
    There is at least one other possibility:
    - you could use [DBMS_LOB.loadclobfromfile|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_lob.htm#i998978]. I've not tried this before myself, but I think the procedure is described [here in the 9i docs|http://download.oracle.com/docs/cd/B10501_01/appdev.920/a96591/adl12bfl.htm#879711]. This is likely to be faster than UTL_FILE (because it is all happening in the underlying DBMS_LOB package, possibly in a native way).
    That's all for now. I haven't yet answered your question on how to report data back out of the CLOB. I would like to know how you associate employees with files; what happens if there is > 1 file per employee, etc.
    HTH
    Regards Nigel
    Edited by: nthomas on Mar 2, 2009 11:22 AM - don't forget to fclose the file...

  • Error while processing csv file

    Hi,
    i get this messages while processing a csv file to load the content to database,
    [SSIS.Pipeline] Warning: Warning: Could not open global shared memory to communicate with performance DLL;
     data flow performance counters are not available.  To resolve, run this package as an administrator, or on the system's console.
    [SSIS.Pipeline] Warning: The output column "Copy of Column 13" (842) on output "Data Conversion Output" (798) and component
     "Data Conversion" (796) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.
    can someone please me with this.
    With Regards
    litu Here

    Hello Litu,
    Are you using: OLEDB Source and Destination ?
    Can you change the source and destination provider to ADO.NET and check if the issue still persist?
    Mark this post as "Answered" if this addresses your question. 
    Regards,
    Don Rohan [MSFT]
    Regards, Don Rohan [MSFT]

  • 10g - disconnected analytics, encrytion on csv file in client machine

    hi, experts,
    I found that the disconnected analytics client firstly download the data from server and SAVED AS CSV FILES.
    but everyone can view the content in csv files.
    IT IS UNSAFE.
    can any encrytion be done for this issue?
    Thank you very mucH!

    user11861756 wrote:
    I am trying to get a csv file in the client machine .My database server is configured in oracle 10g within an unix
    box.Though i am able to get a csv file on my server by using pl/sql codes and can have it on my local machine by
    ftp.But the problem is we dont have any permission to create ant file in our production environment .
    Is there is any process or utility on which i can build on my previous approach to get my csv file on my client
    machine directly.
    The answer to this question is the question.. What is client-server?*
    PL/SQL, SQL, Oracle, database ... all these equal server.
    SQL*Plus, TOAD, web browsers... all these equal client.
    Who handles the presentation of data on the client? Who has access to the client platform's keyboard, mouse, screen printer, disk drives, etc? The client.
    So how can a server process or component then hack across the client-server architecture and into a client component and write data (e.g. CSV files) there?
    This is a fundamental concept that needs to be clearly understood when dealing with all aspects of client-server.
    What also needs to be understood that client-server is a software architecture. This means that client-server can run on a single machine. Client can run on a PC. Server can run on a big Unix machine. Or server can run on the PC and the client can run on the big Unix machine.
    Which means that you can have PL/SQL code (on the server platform) acting as a client.. and use server software on your client platform to service that PL/SQL session.
    For example. PL/SQL creates a CSV "+file+" as a CLOB. PL/SQL uses a FTP package and opens a client FTP session to a FTP server on your client PC. Now your PC is the FTP server and PL/SQL the FTP client. The client then proceeds to FTP the CLOB as a CSV file.

  • Data formatting and reading a CSV file without using Sqlloader

    I am reading a csv file to an Oracle table called sps_dataload. The table is structured based on the record type of the data at the beginning of
    each record in the csv file. But the first two lines of the file are not going to be loaded to the table due to the format.
    Question # 1:
    How can I skip reading the first two lines from my csv file?
    Question # 2:
    There are more fields in the csv file than there are number of columns in my table. I know I can add filler as an option, but then there are
    about 150 odd fields which are comma-separated in the file and my table has 8 columns to load from the file. So, do I really have to use filler
    for 140 times in my script or, there is a better way to do this?
    Question # 3:
    This is more of an extension of my question above. The csv file has fields with block quotes - I know this could be achieved in sql loader when we mention Occassionally enclosed by '"'.
    But can this be doable in the insert as created in the below code?
    I am trying to find the "wrap code" button in my post, but do not see it.
    Heres my file layout -
    PROSPACE SCHEMATIC FILE
    ; Version 2007.7.1
    Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
    Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
    Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
    Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
    Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
    Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
    Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1
    This is my table structure -
    desc sps_dataload;
    File_Name     Varchar2 (50) Not Null,
    Record_Layer Varchar2 (20) Not Null,     
    Level_Id     Varchar2 (20),
    Desc1          Varchar2 (50),
    Desc2          Varchar2 (50),
    Desc3          Varchar2 (50),
    Desc4          Varchar2 (50)
    Heres my code to do this -
    create or replace procedure insert_spsdataloader(p_filepath IN varchar2,
    p_filename IN varchar2,
    p_Totalinserted IN OUT number) as
    v_filename varchar2(30) := p_filename;
    v_filehandle UTL_FILE.FILE_TYPE;
    v_startPos number; --starting position of a field
    v_Pos number; --position of string
    v_lenstring number; --length of string
    v_record_layer varchar2(20);
    v_level_id varchar2(20) := 0;
    v_desc1 varchar2(50);
    v_desc2 varchar2(50);
    v_desc3 varchar2(50);
    v_desc4 varchar2(50);
    v_input_buffer varchar2(1200);
    v_delChar varchar2(1) := ','
    v_str varchar2(255);
    BEGIN
    v_Filehandle :=utl_file.fopen(p_filepath, p_filename, 'r');
    p_Totalinserted := 0;
    LOOP
    BEGIN
    UTL_FILE.GET_LINE(v_filehandle,v_input_buffer);
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    EXIT;
    END;
    -- this will read the 1st field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,1);
    v_lenString := v_Pos - 1;
    v_record_layer := substr(v_input_buffer,1,v_lenString);
    v_startPos := v_Pos + 1;
    -- this will read the 2nd field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,2);
    v_lenString := v_Pos - v_startPos;
    v_desc1 := substr(v_input_buffer,v_startPos,v_lenString);
    v_startPos := v_Pos + 1;
    -- this will read the 3rd field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,3);
    v_lenString := v_Pos - v_startPos;
    v_desc2 := substr(v_input_buffer,v_startPos,v_lenString);
    v_startPos := v_Pos + 1;
    -- this will read the 4th field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,4);
    v_lenString := v_Pos - v_startPos;
    v_desc3 := substr(v_input_buffer,v_startPos,v_lenString);
    v_startPos := v_Pos + 1;
    -- this will read the 5th field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,5);
    v_lenString := v_Pos - v_startPos;
    v_desc4 := substr(v_input_buffer,v_startPos,v_lenString);
    v_startPos := v_Pos + 1;
    v_str := 'insert into table sps_dataload values('||v_filename||','||v_record_layer||','||v_level_id||','||v_desc1||','||v_desc2||','||v_desc3||','||v_desc4||')';
    Execute immediate v_str;
    p_Totalinserted := p_Totalinserted + 1;
    commit;
    END LOOP;
    UTL_FILE.FCLOSE(v_filehandle);
    EXCEPTION
    WHEN UTL_FILE.INVALID_OPERATION THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20051, 'sps_dataload: Invalid Operation');
    WHEN UTL_FILE.INVALID_FILEHANDLE THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20052, 'sps_dataload: Invalid File Handle');
    WHEN UTL_FILE.READ_ERROR THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20053, 'sps_dataload: Read Error');
    WHEN UTL_FILE.INVALID_PATH THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20054, 'sps_dataload: Invalid Path');
    WHEN UTL_FILE.INVALID_MODE THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20055, 'sps_dataload: Invalid Mode');
    WHEN UTL_FILE.INTERNAL_ERROR THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20056, 'sps_dataload: Internal Error');
    WHEN VALUE_ERROR THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20057, 'sps_dataload: Value Error');
    WHEN OTHERS THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE;
    END insert_spsdataloader;
    /

    Justin, thanks. I did happen to change my pl sql procedure using utl_file.get_file and modifying the instr function based on position of ',' in the file, but my procedure is getting really big and too complex to debug. So I got motivated to use external tables or sql loader as plan b.
    As I was reading more about creating an external table as an efficient way and thus believe I can perhaps build an extern table with my varying selection from the file. But I am still unclear if I can construct my external table by choosing different fields in a record based on a record identifier string value (which is the first field of any record). I guess I can, but I am looking for the construct as to how am I going to use the instr function for selecting the field from the file while creating the table.
    PROSPACE SCHEMATIC FILE
    ; Version 2007.7.1
    Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
    Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
    Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
    Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
    Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
    Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
    Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1For example, if I want to create an external table like this -
    CREATE TABLE extern_sps_dataload
    ( record_layer            VARCHAR2(20),
      attr1                   VARCHAR2(20),
      attr2                   VARCHAR2(20),
      attr3                   VARCHAR2(20),
      attr4                   VARCHAR2(20)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
      DEFAULT DIRECTORY dataload
      ACCESS PARAMETERS
      ( RECORDS DELIMITED BY NEWLINE
        BADFILE     dataload:'sps_dataload.bad'
        LOGFILE     dataload:'sps_dataload.log'
        DISCARDFILE dataload:'sps_dataload.dis'
        SKIP 2
        VARIABLE 2 FIELDS TERMINATED BY ',' 
        OPTIONALLY ENCLOSED BY '"' LRTRIM
        MISSING FIELD VALUES ARE NULL
        +LOAD WHEN RECORD_LAYER = 'PROJECT' (FIELD2, FIELD3,FIELD7,FIELD9)+
        +LOAD WHEN RECORD_LAYER= 'PRODUCT' (FIELD3,FIELD4,FIELD8,FIELD9)+
        +LOAD WHEN RECORD_LAYER= 'SEGMENT' (FIELD1,FIELD2,FIELD4,FIELD5)+    LOCATION ('sps_dataload.csv')
    REJECT LIMIT UNLIMITED;
    {code}
    While I was reading the external table documentation, I thought I could achieve similar things by using position_spec option, but I am not getting behind its parameters. I have highlighted italics in the code above(from LOAD WHEN....FIELDS....), the part I think I am going to use, but not sure of it's construct.
    Thank you for your help!! Appreciate your thoughts on this..
    Sanders.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to Compare 2 CSV file and store the result to 3rd csv file using PowerShell script?

    I want to do the below task using powershell script only.
    I have 2 csv files and I want to compare those two files and I want to store the comparision result to 3rd csv file. Please look at the follwingsnap:
    This image is csv file only. 
    Could you please any one help me.
    Thanks in advance.
    By
    A Path finder 
    JoSwa
    If a post answers your question, please click "Mark As Answer" on that post and "Mark as Helpful"
    Best Online Journal

    Not certain this is what you're after, but this :
    #import the contents of both csv files
    $dbexcel=import-csv c:\dbexcel.csv
    $liveexcel=import-csv C:\liveexcel.csv
    #prepare the output csv and create the headers
    $outputexcel="c:\outputexcel.csv"
    $outputline="Name,Connection Status,Version,DbExcel,LiveExcel"
    $outputline | out-file $outputexcel
    #Loop through each record based on the number of records (assuming equal number in both files)
    for ($i=0; $i -le $dbexcel.Length-1;$i++)
    # Assign the yes / null values to equal the word equivalent
    if ($dbexcel.isavail[$i] -eq "yes") {$dbavail="Available"} else {$dbavail="Unavailable"}
    if ($liveexcel.isavail[$i] -eq "yes") {$liveavail="Available"} else {$liveavail="Unavailable"}
    #create the live of csv content from the two input csv files
    $outputline=$dbexcel.name[$i] + "," + $liveexcel.'connection status'[$i] + "," + $dbexcel.version[$i] + "," + $dbavail + "," + $liveavail
    #output that line to the csv file
    $outputline | out-file $outputexcel -Append
    should do what you're looking for, or give you enough to edit it to your exact need.
    I've assumed that the dbexcel.csv and liveexcel.csv files live in the root of c:\ for this, that they include the header information, and that the outputexcel.csv file will be saved to the same place (including headers).

Maybe you are looking for