Problems with sqlldr, creating .ctl file

Hi I am trying to load data from a .csv file into an oracle table. I have been successful with loading some .csv files but at this point I am stuck in loading one .csv file that creates incorrect rows.
The .csv file has rows commencing on a newline, with columns seperated by the delimiter ','. Unfortunately some values in the .csv file contain newline characters leaving blank lines in the .csv file then continuing on with a row on the next line. e.g.
"foo", "foofoo", "foofoo", "lalalala
lallalallala", "foo", "foo"
Does anyone know what is missing from my .ctl file below so that it accurately reads in the correct values. Also delimiters ',' are present inside values e.g. "foo, foo". Does the optionally enclosed by statement below ingore those cases which I want it to?
OPTIONS (SKIP=1)
LOAD DATA
BADFILE 'import.bad'
APPEND
INTO TABLE tablename
FIELDS TERMINATED BY "," optionally enclosed by '"'
--columns
Thanks

Can you get the data supplied to you without the newline characters in the strings? It would make it a lot easier.Unfortunately not as this is an assignment. I also tried the method of converting to a tab csv and adjusting the .ctl file respectively with the same result. Some records are perfect but others are not and I can not for the life of me pinpoint why. I am new to this and I can just tell like you guys have assured me of that newlines contained in values are ugly. But regardless im open to suggets if anyone has any.
Thank you greatly for your help too, its always appreciated.

Similar Messages

  • Problem with sqlldr and commit

    Hi,
    i have a problem with sqlldr and commit.
    I have a simple table with one colum [ col_id number(6) not null ]. The column "col_id" is primary key in the table. I have one file with 100.000 records ( number from 0 to 99.999 ).
    I want load the file in the table with sqlldr ( sql*loader ) but i want commit only if all records are loaded. If one record is discarded i want discarded all record of file.
    The proble is that in coventional path the commit is on 64 row but if i want the same records of file isn't possible and in direct path sqlldr disable primary key :(
    There are a solutions?
    Thanks
    I'm for the bad English

    This is my table:
    DROP TABLE TEST_SQLLOADER;
    CREATE TABLE TEST_SQLLOADER
    (     COL_ID NUMBER NOT NULL,
         CONSTRAINT TEST_SQLLOADER_PK PRIMARY KEY (COL_ID)
    This is my ctlfile ( test_sql_loader.ctl )
    OPTIONS
    DIRECT=false
    ,DISCARDMAX=1
    ,ERRORS=0
    ,ROWS=100000
    load data
    infile './test_sql_loader.csv'
    append
    into table TEST_SQLLOADER
    fields terminated by "," optionally enclosed by '"'
    ( col_id )
    test_sql_loader.csv
    0
    1
    2
    3
    99999
    i run sqlloader
    sqlldr xxx/yyy@orcl control=test_sql_loader.ctl log=test_sql_loader.log
    output on the screen
    Commit point reached - logical record count 92256
    Commit point reached - logical record count 93248
    Commit point reached - logical record count 94240
    Commit point reached - logical record count 95232
    Commit point reached - logical record count 96224
    Commit point reached - logical record count 97216
    Commit point reached - logical record count 98208
    Commit point reached - logical record count 99200
    Commit point reached - logical record count 100000
    Logfile
    SQL*Loader: Release 11.2.0.1.0 - Production on Sat Oct 3 14:50:17 2009
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Control File: test_sql_loader.ctl
    Data File: ./test_sql_loader.csv
    Bad File: test_sql_loader.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 0
    Bind array: 100000 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table TEST_SQLLOADER, loaded from every logical record.
    Insert option in effect for this table: APPEND
    Column Name Position Len Term Encl Datatype
    COL_ID FIRST * , O(") CHARACTER
    value used for ROWS parameter changed from 100000 to 992
    Table TEST_SQLLOADER:
    100000 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 255936 bytes(992 rows)
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 100000
    Total logical records rejected: 0
    Total logical records discarded: 0
    Run began on Sat Oct 03 14:50:17 2009
    Run ended on Sat Oct 03 14:50:18 2009
    Elapsed time was: 00:00:01.09
    CPU time was: 00:00:00.06
    The commit is on 992 row
    if i have error on 993 record i have commit on first 992 row :(
    Edited by: inter1908 on 3-ott-2009 15.00

  • A problem with importing layered PSD file into Flash

    Hi.
    There's a problem with importing layered PSD file into Flash.
    If I import a layered PSD file, some part the color of the lower layer is shown at the edge of objects or shadows. Instead, if I crop each layers first and import them, there's no problem.
    If the higher layer has brush or transparent effects, it becomes worse.
    Any help with this problem?
    Thanks.

    How was the original art created? Was the original RGB or CMYK? What is the resolution of the Photoshop file? Flash only works well with RGB and 72 pixel per inch resolution. If your original art is not set this way, then Flash will attempt to convert it as it imports it. Flash uses the sRGB color space. You'll get the best color translation if your Photoshop file is using this color preference.

  • Problem with Preview and PSD files - random gray square

    Hi guys, hope you can help...
    I've got a problem with Preview and PSD files.
    If I open in Preview both an original jpg straight from my reflex and the photoshop version of the same picture, the psd file presents a gray square (of what it seems unrendered image) in a random area of the photo (sometimes in the center.). The square is quite big...
    If I zoom in or zoom out it disappears...if I scroll to another photo and then back to the psd, the square it's there again...sometimes in a different position.
    I've tried the same psd on my older iMac with leopard...and got no problem at all.
    I suspect it got something to do with my Ati...
    (this is the second iMac 27...the first went back for gray banding on the lcd screen and flickering and yellow tinge........)
    Thanks for your help.
    DAve.

    maybe I'm onto something...
    I've just found out that opening Preview in 32bit mode (instead of default 64bit) works flawlessly with any psd files. If I switch back to 64bit mode, Preview is much faster but the gray square comes back...
    It seems like the i7 is much faster than the Ati....
    Any more realistic ideas?

  • I have problems with seeing my bookmarks, file, view, edit...buttons. I tried other shortcuts. I noticed that all of my bookmarks are located in the Internet Explorer browsers, how can I restore setting back to Mozilla Firefox?

    I have problems with seeing my bookmarks, file, view, edit...buttons. I tried other shortcuts. I noticed that all of my bookmarks are located in the Internet Explorer browsers, how can I restore setting back to Mozilla Firefox?

    Is the menu bar missing (the one containing File, View, Edit etc)? If it is, the following link shows how to restore it - https://support.mozilla.com/kb/Menu+bar+is+missing

  • I have a problem with running an EXE file on win2000, the Lab View is 5.1 and I do not know if it is 16 bit...

    I have a problem with running an EXE file on win2000, the Lab View is 5.1 and I do not know if it is 16 bit...what should I do?

    Hi Arika,
    The drivers that you need to install to make your executable work depends on what your executable is doing. To get started, you need to have the LabVIEW Run-Time Engine installed on your target machine (the Win2000 machine you are planning to use) in order to run your executable. Next, you need to determine what drivers your executable uses, if any. For example, if you are using GPIB instrument control, you will need to install the NI-488 drivers on your target machine. If you are performing data acquisition, you will need to install NI-DAQ drivers. If you are doing image acquistion, you will need to install NI-IMAQ drivers.
    All these drivers are available for downloading on ni.com. To get the drivers, go to ht
    tp://www.ni.com/support , click on the link that takes you to Drivers and Updates (under Option 3), and click on the links to get to the driver(s) you need. For example, if you need the LabVIEW 5.1.1 Run-Time engine, click on the All Drivers and Updates by Application link on the main page (http://www.ni.com/softlib.nsf/). Then click on the LabVIEW link, Windows 2000, Run Time Engine, and then you will see the link to get to the page to download the LabVIEW 5.1.1 Run-Time Engine.
    I hope this information helps.
    Best Regards,
    Wilbur Shen
    National Instruments

  • Problem with saving a pdf file to computer. Continually get an error message " This document could not be saved. There was a problem reading this document (21).

    Need advice on a saving file issue. I'm having problem with saving a .pdf file to computer. Continually get an error message " This document could not be saved. There was a problem reading this document (21). This is new as this error message just recently started to pop-up.

    More information about this issue can be found here:
    https://forums.adobe.com/thread/1672655
    A "quick" fix that worked for me was to uninstall Adobe... then download the base install for Adobe Reader 11.0.
    Then download each of the individual updates and run them sequentially. 
    I've installed back up to the last security update which is version 08 and have been able to do normal Save As operations.
    You will have to disable automatic updates in order to stay at version 08 until Adobe resolves this issue in a later release.
    http://www.adobe.com/support/downloads/product.jsp?product=10&platform=Windows
    Adobe Reader 11.0 - Multilingual (MUI) installer    AdbeRdr11000_mui_Std
    Adobe Reader 11.0.01 update - Multilingual (MUI) installer    AdbeRdrUpd11001_MUI.msp
    Adobe Reader 11.0.02 update - All languages    AdbeRdrSecUpd11002.msp
    Adobe Reader 11.0.03 update - Multilingual (MUI) installer    AdbeRdrUpd11003_MUI.msp
    Adobe Reader 11.0.04 update - Multilingual (MUI) installer    AdbeRdrUpd11004_MUI.msp
    Adobe Reader 11.0.05 security update - All languages    AdbeRdrSecUpd11005.msp
    Adobe Reader 11.0.06 update - Multilingual (MUI) installer    AdbeRdrUpd11006_MUI.msp
    Adobe Reader 11.0.07 update - Multilingual (MUI) installer    AdbeRdrUpd11007_MUI.msp
    Adobe Reader 11.0.08 security update - All languages    AdbeRdrSecUpd11008.msp

  • Problem with Append mode in File Receiver

    Hello,
    I am facing some problem with Append Mode in File Receiver.
    In channel config, i have given :
    Construction Mode : Append
    File Type : Text
    Message Protocol : File Content Conversion
    The size of the file which i am trying to send is about 9.5MB.
    I got this error,
    "Recovering from loss of connection to database; message
    loaded into queue by recover job: System Job (Failover Recovery)".
    So, it would seem that there was a loss of connnection to the database    
    while the file was being written.
    Note -  XI successfully recovered from the connection loss and   
    successfully wrote the file, however since the communication channel  
    was set to append, it appended to the partial file that was written   
    before the database connection loss. This is not correct. The file    
    should have been overwritten after the recovery even though the communication
    channel was configured to append.                                     
    Can anyone help me on this regard.
    Thanks,
    Soorya.

    Hi Venkat,
    I would suggest u to split the file in to chunks if u face any problem in processing at a time in append mode and also
    Memory Requirements are must 4 processing huge files:
    Q: Which memory requirements does the File Adapter have? Is there a restriction on the maximum file size it can process?
    A: The maximum file size that can be processed by the File Adapter depends on a number of factors:
    o The most important one is the size of the Java heap, which is shared among all messages processed at a certain point in time. In order to be able to process larger messages without an out of memory error (OOM), it is recommended to increase the size of the available Java heap and/or to reduce the concurrency in the system so that fewer messages are processed in parallel.
    o Another factor negatively influencing the maximum message size in releases up to and including XI 3.0 SP 13 is an enabled charcter set (encoding) conversion if the message type is set to "Text".
    o Using the transport protocol "File Transfer Protocol (FTP)" also uses more memory for processing than the transport protocol "File System (NFS)" (up to and including XI 3.0 SP 13).
    o If the Message Protocol "File Content Conversion" is used in a File Sender channel, consider that not only the size of the input file affects the File Adapter's memory usage, but even more the size of the XML resulting from the conversion, which is usually a few factors larger than the original plain text file.
    To reduce the memory consumption in this scenario, consider configuring the setting "Maximum Recordsets per Message" for the sender channel. This will cause the input file to be split into multiple smaller mesages.
    Plz do refer the following links:
    U may plan the availability of ur communication channel using "Planning Availability Times" feature
    http://help.sap.com/saphelp_nw04/helpdata/en/45/06bd029da31122e10000000a11466f/frameset.htm
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    hi check the below links for reference
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/10748ef7-b2f0-2910-7cb8-c81e7f284af5
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/7086f109-aaa7-2a10-0cb5-f69bd2affd2b
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/2498bf90-0201-0010-4884-83568752a857
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/cc1ec146-0a01-0010-90a9-b1df1d2f346f
    Regards,
    Vinod.

  • Problem with RTPExport output video files

    Hi, I have a problem with RTPExport output video files. One side streams H263/RTP(AVTransmit2.java) and other write this steam to a file by RTPExport.java. When network conditions are ideal, output video file has same fps and same number of frames like original file. Problem occures, when theres packet lost in network, then output file has different fps,and also has less frames like original video(because it didnt write missing frames to file, and thats why it get shorter). Pls how can I achieve output file that will have the same fps like original one? How to write to file an identical copy of what I can see while receiveing video with AVReceive2.java? Its there a way to modifi rtpexport or avreceiver to do this? Thanks a lot!

    Trubka wrote:
    When network conditions are ideal, output video file has same fps and same number of frames like original file. Problem occures, when theres packet lost in network, then output file has different fps,and also has less frames like original video(because it didnt write missing frames to file, and thats why it get shorter). Okay, first off, the second file is smaller on purpose. RTP intentionally drops packets that are old/out of order in order to make sure real-time video stays as close to real time as it can. This is by design, so there's really nothing that can be done about it.
    How to write to file an identical copy of what I can see while receiveing video with AVReceive2.java? Technically speaking, what you're getting in the RTPExport is exactly what you got on the receiving end. Any frames that are dropped during transmission will not be seen by the receiver, nor saved by the receiver.
    Pls how can I achieve output file that will have the same fps like original one? I'm not 100% sure that you can, but, you can give the following idea a try. I make no guarentees that it'll work, but it should work for you...
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/RTPConnector.html]
    That example is an example of a "custom transport layer" for RTP connections. Essentially, it's some code that's handed the RTP packets on the transmission end, and it's expected to deliver those RTP packets on the other end. And it doesn't care about how they get from A to B, only that they do.
    If you were to replace the UDP socket in that example with a TCP socket, you would be guarenteed not to drop packets due to network reasons. Every RTP packet you were handed by the transmitter, you would then hand to the receiver. There is no guarentee that none of the packets would be cast away as "old" by the RTP protocol itself, but there's also no guarentee any of them would be. It's a crap-shoot at best, but it's certainly worth a try.

  • Authorization problem with VF01 "Create Billing Documents"

    Dear All,
    We face following problem with VF01 "Create Billing Documents".
    Transaction: VF01
    User: Joe
    Authorization of Joe:
         VKORG Sales Organization: A, B (authorization object V_VBRK_VKO)
         FKART Billing Type: 1, 2 (authorization object V_VBRK_FKA)
    The objective for Joe:
       Joe is qualified to create bills in
          sales org A only for billing type 1 and in
          sales org B only for billing type 2.
       Joe should not be able to create bills for
          sales org A with billing type 2 and in
          sales org B with billing type 1.
    How can we solve this problem?
    We already searched for userexits without any success.
    Any help or ideas are very appreciated.
    Regards
    Markus Wilhelm
    Project Manager ERP

    Dear,
    There are standard authority-checks based on Sales organization (authorisation object V_VBRK_VKO) and/or Billing type (V_VBRK_FKA).
    An option would be to create a new billing type, define specific authorisations and use the new billing type for these documents.
    The bad thing is that you would also need a special (new) sales order type, because the billing document type is unique per sales order.
    So maybe it is easier to have a different sales organization.
    Another option would be to create a new authorization object and check it in the billing documents.
    If you wish, you could modify program LV60A005 & LV60A006.
    Then you might check various user/customer exits.
    Some functions to check: EXIT_SAPLV60A_001/002,
    EXIT_SAPLV60B_001 - 008, EXIT_SAPLV60B_010 - 011.
    Regards,
    R.Brahmankar

  • Problem with implicitly created Oracle pipes

    Hi, I am having a problem with implicitly created Oracle pipes. I am not sure if this is the correct forum for this topic, but I could not see one which suited better..
    I am using Oracle pipes as a commiunication mechanism between processes (some are written in PL/SQL and other in PRO*C).
    The general problem I have is that if a timeout occurs during the communication process, I end up with 1 or perhaps 2 implicitly created pipes.
    The biggest problem for me is that I am unable to determine if a create is implicitly created (via dbms_pipe.send_message or receive_message). This causes problems, since these implicitly created pipes are left behind and not deleted. I'll show you the basic flow of the processes and you shall see my problem.
    server: create request pipe "req_pipe"
    server: listen for request on requests pipe.
    client: create two pipes for comms with server.
    client: send request and the names of the newly created pipes to the server.
    server: read request and pipename from request pipe (from this point all comms between the client and server are now done over the 2 pipes the client created).
    server: send ack message to client to ensure they are still there (since requests can be queued in the request pipe and clients could have timed out before the request is received)
    client: send "i'm still here" ack back to the server.
    server: process request and send result back to client.
    client: send ack back to server to let server know we have received the response.
    server: send ack back to client to show that work is now committed.
    OK thats the general event flow. I use the rule, that pipes created by a process are removed by a process. So the client always removes the pipes it created in all situations. But since this can happen at any point we can get in the situation:
    client: timeout occurs, delete pipes.
    server: send message to client (creates an implicit pipe) and therefor works (no errors raised)
    server: do a read for response from previous message. (implicitly creates pipe) (will fail).
    Now we have two implicitly created pipes! These pipes will exist until the database instance is shutdown, and in a poorly performing environment, we could get thousands of these pipes lying around... not a good situation.
    How can I either stop pipes being implicilty created, or how do I detect if a pipe was implicitly created.
    I have tried a couple of things, like using v$db_pipes to check if a pipe exists before doing send/reveive calls but this is FAR to slow to do a select on that view.
    We have also looked at keeping a record of pipes created by the client then have some process (perhaps the server) clean up these pipes after some time frame. This is a workable solution but is not favourable since it adds extra overhead having to check these pipes often, and created an extra level of complexity which is not required..
    Any suggestions will be greatly appreciated.
    Feel free to email me with any suggestions on my email provided below, or just post a reply to this formum..
    Many thanks,
    Karl Bridger
    [email protected]

    I solved the problem by changing the SOAP massage format from Document/Wrapped to Document/Literal

  • Problem with Snow Leopard and file sharing

    I mostly work at home where I have an 18 months-old intel iMac 24'' running Leopard and an older iMac G5 20'' (PowerPC) running Tiger. Besides, I have a three months-old MacBook Pro 17" in which I recently upgraded the system to Snow Leopard.
    Before this upgrade I never had a problem with the sharing of files between the three computers (I have them connected via Ethernet and a 5-port D-Link router wich provides flawless internet connection to the three computers). However, after the upgrade it has become impossible to connect the MacBook to the iMac G5 or vice versa via afp. Besides, getting the MacBook to connect to the (Leopard) iMac 24'' is frustratingly slow in either direction.
    A related problem is the fact that after the upgrade the portable is no longer capable of seeing directly my ethernet-connected laser printer (to one of the ports of the D-Link router) although... surprise! it does see it as a shared printer in the iMac G5 and this way it prints alright.
    Any ideas or suggestions to explore the cause of or solve this problem are warmly welcome.
    Emilio Faro

    Same problem here, did you find a solution?

  • Problems with mp4 (h.264) files, colour desaturated.

    Hey guys. I've been having problems with mp4 h.264 files (not quicktime h.264). When playback on windows media player and windows classic media player it looks terrible. the saturation are weird (espesially on colours like yellow and red) and there seems to be this weird blur on the image. For so long I thought there was problem with the export from Premiere Pro, today however when playback on Quicktime player or VLC player it matches the orginal. I exported the file through premiere pro. I don't know what the problem is, maybe it's just the playback on windows media player and window classic media player.
    Playback on quicktime
    Playback on media player
    Can someone tell me what's going on?

    When playback on windows media player and windows classic media player it looks terrible. the saturation are weird (espesially on colours like yellow and red) and there seems to be this weird blur on the image... when playback on Quicktime player or VLC player it matches the orginal...
    On the contrary, with more than high probability playback in QuirkTime shows you wrong colours because of its crooked colour management.
    How do you judge on colour matching between your source file and what you see in media players?
    What is your spec and GPU in particular? Do you use any sort of colour management in its settings?

  • HT204382 Hi, Ive just got myself a Macbook Pro, and Im having problems with sound on movie files. The format is wmv and they play OK on a windows 7 laptop, and Im trying to play them now on the MAC. The picture is OK , but no sound. I have Flip4Mac - no d

    Hi, Ive just got myself a Macbook Pro, and Im having problems with sound on movie files. The format is wmv and they play OK on a windows 7 laptop, and Im trying to play them now on the MAC. The picture is OK , but no sound. I have Flip4Mac - no diference

    Download and install the free VLC for Mac - that should work:
    http://www.videolan.org/vlc/download-macosx.html

  • I've switched from Google to Firefox because I was so tired of Google crashed all the time. Now I have the same problem with Firefox, Bank id, file down

    Hej!
    I've switched from Google to Firefox because I was so tired of Google crashed all the time. Now I have the same problem with Firefox, Bank id, file downloading, accessing Web pages, none of it works. What is the error?
    Ronnie.

    When Firefox crashes, it usually records information about what was happening at that moment and displays the Mozilla Crash Reporter. Are you seeing that, or is Firefox just freezing and you're killing it?
    Assuming you have crashes, you can submit that data to Mozilla and share it with forum volunteers to see whether it points to the solution. Please check the support article "[[Firefox Crashes]]" (especially the last section) for steps to get those crash report IDs, and then post some of the recent ones here.
    If that's not possible, have you tried Firefox's Safe Mode? That's a standard diagnostic tool to deactivate extensions and some advanced features of Firefox. More info: [[Troubleshoot Firefox issues using Safe Mode]].
    You can restart Firefox in Safe Mode using either:
    * "3-bar" menu button > "?" button > Restart with Add-ons Disabled
    * Help menu > Restart with Add-ons Disabled
    Not all add-ons are disabled: Flash and other plugins still run
    After Firefox shuts down, a small dialog should appear. Click "Start in Safe Mode" (''not'' Reset).
    Any difference?

Maybe you are looking for

  • Web Template - Report Error

    Hello all, How r u ? We are getting this error while trying to select or modify the options in the report which is created using WAD. <b>An exception with the type CX_SY_DYN_CALL_ILLEGAL_TYPE occurred, but was neither handled locally, nor declared in

  • How to create database from .sql file

    how to create database from .sql file..?? i put the sintax query in a sql file.. and i want to call it in java code.. ho to do it..??

  • Sounds drops in a middle of a project

    Hey everybody, I exported a premiere pro (cs4) project to encore, and when i'm playing it in encore there is a point where the sounds stops without any apparent reason. any idea why? thanks, Noam.

  • Creative cloud error

    I can't download creative cloud. I keep getting an error that says cc failed to initialize. Does anyone know how to fix this?

  • Export/Import Users.

    My requirement is to create users with the same privileges in database B referencing database A. so Whatever users and their respective preveleges i need to have them exactly in database B what parameters should i specify in my exp/imp command just t