SQL Loader - Conditional Data Loading

Hello Friends,
I am trying to use SQL Loader to load the following records in to the Emp table (columns being - empno, ename, job, mgr, hiredate, sal, comm, deptno ).
7876,ADAMS,CLERK,7788,"23-MAY-87",1100,,20
7900,JAMES,CLERK,7698,"03-DEC-81",950,,30
7902,FORD,ANALYST,7566,"03-DEC-81",3000,,20
7934,MILLER,CLERK,7782,"23-JAN-82",1300,,10
I want to put a condition that the data should be loaded only when the 5th field in the record (i.e, the Hiredate field) should be greater than "01-JAN-82". So with this condition, the only records to be loaded are the 1st, and the 4th one... the other two records have the hiredates being less than "01-JAN-82".
load data
infile 'c:\mydata.csv'
into table emp1
when .....
fields terminated by "," optionally enclosed by '"'          
( empno, ename, job, mgr, hiredate, sal, comm, deptno )
I am unable to put the when condition in the control file above and would be grateful if you kindly can give me some pointer as to how to achieve this.
Thanks and Regards.
Edited by: user645883 on Dec 29, 2008 12:45 AM

Hello Friends,
Thanks for your replies.
The examples only show the when clauses where we can only use "=" or "<>" in the clause. I am trying to see if we can use ">=" or "<=" in the when clause to load the conditional data. Also trying to find out if date fields can be compared in the when clause. I am unable to do so till now and it throws syntax errors everytime
I am able to test the when with string literals and with "=" and "<>" conditions though.
Please help.
Thanks & Regards.

Similar Messages

  • How to rectify the errors in master data loads & transactional data loads?

    hy,
    please any one tell me
    How to rectify the errors in master data loads & transactional data loads?
    thnQ
    Ravi

    Hi,
    Please post specific questions in the forum.
    Please explain the error you are getting.
    -Vikram

  • Init Load,Historical Data Load & Delta Load with 'No Marker Update'

    What is the difference b/w Compression in Inventroy cube while Init Load,Historical Data Load & Delta Load with 'No Marker Update'???
    please help for this quesy..
    Thanks
    Gaurav

    Hi Gaurav,
    I believe you will find the answers here...
    [http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/e0b8dfe6-fe1c-2a10-e8bd-c7acc921f366&overridelayout=true]
    regards
    Pavel

  • Error in SQL Statement (ODS data loading Error)

    Hello Gurus...
    I am trying to loading the data Info Source To ODS in my ODS Key Figure Z_IVQUA (Invoice Quantity)
    is getting the Error Message Data shows 0 from 0 Records.
    I was Remove the Key Field from ODS Then try to load the data but i am getting same Error..
    Error in SQL Statement: SAPSQL_INVALID_FIELDNAME F~/BIC/Z_INVQUA
    Errors in source system
    Please Suggest....
    Thanks
    Prakash

    Hello All...
    Really appreciate if you can give me some advise. Thanks.
    Prakash
    Edited by: Prakash M on Jan 14, 2009 2:14 PM

  • Problem while loading Master data load

    Hi,
    i am loading master data for 0PROJECT. for attributes and text we do have delta.
    for text after load added records will be 0 eventhough few records there and they are transferred records. i checked in RSA3 it has records.
    can any body tell me the reason for this?
    Thanks,
    Vijaya

    for full load only no of added records are 0 where transferred are 26.
    i checked in master data table and all these records already present... may be becoz of that reason if i do full load and delta ...added is 0 only.
    am i right in this? plz let me know.
    Thanks,
    Vijaya

  • Outline load utility Data load error in Planning

    Hi,
    We are loading the data using OutlineLoad utility in Hyperion Planning 11.1.2.1,but data is not loading and the log file showing the below error.
    Unable to obtain dimension information and/or perform a data load: com.hyperion.planning.olap.NoAvailableOlapConnectionsException: Unable to obtain a connection to Hyperion Essbase. If this problem persists, please contact your administrator.
    Can you please why we are getting this error.
    Thanks.

    Hi John,
    I tried refresh the db and running the utility again, still its showing same error,
    and the log file showing the following.
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2560/Info(1051164)
    Received login request from [::ffff:IP]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2560/Info(1051187)
    Logging in user [admin@Native Directory] from [::ffff:IP]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2788/Info(1051001)
    Received client request: List Applications (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///11872/Info(1051001)
    Received client request: List Databases (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///11232/Info(1051164)
    Received login request from [::ffff:IP]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///11232/Info(1051187)
    Logging in user [admin@Native Directory] from [::ffff:IP]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2824/Info(1051001)
    Received client request: Get Application Info (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2560/Info(1051001)
    Received client request: Get Application State (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2788/Info(1051001)
    Received client request: Select Application/Database (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2788/Info(1051009)
    Setting application BUD_APP active for user [admin@Native Directory]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///11872/Info(1051001)
    Received client request: Logout (from user [admin@Native Directory])
    Thanks

  • UDA loading via Data load rule

    Hi all,I would like to load a file with two columns:-1st column contains current members of a dimension-2nd column contains the UDA I would like to associate to the member.Ex:ENP8XX,CNYENR8XX,CNYENB3XX,THBENM4XX,JPYI know that I should do it through the Data Prepr Editor. But I can't.I open the data file, associate the 1st column to the dimension, declare the 2nd column as UDA.When I try to validate the rule, I always get the same error message:"Field 2 -- This field is defined as a USERATTR. There is no associated member field preceding this field in the same dimension."Can anyone detail the steps to do what I want ?ThanksJM

    Hi,I tried this:1. Opened Data Prep Editor2. File | Open SQL and Enter3. Enter SQL with 2 columns and OK/Retrive button4. userid/password for RDBM5. click on first column and Field | Properties6. Dimension Building Properties tab7. Field Type: Generation Dimension: <select dimension> Number: 28. Next button to move to secound column9. Field Type: UDA Dimension: <select dimension> Number: 210. View | Dimension build field11. Options | Validate12. OK message: "The rules file is correct for dimension building".Hope this helps,GrofatyMy system: Windows XPsp1, Essbase 6.1.4

  • EIS data load slow

    Greetings all,
    Windows Server 2003 SP2. SQL Server 2000 SP3. Hyperion 9.3.1.
    I am running an EIS data load, and the load time is progressively longer depending on the number of records loaded.
    For example, when doing a data load of 1,000,000 records, it takes 2 minutes to load the first 500,000 records but an additional 10 minutes to load the next 500,000 records.
    Is there any reason for this, and can I do anything about it?
    Regards,
    Paul.

    Since non-Streaming mode is the only mode that EIS leverages to load data, it's not surprising that- Data loads are slow.
    If you want to continue using EIS for data loads, I believe- All you can do is- To refine/tune your User-defined SQL.
    You've mentioned about just few million records. However, in the volume-intensive environments where records are in the order of hundreds/thousands of millions, the majority would think of using EIS only for Metadata load & deferring data load to the flat file data extracts, which is way faster. At the same time, you'd have to consider if the extraction time required is not too much that it defeats the very purpose.
    - Natesh

  • Transactional data loads PIR, IM stock, Open PO's documentation

    I have to make a documentation of the process of transactional data loads.
    transactional data loads pirchase info rec, IM stock, Open PO's
    the transactional data is live in both sap and legacy, so they have to be even in both systems in all stages of this process.
    how do i maintain that is the question.
    send me any details regarding this.
    thank you
    sridhar

    Check these three thigns
    /n/sapapo/CCR
    /n/sapapo/CQ
    Check what type of stock has active IM, and what type of stock went in after you created the GR.
    Still if you have any problem let us know.
    My

  • Oracle Data Loader

    Hi guys!
    I'm planning to import a file with about 400k records with data loader (insert function).
    I do this operation with web services and I took about 7 hours. With web services I import about 20k records per time.
    Someone know if i use dataloader, will the time be improved?
    Another question, do you know how data loader do to import a file (if it divide the record, how many records per time, parallel importation, etc...)?
    Thanks in advance,
    Rafael Feldberg

    Rafael, I would recommend clicking on the Training and Support link in the upper right of your CRM On Demand application, then click on Browse Training, then click on Training Resources by Job Role, then click on Administrator and look for the following:
    Data Loader FAQ
    Data Loader Overview for R17
    Data Loader User Guide
    If you are successful using web services I would stick with that method.

  • Rate data loader

    Hey, Gurus!
    Does anyone know how many records does Oracle Data Loader On Demand send by package?
    I didn't find anything on the documentation (Data Loader FAQ, Data Loader Overview for R17, Data Loader User Guide).
    thanks in advance
    Rafael Feldberg

    Rafael, there is no upper limit for the number of records that the Data Loader can import. However, after doing a test import using the Import Wizard I would recommend keeping the number of records at a reasonable level.

  • Data load Error 1030010 - Invalid blank character in name

    <BR><BR>I try data load in Essbase Administration Services 7.1.3 and got the following error:<BR>1030010 - Invalid blank character in name.<BR><BR>I notice that this error occurred during open file before EAS even has a chance to read the first line as there was no error.txt generated.<BR><BR>The filename is ActSales1.txt<BR><BR>With the same file, we were able to load through data load in Essbase Application Manager....<BR><BR>Any idea why?

    I think you will find that the answer is a very simple one.<BR><BR>There should be no spaces in your directory or file names.<BR><BR>If you have a directory called Essbase files, then this is enough to generate the message - same with the file name.<BR><BR>I just substitute an underscore for a space so taht it is a continuous name.<BR><BR>This problem has been around for years.

  • Automate EIS dim builds and data loads

    I want to automate the dimesion builds and data loads from my ETL tool (DTS). I have not been able to find anything about scripting EIS automation in the documentation. Is there any?

    what you can do is create go into EIS metadata outline and create a member load and data load script. Do this by selecting the Outline menu item, then select member load. click next, on this screen, select only save load script. Click the button "Save scripts" to give it a name. click finish. repeat for the dataload script. (If you are using ASO cubes, you must use separate scripts, you can't do both in one script)Then create a batch file to run the member load and data loads. In DTS, use an execute process task to run the batch file

  • Data load status stay in yellow

    Hi,
    My BW platform is BW 701 with SP 5.  I am loading ECCS data hourly with 3.5 method which include transfer rule and update rule.  Most of the time the data load are successfully completed.  The total number of records is about 180,000 records and extracted in 4 packets. Once in a while, randomly one of the data packet stay in yellow could not complete the load.  But, in the next hour data load, the data loaded successfully.  We know it is not data issue.  Does anyone know why the data load is not consistently?  Any suggestions are much appreciated.
    Thanks for your suggestions,
    Frank

    HI Frank,
    This might be because some of the TRFc or Idcos might got hung.
    Check the source system job got finished or not.
    If the source system job got completed check the TRFCs and IDOCs if there are any hung TRFCs and IDOCs.
    in the monitor screen --> menu bar environment > select job overview> source system (enter id n pwd ) check the job status.
    once its done to check the TRFCs
    Info package>Monitor->Environment> Trasact Rfcs-->In source system --> it will diplsay all the TRFCs check for your hung TRFCs and try to manually flush the TRFC(F6).
    If idocs are hung change - process the IDOCs manually.
    Regards
    KP

  • LOAD CSV DATA TO NEW TABLE

    I've got a basic csv file- 12 columns, 247 rows that I've been trying to import or paste into a new table and with each try, not all records get uploaded into the new table. Using the "Load Data" tool, I've tried Load Text Data, Load Spreadsheet Data, both by importing .csv file and copying/pasting. I've put the records in order of the table's PK ID, I've added dummy entries to any fields that are null (i.e. I added the word 'None' to fields that were empty). But nothing is working. I get about half of the existing records.
    Why????

    You're probably right on with that one. I've got a 'Comment' field (VARCHAR(4000)) and a 'link' field that holds the link to doc files that I use in an interactive report VARCHAR(255). If this is my problem, how do I incorporate this data? My user needs to enter comments (indeed maybe not 4000 char - but I need to keep that as large as possible) and the link to the file itself is essential. Basically it's a simple interactive report where all of our active documents are listed and the user needs to find, then open the document on our server.

Maybe you are looking for

  • IPhone (4) Voice Memos not Syncing to New iTunes 10.0

    I'm currently on an iPhone 4 (4.0.1 firmware) and I've been experiencing issues getting my voice memos to sync to my iTunes library. The last time this worked was August 27, 2010 which added 3 voice memos to my iTunes library successfully. Since then

  • IPod volume needs to be more consistent and louder. Ideas anyone?

    I am really impressed with the revolutionary achievement of Apple's iPod. Yet, I'm bothered by the difficulty of managing its volume. Even with Sound Check turned on, there's quite a bit of discrepancy between the volume of older or classical recordi

  • Text not determined correctly in Delivery

    Hi, We are combining 5 scheduling agreement and creating a single delivery, what the problem i am facing is if i delete line item 10 in delivery the text of that lineitem 10 is copied to line item 20. In the same way text of the line item 20 is shift

  • Unable to boot after kernel update

    Hi, now that my arch box is on the net, I made a kernel upgrade from 2.6.3 to 2.6.4, but now I cannot boot anymore. Uncompress Linux... ran out of input data -- System halted How can I fix this? Is it possible to do it with the arch linux install dis

  • Is Arch moving to German because of Friedrich Nietzsche?

    Nietzsche: Elitist, believes in not dumbing down things so the crowd can understand Arch: Elitist, believes in not dumbing things so the crowd can understand Could it be?!?!?!