Loading data in the order in which it is inserted in the file

Hi All,
I am trying to load data from a flat file in database table using sql Loader.
The loader loads all the data successfully without giving any error.
The problem is the data is not inserted as it is in the file i.e.
Line 1 in datafile is inserted in row 10 in table,
Line 2 in datafile is inserted in row 1 of db table.
I want line 1 in datafile to be on row 1 of DB table and line 2 in row 2 etc.
If anybody has any solution.
Thanks in advance.
Harjit

Many thanks to everybody who have given there valuable time.
But what i have found the solution and it lies in this keyword "SEQUENCE(MAX,1))"
Try doing this
LOAD DATA
INFILE *
truncate
INTO TABLE emp11
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
(empno, ename, job, mgr,
s_no SEQUENCE(MAX,1))
BEGINDATA
7782, "Clark", "Manager", 7839
7839, "King", "President"
7934, "Miller", "Clerk", 7782
7566, "Jones", "Manager", 7839
7499, "Allen", "Salesman", 7698
7654, "Martin", "Salesman", 7698
7658, "Chan", "Analyst", 7566
Thanks again

Similar Messages

  • Loading Date in Sales Order

    Hi
    Could any one help to advice on the "Loading Date" derivation in sales order item detail (shipping view) in detail including customization settings.
    In addition;
    -- I have a part sufficent stock
    -- Loading date is say 11-02-2009, but delivery is happen 13-02-2009 (delayed).
    What could be the rootcauses.
    Please show me a complete light on this issue
    Thanks so much for support
    Regards
    RG

    hI RG
    Delivery date is the date on which goods reached the customer.
    Loading date is the date on which Picking/packing  & Transportation planning must have been completed to the start the loading only.
    On Goods issue date Loading is completed and it leaves for the customer on this date to reach the customer on the Delivery date.
    In your case loading date is 11/02/2009 and Delivery date is 13/02/2009.
    That means loading time + Transit time = 2days, therefore system is giving delivery date as 13/02/2009.
    Edited by: pradyumna on Feb 13, 2009 10:25 AM

  • Mass change of LOADING DATE on Sales orders

    Hi
    Does anyone know how we can MASS CHANGE  the LOADING DATES on SALES ORDERS with MANY LINE ITEMS  - ALL WITH Different DELIVERY DATES
    So basically we are wanting to over-ride the loading dates which SAP suggests - which relates to the ROUTE
    I am aware that you can use EDIT > Fast change > delivery date  - but this would put all of the delivery dates then to be the same
    Your help would be much appreciated, as the Operatives are doing this line by line, order by order at the moment
    Many thanks
    Tony

    Dear Tony,
    Won't You able to change the same Via LSMW (or through BDC-Program)?
    Just check whether the T. Code: MASS, could be useful
    Best Reagrds,
    Amit

  • Look up two ODS and load data to the cube

    Hi ,
    I am trying to load data to the Billing Item Cube.The cube contains some fileds which are loaded from the Billing Item ODS which is loaded from 2LIS_13_VDITM directly from the datasource, and there are some fields which needs to be looked up in the Service Order ODS and some fields in the Service Orders Operations ODS.I have written a Start Routine in the Cube update rules.Using the select statement from both the ODS i am fetching the required fields from the both ODS and i am loading them to the internal tables and in the Update rules i am writing the Update routine for the fields using Read statement.
    I am getting an error when it is reading the second select statement(from second ODS).
    The error message is
    You wanted to add an entry to table
      "\PROGRAM=GPAZ1GI2DIUZLBD1DKBSTKG94I3\DATA=V_ZCSOD0100[]", which you declared
    with a UNIQUE KEY. However, there was already an entry with the
    same key.
    The error message says that there is an Unique Key in the select statement which is already an entry.
    Can any one please help me in providing the solution for this requirement.I would appreciate your help if any one can send me the code if they have written.
    Thanks in Advance.
    Bobby

    Hi,
    Can you post the select statements what you have written in Start routine.
    regards,
    raju

  • Input ready query is not showing loaded data in the cube

    Dear Experts,
    In Input ready query we have problem that it is not showing the values which was not entered by throught hat query. Is any settings in input ready query that we can do to populate the loaded data on the cube as well as data entered through Input ready query itself.
    Thanks,
    Gopi R

    Hi,
    input ready queries always should display most recent data (i.e. all green and the yellow request). So you can check the status of the requests in the real-time InfoCube. There should exist only green requests and maybe at most one yellow request.
    In addition you can try to delete the OLAP cache for the plan buffer query: Use RSRCACHE to do this. The technical names of the plan buffer query can be found as follows:
    1. InfoCube\!!1InfoCube, e.g. ZTSC0T003/!!1ZTSC0T003 if ZTSC0T003 is the technical name of the InfoCube
    2. MPRO\!!1MPRO, e.g. ZTSC0M002/!!1ZTSC0M002 if ZTSC0M002 is the technical name of the multiprovider
    If the input ready query is defined on an aggregation level using a real-time InfoCube, the first case is relevant; if the aggregation level is defined on a multiprovider the second case is relevant. If the input-ready query is defined on a multiprovider containing aggregation levels again the first case is relevant (find the real-time InfoCubes used in the aggregation level).
    Regards,
    Gregor

  • Problem loading data from the PSA to the InfoCube

    Hello experts.
    I'm having a problem loading data from the PSA to the InfoCube.
    I'm using a DTP for this process but is happening the following error:
    "Diagnosis
          An error occurred while executing the transformation rule:
          The exact error message is:
          Overflow converting from''
          The error was triggered at the point in the Following Program:
          GP4KMDU7EAUOSBIZVE233WNLPIG 718
      System Response
          Processing the record date has Been terminated.
    Procedure
          The Following is additional information included in the higher-level
         node of the monitor:
         Transformation ID
         Data record number of the source record
         Number and the name of the rule Which produced the error
    Procedure for System Administration
    Have already created new DTP's deactivate and reactivate the InfoCube, the transformation, but solves nothing.
    Does anyone have any idea what to do?
    Thank you.

    HI,
    Is it a flat file load or loading frm any data source?
    try to execute the program GP4KMDU7EAUOSBIZVE233WNLPIG 718 in Se38 and check if its active and no syntax errors are there.
    Check the mapping of the fileds in transformations weather
    some data fileds are mapped to decimal or char 32 filed is mapped to Raw 16
    or calweek, calmonth mapped to calday etc.
    Check in St22 if there any short dumps ..
    Regards
    KP

  • Last data-load date in the WebI report

    Hey,
    I have made a WebI report on the InfoCube. Now  the  client is interested in seeing the date on which the data was latest loaded in the InfoCube. If someone can help with the scenario.
    Thanks.

    Hi,
    Take the max of record load date in the report and show it.
    Let me know if this did not work.
    Cheers,
    Ravichandra K

  • Steps for loading data into the infocube in BI7, with dso in between

    Dear All,
    I am loading data into the infocube in BI7, with dso in between. By data flow looks like....
    Top to bottom:
    InfoCube (Customized)
    Transformation
    DSO (Customized)
    Transformation
    DataSource (Customized).
    The mapping and everything else looks fine and data is also seen in the cube on the FULL Load.
    But due to some minor error (i guess), i am unable to see the DELTA data in DSO, although it is loaded in DataSource through process chains.
    Kindly advise me, where did i miss.
    OR .. Step by step instructions for loading data into the infocube in BI7, with dso in between would be really helpful.
    Regards,

    Hi,
    my first impulse would be to check if the DSO is set to "direct update". In this case there is no Delta possible, because the Change log is not maintained.
    My second thought would be to check the DTP moving data between the DSO and the target cube. If this is set to full, you will not get a delta. It is only possible to create one DTP. So if you created one in FULL mode you can't switch to Delta. Just create the DTP in Delta mode.
    Hope this helps.
    Kind regards,
    Jürgen

  • Can we load and unload the files in the run time?

    Can we load and unload the files in the run time?
    For example there are four files named "test1.h & test1.c" and another set "test2.h & test2.c" (I attached them as attachment to this post).
    test1.h contains code:
    int variable; //variable declared as integer
    test1.c contains code:
    variable = 1; //variable assigned a value
    test1.h contains code:
    char *variable; //variable declared as string
    test1.c contains code:
    variable = "EXAMPLE"; //variable assigned a string
    So here, in this case can I dynamically load / unload the first & second group of files so that the same variable name "variable" can be used both as integer and string? And if yes, how is that to be done?
    Solved!
    Go to Solution.
    Attachments:
    test.zip ‏1 KB

    What do you mean by "dynamically"?
    If you want to have a variable that either is an int or a char in the same program run, I'm afraid your only option is to define it as a variant and assign from time to time the proper data type in the variant according to some condition. Next, every time you access the variable you must firstly check which data type is stored in it, next access it in the proper way.
    If on the other hand your option or to have a run in which the variable is an int, next you stop the program and in a following run it is a char, you may have it by using some appropriade preprocessor clause:
    #ifdef  CHAR_TYPE
    #include "test1.h";        // variable defined as a char
    #else
    #include "test2.h";        // variable defined as int
    #endif
    Next, every time you want to access the variable you must proceed in the same vay:
    #ifdef  CHAR_TYPE
      variable = "string";
    #else
      variable = 1;
    #endif
    Does it worth the effort?
    Additionally, keep in mind that this "dynamical" approach can work only in the IDE, where you can properly #define your CHAR_TYPE or not depending on your wishes: when you compile the program, it will have only one #include depending on the definition of the macro.
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • Aperture Video Import Problem - from Lumix GH4: Imported clips have their dates changed to the import date.  The files show up on the hard drive but many are not showing up in Aperture.

    Aperture Video Import Problem - from Lumix GH4: Imported clips have their dates changed to the import date.  The files show up on the hard drive with import date not created date, but many of these same files are not showing up in Aperture. Sometimes the clips actually show up with the current import but take on the video information from a previously imported file.

    It was suggested I move this question to IPhoto or IMovie which I did. 
    Well moving to a different discussion group did not provide an answer to this question either. But what I finally did was import one batch of photos and videos into IPhoto for a given day at a time. Working with these I could change the date and times in order to get them in the original sequench taken. Then I would create an album with that batch. These would all be on the same day (IMove was closed for this phase). Then I would open IMovie, generate the thumbnails for that album, and select the album I had created. This was necessary because the importing process in IPhoto was using incorrect dates for my video so it was a real struggle finding them in IMove until I developed this approach.
    I believe that this whole process was so screwy because I was importing from an external hard drive not a camera. I had these photos on a PC and did not have the original cameras to use to import directly which I am fairly sure would have made this easier!

  • Steady Stream of "Searching for movie data in the file..." Error Messages

    My iMovie has been crippled by error messages that pop up whenever I try to accomplish anything in iMovie. I always see "Searching for movie data in the file 'healyintro.mov'" for a few minutes, then "The movie file 'healyintro.mov' cannot be found. Without this file, the movie cannot play properly." I cannot actually use iMovie because of these errors.
    I've tried everything from reinstalling iMovie to removing application support files to removing my iMovie Events and iMovie Projects folders to creating a .mov file, calling it healyintro.mov and seeing if that'll shut iMovie up – nothing works.
    Once in a while, iMovie will ask for a different movie file, with the same problem.
    Any ideas?

    I am having the same issue. I don't know what the previous poster means by allowing the system to continue, since I'm prompted with a "Cancel" / "Search" dialog after each missing clip. Slight digression: "Search" is not even the correct term here according to UI guidelines ("Choose" or "Locate" might be better choices given the file picker dialog that results).
    I'm actually using Aperture to relocate my video masters on removable media, which is a very nice feature of Aperture, but completely breaks iMovie unless it's connected. Seems like a pretty major oversight.... can we just have it fail more gracefully here and allow us to work with new stuff without getting hung up on missing movie clips from the past?

  • Cannot Interpret the data in the file

    Hi,
    I need to upload rate routing and I have created a BDC program using a sample recording.
    While I am uploading the data from the flat file, I am getting an error message " Cannot Interpret the data in the file ".
    Please help me where I might have gone wrong. I have checked with template in the flat file and it is correct.
    Please do the needful.
    Thanks,
    Ranjan R Jinka
    Edited by: Ranjan Jinka on Apr 29, 2011 8:55 AM

    Hai Ranjan,
    Please, Check this
    " Can not interpret the data in file " error while uploading the data in DB
    - Check the heading of the excel column and filed used in program.
    If possible, please paste the program so the viewers have a better idea and you will get the exact solution.
    Regards,
    Mani

  • Regarding reading the data from the files without using Stremas

    hai to all of u...
    here i have a problem where i have to read the data from the files without using any streams.
    please guide me how to do this one,if possible by giving with an example
    Thanks & Regard
    M.Ramakrishna

    Simply put, you can't.
    By why do you need to?

  • Error while writing the data into the file . can u please help in this.

    The following error i am getting while writing the data into the file.
    <bindingFault xmlns="http://schemas.oracle.com/bpel/extension">
    <part name="code">
    <code>null</code>
    </part>
    <part name="summary">
    <summary>file:/C:/oracle/OraBPELPM_1/integration/orabpel/domains/default/tmp/
    .bpel_MainDispatchProcess_1.0.jar/IntermediateOutputFile.wsdl
    [ Write_ptt::Write(Root-Element) ] - WSIF JCA Execute of operation
    'Write' failed due to: Error in opening
    file for writing. Cannot open file:
    C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\
    BPEL_Import_with_Dynamic_Transformation\WORKDIRS\SampleImportProcess1\input for writing. ;
    nested exception is: ORABPEL-11058 Error in opening file for writing.
    Cannot open file: C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\
    BPEL_Import_with_Dynamic_Transformation
    \WORKDIRS\SampleImportProcess1\input for writing. Please ensure 1.
    Specified output Dir has write permission 2.
    Output filename has not exceeded the max chararters allowed by the
    OS and 3. Local File System has enough space
    .</summary>
    </part>
    <part name="detail">
    <detail>null</detail>
    </part>
    </bindingFault>

    Hi there,
    Have you verified the suggestions in the error message?
    Cannot open file: C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\BPEL_Import_with_Dynamic_Transformation\WORKDIRS\SampleImportProcess1\input for writing.
    Please ensure
    1. Specified output Dir has write permission
    2. Output filename has not exceeded the max chararters allowed by the OS and
    3. Local File System has enough space
    I am also curious why you are writing to a directory with the name "..\SampleImportProcess1\input" ?

  • Vaildating File name with the data in the file using sender file adapter

    Hi,
    Below is the scenario
    1)       Pick up files from a FTP server, the file name is dynamic, how do I put dynamic name in sender file adapter?
    2)       Determine if the user correctly named the file based on data in the file.
    a.       File naming structure that we will be concerned with is <company_code><accounting_time_period>.<extension>
    b.      The company code and the time period in the file name have to match the data in the file.
    i.      For example.  If the file name is 1001_200712.csv and the data in the file is for company code 1005, time period 200712, the file is incorrectly named.  Both values must be correct.
    How do we do this?

    Hi Sachin,
                    As Rightly said by Krishna, You can not put Dynamic name in sender File Adapter .You have to provide the name of the file like "*.txt" in Sender Adapter and at runtime you can access this file name by using following UDF:
    DynamicConfiguration conf  = (DynamicConfiguration) container
      .getTransformationParameters()
      .get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
    DynamicConfigurationKey key = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/File","FileName");
    String valueOld = conf.get(key);
    return (valueOld);
    As now you have picked up the file name at runtime.
    Now concatenate source file fields Company_code and Accounting_timeperiod using "_" as delimiter in properties.Also concat the extension .Now you have required file name.
    So using EQUALS standard function ,compare it with File Name fetched at runtime using above given UDF, and pass result as you desire to process further or not or to raise Alert to resend the file.
    Thanks & Regards,
    Anurag Garg
    You can validate this file name in Mapping itself.

  • Save the data in the file as json data

    Hello,
    I need to read data from list / Pictures gallery / Documents  and save the returned data  in the file as json data .
    How Can I Implement that .
    ASk

    Try below:
    http://msdn.microsoft.com/en-us/library/office/jj164022%28v=office.15%29.aspx
    The code in the following example shows you how to request a JSON representation of all of the lists in a site by using C#. It assumes that you have an OAuth access token that you are storing in the
    accessToken variable.
    C#
    HttpWebRequest endpointRequest = (HttpWebRequest)HttpWebRequest.Create(sharepointUrl.ToString() + "/_api/web/lists");
    endpointRequest.Method = "GET";
    endpointRequest.Accept = "application/json;odata=verbose";
    endpointRequest.Headers.Add("Authorization", "Bearer " + accessToken);
    HttpWebResponse endpointResponse = (HttpWebResponse)endpointRequest.GetResponse();
    Getting properties that aren't returned with the resource
    Many property values are returned when you retrieve a resource, but for some properties, you have to send a
    GET request directly to the property endpoint. This is typical of properties that represent SharePoint entities.
    The following example shows how to get a property by appending the property name to the resource endpoint. The example gets the value of the
    Author property from a File resource.
      http://<site url>/_api/web/getfilebyserverrelativeurl('/<folder name>/<file name>')/author
    To get the results in JSON format, include an Accept header set to
    "application/json;odata=verbose".
    If this helped you resolve your issue, please mark it Answered

Maybe you are looking for