Data Inserting Logic

If you have a database table with 20 columns and you want to insert data. Your recordset file has 15 columns. The first row is of the file has the field names
that tell you which of 15 fields (out of the 20) that this file contains.
If the table is only 15 fields that match the file, then the readLine, parse tokens, insert is straight forward.
1. Please list the programming logic.
2. What kind of "insert into table values (...........) "statement would be appropiate? I can't use "update table set column1 = ....." since I have not inserted the row.

You can't do "insert into table values (...........)",
you need to give the names of the columns,
i.e. "INSERT INTO TABLE (col1, col2...) VALUES (val1,
val2, ...)"Does (col1, col2...) need to be single quoted ?

Similar Messages

  • Insert data to Logical Database

    Hi all,
    can i insert the data throught logical database ? how to insert ?
    regards,
    Luke

    Hi,
    Check the below link .
    Sample program for fetching data using LDB
    http://help.sap.com/saphelp_nw70/helpdata/en/9f/db9be035c111d1829f0000e829fbfe/content.htm
    Regards,
    Maha

  • Help required for an insert logic

    Hi
    I need to create an insert logic in the following way..
    suppose my table has 20000 rows to insert in tab1 table
    but due to very small undo table space size i need to insert small amount of rows between two commits..
    say 20000/10 = 2000 rows i need to insert in the table in each commit.
    so
    for i in 1 .. 10 loop
    insert into tab1 values act_data(i); --- here in each insert i need to insert 2000 rows
    commit;
    end loop;
    act_data(i) is the collection containing whole data.
    Hope i could express my problem
    Thanks in advance.

    You seem to already have a working answer to your question but your sample code contains one of the worse coding errors you could make.
    Never commit inside a loop.
    Committing every 2000 rows does much more (not in a good way) than make things slower ... read Tom Kyte's many comments about why this should not be done.
    An insert from an array such as you seem to have should be done using FORALL.
    http://www.morganslibrary.org/reference/array_processing.html#apfa

  • Strange data inserted into table via table trigger

    Hi ,
    There is some strange phenomenon happen occasionally where some tables update records will have a TRIGGER to insert records into a table and once in a while, the record has some strange data inserted which looks like a memory corruption. It is running on 10.2.0.3.
    Does anyone ever encounter this before?
    Strange result:
    PRIM_KEY
    -3.614364951000000000000000000000000E-47
    -3.614364951000000000000000000000000E-47
    Normal result:
    PRIM_KEY
    1137KT
    1137KT
    ana

    Hi,
    What is strange in this? Its not memory corruption. Its just one of the numeric form of representation of number
    -3.614364951000000000000000000000000E-47it means -3.614364951 * 10 to the power of -47.
    Whatever value has been entered into the table depends on your business logic you coded, and user input.
    Regards

  • Deleting the data from logical file/unix file

    Hi all.
        I need to delete the all the data from logical file (application server file/unix file).But I dont want to delete the logical file ( only data in the logical file should be deleted, i.e making file empty)
    Thanks in advance.
    Cheers.
    sami

    Hi Sami,
    Refer thsi document https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/4d7aeb7d-0c01-0010-fa8a-a4a8e8968a93.
    Regards,
    Flavya

  • ABAP OO data fetch logic

    Hi Experts,
                     I have normal report with dat fetch logic. I need to convert my normal code logic to OO abap logic using classes & methods..
    Exp:
          select knumh kotabnr vakey datab datbi knuma_bo
                 from konh
                 into corresponding fields of table i_konh
                 for all entries in i_kona
                 where knuma_bo eq i_kona-knuma.
    Is there any easy way to write the code in OO using classes & methods.
    Thanks.

    Hi Khan,
    Here is the code for your query with OO abap logic i.e. Class and methods.
    Code.
    TABLES: konh.
    data: i_konh TYPE TABLE OF konh,
          i_kona Type TABLE OF kona.
    CLASS class_name DEFINITION.
    PUBLIC SECTION.
    CLASS-METHODS get_details.
    ENDCLASS.
    CLASS class_name IMPLEMENTATION.
    METHOD get_details.
    DATA: knumh   TYPE REF TO konh-knumh,
               kotabnr  TYPE REF TO konh-kotabnr,
               vakey    TYPE REF TO konh-vakey,
               datab     TYPE REF TO konh-datab,
               datbi       TYPE REF TO konh-datbi,
              knumabo TYPE REF TO konh-knuma_bo.
    select knumh kotabnr vakey datab datbi knuma_bo
                 from konh
                 into corresponding fields of table i_konh
                 for all entries in i_kona
                 where knuma_bo eq i_kona-knuma.
    ENDMETHOD.
       ENDCLASS.
    *Global data.
      data: obj TYPE REF TO class_name.
    *Event block.
      START-OF-SELECTION.
    CREATE OBJECT: obj.
    CALL METHOD obj->get_details.
    Regards,
    Soundarya.
    Edited by: K.Soundarya Singh on Mar 24, 2010 8:42 AM
    Edited by: K.Soundarya Singh on Mar 24, 2010 8:43 AM

  • Data Conversion Logic

    I am now trying to implement a data conversion logic(such as UOM field), I want to load all the data conversion rules in the cache from file and keep them in the cache. So when mapping starts it just picks up the data conversion rules from the cache.
    Using GlobalContainer it is ok for one mapping even several UOM fields(just load from file once). But it will load from file again when the mapping is triggered again.
    Is there any logic I can use to implememt the data conversion just load from file once?
    Regards,
    Nick

    Hi Arnold,
    normally every input should be made in LC, why you enter the data in group currency?
    However if you have to start from group currency the only solution to avoid this error is to put at the end of sprunconversion the ON_ERROR_CONTINUE clause.
    But remember he will stop to notify every type of error so you have to be very careful and have this clear in mind when some issue occurs with the currency conversion.
    Regards
         Roberto

  • Error while fetching data from Logical data base

    Hi All,
    I need to fetch the data from Logical data base DDF.First when i am trying demo code
    REPORT demo_program_read_tables_2.
    NODES: spfli, sflight.
    GET spfli FIELDS carrid connid cityfrom cityto.
      WRITE: / spfli-carrid,
               spfli-connid,
               spfli-cityfrom,
               spfli-cityto.
    GET sflight FIELDS fldate.
      WRITE: / sflight-fldate.
    I am getting error like ""spfli is not a node of the logical data base __S""
    Should i declare anything esle?
    Madhu

    REPORT demo_logical_database.
    DATA wa_spfli TYPE spfli.
    SELECT-OPTIONS s_carr FOR wa_spfli-carrid.
    DATA: callback TYPE TABLE OF ldbcb,
          callback_wa LIKE LINE OF callback.
    DATA: seltab TYPE TABLE OF rsparams,
          seltab_wa LIKE LINE OF seltab.
    callback_wa-ldbnode     = 'SPFLI'.
    callback_wa-get         = 'X'.
    callback_wa-get_late    = 'X'.
    callback_wa-cb_prog     = sy-repid.
    callback_wa-cb_form     = 'CALLBACK_SPFLI'.
    APPEND callback_wa TO callback.
    CLEAR callback_wa.
    callback_wa-ldbnode     = 'SFLIGHT'.
    callback_wa-get         = 'X'.
    callback_wa-cb_prog     = sy-repid.
    callback_wa-cb_form     = 'CALLBACK_SFLIGHT'.
    APPEND callback_wa TO callback.
    seltab_wa-kind = 'S'.
    seltab_wa-selname = 'CARRID'.
    LOOP AT s_carr.
      MOVE-CORRESPONDING s_carr TO seltab_wa.
      APPEND seltab_wa TO seltab.
    ENDLOOP.
    CALL FUNCTION 'LDB_PROCESS'
         EXPORTING
              ldbname                     = 'F1S'
              variant                     = ' '
         TABLES
              callback                    = callback
              selections                  = seltab
         EXCEPTIONS
              ldb_not_reentrant           = 1
              ldb_incorrect               = 2
              ldb_already_running         = 3
              ldb_error                   = 4
              ldb_selections_error        = 5
              ldb_selections_not_accepted = 6
              variant_not_existent        = 7
              variant_obsolete            = 8
              variant_error               = 9
              free_selections_error       = 10
              callback_no_event           = 11
              callback_node_duplicate     = 12
              OTHERS                      = 13.
    IF sy-subrc <> 0.
      WRITE: 'Exception with SY-SUBRC', sy-subrc.
    ENDIF.
    FORM callback_spfli USING name  TYPE ldbn-ldbnode
                              wa    TYPE spfli
                              evt   TYPE c
                              check TYPE c.
      CASE evt.
        WHEN 'G'.
          WRITE: / wa-carrid, wa-connid, wa-cityfrom, wa-cityto.
          ULINE.
        WHEN 'L'.
          ULINE.
      ENDCASE.
    ENDFORM.
    FORM callback_sflight USING name  TYPE ldbn-ldbnode
                                wa    TYPE sflight
                                evt   TYPE c
                                check TYPE c.
      WRITE: / wa-fldate, wa-seatsocc, wa-seatsmax.
    ENDFORM.

  • How can i get my oxygen 8 to transfer data into Logic?

    i use an Oxygen 8 midi-keyboard and the Presonus firebox-interface with my Logic pro 7.
    how can i get my oxygen 8 to transfer data into Logic?
    its like there is no connection between the two.
    but its not the cables, since everything was running fine before.
    can anybody help me ?

    Go into the Audio MIDI Setup utility and check to see if you see it there. Considering that the Oxygen 8 is class compliant, you should just be able to plug it into the Mac. Perhaps, all it needs it to be recycled (turn it off and on again).
    jord

  • Customizing data insert from

    I have a data insert from that is solely used to insert data in a database. I am trying to figure out a way to customize it such that a few file get auto filled. I use the form to insert data in a table called maintable. 8 of the 10 field of this maintable filled from different vendor's table(currently i manaully put it). What I like to do is see if I can integrate the vendor's table in the page and write codes to autofill the form that put the data in the maintable. I know maybe a while statement will do it. So if in the form  I input vendor's name and product id. it will get the data from vendor's table and relavent product to fill rest of the field. Any guidance will be greatly appreciated as I am very new to web development and don't have much experince in PHP codes.
    Thanks

    Maybe u can try like this. Assuming you have two inputs in form which I have assigned as below
    $product_id = $_POST['pid'];
    $vendor = $_POST['vendor_name'];
    Then do check for vendor name input.
    if(isset($_POST['vendor_name') && !empty($_POST['vendor_name'])) {
    $query = " SELECT * FROM vendor WHERE vendor_name = '$vendor' ";
    $query2 = mysql_query($query, $databaseName) or die(mysql_error);
    $vendor = mysql_fetch_assoc($query2);
    } else echo 'Vendor name doesn't exists';
    $var = $vendor['var'];
    assign for the rest values u want to store in maintable
    //do insert record
    $insert = INSERT INTO maintable .... VALUES ('$vendor', '$var',.....)

  • Combo Box doesn't persist data insertion settings

    Hi,
    I'm simply trying to setup a combo box that places the selected label into a destination table cell. I've tried about every combination possible to simply add a data insertion series. Please note, I AM clicking the plus button to add it to the list shown to the left of the settings. Unfortunately, everytime I have the series added and I click away from the selected combo box it loses the settings I just provided.
    Has anybody else had these frustrating issues with the combo box component?
    Thank you.

    Greg,
    The combo box component can be a bit fiddly at times. It would help if you let me know which version of Xcelsius you are using.
    In any case if you are simply trying to insert the label to a destination you dont need to bother with the series at all. Try it in this order:
    1) Drag a combo box component
    2) In combo box properties pane and click on the range select button beside Labels
    3) Select the range in your excel sheet that has the labels and click OK
    4) Go back to the properties pane and choose Insertion Type: Label
    5) Click the range select button beside Destination and select a cell
    6) To test it drag a Text -> Label compoent in and point it to the destination cell you selected in step 5.
    7) Preview to see if all is in order
    Let me know how you get on.
    Cheers,
    Ryan

  • Any ideas on how to control the amount of sustain pedal data that Logic Pro 9 records? Every time I press the sustain pedal down, it records 4 sustains. Same thing when I release the pedal. Thanks!

    Any ideas on how to control the amount of sustain pedal data that Logic Pro 9 records? Every time I press the sustain pedal down, it records 4 sustains. Same thing when I release the pedal. Thanks!

    This gets confusing, you have two Macs with different issues and two pedals with different issues? First let's sort out what's what. So, do the issues come from the pedals or from the Mac software? To find out, simply switch the pedals and note their behaviour. No change: it's the software, change: it's the pedals (imho the most likely scenario).
    Second, what kind of pedals are they? Can you name the type of pedal too? The Yamaha what and the Mgear which?
    Third, with the 4 fold data problem, are the doubled events simultaneous, or a few ticks apart, or more?
    Also, you have no MIDI keyboard? Are the pedals both "standalone"?
    I think you have to sort this problem at its' source, which are the pedals. You could, if push really comes to shove, perhaps create some sort of ingeneous/elaborate transformer setup in the environment to filter the extra messages, although with the 4/4 sustain events that may not even be possible. With the "stepped" events you could set up a filter to just block all events with values 20-110 or so.
    But much better would be to configure the pedal(s) correctly.

  • SQL server Timeout issue in data insertion

    We are loading data from DB2 database to SQL server using Data flow task. During data insertion in SQL server, we are receiving the below error.
    "An exception has occurred during data insertion, the message returned from the provider is: Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding."
    Kindly help to resolve this issue.
    Thanks!!!

    Check the connection timeout property of the source connection.
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/a5571966-b14e-45c6-9ce8-4f5651d3ee00/ado-net-destination-timeout-issue?forum=sqlintegrationservices
    Regards, RSingh

  • Data Guard logical standby Versus Streams

    I'm referring to both Oracle 10g/9i
    If a Data Guard logical database uses similar technology to Streams (Log Mining and SQL apply), why can't you stand up a standby database on a different platform, or at least I have found nothing on the subject.
    But in 11g Oracle Data Guard (physical standby database) is a solution for same endianness platform migration.
    I will appreciate any insight on the subject.

    Yes...thats true...both uses same technology...
    REDO LOGMINER
    SQL------------------------>BLOCK LEVEL CHANGES---------------------------------->SQL
    But there are serious implemetation diff..
    1) Oracle Data Guard is designed for protecting from data failure and disasters.
    Streams is designed for information sharing and distribution but can also provide a very efficient high availability solution.
    2) Streams is configured from the bottom up — individual tables, schemas, capture processes, apply processes, queues.
    Logical Standby is configured from the top down — start with entire database, then specify only what you don’t want.
    As logical standby is top down and changes are capture at remote location (logical standby db) and for that archive log need to be shipped using FAL client/server to remote and to ship the archivelog in Data Guard configuration, all members should be running on same platform.
    As said before, Streams is configured from the bottom up, it start with tables--> Schemas--->Database and we can capture changes at local/remote location. If we capture changes locally, target streams db can be on diff platform. But for downstream capture must need same platform as logical standby database needed so that archivelogs can transport from source to downstream db.

  • Exception has occurred during data insertion ... Wrong number of parameters

    I am using an ADO dataflow source and an ADO dataflow destination to move data from a table on an IBM iSeries DB2 database to another IBM iSeries DB2 database on another machine.  The connection managers for source and destination both use .Net Providers\Odbc
    Data Provider with the Data Source Specification being ODBC connections specified on the Windows Server 2008 r2 machine hosting bids.  All the test connections work...so far so good.
    Issue is when I execute the control flow task the rows get picked up from the source DB2 database no problem but it fails on the insert to the destination DB2 table...The progress tab specifies the exceptions...
    [ADO NET Destination [601]] Error: An exception has occurred during data insertion, the message returned from the provider is: ERROR [07002] [IBM][System i Access ODBC Driver]Wrong number of parameters.
    [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "ADO NET Destination" (601) failed with error code 0xC020844B while processing input "ADO NET Destination Input" (604). The identified
    component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the
    failure.
    I put a data viewer on the data flow connection and the records I expect are  getting picked up and look ok. I checked to make sure the source and destination tables have the same number of columns in their record layout and identical data
    types and they look fine.
    Could someone kindly tell me whats going wrong?  Does the error message indicate that some of the columns are getting dropped from the dataset before the insert?
    Thanks much in advance for any help, Roscoe

    Hi Roscoe,
    Glad to hear that you have found the root cause. So, the issue occurs because numeric values with no decimal cannot be inserted into the Decimal data type column in the DB2 table. In SSIS, data type Decimal of DB2 database is mapped to data type DT_NUMERIC.
    To address the issue, you can do a data conversion for the column in question to define its precision and scale such as DT_NUMERIC(9,2) by using the Data Conversion Transformation before the ADO NET Destination.
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

Maybe you are looking for