In bdc  while transfer data from  itab to db .we get error how to handle ?

hi guys,
1.while uploading a data from flat file to internal table .
We get
    Lifnr   name1   ort01
    123 #    Eswar#    in ##
But while transfer data from <b>itab to db</b> .we get error how to handle that?
pls coud u give the logic
i'm waiting for u r reply.
Ishwar.

well without telling us WHAT error message you get noone will be able to help yah.
please help us to help you. thx

Similar Messages

  • Error while transfer data from SNP

    Dears,
    Am facing the below error while trying to transfer the data from DP .(SAPAPO/MC90
    Thanks for the help

    Hi Vijesh
    You may start looking into the issue by checking whether all Product/Location mentioned in the seletion exists as CVC in MPOS. This should be followed by running a consistency checks through /SAPAPO/TS_LCM_CONS_CHECK. This previous post may be helpful - https://scn.sap.com/thread/654202.
    Thanks,
    Prasun

  • When attempting to transfer ebooks from library to Nook I get error message "Failed to write destination file."

    Any help here? The ebooks work fine on the computer, so it's not the download itself that's the problem.

    I did that prior to coming here. I've done everything that is mentioned there with no change in the situation.

  • OBJECTS_NOT_FLATLIKE error while transfer data to a file .-

    Hi experts,
    I am getting a dump OBJECTS_NOT_FLATLIKE
    while transfer data from internal table to a file .. File contains basically bank transmitting data in SAP
    Appreciate your advise and help ...
    Thanks

    Hi,
    1. If your files are on a remote File System, you will need to copy one of your files to the machine ODI Designer is running on to allow ODI to retrieve the metadata information of the file.
    2. In Topology create a Physical Schema, the directory you enter for Data and Work Schema should point at this local file.
    3. Then define the File Datastore in ODI Designer. Enter a name, browse and select the file and fill in each filed of the Files tab.
    * If its a Fixed file, click on the grid icon on the Columns tab to enter the columns and have Automatic Adjustment checked.
    * If its a Delimited file, use the Reverse button on the Columns tab to reverse the columns.
    * Right click on the File Datastore select View Data, if you can view data, that means the File Datastore has been defined correctly.
    * If not, check each tab of the File Datastore to make sure everything is defined correctly and retry.
    4. Once View Data is successful, you now change the directories (Data and Work Schema in Topology) to point at the remote File System. These directories must be accessible to the ODI Agent that will be used to run the transformations. The directory can be an absolute path (m:/public/data/files) or relative to the ODI Agent startup directory (../demo/files). It is strongly advised to use a UNC (independent from the execution location) for the path. When running the transformations with "no agent", the directory is relative to the directory where Oracle Data Integrator has been installed.
    You need to have a agent process running at the system where your source file resides .
    Then while running the odi interface choose that agent .
    Thanks,
    Sutirtha

  • Error while loading data from FDM (EPM 11.1.1.3)

    Hi,
    We are loading data into HFM from FDM. While loading data from FDM it is throwing an error which as below.
    "Load data started: 5/20/2012 9:30:45 PM
    Line 27, Error: Cell for Period Mar is not an input cell.
    Actual,2011,Mar,YTD,ENT_JJ,JPY,A23232323,ENT_J76;[None];[None];[None];ICP_Load;3463437"
    There were multiples lines which are throwing error.
    After doing some checking we found that when we unpost ICP data in HFM we were able to load data from FDM. We have not locked the ICP entities because that will restrcit to load data. But we are not able to understand just unposting the ICP values why we are able to load data from FDM. Also we have not closed the period in HFM.
    Kindly suggest, what could be the possible reason.
    Thanks
    Edited by: user12121146 on May 27, 2012 5:29 AM

    one way to figure this out, would be to open a grad at the same POV.
    It will rule out a couple things.
    If it is locked, you'll visually see that it is locked. (will be grey I believe)
    If it is an invalid intersection, you'll see that as well. (will be red)
    If you do not have access to the cell, you will see that if you look at cell status, it will say NO ACCESS.
    That should help you figure out what is happening, the quick and dirty way.

  • Deadlock issue while sending data from PI to JDBC !!!

    Hi All,
    We are getting below error while sending data from PI to JDBC (Database).
    Error - JDBC message processing failed; reason Error processing request in sax parser: Error when executing statement for table/stored proc. 'store_order_details' (structure 'Order_details_Statement'): com.microsoft.sqlserver.jdbc.SQLServerException: Transaction (Process ID 61) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
    This is happening when more than one PI message is trying to deliver data at the same time to database.
    Can anyone let us know the root cause of this issue and how we can rectify this.
    Is there some setting which we can use to do parallel processing as Delivering one message at a time to database is causing performance issues & taking lot of time.
    Thanks
    Neha Verma

    Hello Neha,
    Strange but can please get below information.
    Please check with the DB admin about if the user is getting locked or is there any hanging threads related to user.
    Also confirm with DB admin if the exclusive lock is on table or on the row when you try insertign or updating information.
    You can share the user from the receiver channel.
    Regards,
    Hiren A.

  • Error 65575: overrun error while reading data from NI 9871

    Hello everyone,
    I am new here. I configured NI 9871 in RS-485 mode to acquire data from a sensor at the data rate of 8 samples per second. While reading data from sensor, sometimes I got the error 65575: "An overrun error has occurred in the data being read. Ensure that the module Flow Control is configured correctly." I didn't understand the error, how can I fix it, what dose the flow control means.
    I will be very thankful if anybody help me to fix this error.
    Thanks
    Wasif

    Hello Wasif,
    The error you are getting presents itself when the hardware buffer is full and new data is coming in. Data buffer is 64 bytes. One thing to try is to clear the RX FIFO after every read. Please refer to the following link that talks about the method.
    http://zone.ni.com/reference/en-XX/help/370984R-01/lvaddon11/crio-9871/
    Hope this help you solve the issue.
    Regards
    Luis S
    Application Engineer
    National Instruments

  • Transfer data from Palm LifeDrive to Palm Treo Pro (Unlocked)

    Hi, as it says in the title, i want to transfer data from Palm LifeDrive to Palm Treo Pro, how can i do that? do i need a special software?
    Thank you in advance.
    Ivan Rodriguez.
    Post relates to: Treo Pro T850U (Unlocked)

    You can do this 2 different ways. You can set your Palm Desktop software to sync to Outlook, and then sync your LifeDrive to outlook. You would then be able to just sync the Outlook data to you Treo Pro. The other option would be to purchase Chapura Pocket Copy software, and this will copy the data in the Palm DT application to Outlook. Once again you would just sync your Treo Pro to Outlook. You can get the Pocket Copy application from Chapura.com

  • "Transfer structure  does not exist" while deleting data from a DSO

    Hello experts,
    while deleting data from a DSO I get the following messages:
    @5B\QInformation@     Database table /BIC/AAKIVO1400 was deleted and recreated
    @5C\QError@     Transfer structure  does not exist
    When I try it twice process works without any problems. I get then the following msgs:
    @5B\QInformation@     InfoCube / DataStore object "AKIVO14" contains no data (yet)
    @5B\QInformation@     Database table /BIC/AAKIVO1400 was deleted and recreated
    So I think the first deletion works but ended with errors. What could be the reason?

    Please check if data from new and chage log table is also deleted.
    For the error message : Do you have 3.5 flow ? Check your transfer structure , if it is active and then again try to right click and delete data from DSO.
    - Monika

  • How do i transfer data from one internal tabe to another.

    Hi All,
             How do i transfer data from one internal tabe to another.
             Can i do it ebven if he tables are different in structure.
    Please Advice.
    Thanks in advance.

    Hi Saket Tiwari,
    I hope the earlier post by kashyap is good enough an answer. anywas in addition to it let me give a detailed
    explanation of how you can populate an internal table.
    1) Append data line by line.
         Syntax :  APPEND [<wa> TO / INITIAL LINE TO] <itab>.
    this appends new line to internal table <itab>.
    2) Using COLLECT statement.
                 COLLECT is another form of statement used for populating the internal tables.  Generally COLLECT is used while inserting lines into an internal table with unique standard key. The syntax for COLLECT statement is as shown
         Syntax : COLLECT [<wa> INTO] <itab>.
    3) Using INSERT statement
         Syntax  INSERT [<wa> INTO / INITIAL LINE INTO] <itab> [index <idx>].
    INSERT statement adds a line/work area to the internal table. You can specify the position at which the new line is to be added by using the INDEX clause with the INSERT statement.
    Now coming to your request..
    To append part or all of an internal table
         Syntax
                  APPEND LINES OF <itab1> [FROM <n1>] [TO <n2>] TO <itab2>.
    *     Note:
    Without the FROM and TO options, this statement appends the entire table <itab1> to <itab2>.*
    b) To insert part or all of an internal table into another internal table
         Syntax
              INSERT LINES OF <itab1> [FROM <n1>] [TO <n2>]
              INTO <itab2> [INDEX <idx>].
    c) Using Move statement.
    To copy entire contents of one table into another in one execution
         Syntax MOVE  <itab1> To <itab2>.
                   OR
              <itab1> = <itab2>.
    but u hav to be careful because he contents of itab2 will eb overwritten on the execution of this statement.
    These copy the contents of ITAB1 to ITAB2. Incase of internal tables with header line we have to use [] inorder to distinguish from work area. So, to copy contents of internal tables with header line  the syntax becomes,
    ITAB1[] = ITAB2[].
    Coming to the letter part of your question, Yes, we can copy values between tables having different structures.
    for this we use    
                                MOVE-CORRESPONDING <itab1> TO <itab2>
        this executes the statement for their header lines. Searches for the sub-fields which occur both in itab1 and itab2 and then generates, for all relevant field pairs which correspond to the
            sub-fields ni , statements of the form MOVE itab1-ni TO itab2-ni. The other fields remain unchanged.
    I hope the information provided has been of your help.
    Reward if useful.
    Regards,
    Jose

  • How to download data from itab to Excel sheet..

    Hi sap Experts,
    I develop one report to download data from itab to excel sheet.I done it perfectly.
    My doubt is in the excel sheet i need to put headings. and also at last i need to display total no.of Records.
    Please help me how to do this.
         Thanks in Advance....
    Thanks and Regards
    Siri.......................

    u can either use GUI_DOWNLOAD or the method below...just give a try
    DATA:WK_string TYPE STRING.
    DATA:Wk_EXT(4) TYPE C value '.xls'.
    DATA:WK_path TYPE STRING.
    DATA:wk_file_name type string.
    data:wk_records type sy-tfill.
    wk_file_name = 'Downloaded'.
    describe table itab lines wk_records.
    LOOP AT ITAB.
    at first.
    **Provide path
    concatenate 'C:\' wk_file_name Wk_EXT into wk_path.
    open dataset WK_path for output in text mode encoding default.
    if sy-subrc  0.
    write:'ERROR while opening dataset !!!!!!'.
    EXIT.
    endif.
    CONDENSE wk_path.
    **Heading Part
    concatenate 'Part No' 'Description'
         into wk_string separated by cl_abap_char_utilities=>horizontal_tab.
    endat.
    **Data ITAB
    CONCATENATE ITAB-MATNR itab-maktx into wk_string
         separated by cl_abap_char_utilities=>horizontal_tab.
    transfer wk_string to WK_path.
    at last.
    clear wk_string.
    Concatenate  'Records Passed:' wk_string into wk_string.
    transfer wk_string to WK_path.
    close dataset WK_path.
    endat.
    endloop.

  • HOW TO TRANSFER DATA FROM ONE INTERNAL TABLE TO ANOTHER

    FOR PERTICULAR OBJECT ID ONE INT TABLE JTAB CONTAINS ONE RECORD(ROW) AND ANOTHER INT TABLE KTAB CONTAINS 3 RECORDS(ROWS). THEN HOW I SHOULD TRANSFER DATA FROM KTAB TO JTAB? WHAT R THE VARIOUS WAYS TO DO THAT. PLS HELP ME OUT. THANKS IN ADVANCE

    Try something like
    If you want one record per ktab :
    LOOP AT jtab.
      MOVE-CORRESPONDING jtab TO itab.
      LOOP AT ktab WHERE id = jtab-id.
        MOVE-CORRESPONDING ktab TO itab.
        APPEND itab.
      ENDLOOP.
    ENDLOOP.
    or
    LOOP AT ktab.
      READ TABLE jtab WITH KEY id = ktab-id. " binary implicit if sorted type
      MOVE-CORRESPONDING jtab TO itab.
      MOVE-CORRESPONDING ktab TO itab.
      APPEND itab.
    ENDLOOP. 
    If you want to sum ktab ratio into itab for each jtab
    LOOP AT jtab.
      MOVE-CORRESPONDING jtab TO itab.
      LOOP AT ktab WHERE id = jtab-id.
        ADD-CORRESPONDING ktab TO itab.
        APPEND itab.
      ENDLOOP.
    ENDLOOP.
    Use sorted type table when LOOP AT WHERE, else SORT table is enough.
    Regards

  • How to transfer data from a dynamic internal table

    Hi All
    I want to transfer data from a dynamic internal table<dyn_table>
    to a non dynamic internal table itab which should have the same structure as <dyn_table>.
    How can this be done?
    Regards,
    Harshit Rungta

    As stated earlier this can be done only through field symbols...
    You cannot create an non dynamic internal table with ANY structure...using DATA statement
    If the strucutre is defined well and good...you can create an non-dynamic internal table...
    If you do not know the structure then the internal table has to be dynamic...and to be generated using field symbols
    DATA: lv_ref TYPE REF TO data.
    FIELD-SYMBOLS: <fs_dyn_table> TYPE STANDARD TABLE.
    * You create a dynamic internal table...
    CREATE DATA lv_ref LIKE (your_dynamic_internal_table).
    ASSIGN lv_ref->* TO <fs_dyn_table>.
    Now...do the transfer.
    <fs_dyn_table> = "your_dynamic_internal_Table
    Hope it helps!

  • Fatal error while transferring data from one bb to another

    I was trying to transfer data from one bb to a different bb using the bb desktop manager tool.  It collected all the data from the first bb fine and then while transferring data to the new one it said a fatal error occured to try again.  That bb will not even turn on now.  No white screen or anything.  Its like there is no operating system.  I don't know what to do.  I tried to connect it and just update the software but that isnt working either.  Says it is not responding during initialization! I think I killed it.... any ideas what to do???

    You can try to reload the OS in one of the following ways
    http://www.blackberry.com/btsc/KB03485
    http://blackberryfaq.net/index.php/How_do_I_wipe_the_BlackBerry_using_Jl_Cmder%3F
    If someone has been helpful please consider giving them kudos by clicking the star to the left of their post.
    Remember to resolve your thread by clicking Accepted Solution.

  • Error while extracting data from Generic datasource

    Hello Gurus,
    I have encountered an error "Errors in source system" with status red while extracting data from generic datasource into DSO.
    I have done below things:-
    1. Checked the jobs in source system, the job is successfully completed
    2. The generic datasource is active and supporting delta with calender day  in source system.
    3. No TRfc error and no idocs stucked.
    4. No short dumps in source system.
    5. The delta queue(RSA7) is showing 0 with status as green.
    6. Replicated the datasource On BW side and also activated the transfer rules by RS_TRANSU_ACTIVATE_ALL
    The problem has not yet solved. Could you guys please suggest me what might be the reason for this error......
    Thanks,
    Sonu

    Hello,
    I have used the generic delta using calender day.
    Safety upper limit : - 1
    Safety lowerr limit : - 0
    I have made the entries in va01 transaction, but the delta queus is not getting updated which should show 1.
    Do Generic extraction require any job to transfer data from Database table to delta queue.
    We are not able to get the delta records on BW side. Please suggest.
    Please suggest.
    Thanks,
    Sonu

Maybe you are looking for

  • External Hard Disk Problems

    I realize this may not be the right forum, but since I couldn't find a more related forum to post this, I'll just ask anyway. Recently I bought a Seaget FreeAgent Go 120GB external hard disk via USB. Problem is, the software on it was only compatible

  • TDS DEDUCTING TWICE AT THE TIME OF ADVANCE POSTING AND AT THE TIME OF INVOg

    HI ALL I HAVE AN ISSUE TDS DEDUCTING TWICE AT THE TIME OF ADVANCE POSTING AND AT THE TIME OF INVOICE POSTING AT THE TIME OF ADVANCE POSTING F-48 ENTRY IS VENDOR     DR     30000 BANK          CR     2900 TDS          CR     300 AT THE TIME OF INVOICE

  • CRS10g patchset을 적용한 이후 Veritas SF Oracle RAC 관련 MODULE

    Problem Description 다음은 SF(Storage Foundation) Oracle RAC를 구성하기 위하여 Oracle CRS PATCHSET 등을 적용하기 전에 SF Oracle RAC Veritas libraries 를 ORACLE_HOME에 설치해야 한다는 내용입니다. 또한, ORACLE CRS 패치 Version 을 올리면서 Veritas 쪽 관련하여 필요한 Veritas skgxp module 등이 설치 과정에 빠져 있거

  • D1000 and 10gr2 RAC and ASM

    Solaris 9 10gr2 Hi All, Is there anyway two nodes can share D1000 storage. Both nodes should see all the disks. Not half the disks each node. If it can be done, please give me the proceduere. I want to try oracle 10g RAC on this using ASM. Thanks in

  • Array of cluster of arbitary type

    Hi all, Is it possible to generate an array of cluster with the following characteristic? 1. each element cluster in the array will be of different type 2. depending on the code, each element could be a different data type at different time.  For exa