Custom delta extractor: All data deleted in source table in R/3

Hi everyone,
I have made a custom delta extractor from R/3 to a BW system. The setup is the following:
The source table in R/3 holds a timestamp, which is used for the delta. The data is afterwards loaded to a DSO in the BW system. The extractor works as expected with delta capability. Furthermore if I delete a record in the source table, this is not transmitted to the DSO, which is also as expected.
The issue is this however: If we delete all data in the source table, then on the next load there is a request showing 1 record transfered to the DSO. This request does, however, not show up in the PSA, and afterwards all data fields in the DSO is set to initial.
Does anyone know why this happens?
Thank you in advance.
Philip R. Jarnhus

Hi Philip,
As you have used generic extractor I am not sure how the ROCANCEL will work but you can check the below link for more information,
[0RECORDMODE;
Regards,
Durgesh.

Similar Messages

  • Analyse big data in Excel? Why the dynamic tables doesn't take all the data from the source table.

    Hi,
    I'm doing a internship in a production line.
    My job is to recover production data (input data) and test data (output data) using various types of software (excel, BusinessObject sap, etc).
    To this day, I have recovered hundreds of production data, and have also organized in excel but I need to analyze and plot them.
    I would like to know who can give me an idea of ​​how I could plot as much data and analysis.
    Now i trying to use dynamic charts and plot some data but I did not get acceptable answers.
    How could I compare, analyze and graph for example:
    Five columns of production (input) with five (5) columns tested (data output).
    After graphing.
    Someone can give me a technique to analyze data? ie I compare column by column?
    or some other technique? as a conglomerate could analyze data?
    o give you an idea of ​​the contect, now I perform an internship in a manufacturing turbines.
    My job is to analyze the input data (production) and to estimate the possible behavior of the turbines in the tests.
    As I said, use dynamic tables in excel, but i have not idea why the dynamic tables doesn't  take all the data from the source table.
    I appreciate your advice
    Thanks

    You can declare as PT source whole Columns [$A:$E], without rows number.
    Then You'll have all actually data.
    Oskar Shon, Office System MVP - www.VBATools.pl
    if Helpful; Answer when a problem solved

  • How to delete the source table rows once loaded in Destination Table in SSIS?

    Data Base=kssdata
    Tables= Userdetails having 1000 rows
    Using SSIS: 
    Taking A  
    OLE DB Source----------------->OLE DB Destination
    Am Taking 200 rows in Source table and loaded into Destination table once
    Constraint: here once 200 rows are exported in destination table , that 200 rows are deleted in source table
    repeat the task as source table all the records are loaded into Destination table 
    After that am taking another 200 rows in source table and loaded into Destination table

    Provided you've a sequential primary key  or audit timestamp (datetime/date) column in the table you can do an approach like this
    1. Add a execute sql task connectng to source db with below statement
    SELECT COUNT(*) FROM table
    Store the result in a variable
    2. Have another variable and set it to below expression
    (@[User::CountVariable]/200) + (@[User::CountVariable]%200 >0? 1:0)
    by setting EvaluatesExpression as true. Here CountVariable is variable created in previous step
    3. Have a for loop container with below settings
    InitExpression
    @NewVariable = @CounterVariable
    EvalExpression
    @NewVariable > 0
    AssignExpression
    @NewVariable = @NewVariable - 1
    3. Add a data flow task with OLEDB source and OLEDB Destination
    4. Use source query as
    SELECT TOP 200 columns...
    FROM Table
    ORDER BY [PK | AuditColumn]
    Use PK or audit column depending which one is sequential
    5. After data flow task have a execute sql task with statement as below
    DELETE t
    FROM (SELECT ROW_NUMBER() OVER (ORDER BY PK) AS Rn
    FROM Table)t
    WHERE Rn <= 200
    This will make sure the 200 records gets deleted each time
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • SSIS 2012 is intermittently failing with below "Invalid date format" while importing data from a source table into a Destination table with same exact schema.

    We migrated Packages from SSIS 2008 to 2012. The Package is working fine in all the environments except in one of our environment.
    SSIS 2012 is intermittently failing with below error while importing data from a source table into a Destination table with same exact schema.
    Error: 2014-01-28 15:52:05.19
       Code: 0x80004005
       Source: xxxxxxxx SSIS.Pipeline
       Description: Unspecified error
    End Error
    Error: 2014-01-28 15:52:05.19
       Code: 0xC0202009
       Source: Process xxxxxx Load TableName [48]
       Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description: "Invalid date format".
    End Error
    Error: 2014-01-28 15:52:05.19
       Code: 0xC020901C
       Source: Process xxxxxxxx Load TableName [48]
       Description: There was an error with Load TableName.Inputs[OLE DB Destination Input].Columns[Updated] on Load TableName.Inputs[OLE DB Destination Input]. The column status returned was: "Conversion failed because the data value overflowed
    the specified type.".
    End Error
    But when we reorder the column in "Updated" in Destination table, the package is importing data successfully.
    This looks like bug to me, Any suggestion?

    Hi Mohideen,
    Based on my research, the issue might be related to one of the following factors:
    Memory pressure. Check there is a memory challenge when the issue occurs. In addition, if the package runs in 32-bit runtime on the specific server, use the 64-bit runtime instead.
    A known issue with SQL Native Client. As a workaround, use .NET data provider instead of SNAC.
    Hope this helps.
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Delta enabled master data - Deletion of data sets

    Hello,
    is it possible to automatically delete a certain data set of a master data bearing InfoObject which is delta enabled ? Does a delta load for master data InfoObjects  allow to fully delete a certain data set?
    Example:
    Delta Master InfoObject: Z_Test
    Attributes: Z_Name, Z_Region, ....
    Data sets:
    (1) 12345, Name1, Region1, ....
    (2) 23456, Name2, Region2, ....   (this data set should be completely deleted)
    (3) 34567, Name3, Region3, ...
    Expected result:
    (1) 12345
    (2) 34567
    I just want to know whether the deletion can be performed through a delta data load or not?
    Thanks in Advance.
    Marco

    Hi Marco,
    Welcome to SDN!!
    You can goto RSA1 -> Info-object -> maintain master data and delete the unwanted records and loads the ones you need. If the no. of records are more then you may need to write an ABAP program. But before all this deletion make sure the master data is not used in any other Data target.
    Bye
    Dinesh

  • Insert data from source table to destination table dependning up on a criteria.once inserted delete from source table.

    HI,
    I have a source table with millions of records .I need to insert some of the data (depending on a condition) to a repository table.
    Once they are inserted they can be deleted from the source table.
    The deletion is taking a lot of time .
    I need to reduce the time to delete the records.
    ex:-  1 million records in 8 seconds.
    Had already used bulk collect and cursors but cannot succeed.
    Please suggest how to increase the performance.
    Thanks & Regards

    APPROACH 1:-
    CREATE OR REPLACE PROCEDURE SP_BC
    AS
    DETAILS_REC SOURCETBL%ROWTYPE;
    COUNTER NUMBER:=1;
    RCOUNT NUMBER:= 1;
    START_TIME PLS_INTEGER;
    END_TIME PLS_INTEGER;
    CURSOR C1 IS
    SELECT * FROM SOURCETBL WHERE DOJ<SYDATE;
    BEGIN
    START_TIME := DBMS_UTILITY.GET_TIME;
        DBMS_OUTPUT.PUT_LINE(START_TIME/100);
        OPEN C1;
        LOOP
        FETCH C1 INTO DETAILS_ROW;
        EXIT WHEN  C1%NOTFOUND;
               BEGIN
                EXIT WHEN COUNTER >10000;
                INSERT INTO DESTINATIONTBL VALUES DETAILS_REC;
                IF SQL%FOUND THEN
                    DELETE FROM SOURCETABLE WHERE ID= DETAILS_REC.ID;
                  COUNTER:=COUNTER+1;
            END IF; 
        COMMIT;
            END;
         COUNTER:=1;
        END LOOP;
        COMMIT;
    END;
    APPROACH 2:-
        CREATE OR REPLACE PROCEDURE SP_BC1
    IS
    TYPE T_DET IS TABLE OF SOURCETBL%ROWTYPE;
    T_REC T_DET;
    BEGIN   
        SELECT *  BULK COLLECT INTO T_REC FROM SOURCETBL
         WHERE NAME=@NAME;
        FOR I IN  T_REC .FIRST ..T_REC .LAST
           LOOP
             INSERT INTO DESTINATIONTBL VALUES T_REC (I);
          IF SQL%FOUND THEN
          DELETE FROM SOURCETBL WHERE ID =
           WHERE ID = T_REC (I).ID;  
           END IF; 
           EXIT WHEN T_REC=0;
        END LOOP;
        COMMIT;
    END;
    APPROACH 3:-
    CREATE OR REPLACE PROCEDURE SP_BC2
    AS
    TYPE REC_TYPE IS TABLE OF SOURCETBL%ROWTYPE ;
    DETAILS_ROW REC_TYPE;
    CURSOR C1 IS
    SELECT * FROM
         SOURCETBL WHERE END<SYSDATE;
        BEGIN
        OPEN C1;
        LOOP
        FETCH C1 BULK COLLECT INTO DETAILS_ROW LIMIT 999;
        FORALL I IN 1..DETAILS_ROW.COUNT
                  /* A BATCH OF 999 RECORDS WILL BE CONSIDERED FOR DATA MOVEMENT*/
    INSERT INTO DESTINATIONTBL VALUES DETAILS_ROW(I);
    --            IF SQL%FOUND  THEN
    --                DELETE from SOURCETBL WHERE ID IN DETAILS_ROW(I).ID;
    --           END IF;
            EXIT WHEN  C1%NOTFOUND; 
        COMMIT;   
        END LOOP;
        COMMIT;
    3rd approach seems better but i have an issue with referring the fileds of a record type.

  • All data deleted after upgrade to iOS8

    I am deeply sad. All data has been deleted after upgrade to iOS8.
    What I did:
    1. Connect ipad to PC to transfer records from voice recorder.
    2. iTunes offered to upgrade system.
    3. I confirmed.
    4. Some unknown error occured in the end of upgrade process.
    5. When iPad started, iOS8 was there. All data disappeared.
    6. I am sad. All valued data I need is lost... (no backup has been done before upgrade).
    Pls advice is it possible to get data back.

    Yes. I did backup.
    But don't know how to get it.
    Actualy, all I need is 2 records from Recorder programm. Before upgrade I downloaded records from itunes; screen → http://joxi.ru/caQdVIwyTJDKAjTtsuk
    Now it is empty, no records.
    I see that backup of this programm data available in storage. But don't know how to get it from there; screen → http://joxi.ru/taUdVP3JTJAaYtIkBaI
    By the way iTunes tells me that backup copies available for today only → http://joxi.ru/7KUdVP3JTJBAXfQZjgI

  • How to restore all data deleted from iPhone ?

    I did back up on iTunes but it didn't back up anything all data been deleted and I got install new clouds anyone knows how I could recover my contacts and my photos ? Anyone in Brisbane thanks

    Have you done a back up on your computer? If so your contacts and photos will be under the last saved back up. Have you checked there? If you havent backed it up on your computer then you are out of luck. Good luck.

  • Ical - all data deleted

    When I started the computer up from being asleep, the screen was a jumble of colors with no discernible data. I rebooted and logged in, and the dock had reset itself to the default dock.
    I opened a number of programs and discovered that ical was missing all of my data. I researched to find out that I could restore my calendar, but when I tried importing one from earlier today I got a message that the data was unreadable. I also tried dragging it into ical and got the same error message.
    I have been having small problems with ical for a number of months where events were getting duplicated - when I would move the event to another date there seemed to be a record at the old date as well, which was creating problems when I synced to my iphone.
    Questions:
    1) Why did the dock reset itself to the default?
    2) Why was all the ical data deleted?
    3) Is there a way to copy my ical data from the iphone to the calendar, or use the iphone backup to restore the calendar? I tried syncing ical from the phone to the computer, but no events were copied
    Thanks!

    Problem solved.
    It seems that the initial sycing DID save a backup file and stored it in the ~/library/Calendars/.caldav/calendar folder. It ALSO took all the "deleted" files and put them in a folder in the trash. (Similar to when you have a system crash and upon reboot discover "saved files" in the trash.)
    I was able to restore the old files successfully. Now I'm still trying to restore them on the web accessible section of iCloud but that's another episode.

  • Data deletion in PSA tables

    Hello Folks,
    I am trying to delete Older than 7 days data in my PSA tables, with Process chains I am able to delete the requests ID but actual data is not getting dropped. I can write an ABAP program but that will delete the entire content of PSA tables. I went through lot of SDN messages in this regard but I did not find a solution. Could you please throw some suggestions on this?
    Kris

    Hi,
    There is a process type called "Deleting requests from PSA" and there you can set the frequency i.e. how many old data you want to retain.
    And as I seen once the request is deleted, all the data belonging to that request also get deleted otherwise there is no point in just deleting request ID's.
    I guess there must be some issue with your process chain. Try manually deleting PSA request if they are still available in PSA table.
    Regards,
    Durgesh.

  • How to copy the all data to another new table

    Hi,
    1) "How can I copy the data from one Data Dictionary Table to another new Table, in Dictionary"?
    for eg.  (<i>In Dictionary,  I have one table named 'sflight' and now I want to copy the all data from 'sflight' to another new table named 'zabc'</i>)
    How can I do this?
    2) What is Logical Unit of Work or LUW.
    Thanks

    HI,
    go to <b>se11</b>, give sflight (Your standard table name) in tables text,
    No go to dictionary objects --> copy --> give new <b>Ztable</b> in <b>to table</b> text.
    Like this you can copy the standard sap table to a new ztable..
    A Logical Unit of Work (LUW or database transaction) is an inseparable sequence of database operations which must be executed either in its entirety or not at all. For the database system, it thus constitutes a unit.
    LUWs help to guarantee database integrity. When an LUW has been successfully concluded, the database is once again in a correct state. If, however, an error occurs within an LUW, all database changes made since the beginning of the LUW are canceled and the database is then in the same state as before the LUW started.
    An LUW begins
    o each time you start a transaction
    o when the database changes of the previous LUW have been confirmed (database commit) or
    o when the database changes of the previous LUW have been cancelled (database rollback)
    An LUW ends
    o when the database changes have been confirmed (database commit) or
    o when the database changes have been canceled (database rollback)
    SAP memory and ABAP memoryUsing SAP memory. User-specific memory retained for the duration of the session. Can only be used for simple field values.
    Using ABAP memory. Can be used to transfer data between internal modi. Can be used to transfer any ABAP/4 variables ( Fields, strings, internal tables, complex objects)
    SAP memoryYou use the SPA/GPA parameters to pass data. These paramters are saved globally in memory. The paramters are indetified by a three-character code.
    In dialog programs you can SET and GET these parameters in the attribute window of the screen field, by marking the SET and GET fields and put the name of the parameter in the ParameterId field.
    In a program ( Dialog or Report ) you can use the GET PARAMETER and SET PARAMETER statements:
    set parameter id 'RV1' field <fieldname>
    get parameter id 'RV1' field <fieldname>
    rewards if useful,
    regards,
    nazeer

  • Row should get added in the target as soon as the data in the source table

    I have done the following:
    * The source table is part of the CDC process.
    * I have started the journal on the source table.
    Whenever I change the data in the source, I expect the target to get a new row added with a new sequence number as the surrogate key. I find that even though the source data changes, the new row does not get added.
    Could someone point out to me why is the new row not getting added?

    Step 1 - Sequence Number
    create a sequence in your rdbms namely
    CREATE SEQUENCE SEQUENCE_NAME
    MINVALUE 1
    MAXVALUE 99999
    START WITH 1
    INCREMENT BY 1
    YOU can use the above sequence in your mapping in this way
    schema_name.sequence_name.nextval executed on Target option .
    Next select only Insert option for sequence column .
    Click on the Source datastore and in the Properties panel you will find an option called " Journalized Data Only " . Now whenever this interface runs , only the journalized data gets transferred.
    The other way to see the journalized data from the source side is right click on the source datastore under the model which is journalized and now go to " changed data capture " and then to " journal data .. "
    Now you can see only the journzalied data.
    As CDC creates as trigger at the source , so whenever there is change in the source it gets captured at the target whenver you run the interface above interface with Journalized data only option.
    I hope iam clear and elaborate now.
    Thanks

  • CO delta extractors: How to delete single init uploads?

    Hi Gurus,
    i'm uploading COPA data with parallel initializations. I'm initializing data with different selections criteria, for example:
    init for SOCIETY X - EXERCIZE 2008
    init for SOCIETY Y - EXERCIZE 2008
    init for SOCIETY Z - EXERCIZE 2008
    init for SOCIETY X - EXERCIZE 2007
    This works, but i'm not able to delete  a single initialization: when i try to delete one in IP "Scheduler -> Initialization for this source" the system deletes all initializations.
    Is somene able to help me?
    Thanks
    Piero

    Thanks for the answer,
    but so i don't understand which is the utility of the parallel initializations.
    How can i reinitialize only a selected period instead of all COPA DATA ( millions or records to initialize) when in COPA runs transaction KEND???
    Now R/3 is 4.6C version but we are migrating to ECC 6.0.
    Thanks
    Piero

  • Delta loading master data errors in source system Message RSM340 quality

    Hi,
    I'm trying to load with delta  master data for standard data sources.
    In dev, everything seems fine.
    I' ve transported the same elements to QA. I've done init, it is ok. I do delta, I get red status: Errors in source system, message no. RSM340. I've tried to put my delta on green, I've remade my init, I've reloading with delta, and I get the same problem.
    I have this problem for all master data loading.
    Should I install some infoobject from Bi Content?
    Thanks a lot !

    Hi,
    Check following threads
    [3003 + R3299 + RSM340] error messages loading an ODS
    Extraction failed with error Message no. RSM340
    BW Data Loading - Delta Processes
    Thanks and regards
    Kiran

  • All data deleted off phone after incorrect password attemps

    Someone please help, I have managed to delete all the data off my phone, after trying to connect the phone to my pc to retrive a photo of my daughter. Can anyone please help me to restore this data as I have loads of phones numbers I need to get back. Please please help

    Hi susansmith
    Welcome to BlackBerry Support Forums
    If you have a previous Backup of your device then please check this KB and restore your Contact.
    KB10339 : How to use BlackBerry Desktop Software to restore data to a BlackBerry smartphone from a backup file
    Please try it and let us know.
    Click " Like " if you want to Thank someone.
    If Problem Resolves mark the post(s) as " Solution ", so that other can make use of it.

Maybe you are looking for