Selective Deletion and Data Load from Setup Tables for LIS Extractor

Hi All,
We came across a situation where one of the delta in PSA was missed to load in DSO. This DSO updates another cube in the flow. Since it has been many days since this miss come in our knowledge we need to selectively delete data for those documents from DSO & Cube and then take a full load for the Documents filling the setup table.
Now what will be the right approach to load this data from setup table > DSO > cube. There is change log present for those documents and a few KPI's in DSO are in summation mode.
Regards
Jitendra

thanks Ajeet!!!!
This is Sales Order extractor, the data got loaded to ODS just fine, but since the data is coming to ODS from different extractor. Everything is fine in ODS, but not in the cube. Will Full repair request and Full load would it make difference when the data is going to cube? I thought that it would matter only if I am loading to ODS.
what do you mean "Even if you do a full load without any selections you should do a full repair ".
thanks.
W

Similar Messages

  • Is it possible to automate the load of setup tables for LIS datasources?

    Hello,
    For LIS datasources, eg. 04 (shop floor control), once the production confirmation is done, we have to manually run the transaction OLI4BW to load the setup tables before extracting data to BW. Is there a way to automate the load of setup tables?
    Regards,
    Suraj

    Hi,
    Yes we can !
    Create variants for small ranges and scheduled all of them at a time. So that, loading will be completed soon.
    Thanks,
    Saru

  • Load from Setup tables breaking for high volume

    Hello Friends,
    There are around 50,000 records in the setup table for 03 related appl area. The load is breaking for 50,000 records in BW. But it works if I further select only one day data in the BW scheduler.
    Why is this?
    Immediate help is appreciated.
    Regards,
    Simmi

    Hello gurus,
    So, I tried reducing the packet size to 20,000 and no of Idocs to 10 per Infopckage. But this is working for only 40K records which a just a weeks data for the 2LIS_03_BF extractor.
    If I run 30 days worth of load form setup tables for the same etxractor, the load is breaking.
    The same is happening for a customer built Zextractor. For this I actually tried with 10,000 records and 5 IDOCS per package.
    Does, this mean for sure that there is a memory issue?
    Also, I see that all the BGD WPs have been occupied by other jobs. Is this an issue?
    But we have deltas running every night from R3, but t hese are less volumes(<10,000)and we have not seen this issue.
    However within BW there are heavy loads between data targets and they don't break.
    Is this a issue only for full loads from R3?
    Appreciate your immediate response.
    Thanks
    Simmi
    Edited by: simmi on Jun 20, 2008 1:39 AM

  • Selective Deletion Before Data Load

    Hi Experts - I need to do the data load into Oracle data Warehouse. Before loading data , I need to do some selective deletion from the target table.
    In the source dataset I have a date column where I have Max and Min Date . I need to delete the data from the target laying between this Min and Max date.
    Any Idea how to do this selective deletion.
    Thanks
    R

    Create a workflow, and declare two local variables, $DateMin and $DateMax, of either date or datetime datatypes, as appropriate.  Create a script:
    $DateMin = sql('DS','select min([datetime field]) from [incoming table]');
    $DateMax = sql('DS','select min([datetime field]) from [incoming table]');
    Add a dataflow to your workflow, and connect it downstream of the script.  Add two parameters to the dataflow -- let's say you call them $P_DateMin and $P_DateMax. Back in your workflow, in the "Calls" tab of the Variables & Parameters window, set the mapping of the two dataflow input parameters to your two local workflow variables.
    In your dataflow: perform a selection of the primary key (the column(s) which constitute the pk) of your target table, filtering on your two input parameter values ($P_DateMin and $P_DateMax.  If you want to be on the safe side in terms of preventing blocking issues, send these records into a Data Transfer transform (file preferred, but up to you). Then, downstream from the Data Transfer transform, send the records into a Map Operation transform, mapping 'Normal' to 'Delete'. Then, simply send them into your target table.
    You could, of course, just write a SQL script to delete the records, but those are to be avoided as breaking lineage & impact chains.
    If all your date or datetime stamp fields on your target table are "whole" dates, with no time portion, and you have a smallish number of dates between your min. and max. dates, and you have a large number of records to delete between those dates, and your target table has an index on the date stamp column, then another approach would be to generate records, one per day, using a Date Generation transform, still making use of your two dataflow parameters. You'd declare the date field so generated to be the (false) primary key, map the records to deletes w/ the Map Operation transform, and then send them into your target, with the "Use input keys" option selected.

  • Data loading from one table to another

    Hi,
    I want to load some data from a temp table to a master table. The master is having 40million records and the temp table is having 23 million records. Master table is having around 50 columns and we are adding 4new columns and the temp table is having 5columns. The data for these 4new columns are available in the temporary table also the employee column is there in common to these two table.
    I used a stored procedure to load the data, whcih uses a cursor. But its taking more that 6hours to load.
    Can any one suggest me a good technique to load data faster?
    Thanks,
    Santhosh.

    hi consider this case scenario which matches with yours.
    first of all you have to update not insert in master table.
    master table = emp with columns (emp_id, emp_name, emp_designation)
    to this original master table you added two more columns emp_salary, emp_department
    so now your master table looks like emp_id, emp_name, emp_designation, emp_salary, emp_department
    but when you do select * from master table, the last two columns salary & department are blank.
    Now you have another temp table with folllowing columns (emp_id, emp_salary, emp_department)
    now emp_id is common to master & temp tables & you want to put values from temp table into master tables? I think this is what ur trying to do..
    so for the above case the query i would write is
    update master_table m set m.emp_salary=(select t.emp_salary from temp_table t where
    t.emp_id=m.emp_id);
    commit;
    Regds.

  • Data Loading from one table to another in the Same Database based on conditions .

    Hi ALL ,
    I have 2 tables Products and Product_info .
    Product_info table Product_id is Primary key but not an identity column so auto increment of number needs to be performed from the Package only .
    Requirement is :
    IF the Product_ID is = 20 and Date lies in the previous month not the current month in the Products table then 
    insert into the Product_info table based on below mentioned information .
    1.If the Name  has tap then ignore it completely don't perform any insert for it.
    2.If the Name has Zork in it then perform 2 inserts in the Product_info table having Product_info_id's 1 and 2 .
    3.If the Name doesn't contains Zork or tap insert it in the Product_info table having Product_info_id 4.
    Very new to SSIS package development it will be helpful if you can provide detailed information .
    Source Table (Products table )
    ID
    NAME
    Product_ID
    Date 
    Area_ID
    1
    P_tap_rus
    20
    13-01-2014
    3
    2
    Enc_sap_top
    10
    15-01-2014
    4
    3
    Yorl
    20
    05-02-2014
    5
    4
    zork
    20
    20-01-2014
    6
    5
    fadbt
    10
    22-01-2014
    6
    6
    xyzzz_oprt
    20
    28-01-2014
    5
    7
    def_type_ru
    20
    06-02-2014
    2
    8
    acd_inc_tup
    10
    07-02-2014
    3
    9
    bnf_dlk_fbg
    20
    03-02-2014
    4
    10
    rtyui_vnmghj_sfdl
    10
    12-01-2014
    5
    11
    wlwf_10103_123
    10
    04-02-2014
    9
    Destination table  (Product_info)
    Porduct_ID
    ID
    Area_ID 
    Product_info_ID
    Column1
    1
    3
    5
    4
    As NameString doesn’t contain Zork or Tap 
    2
    4
    3
    1
    As Id is 4 so 2 inserts one for 1 and other for 2 in the Product_info_id column
    3
    4
    3
    2
    4
    6
    5
    4
    5
    10
    5
    4
    6
    11
    9
    4
    Please let me know if any other information is required .
    Thanks
    Priya

    Hi Priya,
    You mentioned this was coming from two tables right? I believe I would try to perform the transformations with T-SQL in my source (If this is a possibility for you). Below is an example of something you could do.
    WITH CTE
    AS
    SELECT ID, Product_ID, [Date], Area_ID,
    CASE
    WHEN Name like '%Zork%' THEN 1
    ELSE 4
    END AS Product_Info_ID
    FROM [YourTable]
    WHERE Product_ID = 20 and MONTH([DATE]) = MONTH(DATEADD(MM, -1, GETDATE())) AND NAME NOT LIKE '%tap%'
    SELECT *
    FROM CTE
    UNION
    SELECT ID, Product_ID, [Date], Area_ID, '2' AS Product_Info_ID
    FROM CTE WHERE Product_Info_ID = 1
    I hope this helps, Regards.

  • SAP GRC 10.1 AMF No data selected when adding duplicate fields from separate tables for configurable data sources

    Hi There,
    In SAP GRC 10.0, our team had an issue where we could not add duplicate fields from separate table (see ERROR: Select Currency/UoM field for the selected analyzed fields). This was resolved by the SAP Note 1904313/ 1904314 (http://service.sap.com/sap/support/notes/1904313).
    We upgraded our system to SAP GRC 10.1 SP05 and could now add the duplicate fields from separate tables. SAP Note 1904313/ 1904314 was part of SAP GRC 10.1 SP03 so it makes sense that we, in a higher version (SP05), would be able to do this.
    The issue now is when we add the duplicate fields from different tables and run the Ad-hoc Query to test if the data source works correctly, the No Data Selected warning persists. This means that the data source provides no data for analysis, which is required to write our business rules.
    Below is an example:
    Basic data source with just one currency reference field EBAN-WAERS.
    When you run the Ad-Hoc Query you receive data.
    Basic data source with second currency reference field EKKO-WAERS.
    When you run the Ad-Hoc Query no data is found.
    Please also make reference to the following thread logged by my colleague (ERROR: Select Currency/UoM field for the selected analyzed fields)
    Any assistance to receive data with duplicate fields from separate tables will be highly appreciated.
    Thanking you in advance.
    Regards
    Gary Khan

    Hi
    following are the  error messages from dump
    hrtText
       There is already a line with the same key.
    hat happened?
       Error in ABAP application program.
       The current ABAP program "SAPLCKMS" had to be terminated because one of the
       statements could not be executed.
       This is probably due to an error in the ABAP program.
    rror analysis
       You wanted to add an entry to table "\FUNCTION-POOL=CKMS\DATA=T_DYN_CKMLCR",
        which you declared
       with a UNIQUE KEY. However, there was already an entry with the
       same key.
       This may have been in an INSERT or MOVE statement, or within a
       SELECT ... INTO statement.
       In particular, you cannot insert more than one initial line into a
       table with a unique key using the INSERT INITIAL LINE... statement.
    rigger Location of Runtime Error
       Program                                 SAPLCKMS
       Include                                 LCKMSF01
       Row                                     226
       Module type                             (FORM)
       Module Name                             DYNAMIC_PERIOD_CLOSING
    Source code where dump ocured
    222
    223           APPEND ht_ckmlpp TO t_add_ckmlpp.
    224           APPEND LINES OF ht_ckmlcr TO t_add_ckmlcr.
    225           INSERT ht_ckmlpp INTO TABLE t_dyn_ckmlpp.
    >>>>           INSERT LINES OF ht_ckmlcr INTO TABLE t_dyn_ckmlcr.
    227         ENDWHILE.
    Also I guess there is problem with material ledger in R/3 side
    I have never worked on material ledger before so dont hav idea of Tcode and tables in SAP R/3 for material ledger.
    Thanks
    Navneet

  • Master data Load from Oracle table

    Hi,
    My master data comes from an Oracle table. The table has both attributes and texts combined. How do I load the Master data into the text and attribute info objects thru Direct Upload of Master Data?
    Is it necessary to go in for Flexible Upload in this case?
    Regards

    Hi,
    you can create two views on the table. One for the text and the other one for the attributes. Create your datasources on these views and assign them to your infoobject.
    regards
    Siggi

  • Master Data deletion and data load

    Hi,
    In a process chain, Master data of my InfoObject are deleted then reloaded.
    My InfoObject is used in cubes and ODS that are loaded later in full or delta.
    My question is: what happens if thoses cubes and ODS are loaded when Master data of my InfoObject are blank (=Process chain failure / Master Data of my InfoObject are deleted but not loaded) ?
    Thanks for your answers.

    hi eric
    you cant delete the master data , till the time it exits in any of the data target ( cube or ODS).
    Now lets say you have new record for master data coming in say , example new material being introduced.
    The cube or ODS will take that new mateiral from the transactional data , and to maintain the corresponding the master data when ever it is loaded successfully it will reflect in your reports provided you have maintain/hierarchy attribute change job scheduled.
    Thanks & Regards,
    Monika

  • How to fill setup table for the Extractor: 2LIS_02_CGR

    Hi,
    Tthere is no data for the 2LIS_02_CGR extractor in RSA3 T-Code, I would like to load the data and as per my understanding i need to fill the step table.
    Could any one provide me the step-by-step to fill the setup table.
    Regards,
    Prem
    Edited by: pannalde on Dec 2, 2010 6:03 AM

    Hi Prem:
        For more details please refer to the blog by Paulo Junior.
    "The SET UP!"
    /people/paulo.junior/blog/2010/11/25/the-set-up
    Regards,
    Francisco Milán.

  • Problem selecting uniqueness and last ddl from two tables

    I need to select uniqueness from user_indexes, and last_ddl_time from user_objects to determine the last ddl on all indexes within a schema.
    I have this code
    select a.object_name,a. object_type ,to_char(last_ddl_time, 'DD-MON-YYYY HH24:MI:SS'
    ) Last_Used , b.uniqueness
    from user_objects a, user_indexes b
    where object_type ='INDEX'
    and uniqueness = 'NONUNIQUE';Schema name not included beacuse i have it setup in my unix script.
    Is this correct?????

    SELECT a.object_name,
        a.object_type,
        to_char(last_ddl_time,     'DD-MON-YYYY HH24:MI:SS') last_used,
        b.uniqueness
    FROM user_objects a,
        user_indexes b
    WHERE object_type = 'INDEX'
    AND uniqueness = 'NONUNIQUE
    and a.object_name = b.index_name';  ------------------------------ missed joined condition.You are missing a join condtion..

  • WEBI and LOV shown after selective deletion of data

    Hi, have the following problem.
    I know that the values for LOV are taken from the infoprovider. After making a selective deletion by characteristic in the infocube which has the loading date, custom characteristic like 0CALDAY, the LOV of the universe still says that i can select this data when in fact all the entries in cube has been deleted.
    Let´s say at first we have 01.01.2012, 05.01.2012, 10.01.2012 and 14.01.2012. After selective deletion of data loaded the 10.01.2012 i want that the LOV in BO has to be 01.01.2012, 05.01.2012, and 14.01.2012 but what i found is that the entry 10.01.2012 still exist althought it has no data in the infoprovider for this date.... any idea how to "delete" this proposal from the LOV ?
    Is it a default value coming from a BEx Query Variable? (set in bex query filters) NO
    is it a predefined filter in the MDX universe, sending a default value? NO
    is the default value coming from a webi query filter prompt? (set in query panel) YES
    There is a characteristic, ZDATEXT-Date Extraction with no master data and created as a reference characteristic of 0DATE.
    It is used in the universe as a dimension and then used in a WEBI as a filter (option maintain last values selected unchecked)
    Why the prompt still propose this date if there is no data in relation with this date ?
    Thanks in advance,
    Joaquin Casas Baca-Castex

    Hi,
    My understanding is that the BAPI call  MDPROVIDER_GET_MEMBERS (as tested in SE37) which is responsible for retrieving the list of values in BW goes against the master data tables, rather than the posted transaction values rolled-up into the Cube.
    i believe that some of the infoobject properties  'Query Execution Filter Value Selection'  or bex characteristic properties " only return posted values in masterdata" which have some effect on this, although i can't remember if these are consumed in XI3.1.
    I know they work in BI4.0 however, when using BICS. 
    Regards,
    H

  • Automate EIS dim builds and data loads

    I want to automate the dimesion builds and data loads from my ETL tool (DTS). I have not been able to find anything about scripting EIS automation in the documentation. Is there any?

    what you can do is create go into EIS metadata outline and create a member load and data load script. Do this by selecting the Outline menu item, then select member load. click next, on this screen, select only save load script. Click the button "Save scripts" to give it a name. click finish. repeat for the dataload script. (If you are using ASO cubes, you must use separate scripts, you can't do both in one script)Then create a batch file to run the member load and data loads. In DTS, use an execute process task to run the batch file

  • Steps for setup tables for 2lis_08trfkz?clarification?

    Hi guru's
    we want to enhance the extrator 2LIS_08TRFKZ.
    I just want to know which transaction we use to fill setup table for this extractor.
    Do we use OLI8BW or something else
    A quick response wil be highly appericiated.
    Regards,
    Amandeep

    Hi Srikanth,
    Thank you for a quick response.
    Yes i have deleted the setup tables before filling them
    the job status say it is finished
    the only bit i am worried is the log which says the setup is for application 12 although we are running it for 2lis_08trfkz and which is under application 8(shipment).
    Please clarify if this is the it is suppose to work.
    Once again thank you very much for your help and time.
    Regards,
    Amandeep.
    Start date        14.10.2006   Name of run      TRFKZ      
    Start time        19:49:49     User name        SINGAM     
    Date of terminatio00.00.0000   Application      12         
    Time of terminatio00:00:00     Source table     LIKP       
    Finish date       15.10.2006   New run                     
    Finish time       03:53:37     Document from    0000000000 
    Duration (min)    483.8        Document to      ZZZZZZZZZZ 
    No. of docs       1,707,237    Version          &(         
                                   Variant

  • How to delete the data loaded into MySQL target table using Scripts

    Hi Experts
    I created a Job with a validation transformation. If the Validation was failed the data passed the validation will be loaded into Pass table and the data failed will be loaded into failed table.
    My requirement was if the data was loaded into Failed database table then i have to delete the data loaded into the Passed table using Script.
    But in the script i have written the code as
    sql('database','delete from <tablename>');
    but as it is an SQL Query execution it is rising exception for the query.
    How can i delete the data loaded into MySQL Target table using scripts.
    Please guide me for this error
    Thanks in Advance
    PrasannaKumar

    Hi Dirk Venken
    I got the Solution, the mistake i did was the query is not correct regarding MySQL.
    sql('MySQL', 'truncate world.customer_salesfact_details')
    error query
    sql('MySQL', 'delete table world.customer_salesfact_details')
    Thanks for your concern
    PrasannaKumar

Maybe you are looking for