Skip only Archiving data from ECC to HANA through SLT

Hello,
SLT is configured between ECC6 as source and HANA as target, real time replication is enabled. I have a situation, where I have to skip the replication for only those data which are deleted from the database after archived. But normal deletion process should replicate in HANA.
1st we are archiving some data in ECC6 and after archiving we are deleting those data or moving those data into content server. For this process those data also being deleted from HANA. How to restore those deleted archived data into HANA.
We cann't stop the server in the time of archiving.
Any help on this would be great help.
Thanks,
Shubhrajit C.

Hi Vikram,
are you using a "Side-Car" approach in your scenario?
If yes, I think the best solution is to use SLT for a 1:1 trigger based replication.
BODS is used more for the scenarios where you have to prepare your data (e.g. data cleansing).
Regards,
Florian

Similar Messages

  • How to retrive archived data from ECC table

    Hi,
    I need extract the data from archived ECC tables to BW.
    How can I know that which tables the archived data has been stored in ECC ?
    Thanks for your help in advance.
    Raj.

    Hi Sreenivas,
    I know the t-code SARA using we can pull the data. Please let me know by the steps how to pull the archived data in to the data source.
    Do I need to create any generic data source or we can pull the data into existing data source.
    Please let me know by steps or if you have any document link?
    Thanks,
    Raj.

  • How Transformations or Routines will work for the NLS Archived data in BW on HANA?

    Hi,
    I have archived data from BW on HANA system into NLS IQ.
    Now I'm creating a Transformation or Routines, in this case how the archived data will be read without writing any ABAP Code.
    Thanks & Regards,
    Ramana SBLSV.

    Hi Ramana,
    May be i will try to explain with 2 cases. hopefully one of this will be your case:-
    DSO1 -> you have archived this DSO.
    Case 1:-
    Now you want to load from DSO 1 -> DSO2 (direct transformation)
    so here , you question is while loading from DSO1 to DSO 2 ,will  the archived data also  be loaded to DSO 2?
    if so, there is a  infoprovider property you need to change for this
      In extra-> infoprovider properties -> change-> Near line usage switched on (by default it is off).
    so, the archived data also will be used in this case.
    Case 2:-
    you are loading from DSO 3 to DSO2. lookup on DSO1.
    so in lookup DSO1, you need to use archived data as well?
    In this case, you have to use the Read from DSO rule type. this will access from both active table and NLS data.
    Let me know if this is not the case with you?
    Regards,
    Sakthi

  • BPC - Consolidation - Data Loading - Will BS/PL accounts data ONLY be loaded from ECC?

    Dear All,
    In BPC, when we load data from ECC for Consolidation, my understanding is that we load only BS and PL accounts' data for/by the entity.
    Apart from BS and PL data, will there be any data that will have to be loaded into BPC for consolidation?
    The following three financial statements -
    -Statement of Cash Flow
    -Statement of Changes in Equity
    -Statement of Comprehensive Income
    are actually derived/calculated from the loaded BS and PL data. This is my understanding. Pls. correct me if I am wrong.
    Thank you!
    Regards,
    Peri

    Hi Peri,
    Balance sheet, PL and those three financial statements are derived from BS/ PL accounts, however, there should also be "flow" information.  Otherwise you won't end up with a correct consolidated cash flow or equity movement. ( or you can prefer to enter flow detail manually)
    Second thing is, while getting BS & PL accounts, you will also need trading partner detail, otherwise you won't be able to do the eliminations. (or you can prefer to manually enter trading partner detail for intercompany accounts)
    Thirdly, you should also consider other disclosures. (Depending on what standart you are implementing - IFRS, US GAAP, Local Gaap whatever...)
    Hope this gives an idea.
    Mehmet.

  • Ho to archive data from COPA tables in ECC 6.0

    HI Experts,
    We are using ECC 6.0 and our client wants to archive data from COPA table's CE1XXXX,CE3XXXX and CE4XXXX. We are not using COPA planning.
    Need your suggestions and step by step process  to follow to archive the data from the above mentioned COPA tables. Also suggest is there any SAP note is required to apply for this activity.
    Regards,
    Chandra

    Hi Vijay Chandra
    please check below links , this shows a detail explaination.
    http://help.sap.com/saphelp_afs64/helpdata/en/8d/3e58e2462a11d189000000e8323d3a/content.htm
    How to Use SARA Tcode
    Also would suggest that Archiving thru batch job is suitable. If archving is to be done on regular basis (say monthly or quarterly)
    Regards
    Minesh
    Edited by: mineshshah on Nov 4, 2011 9:02 AM

  • Issue in transfer of data from ECC to APO

    Hi All,
    I have a requirement of transferring data from ECC to APO. I am using EXIT_SAPLCMAT_001 fro this purpose. The problem is, I need to transfer the data of a field that is not present in cif_matloc but present in /sapapo/matloc.
    How should I proceed...Please help....this is an urgent requirement
    Thanks & Regards,
    SriLalitha

    Hi,
    you may want to go to the transaction /SAPAPO/SNP_SFT_PROF
    Determine Forecast of Replenishment Lead Time
    Use
    In this field, you specify how the extended safety stock planning determines
    the forecast of the replenishment
    lead time (RLT). The following values are available:
    Supply Chain
    The system determines the RLT forecast using the supply chain structure by
    adding the corresponding production, transportation, goods receipt, and goods
    issue times. If there are alternative procurement options, the system always
    takes the
    longest
    option into account.
    Master Data
    The system determines the RLT forecast from the location product master
    data.
    Master Data/ Supply Chain
    First, the system determines the RLT forecast from the location product
    master data. If no RLT forecast can be determined, the system determines the
    forecast using the supply chain structure (as described under
    Supply
    Chain
    Dependencies
    You can retrieve the replenishment lead time forecast yourself by using the
    GET_LEADTIME
    method of the Business Add-In (BAdI) /SAPAPO/SNP_ADV_SFT.
    Replenishment Lead Time in Calendar Days
    Number of calendar days needed to obtain the product, including its
    components, through in-house
    production or external
    procurement.
    Use
    The replenishment lead time (RLT) is used in the enhanced methods of safety
    stock planning in Supply Network Planning (SNP). The goal of safety
    stock planning is to comply with the specified service level, in order
    to be prepared for unforeseen demand that may arise during the replenishment
    lead time. The longer the RLT, the higher the planned safety stock level.
    Dependencies
    The field is taken into account by the system only if you have specified
    master data or master data/supply chain in the RLT: Determine
    Forecast field of the safety stock planning profile used.
    Hope this helps.
    The RLT from ECC is in MARC-WZEIT which is transferred to APO in structure /SAPAPO/MATIO field CHKHOR.
    May be if you maintain the setting in the profile, you may get the value in RELDT.
    Thanks.

  • Extracting data from ECC tables in BODS

    Hello all,
    I'm trying to extract data from ECC tables. I have created a data store and imported the necessary tables(FAGLFLEXT) that i will be using.  I have used the table in a job as a source and tried to execute the job but i was only able to extract 28 records from the extractor when there are many more records present in the table and  also when i view the data in BODS i can only see 28 records.
    Plz help me in resolving this. How to extract the whole data?
    Thanks in advance.

    The table layouts can be found in the [documentation for EPMA|http://www.oracle.com/technetwork/middleware/bi-foundation/epm-data-models-11121-354684.zip] . If this doesn't work, there are other options to export hierarchies to text files. You can use life cycle management or the [EPMA File Generator|http://docs.oracle.com/cd/E17236_01/epm.1112/epma_file_gen_user/launch.html].
    Kyle Goodfriend
    http://www.in2hyperion.com
    Please make sure you assign your post as answered when an appropriate answer is provided (or helpful when applicable) so others benefit.

  • Problem loading material master (IS Mill) data from ECC to BI

    Hi Gurus,
    We have a problem loading Material master data from ECC to BI 7.0 SP 18.
    The scenario is :
    The ECC is with IS Mill... due to which the Material field MATNR is of length 40 instead of standard 18 characters.
    That is data element MATNR has 18 chars and its output length is 40 chars.
    When is table MARA browsed using SE16, the material with more than 18 chars.... shows only first 18 characters and are ended with !.
    OMSL setting shows length as 40.
    When the extractor checker runs 0MATERIAL_TEXT or 0MATERIAL_ATTR it gives correct output ..... which is more than 18 characters... not ended with !
    Till here no problem.
    On BI side, after replication of the datasource, i checked data element MATNR ... but it has length as 18 chars and output length as 18 chars.
    OMSL setting cannot be set more than 18.
    Infopackage has pulled data till PSA successfully. I checked the PSA data .... here to the material with more than 18 chars is ended with !.
    When the data is further pushed to 0MATERIAL infoobject, it throws following error for all materials irrespetive of its length (example below):
    0MATERIAL : Data record 768 ('SIT_PL_B01L_10_01!E '): Version 'SIT_PL_B01L_10_01! ' is not valid
    0MATERIAL : Data record 165 ('RLIRS52 E '): Version 'RLIRS52 ' is not valid
    Diagnosis
         Data record 768 & with the key 'SIT_PL_B01L_10_01!E &' is invalid in value 'SIT_PL_B01L_10_01! &' of the attribute/characteristic 0MATERIAL.
    System Response
         The system has recognized that the value mentioned above is invalid, and has processed this general error message. A subsequent message may give  you more information on the error. This message refers to the same value, even though it does not state this explicitly.
    I did search for SAP note related to this... but could not find any.
    There is one SAP note (Note 960868) which mentions about this, but the correction was then shipped with BI SP 9.... we are running on SP 18.
    Requesting you all experts for help.
    Best Regards,
    Deepak

    Hi,
    follow bellow steps:
    1. you need to activate the Datasouce in BI side.
    Goto RSA1>  Datasource> Select Datasource> Double click> Check fileds and Activate.
    2. Replicate the Datasource into BI side.
    3. Check the RFC connections by useing SM59.
    Regards.

  • Download ztable data from ECC to CRM 7.0

    Hi,
    I have created ztable in ECC and CRM 7.0,(tables are same).Now I want to downlaod data from ECC to CRM.
    Table should only be updatable from ECC. No maintenance required in CRM.How to achieve this?
    Please suggest me.
    Thanks,
    Brahmaji

    create customizing adapter objects in R3AC3.
    copy the stanadard FM 'CRM_BUPA_MAP_ADREREG_CI' to custom FM and write source code.
    load object in R3AS.
    create a variant and shedule the based on requirement

  • I am extracting the data from ECC To bw .but Data Loading taking long tim

    Hi All,
                     i am extracting the data from ECC To BI Syatem..but Data Loading Taking Long time. from last   6 hoursinfopackage is running.still it is showing yellow.Manually i made the red.and delete again i applied repeat of the last delta.but same proble is coming .in the status job is showing bckground job is not finished at source system.we requested to basis.basis people killed that job.again we schedule the chain also again same problem is coming.how can i solve this issue.
    Thanks ,
    chandu

    Hi,
    There are different places to track your job. Once your job is triggered in BW, you can track your load job where exactly it is taking more time and why. Follow below steps:
    1) After InfoPackage is triggered, then take the request number and go to source system to check your extraction job status.
    You can get the job status by taking the request number from BW and go to transaction SM37 in ECC. Then give the request number with begining '' and ending ''.  Also give '*' to user name.
    Job name:  REQ_XXXXXX
    User Name: *
    Check the job status whether job is completed or cancelled or short dump. If the job is still running check in SM66 whether you can see any process. If not accordingly you got to check in ST22 or SM21 in ECC. If the job is complete, then the same in BW side now.
    2) Check the data arrived in PSA, if not check whether Transfer routines or start routines are having bad SQL or code. Similarly in update rules.
    3) Once it is through in Source system (ECC), Transfer rules , Update Rules, then the next task is updating the data might some time take more time which might be based on some parameters ( Number of parallel process to update database ). Check whether updating the database is taking more time and may be you got to check with the DBA guy also.
    At all the times you should see minimum of atleast once process running all the time in SM66 till the time your job gets complete. If not you will see a log in ST22.
    Let me know if you still have questions.
    Assigning points is the only way of saying thanks in SDN.
    Thanks,
    Kumar.

  • Goods Receipts data from ECC 6.0 to APO system

    Hello Team
    Good day to you. Now i have the below requirement.
    1. From legacy system i am getting Goods Receipt data and i want to create GRs in ECC system. for this i found a relavent bapi (ie  BAPI_GOODSMVT_CREATE). please correct me if i am wrong.
    2. As soon as GRs gets create in ECC system, i want to push this data to APO system for udpates. -- > so for this requirement i want to know is there any user-exit/badi exists in which i can write a code to trigger the RFC function module which takes data from ECC system and updates into APO system or is there any other way to perform this task. please suggest me on this.
    my Interface landscape is as below.
    Legacy system -
    > ECC system -
    > APO sytem
    i want to push GR data from legacy system to APO sytem as mentioned in the above two steps. so requesting you to please suggest the solution accordingly.
    Thanks in advance.
    REgards
    RAj

    Hi Mario
    Thank you very much for your detail explanation. But i think i have put a wrong question to you, i am so sorry for this. i am very new to this process. so i am putting my requirement and the options suggested below. so i request you to please check and suggest me accordingly. In Option 3 what is the BADI i need to use and what is the process to follow to complete my option 3 which is suggested. your suggestions will help me a lot.
    Thanks in advance very much.
    My requirement is below with the options suggested are given below.
    Design Decision and Options
    Description
    Describe the problem which requires a decision to be made.
    In the current design for Phase 1 go live, goods receipts from production will be generated through interfaces I608 (when production is receipted in PRISM) and I622 (when production is receipted via OPUS). This will update the stock levels in the ECC system. As the Core Interface (CIF) between ECC and APO will be active, the stocks will also be updated in APO.
    In APO, the production orders will not be reduced by the confirmed amounts until the PRISM to APO interface (I642) runs overnight. Until the production orders are reduced there will be double counting in APO leading to incorrect planning and GATP results.
    *Options(
    List the options that have been considered as potential solutions
    Option 1: Increase the frequency of the current PRISM to APO interface.
    Option 2: Build additional Interface between PRISM & APO for production confirmations.
    Option 3: Build enhancement to ECC GR to update APO production orders.
    Option 1: Increase the frequency of the current PRISM to APO interface
    Describe the pros and cons of option 1
    Pros
    No change to system design
    Cons
    Significant potential for issues with synchronous timing between Goods Receipts in the ECC system and Production Orders from PRISM to APO
    Interface method requires deletion/creation of production orders in APO which will cause issues for APO users
    Option 2: Build additional interface between PRISM and APO for production confirmations
    Describe the pros and cons of option 2
    Pros
    Interface method of updating production orders will not cause issues for APO users.
    Cons
    New interface
    Development work required in PRISM to feed new interface
    Significant potential for issues with synchronous timing between Goods Receipts in the ECC system and Production Orders from PRISM to APO
    Delay between GR from OPUS and production confirmation from PRISM to APO
    Option 3: Build enhancement to ECC GR to update APO production orders.
    Describe the pros and cons of option 3.  If only 2 options, then delete this section.  If more than 3 options, then add further sections.
    Pros
    Ties the APO production confirmation to the ECC goods receipt, mimicking phase 2 scenario
    Will not cause issues for APO users
    Removes timing issue between goods receipt and production order confirmation
    Cons
    New development.
    Assumptions for Proposed Development:
    Utilise available BADi upon posting of goods receipt
    Will generate qRFC that will run Production Confirmation BAPI in APO
    BADi executes in update task of goods receipt
    Interface for goods receipts will contain PRISM production order number
    Recommended Decision
    State which option is recommended as the decision, and the reason why this option has been recommended.
    Option 3:  Build enhancement to ECC GR to update production orders in APO.
    Regards
    Raj

  • Unable to extract the data from ECC 6.0 to PSA

    Hello,
    I'm trying to extract the data from ECC 6.0 data source name as 2LIS_11_VAHDR into BI 7.0
    When i try to load Full Load into PSA , I'm getting following error message
    Error Message: "DataSource 2LIS_11_VAHDR must be activated"
    Actually the data source already active , I look at the datasource using T-code LBWE it is active.
    In BI  on datasource(2LIS_11_VAHDR) when i right click selected "Manage"  system is giving throughing below error message
    "Invalid DataStore object name /BIC/B0000043: Reason: No valid entry in table RSTS"
    If anybody faced this error message please advise what i'm doing wrong?
    Advance thanks

    ECC 6.0 side
    Delete the setup tables
    Fill the data into setup tables
    Schedule the job
    I can see the data using RSA3 (2LIS_11_VAHDR) 1000 records
    BI7.0(Service Pack 15)
    Replicate the datasource in Production in Backgroud
    Migrate Datasource 3.5 to 7.0 in Development
    I did't migrate 3.5 to 7.0 in Production it's not allowing
    When i try to schedule the InfoPakage it's giving error message "Data Source is not active"
    I'm sure this problem relate to Data Source 3.5 to 7.0 convertion problem in production. In Development there is no problem because manually i convert the datasource 3.5 to 7.0
    Thanks

  • How to only migrate data from SQL Server 2008 to Oracle 11?

    According to our requirement, We need to only migrate data from a SQL Server database to an existed
    Oracle database user.
    1) I tried to do it with SQL Developer 3.0.04 Migration Wizard, But find an issue.
    My SQL Server database name is SCDS41P2, and my Oracle database user name is CDS41P2;
    When I used SQL Developer to do offline move data by Migration Wizard, I found all oracle user
    name in movedata files which gotten by run Migration Wizard
    is dbo_SCDS41P2. If the Oracle user name is not the same as my existed Oracle user name,
    the data can't be moved to my existed Oracle user when I run oracle_ctl.bat in command line window.
    So I had to modify the Oracle user name in all movedata files, but it's difficult to modify them because there are many tables in
    databases. So could you please tell me how to get the movedata files which the oracle user name in them is my
    expected Oracle user name?
    2) I also tried to use the 'copy to Oracle' function to copy the SQL Server database tables data
    to the existed Oracle database user. When clicked 'copy to Oracle', I selected 'Include Data' and 'Replace' option
    But I found some tables can't be copied, the error info is as below:
    Table SPSSCMOR_CONTROLTABLE Failed. Message: ORA-00955: name is already used by an existing object
    Could you please tell me how to deal with this kind of error?
    Thanks!
    Edited by: 870587 on Jul 6, 2011 2:57 AM

    Hi,
    Thanks for you replying. But the 'copy to oracle' function still can't be work well. I will give some info about the table. I also search 'SPSSCMOR_CONTROLTABLE' in the target schema, and only find one object. So why say 'name is already used by an existing object'? Could you please give me some advice? Thanks!
    What is the 'Build' version of your SQL*Developer ?
    [Answer]:
    3.0.04
    - what does describe show for the SPSSCMOR_CONTROLTABLE in SQL*Server ?
    [Answer]:
    USE [SCDS41P2]
    GO
    /****** Object: Table [dbo].[SPSSCMOR_CONTROLTABLE] Script Date: 07/18/2011 01:25:05 ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    CREATE TABLE [dbo].[SPSSCMOR_CONTROLTABLE](
         [tablename] [nvarchar](128) NOT NULL,
    PRIMARY KEY CLUSTERED
         [tablename] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
    - what does describe show for the SPSSCMOR_CONTROLTABLE in Oracle ?
    [Answer]:
    -- File created - Monday-July-18-2011
    -- DDL for Table SPSSCMOR_CONTROLTABLE
    CREATE TABLE "CDS41P2"."SPSSCMOR_CONTROLTABLE"
    (     "TABLENAME" NVARCHAR2(128)
    ) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
    TABLESPACE "USERS" ;
    -- DDL for Index SYS_C009547
    CREATE UNIQUE INDEX "CDS41P2"."SYS_C009547" ON "CDS41P2"."SPSSCMOR_CONTROLTABLE" ("TABLENAME")
    PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
    TABLESPACE "USERS" ;
    -- Constraints for Table SPSSCMOR_CONTROLTABLE
    ALTER TABLE "CDS41P2"."SPSSCMOR_CONTROLTABLE" MODIFY ("TABLENAME" NOT NULL ENABLE);
    ALTER TABLE "CDS41P2"."SPSSCMOR_CONTROLTABLE" ADD PRIMARY KEY ("TABLENAME")
    USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
    TABLESPACE "USERS" ENABLE;
    Edited by: 870587 on Jul 18, 2011 1:42 AM

  • In PL-SQL archive data from a table to a file

    I am currently developing a vb app where I need to archive data from a table to a file. I was hoping to do this with a stored procedure. I will also need to be able to retrieve the data from the file for future use if necessary. What file types are available? Thanks in advance for any suggestions.

    What about exporting in an oracle binary format? The export files cannot be modifiable. Is there a way to use the export and import utility in PL/SQL?
    null

  • Error while loading data from ECC to BI

    Hi BW Experts,
    while loading data from ECC 6.0 to BI  i got error in details tab of the infopackage as datapackage 1  arrived in bw processing not yet finished.
    Could any one help me out?
    Thanks

    Amar,
       please check the source system job status. If its successfully completed then we can expect data load sucess in BW side.
    to check source system status: RSMO> select related load>in Menu Environment-->Job overview in source system.
    once this over check the TRFC or IDocs struck
    if TRFC, chk in TCode-SM58 or from environment select TRFC from source/data warhouse.
    if you are finding TRFcs are in stuck just select and run manually to complete.
    M

Maybe you are looking for

  • How do I restore from external drive, with outdated back-up  software?

    I mistakenly kept using my LaCie external drive's Silver keeper softerware after upgrading my Mac OS. My SilverKeeper version is 1.1.2, which I now know is incompatible with my Mac 10.4.6. (Maybe that is why I got an error report when I ran the Mac d

  • Anyone else seeing better-tha​n-ever speeds?

    I was reconfiguring an old Win XP laptop this evening and when I was about done I ran a test on  speedtest.net to see how the configuration was performing.  I was surprised to see better than ever download speeds.  I ran the same test on my Win 7 des

  • Issue Searching for Chinese Characters

    I am creating pdfs from SQL Server Reporting Services. I have the data stored in Chinese characters and it displays fine when the files are opened. Our problem is we are unable to search for any Chinese characters. When we copy a Chinese character fr

  • Passing itab in a perform statement

    Hi Experts, In a Fucn Module, I need to write a perform statement and the corresponding FORM lies in an include. I need to pass an internal table(itab) & workarea(wa_itab) in perform as below : Declaration of itab: types: BEGIN OF ty_out1,          f

  • Retrieving large video from icloud

    I currently have a 23 min ipad video saved on the cloud.  I have tried to download it multiple times and it is erring out and failing to download.  I currently do not have this any any other devices (only icloud).  How can I get this to download or g