Data archiving and data cleansing

hi experts,
Can any one tell me step by step guide for Data archiving and Data cleansing of SAP-ISU Object.
what is the difference between  Data archiving and Data cleansing .
Thanx & Rgds

Data Archiving: So many objects are there you can look some of them..........
ISU_BBP IS-U Archiving: Budget Billing Plan
ISU_BCONT Business Partner Contacts (Contract A/R + A/P)
ISU_BILL IS-U Archiving: Billing Document Header
ISU_BILLZ IS-U Archiving: Billing Line Item
ISU_EABL IS-U Archiving: Meter Reading Results
ISU_EORDER IS-U Archiving: Waste Disposal Order
ISU_EUFASS Archiving of Usage Factors
ISU_FACTS Installation Facts
ISU_INSPEC IS-U Archiving: Campaigns for Inspection List
ISU_PPM Prepayment Documents
ISU_PRDOCH IS-U Archiving: Print Document Header
ISU_PRDOCL IS-U Archiving: Print Document Line Item
ISU_PROFV IS-U Archiving: EDM Profile Values
ISU_ROUTE IS-U Archiving: Waste Disposal Route
ISU_SETTLB Settlement Document
ISU_SWTDOC Archive Switch Document
Go to SARA t-code and enter the object CA_BUPA for business partner then Press F6 you will get  step by step documentation. please follow the procedure for all the objects.
Regards,
Siva

Similar Messages

  • What is data archiving and DMS(Data Management System) in SAP

    what is data archiving and DMS(Data Management System) in SAP
    Welcome to SCN. Before posting questions please search for available information here and in the web. Please also read the Rules of Engagement before further posting.
    Edited by: kishan P on Aug 31, 2010 1:06 PM

    Hi,
    Filtering at the IDoc Level
    Identify the filter object (BD59)
    Modify the distribution model
    Segment Filtering
    specify the segments to be filtered (BD56)
    The Reduced IDoc Type
    Analyze the data.
    Reduce the IDoc type (BD53)
    Thanks and regards.

  • How to Archive and Purge SLA data?

    We are looking for a solution to archive and purge SLA tables. I have opened an SR with Oracle and they suggested to work with third party who specialized in archiving and purging data. I would like to see if any of the clients have developed a custom solution for this and if so what is the criteria that has been used for this
    Thanks in advance,
    Prathibha

    Hi,
    Thanks for the information.
    But this traditional way of doing is archiving is for maintenance and backup operations.
    We want to have this process online, without taking db offline. In this case will this approach work?
    In our case, the rules can be like -
    1. For table 'A', if rows exceed 10Million, then start archiving of the data for that table.
    2. For table 'B', if data is older than 6 motnhs start archiving of the data for that table.
    3. Archiving should be on for 15 minutes only after that should pause, and should resume whenever user wants to resume.
    4. Archiving should start on specified days only... ETC...

  • Data archival and purging for OLTP database

    Hi All,
    Need your suggestion regarding data archival and purging solution for OLTP db.
    currently, we are planning to generate flat files from table..before purging the inactive data and move them to tapes/disks for archiving then purge the data from system. we have many retention requirements and conditions before archival of data. so partition alone is not sufficient.
    Is there any better approach for archival and purging other than this flat file approach..
    thank you.
    regards,
    vara

    user11261773 wrote:
    Hi All,
    Need your suggestion regarding data archival and purging solution for OLTP db.
    currently, we are planning to generate flat files from table..before purging the inactive data and move them to tapes/disks for archiving then purge the data from system. we have many retention requirements and conditions before archival of data. so partition alone is not sufficient.
    Is there any better approach for archival and purging other than this flat file approach..
    thank you.
    regards,
    varaFBDA is the better option option .Check the below link :
    http://www.oracle.com/pls/db111/search?remark=quick_search&word=flashback+data+archive
    Good luck
    --neeraj                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Archiving and deleting of equipment master data in plant maintenance module

    Hi,
    Can anyone explain me the process to archive and delete the equipment master data in PM module using archiving tools.
    I tried doing it thru t-code SARA, would be grateful if anyone can help me with the steps to follow to archive the equipment master data.
    Thanks in advance,

    Many thanks for your reply,
    I tried doing the same in IDES, but unfortunately the archive file is not getting generated when i write it. I have clearly specified in the variant to create an archive file. Can you please explain how does the system generate an archive file. And also in the customizing " Archiving Object-Specific Customizing " techincal settings, i have maintained the production variant and have marked start automatically.
    Please advise,
    thanks again ,

  • What should i do to implement Data Archiving and Data Reporting

    we want to implement Data Archiving and Data Reporting in our product. Can someone tell me what techniques or approaches people take in general to implement Data Archiving and Data reporting.
    i am currently looking into DataWarehousing. is this the right apporach ? i have no idea as where i should start on this. can someone give me a good direction as a starting point.
    thank you,
    Puja

    Did you setup Find My Mac before it was stolen?

  • Differences between archiving and inactivating a qualitative lookup within the Data Admin toolkit.

    Hi,
    Can you please let me know what could be the difference between archiving and Inactivating the Qualitative Lookup in Data Admin Tool Kit.
    Thanks,
    Rohini M

    When you inactive or archive something it is no longer available for selection.   The difference between inactive and archive is that inactive items will still appear available for searching purposes while archive items will not.  
    So lets say you have the following:
    List A
       - Item 1
       - Item 2
    List Items
    If you were to inactivate Item 1, end users would no longer see it available for selection when using the qualitative extended attribute.  However when they search for specs based on the extended attribute they would still be able to select Item 1 so they could find objects that used that value. If you were to archive Item 1, end users should no longer see it available for selection anywhere - including searching.
    Lists
    If you were to inactive or archive the entire List A, you would no longer see it available for selection when setting up qualitative lookup extended attributes.   I don't think there is anywhere you can search for extended attributes by lookup list currently out of the box so these would act similar.  If there was a place to search for extended attributes by lookup list then it would follow the same rules as above.

  • Data Archiving and ABAP

    Hi All,
    I am an ABAPer and recently came accross the process of data archiving..
    Can anyone tell me what significance data archiving has w.r.t ABAP.Are they interrelated at any point??
    Waiting for Reply..
    Shilpa

    Hi Shilpa,
    As a developer you must be aware of the fact that what ever application are provided by SAP they all are using codes written in ABAP in background .
    Similarly we have a tool Archive Developent Kit (ADK) provided by SAP that uses various programs written in ABAP grouped in the form of Archive object to perform Archiving Successfully in any system.
    Yes Archiving and ABAP both are interrelated and an ABPER can very well understand how actually there programs functions at runtime while we Archive data from the database.
    Another imporeant thing is apart from the standard Archive Objects provided by SAP sometimes  requiremnets comes for  custom Objects to be created which requires good indepth knowledge of ABAP.
    I hope my answer will help you understand the assosiation of ABAP and Data Archiving.
    For more you can go through the below link ::
    http://help.sap.com/saphelp_47x200/helpdata/en/8d/3e4d22462a11d189000000e8323d3a/frameset.htm
    -Supriya

  • Can I Archive and Install From Leopard Up-To-Date Disc?

    I'm a Mac newbie, so I apologize if this is an obvious question.
    I purchased my iMac this month, so I received a Leopard Up-To-Date disc for $10. The CD says Upgrade on it, and I can't find any options for what type of install I want to run. I've seen things like Erase And Install, Archive and Install, and Upgrade posted here on the forums, but the only options I have are whether or not to install things like Core Services, X11, etc. Can I not perform an Archive and Install with this disc, and if I can't, is there any chance I could request a full CD from Apple?
    Message was edited by: AceDew

    You should be able to -- when you get to the screen that asks you to choose your installation destination, highlight the drive, then you will see in the lower-left an options buttion. CLick it and you will find three choices: upgrade, archive/install and upgrade install. Choose the one you want and enjoy!
    Keep in mind that when you put the disk in, the system will verify that the DVD is OK -- it takes a few minutes, but it is worth knowing that the disk is not corrupt before you start. Also, repairing permissions seems to take awhile -- just let it happen.

  • Goldengate Extracts reads slow during Table Data Archiving and Index Rebuilding Operations.

    We have configured OGG on a  near-DR server. The extracts are configured to work in ALO Mode.
    During the day, extracts work as expected and are in sync. But during any dialy maintenance task, the extracts starts lagging, and read the same archives very slow.
    This usually happens during Table Data Archiving (DELETE from prod tables, INSERT into history tables) and during Index Rebuilding on those tables.
    Points to be noted:
    1) The Tables on which Archiving is done and whose Indexes are rebuilt are not captured by GoldenGate Extract.
    2) The extracts are configured to capture DML opeartions. Only INSERT and UPDATE operations are captured, DELETES are ignored by the extracts. Also DDL extraction is not configured.
    3) There is no connection to PROD or DR Database
    4) System functions normally all the time, but just during table data archiving and index rebuild it starts lagging.
    Q 1. As mentioned above, even though the tables are not a part of capture, the extracts lags ? What are the possible reasons for the lag ?
    Q 2. I understand that Index Rebuild is a DDL operation, then too it induces a lag into the system. how ?
    Q 3. We have been trying to find a way to overcome the lag, which ideally shouldn't have arised. Is there any extract parameter or some work around for this situation ?

    Hi Nick.W,
    The amount of redo logs generated is huge. Approximately 200-250 GB in 45-60 minutes.
    I agree that the extract has to parse the extra object-id's. During the day, there is a redo switch every 2-3 minutes. The source is a 3-Node RAC. So approximately, 80-90 archives generated in an hour.
    The reason to mention this was, that while reading these archives also, the extract would be parsing extra Object ID's, as we are capturing data only for 3 tables. The effect of parsing extract object id's should have been seen during the day also. The reason being archive size is same, amount of data is same, the number of records to be scanned is same.
    The extract slows down and read at half the speed. If normally it would take 45-50 secs to read an archive log of normal day functioning, then it would take approx 90-100 secs to read the archives of the mentioned activities.
    Regarding the 3rd point,
    a. The extract is a classic extract, the archived logs are on local file system. No ASM, NO SAN/NAS.
    b. We have added  "TRANLOGOPTIONS BUFSIZE" parameter in our extract. We'll update as soon as we see any kind of improvements.

  • Aure capabilties in Data Cleansing, profiling and Metadata lineage

    Can some one explain Azure's inbuilt capabilities in Data Cleansing, profiling and Metadata lineage? if they have any
    Regards

    Azure Data Factory makes it easy to automate Hadoop or custom code.  So any library, method, etc that works on Hadoop can be used within a Data Factory pipeline.
    Once an Azure Data Factory is designed, in the Azure portal
    https://portal.azure.com you can see a diagram view the data factory including the various tables (Files, SQL tables, blobs, etc) and the activities (Copies, Hive/ Pig scripts) which connect those. When you click on a particular item, there is a visual
    lineage view to show which ones feed in and out from the selected item.
    Our intial Lineage focus initially has been "operational lineage" including things like health status, health history, run logs, etc.
    Example screenshot (my items are all green at the moment, but would show failures and logs affiliated with any failures)
    Data Profiling and metadata lineage are areas of asks we’ve heard before.
    What kind of needs does your business have?
    Also, please visit our feedback site and vote up items you wish for, and mention new ideas as well.
    http://feedback.azure.com/forums/270578-azure-data-factory
    Best Regards, Jason
    Didn't get enough help here? Submit a case with the Microsoft Customer Support teams for deeper investigation - Azure service support: https://manage.windowsazure.com/?getsupport=true For on Premise software support go here instead: http://support.microsoft.com/select/default.aspx?target=assistance

  • .:: Archive and Install saves my Data? ::.

    Hello everyone,
    During the installation of the Mac OS 10.9 update (Mine was at 10.3) there was a power cut in my area and when I tried to start of the mac, it wouldn't boot. It was stuck at the grey screen with the apple logo and the spinning gear. After booting into many modes, I found out that my system files were corrupt. I searched for my original installation disks, and found my Mac OS 10.1 discs that came with my mac. When I tried to install it, however, it said that there was a newer version on my hard drive and that it could not downgrade! I proceeded to buy the 4 CD edition of Tiger (arriving soon), and
    +*THE QUESTION:*+
    +*I as wondering whether if I install Tiger, will my user data and personal files be deleted? Could I use this "Archive and Install" Feature to keep all of my files?*+
    Thank you so much for any help given,
    Dexter

    Hi Dexter!
    Before you do anything else, I suggest that attempt to backup the existing system.
    If you still cannot startup correctly, and you have access to another Firewire enabled Mac, you could try using Firewire Target Disk Mode to retrieve your data.
    "During the installation of the Mac OS 10.9..."
    Do you mean the 10.3.9 Combo Update?
    "...found my Mac OS 10.1 discs that came with my mac."
    Why are you not using the Full Retail Version, of the Panther Install CDs, that you would have had to have used, to install Panther 10.3.x?
    ...4 CD edition of Tiger..."
    Are you sure that these discs, are a Full Retail Version, of the Tiger 10.4.x install discs?
    May I inquire where you purchased them?
    "Could I use this "Archive and Install" Feature to keep all of my files?"
    If there is no functioning, uncorrupted system, presently installed, then this may not work exactly as expected, and you may, corrupt the system further.
    At this point, there is no guarantee that your data is not also corrupt.
    I really think you should first try to establish that your data is recoverable.
    ali b

  • Is there any product for Archival and purge of my data?

    1. SHould have automatic schedules that run to purge/archive
    2. Keep log of data that is archived/Purged

    1. SHould have automatic schedules that run to purge/archivePlease see this link -- https://forums.oracle.com/forums/search.jspa?threadID=&q=data+AND+archival+AND+tool&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    2. Keep log of data that is archived/PurgedPlease see this link -- https://forums.oracle.com/forums/search.jspa?threadID=&q=data+AND+archival&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    Thanks,
    Hussein

  • Data Cleansing in CRM Web UI

    Dear Experts,
    I have activated Data Cleansing in CRM 7.0 and it is working.
    Can anyone advise on the setting or configuration required to get the CRM Data cleansing functionality to copy communication data stored at the relationship from the Source record to the Master record.
    Example:
    Company A has a relationship with Contact person A - this relationship has a mobile number and email address maintained.
    When merging the account of Company A (source) with a duplicate company A1(master), we need to copy the relationship (already occurs) between Contact Person A and Company A and the relationship level mobile number and email address to the master account which will be company A1 (this does not occur).
    Currently the relationship level communication data is not copied.
    Please provide step by step config settings to enable this. I already have the SAP Notes on Data Cleansing and have covered all WIKI's on the web. What i need are specific steps to overcome this issue.
    Thanks!!!

    In order to transfer communication data maitained at the Contact Person relationship level, you have to maintain the required node in transaction BUSWU02: BUP115     Contact's Work Address and BUP120 Contact's Communication Data
    If data is still not transferred, SAP has released notes: 1243559, 1491950, and 1493240.  to resolve.  Check the relevance before implementing.
    FK

  • BP_TASK Data Cleansing

    Happy New Year Experts,
    I have a question or three for you on Data Cleansing in the Web IC.  I will explain what I have done and what I need answers to.
    The setup of the Data Cleansing Cases and Account search is fine.  We can search for Business Partners and Merge Now or Merge Later and then search for cases if we chose to Merge Later.
    When we go into the Case to process it most of the functionality is fine.  What isn't ok is this BP_TASK config .
    I understand that in order to execute the 'START' button in the Case Processing screens you need to have the Task config setup.  I have done this to an extent as described below:
    1) Setup Number Ranges - IMG->CRM->Master Data->BP->DQA-> Maintain Number Ranges [Create line 01-0000000001-9999999999]
    2) In Task (1003) add the action profile BP_TASK - IMG->CRM->Transactions->Basic Settings->Define transaction types
    3) Create job CRM_BUPA_REALIGNMENT (periodic job 5-10 min)
    Point 3 is what I am coming unstuck with.  I cannot create a periodic job without assiging an ABAP Program to run etc.  There is nothing anywhere that says 'use this program or method' when creating the job step.
    Secondly, when I select the 'START' button in the Case Processing screen - after confirming the changes, the case errors with a vague message saying the case cannot be saved.  However, if I actually hit the 'SAVE' button the case saves and the changes I confirmed are processed between the various accounts.  So the whole process is about 95% great and 5% annoying.
    Before the questions, all authorisation settings are good as well.
    The questions are then:
    1)  What parameters are required above what I described for the periodic job?
    2)  Does the 'Task' transaction type need to be in the Business Transaction profile for the specific business role the Data Cleansing functionality is assigned within?
    3)  Once a task is created, I guess that the job will process these and that a user does not necessarily need to manual process these tasks?
    4)  Should I change the Action Definition and Condition at all over and above the standard setup that it is currently in?
    Any help and guidance would be great - I'm afraid I have 'Wood for Trees' syndrome now
    Many Thanks for any assistance.
    Regards,
    Mat.

    Hi Gregor,
    I have read your interesting blog. We have a similar kind of data cleansing activity. But when I tried to implement the same, we get a message that the Data Cleansing option is not available for the Insurance industry specific settings. Can you pls help us here ? When I analysed the same the FM 'FKK_CLEANSING_ALLOWED' has a hard coded code element as follows:
    *4.71: Event 7500 does not exist yet*
      IF 1 = 2.
         REFRESH: t_fbstab[].
         CALL FUNCTION 'FKK_FUNC_MODULE_DETERMINE'
           EXPORTING
              i_fbeve            = gc_event_7500
              i_only_application = gc_x
              i_applk            = applk
           TABLES
              t_fbstab           = t_fbstab[].
         LOOP AT t_fbstab INTO fbstab.
           CALL FUNCTION fbstab-funcc
              CHANGING
                c_cleansing_allowed = cleansing_allowed.
         ENDLOOP.
         IF 1 = 2.
    *     für Verwendungsnachweis
           CALL FUNCTION 'FKK_SAMPLE_7500'
              CHANGING
                c_cleansing_allowed = cleansing_allowed.
         ENDIF.
      ENDIF.
    Our SAP version is as folows:
    SAP_BASIS      620         0056 SAPKB62056             SAP Basis Component            
    INSURANCE     472           0010     SAPKIPIN10     INSURANCE 472 : Add-On Installation
    Any help is much appreciated.
    Thanks.
    I Peter

Maybe you are looking for

  • Playing Slowmotion video over airplay does not work with ios update!!!!!

    after upgrading last ios , airplay to shoot slowmotion video does not work. I was expecting it to be fixed with recent update, but still not working!!! I use it for education at my clinic - which is essential, and how come apple does not fix it at al

  • Different versions of the same app on the app store

    Is it possible to let users download the different versions of the same app from the App store,( even though I have not seen it but just wonder whether it is possible) ? If not then can we have different releases of the same product with different fe

  • Identification Specification missing in Source structure!!!

    Hi, I'm Saikumar. I'm doing LSMW (in that recording for MAP1 and MAP2 transactions). I've created 2 source structures ZMAP1_SOURCE and ZMAP2_SOURCE and also assigned some fields. I  created only one flat file for these 2 source structures and I've as

  • Best way to extend wi-fi network for MBP + way to remotely wire it in?

    I have a unibody MBP which has been a little touchy with wi-fi (I can get it to work with my AEBs but only if I set the AEBs to N-only/5 Ghz). I moved into a new place and am planning to use the MBP in a room that will position it about 40 feet from

  • After Download, asks for disk in order to install.

    After downloading Adobe Photoshop Elements 9 from the Adobe website, after entering the serial number and going through all the pre-install configuring, it then asks for the disk to be inserted before it will install. There is no disk as this is a do