Best Practise for Data Archiving

Hi All,
I have query in SAP that when business will do the data archving in their R3 systems.
On which basis they will do data archiving?
Is there any minimum period that data need's to be archived?
before doing the data archiving what they will do ?
Regards
Srini

Hi Srini,
1. SAP suggestes to implement data archiving strategy  as early as possible to manage database growth .
However pople think of archiving when they realise the  problems like large data volumes,slow system resonse time,performance issues etc...
2. There is a proper way to implement Data Archiving . Database has to be anaylzed first for getting the top DB tables and Archiving objects.
3. Based on the DB analysis ,Data archiving plan has to be implenemted according to data management guide.
4. There is a minimum period known as residence time has to be completed before any data to be archived. Once the document is business completed and serverd its minimum required period in the Database ,it can be archived.
5, Before going for data archiving there are many steps to be followed like analysis,configuration etc that you can see in details at the link below :
http://help.sap.com/saphelp_47x200/helpdata/en/2e/9396345788c131e10000009b38f83b/frameset.htm
let me know if this helps you .
-Supriya
Edited by: Supriya  Shrivastava on May 4, 2009 10:38 AM

Similar Messages

  • Best Practise for Data Refresh & Hierarchy

    Hi,
    During a recent discussion with one of our BI user groups, the questions were raised as what the best practice are to handle the following two issues.
    Issue 1:
    If entries are posted to the prior periods in SAP R/3 (outside of the daily auto-refresh range), the current process is that the user group will ask us to conduct a manual refresh in BI for the prior periods which are effected.
    Question: Is it possible to set up a trigger in the system, so that BI knows which periods are changed and automatically refreshes data for those periods?
    Issue 2:
    If a hierarchy used in the reports is modified, there might be an adverse impact on the financial data the user group reports. The current process we have in place is to run a group of BI reports for both current year and prior year to make sure nothing is impacted, but there is limitation to this current process. What if there is no impact on current or prior year, but on the years prior to that?
    Question: What other global companies do to minimize such reporting impact, especially when they have hundreds of complex reports?
    If someone has any info on this, help me in sharing the same.
    Thanks all for your support.
    Regards,
    Murali

    Hi Srini,
    1. SAP suggestes to implement data archiving strategy  as early as possible to manage database growth .
    However pople think of archiving when they realise the  problems like large data volumes,slow system resonse time,performance issues etc...
    2. There is a proper way to implement Data Archiving . Database has to be anaylzed first for getting the top DB tables and Archiving objects.
    3. Based on the DB analysis ,Data archiving plan has to be implenemted according to data management guide.
    4. There is a minimum period known as residence time has to be completed before any data to be archived. Once the document is business completed and serverd its minimum required period in the Database ,it can be archived.
    5, Before going for data archiving there are many steps to be followed like analysis,configuration etc that you can see in details at the link below :
    http://help.sap.com/saphelp_47x200/helpdata/en/2e/9396345788c131e10000009b38f83b/frameset.htm
    let me know if this helps you .
    -Supriya
    Edited by: Supriya  Shrivastava on May 4, 2009 10:38 AM

  • The best practice for data archiving

    Hi
    My client has been using OnDemand for almost 2 years, there are around 2M records in the system(Activities), so just want to know what is the best practice of data archiving, we dont care much about the data in the 6 month ago.

    Hi Erik,
    Archival is nothing but deletion.
    Create a backup cube in BW. Copy the data from your planning cube to the backup cube, and then delete that data region from your planning cube.
    Archival will definitely improve the performance of your templates, scripts, etc; since the system will now search from a smaller dataset.
    Hope this helps.

  • SDO_PC, multiple SRIDs - best practise for data model?

    Hi,
    im using UTM and I am getting data covering two zones.
    all my existing data is from zone A.
    tables:
    pointcloud
    pointcloud_blk
    now im getting data with very few points from zone A and most points from zone B. It was agreed that the data delivery will be in SRID for zone B.
    so I tested whether this would work. I had two pointclouds. One with SRID A, another with SRID B. As soon as I put SRID B pointcloud inside, I could NO LONGER QUERY pointcloud with SRID A.
    So it seems to be necessary to use at least another pointcloud_blk, f.e. pointcloud_blk_[srid].
    Question: does another pointcloud_blk for each SRID suffice or do i also need a pointcloud table per SRID. the pointcloud table seems only interesting due to its EXTENT column. But on the other hand this could be queried by "function", since there are only 10 or so records (pointclouds) inside.
    PLZ share your best practises. What does work, what not.

    It is necessary to have one pointcloud_blk table for each SRID since there is a spatial index on that table.
    As for the PointCloud table itself, it is up to you. You can have pointclouds with different SRIDs in that table.
    But if you want to create spatial index on it, you have to use some function based index so that the index
    sees one SRID for the table.
    Since this table usually does not have many rows, this should work fine with one table for different SRIDs.
    siva

  • Best practise for data exchnge Flash Lite 2.0?

    Hi all,
    At present I am using LoadVars to communicate between Flah
    Lite and my server is there a better way? I am building an
    applciation that needs constant information and updates, such as
    being able to send message to different users. At the moment I am
    checking all the nesaccary database info using LoadVars. This is
    called every 1-5secs for the different functions.
    This however is far from elegant and also the data charges
    can add up if the app is running for a long time.
    Can I use an socket driven solution? Is this possible with
    Flash Lite 2.0.
    I am using PHP to retrieve info from a MySql database.
    Thanks.
    Dev

    Ciao
    is not to Adobe to rebuild the Flash Lite player. Adobe
    license the player to mobile phones manifactures and they decide
    when/how to implement the FL player on their phones.
    If you want to know the roadmap, some hints here:
    http://www.biskero.org/?p=581
    Alessandro

  • A question on different options for data archival and compression

    Hi Experts,
    I have production database of about 5 terabyte size in production and about 50 GB in development/QA. I am on Oracle 11.2.0.3 on Linux. We have RMAN backups configured. I have a question about data archival stretegy.  To keep the OLTP database size optimum, what options can be suggested for data archival?
    1) Should each table have a archival stretegy?
    2) What is the best way to archive data - should it be sent to a seperate archival database?
    In our environment, only for about 15 tables we have the archival stretegy defined. For these tables, we copy their data every night to a seperate schema meant to store this archived data and then transfer it eventualy to a diffent archival database. For all other tables, there is no data archival stretegy.
    What are the different options and best practices about it that can be reviewed to put a practical data archival strategy in place? I will be most thankful for your inputs.
    Also what are the different data compression stregegies? For example we have about 25 tables that are read-only. Should they be compressed using the default oracle 9i basic compression? (alter table compress).
    Thanks,
    OrauserN

    You are using 11g and you can compress read only as well as read-write tables in 11g. Both read only and read write tables are the candidate for compression always. This will save space and could increase the performance but always test it before. This was not an option in 10g. Read the following docs.
    http://www.oracle.com/technetwork/database/storage/advanced-compression-whitepaper-130502.pdf
    http://www.oracle.com/technetwork/database/options/compression/faq-092157.html

  • What is the best practise for setting dirty flag of a page/view?

    For a page/view, normaylly there are 2 things to do for diry data:
    1. when it's clean, Save button is disabled, when it's dirty, save button is enabled.
    2. when it's dirty and the window is closed, a popup says "you have unsaved data, close will lose the data".
    My thought is: it must be handled at client side, because not all valuechange is auto submitted. E.g., you type the 1st letter of a string in a input box, the server side does not know it, but save button should be enabled immediately.
    Is it possible to capture all valueChange events in a page or a view at client side?
    I'm not sure what is the best practise for setting dirty flag? If there is better solution? Does ADF provide facility for this?

    public void save(ActionEvent event){
    boolean formValid = isFormValid();
       if (formValid) {
      save button is enabled.
        private boolean isFormValid() {
            boolean valid = true;
            if (Check Condition 1) {
                valid = false;
               showErrorMessage1();
            if (Check Condition 2) {
                valid = false;
               showErrorMessage1();
            return valid;
        private void showErrorMessage1() {
                    when it's dirty and the window is closed, a popup says "you have unsaved data, close will lose the data".

  • Best practises for replication

    Hi,
    I want to know what is best practise for duration of replicaation of database between two Cisco ACS.
    Regards,
    Atif.

    Hi Atif,
    The replication time interval should always be higher.
    Reason: Everytime you replicate the data it requires ACS services to restart so doing this frequently may affect your production enviroment.
    However, if you want to replicate internal user's password then there is an option to replicate password changes right awayvwithout a full replication.  You can enable this option under System Configuration -> Local Password Management.  With this enabled you could potentially set the replications to a larger interval.
    It also depend how often you do changes in your ACS. If its normal then I would say set it to every sunday 12:00 PM.
    This is how replication happens:
    The primary ACS stops its authentication and creates a copy of the ACSinternal database components that it is configured to replicate. During this
    step, if AAA clients are configured properly, those that usually use the primary ACS fail over to another ACS. The primary ACS resumes its authentication service.
    After the preceding events on the primary ACS, the database replication process continues on the secondary ACS. The secondary ACS stops its authentication service and replaces its database components with the database components that it received from the primary ACS. During this step, if AAA clients are configured properly, those that usually use the secondary ACS fail over to another ACS. The secondary ACS resumes its authentication service.
    HTH
    Regards,
    JK
    Plz rate helpful posts-

  • Best practice for data migration install v1.40 - Error 2732 Directory manag

    Hi
    I'm attempting to install SAP Best Practice for Data migration 1.40 on Win Server 2008 R2 (64 bit).
    Prerequisite error
    Installation program stops with missing file error
    The following file was not found
    ... \migration\InstallationWizard\BusinessObjects Data Services\setup.exe
    The file is necessary for successful installation. Please connect to internet or refer to Quick Guide (available on SAP note 1527151) for information regarding the above file.
    Windows installer log displays
    Error 2732 Directory Manager not initialized
    SAP note 1527151 does not exist or is internal.
    Any help appreciated  on what is the root cause of the error as the file does not exist in that folder in the installation zip file.
    Other prerequisite of .NET 3.5.1 met already.
    Patch is released since 20.11.2011 so I presume that it is a good installation set.
    Thanks,
    Alan

    Hi Alan,
    There are details on data migration v1.4 installations on SAP website and market place. The below link should guide to the right place. It has a power point presentation and other useful links as well.
    http://help.sap.com/saap/sap_bp/DMS_V140/DMS_US/html/index.htm
    Arun

  • Best  Course  for Data Warehousing

    Hi,
                I am planning to join data warehousing course .I heard there is lot courses in data warehousing .
    Data warehousing with ETL tools or
    Data warehousing with Crystal Reports or
    Data warehousing with Business object or
    Data warehousing with Informatica or
    Data warehousing with Bo-Webel or
    Data warehousing with Cognos or
    Data warehousing with Data Stage or
    Data warehousing with MSTR or
    Data warehousing with Erwin or
    Data warehousing with oracle.
    Please suggest me which best to choose and  which have more scope because I  don't know  the ABC of data warehousing  but I have some experience in oracle.
    Is it must that I need work experience in data warehousing  then only can get a job ?Please tell me which is the best book for data warehousing which should start from scratch.  Please  give your suggestions about to my queries.
    Thanks & Regards,
    Raji

    Hi,
    Basically Datawarehouse is a concept.To develop DW , we need two tools mainly. One is ETL tool and other one is Reporting tool .
    The few famous ETL tools are
    Informatica
    Data Stage
    Few famous Reporting tools are
    Crystal Reports
    Cognos
    Business object
    As a DW developer you should aware of atleat one ETL tool and atleat one Reporting tool.The combination is your choice.It better to finout the best combination in point of job market , and then learn them.
    Erwin is Datamodel tool. It can aslo be used in DW implementation. You have already have experience on ORacle,So my adivce is go for Data warehousing with oracle or Data warehousing with Informatica .And learn one reporting tool.I donot is there any reporting tool available from ORACLE.
    My suggestion on books.
    Fundamentals of Datawarehouse by PaulRaj Ponnai and
    Datawarehouse toolkit.
    http://www.inmoncif.com/about.html is one of the best site for Datawarehouse.
    With rgds,
    Anil Kumar Sharma .P
    Assigning points is the way to say thanks in SDN site.

  • SAP Best Practices for Data Migration :repositories only on MS SQL Server ?

    Hi,
    I'm implementing the "SAP Best Practices for Data Migration" (see https://websmp109.sap-ag.de/bp-datamigration).
    As part of the installation you have to install MS SQL Server Express Edition. The installation guide contains detailed steps to do this. All repositories for Data Services should be running on SQL Server, according to the installation guide.
    The customer I'm working for now does not want to use SQL Server, but DB2, as company standard.
    So I use DB2 for the local and profiler repositories.
    I notice however that the web application http://localhost:8080/MigrationServices does not support DB2.The only database type you can select in the configuration area is MS SQL Server.
    Is this a limitation, a by design ?

    Hans,
    The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
    The next release, including the web app, will support SQL Server and Oracle, but not DB2.
    Paul

  • Best Practise for connecting to Ethernet based device

    Hi,
    I have inherited a system where we have a cDAQ-9181 controlling an vehicle access barrier, with a LabView application on  a PC talking to it via Ethernet.
    (The application is very simple - press a button > send a value to the 9181 unit > opens the barrier )
    All works fine most of the time.
    ( We occasionally get network related errors. The LabView application sometimes thinks another PC has reserved the unit, or gives “error 89130 - device not available for routing” )
    The users would now like to be able to easily run the application from a second PC ( not at the same time ), but this seems to be a problem. If I exit the application on PC “A” and run it on PC “B” it struggles to reserve the chassis, and throws the “89130” error and I have to restart the unit via MAC.
    While I’m a “veteran” control programmer, I’m new to LabView, and would be very grateful for any pointers on “best practise” for talking to devices via Ethernet, or any specific suggestions for handling multiple PCs talking to a single device.
    Thank You.
    Tim.

    Hi Tim,
    Thank you for your post and welcome to the NI forums.
    There are lots of knowledgebase articles on our website and you should be able to find documentation for most of our hardware.
    There is a good troubleshooting guide for cDAQ Ethernet here (http://ae.natinst.com/public.nsf/web/searchinternal/e67b4e4749f378ff862577270059bd4b?OpenDocument) - it outlines the steps to take to ensure you have a stable a connection as possible. You may have already seen it, but the quick-start guide for your specific device may also be worth consulting for best practices. Are these helpful?
    As for using more than one PC - this shouldn't be too much of an issue. I would expect that the resource isn't being closed correctly - when you exit the App on PC 'A', how are you closing off the resource?
    Best regards,
    Eden S
    Applications Engineer
    National Instruments UK & Ireland

  • Best Practise for rebooting ISE Nodes?

    Hello Community,
    I administer an ISE installation with two nodes (I am not an ISE Specialist, my job is just to manage the user/mac-adresses... but now I have to move my ISE Nodes from one VMWare Cluster to another VMWare Cluster.
    (Both VMWare environments are connected to our enterprise network, but are different environments. vMotion not possible)
    I would shutdown ISE02, move it to our new VMWare environment and start it again.
    Than I would do this with our ISE01 Node...
    Are there any best practises for doing this? (Shutdown application first, stopl replikation etc)?
    Can I really simply reboot an ISE Node - or have I consider something bevor I doing this? After I doing this?
    Any tasks after reboot?
    Thank you for any answer!
    ISE01    
    Administration, Monitoring, Policy Service    
    PRI(A), SEC(M)
    ISE02    
    Administration, Monitoring, Policy Service    
    SEC(A), PRI(M)

    There is a lot to consider here.  If changing environments means changing IP Address and IP Scopes, then your policies, profiles, and dACLs would also have to change among other things.  If this is the case, create a new ISE VM in the new environment using the built in evaluation license and recreate the deployment from the old environment using the addressing scheme of the new environment.  Then spin-up a new Secondary node and register it on the Primary.  Once this is done, you can re-host the license from your old environment onto your new environment.  You can use this tool to re-host:
    https://tools.cisco.com/SWIFT/LicensingUI/loadDemoLicensee?FormId=3999
    If IP Addressing is to remain the same, it gets simpler. 
    First, and always, perform a configuration and operational backup.
    If downtime is not an issue, or if you have a maintenance window of an hour or so: Simply shut down both nodes.  Transfer them to the New Environment and turn them on, Primary Node first, of course.
    If downtime is an issue, shut down the Secondary Node and transfer it to the New Environment.  Start the Secondary Node and when it is up, shut down the Primary Node.  Once services on the primary node have stopped, promote the Secondary Node to Primary Node.
    Transfer the OLD Primary Node to the New Environment and turn it on.  It should assume the role of Secondary Node.  If it does not, assign that role through the GUI.
    Remember, the correct way to shut down an ISE node is:
    application stop ise
    halt
    By using these commands, the risk of database corruption decreases by about 90% (Remember to always backup).
    Please Rate Helpful posts and mark this question as answered if, in fact, this does answer your question.  Otherwise, feel free to post follow-up questions.
    Charles Moreton

  • Best practices for data migration

    Hi All,
    This thread is useful for those who can use their opportunity to share the ideas and knowledge for making better and best use of data migration using Business Objects Data Services.

    Hans,
    The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
    The next release, including the web app, will support SQL Server and Oracle, but not DB2.
    Paul

  • SAP Best Practises for Logistic Modules

    Dear Reader,
    Does anyone know how to find SAP Best Practises for Netweaver 2004s Logistic scenarios (SD, MM,)
    The old Netwvr 2004 scenarioas are less helpful, especially in Inventory mgmt.
    Thanks

    Check this link, this is the latest I guess.
    [SAP Best Practices|http://help.sap.com/bp_bw370/html/index.htm]
    Cheers,
    Neel.

Maybe you are looking for