Difference in audit data package

Hi,
  What's the difference between AuditPurge.DTS from AuditClear.DTS?
  And is there any sort of purge or backup functionality for Activity Audit, since the few audit related data packages are for Data Audit? Or do we have to custom build our own SSIS package?
Cheers,
Lip Chean

Hi Halomoan,
  After some testings, i think AuditPurge is very similar to AuditClear, the only difference is that AuditPurge works together with the purge frequency that you set in BPC Web "Manage Data Audit", i.e. if you set "Set Purge Frequency Keep Audit Records for x days", and when you perform AuditPurge, then everything outside of the X days of audit data will be cleared.
  Whereas for AuditClear, it literally clears everything with referring to the purge frequency.
  However i am still seeking confirmation whether there is a similar functionality for Activity Audit.
Cheers,
Lip Chean

Similar Messages

  • BPC SP3 - Missing Example Data Package

    I am looking for the purge audit data package.  The help file lists it as an example package, but I can not seem to find it anywhere.  Is there a place where I can download it?
    Thanks,
    Nick

    Hi Nick,
    to purge the audit data you don't need a package because it is already a function into admin module for this.
    Under the Set Log Limits section, do one of the following:
    If you want to keep all audit data for the selected category in the database, leave the default selected, Never purge audit data.
    If you want to purge audit records for the selected category after a specified number of days, select Set purge frequence, and enter the number of days for which to keep audit records in the database.

  • Purpose of Update/Overwrite Hierarchy in Import master data package

    Hi All,
    Please let me know what is the difference in update and overwrite option in import master data package. Here when i tried running the package, both the option are altering the hierarchy of existing member.
    My requirement is just to add a new member ID, if it already exist it should not affect those records and only update the new master data records.
    Thanks & Regards,
    Ramanathan

    Hi Ramanathan,
    You can run Import Master Data DM package in BPC to perform delta master data load. While you run DM package you get option to load Master Data from BW/BI where you can select:
    1. MERGE: this will keep old data intact and also load new master data, and
    2. COPY and REPLACE: This will copy new data and delete old master data.
    As per your requirement, you need to select the First option. If you select Update option then it will not delete old master data, instead update the members and hierarchy according to the parent-children design which you would have given in the dimension membersheet.
    Hope this corresponds to your requirement.
    Rgds,
    Poonam

  • Every 3rd data package taking long time for execution

    Hi Everyone
    We are facing a strange situation. Our scenario involves doing a full load from DSO to CUBE.
    Start routines are not very database intensive and care has been taken to write them in a optimized way.
    But strangely every 3rd data package is taking exceptionally longer time than other data packages.
    a) DTP is having 3 parallal processes.
    b)time spent in extraction , rule, and updation is constant for every data package.
    c)start routine time is larger for every 3rd data package and keeps on increasing. for e.g. 5 mins, 10 mins, 24 mins, 33 mins etc it increases by each 3rd package.
    I tried to anlayze the data which was taking so much time but found no difference in terms of data in normal and longer time taking DTP (i.e. there was not logical difference in data for start routine to behave like this).
    I was wondering what can be the possible reasons for it and may be some other external system factors can be responsible for it. If someone can help in this regard that will be highly appreciated.

    Hi Hemanth,
    In your start routine, are you by any chance adding or multiplying the number of records to the source_package? Something like copy source package into an internal table, add records to internal table and then copy it back to source package? If some logic of this sorts is in your start routine, you need to refresh your internal table. Otherwise, the internal table records goes on increasing with every data package. So, the processing time might increase as the load progresses. This is one common mistake I have seen. Please check your code if you have something like that and refresh the internal tables. See if this makes any difference.
    Thanks and Regards
    Subray Hegde

  • Data package and data packet

    Hit
    i want to know the difference between data package and data packet .when this comes in sap bw
    with regards
    tushar

    Hello,
    Data package term is related to DTP which is used to load Data from PSA to further Data Targets
    Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
    Data Packet Term is related to Info Package which is used to load data from Source System to BI (PSA).
    As per SAP standard, we prefer to have 50,000 records per one data packet.
    For every data packet, it does commit & save --- so less no. of data packets required.
    If you have 1 lakh records per data packet and there is an error in the last record, the entire packet gets failed.
    Hope it helps!

  • Data package size

    What is the basic difference between RSCUSTV6 & SBIW->General setting ->Maintain Control parameters in relation of modification of data package format.

    Hi,
    Just see the help on
    Maintain Control Parameters for Data Transfer:
    1. Source System
    Enter the logical system of your source client and assign the control parameters you selected to it.
    You can find further information on the source client in the source system by choosing the path
    Tools -> Administration -> Management -> Client Maintenance.
    2. Maximum Size of the Data Package
    When you transfer data into BW, the individual data records are sent in packages of variable size. You can use these parameters to control how large a typical data packet like this is.
    If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data package, but also on the size of the transfer structure and the memory requirement of the relevant extractor.
    3. Maximum Number of Rows in a Data Package
    With large data packages, the memory requirement mainly depends on the number of data recrods that are transferred with this package. Using this parameter you control the maximum number of data records that the data package should contain.
    By default a maximum of 100,000 records are transferred per  data package.
    The maximum main memory requiremen per data package is approximately 2  Max. Rows 1000 Byte.
    4. Frequency
    The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
    Frequency 1 is set by default.. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20.
    The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spans of time .
    With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.
    5. Maximum number of parallel processes for the data transfer
    An entry in this field is only relevant from release 3.1I onwards.
    Enter a number larger than 0. The maximum number of parallel processes is set by default at 2. The ideal parameter selection depends on the configuration of the application server, which you use for transferring data.
    6. Background job target system
    Enter the name of the application server on which the extraction job is to be processed.
    To determine the name of the application server, choose
    Tools -> Administration -> Monitor -> System monitoring -> Server. The name of the application server is displayed in the column Computer.
    7. Maximum Number of Data Packages in a Delta Request
    With this parameter, you can restrict the number of data packages in a delta request or in the repetition of a delta request.
    Only use this restriction when you expect delta requests with a very high data volume, so that, despite sufficiently large data package sizes, more than 1000 data packages can result in a request.
    With an initial value or when the value is 0, there is no restriction. Only a value larger than 0 leads to a restriction in the number of data packages. For reasons of consistency, this number is not generally exactly adhered to. The actual restriction can, depending on how much the data is compressed in the qRFC queue , deviate from the given limit by up to 100.
    RSA6:
    Used to change the Datapacket Size.
    Thanks
    Reddy
    Edited by: Surendra Reddy on Mar 12, 2010 6:27 AM

  • What data package needed for enterprise connectivity?e

    My wife and I both have iPhones. I have the original model and she has a 3G. We both currently have the $30 data package from AT&T. I get my MS Exchange email pushed to my phone from my company. They have set it up with remote wipe and all the other stuff. My wife is now trying to get her company to set up her iPhone to do MS Exchange stuff. They are telling her that the $30 data package is not good enough and that she now needs the $45 "enterprise data package" that AT&T offers. Is this true? What is the difference between the $30 personal data package and the $45 enterprise data package?

    The "difference" is who pays. If it's a corporate account, paid directly by the company to AT&T, it's the $45/month 'Enterprise' plan. If it's a personal account, even if used with an Exchange account on the company's server, it's the $30 'personal' plan.
    So, you ask, "What's the added 'feature' of the Enterprise data plan?" Simple - profit for AT&T. That's it.

  • Switchng data package

    Can my husband and I switch data plans?  Currently, my husband has the unlimited web and emails and I have 2g data package.   We are under family plan.  By the way, is there any difference between unlimited web and emails and 2/5/10gig data package?

    You will only be able to switch data plans if you also switch numbers.

  • Creation of data packages due to large amount of datasets leads to problems

    Hi Experts,
    We have build our own generic extractor.
    When data packages (due to large amount of datasets) are created, different problems occur.
    For example:
    Datasets are now doubled and appear twice, one time in package one and a second time in package two. Since those datsets are not identical, information are lost while uploading those datasets to an ODS or Cube.
    What can I do? SAP will not help due to generic datasource.
    Any suggestion?
    BR,
    Thorsten

    Hi All,
    Thanks a million for your help.
    My conclusion from your answers are the following.
    a) Since the ODS is Standard - within transformation no datasets are deleted but aggregated.
    b) Uploading a huge amount of datasets is possible in two ways:
       b1) with selction criteria in InfoPackage and several uploads
       b2) without selction criteria in InfoPackage and therefore an automatic split of datasets in data packages
    c) both ways should have the same result within the ODS
    Ok. Thanks for that.
    So far I have only checked the data within PSA. In PSA number of datasets are not equal for variant b1 and b2.
    Guess this is normal technical behaviour of BI.
    I am fine when results in ODS are the same for b1 and b2.
    Have a nice day.
    BR,
    Thorsten

  • Identify the last data package in start routine

    Hi Everyone
    We have a start routine in transformations. We require to do some special processing in the start routine only when the last data package is executing. How can we determine in the start routine that current package is last one or not ? Any pointers in this direction are appreciated.

    Hi,
    You can get packet Id from datapackid in start routine and end routine. But I'm not so sure how to identify the last packet ID, alternatively you can store this packet id in some where else and read the same value in End routine if your logic(processing) permits to do this in End routine instead of Start routine.
    METHODS
          start_routine
            IMPORTING
              request                  type rsrequest
              datapackid               type rsdatapid
            EXPORTING
              monitor                  type rstr_ty_t_monitors
            CHANGING
              SOURCE_PACKAGE              type tyt_SC_1
            RAISING
              cx_rsrout_abort.
    hope it helps...
    regards.
    Raju

  • Problems with the O2 blackberry data package on my Curve 3G.

    I have already informed O2 about this but they claim that I should be used the blackberry support services, but nothing there helps me!
    I got my Blackberry Curve 3G on September 9th this year and I put on the Blackberry Data Package bolt-on onto my phone on September 16th. I then received a text to say they've taken £5 from my credit and it will be up and running in the next 24 hours. Its now September 19th and my BBM is not working at all and I am extremely upset with the services and behaviour I have received from both O2 and Blackberry.
    Is there any way you can help? If this fails, I shall be forced to go back to the shop from where I got my Blackberry from and ask for their help.
    Many thanks, Jade.

    Can a bubble whistle Problems with the O2 blackberry data package on my Curve 3G.? The seat matures in your oar. The lad ices the pursuing method inside a resident. A judge spins against the vendor! The rose wows the hello. 
    filipina heart

  • Data package is missing in the return structure

    Hi BW Folks,
    I have an issue with ODS activation.While activating the data in ODS object am getting following error message
    Activation of data records from ODS object XXXX terminated.
    data package XXXXX contains errors with status 9 in table 'XX' but this data package is missing in the return structure.
    In detail: The data package is entered in the return structure as incorrect.
    Can anyone provide me the solution. Thanks in advance. Have a nice time!
    Regards,
    Nani.

    HI
    Check these links
    Re: Status 9 error when activating an ODS in a Process Chain
    ODS activation error - status 9
    Error while data loading-terminated with Status 9
    Error while data loading-terminated with Status 9
    hope it helps
    regards
    CK
    Assing points if usefull

  • DTP does not fetch all records from Source, fetches only records in First Data Package.

    Fellas,
    I have a scenario in my BW system, where I pull data from a source using a Direct Access DTP. (Does not extract from PSA, extracts from Source)
    The Source is a table from the Oracle DB and using a datasource and a Direct Access DTP, I pull data from this table into my BW Infocube.
    The DTP's package size has been set to 100,000 and whenever this load is triggered, a lot of data records from the source table are fetched in various Data packages. This has been working fine and works fine now as well.
    But, very rarely, the DTP fetches 100,000 records in the first data package and fails to pull the remaining data records from source.
    It ends, with this message "No more data records found" even though we have records waiting to be pulled. This DTP in the process chain does not even fail and continues to the next step with a "Green" Status.
    Have you faced a similar situation in any of your systems?  What is the cause?  How can this be fixed?
    Thanks in advance for your help.
    Cheers
    Shiva

    Hello Raman & KV,
    Thanks for your Suggestions.
    Unfortunately, I would not be able to implement any of your suggestions because, I m not allowed to change the DTP Settings.
    So, I m working on finding the root cause of this issue and came across a SAP Note - 1506944 - Only one package is always extracted during direct access , which says this is a Program Error.
    Hence, i m checking more with SAP on this and will share their insights once i hear back from them.
    Cheers
    Shiva

  • Same set of Records not in the same Data package of the extractor

    Hi All,
    I have got one senario. While extracting the records from the ECC based on some condition I want to add some more records in to ECC. To be more clear based on some condition I want to add addiional lines of data by gving APPEND C_T_DATA.
    For eg.
    I have  a set of records with same company code, same contract same delivery leg and different pricing leg.
    If delivery leg and pricing leg is 1 then I want to add one line of record.
    There will be several records with the same company code contract delivery leg and pricing leg. In the extraction logic I will extract with the following command i_t_data [] = c_t_data [], then sort with company code, contract delivery and pricing leg. then Delete duplicate with adjustcent..command...to get one record, based on this record with some condition I will populate a new line of record what my business neeeds.
    My concern is
    if the same set of records over shoot the datapackage size how to handle this. Is there any option.
    My data package size is 50,000. Suppose I get a same set of records ie same company code, contract delivery leg and pricing leg as 49999 th record. Suppose there are 10 records with the same characteristics the extraction will hapen in 2 data packages then delete dplicate and the above logic will get wrong. How I can handle this secnaio. Whether Delta enabled function module help me to tackle this. I want to do it only in Extraction. as Data source enhancement.
    Anil.
    Edited by: Anil on Aug 29, 2010 5:56 AM

    Hi,
    You will have to do the enhancement of the data source.
    Please follow the below link.
    You can write your logic to add the additional records in the case statement for your data source.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035c402-3d1a-2d10-4380-af8f26b5026f?quicklink=index&overridelayout=true
    Hope this will solve your issue.

  • Can we have a Family Data Package, Please?

    I have 4 people on our wireless service with Verizon.  Until recently, only I had a data plan.  Then I got my kids new phones and they got data plans.  Now my husband wants a data plan when he is eligible for a new phone in January.  All these plans are getting REALLY expensive.  How about coming out with a family data share plan, similar to the shared minutes plan?
    Thank you!

    TheGreatOne wrote:
    spottedcatfish wrote:
    They didn't feel like coercing people to add texting packages was an appropriate move, so instead they adjusted the price point.  We should all feel deeply blessed that instead of requiring a texting package as well as a data package on most high quality phones, they just made it so affordable that I'm sure most people grab it by default.  Fortunately for Verzion, the profit margins on texting is even higher than on data, because even if you use it a ton, it still costs them next to nothing to provide to you.
    Even if Verizon has required messaging packages instead of data packages,you would still get people complaining most likely. Probably people then saying something like "oh i don't use messaging on my phone ever. I don't need it." 
    And rightly so. Nobody should be forced to buy anything they dont need or use. Is that not common sense?

Maybe you are looking for