Why should we Purge Audit data?

Hi
I have read several threads about purging audit details like the one below
Re: OWB purge audit
I want to know what are the benefits of purging the audit details.Please tell me how audit details are important and how can they play a role in deciding the performance of OWB.
Regards
Vibhuti

Hi,
not exactly - you have the same problems in deployment (the control center must also select they right records in growing tables to examine if the deployment was ok) and execution (the package must get the right records of status and bad records).
But the main problem I have (on my experience) is with the performance of the audit browser.
Regards,
Detlef

Similar Messages

  • Why should we load header data first and then we load item level data?

    Hi BW guru`s,
    I have small confusion about the data loading.
    Why should we load header data first and then we load item level data?
    Is there any particular reason?
    Scenario: 1st I have uploaded 2LIS_11_VAHDR sales document header data from R/3 to BW by using LO-Cockpit extraction. Then I loaded 2LIS_11_VAITM.This is the norma procedure which we use to follow.
    I have a question. If I load 2LIS_11_VAITM data 1st from R/3 to BW and then I will load 2LIS_11_VAHDR by using LO-Cockpit extraction. what will happen?
    Regards,
    Venkat
    Edited by: VENKAT BOORUGADDA on Aug 12, 2008 11:51 AM

    There is no difference in doing it the other way.
    The load sequence will come into play only during activation where if you map the same fields from the two datasources, you might want to have the previous value overwritten by data from the next datasource.
    That is when you should care about loading one datasource before the other.
    To your question it is not arule that header data should come first.

  • XI R2 Auditing data retension period

    Hi,
    Using XI R2 SP2 FP5 with an SQLServer database and want to limit the amount of data held, ideally by date i.e. only 6 months of data.
    I would expect a setting somewhere to say keep x days of data but can't find one and can't find any reference to it in documentation or on these forums.
    Any help much appreciated.
    John

    Hello,
    There is no way to restrict/purge audit data out of the box. You could however purge data in your db as mentioned in SAP NOTE  1198638 - How to purge the BO AUDIT tables leaving only the last 6 months data. I.e
    To purge the BO AUDIT tables, leaving only the last 6 months data, delete AUDIT_DETAIL with a join to AUDIT_EVENT and select the date for less then 6 months. Then delete the same period in AUDIT_EVENT.
    Note that this is apparently not supported. SAP  NOTE 1406372 - Is it possible to purge Audit database entries? 
    Best,
    Srinivas

  • Why should we create index on  the table after inserting data ?

    Please tell me the Reason, why should we create index on the table after inserting data .
    while we can also create index on the table before insertion of the data.

    The choice depends on a number of factors, the main being how many rows are going to be inserted in the table as a percentage of the existing rows, or the percentage growth.
    Creating index after a table has been populated works better when the tables are large or the inserts are large for the following reasons
    1. The sort and creation of index is more efficient when done in batch and written in bulk. So works faster.
    2. When the index is being written blocks get acquired as more data gets written. So, when a large number of rows get inserted in a table that already has an index , the index data blocks start splitting / chaining. This increases the "depth" of the inverted b-tree makes and that makes the index less efficient on I/O. Creating index after data has been inserted allows Orale to create optical block distribution/ reduce splitting / chaining
    3. If an index exists then it too is routed through the undo / redo processes. Thats an overhead which is avoided when you create index after populating the table.
    Regards

  • Scheduling purge signon audit data

    Hi all,
    I have R12.1.3. I want to schedule 'Purge Signon Audit Data' to run once every month. With Purge Signon Audit Data the default date that shows up is the current date; but I want to replace it with current date minus 2 months. It is easy to do that when I run it manually but when scheduling it, the only option I can think of is replacing the value set FND_STANDARD_DATE with custom value set under Audit date parameter. Is there is a better/easier way to do change the default date to current date minus 2 months?
    Thanks,

    DBA_EBiz_EBS wrote:
    Hi all,
    I have R12.1.3. I want to schedule 'Purge Signon Audit Data' to run once every month. With Purge Signon Audit Data the default date that shows up is the current date; but I want to replace it with current date minus 2 months. It is easy to do that when I run it manually but when scheduling it, the only option I can think of is replacing the value set FND_STANDARD_DATE with custom value set under Audit date parameter. Is there is a better/easier way to do change the default date to current date minus 2 months?
    Thanks,Schedule the request to run once every month (Re-run Every Month) from two months ago and enable "Increament date parameter each run".
    If your request contains date parameters, you can choose "Increment date parameters each run" to have the value for that parameter be adjusted to match the resubmission interval. For example, if the value for the parameter is 25-JUL-1997 07:00:00 and your interval is monthly, the parameter is adjusted to 25-AUG-1997 07:00:00 for the next submission.
    http://docs.oracle.com/cd/A60725_05/html/comnls/us/fnd/10gch606.htm
    Thanks,
    Hussein

  • Purging Old Data

    pls help in solving one issue related to purging the data.
    There are some 40 tables in the database am working in. And I need to check these tables for data which is 5 years old and have to delete those data but these tables are interlinked with other tables (parent -child relation). So how to proceed with this.
    I thought of "with cascade" option but what if the record in the table with the relation is below 5 yrs time period (within 5 yrs data, should not be deleted) ?

    Aparna16 wrote:
    pls help in solving one issue related to purging the data.Interesting problem.. and one that I, wearing my DBA hat, will throw back at the developers.
    They know the data model. This request for purging old data is very likely to occur again in a year's time. What then? Go through a painful exercise again (this time taking data model changes since the last time into consideration)?
    I do not see the logic in that. So instead I will throw this back at the developer and tell them that a PL/SQL package needs to be designed and written to purge old data. DBA input into this will be ito design. Can a purge be done as a single massive delete transaction? Or make it more sense to design the purge for a specific date range? Or a specific product/invoice/customer/whatever business entity that can be aged? And then run multiple such purges in parallel?
    And why the purge? Is it to free up space? That may not be the case depending on the pctfree/pctused settings on a data block. So do some tables perhaps need to be rebuild after a purge in order to reorganise the table and free space?
    That's the DBA side of the problem. Figuring out what data can be deleted from which tables and writing code to do that - that's a developer problem.
    So either side, you need to make sure you use the other side to assist you in doing this task.

  • Audit database. config auditing data source (DB2)

    Hello expert,
    I want to enable audit database for Business Object and I has followed the admin guide but not work.
    I install Business Object enterprise XI 3.1 on AIX 5.3 .
    When I installed BO ,I choose 'Use an existing database'  and choose DB2. (The same database of SAP BW database)
    And when the installation require the information of CMS and Audit database, I fill the same db alias as SAP BW database.
    So now I has SAPBW , CMS and Audit  data in the same database.
    After installation I saw the CMS table in "DB2<aliasName>"  schema.
    But can not find the Audit table.
    Does the audit table will create after installation?
    Then I try to enable the audit database using cmsdbsetup.sh and I choose 'selectaudit' and fill in the information that it require.
    Then it finish with no error.
    "Select auditing data source was successful. Auditing Data Source setup finished."
    But I still can not find any audit table in the database.
    I run serverconfig.sh and I can't see the Enable Audit option when I choose 'modify a server'.
    Any idea?
    Thanks in advance.
    Chai

    Hello,
    Thanks for your reply.
    It is not a BO cluster.
    And a log detail when select audit data source is show as below
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) /bodev/bobje/setup/jscripts//ccmunix.js started at Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST)
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) About to process commandline parameters...
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Finished processing commandline parameters.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No password supplied, setting password to default.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No username supplied, setting username and password to default.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No authentication type supplied, setting authentication type to default.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No CMS name supplied, setting CMS name to machine name.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Select auditing data source was successful.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) /bodev/bobje/setup/jscripts//ccmunix.js started at Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST)
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) About to process commandline parameters...
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Finished processing commandline parameters.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No password supplied, setting password to default.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No username supplied, setting username and password to default.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No authentication type supplied, setting authentication type to default.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No CMS name supplied, setting CMS name to machine name.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Select auditing data source was successful.
    And the CMS log file did not show any error.
    Additional detail:
    - My BW and BO are in the same server.
    - I already grant all the rights to the user who relate to audit database.
    - My BW and  BO are in the same database.
    - There is no audit table appear in the database.
    - No Fix pack installed.
    I wonder why BO audit connection did not see my database.
    (In case of DB2, I think the db2 alias name is the same name of the database name (as default).
    So if my database name is BWD then the database alias name should be BWD, am I right?)
    Any idea?
    Thanks in advance.
    Chai

  • Analysing Task Audit, Data Audit and Process Flow History

    Hi,
    Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
    We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
    Many Thanks.

    I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
    This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
    I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
    For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
    The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
    -- Declare working variables --
    declare @dtStartDate as nvarchar(20)
    declare @dtEndDate as nvarchar(20)
    declare @strAppName as nvarchar(20)
    declare @strSQL as nvarchar(4000)
    -- Initialize working variables --
    set @dtStartDate = '1/1/2012'
    set @dtEndDate = '8/31/2012'
    set @strAppName = 'YourAppNameHere'
    --Get Rules Load, Metadata, Member List
    set @strSQL = '
    select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (1)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (21)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (23)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
    exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null

  • How to purge audit tables in 11.2.01

    Hi there,
    Navigation within OWB both Deisgn Center, Control Center and repository browser is far slower than other development tools on my PC.
    Is there a way to purge the OWB audit data such that at least deploying mappings, checking repositiory browser may be quicker.
    Many Thanks

    Here are some options that you can try
    +Increase the Java heap memory
    1. Edit the owb.conf file located in the ORACLE_HOME/owb/bin directory.
    2. Modify the value in the line "AddVMOption -Xmx768M" to increase the Heap Memory for Java, based on your physical memory.
    3. Restart the Design Center and re-try the MDL import.
    You can set higher value like 1024, the maximum value that you can set depends on your RAM memory, It should be less than your RAM memory
    +Optimize Repository
    In the Design center Go to 'Tool' menu and select 'Optiize Repository'
    +If the issue is with underlying network or with the databse you may need check with your respective Admins
    +For the control center performance issue you can consider applying the cumulative patch 10270220, but i suggest to check with Oracle support before 
    applying the patch
    Regards,
    Pnreddy

  • Should I use a data retrieval service or software to recover data ?

    Please pardon the length of this post.
    Here's the setup for our office:
    Computer 1:
    10.4.8 OS X
    1 GHZ PowerPC G4: silver grey tower, grey apple
    1MB L3 Cache
    256 MB DDR SDRAM
    Computer 2:
    10.4.8 OS X
    Dual 450 MHZ PowerPC G4: blue grey tower, blue apple
    256 MB SDRAM
    Computer 3:
    10.4.8 OS X
    1 GHZ PowerPC G4 IMac Flat Screen:
    256 MB DDR SDRAM
    I have 2 LaCie Big Disk d2 Extremes daisy chained and connected to the IMac. We use the first to store all of our data to keep our local disks free. The second d2 is the backup to the first. The other 2 computers connect to the d2's via an ethernet hub. The d2's are each partitioned into 4 compartments.
    A couple of days ago I started the system up when I got in in the morning, and the main d2 would not open. I ran disk utility, but it said that the drive was damaged beyond it's ability to repair. I ran DiskWarrior, and it gave me this message:
    "The directory of disk 'G4' cannot be rebuilt. The disk was not modified. The original directory is too severely damaged. It appears another disk utility has erased critical directory information. (3004, 2176)."
    I contacted Disk Warrior tech support and after a series of exchanges that had me send him dated extracted from the terminal function, he said this:
    "It appears that the concatenated RAID inside your LaCie
    drive has failed (your 500GB drive is actually 2 250GB
    hard drives). That is why we only saw 2 partitions
    on "disk3". A possible cause could be a failed bridge
    in the case.
    You may be looking at sending this drive to a data recovery service.
    However, it is possible that we may be able to recover data
    from the partitions that we CAN see. What we would be doing would cause no damage to your data
    unless the hard drives were having a mechanical failure (ie, the
    head crashed and was contacting the platters, similar to scratching
    a record). But from what I've seen, I don't feel that is the case.
    I believe the piece of hardware that 'bridges' the two drives to
    make them act as one has failed. that's why we can only see data
    about 1 of the 2 drives in the case.
    We would only be attempting to gather data off the drive. Since
    data recovery services sometimes charge for amount of data retrieved,
    it's up to you how you want to proceed."
    Most of the data from the past 5 years for our business stands at being lost. Only some of it had been properly backed up on the second drive, due to some back up software issues. I want to do whatever I can to retrieve all of, or at least some of the data. From what the Alsoft technician said, do you think that the data recovery software available to the consumer is going to be robust enough to retrieve at least the data from the one disk in the drive that is recognizable (there are 2 250gig disks in the d2X. Only one is responding at all). If so, do these Softwares further damage the disks? Or should I just send the drive to a data recovery service?
    I'd like to try to extract some of it myself via over the counter retrieval software, but I don't know whether to trust these programs?
    Any advice would be greatly appreciated.
    Thanks in advance.
    Peter McConnell
    1 GHZ PowerPC G4 IMac Flat Screen   Mac OS X (10.4.8)   Posted

    Peter
    My 2 cents:
    I have used FileSalvage
    http://www.subrosasoft.com/OSXSoftware/index.php?mainpage=product_info&productsid=1
    to recover files from damaged disks. It works as advertised, within limits. Some files may be too damaged to recover. More importantly yo get to scan the disk before actually recovering, and it will give you a list of what it thinks it can recover.
    My experience was that it recovered approx 85% of the data.
    YMMV but they do have a trial.
    Regards
    TD

  • I have Creative Cloud installed and Photoshop installed, but I can not open the program itself. No idea why, everything is up to date and i was using it just fine yesterday.

    I have Creative Cloud installed and Photoshop installed, but I can not open the program itself. No idea why, everything is up to date and I was using it just fine yesterday.
    2015-02-24_1129 - DavidIpromote's library

    Try resetting Photoshop Preferences.   Use your Photoshop start icon to launch Photoshop and immediately press and hole Shift+Ctrl|CMD+Alt|Option keys   Photoshop should Prompt you with Do you want your Preferences deleted. Reply yes.   Photoshop should then delete your user ID preferences and then create a default set of preferences for your ID and start successfully.

  • The uploaded file CPSIDRPT.xml is invalid. The file should be in XML-DATA-T

    can someone explain me why am I getting following error message when I'm trying to upload a data defination file?
    The uploaded file CPSIDRPT.xml is invalid. The file should be in XML-DATA-TEMPLATE format.
    thanks........
    ketki----

    Based on the detail (or lack of) that you have provided, your file is not in the right format.
    Can you open the file in a internet browser? Is it well-formed XML?
    Edited by: SIyer on Dec 1, 2009 4:33 PM

  • Why should we go for ODI?

    Hi,
    I know the Informatica 9.1.0. Now , I am learning ODI so getting some questions.
    I am working with the Hyperion & ODI is used with the hyperion to fetch data from any source system.
    I have few questions in my mind related to ODI.
    why should I go for ODI? OR when should I use OID?
    what is the benefits getting by ODI that does not available in other tools.
    Thanks

    It might be worth starting to read through the features of ODI and related documentation to understand it strengths http://www.oracle.com/technetwork/middleware/data-integrator/overview/index.html
    It is Oracle's strategic integration product so if you are working with EPM products then you will find more features than with Informatica.
    I will let someone else provide information on when to use it because I have been here before on many occasions, it can all depend on what products you have currently, what your source/target systems, what your objectives are to whether it is for you.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Why should we queue?

    Hello Architects and Software enginners,
    I think things that may be needed in a specific scenario, may not be crucial in another scenario, in other words I don't believe a common solution exists for all situations.
    So here is my scenario and my question is do I need to use the queing mechanism in this case? We have an Oracle EBS 11.5.10 with DB 10.2.0.1 system.
    We have to exchange cXML documents with a vendor. For outbound, we need to transform table data to a cXML document and send it to the vendor via https. For inbound, we need to get cXML docs from the vendor via https and create a flat file in our system to load to our EBS interface tables. All this will be done in bacth/asynchronous mode.
    For outbound, the data is already in the EBS tables. Why can't I loop through to get the records, create the cXML documents (say using XML DB) and use utl_http to post to the vendor on the fly per schedule say 10 times a day? Another option would be when the records are created I can create the cXML and store in a temporary table as a clob and then post to vendor as per schedule. Why do I need to use queing, because queuing seems to do the same thing-store in a queue table, dequeue it from there and and process it?
    Similarly for inbound why can't I get the cXML from the vendor, store it in a temp table as a clob and extract the text and create my file? Why should I use queueing? It just seems to be additional processing.
    As you can see it is a very basic question so let the opinions, ideas, critiques, disagreements flow :-)
    I highly appreciate your input as it will help determine how we architect our solution.
    Thanks
    Sandeep
    Edited by: user11992646 on Feb 20, 2010 1:00 PM

    Hi Damorgan,
    Appreciate it very much you taking the time to reply...on a weekend!. As a technologist I understand your point. But as business customers we have to look out for our best interests, so based on business realities I will disagree with some of your statements. I don't want to deviate from learning from you on the needs of AQ for this project. I wish we had met at the openworld so that we could have exchanged rants face to face, your EBS colleagues were targets of mine :-)
    Now to your question. AQ is for real-time not batches. ...If you are going to batch it ... AQ is more technology than you need.
    How is AQ real time? I am not trying to argue here, just trying to understand please bear with me and correct me if I am wrong. If you are queuing something, it means that you want to do something later,how many milliseconds afterwards that may be. I thought that was the whole point of queing so that users can continue doing what they are doing and leave the intensive processing to be done later without waiting online. Isn't this simply a batch that runs more often? A listener may be listening, wakes up every once in a while and processes the messages in batch-Again I don't know how this works but I am anticipating soemthing like this happens, please correct me. If there are disparate messages coming from many systems, queing makes sense in an EAI architecture. But in a point-to-point, as in this project, what is the point? :-)
    Also, can I queue up cXML type of messages in clobs? I read somewhere that only native Oracle XML can be queued up in AQ. An example of cXML is below.
    I would recommend that if your organization wishes to be competitive. And if you wish to have skills that are competitive in the marketplace you (your organization) start taking this technology seriously. Patch to 10.2.0.4 immediately and upgrade to 11.1.0.7 by August. Then throw away the antiquated batch mode model, use AQ, and communicate with your vendors in a more timely and efficient manner.
    I would agree doing away with the batch if I understand AQ better. From what I read, I have explained my understanding of AQ earlier. We have had regular issues with EBS Workflow queueing mechanisms for e-mail notifications where things get piled up for no apparent reason and when we open an SR the inevitable result is "rebuild the queue, bounce Apache, bounce concurrent managers etc..." with no real resolution which is causing me to hesitate using AQ for financial documents.
    Also, we will be upgrading to 11 soon but not soon enough for this project
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE cXML SYSTEM "http://xml.cXML.org/schemas/cXML/1.2.021/Fulfill.dtd">
    <cXML payloadID="[email protected]"
    xml:lang="en-CA" timestamp="2000-10-14T08:39:29-08:00">
    <Header>
    <From>
    <Credential domain="DUNS">
    <Identity>942888711</Identity>
    </Credential>
    </From>
    <To>
    <!-- The buying marketplace and member organization. -->
    <Credential domain="Networkuserid" type="marketplace">
    <Identity>[email protected]</Identity>
    </Credential>
    <Credential domain="NetworkUserId">
    <Identity>[email protected]</Identity>
    </Credential>
    </To>
    <Sender>
    <!-- The supplier -->
    <Credential domain="DUNS">
    <Identity>942888711</Identity>
    <SharedSecret>coyote</SharedSecret>
    </Credential>
    <UserAgent>Workchairs Order Entry</UserAgent>
    </Sender>
    </Header>
    <Request deploymentMode="test">
    <ShipNoticeRequest>
    <ShipNoticeHeader shipmentID="S89823-123" noticeDate="2000-10-14T23:59:20-08:00"
    shipmentDate="2000-10-14T08:30:19-08:00"
    deliveryDate="2000-10-18T09:00:00-08:00">
    <Contact role="shipFrom">
    <Name xml:lang="en-CA">Workchairs, Vancouver</Name>
    <PostalAddress>
    <Street>432 Lake Drive</Street>
    <City>Vancouver</City>
    <State>BC</State>
    <PostalCode>B3C 2G4</PostalCode>
    <Country isoCountryCode="CA">Canada</Country>
    </PostalAddress>
    </Contact>
    <Comments xml:lang="en-CA">Got it all into one shipment.
    </Comments>
    </ShipNoticeHeader>
    <ShipControl>
    <CarrierIdentifier domain="SCAC">UPS</CarrierIdentifier>
    <CarrierIdentifier domain="companyName">United Parcel Service
    </CarrierIdentifier>
    <ShipmentIdentifier>8202 8261 1194</ShipmentIdentifier>
    </ShipControl>
    </ShipNoticeRequest>
    </Request>
    </cXML>

  • Why should we use DSO instad of Infocube i BW7.4-SAP BW7.4

    Dear All,
    now I'm working with HCM- Payroll (BWreports) Standard reports, data flow is datasource--->Infocube--->Reports, but my architecture given data flow datasource--->DSO--->Multiprovider-->Reports, this is in SAP BW7.4, could any one explain me why should we use DSO instead of infocube..
    Thanks for your helping...
    Regards,
    Narasimha

    Hi
    In Loading purpose:
    Better to use DSO, becouse it involves 3 tables ie New data table, Active data table and Change log table.
    in case of INFO CUBE, here it involves more than 3  tables ie by default 3 dimension tables are existedby SAP... at least we will create 2 tables ie 1 dimension table and 1 fact table ....here it involves 5 tables.( 3 default+1 dimension+1 Fact table)
    Conclusion: When we create INFO CUBE it will takes more time compared to DSO because it involves 3 table
    Conclusion: Best practice is when we load data to taget, better to use DSO
    In Reporting:
    Cube will give summerised data whyle reporting and it is multi dimensional,ie we can analyse the data in multiple ways.
    DSO will give detailed data whyle reporting and it is 2 dimensional , ie we can analyse the data in 2 dimensional way only
    Conclusion: Best practise is Whyle reporting better to use INFO CUBE
    Regards
    Raj

Maybe you are looking for

  • Equipment status to be added.

    Hi All, There is a Serial number profile assigned to the material. Equipment get created due to serial number. Now we want to add more statuses in the given screen (manual transaction screen). (T code-IE02-Edit-special serial no. functions-manual tra

  • Power Surge ruined external drive

    Wonder if anyone can help? Am operating a Mac Pro at work and recently a power surge caused one of my WD books to corrupt... it's currently in the lab being worked on but needless to say that it was a bit of a nightmare and has prompted the purchase

  • No actual line items were selected

    Hi Experts, i am trying to view accounting document for delivery by vl03n, system shows errors as 'No actual line items were selected'. All the authorizations are on place, SU53 screen shows 'The last authorization check was successful' Please advice

  • Basic of IDOC and ALE

    Hi All, I am beginner of ALE IDOC. Please tell me in details whats is Control Records, Data Records and Status Records in details. Thnks in Advance, awards will be given. Ulhas

  • Exception in Message Monitoring when failed message is cancelled

    Hi All, I need some info regarding this. In message monitoring a message from Sender A to Sender B fails and gets into "waiting" mode.  When I cancel this message, I immediately recieve an Exception message from Sender B to Sender A. I have not done