Number of datarecords with Data Marts

Hi,
I/m transporting a Cube with a fact table of 14000 records into another BW-system.
When checking the load it only processes 12000 records.
I took a look at the Idoc in source system: also 12000 records.
How is this possible? There is only one load (meaning only 1 DimID for Package. All the records should be unique in the F-table so I expect 14000 records (also no routines between).
Best regards,
Brian

Hi,
   check at infopackage level only. compare the data in both bw systems. take a look at which data is missing. before loading load master data first. take the case order number 100 is missing then go to bw system(sending system) go to extract checker in bw. if u can find the data it should come to other bw. nalyse the settings clearly. u will find the solution. worst case(if that data is urgent then go with psudo delta).
all the best.
nagesh.

Similar Messages

  • How can I calculate the number of members with data for a given Dimension

    HiI want to be able to calculate the exact size of the data given the number of blocks in existence.I am trying to work out a way of estimating the effect of changing Dimensions from Spare to Dense and to see which Dimensions would be worth it.Currently I have Cost Centre and Reference Code dimensions. I know that roughly 90% of the Cost Centres are used against just one member of Reference Codes but the other 10% are against multiple Reference Codes.I also have a number of other dimensions and the problem I come up against is that when I change a Dimension from Sparse I really don't know how many blocks will be in existence at the end of it so I can't multiply this against the blocksize to predict the total datasize.Any thoughts (the artistic "suck it and see" solution is obvious but I'd like to cut down on experiments - they take too long)Thanks

    You can do it manually if you like with code in the Error Console.
    Copy and paste the code in the Code field in the Error Console and click the Evaluate button.
    *Firefox/Tools > Web Development > Error Console (Shift+CtrL+J)
    See http://kb.mozillazine.org/User:Dickvl/JavaScript_Error_Console#Expire_History_By_Days

  • Under Data Marts folder, what is the datasource starting with 8?

    hello BW gurus,
    Run rsa1 -> Source Systems, then pick the BW system itself as the client,  under folders BW DataSources ->  Business Information Warehouse -> Data Marts, we can see some datasources or infosources starting with 8, how these datasouces get created and what are the purpose of them? 
    We know that the datasource starting with 7 means the export datasource generated from PSA, but have no idea what is for datasource starting with 8.   Anyone's explanation is greatly appreciated!

    Hi Kevin,
    The DataSources starting with 8 are the datamart data sources, or Export DatSources. This means that they would supply data from the BW system to the BW system.
    You will notice that the name is 8DataTarget, where DataTarget could be like an ODS object or an InfoCube. This means that data from this ODS or InfoCube can be updated to other data targets in the BW system using update rules.
    You can generate the Export datasource from the context menu of a data target.
    Hope this helps...

  • TO_CHAR with dates to get week number issue

    I am trying to build a report showing weekly sales data and running into an issue with Dates and TO_CHAR..
    Using the date 9/29/2007, I expect the return from the following to be 4:
    SELECT TO_CHAR('09/29/2007','W') FROM DUAL
    However, it returns 5!!! Testing further I tried the next day, since it is a Sunday, it should be the start of the next week, 5..
    SELECT TO_CHAR('09/30/2007','W') FROM DUAL.. I do get a 5..
    I understand that the implementation of TO_CHAR with the 'W' parameter defaults a week start on what ever the first day of the month is, but I need a method that is more calendar like (Sunday is the start of a week, Saturday end of the week).
    Anyone have a nifty solution?
    Thank you,
    Tony Miller
    UTMB/EHN

    This works pretty well. Please note that it is zero based, so add 1 if you want it to start with the first week as 1.
    WITH TAB AS
      (SELECT TO_DATE('12/01/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/02/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/03/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/04/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/05/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/06/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/07/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/08/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/09/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/29/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/30/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL UNION ALL
       SELECT TO_DATE('12/31/2007', 'mm/dd/yyyy') DATE_COL FROM DUAL)
    SELECT DATE_COL,
      TO_CHAR(DATE_COL, 'ww') W_IN_YEAR,
      TO_CHAR(TRUNC(DATE_COL, 'month'), 'ww') W1_OF_MONTH,
      TO_NUMBER(TO_CHAR(DATE_COL, 'ww')) -
      TO_NUMBER(TO_CHAR(TRUNC(DATE_COL, 'month'), 'ww')) MY_WEEK,
      TO_CHAR(DATE_COL, 'W') ORACLE_WEEK
    FROM TAB
    DATE_COL     W_IN_YEAR W1_OF_MONTH MY_WEEK  ORACLE_WEEK
    01-DEC-2007  48        48          0        1          
    02-DEC-2007  48        48          0        1          
    03-DEC-2007  49        48          1        1          
    04-DEC-2007  49        48          1        1          
    05-DEC-2007  49        48          1        1          
    06-DEC-2007  49        48          1        1          
    07-DEC-2007  49        48          1        1          
    08-DEC-2007  49        48          1        2          
    09-DEC-2007  49        48          1        2          
    29-DEC-2007  52        48          4        5          
    30-DEC-2007  52        48          4        5          
    31-DEC-2007  53        48          5        5          
    12 rows selected

  • Check a number range with date range.

    i have table test which is having column
    Start Range
    End Range
    Start Date
    End date
    how can i check start range and end range does not exists b/w existing record also with date range of this record.
    end date can be blank also.

    990690 wrote:
    i have table test which is having column
    Start Range
    End Range
    Start Date
    End date
    how can i check start range and end range does not exists b/w existing record also with date range of this record.
    end date can be blank also.Your question makes absolutely no sense whatsoever. We could make guesses, but then we may be wrong in our guesses.
    Please provide sufficient details for people to be able to help you. Read the FAQ: {message:id=9360002}

  • Data mart status has beed reset by del the in between req in source ODS

    Hi SDN,
    We have a situation where in which there is a daily delta laod going from ODS to Cube and we have reset data mart status accidentally by deleting the request which is in between multiple number of requests in the source ODS, now when we deleted the requests in ODS we got a POP asking for 'do you want to delete the data in cube' we went with on that POP up, after that when we see in the manage of the source ODS the datamart status is not seen for all the requests that have been loaded to target cube. Then we have reconstructed the data for the deleted request from the PSA.
    Next day when a new laod comes into ODS then will the ODS send the correct delta update to target Cube.
    If correct delta is not updated to cube is there any method that we can follow to maintain the data consistency with out deleting the data in the target cube.
    Thank you,
    Prasaad

    Hi,
    You dekleted Data in Cube and loaded,but Ddata Mart is not appearing in ODS. So if you have all data in ODS, delete data in Cube and then just delete datamart layer in ODS and right click and click on Update data in DataTarget, then fresh request will update to Cub , this is Init only then Deltas will go from net day onwards.
    Thanks
    Reddy

  • Error Caller 09 contains error message - Data Marts loading(cube to ODS)

    Dear all,
              Please ! Help me in this problem, This is very urgent.
              I have one process chain that loads data from BIW to BIW only through Data Marts. In that process chain, one process loads data from one cube(Created by us) & loads data to one ODS(also created by us). Data is loaded through full update & for the selected period specified in 'Calender Day' field in data selection.
             Previously I was able to load data for 2 months, but some days ago, suddenly one day, the process of Extraction got stuck in background for long time,& showed following error :
              Error message from the source system
              Diagnosis
             An error occurred in the source system.
              System Response
             Caller 09 contains an error message.
             Further analysis:
             The error occurred in Extractor . 
             Refer to the error message.
             Procedure
             How you remove the error depends on the error message.
             Note
             If the source system is a Client Workstation, then it is possible that the file that you wanted to                load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
                  Then we killed that process on server & after another attempt, It showed some calmonth...timestamp error. Then after reducing data selection period, It had been loaded successfully, after that I was able to load data for 20 days,Again after some days process got stuck,I followed the same procedure,reduced the period to 15 days & continued, Now I can't even load data for 5 Days successfully in one attempt, I have to kill that process in background & repeat it, then sometimes It get loaded.
             Pls, suggest some solutions as soon as possible. I am waiting for your reply. Points will be assigned.
             Thanks,
              Pankaj N. Kude
    Edited by: Pankaj Kude on Jul 23, 2008 8:33 AM

    Hi Friends !
                      I didn't find any short dump for that in ST22.
                      Actually , What happens is, Request continues to run in background for infinite time. At that time
    Status Tab in Process Monitor shows  this messages :
                        Request still running
                        Diagnosis
                        No errors found. The current process has probably not finished yet.
                         System Response
                         The ALE inbox of BI is identical to the ALE outbox of the source system
                           or
                         the maximum wait time for this request has not yet been exceeded
                           or
                        the background job has not yet finished in the source system.
                       Current status
                       in the source system
                        And Details Tab shows following Messages :
                        Overall Status : Missing Messages or warnings
                        Requests (Messages) : Everything OK
                                Data Request arranged
                                Confirmed with : OK
                         Extraction(Messages) ; missing messages
                                Data request received
                                Data selection scheduled
                                Missing message : Number of Sent Records
                                Missing message : selection completed
                        Transfer (IDOCS and TRFC) : Everything OK
                                Info Idoc1 : Application Document Posted
                                Info Idoc2 : Application Document Posted
                         Processing (data packet) : No data
                        This Process runs for infinite time, then I have to kill that process from server, & Then It shows  Caller 09 Error in Status Tab
                        Should I Changed the value for that memory parameter of server or not ?. We r planning to try it today, Is it really belongs to this problem, Will it be helpful ? What r the risks ?
                        Please, give your suggestion as early as possible, I m ewaiting for your reply.
      Thanks,
    Pankaj N. Kude

  • Problem in Data Mart

    I tried to transfer data from one InfoCube(ZE1) to another InfoCube(ZE2).      
    Please find below the procedure I exercised:
    INFOCUBE ZE1 SUCCESSFULLY POPULATED DATA FROM FLATFILE WITH 20 RECORDS.
    INFOCUBE ZE2 CREATED BY COPYING FROM ZE1.
    GENERATED EXPORT DATASOURCE IN ZE1 AND  REPLICATED THIS NEW DATASOURCE 8ZE1 IN THE B3TCLNT800 - BW CLIENT 800 SOURCE SYSTEM.  THEN ACTIVATED THE TRANSFER RULES. THE INFOSOURCE IS IN DATAMART FOULDER.
    THE UPDATE RULES FOR ZE2 WAS CREATED USING INFOCUBE ZE1 (data mart concept) UNDER DATASOURCE TAB.
    SCHEDULED A LOAD PACKAGE ON THE INFOSOURCE  8ZE1 TO LOAD TO THE NEW INFOCUBE ZE2.
    Now while running the scheduling process, a short dump is created:
    No request Idoc generated in BW.
    Exception condition “INHERITED_ERROR” RAISED.
    Could you tell me what could be the reason behind it.
    Thanks in advance.

    Hi Venkat,
    Posting the OSS Note:568768 for your ref:
    <u><b>Symptom</b></u>
    A shortdump with an SQL Error occurs, or a message indicates, that a SQL Error occured. You need to figure out more information about the failing statement and the reasons for failing.
    A shortdump with Exception condition "INHERITED_ERROR" occured, a RAISE statement in the program "SAPLRSDRC " raised the exception condition "INHERITED_ERROR".
    <u><b>Other terms</b></u>
    Shortdump SQL Error Oracle DB6 DB2 MSSQL SAPDB Informix Database dev_w0 developer trace syslog UNCAUGHT_EXCEPTION CX_SY_SQL_ERROR DBIF_REPO_SQL_ERROR INHERITED_ERROR SAPLRSDRC RSDRC RSDRS SAPLRSDRS DBIF_RSQL_SQL_ERROR
    <u><b>Reason and Prerequisites</b></u>
    A shortdump with a SQL Error occurs, e.g. during BW Aggregate build, compression, BW queries, Datamart extraction, or a SQL Statement failed without a shortdump.
    The actions mentioned will already indicate the cause of the error.In particular, the database administrator should frequently be able to immediately recognize solutions.Nevertheless, you should create an OSS problem and you should then attach the SQL error message including additional information (such as the SQL statement that occurred or the error message text) to the problem message.
    This note combines the two OSS notes 495256 and 568768.
    <u><b>Solution</b></u>
    1. If a shortdump occured, get the work process and App Server where the dump occured, otherwise continue with the next step.
               Inside the shortdump, scroll to "System environment". Either you find the work process number here, or continue with the next step.
    1. Get the work process number from the syslog
                Go into the system log (Transaction sm21), search for the Short dump entry with the same timestamp in the syslog of the App Server where the short dump occured.
               The column "Nr" contains the work process number. Maybe there are already syslog entries before this entry containing more information about the error.
               Example: The shortdump contains
    UNCAUGHT_EXCEPTION
    CX_SY_SQL_ERROR
    04.11.2002 at 14:47:32
               The syslog contains:
    Time     Ty. Nr Cl. User    Tcod MNo Text
    14:47:32 BTC 14 000 NAGELK  BY2 Database error -289 at EXE
    14:47:32 BTC 14 000 NAGELK  BY0 > dsql_db6_exec_immediate( SQL
    14:47:32 BTC 14 000 NAGELK  BY0 > Driver][DB2/LINUX] SQL0289N
    14:47:32 BTC 14 000 NAGELK  BY0 > table space "PSAPTEMP". SQLS
    14:47:32 BTC 14 000 NAGELK  BY2 Database error -289 at EXE
    14:47:32 BTC 14 000 NAGELK  R68 Perform rollback
    14:47:32 BTC 14 000 NAGELK  AB0 Run-time error "UNCAUGHT_EXCEP
    14:47:32 BTC 14 000 NAGELK  AB2 > Include RSDRS_DB6_ROUTINES l
    14:47:33 BTC 14 000 NAGELK  AB1 > Short dump "021104 144732
                So from the Syslog the information can be got, that the reason of the error is
                Database error -289 at EXE dsql_db6_exec_immediate( SQLExecDirect ): [IBM][CLI Driver][DB2/LINUX] SQL0289N Unable to allocate new pages in table space "PSAPTEMP". SQLSTATE=57011
                The database error code is -289, the error text is "Unable to allocate new pages in tablespace "PSAPTEMP" ", and the work process number is "14".
    1. Displaying the developers trace
                Go to transaction sm51, select the correct application server (where the shortdump occured), now you are on transaction sm50 for this app server. Check the work process with the number you got from shortdump or syslog (in our example number 14), in the menu bar select Process - Trace - Display File.
               In the developers Trace search for the timestamp. In our example we get the entry:
    C Mon Nov  4 14:47:32 2002
    C  *** ERROR in ExecuteDirect[dbdb6.c, 5617]
    C  &+  0|     
    dsql_db6_exec_immediate( SQLExec...
    C  &+  0
    able space "PSAPTEMP".  SQLSTATE=57011
    C  &+  0
    C  &+  0|     
    INSERT INTO "/BIC/E100015" ...
    C  &+  0
    |    |     ...
    1. For most of the important sql statements generated by BW, when an error occurs, the SQL statement is saved as a text file called, for example, SQL00000959.sql (SQL<error code>.sql). If the statement is run in a dialog process, then the file is in the current directory of your SAP GUI on the front-end PC (for example, C:\Documents and Settings\schmitt\SAPworkdir);with batch tasks, it is stored in the DIR_TEMP directory (see transaction AL11) on the application server.
    2. For SAP Internal Support:
                Using the sql error codes determined under (1) as well as the relevant database platform, you can obtain a more detailed description of the error and possible error causes at http://dbi.wdf.sap.corp:1080/DBI/cgi/dberr.html.
    Hope this helps.
    Bye
    Dinesh

  • Oracle BPM Process Data mart

    I am required to create audit reports on BPM workflows.
    I am new to thid & need some guidance on configuring BPM Process Data mart. What are the pre-requisites for configuring it & what are the steps to do it.
    Also, need some inputs on BAM database. What is the frequency of data upload. Is it data update or insert in BAM.

    Hi,
    You might want to check out the Administration and Configuration Guides on http://download.oracle.com/docs/cd/E13154_01/bpm/docs65/index.html.
    I suspect you might find the BAM and Data Mart portions of this documentation a bit terse, so I've added the steps below that provides more detail. I wrote this for ALBPM 6.0, but believe it will still work for Oracle BPM 10g. It was created from an earlier ALBPM 5.7 document Support wrote called "ALBPM 5_7 Configuring and Troubleshooting the BAM and DataMart Updater.pdf.
    You can define how often you want the contents in both databases updated (actually inserted) and how long you want to persist the contents of the BAM database during the configuration.
    Here's the contents of the document:
    1. Introduction
    The use of BAM (Business Activity Monitoring) and Data Mart (or Warehouse) information is becoming more and more widespread in today’s BPM project implementations for the obvious benefits they bring to the management and tuning of processes.
    BAM is basically composed by a collection of measurements of current processes load and execution times. This gives us an idea of how the business is doing at this moment (in a pseudo real-time fashion).
    Data Mart, on the other hand, is a historical view of the processes load and execution times. And this gives us an idea of how the business has developed since the moment the projects have been put in place.
    In this document we are not going to describe exhaustively all configuration aspects of the BAM and Data Mart Updater, but rather we will quickly move from one configuration step to another paying more attention to subjects that have presented some difficulties in real-life projects.
    2. Creating the Service Endpoints
    The databases for BAM and for Data Mart first have to be defined in the External Resources section of the BPM Process Administrator.
    In this following example the service endpoint ‘BAMJ2EEWL’ is being defined. This definition is going to be used later as BAM storage. At this point nothing is created.
    Add an External Resource with the name ‘BAMJ2EEWL’ and, as we use Oracle, select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMBAM’. This user, and its database, will be created later
    ·     the password for the user
    Scroll down to the bottom of the page and click <Save>.
    In addition to a standard JDBC connection that is going to be used by the Updater Service, a remote JDBC configuration needs to be added as the Engine runs in a WebLogic J2EE container. This Data Source is needed to grant the Engine access over BAM tables thru the J2EE Connection Pool instead of thru a dedicated JDBC. The following is an example of how to set this up.
    Add an External Resource with the name ‘BAMremote’ and select the Oracle driver, then click <Next>
    On the following screen, specify:
    ·     the Lookup Name that will be used subsequently in WebLogic - here I have given it the name ‘XAbamDS’
    Then click <Save>.
    In the next example the definition ‘DWHJ2EEWL’ is created to be used later as Data Mart storage. If you are not going to use a Data Mart storage you can skip this step.
    Add an External Resource with the name ‘DWHJ2EEWL’ and select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMDWH’. This user, and its database, will be created later
    ·     the password for the user
    3. Configuring BAM Updater Service
    Once the service endpoint has been created the next step is to enable the BAM update, select the service endpoint to be used as BAM storage and configure update frequency and others. Here the “Updater Database Configuration” is the standard JDBC we configured earlier and the “Runtime Database Configuration” is the Remote JDBC as we are using the J2EE Engine.
    So, here’s the example of how to set up the BAM Updater service….
    Go into ‘Process Monitoring’ and select the ‘BAM’ tab and enter the relevant information (using the names created earlier – use the drop down list to select):
    Note that here, to allow me to quickly test BAM reporting, I have set the update frequency to 1 minute. This would not be the production setting.
    Once the data is input, click <Save>.
    We now have to create the schema and related tables. For this we will open the “Manage Database” page that has appeared at the bottom of the BAM screen (you may have to re-select that Tab) and select to create the database and the data structure. The user required to perform this operation is the DB system administrator:
    Text showing the successful creation of the database and data structures should appear.
    Once we are done with the schema creation, we can move to the Process Data Mart configuration screen to set up the Common Updater Service parameters. Notice that the service has not been started yet… We will get to that point later.
    4. Configuring Process Data Mart Updater Service
    In the case that Data Mart information is not going to be used, the “Enable Automatic Update” checkbox must be left off and the “Runtime Database Configuration” empty for this service. Additionally, the rest of this section can be skipped.
    In the case it is going to be used, the detail level, snapshot time and the time of update should be configured; in addition to enabling the updater and choosing the storage configuration. An example is shown below:
    Still in ‘Process Monitoring’, select the ‘Process Data Mart’ tab and enter the name created earlier (use the drop down list to select).
    Also, un-tick the Generate O3 Cubes (see later notes):
    Then click <Save>.
    Once those properties have been configured the database and the data structure have to be created. This is performed at the “Manage Database” page for which the link has appeared at the bottom of the page (as with BAM). Even when this page is identical to the one shown above (for the BAM configuration) it has been opened from the link in the “Process Data Mart” page and this makes it different.
    Text showing the successful creation of the database and data structures should appear.
    5. Configuring Common Updater Service Parameters
    In the “Process Data Mart” tab of the Process Monitoring section -along with the parameters that are specific to the Data Mart - we will find some parameters that are common to all services. These parameters are:
    • Log directory: location of the log file
    • Messages logged from Data Store Updater: severity level of the Updater logs
    • Language
    • Generate Performance Metrics: enables performance metrics generation
    • Generate Workload Metrics: enables workload metrics generation
    • Generate O3 Cubes: enables O3 Cubes generation
    In this document we are not going to describe in detail each parameter. But we will mention a few caveats:
    a. the Log directory must be specified in order for the logs to be generated
    b. the Messages logged from Data Store Updater, which indicates the level
    of the logs, should be DEBUG for troubleshooting and WARNING otherwise
    c. Performance and Workload Metrics need to be on for the typical BAM usage and, even when either metric might not be used on the initial project releases, it is recommended to leave them on in case they turn out to be useful in the future
    d. the Generation of O3 Cubes must be off if this service is not used, otherwise the Data Mart Updater service might not work properly .
    The only changes required on this screen was to de-select the ‘Generate O3 Cubes’ as shown in the last section.
    6. Set up the WebLogic configuration
    We need to set up the JDBC data source specified above, so go to Services / JDBC / Data Sources.
    Click on <Lock and Edit> and then <New> to add a New data source.
    Specify:
    ·     the Name – use the name you set up in the Process Administrator
    ·     the JNDI Name – again use the name you set up in the Process Administrator
    ·     the Database Type – Oracle
    ·     use the default Oracle Database Driver
    Then click <Next>
    On the next screen, click <Next>
    On the next screen specify:
    ·     the Database Name – this is the SID – for me that is XE
    ·     the Host Name – as I am running on my laptop, I’ve just specified ‘localhost’
    ·     the Database User Name and Password – this is the BAM database user specified in the Process Administrator
    Then click <Next>
    On the next screen, you can test the configuration to make sure you have got it right, then click <Next>
    On the next screen, select your server as the target server and click <Finish>:
    Finally, click <Activate Changes>.
    7. The Last Step: Starting Up and Shutting Down the Updater Service
    ALBPM distribution is different depending on the Operating System. In the case of the Updater Service:
    -     For Unix like Operating Systems the service is started or stopped with the albpmwarehouse.sh shell script. The command in this case is going to look like this:
    $ALBPM_HOME/bin$ ./albpmwarehouse.sh start
    -     For Windows Operating Systems the service is installed or uninstalled as a Windows Service with the albpmwarehouse.bat batch file. The command will look like:
    %ALBPM_HOME%\bin> albpmwarehouse.bat install
    After installing the service, it has to be started|stopped from the Microsoft Management Console. Note also that Windows will start automatically the installed service when the computer starts. In either case the location of the script is ALBPM_HOME/bin Where ALBPM_HOME is the ALBPM installation directory. An example will be:
    C:\bea\albpm6.0\j2eewl\bin\albpmwarehouse.bat
    8. Finally: Running BAM dashboards to show it is Working
    Now we have finally got the BAM service running, we can run dashboards from within Workspace and see the results:
    9. General BAM and Data Mart Caveats
    a. The basic difference between these two collections of measurements is that BAM keeps track of current processes load and execution times while Data Mart contains a historical view of those same measurements. This is why BAM information is collected frequently (every minute) and cleared out every several hours (or every day) and why Data Mart is updated infrequently (once a day) and grows indefinitely. Moreover, BAM measurements can be though of as a minute-by-minute sequence of Engine Events snapshots, while Data Mart measurements will be a daily sequence of Engine Events snapshots.
    b. BAM and Data Mart table schemas are very similar but they are not the same. Thus, it is important not to use a schema created with the Manage Database for BAM as Data Mart storage or vice-versa. If these schemas are exchanged by mistake, the updater service will run anyway but no data will be added to the tables and there will be errors in the log indicating that the schema is incorrect or that some tables could not be found.
    c. BAM and Data Mart Information and Services are independent from one another. Any of them can be configured and running without the other one. The information is extracted directly from the Engine Database (PPROCINSTEVENT table is the main source of info) for both of them.
    d. So far there has not been a mention of engines, projects or processes in any of the BAM or Data Mart configurations. This is because the metrics of all projects published under the current Process Administrator (or, more precisely, FDI Directory) are going to be collected.
    e. It is also important to note that only activities for which events are generated are going to be measured (and therefore, shown in the metrics). The project default is to generate events only for Interactive activities. This can be changed for any particular activity and for the whole process (where the activity setting, when specified, overrides the process setting). Unfortunately, there is no project setting for events generation so far; thus, remember to edit the level of event generation for every new process that is added to the project.
    f. BAM and Data Mart metrics are usually enriched with Business Variables. These variables are a special type of External Variables. An External Variable is a process variable with the scope of an Instance and whose value is stored on a separate column in the Engine Instances table. This allows the creation of views and filters based on this variable. A Business Variable, then, shares all the properties of an External Variable plus the fact that its value is collected in all BAM and Data Mart measurements (in some cases the value is shown as it is for a particular instance and in others the value is aggregated).
    The caveat here is that there is a maximum number of 256 Business Variables per FDI. Therefore, when publishing several projects into a single FDI directory it is recommendable to reuse business variables. This is achieved by mapping similar Business Variables of different projects with a unique real Variable (on the variable mapping performed at publish time).
    g. Configuring the Updater Service Log
    In section 5. Configuring Common Updater Service Parameters we have seen that there are two common Updater properties related to logging. These properties are “Log directory” and “Messages logged from Data Store Updater”, and they specify the location and level of these two files:
    - dwupdater.log: which is the log for the Data Mart updater service
    - bam-dwupdater.log: which is the log for the BAM updater service
    In addition to these two properties, there is a configuration file called ‘WarehouseService.conf’ that allows us to modify these other properties:
    - wrapper.console.loglevel: level for the updater service log
    - wrapper.logfile.loglevel: level for the updater service log
    - wrapper.java.additional.n: additional argument to the service JVM
    - wrapper.logfile.maxsize: maximum size of the updater service log files
    - wrapper.logfile.maxfiles: maximum number of updater service log files
    - wrapper.logfile: updater service log file name (the default value is dwupdater-service.log)
    9.1. Updater Service Log Configuration Caveats
    a. The first three parameters listed above have to be modified when increasing the log level to DEBUG (since the default is WARNING). The loglevel parameters have to be set to DEBUG and a java.additional.n (where n is a consecutive integer to the already used ones) has to be set to –ea to enable asserts, since without this option no DEBUG message is going to be generated.
    b. Of the other arguments, maxfiles might need to be increased to hold a few more days of data when the log level is set to DEBUG (with the default value up to two days are stored).
    c. The updater service has to be stopped, uninstalled, installed and then started for any of these changes to take effect.
    Hope this helps,
    Dan

  • How to load the exist Data Mart Request in to cube.

    Hi Friends,
    I have a scenario... i got DSO and one Cube , iam loading the data from DSO to Cube .
    The number of records are more in the DSO and due to some clculation in the routine i need to load the data year by year(i got 2007 and 2008 year daya). I have loaded the 2007 data to the Infocube and in the DSO i can see the Data Mart Symbal against the Request Id when th load is finished successfully.
    When iam trying to load the 2008 data from DSO to Cube iam getting the "0" records. I realised that the Data Mart symbal is exist in the DSO.
    How to load the 2008 data in this scenario. If i delete the data mart sysbel means iam deleting the Cube request.
    Can any have an idea on this.
    Thaks in advance.

    HI,
    Things are not clear.
    How is loading happening if its delta or full load through DTP or you using 3.5 flow??
    In any cases if you do a full load or full repair based on the year selection it should pick the records from the source...there is nothing to do with the data mart status in the case of full loads.
    Data mart status will come into picture only when you schedule the delta loads.
    Do a full load based on selections on year from the DSO to the cube...no need to delete the data mart or it will bring that request again delta is scheduled...does that request in DSO contains the data for 2008 only ...if yes then you can just delete the data mart status for that and do a delta...if not then do full loads as said.
    Thanks
    Ajeet

  • How to count number of Characteristics' with the same value?

    Hello, Everybody,
    In InfoCube I have data:
    person ID (characteristic), points (key figure), gendre (characteristic), org.unit (characteristic)
    1313; 10; F;5001
    1313; 10; M;5001
    1313; 12; F;5001
    1313; 6; M;5001
    1515;20;F;5001
    Report,  with data from this InfoCube, should look like this:
    org.unit/number of employees, 6 points, 10 points, 12 points, 20 points
    5001, 1, 2, 1, 1
    Could you give me a suggestion how I can count the number of employees?
    Thanks in advance!
    Best Regards,
    Arunas Stonys

    Arunas,
    you can crate 4 CKF one each for employee points, and in each of the CKY have a data function value =1 if <your condition is satisfied>. This way the CKY will have a value of 1 if point is 6 for the first CKY and similarly for the rest and use exception aggregation based on person ID to count the no. of employees with that particular point in the Org unit. Use org unit in the rows and these 4 CKY in the columns and you should be able to get the report.
    hope this helps.
    Regards,
    Aashish
    Edited by: Aashish Kalra on Jan 6, 2009 12:31 AM

  • Data Mart Status of the request is not ticked

    Hello Everybody,
    I am the first time to deal with BW.
    After I loaded the data to the ODS, the request in the manage data view was not ticked as the others although the job was completed successfully.
    Does anyone can help?
    Many Thanks
    F-B-I

    Hi,
    have you deleted the data mart status in ods .
    u need to follow these steps
    1, if you chenged the status to red no need to delete the datamart status in ods .you can load the data from ods to infocube.
    2, if the status green and need to delete the datamart symbol in ods.
    regards
    sivaraju

  • Compare data in R/3 with data in a BW Cube after the daily delta loads

    Hi Friends,
    How can I compare data in R/3 with data in a BW Cube after the daily delta loads? Are there any standard procedures for checking them or matching the number of records?

    Hi Sunil,
    If you want to check the records daily instead of checking the data in R/3 manually ......
    You can try this...
    If you have staging DSO(level 1) that means whatever data is in source system load it to Staging DSO without any routines or any modifications.
    Now load this DSO data to Cube or DSO(level 2) as per your requirement with routines etc.
    Now Staging DSO contains Source system data.
    Now the level 2 Cube or DSO contains BW data with some modifications.
    Now create a Multiprovider based on level 1 and level 2 data targets.
    Now create a report on which keyfigures you want to test the data.
    In Multiprovider there is a field called 0infoprovider in data packet dimension.
    you can drag this infoprovider to the columns and restict your keyfigures with level 1 and level 2 data targets.
    In the first column you can see the level 1 DSO data ( source system data),in the 2nd column you can see the BW data.
    Now create a formula which gives the diffrence b/n level 1 and level2.
    that is R/3 data - BW data.
    If the diffrence is zero both R/3 and BW data are same.
    if the diffrence is not eqaul to zero check whether any routine is there or not.

  • Unable to update the Data from Cube to Data Mart

    Hi,
    I have a problem with the data loading to a Cube(data Mart)in BW. When i checked in RSA3 it is showing 0 records.The data flow is depicted as follows: R/3 -> ODS(BW)-> cube(BW) -> Cube(BW- Data mart for APO cube)-> APO System. In BW, for the final Data target whenever data is loaded, by deleting the previous request through full load. But on checking this final cube (Data Mart to APO) records are avilable, wherein while checking in RSA3 for this final data target(Data Mart to APO)it is showing 0 records.
    Why? please help me.
    Regards,
    krishna

    Hi,
    I checked the data mart in RSA3.It is not a matter of Full upload or delta upload.
    thanks
    Krishna

  • Data mart cube to cube copy records are not matching in target cube

    hI EXPERTS,
    Need help on the below questions for DATA Mart-Cube to cube copy(8M*)
    Its BW 3.5 system
    We have two financial cube.
    Cube A1 - Sourced through R/3 system (delta update) and Cube B1- Sourced through A1 cube.(Full update). These two cubes are connected through update rules with one to one mapping without any routines.Basis did a copy of back-end R/3 system from Production to Quality server.This happened approximately 2 months back.
    The Cube A1 which extracts delta load from R/3 is loading fine. but for the second cube, (extraction from previous cube A1) i am not getting full volume of data instead i m getting meagre value but the loading shows successful status in the monitor.
    We  tried through giving conditions in my infopackage (as it was given in previous year's loading) but then also its fetching the same meagre volume of data.
    To ensure that is it happening for the particular cube, we tried out in other cube which are sourced thro myself system and that is also getting meagre data rather than full data..
    For Example: For an employee if the data available is 1000, the system is extracting randomly some 200 records.
    Any quick reply will be more helpful. Thanks

    Hi Venkat,
                  Did you do any selective delitions in CUBEA1.
    first reconcile data cube1 & cube2 .
    match totals of cube1 with cube2.
    Thanks,
    Vijay.

Maybe you are looking for

  • Report on Roles

    Report on Roles Hello, How do i generate the report by user and their attached roles? Thank you Rao

  • Facetime and iMessage not working Sometime

    Hi, my first time on this board and just wanted to see if people are having the same problems i've been having. I have been having problems sending iMessage to my New iPad (3rd Gen) when it's been on standby more than 10 minutes. when trying to send

  • Receiver SOAP Adapter : Error Connection Refused (errno: 239)

    Hello I am testing a simple scenario using SOAP Adapter. I am getting the following error. com.sap.aii.af.ra.ms.api.DeliveryException: Connection refused (errno:239)   <?xml version="1.0" encoding="UTF-8" standalone="yes" ?> - <!--  Call Adapter   --

  • Counting how many active sessions in a webapp

    Hi folks, I was wondering if it is possible to count how many active sessions that there currently are in a webapp? I've looked at the ServletContext and ServletConfig classes, but I can't find anything. Any ideas? Cheers, Raj.

  • HT1657 Can i get a refund for a rental i didn't view before the expiration date?

    Can i get a refund for a rental i didn't view before the expiration date? I wasn't able to tranfer the movie to the ipad, and couldn't view it on my pc.