BW Data Mart-How to

Hi All,
I have worked in BI 7 but not in BW.
I want to selectively full load data from one DSO(data is present already) to another DSO without extracting data from source system.
I know this can be achieved in BI 7 through DTP.
Please help me with he steps to do the same in BW dataflow.
Many thanks
BWNewbie.

Hi ,
As I understand from your question you already have an existing dataflow of BW 3.x, and you want to do a selective full load without corrupting the delta b/w DSO's.
Assuming this is the case , please perform below.
1--> Create an Infopackage for Full Repair.
2--> In selection specify the selection for which you want to load the data.
3--> Save and execute this infopackage.
How to create a Full Repair Infopackage
Check Repair Full Request in the infopackage->scheduler-> repair full request.Save the infopackage.
This will solve your your issue.
Thanks,
Mayuri

Similar Messages

  • SAP GRC AC 5.3 SP09 Data Mart: How to set up this new feature

    Hi there,
    does anyone already has some experience in setting the new data mart functionality in AC 5.3 SP09?
    I have read through the data mart config section of AC 5.3 config guide pages 55-56 and 321-322 but don´t understand how this process should work in the simpliest way to receive simple flat files of out AC 5.3.
    We do not use Crystal Reports but we just want to have flat file data extracts out of the AC 5.3 and load them into MS Access.
    1. Regarding the creation of a new data source in the Visual Admin: Is this required when I just want to get flat files out of AC 5.3 using the data mart jobs?
    We use a Oracle DB in a dual stack install for the AC 5.3. Do I need to install a JDBC Oracle driver for setting up the data source?
    2. Where am I supposed to find the data mart flat files that would result out of the data mart jobs when I do not have any connection set up to certain analysis tool DB? Do I need to define a file location in the data mart jobs in RAR?
    Thanks for any help and regards,
    Markus

    Dear all,
    got a reply from SAP on a message regarding this issue stating that the connection process outlined in the document
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/10e245d3-09a5-2c10-3296-f3afe44497e2&overridelayout=true
    can be applied as well to Oracle DB if your GRC AC 5.3 data is stored in there. Prereq is to add the additional data source in your visual admin.
    Via the data mart jobs you then need to fill the data mart and via ODBC for Oracle you should be able to access this data using Crystal Reports or MS Access.
    Regards,
    Markus

  • How to load the exist Data Mart Request in to cube.

    Hi Friends,
    I have a scenario... i got DSO and one Cube , iam loading the data from DSO to Cube .
    The number of records are more in the DSO and due to some clculation in the routine i need to load the data year by year(i got 2007 and 2008 year daya). I have loaded the 2007 data to the Infocube and in the DSO i can see the Data Mart Symbal against the Request Id when th load is finished successfully.
    When iam trying to load the 2008 data from DSO to Cube iam getting the "0" records. I realised that the Data Mart symbal is exist in the DSO.
    How to load the 2008 data in this scenario. If i delete the data mart sysbel means iam deleting the Cube request.
    Can any have an idea on this.
    Thaks in advance.

    HI,
    Things are not clear.
    How is loading happening if its delta or full load through DTP or you using 3.5 flow??
    In any cases if you do a full load or full repair based on the year selection it should pick the records from the source...there is nothing to do with the data mart status in the case of full loads.
    Data mart status will come into picture only when you schedule the delta loads.
    Do a full load based on selections on year from the DSO to the cube...no need to delete the data mart or it will bring that request again delta is scheduled...does that request in DSO contains the data for 2008 only ...if yes then you can just delete the data mart status for that and do a delta...if not then do full loads as said.
    Thanks
    Ajeet

  • How to find Info Source for Export Data source in Data Marts node

    Hi
                  I need to load data from ODS to Info Cube. I created the Export Data source for the ODS. I can see the Export Data Source but in the Data Marts node of Info Source i cannot find the Info Source for the Export Data source i created. I replicated Data sources in the BW source system. I also tried to use Insert Lost Nodes from the Context Menu of the InfoSource node but nothing worked. Please let me know what i need to do to see the Info source in the Data Marts.
    Thanks
    Padma

    In the infosource tab in RSA1 - use settings --> display generated objects
    you will be able to see the datamart infosources..

  • Get back the Data mart status in ODS and activate the delta update.

    I got a problem when deleting the requests in ODS.
    actually there is Cube(1st level. it gets loaded from an ODS(2nd level). this gets loaded from 3 ODS'S( 3rd level). we were willing to delete recents requests from all the data tardets and reload from PSA. but while delting in the request in ODS(2nd level), it has displayed a window, showing as follows.
    - the request 132185 already retrived by the data target BP4CLT612.
    -Delta update in BP4CLT612 must be deactivated before deleting the request.
    - Do you want to deactivate the delta update in data target BP4CLT612.
       I have clicked on execute changes in the window. it has removed the data mart status for all the request which i have not deleted.
    in the same it happened inthe 3 ODS's(3rd level).
    I got clear that if we load further data from source system. it will load all the records from starting.
    so to avoid this can any body help me how to reset the Data mart status and activate the delta update.

    Hi Satish,
    U have to make the requests RED in cube and back them out from cube,before u can go for request deletions from the base targets(from which cube gets data).
    Then u have to reset data mart status for the requests in your 'L2 ODS' before u can delete requests from ODS.
    Here I think u tried to delete without resetting data mart status which has upset the delta sequence.
    To correct this..
    To L2 ODS,do an init without data transfer from below 3 ODS's after removing init request from scheduler menu in init infopackage.
    Do similar from L2 ODS to Cube.
    then reconstruct the deleted request in ODS.It will not show the tick mark in ODS.Do delta load from ODS to Cube.
    see below thread..
    Urgentt !!! Help on reloading the data from the ODS to the Cube.
    cheers,
    Vishvesh

  • Error Caller 09 contains error message - Data Marts loading(cube to ODS)

    Dear all,
              Please ! Help me in this problem, This is very urgent.
              I have one process chain that loads data from BIW to BIW only through Data Marts. In that process chain, one process loads data from one cube(Created by us) & loads data to one ODS(also created by us). Data is loaded through full update & for the selected period specified in 'Calender Day' field in data selection.
             Previously I was able to load data for 2 months, but some days ago, suddenly one day, the process of Extraction got stuck in background for long time,& showed following error :
              Error message from the source system
              Diagnosis
             An error occurred in the source system.
              System Response
             Caller 09 contains an error message.
             Further analysis:
             The error occurred in Extractor . 
             Refer to the error message.
             Procedure
             How you remove the error depends on the error message.
             Note
             If the source system is a Client Workstation, then it is possible that the file that you wanted to                load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
                  Then we killed that process on server & after another attempt, It showed some calmonth...timestamp error. Then after reducing data selection period, It had been loaded successfully, after that I was able to load data for 20 days,Again after some days process got stuck,I followed the same procedure,reduced the period to 15 days & continued, Now I can't even load data for 5 Days successfully in one attempt, I have to kill that process in background & repeat it, then sometimes It get loaded.
             Pls, suggest some solutions as soon as possible. I am waiting for your reply. Points will be assigned.
             Thanks,
              Pankaj N. Kude
    Edited by: Pankaj Kude on Jul 23, 2008 8:33 AM

    Hi Friends !
                      I didn't find any short dump for that in ST22.
                      Actually , What happens is, Request continues to run in background for infinite time. At that time
    Status Tab in Process Monitor shows  this messages :
                        Request still running
                        Diagnosis
                        No errors found. The current process has probably not finished yet.
                         System Response
                         The ALE inbox of BI is identical to the ALE outbox of the source system
                           or
                         the maximum wait time for this request has not yet been exceeded
                           or
                        the background job has not yet finished in the source system.
                       Current status
                       in the source system
                        And Details Tab shows following Messages :
                        Overall Status : Missing Messages or warnings
                        Requests (Messages) : Everything OK
                                Data Request arranged
                                Confirmed with : OK
                         Extraction(Messages) ; missing messages
                                Data request received
                                Data selection scheduled
                                Missing message : Number of Sent Records
                                Missing message : selection completed
                        Transfer (IDOCS and TRFC) : Everything OK
                                Info Idoc1 : Application Document Posted
                                Info Idoc2 : Application Document Posted
                         Processing (data packet) : No data
                        This Process runs for infinite time, then I have to kill that process from server, & Then It shows  Caller 09 Error in Status Tab
                        Should I Changed the value for that memory parameter of server or not ?. We r planning to try it today, Is it really belongs to this problem, Will it be helpful ? What r the risks ?
                        Please, give your suggestion as early as possible, I m ewaiting for your reply.
      Thanks,
    Pankaj N. Kude

  • Deletion of Data Mart Request taking more time

    Hi all,
    One of the Data Mart process(ODS to Infocube) has failed.
    Im not able to delete the request in Infocube.
    When delete option executed, its taking more time while im monitoring that job in SM37 and its not getting completed.
    The details seen in the Job Log is as shown below,
    Job started                                                                  
    Step 001 started (program RSDELPART1, variant &0000000006553, user ID SSRIN90)
    Delete is running: Data target BUS_CONT, from 376,447 to 376,447             
    Please let me know your suggestions.
    Thanks,
    Sowrabh

    Hi,
    How many records are there in that request. Usually when you try to delete out a request it takes more time. Depending on the data volume in the request the deletion time will vary.
    Give it some more time and see if it finishes. To actually know if the job is doing anything, goto SM50/51 and look at whats happening.
    Cheers,
    Kedar

  • GRC AC DATA MART CONFIGURATION

    Hi All,
    We are on GRC AC 5.3 SP 11. The customer wants to use the Data Mart functionality with crystal report 2008 for custom reporting purpose. In my knowledge this functionality is available in AC since SP 9. But don't know the exact procedure to go ahead and configure the same. Though I have gone through couple of documents available on SDN on this, no document is Step by step.Can anyone please suggest me any detailed configuration guide available in SMP/SDN or on web.
    Thanks in Advance.
    Best Regards,
    Guru

    Hi Gangadhar,
    I have already gone through the AC config guide and the notes mentioned by you and some other available docs also. But nowhere it is detailed and step by step. Is there any detailed document or step-by-step on the same. Because in the config guide it has been given the steps need to be performed in the GRC AC side. To view the custom report in Crystal Reports 2008, what configurations need to be done there is no such details. However I have got some other docs like, 'SAP BUSINESSOBJECTS ACCESS CONTROL 5.3 SP09 DATA MART u2013 SAMPLE REPORTS', from SDN where it has been given the frontend configurations. But I am confused where to start and how to get the custom reports after doing the necessary configurations. Any idea on the same.
    Thanks,
    Guru

  • Transport Error on Data Mart Application Component

    Hi All,
    I have to transportthe export data source for my DSOs. I did use transport collection to make sure all the relevent objects are collected. When I transport the export data sources everything is going fine, but the DM (Application Component for the export data sources fail and put the object in repair status).
    The 8* data sources are active and available in the myself source system and the update rules from the DSO to Cube is also active, but the info source is not there because the Data Mart - Application component failed. It says as follows in the transport log.
    'Object R3TRAPCODM is in repair status. Therefore, it cannot be imported'.
    If the object is in repair status, can I ask the basis guys to re-import with over-write option? will it fix the issue. I could not see the repair statuis on the object as it is an SAP component.
    Or
    anyone tell me how to transport export data sources. I am in BI 7.0 but using RSA1OLD setup.
    Anyclues....?
    Thanks,
    Alex(Arthur Samson)

    I am closing this question. Identified the cause of the error.

  • Oracle BPM Process Data mart

    I am required to create audit reports on BPM workflows.
    I am new to thid & need some guidance on configuring BPM Process Data mart. What are the pre-requisites for configuring it & what are the steps to do it.
    Also, need some inputs on BAM database. What is the frequency of data upload. Is it data update or insert in BAM.

    Hi,
    You might want to check out the Administration and Configuration Guides on http://download.oracle.com/docs/cd/E13154_01/bpm/docs65/index.html.
    I suspect you might find the BAM and Data Mart portions of this documentation a bit terse, so I've added the steps below that provides more detail. I wrote this for ALBPM 6.0, but believe it will still work for Oracle BPM 10g. It was created from an earlier ALBPM 5.7 document Support wrote called "ALBPM 5_7 Configuring and Troubleshooting the BAM and DataMart Updater.pdf.
    You can define how often you want the contents in both databases updated (actually inserted) and how long you want to persist the contents of the BAM database during the configuration.
    Here's the contents of the document:
    1. Introduction
    The use of BAM (Business Activity Monitoring) and Data Mart (or Warehouse) information is becoming more and more widespread in today’s BPM project implementations for the obvious benefits they bring to the management and tuning of processes.
    BAM is basically composed by a collection of measurements of current processes load and execution times. This gives us an idea of how the business is doing at this moment (in a pseudo real-time fashion).
    Data Mart, on the other hand, is a historical view of the processes load and execution times. And this gives us an idea of how the business has developed since the moment the projects have been put in place.
    In this document we are not going to describe exhaustively all configuration aspects of the BAM and Data Mart Updater, but rather we will quickly move from one configuration step to another paying more attention to subjects that have presented some difficulties in real-life projects.
    2. Creating the Service Endpoints
    The databases for BAM and for Data Mart first have to be defined in the External Resources section of the BPM Process Administrator.
    In this following example the service endpoint ‘BAMJ2EEWL’ is being defined. This definition is going to be used later as BAM storage. At this point nothing is created.
    Add an External Resource with the name ‘BAMJ2EEWL’ and, as we use Oracle, select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMBAM’. This user, and its database, will be created later
    ·     the password for the user
    Scroll down to the bottom of the page and click <Save>.
    In addition to a standard JDBC connection that is going to be used by the Updater Service, a remote JDBC configuration needs to be added as the Engine runs in a WebLogic J2EE container. This Data Source is needed to grant the Engine access over BAM tables thru the J2EE Connection Pool instead of thru a dedicated JDBC. The following is an example of how to set this up.
    Add an External Resource with the name ‘BAMremote’ and select the Oracle driver, then click <Next>
    On the following screen, specify:
    ·     the Lookup Name that will be used subsequently in WebLogic - here I have given it the name ‘XAbamDS’
    Then click <Save>.
    In the next example the definition ‘DWHJ2EEWL’ is created to be used later as Data Mart storage. If you are not going to use a Data Mart storage you can skip this step.
    Add an External Resource with the name ‘DWHJ2EEWL’ and select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMDWH’. This user, and its database, will be created later
    ·     the password for the user
    3. Configuring BAM Updater Service
    Once the service endpoint has been created the next step is to enable the BAM update, select the service endpoint to be used as BAM storage and configure update frequency and others. Here the “Updater Database Configuration” is the standard JDBC we configured earlier and the “Runtime Database Configuration” is the Remote JDBC as we are using the J2EE Engine.
    So, here’s the example of how to set up the BAM Updater service….
    Go into ‘Process Monitoring’ and select the ‘BAM’ tab and enter the relevant information (using the names created earlier – use the drop down list to select):
    Note that here, to allow me to quickly test BAM reporting, I have set the update frequency to 1 minute. This would not be the production setting.
    Once the data is input, click <Save>.
    We now have to create the schema and related tables. For this we will open the “Manage Database” page that has appeared at the bottom of the BAM screen (you may have to re-select that Tab) and select to create the database and the data structure. The user required to perform this operation is the DB system administrator:
    Text showing the successful creation of the database and data structures should appear.
    Once we are done with the schema creation, we can move to the Process Data Mart configuration screen to set up the Common Updater Service parameters. Notice that the service has not been started yet… We will get to that point later.
    4. Configuring Process Data Mart Updater Service
    In the case that Data Mart information is not going to be used, the “Enable Automatic Update” checkbox must be left off and the “Runtime Database Configuration” empty for this service. Additionally, the rest of this section can be skipped.
    In the case it is going to be used, the detail level, snapshot time and the time of update should be configured; in addition to enabling the updater and choosing the storage configuration. An example is shown below:
    Still in ‘Process Monitoring’, select the ‘Process Data Mart’ tab and enter the name created earlier (use the drop down list to select).
    Also, un-tick the Generate O3 Cubes (see later notes):
    Then click <Save>.
    Once those properties have been configured the database and the data structure have to be created. This is performed at the “Manage Database” page for which the link has appeared at the bottom of the page (as with BAM). Even when this page is identical to the one shown above (for the BAM configuration) it has been opened from the link in the “Process Data Mart” page and this makes it different.
    Text showing the successful creation of the database and data structures should appear.
    5. Configuring Common Updater Service Parameters
    In the “Process Data Mart” tab of the Process Monitoring section -along with the parameters that are specific to the Data Mart - we will find some parameters that are common to all services. These parameters are:
    • Log directory: location of the log file
    • Messages logged from Data Store Updater: severity level of the Updater logs
    • Language
    • Generate Performance Metrics: enables performance metrics generation
    • Generate Workload Metrics: enables workload metrics generation
    • Generate O3 Cubes: enables O3 Cubes generation
    In this document we are not going to describe in detail each parameter. But we will mention a few caveats:
    a. the Log directory must be specified in order for the logs to be generated
    b. the Messages logged from Data Store Updater, which indicates the level
    of the logs, should be DEBUG for troubleshooting and WARNING otherwise
    c. Performance and Workload Metrics need to be on for the typical BAM usage and, even when either metric might not be used on the initial project releases, it is recommended to leave them on in case they turn out to be useful in the future
    d. the Generation of O3 Cubes must be off if this service is not used, otherwise the Data Mart Updater service might not work properly .
    The only changes required on this screen was to de-select the ‘Generate O3 Cubes’ as shown in the last section.
    6. Set up the WebLogic configuration
    We need to set up the JDBC data source specified above, so go to Services / JDBC / Data Sources.
    Click on <Lock and Edit> and then <New> to add a New data source.
    Specify:
    ·     the Name – use the name you set up in the Process Administrator
    ·     the JNDI Name – again use the name you set up in the Process Administrator
    ·     the Database Type – Oracle
    ·     use the default Oracle Database Driver
    Then click <Next>
    On the next screen, click <Next>
    On the next screen specify:
    ·     the Database Name – this is the SID – for me that is XE
    ·     the Host Name – as I am running on my laptop, I’ve just specified ‘localhost’
    ·     the Database User Name and Password – this is the BAM database user specified in the Process Administrator
    Then click <Next>
    On the next screen, you can test the configuration to make sure you have got it right, then click <Next>
    On the next screen, select your server as the target server and click <Finish>:
    Finally, click <Activate Changes>.
    7. The Last Step: Starting Up and Shutting Down the Updater Service
    ALBPM distribution is different depending on the Operating System. In the case of the Updater Service:
    -     For Unix like Operating Systems the service is started or stopped with the albpmwarehouse.sh shell script. The command in this case is going to look like this:
    $ALBPM_HOME/bin$ ./albpmwarehouse.sh start
    -     For Windows Operating Systems the service is installed or uninstalled as a Windows Service with the albpmwarehouse.bat batch file. The command will look like:
    %ALBPM_HOME%\bin> albpmwarehouse.bat install
    After installing the service, it has to be started|stopped from the Microsoft Management Console. Note also that Windows will start automatically the installed service when the computer starts. In either case the location of the script is ALBPM_HOME/bin Where ALBPM_HOME is the ALBPM installation directory. An example will be:
    C:\bea\albpm6.0\j2eewl\bin\albpmwarehouse.bat
    8. Finally: Running BAM dashboards to show it is Working
    Now we have finally got the BAM service running, we can run dashboards from within Workspace and see the results:
    9. General BAM and Data Mart Caveats
    a. The basic difference between these two collections of measurements is that BAM keeps track of current processes load and execution times while Data Mart contains a historical view of those same measurements. This is why BAM information is collected frequently (every minute) and cleared out every several hours (or every day) and why Data Mart is updated infrequently (once a day) and grows indefinitely. Moreover, BAM measurements can be though of as a minute-by-minute sequence of Engine Events snapshots, while Data Mart measurements will be a daily sequence of Engine Events snapshots.
    b. BAM and Data Mart table schemas are very similar but they are not the same. Thus, it is important not to use a schema created with the Manage Database for BAM as Data Mart storage or vice-versa. If these schemas are exchanged by mistake, the updater service will run anyway but no data will be added to the tables and there will be errors in the log indicating that the schema is incorrect or that some tables could not be found.
    c. BAM and Data Mart Information and Services are independent from one another. Any of them can be configured and running without the other one. The information is extracted directly from the Engine Database (PPROCINSTEVENT table is the main source of info) for both of them.
    d. So far there has not been a mention of engines, projects or processes in any of the BAM or Data Mart configurations. This is because the metrics of all projects published under the current Process Administrator (or, more precisely, FDI Directory) are going to be collected.
    e. It is also important to note that only activities for which events are generated are going to be measured (and therefore, shown in the metrics). The project default is to generate events only for Interactive activities. This can be changed for any particular activity and for the whole process (where the activity setting, when specified, overrides the process setting). Unfortunately, there is no project setting for events generation so far; thus, remember to edit the level of event generation for every new process that is added to the project.
    f. BAM and Data Mart metrics are usually enriched with Business Variables. These variables are a special type of External Variables. An External Variable is a process variable with the scope of an Instance and whose value is stored on a separate column in the Engine Instances table. This allows the creation of views and filters based on this variable. A Business Variable, then, shares all the properties of an External Variable plus the fact that its value is collected in all BAM and Data Mart measurements (in some cases the value is shown as it is for a particular instance and in others the value is aggregated).
    The caveat here is that there is a maximum number of 256 Business Variables per FDI. Therefore, when publishing several projects into a single FDI directory it is recommendable to reuse business variables. This is achieved by mapping similar Business Variables of different projects with a unique real Variable (on the variable mapping performed at publish time).
    g. Configuring the Updater Service Log
    In section 5. Configuring Common Updater Service Parameters we have seen that there are two common Updater properties related to logging. These properties are “Log directory” and “Messages logged from Data Store Updater”, and they specify the location and level of these two files:
    - dwupdater.log: which is the log for the Data Mart updater service
    - bam-dwupdater.log: which is the log for the BAM updater service
    In addition to these two properties, there is a configuration file called ‘WarehouseService.conf’ that allows us to modify these other properties:
    - wrapper.console.loglevel: level for the updater service log
    - wrapper.logfile.loglevel: level for the updater service log
    - wrapper.java.additional.n: additional argument to the service JVM
    - wrapper.logfile.maxsize: maximum size of the updater service log files
    - wrapper.logfile.maxfiles: maximum number of updater service log files
    - wrapper.logfile: updater service log file name (the default value is dwupdater-service.log)
    9.1. Updater Service Log Configuration Caveats
    a. The first three parameters listed above have to be modified when increasing the log level to DEBUG (since the default is WARNING). The loglevel parameters have to be set to DEBUG and a java.additional.n (where n is a consecutive integer to the already used ones) has to be set to –ea to enable asserts, since without this option no DEBUG message is going to be generated.
    b. Of the other arguments, maxfiles might need to be increased to hold a few more days of data when the log level is set to DEBUG (with the default value up to two days are stored).
    c. The updater service has to be stopped, uninstalled, installed and then started for any of these changes to take effect.
    Hope this helps,
    Dan

  • Combining Apps tables & custom data mart tables in the same EUL?

    Has anyone tried to do this? We have a situation where the data mart tables are specifically for sales reporting, but we need to combine this with Apps data that is not being captured in the sales data mart table. The data mart table are Oracle tables, but not Apps tables.
    How would this work? Would you have to use an Apps mode EUL? I'm not sure how the security would be used - would the responsibilities have any impact on the data in the data mart tables?
    Any thoughts would be greatly appreciated. Thanks.

    Hi user11256032, I just stumbled upon this by accident. The reason no-one has answered yet, is because it is in the wrong forum. (I can understand that you thought it belonged here.) Please post the question to the Oracle Utilities forum, which is here Utilities If that link doesn't work, go to Forum Home, then choose Industries, then Utilities. You may have to select "More ..." on Industries.
    Actually, I suspect there was an SR created for these, so your question may have been answered already.
    If you don't mind me asking, which customer is this for?
    Jeremy

  • Can we transport Data mart data sources?

    Hello Experts,
    I have question on Data Marts.
    Can we transport Data mart data sources? If yes how is it differant from BW transport process?
    Thanks a lot
    Padma

    Hi,
    You can't transport data mart datasources. You have to regenerate them in every system.
    Cheers,
    Kedar

  • Help in data marting

    hi gurus,,
    can u plz help me in explaining the procedure for data marting and how much it is useful
    thanx

    Hi,
    Check these links:
    Data Marts
    data mart and open hub service
    Re: Uses of Data Mart
    Few questions
    Search the forums for few more links.
    With rgds,
    Anil Kumar Sharma .P

  • Data Mart

    Hi experts,
    I am using data mart from ods to cube
    my issue is when i am loading data from ods to cube it takes more time i want to reduce the time can any body give me good suggestion
    Regards,
    Siva

    Hi,
    One way is to delete the indices as mentioned earlier.
    This is the most common and effective method.
    If you want i can explain step by step procedure of how to delete the indices...Do let me know.
    Other is that you can take care that no other load is taking place from ODS to cube.
    Sometimes that also reduces the performance.
    Hope it helps..

  • Deleted bad delta and former good deltas form data mart to targets(5)?

    Hi experts,
    My process chain failed in continues for 5 days due to DSO activation issue. On further analysis I found a bad request with status red on july 3rd and their is no psa to fix it.
    I deleted the bad request that is loaded on july 3rd with status red and all the god requests with status green on july 4th,5th,6th,7th and 8th.
    The data load is going from data mart to 5 different targets (3cubes and 2 DSO's) and i deleted the requests listed above with all the 5 targets.
    Know When I try to run new delta it is saying that 'new delta is not possible and asking for the previous deltas'
    What to do know? Should I delete the init and reinit with out data transfer and load my new delta? I am affraid that I may loose data if i do this.
    Thanks in advance.
    Sharat.

    Murali,
    I have the failed delta in the targets (3cubes and 2 ods). I deleted the bad delta and the former good ones after july 3rd from all the five targets by verifying the technical name of the requests in the targets.
    How should I proceed know?
    Can I do a full repair request and new delta? or delete the init and reinit without data transfer and do a full repair and a new delta?
    Please let me know. Thank you for your help.
    Sharat.

Maybe you are looking for

  • Open links in same window, not new

    Help greatly appreciated! My links open in a new window, but I need them to open in the same window. What do I need to add to my code so that this happens. Here is my code- stop(); var arrLinks:Array = new Array(); arrLinks[1] = ' http://www.diesel-e

  • How do I save an Attachment (PDF file) from an email on my 3G iPhone??

    How do I sae an email attachment to my 3G iPhone (in this case I want to save a PDF file sent to me as an attachment). Thks

  • Songs not showing up in ARTISTS???

    When i go to look for a song i go to ARTISTS..when i do this some of the songs i have on it arent shown in ARTISTS,but when i go to SONGS they are there...whats wrong??? p.s. Its only with some artists not all. Did i organize them wrong when i rename

  • Change term of payments in Documents

    Hi Can the terms of payment be changed to the documents which are already posted. Say i posted documents with 30 days grace, now i changed the terms of payment to 20 days. But the changes did not effect documents already posted. Is there any why by w

  • JNI code to access to several C programs

    im now working in JNI and using a native method to access a C program can i acces two seperate c files in one JNI method? any one pls help me. regards prasad