Data mart - Urgent

Hi Friends
I have ODS1 and ODS2 loading from R/3.ODS1 is daily load and ODS2 is monthly load.On ODS1 i have fields Property number and InspDate and On ODS2 i have proerty number and 0FISCPER.
Now i need to update from ODS1 into ODS2.Now how can i map the fields on update raule.Must i map property number as well and only Those 2 fields which i want to update.
Thanks.
Regards,
Chama.

Hi
Property numbers are not same on ODS's.Some property numbers are in ODS1 but on ODS2.
Interms of date,I load for ODS2 on 2nd of every month and daliy for ODS1.Now on ODS1 new changes will load onto it.
Let say one entry changed 1st of the month and laoding data on 2nd then i don't want to update the changes made after month end(30 or 31st).
Please advise me.
It's very urgent.
Thanks.
regards,
Chama.

Similar Messages

  • Error Caller 09 contains error message - Data Marts loading(cube to ODS)

    Dear all,
              Please ! Help me in this problem, This is very urgent.
              I have one process chain that loads data from BIW to BIW only through Data Marts. In that process chain, one process loads data from one cube(Created by us) & loads data to one ODS(also created by us). Data is loaded through full update & for the selected period specified in 'Calender Day' field in data selection.
             Previously I was able to load data for 2 months, but some days ago, suddenly one day, the process of Extraction got stuck in background for long time,& showed following error :
              Error message from the source system
              Diagnosis
             An error occurred in the source system.
              System Response
             Caller 09 contains an error message.
             Further analysis:
             The error occurred in Extractor . 
             Refer to the error message.
             Procedure
             How you remove the error depends on the error message.
             Note
             If the source system is a Client Workstation, then it is possible that the file that you wanted to                load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
                  Then we killed that process on server & after another attempt, It showed some calmonth...timestamp error. Then after reducing data selection period, It had been loaded successfully, after that I was able to load data for 20 days,Again after some days process got stuck,I followed the same procedure,reduced the period to 15 days & continued, Now I can't even load data for 5 Days successfully in one attempt, I have to kill that process in background & repeat it, then sometimes It get loaded.
             Pls, suggest some solutions as soon as possible. I am waiting for your reply. Points will be assigned.
             Thanks,
              Pankaj N. Kude
    Edited by: Pankaj Kude on Jul 23, 2008 8:33 AM

    Hi Friends !
                      I didn't find any short dump for that in ST22.
                      Actually , What happens is, Request continues to run in background for infinite time. At that time
    Status Tab in Process Monitor shows  this messages :
                        Request still running
                        Diagnosis
                        No errors found. The current process has probably not finished yet.
                         System Response
                         The ALE inbox of BI is identical to the ALE outbox of the source system
                           or
                         the maximum wait time for this request has not yet been exceeded
                           or
                        the background job has not yet finished in the source system.
                       Current status
                       in the source system
                        And Details Tab shows following Messages :
                        Overall Status : Missing Messages or warnings
                        Requests (Messages) : Everything OK
                                Data Request arranged
                                Confirmed with : OK
                         Extraction(Messages) ; missing messages
                                Data request received
                                Data selection scheduled
                                Missing message : Number of Sent Records
                                Missing message : selection completed
                        Transfer (IDOCS and TRFC) : Everything OK
                                Info Idoc1 : Application Document Posted
                                Info Idoc2 : Application Document Posted
                         Processing (data packet) : No data
                        This Process runs for infinite time, then I have to kill that process from server, & Then It shows  Caller 09 Error in Status Tab
                        Should I Changed the value for that memory parameter of server or not ?. We r planning to try it today, Is it really belongs to this problem, Will it be helpful ? What r the risks ?
                        Please, give your suggestion as early as possible, I m ewaiting for your reply.
      Thanks,
    Pankaj N. Kude

  • Value gets doubled in the cube when doing a data mart

    Hi,
    I am doing a data mart from one cube to another cube for different consolidation units and the value gets doubled for all the consolidation units except for one consolidation unit.there are no duplicate results in the cube.
    Can anyone tell me how i can debug this issue or what could be the reasons,This is very urgent.
    Thnaks and Regards,
    Subha

    In the Cube that is being loaded try seeing its content. While seeing its content give the restriction on the Request ID as that of the request which loaded the Cube.
    If you find the consolidation unit values doubled there, try seeing your update rule and transfer rule if the value is being changed anywhere.
    Then check the PSA.
    Hope that helps.
    Regards.

  • Data Mart Status of the request is not ticked

    Hello Everybody,
    I am the first time to deal with BW.
    After I loaded the data to the ODS, the request in the manage data view was not ticked as the others although the job was completed successfully.
    Does anyone can help?
    Many Thanks
    F-B-I

    Hi,
    have you deleted the data mart status in ods .
    u need to follow these steps
    1, if you chenged the status to red no need to delete the datamart status in ods .you can load the data from ods to infocube.
    2, if the status green and need to delete the datamart symbol in ods.
    regards
    sivaraju

  • Unable to update the Data from Cube to Data Mart

    Hi,
    I have a problem with the data loading to a Cube(data Mart)in BW. When i checked in RSA3 it is showing 0 records.The data flow is depicted as follows: R/3 -> ODS(BW)-> cube(BW) -> Cube(BW- Data mart for APO cube)-> APO System. In BW, for the final Data target whenever data is loaded, by deleting the previous request through full load. But on checking this final cube (Data Mart to APO) records are avilable, wherein while checking in RSA3 for this final data target(Data Mart to APO)it is showing 0 records.
    Why? please help me.
    Regards,
    krishna

    Hi,
    I checked the data mart in RSA3.It is not a matter of Full upload or delta upload.
    thanks
    Krishna

  • Data mart cube to cube copy records are not matching in target cube

    hI EXPERTS,
    Need help on the below questions for DATA Mart-Cube to cube copy(8M*)
    Its BW 3.5 system
    We have two financial cube.
    Cube A1 - Sourced through R/3 system (delta update) and Cube B1- Sourced through A1 cube.(Full update). These two cubes are connected through update rules with one to one mapping without any routines.Basis did a copy of back-end R/3 system from Production to Quality server.This happened approximately 2 months back.
    The Cube A1 which extracts delta load from R/3 is loading fine. but for the second cube, (extraction from previous cube A1) i am not getting full volume of data instead i m getting meagre value but the loading shows successful status in the monitor.
    We  tried through giving conditions in my infopackage (as it was given in previous year's loading) but then also its fetching the same meagre volume of data.
    To ensure that is it happening for the particular cube, we tried out in other cube which are sourced thro myself system and that is also getting meagre data rather than full data..
    For Example: For an employee if the data available is 1000, the system is extracting randomly some 200 records.
    Any quick reply will be more helpful. Thanks

    Hi Venkat,
                  Did you do any selective delitions in CUBEA1.
    first reconcile data cube1 & cube2 .
    match totals of cube1 with cube2.
    Thanks,
    Vijay.

  • Get back the Data mart status in ODS and activate the delta update.

    I got a problem when deleting the requests in ODS.
    actually there is Cube(1st level. it gets loaded from an ODS(2nd level). this gets loaded from 3 ODS'S( 3rd level). we were willing to delete recents requests from all the data tardets and reload from PSA. but while delting in the request in ODS(2nd level), it has displayed a window, showing as follows.
    - the request 132185 already retrived by the data target BP4CLT612.
    -Delta update in BP4CLT612 must be deactivated before deleting the request.
    - Do you want to deactivate the delta update in data target BP4CLT612.
       I have clicked on execute changes in the window. it has removed the data mart status for all the request which i have not deleted.
    in the same it happened inthe 3 ODS's(3rd level).
    I got clear that if we load further data from source system. it will load all the records from starting.
    so to avoid this can any body help me how to reset the Data mart status and activate the delta update.

    Hi Satish,
    U have to make the requests RED in cube and back them out from cube,before u can go for request deletions from the base targets(from which cube gets data).
    Then u have to reset data mart status for the requests in your 'L2 ODS' before u can delete requests from ODS.
    Here I think u tried to delete without resetting data mart status which has upset the delta sequence.
    To correct this..
    To L2 ODS,do an init without data transfer from below 3 ODS's after removing init request from scheduler menu in init infopackage.
    Do similar from L2 ODS to Cube.
    then reconstruct the deleted request in ODS.It will not show the tick mark in ODS.Do delta load from ODS to Cube.
    see below thread..
    Urgentt !!! Help on reloading the data from the ODS to the Cube.
    cheers,
    Vishvesh

  • Data mart status has beed reset by del the in between req in source ODS

    Hi SDN,
    We have a situation where in which there is a daily delta laod going from ODS to Cube and we have reset data mart status accidentally by deleting the request which is in between multiple number of requests in the source ODS, now when we deleted the requests in ODS we got a POP asking for 'do you want to delete the data in cube' we went with on that POP up, after that when we see in the manage of the source ODS the datamart status is not seen for all the requests that have been loaded to target cube. Then we have reconstructed the data for the deleted request from the PSA.
    Next day when a new laod comes into ODS then will the ODS send the correct delta update to target Cube.
    If correct delta is not updated to cube is there any method that we can follow to maintain the data consistency with out deleting the data in the target cube.
    Thank you,
    Prasaad

    Hi,
    You dekleted Data in Cube and loaded,but Ddata Mart is not appearing in ODS. So if you have all data in ODS, delete data in Cube and then just delete datamart layer in ODS and right click and click on Update data in DataTarget, then fresh request will update to Cub , this is Init only then Deltas will go from net day onwards.
    Thanks
    Reddy

  • Giving Error while generating the Data mart to Infocube.

    Hi Gurus,
    I need to  extract the APO infocube data in to the BW infocube. For that iam trying to generate the data mart to APO infocube .
    So, that i can use that data mart as a data source to extract that APO Infocube data in to  BW infocube.
    In that process iam trying to generate the datamart for APO Infocube . But while generating it is giving errors like below:
    Creation of InfoSource 8ZEXTFCST for target system BW 1.2 failed
    The InfoCube cannot be used as a data mart for a BW 1.2 target system.
    Failed to create InfoSource &v1& for target system BW 1.2.
    PLease suggest me what to do for this Error problem.
    Thanks alot in Advance.

    Hi,
    Point No : 1
    What is Planning Area :
    http://help.sap.com/saphelp_scm41/helpdata/en/70/1b7539d6d1c93be10000000a114084/content.htm
    Point No : 2
    Creation Steps for Planning Area :
    http://www.sap-img.com/apo/creation-of-planning-area-in-apo.htm
    Note : We will not create Planning Area.This will be done by APO team.
    Point No 3  : Afetr opening the T-Code : /n/SAPAPO/MSDP_ADMIN in APO you will be able to see all the planning areas.
    Point No 4 : Select your planning area and Goto Extras menu and Click on Generate DS
    Point No 5. System automaticall generate the DS in APO (Naming Convention start with 9) and Replicate the DS in BI Map to your cube and load the data.
    Regards
    Ram.

  • Table Onwers and Users Best Practice for Data Marts

    2 Questions:
    (1)We are developing multiple data marts that share the same Instance. We want to deny access to the users when tables are being updated. We have one generic user (BI_USER) with read access through one of the popular BI Tools. The current (first) data mart we denied access by revoking the privilege to the BI_USER, however going forward with other data marts the tables will get updated on a different schedule and we do not want to deny access to all the data marts. What is the best approach?
    (2) What is the best Methodology for table ownership of tables in different data marts that share tables across marts? Can we create one generic ETL_USER to update tables with different owners?
    Thanx,
    Jim Masterson

    If you have to go with generic logins, I would at least have separate generic logins for each data mart.
    Ideally, data loads should be transactional (or nearly transactional), so you don't have to revoke access ever. One of the easier tricks to accomplish this is to load data into a shadow table and then rename the existing table and the shadow table. If you can move the data from the shadow table to the real table in a single transaction, though, that's even better from an availability standpoint.
    If you do have to revoke table access, you would generally want to revoke SELECT access to the particular object from a role while the object is being modified. If this role is then assigned to all the Oracle user accounts, everyone will be prevented from viewing the table. Of course, in this scenario, you would have to teach your users that "table not found" means that the table is being refreshed, which is why the zero downtime approach makes sense.
    You can have generic users that have UPDATE access on a large variety of tables. I would suggest, though, that you have individual user logins to the database and use roles to grant whatever ad-hoc privileges users need. I would then create one account per data mart, with perhaps one additional account for the truely generic tables, that own each data mart's objects. Those users would then grant different roles different database privileges, and you would then grant those different roles to different users. That way, Sue in accounting can have SELECT access to portions of one data mart and UPDATE access to another data mart without granting her every privilege under the sun. My hunch is that most users should not be logging in to, let alone modifying, all the data marts, so their privileges should reflect that.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Automatic loading from data marts

    Hi All,
    I have a cube which has a data flow wherein I have an ODS1 which gives data to ODS2 and then finally to the cube.
    I have put this entire process in a process chain.
    But many a times due to an erroneous record,. the ODS1 loading fails and then I have to manually correct the PSA and do a manual activation too in ODS1.
    I
    I now want to know whether the system will pick up the processes again from the process chain and do further loading automatically OR if there is a possibility of automated loading settings be defined in the infopackage so as to do subsequent data targets once I have done it successfully in ODS1???
    Also, if I do such a setting, then the job will be done in my username or BWALEREMOTE??
    Thanks,
    Sharmishtha Biswas

    Thnaks for ur reply,
    But it is possible that the data mart does it automatically??
    As I have observed that some of the subequent infoproviders get loaded automatically after I manually load and activate one data mart.
    I wanted to find an explanation for this..
    Thanks,
    sharmishtha

  • How to find Info Source for Export Data source in Data Marts node

    Hi
                  I need to load data from ODS to Info Cube. I created the Export Data source for the ODS. I can see the Export Data Source but in the Data Marts node of Info Source i cannot find the Info Source for the Export Data source i created. I replicated Data sources in the BW source system. I also tried to use Insert Lost Nodes from the Context Menu of the InfoSource node but nothing worked. Please let me know what i need to do to see the Info source in the Data Marts.
    Thanks
    Padma

    In the infosource tab in RSA1 - use settings --> display generated objects
    you will be able to see the datamart infosources..

  • Deletion of Data Mart Request taking more time

    Hi all,
    One of the Data Mart process(ODS to Infocube) has failed.
    Im not able to delete the request in Infocube.
    When delete option executed, its taking more time while im monitoring that job in SM37 and its not getting completed.
    The details seen in the Job Log is as shown below,
    Job started                                                                  
    Step 001 started (program RSDELPART1, variant &0000000006553, user ID SSRIN90)
    Delete is running: Data target BUS_CONT, from 376,447 to 376,447             
    Please let me know your suggestions.
    Thanks,
    Sowrabh

    Hi,
    How many records are there in that request. Usually when you try to delete out a request it takes more time. Depending on the data volume in the request the deletion time will vary.
    Give it some more time and see if it finishes. To actually know if the job is doing anything, goto SM50/51 and look at whats happening.
    Cheers,
    Kedar

  • Number of datarecords with Data Marts

    Hi,
    I/m transporting a Cube with a fact table of 14000 records into another BW-system.
    When checking the load it only processes 12000 records.
    I took a look at the Idoc in source system: also 12000 records.
    How is this possible? There is only one load (meaning only 1 DimID for Package. All the records should be unique in the F-table so I expect 14000 records (also no routines between).
    Best regards,
    Brian

    Hi,
       check at infopackage level only. compare the data in both bw systems. take a look at which data is missing. before loading load master data first. take the case order number 100 is missing then go to bw system(sending system) go to extract checker in bw. if u can find the data it should come to other bw. nalyse the settings clearly. u will find the solution. worst case(if that data is urgent then go with psudo delta).
    all the best.
    nagesh.

  • Data Mart and Data Extraction from an Infocube

    Can a data mart which is built on one Infocube in BW support delta extraction? We have two separate BW systems and are trying to extract data from one Infocube in one BW (Source) and load it to one Infocube in another BW(Target)?
    We have built a data mart on the source BW infocube and have successfully been able to extract and load the initial data load into the target BW infocube. I noticed that the field 0RECORDMODE was not on the datasource that was created to support this data mart so my gut feeling is that we will not be able to do delta data extractions from this data mart.  Any feedback or confirmation of this?

    Hi,
    The InfoCube that is used as an export-DataSource is first initialized , meaning that the current status is transferred into the target BW. When the next upload comes around only those requests will then be transferred that have come in since the time of initialization. Different target systems can also be supplied like this
    Since it is request based ..data mart on info cube supports delta.
    Hope its helpful,
    Anu.

Maybe you are looking for

  • Output XML with indentation...

    I've not found an answer to this on the forums so was hoping someone might have discovered what's up since... Using the Transformation API with the default JAXP 1.1 transformer (Xalan) and setting the output property "indent" to "yes", the resulting

  • Create data guard using grid control 10gR3?

    Has anyone been able to create a data guard of 10g R2 (10.2.0.3 dbs) using Grid Control 10gR3? My creation stopped on cloning the standby db step. And i had to finish it manually but i'm just wondering if it has worked for anyone else?

  • Can't delete smart mailbox

    I have a number of old smart mailboxes that I've been trying to delete.  When I delete them ('control-click, delete mailbox), they disappear.  No error messages, etc. About a minute later, though, they just reappear. Any thoughts?

  • Indesign CS4 won't boot! Hourglass for a few seconds, then nothing!

    Hi all, My first thread here, borne of extreme frustration. We have recently installed CS4 on our machines at work. On one of the machines (mine) Indesign refuses to boot. If I double click on the icon, or try to open an existing file, the same thing

  • Freeze trying to open a file

    Whenever I try to open a file from somewhere the rainbow swirl starts spinning and never stops. I have to restart the application. For instance when I try to upload a file from a browser the form in which I have to select a file freezes. It wasn't th