Problem in Datamart.

HI Experts,
I have one data mart from 0SD_C03 to IC_MIS, I have added one attribute in 0CUSTOMER and I made it that attribute as a navigational   
in 0SD_C03 also i made it as navigational, after that I loaded data from 0SD_C03 to IC_MIS at that time it is showing still running process, I checked in manage tab records are transfered but not added.
In SM37 job also finished.
I didn't added anything in second cube......
wt will be the problem,
Please help to do this.........
Thanks in advance,
Venakt.
Edited by: Siegfried Szameitat on Nov 3, 2008 10:12 AM
deleted points offer

Hello,
Try activating the cube 0SD_C03 and  IC_MIS.
If you need the navigation attr in  IC_MIS, then try to make that settings too.
Make the request RED and delete the request you loaded even from PSA and schedule the load again
Regards,
Dhanya

Similar Messages

  • Datamart Problems

    Hi All,
                I am new to BI . I have some doubts .
               1.  If I am loading 5 cubes from an ODS and in 4 cubes data loaded correctly and in one cube  the delta has failed.So what should i do to correct the delta and reload to that single cube . And does the successful requests in the other cube will have any impact because of this failed delta.
               2 . In the second one my ODS data is corrupted and my present delta is fine . But my past 2 deltas data was wrongly loaded and how can i repair and reload my past 2 deltas correctly  ?
               3 . In other scenario , in a ODS some of keyfigures were getting overwritten in update rules  and now they were asking the update type as addition . Now if i change my update rules will i face any consequences?? . And can we maintain both the update types for keyfigrues if necessary ??
    Regards,
    Aditya.

    Hi Aditya,
    answers for 1: we need to reset the deltamart status have reload it.
    answer for 2: Repair full is the best option as on when we come across this type of problems.
    Answer for 3:
    We can maintain the update mode for keyfigure as either over write or addition functionality,
    suppose if you are using only for staging purpose you can use overwrite functionality so that we can update the unique records to the above level target, or if you want to use it for your reporting requirements we can set update mode as additive.
    once the update mode is set and loaded the dso afterwards it is not advisable to change the update mode further we will get some reconcilation issues. it purely depends on your business requirements.
    Thanks,
    Sathish.

  • Problem with 0RECORDMODE in datamart from ODS tu Cube

    Hi all,
    i need to do a datamart from a cube to an ODS, when i create the Update rules (in 3.5), this ask me "the infoobject 0RECORDMODE doesnt exist in the infosource", thats ok because the cube doesnt need the 0RECORDMODE, but the most weird is, all this that i need to do, is created and it works, now i dont know how that consultan could do it, how can i do this? help guys

    Hi,
    You just create the export datasource for this infocube then create the flow from cube to ODS.Because from cube to ODS delta work on request based.So it will update the ODS according to the new request which will be load into the infocube...
    Thanks,
    Kamal

  • ODS & Datamart Problem(A.H.P/Roberto/Bhanu Gupta/Anyone)

    Hi Experts,
    We have an ODS for which we have data in Active Data but not able to see it in <b>Change log</b>. When we click on generate export data source, we get a short dump.
    What might be the problem. Kindly suggest.
    Appreciate immediate response.
    Thanks & Regards
    Sajeed

    Hi Raj,
    Thanks for early response.
    We have activated it. Otherwise how is it available in Active Data.
    Any other thoughts.
    Regards
    Sajeed

  • DataMart problem

    Hi Gurus,
    I click  "generate export datasource" but I don't see it in the datamart.
    Please advice what could be the reason.
    Thanks
    Liza

    Under infosources refresh and look in Data marts application component for the export datasource (8*). Also check if 'Show generated ODS objects' is selected under settings --> display generated objects.

  • ODS Activation Problem - Error getting SID

    Hi All,
    I am facing the problem while activating the request,
    "Error getting SID for ODS object xxxx.i know lot of threads are there in forum but those all are taking about while loading the data to ODS,Here i loaded the data into my first ODS(here fine) and then transferring the data to another ODS while activation of Data in second ODS i am getting problem.the second ODS checked the Bex reporting.
    If masterdata not existis then it can be a problme in my first ODS but i didn't face any problem in First ODS.First ODS loaded the data from R/3 and second ODS using the datamarts.
    Any ideas how to solve this problem.
    Thanks,

    Use Tcode RSRV  Tests in Transaction RSRV  All Elementary Tests  ODS Objects  Foreign Key relationship of reporting-relevant ODS object and SID table characteristics (Dbl Click)
    From the right side window, expand u201CForeign Key relationship of reporting-relevant ODS object and SID table characteristics & Enter second ODS name, and click u201CTransferu201D
    Now, click u201CExecuteu201D (Toolbar)u2026 and check whether the results displayed in green icon..
    Otherwise, go back and click u201CCorrect erroru201D (Toolbar)...then try to activate second ODS

  • Is Business Objects is ROLAP tool?? If access datamart performance Decrease

    Hi Bo gurus,
    i am getting good response from u guys.And keeping improving myself through SDN community.And i came with a different question as follows,
    we have created datamart from the database,it does not have physical existence.we have created the target tables using data integrator.
    Now we use BOXI R3.1 ,created the universe according to the business and generated the reports and dashboards from the datamart.Now whats the problem we are facing is that for generating reports in voyager we short fall of OLAP cube.As voyager can only access OLAP cube not the universe directly like reports.
    Now the client have asked why we have not developed the datamart in OLAP cubes.But i replied that for creating universe there is no need of OLAP by using the datamart target tables itself we can create universe and generate reports.
    But client says why u have not created in OLAP cube.If i also create which is best method to implement whether MOLAP or ROLAP is best ???
    I am trying to convince that ROLAP is best as it can handel large database.
    But client is saying if we implement MOLAP performance will be faster....
    I have question whether creating universe on the top of OLAP will be faster and make some difference or directly from target tables(datamart-facts & dimension) itself work fine????
    I have gone through some of sites telling that Bo is ROLAP tool icon_confused.gif and Cognous is ROLAP tool is that true.....
    PLZ Give me some suggestion icon_idea.gif on this....
    Thanks in advance GURUS,
    santose

    Hi,
    i would recommend posting this in the "Enterprise Information Management" Forum. DataServices belongs to the EIM productline of SAPBOBJ and there are the pros for the Tool.
    Regards
    -Seb.

  • Missing Datamarts in the Infosource Screen

    Hi,
    I needed to experiment loading from an infocube to another infocube for some re-modeling to be done at the clients side. The problem I am facing is none of the datamart datasources are showing up on the infosources tab. They appear under the BWSystem in the source system tab but cant see them under infosources tab.
    I have tried all solutions that I am aware of like Settings -->Display Generated Objects / Select Display Options etc. I find entries in the ROOSOURCE table on the BW side too but cant see it on screen.
    Anybody faced a similar problem? I don't know if it has anything to do with transferring application component hierarchy from R/3. That was not done when the initial implementation was done and I have done that too and replicated on the BW side. I am still missing the whole Data Marts Application Component in the infosource tab.
    Any ideas anybody? Am I missing something major here??!!Thanks in advance.

    hi,
    in 'old day' we solved this problem with oss note 142947-Data Marts application component missing
    Symptom
    After the meta data upload export InfoSources are not displayed in the InfoSource tree.
    Additional key words
    Application component
    Cause and prerequisites
    The entry for the component 'Data Marts' is missing in the BW management for the InfoSource hierarchy.
    Solution
    Maintain the following table entries in table RSAPPL of the target BW using Transaction SE16:
    1. APPLNM      DM   APPLNM P   APPLNM C    DM-IO   APPLNM N    NODESNOTCONNECTED   RELEASE     40B   TXTSH       Data Marts   TXTMD       Data Marts   TXTLG       Data Marts
    2. APPLNM      DM-IO   APPLNM P    DM   APPLNM C   APPLNM N   RELEASE     40B   TXTSH       Master data DM   TXTMD       Master data Data Marts   TXTLG       Master data Data Marts
    In addition, change the entry APPLNM = SAP R/3 as follows:
    APPLNM      SAP-R/3APPLNM PAPPLNM C    SAP-R/3-IOAPPLNM N    DMRELEASE     30DTXTSH       application componentsTXTMD       application componentsTXTLG       application components

  • BW - APO Datamart datasource migration issue

    Hi,
    Question :  Can we migrate datasources exported as datamart from APO-DP system into BW, from 3.x to 7.0 or does it have to be a 3.x datasource with transfer rules, TS , update rules etc ...
    1) I know we cannot migrate data marts with 8* or //8  but this datamart is replicated from our APO-DP system and begins with 9*.
    Thanks .
    HS

    Hi Geetanjali,
    Thanks for the reply.
    My problem is that I am getting error message "No transfer structure available for infosource 9* in source system" when I run the infopkg to load in PSA ( which is default for 7.0 datasources).
    This is the dataflow l I wanted to have which replaces old model as shown  :
    NEW 7.0 dataflow : Datasource -->infopkg --> psa --> dtp -->transformation --> cubes ( same source feeds 3 cubes ) .
    OLD 3.x dataflow : DS >Infosource>Transfer rules--> infopkg --> update rules --> cubes.
    Thanks .
    HS

  • Problem relating multiple fact tables

    I have 2 fact tables in my cube (Encounter and Visit).   Every visit has at least 1 encounter but not every encounter has a visit.  I also have a department dimension which is related to both facts, and an appointment status dimension which is
    related only to Visit Fact.  
    The users want to look at the count of encounters by department and appointment status.  At first I was having an issue with this because when using the encounter count, I would end up with the same value in each row.  So I was able to create a
    dimension from my Visit Fact table and then have Encounter Fact reference the appointment status dimension through the Visit Fact dimension.  The problem I am having now is that since not every encounter has a visit, that my count of encounters only includes
    the encounters that have visits.  What I currently have is "joining out" the encounters that don't have a visit. How do I work around this so that all of the records from Encounter Fact are in the cube regardless of whether or not they have
    a visit?
    Do I need to implement unknown members in this case?  I'm not sure where to go.
    Thanks for the help,
    Andy

    So yes, if you are going to have some nulls you want to implement unknown members either
    1. "manually" in the ETL by making yourself a custom-created unknown record in each dimension with the values your users want to see in each column (e.g. unknown record might have product_category='Misc')
    2. using the built-in analysis services stuff - convert null to unknown
    Typically in my cubes facts are related only to dimensions, not directly to each other.  So I think I am not getting a full description of the relationships in your cube so I can't comment on your overall setup.  I guess I would say if your can
    determine the appointment status of an encounter based on the visit (or null/unknown if visit is null/unknown), then you should go ahead and use that information to directly relate encounter and appointment status.  I would tend to do it in the ETL, you
    can always use a view layer over the datamart dimension tables to do this an easy way without snowflaking your actual cube with reference dimensions.

  • Datawarehouse with dependent datamarts

    Hello Everyone,
    Could you please guide me on how to
    1)Create a datawarehouse with a focus to create dependent datamarts on top of it
    2)Simplest procedure to load the dtawarehouse to these datamarts
    3)Can I have multiple fact tables in the dtawarehouse with each fact table and set of dimensions catering to datamart?
    4)Additionally can we load data directly into these datamarts without loading them to Datawarehouse?
    Any reference material to deal with this scenario would be highly appreciated
    Thanks Much,
    Sri

    Hi,
    not to use a central datawarehouse causes several BIG problems. You will build cubes for a specific department and cannot integrate them including their etl processes for other departements. Larissa Moss calls it 'swim lanes'! I would never do this, it's a low level of maturity for a dwh. This was done some years ago! Golden rule: Think big, start small. You can build cubes without building the complete enterprise model for your whole company. Just model the parts you need for your project - but think about that it must fit for other departments too.
    And try to put all department specific logic in the load processes from the dwh to the data marts. Then you can reuse all data and load processes of your dwh to build some data marts for other departments. If you load data marts straight ot of the source the data marts for other departments (which can have only some small changes to the existing ones) must be build from the scratch. You will have much dedundant code, a really bad architecture, too much process flows, ...
    Regards,
    Detlef

  • Problem in Data Mart

    I tried to transfer data from one InfoCube(ZE1) to another InfoCube(ZE2).      
    Please find below the procedure I exercised:
    INFOCUBE ZE1 SUCCESSFULLY POPULATED DATA FROM FLATFILE WITH 20 RECORDS.
    INFOCUBE ZE2 CREATED BY COPYING FROM ZE1.
    GENERATED EXPORT DATASOURCE IN ZE1 AND  REPLICATED THIS NEW DATASOURCE 8ZE1 IN THE B3TCLNT800 - BW CLIENT 800 SOURCE SYSTEM.  THEN ACTIVATED THE TRANSFER RULES. THE INFOSOURCE IS IN DATAMART FOULDER.
    THE UPDATE RULES FOR ZE2 WAS CREATED USING INFOCUBE ZE1 (data mart concept) UNDER DATASOURCE TAB.
    SCHEDULED A LOAD PACKAGE ON THE INFOSOURCE  8ZE1 TO LOAD TO THE NEW INFOCUBE ZE2.
    Now while running the scheduling process, a short dump is created:
    No request Idoc generated in BW.
    Exception condition “INHERITED_ERROR” RAISED.
    Could you tell me what could be the reason behind it.
    Thanks in advance.

    Hi Venkat,
    Posting the OSS Note:568768 for your ref:
    <u><b>Symptom</b></u>
    A shortdump with an SQL Error occurs, or a message indicates, that a SQL Error occured. You need to figure out more information about the failing statement and the reasons for failing.
    A shortdump with Exception condition "INHERITED_ERROR" occured, a RAISE statement in the program "SAPLRSDRC " raised the exception condition "INHERITED_ERROR".
    <u><b>Other terms</b></u>
    Shortdump SQL Error Oracle DB6 DB2 MSSQL SAPDB Informix Database dev_w0 developer trace syslog UNCAUGHT_EXCEPTION CX_SY_SQL_ERROR DBIF_REPO_SQL_ERROR INHERITED_ERROR SAPLRSDRC RSDRC RSDRS SAPLRSDRS DBIF_RSQL_SQL_ERROR
    <u><b>Reason and Prerequisites</b></u>
    A shortdump with a SQL Error occurs, e.g. during BW Aggregate build, compression, BW queries, Datamart extraction, or a SQL Statement failed without a shortdump.
    The actions mentioned will already indicate the cause of the error.In particular, the database administrator should frequently be able to immediately recognize solutions.Nevertheless, you should create an OSS problem and you should then attach the SQL error message including additional information (such as the SQL statement that occurred or the error message text) to the problem message.
    This note combines the two OSS notes 495256 and 568768.
    <u><b>Solution</b></u>
    1. If a shortdump occured, get the work process and App Server where the dump occured, otherwise continue with the next step.
               Inside the shortdump, scroll to "System environment". Either you find the work process number here, or continue with the next step.
    1. Get the work process number from the syslog
                Go into the system log (Transaction sm21), search for the Short dump entry with the same timestamp in the syslog of the App Server where the short dump occured.
               The column "Nr" contains the work process number. Maybe there are already syslog entries before this entry containing more information about the error.
               Example: The shortdump contains
    UNCAUGHT_EXCEPTION
    CX_SY_SQL_ERROR
    04.11.2002 at 14:47:32
               The syslog contains:
    Time     Ty. Nr Cl. User    Tcod MNo Text
    14:47:32 BTC 14 000 NAGELK  BY2 Database error -289 at EXE
    14:47:32 BTC 14 000 NAGELK  BY0 > dsql_db6_exec_immediate( SQL
    14:47:32 BTC 14 000 NAGELK  BY0 > Driver][DB2/LINUX] SQL0289N
    14:47:32 BTC 14 000 NAGELK  BY0 > table space "PSAPTEMP". SQLS
    14:47:32 BTC 14 000 NAGELK  BY2 Database error -289 at EXE
    14:47:32 BTC 14 000 NAGELK  R68 Perform rollback
    14:47:32 BTC 14 000 NAGELK  AB0 Run-time error "UNCAUGHT_EXCEP
    14:47:32 BTC 14 000 NAGELK  AB2 > Include RSDRS_DB6_ROUTINES l
    14:47:33 BTC 14 000 NAGELK  AB1 > Short dump "021104 144732
                So from the Syslog the information can be got, that the reason of the error is
                Database error -289 at EXE dsql_db6_exec_immediate( SQLExecDirect ): [IBM][CLI Driver][DB2/LINUX] SQL0289N Unable to allocate new pages in table space "PSAPTEMP". SQLSTATE=57011
                The database error code is -289, the error text is "Unable to allocate new pages in tablespace "PSAPTEMP" ", and the work process number is "14".
    1. Displaying the developers trace
                Go to transaction sm51, select the correct application server (where the shortdump occured), now you are on transaction sm50 for this app server. Check the work process with the number you got from shortdump or syslog (in our example number 14), in the menu bar select Process - Trace - Display File.
               In the developers Trace search for the timestamp. In our example we get the entry:
    C Mon Nov  4 14:47:32 2002
    C  *** ERROR in ExecuteDirect[dbdb6.c, 5617]
    C  &+  0|     
    dsql_db6_exec_immediate( SQLExec...
    C  &+  0
    able space "PSAPTEMP".  SQLSTATE=57011
    C  &+  0
    C  &+  0|     
    INSERT INTO "/BIC/E100015" ...
    C  &+  0
    |    |     ...
    1. For most of the important sql statements generated by BW, when an error occurs, the SQL statement is saved as a text file called, for example, SQL00000959.sql (SQL<error code>.sql). If the statement is run in a dialog process, then the file is in the current directory of your SAP GUI on the front-end PC (for example, C:\Documents and Settings\schmitt\SAPworkdir);with batch tasks, it is stored in the DIR_TEMP directory (see transaction AL11) on the application server.
    2. For SAP Internal Support:
                Using the sql error codes determined under (1) as well as the relevant database platform, you can obtain a more detailed description of the error and possible error causes at http://dbi.wdf.sap.corp:1080/DBI/cgi/dberr.html.
    Hope this helps.
    Bye
    Dinesh

  • Can Anyone Clear me the things with DATAMARTS

    Hi All,
    I Am trying to load data from ODS to MAsterdata Object.
    The Masterdata Object has already been assgined to PCFILES source system. SO I Am trying to assgin another  source system which will be the MYSELF SYSTEM. When I am giving the name of the source system(as myself), a pop screen is coming asking to assgin the datasource... Here comes the problem for me . I need to update the masterdata with the contents in ODS. SO my datasource wilbe the 8(ods) right ? ... but in the list I am not finding the datasource name with my ODS .. BUt i am avle to see the datsource in the --sourcesyeem, myself system under datamarts  section.
    And One more question ,IF you go the infosource screen of any ODS I mean the 8ODS.. there in the menu bar we have drop meuns like , 1.INfosource 2. EDIT 3.goto
    4.??????????????? 5.environment.
    what does that 4 mean here and why the names of those are not displayed and if we click on that we will get  options like . 1.infosource status , 2. ??
    what is the opiton 2 here .
    if we click on the question mark we will get two radio buttons like .
    1.transfer program 8ztods_vc
    2.psa access program.
    can anyone tell me what will happen if we choose one of them and where we can see the results if we choose any one of them and excute.
    Thanks in Advance ,

    Hi,
    Here you tried to connect the Info source(Direct update) to Transactional data Data source(8...).Thats why you are not able to see the Data source 8....
    So create a Infosource (for Flexible updating ) for your Master characterstic.After activating the communication structure , if you try to assign the Data source,you will see the DS 8...
    With rgds,
    Anil Kumar Sharma .P
    Plz, donot forget to assign points, if it helps you.

  • Problems in calculating querys

    I have a query for which data are in a infocubo and I pass the following:
    I have records of the type:
    City Stock Price
    New York 2 100
    New York 1 50
    And I want to make the value of the stock, but when you add the query by New York draws 450 proved that squeezes to add the stock and add the price and multiply. And what I want is that the operation is performed by recording and then add, ie 2 * 100 = 200 and 1 * 50 = 50, Total = 250.
    Please tell me how I can get it.

    Hi Agustin,
    Calculations of these kinds are always inconsistent at the report Level.
    Can you please tell me will you be loading data to your ODS ? If not there is another solution. You can create a datamart of your ODS and assign it to itself and reload data after adding a new KF.
    Here you will again have a problem since it is an ODS and if your Key Fields are same then it might add up/ replace the existing data.
    If your ODS is used only for historical data you can do one thing. You can keep a flag infoobject as one of the keys and reload the data again.
    Regards,
    Pramod

  • Request not yet updated with datamart

    Hey guys,
    I am having trouble with an Upload.
    Some days ago I uploaded an Excel file in an ODS without any problems. Everything turns out green:
    Load: green
    Activation: green
    Yet the data is not in my Infocube yet, because when I run a query the data is simply not there.
    Now when I click 'manage data' on the ODS, I see this message that says 'Request not yet updated with datamart'.
    What can I do about this or what might be the problem?
    Thank you very much,
    Filip

    Hey Jürgen,
    I tried the "update 3.x" but it wasnt working.
    Next I tried to generate a datasource and to "update 3.x data", this 'worked' it didn't give me an error, but still my request is still "not yet updated with the datamart".
    Any ideas?
    Filip

Maybe you are looking for