Data marting problem

Hi There
How can i delete a datasource for data marting.
any one can give me some thought.
Regards,
Chandu.

Thanks kedar
I have deactivated tr rules and deleted assignment.
Now i want to delete DS ,i didn't find any delete option in context menu(write click on DS).
Regards,
Chandu.

Similar Messages

  • Problem in Data Mart

    I tried to transfer data from one InfoCube(ZE1) to another InfoCube(ZE2).      
    Please find below the procedure I exercised:
    INFOCUBE ZE1 SUCCESSFULLY POPULATED DATA FROM FLATFILE WITH 20 RECORDS.
    INFOCUBE ZE2 CREATED BY COPYING FROM ZE1.
    GENERATED EXPORT DATASOURCE IN ZE1 AND  REPLICATED THIS NEW DATASOURCE 8ZE1 IN THE B3TCLNT800 - BW CLIENT 800 SOURCE SYSTEM.  THEN ACTIVATED THE TRANSFER RULES. THE INFOSOURCE IS IN DATAMART FOULDER.
    THE UPDATE RULES FOR ZE2 WAS CREATED USING INFOCUBE ZE1 (data mart concept) UNDER DATASOURCE TAB.
    SCHEDULED A LOAD PACKAGE ON THE INFOSOURCE  8ZE1 TO LOAD TO THE NEW INFOCUBE ZE2.
    Now while running the scheduling process, a short dump is created:
    No request Idoc generated in BW.
    Exception condition “INHERITED_ERROR” RAISED.
    Could you tell me what could be the reason behind it.
    Thanks in advance.

    Hi Venkat,
    Posting the OSS Note:568768 for your ref:
    <u><b>Symptom</b></u>
    A shortdump with an SQL Error occurs, or a message indicates, that a SQL Error occured. You need to figure out more information about the failing statement and the reasons for failing.
    A shortdump with Exception condition "INHERITED_ERROR" occured, a RAISE statement in the program "SAPLRSDRC " raised the exception condition "INHERITED_ERROR".
    <u><b>Other terms</b></u>
    Shortdump SQL Error Oracle DB6 DB2 MSSQL SAPDB Informix Database dev_w0 developer trace syslog UNCAUGHT_EXCEPTION CX_SY_SQL_ERROR DBIF_REPO_SQL_ERROR INHERITED_ERROR SAPLRSDRC RSDRC RSDRS SAPLRSDRS DBIF_RSQL_SQL_ERROR
    <u><b>Reason and Prerequisites</b></u>
    A shortdump with a SQL Error occurs, e.g. during BW Aggregate build, compression, BW queries, Datamart extraction, or a SQL Statement failed without a shortdump.
    The actions mentioned will already indicate the cause of the error.In particular, the database administrator should frequently be able to immediately recognize solutions.Nevertheless, you should create an OSS problem and you should then attach the SQL error message including additional information (such as the SQL statement that occurred or the error message text) to the problem message.
    This note combines the two OSS notes 495256 and 568768.
    <u><b>Solution</b></u>
    1. If a shortdump occured, get the work process and App Server where the dump occured, otherwise continue with the next step.
               Inside the shortdump, scroll to "System environment". Either you find the work process number here, or continue with the next step.
    1. Get the work process number from the syslog
                Go into the system log (Transaction sm21), search for the Short dump entry with the same timestamp in the syslog of the App Server where the short dump occured.
               The column "Nr" contains the work process number. Maybe there are already syslog entries before this entry containing more information about the error.
               Example: The shortdump contains
    UNCAUGHT_EXCEPTION
    CX_SY_SQL_ERROR
    04.11.2002 at 14:47:32
               The syslog contains:
    Time     Ty. Nr Cl. User    Tcod MNo Text
    14:47:32 BTC 14 000 NAGELK  BY2 Database error -289 at EXE
    14:47:32 BTC 14 000 NAGELK  BY0 > dsql_db6_exec_immediate( SQL
    14:47:32 BTC 14 000 NAGELK  BY0 > Driver][DB2/LINUX] SQL0289N
    14:47:32 BTC 14 000 NAGELK  BY0 > table space "PSAPTEMP". SQLS
    14:47:32 BTC 14 000 NAGELK  BY2 Database error -289 at EXE
    14:47:32 BTC 14 000 NAGELK  R68 Perform rollback
    14:47:32 BTC 14 000 NAGELK  AB0 Run-time error "UNCAUGHT_EXCEP
    14:47:32 BTC 14 000 NAGELK  AB2 > Include RSDRS_DB6_ROUTINES l
    14:47:33 BTC 14 000 NAGELK  AB1 > Short dump "021104 144732
                So from the Syslog the information can be got, that the reason of the error is
                Database error -289 at EXE dsql_db6_exec_immediate( SQLExecDirect ): [IBM][CLI Driver][DB2/LINUX] SQL0289N Unable to allocate new pages in table space "PSAPTEMP". SQLSTATE=57011
                The database error code is -289, the error text is "Unable to allocate new pages in tablespace "PSAPTEMP" ", and the work process number is "14".
    1. Displaying the developers trace
                Go to transaction sm51, select the correct application server (where the shortdump occured), now you are on transaction sm50 for this app server. Check the work process with the number you got from shortdump or syslog (in our example number 14), in the menu bar select Process - Trace - Display File.
               In the developers Trace search for the timestamp. In our example we get the entry:
    C Mon Nov  4 14:47:32 2002
    C  *** ERROR in ExecuteDirect[dbdb6.c, 5617]
    C  &+  0|     
    dsql_db6_exec_immediate( SQLExec...
    C  &+  0
    able space "PSAPTEMP".  SQLSTATE=57011
    C  &+  0
    C  &+  0|     
    INSERT INTO "/BIC/E100015" ...
    C  &+  0
    |    |     ...
    1. For most of the important sql statements generated by BW, when an error occurs, the SQL statement is saved as a text file called, for example, SQL00000959.sql (SQL<error code>.sql). If the statement is run in a dialog process, then the file is in the current directory of your SAP GUI on the front-end PC (for example, C:\Documents and Settings\schmitt\SAPworkdir);with batch tasks, it is stored in the DIR_TEMP directory (see transaction AL11) on the application server.
    2. For SAP Internal Support:
                Using the sql error codes determined under (1) as well as the relevant database platform, you can obtain a more detailed description of the error and possible error causes at http://dbi.wdf.sap.corp:1080/DBI/cgi/dberr.html.
    Hope this helps.
    Bye
    Dinesh

  • Problem loading from DATA MART to ODS, SERVICE-API

    Hi gurus,
    I have a problem loading data from data mart to ODS, full load.
    But, if i try extractor itself (test in RSA3) it works fine.
    I already replicated, generated,check transfer rules....datamart but when i try to load data, I get this to messages:
    Message no. R3005
    "The InfoSource 8TEST_MM specified in the data request, is not defined in the
    source system."
    Message no. RSM340
    Errors in source system.
    BTW: This test system was copied from production system. And so far I had no problems with system, but i never tried loading from data marts.
    Any ideas?
    Regards, Uros

    Thanks, for your answer.
    I already did that and everything is fine, I see the infosource, and if I test the extractor it works fine, but the infopackage gives me above mentioned errors.
    I already looked through notes and I couldn't find anything useful.
    I forgot to mention that I generated export data source from transacional ODS.
    Regards, Uros

  • Unable to update the Data from Cube to Data Mart

    Hi,
    I have a problem with the data loading to a Cube(data Mart)in BW. When i checked in RSA3 it is showing 0 records.The data flow is depicted as follows: R/3 -> ODS(BW)-> cube(BW) -> Cube(BW- Data mart for APO cube)-> APO System. In BW, for the final Data target whenever data is loaded, by deleting the previous request through full load. But on checking this final cube (Data Mart to APO) records are avilable, wherein while checking in RSA3 for this final data target(Data Mart to APO)it is showing 0 records.
    Why? please help me.
    Regards,
    krishna

    Hi,
    I checked the data mart in RSA3.It is not a matter of Full upload or delta upload.
    thanks
    Krishna

  • Get back the Data mart status in ODS and activate the delta update.

    I got a problem when deleting the requests in ODS.
    actually there is Cube(1st level. it gets loaded from an ODS(2nd level). this gets loaded from 3 ODS'S( 3rd level). we were willing to delete recents requests from all the data tardets and reload from PSA. but while delting in the request in ODS(2nd level), it has displayed a window, showing as follows.
    - the request 132185 already retrived by the data target BP4CLT612.
    -Delta update in BP4CLT612 must be deactivated before deleting the request.
    - Do you want to deactivate the delta update in data target BP4CLT612.
       I have clicked on execute changes in the window. it has removed the data mart status for all the request which i have not deleted.
    in the same it happened inthe 3 ODS's(3rd level).
    I got clear that if we load further data from source system. it will load all the records from starting.
    so to avoid this can any body help me how to reset the Data mart status and activate the delta update.

    Hi Satish,
    U have to make the requests RED in cube and back them out from cube,before u can go for request deletions from the base targets(from which cube gets data).
    Then u have to reset data mart status for the requests in your 'L2 ODS' before u can delete requests from ODS.
    Here I think u tried to delete without resetting data mart status which has upset the delta sequence.
    To correct this..
    To L2 ODS,do an init without data transfer from below 3 ODS's after removing init request from scheduler menu in init infopackage.
    Do similar from L2 ODS to Cube.
    then reconstruct the deleted request in ODS.It will not show the tick mark in ODS.Do delta load from ODS to Cube.
    see below thread..
    Urgentt !!! Help on reloading the data from the ODS to the Cube.
    cheers,
    Vishvesh

  • Giving Error while generating the Data mart to Infocube.

    Hi Gurus,
    I need to  extract the APO infocube data in to the BW infocube. For that iam trying to generate the data mart to APO infocube .
    So, that i can use that data mart as a data source to extract that APO Infocube data in to  BW infocube.
    In that process iam trying to generate the datamart for APO Infocube . But while generating it is giving errors like below:
    Creation of InfoSource 8ZEXTFCST for target system BW 1.2 failed
    The InfoCube cannot be used as a data mart for a BW 1.2 target system.
    Failed to create InfoSource &v1& for target system BW 1.2.
    PLease suggest me what to do for this Error problem.
    Thanks alot in Advance.

    Hi,
    Point No : 1
    What is Planning Area :
    http://help.sap.com/saphelp_scm41/helpdata/en/70/1b7539d6d1c93be10000000a114084/content.htm
    Point No : 2
    Creation Steps for Planning Area :
    http://www.sap-img.com/apo/creation-of-planning-area-in-apo.htm
    Note : We will not create Planning Area.This will be done by APO team.
    Point No 3  : Afetr opening the T-Code : /n/SAPAPO/MSDP_ADMIN in APO you will be able to see all the planning areas.
    Point No 4 : Select your planning area and Goto Extras menu and Click on Generate DS
    Point No 5. System automaticall generate the DS in APO (Naming Convention start with 9) and Replicate the DS in BI Map to your cube and load the data.
    Regards
    Ram.

  • Error Caller 09 contains error message - Data Marts loading(cube to ODS)

    Dear all,
              Please ! Help me in this problem, This is very urgent.
              I have one process chain that loads data from BIW to BIW only through Data Marts. In that process chain, one process loads data from one cube(Created by us) & loads data to one ODS(also created by us). Data is loaded through full update & for the selected period specified in 'Calender Day' field in data selection.
             Previously I was able to load data for 2 months, but some days ago, suddenly one day, the process of Extraction got stuck in background for long time,& showed following error :
              Error message from the source system
              Diagnosis
             An error occurred in the source system.
              System Response
             Caller 09 contains an error message.
             Further analysis:
             The error occurred in Extractor . 
             Refer to the error message.
             Procedure
             How you remove the error depends on the error message.
             Note
             If the source system is a Client Workstation, then it is possible that the file that you wanted to                load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
                  Then we killed that process on server & after another attempt, It showed some calmonth...timestamp error. Then after reducing data selection period, It had been loaded successfully, after that I was able to load data for 20 days,Again after some days process got stuck,I followed the same procedure,reduced the period to 15 days & continued, Now I can't even load data for 5 Days successfully in one attempt, I have to kill that process in background & repeat it, then sometimes It get loaded.
             Pls, suggest some solutions as soon as possible. I am waiting for your reply. Points will be assigned.
             Thanks,
              Pankaj N. Kude
    Edited by: Pankaj Kude on Jul 23, 2008 8:33 AM

    Hi Friends !
                      I didn't find any short dump for that in ST22.
                      Actually , What happens is, Request continues to run in background for infinite time. At that time
    Status Tab in Process Monitor shows  this messages :
                        Request still running
                        Diagnosis
                        No errors found. The current process has probably not finished yet.
                         System Response
                         The ALE inbox of BI is identical to the ALE outbox of the source system
                           or
                         the maximum wait time for this request has not yet been exceeded
                           or
                        the background job has not yet finished in the source system.
                       Current status
                       in the source system
                        And Details Tab shows following Messages :
                        Overall Status : Missing Messages or warnings
                        Requests (Messages) : Everything OK
                                Data Request arranged
                                Confirmed with : OK
                         Extraction(Messages) ; missing messages
                                Data request received
                                Data selection scheduled
                                Missing message : Number of Sent Records
                                Missing message : selection completed
                        Transfer (IDOCS and TRFC) : Everything OK
                                Info Idoc1 : Application Document Posted
                                Info Idoc2 : Application Document Posted
                         Processing (data packet) : No data
                        This Process runs for infinite time, then I have to kill that process from server, & Then It shows  Caller 09 Error in Status Tab
                        Should I Changed the value for that memory parameter of server or not ?. We r planning to try it today, Is it really belongs to this problem, Will it be helpful ? What r the risks ?
                        Please, give your suggestion as early as possible, I m ewaiting for your reply.
      Thanks,
    Pankaj N. Kude

  • Data mart data load InfoPackage gets shot dumps

    This is related to the solution Vijay provided in the link What is the functionality of 0FYTLFP [Cumulated to Last Fiscal Year/Period
    I encounter the problem again for Data Mart load that I created different initial load InfoPackages with different data selection and ran them separatedly that the initial data packet are messed up and whenever I try to creat a new InfoPackage, always get short dumps. RSA7 on BW system itself doesn't give the fault entry.
    I try to use the program RSSM_OLTP_INIT_DELTA_UPDATE you provided, get three parameters:
    LOGSYS (required)
    DATASOUR (required)
    ALWAYS (not required)
    I fill in LOGSYS with our BW system source system name that's in the InfoPackage and fill in DATASOUR with the datasource name 80PUR_C01. But it goes nowhere when clicking the execution button!
    Then I tried another option you suggested by checking the entries in the following three tables:
    ROOSPRMS Control Parameters Per DataSource
    ROOSPRMSC Control Parameter Per DataSource Channel
    ROOSPRMSF Control Parameters Per DataSource
    I find there is no any entry for 1st table with datasource 80PUR_C01, but find two entries in each of the 2nd and 3rd tables. I need to go ahead to delete these two entries for these two tables, right?
    Thanks

    Kevin,
    sorry, I didn't follow your problem/question, but pay attention when you want to modify these tables content !!!
    Since there is an high risk of inconsistencies...(why don't you ask for some SAP support in OSS for this situation?)
    Hope it helps!
    Bye,
    Roberto

  • Data Mart load does not support Delta load?

    For data mart load, we create an InfoPackage for Delta load, under data selection tab, for the InfoObj. 0FISCPER, we pick OLAP variable 0FPER [Current Fiscal Year/Period (SAP Exit)], save this InfoPackage, but when get in again, find under Data Selection tab, on the row 0FISCPER, the From Value column becomes 001/2005 and To Value column becomes 011/2005 and the whole Data Selection tab screen becomes non-editable! 
    The correct behavior should be that after saving the InfoPackage on this row 0FISCPER, From Value/To Value columns are blank and just the Type column value is 7 and after saving the infopackage and get back in, the Data Selection tab screen is editable.
    But if we select Full update mode, then switch back to the Data Selection tab, then the behavior is right. 
    We wonder if SAP doesn't support Delta load for Data Mart load, why under the Update tab, the Delta load radio button shows up after we conducted an initial load?  It doesn't make any sense for SAP!

    Dear,
    as alredy suggested, you cannot do different selection criteria on your delta, since you automatically have to load on the basis of initial selection...and, if you think about it, this is logic: you cannot initialize a flow for certain conditions and then decide to upload in delta other (different) records...
    Hope now is clearer...
    Bye,
    Roberto
    clearly, if you perform a full (like a 'repair' full, from a functional point of view), this problem doesn't exist (data is taken from active table and not from change log one!)

  • Data mart symbol not showing

    Hi all,
    I am facing problem. when we load the data from ODS to CUBE .Data loaded succesfully.In ods manage screen DATA MART symbol not showing.
    pls help me.
    Thanks
    kamal

    Hi Kamal,
    Try running delta update from ODS to Cube manually by right click ods -> Update data into data target -> Delta update.
    And hopefully datamart status should get checked.
    Also see the request status of datamart from ODS to Cube it may be in red / yellow status.
    Hope that helps.
    Regards
    Mr Kapadia
    <removed>

  • Data Mart Status of the request is not ticked

    Hello Everybody,
    I am the first time to deal with BW.
    After I loaded the data to the ODS, the request in the manage data view was not ticked as the others although the job was completed successfully.
    Does anyone can help?
    Many Thanks
    F-B-I

    Hi,
    have you deleted the data mart status in ods .
    u need to follow these steps
    1, if you chenged the status to red no need to delete the datamart status in ods .you can load the data from ods to infocube.
    2, if the status green and need to delete the datamart symbol in ods.
    regards
    sivaraju

  • Data mart cube to cube copy records are not matching in target cube

    hI EXPERTS,
    Need help on the below questions for DATA Mart-Cube to cube copy(8M*)
    Its BW 3.5 system
    We have two financial cube.
    Cube A1 - Sourced through R/3 system (delta update) and Cube B1- Sourced through A1 cube.(Full update). These two cubes are connected through update rules with one to one mapping without any routines.Basis did a copy of back-end R/3 system from Production to Quality server.This happened approximately 2 months back.
    The Cube A1 which extracts delta load from R/3 is loading fine. but for the second cube, (extraction from previous cube A1) i am not getting full volume of data instead i m getting meagre value but the loading shows successful status in the monitor.
    We  tried through giving conditions in my infopackage (as it was given in previous year's loading) but then also its fetching the same meagre volume of data.
    To ensure that is it happening for the particular cube, we tried out in other cube which are sourced thro myself system and that is also getting meagre data rather than full data..
    For Example: For an employee if the data available is 1000, the system is extracting randomly some 200 records.
    Any quick reply will be more helpful. Thanks

    Hi Venkat,
                  Did you do any selective delitions in CUBEA1.
    first reconcile data cube1 & cube2 .
    match totals of cube1 with cube2.
    Thanks,
    Vijay.

  • Data mart status has beed reset by del the in between req in source ODS

    Hi SDN,
    We have a situation where in which there is a daily delta laod going from ODS to Cube and we have reset data mart status accidentally by deleting the request which is in between multiple number of requests in the source ODS, now when we deleted the requests in ODS we got a POP asking for 'do you want to delete the data in cube' we went with on that POP up, after that when we see in the manage of the source ODS the datamart status is not seen for all the requests that have been loaded to target cube. Then we have reconstructed the data for the deleted request from the PSA.
    Next day when a new laod comes into ODS then will the ODS send the correct delta update to target Cube.
    If correct delta is not updated to cube is there any method that we can follow to maintain the data consistency with out deleting the data in the target cube.
    Thank you,
    Prasaad

    Hi,
    You dekleted Data in Cube and loaded,but Ddata Mart is not appearing in ODS. So if you have all data in ODS, delete data in Cube and then just delete datamart layer in ODS and right click and click on Update data in DataTarget, then fresh request will update to Cub , this is Init only then Deltas will go from net day onwards.
    Thanks
    Reddy

  • Date popup problem in APEX 3.1

    Hi
    I have a date picker (DD-MM-YYYY HH24:MI) and after upgrading to APEX 3.1 from APEX 3.0.1 the popup window height is too small.
    The end user is having to resize the window to click on the OK button.
    Is there a file I can edit to increase the height, couldnt find it in templates/themes.
    Regards
    Adam

    Hi Adam,
    This is a bug in APEX 3.1. It was discussed here:
    Apex 3.1 Upgrade Issue - dba_lock and date picker display
    and here:
    Date Picker problem in Apex 3.1
    I'll let Carl investigate and provide an official response and recommendation. Although I know where this problem is occurring.
    The size of the popup calendar window is hard-wired in the file apex/images/javascript/apex_3_1.js. In APEX 3.0, the size of the popup window was determined programatically at runtime and was a function of the date format, if it included a time component or not.
    The uncompressed, readable version of this same file is in apex/images/javascript/uncompressed/apex_3_1.js. Look for p_DatePicker and you'll see what I'm talking about. You'll see the height is hard-wired to 210 and width to 258. In APEX 3.0, the height was set to 255 if the date format contained a time component.
    So my suggestion, until Carl provides an official response, is to look for '210' in apex/images/javascript/apex_3_1.js and change this to 255. Granted, all calendar popup windows will be this big, but it won't put as great a burden on the end-user.
    I hope this helps.
    Joel

  • ORA-01403: no data found Problem when using AUTOMATIC ROW FETCH to populate

    ORA-01403: no data found Problem when using AUTOMATIC ROW FETCH to populate a form.
    1) Created a FORM on EMP using the wizards. This creates an AUTOMATIC ROW FETCH
    TABLE NAME - EMP
    Item Containing PRIMARY KEY - P2099_EMPNO
    Primary key column - EMPNO
    By default the automatic fetch has a ‘Process Error Message’ of ‘Unable to fetch row.’
    2) Created a HTML region. Within this region add
    text item P2099_FIND_EMPNO
    Button GET_EMP to submit
    Branch Modified the conditional branch created during button creation to set P2099_EMPNO with &P2099_FIND_EMPNO.
    If I then run the page, enter an existing employee number into P2099_EMPNO and press the GET_EMP button the form is populated correctly. But if I enter an employee that does not exist then I get the oracle error ORA-01403: no data found and no form displayed but a message at the top of the page ‘Action Processed’.I was expecting a blank form to be displayed with the message ‘Unable to fetch row.’
    I can work around this by making the automated fetch conditional so that it checks the row exists first. Modify the Fetch row from EMP automated fetch so that it is conditional
    EXIST (SQL query returns at least one row)
    select 'x'
    from EMP
    where EMPNO = :P2099_EMPNO
    But this means that when the employee exists I must be fetching from the DB twice, once for the condition and then again for the actual row fetch.
    Rather than the above work around is there something I can change so I don’t get the Oracle error? I’m now wondering if the automatic row fetch is only supposed to be used when linking a report to a form and that I should be writing the fetch process manually. The reason I haven’t at the moment is I’m trying to stick with the automatic wizard generation as much as I can.
    Any ideas?
    Thanks Pete

    Hi Mike,
    I've tried doing that but it doesn't seem to make any difference. If I turn debug on it shows below.
    0.05: Computation point: AFTER_HEADER
    0.05: Processing point: AFTER_HEADER
    0.05: ...Process "Fetch Row from EMP": DML_FETCH_ROW (AFTER_HEADER) F|#OWNER#:EMP:P2099_EMPNO:EMPNO
    0.05: Show ERROR page...
    0.05: Performing rollback...
    0.05: Processing point: AFTER_ERROR_HEADER
    I don't really wan't the error page, either nothing with the form not being populated or a message at the top of the page.
    Thanks Pete

Maybe you are looking for

  • I have problem like this with all websites lately.

    problem loading sites with message like this: A script on this page may be busy, or it may have stopped responding. You can stop the script now, or you can continue to see if the script will complete. Script: http://www.careerbuilder.co.uk/A2EB891D63

  • Customer Deletion via RFC

    We would like to delete a customer (set deletion flag) via RFC. The problem is that the BAPI_CUSTOMER_DELETE calls screens which means that it cannot be called. We want to avoid doing database updates on KNA1-LOEVM because we want such deletions reco

  • Special stock-E in Requirements class 011-Delivery requirement

    Dear expert, i want to change the value in filed: special stock from E to other value via OVZG, can i do it and how? please do me a favor,great thanks! David

  • REGARDING ERROR MESAGE TRIGGER WHILE MAKING P.R AGAINST WBS ELEMENT.

    Dear FI/CO & PS Consultants,                                                 in which node shell i have to configure  for getting the error message while creating the PURCHASE REQUISITION when  my budget is excedding. Actually i  craeted WBS ELEMENTS

  • BIA Problen

    Hi, We are Getting differet values for the same reports when BIA is ON and when BIA is OFF. Note that we have not used any non cumulative key figures in our report , ours is very simple report. Please anyone help in the issue. Thanks in advance. Than