Release Info cube to OLTP - PIR not creaeted in ECC system

Hi,
I am releasing a key figure from info cube to the OLTP syetm (APO to ECC 6.0) as a PIR quantities. The program is executed by process chain in RSPC transaction. The process is executed without any error messages. When I check the planned Independent requirements in ECC system. There is no PIR created.
I have checked the RFC connection between the ECC and APO systems and found the connection is OK. How can I further troubleshoot this issue.
thanks and regards
Murugesan

The product masters were not maintained in the APO system. After maintaining the product masters the PIRs are created in the ECC system.

Similar Messages

  • Release Info cube to OLTP

    Hi,
    I am releasing a key figure from info cube to the OLTP syetm (APO to ECC 6.0) as a PIR quantities. The program is executed by process chain in RSPC transaction. The process is executed  without any error messages. When I check the planned Independent requirements in ECC system. There is no PIR created. 
    I have checked the RFC connection between the ECC and APO systems and found the connection is OK. How can I further troubleshoot this issue.
    thanks and regards
    Murugesan

    The issue resolved by maintaining the product masters for the products in APO.

  • ZSEGREPORT Not Exists in ECC System

    Hi Guys,
    I am trying to access SAP tables, Clusters and FM using Crystal report. I am getting below error message
    "you do not have the necessary rights to design reports against the SAP system"
    I checked all our post in SDN, We successfully transported the ABAP transport from Integration Kit but we couldn't able to find the Authorization object ZSEGREPORT and Class ZSSI in ECC system.
    Can anyone please help me in this issue?
    Thanks
    Vijay

    Hi,
    it looks to me that you are missing some of the transports. also make sure you import them in the right order, which is explained in the installation guide for the SAP integration kit in the chapter transports
    ingo

  • CTM SNP:PL-ORD not ciffed to ECC system

    Hi all,
    We have implemented MTS & MTO scenario.
    The issue is: for MTO the SNP PL-ORD created by CTM SNP aren't ciffed on ECC side. They are ciffed only if I made the conversion from SNP PL-ORD to PP/DS PL-ORD.
    For MTS, I don't have any transfer problem for SNP PL-ORD or PP/DS PL-ORD. 
    Is there a system constraint or I missed something at configuration level?
    Thanks, Marius

    Hi Rajesh,
    Note 832393 - Release Restrictions for SCM 5.0
    Make to order process in CTM
    Since SCM 4.0 CTM can also support Make to Order + This is only possible when using CTM time continous planning. (this is CTM PPDS)
    I really think that it's a system limit. CTM Bucket Oriented Planning (SNP) for MTO is not supported. However at least the CTM Heuristic create SNP PL-ORD and then with conversion all things are OK. 
    Your note 996299 is for CTM MTO PPDS
    You are in Production Planning and Detailed Scheduling (PP/DS) mode and in the CTM profile, you chose the variant for the bucket-oriented planning in the field for the planning type............
    Thanks, Marius

  • Unable to see the request in Reconstruction Tab of Info Cube

    Hi Experts,
    We are scm 5.1 version.
    Actually I have created extraction process to extract data from Demand Planning area. I have extracted data from planning area also. I can see the request in to PSA as well as Info Cube but I am not able to see the request in reconstruction tab.
    Is there any setting to see the request in to Reconstruction tab of the info cube?
    Please advise..
    Regards
    Sujay

    Hi rathy,
    Is there any way to support the equivalent of reconstructions with the new BW 7.0 design?
    I will explian in detail..
    Reconstruction is critical functionality for our current DP solutions, it allows us to "reconstruct" old requests from the backup InfoCube (based on what's been kept in the PSA) and use this to restore data in the planning area to a day in the past.  It is used whenever there is a technical or user error than needs to be un-done.  I believe there is a way to accomplish with the BW 7.0 design.
    Please suggest...
    Thanks in advance..
    Regards
    Sujay

  • Init Data Upload from ODS to Info cube

    Hi,
    When i am trying to upload the date from ODS to info cube using initial upload option in info package, i am not able to see the selection option fields.
    Whereas using the same ODS & info cube, if i am trying to change the option to full upload in info package, i am getting all the selection fields.
    what could be the reason for the same.
    please help
    Rajiv

    Somnath,
    Let me explain u the complete scenario
    1) we have an existing ODS which populates an info cube on daily basis (delta upload)
    2) we have created a new info cube, which fetch the data from the same ODS. that means 2 info cubes which are getting populated from one ODS.
    3) now when we have tried to initialize the new info cube from the existing ODS, it asked us to delete the initialize upload options "scheduler - initialize option for the system". & we have done the same. but by doing this, ODS has removed the option of delta upload from old info cube upload info package &&  new info cube initial upload is not dislaying any of selection fields..
    4) now, i have a query
    - how can i activate delta on old info cube (without losing any data)
    - how to upload data in new info cube with selection fields and start delta
    Regards
    Rajiv

  • LDAP Userid not present in Backend System

    We are trying to implement a scenario where the user are present in LDAP but are not present in the backend system "ECC".
    Case Scenario: User log into portal using "testid" and want to access the BSP view which pull up the information from the backend system. Now this "testid" is not present in ecc system so i am looking for solution to achieve this scenario where "testid" will be able to pull the information without being created in ECC system.
    This is required for our external customer which we donot want to setup in ecc system.
    Any suggestion or if some one has implemented this scenario and can share how they did it will be really appreciated

    Hi,
    1.) Licensing. Noteably the first aspect that would come from SAP themselves will be licensing. You still need to be transparent about licensing with SAP in your scenario.
    2.) Traceability. If any breaches are made or if any issues occur in yoru backend system, you will need to determine the exact time and it may be that the issue was invoked by several users acting on the system at the same time.
    3.) Locking/Synch and Session Terminations. If mulitple users (using the same backend ID) perform update tasks on the database you may find that they will incur locking issues from trying to update the same record. Synchronous jobs may also result in issues similarly when invoked by different users on the same ID. Session management on the backend systems (dependant on the application) may also result in situations where multiple users may be limited to the max number of concurrent connections under their ID or sessions be "locked" as a result of terminations on the client side - which would require to be cleaned up from time to time.
    While the above are some of the aspects that can be encountered, SAP generally handles these situations rather well, yet from a system managnment perspective you will encounter them from time to time.
    PS: Depending on how you map users - individual mapping can become cumbersome particularly if mappings need to change etc. You can also map at a group level. This will make your life easy particularly say for vendors who have several employees yet need to map to one ECC ID to transact - just create a group per vendor assign the users to the group and map the group to the ID.
    You can also us mass upload to map users efficiently - see:
    http://help.sap.com/saphelp_nw72/helpdata/en/48/a96f43db653206e10000000a42189c/frameset.htm
    https://cw.sdn.sap.com/cw/docs/DOC-107900
    example:
    [User]
    uid=user0002
    Last_Name=Johnson
    $usermapping$:BCE:user=ext_user0002
    $usermapping$:BCE:mappedpassword=initial1

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

  • Info Cube not getting full records

    Hi All,
    I am extracting data from the flat file (60k records..of which most fields have binary values),, am getting the values into data source..but when loading from data source to info cube am not getting total records..
    the load is successful and it is showing as transferred 60k records and added 510 records in the manage tab of the info cube.
    what would be the problem ....any help is appreciated.

    Hi,
    Data with same combination for characterstic is added and only for unique charactersitc value it will give new record.
    Material---Customer-Price
    10002----- C1--
    20
    10002----- C1--
    20
    10002----- C1--
    20
    10002----- C2--
    20
    SO As you can see in flat file 4 records are there but in inofcube it will be only 2
    Data with same combination of charactersitc is added and with unique combo only it will display.
    Hope it clears you.
    Also in addition please check the data in flat file.
    Regards,
    AL

  • How to populate values in DP key figure which is not present in Info Cube

    Hi
    We have a special Key figure in Demand Planning created for uplaoding Forecasts coming directly from customer,
    This key figure is not present in Info Cube and data doesnot pupulate in this key figure through Info cube build,
    Hence data in this key figure is to be loaded externally , Customer sends his forecasts through EDI messages, Now we want to update forecast from external customer in to this  special key figure,
    Please suggest some ways
    Thanks and regards,
    Nitin Lavhe

    Hi
    I think you can get loaded this information in the cube in any other key figure which is available in the info cube. Then while loading the data into planning area using TSCUBE, you can define the key figure assignment.
    In which you define data from which key figure of infocube has to be get loaded into which key figure of planning area. Here you can define the respective key figures.
    Please let us know if it helps you. Please let us know if you require any more information from our side.
    Thanks
    Amol

  • All info cubes are not listed in SAP_INFOCUBE_DESIGNS

    Hello experts,
    I want to list out all the info cubes & their Layout using report SAP_INFOCUBE_DESIGNS as this report shows only a few of the info provider details not all.
    how can i display all the infocube details in the same. is there any SAP NOTE or any other report or TCode to lis out the same.
    i have searched the same thing in SDN but did not get the solution?
    please advice
    thanks in advance
    neha

    Hi,
    Please do the below process.
    RSRV->All Elementary Tests ->Transaction Data ->Fact- and Dimension- Table of an InfoCube
    Povide Info cube name , Dim name(can also select in F4 Help) and select F/E table in the parameter tab and transfer.
    Expand Database information about InfoProvider XXXX tables Tab , here we will be able to see number of entries and dim table size by comparing with fact table size.
    Hope this might helps you, Let us know if you require any further details.
    Best Regards,
    Maruthi

  • Release of History from Info Cube to Planning Area

    Hi Experts,
    I am using program /SAPAPO/RTSINPUT_CUBE to load sales history data from Info Cube to Planning area. Most of the data gets loaded but for few CVCs we get following message: -
    549 combinationens of InfoCube are not contained in the BasisPlobStru
    While checking up these CVCs in transaction /SAPAPO/MC62 and Planning Book I can see that these CVC exist and data is also visible.
    Could anyone please help me out with the reason and what should be done to encounter this.
    Thanks in Advance.
    Thanks and Regards,
    Chandan

    hi Chandan,
    First of sincere apologies for not having read the query completely.
    regarding your query on 549 combinationens of InfoCube are not contained in the BasisPlobStru, as you have stated that the CVCs exist in the MPOS.
    But can you please check if the CVCs maintained in the MPOS are the exact CVCs that are mentioned in the messgae. For example, the message is for location product sales org Country. Can you check if the CVC exists for this combination for which the forecast is being loaded from the source? For example it is possible that data is being loaded for Loc1 Prod1 SalesOrg1 Country1, but the CVC exists for Loc1 Prod1 SalesOrg2 Country1, then such message appears.
    So can you check if the exact CVCs are exisitng in the MPOS as indicated in the messages?
    Rgds, Sandeep
    Sorry again for misreading on your query.

  • Data is not loaded into info cube

    Hi All,
    I have a custom defined info cube and full load is running every month and It's hardly taking 1 hour to finish the same.
    But this month data came to PSA successfully from there to data target process stopped at update rules, non of the records are getting added to info cube. We haven't done any changes on update rules also.
    Can anybody let me know what might be the reason behind it. Thanks.
    Regards,
    Ashok
    Message was edited by: Ashok kaipu

    Hi Ashok,
    You can do the following:
    1. In the Monitor Status tab, turn the request Red.
    2. In the Details tab right click this data package and choose Manual Update.
    3. After the processing is done, the datapackage will be green but the overall request will still be red.
    4. In the Monitor Status tab, turn the request back to original status (this will make it green)
    Hope this helps...

  • Abap insert into info cube results in ORA 14400

    Hi friends,
    in principle the situation is as following:
    we have 2 cubes which are identically in design, in the number of info objects etc.
    now we have the following code here:                                                                   
    * Dann aus Quelltabellen kopieren
    * (Fakten zuletzt wegen Fremdschlüsselbez. zu Dim.)
      CHECK P_KOP IS NOT INITIAL.
      COMMIT WORK.
      SELECT * FROM /BIC/DZHW_CUBE1P.
        CHECK /BIC/DZHW_CUBE1P-DIMID <> 0.
        MOVE-CORRESPONDING /BIC/DZHW_CUBE1P TO /BIC/DZHW_CUBE2P.
        INSERT /BIC/DZHW_CUBE2P.
      ENDSELECT.
      SELECT * FROM /BIC/DZHW_CUBE1T.
        CHECK /BIC/DZHW_CUBE1T-DIMID <> 0.
        MOVE-CORRESPONDING /BIC/DZHW_CUBE1T TO /BIC/DZHW_CUBE2T.
        INSERT /BIC/DZHW_CUBE2T.
      ENDSELECT.
      SELECT * FROM /BIC/DZHW_CUBE1U.
        CHECK /BIC/DZHW_CUBE1U-DIMID <> 0.
        MOVE-CORRESPONDING /BIC/DZHW_CUBE1U TO /BIC/DZHW_CUBE2U.
        INSERT /BIC/DZHW_CUBE2U.
      ENDSELECT.
      SELECT * FROM /BIC/DZHW_CUBE11.
        CHECK /BIC/DZHW_CUBE11-DIMID <> 0.
        MOVE-CORRESPONDING /BIC/DZHW_CUBE11 TO /BIC/DZHW_CUBE21.
        INSERT /BIC/DZHW_CUBE21.
      ENDSELECT.
      COMMIT WORK.
      SELECT * FROM /BIC/FZHW_CUBE1.
        /BIC/FZHW_CUBE2-KEY_ZHW_CUBE2P = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE1P.
        /BIC/FZHW_CUBE2-KEY_ZHW_CUBE2T = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE1T.
        /BIC/FZHW_CUBE2-KEY_ZHW_CUBE2U = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE1U.
        /BIC/FZHW_CUBE2-KEY_ZHW_CUBE21 = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE11.
        /BIC/FZHW_CUBE2-/BIC/ZGEHALT   = /BIC/FZHW_CUBE1-/BIC/ZGEHALT.
        INSERT /BIC/FZHW_CUBE2.
      ENDSELECT.
    the problem is the insert statement. if the interpreter reaches this step in the code, we get a dump with the following message:
    " Database error text........: "ORA-14400: inserted partition key does not map to   
      any partition"                                                                   
    now i assume - since this cube can be loaded with data normally - that it is not possible to store data in an info cube directly using a simple insert statement. are there any appropriate function modules to condense data in a request and load it into an info cube?
    Kind Regards.
    Gideon.
    Message was edited by: Gideon Lenz

    hi,
    we had a similar error and solved with oss note 339896,
    though 2.0b mentioned, it's applicable for our 3.0, please take a look beside the note Vinod mentioned ....(there also mentioned 509660 if f-fact table)
    339896
    Symptom
    During the parallel upload into InfoCubes, ORACLE error ORA14400 might occur in BW 2.0B.
    BW2.0B and BW2.1C both originate from the same BW technology basis. Thus 2.0B is a synonym for both releases.
    Other terms
    Partitioning, ODS, PSA, parallel loading, ORA14400
    Reason and Prerequisites
    The error can either occur during the insert into the "PSA table" ( /BIC/B00....) or during the insert into the "F-fact table" ( "/BI*/F<INFOCUBE>" ).
    If the error occurs when writing to the F-fact table, please refer to Note 509660.
    If the error occurs when writing to partitioned PSA tables, an inconsistency exists between administration table "RSTSODS" and the partitions in the database.
    Solution
    1.) As of Patch 22, CHECK Transaction "RSRV" allows to check the consistency of PSA tables and repair them, if required. (If no name is specified, the CHECK is carried out for all PSA tables.)
    2.) Patch < 22: Please check whether function module "RSDDCVER_PSA_PARTITION" exists in your system.
    - If yes: Start it for the corresponding PSA table using i_repair = 'X'. Then, the inconsistencies should be eliminated.
    - If not: The inconsistencies have to be repaired manually!
    Among all partitions of the PSA table, determine the partition with the highest "HIGH VALUE". Compare this partition to the entry in the Partno field of the "RSTSODS" table.
    ==> Transaction SE16 ---> 'RSTSODS' ---> filter on ODSNAME_TECH with the PSA table name. Error ORA14400 occurs if the entry in the RSTSODS table is higher than the highest partition. To solve the problem, release table RSTSODS in the SAP-DD so that you can change it via Transaction SE16. Then change the entry in the PARTNO field to the value of the highest 'HIGH VALUES'. If the PSA table is empty, enter value '2' in the PARTNO field of the "RSTSODS" table!!!!! The inconsistency may exist in the SAP buffer only. Table RSTSODS is "buffered completely". Before you change the table manually, do submit the command "/$tab rstsods" in the OK-code which invalidates the table in the buffer. If the problem continues to exist, change the entry and invalidate the buffer again by submitting "/$tab rstsods".
    509660
    Symptom
    BW2.0B and BW2.1C are based on the same BW technology.2.0B is therefore a synonym for both releases.
    The ORACLE error ORA14400 can occur in BW 2.0B//2.1C during writing to InfoCubes.
    Other terms
    Partitioning, parallel loading, F fact table, ORA00054, ORA14400, ORA14074, ORA02149
    Reason and Prerequisites
    The error can also occur during writing to PSA tables or when ODS objects are activated. Refer to note 339896 in this case.As of 2.0B, the F fact table in an BW/ORACLE environment is "range-partitioned" according to the package dimension.A new partition is created when a request is written to requests in the F fact table.If creation of the partition is not successful, ORACLE error 14400 ( "inserted partition key is beyond highest legally partition key" ). There are several causes for this:
    1. During loading, there are jobs running (such as ANALYZE TABLE ... CREATE INDEX... ) which lock the table in the ORACLE catalog and take a very long time.Creating the partition on the database takes a very long time;In this case, the ORA00054 error (resource busy) is issued.
    2. during parallel loading, an error occurs because an optimistic blocking concept is implemented in the update.
    Adding a partition is linked to writing the package dimension entry. If the loading process terminates in such a way that the dimension entry was written, but the partition was not created, the second loading process terminates with ORACLE error ORA14400.
    Solution
    As of patch BW 2.0B 15 / BW 2.1C 7, creation of a new partition by a SAP lock is saved.If a lock is already present, the program tries to get the lock 100 more times and then continues.In the case of parallel loading processes, this optimistic lock approach always resulted in the ERROR.For this reason, the parameter _wait is set to True in the "RSTMPLWI" template for BW 2.0B patch 21 / BW 2.1C 13 when the enqueue function is called and the loop counter is set to 500, so that the loaders wait longer before continuing.
    First select the test "Unused entries in the dimensions of a InfoCube" for the corresponding CUBE via transaction "rsrv"  ==>  InfocubeData and press "Eliminate error". The entry is then deleted in the dimension and during the next loading process, a new entry is generated and the partition is created.
    The correction instructions for BW 2.9B patch 15 - 20 / BW 2.1C patch 7 - 12 is attached.In this way, the programs from the corrected TEMPLATE are regenerated.
    If the error persists in spite of this change, check whether the F fact table is analyzing or whether indexes are being generated on the F fact table at the same time as the load process.
    The program RSTMPLWI is an ABAP template where the update programs are generated. ==> For this reason, automatic installation is not possible and you can not run a syntax check!

  • Extractors for Info Cubes only

    Hi All
    Why some extractors with delta capabilities are loaded directly into info cubes nly ? Any Valid reason ?
    Vanaja

    Hi Vanaja:
        In some cases the first release of the Extractors didn't have ODS capability, but even though this feature was improved in further Business Content releases the corresponding DSOs were not includeded as part of the data flow. As an example of this please refer to SAP Note 440166 - "ODS capability for shipment and shipment costs DataSources".
    Regards,
    Francisco Milán.
    Edited by: Francisco Milan on Jun 30, 2010 7:04 PM

Maybe you are looking for

  • Driver s400 win 8.1

    Windows 8.1 I downloaded driver form support website, but when i installed bluetooth driver, my pc falled in black screen, can't restart, shutdown. When i did that pc not off, power button led still on, hdd on. That still showed when installed with w

  • Firefox prevented this page from automatically reloading. How do I get trid of that? It asks me to allow every time it wants to refresh?

    I constantly have the following in a bar at the top of the CNN page: "Firefox prevented this page from automatically reloading". Ii is accompanied by an "Allow" button. How do I get rid of that. I also have on every site randomly highlighted words in

  • Unable to Connect to: sysadmin@ my_Database

    Hello all, I am new to Oracle and for the past several days I have been trying to login to an Oracle apps database (11.5.10) through Discover desktop. I get the following error when trying to login as an application user Unable to Connect to: sysadmi

  • Colors not the right ones

    When I open a new document and try to add a color, only variations of gray come up. How do I get back the real colors? I even uninstalled and reinstalled the software but it didn't help.

  • RFx response

    Hi Experts, SRM 7 EHP 1 ECC 6 EHP 5 PPS scenario We have created a RFx and the bidder is able to view the RFx details in the worklist. However the option of providing bidder price details at line item level is inactive. All necessary number ranges, t