Release Info cube to OLTP
Hi,
I am releasing a key figure from info cube to the OLTP syetm (APO to ECC 6.0) as a PIR quantities. The program is executed by process chain in RSPC transaction. The process is executed without any error messages. When I check the planned Independent requirements in ECC system. There is no PIR created.
I have checked the RFC connection between the ECC and APO systems and found the connection is OK. How can I further troubleshoot this issue.
thanks and regards
Murugesan
The issue resolved by maintaining the product masters for the products in APO.
Similar Messages
-
Release Info cube to OLTP - PIR not creaeted in ECC system
Hi,
I am releasing a key figure from info cube to the OLTP syetm (APO to ECC 6.0) as a PIR quantities. The program is executed by process chain in RSPC transaction. The process is executed without any error messages. When I check the planned Independent requirements in ECC system. There is no PIR created.
I have checked the RFC connection between the ECC and APO systems and found the connection is OK. How can I further troubleshoot this issue.
thanks and regards
MurugesanThe product masters were not maintained in the APO system. After maintaining the product masters the PIRs are created in the ECC system.
-
Release of History from Info Cube to Planning Area
Hi Experts,
I am using program /SAPAPO/RTSINPUT_CUBE to load sales history data from Info Cube to Planning area. Most of the data gets loaded but for few CVCs we get following message: -
549 combinationens of InfoCube are not contained in the BasisPlobStru
While checking up these CVCs in transaction /SAPAPO/MC62 and Planning Book I can see that these CVC exist and data is also visible.
Could anyone please help me out with the reason and what should be done to encounter this.
Thanks in Advance.
Thanks and Regards,
Chandanhi Chandan,
First of sincere apologies for not having read the query completely.
regarding your query on 549 combinationens of InfoCube are not contained in the BasisPlobStru, as you have stated that the CVCs exist in the MPOS.
But can you please check if the CVCs maintained in the MPOS are the exact CVCs that are mentioned in the messgae. For example, the message is for location product sales org Country. Can you check if the CVC exists for this combination for which the forecast is being loaded from the source? For example it is possible that data is being loaded for Loc1 Prod1 SalesOrg1 Country1, but the CVC exists for Loc1 Prod1 SalesOrg2 Country1, then such message appears.
So can you check if the exact CVCs are exisitng in the MPOS as indicated in the messages?
Rgds, Sandeep
Sorry again for misreading on your query. -
Abap insert into info cube results in ORA 14400
Hi friends,
in principle the situation is as following:
we have 2 cubes which are identically in design, in the number of info objects etc.
now we have the following code here:
* Dann aus Quelltabellen kopieren
* (Fakten zuletzt wegen Fremdschlüsselbez. zu Dim.)
CHECK P_KOP IS NOT INITIAL.
COMMIT WORK.
SELECT * FROM /BIC/DZHW_CUBE1P.
CHECK /BIC/DZHW_CUBE1P-DIMID <> 0.
MOVE-CORRESPONDING /BIC/DZHW_CUBE1P TO /BIC/DZHW_CUBE2P.
INSERT /BIC/DZHW_CUBE2P.
ENDSELECT.
SELECT * FROM /BIC/DZHW_CUBE1T.
CHECK /BIC/DZHW_CUBE1T-DIMID <> 0.
MOVE-CORRESPONDING /BIC/DZHW_CUBE1T TO /BIC/DZHW_CUBE2T.
INSERT /BIC/DZHW_CUBE2T.
ENDSELECT.
SELECT * FROM /BIC/DZHW_CUBE1U.
CHECK /BIC/DZHW_CUBE1U-DIMID <> 0.
MOVE-CORRESPONDING /BIC/DZHW_CUBE1U TO /BIC/DZHW_CUBE2U.
INSERT /BIC/DZHW_CUBE2U.
ENDSELECT.
SELECT * FROM /BIC/DZHW_CUBE11.
CHECK /BIC/DZHW_CUBE11-DIMID <> 0.
MOVE-CORRESPONDING /BIC/DZHW_CUBE11 TO /BIC/DZHW_CUBE21.
INSERT /BIC/DZHW_CUBE21.
ENDSELECT.
COMMIT WORK.
SELECT * FROM /BIC/FZHW_CUBE1.
/BIC/FZHW_CUBE2-KEY_ZHW_CUBE2P = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE1P.
/BIC/FZHW_CUBE2-KEY_ZHW_CUBE2T = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE1T.
/BIC/FZHW_CUBE2-KEY_ZHW_CUBE2U = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE1U.
/BIC/FZHW_CUBE2-KEY_ZHW_CUBE21 = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE11.
/BIC/FZHW_CUBE2-/BIC/ZGEHALT = /BIC/FZHW_CUBE1-/BIC/ZGEHALT.
INSERT /BIC/FZHW_CUBE2.
ENDSELECT.
the problem is the insert statement. if the interpreter reaches this step in the code, we get a dump with the following message:
" Database error text........: "ORA-14400: inserted partition key does not map to
any partition"
now i assume - since this cube can be loaded with data normally - that it is not possible to store data in an info cube directly using a simple insert statement. are there any appropriate function modules to condense data in a request and load it into an info cube?
Kind Regards.
Gideon.
Message was edited by: Gideon Lenzhi,
we had a similar error and solved with oss note 339896,
though 2.0b mentioned, it's applicable for our 3.0, please take a look beside the note Vinod mentioned ....(there also mentioned 509660 if f-fact table)
339896
Symptom
During the parallel upload into InfoCubes, ORACLE error ORA14400 might occur in BW 2.0B.
BW2.0B and BW2.1C both originate from the same BW technology basis. Thus 2.0B is a synonym for both releases.
Other terms
Partitioning, ODS, PSA, parallel loading, ORA14400
Reason and Prerequisites
The error can either occur during the insert into the "PSA table" ( /BIC/B00....) or during the insert into the "F-fact table" ( "/BI*/F<INFOCUBE>" ).
If the error occurs when writing to the F-fact table, please refer to Note 509660.
If the error occurs when writing to partitioned PSA tables, an inconsistency exists between administration table "RSTSODS" and the partitions in the database.
Solution
1.) As of Patch 22, CHECK Transaction "RSRV" allows to check the consistency of PSA tables and repair them, if required. (If no name is specified, the CHECK is carried out for all PSA tables.)
2.) Patch < 22: Please check whether function module "RSDDCVER_PSA_PARTITION" exists in your system.
- If yes: Start it for the corresponding PSA table using i_repair = 'X'. Then, the inconsistencies should be eliminated.
- If not: The inconsistencies have to be repaired manually!
Among all partitions of the PSA table, determine the partition with the highest "HIGH VALUE". Compare this partition to the entry in the Partno field of the "RSTSODS" table.
==> Transaction SE16 ---> 'RSTSODS' ---> filter on ODSNAME_TECH with the PSA table name. Error ORA14400 occurs if the entry in the RSTSODS table is higher than the highest partition. To solve the problem, release table RSTSODS in the SAP-DD so that you can change it via Transaction SE16. Then change the entry in the PARTNO field to the value of the highest 'HIGH VALUES'. If the PSA table is empty, enter value '2' in the PARTNO field of the "RSTSODS" table!!!!! The inconsistency may exist in the SAP buffer only. Table RSTSODS is "buffered completely". Before you change the table manually, do submit the command "/$tab rstsods" in the OK-code which invalidates the table in the buffer. If the problem continues to exist, change the entry and invalidate the buffer again by submitting "/$tab rstsods".
509660
Symptom
BW2.0B and BW2.1C are based on the same BW technology.2.0B is therefore a synonym for both releases.
The ORACLE error ORA14400 can occur in BW 2.0B//2.1C during writing to InfoCubes.
Other terms
Partitioning, parallel loading, F fact table, ORA00054, ORA14400, ORA14074, ORA02149
Reason and Prerequisites
The error can also occur during writing to PSA tables or when ODS objects are activated. Refer to note 339896 in this case.As of 2.0B, the F fact table in an BW/ORACLE environment is "range-partitioned" according to the package dimension.A new partition is created when a request is written to requests in the F fact table.If creation of the partition is not successful, ORACLE error 14400 ( "inserted partition key is beyond highest legally partition key" ). There are several causes for this:
1. During loading, there are jobs running (such as ANALYZE TABLE ... CREATE INDEX... ) which lock the table in the ORACLE catalog and take a very long time.Creating the partition on the database takes a very long time;In this case, the ORA00054 error (resource busy) is issued.
2. during parallel loading, an error occurs because an optimistic blocking concept is implemented in the update.
Adding a partition is linked to writing the package dimension entry. If the loading process terminates in such a way that the dimension entry was written, but the partition was not created, the second loading process terminates with ORACLE error ORA14400.
Solution
As of patch BW 2.0B 15 / BW 2.1C 7, creation of a new partition by a SAP lock is saved.If a lock is already present, the program tries to get the lock 100 more times and then continues.In the case of parallel loading processes, this optimistic lock approach always resulted in the ERROR.For this reason, the parameter _wait is set to True in the "RSTMPLWI" template for BW 2.0B patch 21 / BW 2.1C 13 when the enqueue function is called and the loop counter is set to 500, so that the loaders wait longer before continuing.
First select the test "Unused entries in the dimensions of a InfoCube" for the corresponding CUBE via transaction "rsrv" ==> InfocubeData and press "Eliminate error". The entry is then deleted in the dimension and during the next loading process, a new entry is generated and the partition is created.
The correction instructions for BW 2.9B patch 15 - 20 / BW 2.1C patch 7 - 12 is attached.In this way, the programs from the corrected TEMPLATE are regenerated.
If the error persists in spite of this change, check whether the F fact table is analyzing or whether indexes are being generated on the F fact table at the same time as the load process.
The program RSTMPLWI is an ABAP template where the update programs are generated. ==> For this reason, automatic installation is not possible and you can not run a syntax check! -
Extractors for Info Cubes only
Hi All
Why some extractors with delta capabilities are loaded directly into info cubes nly ? Any Valid reason ?
VanajaHi Vanaja:
In some cases the first release of the Extractors didn't have ODS capability, but even though this feature was improved in further Business Content releases the corresponding DSOs were not includeded as part of the data flow. As an example of this please refer to SAP Note 440166 - "ODS capability for shipment and shipment costs DataSources".
Regards,
Francisco Milán.
Edited by: Francisco Milan on Jun 30, 2010 7:04 PM -
End routine to populate Info-cube.
Hi ,
Is it possible to load fileds of a Info-cube using End routines in the following scenairos.
1.Loading fields of info-cube by referencing/using a master data table in End routine.
2.Loading fields of info-cube by referencing/using a DSO fields in End routine.
3.Loading fields of info-cube by referencing/using a fields of another info-cube in End routine.
Please advise.Hi Stalin,
Before answering your question you need to understand something about "End routine" and "Expert routine".
End Routine:
- Result_fields and Result_package are available
- End routine contains only those fields available in Data target.
Start Routine:
- Source_fields and Source_package are available
- Start routine contains only those fields coming from source.
Expert Routine:
- Source_fields, Source_package, Result_fields and Result_package are available
So Now if you want write code to look up into some other cube, in look up you may need to test condition using source fields, in that case " Expert Routine" is only the option.
For Ex
my data target contains : x,y and z fields (it becomes result_field)
source contains : a field ( it becomes source_field)
now if i want to write look up code like this " select x,y and z fields from other cube where my a field value = other cube a field value. here u r accessing both S_F as well as R_F. So only the option is "EXPERT ROUTINE"
or else u want to write code only with R_F then "End routine " is enough.
Thanks,
Gowd -
Data in ODS, Info cube ans Multiprovider(List cube) are in Sync.
Hi,
My query is built on multiprovider. The data flow is data source u2013 ODS then ODS u2013 Info cube and multiprovider contains Info cube only.
Data in ODS, Info cube ans Multiprovider(List cube) are in Sync.
The query results are not tie up ODS, Info cube ans Multiprovider(List cube).
Any one let me know why this is happening and how do I resolve it.
Regards,
Sharma.HI;
thanks for help.
I resolved the issue in my own.
Regards,
Sharma. -
Info cube data doesn't match with R3
Hi,
We are using "0AFMM_C02" Info cube for Inventory data.
for some records its not fatching data for distribution channel.
I checked the update routine for Distribution Channel.
PROGRAM UPDATE_ROUTINE.
$$ begin of global - insert your declaration only below this line -
TABLES: ...
DATA: BEGIN OF it_data OCCURS 0,
material LIKE /bic/cs2af_mm_inv_1-material,
plant LIKE /bic/cs2af_mm_inv_1-plant,
val_type LIKE /bic/cs2af_mm_inv_1-val_type,
END OF it_data.
DATA: lt_mrpods TYPE TABLE OF /bic/ascm_d0500.
INCLUDE rs_bct_retail_update_rules.
$$ end of global - insert your declaration only before this line -
FORM compute_key_field
TABLES MONITOR STRUCTURE RSMONITOR "user defined monitoring
USING COMM_STRUCTURE LIKE /BIC/CS2AF_MM_INV_1
RECORD_NO LIKE SY-TABIX
RECORD_ALL LIKE SY-TABIX
SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
CHANGING RESULT LIKE /BI0/V0AFMM_C02T-DISTR_CHAN
RETURNCODE LIKE SY-SUBRC
ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
$$ begin of routine - insert your code only below this line -
fill the internal table "MONITOR", to make monitor entries
data:DISTR_CHAN LIKE /BI0/MCUST_SALES-DISTR_CHAN.
SELECT SINGLE DISTR_CHAN INTO DISTR_CHAN FROM /BI0/MCUST_SALES
WHERE CUST_SALES = COMM_STRUCTURE-CUST_SALES.
IF SY-SUBRC EQ 0.
RESULT = DISTR_CHAN.
ELSE.
RESULT = COMM_STRUCTURE-DISTR_CHAN.
ENDIF.
result value of the routine
RESULT = .
if the returncode is not equal zero, the result will not be updated
RETURNCODE = 0.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
$$ end of routine - insert your code only before this line -
ENDFORM.
Can any body tell for what distribution channel it will get data..
Please help .. its very urgent..
Regards,
Viren.Hi,
I have a better suggestion..Try to write this code in Start Routine of Update Rule.
DATA: S_DATA TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
WITH HEADER LINE
WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
DATA: ITEM_TABLE TYPE STANDARD TABLE OF /BI0/MCUST_SALES
WITH HEADER LINE
WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
*start of modification
*Populating the item status data from the item status ODS table
SELECT * FROM /BI0/MCUST_SALES INTO TABLE ITEM_TABLE.
LOOP AT DATA_PACKAGE INTO S_DATA.
LOOP AT ITEM_TABLE WHERE CUST_SALES = S_DATA-CUST_SALES.
MOVE item_table-DISTR_CHAN to s_data-DISTR_CHAN.
APPEND S_DATA.
ENDLOOP.
ENDLOOP.
DATA_PACKAGE[] = S_DATA[].
Hope this works...
Regards,
San! -
Hi gurus
I have done enhancement to populate a new field to the DataSource,
then how would i ad this object to the info cube,
and shall i delete the data which is already in the info cube.
thanx
vidhuHi,
If you have added a new key figure then
1) you need not delete the data just add the key figure and run a delta it will bring all the records into this new filed and everything will work fine.
2) You can delete the data and run a full load till the cube.
If you have added a char to the data source
1) you will have to add this char to the cube and then you will have top assign to a dimesion and therfoer you will have to delete tha data from the cube and once updat rule is done you will have to again to init data transfer.
Thanks -
Delta records are not loading from DSO to info cube
My query is about delta loading from DSO to info cube. (Filter used in selection)
Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
Selected "Change log" and "Get all new data request by request", but again 0 records got updated
Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and gave error message "Lock Table Overflow" .
When I run full load using same filter, data is loading from DSO to info cube.
Can anyone please help me on this to get delta records from DSO to info cube?
Thanks,
ShammaData is loading in case of full load with the same filter, so I don't think filter is an issue.
When I follow below sequence, I get lock table overflow error;
1. Full load with active table with or without archive
2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
When I chnage the settings of DTP to init run;
1. Select change log and get only one request, and run the init, It is successfully completed with green status
2. But when I run the same DTP for delta records, it does not load any data.
Please help me to resolve this issue. -
Hi,
Generally reporting will be done on Info Cube rather than DSO.
Suppose If we assign the same data source to Info Cube and DSO then both contains the same data.
Info cube have additive and aggregated functionality where DSO have the overwrite functionality .
Are we using cube for this functionality only ?
What about the Dimensions in Cube how they differ from data fields and key fields in DSO when we are developing same Bex Report on both ?
Please advice me .
Thanks in advance.
Thanks & Regards,
Ramnaresh.pIt is hard to compare Cube and DSO.
Both thier own usage.
1. InfoCube is always additive, while DSO supports overwrite functionality.
2. In InfoCube, combination of all the characteristic value is a Key in the Fact Table, while in ODS, you can specify your own Key Fields based on which you want to generate unique record in the DSO.
3. DSO supports many delta modes like D, R, N, X, after image, before image, while cube does not support all the modes. You can not delete the record based on the key from the cube by just loading the data. While DSO automaitcally deletes the record from active table and generates the reverse entry for Cube.
4. DSO is a flat structure and therefore, it is used to store information at detail level, while cube is used to store information at aggregated level.
So both the structures are very much different from each other. One can replace other at some places, but both the objects have thier own functionality.
- Danny -
Info Cube not getting full records
Hi All,
I am extracting data from the flat file (60k records..of which most fields have binary values),, am getting the values into data source..but when loading from data source to info cube am not getting total records..
the load is successful and it is showing as transferred 60k records and added 510 records in the manage tab of the info cube.
what would be the problem ....any help is appreciated.Hi,
Data with same combination for characterstic is added and only for unique charactersitc value it will give new record.
Material---Customer-Price
10002----- C1--
20
10002----- C1--
20
10002----- C1--
20
10002----- C2--
20
SO As you can see in flat file 4 records are there but in inofcube it will be only 2
Data with same combination of charactersitc is added and with unique combo only it will display.
Hope it clears you.
Also in addition please check the data in flat file.
Regards,
AL -
I am planning to create custom defined DSO Object & Info cube
Hi ,
i am planning to create custom defined DSO Object & Info cube.what ratio i can calculate what is the keyfields & what are the data fields in DSO.How can i calculate.
2. how can i create suitable dimensions, suitable characterstics for dimensions.what ratio i can decide.
Thanks,
chandu.Hi Diego Garu,
Thanks for your fast response.i
VBELN VBAP 2LIS_11_VAITM 0DOC_NUMBER
POSNR VBAP 2LIS_11_VAITM 0S_ORD_ITEM
KUNNR VBAK 2LIS_11_VAHDR 0SOLD_TO
VBELN VBRP 2LIS_13_VDITM 0BILL_NUM
FKDAT VBRK 2LIS_13_VDHDR 0BILL_DATE
INCO1 VBRK 2LIS_13_VDHDR(INCO1FieldNot Available in Data Source) 0INCOTERMS
ZTERM VBRK 2LIS_13_VDHDR(Payment terms field Not Available in Data Source) 0UCPYTERMS
NETWR VBRP 2LIS_13_VDITM 0NETVAL_INV.
here data is coming from the multible tables.that why i am planning to create custom defined data source based on view. here how can i calucate dso is suitable or cube is suitable.
suppose dso is suitable how can i decide which on is the data field and which one is the key field.
how can i decide how many dimensions are needed here.and which chara are suitable for that dimensions.
Thanks ,
chandu. -
Open Orders are negitive in ODS and Info cube
Hi,
Our ODS is getting data from sales order Item data and from ODS data is going to Info cube.
The problem is that: for few sales orders some open orders are deleting in R/3 but same records are reversing
the old records due to this in the ODS and info cube sales orders Quantity is negative but the orders are not negitive.
Could any one let me know how do we can ressolve it.
Regards,
Sharma. IVNHi Sarma,
You should consider checking the attached links below:
ROCANCEL field in R3 extraction program can't catch the LOEKZ (deletion ind
http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_bct/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d383836373136%7d
This may help in this case.
Regards,
Pietro -
Adding a new key figure to Info Cube
Can I add a new key figure to an existing Info cube in which data is loaded?
Assume that In Info Cube we have the following.
Stu Id -- Characteristic
Maths , Physics , Chemistry -- Key figures
These are the info objects which are already available and transfer rules are already available for the same in the infosource.
Could any one let me know how to add one more new key figure (total ) , which is the sum of the three marks to the Info Cube and populate the data in to the same.
I have tried the following steps , but could not get the solution.
1). Create a new key figure info object (total).
2). Add the same to the communication and pull the same into transfer rules of the existing info source and activate the same.
3). Add the new key figure to the info cube.
4). Open transfer rules for the info cube and change the mode from NO updation to Addition for the particular key figure (total).
When I perform the 4th step it is giving a red symbol beside the key figure in update rules and I'm not able to activate the same.
Any help on how to add key figures to update rules and transfer rules is highly appreciated and points would be assigned.
Thanks1.add new Keyfigure to Infocube.
2.Go to Update rules> go to update type of that Keyfigure>selct formula in update method>create new formula>here you can add up your 3 keyfigures-->OK.
activate update rules and InfoProvider.Check all are active or not..
by using Export generate datasouce ,you populate data(historical data) to new keyfigure as well.Then you have to delete historical data requests.
or iyou can create formula in query designer as well as srinivas suggested.that would be better option.
Maybe you are looking for
-
After finishing a GarageBand file, my iMac G5 won't export into iTunes. It's gets to the end of the slidebar and then just sits there. Nothing. Also, I have been unable to install any iTunes upgrades since the horrible 7.0 or 7.01 version. The instal
-
URL Parameters in Dreamweaver CC
I have just started using Dreamweaver CC. I'm a long time user of the dynamic data options and have really gotten a lot of usage out of the using Dreamweaver's PHP and ASP functionality. I know that the Server behaviors panel for dynamic data was not
-
Apex 4.01 Tabular Form Issue javascript
We have moved an apex 3.2 application to apex 4.0 and we have some problems with the new ordering of the G_Fxx-Colums as described in another thread. There is another thing we do not understand: The SQL-Statment is like select a, b, apex_item.text(40
-
How to spec out the Application Server 10g Machine ? If we need to support x number of clients. or if we need to add more users in an already running server is there any algo/formula to add 'y' RAM to the server to support 'x' number of users. Regard
-
My iWeb Domain File CANNOT be found!?!?!
I am running iWeb 3.0.1 and have a website up and running via FTP. I cannot find the domain file anywhere on my computer??? I have checked Library/App Support/iWeb and it is empty. The site works and publishes fine but we need to move the file for sh