Table parameter does not contain values in RFC call
Hello Experts,
I created RFC in R/3 with table parameter I_CUSTOMER. This RFC will be called by the CRM to pass all the customers data to R/3. The RFC was triggered by CRM passing 20 entries in I_CUSTOMER. However, when the debugger jumps in R/3 the table I_CUSTOMER in R/3 is empty. Can anyone help me to resolve the issue? any idea? thanks in advance!
Best Regards,
Alezandro
Hi Alezandro ..
what is the type of table parameter I_CUSTOMER u have declared??
if CRM is passing 20 values through I_CUSTOMER, u must take
type <b>TABLE TYPE</b> which is to be created in dictionary.
if it doesn't helps u ...can u be more clear??
Ram
Similar Messages
-
Error msg, "Maximum selection does not contain value 9999999999.
Hi, I am getting this error wile doing Reclassification of HFM Movement.
Error msg, "Maximum selection does not contain value 9999999999 of subassignment HFM CD2 & Asset Type "
in the Data Monitor, ( in Reclass Net Reserve). While running BCF task for a company for period 16/2009
Actually, We have converted the consolidation area from XX(2009) to YY(2010).
Before this task, user has performed the balance carry forward.
My Analysis is as follows:-
Method Used in this Reclassification
RECYX (Reclassify HFM Movement at YE lvl 10)
TRIGGER
Items : 100-299, 9900-9999
HFM Movement Type : O200 to O300
Posting Level : 10
TARGET
Items : 100-299, 9900-9999
Move : O100C
Posting Level : 10
HFM CD2 and Asset type : Ticked
After TEST run in UCMON, the error message comes :-
The "HFM CD2 & Asset Type value 9999999999" is not maintained in Breakdown Categories Y500, Y700 etc.
Breakdown categories in Workbench
Maintained as "OPTIONAL, Initialized value allowed".
Single Selection
1A
1B
AR
GA
Default Selection
99999999999
BCS CUBE
HFM CD2 & Asset Type - coming blank
LOTR
HFM CD2 & Asset Type - Coming blank
Please help to go forward.
In Anticipation,
Deepankar JainThanks a lot Dan for the reply.
Yes, the single selection is 1A, 1B etc, I agree that including this default value 9999999999 in single selection will resolve this issue, but the BREAK DOWN TYPE is OPTIONAL, that means It will also consider a blank value, right?
Also, In the method defined for RECLASSIFICATION in workbench, in the TARGET parameters, the" HFM CD2 and Asset Type" is blank but the "DEFAULT is TICKED" which makes it a mandatory derivation rule.
Please correct me if I am wrong.
Your replies on it are highly appreciated.
Thanks and Regards,
Deepankar -
Maximum selection does not contain value 600 of subassignment Subitem-error
Hi All,
I'm getting the following error when i tried to load the file through data stream.
Maximum selection does not contain value 600 of subassignment Subitem
Message no. UCD1038
Diagnosis
The posting item 11100100 has breakdown category BR01 for which a maximum selection was maintained for subassignment Subitem. The current posting value, however, is not in this maximum selection.
System Response
The posting is not possible.
Procedure
Check the maximum selection and the posting.
I have already refered the earlier threads but its of no use for me. I have checked my setting w.r.t to FS item breakdown category, Max selection in breakdown category and sub items all the setting are fine, even i applied the OSS notes no-1074599 still i'm facing the same issue.
Can any one help me with this regards, thanks in advance.
Regards,
MageshHi Dan,
Yes, breakdown category has been assigned with a sub item category which in turn assigned to subitems and this sub item is a part of the sub item category which is assigned in max. selection.
Let me know if you want any more information on the same.
Regards,
Magesh -
Repos - logical table source does not contain mapping
Hi,
I have a repository question?
I do not have a fact table in my physical layer. Do I have to have one? I thought the fact table got created in the BM.
I do have a date table in the physical layer, but it isn't joined to any other table.
I added the PK from the date_dimention to my logical fact table.
when I open the properities of my logical date table it is mapped to the physical date table.
Now in answers, I can query the date_dimension without error.
But, when I query the fact table using the date key, I get this error.
Error Details
Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table CRTK_FACTS) does not contain mapping for [CRTK_FACTS.MASTER_AI_ID, CRTK_FACTS.INT_DOC_ID, CRTK_FACTS.DATE_KEY]. (HY000)
SQL Issued: SELECT CRTK_FACTS.MASTER_AI_ID saw_0, CRTK_FACTS.INT_DOC_ID saw_1, CRTK_FACTS.DATE_KEY saw_2 FROM CRTK ORDER BY saw_0, saw_1, saw_2
Anyone have any suggestions on what I am missing?
thanks,
KathyI don't have a fact table in the physical layer. Do you mean join between the date table and other dimension tables?
or do I have to build the fact table in the physical layer?
I have not built any hierarchies yet, working on it. -
SAP:E:000:Table 'T100 ' does not contain an entry for ' 000'?
Gurus,
when user is updating size grid in matl master, the third party legacy system is reciving the bwlow error:
SAP:E:000:Table 'T100 ' does not contain an entry for ' 000'
could u all pls advise as why this error occured?
Thanks in advance..t100 is the table that holds all messages in SAP.
The key to access this table is the message class and the message number.
Reading the message that you provided, it looks like SAP is searching for a message number 000 without having a message class.
Do you use any exit with own programming ?
You probably need to debug the program to find the root cause -
ORA-12014: table 'DBA' does not contain a primary key constraint
Hai
when implementing basic replication i got the below error.
ORA-12014: table 'DBA' does not contain a primary key constraint
I was wondering primary key is enable at remote table
Any idea about this
Regards
mohan
I am giving below example
AT master site
global_names=false in init.ora file
sql>create table dba(no number primary key);
table created
and create snapshot log
sql>create snapshot log on m1;
materilized view created
AT SNAPSHOT SITE
1.Create service using net8 stiring name like n1
2.create database link
sql>create public database link m3 connect to system identified by manager using 'n1';
Database link created.
3.when creating snapshot site i got below error
SQL> create snapshot snap1 refresh fast start with sysdate next sysdate+1/(24*60
*60) as select * from dba@m3;
create snapshot snap1 refresh fast start with sysdate next sysdate+1/(24*60*60)
as select * from dba@m3
ERROR at line 1:
ORA-12014: table 'DBA' does not contain a primary key constraintHello,
Please repost this question in the appropriate section of the Discussion Forum.
This forum is for general suggestions and feedback about the OTN site.
You can also use our new offering called OTN Service Network:
For Oracle Advice/Minimal Support (fee based) on the Oracle Database Server, Oracle9i Application Server Containers for J2EE (OC4J), Oracle9i JDeveloper, Reports, Forums, SQL*Plus, and PL/SQL, please go to: http://www.oracle.com/go/?&Src=912386&Act=45
For customers with paid support (Metalink) please go to:
http://www.oracle.com/support/metalink
Regards,
OTN -
Public Page Parameter does not get value
I use the following API call to create a URL that will raise an event and send the “facilityId” parameter to the Public Page parameter of a destination page.
String URLString = EventUtils.constructEventLink(prr,"FacilityInfoPage", linkParams,true,true);
The link/URL generated by this code is:
http://ese-ny.its-ese.local/portal/event?_pageid=584,1533923&_dad=portal&_schema=PORTAL&_portlet_ref=584_1533941_584_1533923_1533923&_event_facilityId=36003001001&_eventName_FacilityInfoPage=
When I click on the link I am taken to the destination page and the following URL shows up in the address bar of the browser:
http://ese-ny.its-ese.local/portal/page?_pageid=584,1533732&_dad=portal&_schema=PORTAL
Issue 1: The Public Page parameter of the destination page does not receive the value of facilityId, Why?
Issue 2: Why is the URL in the address bar truncated after "PORTAL"?
Any ideas?Issue 1: The Public Page parameter of the
destination page does not receive the value of
facilityId, Why?The event link sends an event (FacilityInfoPage) to the page with the given parameters if any. In the page's edit tab you can link the event to a page and link the event's parameters to page parameters. This does not happen automatically.
So probably you haven't linked the event parameters to page parameters.
Issue 2: Why is the URL in the address bar
truncated after "PORTAL"?After an event is triggered you are forwarded to the page specified for the event with the specified parameters in event. If none specified I guess you go to the original page with no parameters. The URL is then only the URL of the page. -
ORA-12014 table does not contain a primary key constraint
Hi
I have some existing Materialised Views I am trying to redeploy through OWB as its now our standard tool.
The existing code has
CREATE MATERIALIZED VIEW .......
.REFRESH ON DEMAND WITH ROWID AS
SELECT *
FROM apps.fafg_assets
When I create in OWB you only put the select statement, there is nowhere to put the 'with rowid ' part hence I get the following error on deployment;
ORA-12014: table 'FAFG_ASSETS' does not contain a primary key constraint
I cannot put a primary key on this table though so how do I get around this in OWB? Like I say writing the MV in PL/SQL putting the 'with rowid' bit makes it work?
ThanksHi...
I believe you'll need a PK so Oracle will know how to update the MV. Is there any particular reason for you not having a PK in FAFG_ASSETS table? As an alternative, you may want to create a new column in this table and having a table trigger/sequence populating this column.
But It looks like you are using EBS, so, I don't know if you can add new columns to tables.
See if this thread can help you:
Re: ORA-12014: table 'XXX' does not contain a primary key constraint
Regards,
Marcos -
Materialized View - does not contain a primary key constraint
I am trying to create materialized view. I gone through the MV wizard creation. Added 2 columns (foo_column, foo_pk) of the table and have a simple select statement (Select foo_column from foo_dim). Also created a primary key contrainst and refencing the primary key (FOO_PK) of the dimension.
I am getting the following error:
ORA-12014: table 'FOO_DIM' does not contain a primary key constraintIt was solved. The table that I am querying has to have a primary key defined before creating a materialized view.
-
Incorrectly defined logical table source (for fact table Facts) does not
Hi,
I have two Dimensions A and B. A is joined to B by a foreign Key.
The report works if I pull B. Column1, A.Column2.
The report is throwing an error if i try to change the order of the columns like this. A.Column2, B. Column1.
error : Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
File: odbcstatementimpl.cpp, Line: 186
State: S1000. Code: 10058. [NQODBC] [SQL_STATE: S1000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table Facts) does not contain mapping for B.Column1
I am not sure where it is going wrong.
Thanks
jagadeesh
Edited by: Jagadeesh Kasu on Jun 16, 2009 4:22 PMdid you make joins in LTS or on the physical table.
try to make join in LTS if they are not there. -
Trying to access row values in a table which does not have any rows yet
try{
MappedRecord importParams = recordFactory.createMappedRecord("CONTAINER_OF_IMPORT_PARAMS");
IFunction function1 = client.getFunctionsMetaData().getFunction(funModGetDet);
IStructureFactory strucFact = interaction.retrieveStructureFactory();
response.write("try2 :"+pnumber);
IRecord structure = (IRecord) strucFact.getStructure(function1.getParameter("PERNR_TAB").getStructure());
response.write("try111 :"+pnumber);
structure.setString("PERNR",pnumber);
I am getting the following error "Trying to access row values in a table which does not have any rows yet " where PERNR_TAB is a table containing field "PERNR".
Can anybody help me out?Please re-post this question in the appropriate forum. It seems to have nothing to do with Web Dynpro.
-
Selected model does not contain any target value prior
Hi ODM experts,
I have tried to apply the SVM alg in order to find anomalous records.The table source have rows like that:
uniq_rec ID NAME A1 A2 A3 A4 A5 data
577 2052956018 NAMEHDRCP8 2.27 0.4 85.46 0.01 14.54 24-JAN-13
578 1250914484 NAMEDJDRVP3 11.45 1.24 56.24 0.01 43.77 24-JAN-13
579 1968689283 NAMEDKEND12 0.000011 6.78 0.000029 0.01 0.091 24-JAN-13
580 2063389130 NAMEDNMXG14 0.000011 0.65 36.65 0.02 0.091 24-JAN-13
unq_rec is the pk, id is the id for the generic name and A1 .. A5 attributes ,data when collection occur etc
I'm trying to execute the following code:
drop table ALG_SET;
exec dbms_data_mining.drop_model('SVMODEL');
create table ALG_SET (setting_name varchar2(30), setting_value varchar2(4000));
insert into ALG_SET values ('ALGO_NAME','ALGO_SUPPORT_VECTOR_MACHINES');
insert into ALG_SET values ('PREP_AUTO','ON');
commit;
Begin
dbms_data_mining.create_model('SVMODEL', 'CLASSIFICATION', 'ODM_PAR_FIN_HIST', 'UNQ_CRT', null, 'ALG_SET');
end;
The results is the following error:ORA-40104: invalid training data for model build ( if I run the code) .If I run from graphical interface I have obtained this
error code " Selected model does not contain any target value prior"(using the similar model - SVM for anomaly detction plus the same source table )
Please advice what is missing or wrong and if possible how to bypass this issue.
Thanks in advance for support.
Best Regards,
BogdanHere is also a newer example of creating a SVM Anomaly model from ODM sample code (12.1 version but this applies to 11.2):
Rem
Rem $Header: rdbms/demo/dmsvodem.sql /main/6 2012/04/15 16:31:56 xbarr Exp $
Rem
Rem dmsvodem.sql
Rem
Rem Copyright (c) 2004, 2012, Oracle and/or its affiliates.
Rem All rights reserved.
Rem
Rem NAME
Rem dmsvodem.sql - Sample program for the DBMS_DATA_MINING package.
Rem
Rem DESCRIPTION
Rem This script creates an anomaly detection model
Rem for data analysis and outlier identification using the
Rem one-class SVM algorithm
Rem and data in the SH (Sales History)schema in the RDBMS.
Rem
Rem NOTES
Rem
Rem
Rem MODIFIED (MM/DD/YY)
Rem amozes 01/23/12 - updates for 12c
Rem xbarr 01/10/12 - add prediction_details demo
Rem ramkrish 06/14/07 - remove commit after settings
Rem ramkrish 10/25/07 - replace deprecated get_model calls with catalog
Rem queries
Rem ktaylor 07/11/05 - minor edits to comments
Rem jcjeon 01/18/05 - add column format
Rem bmilenov 10/28/04 - bmilenov_oneclass_demo
Rem bmilenov 10/25/04 - Remove dbms_output statements
Rem bmilenov 10/22/04 - Comment revision
Rem bmilenov 10/20/04 - Created
Rem
SET serveroutput ON
SET trimspool ON
SET pages 10000
SET echo ON
-- SAMPLE PROBLEM
-- Given demographics about a set of customers that are known to have
-- an affinity card, 1) find the most atypical members of this group
-- (outlier identification), 2) discover the common demographic
-- characteristics of the most typical customers with affinity card,
-- and 3) compute how typical a given new/hypothetical customer is.
-- DATA
-- The data for this sample is composed from base tables in the SH schema
-- (See Sample Schema Documentation) and presented through a view:
-- mining_data_one_class_v
-- (See dmsh.sql for view definition).
-- BUILD THE MODEL
-- Cleanup old model with the same name (if any)
BEGIN DBMS_DATA_MINING.DROP_MODEL('SVMO_SH_Clas_sample');
EXCEPTION WHEN OTHERS THEN NULL; END;
-- PREPARE DATA
-- Automatic data preparation is used.
-- SPECIFY SETTINGS
-- Cleanup old settings table (if any)
BEGIN
EXECUTE IMMEDIATE 'DROP TABLE svmo_sh_sample_settings';
EXCEPTION WHEN OTHERS THEN
NULL;
END;
-- CREATE AND POPULATE A SETTINGS TABLE
set echo off
CREATE TABLE svmo_sh_sample_settings (
setting_name VARCHAR2(30),
setting_value VARCHAR2(4000));
set echo on
BEGIN
-- Populate settings table
-- SVM needs to be selected explicitly (default classifier: Naive Bayes)
-- Examples of other possible overrides are:
-- select a different rate of outliers in the data (default 0.1)
-- (dbms_data_mining.svms_outlier_rate, ,0.05);
-- select a kernel type (default kernel: selected by the algorithm)
-- (dbms_data_mining.svms_kernel_function, dbms_data_mining.svms_linear);
-- (dbms_data_mining.svms_kernel_function, dbms_data_mining.svms_gaussian);
-- turn off active learning (enabled by default)
-- (dbms_data_mining.svms_active_learning, dbms_data_mining.svms_al_disable);
INSERT INTO svmo_sh_sample_settings (setting_name, setting_value) VALUES
(dbms_data_mining.algo_name, dbms_data_mining.algo_support_vector_machines);
INSERT INTO svmo_sh_sample_settings (setting_name, setting_value) VALUES
(dbms_data_mining.prep_auto, dbms_data_mining.prep_auto_on);
END;
-- CREATE A MODEL
-- Build a new one-class SVM Model
-- Note the NULL sprecification for target column name
BEGIN
DBMS_DATA_MINING.CREATE_MODEL(
model_name => 'SVMO_SH_Clas_sample',
mining_function => dbms_data_mining.classification,
data_table_name => 'mining_data_one_class_v',
case_id_column_name => 'cust_id',
target_column_name => NULL,
settings_table_name => 'svmo_sh_sample_settings');
END;
-- DISPLAY MODEL SETTINGS
column setting_name format a30
column setting_value format a30
SELECT setting_name, setting_value
FROM user_mining_model_settings
WHERE model_name = 'SVMO_SH_CLAS_SAMPLE'
ORDER BY setting_name; -
Load ODS - InfoObject: InfoObject does not contain alpa-conforming value
Hello everybody,
I get following error while uploading from ODS to InfoObject.
InfoObject /BIC/ZHOUSENUM does not contain alpa-conforming value 0000000000000000004.
The data flow is as follows: Transactional InfoSource -> ODS -> InfoObject.
In the Transfer Rules before ODS I have marked the Conversion check box. The data is populated into ODS without any problems.
I can even activate the ODS, which reporting enabled.
When I browse the ODS table with the option check conversion exits unmarked, I can see the value '0000000000000000004'.
But the upload into InfoObject master data failes
Any help appreciated.
TIA
pawelsap_all onboard.
I assume, this infosource is locked, due to the fact it was system generated.
On the other hand, there is a s-note 559763 which says:
If an InfoObject is filled with ALPHA exits from an R/3 System, the BW assumes that the data is to arrive in the internal ALPHA format and therefore does not convert the data.
I know, it is about R/3 as source system, but I assume the same would be for BW.
p. -
Dump showing that table does not contain any elements.
Hi All,
I have a table, with lead select.Based on the row selected, corresponding values of the row are filled in the input fields.
I filter the table and only those rows are displayed which has status "waiting". Now the problem is when there is only one row with status "waiting", and process it, its status changes to "done", as a result of which there is no row in the table, and i get following error:
Adapter error in INPUT_FIELD "IF_STATUS" of view "ZPWB.DISTRIBUTE_VIEW": Context binding for property VALUE cannot be resolved: The DISTRIBUTE_VIEW.1.PD_ITEMS_TAB node does not contain any elements
Can you please help out:)I have now the same problem.
somebody found out what the solution is for that issue?
if i have an empty context i get a dump due to the context binding. what is the work around? -
Hi All,
I have a cube in which i'm using the TIME DIM that i created in the warehouse. But now i wanted a new measure in the cube which is Average over time and when i wanted to created the new measure i got a message that no time dim was defined, so i created a
new time dimension in the SSAS using wizard. But when i tried to process the new time dimension i'm getting the follwoing error message
"Errors in the high-level relational engine. The data source view does not contain a definition for "SSASTIMEDIM" the table or view. The Source property may not have been set."
Can anyone please tell me why i cannot create a new measure average over the time using my time dimension? Also what am i doing wrong with the SSASTIMEDIM, that i'm getting the error.
ThanksHi PMunshi,
According to your description, you get the above error when processing the time dimension. Right?
In this scenario, since you have updated the DSV, it should have no problem on the table existence. One possibility is that table has been specified for tracking in the notifications for proactive caching, but isn't available any more for some
reason. Please change the setting in Proactive Caching into "MOLAP".
Reference:
How To Implement Proactive Caching in SQL Server Analysis Services SSAS
If you have any question, please feel free to ask.
Best Regards,
Simon Hou
TechNet Community Support
Maybe you are looking for
-
Sharing Aperture library with another user on the same mac
Hi I just upgraded to Aperture 3.3, on a MBP running Lion (10.7.4) - I am trying to share the library with my wife so that each of us can retain our separate accounts and preferences and be able to edit/print/add pictures to the library I've seen som
-
I'm trying to install a new hard drive on my Touchsmart but the recovery disks won't boot.
The Touchsmart 300-1120 is only 1.5 years old and it's hard drive is already fried. I bought a new hard drive, ordered the recovery disks, and it still won't boot. I've changed the boot order to go from the CD drive first, but all it does is click
-
[CS3 JS] Finding paragraph containing table
I have a table object and want to find the paragraph that contains the table. Then I want to return the contents of the paragraph previous to the table anchor. I am using this, but when I attempt to display the contents of the paragraph, I get "Objec
-
Is it possible to install a USB 2.0 card in an old eMac?
Hi, This is probably an embarrassingly ignorant question, but I'd be grateful to know whether I can install a USB 2.0 card in my old eMac 1GHz PowerPC G4 (1GB RAM) running 10.5.6, which only has the old USB 1 ports buitl in. Basically, I'd like to be
-
CMS Runtime Systems - TCSDeployServlet Unauthorized
Hello, We recently upgraded our NWDI System from 7.31 SP7 – SP10 and facing an issue in defining the runtime systems for the track and saving them. We are able to save the track without any runtime definition but when there is one provided the track