Siebel Data Mapper issue
Hi everyone,
I have a requirement to map two external IO's using the declarative Data Map. Further on the mapping for one integration component in the source IO has to be conditioned on whether it has or not children. If the children are present they get mapped if not the parent is mapped. How to do this? Sounds silly, but I have tried Search Spec and Precondition at the component level but it does not seem to work. Using xslt would be no brainer, but the client sticks to the Data Maps.
Anyone having the same issue ?
Thanks for your input,
Hi,
Try to use EAI Dispatch Service for your requirement. You can check hierarchy of the message in different ways and invoke different maps, based on your rules.
This service is documented in Bookshelf.
Regards.
Similar Messages
-
Hi,
What will be the difference between Siebel Data Mapper and XSLT ? One of the difference that I have identified is that XSLT's can be used to convert one XML to other without the need to have an IO defined like in data mapper.
Apart from this I have failed to recognize any major difference.
Please share if any one of you have any other difference which can help decide when do we go for XSLT ?
Thanks :-)Hello,
I found a workaround to the problem :-) you need to close JDeveloper and relaunch it. The linking of the two sides is then possible.
Jon -
Issue importing Siebel data model into OPM
I am using OPM 10.2 and havaing a issue importing the Siebel data model in to OPM. I also have 10.2 Siebel connector installed on the desktop.
In OPM I cannot see the Siebel --> Import Data Model option in the menu.
Is there any configuration change that need to happen in OPM to do this?
Edited by: SP1D3R MAN on 17/11/2011 18:021. Make sure that you have the Siebel Connector 10.2.1 from Oracel Software delivery Cloud (formerly E-Devliery), there was a bug with the earlier 10.2 where the Policy Modeling Plugin was not working.
2. Make sure that the files Oracle.Modeling.Siebel.Exchange.dll, Oracle.Modeling.Siebel.Plugin.dll exist the the Plugins sub-directory of Policy Modeling (default location C:\Program Files\Oracle\Policy Modeling\plugins)
3. Make sure you are using Policy Modeling 10.2
if you need to manually copy the dll's they can be found in the siebel connector download at program files\Oracle\Policy Modeling\plugins\ -
How to map between child objects in a Siebel Data Map
Hello. I need to map a field from one child object to another child object using a Siebel Data Map.
Specifically, I have an Integration Object set up for FINS Group Policy with 2 child components (Action and Proposal). I want to map the Proposal's name to the Activity's Description. Here's the structure:
Policy (io)
-Name
-Action (io)
--Description
-Proposal (io)
--PPSLFileName (this is the proposal's file name)
I can map from the parent (Policy) to the child (Action) as follows:
Integration Component Map: Action-to_Action
Source Expression: [FINS Group Policy.Name]
Target Field Name: Description
This will map the Policy's Name to the Activity's Description.
I tried these, but they don't work:
1)
Integration Component Map: Action-to_Action
Source Expression: [FINS Group Policy.Proposal.PPSLFileName]
Target Field Name: Description
2)
Integration Component Map: Action-to_Action
Source Expression: [Proposal.PPSLFileName]
Target Field Name: Description
Thanks for any helpHi,
For your issue,you can use the 2nd point what u hv mentioned :-
Integration Component Map: Action-to_Action
Source Expression: [Proposal.PPSLFileName]
Target Field Name: Description
But it has some limitations,i.e the child's(Proposal) cardinality should be 1 or less than 1 i.e no. of instances of Proposal IC should not exceed 1 or else the data map will error out at runtime.
Regards,
Sambit -
How to convert siebel date to following format?
Hi,
How to convert siebel date to following format?
01-01-2011 5:00 AM (dd-mm-yyyy hh:mm AM)
01-JAN-2011 5:00 AM (dd-MON-yyyy hh:mm AM)
01/01/2011 5:00 AM (dd/mm/yyyy hh:mm AM)
Please suggest functions for conversion. Have tried few but didnt work.
Thanks.Excerpts from
http://bipconsulting.blogspot.com/2010/01/my-date-doesnt-like-me-with-siebel.html
".. the date data generated by Siebel with the IO is in ‘MM/DD/YYYY HH:MI:SS’ format, so it is something like ‘10/09/2008 05:31:13’. And BI Publisher doesn’t recognize that this is a date data because it’s not presented in the ‘Canonical’ format!.."
"...There are two possible and reasonable workarounds to overcome this issue. The first one is to use one of the Siebel’s extended function ‘totext()’ to convert the Siebel date format to the ‘Canonical’ date format before the BI Publisher’s date related function comes in..."
Please read on the above link to find out how to actually resolve this issue.
Good Luck
(please if this provides a solution for you grant the points and close this thread. Thanks!)
Jorge -
Windows 8.1 Data reordering issue with Intel Adaptors
According to Intel, there is a data reordering issue with their adaptors and probably this dumb WIDI software. This is from Intel site. they say some are fixed, "A future Windows 8 fix will address this issue for other Intel wireless adapters." I
have one Nope, still broke. I get drops all the time. Brand new Toshiba laptop I7 16 gigs of ram and a SSD and a 2 gig Vid card. Would be nice to be able to play games but I get dropped all the time. Now would Microsoft quit hiding
about this, and fix the darn thing. Also i'm a system admin for 13 years. I have build over 1000 PCs and servers. I know bad software. Please fix this. PLEASE. Its not going to just go away and its not just Toshiba, I have seen other companies with the
same problem. If there is a Fix PLEASE POST IT. Or even a workaround I have tried everything.
http://www.intel.com/support/wireless/wlan/sb/CS-034535.htm
Hi,
Have your first tried the software fix under this link for your network adapter?
http://www.intel.com/support/wireless/wtech/proset-ws/sb/CS-034041.htm
Please Note: The third-party product discussed here is manufactured by a company that is independent of Microsoft. We make no warranty, implied or otherwise, regarding this product's performance or reliability.
Also, you can try to check if there is any driver update under Device manager from manufacture's website.
Kate Li
TechNet Community Support
Yep didn't work. Still get drops all the time, had to run a Cat 5E cable to my laptop from my modem, because I have Atheros Gigabyte Lan adaptor. Works Great. The Wireless still drops all the time. Has Microsoft let out the patch to fix this or is it coming in
April in the 8.1 patch that's coming. Funny thing is all for Widi, I don't even use widi, I got the software to do that from Samsung works better on my TV. Intel and Microsoft need to get this fixed. because their driving off gamers and that's the
people that make sure they buy Microsoft so they can play games. With the wireless link dead and a great laptop worthless what's the point. Ive been in IT for 13 years building PCs and Servers how I knew how to run a 60 FT Cat 5e line thru
a 2 story house and terminate it. Most people don't. Fix the problem. -
Data Load Issue "Request is in obsolete version of DataSource"
Hello,
I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
I have taken the follwoing action
1. Replicated the data source
2. Deleted all request from PSA
2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
3. Re transported the datasource , transformation, DTP
Still getting the same issue
If you have any idea please reply asap.
SamitHi
Generate your datasource in R/3 then replicate and activate the transfer rules.
Regards,
Chandu. -
Hi,
Im very new to streams and having a doubt regarding ORA-01403 issue happening while replication. Need you kind help on this regard. Thanks in advance.
Oracle version : 10.0.3.0
1.Suppose there are 10 LCRs in a Txn and one of the LCR caused ORA-01403 and none of the LCRs get executed.
We can read the data of this LCR and manually update the record in the Destination database.
Eventhough this is done, while re-executing the transaction, im getting the same ORA-01403 on the same LCR.
What could be the possible reason.
Since, this is a large scale system with thousands of transactions, it is not possible to handle the No data found issues occuring in the system.
I have written a PL/SQL block which can generate Update statements with the old data available in LCR, so that i can re-execute the Transaction again.
The PL/SQL block is given below. Could you please check if there are any issues in this while generating the UPDATE statements. Thank you
/* Formatted on 2008/10/23 14:46 (Formatter Plus v4.8.7) */
--Script for generating the Update scripts for the Message which caused the 'NO DATA FOUND' error.
DECLARE
RES NUMBER; --No:of errors to be resolved
RET NUMBER; --A number variable to hold the return value from getObject
I NUMBER; --Index for the loop
J NUMBER; --Index for the loop
K NUMBER; --Index for the loop
PK_COUNT NUMBER; --To Hold the no:of PK columns for a Table
LCR ANYDATA; --To Hold the Logical Change Record
TYP VARCHAR2 (61); --To Hold the Type of a Column
ROWLCR SYS.LCR$_ROW_RECORD; --To Hold the LCR caused the error in a Txn.
OLDLIST SYS.LCR$_ROW_LIST; --To Hold the Old data of the Record which was tried to Update/Delete
NEWLIST SYS.LCR$_ROW_LIST;
UPD_QRY VARCHAR2 (5000);
EQUALS VARCHAR2 (5) := ' = ';
DATA1 VARCHAR2 (2000);
NUM1 NUMBER;
DATE1 TIMESTAMP ( 0 );
TIMESTAMP1 TIMESTAMP ( 3 );
ISCOMMA BOOLEAN;
TYPE TAB_LCR IS TABLE OF ANYDATA
INDEX BY BINARY_INTEGER;
TYPE PK_COLS IS TABLE OF VARCHAR2 (50)
INDEX BY BINARY_INTEGER;
LCR_TABLE TAB_LCR;
PK_TABLE PK_COLS;
BEGIN
I := 1;
SELECT COUNT ( 1)
INTO RES
FROM DBA_APPLY_ERROR;
FOR TXN_ID IN
(SELECT MESSAGE_NUMBER,
LOCAL_TRANSACTION_ID
FROM DBA_APPLY_ERROR
WHERE LOCAL_TRANSACTION_ID =
'2.85.42516'
ORDER BY ERROR_CREATION_TIME)
LOOP
SELECT DBMS_APPLY_ADM.GET_ERROR_MESSAGE
(TXN_ID.MESSAGE_NUMBER,
TXN_ID.LOCAL_TRANSACTION_ID
INTO LCR
FROM DUAL;
LCR_TABLE (I) := LCR;
I := I + 1;
END LOOP;
I := 0;
K := 0;
dbms_output.put_line('size >'||lcr_table.count);
FOR K IN 1 .. RES
LOOP
ROWLCR := NULL;
RET :=
LCR_TABLE (K).GETOBJECT
(ROWLCR);
--dbms_output.put_line(rowlcr.GET_OBJECT_NAME);
PK_COUNT := 0;
--Finding the PK columns of the Table
SELECT COUNT ( 1)
INTO PK_COUNT
FROM ALL_CONS_COLUMNS COL,
ALL_CONSTRAINTS CON
WHERE COL.TABLE_NAME =
CON.TABLE_NAME
AND COL.CONSTRAINT_NAME =
CON.CONSTRAINT_NAME
AND CON.CONSTRAINT_TYPE = 'P'
AND CON.TABLE_NAME =
ROWLCR.GET_OBJECT_NAME;
dbms_output.put_line('Count of PK Columns >'||pk_count);
DEL_QRY := NULL;
DEL_QRY :=
'DELETE FROM '
|| ROWLCR.GET_OBJECT_NAME
|| ' WHERE ';
INS_QRY := NULL;
INS_QRY :=
'INSERT INTO '
|| ROWLCR.GET_OBJECT_NAME
|| ' ( ';
UPD_QRY := NULL;
UPD_QRY :=
'UPDATE '
|| ROWLCR.GET_OBJECT_NAME
|| ' SET ';
OLDLIST :=
ROWLCR.GET_VALUES ('old');
-- Generate Update Query
NEWLIST :=
ROWLCR.GET_VALUES ('old');
ISCOMMA := FALSE;
FOR J IN 1 .. NEWLIST.COUNT
LOOP
IF NEWLIST (J) IS NOT NULL
THEN
IF J <
NEWLIST.COUNT
THEN
IF ISCOMMA =
TRUE
THEN
UPD_QRY :=
UPD_QRY
|| ',';
END IF;
END IF;
ISCOMMA := FALSE;
TYP :=
NEWLIST
(J).DATA.GETTYPENAME;
IF (TYP =
'SYS.VARCHAR2'
THEN
RET :=
NEWLIST
(J
).DATA.GETVARCHAR2
(DATA1
IF DATA1 IS NOT NULL
THEN
UPD_QRY :=
UPD_QRY
|| NEWLIST
(J
).COLUMN_NAME;
UPD_QRY :=
UPD_QRY
|| EQUALS;
UPD_QRY :=
UPD_QRY
|| ' '
|| ''''
|| SUBSTR
(DATA1,
0,
253
|| '''';
ISCOMMA :=
TRUE;
END IF;
ELSIF (TYP =
'SYS.NUMBER'
THEN
RET :=
NEWLIST
(J
).DATA.GETNUMBER
(NUM1
IF NUM1 IS NOT NULL
THEN
UPD_QRY :=
UPD_QRY
|| NEWLIST
(J
).COLUMN_NAME;
UPD_QRY :=
UPD_QRY
|| EQUALS;
UPD_QRY :=
UPD_QRY
|| ' '
|| NUM1;
ISCOMMA :=
TRUE;
END IF;
ELSIF (TYP =
'SYS.DATE'
THEN
RET :=
NEWLIST
(J
).DATA.GETDATE
(DATE1
IF DATE1 IS NOT NULL
THEN
UPD_QRY :=
UPD_QRY
|| NEWLIST
(J
).COLUMN_NAME;
UPD_QRY :=
UPD_QRY
|| EQUALS;
UPD_QRY :=
UPD_QRY
|| ' '
|| 'TO_Date( '
|| ''''
|| DATE1
|| ''''
|| ', '''
|| 'DD/MON/YYYY HH:MI:SS AM'')';
ISCOMMA :=
TRUE;
END IF;
ELSIF (TYP =
'SYS.TIMESTAMP'
THEN
RET :=
NEWLIST
(J
).DATA.GETTIMESTAMP
(TIMESTAMP1
IF TIMESTAMP1 IS NOT NULL
THEN
UPD_QRY :=
UPD_QRY
|| ' '
|| ''''
|| TIMESTAMP1
|| '''';
ISCOMMA :=
TRUE;
END IF;
END IF;
END IF;
END LOOP;
--Setting the where Condition
UPD_QRY := UPD_QRY || ' WHERE ';
FOR I IN 1 .. PK_COUNT
LOOP
SELECT COLUMN_NAME
INTO PK_TABLE (I)
FROM ALL_CONS_COLUMNS COL,
ALL_CONSTRAINTS CON
WHERE COL.TABLE_NAME =
CON.TABLE_NAME
AND COL.CONSTRAINT_NAME =
CON.CONSTRAINT_NAME
AND CON.CONSTRAINT_TYPE =
'P'
AND POSITION = I
AND CON.TABLE_NAME =
ROWLCR.GET_OBJECT_NAME;
FOR J IN
1 .. NEWLIST.COUNT
LOOP
IF NEWLIST (J) IS NOT NULL
THEN
IF NEWLIST
(J
).COLUMN_NAME =
PK_TABLE
(I
THEN
UPD_QRY :=
UPD_QRY
|| ' '
|| NEWLIST
(J
).COLUMN_NAME;
UPD_QRY :=
UPD_QRY
|| ' '
|| EQUALS;
TYP :=
NEWLIST
(J
).DATA.GETTYPENAME;
IF (TYP =
'SYS.VARCHAR2'
THEN
RET :=
NEWLIST
(J
).DATA.GETVARCHAR2
(DATA1
UPD_QRY :=
UPD_QRY
|| ' '
|| ''''
|| SUBSTR
(DATA1,
0,
253
|| '''';
ELSIF (TYP =
'SYS.NUMBER'
THEN
RET :=
NEWLIST
(J
).DATA.GETNUMBER
(NUM1
UPD_QRY :=
UPD_QRY
|| ' '
|| NUM1;
END IF;
IF I <
PK_COUNT
THEN
UPD_QRY :=
UPD_QRY
|| ' AND ';
END IF;
END IF;
END IF;
END LOOP;
END LOOP;
UPD_QRY := UPD_QRY || ';';
DBMS_OUTPUT.PUT_LINE (UPD_QRY);
--Generate Update Query - End
END LOOP;
END;Thanks for you replies HTH and Dipali.
I would like to make some points clear from my side based on the issue i have raised.
1.The No Data Found error is happening on a table for which supplemental logging is enabled.
2.As per my understanding, the "Apply" process is comparing the existing data in the destination database with the "Old" data in the LCR.
Once there is a mismatch between these 2, ORA-01403 is thrown. (Please tell me whether my understanding is correct or not)
3.This mismatch can be on date field or even on the timestamp millisecond as well.
Now, the point im really wondering about :
Some how a mismatch got generated in the destination database (Not sure about the reason) and ORA-01403 is thrown.
If we could update the Destination database with the "Old" data from LCR, this mismatch should be resolved isnt it?
Reply to you Dipali :
If nothing is working out, im planning to put a conflict handler for all tables with "OVERWRITE" option. With the following script
--Generate script for applying Conflict Handler for the Tables for which Supplymentary Logging is enabled
declare
count1 number;
query varchar2(500) := null;
begin
for tables in (
select table_name from user_tables where table_name IN ("NAMES OF TABLES FOR WHICH SUPPLEMENTAL LOGGING IS ENABLED")
loop
count1 := 0;
dbms_output.put_line('DECLARE');
dbms_output.put_line('cols DBMS_UTILITY.NAME_ARRAY;');
dbms_output.put_line('BEGIN');
select max(position) into count1
from all_cons_columns col, all_constraints con
where col.table_name = con.table_name
and col.constraint_name = con.constraint_name
and con.constraint_type = 'P'
and con.table_name = tables.table_name;
for i in 1..count1
loop
query := null;
select 'cols(' || position || ')' || ' := ' || '''' || column_name || ''';'
into query
from all_cons_columns col, all_constraints con
where col.table_name = con.table_name
and col.constraint_name = con.constraint_name
and con.constraint_type = 'P'
and con.table_name = tables.table_name
and position = i;
dbms_output.put_line(query);
end loop;
dbms_output.put_line('DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(');
dbms_output.put_line('object_name => ''ICOOWR.' || tables.table_name|| ''',');
dbms_output.put_line('method_name => ''OVERWRITE'',');
dbms_output.put_line('resolution_column => ''COLM_NAME'',');
dbms_output.put_line('column_list => cols);');
dbms_output.put_line('END;');
dbms_output.put_line('/');
dbms_output.put_line('');
end loop;
end;
Reply to u HTH :
Our Destination database is a replica of the source and no triggers are running on any of these tables.
This is not the first time im facing this issue. Earlier, we had to take big outage times and clear the Replica database and apply the dump from the source...
Now i cant think about that situation. -
4G LTE data reception issue in area of work building
Hi, I'm having a data reception issue in a certain area at work. The signal indicator at the upper right of the homescreen shows "4GLTE" but this is clearly inaccurate since I am not able to navigate to websites or send/receive multimedia messages. If I move ~30 feet east in the building, the reception is restored. Two people with iPhone 5 devices have the same issue. However, the Verizon iPhone 5 allows you to turn off LTE. Once this was done and the signal fell back to 3G, reception was restored, albeit with slower speeds, but at least reception wasn't completely blocked. I understand 4G is not available in all areas, but in this case, the phone is not automatically switching to 3G and there is no workaround because there is no option to turn off LTE on the Z10. In the "Settings" -> "Network Connections" -> "Mobile Network" -> "Network Technology" dropdown, the only values are:
UMTS/GSM (when I switch to this, no networks are found)
Global (the current selection)
LTE/CDMA
This is a big problem for me because for 8+ hours in the day I can't receive MMS messages or navigate to websites.Hi, Nate650,
Sorry to hear about your problem with 4G. First, let me ask, have you updated your Z10 to the latest official software version? I had a similar problem with my Z10. After about an hour on the phone with CS, we figured out it was a problem with the tower near me. The problem was fixed by VZW and I have not had connection issues. You are right, though, about the Z10 falling back to 3G. Mine did before the update but not since.
Doc -
Logical Standby Data Consistency issues
Hi all,
We have been running a logical standby instance for about three weeks now. Both our primary and logical are 11g (11.1.0.7) databases running on Sun Solaris.
We have off-loaded our Discoverer reporting to the logical standby.
About three days ago, we started getting the following error message (initially for three tables, but from this morning on a whole lot more)
ORA-26787: The row with key (<coulmn>) = (<value>) does not exist in table <schema>.<table>
This error implies that we have data consistency issues between our primary and logical standby databases, but we find that hard to believe
because the "data guard" status is set to "standby", implying that schemas' being replicated by data guard are not available for user modification.
any assistance in this regard would be greatly appreciated.
thanks
MelIt is a bug : Bug 10302680 . Apply the corresponding Patch 10302680 to your standby db.
-
How to get material's last posting date of issuing to production order?
Hi,
In my scenario, I need to get material's last posting date of issuing to production order (e.g. mov. typ. 261).
I tried to select the material documents whose movement type is 261, and restrict the posting date from month to month each time, until the first material document is selected.
But this method seems quite inefficient.
What kind of algorithm is more effient to do this?
Thanks
WesleyHi,
select max( budat )
from mkpf
into gv_budat
where mblnr in ( select mblnr
from aufm
where aufnr = gv_aufnr "(Prod. Order)
and matnr = gv_matnr "(Issued Material)
and bwart = '261' ).
Edited by: Azeem Ahmed Matte on Mar 12, 2010 12:33 PM -
Best practice on extending the SIEBEL data model
Can anyone point me to a reference document or provide from their experience a simple best practice on extending the SIEBEL data model for business unique data? Basically I am looking for some simple rules - based on either use case characteristics (need to sort and filter by, need to update frequently, ...) or data characteristics (transient, changes frequently, ...) to tell me if I should extend the tables, leverage the 'x' tables, or do something else.
Preferably they would be prescriptive and tell me the limits of the different options from a use perspective.
ThanksAccepting the given that Siebel's vanilla data model will always work best, here are some things to keep in mind if you need to add something to meet a process that the business is unwilling to adapt:
1) Avoid re-using existing business component fields and table columns that you don't need for their original purpose. This is a dangerous practice that is likely to haunt you at upgrade time, or (worse yet) might be linked to some mysterious out-of-the-box automation that you don't know about because it is hidden in class-specific user properties.
2) Be aware that X tables add a join to your queries, so if you are mapping one business component field to ATTRIB_01 and adding it to your list applets, you are potentially putting an unnecessary load on your database. X tables are best used for fields that are going to be displayed in only one or two places, so the join would not normally be included in your queries.
3) Always use a prefix (usually X_ ) to denote extension columns when you do create them.
4) Don't forget to map EIM extensions to the extension columns you create. You do not want to have to go through a schema change and release cycle just because the business wants you to import some data to your extension column.
5) Consider whether you need a conversion to populate the new column in existing database records, especially if you are configuring a default value in your extension column.
6) During upgrades, take the time to re-evalute your need for the extension column, taking into account the inevitable enhancements to the vanilla data model. For example, you may find, as we did, that the new version of the S_ADDR_ORG table had an ADDR_LINE_3 column, and our X_ADDR_ADDR3 column was no longer necessary. (Of course, re-configuring all your business components to use the new vanilla column can also be quite an ordeal.)
Good luck!
Jim -
How to get the previous state of my data after issuing coomit method
How to get the previous state of some date after issuing commit method in entity bean (It should not use any offline storage )
>
Is there any way to get the state apart from using
offline storage ?As I said the caller keeps a copy in memory.
Naturally if it is no longer in memory then that is a problem.
>
and also what do you mean by auditlog?
You keep track of every change to the database by keeping the old data. There are three ways:
1. Each table has a version number/delete flag for each record. A record is never updated nor deleted. Instead a new record is created with a new version number and with the new data.
2. Each table has a duplicate table which has all of the same columns. When the first table is modified the old data is moved to the duplicate table.
3. A single table is used which has columns for 'table', 'field', 'data' and 'activity' (update, delete). When a change is made in any table then this table is updated. This is generally of limited useability due to the difficulty in recovering the data.
All of the above can have a user id, timestamp, and/or additional information which is relevant to the data being changed.
Note that ALL of this is persisted storage.
I am not sure what this really has to do with "offline storage" unless you are using that term to refer to backed up data which is not readily available. -
Help XSLT Data Mapper and Transformations
Hi guys,
I need help in oracle ESB (XSLT Data Mapper and Transformations). I need to use the XSLT Data Mapper and Transformations using Response XML to Request XML.
Thanks
VyasThe concept is the same as BPEL. Without going into too much detail have a look at the folowing tutorial.
http://download.oracle.com/docs/cd/B31017_01/integrate.1013/b28212/buildendtoend.htm#BEICEFJD
Near the bottom they show how to do transformations.
cheers
James -
Query related to Siebel Data Bean - Business Object - Account - Update
Hello,
I am using Siebel Databean to update Business object - Account fields. I am using the field names displayed in browser to update the Account properties.
But sometimes it fails with below error:-
For e.g. Field 'Account Team' -
Field 'AccountTeam' does not exist in definition for business component 'Account'.
Please ask your systems administrator to check your application configuration.(SBL-DAT-00398)
Where I can find the information related to mapping of fields in business object 'Account' which I can use in Siebel Data bean program?
Thanks in advance !!Hi!
You need to use Siebel Tools to determine the Business Component Field Mappings to the Applet Controls that you're looking at.
Regards,
Oli
Maybe you are looking for
-
How to use Airplay in Keynote with iPad 1
The new update for Keynote for iOS (1.5) says that they now added AirPlay support. They didn't say this will only be for iPad2 or iPhone 4S where you could mirror anyway. So: How can I enable Airplay on my iPad1 in Keynote, so that the presentation w
-
Hi everybody, I have the problem that table GLT3 is unequal to table GLT0. 0. All neccessary mapping tables have been completed (e.g. account -> item, partner information, consolidation transaction types and so on...) 1. I activated the Consolidation
-
How to install flash player on a lumia 820.
Hi all. I have tried to find a way to install flash player on my lumia 820. Is there a way to do this? Downloads from Adobe directly do not work. I wanted to be able to play FB games on my phone. Android users can, it makes little sense that we cant
-
Org.apache.crimson and 1.5
I am trying to test an application in 1.5 and am getting compile errors for crimson references in classes someone else wrote. Looking at the JDK src, it looks like the apache packages are not bundeled in to the 1.5 beta. Wth the preface that I am mos
-
Force flash builder 4 plugin localization
Hello there, I've successfully installed the flash builder 4 plugin in a Spring Tool Suite IDE. The IDE is localized in english, but the text content related to flash builder appears in french, as my operating system is localized in french (I didn't