Initial load of data to UCM for Customer Hub PIP
Hi,
What is the recommended approach to have XREF tables populated during an initial load to UCM (via EIM), when the Accounts already exist in Siebel CRM?
Our approach as of now, using EIM, is as follows:
1) Set required customer information in EIM_UCM_ORG
2) Look up the customer's existing Row_ID in Siebel, and populate EIM_UCM_ORG.EXT_SYSTEM_NUM and EIM_UCM_ORG.UCM_EXT_ID) accordingly
3) Run the EIM job and UCM Batch Process to import the customer into UCM
The account then appears in UCM with the correct reference to siebel/row_id under the "external account IDs" tab. HOWEVER, it also contains a reference to a newly created duplicate record for that account in Siebel. Looking at the xref tables, there is no reference to the existing Siebel/row_id specified in the EIM batch load, and our hypothesis is that this is the reason the account cannot be found (and a duplicated is created). What we want to achieve here is to tell UCM that the accounts we are inserting do infact already exist in Siebel CRM, and can be identified by the row_id that we pass along.
The relevant system versions are Customer Hub PIP 11g with AIA 3. Siebel CRM and Siebel UCM are on patch 8.1.1.7 (and pertinent ACRs have been incorporated in the two Siebel instances).
Any hints or suggestions on how to approach this would be appreciated
-M
Edited by: 968713 on Nov 1, 2012 5:05 AM
Edited by: 968713 on Nov 1, 2012 5:05 AM
Edited by: 968713 on Nov 1, 2012 5:06 AM
Do you really need to populate the XREF table/transaction History table for initial load?
Similar Messages
-
Payload modification in case of ESB Resequencer in 10g for Customer Hub PIP
Hi all,
We are implementing customer hub PIP and in that PIP there is a particular ESB consumer (SyncContSiebelAggrEventConsumer) where the routing service AIASystem_Siebel_ABCS_SyncContSiebelAggrEventConsumer_RS.esbsvc is implementing FIFO ESB Resequence pattern.
Now we have enabled ESB re-sequence by modifying the esb_config.ini file and testing the scenario.
This is what we observer:
When we publish a particular contact belonging to a particular group say 'XYZ' then for any kind of payload error in the contact information the SyncContactSiebelReqABCSImpl is errored out which I can see both from BPEL console and from ESB console.
As a result the group gets locked and subsequent messages belonging to the same group is getting queued up.
Now the execution rule between SyncContSiebelAggrEventConsumer and SyncContactSiebelReqABCSImpl is Asynchronous and so from ESB console I am getting the option to modify the payload and re-submit it after unlocking the group using resequencer_restart_processing_group.sql script.
However I find that the payload I modified is not getting reflected and the ESB resequencer is submitting the same old payload and as a result the business error is getting re-generated and the group again getting relocked.
How do I modify the payload and re-submit the message when I am using ESB re-sequencer?
Thnks
Mandrita.Hi all,
We are implementing customer hub PIP and in that PIP there is a particular ESB consumer (SyncContSiebelAggrEventConsumer) where the routing service AIASystem_Siebel_ABCS_SyncContSiebelAggrEventConsumer_RS.esbsvc is implementing FIFO ESB Resequence pattern.
Now we have enabled ESB re-sequence by modifying the esb_config.ini file and testing the scenario.
This is what we observer:
When we publish a particular contact belonging to a particular group say 'XYZ' then for any kind of payload error in the contact information the SyncContactSiebelReqABCSImpl is errored out which I can see both from BPEL console and from ESB console.
As a result the group gets locked and subsequent messages belonging to the same group is getting queued up.
Now the execution rule between SyncContSiebelAggrEventConsumer and SyncContactSiebelReqABCSImpl is Asynchronous and so from ESB console I am getting the option to modify the payload and re-submit it after unlocking the group using resequencer_restart_processing_group.sql script.
However I find that the payload I modified is not getting reflected and the ESB resequencer is submitting the same old payload and as a result the business error is getting re-generated and the group again getting relocked.
How do I modify the payload and re-submit the message when I am using ESB re-sequencer?
Thnks
Mandrita. -
How to - Extract data from Cloud For Customer into SAP HANA system
Hello Community,
I have a requirement for extracting the existing data from Cloud for Customer into separate SAP HANA Box.
Is it possible to achieve the same ? If yes, Please guide me for the same.
Awaiting quick response.
Regards
KumarHi Kumar,
In addition to what Thierry mentioned you could also use the C4C integration via standard Operational Data Provisioning (ODP) interfaces. This integration was acutally built for SAP BW and allows you to access any C4C data sources. From my perspective you can also build upon that for a native SAP HANA integration. Please also have a look at this guide: How To... load SAP Business Suite data into SAP... | SAP HANA.
Besides that question let me also add the following: SAP Cloud for Customer already runs on SAP HANA since Nov. 2013. You may also use the powerful built-in analytics within C4C for analyzing data and any of your reporting demands. If your report should consider external data as well, you can combined the existing C4C data source with an external, so-called Cloud Data Source. More infomation is published in the C4C Analytics Guide: http://help.sap.com/saphelp_sapcloudforcustomer/en/PDF/EN-3.pdf.
I hope this helps...
Best regards,
Sven -
Initial Load Extract (Date Format)
Hi ,
I'm doing an Initial Load Extract (File To Replicat ) for module of 30 GB , and I'm getting the following error at on the extracted tables :
ERROR OGG-00665 OCI Error error executing fetch with error code 1801 (status = 1801-ORA-01801: date format is too long for internal buffer), SQL<SELECT ........
My concerns are :
1- How to overcome this error , without updating the source data ?
2- How to deal with that at the target side (If the data replicated ) , so it'll not affect the business needs
Thanks So MuchThanks 960104 for your interest
>> Source GG
11.1.1.1.2
DB is 10.2.0.4
++++++++++++++++++++++++++++++++++
Target GG
Version 12.1.2.1.0
DB is 12.1.0.2.0
Report file is only for extract ,as I'm pushing the trails to remote only without doing the replicat at the moment
=================================================================
2015-01-25 10:23:53 INFO
OGG-01026 Rolling over remote file /u04/GG_TRAILS/ff000001.
2015-01-25 10:23:56 INFO
OGG-01026 Rolling over remote file /u04/GG_TRAILS/ff000002.
2015-01-25 10:24:00 INFO
OGG-01026 Rolling over remote file /u04/GG_TRAILS/ff000003.
551606 records processed as of 2015-01-25 10:24:00 (rate 49204,delta 49204)
2015-01-25 10:24:05 INFO
OGG-01026 Rolling over remote file /u04/GG_TRAILS/ff000004.
2015-01-25 10:24:08 INFO
OGG-01026 Rolling over remote file /u04/GG_TRAILS/ff000005.
1062596 records processed as of 2015-01-25 10:24:10 (rate 50097,delta 51098)
2015-01-25 10:24:12 INFO
OGG-01026 Rolling over remote file /u04/GG_TRAILS/ff000006.
2015-01-25 10:24:16 INFO
OGG-01026 Rolling over remote file /u04/GG_TRAILS/ff000007.
2015-01-25 10:24:20 INFO
OGG-01026 Rolling over remote file /u04/GG_TRAILS/ff000008.
1471164 records processed as of 2015-01-25 10:24:21 (rate 46541,delta 39289)
Source Context :
SourceModule
: [ggdb.ora.sess]
SourceID
: [/scratch/pradshar/view_storage/pradshar_bugdbrh40_12927937/oggcore/OpenSys/src/gglib/ggdbora/ocisess.c]
SourceFunction
: [OCISESS_try]
SourceLine
: [501]
ThreadBacktrace
: [10] elements
: [/u01/GG/extract(CMessageContext::AddThreadContext()+0x26) [0x664446]]
: [/u01/GG/extract(CMessageFactory::CreateMessage(CSourceContext*, unsigned int, ...)+0x7b2) [0x65aee2]]
: [/u01/GG/extract(_MSG_ERR_ORACLE_OCI_ERROR_WITH_DESC_SQL(CSourceContext*, int, char const*, char const*, char const*, CMessageFactory::MessageDisposition)+0xb2) [0x613232]]
: [/u01/GG/extract(OCISESS_try(int, OCISESS_context_def*, char const*, ...)+0x48b) [0x5a3c2b]]
: [/u01/GG/extract(DBOCI_get_query_row(file_def*, int, int*)+0x95e) [0x923558]]
: [/u01/GG/extract(gl_get_query_row(file_def*)+0x10) [0x933e2c]]
: [/u01/GG/extract [0x87d18d]]
: [/u01/GG/extract(main+0x11e1) [0x527aa1]]
: [/lib64/libc.so.6(__libc_start_main+0xf4) [0x392f81d994]]
: [/u01/GG/extract(__gxx_personality_v0+0x1ea) [0x4f32ca]]
2015-01-25 10:24:21 ERROR OGG-00665 OCI Error error executing fetch with error code 1801 (status = 1801-ORA-01801: date format is too long for internal buffer), SQL<SELECT x."ID",x."STATUS",x."STATUS_DATE",x."UPDATE_DATE",x."CREATED_BY",x."CREATION_DATE",x."UPDATED_BY",x."DIR_ID",x."LOC_ID",x."TICKET_DATE",x."TICKET_TIME",x."ROAD_SPEED",x."VEHICLE_SPEED",x."RADAR>.
2015-01-25 10:24:21 ERROR OGG-01668 PROCESS ABENDING.
=================================================================
End of Report file
Thanks -
Need to load the data through excel for plan data through IP?
Hi all,
The user needs to load the data through the excel and click on the save button so that the data is saved in the cube through integrated planning.
The objects used are
customer, material, plant, calendarmonth sales qty, gross billing!!
Can anyone pls let me know how to do this?
ThanksHi,
You can use the following How to guide to load data from file into transactional cube.
If using IP :
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/90cef248-29e0-2910-11bb-df8dece7e22c?quicklink=index&overridelayout=true
Can also Refer to Marc's Blog:
/people/marc.bernard/blog/2007/11/25/how-to-load-a-file-into-sap-netweaver-bi-integrated-planning-part-1
If you are using bPS you can get the details in Wiki also for the same.
You can then create the hyperlink in Excel ,which user can use to upload data which can then be saved to cube using save button. -
Data Changes / Transports for Customizing Requests
Hello,
I am working on building a report which will fetch the transport related data such as Release Date & Time, Import Date & Time along with the Objects in that particular Transport.
It works fine in case of Workbench requests because I browse through the VRSD table to fetch the previous transports for the given object.
However, for Customizing, as we know that we cannot find information in the VRSD table, is there any place where I can look for data changes / previous data for the given Table Content / View Content via any Class or Function Module? Please let me know.
Regards,
Venkata Phani Prasad KHello Thomas,
Thanks for the inputs. Basically, here is what my requirement is.
1. Let us say we have a view T582L in which we have a record that changed and the change is on a transport (Customizing).
2. Now, I would like to know whether there is any place where the previous record (Old Record) for the same entry exists on a different transport.
3. So, it would be to track the changes happened to any of the T-Tables.
Thats what I wanted to know. Any further inputs would be of great help.
Regards,
Venkata Phani Prasad K -
Last Dunning Data not generated for customer account
Hi Gurus,
I have run the Dunning for a customer, it was successfully data generated to the customer account and able to see dunning history until 1st and 2nd runs, but 3rd and final dunned but no data generated for the particular customer.
In this time i find few other line items are additioned to the previous line item.
The 3rd dunning days are already exceeded the grace days, but not data generated because of additional items are not eligible for dun this time they have good enough time as grace days.
So this is caused, and no data generated to the for the respective line item which has to get the data for it's 3rd dun.
When i run for some future time 3rd dunning was successfully got and data updated, for additional items also got dunned 1st reminder this time because they are also eligible this time..
In my dunning procedure i have selected for always dun for all intervals and print line items for last dun.
Your inputs are highly appreciated.
Thanks in advance, points are avails.
Thanks,
VyasHi
Any Updates on this ..
Thanks in advance,
Vyas -
Can I modify 'Load Files into Stack.jsx' for custom white balance?
I've been wondering whether its possible to change Adobe's 'Load Layers in Stack' script so that it loads RAW files with the white balance adjustments I have made in Camera Raw ('ACR').
At present, the script loads the files as layers with the default ACR (or Lightroom) settings. This means that any white balance adjustments to the file in ACR are ignored and the RAW file is loaded with the WB 'As Shot'.
It IS possible to save new ACR default settings, but this doesn't really help because I don't want the WB to be a constant value. The changes I make in ACR are usually subtle and 'by eye'.
I would like the script to be mindful of these changes when opening the RAW file.. does anyone have any suggestions?
The script is below.
// (c) Copyright 2006. Adobe Systems, Incorporated. All rights reserved.
@@@BUILDINFO@@@ Load Files into Stack.jsx 1.0.0.1
// Load Files into Stack.jsx - does just that.
// BEGIN__HARVEST_EXCEPTION_ZSTRING
<javascriptresource>
<name>$$$/JavaScripts/LoadFilesintoStack/Menu=Load Files into Stack...</name>
</javascriptresource>
// END__HARVEST_EXCEPTION_ZSTRING
// debug level: 0-2 (0:disable, 1:break on error, 2:break at beginning)
//$.level = (Window.version.search("d") != -1) ? 1 : 0; // This chokes bridge
$.level = 0;
// debugger; // launch debugger on next line
// on localized builds we pull the $$$/Strings from a .dat file
$.localize = true;
// Put header files in a "Stack Scripts Only" folder. The "...Only" tells
// PS not to place it in the menu. For that reason, we do -not- localize that
// portion of the folder name.
var g_StackScriptFolderPath = app.path + "/"+ localize("$$$/ScriptingSupport/InstalledScripts=Presets/Scripts") + "/"
+ localize("$$$/private/LoadStack/StackScriptOnly=Stack Scripts Only/");
$.evalFile(g_StackScriptFolderPath + "LatteUI.jsx");
$.evalFile(g_StackScriptFolderPath + "StackSupport.jsx");
$.evalFile(g_StackScriptFolderPath + "CreateImageStack.jsx");
// loadLayers routines
loadLayers = new ImageStackCreator( localize("$$$/AdobePlugin/Shared/LoadStack/Process/Name=Load Layers"),
localize('$$$/AdobePlugin/Shared/LoadStack/Auto/untitled=Untitled' ) );
// LoadLayers is less restrictive than MergeToHDR
loadLayers.mustBeSameSize = false; // Images' height & width don't need to match
loadLayers.mustBeUnmodifiedRaw = false; // Exposure adjustements in Camera raw are allowed
loadLayers.mustNotBe32Bit = false; // 32 bit images
loadLayers.createSmartObject = false; // If true, option to create smart object is checked.
// Add hooks to read the value of the "Create Smart Object" checkbox
loadLayers.customDialogSetup = function( w )
w.findControl('_createSO').value = loadLayers.createSmartObject;
if (! app.featureEnabled( localize( "$$$/private/ExtendedImageStackCreation=ImageStack Creation" ) ))
w.findControl('_createSO').hide();
loadLayers.customDialogFunction = function( w )
loadLayers.createSmartObject = w.findControl('_createSO').value;
// Override the default to use "Auto" alignment.
loadLayers.alignStack = function( stackDoc )
selectAllLayers(stackDoc, 2);
alignLayersByContent( "Auto" );
loadLayers.stackLayers = function()
var result, i, stackDoc = null;
stackDoc = this.loadStackLayers();
if (! stackDoc)
return;
// Nuke the "destination" layer that got created (M2HDR holdover)
stackDoc.layers[this.pluginName].remove();
// Stack 'em up.
if (this.createSmartObject)
selectAllLayers( stackDoc );
executeAction( knewPlacedLayerStr, new ActionDescriptor(), DialogModes.NO );
// "Main" execution of Merge to HDR
loadLayers.doInteractiveLoad = function ()
this.getFilesFromBridgeOrDialog( localize("$$$/private/LoadStack/LoadLayersexv=LoadLayers.exv") );
if (this.stackElements)
this.stackLayers();
loadLayers.intoStack = function(filelist, alignFlag)
if (typeof(alignFlag) == 'boolean')
loadLayers.useAlignment = alignFlag;
if (filelist.length < 2)
alert(localize("$$$/AdobeScripts/Shared/LoadLayers/AtLeast2=At least two files must be selected to create a stack."), this.pluginName, true );
return;
var j;
this.stackElements = new Array();
for (j in filelist)
var f = filelist[j];
this.stackElements.push( new StackElement( (typeof(f) == 'string') ? File(f) : f ) );
if (this.stackElements.length > 1)
this.stackLayers();
if (typeof(loadLayersFromScript) == 'undefined')
loadLayers.doInteractiveLoad();This is part of the script looks interesting - is there a reference where can I find more hooks? Perhaps there is one that relates to WB?
// LoadLayers is less restrictive than MergeToHDR
loadLayers.mustBeSameSize = false; // Images' height & width don't need to match
loadLayers.mustBeUnmodifiedRaw = false; // Exposure adjustements in Camera raw are allowed
loadLayers.mustNotBe32Bit = false; // 32 bit images
loadLayers.createSmartObject = false; // If true, option to create smart object is checked.
// Add hooks to read the value of the "Create Smart Object" checkbox
loadLayers.customDialogSetup = function( w ) -
How to get data from screen for custom field?
Hi,
I have added custiom field in header for MIGO_GI transaction using BADI MB_MIGO_BADI by using Method PBO_HEADER, How to get the data from that field into method PAI_HEADER, if any body knows,please let me know.
Thanks.hi
you need to apppend your custom field in the GOHEAD structure , then automatically you we will get field value in the method IF_EX_MB_MIGO_BADI~PAI_HEADER with internal table is_gohead.
Pls try once?
Tks .. venkat -
Data element creation for custom 'data element orgchart'
Hi all,
I'd like to create my own Data element orgchart, using self-developed functional modules to achieve more flexibility in outputting data. Is any recommendations or examples for in-out interface of such modules?
Regards,
Sergey AksenovHi Sergey,
What are you trying to achieve here? It sounds like you want to create new structures or use FMs to replace the data retrieval for the existing structures. This cannot be done in OrgChart, at least not through the AdminConsole.
The structure is called in one call, as well as the data in the nodes. However, for each data source (e.g. NakisaRFC) used in a node there will be a call. For example, if you show position data from HRP1000 and employee data from PA0001 in a node then there will be 2 calls (which calls the data for all nodes at once). You can use an FM to do one call, but this would be in the view and not in the structure. The details panel makes multiple calls to retrieve data, so you can write an FM to call all data in one call rather than many.
You should refer to page 58 of the OrgChart Admin Guide for more information on custom RFCs.
Best regards,
Luke -
I want to find out opening balance as on date in fi(for customer)
hi
experts,
my req is user enter date through selection screen like (07.04.09)
i want to find opening balance.
thanks
ajay.Dear Ajay,
Try for Bapi for opening balance on perticular date. This will definately solve your problem
I have come across this problem long back.
Try this. I will give you name of Bapi sortly.
Regards,
Vijay -
Not able to load text master data for customer infoobject
Hi Guys,
I am not able load master data text to the customer infoobject.
I have a infoobject-A which is loaded with 1000 records, out of which only 100 records are with text and remaining are without text.
I have run a repair full request(extraction from ECC) and
i can see data till PSA(the 100 records which are with text are no more available in ECC now).remaining 900 records are available with text till PSA but not updating to target infoobject.
I have 3.X flow and i have checked transfer rules update rules and tried everythign from side-but no use-still not not working.
Please post your suggestions.
Thanks in advance
Regards,
SivaHi Ramanjaneyulu,
Transferrules screen shot:-
update rules
Regards,
Siva -
Importing Data from an ABAP system - JOB Initial Load - IDM 8.0
Hello all,
I got the error during the execution initial load job:
Value not legal for this attribute:Attribute: MX_USERTYPE" when storing attribute 'MX_USERTYPE=A'
Value not legal for this attribute:Attribute: MX_DATEFORMAT" when storing attribute 'MX_DATEFORMAT=1'
I have executed the job read value help content before start initial load job.
Could anyone explain if this attribute should be created manually in mxi_AttrValueHelp table before run the initial job?
ThanksHello Rafael,
There is a possibility that you have encountered a problem that we had with the language translations for the attribute values.
I would like to ask you to check one file content: could you try to open the language translations file: this should be located under ICCORE -> Database Schema -> SQL-Server -> 9-language-data.sql
There is a chance that this file is "broken". If so - we have fixed this specific problem in the Designtime Component patch 2 (now 3 is also available) - so you would need to update to this one.
You could also take a look at the table for the attribute values help - via executing "select * from mxi_attrvaluehelp".
Kind Regards,
Rali
SAP Identity Management Development -
Reg lsmw for customer master data transfer
Hello All,
I want to know while transfering data by lsmw for customer master data .
Scenario : There are more then one ship-to-party and bill to party so how to upload the data in this case?????
Thanks,
Sunnyhi
just refer to the link below
http://www.sapmaterial.com/?gclid=CN322K28t4sCFQ-WbgodSGbK2g
<b>step by step procedure with screen shots</b>
regards
vijay
<b>plz dont forget to reward points if helpful</b> -
Middleware initial load error on funcloc segment type (ADR2) does not exist
Hello,
We are doing initial loads of data into a new CRM 7.0 system.
So far all objects have been done except Functional locations (FUNCLOC).
During the initial load of the functional locations the various BDOCs all go into error state.
The error is the same everywhere: segment type adr2 does not exist.
If we deactivate the adr2 table In transaction R3AC1 and process again, we get the error message for the next table
(Like: segment type IFLOTX does not exist) and so on.
IS there any settings we can do to manage these 'segments' or would there be another wa to solve this?
Thanks.
JoramHello,
Take a look at note 1271465, it does not talk about the ADR2 segment but about segment errors in SMW01 for functional location download. It might be useful. Observe especially the constraint in ERP version number, it might be that your ERP system is out-of-date/incompatible for this scenario.
Second thing to check, go to SPRO>CRM>Master Data>Installed Base>Replication of technical objects from ERP
There you'll find a mandatory customizing activity for Functional location download.
And in the documentation of activity "Replication of Functional locations: Prerequisites and General information" you'll find all the necessairy actions to perform.
This helped me in setting up funcloc replication in CRM 7.0.
Hope this helps,
Kind regards,
Joost
Maybe you are looking for
-
Using java.sql.Time: Offset by 1 hour?
I have a problem understanding the behaviour of the java.sql.Time class. As the following example shows. 61952000 ms is the Time 17:12:32. If i feed a Time-Object with it and print the time or date I'll get "18:12:32", an offset of 1h. But if I use t
-
I haven't backed up my computer in over a year. When I tried to back it up today, Time Machine says 'an error occurred while creating the backup folder.' When I check the side of the partition where I manually saved old photos and documents (includin
-
Java executed SQL statement to XML - large number of rows
Hello, I have to write some code to generate XML as the result of a query. AFAIK I am using the latest versions of the relevant Java libraries etc. I have found that using a max_size function above 2000 results in 'Exception in thread "main" java.lan
-
How to create Database Control Administrative Users on EM ?
Hi to all I am learning dba oracle 11g r1 from doc .. in this doc show how to create admin user .. the first step must click on ( setup ) on top EM page Issue that I can not find (setup) on my EM page http://www.comp.dit.ie/btierney/oracle11gdoc/serv
-
Regular expressions, characters unallowed in file names
trying to take a url and turn it into a file name by grabbing the last bit of the string that doesnt have any characters that arent allowed in file names, but this keeps returning the same sting over and over again "://" System.out.println("\n" + fil