Migration of mapping created in data synchronization
Hi,
I've created a mapping in EPM data synchronization utility. I am to migrate from dev to production. Is there any way to migrate / export the data synchronization along with the mappings created in Dev or I have to recreate everything from the scratch? IT seems that there is no way in which I can export the mapping created. Appreciate your help.
Thanks,
ADB
Hi Alexey,
Could you elaborate on the requirement? It is still not clear to me what you want to achieve.
What I do understand is that the users should be able to make adjustments to the mapping/lookup entries.
If that is the case, what exactly is going to be maintained in the 'additional table' and how are you suggesting end users are going to maintain this?
Ideally, your query transformation should not change when parameter values change, so you have to think about what logic you put where.
My suggestion would be to use a file or a table which can be maintained by users. In your query transformation you can then use the lookup or lookup_ext function.
Especially with lookup_ext you can make it as complicated as you want. The function is well documented but if you need help then just reply and explain in a bit more detail what you're trying to do.
If you do think the 'hard-coded' option would suit you better, you can look into the 'decoce' function. Again, it is well documented in the technical manual.
Jan.
Similar Messages
-
Data Synchronize from HFM FMDmeException Query Failed or invalid
I'm trying to create a Data Synchronization where HFM is the data source source.
I keep getting the following dme exception and don't know how to further diagnose the issue.
4/19/12 1:29 PM : Error submitting request to source connector - exception: com.hyperion.datasync.fmdme.exception.FMDmeException: Acknowledgement failure - status: 1, reason: Query Failure: The query failed or is invalid! (0)
4/19/12 1:29 PM : Translation failed - Source query failed to open - exception:; nested exception is:
com.hyperion.datasync.fmdme.exception.FMDmeException: Acknowledgement failure - status: 1, reason: Query Failure: The query failed or is invalid! (0)
The DME Listener service is running.
I've tried the simplest data sync -- just one member in each dimension, copying to same application (in a different scenario).
I have data synchronizations that do work where the source is a Planning application.
I was able -- a month ago -- to create a sync where HFM is the source on this development application so don't know if something has changed in the application itself.
Any suggestions appreciated.
Thanks,
Barb
Edited by: bg on Apr 19, 2012 1:31 PMYou're right. I should have put it in Financial Consolidation. I don't see that I can post a question directly to EPMA level.
-
Essbase Analytics Link cannot create data synchronization server database
When I try to create data synchronization server database using Essbase Analytics Link, the below error occur, anyone can help?Thnaks
dss.log:
19 Oct 2011 17:28:55] [dbmgr] ERROR: last message repeated 2 more times
[19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\Comma.hdf"
[19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\PERIOD.hrd"
[19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\VIEW.hrd"
[19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\YEAR.hrd"
[19 Oct 2011 17:28:58] [dbmgr] Create metadata: "C:/oracle/product/EssbaseAnalyticsLink/oem/hfm/Comma/Default/Comma.hdf"
[19 Oct 2011 17:28:59] [dbmgr] WARN : HR#03826: Directory "C:\oracle\product\EssbaseAnalyticsLink/Work/XOD/backUp_2" not found. Trying to create
[19 Oct 2011 17:29:15] [dbmgr] ERROR: ODBC: HR#01465: error in calling SQLDriverConnect ([Microsoft][ODBC SQL Server Driver][Shared Memory]Invalid connection. [state=08001 code=14]).
[19 Oct 2011 17:29:15] [dbmgr] ERROR: HR#00364: Cannot open source reader for "ACCOUNT"
[19 Oct 2011 17:29:15] [dbmgr] ERROR: HR#00627: Cannot create dimension: "ACCOUNT".
[19 Oct 2011 17:29:16] [dbmgr] ERROR: HR#07722: Cube 'main_cube' of application 'Comma' is not registered.
eal.log:
[2011-Oct-19 17:28:56] http://localhost/livelink/Default.aspx?command=readYear&server=TestEss64&application=Comma&domain=
[2011-Oct-19 17:28:56] http://localhost/livelink/Default.aspx?command=readPeriod&server=TestEss64&application=Comma&domain=
[2011-Oct-19 17:28:57] http://localhost/livelink/Default.aspx?command=readView&server=TestEss64&application=Comma&domain=
[2011-Oct-19 17:28:57] http://localhost/livelink/Default.aspx?command=getVersion&server=TestEss64&application=Comma&domain=
[2011-Oct-19 17:28:58] DSS Application created
[2011-Oct-19 17:28:58] http://localhost/livelink/Default.aspx?command=getICPWeight&server=TestEss64&application=Comma&domain=
[2011-Oct-19 17:29:15] (-6981) HR#07772: cannot register HDF
[2011-Oct-19 17:29:15] com.hyperroll.jhrapi.JhrapiException: (-6981) HR#07772: cannot register HDF
[2011-Oct-19 17:29:15] at com.hyperroll.jhrapi.JhrapiImpl.updateMetadata(Native Method)
[2011-Oct-19 17:29:15] at com.hyperroll.jhrapi.Application.updateMetadata(Unknown Source)
[2011-Oct-19 17:29:15] at com.hyperroll.hfm2ess.bridge.HyperRollProcess.updateMetadata(Unknown Source)
[2011-Oct-19 17:29:15] at com.hyperroll.hfm2ess.bridge.ws.BridgeOperationManagerImpl.createAggServerApp(Unknown Source)
[2011-Oct-19 17:29:15] at com.hyperroll.hfm2ess.bridge.ws.BridgeOperationManager.createAggServerApp(Unknown Source)
[2011-Oct-19 17:29:15] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[2011-Oct-19 17:29:15] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[2011-Oct-19 17:29:15] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[2011-Oct-19 17:29:15] at java.lang.reflect.Method.invoke(Method.java:597)
[2011-Oct-19 17:29:15] at weblogic.wsee.jaxws.WLSInstanceResolver$WLSInvoker.invoke(WLSInstanceResolver.java:92)
[2011-Oct-19 17:29:15] at weblogic.wsee.jaxws.WLSInstanceResolver$WLSInvoker.invoke(WLSInstanceResolver.java:74)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.server.InvokerTube$2.invoke(InvokerTube.java:151)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.server.sei.EndpointMethodHandlerImpl.invoke(EndpointMethodHandlerImpl.java:268)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.server.sei.SEIInvokerTube.processRequest(SEIInvokerTube.java:100)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.api.pipe.Fiber.__doRun(Fiber.java:866)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.api.pipe.Fiber._doRun(Fiber.java:815)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.api.pipe.Fiber.doRun(Fiber.java:778)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.api.pipe.Fiber.runSync(Fiber.java:680)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.server.WSEndpointImpl$2.process(WSEndpointImpl.java:403)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.transport.http.HttpAdapter$HttpToolkit.handle(HttpAdapter.java:532)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.transport.http.HttpAdapter.handle(HttpAdapter.java:253)
[2011-Oct-19 17:29:15] at com.sun.xml.ws.transport.http.servlet.ServletAdapter.handle(ServletAdapter.java:140)
[2011-Oct-19 17:29:15] at weblogic.wsee.jaxws.WLSServletAdapter.handle(WLSServletAdapter.java:171)
[2011-Oct-19 17:29:15] at weblogic.wsee.jaxws.HttpServletAdapter$AuthorizedInvoke.run(HttpServletAdapter.java:708)
[2011-Oct-19 17:29:15] at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:363)
[2011-Oct-19 17:29:15] at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:146)
[2011-Oct-19 17:29:15] at weblogic.wsee.util.ServerSecurityHelper.authenticatedInvoke(ServerSecurityHelper.java:103)
[2011-Oct-19 17:29:15] at weblogic.wsee.jaxws.HttpServletAdapter$3.run(HttpServletAdapter.java:311)
[2011-Oct-19 17:29:15] at weblogic.wsee.jaxws.HttpServletAdapter.post(HttpServletAdapter.java:336)
[2011-Oct-19 17:29:15] at weblogic.wsee.jaxws.JAXWSServlet.doRequest(JAXWSServlet.java:98)
[2011-Oct-19 17:29:15] at weblogic.servlet.http.AbstractAsyncServlet.service(AbstractAsyncServlet.java:99)
[2011-Oct-19 17:29:15] at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
[2011-Oct-19 17:29:15] at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
[2011-Oct-19 17:29:15] at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
[2011-Oct-19 17:29:15] at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
[2011-Oct-19 17:29:15] at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:183)
[2011-Oct-19 17:29:15] at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3717)
[2011-Oct-19 17:29:15] at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
[2011-Oct-19 17:29:15] at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
[2011-Oct-19 17:29:15] at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
[2011-Oct-19 17:29:15] at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
[2011-Oct-19 17:29:15] at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
[2011-Oct-19 17:29:15] at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
[2011-Oct-19 17:29:15] at weblogic.work.ExecuteThread.execute(ExecuteThread.java:207)
[2011-Oct-19 17:29:15] at weblogic.work.ExecuteThread.run(ExecuteThread.java:176)
[2011-Oct-19 17:29:15] LiveLinkException [HR#09746]: Data Synchronization Server database cannot be createdWhat version of EAL have you installed, what OS + 32bit/64bit are you installing it on.
What version of the OUI did you use.
Have you gone through all the configuration steps successfully.
Cheers
John
http://john-goodwin.blogspot.com/ -
Preventive Work Order Dates Synchronization with its automatic created notification
Dear all,
Preventive Work Order Dates Synchronization with its automatic created notification
My question was initiated from PM module forum, please check above URL if you have time.
My ultimate problem is that :
"SMOD" Enhancement "QQMA0018" needs to receive its origin data of work order when the execution time of IP10 or IP30.
so that I can decide/calculate desired dates on the notification according to its origin work order date .
Do I need to use "export/import memory" statements to communicate with origin work order?
if it's true
Thanks for your help in advance.Dear Pete,
I just made test data in our DEV server,
Work orders has been created by "IP30" which was executed today with its maint. plan( date format DD.MM.YYYY ).
below work order has been created and the basic start date seems derived from plan date.
below image is showing our 'SPRO' configuration for the PM priority.
"ZM03" order type is using "ZP" priority type.
(I don't know why is this work order didn't take the values defined on 'priorities for Each Priority Type' for calculating "basic start" & "basic finish" date, it seemed just copied same date with PlanDate)
you can see the notification which is created automatically by work order as below.
"M4" notification type is using "ZP" priority type as well.
It took the 'required start' date by adding 2 weeks from today(=creation date) as 'priorities for Each Priority Type' has defined and 'required end' date also set to 6 months later from 'required start' date.
(Am I right?)
Please let me know, if you need further information.
Thank you for your help. -
Preserving old Created On dates during SharePOint Online migration
We are currently migrating files from our on-premise fileshare to SharePoint Online. We wish to carry as much file meta data from the fileshare into SharePoint Online, but when we drag&drop a file from the fileshare to our Documents
library in in SharePoint we lose the original Created On data stored on the fileshare i.e. SharePoint site column 'Created' is populated with the date the file is created on SharePoint and not the original date of the file.
Could anyone advise on how we can carry the Created On date from the fileshare into SharePoint?I believe Office 365 supports only web templates, not site templates. There is a subtle difference between these two.
http://geekswithblogs.net/erwinvanhunen/archive/2012/02/09/creating-a-site-collection-from-your-own-webtemplate-on-office.aspx
You need to create a custom web template, import it as .wsp, upload to Solutions gallery of Office 365 site-collection and activate it it. Have your done this ??
I don't think you get direct access to log files for troubleshooting correlation id issues. You may need to contact microsoft support.
Please mark the replies as answers if they help or unmark if not. -
Try NineSYNC, Internet Data Synchronization & Mapping Web Service Software
Those who are learning or pros in web service technology, try the software named NineSYNC, an internet data synchronization and mapping web service, which really running at http://www.ninestep.com/services/ninesync/NineSYNC. You can go to the Jence Incorporated (NineSTEP) web site http://www.jence.com or http://www.ninesync.com and load the NineSYNC client. You can also load the trial version of the web service software.
NineSYNC is a real web service and its big. And it does what its supposed to do. I have tried synchronizing data from Mysql to Sql Server and it was so quick with NineSYNC. I also synchronize files with NineSYNC on a daily basis. All you have to do is set the software up and it will synchronize periodically.
Nice cool way to do things and the future of web services technology.
Burd.Burd, It's a nice program. I haven't found anything like this before and it works quite good. Thanks for sharing the informaiton. I would suggest anybody to take a look at this SW who are involved in data synchronization and mapping process.
-
Urgen: SRM and BW user data Synchronization problem
Dear Buddies:
I'm a BWer in a SRM project. These days I meet a very strange problem in the user data Synchronization configuration between SUS and BW system.
The symptom is:
I config the user data Synchronization parameters in SUS system:
SAP Reference IMG u2192 Supplier Relationship Management u2192 Supplier Self-Services u2192 Master Data u2192 Maintain Systems for Synchronization of User Data
Here I've maintained the BW logical system and filled the 'FM BPID' field with 'RS_BW_BCT_SRM_SUS_USER_BPID', and filled the 'Function Module for creating user' field with 'BAPI_USER_CREATE'.
The function of the config above is that:
When a new user is created in the SAP SUS system, it will automatically be created in SAP BW, too.
At the same time, an internal table (SRM_USER_SUPBPID) is filled automatically. The table contains the assignment between the automatically created SAP BW user and the corresponding Business Partner ID of the supplier company.
Then I test the user creation in SUS on web. I found that when the SUS user created , the same user is created automatically in BW system. That means the 'BAPI_USER_CREATE' is work.
But the content of the user-BPID mapping table 'SRM_USER_SUPBPID' is still empty. That means the FM 'RS_BW_BCT_SRM_SUS_USER_BPID' is not work at all.
Anybody met with similar problem? Or any suggestion do you have pls kindly show your solutions, Thanks!!No solutions? I need your support my friends.
-
Hi @all,
I am working on a scenario to split one soruce message with several items to 0..n message for the same receiver.
Therefore the occurrency of target message (signature in MM) have been adjusted.
Not all items are forwarded to receiver, only if the content meets some rules the item should be forwarded.
It could be that no item of the source message meet these rules and therefore no target message at all should be created.
During test of this case I am facing following error in RWB, scenario is setup as integrated configuration.
MappingException: Split mapping created no messages, cannot proceed. Review your mapping setup: splitting to 0 messages is not allowed.
As the occurrency of the target message is 0..n, 0 should be also valid and there should be no error in this case, just stop processing like if there is no receiver in receiver determination step (mode IGNORE)
Otherwise this will end up in lots of error messages which have to be analyzed and canceled manually which is quite time consuming.
Maybe someone faced the same issue and found a solution to get rid of this error message.
Best regards
JochenHi Somil,
hi Sriram,
thanks for your response.
In my scenario there also multiple receiver applications receiving data from one single sender application. We are using PI 7.30 dual stack and major goal is to setup all scenario as integrated configurations to have java only processing. One reason for this setup is that the main support user do not have extensive PI knowledge so far and we want to keep it as simple as possible and therefore do not want them to care about the abap stack as much as possible.
Unfortunately migration to java only is not possible yet.
In this release extended receiver determination using operation mapping is not supported in integrated configuration.
There is quite complex rules to check if an item should be forward or not including lots of fields and also value mapping, so it would be quite difficult/impossible to write XPATH expression to have the same check during receiver determination step.
One workaround could be creating a dummy message in case of no item to be forward and root this message to different place (local file which is always to be overwritten)
Are there any alternatives/workarounds?
Best regards
Jochen
Message was edited by: Jochen Gugel -
Data Filter on CSV file using Data Synchronization
gOT error When i used Data Filter on CSV file in Data Synchronization task, Filter condition : BILLINGSTATE LIKE 'CA'
TE_7002 Transformation stopped due to a fatal error in the mapping. The expression [(BILLINGSTATE LIKE 'CA')] contains the following errors [<<PM Parse Error>> missing operator ... (BILLINGSTATE>>>> <<<<LIKE 'CA')].Hi,
Yes,This can be done through BEx Broadcaster.
Please follow the below stes...
1.Open your query in BEx Analyzer
2.Go to BEx Analysis Toolbar->Tools->BEx Broadcaster...
3.Click "Create New Settings"->Select the "Distribution Type" as "Broadcast Email" and "Output Format" as "CSV"
4.Enter the Recipients Email Address under "Recipients" tab
5.Enter the Subject and Body of the mail under "Texts" tab
6.Save the Setting and Execute it.
Now the Query data will be attached as a CSV file and sent to the recipents through Email.
Hope this helps you.
Rgds,
Murali -
Please help me to solve error: Split mapping created no messages
Hi Experts,
I am facing Split mapping created no messages when i run my scenario.
My Scenario is : Proxy to FILE.
I am triggering Proxy and based on one condition i am generating different Flat Files.
Now when i do this i am getting error :Split mapping created no messages in SXI_MONITOR.
When i check this in Message Mapping by taking data from SXI_MONI its working fine...
but when i trigger the data from RUNTIME WORKBENCH its throughing the error.
When i remove the NameSpaces <ns0:message>
and <ns0:messages1> its working fine.
How can i solve this issue.
Even though i removed Namespace in Message Type, i am getting the error.
I changed occurance of Target message to O..unbounded both in Message Mapping and Interface Mapping.
Its working fine in Message Mapping, problem occurs only when i run end-to-end scenario.
So please help me to solve this issue.I think there might be mismatch between your name spaces.
may be this link might help you
Split mapping created no messages -Mluti Mapping
are you getting the same structure from your proxies whatever structure you have in your XI structure(sender) and NS should match.
but I dont have exp with proxies.
Sri -
Hi,
I am trying to use "IBIMonitoringAuthoring" in my local web site.
But i am getting error like "Server was unable to process request. ---> You do not have permissions to create a data source in this document library. Additional details have been logged for your administrator."
My code is below,
string url = ServerName + webServiceUrl;
IBIMonitoringAuthoring biService = BIMonitoringAuthoringServiceProxy.CreateInstance(url);
//Create data source object
DataSource dataCube = new DataSource("AW_Data_Cube");
dataCube.Name.Text = "AW_Data_Cube";
dataCube.ServerName = "SQL2008dev";
dataCube.DatabaseName = "Analysis Services Project1";
dataCube.CubeName = "TestCube";
dataCube.ConnectionContext = ConnectionContext.ConnectAsSharedUser;
dataCube.FormattingDimensionName = "Measures";
dataCube.MinutesToCache = 10;
dataCube.CustomTimeIntelligenceSettings = "";
biService.CreateDataSource(connectionListUrl, dataCube);
How could i authenticate the Service. Is there any way to pass credentials for this method?
Thanks & Regards
Poomani SankaranI suffered similar issue in Infopath, and i finally solved the issue by changing the data connection URL, it should the same as the Infopath publish location.
for example: SP server iP 192.168.1.1 have two name, hostname is mySP, alternate assces mapping name is companySP, and you can access the websit by both
http://mySP and
http://companySP
hope it can help someone.. -
Problem in creating jdbc-data-vew to oracle10g
I have installed DSEE7.0 and setup the directory proxy server successfully
my oracle environment:
ip=202.205.16.93
sid=nicdbs
table: CAMPUS.CAMPUS_USER { USERID, NAME, TEL, EMAIL }
my steps (refer to the administrator guide a reference):
dpconf create-jdbc-data-source -b nicdbs -B jdbc:oracle:thin://202.205.16.93:1521: -J file:///opt/sun/dsee7/var/jdbc/ojdbc14.jar -S oracle.jdbc.driver.OracleDriver oracleds
dpconf set-jdbc-data-source-prop oracleds db-vendor:oracle
dpconf set-jdbc-data-source-prop oracleds db-pwd-file:/etc/oraclepass.txt
dpconf create-jdbc-data-source-pool oraclepool
dpconf attach-jdbc-data-source oraclepool oracleds
dpconf create-jdbc-data-view oracleview oraclepool o=oracle
dpconf create-jdbc-table user CAMPUS.CAMPUS_USER
dpconf add-jdbc-attr user uid USERID
dpconf add-jdbc-attr user cn NAME
dpconf add-jdbc-attr user telephoneNumber TEL
dpconf add-jdbc-attr user mail EMAIL
dpconf create-jdbc-object-class oracleview person user uid
every thing return succefully
but when I test the view by
ldapsearch -b o=oralce -x uid=1422
it returns:
ldap_search: DSA is unavailable
ldap_search: additional info: Unable to process the search request. Reason: [Original error=52] No JDBC server available.
i cant get any help from this error message.I want some help, Maybe some steps were left.
thanks in advace.thanks but it didn't solve it,
ldapsearch -b o=oracle uid=1422
ldap_search: DSA is unavailable
ldap_search: additional info: Unable to process the search request. Reason: [Original error=52] No JDBC server available.
i also have tried:
db-name: (CONNECT_DATA =(SERVICE_NAME = nicdbs))
db-url: jdbc:oracle:thin:@(DESCRIPTION =(ADDRESS = (PROTOCOL = TCP)(HOST = 202.205.16.93)(PORT = 1521)))
but it still didnt work.
#[root@dsee7 ~]# dpconf get-jdbc-data-source-prop oracleds
db-name : nicdbs
db-pwd : {3DES}+j6AoBZNIGx+7st/KDPgSmzVMlDcL6zs
db-url : jdbc:oracle:thin:@202.205.16.93:1521:
db-user : campus
db-vendor : oracle
description : -
driver-class : oracle.jdbc.driver.OracleDriver
driver-url : file:///opt/sun/dsee7/var/jdbc/ojdbc14.jar
is-enabled : true
is-read-only : false
monitoring-inactivity-timeout : 2m
monitoring-interval : 30s
monitoring-mode : reactive
num-connection-incr : 5
num-connection-init : 5
num-connection-limit : 30
#dpconf get-jdbc-data-view-prop oracleview
Enter "cn=Proxy Manager" password:
alternate-search-base-dn : ""
attr-name-mappings : none
base-dn : o=oracle
contains-shared-entries : false
custom-distribution-algorithm : none
description : -
distribution-algorithm : none
dn-join-rule : none
dn-mapping-attrs : none
dn-mapping-source-base-dn : none
excluded-subtrees : -
filter-join-rule : none
is-enabled : true
is-read-only : false
is-routable : true
jdbc-attr-date-format : yyyy-MM-dd
jdbc-attr-time-format : hh:mm:ss
jdbc-attr-timestamp-format : yyyy-MM-dd hh:mm:ss
jdbc-data-source-pool : oraclepool
lexicographic-attrs : all
lexicographic-lower-bound : none
lexicographic-upper-bound : none
non-viewable-attr : none
non-writable-attr : none
numeric-attrs : all
numeric-default-data-view : false
numeric-lower-bound : none
numeric-upper-bound : none
pattern-matching-base-dn-regular-expression : all
pattern-matching-base-object-search-filter : all
pattern-matching-dn-regular-expression : all
pattern-matching-one-level-search-filter : all
pattern-matching-subtree-search-filter : all
process-bind : -
replication-role : master
viewable-attr : all except non-viewable-attr
writable-attr : all except non-writable-attr
Edited by: mickypc1979 on Mar 1, 2010 4:31 PM -
Error - Split mapping created no messages
Dear Experts,
while executing the IDOC to File scenario in PI 7.0, it shows error as u201C Split mapping created no messages u201C.
To resolve this error I have removed the extra tags with namespace specified as "\SplitAndMerge" in source XML message and also IDOCs used in the Interface maps are standard. no issues with mapping also.
kindly suggest me to solve this issue.
Thanks and Regards,
SrinivasHi,
Now i have solved the error of empty file creation. in my target file i am able to populate the data. but now ths issue is --
i have source IDOC with 3 sales items and i need to generate 3 records in Output file. but now i am geeting only the first record.
So please let me know how to get the remaining 2 line items also in my Output file.
Source MT:
Messages 1..1
Messages1 1..1
ItemMain 1..1
item 0..unbound
Messages2 1..1
Item2 0..1
Target MT
Messages 1..1
Messages1 1..1
TItem 0..1
item 0..unbound
Messages2 1..1
TItem2 0..1
Best Reagrds,
Srinivas -
Split mapping created no messages -Mluti Mapping
Hi ,
I am using Multimapping without BPM,XML to flat file Scenario (PI SP12) and I am getting this following error in SXMB_MONI
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
- <!-- Request Message Mapping
-->
- <SAP:Error xmlns:SAP="http://sap.com/xi/XI/Message/30" xmlns:SOAP="http://schemas.xmlsoap.org/soap/envelope/" SOAP:mustUnderstand="">
<SAP:Category>XIServer</SAP:Category>
<SAP:Code area="MAPPING">GENERIC</SAP:Code>
<SAP:P1>Split mapping created no messages</SAP:P1>
<SAP:P2 />
<SAP:P3 />
<SAP:P4 />
<SAP:AdditionalText />
<SAP:ApplicationFaultMessage namespace="" />
<SAP:Stack>Split mapping created no messages</SAP:Stack>
<SAP:Retry>M</SAP:Retry>
</SAP:Error>
I know this is very known error.
I refered Split mapping created no messages
and blog /people/jin.shin/blog/2006/02/07/multi-mapping-without-bpm--yes-it146s-possible
but they didn't help,but I have just one target message structure not two like in the blog.
I double checked my source input XML file
1.mapped my header (0--unbounded) to the Message type ,cos I need to get many file as many times my header occur in the source.
2.checked in MM target Mesasge is 0---Unbounded
3.checked in IM target Messag is 0--- unbounded
4.used Enhanced Interface determination.
when I test with the same sample XML data in test MM ,I am getting
<?xml version="1.0" encoding="UTF-8"?>
<ns0:Messages xmlns:ns0="http://sap.com/xi/XI/SplitAndMerge"><ns0:Message1></ns0:Message1></ns0:Messages>
seems like no data in it,but system message is <b>Mapping executed successfully</b>
does it give like this when use multimapping?
basically it is not able to split the source message as required.
any suggestion?I think I chcked all whatever needs to be right.
thank you,
BabuHi Agasturi,
thank you for your response.
>>If the Pay load is huge some time it will be empty message after pasting one time try to execute it again, Let us check the error message now.
I don't think my message is huse.just 60MB but header repeating like 100 times ,so it would generate 100 output files
OK,for safe side,I just took one record from my payload and also three records from my payload (SXMB_MONI),I get the same error in SXMB_MONI
"Split mapping created no messages "
and in MM Testing similar to the previous one ,like
no output data just empty tags as given in my very first message and system status "mapping executed Successfully.
I have done similar scenarios using Muli mapping ,and I did this same way.there is no difference .I did exactly same as my earlier interface.but in this interface I have only differences from earlier interface "occurences of my source structure.
earlier I have SOURCE Header(1-unbounded ),header fields 1-1 trailer (0unbounded) trailer fields 1-1 so I created my target structure Header,header fields 1-1 and trailer(0unbounded) trailer fields 1-1 that time.
now I have
SOURCE Header(0-unbounded ),header fields 0-1 trailer (0unbounded) trailer fields 0-1 so I created my target structure Header,header fields 0-1 and trailer(0unbounded) trailer fields 0-1
and mapped my header(0unbounded) to my MT-XYZ(0unbounded (after changing in the message tab occu to 0--unbounded)) to get as many as my hear times.
any suggestion?
thank you,
Babu. -
I have a scenario, ECC-PI-Message broker. ECC sending IDOC to PI, PI execute mapping and sends data to Message borker(thru JMS channel).(with almost one to one mapping)., IDOC(AAE)-PI-JMS. Now my requirement is., from PI after mapping we need to create file with same data what ever send to Message broker and put the file in SAP folder without touching mapping. Is it possible? Please advise with the steps. We are using the ICO for this senario. Quick response is appriciated.
Hi Pratik,
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/502991a2-45d9-2910-d99f-8aba5d79fb42?quicklink=index&overridelayout=true
This link might help.
regards
Anupam
Maybe you are looking for
-
Delete message from phone only and not server
Hello, I had an issue with my mail settings today and had to delete an email account (Hotmail) and set it up again. Upon doing this, there is no longer the option to delete downloaded message from the phone only and keep it on the server. Is that cor
-
Somebody created an Apple ID with my email address!
Hi all! I've received a few emails from Apple: the first one was asking me to verify my email address, the subsequent ones informed me that someone was accessing iCloud, FaceTime and iMessage from an iPhone 4s, using my Apple ID. After a few minutes,
-
I am trying to optimise my Database code. At step one I make a fetch on the data by getting all the info about all States in my Country table. Now having got a reference to this resultSet, I want to narrow down the ResultSet to contain rows only for
-
Limit users to execute "SELECT" statemes only from SQL Developer
Hi: Is there a way to control what SQL command can be executed from SQL Developer? I am trying to limit users to execute "SELECT" statements only from SQL Developer. I believe SQL*Plus used to have the product profile where I can control what SQL com
-
Ios updates download without permission? AGAIN?
Is anyone else having an issue where iOS updates are downloading automatically without permission? I've checked my settings and ALL of the Automatic Download options on my iOS devices are set to OFF. But 8.3 has downloaded automatically without ask