Data records rejected while using Data Manager Packages
Hello all ,
We have configured the dimensions and applications in SAP- BPC 7.5 NW. However , while loading the data into the application using a data manager package , all the records are rejected. The following is the error :
Line 3 :Dimension: P_CC member: SALESTEAM1 is invalid or is a calculated member
Line 3 :Dimension: PRODUCT member: PRODUCTA is invalid or is a calculated member
However , these member are present in the dimensions and are processed as well !! Any help is appreciated .
THanks
Vijay
The message seems to be saying you have a dimension forumula defined for those two cost centers. Please check your dimension members master data and make sure there is nothing defined in a property column called "Formula" for those two members. If there is something populated, then delete the cell contents and save and process your dimension. Then try your data manager load again.
Best regards,
[Jeffrey Holdeman|http://wiki.sdn.sap.com/wiki/display/profile/Jeffrey+Holdeman]
SAP Labs, LLC
BusinessObjects Division
Americas Applications Regional Implementation Group (RIG)
Similar Messages
-
UJK_EXECUTION_EXCEPTION:Data records rejected when writing to the cube
Hello,
I ran into an issue while testing Script using UJKT.
I can plan the same intersection/record from BPC template but it doesn't work from Script/UJKT.
I checked work statuses, scripts, rules. Found nothing that locks the intersection.
Anything else I should check?
Is there a way to find out more detailed log to see why the data records are rejected.
Appreciate your help. Thanks.
P.S. I looked at many scn posts on this error but nothing really helped.
UJKT Log:
#dim_memberset=9
REC :%VALUE% * 110
CALCULATION BEGIN:
QUERY PROCESSING DATA
QUERY TIME : 571.13 ms. 1 RECORDS QUERIED OUT.
QUERY REFERENCE DATA
CALCULATION TIME IN TOTAL :197.47 ms.
1 RECORDS ARE GENERATED.
CALCULATION END.
ENDWHEN ACCUMULATION: 1 RECORDS ARE GENERATED.
UJK_EXECUTION_EXCEPTION:Data records rejected when writing to the cubeThanks Vadim. I debugged UJKT but didn't find anything helpful yet.
Will keep digging.
Here is the script.
*XDIM_MEMBERSET CAP_VIEW = 27
*XDIM_MEMBERSET PPS_TIME = 2014.03
*XDIM_MEMBERSET PPS_ITEM = 52540046EAC51ED3B2E49C10DBDB1565
*XDIM_MEMBERSET CAP_GRP = ZHM_IT00050001
*XDIM_MEMBERSET PPS_ENTITY = US
*XDIM_MEMBERSET CAP_CATG = ZHM_IT0005
*XDIM_MEMBERSET PPS_ACCOUNT = COST
*XDIM_MEMBERSET RPTCURRENCY = USD
*DESTINATION_APP=PPS_PLANNING
*SKIP_DIM = CAP_CATG,CAP_GRP,CAP_VIEW
*ADD_DIM PPS_CATG = ZHM_IT0001
*ADD_DIM PPS_GROUP = ZHM_IT00010001
*ADD_DIM PPS_VIEW = ZHM_IT0001000127
*WHEN PPS_ACCOUNT
*IS COST
*REC(EXPRESSION=%VALUE%*10)
*ENDWHEN
*COMMIT
LGX:
LOG:
FILE:\ROOT\WEBFOLDERS\HM_PLAM \ADMINAPP\PPS_CAPPLAN\TEST.LGF
USER:SMULLAPUDI
APPSET:HM_PLAM
APPLICATION:PPS_CAPPLAN
[INFO] GET_DIM_LIST(): I_APPL_ID="PPS_CAPPLAN", #dimensions=9
CAP_CATG,CAP_GRP,CAP_VIEW,MEASURES,PPS_ACCOUNT,PPS_ENTITY,PPS_ITEM,PPS_TIME,RPTCURRENCY
#dim_memberset=8
CAP_VIEW:27,1 in total.
PPS_TIME:2014.03,1 in total.
PPS_ITEM:52540046EAC51ED3B2E49C10DBDB1565,1 in total.
CAP_GRP:ZHM_IT00050001,1 in total.
PPS_ENTITY:US,1 in total.
CAP_CATG:ZHM_IT0005,1 in total.
PPS_ACCOUNT:COST,1 in total.
RPTCURRENCY:USD,1 in total.
REC :%VALUE%*10
CALCULATION BEGIN:
QUERY PROCESSING DATA
QUERY TIME : 1038.58 ms. 1 RECORDS QUERIED OUT.
QUERY REFERENCE DATA
CALCULATION TIME IN TOTAL :173.04 ms.
1 RECORDS ARE GENERATED.
CALCULATION END.
ENDWHEN ACCUMULATION: 1 RECORDS ARE GENERATED.
UJK_EXECUTION_EXCEPTION:Data records rejected when writing to the cube
Thanks,
Vasu -
Getting Duplicate data Records error while loading the Master data.
Hi All,
We are getting Duplicate data Records error while loading the Profit centre Master data. Master data contains time dependent attributes.
the load is direct update. So i made it red and tried to reloaded from PSA even though it is throwing same error.
I checked in PSA. Showing red which records have same Profit centre.
Could any one give us any suggestions to resolve the issues please.
Thanks & Regards,
RajuHi Raju,
I assume there are no routines written in the update rules and also you ae directly loading the data from R/3 ( not from any ODS). If that is the case then it could be that, the data maintained in R/3 has overlapping time intervals (since time dependency of attributes is involved). Check your PSA if the same profit center has time intervals which over lap. In that case, you need to get this fixed in R/3. If there are no overlapping time intervals, you can simply increase the error tolerance limit in your info-package and repeat the load.
Hope this helps you.
Thanks & Regards,
Nithin Reddy. -
OSB: Cannot acquire data source error while using JCA DBAdapter in OSB
Hi All,
I've entered 'Cannot acquire data source' error while using JCA DBAdapter in OSB.
Error infor are as follows:
The invocation resulted in an error: Invoke JCA outbound service failed with application error, exception: com.bea.wli.sb.transports.jca.JCATransportException: oracle.tip.adapter.sa.api.JCABindingException: oracle.tip.adapter.sa.impl.fw.ext.org.collaxa.thirdparty.apache.wsif.WSIFException: servicebus:/WSDL/DBAdapter1/RetrievePersonService [ RetrievePersonService_ptt::RetrievePersonServiceSelect(RetrievePersonServiceSelect_inputParameters,PersonTCollection) ] - WSIF JCA Execute of operation 'RetrievePersonServiceSelect' failed due to: Could not create/access the TopLink Session.
This session is used to connect to the datastore.
Caused by Exception [EclipseLink-7060] (Eclipse Persistence Services - 2.0.2.v20100323-r6872): org.eclipse.persistence.exceptions.ValidationException
Exception Description: Cannot acquire data source [jdbc/soademoDatabase].
Internal Exception: javax.naming.NameNotFoundException: Unable to resolve 'jdbc.soademoDatabase'. Resolved 'jdbc'; remaining name 'soademoDatabase'.
; nested exception is:
BINDING.JCA-11622
Could not create/access the TopLink Session.
This session is used to connect to the datastore.
JNDI Name for the Database pool: eis/DB/soademoDatabase
JNDI Name for the Data source: jdbc/soademoDatabase
I created a basic DBAdapter in JDeveloper, got the xsd file, wsdl file, .jca file and the topLink mapping file imported them into OSB project.
Then I used the .jca file to generate a business service, and tested, then the error occurs as described above.
Login info in RetrievePersonService-or-mappings.xml
<login xsi:type="database-login">
<platform-class>org.eclipse.persistence.platform.database.oracle.Oracle9Platform</platform-class>
<user-name></user-name>
<connection-url></connection-url>
</login>
jca file content are as follows:
<adapter-config name="RetrievePersonService" adapter="Database Adapter" wsdlLocation="RetrievePersonService.wsdl" xmlns="http://platform.integration.oracle/blocks/adapter/fw/metadata">
<connection-factory location="eis/DB/soademoDatabase" UIConnectionName="Connection1" adapterRef=""/>
<endpoint-interaction portType="RetrievePersonService_ptt" operation="RetrievePersonServiceSelect">
<interaction-spec className="oracle.tip.adapter.db.DBReadInteractionSpec">
<property name="DescriptorName" value="RetrievePersonService.PersonT"/>
<property name="QueryName" value="RetrievePersonServiceSelect"/>
<property name="MappingsMetaDataURL" value="RetrievePersonService-or-mappings.xml"/>
<property name="ReturnSingleResultSet" value="false"/>
<property name="GetActiveUnitOfWork" value="false"/>
</interaction-spec>
</endpoint-interaction>
</adapter-config>
RetrievePersonService_db.wsdl are as follows:
<?xml version="1.0" encoding="UTF-8"?>
<WL5G3N0:definitions name="RetrievePersonService-concrete" targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/db/KnowledeMgmtSOAApplication/AdapterJDevProject/RetrievePersonService" xmlns:WL5G3N0="http://schemas.xmlsoap.org/wsdl/" xmlns:WL5G3N1="http://xmlns.oracle.com/pcbpel/adapter/db/KnowledeMgmtSOAApplication/AdapterJDevProject/RetrievePersonService" xmlns:WL5G3N2="http://schemas.xmlsoap.org/wsdl/soap/">
<WL5G3N0:import location="RetrievePersonService.wsdl" namespace="http://xmlns.oracle.com/pcbpel/adapter/db/KnowledeMgmtSOAApplication/AdapterJDevProject/RetrievePersonService"/>
<WL5G3N0:binding name="RetrievePersonService_ptt-binding" type="WL5G3N1:RetrievePersonService_ptt">
<WL5G3N2:binding style="document" transport="http://www.bea.com/transport/2007/05/jca"/>
<WL5G3N0:operation name="RetrievePersonServiceSelect">
<WL5G3N2:operation soapAction="RetrievePersonServiceSelect"/>
<WL5G3N0:input>
<WL5G3N2:body use="literal"/>
</WL5G3N0:input>
<WL5G3N0:output>
<WL5G3N2:body use="literal"/>
</WL5G3N0:output>
</WL5G3N0:operation>
</WL5G3N0:binding>
<WL5G3N0:service name="RetrievePersonService_ptt-bindingQSService">
<WL5G3N0:port binding="WL5G3N1:RetrievePersonService_ptt-binding" name="RetrievePersonService_ptt-bindingQSPort">
<WL5G3N2:address location="jca://eis/DB/soademoDatabase"/>
</WL5G3N0:port>
</WL5G3N0:service>
</WL5G3N0:definitions>
Any suggestion is appricated .
Thanks in advance!
Edited by: user11262117 on Jan 26, 2011 5:28 PMHi Anuj,
Thanks for your reply!
I found that the data source is registered on server soa_server1 as follows:
Binding Name: jdbc.soademoDatabase
Class: weblogic.jdbc.common.internal.RmiDataSource_1033_WLStub
Hash Code: 80328036
toString Results: ClusterableRemoteRef(8348400613458600489S:10.2.1.143:[8001,8001,-1,-1,-1,-1,-1]:base_domain:soa_server1 [8348400613458600489S:10.2.1.143:[8001,8001,-1,-1,-1,-1,-1]:base_domain:soa_server1/291])/291
Binding Name: jdbc.SOADataSource
Class: weblogic.jdbc.common.internal.RmiDataSource_1033_WLStub
Hash Code: 92966755
toString Results: ClusterableRemoteRef(8348400613458600489S:10.2.1.143:[8001,8001,-1,-1,-1,-1,-1]:base_domain:soa_server1 [8348400613458600489S:10.2.1.143:[8001,8001,-1,-1,-1,-1,-1]:base_domain:soa_server1/285])/285
I don't know how to determine which server the DBAdapter is targetted to.
But I found the following information:
Under Deoloyment->DBAdapter->Monitoring->Outbound Connection Pools
Outbound Connection Pool Server State Current Connections Created Connections
eis/DB/SOADemo AdminServer Running 1 1
eis/DB/SOADemo soa_server1 Running 1 1
eis/DB/soademoDatabase AdminServer Running 1 1
eis/DB/soademoDatabase soa_server1 Running 1 1
The DbAdapter is related to the following files:
C:\ Oracle\ Middleware\ home_11gR1\ Oracle_SOA1\ soa\ connectors\ DbAdapter. rar
C:\ Oracle\ Middleware\ home_11gR1\ Oracle_SOA1\ soa\ DBPlan\ Plan. xml
I unzipped DbAdapter.rar, opened weblogic-ra.xml and found that there's only one data source is registered:
<?xml version="1.0"?>
<weblogic-connector xmlns="http://www.bea.com/ns/weblogic/90">
<enable-global-access-to-classes>true</enable-global-access-to-classes>
<outbound-resource-adapter>
<default-connection-properties>
<pool-params>
<initial-capacity>1</initial-capacity>
<max-capacity>1000</max-capacity>
</pool-params>
<properties>
<property>
<name>usesNativeSequencing</name>
<value>true</value>
</property>
<property>
<name>sequencePreallocationSize</name>
<value>50</value>
</property>
<property>
<name>defaultNChar</name>
<value>false</value>
</property>
<property>
<name>usesBatchWriting</name>
<value>true</value>
</property>
<property>
<name>usesSkipLocking</name>
<value>true</value>
</property>
</properties>
</default-connection-properties>
<connection-definition-group>
<connection-factory-interface>javax.resource.cci.ConnectionFactory</connection-factory-interface>
<connection-instance>
<jndi-name>eis/DB/SOADemo</jndi-name>
<connection-properties>
<properties>
<property>
<name>xADataSourceName</name>
<value>jdbc/SOADataSource</value>
</property>
<property>
<name>dataSourceName</name>
<value></value>
</property>
<property>
<name>platformClassName</name>
<value>org.eclipse.persistence.platform.database.Oracle10Platform</value>
</property>
</properties>
</connection-properties>
</connection-instance>
</connection-definition-group>
</outbound-resource-adapter>
</weblogic-connector>
Then I decided to use eis/DB/SOADemo for testing.
For JDeveloper project, after I deployed to weblogic server, it works fine.
But for OSB project referencing wsdl, jca and mapping file from JDeveloper project, still got the same error as follows:
BEA-380001: Invoke JCA outbound service failed with application error, exception:
com.bea.wli.sb.transports.jca.JCATransportException: oracle.tip.adapter.sa.api.JCABindingException: oracle.tip.adapter.sa.impl.fw.ext.org.collaxa.thirdparty.apache.wsif.WSIFException: servicebus:/WSDL/DBAdapterTest/DBReader [ DBReader_ptt::DBReaderSelect(DBReaderSelect_inputParameters,PersonTCollection) ] - WSIF JCA Execute of operation 'DBReaderSelect' failed due to: Could not create/access the TopLink Session.
This session is used to connect to the datastore.
Caused by Exception [EclipseLink-7060] (Eclipse Persistence Services - 2.0.2.v20100323-r6872): org.eclipse.persistence.exceptions.ValidationException
Exception Description: Cannot acquire data source [jdbc/SOADataSource].
Internal Exception: javax.naming.NameNotFoundException: Unable to resolve 'jdbc.SOADataSource'. Resolved 'jdbc'; remaining name 'SOADataSource'.
; nested exception is:
BINDING.JCA-11622
Could not create/access the TopLink Session.
This session is used to connect to the datastore.
Caused by Exception [EclipseLink-7060] (Eclipse Persistence Services - 2.0.2.v20100323-r6872): org.eclipse.persistence.exceptions.ValidationException
Exception Description: Cannot acquire data source [jdbc/SOADataSource].
Internal Exception: javax.naming.NameNotFoundException: Unable to resolve 'jdbc.SOADataSource'. Resolved 'jdbc'; remaining name 'SOADataSource'.
You may need to configure the connection settings in the deployment descriptor (i.e. DbAdapter.rar#META-INF/weblogic-ra.xml) and restart the server. This exception is considered not retriable, likely due to a modelling mistake.
It almost drive me crazy!!:-(
What's the purpose of 'weblogic-ra.xml' under the folder of 'C:\Oracle\Middleware\home_11gR1\Oracle_OSB1\lib\external\adapters\META-INF'?
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- -
Unable to support application while using data services
I have a curve9220
all time good product
but unable to use any application while using data services
but i can browes from explore.Yes, if you want to use them on mobile network you must have a BB data plan.
1. Please thank those who help you by clicking the "Like" button at the bottom of the post that helped you.
2. If your issue has been solved, please resolve it by marking the post "Solution?" which solved it for you! -
Data load failed while loading data from one DSO to another DSO..
Hi,
On SID generation data load failed while loading data from Source DSO to Target DSO.
Following are the error which is occuuring--
Value "External Ref # 2421-0625511EXP " (HEX 450078007400650072006E0061006C0020005200650066
Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
So, i'm not getting WHY in one DSO i.e Source it got successful but in another DSO i.e. Target its got failed??
While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
Please explain..
Thanks,
SnehaHi,
I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO. By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.
Please analyze the following
Have you loaded masterdata before transaction data ... if no please do it first
go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
this may be the reason.
Also check whether there is any special char involvement in your transaction data (even lower case letter)
Regards
BVR -
Creating data and .par files using SWIFT Integration Package
Hi,
I have a requirement to generate a data and .par file using SAP PI Integration package for SWIFT.
I am following the SAP Note: 1303428.
I have created 1 sender and 2 receiver comm channels(one for payload and other for par). Used operation mapping SWIFT_payload_parFile_split in the Interface determination.
I am using the adapter module : localejbs/swift/FileActConversionToSWIFTModule and setting the parameter DetachParameters to true. This adapter module is being used in all the three channels(1 sender and 2 receivers).
I have used ASMA ans has set the FileName checkbox.
Now after placing the file in the input directory, the file with the same name gets created in the output directory but the file is exactly same and also no .par file is getting created. I have set the Empty file handling to Ignore, so it shows that there is no data to create a .par file and only payload file is getting created but the payload file is exactly same.
Also if I use the adapter module : localejbs/swift/FileActConversionToSWIFTModule in only the sender communication channel, a payload file gets created like below.
<?xml version="1.0" encoding="UTF-8"?>
-<ns1:SWIFT_payload xmlns:ns1="http://sap.com/xi/SWIFT"><payload>PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4KPEZpbGU+CiA8UGFyYW1ldGVycyB4bWxucz0idXJuOnN3aWZ0OnNhZzp4c2Q6ZnRhLnBhcmFtLjEuMCIgeG1sbnM6eHNpPSJodHRwOi8vd3d3LnczLm9yZy8yMDAxL1hNTFNjaGVtYS1pbnN0YW5jZSI+CiAgPE92ZXJyaWRlcz4KICAgPFJlc3BvbmRlcj5jbj1jZngsb3U9bmEsbz1jaXRpZ2IybCxvPXN3aWZ0PC9SZXNwb25kZXI+CiAgIDxTZXJ2aWNlPnN3aWZ0LmNvcnAuZmEhcDwvU2VydmljZT4KICAgPFJlcXVlc3RUeXBlPnBhaW4uMDAxLjAwMS4wMzwvUmVxdWVzdFR5cGU+CiAgIDxUcmFuc2ZlckRlc2NyaXB0aW9uPkIwNDExMC1CYXRjaDE3ODc8L1RyYW5zZmVyRGVzY3JpcHRpb24+CiAgIDxUcmFuc2ZlckluZm8+QjA0MTEwLUJhdGNoMTc4NzwvVHJhbnNmZXJJbmZvPgogICA8RmlsZURlc2NyaXB0aW9uPjIwNzY4PC9GaWxlRGVzY3JpcHRpb24+CiAgIDxGaWxlSW5mbz5Td0NvbXByZXNzaW9uPW5vbmU8L0ZpbGVJbmZvPgogICA8Tm9uUmVwdWRpYXRpb24+VFJVRTwvTm9uUmVwdWRpYXRpb24+CiAgIDxTaWduPlRSVUU8L1NpZ24+CiAgIDxQcmlvcml0eT5Ob3JtYWw8L1ByaW9yaXR5PgogIDwvT3ZlcnJpZGVzPgogPC9QYXJhbWV0ZXJzPgogPERvY3VtZW50IHhtbG5zPSJ1cm46aXNvOnN0ZDppc286MjAwMjI6dGVjaDp4c2Q6cGFpbi4wMDEuMDAxLjAzIiB4bWxuczp4c2k9Imh0dHA6Ly93d3cudzMub3JnLzIwMDEvWE1MU2NoZW1hLWluc3RhbmNlIj4KICA8Q3N0bXJDZHRUcmZJbml0bj4KICAgPEdycEhkcj4KICAgIDxNc2dJZD4xMDAwMzI4MTE1PC9Nc2dJZD4KICAgIDxDcmVEdFRtPjIwMTQtMDMtMjhUMTk6MjY6Mzc8L0NyZUR0VG0+CiAgICA8TmJPZlR4cz4xPC9OYk9mVHhzPgogICAgPEN0cmxTdW0+NTkwLjAwPC9DdHJsU3VtPgogICAgPEluaXRnUHR5PgogICAgIDxObT5BTEVYSU9OIElOVC4gU0FSTDwvTm0+CiAgICAgPElkPgogICAgICA8T3JnSWQ+CiAgICAgICA8QklDT3JCRUk+QUxYTlVTMjBYWFg8L0JJQ09yQkVJPgogICAgICA8L09yZ0lkPgogICAgIDwvSWQ+CiAgICA8L0luaXRnUHR5PgogICA8L0dycEhkcj4KICAgPFBtdEluZj4KICAgIDxQbXRJbmZJZD4xMDAwMzI4MTE1PC9QbXRJbmZJZD4KICAgIDxQbXRNdGQ+VFJGPC9QbXRNdGQ+CiAgICA8QnRjaEJvb2tnPmZhbHNlPC9CdGNoQm9va2c+CiAgICA8TmJPZlR4cz4xPC9OYk9mVHhzPgogICAgPEN0cmxTdW0+NTkwLjAwPC9DdHJsU3VtPgogICAgPFBtdFRwSW5mPgogICAgIDxJbnN0clBydHk+Tk9STTwvSW5zdHJQcnR5PgogICAgIDxTdmNMdmw+CiAgICAgIDxDZD5TRVBBPC9DZD4KICAgICA8L1N2Y0x2bD4KICAgIDwvUG10VHBJbmY+CiAgICA8UmVxZEV4Y3RuRHQ+MjAxNC0wMy0yOTwvUmVxZEV4Y3RuRHQ+CiAgICA8RGJ0cj4KICAgICA8Tm0+QUxYTiBCRU5FTFVYIEJWIE5MIEJSQU5DSDwvTm0+CiAgICAgPFBzdGxBZHI+CiAgICAgIDxTdHJ0Tm0+U3RyYWF0PC9TdHJ0Tm0+CiAgICAgIDxUd25ObT5OZXRoZXJsYW5kczwvVHduTm0+CiAgICAgIDxDdHJ5Pk5MPC9DdHJ5PgogICAgIDwvUHN0bEFkcj4KICAgICA8Q3RyeU9mUmVzPk5MPC9DdHJ5T2ZSZXM+CiAgICA8L0RidHI+CiAgICA8RGJ0ckFjY3Q+CiAgICAgPElkPgogICAgICA8SUJBTj5OTFhYQU5CQTEyMzAwNDU2NzY3ODkwPC9JQkFOPgogICAgIDwvSWQ+CiAgICAgPENjeT5FVVI8L0NjeT4KICAgIDwvRGJ0ckFjY3Q+CiAgICA8RGJ0ckFndD4KICAgICA8RmluSW5zdG5JZD4KICAgICAgPEJJQz5BQk5BTkwyWFhYWDwvQklDPgogICAgICA8UHN0bEFkcj4KICAgICAgIDxDdHJ5Pk5MPC9DdHJ5PgogICAgICA8L1BzdGxBZHI+CiAgICAgPC9GaW5JbnN0bklkPgogICAgPC9EYnRyQWd0PgogICAgPENocmdCcj5TTEVWPC9DaHJnQnI+CiAgICA8Q2R0VHJmVHhJbmY+CiAgICAgPFBtdElkPgogICAgICA8RW5kVG9FbmRJZD5OTDEyMzAwNDAwMDAwMDwvRW5kVG9FbmRJZD4KICAgICA8L1BtdElkPgogICAgIDxBbXQ+CiAgICAgIDxJbnN0ZEFtdCBDY3k9IkVVUiI+NTkwLjAwPC9JbnN0ZEFtdD4KICAgICA8L0FtdD4KICAgICA8Q2R0ckFndD4KICAgICAgPEZpbkluc3RuSWQ+CiAgICAgICA8QklDPkFCTkFOTFhYWFhYPC9CSUM+CiAgICAgICA8Q2xyU3lzTW1iSWQ+CiAgICAgICAgPE1tYklkPjAwMzwvTW1iSWQ+CiAgICAgICA8L0NsclN5c01tYklkPgogICAgICAgPE5tPkFCTiBBbXJvPC9ObT4KICAgICAgIDxQc3RsQWRyPgogICAgICAgIDxDdHJ5Pk5MPC9DdHJ5PgogICAgICAgPC9Qc3RsQWRyPgogICAgICA8L0Zpbkluc3RuSWQ+CiAgICAgPC9DZHRyQWd0PgogICAgIDxDZHRyPgogICAgICA8Tm0+QUxYTiBOTCBEb21lc3RpYyBWZW5kb3I8L05tPgogICAgICA8UHN0bEFkcj4KICAgICAgIDxDdHJ5Pk5MPC9DdHJ5PgogICAgICA8L1BzdGxBZHI+CiAgICAgIDxJZD4KICAgICAgIDxPcmdJZD4KICAgICAgICA8T3Rocj4KICAgICAgICAgPElkPjAwMTUwMDAxOTc8L0lkPgogICAgICAgIDwvT3Rocj4KICAgICAgIDwvT3JnSWQ+CiAgICAgIDwvSWQ+CiAgICAgPC9DZHRyPgogICAgIDxDZHRyQWNjdD4KICAgICAgPElkPgogICAgICAgPElCQU4+TkwxMjAwMzA0NTY3ODkxMjAwMDA8L0lCQU4+CiAgICAgIDwvSWQ+CiAgICAgIDxDY3k+RVVSPC9DY3k+CiAgICAgIDxObT5BTFhOIE5MIERvbWVzdGljIFZlbmRvcjwvTm0+CiAgICAgPC9DZHRyQWNjdD4KICAgICA8Um10SW5mPgogICAgICA8VXN0cmQ+L1BNREQvVEVTVDY4MSw1OTAuMDAsRVVSLDIwMTQwMzI8L1VzdHJkPgogICAgIDwvUm10SW5mPgogICAgPC9DZHRUcmZUeEluZj4KICAgPC9QbXRJbmY+CiAgPC9Dc3RtckNkdFRyZkluaXRuPgogPC9Eb2N1bWVudD4KPC9GaWxlPgo=</payload></ns1:SWIFT_payload>
But while creating the par file, it shows that the file could not be overwritten and so the .par file is not getting created.
I need to understand that:
1) How do I configure both of my receiver channels i.e. what should be the difference. Currently, I am just setting the
same Output directory in both and file name schema * and has used ASMA with FileName parameter. So same name files are getting created and so one file gets discarded. How to create a .par file.
2) Also Is the file above is the correct file required by SWIFT.
3) Also when I check in SXMB_MONI, I can see that after running the adapter module same payload as given above is going in both payload and par files i.e. also if I am using different names in my receiver communication channels two files are getting created payload and par both are having the same payload. So what exactly should get created.
Kindly guide on this implementation.Hi,
I am able to generate the .par file by setting the localsecurity to true and the KeyId from the key manager.
Now two files are getting created .xml and .par. .xml file which is the payload file is the same as the input file with no difference and contains both the overrides and the data parameters. The .par file contains the Algorithm and the Value.
Is this correct?
Also suppose the input file name is SEPA.xml, then the payload file is created with the name as
SEPA.xml and the par file is created with the name SEPA.xml.par. I need only SEPA.par. How to achieve this. -
Duplicate Data records occured while loading
Hai Experts
While loading Duplicate Data records occured earlier there was less no of records i used to delete the duplicate records in background in PSA. Now i am having more records which i am confusing which to delete and which not to delete. Since it is a flatfile loading with delta update mode. and when i went to infopackage it is showing update subsequently in data target in hide position and i went through process chainipdisplay variant---- and made status red--- selected updated subsequently in data target
ignore duplicate data records. and i want to trigger the subsequent process. if we go through process monitor i can rspc_process_finish. I i go display variant then what is the process.......................... is there any function module.Select the check box " Handle duplicate records key" in DTP on update tab.
Thanks....
Shambhu -
Error While using date fields in HIbernate Criteraia
I am trying a sample applciation using Hibernate with JPA. I am using annotations to map database table with Java class.I am using org.hibernate.Criteria to form where clause. My code looks as follows:
EntityManagerFactory emf = Persistence.createEntityManagerFactory("PERSISTANCE_UNIT");
EntityManger em = emf.createEntityManager();
Session session = em.getDelegate();
Criteria criteria = session.createCriteria(MyDO.class)
criteria.add(Restrictions.eq("myDO.date",myDO.getDate()));
List list = criteria.list();
I am searching the data base based on the date field (myDO is an instance of MyDO.class which contains the mapping to the data base table). While running the application first time after publishing to server, I am getting the following error. For subsequent running of the application I am not getting any error and the appliaction was running with actual results.
JDBCException W org.hibernate.util.JDBCExceptionReporter logExceptions SQL Error: -181, SQLState: 22007
JDBCException E org.hibernate.util.JDBCExceptionReporter logExceptions THE STRING REPRESENTATION OF A DATETIME VALUE IS NOT A VALID DATETIME VALUE
org.hibernate.exception.DataException: could not execute query
at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:77)
at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:43)
at org.hibernate.loader.Loader.doList(Loader.java:2223)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2104)
at org.hibernate.loader.Loader.list(Loader.java:2099)
at org.hibernate.loader.criteria.CriteriaLoader.list(CriteriaLoader.java:94)
at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1569)
at org.hibernate.impl.CriteriaImpl.list(CriteriaImpl.java:283)
Can anybody help in fixing the problem.
Thanks in advance.Resolved the issue by setting System date where expected
-
Error in loading data into essbase while using Rule file through ODI
Hi Experts,
Refering my previous post Error while using Rule file in loading data into Essbase through ODI
I am facing problem while loading data into Essbase. I am able to load data into Essbase successfully. But when i used Rule file to add values to existing values I am getting error.
test is my Rule file.
com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Cannot put olap file object. Essbase Error(1053025): Object [test] already exists and is not locked by user [admin@Native Directory]
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
at java.lang.Thread.run(Thread.java:662)
from com.hyperion.odi.common import ODIConstants
from com.hyperion.odi.connection import HypAppConnectionFactory
from java.lang import Class
from java.lang import Boolean
from java.sql import *
from java.util import HashMap
# Get the select statement on the staging area:
sql= """select C1_HSP_RATES "HSP_Rates",C2_ACCOUNT "Account",C3_PERIOD "Period",C4_YEAR "Year",C5_SCENARIO "Scenario",C6_VERSION "Version",C7_CURRENCY "Currency",C8_ENTITY "Entity",C9_VERTICAL "Vertical",C10_HORIZONTAL "Horizontal",C11_SALES_HIERARICHY "Sales Hierarchy",C12_DATA "Data" from PLANAPP."C$_0HexaApp_PLData" where (1=1) """
srcCx = odiRef.getJDBCConnection("SRC")
stmt = srcCx.createStatement()
srcFetchSize=30
#stmt.setFetchSize(srcFetchSize)
stmt.setFetchSize(1)
print "executing query"
rs = stmt.executeQuery(sql)
print "done executing query"
#load the data
print "loading data"
stats = pWriter.loadData(rs)
print "done loading data"
#close the database result set, connection
rs.close()
stmt.close()
Please help me on this...
Thanks & Regards,
ChinnuHi Priya,
Thanks for giving reply. I already checked that no lock are available for rule file. I don't know what's the problem. It is working fine without the Rule file, but throwing error only when using rule file.
Please help on this.
Thanks,
Chinnu -
Need to specify LEFT OUTER JOIN while using data from logical database BRM?
I'm trying to extract data for external processing using SQVI. The fields required are in tables BKPF (Document Header) and BSEG (document detail) so I'm using logical database BRM. Note: required fields include the SPGR* (blocking reasons) which don't appear to be in BSIS/BSAS/BSID/BSAD/BSIK/BSAK, hence I can't just use a Table Join on any of these but have to use BSEG, hence BRM.
If the document type is an invoice, I also need to include the PO number from table EKKO (PO header), if present, hence I'd like to add this to the list. However, if I do this, it seems that some records are no longer display, e.g. AB documents.
The interesting thing is that not all records are suppressed, so it's not a simple case of the logical database using an effective INNER JOIN, but the effect is similar.
In any event, is there a way to specify that the link to table EKKO should be treated as an effective LEFT OUTER JOIN, i.e. records from BKPF/BSEG should be included irrespective of whether any records from EKKO/EKPO exist or not?
Alternatively, is there some other way to get the SPGR* fields (for example) from BSEG and still join the BKPF? Of course, one solution is to use multiple queries, but I was hoping to avoid this.Thanks for everyone's responses, I know how to work around the problem with sql, I am wanting to see if there is a way to make the outer joins filter go in the join clause instead of the where clause with Crystal Reports standard functionality.
We have some Crystal Reports users that are not sql users, i.e. benefit specialists, payroll specialists and compensation analysts who have Crystal Reports. I was hoping this functionality was available for them. I just made my example a simple one, but often reports have multiple outer joins with maybe 2 or three of the outer joins needing a filter on them that won't make them into an inner join.
Such as
Select person information
outer join address record
outer join email record
outer join tax record (filter for active state record & filter for code = STATE )
outer join pay rates record
outer join phone#s (filter for home phone#)
I thought maybe the functionality may be available, that I just don't know how or where to use it. Maybe it is just not available.
If it is not available, I will probably need to setup some standard views for them to query, rather than expecting them to pull the tables together themselves. -
Data security concern while using JDBC
My java application connecting to a database to read patient information. Do I have to worry about encrypting the data? I am using oracle jdbc driver. Is there any chance, anyone can read the data on transit?
In theory it is possible. In practice I don't know that there are any recorded instances outside NSA. I've read that there are no known cases of credit card numbers being harvested from plaintext IP traffic.
Your question should really be directed to your employer or the customer.
If you're on an Intranet I would forget about it; if you're using the Internet it may be required to use SSL. -
Duplciuate Data record error while reloading a DTP
I have a DTP created to load etxt datasource and put in the process chain. I have set "delta mode" so that it picks up if additional records are available.
But, I noiticc that when there are no additional records (for eg: earlier load had 269 records and current load has same 269 records), I am gettign the error - "Duplciate data record" and status is set as "Red". I do not want to see error in this case as it did not find any new records and want the status to be green if there are no new records.
Could you please suggest what settings will do that.
Regards
RajDelta DTP will fetch only unloaded requests ( requests which donot exist in target ) and not additional records.
Is the text datasource delta enabled ?
Do you have Infopackage of Update type Delta setup?
Did you run a Full DTP before Delta DTP?
Assuming Full Infopackage has loaded 269 records to PSA and the same will be loaded ( if no additional records in source system ) again.
Req 1 - 269 - Yesterday
Req 2 - 269 - Today
Full DTP ran yesterday will load 269 records to target.
Delta DTP ran today will load Req1 & Req 2 to target ( master data will be overwritten ) - Reason Delta DTP is acting like Init with data transfer.
You can start off with a Delta DTP instead Full, if Full DTP is ran before a Delta DTP maje sure you delete requests loaded by Full DTP.
This can be ignored as this is Master Data which wil be overwritten.
To get rid of error jst Check " Handle Duplicate Records " in Update Tab of DTP. -
Iphone 5 wont connect to internet while using data or Wifi after iOS 7.0.6 update
I just updated my iPhone 5 today to iOS 7.0.6. No matter what conection I use, data or wifi connection, I can not connect to the internet or use any function involving the internet.
UPDATE - I got my devices working on my wifi. Here's what I did:
1. On your iOS device - go to "network settings" and select the associated network and select "forget this network".
2. Reboot your iOS device.
3. Change the wifi (WPA2) password on your wifi AP.
4. Re-join the wifi network on your iOS device. Put in the new password.
--> If your system is like mine, the device will now connect to your wifi network. I suspect the SSL fix somehow munged the password encryption for the existing password, but changing it on both ends somehow fixes it.
good luck!!
Tom -
Loading data in multiple languages using Import Manager
Hello Experts,
I have a description field associated with Main table, and need to load the data in multiple languages.. How to load the data in Multiple languages using Importmanager?
Thanks in advance..Hi Kiran,
Please do the changes as mentioned in this threads:
Re: Multi Language Question
Multilingual field
Regards,
---Satish
Maybe you are looking for
-
Validation for PO text in the material master.
Hi experts, I am trying to do a validation on the Purchase Order Text (long text) which must always be filled in Language EN and DE. Thus if a user tries to edit a material in MM02 or creates a new one in MM01 - this box of long text (PO text) must b
-
Error when submitting podcast: unable to download episodes from your feed
Hi everyone! When I want to submit my podcast in iTunes, it tells me it is unable to download episodes from my feed. I uploaded my mp3 file and xml file to an ftp-server, could there be a problem? Thanks in advance! Kind regards, Jannes
-
I just bought final cut express 4 Academic and it fails downloading!!!
I just bought final cut express 4 at my Sister's university. They have a mac store there and the listed price for FCE 4 was $70, so i bought it. After opening it the download failed. What do I do?! Please help!
-
Windows Mobile 5.0 Cingular 8525 phone and Airport Extreme WiFi
Does anyone know how to get the cingular 8525 phone which runs windows mobile 5.0 to connect with my current airport extreme wifi? Are they even compatible? I can connect to other wifis, but they have all been open to the public.
-
How to deploy a java application -- might be silly but just help me out
Hi I�ve developed a java application in eclipse. How do I deploy it so that it can be used by others without the help of eclipse? I can run it in eclipse but how do I deliver it to my customers and how can they use it without the help of eclipse or u