How to configure HR Data mapping to LDAP
Hi everyone,
I configure LDAP connection and it works. But I cannot do mappings of attributes to LDAP. Is there any document what are the fields to map and what is the meanings?
Thanks
Haldun
Dear Abhishek,
Before you proceed to Create Jobs & Positions first you check whether the Personnel Actions (PA40) is configured or not. Then after uploading the employeee master data you can proceed to other process.
if PA40 is not configured then Configure in SPRO using PA/PM/Customizing procudures/ Infotype Menus etc. You can also maintain user profile using Tcode. SU3.
All the best.
Rgds,
Vikrant
Similar Messages
-
Introduction
In SQL Server Reporting Services, we can define a mapping between the fields that are returned in the query to specific delivery options and to report parameters in a data-driven subscription.
For a report with a parameter (such as YEAR) that allow multiple values, when creating a data-driven subscription, how can we pass a record like below to show correct data (data for year 2012, 2013 and 2014).
EmailAddress Parameter
Comment
[email protected] 2012,2013,2014 NULL
In this article, I will demonstrate how to configure a Data Driven Subscription which get multi-value parameters from one column of a database table
Workaround
Generally, if we pass the “Parameter” column to report directly in the step 5 when creating data-driven subscription.
The value “2012,2013,2014” will be regarded as a single value, Reporting Services will use “2012,2013,2014” to filter data. However, there are no any records that YEAR filed equal to “2012,2013,2014”, and we will get an error when the subscription executed
on the log. (C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\LogFiles)
Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: Default value or value provided for the report parameter 'Name' is not a valid value.
This means that there is no such a value on parameter’s available value list, this is an invalid parameter value. If we change the parameter records like below.
EmailAddress Parameter Comment
[email protected] 2012 NULL
[email protected] 2013 NULL
[email protected] 2014 NULL
In this case, Reporting Services will generate 3 reports for one data-driven subscription. Each report for only one year which cannot fit the requirement obviously.
Currently, there is no a solution to solve this issue. The workaround for it is that create two report, one is used for view report for end users, another one is used for create data-driven subscription.
On the report that used create data-driven subscription, uncheck “Allow multiple values” option for the parameter, do not specify and available values and default values for this parameter. Then change the Filter
From
Expression:[ParameterName]
Operator :In
Value :[@ParameterName]
To
Expression:[ParameterName]
Operator :In
Value :Split(Parameters!ParameterName.Value,",")
In this case, we can specify a value like "2012,2013,2014" from database to the data-driven subscription.
Applies to
Microsoft SQL Server 2005
Microsoft SQL Server 2008
Microsoft SQL Server 2008 R2
Microsoft SQL Server 2012
Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.For every Auftrag, there are multiple Position entries.
Rest of the blocks don't seems to have any relation.
So you can check this code to see how internal table lt_str is built whose first 3 fields have data contained in Auftrag, and next 3 fields have Position data. The structure is flat, assuming that every Position record is related to preceding Auftrag.
Try out this snippet.
DATA lt_data TYPE TABLE OF string.
DATA lv_data TYPE string.
CALL METHOD cl_gui_frontend_services=>gui_upload
EXPORTING
filename = 'C:\temp\test.txt'
CHANGING
data_tab = lt_data
EXCEPTIONS
OTHERS = 19.
CHECK sy-subrc EQ 0.
TYPES:
BEGIN OF ty_str,
a1 TYPE string,
a2 TYPE string,
a3 TYPE string,
p1 TYPE string,
p2 TYPE string,
p3 TYPE string,
END OF ty_str.
DATA: lt_str TYPE TABLE OF ty_str,
ls_str TYPE ty_str,
lv_block TYPE string,
lv_flag TYPE boolean.
LOOP AT lt_data INTO lv_data.
CASE lv_data.
WHEN '[Version]' OR '[StdSatz]' OR '[Arbeitstag]' OR '[Pecunia]'
OR '[Mita]' OR '[Kunde]' OR '[Auftrag]' OR '[Position]'.
lv_block = lv_data.
lv_flag = abap_false.
WHEN OTHERS.
lv_flag = abap_true.
ENDCASE.
CHECK lv_flag EQ abap_true.
CASE lv_block.
WHEN '[Auftrag]'.
SPLIT lv_data AT ';' INTO ls_str-a1 ls_str-a2 ls_str-a3.
WHEN '[Position]'.
SPLIT lv_data AT ';' INTO ls_str-p1 ls_str-p2 ls_str-p3.
APPEND ls_str TO lt_str.
ENDCASE.
ENDLOOP. -
How to configure Essbase data source in OBIEE11g on Unix system?
Hi,
I am looking for documentation/link for how to configure Essbase data source in OBIEE 11g on UNIX system.
Thanks in advanceHi Fayaz
First You need "BI Administrator Client Tool" cause you need to make changes with in RPD (repository) BUT "BI Administrator Client Tool" for Unix is not Available.
So you have download OBIEE 11.1.1.5 ""BI Administrator Client Tool" and Install it on Windows Platform.
Now regarding your Original Issue I am also looking for this Info.
Regards
Sher
Edited by: Sher Ullah Baig on Apr 17, 2012 4:00 PM -
How to configure jazn-data.xml for two realms?
Hi,
in my test application a JSP page is secured with <auth-method>BASIC</auth-method>.
I use the jazn-data.xml configurtion in the Embedded OC4J in JDeveloper 10.1.2.
With the standard configuration of jazn-data.xml I can sign in with user admin/welcome without any problems.
In this case only the jazn.com realm exists.
If I configure a second realm over the JDeveloper menu "Tools -> Embedded OC4J Server Preferences ..." then I can´t sign in any more.
The new realm domain.com have some users in the group myusers. In the orion-web.xml I add the new group in the role-mapping.
What is the problem? I can´t find informations in the documentation.
Oracle® Application Server Containers for J2EE
Security Guide
10g Release 2 (10.1.2)
B14013-02
Thanks and best regards,
Tobias
orion-web.xml
<orion-web-app servlet-webdir="/servlets/">
<security-role-mapping impliesAll="false" name="user">
<group name="jazn.com/users"/>
<group name="domain.com/myusers"/>
</security-role-mapping>
</orion-web-app>
web.xml
<web-app>
<description>Empty web.xml file for Web Application</description>
<session-config>
<session-timeout>35</session-timeout>
</session-config>
<mime-mapping>
<extension>html</extension>
<mime-type>text/html</mime-type>
</mime-mapping>
<mime-mapping>
<extension>txt</extension>
<mime-type>text/plain</mime-type>
</mime-mapping>
<login-config>
<auth-method>BASIC</auth-method>
</login-config>
<security-constraint>
<web-resource-collection>
<web-resource-name>TestApp</web-resource-name>
<url-pattern>*</url-pattern>
</web-resource-collection>
<auth-constraint>
<role-name>user</role-name>
</auth-constraint>
</security-constraint>
<security-role>
<role-name>user</role-name>
</security-role>
</web-app>Hi,
I switched my above example from jazn-data.xml to LDAP and deployed this in iAS.
In the OID are two realms configured, realm_1 and realm_2 and some users for each.
But I can only access my JSP with users of realm_1 or realm_2 and not
simultaneous users of both realms. That´s dependents on the configured "Default
realm" of the JAZN LDAP User Manager in the EAR.
If I set "Default realm" = realm_1 in the enterprise manager of the iAS then
only members of realm_1 can access the JSP and vice versa for realm_2.
I packaged my above example as EAR and configured the default generated
orion-application.xml in the enterpise manager of the iAS.
orion-application.xml
<?xml version = '1.0'?>
<!DOCTYPE orion-application PUBLIC "-//ORACLE//DTD OC4J Application runtime
9.04//EN" "http://xmlns.oracle.com/ias/dtds/orion-application-9_04.dtd">
<orion-application deployment-version="10.1.2.0.2"
default-data-source="jdbc/OracleDS" treat-zero-as-null="true">
<web-module id="webapp" path="webapp.war"/>
<persistence path="persistence"/>
<principals path="principals.xml"/>
<jazn provider="LDAP">
<property name="ldap.user" value="cn=orcladmin"/>
<property name="ldap.password" value="!ias_admin10g"/>
</jazn>
<log>
<file path="application.log"/>
</log>
<namespace-access>
<read-access>
<namespace-resource root="">
<security-role-mapping>
<group name="jazn.com/administrators"/>
</security-role-mapping>
</namespace-resource>
</read-access>
<write-access>
<namespace-resource root="">
<security-role-mapping>
<group name="jazn.com/administrators"/>
</security-role-mapping>
</namespace-resource>
</write-access>
</namespace-access>
</orion-application>
orion-web.xml
<orion-web-app servlet-webdir="/servlets/">
<security-role-mapping impliesAll="false" name="user">
<group name="realm_1/users"/>
<group name="realm_2/myusers"/>
</security-role-mapping>
</orion-web-app>
What do I have to configure to get access to the JSP with both realms?
Best regards,
Tobias -
How to configure sendmail to use multiple LDAP servers ?
Hi everybody!
I have a sendmail running on Solaris 10 and a LDAP server(192.168.1.9) also running Solaris 10 OS. I have configured the sendmail the following way:
bash-3.00# ldapclient list
NS_LDAP_FILE_VERSION= 2.0
NS_LDAP_BINDDN= cn=proxyagent,ou=profile,dc=email,dc=reso,dc=ru
NS_LDAP_BINDPASSWD= {NS1}*********************
NS_LDAP_SERVERS= 192.168.1.9
NS_LDAP_SEARCH_BASEDN= dc=email,dc=domain,dc=ru
NS_LDAP_AUTH= simple
NS_LDAP_SEARCH_REF= FALSE
NS_LDAP_SEARCH_SCOPE= sub
NS_LDAP_SEARCH_TIME= 30
NS_LDAP_CACHETTL= 43200
NS_LDAP_PROFILE= default
NS_LDAP_CREDENTIAL_LEVEL= proxy
NS_LDAP_BIND_TIME= 10
I also have another LDAP server (IP 192.168.1.10). It is configured as a replicant of the 192.168.1.9 LDAP server.
The question is how can i configure sendmail to use both LDAP servers ?
The man pages explain how to configure ldapclient to use ONE server and what if want to use two or more? All the settings and the profiles the same.
Thanks in advance =))Hi!
To add LDAP servers to the Solaris ldapclient, you might use the ldapclient command:
ldapclient manual -v -a defaultServerList="servera.yourdomain.com serverb.yourdomain.com"
But this is only failover, AFAIK the Solaris ldapclient does not perform loadbalancing by itself.
But I am not sure about your sendmail programm. Normally, sendmail has its own configuration
and can be configured to use LDAP e.g. for aliases etc.
Regards!
Rainer -
How to configure Dynamic Data Connection for Business View
Hi,
How can we configure Dynamic Data connection that we can save the profile of the connection to somewhere that we do not need to enter it everytime when we refresh the report?
thanks and regards
noraHi James,
Thanks for the reply. Its solved now. For anybody if interested you can set the dynamic email address either i) having it as part of payload - In this case use the xpath to query the payload varaible ii) use identity service - follow the following steps
1.Create a user in Application enterprise manager/also you can use a existing account, if u are creating a new one assign the correct role
2. In either case edit the user-properties.xml(bpel/system/services/config) file for the corresponding user and add an attribute called email
3. Bounce the server for this changes to take effect
4. In the notification properties in the to address use ids:getUserProperty and pass the attribute name -
How to configure qpopper to authenticate against LDAP server
Hi,
This is re-post of my question:
I have directory server 6.0 set up on Solaris 9 system. Also, I have set up Solaris 9 system native LDAP client. The qpopper daemon is running on that client. I have re-compiled the qpopper to use PAM authentication, then 'kill -HUP' inetd. But when I try to connect to qpopper with PAM authentication, I got an error:
-ERR [AUTH] PAM authentication failed for user "nsr": No account present for user (13)
I do have user's account and I am able to retrieve the user's account information by 'ldaplist -l passwd nsr'. I guess it is related pam configuration problem, but I don't know how to configure pam for qpopper. The information provided by qpopper manual is listed below:
#%PAM-1.0
auth required /lib/security/pam_pwdb.so shadow
account required /lib/security/pam_pwdb.so
password required /lib/security/pam_cracklib.so
password required /lib/security/pam_pwdb.so nullok
use_authtok md5 shadow
session required /lib/security/pam_pwdb.so
Obviously, the example configuration is for Linux. So how I can configure Solaris pam.conf to have qpopper authenticate through pam?
My current pam.conf is listed here also:
login auth requisite pam_authtok_get.so.1
login auth required pam_dhkeys.so.1
login auth required pam_dial_auth.so.1
login auth binding pam_unix_auth.so.1 server_policy
login auth required pam_ldap.so.1
rlogin auth sufficient pam_rhosts_auth.so.1
rlogin auth requisite pam_authtok_get.so.1
rlogin auth required pam_dhkeys.so.1
rlogin auth binding pam_unix_auth.so.1 server_policy
rlogin auth required pam_ldap.so.1
rsh auth sufficient pam_rhosts_auth.so.1
rsh auth binding pam_unix_auth.so.1 server_policy
rsh auth required pam_ldap.so.1
ppp auth requisite pam_authtok_get.so.1
ppp auth required pam_dhkeys.so.1
ppp auth required pam_dial_auth.so.1
ppp auth binding pam_unix_auth.so.1 server_policy
ppp auth required pam_ldap.so.1
other auth requisite pam_authtok_get.so.1
other auth required pam_dhkeys.so.1
other auth binding pam_unix_auth.so.1 server_policy
other auth required pam_ldap.so.1
passwd auth binding pam_passwd_auth.so.1 server_policy
passwd auth required pam_ldap.so.1
cron account required pam_unix_account.so.1
other account requisite pam_roles.so.1
other account binding pam_unix_account.so.1 server_policy
other account required pam_ldap.so.1
other session required pam_unix_session.so.1
other password required pam_dhkeys.so.1
other password requisite pam_authtok_get.so.1
other password requisite pam_authtok_check.so.1
other password required pam_authtok_store.so.1 server_policy
Thanks,
--xinhuaniAS 6.0 sp4 officially does only support iPlanet Directory Server 5.0 sp1 and 4.13.
For more details visit: http://docs.iplanet.com/docs/manuals/ias/60/sp4/ig/prep.htm#42084
I guess, you can specify the directory server during the time of installation.
Thanks,
Rakesh. -
How to configure once data load then trigerd or run ibot?
Hi Experts,
I have a one requirement,
1) Every day run one workflow( means data load into data warehouse)
2) After, ibot should be run and delivery to users.
3) We scheduled the workflows in DAC for every day morning.
Requirement:
Once data loaded, then IBot should be run and send to users dynamically (without scheduling).
If workflow failed, IBot won’t be delivered.
How to find out or configure once data load then trigerd or run ibot?
I am using obi 10g and informatica 8 and os xp.
Advance thanks..
Thanks,
RajaHi,
Below are the details for automating the OBIEE Scheduler.
Create a batch file or Sh file with following command
D:\OracleBI\server\Bin\saschinvoke -u Administrator/udrbiee007 -j 8
-u is username/Password for Scheduler (username/password that u given while configuration)
-j is a job id number when u create a ibot it will assign the new job number that can be identified from"Jobmanager"
Refer the below thread for more information.
iBot scheduling after ETL load
Or ,
What you the above also it will work but problem is we need specify the time like every day 6.30 am .
Note: The condition report is true then the report will be delivered at 6.30 pm only but the condition is false the report will not triggered.
I also implemented this but that is little bit different .
Hope this help's
Thanks
Satya
Edited by: Satya Ranki Reddy on Jul 13, 2012 12:05 PM -
How to configure the date format in toolbar of ADF Calendar
Hi,
In the weekly view of the ADF Calendar, the date on the toolbar of the calendar component is something like this "May 1,2011 - May 7,2011".
Is there a way to configure this. What changes need to be done to show this as "1 May 2011 - 7 May 2011".
Thanks,
DeepakSee Bug 13825615 - backport to 11.1.1.7.0 bug 13045234 - date format not supported in toolbar by ca
-
Hi all,
I would like to have a data connection file used by excel to extract data form a SQL Server DB (so that change in location of the DB needs only a change to the single data connection file). The excel file will retrieve data from only a single database,
but there are multiple queries (30+) stored on SQL server that are used - each SQL server views' data is returned to a pivot table in an excel worksheet, which has an associated chart with various slicers on the main dashboard worksheet.
Do I need a seperate data conenction file for each SQL server query being retrieved, or can a single connection file be created as all data comes from a single database?
(all my attempts and research on the net have led me to believe that a seperate file is needed for each view, but this seems unnecessary)Hi, im learning my way with this, so apologies if I am providing too much or too little info.
There are 8 source files which are very loosely related in that they capture infromation regarding what has happened within a metro railway over a day. However there are only a few relationships between the contents of tehse files.
These tables are imported into SQL Server using SSIS where I have developed a number of views that query these 8 source tables to generate a number of metrics to provide insight into the service provided to the passengers. Some examples of metrics are: Number
of passengers in transit at any given time, time taken to travel between adjacent stations, how much power was used during the day, what distance did the trains travel during the day, etc. Some views provide only a handful of rows, some provide 1M plus. There
are now approx 40 seperate views created.
I have then used a spreadsheet with a worksheet associated with each SQL server view. Each worksheet is set up as a pivot table, which allows the related chart on the main dashboard worksheet to use standard excel capability to slice and present the data
in different ways.
At the moment if the server on which teh database is stored moves, I have had to recreate the spreadsheet from scratch as I dont know how to change the connection information in any other way. Ideally I would like to have connection info in a single place
to reduce ongoing maintenance, particularly as I would like to place the spreadsheet on a SharePoint server to distribute it to other users.
Thanks -
How to configure Spring Data JPA to handle sdo_geometry and Timestamp(0)?
Hi all,
I'm new to this community, not sure if I'm posting in the right place.
I'm an Oracle old-hand, but new to Spring Data, having done a bit on postGIS, but now trying to apply it to Oracle.
My problem: I have a new maven project, with Entity and Repository classes, and a JUnit test program that uses DbUnit to preload test data.
The current config uses a beans.xml file to set up:
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">...
<property name="jpaProperties">....
<bean id="vendorAdapter"
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="databasePlatform" value="org.hibernate.dialect.Oracle10gDialect" />
<property name="generateDdl" value="false" />
</bean>
<bean id="dataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource"
destroy-method="close">
<property name="driverClass" value="oracle.jdbc.OracleDriver" />
<property name="jdbcUrl"
value="jdbc:oracle:thin:@localhost:2632:BLADE" />...
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory" />
</bean>
Test of repos.count() works ok.
Test to call repos.findByName(String) fails, (stack below).
Now, the current setup doesn't handle SDO_GEOMETRY, or TIMESTAMP(0), but will manage TIMESTAMP(6).
The warnings below show that I'm using "DefaultDataTypeFactory" and should probably be using OracleDataTypeFactory.
I've also found the OracleDataSource class, and tried that, but still get the same data type issues.
Ok, the big questions:
1. Can anyone point me at an example to configure the dataSource, or vendorAdapter, to use the OracleDataTypeFactory, and retrieve these data types?
2. What data type should I be using in Java for the geometry column?
Thanks,
Phil
Test output warnings:
16:24:05.846 [main] INFO org.springframework.test.context.transaction.TransactionalTestExecutionListener - Began transaction (2) for test context [DefaultTestContext@990034 testClass = NotamRepositoryTest, testInstance = com.thalesgroup.uk.airscape.common.data.domain.dao.notam.NotamRepositoryTest@3ff1d6, testMethod = testNotamFetchByDatasetName@NotamRepositoryTest, testException = [null], mergedContextConfiguration = [MergedContextConfiguration@1346193 testClass = NotamRepositoryTest, locations = '{classpath:testBeans.xml}', classes = '{}', contextInitializerClasses = '[]', activeProfiles = '{}', contextLoader = 'org.springframework.test.context.support.DelegatingSmartContextLoader', parent = [null]]]; transaction manager [org.springframework.orm.jpa.JpaTransactionManager@1f1fe9d]; rollback [true]
16:24:08.674 [main] WARN org.dbunit.dataset.AbstractTableMetaData - Potential problem found: The configured data type factory 'class org.dbunit.dataset.datatype.DefaultDataTypeFactory' might cause problems with the current database 'Oracle' (e.g. some datatypes may not be supported properly). In rare cases you might see this message because the list of supported database products is incomplete (list=[derby]). If so please request a java-class update via the forums.If you are using your own IDataTypeFactory extending DefaultDataTypeFactory, ensure that you override getValidDbProducts() to specify the supported database products.
16:24:08.674 [main] WARN org.dbunit.util.SQLHelper - DATA_SET.SEC_TAG_MODIFIED data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
16:24:08.674 [main] WARN org.dbunit.util.SQLHelper - DATA_SET.CREATED data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
16:24:08.674 [main] WARN org.dbunit.util.SQLHelper - DATA_SET.MODIFIED data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
16:24:08.690 [main] WARN org.dbunit.util.SQLHelper - DATA_SET.VALID_FROM data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
16:24:08.690 [main] WARN org.dbunit.util.SQLHelper - DATA_SET.VALID_TO data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
16:24:08.690 [main] WARN org.dbunit.dataset.AbstractTableMetaData - Potential problem found: The configured data type factory 'class org.dbunit.dataset.datatype.DefaultDataTypeFactory' might cause problems with the current database 'Oracle' (e.g. some datatypes may not be supported properly). In rare cases you might see this message because the list of supported database products is incomplete (list=[derby]). If so please request a java-class update via the forums.If you are using your own IDataTypeFactory extending DefaultDataTypeFactory, ensure that you override getValidDbProducts() to specify the supported database products.
16:24:08.690 [main] WARN org.dbunit.util.SQLHelper - NOTAM.Q_POINT data type (1111, 'SDO_GEOMETRY') not recognized and will be ignored. See FAQ for more information.
16:24:08.690 [main] WARN org.dbunit.util.SQLHelper - NOTAM.Q_RANGE_RING data type (1111, 'SDO_GEOMETRY') not recognized and will be ignored. See FAQ for more information.
java.lang.UnsupportedOperationException
at org.hibernate.spatial.GeometrySqlTypeDescriptor.getExtractor(GeometrySqlTypeDescriptor.java:57)
at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:267)
at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:263)
at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:253)
at org.hibernate.type.AbstractStandardBasicType.hydrate(AbstractStandardBasicType.java:338)
at org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:2969)
at org.hibernate.loader.Loader.loadFromResultSet(Loader.java:1695)
at org.hibernate.loader.Loader.instanceNotYetLoaded(Loader.java:1627)
at org.hibernate.loader.Loader.getRow(Loader.java:1514)
at org.hibernate.loader.Loader.getRowFromResultSet(Loader.java:725)
at org.hibernate.loader.Loader.processResultSet(Loader.java:952)
at org.hibernate.loader.Loader.doQuery(Loader.java:920)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:354)
at org.hibernate.loader.Loader.doList(Loader.java:2553)
at org.hibernate.loader.Loader.doList(Loader.java:2539)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2369)
at org.hibernate.loader.Loader.list(Loader.java:2364)
at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:496)
at org.hibernate.hql.internal.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:387)
at org.hibernate.engine.query.spi.HQLQueryPlan.performList(HQLQueryPlan.java:231)
at org.hibernate.internal.SessionImpl.list(SessionImpl.java:1264)
at org.hibernate.internal.QueryImpl.list(QueryImpl.java:103)
at org.hibernate.jpa.internal.QueryImpl.list(QueryImpl.java:573)
at org.hibernate.jpa.internal.QueryImpl.getResultList(QueryImpl.java:449)
at org.hibernate.jpa.criteria.compile.CriteriaQueryTypeQueryAdapter.getResultList(CriteriaQueryTypeQueryAdapter.java:67)
at org.springframework.data.jpa.repository.query.JpaQueryExecution$CollectionExecution.doExecute(JpaQueryExecution.java:81)
at org.springframework.data.jpa.repository.query.JpaQueryExecution.execute(JpaQueryExecution.java:59)
at org.springframework.data.jpa.repository.query.AbstractJpaQuery.doExecute(AbstractJpaQuery.java:97)
at org.springframework.data.jpa.repository.query.AbstractJpaQuery.execute(AbstractJpaQuery.java:88)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:384)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:344)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:98)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:262)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:95)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:136)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.jpa.repository.support.CrudMethodMetadataPostProcessor$CrudMethodMetadataPopulatingMethodIntercceptor.invoke(CrudMethodMetadataPostProcessor.java:111)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy67.findByDataSetNameAndDeletedIsNull(Unknown Source)
at com.thalesgroup.uk.airscape.common.data.domain.dao.notam.NotamRepositoryTest.testNotamFetchByDatasetName(NotamRepositoryTest.java:103)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:233)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:87)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:176)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)Hi
You can use T-code NACE and then select your application based on your output type you can assign your Print program and Form.
Reward points, if it is helpful.
Regards
Raja. -
How to configure value date (VALUT) filling in FF67?
Hi,
Anyone knows if there is any configuration related to the automatic filling of the value date(BSEG-VALUT) in the transaction FF67? During the compensation of one document this date is not automatically passed and I want to change this.
Thanks,
NunoSubstitution can be used for the same.
Check condition with T Code and document type, fill the value date with the current date.
T Code for substitution : OBBH -
How to configure human workflow using embedded ldap in standalone weblogic
I am trying to use embedded ldap to select users for a human workflow. I have created an application server instance using soa server details but the realm field in human workflow remains empty.
Please let me know what would be right steps.Can you provide more details about the context of where this happens? Are you selecting users in the Organization editor in BPM studio? Is this on 11.1.1.3 or 11.1.1.4?
-
Active Data Guard-How to configure Auxiliary Data Base's Listener.
HI,
I am stuck in creating listener/tnsnames for auxiliary database as the problem happens because of the auxiliary instance has just started. It is now in NOMOUNT stage. The instance registration with the listener is performed by PMON process and to start the PMON process database need to be in mount stage. So, before the instance registration by PMON with the listener there is actually noting to register and hence the instance is BLOCKED.
http://www.oracle.com/technology/deploy/availability/pdf/oracle-openworld-2009/adg_hol_2009.pdf
I am following the above document.
LSNRCTL> status
Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=IPC)(KEY=LISTENER)))
STATUS of the LISTENER
Alias LISTENER
Version TNSLSNR for Linux: Version 11.2.0.1.0 - Production
Start Date 05-JUL-2010 11:06:23
Uptime 0 days 0 hr. 11 min. 44 sec
Trace Level off
Security ON: Local OS Authentication
SNMP OFF
Listener Parameter File /u01/app/11.2.0/grid/network/admin/listener.ora
Listener Log File /u02/app/oracle/diag/tnslsnr/oracle/listener/alert/log.xml
Listening Endpoints Summary...
(DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=LISTENER)))
(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=192.168.127.133)(PORT=1521)))
(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=192.168.127.140)(PORT=1521)))
Services Summary...
Service "+ASM" has 1 instance(s).
Instance "+ASM1", status READY, has 1 handler(s) for this service...
Service "eagledb" has 1 instance(s).
Instance "eagledb1", status READY, has 1 handler(s) for this service...
Service "eagledbXDB" has 1 instance(s).
Instance "eagledb1", status READY, has 1 handler(s) for this service...
Service "eagledb_DGB" has 1 instance(s).
Instance "eagledb1", status READY, has 1 handler(s) for this service...
Service "nyc" has 1 instance(s).
Instance "nyc", status BLOCKED, has 1 handler(s) for this service...
The command completed successfully
I found some simillar solution on below link but it is aplicable for 10g or below.
http://arjudba.blogspot.com/2008/05/connection-to-auxilary-instance-failed.html
I am using enterprise linux with clusterware 11g rel 2 with data base 11f rel 2 on single machine.
Thanks
Anis
99826412
Muscat
Edited by: user12979506 on Jul 5, 2010 10:13 PMThe "auxiliary" is, I presume, actually a physical standby.
If so why is it in NOMOUNT? Why don't you start it?
alter database recover managed standby database cancel;
alter database open read only; -
How to configure ABAP mappings in iterface mapping
Hi,
i am using PI 7.0 ,in interface mapping Abap Mapping will not display in dropdown list. can any one help me how to configure <b>ABAP mapping</b> in interface mapping screen.
thanx and reg,
sureshhi
Try this
How to use ABAP Mapping
https://wwwn.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/5c46ab90-0201-0010-42bd-9d0302591383
Some Blogs
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e3ead790-0201-0010-64bb-9e4d67a466b4
/people/sameer.shadab/blog/2005/09/29/testing-abap-mapping
Some more on ABAP Mapping
https://websmp101.sap-ag.de/~sapdownload/011000358700003082332004E/HowToABAPMapping.pdf
/people/ravikumar.allampallam/blog/2005/02/10/different-types-of-mapping-in-xi
/people/r.eijpe/blog
ABAP Mapping Vs Java Mapping.
Re: Message Mapping of type ABAP Class not being shown
Thanks
Maybe you are looking for
-
Keep trying to download and install flash player
My flash player quit working, so I upgraded it. A few days ago, it just dies. I can't find it anywhere on my computer. So I tried to download and install the newest flash player. It initializes and begins installing. Then stops and asks me to tu
-
InDesign CS5 Fails on Start Up on iMac Snow Lepoard.
When trying to start InDesign it gets to a point "Setting up Service Registry" and then fails. We appreciate your earliest response. Thank you. Regards.......Don Below is the Error Report: Process: Adobe InDesign CS5 [21377] Path:
-
Drag and drop not working in CS6?
Hey guys, we recently updated to Fireworks CS6 (from CS5) @ work and while I do love the speed upgrade, there is one thing different that I cannot get past, I cannot drag and drop a document onto the Fireworks menu bar, I need to Open > File every ti
-
Customized selectBooleanCheckbox works fine on Firefox but not IE, Opera?
I'm using a customized selectBooleanCheckbox solution to give me a 'select all' like feature: <h:form> <h:dataTable binding="#{DialogManager.bean.nodeTable}" value="#{DialogManager.bean.nodeList}" var="nodeItem" border="1" cellpadding="10" cellspacin
-
Is there a possibility that I can tap to unlock on my n97 mini? I already have nUnlock app. But I'd rather just have to tap My Phones: Lumia 800 Mango N97 Mini v30.0.004