How to configure value date (VALUT) filling in FF67?

Hi,
Anyone knows if there is any configuration related to the automatic filling of the value date(BSEG-VALUT) in the transaction FF67? During the compensation of one document this date is not automatically passed and I want to change this.
Thanks,
Nuno

Substitution can be used for the same.
Check condition with T Code and document type, fill the value date with the current date.
T Code for substitution : OBBH

Similar Messages

  • [Forum FAQ] How to configure a Data Driven Subscription which get multi-value parameters from one column of a database table?

    Introduction
    In SQL Server Reporting Services, we can define a mapping between the fields that are returned in the query to specific delivery options and to report parameters in a data-driven subscription.
    For a report with a parameter (such as YEAR) that allow multiple values, when creating a data-driven subscription, how can we pass a record like below to show correct data (data for year 2012, 2013 and 2014).
    EmailAddress                             Parameter                      
    Comment
    [email protected]              2012,2013,2014               NULL
    In this article, I will demonstrate how to configure a Data Driven Subscription which get multi-value parameters from one column of a database table
    Workaround
    Generally, if we pass the “Parameter” column to report directly in the step 5 when creating data-driven subscription.
    The value “2012,2013,2014” will be regarded as a single value, Reporting Services will use “2012,2013,2014” to filter data. However, there are no any records that YEAR filed equal to “2012,2013,2014”, and we will get an error when the subscription executed
    on the log. (C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\LogFiles)
    Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: Default value or value provided for the report parameter 'Name' is not a valid value.
    This means that there is no such a value on parameter’s available value list, this is an invalid parameter value. If we change the parameter records like below.
    EmailAddress                        Parameter             Comment
    [email protected]         2012                     NULL
    [email protected]         2013                     NULL
    [email protected]         2014                     NULL
    In this case, Reporting Services will generate 3 reports for one data-driven subscription. Each report for only one year which cannot fit the requirement obviously.
    Currently, there is no a solution to solve this issue. The workaround for it is that create two report, one is used for view report for end users, another one is used for create data-driven subscription.
    On the report that used create data-driven subscription, uncheck “Allow multiple values” option for the parameter, do not specify and available values and default values for this parameter. Then change the Filter
    From
    Expression:[ParameterName]
    Operator   :In
    Value         :[@ParameterName]
    To
    Expression:[ParameterName]
    Operator   :In
    Value         :Split(Parameters!ParameterName.Value,",")
    In this case, we can specify a value like "2012,2013,2014" from database to the data-driven subscription.
    Applies to
    Microsoft SQL Server 2005
    Microsoft SQL Server 2008
    Microsoft SQL Server 2008 R2
    Microsoft SQL Server 2012
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    For every Auftrag, there are multiple Position entries.
    Rest of the blocks don't seems to have any relation.
    So you can check this code to see how internal table lt_str is built whose first 3 fields have data contained in Auftrag, and next 3 fields have Position data. The structure is flat, assuming that every Position record is related to preceding Auftrag.
    Try out this snippet.
    DATA lt_data TYPE TABLE OF string.
    DATA lv_data TYPE string.
    CALL METHOD cl_gui_frontend_services=>gui_upload
      EXPORTING
        filename = 'C:\temp\test.txt'
      CHANGING
        data_tab = lt_data
      EXCEPTIONS
        OTHERS   = 19.
    CHECK sy-subrc EQ 0.
    TYPES:
    BEGIN OF ty_str,
      a1 TYPE string,
      a2 TYPE string,
      a3 TYPE string,
      p1 TYPE string,
      p2 TYPE string,
      p3 TYPE string,
    END OF ty_str.
    DATA: lt_str TYPE TABLE OF ty_str,
          ls_str TYPE ty_str,
          lv_block TYPE string,
          lv_flag TYPE boolean.
    LOOP AT lt_data INTO lv_data.
      CASE lv_data.
        WHEN '[Version]' OR '[StdSatz]' OR '[Arbeitstag]' OR '[Pecunia]'
             OR '[Mita]' OR '[Kunde]' OR '[Auftrag]' OR '[Position]'.
          lv_block = lv_data.
          lv_flag = abap_false.
        WHEN OTHERS.
          lv_flag = abap_true.
      ENDCASE.
      CHECK lv_flag EQ abap_true.
      CASE lv_block.
        WHEN '[Auftrag]'.
          SPLIT lv_data AT ';' INTO ls_str-a1 ls_str-a2 ls_str-a3.
        WHEN '[Position]'.
          SPLIT lv_data AT ';' INTO ls_str-p1 ls_str-p2 ls_str-p3.
          APPEND ls_str TO lt_str.
      ENDCASE.
    ENDLOOP.

  • How to get values/data stored in the database into a list-item.

    how to get values/data stored in the database into a list-item.
    i tried to make a list item without any values assigned to it...but i got the below error.
    FRM-30191: No list items defined for required poplist.
    or
    FRM-32082: Invalid value for given item type.
    List EMPNO
    Item: EMPNO
    Block: EMP
    Form: MODULE5
    FRM-30085: Unable to adjust form for output.
    then according to some docs, i tried the the following for the trigger
    when-new-form-instance
    declare
         rg_name varchar2(40) := 'emp_rec';
         status number;
         groupid recordgroup;
         it_id item;
    begin
         it_id := Find_Item('empno');
         groupid := create_group_from_query(rg_name, 'select empno from emp');
         status := populate_group(groupid);
         populate_list(it_id, groupid);
    end;
    but yet didnt work... :(
    so how the heck do i get values fetched from the database table into the list item?

    for list items you need to values in the record group, one is the shown value and one is the returned value.
    Check out the online help for the populate_list built-in.
    You'll need something like select ename,ename from emp as the record group query.

  • How to configure Essbase data source in OBIEE11g on Unix system?

    Hi,
    I am looking for documentation/link for how to configure Essbase data source in OBIEE 11g on UNIX system.
    Thanks in advance

    Hi Fayaz
    First You need "BI Administrator Client Tool" cause you need to make changes with in RPD (repository) BUT "BI Administrator Client Tool" for Unix is not Available.
    So you have download OBIEE 11.1.1.5 ""BI Administrator Client Tool" and Install it on Windows Platform.
    Now regarding your Original Issue I am also looking for this Info.
    Regards
    Sher
    Edited by: Sher Ullah Baig on Apr 17, 2012 4:00 PM

  • How to change value date from posting date to net due date

    Hi Gurus,
    My client wants to change value date from posting date to net due date. currently posting date is considered as value date but in future client wants to change value date to net due date of documents.
    1. what configurations need to be maintained to change value date from posting date to net due date.
    2. Will these changes effect automatic payment run.
    Thanks and Regards,
    Suresh

    Hi Suresh,
    In Future
    While posting the documents you can mention the value date as due date of the documents.
    For already posted documents
    You can change the value date to net due date.
    If the value date is display mode i.e. if you are not able to change the value date in FB02 (Document Change Mode), use OB32 transaction and make value date field as eidtable (BSEG-VALUT).
    In OB32 transaction enter all the required fields like Account type, transaction type, company code and finally activate the Field can be changed check box.
    But not sure, whether we can change the value date to previous date or not?
    This would help you..
    Regards,
    Praisty
    Edited by: Praisty on Jul 28, 2009 9:56 AM

  • How to configure Dynamic Data Connection for Business View

    Hi,
    How can we configure Dynamic Data connection that we can save the profile of the connection to somewhere that we do not need to enter it everytime when we refresh the report?
    thanks and regards
    nora

    Hi James,
    Thanks for the reply. Its solved now. For anybody if interested you can set the dynamic email address either i) having it as part of payload - In this case use the xpath to query the payload varaible ii) use identity service - follow the following steps
    1.Create a user in Application enterprise manager/also you can use a existing account, if u are creating a new one assign the correct role
    2. In either case edit the user-properties.xml(bpel/system/services/config) file for the corresponding user and add an attribute called email
    3. Bounce the server for this changes to take effect
    4. In the notification properties in the to address use ids:getUserProperty and pass the attribute name

  • How to configure once data load then trigerd or run ibot?

    Hi Experts,
    I have a one requirement,
    1) Every day run one workflow( means data load into data warehouse)
    2) After, ibot should be run and delivery to users.
    3) We scheduled the workflows in DAC for every day morning.
    Requirement:
    Once data loaded, then IBot should be run and send to users dynamically (without scheduling).
    If workflow failed, IBot won’t be delivered.
    How to find out or configure once data load then trigerd or run ibot?
    I am using obi 10g and informatica 8 and os xp.
    Advance thanks..
    Thanks,
    Raja

    Hi,
    Below are the details for automating the OBIEE Scheduler.
    Create a batch file or Sh file with following command
    D:\OracleBI\server\Bin\saschinvoke -u Administrator/udrbiee007 -j 8
    -u is username/Password for Scheduler (username/password that u given while configuration)
    -j is a job id number when u create a ibot it will assign the new job number that can be identified from"Jobmanager"
    Refer the below thread for more information.
    iBot scheduling after ETL load
    Or ,
    What you the above also it will work but problem is we need specify the time like every day 6.30 am .
    Note: The condition report is true then the report will be delivered at 6.30 pm only but the condition is false the report will not triggered.
    I also implemented this but that is little bit different .
    Hope this help's
    Thanks
    Satya
    Edited by: Satya Ranki Reddy on Jul 13, 2012 12:05 PM

  • How to change value date field

    HI,
    After posting the transaction data into the document , is it possible to change the 'value date' . if it is possible pl.give me guidelines . As i know it is possible to change "Assignmentand Text filed"
    This is my user Requirement to change " Value date field".
    Regards,
    Prabhakar

    Hi,
    Check in document change FB02.
      It may allow you to change the value date. If it is not allow to change the filed then create value date in
      Document Change Rules, Document Header.
    Regards,
    Sankar

  • How to configure Spring Data JPA to handle sdo_geometry and Timestamp(0)?

    Hi all,
    I'm new to this community, not sure if I'm posting in the right place.
    I'm an Oracle old-hand, but new to Spring Data, having done a bit on postGIS, but now trying to apply it to Oracle.
    My problem:  I have a new maven project, with Entity and Repository classes, and a JUnit test program that uses DbUnit to preload test data.
    The current config uses a beans.xml file to set up:
      <bean id="entityManagerFactory"
        class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">...
        <property name="jpaProperties">....
      <bean id="vendorAdapter"
        class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
        <property name="databasePlatform" value="org.hibernate.dialect.Oracle10gDialect" />
        <property name="generateDdl" value="false" />
      </bean>
      <bean id="dataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource"
        destroy-method="close">
        <property name="driverClass" value="oracle.jdbc.OracleDriver" />
        <property name="jdbcUrl"
         value="jdbc:oracle:thin:@localhost:2632:BLADE" />...
    <bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
      <property name="entityManagerFactory" ref="entityManagerFactory" />
    </bean>
    Test of repos.count() works ok.
    Test to call repos.findByName(String) fails, (stack below).
    Now, the current setup doesn't handle SDO_GEOMETRY, or TIMESTAMP(0), but will manage TIMESTAMP(6).
    The warnings below show that I'm  using "DefaultDataTypeFactory" and should probably be using OracleDataTypeFactory.
    I've also found the OracleDataSource class, and tried that, but still get the same data type issues.
    Ok, the big questions:
    1. Can anyone point me at an example to configure the dataSource, or vendorAdapter, to use the OracleDataTypeFactory, and retrieve these data types?
    2. What data type should I be using in Java for the geometry column?
    Thanks,
    Phil
    Test output warnings:
    16:24:05.846 [main] INFO  org.springframework.test.context.transaction.TransactionalTestExecutionListener - Began transaction (2) for test context [DefaultTestContext@990034 testClass = NotamRepositoryTest, testInstance = com.thalesgroup.uk.airscape.common.data.domain.dao.notam.NotamRepositoryTest@3ff1d6, testMethod = testNotamFetchByDatasetName@NotamRepositoryTest, testException = [null], mergedContextConfiguration = [MergedContextConfiguration@1346193 testClass = NotamRepositoryTest, locations = '{classpath:testBeans.xml}', classes = '{}', contextInitializerClasses = '[]', activeProfiles = '{}', contextLoader = 'org.springframework.test.context.support.DelegatingSmartContextLoader', parent = [null]]]; transaction manager [org.springframework.orm.jpa.JpaTransactionManager@1f1fe9d]; rollback [true]
    16:24:08.674 [main] WARN  org.dbunit.dataset.AbstractTableMetaData - Potential problem found: The configured data type factory 'class org.dbunit.dataset.datatype.DefaultDataTypeFactory' might cause problems with the current database 'Oracle' (e.g. some datatypes may not be supported properly). In rare cases you might see this message because the list of supported database products is incomplete (list=[derby]). If so please request a java-class update via the forums.If you are using your own IDataTypeFactory extending DefaultDataTypeFactory, ensure that you override getValidDbProducts() to specify the supported database products.
    16:24:08.674 [main] WARN  org.dbunit.util.SQLHelper - DATA_SET.SEC_TAG_MODIFIED data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
    16:24:08.674 [main] WARN  org.dbunit.util.SQLHelper - DATA_SET.CREATED data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
    16:24:08.674 [main] WARN  org.dbunit.util.SQLHelper - DATA_SET.MODIFIED data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
    16:24:08.690 [main] WARN  org.dbunit.util.SQLHelper - DATA_SET.VALID_FROM data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
    16:24:08.690 [main] WARN  org.dbunit.util.SQLHelper - DATA_SET.VALID_TO data type (1111, 'TIMESTAMP(0)') not recognized and will be ignored. See FAQ for more information.
    16:24:08.690 [main] WARN  org.dbunit.dataset.AbstractTableMetaData - Potential problem found: The configured data type factory 'class org.dbunit.dataset.datatype.DefaultDataTypeFactory' might cause problems with the current database 'Oracle' (e.g. some datatypes may not be supported properly). In rare cases you might see this message because the list of supported database products is incomplete (list=[derby]). If so please request a java-class update via the forums.If you are using your own IDataTypeFactory extending DefaultDataTypeFactory, ensure that you override getValidDbProducts() to specify the supported database products.
    16:24:08.690 [main] WARN  org.dbunit.util.SQLHelper - NOTAM.Q_POINT data type (1111, 'SDO_GEOMETRY') not recognized and will be ignored. See FAQ for more information.
    16:24:08.690 [main] WARN  org.dbunit.util.SQLHelper - NOTAM.Q_RANGE_RING data type (1111, 'SDO_GEOMETRY') not recognized and will be ignored. See FAQ for more information.
    java.lang.UnsupportedOperationException
        at org.hibernate.spatial.GeometrySqlTypeDescriptor.getExtractor(GeometrySqlTypeDescriptor.java:57)
        at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:267)
        at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:263)
        at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:253)
        at org.hibernate.type.AbstractStandardBasicType.hydrate(AbstractStandardBasicType.java:338)
        at org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:2969)
        at org.hibernate.loader.Loader.loadFromResultSet(Loader.java:1695)
        at org.hibernate.loader.Loader.instanceNotYetLoaded(Loader.java:1627)
        at org.hibernate.loader.Loader.getRow(Loader.java:1514)
        at org.hibernate.loader.Loader.getRowFromResultSet(Loader.java:725)
        at org.hibernate.loader.Loader.processResultSet(Loader.java:952)
        at org.hibernate.loader.Loader.doQuery(Loader.java:920)
        at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:354)
        at org.hibernate.loader.Loader.doList(Loader.java:2553)
        at org.hibernate.loader.Loader.doList(Loader.java:2539)
        at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2369)
        at org.hibernate.loader.Loader.list(Loader.java:2364)
        at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:496)
        at org.hibernate.hql.internal.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:387)
        at org.hibernate.engine.query.spi.HQLQueryPlan.performList(HQLQueryPlan.java:231)
        at org.hibernate.internal.SessionImpl.list(SessionImpl.java:1264)
        at org.hibernate.internal.QueryImpl.list(QueryImpl.java:103)
        at org.hibernate.jpa.internal.QueryImpl.list(QueryImpl.java:573)
        at org.hibernate.jpa.internal.QueryImpl.getResultList(QueryImpl.java:449)
        at org.hibernate.jpa.criteria.compile.CriteriaQueryTypeQueryAdapter.getResultList(CriteriaQueryTypeQueryAdapter.java:67)
        at org.springframework.data.jpa.repository.query.JpaQueryExecution$CollectionExecution.doExecute(JpaQueryExecution.java:81)
        at org.springframework.data.jpa.repository.query.JpaQueryExecution.execute(JpaQueryExecution.java:59)
        at org.springframework.data.jpa.repository.query.AbstractJpaQuery.doExecute(AbstractJpaQuery.java:97)
        at org.springframework.data.jpa.repository.query.AbstractJpaQuery.execute(AbstractJpaQuery.java:88)
        at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:384)
        at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:344)
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
        at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:98)
        at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:262)
        at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:95)
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
        at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:136)
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
        at org.springframework.data.jpa.repository.support.CrudMethodMetadataPostProcessor$CrudMethodMetadataPopulatingMethodIntercceptor.invoke(CrudMethodMetadataPostProcessor.java:111)
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
        at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92)
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
        at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
        at com.sun.proxy.$Proxy67.findByDataSetNameAndDeletedIsNull(Unknown Source)
        at com.thalesgroup.uk.airscape.common.data.domain.dao.notam.NotamRepositoryTest.testNotamFetchByDatasetName(NotamRepositoryTest.java:103)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
        at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
        at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
        at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
        at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
        at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
        at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
        at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:233)
        at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:87)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
        at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
        at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
        at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:176)
        at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
        at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
        at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
        at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
        at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
        at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)

    Hi
    You can use T-code NACE and then select your application based on your output type you can assign your  Print program and Form.
    Reward points, if it is helpful.
    Regards
    Raja.

  • How to configure jazn-data.xml for two realms?

    Hi,
    in my test application a JSP page is secured with <auth-method>BASIC</auth-method>.
    I use the jazn-data.xml configurtion in the Embedded OC4J in JDeveloper 10.1.2.
    With the standard configuration of jazn-data.xml I can sign in with user admin/welcome without any problems.
    In this case only the jazn.com realm exists.
    If I configure a second realm over the JDeveloper menu "Tools -> Embedded OC4J Server Preferences ..." then I can´t sign in any more.
    The new realm domain.com have some users in the group myusers. In the orion-web.xml I add the new group in the role-mapping.
    What is the problem? I can´t find informations in the documentation.
    Oracle® Application Server Containers for J2EE
    Security Guide
    10g Release 2 (10.1.2)
    B14013-02
    Thanks and best regards,
    Tobias
    orion-web.xml
    <orion-web-app servlet-webdir="/servlets/">
    <security-role-mapping impliesAll="false" name="user">
    <group name="jazn.com/users"/>
    <group name="domain.com/myusers"/>
    </security-role-mapping>
    </orion-web-app>
    web.xml
    <web-app>
    <description>Empty web.xml file for Web Application</description>
    <session-config>
    <session-timeout>35</session-timeout>
    </session-config>
    <mime-mapping>
    <extension>html</extension>
    <mime-type>text/html</mime-type>
    </mime-mapping>
    <mime-mapping>
    <extension>txt</extension>
    <mime-type>text/plain</mime-type>
    </mime-mapping>
    <login-config>
    <auth-method>BASIC</auth-method>
    </login-config>
    <security-constraint>
    <web-resource-collection>
    <web-resource-name>TestApp</web-resource-name>
    <url-pattern>*</url-pattern>
    </web-resource-collection>
    <auth-constraint>
    <role-name>user</role-name>
    </auth-constraint>
    </security-constraint>
    <security-role>
    <role-name>user</role-name>
    </security-role>
    </web-app>

    Hi,
    I switched my above example from jazn-data.xml to LDAP and deployed this in iAS.
    In the OID are two realms configured, realm_1 and realm_2 and some users for each.
    But I can only access my JSP with users of realm_1 or realm_2 and not
    simultaneous users of both realms. That´s dependents on the configured "Default
    realm" of the JAZN LDAP User Manager in the EAR.
    If I set "Default realm" = realm_1 in the enterprise manager of the iAS then
    only members of realm_1 can access the JSP and vice versa for realm_2.
    I packaged my above example as EAR and configured the default generated
    orion-application.xml in the enterpise manager of the iAS.
    orion-application.xml
    <?xml version = '1.0'?>
    <!DOCTYPE orion-application PUBLIC "-//ORACLE//DTD OC4J Application runtime
    9.04//EN" "http://xmlns.oracle.com/ias/dtds/orion-application-9_04.dtd">
    <orion-application deployment-version="10.1.2.0.2"
    default-data-source="jdbc/OracleDS" treat-zero-as-null="true">
    <web-module id="webapp" path="webapp.war"/>
    <persistence path="persistence"/>
    <principals path="principals.xml"/>
    <jazn provider="LDAP">
    <property name="ldap.user" value="cn=orcladmin"/>
    <property name="ldap.password" value="!ias_admin10g"/>
    </jazn>
    <log>
    <file path="application.log"/>
    </log>
    <namespace-access>
    <read-access>
    <namespace-resource root="">
    <security-role-mapping>
    <group name="jazn.com/administrators"/>
    </security-role-mapping>
    </namespace-resource>
    </read-access>
    <write-access>
    <namespace-resource root="">
    <security-role-mapping>
    <group name="jazn.com/administrators"/>
    </security-role-mapping>
    </namespace-resource>
    </write-access>
    </namespace-access>
    </orion-application>
    orion-web.xml
    <orion-web-app servlet-webdir="/servlets/">
    <security-role-mapping impliesAll="false" name="user">
    <group name="realm_1/users"/>
    <group name="realm_2/myusers"/>
    </security-role-mapping>
    </orion-web-app>
    What do I have to configure to get access to the JSP with both realms?
    Best regards,
    Tobias

  • How to find from Data target filled by which  process chain.

    hi all.
    i have list of cubes i want to know that which cube is filled by which prcess chain ...
    can anyone suggest me appropriate solution.
    simply i want to know that when reconcilation which process chain affect on which bex report so that if one process chain got failed then i will not generate that specific report for that day..
    i have list of multiprovider on which we are generating rports and tech. name of report also i have.

    Ganesh ji,
    Choose your infopackage which holds the data targets, in the Infopackage screen reach the Scheduler tab and you will find the where-Used list. This will let you know the particular process chain responsible.
    thanks
    Prabhakaran
    09176665954

  • How to configure the date format in toolbar of ADF Calendar

    Hi,
    In the weekly view of the ADF Calendar, the date on the toolbar of the calendar component is something like this "May 1,2011 - May 7,2011".
    Is there a way to configure this. What changes need to be done to show this as "1 May 2011 - 7 May 2011".
    Thanks,
    Deepak

    See Bug 13825615 - backport to 11.1.1.7.0 bug 13045234 - date format not supported in toolbar by ca

  • How to configure optimal data connections to allow excel to retrieve multiple data sets form a single data source

    Hi all,
    I would like to have a data connection file used by excel to extract data form a SQL Server DB (so that change in location of the DB needs only a change to the  single data connection file). The excel file will retrieve data from only a single database,
    but there are multiple queries (30+) stored on SQL server that are used - each SQL server views' data is returned to a pivot table in an excel worksheet, which has an associated chart with various slicers on the main dashboard worksheet.
    Do I need a seperate data conenction file for each SQL server query being retrieved, or can a single connection file be created as all data comes from a single database?
    (all my attempts and research on the net have led me to believe that a seperate file is needed for each view, but this seems unnecessary)

    Hi, im learning my way with this, so apologies if I am providing too much or too little info.
    There are 8 source files which are very loosely related in that they capture infromation regarding what has happened within a metro railway over a day. However there are only a few relationships between the contents of tehse files.
    These tables are imported into SQL Server using SSIS where I have developed a number of views that query these 8 source tables to generate a number of metrics to provide insight into the service provided to the passengers. Some examples of metrics are: Number
    of passengers in transit at any given time, time taken to travel between adjacent stations, how much power was used during the day, what distance did the trains travel during the day, etc. Some views provide only a handful of rows, some provide 1M plus. There
    are now approx 40 seperate views created.
    I have then used a spreadsheet with a worksheet associated with each SQL server view. Each worksheet is set up as a pivot table, which allows the related chart on the main dashboard worksheet to use standard excel capability to slice and present the data
    in different ways.
    At the moment if the server on which teh database is stored moves, I have had to recreate the spreadsheet from scratch as I dont know how to change the connection information in any other way. Ideally I would like to have connection info in a single place
    to reduce ongoing maintenance, particularly as I would like to place the spreadsheet on a SharePoint server to distribute it to other users.
    Thanks

  • How to remove values data points from Apex3 Flash Chart

    Hi
    I am using a Apex 3.2 and I have to put Flash Charts in my application. But since the charts have are smaller in size, I do not want to have the data points coming up in the chart. I have planned to show them using hints. But unfortunately, I am not able to hide the Data Points. Can you please help. Thanks Sahcin. Here is my custom XML.
    <?xml version="1.0" encoding="utf-8" standalone="yes"?>
    <root>
         <type>
              <chart type="Stacked Horizontal 2DColumn">
                   <animation enabled="no"/>
                   <hints enabled='yes' auto_size='no' width='200' height='20' horizontal_position='left' vertical_position='top'>
                        <text><![CDATA[{NAME}, {VALUE}]]></text>
                        <font type="Verdana" size="10" color="black"/>
                        <background color="0xF4F4F4"/>
                   </hints>
                   <arguments show="no"/>
                   <names show='yes' position='top'/>
                   <values show="no" prefix="$" postfix="" decimal_separator="." thousand_separator=',' decimal_places="2"/>
                   <column_chart column_space='20' block_space='20' left_space='0' right_space='0' up_space='0' down_space='0' round_radius='0'>
                        <border enabled="no"/>
                        <block_names enabled="yes" placement="chart" rotation="45" x_offset="0" position="left">
                             <font type="verdana_embed_tf" size="8" color="0x000000"/>
                        </block_names>
                        <background type="gradient" gradient_type="linear">
                             <alphas>
                                  <alpha>100</alpha>
                                  <alpha>100</alpha>
                                  <alpha>100</alpha>
                             </alphas>
                             <ratios>
                                  <ratio>0</ratio>
                                  <ratio>120</ratio>
                                  <ratio>0xFF</ratio>
                             </ratios>
                        </background>
                   </column_chart>
              </chart>
              <workspace>
                   <background enabled="yes" type="solid" color="0xF4F4F4" alpha="0"/>
                   <base_area enabled="no"/>
                   <chart_area enabled="yes" x="100" y="50" width="600" height="300" deep="0">
                        <background enabled="yes" type="solid" color="0xCCFF99"/>
                        <border enabled="yes" size="1"/>
                   </chart_area>
                   <x_axis name="Region" smart="yes" position="left_center">
                        <font type="Verdana" size="10" color="0x000000" bold="no" align="center"/>
                   </x_axis>
                   <y_axis name="Booking Amount (kUSD)" smart="yes" position="center_bottom">
                        <font type="Verdana" size="10" color="0x000000" bold="no" align="center"/>
                   </y_axis>
                   <grid>
                        <values>
                             <lines size='1' color='0x15771A' alpha='100'/>
                        </values>
                   </grid>
              </workspace>
              <legend enabled="yes" x="0" y="0" rows="2" rows_auto_count="no">
                   <names width='80' enabled='yes'/>
              </legend>
         </type>
         #DATA#
    </root>Edited by: user779712 on Nov 3, 2009 4:45 PM
    Edited by: user779712 on Nov 3, 2009 4:46 PM

    Well going in point by point and deleting is certainly one way of going about it. Of course that would defeat the entire purpose of having this great programming enviroment that is capable of doing that kind of thing for us.
    Is your text file small enough to upload? Obviously you know what the criteria are for judging whether or not a data point is acceptable so it should be simple enough to program. Give us a bit more information and I can assure you there will be a race between LabVIEW experts to see who can post the cleanest, simplest, and fastest piece of code that will do what you need. My wager is on altenbach, but there are a few others here that are almost as impressive.

  • How to configure HR Data mapping to LDAP

    Hi everyone,
    I configure LDAP connection and it works. But I cannot do mappings of attributes to LDAP. Is there any document what are the fields to map and what is the meanings?
    Thanks
    Haldun

    Dear Abhishek,
    Before you proceed to Create Jobs & Positions first you check whether the Personnel Actions (PA40) is configured or not. Then after uploading the employeee master data you can proceed to other process.
    if PA40 is not configured then Configure in SPRO using PA/PM/Customizing procudures/ Infotype Menus etc. You can also maintain user profile using Tcode. SU3.
    All the best.
    Rgds,
    Vikrant

Maybe you are looking for

  • Not able to save sales order

    Hi gurus, Iam working on IDES ECC 6.0. when iam using a make to order material in sales order and when iam saving sales order it is not allowing me to save and message says, OPEN_FORM IS MISSING and throwing me out without saving sales order. But whe

  • Help on ABAP HR

    Hi, i learned ABAP HR can any body help me materials regardign this any body having specifications regardsing ABAP HR it will be more helpfull to this id <b><REMOVED BY MODERATOR></b> Message was edited by:         Alvaro Tejada Galindo

  • Attempt to restore HD fails

    My OS X 10.4 Server (800 MHz G4, 1GB RAM) recently experienced HD problems that necessitated replacing the drive. I used Disk Utility to restore the drive to an external firewire drive, and successfully rebooted from the firewire drive - it's not fas

  • Unable to view and use iphotos within imovie or idvd

    I recently upgraded to iLife '06 and I would like to view and use my iphotos from within imovie. But for some reason I cannot. When I click on the media button, I can bring in any of my iTunes music. But when I click on the photos button, nothing, ex

  • After new update I cant erase my pics?, After new update I cant erase my pics?

    After new update I cant delete my photos!?!?