Sessions.xml to point to a data-source

Hi,
I'm trying to have the sessions.xml file point and use a data source that is defined in the data-sources.xml file.
I have checked that the data-source entires are fine(I've used the test-connection feature in AS, to check this).
Below is the content of my sessions.xml file; Can you please let me know if I am missing some entires in this file.Also sending my data-sources.xml file incase you want to have a look at it.
Sessions.xml
<?xml version="1.0" encoding="UTF-8"?>
<toplink-sessions version="10g release 2 (10.1.3.0.0DP4)" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<session xsi:type="database-session">
<name>oracle.epcis.common.model</name>
<event-listener-classes/>
<primary-project xsi:type="xml">EPCIS_OXM.xml</primary-project>
<login xsi:type="xml-login">
<password>5C4C6924B1B14E16</password>
</login>
</session>
<session xsi:type="server-session">
<name>oracle.epcis.common.model.orm</name>
<event-listener-classes/>
<logging xsi:type="toplink-log"/>
<primary-project xsi:type="xml">EPCIS_ORM.xml</primary-project>
<login>
<datasource>jdbc/epcisDS</datasource>
<uses-external-connection-pool>true</uses-external-connection-pool>
<uses-external-transaction-controller>true</uses-external-transaction-controller>
<user-name/>
</login>
<connection-pools>
<read-connection-pool>
<name>EPCIS Connection Pool</name>
</read-connection-pool>
<write-connection-pool>
<name>EPCIS Connection Pool</name>
</write-connection-pool>
</connection-pools>
<connection-policy/>
</session>
</toplink-sessions>
Data-sources.xml
<?xml version = '1.0' encoding = 'UTF-8'?>
<data-sources xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://xmlns.oracle.com/oracleas/schema/data-sources-10_1.xsd" schema-major-version="10" schema-minor-version="1">
<managed-data-source connection-pool-name="EPCIS Connection Pool" jndi-name="jdbc/epcisDS" name="epcisDS"/>
<connection-pool name="EPCIS Connection Pool">
<connection-factory factory-class="oracle.jdbc.pool.OracleDataSource" user="scott" password="tiger" url="jdbc:oracle:thin:@//localhost:1521/ORCL"/>
</connection-pool>
</data-sources>
Thanks in Advance,
Mahima

Thanks! That helped.
I have a project-orm.xml file which also has the database username, pwd, url.How can I get this to point to the same datasource too?
The project orm.xml file is as below: How can I modify it to point to the datasource jdbc/epcisDS?
project-orm.xml
<?xml version="1.0" encoding="UTF-8"?>
<toplink:object-persistence version="Oracle TopLink - 10g Release 3 (10.1.3.0.0) (Build 060118)" xmlns:opm="http://xmlns.oracle.com/ias/xsds/opm" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:toplink="http://xmlns.oracle.com/ias/xsds/toplink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<opm:name>EPCIS_ORM</opm:name>
</opm:class-mapping-descriptor>
<opm:class-mapping-descriptor xsi:type="toplink:relational-class-mapping-descriptor">
<opm:class>oracle.epcis.common.model.TransactionEvent</opm:class>
<opm:alias>SdmTransactionEvent</opm:alias>
<opm:primary-key>
<opm:field table="SDM_TRANSACTION_EVENT" name="TRANSACTION_EVENT_ID" xsi:type="opm:column"/>
</opm:primary-key>
<opm:events xsi:type="toplink:event-policy"/>
<opm:querying xsi:type="toplink:query-policy"/>
<opm:attribute-mappings>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>action</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT" name="ACTION" xsi:type="opm:column"/>
</opm:attribute-mapping>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>bizLocation</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT" name="BIZ_LOCATION" xsi:type="opm:column"/>
</opm:attribute-mapping>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>bizStep</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT" name="BIZ_STEP" xsi:type="opm:column"/>
</opm:attribute-mapping>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>bizTransactionListId</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT" name="BIZ_TRANSACTION_LIST_ID" xsi:type="opm:column"/>
</opm:attribute-mapping>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>disposition</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT" name="DISPOSITION" xsi:type="opm:column"/>
</opm:attribute-mapping>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>eventTime</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT" name="EVENT_TIME" xsi:type="opm:column"/>
</opm:attribute-mapping>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>rawXmlEventId</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT" name="RAW_XML_EVENT_ID" xsi:type="opm:column"/>
</opm:attribute-mapping>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>readpoint</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT" name="READPOINT" xsi:type="opm:column"/>
</opm:attribute-mapping>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>transactionEventId</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT" name="TRANSACTION_EVENT_ID" xsi:type="opm:column"/>
</opm:attribute-mapping>
</opm:attribute-mappings>
<toplink:descriptor-type>independent</toplink:descriptor-type>
<toplink:instantiation/>
<toplink:copying xsi:type="toplink:instantiation-copy-policy"/>
<toplink:change-policy xsi:type="toplink:deferred-detection-change-policy"/>
<toplink:tables>
<toplink:table name="SDM_TRANSACTION_EVENT"/>
</toplink:tables>
</opm:class-mapping-descriptor>
<opm:class-mapping-descriptor xsi:type="toplink:relational-class-mapping-descriptor">
<opm:class>oracle.epcis.common.model.TransactionEventMap</opm:class>
<opm:alias>SdmTransactionEventMap</opm:alias>
<opm:primary-key>
<opm:field table="SDM_TRANSACTION_EVENT_MAP" name="CHILD_EPC_ID" xsi:type="opm:column"/>
<opm:field table="SDM_TRANSACTION_EVENT_MAP" name="TRANSACTION_EVENT_ID" xsi:type="opm:column"/>
</opm:primary-key>
<opm:events xsi:type="toplink:event-policy"/>
<opm:querying xsi:type="toplink:query-policy"/>
<opm:attribute-mappings>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>childEpcId</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT_MAP" name="CHILD_EPC_ID" xsi:type="opm:column"/>
</opm:attribute-mapping>
<opm:attribute-mapping xsi:type="toplink:direct-mapping">
<opm:attribute-name>transactionEventId</opm:attribute-name>
<opm:field table="SDM_TRANSACTION_EVENT_MAP" name="TRANSACTION_EVENT_ID" xsi:type="opm:column"/>
</opm:attribute-mapping>
</opm:attribute-mappings>
<toplink:descriptor-type>independent</toplink:descriptor-type>
<toplink:instantiation/>
<toplink:copying xsi:type="toplink:instantiation-copy-policy"/>
<toplink:change-policy xsi:type="toplink:deferred-detection-change-policy"/>
<toplink:tables>
<toplink:table name="SDM_TRANSACTION_EVENT_MAP"/>
</toplink:tables>
</opm:class-mapping-descriptor>
</opm:class-mapping-descriptors>
<toplink:login xsi:type="toplink:database-login">
<toplink:platform-class>oracle.toplink.platform.database.oracle.Oracle10Platform</toplink:platform-class>
<toplink:user-name>admin</toplink:user-name>
<toplink:password>77F9BCD65FABB4501B550455987A268F</toplink:password>
<toplink:sequencing>
<toplink:default-sequence xsi:type="toplink:native-sequence"/>
</toplink:sequencing>
<toplink:driver-class>oracle.jdbc.driver.OracleDriver</toplink:driver-class>
<toplink:connection-url>jdbc:oracle:thin:@localhost:1521:EPCIS</toplink:connection-url>
</toplink:login>
</toplink:object-persistence>
Thanks,
Mahima

Similar Messages

  • 11i EBS XML Publisher Report with Multiple Data Source

    I need to create XML Publisher report in 11i EBS pulling data from another 10.7 EBS Instance as well as 11i EBS in single report.
    I am not allowed to create extract or use db links.
    My problem is how to create Data Source Connection using Java Concurrent Program.
    The approach I am trying is
    1. create Java concurrent program to establish connection to 10.7 instance.
    2. Will write the SQL queries in Data Tempalete with 2 Data Source 1 for 11i EBS and 2 for 10.7 EBS
    3. Template will show the data from both query in 1 report..
    Is there any other way to proceed using datasource API...
    thanks

    option1:
    The query should be same @ detail level, only the template has to be different for summary and details.
    @runtime, user can choose to see the detail/summary
    Disadvantage, if the data is huge,
    advantage , only one report.
    option2:
    create two separate reports summary and details
    and create diff data and diff layout and keep it as different report
    Advantage, query will perform based on the user run report, summary/detail, so that you can write efficient query.
    Dis advantage , two reports query/template to be maintained.

  • SCCM 2012 cannot access distribution point as a data source for OS image

    I am trying to path a data source to one of our file servers and get the following error:
    I am guessing this is a permissions issue.  The path is correct. This is also a distribution point and I have no trouble distributing content to it.  The SCCM server is set to have local admin and network share rights to this file server.  Any
    ideas?

    Of course - that would be too easy... had to ask though :)
    Hmm, sounds odd.  I'm guessing that since you're using it as a DP you're on Windows Server... You might try browsing there in your runline but instead of doing
    \\server\share\OS.WIM try doing
    \\Server.FQDN\Share\OS.WIM
    I've seen some odd issues where I need to specify FQDN but it's pretty rare.  I do remember an issue where my own account would not go through even though it had rights and I could navigate.  I used a different account and I think after a while
    it cleared up.  My guess is something got rebooted and it "righted the ship" so-to-speak.  I'm sorry I don't remember more, that particular client had multiple accounts so I just switched to a different one and by the time I had to do the same function
    again the issues had resolved itself so I just moved on.

  • Using XML Publisher with plsql package data source?

    Hi,
    I have a html gantt chart which i create using a plsql package and the use of the htp.p procedure for output to a webpage.
    I want to be able to print this to PDF and was hoping the XML Publisher may be an option for doing this. The datasource for this however seems to be a sql query or XML feed.
    Can anyone provide any suggestions on this??

    With a Pipelined Table Function you can use a function in your FROM-clause (so you'll have a normal SELECT, but in the background the data comes from a PL/SQL-function):
    SELECT * FROM TABLE( <function_name> )
    Look here:
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:30404463030437
    http://download-uk.oracle.com/docs/cd/B19306_01/appdev.102/b14261/tuning.htm#sthref2335

  • Using XML as a Data Source

    In http://download.oracle.com/docs/cd/E14571_01/bi.1111/e10540/datasource.htm#CHDBCAIJ it states you can use XML as your data source and mentions...
    URLs can include repository or session variables, providing support for HTTP data sources that accept user IDs and passwords embedded in the URL. For example:
    http://somewebserver/cgi.pl?userid=valueof(session_variable1)&password=valueof(session_variable2)
    I'm trying to pass this:
    http://somewebserver/cgi.pl?userid=valueof(NQ_SESSION.USER)
    but when I try that I can see in my web server logs that the content of the userid variable is 'valueof(NQ_SESSION.USER)' so it's not passing the results to the web server, just the text. Any ideas what's going on? Is my syntax correct? Thanks!

    Hey,
    There are a few things to look at when using session and repository variables within the RPD.
    One of the first documents you need to reference is the "What is the Syntax for Referencing Variable?" section within the OBI documentation. In addition look at this section on how to use variables in the RPD, http://download.oracle.com/docs/cd/E14571_01/bi.1111/e10540/variables.htm#BIEMG3104
    Basically you are trying to use NQ_SESSION.USER which is the syntax for trying to show the user name is Presentation Services. Of course you can see this in your web server logs because that is the correct syntax for presentation services web tier but not the RPD where you define your XML Source.
    You need to try either use VALUEOF("USER") or you are simply forgeting to put the syntax surrounded by single-quotes like, http://somewebserver/cgi.pl?userid='valueof(NQ_SESSION.USER)' to pull the system session variable into the XML URL as needed.
    Try those options and let me know if that works.
    Please mark this as the correct answer or helpful.
    Cheers,
    Christian
    http://www.artofbi.com
    Edited by: Christian Screen on May 8, 2011 10:32 AM

  • Performance Point Services data source pointing towards wrong datasource after production instance copied in development

    Hi
    We have copied our Project Server 2010 production databases to development server. We have few reports created by using Performance Point Services, but here when we try to open the reports the data source is  pointing towards production data source
    instead of dev.
    We are getting some issue while executing the reports
    Request you to let me know what exactly the cause and steps to rectify the issue.
    Thanks 
    Geeth If you feel that the answer which i gave you is Helpful please select it as Answer/helpful.

    As you have copied data from Prod to Dev reports are pointing to production data sources and reports are unable to display data. Whenever we copy the data from one environment to other then this kind if issue occurs.
    We need to modify the data source of all the reports manually so that report point to dev data source.
    You have to open data source for your reports and change it from prod to dev then reports will display correct data.
    http://blogs.msdn.com/b/performancepoint/archive/2011/09/19/performancepoint-data-connection-libraries-and-content-lists-in-sharepoint.aspx
    http://www.networkworld.com/community/node/57687
    kirtesh

  • OLAP cubes from heterogeneous data sources using XML DB

    hi.
    Q:1
    How we can create an OLAP cube (XML cubes) from XML data sources if using XML DB.
    Q:2
    How we can create an OLAP cubes from warehouses and flat files and covert these cubes into XML cubes.
    Q:3
    Is there any other tool (Except Analytical workspace manager AWM) which supports the construction of OLAP cubes in XML format from heterogeneous data sources?
    Edited by: user11236392 on Aug 21, 2009 3:50 AM

    Hi Stuart!
    Your undersatnding is partially correct. i am working for providing an architecture for XOLAP, XML is one of my data source.
    The idea is to generate uniform cubes from heterogeneous sources that can be integrated into a global cube. instead of building Oracle OLAP cubes from all the sources (this work is already done) i want to generate an XCube from XML data sources using XQuery.
    On the other hand if we have generated the Oracle OLAP cube from other sources like warehouses or flat files. i have to convert these Oracle OLAP cube into XML cube for uniformity. in an research paper i find that there is an operator Xcube embedded in XQuery which converts the multidimensional data (cube) into XML cube. im looking for the implementation of this operator in Query that how this operator works.
    hope u understand my architecture. but if u still have some confusion, kindly give me ur mail id. i will mail the diagram of my architecture.
    thanks.
    saqib

  • Testing data sources in oc4j-ra.xml

    Is there a way to test the data sources in oc4j-ra.xml from the server? I have a server sitting inside a DMZ that has access back to internal databases. The DMZ has its own set of IP addresses and it has specific openings to the internal resources it needs. that is why i would like to test the connections from the server.

    Hi,
    You're right. I've downloaded JDev 10.1.2.
    The working version:
      <data-source name="jdev-connection-myconn"
                   class="com.evermind.sql.DriverManagerDataSource"
                   location="jdbc/myconnCoreDS" xa-location="jdbc/xa/myconnXADS"
                   ejb-location="jdbc/myconnDS"
                   pooled-location="jdbc/myconnPooledDS"
                   connection-driver="oracle.jdbc.driver.OracleDriver"
                   username="xxx" password="yyy"
                   url="jdbc:oracle:thin:@localhost:1521:mysid"
                   inactivity-timeout="30"/>and:
    DataSource ds = (DataSource) getInitialContext().lookup("jdbc/myconnDS");JDev generates the correct data-sources.xml, if you create the data-sources.xml with its wizard (New Galery/General/DeploymentDescriptors/data-sources.xml). Then you can see in the properties dialog, the default setting is: Auto update data-sources.xml when running or deploying to OC4J.
    So when you first run the embedded OC4J, your data-sources.xml file will be updated with your connections.
    I read these useful things in Olaf Heimburger's blog :) (OC4J: Configuring DataSources, 2007-05-03)
    Regards,
    Kati

  • CAF and custon data source in CE 7.2?

    Hello,
    is it in CE 7.20 possible to use the CAF Framerwork with a custom JDBC data source instead of the system data base?
    Best Regards
    Sebastian
    Edited by: Sebastian Lenk on Mar 2, 2011 5:25 PM

    Hi,
    in theory this should work because even SAP and CAF do not break the way Java EE handles database persistence, in practice it is very difficult, probably impossible.
    CAF generates something like a Java EE 5 application with JPA 1.0 persistency, so it is in principle unaware of a system datasource. If you look at src/META-INF/persistence.xml, you find a jta-datasource called SAP/CAF_RT which is an alias to system datasource. This persistence.xml is regenerated every time you generate your CAF app so your changes are lost.
    In NWA, you cannot change the alias SAP/CAF_RT to point to another JDBC Datasource (at least I did not manage it - getting an error message).
    And there are about 25 CAF*-tables in the schema of system datasource which CAF will probably need and maybe will expect in the same datasource as your app tables.
    In principle, you could try to create these CAF* tables in another database scheme and maybe it is somehow possible to let SAP/CAF_RT point to another data source by editing some of SAP's CAF ears, but this sounds experimental.
    So I am quite sure CAF only supports the system datasource. CAF was not invented to support agilty and degrees of freedom for the developer.
    Cheers
    --Rolf
    P.S. IMHO with EJB 3.x and JPA it is not a good idea to use CAF for persistence at all. With EJBs there is even no good reason to use for local apps at all. CAF was useful in times EJB 2.x where development of persitent classes took a lot of boilerplate code. SAP updated the framework towards  EJB 3 but kept the unflexible structure whereas with JPA 1.0 it is easy to learn and write @Entity, @Column, EntityManager.find or EntityManager.persist. E.g. the CAF way to handle relations between tables is a pain and the programming model with data access is outdated. And CAF way of model driven development is too unflexible to support an agile development process in 2011. So use CAF for small projects that will not increase.

  • Register with ESB, data sources

    Hi, I have an ESB project to deploy on an integration server. It makes use of a database adapter. When I created that service, I connected to the database, and through the wizard it worked great. So far so good. From the wsdl file:
    Your runtime connection is declared in J2EE_HOME/application-deployments/default/DbAdapter/oc4j-ra.xml These mcf properties here are from your design time connection and save you from having to edit that file and restart the application server if eis/DB/myDBconn is missing. These mcf properties are safe to remove
    Then the connection details follows. Now, I would rather not use the generated connection profile in the wsdl, but rather point to a data source, like the text above explains. But how do I delete it, and how do I make the adapter use the integration server connection, here named "eis/DB/myDBconn"?

    From the Adapter configuration Wizard in Step 3, you choose the correct JNDI Name (like eis/DB/myDBConn).
    You just need to create a connection factory and a datasource in Application Server Console for your OC4J instance, having:
    - connection factory named eis/DB/myDBConn (location) with xADataSourceName = jdbc/myDBConnDS)
    - data source created named myDBConnDS having JNDI = jdbc/myDBConnDS
    Regards
    Stéphane

  • Configuring data-source

    Hi
    I'm trying to configure a datasource in jdev using this guide(part 4):
    http://doc.uni-klu.ac.at/doc/oracle/10g/as/b10464_03/web.904/b10322.pdf
    Anyway, the application conists of two projects. I've added an application.xml file with a <data-sources element that points to my data-sources.xml. I simply edited the data source that was included in that guide to point to my database(connect string and credentials).
    However, when I run this java code,
    <code>
    try
    ic = new InitialContext();
    DataSource ds = (DataSource) ic.lookup("jdbc/OracleDS");
    Connection conn = ds.getConnection();
    this.dbConnection_ = conn;
    catch(NamingException ne)
    System.out.println(ne.getMessage());
    </code>
    it returns "jdbc/OracleDS not found". I've also noticed some other config files in my application folder that i'm not sure about(app_name-data-sources.xml etc.).
    Any help?

    Thanks Deepak. I tried both, neither one worked.
    The first time seemed close, this was the error: java:comp/env/jdbc/OracleDS not found in "Project Name"
    It picked up my project name at least. If I have a view-controller and model both, which project should the application.xml and data-sources.xml be in?
    When I tried this suggestion:
    HashTable ht=new HashTable();
    ht.put(Context.INITIAL_CONTEXT_FACTORY,"com.evermind.server.rmi.RMIInitialContextFactory");
    ht.put(Context.PROVIDER_URL,"ormi://localhost");
    InitialContext ic=new InitialContext(ht);
    I got a series of RMI errors. Connection refused.

  • Using DAO using a JDBC data source with struts

    Hello,
    I have created a number of Data Access Objects and Transfer Objects to use in a non EJB, struts web application I am developing. I had tested these with a kind of a Service Locator custom class to provide access to a JDBC connection. My custom class is a bit clunky and not very configurable. I would like to use a data source using the struts config XML file e.g.
        <data-sources>
            <!-- configuration for commons BasicDataSource -->
            <data-source type="org.apache.commons.dbcp.BasicDataSource">
                <set-property
                  property="description"
                  value="My MySQL Database Connection" />
                <set-property
                  property="driverClassName"
                  value="com.mysql.jdbc.Driver" />
                <set-property
                  property="url"
                  value="jdbc:mysql://localhost/databaseName" />
                <set-property
                  property="username"
                  value="myUsername" />
                <set-property
                  property="password"
                  value="xxxxxxxxx" />
                <set-property
                  property="maxActive"
                  value="10" />
                <set-property
                  property="maxWait"
                  value="5000" />
                <set-property
                  property="defaultAutoCommit"
                  value="false" />
                <set-property
                  property="defaultReadOnly"
                  value="false" />
             </data-source>
        </data-sources>This is great, and precisely the kind of thing I would like to use. However, this datasource is only available AFAIK through a HttpServletRequest instance like in the example I found below...
        public ActionForward execute(ActionMapping mapping, ActionForm form, HttpServletRequest request,
                                     HttpServletResponse response)     throws Exception
            javax.sql.DataSource dataSource = null;
            java.sql.Connection myConnection = null;
            try {
             dataSource = getDataSource(request);
             myConnection = dataSource.getConnection();
             // do what you wish with myConnection
            } catch (SQLException sqle) {
               getServlet().log("Connection.process", sqle);
            } finally {
               //enclose this in a finally block to make
               //sure the connection is closed
               try {
                  if(myConnection != null)
                      myConnection.close();
               } catch (SQLException e) {
                  getServlet().log("Connection.close", e);
            return (mapping.findForward("success"));
        }That would be great if I wanted to use the database connection anywhere near a struts Action (wrong tier!). I want access like that to to a data source in my DAOs. Is it possible for me to use the data-sources aproach to access the DB from my DAOs or will I need to use something like JNDI to do this in a similar way but separate from struts. If so I have a big gap in my knowledge as far as JNDI goes which I need to fill and that will be my next question
    I'm relatively new to using patterns inn Java and any help or pointers would be great.
    Thanks :)

    Create a JAAS Authentication Entry in the Server configuration.
    This should then appear in the drop-down when specifying your DataSource.

  • How do I modify invoice request xml file by adding posting date?

    Hi,
    We import customer invoice requests via xml files from an external data source. Currently the standard SAP xml file does not include the posting date, and invoices enter SAP with a blank posting date. When the invoice is released, the posting date is taken from the invoice date.
    Due to our month end processes, we have hundreds of invoice requests every month where we do not want posting date to equal the invoice date, and for each of those invoices the posting date is manually entered one invoice at a time during the release process. This is very time consuming.
    We would like to build functionality in our external system to create the posting date at the xml file generation stage. Could anyone let us know the following:
    - what is the name of the posting date field on invoice requests (invoice documents)?
    - where would we place the additional script in the xml file?
    I'm attaching one of our current xml files (which already contains one section that has been customized)
    Your suggestions would be appreciated.
    Thanks,
    Kerstin

    Kerstin,
    The closest i could find in the WSDL of the Manage Invoice Request Web Service that handles this integration is "<ProposedInvoiceDate>" or possibly even "<ProposedDeviatingPostingdate>", both of which sit directly under the <CustomerInvoiceRequest> element.
    For more information, go to the Service Explorer, find the ManageInvoiceRequestsIn Web Service, download the WSDL, and open it in SOAP-UI or something similar. This way you can see all the fields that you can write to, which is where i found these two elements.

  • Access enforcer and User Data Source for HR

    We are on Access Enforcer 5.2 - service pack 2:
    My problem is that when creating a new request in AE, I able to get a list of all users when I point my User Data Source to either SAP or UME. However when I attempt to create a request whilst pointing the User Data Source at the SAPHR system, I do not get any users back (and we have user set up in the SAP HR system).
    I’ve changed the connector to ‘YES’ under the HR System box, I’ve changed the Data Source Type and Details Source Type to point at the SAPHR and still it fails to fetch any users.
    I've tried looking at the log, but can't get much out of it.
    I would appreciate it, if anyone could provide any assistance.
    Thanks you in advance.
    Amarjit
    Message was edited by:
            amarjit singh

    Hi Micheal,
    Thanks for your reply.
    I'm pointing both Data Source Type and Details Source Type to the same system SAPHR and to the same system name (which is our dev system)
    Regards,
    Amarjit

  • Data source was activated and replicated but not showing up in RSA7.

    Hello,
    Data source was activated and replicated but not showing up in RSA7.  At what point does the data source appear in the Delta Queue?
    Thanks

    Hi,
    for LO,LIS,generic,FI data sources, delta records come from delta queue.
    if u run the INIT in BW whether it may be success or not delta queue will be maintained in RSA7. and u can check the records in RSA7 or smq2.
    when the init request goes to R/3 then it will maintains delta queue in RSA7.
    assign points if it helps,
    thanks,
    pavan.

Maybe you are looking for

  • Password expire date back to 2011 from 2012  after assigned  a user profile

    Friends, I created a profile test as COMPOSITE_LIMIT UNLIMITED SESSIONS_PER_USER UNLIMITED CPU_PER_SESSION UNLIMITED CPU_PER_CALL UNLIMITED LOGICAL_READS_PER_SESSION UNLIMITED LOGICAL_READS_PER_CALL UNLIMITED IDLE_TIME 60 CONNECT_TIME UNLIMITED PRIVA

  • New Credit Control area not considering Recievable amount

    Dear All, We have created two credit control area  "1000" and "2000" for a company. A customer should be assigned to one credit area only. Mistakenly user have assigned  wrong credit area  "2000"  for customers and  transactions have been processed f

  • Problem with BPM fix

    We all know about the BPM switch fix for finding and removing dead entries from iTunes, but I have encountered a (big) problem with this. Actually, two problems, one big, and one slightly smaller: 1. As anyone who has done this knows, it takes quite

  • Database hang issue

    Hi, My report database(standby database) sometimes getting hang, also that another db,lets say testdb also accessing my standby db via dblink. there was no constant time to say that my report db getting slow. AMM(automatic memory management) has been

  • SE80 problems

    Hello all, Not sure if this is the correct forum, but I figured it's the most likelt place to find someone who's experienced the same problem. When I run transaction SE80 (ABAP development workbench), within the "Repository browser" there should be a