Logical Data Service

Hi all,
I´m trying to create a phsysical data service that uses two physical data services (one Store procuredure and one table).
I what two create a join clause between my strore procedure and my table, but i got this error message:
+"cannot create relationship from/to any Data service other than entity data service.+
+One or both of the data services chosen are not and entity Data SErvice"+
What I need to do is:
I have A AND B values as input to my logical data service.
I need to use A and B value to get another value C in my physical data service( type: Relational Table).
C will be physical data service (type Store Procedure) input data. The return of this procedure will be mapped in (XSD) file that represents my entity.
Does anybody can help me?

If I understand correctly, your need to satisfy conditions on parameters A and B with your table, so you can extract C as a result from the table and use it as a parameter in your stored procedure to return the result of the stored procedure.
In XQuery this kind of join looks loosely like this:
for $t in ns1:TABLE()
where $t/column1 eq $A
where $t/column2 eq $B
return
for $s in ns2:STOREDPROC($t/C)
return
<myresulttype>
<field1>{fn:data($s/column1)}</field1>
</myresulttype>
The error message you have indicates that you are trying to create a relationship function. While it is possible to express joins through relationships, you can also do joins using the Query Map editor.
The chapter "[Create Your First Data Services|http://download.oracle.com/docs/cd/E13162_01/odsi/docs10gr3/datasrvc/Create%20Your%20First%20Data%20Services.html#CreateYourFirstDataServices-CreatingaLogicalDataService]" describes how to create a join graphically using the Query Map. The only difference in your case is that you would map your join line (Explained in "Setting up a Join Condition" in the docs) would be mapped to the parameter of the stored procedure.
Edited by: alexkotopoulis on Jan 16, 2009 1:56 PM
Edited by: alexkotopoulis on Jan 16, 2009 1:57 PM

Similar Messages

  • User Defined Transaction in Logical Data Service

    Hi
    I have few physical data services created from database store procedures. I've corresponding logical services for each of the physical data services.
    Now I've created a logical data service that calls 2 store procedures & trying to update this 2 store procs in a single transaction.
    Also I dont have a read operation in my logical data services. In this case both store proc. should take part in a single transaction & should commit or rollback as per success/failure.
    I tried testing this scenario & it fails. ALDSP was able to update first store proc. but there was an error in second store proc. so it should have rollback the entire transaction.
    Is there any way that we can have user defined transaction in logical data service?
    My Scenario:
    InsertData(RequestA,RequestB) as response
    res1 := call First Store Proc (RequestA) ;
    res2 := call Second Store Proc (RequestB);
    response := res1 + res2
    So my question was is it possible to define user transaction before making calls & have commit/rollback define around it.
    Apprecaite for your help!
    Thanks
    Pash

    I have assumed that you realize you need XA drivers for this. What is the exception you see? Something about "mixed outcome"?
    But in the case of store procedure, we cant drag & drop physical ds into logical ds. Why?Since your stored procedure updates, it should be defined in the physical ds as a "procedure" not a "function" - it sounds like it is. (procedure means it will update, function means it will not - not that it is a "stored procedure"). Because it updates it cannot be called by an xquery function because xquery is by definition read-only. It can only be called directly from the client, or from another procedure (which is XQSE, not Xquery). If you created a logical ds with a "procedure" - you could drop your procedure into that.
    created a new library procedure in which i am calling 2 physical ds assuming it will be executed in a single transaction. But it does not. Please verify.
    But it does not. Please verify.I won't verify it because it is not correct. It should be treated as though it is one transaction. And barring the bugs with JNDI/data source names and multi-data sources, I do not know how how it would not be. So either there is a misunderstanding or a new bug.
    Either way you need to open a case with customer support to resolve this. When you open your case, the more information you provide them with (i.e the smallest possible dataspace that will reproduce the problem along with DML to create the stored procedures and the tables that they use, and also any/all exceptions/stack traces and server logs).
    - Mike
    ps. there are no means to explicitly create a transaction in ODSI.

  • Using a schema from a different Logical Data Service

    Is it possible to use a schema from a different Logical Data Service in a new Logical Data Service? I want to maintain just one version of the schema.

    when I try and associate a schema type the 'Select Schema Type' dialog appears. On there the scope is set to project. Next to it is 'All projects', is there a way I can get that ungreyed?I don't think so. I assume that is a remnant from ALDSP 2.x when we had the notion of an ALDSP application that contained multiple ALDSP projects and there was visibility across projects (actually just folders) in the same application.
    can I use a schema associated with another dataspace projectNo. The contents of a dataspace are not accessible from outside the the dataspace other than what is provided by the APIs. And although the schemas could be accessed via catalogservices, this isn't a standard protocol (like http://) and therefore schema processors wont' be able to use them.
    You could host your schemas in a web-application and expose them via http. Create a Dynamic Web Project and put your schemas where the content gets served from (i.e. alongside index.html), and then they would be accessible via http://<yourserver>:<port>/yourWebApp/yourSchema.xsd. All the schemaLocation in the imports would need to be relative. It would probably be easier to create your schema-hosting application first, then build your dataservices.

  • Error while executing logical data service:XP0006

    Hi,
    <br><br>
    I have one physical dataservice which is caps.ds based on oracle database. I created one method in caps.ds by name gatCap(String param). basically this method will fetch the data from the database based on input parameter. This method is working fine in physical data service.<br>. Now I created one logical dataservice named as Test.ds from this physical dataservice. I added one method getCaps(string param) method in Test.ds. Noe if i try to test this method in workshop I am getting the following error.<br><br>.System error
    weblogic.xml.query.exceptions.XQueryTypeException: ld:NewDSProj/Test.ds, line 26, column 5: {err}XP0006: "element {ld:NewDSProj/Caps}Caps { {http://www.w3.org/2001/XMLSchema}anyType }": bad value for type element {ld:NewDSProj/Test}Test { {http://www.w3.org/2001/XMLSchema}anyType }*<br><br>
    Even in Query Plan also it is showing TypeError in red color.
    <br><br>
    I am attaching the capa.ds, Test.ds and required .xsd file and server log file for your reference.
    <br>Any suggestiins please...
    <br><br>
    Regards,
    Suresh Varma.

    Hi<br>I changed the return type in Test.ds to caps.xsd. It is working fine. Thank you.<br><br>
    Now I have the following data services.<br>
    1.MRM_RESPONSE.ds--which is created from java method.This java method will read the data from xml.<br>
    2.Caps.ds--which is created for Oracle data base.<br>
    Now I created one logical data service named as Test.ds with function newFunction(String param).My intention is i want consolidated output from both physical data services based on join condition. I have the state field in both physical data services. So the output of newFunction() in Test.ds should be combination of caps and MRM_RESPONSE.<br> In Query plan for newFunction() I am not getting any error. But if I try to test I am getting the following error.<br>System error
    weblogic.xml.query.exceptions.XQueryTypeException: {bea-err}TYPE003 [{bea-err}TYPE003a]: Runtime Type Mismatch: got an xdt:untypedAtomic value<br>.
    I am attaching latest code and server log file for your reference. In the attached zip file NewSchemas is the schemain schemas project.JavaClient contains java method.
    <br>
    Thanks& Regards,
    Suresh Varma.

  • Problems with logical data service

    This is related to thread 837141 (Problems creating physical data service to relational table)
    I have been able to create the logical data service that uses the physical service (mentioned above).
    However when I try to create a library function to update (shown below)
    declare function lper:update($new as element(per:Persistance2)) as xs:boolean {*
    * let $changed :=*
    * for $n in $new*
    * let $old := lper:read($new/Project, $ew/Property)*
    * return*
    * fn-bea:replace-value(fn-bea:changed-element($old), "Value", $n/Value)*
    _let $return := {color:#ff0000}pper:update{color}($changed)_*
    * return fn:true()*
    WorkSpace Studio gives the following error (for the line underlined above, and the function in red)
    *{bea-err}FUNC002: Illegal use of side-effect procedure {ld:Physical/Persistance2}update with arity 1. Side-effect procedures cannot be invoked from a side-effect free context*
    Any idea why?

    functions, by definition, do not have side-effects. So calling something that had a side-effect (procedure) is not allowed.
    You need to create a library procedure.
    Your
    declare function lper:update($new as element(per:Persistance2)) as xs:boolean {*
    needs to be
    declare procedure lper:update($new as element(per:Persistance2)) as xs:boolean {*
    - mike
    Edited by: mikereiche on Dec 12, 2008 5:10 PM

  • Newbie question about Logical Data services

    Lets say we want to create a Logical Data Service that will retrieve data from several web-services and aggregate the results. When a client requests data from the logical data service, a number of webservices will get invoked. Response time of these webservices can be different. We do not want to keep the client app waiting for long.. so is it possible to define a cut-off time in data service after which client gets results obtained till that point? Alternatively, does ALDSP provide support for clients that could poll a data-service to check for results of a previously submitted query?
    Will appreciate your help in understanding capabilities of ALDSP and approach taken by you in similar situations.

    to define a cut-off time in data service after which client gets results obtained till that point?See the fn-bea:timeout() function in the ALDSP documentation. It will do exactly what you want for web-services. Note that adding the function to a query will of course alter the query plan - which will have little/no effect for web-services - but for database access, may prevent some optimizations.
    In DSP 3.0, you will be able to use the hasNext()/next() form of the mediator api and the client will only block while the first and subsequent items are being retrieved.
    If you want to call DSP asynchronously - I believe ALSB allows you to create an asynchronous proxy service (JMS, for instance)

  • ALDSP , Logical Data service to PhysicalData service mapping

    Hi ,
    Need help to find out the way how to find the mapping between the Logical Dataservice to physical service using config file .
    I have requirement like depend on the Region ( Parameter) need to connect to the database . How do we achive this using an xml file ( means dynamic which out a code change ) .
    for ex:
    Think we have 3 regions names ( DAO ,APJ ,CCC). I have created 3 Physical data services for each region (which will point to 3 diff databases).
    then In the logical service I have logic like this
    If( region =='DAO')
    call the DAO Physical Srvice
    If(region == 'APJ')
    call the APJ Physical Srvice
    If(region == 'CC')
    call the CCC Physical Srvice
    So if I have to point the region 'DAO' to the APJ physical service then it required a code change . how we can avoid this hard coded mapping . Is there any way to define the region and the physical service mapping in the xml file so that if any change is required it will be a change in xml file itself with out a code change .
    Please help me . This is little bit urgent .
    Thanks in advance .
    Thanks & Regards
    Sarath

    It seems that if you are adding more regions and physical data services you will need to modify your dataspace anyway to add the new physical data services.
    Please post your ODSI/ALDSP questions on the ODSI forum - Data Service Integrator

  • Error when calling create and update functions on logical data service

    Hi There,
    I receive the following error when trying to call the createCustomerDetailResponse method from our dataservice. Note that the operation referenced in the error is createCustomerDetail rather than the actual method name 'createCustomerDetailResponse'. I've attached a stack trace as well as a copy of our .ds & code. Thanks, any help would be greatly appreciated!
    mlns:bea_fault="http://www.bea.com/servers/wls70/webservice/fault/1.0.0">javax.xml.rpc.JAXRPCException: Can not
    locate the operation: {ld:CustomerMaster/Logical/CustomerDetail_ws}createCustomerDetail
         at com.bea.dsp.ws.WssInboundHandler.operationLookup(WssInboundHandler.java:252)

    I think there is a definitely a problem with my ALDSP webservice. I deleted and regenerated my .ws file, and when I test the createCustomerDetailResponse webservice operation directly through the ALDSP IDE I get the same error.
    createCustomerDetailResponse Request Summary
    Arguments: [complex type]
    Fault: Failed to process inbound requestFailed to retrieve operation from SOAP bodyCan not locate the operation: {ld:CustomerMaster/Logical/CustomerDetail_ws}createCustomerDetail
    Submitted: Wed Mar 05 16:37:53 EST 2008
    Duration: 9 ms

  • Schema Mapping in Data Service Integrator

    Hi,
    I'm just examining some schema mapping programs like BizTalk, Altova and IBM Rational Data Architect, and now found Oracle Data Service Integrator which might be a similiar tool. Actually I found out about Data Service Integrator via BeQ AquaLogic, which probably had been some schema mapping tool before Oracle acquired it and now offers it as Data Service Integrator.
    My question is, whether Oracle Data Service Integrator is really applicable for schema mapping/matching, such as creating mappings between xml, csv/flat files or database schemas. I already downloaded it and tried it out, but had troubles creating a map. According to a tutorial you create a physical data service if you wanna do something like mapping, but after I did this there was only a "map" with a source schema. There was no way to add a target schema and map it with the source schema.
    So can I create mappings in Data Service Integrator or are there other products which would be more convenient (Oracle Warehourse Builder for instance)? If so, does anyone know whether there is a good tutorial how to map simple schemas such as xml files in Oracle Data Service Integrator?
    Thank you in advance.

    After you create your physical data services, create a logical data service using your target schema as the 'return type'. Then add functions and use the xquery mapper to map your physical data services (csv, database, xml, web service etc) to your target schema. You can also use logical data services as the input to a logical data service.

  • Result set not visible in Test View for Physical data service

    I have configured a MS SQL server stored procedure as a data service. It returns a resultset. I am calling this data service (physical) from a logical data service. I am able to see the returned result set in the test view (workshop) of the logical data service but not in the test view of the physical data service? Is this a problem within the workshop?
    Any help is appreciated.

    Never heard of this before. Can you turn on Auditing of XQuery parameters and results and then show me the audit information so I can see you are calling both the logical and the physical data service with the same arguments?
    It would also help to see both the logical and physical ds and their schemas. Just zip and post the whole dsp project if possible.

  • Data Services communication with SAP - physical/logical server name

    Iu2019m having trouble connecting Data Services to SAP.
    Environment details:
    Data Services version 12.2.1.2 on Windows 2008 SP2
    ERP and SRM on Windows 2008 (database SQL Server 2008 SP1)
    Repository on SQL Server 2008 SP1
    All servers (SAP and Data Servers) have had logical systems defined over the physical system names and they use these logical names to communicate with each other.  The Data Services server, hence has two names (logical and physical) and two corresponding IP addresses. 
    The problem weu2019re having is that when the SAP server communicates back to the Job Server it needs to use the logical server name and corresponding IP address.  Currently it is using the physical server name and IP address, causing communication failures. 
    When I run the host_name() function within Data Services it returns the physical host name.
    Is there a way to determine how Data Services identifies itself and whether this can be manipulated?  Is it possible to configure Data Services to run in a logical environment?
    Any thoughts or commends would be appreciated. Thanks

    Looked at this from another angle.  We were using Direct Download transport method which processes in the foreground and uses the SAPGUI for communication.  It seems as though it was how SAPGUI communicates with the SAP server on Data Services behalf that was causing the problem.
    We managed to motivate to get a share setup between the SAP server and the Data Services box which allows us to use Shared Directory transport method.  This also allows for background processing - eliminating the part of the SAPGUI communication that was causing the problem.

  • Data Services job rolling back Inserts but not Deletes or Updates

    I have a fairly simple CDC job that I'm trying to put together. My source table has a record type code of "I" for Inserts, "D" for deletes, "UB" for Update Before and "UP" for Update After. I use a Map_CDC_Operation transform to update the destination table based on those codes.
    I am not using the Transaction Control feature (because it just throws an error when I use it)
    My issue is as follows.
    Let's say I have a set of 10,000 Insert records in my source table. Record number 4000 happens to be a duplicate of record number 1. The job will process the records in order starting with record 1 and begin happily inserting records into the destination table. Once it gets to record 4000 however it runs into a duplicate key issue and then my try/catch block catches the error and the dataflow will exit. All records that were inserted prior to the error will be rolled back in the destination.
    But the same is not true for updates or deletes. If I have 10000 deletes and 1 insert in the middle that happens to be an insert of a duplicate key, any deletes processed before the insert will not be rolled back. This is also the case for updates.
    And again, I am not using Transaction Control, so I'm not sure why the Inserts are being rolled back, but more curiously Updates and Deletes are not being rolled back. I'm not sure why there isn't a consistent result regardless of type of operation. Does anyone know what's going on here or  what I'm doing wrong/what my misconception may be?
    Environment information: both source and destination are SQL Server 2008 databases and the Data Services version we use is 14.1.1.460.
    If you require more information, please let me know.

    Hi Michael,
    Thanks for your reply. Here are all the options on my source table:
    My Rows per commit on the table is 10,000.
    Delete data table before loading is not checked.
    Column comparison - Compare by name
    Number of loaders - 1
    Use overflow file - No
    Use input keys - Yes
    Update key columns - No
    Auto correct load - No
    Include in transaction - No
    The rest were set to Not Applicable.
    How can I see the size of the commits for each opcode? If they are in fact different from my Rows per commit (10,000) that may solve my issue.
    I'm new to Data Services so I'm not sure how I would implement my own transaction control logic using a control column and script. Is there a guide somewhere I can follow?
    I can also try using the Auto correct load feature.  I'm guessing "upsert" was a typo for insert? Where is that option?
    Thank you very much!
    Riley

  • Table based on one data service but filtered by a second data service

    Hi,
    I'm fairly new to VC and have got comfortable with a few simple scenarios. However, I'm trying to apply some fundamental logic that must be possible and I'm guessing straight forward.
    I have a data service that contains document type master data that comes from system A. I have a second data service that supplies a list of the document types that a particular user is able to view. This comes from system B. I want to be able to display the document type master data in a table but only the document types the user is allowed to see. Both data services contain a document type code.
    If anyone can point me in the right direction I would really appreciate it. We want to use VC on my current project and this is the last piece of functionality I'm trying to prototype.
    Many Thanks
    Gary

    Hi,
    I am not sure whether you have tried to achieve your requirement using intersect operator or no. I haven't tried it before but you can give it a shot.
    Regards,
    Murtuza

  • How to call stored procedure in data services?

    Hi, all,
    I'm a newbie and am trying to evaluate Flex for our
    development environment. We use a lot of Oracle stored packages in
    the database to process the business logic. However, I can't seem
    to find a good example to call a stored procedure in flex data
    services.
    Here is a part of my data-management-config.xml:
    <destination id="oracle2">
    <adapter ref="java-dao" />
    <properties>
    <use-transactions>true</use-transactions>
    <source>flex.data.assemblers.SQLAssembler</source>
    <scope>application</scope>
    <metadata>
    <identity property="GKEY"/>
    </metadata>
    <network>
    <session-timeout>20</session-timeout>
    <paging enabled="false" pageSize="10" />
    <throttle-inbound policy="ERROR" max-frequency="500"/>
    <throttle-outbound policy="REPLACE"
    max-frequency="500"/>
    </network>
    <server>
    <database>
    <driver-class>oracle.jdbc.driver.OracleDriver</driver-class>
    <url>jdbc:oracle:thin:xxxxxxxxxxx:1521/racdev</url>
    <login-timeout>15</login-timeout>
    </database>
    <actionscript-class>History</actionscript-class>
    <fill>
    <name>by_nbr</name>
    <sql>select to_char(POSTED,'dd-Mon-yyyy hh:mi:ssAM')
    POSTED, GKEY, EQUSE_GKEY,WTASK_ID,VSL_ID,VOY_NBR,STATUS from
    equipment_history where eq_nbr = #sNbr# order by posted
    desc</sql>
    </fill></server>
    </properties>
    </destination>
    How do I call a stored procedure named
    pk_equipment_history.get(eq_nbr varchar2, o_resultset out
    sys_refcursor) in place of the sql select in the above file? The
    o_resultset output parameter will return exactly the same result as
    the sql. How do I bind the output parameter?
    TIA

    Hi,
    Your question isn't related to Java Programming and should be asked in a [Hibernate forum|http://forum.hibernate.org/]
    Kaj

  • Question - new to BO Data Services Designer - how to automatically transform varchar column data to upper(varchar) across all columns in a table?

    Hello -
    New user to BO Data Services Designer. Company is using Data Services Version 12.2.
    I have many tables that all need the same transformation- converting varchars fields to upper(varchar) fields
    Example:
    I have a table called Items. It has 40 columns, 25 of which are of type varchar.
    What I have been doing is ....
    I make the Item table as the source table then create a Query transform that is then attached to a new target table called - ITEMS_Production.
    I can manually drag and drop columns from the source table to the query and then in the mapping area I can manually type in upper(table_name.column_name) or select the upper function and then select the table.column name from the function drop down list.
    Obviously, I want to do this quicker as I have to do this for lots and lots of tables.
    How can set up Data Services so that I can drag and drop an upper transform quickly and easily or automate this process.
    I also know Python-Java so am happy to script something if need be.
    the logic would be something like -
    if there is a column with a type varchar, add the upper() transformation to column. Go to next column.
    Excited to be using this new tool-
    Thanks in advance.
    Ray

    Use the DS Workbench.

Maybe you are looking for

  • Error message when attempting to install update

    I am having trouble updating iTunes on my PC. The software downloads then an erroe message pops up stating that the file .....iTunes64.msi cannot be found. A manual search for the file is also fruitless. How can I work around this?

  • Report in 902

    I need help with making a report in Portal 9.0.2, using the tables below. My_table wwsec_person$ Employee_id <----> id date first_name etc last_name In Portal 3.0.9 this worked fine. select date, etc, first_name from my_table, wwsec_person$ where emp

  • Final invoice without paying existing down payment invoice

    One of our customers demands some changes within the Business One. The support in Ireland advised us to post these requirements in this Forum. The customer has created an AR down payment invoice. This invoice was not paid, but the delivery took place

  • Acrobat 9.0 & Vista

    Is there a rpoblem between Vista and Acrobat. I have pruchase Photoshop Elements 6 and Acrobat 9.0 upgrade for my new PC and It does not load. I get an error message of "Error 1321. The Installer has insufficient privileges to modify the file C;\Prog

  • MQSeries and WebMethod adapters?

    Does WLI provide any adapters for third party messaging services such as MQSeries or WebMethods? Or do these adapters need to be written? Please advise, Dave