Query on Backend Plug-In

Hi All
It is suggested in the MAM configuration guide that the backend should have "Plug-In 2004.1 SP01” .If there is any new version of the Plug-In that can fit here then please suggest me that.
Thanks
Gopi

Dear
I am not sure whether you can change the Batch SELECTION logic  for concurrent production order .As far as I know batch selection is done on Batch search strategy /sort Rule
Explore SAP Help : http://help.sap.com/printdocu/corePrint46c/en/data/pdf/LOBM/LOBM.pdf
Refer this useful thead from expert :Userexit in Batch determination and external number ranges for batches
Regards
JH

Similar Messages

  • FAST Search - Unable to Complete Query on BackEnd resource limit remporarily exceeded code:1013

    Hi ,
    We are using FAST search in our SharePoint 2010 environment. I am encountering an issue while navigating to 21st  page or Query string value 200 and above (
    URL -> /Pages/results.aspx?k=1&s=All&start1=201
    ) in search core result web part. The error message is UI is  "
    The search request was unable to execute on FAST Search Server".
    I have found the below log in the ULS .
    SearchServiceApplication::Execute--Exception:   Microsoft.Office.Server.Search.Query.FASTSearchQueryException: Unable to   complete query on backend Resource limit temporarily
    exceeded Code:   1013   
    at Microsoft.Office.Server.Search.Query.Gateway.FastSearchGateway.GetQueryResult(IQuery   query)   
    at   Microsoft.Office.Server.Search.Query.Gateway.FastSearchGateway.ExecuteSearch(SearchRequest   request, String queryString)   
    at   Microsoft.Office.Server.Search.Query.Gateway.AbstractSearchGateway.Search(SearchRequest   request)   
    at   Microsoft.Office.Server.Search.Query.FASTQueryInternal.ExecuteSearch(SearchRequest   request, ResultTableCollection rtc)   
    at Microsoft.Office.Server.Search.Query.FASTQueryInternal.Execute(QueryProperties   properties)   
    at   Microsoft.Office.Server.Search.Administration.SearchServiceApplication.Execute(QueryProperties   properties)
    Please share your thoughts or advise if some has come across this issue. Thanks!

    Hi ,
    We are using FAST search in our SharePoint 2010 environment. I am encountering an issue while navigating to 21st  page or Query string value 200 and above (
    URL -> /Pages/results.aspx?k=1&s=All&start1=201
    ) in search core result web part. The error message is UI is  "
    The search request was unable to execute on FAST Search Server".
    I have found the below log in the ULS .
    SearchServiceApplication::Execute--Exception:   Microsoft.Office.Server.Search.Query.FASTSearchQueryException: Unable to   complete query on backend Resource limit temporarily
    exceeded Code:   1013   
    at Microsoft.Office.Server.Search.Query.Gateway.FastSearchGateway.GetQueryResult(IQuery   query)   
    at   Microsoft.Office.Server.Search.Query.Gateway.FastSearchGateway.ExecuteSearch(SearchRequest   request, String queryString)   
    at   Microsoft.Office.Server.Search.Query.Gateway.AbstractSearchGateway.Search(SearchRequest   request)   
    at   Microsoft.Office.Server.Search.Query.FASTQueryInternal.ExecuteSearch(SearchRequest   request, ResultTableCollection rtc)   
    at Microsoft.Office.Server.Search.Query.FASTQueryInternal.Execute(QueryProperties   properties)   
    at   Microsoft.Office.Server.Search.Administration.SearchServiceApplication.Execute(QueryProperties   properties)
    Please share your thoughts or advise if some has come across this issue. Thanks!

  • CS5 - Flex UI plug-in using a C++ model backend plug-in

    Hi,
    I'm new to InDesign CS4/CS5 plug-in development and been given the task of coming up with a plug-in equivalent to one done for QuarkXPress 8.
    In the CS4 SDK there were two samples for the FlexUIStroke plug-in. One was a pure Flex/scripting sample and the other was a combination of Flex and C++. The latter was useful for showing how C++ source code could interface with Flex. In the CS5 SDK (build 335) the C++ sample has been removed.
    The Flex/scripting sample has been rewritten for use with CS Extension Builder under Flash Builder 4. I've quickly tested this using CSEB and it is a really handy tool for creating, installing and debugging InDesign extensions using Flex panels.
    In CS5, Adobe state the importance of model/UI separation as the former permits multiple threads. By this I assume that a separate thread in the model plug-in can manipulate a document. I like the idea purely as a means of developing code where the UI is separate from a backend which loads, saves and manipulates documents. The UI can be reinstalled without changing or reinstalling the backend.
    My project requires:
    A panel-based user interface (using Flex).
    Sockets code to listen for messages from a browser to load and save documents etc - therefore a multi-thread solution.
    Code to add text and images to the document via the plug-in user interface and report back any problems to the user interface - two-way communications.
    It sounds straightforward - as it was in QXP. But here's the crunch, is it possible to have a UI plug-in developed using CSEB and a model-compliant (not using WidgetBin.lib) backend developed using C++? At the moment I'm thinking I'm stuck using a "CS4" approach (single plug-in without CSEB) for the project.
     

    This morning I added some code to a C++ plug-in to enumerate through the plug-ins installed under InDesign. The aom was to see if LoadMovieFromResource() could work. I noticed that the CSEB plug-in was not listed. Then I realized that CSEB doesn't generate plug-ins but extensions. Therefore the last few days has been a waste.
    The CS SDK states that:
    It is possible to combine a Creative Suite extension with a C++ plug-in to create a hybrid extension; the
    tech note provides more information about how to do this.
    I haven't found the tech note!
    I've lost too much time on this, so my project will revert back to two C++ plug-ins (one UI and one model) where the UI plug-in loads the SWF.
    Shame, I feel a real opportunity has been missed here to make plug-in development using Flex user interfaces really easy (especially debugging). Oh well may be a few SDK builds down the road, C++ plug-in using Flex user interfaces will be documented better.

  • How can I customize the dam search query in backend?

    I am trying to customize the dam search query in the cq backend but I cannot figure out where to do it.
    It seems that the search is performed by dam.assetsearch.json but where can I edit it?
    Thanks.

    The above is the post message to the querybulider.json when we trigger searching
    p.hitwriter=full
    p.nodedepth=4
    mainasset=true
    type=dam:Asset
    fulltext=1111
    path=/content/dam
    path.flat@Delete=true
    orderby=path
    0_daterange.property=jcr:content/jcr:lastModified
    0_daterange.lowerBound=
    0_daterange.upperBound=
    1_group.property=jcr:content/metadata/dc:format
    1_group.p.or=true
    2_group.property=jcr:content/metadata/cq:tags
    2_group.p.or=true
    3_group.property=jcr:content/metadata/dam:scene7FileStatus
    3_group.p.or=true
    p.limit=15
    p.offset=0
    Actually how 0_daterange, 1_group, 2_group.... generated?
    If I just add a predicate, it cannot be convert to x_group something like this and hence the searching fail.
    Any idea to this?
    Thanks.

  • Selected Query Parameter backend error

    Hi, experts,
         I am having some backend error whenever I add a parameter to the Selected Query Parameter to the ValueHelpQuery of an OVS.
         Can anybody please tell me why it is happening?
         Any help is needed and highly appreciated.
    Regards,
    May T.

    Hi, Horst,
    Thanks, I understand it.   
    I did not bind that N_AccountUUID with nothing else. Do I have to?
    I always add a new field to the SearchParameters and than I bind it in the Selected Query Parameters.
    I don't bind it with anything else.
    Is that a problem?
    I just bound it with my BO data field and it shows "The data field is bound to query parameter".
    Regards,
    May T.

  • Process of Querying the Backend similar to Toad

    Hi
    I would like to  know the process of querying the oracle database similar to Toad in fusion environment.
    Could you please show some pointers .
    Thanks
    Santosh 

    Hi.
    Assuming this is part of your access (i.e. not SaaS), then I'd recommend SQL Developer.
    Read more here
    Kind regards
    Richard
    FA Dev Relations

  • Backend query to find metrics with thresholds

    Hi,
    Is there a way to query the backend tables in repository database to find the metrics with thresholds on a database.I want to findout the missing warning/critical thresholds for all the targets.Any inputs for this may be helpful to me.
    Thanks.

    You might want to create a User Deinfed Metric that does a query on the SYSMAN schema to find out.
    Checkout
    Oracle® Enterprise Manager Extensibility Guide
    10g Release 5 (10.2.0.5)
    http://download.oracle.com/docs/cd/B16240_01/doc/em.102/b40007/toc.htm
    For description of all SYSMAN views to select your data
    Regards
    Rob
    http://oemgc.wordpress.com

  • Query cost is changing when used in a VO

    Hi All
    I have created the view with the below query
    SELECT component_item_name segment1,component_item_id inventory_item_id,organization_id,assembly_item_id
    FROM BOMFG_BOM_COMPONENTS WHERE organization_id = :1
    AND NVL(END_EFFECTIVE_DATE,SYSDATE ) >= SYSDATE
    AND EXISTS
    ( SELECT 1
    FROM apps.fnd_common_lookups fcl, mtl_system_items_b msi
    WHERE fcl.lookup_type = 'ITEM_TYPE' AND msi.item_type = fcl.lookup_code
    AND msi.inventory_item_id = component_item_id AND msi.organization_id = :2
    AND msi.item_type IN ('ATO','OPTIONAL FEAT')
    ) START WITH assembly_item_id = :3 CONNECT BY PRIOR component_item_id =assembly_item_id GROUP BY component_item_name,component_item_id,organization_id,assembly_item_id
    The cost of the above query is just 13 .
    attached this VO with one LOV in my page.
    when query gets executed while openign the page , the OA Framework is taking above query as
    select * from (SELECT component_item_name segment1,component_item_id inventory_item_id,organization_id,assembly_item_id
    FROM BOMFG_BOM_COMPONENTS WHERE organization_id = :1
    AND NVL(END_EFFECTIVE_DATE,SYSDATE ) >= SYSDATE
    AND EXISTS
    ( SELECT 1
    FROM apps.fnd_common_lookups fcl, mtl_system_items_b msi
    WHERE fcl.lookup_type = 'ITEM_TYPE' AND msi.item_type = fcl.lookup_code
    AND msi.inventory_item_id = component_item_id AND msi.organization_id = :2
    AND msi.item_type IN ('ATO','OPTIONAL FEAT')
    ) START WITH assembly_item_id = :3 CONNECT BY PRIOR component_item_id =assembly_item_id GROUP BY )component_item_name,component_item_id,organization_id,assembly_item_id
    and the query cost becomes 21000 .
    Because of this, VO takes longer than expected time .
    pls suggest how can I handle this ,
    Thanks
    Naveen
    Edited by: user640347 on Mar 10, 2010 12:27 AM

    Hi,
    Just tell me one thing you execute the query at backend after providing the inputs.
    now again after the page has loaded and then again execute the lov query at backend with same input
    variables.
    Now if the query is same then how can the cost differ?
    Verify it.
    Thanks,
    Gaurav

  • SAP Query in MSS - Not Data

    Hi Sap Experts,
    we added a custom query as described step by step in MSS Business Package (HCM Reports).
    <a href="http://help.sap.com/saphelp_erp2005/helpdata/en/3a/3198408d953154e10000000a1550b0/frameset.htm">http://help.sap.com/saphelp_erp2005/helpdata/en/3a/3198408d953154e10000000a1550b0/frameset.htm</a>
    After this steps the Query appears in the Reportlist and you can process it without any error message. But the result list does not contain any data.
    When we test the query in backend there are data in the result list.
    The Testuser has SAP ALL.
    So I assume we have to do something else additionally to the steps described in MSS Business Package.
    So what is the reason?
    Best Regards,
    Nadin

    Hi SAP Experts,
    The Reason was the following:
    We were using a query that is based on logical database PCH so the
    reporting iView tries to fill in the objects the user has selected
    in the iView into the fields planversion, object type and object id.
    We had to use a different query to get the expected result:
    Instead of using a PCH query and accessing Personnel Administration
    data (infotyp 0001 for example) out of this PCH-query we now
    use a query that is based on PNP/PNPCE.
    Kind Regards,
    Nadin

  • Java WS for query results

    All, new to Java from .NET and building a web service on a 3tier application. The client dynamically builds a query string and sends to the WS which is supposed to query the backend Oracle 10g database and return the resultset to the WS which sends back to the client. Okay, I can do simple strings just fine, but when I build the WS with a complex return type (ResultSet, Vector, String[][]) I can not get either the WS to compile or the proxy creation in JDeveloper to work. Any thoughts on how to solve this is appreciated.

    Thank you for your response. How would you suggest I send the results of a SQL Query back to the client from a web service that uses JDBC to call the DB for a resultset? XML seems like a good choice, but when it gets to the client from the WS, how would I load the XML string (of the resultset) into a JTable?

  • How to find Pstn Meeting ID from backend Database

    Customer has deployed Lync Server 2010 enterprise edition. One of the management requirement is for administrator is to find specific meeting id for specific users and let new users know such that meeting invitation does not need to be forwarded by organizer
    or other invitees in the meeting.
    Customer mainly interested to find the PSTN Meeting ID for specific conference settings.
    The two tools in ResKit allows to find details customer is looking for - using WebConfDataTool.exe and dbanalyze.exe. The WebConfDataToo.exe returns external conference ID and using external conference ID, dbanalyze.exe can return PSTN Meeting ID. However,
    both these tools perform remote SQL query to backend database. Customer's backend SQL server is setup such that remote SQL query is not permitted. Hence these tools fails to get the details.
    Therefore, here are the questions:
    1) In which database and what table PSTNMeetingID is stored? I see a PSTNLocalId but it is not the same.
    2) I have looked at the table dbo.PstnMeetingID and stored procedure
    dbo.ConfpLookupConferenceByExternalConfId. Is there any existing store procedure? If answer is "Yes", can you please let me know the name? I think a stored procedure which convert PstnLocalId to PSTNMeetingID would work too.
    All response is greatly appreciated.
    Thanks,
    Ketan Shah
    [email protected]
    Ketan Shah Sr. Lync Support Engineer Lync Managed Services Catapult Systems

    Hi,
    Which database do you find the Meeting ID?
    By default, the schedule meeting message is stored in the RTC database of Back End. Please check it.
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.
    Sean Xiao
    TechNet Community Support

  • 500 Internal Server error for a single report in a dashboard

    Hi All,
    I had an issue with a report in a dashboard i.e.., In a dashboard a report is prepared by using the opaque view. When I ran the report in a dashboard it ran for about 15 to 20 mins and the report is excuted with the data. Now I tried to downlaod the data for that particular report into the excel sheet by using the option Download to Excel. At this stage after 5 mins I am facing the Internal serevr error. But strange behaviour is that it downloads sometimes and sometimes it does'nt. One more important thing is that I tested the query in backend and it took around 6 to 7 mins to fetch the data. The total no of rows for the query is only 166. So can anyone please give me some suggestions or we can have a discussion as this report is behaving strangely. One more thing I forgot to mention is that they used session variables as parameters in the query for two fields which they used in the opaque view to build that report.
    Thanks,
    Sandy

    Hi Arun,
    where can i check with that whether in DEV/PROD system...?
    I don't have access to PROD system and SU01 system too.
    Can you please guide me with the steps to get it done.
    Please check the below log which is getting displayed for the User, when he is logging in.
    Error summary
    While processing the current request, an exception occurred which could not be handled by the application or the framework.
    If the information contained on this page doesnu2019t help you to find and correct the cause of the problem, please contact your system administrator. To facilitate analysis of the problem, keep a copy of this
    Root Cause
    The initial exception that caused the request to fail, was:
    com.sap.tc.webdynpro.services.exceptions.WDRuntimeException: ComponentUsage(FPMConfigurationUsage): Active component must exist when getting  interface controller
    at com.sap.tc.webdynpro.progmodel.components.ComponentUsage.ensureActiveComponent(ComponentUsage.java:773)
    at com.sap.tc.webdynpro.progmodel.components.ComponentUsage.getInterfaceControllerInternal(ComponentUSage.java:348)
    at com.sap.tc.webdynpro.progmodel.components.ComponentUsage.getInterfaceController(ComponentUsage.java:335)
    at com.sap.pcuigp.xssfpm.wd.wdp.InternalFPMComponent.wdGetFPMConfigurationUsageInterface(InternalFPMComponent.java.245)
    at com.sap.pcuigp.xssfpm.wd.FPMComponent$FPM.changeToExceptionPerspective(FPMComponent.java:862)
    u2026 59 more
    Please provide me any help.
    Thanks in advance.
    Regards,
    Ponneswari A.

  • Business Folder Upgrade from 11i to R12

    Hi,
    We are upgrading from 11.5.10.2 to R12.
    In the process, we have to upgrade discoverer folder also.
    To undergo the change, we are testing one report from 11i to R12.
    The report has the business folder as po_vendors in 11i, which is taking data from PO_VENDORS table from PO schema.
    As the PO_VENDORS table is obselete in R12.
    We are trying to change the PO_VENDORS table to AP_SUPPLIERS table and we are changing schema from PO to AP.
    We were able to do that in Disco Admin, but when I run the report from Desktop, we are not able to get the data.
    Where in SQLInspector we can that the query is changed, one more interesting thing is when we run the query from backend for exact set of parameters, its returning data.
    Is there anything which we need to take care, are we doing anything wrong.
    Please help.
    Thanks,
    S

    Hi,
    We are using 3 views and a materialized view.
    Object names are
    1) AP_INVOICES
    2) AP_INVOICES_DISTRIBUTIONS
    3) AP_SUPPLIERS
    now when we change the views to base tables for e.g. AP_INVOICES to AP_INVOICES_ALL and AP_INVOICES_DISTRIBUTIONS to AP_INVOICES_DISTRIBUTIONS_ALL then our report is givng data.
    PFB the query.
    SELECT DISTINCT o215482.SEGMENT2 as E215623,o215482.SEGMENT2_DESCRIPTION as E215625,o215476.ATTRIBUTE5 as E216133,o215482.SEGMENT4 as E216142,o215482.SEGMENT1 as E216190,o215482.SEGMENT3 as E216208,o215482.SEGMENT3_DESCRIPTION as E216210,o215476.INVOICE_DATE as E216817,o215476.PAYMENT_STATUS_FLAG as E217125,o215473.PERIOD_NAME as E217133,o215487.VENDOR_NAME as E217379,o215487.SEGMENT1 as E217385,(SUBSTR(o215476.INVOICE_NUM,1,12)) as E217632,as215592_217133_OLD as as215592_217133_OLD,o215473.AMOUNT as E215705,NVL(o215476.INVOICE_AMOUNT,0) as E216800
    FROM APPS.AP_INVOICE_DISTRIBUTIONS o215473,
    APPS.AP_INVOICES o215476,
    APPS.JS_GL_CODE_COMBINATIONS_MV o215482,
    AP.AP_SUPPLIERS o215487,
    ( SELECT o215509.PERIOD_NAME AS as215592_217133_OLD_2, MAX(o215509.START_DATE) AS as215592_217133_OLD FROM ( SELECT gl.period_name ,gl.start_date ,gl.end_date ,gl.period_set_name FROM gl.gl_periods gl ) o215509 WHERE (o215509.PERIOD_SET_NAME = 'JS_GL_Calendar') GROUP BY o215509.PERIOD_NAME)
    WHERE ( (o215476.INVOICE_ID = o215473.INVOICE_ID)
    and (o215482.CODE_COMBINATION_ID = o215473.DIST_CODE_COMBINATION_ID)
    and (o215487.VENDOR_ID = o215476.VENDOR_ID)
    and (o215473.PERIOD_NAME = as215592_217133_OLD_2(+)))
    AND (o215482.SEGMENT6 LIKE :"Project Parameter 1")
    AND (o215482.SEGMENT5 LIKE :"Category Code Parameter 1")
    AND (o215482.SEGMENT4 LIKE :"Business Unit Parameter 1")
    AND (o215482.SEGMENT3 LIKE :"Cost Centre Parameter 1")
    AND (o215482.SEGMENT2 LIKE :"Account Parameter 1")
    AND (o215482.SEGMENT1 LIKE :"Company Parameter 1")
    AND (o215473.PERIOD_NAME LIKE :"Period Name Parameter 1")
    AND (o215487.SEGMENT1 LIKE :"Supplier Number Parameter 1")
    AND (o215476.CANCELLED_DATE IS NULL )
    AND ( ( UPPER(o215473.LINE_TYPE_LOOKUP_CODE) = UPPER('ITEM') OR UPPER(o215473.LINE_TYPE_LOOKUP_CODE) = UPPER('FREIGHT') ) )
    ORDER BY o215482.SEGMENT1 ASC , o215482.SEGMENT3 ASC , o215482.SEGMENT2 ASC , o215482.SEGMENT4 ASC , o215487.SEGMENT1 ASC , o215487.VENDOR_NAME ASC , (SUBSTR(o215476.INVOICE_NUM,1,12)) ASC ;
    Kindly help.
    Thanks,
    S

  • Reverse Mapping Tutorial - Finder.java queries the wrong table?!

    I have been almost successful in running the Reverse Mapping Tutorial, by
    creating Java Classes from the hsqldb sample database, and running the JDO
    Enhancer on them.
    However, I cannot get he Finder.java to work. It seems to look in the wrong
    table: MAGAZINEX instead of MAGAZINE?
    Did anyone have trouble with this step, or ran it successfully?
    Liviu
    PS: here is the trace:
    0 [main] INFO kodo.Runtime - Starting Kodo JDO version 2.4.2
    (kodojdo-2.4.2-20030326-1841) with capabilities: [Enterprise Edition
    Features, Standard Edition Features, Lite Edition Features, Evaluation
    License, Query Extensions, Datacache Plug-in, Statement Batching, Global
    Transactions, Developer Tools, Custom Database Dictionaries, Enterprise
    Databases]
    70 [main] WARN kodo.Runtime - WARNING: Kodo JDO Evaluation expires in 25
    days. Please contact [email protected] for information on extending your
    evaluation period or purchasing a license.
    68398 [main] INFO kodo.MetaData -
    com.solarmetric.kodo.meta.JDOMetaDataParser@19eda2c: parsing source:
    file:/C:/Documents%20and%20Settings/default/jbproject/JDO/classes/reversetut
    orial.jdo
    74577 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] open:
    jdbc:hsqldb:hsql_sample_database (sa)
    75689 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] close:
    com.solarmetric.datasource.PoolConnection@17918f0[[requests=0;size=0;max=70;
    hits=0;created=0;redundant=0;overflow=0;new=0;leaked=0;unavailable=0]]
    75699 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] close
    connection
    77331 [main] INFO jdbc.JDBC - Using dictionary class
    "com.solarmetric.kodo.impl.jdbc.schema.dict.HSQLDictionary" to connect to
    "HSQL Database Engine" (version "1.7.0") with JDBC driver "HSQL Database
    Engine Driver" (version "1.7.0")
    1163173 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ] open:
    jdbc:hsqldb:hsql_sample_database (sa)
    1163293 [main] INFO jdbc.SQL - [ C:3093871; T:31737213; D:22310332 ]
    preparing statement <17940412>: SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM
    MAGAZINEX
    1163313 [main] INFO jdbc.SQL - [ C:3093871; T:31737213; D:22310332 ]
    executing statement <17940412>: [reused=1;params={}]
    1163443 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ]
    close:
    com.solarmetric.datasource.PoolConnection@2f356f[[requests=1;size=0;max=70;h
    its=0;created=1;redundant=0;overflow=0;new=1;leaked=0;unavailable=0]]
    1163443 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ] close
    connection
    Hit uncaught exception javax.jdo.JDOFatalDataStoreException
    javax.jdo.JDOFatalDataStoreException:
    com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
    [SQL=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    [PRE=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    Table not found: S0002 Table not found: MAGAZINEX in statement [SELECT
    DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX] [code=-22;state=S0002]
    NestedThrowables:
    com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
    [SQL=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    [PRE=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    Table not found: S0002 Table not found: MAGAZINEX in statement [SELECT
    DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    at
    com.solarmetric.kodo.impl.jdbc.runtime.SQLExceptions.throwFatal(SQLException
    s.java:17)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.SubclassProviderImpl.getSubclasses(
    SubclassProviderImpl.java:283)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.getPrimaryMappingField
    s(ClassMapping.java:1093)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.executeQuery(JDBCSto
    reManager.java:704)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.executeQuery(JDBCQuery.java
    :93)
    at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:792)
    at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:595)
    at reversetutorial.Finder.main(Finder.java:32)
    NestedThrowablesStackTrace:
    java.sql.SQLException: Table not found: S0002 Table not found: MAGAZINEX in
    statement [SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    at org.hsqldb.Trace.getError(Trace.java:226)
    at org.hsqldb.jdbcResultSet.<init>(jdbcResultSet.java:6595)
    at org.hsqldb.jdbcConnection.executeStandalone(jdbcConnection.java:2951)
    at org.hsqldb.jdbcConnection.execute(jdbcConnection.java:2540)
    at org.hsqldb.jdbcStatement.fetchResult(jdbcStatement.java:1804)
    at org.hsqldb.jdbcStatement.executeQuery(jdbcStatement.java:199)
    at
    org.hsqldb.jdbcPreparedStatement.executeQuery(jdbcPreparedStatement.java:391
    at
    com.solarmetric.datasource.PreparedStatementWrapper.executeQuery(PreparedSta
    tementWrapper.java:93)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedQueryI
    nternal(SQLExecutionManagerImpl.java:771)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQueryInternal(
    SQLExecutionManagerImpl.java:691)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQuery(SQLExecu
    tionManagerImpl.java:372)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQuery(SQLExecu
    tionManagerImpl.java:356)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.SubclassProviderImpl.getSubclasses(
    SubclassProviderImpl.java:246)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.getPrimaryMappingField
    s(ClassMapping.java:1093)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.executeQuery(JDBCSto
    reManager.java:704)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.executeQuery(JDBCQuery.java
    :93)
    at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:792)
    at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:595)
    at reversetutorial.Finder.main(Finder.java:32)

    The reason I did not run importtool is because ... I actually ran it, but it
    was not successfull. **!
    I now tried the solutions directory, from the kodo distribution, and that
    failed as well. Here is what I did:
    - I went to reversetutorial/solutions, and compiled all the classes, and
    then placed them into a reversetutorial folder (to match the package)
    - ran "rd-importtool reversetutorial.mapping" (the mapping file from the
    solutions directory), which failed as below:
    0 [main] INFO kodo.MetaData - Parsing metadata resource
    "file:/C:/kodo/reversetutorial/solutions/reversetutorial.mapping".
    Exception in thread "main"
    com.solarmetric.rd.kodo.meta.JDOMetaDataNotFoundException: No JDO metadata
    was found for type "class reversetutorial.Article".
    FailedObject:class reversetutorial.Article
    at
    com.solarmetric.rd.kodo.meta.JDOMetaDataRepositoryImpl.getMetaData(JDOMetaDa
    taRepositoryImpl.java:148)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(Mapping
    Repository.java:147)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(MappingR
    epository.java:158)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(ImportTo
    ol.java:126)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappings(Impo
    rtTool.java:57)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTool.java
    :408)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportTool.jav
    a:385)
    Any idea why? The solutions directory should work, right? I even tried
    specifying a kodo.properties file, but it did not seem to help.
    Liviu
    "Abe White" <[email protected]> wrote in message
    news:[email protected]...
    Running the reversemappingtool creates classes, metadata files, and a
    .mapping file. That .mapping file contains all the O/R mapping
    information for how the generated classes map to your existing database
    tables. What the importtool does is just transfer that mapping
    information to the metadata files, in the form of <extension> elements.
    The reason this is a separate step will be clear once Kodo 3.0 comes out.
    So in sum, the importtool does not affect the database in any way. It
    just moves information from one format (.mapping file) to another
    (<extension> elements in the .jdo file).

  • How to insert a new record to table with foreign key

    I have 3 tables like this :
    CREATE TABLE PERSON (
    PK INTEGER NOT NULL,
    NAME VARCHAR(10),
    SSNUM INTEGER,
    MGR INTEGER);
    ALTER TABLE PERSON ADD CONSTRAINT PK_PERSON PRIMARY KEY (PK);
    ALTER TABLE PERSON ADD CONSTRAINT FK_PERSON FOREIGN KEY (MGR) REFERENCES
    PERSON (PK);
    /* Tables
    CREATE TABLE PROJECT (
    PK INTEGER NOT NULL,
    CODE_NAME INTEGER);
    ALTER TABLE PROJECT ADD CONSTRAINT PK_PROJECT PRIMARY KEY (PK);
    /* Tables
    CREATE TABLE XREF (
    PERSON INTEGER NOT NULL,
    PROJECT INTEGER NOT NULL);
    ALTER TABLE XREF ADD CONSTRAINT PK_XREF PRIMARY KEY (PERSON, PROJECT);
    ALTER TABLE XREF ADD CONSTRAINT FK_XREF1 FOREIGN KEY (PERSON) REFERENCES
    PERSON (PK);
    ALTER TABLE XREF ADD CONSTRAINT FK_XREF2 FOREIGN KEY (PROJECT) REFERENCES
    PROJECT (PK);
    I do like the way of "ReverseTutoral" and the file .jdo here :
    <?xml version="1.0" encoding="UTF-8"?>
    <jdo>
    <package name="reversetutorial">
    <class name="Person" objectid-class="PersonId">
    <extension vendor-name="kodo" key="class-column" value="none"/>
    <extension vendor-name="kodo" key="lock-column" value="none"/>
    <extension vendor-name="kodo" key="table" value="PERSON"/>
    <field name="name">
    <extension vendor-name="kodo" key="data-column"
    value="NAME"/>
    </field>
    <field name="person">
    <extension vendor-name="kodo" key="pk-data-column"
    value="MGR"/>
    </field>
    <field name="persons">
    <collection element-type="Person"/>
    <extension vendor-name="kodo" key="inverse"
    value="person"/>
    <extension vendor-name="kodo" key="inverse-owner"
    value="person"/>
    </field>
    <field name="pk" primary-key="true">
    <extension vendor-name="kodo" key="data-column"
    value="PK"/>
    </field>
    <field name="ssnum">
    <extension vendor-name="kodo" key="data-column"
    value="SSNUM"/>
    </field>
    <field name="xrefs">
    <collection element-type="Xref"/>
    <extension vendor-name="kodo" key="inverse"
    value="person"/>
    <extension vendor-name="kodo" key="inverse-owner"
    value="person"/>
    </field>
    </class>
    <class name="Project" objectid-class="ProjectId">
    <extension vendor-name="kodo" key="class-column" value="none"/>
    <extension vendor-name="kodo" key="lock-column" value="none"/>
    <extension vendor-name="kodo" key="table" value="PROJECT"/>
    <field name="codeName">
    <extension vendor-name="kodo" key="data-column"
    value="CODE_NAME"/>
    </field>
    <field name="pk" primary-key="true">
    <extension vendor-name="kodo" key="data-column"
    value="PK"/>
    </field>
    <field name="xrefs">
    <collection element-type="Xref"/>
    <extension vendor-name="kodo" key="inverse"
    value="project"/>
    <extension vendor-name="kodo" key="inverse-owner"
    value="project"/>
    </field>
    </class>
    <class name="Xref" objectid-class="XrefId">
    <extension vendor-name="kodo" key="class-column" value="none"/>
    <extension vendor-name="kodo" key="lock-column" value="none"/>
    <extension vendor-name="kodo" key="table" value="XREF"/>
    <field name="person">
    <extension vendor-name="kodo" key="pk-data-column"
    value="PERSON"/>
    </field>
    <field name="person2" primary-key="true">
    <extension vendor-name="kodo" key="data-column"
    value="PERSON"/>
    </field>
    <field name="project">
    <extension vendor-name="kodo" key="pk-data-column"
    value="PROJECT"/>
    </field>
    <field name="project2" primary-key="true">
    <extension vendor-name="kodo" key="data-column"
    value="PROJECT"/>
    </field>
    </class>
    </package>
    </jdo>
    Data of those tables are :
    PERSON :
    | PK | NAME | SSNUM | MGR |
    | 1 | ABC | 1 | 1 |
    | 2 | DEF | 5 | 1 |
    PROJECT
    | PK | CODE_NAME |
    | 1 | 12 |
    | 2 | 13 |
    And now I want to add a new record into table XREF : insert into XREF
    values (1,1);
    public void createData() {
    Xref xref = new Xref();
    Person person = new Person(1);
    Project project = new Project(1);
    xref.setPerson(person);
    xref.setProject(project);
    person.getXrefs().add(xref);
    person.getXrefs().add(xref);
    pm.currentTransaction().begin();
    pm.makePersistent(xref);
    pm.currentTransaction().commit();
    I don't know why Kodo automatically insert new record to table PERSON ->
    confilct Primary Key. The errors are :
    0 [main] INFO kodo.Runtime - Starting Kodo JDO version 2.4.1
    (kodojdo-2.4.1-20030126-1556) with capabilities: [Enterprise Edition
    Features, Standard Edition Features, Lite Edition Features, Evaluation
    License, Query Extensions, Datacache Plug-in, Statement Batching, Global
    Transactions, Developer Tools, Custom Database Dictionaries, Enterprise
    Databases, Custom ClassMappings, Custom ResultObjectProviders]
    41 [main] WARN kodo.Runtime - WARNING: Kodo JDO Evaluation expires in 29
    days. Please contact [email protected] for information on extending
    your evaluation period or purchasing a license.
    1627 [main] INFO kodo.MetaData -
    com.solarmetric.kodo.meta.JDOMetaDataParser@e28b9: parsing source:
    file:/D:/AN/Test/classes/reversetutorial/reversetutorial.jdo
    3092 [main] INFO jdbc.JDBC - [ C:23387093; T:19356985; D:10268916 ] open:
    jdbc:firebirdsql:localhost/3050:D:/An/test/temp.gdb (sysdba)
    3325 [main] INFO jdbc.JDBC - [ C:23387093; T:19356985; D:10268916 ]
    close:
    com.solarmetric.datasource.PoolConnection@164dbd5[[requests=0;size=0;max=70;hits=0;created=0;redundant=0;overflow=0;new=0;leaked=0;unavailable=0]]
    3335 [main] INFO jdbc.JDBC - [ C:23387093; T:19356985; D:10268916 ] close
    connection
    3648 [main] INFO jdbc.JDBC - Using dictionary class
    "com.solarmetric.kodo.impl.jdbc.schema.dict.InterbaseDictionary" to
    connect to "Firebird" (version "__WI-V6.2.972 Firebird 1.0.3)WI-V6.2.972
    Firebird 1.0.3/tcp (annm)/P10") with JDBC driver "firebirdsql jca/jdbc
    resource adapter" (version "0.1")
    4032 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ] open:
    jdbc:firebirdsql:localhost/3050:D:/An/test/temp.gdb (sysdba)
    4143 [main] INFO jdbc.SQL - [ C:25657668; T:19356985; D:10268916 ]
    preparing statement <3098834>: INSERT INTO XREF(PERSON, PROJECT) VALUES
    4224 [main] INFO jdbc.SQL - [ C:25657668; T:19356985; D:10268916 ]
    executing statement <3098834>: [reused=1;params={(int)1,(int)1}]
    4244 [main] INFO jdbc.SQL - [ C:25657668; T:19356985; D:10268916 ]
    preparing statement <9090824>: INSERT INTO PERSON(MGR, NAME, PK, SSNUM)
    VALUES (?, ?, ?, ?)
    4315 [main] INFO jdbc.SQL - [ C:25657668; T:19356985; D:10268916 ]
    executing statement <9090824>: [reused=1;params={null,null,(int)1,(int)0}]
    4598 [main] WARN jdbc.JDBC - java.sql.SQLWarning: java.sql.SQLWarning:
    resultSetType or resultSetConcurrency changed
    4598 [main] WARN jdbc.JDBC - java.sql.SQLWarning: java.sql.SQLWarning:
    resultSetType or resultSetConcurrency changed
    4598 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ] begin
    rollback
    4608 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ] end
    rollback 10ms
    4628 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ]
    close:
    com.solarmetric.datasource.PoolConnection@1878144[[requests=2;size=2;max=70;hits=0;created=2;redundant=0;overflow=0;new=2;leaked=0;unavailable=0]]
    4628 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ] close
    connection
    javax.jdo.JDOFatalDataStoreException:
    com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
    [SQL=INSERT INTO PERSON(MGR, NAME, PK, SSNUM) VALUES (null, null, 1, 0)]
    [PRE=INSERT INTO PERSON(MGR, NAME, PK, SSNUM) VALUES (?, ?, ?, ?)]
    GDS Exception. violation of PRIMARY or UNIQUE KEY constraint "PK_PERSON"
    on table "PERSON" [code=335544665;state=null]
    NestedThrowables:
    com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
    [SQL=INSERT INTO PERSON(MGR, NAME, PK, SSNUM) VALUES (null, null, 1, 0)]
    [PRE=INSERT INTO PERSON(MGR, NAME, PK, SSNUM) VALUES (?, ?, ?, ?)]
    GDS Exception. violation of PRIMARY or UNIQUE KEY constraint "PK_PERSON"
    on table "PERSON"
    at
    com.solarmetric.kodo.impl.jdbc.runtime.SQLExceptions.throwFatal(SQLExceptions.java:17)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:416)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:575)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:438)
    at reversetutorial.Finder.createData(Finder.java:74)
    at reversetutorial.Finder.main(Finder.java:141)
    NestedThrowablesStackTrace:
    org.firebirdsql.jdbc.FBSQLException: GDS Exception. violation of PRIMARY
    or UNIQUE KEY constraint "PK_PERSON" on table "PERSON"
    at
    org.firebirdsql.jdbc.FBPreparedStatement.internalExecute(FBPreparedStatement.java:425)
    at
    org.firebirdsql.jdbc.FBPreparedStatement.executeUpdate(FBPreparedStatement.java:136)
    at
    com.solarmetric.datasource.PreparedStatementWrapper.executeUpdate(PreparedStatementWrapper.java:111)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatementNonBatch(SQLExecutionManagerImpl.java:542)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatement(SQLExecutionManagerImpl.java:511
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeInternal(SQLExecutionManagerImpl.java:405)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.flush(SQLExecutionManagerImpl.java:272
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:411)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:575)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:438)
    at reversetutorial.Finder.createData(Finder.java:74)
    at reversetutorial.Finder.main(Finder.java:141)
    at org.firebirdsql.gds.GDSException: violation of PRIMARY or UNIQUE KEY
    constraint "PK_PERSON" on table "PERSON
    at org.firebirdsql.jgds.GDS_Impl.readStatusVector(GDS_Impl.java:1683)
    at org.firebirdsql.jgds.GDS_Impl.receiveResponse(GDS_Impl.java:1636)
    at org.firebirdsql.jgds.GDS_Impl.isc_dsql_execute2(GDS_Impl.java:865)
    at
    org.firebirdsql.jca.FBManagedConnection.executeStatement(FBManagedConnection.java:782)
    at
    org.firebirdsql.jdbc.FBConnection.executeStatement(FBConnection.java:1072)
    at
    org.firebirdsql.jdbc.FBPreparedStatement.internalExecute(FBPreparedStatement.java:420)
    at
    org.firebirdsql.jdbc.FBPreparedStatement.executeUpdate(FBPreparedStatement.java:136)
    at
    com.solarmetric.datasource.PreparedStatementWrapper.executeUpdate(PreparedStatementWrapper.java:111)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatementNonBatch(SQLExecutionManagerImpl.java:542)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatement(SQLExecutionManagerImpl.java:511)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeInternal(SQLExecutionManagerImpl.java:405)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.flush(SQLExecutionManagerImpl.java:272)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:411)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:575)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:438)
    at reversetutorial.Finder.createData(Finder.java:74)
    at reversetutorial.Finder.main(Finder.java:141)
    Exception in thread "main"

    First off, use the '-primaryKeyOnJoin true' flag when running the reverse
    mapping tool so that you can get rid of that useless Xref class and have
    a direct relation between Person and Project. See the documentation on
    reverse mapping tool options here:
    http://www.solarmetric.com/Software/Documentation/latest/docs/ref_guide_pc_reverse.html
    But your real problem is that you are creating new objects, assigning
    primary key values, and expecting them to represent existing objects.
    That's not the way JDO works. If you want to set relations to existing
    objects in JDO, you use the PM to look up those objects. If you try to
    create new objects, JDO will assume you want to insert new records into
    the DB, and you'll get PK conflicts like you see here.
    There are several good books out on JDO; if you're just starting out with
    it, they might save you a lot of time and help you master JDO quickly.

Maybe you are looking for