Values of Non-Qualifiers (lookups) not auto-converting.

We've been trying to use Import Server to automatically import records (XML) that had data for qualified lookup tables. Apparently there is an issue with namespaces in both the schema that we mapped to and the source field.
After removing the namespaces, we now have an issue with the non-qualifier that uses a lookup table not automapping it's values and thus retaining the compound field at the Customer main level.
First, the Partner Function is mapped to the Partner Function flat lookup table. At this level, the value auto-converts and the green dot is visible to the left. 
Second, at the Customer Partner qualified lookup, the Partner Function is again mapped to the Partner Function lookup. This time, however, the values do not automatically convert and the "AUTOMAP" button must be manually pushed for them to map. The other non-qualifiers (CustomerID, PartnerNum, PartnerSAP_ID) are also mapped and their values auto-convert (green dot).
Finally, at the Customer main level, all of the non-qualified field from the Customer Partner table are mapped and all auto-convert except for the Partner Function field. This requires us to manually push "Automap". Then, and only then, does it allow us to right click on one of the fields and create the compound field to map to the Customer Partner field. Again, the value mapping is not automatic but must be done manually.
This map is then saved and ImpMgr closed. When the IM is reopened with the same map, the compound field is not recreated due to the fact that the Partner Function values did not convert. Once these are redone manually, then and only then can the compound field be recreated and value mapped.
When this same map is used in the Import Server, the error is that the qualified fields cannot be mapped. This brings us back to the lookup fields not value mapping. Likewise, if this same file is opened in Import Manager mannually, and the map applied, it STILL does not convert the values and the Compound Field for the Qualified L/Up table is dropped. 
We are using MDM 5.5 SP4 Patch 3.

Hello Christiane,
Yes, this is a very important point and I should have noted it in my original post. Yes, my Default Batch Handling settings are:
  Yes - Automap unmapped values
  Add - Unmapped Value Handling
Plus:
  Replace - Default multi-valued update
  Append - Deault qualified update
  None - Default matching qualifiers
Also, in my mdis.ini file, my settings are:
  Automap Unmapped Value = True
  Unmapped Value Handling = Add
  Always Use Unmapped Value Handling = True
It used to work when we were in SP4, Patch 2, but have since upgraded to Patch 3, Hotfix 2(?) and can't get it to work.
Thanks for helping!
Don

Similar Messages

  • Is there a way to not auto-convert documents to FrameMaker 9 when opening lower version files?

    Hi, All!
    Is there a way to prevent FrameMaker 9 from automatically converting lower version files to FrameMaker 9 when opening them?
    It becomes a problem when I just want to view the files. I have to save each chapter in the book to the lower version file again after viewing for my teammates to use them.
    Thanks.

    *** haven't checked, because I don't have any pre-9.0 files easily available ***  ... but from what I remember, if you're really just viewing the files in 9.0 you should need to save them explicitly back to an earlier version. FM may mark the files as "changed" when it opens them, but that's because it checks things like datestamps and cross-references. If you do not, yourself, explicitly change any content in the file you have opened, you don't need to save the changes; just close the file unsaved.  Check my supposition with a couple of files before relying on it! And other people will be able to give you a more detailed explanation of why file you haven't touched yourself still show up as "changed".

  • Sorting the Non qualifier values.. Is it Possible!!

    Hi Experts,
    I have a qualified table.
    week day -> Non qualifier
    Time In -> Qualifier
    Time out -> Qualifier
    The Week day Field is fileed with all 7 days of the week.
    Monday, Tuesday,wednesday,Th..... sunday
    Wehave 2 issues here..
    1) In DM qualified lookup selection for a record, i can see it is autiomaticallyy sorted Based on Alphabetical order..
    that is Friday, Monday, Saturday, Sunday, thursday, Tuesday, Wednesday.
    we require this to be in Monday to Sunday order..
    2) Also i selected Monday first and gave timein time out values, and for the same record when i ALSO select Friday and provided Time in and Time out values and once i Hit save, then i can see that Friday is seen first and Monday is seen Next to it.
    Which means it is also saving the Entries for Qualified table in a Non qualifier Alphabetical sorting order.
    Can we get rid of this??
    User should see the Qualifid Lookup selection box in Monday to sunday order
    Once user selects the any of the values in non qualifiers they should be seen in Monday to Sunday order once they save.
    Note: Sort Type option at Non qualifier Field level did not help.
    did anyone face similar issue??
    KR
    John

    Hi,
    I used a Integer field called Code.
    which has 1-7 numbers.
    Name                 :     Code
    Monday                     1
    Tuesday                   2
    Wednesday              3
    thursday                   4
    Friday                       5
    Saturday                   6
    Sunday                     7
    These 2 fields are Non qualifiers, But in Portal we have hidden the Code field.
    This way the sorting worked.
    Closing thread.
    KR
    John

  • Charecter. value "1000'  of chrecteristic  0COMPANY not ALPHAconverted

    Hi ,
    We are doing upgrade test.
    when I load data, it is showing error  " Charecter. value "1000'  of chrecteristic  0COMPANY not ALPHA converted .
    I have checked with ALPHA check mark in transfer rules , there is correction in error.
    when I check with production, there is no check mark, and it is loading data successfuly.
    Any one could give idea, why this is happeing.
    Thanks
    R

    Hi Swapna ,
    Thanks.
    I come across  into PSA  ,0Pcompany column has data  as "1000"  and "2000" for 220 recods out of 2189 records.
    but this 0PCOMPANY is reference field , no master data maintained.
    when I cross check in production on ODS there are no data found for this object, and daily data loading(Prod) for this 0pcompany also has got 0 Value.
    I think problem is in testing data in ECC 6.0.
    could any one suggest please.
    Thanks
    Ram

  • Db_lost_write_protect value is none

    Hi,
    I am learning 11g for OCP ceritification..
    When I set db_ultra_safe parameter value to 'DATA_AND_INDEX' , db_lost_write_protect parameter value is none, which should not be the case....
    It should be typical instead of none as given in the book...What i missed..Why db_lost_write_protect is none, can please someone throw light on this?
    ---------------Please refer the screenshot ------------------------------------------------------
    SQL> alter system set db_ultra_safe='DATA_AND_INDEX' scope=spfile;
    System altered.
    SQL> shutdown immediate;
    Database closed.
    Database dismounted.
    ORACLE instance shut down.
    SQL> startup
    ORACLE instance started.
    Total System Global Area 610992128 bytes
    Fixed Size 1376296 bytes
    Variable Size 230690776 bytes
    Database Buffers 373293056 bytes
    Redo Buffers 5632000 bytes
    Database mounted.
    Database opened.
    SQL> sho parameter db_ultra_safe
    NAME TYPE VALUE
    db_ultra_safe string DATA_AND_INDEX
    SQL> sho parameter db_block
    NAME TYPE VALUE
    db_block_buffers integer 0
    db_block_checking string FULL
    db_block_checksum string FULL
    db_block_size integer 8192
    SQL> sho parameter db_lost
    NAME TYPE VALUE
    db_lost_write_protect                string      NONE
    SQL> select * from v$instance;
    INSTANCE_NUMBER INSTANCE_NAME
    HOST_NAME
    VERSION STARTUP_T STATUS PAR THREAD# ARCHIVE LOG_SWITCH_WAIT
    LOGINS SHU DATABASE_STATUS INSTANCE_ROLE ACTIVE_ST BLO
    1 tk
    11.2.0.1.0 23-DEC-10 OPEN NO 1 STARTED
    ALLOWED NO ACTIVE PRIMARY_INSTANCE NORMAL NO
    SQL>

    Its because Data Guard allows for different modes such as Maximum Performance or Maximum Protection etc.
    Its up to you to decide what you want. This is part of Optimal Corruption Protection and includes the parameters DB_BLOCK_CHECKING and DB_BLOCK_CHECKUM.
    Your data isn't unprotected, but if you want Maximum Protection you have to change a few things.
    By the way there is a separate Data Guard section in this forum. ( Data Guard )
    When the parameter is set to FULL on the primary database, the instance logs reads for read-only tablespaces as well as read-write tablespaces.
    When the parameter is set to TYPICAL or FULL on the standby database or on the primary database during media recovery, the instance performs lost write detection.
    Regards
    MS

  • Qualifier values are not part of a qualified lookup record

    Hi
         I am getting the exception "Qualifier values are not part of a qualified lookup record" when I serach for a record in the Qualified table. The Table has both Qualfier and non-Qualifier fields, but all are of type display fields.
    My question is If the filed is of type Qualifier then is it mandatory that it should be non-display Field?
    if  not can any one suggest me to get the record.
    If we make Qualifier fields as NON-Dispaly fields in repository then it is working fine.
    Here with I am providing the code
                   Search serarch = new Search(repoSchemaCmd.getRepositorySchema().getTable("QT_NOTES").getId());
                   serarch.addSearchItem(new FieldSearchDimension(repoSchemaCmd.getRepositorySchema().getFieldId("QT_NOTES","USER_PMF_ID")),new TextSearchConstraint("test", TextSearchConstraint.EQUALS));
                   ResultDefinition rd =new ResultDefinition(repoSchemaCmd.getRepositorySchema().getTable("QT_NOTES").getId());
                   //Returns the array of IDs for all display fields associated with the table
                   rd.setSelectFields(repoSchemaCmd.getRepositorySchema().getTableSchema("QT_NOTES").getDisplayFieldIds());
                   RetrieveLimitedRecordsCommand retrievedRecords =new RetrieveLimitedRecordsCommand(conn);
                   retrievedRecords.setSession(userSessionID);
                   retrievedRecords.setSearch(serarch);
                   //retrievedRecords.setIds(RID);
                   retrievedRecords.setResultDefinition(rd);
                   try {
                        retrievedRecords.execute();
                   } catch (CommandException e) {
                        throw new CommandException(e);
    Regards,
    Sandeep.

    Hi Sandeep
    I have a similar problem, would like to know what is the resolution to this issue.
    Regards
    Dilip

  • Non-qualifier Auto Id

    Hello
    we have a qulaified lookup table with all fields set as Qualified except one which is an AutoId field.
    How do we handle the mapping of the qulaified lookup field in the the main table for the auto id seeing as we will not have a source field?
    Do we need to map all the Qualified fields or just the qualified fields that are display field?
    We keep getting the following error on import:
    Import failed. Cannot import qualifiers because the qualified lookup field is not mapped.
    Import action: Create
    Source record no: 1
    Kind Regards
    C

    Hello
    The senario is as follows:
    Main table record Customer has a field of type qualifed lookup called Search key id. The qualified table is used to maintain a list of search keys per customer.
    Lookup table is as follows:
    Id        AutoID   Non-Qualifier
    Key     Text      Qualifier
    Status Text      Qualifer.
    We recieve on import:
    Customer Number  and map to Customer Number (main table)
    Key Value and map to Key (qualified lookup table)
    Status Value and map to Status (qualified lookup table)
    We add a field "search Key" on import and map to search key (main table)
    The question is how do we handle the AutoId field? E.g. If we import 3 search keys
    for a customer autoid runs from 1 to 3 the qulaified lookup table
    We keep getting an error when we try this. (see org. message)
    Kind Regards
    Con

  • Updating  a qualifier value of qualified lookup table failed

    Hello,
    I am trying to update  a qualifier value of qualified lookup table.
    I  have written the method attached to this email.
    I am not able to modify a qualifier field, that "Installation instructions DAR Update" field of "Vendor Details" table with this method call. Is there anything I am missing or not doing correctly?
    Also, I noticed that in SP6 API documentation, this line in section of ModifyRecordsCommand:
    Note: Qualifier values is currently not supported by this command. This command simply ignore qualifier values passed in.
    First I have used this method failed. After I saw this, I tried to use ModifyRecordCommand but no success still.
    I am copying my method content here. qlv1 and qlv2 is geeting new value after setQualifierFieldValue.
    It will be highly appreciated if you can help me.
                    boolean isUpdated = false;
              final MdmValue mval = record.getFieldValue(qualFieldID);
              qlv = (QualifiedLookupValue) mval;
              QualifiedLinkValue[] qlinkvals = qlv.getQualifiedLinks();
              for (int j = 0; j < qlinkvals.length; j++) {
                   qlinkval = qlinkvals[j];
                   qrec = qlinkval.getQualifiedLookupRecord();
                   tabId = qrec.getTable();
                   qfields = qlinkval.getQualifierFields();
                   vdrec = getTSVal(tsVals, qrec);
                   for (int k = 0; k < qfields.length; k++) {
                        fieldId = qfields[k];
                        fieldValue = qlinkval.getQualifierValue(fieldId).toString();
                        mstr = mdmHandle.getSchema().getField(tabId, fieldId).getName();
                        fieldName = mstr.get(MDMBase.LANG);
                        if (!(vdrec.containsKey(fieldName)))
                             continue;
                        String curTSVal = fieldValue;
                        String darTSval = vdrec.get(fieldName);
                        // If the dar updated value has different timestamp value,
                        // then it means MDM have changed the value after we sent the
                        // last export to DAR.For this case, we won't change status flag
                        if (!(darTSval.equalsIgnoreCase(curTSVal)))
                             continue;
                        FieldId updateFieldID = tsUpdateFields.get(fieldName);
                        //test
                        FieldId iid = tsUpdateFields.get("Parts Breakdown Timestamp");
                        //test
                        try {
                             MdmValue mvalTrue = MdmValueFactory
                                       .createBooleanValue(true);
                             qlv.setQualifierFieldValue(j, updateFieldID, mvalTrue);
                             //test
                             qlv.setQualifierFieldValue(j, iid, mvalTrue);
                             String qlv1 = qlv.getQualifierFieldValue(j,updateFieldID).toString();
                             String qlv2 = qlv.getQualifierFieldValue(j,iid).toString();
                             QualifiedLinkValue qlv11 = qlv.getQualifiedLinks()[1];
                             logger.info(qlv1":"qlv2);
                             //test
                        } catch (IllegalArgumentException e) {
                             // TODO Auto-generated catch block
                             e.printStackTrace();
                        isUpdated = true;
                   if (isUpdated){
                        qlv.setCompleteSet(true);
                        mdmHandle.modifyRecord(record,true);
                        mdmHandle.modifyRecord(qrec,true);
    Regards

    Apparently you can configure the fields that are shown in the pop-up window, but it also limits the fields that are displayed on the iView of the main table.

  • Error while trying to modify a Qualified Lookup Value.

    Hello,
    I have the following structure:
    Vendor <-- Main Table
    vendorName
    vendorType
    BankDetails <-- Qualified Table
    BankDetails (Table)
    Country   (non-qualifier value)
    bankKeyCode (cualifier value)
    Country
    code
    name
    I am trying to modify the BankDetails for a specific Vendor, I am getting a Record from Country table and then create a Link for the QualifiedLookupValue
    qualifiedLookupValue = (QualifiedLookupValue) mainRecord.getFieldValue(bankDetailFieldId);
    int index = qualifiedLookupValue.createQualifiedLink(countryRecord.getId());
    qualifiedLookupValue.setQualifierFieldValue(index, bankKeyFieldId, new LookupValue(bankKeyRecord.getId()));
    mainRecord.setFieldValue(bankDetailFieldId, qualifiedLookupValue);
    And then I modify the main Record (Vendor).
    I am getting the error: com.sap.mdm.commands.CommandException: com.sap.mdm.internal.protocol.manual.ServerException: Server error (0xffaa6000)
    What I have found in the forum is that this error is related to "Record Not Found". However, I am actually getting a correct Country from the Country table.
    Any Idea?.
    Many thanks in advance.
    Best Regards,
    Baez

    Hi Baez,
    You have to create a record with country record id in BankDetails table or get record id if country record id is already existed in BankDetails table  and use this record id to create qualified link .
    1. get Country record id
    2. search in Bankdetails table with Country Record id,
    3 . if exist in Bank details table, get record id in Bank details table and pass it to createQualifiedLink()method
                    or
    3. if Country record does not exist in BankDetails table, then  create new record in bankdetails table with Country record id, pass record id of new record to createQualifiedLink()method
    I hope this helps you......
    Cheers,
    Veeru

  • Fetching Non Qualifier value frm main table

    Hi All,
           I have a requirement where I need to get 3  Non-Qualifier value present in the Main table .
           These 3 Non-Qualifier values are present in 1 Qualified Table.
            Kindly send me some sample code for this scenario
    awaiting for your reply
    Regards

    Hi Sudheendra,
                  Get the record result set by serching one particular column(other than field of type qualified) of the main table.
    then follow the steps.
    QualifiedLookupValue LVAL =(QualifiedLookupValue) temprs.getRecords()[0].getFieldValue(repoSchemacmd.getRepositorySchema().getTableSchema("mainTableName").getField("MaintableFieldName").getId());
    QualifiedLinkValue[] qlinks = LVAL.getQualifiedLinks();
    for (int k = 0; k < qlinks.length; k++) {
       //check whether qualifiercolumn is Display field or not.
       if(displayField){
       RecordId RID[] = new RecordId[1];
       RID[0] = qlinks[k].getLookupId();
            ResultDefinition rd =new ResultDefinition(repoSchema.getTable("qualifertableName").getId());
            rd.setSelectFields(repoSchemacmd.getRepositorySchema().getTableSchema("qualifertableName").getDisplayFieldIds());
            RetrieveRecordsByIdCommand retrievedRecords =new RetrieveRecordsByIdCommand(conAccessor);
           retrievedRecords.setSession(userSessionID);
           retrievedRecords.setIds(RID);
           retrievedRecords.setResultDefinition(rd);
              try {
                   retrievedRecords.execute();
                  } catch (CommandException e) {
              Record[] rec = retrievedRecords.getRecords().getRecords();
                     for (int m = 0; m < rec.length; m++) {
                         //if the field value is String.
                        String Value = qlinks[k].getQualifierValue(repoSchemacmd.getRepositorySchema().getFieldId(tableName,column)).toString();
      else{
           //if columt type in qualifier table is StringValue.
           String value1 =qlinks[k].getQualifierValue(repoSchemacmd.getRepositorySchema().getFieldId("qualifertableName","qualifiercolumn")).toString();
    Regards,
    Sandeep.

  • Error "Value '' could not be converted."

    WPF - MVVM arch.  .Net ver 4.0, VS2010.
    I am trying to clear the selection in a comboBox get an error "Value '' could not be converted."
    The ComboBox's iItemSource is bound to a key-value pair list.  The SelectedValue is the Key and the DisplayMemberPath is bound to the Value.
    If the ItemSource is bound to an ordinary datatype, such as a string, and clear the selected value in ComboBox, this error doesnt occur.  But I needed it as a Key-Value pair since its a lookup. 
    Suspect that the error could be because the key-value pair has no corresponding null entry or could not take a null value.  Could this a bug in the framework.  How to resolve this.  Seen the blogs that says use Nullable
    value and do the conversion, but doesnt appear to be a good way to resolve this since an explicit conversion adapter have to be written.  Is there a better way to resolve this.

    I would have thought
    Bind SelectedValue to a nullable property.
    That's the key it's going to end up with.
    Your key might have to be a nullable type as well.
    Set that property to null.
    You might also have to do the synchronise thing
    IsSynchronizedWithCurrentItem="True"
    On the combo.
    Hope that helps.
    Technet articles: Uneventful MVVM;
    All my Technet Articles

  • New@SSIS - Excel Src: The value could not be converted because of a potential loss of data

    Good day,
    I am currently working on a SSIS package that had been built by a previous user. That Package allows me to import data from an excel source to a SQL table. The excel sheet holds multiple columns, However I am having difficulty with one
    column when running the package.
    The column rows hold a character and an integer (eg: N2). The data types are as follows:
    SQL – varchar(3),NULL
    SSIS - double-precision float [DT_R8]
    I then inserted a data conversion task to change the data type. The task holds the following properties:
    Input column – F6
    Output Alias – Copy of F6
    Data Type – string[DT_STR]
    Length – 50
    When I execute the package I receive the following error:”
    [Excel Source Data [76]] Error: There was an error with output column "F6" (91) on output "Excel Source Output" (85). The column status returned was: "The value could not be converted because
    of a potential loss of data.".
    I do know that usually the message "The value could not be converted because of a potential loss of data." refers to a data type or length problem.
    Further insight to this problem will be much appreciated.

    Hi Zahid01,
    From the error message you post, we can infer the issue is caused by the input column “F6” cannot convert to the output column in Excel Source.
    Based on my test, I can reproduce the issue in my environment. The exactly cause is the data type of input column “F6” cannot convert to the data type of corresponding output column. For example, the data type of input column “F6” is [DT_STR], the data type
    of corresponding output column is [DT_I4], and there are some nonnumeric values in the input column “F6”. When we execute the task, the error occurs. We can verify the issue in the Advanced Editor dialog box of Excel Source. The following screenshot is for
    your reference:
    To avoid this issue, please make sure the input column can convert to the output column in Excel Source, including all the values in the input column.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • CF9 Problems with Select * query, resulting in Value can not be converted to requested type.

    So, I work on a legacy CF web site and there are numerous SELECT * FROM USERS_TABLE queries all over the site.
    Well, we changed the structure of said table in the database on our Testing and Staging sites, with no issues.
    When we pushed up to our production environment and changed the structure of the production DB table, the server kept kicking back "Value can not be converted to requested type."
    After doing some searching out there, it looks like CF caches the structure of the table, and you either have to restart CF to clear it, or rename and then name-back the DSN to fix the issue.
    http://www.bennadel.com/blog/194-ColdFusion-Query-Error-Value-Can-Not-Be-Converted-To-Requ ested-Type.htm
    That said, this doesn't happen in our testing and staging environments - so what would be the difference?
    Is there some setting I need to change in the CF Admin to keep this from happening again?

    Also, if you can use a Stored Procedure to retrieve the data, do so.  Standard queries gets all the information, anyway, chokes bandwidth passing it all to the CF server, and forces the CF server to filter, sort, and format the data.  SPs tell the db server to get ONLY the data requested, and forces the db server to filter and sort the data, leaving only formatting to the CF server.
    That's not true. The only time CF messes with data returned from the DB is if there's a maxrows attribute, and the record set returnded from the DB has more than that number of records... which causes CF to truncate the recordset to the correct size before returning it. The DB might or might not stop sending rows down to CF after CF says "yeah, I've got all I want now".
    Other than that, for all intents and purposes all CF does with the SQL is pass it to the DB and wait for an answer. The only thing it does to the returned data is to create a CF record set ("query") with it... this does not involve any filtering and sorting.
    Adam

  • Exception --- Value can not be converted to requested type.Error Code: 0

    Inside my query, I try to use function,like
    Expression formattedRegDateCol = regDateCol.getFunction("to_char", "YYYY/MM/DD");
    sometimes query works fine and sometimes got following exception, can anyone here help me out..
    Internal Exception: java.sql.SQLException: [BEA][Oracle JDBC Driver]Value can not be converted to requested type.Error Code: 0
    Call:SELECT COUNT(*), to_char(REG_DATE, 'YYYY/MM/DD') FROM SYSTEM_DETAIL WHERE ((REG_DATE BETWEEN {ts '2008-08-10 00:00:00.0'} AND {ts '2008-09-30 23:59:59.0'}) AND (GEOGRAPHIC_ID = 104)) GROUP BY to_char(REG_DATE, 'YYYY/MM/DD') ORDER BY to_char(REG_DATE, 'YYYY/MM/DD') ASC
    Query:ReportQuery(com.cad.registration.RegistrationDetailImpl)
         at com.cad.report.data.DocumentReturnedDataGenImpl.loadData(DocumentReturnedDataGenImpl.java:130)
         at com.cad.report.data.AbstractDataGen.getReport(AbstractDataGen.java:96)
         at com.cad.report.command.ProcessReportRequestOnMessage.run(ProcessReportRequestOnMessage.java:74)
         at com.cad.registration.command.AbstractCommand.execute(AbstractCommand.java:106)
         at com.cad.flow.core.Run.run(Unknown Source)
         at com.cad.flow.core.WorkflowComponent.run(Unknown Source)
         at com.cad.flow.ejbs.WorkflowManagerBean.execute(WorkflowManagerBean.java:238)
         at com.cad.flow.ejbs.WorkflowManagerBean.execute(WorkflowManagerBean.java:135)
         at com.cad.flow.ejbs.WorkflowManager_uh667k_EOImpl.execute(WorkflowManager_uh667k_EOImpl.java:132)
         at com.cad.flow.ejbs.WorkflowManager_uh667k_EOImpl_WLSkel.invoke(Unknown Source)
         at weblogic.rmi.internal.ServerRequest.sendReceive(ServerRequest.java:174)
         at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:335)
         at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:252)
         at com.cad.flow.ejbs.WorkflowManager_uh667k_EOImpl_1001_WLStub.execute(Unknown Source)
         at com.cad.util.WorkflowProxy.runWorkflow(WorkflowProxy.java:128)
         at com.cad.kbd.messenger.listener.mdb.ReportRequestMDB.onMessage(ReportRequestMDB.java:84)
         at weblogic.ejb.container.internal.MDListener.execute(MDListener.java:466)
         at weblogic.ejb.container.internal.MDListener.transactionalOnMessage(MDListener.java:371)
         at weblogic.ejb.container.internal.MDListener.onMessage(MDListener.java:327)
         at weblogic.jms.client.JMSSession.onMessage(JMSSession.java:4072)
         at weblogic.jms.client.JMSSession.execute(JMSSession.java:3964)
         at weblogic.jms.client.JMSSession$UseForRunnable.run(JMSSession.java:4490)
         at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:464)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:200)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:172)
    Caused by: Exception [TOPLINK-4002] (Oracle TopLink - 10g Release 3 (10.1.3.3.0) (Build 070620)): oracle.toplink.exceptions.DatabaseException
    Internal Exception: java.sql.SQLException: [BEA][Oracle JDBC Driver]Value can not be converted to requested type.Error Code: 0
    Call:SELECT COUNT(*), to_char(REG_DATE, 'YYYY/MM/DD') FROM SYSTEM_DETAIL WHERE ((REG_DATE BETWEEN {ts '2008-08-10 00:00:00.0'} AND {ts '2008-09-30 23:59:59.0'}) AND (GEOGRAPHIC_ID = 104)) GROUP BY to_char(REG_DATE, 'YYYY/MM/DD') ORDER BY to_char(REG_DATE, 'YYYY/MM/DD') ASC
    Query:ReportQuery(com.cad.registration.RegistrationDetailImpl)
         at oracle.toplink.exceptions.DatabaseException.sqlException(DatabaseException.java:282)
         at oracle.toplink.internal.databaseaccess.DatabaseAccessor.getObject(DatabaseAccessor.java:988)
         at oracle.toplink.internal.databaseaccess.DatabaseAccessor.fetchRow(DatabaseAccessor.java:780)
         at oracle.toplink.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:562)
         at oracle.toplink.internal.databaseaccess.DatabaseAccessor.executeCall(DatabaseAccessor.java:441)
         at oracle.toplink.threetier.ServerSession.executeCall(ServerSession.java:457)
         at oracle.toplink.internal.queryframework.DatasourceCallQueryMechanism.executeCall(DatasourceCallQueryMechanism.java:117)
         at oracle.toplink.internal.queryframework.DatasourceCallQueryMechanism.executeCall(DatasourceCallQueryMechanism.java:103)
         at oracle.toplink.internal.queryframework.DatasourceCallQueryMechanism.executeSelectCall(DatasourceCallQueryMechanism.java:174)
         at oracle.toplink.internal.queryframework.DatasourceCallQueryMechanism.selectAllRows(DatasourceCallQueryMechanism.java:481)
         at oracle.toplink.internal.queryframework.ExpressionQueryMechanism.selectAllRowsFromTable(ExpressionQueryMechanism.java:825)
         at oracle.toplink.internal.queryframework.ExpressionQueryMechanism.selectAllReportQueryRows(ExpressionQueryMechanism.java:791)
         at oracle.toplink.queryframework.ReportQuery.executeDatabaseQuery(ReportQuery.java:518)
         at oracle.toplink.queryframework.DatabaseQuery.execute(DatabaseQuery.java:620)
         at oracle.toplink.queryframework.ObjectLevelReadQuery.execute(ObjectLevelReadQuery.java:779)
         at oracle.toplink.queryframework.ReadAllQuery.execute(ReadAllQuery.java:451)
         at oracle.toplink.publicinterface.Session.internalExecuteQuery(Session.java:2089)
         at oracle.toplink.publicinterface.Session.executeQuery(Session.java:993)
         at oracle.toplink.publicinterface.Session.executeQuery(Session.java:965)
         at oracle.toplink.publicinterface.Session.executeQuery(Session.java:878)
         at com.cad.persistence.toplink.TopLinkPersistenceBrokerImpl.executeQuery(TopLinkPersistenceBrokerImpl.java:420)
         at com.cad.report.data.DocumentSys.getSummitedDocumentNumber(DocumentSys.java:332)
         at com.cad.report.data.DocumentSys.loadData(DocumentSys.java:110)
         ... 24 more
    Caused by: java.sql.SQLException: [BEA][Oracle JDBC Driver]Value can not be converted to requested type.
         at weblogic.jdbc.base.BaseExceptions.createException(Unknown Source)
         at weblogic.jdbc.base.BaseExceptions.getException(Unknown Source)
         at weblogic.jdbc.base.BaseData.getTimestamp(Unknown Source)
         at weblogic.jdbc.base.BaseResultSet.getTimestamp(Unknown Source)
         at weblogic.jdbc.wrapper.ResultSet_weblogic_jdbc_base_BaseResultSet.getTimestamp(Unknown Source)
         at oracle.toplink.internal.databaseaccess.DatabaseAccessor.getObjectThroughOptimizedDataConversion(DatabaseAccessor.java:1038)
         at oracle.toplink.internal.databaseaccess.DatabaseAccessor.getObject(DatabaseAccessor.java:942)
         ... 45 more

    Here is my code
    ExpressionBuilder builder = new ExpressionBuilder();
    ReportQuery query = new ReportQuery(RegistrationDetailImpl.class, builder);
    query.addArgument("startTime");
    query.addArgument("endTime");
    query.addArgument("lro");
    Expression lroCol = builder.get("lro");
    Expression regDateCol = builder.get(
    QueryConstants.RegistrationDetailQueryConstants.REGISTRATION_NUMBER_COL).get(
    QueryConstants.RegistrationDetailQueryConstants.REGISTRATION_DATE_COL);
    Expression formattedRegDateCol = regDateCol.getFunction("to_char", "YYYY/MM/DD");
    Expression timeExp = regDateCol.between(builder.getParameter("startTime"), builder
    .getParameter("endTime"));
    Expression lroExp = lroCol.equal(builder.getParameter("lro"));
    query.setSelectionCriteria(timeExp.and(lroExp));
    query.addCount();
    query.addAttribute("regDate", formattedRegDateCol);
    query.addGrouping(formattedRegDateCol);
    query.addOrdering(formattedRegDateCol.ascending());
    descriptor.getDescriptorQueryManager().addQuery("testquery", query);

  • Lookup two objects list values in one object are not in the other object

    Hi,
    I have two sources of reference numbers that ties our whole system together. Let's call them RefCodeSystemA and RefCodeSystemB. I would like to build a webi report that shows the Values in RefCodeSystemA that a NOT in RefCodeSystemB. I can easily find out the matching ones but not the NON matching onces?
    Any suggestions on a simple query to do this?

    Hi Galvin,
                    Hope you can use the Minus at the Query pannel.
    We have Union,Intersection,Minus hence you can get it from that.
    Please let me know if you need details for this.
    Regards
    Prashant

Maybe you are looking for