Query Manager Tables

I see that the OQCN table holds the Query Manager Categories. Does anyone know which DB table holds the actual query name and where, if anywhere, the sql itself is held.
Thanks

OUQR table, QName and QString.

Similar Messages

  • Which Table Store the Query in the Query Manager?

    Hello, I have looking for the table which store all the query inside the Query Manager.
    From this i can retrieve the query without open the program.
    Please help. Thanks.

    Hi William,
    Try this Queries in MSSQL Server Management Studio and dont use Query Generater.
    1. Run this Query, Result: Query Manager Details
    SELECT * from OQCN
    2. Run this Query, Result: Query Details on Query Manager
    SELECT * from OUQR
    Regards,
    Madhan.

  • Tables related to the Query Manager

    I have been looking for the table that hold the data on preformatted queries.
    I found the OUQR table which shows the basics of every query in teh Query Manager.
    What other table(s) have data related to the queries in teh Query Manager.

    Great,  is there any other tables.
    When I run a query it states I have 255 charactors but shows no table info.
    Other then the OUQR and OQCN, is there another table(s) that contain more data on the queries?
    Thanks

  • Results table difference when running query from Alert compared to Query Manager

    Hello,
    I have following query - Aim is to create alert to tell employee which customers to make visit to in next 4 weeks
    SELECT DISTINCT T1.CardName, T1.U_VisitDue, T1.U_VisitReason, T1.U_Priority, T1.U_Region as 'Area', T1.U_VNotes
    FROM dbo.OCRD T1
    WHERE DateDiff(d,T1.U_VisitDue, GETDATE()) <29
    Group BY T1.CardName, T1.U_VNotes, T1.U_VisitDue, T1.U_VisitReason, T1.U_Priority, T1.U_Region
    FOR BROWSE
    My problem is that when I run query through alert the field U_VNotes is displayed differently.
    Running the query from Query Manager the contents of this field appear in one field of the result table.
    Running the query via Alery the content of this field is split into several fields in results table according to 'new line' in field entry.
    This makes the results table from the Query less 'user-friendly' because if you want to sort the table by 'Date' column for example it makes a mess of the table because of the extra rows.
    I want results table to look like top version in picture below ... but i want to run from alert.
    Is this possible?
    How can I achieve it?
    Thanks for any assistance
    Regards, Karen

    Hi
    check this support note:1774628 The SQL SELECT DISTINCT Statement does not work in ALERTS
    Kind regards
    Agustín Marcos Cividanes

  • How to add drop down list for query manager report in sap business one

    Hi Every one,
    I need drop down list for parameter selection in sap business one Query Manager.I have two Parameters 'Sales Order','Invoice'.
    Please suggest.
    Thanks and Regards
    DEV

    Hi,
    you need to use this :
    /*select from [dbo].[OINV] T2*/
    DECLARE @Invoice varchar(100)
    /*where*/
    set @Invoice =/* T2.DocNum  */N'[%2]'
    you can change the tables and the parameter number but you have to write it exactly that way.
    when you run the query within the SBO you will get list of objects ( in this case list of invoices)
    hope it was helpful
    Shachar

  • Pass parameter to sql statement in query manager

    Hai to all,
               I want to pass the percentage  as the parameter into the sql statemnet.i what to execute it in the query manager.
              If i execute that statement then cann't found the tablename error is coming.
             Other than the data in the table (general data)  pass to the parameter in the sql at runtime.
    for example:
    select [%0] *100
    how to pass 10 to that sql statement.
    Please help me...
    Regards,
    Raji.

    Hi Ramya,
    You can create a SP with parameters to accept and then execut this SP from SAP Business One Query Manager by passing the parameter (in your case 10). The result will be as desired.
    Ex:
    Create this Procedure in SQL Management Studio
    create proc Test(@a as int)
    as
    begin
    select (@a*100)
    end
    To Execute the Query use this Query and pass the desired values with parameters
    execute Test 10
    Regards,
    Reno

  • Query Manager Problem

    Here's an interesting problem a client is having. I have a simple query for them that retrieves some information from the primary B1 database, along with a UDT. The query works just fine inside SQL Server Management Studio. When we implement it into the query manager, before saving it, it worked just fine. We then saved it then tried running it again. One of the columns is not visibil. When click the edit button and try running it, it still doesn't work. But when we add or delete whitespace while in edit mode the query works just fine. I haven't been able to find anything on the notes yet and the customer is on 2007A PL 10 but they are planning on upgrading to 8.8+.
    Any ideas on why the query manager is behaving this way? Is it a bug thats been fixed in a later patch for 2007A or is it something else entirely?

    You may add index to U_slsm column.
    Update your query as:
    Select T0.U_slsm as 'Slsm',
    LEFT(T0.Code,9) as 'PC No',
    T0.U_date,
    T3.CardCode,
    T1.DocNum as 'Shop',
    T2.DocNum as 'Quote',
    T0.U_amount,
    T0.U_dm,
    T0.U_comment
    From [dbo].[@ORDERLOG T0]
    Left Join dbo.ORDR T1 on T0.U_DocNum = T1.DocNum
    Left Join dbo.OQUT T2 on T0.U_quote = T2.DocNum
    Left Join dbo.OCRD T3 on T0.U_CardCode = T3.CardCode
    Where T0.U_slsm = '[%0]'
    Order By T0.U_date Desc, T0.U_amount
    I have doubt regarding your links to those 3 system tables. Are those documents having internal links?

  • Select Query LQUA Table

    Hi Experts
    Can Any one tell me ,
    Is there anything wrong in this select query, As per my knoledge everything is Fine, But Still i am getting SY-SUBRC = 4 for this Query,
    Its a Warehouse Management table LQUA i have Used.
    TABLES:mara,lqua.
    *Internal Table And Work Area Declearation
    DATA: it_lqua1 TYPE TABLE OF lqua WITH HEADER LINE.
    DATA: wa_tmp_lqua TYPE lqua.
    CONSTANTS: c_x TYPE char1 VALUE 'X',
               c_t TYPE char1 VALUE 'Q'.
    *Selection Screen
    SELECTION-SCREEN BEGIN OF BLOCK blk1 WITH FRAME TITLE text-001.
    PARAMETERS:P_HLDR TYPE lqua-Z_HLDREF1.
    SELECT-OPTIONS:s_matnr FOR mara-matnr,
                   s_date  FOR sy-datum DEFAULT sy-datum,
                   s_mfgt  FOR sy-uzeit DEFAULT sy-uzeit,
                   s_CHARG FOR lqua-CHARG.
    SELECTION-SCREEN END OF BLOCK blk1.
    START-OF-SELECTION.
    Select * from LQUA
                into table IT_LQUA1
                where MATNR in S_MATNR
                and BDATU in S_DATE
                and Z_MFGTIM in S_MFGT
                   AND CHARG IN S_CHARG.
    After Debuging Above Query I am Getting SY-SUBRC = 4.
    Same data is availabe In LQUA Table For Selection Criteria..
    Please help me if anyone has worked on WM LQUA Table

    Hi
    as per your code you are mentioning S_DATE and S_MFGT which is system date and username.
    i beleive you do not have data for this user name and this date in your table LQUA. that why you are getting sy-subrc = 4
    because all this is in the where condition with AND operator. So all the conditions should be matched otherwise you will not find any data.
    so check your table once again for the given values, like your system date and the current user name.
    Thanks
    Lalit Gupta

  • While Executing the Sp in Query manager Getting Error.

    hi.
    i need a small information.
    I Created a small table in Sql 
    ccode
    varchar
    no
    250
    cname
    varchar
    no
    250
    ctype
    varchar
    no
    250
    My Stored Procedure at  Sql
    i am inserting values in to the temporary table by using below one after passing the parameter
    i am executing the parameter is
    EXEC 'cardcode' ,'cardname' ,'c'
    if i execute the stored procedure in sql values are inserting  same thing i want to do in sap b1 at query maanger
    USE [WCTBRPLDB21-05-2014]
    GO
    /****** Object:  StoredProcedure [dbo].[uspGetAddress1]    Script Date: 8/1/2014 4:07:20 PM ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    ALTER Procedure [dbo].[uspGetAddress1]
    @Cuscode varchar(250),
    @customertName VARCHAR(250) ,
    @custype varchar(250)
    As
    Begin
    DECLARE @Cur_Product CURSOR
    set @Cur_Product= cursor for select CardCode,CardName,CardType  from ocrd  where  CardCode =@Cuscode and   CardName= @customertName  and  cardtype =@custype
    open @Cur_Product
           fetch next
           from @Cur_Product into @Cuscode,@customertName,@custype
           while @@FETCH_STATUS = 0
           begin
           insert into custupdate (ccode,cname,ctype) values (@Cuscode,@customertName,@custype)
           fetch next
           from @Cur_Product into @Cuscode,@customertName,@custype
           end
           close @Cur_Product
           deallocate @Cur_Product
        select * from custupdate
    End
    @Cuscode varchar(250),
    @customertName VARCHAR(250) ,
    @custype varchar(250),
    select @Cuscode = T0.CARDCODE from OCRD T0 where T0.CARDCODE = '[%01]' AND
    select @CARDNAME = T0.CARDNAME from OCRD T0 where T0.CARDNAME = '[%02]' AND
    select @CARDTYPE = T0.CARDTYPE from OCRD T0 where T0.CARDTYPE = '[%03]'
    uspGetAddress1 @Cuscode ,@CARDNAME ,@CARDTYPE
    The above one i pasted at query manager.
    if i run it it is asking the parameters
    but after execute it
    i am getting the error that must declare the Scalar variable like that  it is showing.
    but i am all ready giving the scalar variable but it is not running.
    Any information plz update me
    is there any declaration problem in query  manager.....................

    Try below in query generator:
    Declare @Cuscode varchar(250),
    Declare @customertName VARCHAR(250) ,
    Declare @custype varchar(250)
    /*SELECT FROM [dbo].[OCRD] T0*/
    /* WHERE */
    SET @Cuscode = /* T0.CardCode */ '[%0]'
    /*SELECT FROM [dbo].[OCRD] T0*/
    /* WHERE */
    SET @customertName = /* T0.CardName */ '[%1]'
    /*SELECT FROM [dbo].[OCRD] T0*/
    /* WHERE */
    SET @custype = /* T0.CardType */ '[%2]'
    EXEC uspGetAddress1 @Cuscode,@customertName,@custype
    **Don't remove comments
    Thanks
    Navneet

  • Query Manager- Filter List Missing Values in the Filter List.

    Hello experts,
    We would like your help in the following issue.
    We are using SAP B1 9.0 (PL10).
    When we run a query report from the query manager in SAPB1, the report results that are displayed contain an extended list of names, starting from A to Z. When we use the filters button in this report, in the filter field with the BP Names the list contains only the names from A to M, and not the rest.
    Please advise us, if anyone has faced the same issue.
    Thank you in advance,
    Vassilios Korolis.

    Hi,
    Please check SAP note:
    1012827 - Filter table function displays only up to 499
    values
    Thanks & Regards,
    Nagarajan

  • Print Preview for Query Manager

    Hi all
    When I open up query manager and run a report query it opens up the correct information however when I click on print preview it brings up a window which says print preferences and has a selection of window, table, or selection area.  I want the selection to be table permanently.
    What I want to know is what do I need to do to skip past that window and take me directly to print preview in the table format?

    There is no option available to skip the print preferences window or to make the selection to be permanent.
    As an alternate, Export the result to Excel and print from there or
    Create a Query Print Layout for the Query by clicking 'Create Report' button in the 'Query Manager' window, and print the Query Print Layout by selecting  Tools -> Queries -> Query Print Layout

  • The Connection String for the Query Log table is automatically encrypted.

    When I try to use the Usage Based Optimization to apply Aggregation Design to my measure group, it shows me the following
    error message.
    The connection string cannot be found. Open Microsoft SQL Server Management Studio and, in the Analysis Server Properties
    page, check the value of the Log\QueryLog\QueryLogConnectionString
    property.
    I encountered this error like two weeks ago.  At that time I just reset the connection string and every things seem
    to be fine.  A week ago, I successfully applied the Usage Based Optimization for one of my cubes.  However when I tried to apply UBO for my other cubes today, I encountered the same issue again!  I believe no one has changed the property of
    the connection string.
    Also if I query the Query Log table, I can see those latest queries made by the users.  I'm sure the queries are still
    logging into this table.
    This is really strange.  Anyone else has encountered the same issue?  Thanks.

    Hello Thomas,
    I encounterd this issue. And I am struggling trying to solve this problem. If you have resolved this issue and I guess you must've, because this post is two years old, could you kindly post how you resolved this issue?
    Thanks in advance
    Best Regards,
    Neeraja

  • Query Dictionary tables and V$ views in EM

    Hello;
    I want to query dba_ tables and v$ views from Enterprise manager (10g).
    I hv tried that in the Tables -> SYS schema ; But these tables are not shown there. Im unable to query from the EM.
    Able to query from the sql plus , Where i have logged as SYS
    select status from v$instance;
    From where can i access these tables; (in EM)
    v$instance
    v$database
    dba_tablespaces
    Edited by: Zerandib on Dec 8, 2009 8:39 AM

    Zerandib wrote:
    From where can i access these tables; (in EM)
    v$instance
    v$database
    dba_tablespaces
    Those are not tables, they are views. Try looking under Adminstration|Schema|Views with schema as SYS. Keep in mind many of the V$ names, such as V$INSTANCE, are public synonyms for V_$ views. For example V$INSTANCE is a public synonym for V_$INSTANCE.

  • Unable to export Hive managed table using Oraloader

    Hi,
    I am using MySQL as Hive metastore and trying to export a Hive managed table using Oraloader.
    I get the following excpetion in Jobtracker:
    2012-09-12 12:23:56,337 INFO org.apache.hadoop.mapred.JobTracker: Job job_201209121205_0007 added successfully for user 'oracle' to queue 'default'
    2012-09-12 12:23:56,338 INFO org.apache.hadoop.mapred.AuditLogger: USER=oracle IP=192.168.1.5 OPERATION=SUBMIT_JOB TARGET=job_201209121205_0007 RESULT=SUCCESS
    2012-09-12 12:23:56,353 INFO org.apache.hadoop.mapred.JobTracker: Initializing job_201209121205_0007
    2012-09-12 12:23:56,353 INFO org.apache.hadoop.mapred.JobInProgress: Initializing job_201209121205_0007
    2012-09-12 12:23:56,594 INFO org.apache.hadoop.mapred.JobInProgress: jobToken generated and stored with users keys in /opt/ladap/common/hadoop-0.20.2-cdh3u1/hadoop-datastore/mapred/system/job_201209121205_0007/jobToken
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: Input size for job job_201209121205_0007 = 5812. Number of splits = 2
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: tip:task_201209121205_0007_m_000000 has split on node:/default-rack/hadoop-namenode
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: tip:task_201209121205_0007_m_000001 has split on node:/default-rack/hadoop-namenode
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobInProgress: job_201209121205_0007 LOCALITY_WAIT_FACTOR=1.0
    2012-09-12 12:23:56,607 ERROR org.apache.hadoop.mapred.JobTracker: Job initialization failed:
    java.lang.NegativeArraySizeException
    at org.apache.hadoop.mapred.JobInProgress.initTasks(JobInProgress.java:748)
    at org.apache.hadoop.mapred.JobTracker.initJob(JobTracker.java:4016)
    at org.apache.hadoop.mapred.EagerTaskInitializationListener$InitJob.run(EagerTaskInitializationListener.java:79)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:679)
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobTracker: Failing job job_201209121205_0007
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobInProgress$JobSummary: jobId=job_201209121205_0007,submitTime=1347467036196,launchTime=1347467036607,,finishTime=1347467036607,numMaps=2,numSlotsPerMap=1,numReduces=0,numSlotsPerReduce=1,user=oracle,queue=default,status=FAILED,mapSlotSeconds=0,reduceSlotsSeconds=0,clusterMapCapacity=10,clusterReduceCapacity=2
    2012-09-12 12:23:56,639 INFO org.apache.hadoop.mapred.JobHistory: Moving file:/opt/ladap/common/hadoop/logs/history/hadoop-namenode_1347465941865_job_201209121205_0007_oracle_OraLoader to file:/opt/ladap/common/hadoop/logs/history/done
    2012-09-12 12:23:56,648 INFO org.apache.hadoop.mapred.JobHistory: Moving file:/opt/ladap/common/hadoop/logs/history/hadoop-namenode_1347465941865_job_201209121205_0007_conf.xml to file:/opt/ladap/common/hadoop/logs/history/done
    My oraloader console log is below:
    [oracle@rakesh hadoop]$ bin/hadoop jar oraloader.jar oracle.hadoop.loader.OraLoader -conf olh-conf/TestAS/scott/testmanagedtable/conf.xml -fs hdfs://hadoop-namenode:9000/ -jt hadoop-namenode:9001
    Oracle Loader for Hadoop Release 1.1.0.0.1 - Production
    Copyright (c) 2011, Oracle and/or its affiliates. All rights reserved.
    12/09/12 12:23:42 INFO loader.OraLoader: Oracle Loader for Hadoop Release 1.1.0.0.1 - Production
    Copyright (c) 2011, Oracle and/or its affiliates. All rights reserved.
    12/09/12 12:23:47 INFO loader.OraLoader: Sampling disabled, table: LDP_TESTMANAGEDTABLE is not partitioned
    12/09/12 12:23:47 INFO loader.OraLoader: oracle.hadoop.loader.loadByPartition is disabled, LDP_TESTMANAGEDTABLE is not partitioned
    12/09/12 12:23:47 INFO output.DBOutputFormat: Setting reduce tasks speculative execution to false for : oracle.hadoop.loader.lib.output.JDBCOutputFormat
    12/09/12 12:23:47 INFO loader.OraLoader: Submitting OraLoader job OraLoader
    12/09/12 12:23:50 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    12/09/12 12:23:50 INFO metastore.ObjectStore: ObjectStore, initialize called
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
    12/09/12 12:23:51 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
    12/09/12 12:23:51 INFO DataNucleus.Persistence: Property javax.jdo.option.NonTransactionalRead unknown - will be ignored
    12/09/12 12:23:51 INFO DataNucleus.Persistence: ================= Persistence Configuration ===============
    12/09/12 12:23:51 INFO DataNucleus.Persistence: DataNucleus Persistence Factory - Vendor: "DataNucleus" Version: "2.0.3"
    12/09/12 12:23:51 INFO DataNucleus.Persistence: DataNucleus Persistence Factory initialised for datastore URL="jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true" driver="com.mysql.jdbc.Driver" userName="root"
    12/09/12 12:23:51 INFO DataNucleus.Persistence: ===========================================================
    12/09/12 12:23:52 INFO Datastore.Schema: Creating table `DELETEME1347467032448`
    12/09/12 12:23:52 INFO Datastore.Schema: Schema Name could not be determined for this datastore
    12/09/12 12:23:52 INFO Datastore.Schema: Dropping table `DELETEME1347467032448`
    12/09/12 12:23:52 INFO Datastore.Schema: Initialising Catalog "hive", Schema "" using "None" auto-start option
    12/09/12 12:23:52 INFO Datastore.Schema: Catalog "hive", Schema "" initialised - managing 0 classes
    12/09/12 12:23:52 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
    12/09/12 12:23:52 INFO DataNucleus.MetaData: Registering listener for metadata initialisation
    12/09/12 12:23:52 INFO metastore.ObjectStore: Initialized ObjectStore
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 11, column 6 : cvc-elt.1: Cannot find the declaration of element 'jdo'. - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 312, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 359, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 381, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 416, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 453, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 494, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 535, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 576, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 621, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 666, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : `DBS`, InheritanceStrategy : new-table]
    12/09/12 12:23:53 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : `DATABASE_PARAMS`]
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 index(es) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 0 foreign key(s) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 unique key(s) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 index(es) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 1 foreign key(s) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 1 unique key(s) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase
    12/09/12 12:23:54 INFO metastore.HiveMetaStore: 0: get_table : db=db_1 tbl=testmanagedtable
    12/09/12 12:23:54 INFO HiveMetaStore.audit: ugi=oracle     ip=unknown-ip-addr     cmd=get_table : db=db_1 tbl=testmanagedtable     
    12/09/12 12:23:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MSerDeInfo [Table : `SERDES`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [Table : `SDS`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MTable [Table : `TBLS`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MSerDeInfo.parameters [Table : `SERDE_PARAMS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.parameters [Table : `TABLE_PARAMS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.partitionKeys [Table : `PARTITION_KEYS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.bucketCols [Table : `BUCKETING_COLS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.cols [Table : `COLUMNS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.parameters [Table : `SD_PARAMS`]
    12/09/12 12:23:55 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.sortCols [Table : `SORT_COLS`]
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 index(es) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 0 foreign key(s) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 4 index(es) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 foreign key(s) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 unique key(s) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SORT_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SORT_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SORT_COLS`
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MSerDeInfo
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MStorageDescriptor
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MTable
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MFieldSchema
    12/09/12 12:23:55 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    12/09/12 12:23:55 WARN snappy.LoadSnappy: Snappy native library not loaded
    12/09/12 12:23:55 INFO mapred.FileInputFormat: Total input paths to process : 1
    12/09/12 12:23:56 INFO mapred.JobClient: Running job: job_201209121205_0007
    12/09/12 12:23:57 INFO mapred.JobClient: map 0% reduce 0%
    12/09/12 12:23:57 INFO mapred.JobClient: Job complete: job_201209121205_0007
    12/09/12 12:23:57 INFO mapred.JobClient: Counters: 0
    [oracle@rakesh hadoop]$
    Please help. Thanks in advance.
    Regards,
    Rakesh Kumar Rakshit

    Hi Raskesh,
    Can you share the conf.xml and map.xml files you are using? I am trying to do the same (export from HIVE to oracle DB) and i get the following exception ClassNotFoundException: org.apache.hadoop.hive.metastore.TableType.
    Best regards,
    Bilal

  • U00BFhow can I show ITemname in Query manager?

    If I do a query using items, when i choose de range of items only show ItemCode and  numbers of records, how can also show itemname??
    thanks.

    Dear Mariano,
    You can get the details information about all the fields of the OITM table
    by doing the following step :-
    1. Open the Query Manager in the SAP Business One Application.
    2. Select the Table OITM
    Then you will get all the details information about the fields available in OITM table with
    description.
    After that you can select the fields of the OITM table as per your requirement.
    Regards,
    Rakesh Pati
    SAP Business One Forum Team

Maybe you are looking for

  • WLAN not working under Windows XP on Satellite U400

    After installing Windows XP Recovery on my U400 the WLAN is not found. Under Win Vista it was working properly. There is no module listed under the device manager - searching after new hardware gave no results. It seems that the WLAN Interface is not

  • Adobe Acrobat 7.1.0 won't read

    For some reason, after I downloaded Adobe Acrobat 7.1.0 and installed it, it won't read documents from several different websites. At first, I thought the explanation for the problem was that government websites weren't set up for 7.1.0., ----so Adob

  • Issue to send a draft mail

    Hello, I will try to explain my strange problem. I have a Curve 8310, I decide to forward any email to one or several users, but before sending this mail i'm saving a draft. When i decide to send this draft i can't do it, the field "SEND" is not avai

  • Avoid outbound delivery split creation during GI

    Hi guys, Every time we perform a GI in /SCWM/PRDO for a partial picking, the system creates a delivery split, leaving the rest for the original delivery. I want to avoid the delivery split! The operators have a procedure to check the warehouse tasks

  • User  Presets keep disappearing

    For the 4th time this year I have started up LR4 only to find that all my User Presets have disappeared - why is this happening. All my photos are still there but I have no User presets for Develop nor print and no Metadata presets. I haven't changed