Regarding tables used in recipe management

Hi,
  I want the tables and their relations  for recipe mangement. If  any body knows plz share it.
regrds
prasad

Dear Prasad,
please see into the report RMTABLEINFO.  In this report you find the tables under the data definition.
I hope, it helps.
Best regards,
Roland Freudenberg
Edited by: Roland Freudenberg  on Nov 24, 2008 4:31 PM

Similar Messages

  • Tables using  Oracle Identity Manager 11g

    Good evening friends,
    I have a query about the tables used by Identity Manager 11g.
    There is some document that tells me the tables used by Oracle Identity Manager 11g, and how they are linked.
    Very grateful for your help

    This does not exist for 11g. Only 9.x has a database diagram. Most the tables are the same, it's just the request side that is new due to the SOA integration.
    -Kevin

  • Tables used in Solution Manager

    Hi all,
    Please let me know the details of the tables used in SolMan.
    Thanks
    Anish

    Hi....
      I am nto having that program in my system....
    any way ... Try with SQL Trace...
    >1. Activate SQL trace in ST05
    >2. Goto that program and excute with some selection criteria
    >3. Now deactivate the Trace and Display!
    >4. You can get all the tables and technical information about that Program
    Try this..,
    Thanks,
    Naveen.I

  • Is it possible to add records to tables using Enterprise Manager?

    Can we add records to tables using Oracle Enterprise Manager web interface instead of the sqlplus commands ? if not, is there any easy way to insert data into table using a graphical interface?

    You may consider using iSQL*Plus.
    Check http://www.oracle.com/technology/obe/obe10gdb/install/isqlplus/isqlplus.htm
    Thanks,

  • How to map lookup main table field in another main table using MDM 7.1?

    We created a new SAP MDM 7.1 repository with multiple main tables.  The first main table is called ProductMaster table which contains Products information.  The ProductCode is the primary key and the only display field for the table during data loading process. The second main table is ProductByRegion table which has a main table lookup field ProductCode and a RegionId field.  These two fields (ProductCode and RegionId) combine as the PK for this main table.  Both main tables have key mapping enabled. 
    I was able to load ProductMaster table using Import Manager.  But Iu2019m having trouble to load data into ProductByRegion table using MDM Import Manager.  Although I have met all the 5 requirements below (excerpted from MDM Import Manager Reference Guide P.222), the ProductCode wonu2019t show up on the destination value pane.  If I mapped all productCode to NULL field, ProductCode wonu2019t load.  If I u2018Addu2019 all ProductCode to Destination Value pane, the Import Manager added duplicated rows to Product Master table while only loading 1 record to ProductByRegion table.  I canu2019t get ProductCode show up in Matching Destination Field list.  When I checked ProductMaster records in MDM Data Manager, I right-clicked on one of records, chose Edit Key Mappings, it didnu2019t show anything.  However, if I right-clicked on one of those duplicated rows, Edit Key Mapping shows remote system and key correctly.
    Where did I do wrong?  How can I fix the problem?
    Thank you for help in advance.
    From: SAP MDM Import Manager Reference Guide:
    Mapping to Main Table Lookup Destination Fields
    Import Manager handles main table lookup fields (Lookup [Main])
    differently than other MDM lookup fields. Specifically, Import Manager
    does not display the complete set of display field values of the records
    of the underlying lookup table. Instead, the values it displays for a main
    table lookup field are limited by both the key mappings for the lookup
    table and the values in the source file.
    Also, Import Manager does not automatically display the values of a
    Lookup [Main] destination field in the Destination Values grid when you
    select the field in the Destination Fields grid. Instead, for a main table
    lookup field value to appear in the Destination Values grid, all of the
    following conditions must be met:
    u2022 The lookup table must have key mapping enabled
    u2022 The lookup field must be mapped to a source field
    u2022 The source field must contain key values for the lookup table
    u2022 The destination value must have a key on the current remote system
    u2022 The destination valueu2019s key must match a source field value
    NOTE ►► The current remote system is the remote system that was
    selected in Import Manageru2019s Connect to Source dialog (see
    u201CConnecting to a Remote Systemu201D on page 416 for more information).
    Vicky

    Hi Michael,
    Thank you very much for your response.  I'm new to SAP MDM, I need some clarification and help regarding your solution. 
    I did use two maps to load ProductMaster and ProductByRegion separately.  Here were my steps:
    1. create main table ProductMaster with key mapping enabled at the table level and set ProductCode as unique and writable once (primary key).
    2. create a map to load ProductMaster record from a staging table located an oracle database.  But Key mapping didn't show anything when I looked at them using Data Manager.
    3. create main table ProductByRegion with a lookup field looking at ProductMaster table.  This field and RegionId combines as a unique field for ProductByRegion table. 
    4. create a map to load ProductByRegion table.  But ProductCode records only shows on the source pane not destination pane and can't be mapped properly.
    My questions:
    1. How can I "Ensure that you add key mapping info for all ProductMaster records" besides enabling Key Mapping on the table level?
    2. How can I define a concatenation of ProductCode and RegionId as a REMOTE KEY"?
    Thanks a lot for your help!
    Vicky

  • Creating a DWMQY DIMENSION using Analytic Workspace Manager

    Hi everyone,
    I need some help creating a "time aware" (DAY, WEEK, MONTH, QUARTER, and YEAR) dimension using Analytic Workspace Manager.
    Let me give you some background. I'm coming from a traditional "Oracle Express" OLAP background where all our data is stored in cubes and these are defined, populated and operated on using OLAP DML, there is no SQL or traditional relational tables involved.
    I now want to pull data from relational tables into some OLAP cubes and am using Analytic Workspace Manager to do this (maybe this is not the best way?)
    Let me explain what I'm trying to achieve. In OLAP worksheet I can type the following DML commands:
    DEFINE MY_DAY DIMENSION DAY
    MAINTAIN MY_DAY ADD TODAY '01JAN2011'
    What this will do is create a "day dimension" and will populate it with values for each and every day between 1st Jan 2011 and today. It will be fully "time aware" and thus you can use date functions such as DAYOF to limit the MY_DAY dimension to all the Fridays etc. Similarly if I define a "month dimension" there will be an automatic implicit relationship between these two dimensions, this relationship and time aware cleverness is built into Oracle.
    However, a dimension defined using DML commands (and indeed all objects created using DML language) is not visible in Analytic Workspace Manager (as there is no metadata for them?) and for the life of me I cannot work out how to create such a dimension using AWM. If I create a "Time Dimension" then, as far as I can tell, this is not a proper time dimension but merely a text dimension and I, presume, I have to teach it time awareness.
    Can anyone help me? I have no issues creating, and populating cubes from relational tables using Analytic Workspace Manager, the only issue I have is creating a "proper" time aware dimension.
    Many thanks in anticipation.
    Ian.

    When a dimension is of type "TIME" in AWM, then for each member of that dimension, you need END_DATE and TIMESPAN attributes in addition to the key column and description column.
    So in your case, if there are 5 levels: DAY->WEEK->MONTH->QTR->YEAR
    then you will need atleast 15 columns in your source sql table/view
    or 20 columns if you have separate column for description.
    For example the columns in your source table/view could be:
    DAY_ID,
    DAY_DESC,
    DAY_END_DATE, (which will be that day's date)
    DAY_TIMESPAN, (which will be 1)
    WEEK_ID,
    WEEK_DESC,
    WEEK_END_DATE,
    WEEK_TIMESPAN,
    MONTH_ID,
    MONTH_DESC,
    MONTH_END_DATE,
    MONTH_TIMESPAN,
    QTR_ID,
    QTR_DESC,
    QTR_END_DATE,
    QTR_TIMESPAN,
    YEAR_ID,
    YEAR_DESC,
    YEAR_END_DATE,
    YEAR_TIMESPAN
    Just "map" this table/view to the 5-level time dimension in AWM.
    NOTE that behind-the-scenes lot of useful structures are automatically defined to support time-series measures,
    and there are lot of calculation templates available also.
    Since you came from Express background, I have to say that try to use new OLAP Expression Syntax when creating calculated measures instead of OLAP DML.
    Its very rare these days that we need OLAP DML.
    Edited by: Nasar on Nov 22, 2012 12:11 PM

  • Unable to export Hive managed table using Oraloader

    Hi,
    I am using MySQL as Hive metastore and trying to export a Hive managed table using Oraloader.
    I get the following excpetion in Jobtracker:
    2012-09-12 12:23:56,337 INFO org.apache.hadoop.mapred.JobTracker: Job job_201209121205_0007 added successfully for user 'oracle' to queue 'default'
    2012-09-12 12:23:56,338 INFO org.apache.hadoop.mapred.AuditLogger: USER=oracle IP=192.168.1.5 OPERATION=SUBMIT_JOB TARGET=job_201209121205_0007 RESULT=SUCCESS
    2012-09-12 12:23:56,353 INFO org.apache.hadoop.mapred.JobTracker: Initializing job_201209121205_0007
    2012-09-12 12:23:56,353 INFO org.apache.hadoop.mapred.JobInProgress: Initializing job_201209121205_0007
    2012-09-12 12:23:56,594 INFO org.apache.hadoop.mapred.JobInProgress: jobToken generated and stored with users keys in /opt/ladap/common/hadoop-0.20.2-cdh3u1/hadoop-datastore/mapred/system/job_201209121205_0007/jobToken
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: Input size for job job_201209121205_0007 = 5812. Number of splits = 2
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: tip:task_201209121205_0007_m_000000 has split on node:/default-rack/hadoop-namenode
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: tip:task_201209121205_0007_m_000001 has split on node:/default-rack/hadoop-namenode
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobInProgress: job_201209121205_0007 LOCALITY_WAIT_FACTOR=1.0
    2012-09-12 12:23:56,607 ERROR org.apache.hadoop.mapred.JobTracker: Job initialization failed:
    java.lang.NegativeArraySizeException
    at org.apache.hadoop.mapred.JobInProgress.initTasks(JobInProgress.java:748)
    at org.apache.hadoop.mapred.JobTracker.initJob(JobTracker.java:4016)
    at org.apache.hadoop.mapred.EagerTaskInitializationListener$InitJob.run(EagerTaskInitializationListener.java:79)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:679)
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobTracker: Failing job job_201209121205_0007
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobInProgress$JobSummary: jobId=job_201209121205_0007,submitTime=1347467036196,launchTime=1347467036607,,finishTime=1347467036607,numMaps=2,numSlotsPerMap=1,numReduces=0,numSlotsPerReduce=1,user=oracle,queue=default,status=FAILED,mapSlotSeconds=0,reduceSlotsSeconds=0,clusterMapCapacity=10,clusterReduceCapacity=2
    2012-09-12 12:23:56,639 INFO org.apache.hadoop.mapred.JobHistory: Moving file:/opt/ladap/common/hadoop/logs/history/hadoop-namenode_1347465941865_job_201209121205_0007_oracle_OraLoader to file:/opt/ladap/common/hadoop/logs/history/done
    2012-09-12 12:23:56,648 INFO org.apache.hadoop.mapred.JobHistory: Moving file:/opt/ladap/common/hadoop/logs/history/hadoop-namenode_1347465941865_job_201209121205_0007_conf.xml to file:/opt/ladap/common/hadoop/logs/history/done
    My oraloader console log is below:
    [oracle@rakesh hadoop]$ bin/hadoop jar oraloader.jar oracle.hadoop.loader.OraLoader -conf olh-conf/TestAS/scott/testmanagedtable/conf.xml -fs hdfs://hadoop-namenode:9000/ -jt hadoop-namenode:9001
    Oracle Loader for Hadoop Release 1.1.0.0.1 - Production
    Copyright (c) 2011, Oracle and/or its affiliates. All rights reserved.
    12/09/12 12:23:42 INFO loader.OraLoader: Oracle Loader for Hadoop Release 1.1.0.0.1 - Production
    Copyright (c) 2011, Oracle and/or its affiliates. All rights reserved.
    12/09/12 12:23:47 INFO loader.OraLoader: Sampling disabled, table: LDP_TESTMANAGEDTABLE is not partitioned
    12/09/12 12:23:47 INFO loader.OraLoader: oracle.hadoop.loader.loadByPartition is disabled, LDP_TESTMANAGEDTABLE is not partitioned
    12/09/12 12:23:47 INFO output.DBOutputFormat: Setting reduce tasks speculative execution to false for : oracle.hadoop.loader.lib.output.JDBCOutputFormat
    12/09/12 12:23:47 INFO loader.OraLoader: Submitting OraLoader job OraLoader
    12/09/12 12:23:50 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    12/09/12 12:23:50 INFO metastore.ObjectStore: ObjectStore, initialize called
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
    12/09/12 12:23:51 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
    12/09/12 12:23:51 INFO DataNucleus.Persistence: Property javax.jdo.option.NonTransactionalRead unknown - will be ignored
    12/09/12 12:23:51 INFO DataNucleus.Persistence: ================= Persistence Configuration ===============
    12/09/12 12:23:51 INFO DataNucleus.Persistence: DataNucleus Persistence Factory - Vendor: "DataNucleus" Version: "2.0.3"
    12/09/12 12:23:51 INFO DataNucleus.Persistence: DataNucleus Persistence Factory initialised for datastore URL="jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true" driver="com.mysql.jdbc.Driver" userName="root"
    12/09/12 12:23:51 INFO DataNucleus.Persistence: ===========================================================
    12/09/12 12:23:52 INFO Datastore.Schema: Creating table `DELETEME1347467032448`
    12/09/12 12:23:52 INFO Datastore.Schema: Schema Name could not be determined for this datastore
    12/09/12 12:23:52 INFO Datastore.Schema: Dropping table `DELETEME1347467032448`
    12/09/12 12:23:52 INFO Datastore.Schema: Initialising Catalog "hive", Schema "" using "None" auto-start option
    12/09/12 12:23:52 INFO Datastore.Schema: Catalog "hive", Schema "" initialised - managing 0 classes
    12/09/12 12:23:52 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
    12/09/12 12:23:52 INFO DataNucleus.MetaData: Registering listener for metadata initialisation
    12/09/12 12:23:52 INFO metastore.ObjectStore: Initialized ObjectStore
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 11, column 6 : cvc-elt.1: Cannot find the declaration of element 'jdo'. - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 312, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 359, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 381, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 416, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 453, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 494, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 535, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 576, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 621, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 666, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : `DBS`, InheritanceStrategy : new-table]
    12/09/12 12:23:53 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : `DATABASE_PARAMS`]
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 index(es) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 0 foreign key(s) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 unique key(s) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 index(es) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 1 foreign key(s) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 1 unique key(s) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase
    12/09/12 12:23:54 INFO metastore.HiveMetaStore: 0: get_table : db=db_1 tbl=testmanagedtable
    12/09/12 12:23:54 INFO HiveMetaStore.audit: ugi=oracle     ip=unknown-ip-addr     cmd=get_table : db=db_1 tbl=testmanagedtable     
    12/09/12 12:23:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MSerDeInfo [Table : `SERDES`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [Table : `SDS`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MTable [Table : `TBLS`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MSerDeInfo.parameters [Table : `SERDE_PARAMS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.parameters [Table : `TABLE_PARAMS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.partitionKeys [Table : `PARTITION_KEYS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.bucketCols [Table : `BUCKETING_COLS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.cols [Table : `COLUMNS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.parameters [Table : `SD_PARAMS`]
    12/09/12 12:23:55 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.sortCols [Table : `SORT_COLS`]
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 index(es) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 0 foreign key(s) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 4 index(es) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 foreign key(s) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 unique key(s) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SORT_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SORT_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SORT_COLS`
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MSerDeInfo
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MStorageDescriptor
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MTable
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MFieldSchema
    12/09/12 12:23:55 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    12/09/12 12:23:55 WARN snappy.LoadSnappy: Snappy native library not loaded
    12/09/12 12:23:55 INFO mapred.FileInputFormat: Total input paths to process : 1
    12/09/12 12:23:56 INFO mapred.JobClient: Running job: job_201209121205_0007
    12/09/12 12:23:57 INFO mapred.JobClient: map 0% reduce 0%
    12/09/12 12:23:57 INFO mapred.JobClient: Job complete: job_201209121205_0007
    12/09/12 12:23:57 INFO mapred.JobClient: Counters: 0
    [oracle@rakesh hadoop]$
    Please help. Thanks in advance.
    Regards,
    Rakesh Kumar Rakshit

    Hi Raskesh,
    Can you share the conf.xml and map.xml files you are using? I am trying to do the same (export from HIVE to oracle DB) and i get the following exception ClassNotFoundException: org.apache.hadoop.hive.metastore.TableType.
    Best regards,
    Bilal

  • Problem regarding the creation of Table using CSS.

    Hi ,
    Here I have a Problem regarding the creation of Table using CSS.
    In My Application i have a table with multiple rows(Rows are Dynamically added to the table).First i am setting the table with the following properties:
    width:900px;
    height : auto,
    Overflow : visible,
    Max-height: : 200px.
    If I use above properties,I'm getting a table with 5 or 6 rows(height upto 200px).After that i am getting the Vertical ScrollBar.
    The problem is when a table has many columns, Vertical and Horizontal Scrolls are coming at the time of setting the table. The table height is not Increasing dynamically.
    How can i use "height" property in CSS? (I want the table height to be increased when the columns are more.)
    Thanks & Regards
    Madhavi

    Hey humble user. Errr I'm trying to understand what ur trying to do. U want to create a section of a region destructively from an existing region right? If so select the option convert to new region (opt-comm-R or selecting it by right clicking). Check your audio bin to make sure. Whats the "merge" function? Are u refering to the glue tool?

  • Tables created using Enterprise Console Manager do not show up in SQLPLUS

    I am struggling with Enterprise console manager and command line SQLPLUS.
    I have tried scott/tiger , system/manager and the new user that I have created. But I am having the following problems:
    1. I cannot connect as NORMAL through Enterprise Console Manager if I use scott/tiger OR new/paswd. Only way I can connect as NORMAL is using system/manager
    2. The tables that I create using Enterprise Console Manager (while connected as system/manager in NORMAL mode) do not show up in the command line SQLPLUS. I am check this by running the following command:
    SELECT OWNER, OBJECT_NAME FROM ALL_OBJECTS WHERE OBJECT_NAME LIKE '%TEST%';
    While running the above command I am conected to SQLPLUS using system/manager WITHOUT SYSDBA
    3. Lastly, I have a JAVA class which establishes as connection to the database using system/manager. But I can access only the tables that I create using the Enterprise Console Manager. If I create a table using command line SQLPLUS (while connected as system/manager), I cannot access it through Java program.
    I am struggling with this since last morning and now I have run out ways to debug what is happening. Can anybody suggest any ways to tackle this?
    Thanks in advance,
    Mahesh

    Prior to 11g, when you created a table or whatever, you automatically allocated one extent.
    This is now no longer true and depends on a parameter I don't remember.
    dba_segments is a summary of dba_extents.
    Obviously, if there is no extent allocated, the table (view is defined with inner join) will not show up.
    You could qualify this is as a bug and submit a SR to Oracle. But then the performance impact may be huge.
    Sybrand Bakker
    Senior Oracle DBA

  • Recipe Management error while creating Recipe

    Hi,
    While creating recipe in Recipe Management (Tcode: RMWB), in the process tab, I am unable to create a STAGE no.
    When I try to create stage (4 digit numeric) it gives me an error message "Change number 500000000000 does not exist"
    Long text of error message
    Diagnosis
    One of the following situations caused the error message:
    1. You want to edit a BOM or routing using change number 500000000000.
    2. You entered change number 500000000000 in order to display or change the change master.
    Change number 500000000000 which you entered does not exist in the system.
    Procedure
    Check your entry. Correct the change number if appropriate.
    Please help in solving the above issue.
    Rgd,
    Jag

    Hi Jag,
    Are there recipes in the system (after an upgrade)?
    When you start with the recipe management  the system create automatically a dummy change number (with the profil setting RCP01). The change number  is stored in the table RCPC_AENNR; to read the change number , please use the function module  RCP899_DUMMY_AENNR_READ.
    If the change number 500000000 the dummy, please correct the settings of the change managment if the change number is missing.
    Best regards,
    Roland Freudenberg

  • Adding Custom Fields to the Recipe Management Workbench

    Hi Experts
    I'm trying to add fields to the Quatities Tab of the recipe dependent and the stage dependent formulas
    in RMWB (Recipe Management Workbench).
    I've added other fields to the Input and output tab by appending the frmls_iot_scr structure and then doing the layout setup
    in SPRO but where and how do I go about adding fields to the Quantities Tab and is there a BADi that can then be used to fill
    those additional fields.
    Any help would be much appreciated.
    Regards
    Vic

    Hi Vic,
    There are no exits or BAdI for extending the fields for the view quantities. New fields must be added at the structure FRMLS_TOTALS and added into the customizing table FRMLC32 as col_id. Then the processing of the fields is still to be programmed. The function group is FRML600.
    You can use also a copy of the group group.
    Best regards,
    Roland

  • Data Sources related to Recipe Management?

    Hi All,
    My requirement is to extract data into BW systems on Recipe Management from ECC system.
    (RM is a part of Production Planning for Process Industries. We get this from PLM RM 2.10 on SAP R/3 Enterprise PLM Extension (1.10))
    Please let me know the standard SAP datasource(s) for Recipe Management module of SAP.
    Regards,
    Ravikiran.

    Hi Gaurav,
    Seems like there is no direct way. Can you tell the BO version?
    In case you know the universes in which that table has been used then you can take out the list of Webi reports using Query Builder.
    Regards,
    Yuvraj

  • Recipe Management- BAPI_BUS1077_CREATE - Restrictions do not apprear

    Hi,
    I'm using BAPI_BUS1077_CREATE to create a specification in the SAP Recipe Management Module, but, the values for 'Rating' and 'ValArea' do not get populated in the 'Restrictions' tab of the Specification header.
    Please advise.
    Thanks,
    Shalabh Jain

    Hi Shalabh Jain,
    Then I think, you should check the content of the table sub_tab_wa-usage_tab in the BADI (BAPI_BUS1077_CREATE).
    Please check the settings of the parameters during the bulding of the header usages. There are some different options.
    Best regards,
    Roland

  • Tables used in Purchasing

    Hi Experts,
    I am new to SAP MM can you plz guide me for various tables used in purchasing.
    I want this info for some of my SD reports.
    Thanx inadvance
    Regards
    Rohit

    Hi
    Please note some of them.
    EINA Purchasing Info Record- General Data
    EINE Purchasing Info Record- Purchasing Organization Data
    MAKT Material Descriptions
    MARA General Material Data
    MARC Plant Data for Material
    MARD Storage Location Data for Material
    MAST Material to BOM Link
    MBEW Material Valuation
    MKPF Header- Material Document
    MSEG Document Segment- Material
    MVER Material Consumption
    MVKE Sales Data for materials
    RKPF Document Header- Reservation
    T023 Mat. groups
    T024 Purchasing Groups
    T156 Movement Type
    T157H Help Texts for Movement Types
    MOFF Lists what views have not been created
    A501 Plant/Material
    EBAN Purchase Requisition
    EBKN Purchase Requisition Account Assignment
    EKAB Release Documentation
    EKBE History per Purchasing Document
    EKET Scheduling Agreement Schedule Lines
    EKKN Account Assignment in Purchasing Document
    EKKO Purchasing Document Header
    EKPO Purchasing Document Item
    IKPF Header- Physical Inventory Document
    ISEG Physical Inventory Document Items
    LFA1 Vendor Master (General section)
    LFB1 Vendor Master (Company Code)
    NRIV Number range intervals
    RESB Reservation/dependent requirements
    T161T Texts for Purchasing Document Types
    SAP MM Tips by : Bahadur
    Following are the list of Important MM tables. Please check if its useful.
    Inventory Management:
    Table
    Description
    Material
    MSEG
    Material document / transaction details
    Document/Movements
    MKPF
    Material document header information
    Material Stock Balances
    MARD
    Material stock
    MBEW
    Material stock with valuation
    Sales Order Stock
    MSKA
    Stock balance with associated sales
    order data
    Stock Transport
    EKUB
    Index for Stock Transport Orders for
    MDUB
    Material
    Reading View of Stock Transport Ord. for
    Release Ord.
    Special Stocks
    MKOL
    Consignment, material provided to
    vendor, etc.
    Material Master Data:
    Table
    Description
    Materials
    MARA
    General Data, material type, group,
    configurable & batch ind.
    MAKT
    Short Texts, descriptions
    MARM
    Conversion Factors
    MVKE
    Sales Org, distribution channel
    MLAN
    Sales data, tax indicator, tax
    MARC
    classification
    MBEW
    Plant Planning Data
    MLGN
    Valuation Data
    MLGT
    Warehouse Management Inventory Data
    MVER
    Warehouse Management Storage Type
    MAPR
    Data
    MARD
    Consumption Data
    MCHA
    Pointer for Forecast Data
    MCHB
    Storage location data with stock
    balances
    Batches
    Batch Stocks
    Rewad if useful for you
    AAK

  • ASCP Table name for Recipe 'Preference' field

    I have separate Planning and Production recipes for the same item. ASCP is required to plan using planning recipes - and batches would be created manually refererring ASCP output using production recipes (there is business reason for this).
    All planning recipes have validity rules with Preference = 1 and production recipes have Preference = 10. As per metalink, ASCP would collect both recipes and try to plan using the Planning recipe. But, if required, it may use the production recipe in a constrained plan. I do not want that to happen ever. To ensure this I want to end date / delete the production recipes in the planning server. But I can not find any field corresponding to Preference in MSC_BOMS and MSC_BOM_COMPONENTS. What is the table name where this preference is saved? Also what impact will this have on open OPM batches - which uses the production recipe?

    Hi GeertN,
    Thanks once again for the prompt response.
    I had seen those two fields (Effectivity_date and disable_date) - but they do not contain the validity rule end date. For example, I have a recipe which is end dated for 5 organizations in source and valid for 1 org. In msc_process_effectivity there is only one row for that bill_sequence_id - and that corresponds to the org for which the recipe is valid. Does that mean the collection process does not collect recipes with end dated validity rules?
    Also, I had tried "update MSC_PROCESS_EFFECTIVITY set disable_date = sysdate where... " - but with that MBP seems to have hung (running for 3 hours - when it normally finishes in 40 mins).
    Regards.

Maybe you are looking for

  • After loading FF4 the back/forth buttons are visible but not functional. Have tried to reset tool bar etc. no luck. fix please

    After upgrading to the new version everything seems to work well except my back/forth button on the browser are visible but grayed out and not functional. I have tried all of the suggestions in help and am unable to bring them to life. Very frustrati

  • Patches for JDK 1.5.0_14?

    Forum Moderator and Update Connection users: Last week, Sun released an update of the JDK to version 1.5.0_14. Although I could download and install it separately, I've been relying on updatemanager to apply JDK upgrades as patches. If I am not mista

  • Cannot publish to Mobile Me Gallery

    I am unable to update a MobileMe gallery using iPhoto9. I have an iMac G5 and I'm running iPhoto 9 and OS x 10.5.6.. Prior to upgrading to iPhoto 9 I published several galleries to MobileMe. I just attempted to upload another photo to one of the gall

  • SAP SD benchmark in rel605 which mandt to choose

    hi , guys i got the rel605 folder. there are 2 folder:rel605_sp00,rel605_sp05 in rel605_sp00, mandt.car,mandt_2011.car and z_source_605_v2.tar in rel605_sp05, mandt.car,mandt_2012.car and mandt_2013.car in Inst_BMTools_V2_24 5.2 Prepare SAP ERP Syste

  • Bridge window placement?

    Is there any way you can reassign the placement of the Bridge window? It's always on top of everything else, which would be fine if it would just stay in a corner where I put it. But nooooo. It always has to go back to the same place. I've looked in