Unable to create a management pack key token to seal a management pack using MPSeal.exe

Hi All,
I am trying to seal a custom made management pack. I refereed Jonathan
Almquist blog to do it. When i do it i get a error message Access is denied while creating a public key token.
Blog link: http://blogs.technet.com/b/jonathanalmquist/archive/2008/08/19/seal-a-management-pack.aspx
I get this error when i run the second command which i have Bolded below
Below are the directories i used.
C:\MP_Seal - To keep the SN.EXE & MPSEAL.exe and the unsealed management pack.
Created 2 folders named output & Pubkey in C:\MP_Seal
Opened command prompt Run as Administrator:
I am a domain admin and i have full access to the whole server
Ran the below commands to generate a Key
Note: I have changed my key name in this article to Mykey.Snk due to privacy issues.
1.C:\MP_Seal\sn -k C:\MP_Seal\Mykey.Snk
2.C:\MP_Seal\sn -p C:\MP_Seal\Mykey.Snk C:\MP_Seal\PubKey
3.C:\MP_Seal\sn -tp C:\MP_Seal\Pubkey
The first command how ever generated me a snk file. The second shows Access is denied
Also created the same setup in the D:\ Drive as well. Still the same issue occurs
Can any one help please.
Gautam.75801

Thank you for the reply Jon,
As i don't have a key generated already for me hence i am creating a new one. What i wanted to know is that is that, The first command however creates me a .SNK file. Is that enough for me to seal a management pack ? If yes what is the use of the second
and third mentioned below ? 
1.C:\MP_Seal\sn -k C:\MP_Seal\Mykey.Snk
2.C:\MP_Seal\sn -p C:\MP_Seal\Mykey.Snk C:\MP_Seal\PubKey
3.C:\MP_Seal\sn -tp C:\MP_Seal\Pubkey
Gautam.75801

Similar Messages

  • Windows 2008 r2 Cluster not starting - "unable to create security manager worker queues"

    Hello, following a power outage, we got a serious cluster error preventing the start of the cluster.
    We are trying to interpret the only four lines the cluster.log generates :
    00000330.000016cc::2014/09/26-10:44:06.348 ERR   [WTQ] bogus file creation failed, 2
    00000330.000016cc::2014/09/26-10:44:06.348 ERR   [WTQ] bogus file creation failed, 2
    00000330.000016cc::2014/09/26-10:44:06.348 ERR   [CS] Unable to create SecurityManager worker queues, 2
    00000330.000016cc::2014/09/26-10:44:06.363 ERR   Error 6
    AND if starting clussvc manually :
    Got ERROR_FILE_NOT_FOUND(2)' because of 'Error while creating the Security Manag
    er's Thread Pool' in
        000007fe:fd69940d( ERROR_MOD_NOT_FOUND(126) )
        00000000:001ff190( ERROR_MOD_NOT_FOUND(126) )
    We suspect a DLL problem (because of mod not found), but we are unable to find the ones involved even with process monitor.
    clusdb hive seems ok.
    The situation is serious, can anybody help, please ?

    Hi RodV,
    This error usually caused by cluster service fails to open a 
    handle to the \NUL device, Device manager shows the device instance in error state.
    Please check whether the following register value still exist, if not please backup your current registry then add the it.
    HKEY_LOCAL_MACHINE\SYSTEM\CURRENTCONTROLSET\ENUM\ROOT\LEGACY_NULL\0000\CONTROL
    ActiveService REG_SZ Null
    I am glad to be of help to you!
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Unable to export Hive managed table using Oraloader

    Hi,
    I am using MySQL as Hive metastore and trying to export a Hive managed table using Oraloader.
    I get the following excpetion in Jobtracker:
    2012-09-12 12:23:56,337 INFO org.apache.hadoop.mapred.JobTracker: Job job_201209121205_0007 added successfully for user 'oracle' to queue 'default'
    2012-09-12 12:23:56,338 INFO org.apache.hadoop.mapred.AuditLogger: USER=oracle IP=192.168.1.5 OPERATION=SUBMIT_JOB TARGET=job_201209121205_0007 RESULT=SUCCESS
    2012-09-12 12:23:56,353 INFO org.apache.hadoop.mapred.JobTracker: Initializing job_201209121205_0007
    2012-09-12 12:23:56,353 INFO org.apache.hadoop.mapred.JobInProgress: Initializing job_201209121205_0007
    2012-09-12 12:23:56,594 INFO org.apache.hadoop.mapred.JobInProgress: jobToken generated and stored with users keys in /opt/ladap/common/hadoop-0.20.2-cdh3u1/hadoop-datastore/mapred/system/job_201209121205_0007/jobToken
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: Input size for job job_201209121205_0007 = 5812. Number of splits = 2
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: tip:task_201209121205_0007_m_000000 has split on node:/default-rack/hadoop-namenode
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: tip:task_201209121205_0007_m_000001 has split on node:/default-rack/hadoop-namenode
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobInProgress: job_201209121205_0007 LOCALITY_WAIT_FACTOR=1.0
    2012-09-12 12:23:56,607 ERROR org.apache.hadoop.mapred.JobTracker: Job initialization failed:
    java.lang.NegativeArraySizeException
    at org.apache.hadoop.mapred.JobInProgress.initTasks(JobInProgress.java:748)
    at org.apache.hadoop.mapred.JobTracker.initJob(JobTracker.java:4016)
    at org.apache.hadoop.mapred.EagerTaskInitializationListener$InitJob.run(EagerTaskInitializationListener.java:79)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:679)
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobTracker: Failing job job_201209121205_0007
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobInProgress$JobSummary: jobId=job_201209121205_0007,submitTime=1347467036196,launchTime=1347467036607,,finishTime=1347467036607,numMaps=2,numSlotsPerMap=1,numReduces=0,numSlotsPerReduce=1,user=oracle,queue=default,status=FAILED,mapSlotSeconds=0,reduceSlotsSeconds=0,clusterMapCapacity=10,clusterReduceCapacity=2
    2012-09-12 12:23:56,639 INFO org.apache.hadoop.mapred.JobHistory: Moving file:/opt/ladap/common/hadoop/logs/history/hadoop-namenode_1347465941865_job_201209121205_0007_oracle_OraLoader to file:/opt/ladap/common/hadoop/logs/history/done
    2012-09-12 12:23:56,648 INFO org.apache.hadoop.mapred.JobHistory: Moving file:/opt/ladap/common/hadoop/logs/history/hadoop-namenode_1347465941865_job_201209121205_0007_conf.xml to file:/opt/ladap/common/hadoop/logs/history/done
    My oraloader console log is below:
    [oracle@rakesh hadoop]$ bin/hadoop jar oraloader.jar oracle.hadoop.loader.OraLoader -conf olh-conf/TestAS/scott/testmanagedtable/conf.xml -fs hdfs://hadoop-namenode:9000/ -jt hadoop-namenode:9001
    Oracle Loader for Hadoop Release 1.1.0.0.1 - Production
    Copyright (c) 2011, Oracle and/or its affiliates. All rights reserved.
    12/09/12 12:23:42 INFO loader.OraLoader: Oracle Loader for Hadoop Release 1.1.0.0.1 - Production
    Copyright (c) 2011, Oracle and/or its affiliates. All rights reserved.
    12/09/12 12:23:47 INFO loader.OraLoader: Sampling disabled, table: LDP_TESTMANAGEDTABLE is not partitioned
    12/09/12 12:23:47 INFO loader.OraLoader: oracle.hadoop.loader.loadByPartition is disabled, LDP_TESTMANAGEDTABLE is not partitioned
    12/09/12 12:23:47 INFO output.DBOutputFormat: Setting reduce tasks speculative execution to false for : oracle.hadoop.loader.lib.output.JDBCOutputFormat
    12/09/12 12:23:47 INFO loader.OraLoader: Submitting OraLoader job OraLoader
    12/09/12 12:23:50 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    12/09/12 12:23:50 INFO metastore.ObjectStore: ObjectStore, initialize called
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
    12/09/12 12:23:51 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
    12/09/12 12:23:51 INFO DataNucleus.Persistence: Property javax.jdo.option.NonTransactionalRead unknown - will be ignored
    12/09/12 12:23:51 INFO DataNucleus.Persistence: ================= Persistence Configuration ===============
    12/09/12 12:23:51 INFO DataNucleus.Persistence: DataNucleus Persistence Factory - Vendor: "DataNucleus" Version: "2.0.3"
    12/09/12 12:23:51 INFO DataNucleus.Persistence: DataNucleus Persistence Factory initialised for datastore URL="jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true" driver="com.mysql.jdbc.Driver" userName="root"
    12/09/12 12:23:51 INFO DataNucleus.Persistence: ===========================================================
    12/09/12 12:23:52 INFO Datastore.Schema: Creating table `DELETEME1347467032448`
    12/09/12 12:23:52 INFO Datastore.Schema: Schema Name could not be determined for this datastore
    12/09/12 12:23:52 INFO Datastore.Schema: Dropping table `DELETEME1347467032448`
    12/09/12 12:23:52 INFO Datastore.Schema: Initialising Catalog "hive", Schema "" using "None" auto-start option
    12/09/12 12:23:52 INFO Datastore.Schema: Catalog "hive", Schema "" initialised - managing 0 classes
    12/09/12 12:23:52 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
    12/09/12 12:23:52 INFO DataNucleus.MetaData: Registering listener for metadata initialisation
    12/09/12 12:23:52 INFO metastore.ObjectStore: Initialized ObjectStore
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 11, column 6 : cvc-elt.1: Cannot find the declaration of element 'jdo'. - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 312, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 359, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 381, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 416, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 453, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 494, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 535, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 576, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 621, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 666, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : `DBS`, InheritanceStrategy : new-table]
    12/09/12 12:23:53 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : `DATABASE_PARAMS`]
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 index(es) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 0 foreign key(s) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 unique key(s) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 index(es) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 1 foreign key(s) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 1 unique key(s) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase
    12/09/12 12:23:54 INFO metastore.HiveMetaStore: 0: get_table : db=db_1 tbl=testmanagedtable
    12/09/12 12:23:54 INFO HiveMetaStore.audit: ugi=oracle     ip=unknown-ip-addr     cmd=get_table : db=db_1 tbl=testmanagedtable     
    12/09/12 12:23:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MSerDeInfo [Table : `SERDES`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [Table : `SDS`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MTable [Table : `TBLS`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MSerDeInfo.parameters [Table : `SERDE_PARAMS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.parameters [Table : `TABLE_PARAMS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.partitionKeys [Table : `PARTITION_KEYS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.bucketCols [Table : `BUCKETING_COLS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.cols [Table : `COLUMNS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.parameters [Table : `SD_PARAMS`]
    12/09/12 12:23:55 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.sortCols [Table : `SORT_COLS`]
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 index(es) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 0 foreign key(s) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 4 index(es) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 foreign key(s) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 unique key(s) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SORT_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SORT_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SORT_COLS`
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MSerDeInfo
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MStorageDescriptor
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MTable
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MFieldSchema
    12/09/12 12:23:55 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    12/09/12 12:23:55 WARN snappy.LoadSnappy: Snappy native library not loaded
    12/09/12 12:23:55 INFO mapred.FileInputFormat: Total input paths to process : 1
    12/09/12 12:23:56 INFO mapred.JobClient: Running job: job_201209121205_0007
    12/09/12 12:23:57 INFO mapred.JobClient: map 0% reduce 0%
    12/09/12 12:23:57 INFO mapred.JobClient: Job complete: job_201209121205_0007
    12/09/12 12:23:57 INFO mapred.JobClient: Counters: 0
    [oracle@rakesh hadoop]$
    Please help. Thanks in advance.
    Regards,
    Rakesh Kumar Rakshit

    Hi Raskesh,
    Can you share the conf.xml and map.xml files you are using? I am trying to do the same (export from HIVE to oracle DB) and i get the following exception ClassNotFoundException: org.apache.hadoop.hive.metastore.TableType.
    Best regards,
    Bilal

  • How to create demand management workflow using visual studio

    Hello all,
    I need to create demand management workflow (Project server 2013) using visual studio 2012. I need to understand how we can write code in it. Any references??

    Hi Shanila,
    Here are some of the articles you can use for designing workflows using Visual Studio.
    Getting started Project Server 2013 Workflows :
    http://msdn.microsoft.com/en-us/library/office/ee767694(v=office.15).aspx
    Using workflow for Demand Management:
    http://technet.microsoft.com/en-us/library/dn458861(v=office.15).aspx
    Creating Project Workflows using Visual studio 2012:
    http://blogs.msdn.com/b/project_programmability/archive/2012/11/07/creating-project-workflows-using-visual-studio-2012.aspx
    Thanks,
    Phani

  • Unable to create PR with Service Line & also no entry in ESLH table using

    Hi Experts,
    I am using BAPI => BAPI_REQUISITION_CREATE to create PR and it is successful to create PR but when I try to create PO is become fail (proably missing entry in ESLH table).
    Now my requirement is to create PR with Service Line and an entry in ESLH table (should be filled by SAP automatically). I have been passing the service related/account assigment tables into BAPI but still not working.
    Will you please give a solution to create PR with Service Line and also need an entry in ESLH table (its important to my req)?
    Also provide some input on Service Line, if possible(How to check Service Line for PR + Account Assignment to Service Line + etc.)?
    Note that: If I create PR manually then there have entry in ESLH and hence I can create PO.
    Definately the quick solution help me lot...
    Thanks
    AKG

    Hi Experts,
    I am using BAPI => BAPI_REQUISITION_CREATE to create PR and it is successful to create PR but when I try to create PO is become fail (proably missing entry in ESLH table).
    Now my requirement is to create PR with Service Line and an entry in ESLH table (should be filled by SAP automatically). I have been passing the service related/account assigment tables into BAPI but still not working.
    Will you please give a solution to create PR with Service Line and also need an entry in ESLH table (its important to my req)?
    Also provide some input on Service Line, if possible(How to check Service Line for PR + Account Assignment to Service Line + etc.)?
    Note that: If I create PR manually then there have entry in ESLH and hence I can create PO.
    Definately the quick solution help me lot...
    Thanks
    AKG

  • Unable to create service user Installation

    Hi am unable to create PIAPPLUSER service user.Am installing NW2004s on AIX.
    Its using the BAPI_USER_CREATE1 to create the user, everything is successful but at last it gets a message RFC connection closed.Is this some thing to do with SLD? I created the user PIAPPLUSER manually, but then also its giving RFC connection closed...Please help me...Answers will be greatly rewarded...
    INFO       2007-04-19 12:31:08 [iaxxrfcimp.cpp:478]
               CAbRfcImpl::checkSysInfoSAP
    Version 700  of remote SAP System QPI accepted.
    INFO       2007-04-19 12:31:08 [iaxxrfcimp.cpp:594]
               CAbRfcImpl::setFunction
    Setting new application function BAPI_USER_CREATE1.
    INFO       2007-04-19 12:31:08 [iaxxrfcimp.cpp:1017]
               CAbRfcImpl::callLibraryFunction
    Generating interface for remote function.
    INFO       2007-04-19 12:31:09 [iaxxrfcimp.cpp:1065]
               CAbRfcImpl::performFunctionCall
    Function call was successful.
    INFO       2007-04-19 12:31:09 [iaxxrfcimp.cpp:924]
               CAbRfcImpl::getRfcInterfaceSAP
    Function interface generated successfully.
    INFO       2007-04-19 12:31:10 [iaxxrfcimp.cpp:926]
               CAbRfcImpl::getRfcInterfaceSAP
    Technical properties of function set successfully.
    INFO       2007-04-19 12:31:10 [iaxxrfcfls.cpp:107]
               CRfcFuncRep::insFuncIf
    Information for application function BAPI_USER_CREATE1 copied to local Repository.
    INFO       2007-04-19 12:31:10 [iaxxrfcimp.cpp:622]
               CAbRfcImpl::setFunction
    Function module BAPI_USER_CREATE1 set successfully.
    INFO       2007-04-19 12:31:10 [iaxxrfcimp.cpp:1032]
               CAbRfcImpl::callFunction
    Executing function call BAPI_USER_CREATE1.
    INFO       2007-04-19 12:31:10 [iaxxrfcimp.cpp:1065]
               CAbRfcImpl::performFunctionCall
    Function call was successful.
    INFO       2007-04-19 12:31:10 [iaxxbjsco.cpp:561]
               CIaJSCo::disconnect_impl(001:DDIC:EN:tsqa1d03:40:::)
    RFC connection closed.
    ERROR      2007-04-19 12:31:11 [iaxxejsbas.cpp:178]
               EJS_ErrorReporter
    FJS-00003  TypeError: this.getSystemInfo() has no properties (in script NW_Onehost|ind|ind|ind|ind, line 12941: ???)
    ERROR      2007-04-19 12:31:11 [iaxxgenimp.cpp:736]
               showDialog()
    FCO-00011  The step CreateUser with step key |NW_Onehost|ind|ind|ind|ind|0|0|SAP_Software_Features_Configuration|ind|ind|ind|ind|5|0|NW_Usage_Types_Configuration_PI|ind|ind|ind|ind|1|0|GenericNewCreateAbapUser|ind|ind|ind|ind|1|3|CreateUser was executed with status ERROR .

    Hi
    I did that....I created the user manually and assigned the role...Then also its not crossing the step "Creating PIAPPL USER step".If it passes thru this step...am all set to go with my installation...Any help??

  • After upgrade to CF9, CFIMAGE "Unable to create temporary file" error

    We recently upgraded from CF8 to CF9 Enterprise.  I'm getting an "Unable to create temporary file" error on
    my CFIMAGE resize calls.  We use sandbox security.  I assume I need to grant write access to whatever folder CF uses for temp files, but which folder is it?   The same code (and sandbox settings) ran fine in CF8....
    Note, if I attempt to add C:\JRun4\servers\cfusion\SERVER-INF\temp to the sandbox for this particular app, CF crashes on all requests across all apps on the server with a:
    Security: The requested template has been denied access to C:\JRun4\servers\cfusion\cfusion-ear\cfusion-war\WEB-INF\cfclasses\cfapp2ecfc1510154633.c lass.
    The following is the internal exception message: access denied (java.io.FilePermission C:\JRun4\servers\cfusion\cfusion-ear\cfusion-war\WEB-INF\cfclasses\cfapp2ecfc1510154633.c lass read)
    ColdFusion cannot determine the line of the template that caused this error. This is often caused by an error in the exception handling subsystem.
    I need to restart CF to get everything working again.

    Another update.   Had a problem with a sandboxed CF9 site doing a simple CFIMAGE READ to a memory variable.  Got an "Unable to create temporary file" error.
    Inserted the following code in the file upload page:
    <cfscript>
    writeoutput("Temp Dir : " & createobject("java","java.lang.System").getProperty("java.io.tmpdir") );
    </cfscript>
    ... and it reveals the temp directory as C:\WINDOWS\TEMP.  Added that to the sandbox, and the CFIMAGE READ is working properly now.
    Note this seems inconsistent with CFIMAGE RESIZE behavior which appears to use the CF GetTempDirectory() value, which in my case is C:\JRun4\servers\cfusion\SERVER-INF\temp\cfusion-war-tmp\
    For reference, see the section "Sandbox Considerations" at this link:
    http://help.adobe.com/en_US/ColdFusion/9.0/Admin/WSc3ff6d0ea77859461172e0811cbf364104-7fc8 .html#WSc3ff6d0ea77859461172e0811cbf364104-7fcc

  • Unable to create volume - error message when exporting

    Hello - relatively new Mac and iphoto user here.
    I'm trying to export photos from iphoto to a 2 GB thumb drive. I keep getting the error message "unable to create volume" when I try to move anything but very small batches of photos over. What's going on? I tried to create a folder on my desktop for the photos but then when I try to move from there I get an error message telling me the photo "is in use" and can't be moved to the thumb drive.
    I would move them to my external hard drive but I'm having trouble with it too. It won't even show up. When I run a disk utility on it it says it can't be verified. Sigh. So that's a whole other issue that will probably take something like Disk Warrior to clean up.
    In the meantime, I'm trying to move images to the thumb drive then I'm putting the thumb drive in my other computer and moving the images from the thumb drive to it's external hard drive A long process when you can only move about 20 images at a time from iphoto.
    I'm just trying to move a good bit of old photos out of iphoto on the Mace because I'm scared that the program or even my entire computer is going to crash and I'm going to have data loss.
    What can I do about the "unable to create volume" message? And what's with the "file in use" error message when I try to move the images off a file folder on the desktop.
    Is there a better way to move 1GB of photos all at once to the thumb drive?

    Welcome to the Apple Discussions.
    Seems to be two issues complicating things here:
    I'm trying to export photos from iphoto to a 2 GB thumb drive. I keep getting the error message "unable to create volume" when I try to move anything but very small batches of photos over.
    What format is that wee drive? Mostly there are FAT16 or FAT 32 and there is a limit to the number of objects they can have on the root of the drive. The workaround is to export to a Folder on the desktop and copy that over.
    Which brings us to:
    I tried to create a folder on my desktop for the photos but then when I try to move from there I get an error message telling me the photo "is in use" and can't be moved to the thumb drive.
    You will sometimes get this message when you try to export two files +with the same name+ to a Folder. As you know your camera is well capable of producing many files called something like DSC_1234.jpeg. In iPhoto these will be in separate folders, so no problem. However, when you go to export them to the Folder on the desktop if iPhoto continues the export one will overwrite the error.
    Solution: Give the Pics a title (Photos -> Batch Change) and a sequential number. Then when you export, in the export dialogue you have the option to use the Title as filename.
    Regards
    TD

  • Zen Xtra unable to create new or update playli

    I am unable to create new play lists on my Zen Xtra. When I use the Zen itself, it looks like it creates the playlist but when I open it, the Zen screens draws a box and prints "error" in the box. Using Nomad explorer, it creates the list, I can add to it, delete from it, and it appears to work in every way but when I try to open it on the Zen, I get the little box and the word "error" in it. I tried taking an existing playlist, removing all the titles, add title to the playlist and I get the same eror message. I have run the clean up disk from the utilities menu on the Zen. I have also upgraded to the latest firmware version. Other than the error on the play list, the unit works perfectly.
    Any suggestions?
    Thanks
    Randy

    Does this player have an upgraded hard disk?
    Either way, how many tracks are on the player?

  • Document management system using oracle text

    i plan to create document management system using oracle text with following features
    1) document comparision
    2) document search
    and more...
    can oracle text be used to display documents of various formats by converting them to HTML. and can search keywords be highlighted in the document.
    please help!

    Have you ever considered doing this in Oracle Application Express (free on top of the Oracle database)? How about something like:
    http://download-west.oracle.com/docs/cd/B31036_01/doc/appdev.22/b28839/up_dn_files.htm
    Index the files using the CONTEXT index, and perhaps the docs' meta with it using the Oracle Text MULTI_COLUMN_DATASTORE, and then when you write your query for a report on the documents include a search string.
    I've created a number of APEX-based document management systems and it is quite easy once you get the hang of using this environment. I suggest looking at some of the tutorials/how-to documents and you'll be on your way quickly.
    Start with the upload application. Once you can get your documents in, create a report that shows everything except the document. Verify all of this works correctly.
    Add some "items" to the page for the report, and include them as bind variables in the where clause.
    After that, add your Oracle Text index to the database, and toss in a "text-field" item to the APEX page. Modify your report query, adding the CONTAINS clause, and use the newly created item as a bind variable. There's your keyword search.
    Linking to Oracle Apps is done through API's and may be over database links.
    Hope it helps. Though not a step-by-step how to document, this should point you in the right direction. Get familiar with APEX as that covers most of what you described.
    -Ron

  • Can any one tell me how to pull out a sealed management pack key token

    Hi All,
    Can any one tell me how to pull out a sealed management pack key token. I want to add dependencies to a management pointing towards a sealed management pack for which a "management pack key token" is required which i need to add in the XML File
    of the management pack. Can any one please tell me how to fetch that information.

    Hi Gautam,
    If you need to reference one management pack in another management pack, then you will need to know what the public key token is. When referencing a Microsoft management pack, it is easy as Microsoft always have the same key which is 31bf3856ad364e35 but
    if you need publickey token for other then MS management pack then you can simply run a SQL querry in your OperationManagerDB.
    select * from dbo.ManagementPack where MPName = 'Type your ManagementPackID'
    ManagementPackID will be present of the properties of the Management Pack.
    In that case the management pack should be imported in your SCOM environment.
    And you will get Public key token only for sealed Management packs not for unsealed.

  • Unable to create foreign key: InvalidArgument=Value of '0' is not valid for 'index'. Parameter name: index

    I am running an SQL(CE) script to create a DB. All script commands succeed, but the DB get "broken" after creating the last costaint: after running the script, viewing table properties of Table2 and clicking on "Manage relations" gives the following error: Unable to create foreign key: InvalidArgument=Value of '0' is not valid for 'index'. Parameter name: index. Wondering what does that refer to...
    Here it is the script. Please note that no error is thrown by running the following queries (even from code that passing the queries by hand, one-by-one to sql server management studio).
    CREATE TABLE [table1] (
    [id_rubrica] numeric(18,0) NOT NULL
    , [id_campo] numeric(18,0) NOT NULL
    , [nome] nvarchar(100) NOT NULL
    GO
    ALTER TABLE [table1] ADD PRIMARY KEY ([id_rubrica],[id_campo]);
    GO
    CREATE UNIQUE INDEX [UQ__m_campi] ON [table1] ([id_campo] Asc);
    GO
    CREATE TABLE [table2] (
    [id_campo] numeric(18,0) NOT NULL
    , [valore] nvarchar(4000) NOT NULL
    GO
    ALTER TABLE [table2] ADD PRIMARY KEY ([id_campo],[valore]);
    GO
    ALTER TABLE [table2] ADD CONSTRAINT [campo_valoriFissi] FOREIGN KEY ([id_campo]) REFERENCES [table1]([id_campo]);
    GO
    Sid (MCP - http://www.sugata.eu)

    I know this is kind of old post, but did this realy solved your problem?
    I'm getting this same error message after adding a FK constraint via UI on VS2008 Server Explorer.
    I can add the constraint with no errors, but the constraint is not created on the DataSet wizard (strongly typed datasets on Win CE 6) and when I click "Manage Relations" on the "Table Properties" this error pop out:
    "InvalidArgument=Value or '0' is not valid for 'index'.
    Parameter name: index"
    Even after vreating my table with the relation in SQL the same occurs:
    CREATE TABLE pedidosRastreios (
        idPedidoRastreio INT NOT NULL IDENTITY PRIMARY KEY,
        idPedido INT NOT NULL CONSTRAINT FK_pedidosRastreios_pedidos REFERENCES pedidos(idPedido) ON DELETE CASCADE,
        codigo NVARCHAR(20) NOT NULL

  • I am unable to install Itunes on Windows 7 as I am getting the message "an error occured during the installation of assembly "Microsoft VC80.CRT Type ="win32" version="8.0.50727.6195",public key token 1fc8b 3b 9a 1e 18e 3b",processorArchitecture="x86".Ple

    I am unable to install Itunes on Windows 7 as I am getting the message "an error occured during the installation of assembly "Microsoft VC80.CRT Type ="win32" version="8.0.50727.6195”,public key token 1fc8b 3b 9a 1e 18e 3b”,processorArchitecture=”x86”.Please refer to Help and Support for more information:"HResult 0x800736FD

    I found htis other post https://discussions.apple.com/thread/3401328?start=0&tstart=0
    I also found this in another post
    These ones are typically caused by underlying problems on the PC that also often cause Windows Updates to fail to install. If we can fix the Windows Update trouble, we can usually get the iTunes trouble cleaned up en passant.
    Go into your Windows update and try to check for new updates. If updates install, stock up on the ones you're behind on, restarting the PC if requested to do so. After the restart, try installing iTunes again. Does it go through this time for you?
    If windows updates fail to install, go into your Update History and doubleclick the failures. What alphanumeric codes appear for you? (For example, 8007000B.)

  • I am unable to install Itunes on Windows 7 as I am getting the message "an error occured during the installation of assembly "Microsoft VC80.CRT Type ="win32" version="8.0.50727.4053,public key token....."HResult 0X8007054f

    I am unable to install Itunes on Windows 7 as I am getting the message "an error occured during the installation of assembly "Microsoft VC80.CRT Type ="win32" version="8.0.50727.4053,public key token....."HResult 0X8007054f

    I found htis other post https://discussions.apple.com/thread/3401328?start=0&tstart=0
    I also found this in another post
    These ones are typically caused by underlying problems on the PC that also often cause Windows Updates to fail to install. If we can fix the Windows Update trouble, we can usually get the iTunes trouble cleaned up en passant.
    Go into your Windows update and try to check for new updates. If updates install, stock up on the ones you're behind on, restarting the PC if requested to do so. After the restart, try installing iTunes again. Does it go through this time for you?
    If windows updates fail to install, go into your Update History and doubleclick the failures. What alphanumeric codes appear for you? (For example, 8007000B.)

  • Hi Folks,   I have a problem with OBIEE EM, There in Enterprise Manager i am unable to navigate into the options like when i clik on any Tab it is not working and even  i am unable to create a report using analysis.

    Hi Folks,
      I have a problem with OBIEE EM, There in Enterprise Manager i am unable to navigate into the options like when i clik on any Tab it is not working and even  i am unable to create a report using analysis.
    Thanks in advance

    i have also tried that but no use can u give me any other way.......

Maybe you are looking for

  • File Browse Item in Master-Detail Form

    Hi, This is what I need to do. I have a master table called Regulations. As part of the Regulation Creation process, users need to upload supporting files. The number of files is not fixed, so I have a detail table for the uploaded files. I was plann

  • Filesize diminished in iPhoto Library and "!" mark

    Like many others, all of a sudden a lot of my photos are just showing up as exclamation marks in iPhoto although I can see the thumbnails. As suggested by some, I tracked the original files for those photos in the iPhoto Library. My digital camera ta

  • Dependency check

    I am increasing the size of a varchar2 column, how to find out if this has any negative impact (anywhere in the code where they are using insert into..) etc, can i use dba_dependencies ? or any trigger that is doing the insert ?

  • Export DB Names over Multiple Servers into csv file

    Hi, I am a Junior DBA at my company and I am looking at running a query onto specific servers that pulls out the Database Name.  I have managed to run that query.  However our company has 50 different Host Servers each with a number of databases on e

  • Bug in Icon Editor?

    I was editing a VI in LV 2012 yesterday and modifiying its icon.  I came across some odd behavior I would consider a bug. I searched the forums and haven't found it being mentioned before.  I tried it today using LV 2013 and had the same behavior.  I