Utils import config command failed on CUCM8.6.1
I've created cucm861sub.flp file and uploaded it to the datastore on ESXi 4.1, then mapped to the floppy, i.e. I followed procedure on cisco wiki to the last letter. Still, after applying command I'm receiving following error: "Cannot locate configuration file"
I've used Winimage to crate 1.44MB floppy that contains only one file, called platformConfig.xml (file was downloaded directly from CCO).
Was anybody able to apply successfully new identity feature?
I opened a TAC case and found this is a bug. CSCtx55507 :utils import config" is failing.
these are the call manager versions that has the fix for this issue: 8.6(2.10000.30), 8.6(1.98000.87), 8.6(1.98000.39) or any later version.
Similar Messages
-
N5K - commit imported config always failed
Hello al.
I have upgrade a N5K (5010) cluster to SW version 5.0(2)N1(1)
When I want to commit a configuration into a switch profile I always receive the error "Failed: Verify Failed"
Neither I want to commit only one rule I still receive the same error.
Any Idea what I'm doing wrong?
I followed the Release Notes for 5.0(2
http://www.cisco.com/en/US/docs/switches/datacenter/nexus5000/sw/release/notes/Rel_5_0_2_N1_1/Nexus5000_Release_Notes_5_0_2_N1_1.html
HI16955# configure sync
Enter configuration commands, one per line. End with CNTL/Z.
HI16955(config-sync)# switch-profile SYNC
Switch-Profile started, Profile ID is 1
HI16955(config-sync-sp)# import running-config
HI16955(config-sync-sp-import)# show switch-profile SYNC buffer
switch-profile : SYNC
Seq-no Command
4 vlan 3
4.1 name 3_vlan_3
5 vlan 4
5.1 name 4_mgmt_mon
6 vlan 5
6.1 name 5_tp_router
7 vlan 6
7.1 name 6_int_servers
8 vlan 8
8.1 name 8_vpn_lan
9 vlan 10
9.1 name 10_dmz1
HI16955(config-sync-sp-import)# commit
Failed: Verify Failed
HI16955(config-sync-sp-import)#
Regards
Tom LauwereinsHello al.
I have upgrade a N5K (5010) cluster to SW version 5.0(2)N1(1)
When I want to commit a configuration into a switch profile I always receive the error "Failed: Verify Failed"
Neither I want to commit only one rule I still receive the same error.
Any Idea what I'm doing wrong?
I followed the Release Notes for 5.0(2
http://www.cisco.com/en/US/docs/switches/datacenter/nexus5000/sw/release/notes/Rel_5_0_2_N1_1/Nexus5000_Release_Notes_5_0_2_N1_1.html
HI16955# configure sync
Enter configuration commands, one per line. End with CNTL/Z.
HI16955(config-sync)# switch-profile SYNC
Switch-Profile started, Profile ID is 1
HI16955(config-sync-sp)# import running-config
HI16955(config-sync-sp-import)# show switch-profile SYNC buffer
switch-profile : SYNC
Seq-no Command
4 vlan 3
4.1 name 3_vlan_3
5 vlan 4
5.1 name 4_mgmt_mon
6 vlan 5
6.1 name 5_tp_router
7 vlan 6
7.1 name 6_int_servers
8 vlan 8
8.1 name 8_vpn_lan
9 vlan 10
9.1 name 10_dmz1
HI16955(config-sync-sp-import)# commit
Failed: Verify Failed
HI16955(config-sync-sp-import)#
Regards
Tom Lauwereins -
Integrating OID with AD: Import Config/ Synchronization fails
I am trying to integrate OID with AD. The bootstrap is successful. I added a new entry to the AD after the bootstrap. The sync interval in OID is set to 10 seconds. The Synchronization Status in OID changes, but it is not successful.
The updated Synchronization status is "Agent Execution Successful, Mapping/IMPORT operation Failure".
Please help me with the following exception. Here is the text from the trace file:
Changetype is 5
Processing modifyRadd Operation ..
Entry Not Found. Converting to an ADD op..
Processing Insert Operation ..
Performing createEntry..
Exception creating Entry : javax.naming.NoPermissionException: [LDAP: error code 50 - Insufficient Access Rights]; remaining name 'cn=test,cn=users,dc=its-ese,dc=local'
[LDAP: error code 50 - Insufficient Access Rights]
javax.naming.NoPermissionException: [LDAP: error code 50 - Insufficient Access Rights]; remaining name 'cn=test,cn=users,dc=its-ese,dc=local'
at com.sun.jndi.ldap.LdapCtx.mapErrorCode(LdapCtx.java:2996)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(LdapCtx.java:2934)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(LdapCtx.java:2740)
at com.sun.jndi.ldap.LdapCtx.c_createSubcontext(LdapCtx.java:777)
at com.sun.jndi.toolkit.ctx.ComponentDirContext.p_createSubcontext(ComponentDirContext.java:319)
at com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.createSubcontext(PartialCompositeDirContext.java:248)
at com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.createSubcontext(PartialCompositeDirContext.java:236)
at javax.naming.directory.InitialDirContext.createSubcontext(InitialDirContext.java:176)
at oracle.ldap.odip.gsi.LDAPWriter.createEntry(LDAPWriter.java:1056)
at oracle.ldap.odip.gsi.LDAPWriter.insert(LDAPWriter.java:409)
at oracle.ldap.odip.gsi.LDAPWriter.modifyRadd(LDAPWriter.java:748)
at oracle.ldap.odip.gsi.LDAPWriter.writeChanges(LDAPWriter.java:335)
at oracle.ldap.odip.engine.AgentThread.mapExecute(AgentThread.java:581)
at oracle.ldap.odip.engine.AgentThread.execMapping(AgentThread.java:306)
at oracle.ldap.odip.engine.AgentThread.run(AgentThread.java:186)
DIP_LDAPWRITER_ERROR_CREATE
Error in executing mapping DIP_LDAPWRITER_ERROR_CREATE
DIP_LDAPWRITER_ERROR_CREATE
at oracle.ldap.odip.engine.AgentThread.mapExecute(AgentThread.java:722)
at oracle.ldap.odip.engine.AgentThread.execMapping(AgentThread.java:306)
at oracle.ldap.odip.engine.AgentThread.run(AgentThread.java:186)
DIP_LDAPWRITER_ERROR_CREATE
ActiveChgImp:Error in Mapping EngineDIP_LDAPWRITER_ERROR_CREATE
DIP_LDAPWRITER_ERROR_CREATE
at oracle.ldap.odip.engine.AgentThread.mapExecute(AgentThread.java:741)
at oracle.ldap.odip.engine.AgentThread.execMapping(AgentThread.java:306)
at oracle.ldap.odip.engine.AgentThread.run(AgentThread.java:186)
ActiveChgImp:about to Update exec status
Updated Attributes
orclodipLastExecutionTime: 20080915141948Hi,
you will need to follow Metalink Note
276481.1 Troubleshooting OID DIP Synchronization Issues -> Problem 11
What you see is an ACI issue on OID server, caused by the fact that you are using a different container.
BR,
Octavian -
Ldif2db command fails to import data and deletes the root node
ldif2db command fails to import data into LDAP.
The command used was
ldif2db -n userRoot -s "dc=example,dc=com" -i test.ldif
The test.ldif conains the nodes under dc=example,dc-com
eg: ou=test,dc=example,dc=com
But on executing the command the root node "dc=example,dc=com" itself got deleted. The console output was like "Skipping entry uid=test001,ou=test,dc=example,dc=com" for all the entries present in ldif.
What might be the reason for this ? Any clues ?
The reason y i am trying to do this ldif2db is to preserve the createtimestamp and modifytimestamp while migrating data from one Directory Server to another. Any other ways of doing it ?ldif2db is the right command to migrate data and preserve those attributes like createtimestamp and modifytimestamp.
However, when this command is used, it will first remove everthing before it load whatever you want. So you need to be very careful. I got this terrible problem as well.
In my experience, if you use this command, don't use "-s". You can just use:
ldif2db -n suffixName -i test.ldif
If you only have one suffix (database), then you can use "-n userRoot".
Also, if you migrate your data from server A to server B, you'd better dump the data using db2ldif -n userRoot -a test.ldif from server A. Then load it into server B using ldif2db -n userRoot -i test.ldif. -
FDM Data import from EBS failed via FDMEE after roll back the 11.1.2.3.500 patch . Getting below error message in ERPI Adapter log.
*** clsGetFinData.fExecuteDataRule @ 2/18/2015 5:36:17 AM ***
PeriodKey = 5/31/2013 12:00:00 AM
PriorPeriodKey = 4/30/2013 12:00:00 AM
Rule Name = 6001
Execution Mode = FULLREFRESH
System.Runtime.InteropServices.COMException (0x80040209): Error connecting to AIF URL.
at Oracle.Erpi.ErpiFdmCommon.ExecuteRule(String userName, String ssoToken, String ruleName, String executionMode, String priorPeriodKey, String periodKey, String& loadId)
at fdmERPIfinE1.clsGetFinData.fExecuteDataRule(String strERPIUserID, String strDataRuleName, String strExecutionMode, String strPeriodKey, String strPriorPeriodKey)
Any help Please?
ThanksHi
Getting this error in ErpiIntergrator0.log . ODI session ID were not generated in ODI / FDMEE. If I import from FDMEE its importing data from EBS.
<[ServletContext@809342788[app:AIF module:aif path:/aif spec-version:2.5 version:11.1.2.0]] Servlet failed with Exception
java.lang.RuntimeException
at com.hyperion.aif.servlet.FDMRuleServlet.doPost(FDMRuleServlet.java:76)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:324)
at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:460)
at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:163)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3730)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3696)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:221) -
"error: command failed to execute correctly" on several packages
Last night, when I updated before shutting down, I got a few errors, as in the subject. As it was very late, I thought I'd pick it up today.
Unfortunately, the pacman log only lists one of the ones that failed as libgpg-error. The other one that I remember erroring was gawk. There were a few others, maybe four or five, but I couln't reliably recall them all, so won't guess.
Here's a new attempt to reinstall gawk with --debug. I did the same with libgpg-error and the error occurred at the same place, with very similar output, so I think the issue is the same for all failures.
debug: pacman v4.2.1 - libalpm v9.0.1
debug: parseconfig: options pass
debug: config: attempting to read file /etc/pacman.conf
debug: config: finish section '(null)'
debug: config: new section 'options'
debug: config: HoldPkg: pacman
debug: config: HoldPkg: glibc
debug: config: usedelta (default 0.7)
debug: config: arch: x86_64
debug: config: verbosepkglists
debug: config: chomp
debug: config: SigLevel: Required
debug: config: SigLevel: DatabaseOptional
debug: config: SigLevel: TrustedOnly
debug: config: LocalFileSigLevel: Optional
debug: config: finish section 'options'
debug: config: new section 'core'
debug: config file /etc/pacman.conf, line 78: including /etc/pacman.d/mirrorlist
debug: config: attempting to read file /etc/pacman.d/mirrorlist
debug: config: finished parsing /etc/pacman.d/mirrorlist
debug: config: finish section 'core'
debug: config: new section 'extra'
debug: config file /etc/pacman.conf, line 81: including /etc/pacman.d/mirrorlist
debug: config: attempting to read file /etc/pacman.d/mirrorlist
debug: config: finished parsing /etc/pacman.d/mirrorlist
debug: config: finish section 'extra'
debug: config: new section 'xyne-x86_64'
debug: config: finish section 'xyne-x86_64'
debug: config: new section 'community'
debug: config file /etc/pacman.conf, line 91: including /etc/pacman.d/mirrorlist
debug: config: attempting to read file /etc/pacman.d/mirrorlist
debug: config: finished parsing /etc/pacman.d/mirrorlist
debug: config: finish section 'community'
debug: config: new section 'multilib'
debug: config file /etc/pacman.conf, line 100: including /etc/pacman.d/mirrorlist
debug: config: attempting to read file /etc/pacman.d/mirrorlist
debug: config: finished parsing /etc/pacman.d/mirrorlist
debug: config: finish section 'multilib'
debug: config: new section 'infinality-bundle'
debug: config: finish section 'infinality-bundle'
debug: config: new section 'infinality-bundle-multilib'
debug: config: finish section 'infinality-bundle-multilib'
debug: config: new section 'infinality-bundle-fonts'
debug: config: finish section 'infinality-bundle-fonts'
debug: config: new section '(null)'
debug: config: finished parsing /etc/pacman.conf
debug: setup_libalpm called
debug: option 'logfile' = /var/log/pacman.log
debug: option 'gpgdir' = /etc/pacman.d/gnupg/
debug: option 'cachedir' = /var/cache/pacman/pkg/
debug: parseconfig: repo pass
debug: config: attempting to read file /etc/pacman.conf
debug: config: finish section '(null)'
debug: config: new section 'options'
debug: config: finish section 'options'
debug: config: new section 'core'
debug: config file /etc/pacman.conf, line 78: including /etc/pacman.d/mirrorlist
debug: config: attempting to read file /etc/pacman.d/mirrorlist
debug: config: finished parsing /etc/pacman.d/mirrorlist
debug: config: finish section 'core'
debug: registering sync database 'core'
debug: database path for tree core set to /var/lib/pacman/sync/core.db
debug: "/var/lib/pacman/sync/core.db.sig" is not readable: No such file or directory
debug: sig path /var/lib/pacman/sync/core.db.sig could not be opened
debug: missing optional signature
debug: setting usage of 15 for core repoistory
debug: adding new server URL to database 'core': http://arch.tamcore.eu/core/os/x86_64
debug: adding new server URL to database 'core': http://mirror.one.com/archlinux/core/os/x86_64
debug: adding new server URL to database 'core': http://mirror.gnomus.de/core/os/x86_64
debug: adding new server URL to database 'core': http://mirror.js-webcoding.de/pub/archlinux/core/os/x86_64
debug: adding new server URL to database 'core': http://archlinux.polymorf.fr/core/os/x86_64
debug: config: new section 'extra'
debug: config file /etc/pacman.conf, line 81: including /etc/pacman.d/mirrorlist
debug: config: attempting to read file /etc/pacman.d/mirrorlist
debug: config: finished parsing /etc/pacman.d/mirrorlist
debug: config: finish section 'extra'
debug: registering sync database 'extra'
debug: database path for tree extra set to /var/lib/pacman/sync/extra.db
debug: "/var/lib/pacman/sync/extra.db.sig" is not readable: No such file or directory
debug: sig path /var/lib/pacman/sync/extra.db.sig could not be opened
debug: missing optional signature
debug: setting usage of 15 for extra repoistory
debug: adding new server URL to database 'extra': http://arch.tamcore.eu/extra/os/x86_64
debug: adding new server URL to database 'extra': http://mirror.one.com/archlinux/extra/os/x86_64
debug: adding new server URL to database 'extra': http://mirror.gnomus.de/extra/os/x86_64
debug: adding new server URL to database 'extra': http://mirror.js-webcoding.de/pub/archlinux/extra/os/x86_64
debug: adding new server URL to database 'extra': http://archlinux.polymorf.fr/extra/os/x86_64
debug: config: new section 'xyne-x86_64'
debug: config: SigLevel: Required
debug: config: finish section 'xyne-x86_64'
debug: registering sync database 'xyne-x86_64'
debug: database path for tree xyne-x86_64 set to /var/lib/pacman/sync/xyne-x86_64.db
debug: GPGME version: 1.5.4
debug: GPGME engine info: file=/usr/bin/gpg2, home=/etc/pacman.d/gnupg/
debug: checking signature for /var/lib/pacman/sync/xyne-x86_64.db
debug: 1 signatures returned
debug: fingerprint: EC3CBE7F607D11E663149E811D1F0DC78F173680
debug: summary: valid
debug: summary: green
debug: status: Success
debug: timestamp: 1430676813
debug: exp_timestamp: 0
debug: validity: full; reason: Success
debug: key: EC3CBE7F607D11E663149E811D1F0DC78F173680, Xyne. (key #3) <[email protected]>, owner_trust unknown, disabled 0
debug: signature is valid
debug: signature is fully trusted
debug: setting usage of 15 for xyne-x86_64 repoistory
debug: adding new server URL to database 'xyne-x86_64': http://xyne.archlinux.ca/repos/xyne
debug: config: new section 'community'
debug: config file /etc/pacman.conf, line 91: including /etc/pacman.d/mirrorlist
debug: config: attempting to read file /etc/pacman.d/mirrorlist
debug: config: finished parsing /etc/pacman.d/mirrorlist
debug: config: finish section 'community'
debug: registering sync database 'community'
debug: database path for tree community set to /var/lib/pacman/sync/community.db
debug: "/var/lib/pacman/sync/community.db.sig" is not readable: No such file or directory
debug: sig path /var/lib/pacman/sync/community.db.sig could not be opened
debug: missing optional signature
debug: setting usage of 15 for community repoistory
debug: adding new server URL to database 'community': http://arch.tamcore.eu/community/os/x86_64
debug: adding new server URL to database 'community': http://mirror.one.com/archlinux/community/os/x86_64
debug: adding new server URL to database 'community': http://mirror.gnomus.de/community/os/x86_64
debug: adding new server URL to database 'community': http://mirror.js-webcoding.de/pub/archlinux/community/os/x86_64
debug: adding new server URL to database 'community': http://archlinux.polymorf.fr/community/os/x86_64
debug: config: new section 'multilib'
debug: config file /etc/pacman.conf, line 100: including /etc/pacman.d/mirrorlist
debug: config: attempting to read file /etc/pacman.d/mirrorlist
debug: config: finished parsing /etc/pacman.d/mirrorlist
debug: config: finish section 'multilib'
debug: registering sync database 'multilib'
debug: database path for tree multilib set to /var/lib/pacman/sync/multilib.db
debug: "/var/lib/pacman/sync/multilib.db.sig" is not readable: No such file or directory
debug: sig path /var/lib/pacman/sync/multilib.db.sig could not be opened
debug: missing optional signature
debug: setting usage of 15 for multilib repoistory
debug: adding new server URL to database 'multilib': http://arch.tamcore.eu/multilib/os/x86_64
debug: adding new server URL to database 'multilib': http://mirror.one.com/archlinux/multilib/os/x86_64
debug: adding new server URL to database 'multilib': http://mirror.gnomus.de/multilib/os/x86_64
debug: adding new server URL to database 'multilib': http://mirror.js-webcoding.de/pub/archlinux/multilib/os/x86_64
debug: adding new server URL to database 'multilib': http://archlinux.polymorf.fr/multilib/os/x86_64
debug: config: new section 'infinality-bundle'
debug: config: finish section 'infinality-bundle'
debug: registering sync database 'infinality-bundle'
debug: database path for tree infinality-bundle set to /var/lib/pacman/sync/infinality-bundle.db
debug: checking signature for /var/lib/pacman/sync/infinality-bundle.db
debug: 1 signatures returned
debug: fingerprint: A9244FB5E93F11F0E975337FAE6866C7962DDE58
debug: summary: valid
debug: summary: green
debug: status: Success
debug: timestamp: 1430276639
debug: exp_timestamp: 0
debug: validity: full; reason: Success
debug: key: A9244FB5E93F11F0E975337FAE6866C7962DDE58, bohoomil (dev key) <[email protected]>, owner_trust unknown, disabled 0
debug: signature is valid
debug: signature is fully trusted
debug: setting usage of 15 for infinality-bundle repoistory
debug: adding new server URL to database 'infinality-bundle': http://bohoomil.com/repo/x86_64
debug: config: new section 'infinality-bundle-multilib'
debug: config: finish section 'infinality-bundle-multilib'
debug: registering sync database 'infinality-bundle-multilib'
debug: database path for tree infinality-bundle-multilib set to /var/lib/pacman/sync/infinality-bundle-multilib.db
debug: checking signature for /var/lib/pacman/sync/infinality-bundle-multilib.db
debug: 1 signatures returned
debug: fingerprint: A9244FB5E93F11F0E975337FAE6866C7962DDE58
debug: summary: valid
debug: summary: green
debug: status: Success
debug: timestamp: 1430087321
debug: exp_timestamp: 0
debug: validity: full; reason: Success
debug: key: A9244FB5E93F11F0E975337FAE6866C7962DDE58, bohoomil (dev key) <[email protected]>, owner_trust unknown, disabled 0
debug: signature is valid
debug: signature is fully trusted
debug: setting usage of 15 for infinality-bundle-multilib repoistory
debug: adding new server URL to database 'infinality-bundle-multilib': http://bohoomil.com/repo/multilib/x86_64
debug: config: new section 'infinality-bundle-fonts'
debug: config: finish section 'infinality-bundle-fonts'
debug: registering sync database 'infinality-bundle-fonts'
debug: database path for tree infinality-bundle-fonts set to /var/lib/pacman/sync/infinality-bundle-fonts.db
debug: checking signature for /var/lib/pacman/sync/infinality-bundle-fonts.db
debug: 1 signatures returned
debug: fingerprint: A9244FB5E93F11F0E975337FAE6866C7962DDE58
debug: summary: valid
debug: summary: green
debug: status: Success
debug: timestamp: 1430276566
debug: exp_timestamp: 0
debug: validity: full; reason: Success
debug: key: A9244FB5E93F11F0E975337FAE6866C7962DDE58, bohoomil (dev key) <[email protected]>, owner_trust unknown, disabled 0
debug: signature is valid
debug: signature is fully trusted
debug: setting usage of 15 for infinality-bundle-fonts repoistory
debug: adding new server URL to database 'infinality-bundle-fonts': http://bohoomil.com/repo/fonts
debug: config: new section '(null)'
debug: config: finished parsing /etc/pacman.conf
debug: loading package cache for repository 'core'
debug: opening archive /var/lib/pacman/sync/core.db
debug: added 208 packages to package cache for db 'core'
debug: adding package 'gawk'
debug: loading package cache for repository 'local'
debug: added 1122 packages to package cache for db 'local'
warning: gawk-4.1.2-1 is up to date -- reinstalling
debug: adding package gawk-4.1.2-1 to the transaction add list
resolving dependencies...
debug: resolving target's dependencies
debug: started resolving dependencies
debug: checkdeps: package gawk-4.1.2-1
debug: finished resolving dependencies
looking for conflicting packages...
debug: looking for conflicts
debug: check targets vs targets
debug: check targets vs targets
debug: check targets vs db and db vs targets
debug: check targets vs db
debug: check db vs targets
debug: checking dependencies
debug: checkdeps: package gawk-4.1.2-1
debug: found cached pkg: /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
debug: setting download size 0 for pkg gawk
debug: sorting by dependencies
debug: started sorting dependencies
debug: sorting dependencies finished
Package (1) Old Version New Version Net Change
core/gawk 4.1.2-1 4.1.2-1 0.00 MiB
Total Installed Size: 2.19 MiB
Net Upgrade Size: 0.00 MiB
:: Proceed with installation? [Y/n] y
debug: using cachedir: /var/cache/pacman/pkg/
debug: using cachedir: /var/cache/pacman/pkg/
checking keyring...
debug: looking up key 771DF6627EDF681F locally
debug: key lookup success, key exists
checking package integrity...
debug: found cached pkg: /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
debug: sig data: iQEcBAABCAAGBQJVQNc+AAoJEHcd9mJ+32gfQZgH/jkRiirmPTb4nE0xgcFGKc8wrxw3k9ooGyMFoeqAthTICB/5dBzNfEQ8b4X74gi8KiYQVYm4WE8kWIidUj5ekJhGwngO6Gk+lwyBq+Uh8rUHDJKw557fImM2bBah2lxNUxqZzxYTA1FByq2lptLB5EPJgAPemyUXACMXITDfqtWMpuHIEPLZi5WW9+cB0eMKz5IeEEfZi4lO2fyfRqxNkRDNSmC5NEDkfhm+XVXBEd4gugSOmYpKzlA67mjw2HP+oOyNheL8st4SjgFr/qVDdbfiBbaTTujC4mF1n73z5qp4K5/xgHqk42ftoo003XFQYVOAg3bDWMvUF5d63D4+HKg=
debug: checking signature for /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
debug: 1 signatures returned
debug: fingerprint: 5B7E3FB71B7F10329A1C03AB771DF6627EDF681F
debug: summary: valid
debug: summary: green
debug: status: Success
debug: timestamp: 1430312766
debug: exp_timestamp: 0
debug: validity: full; reason: Success
debug: key: 5B7E3FB71B7F10329A1C03AB771DF6627EDF681F, Tobias Powalowski <[email protected]>, owner_trust unknown, disabled 0
debug: signature is valid
debug: signature is fully trusted
loading package files...
debug: found cached pkg: /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
debug: replacing pkgcache entry with package file for target gawk
debug: opening archive /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
debug: starting package load for /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
debug: found mtree for package /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz, getting file list
debug: finished mtree reading for /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
debug: sorting package filelist for /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
checking for file conflicts...
debug: looking for file conflicts
debug: searching for file conflicts: gawk
debug: searching for filesystem conflicts: gawk
checking available disk space...
debug: checking available disk space
debug: discovered mountpoint: /tmp
debug: discovered mountpoint: /sys/kernel/security
debug: discovered mountpoint: /sys/kernel/debug
debug: discovered mountpoint: /sys/kernel/config
debug: discovered mountpoint: /sys/fs/pstore
debug: discovered mountpoint: /sys/fs/cgroup/systemd
debug: discovered mountpoint: /sys/fs/cgroup/net_cls
debug: discovered mountpoint: /sys/fs/cgroup/memory
debug: discovered mountpoint: /sys/fs/cgroup/freezer
debug: discovered mountpoint: /sys/fs/cgroup/devices
debug: discovered mountpoint: /sys/fs/cgroup/cpuset
debug: discovered mountpoint: /sys/fs/cgroup/cpu,cpuacct
debug: discovered mountpoint: /sys/fs/cgroup/blkio
debug: discovered mountpoint: /sys/fs/cgroup
debug: discovered mountpoint: /sys
debug: discovered mountpoint: /run/user/1000
debug: discovered mountpoint: /run
debug: discovered mountpoint: /proc/sys/fs/binfmt_misc
debug: discovered mountpoint: /proc
debug: discovered mountpoint: /home/skanky/personal
debug: discovered mountpoint: /home
debug: discovered mountpoint: /dev/shm
debug: discovered mountpoint: /dev/pts
debug: discovered mountpoint: /dev/mqueue
debug: discovered mountpoint: /dev/hugepages
debug: discovered mountpoint: /dev
debug: discovered mountpoint: /
debug: loading fsinfo for /
debug: partition /, needed 0, cushion 5121, free 1174711
debug: installing packages
reinstalling gawk...
debug: reinstalling package gawk-4.1.2-1
debug: opening archive /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
debug: extracting: .INSTALL
debug: removing old package first (gawk-4.1.2-1)
debug: removing 110 files
debug: unlinking /usr/share/man/man3/time.3am.gz
debug: unlinking /usr/share/man/man3/rwarray.3am.gz
debug: unlinking /usr/share/man/man3/revtwoway.3am.gz
debug: unlinking /usr/share/man/man3/revoutput.3am.gz
debug: unlinking /usr/share/man/man3/readfile.3am.gz
debug: unlinking /usr/share/man/man3/readdir.3am.gz
debug: unlinking /usr/share/man/man3/ordchr.3am.gz
debug: unlinking /usr/share/man/man3/inplace.3am.gz
debug: unlinking /usr/share/man/man3/fork.3am.gz
debug: unlinking /usr/share/man/man3/fnmatch.3am.gz
debug: unlinking /usr/share/man/man3/filefuncs.3am.gz
debug: keeping directory /usr/share/man/man3/ (contains files)
debug: unlinking /usr/share/man/man1/igawk.1.gz
debug: unlinking /usr/share/man/man1/gawk.1.gz
debug: keeping directory /usr/share/man/man1/ (contains files)
debug: keeping directory /usr/share/man/ (contains files)
debug: unlinking /usr/share/locale/vi/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/vi/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/vi/ (contains files)
debug: unlinking /usr/share/locale/sv/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/sv/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/sv/ (contains files)
debug: unlinking /usr/share/locale/pl/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/pl/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/pl/ (contains files)
debug: unlinking /usr/share/locale/nl/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/nl/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/nl/ (contains files)
debug: unlinking /usr/share/locale/ms/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/ms/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/ms/ (contains files)
debug: unlinking /usr/share/locale/ja/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/ja/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/ja/ (contains files)
debug: unlinking /usr/share/locale/it/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/it/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/it/ (contains files)
debug: unlinking /usr/share/locale/fr/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/fr/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/fr/ (contains files)
debug: unlinking /usr/share/locale/fi/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/fi/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/fi/ (contains files)
debug: unlinking /usr/share/locale/es/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/es/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/es/ (contains files)
debug: unlinking /usr/share/locale/de/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/de/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/de/ (contains files)
debug: unlinking /usr/share/locale/da/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/da/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/da/ (contains files)
debug: unlinking /usr/share/locale/ca/LC_MESSAGES/gawk.mo
debug: keeping directory /usr/share/locale/ca/LC_MESSAGES/ (contains files)
debug: keeping directory /usr/share/locale/ca/ (contains files)
debug: keeping directory /usr/share/locale/ (contains files)
debug: unlinking /usr/share/info/gawkinet.info.gz
debug: unlinking /usr/share/info/gawk.info.gz
debug: keeping directory /usr/share/info/ (contains files)
debug: unlinking /usr/share/awk/zerofile.awk
debug: unlinking /usr/share/awk/walkarray.awk
debug: unlinking /usr/share/awk/strtonum.awk
debug: unlinking /usr/share/awk/shellquote.awk
debug: unlinking /usr/share/awk/round.awk
debug: unlinking /usr/share/awk/rewind.awk
debug: unlinking /usr/share/awk/readfile.awk
debug: unlinking /usr/share/awk/readable.awk
debug: unlinking /usr/share/awk/quicksort.awk
debug: unlinking /usr/share/awk/processarray.awk
debug: unlinking /usr/share/awk/passwd.awk
debug: unlinking /usr/share/awk/ord.awk
debug: unlinking /usr/share/awk/noassign.awk
debug: unlinking /usr/share/awk/libintl.awk
debug: unlinking /usr/share/awk/join.awk
debug: unlinking /usr/share/awk/inplace.awk
debug: unlinking /usr/share/awk/group.awk
debug: unlinking /usr/share/awk/gettime.awk
debug: unlinking /usr/share/awk/getopt.awk
debug: unlinking /usr/share/awk/ftrans.awk
debug: unlinking /usr/share/awk/ctime.awk
debug: unlinking /usr/share/awk/cliff_rand.awk
debug: unlinking /usr/share/awk/bits2str.awk
debug: unlinking /usr/share/awk/assert.awk
debug: keeping directory /usr/share/awk/ (in new package)
debug: keeping directory /usr/share/ (contains files)
debug: unlinking /usr/lib/gawk/time.so
debug: unlinking /usr/lib/gawk/testext.so
debug: unlinking /usr/lib/gawk/rwarray.so
debug: unlinking /usr/lib/gawk/revtwoway.so
debug: unlinking /usr/lib/gawk/revoutput.so
debug: unlinking /usr/lib/gawk/readfile.so
debug: unlinking /usr/lib/gawk/readdir.so
debug: unlinking /usr/lib/gawk/ordchr.so
debug: unlinking /usr/lib/gawk/inplace.so
debug: unlinking /usr/lib/gawk/fork.so
debug: unlinking /usr/lib/gawk/fnmatch.so
debug: unlinking /usr/lib/gawk/filefuncs.so
debug: keeping directory /usr/lib/gawk/ (in new package)
debug: unlinking /usr/lib/awk/pwcat
debug: unlinking /usr/lib/awk/grcat
debug: keeping directory /usr/lib/awk/ (in new package)
debug: keeping directory /usr/lib/ (contains files)
debug: unlinking /usr/include/gawkapi.h
debug: keeping directory /usr/include/ (contains files)
debug: unlinking /usr/bin/igawk
debug: unlinking /usr/bin/gawk-4.1.2
debug: unlinking /usr/bin/gawk
debug: unlinking /usr/bin/awk
debug: keeping directory /usr/bin/ (contains files)
debug: keeping directory /usr/ (contains files)
debug: removing database entry 'gawk'
debug: removing entry 'gawk' from 'local' cache
debug: extracting files
debug: opening archive /var/cache/pacman/pkg/gawk-4.1.2-1-x86_64.pkg.tar.xz
debug: skipping extraction of '.PKGINFO'
debug: extracting /var/lib/pacman/local/gawk-4.1.2-1/install
debug: extracting /var/lib/pacman/local/gawk-4.1.2-1/mtree
debug: extract: skipping dir extraction of /usr/
debug: extract: skipping dir extraction of /usr/lib/
debug: extract: skipping dir extraction of /usr/share/
debug: extract: skipping dir extraction of /usr/include/
debug: extract: skipping dir extraction of /usr/bin/
debug: extracting /usr/bin/igawk
debug: extracting /usr/bin/awk
debug: extracting /usr/bin/gawk-4.1.2
debug: extracting /usr/bin/gawk
debug: extracting /usr/include/gawkapi.h
debug: extract: skipping dir extraction of /usr/share/locale/
debug: extract: skipping dir extraction of /usr/share/awk/
debug: extract: skipping dir extraction of /usr/share/info/
debug: extract: skipping dir extraction of /usr/share/man/
debug: extract: skipping dir extraction of /usr/share/man/man3/
debug: extract: skipping dir extraction of /usr/share/man/man1/
debug: extracting /usr/share/man/man1/gawk.1.gz
debug: extracting /usr/share/man/man1/igawk.1.gz
debug: extracting /usr/share/man/man3/filefuncs.3am.gz
debug: extracting /usr/share/man/man3/fnmatch.3am.gz
debug: extracting /usr/share/man/man3/fork.3am.gz
debug: extracting /usr/share/man/man3/inplace.3am.gz
debug: extracting /usr/share/man/man3/ordchr.3am.gz
debug: extracting /usr/share/man/man3/readdir.3am.gz
debug: extracting /usr/share/man/man3/readfile.3am.gz
debug: extracting /usr/share/man/man3/revoutput.3am.gz
debug: extracting /usr/share/man/man3/revtwoway.3am.gz
debug: extracting /usr/share/man/man3/rwarray.3am.gz
debug: extracting /usr/share/man/man3/time.3am.gz
debug: extracting /usr/share/info/gawk.info.gz
debug: extracting /usr/share/info/gawkinet.info.gz
debug: extracting /usr/share/awk/zerofile.awk
debug: extracting /usr/share/awk/walkarray.awk
debug: extracting /usr/share/awk/strtonum.awk
debug: extracting /usr/share/awk/shellquote.awk
debug: extracting /usr/share/awk/round.awk
debug: extracting /usr/share/awk/rewind.awk
debug: extracting /usr/share/awk/readfile.awk
debug: extracting /usr/share/awk/readable.awk
debug: extracting /usr/share/awk/quicksort.awk
debug: extracting /usr/share/awk/processarray.awk
debug: extracting /usr/share/awk/ord.awk
debug: extracting /usr/share/awk/noassign.awk
debug: extracting /usr/share/awk/libintl.awk
debug: extracting /usr/share/awk/join.awk
debug: extracting /usr/share/awk/inplace.awk
debug: extracting /usr/share/awk/gettime.awk
debug: extracting /usr/share/awk/getopt.awk
debug: extracting /usr/share/awk/ftrans.awk
debug: extracting /usr/share/awk/ctime.awk
debug: extracting /usr/share/awk/cliff_rand.awk
debug: extracting /usr/share/awk/bits2str.awk
debug: extracting /usr/share/awk/assert.awk
debug: extracting /usr/share/awk/group.awk
debug: extracting /usr/share/awk/passwd.awk
debug: extract: skipping dir extraction of /usr/share/locale/vi/
debug: extract: skipping dir extraction of /usr/share/locale/sv/
debug: extract: skipping dir extraction of /usr/share/locale/pl/
debug: extract: skipping dir extraction of /usr/share/locale/nl/
debug: extract: skipping dir extraction of /usr/share/locale/ms/
debug: extract: skipping dir extraction of /usr/share/locale/ja/
debug: extract: skipping dir extraction of /usr/share/locale/it/
debug: extract: skipping dir extraction of /usr/share/locale/fr/
debug: extract: skipping dir extraction of /usr/share/locale/fi/
debug: extract: skipping dir extraction of /usr/share/locale/es/
debug: extract: skipping dir extraction of /usr/share/locale/de/
debug: extract: skipping dir extraction of /usr/share/locale/da/
debug: extract: skipping dir extraction of /usr/share/locale/ca/
debug: extract: skipping dir extraction of /usr/share/locale/ca/LC_MESSAGES/
debug: extracting /usr/share/locale/ca/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/da/LC_MESSAGES/
debug: extracting /usr/share/locale/da/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/de/LC_MESSAGES/
debug: extracting /usr/share/locale/de/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/es/LC_MESSAGES/
debug: extracting /usr/share/locale/es/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/fi/LC_MESSAGES/
debug: extracting /usr/share/locale/fi/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/fr/LC_MESSAGES/
debug: extracting /usr/share/locale/fr/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/it/LC_MESSAGES/
debug: extracting /usr/share/locale/it/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/ja/LC_MESSAGES/
debug: extracting /usr/share/locale/ja/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/ms/LC_MESSAGES/
debug: extracting /usr/share/locale/ms/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/nl/LC_MESSAGES/
debug: extracting /usr/share/locale/nl/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/pl/LC_MESSAGES/
debug: extracting /usr/share/locale/pl/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/sv/LC_MESSAGES/
debug: extracting /usr/share/locale/sv/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/share/locale/vi/LC_MESSAGES/
debug: extracting /usr/share/locale/vi/LC_MESSAGES/gawk.mo
debug: extract: skipping dir extraction of /usr/lib/gawk/
debug: extract: skipping dir extraction of /usr/lib/awk/
debug: extracting /usr/lib/awk/pwcat
debug: extracting /usr/lib/awk/grcat
debug: extracting /usr/lib/gawk/filefuncs.so
debug: extracting /usr/lib/gawk/fnmatch.so
debug: extracting /usr/lib/gawk/fork.so
debug: extracting /usr/lib/gawk/inplace.so
debug: extracting /usr/lib/gawk/ordchr.so
debug: extracting /usr/lib/gawk/readdir.so
debug: extracting /usr/lib/gawk/readfile.so
debug: extracting /usr/lib/gawk/revoutput.so
debug: extracting /usr/lib/gawk/revtwoway.so
debug: extracting /usr/lib/gawk/rwarray.so
debug: extracting /usr/lib/gawk/testext.so
debug: extracting /usr/lib/gawk/time.so
debug: updating database
debug: adding database entry 'gawk'
debug: writing gawk-4.1.2-1 DESC information back to db
debug: writing gawk-4.1.2-1 FILES information back to db
debug: adding entry 'gawk' in 'local' cache
debug: executing ". /tmp/alpm_r21DA5/.INSTALL; post_upgrade 4.1.2-1 4.1.2-1"
debug: executing "/usr/bin/bash" under chroot "/"
debug: call to waitpid succeeded
error: command failed to execute correctly
debug: running ldconfig
debug: executing "/usr/bin/ldconfig" under chroot "/"
debug: call to waitpid succeeded
debug: unregistering database 'local'
debug: freeing package cache for repository 'local'
debug: unregistering database 'core'
debug: freeing package cache for repository 'core'
debug: unregistering database 'extra'
debug: unregistering database 'xyne-x86_64'
debug: unregistering database 'community'
debug: unregistering database 'multilib'
debug: unregistering database 'infinality-bundle'
debug: unregistering database 'infinality-bundle-multilib'
debug: unregistering database 'infinality-bundle-fonts'
pacman thinks the upgrade/reinstall was successful in that the latest version is installed.
I did a search on the forums and the only other issue that I thought was connected might be microcode not up to date, but I had followed the update instructions some time back and as far as I can tell, the microcode is up to date.
I have two main questions:
1) How do I work out what's causing the error, from above?
2) Is there a way I can work out which packages gave the error, so I can make sure they're installed properly?
Thanks.The following packages also had problems
( 2/17) upgrading glibc
error: command failed to execute correctly
( 3/17) upgrading binutils
error: command failed to execute correctly
( 4/17) upgrading coreutils
error: command failed to execute correctly
( 8/17) upgrading gcc
error: command failed to execute correctly
( 9/17) upgrading gcc-fortran
error: command failed to execute correctly
(10/17) upgrading gcc-libs
error: command failed to execute correctly
Does anybody have a clue?
Thanks, -
BPM process archiving "Processing of archiving write command failed"
Can someone help me with the following problem. After archiving a BPM proces, I get the following messages (summary):
ERROR Processing of archiving write command failed
ERROR Job "d5e2a9d9ea8111e081260000124596b3" could not be run as user"E61006".
LOG -> Processing of archiving write command failed
[EXCEPTION] com.sap.glx.arch.xml.XmlArchException: Cannot create archivable items from object
Caused by: java.lang.ClassCastException: ...
Configuration
I've completed the following steps based on a blog item.
1. created an archive user with the corresponding roles
2. updated the destination DASdefault with the created user -> destination ping = OK
3. created an archive store BPM_ARCH based on unix root folder
4. created home path synchornization with home path /<sisid>/bpm_proc/ and archive store BPM_ARCH
5. start process archiving from manage processes view.
Process Archiving
Manage Process -> Select a process from the table -> Archive button -> Start archiving by using the default settings.
Archiving Monitor
The following log is created which describe that the write command failed.
Write phase log:
[2011.09.29 12:00:18 CEST] INFO Job bpm_proc_write (ID: d5e2a9d9ea8111e081260000124596b3, JMS ID: ID:124596B30000009D-000000000C08) started on Thu, 29 Sep 2011 12:00:18:133 CEST by scheduler: 5e11a5e0df3111decc2d00237d240438
[2011.09.29 12:00:18 CEST] INFO Start execution of job named: bpm_proc_write
[2011.09.29 12:00:18 CEST] INFO Job status: RUNNING
[2011.09.29 12:00:18 CEST] ERROR Processing of archiving write command failed
[2011.09.29 12:00:18 CEST] INFO Start processing of archiving write command ...
Verify Indexes ...
Archive XML schema ...
Resident Policy for object selection is instanceIds = [9ca38cb2343511e0849600269e82721e] , timePeriod = 1317290418551 , inError = false ,
[2011.09.29 12:00:18 CEST] ERROR Job "d5e2a9d9ea8111e081260000124596b3" could not be run as user"E61006".
[2011.09.29 12:00:18 CEST] INFO Job bpm_proc_write (ID: d5e2a9d9ea8111e081260000124596b3, JMS ID: ID:124596B30000009D-000000000C08) ended on Thu, 29 Sep 2011 12:00:18:984 CEST
Log viewer
The following message is created in the log viewer.
Processing of archiving write command failed
[EXCEPTION]
com.sap.glx.arch.xml.XmlArchException: Cannot create archivable items from object
at com.sap.engine.core.thread.execution.CentralExecutor$SingleThread.run(CentralExecutor.java:328)
Caused by: java.lang.ClassCastException: class com.sap.glx.arch.Archivable:sap.com/tcbpemarchear @[email protected]2@alive incompatible with interface com.sap.glx.util.id.UID:library:tcbpembaselib @[email protected]f@alive
at com.sap.glx.arch.him.xml.JaxbTaskExtension.createJaxbObjects(JaxbTaskExtension.java:69)
at com.sap.glx.arch.xml.JaxbSession.fillFromExtensions(JaxbSession.java:73)
at com.sap.glx.arch.pm.xml.ArchProcessExtension.fillHimObjects(ArchProcessExtension.java:113)
at com.sap.glx.arch.pm.xml.ArchProcessExtension.createArchObjectItem(ArchProcessExtension.java:60)
at com.sap.glx.arch.xml.JaxbSession.createArchObjectItems(JaxbSession.java:39)
at com.sap.glx.arch.xml.Marshaller.createItems(Marshaller.java:29)
... 61 moreHi Martin,
I don't have a specific answer sorry, however I do recall seeing a number of OSS notes around BPM archiving whilst searching for a different issue last year - have you checked on there for anything relevant to your currnet version and SP level? There were quite a few notes if memory serves me well!
Regards,
Gareth. -
Can't import config to anything beyond 6.01 - say it isn't so
Hi all
Just downloaded, burned, and delivered MARS 6 image to one of our sites last week. Target box reimaged from 4.3.6. Exported config from 4.3.6, etc. according to the Cisco documents.
After the re-image, on the config import, i hit this error
File from_04_3_60_to_06_0_60.sql missing from schema.
Configuration import failed with error code: 1
Configrestore failed!
Error: failed to import config data.
A bit of digging - turns out apparently I have to reimage specifically to v 6.01, import the original config to that version, then update the 6.01 image with each sequential 6.x point release individually.
Is this correct?
OK - you got me on a technicality - the upgrade docs do refer to 6.01, but silly me, I just assumed that subsequent point releases of MARS would maintain / preserve the same functionality so I used the latest available 6.06 image.
Is there a quick fix? otherwise this looks like a do-over -and if that works, then a lot of incremental updates.
<sigh>....Yes, unfortunately migration from version 4.3.6 only works to version 6.0.1, and you would need to perform incremental upgrade from version 6.0.1 onwards. Sorry, no easy and quick way unfortunately.
-
VLAN RUNNING Config fetch Failed
Hi;
I have running Ciscowork LSM 3.2 with RME Ver 4.3 and Running SNMP V3 on all device. Out of 134 device 12-13 shows me Vlan running Config fetch failed. Please also find the dcmaservice.log in attachment.
I already verify the Crediential number or times, I can access the devices from SSH and Telnet, on Cisco works RME shows Running and Startup Config success while VLAN Running Config fetch failed. and on some device it show following
Protocol and Platforms passed = TELNET , RMEIOS
trying for Telnet
Failed to get CmdSvc from DeviceContext....
Thanks,
Best regards;
Shoaib AhmedHI Shoaib,
Try ssh\telnet into the device manually and issue the command copy flash:vlan.dat tftp: and
enter the IP address of the LMS server? Does it work? If that works fine, try to increase
the tftp timeout.
This can be done under RME> Admin> System Preferences> RME Device Attributes.
if it failed with an Error as below :
%Error opening tftp: ............
Then check if there any firewall in between these device and LMS blocking TFTP as TFTP is the only protocal
that fetch the vlan.dat from the device.
Thanks--
Afroj -
Can't run service perfigo config command on NAC
I have a new NAC Manager server for a fresh deployment. I logged in using root password with a serial connection to the server.
I can't seem to be able to run the "service perfigo config" command to perform the initial CAM configuration.
[root@nacmanager /]# service perfigo start
perfigo: unrecognized service
[root@nacmanager /]#
Any idea what might be the problem?
Thanks in advance.What is happening when you are booting from a CD where NAC ISO is installed on it?
Usually, during boot, you should receive a installer welcome message:
Cisco Clean Access 4.8.2 Installer (C) 2011 Cisco Systems, Inc.
Welcome to the Cisco Clean Access Installer!
- To install a Cisco Clean Access device, press the key.
- To install a Cisco Clean Access device over a serial console, enter serial at the boot prompt and press the key.
boot: serial
You can type serial if you are connected through console, after that it will check for existing installations and will ask you if you need to install a NAC Manager or Server:
Please choose one of the following configurations:
1) CCA Manager.
2) CCA Server.
3) Exit.
You choose 1 or 2 depending on the server type. The software will install and the server will reboot.
After the reboot, you login as root and automatically the configuration utility will launch by itself and you will set the basic parameters:
CentOS release 5.3 (Final)
Kernel 2.6.18-128.1.10.el5PAE on an i686
nacmanager login: root
Welcome to the Cisco Clean Access Manager quick configuration utility.
Note that you need to be root to execute this utility.
The utility will now ask you a series of configuration questions.
Please answer them carefully.
Cisco Clean Access Manager, (C) 2011 Cisco Systems, Inc.
Configuring the network interface:
Please enter the IP address for the interface eth0 []: 172.30.1.1
You entered 172.30.1.1 Is this correct? (y/n)? [y]
etc......
Hope this helps.
Regards, -
Hi,
Can 9.3.1 ImportExport utility import an exported file from 9.2.0.0 ?
When I ran 9.3.1’s ImportExport utility to import 9.2’s export.csv, it gave me the following error messages:
D:\Hyperion\common\utilities\CSSImportExportUtility\importexport>CSSImport.bat i
mportexport2.properties
Trace...
2009-04-07 01:05:18,025 Attempting a import operation
2009-04-07 01:05:20,728 Import : Attempting to create user admin
2009-04-07 01:05:20,728 Import : Password not specified for user admin
2009-04-07 01:05:20,744 Import : Unable to create user admin
2009-04-07 01:05:20,744 Import : Create user admin failed. Attempting up
date.
2009-04-07 01:05:20,744 Import : Attempting to update user admin
2009-04-07 01:05:20,775 Import : User updated : admin
2009-04-07 01:05:20,790 Import : Attempting to create user essadmin
2009-04-07 01:05:20,790 Import : Password not specified for user essadmi
n
2009-04-07 01:05:20,790 Import : Unable to create user essadmin
2009-04-07 01:05:20,790 Import : Create user essadmin failed. Attempting
update.
2009-04-07 01:05:20,790 Import : Attempting to update user essadmin
2009-04-07 01:05:20,806 Import : Unable to locate user essadmin
The number of errors detected during the import operation exceeded the n
umber of allowable errors. Please correct the errors and then rerun the program
or increase the value of the import.maxerrors property.
Aborting program...
====================
We have a user essadmin in 9.2.0.0 but not in 9.3.1.
Since we already have a user called admin in 9.3.1, I deleted user admin in export.csv file and tried to import again.
The 9.3.1 ImportExport utility did not like the newly updated export.csv fileHello John,
I'm exporting from 9.2.0.0.
I'll try updating my 9.2's export file with password and resubmitting it.
In the meantime, I was told 9.3.1 ImportExport will run in 9.2.0.0. However, everytime I tried that, it gives me the error
When I run 9.3.1's cssexport.bat importexport.properties in 9.2.0's HSharedServices server, I get the error message:
The system cannot find the path specified.
I attach the importexport.properties file for your examination. All the paths are correct, such as:
C:/Hyperion/SharedServices/9.2/AppServer/InstalledApps/WebLogic/8.1/CSS.xml
C:/Hyperion/common/utilities/importexport931/trace.log
What am I doing wrong?
my css.xml file was found in /C:/Hyperion/SharedServices/9.2/AppServer/InstalledApps/WebLogic/8.1/CSS.xml.
We are using weblogic. Is that the correct location?
I don't get the above messages when I run 9.3.1's cssexport.bat importexport.properties in 9.3.1
importexport.css=file:/c:/Hyperion/SharedServices/9.2/AppServer/InstalledApps/WebLogic/8.1/CSS.xml
importexport.cmshost=localhost
importexport.cmsport=58080
importexport.username=admin
importexport.password={CSS}MRcYv323uzxGr8rFdvQLcA==
importexport.enable.console.traces=true
importexport.trace.events.file=d:/Hyperion/common/utilities/CSSImportExportUtility/importexport/trace.log
importexport.errors.log.file=d:/Hyperion/common/utilities/CSSImportExportUtility/importexport/errors.log
importexport.locale=en
importexport.ssl_enabled=false
export.fileformat=csv
export.file=d:/Hyperion/common/utilities/CSSImportExportUtility/importexport/exportHSS.csv
export.internal.identities=true
export.native.user.passwords=true
export.provisioning.all=true
export.delegated.lists=false
export.user.filter=*
export.group.filter=*
export.role.filter=*
export.producttype=*
export.provisioning.apps=* -
Importing registered serves fails in MSSM 2012
I exported servers two days ago, and did a clean install of Win7, now when I try to import that sqlservers.regsrvr I face this error:
===================================
Key not valid for use in specified state.
(System.Security)
Program Location:
at System.Security.Cryptography.ProtectedData.Unprotect(Byte[] encryptedData, Byte[] optionalEntropy, DataProtectionScope scope)
at Microsoft.SqlServer.Management.RegisteredServers.RegisteredServer.ProtectData(String input, Boolean encrypt)
at Microsoft.SqlServer.Management.RegisteredServers.RegisteredServer.get_SecureConnectionString()
at Microsoft.SqlServer.Management.RegisteredServers.RegisteredServer.get_ConnectionString()
at Microsoft.SqlServer.Management.RegisteredServers.RegisteredServer.get_ServerName()
at Microsoft.SqlServer.Management.RegisteredServers.RegisteredServerTree.AddRegisteredServerNode(RegisteredServer regSrv, TreeNodeCollection nodes)
at Microsoft.SqlServer.Management.RegisteredServers.RegisteredServerTree.DoOnRegisteredServerCreated(RegisteredServer server)
at Microsoft.SqlServer.Management.RegisteredServers.RegisteredServerTree.OnRegisteredServerCreated(RegisteredServer server)
at Microsoft.SqlServer.Management.RegisteredServers.RegisteredServerControl.Events_SfcObjectCreated(Object sender, SfcObjectCreatedEventArgs e)
at Microsoft.SqlServer.Management.Sdk.Sfc.SfcApplication.SfcObjectCreatedEventHandler.Invoke(Object sender, SfcObjectCreatedEventArgs e)
at Microsoft.SqlServer.Management.Sdk.Sfc.SfcApplicationEvents.OnObjectCreated(SfcInstance obj, SfcObjectCreatedEventArgs e)
at Microsoft.SqlServer.Management.RegisteredServers.ServerGroup.RaiseSfcAppObjectCreatedEvent(SfcInstance obj)
at Microsoft.SqlServer.Management.RegisteredServers.ServerGroup.Import(String file)
The operation 'Import' failed. (Microsoft.SqlServer.Management.RegisteredServers)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=11.0.2100.60+((SQL11_RTM).120210-1846+)&LinkId=20476
Program Location:
at Microsoft.SqlServer.Management.RegisteredServers.ServerGroup.Import(String file)
at Microsoft.SqlServer.Management.RegisteredServers.Utils.Import(ServerGroup parentGroup, String filename)
at Microsoft.SqlServer.Management.RegisteredServers.RegisteredServersContextMenuManager.ContextImport(Object sender, EventArgs e)
All exported were remote, and I checked save username and passwords in export window.Hi MaysamSh,
According to your description, you want to import and export registered servers between computers by using SSMS. The error may occur due to some encryption enabled in your existing registered server list, you can try to delete or rename the files
below and then give a try to import the registered servers again.
C:\Users\[USERNAME]\AppData\Local\Microsoft\Microsoft SQL Server\100\Tools\Shell\RegSrvr.xml
OR
C:\Users\[USERNAME]\AppData\Roaming\Microsoft\Microsoft SQL Server\100\Tools\Shell\RegSrvr.xml
For more information, you can refer to the following article.
http://chocosmith.wordpress.com/2011/02/02/ssms-r2-key-not-valid-for-use-in-specified-state/
Regards,
Sofiya Li
Sofiya Li
TechNet Community Support -
Import seed data failed aiaconfig.sh
Hi All,
Please find the below screen shot .I am getting import seed data failed while running aiaconfig.sh . I have selected Oracle value chain planning integration base pack .
Please find the screen shot and log file .
http://www.4shared.com/photo/ZTJH6PSX/4shared_AIA.html
Buildfile: AIAPLWImportData.xml
[echo] AIA HOME: /u01/oraaia/Oracle/Middleware/aia_home
[echo] AIA Instance: /u01/oraaia/Oracle/Middleware/aia_home/aia_instances/AIA_VCP
[echo] Importing /u01/oraaia/Oracle/Middleware/aia_home/data/VCPJDE/PLWSeedData/VCPJDESeed.xml
all:
[echo] Executing /u01/oraaia/Oracle/Middleware/aia_home/Infrastructure/LifeCycle/PLWImExport/PLWImport.sh -f /u01/oraaia/Oracle/Middleware/aia_home/data/VCPJDE/PLWSeedData/VCPJDESeed.xml
[echo] Shell: /bin/sh
BUILD FAILED
/u01/oraaia/Oracle/Middleware/aia_home/Infrastructure/Install/AID/AIAPLWImportData.xml:48: exec returned: 1
Total time: 2 seconds
Configuration Command Return Value: 1
Any advises on this would be appreciated .
Regards,
Nag.Hi All,
Please find the below screen shot .I am getting import seed data failed while running aiaconfig.sh . I have selected Oracle value chain planning integration base pack .
Please find the screen shot and log file .
http://www.4shared.com/photo/ZTJH6PSX/4shared_AIA.html
Buildfile: AIAPLWImportData.xml
[echo] AIA HOME: /u01/oraaia/Oracle/Middleware/aia_home
[echo] AIA Instance: /u01/oraaia/Oracle/Middleware/aia_home/aia_instances/AIA_VCP
[echo] Importing /u01/oraaia/Oracle/Middleware/aia_home/data/VCPJDE/PLWSeedData/VCPJDESeed.xml
all:
[echo] Executing /u01/oraaia/Oracle/Middleware/aia_home/Infrastructure/LifeCycle/PLWImExport/PLWImport.sh -f /u01/oraaia/Oracle/Middleware/aia_home/data/VCPJDE/PLWSeedData/VCPJDESeed.xml
[echo] Shell: /bin/sh
BUILD FAILED
/u01/oraaia/Oracle/Middleware/aia_home/Infrastructure/Install/AID/AIAPLWImportData.xml:48: exec returned: 1
Total time: 2 seconds
Configuration Command Return Value: 1
Any advises on this would be appreciated .
Regards,
Nag. -
3640 - AAA/AUTHOR: config command authorization not enabled
Hello, I have a 3640 router with c3640-ik9o3sw6-mz.122-8.T.bin version but when I try to validate the username and password with a radius server, the debbug message is "AAA/AUTHOR: config command authorization not enabled" and I'm sure that the radius validates the user and the packet arrive to the router.
I've tried to update the IOS with c3640-ik9o3s-mz.122-46a.bin and I can validate but I cannot use "crypto isakmp client configuration group mygroup" to configure Easy VPN server.
I attach you the files with config and logs.
Thanks you in advance.Yep! I'm really running 12.1!
I'm receiving the message once i include "aaa authorization exec default group radius local if-authenticated" in the config.
Login is successful, however authorization does not allow me to go directly into enable mode. If I take the aaa authorization line out I can login to user mode and then use the enable password to move forward but that is not what I wish to achieve.
sh run | i aaa
aaa new-model
aaa authentication attempts login 5
aaa authentication banner ^C
aaa authentication fail-message ^C
aaa authentication login My-RADIUS group radius local
aaa accounting exec My-RADIUS start-stop group radius
aaa session-id common
Is there somewhere specific I was suppose to configure the aaa authorization enabled, because I'm not seeing it.
Let me know what other thoughts you may have.
Thanks
Nik -
Error while compiling BerkeleyDB: Command failed for target `util_sig.lo'
Hi,
While trying to compile BerkeleyDB on Solaris 10 x86 ( is there an easier way to get BerkeleyDB running than compiling from source? ) I was confronted with the error message appended.
I used Berkeley 4.4.20 and configured by executing
PATH=/usr/ccs/bin:${PATH} # for ar
cd build_unix
CC=gcc ../dist/configure
Here the error message:
UNIX /app/berkeleydb/db/4.4.20/build_unix
$ make
/usr/bin/sh ./libtool --mode=compile gcc -c -I. -I../dist/.. -D_REENTRANT -O2 ../common/util_sig.c
gcc -c -I. -I../dist/.. -D_REENTRANT -O2 ../common/util_sig.c -fPIC -DPIC -o .libs/util_sig.o
In file included from /usr/include/sys/signal.h:34,
from /usr/include/signal.h:26,
from ../common/util_sig.c:15:
/usr/include/sys/siginfo.h:259: error: parse error before "ctid_t"
/usr/include/sys/siginfo.h:292: error: parse error before '}' token
/usr/include/sys/siginfo.h:294: error: parse error before '}' token
/usr/include/sys/siginfo.h:390: error: parse error before "ctid_t"
/usr/include/sys/siginfo.h:392: error: conflicting types for `__proc'
/usr/include/sys/siginfo.h:261: error: previous declaration of `__proc'
/usr/include/sys/siginfo.h:398: error: conflicting types for `__fault'
/usr/include/sys/siginfo.h:267: error: previous declaration of `__fault'
/usr/include/sys/siginfo.h:404: error: conflicting types for `__file'
/usr/include/sys/siginfo.h:273: error: previous declaration of `__file'
/usr/include/sys/siginfo.h:420: error: conflicting types for `__prof'
/usr/include/sys/siginfo.h:287: error: previous declaration of `__prof'
/usr/include/sys/siginfo.h:424: error: conflicting types for `__rctl'
/usr/include/sys/siginfo.h:291: error: previous declaration of `__rctl'
/usr/include/sys/siginfo.h:426: error: parse error before '}' token
/usr/include/sys/siginfo.h:428: error: parse error before '}' token
/usr/include/sys/siginfo.h:432: error: parse error before "k_siginfo_t"
/usr/include/sys/siginfo.h:437: error: parse error before '}' token
In file included from /usr/include/signal.h:26,
from ../common/util_sig.c:15:
/usr/include/sys/signal.h:85: error: parse error before "siginfo_t"
In file included from ../common/util_sig.c:15:
/usr/include/signal.h:111: error: parse error before "siginfo_t"
/usr/include/signal.h:113: error: parse error before "siginfo_t"
*** Error code 1
make: Fatal error: Command failed for target `util_sig.lo'
Thanks, Dietrich
PS:
I need the LDAP server for playing with a JAAS LDAP login module and something minimal would be enough. Is there something easier to install than OpenLDAP?I fail to understand why you're taking all this trouble. Both the Berkeley DB as OpenLDAP are both shipped with Solaris 10/x86 natively. Look into the so called companion cd / dvd. That makes setting all of this up a piece of cake.
Another option could be looking into Blastwave or Sun Freeware.
Maybe you are looking for
-
Hi, I have a ODS which had a infoobject with attribute only status. To be able use this in Bex this attribute only status has been disabled. Now while transporting the request I am getting the error - generate new SIDs for this characteristic in the
-
PO Box printed on first line in Smartform
Hi, I have used Address node in the Smartform to display the address. I have passsed Address Number, Type = Organization Address, Output Starts with Paragrah = SA, Number of lines to be used = 5, Use PO Box While, printing the PO box is getting print
-
Networking with Windows computers
After over 20 years of working in IT support in the Microsoft world, I'm buying an iMAC for my home computer and ditching the Microsoft box. However, my wife will continue to use her Windows machine, at least on the short term. I currently have a sha
-
Resource pool allocation problem - MS Project 2013
I have a new resource pool which I have created, im very new to MS Project so I may have set this up wrong im not too sure as I am learning from reading articles on google. So my resource pool has 12 fabricators on it – working 8am – 4.30pm but I ha
-
Any idea on Enhance commit usage ?
Hi Due to some performance related issues I have to use enhance commit . I need to do intermittent commits --( I know it is not okay but rollback segment is too small ) for every 1000 records the prev code was somewhat like begin for i in ( select *