How do i reconcile users from OIM to OID
I have configured the OIM with the connector for OID. But the user created in OIM is not stored in the OID. How should i proceed with it?
You are trying to do Provisioning with OID.
You have logged in with XELSYSADM.
You have searched for user and went to his Resource Profile.
Clicked Provision New Resource and selected OID.
Resource form must be populated and you have filled the information and clicked continue and subitt that.
Now go to Process Form attached with it and see whether values for OID Server has come and check for other attributes also.
Also uncheck Auto Save button and start provisioning for some user again and give proper values in all the fields of Process as well as Object forms.
Similar Messages
-
Provisoning users from OIM to OID having org other than xellerate users
Hi,
when i provision a user belonging to default Xellerate Users organization in OIM to OID, it is done.
what changes do i need to do if a want to provision a user in any other organization say 'MyCompany' to OID user
(it gives naming exception error when i try doing so)let me explain what I am trying to acheive.
I create a user using flat file reconciliation such that the user is created in organization say 'XYZ'. Also I've created a group say XYZmember (membership rule is organization name=XYZ)
I created an access policy such that whenever a user who is a member of XYZmember group(means organization name is XYZ)is created in OIM the user gets provisioned to OID and will be assigned an OID role say role1.
Now when i create a user with XYZ as organization,he becomes a member of XYZmember group.... according to access policy he should be provisioned to OID user and assigned role1
But it gives naming exception error.
i want to know if i create a user in some org other than xellerate users will it get provisioned to OID? and HOW? -
Error While Provisioning User from OIM to OID
This is the Error I'm getting While Creating a user and provisioning.
DOBJ.THROWABLE_IN_SAVE
Unhandled throwable java.lang.NoClassDefFoundError in com.thortech.xl.dataobj.tcScheduleItem's save
This error happens when i try to provision the user with OID.
Regards,
sudhanCould you please write down what you have given in ITResource?
May be you are giving some wrong value in IT Resource.
Have you made chnages to OID Prov Lookup. If no check this link :
Re: Problem with OID Connector
And give a try ! -
hi,
i am trying to figure out how to move existing user from OIM to OID in bulk.
Is there anyway by which we can move all the existing user in OIM simultaneously rather than one by one through resource profile by provisioning.
Regards
PegasusI don't know if I understood the question, ignore me if I'm wrong.
If you want to provision all your users in a Resource you can do the following:
1) Create an "Access Policy" through Admin. Console, wich provisions your OID Resource (ensure you check the "Retrofit Access Policy" Checkbox!)
2) When creating the Policly you'll be asked to select the Users Groups that will be affected by the policy. As all OIM users belong to "ALL USERS" group, you can assign your Access Policy to this group. By the way I would consider to create a new Users Group if there is any chance that you add a user to OIM who you won't need to be provisioned in OID.
You can have a look to chapters 10 and 11 in the Admin. Console Documentation:
link
Shout me if I missunderstood you ;)
Regards, -
User Provisioning not working from OIM to OID
Hi All,
I am trying to create new user from OIM to OID, am getting following error message on console...
Response: INVALID_NAMING_ERROR
Response Description: Naming exception encountered
Notes:
In logs files while creation am getting following message....
INFO,09 Oct 2011 23:37:50,253,[XELLERATE.WEBAPP],retrieving object from cache key = xlCustomClienten_US
INFO,09 Oct 2011 23:37:50,253,[XELLERATE.WEBAPP],Key not found in Custom Resource Bundle: newKey = global.udf.USR_UDF_ALIAS
INFO,09 Oct 2011 23:37:50,253,[XELLERATE.WEBAPP],Writing Custom default resource bundle object to cache : Key = xlConnectorResourceBundleen_US
INFO,09 Oct 2011 23:37:50,254,[XELLERATE.WEBAPP],retrieving object from cache key = xlCustomClienten_US
INFO,09 Oct 2011 23:37:50,254,[XELLERATE.WEBAPP],Key not found in Custom Resource Bundle: newKey = global.udf.USR_UDF_CUSTID
INFO,09 Oct 2011 23:37:50,254,[XELLERATE.WEBAPP],Writing Custom default resource bundle object to cache : Key = xlConnectorResourceBundleen_US
INFO,09 Oct 2011 23:37:50,254,[XELLERATE.WEBAPP],retrieving object from cache key = xlCustomClienten_US
INFO,09 Oct 2011 23:37:50,254,[XELLERATE.WEBAPP],Key not found in Custom Resource Bundle: newKey = global.udf.USR_UDF_IVRPIN
INFO,09 Oct 2011 23:37:50,254,[XELLERATE.WEBAPP],Writing Custom default resource bundle object to cache : Key = xlConnectorResourceBundleen_US
INFO,09 Oct 2011 23:37:50,255,[XELLERATE.WEBAPP],retrieving object from cache key = xlCustomClienten_US
INFO,09 Oct 2011 23:37:50,255,[XELLERATE.WEBAPP],Key not found in Custom Resource Bundle: newKey = global.udf.USR_UDF_USERAPPSTATUS
INFO,09 Oct 2011 23:37:50,255,[XELLERATE.WEBAPP],Writing Custom default resource bundle object to cache : Key = xlConnectorResourceBundleen_US
INFO,09 Oct 2011 23:37:50,255,[XELLERATE.WEBAPP],retrieving object from cache key = xlCustomClienten_US
INFO,09 Oct 2011 23:37:50,255,[XELLERATE.WEBAPP],Key not found in Custom Resource Bundle: newKey = global.udf.USR_UDF_CREATEDDATE
INFO,09 Oct 2011 23:37:50,255,[XELLERATE.WEBAPP],Writing Custom default resource bundle object to cache : Key = xlConnectorResourceBundleen_US
INFO,09 Oct 2011 23:37:50,256,[XELLERATE.WEBAPP],retrieving object from cache key = xlCustomClienten_US
INFO,09 Oct 2011 23:37:50,256,[XELLERATE.WEBAPP],Key not found in Custom Resource Bundle: newKey = global.udf.USR_UDF_OAMLOCKTIME
INFO,09 Oct 2011 23:37:50,256,[XELLERATE.WEBAPP],Writing Custom default resource bundle object to cache : Key = xlConnectorResourceBundleen_US
INFO,09 Oct 2011 23:37:50,256,[XELLERATE.WEBAPP],retrieving object from cache key = xlCustomClienten_US
INFO,09 Oct 2011 23:37:50,256,[XELLERATE.WEBAPP],Key not found in Custom Resource Bundle: newKey = global.udf.USR_UDF_PASSWORD_EXPIRE
INFO,09 Oct 2011 23:37:50,257,[XELLERATE.WEBAPP],Writing Custom default resource bundle object to cache : Key = xlConnectorResourceBundleen_US
Please help me on this....
Thanks in Advance
YJRThat is not the log output of the OID connector. Check the connector docs, and enable the OID logging only. The INVALID_NAMING_ERROR means something is wrong with the naming of your object. Most likely there is an LDAP error output somewhere, but all the output you provided is info level, nothing wrong with it.
-Kevin -
How to give design console access to the user from OIM GUI - OIM 11g R2
Hi,
Could you please let me know if there is any way to give Design Console access to a normal user in OIM 11g R2.
I tried by giving the access from backend by using DB command and I was able to give the design console access to the user.
But I need to give design console access to the user from OIM Interface.
Please let me know how to achieve this functionality.
ThanksI have already used this approach by directly modifying the user record in DB.
I am looking if it is possible to give Design console access from OIM GUI, the way we use to give in OIM 11g R1. -
How to check SSO user from database?
Hi:
I've posted this topic in Forms forum:
How to check SSO user from database?
then as I've been told, it's better to post it here, so ...... here is the question:
I'm writing a "before delete trigger" to insert into log table before delete. Is there a way that I know from database the current SSO user when SSO users share one database user?
Just like in Oracle Application Express there is v('APP_USER') to know the current user.
Saad,End users are manipulating data through Oracle Forms(and SSO through portal) and the thing I need is to trace the SSO username from database without modifying forms, I mean purely from database taking into consideration that SSO users are sharing one database user. Is it possible?
Saad, -
How To Block a User From Changing Total Field In AR Invoice
Hi all,
I would like to find out how to block a user from being able to change the total field at the bottom right hand side of the AR Invoice. Currently if a user creates an invoice and is still busy in that invoice they can adjust the total field which in turn will update the discount field as well.
Is this simply an authorization issue or am I going to have to do it in the transaction notification?
We are using SAP Business One PL 30 currently.Hi
Please review by note again -This is by system design .
I don't know why you are so worried abt this because by setting up discount max ,if user cannot post the document ,that means it is not in the system ,no matter they change multiple times in the fields .
But I think your scenario is different ,
You are copying with certain discount from Delivery to AR invoice and i think you don't want user to change the discount field , then you can solve your issue by using approval or sp_notification
If you really want it to be greyed out ,I think you might have to go through by SDK or Boyum addon.
Thank you
Bishal -
How to get All Users from OID LDAP
Hi all,
I have Oracle Internet Directory(OID) and have created the users in it manually.
Now I want to extract all the users from OID. How can I get Users from OID??
Any response will be appritiated. If some one could show me demo code for that I shall be greatful to you.
Thanks and reagards
Pravyhi,
the notes from metalink:
bgards
elvis
Doc ID: Note:276688.1
Subject: How to copy (export/import) the Portal database schemas of IAS 9.0.4 to another database
Type: BULLETIN
Status: PUBLISHED
Content Type: TEXT/X-HTML
Creation Date: 18-JUN-2004
Last Revision Date: 05-AUG-2005
How to copy (export/import) Portal database schemas of IAS 9.0.4 to another database
Note 276688.1
Download scripts Unix: Attachment 276688.1:1
Download Perl scripts (Unix/NT) :Attachment 276688.1:2
This article is being delivered in Draft form and may contain errors. Please use the MetaLink "Feedback" button to advise Oracle of any issues related to this article.
HISTORY
Version 1.0 : 24-JUN-2004: creation
Version 1.1 : 25-JUN-2004: added a link to download the scripts from Metalink
Version 1.2 : 29-JUN-2004: Import script: Intermedia indexes are recreated. Imported jobs are reassigned to Portal. ptlconfig replaces ptlasst.
Version 1.3 : 09-JUL-2004: Additional updates. Usage of iasconfig.xml. Need only 3 environment variables to import.
Version 1.4 : 18-AUG-2004: Remark about 9.2.0.5 and 10.1.0.2 database
Version 1.5 : 26-AUG-2004: Duplicate job id
Version 1.6 : 29-NOV-2004: Remark about WWC-44131 and WWSBR_DOC_CTX_54
Version 1.7 : 07-JAN-2005: Attached perl scripts (for NT/Unix) at the end of the note
Version 1.8 : 12-MAY-2005: added a work-around for the WWSTO_SESS_FK1 issue
Version 1.9 : 07-JUL-2005: logoff trigger and 9.0.1 database export, import in 10g database
Version 1.10: 05-AUG-2005: reference to the 10.1.2 note
PURPOSE
This document explains how to copy a Portal database schema from a database to another database.
It allows restoring the Portal repository and the OID security associated with Portal.
It can be used to go in production by copying physically a database from a development portal to a production environment and avoid to use the export/import utilities of Portal.
This note:
uses the export/import on the database level
allows the export/import to be done between different platforms
The script are Unix based and for the BASH shell. They can be adapted for other platforms.
For the persons familiar with this technics in Portal 9.0.2, there is a list of the main differences with Portal 9.0.2 at the end of the note.
These scripts are based on the experience of a lot of persons in Portal 902.
The scripts are attached to the note. Download them here: Attachment 276688.1:1 : exp_schema_904.zip
A new version of the script was written in Perl. You can also download them, here: Attachment 276688.1:2 : exp_schema_904_v2.zip. They do exactly the same than the bash ones. But they have the advantage of working on all platforms.
SCOPE & APPLICATION
This document is intented for Portal administrators. For using this note, you need basic DBA skills.
This notes is for Portal 9.0.4.x only. The notes for Portal 9.0.2 are :
Note 228516.1 : How to copy (export/import) Portal database schemas of IAS 9.0.2 to another database
Note 217187.1 : How to restore a cold backup of a Portal IAS 9.0.2 on another machine
The note for Portal 10.1.2 is:
Note 330391.1 : How to copy (export/import) Portal database schemas of IAS 10.1.2 to another databaseMethod
The method that we will follow in the document is the following one:
Export:
- export of the 4 portal schemas of a database (DEV / development)
- export the LDAP OID users and groups (optional)
Install a new machine with fresh IAS installation (PROD / production)
Import:
- delete the new and empty portal schema on PROD
- import the schemas in the production database in place of the deleted schemas
- import the LDAP OID users and groups (optional)
- modify the configuration such that the infrastructure uses the portal repository of the backup
- modify the configuration such that the portal repository uses the OID, webcache and SSO of the new infrastructure
The export and the import are divided in several steps. All of these steps are included in 2 sample scripts:
export : exp_portal_schema.sh
import : imp_portal_schema.sh
In the 2 scripts, all the steps are runned in one shot. It is just an example. Depending of the configuration and circonstance, all the steps can be runned independently.
Convention
Development (DEV) is the name of the machine where resides the copied database
Production (PROD) is the name of the machine where the database is copied
Prerequisite
Some prerequisite first.
A. Environment variables
To run the import/export, you will need 3 environment variables. In the given scripts, they are defined in 'portal_env.sh'
SYS_PASSWORD - the password of user sys in the Portal database
IAS_PASSWORD - the password of IAS
ORACLE_HOME - the ORACLE_HOME of the midtier
The rest of the settings are found automatically by reading the iasconfig.xml file and querying the OID. It is done in 'portal_automatic_env.sh'. I wish to write a note on iasconfig.xml and the way to transform it in usefull environment variables. But it is not done yet. In the meanwhile, you can read the old 902 doc, that explains the meaning of most variables :
< Note 223438.1 : Shell script to find your portal passwords, settings and place them in environment variables on Unix >
B. Definition: Cutter database
A 'Cutter Database' is the term used to designate a Database created by RepCA or OUI and that contains all the schemas used by a IAS 9.0.4 infrastructure. Even if in most cases, several schemas are not used.
In Portal 9.0.4, the option to install only the portal repository in an empty database has been removed. It has been replaced by RepCA, a tool that creates an infrastructure database. Inside all the infrastucture database schemas, there are the portal schemas.
This does not stop people to use 2 databases for running portal. One for OID and one for Portal. But in comparison with Portal 9.0.2, all schemas exist in both databases even if some are not used.
The main idea of Cutter database is to have only 1 database type. And in the future, simplify the upgrades of customer installation
For an installation where Portal and OID/SSO are in 2 separate databases, it looks like this
Portal 9.0.2 Portal 9.0.4
Infrastructure database
(INFRA_SID)
The infrastructure contains:
- OID (used)
- OEM (used)
- Single Sign-on / orasso (used)
- Portal (not used)
The infrastructure contains:
- OID (used)
- OEM (used)
- Single Sign-on / orasso (used)
- Portal (not used)
Portal database
(PORTAL_SID)
The custom Portal database contains:
- Portal (used)
The custom Portal database (is also an infrastructure):
- OID (not used)
- OEM (not used)
- Single Sign-on / orasso (not used)
- Portal (used)
Whatever, the note will suppose there is only one single database. But it works also for 2 databases installation like the one explained above.
C. Directory structure.
The sample scripts given inside this note will be explained in the next paragraphs. But first, the scripts are done to use a directory structure that helps to classify the files.
Here is a list of important files used during the process of export/import:
File Name
Description
exp_portal_schema.sh
Sample script that exports all the data needed from a development machine
imp_portal_schema.sh
Sample script that import all the data into a production machine
portal_env.sh
Script that defines the env variable specific to your system (to configure)
portal_automatic_env.sh
Helper script to get all the rest of the Portal settings automatically
xsl
Directory containing all the XSL files (helper scripts)
del_authpassword.xsl
Helper script to remove the authpassword tags in the DSML files
portal_env_unix.sql
Helper script to get Portal settings from the iasconfig.xml file
exp_data
Directory containing all the exported data
portal_exp.dmp
export on the database level of the portal, portal_app, ... database schemas
iasconfig.xml
copy the name of iasconfig.xml of the midtier of DEV. Used to get the hostname and port of Webcache
portal_users.xml
export from LDAP of the OID users used by Portal (optional)
portal_groups.xml export from LDAP of the OID groups used by Portal (optional)
imp_log
Directory containing several spool and logs files generated during the import
import.log Log file generated when running the imp command
ptlconfig.log
Log generated by ptlconfig when rewiring portal to the infrastructure.
Some other spool files.
D. Known limitations
The scripts given in this note have the following known limitations:
It does not copy the data stored in the SSO schema: external applications definitions and the passwords stored for them.
See in the post steps: SSO migration to know how to do.
The ssomig command resides in the Infrastructure Oracle home. And all commands of Portal in the Midtier home. And practically, these 2 Oracle homes are most of the time not on the same machine. This is the reason.
The export of the users in OID exports from the default user location:
ldapsearch .... -b "cn=users,dc=domain,dc=com"
This is not 100% correct. The users are by default stored in something like "cn=users,dc=domain,dc=com". So, if the users are stored in the default location, it works. But if this location (user install base) is customized, it does not work.
The reason is that such settings means that the LDAP most of the time highly customized. And I prefer that the administrator to copy the real LDAP himself. The right command will probably depend of the customer case. So, I prefered not to take the risk..
orclCommonNicknameAttribute must match in the Target and Source OID .
The orclCommonNicknameAttribute must match on both the source and target OID. By default this attribute is set to "uid", so if this has been changed, it must be changed in both systems.
Reference Note 282698.1
Migration of custom Java portlets.
The script migrates all the data of Portal stored in the database. If you have custom java portlet deployed in your development machine, you will need to copy them in the production system.
Step 1 - Export in Development (DEV)
To export a full Portal installation to another machine, you need to follow 3 steps:
Export at the database level the portal schemas + related schemas
Get the midtier hostname and port of DEV
Export of the users and groups with LDAPSEARCH in 2 XML files
A script combining all the steps is available here.
A. Export the 4 portals schemas (DEV)
You need to export 3 types of database schemas:
The 4 portal schemas created by default by the portal installation :
portal,
portal_app,
portal_demo,
portal_public
The schemas where your custom database portlets / providers resides (if any)
- The custom schemas you have created for storing your portlet / provider code
The schemas where your custom tables resides. (if any)
- Your custom schemas accessed by portal and containing only data (tables, views ...)
You can get an approximate list of the schemas: default portal schemas (1) and database portlets schemas (2) with this query.
SELECT USERNAME, DEFAULT_TABLESPACE, TEMPORARY_TABLESPACE
FROM DBA_USERS
WHERE USERNAME IN (user, user||'_PUBLIC', user||'_DEMO', user||'_APP')
OR USERNAME IN (SELECT DISTINCT OWNER FROM WWAPP_APPLICATION$ WHERE NAME != 'WWV_SYSTEM');
It still misses your custom schemas containing data only (3).
We will export the 4 schemas and your custom ones in an export file with the user sys.
Please, use a command like this one
exp userid="'sys/change_on_install@dev as sysdba'" file=portal_exp.dmp grants=y log=portal_exp.log owner=(portal,portal_app,portal_demo,portal_public)The result is a dump file: 'portal_exp.dmp'. If you are using a database 9.2.0.5 or 10.1.0.2, the database of the exp/imp dump file has changed. Please read this.
B. Hostname and port
For the URL to access the portal, you need the 2 following infos to run the script 'imp_portal_schema.sh below :
Webcache hostname
Webcache listen port
These values are contained in the iasconfig.xml file of the midtier.
iasconfig.xml
<IASConfig XSDVersion="1.0">
<IASInstance Name="ias904.dev.dev_domain.com" Host="dev.dev_domain.com" Version="9.0.4">
<OIDComponent AdminPassword="@BfgIaXrX1jYsifcgEhwxciglM+pXod0dNw==" AdminDN="cn=orcladmin" SSLEnabled="false" LDAPPort="3060"/>
<WebCacheComponent AdminPort="4037" ListenPort="7782" InvalidationPort="4038" InvalidationUsername="invalidator" InvalidationPassword="@BR9LXXoXbvW1iH/IEFb2rqBrxSu11LuSdg==" SSLEnabled="false"/>
<EMComponent ConsoleHTTPPort="1813" SSLEnabled="false"/>
</IASInstance>
<PortalInstance DADLocation="/pls/portal" SchemaUsername="portal" SchemaPassword="@BR9LXXoXbvW1c5ZkK8t3KJJivRb0Uus9og==" ConnectString="cn=asdb,cn=oraclecontext">
<WebCacheDependency ContainerType="IASInstance" Name="ias904.dev.dev_domain.com"/>
<OIDDependency ContainerType="IASInstance" Name="ias904.dev.dev_domain.com"/>
<EMDependency ContainerType="IASInstance" Name="ias904.dev.dev_domain.com"/>
</PortalInstance>
</IASConfig>
It corresponds to a portal URL like this:
http://dev.dev_domain.com:7782/pls/portalThe script exp_portal_schema.sh copy the iasconfig.xml file in the exp_data directory.
C. Export the security: users and groups (optional)
If you use other Single Sing-On uses than the portal user, you probably need to restore the full security, the users and groups stored in OID on the production machine. 5 steps need to be executed for this operation:
Export the OID entries with LDAPSEARCH
Before to import, change the domain in the generated file (optional)
Before to import, remove the 'authpassword' attributes from the generated files
Import them with LDAPADD
Update the GUID/DN of the groups in portal tables
Part 1 - LDAPSEARCH
The typical commands to do this operation look like this:
ldapsearch -h $OID_HOSTNAME -p $OID_PORT -X -b "cn=portal.040127.1384,cn=groups,dc=dev_domain,dc=com" -s sub "objectclass=*" > portal_group.xml
ldapsearch -h $OID_HOSTNAME -p $OID_PORT -X -D "cn=orcladmin" -w $IAS_PASSWORD -b "cn=users,dc=dev_domain,dc=com" -s sub "objectclass=inetorgperson" > portal_users.xmlTake care about the following points
The groups are stored in a LDAP directory containing the date of installation
( in this example: portal.040127.1384,cn=groups,dc=dev_domain,dc=com )
If the domain of dev and prod is different, the exported files contains the name of the development domain in the form of 'dc=dev_domain,dc=com' in a lot of place. The domain name needs to be replaced by the production domain name everywhere in the files.
Ldapsearch uses the option '- X '. It it to export to DSML files (XML). It avoids a problem related with common LDAP files, LDIF files. LDIF files are wrapped at 78 characters. The wrapping to 78 characters make difficult to change the domain name contained in the LDIF files. XML files are not wrapped and do not have this problem.
A sample script to export the 2 XML files is given here in : step 3 - export the users and groups (optional) of the export script.
Part 2 : change the domain in the DSML files
If the domain of dev and prod is different, the exported files contains the name of the development domain in the form of 'dc=dev_domain,dc=com' in a lot of place. The domain name need to be replaced by the production domain name everywhere in the files.
To do this, we can use these commands:
cat exp_data/portal_groups.xml | sed -e "s/$DEV_DN/$PROD_DN/" > imp_log/portal_groups.xml
cat exp_data/portal_users.xml | sed -e "s/$DEV_DN/$PROD_DN/" > imp_log/temp_users.xml
Part 3 : Remove the authpassword attribute
The export of all attributes from the all users has also exported an automatically generated attribute in OID called 'authpassword'.
'authpassword' is a list automatically generated passwords for several types of application. But mostly, it can not be imported. Also, there is no option in ldapsearch (that I know) that allows removing an attribute. In place of giving to the ldapsearch command the list of all the attributes that is very long, without 'authpassword', we will remove the attribute after the export.
For that we will use the fact that the DSML files are XML files. There is a XSLT in the Oracle IAS, in the executable '$ORACLE_HOME/bin/xml'. XSLT is a standard specification of the internet consortium W3C to transform a XML file with the help of a XSL file.
Here is the XSL file to remove the authpassword tag.
del_autpassword.xsl
<!--
File : del_authpassword.xsl
Version : 1.0
Author : mgueury
Description:
Remove the authpassword from the DSML files
-->
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xml:output method="xml"/>
<xsl:template match="*|@*|node()">
<xsl:copy>
<xsl:apply-templates select="*|@*|node()"/>
</xsl:copy>
</xsl:template>
<xsl:template match="attr">
<xsl:choose>
<xsl:when test="@name='authpassword;oid'">
</xsl:when>
<xsl:when test="@name='authpassword;orclcommonpwd'">
</xsl:when>
<xsl:otherwise>
<xsl:copy>
<xsl:apply-templates select="*|@*|node()"/>
</xsl:copy>
</xsl:otherwise>
</xsl:choose>
</xsl:template>
</xsl:stylesheet>
And the command to make the transfomation:
xml -f -s del_authpassword.xsl -o imp_log/portal_users.xml imp_log/temp_users.xmlWhere :
imp_log/portal_users.xml is the final file without authpassword tags
imp_log/temp_users.xml is the input file with the authpassword tags that can not be imported.
Part 4 : LDAPADD
The typical commands to do this operation look like this:
ldapadd -h $OID_HOSTNAME -p $OID_PORT -D "cn=orcladmin" -w $IAS_PASSWORD -c -X portal_group.xml
ldapadd -h $OID_HOSTNAME -p $OID_PORT -D "cn=orcladmin" -w $IAS_PASSWORD -c -X portal_users.xmlTake care about the following points
Ldapadd uses the option ' -c '. Existing users/groups are generating an error. The option -c allows continuing and ignoring these errors. Whatever, the errors should be checked to see if it is just existing entries.
A sample script to import the 2 XML files given in the step 5 - import the users and groups (optional) of the import script.
Part 5 : Update the GUID/DN
In Portal 9.0.4, the update of the GUID is taken care by PTLCONFIG during the import. (Import step 7)
D. Example script for export
Here is a example script that combines the 3 steps.
Depending of you need, you will :
or execute all the steps
or just execute the 1rst one (export of the database users). It will be enough you just want to login with the portal user on the production instance.
if your portal repository resides in a database 9.2.0.5 or 10.1.0.2, please read this
you can download all the scripts here, Attachment 276688.1:1
Do not forget to modify the script to your need and mostly add the list of users like explained in point A above.
exp_portal_schema.sh
# BASH Script : exp_portal_schema.sh
# Version : 1.3
# Portal : 9.0.4.0
# History :
# mgueury - creation
# Description:
# This script export a portal dump file from a dev instance
# -------------------------- Environment variables --------------------------
. portal_env.sh
# In case you do not use portal_env.sh you have to define all the variables
# For exporting the dump file only.
# export SYS_PASSWORD=change_on_install
# export PORTAL_TNS=asdb
# For the security (optional)
# export IAS_PASSWORD=welcome1
# export PORTAL_USER=portal
# export PORTAL_PASSWORD=A1b2c3de
# export OID_HOSTNAME=development.domain.com
# export OID_PORT=3060
# export OID_DOMAIN_DN=dc=`echo $OID_HOSTNAME | cut -d '.' -f2,3,4,5,6 --output-delimiter=',dc='`
# ------------------------------ Help function -----------------------------------
function press_any_key() {
if [ $PRESS_ANY_KEY_AFTER_EACH_STEP = "Y" ]; then
echo
echo Press enter to continue
read $ANY_KEY
else
echo
fi
echo "------------------------------- Export ------------------------------------"
# create a directory for the export
mkdir exp_data
# copy the env variables in the log just in case
export > exp_data/exp_env_variable.txt
echo "--------------------- step 1 - export"
# export the portal users, but take care to add:
# - your users containing DB providers
# - your users containing data (tables)
exp userid="'sys/$SYS_PASSWORD@$PORTAL_TNS as sysdba'" file=exp_data/portal_exp.dmp grants=y log=exp_data/portal_exp.log owner=(portal,portal_app,portal_demo,portal_public)
press_any_key
echo "--------------------- step 2 - store iasconfig.xml file of the MIDTIER"
cp $MIDTIER_ORACLE_HOME/portal/conf/iasconfig.xml exp_data
press_any_key
echo "--------------------- step 3 - export the users and groups (optional)"
# Export the groups and users from OID in 2 XML files (not LDIF)
# The OID groups of portal are stored in GROUP_INSTALL_BASE that depends
# of the installation date.
# For the user, I use the default place. If it does not work,
# you can find the user place with:
# > exec dbms_output.put_line(wwsec_oid.get_user_search_base);
# Get the GROUP_INSTALL_BASE used in security export
sqlplus $PORTAL_USER/$PORTAL_PASSWORD@$PORTAL_TNS <<IASDB
set serveroutput on
spool exp_data/group_base.log
begin
dbms_output.put_line(wwsec_oid.get_group_install_base);
end;
IASDB
export GROUP_INSTALL_BASE=`grep cn= exp_data/group_base.log`
echo '--- Exporting Groups'
echo 'creating portal_groups.xml'
ldapsearch -h $OID_HOSTNAME -p $OID_PORT -X -s sub -b "$GROUP_INSTALL_BASE" -s sub "objectclass=*" > exp_data/portal_groups.xml
echo '--- Exporting Users'
echo 'creating portal_users.xml'
ldapsearch -h $OID_HOSTNAME -p $OID_PORT -D "cn=orcladmin" -w $IAS_PASSWORD -X -s sub -b "cn=users,$OID_DOMAIN_DN" -s sub "objectclass=inetorgperson" > exp_data/portal_users.xml
The script is done to run from the midtier.
Step 2 - Install IAS in a new machine (PROD)
A. Installation
This note does not distinguish if Portal is sharing the same database than Single-Sign On and OID. For simplicity, I will speak only about 1 database. But I could also create a second infrastructure database just for the portal repository. This way is better for production system, because the Portal repository is only product used in the 2nd database. Having 2 separate databases allows taking easily backup of the portal repository.
On the production machine, you need to install a fresh install of IAS 9.0.4. Take care to use :
the same IAS patchset 9.0.4.1, 9.0.4.2, ...on the middle-tier and infrastruture than in development
and same characterset than in development (or UTF8)
The result will be 2 ORACLE_HOMES and 1 infrastructure database:
the ORACLE_HOME of the infrastructure (SID:infra904)
the ORACLE_HOME of the midtier (SID:ias904)
an infrastructure database (SID:asdb)
The empty new Portal install should work fine before to go to the next step.
B. About tablespaces (optional)
The size of the tablespace of the production should match the one of the Developement machine. If not, the tablespace will autoextend. It is not really a concern, but it is slow. You should modify the tablespaces for to have as much space on prod and dev.
Also, it is safer to check that there is enough free space on the hard disk to import in the database.
To modify the tablespace size, you can use Oracle Entreprise Manager console,
On Unix, . oraenv
infra904oemapp dbastudio
On NT Start/ Programs/ Oracle Application server - infra904 / Enterprise Manager Console
Launch standalone
Choose the portal database (typically asdb.domain.com)
Connect with a DBA user, sys or system
Click Storage/Tablespaces
Change the size of the PORTAL, PORTAL_DOC, PORTAL_LOGS, PORTAL_IDX tablespaces
C. Backup
It could be a good idea to take a backup of the MIDTIER and INFRASTRUCTURE Oracle Homes at that point to allow retesting the import process if it fails for any reason as much as you want without needing to reinstall everything.
Step 3 - Import in production (on PROD)
The following script is a sample of an Unix script that combines all the steps to import a portal repository to the production machine.
To import a portal reporistory and his users and group in OID, you need to do 8 things:
Stop the midtier to avoid errors while dropping the portal schema
SQL*Plus with Portal
Drop the 4 default portal schemas
Create the portal users with the same passwords than the just deleted users and give them grants (you need to create your own custom shemas too if you have some).
Import the dump file
Import the users and groups into OID (optional)
SQL*Plus with SYS : Post import changes
Recompile everything in the database
Reassign the imported jobs to portal
SQL*Plus with Portal : Post import changes
Recreate the Portal intermedia indexes
Correct an import errror on wwsrc_preference$
Make additional post import changes, by updating some portal tables, and replacing the development hostname, port or domain by the production ones.
Rewire the portal repository with ptlconfig -dad portal
Restart the midtier
Here is a sample script to do this on Unix. You will need to adapt the script to your needs.
imp_portal_schema.sh
# BASH Script : imp_portal_schema.sh
# Version : 1.3
# Portal : 9.0.4.0
# History :
# mgueury - creation
# Description:
# This script import a portal dump file and relink it with an
# infrastructure.
# Script to be started from the MIDTIER
# -------------------------- Environment variables --------------------------
. portal_env.sh
# Development and Production machine hostname and port
# Example
# .._HOSTNAME machine.domain.com (name of the MIDTIER)
# .._PORT 7782 (http port of the MIDTIER)
# .._DN dc=domain,dc=com (domain name in a LDAP way)
# These values can be determined automatically with the iasconfig.xml file of dev
# and prod. But if you do not know or remember the dev hostname and port, this
# query should find it.
# > select name, http_url from wwpro_providers$ where http_url like 'http%'
# These variables are used in the
# > step 4 - security / import OID users and groups
# > step 6 - post import changes (PORTAL)
# Set the env variables of the DEV instance
rm /tmp/iasconfig_env.sh
xml -f -s xsl/portal_env_unix.xsl -o /tmp/iasconfig_env.sh exp_data/iasconfig.xml
. /tmp/iasconfig_env.sh
export DEV_HOSTNAME=$WEBCACHE_HOSTNAME
export DEV_PORT=$WEBCACHE_LISTEN_PORT
export DEV_DN=dc=`echo $OID_HOSTNAME | cut -d '.' -f2,3,4,5,6 --output-delimiter=',dc='`
# Set the env variables of the PROD instance
. portal_env.sh
export PROD_HOSTNAME=$WEBCACHE_HOSTNAME
export PROD_PORT=$WEBCACHE_LISTEN_PORT
export PROD_DN=dc=`echo $OID_HOSTNAME | cut -d '.' -f2,3,4,5,6 --output-delimiter=',dc='`
# ------------------------------ Help function -----------------------------------
function press_any_key() {
if [ $PRESS_ANY_KEY_AFTER_EACH_STEP = "Y" ]; then
echo
echo Press enter to continue
read $ANY_KEY
else
echo
fi
echo "------------------------------- Import ------------------------------------"
# create a directory for the logs
mkdir imp_log
# copy the env variables in the log just in case
export > imp_log/imp_env_variable.txt
echo "--------------------- step 1 - stop the midtier"
# This step is needed to avoid most case of ORA-01940: user connected
# when dropping the portal user
$MIDTIER_ORACLE_HOME/opmn/bin/opmnctl stopall
press_any_key
echo "--------------------- step 2 - drop and create empty users"
sqlplus "sys/$SYS_PASSWORD@$PORTAL_TNS as sysdba" <<IASDB
spool imp_log/drop_create_user.log
---- Drop users
-- Warning: You need to stop all SQL*Plus connection to the
-- portal schema before that else the drop will give an
-- ORA-01940: cannot drop a user that is currently connected
drop user portal_public cascade;
drop user portal_app cascade;
drop user portal_demo cascade;
drop user portal cascade;
---- Recreate the users and give them grants"
-- The new users will have the same passwords as the users we just dropped
-- above. Do not forget to add your exported custom users
create user portal identified by $PORTAL_PASSWORD default tablespace portal;
grant connect,resource,dba to portal;
create user portal_app identified by $PORTAL_APP_PASSWORD default tablespace portal;
grant connect,resource to portal_app;
create user portal_demo identified by $PORTAL_DEMO_PASSWORD default tablespace portal;
grant connect,resource to portal_demo;
create user portal_public identified by $PORTAL_PUBLIC_PASSWORD default tablespace portal;
grant connect,resource to portal_public;
alter user portal_public grant connect through portal;
start $MIDTIER_ORACLE_HOME/portal/admin/plsql/wwv/wdbigra.sql portal
exit
IASDB
press_any_key
echo "--------------------- step 3 - import"
imp userid="'sys/$SYS_PASSWORD@$PORTAL_TNS as sysdba'" file=exp_data/portal_exp.dmp grants=y log=imp_log/import.log full=y
press_any_key
echo "--------------------- step 4 - import the OID users and groups (optional)"
# Some errors will be raised when running the ldapadd because at least the
# default entries will not be able to be inserted. Remove them from the
# ldif file if you want to avoid them. Due to the flag '-c', ldapadd ignores
# duplicate entries. Another more radical solution is to erase all the entries
# of the users and groups in OID before to run the import.
# Replace the domain name in the XML files.
cat exp_data/portal_groups.xml | sed -e "s/$DEV_DN/$PROD_DN/" > imp_log/portal_groups.xml
cat exp_data/portal_users.xml | sed -e "s/$DEV_DN/$PROD_DN/" > imp_log/temp_users.xml
# Remove the authpassword attributes with a XSL stylesheet
xml -f -s xsl/del_authpassword.xsl -o imp_log/portal_users.xml imp_log/temp_users.xml
echo '--- Importing Groups'
ldapadd -h $OID_HOSTNAME -p $OID_PORT -D "cn=orcladmin" -w $IAS_PASSWORD -c -X imp_log/portal_groups.xml -v
echo '--- Importing Users'
ldapadd -h $OID_HOSTNAME -p $OID_PORT -D "cn=orcladmin" -w $IAS_PASSWORD -c -X imp_log/portal_users.xml -v
press_any_key
echo "--------------------- step 5 - post import changes (SYS)"
sqlplus "sys/$SYS_PASSWORD@$PORTAL_TNS as sysdba" <<IASDB
spool imp_log/sys_post_changes.log
---- Recompile the invalid packages"
-- On the midtier, the script utlrp is not present. This step
-- uses a copy of it stored in patch/utlrp.sql
select count(*) INVALID_OBJECT_BEFORE from all_objects where status='INVALID';
start patch/utlrp.sql
set lines 999
select count(*) INVALID_OBJECT_AFTER from all_objects where status='INVALID';
---- Jobs
-- Reassign the JOBS imported to PORTAL. After the import, they belong
-- incorrectly to the user SYS.
update dba_jobs set LOG_USER='PORTAL', PRIV_USER='PORTAL' where schema_user='PORTAL';
commit;
exit
IASDB
press_any_key
echo "--------------------- step 6 - post import changes (PORTAL)"
sqlplus $PORTAL_USER/$PORTAL_PASSWORD@$PORTAL_TNS <<IASDB
set serveroutput on
spool imp_log/portal_post_changes.log
---- Intermedia
-- Recreate the portal indexes.
-- inctxgrn.sql is missing from the 9040 CD-ROMS. This is the bug 3536937.
-- Fixed in 9041. The missing script is contained in the downloadable zip file.
start patch/inctxgrn.sql
start $MIDTIER_ORACLE_HOME/portal/admin/plsql/wws/ctxcrind.sql
---- Import error
alter table "WWSRC_PREFERENCE$" add constraint wwsrc_preference_pk
primary key (subscriber_id, id)
using index wwsrc_preference_idx1
begin
DBMS_RLS.ADD_POLICY ('', 'WWSRC_PREFERENCE$', 'WEBDB_VPD_POLICY',
'', 'webdb_vpd_sec', 'select, insert, update, delete', TRUE,
static_policy=>true);
end ;
---- Modify tables with full URLs
-- If the domain name of prod and dev are different, this step is really important.
-- It modifies the portal tables that contains reference to the hostname or port
-- of the development machine. (For more explanation: see Addional steps in the note)
-- groups (dn)
update wwsec_group$
set dn=replace( dn, '$DEV_DN', '$PROD_DN' )
update wwsec_group$
set dn_hash = wwsec_api_private.get_dn_hash( dn )
-- users (dn)
update wwsec_person$
set dn=replace( dn, '$DEV_DN', '$PROD_DN' )
update wwsec_person$
set dn_hash = wwsec_api_private.get_dn_hash( dn)
-- subscriber
update wwsub_model$
set dn=replace( dn, '$DEV_DN', '$PROD_DN' ), GUID=':1'
where dn like '%$DEV_DN%'
-- preferences
update wwpre_value$
set varchar2_value=replace( varchar2_value, '$DEV_DN', '$PROD_DN' )
where varchar2_value like '%$DEV_DN%'
update wwpre_value$
set varchar2_value=replace( varchar2_value, '$DEV_HOSTNAME:$DEV_PORT', '$PROD_HOSTNAME:$PROD_PORT' )
where varchar2_value like '%$DEV_HOSTNAME:$DEV_PORT%'
-- page url items
update wwv_things
set title_link=replace( title_link, '$DEV_HOSTNAME:$DEV_PORT', '$PROD_HOSTNAME:$PROD_PORT' )
where title_link like '%$DEV_HOSTNAME:$DEV_PORT%'
-- web providers
update wwpro_providers$
set http_url=replace( http_url, '$DEV_HOSTNAME:$DEV_PORT', '$PROD_HOSTNAME:$PROD_PORT' )
where http_url like '%$DEV_HOSTNAME:$DEV_PORT%'
-- html links created by the RTF editor inside text items
update wwv_text
set text=replace( text, '$DEV_HOSTNAME:$DEV_PORT', '$PROD_HOSTNAME:$PROD_PORT' )
where text like '%$DEV_HOSTNAME:$DEV_PORT%'
-- Portlet metadata nls: help URL
update wwpro_portlet_metadata_nls$
set help_url=replace( help_url, '$DEV_HOSTNAME:$DEV_PORT', '$PROD_HOSTNAME:$PROD_PORT' )
where help_url like '%$DEV_HOSTNAME:$DEV_PORT%'
-- URL items (There is a trigger on this table building absolute_url automatically)
update wwsbr_url$
set absolute_url=replace( absolute_url, '$DEV_HOSTNAME:$DEV_PORT', '$PROD_HOSTNAME:$PROD_PORT' )
where absolute_url like '%$DEV_HOSTNAME:$DEV_PORT%'
-- Things attributes
update wwv_thingattributes
set value=replace( value, '$DEV_HOSTNAME:$DEV_PORT', '$PROD_HOSTNAME:$PROD_PORT' )
where value like '%$DEV_HOSTNAME:$DEV_PORT%'
commit;
exit
IASDB
press_any_key
echo "--------------------- step 7 - ptlconfig"
# Configure portal such that portal uses the infrastructure database
cd $MIDTIER_ORACLE_HOME/portal/conf/
./ptlconfig -dad portal
cd -
mv $MIDTIER_ORACLE_HOME/portal/logs/ptlconfig.log imp_log
press_any_key
echo "--------------------- step 8 - restart the midtier"
$MIDTIER_ORACLE_HOME/opmn/bin/opmnctl startall
date
Each step can generate his own errors due to a lot of factors. It is better to run the import step by step the first time.
Do not forget to check the output of log files created during the various steps of the import:
imp_log/drop_create_user.log
Spool when dropping and recreating the portal users
imp_log/import.log Import log file when importing the portal_exp.dmp file
imp_log/sys_post_changes.log
Spool when making post changes with SYS
imp_log/portal_post_changes.log
Spool when making post changes with PORTAL
imp_log/ptlconfig.log
Log file of ptconfig when rewiring the midtier
Step 4 - Test
A. Check the log files
B. Test the website and see if it works fine.
Step 5 - take a backup
Take a backup of all ORACLE_HOME and DATABASES to prevent all hardware problems. You need to copy:
All the files of the 2 ORACLE_HOME
And all the database files.
Step 6 - Additional steps
Here are some additional steps.
SSO external application ( that are part of the orasso schema and not imported yet )
Page URL items ( they seems to store the full URL ) - included in imp_portal_schema.sh
Web Providers ( the URL needs to be changed ) - included in imp_portal_schema.sh
Text items edited with the RTF editor in IE and containing links - included in imp_portal_schema.sh
Most of them are taken care by the "step 8 - post import changes". Except the first one.
1. SSO import
This script imports only Portal and the users/groups of OID. Not the list of the external application contained in the orasso user.
In Portal 9.0.4, there is a script called SSOMIG that resides in $INFRA_ORACLE_HOME/sso/bin and allows to move :
Definitions and user data for external applications
Registration URLs and tokens for partner applications
Connection information used by OracleAS Discoverer to access various data sources
See:
Oracle® Application Server Single Sign-On Administrator's Guide 10g (9.0.4) Part Number B10851-01
14. Exporting and Importing Data
2. Page items: the page URL items store the full URL.
This is Bug 2661805 fixed in Portal 9.0.2.6.
This following work-around is implemented in post import step of imp_portal_schema.sh
-- page url items
update wwv_things
set title_link=replace( title_link, 'dev.dev_domain.com:7778', 'prod.prod_domain.com:7778' )
where title_link like '%$DEV_HOSTNAME:$DEV_PORT%'
2. Web Providers
The URL to the Web providers needs also change. Like for the Page items, they contain the full path of the webserver.
Or you can get the list of the URLs to change with this query
select name, http_url from PORTAL.WWPRO_PROVIDERS$ where http_url like '%';
This following work-around is implemented in post import step of imp_portal_schema.sh
-- web providers
update wwpro_providers$
set http_url=replace( http_url, 'dev.dev_domain.com:7778', 'prod.prod_domain.com:7778' )
where http_url like '%$DEV_HOSTNAME:$DEV_PORT%'
4. The production and development machine do not share the same domain
If the domain of the production and the development are not the same, the DN (name in LDAP) of all users needs to change.
Let's say from
dc=dev_domain,dc=com -> dc=prod_domain,dc=com
1. before to upload the ldif files. All the strings in the 2 ldifs files that contain 'dc=dev_domain,dc=com', have to be replaced by 'dc=prod_domain,dc=com'
2. in the wwsec_group$ and wwsec_person$ tables in portal, the DN need to change too.
This following work-around is implemented in post import step of imp_portal_schema.sh
-- groups (dn)
update wwsec_group$
set dn=replace( dn, 'dc=dev_domain,dc=com', 'dc=prod_domain,dc=com' )
update wwsec_group$
set dn_hash = wwsec_api_private.get_dn_hash( dn )
-- users (dn)
update wwsec_person$
set dn=replace( dn, 'dc=dev_domain,dc=com', 'dc=prod_domain,dc=com' )
update wwsec_person$
set dn_hash = wwsec_api_private.get_dn_hash( dn)
5. Text items with HTML links
Sometimes people stores full URL inside their text items, it happens mostly when they use link with the RichText Editor in IE .
This following work-around is implemented in post import step in imp_portal_schema.sh
-- html links created by the RTF editor inside text items
update wwv_text
set text=replace( text, 'dev.dev_domain.com:7778', 'prod.prod_domain.com:7778' )
where text like '%$DEV_HOSTNAME:$DEV_PORT%'
6. OID Custom password policy
It happens quite often that the people change the password policy of the OID server. The reason is that with the default policy, the password expires after 60 days. If so, do not forget to make the same changes in the new installation.
PROBLEMS
1. Import log has some errors
A. EXP-00091 -Exporting questionable statistics
You can ignore this error.
B. IMP-00017 - WWSRC_PREFERENCE$
When importing, there is one import error:
IMP-00017: following statement failed with ORACLE error 921:
"ALTER TABLE "WWSRC_PREFERENCE$" ADD "
IMP-00003: ORACLE error 921 encountered
ORA-00921: unexpected end of SQL commandThe primary key is not created. You can create it with this commmand
in SQL*Plus with the user portal.. Then readd the missing VPD policy.
alter table "WWSRC_PREFERENCE$" add constraint wwsrc_preference_pk
primary key (subscriber_id, id)
using index wwsrc_preference_idx1
begin
DBMS_RLS.ADD_POLICY ('', 'WWSRC_PREFERENCE$', 'WEBDB_VPD_POLICY',
'', 'webdb_vpd_sec', 'select, insert, update, delete', TRUE,
static_policy=>true);
end ;
Step 8 in the script "imp_portal_schema.sh" take care of this. This can also possibly be solved by the
C. IMP-00017 - WWDAV$ASL
. importing table "WWDAV$ASL"
Note: table contains ROWID column, values may be obsolete 113 rows importedThis error is normal, the table really contains a ROWID column.
D. IMP-00041 - Warning: object created with compilation warnings
This error is normal too. The packages giving these error have
dependencies on package not yet imported. A recompilation is done
after the import.
E. ldapadd error 'cannot add add entries containing authpasswords'
# ldap_add: DSA is unwilling to perform
# ldap_add: additional info: You cannot add entries containing authpasswords.
"authpasswords" are automatically generated values from the real password of the user stored in userpassword. These values do not have to be exported from ldap.
In the import script, I remove the additional tag with a XSL stylesheet 'del_authpassword.xsl'. See above.
F. IMP-00017: WWSTO_SESSION$
IMP-00017: following statement failed with ORACLE error 2298:
"ALTER TABLE "WWSTO_SESSION$" ENABLE CONSTRAINT "WWSTO_SESS_FK1""
IMP-00003: ORACLE error 2298 encountered
ORA-02298: cannot validate (PORTAL.WWSTO_SESS_FK1) - parent keys not found
Here is a work-around for the problem. I will normally integrate it in a next version of the scripts.
SQL> delete from WWSTO_SESSION_DATA$;
7690 rows deleted.
SQL> delete from WWSTO_SESSION$;
1073 rows deleted.
SQL> commit;
Commit complete.
SQL> ALTER TABLE "WWSTO_SESSION$" ENABLE CONSTRAINT "WWSTO_SESS_FK1";
Table altered.
G. IMP-00017 - ORACLE error 1 - DBMS_JOB.ISUBMIT
This error can appear during the import when the import database is not empty and is already customized for some reasons. For example, you export from an infrastructure and you import in a database with a lot of other programs that uses jobs. And unhappily the same job id.
Due to the way the export/import of jobs is done, the jobs keeps their id after the import. And they may conflict.
IMP-00017: following statement failed with ORACLE error 1: "BEGIN DBMS_JOB.ISUBMIT(JOB=>42,WHAT=>'begin execute immediate " "''begin wwutl_cache_sys.process_background_inval; end;'' ; exc" "eption when others then wwlog_api.log(p_domain=> ''utl'', " " p_subdomain=>''cache'', p_name=>''background'', " " p_action=>''process_background_inval'', p_information => ''E" "rror in process_background_inval ''|| sqlerrm);end;', NEXT_DATE=" ">TO_DATE('2004-08-19:17:32:16','YYYY-MM-DD:HH24:MI:SS'),INTERVAL=>'SYSDATE " "+ 60/(24*60)',NO_PARSE=>TRUE); END;"
IMP-00003: ORACLE error 1 encountered ORA-00001: unique constraint (SYS.I_JOB_JOB) violated
ORA-06512: at "SYS.DBMS_JOB", line 97 ORA-06512: at line 1
Solutions:
1. use a fresh installed database,
2. Due that the jobs conflicting are different because it happens only in custom installation, there is no clear rule. But you can
recreate the jobs lost after the import with other_ids
and/or change the job id of the other program before to import. This type of commands can help you (you need to do it with SYS)
select * from dba_jobs;
update dba_jobs set job=99 where job=52;
commit
2. Import in a RAC environment
Be aware of the Bug 2479882 when the portal database is in a RAC database.
Bug 2479882 : NEEDED TO BOUNCE DB NODES AFTER INSTALLING PORTAL 9.0.2 IN RAC NODE3. Intermedia
After importing a environment, the intermedia indexes are invalid. To correct the error you need to run in SQL*Plus with Portal
start $MIDTIER_ORACLE_HOME/portal/admin/plsql/wws/inctxgrn.sql
start $MIDTIER_ORACLE_HOME/portal/admin/plsql/wws/ctxcrind.sql
But $MIDTIER_ORACLE_HOME/portal/admin/plsql/wws/inctxgrn.sql is missing in IAS 9.0.4.0. This is Bug 3536937. Fixed in 9041. The missing scripts are contained in the downloadable zip file (exp_schema904.zip : Attachment 276688.1:1 ), directory sql. This means that practically in 9040, you have to run
start sql/inctxgrn.sql
start $MIDTIER_ORACLE_HOME/portal/admin/plsql/wws/ctxcrind.sql
In the import script, it is done in the step 6 - recreate Portal Intermedia indexes.
You can not WA the problem without the scripts. Running ctxcrind.sql alone does not work. You will have this error:
ORA-06510: PL/SQL: unhandled user-defined exception
ORA-06512: at "PORTAL.WWERR_API_EXCEPTION", line 164
ORA-06512: at "PORTAL.WWV_CONTEXT", line 1035
ORA-06510: PL/SQL: unhandled user-defined exception
ORA-06512: at "PORTAL.WWERR_API_EXCEPTION", line 164
ORA-06512: at "PORTAL.WWV_CONTEXT", line 476
ORA-06510: PL/SQL: unhandled user-defined exception
ORA-20000: Oracle Text error:
DRG-12603: CTXSYS does not own user datastore procedure: WWSBR_THING_CTX_69
ORA-06512: at line 13
4. ptlconfig
If you try to run ptlconfig simply after an import you will get an error:
Problem processing Portal instance: Configuring HTTP server settings : Installing cache data : SQL exception: ERROR: ORA-23421: job number 32 is not a job in the job queue
This is because the import done by user SYS has imported the PORTAL jobs to the SYS schema in place of portal. The solution is to run
update dba_jobs set LOG_USER='PORTAL', PRIV_USER='PORTAL' where schema_user='PORTAL';
In the import script, it is done in the step 8 - post import changes.
5. WWC-41417 - invalid credentials.
When you try to login you get:
Unexpected error encountered in wwsec_app_priv.process_signon (User-Defined Exception) (WWC-41417)
An exception was raised when accessing the Oracle Internet Directory: 49: Invalid credentials
Details
Error:Operation: dbms_ldap.simple_bind_s
OID host: machine.domain.com
OID port number: 4032
Entry DN: orclApplicationCommonName=PORTAL,cn=Portal,cn=Products,cn=OracleContext. (WWC-41743)Solution:
- run secupoid.sql
- rerun ptlconfig
This problem has been seen after using ptlasst in place of ptlconfig.
6. EXP-003 with a database 9.2.0.5 or 10.1.0.2
In fact, the DB format of imp/exp has changed in 9.2.0.5 or 10.1.0.2. The EXP-3 error only occurs when the export from the 9.2.0.5.0 or 10.1.0.2.0 database is done with a lower release export utility, e.g. 9.2.0.4.0.
Due to the way this note is written, the imp/exp utility used is the one of the midtier (9014), if your portal resides in a 9.2.0.5 database, it will not work. To work-around the problem, there are 2 solutions:
Change the script so that it uses the exp and imp command of database.
Make a change to the 9.2.0.5 or 10.1.0.2 database to make them compatible with previous version. The change is to modify a database internal view before to export/import the data.
A work-around is given in Bug 3784697
1. Make a note of the export definition of exu9tne from
$OH/rdbms/admin/catexp.sql
2. Copy this to a new file and add "UNION ALL select * from sys.exu9tneb" to the end of the definition
3. Run this as sys against the DB to be exported.
4. Export as required
5. Put back the original definition of exu9tne
eg: For 9204 the workaround view would be:
CREATE OR REPLACE VIEW exu9tne (
tsno, fileno, blockno, length) AS
SELECT ts#, segfile#, segblock#, length
FROM sys.uet$
WHERE ext# = 1
UNION ALL
select * from sys.exu9tneb
7. EXP-00006: INTERNAL INCONSISTENCY ERROR
This is Bug 2906613.
The work-around given in this bug is the following:
- create the following view, connected as sys, before running export:
CREATE OR REPLACE VIEW exu8con (
objid, owner, ownerid, tname, type, cname,
cno, condition, condlength, enabled, defer,
sqlver, iname) AS
SELECT o.obj#, u.name, c.owner#, o.name,
decode(cd.type#, 11, 7, cd.type#),
c.name, c.con#, cd.condition, cd.condlength,
NVL(cd.enabled, 0), NVL(cd.defer, 0),
sv.sql_version, NVL(oi.name, '')
FROM sys.obj$ o, sys.user$ u, sys.con$ c,
sys.cdef$ cd, sys.exu816sqv sv, sys.obj$ oi
WHERE u.user# = c.owner# AND
o.obj# = cd.obj# AND
cd.con# = c.con# AND
cd.spare1 = sv.version# (+) AND
cd.enabled = oi.obj# (+) AND
NOT EXISTS (
SELECT owner, name
FROM sys.noexp$ ne
WHERE ne.owner = u.name AND
ne.name = o.name AND
ne.obj_type = 2)
The modification of exu8con simply adds support for a constraint type that had not previously been supported by this view. There is no negative impact.
8. WWSBR_DOC_CTX_54 is invalid
After the recompilation of the package, one package remains invalid (in sys_post_changes.log):
INVALID_OBJECT_AFTER
1
select owner, object_name from all_objects where status='INVALID'
CTXSYS WWSBR_DOC_CTX_54
CREATE OR REPLACE procedure WWSBR_DOC_CTX_54
(rid in rowid, bilob in out NOCOPY blob)
is begin PORTAL.WWSBR_CTX_PROCS.DOC_CTX(rid,bilob);end;
This object is not used anymore by portal. The error can be ignored. The procedure can be removed too. This is Bug 3559731.
9. You do not have permission to perform this operation. (WWC-44131)
It seems that there are problems if
- groups on the production machine are not residing in the default place in OID,
- and that the group creation base and group search base where changed.
After this, the cloning of the repository work without problem. But it seems that the command 'ptlconfig -dad portal' does not reset the GUID and DN of the groups correctly. I have not checked this yet.
The solution seems to use the script given in the 9.0.2 Note 228516.1. And run group_sec.sql to reset all the DN and GUID in the copied instance.
10. Invalid Java objects when exporting from a 9.x database and importing in a 10g database
If you export from a 9.x database and import in a 10g database, after running utlrp.sql, 18 Java objects will be invalid.
select object_name, object_type from user_objects where status='INVALID'
SQL> /
OBJECT_NAME OBJECT_TYPE
/556ab159_Handler JAVA CLASS
/41bf3951_HttpsURLConnection JAVA CLASS
/ce2fa28e_ProviderManagerClien JAVA CLASS
/c5b98d35_ServiceManagerClient JAVA CLASS
/d77cf2ab_SOAPServlet JAVA CLASS
/649bf254_JavaProvider JAVA CLASS
/a9164b8b_SpProvider JAVA CLASS
/2ee43ac9_StatefulEJBProvider JAVA CLASS
/ad45acec_StatelessEJBProvider JAVA CLASS
/da1c4a59_EntityEJBProvider JAVA CLASS
/66fdac3e_OracleSOAPHTTPConnec JAVA CLASS
/939c36f5_OracleSOAPHTTPConnec JAVA CLASS
org/apache/soap/rpc/Call JAVA CLASS
org/apache/soap/rpc/RPCMessage JAVA CLASS
org/apache/soap/rpc/Response JAVA CLASS
/198a7089_Message JAVA CLASS
/2cffd799_ProviderGroupUtils JAVA CLASS
/32ebb779_ProviderGroupMgrProx JAVA CLASS
18 rows selected.
This is a known issue. This can be solved by applying patch one of the following patch depending of your IAS version.
Bug 3405173 - PORTAL 9.0.4.0.0 PATCH FOR 10G DB UPGRADE (FROM 9.0.X AND 9.2.X)
Bug 4100409 - PORTAL 9.0.4.1.0 PATCH FOR 10G DB UPGRADE (FROM 9.0.X AND 9.2.X)
Bug 4100417 - PORTAL 9.0.4.2.0 PATCH FOR 10G DB UPGRADE (FROM 9.0.X AND 9.2.X)
11. Import : IMP-00003: ORACLE error 30510 encountered
When importing Portal 9.0.4.x, it could be that the import of the database side produces an error ORA-30510.The new perl script work-around the issue in the portal_post_import.sql script. But not the BASH scripts. If you use the BASH scripts, after the import, please run this command manually in SQL*Plus logged as portal.
---- Import error 2 - ORA-30510 when importing
CREATE OR REPLACE TRIGGER logoff_trigger
before logoff on schema
begin
-- Call wwsec_oid.unbind to close open OID connections if any.
wwsec_oid.unbind;
exception
when others then
-- Ignore all the errors encountered while unbinding.
null;
end logoff_trigger;
This is logged as <Bug;4458413>.
12. Exporting from a 9.0.1 database and import in a 9.2.0.5+ or 10g DB
It could be that when exporting from a 9.0.1 database to a 10g database that the java classes do not get compiled correctly. The following errors are seen
ORA-29534: referenced object PORTAL.oracle/net/www/proto/https/HttpsURLConnection could not be resolved
errors:: class oracle/net/www/proto/https/HttpsURLConnection
ORA-29521: referenced name oracle/security/ssl/OracleSSLSocketFactoryImpl could not be found
ORA-29521: referenced name oracle/security/ssl/OracleSSLSocketFactory could not be found
In such a case, please apply the following patches after the import in the 10g database.
Bug 3405173 PORTAL REPOS DB UPGRADE TO 10G: for Portal 9.0.4.0
Bug 4100409 PORTAL REPOS DB UPGRADE TO 10G: for Portal 9.0.4.1
Main Differences with Portal 9.0.2
For the persons used to this technics in Portal 9.0.2, you could be interested to read the main differences with the same note for Portal 9.0.2
Portal 9.0.2
Portal 9.0.4
Cutter database
Portal 9.0.2 can be part of an infrastructure database or in a custom external database.
In Portal 9.0.2, the portal schema is imported in an empty database.
Portal 9.0.4 can only be installed in a 'Cutter database', a database created with RepCA or OUI containing always OID, DCM and so on...
In Portal 9.0.4, the portal schema is imported in an 'Cutter database' (new)
group_sec.sql
group_sec.sql is used to correct the GUIDs of OID stored in Portal
ptlconfig -dad portal -oid is used to correct the GUIDs of OID stored in Portal (new)
1 script
The import / export are divided by several steps with several scripts
The import script is done in one step
Additional steps are included in the script
This requires to know the hostname and port of the original development machine. (new)
Import
The steps are:
creation of an empty database
creation of the users with password=username
import
The steps are:
creation of an IAS 10g infrastructure DB (repca or OUI)
deletion of new portal schemas (new)
creation of the users with the same password than the schemas just dropped.
import
DAD
The dad needed to be changed
The passwords are not changed, the dad does not need to be changed.
Bugs
In portal 9.0.2, 2 bugs were workarounded by change_host.sh
In Portal 9.0.4, some tables additional tables needs to be updated manually before to run ptlasst. This is #Bug:3762961#.
export of LDAP
The export is done in LDIF files. If the prod and the dev have different domain, it is quite difficult to change the domain name in these file due to the line wrapping at 78 characters.
The export is done in XML files, in the DSML format (new). It is a lot easier to change the XML files if the domain name is different from PROD to DEV.
Download
You have to cut and paste the scripts
The scripts are attached to the note. Just donwload them.
Rewiring
9.0.2 uses ptlasst.
ptlasst.csh -mode MIDTIER -i custom -s $PORTAL_USER -sp $PORTAL_PASSWORD -c $PORTAL_HOSTNAME:$PORTAL_DB_PORT:$PORTAL_SERVICE_NAME -sdad $PORTAL_DAD -o orasso -op $ORASSO_PASSWORD -odad orasso -host $MIDTIER_HOSTNAME -port $MIDTIER_HTTP_PORT -ldap_h $INFRA_HOSTNAME -ldap_p $OID_PORT -ldap_w $IAS_PASSWORD -pwd $IAS_PASSWORD -sso_c $INFRA_HOSTNAME:$INFRA_DB_PORT:$INFRA_SERVICE_NAME -sso_h $INFRA_HOSTNAME -sso_p $INFRA_HTTP_PORT -ultrasearch -oh $MIDTIER_ORACLE_HOME -mc false -mi true -chost $MIDTIER_HOSTNAME -cport_i $WEBCACHE_INV_PORT -cport_a $WEBCACHE_ADM_PORT -wc_i_pwd $IAS_PASSWORD -emhost $INFRA_HOSTNAME -emport $EM_PORT -pa orasso_pa -pap $ORASSO_PA_PASSWORD -ps orasso_ps -pp $ORASSO_PS_PASSWORD -iasname $IAS_NAME -verbose -portal_only
9.0.4 uses ptlconfig (new)
ptlconfig -dad portal
Environment variables
A lot of environment variables are needed
Just 3 environment variables are needed:
- password of SYS
- password of IAS,
- ORACLE_HOME of the Midtier
All the rest is found in iasconfig.xml and LDAP (new)
TO DO
- Check if the orclcommonapplication name fits SID.hostname
- Check what gives the import of a portal30 upgraded schema inside a schema named portal
- Explain how to copy the portal*.dbf files in place of export/import and the limitation of tra -
How do I prevent users from being able to update Firmware
I have several users (14) with iPad 2 and they rely on an in-house developed App. we have yet to test this App on iOS 5.1 and therefore want to avoid any of the users updating the iPads at all cost!
this question is in two parts:
How can I prevent users from upgrading firmware themselves short of just asking nicely?
How can I stop the iPad from automatically downloading the Upgrade when I deploy a Policy using the iPhone Configuration Utility?
Any advice would be great!We've been looking at the AirWatch mdm and have been told it has this capability. Not sure if it would be justified from an economic standpoint for you, however.
-
Date parse error while importing users from OIM to OIA (SRM 5.0.3)
Hi All,
Env Details:
OIA (SRM 5.0.3), Weblogic and Oracle 10g DB
We have integrated OIM to OIA with extended attributes mapping by modifying iam-context.xml file to load users. Its done successfully. But when we map "Date" related attribute, its giving "Date Parsing error" and its not loading the users.
We have tried loading users using flatFile mechanism, its also giving same result.
Please suggest me. Thanks in Advance !!!
Regards,
Ravi G.Hi,
Its a problem with OOB's OIMIAMSolution.class file, which is called while importing users from OIM. It used DateParse () conversion method only for all attributes which OIA attributes' name is ends with "Date". It defined, the conversion of date from (yyyy-MM-dd). So its expecting the input value should be in defined format(yyyy-MM-dd), if not, it gives a parse error.
We found work around for this as follows,
We have used other related OIA attribute which name ends other than "Date" string.
Thanks,
Ravi G. -
Provisiong of users from OIM to Exchange Server 2007
Hi,
I am trying to Provisioning the users from OIM 9.1.0.1 to Exchange server 2007. For this i used Exchange Server Connector 9.1.1.1.0. By using AD_Base_connector 9.1.1.0.0 i can
provisioned the user details. But while provisioning to the Exchange server 2007 i am getting the following error
ERROR [XELLERATE.WEBAPP],Class/Method: tcLookupFieldAct ion/lookupByColumn encounter some problems: lookup Error in OIM
And i am unable to get the LookUp detais for the MailBox in Design Console as well as in the ScheduleTasks in OIM Admin Console..
Can anybody help me in solving this issue.
Thanks & Regards
SRIHi suren,
i am using the Remote Manager. I enabled logs in log.properties in both OIM server and Remote Manager.
I am observing the following message in Remote Manager command prompt.
DEBUG,30 Mar 2010 01:12:44,437,[XELLERATE.REMOTEMANAGER],Class/Method: RMISSLCli
entSocketFactory/createSocket left.
and i am getting the error in Weblogic server command prompt.
Running ISADAM
Target Class = java.lang.String
Running GETATTRIBUTEHASH
Target Class = com.thortech.xl.util.adapters.tcUtilHashTableOperations
Running Set User Attributes
ERROR,30 Mar 2010 01:13:39,484,[XELLERATE.WEBAPP],Class/Method: tcLookupFieldAct
ion/lookupByColumn encounter some problems: lookup Error
can u help me in resolving this issue.
Thanks in Advance,
SRI -
OIA-OIM Integration, Error while trying to import users from OIM to OIA
Hi Experts,
I have used depreciation method to integrate OIA 11gR1 with OIM 9.1.0.2 BP13.
When I am trying to import Users from OIM to OIA, I am getting the following error: " failed reading the magic number mapping file"
Here are logs. Can anyone tell me what is this error about?
12:28:33,000 DEBUG [QuartzJobListener] OIM1: job about to be executed
12:28:33,000 DEBUG [IAMJob] ******* executing job OIM1 *******
12:28:33,000 INFO [DefaultRemoter] Exec: dwrSchedulerService.getJobStatus()
12:28:33,000 DEBUG [DefaultRemoter] --Object created, not stored. id=0
12:28:33,000 DEBUG [DebuggingPrintWriter] out(46): throw 'allowScriptTagRemoting is false.';
12:28:33,000 DEBUG [DebuggingPrintWriter] out(46): //#DWR-INSERT
12:28:33,000 DEBUG [DebuggingPrintWriter] out(46): //#DWR-REPLY
12:28:33,000 DEBUG [DebuggingPrintWriter] out(46): var s0={};var s1={};s0.currentCount=0;s0.groupName="IAM";s0.job=null;s0.jobName="OIM1";s0.jobStatusId=null;s0.jobType="Import/Export Progress";s0.lastAccessedTime=0;s0.launcher=null;s0.monitorMap=s1;s0.status=1;s0.timeElapsed=0;s0.totalCount=0;
dwr.engine._remoteHandleCallback('20','0',[s0]);
12:28:33,015 DEBUG [IAMJob] ---> executing job 'OIM1' using IAMJobExecutor
12:28:33,015 DEBUG [IAMJobExecutor] found valid iam service
12:28:33,015 DEBUG [IAMJobExecutor] looking for iam server connection 'OIM1'
12:28:33,031 DEBUG [IAMJobExecutor] ----> adding connection defined in config files [dbIAMConnection, fileIAMConnection]
12:28:33,031 DEBUG [IAMJobExecutor] found 3 iam server connections
12:28:33,031 DEBUG [IAMJobExecutor] checking iam server connection 'OIM1'
12:28:33,031 DEBUG [IAMJobExecutor] found matching iam server connection 'OIM1'
12:28:33,031 DEBUG [IAMJobExecutor] found valid iam server OIM1
12:28:33,031 DEBUG [IAMJobExecutor] IAM action specified is ACTION_IMPORT_USERS[2]
12:28:33,031 DEBUG [OIMIAMSolution] In Read Users ...
12:28:33,031 DEBUG [OIMIAMSolution] publishing import starting event...
12:28:33,031 DEBUG [OIMIAMSolution] Starting import run id ---> null
12:28:33,031 DEBUG [OIMIAMSolution] Trying to establish a connection with OIM Server...
12:28:33,031 DEBUG [OIMIAMSolution] ************** OIM Connection Params *************
12:28:33,031 DEBUG [OIMIAMSolution] XL Home ---> E:\Middleware10G_Home\xellerate
12:28:33,031 DEBUG [OIMIAMSolution] login config ---> E:\Middleware10G_Home\xellerate\config\auth.config
12:28:33,031 DEBUG [OIMIAMSolution] Naming Factory Initial ---> : weblogic.jndi.WLInitialContextFactory
12:28:33,031 DEBUG [OIMIAMSolution] Provider URL --> t3://vkalyan-in:7001
12:28:33,031 DEBUG [OIMIAMSolution] ****************************************************
12:28:33,031 DEBUG [OIMIAMSolution] ********** Connecting to OIM Server **********
12:28:33,031 DEBUG [DefaultIAMListener] storing new ImportRun
12:28:33,109 DEBUG [SrmIndexDaemon] Checking Imports or Re-Indexing Activity...
12:28:33,109 INFO [SrmIndexDaemon] Imports or Re-Indexing are Running. Stopping online indexing
12:28:33,156 ERROR [JobRunShell] Job IAM.OIM1 threw an unhandled Exception:
java.lang.ExceptionInInitializerError
at org.jgroups.conf.ClassConfigurator.<clinit>(ClassConfigurator.java:46)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at java.lang.Class.newInstance0(Class.java:355)
at java.lang.Class.newInstance(Class.java:308)
at org.jgroups.stack.ProtocolStack.<init>(ProtocolStack.java:88)
at org.jgroups.JChannel.init(JChannel.java:1568)
at org.jgroups.JChannel.<init>(JChannel.java:257)
at org.jgroups.JChannel.<init>(JChannel.java:240)
at org.jgroups.blocks.NotificationBus.<init>(NotificationBus.java:69)
at com.opensymphony.oscache.plugins.clustersupport.JavaGroupsBroadcastingListener.initialize(JavaGroupsBroadcastingListener.java:113)
at com.opensymphony.oscache.base.AbstractCacheAdministrator.configureStandardListeners(AbstractCacheAdministrator.java:328)
at com.opensymphony.oscache.general.GeneralCacheAdministrator.createCache(GeneralCacheAdministrator.java:305)
at com.opensymphony.oscache.general.GeneralCacheAdministrator.<init>(GeneralCacheAdministrator.java:99)
at com.thortech.xl.cache.OSCacheProvider.initialize(Unknown Source)
at com.thortech.xl.cache.CacheFactory.getCacheProvider(Unknown Source)
at com.thortech.xl.cache.CacheUtil.<clinit>(Unknown Source)
at Thor.API.tcUtilityFactory.getPropertyValue(Unknown Source)
at Thor.API.tcUtilityFactory.<init>(Unknown Source)
at com.vaau.rbacx.iam.oracle.OIMIAMSolution.getUtilityFactory(OIMIAMSolution.java:2542)
at com.vaau.rbacx.iam.oracle.OIMIAMSolution.readUsers(OIMIAMSolution.java:754)
at com.vaau.rbacx.iam.service.impl.RbacxIAMServiceImpl.importUsers(RbacxIAMServiceImpl.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at $Proxy116.importUsers(Unknown Source)
at com.vaau.rbacx.scheduling.executor.iam.IAMJobExecutor.execute(IAMJobExecutor.java:121)
at com.vaau.rbacx.scheduling.manager.providers.quartz.jobs.AbstractJob.execute(AbstractJob.java:72)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:534)
Caused by: org.jgroups.ChannelException: failed reading the magic number mapping file
at org.jgroups.conf.ClassConfigurator.init(ClassConfigurator.java:101)
at org.jgroups.conf.ClassConfigurator.<clinit>(ClassConfigurator.java:43)
... 38 more
Caused by: java.io.IOException
at org.jgroups.conf.MagicNumberReader.parseClassData(MagicNumberReader.java:89)
at org.jgroups.conf.MagicNumberReader.parse(MagicNumberReader.java:69)
at org.jgroups.conf.MagicNumberReader.readMagicNumberMapping(MagicNumberReader.java:57)
at org.jgroups.conf.ClassConfigurator.init(ClassConfigurator.java:73)
... 39 more
Caused by: java.lang.NullPointerException
at org.jgroups.conf.MagicNumberReader.parseClassData(MagicNumberReader.java:84)
... 42 more
12:28:33,156 ERROR [ErrorLogger] Job (IAM.OIM1 threw an exception.
org.quartz.SchedulerException: Job threw an unhandled exception. [See nested exception: java.lang.ExceptionInInitializerError]
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:534)
Caused by: java.lang.ExceptionInInitializerError
at org.jgroups.conf.ClassConfigurator.<clinit>(ClassConfigurator.java:46)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at java.lang.Class.newInstance0(Class.java:355)
at java.lang.Class.newInstance(Class.java:308)
at org.jgroups.stack.ProtocolStack.<init>(ProtocolStack.java:88)
at org.jgroups.JChannel.init(JChannel.java:1568)
at org.jgroups.JChannel.<init>(JChannel.java:257)
at org.jgroups.JChannel.<init>(JChannel.java:240)
at org.jgroups.blocks.NotificationBus.<init>(NotificationBus.java:69)
at com.opensymphony.oscache.plugins.clustersupport.JavaGroupsBroadcastingListener.initialize(JavaGroupsBroadcastingListener.java:113)
at com.opensymphony.oscache.base.AbstractCacheAdministrator.configureStandardListeners(AbstractCacheAdministrator.java:328)
at com.opensymphony.oscache.general.GeneralCacheAdministrator.createCache(GeneralCacheAdministrator.java:305)
at com.opensymphony.oscache.general.GeneralCacheAdministrator.<init>(GeneralCacheAdministrator.java:99)
at com.thortech.xl.cache.OSCacheProvider.initialize(Unknown Source)
at com.thortech.xl.cache.CacheFactory.getCacheProvider(Unknown Source)
at com.thortech.xl.cache.CacheUtil.<clinit>(Unknown Source)
at Thor.API.tcUtilityFactory.getPropertyValue(Unknown Source)
at Thor.API.tcUtilityFactory.<init>(Unknown Source)
at com.vaau.rbacx.iam.oracle.OIMIAMSolution.getUtilityFactory(OIMIAMSolution.java:2542)
at com.vaau.rbacx.iam.oracle.OIMIAMSolution.readUsers(OIMIAMSolution.java:754)
at com.vaau.rbacx.iam.service.impl.RbacxIAMServiceImpl.importUsers(RbacxIAMServiceImpl.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at $Proxy116.importUsers(Unknown Source)
at com.vaau.rbacx.scheduling.executor.iam.IAMJobExecutor.execute(IAMJobExecutor.java:121)
at com.vaau.rbacx.scheduling.manager.providers.quartz.jobs.AbstractJob.execute(AbstractJob.java:72)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
... 1 more
Caused by: org.jgroups.ChannelException: failed reading the magic number mapping file
at org.jgroups.conf.ClassConfigurator.init(ClassConfigurator.java:101)
at org.jgroups.conf.ClassConfigurator.<clinit>(ClassConfigurator.java:43)
... 38 more
Caused by: java.io.IOException
at org.jgroups.conf.MagicNumberReader.parseClassData(MagicNumberReader.java:89)
at org.jgroups.conf.MagicNumberReader.parse(MagicNumberReader.java:69)
at org.jgroups.conf.MagicNumberReader.readMagicNumberMapping(MagicNumberReader.java:57)
at org.jgroups.conf.ClassConfigurator.init(ClassConfigurator.java:73)
... 39 more
Caused by: java.lang.NullPointerException
at org.jgroups.conf.MagicNumberReader.parseClassData(MagicNumberReader.java:84)
... 42 more
12:28:33,156 DEBUG [QuartzJobListener] OIM1: job was executed
12:28:33,156 DEBUG [VaauSchedulerEventListenerImpl] Processing VaauSchedulerEvent
12:28:33,156 INFO [VaauSchedulerEventListenerImpl] Job executed: OIM1, IAM
12:28:33,156 INFO [VaauSchedulerEventListenerImpl] Job run time: 0s
12:28:33,156 INFO [VaauSchedulerEventListenerImpl] Next Run: null
Regards
KalyanI solved the above error by removing oscache***.jar file and keeping only oscache.jar in lib directory.
However I am getting some other error this time. Let me know if you have any suggestions:
11:27:08,015 DEBUG [QuartzJobListener] job1: job about to be executed
11:27:08,015 DEBUG [IAMJob] ******* executing job job1 *******
11:27:08,046 DEBUG [IAMJob] ---> executing job 'job1' using IAMJobExecutor
11:27:08,046 DEBUG [IAMJobExecutor] found valid iam service
11:27:08,046 DEBUG [IAMJobExecutor] looking for iam server connection 'OIM1'
11:27:08,078 DEBUG [IAMJobExecutor] ----> adding connection defined in config files [dbIAMConnection, fileIAMConnection]
11:27:08,078 DEBUG [IAMJobExecutor] found 3 iam server connections
11:27:08,078 DEBUG [IAMJobExecutor] checking iam server connection 'OIM1'
11:27:08,078 DEBUG [IAMJobExecutor] found matching iam server connection 'OIM1'
11:27:08,078 DEBUG [IAMJobExecutor] found valid iam server OIM1
11:27:08,078 DEBUG [IAMJobExecutor] IAM action specified is ACTION_IMPORT_USERS[2]
11:27:08,078 DEBUG [OIMIAMSolution] In Read Users ...
11:27:08,078 DEBUG [OIMIAMSolution] publishing import starting event...
11:27:08,078 DEBUG [OIMIAMSolution] Starting import run id ---> null
11:27:08,078 DEBUG [OIMIAMSolution] Trying to establish a connection with OIM Server...
11:27:08,078 DEBUG [OIMIAMSolution] ************** OIM Connection Params *************
11:27:08,078 DEBUG [OIMIAMSolution] XL Home ---> E:\Middleware10G_Home\xellerate
11:27:08,078 DEBUG [OIMIAMSolution] login config ---> E:\Middleware10G_Home\xellerate\config\auth.config
11:27:08,078 DEBUG [OIMIAMSolution] Naming Factory Initial ---> : weblogic.jndi.WLInitialContextFactory
11:27:08,078 DEBUG [OIMIAMSolution] Provider URL --> t3://vkalyan-in:7001
11:27:08,078 DEBUG [OIMIAMSolution] ****************************************************
11:27:08,078 DEBUG [OIMIAMSolution] ********** Connecting to OIM Server **********
11:27:08,078 DEBUG [DefaultIAMListener] storing new ImportRun
11:27:08,156 INFO [DefaultRemoter] Exec: dwrSchedulerService.getJobStatus()
11:27:08,156 DEBUG [DefaultRemoter] --Object created, not stored. id=0
11:27:08,156 DEBUG [DebuggingPrintWriter] out(35): throw 'allowScriptTagRemoting is false.';
11:27:08,156 DEBUG [DebuggingPrintWriter] out(35): //#DWR-INSERT
11:27:08,156 DEBUG [DebuggingPrintWriter] out(35): //#DWR-REPLY
11:27:08,156 DEBUG [DebuggingPrintWriter] out(35): var s0={};var s1={};s0.currentCount=0;s0.groupName="IAM";s0.job=null;s0.jobName="job1";s0.jobStatusId=null;s0.jobType="Import/Export Progress";s0.lastAccessedTime=0;s0.launcher=null;s0.monitorMap=s1;s0.status=1;s0.timeElapsed=0;s0.totalCount=0;
dwr.engine._remoteHandleCallback('139','0',[s0]);
11:27:11,187 ERROR [JobRunShell] Job IAM.job1 threw an unhandled Exception:
java.lang.AssertionError: Failed to generate class for com.thortech.xl.ejb.beans.tcUnauthenticatedOperationsSession_j7uqe_EOImpl_1032_WLStub
at weblogic.rmi.internal.StubGenerator.generateStub(StubGenerator.java:790)
at weblogic.rmi.internal.StubGenerator.generateStub(StubGenerator.java:779)
at weblogic.rmi.extensions.StubFactory.getStub(StubFactory.java:74)
at weblogic.rmi.internal.StubInfo.resolveObject(StubInfo.java:226)
at weblogic.rmi.internal.StubInfo.readResolve(StubInfo.java:207)
at sun.reflect.GeneratedMethodAccessor81.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1061)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1762)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:351)
at weblogic.rmi.extensions.server.CBVInputStream.readObject(CBVInputStream.java:64)
at weblogic.rmi.internal.ServerRequest.unmarshalReturn(ServerRequest.java:100)
at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:348)
at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:259)
at com.thortech.xl.ejb.beans.tcUnauthenticatedOperationsSession_j7uqe_HomeImpl_1032_WLStub.create(Unknown Source)
at Thor.API.tcUtilityFactory.getUnauthenticatedOperations(Unknown Source)
at Thor.API.tcUtilityFactory.getPropertyValue(Unknown Source)
at Thor.API.tcUtilityFactory.<init>(Unknown Source)
at com.vaau.rbacx.iam.oracle.OIMIAMSolution.getUtilityFactory(OIMIAMSolution.java:2542)
at com.vaau.rbacx.iam.oracle.OIMIAMSolution.readUsers(OIMIAMSolution.java:754)
at com.vaau.rbacx.iam.service.impl.RbacxIAMServiceImpl.importUsers(RbacxIAMServiceImpl.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at $Proxy114.importUsers(Unknown Source)
at com.vaau.rbacx.scheduling.executor.iam.IAMJobExecutor.execute(IAMJobExecutor.java:121)
at com.vaau.rbacx.scheduling.manager.providers.quartz.jobs.AbstractJob.execute(AbstractJob.java:72)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:534)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at weblogic.rmi.internal.StubGenerator.generateStub(StubGenerator.java:788)
... 37 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 20
at com.thortech.xl.ejb.beans.tcUnauthenticatedOperationsSession_j7uqe_EOImpl_1032_WLStub.ensureInitialized(Unknown Source)
at com.thortech.xl.ejb.beans.tcUnauthenticatedOperationsSession_j7uqe_EOImpl_1032_WLStub.<init>(Unknown Source)
... 42 more
11:27:11,203 ERROR [ErrorLogger] Job (IAM.job1 threw an exception.
org.quartz.SchedulerException: Job threw an unhandled exception. [See nested exception: java.lang.AssertionError: Failed to generate class for com.thortech.xl.ejb.beans.tcUnauthenticatedOperationsSession_j7uqe_EOImpl_1032_WLStub]
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:534)
Caused by: java.lang.AssertionError: Failed to generate class for com.thortech.xl.ejb.beans.tcUnauthenticatedOperationsSession_j7uqe_EOImpl_1032_WLStub
at weblogic.rmi.internal.StubGenerator.generateStub(StubGenerator.java:790)
at weblogic.rmi.internal.StubGenerator.generateStub(StubGenerator.java:779)
at weblogic.rmi.extensions.StubFactory.getStub(StubFactory.java:74)
at weblogic.rmi.internal.StubInfo.resolveObject(StubInfo.java:226)
at weblogic.rmi.internal.StubInfo.readResolve(StubInfo.java:207)
at sun.reflect.GeneratedMethodAccessor81.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1061)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1762)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:351)
at weblogic.rmi.extensions.server.CBVInputStream.readObject(CBVInputStream.java:64)
at weblogic.rmi.internal.ServerRequest.unmarshalReturn(ServerRequest.java:100)
at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:348)
at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:259)
at com.thortech.xl.ejb.beans.tcUnauthenticatedOperationsSession_j7uqe_HomeImpl_1032_WLStub.create(Unknown Source)
at Thor.API.tcUtilityFactory.getUnauthenticatedOperations(Unknown Source)
at Thor.API.tcUtilityFactory.getPropertyValue(Unknown Source)
at Thor.API.tcUtilityFactory.<init>(Unknown Source)
at com.vaau.rbacx.iam.oracle.OIMIAMSolution.getUtilityFactory(OIMIAMSolution.java:2542)
at com.vaau.rbacx.iam.oracle.OIMIAMSolution.readUsers(OIMIAMSolution.java:754)
at com.vaau.rbacx.iam.service.impl.RbacxIAMServiceImpl.importUsers(RbacxIAMServiceImpl.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at $Proxy114.importUsers(Unknown Source)
at com.vaau.rbacx.scheduling.executor.iam.IAMJobExecutor.execute(IAMJobExecutor.java:121)
at com.vaau.rbacx.scheduling.manager.providers.quartz.jobs.AbstractJob.execute(AbstractJob.java:72)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
... 1 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at weblogic.rmi.internal.StubGenerator.generateStub(StubGenerator.java:788)
... 37 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 20
at com.thortech.xl.ejb.beans.tcUnauthenticatedOperationsSession_j7uqe_EOImpl_1032_WLStub.ensureInitialized(Unknown Source)
at com.thortech.xl.ejb.beans.tcUnauthenticatedOperationsSession_j7uqe_EOImpl_1032_WLStub.<init>(Unknown Source)
... 42 more
11:27:11,203 DEBUG [QuartzJobListener] job1: job was executed
11:27:11,203 DEBUG [VaauSchedulerEventListenerImpl] Processing VaauSchedulerEvent
11:27:11,203 INFO [VaauSchedulerEventListenerImpl] Job executed: job1, IAM
11:27:11,203 INFO [VaauSchedulerEventListenerImpl] Job run time: 3s
11:27:11,203 INFO [VaauSchedulerEventListenerImpl] Next Run: null -
Provisioning of User from OIM 11g to GooggleApps
hi all
I m trying to Provision of user from OIM 11g to google apps with Googgle App 11.1.1.5 (icf) connector.
But while provisioning User I am getting the exception like
javax.xml.parsers.FactoryConfigurationError: WebLogicSAXParser cannot be created.SAX feature 'http://xml.org/sax/features/external-general-entities' not supported.
at weblogic.xml.jaxp.RegistrySAXParser.<init>(RegistrySAXParser.java:73)
at weblogic.xml.jaxp.RegistrySAXParser.<init>(RegistrySAXParser.java:46)
at weblogic.xml.jaxp.RegistrySAXParserFactory.newSAXParser(RegistrySAXParserFactory.java:91)
at com.google.gdata.util.common.xml.parsing.SecureGenericXMLFactory$SecureSAXParserFactory.newSAXParser(SecureGenericXMLFactory.java:147)
at com.google.gdata.util.XmlParser.getSAXParserFactory(XmlParser.java:92)
at com.google.gdata.util.XmlParser.parse(XmlParser.java:679)
at com.google.gdata.util.XmlParser.parse(XmlParser.java:576)
at com.google.gdata.data.BaseEntry.parseAtom(BaseEntry.java:1015)
at com.google.gdata.wireformats.input.AtomDataParser.parse(AtomDataParser.java:59)
at com.google.gdata.wireformats.input.AtomDataParser.parse(AtomDataParser.java:39)
at com.google.gdata.wireformats.input.CharacterParser.parse(CharacterParser.java:100)
at com.google.gdata.wireformats.input.XmlInputParser.parse(XmlInputParser.java:52)
at com.google.gdata.wireformats.input.AtomDualParser.parse(AtomDualParser.java:66)
at com.google.gdata.wireformats.input.AtomDualParser.parse(AtomDualParser.java:34)
at com.google.gdata.client.Service.parseResponseData(Service.java:2165)
at com.google.gdata.client.Service.parseResponseData(Service.java:2098)
at com.google.gdata.client.Service.getEntry(Service.java:1353)
at com.google.gdata.client.GoogleService.getEntry(GoogleService.java:567)
at com.google.gdata.client.Service.getEntry(Service.java:1278)
at com.google.gdata.client.appsforyourdomain.AppsForYourDomainService.getEntry(AppsForYourDomainService.java:118)
at org.identityconnectors.googleapps.GoogleAppsClient.getUserEntry(GoogleAppsClient.java:148)
at org.identityconnectors.googleapps.GoogleAppsClient.testConnection(GoogleAppsClient.java:171)
at org.identityconnectors.googleapps.GoogleAppsConnector.test(GoogleAppsConnector.java:407)
at org.identityconnectors.googleapps.GoogleAppsConnector.checkAlive(GoogleAppsConnector.java:415)
at org.identityconnectors.framework.impl.api.local.ConnectorPoolManager$ConnectorPoolHandler.testObject(ConnectorPoolManager.java:105)
at org.identityconnectors.framework.impl.api.local.ConnectorPoolManager$ConnectorPoolHandler.testObject(ConnectorPoolManager.java:74)
at org.identityconnectors.framework.impl.api.local.ObjectPool.borrowObject(ObjectPool.java:229)
at org.identityconnectors.framework.impl.api.local.operations.ConnectorAPIOperationRunnerProxy.invoke(ConnectorAPIOperationRunnerProxy.java:83)
at $Proxy357.schema(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.identityconnectors.framework.impl.api.local.operations.ThreadClassLoaderManagerProxy.invoke(ThreadClassLoaderManagerProxy.java:107)
at $Proxy357.schema(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.identityconnectors.framework.impl.api.DelegatingTimeoutProxy.invoke(DelegatingTimeoutProxy.java:107)
at $Proxy357.schema(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.identityconnectors.framework.impl.api.LoggingProxy.invoke(LoggingProxy.java:76)
at $Proxy357.schema(Unknown Source)
at org.identityconnectors.framework.impl.api.AbstractConnectorFacade.schema(AbstractConnectorFacade.java:112)
at oracle.iam.connectors.icfcommon.prov.ICProvisioningManager.getConnectorSchema(ICProvisioningManager.java:337)
at oracle.iam.connectors.icfcommon.prov.ICProvisioningManager.createObject(ICProvisioningManager.java:116)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.thortech.xl.adapterGlue.ScheduleItemEvents.adpGOOGLEAPPSCREATEOBJECT.CREATEOBJECT(adpGOOGLEAPPSCREATEOBJECT.java:109)
at com.thortech.xl.adapterGlue.ScheduleItemEvents.adpGOOGLEAPPSCREATEOBJECT.implementation(adpGOOGLEAPPSCREATEOBJECT.java:54)
at com.thortech.xl.client.events.tcBaseEvent.run(tcBaseEvent.java:196)
at com.thortech.xl.dataobj.tcDataObj.runEvent(tcDataObj.java:2492)
at com.thortech.xl.dataobj.tcScheduleItem.runMilestoneEvent(tcScheduleItem.java:2917)
at com.thortech.xl.dataobj.tcScheduleItem.eventPostInsert(tcScheduleItem.java:547)
at com.thortech.xl.dataobj.tcDataObj.insert(tcDataObj.java:602)
at com.thortech.xl.dataobj.tcDataObj.save(tcDataObj.java:474)
at com.thortech.xl.ejb.beansimpl.tcProvisioningOperationsBean.retryTasks(tcProvisioningOperationsBean.java:4042)
at Thor.API.Operations.tcProvisioningOperationsIntfEJB.retryTasksx(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.bea.core.repackaged.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:310)
at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:131)
at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:119)
at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at com.bea.core.repackaged.springframework.jee.spi.MethodInvocationVisitorImpl.visit(MethodInvocationVisitorImpl.java:37)
at weblogic.ejb.container.injection.EnvironmentInterceptorCallbackImpl.callback(EnvironmentInterceptorCallbackImpl.java:54)
at com.bea.core.repackaged.springframework.jee.spi.EnvironmentInterceptor.invoke(EnvironmentInterceptor.java:50)
at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at com.bea.core.repackaged.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:89)
at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:131)
at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:119)
at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at com.bea.core.repackaged.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at $Proxy359.retryTasksx(Unknown Source)
at Thor.API.Operations.tcProvisioningOperationsIntfEJB_4xftoh_tcProvisioningOperationsIntfRemoteImpl.__WL_invoke(Unknown Source)
at weblogic.ejb.container.internal.SessionRemoteMethodInvoker.invoke(SessionRemoteMethodInvoker.java:40)
at Thor.API.Operations.tcProvisioningOperationsIntfEJB_4xftoh_tcProvisioningOperationsIntfRemoteImpl.retryTasksx(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at weblogic.ejb.container.internal.RemoteBusinessIntfProxy.invoke(RemoteBusinessIntfProxy.java:85)
at $Proxy174.retryTasksx(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:198)
at $Proxy345.retryTasksx(Unknown Source)
at Thor.API.Operations.tcProvisioningOperationsIntfDelegate.retryTasks(Unknown Source)
at com.thortech.xl.webclient.actions.ResourceProfileProvisioningTasksAction.retryTasks(ResourceProfileProvisioningTasksAction.java:702)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:269)
at com.thortech.xl.webclient.actions.tcLookupDispatchAction.execute(tcLookupDispatchAction.java:133)
at com.thortech.xl.webclient.actions.tcActionBase.execute(tcActionBase.java:894)
at com.thortech.xl.webclient.actions.tcAction.execute(tcAction.java:213)
at org.apache.struts.chain.commands.servlet.ExecuteAction.execute(ExecuteAction.java:58)
at org.apache.struts.chain.commands.AbstractExecuteAction.execute(AbstractExecuteAction.java:67)
at org.apache.struts.chain.commands.ActionCommandBase.execute(ActionCommandBase.java:51)
at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:191)
at org.apache.commons.chain.generic.LookupCommand.execute(LookupCommand.java:305)
at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:191)
at org.apache.struts.chain.ComposableRequestProcessor.process(ComposableRequestProcessor.java:283)
at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1913)
at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:462)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecur)ityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStub
I am not sure why this exception is happening while provisioning the user
I already Uploaded the 4jars recommended by Four jars into the Classpath
any help regarding this issue is highly appreciatedthis might help you
Re: OIM and Google Apps
--nayan -
Can't Provision user from OIM to AD (manaul provis
can't Provision user from OIM to AD (manual provisioning ) failed with Error
the following is connector server log
==========================================
DateTime=2012-07-18T08:39:32.8713100Z
ConnectorServer.exe Error: 0 : System.ArgumentNullException: Value cannot be null.
Parameter name: Parameter 'uid' must not be null.
at Org.IdentityConnectors.Common.Assertions.NullCheck(Object o, String param)
at Org.IdentityConnectors.Framework.Impl.Api.Local.Operations.UpdateImpl.ValidateInput(ObjectClass objclass, Uid uid, ICollection`1 attrs, Boolean isDelta) in c:\ADE\aime_icf\icf\framework\dotnet\FrameworkInternal\ApiLocalOperations.cs:line 1568
at Org.IdentityConnectors.Framework.Impl.Api.Local.Operations.UpdateImpl.Update(ObjectClass objclass, Uid uid, ICollection`1 replaceAttributes, OperationOptions options) in c:\ADE\aime_icf\icf\framework\dotnet\FrameworkInternal\ApiLocalOperations.cs:line 1365
at Org.IdentityConnectors.Framework.Impl.Api.Local.Operations.ConnectorAPIOperationRunnerProxy.Invoke(Object proxy, MethodInfo method, Object[] args) in c:\ADE\aime_icf\icf\framework\dotnet\FrameworkInternal\ApiLocalOperations.cs:line 244
at ___proxy1.Update(ObjectClass , Uid , ICollection`1 , OperationOptions )
at Org.IdentityConnectors.Framework.Impl.Server.ConnectionProcessor.ProcessOperationRequest(OperationRequest request) in c:\ADE\aime_icf\icf\framework\dotnet\FrameworkInternal\Server.cs:line 609
DateTime=2012-07-18T08:39:37.8558126Z
1- iam using OIM 11.1.1.5 / applied patch p13704894_111150
2- this the target system LDAP on Windows Server 2008 R2 Entrprise version 6.1(7601) , Service Pack 1
3- and the connector server and connector version , activedirectory-11.1.1.5.0 , Connector_Server_111150
i noticed that for any user i create on OIM objectGUID is 0 , i can read groups and organizations from LDAP with no errors
please supportThis issue is coming because your object guid is not getting synchronized properly. Login to design console and open AD User form. Go to pre-populate tab. Open prepop adapter for User Principal name. Here bydefault IT resource name passed is Active Directory whereas you should have your IT server name which I think bydefault is AD Server. In the Mapto section select Process data and qualifier field will have AD server. Click on save button. Save your form.
Retry your test case now. This will resolve your problem.
regards,
GP
Maybe you are looking for
-
Error when double clicking a timeline clip
Hello, I'm editing a movie on the latest and legal version of Mac's Premiere Pro CC. I began working on various projects and some days ago I merged them into one. Now, when I double click a timeline clip to preview or modify the in/out point of it an
-
Without tax is there any problem post billing document into FI
Hi All, in my project we are not going determined any tax in sales order condition type. we are just maintaining the pricing condition. is there any problem with out tax maintain to post billing document into account document. Thanks, KR Edited by: k
-
Version management output type issue
hi, we have activated version management and all working fine . but while taking print out through ME9F we have assigned to 3 output types (NEU,ZNEU,ZNIU) for a single PO , until and unless we take print for 3 output tyes we are not able to amend the
-
Does an evaluation-only install of OBIEE 11g require a database?
I wanted to set up OBIEE 11g on my laptop so that I could start to evaluate it. From numerous blogs, I was led to believe that I first needed to install a database (I installed Oracle 11g), run the RCU utility, and then install OBIEE 11g. I successfu
-
Is it possible to pass files as arguments to executable jars in windows?
Hi, I've searched through the forums a bit and couldn't find any good info... basically, I'd like a user of my jar file to be able to drag and drop a file onto my jar file, and have the jar file process that file. Anyone know if this is possible? I'd