Issue with Database table adapter
Hi,
Iam working on OIM 9.1.0.1
I login as an end user & self request the DB table which is configured.
User can assign & the xelsysadm approves ..
The approval task is completed as seen in the End User page. In the Task Assignment History -->Task status is Pending
Can anyone help me out on this issue.
Error seen on console.
ERROR,19 Feb 2011 16:00:01,421,[XELLERATE.ACCOUNTMANAGEMENT],Class/Method: tcUse
rOperationsBean/updateUserData encounter some problems: maoErrors:An error occur
red while retrieving process information null
ERROR,19 Feb 2011 16:00:01,437,[XELLERATE.SCHEDULER.TASK],Class/Method: tcTskUsr
Provision/checkPolicyUpdateFlags encounter some problems: An error occurred whil
e retrieving process information null
Thor.API.Exceptions.tcAPIException: An error occurred while retrieving process i
nformation null
at com.thortech.xl.ejb.beansimpl.tcUserOperationsBean.updateUserData(Unk
nown Source)
at com.thortech.xl.ejb.beansimpl.tcUserOperationsBean.updateUser(Unknown
Source)
at com.thortech.xl.ejb.beans.tcUserOperationsSession.updateUser(Unknown
Source)
at com.thortech.xl.ejb.beans.tcUserOperations_voj9p2_EOImpl.updateUser(t
cUserOperations_voj9p2_EOImpl.java:4747)
at Thor.API.Operations.tcUserOperationsClient.updateUser(Unknown Source)
at sun.reflect.GeneratedMethodAccessor320.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at Thor.API.Base.SecurityInvocationHandler$1.run(Unknown Source)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(Authenticate
dSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(Unknown Source)
at weblogic.security.Security.runAs(Security.java:41)
at Thor.API.Security.LoginHandler.weblogicLoginSession.runAs(Unknown Sou
rce)
at Thor.API.Base.SecurityInvocationHandler.invoke(Unknown Source)
at $Proxy51.updateUser(Unknown Source)
at com.thortech.xl.schedule.tasks.tcTskUsrProvision.checkPolicyUpdateFla
gs(Unknown Source)
at com.thortech.xl.schedule.tasks.tcTskUsrProvision.execute(Unknown Sour
ce)
at com.thortech.xl.scheduler.tasks.SchedulerBaseTask.run(Unknown Source)
at com.thortech.xl.scheduler.core.quartz.QuartzWrapper$TaskExecutionActi
on.run(Unknown Source)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(Authenticate
dSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(Unknown Source)
at weblogic.security.Security.runAs(Security.java:41)
at Thor.API.Security.LoginHandler.weblogicLoginSession.runAs(Unknown Sou
rce)
at com.thortech.xl.scheduler.core.quartz.QuartzWrapper.execute(Unknown S
ource)
at org.quartz.core.JobRunShell.run(JobRunShell.java:178)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.j
ava:477)
Running GENERICADAPTER
Target Class = com.thortech.xl.gc.runtime.GCAdapterLibrary
It's a new error now..
The Create User task is Rejected with the below error in console.
Running GENERICADAPTER
Target Class = com.thortech.xl.gc.runtime.GCAdapterLibrary
ERROR,21 Feb 2011 17:45:42,171,[XELLERATE.APIS],
java.lang.NullPointerException
at com.thortech.xl.ejb.beansimpl.GCOperationsBean.getModelFromConnectorD
efinition(Unknown Source)
at com.thortech.xl.ejb.beansimpl.GCOperationsBean.lookup(Unknown Source)
at com.thortech.xl.ejb.beans.GCOperationsSession.lookup(Unknown Source)
at com.thortech.xl.ejb.beans.GCOperations_do1ndy_EOImpl.lookup(GCOperati
ons_do1ndy_EOImpl.java:576)
at Thor.API.Operations.GCOperationsClient.lookup(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at Thor.API.Base.SecurityInvocationHandler$1.run(Unknown Source)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(Authenticate
dSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(Unknown Source)
at weblogic.security.Security.runAs(Security.java:41)
at Thor.API.Security.LoginHandler.weblogicLoginSession.runAs(Unknown Sou
rce)
at Thor.API.Base.SecurityInvocationHandler.invoke(Unknown Source)
at $Proxy79.lookup(Unknown Source)
at com.thortech.xl.gc.runtime.GCAdapterLibrary.executeFunctionality(Unkn
own Source)
at sun.reflect.GeneratedMethodAccessor373.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.thortech.xl.adapterGlue.ScheduleItemEvents.adpMY_NEWTESTUSERS_GTC
.GENERICADAPTER(adpMY_NEWTESTUSERS_GTC.java:125)
at com.thortech.xl.adapterGlue.ScheduleItemEvents.adpMY_NEWTESTUSERS_GTC
.implementation(adpMY_NEWTESTUSERS_GTC.java:70)
at com.thortech.xl.client.events.tcBaseEvent.run(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.runEvent(Unknown Source)
at com.thortech.xl.dataobj.tcScheduleItem.runMilestoneEvent(Unknown Sour
ce)
at com.thortech.xl.dataobj.tcScheduleItem.eventPostInsert(Unknown Source
at com.thortech.xl.dataobj.tcDataObj.insert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcORC.insertNonConditionalMilestones(Unknown
Source)
at com.thortech.xl.dataobj.tcORC.completeSystemValidationMilestone(Unkno
wn Source)
at com.thortech.xl.dataobj.tcOrderItemInfo.completeCarrierBaseMilestone(
Unknown Source)
at com.thortech.xl.dataobj.tcOrderItemInfo.eventPostInsert(Unknown Sourc
e)
at com.thortech.xl.dataobj.tcUDProcess.eventPostInsert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.insert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcORC.autoDOBSave(Unknown Source)
at com.thortech.xl.dataobj.util.tcOrderPackages.createOrder(Unknown Sour
ce)
at com.thortech.xl.dataobj.util.tcOrderPackages.orderPackageForUser(Unkn
own Source)
at com.thortech.xl.dataobj.tcOIU.provision(Unknown Source)
at com.thortech.xl.dataobj.tcOIU.eventPostInsert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.insert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcOBI.checkApproved(Unknown Source)
at com.thortech.xl.dataobj.tcOBI.eventPostUpdate(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.update(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcOBI.approve(Unknown Source)
at com.thortech.xl.dataobj.tcRequestObject.handleApprovalLaunch(Unknown
Source)
at com.thortech.xl.dataobj.tcREQ.launchObjectApprovals(Unknown Source)
at com.thortech.xl.dataobj.tcREQ.eventPostUpdate(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.update(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcREQ.checkRequestReceived(Unknown Source)
at com.thortech.xl.dataobj.tcREQ.eventPostUpdate(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.update(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.schedule.jms.requestapproval.InitRequestApproval.exec
ute(Unknown Source)
at com.thortech.xl.schedule.jms.requestapproval.InitRequestApproval.exec
ute(Unknown Source)
at com.thortech.xl.schedule.jms.messagehandler.MessageProcessUtil.proces
sMessage(Unknown Source)
at com.thortech.xl.schedule.jms.messagehandler.MessageHandlerMDB.onMessa
ge(Unknown Source)
at weblogic.ejb.container.internal.MDListener.execute(MDListener.java:46
6)
at weblogic.ejb.container.internal.MDListener.transactionalOnMessage(MDL
istener.java:371)
at weblogic.ejb.container.internal.MDListener.onMessage(MDListener.java:
327)
at weblogic.jms.client.JMSSession.onMessage(JMSSession.java:4547)
at weblogic.jms.client.JMSSession.execute(JMSSession.java:4233)
at weblogic.jms.client.JMSSession.executeMessage(JMSSession.java:3709)
at weblogic.jms.client.JMSSession.access$000(JMSSession.java:114)
at weblogic.jms.client.JMSSession$UseForRunnable.run(JMSSession.java:505
8)
at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTunin
gWorkManagerImpl.java:516)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
ERROR,21 Feb 2011 17:45:42,250,[XELLERATE.APIS],An exception occurred while crea
ting the GenericConenctor Model from the Connector Definition file
ERROR,21 Feb 2011 17:45:42,250,[XELLERATE.APIS],An exception occurred while gene
rating Generic Connector model from the connector definition xml of the connecto
r 'MY_NEWTESTUSERS'
com.thortech.xl.gc.exception.ConnectorDefinitionOperationsException: An exceptio
n occurred while creating the GenericConenctor Model from the Connector Definiti
on file
at com.thortech.xl.ejb.beansimpl.GCOperationsBean.getModelFromConnectorD
efinition(Unknown Source)
at com.thortech.xl.ejb.beansimpl.GCOperationsBean.lookup(Unknown Source)
at com.thortech.xl.ejb.beans.GCOperationsSession.lookup(Unknown Source)
at com.thortech.xl.ejb.beans.GCOperations_do1ndy_EOImpl.lookup(GCOperati
ons_do1ndy_EOImpl.java:576)
at Thor.API.Operations.GCOperationsClient.lookup(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at Thor.API.Base.SecurityInvocationHandler$1.run(Unknown Source)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(Authenticate
dSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(Unknown Source)
at weblogic.security.Security.runAs(Security.java:41)
at Thor.API.Security.LoginHandler.weblogicLoginSession.runAs(Unknown Sou
rce)
at Thor.API.Base.SecurityInvocationHandler.invoke(Unknown Source)
at $Proxy79.lookup(Unknown Source)
at com.thortech.xl.gc.runtime.GCAdapterLibrary.executeFunctionality(Unkn
own Source)
at sun.reflect.GeneratedMethodAccessor373.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.thortech.xl.adapterGlue.ScheduleItemEvents.adpMY_NEWTESTUSERS_GTC
.GENERICADAPTER(adpMY_NEWTESTUSERS_GTC.java:125)
at com.thortech.xl.adapterGlue.ScheduleItemEvents.adpMY_NEWTESTUSERS_GTC
.implementation(adpMY_NEWTESTUSERS_GTC.java:70)
at com.thortech.xl.client.events.tcBaseEvent.run(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.runEvent(Unknown Source)
at com.thortech.xl.dataobj.tcScheduleItem.runMilestoneEvent(Unknown Sour
ce)
at com.thortech.xl.dataobj.tcScheduleItem.eventPostInsert(Unknown Source
at com.thortech.xl.dataobj.tcDataObj.insert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcORC.insertNonConditionalMilestones(Unknown
Source)
at com.thortech.xl.dataobj.tcORC.completeSystemValidationMilestone(Unkno
wn Source)
at com.thortech.xl.dataobj.tcOrderItemInfo.completeCarrierBaseMilestone(
Unknown Source)
at com.thortech.xl.dataobj.tcOrderItemInfo.eventPostInsert(Unknown Sourc
e)
at com.thortech.xl.dataobj.tcUDProcess.eventPostInsert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.insert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcORC.autoDOBSave(Unknown Source)
at com.thortech.xl.dataobj.util.tcOrderPackages.createOrder(Unknown Sour
ce)
at com.thortech.xl.dataobj.util.tcOrderPackages.orderPackageForUser(Unkn
own Source)
at com.thortech.xl.dataobj.tcOIU.provision(Unknown Source)
at com.thortech.xl.dataobj.tcOIU.eventPostInsert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.insert(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcOBI.checkApproved(Unknown Source)
at com.thortech.xl.dataobj.tcOBI.eventPostUpdate(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.update(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcOBI.approve(Unknown Source)
at com.thortech.xl.dataobj.tcRequestObject.handleApprovalLaunch(Unknown
Source)
at com.thortech.xl.dataobj.tcREQ.launchObjectApprovals(Unknown Source)
at com.thortech.xl.dataobj.tcREQ.eventPostUpdate(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.update(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcREQ.checkRequestReceived(Unknown Source)
at com.thortech.xl.dataobj.tcREQ.eventPostUpdate(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.update(Unknown Source)
at com.thortech.xl.dataobj.tcDataObj.save(Unknown Source)
at com.thortech.xl.dataobj.tcTableDataObj.save(Unknown Source)
at com.thortech.xl.schedule.jms.requestapproval.InitRequestApproval.exec
ute(Unknown Source)
at com.thortech.xl.schedule.jms.requestapproval.InitRequestApproval.exec
ute(Unknown Source)
at com.thortech.xl.schedule.jms.messagehandler.MessageProcessUtil.proces
sMessage(Unknown Source)
at com.thortech.xl.schedule.jms.messagehandler.MessageHandlerMDB.onMessa
ge(Unknown Source)
at weblogic.ejb.container.internal.MDListener.execute(MDListener.java:46
6)
at weblogic.ejb.container.internal.MDListener.transactionalOnMessage(MDL
istener.java:371)
at weblogic.ejb.container.internal.MDListener.onMessage(MDListener.java:
327)
at weblogic.jms.client.JMSSession.onMessage(JMSSession.java:4547)
at weblogic.jms.client.JMSSession.execute(JMSSession.java:4233)
at weblogic.jms.client.JMSSession.executeMessage(JMSSession.java:3709)
at weblogic.jms.client.JMSSession.access$000(JMSSession.java:114)
at weblogic.jms.client.JMSSession$UseForRunnable.run(JMSSession.java:505
8)
at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTunin
gWorkManagerImpl.java:516)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
Caused by: java.lang.NullPointerException
... 76 more
ERROR,21 Feb 2011 17:45:42,312,[XELLERATE.GC.FRAMEWORKPROVISIONING],An exception
occurred while generating Generic Connector model from the connector definition
xml of the connector 'MY_NEWTESTUSERS'
Similar Messages
-
Rebate related issue with database table VKDFS & VBAK
Hi everybody,
I am facing the problem with the tables VKDFS and VBAK.
In my program the report has to display the details of the agrement numbers concerning to the sale or billing doucmnets later on it has to create a credit memo for that particular customer.
In the coding the program in very beging step, it is fetching all sales documents from VKDFS as per selections like following.
select * from vkdfs into table ivkdfs
where fktyp in r_fktyp
and vkorg in s_vkorg
and fkdat in s_fkdat
and kunnr in s_kunnr
and fkart in s_fkart
and vbeln in s_vbeln
and faksk in s_faksk
and vtweg in s_vtweg
and spart in s_spart
and netwr in s_netwr
and waerk in s_waerk.
After this whatever the sales orders fetched here, for those all again its fetching from VBAK table as following.
SVBAK[] = IVKDFS[]
select * from vbak into table ivbak
for all entries in svbak
where vbeln = svbak-vbeln
and knuma in s_knuma
and auart in s_auart
and submi in s_submi
and (vbak_wtab).
So, its filtering from VBAK.
But the exact issue is that, there is one sales order which is available in VBAK but does not available in VKDFS table.
So, my program fails to display the report regarding to that agreement number.
As per my analysis I came to know that there are no entries in VKDFS table against to the sales orders in VBAK concerning agreement numbers.
VKDFS-SD index: billing initiator table.
I want to know how come this VKDFS table is updating against to VBAK table. If possible how to make this entry in that table against to the values in VBAK. But it should not effect other tables.
Please let me know the solution if you people have any .
Its an urgent and sev 1 tickets
eagerly waiting for solution or some information.
Thanks&Regards.
J.Hi everybody,
I am facing the problem with the tables VKDFS and VBAK.
In my program the report has to display the details of the agrement numbers concerning to the sale or billing doucmnets later on it has to create a credit memo for that particular customer.
In the coding the program in very beging step, it is fetching all sales documents from VKDFS as per selections like following.
select * from vkdfs into table ivkdfs
where fktyp in r_fktyp
and vkorg in s_vkorg
and fkdat in s_fkdat
and kunnr in s_kunnr
and fkart in s_fkart
and vbeln in s_vbeln
and faksk in s_faksk
and vtweg in s_vtweg
and spart in s_spart
and netwr in s_netwr
and waerk in s_waerk.
After this whatever the sales orders fetched here, for those all again its fetching from VBAK table as following.
SVBAK[] = IVKDFS[]
select * from vbak into table ivbak
for all entries in svbak
where vbeln = svbak-vbeln
and knuma in s_knuma
and auart in s_auart
and submi in s_submi
and (vbak_wtab).
So, its filtering from VBAK.
But the exact issue is that, there is one sales order which is available in VBAK but does not available in VKDFS table.
So, my program fails to display the report regarding to that agreement number.
As per my analysis I came to know that there are no entries in VKDFS table against to the sales orders in VBAK concerning agreement numbers.
VKDFS-SD index: billing initiator table.
I want to know how come this VKDFS table is updating against to VBAK table. If possible how to make this entry in that table against to the values in VBAK. But it should not effect other tables.
Please let me know the solution if you people have any .
Its an urgent and sev 1 tickets
eagerly waiting for solution or some information.
Thanks&Regards.
J. -
Hello Gurus..... ISSUE with child Table update
I have an issue with child table update
I have created a GTC with one parent table and two child tables. I'm able to update the parent table and the values are found in db, but the ISSUE is the child Table values are not updating the db.
please give me a solution
regards
SrikanthIf you are keeping referential integrity in the database, not in the application, it is easy to find the child and parent tables. Here is a quick and dirty query. You can join this to dba_cons_columns to find out on which columns the referential constraints are defined. This lists all child-parent table including SYS and SYSTEM users. You can run this for specific users of course.
select cons1.owner child_owner,cons1.table_name child_table,
cons2.owner parent_owner,cons2.table_name parent_table
from dba_constraints cons1,dba_constraints cons2
where cons1.constraint_type='R'
and cons1.r_constraint_name=cons2.constraint_name; -
Issue with Receiver SOAP adapter for synchronous scenario
Hello All,
We are facing a strange issue with the SOAP adapter in the interface we have setup. This is the 1st time we are using SOAP adapter in our system (PI 7.11 SP7). We are making a synchronous HTTP call to the web service exposed by another system in our landscape. The payload is send with SOAP envelope and there are no credentials to be maintained in PI settings.
The issue is that we are always getting timeout exception in PI audit logs after sending the request (3 minutes - standard timeout value, no additional config for this). But target system has confirmed that they are sending the response back. We tested from our server OS level and have received the response back in the same screen (to verify there is no firewall/port issue in between the systems). But when tried from RWB, it is always giving the timeout exception and we are not able to see any other log.
We have tried checking in the NWA logs as well after increasing the logging level to ALL for com.sap.aii.adapter.soap. But surprisingly, we didn't get any logs at all for the outgoing SOAP call or incoming response and hence we are unable to trace the issue.
We have setup another synchronous inbound SOAP interface (PI exposing the webservice) and it is working fine. We are also able to trace the logs in both audit log and NWA logs.
Is there anywhere else we can check for the logs? Audit logs is showing timeout error and we are not able to see anything in NWA logs.
Does the target system need to maintain PI credentials in the header when they send the synchronous response back?
Are there any specific settings which should be checked to enable the sync communication? (this should not be the case since the inbound interface is working fine)
Please help.
Thanks
JustinHi Amit,
Thanks for the reply.
Yes we had tested successfully via SOAP UI as well (forgot to mention that). We are getting back the expected response in SOAP UI without using any credentials. We got the same response when we tested it through OS commands from PI server.
The WS is hosted by the target system and they haven't maintained any credentials at their end. So when PI is trying to access, we don't need to provide any credentials. My question is, whether the target system should keep any credentials to send the synchronous response back to PI (java stack). We have tried that as well but since there aren't any logs, we are unable to verify whether the credentials are coming correctly.
The service interfaces are correct and PI configuration are OK. I will try the XPI inspector for logs as you have suggested.
Thanks
Justin -
Error binding table components with database tables
Hi, when i try to bind table components with database tables y receive this error
java.lang.NullPointerException
at com.sun.sql.rowset.CachedRowSetXImpl.initMetaData(CachedRowSetXImpl.java:861)
at com.sun.sql.rowset.CachedRowSetXImpl.getMetaData(CachedRowSetXImpl.java:2336)
at com.sun.data.provider.impl.CachedRowSetDataProvider.getMetaData(CachedRowSetDataProvider.java:1317)
at com.sun.data.provider.impl.CachedRowSetDataProvider.getFieldKeys(CachedRowSetDataProvider.java:489)
at com.sun.rave.web.ui.component.table.TableRowGroupDesignState.resetTableColumns(TableRowGroupDesignState.java:261)
at com.sun.rave.web.ui.component.table.TableRowGroupDesignState.setDataProviderBean(TableRowGroupDesignState.java:163)
at com.sun.rave.web.ui.component.table.TableDesignState.setDataProviderBean(TableDesignState.java:250)
at com.sun.rave.web.ui.component.TableDesignInfo.linkBeans(TableDesignInfo.java:162)
at com.sun.rave.insync.models.FacesModel.linkBeans(FacesModel.java:1042)
at com.sun.rave.designer.DndHandler.processLinks(DndHandler.java:2126)
at com.sun.rave.designer.DndHandler.importBean(DndHandler.java:880)
at com.sun.rave.designer.DndHandler.importItem(DndHandler.java:702)
at com.sun.rave.designer.DndHandler.importDataDelayed(DndHandler.java:376)
at com.sun.rave.designer.DndHandler.access$000(DndHandler.java:114)
[catch] at com.sun.rave.designer.DndHandler$1.run(DndHandler.java:298)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:209)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:461)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:163)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:157)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:149)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:110)There are a thing that I don't undertand. When I try to bind samples database table (Travel, for example) to table components, there are no errors. But when I try to bind table component with my remote postgresql database table, yes...the table component disappear and only a "Table" string is displayed....this error is because I use remote database?? Or because it's PostgreSQL??
Thanks. -
Issues with Advance Table Add Row New Row not work in some scenarios.
Hi,
Wondering if there's any issue with Advanced Tables where it does not create any rows. I don't know if anyone tried this or not. I have one OA Page with Advanced Table and a button that when clicked open a new OA Page in a POP-UP Window. The pop-up page conatins one textbox where u enter a data and this gets saved in one of the VO's transient attribute. Now on the ase page if you don't click a button to open a pop-up page you can Add New Rows in the Advanced Table by clicking Add Row Button. But as soon as you open a popup window and close it Add New Rows button doesn't work and is not creating any new rows. Basically page stops working. Both the POP-UP and the base page share the same AM but have different controllers.
POP-UP page is a custom page that I open giving the Destination URI value in the button item and target frame _blank.
I even tried creating rows programmatically for Advance Table but this too doesn't work once u open a pop-up. Also I have used pageContext.putTransactionValue in the pop-up page and am checking and removing this in the base page.
Any help is appreciated.
Thanksanyone
-
Issue with JMS inbound Adapter and Asynchronous BPEL
Hi Gurus,
I am facing the below issue with JMS inbound Adapter and Asynchronous BPEL .
I have 2 JMS Queues one inbound and one outbound . The Composite has multiple BPEL Components around 4 on an average and i have 4 composites similar to that .
Totally 4 Composites * 4 BPEL Components = 16 Services
Propoesd Solution :
I have used MessageSelector in the JMS Adapter for selecting the incoming message. The BPEL gets invoked if the message selector is true and proceses the Message and writes the response to the other Queue.
Initially i had no problems but intermittantly the BPEL processes are getting invoked but they are not processing the data ( Bpel process is supposed to invoke an external service and get the response in a sync call) and each BPEL processe instance is in running state for ever . This remains even if i restart the servers .
The message gets read by the JMS Adapter , BPEL instace gets created but it wont proceed futher and remains in the runnign state for ever.
If i redeploy the Composite then messages get processed but the issue creeps up again ( i tried to checl the logs but ino cluea about the issue .
Getting frustrated day by day tried the bpel.config.transaction, increased the JMS Adapter threads , inreased the worker threads but all in vein..
please let me know if any one has faced similar issue .
Anticipating a quick response from the gurus.
Lakshmi.We are also facing this issue in 11.1.1.5.
Breifly the issue is : The BPEL process which polls an inbound JMSAdater ( consume_message) either stays in running state forever ( whatever we do) or go to the recovery queue. It doesnt recover even if i try to recover it. This happens intermittently. Redeploying the application / restarting servers sometime solve the issue but as know we cant do that on prod systems .
Can some one look into this on priority and help us giving a solution/workaround. -
Performance issues with pipelined table functions
I am testing pipelined table functions to be able to re-use the <font face="courier">base_query</font> function. Contrary to my understanding, the <font face="courier">with_pipeline</font> procedure runs 6 time slower than the legacy <font face="courier">no_pipeline</font> procedure. Am I missing something? The <font face="courier">processor</font> function is from [url http://www.oracle-developer.net/display.php?id=429]improving performance with pipelined table functions .
Edit: The underlying query returns 500,000 rows in about 3 minutes. So there are are no performance issues with the query itself.
Many thanks in advance.
CREATE OR REPLACE PACKAGE pipeline_example
IS
TYPE resultset_typ IS REF CURSOR;
TYPE row_typ IS RECORD (colC VARCHAR2(200), colD VARCHAR2(200), colE VARCHAR2(200));
TYPE table_typ IS TABLE OF row_typ;
FUNCTION base_query (argA IN VARCHAR2, argB IN VARCHAR2)
RETURN resultset_typ;
c_default_limit CONSTANT PLS_INTEGER := 100;
FUNCTION processor (
p_source_data IN resultset_typ,
p_limit_size IN PLS_INTEGER DEFAULT c_default_limit)
RETURN table_typ
PIPELINED
PARALLEL_ENABLE(PARTITION p_source_data BY ANY);
PROCEDURE with_pipeline (argA IN VARCHAR2,
argB IN VARCHAR2,
o_resultset OUT resultset_typ);
PROCEDURE no_pipeline (argA IN VARCHAR2,
argB IN VARCHAR2,
o_resultset OUT resultset_typ);
END pipeline_example;
CREATE OR REPLACE PACKAGE BODY pipeline_example
IS
FUNCTION base_query (argA IN VARCHAR2, argB IN VARCHAR2)
RETURN resultset_typ
IS
o_resultset resultset_typ;
BEGIN
OPEN o_resultset FOR
SELECT colC, colD, colE
FROM some_table
WHERE colA = ArgA AND colB = argB;
RETURN o_resultset;
END base_query;
FUNCTION processor (
p_source_data IN resultset_typ,
p_limit_size IN PLS_INTEGER DEFAULT c_default_limit)
RETURN table_typ
PIPELINED
PARALLEL_ENABLE(PARTITION p_source_data BY ANY)
IS
aa_source_data table_typ;-- := table_typ ();
BEGIN
LOOP
FETCH p_source_data
BULK COLLECT INTO aa_source_data
LIMIT p_limit_size;
EXIT WHEN aa_source_data.COUNT = 0;
/* Process the batch of (p_limit_size) records... */
FOR i IN 1 .. aa_source_data.COUNT
LOOP
PIPE ROW (aa_source_data (i));
END LOOP;
END LOOP;
CLOSE p_source_data;
RETURN;
END processor;
PROCEDURE with_pipeline (argA IN VARCHAR2,
argB IN VARCHAR2,
o_resultset OUT resultset_typ)
IS
BEGIN
OPEN o_resultset FOR
SELECT /*+ PARALLEL(t, 5) */ colC,
SUM (CASE WHEN colD > colE AND colE != '0' THEN colD / ColE END)de,
SUM (CASE WHEN colE > colD AND colD != '0' THEN colE / ColD END)ed,
SUM (CASE WHEN colD = colE AND colD != '0' THEN '1' END) de_one,
SUM (CASE WHEN colD = '0' OR colE = '0' THEN '0' END) de_zero
FROM TABLE (processor (base_query (argA, argB),100)) t
GROUP BY colC
ORDER BY colC
END with_pipeline;
PROCEDURE no_pipeline (argA IN VARCHAR2,
argB IN VARCHAR2,
o_resultset OUT resultset_typ)
IS
BEGIN
OPEN o_resultset FOR
SELECT colC,
SUM (CASE WHEN colD > colE AND colE != '0' THEN colD / ColE END)de,
SUM (CASE WHEN colE > colD AND colD != '0' THEN colE / ColD END)ed,
SUM (CASE WHEN colD = colE AND colD != '0' THEN 1 END) de_one,
SUM (CASE WHEN colD = '0' OR colE = '0' THEN '0' END) de_zero
FROM (SELECT colC, colD, colE
FROM some_table
WHERE colA = ArgA AND colB = argB)
GROUP BY colC
ORDER BY colC;
END no_pipeline;
END pipeline_example;
ALTER PACKAGE pipeline_example COMPILE;Edited by: Earthlink on Nov 14, 2010 9:47 AM
Edited by: Earthlink on Nov 14, 2010 11:31 AM
Edited by: Earthlink on Nov 14, 2010 11:32 AM
Edited by: Earthlink on Nov 20, 2010 12:04 PM
Edited by: Earthlink on Nov 20, 2010 12:54 PMEarthlink wrote:
Contrary to my understanding, the <font face="courier">with_pipeline</font> procedure runs 6 time slower than the legacy <font face="courier">no_pipeline</font> procedure. Am I missing something? Well, we're missing a lot here.
Like:
- a database version
- how did you test
- what data do you have, how is it distributed, indexed
and so on.
If you want to find out what's going on then use a TRACE with wait events.
All nessecary steps are explained in these threads:
HOW TO: Post a SQL statement tuning request - template posting
http://oracle-randolf.blogspot.com/2009/02/basic-sql-statement-performance.html
Another nice one is RUNSTATS:
http://asktom.oracle.com/pls/asktom/ASKTOM.download_file?p_file=6551378329289980701 -
Hi All,
We have an issue of lowercase in database table and we want to convert that lowercase data to upper case. The code for doing this operation is as follows:
SELECT * FROM zehs_volumes INTO TABLE i_volumes_temp.
LOOP AT i_volumes_temp.
DELETE zehs_volumes FROM i_volumes_temp.
IF sy-subrc is INITIAL.
COMMIT WORK.
ELSE.
ROLLBACK WORK.
ENDIF.
TRANSLATE i_volumes_temp-zehs_business TO UPPER CASE.
MODIFY i_volumes_temp.
CLEAR i_volumes_temp.
ENDLOOP.
SORT i_volumes_temp BY SUBID ZEHS_BUSINESS ZEHS_SITE ZEHS_ROLE
ZST_DATE ZED_DATE.
DELETE ADJACENT DUPLICATES
FROM i_volumes_temp COMPARING SUBID ZEHS_BUSINESS ZEHS_SITE
ZEHS_ROLE ZST_DATE ZED_DATE.
MODIFY zehs_volumes FROM TABLE i_volumes_temp.
IF sy-subrc is INITIAL.
COMMIT WORK.
ELSE.
ROLLBACK WORK.
ENDIF.
The problem wid the above code is that it first clears the entire database table and then fills it with the modified data from the internal table. But we do not want to take risk of emptying the database table. Is there any way out to perform the above operation?
Thanks in advance
Regards,
Yogesh SharmaHi Karthik,
Thanks for replying.
In our case the scenario is we need to delete wrong entires in the database table based on the primary input fields. There is already a utility to do so. But in the table some of the entries are present in the lower case.
For eg. Suppose we need to delete entry with field zehs_business = "Pioneer". When we input this SAP interpret it as PIONEER and we get an error message "No data exist".
So the code should we in such a way that all the value in field zehs_business should be translated to upper case otherwise SAP will give an error message of non-existence of data.
The code which you gave adds one more entry in database table with zehs_business = "PIONEER". Now the database table have two records one with "Pioneer" and other with "PIONEER". After the execution of DELETE statement only the record with zehs_business = PIONEER gets deleted....and the other record still exist.
Is there any way to avoid above situation.... -
Challenging issue with JDBC sender adapter
Hello Guys
I have this requirement
From Oracle Database I have to read to tables one for the header and one for the detailes and map the result to one RFC.
I have only worked with one table at the time with the JDBC sender adapter but never with 2 tables
My challenges are
1 - How I can read the 2 tables at the same select statement , I suppose I have to use a join with 2 identical fields , so far so good ,but how I can handle multiple records
suppose the result is 10 new different records which each of the recored has to be mapped to the RFC , then we will have 10 RFC calls.
how can I do the mapping in this case.
2 - How can I update the 2 tables at the samme time and flag them as processed , as far as I know we can not use 2 update statement in the same JDBC sender
any help will do
Thanks in advance.Hi
Thanks for the replayes
The RFC is used to create and Invoice Idoc , It has to be one record (Header and Item ) to one RFC.
a JDBC to IDoc can also be used , but we have to update another table in SAP that is why the RFC,
My doubt is how the Data type for the sender JDBC should look like , as mentioned I have 2 tables to fetch data from
<Invoice>
<Header> 1 -- 1
<f1>
<f2>
<f3>
</Header>
<Item> 1--n
<f1>
<f2>
<f3>
</Item>
</Invoice>
The sender JDBC returns
<Invoice>
<row>
<f1>
<f2>
<f3>
</row>
</Invoice>
How can I replicate the data type to meet the JDBC sender structure.
and regarding the mapping do I have to change the occurrunce to 0 -- unbounded
regarding the update Ragu is right in his suggestion
Table1 can be used as primary table, Table2 can be used as Secondary table.. u will have key field to link both table.
So Just Updating only Primary Table(Table1) will helps on this.
Thanks. -
Hi All,
I have defined an internal table 'with occurs 0'.
While debuggining, i am observing that only 50,000 lines are getting populated in that internal table.
How to increase the capacity of the internal table, so that it can hold more number of lines?
Regards
PavanI would like to explaing you all about this issue.
I have defined an internal table as follows...
data: IT_CATSDB TYPE BAPICATS2 OCCURS 0 WITH HEADER LINE.
I have used a BAPI called "BAPI_CATIMESHEETRECORD_GETLIST" which will retirieve the Timesheet details of all employees within a given date range.
Now, the number of entries that we got from this BAPI are stored in the internal table it_catsdb.
I have observed that this internal table is holding max of 50k lines.
If i check the database table (CATSDB) for the same conditions in SE11, i am able to get around 1L lines.
Hence my report is showing incorrect output.
Kindly help me out with some logic, so that this internal table can hold as much as data possible without any restriction.....
Regards
Pavan -
SQL Query : Order By issue with HUGE Table
Hello friends,
I have been through a terrible issue with order by. I would appreciate your help. Please let me know, your input for my case:
=> if i run select query it returns result quick in some milliseconds. (sql dev. fetches 50 rows at a time)
=> if i run select query with where condition and column (say A) in where condition is even indexed and i have order by and that order by column (say B) is also indexed.
Now, here is the issue:
1. if no. of rows with that where condition can filter yielding small result set then order by works fine .. 1-5 sec which is good.
2.*if no. of rows with that where condition can filter yielding Large result set, say more than 50,000 then with order by then the wait time is exponential.... i have even waited 10+ mins to get the result back for 120,000 records.*
Is order by takes that long for 100K records ... i think something else if wrong... your pointer will really be helpful... i am very new to sql and even newer for large table case.
I am using SQL Developer Version 2.1.1.64
and Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
Thank you so much.
Edited by: 896719 on Jan 11, 2013 8:38 AMYes you are correct, but my concentration was on order by thing, so it will do full scan of table so i was putting that ... and was also wondering if millions of record in table should not be a issue...???
Any way for the explain plan , when just a value in the where changes there is the huge difference i want to point out too as below:
SELECT
FROM
EES_EVT EES_EVT where APLC_EVT_CD= 'ABC' ORDER BY CRE_DTTM DESC
execution time : 0.047 sec
Plan hash value: 290548126
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 27 | 14688 | 25 (4)| 00:00:01 |
| 1 | SORT ORDER BY | | 27 | 14688 | 25 (4)| 00:00:01 |
| 2 | TABLE ACCESS BY INDEX ROWID| EES_EVT | 27 | 14688 | 24 (0)| 00:00:01 |
|* 3 | INDEX RANGE SCAN | XIE1EES_EVT | 27 | | 4 (0)| 00:00:01 |
Predicate Information (identified by operation id):
3 - access("APLC_EVT_CD"='ABC')
Note
- SQL plan baseline "SYS_SQL_PLAN_6d41e6b91925c463" used for this statement
=============================================================================================
SELECT
FROM
EES_EVT EES_EVT where APLC_EVT_CD= 'XYZ' ORDER BY CRE_DTTM DESC
execution : 898.672 sec.
Plan hash value: 290548126
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 121K| 62M| | 102K (1)| 00:11:02 |
| 1 | SORT ORDER BY | | 121K| 62M| 72M| 102K (1)| 00:11:02 |
| 2 | TABLE ACCESS BY INDEX ROWID| EES_EVT | 121K| 62M| | 88028 (1)| 00:09:27 |
|* 3 | INDEX RANGE SCAN | XIE1EES_EVT | 121K| | | 689 (1)| 00:00:05 |
Predicate Information (identified by operation id):
3 - access("APLC_EVT_CD"='XYZ')
Note
- SQL plan baseline "SYS_SQL_PLAN_ef5709641925c463" used for this statementAlso Note this table contains 74328 MB data in it.
Thanks -
Performance issue with MSEG table
Hi all,
I need to fetch materials(MATNR) based on the service order number (AUFNR) in the selection screen,but there is performance isssue with this , how to over come this issue .
Regards ,
AmitHi,
There could be various reasons for performance issue with MSEG.
1) database statistics of tables and indexes are not upto date.
because of this wrong index is choosen during the execution.
2) Improper indexes, because there is no indexes with the fields mentioned in the WHERE clause of the statement. Because of this reason, CBO would have choosen wrong index and did a range scan.
3) Optimizer bug in oracle.
4) Size of table is very huge, archive.
Better switch on ST05 trace before you run this statements, so it will give more detailed information, where exactly time being spent during the execution.
Hope this helps
dileep -
Issue with Multiple Tables in Report
Post Author: dwessell
CA Forum: General
Hi,
I'm using Crystal Reports 2k8.
I'm doing a report with three tables, CQ_HEADER, SO_HEADER and SALESPERSON. Both the CQ_HEADER and the SO_HEADER tables link to the SALESPERSON table via a SPN_AUTO_KEY field.
However, I always receive duplicates in my result set, due to the joins made, and I don't receive results that are valid in one table, and empty in another (Such that it only counts a CQ, if there is a SO associated with it. Here's the query that's produced by CR.
SELECT "CQ_HEADER"."CQ_NUMBER", "CQ_HEADER"."ENTRY_DATE", "CQ_HEADER"."TOTAL_PRICE", "SALESPERSON"."SALESPERSON_NAME", "SO_HEADER"."ENTRY_DATE", "SO_HEADER"."TOTAL_PRICE"
FROM "CQ_HEADER" "CQ_HEADER" INNER JOIN ("SO_HEADER" "SO_HEADER" INNER JOIN "SALESPERSON" "SALESPERSON" ON "SO_HEADER"."SPN_AUTO_KEY"="SALESPERSON"."SPN_AUTO_KEY") ON "CQ_HEADER"."SPN_AUTO_KEY"="SALESPERSON"."SPN_AUTO_KEY"
WHERE ("CQ_HEADER"."ENTRY_DATE">={ts '2007-12-01 00:00:00'} AND "CQ_HEADER"."ENTRY_DATE"<{ts '2007-12-18 00:00:00'}) AND ("SO_HEADER"."ENTRY_DATE">={ts '2007-12-01 00:00:00'} AND "SO_HEADER"."ENTRY_DATE"<{ts '2007-12-18 00:00:00'})
ORDER BY "SALESPERSON"."SALESPERSON_NAME"
There is no link between the SO_HEADER and the CQ_HEADER. Can anyone make a suggestion as to how I could go about structuring this such that it doesn't return duplicate values?
Thanks
DavidHey,
I understand you used Retainsameconnection property true for all the OLEDB connections you used in the package if not make sure its for all the connection including file connection as well.
Additionally, you can try to set Delayvalidation property to true for all the dataflows and control flows in the connection and try running the package for 10MB file.
I hope this will fix the intermittent failure issue you are facing with SSIS.
(Please mark solved if I've answered your question, vote for it as helpful to help other user's find a solution quicker)
Thanks,
Atul Gaikwad. -
Hello,
I have an issue with my adaptive noise cancellation program. Essentially I want to input a custom wav add noise to it and then filter the noise away in order to gain the custom wav again.
While doing so I want to read the learning curve and read the adaptive coefficients. Unfortunately I have an issue when it comes to the filtering of the custom wav+noise. It won't filter the signal at all.
It would be helpful if someone could have a look at it and possibly help me out.
Thanks!
Attachments:
Testing.vi 59 KBHey Jan,
Thanks for the reply. I am currently using the Adaptive Filter Toolkit in order to obtain those VIs. The VIs which are in use are the LMS Adaptive Filter ones.
I figured there might be an error with the input of the array. This VI requires a DBL Array but it seems like it can not process it.
The "Get Wavefirn Components" works better now but I still have an issue with a time. I put a 9s wav file in but it only computes it in a very short time and I can not play the file while it is computing.
I added the modified program to the attachments.
This program is ought to read a waveform file add noise to it and then use an adaptive filter in order to get the orginal waveform back again and if possible either store or play the final waveform.
Thanks for your help.
Attachments:
Testing.vi 62 KB
Maybe you are looking for
-
Why can I no longer connect to WiFi.... I have an iPhone 4S fully updated but recently my WiFi dropped off and I can't turn it back on. The option is there but the button won't slide across and the word "WiFi" seems to be a lighter shade
-
Uploading Access split database to Sharepoint Site.
Hi, I'm having trouble finding instructions on how exactly you would upload a split database to SharePoint. If I split my database locally i can't then move the back end. I want to upload my the back end of the Access database to "mysitesname".sharep
-
*** Publishing only one iweb site to .mac
I have a number of different sites creates in iWed that I normally upload to my own server and NOT my .mac account. I would like to clear my .mac space to 0 and then upload 1 of my entire movies there. I'd like to create the backdrop in iweb and publ
-
Text determination for billing document
pls tell me text determination for billing document.pls tell me each and every phace for text determination for header billing document; when am creting text determination am getting so many problems, i saw in text form in header text,but i didnot ge
-
Combine a Bar Graph with a Line Graph!
Is it possible to combine a Bar Graph with a Line Graph because occasionally we wish to use both bars and lines on the same plot. If yes, could you please give us a sample rtf file? Thanks, -Ritu