FM datasource
Hello Friends,
I am facing critical error for FM dataSource.
I have SAP R/3 program which gives me data in table that feeds to datasource FM.
Now I have import export mechanism that fills data into datasource.
Now S_S_IF-MAXSIZE which is a standard parameter when we create FM for datasource.
When I am trying to load data into BW, S_S_IF-MAXSIZE is taking any number and runs upto that number.
When second time loop runs it value becomes 0, so the loop doesnt execute again and I am able to load small number of data into BW.
So is there any way by which I can manage S_S_IF-MAXSIZE and every time loop runs and append data to E_T_DATA. And data comes to BW in several packets.
Right now it is coming as single package in BW.
Thanks,
John.
You should to use the RSAX_BIW_GET_DATA_SIMPLE template as an example in order to create your extractor.....this template have the basic code that loads tha data in several packages without problems.....here you only have to change the selection and the input variables....
Obviously if you use a program for extract data you have a problem because you are not using a cursor(cursor has the advantage that never is deleted from first to the last package execution).....in this case you could modify the Control Program of the Function Module.....there you could create an internal table...variables defined here have a behavior as a cursor....exist from the first to the last execution....in this variable you could fill your data and read that package by package....
I hope this helps you
Message was edited by:
Oscar Díaz
Similar Messages
-
Database connection timeouts and datasource errors
Connections in the pool randomly die overnight. Stack traces show that for some reason, the evermind driver is being used even though the MySql connection pool is specified.
Also, the evermind connection pool is saying connections aren't being closed, and the stack trace shows they're being allocated by entity beans that are definitely not left hanging around.
Sometimes we get non-serializable errors when trying to retrieve the datasource (this is only after the other errors start). Some connections returned from the pool are still good, so the application limps along.
EJBs and DAOs both use jdbc/SQLServerDSCore.
Has anyone seen this problem?
<data-sources>
<data-source
class="com.mysql.jdbc.jdbc2.optional.MysqlConnectionPoolDataSource"
name="SQLServerDSCore"
location="jdbc/SQLServerDSCore"
xa-location="jdbc/xa/SQLServerXACore"
ejb-location="jdbc/SQLServerDSCore"
connection-driver="com.mysql.jdbc.Driver"
min-connections="5"
username="xxx"
password="xxx"
staleness-timeout="3600"
alive-poll-query="SELECT 1 FROM medispan"
url="jdbc:mysql://1.2.3.4:3306/dbo?autoReconnect=true&autoReconnectForPools=true&cachePrepStmts=true&is-connection-validation-required=true"
inactivity-timeout="30"
>
<property name="autoReconnect" value="true"/>
<property name="autoReconnectForPools" value="true"/>
<property name="is-connection-validation-required" value="true"/>
<property name="cachePrepStmts" value="true"/>
</data-source>
</data-sources>Rick,
OC4J 9.0.4.0.0 - BTW, do you know of any patches?As far as I know, there are no patches for the 9.0.4
production version of OC4J stand-alone.
I'm using container managed persistence,It was not clear to me, from your previous post, that you
are using CMP entity beans.
I found staleness-timeout and alive-poll-query
somewhere on a website when trying to track this
down. Here's four sources:Those sources refer to OrionServer -- and an older version, too, it seems.
Like all other Oracle products that start out as somebody
else's -- including, for example, JBuilder (that became "JDeveloper"), Apache Web Server (that became "Oracle HTTP Server") and TopLink -- their development paths diverge, until, eventually, there is absolutely no similarity between them at all. Hence, the latest versions of OC4J and "OrionServer" are so different, that you cannot be sure that something that works for "OrionServer" will work for OC4J.
I recall reading something, somewhere, sometime about configuring OC4J to use different databases (other than Oracle), but I really don't remember any details (since it was not relevant to me, because we only use Oracle database). In any case, it is possible to use a non-Oracle database with OC4J.
Good Luck,
Avi. -
Possible to change the datasource from a business-view to a Sql Command ?
Hello,
When a business view contains a lot of elements it takes a while just to open the report.
We'd like to keep the BV as the dictionnary, but,
once the report design completed,
we'd like to disconnect the business view and replace it by the Sql command which can be seen in the menu option 'show SQL query'.
Is it possible via the RAS sdk ?
Did somebody experience this ?
How to proceed ?
Thanks a lot
AlainHi Ted,
I'm thinking opening a Case for this problem of opening reports based on a big BV.
We can't really reduce the BV, since it is the dictionnary and we need the whole thing...
I'm wondering why it is impossible to change the Datasource if the tables and fields underneath are identical.
Is it impossible to change the fields' mapping ?
The other solution, as you suggest, is to create a report from scratch, create a new Datasource with the Sql command format, and rebuild the report... I agree it looks like a big job...
Do you know if it is possible to export the report in XML for instance, change the XML, and then re-import ?
Thanks for your Help.
Alain -
Getting following error while creating a datasource connection with oracle database.
I have 32 bit oracle server installed in remote server.
and 64 bit sql server 2008 r2 report server installed, and 64 bit oracle client installed on my report server while create a new datasource
in the report server i am getting this error
Error
Attempt to load Oracle client libraries threw BadImageFormatException. This problem will occur when running in 64 bit mode with the 32 bit Oracle
client components installed
How can i fix this and let me know the reasonThis link will help you out.
http://social.msdn.microsoft.com/Forums/en-US/sqlreportingservices/thread/0a38fa00-31de-49de-b68f-4c5a4565e5b1?prof=required
Milan Das -
Error while creating datasource
I am getting this error while creating the datasource on console of html page.
java.lang.NullPointerException
at weblogic.management.console.utils.MBeans.getMBeanClassNameFor(MBeans.java:1153)
at weblogic.management.console.actions.mbean.EditMBeanAction.getMBeanClass(EditMBeanAction.java:210)
at weblogic.management.console.actions.mbean.EditMBeanAction.getDialogTypeKey(EditMBeanAction.java:188)
at weblogic.management.console.actions.internal.InternalActionContext.setAction(InternalActionContext.java:158)
at weblogic.management.console.actions.internal.ActionServlet.doAction(ActionServlet.java:170)
at weblogic.management.console.actions.internal.ActionServlet.doPost(ActionServlet.java:85)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run(ServletStubImpl.java:945)
at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:332)
at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:242)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:5360)
at weblogic.security.service.SecurityServiceManager.runAs(SecurityServiceManager.java:721)
at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppServletContext.java:3043)
at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestImpl.java:2468)
at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:152)
at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:133)well thankx for the inputs. well but te screen dosent move from tht point. as we enter a datasource name, it gives this error. and we havent logged into anywhr to give the access credentials.
am wondering how then whts the point in knowing the username and pwd if i give edit datasource .
thankx -
Error while adding a datasource to an index
Hi,
I am creating an index with a file system repository. I have to create a taxonomy on that index to classify all the documents inside the repository. When i am trying to assign a datasource to the index i am getting the following error
"The repository needs the properties service to be attached to a classification index"
Can anybody help me?
Thanks in Advance
PrakashHi Prakash,
what you have to do is to edit the file system repository you want to assign as datasource to the index.
(Content Management -> Repository Mangers -> File System Repository -> Mark "Your FS Repository" -> Edit)
Go through the list of Repository Services (click on Next Page) and activate (mark) the "properties" Repository Service.
After saving and restarting! the J2EE Engine you should be able to add the datasource to the classification index.
Hope this helps,
Robert -
Error while creating a datasource in planning 9.3.1 on oracle 11.2 database
I am unable to create datasource in planning 9.3.1 on oracle 11.2 database. I have configure sharedservices and registered planning with shared servers. I am unable to create data source after application deployment and instance creation.
I am getting the following error,
Launching Hyperion Configuration Utility Program
HYPERION_HOME: C:\Hyperion
In HspDBPropertiesLocationPanel constructor
In HspDBPropertiesLocationPanel queryEnter
Resource Bundle is java.util.PropertyResourceBundle@322394
Product Name in file is PLANNING
Availability Date is 20051231
Creating rebind thread to RMI
Resource Bundle is java.util.PropertyResourceBundle@322394
Product Name in file is PLANNING
Availability Date is 20051231
$$$$$$$$$$$$$ dname is
Resource Bundle is java.util.PropertyResourceBundle@322394
Product Name in file is PLANNING
Availability Date is 20051231
Exception in thread "AWT-EventQueue-0" java.lang.UnsatisfiedLinkError: no HspEss
baseEnv in java.library.path
at java.lang.ClassLoader.loadLibrary(Unknown Source)
at java.lang.Runtime.loadLibrary0(Unknown Source)
at java.lang.System.loadLibrary(Unknown Source)
at com.hyperion.planning.olap.HspEssbaseEnv.<clinit>(Unknown Source)
at com.hyperion.planning.olap.HspEssbaseJniOlap.<clinit>(Unknown Source)
at com.hyperion.planning.HspJSHomeImpl.TestEssConnection(Unknown Source)
at com.hyperion.planning.HspDSEssbasePanelManager.TestEssConnection(HspD
SEssbasePanelManager.java:156)
at com.hyperion.planning.HspDSEssbasePanelManager.queryExit(HspDSEssbase
PanelManager.java:132)
at com.hyperion.cis.config.wizard.ProductCustomInputPanel.queryExit(Prod
uctCustomInputPanel.java:114)
at com.installshield.wizard.awt.AWTWizardUI.doNext(Unknown Source)
at com.installshield.wizard.awt.AWTWizardUI.actionPerformed(Unknown Sour
ce)
at com.installshield.wizard.swing.SwingWizardUI.actionPerformed(Unknown
Source)
at com.installshield.wizard.swing.SwingWizardUI$SwingNavigationControlle
r.notifyListeners(Unknown Source)
at com.installshield.wizard.swing.SwingWizardUI$SwingNavigationControlle
r.actionPerformed(Unknown Source)
at javax.swing.AbstractButton.fireActionPerformed(Unknown Source)
at javax.swing.AbstractButton$Handler.actionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.setPressed(Unknown Source)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Sour
ce)
at java.awt.Component.processMouseEvent(Unknown Source)
at javax.swing.JComponent.processMouseEvent(Unknown Source)
at java.awt.Component.processEvent(Unknown Source)
at java.awt.Container.processEvent(Unknown Source)
at java.awt.Component.dispatchEventImpl(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Window.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.EventQueue.dispatchEvent(Unknown Source)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.run(Unknown Source)
But, My essbase server is up and running. I am able to connect it through EAS.It looks like more of an issue with connecting to essbase, usually "java.lang.UnsatisfiedLinkError: no HspEssbaseEnv in java.library.path" means planning has not been installed or deployed correctly, what OS is it running on?
Cheers
John
http://john-goodwin.blogspot.com/ -
Error while transporting webservice datasources
Hi all,
I have created webservice datasources to fetch EP data to BW. and now i am transporting that Datasources from DEV to QA. I have colllected datasources alone in one req. (source system i have created in QA manually and i mapped that with DEV)
But while transporting i am getting the folllowing error,
3S3T Start of the after-import method RS_RSDS_AFTER_IMPORT for object type(s) RSDS (Activation Mode)
Error Screen output without connection to user .
Error when activating DataSource ZANS_MASTER BI7001
Please through some light is there any predefined settings i need to do in QA system?.
Thanks in advance,
V.SenDear All,
Thanks for your kind reply.
There are no R/3 Datasources involved. Our BI Cockpit landscape is based upon only BI system. No data transfer to take place between R/3 and BI.
Secondly, you mean to say that I need to replicate my Source system in Production and then proceed with the transporting of the requests from Dev to Prod.
But I never did the same in Quality while transporting from Dev. Is it that it might have been done in Quality beforehand by someone ?
Also, when the Datasources have not been transported to Production (resulting in error) then how can i replicate these datasources in Production ???
Let me know if I have not understood correctly.
Many thanks again for your efforts.
/Shalabh -
Error while transporting Datasources in Production
Dear All,
I have facing a problem while transporting objects pertaining to BI Cockpit in Production.
Earlier I had transported all the obects (in more than one transports) in Quality (from Development). After testing the objects, I had imported the same number of transports (and in the same order) to Production and now some of the requests have ended in errors like:
DataSource 0TCTWHMTFM_TEXT does not exist in source system PW1CLNT003 of version A
Such errors have occured for most of the Datasources.
Can anyone let me know
1) Why the transports have failed while importing in Production when similar requests have been transported successfully to Quality.
2) The possible solution to correct the error.
Request your kind help to overcome this problem.
Regards
ShalabhDear All,
Thanks for your kind reply.
There are no R/3 Datasources involved. Our BI Cockpit landscape is based upon only BI system. No data transfer to take place between R/3 and BI.
Secondly, you mean to say that I need to replicate my Source system in Production and then proceed with the transporting of the requests from Dev to Prod.
But I never did the same in Quality while transporting from Dev. Is it that it might have been done in Quality beforehand by someone ?
Also, when the Datasources have not been transported to Production (resulting in error) then how can i replicate these datasources in Production ???
Let me know if I have not understood correctly.
Many thanks again for your efforts.
/Shalabh -
Error while transporting custom datasource
Hi All,
Below is the scenario that was carried out in my project.
1. Created OHD to load data from a CUBE to Data base Table /BIC/ABC.
2. Created a custom Datasource ZABC_TEST with the above table.
3. Activated the DataSource and loaded the data. (Data load successfull. It works fine).
4. Tried transporting the above. But trows error while transporting the data source.
Error -- InfoSource ZABC_TEST is not available in source system BITST.
Let me know what can be done to solve the above error. Please suggest if i have missed out something.
Thanks in advance.
MaddyHI,
The problem seems to be occuring since you are transporting both OHD and DS in the same TR request.
Transport first the DS in a request, then in another request transport the OHD.
This should resolve the issue.
hope this helps.
thanks,
rahul -
Error while transporting a WEBservice Datasource
Hi all,
I am trying to transport a web service datasource(PI data to BW) in testing system . But while transporting ,I get the following error in the logs of my transport request.
Error generating the Web service /BIC/CQZPI_KPI00002000
This is not an authorization issue as I have got the required authorizations from the basis team .
Can anyone please give me some solution to this error.
Many Thanks in AdvanceHi.
Did you checked the following notes to solve this issue?
1177867 Error while deleting Web service in BI After Import
1177005 Error while activating/deleting Web service in BI
1170688 DataSource: Missing replication in transport
Please check also if Myself connection was correctly set in your system.
538052 Maintenance of Myself destination in BW.
Please check if enough authorizations were assigned to user who is
transporting this request.
913944-Error RSDS 301 when you activate a Web Service Data
Please try to check the data format. Check if some fields are mentioned in
logs and set them to EXTERNAL before the transport.
Please try also to activate the DataSource in the source system and
then create a new transport request and transport it.
I hope I can be helpful.
Thanks,
Walter Oliveira. -
Error while transporting Generic Datasource in R/3
Hi All,
I have created a generic datasource(R/3) on a view. As it was in $TMP, I have assigned the datasource, ZABCD, and the view, ZVIEW to a devlopment package and also to a transport request. I released the Transport request and when I tried to import it into Q, it throwed me an error with below messages:
Extract structure ZOX**** is not active
The extract structure ZOX**** of the DataSource ZABCD is invalid
Errors occurred during post-handling RSA2_DSOURCE_AFTER_IMPORT for OSOA L
The errors affect the following components:
BC-BW (SAP Business Information Warehouse Extractors)
The extract structure ZOX**** is the structure used by the datsource. I forgot to assign that to the earlier TR and I think the error is because the structure is missing in the TR. Am I correct?
How should I deal with it now? DO I need to create a new TR with this structure alone as the earlier TR is already released?
Thanks,
RPK.As you comment it is possible error was not include the structure in earlier TR.
You can create, in se09 or se10, an empty TR. In these transactions you have a buttoun with a box drawed. Set focus on your new empty TR and push button.
Add object from TR transported with error and the missed structure. -
Error while configuring kodo to lookup a datasource
We had a working application using EEPersistenceManagerFactory
I changed the kodo.properties to lookup a non XA JDBC datasource.
After that the application is working fine (it creates
,updates,deletes,finds record in the DB)
but SystemOut.log has the following error for every operation
We are using Kodo 2.5.0, Websphere 5.0 and Oracle 8
How can we avoid getting this error ?.
We tried to find any property on the Websphere datasource which can be
altered to avoid this error but no luck.
Thanks
Paresh
[10/7/03 15:30:45:467 IST] 3d8b2d1a MCWrapper E J2CA0081E: Method
destroy failed while trying to execute method destroy on ManagedConnection
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl@437f6d23 from resource
<null>. Caught exception: com.ibm.ws.exception.WsException: DSRA0080E: An
exception was received by the Data Store Adapter. See original exception
message: Cannot call 'cleanup' on a ManagedConnection while it is still in
a transaction..
at
com.ibm.ws.rsadapter.exceptions.DataStoreAdapterException.<init>(DataStoreAdapterException.java:222)
at
com.ibm.ws.rsadapter.exceptions.DataStoreAdapterException.<init>(DataStoreAdapterException.java:172)
at
com.ibm.ws.rsadapter.AdapterUtil.createDataStoreAdapterException(AdapterUtil.java:182)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.cleanupTransactions(WSRdbManagedConnectionImpl.java:1826)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.destroy(WSRdbManagedConnectionImpl.java:1389)
at com.ibm.ejs.j2c.MCWrapper.destroy(MCWrapper.java:1032)
at
com.ibm.ejs.j2c.poolmanager.FreePool.returnToFreePool(FreePool.java:259)
at com.ibm.ejs.j2c.poolmanager.PoolManager.release(PoolManager.java:777)
at com.ibm.ejs.j2c.MCWrapper.releaseToPoolManager(MCWrapper.java:1304)
at
com.ibm.ejs.j2c.ConnectionEventListener.connectionClosed(ConnectionEventListener.java:195)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.processConnectionClosedEvent(WSRdbManagedConnectionImpl.java:843)
at
com.ibm.ws.rsadapter.jdbc.WSJdbcConnection.closeWrapper(WSJdbcConnection.java:569)
at com.ibm.ws.rsadapter.jdbc.WSJdbcObject.close(WSJdbcObject.java:132)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.close(SQLExecutionManagerImpl.java:814)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.release(JDBCStoreManager.java(Inlined
Compiled Code))
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.load(JDBCStoreManager.java(Compiled
Code))
at
com.solarmetric.kodo.runtime.StateManagerImpl.loadFields(StateManagerImpl.java(Compiled
Code))
at
com.solarmetric.kodo.runtime.StateManagerImpl.preSerialize(StateManagerImpl.java:784)
at com.paresh.core.vo.Release.jdoPreSerialize(Release.java)
at com.paresh.core.vo.Release.writeObject(Release.java)
at java.lang.reflect.Method.invoke(Native Method)
at
com.ibm.rmi.io.IIOPOutputStream.invokeObjectWriter(IIOPOutputStream.java:703)
at com.ibm.rmi.io.IIOPOutputStream.outputObject(IIOPOutputStream.java:671)
at
com.ibm.rmi.io.IIOPOutputStream.simpleWriteObject(IIOPOutputStream.java:146)
at
com.ibm.rmi.io.ValueHandlerImpl.writeValueInternal(ValueHandlerImpl.java:217)
at com.ibm.rmi.io.ValueHandlerImpl.writeValue(ValueHandlerImpl.java:144)
at com.ibm.rmi.iiop.CDROutputStream.write_value(CDROutputStream.java:1590)
at com.ibm.rmi.iiop.CDROutputStream.write_value(CDROutputStream.java:1107)
at
com.paresh.core.interfaces._EJSRemoteStatelessValidation_da16513c_Tie.findCorrectionAction(_EJSRemoteStatelessValidation_da16513c_Tie.java:309)
at
com.paresh.core.interfaces._EJSRemoteStatelessValidation_da16513c_Tie._invoke(_EJSRemoteStatelessValidation_da16513c_Tie.java:104)
at
com.ibm.CORBA.iiop.ServerDelegate.dispatchInvokeHandler(ServerDelegate.java:582)
at com.ibm.CORBA.iiop.ServerDelegate.dispatch(ServerDelegate.java:437)
at com.ibm.rmi.iiop.ORB.process(ORB.java:320)
at com.ibm.CORBA.iiop.ORB.process(ORB.java:1544)
at com.ibm.rmi.iiop.Connection.doWork(Connection.java:2063)
at com.ibm.rmi.iiop.WorkUnitImpl.doWork(WorkUnitImpl.java:63)
at com.ibm.ejs.oa.pool.PooledThread.run(ThreadPool.java:95)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:592)
kodo.properties
com.solarmetric.kodo.LicenseKey=
#com.solarmetric.kodo.ee.ManagedRuntimeProperties=TransactionManagerName=java:/TransactionManager
com.solarmetric.kodo.ee.ManagedRuntimeProperties=TransactionManagerName=TransactionFactory
TransactionManagerMethod=com.ibm.ejs.jts.jta.TransactionManagerFactory.getTransactionManager
#com.solarmetric.kodo.ee.ManagedRuntimeClass=com.solarmetric.kodo.ee.InvocationManagedRuntime
com.solarmetric.kodo.ee.ManagedRuntimeClass=com.solarmetric.kodo.ee.AutomaticManagedRuntime
#javax.jdo.PersistenceManagerFactoryClass=com.solarmetric.kodo.impl.jdbc.JDBCPersistenceManagerFactory
javax.jdo.PersistenceManagerFactoryClass=com.solarmetric.kodo.impl.jdbc.ee.EEPersistenceManagerFactory
javax.jdo.option.ConnectionFactoryName=ds/kodo/DataSource1
javax.jdo.option.Optimistic=true
javax.jdo.option.RetainValues=true
javax.jdo.option.NontransactionalRead=true
#com.solarmetric.kodo.DataCacheClass=com.solarmetric.kodo.runtime.datacache.plugins.CacheImpl
# Changing these to a non-zero value will dramatically increase
# performance, but will cause in-memory databases such as Hypersonic
# SQL to never exit when your main() method exits, as the pooled
# connections in the in-memory database will cause a daemon thread to
# remain running.
javax.jdo.option.MinPool=5
javax.jdo.option.MaxPool=10We do have a makeTransientAll() before the object returns from Session
Bean.
We also tried the JCA path
After installing the JCA RAR and doing a lookup for
PersistenceManagetFactory the same code is not throwing any exception.
The exception is thrown only if datasource is used.
Thanks
Paresh
Marc Prud'hommeaux wrote:
Paresh-
It looks like you are returning a collection of instances from an EJB,
which will cause them to be serialized. The serialization is happening
outside the context of a transaction, and Kodo needs to obtain a
connection. Websphere seems to not like that.
You have a few options:
1. Call makeTransientAll() on all the instances before you return them
from your bean methods
2. Manually instantiate all the fields yourself before sending them
back. You could use a bogus ObjectOutputStream to do this.
3. In 3.0, you can use the new detach() API to detach the instances
before sending them back to the client.
In article <[email protected]>, Paresh wrote:
We had a working application using EEPersistenceManagerFactory
I changed the kodo.properties to lookup a non XA JDBC datasource.
After that the application is working fine (it creates
,updates,deletes,finds record in the DB)
but SystemOut.log has the following error for every operation
We are using Kodo 2.5.0, Websphere 5.0 and Oracle 8
How can we avoid getting this error ?.
We tried to find any property on the Websphere datasource which can be
altered to avoid this error but no luck.
Thanks
Paresh
[10/7/03 15:30:45:467 IST] 3d8b2d1a MCWrapper E J2CA0081E: Method
destroy failed while trying to execute method destroy on ManagedConnection
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl@437f6d23 from resource
<null>. Caught exception: com.ibm.ws.exception.WsException: DSRA0080E: An
exception was received by the Data Store Adapter. See original exception
message: Cannot call 'cleanup' on a ManagedConnection while it is still in
a transaction..
at
com.ibm.ws.rsadapter.exceptions.DataStoreAdapterException.<init>(DataStoreAdapterException.java:222)
at
com.ibm.ws.rsadapter.exceptions.DataStoreAdapterException.<init>(DataStoreAdapterException.java:172)
at
com.ibm.ws.rsadapter.AdapterUtil.createDataStoreAdapterException(AdapterUtil.java:182)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.cleanupTransactions(WSRdbManagedConnectionImpl.java:1826)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.destroy(WSRdbManagedConnectionImpl.java:1389)
at com.ibm.ejs.j2c.MCWrapper.destroy(MCWrapper.java:1032)
at
com.ibm.ejs.j2c.poolmanager.FreePool.returnToFreePool(FreePool.java:259)
at com.ibm.ejs.j2c.poolmanager.PoolManager.release(PoolManager.java:777)
at com.ibm.ejs.j2c.MCWrapper.releaseToPoolManager(MCWrapper.java:1304)
at
com.ibm.ejs.j2c.ConnectionEventListener.connectionClosed(ConnectionEventListener.java:195)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.processConnectionClosedEvent(WSRdbManagedConnectionImpl.java:843)
at
com.ibm.ws.rsadapter.jdbc.WSJdbcConnection.closeWrapper(WSJdbcConnection.java:569)
at com.ibm.ws.rsadapter.jdbc.WSJdbcObject.close(WSJdbcObject.java:132)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.close(SQLExecutionManagerImpl.java:814)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.release(JDBCStoreManager.java(Inlined
Compiled Code))
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.load(JDBCStoreManager.java(Compiled
Code))
at
com.solarmetric.kodo.runtime.StateManagerImpl.loadFields(StateManagerImpl.java(Compiled
Code))
at
com.solarmetric.kodo.runtime.StateManagerImpl.preSerialize(StateManagerImpl.java:784)
at com.paresh.core.vo.Release.jdoPreSerialize(Release.java)
at com.paresh.core.vo.Release.writeObject(Release.java)
at java.lang.reflect.Method.invoke(Native Method)
at
com.ibm.rmi.io.IIOPOutputStream.invokeObjectWriter(IIOPOutputStream.java:703)
at com.ibm.rmi.io.IIOPOutputStream.outputObject(IIOPOutputStream.java:671)
at
com.ibm.rmi.io.IIOPOutputStream.simpleWriteObject(IIOPOutputStream.java:146)
at
com.ibm.rmi.io.ValueHandlerImpl.writeValueInternal(ValueHandlerImpl.java:217)
at com.ibm.rmi.io.ValueHandlerImpl.writeValue(ValueHandlerImpl.java:144)
at com.ibm.rmi.iiop.CDROutputStream.write_value(CDROutputStream.java:1590)
at com.ibm.rmi.iiop.CDROutputStream.write_value(CDROutputStream.java:1107)
at
com.paresh.core.interfaces._EJSRemoteStatelessValidation_da16513c_Tie.findCorrectionAction(_EJSRemoteStatelessValidation_da16513c_Tie.java:309)
at
com.paresh.core.interfaces._EJSRemoteStatelessValidation_da16513c_Tie._invoke(_EJSRemoteStatelessValidation_da16513c_Tie.java:104)
at
com.ibm.CORBA.iiop.ServerDelegate.dispatchInvokeHandler(ServerDelegate.java:582)
at com.ibm.CORBA.iiop.ServerDelegate.dispatch(ServerDelegate.java:437)
at com.ibm.rmi.iiop.ORB.process(ORB.java:320)
at com.ibm.CORBA.iiop.ORB.process(ORB.java:1544)
at com.ibm.rmi.iiop.Connection.doWork(Connection.java:2063)
at com.ibm.rmi.iiop.WorkUnitImpl.doWork(WorkUnitImpl.java:63)
at com.ibm.ejs.oa.pool.PooledThread.run(ThreadPool.java:95)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:592)
kodo.properties
com.solarmetric.kodo.LicenseKey=
#com.solarmetric.kodo.ee.ManagedRuntimeProperties=TransactionManagerName=java:/TransactionManager
>>
>
com.solarmetric.kodo.ee.ManagedRuntimeProperties=TransactionManagerName=TransactionFactory
>>
>
TransactionManagerMethod=com.ibm.ejs.jts.jta.TransactionManagerFactory.getTransactionManager
>>
>>
>
#com.solarmetric.kodo.ee.ManagedRuntimeClass=com.solarmetric.kodo.ee.InvocationManagedRuntime
>>
>
com.solarmetric.kodo.ee.ManagedRuntimeClass=com.solarmetric.kodo.ee.AutomaticManagedRuntime
>>
>>
>
#javax.jdo.PersistenceManagerFactoryClass=com.solarmetric.kodo.impl.jdbc.JDBCPersistenceManagerFactory
>>
>
javax.jdo.PersistenceManagerFactoryClass=com.solarmetric.kodo.impl.jdbc.ee.EEPersistenceManagerFactory
>>
javax.jdo.option.ConnectionFactoryName=ds/kodo/DataSource1
javax.jdo.option.Optimistic=true
javax.jdo.option.RetainValues=true
javax.jdo.option.NontransactionalRead=true
#com.solarmetric.kodo.DataCacheClass=com.solarmetric.kodo.runtime.datacache.plugins.CacheImpl
>>
>>
# Changing these to a non-zero value will dramatically increase
# performance, but will cause in-memory databases such as Hypersonic
# SQL to never exit when your main() method exits, as the pooled
# connections in the in-memory database will cause a daemon thread to
# remain running.
javax.jdo.option.MinPool=5
javax.jdo.option.MaxPool=10
Marc Prud'hommeaux [email protected]
SolarMetric Inc. http://www.solarmetric.com -
Error while creating a datasource for XMLfile in DataFederator
Hi,
I am trying to create a datasource in Data federator for XML file, when I tried to click on Generate Elements and Attributes button
Data Federator gives me error "Java.lang.reflect.InnovationTargetException".
Thanx In Advance.
Yogesh
Edited by: mahayog74 on Dec 2, 2011 11:09 AMIt looks like more of an issue with connecting to essbase, usually "java.lang.UnsatisfiedLinkError: no HspEssbaseEnv in java.library.path" means planning has not been installed or deployed correctly, what OS is it running on?
Cheers
John
http://john-goodwin.blogspot.com/ -
No key field found for creation of DataSource - Classification Datasource
Hello,
While trying to create a clasiffication datasource based on 0PLANT_ATTR, when i assign a characteristic and push the DataSource button, i get the following message:
No key field found for creation of DataSource
Diagnosis
During generation of a classification or configuration DataSource, only those key fields for the object table (field "Obj.Tabelle") that are already present in the basis DataSource transferred to the extract structure. This is the case when none of the key fields of the object table were found in the structure of the basis DataSource.
System response
A DataSource cannot be created without key fields. The action was cancelled.
Procedure
Check whether you have selected the correct basis DataSource and object table. For more information, please see SAP Note 569849.
Do you know what can be the problem?
Thank you and regardsHi Alberto,
plants are a special case. The key which is used for the classification
of plants (object type BETR) is not the same as the key which is used in
datasource 0PLANT_ATTR.
BETR has key LOCNR (Site). That's a customer related to a plant. The
customer number will be extracted in field LOCNR.
0PLANT_ATTR extracts the plant in its key field WERKS.
Transaction CTBW and the generic extraction program for classifications
don't know the relationship between LOCNR and WERKS. So they cannot map
them.
I do recommend a solution which would add the mapping between
LOCNR and WERKS:
1. Create please the classification datasource as intended, but use
datasource 0RT_LOC_MGR_ATTR as basis datasource. It's the only
datasource of the content where LOCNR is a key field. So
0RT_LOC_MGR_ATTR is used as a dummy here, to allow transaction CTBW to
create the classification datasource. It's not necessary to extract
data with datasource 0RT_LOC_MGR_ATTR.
2. Extend please the extract structure of the created classification
datasource. Add please field WERKS using component type WERKS_D. Make
this field visible.
3. Fill field WERKS in the extractor user exit EXIT_SAPLRSAP_002. WERKS
can be read from table KNA1 by using the customer number extracted to
LOCNR to select on field KNA1-KUNNR.
4. Transaction CTBW_META on the BW system isn't able to append the
characteristics from the classification datasource to infosource
0PLANT, because the keys are different. So create please a new info
source with CTBW_META. This allows CTBW_META to create the info objects
for the characteristics used in the classification datasource.
5. Add please the characteristics used in the classification datasource
to infosource 0PLANT manually. You will find the info object names of
the characteristics by looking up the characteristic datasources which
are assigned to the classification datasource in transaction CTBW. From
these names you can derive the info object names:
1CL_A... -> C_A...
6. Disconnect the infosource which has been created with CTBW_META from
the classification datasource.
7. Connect please the classification datasource to infosource 0PLANT
Use following info object:
info object field
0PLANT WERKS
The info object names for the characteristics are explained in step 5.
8. Add an infopackage to infosource 0PLANT for the classification
datasource.
Now the extraction of classifications of sites should work.
Best regards,
Rolf
P.S. I saw the system messed it up and doesn't display any new and empty lines. Sorry, I hope you still can read it.
Edited by: Rolf Doersam on Mar 26, 2010 6:56 PM -
Guruji's:
I am in a lab environment for BI7 ECC 6....trying to follow basic steps for ETL exercize in BW modeling document. I have no basis support and there are no other users in this lab.
Please if you don't feel like reading the whole thing just read part 1 below, I am just being as detailed as possible.
1: If I go into tcode SBIW, I am able to create a generic datasource only if I select an existing SAP-based Application Component. However it then does not appear under said application component when I browse or search for it after it has been saved.
2: If I try to create my own application component, on the other hand, and create the datasource using this created one (while in t code sbiw) I get a message "there is no application component x" (x=name of whatever application component I am using.
3: I tried a workaround: In the workbench when I go to create a datasource by clicking on the button for datasource at the top and I go to create a datasource I get the mssg "creation of datasources for the sap source system T90CLNT090 is not permitted"
I have tried: creating application components under sap-based application components as well as by themselves and nothing seems to matter.
Note: This problem seems sporadic, as I was able to finally create a datasource but after I replicated and activated it and went in to create a transformation, the datasource tables didn't show up at all on the left, only the table from the infoobject characteristic load.
I presently have no usable datasource and am starting from scratch.
Thanks a lotI am editing this post because there have been changes:
1. Yes there are already standard datasources.
2. No I haven't done any initial settings...so I went to the Transfer Business Content DataSources thru sbiw but didn't see anything to select "execute"....so I try something...I went back Transfer Business Content DataSources and chose "content delta" and it says 'comparison bct <-> cust version ' then it counted thru several objects for a few minutes and finished with no warnings. then I selected each datasource under co-om-cca and activated those with no problems.
3. went to rsa 5 and again activated each datasource again for c0-om-cca.
i am still unable to see one of the two datasources i have created on the workbench but when i go to rsa6 both of my two datasources are showing.
when I try to create my generic datasource under my own application component it again does not recognise my application component so same problem...
4. when i try to create came generic datasource using co-om-cca application component i get mssg that it saved
5. but again when I look under the application component it is not there.
6. when i try to replicate datasources for t90clnt090 source system (rt click on source system name and select replicate datasources) it hangs up for several minutes and eventually my session terminates. it has always been like that.
Edited by: natalie marma on Apr 6, 2009 10:49 PM
Maybe you are looking for
-
How can I print a list of the apps I have on my iPad and iPhone ?
I had to reinstall all my apps when I upgraded to the new iOS - while some were listed in my 'purchased' list a lot weren't and I had to go through all my iPhone apps to see what was missing. And my folders all vanished as well! Any hints or tips be
-
My seriel number for getting adobe premiere elements 12 doesn't work, how can i do ?
Hi! I bought a canon camera in November and a pack of lightroom 5 and adobe premiere elements 12 were with the camera. Then i have 2 serial number and when i put the premiere element serial number for getting the app, it doesn't work, the error messa
-
Hi all, I have a DTP to load data from one DSO to another DSO. I'm getting the following error while activating this DTP, Activation of Objects with Type Data Transfer Process Internal Activation (Data Transfer Process ) Post Processing/Chec
-
Check if specified Excel workbook is already open, with .vbs
Continuing from the title... If it is already open, make it the active window before continuing with script. And if it's not already open, simply open it and continue with script. I have become very confused trying to get this procedure to happen, th
-
Logic - Picking Up of Price for Costing purpose when 2 Pur. Info Records
Hi, Having one Query related to Product Costing. If you can help out... 2 Purchase Info Records are there for Same Material & Plant for 2 Different Vendors & Net Price is maintained in both Info Records 1. Material M1 - Plant 1000 - Vendor A - Net Pr