Slow for Physical Data Servive

Hello, I am using aqualogi data service 3.0 and am having some trouble accessing the schemas and procedures within the schemas, as is taking a long time to return the contents of that schema when I riando a physical data service.
Someone went through this experience? (takes more than 10 minutes).
Regards,
Marcio Dantas

Open a case with customer support (I think your company already has a case open for this).
This issue was reported on one specific Oracle database server - the JDBC call databasemetadata.getProcedures() was hanging - the same behavior could be demonstrated outside of ALDSP - with DBVisualizer and also a stand-alone Java/JDBC program that was provided to the customer. The DBA needs to figure out what is going on with the database.

Similar Messages

  • Result set not visible in Test View for Physical data service

    I have configured a MS SQL server stored procedure as a data service. It returns a resultset. I am calling this data service (physical) from a logical data service. I am able to see the returned result set in the test view (workshop) of the logical data service but not in the test view of the physical data service? Is this a problem within the workshop?
    Any help is appreciated.

    Never heard of this before. Can you turn on Auditing of XQuery parameters and results and then show me the audit information so I can see you are calling both the logical and the physical data service with the same arguments?
    It would also help to see both the logical and physical ds and their schemas. Just zip and post the whole dsp project if possible.

  • Report download is slow for large data sets

    APEX 4.1
    I have a classic report which retrieves around 1lakh plus rows.
    While downloading report as excel it takes 5-10 mins for download.
    Any solution for this?
    Can we download in sets like first 1000 records n then next 1000 etc.?

    what I understood is we can export CSV in background using Kubicek's export_to_excel package.
    1.We can provide a button to execute procedure. - jfosteroracle says custom CSV download was slow
    2.Use job to download excel in background. - need to check with client if they wish to go ahead with this.
    Correct. You need to use the custom package and button on the page to submit the request for downloading the report in back-end.
    Is it possible to zip a file first and then download it?
    No. As of my knowledge it's not possible to zip a file and then download it.
    Thanks
    Lakshmi

  • IMPORT_COUNTREQUEST API for Physical Count form data load?

    Dear All,
    Please clarify whether I can use the IMPORT_COUNTREQUEST API for physical count form data load...?
    or which other API does the job other than dataload through dataloader..
    Please update...
    many thanks in advance....

    Hi,
    We believe instead of using IMPORT_COUNTREQUEST API you can use a Dataload script to key in data in Physical Inventory form.
    Basically you need to populate 4 fields in this form, i.e Physical Inventory, Date, Description & Snapshot Complete (checkbox).
    Go ahead and write a Dataload script for the same.
    Regards,
    S.P DASH

  • Dynamic Creation of Physical Data Server / Agent cache Refresh

    Scenario:
    I have a requirement to load data from xml source to oracle DB, and the xml source will change at run time,but the xsd of the xml would remain same ( so I don't have to change the Logical data Server, models, mappings, interfaces and scenarios - only the Physical Data Server will change at runtime).I have created all the ODI artifacts using ODI studio in my Work Repo and then I'm using odi sdk to create the physical dataserver for the changed xml data source and then invoking the agent programmatically.
    Problem:
    The data is being loaded from the xml source to oracle DB for the first time, but it is not working fine from the second time onwards. If I restart the agent, it is again working fine for one more time. on the first run, I think the agent maintains some sort of cache for the physical data server details and so when ever I change the data server, something is going wrong and that is leading to the following exception. So I want to know, if there is any mechanism to handle dynamic data servers or if there is any way of clearing the agent cache, if any.
    Caused By: org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
    File "<string>", line 41, in <module>
    AttributeError: 'NoneType' object has no attribute 'createStatement'
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1596)
         at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$2.doAction(StartScenRequestProcessor.java:582)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:513)
         at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$StartScenTask.doExecute(StartScenRequestProcessor.java:1070)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$1.run(DefaultAgentTaskExecutor.java:50)
         at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor.executeAgentTask(DefaultAgentTaskExecutor.java:41)
         at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.doExecuteAgentTask(TaskExecutorAgentRequestProcessor.java:93)
         at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.process(TaskExecutorAgentRequestProcessor.java:83)
         at oracle.odi.runtime.agent.support.DefaultRuntimeAgent.execute(DefaultRuntimeAgent.java:68)
         at oracle.odi.runtime.agent.servlet.AgentServlet.processRequest(AgentServlet.java:445)
         at oracle.odi.runtime.agent.servlet.AgentServlet.doPost(AgentServlet.java:394)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:821)
         at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:503)
         at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:389)
         at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
         at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181)
         at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765)
         at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:417)
         at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
         at org.mortbay.jetty.Server.handle(Server.java:326)
         at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:534)
         at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:879)
         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:747)
         at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)
         at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
         at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
         at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:520)

    Hi ,
    If you want to load multiple files ( same structure) through one connection then in topology create M.XSD for M.XML file
    Create three directories
    RAW -- It will contain file with original name
    PRO- Processing area where file will be moved one by one & renamed it as M.XML.
    OUT- Once file data will be loaded into tables move the file M.XML from PRO to OUT.
    Go to odiexperts to create loop,
    Use odifilemove ( to move & rename/masking) to move A.XML from RAW to PRO & rename to M.XML
    use ODIfilemove to move M.XML to OUT folder & then rename back to A.XML
    Use variables to store file names & refresh
    NoneType' object has no attribute 'createStatement' : It seems that structure of your file is different & your trying to load different files in same schema. If stucture is same then use Procedure "SYNCHRONIZE ALL" after every load...
    Edited by: neeraj_singh on Feb 16, 2012 4:47 AM

  • I have an early 2011 MacBook Pro which has been running slow for a while. After looking at responses to similar problems I have downloaded and run EtreCheck and will post the output. Please can someone help me with what it all means.Thanks in advance

    I have an early 2011 MacBook Pro which has been running slow for a while. After looking at responses to similar problems I have downloaded and run EtreCheck. Please can someone help me with what it all means.
    Thanks in advance.
    EtreCheck version: 1.9.15 (52)
    Report generated 19 September 2014 08:07:14 GMT+8
    Hardware Information: ?
      MacBook Pro (13-inch, Early 2011) (Verified)
      MacBook Pro - model: MacBookPro8,1
      1 2.3 GHz Intel Core i5 CPU: 2 cores
      4 GB RAM
    Video Information: ?
      Intel HD Graphics 3000 - VRAM: 384 MB
      Color LCD 1280 x 800
    System Software: ?
      OS X 10.9.4 (13E28) - Uptime: 0 days 0:4:29
    Disk Information: ?
      Hitachi HTS545032B9A302 disk0 : (320.07 GB)
      S.M.A.R.T. Status: Verified
      EFI (disk0s1) <not mounted>: 209.7 MB
      Macintosh HD (disk0s2) / [Startup]: 319.21 GB (147 GB free)
      Recovery HD (disk0s3) <not mounted>: 650 MB
      MATSHITADVD-R   UJ-898 
    USB Information: ?
      Apple Inc. FaceTime HD Camera (Built-in)
      Apple Inc. BRCM2070 Hub
      Apple Inc. Bluetooth USB Host Controller
      Apple Inc. Apple Internal Keyboard / Trackpad
      Apple Computer, Inc. IR Receiver
    Thunderbolt Information: ?
      Apple Inc. thunderbolt_bus
    Gatekeeper: ?
      Mac App Store and identified developers
    Kernel Extensions: ?
      [not loaded] com.seagate.driver.PowSecDriverCore (5.2.4 - SDK 10.4) Support
      [not loaded] com.seagate.driver.PowSecLeafDriver_10_4 (5.2.4 - SDK 10.4) Support
      [not loaded] com.seagate.driver.PowSecLeafDriver_10_5 (5.2.4 - SDK 10.5) Support
      [not loaded] com.seagate.driver.SeagateDriveIcons (5.2.4 - SDK 10.4) Support
      [loaded] com.sophos.kext.sav (9.1.55 - SDK 10.7) Support
      [loaded] com.sophos.nke.swi (9.1.50 - SDK 10.8) Support
    Launch Daemons: ?
      [loaded] com.adobe.fpsaud.plist Support
      [loaded] com.microsoft.office.licensing.helper.plist Support
      [running] com.sophos.autoupdate.plist Support
      [running] com.sophos.configuration.plist Support
      [running] com.sophos.intercheck.plist Support
      [running] com.sophos.notification.plist Support
      [running] com.sophos.scan.plist Support
      [running] com.sophos.sxld.plist Support
      [running] com.sophos.webd.plist Support
      [running] com.trusteer.rooks.rooksd.plist Support
    Launch Agents: ?
      [loaded] com.divx.dms.agent.plist Support
      [loaded] com.divx.update.agent.plist Support
      [running] com.sophos.uiserver.plist Support
      [running] com.trusteer.rapport.rapportd.plist Support
    User Launch Agents: ?
      [loaded] com.adobe.ARM.[...].plist Support
      [running] com.amazon.music.plist Support
      [loaded] com.google.keystone.agent.plist Support
      [not loaded] jp.co.canon.Inkjet_Extended_Survey_Agent.plist Support
    User Login Items: ?
      iTunesHelper
      TomTomHOMERunner
      AdobeResourceSynchronizer
      Dropbox
    Internet Plug-ins: ?
      FlashPlayer-10.6: Version: 15.0.0.152 - SDK 10.6 Support
      DivX Web Player: Version: 3.2.1.977 - SDK 10.6 Support
      AdobePDFViewerNPAPI: Version: 11.0.09 - SDK 10.6 Support
      AdobePDFViewer: Version: 11.0.09 - SDK 10.6 Support
      Flash Player: Version: 15.0.0.152 - SDK 10.6 Support
      EPPEX Plugin: Version: 10.0 Support
      Default Browser: Version: 537 - SDK 10.9
      OVSHelper: Version: 1.1 Support
      QuickTime Plugin: Version: 7.7.3
      SharePointBrowserPlugin: Version: 14.4.4 - SDK 10.6 Support
      iPhotoPhotocast: Version: 7.0 - SDK 10.7
    Safari Extensions: ?
      Ultimate
    Audio Plug-ins: ?
      BluetoothAudioPlugIn: Version: 1.0 - SDK 10.9
      AirPlay: Version: 2.0 - SDK 10.9
      AppleAVBAudio: Version: 203.2 - SDK 10.9
      iSightAudio: Version: 7.7.3 - SDK 10.9
    iTunes Plug-ins: ?
      Quartz Composer Visualizer: Version: 1.4 - SDK 10.9
    3rd Party Preference Panes: ?
      Flash Player  Support
      Perian  Support
      Trusteer Endpoint Protection  Support
    Time Machine: ?
      Skip System Files: NO
      Auto backup: YES
      Volumes being backed up:
      Macintosh HD: Disk size: 297.29 GB Disk used: 160.38 GB
      Destinations:
      Data [Network] (Last used)
      Total size: 2 TB
      Total number of backups: 99
      Oldest backup: 2012-04-20 17:05:32 +0000
      Last backup: 2014-09-18 23:49:25 +0000
      Size of backup disk: Excellent
      Backup size 2 TB > (Disk size 297.29 GB X 3)
      Time Machine details may not be accurate.
      All volumes being backed up may not be listed.
    Top Processes by CPU: ?
          6% InterCheck
          5% iCalExternalSync
          3% WindowServer
          2% CalendarAgent
          2% SystemUIServer
    Top Processes by Memory: ?
      152 MB SophosScanD
      147 MB InterCheck
      106 MB SophosAntiVirus
      66 MB Dropbox
      57 MB com.apple.iTunesLibraryService
    Virtual Memory Information: ?
      161 MB Free RAM
      1.55 GB Active RAM
      1.41 GB Inactive RAM
      902 MB Wired RAM
      611 MB Page-ins
      0 B Page-outs

    Uninstall Trusteer software
    http://www.trusteer.com/support/uninstalling-rapport-mac-os-x
    Remove Sophos
    https://discussions.apple.com/message/21069437#21069437

  • Error when saving a default format for a date field in 11g

    Getting this error when attempting to save the default for a date field in 11g:
    The current xml is invalid with the following errors: Bad xml instance! <?xml version="1.0"?> <sawsavedformat:metadata xmlns:sawsavedformat="com.siebel.analytics.web/savedformat/v1.1"><sawsavedformat:columnSavedFormats><sawsavedformat:columnSavedFormat xmlns:saw="com.siebel.analytics.web/report/v1.1" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="sawsavedformat:regularColumnSavedFormat" columnName="&quot;Retrospectives&quot;.&quot;Time Dim&quot;.&quot;Two Months Ago&quot;"><saw:displayFormat><saw:formatSpec suppress="suppress" wrapText="true" visibility="visible" hAlign="left" vAlign="top"><saw:dataFormat xsi:type="saw:custom" customFormat="MMMM dd, yyyy" displayTimeZone=""/></saw:formatSpec></saw:displayFormat><saw:tableHeading><saw:displayFormat><saw:formatSpec/></saw:displayFormat></saw:tableHeading><saw:columnHeading><saw:displayFormat><saw:formatSpec/></saw:displayFormat></saw:columnHeading></sawsavedformat:columnSavedFormat></sawsavedformat:columnSavedFormats></sawsavedformat:metadata>
    Line:2, Col:608, Attribute 'displayTimeZone' is not declared for element 'dataFormat'
    Thoughts and suggestions?

    I have a Support Request open with Oracle right now (it's taking a few weeks to resolve) but here's the update:
    OBIEE 10g handles DATE types differently than 11g. Both the support rep and I can't figure out how, but they are different. If you bring the column in as a DATETIME in your Physical Layer you can't save it system-wide, even if you do set a time zone for your account (My Account -> Preferences -> Time Zone). If you bring the column in as DATE you can save it system-wide but you can't display the hours/mins/secs, only the M/D/Y. This has been tested on a 9i and 10g database and produced the same thing on both. The problem never occurred with those same columns using OBIEE 10g.
    Oracle support has not fully admitted this as a bug, but the rep working on this SR said this will probably be fixed in the next release- 11.1.1.4, which is due out late Jan/early Feb of this year. He is trying to reproduce this so he can log it as a bug (which we all are hoping for!) so Oracle can address it.
    For now, I've found that you are stuck with two options: 1) bring the column in as a DATE, display it system-wide as a saved format, but you can't show the time, and 2) don't save that date value with a system-wide formatting and set it manually on every report you need a custom format (don't forget to set your time zone).
    If I hear anything more back from Oracle support I'll update this thread.
    Sorry guys, looks like we're all stranded on this one...

  • Sudden slow down in data transfer

    I'm reading data from the hard disk on a Target PXI System (8106 RT) and sending it via Ethernet to a Host PC and writing it to its hard disk. I read a chuck of data from the Target, send it using TCP Write function. The Host uses the TCP Read function get the data and write it to its hard disk. Data is being transfer over a Gbit Ethernet. A total of 9 files ranging in size from 700 MB - 2 GB are being transferred. The process moves along at an adequate speed until it suddenly slows down to slower than a snail's pace in the middle of a file.
    By using probes (see enclosures) I've been able to determine it take approximately 17 seconds between when the data is read on the PXI to when it is written on the Host. However, an iteration of the loop on the Target has increased to over 3 minutes. The CPU usage on the Target and Host are minimal, less than 1%. Memory usage on the Host is also small. Any suggestions on how to diagnose what has caused this sudden slow down?
    Attachments:
    Slide1.PNG ‏104 KB
    Slide2.PNG ‏124 KB
    Slide3.PNG ‏39 KB

    I would recommend using smaller chunk sizes for your data transfers. I would recommend something inthe 10K to 50K range. I have done quite a bit of profiling for sending large amounts of data and I have found that it is better to send lots of smaller chunks rather than fewer large chunks. Also make sure that you are not automatically building up large buffers in your application if you are using shift registers. If you are reading from a file read the file in chunks and send the chuncks. Don't read the whole file in in one read. One experiment I ran resulted in a single read of a large file (2 MB or so) took something like 10 minutes to read it in one read. However it only took a couple of 100 ms to read the whole thing in 10K chunks. Same applies to wrinting very large chunks using the TCP write. You can easily get a timeout if you write 1 MB as a single write because the receiver has not consumed all of the data within the timeout period. However, multiple small writes will reset the timeout for each chunk written.
    Mark Yedinak
    "Does anyone know where the love of God goes when the waves turn the minutes to hours?"
    Wreck of the Edmund Fitzgerald - Gordon Lightfoot

  • Is there any Bapi for Physical Sample creation

    Hi Folks,
    Is there any Bapi for physical sample creation. My scenario is like this I have the data in the table (material, plant, batch, type,vendor, vendorbatch ). For each material i have to create a  physical sample......
    Any ideas or suggestions would be appreciated..
    thanks
    chaithanya.

    HI Chaitanya,
    You can use : BAPI_BATCH_CREATE
    it has import paramaters like Material, Plant , batch , Batch Attributes ( Vendor No, Vendor Batch etc)... I guess this is what you are looking for.
    Regards,
    -Venkat.

  • 2.2 is slower - for certain types of processing

    So far the only major complaint I have with 2.2 is that it's slower for certain things. Scrolling through pictures in the camera app is slower. With 2.1 the pictures would show up immediately even when scrolling through them very fast. Also the grid view would load much faster, now i see a blank gray picture for 3-5 seconds before the actual photo loads. Scrolling through my contact list and app list is slower and choppy. Before it was fast and smooth.
    Has anyone else noticed these types of issues? Could the amount of processes running in the background cause the "jitter"  during processor intensive usage? With 2.1 I had the bare minimum running in background - maybe 2 - 4 processes and never had any issues. Now, I've disabled ATK - which seemed to help battery life, because of the wonderful programs that auto-restart themselves (such as Amazon MP3, footprints, skype, VZ Navigator, FM Radio, etc...), and have 10-20 processes at most times.

    w40d1n1 wrote:
    I believe that the froyo sluggishness, of the browser (pgup/down while the browser is still loading images), image viewer, and contacts, is due to the new "keep everything running in memory, and we'll sort it all out (by consuming more cpu) when the resources get low and froyo has to start killing/swapping apps to make room for the current app you are using" theory.;-)
    actually it requires less CPU power to do it the Android way.  It requires more processes force closing apps then then openning up what you want.  Once a application is in memory it has 0 processes unless you are using it.  If it uses a process then it is collecting data, or doing something you want it to do.
    Everytime someone say Task Killers are needed still can't differenciate between process and application under the terms of Linux.  I really do find it funny though.  You NEVER hear people stating you need a Task Killer for OSX, Ubuntu, Fedora, or <insert linux variant here>, but always recommend it for Android eventhough it is a Linux Variant itself.

  • Viewer slow while retrieving data while changing value from a page item

    Hi, I am using 9.0.2 to create reports. My end-users are using Viewer to run these reports. Recently, we are experiencing performance problems with Viewer. We have reports with Page Item. Let say a report takes 1 min to run. After running the report if a different value is selected for a page item, it is takes another 1 min to get new data. Whereas, Desktop is not taking any additional time (noticable) for a different value for the page item. Recently, we are experiencing this problem. I thought that it was cache problem in Application server, so we restarted all the pieces in aps server. However, the problem still exists. We are using a materialized for this particular report.
    Any tips would be highly appreciated. Thanks.

    The performance differences can be attributed to the differences in product. Desktop is a client server product. A query executes and ALL data is returned to the desktop. You can manipulate that data in any way and the database is not required. Web (Plus/Viewer) only return the data needed to satisfy the query currently being viewed. If you change a page item, a database call is made to retreive additional/different data.
    In other words this is just how things work. You can alter settings in your prefs.txt file to improve performance in Viewer/Plus. Changing number of rows returned, memory settings, row fetch limits, rows per fetch all can improve/reduce performance.
    As best I know there is not a "magic" setting for these values. It is based on the server/volume of data/workbook content/ workbook layout. etc. Crosstab workbooks are much slower to return data than page -detail.

  • Physical Data Service to Oracle Stored Procedure Package

    i'm calling a package with the following signature:
    PROCEDURE QUERY_ENTITY (
         P_TABLE_NAME      IN      VARCHAR2,
    P_ENTITY_ID      IN      NUMBER,
    P_USER           IN      VARCHAR2,
    P_RECORDSET      OUT      CNODB_ENTITY_PKG.C_ENTITY);
    When defining the data source what to I type the OUT parameter as? My DBA tells me that it returns a reference cursor. I've randomly chosen quite a number of things which results in the error "Relational wrapper excpetion. could not find the corresponding sql type for output parameter P_RECORDSET.
    Has anyone written a Physical DS that connects to Oracle Packages that return a refernce cursor rather than a specific data type?
    Sorry, here's the full dump of the error:
    com.bea.dsp.das.exception.DASException: weblogic.xml.query.exceptions.XQuerySagaException: {bea-err}UPD003: Update failure: mixed outcome, update error dispatched (updateid=SAGA_64f908a41cb48aeb:51ff32b4:1233d8ca55c:-7fdb): com.bea.dsp.wrappers.rdb.exceptions.RDBWrapperException: {bea-err}RDBW0000: Relational wrapper exception. could not find the corresponding sql type for output parameter P_RECORDSET
         at com.bea.dsp.das.ejb.EJBClient.invokeOperation(EJBClient.java:160)
         at com.bea.dsp.das.DataAccessServiceImpl.invokeOperation(DataAccessServiceImpl.java:171)
         at com.bea.dsp.das.DataAccessServiceImpl.invoke(DataAccessServiceImpl.java:122)
         at com.bea.dsp.ide.xquery.views.test.QueryExecutor.invokeFunctionOrProcedure(QueryExecutor.java:113)
         at com.bea.dsp.ide.xquery.views.test.XQueryTestViewContent.getFunctionExecutionResult(XQueryTestView.java:1041)
         at com.bea.dsp.ide.xquery.views.test.XQueryTestViewContent.executeFunction(XQueryTestView.java:1176)
         at com.bea.dsp.ide.xquery.views.test.XQueryTestViewContent.widgetSelectedImpl(XQueryTestView.java:1866)
         at com.bea.dsp.ide.xquery.views.test.XQueryTestViewContent.access$300(XQueryTestView.java:174)
         at com.bea.dsp.ide.xquery.views.test.XQueryTestViewContent$3.run(XQueryTestView.java:1594)
         at org.eclipse.swt.custom.BusyIndicator.showWhile(BusyIndicator.java:67)
         at com.bea.dsp.ide.xquery.views.test.XQueryTestViewContent.widgetSelectedBusy(XQueryTestView.java:1597)
         at com.bea.dsp.ide.xquery.views.test.XQueryTestViewContent.widgetSelected(XQueryTestView.java:1560)
         at org.eclipse.swt.widgets.TypedListener.handleEvent(TypedListener.java:227)
         at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:66)
         at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:938)
         at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:3687)
         at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3298)
         at org.eclipse.ui.internal.Workbench.runEventLoop(Workbench.java:2389)
         at org.eclipse.ui.internal.Workbench.runUI(Workbench.java:2353)
         at org.eclipse.ui.internal.Workbench.access$4(Workbench.java:2219)
         at org.eclipse.ui.internal.Workbench$4.run(Workbench.java:466)
         at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:289)
         at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:461)
         at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:149)
         at org.eclipse.ui.internal.ide.application.IDEApplication.start(IDEApplication.java:106)
         at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:169)
         at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:106)
         at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:76)
         at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:363)
         at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:176)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:508)
         at org.eclipse.equinox.launcher.Main.basicRun(Main.java:447)
         at org.eclipse.equinox.launcher.Main.run(Main.java:1173)
         at org.eclipse.equinox.launcher.Main.eclipse_main(Main.java:1148)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at com.m7.installer.util.NitroxMain$1.run(NitroxMain.java:33)
         at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:199)
         at java.awt.EventQueue.dispatchEvent(EventQueue.java:597)
         at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:273)
         at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:183)
         at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:173)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:168)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:160)
         at java.awt.EventDispatchThread.run(EventDispatchThread.java:121)
    Caused by: weblogic.xml.query.exceptions.XQuerySagaException: {bea-err}UPD003: Update failure: mixed outcome, update error dispatched (updateid=SAGA_64f908a41cb48aeb:51ff32b4:1233d8ca55c:-7fdb): com.bea.dsp.wrappers.rdb.exceptions.RDBWrapperException: {bea-err}RDBW0000: Relational wrapper exception. could not find the corresponding sql type for output parameter P_RECORDSET
         at com.bea.ld.server.update.recovery.SagaRecovery.process(SagaRecovery.java:104)
         at com.bea.ld.server.update.recovery.DSPSagaManager.processSagaFailure(DSPSagaManager.java:222)
         at weblogic.xml.query.update.recovery.SagaState$2.afterCompletion(SagaState.java:87)
         at weblogic.xml.query.transaction.TransactionManager.afterCompletion(TransactionManager.java:134)
         at weblogic.transaction.internal.ServerSCInfo.doAfterCompletion(ServerSCInfo.java:1032)
         at weblogic.transaction.internal.ServerSCInfo.callAfterCompletions(ServerSCInfo.java:1011)
         at weblogic.transaction.internal.ServerTransactionImpl.callAfterCompletions(ServerTransactionImpl.java:2990)
         at weblogic.transaction.internal.ServerTransactionImpl.afterRolledBackStateHousekeeping(ServerTransactionImpl.java:2871)
         at weblogic.transaction.internal.ServerTransactionImpl.setRolledBack(ServerTransactionImpl.java:2847)
         at weblogic.transaction.internal.ServerTransactionImpl.globalRetryRollback(ServerTransactionImpl.java:3087)
         at weblogic.transaction.internal.ServerTransactionImpl.globalRollback(ServerTransactionImpl.java:2837)
         at weblogic.transaction.internal.ServerTransactionImpl.internalRollback(ServerTransactionImpl.java:400)
         at weblogic.transaction.internal.ServerTransactionImpl.rollback(ServerTransactionImpl.java:378)
         at weblogic.xml.query.transaction.TransactionHelper.rollback(TransactionHelper.java:102)
         at weblogic.xml.query.transaction.TransactionManager.teardownOnFailure(TransactionManager.java:264)
         at com.bea.ld.EJBRequestHandler.handleThrowable(EJBRequestHandler.java:829)
         at com.bea.ld.EJBRequestHandler.invokeOperation(EJBRequestHandler.java:326)
         at com.bea.ld.ServerBean.executeOperationStreaming(ServerBean.java:84)
         at com.bea.ld.Server_ydm4ie_EOImpl.executeOperationStreaming(Server_ydm4ie_EOImpl.java:426)
         at com.bea.ld.Server_ydm4ie_EOImpl_WLSkel.invoke(Unknown Source)
         at weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:589)
         at weblogic.rmi.cluster.ClusterableServerRef.invoke(ClusterableServerRef.java:230)
         at weblogic.rmi.internal.BasicServerRef$1.run(BasicServerRef.java:477)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:363)
         at weblogic.security.service.SecurityManager.runAs(Unknown Source)
         at weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:473)
         at weblogic.rmi.internal.wls.WLSExecuteRequest.run(WLSExecuteRequest.java:118)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Caused by: com.bea.dsp.wrappers.rdb.exceptions.RDBWrapperException: {bea-err}RDBW0000: Relational wrapper exception. could not find the corresponding sql type for output parameter P_RECORDSET
         at com.bea.dsp.wrappers.rdb.exceptions.RDBWrapperException.create(RDBWrapperException.java:89)
         at weblogic.xml.query.exceptions.XQueryException.create(XQueryException.java:127)
         at weblogic.xml.query.exceptions.XQueryException.create(XQueryException.java:87)
         at weblogic.xml.query.exceptions.XQueryException.create(XQueryException.java:151)
         at weblogic.xml.query.exceptions.XQueryException.create(XQueryException.java:139)
         at com.bea.dsp.wrappers.rdb.runtime.ProcedureCallableWrapper.prepareQuery(ProcedureCallableWrapper.java:272)
         at com.bea.dsp.wrappers.rdb.runtime.ProcedureIterator.getNextToken(ProcedureIterator.java:175)
         at com.bea.dsp.wrappers.rdb.runtime.ProcedureIterator.fetchNext(ProcedureIterator.java:125)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:104)
         at weblogic.xml.query.runtime.core.ExecutionWrapper.fetchNext(ExecutionWrapper.java:88)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:104)
         at weblogic.xml.query.runtime.util.TokenBuffer.addAll(TokenBuffer.java:368)
         at weblogic.xml.query.update.runtime.Interpreter$Frame.setValueUsingTemporary(Interpreter.java:429)
         at weblogic.xml.query.update.runtime.Interpreter.processMSet(Interpreter.java:271)
         at weblogic.xml.query.update.runtime.Interpreter.run(Interpreter.java:108)
         at weblogic.xml.query.update.rewriting.UVMUtils$3.run(UVMUtils.java:130)
         at weblogic.xml.query.update.recovery.SagaInterpreter.run(SagaInterpreter.java:41)
         at weblogic.xml.query.update.runtime.UVMRewritingEvaluatorImpl.run(UVMRewritingEvaluatorImpl.java:32)
         at weblogic.xml.query.xdbcimpl.XQSEStatementImpl.execute(XQSEStatementImpl.java:103)
         at com.bea.ld.server.XQueryInvocation.execute(XQueryInvocation.java:752)
         at com.bea.ld.EJBRequestHandler.invokeQueryInternal(EJBRequestHandler.java:624)
         at com.bea.ld.EJBRequestHandler.invokeOperationInternal(EJBRequestHandler.java:478)
         at com.bea.ld.EJBRequestHandler.invokeOperation(EJBRequestHandler.java:323)
         ... 12 more
    Edited by: user10592709 on Aug 21, 2009 11:41 AM

    First (and somewhat unrelated) - how come the "update" errors? Aren't you just trying to read it? You should probably be using "function" instead of "procedure" in your dataservice. You should only be using procedure (XQSE) if you are updating. Not that this is related to your problem - it's just something you should know.
    Now for the REF CURSOR. You need to figure out what columns are being returned, and then define a schema to match, then use that schema as the definition. If you are lucky, the REF CURSOR will actually be a cursor from selecting on a table. If that is the case, you can create a new physical data service based on that table - which will result in a schema being created with the same name as the table. Then create a new physical data service based on the stored procudure, the type of that parameter should appear as "Unknown". Edit the parameter types and specify the type in the schema that was generated for the table.

  • Data warehouse Admin Console - MySQL Physical Data Source

    We have some tables on a MySQL database that we'd like to include in our ETL process. The data is to be loaded into our Oracle 11g R1 data warehouse after the transformations are completed.
    Everything works with the exception of adding a physical data source in the DAC (Data warehouse administration console) for the MySQL database. There is no connection type for this database. I don't see how it's setup in the documentation.
    Can someone explain what needs to be done or direct me to the docs? Or, can this be done?
    Does it have to be done? The connection details are stored in the odbc.ini file on the server. So that's managed by Informatica. Will the workflow call from the DAC be sufficient?
    Thanks,
    LWatkins
    DAC 10.1.3.4.0.20080729.2025
    Informatica PowerCenter 8.1.1
    AIX 5.3/64Bit
    OBIA 7.9.5
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit
    MySQL 5.5
    Edited by: LWatkins on Mar 25, 2013 12:32 PM

    I would suggest this work only if you are able to connect MySQL thru Informatica;
    1) Create Connection in Worflow Manager and try to use the same connecton in sessions and try to avoid parameters from dac
    2) Create same connection in DAC pointing to Oracle or MSSQL or DB2
    3) Add tasks in DAC and do not add source and target tables
    Another workaround to handling full loads using dac system variables or any other
    If helps mark
    Edited by: Srini VEERAVALLI on Mar 25, 2013 4:49 PM

  • Recommended throughput for Oracle data warehouse

    Hi, I know up front this is going to be a vague question...but I'm trying to determine approximate I/O bandwidth for a data mart server. Right now we're hosting 3 or 4 different marts on it, but that number is going to increase.
    Oracle's DW "2 day" class recommends starting with either maximum throughput from user queries, or basing it off of batch windows. Right now the server is barely used for end user queries, as we haven't yet implemented a BI tool to allow users easy access (that's underway right now). So I find it hard to base any info on that. However, it's on the way, and I'm in charge of the BI took (OBIEE). I'm having nightmares that we get OBIEE deployed, and our queries end up taking 5 minutes each to get answers... Right now, on the system basically by myself, if I do a simple "select sum(amount) from fact_ledger", where fact_ledger is a 1 Gb table (with 40 million rows), it takes almost a full minute to run. It feels like I could add this up by hand and get an answer faster...and this certainly doesn't compare with other Oracle marts / DWs I've worked on in the past.
    From a batch window standpoint, all I can say is that it feels really, REALLY too slow to me. Right now, some jobs that start with a 40 million row table and join it to 6 or 7 other small tables (looking up surrogate keys) and writing to a non-logged, non-indexed output table takes over 2 1/2 hours to complete. To me this should be a 15 minute job.
    We've asked IT to do a "root cause analysis" of why performance is so bad - but as part of that, the architecture group wants something more concrete than "it just feels way too slow". So does anyone have some general guidelines they can provide? I guess our detailed info would be:
    - three marts, each of which has a fact table around the 30 - 60 million row level
    - simple "join 30 million row staging table to look up surrogate keys" and writing results is taking 2.5+ hours
    - we expect at some point to have mabe 50 - 100 users running data concurrently (spread across the marts)
    - users will be performance both canned and ad-hoc analysis against it...and they are high level business users, aren't going to be happy with waiting 2 minutes for a simple answer
    My start was to swag this as requiring 6 CPUs or so, which would indicate (according to Oracle's best practice docs) of needing somewhere betweeen 1.2 GB/s to 2.4 GB/s throughput. I'm assuming if it takes almost a full minute to read a 1 GB table, that our IO is currently 60 to 120 times too slow. Does that make sense?
    Thanks and sorry for the lack of details...we just don't know yet.
    Thx,
    Scott

    Why don't you start by taking an AWR report from those two hours so you can see what is the bottleneck for your system ?

  • OLE for transfer data to excel

    Hi
    Can any one send me the coding to use ole for sending data from a multirecord form to excel?
    I have never used OLE.
    i am using DDE package for this task but its very slow.
    regards,
    adeel

    Hi Adeel
    Here's some code to dump the contents of a block into Excel. You may want to uncomment the commented-out sections if you want the functionality of being able to save the workbook that gets created. (I copied it a while back from this forum, so am not claiming any credit for it, or for the lack of readability!!)
    Best regards
    Andrew
    PROCEDURE pr_Forms_to_Excel(p_block_name IN VARCHAR2 DEFAULT NAME_IN('system.current_block')) IS
    -- Declare the OLE objects
    application OLE2.OBJ_TYPE;
    workbooks OLE2.OBJ_TYPE;
    workbook OLE2.OBJ_TYPE;
    worksheets OLE2.OBJ_TYPE;
    worksheet OLE2.OBJ_TYPE;
    cell OLE2.OBJ_TYPE;
    range OLE2.OBJ_TYPE;
    range_col OLE2.OBJ_TYPE;
    -- Declare handles to OLE argument lists
    args OLE2.LIST_TYPE;
    -- Declare form and block items
    form_name VARCHAR2(100);
    f_block VARCHAR2(100);
    l_block VARCHAR2(100);
    f_item VARCHAR2(100);
    l_item VARCHAR2(100);
    cur_block VARCHAR2(100) := NAME_IN('system.current_block');
    cur_item VARCHAR2(100) := NAME_IN('system.current_item');
    cur_record VARCHAR2(100) := NAME_IN('system.cursor_record');
    item_name VARCHAR2(100);
    baslik VARCHAR2(100);
    row_n NUMBER;
    col_n NUMBER;
    filename VARCHAR2(100);
    BEGIN
    -- Start Excel
    application:=OLE2.CREATE_OBJ('Excel.Application');
    OLE2.SET_PROPERTY(application, 'Visible', 'True');
    -- Return object handle to the Workbooks collection
    workbooks:=OLE2.GET_OBJ_PROPERTY(application, 'Workbooks');
    -- Add a new Workbook object to the Workbooks collection
    workbook:=OLE2.GET_OBJ_PROPERTY(workbooks,'Add');
    -- Return object handle to the Worksheets collection for the Workbook
    worksheets:=OLE2.GET_OBJ_PROPERTY(workbook, 'Worksheets');
    -- Get the first Worksheet in the Worksheets collection
    -- worksheet:=OLE2.GET_OBJ_PROPERTY(worksheets,'Add');
    args:=OLE2.CREATE_ARGLIST;
    OLE2.ADD_ARG(args, 1);
    worksheet:=OLE2.GET_OBJ_PROPERTY(worksheets,'Item',args);
    OLE2.DESTROY_ARGLIST(args);
    -- Return object handle to cell A1 on the new Worksheet
    go_block(p_block_name);
    baslik := get_block_property(p_block_name,FIRST_ITEM);
    f_item := p_block_name||'.'||get_block_property(p_block_name,FIRST_ITEM);
    l_item := p_block_name||'.'||get_block_property(p_block_name,LAST_ITEM);
    first_record;
    LOOP
    item_name := f_item;
    row_n := NAME_IN('SYSTEM.CURSOR_RECORD');
    col_n := 1;
    LOOP
    IF get_item_property(item_name,ITEM_TYPE)<>'BUTTON' AND
    get_item_property(item_name,VISIBLE)='TRUE'
    THEN
    -- Set first row with the item names
    IF row_n=1 THEN
    baslik:=NVL(get_item_property(item_name,PROMPT_TEXT),baslik);
    args:=OLE2.CREATE_ARGLIST;
    OLE2.ADD_ARG(args, row_n);
    OLE2.ADD_ARG(args, col_n);
    cell:=OLE2.GET_OBJ_PROPERTY(worksheet, 'Cells', args);
    OLE2.DESTROY_ARGLIST(args);
    OLE2.SET_PROPERTY(cell, 'Value', baslik);
    OLE2.RELEASE_OBJ(cell);
    END IF;
    -- Set other rows with the item values
    args:=OLE2.CREATE_ARGLIST;
    OLE2.ADD_ARG(args, row_n+1);
    OLE2.ADD_ARG(args, col_n);
    cell:=OLE2.GET_OBJ_PROPERTY(worksheet, 'Cells', args);
    OLE2.DESTROY_ARGLIST(args);
    IF get_item_property(item_name,DATATYPE)<>'NUMBER' THEN
    OLE2.SET_PROPERTY(cell, 'NumberFormat', '@');
    END IF;
    OLE2.SET_PROPERTY(cell, 'Value', name_in(item_name));
    OLE2.RELEASE_OBJ(cell);
    END IF;
    IF item_name = l_item THEN
    exit;
    END IF;
    baslik := get_item_property(item_name,NEXTITEM);
    item_name := p_block_name||'.'||get_item_property(item_name,NEXTITEM);
    col_n := col_n + 1;
    END LOOP;
    EXIT WHEN NAME_IN('system.last_record') = 'TRUE';
    NEXT_RECORD;
    END LOOP;
    -- Autofit columns
    range := OLE2.GET_OBJ_PROPERTY( worksheet,'UsedRange');
    range_col := OLE2.GET_OBJ_PROPERTY( range,'Columns');
    OLE2.INVOKE( range_col,'AutoFit' );
    OLE2.RELEASE_OBJ( range );
    OLE2.RELEASE_OBJ( range_col );
    -- Get filename and path
    args := OLE2.CREATE_ARGLIST;
    OLE2.ADD_ARG( args, p_block_name );
    OLE2.ADD_ARG( args,'Excel Workbooks (*.xls, *.xls');
    filename := OLE2.INVOKE_CHAR( application,'GetSaveAsFilename',args );
    OLE2.DESTROY_ARGLIST( args );
    -- Save as worksheet
    IF NVL(filename,'0')<>'0' THEN
    args := OLE2.CREATE_ARGLIST;
    OLE2.ADD_ARG( args,filename );
    OLE2.INVOKE( worksheet,'SaveAs',args );
    OLE2.DESTROY_ARGLIST( args );
    END IF;
    -- Close workbook
    --OLE2.INVOKE( workbook ,'Close');
    -- Release the OLE objects
    OLE2.RELEASE_OBJ(worksheet);
    OLE2.RELEASE_OBJ(worksheets);
    OLE2.RELEASE_OBJ(workbook);
    OLE2.RELEASE_OBJ(workbooks);
    --OLE2.INVOKE(application, 'Quit');
    OLE2.RELEASE_OBJ(application);
    -- Focus to the original location
    go_block(cur_block);
    go_record(cur_record);
    go_item(cur_block||'.'||cur_item);
    END;

Maybe you are looking for

  • Cannot change table-name in persistent.xml

    I've successful deployed a Entity EJB on "SAP Enterprise Portal 6.0 SP4 NetWeaver Developer Sneak Preview" After changing the Table Name in persistent.xml and re-deploying the new EAR, the Entity EJB still writes to the old table. How can I get SAP W

  • Getting a substitution variable value with VB API

    Hi,I'm trying to get the value of a specific substitution variable on an application with EsbGetVariable().I get a nice message that says "you do not have sufficient access to get this subsitution variable." and then i get the return code of EsbGetVa

  • ITunes is splitting up my Keynote Podcasts!?

    I have 11 episodes of Apple's keynotes that I have stored on my external Hard Drive, containing the rest of my iTunes library. I have clean installed OS X Lion and done all the updates, and opened iTunes, clicked "Add to library", added all the podca

  • ALE Configuration HCM - Program read table idoc_structure error

    Hello SCN community, I have small problem with ALE Configuration of HRMD_A message type. I created extension named 'ZRMD_A06'. When I'm trying to run t-code: bd21 (report RBDMIDOC) I'm getting following error message: At the beginning I thought there

  • Get url trouble

    Need help. After looking at all the post I did not find answers. I have a website (home page in frontpage), with two entries one is with a log in (Flash8 inside Dreamweaver8). The SWF flash login lead to a flash button (from the library) with on (rel