Import/export related with performance issues

Hello,
my database has a poor performance. If I export the schema containing data+ recreate it + import data, will I have any benefit? At least, regarding space issues I hope so.
Any suggestions?
Regards,

will I have any benefit?
I don't think so.
You first have to discover the root problem.
Pehaps a simple dbms_stats.xy call may help, or your instance is baldy configured or the sqls are written dickery or ...
What does "poor performance" mean exactly?
Dim

Similar Messages

  • Perform Client import/export SCC8 with different release component between server

    Dear All
    Isn't possible to perform Client import/export SCC8 with different release component between server.?
    Currently the condition of between two system as follows
    Source Server     : SAP ECC6.0, Component SAP_APPL, release 602 level 16
    Target Server     : SAP ECC6.0, Component SAP_APPL, release 600 level 24.
    This problem happens due we was unable to downgrade the release and patch the latest support pack of SAP_APPL reelease 600.
    Thank You, your help is much appreciated.

    No..

  • Brand new with performance issues -- programs not responding, system hanging

    Just this morning I began migrating from a Dell Inspiron 1525 laptop that has been generally reliable for the better part of 3 years (monitor housing is literally falling apart but still works great... check E-Bay later for spare parts! lol)... migrating to a Toshiba Satellite C655D-S5136 laptop running Windows 7 64-bit.  It's literally right out of the box 5 hours ago and I've been reinstalling programs, transferring files, etc.
    The problem is that I'm already noticing a lot of performance issues.  The system (and/or programs) seem to get caught often... resulting in frequent "Program not responding" messages (some of which the system manages to bypass)... and a few of the "This program closed unexpectedly" or "This program stopping responding... do you want to check online for a solution or just close the program?" messages.
    This is happening on multiple programs... tax prep software TaxAct update dialog hangs after 256 bytes downloaded every time, AVG Anti-Virus was hanging although never gave me a not responding message, Internet Explorer has stopped responding a couple times, Windows Live Mail did often, especially when I was using IMAP (since changed to POP3 and problem seems to have resolved but still).  Even opening the start menu a few seconds after plugging in an external hard drive lagged the system like hell for a full two minutes until it caught up with itself.
    This is unacceptable in a brand new machine.  I was getting a few errors like this on my Dell every now and then but I chalked it up to an old machine that had never been wiped and had a failing hard disk.  I understand the computer moving slower, especially as I've been installing a lot, running antivirus, downloading all at once, but no way programs should simply stop responding this often.  I could do twice the activity on my Dell and have no issues.
    I've uninstalled several of the preloaded programs... the antivirus, quickbooks, etc.  I've also disabled and/or uninstalled several of the other programs like Toshiba Reeltime and Toshiba Zoom Manager or whatever that was.  Nothing has seemed to help.  And that's not surprising... the speed is great, it's that programs are getting caught so often, sometimes every time or every other time I open up a program.
    Is this a Windows 7 thing? (This is first time using Windows 7, coming from Vista).  Is it a Satellite thing?
    Anyone have any suggestions?  Is it possible there's just some component that's bad?  Processor, RAM, etc.?
    Thanks... very frustrated!

    I may have found a solution. I changed my startup programs and stopped some of the Toshiba software from running at startup. The ones I stopped are listed below. I have used the laptop heavily for two days now and have not had any problems. It actually runs very fast. I hope this will help.
    Toshiba Reel Time
    Toshiba Sleep
    Toshiba PC Health Monitor
    Toshiba WebCamera App.
    Toshiba Power Saver
    Toshiba Message Center
    Toshiba Service Station
    SmartFaceVWatcher
    Toshiba Online Backup
    Toshiba Zooming Utility
    Toshiba eco Utility
    Toshiba App Place 
    Message Center
    For people that don't know how to change these, here are the instructions.
    Click Start > Type msconfig in the search box > open msconfig > click on the startup tab > remove the checkmarks next to the above programs > restart computer.
    Doing this may not fix your problems, but it did fix mine.

  • Problem with imported .AI into Flash (performance issue/frame lag)

    Hi Flash people i have a problem:
    Background:
    i am creating a flash file (old school frame animation with a little AS2) to be used on big screens (in a child libray)
    i am importing diffeent files (some PNG some SWF images and Some .AI out of Illustrator) hee is the problem;
    The file is BIG 1920+1200px (non scallable) all works fine until the movie hits the imported .Ai file(s) which are
    browser views remade in full vecto in illustrator in 1200x800px - i have a simple image transition fom a PNG image
    to the browser view (this view is pretty similar to a facebook frontpage in terms of content and text/image distribution.
    the transition is 20-25 frames (in 31fps) and works fine with PNG but whrn the .AI frame is loaded the movie solws down
    by factor 10 and it takes several seconds (5-8sec.) to play through the 1sec transition.
    I have tried to export the same AI page as SWF out of illustrator and the esult is basically the same = exteme frame lag.x
    how do i get about this?
    i know the 1920x1200px is HUGE for a fullscreen transition/fade, but then again... it should work right?
    any ideas? - any other vector formats does a better job (and yes it HAS to be vector as i am also doing some looking class effects)
    thjis is really frustrating and the fist time since 96  i have hit the wall with flash :-/ - i have tried everything i know, perhaps overlooking the obvious?
    best
    Mobius6 :-)

    Anyone?      :-)

  • Import/export schema with ODI

    Hi all,
    do you know if with ODI11g I can import a full schema from Oracle database to another Oracle database?
    Thanks
    Erika

    why don't use imp/exp command to export schema from source db and import into target db.
    1) export11g ODI repository schemas in DMP files
    example : my 11g schemas are "ODI_MASTER" and "ODI_WORK"
    exp userid=odi_master/*odi_master* file=c:\odi_master.dmp
    exp userid=odi_work/*odi_work* file c:\odi_work.dmp
    2) import the dump file into new schemas in the new database (or is it the same database ? in my case, it was a different one)
    imp userid=SYSTEM/password touser=odi_master fromuser=odi_master file=c:\odi_master.dmp
    imp userid= SYSTEM/password touser=odi_work fromuser=odi_work file=c:\odi_work.dmp
    --nayan                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Objects or Methods? Which are best related with performance?

    Good morning,
    I waked up this morning with a question in my mind. What is the best thing, to have objects inside another object or else to have methods?
    Okay the question may sound confusing, so I will simplify it with an example. Imagine having an object of type Client. This object needs to load all the client contact details from a file (simple text file).
    Now what would it be the best solution, to have the following code:
    public class Client{
       private Details details;
          public Client(){
             this.details = this.loadDetails();
          �
          public Details loadDetails(){
          �
    } OR
    public class Client{
       private Details details;
       private DetailsLoader loader;
       public Client(){
          this.loader = new DetailsLoader();
          this.details = loader.loadDetails();
    } It is very clear that in the first code segment I am using methods inside the same class to load the details of the client, while in the second code segment I am using another object to load these details.
    OOP wise the second method is better. But what If I have thousands or millions of clients registered in the system? That would mean that in memory, each one of these object (a 1000 for example) will have a DataLoader in memory as well (thus becoming 2000)! The more the numbers increase, the less the system will perform.
    I am asking this question because I want to develop something that still is strictly OOP, but that it is not so much of a heavy weight on a system.
    Any opinion on this is more then welcome :)
    Regards,
    Sim085
    Message was edited by:
    sim085 to correct code indentation.

    Thanks for the posts :)
    I am sorry my example was not clear enough. However all I wanted is to display the effects of having an instance of an object inside another object. I think the best solution would be to create an interface of type Client and then if the file needs to be loaded from an XML I would create a class XMLClient that implements Client (please not I am keeping the stupid example which most probably does not make sens).
    I did that example to show the difference that will happen in the JVM memory according to the different implementation.
    Imagine having object A that has a method load() to load the data from an XML file. Then in memory, when creating a new instance of type A, I will have the object A defined in the Heap, and the pointer to A in the stack.
    If instead I have an object A that has an object B with the method load(). Then if B is initialized from A, in memory I will have objects A and B in the Heap, and a pointer to both object in the Stack. If instrad object B is initalized from somewhere else and passed to A as a parameter, then I can have many instance of A while having one instance of B.
    I do not know if anyone understood the above. But what I am trying to say is that the different implementations, altough at the end of the day they will all result to the same ending, on the back end they are effecting the JVM in a different way!
    I know that the best solution is to feed B to A, so that I have one big object in memory loaded only once :)
    Was it "where am I?" ;-)No actually it was .. "shit I'm still here :D" :)
    regards,
    sim085

  • Help with performance issues

    Hi peep, looking for some pointers on performance after installing a fresh copy of 7 starter. Have upgraded RAM to 2Gig but still booting takes 15 mins. Running the machine is incredibly slow. I have an S10-3 model with an intel atom, which i dont expect to be a power house but surely it should be useable. Any help would be great thanks.

    Hello
    I would suggest you start looking at the startrup programs and deselect some of them which you think you may not need during boot times.
    Cheers and regards,
    • » νιנαソѕαяα∂нι ѕαмανє∂αм ™ « •
    ●๋•کáŕádhí'ک díáŕý ツ
    I am a volunteer here. I don't work for Lenovo

  • Help with Performance issue with a view

    Hi
    We developed a custom view to get the data from gl_je_lines table with source as Payables. We are bringing the data for the last year and current year till date ie., from 01-JAN-2012 to SYSDATE. This view is in a package body, which is called from a concurrent program to write the data to a outbound file.
    The problem I am facing is that this view is fetching around 72 lakhs of records for the above date range and the program is running for a long time and completes abruptly with out any result. Can anyone please let me know if there is an alternative to this. I checked the view query and there seems to be not much scope to improve the performance.
    Will inserting al this data into a Global Temporary Table will help? Please revert at the earliest as this solution is very urgent for our clients.
    Message was edited by: 988490e8-2268-414d-b867-9d9a911c0053

    This is the view query:
    select GCC.SEGMENT1 "EMPRESA",
           GCC.SEGMENT2 "CCUSTO",
           GCC.SEGMENT3 "CONTA",
           GCC.SEGMENT4 "PRODUTO",
           GCC.SEGMENT5 "SERVICO",
           GCC.SEGMENT6 "CANAL",
           GCC.SEGMENT7 "PROJECT",
           GCC.SEGMENT8 "FORWARD1",
           GCC.SEGMENT9 "FORWARD2",
           FFVT.DESCRIPTION "CONTA_DESCR",
           ltrim(substr(XRX_CONSOLIDATION_MAPPING('XDMO_LOCAL_USGAAP_LEGAL_ENTITY',
                                                  'XDMO_REPORT_USGAAP_LEGAL_COMPANY',
                                                  GCC.SEGMENT1,
                                                  GCC.SEGMENT3),
                        1,
                        80)) "LEGAL_COMPANY",
           ltrim(substr(XRX_CONSOLIDATION_MAPPING('XDMO_LOCAL_USGAAP_ACCOUNT',
                                                  'XDMO_REPORT_USGAAP_FIN_ACCOUNT',
                                                  GCC.SEGMENT3,
                                                  GCC.SEGMENT3),
                        1,
                        80)) "GRA",
           ltrim(substr(XRX_CONSOLIDATION_MAPPING('XDMO_LOCAL_USGAAP_BUDGET_CENTER',
                                                  'XDMO_REPORT_USGAAP_RESPONSIBILITY',
                                                  GCC.SEGMENT2,
                                                  GCC.SEGMENT3),
                        1,
                        80)) "RESP",
           ltrim(substr(XRX_CONSOLIDATION_MAPPING('XDMO_LOCAL_USGAAP_PRODUCT',
                                                  'XDMO_REPORT_USGAAP_TEAM',
                                                  GCC.SEGMENT4,
                                                  GCC.SEGMENT3),
                        1,
                        80)) "TEAM",
           ltrim(substr(XRX_CONSOLIDATION_MAPPING('XDMO_LOCAL_USGAAP_ACCOUNT',
                                                  'XDMO_REPORT_USGAAP_FIN_ACCOUNT',
                                                  GCC.SEGMENT3,
                                                  GCC.SEGMENT3),
                        164,
                        80)) "GRA_DESCR",
           GJH.NAME "IDLANC",
           GJS.USER_JE_SOURCE_NAME "ORIGEM",
           GJC.USER_JE_CATEGORY_NAME "CATEGORIA",
           GJL.DESCRIPTION "DESCRICAO",
           decode(GJH.JE_SOURCE, 'Payables', GJL.REFERENCE_2, '') "INVOICE_ID",
           decode(GJH.JE_SOURCE, 'Payables', GJL.REFERENCE_5, '') "NOTA",
           decode(GJH.JE_SOURCE, 'Payables', GJL.REFERENCE_1, '') "FORNECEDOR",
           GJH.DEFAULT_EFFECTIVE_DATE "DTEFET",
           to_char(GJB.POSTED_DATE, 'DD-MON-YYYY HH24:MI:SS') "DTPOSTED",
           GJH.CURRENCY_CONVERSION_TYPE "TPTAX",
           substr(GCC.SEGMENT9, 8, 1) "TAXA",
           GJH.CURRENCY_CONVERSION_DATE "DTCONV",
           -- nvl(GJL.ACCOUNTED_DR,0)-nvl(GJL.ACCOUNTED_CR,0)       "VALOR",
           -- added as per ITT #517830
           nvl(GJL.ENTERED_DR, 0) - nvl(GJL.ENTERED_CR, 0) "VALOR",
           GJH.CURRENCY_CODE "MOEDA",
           --  decode(gcc.segment9, '00000000', 0, '00000001', nvl(GJL.ACCOUNTED_DR,0)-nvl(GJL.ACCOUNTED_CR,0)) "VALOR_FUNCIONAL",
           -- added as per ITT #517830
           (nvl(GJL.ACCOUNTED_DR, 0) - nvl(GJL.ACCOUNTED_CR, 0)) "VALOR_FUNCIONAL",
           GSOB.CURRENCY_CODE "FUNCIONAL",
           GJH.PERIOD_NAME "PERIODO",
           GJB.STATUS "STATUS",
           GSOB.SHORT_NAME "LIVRO",
           GJL.LAST_UPDATE_DATE "JL_LAST_UPDATE_DATE",
           GJH.LAST_UPDATE_DATE "JH_LAST_UPDATE_DATE",
           GJB.LAST_UPDATE_DATE "JB_LAST_UPDATE_DATE",
           GJL.JE_HEADER_ID "JE_HEADER_ID",
           GJL.JE_LINE_NUM "JE_LINE_NUM"
      from GL.GL_JE_LINES   GJL,
           GL.GL_JE_HEADERS GJH,
           GL.GL_JE_BATCHES GJB,
           --GL.GL_SETS_OF_BOOKS                    GSOB, ---As GL_SETS_OF_BOOKS table dropped in R12 so replaced with GL_LEDGERS table,Commented as part of DMO R12 Upgrade-RFC#411290.
           GL.GL_LEDGERS               GSOB, ---Added as part of DMO R12 Upgrade-RFC#411290.
           GL.GL_JE_SOURCES_TL         GJS,
           GL.GL_JE_CATEGORIES_TL      GJC,
           GL.GL_CODE_COMBINATIONS     GCC,
           APPLSYS.FND_FLEX_VALUES_TL  FFVT,
           APPLSYS.FND_FLEX_VALUES     FFV,
           APPLSYS.FND_FLEX_VALUE_SETS FFVS
    where GJL.CODE_COMBINATION_ID = GCC.CODE_COMBINATION_ID
       and GJL.JE_HEADER_ID = GJH.JE_HEADER_ID
       and GJH.JE_BATCH_ID = GJB.JE_BATCH_ID
          --and GJB.SET_OF_BOOKS_ID = GSOB.SET_OF_BOOKS_ID    ---Changing the mappings between the tables GL_JE_HEADERS and GL_JE_BATCHES As column SET_OF_BOOKS_ID of table GL_JE_BATCHES dropped in R12,Commented as part of DMO R12 Upgrade-RFC#411290.
       and GJH.LEDGER_ID = GSOB.LEDGER_ID ---Added as part of DMO R12 Upgrade-RFC#411290.
       and GJH.JE_SOURCE = GJS.JE_SOURCE_NAME
       and GJH.JE_CATEGORY = GJC.JE_CATEGORY_NAME
       and GCC.SEGMENT3 = FFV.FLEX_VALUE
       and FFV.FLEX_VALUE_ID = FFVT.FLEX_VALUE_ID
       and FFV.FLEX_VALUE_SET_ID = FFVS.FLEX_VALUE_SET_ID
       and FFVS.FLEX_VALUE_SET_NAME = 'XDMO_LOCAL_USGAAP_ACCOUNT'
       and GSOB.SHORT_NAME in ('XBRA BRL LOCAL GAAP', 'XBRA BRL USGAAP')
       and gcc.chart_of_accounts_id = gsob.chart_of_accounts_id
       and gjh.actual_flag = 'A'
    DB VErsion: 11.2.0.3.0
    The problem I am facing is that the above query fetches huge data and I want to know if there is anyway to improve the performance of this query. You are right that view is stored in DB. I am using this view query in a cursor to fetch the records

  • Macro to import/export data with MS Access

         The company I work for has a series of MS Word documents that can use a macro button (button in MS Word) to communicate with MS Access. The Word document receives a tracking number and gives MSAccess two fields of information such as name, and date. The word document then saves that tracking number, and Access saves the name and date information.
         Is this macro a possibility with Adobe LifeCycle documents? I have replaced the word documents with LifeCycle designed pdfs. The pdfs are amazing but the macro is the last piece to bridge the gap. Any comments or suggestions? From some light Adobe reading it appears possible.Thanks as always for your help.
    I have found the MS Word file  macro. Is this convertible to javascript?

    take a glance at what these guys do: http://www.openbusinessengine.org/docs/guide.html

  • Performance issues ..with apex in reports version 3.1

    Hello All,
    I am using apex 3.1 oracle 10g.
    I am facing with performance issues with apex . I am generating iteractive reports with apex and the number of records are huge - running in 30 to 40 thousands of records and the reports is taking almost 30 minutes.
    How I can improve the performance of this kind of report. I am using apex collections.
    How apex works in terms of retrieving the records -?
    Please let me know .
    Thanks/kumar
    Edited by: kumar73 on Jun 18, 2010 10:21 AM

    Hello Tony ,
    The following are the sequence of steps to run the test case.
    Note:- All the schemas , tables and variables are populated from database.
    From Schema and Relations tab choose the following:
    1)     Select P3I2008Q4 as schema.
    2)     Choose Relation as query path.
    3)     Select ECLA, ECLB, MTAB as relations.
    From Variables choose the following:
    4)     Choose the variables AGE_SEXA,CLODESCA,ALCNO from ECLA relation.
    5)     Choose the variables AGE_SEXB, ALCNO, CLODESCB from ECLB relation.
    6)     Choose the variables EXPNAME, ALCNO, COST_, COST from MTAB relation.
    From Conditions: Click the Run Report button this generated standard report ( Total no of records in report – 30150 )
    Click on Interactive report button –to generate an interactive report. ( Error occurred )
    We are using return sql statement in generationg the standard report and collections for interactive report.
    thanks/kumar

  • How to get around a performance issue when dealing with a lot of data

    Hello All,
    This is an academic question really, I'm not sure what I'm going to do with my issue, but I have some options.  I was wondering if anyone would like to throw in their two cents on what they would do.
    I have a report, the users want to see all agreements and all conditions related to the updating of rebates and the affected invoices. From a technical perspective ENT6038-KONV-KONP-KONA-KNA1.  THese are the tables I have to hit.  The problem is that when they retroactively update rebate conditions they can hit thousands of invoices, which blossoms out to thousands of conditions...you see the problem. I simply have too much data to grab, it times out.
    I've tried everything around the code.  If you have a better way to get price conditions and agreement numbers off of thousands of invoices, please let me know what that is.
    I have a couple of options.
    1) Use shared memory to preload the data for the report.  This would work, but I'm not going to know what data is needed to be loaded until report run time. They put in a date. I simply can't preload everything. I don't like this option much. 
    2) Write a function module to do this work. When the user clicks on the button to get this particular data, it will launch the FM in background and e-mail them the results. As you know, the background job won't time out. So far this is my favored option.
    Any other ideas?
    Oh...nope, BI is not an option, we don't have it. I know, I'm not happy about it. We do have a data warehouse, but the prospect of working with that group makes me whince.

    My two cents - firstly totally agree with Derick that its probably a good idea to go back to the business and justify their requirement in regards to reporting and "whether any user can meaningfully process all those results in an aggregate". But having dealt with customers across industries over a long period of time, it would probably be bit fanciful to expect them to change their requirements too much as in my experience neither do they understand (too much) technology nor they want to hear about technical limitations for a system etc. They want what they want if possible yesterday!
    So, about dealing with performance issues within ABAP, I'm sure you must be already using efficient programming techniques like using Hash internal tables with Unique Keys, accessing rows of the table using Field-Symbols and all that but what I was going to suggest to you is probably look at using [Extracts|http://help.sap.com/saphelp_nw04/helpdata/en/9f/db9ed135c111d1829f0000e829fbfe/content.htm]. I've had to deal with this couple of times in the past when dealing with massive amount of data and I found it to be very efficient in regards to performance. A good point to remember when using Extracts that, I quote from SAP Help, "The size of an extract dataset is, in principle, unlimited. Extracts larger than 500KB are stored in operating system files. The practical size of an extract is up to 2GB, as long as there is enough space in the filesystem."
    Hope this helps,
    Cheers,
    Sougata.

  • Performance issue with iPhone 4 on iOS 7

    Hello,
    I have been using iPhone 4 since October 2011. Phone was performarming exceptionally good when it was running on iOS 6. However, post upgrading to iOS 7, finding all nonsense experieinces like any other Andriod/windows devices.
    1. Phone hangs very frequently
    2. Response time is rediculous (eventhough once I restart the device, performs better but cannot do it every now & then)
    3. Applications stop responding & phone returns to home screen
    Is this a common problem with all iPhone 4 users or specifically in India.
    Also, want to know is there a possibility to go back to iOS 6 rather than running on iOS 7 & suffering with performance issues.

    Prasadhegde wrote:
    is there a possibility to go back to iOS 6
    Downgrading is not Supported by Apple.
    For your Issues...
    Try This...
    Close All Open Apps... Sign Out of your Account... Perform a Reset... Try again...
    Reset  ( No Data will be Lost )
    Press and Hold the Sleep/Wake Button and the Home Button at the Same Time...
    Wait for the Apple logo to Appear...
    Usually takes about 15 - 20 Seconds... ( But can take Longer...)
    Release the Buttons...
    If no joy...
    Reset all settings
    Settings > General > Reset > Reset all Settings.
    This will return all iDevice settings to factory defaults... you will not lose any data.... But you will have to re-enter all of the device settings.
    If the issue persists...
    Connect to iTunes on the computer you usually Sync with and Restore
    http://support.apple.com/kb/HT1414
    Make sure you have the Latest Version of iTunes (v11) Installed on your computer
    iTunes free download from www.itunes.com/download
    More Tips here...
    http://osxdaily.com/2013/09/23/ios-7-slow-speed-it-up/
    http://osxdaily.com/2013/09/19/ios-7-battery-life-fix/
    Note:
    Also consider Deleting any Apps you have Purchased / Downloaded but you now never use.

  • Oracle SOA 11g Performance Issue

    Hi,
    We have set up Oracle SOA Suite on AIX environment. Java which we are using is IBM Jdk 1.6. Recently we are hit with performance issue. Frequently we are getting out of memory exception and we need to restart the server and sometimes physically reboot the machine, because out of 16 GB of RAM 4GB we have given as heap space to Admin Server, 7 GB to SOA Server but it is taking more than 7 GB as heap space. On stopping or killing both the services memory is not getting released
    SOA Suite Version : 11.1.1.3
    Instance Node: Single Node
    I collected the logs and tried to analyze in Thread Dump Analyzer and i could see objects(Reserved) is taking 100% of the CPU Utilization.
    We are getting the following error highlighed in the analyzer. There are about 200+ threads got stuck.
    "HTTPThreadGroup-42" prio=10 tid=0x6382ba28 nid=0x20bf4 waiting on condition [0x6904f000..0x6904fb94]
    at sun.misc.Unsafe.park(Native Method)
    at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:146)
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireNanos(AbstractQueuedSynchronizer.java:772)
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireNanos(AbstractQueuedSynchronizer.java:1087)
    at java.util.concurrent.SynchronousQueue$Node.waitForPut(SynchronousQueue.java:291)
    at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:443)
    at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:475)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:674)
    at java.lang.Thread.run(Thread.java:595)
    "HTTPThreadGroup-41" prio=10 tid=0x6ae3cce0 nid=0x20bf0 waiting on condition [0x68d8f000..0x68d8fc14]
    at sun.misc.Unsafe.park(Native Method)
    at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:146)
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireNanos(AbstractQueuedSynchronizer.java:772)
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireNanos(AbstractQueuedSynchronizer.java:1087)
    at java.util.concurrent.SynchronousQueue$Node.waitForPut(SynchronousQueue.java:291)
    at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:443)
    at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:475)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:674)
    at java.lang.Thread.run(Thread.java:595)
    Has anyone faced same issue? We are badly hit with this performance issue in UAT
    Please concider this on high priority and someone please help us
    Regards,
    Sundar

    Always raise a case with Oracle Support for such issues.
    Regards,
    Anuj

  • 10.2 Oracle sapdata layout - performance issues?

    Can someone provide the pros cons of /oracle layout for ERP 6.0 production system?
    Is it better to build several additional sapdatas.
    Initially SAP loads data into sapdata 1 through sapdata4.....
    Has anyone seen any possible performance issues only using sapdata1-4 especially with a 3 TB DB running 10.2.04?
    Any advice is appriciated...
    Thanks
    Mikie B

    Hello Mikie,
    > Initially SAP loads data into sapdata 1 through sapdata4....
    What do you mean with this?
    > Is it better to build several additional sapdatas.
    This depends on your storage. Normally if you have a high-range SAN storage it shouldn't matter. I have seen special cases where some disk ranks were overloaded .. but normally you should not think about that. If you don't have SAN and your load needs to be spread .. this is something different.
    > Has anyone seen any possible performance issues only using sapdata1-4 especially with a 3 TB DB running 10.2.04?
    Our main logistic system is round about 3.6 TB and stored in eleven sapdata directories .. but this partitioning has nothing todo with performance issues ... the reason for that is some limitation on OS level.
    Regards
    Stefan
    P.S.:
    > Can someone provide the pros cons of /oracle layout for ERP 6.0 production system?
    If i extend or create a new tablespace i am wondering every time .. why sap is creating a sub directory for every data file by default .. absolutely freaky.

  • "argument number too large" in OEM import export app

    Hi everybody,
    I am a beginer with Oracle 9i and I am facing a strange error message when I try to import/export data from OEM connecter to my Oracle Enterprise Manager server.
    The user loged in is the SYSMAN user.
    When I launch the Import or Export applet on the selected database, Oracle send me the folowing message :
    (sorry for my french to english translation)
    "The following error occurs when you attemps to connect to the database with the login parameters
    argument number too large at"
    My Oracle 9i is installed on a clean test platform : NT4server + SP6 french version, dual proc and 512 Mb RAM.
    Can somebody help me ? I am unable to import/export data with the GUI.
    Thank you.
    Frangois.

    To view your folio on your iPad you have two choices:
    1 - If you built your folio offline on your local machine, choose the upload option from the Folio Builder fly out menu. All folios that were built online and any you uploaded are available for you to download when you log into ACV on your iPad.
    2 - This is a MAC only option. Plug your iPad into your computer via the USB cable. With the iPad turned on and ACV opened you will see a Preview on [iPad Name] from the preview menu in folio builder.
    I tested both of the supplied files after publishing them for the web from Edge and placing the html files in a web overlay. I previewed on both the desktop viewer and my iPad. Both worked as expected on the iPad. The Jams worked in a desktop preview but your logo just showed a red box. The white background is removed by selecting the transparent background option in the Overlay panel.
    As Bob said, the desktop previewer is not 100% reliable. It is used for quick, down and dirty checking. It's always pest to preview on a device if something does not work as expected.
    If it helps, I will share my test folio with you. Message me you email address.

Maybe you are looking for