Large amount of physical reads

Any query using a CONTAINS accessing our main IMT index does a huge amount of physical reads. The cost according to an explain plan is great, only 3, but usually over 2000 physical reads each time. Even if you do the same one over and over, it never decreases. What might I do about this?
Steve Karam
Lead DBA www.traderonline.com www.harmonhomes.com

Our application runs on ColdFusion and points to our database from 24 seperate webservers all armed with Oracle Client. The 'databii' are both 8.1.6.3 on Sun SPARC Solaris 2.7. The index is created with 160M memory, our STORAGE preference puts the tables and index in a seperate tablespace about 4 times their size. We use a USER type Datastore to convert the indexed field to an XML appearance, then use the Section Group to read through it XML style. It indexes about 430,000 documents. We rebuild 'optimize full' every morning and keep frag levels under 2%. Our queries use CONTAINS and makes use of the SYN and PT function to query against the thesaurus, and in a few spots we use the WITHIN clause for our 2 zones, MAKE_NAME and MODEL_NAME (with MAK and MOD as tags respectively).

Similar Messages

  • Query on data dictionary results in large number of physical reads

    I don't understand why I am getting 80,000 physicals for this query. I am not looking for help re-writing this. I just don't understand why I would hit the disk at all.
    My understanding had been that v$views where sql structures that pointed to x$tables. These x$tables are sql structures.
    underneath, the x$tables were linked lists stored in memory. This is why when you bounce the database, all the data, gets reset. Since it is not saved to disk.
    I am doing a simple insert/select off of v$open_cursor that is resulting in 80,000+ physical reads. I am posting the tkprof. It is all from v$open_cursor.
    mysid_table has 6 records. It is 1 mb in size
    if I index mysid_table.sid the query reduces to 20,000 physical reads. (but all the physical reads are on v$session_event)
    the sequence number I am passing returns 2 SIDs
    insert into my_save_table
    select *
    from v$session_event
    where sid in (select sid
      from my_sid_table
    where id = vseq);
    vrowcount := sql%rowcount;
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.01       0.01          0          0          0           0
    Execute      1     31.70      47.57      88570         22          0           1
    Fetch        0      0.00       0.00          0          0          0           0
    total        2     31.71      47.58      88570         22          0           1
    Misses in library cache during parse: 1
    Optimizer mode: CHOOSE
    Parsing user id: 22 
    Elapsed times include waiting on following events:
      Event waited on                             Times   Max. Wait  Total Waited
      ----------------------------------------   Waited  ----------  ------------
      latch: row cache objects                        1        0.00          0.00
      log file sync                                   1        0.00          0.00
      SQL*Net message to client                       1        0.00          0.00
      SQL*Net message from client                     1        0.00          0.00
    ********************************************************************************

    It seems like there is some missing information.
    You have a wait for a log file sync, but no commit.
    Your table my_sid_table is 1 MB for only 6 records?
    Does the target table you are inserting into (my_save_table) have indexes on it?

  • Query tuning-how to reduce physical reads-help me

    1* select * from masterbillingInvoiceview Where SiteIID =300964 and InvoiceID like '%' order by Invoice asc
    SQL> /
    33 rows selected.
    Elapsed: 00:00:00.34
    Execution Plan
    Plan hash value: 352896138
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
    | 0 | SELECT STATEMENT | | 16 | 6304 | 341 (3)| 00:00:05 |
    | 1 | SORT ORDER BY | | 16 | 6304 | 341 (3)| 00:00:05 |
    | 2 | TABLE ACCESS BY INDEX ROWID | ADDRESS | 1 | 48 | 2 (0)| 00:00:01
    | 3 | NESTED LOOPS | | 16 | 6304 | 340 (3)| 00:00:05 |
    | 4 | NESTED LOOPS | | 16 | 5536 | 316 (3)| 00:00:04 |
    |* 5 | HASH JOIN OUTER | | 16 | 5168 | 308 (3)| 00:00:04 |
    |* 6 | HASH JOIN | | 16 | 4992 | 302 (3)| 00:00:04 |
    |* 7 | HASH JOIN | | 16 | 4752 | 298 (3)| 00:00:04 |
    | 8 | NESTED LOOPS | | 17 | 4828 | 44 (0)| 00:00:01 |
    | 9 | NESTED LOOPS | | 17 | 3451 | 35 (0)| 00:00:01 |
    | 10 | NESTED LOOPS OUTER | | 1 | 91 | 2 (0)| 00:00:01 |
    | 11 | TABLE ACCESS BY INDEX ROWID| SITEMASTER | 1 | 74 | 1 (0)| 00:00:01 |
    |* 12 | INDEX UNIQUE SCAN | PK_SITEMASTER | 1 | | 1 (0)| 00:00:01 |
    | 13 | TABLE ACCESS BY INDEX ROWID| INVGROUPS | 1735 | 29495 | 1 (0)| 00:00:01 |
    |* 14 | INDEX UNIQUE SCAN | PK_INVGROUPS | 1 | | 1 (0)| 00:00:01 |
    |* 15 | TABLE ACCESS BY INDEX ROWID | INVOICEHEAD | 17 | 1904 | 33 (0)| 00:00:01 |
    |* 16 | INDEX RANGE SCAN | IDX_INVOICEHEADSITEIID | 271 | | 1 (0)| 00:00:01 |
    | 17 | TABLE ACCESS BY INDEX ROWID | CUSTOMER | 1 | 81 | 1 (0)| 00:00:01 |
    |* 18 | INDEX UNIQUE SCAN | PK_CUSTOMER | 1 | | 1 (0)| 00:00:01 |
    | 19 | VIEW | | 35844 | 455K| 254 (3)| 00:00:04 |
    | 20 | HASH GROUP BY | | 35844 | 455K| 254 (3)| 00:00:04 |
    | 21 | TABLE ACCESS FULL | CONTACT | 56100 | 712K| 250 (1)| 00:00:03 |
    | 22 | VIEW | index$_join$_009 | 7 | 105 | 3 (0)| 00:00:01 |
    |* 23 | HASH JOIN | | | | | |
    | 24 | INDEX FAST FULL SCAN | IDX_PAYMENTTERMSID | 7 | 105 | 1 (0)| 00:00:01
    | 25 | INDEX FAST FULL SCAN | PK_PAYMENTTERMS | 7 | 105 | 1 (0)| 00:00:01
    | 26 | VIEW | index$_join$_011 | 1428 | 15708 | 6 (0)| 00:00:01 |
    |* 27 | HASH JOIN | | | | | |
    | 28 | INDEX FAST FULL SCAN | PK_EMPLOYEE | 1428 | 15708 | 3 (0)| 00:00:01 |
    | 29 | INDEX FAST FULL SCAN | IDX_EMPLOYEEEMPLOYEEID | 1428 | 15708 | 3 (0)| 00:00:01
    | 30 | TABLE ACCESS BY INDEX ROWID | CONTACT | 1 | 23 | 1 (0)| 00:00:
    |* 31 | INDEX UNIQUE SCAN | PK_CONTACT | 1 | | 1 (0)| 00:00:01 |
    |* 32 | INDEX RANGE SCAN | PK_ADDRESS | 1 | | 1 (0)| 00:00:01 |
    Predicate Information (identified by operation id):
    5 - access("A"."CREATEDBY"="J"."EMPIID"(+))
    6 - access("A"."TERMSIID"="H"."PAYMENTTERMSIID")
    7 - access("D"."CUSTOMERIID"="F"."CUSTOMERIID")
    12 - access("B"."SITEIID"=300964)
    14 - access("B"."PARENTIID"="I"."GROUPIID"(+))
    15 - filter("A"."TYPE"=4 AND "A"."INVOICEID" LIKE '%')
    16 - access("A"."SITEIID"=300964)
    18 - access("A"."BILLINGCUSTOMERIID"="D"."CUSTOMERIID")
    23 - access(ROWID=ROWID)
    27 - access(ROWID=ROWID)
    31 - access("F"."CONTACTIID"="G"."CONTACTIID")
    32 - access("E"."ADDRESSIID"=NVL("D"."BILLINGADDRESSIID","D"."ADDRESSIID"))
    Statistics
    107 recursive calls
    0 db block gets
    2819 consistent gets
    0 physical reads
    0 redo size
    6586 bytes sent via SQL*Net to client
    356 bytes received via SQL*Net from client
    4 SQL*Net roundtrips to/from client
    1 sorts (memory)
    0 sorts (disk)
    33 rows processed
    SQL> ed
    Wrote file afiedt.buf
    1* select * from masterbillingInvoiceview Where SiteIID =300964 and InvoiceID like '%%%' order by Invoice asc
    SQL> /
    33 rows selected.
    Elapsed: 00:06:15.23
    Execution Plan
    Plan hash value: 1828716447
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
    | 0 | SELECT STATEMENT | | 2 | 964 | 295 (3)| 00:00:04 |
    |* 1 | FILTER | | | | | |
    | 2 | SORT GROUP BY | | 2 | 964 | 295 (3)| 00:00:04 |
    | 3 | MERGE JOIN CARTESIAN | | 72315 | 33M| 291 (1)| 00:00:04 |
    | 4 | TABLE ACCESS BY INDEX ROWID | ADDRESS | 1 | 52 | 2 (0)| 00:00:01 |
    | 5 | NESTED LOOPS | | 1 | 447 | 41 (0)| 00:00:01 |
    | 6 | NESTED LOOPS | | 1 | 395 | 40 (0)| 00:00:01 |
    | 7 | NESTED LOOPS | | 1 | 382 | 38 (0)| 00:00:01 |
    | 8 | NESTED LOOPS OUTER | | 1 | 289 | 37 (0)| 00:00:01 |
    | 9 | NESTED LOOPS | | 1 | 266 | 36 (0)| 00:00:01 |
    | 10 | NESTED LOOPS | | 1 | 239 | 35 (0)| 00:00:01 |
    | 11 | NESTED LOOPS OUTER | | 1 | 115 | 2 (0)| 00:00:01 |
    | 12 | TABLE ACCESS BY INDEX ROWID| SITEMASTER | 1 | 86 | 1 (0)| 00:00:01 |
    |* 13 | INDEX UNIQUE SCAN | PK_SITEMASTER | 1 | | 1 (0)| 00:00:01 |
    | 14 | TABLE ACCESS BY INDEX ROWID| INVGROUPS | 1735 | 50315 | 1 (0)| 00:00:01 |
    |* 15 | INDEX UNIQUE SCAN | PK_INVGROUPS | 1 | | 1 (0)| 00:00:01 |
    |* 16 | TABLE ACCESS BY INDEX ROWID | INVOICEHEAD | 1 | 124 | 33 (0)| 00:00:01 |
    |* 17 | INDEX RANGE SCAN | IDX_INVOICEHEADSITEIID | 271 | | 1 (0)| 00:00:01 |
    | 18 | TABLE ACCESS BY INDEX ROWID | PAYMENTTERMS | 1 | 27 | 1 (0)| 00:00:01 |
    |* 19 | INDEX UNIQUE SCAN | PK_PAYMENTTERMS | 1 | | 1 (0)| 00:00:01 |
    | 20 | TABLE ACCESS BY INDEX ROWID | EMPLOYEE | 1 | 23 | 1 (0)| 00:00:01 |
    |* 21 | INDEX UNIQUE SCAN | PK_EMPLOYEE | 1 | | 1 (0)| 00:00:01 |
    | 22 | TABLE ACCESS BY INDEX ROWID | CUSTOMER | 1 | 93 | 1 (0)| 00:00:01 |
    |* 23 | INDEX UNIQUE SCAN | PK_CUSTOMER | 1 | | 1 (0)| 00:00:01 |
    | 24 | TABLE ACCESS BY INDEX ROWID | CONTACT | 1 | 13 | 2 (0)| 00:00:01 |
    |* 25 | INDEX RANGE SCAN | IDX_CONTACTCUSTOMERIID | 2 | | 1 (0)| 00:00:01 |
    |* 26 | INDEX RANGE SCAN | PK_ADDRESS | 1 | | 1 (0)| 00:00:01 |
    | 27 | BUFFER SORT | | 56100 | 1917K| 294 (3)| 00:00:04 |
    | 28 | TABLE ACCESS FULL | CONTACT | 56100 | 1917K| 250 (1)| 00:00:03 |
    Predicate Information (identified by operation id):
    1 - filter("G"."CONTACTIID"=MAX("CONTACTIID"))
    13 - access("B"."SITEIID"=300964)
    15 - access("B"."PARENTIID"="I"."GROUPIID"(+))
    16 - filter("A"."TYPE"=4 AND "A"."INVOICEID" LIKE '%%%')
    17 - access("A"."SITEIID"=300964)
    19 - access("A"."TERMSIID"="H"."PAYMENTTERMSIID")
    21 - access("A"."CREATEDBY"="J"."EMPIID"(+))
    23 - access("A"."BILLINGCUSTOMERIID"="D"."CUSTOMERIID")
    25 - access("D"."CUSTOMERIID"="CUSTOMERIID")
    filter("CUSTOMERIID" IS NOT NULL)
    26 - access("E"."ADDRESSIID"=NVL("D"."BILLINGADDRESSIID","D"."ADDRESSIID"))
    Statistics
    1952 recursive calls
    78 db block gets
    4649 consistent gets
    236151 physical reads
    0 redo size
    6586 bytes sent via SQL*Net to client
    356 bytes received via SQL*Net from client
    4 SQL*Net roundtrips to/from client
    1 sorts (memory)
    1 sorts (disk)
    33 rows processed
    this is the execution plan of my one query with a small difference, bt there is large diffference in physical reads..
    can anyone help me out of this
    thanks
    aju

    Hi,
    Can you please format your explain plan using
    { code } --without space
    Explain plan
    { code } -- without any space
    What is your DB version?
    There are differences in access and filter criteria...
    -- FRIST QUERY
    5 - access("A"."CREATEDBY"="J"."EMPIID"(+))
    6 - access("A"."TERMSIID"="H"."PAYMENTTERMSIID")
    7 - access("D"."CUSTOMERIID"="F"."CUSTOMERIID")
    12 - access("B"."SITEIID"=300964)
    14 - access("B"."PARENTIID"="I"."GROUPIID"(+))
    15 - filter("A"."TYPE"=4 AND "A"."INVOICEID" LIKE '%')
    16 - access("A"."SITEIID"=300964)
    18 - access("A"."BILLINGCUSTOMERIID"="D"."CUSTOMERIID")
    23 - access(ROWID=ROWID)
    27 - access(ROWID=ROWID)
    31 - access("F"."CONTACTIID"="G"."CONTACTIID")
    32 - access("E"."ADDRESSIID"=NVL("D"."BILLINGADDRESSIID","D"."ADDRESSIID"))
    ------SECOND QUERY
    1 - filter("G"."CONTACTIID"=MAX("CONTACTIID"))
    13 - access("B"."SITEIID"=300964)
    15 - access("B"."PARENTIID"="I"."GROUPIID"(+))
    16 - filter("A"."TYPE"=4 AND "A"."INVOICEID" LIKE '%%%')
    17 - access("A"."SITEIID"=300964)
    19 - access("A"."TERMSIID"="H"."PAYMENTTERMSIID")
    21 - access("A"."CREATEDBY"="J"."EMPIID"(+))
    23 - access("A"."BILLINGCUSTOMERIID"="D"."CUSTOMERIID")
    25 - access("D"."CUSTOMERIID"="CUSTOMERIID")
    filter("CUSTOMERIID" IS NOT NULL)
    26 - access("E"."ADDRESSIID"=NVL("D"."BILLINGADDRESSIID","D"."ADDRESSIID"))-Avinash

  • Java NIO - reading large amount of data

    Hi,
    I have diffuculties of reading large amount of data with SocketChannel (using directAllocated buffer & allocated one). Files greater than 300KB are cut even though I tried write the data into FileChannel.
    My Code:
    ByteBuffer directBlockBuffer = ByteBuffer.allocateDirect(150000);
    buffer = ByteBuffer.allocate(6000000);
    out = new     FOS("d:\\msgData.tmp");
    fc=out.getFOS().getChannel(); // FileChannel               
    int fileLength = (int)fc.size();
    while (clientChannel.read(directBlockBuffer)>0)
    {                              directBlockBuffer.flip()                         buffer.put(directBlockBuffer);
         directBlockBuffer.compact();
    //close data file
                                       buffer.flip();
                                       fc.write(buffer);
                                       fc.close();
    FOS.close();
    // end of code
    Any ideas?
    Thanks
    AST

    I don't understand how the "write" result will help read the whole data.
    Anyway, I changed the code so the SocketChannel will read in smaller chunks (~8KB) & the FileChannel writes in every read
    but the data stream is cut again (to ~5KB no matter what size of file I send).
    In the updated code when try to compare socketChannel.read to -1 I got endless loop.
    I'm basically trying to write POP3/SMTP server program, this part of code handles attachment that is received by the SocketChannel in one unit (i.e 1+ MB of data, the other SMTP commands/lines are no more than 27 chars and simple to handle).
    Therefore I need to be ready to accept large amount of data to the buffer & write it to filechannel. (In the POP3 thread I'm using MappedByteBuffer successfully).
    Updated code:
    ByteBuffer directBlockBuffer = ByteBuffer.allocateDirect(8192);
    while (clientChannel.read   (directBlockBuffer>0&&directBlockBuffer.hasRemaining))
              directBlockBuffer.flip();
              fc.write(directBlockBuffer);
              directBlockBuffer.clear();
         }I think based on API my code is logical (and good for small files) but what about handling bigger files (up to 5MB)?
    Thanks,
    AST 

  • DSS problems when publishing large amount of data fast

    Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
    There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
    I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
    My questions are
    1. Is there any limit in speed (frequency) for data publishing in DSS?
    2. Can DSS be unstable if loaded to much?
    3. Can I lose/miss data in any situation?
    4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
    5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
    Regards
    Idriz Zogaj
    Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
    Memory Profesional
    direct: +46 (0) - 734 32 00 10
    http://www.zogaj.se

    LuI wrote:
    >
    > Hi all,
    >
    > I am frustrated on VISA serial comm. It looks so neat and its
    > fantastic what it supposes to do for a develloper, but sometimes one
    > runs into trouble very deep.
    > I have an app where I have to read large amounts of data streamed by
    > 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
    > same time.)
    > I use either a Moxa multiport adapter C320 with 16 serial ports or -
    > for test purposes - a Keyspan serial-2-USB adapter with 4 serial
    > ports.
    Does it work better if you use the serial port(s) on your motherboard?
    If so, then get a better serial adapter. If not, look more closely at
    VISA.
    Some programs have some issues on serial adapters but run fine on a
    regular serial port. We've had that problem recent
    ly.
    Best, Mark

  • Best way to handle large amount of text

    hello everyone
    My project involves handling large amount of text.(from
    conferences and
    reports)
    Most of them r in Ms Word. I can turn them into RTF format.
    I dont want to use scrolling. I prefer turning pages(next,
    previous, last,
    contents). which means I need to break them into chunks.
    Currently the process is awkward and slow.
    I know there wud b lots of people working on similar
    projects.
    Could anyone tell me an easy way to handle text. Bring them
    into cast and
    break them.
    any ideas would be appreciated
    thanx
    ahmed

    Hacking up a document with lingo will probably loose the rtf
    formatting
    information.
    Here's a bit of code to find the physical position of a given
    line of on
    screen text (counting returns is not accurate with word
    wrapped lines)
    This stragety uses charPosToLoc to get actual position for
    the text
    member's current width and font size
    maxHeight = 780 -- arbitrary display height limit
    T = member("sourceText").text
    repeat with i = 1 to T.line.count
    endChar = T.line[1..i].char.count
    lineEndlocV = charPosToLoc(member "sourceText",
    endChar).locV
    if lineEndlocV > maxHeight then -- fount "1 too many"
    line
    -- extract identified lines "sourceText"
    -- perhaps repeat parce with remaining part of "sourceText"
    singlePage = T.line[1..i - 1]
    member("sourceText").text = T.line[i..99999] -- put remaining
    text back
    into source text member
    If you want to use one of the roundabout ways to display pdf
    in
    director. There might be some batch pdf production tools that
    can create
    your pages in pretty scalable pdf format.
    I think flashpaper documents can be adapted to director.

  • Firefox is using large amounts of CPU time and disk access, and I need to know how to shut down most of this so I can actually use the browser.

    Firefox is a very busy piece of software. It's using large amounts of CPU time and disk access. It puts my usage at low priority, so I have to wait for some time to be able to use my pointer or keyboard. I don't know what it uses all that CPU and disk access time for, but it's of no use to me. It often takes off with massive use of resources when I'm not doing anything, and I may not have use of my pointer for several minutes. How can I shut down most of this so I can use the browser to get my work done. I just want to use the web site access part of the software, and drop all the extra. I don't want Firefox to be able to recover after a crash. I just want to browse with a minimum of interference from Firefox. I would think that this is the most commonly asked question.

    Firefox consumes a lot of CPU resources
    * https://support.mozilla.com/en-US/kb/Firefox%20consumes%20a%20lot%20of%20CPU%20resources
    High memory usage
    * https://support.mozilla.com/en-US/kb/High%20memory%20usage
    Check and tell if its working.

  • Large Amount of Data in JSF

    Hello,
    I am using the Table Group component for displaying data in my application designed in Java Studio Creator.
    I have enabled paging on the component. I use CachedRowSet on the bean for the page for getting the data. This works very well at the moment in my development environment. At the moment I am testing on small amount of data.
    I was wondering how does this component perform with very large amounts of data (>75,000 rows). I noticed that there is a button available for users to retrieve all the rows. So I was wondering apart from that instance, when viewing in a paged mode does the component get all the results from the database everytime ?
    Which component would be best suited for displaying large amounts of data in a table format?
    Thanks In Advance!!

    Thanks for your reply. The table control that I use does have paging as a feature and I have enabled it. It still takes time to load the data initially.
    I wonder if it is got to do with the logic of paging. How do you specify which set of 20 records to extract from SQL.
    Thanks for your help!!

  • Advice needed on how to keep large amounts of data

    Hi guys,
    Im not sure whats the best way is to make large amounts of data available to my android  app on the local device.
    For example records of food ingredients, in the 100's?
    I have read and successfully created .db's using this tutorial.
    http://help.adobe.com/en_US/AIR/1.5/devappsflex/WS5b3ccc516d4fbf351e63e3d118666ade46-7d49. html
    However to populate the database I use flash? So this kind of defeats the purpose of it. No point in me shifting a massive array of data from flash to a sql database, when I could access the data direct from the as3 array?
    So maybe I could create the .db with an external program? but then how would I include that .db in the apk file and then deploy it to users android device.
    Or maybe I create a as3 class with an xml object init and use that as a means of data storage?
    Any advice would be appreciated

    You can use any means you like to populate your SQLite database, including using external programs, (temporarily) embedding a text file with SQL statements, executing some SQL from AS3 code etc etc.
    Once you have populated your db, deploy it with your project:
    http://chrisgriffith.wordpress.com/2011/01/11/understanding-bundled-sqlite-databases-in-ai r-for-mobile/
    Cheers, - Jon -

  • Is there a way to put a large amount of music on your iPod without having to keep all the files in iTunes on your computer as well? I want to put my entire music collection (including cds) on my iPod but don't want to take up the space on my computer.

    is there a way to put a large amount of music on the ipod without having to keep all the files on itunes as well? I want to use my ipod as an externa drive and put all of my music on it without taking up the space on my computer. I also don't want to lose all my files everytime I plug ipod into my computer. Is this possible? Is there a way to avoid using itunes and only use the ipod as an external drive?

    You cannot put music onto your iPod without using iTunes, that what iTunes is for.
    It's also not a good idea to wipe the music from your computer and having it only on your iPod. We see countless posts here from people who have done just that - and then lost everything when the iPod needs a Restore. Even if you never need to Restore your iPod, what happens when you eventually replace the iPod? You'll be back here asking how to get the music from one iPod to another. That's not easy to do, we see countless posts about that too!
    A much better idea is to buy an external drive (a proper external drive, not simply an iPod) and put your large amount of music onto that drive. Then point your iTunes Library to that drive. However, you need to remember two things:
    You still need a backup of that Library.
    Using an external drive as your iTunes Library means that the drive must be connected and ready to read before starting your iTunes programme. If it isn't, then iTunes will look on the C: drive - and you will find no music in your Library. (Once again, lots of posts about that as well!)

  • I just want to ask if there is any way i can fix my ipod touch 4G from watar damage and i dont think that a larg amount of water have entered and the ipod isnt working at all and i tried charging it but it didnt work so please help me, and thankyou.

    I just want to ask if there is any way i can fix my ipod touch 4G from watar damage and i dont think that a larg amount of water have entered and the ipod isnt working at all and i tried charging it but it didnt work so please help me, and thankyou.

    Probably dead if you have tried to charge it.
    You should never try to charge or turn on a wet electronic device.
    You should let it dry for  a week or so, then try.

  • How to send large amount of XML data in one CLOB variable

    Hi,
    I am sending large amount of XML data to TCP/IP port in one CLOB variable.
    My requirement is to send the whole data in one go in one CLOB variable.
    But that CLOB variable is not sufficient to hold all the data.
    Please suggest some solution.
    Thanks in advance

    Hi Here is my code:
    CREATE OR REPLACE PACKAGE BODY APPS.XXMB_WIP_PROD_TAG_DOOR_PKG
    AS
    PROCEDURE xxmb_get_xml_data_1270 (
    -- errbuf OUT VARCHAR2,
    -- retcode OUT NUMBER,
    p_org IN VARCHAR2,
    p_limit_to_global IN VARCHAR2,
    p_label IN VARCHAR2,
    p_printer IN VARCHAR2,
    p_quantity IN VARCHAR2,
    p_print_method IN VARCHAR2,
    p_enable_release IN VARCHAR2,
    p_enable_serial_no IN VARCHAR2,
    p_release IN VARCHAR2,
    p_rep_group IN VARCHAR2,
    p_cart_type IN VARCHAR2,
    p_cart_no_from IN VARCHAR2,
    p_cart_no_to IN VARCHAR2,
    p_serial_no IN VARCHAR2
    AS
    CURSOR c_xml_data_door (
    p_org IN VARCHAR2,
    p_label IN VARCHAR2,
    p_printer IN VARCHAR2,
    p_quantity IN VARCHAR2,
    p_print_method IN VARCHAR2,
    p_rep_group IN VARCHAR2,
    p_release IN VARCHAR2,
    p_cart_type IN VARCHAR2,
    p_cart_no_from IN VARCHAR2,
    p_cart_no_to IN VARCHAR2,
    p_serial_no IN VARCHAR2
    IS
    SELECT xxasa.item_id AS item_id, xcs.serial_number AS serial_number,xxcpf.cart_type,xcs.destination_cart_num cart,xcs.destination_slot_num slot
    CURSOR c_product_detail (
    l_product IN NUMBER,
    l_serial_num IN VARCHAR2,
    p_limit_to_global IN VARCHAR2
    IS
    SELECT xcra_specie.reference_id AS reference_id,
    xcra_ege.attribute_value AS ege, xcs.item_id AS item_id,
    AND msib.inventory_item_id = l_product
    and xcs.organization_id = nvl(p_org, xcs.organization_id)
    AND xcs.serial_number = NVL (l_serial_num, xcs.serial_number);
    /*-------------------------------------------------------+
    | Cursor to fetch the data for special Message Label |
    +-------------------------------------------------------*/
    CURSOR c_count (p_item_id IN NUMBER)
    IS
    SELECT xcrav.attribute_value, xcs.serial_number, xcs.cabinet_number
    FROM xxmb_czmfg_ref_attributes xcrav,
    cz_config_attributes cca,
    AND msib.organization_id = xcs.organization_id
    AND msib.inventory_item_id = xcs.item_id;
    /*--------------------------+
    | Common variables |
    +--------------------------*/
    v_limit_to_global VARCHAR2 (100);
    l_label_count NUMBER := 1;
    total_rec NUMBER;
    l_rewrite VARCHAR2 (1) := 'N';
    l_file_count NUMBER := 1;
    l_separate_line VARCHAR2 (10);
    BEGIN
    fnd_profile.get ('WMS_LABEL_OUTPUT_DIRECTORY', l_output_dir);
    fnd_profile.get ('WMS_LABEL_FILE_PREFIX', l_output_file_prefix);
    l_request_id := apps.fnd_global.conc_request_id;
    l_output_file_name :=
    l_output_file_prefix || l_request_id || l_file_end;
    l_dir_seperator := '/';
    IF (INSTR (l_output_dir, l_dir_seperator) = 0)
    THEN
    l_dir_seperator := '\';
    END IF;
    v_label := p_label;
    v_printer := p_printer;
    v_quantity := p_quantity;
    V_LIMIT_TO_GLOBAL := P_LIMIT_TO_GLOBAL;
    L_XML_CONTENT := '<?xml version="1.0" encoding="UTF-8" ?>';
    L_XML_CONTENT := L_XML_CONTENT || '<!DOCTYPE labels SYSTEM "label.dtd">';
    L_XML_CONTENT := L_XML_CONTENT || '<labels>';
    FOR r_xml_data_door IN c_xml_data_door (p_org,
    p_label,
    p_printer,
    p_quantity,
    p_print_method,
    p_rep_group,
    p_release,
    p_cart_type,
    p_cart_no_from,
    p_cart_no_to,
    p_serial_no
    LOOP
    -- dbms_output.put_line ( 1 );
    FOR r_product_detail IN
    c_product_detail (r_xml_data_door.item_id,
    r_xml_data_door.serial_number,
    v_limit_to_global
    LOOP
    -- dbms_output.put_line ( 2 );
    -- l_xml_content := '<?xml version="1.0" encoding="UTF-8" ?>';
    -- l_xml_content := l_xml_content || '<!DOCTYPE labels SYSTEM "label.dtd">';
    -- l_xml_content := l_xml_content || '<labels>';
    fnd_file.put_line (fnd_file.LOG, 'label cnt: ' || l_label_count);
    dbms_output.put_line (l_label_count);
    L_XML_CONTENT := L_XML_CONTENT || '<label _FORMAT='
    || '"'
    || 'lib://FRD/'
    || v_label
    || '"'
    || ' _PRINTERNAME='
    || '"'
    || v_printer
    || '"'
    || ' _QUANTITY='
    || '"'
    || v_quantity
    || '"'
    || '>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Color">'
    || R_PRODUCT_DETAIL.COLOR
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'<variable name= "Model">'
    || R_PRODUCT_DETAIL.model
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Build_Date">'
    || R_PRODUCT_DETAIL.BUILD_DATE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Assy_Cart">'
    || R_PRODUCT_DETAIL.ASSY_CART
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Assy_Slot">'
    || R_PRODUCT_DETAIL.ASSY_SLOT
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Assy_Line">'
    || R_PRODUCT_DETAIL.ASSY_LINE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Finish_Cart">'
    || R_PRODUCT_DETAIL.FINISH_CART
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Finish_Slot">'
    || R_PRODUCT_DETAIL.FINISH_SLOT
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Serial_Number">'
    || R_PRODUCT_DETAIL.SERIAL_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Serial_Number_Barcode">'
    || R_PRODUCT_DETAIL.SERIAL_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Specie">'
    || R_PRODUCT_DETAIL.SPECIE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'<variable name= "Truck_Group">'
    || R_PRODUCT_DETAIL.TRUCK_GROUP
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Label_Sequence_No">'
    || L_LABEL_COUNT
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'<variable name= "WIP_Cart">'
    || R_PRODUCT_DETAIL.WIP_CART
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "WIP_Slot">'
    || R_PRODUCT_DETAIL.WIP_SLOT
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Cabinet_Sequence_No">'
    || R_PRODUCT_DETAIL.CAB_SEQ_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "RAW_PART_NO">'
    || R_PRODUCT_DETAIL.RAW_PART_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "JC">'
    || R_PRODUCT_DETAIL.JC
    || '</variable>' ;
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "QC">'
    || R_PRODUCT_DETAIL.QC
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Thickness">'
    || R_PRODUCT_DETAIL.THICKNESS
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Width">'
    || R_PRODUCT_DETAIL.width
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Length">'
    || R_PRODUCT_DETAIL.length
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Overlay">'
    || R_PRODUCT_DETAIL.OVERLAY
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Options">'
    || R_PRODUCT_DETAIL.OPTIONS
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Stop">'
    || R_PRODUCT_DETAIL.stop
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Profile_No">'
    || R_PRODUCT_DETAIL.PROFILE_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Door_Style">'
    || R_PRODUCT_DETAIL.DOOR_STYLE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Glaze">'
    || R_PRODUCT_DETAIL.GLAZE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Shape">'
    || R_PRODUCT_DETAIL.SHAPE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Glass">'
    || R_PRODUCT_DETAIL.GLASS
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Hinge_Side">'
    || R_PRODUCT_DETAIL.HINGE_SIDE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Hinge_Type">'
    || R_PRODUCT_DETAIL.HINGE_TYPE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "EGE">'
    || R_PRODUCT_DETAIL.EGE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Door_Style_Code">'
    || R_PRODUCT_DETAIL.DOOR_STYLE_CODE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Finish_Technique">'
    || R_PRODUCT_DETAIL.FINISH_TECHNIQUE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Hinge_Location">'
    || R_PRODUCT_DETAIL.HINGE_LOCATION
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Construction_Type">'
    || R_PRODUCT_DETAIL.CONSTRUCTION_TYPE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_Type">'
    || R_PRODUCT_DETAIL.PANEL_TYPE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_Profile_No">'
    || R_PRODUCT_DETAIL.PANEL_PROFILE_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Rail_Profile_No">'
    || R_PRODUCT_DETAIL.RAIL_PROFILE_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Rail_1_Length">'
    || R_PRODUCT_DETAIL.RAIL_1_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Stile_Profile_No">'
    || R_PRODUCT_DETAIL.STILE_PROFILE_NO
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Rail_2_Length">'
    || R_PRODUCT_DETAIL.RAIL_2_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Stile_1_Length">'
    || R_PRODUCT_DETAIL.STILE_1_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Stile_2_Length">'
    || R_PRODUCT_DETAIL.STILE_2_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_1_Width">'
    || R_PRODUCT_DETAIL.PANEL_1_WIDTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_1_Length">'
    || R_PRODUCT_DETAIL.PANEL_1_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_2_Width">'
    || R_PRODUCT_DETAIL.PANEL_2_WIDTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Panel_2_Length">'
    || R_PRODUCT_DETAIL.PANEL_2_LENGTH
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'</label>';
    /*-----------------------------------------+
    | Handling XML data for special message |
    +-----------------------------------------*/
    FOR rec_count IN c_count (r_product_detail.item_id)
    LOOP
    L_XML_CONTENT := L_XML_CONTENT || '<label _FORMAT='
    || '"'
    || 'lib://FRD/SpecMessage_Door.btw'
    || '"'
    || ' _PRINTERNAME='
    || '"'
    || v_printer
    || '"'
    || ' _QUANTITY='
    || '"'
    || v_quantity
    || '"'
    || '>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Serial_Number">'
    || REC_COUNT.SERIAL_NUMBER
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Special_Message">'
    || REC_COUNT.ATTRIBUTE_VALUE
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT || '<variable name= "Cabinet_Sequence_No">'
    || REC_COUNT.CABINET_NUMBER
    || '</variable>';
    L_XML_CONTENT := L_XML_CONTENT ||'</label>';
    EXIT WHEN c_count%NOTFOUND;
    end LOOP;
    -- L_XML_CONTENT := L_XML_CONTENT || '</labels>';
    fnd_file.put_line (fnd_file.LOG, l_xml_content);
    dbms_output.put_line ( l_xml_content );
    L_LABEL_COUNT := L_LABEL_COUNT + 1;
    -- apps.inv_print_request.sync_print_tcpip (l_xml_content,
    -- l_job_status,
    -- l_printer_status,
    -- l_status_type,
    -- l_return_status,
    -- l_return_msg
    END LOOP;
    END LOOP;
    l_xml_content := l_xml_content || '</labels>';
    fnd_file.put_line (fnd_file.LOG, l_xml_content);
    apps.inv_print_request.sync_print_tcpip (l_xml_content,
    l_job_status,
    l_printer_status,
    l_status_type,
    l_return_status,
    l_return_msg
    L_XML_CONTENT := null;
    /*--------------------------------------------------------------------------------------+
    | APPS.INV_PRINT_REQUEST.SYNC_PRINT_TCPIP will send the XML data to TCP/IP Port |
    +--------------------------------------------------------------------------------------*/
    fnd_file.put_line (fnd_file.LOG,
    'Printer Status:' || ' ' || l_printer_status
    fnd_file.put_line (fnd_file.LOG,
    'Return Status:' || ' ' || l_return_status
    fnd_file.put_line (fnd_file.LOG,
    'Return Message:' || ' ' || L_RETURN_MSG
    COMMIT;
    EXCEPTION
    WHEN OTHERS
    THEN
    fnd_file.put_line
    (fnd_file.LOG,
    'Unexpected error in the xxmb_get_xml_data_1270 procedure, error is : '
    || SQLERRM
    || ', '
    || SQLCODE
    END xxmb_get_xml_data_1270;
    END xxmb_wip_prod_tag_door_pkg;
    /

  • My iMac running 10.7.5 crashes when copy and paste large amounts of information like a picture.

    My iMac running 10.7.5 crashes when I store a large amount of data like copy and paste a picture. It also has started being painfully slow to open the first page in Safari.  Here's is some information a program gathered on my computer.   -Thanks!
    Problem description:
    iMac 10.7.5.  Crashes when copy & paste large amounts and slow to first open first web page then back to normal speed
    EtreCheck version: 2.1.8 (121)
    Report generated April 13, 2015 3:05:27 PM EDT
    Download EtreCheck from http://etresoft.com/etrecheck
    Click the [Click for support] links for help with non-Apple products.
    Click the [Click for details] links for more information about that line.
    Hardware Information: ℹ️
        iMac (27-inch, Late 2009) (Technical Specifications)
        iMac - model: iMac10,1
        1 3.33 GHz Intel Core 2 Duo CPU: 2-core
        8 GB RAM
            BANK 0/DIMM0
                2 GB DDR3 1067 MHz ok
            BANK 1/DIMM0
                2 GB DDR3 1067 MHz ok
            BANK 0/DIMM1
                2 GB DDR3 1067 MHz ok
            BANK 1/DIMM1
                2 GB DDR3 1067 MHz ok
        Bluetooth: Old - Handoff/Airdrop2 not supported
        Wireless: Unknown
    Video Information: ℹ️
        ATI Radeon HD 4670 - VRAM: 256 MB
            iMac 2560 x 1440
    System Software: ℹ️
        Mac OS X 10.7.5 (11G63) - Time since boot: 14 days 7:47:18
    Disk Information: ℹ️
        ST31000528AS disk0 : (1 TB)
            disk0s1 (disk0s1) <not mounted> : 210 MB
            Macintosh HD (disk0s2) / : 999.35 GB (777.11 GB free)
            Recovery HD (disk0s3) <not mounted>  [Recovery]: 650 MB
        OPTIARC DVD RW AD-5680H 
    USB Information: ℹ️
        Sunplus Innovation Technology. USB to Serial-ATA bridge 1 TB
            disk2s1 (disk2s1) <not mounted> : 210 MB
            Time Machine Backups (disk2s2) /Volumes/Time Machine Backups : 999.86 GB (143.50 GB free)
        Apple Inc. Built-in iSight
        Apple Internal Memory Card Reader
        Apple Computer, Inc. IR Receiver
        HP Deskjet 9800
        Apple Inc. BRCM2046 Hub
            Apple Inc. Bluetooth USB Host Controller
    Kernel Extensions: ℹ️
            /Library/Extensions
        [loaded]    org.virtualbox.kext.VBoxDrv (4.2.18) [Click for support]
        [loaded]    org.virtualbox.kext.VBoxNetAdp (4.2.18) [Click for support]
        [loaded]    org.virtualbox.kext.VBoxNetFlt (4.2.18) [Click for support]
        [loaded]    org.virtualbox.kext.VBoxUSB (4.2.18) [Click for support]
            /Library/Parallels/Parallels Service.app
        [loaded]    com.parallels.kext.prl_hid_hook (7.0 15107.796624) [Click for support]
        [loaded]    com.parallels.kext.prl_hypervisor (7.0 15107.796624) [Click for support]
        [loaded]    com.parallels.kext.prl_netbridge (7.0 15107.796624) [Click for support]
        [loaded]    com.parallels.kext.prl_vnic (7.0 15107.796624) [Click for support]
            /System/Library/Extensions
        [loaded]    com.parallels.kext.prl_usb_connect (7.0 15107.796624) [Click for support]
    Startup Items: ℹ️
        HP IO: Path: /Library/StartupItems/HP IO
        ParallelsTransporter: Path: /Library/StartupItems/ParallelsTransporter
        VirtualBox: Path: /Library/StartupItems/VirtualBox
        Startup items are obsolete in OS X Yosemite
    Launch Agents: ℹ️
        [running]    com.hp.devicemonitor.plist [Click for support]
        [loaded]    com.hp.help.tocgenerator.plist [Click for support]
        [loaded]    com.parallels.desktop.launch.plist [Click for support]
        [loaded]    com.parallels.DesktopControlAgent.plist [Click for support]
        [running]    com.parallels.vm.prl_pcproxy.plist [Click for support]
    Launch Daemons: ℹ️
        [loaded]    com.adobe.fpsaud.plist [Click for support]
        [loaded]    com.hikvision.iVMS-4200.plist [Click for support]
        [loaded]    com.microsoft.office.licensing.helper.plist [Click for support]
        [running]    com.parallels.desktop.launchdaemon.plist [Click for support]
    User Launch Agents: ℹ️
        [loaded]    com.adobe.ARM.[...].plist [Click for support]
        [running]    com.akamai.single-user-client.plist [Click for support]
        [loaded]    com.google.keystone.agent.plist [Click for support]
        [not loaded]    org.virtualbox.vboxwebsrv.plist [Click for support]
    User Login Items: ℹ️
        iTunesHelper    UNKNOWN  (missing value)
        GrowlHelperApp    Application  (/Users/[redacted]/Library/PreferencePanes/Growl.prefPane/Contents/Resources/Gr owlHelperApp.app)
        Microsoft AU Daemon    Application  (/Applications/Microsoft AutoUpdate.app/Contents/MacOS/Microsoft AU Daemon.app)
        Air Mouse Server    UNKNOWN  (missing value)
        CrossOver CD Helper    UNKNOWN  (missing value)
        Pages    UNKNOWN  (missing value)
        Dropbox    Application  (/Applications/Dropbox.app)
        Wondershare Helper Compact    Application  (/Users/[redacted]/Library/Application Support/Helper/Wondershare Helper Compact.app)
        AdobeResourceSynchronizer    Application Hidden (/Applications/Adobe Reader.app/Contents/Support/AdobeResourceSynchronizer.app)
        HP Scheduler    Application  (/Library/Application Support/Hewlett-Packard/Software Update/HP Scheduler.app)
    Internet Plug-ins: ℹ️
        JavaAppletPlugin: Version: 14.9.0 - SDK 10.7 Check version
        FlashPlayer-10.6: Version: 16.0.0.305 - SDK 10.6 [Click for support]
        QuickTime Plugin: Version: 7.7.1
        AdobePDFViewerNPAPI: Version: 11.0.10 - SDK 10.6 [Click for support]
        Flash Player: Version: 16.0.0.305 - SDK 10.6 Outdated! Update
        AdobePDFViewer: Version: 11.0.10 - SDK 10.6 [Click for support]
        SharePointBrowserPlugin: Version: 14.4.8 - SDK 10.6 [Click for support]
        Google Earth Web Plug-in: Version: 6.0 [Click for support]
        Silverlight: Version: 4.0.60531.0 [Click for support]
        iPhotoPhotocast: Version: 7.0
    Safari Extensions: ℹ️
        1-ClickWeather
        Reload Button
    3rd Party Preference Panes: ℹ️
        Akamai NetSession Preferences  [Click for support]
        Flash Player  [Click for support]
        Growl  [Click for support]
        MacFUSE  [Click for support]
    Time Machine: ℹ️
        Time Machine not configured!
    Top Processes by CPU: ℹ️
             2%    WindowServer
             1%    prl_disp_service
             0%    fontd
             0%    AdobeReader
             0%    ODSAgent
    Top Processes by Memory: ℹ️
        120 MB    mds
        112 MB    AdobeReader
        94 MB    WindowServer
        94 MB    Finder
        69 MB    loginwindow
    Virtual Memory Information: ℹ️
        5.44 GB    Free RAM
        1.78 GB    Active RAM
        464 MB    Inactive RAM
        901 MB    Wired RAM
        1.64 GB    Page-ins
        0 B    Page-outs

    I believe that insufficient RAM may be the source of some of your problems. If you have a RAM of somewhere 4 to 8GB, you will experience smoother computing. 3GB doesn't seem right, so you might want to learn more by going to this site:
    http://www.crucial.com/store/drammemory.aspx
    I don't know what know what's happening with your optical drive, but it seems you use your drive quite a bit. In that case, look into a lens cleaner for your machine. It's inexpensive, works quite well.
    I hope you'll post here with your results!

  • Create new table , 1 column has large amount of data, column type ?

    Hi, sure this is a noob question but...
    I exported a table from sql server, want to replicate in oracle.
    It has 3 columns, one of the columns has the configuration for reports, so each field in the column has a large amount of text in it, some upwards of 30k characters.
    What kind of column do I make this in oracle, I believe varchar has a 4k limit.
    creation of the table...
    REATE TABLE "RPT_BACKUP"
    "NAME" VARCHAR2(1000 BYTE),
    "DATA"
    "USER" VARCHAR2(1000 BYTE)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER DEFAULT DIRECTORY "XE_FTP" ACCESS PARAMETERS ( records delimited BY newline
    skip 1 fields terminated BY ',' OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES
    ARE NULL ) LOCATION ( 'REPORT_BACKUP.csv' ))
    Edited by: Jay on May 3, 2011 5:42 AM
    Edited by: Jay on May 3, 2011 5:43 AM

    Ok, tried that, but now when i do a select * from table i get..
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-30653: reject limit reached
    ORA-06512: at "SYS.ORACLE_LOADER", line 52
    29913. 00000 - "error in executing %s callout"
    *Cause:    The execution of the specified callout caused an error.
    *Action:   Examine the error messages take appropriate action.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Looking for ideas for transferring large amounts of data between systems

    Hello,
    I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
    We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
    We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
    Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
    We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

Maybe you are looking for

  • Advacne Payment With Reference to PO

    Dear All, i want to make Advacne payment to the Vendor with reference to a PO. How can i do it ?? Cheers

  • Does pages for IPad convert to e pub format?

    I read in a google search that Pages can  convert to E Pub format without knowing HTML is this correct?

  • Controlling loaded SWF inside a child

    Hello to everyone, may this question be too obvious to everyone, but I've been googling for hours without any luck. Have seen examples but they are not exactly as my scenario... Please, I need some help. I have loaded a SWF and attached to a MovieCli

  • Error when trying to send e-mail

    When running the following code: import oracle.ifs.common.IfsException; import oracle.ifs.common.*; import oracle.ifs.beans.LibraryService; import oracle.ifs.protocols.email.beans.Mailbox; import oracle.ifs.adk.mail.IfsMessage.*; import oracle.ifs.ad

  • CC Desktop app update is stucked on "finishing" status

    Hello, 2 days ago I initiated updates of CC apps from the CC desktop app as it notified me to do so. I started to update after effects CC and while it was doing so I was working on Lightroom. Then a pop up window appeared asking me to close lightroom