Oracle 9i running out of memory

Folks !
I have a simple 3 table schema with a few thousand entries each. After dedicating gigabytes of hard disk space and 50% of my 1+ GB memory, I do a few simple Oracle Text "contains" searches (see below) on these tables and oracle seems to grow some 25 MB after each query (which typically return less than a dozen rows each) till it eventually runs out of memory and I have to reboot the system (Sun Solaris).
This is on Solaris 9/Sparc with Oracle 9.2 . My query is simple right outer join. I think the memory growth is related to Oracle Text index/caching since memory utilization seems pretty stable with simple like '%xx%' queries.
"top" shows a dozen or so processes each with about 400MB RSS/SIZE. It has been a while since I did Oracle DBA work but I am nothing special here. Databse has all the default settings that you get when you create an Oracle database.
I have played with SGA sizes and no matter how large or small the size of SGA/PGA, Oracle runs out of memory and crashes the system. Pretty stupid to an Enterprise databas to die like that.
Any clue on how to arrest the fatal growth of memory for Oracle 9i r2?
thanks a lot.
-Sanjay
PS: The query is:
SELECT substr(sdn_name,1,32) as name, substr(alt_name,1,32) as alt_name, sdn.ent_num, alt_num, score(1), score(2)
FROM sdn, alt
where sdn.ent_num = alt.ent_num(+)
and (contains(sdn_name,'$BIN, $LADEN',1) > 0 or
contains(alt_name,'$BIN, $LADEN',2) > 0)
order by ent_num, score(1), score(2) desc;
There are following two indexes on the two tables:
create index sdn_name on sdn(sdn_name) indextype is ctxsys.context;
create index alt_name on alt(alt_name) indextype is ctxsys.context;

I am already using MTS.
Atached is the init.ora file below.
may be I should repost this article with subject "memory leak in Oracle" to catch developer attention. I posted this a few weeks back in Oracle Text groiup and no response there either.
Thanks for you help.
-Sanjay
# Copyright (c) 1991, 2001, 2002 by Oracle Corporation
# Cache and I/O
db_block_size=8192
db_cache_size=33554432
db_file_multiblock_read_count=16
# Cursors and Library Cache
open_cursors=300
# Database Identification
db_domain=""
db_name=ofac
# Diagnostics and Statistics
background_dump_dest=/space/oracle/admin/ofac/bdump
core_dump_dest=/space/oracle/admin/ofac/cdump
timed_statistics=TRUE
user_dump_dest=/space/oracle/admin/ofac/udump
# File Configuration
control_files=("/space/oracle/oradata/ofac/control01.ctl", "/space/oracle/oradata/ofac/control02.ctl", "/space/oracle/oradata/ofac/control03.ctl")
# Instance Identification
instance_name=ofac
# Job Queues
job_queue_processes=10
# MTS
dispatchers="(PROTOCOL=TCP) (SERVICE=ofacXDB)"
# Miscellaneous
aq_tm_processes=1
compatible=9.2.0.0.0
# Optimizer
hash_join_enabled=TRUE
query_rewrite_enabled=FALSE
star_transformation_enabled=FALSE
# Pools
java_pool_size=117440512
large_pool_size=16777216
shared_pool_size=117440512
# Processes and Sessions
processes=150
# Redo Log and Recovery
fast_start_mttr_target=300
# Security and Auditing
remote_login_passwordfile=EXCLUSIVE
# Sort, Hash Joins, Bitmap Indexes
pga_aggregate_target=25165824
sort_area_size=524288
# System Managed Undo and Rollback Segments
undo_management=AUTO
undo_retention=10800
undo_tablespace=UNDOTBS1

Similar Messages

  • Generating large amounts of XML without running out of memory

    Hi there,
    I need some advice from the experienced xdb users around here. I´m trying to map large amounts of data inside the DB (Oracle 11.2.0.1.0) and by large I mean files up to several GB. I compared the "low level" mapping via PL/SQL in combination with ExtractValue/XMLQuery with the elegant XML View Mapping and the best performance gave me the View Mapping by using the XMLTABLE XQuery PATH constructs. So now I have a View that lies on several BINARY XMLTYPE Columns (where the XML files are stored) for the mapping and another view which lies above this Mapping View and constructs the nested XML result document via XMLELEMENT(),XMLAGG() etc. Example Code for better understanding:
    CREATE OR REPLACE VIEW MAPPING AS
    SELECT  type, (...)  FROM XMLTYPE_BINARY,  XMLTABLE ('/ROOT/ITEM' passing xml
         COLUMNS
          type       VARCHAR2(50)          PATH 'for $x in .
                                                                let $one := substring($x/b012,1,1)
                                                                let $two := substring($x/b012,1,2)
                                                                return
                                                                    if ($one eq "A")
                                                                      then "A"
                                                                    else if ($one eq "B" and not($two eq "BJ"))
                                                                      then "AA"
                                                                    else if (...)
    CREATE OR REPLACE VIEW RESULT AS
    select XMLELEMENT("RESULTDOC",
                     (SELECT XMLAGG(
                             XMLELEMENT("ITEM",
                                          XMLFOREST(
                                               type "ITEMTYPE",
    ) as RESULTDOC FROM MAPPING;
    ----------------------------------------------------------------------------------------------------------------------------Now all I want to do is materialize this document by inserting it into a XMLTYPE table/column.
    insert into bla select * from RESULT;
    Sounds pretty easy but can´t get it to work, the DB seems to load a full DOM representation into the RAM every time I perform a select, insert into or use the xmlgen tool. This Representation takes more than 1 GB for a 200 MB XML file and eventually I´m running out of memory with an
    ORA-19202: Error occurred in XML PROCESSING
    ORA-04030: out of process memory
    My question is how can I get the result document into the table without memory exhaustion. I thought the db would be smart enough to generate some kind of serialization/datastream to perform this task without loading everything into the RAM.
    Best regards

    The file import is performed via jdbc, clob and binary storage is possible up to several GB, the OR storage gives me the ORA-22813 when loading files with more than 100 MB. I use a plain prepared statement:
            File f = new File( path );
           PreparedStatement pstmt = CON.prepareStatement( "insert into " + table + " values ('" + id + "', XMLTYPE(?) )" );
           pstmt.setClob( 1, new FileReader(f) , (int)f.length() );
           pstmt.executeUpdate();
           pstmt.close(); DB version is 11.2.0.1.0 as mentioned in the initial post.
    But this isn´t my main problem, the above one is, I prefer using binary xmltype anyway, much easier to index. Anyone an idea how to get the large document from the view into a xmltype table?

  • I am running out of memory on my hard drive and need to delete files. How can I see all the files/applications on my hard drive so I can see what is taking up a lot of room?

    I am running out of memory on my hard drive and need to delete files. How can I see all the files/applications on my hard drive so I can see what is taking up a lot of room?
    Thanks!
    David

    Either of these should help.
    http://grandperspectiv.sourceforge.net/
    http://www.whatsizemac.com/
    Or search 'disk size' in the App Store.
    Be carefull with what you delete & have a backup BEFORE you start, you may also want to reboot to try to free any memory that may have been written to disk.

  • How can I avoid running out of memory when creating components dynamically

    Hello everyone,
    Recently, I am planning to design a web application. It will be used by all middle school teachers in a region to make examination papers and it must contain the following main functions.
    1)Generate test questions dynamically. For instance, a teacher who logs on the web application will only see a select one menu and a Next Quiz button. The former is used for determining the number of options for the current multiple/single choice question. The later is dedicated to creating appropriate input text elements according to the selected option number. That is to say, if the teacher selects 4 in the menu and presses the Next Quiz button, 5 input text form elements will appear. The first one is for the question to be asked such as "1.What is the biggest planet in the solar system?", the others are optional answers like a)Uranus. b) Saturn. c)Jupiter. d)Earch. Each answer stands for an input text elements. When the teacher fills in the fourth answer, another select one menu and Next Quiz button will emerge on the fly just under this answer, allowing the teacher to make the second question. The same thing repeats for the following questions.
    2)Undo and Redo. Whenever a teacher wants to roll back or redo what he has done, just press the Undo or[i] Redo button. In the previous example, if the teacher selects the third answer and presses the Delete button to drop this answer, it will delete both the literal string content[i] and the input text element, changing the answer d to c automatically. After that, he decides to get back the original answer c, Jupiter, he can just click the Undo button as if he hadn�ft made the deleting operation.
    3)Save the unfinished working in the client side. If a teacher has done half of his work, he can choose to press the Save button to store what he has done in his computer. The reason for doing so is simply to alleviate the burden of the server. Although all finished test papers must be saved in a database on the server, sometimes the unfinished papers could be dropped forever or could form the ultimate testing papers after several months. So if these papers keep in the server, it will waste the server computer�fs room. Next time the teacher can press the Restore button on the page to get the previously stored part of the test paper from his own computer and continue to finish the whole paper.
    4)Allow at least 1,000 teachers to make test papers at the same time. The maximum question number per examination paper is 60.
    Here are my two rough solutions,
    A.Using JSF.
    B.Using JavaScript and plain JSP[b] without JSF.
    The comparison of the two solutions:
    1)Both schemas can implement the first and the second requirements. In JSF page I could add a standard panelGird tag and use its binding attribute. In the backing bean, the method specified by the binding attribute is responsible for generating HtmlInput objects and adding them to the HtmlPanelGird object on the fly. Every HtmlInput object is corresponding to a question subject or an optional answer. The method is called by an actionListener, which is registered in the Next Quiz commandButton, triggering by the clicking on this button in the client side. Using JSF can also be prone to managing the HtmlInput objects, e.g. panelGird.getChildren().add(HtmlInput) and panelGird.getChildren().remove(HtmlInput) respond to the undoing operation of deleting an optional answer and the redoing operation of the deleting action respectively. I know JavaScript can also achieve these goals. It could be more complex since I don�ft know well about JavaScript.
    2)I can not find a way to meet the third demand right now. I am eager to know your suggestion.
    3)Using JSF, I think, can�ft allow 1,000 teachers to do their own papers at the same time. Because in this scenario, suppose each questionnaire having 60 questions and 4 answers per question, there will be approximately 300,000 HtmlInput objects (1,000X60X(4+1)) creating on the server side. The server must run out of memory undoubtedly. To make things better, we can use a custom component which can be rendered as a whole question including its all optional answers. That is to say, a new custom component on the server side stands for a whole question on the client side. Even so, about 60,000(1,000X60) this type of custom components will be created progressively and dynamically, plus other UISelectOne and UICommand objects, it also can�ft afford for most servers. Do I have to use JavaScript to avoid occupying the server's memory in this way? If so, I have to go back and use JavaScript and plain JSP without JSF.
    Thank you in advance!
    Best Regards, Ailsa
    2007/5/4

    Thank you for your quick response, BalusC. I really appreciate your answer.
    Yes, you are right. If I manually code the same amount of those components in the JSF pages instead of generating them dynamically, the server will still run out of memory. That is to say, JSF pages might not accommodate a great deal of concurrent visiting. If I upgrade the server to just allow 1,000 teachers making their own test papers at the same time, but when over 2,000 students take the same questionnaire simultaneously, the server will need another upgrading. So I have to do what you have told me, using JS+DOM instead of upgrading the server endlessly.
    Best Regards, Ailsa

  • Lightroom 5 permanently runs out of memory

    Lightroom 5 on Windows 7 32 Bit and 8 Gigabytes of memory (more than the 32 Bit system can use) permanently runs out of memory when doing some more complex edits on a RAW file, especially when exporting to 16 Bit TIFF. The RAW files were created by cameras with 10 up to 16 megapixel sensors with bit depths between 12 and 14.
    After exporting one or two images to 16 Bit uncompressed TIFF an error message "Not enough memory" will be displayed and only a Lightroom restart solves that - for the next one to two exports. If an image has much brush stroke edits, every additional stroke takes more and more time to see the result until the image disappears followed by the same "Not enough memory" error message.
    A tab character in the XMP sidecar file is *not* the reason (ensured that), as mentioned in a post. It seems that Lightroom in general does not allocate enough memory and frees too less/late allocated.
    Please fix that bug, it's not productive permanently quit and restart Lightroom when editing/exporting a few RAW files. Versions prior to Lightroom 4 did not have that bug.
    P.S. Posting here, because it was not possible to post it at http://feedback.photoshop.com/photoshop_family/topics/new It's very bad design, to let a user take much time to write and then say: "Log in", but a log in with the Adobe ID and password does not work (creating accounts on Facebook etc. is not an acceptable option, Adobe ID should be enough). Also a bugtracker such as Bugzilla would be a much better tool for improving a software and finding relevant issues to avoid duplicate postings.

    First of all: I personally agree with your comments regarding the feedback webpage. But that is out of our hands since this is a user-to-user forum, and there is nothing we users can do about it.
    Regarding your RAM: You are running Win7 32-bit, so 4 GB of your 8 GB of RAM sit idle since the system cannot use it. And, frankly, 4 GB is very scant for running Lr, considering that the system uses 1 GB of that. So there's only 3 GB for Lr - and that only if you are not running any other programs at the same time.
    Since you have a 8 GB system already, why don't you go for Win7 64-bit. Then you can also install Lr 64-bit and that - together with 8 GB of RAM - will bring a great boost in Lr performance.
    Adobe recommends to run Lr in the 64-bit version. For more on their suggestion on improving Lr performance see here:
    http://helpx.adobe.com/lightroom/kb/performance-hints.html?sdid=KBQWU
    for more: http://forums.adobe.com/thread/1110408?tstart=0

  • I want to use Meteor app but run out of memory.

    Is there a way I can store data on the cloud, so freeing up memory to work on any given app? I run out of space all the time. I use lots of apps in different media so
    can I say, do some work on one app then save the files to the Cloud, then reload later when I need it. Basically, I feel I should only Keep the apps on my iPad, then on each separate app save the work as I finish for the day to Keep my memory uncluttered? Any help would be appreciated. I spent lots on apps but run out of memory. Thanks.

    Only if the individual Apps support saving to the cloud. Otherwise no. There is no user access to the iCloud storage area.
    Its only there for backups, and data syncronization between certain Apps that support it.

  • I have a file where I am running out of memory can anyone take a look at this file and see?

    I am trying to make this file 4'x8'.
    Please let me know if anyone can help me get this file to that size.
    I have a quad core processor with 6 gig of ram and have gotten the file to 50"x20", but I run out of memory shortly thereafter.  Any help would be appreciated.
    Thanks,

    Where to begin? You should look into using a swatch pattern instead of those repeating circles. Also, I see that each circle in your pattern is actually a stack of four circles, but I see no reason why. Perhaps Illustrator is choking on the huge number of objects required to make that patterns as you haave constructed it.
    Here is a four foot by eight foot Illustrator file using a swatch pattern. Note that, despite the larger size, the file is less than one 16th the size.

  • My mac's run out of memory and I can't find the culprit!

    Hi, I'm in serious need of some help! I'm sure this is simple, but I'm about to break down over it – I use my mac for everything. I've got a 200gb 2009 macbook (running iOS7), and it's told me it's run out of memory. The storage tab in 'about this mac' tells me 108GB is being used for video – but I can't find them! My iPhoto has about 17GB of movies, my iTunes has around 20GB, and I've got maybe another 10GB in files within finder – but that's still only half the videos my mac is saying it has? How do I find the rest? I've got 80GB being used by 'other' as well – is that just pages and numbers documents, along with the iOS? Is there a way of finding exactly what all my memory's being allocated to?
    I've got the entire mac backed up on an external hard drive, but I'm terrified of deleting anything from the mac in case that fails. I plan on getting a second external HD, but even then I think I'll be too worried (I've heard about so many hard drives continuously failing). How does anyone manage all their stuff?!?
    Thank you in advance, for any help you can offer.

    Just a slight correction to start, you're not running iOS 7. You're running a version of OS X, iOS is for mobile devices like iPhones and iPads. To find out which version OS OS X you're running click the Apple menu at the top left and select About This Mac.
    This http://pondini.org/OSX/LionStorage.html should help you understand "Other".

  • HT201317 dear, i have a question please.  if i want to keep a folder of pictures in my icloud account; then delete this folder off my phone would it be deleted from icloud account too?? cause im running out of memory on my iphone

    dear, i have a question please.  if i want to keep a folder of pictures in my icloud account; then delete this folder off my phone would it be deleted from icloud account too?? cause im running out of memory on my iphone
    thanks

    You can't use icloud to store photos.  There's a service called photo stream which is used to sync photos from one device to another, but this can't be used to store photos permanently.
    Always sync photos from a device to your computer,  Then you can delete them from the device.
    Read this...  import photos to your computer...
    http://support.apple.com/kb/HT4083

  • What causes my iphone 5s to keep on running out of memory with any new download. I recently updated to  ios 8.2

    What causes my iphone 5s to keep on running out of memory without any new download. I recently updated to  ios 8.2

    I meant to say without downloading or receiving anything

  • WARNING:Oracle process running out of OS kernel I/O resources (1)

    Hi,
    on my server, IBM power6, with oracle 10.2.0.4, dbwr trace report some errors like this:
    *** 2010-08-31 06:26:46.574
    Warning: lio_listio returned EAGAIN
    Performance degradation may be seen.
    WARNING:Oracle process running out of OS kernel I/O resources (1)
    *** 2010-09-01 07:11:38.691
    Warning: lio_listio returned EAGAIN
    Performance degradation may be seen.
    WARNING:Oracle process running out of OS kernel I/O resources (1)
    The awr in this period of time reports:
    Top 5 Timed Events Avg %Total
    ~~~~~~~~~~~~~~~~~~ wait Call
    Event Waits Time (s) (ms) Time Wait Class
    db file sequential read 509,435 2,610 5 42.3 User I/O
    CPU time 1,714 27.8
    log file sync 55,309 1,146 21 18.6 Commit
    log file parallel write 60,498 937 15 15.2 System I/O
    db file parallel write 27,166 295 11 4.8 System I/O
    The workload was represented from a sqlldr with rows=100000.
    Warning in trace file worry me.
    Warning: lio_listio returned EAGAIN
    depends on aioserver that is low, 10 instead 50, because we have 40disks*10/8 cpu.
    Can anyone help me?

    In note 443368.1:
    " If you are using JFS/JFS2, then set the initial value to *10 times the number of logical disks divided by the number of CPUs*."
    pstat -a | grep -c aios
    161
    lsattr -E -l aio0
    autoconfig available STATE to be configured at system restart True
    fastpath enable State of fast path True
    kprocprio 39 Server PRIORITY True
    maxreqs 4096 Maximum number of REQUESTS True
    maxservers 10 MAXIMUM number of servers per cpu True
    minservers 1 MINIMUM number of servers True
    Edited by: Davy on 1-set-2010 3.04

  • WARNING:Oracle process running out of OS kernel I/O resources

    Hi!
    I am getting below warning in the dbwr trace files almost daily at different times:
    WARNING:Oracle process running out of OS kernel I/O resources
    We are using :
    SuSE Linux Enterprise Edition 10 Sp 2
    OS Kernel --> 2.6.16.60-0.39.3-smp
    Oracle Database 10.2.0.2 64-bit with only 5380055 bug fix applied
    Storage: IBM DS 8300
    fs.aio-max-nr = 65536
    Please help in resolving this issue.
    Regards,
    Raju

    Hi Mark,
    Thank u for quick ur reply.
    Doc #396057.1 (SuSE 10.2 OS issue) --> This note says "The archiver is getting stuck with ORA-19502 and ORA-27061", whereas in my case I'm not getting any ORA- errors in alert log nor anywhere but only getting these warnings in dbwr trace files that is also intermittently.
    I have consulted my system team in regards to disk I/O they said we are using very high end storage and there are no errors or warnings in the OS system logs.
    # 415872.1 --> this note says "HUNG DATABASE INSTANCE IF LINUX KERNEL MISS AIO REQUEST". This is not the situation in our case.
    #6687381.8 --> says versions confirmed as being affected are 10.2.0.3 and 10.2.0.4
    I'm unable to find anything which points to my situation, pls help.
    regards,
    raju

  • Oracle process running out of OS kernel I/O resources

    Hi All,
    Here is our platform:
    Oracle Version: 10.2.0.4.0
    O/S: Linux x86_64
    We are getting the below Error Msg:
    ERROR:
    ===================================================================
    WARNING:io_submit failed due to kernel limitations MAXAIO for process=128 pending aio=123 WARNING:asynch I/O kernel limits is set at AIO-MAX-NR=1048576 AIO-NR=64384 WARNING:Oracle process running out of OS kernel I/O resources
    ===================================================================
    To avoid this we recently upgrade the Oracle to 10.2.0.4.0 and applied the below Ipatches also but still we are getting the same error:
    p6051177_10204_Linux-x8664.zip
    p6024730_10204_Linux-x8664.zip
    p5935935_10204_Linux-x8664.zip
    p5923486_10204_Linux-x8664.zip
    p5895190_10204_Linux-x8664.zip
    p5880921_10204_Linux-x8664.zip
    p5756769_10204_Linux-x8664.zip
    p5747462_10204_Linux-x8664.zip
    p5561212_10204_Linux-x8664.zip
    p6944036_10204_Linux-x8664.zip
    p6826661_10204_Linux-x8664.zip
    p6775231_10204_Linux-x8664.zip
    p6768114_10204_Linux-x8664.zip
    p6679303_10204_Linux-x8664.zip
    p6645719_10204_Linux-x8664.zip
    p6452766_10204_Linux-x8664.zip
    p6379441_10204_Linux-x8664.zip
    p6324944_10204_Linux-x8664.zip
    p6313035_10204_Linux-x8664.zip
    p6151380_10204_Linux-x8664.zip
    p6084232_10204_Linux-x8664.zip
    p6082832_10204_Linux-x8664.zip
    p7573151_10204_Linux-x8664.zip
    p7522909_10204_Linux-x8664.zip
    p7513673_10204_Linux-x8664.zip
    p7300608_10204_Linux-x8664.zip
    p7287289_10204_Linux-x8664.zip
    p7149863_10204_Linux-x8664.zip
    p7027551_10204_Linux-x8664.zip
    p6972843_10204_Linux-x8664.zip
    p6954829_10204_Linux-x8664.zip
    p7592168_10204_Linux-x8664.zip
    p8201796_10204_Linux-x8664.zip
    p7592346_10204_Linux-x8664.zip
    p6705635_10204_Generic.zip
    p7252962_10204_Generic.zip
    Do we need to set any Oracle parameters?
    Thanks And Regards:
    Giridhar N

    Hi,
    under the limits.conf i am seeing the below values:
    Added for SAP on 2008-07-17 15:25:47 UTC
    @sapsys          soft    nofile          48000
    @sapsys          hard    nofile          48000
    @sdba            soft    nofile          32800
    @sdba            hard    nofile          32800
    @dba             soft    nofile          32800
    @dba             hard    nofile          32800
    still i am getting the same error:
    WARNING:io_submit failed due to kernel limitations MAXAIO for process=128 pending aio=127 WARNING:asynch I/O kernel limits is set at AIO-MAX-NR=65536 AIO-NR=65536 WARNING:Oracle process running out of OS kernel I/O resources (1)

  • Oracle process running out of OS kernel I/O resources (1)

    Hello All,
    I am getting "Oracle process running out of OS kernel I/O resources (1)" error message in DBWxx.trc files on AIX 5.3 platform, Please can anybody help me?
    Regards,
    Ajay

    What Oracle version? What, if any, error messages follow that message in the trace files? Anything in the alert.log?
    Did you try an MOS search? How about a Google search?
    This seems to be a pretty well known error.
    -Mark

  • Running out of memory building csv file

    I'm attempting to write a script that does a query on my
    database. It will generally be working with about 10,000 - 15,000
    records. It then checks to see if a certain file exists. If it
    does, it will add the record to an array. When its done looping
    over all the records, it takes the array that was created and
    outputs a csv file (usually with about 5,000 - 10,000 lines).
    But... before that ever happens, it runs out of memory. What can I
    do to make it not run out of memory?

    quote:
    Originally posted by:
    nozavroni
    I'm attempting to write a script that does a query on my
    database. It will generally be working with about 10,000 - 15,000
    records. It then checks to see if a certain file exists.
    Sounds pretty inefficient to me. Is there no way you can
    modify the query so that it only selects the records for which the
    file exists?

Maybe you are looking for

  • Multiple Stored Procedures - One at a time

    Hello everyone! I´m going to start an interface that requires me to call multiple Stored Procedures, but they will not be called at the same time. The legacy system has 3 stored procedures to be called, like this: 1 - SP number 1: it says to the lega

  • Java inside javascript

    Hi i am getting alert messages to be displayed in the GUI from the properties file. can anyone tell me how can i use java inside javascript..? so that i can use ResourceBundle file in javascript. Regards Sonia

  • Dreamweaver CS6 won't open, tried to update software, but software won't load

    I have CS6 (not cloud) installed on my computer, I use Photoshop and Illustrator regularly with no problems.  I recently got a freelance job that requires me to use Dreamweaver.  I've used it in the past but it's been some time since I've opened the

  • Help setting up iPhoto to publish with Flickr

    Please tell me what i am doing wrong. I am trying to get iPhoto to publish to Flickr. I chose my photo, a drop down message appears at the top of the screen saying "Do you want to set up iPhoto to publish to Flickr?" I click Setup and it takes me to

  • PNG-8 images appear faded on Finder/Preview

    Ever since I updated OSX to Mavericks, PNG-8 images show up faded on Finder and Preview. If I open them on Photoshop CS6 (the program that I used to create them, or if I paste them in a TextEdit file, they show up fine. They also show correctly if on