Slow searches in large file ...

It is almost unusable to use the find function as is in my large file, as the program won't let me complete my text in the search field once it starts looking for the first string of text i type. I have enabled the whole words option, but this doesn't help much. I tried to use the "use selection for find" option in the menu, but this is always disabled, so I'm not sure of this is of any use...
Any ideas how to get search to wait until I type in the whole text field before it starts to bog down??
Thanks
jcm

Thanks for the reply.
File is 7 Megs, 1000 rows, columns to BB.
I don't mind that the search is slow, but e.g. when I want to search for "johnson", it stops me from typing after jo, then again after john, then again after johns etc etc, .... Best workaround so far is to type Johnson into a blank cell somewhere, then paste it into the search field...but if this is the answer, it's back to Excel.......

Similar Messages

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

  • Spot Healing Brush Content-Aware SLOW! on large files

    I updated from Photoshop CC to CC 2014. When working with ultra large format scans (more than 3 Gigapixel) the Spot Healing Brush set on Content-Aware takes unusually long (5-7 seconds for retouching small dust spots). In Photoshop CC the same Content-Aware Spot Healing Brush actions were executed instantly without any perceived delays once I've pointed a source for the Clone Stamp tool (which I found weird but doing so really made the Spot Healing Brush faster on large files). On smaller images it's still very fast on CC 2014 but not on the 3.5 GPixel images I'm working on at the moment.
    I know there have been some changes/improvements for the Content-Aware tools but in this case it's seriously hitting the performance. I haven't noticed any other slow-downs in processing performance with the update.
    I'm working on a dual-CPU (2 x 2.4 GHz 6-core) and 128 GB RAM Windows 8.1 machine. I've allowed Photoshop to use up to 120 GB RAM.
    Is there any way to reset the Content-Aware Spot Healing Brush to work as fast as it had before the CC 2014 update?

    My performance settings are the same for CC and CC 2014.
    I assume the problem is introduced by the changes/improvements of the Content-Aware tools and on normal file sizes it might not matter for the user if the operation takes 25 or 200 ms. But on large files it seriously slows down the retouching work.
    So far I've been doing as you suggested, using CC for that type of work. But as it takes about 90 minutes to save these huge files and AFAIK Photoshop CC and CC 2014 cannot be run as separate instances simultaneously it's quite inconvenient not being able to use CC 2014 for some other type of work while CC is saving the file.
    If the new Content-Aware functions take that much more processing it would be great to have a choice (similar to the choice of the RAW-engine used in ACR). However, if the new Content-Aware functions are just not yet optimized for fast performance then it's a bug.
    I will file a Feature Request/Bug Report and ask if there is anything that can be done about it in Photoshop CC 2014.
    Thanks for your help, Eugene.

  • CFB slowing up on large files

    G'day
    This is just an observation more than anything else.
    The performance of the release version of CFB has been much better than the beta versions, which is heartening.  During the beta I had to have as much background functionality as possible switched off, eg: workspace rebuilding, syntax checking, etc to even get work done, but on the release version I've been able to leave everything on, and it's been OK thusfar.
    I'm in the process of writing a CFC which has started to be rather large (I might need to do some refactoring, but I'll get onto that a bit further down the track...).  CFB seemed to handle it fine up until I was at about 1400-1500 lines, but I've now tipped 2000 lines (92kB) and CFB is beginning to struggle.  Syntax highlighting is so slow now it's always wrong for a while, until CFB catches up, even data entry seems to have slowed up - I guess this is due to syntax checking or something - but I am typing faster than CFB can render the characters I've typed.  I'm a reasonably fast typer, but when I'm coding, I'm not spectacularly fast.  Sometimes I need to pause to think what I'm doing ;-)
    Unfortunately I do not have CFEclipse installed on this machine to compare performance.
    Now... I guess I expect the thing to slow up eventually as files grow, but as it currently stands I'm expecting to start getting frustrated with having to wait around for CFB to catch up with me, if it slows down much more.  I speculate this file will grown to be another 500-odd lines long yet.
    Adam

    G'day
    This is just an observation more than anything else.
    The performance of the release version of CFB has been much better than the beta versions, which is heartening.  During the beta I had to have as much background functionality as possible switched off, eg: workspace rebuilding, syntax checking, etc to even get work done, but on the release version I've been able to leave everything on, and it's been OK thusfar.
    I'm in the process of writing a CFC which has started to be rather large (I might need to do some refactoring, but I'll get onto that a bit further down the track...).  CFB seemed to handle it fine up until I was at about 1400-1500 lines, but I've now tipped 2000 lines (92kB) and CFB is beginning to struggle.  Syntax highlighting is so slow now it's always wrong for a while, until CFB catches up, even data entry seems to have slowed up - I guess this is due to syntax checking or something - but I am typing faster than CFB can render the characters I've typed.  I'm a reasonably fast typer, but when I'm coding, I'm not spectacularly fast.  Sometimes I need to pause to think what I'm doing ;-)
    Unfortunately I do not have CFEclipse installed on this machine to compare performance.
    Now... I guess I expect the thing to slow up eventually as files grow, but as it currently stands I'm expecting to start getting frustrated with having to wait around for CFB to catch up with me, if it slows down much more.  I speculate this file will grown to be another 500-odd lines long yet.
    Adam

  • Slow Search on large spreadsheets

    The progressive search is does not work on any large spreadsheet. If I want to search for 'peanut', Numbers seems to actually perform 5 separate searches: "pe", "a", "n", "u", "t". That is just crazy. If I wanted to search for "pe" or "pean" I would have entered that.
    The net effect of this is for a large spreadsheet, say, 1000 rows, the search takes forever, pegging my Macbook Pro CPU (well, only 1 of the CPUs as Number is unfortunately not taking advantage of both!).
    Apple, please redesign the search feature to have better performance. I have had to move my spreadsheet back to Excel on my Windows test machine to work on any large spreadsheets.

    MB,
    It's a well-known issue that large Numbers documents are slow. You can register your complaint with Apple by using the Feedback menu item under Numbers.
    Jerry

  • Very Slow performance with large files

    Using iTunes with my AppleTV I've been in the slow and painful process of digitizing my dvd library and when converting the LOTR (extended edition) trilogy I ran into a problem post-conversion. They play fine in Quicktime 7.3.1, I can add them to the iTunes library but when attempting to edit any information within iTunes and attempting to save iTunes freezes for several minutes before working or crashing (odds are around 50/50). If I just add the file to the library and try to play it the movie doesn't show up on the AppleTV either which is even stranger.
    Output format of the movie: MP4/H.264, native 720x480 resolution, 23.97fps, 2Mbps video stream, 128k audio stream(limit of winavi).
    Output Size: 4.4GB
    Length: 4hours 24minutes
    Software versions: iTunes 7.3.1, QuickTime 7.3.1
    OS: Windows XP Pro SP2(current patch level as of 7/15).

    Is possible than iTunes have 4 Gb folder limits. I'm trying put a little of light over the problem because iTunes Help don't said.
    Cheers

  • Is illustrator running slow because of large file size?

    Is my illustrator running slow? I have an i7 3770K, 16gb ram, gtx 660i 2g, h100i, ax1200i and maximus formula v. I'm running dual 24" 1920x1080 monitors. File size is 800x600 pixals with 5 layers and 100+ paths in each layer. Is the program running slow because the file is too big or is there a problem with illustrator?

    If you have a internal HD available, you can select it as a scratch disk...
    the above dialog is from CS3, but you have something similar in your Preferences file.
    If you are running out of disk space, then there's not enough scratch to perform certain ops efficiently and it could slow things down considerably.  Just a thought.

  • Very slow in sending large file email

    I am having problem sending email with attachements around 8MB size, the email could not be sent out. Any setting or solution that can help on this?

    I have been experiencing similar problems with slow downloads from my WS2012 running Mercury Mail to our outlook 2010 clients.
    I finally sussed this one out.
    Even though there is much talk out there that Anti Viruses no longer scan emails because in the words of another post
    There's no need to scan email, only attachments and that is done by all modern antimalware programs as the files are written to the PC file system.
    All of the major AV vendors have been telling their customers for several years now that 'email scanning' is no longer necessary, it's a holdover from the early days of AV programs when Microsoft hadn't yet created the API (Application Programming Interface)
    sets that allow the AV programs to scan a file before any user access (including email programs) is allowed.
    That being said the AV I have installed on my WS2012 (SC2012EP) still looks at the app. So what I did what exclude the Mercury mail application in SC2012EP and tested it.
    An 8.9MB email which took 4 minutes to download now only take 9 seconds.
    Problems solved!!
    J I hope it helps you

  • Search for big files

    I have a MacBook Pro, and I am running out of Hard Disk space. I used to just create a smart folder that contained files larger than 100 meg. Now I can't see the size in the smart folder view. Any way to make this happen? Did apple seriously not include an option to search for large files and display the file size? If this is the case, that is a serious problem.
    Ken

    I should mention a "not really a work around" thing you can do to help, at least a little:
    When you have your list of files larger than whatever, select the first item, then bring up the "Show Inspector" window (Command-Option-I) and place it next to your results window. It shows the actual size of the selected file. Click back on the results window if necessary (this is ANOTHER annoyance--Finder windows sometimes lose focus to the Info window without any visual feedback from either of the windows that this has happened, and sometimes they don't), you can now use the arrow keys to move up and down the list and read the size in the Info window of each file. Not really what one wants, but at least you can locate the biggest files.
    Francine
    Francine
    Schwieder

  • Slow large file transfer speed with LaCie 1T firewire 800 drives

    I am transferring two large files (201gb and 95gb) from one LaCie 1T firewire external drives to another one (using separate connections to a PCI Express firewire 800 card in my Quad G5. The transfer time is incredibly slow – over for hours for the 201gb file and over 2 hours for the 95gb file.
    Does anyone have any ideas why this is so slow or what I might try to speed up the transfer rates?
    Thank you.
    G5 Quad 2.5 8GB DDR2 SDRAM two SATA 400GB   Mac OS X (10.4.5)  

    You posted this in the Powerbook discussion forum. You may want to post it in the Power Mac G5 area, located at http://discussions.apple.com/category.jspa?categoryID=108

  • I need to update to the latest version of Snow Leapord-currently running v10.6. Because of where we live the slow download speed for such a large file has kept me from downloading the update. What can I do short of hooking up computer elsewhere?

    I need to update to the latest version of Snow Leapord-currently running v10.6. Because of where we live the slow download speed for such a large file has kept me from downloading the update. What can I do short of hooking up computer elsewhere?

    Do you ever visit a friend or realtive with a Mac who has faster internet? Maybe the local library has Macs on a fast line. If so, get a USB thumb drive and put this link on it from your computer:
    http://support.apple.com/kb/DL1399
    Then put this link on the drive:
    http://support.apple.com/kb/DL1429
    When you are at some place that has a Mac with a decent download speed. insert the thumb drive in that Mac and click on the first link ("DL1399") and direct the download to your thumb drive. Now do the same this with the second link.
    The installer files will now be on the thumb drive and, when you get home, drag them from the thumb drive to your desktop. Install the Combo update first.

  • DW MX 2004 Slow with large files?

    I'm using DW MX 2004 with a static website with a few
    thousand files.
    It's very slow when opening multiple files at the same time
    or
    opening a large html file.
    But DW4 is fine, nice and quick compared with DW MX 2004.
    Is there any way to help DW MX 2004 work better with these
    files?
    Many thanks, Craig.

    Many thanks.
    But already running the 7.0.1 update.
    "Randy Edmunds" <[email protected]> wrote in
    message
    news:eggcl0$6ke$[email protected]..
    > Be sure that you've installed the 7.0.1 updater. There
    were a few
    > performance fixes, especially on the Mac, that may help
    your workflow.
    >
    > HTH,
    > Randy
    >
    >
    >> I'm using DW MX 2004 with a static website with a
    few thousand files.
    >> It's very slow when opening multiple files at the
    same time or
    >> opening a large html file.
    >> But DW4 is fine, nice and quick compared with DW MX
    2004.
    >>
    >> Is there any way to help DW MX 2004 work better with
    these files?

  • Numbers 09 is slow dealing with relatively large files

    As some of you might know, Excel 2008 is painfully slow when opening relatively large files (about 12000 rows and 28 columns) that have simple 2D x-y charts. FYI, Excel 2004 doesn't have the same problem (and Excel 2003 in the XP world is even better). I purchased iWork09 hoping that Numbers 09 will help, but unfortunately I have the same problem. iWok09 takes more than 5 minutes to open the file - something the older versions of Excel could do in seconds. When the file opens up, it is impossible to manipulate it. I have a MacBook with 2.4 GHz Intel Core 2 Duo and 4GB of RAM running OS x (version 10.5.6).
    Has anybody else experienced the same problem? If so, is there a bug in iWork09, or is it because it isn't meant to deal with large files.
    I appreciate your response.

    Numbers '08 was very slow.
    Numbers '09 is not so slow but it's not running like old fashioned spreadsheets.
    I continue to think that it's a side effect of the use of xml to describe thr document.
    We may hope that developers will discover programming tips able to fasten the beast.
    Once again, I really don't understand why users may buy an application before testing it with the FREE 30 days demo available from Apple's Web page.
    Yvan KOENIG (from FRANCE dimanche 8 mars 2009 13:13:18)

  • COMPSUMM.BOX Slow to Process Larger SUM Files

    Hey Everyone,
    We have a large SCCM 2007 environment (over 2000 sites, 80,000+ clients) and we are finding the Central Site doesn't appear to be able to keep up with the influx of messages coming into COMPSUMM.BOX.  We can move all of the files out and feed
    them back in slowly, but even if we do nothing but clear out the files the inbox gets backlogged again.
    In this Inbox there are a combination of SVF and SUM files.  The SVF files are small and process without any issues.  The problem is the SUM files.  The SUM files up to about 500KB or less appear to process OK and don't hold
    up the overall inbox processing too much, but the problem is when the 2MB, 3MB, 4MB and 5MB SUM files come in.  As soon as they hit, the inbox comes to a crawl.  It takes a significant amount of time to process each of these larger files (15mins+
    each) and while it's processing one, more come in, then it gets to the next one, and then more come in and by the end of the day we can end up with as many as 300,000 files in this folder.  Eventually, it seems to process a lot of them and occasionally
    even gets caught up (down to only 10,000 files), but in busier periods it needs help by us moving some of the files out.  Also, the large SUM files seem to process at a higher priority and leave other items sitting there not moving while the system works
    on the larger files?  What process generates the larger files flowing through here and how often?
    My question is, is there anything we can do to reduce the amount of data flowing through this specific inbox so it is able to better keep up with the load?  Can we change the processing priority so that the larger SUM files don't automatically jump
    to the front of the queue?  I know we can change the schedule for the "Site System Status Summarizer", but there doesn't seem to be such a schedule for the "Component Status Summarizer".  I don't want to turn off replicating these
    messages from the child primary sites, but we also don't want to be dealing with a constant backlog situation either.
    Any suggestions are much appreciated.
    Thanks!
    -Jeff

    Hi Garth,
    I wasn't saying that the larger files process because they are larger files but rather that SUM files appear to process at higher priority than SVF files in this Inbox.  Once one of the large SUM files hits it's turn in the queue, compsumm.box starts
    growing.  There were 88,000 files in here when I came in this morning and it still had files from the 17th that hadn't processed.  Moved out all the large files (1000+ of them) and all of the other files in the Inbox processed.  Move the 1000
    back in... backlog begins again.  We can have 300 SVF files come in and one SUM file and the SUM file will begin processing immediately and the SVF files don't appear to process until the SUM file is done.  I was thinking more along the lines that
    the content of the SUM files may be prioritized higher than others.
    By 2000 Sites, yes CM07 Primary and Secondary Sites.  19 Primary and 2286 Secondary to be exact.
    No errors in COMPSUMM.LOG.  Just very busy processing.
    HINV = Every 4 Days
    SINV = Every 7 Days
    SWM = Every 7 Days
    Heartbeat = Every 7 Days
    Discovery/Inventory isn't an issue... No DDR or MIF backlogs.  We don't run AD System Discovery, only Group Discovery and it runs on the tier two primary sites once a month and are staggered by region.
    CPU is steady at about 50% with SMSEXEC using about 30%.
    Memory is steady at around 10GB of the 36GB Total
    SQL is interesting... SQL is a cluster on a dedicated server with 48GB physical memory.  SQL Server 2008 SP3 64-Bit SQL is configured to be able to use up to 42GB of memory leaving 6GB for the OS.  In Task Manager, SQLServr.exe is showing
    it is using about 800-900MB most of the time.  However, looking in the status bar in Task Manager it shows Physical Memory: 98%, consistently.  I ran a TASKLIST and exported the results to view in Excel but when I total everything there, it is only about
    2.1GB.  Hmm...  Running RAMMap.exe it shows AWE allocated with 43GB of memory.  AWE is NOT enabled in SQL.  From another Google search, this appears to be something others have seen as well, but I'm not finding any good solution, or
    any really clear indication this is actually a problem as opposed to how Win2K8 is managing memory.  Don't like seeing the server showing 98% memory usage though.  Going to continue to look at this further and feed the info back to the team.
    I have not performed the DBCC commands on the SQL database (yet) but will do so.
    Thanks!
    -Jeff

  • CS6 very slow saving large files

    I have recently moved from PS CS5 to CS6 and have noticed what seems to be an increase in the amount of time it takes to save large files. At the moment I am working with a roughly 8GB .psb and to do a save takes about 20 minutes. For this reason I have had to disable the new autosave features, otherwise it just takes far too long.
    CS5 managed to save larger files, more quickly and with less memory available to it. Looking at system resources while Photoshop is trying to save, it is not using its full allocation of RAM specified in performance preferences and there is still space free on the primary scratch disc. The processor is barely being used and disc activity is minimal (Photoshop might be writing at 4mb/s max, often not at all though according to Windows).
    I am finding the new layer filtering system invaluable so would rather not go back to CS5. Is this is a known issue or is there something I can do to speed up the saving process?
    Thanks.

    Thanks for the quick replies.
    Noel: I did actually experiment with turing off 'maximize compatibility' and compression and it had both good and bad effects. On the plus side it did reduce the save time somewhat, to somewhere a little over 10 minutes. However it also had the effect of gobbling up ever more RAM while saving, only leaving me with a few hundred mb's during the save process. This is odd in itself as it actually used up more RAM than I had allocated in the preferences. The resulting file was also huge, almost 35 GB. Although total HD space isn't a problem, for backing up externally and sharing with others this would make things a real headache.
    Curt: I have the latest video driver and keep it as up to date as possible.
    Trevor: I am not saving to the same drive as the scratch dics, although my primary scratch disc does hold the OS as well (it's my only SSD). The secondary scratch disc is a normal mechanical drive, entirely separate from where the actual file is held. If during the save process my primary scratch disc was entirely filled I would be concerned that that was an issue, but it's not.
    Noel: I have 48 GB with Photoshop allowed access to about 44 of that. FYI my CPU's are dual Xeon X5660's and during the save process Photoshop isn't even using 1 full core i.e. less than 4% CPU time.

Maybe you are looking for