Huge memory usage while parsing a large number of files
hi,
I use ORACLE8i XML parser class in my app.The problem is that
when I parse a large number of xml files ,memory usage is so
huge as more than 100M byte.Even worse,sometime I saw virtual
memory low dialog.Could anybody give me some idea about this.
any reply are welcome.Thanks a lot.
Hi Mike. Yes I do have this enabled but this should only be requesting login to view the normal (open) applications. What I end up getting, after I login, is all the applications restarting. In the case of the synchronising files the database has to rebuild itself every time (160,000 files), and nothing has been happening while logged out. Using the Caffeine app I have still enabled the screensaver using a hot corner and had to log back in after 10 minutes but all the applications were still running, including the synchronisation. I am pretty sure my laptop has not gone into sleep mode, or it wouldn't have to reopen the apps, but it also has not shutdown as I get a login box as soon as I press any key. I can't understand the problem, and what makes it even stranger is that it is happening on two separate MacBooks at the same time. Thanks for the suggestion – really appreciated.
OldGnome - thanks for the suggestion for the other app. I chose Caffeine as this has really good user reviews but the other one doesn't seem to have any, but I might still try it. Again thanks for your help.
Similar Messages
-
how do i go about deleting a large number of files at the same time? where's the easiest place to do it?
A bit vague as to what you intend, but the simple answer is to select all the files you want to delete then either drag to the Trash or CTRL- or RIGHT-click on the selection and choose Move to Trash from the contextual menu.
-
I have lost a large number of files after upgrading to macOS Yosemite! Any help on how to restore the files would be much appreciated.
Iimport them from your backup you made before choosing to upgrade to Yosemite.
Cheers
Pete -
Rman backup failure, and is generating a large number of files.
I would appreciate some pointers on this if possible, as I'm a bit of an rman novice.
Our rman backup logs indicated a failure and in the directory where it puts its files there appeared a large number of files for the 18th, which was the date of the failure. Previous days backups generated 5 files of moderate size. When it failed it generated between 30 - 40 G of files ( it looks like one for each database file ).
The full backup is early monday morning, and the rest incremental :
I have placed the rman log, the script and a the full directory file listing here : http://www.tinshed.plus.com/rman/
Thanks in advance - George
-rw-r----- 1 oracle dba 1073750016 Jan 18 00:03 database_f734055071_s244_s1
-rw-r----- 1 oracle dba 1073750016 Jan 18 00:03 database_f734055096_s245_s1
-rw-r----- 1 oracle dba 1073750016 Jan 18 00:03 database_f734573008_s281_s1
-rw-r----- 1 oracle dba 1073750016 Jan 18 00:03 database_f734055045_s243_s1
-rw-r----- 1 oracle dba 524296192 Jan 18 00:03 database_f734055121_s246_s1
-rw-r----- 1 oracle dba 1073750016 Jan 18 00:03 database_f734055020_s242_s1
-rw-r----- 1 oracle dba 4294975488 Jan 18 00:02 database_f734054454_s233_s1
-rw-r----- 1 oracle dba 4294975488 Jan 18 00:02 database_f734054519_s234_s1
-rw-r----- 1 oracle dba 4294975488 Jan 18 00:02 database_f734054595_s235_s1
-rw-r----- 1 oracle dba 4294975488 Jan 18 00:02 database_f734054660_s236_s1
-rw-r----- 1 oracle dba 4294975488 Jan 18 00:02 database_f734054725_s237_s1
-rw-r----- 1 oracle dba 4294975488 Jan 18 00:02 database_f734054790_s238_s1
-rw-r----- 1 oracle dba 209723392 Jan 18 00:02 database_f734055136_s247_s1
-rw-r----- 1 oracle dba 73408512 Jan 18 00:02 database_f734055143_s248_s1
-rw-r----- 1 oracle dba 67117056 Jan 18 00:02 database_f734055146_s249_s1
-rw-r----- 1 oracle dba 4194312192 Jan 18 00:02 database_f734054855_s239_s1
-rw-r----- 1 oracle dba 2147491840 Jan 18 00:02 database_f734054975_s241_s1
-rw-r----- 1 oracle dba 3221233664 Jan 18 00:02 database_f734054920_s240_s1
drwxr-xr-x 2 oracle dba 4096 Jan 18 00:00 logs
-rw-r----- 1 oracle dba 18710528 Jan 17 00:15 controlfile_c-1911789030-20110117-00
-rw-r----- 1 oracle dba 1343488 Jan 17 00:15 database_f740621746_s624_s1
-rw-r----- 1 oracle dba 2958848 Jan 17 00:15 database_f740621745_s623_s1
-rw-r----- 1 oracle dba 6415990784 Jan 17 00:15 database_f740620829_s622_s1
-rw-r----- 1 oracle dba 172391424 Jan 17 00:00 database_f740620814_s621_s1george3 wrote:
Ok, perhaps its my understanding of RMAN that is at fault. From the logs :
Starting recover at 18-JAN-11
channel m1: starting incremental datafile backup set restore
channel m1: specifying datafile copies to recover
recovering datafile copy file number=00001
name=/exlibris1/rmanbackup/database_f734055020_s242_s1
recovering datafile copy file number=00002
name=/exlibris1/rmanbackup/database_f734055045_s243_s1
it seems to make backup copies of the datafiles every night, so the creation of these large files is normal ?Above results indicate that you have full (incremental level 0) backup(all datafiles copies ) and there happen update/recover (applying incremental level 1) backup.So there was happen applying */exlibris1/rmanbackup/database_f734055045_s243_s1* inremental backup to full(level 1) backup.And size should be normal
Why is it making copies of the datafiles even on days of incrementals ?
Because after getting level 1 backup and need applying and every day will apply one incremental backup. -
How to copy very large number of files from one drive to another???
I'm a fairly experienced Mac user for serveral years but this problem really has me stumped.
I'm trying to copy or move 152,000 files from one external drive to another drive. I can highlight (Cmd - A) all the files on the first drive and drag them to the second drive but Finder always shows 32,768 files being copied no matter what I try.
Any and all suggestions on how to move/copy a large number of files from one external drive to another are greatefully appreciated.
Thank you in advance,
MackI would use the command line tool rsync.
For instance with: rsync -av source-dir destination-dir
-a The files are transferred in "archive" mode, which ensures that symbolic links, devices, attributes, permissions, ownerships, etc. are preserved in the transfer.
-v Verbose, so you see the progress.
Rsync is fast and really, really powerful and many times used in shell scripts and the like to automatically backup and/or sync stuff. Google a bit for more info. -
Memory Leakage while parsing and schema validation
It seems there is some kind of memory leakage. I was using xdk 9.2.0.2.0. Later i found that from this forum which contain a Topic on this and they (oracle ) claim they have memory leakage. And they have fixes this bugs in 9.2.0.6.0 xdk. When i used truss command, for each call to parser and schame validation, it was opening file descriptor for lpxus and lsxus file. And this connections were not close. And keep on openning it with each call to parser. I was able to diagonise this using truss command on on solaris. After making many calls, i was error message could not open file Result.xsd (0202). I am using one instance of Parser and Schema. And i am doing clean up for parser after each parse.
Later i downloaded, 9.2.0.6.0,
Above problem for the parser was solvedm but still the problem continued for schema validation. And even i tried with latest beta release 10.
And this has caused great troubles to us. Please can u look whether there is come sort of leakage. Please advice if u have any solution.
Code---
This below code is called multiple times
char* APIParser::execute(const char* xmlInput) {
char* parseResult = parseDocument(xmlInput);
//if(strcmp(parseResult,(const char*)"")==0) {
if(parseResult == NULL) {
parseResult = getResult();
parser.xmlclean();
return parseResult;
} else {
return parseResult;
Parser and schema are intialised in Construtor and terminated in Destructor.Hi, here is the complete test case
#include<iostream>
#ifndef ORAXML_CPP_ORACLE
# include <oraxml.hpp>
#endif
using namespace std;
#define FAIL { cout << "Failed!\n"; return ; }
void mytest(int count)
uword ecode;
XMLParser parser;
Document *doc;
Element root, elem;
if (ecode = parser.xmlinit())
cout << "Failed to initialze XML parser, error " << ecode << "\n";
return ;
cout << "\nCreating new document...\n";
if (!(doc = parser.createDocument((oratext *) 0, (oratext *) 0,(DocumentType *) 0)))
FAIL
if (!(elem = doc->createElement((oratext *) "ROOT")))
FAIL
string test="Elem";
for(int i=0;i<count;i++)
//test="Elem"+string(ltoa(i));
if (!(elem = doc->createElement((oratext *) "element")))
FAIL
if (!doc->appendChild(elem))
FAIL
//doc ->print();
//parser.xmlclean();
parser.xmlterm();
int main(int argc,char* argv[])
int count=atol(argv[1]);
mytest(count);
char c;
cout<<"check memory usage n press any key"<<endl;
cin>>c;
return 0;
-------------------------------------------cut here-----
Now, i cant use the xdk 10g because i work on a hpux machine. I have tried the above program with a count of 1000000. the memory usage at the end was around 2 gigabytes.
Could someone please please help me? :(
Thank you. -
Ai CS4 & CS5: Can't move objects, huge memory usage. Tips?
Okay guys this is my first post here, so bear with me.
I'm just wondering if anyone can give guidance or refer me somewhere I can find a solution. I both CS4 and CS5 of Ai installed on my mac. Everything works just great, until I attempt to move an object. Something simple as a single line with freeze my system if I attempt to move it from one location to another.
It is odd. I can scale/transform objects without any issue. Its just when I click an object to drag it to another location my system will freeze for about 5 seconds. The object won't move still. I opened up Activity Monitor and noticed my Memory usage sky rocketed as soon as I tried to move the object.
Is there configurations for memory allocation in Illustrator like there is in photshop?
(PS I have 3GB RAM and a 120GB HDD with 30GB free space, 2.0 Intel Core 2 processor, it's a 2007 Mac Mini)How big is the file? and is it complex if so then you simply do not have the resources. Also you might have a font cache issue.
But I think the lack of resources. -
Upgraded to FF8 from FF7, still had huge memory issues. Running it on 32 bit Vista . Downgraded to 3.6, lost all bookmarks! then tried to upgrade to FF8, bookmarks still not there. Downgraded again to FF 3.6 as FF8 killing my memory.
Your Bookmarks are stored in a single file, '''places.sqlite''', in your '''Profile folder'''. To open your Profile folder go to '''Help''' then '''Troubleshooting Information,''' then next to "'''Profile Directory'''" click the "'''Open Containing Folder'''" button to open the Profile Folder.
http://support.mozilla.com/en-US/kb/Profiles
see also:
http://kb.mozillazine.org/Transferring_data_to_a_new_profile_-_Firefox#Suggested_profile_contents_to_transfer
thank you
Please mark "Solved" the answer that really solve the problem, to help others with a similar problem. -
Bridge freezes when applying camera raw settings to large number of files
I have a folder with 32,000 frames from a time-lapse project that I'm doing. I'd like to apply the same camera raw adjustments to all of these files, and so follow one of the two methods below - neither of which are working with such a large set of files:
Method 1:
Select all files, open in camera raw.
Apply changes to the first image
Select all images, then synchronize the changes to all other images.
Unfortunately selecting "Open in Camera Raw..." just leaves bridge to hang. It's using a lot of memory and processor time, so I would assume it's working, but there's no progress bar or similar to indicate that it is.
Method 2:
Open first file in Camera Raw
Apply all changes, and exit camera raw.
Select all files in Bridge, right click on one and select "Develop Settings -> Previous Conversion"
Unfortunately the final step again leaves bridge just thinking. I waited around an hour, closed bridge, but unfortunately when I opened it again there was no sign that it had copied the camera raw settings onto any of the other files. My computer's pretty slow (see below) but it should have written one xmp file in an hour.
Question:
Does anyone have any workarounds other than repeating the process 32 times for 1000 images at a time?
System Info From Photoshop:
Adobe Photoshop Version: 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00) x64
Operating System: Windows Vista 64-bit
Version: 6.0 Service Pack 2
System architecture: AMD CPU Family:15, Model:15, Stepping:2 with MMX, SSE Integer, SSE FP, SSE2, SSE3
Physical processor count: 1
Processor speed: 1808 MHz
Built-in memory: 8062 MB
Free memory: 4508 MB
Memory available to Photoshop: 7140 MB
Memory used by Photoshop: 70 %
Image tile size: 132K
Image cache levels: 4
OpenGL Drawing: Disabled.
OpenGL Drawing Mode: Basic
OpenGL Allow Normal Mode: False.
OpenGL Allow Advanced Mode: False.
OpenGL Allow Old GPUs: Not Detected.
Video Card Vendor: NVIDIA Corporation
Video Card Renderer: GeForce 6150SE nForce 430/PCI/SSE2
Display: 1
Display Bounds:= top: 0, left: 0, bottom: 1024, right: 1280
Video Card Number: 1
Video Card: NVIDIA GeForce 6150SE nForce 430
OpenCL Unavailable
Driver Version: 8.17.12.7533
Driver Date: 20110521050100.000000-000
Video Card Driver: nvd3dumx.dll,nvd3dum
Video Mode: 1280 x 1024 x 4294967296 colors
Video Card Caption: NVIDIA GeForce 6150SE nForce 430
Video Card Memory: 128 MB
Video Rect Texture Size: 4096
Serial number: Tryout Version
Application folder: C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\
Temporary file path: C:\Users\MARTIN~1\AppData\Local\Temp\
Photoshop scratch has async I/O enabled
Scratch volume(s):
H:\, 279.5G, 50.4G free
D:\, 149.0G, 108.1G free
E:\, 74.5G, 46.1G free
Required Plug-ins folder: C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\Required\
Primary Plug-ins folder: C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\Plug-ins\
Additional Plug-ins folder: not set
Installed components:
A3DLIBS.dll A3DLIB Dynamic Link Library 9.2.0.112
ACE.dll ACE 2012/01/18-15:07:40 66.492997 66.492997
adbeape.dll Adobe APE 2012/01/25-10:04:55 66.1025012 66.1025012
AdobeLinguistic.dll Adobe Linguisitc Library 6.0.0
AdobeOwl.dll Adobe Owl 2012/02/09-16:00:02 4.0.93 66.496052
AdobePDFL.dll PDFL 2011/12/12-16:12:37 66.419471 66.419471
AdobePIP.dll Adobe Product Improvement Program 6.0.0.1642
AdobeXMP.dll Adobe XMP Core 2012/02/06-14:56:27 66.145661 66.145661
AdobeXMPFiles.dll Adobe XMP Files 2012/02/06-14:56:27 66.145661 66.145661
AdobeXMPScript.dll Adobe XMP Script 2012/02/06-14:56:27 66.145661 66.145661
adobe_caps.dll Adobe CAPS 5,0,10,0
AGM.dll AGM 2012/01/18-15:07:40 66.492997 66.492997
ahclient.dll AdobeHelp Dynamic Link Library 1,7,0,56
aif_core.dll AIF 3.0 62.490293
aif_ocl.dll AIF 3.0 62.490293
aif_ogl.dll AIF 3.0 62.490293
amtlib.dll AMTLib (64 Bit) 6.0.0.75 (BuildVersion: 6.0; BuildDate: Mon Jan 16 2012 18:00:00) 1.000000
ARE.dll ARE 2012/01/18-15:07:40 66.492997 66.492997
AXE8SharedExpat.dll AXE8SharedExpat 2011/12/16-15:10:49 66.26830 66.26830
AXEDOMCore.dll AXEDOMCore 2011/12/16-15:10:49 66.26830 66.26830
Bib.dll BIB 2012/01/18-15:07:40 66.492997 66.492997
BIBUtils.dll BIBUtils 2012/01/18-15:07:40 66.492997 66.492997
boost_date_time.dll DVA Product 6.0.0
boost_signals.dll DVA Product 6.0.0
boost_system.dll DVA Product 6.0.0
boost_threads.dll DVA Product 6.0.0
cg.dll NVIDIA Cg Runtime 3.0.00007
cgGL.dll NVIDIA Cg Runtime 3.0.00007
CIT.dll Adobe CIT 2.0.5.19287 2.0.5.19287
CoolType.dll CoolType 2012/01/18-15:07:40 66.492997 66.492997
data_flow.dll AIF 3.0 62.490293
dvaaudiodevice.dll DVA Product 6.0.0
dvacore.dll DVA Product 6.0.0
dvamarshal.dll DVA Product 6.0.0
dvamediatypes.dll DVA Product 6.0.0
dvaplayer.dll DVA Product 6.0.0
dvatransport.dll DVA Product 6.0.0
dvaunittesting.dll DVA Product 6.0.0
dynamiclink.dll DVA Product 6.0.0
ExtendScript.dll ExtendScript 2011/12/14-15:08:46 66.490082 66.490082
FileInfo.dll Adobe XMP FileInfo 2012/01/17-15:11:19 66.145433 66.145433
filter_graph.dll AIF 3.0 62.490293
hydra_filters.dll AIF 3.0 62.490293
icucnv40.dll International Components for Unicode 2011/11/15-16:30:22 Build gtlib_3.0.16615
icudt40.dll International Components for Unicode 2011/11/15-16:30:22 Build gtlib_3.0.16615
image_compiler.dll AIF 3.0 62.490293
image_flow.dll AIF 3.0 62.490293
image_runtime.dll AIF 3.0 62.490293
JP2KLib.dll JP2KLib 2011/12/12-16:12:37 66.236923 66.236923
libifcoremd.dll Intel(r) Visual Fortran Compiler 10.0 (Update A)
libmmd.dll Intel(r) C Compiler, Intel(r) C++ Compiler, Intel(r) Fortran Compiler 10.0
LogSession.dll LogSession 2.1.2.1640
mediacoreif.dll DVA Product 6.0.0
MPS.dll MPS 2012/02/03-10:33:13 66.495174 66.495174
msvcm80.dll Microsoft® Visual Studio® 2005 8.00.50727.6195
msvcm90.dll Microsoft® Visual Studio® 2008 9.00.30729.1
msvcp100.dll Microsoft® Visual Studio® 2010 10.00.40219.1
msvcp80.dll Microsoft® Visual Studio® 2005 8.00.50727.6195
msvcp90.dll Microsoft® Visual Studio® 2008 9.00.30729.1
msvcr100.dll Microsoft® Visual Studio® 2010 10.00.40219.1
msvcr80.dll Microsoft® Visual Studio® 2005 8.00.50727.6195
msvcr90.dll Microsoft® Visual Studio® 2008 9.00.30729.1
pdfsettings.dll Adobe PDFSettings 1.04
Photoshop.dll Adobe Photoshop CS6 CS6
Plugin.dll Adobe Photoshop CS6 CS6
PlugPlug.dll Adobe(R) CSXS PlugPlug Standard Dll (64 bit) 3.0.0.383
PSArt.dll Adobe Photoshop CS6 CS6
PSViews.dll Adobe Photoshop CS6 CS6
SCCore.dll ScCore 2011/12/14-15:08:46 66.490082 66.490082
ScriptUIFlex.dll ScriptUIFlex 2011/12/14-15:08:46 66.490082 66.490082
tbb.dll Intel(R) Threading Building Blocks for Windows 3, 0, 2010, 0406
tbbmalloc.dll Intel(R) Threading Building Blocks for Windows 3, 0, 2010, 0406
TfFontMgr.dll FontMgr 9.3.0.113
TfKernel.dll Kernel 9.3.0.113
TFKGEOM.dll Kernel Geom 9.3.0.113
TFUGEOM.dll Adobe, UGeom© 9.3.0.113
updaternotifications.dll Adobe Updater Notifications Library 6.0.0.24 (BuildVersion: 1.0; BuildDate: BUILDDATETIME) 6.0.0.24
WRServices.dll WRServices Friday January 27 2012 13:22:12 Build 0.17112 0.17112
wu3d.dll U3D Writer 9.3.0.113
Required plug-ins:
3D Studio 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Accented Edges 13.0
Adaptive Wide Angle 13.0
ADM 3.11x01
Angled Strokes 13.0
Average 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Bas Relief 13.0
BMP 13.0
Camera Raw 7.0
Chalk & Charcoal 13.0
Charcoal 13.0
Chrome 13.0
Cineon 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Clouds 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Collada 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Color Halftone 13.0
Colored Pencil 13.0
CompuServe GIF 13.0
Conté Crayon 13.0
Craquelure 13.0
Crop and Straighten Photos 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Crop and Straighten Photos Filter 13.0
Crosshatch 13.0
Crystallize 13.0
Cutout 13.0
Dark Strokes 13.0
De-Interlace 13.0
Dicom 13.0
Difference Clouds 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Diffuse Glow 13.0
Displace 13.0
Dry Brush 13.0
Eazel Acquire 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Embed Watermark 4.0
Entropy 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Extrude 13.0
FastCore Routines 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Fibers 13.0
Film Grain 13.0
Filter Gallery 13.0
Flash 3D 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Fresco 13.0
Glass 13.0
Glowing Edges 13.0
Google Earth 4 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Grain 13.0
Graphic Pen 13.0
Halftone Pattern 13.0
HDRMergeUI 13.0
IFF Format 13.0
Ink Outlines 13.0
JPEG 2000 13.0
Kurtosis 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Lens Blur 13.0
Lens Correction 13.0
Lens Flare 13.0
Liquify 13.0
Matlab Operation 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Maximum 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Mean 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Measurement Core 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Median 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Mezzotint 13.0
Minimum 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
MMXCore Routines 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Mosaic Tiles 13.0
Multiprocessor Support 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Neon Glow 13.0
Note Paper 13.0
NTSC Colors 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Ocean Ripple 13.0
Oil Paint 13.0
OpenEXR 13.0
Paint Daubs 13.0
Palette Knife 13.0
Patchwork 13.0
Paths to Illustrator 13.0
PCX 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Photocopy 13.0
Photoshop 3D Engine 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Picture Package Filter 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Pinch 13.0
Pixar 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Plaster 13.0
Plastic Wrap 13.0
PNG 13.0
Pointillize 13.0
Polar Coordinates 13.0
Portable Bit Map 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Poster Edges 13.0
Radial Blur 13.0
Radiance 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Range 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Read Watermark 4.0
Reticulation 13.0
Ripple 13.0
Rough Pastels 13.0
Save for Web 13.0
ScriptingSupport 13.0
Shear 13.0
Skewness 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Smart Blur 13.0
Smudge Stick 13.0
Solarize 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Spatter 13.0
Spherize 13.0
Sponge 13.0
Sprayed Strokes 13.0
Stained Glass 13.0
Stamp 13.0
Standard Deviation 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Sumi-e 13.0
Summation 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Targa 13.0
Texturizer 13.0
Tiles 13.0
Torn Edges 13.0
Twirl 13.0
U3D 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Underpainting 13.0
Vanishing Point 13.0
Variance 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Variations 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Water Paper 13.0
Watercolor 13.0
Wave 13.0
Wavefront|OBJ 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
WIA Support 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
Wind 13.0
Wireless Bitmap 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
ZigZag 13.0
Optional and third party plug-ins: NONE
Plug-ins that failed to load: NONE
Flash:
Mini Bridge
Kuler
Installed TWAIN devices: NONEThe files are only 2MP each, and I have the same issue when I apply the changes to RAW files and JPG files. The error occurs though before any changes are applied, before the images would need to be recached, as when the error doesn't occur the dialogue box I mentioned pops up asking what to change.
Your specs sound more then enough I would think. Do you have set the option to Write cache to folders when possible and if so could you uncheck that option?
I don't have experience with this kind of numbers but I do dump the Bridge cache manually on a regular base (every new cycle it seems to be more stable so more interval in the periods of dumping cache) but my main folder contains usually around 6 to 7 K of DNG files from 21 MP dSLR. This starts caching without problems but takes a few hours before done.
Can't use the option cache for folder due to a longstanding problem with error message to replace the CacheT file, hence my question about your setting.
Also I have set the option to previews Always HQ and in the prefs have set the option to build monitor size previews.
As said, don't know the large numbers but the 2 MP files are very small and it should be possible for Bridge I would think?
Did you also try to create a new folder and place the files inthere. Or divide them in to three folders for testing? -
MBPro crashes when transfering large number of files
i have a MBPro 2008 and recently when i transfer a large number of video files [over 5GB] my computer will suddendly crash in the middle of the transfer.
i'm usually not using it while its transfering. This is a very recent problem.
Sincerly,
stef./.\How large is you HD and how much space do you have left?
Try the following:
Disconnect all devices from the computer then do the following:
Boot up from your Snow Leopard Install DVD while holding down the "c" key.
Select the language you wish to use, and then choose "Disk Utility" from the menu bar.
Your HD should appear in a panel on the left hand side of the window which opens. Select it and then click on the "repair disk" button in the right hand section of the window.
Once this process has been completed restart your computer and repair permissions directly from Disk Utility.
If Disk Utility is unable to complete the repair, you will need to use a stronger 3rd party utility like DiskWarrior, Techtool PRO (not Deluxe) or Drive Genius. -
Copying large number of files with Finder
When you copy lots of files and folders with Finder, it appears to count up the number of "items" and then start copying and at the same time the number of items remaining gets smaller and smaller as the copy process proceeds.
My question is what are the "items" that Finder is referring to? I did this on a folder with several folders and files and it came up with like about 45K items, but if I open up a Terminal window and do a "find . -print | wc" I get like about 155K lines, which I would have thought were the "items" that Finder is referring to, but obviously not.
So what is it that Finder means when it talks about "items" when copying things around?
If it matters I was copying the files from external drive to another external drive. And both external drives were physically attached to the two USB ports on a MacBook Pro running Mac OS X 10.5.8
Also, is there a way to find out the number of "items" that Finder just copied as sometimes this number goes away so quickly before it registers on my eyeball/brain synapses Darn those fast computers I can go to the Finder's "Edit" menu and undo the last operation but is there somewhere that the number of items last processed can be looked at?
And one last Finder question - is the progress bar that the Copy operation shows measure the number of items or the magnitude of the amount of MB it has copied or about to copy? So a copy of two items of 10GB each would progress in three discreet jumps (0%, 50% and 100%) and go much quicker versus a copy of 10Billion items of two bytes each - even though a total of 20GB were transferred in both cases?
Thanks...
-BobReallyDeepYogurt wrote:
My question is what are the "items" that Finder is referring to? I did this on a folder with several folders and files and it came up with like about 45K items, but if I open up a Terminal window and do a "find . -print | wc" I get like about 155K lines, which I would have thought were the "items" that Finder is referring to, but obviously not.
This is a guess, but I'd think that the Finder considers a "package" to be one item, while the "find" command counts each file within a package. -
Can't Empty Trash With Large Number of Files
Running OS X 10.8.3
I have a very large external drive that had a Time Machine backup on the main partition. At some point, I created a second partition, then started doing backups on the new partition. On Wed, I finally got around to doing some "housecleaning" tasks I'd been putting off. As part of that, I decided to clean up my external drive. So... I moved the old, unused and unwanted Backups.backupdb that used to be the Time Machine backup, and dragged it to the Trash.
Bad idea.
Now I've spent the last 3-4 days trying various strategies to actually empty the trash and reclaim the gig or so of space on my external drive. Initially I just tried to "Empty Trash", but that took about four hours to count up the files just to "prepare to delete" them. After the file counter stopped counting up, and finally started counting down... "Deleting 482,832 files..." "Deleting 482,831 files..." etc, etc... I decided I was on the path to success, so left the machine alone for 12-14 hours.
When I came back, the results were not what I expected. "Deleting -582,032 files..." What the...?
So after leaving that to run for another few hours with no results, I stopped that process. Tried a few other tools like Onyx, TrashIt, etc... No luck.
So finally decided to say the **** with the window manager, pulled up a terminal, and cd'ed to the .Trash directory for my UID on the USB volume and did a rm -rfv Backups.backupdb
While it seemed to run okay for a while, I started getting errors saying "File not found..." and "Invalid file name..." and various other weird things. So now I'm doing a combination of rm -rfing individual directories, and using the finder to rename/cleanup individual Folders when OSX refuses to delete them.
Has anyone else had this weird overflow issue with deleting large numbers of files in 10.8.x? Doesn't seem like things should be this hard...I'm not sure I understand this bit:
If you're on Leopard 10.5.x, be sure you have the "action" or "gear" icon in your Finder's toolbar (Finder > View > Customize Toolbar). If there's no toolbar, click the lozenge at the upper-right of the Finder window's title bar. If the "gear" icon isn’t in the toolbar, selectView > Customize Toolbar from the menubar.
Then use the Time Machine "Star Wars" display: Enter Time Machine by clicking the Time Machine icon in your Dock or select the TM icon in your Menubar.
And this seems to defeat the whole purpose:
If you delete an entire backup, it will disappear from the Timeline and the "cascade" of Finder windows, but it will not actually delete the backup copy of any item that was present at the time of any remaining backup. Thus you may not gain much space. This is usually fairly quick
I'm trying to reclaim space on a volume that had a time machine backup, but that isn't needed anymore. I'm deleting it so I can get that 1GB+ of space back. Is there some "official" way you're supposed to delete these things where you get your hard drive space back? -
Adding a custom stamp to a large number of files.
I need to build a batch process to add a custom stamp to a large number of PDF files. I want to do this without displaying tne files. I'm not sure where to go to learn about how to do this. I've found some sample Jscript code for the for stamps but don't really understand how to invoke the code in a batch process. Thanks for th guidance. Bianca
It should be noted the "Batch Processing" is only available with the Professional or better versions.
See "JavaScript in Batch Processing" by Thom Parker, http://www.acrobatusers.com/tech_corners/javascript_corner/tips/2006/batch_process/ , or "Intro to Acrobat 5 Batch Processing" by Dr. D. P. Story, http://www.planetpdf.com/developer/article.asp?ContentID=Intro+to+Acrobat+5+Batch+Process ing+&gid=6518 .
You also should look at "Automating placement of annotations" by Thom Parker, http://www.acrobatusers.com/tutorials/2007/10/auto_placement_annotations/ . -
File Bundle with large number of files failed
Hi!
Well, I thought there will appear problems. We do have some apps for distribution just by copying large amount of files (not large in size) to Windows (XP Pro, usually) machines. These is some programs which works from directory wo any special need for installation. Happy situation for admin. From one side. In ZfD 4.0.1 we did install this app on one of machines and then did take snapshot via special app (who remember) and did copy file to (Netware) server share, give rights for device (~ workstation) and associate it with ws via eDir and ... voila, next restart or whatsoever and app was there. Very nice, indeed, I miss this!
So, I tried to make this happen on ZCM 10 (on SLES 11). Did app, sorry, bundle, upload files (first time it stuck, second time id accomplish, around 7500 files) and did distribution/launch association to ws (~device). And ... got errors. Several entries in log as examples below.
Any ideas?
More thanks, Alar.
Error: [1/8/10 2:41:53 PM] BundleManager BUNDLE.UnknownExceptionOccurred An Unknown exception occurred trying to process task: Novell.Zenworks.AppModule.LaunchException: Exception of type 'Novell.Zenworks.AppModule.LaunchException' was thrown.
at Novell.Zenworks.AppModule.AppActionItem.ProcessAct ion(APP_ACTION launchType, ActionContext context, ActionSetResult previousResults)
Error: [1/8/10 2:41:54 PM] BundleManager ActionMan.FailureProcessingActionException Failed to process action: Information for id 51846d2388c028d8c471f1199b965859 has not been cached. Did you forget to call CacheContentInfo first?ZCM10 is not efficient in handling that number of files in a single
bundle when they are in the content repo.
Suggestions include zipping the files and uploading to the content repo
and then downloading and extracting the zip as part of the bundle.
Or Use the "Copy Directory" option to copy the files from a Network
Source Directly like you did in ZDM.
On 1/8/2010 8:56 AM, NovAlf wrote:
>
> Hi!
> Well, I thought there will appear problems. We do have some apps for
> distribution just by copying large amount of files (not large in size)
> to Windows (XP Pro, usually) machines. These is some programs which
> works from directory wo any special need for installation. Happy
> situation for admin. From one side. In ZfD 4.0.1 we did install this app
> on one of machines and then did take snapshot via special app (who
> remember) and did copy file to (Netware) server share, give rights for
> device (~ workstation) and associate it with ws via eDir and ... voila,
> next restart or whatsoever and app was there. Very nice, indeed, I miss
> this!
> So, I tried to make this happen on ZCM 10 (on SLES 11). Did app, sorry,
> bundle, upload files (first time it stuck, second time id accomplish,
> around 7500 files) and did distribution/launch association to ws
> (~device). And ... got errors. Several entries in log as examples
> below.
> Any ideas?
> More thanks, Alar.
> ---
> Error: [1/8/10 2:41:53 PM] BundleManager
> BUNDLE.UnknownExceptionOccurred An Unknown exception occurred trying to
> process task: Novell.Zenworks.AppModule.LaunchException: Exception of
> type 'Novell.Zenworks.AppModule.LaunchException' was thrown.
> at Novell.Zenworks.AppModule.AppActionItem.ProcessAct ion(APP_ACTION
> launchType, ActionContext context, ActionSetResult previousResults)
> ---
> Error: [1/8/10 2:41:54 PM] BundleManager
> ActionMan.FailureProcessingActionException Failed to process action:
> Information for id 51846d2388c028d8c471f1199b965859 has not been cached.
> Did you forget to call CacheContentInfo first?
> ---
>
> -
LR 3.2 crashing when writing XMP changes to large number of files
I've been making some minor changes to a large number of images in bulk and have had LR 3.2 stop responding and crash numerous times. The changes seem to apply to the database, it's just the writing to XMP for a large number of images that crashes (it writes to many, but eventually crashes which is a problem because I'm not sure which ones have saved and which haven't).
The files are only 2MP each, and I have the same issue when I apply the changes to RAW files and JPG files. The error occurs though before any changes are applied, before the images would need to be recached, as when the error doesn't occur the dialogue box I mentioned pops up asking what to change.
Your specs sound more then enough I would think. Do you have set the option to Write cache to folders when possible and if so could you uncheck that option?
I don't have experience with this kind of numbers but I do dump the Bridge cache manually on a regular base (every new cycle it seems to be more stable so more interval in the periods of dumping cache) but my main folder contains usually around 6 to 7 K of DNG files from 21 MP dSLR. This starts caching without problems but takes a few hours before done.
Can't use the option cache for folder due to a longstanding problem with error message to replace the CacheT file, hence my question about your setting.
Also I have set the option to previews Always HQ and in the prefs have set the option to build monitor size previews.
As said, don't know the large numbers but the 2 MP files are very small and it should be possible for Bridge I would think?
Did you also try to create a new folder and place the files inthere. Or divide them in to three folders for testing?
Maybe you are looking for
-
Sent Mail STILL not showing anything in the attachments column
I was hoping this would be fixed with the 10.7.1 update but it hasn't happened. All I can see is the paper clip icon at the top of the column, but NO indication of any attachments alongside individual messages. So how am I supposed to know that my at
-
Hello Gurus, I have a bad performance in IDCP transaction, i read the sap note 511819 and this recommend create an index in BKPF table with fields: 'MANDT' 'BUKRS' 'XBLNR' But i see in the system the table have an index with fields: MANDT BUKRS BSTAT
-
Adding nodes at runtime...
Is there any way for adding workflow step dinamically at runtime. I want to ask that oracle workflow have ad-hoc workflow capabililities or not ? Thank You...
-
HT4972 Why Pages and numbers are not updating.
My pages and numbers are not updating for about a week now Why?
-
I have iphone 5 but in that i do not have facetime so what should i do
i have iphone 5 but in that i do not have facetime so what should i do