LARGE db_cra_log.LDF file

The db_cra_log.LDF database file on my ICD server - version 3.1(3), is about 16.5 gigs now. It is filling up the hard drive. I found a document which may solve my problem here: http://www.cisco.com/en/US/products/sw/custcosw/ps1846/prod_troubleshooting_guide_chapter09186a0080193f35.html#1070884
My question is, does anyone know of any adverse affects from following this procedure? If the file becomes corrupt or something like that, what is the impact? Should this only be done during a maint. window?

Hello,
It's seems that your Sharepoint is installed in standalone mode, so DB are on a SQL Server Express.
Use SQL management studio (from another server, your FIM SQL Server for example) and connect to FIMPortalServer\SQLEXPRESS. You will be able to perform the actions pointed by your link.
Regards,
Sylvain

Similar Messages

  • Can I recover a damaged SQL Server 2008 database with the undamaged .mdb and .ldf files?

    Their original SQL Server was 2008, with SP unknown, but installed on D: and C: drives.  The power spike corrupted their O/S on the C:
    drive and someone reinstalled both the O/S and the SQL Server, which is now SQL 2008, SP4.  They have intact files for all system databases, both .mdb and .ldf.  Is there some way they can reconnect with the user databases using the intact copies
    of the previous system databases? I have heard that if the SQL Server is stopped, previous Master and Msdb .mdb and .ldf files moved into place and the server restarted, that any previous user database .mdb and .ldf files can be accessed by the SQL server.
    Is this the case, or are there details missing?

    Hello! try this steps
     1 Open your SQL Server Management Studio console. This application shortcut is available in the SQL Server directory in the Windows Start button.
    2  Enter the system administrator user name and password. SQL Server's administrator user name is "sa." This account is required for privileges to restore the database. If your restoring on a host provider server, use the administrator user name
    and password they supplied for your account.
     3 Right-click your database name and select "Attach." In the new window that opens, click the "Add" button to open a dialog box.
     4 Select your MDF file and press the "Ok" button. It may take several minutes to restore the database if it is a large file. Once the process is finished, browse your tables to verify the data. The database is now restored.
    If nothing helped try to use: 
    https://www.youtube.com/watch?v=1cbOYdvBW2c

  • Transferring large volume of files from mac to PC?

    Hi i have a mac with osx 10.4.8 and a pc, i need to transfer a large amount of files from mac to pc (around 200GB)
    Now i have lots of external HDs which are all used by either the PC or the macs. the ones formatted for the macs cannot be read atall by the PC without some expensive software, the PC formatted one i have appears to be readable on the mac, but when i try to copy the files onto it or change anything, i am not able to change/add anythingn due to permissions problems.
    Not sure what to do, i have quite a few HDs with plenty of sapce, but if they are all in the wrong format i cant really wipe/reformat them easily with files on them, nor do i want to buy yet another HDD just to transfer some files....
    Any ideas/advice?

    https://discussions.apple.com/docs/DOC-3003

  • Best way to format my 16GB flash drive for Mac and PC transferring large power point files?

    best way to format my 16GB flash drive for Mac and PC transferring large power point files?

    format flash drive in Exfat for transferring files between Mac and Pc.
    FORMAT TYPES
    FAT32 (File Allocation Table)
    Read/Write FAT32 from both native Windows and native Mac OS X.
    Maximum file size: 4GB.
    Maximum volume size: 2TB
    You can use this format if you share the drive between Mac OS X and Windows computers and have no files larger than 4GB.
    NTFS (Windows NT File System)
    Read/Write NTFS from native Windows.
    Read only NTFS from native Mac OS X
    To Read/Write/Format NTFS from Mac OS X, here are some alternatives:
    For Mac OS X 10.4 or later (32 or 64-bit), install Paragon (approx $20) (Best Choice for Lion)
    Native NTFS support can be enabled in Snow Leopard and Lion, but is not advisable, due to instability.
    AirPort Extreme (802.11n) and Time Capsule do not support NTFS
    Maximum file size: 16 TB
    Maximum volume size: 256TB
    You can use this format if you routinely share a drive with multiple Windows systems.
    HFS+ ((((MAC FORMAT)))) (Hierarchical File System, a.k.a. Mac OS Extended (Journaled) Don't use case-sensitive)
    Read/Write HFS+ from native Mac OS X
    Required for Time Machine or Carbon Copy Cloner or SuperDuper! backups of Mac internal hard drive.
    To Read HFS+ (but not Write) from Windows, Install HFSExplorer
    Maximum file size: 8EiB
    Maximum volume size: 8EiB
    You can use this format if you only use the drive with Mac OS X, or use it for backups of your Mac OS X internal drive, or if you only share it with one Windows PC (with MacDrive installed on the PC)
    EXFAT (FAT64)
    Supported in Mac OS X only in 10.6.5 or later.
    Not all Windows versions support exFAT. 
    exFAT (Extended File Allocation Table)
    AirPort Extreme (802.11n) and Time Capsule do not support exFAT
    Maximum file size: 16 EiB
    Maximum volume size: 64 ZiB
    You can use this format if it is supported by all computers with which you intend to share the drive.  See "disadvantages" for details.

  • Hi, I am using USB 8476s to communicat​e to a slave unit in LIN network using LabVIEW7.1​. Can anyone tell me how i can send a header file plus 1 byte of data to the slave in a LIN network. or how do i use ldf file. i want to read responses from the slave

    Hi,   I am using USB 8476s to communicate to a slave unit in LIN network. Can anyone tell me how i can send a header frame plus 1 byte of data to the slave in a LIN network. or how do I communictae with slave using LabVIEW7.1.
    I want to read responses from the slave. When i tried with labview exapmle programs or even using MAX also, while doing some switching action in my slave, i am getting response as Device inactive with timestamp but there is no data format. 
    And I have Lin Description File. Can you suggest me how to use ldf file.
    I am at customer place and It would be great help from you if you can suggest me at the earliest. Thank you

    you may use the LDF Starter Kit to use LDF informations in your application
    http://joule.ni.com/nidu/cds/view/p/id/665/lang/en

  • Bridge freezes when applying camera raw settings to large number of files

    I have a folder with 32,000 frames from a time-lapse project that I'm doing. I'd like to apply the same camera raw adjustments to all of these files, and so follow one of the two methods below - neither of which are working with such a large set of files:
    Method 1:
    Select all files, open in camera raw.
    Apply changes to the first image
    Select all images, then synchronize the changes to all other images.
    Unfortunately selecting "Open in Camera Raw..." just leaves bridge to hang. It's using a lot of memory and processor time, so I would assume it's working, but there's no progress bar or similar to indicate that it is.
    Method 2:
    Open first file in Camera Raw
    Apply all changes, and exit camera raw.
    Select all files in Bridge, right click on one and select "Develop Settings -> Previous Conversion"
    Unfortunately the final step again leaves bridge just thinking. I waited around an hour, closed bridge, but unfortunately when I opened it again there was no sign that it had copied the camera raw settings onto any of the other files. My computer's pretty slow (see below) but it should have written one xmp file in an hour.
    Question:
    Does anyone have any workarounds other than repeating the process 32 times for 1000 images at a time?
    System Info From Photoshop:
    Adobe Photoshop Version: 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00) x64
    Operating System: Windows Vista 64-bit
    Version: 6.0 Service Pack 2
    System architecture: AMD CPU Family:15, Model:15, Stepping:2 with MMX, SSE Integer, SSE FP, SSE2, SSE3
    Physical processor count: 1
    Processor speed: 1808 MHz
    Built-in memory: 8062 MB
    Free memory: 4508 MB
    Memory available to Photoshop: 7140 MB
    Memory used by Photoshop: 70 %
    Image tile size: 132K
    Image cache levels: 4
    OpenGL Drawing: Disabled.
    OpenGL Drawing Mode: Basic
    OpenGL Allow Normal Mode: False.
    OpenGL Allow Advanced Mode: False.
    OpenGL Allow Old GPUs: Not Detected.
    Video Card Vendor: NVIDIA Corporation
    Video Card Renderer: GeForce 6150SE nForce 430/PCI/SSE2
    Display: 1
    Display Bounds:=  top: 0, left: 0, bottom: 1024, right: 1280
    Video Card Number: 1
    Video Card: NVIDIA GeForce 6150SE nForce 430
    OpenCL Unavailable
    Driver Version: 8.17.12.7533
    Driver Date: 20110521050100.000000-000
    Video Card Driver: nvd3dumx.dll,nvd3dum
    Video Mode: 1280 x 1024 x 4294967296 colors
    Video Card Caption: NVIDIA GeForce 6150SE nForce 430
    Video Card Memory: 128 MB
    Video Rect Texture Size: 4096
    Serial number: Tryout Version
    Application folder: C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\
    Temporary file path: C:\Users\MARTIN~1\AppData\Local\Temp\
    Photoshop scratch has async I/O enabled
    Scratch volume(s):
      H:\, 279.5G, 50.4G free
      D:\, 149.0G, 108.1G free
      E:\, 74.5G, 46.1G free
    Required Plug-ins folder: C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\Required\
    Primary Plug-ins folder: C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\Plug-ins\
    Additional Plug-ins folder: not set
    Installed components:
       A3DLIBS.dll   A3DLIB Dynamic Link Library   9.2.0.112  
       ACE.dll   ACE 2012/01/18-15:07:40   66.492997   66.492997
       adbeape.dll   Adobe APE 2012/01/25-10:04:55   66.1025012   66.1025012
       AdobeLinguistic.dll   Adobe Linguisitc Library   6.0.0  
       AdobeOwl.dll   Adobe Owl 2012/02/09-16:00:02   4.0.93   66.496052
       AdobePDFL.dll   PDFL 2011/12/12-16:12:37   66.419471   66.419471
       AdobePIP.dll   Adobe Product Improvement Program   6.0.0.1642  
       AdobeXMP.dll   Adobe XMP Core 2012/02/06-14:56:27   66.145661   66.145661
       AdobeXMPFiles.dll   Adobe XMP Files 2012/02/06-14:56:27   66.145661   66.145661
       AdobeXMPScript.dll   Adobe XMP Script 2012/02/06-14:56:27   66.145661   66.145661
       adobe_caps.dll   Adobe CAPS   5,0,10,0  
       AGM.dll   AGM 2012/01/18-15:07:40   66.492997   66.492997
       ahclient.dll    AdobeHelp Dynamic Link Library   1,7,0,56  
       aif_core.dll   AIF   3.0   62.490293
       aif_ocl.dll   AIF   3.0   62.490293
       aif_ogl.dll   AIF   3.0   62.490293
       amtlib.dll   AMTLib (64 Bit)   6.0.0.75 (BuildVersion: 6.0; BuildDate: Mon Jan 16 2012 18:00:00)   1.000000
       ARE.dll   ARE 2012/01/18-15:07:40   66.492997   66.492997
       AXE8SharedExpat.dll   AXE8SharedExpat 2011/12/16-15:10:49   66.26830   66.26830
       AXEDOMCore.dll   AXEDOMCore 2011/12/16-15:10:49   66.26830   66.26830
       Bib.dll   BIB 2012/01/18-15:07:40   66.492997   66.492997
       BIBUtils.dll   BIBUtils 2012/01/18-15:07:40   66.492997   66.492997
       boost_date_time.dll   DVA Product   6.0.0  
       boost_signals.dll   DVA Product   6.0.0  
       boost_system.dll   DVA Product   6.0.0  
       boost_threads.dll   DVA Product   6.0.0  
       cg.dll   NVIDIA Cg Runtime   3.0.00007  
       cgGL.dll   NVIDIA Cg Runtime   3.0.00007  
       CIT.dll   Adobe CIT   2.0.5.19287   2.0.5.19287
       CoolType.dll   CoolType 2012/01/18-15:07:40   66.492997   66.492997
       data_flow.dll   AIF   3.0   62.490293
       dvaaudiodevice.dll   DVA Product   6.0.0  
       dvacore.dll   DVA Product   6.0.0  
       dvamarshal.dll   DVA Product   6.0.0  
       dvamediatypes.dll   DVA Product   6.0.0  
       dvaplayer.dll   DVA Product   6.0.0  
       dvatransport.dll   DVA Product   6.0.0  
       dvaunittesting.dll   DVA Product   6.0.0  
       dynamiclink.dll   DVA Product   6.0.0  
       ExtendScript.dll   ExtendScript 2011/12/14-15:08:46   66.490082   66.490082
       FileInfo.dll   Adobe XMP FileInfo 2012/01/17-15:11:19   66.145433   66.145433
       filter_graph.dll   AIF   3.0   62.490293
       hydra_filters.dll   AIF   3.0   62.490293
       icucnv40.dll   International Components for Unicode 2011/11/15-16:30:22    Build gtlib_3.0.16615  
       icudt40.dll   International Components for Unicode 2011/11/15-16:30:22    Build gtlib_3.0.16615  
       image_compiler.dll   AIF   3.0   62.490293
       image_flow.dll   AIF   3.0   62.490293
       image_runtime.dll   AIF   3.0   62.490293
       JP2KLib.dll   JP2KLib 2011/12/12-16:12:37   66.236923   66.236923
       libifcoremd.dll   Intel(r) Visual Fortran Compiler   10.0 (Update A)  
       libmmd.dll   Intel(r) C Compiler, Intel(r) C++ Compiler, Intel(r) Fortran Compiler   10.0  
       LogSession.dll   LogSession   2.1.2.1640  
       mediacoreif.dll   DVA Product   6.0.0  
       MPS.dll   MPS 2012/02/03-10:33:13   66.495174   66.495174
       msvcm80.dll   Microsoft® Visual Studio® 2005   8.00.50727.6195  
       msvcm90.dll   Microsoft® Visual Studio® 2008   9.00.30729.1  
       msvcp100.dll   Microsoft® Visual Studio® 2010   10.00.40219.1  
       msvcp80.dll   Microsoft® Visual Studio® 2005   8.00.50727.6195  
       msvcp90.dll   Microsoft® Visual Studio® 2008   9.00.30729.1  
       msvcr100.dll   Microsoft® Visual Studio® 2010   10.00.40219.1  
       msvcr80.dll   Microsoft® Visual Studio® 2005   8.00.50727.6195  
       msvcr90.dll   Microsoft® Visual Studio® 2008   9.00.30729.1  
       pdfsettings.dll   Adobe PDFSettings   1.04  
       Photoshop.dll   Adobe Photoshop CS6   CS6  
       Plugin.dll   Adobe Photoshop CS6   CS6  
       PlugPlug.dll   Adobe(R) CSXS PlugPlug Standard Dll (64 bit)   3.0.0.383  
       PSArt.dll   Adobe Photoshop CS6   CS6  
       PSViews.dll   Adobe Photoshop CS6   CS6  
       SCCore.dll   ScCore 2011/12/14-15:08:46   66.490082   66.490082
       ScriptUIFlex.dll   ScriptUIFlex 2011/12/14-15:08:46   66.490082   66.490082
       tbb.dll   Intel(R) Threading Building Blocks for Windows   3, 0, 2010, 0406  
       tbbmalloc.dll   Intel(R) Threading Building Blocks for Windows   3, 0, 2010, 0406  
       TfFontMgr.dll   FontMgr   9.3.0.113  
       TfKernel.dll   Kernel   9.3.0.113  
       TFKGEOM.dll   Kernel Geom   9.3.0.113  
       TFUGEOM.dll   Adobe, UGeom©   9.3.0.113  
       updaternotifications.dll   Adobe Updater Notifications Library   6.0.0.24 (BuildVersion: 1.0; BuildDate: BUILDDATETIME)   6.0.0.24
       WRServices.dll   WRServices Friday January 27 2012 13:22:12   Build 0.17112   0.17112
       wu3d.dll   U3D Writer   9.3.0.113  
    Required plug-ins:
       3D Studio 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Accented Edges 13.0
       Adaptive Wide Angle 13.0
       ADM 3.11x01
       Angled Strokes 13.0
       Average 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Bas Relief 13.0
       BMP 13.0
       Camera Raw 7.0
       Chalk & Charcoal 13.0
       Charcoal 13.0
       Chrome 13.0
       Cineon 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Clouds 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Collada 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Color Halftone 13.0
       Colored Pencil 13.0
       CompuServe GIF 13.0
       Conté Crayon 13.0
       Craquelure 13.0
       Crop and Straighten Photos 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Crop and Straighten Photos Filter 13.0
       Crosshatch 13.0
       Crystallize 13.0
       Cutout 13.0
       Dark Strokes 13.0
       De-Interlace 13.0
       Dicom 13.0
       Difference Clouds 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Diffuse Glow 13.0
       Displace 13.0
       Dry Brush 13.0
       Eazel Acquire 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Embed Watermark 4.0
       Entropy 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Extrude 13.0
       FastCore Routines 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Fibers 13.0
       Film Grain 13.0
       Filter Gallery 13.0
       Flash 3D 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Fresco 13.0
       Glass 13.0
       Glowing Edges 13.0
       Google Earth 4 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Grain 13.0
       Graphic Pen 13.0
       Halftone Pattern 13.0
       HDRMergeUI 13.0
       IFF Format 13.0
       Ink Outlines 13.0
       JPEG 2000 13.0
       Kurtosis 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Lens Blur 13.0
       Lens Correction 13.0
       Lens Flare 13.0
       Liquify 13.0
       Matlab Operation 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Maximum 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Mean 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Measurement Core 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Median 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Mezzotint 13.0
       Minimum 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       MMXCore Routines 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Mosaic Tiles 13.0
       Multiprocessor Support 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Neon Glow 13.0
       Note Paper 13.0
       NTSC Colors 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Ocean Ripple 13.0
       Oil Paint 13.0
       OpenEXR 13.0
       Paint Daubs 13.0
       Palette Knife 13.0
       Patchwork 13.0
       Paths to Illustrator 13.0
       PCX 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Photocopy 13.0
       Photoshop 3D Engine 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Picture Package Filter 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Pinch 13.0
       Pixar 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Plaster 13.0
       Plastic Wrap 13.0
       PNG 13.0
       Pointillize 13.0
       Polar Coordinates 13.0
       Portable Bit Map 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Poster Edges 13.0
       Radial Blur 13.0
       Radiance 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Range 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Read Watermark 4.0
       Reticulation 13.0
       Ripple 13.0
       Rough Pastels 13.0
       Save for Web 13.0
       ScriptingSupport 13.0
       Shear 13.0
       Skewness 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Smart Blur 13.0
       Smudge Stick 13.0
       Solarize 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Spatter 13.0
       Spherize 13.0
       Sponge 13.0
       Sprayed Strokes 13.0
       Stained Glass 13.0
       Stamp 13.0
       Standard Deviation 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Sumi-e 13.0
       Summation 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Targa 13.0
       Texturizer 13.0
       Tiles 13.0
       Torn Edges 13.0
       Twirl 13.0
       U3D 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Underpainting 13.0
       Vanishing Point 13.0
       Variance 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Variations 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Water Paper 13.0
       Watercolor 13.0
       Wave 13.0
       Wavefront|OBJ 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       WIA Support 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       Wind 13.0
       Wireless Bitmap 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00)
       ZigZag 13.0
    Optional and third party plug-ins: NONE
    Plug-ins that failed to load: NONE
    Flash:
       Mini Bridge
       Kuler
    Installed TWAIN devices: NONE

    The files are only 2MP each, and I have the same issue when I apply the changes to RAW files and JPG files. The error occurs though before any changes are applied, before the images would need to be recached, as when the error doesn't occur the dialogue box I mentioned pops up asking what to change.
    Your specs sound more then enough I would think. Do you have set the option to Write cache to folders when possible and if so could you uncheck that option?
    I don't have experience with this kind of numbers but I do dump the Bridge cache manually on a regular base (every new cycle it seems to be more stable so more interval in the periods of dumping cache) but my main folder contains usually around 6 to 7 K of DNG files from 21 MP dSLR. This starts caching without problems but takes a few hours before done.
    Can't use the option cache for folder due to a longstanding problem with error message to replace the CacheT file, hence my question about your setting.
    Also I have set the option to previews Always HQ and in the prefs have set the option to build monitor size previews.
    As said, don't know the large numbers but the 2 MP files are very small and it should be possible for Bridge I would think?
    Did you also try to create a new folder and place the files inthere. Or divide them in to three folders for testing?

  • Why is menu size so much larger than the files?

    I am somewhat perplexed - I'm making a single-layer dvd and have a simple single screen theme (no animation) - it has one drop zone. The movie has 28 chapters so there are five of them, but so far I've added only one montage of photos to each menu page - it's a "mobile" quality in media browser and ave only 10 - 20 MB, then one song. On average these movie clips are 1:00 to 2:00 min long and the music is cut to that length. So, all told, it's probably less than 110 MB in files - but when I look at the project properties, the menus are 3301 MB!!!
    The movie itself is only 1.1 GB so what is going on?
    It's preventing me from doing the project and I've got no idea why it's making the menu size so much larger than the file sizes?
    Can you please help me figure out what to do?
    Alexa

    The 2 min clips were natively 2 min - I had each one separate and actually converted to media browser in "mobile" size (b/c the drop zone was 4 x 6 in size in the theme). They were tiny - in most cases I was shortening the audio (e.g. the song was 4 minutes and I was setting the loop to only 1:30 b/c that's how long the video clip was on the menu).
    BUT, to resolve the question (I always like to post the answer) - I ended up duplicating the project to try to reimport the video. It was a fresh iDVD project and I happened to click on the project tree of screens - and, lo and behold, the ENTIRE menu was duplicated for some reason (and the movie, actually). I went back to the original and it was the same! I have no idea how that would happen do you? It only had one main menu screen - and then an entire duplicate menu - which I wouldn't even know how to access if I didn't see it in the project menu tree?
    The only think of is that at one point I added a title menu link to the scene selection - it gave me a warning that my menu was more than 12 minutes, did I want to fix or ignore and fix later - which confused me b/c it was under 12 min, then but I clicked ignore. Does that create an alternate title menu and send people back to something else?
    Anyway, I deleted the entire extra scene selection menus (5 of them) and it was back to under 4 GB.
    So, I was able to burn, but still wondering about creating the "title menu" link on scene selections? It drives me crazy that it doesn't automatically do that so I like to add "main menu" links.
    Thanks for your help!
    Alexa

  • Can't Empty Trash With Large Number of Files

    Running OS X 10.8.3
    I have a very large external drive that had a Time Machine backup on the main partition. At some point, I created a second partition, then started doing backups on the new partition. On Wed, I finally got around to doing some "housecleaning" tasks I'd been putting off. As part of that, I decided to clean up my external drive. So... I moved the old, unused and unwanted Backups.backupdb that used to be the Time Machine backup, and dragged it to the Trash.
    Bad idea.
    Now I've spent the last 3-4 days trying various strategies to actually empty the trash and reclaim the gig or so of space on my external drive.  Initially I just tried to "Empty Trash", but that took about four hours to count up the files just to "prepare to delete" them. After the file counter stopped counting up, and finally started counting down... "Deleting 482,832 files..." "Deleting 482,831 files..." etc, etc...  I decided I was on the path to success, so left the machine alone for 12-14 hours.
    When I came back, the results were not what I expected. "Deleting -582,032 files..."  What the...?
    So after leaving that to run for another few hours with no results, I stopped that process.  Tried a few other tools like Onyx, TrashIt, etc...  No luck.
    So finally decided to say the **** with the window manager, pulled up a terminal, and cd'ed to the .Trash directory for my UID on the USB volume and did a rm -rfv Backups.backupdb
    While it seemed to run okay for a while, I started getting errors saying "File not found..." and "Invalid file name..." and various other weird things.  So now I'm doing a combination of rm -rfing individual directories, and using the finder to rename/cleanup individual Folders when OSX refuses to delete them.
    Has anyone else had this weird overflow issue with deleting large numbers of files in 10.8.x? Doesn't seem like things should be this hard...

    I'm not sure I understand this bit:
    If you're on Leopard 10.5.x, be sure you have the "action" or "gear" icon in your Finder's toolbar (Finder > View > Customize Toolbar).  If there's no toolbar, click the lozenge at the upper-right of the Finder window's title bar.  If the "gear" icon isn’t in the toolbar, selectView > Customize Toolbar from the menubar.
    Then use the Time Machine "Star Wars" display:  Enter Time Machine by clicking the Time Machine icon in your Dock or select the TM icon in your Menubar.
    And this seems to defeat the whole purpose:
    If you delete an entire backup, it will disappear from the Timeline and the "cascade" of Finder windows, but it will not actually delete the backup copy of any item that was present at the time of any remaining backup. Thus you may not gain much space. This is usually fairly quick
    I'm trying to reclaim space on a volume that had a time machine backup, but that isn't needed anymore. I'm deleting it so I can get that 1GB+ of space back. Is there some "official" way you're supposed to delete these things where you get your hard drive space back?

  • Please add the ability to add multiple folders to the assets folder in order to better organize large numbers of files.

    Please add the ability to add multiple folders to the assets folder in order to better organize large numbers of files.

    Hello KDLadage
    Thank you for your recommendation. I understand the challenges of managing large numbers of files on the My Files page. I also understand the need to preserve project files.
    Perhaps a compromise would be to create an Archive tab under My Files. Previous versions and retired project files could then be automatically moved into this holding area when a new version is published, thus preserving the files in a separate area that is still accessible to the author.
    I will submit this suggestion to our product management team to consider as a future enhancement.

  • How to process large input CSV file with File adapter

    Hi,
    could someone recommend me the right BPEL way to process the large input CSV file (4MB or more with at least 5000 rows) with File Adapter?
    My idea is to receive data from file (poll the UX directory for new input file), transform it and then export to one output CSV file (input for other system).
    I developed my process that consists of:
    - File adapter partnerlink for read data
    - Receive activity with checked box to create instance
    - Transform activity
    - Invoke activity for writing to output CSV.
    I tried this with small input file and everything was OK, but now when I try to use the complete input file, the process doesn't start and automatically goes to OFF state in BPEL console.
    Could I use the MaxTransactionSize parameter as in DB adapter, should I batch the input file or other way could help me?
    Any hint from you? I've to solve this problem till this thursday.
    Thanks,
    Milan K.

    This is a known issue. Martin Kleinman has posted several issues on the forum here, with a similar scenario using ESB. This can only be solved by completely tuning the BPEL application itself, and throwing in big hardware.
    Also switching to the latest 10.1.3.3 version of the SOA Suite (assuming you didn't already) will show some improvements.
    HTH,
    Bas

  • How do i go about deleting a large number of files at the same time?  where's the easiest place to do it?

    how do i go about deleting a large number of files at the same time?  where's the easiest place to do it?

    A bit vague as to what you intend, but the simple answer is to select all the files you want to delete then either drag to the Trash or CTRL- or RIGHT-click on the selection and choose Move to Trash from the contextual menu.

  • WebDAV Write Unreliability - Large non-XML Files

    I am unable to get Oracle's WebDAV implementation on Windows XP to reliably store large (1 Megabyte +) files that are not XML, without unexplained errors occurring, which result in only part files being transferred.
    For testing I started using the repository folders created by the XDB Basic Demo, using Windows Explorer WebDAV as the client to insert files of varying sizes. Files in excess of 1 megabyte files fail more often than succeed, but I have occasionally succeeded in storing files of many megabytes. Because they were handy I mainly used ZIP files to insert into the Oracle XML DB repository. When the files were renamed to extension, .xml, they always succeeded.
    To eliminate client error, I have also used DavExplorer with the same symptoms observed. No problems were experienced retrieving large files from the Oracle XML DB Repository. My tests were performed using Oracle 9.2.0.5 and 10.1.0.2. The same problem did not occur when using Oracle's FTP interface, and when using the "native" Apache 1.3 and Apache WebDAV implementations.
    This failure is a major problem for our application, which would otherwise fit very well with Oracle XML DB features. Our application provides an authoring environment for a large collection of small XML files, but also has to allow storage of some related GIF, PDF and Microsoft Offices files. Our customers are large corporates, who would prefer to use Oracle for this requirement and for related data processing requirements.
    In case, it is relevant, we are not restricted to running Oracle on XP, with Unix platforms being a more likely option for our customers.
    I could not find details of the error logged by the Oracle software, using either database version. The only information I have gathered about the failure was provided by adding the following line to sqlnet.ora:
    TRACE_LEVEL_SERVER=4
    The trace output for a failure is as follows:
    [19-MAY-2004 13:46:59:735] nsopen: opening transport...
    [19-MAY-2004 13:46:59:735] nsopen: transport is open
    [19-MAY-2004 13:46:59:735] nsopen: global context check-in (to slot 1) complete
    [19-MAY-2004 13:46:59:735] nsanswer: deferring connect attempt; at stage 1064
    [19-MAY-2004 13:46:59:735] nsc2addr: (ADDRESS=(PROTOCOL=tcp)(DEV=340)(HOST=132.223.134.163)(PORT=8080))
    [19-MAY-2004 13:46:59:735] nttgetport: port resolved to 8080
    [19-MAY-2004 13:46:59:735] nttbnd2addr: using host IP address: 132.223.134.163
    [19-MAY-2004 13:46:59:735] nstimarmed: no timer allocated
    [19-MAY-2004 13:46:59:735] nsclose: global context check-out (from slot 1) complete
    [19-MAY-2004 13:46:59:735] nttcnp: Validnode Table IN use; err 0x0
    [19-MAY-2004 13:46:59:735] nsopen: opening transport...
    [19-MAY-2004 13:46:59:735] nttcnp: Validnode Table IN use; err 0x0
    [19-MAY-2004 13:46:59:735] nttcnp: Validnode Table IN use; err 0x0
    [19-MAY-2004 13:46:59:735] nttcnp: getting sockname
    [19-MAY-2004 13:46:59:735] nttcnp: getting peername
    [19-MAY-2004 13:46:59:735] nttcon: set TCP_NODELAY on 296
    [19-MAY-2004 13:46:59:745] nsopen: transport is open
    [19-MAY-2004 13:46:59:745] nsopen: global context check-in (to slot 1) complete
    [19-MAY-2004 13:46:59:745] nstoControlATO: ATO disabled for ctx=0x055D6FC4
    [19-MAY-2004 13:46:59:745] nsevdansw: exit
    [19-MAY-2004 13:46:59:745] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:745] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:745] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:745] nstoControlATO: ATO disabled for ctx=0x055D3548
    [19-MAY-2004 13:46:59:745] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:745] nstoControlATO: ATO disabled for ctx=0x055D3548
    [19-MAY-2004 13:46:59:745] snttcallback: op = 5, bytes = 232, err = 0
    [19-MAY-2004 13:46:59:745] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:745] nstoControlATO: ATO disabled for ctx=0x055D3548
    [19-MAY-2004 13:46:59:755] snttcallback: op = 5, bytes = 296, err = 0
    [19-MAY-2004 13:46:59:755] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:755] nstoControlATO: ATO disabled for ctx=0x055D3548
    [19-MAY-2004 13:46:59:755] snttcallback: op = 5, bytes = 0, err = 0
    [19-MAY-2004 13:46:59:755] ntt2err: Read unexpected EOF ERROR on 296
    [19-MAY-2004 13:46:59:755] snttcallback: op = 5, bytes = 0, err = 0
    [19-MAY-2004 13:46:59:755] ntt2err: Read unexpected EOF ERROR on 296
    [19-MAY-2004 13:46:59:755] snttcallback: op = 5, bytes = 0, err = 0
    [19-MAY-2004 13:46:59:755] ntt2err: Read unexpected EOF ERROR on 296
    [19-MAY-2004 13:46:59:755] snttcallback: op = 5, bytes = 0, err = 0
    [19-MAY-2004 13:46:59:755] ntt2err: Read unexpected EOF ERROR on 296
    [19-MAY-2004 13:46:59:755] nsdo: transport read error
    [19-MAY-2004 13:46:59:755] nserror: nsres: id=1, op=68, ns=12537, ns2=12560; nt[0]=507, nt[1]=0, nt[2]=0; ora[0]=0, ora[1]=0, ora[2]=0
    [19-MAY-2004 13:46:59:755] nstimarmed: no timer allocated
    [19-MAY-2004 13:46:59:765] nsclose: closing transport
    [19-MAY-2004 13:46:59:765] nsclose: global context check-out (from slot 1) complete
    [19-MAY-2004 13:46:59:765] nsopen: opening transport...
    [19-MAY-2004 13:46:59:765] nsopen: transport is open
    [19-MAY-2004 13:46:59:765] nsopen: global context check-in (to slot 1) complete
    [19-MAY-2004 13:46:59:765] nsanswer: deferring connect attempt; at stage 1064
    [19-MAY-2004 13:46:59:765] nsc2addr: (ADDRESS=(PROTOCOL=tcp)(DEV=340)(HOST=132.223.134.163)(PORT=8080))
    [19-MAY-2004 13:46:59:765] nttgetport: port resolved to 8080
    [19-MAY-2004 13:46:59:765] nttbnd2addr: using host IP address: 132.223.134.163
    [19-MAY-2004 13:46:59:765] nstimarmed: no timer allocated
    [19-MAY-2004 13:46:59:765] nsclose: global context check-out (from slot 1) complete
    [19-MAY-2004 13:46:59:765] nttcnp: Validnode Table IN use; err 0x0
    [19-MAY-2004 13:46:59:765] nsopen: opening transport...
    [19-MAY-2004 13:46:59:765] nttcnp: Validnode Table IN use; err 0x0
    [19-MAY-2004 13:46:59:765] nttcnp: Validnode Table IN use; err 0x0
    [19-MAY-2004 13:46:59:765] nttcnp: getting sockname
    [19-MAY-2004 13:46:59:765] nttcnp: getting peername
    [19-MAY-2004 13:46:59:765] nttcon: set TCP_NODELAY on 296
    [19-MAY-2004 13:46:59:765] nsopen: transport is open
    [19-MAY-2004 13:46:59:765] nsopen: global context check-in (to slot 1) complete
    [19-MAY-2004 13:46:59:765] nstoControlATO: ATO disabled for ctx=0x055D6FC4
    [19-MAY-2004 13:46:59:765] nsevdansw: exit
    [19-MAY-2004 13:46:59:765] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:765] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:765] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:765] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:765] nstoControlATO: ATO disabled for ctx=0x055D3548
    [19-MAY-2004 13:46:59:765] snttcallback: op = 5, bytes = 302, err = 0
    [19-MAY-2004 13:46:59:765] ntt2err: soc 296 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:46:59:775] snttcallback: op = 5, bytes = 8192, err = 0
    [19-MAY-2004 13:46:59:775] snttcallback: op = 5, bytes = 8192, err = 0
    [19-MAY-2004 13:46:59:775] snttcallback: op = 5, bytes = 8192, err = 0
    [19-MAY-2004 13:46:59:775] snttcallback: op = 5, bytes = 8192, err = 0
    [19-MAY-2004 13:47:00:246] nstoControlATO: ATO disabled for ctx=0x055D3548
    [19-MAY-2004 13:47:00:246] nstimarmed: no timer allocated
    [19-MAY-2004 13:47:00:246] nsclose: closing transport
    [19-MAY-2004 13:47:00:366] nsclose: global context check-out (from slot 1) complete
    The trace for a successful transfer is as follows:
    [19-MAY-2004 13:57:28:249] nsopen: opening transport...
    [19-MAY-2004 13:57:28:249] nsopen: transport is open
    [19-MAY-2004 13:57:28:249] nsopen: global context check-in (to slot 1) complete
    [19-MAY-2004 13:57:28:249] nsanswer: deferring connect attempt; at stage 1064
    [19-MAY-2004 13:57:28:249] nsc2addr: (ADDRESS=(PROTOCOL=tcp)(DEV=352)(HOST=132.223.134.163)(PORT=8080))
    [19-MAY-2004 13:57:28:249] nttgetport: port resolved to 8080
    [19-MAY-2004 13:57:28:249] nttbnd2addr: using host IP address: 132.223.134.163
    [19-MAY-2004 13:57:28:249] nstimarmed: no timer allocated
    [19-MAY-2004 13:57:28:249] nsclose: global context check-out (from slot 1) complete
    [19-MAY-2004 13:57:28:249] nttcnp: Validnode Table IN use; err 0x0
    [19-MAY-2004 13:57:28:249] nsopen: opening transport...
    [19-MAY-2004 13:57:28:249] nttcnp: Validnode Table IN use; err 0x0
    [19-MAY-2004 13:57:28:249] nttcnp: Validnode Table IN use; err 0x0
    [19-MAY-2004 13:57:28:249] nttcnp: getting sockname
    [19-MAY-2004 13:57:28:249] nttcnp: getting peername
    [19-MAY-2004 13:57:28:249] nttcon: set TCP_NODELAY on 1420
    [19-MAY-2004 13:57:28:249] nsopen: transport is open
    [19-MAY-2004 13:57:28:249] nsopen: global context check-in (to slot 1) complete
    [19-MAY-2004 13:57:28:249] nstoControlATO: ATO disabled for ctx=0x055D6FC4
    [19-MAY-2004 13:57:28:249] nsevdansw: exit
    [19-MAY-2004 13:57:28:249] nstoControlATO: ATO disabled for ctx=0x055D3548
    [19-MAY-2004 13:57:28:249] ntt2err: soc 1420 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:57:28:249] ntt2err: soc 1420 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:57:28:269] ntt2err: soc 1420 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:57:28:269] ntt2err: soc 1420 error - operation=5, ntresnt[0]=524, ntresnt[1]=997, ntresnt[2]=0
    [19-MAY-2004 13:57:28:369] nstoControlATO: ATO disabled for ctx=0x055D3548
    [19-MAY-2004 13:57:31:374] nstimarmed: no timer allocated
    [19-MAY-2004 13:57:31:384] snttcallback: op = 5, bytes = 0, err = 995
    [19-MAY-2004 13:57:31:384] ntt2err: soc 1420 error - operation=5, ntresnt[0]=530, ntresnt[1]=995, ntresnt[2]=0
    [19-MAY-2004 13:57:31:384] snttcallback: op = 5, bytes = 0, err = 995
    [19-MAY-2004 13:57:31:384] ntt2err: soc 1420 error - operation=5, ntresnt[0]=530, ntresnt[1]=995, ntresnt[2]=0
    [19-MAY-2004 13:57:31:384] snttcallback: op = 5, bytes = 0, err = 995
    [19-MAY-2004 13:57:31:384] ntt2err: soc 1420 error - operation=5, ntresnt[0]=530, ntresnt[1]=995, ntresnt[2]=0
    [19-MAY-2004 13:57:31:384] snttcallback: op = 5, bytes = 0, err = 995
    [19-MAY-2004 13:57:31:384] ntt2err: soc 1420 error - operation=5, ntresnt[0]=530, ntresnt[1]=995, ntresnt[2]=0
    [19-MAY-2004 13:57:31:384] nsclose: closing transport
    [19-MAY-2004 13:57:31:484] nsclose: global context check-out (from slot 1) complete

    Thanks Mark.
    Unfortunately I'm not currently in a position to test on Unix, but understand that I need to do so. It may be a couple of weeks before I can test on Linux, as "IT Support" will need to install an OS version officially supported by Oracle. If we win the work we'll have the required hardware, but we have to demonstrate it working first.
    Note, in my test circumstance transfers work fine using FTP, and only fail using WebDAV when the file extension is not .xml.

  • I have my camera (Canon 5D MARK 2) set to take both JPEG Large and Raw files with each shot. I uploaded the images from the card to my Pro (Aperture 3) and while the import info said 1500 images were uploaded, I can't find the RAW images.  Aperture put ab

    I have my camera (Canon 5D Mark 2) set to take both JPEG Large and Raw files with each shot. I uploaded the images from the card to my Pro (Aperture 3) and while the import info said 1500 images were uploaded, I can't find the RAW images.  Aperture put about 700 images in an untitled project folder, but all the images are the JPEGs.  What am I missing?
    Thanks,
    upsjdris

    Have you checked your "Import" settings for "Raw&Jpeg" pairs in the "Import" panel?
    You can set Aperture to import raw, jpeg, or raw&jpeg.
    If you imported Raw&Jpeg, but have set Aperture to use the Jpeg as original, you will see the imported image as Jpeg image, not as a raw image, even if the raw has also been imported. You can switch between Raw and Jpeg originals for selected images from the Photos menu:
    Photos > Use Raw as original.
    Regards
    Léonie

  • Optimal Locations for SQL 2005 TempDB & ReportServerTempDB MDF & LDF Files

    Hello.  I have SQL Server 2005 SP4 installed on Windows Server 2003 R2 Standard SP2. 
    The server has four CPU cores.  The server was configured with the following drive configuration.
    RAID 1 - System drive
    RAID 5 - SQL MDF files
    RAID 1 - SQL LDF files
    There is currently one of each of the following files on the server. 
    The MDF files are on the RAID 5 array, while the LDF files are on the RAID1 array for SQL LDF files.
    TempDB.mdf                                    
    ReportServerTempDB.mdf
    TempLog.ldf                                     
    ReportServerTempDB_log.ldf
    It is my understanding that having one TempDB per core is a best practice, so I should create three more TempDB’s. 
    Does that also apply to ReportServerTempDB?  Given the drive configuration of this server, where should all of the required MDF and LDF files for the TempDB’s and ReportServerTempDB’s be located?

    You cannot create more than 1 tempb in any SQL Instance. The recommendations are to have additional tempdb *data files* depending on cores upto a max of 8. Obviously this depends on every environment. This recommendation is only for tempdb. It is not
    for ReportServerTempDB’s. They are totally different and is not used similar to tempdb.
    http://www.sqlskills.com/blogs/paul/a-sql-server-dba-myth-a-day-1230-tempdb-should-always-have-one-data-file-per-processor-core/
    http://blogs.msdn.com/b/cindygross/archive/2009/11/20/compilation-of-sql-server-tempdb-io-best-practices.aspx
    Now, it is good to keep tempdb in its own drive, but because you dont have one specific, you can just put tempdb data files in data files drive and log file in log files drive.
    You might want to consider moving them to a seperate disk if you see IO contention.
    Regards, Ashwin Menon My Blog - http:\\sqllearnings.com

  • ADT Not Working With Large Amount Of Files

    Hi,
    I posted a similar issue in the Flash Builder forum; it seems these tools are not meant for anything too big. Here is my problem; I have an application that runs on the iPad just fine, however I cannot seem to be able to package any of the resource files it uses, such as swf, xml, jpg, and flv. There are a lot of these files, maybe around a 1000. This app won't be on the AppStore and just used internally at my company. I tried to package it using Flash Builder, but it couldn't handle it, and so now I have tried the adt command line tool. The problem now is that it is running out of memory:
    Exception in thread "main" java.lang.OutOfMemoryError
            at java.util.zip.Deflater.init(Native Method)
            at java.util.zip.Deflater.<init>(Unknown Source)
            at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:428)
            at com.adobe.air.ipa.IPAOutputStream.addFile(IPAOutputStream.java:338)
            at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:273)
            at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:247)
            at com.adobe.air.ADTOutputStream.addFile(ADTOutputStream.java:367)
            at com.adobe.air.ipa.IPAOutputStream.addFile(IPAOutputStream.java:161)
            at com.adobe.air.ApplicationPackager.createPackage(ApplicationPackager.java:67)
            at com.adobe.air.ipa.IPAPackager.createPackage(IPAPackager.java:163)
            at com.adobe.air.ADT.parseArgsAndGo(ADT.java:504)
            at com.adobe.air.ADT.run(ADT.java:361)
            at com.adobe.air.ADT.main(ADT.java:411)
    I tried increasing the java memory heap size to 1400m, but that did not help. What is disturbing though is why this day and age, the adt tool is written solely for 32 bit java runtimes, when you can no longer buy a 32bit computer. Almost everything is 64bit now. Being able to utilize a 64bit java runtime would eliminate this.
    Has anyone else had to deal with packaging a large number of files within an iPad app?
    thanks

    Hi,
    Yes the bug number is 2896433, logged against Abobe Air. I also logged one against Flash Builder for the same issue basically, FB-31239. In the tests I have done here, both issues are the result of including a large number of files in the package. By reducing the number of files included, these bugs do not appear.
    I would liketo try out the new SDK, but it won't be available until a few more weeks, correct?
    The main reason I want to include all these files is because the iPads will not always have a wireless connection.
    thx

Maybe you are looking for

  • Mass deletion or cancellation of background Jobs.

    hi, Can anybody tell me the program name or the Job name by which we can cancel or delete all the jobs i.e mass deletion or cancellation of background Jobs.

  • CS3: Pan/Zoom & Cross-Fade

    Hi, I'm trying to make a "Ken Burns" style effect for a banner, using several photos. I've got the Pan/Zoom down, by using the "transform" timeline effect, or by doing a motion tween. My question is: how can I get a fade transition between one image

  • 10.5.7 Update Correlates With Fans Running Continually

    After updating to 10.5.7, my MacBook's fans seem to run continually. If I Restart, they will stop for about 10 minutes, then return to continual running. As possible solutions, I tried: - zapping the PRAM, - reinstalling the update. Neither worked. H

  • Sharing pictures from iphoto to NON-apple users

    Can you share pictures from iPhoto to people that don't have an apple device? Everything I have found says they must have an apple device. I am trying to share a bunch of pictures to a friend that has a Window-based computer. Thanks

  • For mobile

    isn't there any app that increases the firefox running device ram using memory card ? free and paid app will work . but go fro free