Edit big files in to smaller or "chapters"

Hi,
New person on the board here.
Let's see if I can explain what I'm after.
I have three large mp3 files of an hour playing time each. these all came from a radio show I tokk part of. I now want to edit these so that I can burn them to normal CDs and so that you can skip between songs, chatter and commercial breaks without getting a 2second silent break in between.
Does anyone know if this can be done in GB3 and if so how?

GB is really not the best tool for that, I'd recommend an Audio Editor for, well, editing Audio files:
http://thehangtime.com/gb/gbfaq2.html#audioeditors

Similar Messages

  • How to split a big file into several small files using XI

    Hi,
    Is there any way to split a huge file into small files using XI.
    Thanks
    Mukesh

    There is a how to guide for using file adapters, an "sdn search" will get you the document.
    Based on that , read a file into XI, Use strings to store the file content and do the split
    here is some code to get you started
    ===========
    Pseudocode:
    ===========
    // This can be passed in - however many output files to which the source is split
    numOutputFiles = 5;
    // Create the same number of filehandles as numOutputFiles specifies
    open file1, file2, file3, file4, file5;
    // Create an Array to hold the references to those filehandles
    Array[5] fileHandles = {file1, file2, file3, file5, file5};
    // Initialize a loop counter
    int loopCounter = 0;
    // a temporary holder to "point" the output stream to the correct output file
    File currentOutputFile = null;
    // loop through your input file
    while (sourceFile.nextLine != null)
    loopCounter++;
    if (loopCounter == (numOutputFiles +1) // you've reached 5, loop back
    loopCounter = 1;
    currentOutputFile = fileHandles[loopCounter]; // gets the output file at that index of the array
    currentOutputFile.write(sourceFile.nextLine);
    regards
    krishna

  • Edited RAW files used to make a photo book

    I am posting a question that I mistakenly posted in the wrong support community. 
    I have photos in both RAW and JPEG format.  I know that when a RAW file is edited it is saved as a JPEG.  I noticed that the size of the edited RAW file is often smaller than the same JPEG photo. (Sometimes it's up to 2MB or more difference.)  Believing that a larger size photo has more detail do you recommend using one file format over the other (edited RAW v. JPEG) to create an iBook?  What is the smallest size photo recommended for a photobook?
    I use iPhoto'11.
    Thanks.

    I know that when a RAW file is edited it is saved as a JPEG.
    It's not since iPhoto 08. When you process a Raw the decisions you make are recorded in the database and a Preview is made. This preview is what's used in Media Browsers and the like - essentially any time you share a photo except via File -> Export. But the edits are never fixed or committed.
      I noticed that the size of the edited RAW file is often smaller than the same JPEG photo. (Sometimes it's up to 2MB or more difference.)
    That's the difference between the camera's jpeg and the iPhoto Preview. The Preview is a medium quality version - a sort of "good for most things" version. If you export you can create much bigger files.
    Believing that a larger size photo has more detail
    Not necessarily so... It is less compressed, that's all.
    do you recommend using one file format over the other (edited RAW v. JPEG) to create an iBook?
    THere's no right answer. No matter what, you can't print the Raw so it's either going to be the camera Jpeg or one created by iPhoto from the Raw. If you don't want to process the image yourself or if you feel what you can do wuth the image is not better than the camera, it might be worth asking yourself why you're shooting Raw at all? And if you feel that what you do with the Raw is better than the camera creates, you might ask yourself why you're shooting the Jpeg at all.
    What will print best is the best shot, regardless of whether the camera processed it or you did.

  • HP C3183 Doesn't Print Big Files

    I've bought new computer with Windows 7, installed HP software and drivers from official web site.  The problem is the device doesn't print big files, only very small files, approx 5kb. When I'm trying to print file larger then 5 kb, it simply doesn't do anything with files collected in print queue.  I' ve tried to reinstall drivers several times, but with the same result.  Did anybody have this kind of problem? Maybe there is a problem with printer internal memory?

    Hello and thank you for your updated information.
    I understand when you turn off your computer, you are unable to print. You shouldn't have to restart the Print Spooler everytime in order to print. You may have a problem with the Print Spooler on your computer. Please follow these steps:
    First, I would recommend bypassing the Print Spooler to verify if you are able to print:
    1) Go into Devices and Printers
    2) Next, right click on the Photosmart C3183 printer
    4) Click on Printer Properties, then click on the Advance Tab (very top)
    5) Click on "Print Directly to the printer", then click Apply.
    6) Try to print a document
    If this step doesn't work, then follow this entire document on Print Spooler Keeps stopping Automatically. I know you are not receiving this error code, but it has valid troubleshooting steps to perform. Make sure the "Startup Type" is set to Automatic.
    Also, try this tool from Microsoft here, which will diagnose and fix printing problems.
    Please post your results, as I will be looking forward to hearing from you.
    I worked on behalf of HP.

  • Small file to big file

    Hi All,
    can anyone tell me how to convert a small file tablespace to bigfile tablespace? below is my database version
    Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bi
    PL/SQL Release 10.2.0.2.0 - Production
    CORE 10.2.0.2.0 Production
    TNS for Solaris: Version 10.2.0.2.0 - Production
    NLSRTL Version 10.2.0.2.0 - Production

    DUPLICATE*
    ===============================================
    small file to big file
    ===============================================

  • Big File vs Small file Tablespace

    Hi All,
    I have a doubt and just want to confirm that which is better if i am using Big file instead of many small datafile for a tablespace or big datafiles for a tablespace. I think better to use Bigfile tablespace.
    Kindly help me out wheather i am right or wrong and why.

    GirishSharma wrote:
    Aman.... wrote:
    Vikas Kohli wrote:
    With respect to performance i guess Big file tablespace is a better option
    Why ?
    If you allow me to post, I would like to paste the below text from my first reply's doc link please :
    "Performance of database opens, checkpoints, and DBWR processes should improve if data is stored in bigfile tablespaces instead of traditional tablespaces. However, increasing the datafile size might increase time to restore a corrupted file or create a new datafile."
    Regards
    Girish Sharma
    Girish,
    I find it interesting that I've never found any evidence to support the performance claims - although I can think of reasons why there might be some truth to them and could design a few tests to check. Even if there is some truth in the claims, how significant or relevant might they be in the context of a database that is so huge that it NEEDS bigfile tablespaces ?
    Database opening:  how often do we do this - does it matter if it takes a little longer - will it actually take noticeably longer if the database isn't subject to crash recovery ?  We can imagine that a database with 10,000 files would take longer to open than a database with 500 files if Oracle had to read the header blocks of every file as part of the database open process - but there's been a "delayed open" feature around for years, so maybe that wouldn't apply in most cases where the database is very large.
    Checkpoints: critical in the days that a full instance checkpoint took place on the log file switch - but (a) that hasn't been true for years, and (b) incremental checkpointing made a big difference the I/O peak when an instance checkpoint became necessary, and (c) we have had a checkpoint process for years (if not decades) which updates every file header when necessary rather than requiring DBWR to do it
    DBWR processes: why would DBWn handle writes more quickly - the only idea I can come up with is that there could be some code path that has to associate a file id with an operating system file handle of some sort and that this code does more work if the list of files is very long: very disappointing if that's true.
    On the other hand I recall many years ago (8i time) crashing a session when creating roughly 21,000 tablespaces for a database because some internal structure relating to file information reached the 64MB hard limit for a memory segment in the SGA. It would be interesting to hear if anyone has recently created a database with the 65K+ limit for files - and whether it makes any difference whether that's 66 tablespaces with about 1,000 files, or 1,000 tablespace with about 66 files.
    Regards
    Jonathan Lewis

  • I have to send large files like 5MB or biger via mobile phone, but in that cases phone is telling mi to plug in power. And when I do that, there is no problem to send the file. For smaller files there is no problem. How can I solve the problem? Because wh

    I have to send large files like 5MB or biger via mobile phone, but in that cases phone is telling mi to plug in power. And when I do that, there is no problem to send the file. For smaller files there is no problem. How can I solve the problem? Because when I'm not at home I can't plug in the power.

    hi,
    I am sending file from server to client i.e client will request for a file and service will send it back....... no socket connection is there...I am using JBOSS and apache axis.
    pls help me out.....
    Rashi

  • JDEV TP3 freezes when editing big .jspx files

    Hi,
    I have mainly worked with small files in JDEV TP3. At the moment I am testing the ADF Faces Rich table component. I have a huge table, the markup is over 700 lines long and when I edit this file JDeveloper freezes for several seconds, repeatedly. This occurs when I edit the file. Normal text input, copy and paste, selecting.
    I tried to run JDeveloper with JDK 1.5 and 1.6.4. No difference. When I close and open the program. No difference. On all the other - smaller files < 100 lines everything works fine.

    Hi,
    any chance you can provide a testcase ?
    Frank

  • How to edit big amount of files correctly?

    Hi.
    During summer I've captured 120 GB of video material on my GoPro camera.
    I started to add files I want to take clips from as sequences and marked parts which I would like cut later - took me awhile but this way I could know where are the best parts.  I've made around 50-60 sequences that way and started to make a clip out of them ( also as a sequence ).
    Sadly, my laptop can't handle Premiere after few minutes of running anymore. It's freezes and Importer Process Server process is using way too much memory so I can't basically do anything else than shutting down program and re-opening it.
    I was wondering if there is better way to do it.
    How do You guys edit Your files ?
    Would like to hear some suggestions of more experienced users.
    I'm using Dell XPS 702x with 6 GB RAM memory on board.

    >GoPro camera
    Unless you have the latest PProCC version (which I have read works with "native" gopro files) you need to convert to an easier to edit format
    GoPro http://forums.adobe.com/message/4486784
    -says Mac (Win process should be the same) but even GoPro says to convert
    -http://gopro.com/how-to-prepare-video-files-for-editing-apple-mac/
    -use to convert http://gopro.com/software-app/cineform-studio/

  • Problem editing .mov files in CS6

    Hi Adobe folk,
    I've uploaded .mov files from my Canon 7D camera onto Adobe Premiere CS6, but after doing some editing the footage in the playback screen is very jerky/slow. I'm working on a brand new iMac OSX and have a RAM of 4GB, so I have no idea why the playback would be so slow.
    It's an urgent project that I'm working on, so any solutions would be much appreciated.
    Thanks :-)

    All .mov files aren't the same thing though. Quicktime/.mov files are just the "container" for the actual "codec" the codec is what determines basically everything other than the file extension itself anyways. Basically the type of .mov file you're attempting to edit is using the h.264 codec, which is a codec that puts a TON of strain on your CPU. Converting the files to ProRes would make playback better, but will produce huge files, and if you start stacking several layers of video on top of each other it will start to playback in a choppy manor, because you're not using a RAID array.  In this case I don't think the RAM will help that much, although it should make your overall experince much better.
    Although how many layers of video are you attempting to edit on the same track and what effects are you using? Because your system should be able to at least playback a single video clip inside Premiere without issue.
    Although once you post what the resolution is I'll be able to recommend what codec to use, because for certain resolutions some codecs tend to become more efficient than others. But basically unless you have FCP on your machine you can't convert to ProRes, however if you don't have final cut no worries because you can use the Cineform codec which is just as good as ProRes. The cineform codec is also free now days which is great.  But honestly I think that most likely you're going to need to transcode the footage to a ligher codec, basically you don't need to worry about files being too big because, even though it seems like a smaller file would be easier for your machine to handle it's actually quite the opposite unless you start hitting a HDD bottleneck. Which won't happen in most cases, (depending on resolution etc though or unelss the video is totally uncompressed HD).
    Here is the basic reason why the less compressed a video clip is the easier you can edit the file without strain on the CPU. Video clips that use "GOP's" or Group of Pictures use
    I - frames
    B - frames
    P - frames        (With mpeg-1 there is also D-frames but it's not used anymore to my knowledge)
    Anyways though when your CPU plays back a h.264 clip it has to decode the video which the longer the GOP is the harder your CPU has to work, and in general H.264 footage is almost always GOP because it's so heavily compressed. Unless it's I-frame only. Because basically the further the I-frames are from each other the harder your cpu must work to decode it. When dealing with footage that is much less compressed though you're CPU is basically free to process other stuff such as heavy CC or other effects.

  • How to edit raw files' metadata directly?

    Is it possible to edit raw files' metadata directly?  If so, how?  (I am using Bridge CS6 v5.0.2.4 x64).  In my case, raw files are from Nikon digital cameras, i.e., .NEF.
    By directly, I mean the original file is changed, rather than the sidecar .xmp file.
    This is necessary under some circumstances where it may be required to prove ownership or origin of a photo.
    Help most appreciated.
    Thank you.

    So how do you prove an image is yours?
    I did understand the ownerships problem but I also would point to the impossibility of hard proof with current technology.
    As said, I put my energy in good clients and for my images that are used on personal websites without my permission I really don't bother.
    Don't know where you live and what your country has in its law regarding copyright. I also don't know to what extend you want to go and how much money is involved with your problem.
    Never had to do it myself but I heard of several cases in the Netherlands. If you are certain it is your image and it is used without your permission and without paying you there are normal ways to seek justice.
    First call the person or company that betrayed you for info and explain, then put this in writing with the request of a reasonable fee. If no luck seek a lawyer for a first letter with warning for extra costs and fine.
    If you want (depending on the law in your country but most western countries do respect copyright) you can get it to court. Be sure to have the original raw files at hand (the whole series makes it more believable) and eventually if needed or possible seek witnesses seeing you taking the pictures.
    But it can be a long and hard road that is certainly not suitable for a small case. Then again, most of the cases have been decided in favor of the photographer, the company that first refused to pay left with the usual fee to pay including a fine (mostly the same or more as the fee) and having to pay for the costs of the court…
    Also keep in mind this is not good advertisement for you as a photographer, especially if the fine is big the word will spread around in the business. It is always problematic in this cases, hence my advise to invest in good clients...

  • Gimp: big files [solution inside]

    i try to glue together images to a bigger one (each of them are about 3000x2000px and there are 5 of them)
    what i tried:
    - open all 5 files in gimp
    - create a new image in gimp, with the size 8000x8000px
    this worked, but then if i copy one of the smaller images and go to the bigger one and paste it, the system becomes horribly slow and hdd is working a lot --- gimp redraws windows then extremly slowly
    i have 768MB ram and only about 300 is occupied while trying this
    i think gimp is using only a part of the memory and then has the problem working
    why is gimp not using more memory?
    also if you know a other method how to glue pics together, i would be glad hearing, as i'm newbie in this sector
    what i did as workaround is: i scaled all the pics down to a smaller size and then glued them together. the product you can see here:
    http://daperi.home.solnet.ch/uni/bio4/p … logie.html
    (click on the first image)
    the original resolution is much more and the target is to also use the full resolution to glue them to a much better image
    any suggestions welcome
    thanx in advance

    Dusty wrote:Is File-->prefrences-->environment-->tile cache size what you're looking at?
    it was set to 64MB !!! i changed it to 400 and now gimp works again normally also with big files - thanx a lot
    Dusty wrote:Another option is to edit smaller files. :-D
    this is not an option but a workaround for me, because i need to glue images taken under the microscope, that must keep resolution and size but should be glued together (to keep the details) - with this method, i can construct the whole probe i had a look at the microscope in the computer in one picture, what gives great possibilities in archiving it --- but the trouble is that each pic is about 6*10^6px and if you glue 10 such pics, you need obviously more than 64mb :-)
    thanx for helping

  • Photoshop CC slow in performance on big files

    Hello there!
    I've been using PS CS4 since release and upgraded to CS6 Master Collection last year.
    Since my OS broke down some weeks ago (RAM broke), i gave Photoshop CC a try. At the same time I moved in new rooms and couldnt get my hands on the DVD of my CS6 resting somewhere at home...
    So I tried CC.
    Right now im using it with some big files. Filesize is between 2GB and 7,5 GB max. (all PSB)
    Photoshop seem to run fast in the very beginning, but since a few days it's so unbelievable slow that I can't work properly.
    I wonder if it is caused by the growing files or some other issue with my machine.
    The files contain a large amount of layers and Masks, nearly 280 layers in the biggest file. (mostly with masks)
    The images are 50 x 70 cm big  @ 300dpi.
    When I try to make some brush-strokes on a layer-mask in the biggest file it takes 5-20 seconds for the brush to draw... I couldnt figure out why.
    And its not so much pretending on the brush-size as you may expect... even very small brushes (2-10 px) show this issue from time to time.
    Also switching on and off masks (gradient maps, selective color or leves) takes ages to be displayed, sometimes more than 3 or 4 seconds.
    The same with panning around in the picture, zooming in and out or moving layers.
    It's nearly impossible to work on these files in time.
    I've never seen this on CS6.
    Now I wonder if there's something wrong with PS or the OS. But: I've never been working with files this big before.
    In march I worked on some 5GB files with 150-200 layers in CS6, but it worked like a charm.
    SystemSpecs:
    I7 3930k (3,8 GHz)
    Asus P9X79 Deluxe
    64GB DDR3 1600Mhz Kingston HyperX
    GTX 570
    2x Corsair Force GT3 SSD
    Wacom Intous 5 m Touch (I have some issues with the touch from time to time)
    WIN 7 Ultimate 64
    all systemupdates
    newest drivers
    PS CC
    System and PS are running on the first SSD, scratch is on the second. Both are set to be used by PS.
    RAM is allocated by 79% to PS, cache is set to 5 or 6, protocol-objects are set to 70. I also tried different cache-sizes from 128k to 1024k, but it didn't help a lot.
    When I open the largest file, PS takes 20-23 GB of RAM.
    Any suggestions?
    best,
    moslye

    Is it just slow drawing, or is actual computation (image size, rotate, GBlur, etc.) also slow?
    If the slowdown is drawing, then the most likely culprit would be the video card driver. Update your driver from the GPU maker's website.
    If the computation slows down, then something is interfering with Photoshop. We've seen some third party plugins, and some antivirus software cause slowdowns over time.

  • Adobe Photoshop CS3 collapse each time it load a big file

    I was loading a big file of photos from iMac iPhoto to Adobe Photoshop CS3 and it keep collapsing, yet each time I reopen photoshop it load the photos again and collapse again. is there a way to stop this cycle?

    I don't think that too many users here actually use iPhoto (even the Mac users)
    However, Google is your friend. A quick search came up with some other non-Adobe forum entries:
    .... but the golden rule of iPhoto is NEVER EVER MESS WITH THE IPHOTO LIBRARY FROM OUTSIDE IPHOTO.In other words, anything you might want to do with the pictures in iPhoto can be done from *within the program,* and that is the only safe way to work with it. Don't go messing around inside the "package" that is the iPhoto Library unless you are REALLY keen to lose data, because that is exactly what will happen.
    .....everything you want to do to a photo in iPhoto can be handled from *within the program.* This INCLUDES using a third-party editor, and saves a lot of time and disk space if you do this way:
    1. In iPhoto's preferences, specify a third-party editor (let's say Photoshop) to be used for editing photos.
    2. Now, when you right-click (or control-click) a photo in iPhoto, you have two options: Edit in Full Screen (ie iPhoto's own editor) or Edit with External Editor. Choose the latter.
    3. Photoshop will open, then the photo you selected will automatically open in PS. Do your editing, and when you save (not save as), PS "hands" the modified photo back to iPhoto, which treats it exactly the same as if you'd done that stuff in iPhoto's own editor and updates the thumbnail to reflect your changes. Best of all, your unmodified original remains untouched so you can always go back to it if necessary.

  • Can't move big files also with Mac OS Extended Journaled!

    Hi all =)
    This time I can't really understand.
    I'm trying to move a big file (an app 9 GB large) from my macbook app folder to an external USB drive HFS+ formatted, but at about 5.5 GB the process stop and I get this error:
    "The Finder can't complete the operation because some data in "application.app" can't be read or written. (error code -36)"
    Tried to search for this error code over the internet with no results.
    Tried the transfer of the same file to a different drive (which is also HFS) but I still get the same error code.
    Both drives have plenty of available free space.
    Tried also  different USB cables.
    The app in subject has just been fully downloaded with no error from the app store.
    What should I try now? Please any suggestion welcome.. this situation it's so frustrating!
    Thanks

    LowLuster wrote:
    The Applications folder is a System folder and there are restrictions on it. I'm not sure why you are trying to copy an App out of it to an external drive. Try copying it to a folder on the ROOT of the internal drive first, a folder you create using a Admin account, and then copying that to the external.
    Thanks a lot LowLust, you actually solved my situation!
    But apparently in this "forum" you can't cancel the choosen answer if by mistake you clicked on the wrong one and you can't edit your messages and you even can't delete your messages... such a freedom downhere, jeez!

Maybe you are looking for

  • Invalid objects found after R12.1.2 upgrade from R12.1.1

    Hi all, We had Oracle E-Business R12.1.1 vanilla installation before and just upgraded to R12.1.2 by applying patch (7303033), in which during this process there was an error stating some errors found at forms library. And the below query; select cou

  • Apex long time waiting! APEX + COCOON = PDF Carl's blog

    Hello! Help me, please! I need to print reports in PDF. What I did: 1 I installed Apache Tomcat 6.0.14 2 I installed Cocoon 2.1.10 All works fine! http://localhost:8080/Cocoon opened OK 3 I copied fop_post to \apache-tomcat-6.0.14\webapps\cocoon\ , n

  • Database structure check

    Hello All, In the Live Cache alert monitor for a production system I am getting a red alert for the node "Database Structure Check" The message is "No data consistency check in the last three months" Can anyone please let me know if I can schedule th

  • Some more details about hard drives with a T410s

    Hi all, So, needless to say, my T410s uses a 100 GB 1.8" Internal hard drive.  The 1.8" drives are seeming to be harder find and more expensive by the day.  I have recently purchased and installed the ultrabay hard drive adapter so I now have some qu

  • Cle non valide pour l'utilisation dans l'etat specifie sur installation de la version d'essai de PSE Premiere 12

    Bonjour. Je ne parviens pas à installer la version d'essai de PSE premiere 12. J'ai déla la version PSE Photoshop 12 qui est installée et qui fonctionne correctement. J'ai le message "cle non valide pour l'utilisation dans l'etat specifie" qui appara