Keeping two very large datastores in sync

I'm looking at options for keeping a very large (potentially 400GB) TimesTen (11.2.2.5) datastore in sync between a Production server and a [warm] Standby.
Replication has been discounted because it doesn't support compressed tables, nor the types of table our closed-code application is creating (without non-null PKs)
I've done some testing with smaller datastores to get indicative numbers, and a 7.4GB datastore (according to dssize) resulted in a 35GB backup set (using ttBackup -type fileIncrOrFull). Is that large increase in volume expected, and would it extrapolate up for a 400GB data store (2TB backup set??)?
I've seen that there are Incremental backups, but to maintain our standby as warm, we'll be restoring these backups and from what I'd read & tested only a ttDestroy/ttRestore is possible, i.e. complete restore of the complete DSN each time, which is time consuming. Am I missing a smarter way of doing this?
Other than building our application to keep the two datastores in sync, are there any other tricks we can use to efficiently keep the two datastores in sync?
Random last question - I see "datastore" and "database" (and to an extent, "DSN") used apparently interchangeably - are they the same thing in TimesTen?
Update: the 35GB compresses down with 7za to just over 2.2GB, but takes 5.5 hours to do so. If I take a standalone fileFull backup it is just 7.4GB on disk, and completes faster too.
thanks,
rmoff.
Message was edited by: rmoff - add additional detail

This must be an Exalytics system, right? I ask this because compressed tables are not licensed for use outside of an Exalytics system...
As you note, currently replication is not possible in an Exalytics environment, but that is likely to change in the future and then it will definitely be the preferred mechanism for this. There is not really any other viable way to do this other than through the application.
With regard to your specific questions:
1.   A backup consists primarily of the most recent checkpoint file plus all log files/records that are newer than that file. So, to minimise the size of a full backup ensure
     that a checkpoint occurs (for example 'call ttCkpt' from a ttIsql session) immediately prior to starting the backup.
2.   No, only complete restore is possible from an incremental backup set. Also note that due to the large amount of rollforward needed, restoring a large incremental backup set may take quite a long time. Backup and restore are not really intended for this purpose.
3.   If you cannot use replication then some kind of application level sync is your only option.
4.   Datastore and database mean the same thing - a physical TimesTen database. We prefer the term database nowadays; datastore is a legacy term. A DSN is a different thing (Data Source Name) and should not be used interchangeably with datastore/database. A DSN is a logical entity that defines the attributes for a database and how to connect to it. It is not the same as a database.
Chris

Similar Messages

  • In Mail, one mailbox for Recovered Message (AOL) keeps showing 1 very large message that I cannot delete. How can I get rid of this recurring problem, please?

    In Mail on iMac, successfully running OS X Lion, one mailbox on My Mac for "Recovered Messages (from AOL)" keeps showing 1 very large message (more than 20 Mb) that I just cannot seem to delete. Each time I go into my In Box, the "loading" symbol spins and the message appears in the "Recovered Messages" mailbox. How can I get rid of this recurrent file, please?
    At the same time, I'm not receviving any new mails in my In Box, although, if I look at the same account on my MacBook Pro, I can indeed see the incoming mails (but on that machine I do not have the "recovery" problem).
    The help of a clear-thinking Apple fan would be greatly appreciated.
    Many thanks.
    From Ian in Paris, France

    Ian
    I worked it out.
    Unhide your hidden files ( I used a widget from http://www.apple.com/downloads/dashboard/developer/hiddenfiles.html)
    Go to your HD.
    Go to Users.
    Go to your House (home)
    there should be a hidden Library folder there (it will be transparent)
    Go to Mail in this folder
    The next folder ( for me ) is V2
    Click on that and the next one will be a whole list of your mail servers, and one folder called Mailboxes
    Click on that and there should be a folder called recovered messages (server) . mbox
    Click on that there a random numbered/lettered folder -> data
    In that data folder is a list of random numbered folders (i.e a folder called 2, one called 9 etc) and in EACH of these, another numbered folder, and then a folder called messages.
    In the messages folder delete all of the ebmx (I think that's what they were from memory, sorry I forgot as I already deleted my trash after my golden moment).
    This was GOLDEN for me. Reason being, when I went to delete my "recovered file" in mail, it would give me an error message " cannot delete 2500 files". I knew it was only 1 file so this was weird. Why 2500 files? Because if you click on the ebmx files like I did, hey presto, it turned out that they were ALL THE SAME MESSAGE = 2500 times. In each of those folders in the random numbers, in their related message folder.
    Now remember - DONT delete the folder, make sure you have gone to the message folder, found all those pesky ebmx files and deleted THOSE, not the folder.
    It worked for me. No restarting or anything. And recovered file. GONE.
    Started receiving and syncing mail again. Woohoo.
    Best wishes.

  • How to keep two HD's in sync?

    How to keep two HD's in sync? They both contain my Sample Libraries, one is in my desktop the other in my laptop…essentially they contain the same items, but, the thing is: when I'm working on one I begin to delete, modify and add to the pool, and so the two become out of sync.
    It's hard to know what exactly I've changed where. (So, what I usually do is take out things that I edit and re-save them separately, just in case, but this creates vast quantities of unnecessary duplicates).
    So, what I'd like, would be if I could just keep both of the in sync somehow?
    Here's what I'm thinking:
    What if I was to use Carbon Copy Cloner, and just clone one onto the other, back and forth regularly? (Using incremental backup BTW) Would that be a good idea? I'd be worried that things might get corrupted by being overwritten/modified so regularly? Or would that be OK? I mean what if I had changed a small portion of a sample on one HD and then CCC'd onto the other, is it going to detect such a change? And will it always safely copy it across?
    Thank you so much for any help.

    emirium wrote:
    It's hard to know what exactly I've changed where.
    Here's what I'm thinking:
    What if I was to use Carbon Copy Cloner, and just clone one onto the other, back and forth regularly? (Using incremental backup BTW)
    If you are very strict about changing just one drive at a time, you can use CCC or ChronoSync and I think it will be fine.
    BUT...if you are talking about making changes to both drives and trying to reconcile them, then you need ChronoSync or something similar. Because they allow you to preview exactly which files will be changed, in which direction, before you commit. I use a different utility (FoldersSynchronizer) for this purpose; it looks like ChronoSync achieves this through their Analyze window. At time I have stopped potential unintentional data loss by always previewing the changes.

  • Can't sync very large .mp4 file to iPhone

    I have tried to sync a very large .mp4 video (>300MB) to my iPhone but to no avail. Could it be that the file is simply too large or is there another reason?

    No, I have 1 gig movies on my iPhone, I would check to make sure that the movie is selected to sync in your movies or TV shows section. Plug your iPhone into you computer, open iTunes then click on your iPhone and click on Movies and then click on TV shows.
    Message was edited by: Eric Ross

  • I am thinking of buying a "new" iPad but I have a very large iPhoto library which I would like to keep intact and be able to view on my new iPad.  Is there a way to use a external storage device to store the iPhoto library and view it on the iPad?

    I am thinking of buying a "new" iPad but I have a very large iPhoto library which I would like to keep intact and be able to view on my new iPad.  Is there a way to use a external storage device to store the iPhoto library and view it on the iPad?

    You can't use an external hard drive like you would with a computer.
    You can use a USB flash drive & the camera connection kit.
    Plug the USB flash drive into your computer & create a new folder titled DCIM. Then put your movie/photo files into the folder. The files must have a filename with exactly 8 characters long (no spaces) plus the file extension (i.e., my-movie.mov; DSCN0164.jpg).
    Now plug the flash drive into the iPad using the camera connection kit. Open the Photos app, the movie/photo files should appear & you can import. (You can not export using the camera connection kit.)
    When you first connect the USB flash drive, you will only see thumbnails of the pics on the iPad; you have to import the pic file to see the full size. With your large library of pics, you will have to do this repeateably and then delete pics so you don't exceed your memory capacity.
     Cheers, Tom

  • Keeping two iTunes Media Folders (and libraries) in sync

    Hi all,
    Here's my situation... I can't be the only person looking for an answer to this. (Can I?)
    At home I have a Mac Mini with my music collection on it managed by iTunes. I have the same at work. I also have Chronosync set up to nightly syncronize my iTunes Media folders. Obviously, this is just step one.
    What I'd ultimately like is to be able to change files or add media at either location (through iTunes) and not only have the files moved by Chronosync but also have the iTunes libraries updated. Is there a way to do this via AppleScript or some other solution? I looking to keep two iTunes Media folders mirrored and have iTunes' library be updated at the same time (or on a regular basis).
    Thanks in advance!
    A

    It sounds like a question for whoever wrote Chronosync.  I have never heard of it and have no idea what it is or what it does.  Generally you ask questions related to third party software on the software's own support forum.  What you need it to do is synchronize the whole iTunes folder, not just media files.  The library.itl file contains most of the actual playlist and content data but things also get updated in the artwork folder, and some other files (mostly also in the iTunes folder).
    What are the iTunes library files? - http://support.apple.com/kb/HT1660
    More on iTunes library files and what they do - http://en.wikipedia.org/wiki/ITunes#Media_management
    What are all those iTunes files? - http://www.macworld.com/article/139974/2009/04/itunes_files.html
    Where are my iTunes files located? - http://support.apple.com/kb/ht1391
    Apple's solution is to have you buy all your music from Apple and use the automatically download new purchases set on both computers.  Of course it isn't a complete solution. For example, if you change the rating on one computer it won't change it on the other since all you're doing is downloading a file from a third place (the store) to both other places (computers).

  • I am trying to use two very long (2.5 hour HD) clips in multicam but FCPX only places one clip right after the other and does not create multicam. Is there a limit on clip size for multicam?

    I am trying to use two very long (2.5 hour HD clips) and a 2.5 hour audio clip in multicam. One clip and the audio will ceate a multiclip but two clips will not. FCPX simply adds the second clip on at the end of the first. There must be some sort oflimitation on clip length. Is that correct and if so what is it?

    The Angle field is not shown in the Inspector's default Basic View. With the clip selected in the event viewer, select the Info tab in the inspector. Change the view of the inspector from the default Basic View to the General View to be able to see the Angle field.
    I assume you want to use the audio to do the sync. If FCP X has trouble syncing the audio, you can set a marker at the same chronological point in each clip before creating the multiclip. Then set Angle Syncronization to First Marker on the Angle and keep Use audio for syncronization checked. Your marker only needs to be close to get FCP X started. It will then use the audio to get final sync.

  • Can iCloud be used to synchronize a very large Aperture library across machines effectively?

    Just purchased a new 27" iMac (3.5 GHz i7 with 8 GB and 3 TB fusion drive) for my home office to provide support.  Use a 15" MBPro (Retina) 90% of the time.  Have a number of Aperture libraries/files varying from 10 to 70 GB that are rapidly growing.  Have copied them to the iMac using a Thunderbolt cable starting the MBP in target mode. 
    While this works I can see problems keeping the files in sync.  Thought briefly of putting the files in DropBox but when I tried that with a small test file the load time was unacceptable so I can imagine it really wouldn't be practical when the files get north of 100 GB.  What about iCloud?  Doesn't appear a way to do this but wonder if that's an option.
    What are the rest of you doing when you need access to very large files across multiple machines?
    David Voran

    Hi David,
    dvoran wrote:
    Don't you have similar issues when the libraries exceed several thousand images? If not what's your secret to image management.
    No, I don't  .
    It's an open secret: database maintenance requires steady application of naming conventions, tagging, and backing-up.  With the digitization of records, losing records by mis-filing is no longer possible.  But proper, consistent labeling is all the more important, because every database functions as its own index -- and is only as useful as the index is uniform and holds content that is meaningful.
    I use one, single, personal Library.  It is my master index of every digital photo I've recorded.
    I import every shoot into its own Project.
    I name my Projects with a verbal identifier, a date, and a location.
    I apply a metadata pre-set to all the files I import.  This metadata includes my contact inf. and my copyright.
    I re-name all the files I import.  The file name includes the date, the Project's verbal identifier and location, and the original file name given by the camera that recorded the data.
    I assign a location to all the Images in each Project (easy, since "Project" = shoot; I just use the "Assign Location" button on the Project Inf. dialog).
    I _always_ apply a keyword specifying the genre of the picture.  The genres I use are "Still-life; Portrait; Family; Friends; People; Rural; Urban; Birds; Insects; Flowers; Flora (not Flowers); Fauna; Test Shots; and Misc."  I give myself ready access to these by assigning them to a Keyword Button Set, which shows in the Control Bar.
    That's the core part.  Should be "do-able".  (Search the forum for my naming conventions, if interested.)  Or course, there is much more, but the above should allow you to find most of your Images (you have assigned when, where, why, and what genre to every Image). The additional steps include using Color Labels, Project Descriptions, keywords, and a meaningful Folder structure.  NB: set up your Library to help YOU.  For example, I don't sell stock images, and so I have no need for anyone else's keyword list.  I created my own, and use the keywords that I think I will think of when I am searching for an Image.
    One thing I found very helpful was separating my "input and storage" structure from my "output" structure.  All digicam files get put in Projects by shoot, and stay there.  I use Folders and Albums to group my outputs.  This works for me because my outputs come from many inputs (my inputs and outputs have a many-to-many relationship).  What works for you will depend on what you do with the picture data you record with your cameras.  (Note that "Project" is a misleading term for the core storage group in Aperture.  In my system they are shoots, and all my Images are stored by shoot.  For each output project I have (small "p"), I create a Folder in Aperture, and put Albums, populated with the Images I need, in the Folder.  When these projects are done, I move the whole Folder into another Folder, called "Completed".)
    Sorry to be windy.  I don't have time right now for concision.
    HTH,
    --Kirby.

  • HELP!! Very Large Spooling / File Size after Data Merge

    My question is: If the image is the same and only the text is different why not use the same image over and over again?
    Here is what happens...
    Using CS3 and XP (P4 2.4Ghz, 1GB Ram, 256MB Video Card) I have taken a postcard pdf (the backside) placed it in a document, then I draw a text box. Then I select a data source and put the fields I wish to print (Name, address, zip, etc.) in the text box.
    Now, under the Create Merged Document menu I select Multiple Records and then use the Multiple Records Layout tab to adjust the placement of this postcard on the page. I use the preview multiple records option to lay out 4 postcards on my page. Then I merge the document (it has 426 records).
    Now that my merged document is created with four postcards per page and the mailing data on each card I go to print. When I print the file it spools up huge! The PDF I orginally placed in the document is 2.48 MB but when it spools I can only print 25 pages at a time and that still takes FOREVER. So again my question is... If the image is the same and only the text is different why not use the same image over and over again?
    How can I prevent the gigantic spooling? I have tried putting the PDF on the master page and then using the document page to create the merged document and still the same result. I have also tried createing a merged document with just the addresses then adding the the pdf on the Master page afterward but again, huge filesize while spooling. Am I missing something? Any help is appreciated :)

    The size of the EMF spool file may become very large when you print a document that contains lots of raster data
    View products that this article applies to.
    Article ID : 919543
    Last Review : June 7, 2006
    Revision : 2.0
    On This Page
    SYMPTOMS
    CAUSE
    RESOLUTION
    STATUS
    MORE INFORMATION
    Steps to reproduce the problem
    SYMPTOMS
    When you print a document that contains lots of raster data, the size of the Enhanced Metafile (EMF) spool file may become very large. Files such as Adobe .pdf files or Microsoft Word .doc documents may contain lots of raster data. Adobe .pdf files and Word .doc documents that contain gradients are even more likely to contain lots of raster data.
    Back to the top
    CAUSE
    This problem occurs because Graphics Device Interface (GDI) does not compress raster data when the GDI processes EMF spool files and generates EMF spool files.
    This problem is very prominent with printers that support higher resolutions. The size of the raster data increases by four times if the dots-per-inch (dpi) in the file increases by two times. For example, a .pdf file of 1 megabyte (MB) may generate an EMF spool file of 500 MB. Therefore, you may notice that the printing process decreases in performance.
    Back to the top
    RESOLUTION
    To resolve this problem, bypass EMF spooling. To do this, follow these steps:1. Open the properties dialog box for the printer.
    2. Click the Advanced tab.
    3. Click the Print directly to the printer option.
    Note This will disable all print processor-based features such as the following features: N-up
    Watermark
    Booklet printing
    Driver collation
    Scale-to-fit
    Back to the top
    STATUS
    Microsoft has confirmed that this is a problem in the Microsoft products that are listed in the "Applies to" section.
    Back to the top
    MORE INFORMATION
    Steps to reproduce the problem
    1. Open the properties dialog box for any inbox printer.
    2. Click the Advanced tab.
    3. Make sure that the Print directly to the printer option is not selected.
    4. Click to select the Keep printed documents check box.
    5. Print an Adobe .pdf document that contains many groups of raster data.
    6. Check the size of the EMF spool file.

  • Keep (the same) folder content in sync with another mac

    Hi all,
    Me and a friend of mine work on the same project. We have one folder with all our files and subdirectory in each of our mac. We want to keep the work on both folder synced.
    How we can do this?
    What happens if we modify the same file?
    Thanks in advance,
    Filippo

    If you have a dotmac account you could use syncing to keep the files in sync with each other.
    You would have to turn on iDisk Syncing. Then put the files you are working on in that folder.
    Sync the other machine with iDisk and then have iDisk Syncing switched on on that machine too but have one machine set up as the 'Master' and one Machine setup to sync Manually. The Master Machine should sync whenever any file changes and then you can get in touch with your friend and tell him when to sync, as syncing automatically with two machines, they might both sync at the same time causing an error.
    When the other machine does a 'Manual' sync the files which have been replaced or updated will be changed to the new ones. He could sync manually when he has worked on a file too and on yours it would update automatically.
    You would have to have a very good method in place and keep in contact regularly but I think that would work. That is if the machines are not in the same building? I assume their not as you would just have them shared on the Network wouldn't you?
    Mark

  • Movie very large (So Close yet so Far away)

    Ok so im about seconds away of finishing up my feature film. I have the menu and its very large. The movie it self is 1 hour and 27 minutes. And there are 5 features leading to be about another 1 hour and 30 minutes.
    I compressed everything but still the actual movie takes up 3GB of the disk. And with everything is about 5 GB.
    How do i make al lof this fit in the quickest time possiable

    Have you tried using Quick Time directly? You can export from there as MPEG2 as well... you just don't get the bells and whistles that you get with Compressor.
    If you are likely to do a lot of this kind of work then I can really recommend that you invest in a different encoder as well - BitVice or MegaPEG.X are the two to use on a Mac. Both are a couple of hundred dollars or so... and well worth it!
    About Compressor not starting - if you look in the Compressor forum there are umpteen references to this problem. For the most part they can be resolved by re-starting your machine, which in turn re-starts the qmaster service. However, you can also do this from Terminal if you need to. This from a post a long time ago... not from me, I add:
    Launch Terminal (Application>Utilities>Terminal) and type this:
    sudo /Library/StartupItems/Qmaster/Qmaster
    Then hit return, then enter your administrative password. Then you should get a message about Qmaster services starting up.
    Re-launch Compressor, see if it works.
    Or
    go to finder, go, go to folder
    type /etc/
    view as icons
    locate "hostconfig" and COMMAND then click on fileand drag to desktop. To do this, you MUST press command FIRST then while holding it down click the file and drag to desktop – if you’ve clicked on the file first, the command click you will unselect the file. If you don't get it the first time, keep trying.
    Once on the desktop, rename the file to "hostconfig.old" all lower case
    COMMAND drag the newly named file back into the etc folder.
    That's it! Restart your machine and start-up compressor.
    If those two don't solve it, you'll need to hunt around for other solutions, I'm afraid.

  • Jtree - Problem creating a very large one

    Hi,
    I need to create a multi-level JTree that is very large (aroung 600-650 nodes all in all). At present I am doing the same by creating DefaultMutableTreeNodes and adding them one by one to the tree. It takes a very long time to complete(obviously).
    Is there any way of doing this faster? The tree is being constructed from run-time data.
    Thanks in advance,
    Regards,
    Achyuth

    Two thoughts. First, I create 100's of nodes and it's pretty fast. It could be how you are creating/adding them. Make sure your code is not the bottleneck. If you are traversing the entire tree or some other odd way of doing it, it could be the problem. I only say that because I am able to create an object structre, parse XML as I create that structe and create all the nodes and its less than 1 second on a PIV 1.7Ghz system. Maybe on slower systems its a lot slower, but for me its up immediately.
    Another way, however, is to keep a Map of "paths" to child items in memory. As you build your object from whever, use a Map to keep path info. Now, add listeners to the tree for expand/collapse. Each time it is about to expand, build up the nodes and add them to the expanding node at that point. If it collapses, discard them. This does one of two things. First, for large trees, you aren't wasting tons of time building the tree at the start, and more so, its probably likely that all those nodes aren't going to be expanded/shown right away. Second, it conserves resources. If your tree MUST open in a fully epxanded state, well, then you may be out of luck. But I would much prefer epxanding and building the child nodes at that moment, rather than do the whole thing first.

  • Can I get a very large (72GB) SSD?

    Can I get a very large (72GB) SSD? Price is currently no issue. Speed and capacity are the two highly important factors.

    This question is not particularly related to OS X Server, so you probably won't get the widest audience for your storage hardware question here; Apple doesn't offer SSD storage arrays, and big FC SAN arrays aren't all that common on OS X and OS X Server systems.
    As for your question...
    The HP 3PAR StoreServ 10000 SSD array will support up to 3.2 petabytes (that's ~3,200 terabytes), but you're probably going to need somebody to test and qualify it for you, or you'll have to qualify it yourself — primary connection into that would be FC SAN via 8 Gbps FC SAN HBA.    Here are some smaller arrays; 3PAR 7450 220 TB and this 3PAR 7000 (not sure of the capacity there).
    Contact ATTO, Promise, or one of the other organizations that specialize in high-end storage for OS X, if you want somebody that can work with you here; to qualify and support this configuration, should you need that — they'll also be involved establishing a connection from your Mac to the FC SAN; via Thunderbolt or via FC SAN PCIe HBA or such.
    If price is an object, and if this question is related to this how much storage do I need? question, these 3PAR SSD arrays are very likely massive capacity and budget overkill — SSD arrays aren't yet cheap enough to provide archival storage for most places, and your OS X Systems probably can't even generate enough I/O to keep these 3PAR arrays busy.   (Enterprise-class gear such as these 3PAR arrays can be fairly gonzo in its bandwidth and I/O capabilities.)
    For typical backup configurations I work with, that's usually either a decent-sized disk array — possibly with a mix of flash cache, if necessary — and enough spindles on the disks to get you the I/O bandwidth necessary, then possibly with tape backups for longer-term and off-site storage. 
    Most folks that really need I/O performance will use SSDs for secondary storage (after using main memory and processor caches for all they're worth), and then use bigger (and somewhat slower) disks for tertiary storage (with RAID-10 or virtual RAID across parallel disks for speed and hardware reliability — SATA disks are fairly common here, because they're big and cheap), and then out to LTO Ultrium tape (LTO-6 provides ~2.5 TB per cartridge uncompressed, and claims ~2.5:1 compression) or maybe uploaded to Amazon Glacier or similar cloud services for long-term and off-site storage requirements.   (I've yet to work at a site that's using SSD for classic backups, though there are sites that are using these arrays for database transaction logs and other backup-like purposes.)
    Also look at archiving the data to a remote site, if it's important data.  Either tapes or (less desirably) disks via courier, or transferred via a fat network link, if you can afford that.

  • Iphone 3GS displaying icons very largely.......

    I am facing a very strange problem, my iphone model 3GS is displaying icons on the screen very large in size, than the usal format.
    I tried to solve the problem by switching on/off, tried the settings options.
    But did not go for restoring the factory settings, because of the doubt i may lose all the contacts saved in the phone.
    Please advise any solution and clear the doubt of losing the saved contacts if i apply the factory resetting option.

    If you select erase all content and settings on your iPhone, this does exactly as provided - it will erase all content and settings - everything but the firmware/included software.
    If you restore your iPhone with iTunes as a new iPhone or not from your iPhone's backup, the same applies.
    Contacts are designed to be synced with a supported application on your computer. With Windoze, this can be with Outlook 2003 or 2007 along with syncing calendar events, or with Windows Contacts for syncing contacts only - the address book used by Windows Mail with Vista.
    If you aren't currently using a supported application on your computer for contacts - the address book is empty, before syncing contacts with the application enter one contact in the application on your computer. Make this contact up if needed, which can be deleted later. This will provide a merge prompt with the first sync for this data, which you want to select. Contacts between the two will be kept synchronized with following syncs. Syncing contacts with a supported application on your computer is selected under the Info tab for your iPhone sync preferences with iTunes. You can do this before restoring your iPhone with iTunes which will also give you access to your contacts on your computer.

  • Editing very large MP3 Files

    Hello,
    Sorry for my bad English! I am French!
    I recently bought a Sony digital recorder (ICD-PX312) that records MP3 files.
    I recorded my noisy neighbour, in continuous mode for three days. (I was not at home)
    PX312 recorder has produced several files which maximum size is  511,845 kB corresponding to a lenght of 24H 15min.
    Every file, checked with Mediainfo, have the same properties. (48kbps 44.1KHz)
    I can read them with VLC media player without any problem, but I need to edit these files to keep only the interesting parts.
    If I open (drag and drop or open file) these files with Audition 1.0 (I came from Cool-Edit), I found a very strange behavior.
    The 24H 15min files are opened with a lenght of 1H and a half or so!.
    I gave a try opening with Audition 3.0 and the result is strange too, and more "funny":
    The lenght of one 24H15min file is 1H and a half
    The lenght of another 24H 15min file is 21H or so.
    In the Audition TMP directory, Audition has created several 4 GB temporay files that correspond to the limit of the WAV files.
    I made a test with a 128kpbs, 44.1khz. This 511,845 kB file is  9H 5min long
    Audition read it as 4H40min
    The TMP file is 2,897,600 kB, far below the 4 GB limit
    It seems Audition 1 and 3 (I read CS5 has the same problem too) does not share the WAV conversion very well.
    Is it due to Audition itself or to the Fraunhofer MP3 codec?
    Is there any workaround from Adobe?
    As I am an AVS4YOU client, I tried AVS Audio Editor. It opens every file with the good lenght but does not have the needed editing functions. That demonstrates that it is possible to share very large files.
    Many thanks in advance for any help or idea, because Audition is the editor I need!
    Best Regards

    SteveG wrote :
    t's not their 'bug' at all. MP3 files work fine, and Adobe didn't invent the wav file format, Microsoft did. It's caused by the 32-bit unsigned integer to record the file size headerincorporated into wav files, which limits their size to one that can be described within that limit - 4Gb.
    May be I was not clear enough in my explanation
    I Agree partly with you.
    I Agree with you when you wrote that the 4 GB limit is inherant to the Microsoft WAV format.
    When reading 24H MP3 Files, Audition 3.0 creates, in TMP folder, two chunk of 4 GB and another smaller chunk, to try to overcome the limit. This cutting process is exactly what I am expecting from such a software.
    The problem - and the bug - is that the duration  that is "extracted" in audition is smaller than that of the original MP3 (e.g. 21H 30min instead of 24H 15min). Some part of the original MP3 files has been skipped in the cuttng/conversion process.
    I dont think Microsoft is involved in this "cutting" process that belongs to Adobe an/or Fraunhofer. This is why I am surprised

Maybe you are looking for

  • Lack of settings in Z10

    Anybody else disappointed with the lack of settings on the Z10 vs the Bold? For instance you can not set the number of vibrates when you have a notification When you plug in your phone to a charger you can not set it up to go into Bedside Mode and a

  • How to import mod files in Imovie

    I recently bought an I Mac and now I try to put my movies (taken with a JVC camcorder, mod extension) in imovie. Because the mod exention is not recognised this isn't working. Can someone tell me how to get these files in imovie? Thanks in advance

  • To connect the Current User Filter with the Excel Web Access

    Hello, I have uploaded an excel sheet which has data in Pivot table format into SharePoint site and then by using the current user filter I am trying to connect the filter (sharepoint profile user property) to a parameter in the excel sheet. I have d

  • Ale question

    hello everybody, what is difference between executable programs and change pointers in ale? and also difference between change pointers and message control? regards uday

  • Central User Administration (CUA): Remote Change of User

    Dear experts, I have following CUA scenario within my company: We have a CUA which provides a couple of R/3 daughter systems/clients. Further we have a HR system stand-alone-system which is also integrated in our CUA. I tried to create a ABAP on the