Handling large projects in FCPX

I am working on a documentary which in the end will take about 60 minutes of HD quality. The project is now about 25 min. Working with it becomes progressively slower since the files becomes to big. So the background jobs take too much time.
I am considering to split the project into 3 or 4 other projects and then in the end put them together. Is that a goor idea ?
Or is it better to split the current project in compound clips and then edit the compound clips separately on the timeline ?
The problem in that case is that if in the editing process the duration of your compound clip increases (or decreases ), you get into trouble when returning to the original project. What is the best workflow in large projects ?

THX Russ. Indeed I have a lot of footage and I am using 2 Thunderbolt external drives of several TB 's. The project itself is on one of the harddiscs of my 27 inch iMac (medio 2011, OS X Yosemite,3,4 GHz Intel Core i7 processor, AMD Radeon HD 6970M 2048 MB videocard).
I just have split the project now in several Compound Clips covering the various episodes. So at this moment the timeline of the project is very simple, just a couple of compound clips separated by black generator clips. Only at the end of the project, the part I am working on which is not (yet) a compound clip. It helps a lot, the background rendering is only engaged with the part I am working on ! So I think this works fine for a large project, without the need to copy and paste later on several projects with the risk of losing parts during the process ( which I had before !!! ).
The trick is that if you want to change something in the previous compound clips, you have to take care that you open the source compound clip which is listed in the library ! You open that in the timeline and you can change whatever you want, no matter if the duration  changes. If you are ready you open the project in the timeline and the changes you have made in your compound clip are nicely copied in the project.
The mistake I made was that I opened the compound clip in the project  on the timeline and then tried to edit it. Then you can get easily in trouble if the duration of the clip changes in the editing process. !

Similar Messages

  • RH7 and large project file handling

    Hello
    Does RoboHelp 7 handle a larger number of topic files than
    earlier
    versions? I am investigating whether it might be worthwhile
    upgrading
    from X5 to 7, and the possibility of being able to handle a
    project
    with over 3000 topics would be a distinct advantage.
    Thanks in advance
    M.

    Welcome to the forum.
    I don't think any change has been made there but why would
    it? 3,000 topics has never been an issue. I am running merged
    webhelp and two of the projects have around 4,000 topics.
    RH7 makes it easier to work with the topics. See my site if
    you want some more information about it.

  • Very large project management - need advice?

    I am editing a 6 one hour episode series. I am doing this with a single library and about 20 events. Each event holds the footage keywords for the original Prores source files. I have the 6 major episode timelines created and also about 20 more timelines for interviews (I have 80 interviews that are subtitled). My FCP library files is 1.82TB size.
    I am running a new Mac Pro and a new 8TB Raid 0 drive (2x4Tb). This started to freeze once or twice a day while editing.
    I am now thinking to make this into 6 libraries - one for each episode and with just 3 or 4 timelines needed (sequences) for each episode. Would this help? I read that FCPX cannot handle a large project like mine and long timelines?
    Thanks.

    Jon Braeley wrote:
    My source files are 99% Prores, so I leave the files in place and do not optimize.
    Just to point out that FCP X should not allow optimizing of  iFrame media, which of cours Pro Res is,
    Have you seen any info on splitting into smaller libraries - not seen anything on this?
    I just recently did this to reverse course on an early organizational decision I'd come to regret. Basically created new libraries and moved events to them from within the app.
    But I would definitely consider Luis' suggestion about consolidating and working with external media.
    Russ

  • Large project in Premiere CS3 causes Windows XP64 to reboot

    Ok...I have a rather sizeable project going in Premiere CS3.  It contains a rather large number of animations created in Autodesk Maya (most between 10 seconds and 3 minutes long) and a large number of stills, as well as scrolling background (a .mov file I created using Photoshop and Premiere) and a soundtrack created by myself (in Cakewalk/Sonar).  The total project is about 18 1/2 minutes in length....around 8 video tracks and 3 audio tracks at the moment. The main problem is that when I start working on it...either importing files and especially moving files around in the project or even just hitting the playback button, the system just reboots.  I have seen the infamous Windows "blue screen of death" flash briefly once or twice, but it doesn't hang...just goes right to the system BIOS screen.  The ONLY thing that seems to help is basically saving the project like every 30 seconds.  Import a file, save, conform file to widescreen, save, bring file into project, save, move file around a bit, save, add crossfade, save....you get the idea.
    Now before I go too much futher here, I suspect the problem isn't actually with Adobe/Premiere as I had this happen a few times while I was creating the sound track in Sonar 4 as well.  The sound track is basically a pseudo-orchastral piece and is equally around 18 1/2 minutes long....currently with around 42 tracks of audio and 5 midi tracks.  The audio has been sampled down to a standard stereo wav file for the import into Premiere. And yes, as the audio portion of this project got bigger and bigger, it caused a few reboots as well (not to mention drop-outs, studders and all the other fun stuff associated with audio production).  With that, I was able to "compact" the Sonar file (basically a Sonar project defrag) and the reboot problem seems to have went away on that end.
    The system itself SHOULD be able to handle this project...not top of the line, but pretty beefy. 2nd generation Intel i5 quad core, 16 gigs of Kingston DR-1600 RAM, Tascam US-428 for the audio and a good 2+ terabytes across 5 harddrives (3 internal 3 gig-per-second SATA's, 2 USB).  Also has a 4 gig HIS vid card with ATI chipset (still can't afford a FirePro yet).  While Premiere is installed on the (comparatively dinky) 160 gig C drive, I'm using the 500 gig E drive internal and a 500 gig USB drive as my work drives.  Most of the Maya animations are on the internal E drive but the rest...the stills and Premiere files and such...are on the USB drive.  I.e., the system itself should have plenty of rendering power for this.  Also I have replaced the C drive with a nearly identical twin.  After a recent crash, I finally broke down and got a backup C drive...in other words, I'm sure the drive(s) are a-ok...even put a brand new 500 watt power supply in just in case.
    I have already re-installed XP64...a couple of times in fact, as just before I started this project one of those god-foresaken MS "critical updates" crashed the crap out of my computer.  Then on the first reinstall, something hosed in .NET framework during the "night of a thousand updates" so I had to scrap it and start the reinstall all over again.  Likewise, I reinstalled Premiere CS3 (along with everything else including Maya) once I finally had the computer running again.
    A quick note here: I did create my own custom 1080 Hi Def widescreen settings for Premiere since CS3 didn't appear to have any such thing.  For short projects they seem to work just fine...although admitedly, the render time is a serious b*tch.  Likewise, all the Maya animations have been rendered out at 1080 widescreen, production quality (yea...the render time there was a killer too).
    Another quick note:  While I'm trying to use Premiere exclusively while I'm working, on occasion I have had to open either Photoshop or Maya to deal with a couple of stills...and when I open something else, I get a "windows virtual memory to low" (16 gigs of RAM and the virtual memory is too low??).  Otherwise, except for Avira (virus scanner) I don't really have anything else running in memory...no screen savers, no desktop pictures, no Instant Messanger crap, etc..  Never cared for bells and whistles and such, so the system is pretty clean in that regard (and I do keep up on my degrags as well).
    Finally, while I -am- grateful for any helpful suggestions regarding this, please do NOT simply tell me to upgrade something (Premiere, Windows, etc) as point blank; it ain't gonna happen.  Aside from the fact that Adobe seems to have made Premiere CS4 and up much more difficult to use (I have -many- niggles about the newer interfaces there...using CS5.5 at the college and it's REALLY annoying) and aside from my wife being a MS programmer/developer and having to listen to her complain about Windows 7 and 8, we're in the process of buying a new house, so any and all "disposable income" is nill...and will be for quite some time.  No money...zero...nadda...i.e. no upgrades.  Further, while granted this is certainly the biggest project I've done to date, I've ran this setup/configuration for at least the past 3 years now without a hitch...this all seems to have started with that last MS critical update (at the risk of sounding paranoid, I'm really starting to think they slid a nasty in that last one to FORCE people to upgrade since they're discontinuing XP64 support soon). 
    Alrighty...I know that's a lot but I wanted to provide as many details as I could here in hopes that someone can help me get a handle on this reboot issue.  Getting to be REALLY annoying having to save every 30 seconds or so.  I have searched the help files, but got tired of randomly sorting thru all the blather and not finding anything helpful.
    I'm grateful for any help here...thanks!
    Jim

    Well, I'm about 98% sure it isn't a hardware problem.  I had already pulled the RAM and tried some backup RAM I have sitting in the box (8 gigs of Corsair), went thru the harddrives weeks ago with Chkdsk and everything was a-ok there (a couple of bad clusters on the old C drive, but that has since been replaced and is on the shelf as a backup), motherboard seems to be fine, re-seated all cards and cables, etc..  Again I even slapped a new 500 watt PS in just in case (the old one was a 400 watt and with all the crap I have on this system, thought it might have been a voltage drop off issue...nadda).   I did have a bit of a heat problem a while back...one of the case cooling fans was going bad and the CPU was getting just hot enough to trip the system (about 5 degrees over).  That said, that also shows up in the BIOS monitor as well and was easy enough to track down.  Either way, the bad cooling fan has been replaced and I even added a "squirrel cage" style fan as well...system is running nice and cool now.
    BTW...I used to be A+ certified on both PC's and Macs when I was a hardware tech (not to mention HP, IBM and Lexmark certified, Cat 5 installer, etc), so I like to think I know my hardware fairly well.  Not just a weekend warrior on that front...used to do it for a living.
    I had also already tried manually resizing the paging file as well...tried different sizes, tried spreading it out across the drives, etc..  Didn't really make a difference either way and again the only time the virtual memory low comes up is if I try to open Photoshop or Maya while this big project is open in Premiere.  That said, Photoshop and Maya both tend to be pigs when it comes to memory/RAM (Photoshop especially).  Doesn't seem to happen with Illustrator or any other program....just PS and Maya if this project is open in Premiere.  
    One other oddity I have noticed recently is actually with Maya.  Before all of this started...in other words, before that last MS critical update and 3 days of reinstalling software...I used to be able to do a background render with Maya and still be able to keep working on stuff.  I'd fire off batch render and then I could move on to other things...editing images in Photoshop (CS 5 there if it matters), surfing the web and even doing some light stuff in Premiere.  Since this problem has been occuring however, I have noticed that the Maya batch renders are REALLY slowing down the system big time.  Difficult to even check my email while a batch is running.  Likewise, I have noticed that since I upgraded the Maya from 2011 to 2013, Maya has a nasty tendency to freeze during a batch render.  If I have a long batch render, I'll set it up to run over night...when I get up the next moring, the render itself will still be running (you can see Mayabatch.exe still going in processes under Task Manager) but Maya is totally frozen up.  Now I did that update at the same time as the rest of this mess happened so I don't know if these problems are due to the Maya upgrade or the (apparent) Windows problem...I'm inclined to believe the latter.
    BTW...after having a project file get corrupted yesterday during one of these shut downs, I've been double saving the files (so as to have a backup) and that too seems to have made things a bit better.  Hasn't actually shut down on me when I've been doing this.  That said, it's also a serious pain in the butt having to save that way too.
    With that, looking thru the rest of Hunt's article there, I did kill indexing services...at least on the work drives.  it's about the only thing there that I haven't tried, so we'll see what happens this afternoon.  I seriously doubt that would be the problem though as usually that would give an error or a lockup and not an instant shut down, but we'll see.

  • Best way to handle large amount of text

    hello everyone
    My project involves handling large amount of text.(from
    conferences and
    reports)
    Most of them r in Ms Word. I can turn them into RTF format.
    I dont want to use scrolling. I prefer turning pages(next,
    previous, last,
    contents). which means I need to break them into chunks.
    Currently the process is awkward and slow.
    I know there wud b lots of people working on similar
    projects.
    Could anyone tell me an easy way to handle text. Bring them
    into cast and
    break them.
    any ideas would be appreciated
    thanx
    ahmed

    Hacking up a document with lingo will probably loose the rtf
    formatting
    information.
    Here's a bit of code to find the physical position of a given
    line of on
    screen text (counting returns is not accurate with word
    wrapped lines)
    This stragety uses charPosToLoc to get actual position for
    the text
    member's current width and font size
    maxHeight = 780 -- arbitrary display height limit
    T = member("sourceText").text
    repeat with i = 1 to T.line.count
    endChar = T.line[1..i].char.count
    lineEndlocV = charPosToLoc(member "sourceText",
    endChar).locV
    if lineEndlocV > maxHeight then -- fount "1 too many"
    line
    -- extract identified lines "sourceText"
    -- perhaps repeat parce with remaining part of "sourceText"
    singlePage = T.line[1..i - 1]
    member("sourceText").text = T.line[i..99999] -- put remaining
    text back
    into source text member
    If you want to use one of the roundabout ways to display pdf
    in
    director. There might be some batch pdf production tools that
    can create
    your pages in pretty scalable pdf format.
    I think flashpaper documents can be adapted to director.

  • Best practices code structure for large projects?

    Hi, I come from the Java world where organizing your code is handled conveniently through packages. Is there an equivalent in XCode/Objective C? I'd rather not lump all my observers, entities, controllers, etc in one place under "Classes"...or maybe it doesn't matter...
    If anyone could point me to a document outlining recommended guidelines I'd appreciate it.
    Thanks! Jon

    If you have a small project, you can setup Groups in Xcode to logically organize your files. Those Groups do not necessarily have to correspond to any directory structure. I have all my source files in one directory but organize them into Groups in Xcode.
    If you have a larger project, you can do the same thing, but with code organized into actual directories. Groups can be defined to be relative to a particular directory.
    If really do have a large project, you should organize things the same was as in Java. Your "packages" would just be libraries - either static or dynamic.
    As far as official guidelines go, there really aren't any. It would be best to stick to the Cocoa Model-View-Controller architecture if that is the type of application you are working on. For other software, you can do it however you want, including following something like Sun's guidelines if you want.

  • Cp 5 crashes during publishing of large projects

    I have large Captivate 5 projects with 300-450 slides divided into 2 to 4 slide groups.  When I publish the SWF file, the publishing process always freezes at 70% and the application crashes (can't hit Cancel in the progress window ). 
    This is not due to any particular file; many of my large projects run into this problem.
    This is probably not due to a particular slide since all the projects freeze at 70% and I can publish each slide group in its own project without any trouble.
    The projects are straightforward demonstrations; no quizzing.
    Scouring the net I've had trouble finding a resolution to the problem. I use Captivate 5.0.3.631
    Hoping for ideas. Thanks for reading
    -Patrick
    PS.
    I've tried:
    > In publish dialog "Advanced Options", to enable "force re-publish all the slides"
    > In Preferences > "SWF Size and Quality", to disable "Advanced Project Compression"
    > In Preferences > "SWF Size and Quality", to disable "Compress SWF File"
    > In Preferences > "SWF Size and Quality", to enable "Compress Full Motion Recording SWF file"
    > to remove the slide groups and leave the slides ungrouped
    > In Publish dialog, to publish as .f4v instead of swf
    > to hide all the slides in a slide group in order to reduce the publish SWF to 200 or 250 slides.
    None of these have worked.

    Vikranth --
    Through process of elimination, I have found the offending slide -- Slide 55 in the Captivate 5 version, which incorporates several fill-in-the-blank questions.  When I hide that one slide, everything previews and/or publishes as it should.  I had my colleague test her original Captivate 4 version in Captivate 4, and that slide worked just fine (it is however a slightly later slide number) -- the entire project previewed as expected.  I then had her import her orginal file into Captivate 5 and do the same thing, and it crashed for her as well.
    So, it must be something about the way that particular slide is setup that screws up somehow in the conversion from a Captivate 4 file to a Captivate 5 file.  I'll just handle that question differently to fix the issue at hand, but do you have any idea why this problem might have occured in the first place?  It seems that the patch didn't correct this situation as it corrected most of the other Captivate 4 to Captivate 5 issues.
    Thanks to everyone for their help.  I had to play detective and isolate the issue, and haven't solved it, but at least now I have a starting point to try and do the slide in a different way to work around the issue.

  • Importing FCP7 projects in FCPX 10.1.1

    Hello
    I just installed Mavericks and FCPX 10.1.1 on another boot drive to test if I can handle the switch to FCPX.
    Is it possible to import any FCP7 projects into FCPX ?
    Thought it's best to ask before trying it and messing something up. They are pretty intricate edits for professional use.
    Thanks
    Nick

    You need to read the info from Assisted Editing, particularly "What loses something in translation" and "What gets lost in translation" Since you are exporting an XML from FCP7, there are some things in the original sequence, such as effects on clips, that may or may not make it to FCP-X.
    Here is there info on what does and does not transfer:
    http://assistedediting.intelligentassistance.com/7toX/about.html
    MtD

  • Encore DVD 2.0 memory errors on large project

    Hello,
    I'm compiling a DVD for commercial distribution using Encore 2.0.
    It's a large project in that there are 4 x 3 minute video clips, 2000 timeline-arranged audio files, motion menus with audio and quite a lot of buttons.
    Encore is misbehaving quite badly - it has problems with audio clips that are not the standard (48000 hertz, 768 kbps bitrate, PCM audio) and the project does preview briefly before 'internal memory error 0x000002' comes up.
    The workstation is an AMD64 dual-core 3800, on a Gigabyte skt 939 board, with a Geforce 6600 graphics card and 2 Gb of DDR RAM. So no hardware problems, I think. And yes it is a SSE2 processor - it'd have to be to run Encore at all.
    I can't even build a DVD image due to the apparent size of the project. My client is certain that all of the audio timelines are necessary to the project, but I know that deleting them and cutting the size of the project down will cure the problems I'm having.
    I don't know if there is any obvious reason that Encore can't handle the project - Adobe Updater does not have any new updates for Encore so I assume that this is as good as I'm gonna get.
    Any similar experiences? Any ideas? Thanks in advance,
    Ed Brown
    myweb.tiscali.co.uk/iline
    [email protected]

    You're right - some of the files are mono, and I have been batch processing them up to 48k stereo, thus doubling the bitrate.
    Originally the audio files were recorded in-camera during interviews. The audio of these interviews was chopped up and edited, and animation was produced to complement the selected sections of the audio interviews. Now we're putting the animation series (called Terra 2050) onto DVD and I'm in charge of the project. For the DVD extras, we wanted to include the full audio interviews.
    So I set up a basic DVD with Play All, Select Episode etc. Then I started to add a timeline for each person, dumped all of that person's interview onto the audio track, made a PSD of that person's name, so by choosing "Colin Pillinger" on the Interviewees menu, the DVD should jump to a timeline that plays a still saying "Colin Pillinger - Professor of Planetary Science" while Colin's interview plays over the top.
    The audio files are only half the problem, however.
    The structure is as follows:
    Top menu: Play All - Select Episode - Extras Menu
    Extras Menu: Behind the Scenes - Audio Interviews
    Audio Interviews: (a long list of buttons, with each person's name on each button, which links to that person's timeline)
    Each timeline contains; video track contains a PSD of that person's name; audio track contains several audio clips of that person's interview.
    So I do not have more than 99 timelines or 99 chapters in any one timeline. I think the project is too large to be cached in one go for Preview purposes, but I know it all works properly according to the Flowchart/Check Project windows.
    Should I just switch to Sonic Scenarist and stick my head in a DVD architecture manual for a month? Encore doesn't seem to handle 'large' (read professional quality) projects at all well.
    Thanks so much for your input - I really appreciate your help.
    Ed

  • How to create event handler in project online

     how to create a remote event handler for project online...
    i want to create a event handler onprojectcreating using CSOM...need Help..

    Hi Abidulla,
    Here is a good post from UMT for you to start.
    http://www.umtsoftware.com/blog/2013/08/01/project-server-2013-remote-event-handlers/
    Hope this helps,
    Guillaume Rouyre, MBA, MCP, MCTS |

  • Ways to handle large volume data (file size = 60MB) in PI 7.0 file to file

    Hi,
    In a file to file scenario (flat file to xml file), the flat file is getting picked up by FCC and then send to XI. In xi its performing message mapping and then XSL transformation in a sequence.
    The scenario is working fine for small files (size upto 5MB) but when the input flat file size is more then 60 MB, then XI is showing lots of problem like (1) JCo call error or (2) some times even XI is stoped and we have to strat it manually again to function properly.
    Please suggest some way to handle large volume (file size upto 60MB) in PI 7.0 file to file scenario.
    Best Regards,
    Madan Agrawal.

    Hi Madan,
    If every record of your source file was processed in a target system, maybe you could split your source file into several messages by setting up this in Recordset Per Messages parameter.
    However, you just want to convert you .txt file into a .xml file. So, try firstly to setting up
    EO_MSG_SIZE_LIMIT parameter in SXMB_ADM.
    However this could solve the problem in Inegration Engine, but the problem will persit in Adapter Engine, I mean,  JCo call error ...
    Take into account that file is first proccessed in Adapter Engine, File Content Conversion and so on...
    and then it is sent to the pipeline in Integration Engine.
    Carlos

  • How do I handle large resultsets in CRXI without a performance issue?

    Hello -
    Problem Definition
    I have a performance problem displaying large/huge resultset of data on a crystal report.  The report takes about 4 minutes or more depending on the resultset size.
    How do you handle large resultsets in Crystal Reports without a performance issue?
    Environment
    Crystal Reports XI
    Apache WebSvr 2.X, Jboss 4.2.3, Struts
    Java Reporting Component (JRC),Crystal Report Viewer (CRV)
    Firefox
    DETAILS
    I use the CRXI thick client to build my report (.rpt) and then use it in my webapplication (webapp) under Jboss.
    User specifies the filter criteria to generate a report (date range etc) and submits the request to the webapp.  Webapp  queries the database, gets a "resultset".
    I initialize the JRC and CRV according to all the specifications and finally call the "processHttpRequest" method of Crystal Report Viewer to display the report on browser.
    So.....
    - Request received to generate a report with a filter criteria
    - Query DB to get resultset
    - Initialize JRC and CRV
    - finally display the report by calling
        reportViewer.processHttpRequest(request, response, request.getSession().getServletContext(), null);
    The performance problem is within the last step.  I put logs everywhere and noticed that database query doesnt take too long to return resultset.  Everything processes pretty quickly till I call the processHttpRequest of CRV.  This method just hangs for a long time before displaying the report on browser.
    CRV runs pretty fast when the resultset is smaller, but for large resultset it takes a long long time.
    I do have subreports and use Crystal report formulas on the reports.  Some of them are used for grouping also.  But I dont think Subreports is the real culprit here.  Because I have some other reports that dont have any subreports, and they too get really slow displaying large resultsets.
    Solutions?
    So obviously I need a good solution to this generic problem of "How do you handle large resultsets in Crystal Reports?"
    I have thought of some half baked ideas.
    A) Use external pagination and fetch data only for the current page being displayed.  But for this, CRXI must allow me to create my own buttons (previous, next, last), so I can control the click event and fetch data accordingly.  I tried capturing events by registering event handler "addToolbarCommandEventListener" of CRV.  But my listener gets invoked "after" processHttpRequest method completes, which doesnt help.
    Some how I need to be able to control the UI by adding my own previous page, next page, last page buttons and controlling it's click events. 
    B) Automagically have CRXI use a javascript functionality, to allow browser side page navigation.  So maybe the first time it'll take 5 mins to display the report, but once it's displayed, user can go to any page without sending the request back to server.
    C) Try using Crystal Reports 2008.  I'm open to using this version, but I couldnt figureout if it has any features that can help me do external pagination or anything that can handle large resultsets.
    D) Will using the Crystal Reports Servers like cache server/application server etc help in any way?  I read a little on the Crystal Page Viewer, Interactive Viewer, Part Viewer etc....but I'm not sure if any of these things are going to solve the issue.
    I'd appreciate it if someone can point me in the right direction.

    Essentialy the answer is use smaller resultsets or pull from the database directly instead of using resultsets.

  • Best practices for handling large messages in JCAPS 5.1.3?

    Hi all,
    We have ran into problems while processing larges messages in JCAPS 5.1.3. Or, they are not that large really. Only 10-20 MB.
    Our setup looks like this:
    We retrieve flat file messages with from an FTP server. They are put onto a JMS queue and are then converted to and from different XML formats in several steps using a couple of jcds with JMS queues between them.
    It seems that we can handle one message at a time but as soon as we get two of these messages simultaneously the logicalhost freezes and crashes in one of the conversion steps without any error message reported in the logicalhost log. We can't relate the crashes to a specific jcd and it seems that the memory consumption increases A LOT for the logicalhost-process while handling the messages. After restart of the server the message that are in the queues are usually converted ok. Sometimes we have however seen that some message seems to disappear. Scary stuff!
    I have heard of two possible solutions to handle large messages in JCAPS so far; Splitting them into smaller chunks or streaming them. These solutions are however not an option in our setup.
    We have manipulated the JVM memory settings without any improvements and we have discussed the issue with Sun's support but they have not been able to help us yet.
    My questions:
    * Any ideas how to handle large messages most efficiently?
    * Any ideas why the crashes occur without error messages in the logs or nothing?
    * Any ideas why messages sometimes disappear?
    * Any other suggestions?
    Thanks
    /Alex

    * Any ideas how to handle large messages most efficiently? --
    Strictly If you want to send entire file content in JMS message then i don't have answer for this question.
    Generally we use following process
    After reading the file from FTP location, we just archive in local directory and send a JMS message to queue
    which contains file name and file location. Most of places we never send file content in JMS message.
    * Any ideas why the crashes occur without error messages in the logs or nothing?
    Whenever JMSIQ manager memory size is more lgocialhosts stop processing. I will not say it is down. They
    stop processing or processing might take lot of time
    * Any ideas why messages sometimes disappear?
    Unless persistent is enabled i believe there are high chances of loosing a message when logicalhosts
    goes down. This is not the case always but we have faced similar issue when IQ manager was flooded with lot
    of messages.
    * Any other suggestions
    If file size is more then better to stream the file to local directory from FTP location and send only the file
    location in JMS message.
    Hope it would help.

  • Handling large messages with MQ JMS sender adapter

    Hi.
    Im having trouble handling large messages with a MQ JMS sender adapter.
    The messages are around 35-40MB.
    Are there any settings I can ajust to make the communication channel work?
    Error message is:
    A channel error occurred. The detailed error (if any) : JMS error:MQJMS2002: failed to get message from MQ queue, Linked error:MQJE001: Completion Code 2, Reason 2010, Error Code:MQJMS2002
    The communication channel works fine with small messages!
    Im on SAP PI 7.11, MQ Driver is version 6.
    Best Regards...
    Peter

    The problem solved itself, when the MQ server crashed and restarted.
    I did find a note that might could have been useful:
    Note 1258335 - Tuning the JMS service for large messages or many consumers
    A relevant post as well: http://forums.sdn.sap.com/thread.jspa?threadID=1550399

  • How do I select all clips in a large project with out individually select each clip?

    I have a large project with ~1000 clips (still jpgs).  My question is how do I select all of the clips without individually selecting each clip?

    Chris
    Thanks for the reply with additional information.
    The suggestion given would be applicable to your Premiere Elements 12 on Windows 7.
    As an aside, have you updated 12 to 12.1 yet? If not, please do so using an opened project's Help Menu/Update. The major perk of the update
    is that it corrects an Expert workspace Text/Style issue. There are other less defined advantages included.
    Please let us know if the suggestion for Select All worked for you.
    Thanks.
    ATR

Maybe you are looking for

  • Photoshop cs4 crashing when started

    Can anyone advise. I am using photshop cs4, and when I open the app it crashes repeatedly. I dumped the pref file, ran disk repaim even an FSCK check. It still behaves the same, crashing once opened . Do I need to re-install, or is there another solu

  • Messages are in To be delivered state in the Adapter engine

    Hi, For all the receiver file adapters the messages are successful processed through Integration Engine ,no messages are stuck in SMQ1 & SMQ2 on Inegration engine. All the messages are stuck in Adapter Engine with the status "To be Delivered ". These

  • How I can delete a game I download from apple store

    Head soccer game I don't know how to uninstall it? anyone know

  • UCM11g: Workflow notification sends email TWICE

    Hi guys, I have a problem with the sending notification email from UCM11g workflow. One step of my workflow sends notification email TWICE. I don't know why. I checked the workflow settings, the UCM settings and i cannot find where is the problem. I

  • Waiting time before operation in APO

    Dear All, I have a questions regarding the waiting time of operations with APO. Our customer wants to have a waiting time befor the operation. In R/3 there is no problem as we can use 'Wait time' to define the inter-operation time. But in APO 'wait t