Filming to film editing process - best practice?

Hey everyone,
This is an odd question, and I'm not really sure where to start looking. But basically, I've been doing a lot of film editing for my company. They send out 2 people to do the filming, then hand me the ra footage to edit. I am given a rough storyboard as well.
Now, my problem is this - because the filming is done by easiest access at the time (so it wont necessarily be in the order of the story board) and I'm not involved in the actual filming (so I don't know roughly when things are filmed) it's taking me so much longer to go through all the footage, and figure out what I need.
So does anyone have any good links or advice on how best to bridge these two processes? Or any experience in the industy to help me out?
I get very tight deadlines, and don't really have time to troll through hours of video. Which then I forget and get lost with it anyway, so I have to search through it again.
Thank you for you help

Steven L. Gotz wrote:
Or, learn to use Adobe Prelude. I don't use it, but those who do might chime in here with their own opinions on the subject.
It's been a little while since I've worked extensively in Prelude so some of the features and details may have changed but the workflow is the same so I'll give it a shot...
Prelude is an app specifically designed to log and manage your footage before (and during...) the editing process. There's a lot you can do with it to edit metadata, add markers or selectively ingest footage, but at its moist basic level it's good for marking things up into subclips and then ordering them as a sort of rough cut to bring directly in PrPro. But of course that all takes time too, so depending on what your needs and proficiency level are, it still might be better to just use bins and things to organize it all in PrPro and then just edit directly from that.
Any way you cut it, 'logging' is a huge job, big enough that we developed a whole app just to help people do it effectively. That's also wht production houses often have dedicated logging pro's for that reason (but it still often falls to the editor in a big way). The production people in the field could certainly help your plight by trying to organize things a little as they go, but that sort of work comes with its own heavy pressures and priorities so that probably won't happen.

Similar Messages

  • Deadline Branche in Correlation Process - Best Practice

    Hello,
    I have an integration process with a correlation - there is a asynchronous send step which activates a correlation and afterwards an asynchronous receive step that uses that correlation.
    Furthermore I have a deadline branch to cancel the process after 24 hours.
    My question now is:
    There could be (rare) cases where a message arrives later than 24 hours, so according to my understanding the received message will block the inbound queue as no active correlation can be found anymore. Is this correct? How can I avoid this situation, I guess a blocked queue would also block other messages that are sent to the integration process?
    What would be best practice to handle such a scenario? I could leave the process intance open for 1 month, however this might have a significant impact on system performance.....
    Thank you for your advice.

    There could be (rare) cases where a message arrives later than 24 hours, so according to my understanding the received
    essage will block the inbound queue as no active correlation can be found anymore
    No correlation found error will occur only when the BPM instance is running and the message tries to enter into the relevant receive step (not the first one)
    However when you say the process is cancelled you need not worry about the message going into the queue and blocking the BPM queue.
    Regards,
    Abhishek.

  • Idoc processing best practices - use of RBDAPP01 and RBDMANI2

    We are having performance problems in the processing of inbound idocs.  The message type is SHPCON, and transaction volume is very high.  I am a functional consultant, not an ABAP developer, but will try my best to explain our current setup.
    1)     We have a number of message variants for the inbound SHPCON message, almost all of which are set to trigger immediately upon receipt under the Processing by Function Module setting.
    2)      For messages that fail to process on the first try, we have a batch job running frequently using RBDMANI2.
    We are having some instances of the RBDMANI2 almost every day which get stuck running for a very long period of time.  We frequently have multiple SHPCON idocs coming in containing the same material number, and frequently have idocs fail because the material in the idoc has become locked.  Once the stuck batch job is cancelled and the job starts running again normally, the materials unlock and the failed idocs begin processing.  The variant for the RBDMANI2 batch job is currently set with a packet size of 1 and without parallel processing enabled.
    I am trying to determine the best practice for processing inbound idocs such as this for maximum performance in a very high volume system.  I know that RBDAPP01 processes idocs in status 64 and 66, and RBDMANI2 is used to reprocess idocs in all statuses.  I have been told that setting the messages to trigger immediately in WE20 can result in poor performance.  So I am wondering if the best practice is to:
    1)     Set messages in WE20 to Trigger by background program
    2)     Have a batch job running RBDAPP01 to process inbound idocs waiting in status 64
    3)     Have a periodic batch job running RBDMANI2 to try and clean up any failed messages that can be processed
    I would be grateful if somebody more knowledgeable than myself on this can confirm the best practice for this process and comment on the correct packet size in the program variant and whether or not parallel processing is desirable.  Because of the material locking issue, I felt that parallel processing was not desirable and may actually increase the material locking problem.  I would welcome any comments.
    This appeared to be the correct area for this discussion based upon other discussions.  If this is not the correct area for this discussion, then I would be grateful if the moderator could re-assign this discussion to the correct area (if possible) or let me know the best place to post it.  Thank you for your help.

    Hi Bob,
    Not sure if there is an official best practice, but the note 1333417 - Performance problems when processing IDocs immediately does state that for the high volume the immediate processing is not a good option.
    I'm hoping that for SHPCON there is no dependency in the IDoc processing (i.e. it's not important if they're processed in the same sequence or not), otherwise it'd add another complexity level.
    In the past for the high volume IDoc processing we scheduled a background job with RBDAPP01 (with parallel processing) and RBDMANIN as a second step in the same job to re-process the IDocs with errors due to locking issues. RBDMANI2 has a parallel processing option, but it was not needed in our case (actually we specifically wouldn't want to parallel-process the errors to avoid running into a lock issue again). In short, your steps 1-3 are correct but 2 and 3 should rather be in the same job.
    Also I believe we had a designated server for the background jobs, which helped with the resource availability.
    As a side note, you might want to confirm that the performance issues are caused only by the high volume. An ABAPer or a Basis admin should be able to run a performance trace. There might be an inefficiency in the process that could be adding to the performance issue as well.
    Hope this helps.

  • Re engineering of existed process / Best Practice (customization)

    Hi all of you,
    We are implementing SAP ECC 6.0 for one of our clients. Client is asking us to compare their existed business process with Best Practice / standard process and based on the result, asking to prepare a GAP analysis between the existed and best practice for his business.
    SAP itself is a best practice in the respective domains / business processes. By implementing SAP ERP,  the client will have best practice for his business processes as I know. But thing is, how can I explain to the client that SAP has given the best practice and based on which, client will consider the SAP practice as Best Practice for the business??
    Please give me a solution
    Regards,
    Ramki

    f l,
    I'm not sure deleting keys from the registry is ever a best practice, however Xcelsius has listings in:
    HKEY_CURRENT_USER > Software > Business Objects > Xcelsius
    HKEY_LOCAL_MACHINE > SOFTWARE > Business Objects > Suite 12.0 > Xcelsius
    The current user folder holds temporary settings, such as how you've modified your interface.
    The local machine folder holds more important information.
    As always, it's recommended that you backup the registry and/or create a restore point before modifying or deleting any keys.
    As for directories, the only directory Xcelsius uses is the one you install to.  It also places some install logs in the temp directory, but they have no effect on the application.

  • Order Process Best Practice Suggestions?

    Hey CF World,
    I have to revamp an online order process. The process is broken into 4 steps.
    The app as it exists today was built by a different developer and for the life of me, I have wasted about 5 hours trying to figure out exactly what the person is doing in the code just so I can make some basic tweaks to the process.
    Could anyone offer what might be considered today's best practice for a step by step order process?
    The thought is, if the user could complete step 1, upon clicking next the data elements of the form would be validated and then they would be taken to step 2, etc, etc... until the end where upon submission, the order would then be written to the database and next process triggered internally.
    Should I have one page that upon submit of step 1 cycles back to itself, processes the data and then loads a separate div of info for step 2 or...?
    Any suggestions would be great.  Thank you so much in advance for your help, I sincerely appreciate it.
    Ciao'
    D.

    Hello,
    Thank you so much for that. Let me qualify a few things as I probably should have in the first place. (my apologies)
    Coldfusion 8
    SQL Server  2005
    There is no payment or credit card information being provided.
    The user comes online, goes through a basic order process for some work to be done. As mentioned, it is a multi step process for gathering their information.
    Once the entire order is in and all the fields validated along the way to ensure they were populated where required, the order is to be written into the pending orders table and an email is sent to the branch closest to the customer notifying them of the new order with a link into the details. The branch then calls them directly to confirm the details of the order before activating it.
    So, the code I received, is next to impossible to follow through, for the life of me I can not figure out what the former developer has done. I need to make some changes to the process and if I can not even follow the flow to figure out where to make my changes, that could pose a problem.
    I have not coded too much in Coldfusion for the past two years but did so quite extensively before that. I totally agree on the CFTransaction suggestion. I guess what I was looking for is, are there any best practices for coding that I should be aware of, especially considering what I want to accomplish? Previously we used the "fusebox" concept of coding and had most of our code in CustomTags in a very reusable and easy to follow structure and flow.
    Any thoughts/suggestions would be great! Thank you very much!
    D.

  • Film Edit Process Q's & Reconnect Conflicts

    I am one of two assistant editors on an independent feature film, shot on 35m, transferred to BetaSp pal for edit. FTL files (from the lab) are imported,
    Batchlists and Database folders created. Our clips are then digitized from the batchlists in FCP. We edit at 24fps, video is converted through Cinema Tools and merged with the audio tracks when synced in FCP. (we are using the basic sync by clap system) Assistants have now started editing on multiple computers. We copy the editors project, all of the respected folders
    (FTL, BATCH CAPTURE, DATABASE, etc) the media and seperate sound folders to our two 250-500GB Lacie fw drives. We work seperatly, finish a sequence, then take project files back to the editor's station; copy bins/sequences into his project and reconnect to the media that sits on his drives. Video always reconnects fine. When working with sync clips and reconnecting audio, I get this window, "some attrubutes of one or more files have chosen do not match attributes of original" 99% of the time the attribute conflict is listed as, "media start and end". Sometimes additional attributes: "reel, rate" are listed. I have the option of chosing "connect" anyway, and I've done a test with
    this and even with "conflicts" sync appears fine. I want to be sure that everything happens smoothly with our audio post (sound designer working on a seperate system from our exported OMFs) and I want to be sure
    that we output a good EDL. (we will be making a film cut to scan into "Luster", a DI system, and printing to negative from there) Also, what should we just
    watch out for in this work process described? What are some pitfalls that I could be missing in copying media around, taking media off the assistant edit
    drives, etc., as we work? Also for organizational purposes, I want to rename media folders (capture folder), where the clips were digitized into on the
    editor's machine. (would like to reorganize media in general, intial organization untidy) Will this be a problem, most concern again for the EDL. Forgive me for some ignorance, first time working with film in edit. Appreciate any feedback. Thanks-
    amysands

    Hi Jim - thanks for your advice, wondering if I should I have headed your warning earlier - I expressed concerns to Steve about EDL export, turns out I ran into some problems recently when doing an EDL test. If you have some time, I'd greatly appreciate any information or help you can offer. Here is my long winded explanation of what happened. (again, I'm an assistant on an independent feature film - finally getting back to the forum after being buried for a bit)
    OUR PROCESS:
    1) Film format: 35m 3perf, (plus some Super 16mm).
    2) Transferred to BetaSp PAL and Digibeta PAL.
    3) FTL files (from telecine)copied and imported to CINEMA TOOLS, Database created. Batch Lists created.
    4) Batch Lists imported to FCP, media digitized at 25fps. Media converted through Cinema Tools to 24fps. Media then “reconnected” in FCP.
    5) Audio (at 24fps) imported and synced with video (using the basic sync by clap system). Clips are “merged” as we sync. Edit occurs in 24fps, timelines.
    6) Finishing with DI - film negatives to be scanned, film print created.
    EQUIPMENT:
    FCP 5.0
    Cinema Tools
    G5, dual processor
    Mac OS X 10.4.2
    BlackLink Card
    (assistants also working with copied media on an EMAC
    and G4 Lap Top, projects transferred back to G5 system)
    REDIGITIZING AND EDL ERRORS:
    Our offline footage was digitized at DV-PAL compression from Beta SP (PAL). We had also telecined to matching Digibeta (PAL) tapes. Recently we tried to re-digitize for a trailer capturing at 8 bit compression. We brought in a Sony Digibeta deck with SDI cable.
    Several problems occurred:
    I tried first to export an EDL from FCP. In a new project, I imported the EDL and started to re-digitize. The system immediatly searched for timecode out of the ballpark from the tape (not even the same hour).
    I was able to batch digitize from the sequence (still in a 24fps timeline). Media then had to be conformed to 24fps through Cinema Tools. In/Outs for the most part would match. However, maybe because our FTL codes were not broken into individual takes, huge source clips needed to be recaptured. For example, for a 30sec clip in the timeline, 6 minutes of footage was digitized. We had to capture large source files corresponding to our smaller edited clips. This took up more space and time then we had available.
    Even after conforming the media to 24fps, some clips did not match the in/outs of the original sequence. 15% of the clips were not even batch digitized; tape numbers or clip names ignored all together in the batch digitize que. This occurred even with selecting “digitize online” clips function.
    A seperate issue, but mastering back to digibeta was also unsuccessful. We tried rendering the sequence out at 25fps, (which took a huge amount of time) and then discovered that we could still not print to tape on the digibeta deck. It would not output at all - we got only a fuzzy green screen. We have tried both component / composite and SDI but its the same each time. We've also tried to output to a BetaSP deck - no result. In the end we exported quicktimes and outputted via firewire to DVCAM. I have since read that mastering back to Beta is commonly problematic.
    Any insights as to why the digitization is erratic? Why the EDL is not working? The EDL is the most concerning problem - will need to eventually both work with high res footage and scan our negatives for the DI and print to film. Should I create an EDL through Cinema Tools? Am I missing something in the EDL settings maybe? Trying to do some quick reading on this as well - thanks guys for all the help -
    Amy

  • Transferring the film to video best practice and combining PAL and NTSC

    Could anyone help me with the following 2 questions that I was asked in our small school video lab, I don't really have much experience with negative film and NTSC. Thankyou so much.
    1. "I may be going back to the film negative to cut it, based on the FCP EDL. This means that Final Cut has to maintain perfect synch. I know that with AVID, it's more reliable to transfer the film to video at 25 fps rather than 24 fps. Do you have any idea whether this is also the case with Final Cut??"
    2. "Some of my source materials is on PAL and some is on NTSC. Is that going to be a nightmare?? Will I be able to convert from one to the other when I import?? Or will I need to get the NTSC miniDV tapes transfered to PAL so that your PAL deck can read them? "
    we normally use PAL (In UK).

    1. This is where Cinema Tools comes into play. It can conform your edit list from FCP back to film.
    There is a wealth of information in the Cinama Tools handbook and Help menu item.
    Someone else might be able to contribute more information, my experience with CT is very limited.
    2. Some decks are switchable between PAL and NTSC. If yours can do this then you can capture your footage in a preliminary project and convert it for free with [JES Deinterlacer|http://www.xs4all.nl/~jeschot/home.html] which does a decent job or for $100 with [Natress Standards Conversion|http://www.nattress.com/Products/standardsconversion/standardsconver sion.htm] which does a very good job. Both will take some time, best to capture only what you really need.
    The best possible conversion is done with dedicated hardware solutions such as those offered by Snell & Wilcox. Real time with excellent results. This would be the way to go if you have a lot of material or if your deck is not PAL - NTSC switchable.

  • Af:dialog model-restore / cancel-button processing best practice ?

    Using JDev 11.1.1.3; if I have an af:dialog running in an af:popup which contains auto-submit components (for cross-component enablement, validation etc.). My question is what are the preferred ways of discarding the model submitted changes made through popup processing if/when the af:dialog cancel button is pressed by user ? Figured that using a task flow for the content that is the popup could be an option, and using the task flow savepoint restore feature, but that looks more like database restore than model restore. I want to be able to restore the model content to the way it looked before the popup executed, without necessitating a submit to the database. How is this most commonly and best achieved ?
    Thanks,

    Taskflow savepoints are not database savepoints. Transactional BTF can be configured to issue automatic savepoints at TF entry and eventually to "rollback" to them at the TF exit. The internal implementation uses the ApplicationModule's passivation/activation mechanism to passivate the AM state at the TF entry and eventually to activate the AM state at the TF exit back to the passivated state at the entry. In this way it is simulated that you have not made any modifications in ADF BC, so your model layer will be restored to the state before TF entry. (Of course, you must not perform any DB commits durring the lifetime of this TF). I have used successfully this mechanism for the same goal you are asking about.
    Also there are savepoints managed by the ADF Controller, but I could be of little help here because I have never used them. I suspect that this mechanism could be what you need, so you may have a look here for more details:
    Adding Save Points to a Task Flow
    and in this thread:
    {thread:id=2128956}
    Dimitar

  • PL/SQL After submit process - best practice?

    I have after submit process which fires PL/SQL procedure. In this PL/SQL procedure I do some updates and would also like to generate some XML output and send it to browser so that user can save it in file. What I'm asking is, what is proper way to handle this.
    I realize that starting procedure from "after submit process" is too late. If I understand correctly, the page is already rendered at that time so htp.p output from PL/SQL procedure in not showing (but procedure is executed). So I create branch to PL/SQL procedure (after button is pressed). That way procedure actualy creates new window and I can use htp.p functions. Altough now I have trouble closing window but I hope I could manage this.
    Is there some other, better way to do export? Maybe javascript popup and calling procedure from there? Any suggestions?
    Thanks!
    Marko

    How should I send this content to user so that his browser recognize this as a file (for opening or saving)?
    Put that code in a onLoad process similar to how Scott shows at http://spendolini.blogspot.com/2006/04/custom-export-to-csv.html
    With this in place, when you issue a show request on that page, your generated content will be offered by the browser using a open/save dialog box.

  • Best practice video conversion from download

    I am looking for best practice for video conversions.
    I am downloading adobe recordings via this method:
    http://server.adobeconnect.com/xyz/output/filename.zip?download=zip
    From here, I have been converting the FLVs using either freemake video converter or FLV converter. I have tried converting into AVI (XVID), MOV, WMV, etc. (I need the file to be under 600 MB for an hour of recording, therefore it is going to need some type of compression).
    My goal is to import the video into Sony Vegas Pro 10 for further editting. I have found that whatever method I use, the video and audio does not sync properly about 50% of the time. The video time is longer than the audio time usually. Or that there are other various errors, such as the video just freezing halfway through the video.
    I have been using connect for a few years now, but with each update I find (connect 8, 9, etc), that the problems are getting worse. At this point I am just wasting time trying to convert into various formats using various codecs just trying to luck upon one where the video is at least without error.
    What methods are others using to convert the FLV to a workable editable format?

    Can't the FLV files be changed into many different formats through Apple's
    Compressor or Adobe Media Encoder? These formats can then be opened in
    standard video editing software for editing.
    Best practice video conversion from download
    mach5kel
    to:
    jsb152
    05/21/12 01:03 PM
    Please respond to jive-509399086-9dnu-2-2mvb7
    Re: Best practice video conversion from download
    created by mach5kel in Connect General Discussion - View the full
    discussion
    Yes, I use this as a last resort, as the quality of capture this was is
    signifcatnly lower. As well as it is a much more time consuming process. I
    sometimes have over 50 parts of 1 hour video. To use camtasia, you need
    first to record it, then it must be saved in a camtasia format, and then
    lastly rendered into avi or wmv. Therefore, it is does take awhile.
    Really, I feel there shouldnt be so many errors in the conversion process,
    but I am finding the FLV recordings themselves have problems. This last
    file I am looking at, even the recording playback on adobe connect, has
    serious issues with audio and video sync. A problem that is all too common
    Replies to this message go to everyone subscribed to this thread, not
    directly to the person who posted the message. To post a reply, either
    reply to this email or visit the message page: [
    http://forums.adobe.com/message/4426243#4426243]
    To unsubscribe from this thread, please visit the message page at [
    http://forums.adobe.com/message/4426243#4426243]. In the Actions box on
    the right, click the Stop Email Notifications link.
    Start a new discussion in Connect General Discussion by email or at Adobe
    Forums
    For more information about maintaining your forum email notifications
    please go to http://forums.adobe.com/message/2936746#2936746.

  • Media organisation for fiction film editing

    Hello.
    If anyone had experience with FCPX and fiction film editing, I would appreciate if you could help me find out what is the best practice for organising and labeling the material.
    I started editing on flatbeds, got used to MediaComposer and was probably one of the first people to use FCP for cinema fiction film editing. Recently bought FCPX, went through some tutorials and played a little, but I'm still not at home with it.
    The projects I'm considering would be shot with either ARRI Alexa or Canon DSLR, thus each shot comes as a separate file, and would be labeled by the assistant according to marks on the slate. (We use Scene/Shot-Take system e.g. 5/3-2x). Sound is coming in separately as WAV files which need to be synced to the picture.
    Also, if you have any comments on editing fiction narrative films with X, I'd be interested. I was quite distracted today when I realised that FCPX doesn't remember where I stopped watching a clip when I come back to it after taking a look at some other take or shot!
    Also during today's experimenting it seemed awkward that MatchFrame (Reveal clip in Event browser) doesn't open the collection from where I actually inserted the clip.
    Thanks.
    Mato

    Hello Matoid,
    I am now facing the same situation that you did, and was wondering if you came up with workflow for editing a feature length film on FCPX.
    I would specifically appreciate any advice on the following:
    All the dailies are synchronized QT files organized by shooting day.  For instance, a file named Shooting Day 1 will contain all the synchronized dailies that were shot that day.  The files are in the following format:
    Dimensions: 1920 x 1080
    Codecs: Apple ProRes 422, Linear 422
    Color Profile: HD (1-1-1)
    Audio Channels: 2
    Since I come from the old school of editing, I would like to organize all the material by scene, shot and take number, so the first question is, how should I go about splitting the files into single takes?
    What I’ve done so far with the first two days of shooting is to create an event for each day, then select each take within the event, create a compound clip, and label it according to its slate info (Scene # +Shot + Take #).  Once I break up the Shooting day event into several scene clips, I create a new event and label it by Scene #, and then I copy all the clips from the Shooting Day Event that correspond to the Scene Event. Then I go back and delete all the clips from the Shooting Day Event and lastly y also delete the Shooting Day Event to make sure I am not doubling up on clips.
    This seems to work, but I feel there must be a more effective way of doing it.
    Another question I have is, should I assemble each sequence into its own story line and then string all the storylines together for the completed film, or should I just assemble everything in one story line?
    I am sure that more questions will come up as the project evolves, but for the time being any advice on how to go about organizing the project would help me a lot and would be immensely appreciated.

  • Plz give me work flow of film editing in Final Cut Pro..

    Hello Everyone...
    I am basically a Avid Film Editor. Past 2 years I am using Final Cut Pro. Recently I finished Final Cut Pro Level 1 & T3. Right now I am working as a App. Support Engineer specially for Final Cut Pro .
    I want know the work flow of film editing in Final Cut Pro. I am very interested to do film editing in FCP but the workflow from Avid to FCP its totally different.
    I tried few options like using FCP & Cinema Tools..
    1. Captured 25fps clips and cofirmed to 24fps in cinematools.
    2. In FCP the Timeline frame rate is 24fps
    But I am not able to understand..
    1. What is Database?
    2. Cant we capture 24fps in FCP?
    3. How to Synchronize captured clips with Cinema Tools database?
    4. How to link the captured clips in to Cinema Tools database?
    5. Where do I enter the KN Start & KN End?
    6. What about Pull List & Cut List?
    Please give me the details of film technics in Final Cut Pro....
    Thanks in advance....

    I knew I'd find this same post here. That makes three versions of trhe same list of questions.
    Posting three times here implies you have posted this set of questions in many other FCP forums around the Net. If that is so, do us all the kindness of returning to all of them and closing them. Include the suggestions you have received and the solutions that were successful for you. That way your multiple threads will show up on search inquiries when other confused FCP newbies try to start complex film projects.
    It's a bummer you cannot find these answers in the FCP manuals, I'm usre you looked there first. But, since we run on volunteers around here, it may be several days before someone with enough practical CinemaTools experience can stop by and handle them for you. Please have some patience.
    bogiesan

  • Best Practice Re-install Process?  (Xcelsius 2008 Engage)

    The main impetus of this post is now to request a best practice reinstall process.  E.g., what files/directories or registry keys to delete after uninstalling under Windows XP?  Running Xcelsius 2008 5.1.1.0 Build 12,1,1,344 (Windows XP SP2)?  I want to clean it up and reinstall completely, ideally without also reinstalling Office 2007.
    Earlier today, I had loaded up an otherwise healthy project and all of a sudden my edits were not taking.  I could perform the change in the Component Properties window but the change did not translate in the project display area.
    I had closed and reopened this project several times, killed both "Xcelsius" and "EXCEL" processes, rebooted the machine a couple of times, all to no avail.  I was totally stuck, it seems.  Then, it randomly corrected itself after the Xth close/reboot/reload.  I'm past the issue now it seems, but please consider this an official bug report.  These issues are no longer a mere minor nuisance, and I am looking forward to a nice big update in the next Fix Pack release, which I assume is right around the corner (hint, hint).
    Thanks.
    Edited by: f l on Nov 17, 2008 9:20 PM

    f l,
    I'm not sure deleting keys from the registry is ever a best practice, however Xcelsius has listings in:
    HKEY_CURRENT_USER > Software > Business Objects > Xcelsius
    HKEY_LOCAL_MACHINE > SOFTWARE > Business Objects > Suite 12.0 > Xcelsius
    The current user folder holds temporary settings, such as how you've modified your interface.
    The local machine folder holds more important information.
    As always, it's recommended that you backup the registry and/or create a restore point before modifying or deleting any keys.
    As for directories, the only directory Xcelsius uses is the one you install to.  It also places some install logs in the temp directory, but they have no effect on the application.

  • Best Practice Process Chains

    Hi All,
          What is the Best Practice Process chains recommended from SAP mainly for FICO related stuff.
           Do anyone have the structure of FI staging ,reporting layer Process chains step by step.
    Thanks in Advance
    Regards
    con
    Edited by: con on May 21, 2009 10:01 PM

    Hello con ,
    two thing to follow..
    1)have a Change run Step Once master data load completes for all the infoobjects which you have loaded.
    The Change Run activates the master data and adapts all aggregates for newly loaded master data and hierarchies.
    2)If you want to replace existing data in the DataTarget completely, first delete the data in DataTarget(also in PSA if present )and load afterwards. This will help to improve on timing parameters.
    Hope this helps you!!!
    - Nandita

  • Best practices to modify process chain

    Hi,
    What is the best practice to modify the  process chain whether in in the production or transport back to the dev.
    Does Query performing tuning setting should be done in production like read modes,cache settings.
    Thanks
    nikhil

    Hi Nikhil,
    The best practice to modify the process chains will be making the change in Development system and transporting to Quality and Production. But if you are making a simple change like changing the scheduling time ( change of scheduled date and time by editing the variant) you can do it directly in Production. But if you are adding new steps to process chain, it is better to make change in Dev and move it to Prod.
    Also once the change reach production you may need to open the chain in edit mode and activate it manually.
    It is a common issue for the process chains containing delta DTP, that the chain will be saved in modified (M)  version and not in active version by transports.
    The query read mode and cache settings, you can do it in Development and move it to production.
    But the pre-filling of cache using broadcaster settings can be done directly in production.
    Also creating of cube aggregates to improve query can be done directly in production rather than creating in development and transporting to Prod.
    Again, it depends on project, as in some projects they make changes directly in Prod, while in some projects changes in Prod are not allowed. I think it is better always start the changes in dev and move it to P.
    Hope it helps,
    Thanks,
    Vinod-

Maybe you are looking for