Master Data Workflow Best Practice

Hi,
We are looking for guidance to build Workflow for Material Master and this is a complex workflow with more than 25 steps, parallel steps and custom role assignment. We are dealing with high volume of data in both design (more than 350 fields in a form) and runtime as well. Data spans across ECC and CRM systems.
We are thinking about using Java Adobe Interactive forms for UI and Guided Procedures for workflow but  SAP is not suggesting using GP for complex workflow which requires more than 15 steps for performance reasons.
Does anyone know the scenarios where the complex workflows are built in GP and come across performance issues?
Thanks,
Krishna.

Hi,
I think MDM suites you're Requirement.
Check this [Link1|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/50f1c01b-972d-2c10-3d9d-90887014fafb] [Link2|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/60559952-ff62-2910-49a5-b4fb8e94f167] [Link3|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/90990743-91c0-2a10-fd8f-fad371c7ee40] & [Link4|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/60fd28c5-62ea-2a10-2093-be4fedf1bc7d].
Regards,
Surjith

Similar Messages

  • Building complex flash game in Flash Builder 4 - Workflow/Best Practices

    I'm investigating switching to Flash Builder 4 for building a complex game that currently lives purely inside Flash CS4.  CS4 is a pretty terrible source code editor and debugger.  It's also quite unstable.  Many crashes caused by bad behavior in the SWF will take out the entire IDE so are almost impossible to debug.  And I've heard other horror stories.  To be clear, for this project I'm not interested in the Flex API, just the IDE.
    Surprisingly, it seems Flash Builder 4 isn't really set up for this type of development.  I was hoping for an "Import FLA" option that would import my Document Class, set it as the main entry point, and figure out where other assets live and construct a new project.  What is the best workflow for developing a project like this?
    What I tried:
    -Create a new Actionscript Project in the same directory where my CS4  lives
    -Set the primary source file to match the original project's source file and location
    -Set my main FLA as "export to SWC", and added "SWC PATH" to my flash builder 4 project.
    -Compile and run.. received many errors due to references to stage instance. I changed these to GetChildByName("stagename").  Instead, should I declare them as members of the main class?  (this would mimic what flash CS4 does).
    -My project already streams in several external SWF's.  I set these to "Export SWC" to get compile-time access to classes and varaibles. This works fine in cs4, the loaded SWF's behave as if they were in the native project.  Is the same recommended with FB4?
    -Should I also be setting the primary FLA as "export to swc"?  If not, how do I reference it from flex, and how does flex know which fla it should construct the main stage with?
    Problems:
    -I'm getting a crash inside a class that is compiled in one of the external SWF's (with SWC).  I cannot see source code for the stack inside this class at all.  I CAN see member variables of the class, so symbol information exists.  And I do see the stack with correct function names.  I even see local variables and function parameters in the watch window! But no source.  Is this a known bug, or "by design"? Is there a workaround?  The class is compiled into the main project, but I still cannot see source.  If FLEX doesn't support source level debugging of SWC's, then it's pretty useless to me.   The project cannot live as a single SWF.  It needs to be streaming and modular for performance and also work flow. I can see source just fine when debugging the exact same SWC/SWF through CS4.
    -What is the expected workflow with artists/designers working on the project?  Currently they just have access to all the latest source, and to test changes they run right through flash.  Will they be required to license Flash Builder as well so they can test changes?  Or should I be distributing the main "engine" as a SWF, and having it reference other SWF files that artists can work on?  They they compile their SWF in CS4, and to test the game, they can load the SWF I distribute.
    A whitepaper on this would be awesome, since I think a lot of folks are trying to go this direction.  I spent a long time searching the web and there is quite a bit of confusion on this issue, and various hacks/tricks to make things work.  Most of the information is stale from old releases (AS2!).
    If a clean workflow I would happily adopt Flash Builder 4 as the new development tool for all the programmers.  It's a really impressive IDE with solid performance, functional intellisense, a rich and configurable interface, a responsive debugger..I could go on and on.  One request is shipping with "visual studio keyboard layout" for us C++ nerds.
    Thanks very much for reading this novel!

    Flash builder debugging is a go!  Boy, I feel a bit stupid, you nailed the problem Jason - I didn't have "Permit Debugging set".  I didn't catch it because debugging worked fine in CS4 because, well, CS4 doesn't obey this flag, even for externally loaded SWF files (I think as long as it has direct access to the SWC). Ugh.
    I can now run my entire, multi SWF, complex project through FB with minimal changes.  One question I do have:
    In order to instantiate stage instances and call the constructor of the document class, I currently load the SWF file with LoaderContext.  I'm not even exporting an SWC for the main FLA (though I may, to get better intellisense).  Is this the correct way of doing it?  Or should I be using , or some other method to pull it into flex?  They seem to do the same thing.
    The one awful part about this workflow is that since almost all of my code is currently tied to symbols, and lives in the SWF, any change i make to code must first be recompiled in CS4, then I have to switch back to FB.  I'm going to over time restructure the whole code base to remove the dependency of having library symbols derive from my own custom classes.  It's just a terrible work flow for both programmers and artists alike.  CS5 will make this better, but still not great.  Having a clean code base and abstracted away assets that hold no dependencies on the code  seems like the way to go with flash.  Realistically, in a complex project, artists/designers don't know how to correctly set up symbols to drive from classes anyway, it must be done by a programmer.  This will allow for tighter error checking and less guess work.  Any thoughts on this?
    Would love to beta test CS5 FYI seeing as it solves some of these issues.
    Date: Thu, 21 Jan 2010 15:06:07 -0700
    From: [email protected]
    To: [email protected]
    Subject: Building complex flash game in Flash Builder 4 - Workflow/Best Practices
    How are you launching the debug session from Flash Builder? Which SWF are you pointing to?
    Here's what I did:
    1) I imported your project (File > Import > General > Existing project...)
    2) Create a launch configuration (Run > Debug Configuration) as a Web Application pointing to the FlexSwcBug project
    3) In the launch config, under "URL or path to launch" I unchecked "use default" and selected the SWF you built (I assume from Flash Pro C:\Users\labuser\Documents\FLAs\FlexSwcBug\FlexSwcBugCopy\src\AdobeBugExample_M ain.swf)
    4) Running that SWF, I get a warning "SWF Not Compiled for Debugging"
    5) No problem here. I opened Flash Professional to re-publish the SWF with "Permit debugging" on
    6) Back In Flash Builder, I re-ran my launch configuration and I hit the breakpoint just fine
    It's possible that you launched the wrong SWF here. It looks like you setup DocumentClass as a runnable application. This creates a DocumentClass.swf in the bin-debug folder and by default, that's what Flash Builder will create a run config for. That's not the SWF you want.
    In AdobeBugExample_Main.swc, I don't see where classCrashExternal is defined. I see that classCrashMainExample is the class and symbol name for the blue pentagon. Flash Builder reads the SWC fine for me. I'm able to get code hinting for both classes in the SWC.
    Jason San Jose
    Quality Engineer, Flash Builder
    >

  • Workflow Best Practices for 11i and R12

    A workflow best practices document targeted more towards System Administrators is published. Please read metalink note 453137.1. Please note that this is not a troubleshooting guide. This is more of a practices that can lead to a healthier workflow system.
    Any suggestions/improvements from the field is welcome.

    Hi Narayan;
    Please see:
    Interesting Documents Concerning E-Business Suite 11i to R12 Upgrades [ID 850008.1]
    Upgrade Advisor: E-Business Suite (EBS) Technology Stack Upgrade from 11.5.10.2 to 12.1.2 [ID 253.1]
    Those should helps you
    I suggest also use search mechanisim and make search on EBS forum, you can see our previous topic which is mention same&similar issues
    Regard
    Helios

  • Unicode Migration using National Characterset data types - Best Practice ?

    I know that Oracle discourages the use of the national characterset and national characterset data types(NCHAR, NVARCHAR) but that is the route my company has decide to take and I would like to know what is the best practice regarding this specifically in relation to stored procedures.
    The database schema is being converted by changing all CHAR, VARCHAR and CLOB data types to NCHAR, NVARCHAR and NCLOB data types respectively and I would appreciate any suggestions regarding the changes that need to be made to stored procedures and if there are any hard and fast rules that need to be followed.
    Specific questions that I have are :
    1. Do CHAR and VARCHAR parameters need to be changed to NCHAR and NVARCHAR types ?
    2. Do CHAR and VARCHAR variables need to be changed to NCHAR and NVARCHAR types ?
    3. Do string literals need to be prefixed with 'N' in all cases ? e.g.
    in variable assignments - v_module_name := N'ABCD'
    in variable comparisons - IF v_sp_access_mode = N'DL'
    in calls to other procedures passing string parameters - proc_xyz(v_module_name, N'String Parameter')
    in database column comparisons - WHERE COLUMN_XYZ = N'ABCD'
    If anybody has been through a similar exercise, please share your experience and point out any additional changes that may be required in other areas.
    Database details are as follows and the application is written in COBOL and this is also being changed to be Unicode compliant:
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    NLS_CHARACTERSET = WE8MSWIN1252
    NLS_NCHAR_CHARACTERSET = AL16UTF16

    ##1. while doing a test convertion I discovered that VARCHAR paramaters need to be changed to NVARCHAR2 and not VARCHAR2, same for VARCHAR variables.
    VARCHAR columns/parameters/variables should not by used as Oracle reserves the right to change their semantics in the future. You should use VARCHAR2/NVARCHAR2.
    ##3. Not sure I understand, are you saying that unicode columns(NVARCHAR2, NCHAR) in the database will only be able to store character strings made up from WE8MSWIN1252 characters ?
    No, I meant literals. You cannot include non-WE8MSWIN1252 characters into a literal. Actually, you can include them under certain conditions but they will be transformed to an escaped form. See also the UNISTR function.
    ## Reason given for going down this route is that our application works with SQL Server and Oracle and this was the best option
    ## to keep the code/schemas consistent between the two databases
    First, you have to keep two sets of scripts anyway because syntax of DDL is different between SQL Server and Oracle. There is therefore little benefit of just keeping the data type names the same while so many things need to be different. If I designed your system, I would use a DB-agnostic object repository and a script generator to produce either SQL Server or Oracle scripts with the appropriate data types or at least I would use some placeholder syntax to replace placeholders with appropriate data types per target system in the application installer.
    ## I don't know if it is possible to create a database in SQL Server with a Unicode characterset/collation like you can in Oracle, that would have been the better option.
    I am not an SQL Server expert but I think VARCHAR data types are restricted to Windows ANSI code pages and those do not include Unicode.
    -- Sergiusz

  • NLS data conversion – best practice

    Hello,
    I have several tables originate from a database with a single byte character set. I want to load the data into a database with multi-byte character set like UTF-8, and in the future, be able to use the Unicode version of Oracle XE.
    When I'm using DDL scripts to create the tables on the new database, and after that trying to load the data, I receive a lot of error messages regarding the size of the VARCHAR2 fields (which, of course, makes sense).
    As I understand, I can solve the problem by doubling the size of the verachar2 fields: VARCHAR2(20) will become VARCHAR2(40) and so on. Another option is to use the NVARCHAR2 datatype, and retain the correlation with the number of characters in the field.
    I never used NVARCHAR2 before, so I don't know if there are any side affects on the pre-built APEX processes like Automatic DML, Automatic Row Fetch and the likes, or on the APEX import data mechanism.
    What will be the best practice solution for APEX?
    I'll appreciate any comments on the subjects,
    Arie.

    Hello,
    Thanks Maxim and Patrick for your replies.
    I started to answer Maxim when Patrick post came in. It's interesting as I tried to change this nls_length_semantics parameter once before, but without any success. I even wrote an APEX procedure to run over all my VARCHAR2 columns, and change them to something like VARCHAR2(20 char). However, I wasn't satisfied with this solution, partially because what Patrick said about developers forgetting the full syntax, and partially because I read that some of the internal procedures (mainly with LOBs) do not support this character mode and always working with byte mode.
    Changing the nls_length_semantics parameter seems like a very good solution, mainly because, as Patrick wrote, " The big advantage is that you don't have to change any scripts or PL/SQL code."
    I'm just curious, what is the technique APEX is using to run on all various, SB and MB character sets?
    Thanks,
    Arie.

  • Data access best practice

    Oracle web site has an article talking about the 9iAS best practice. Predefining column type in the select statement is one of topics. The detail is following.
    3.5.5 Defining Column Types
    Defining column types provides the following benefits:
    (1) Saves a roundtrip to the database server.
    (2) Defines the datatype for every column of the expected result set.
    (3) For VARCHAR, VARCHAR2, CHAR and CHAR2, specifies their maximum length.
    The following example illustrates the use of this feature. It assumes you have
    imported the oracle.jdbc.* and java.sql.* interfaces and classes.
    //ds is a DataSource object
    Connection conn = ds.getConnection();
    PreparedStatement pstmt = conn.prepareStatement("select empno, ename, hiredate from emp");
    //Avoid a roundtrip to the database and describe the columns
    ((OraclePreparedStatement)pstmt).defineColumnType(1,Types.INTEGER);
    //Column #2 is a VARCHAR, we need to specify its max length
    ((OraclePreparedStatement)pstmt).defineColumnType(2,Types.VARCHAR,12);
    ((OraclePreparedStatement)pstmt).defineColumnType(3,Types.DATE);
    ResultSet rset = pstmt.executeQuery();
    while (rset.next())
    System.out.println(rset.getInt(1)+","+rset.getString(2)+","+rset.getDate(3));
    pstmt.close();
    Since I'm new to 9iAS, I'm not sure whether it's true that 9iAS really does an extra roundtrip to database just for the data type of the columns and then another roundtrip to get the data. Anyone can confirm it? Besides the above example uses the Oracle proprietary information.
    Is there any way to trace the db activities on the application server side without using enterprise monitor tool? Weblogic can dump all db activities to a log file so that they can be reviewed.
    thanks!

    Dear Srini,
    Data level Security is not at all issue for me. Have already implement it and so far not a single bug in testing is caught.
    It's about object level security and that too for 6 different types of user demanding different reports i.e. columns and detailed drill downs are different.
    Again these 6 types of users can be read only users or power users (who can do ad hoc analysis) may be BICONSUMER and BIAUTHOR.
    so need help regarding that...as we have to take decision soon.
    thanks,
    Yogen

  • Data Migration Best Practice

    Is the a clear cut best practice procedure for conducting data migration from one company to a new one ?

    I don't think there is a clear cut for that.  Best Practice would always be relative.  It varies dramatically depending on many factors.  There is no magical bullet here.
    One except for above: you should always use Tab delimited Text format.  It is DTW friendly format.
    Thanks,
    Gordon

  • Workflow best practices

    Hi all,
    I am new to workflow, and I would like to know what are the best practices in WF design, for example locking objects, checking existence... It would be great if one of you guys can provide me with snapshots of the skeleton of generic WF for change or approval for example. Thank you in advance
    Best regards,
    Haku

    Hi,
    Normally there are lots of business objects related to the business specific scenario, for which you're modelling a workflow.
    For example for employees (Person administration module), there are the business objects BUS1065 and EMPLOYEET (there are of course other such as FAMILY).
    Always look at these first to check if a method can be used, or which BO to extend (delegation).
    In many cases there are also standard workflow supplied by SAP which can cover the functional requirement, or which can serve as a basis.
    Locking objects your specific case, (i'm now not at work, so this is from the top of my head). enqueueing and dequeueing an employee is a method of bus1065/employeet (ther should also be a task for it). When making HR-PA flows, you could as a first step include an enqueueing of the person, if this fails, you could create a loop around it, which waits for let's day 5 minutes and then try it again, if after 20 times (or less) you can't enqueue an employee you can send a message to someone. This could be implemented in every workflow you make and I recommend it.
    Kind regards, Rob Dielemans

  • Dropbox workflow best practices?

    I'm looking for best practices on using Dropbox for an InDesign/InCopy workflow. Specifically, in my own testing with two computers, I've realized that the lockout function does not work when using Dropbox, so multiple people can  edit the same file simultaneously... which is obviously problematic. Suggestions for how to avoid this? If I create a "lock" folder, and move files in there before editing them, will my designer's Dropbox refresh fast enough to prevent him from also opening that file?
    How are your own Dropbox workflows set up, for those of you that use it? Are there other hiccups, hangups or landmines I should watch out for?

    Well the issues are what you stated yourself, namely that Dropbox doesn't copy the lock file across, and if you manually "locked" it by shifting it into a different folder another user could still grab it before Dropbox had shifted their copy.
    In that scenario though, you could speed up the shift by making sure that the InDesign file was updated on both computers first.  To do that you'd have to save the file, wait until it was uploaded (you get the green tick) and wait at least the same amount of time again (for it to download on the other computer) then shift it.  This means that Dropbox would shift it quicker because that is the only action it is transferring between computers.
    Another thing to be aware of though is that if your Dropbox is slogging way at another task (say uploading a folder of linked images) it might put doing that shift down the list.
    In conclusion I don't think it's realistic to expect Dropbox to stop two users editing the same file, you need some other project structure to achieve that.

  • Audio workflow - Best practices?

    For the past week, I've been working diligently creating lots of content with Captivate. I have a question about workflow. Since Captivate captures individual screenshots to generate the movie, what is the best way to lay down audio? For example, I may have several slides dealing with some configuration of the software that takes place through a right-click context menu. Each click to get through that menu is a separate slide. Is there a way to get the audio to span several slides or do I just put a short audio clip on each one?

    Hi there
    I'm not sure if it's necessarily "Best practice", but Captivate normally likes to have the audio narration at the slide level.
    However, if you have a few slides where it doesn't make sense to do this, note that you can record a longer audio clip (maybe on a blank slide) then assign the audio to the slide where it should begin. If the clip is longer than the slide, Captivate will offer a dialog allowing you to choose how the clip will be split. You would then choose the option where it simply spills into subsequent slides.
    Cheers... Rick
    Click here for Adobe Certified Captivate and RoboHelp HTML Training
    Click here for the SorcerStone Blog
    Click here for RoboHelp and Captivate eBooks

  • Long-form workflow best practice? (sub clips)

    I am struggling with the new media management and organizational challenges of FCP X.
    Am I the only person in the FCP community who used sub clips?  In all the extensive belly aching about the new FCP, I've yet to see anyone list the death of subclips as a dealbreaker.
    This leaves me to believe that even in the pre-FCP X world people used a different (better) workflow than the one I used for over a decade.
    Tom Wolsky tells me that I'm trying to force the new software to work like the old...  Maybe, but I think it's reasonable to expect that an "upgrade" means that the new version of a software will have all the functionality of the software it's replacing with new capabilities added....  What we have, however, are lots of new (and admittedly) very desirable new capabilities with (infuriatingly) many of the capabilities most of us thought were crucial to our editing approaches stripped entirely.  Of course there's the promise that "some" of these capabilities will return at some future, unspecified date.
    Enough of my mini-rant, here's the question....
    If I shoot 22 hours of raw footage and hope to mine 90 minutes for the final cut, which has approximately 45 scenes, how should I use these radical new organization features to be able to easily locate my clips?
    In the old way, I'd get my Sony PMW-EX3 clips into FCP (not intuitive--another question, when will FCP accept the native SxS files?), then look at each of those master clips and cut those clips into sub clips.
    Each subclip would have a unqiue name.  And I would create a series of folders for the subclips.  Very clean, very organized and when I need a cutaway shot of the dog barking in scene 22, I look at folder 22, open it and there it is... "Dog Barking."
    The new FCP X uses Favorites or Keywords... both approaches inferior (in my opinion) to my old workflow...  So what am I missing?  How is the new way better?  Was there a better way before that I was just missing???
    And now I'm getting the whiff of an even bigger problem...  I haven't encountered it myself but others are finding that once a project exceeds twelve minutes, things become terribly unstable.
    Oh, well.  Growing pains, I suppose.  And people are correct, I can just use FCP 7...  but I had such hopes... such hopes....
    Dale

    Yes, I know FCP X is an upgrade in name only.  That's part of Apple's PR headache over this whole debacle.  When people launch a program and 10.0 Final Cut Pro X stares them in the face, they are justified to expect that it's a true update (upgrade) of Final Cut Pro.  And with Apple overselling the product.... oh, well, it is what it is.
    Most of us would have been happy with Final Cut Pro 8, a true 64 bit upgrade of the program that was working for us.
    Yes, Favorites and Keywords are similar to Subclips but they are still so tied to the Master Clip.. If I Favorite 10 clips and sort by favorites, I get the Master Clips and have to hit the triangle beside them to see the Favorites.  Even if all 10 Favorites are from the same Master Clip, I get ten redundant lines I have to wade through....
    The old way was better... One bin with ten uniquely named subclips.  Clear and clean....
    The new program does not seem informed by anyone who ever made a movie longer than five minutes... and evidently (from all reviews) it wasn't influenced by anyone who considers himself a professional editor.  (I do not consider myself a professional editor, just an independent movie maker who writes, directs, acts, edits, etc., and who happily used Final Cut Pro for over a decade.)
    I'm hanging around for FCP X 10.1... I've never seen Apple so mishandle a situation...  On a much smaller scale, it's Apple's Vista moment....
    Thanks for responding...
    Dale

  • Data Load Best Practice.

    Hi,
    I needed to know what is the best way to load the data from a source. Is the SQL load the best way or using data files better? What are the inherent advantages and disadvantages of the two processes?
    Thanks for any help.

    I have faced a scenario that explaining here
    I had an ASO cube and data is being load from txt file daily basis and data was huge. There is some problem in data file as well as Master file (file that is being used for dimension building).
    Data and master file has some special character like ‘ , : ~ ` # $ % blank spaces and tab spaces, even ETL process cannot remove these things because this is coming within a data.
    Sometimes any comment or database error were also present in data file.
    I faced problem with making rule file with different delimiter, most of the time I find same character within data that is used as a delimiter. So its increases no of data field and Essbase give error.
    So I have used sql table a for data load .a Launch table is created and data is populated in this table. All error are removed here before using data load into Essbase
    This was my scenario (this case I find SQL load the second one is better)
    Thanks
    Dhanjit G.

  • Data Mining Best Practices

    Our organization is just beginning to use Data Mining in BI. We're trying to understand what the typical protocol is for moving data models into production? Is it standard practice to create the data models directly in the Production system or are these changes typically transported into the Production system? I have been unable to find any information on this in my research and would appreciate any input to help guide our decisions.
    Thanks,
    Nicole Daley

    Hi There,
    You're on the right track, here are a few additional guidelines:
    1. Determine your coverage levels along with the desired minimum data rate required by your application(s). Disabling lower data rates does have a significant impact to your coverage area.
    2. You have already prevented 802.11b clients by disabling  1,2,5.5,11 -- so that piece is taken care of.
    3. Typically, we see deployments having the lowest enabled data rate set to mandatory. This allows for the best client compatibility. You can also have higher mandatory rates, but then you need to confirm that all client devices will in fact support those higher rates. (Most clients do, but there are some exceptions). Worth noting here is that multicast traffic will be sent out at the highest mandatory data rate -- so if you have the need for higher bandwidth multicast traffic, you may want to have another data rate(s) set as mandatory.
    -Patrick Croak
    Wireless TAC

  • Timing beats for music video - workflow best practices?

    Hi all,
    I am currently working on a short video that needs to be timed to music. Obviously in some respects, a tool like Final Cut or Premier would be more suited to the task in certain ways, but for reasons of the technical specification of the output, it has to be done in After Effects.
    So I am trying to nail down a good workflow for timing various elements of the project to the beat of the soundtrack. What I'm doing so far is, I have expanded the waveform of the soundtrack so I can get some rough visual on where obvious changes happen. See the screenshot to the left. I placed a marker on the exact frame where the drum kick happens, where I would want to do something cool with the video.
    This is obviously of limited utility, because although the beat does coincide with one of the peaks of the waveform, it's not apparent from looking at it which one it would be. It's also not quite possible to ascertain the exact frame on which the beat happens using this method, as each peak is several frames long and includes several elements of the audio.
    So the next thing I tried is simply listening to the track with RAM-preview, and pressing the * key to place markers on the soundtrack on the beat. Maybe some of you are better at rhythm games than I could hope to be, but I find that while this lets me put down a lot of markers quickly, it's very inexact.
    Which, alright, is to be expected - I might not get every marker on the exact frame on the first try, but I can go back and slide them a few frames, right? But the problem here is with scrubbing audio, and with visual feedback with the playhead. When you RAM preview, the playhead does not update on every frame - it updates only every seven or eight frames or so. Which means that if I watch the timeline to see if my markers are in the right place, I can't tell when the playhead passes over them.
    So instead, I hold down the Ctrl button and scrub. This kinda works, but the playback of the audio is quite choppy and discontiguous. As such it's hard to get a sense of each frame as part of the song, and even if I do hear a beat it's hard to recognize it as such (this technique works better in Final Cut somehow, for some reason it manages to scrub while sounding less discontiguous). So then what I do is I stop on a frame I think might possibly be the beat, and I hold Ctrl and Left Click on the playhead, which loops a short segment (five frames?) of the audio. Then I do the same for a few frames before and afterward, just to make sure I targeted the right frame. Now, finally, I have the marker in the right place for a single beat.
    There's got to be a better way.
    I wonder if anyone who has been faced with this problem has figured out a more effective workflow? I am thinking there might be a separate piece of software I might use that lets me go through the audio and place markers better - and then allows me to somehow export marker data to After Effects? Or maybe another workflow I haven't thought of.
    Your thoughts?

    Cutting music accurately to beats is usually a simple matter of math. Most music, especially music with a strong beat, runs at a specific number of beats per minute. That's usually easy to figure out. There are times when you have a tempos shift or a fermata, acc. or ret. or other change they are usually for a short duration and pretty easy to fix. Figure out the tempo and you're about 90% there.
    Another tool, and one that I use quite often is Sound Keys from Trapcode. This is much more versatile than using the built in Convert audio to keyframes.
    If I have a fairly long segment that has a bunch of beats quick beats I will sometimes create a small solid, add hue and saturation to the small solid set to colorize, add this expression to colorize hue - index * 140, trim the length of the layer to the length of the beat, then duplicate the layer for as many beats as I want to cover, sequence the layers, and then use that layer sequence as a guide. The setup looks like this:
    You could also use a loopOut expression on rotation to rotate a small layer say 45º on every beat. For example, if you had a beat every 20 frames set 3 hold keyframes, one at the first frame for 0º, one at frame 20 for 40º then put one at frame 40 for 0º and simply add the expression loopOut("cycle") to rotation. Now the little cube would snap to a different angle on every beat.
    These tricks will help you with some fast cuts right to the beat but let me give you some advice on cutting to music that I learned 42 years ago (yes, I am an old guy) when I was learning to edit to music at my first job in film. My mentor had been in the business since 1952, had worked as an editor on a ton of feature films and a zillion commercials. He told me that if you precisely match the beat, which was pretty easy to find and mark with a wax pencil on mag stock, your cuts will look out of sync. The problem with that precision is that the eye and the ear are not precisely in sync. The lead or lag time depends on the shot, where the shot is taking your eye, where the cut moves your eye, and the mood the music is creating. The best technique is to listen to the music and place markers as you tap out the beat. In the old days I did this by tapping a wax pencil on the mag stock as it ran through the Steenbeck or Moviola, today I do it by tapping the * key, then I cut and adjust the cut point letting the shot dictate when the music and beat line up emotionally. You'll get a better result and your work will go much faster than fussing around trying to precisely match the beat with every cut. I have shifted cuts as many as 4 or 5 frames ahead or behind to beat to get sequence to feel right.
    I do use the two techniques I listed above to cut short segments of very tight cuts and they help save time, but setting your comp resolution to auto and your comp Zoom factor to 25%, ram preview to "From current time," and just cutting to the beat by feel will give you a much more pleasing and emotional response to your sequence than precisely and mathematically exactly matching the beat. The final piece of advice I can give you about cutting music is to make the first cut, then walk away for at least a half hour, then come back and look at what you have done. You'll instantly see where you need to make adjustments and it will take you less than half the time to cut your sequence.

  • Help!  (Data Recovery best practices question)

    Recently my fiancé's Macbook (first White model that used an Intel chipset) running 10.6.2 began to behave strange (slow response time, hanging while applications launch, etc). I decided to take an old external USB HD I had lying around and format it on my MBP in order to time machine her photo's and itunes library. Time machine would not complete a backup and I could not get any of the folders to copy through finder(various file corrupt errors). I assumed it could be a permission issue so I inadvertantly fired up my 10.5 disk and did a permission repair. Afterwards the disk was even more flaky (which I believe was self inflicted when I repaired with 10.5).
    I've since created a 10.6.2 bootable flash key and went out and bought Disk Warrior (4.2). I ran a directory repair and several disk util repairs but was still unable to get the machine to behave properly (and unable to get time machine to complete). Attempting to run permission repairs while booted to USB or the Snow Leopard install disk resulted in it hanging at the '1 minute remaining' for well over an hour. My next step was to re-install Snow Leopard but the install keeps failing after the progress bar completes.
    As it stands now the volume on the internal HD is not bootable and I'm running off my usb key boot drive using 'CP -R *' in terminal to copy her user folder onto the external USB hard drive. It seems to be working, but it's painfully slow (somewhere along the lines of maybe 10 meg per half an hour with 30gb to copy) I'm guessing this speed has to do with my boot volume running off a flash drive.
    I'm thinking of running out and grabbing a firewire cable and doing a target boot from my MBP hoping that that would be a lot faster than what I'm experiencing now. My question is, would that be the wisest way to go? My plan of action was to grab her pictures and music then erase and reformat the drive. Is it possible that I could try something else with Disk Warrior? I've heard a lot of good things about it but I fear that I did a number on it when I accidently ran 10.5 permission repair on the volume.
    Any additional help would be appreciated as she has years of pictures on there that I'd hate to see her loose.

    That sounds like a sensible solution, although you need not replace the original drive. Install OS X on the external drive, boot from it and copy her data. Then erase her drive and use Disk Utility's Restore option to clone the external drive to the internal drive. If that works then she should continue using the external drive as a backup so the next time this happens she can restore from the backup.
    For next time: Repairing permissions is not a troubleshooting tool. It's rarely of any use and it does not repair permissions in a Home folder. If a system is becoming unresponsive or just slower then there's other things you should do. See the following:
    Kappy's Personal Suggestions for OS X Maintenance
    For disk repairs use Disk Utility. For situations DU cannot handle the best third-party utilities are: Disk Warrior; DW only fixes problems with the disk directory, but most disk problems are caused by directory corruption; Disk Warrior 4.x is now Intel Mac compatible. TechTool Pro provides additional repair options including file repair and recovery, system diagnostics, and disk defragmentation. TechTool Pro 4.5.1 or higher are Intel Mac compatible; Drive Genius is similar to TechTool Pro in terms of the various repair services provided. Versions 1.5.1 or later are Intel Mac compatible.
    OS X performs certain maintenance functions that are scheduled to occur on a daily, weekly, or monthly period. The maintenance scripts run in the early AM only if the computer is turned on 24/7 (no sleep.) If this isn't the case, then an excellent solution is to download and install a shareware utility such as Macaroni, JAW PseudoAnacron, or Anacron that will automate the maintenance activity regardless of whether the computer is turned off or asleep. Dependence upon third-party utilities to run the periodic maintenance scripts had been significantly reduced in Tiger and Leopard. These utilities have limited or no functionality with Snow Leopard and should not be installed.
    OS X automatically defrags files less than 20 MBs in size, so unless you have a disk full of very large files there's little need for defragmenting the hard drive. As for virus protection there are few if any such animals affecting OS X. You can protect the computer easily using the freeware Open Source virus protection software ClamXAV. Personally I would avoid most commercial anti-virus software because of their potential for causing problems.
    I would also recommend downloading the shareware utility TinkerTool System that you can use for periodic maintenance such as removing old logfiles and archives, clearing caches, etc. Other utilities are also available such as Onyx, Leopard Cache Cleaner, CockTail, and Xupport, for example.
    For emergency repairs install the freeware utility Applejack (not compatible with Snow Leopard.) If you cannot start up in OS X, you may be able to start in single-user mode from which you can run Applejack to do a whole set of repair and maintenance routines from the commandline. Note that AppleJack 1.5 is required for Leopard. AppleJack is not compatible with Snow Leopard.
    When you install any new system software or updates be sure to repair the hard drive and permissions beforehand. I also recommend booting into safe mode before doing system software updates.
    Get an external Firewire drive at least equal in size to the internal hard drive and make (and maintain) a bootable clone/backup. You can make a bootable clone using the Restore option of Disk Utility. You can also make and maintain clones with good backup software. My personal recommendations are (order is not significant):
    1. Retrospect Desktop (Commercial - not yet universal binary)
    2. Synchronize! Pro X (Commercial)
    3. Synk (Backup, Standard, or Pro)
    4. Deja Vu (Shareware)
    5. Carbon Copy Cloner (Donationware)
    6. SuperDuper! (Commercial)
    7. Intego Personal Backup (Commercial)
    8. Data Backup (Commercial)
    9. SilverKeeper 2.0 (Freeware)
    10. MimMac (Commercial)
    11. Tri-Backup (Commercial)
    Visit The XLab FAQs and read the FAQs on maintenance, optimization, virus protection, and backup and restore.
    Additional suggestions will be found in Mac Maintenance Quick Assist.
    Referenced software can be found at www.versiontracker.com and www.macupdate.com.

Maybe you are looking for

  • Error message on the menu bar

    since installing 7, we get a very long error message in our menu bar: Uh oh! Error while rendering XUL: Value cannot be null. Parameter name: replacement at Vesdia. Toolbar.Handlers.GenericHandler.RenderInterface(String TemplateLoggedIn, String templ

  • Alter table taking long time

    Hi we have problem with one of our tables we are trying to add a column to that table and it hangds I am trying to drop the table and it still hangs nor can i rename that table to a diff table name i killed the SQL session and in one of trace files g

  • J6450 dropping wireless connection with Vista

    OMG! I purchased an HP touchsmart desktop with Vista and an HPJ6450 All in one printer hoping that having all HP would solve a lot of compatibility issues. Well that stupid printer seemed to have a mind of it's own as to when it was connected. REALLY

  • In UNIX middle mouse button pastes clipboard text to noneditable JTextField

    In UNIX, if you have a JTextField which is not editable, i.e. setEditable(false), clipboard text can still be pasted to the JTextField if the middle mouse button is pressed. How can I prevent this? I've tried overriding the JTextComponent paste() met

  • Field

    can anybody explain indetail what are meant by field symbols & field  groups