HD in CS3; Best Practice?

I've a new Sony HVR Z5U camera.  I use CS4, so my Sony HD footage (1080i 60i file type .M2T) imports directly with no problem.  However, a couple days ago  I shot some footage for a friend who is still using CS3.   What do I need to do to my 1080i / 60i M2T footage so he can import it into his CS3 with minimal quality loss?
In CS4 Media Encoder, I see I can convert the M2T to AVI files, but the AVIs are 720x480.   The HD M2T files are much larger in pixel size, so I'm concerned converting them to a standard AVI file may cause some degradation in quality.
What would you suggest I do if my goals is to provide my friend with the highest quality footage possible that will import into CS3?
TIA,
jd
Adobe Forums > People > jdmoor > Private Messages > Sent

Not a free option but you could consider NeoScene from Cineform. Excellent quallity.
http://www.cineform.com/neoscene/

Similar Messages

  • Best practice for smooth workflow in PrE?

    Hi all.  I'm an FCP user for many many years, but I'm helping an artist friend of mine with a Kickstarter video...and he's insistent that he's going to do it himself on his Dell laptop running Win7 and PrE (I believe v11, from the CS3 package)...so I'm turning to the forum here for some help.
    In Apple Land (that is, those of us still using FCP 7), we take all our elements in whatever format they're delivered to us and transcode them to ProRes, DVCPro HD or XDCAM...it just makes it easier not to deal with mixed formats on the timeline (please, no snarky comments about that, OK, I turn out broadcast work every week doing this so this method's got something going for it...).  However, when I fired up PrE I see that you can edit in all sorts of formats, including long-GOP formats like .mts and mp4 files that I wouldn't dream of working with natively in FCP...I don't enjoy staring at spinning beachballs that much. 
    Now, remembering that he's working with a severely underpowered laptop, with 2 gig of RAM, and a USB2 connection to his 7200 rpm "video" drive...and also considering that most of the video he'll be using will come in two flavors (AVHCD from a Canon Vixia 100, and HDV from a Canon EX-something or other), what would be the best way to proceed to maximize the ease at which he can edit?  I'm thinking that transcoding to something like Motion-JPEG or some other inter-frame compressed AVI format would be the way to go...it's a short video and he won't have that much material so file size inflation isn't an issue...speed and ease of processing the video files on the timeline (or do they call it a "Sceneline") is.
    Any advice (besides "buy another computer") would be appreciated...

    Steve, thanks, this is helping me now.
    I mention MJPEG because, as an Interframe compression method, it's less processor-intensive than GOP style MPEG compressions.  Again, my point of reference is the Mac/FCP7 world (so open my eyes as to how an Intel processor running Win7 would work differently), but over there best practice says NOT to edit in a GOP-base codec (XDCAM being the exception which proves the rule, eg, render times), but to transcode everything from, say, h264 or AVCwhatever into ProRes.  YES, I know PrE (and PPRO) doesn't use ProRes...not asking that.  But, at least at this juncture, any sort of a hardware upgrade is out of the question...this is what he's gonna be using to edit.  Now if I was going to be using an underpowered Mac laptop to try and edit, I most certainly would not try to push native AVCHD .mts files or native h264 files through it...those don't even work well with the biggest MacPro towers.  What is it about PrE that allows it to efficiently work with these processor-intensive formats?  This is the crux of the issue, as I'm going to advise my friend to "work this way" and I don't want to send him down the garden path of render hell...
    And finally, your advice to run tests is well-given...since I have no experience with PrE and his computer, I guess that's where we'll start...

  • Titling Best Practices

    Hello
    Working with AE and Premier I am starting to wonder what the best way of titling my videos in Premier is.
    When I use AE I can get all sorts of fancy results but sometimes the quality of the edges of the fonts and stuff looks jaggy.
    My project is video from a consumer camera, a Panasonic PV-GS150 to be exact, and although it is a good camera the video isnt super quality.
    After I capture my footage in Premier I get it laid out and then I use AE to create Compositions that I import and drop into the video project in Premier and place it over the footage to create my titles. I have had differing success. Sometimes it is super crisp and sometimes it shows a case of the jaggies.
    Could this be caused by interlacing or the type of scan that I set the render of the video too? I have been using progressive scan lately and I think the video quality of some of my latest projects have been good.
    Being a perfectionist I would like to know if there is something I am doing wrong. Is there a setting that would cause a little jaggyness, or some process that I could be doing different?
    I was thinking about taking the section of video where I want my titles and putting it in the AE project and then putting the AE comp into the timeline in premier in place of where I want the titles to go.
    What is the best way?

    Have you read the
    "Best practices for creating text and vector graphics for video" section of After Effects CS3 Help on the Web?
    If you don't feel like clicking, here's the body of that document:
    "Text that looks good on your computer screen as you are creating it can sometimes look bad when viewed in a final output movie. These differences can arise from the device used to view the movie or from the compression scheme used to encode the movie. The same is true for other vector graphics, such as shapes in shape layers. Keep the following in mind as you create and animate text and vector graphics for video:
    * You should always preview your movie on the same sort of device that your audience will use to view it, such as an NTSC video monitor. (See Preview on an external video monitor.)
    * Avoid sharp color transitions, especially from one highly saturated color to its complementary color. Sharp color transitions are difficult for many compression schemessuch as those used by MPEG and JPEG standardsto encode. This can cause visual noise near sharp transitions. For analog television, the same sharp transitions can cause spikes outside the allowed range for the signal, also causing noise.
    * When text will be over moving images, make sure that the text has a contrasting border (such as a glow or a stroke) so that the text is still readable when something the same color as the fill passes behind the text.
    * Avoid thin horizontal elements, which can vanish from the frame if they happen to be on an even scan line during an odd field, or vice versa. The height of the horizontal bar in a capital H, for example, should be three pixels or greater. You can thicken horizontal elements by increasing font size, using a bold (or faux bold) style, or applying a stroke. (See Formatting characters.)
    * When animating text to move verticallyfor scrolling credits, for examplemove the text vertically at a rate in pixels per second that is an even multiple of the field rate for the interlaced video format. This prevents a kind of twitter that can come from the text movement being out of phase with the scan lines. For NTSC, good values include 0, 119.88, and 239.76 pixels per second; for PAL, good values include 0, 100, and 200 pixels per second.
    Apply the Autoscroll - Vertical animation preset in the Behaviors category to quickly create a vertical text crawl.
    Fortunately, many problems with text in video and compressed movie formats can be solved with one simple technique: Apply Fast Blur to the text layer, with a Blurriness setting between 1 and 2. A slight blur can soften color transitions and cause thin horizontal elements to expand."

  • Best practice for updating SL to 10.6.8

    recently purchased a 2009 iMac with Snow Leopard upgrade installed  OS 10.6
    Looks as though there are two updates; should I install both or can I install the last/latest. Appreciate being directed to best practices discussions.
    FYI I will want to install Rosetta for older applications, CS3 & CS4, that I need for old client files. Thanks.
    Ali

    Buy one. Anything you want to keep shouldn't be on only one drive; problems may occur at any time, and are particularly likely to occur during an OS update or upgrade.
    (78403)

  • Best Practices question re Windows XP & Parallels 4.0 installation

    To Apple Gurus here:
    I am a new convert from Windows to Mac. Just bought a Macbook Pro (4G/320G, 2.4Gz) and a copy of Parallels 4.0. I have an OEM copy of Windows XP Pro & Photoshop CS4 for Windows. The question before me is what sequence should I go about installing Windows & Parallels. Logically, I think I should install:
    1) Windows XP using Boot Camp first,
    2) then install PhotoShop CS3 for Windows in the Windows partition
    3) then install MS Office
    4) and finally install Parallels 4.0.
    Is this the right sequence or indeed a "Best Practices" scenario?
    Any tips for a 'Best Practice' installation will be highly appreciated.
    Also, is anyone here using the SAP GUI for Mac OS-X & Citrix Presentation Server Client for Mac OS 10.0 (now renamed XenApp)?

    First, my creds. I don't consider myself an Apple guru. I have been running a MB since last December and at that time, I installed Parallels 3.0. If I remember correctly, after installing Parallels, I installed Windows Vista, and then Office and while I was impressed to be able to run MS Office on a MB, it took what I considered to be TOO long to load and then the performance was not that great. So, mostly I've stayed on the Mac side of the operation and only loaded Parallels if I had to run some MS program.
    About a week ago I got an offer from Parallels to buy 4.0 at an upgrade price of $40. I went with the box version since it was the same price as the download version. Tonight I got my courage up to do the upgrade. I was leery because I thought I might have to reinstall all my MS stuff (Office Pro, etc.) When I put the disk in to install the program, I receive a message saying there was a later edition available with the option to download it or install the box edition. After a few minutes of thought, I decided to do the download version. I would still recommend getting the box version since you get a manual with it although the download version comes with a PDF manual.
    When I finished, I then clicked on upgrade/install and the installation proceeded without much input from me. Lo and behold, the installation finished and it booted up to my previous Vista installation with all my programs intact.
    So far, I must stay I'm VERY impressed with this upgrade Parallels edition. It seems to load much faster, the programs are more responsive, Vista so far seems very stable and the ability to switch back and forth from Windows to OS X is totally better. From what I've seen so far, I would highly recommend anyone using Parallels 3.0 get this upgrade. While I've only been using it a few hours, it seems like the best upgrade for ANY program/system (Windows 95-->Vista) that I've ever done.
    A few months ago I saw a piece on an upgraded version of Fusion which stated that it moved Fusion ahead of Parallels. If that were so, I think the ball must be back in Parallels court with 4.0.

  • Logical level in Fact tables - best practice

    Hi all,
    I am currently working on a complex OBIEE project/solution where I am going straight to the production tables, so the fact (and dimension) tables are pretty complex since I am using more sources in the logical tables to increase performance. Anyway, what I am many times struggling with is the Logical Levels (in Content tab) where the level of each dimension is to be set. In a star schema (one-to-many) this is pretty straight forward and easy to set up, but when the Business Model (and physical model) gets more complex I sometimes struggle with the aggregates - to get them work/appear with different dimensions. (Using the menu "More" - "Get levels" does not allways give the best solution......far from). I have some combinations of left- and right outer join as well, making it even more complicated for the BI server.
    For instance - I have about 10-12 different dimensions - should all of them allways be connected to each fact table? Either on Detail or Total level. I can see the use of the logical levels when using aggregate fact tables (on quarter, month etc.), but is it better just to skip the logical level setup when no aggregate tables are used? Sometimes it seems like that is the easiest approach...
    Does anyone have a best practice concerning this issue? I have googled for this but I haven't found anything good yet. Any ideas/articles are highly appreciated.

    Hi User,
    For instance - I have about 10-12 different dimensions - should all of them always be connected to each fact table? Either on Detail or Total level.It not necessary to connect to all dimensions completely based on the report that you are creating ,but as a best practice we should maintain all at Detail level only,when you are mentioning any join conditions in physical layer
    for example for the sales table if u want to report at ProductDimension.ProductnameLevel then u should use detail level else total level(at Product,employee level)
    Get Levels. (Available only for fact tables) Changes aggregation content. If joins do not exist between fact table sources and dimension table sources (for example, if the same physical table is in both sources), the aggregation content determined by the administration tool will not include the aggregation content of this dimension.
    Source admin guide(get level definition)
    thanks,
    Saichand.v

  • Best practices for setting up users on a small office network?

    Hello,
    I am setting up a small office and am wondering what the best practices/steps are to setup/manage the admin, user logins and sharing privileges for the below setup:
    Users: 5 users on new iMacs (x3) and upgraded G4s (x2)
    Video Editing Suite: Want to connect a new iMac and a Mac Pro, on an open login (multiple users)
    All machines are to be able to connect to the network, peripherals and external hard drive. Also, I would like to setup drop boxes as well to easily share files between the computers (I was thinking of using the external harddrive for this).
    Thank you,

    Hi,
    Thanks for your posting.
    When you install AD DS in the hub or staging site, disconnect the installed domain controller, and then ship the computer to the remote site, you are disconnecting a viable domain controller from the replication topology.
    For more and detail information, please refer to:
    Best Practices for Adding Domain Controllers in Remote Sites
    http://technet.microsoft.com/en-us/library/cc794962(v=ws.10).aspx
    Regards.
    Vivian Wang

  • Add fields in transformations in BI 7 (best practice)?

    Hi Experts,
    I have a question regarding transformation of data in BI 7.0.
    Task:
    Add new fields in a second level DSO, based on some manipulation of first level DSO data. In 3.5 we would have used a start routine to manipulate and append the new fields to the structure.
    Possible solutions:
    1) Add the new fields to first level DSO as well (empty)
    - Pro: Simple, easy to understand
    - Con: Disc space consuming, performance degrading when writing to first level DSO
    2) Use routines in the field mapping
    - Pro: Simple
    - Con: Hard to performance optimize (we could of course fill an internal table in the start routine and then read from this to get some performance optimization, but the solution would be more complex).
    3) Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine).
    Does anybody know what is best practice is? Or do you have any experience regarding what you see as the best solution?
    Thank you in advance,
    Mikael

    Hi Mikael.
    I like the 3rd option and have used this many many times.  In answer to your question:-
    Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized  - Yes have read and tested this that it works faster.  A OSS consulting note is out there indicating the speed of the end routine.
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine). - Yes but by using the result package, the manipulation can be done easily.
    Hope it helps.
    Thanks,
    Pom

  • Temp Tables - Best Practice

    Hello,
    I have a customer who uses temp tables all over their application.
    This customer is a novice and the app has its roots in VB6. We are converting it to .net
    I would really like to know the best practice for using temp tables.
    I have seen code like this in the app.
    CR2.Database.Tables.Item(1).Location = "tempdb.dbo.[##Scott_xwPaySheetDtlForN]"
    That seems to work, though i do not know why the full tempdb.dbo.[## is required.
    However, when i use this in the new report I am doing I get runtime errors.
    i also tried this
    CR2.Database.Tables.Item(1).Location = "##Scott_xwPaySheetDtlForN"
    I did not get errors, but I was returned data i did not expect.
    Before i delve into different ways to do this, i could use some help with a good pattern to use.
    thanks

    Hi Scott,
    Are you using the RDC still? It's not clear but looks like it.
    We had an API that could piggy back the HDBC handle in the RDC ( craxdrt.dll ) but that API is no longer available in .NET. Also, the RDC is not supported in .NET since .NET uses the framework and RDC is COM.
    Work around is to copy the temp data into a data set and then set location to the data set. There is no way that I know of to get to the tempdb from .NET. Reason being is there is no CR API to set the owner of the table to the user, MS SQL Server locks the tempdb to that user has exclusinve rights on it.
    Thank you
    Don

  • Best Practice for Significant Amounts of Data

    This is basically a best-practice/concept question and it spans both Xcelsius & Excel functions:
    I am working on a dashboard for the US Military to report on some basic financial transactions that happen on bases around the globe.  These transactions fall into four categories, so my aggregation is as follows:
    Year,Month,Country,Base,Category (data is Transaction Count and Total Amount)
    This is a rather high level of aggregation, and it takes about 20 million transactions and aggregates them into about 6000 rows of data for a two year period.
    I would like to allow the users to select a Category and a country and see a chart which summarizes transactions for that country ( X-axis for Month, Y-axis Transaction Count or Amount ).  I would like each series on this chart to represent a Base.
    My problem is that 6000 rows still appears to be too many rows for an Xcelsius dashboard to handle.  I have followed the Concatenated Key approach and used SUMIF to populate a matrix with the data for use in the Chart.  This matrix would have Bases for row headings (only those within the selected country) and the Column Headings would be Month.  The data would be COUNT. (I also need the same matrix with Dollar Amounts as the data). 
    In Excel this matrix works fine and seems to be very fast.  The problem is with Xcelsius.  I have imported the Spreadsheet, but have NOT even created the chart yet and Xcelsius is CHOKING (and crashing).  I changed Max Rows to 7000 to accommodate the data.  I placed a simple combo box and a grid on the Canvas u2013 BUT NO CHART yet u2013 and the dashboard takes forever to generate and is REALLY slow to react to a simple change in the Combo Box.
    So, I guess this brings up a few questions:
    1)     Am I doing something wrong and did I miss something that would prevent this problem?
    2)     If this is standard Xcelsius behavior, what are the Best Practices to solve the problem?
    a.     Do I have to create 50 different Data Ranges in order to improve performance (i.e. Each Country-Category would have a separate range)?
    b.     Would it even work if it had that many data ranges in it?
    c.     Do you aggregate it as a crosstab (Months as Column headings) and insert that crosstabbed data into Excel.
    d.     Other ideas  that Iu2019m missing?
    FYI:  These dashboards will be exported to PDF and distributed.  They will not be connected to a server or data source.
    Any thoughts or guidance would be appreciated.
    Thanks,
    David

    Hi David,
    I would leave your query
    "Am I doing something wrong and did I miss something that would prevent this problem?"
    to the experts/ gurus out here on this forum.
    From my end, you can follow
    TOP 10 EXCEL TIPS FOR SUCCESS
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/204c3259-edb2-2b10-4a84-a754c9e1aea8
    Please follow the Xcelsius Best Practices at
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a084a11c-6564-2b10-79ac-cc1eb3f017ac
    In order to reduce the size of xlf and swf files follow
    http://myxcelsius.com/2009/03/18/reduce-the-size-of-your-xlf-and-swf-files/
    Hope this helps to certain extent.
    Regards
    Nikhil

  • Best-practice for Catalog Views ? :|

    Hello community,
    A best practice question:
    The situtation: I have several product categories (110), several items in those categories (4000) and 300 end-users.    I would like to know which is the best practice for segment the catalog.   I mean, some users should only see categories 10,20 & 30.  Other users only category 80, etc.    The problem is how can I implement this ?
    My first idea is:
    1. Create 110 Procurement Catalogs (1 for every prod.category).   Each catalog should contain only its product category.
    2. Assign in my Org Model, in a user-level all the "catalogs" that the user should access.
    Do you have any idea in order to improve this ?
    Saludos desde Mexico,
    Diego

    Hi,
    Your way of doing will work, but you'll get maintenance issues (to many catalogs, and catalog link to maintain for each user).
    The other way is to built your views in CCM, and assign these views to the users, either on the roles (PFCG) or on the user (SU01). The problem is that with CCM 1.0 this is limitated, cause you'll have to assign one by one the items to each view (no dynamic or mass processes), it has been enhanced in CCM 2.0.
    My advice:
    -Challenge your customer about views, and try to limit the number of views, with for example strategic and non strategic
    -With CCM 1.0 stick to the procurement catalogs, or implement BADIs to assign items to the views (I experienced it, it works, but is quite difficult), but with a limitated number of views
    Good luck.
    Vadim

  • Best practice on sqlite for games?

    Hi Everyone, I'm new to building games/apps, so I apologize if this question is redundant...
    I am developing a couple games for Android/iOS, and was initially using a regular (un-encrypted) sqlite database. I need to populate the database with a lot of info for the games, such as levels, store items, etc. Originally, I was creating the database with SQL Manager (Firefox) and then when I install a game on a device, it would copy that pre-populated database to the device. However, if someone was able to access that app's database, they could feasibly add unlimited coins to their account, unlock every level, etc.
    So I have a few questions:
    First, can someone access that data in an APK/IPA app once downloaded from the app store, or is the method I've been using above secure and good practice?
    Second, is the best solution to go with an encrypted database? I know Adobe Air has the built-in support for that, and I have the perfect article on how to create it (Ten tips for building better Adobe AIR applications | Adobe Developer Connection) but I would like the expert community opinion on this.
    Now, if the answer is to go with encrypted, that's great - but, in doing so, is it possible to still use the copy function at the beginning or do I need to include all of the script to create the database tables and then populate them with everything? That will be quite a bit of script to handle the initial setup, and if the user was to abandon the app halfway through that population, it might mess things up.
    Any thoughts / best practice / recommendations are very appreciated. Thank you!

    I'll just post my own reply to this.
    What I ended up doing, was creating the script that self-creates the database and then populates the tables (as unencrypted... the encryption portion is commented out until store publishing). It's a tremendous amount of code, completely repetitive with the exception of the values I'm entering, but you can't do an insert loop or multi-line insert statement in AIR's SQLite so the best move is to create everything line by line.
    This creates the database, and since it's not encrypted, it can be tested using Firefox's SQLite manager or some other database program. Once you're ready for deployment to the app stores, you simply modify the above set to use encryption instead of the unencrypted method used for testing.
    So far this has worked best for me. If anyone needs some example code, let me know and I can post it.

  • Best Practice Table Creation for Multiple Customers, Weekly/Monthly Sales Data in Multiple Fields

    We have an homegrown Access database originally designed in 2000 that now has an SQL back-end.  The database has not yet been converted to a higher format such as Access 2007 since at least 2 users are still on Access 2003.  It is fine if suggestions
    will only work with Access 2007 or higher.
    I'm trying to determine if our database is the best place to do this or if we should look at another solution.  We have thousands of products each with a single identifier.  There are customers who provide us regular sales reporting for what was
    sold in a given time period -- weekly, monthly, quarterly, yearly time periods being most important.  This reporting may or may not include all of our product identifiers.  The reporting is typically based on calendar-defined timing although we have
    some customers who have their own calendars which may not align to a calendar month or calendar year so recording the time period can be helpful.
    Each customer's sales report can contain anything from 1,000-20,000 rows of products for each report.  Each customer report is different and they typically have between 4-30 columns of data for each product; headers are consistently named.  The
    product identifiers included may vary by customer and even within each report for a customer; the data in the product identifier row changes each week.  Headers include a wide variety of data such as overall on hand, overall on order, unsellable on hand,
    returns, on hand information for each location or customer grouping, sell-through units information for each location or customer grouping for that given time period, sell-through dollars information for each location or customer grouping for that given time
    period,  sell-through units information for each location or customer grouping for a cumulative time period (same thing for dollars), warehouse on hands, warehouse on orders, the customer's unique categorization of our product in their system, the customer's
    current status code for that product, and so on.
    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables for our largest customers so I can create queries and pivot tables to more quickly look at sales-related information by category, by specific product(s), by partner, by specific products or categories across partners, by specific products or
    categories across specific weeks/months/years, etc.  We do have a separate product table so only the product identifier or a junction table may be needed to pull in additional information from the product table with queries.  We do need to maintain
    the sales reporting information indefinitely.
    I welcome any suggestions, best practice or resources (books, web, etc).
    Many thanks!

    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables .....
    I assume you want to migrate to SQL Server.
    Your best course of action is to hire a professional database designer for a short period like a month.
    Once you have the database, you need to hire a professional DBA to move your current data from Access & Excel into the new SQL Server database.
    Finally you have to hire an SSRS professional to design reports for your company.
    It is also beneficial if the above professionals train your staff while building the new RDBMS.
    Certain senior SQL Server professionals may be able to do all 3 functions in one person: db design, database administration/ETL & business intelligence development (reports).
    Kalman Toth Database & OLAP Architect
    SELECT Video Tutorials 4 Hours
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • Best Practice to fetch SQL Server data and Insert into Oracle Tables

    Hello,
    I want to read sqlserver data everry half an hour and write into oracle tables ( in two different databases). What is the best practice for doing this?
    We do not have any database dblinks from oracle to sqlserver and vice versa.
    Any help is highly appreciable?
    Thanks

    Well, that's easy:
    use a TimerTask to do the following every half an hour:
    - open a connection to sql server
    - open two connections to the oracle databases
    - for each row you read from the sql server, do the inserts into the oracle databases
    - commit
    - close all connections

  • Best Practice for Image placement and Anchored Frames for use in Robohelp 9

    Hi,
    I'm looking for the best practices in how to layout my images in Framemaker 10 so that they translate correctly to Robohelp 9.  I currently have images inside of Anchored frames that "Run into" the right side of my text. I've adjusted the size of the anchored frame so that my text flows correctly around the image. Everything looks good in Framemaker! Yeah! The problem is that when I link my Framemaker document to Robohelp, the text does not flow around my image in the same manner. On a couple of Robohelp screens the image is running into the footer. I'm wondering if I should be using tables in Framemaker in order to get the page layout that I'm looking for. Also, I went back and forth...is this a Framemaker question or is this a Robohelp question. Any assistance would be greatly appreciated.

    I think Jeff is meaning this section of the RoboHelp forums:
    http://forums.adobe.com/community/robohelp/robohelp_framemaker

Maybe you are looking for

  • Itunes 7.1 different problem

    After installing the new itunes and quicktime update yesterday i have discovered today that neither will load up. I was wondering if anybody else had has this problem and knows of a quick fix? Macbook cd 2ghz   Mac OS X (10.4.8)  

  • Email forwarded or replied to on PC not showing as replied/forwarded to on Blackberry - help please?

    Hi All I have access on Outlook on PC to my manager's emails.  He works predominantly on blackberry off site.  If I reply to his emails on PC, when the Blackberry syncs wirelessly, the messages I replied to or forwarded don't show as having been repl

  • Disable ability to rename files in JFileChooser

    I am using a JFileChooser in my application, and when the user has the JFileChooser up, they can select files and click again for the ability to rename. This ability can cause failure in other parts of the application, so I want to disable the abilit

  • Can't Install PE 11 on iMac OS 10.2

    Installation won't go past 0% before getting an installation error message.  I have searched for solution to this problem, but can't find anything that solves the problem. Hours of time wasted! I have never had issues in past installing Adobe softwar

  • Disc Icon has padlock, and won't open, How do I unlock it?

    I have 4 internal Hard drives and two external drives. I had to re-install my Mac OS 10.6.8. Now on of the Internal drives and one of the external (the one that contains my Time Machine Backups) have a padlock on the desktop icon. I get the message "