What is the best approach to multi-platform development on Flash Pro CS6?

Hello everyone,
I have been writing Flash games for a while and now I would like to write games for browser, Android and iOS.  I am very comfortable with AS3 (coming from Java) have read through books and tutorials, had online trainining from Lynda.com you name it.  I would like to hear from real developres some best practice approaches to the 'write once, deploy anywhere' feature of Flash Professional.  Namely I would like to know:
Should I use timeline code?
Should I include third party engines like Tween Lite.  For example, if I have a Tween Lite com folder in my app or game will that compile to native iOS code?
How should I approach sound clips and music?
These are the main things I would like to know.  Feel free to add any hands-on knowledge you would offer me and users of this forum.  I think this is a great discussion to have that would elimate a lot of the 'fire extinguishing' that takes place on help forums.  I am running CS6 Web Premium on Windows 7.

Thanks for chiming in AV
(helpful )
No doubt about it.  There are, however, a lot of caveats to using timeline code.  For example, always remembering to copy/paste into notepad and save in the case of a corrupted FLA, being limited to only one frame vs the tentacles of a document class, weight (it seems all the code loads and runs without waiting to be called)  and the list goes on and on.  You get the point. 
Timeline code is like a filing cabinet, as you put it, whereas the class approach is like a books on a shelf.  I think Kglad's opinion was specific to the question I asked, bearing in mind that I was making games.  In the case of games, timeline code can become very expenisive.  I try to reuse code wherever I can to keep things light.
My approach, so far, is still mostly class based but I almost always use timeline code for animations varying from five to thirty lines of code on average, I would say.  Then I just call pre-animated symbols from my document class.  Seems to be working so far.
Thanks for sharing that timeline code is porting nicely into .ipa.  That is a huge reassurance!  That was actually my main concern.  Android is still using a virtual machine (so I am not too worried about that) but I heard that publishing to .ipa converts to native code.  Hats off to the team who pulled that off.
And like you said, there are definitely "a lot of ways to skin the cat."  Happy coding.

Similar Messages

  • What is THE best video file format to import into Premiere Pro CS6?

    What is THE best video file format to import into Premiere Pro CS6? I am tired of the guess work of "this might work if you do this." Just tell me which is, flat out, the easiest for Premiere to read?? I generally animate in Adobe Flash, and some times with a Live camera. Premiere seems to pick and choose which video file it wants. I import from the File drop down box in the upper left hand corner, then move the video file to timeline. Under the "scrubber" there is a thin red, (or when rendered) a green line. The videos I put in the timeline show a thin yellow line which means Premiere is confused. So, in basic terms, which video format is MOST commonly accepted with the least amount of trouble/hassle?

    Premiere has a function now where you can import a clip and drag onto a timeline and it will prompt you with a message about changing the timeline settings to match your clip, (or if there is no timeline it will automatically create one that matches your clip.) This is in CC of course so you might consider upgrading. They have fixed a lot of compatibility issues from CS6 to now. If you don't want to get all of CC for 60/month then you can at least shell out 20/month on premiere. It's worth it. They've made it where you don't have to worry about 99% of all video formats. They just work.

  • What is the best approach to converting LV7.1 tags to LV2012 shared variables in multiple VIs?

    What is the best approach to upgrading from LV7.1/DSC tags to LV2012/DSC shared variables, in multiple VIs running on multiple platforms? Our system is composed of  about 5 PCs running Windows 2000/LV7.1 Runtime, plus a PLC, and a main controller running XP/SP3/LV2012. About 3 of the PCs publish sensor information via tags across the LAN to the main controller. Only the main controller is currently being upgraded. Rudimentary questions:
    1. Will the other PCs running the 7.1 RTE (with tags) be able to communicate with the main controller running 2012 (shared variables)?
    2. Is it necessary to convert from tags to shared variables, or will the deprecated legacy tag VIs from LV7.1 work in LV2012?
    3. Will all the main controller VIs need to be incorporated into a project in order to use shared variables?
    4. Is the only way to do this is to find all tag items and replace them with shared variable items?
    Thanks in advance with any information and advice!
    lb
    Solved!
    Go to Solution.

    Hi lb,
    We're glad to hear you're upgrading, but because there was a fundamental change in architecture since version 7.1, there will likely be some portions that require a rewrite. 
    The RTE needs to match the version of DSC your using.  Also, the tag architecture used in 7.1 is not compatible with the shared variable approach used in 2012.  Please see the KnowledgeBase article Do I Need to Upgrade My DSC Runtime Version After Upgrading the LabVIEW DSC Module?
    You will also need to convert from tags to shared variables.  The change from tags to shared variables took place in the transition to LabVIEW 8.  The KnowledgeBase Migrating from LabVIEW DSC 7.1 to 8.0 gives the process for changing from tags to shared variables. 
    Hope this gets you headed in the right direction.  Let us know if you have more questions.
    Thanks,
    Dave C.
    Applications Engineer
    National Instruments

  • What's the best approach for handeling about 1300 connections in Oracle.

    What's the best approach for handling about 1300 connections in Oracle 9i/10g through a Java application?
    1.Using separate schema s for various type users(We can store only relevant data with a particular schema.     Then No. of records per table can be reduced by replicating tables but we have to maintain all data with a another schema     Then we need update two schema s for a given session.Because we maintain separate scheama for a one user and another schema for all data and then there may be Updating problems)
    OR
    2. Using single schema for all users.
    Note: All users may access the same tables and there may be lot of records than previous case.
    What is the Best case.
    Please give Your valuable ideas

    It is a true but i want a solution from you all.I want you to tell me how to fix my friends car.

  • What are the best approaches for mapping re-start in OWB?

    What are the best approaches for mapping re-start in OWB?
    We are using OWB repository 10.2.0.1.0 and OWB client 10.2.0.1.31. The Oracle version is 10 G (10.2.0.3.0). OWB is installed on Linux.
    We have number of mappings. We built process flows for mappings as well.
    I like to know, what are the best approches to incorportate re-start options in our process. ie a failure of mapping in process flow.
    How do we re-cycle failed rows?
    Are there any builtin features/best approaches in OWB to implement the above?
    Does runtime audit tables help us to build re-start process?
    If not, do we need to maintain our own tables (custom) to maintain such data?
    How did our forum members handled above situations?
    Any idea ?
    Thanks in advance.
    RI

    Hi RI,
    How many mappings (range) do you have in a process flows?Several hundreds (100-300 mappings).
    If we have three mappings (eg m1, m2, m3) in process flow. What will happen if m2 fails?Suppose mappings connected sequentially (m1 -> m2 -> m3). When m2 fails then processflow is suspended (transition to m3 will not be performed). You should obviate cause of error (modify mapping and redeploy, correct data, etc) and then repeat m2 mapping execution from Workflow monitor - open diagram with processflow, select mapping m2 and click button Expedite, choose option Repeat.
    In re-start, will it run m1 again and m2 son on, or will it re-start at row1 of m2?You can specify restart point. "at row1 of m2" - I don't understand what you mean (all mappings run in Set based mode, so in case of error all table updates will rollback,
    but there are several exception - for example multiple target tables in mapping without corelated commit, or error in post-mapping - you must carefully analyze results of error).
    What will happen if m3 fails?Process is suspended and you can restart execution from m3.
    By having without failover and with max.number of errors=0, you achieve re-cycle failed rows to zero (0).This settings guarantee existence only two return result of mapping - SUCCSES or ERROR.
    What is the impact, if we have large volume of data?In my opinion for large volume Set based mode is the prefered processing mode of data processing.
    With this mode you have full range enterprise features of Oracle database - parallel query, parallel DML, nologging, etc.
    Oleg

  • Newbie: What is the best approach to integrate BO Enterprise into web app

    Hi
    1. I am very new to Business Objects and .Net. I need to know what's the best approach
    when intergrating bo into my web app i.e which sdk do i use?
    For now i want to provide very basic viewing functionality for the following reports :
    -> Crystal Reports
    -> Web Intellegence Reports
    -> PDF Reports
    2. Where do i find a standalone install for the Business Objects Enteprise XI .Net providers?
    I only managed to find the wssdk but i can't find the others. Business Objects Enteprise XI
    does not want to install on my machine (development) - installed fine on server, so i was hoping i could find a standalone install.

    To answer question one, you can use the Enterprise .NET SDK for each, though for viewing Webi documents it is much easier to use the opendocument method of URL reporting to view them.
    The Crystal Reports and PDF instances can be viewed easily using the SDK.
    Here is a link to the Developer Library:
    [http://devlibrary.businessobjects.com/]
    VB.NET XI Samples:
    [http://support.businessobjects.com/communityCS/FilesAndUpdates/bexi_vbnet_samples.zip.asp]
    C# XI Samples:
    [http://support.businessobjects.com/communityCS/FilesAndUpdates/bexi_csharp_samples.zip.asp]
    Other samples:
    [https://boc.sdn.sap.com/codesamples]
    I answered the provider question on your other thread.
    Good luck!
    Jason

  • What´s the best approach to work with Excel, csv files

    Hi gurus. I got a question for you. According to your experience what's the best approach to work with Excel or csv files that have to be uploaded through DataServices to you datawarehouse.
    Let's say your end-user, who is not a programmer, creates a group of 4 excel files with different calculations in a monthly basis, so they can generate a set of reports from their datawarehouse once the files have been uploaded to tables in your DWH. The calculations vary from month to month. The user doesn't have a front-end to upload the excel files directly to Data Services. The end user needs to keep a track of which person uploaded the files for a determined month.
    1. The end user should place their 4 excel files in a shared directory that will be seen by DataServices.
    2. DataServices will execute certain scheduled job that will read the four files and upload them to the Datawarehouse at a determined time, lets say at 9:00pm.
    It makes me wonder... what happens if the user needs to present their reports immediately so they can´t wait until 9:00pm.  Is it possible for the end user to execute some kind of action (out of the DataServices Environment) so DataServices "could know" that it has to process those files right now, instead of waiting for the night schedule?
    Is there a way that DS will track who was the person who uploaded those files?
    Would it be better to build a front-end for the end user so they can upload their four files directlyto the datawarehouse?
    Waiting for your comments to resolve this dilemma
    Best Regards
    Erika

    Hi,
    There are functions in DS that captures the input files automatically. You could use file_exists() or wait_for_file() option to do that. Schedule the job to run every certain minute and if the file exists then run. This could be done by using a certain file name with date and timestamp etc or after running move the old files to archive and DS wait for new files to show up.
    Check this - Selective Reading and Postprocessing - Enterprise Information Management - SCN Wiki
    Hope this helps.
    Arun

  • What is the best approach to process data on row by row basis ?

    Hi Gurus,
    I need to code stored proc to process sales_orders into Invoices. I
    think that I must do row by row operation, but if possible I don't want
    to use cursor. The algorithm is below :
    for all sales_orders with status = "open"
    check for credit limit
    if over credit limit -> insert row log_table; process next order
    check for overdue
    if there is overdue invoice -> insert row to log_table; process
    next order
    check all order_items for stock availability
    if there is item that has not enough stock -> insert row to
    log_table; process next order
    if all check above are passed:
    create Invoice (header + details)
    end_for
    What is the best approach to process data on row by row basis like
    above ?
    Thank you for your help,
    xtanto

    Processing data row by row is not the fastest method out there. You'll be sending much more SQL statements towards the database than needed. The advice is to use SQL, and if not possible or too complex, use PL/SQL with bulk processing.
    In this case a SQL only solution is possible.
    The example below is oversimplified, but it shows the idea:
    SQL> create table sales_orders
      2  as
      3  select 1 no, 'O' status, 'Y' ind_over_credit_limit, 'N' ind_overdue, 'N' ind_stock_not_available from dual union all
      4  select 2, 'O', 'N', 'N', 'N' from dual union all
      5  select 3, 'O', 'N', 'Y', 'Y' from dual union all
      6  select 4, 'O', 'N', 'Y', 'N' from dual union all
      7  select 5, 'O', 'N', 'N', 'Y' from dual
      8  /
    Tabel is aangemaakt.
    SQL> create table log_table
      2  ( sales_order_no number
      3  , message        varchar2(100)
      4  )
      5  /
    Tabel is aangemaakt.
    SQL> create table invoices
      2  ( sales_order_no number
      3  )
      4  /
    Tabel is aangemaakt.
    SQL> select * from sales_orders
      2  /
            NO STATUS IND_OVER_CREDIT_LIMIT IND_OVERDUE IND_STOCK_NOT_AVAILABLE
             1 O      Y                     N           N
             2 O      N                     N           N
             3 O      N                     Y           Y
             4 O      N                     Y           N
             5 O      N                     N           Y
    5 rijen zijn geselecteerd.
    SQL> insert
      2    when ind_over_credit_limit = 'Y' then
      3         into log_table (sales_order_no,message) values (no,'Over credit limit')
      4    when ind_overdue = 'Y' and ind_over_credit_limit = 'N' then
      5         into log_table (sales_order_no,message) values (no,'Overdue')
      6    when ind_stock_not_available = 'Y' and ind_overdue = 'N' and ind_over_credit_limit = 'N' then
      7         into log_table (sales_order_no,message) values (no,'Stock not available')
      8    else
      9         into invoices (sales_order_no) values (no)
    10  select * from sales_orders where status = 'O'
    11  /
    5 rijen zijn aangemaakt.
    SQL> select * from invoices
      2  /
    SALES_ORDER_NO
                 2
    1 rij is geselecteerd.
    SQL> select * from log_table
      2  /
    SALES_ORDER_NO MESSAGE
                 1 Over credit limit
                 3 Overdue
                 4 Overdue
                 5 Stock not available
    4 rijen zijn geselecteerd.Hope this helps.
    Regards,
    Rob.

  • What is the best approach to install BI statistics in SAP BI ?

    Hello All,
    what is the best approach to install BI statistics in SAP BI ?
    by collecting objects in standard BI content- 0TCT*objects or
    by executing some standard tcodes.
    Regards,
    Siva

    the best approach is based on version of your BW system follow up install steps in notes:
    BW 3.x:
    309955 - BW statistics - Questions, answers and errors
    BW 7.x
    934848 - Collective note: (FAQ) BI Administration CockpitBW 7.x
    Cheers,
    m./

  • What's the best approach/program for finding and eliminating duplicate photos on my hard drive?

    What's the best approach/program for finding and eliminating duplicate photos on my hard drive? I have a "somewhat" older version of iPhoto (5.0.4), and it doesn't seem to offer anything like that except during the importing phase of syncing my phone...

    I wonder, is there room to transfer them to your phone, & then back to filter them?

  • What is the best approach to capture TBOM's for a SAP SRM system/functionality?

    Hello SCN Community,
    It would be much appreciated if somebody could share some information about the following....
    What is the best approach to create TBOM's for a SAP SRM system? The SRM functionality is basically consisting out of multiple ABAP Web Dynpro's that are connected as a process via a SAP Portal (as is understand it). The entrypint to the SRM functionality is via the SAP Portal.
    Do I first have to create a link to the Portal via an SAP Web Application link in SOLAR01 and then start recording? Will it record only the portal objects or also the ABAP Web Dynpro objects?
    Do I have to list all the separate ABAP Web Dynpro's in SOLAR01 and use those as a starting point?
    I am myself more familair with more classical SAP ABAP ECC systems and transactions.  I could hardly find any information on the use of BPCA and the required TBOM's in the area of SRM.... Any help would be much appreciated!
    Kind Regards,
    Guido Jacobs

    Hi Guido,
    today was a new blog released, maybe this helps:
    BPCA - Powerful Risk Eliminator
    Best Regards,
    Christoph

  • What is the best approach to setup intranet and internet sites in SharePoint 2013?

    I am planning to setup a internet and intranet website for one of our client.  What is the best approach to setup this kind of environment?
    Some of the users (registered users) from the internet should be able to access information in the intranet site.  I have created two web applications for intranet and internet.  Is it the right way to go forward?
    Thanks in advance! :)
    LM

    Hi Laemon,
    Creating two separate web applications, one for Internet site and the other for Intranet is the right thing you have done.
    1. To properly plan creation of your web application, site collection and website is of utmost important to ensure you build your site in a professional and most recommended way. Go through this article from Technet that would help you plan your site in
    SharePoint 2013.
    https://technet.microsoft.com/en-us/library/cc263267.aspx
    2. Planning and choosing the right authentication type is also a very important decision. I recommend you to go through the below article if you have not already gone through.
    Plan for user authentication methods in SharePoint 2013
    3. Plan for licensing for your SharePoint 2013 Internet Facing Website.
    Licensing Internet Sites Built on SharePoint 2013
    SharePoint 2013 licensing for Internet facing sites
    4. To grant access to registered users to Intranet site (as you mentioned in question), if you created both web applications in same farm (same domain) then that would be easy to grant access using Site Permission with Windows Authentication enabled for
    both web application. If both web applications are created on different domains then If there is a two-way trust in place, and the SharePoint servers have the necessary port access to the remote domain's Domain Controller, then it is automatic. If it is a
    one-way trust, then you need to follow these directions:
    http://technet.microsoft.com/en-us/library/cc263460(v=office.12).aspx
    If there is no domain trust in place, then you either need to create one, or look at alternative technologies,
    such as ADFS.
    Please remember to upvote if it helps you or
    click 'Mark as Answer' if the reply answers your query.

  • What is the best approach to insert millions of records?

    Hi,
    What is the best approach to insert millions of record in table.
    If error occurred while inserting the record then how to know which record has failed.
    Thanks & Regards,
    Sunita

    Hello 942793
    There isn't a best approach if you do not provide us the requirements and the environment...
    It depends on what for you is the best.
    Questions:
    1.) Can you disable the Constraints / unique Indexes on the table?
    2.) Is there a possibility to run parallel queries?
    3.) Do you need to know the rows which can not be inserted if the constraints are enabled? Or it is not necessary?
    4.) Do you need it fast or you have time to do it?
    What does "best approach" mean for you?
    Regards,
    David

  • What is the best approach to take daily backup of application from CQ5 Server ?

    Hello,
    How to maintain daily backup to maintain the data from server.
    What is the best approach.
    Regards,
    Satish Sapate.

    Linking shared from ryan should give enough information. 
    If case backing up large repository you may know Data Store store holds large binaries and are only stored once. To reduce the backup time remove the datastore from the backup by following [1] (CQ 5.3 example)
    [1] In order to remove the datastore from the backup you will need to do the following:
    Assuming your repository is under /website/crx/repository and you want to move your datastore to /website/crx/datastore
        stop the crx instance
        mv /website/crx/repository/shared/repository/datastore /website/crx/
        Then modify repository.xml by adding the new path configuration to the DataStore element.
    Before:
    <DataStore class="org.apache.jackrabbit.core.data.FileDataStore">
    <param name="minRecordLength" value="4096"/>
    </DataStore>
    After:
    <DataStore class="org.apache.jackrabbit.core.data.FileDataStore">
    <param name="path" value="/website/crx/datastore"/>
    <param name="minRecordLength" value="4096"/>
    </DataStore>
    After doing this then you can safely run separate backups on the datastore while the system is running without affecting performance very much.
    Following our example, you could use rsync to backup the datstore:
    rsync --av --ignore-existing /website/crx/datastore /website/backup/datastore

  • What is the best approach to generate control numbers from bpel?

    1. If we want to control ISA/GS/ST control numbers from bpel, what is the best approach to do that?
    2.  how to generate these control numbers and where to store them to get a sequence out of it?
    Thanks,
    Kathar

    Internally Oracle B2B uses DB sequence for generating the control numbers. It is the best approach but at the same time it is not very straight forward, specially in case of clustered database. So you may carefully implement same with BPEL.
    Hi Anuj,
    If we let B2B to generate control numbers in the clustered environment, is there any settings we have to do?
    So you may carefully implement same with BPEL. BTW, what is the use case behind this?
    We were thinking about using this to send out duplicate messages to two TPs but we decided to go with java callout as you suggested in another thread.
    Thanks!
    Kathar

Maybe you are looking for

  • Hi there how are you guys out there of this forum? I got a question for y'a

    Hi there how are you guys out there of this forum? I got a question for y'all I wanted to know how I could record different folders using iTunes Cause when recorded 2000 thongs on the DVD There were 2000 songs to go through So I really wanted to have

  • How do you turn off servermgrd's auto-portmapping?

    I recently added a Time Capsule to my setup. OS X Server is hitting the TC with a auto-portmap request every MINUTE. This is rather annoying as it's essentially spamming my logs on both OS X Server and the TC. Anyone know how to turn this feature of

  • Maximum disk size for azure site recovery

    Hi everyone, I am looking into Azure Site Recovery, and I can't seem to find the maximum disk size I would be able to replicate into Azure.  I have read some articles saying that 1TB is the maximum size, and some people have said that it is 64TB!! I

  • Consolidation in special posting period

    Hi All, I am using the Fiscal year variant April to March with 4 special posting periods in ECC. I am using the same in BCS too. I am taking the data through LFDS with posting period move. I am using quarterly consolidation which is period 3, 6, 9 an

  • How to keep DLV status orders cost not to settle

    How to keep the dollars associated with production orders at DLV status in WIP and not let them settle to the P&L like orders at TECO status do