Best practices for audio files in your app

Hello All,
I was just curious what everyone is doing as far as having quality sounds in their app but keeping the file size down. Here are a few questions:
Is it better to have them compressed like mp3's, or not as .wav. Is .caf really better than .wav?
What have you found to be the best "sweet spot" for sample rate etc. for good quality and lower file size for an iphone app?
What programs can you recommended to do the final conversions necessary to keep the file size low and the file still working for the iphone. As an example when I make a .wav file by bouncing a mix from my PC based protools it creates .wav file that for some reason throws an error when used in an app; but if I take that same .wav file and open it in freeware like Audacity and save it as a .wav again; it reduces the file size and "fixes" whatever the issue was.
Any help you can offer would be very much appreciated. Thank you.

Well I'm not sure if any of the benefits of caf will be apparent to your app given that you are playing and not recording and you are using PCM. I think you can go directly from wav to caf with /Developer/Examples/CoreAudio/SimpleSDK/ConvertFile but I'm not sure. However, if you can produce compressed audio you will have a lot of space savings and you can experiment with sample rates to make a good size / quality tradeoff. I do not know what tools there are to record compressed audio...

Similar Messages

  • What are best practice for packaging and deploying j2EE apps to iAS?

    We've been running a set of J2EE applications on a pair of iAS SP1b for about a year and it has been quite stable.
    Recently however we have had a number of LDAP issues, particularly when registering and unregistering applications (registering ear files sometimes fails 1st time but may work 2nd time). Also We've noticed very occasionally that old versions of classes sometimes find their way onto our machines.
    What is considered to be best practice in terms of packaging and deployment, specifically:
    1) Packaging - using the deployTool that comes with iAS6 SP1b to package is a big manual task, especially when you have 200+ jsp files. Are people out there using this or are they scripting it with a build tool such as Ant?
    2) Deploying an existing application to multiple iAS's. Are you guys unregistering old application then reregistering new application? Are you shutting down iAS whilst doing the deployment?
    3) Deploying ear files can take 5 to 10 mins, is this normal?
    4) In a clustered scenario where HTTPSession is shared what are the consequences of doing deployments to data stored in session?
    thanks in asvance for your replies
    Owen

    You may want to consider upgrading your application server environment to a newer service pack. There are numerous enhancements involving the deployment tool and run time layout of your application that make clear where you're application is loading its files from.
    If you've at a long running application server environment, with lots of deployments under your belt, you might start to notice slow downs in deployment and kjs start time. Generally this is due to garbage collecting in your iAS registry.
    You can do several things to resolve this. The most complete solution is to reinstall the application server. This will guarantee a clean ldap registry. Of course you've got to restablish your configurations and redeploy your applications. When done, backup your application server install space with the application server and directory server off. You can use this backup to return to a known configuation at some future time.
    For the second method: <B>BE CAREFUL - BACKUP FIRST</B>
    There is a more exhaustive solution that involves examining your deployed components to determine the active GUIDS. You then search the NameTrans section of the registry searching for Applogic Servlet *, and Bean * entries that represent your previously deployed components but are represented in the set of deployed GUIDs. Record these older GUIDs, remove them from ClassImp and ClassDef. Finally remove the older entries from NameTrans.
    Best practices for deployment depend on your particular environmental needs. Many people utilize ANT as a build tool. In later versions of the application server, complete ANT scripts are included that address compiling, assembly and deployment. Ant 1.4 includes iAS specific targets and general J2EE targets. There are iAS specific targets that can be utilized with the 1.3 version. Specialized build targets are not required however to deploy to iAS.
    Newer versions of the deployment tool allow you to specify that JSPs are not to be registered automatically. This can be significant if deployment times lag. Registered JSP's however benefit more fully from the services that iAS offers.
    2) In general it is better to undeploy then redeploy. However, if you know that you're not changing GUIDs, recreating an existing application with new GUIDs, or removing registered components, you may avoid the undeploy phase.
    If you shut down the KJS processes during deployment you can eliminate some addition workload on the LDAP server which really gets pounded during deployment. This is because the KJS processes detect changes and do registry loads to repopulate their caches. This can happen many times during a deployment and does not provide any benefit.
    3) Deploying can be a lengthy process. There have been improvements in that performance from service pack to service pack but unfortunately you wont see dramatic drops in deployment times.
    One thing you can do to reduce deployment times is to understand the type of deployment. If you have not manipulated your deployment descriptors in any way, then there is no need to deploy. Simply drop your newer bits in to the run time space of the application server. In later service packs this means exploding the package (ear,war, or jar) in to the appropriate subdirectory of the APPS directory.
    4) If you've changed the classes of objects that have been placed in HTTPSession, you may find that you can no longer utilize those objects. For that reason, it is suggested that objects placed in session be kept as simple as possible in order to minimize this effect. In general however, is not a good idea to change a web application during the life span of a session.

  • Best practices for applying sharpening in your workflow

    Recently I have been trying to get a better understanding of some of the best practices for sharpening in a workflow.  I guess I didn't realize it but there are multiple places to apply sharpening.  Which are best?  Are they additive?
    My typical workflow involves capturing an image with a professional digital SLR in either RAW or JPEG or both, importing into Lightroom and exporting to a JPEG file for screen or printing both lab and local. 
    There are three places in this workflow to add sharpening.  In the SLR, manually in Lightroom and during the export to a JPEG file or printing directly from Lightroom
    It is my understanding that sharpening is not added to RAW images even if you have added sharpening in your SLR.  However sharpening will be added to JPEG’s by the camera. 
    Back to my question, is it best to add sharpening in the SLR, manually in Lightroom or wait until you export or output to your final JPEG file or printer.  And are the effects additive?  If I add sharpening in all three places am I probably over sharpening?

    You should treat the two file types differently. RAW data never has any sharpening applied by the camera, only jpegs. Sharpening is often considered in a workflow where there are three steps (See here for a founding article about this idea).
    I. A capture sharpening step that corrects for the loss of sharp detail due to the Bayer array and the antialias filter and sometimes the lens or diffraction.
    II. A creative sharpening step where certain details in the image are "highlighted" by sharpening (think eyelashes on a model's face), and
    III. output sharpening, where you correct for loss of sharpness due to scaling/resampling or for the properties of the output medium (like blurring due to the way a printing process works, or blurring due to the way an LCD screen lays out its pixels).
    All three of these are implemented in Lightroom. I. and II. are essential and should basically always be performed. II. is up to your creative spirits. I. is the sharpening you see in the develop panel. You should zoom in at 1:1 and optimize the parameters. The default parameters are OK but fairly conservative. Usually you can increase the mask value a little so that you're not sharpening noise and play with the other three sliders. Jeff Schewe gives an overview of a simple strategy for finding optimal parameters here. This is for ACR, but the principle is the same. Most photos will benefit from a little optimization. Don't overdo it, but just correct for the softness at 1:1.
    Step II as I said, is not essential but it can be done using the local adjustment brush, or you can go to Photoshop for this. Step III is however very essential. This is done in the export panel, the print panel, or the web panel. You cannot really preview these things (especially the print-directed sharpening) and it will take a little experimentation to see what you like.
    For jpeg, the sharpening is already done in the camera. You might add a little extra capture sharpening in some cases, or simply lower the sharpening in camera and then have more control in post, but usually it is best to leave it alone. Step II and III, however, are still necessary.

  • Best Practices for Exporting Files??

    I'm new to Premiere (coming from FCP).  I used Premiere months ago to compress some ProRes files to h.264 files for the web.  I sent the files through Media Encoder and everything seemed fine.  However, I realized after several weeks that the audio in all of the files was a few frames out of sync.  Having not been a Premiere user at the time I did not do much research and decided to just use MPEG Streamclip from then on.
    Now that I'm learning how to use Premiere, I looked up the issue on the forums and found that many people have had similar issues with the audio being out of sync after exporting. However, there are tons of different scenerios in which it seems to be occuring.  The one common variable that I've noticed (among many of the threads, but not all) is that many of the people are exporting to a Quicktime format. 
    While I don't remember all the details of my export and sequence settings from my issue months ago (so I don't want to address that specific case), I am curious as to what are some "Best Practices" when exporting from Premiere Pro? Is there any advantage/disadvantage to use AME rather than exporting directly from Premiere Pro? In general, I will just be exporting as H.264 files for the web, MPEG-2 for DVD, and ProRes 422 for After Effects (or sometimes to bring into MPEG Streamclip). 
    I shoot almost entirely in AVCHD, and usually at 1080p 30fps.  I'm running CS5 on a Macbook Pro 15" 2.0 Quad Core i7 8GB RAM.
    While the question may seem broad, my main concern that I want to avoid is having the audio out of sync.  But also I just want to know of any important details to keep in mind to prevent other issues.
    Thanks,
    Mike

    > I'm running CS5...
    What specific version? We're up to 5.0.4 now.
    There have been bug fixes for audio/video synch in the updates. One of the fixes was for a bug in the conforming of audio and indexing of MPEG files, so you need to delete your media cache files and let Premiere Pro create new ones for this fix to take effect.

  • EP Upgrade - SP14 - Best Practice for Modification File Comparison

    SDN  Experts -
    We are upgrading our EP from SP14 - SP16.  SAP offers a file "diff" tool that is only useful for Java application files to assist in re-applying our mods on top of the new code stack.
    We are looking for best practices in Portal upgrades to do the following:
    - Identify all files that we have modified on existing SP
    - Diff all source code files (java, XML, GUI, other) between Current SP14 and SP16
    We are also looking for documentation that identifies the local directory structure for NWDS.  This would aid us in creating a batch process to "diff" our source code libraries.
    Any recommendations are appreciated.
    Thanks

    I'm not realy getting your question because you already state what to do:
    We are looking for best practices in Portal upgrades to do the following:
    Identify all files that we have modified on existing SP
    Diff all source code files (java, XML, GUI, other) between Current SP14 and SP16
    You should know by documentation what is changed I guess? Then start diff-ing the code and recompile or repackage. NWDS also has diff functionalities.
    Good luck,
    Benjamin

  • Best Practice for Flat File Data Uploaded by Users

    Hi,
    I have the following scenario:
    1.     Users would like to upload data from flat file and subsequently view their reports.
    2.     SAP BW support team would not be involved in data upload process.
    3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
    4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
    What are the best practice we should adopt for this scenario?
    Thanks!

    Hi,
    I can share what we do in our project.
    We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
    So this happens everyday.
    Rgds
    SVU123
    Edited by: svu123 on Mar 25, 2011 5:46 AM

  • What are some of your best practices for using videos in your marketing campaigns?

    As video becomes more and more popular, what are some best practices you have for effectively using video in your marketing campaigns? Any cool examples?

    Just saw this topic.
    We use video a great deal in our campaigns.  One of our best ones though was for an offering that the written word couldn't convey as quickly, succinctly, or visually even with graphics that a short commercial could do.  We therefore created a simple video with professional VO work and blasted it out via email and social networks.  The video clearly communicated our product and it's value to our prospects.
    In return we got a enormous response of form submits from individuals wanting to download the products overview documentation.  It is still one of our most watched videos.
    We are now looking at adding in more value-add videos to go along with the current 'product information' stable to help generate more soft-sell opptys.
    We are firm believers in using video.  You can't just put anything out there and expect just because it is a video to do well.  You must plan it carefully, think the messaging and value offered through just as much as any outbound offering.  Time is still valuable and your prospects are always asking what's in it for me.  Once they get something that they believe wastes their time, they will be more skeptical in the future regarding your videos or any messaging.
    Keeping trust and relevance is essential to generating value with video.

  • Best practice for making changes to Oracle apps business views and BAs/fold

    HI
    The oracle BI solution comes with pre-defined Business Views- database views and Business Areas and folders. If we want to customize those database views or BAs and folders what will be the best practice in order to avoid losing it during any upgrades.
    For ex Oracle out-of box Order Management BA that we are using heavily needs some additional fields to be added to Order Header and Order Lines folders and we also want to add some custom folders to this BA.
    If we do the changes to the database views behind this BA would they be lost during the upgrade or do we have to copy(duplicate) those views, updated them and create a custom BA and folders against those views.
    Thanks

    Hi,
    If you are adding new folders then just add them to the Oracle Business Area. The business area is just a collection of folders. If the business area was changed in an upgrade the new folder would not be deleted.
    If you want to add fields to the existing folders/views then you have 2 options. Add the field to the defining base view (these are the views beginning OEBV and OEFV) and then regenerate the business views. This may be overwritten if the view is upgrade but this is unlikely.
    Alternatively, copy the view to create a new version and then map the old folder to the new view and refresh. You may need to re-map the folder if the folder is upgraded, but at least you have a single folder used by both Oracle and custom reports.
    Rod West

  • Best Practice For Secure File Sharing?

    I'm a newbie to both OX X Server and File Sharing protocols, so please excuse my ignorance...
    My client would like to share folders in the most secure way possible; I was considering that what might be the best way would be for them to VPN into the server and then view the files through the VPN tunnel; my only issue with this is that I have no idea how to open up File Sharing to ONLY allow users who are connecting from the VPN (i.e. from inside of the internal network)... I don't see any options in Server Admin to restrict users in that way....
    I'm not afraid of the command line, FYI, I just don't know if this is:
    1. Possible!
    And 2. The best way to ensure secure AND encrypted file sharing via the server...
    Thanks for any suggestions!

    my only issue with this is that I have no idea how to open up File Sharing to ONLY allow users who are connecting from the VPN
    Simple - don't expose your server to the outside world.
    As long as you're running on a NAT network behind some firewall or router that's filtering traffic, no external traffic can get to your server unless you setup port forwarding - this is the method used to run, say, a public web server where you tell the router/firewall to allow incoming traffic on port 80 to get to your server.
    If you don't setup any port forwarding, no external traffic can get in.
    There are additional steps you can take - such as running the software firewall built into Mac OS X to tell it to only accept network connections from the local network, but that's not necessary in most cases.
    And 2. The best way to ensure secure AND encrypted file sharing via the server...
    VPN should take care of most of your concerns - at least as far as the file server is concerned. I'd be more worried about what happens to the files once they leave the network - for example have you ensured that the remote user's local system is sufficiently secured so that no one can get the documents off his machine once they're downloaded?

  • Best practices for ZFS file systems when using live upgrade?

    I would like feedback on how to layout the ZFS file system to deal with files that are constantly changing during the Live Upgrade process. For the rest of this post, lets assume I am building a very active FreeRadius server with log files that are constantly updating and must be preserved in any boot environment during the LU process.
    Here is the ZFS layout I have come up with (swap, home, etc omitted):
    NAME                                USED  AVAIL  REFER  MOUNTPOINT
    rpool                              11.0G  52.0G    94K  /rpool
    rpool/ROOT                         4.80G  52.0G    18K  legacy
    rpool/ROOT/boot1                   4.80G  52.0G  4.28G  /
    rpool/ROOT/boot1/zones-root         534M  52.0G    20K  /zones-root
    rpool/ROOT/boot1/zones-root/zone1   534M  52.0G   534M  /zones-root/zone1
    rpool/zone-data                      37K  52.0G    19K  /zones-data
    rpool/zone-data/zone1-runtime        18K  52.0G    18K  /zones-data/zone1-runtimeThere are 2 key components here:
    1) The ROOT file system - This stores the / file systems of the local and global zones.
    2) The zone-data file system - This stores the data that will be changing within the local zones.
    Here is the configuration for the zone itself:
    <zone name="zone1" zonepath="/zones-root/zone1" autoboot="true" bootargs="-m verbose">
      <inherited-pkg-dir directory="/lib"/>
      <inherited-pkg-dir directory="/platform"/>
      <inherited-pkg-dir directory="/sbin"/>
      <inherited-pkg-dir directory="/usr"/>
      <filesystem special="/zones-data/zone1-runtime" directory="/runtime" type="lofs"/>
      <network address="192.168.0.1" physical="e1000g0"/>
    </zone>The key components here are:
    1) The local zone / is shared in the same file system as global zone /
    2) The /runtime file system in the local zone is stored outside of the global rpool/ROOT file system in order to maintain data that changes across the live upgrade boot environments.
    The system (local and global zone) will operate like this:
    The global zone is used to manage zones only.
    Application software that has constantly changing data will be installed in the /runtime directory within the local zone. For example, FreeRadius will be installed in: /runtime/freeradius
    During a live upgrade the / file system in both the local and global zones will get updated, while /runtime is mounted untouched in whatever boot environment that is loaded.
    Does this make sense? Is there a better way to accomplish what I am looking for? This this setup going to cause any problems?
    What I would really like is to not have to worry about any of this and just install the application software where ever the software supplier sets it defaults to. It would be great if this system somehow magically knows to leave my changing data alone across boot environments.
    Thanks in advance for your feedback!
    --Jason                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Hello "jemurray".
    Have you read this document? (page 198)
    http://docs.sun.com/app/docs/doc/820-7013?l=en
    Then the solution is:
    01.- Create an alternate boot enviroment
    a.- In a new rpool
    b.- In the same rpool
    02.- Upgrade this new enviroment
    03.- Then I've seen that you have the "radious-zone" in a sparse zone (it's that right??) so, when you update the alternate boot enviroment you will (at the same time) upgrading the "radious-zone".
    This maybe sound easy but you should be carefull, please try this in a development enviroment
    Good luck

  • Best practices for saving files when emailing to the media?

    I'm pretty much an Ai imposter. I have Illustrator cs4 (14.0.0) and the way I use it is to cobble my designs together from iStock vector files and a lot of trial and error. Occasionally my husband throws me a lifeline because he is skilled with PhotoShop. As a small business owner, I am dying to find the time to take a course but right now I am just trying to float around the forums and learn what I can. Basically I flail about near the computer and the design sloooowly and somewhat mysteriously comes together. It's getting faster and is kind of fun, unless I get stuck.
    This week I had a problem with an ad I designed for a very small community orchestra's printed programs. I send them a pdf first, and when they view it on their screens using inDesign they see the whole image but when it printed it omitted an element (a sunburst in the background). They thought it had something to do with that element being in color rather than greyscale (though there were other elements that survived and they were the exact same color so I was skeptical). I sent a greyscale file, no luck. I sent them the Ai file, but that apparently "crashed" their iD and now they believe I've sent a corrupted file. They aren't very adobe-savvy, either.
    I've designed and emailed no fewer than 9 other ads to other print & online media organizations this year and never had a problem. The file looks fine to me in all versions I open/upload/email to myself.
    You can see it here if you like: http://www.scribd.com/doc/27776704/RCMA-for-CSO-Greyscale
    So here are my specific questions:
    1. How SHOULD I be saving this stuff? Is pdf the mark of a rookie?
    2. What settings should I be looking at when/before saving? I read about overprint, for example, and did try that with one of the versions I sent them to no avail. I don't really know what that does, so I was just trying a hail mary there anyway.
    Thanks for your time!

    PDF is the modern way of sending files  but what you might want to do is in this case select the art in question and go to Object>Flatten Transparency
    then save it as a pdf or as an ai file when sending the file to them zip it if they have windows based computers or stuffit if they have Mac actually zip is good for both.
    It is safer to send it as an archive then as an ai file.

  • Best practice for property file

    I am trying one small code which uses values from property file.
    public void doPost(HttpServletRequest request, HttpServletResponse response)
      throws ServletException, java.io.IOException
          InputStream inputStream = servletContext.getResourceAsStream(/WEB-INF/test.properties);
          PrintWriter out         = response.getWriter();
           Properties prop = new Properties();
           prop.load(inputStream);
           inputStream.close();
          String nameprop = prop.getProperty("name");
          out.println(nameprop);
       }I am getting the value from property file, It's perfectly ok as far as test code is concerned. Above will open the inputStream for each thread, which is not needed so i modified it and put in inside init() method. That is also working, i believe this will give me some better performance.
    But can i extend it further? I load property file at application level and all my servlets make use of it?
    What is normal practice to read property file to get good performance? I serached forum, i found some info there but could not figure out exactly how people are doing it.
    Thanks in advance,
    regards
    Manisha

    Thanks to all posters,
    As mentioned by duffymo, I tried some code which i tested on my m/c and working fine. But just want to confirm, what i understood and wrote is correct.
    I wrote 1st servlet to get the property file and store into servlet context, 2nd servlet is just to test it.
    1st Servlet:
    package common;
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.util.*;
    public class TestPropertyfiles_1 extends HttpServlet
    public void init(ServletConfig config) throws ServletException
       super.init(config);
       try{
            String PF_PATH = "/WEB-INF/test.properties";
            Properties prop = new Properties();
                            InputStream inputStream = config.getServletContext().getResourceAsStream(PF_PATH);
           prop.load(inputStream);
           inputStream.close();
           getServletContext().setAttribute("pf", prop);
       }catch(Exception e){}
    public void doPost(HttpServletRequest request, HttpServletResponse response)
      throws ServletException, java.io.IOException
            PrintWriter out         = response.getWriter();
            Properties tmpprop = (Properties) getServletContext().getAttribute("pf");
            String nameprop = tmpprop.getProperty("name");     
                out.println("name from property file" + nameprop);     
    }2nd servlet:
    package common;
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.util.*;
    public class TestPropertyfiles_2 extends HttpServlet
         public void init(ServletConfig config) throws ServletException
            super.init(config);
    public void doPost(HttpServletRequest request, HttpServletResponse response)
      throws ServletException, java.io.IOException
            PrintWriter out         = response.getWriter();
            Properties tmpprop = (Properties) getServletContext().getAttribute("pf");
            String nameprop = tmpprop.getProperty("name");
                            out.println("name from property file - no 2" + nameprop);
    }This was initially giving problem if I access 2nd servlet first before accessing 1st servlet. I did some changes inside web.xml
    <servlet>
    <servlet-name>common.TestPropertyfiles_1</servlet-name>
    <servlet-class>common.TestPropertyfiles_1</servlet-class>
    <load-on-startup>1</load-on-startup>
    </servlet>
    And then all was ok.
    One thing came to my mind. I can have one common servlet just for all initialisation, this will not have any doGet/doPost.
    regards
    Manisha

  • Best practice for audio?

    I am building a website for an historical novel (http://www.bearriverbooks.com/index.html) and wish to include a recording of a radio interview with the authors. I work on the web very sporadically – picking up as I go along.
    What I have read so far is suggests that a swf file with attached audio is likely to reach the largest audience, due to the widespread availability of the Flash Player.  (I've heard 95%.  Is this real?)  If I go this route, it looks like I will have to roll my own controller in ActionScript, which would be fun, but I'm under time constraints and would have to pick up AS's syntax in a hurry.
    An alternative would be to link to a QT audio-only .mov file.  Presumably, this would give me instant controller, but I'm not sure how many listeners I would lose over on the Windows side.  Anyone know how widespread the QT Player is over there in PC land?  Any comments on preferred compression?  (Voice-level mono is okay. The music in the clip is incidental.)
    Or, it could be that both of the above options may be foolish, and a simple and effective solution has eluded me because I'm basically a thrill-seeking newbie to web delivery.
    In any event, I'd really appreciate a little nudge in the right direction.
    TIA
    Richard Hurley
    Grass Valley MultiMedia

    I've used the JPlayer Audio Playlist generator before for a little site for a local singer / songwriter.
    The generator:
    http://chapmanit.thruhere.net/nick/source/v_0.2/
    It was quite straightforward to restyle for the site:
    http://www.rachel-worrall.co.uk/music/
    Once its set up, its easy to change the tracks to be included in the playlist - simply add or remove them from the defined folder on the server.
    Having said that, I'me going to bookmark Nancy's links, as a long with Murray, they're possibly the two most consistantly helpful people posting here.

  • Best practices for protecting files from ransomware?

    If you don't know what CryptoWall and such ransomware is, you are lucky. For now.
    This os probably more of a Desktop security issue but I'd like some ideas for file server protection.
    A corporate office got lucky today with just the files on one PC infected and network file shares the user had access to lost - but they were backed up, hence the "lucky".
    But it was scary enough they want to know what Microsoft wants us to do to prevent this in the future. The user was not admin on the local machine and so we are not sure how it was installed (I've read people get it different ways).
    We have SCCM EndPoint protection and obviously it didn't help. It did actually stop a password stealing utility from installing around the same time but didn't stop us from having thousands of files rendered useless for many hours today.
    It was suggested not using mapped network drives but I think one share was hit without a mapping (still waiting for confirmation). But I think anywhere it finds it, ie., under Favorites, could be attacked.
    Suggestions please.
    Thank you!

    You can try this.
    http://www.thirdtier.net/2013/10/cryptolocker-prevention-kit/

  • Best practice for importing non-"Premiere-ready" video files

    Hello!
    I work with internal clients that provide me with a variety of differnet video types (could be almost ANYTHYING, WMV, MP4, FLV).  I of course ask for AVIs when possible, but unfortunately, I have no control over the type of file I'm given.
    And, naturally, Premiere (just upgraded to CS5) has a hard time dealing with these files.  Unpredictable, ranging from working fine to not working at all, and everything in between.  Naturally, it's become a huge issue for turnaround time.
    Is there a best practice for preparing files for editing in Premiere?
    I've tried almost everything I can think of:  converting the file(s) to .AVIs using a variety of programs/methods.  Most recently, I tried creating a Watch Folder in Adobe Media Encoder and setting it for AVI with the proper aspect ratio.  It makes sense to me that that should work:using an Adobe product to render the file into something Premiere can work with.
    However, when I imported the resulting AVI into Premiere, it gave me the Red Line of Un-renderness (that is the technical term, right?), and had the same sync issue I experienced when I brought it in as a WMV.
    Given our environment, I'm completely fine with adding render time to the front-end of projects, but it has to work.  I want files that Premiere likes.
    THANK YOU in advance for any advice you can give!
    -- Dave

    I use an older conversion program (my PrPro has a much older internal AME, unlike yours), DigitalMedia Converter 2.7. It is shareware, and has been replaced by Deskshare with newer versions, but my old one works fine. I have not tried the newer versions yet. One thing that I like about this converter is that it ONLY uses System CODEC's, and does not install its own, like a few others. This DOES mean that if I get footage with an oddball CODEC, I need to go get it, and install it on the System.
    I can batch process AV files of most types/CODEC's, and convert to DV-AVI Type II w/ 48KHz 16-bit PCM/WAV Audio and at 29.97 FPS (I am in NTSC land). So far, 99% of the resultant converted files have been perfect, whether from DivX, WMV, MPEG-2, or almost any other format/CODEC. If there is any OOS, my experience has been that it will be static, so I just have to adjust the sync offset by a few frames, and that takes care of things.
    In a few instances, the PAR flag has been missed (Standard 4:3 vs Widescreen 16:9), but Interpret Footage has solved those few issues.
    Only oddity that I have observed (mostly with DivX, or WMV's) is that occasionally, PrPro cannot get the file's Duration correct. I found that if I Import those problem files into PrElements, and then just do an Export, to the same exact specs., that resulting file (seems to be 100% identical, but something has to be different - maybe in the header info?) Imports perfectly into PrPro. This happens rarely, and I have the workaround, though it is one more step for those. I have yet to figure out why one very similar file will convert with the Duration info perfect, and then a companion file will not. Nor have I figured out exactly what is different, after running through PrE. Every theory that I have developed has been shot down by my experiences. A mystery still.
    AME works well for most, as a converter, though there are just CODEC's, that Adobe programs do not like, such as DivX and Xvid. I doubt that any Adobe program will handle those suckers easily, if at all.
    Good luck,
    Hunt

Maybe you are looking for