Mastering Question:  The Best Limiter:  Final Limitation:

this is probably a stupid question but I though I ask anyway.
when your final mix is done and you want to bring everything up to the light "just before the red' which plugin is best for this?
After playing around with all of logics limiters I am yet to find one that does exactly this without hurting sound quality.
anyone?

the best limiter is the one you can't hear working.
i've tried most of the available, and for the most
part, would never depend on any.
a least do a mix without any for reference.
there's a certain skill in NOT having to depend on
such devices.
best, david r.
Sounds good (the answer and probably your results), but there is one reason that might vindicate the use of a limiter anyway.
Even when you balanced everything out to keep the whole song within the desired dynamic boundaries, there is no guaranty that under no circumstances some peak values can add up to unallowable levels since you don't have full control over all phase correlations between your tracks (or the voices of virtual instruments or the delay lines of certain effects). You can run (or bounce) a song several times and you will end up with several different overall peak levels. When recording on tape, the magnetic saturation will swallow such things, but in a digital system a limiter can help you out here.
So to me the best way seems to set all levels properly that no limiter is needed - and then to use it anyway for the worst case and hope it will never step in.

Similar Messages

  • Xml to indesign workflow question - the best xml editor or doc creator

    My department is considering the switch from our team members working directly in InDesign to a process where their work is imported to InDesign as xml content, then formatted automatically.
    Does anyone with relevant experience have ideas as to the best programs to create the xml documents in? Framemaker seems like a canidate, however, I am wondering about open source or other options.
    Thank you

    You probably want to look at InDesign's Data Merge feature.
    Unfortunately it won't accept XML directly, so you'll have to convert your XML to CSV or TSV. But that's pretty easy to do.

  • The best SD final movie quality, shooting with HD camera

    I have Canon XH A1, a great camera for HD, but if I need final SD movie. I have a problem.
    I tried to shoot HD and captured FCP as HD and edited in HD. When I exported to SD NTSC 16x9 I get the jagged edges, and some places pixelation. If I shoot SD with Canon, I don't get the details. The grass, for example looks "muddy" and loses the details.
    Is there a way to make SD look better when converting from HD shot video?
    Thanks,
    Message was edited by: Girshon Rutstein

    SD...what SD? DV? Uncompressed 8-bit? 10-bit? DV50? ProRes? There are many many SD codecs. So what codec are you using? How are you exporting?
    Here's the rub...you said you shot SD with the Canon, so I assume you mean DV. So I also assume that that is an HDV camera then. No, you will not see the same details in DV that you will with HDV...the compression is different. DV is a highly compressed SD format. Even if you shoot HD then export to DV, that detail will be lost. That is just the way things are. DV50 and uncompressed formats offer more detail, but then this all boils down to...what are you exporting TO? If DV, then there is nothing to be done.
    Shane

  • What is the best way to back up iPhoto and create disc space

    what is the best way to back up iPhoto and then remove the iphoto folder from the mac book and onto the external apple time capsule

    Two different questions
    the best qway to backup is to use TimeMachine or other automatic backjup program
    However this does not allow you to reduce space since a backup need to be a copy of your existing library and as soond as you changge the library you no longer have a backup and sooner or later the photos will be gone form the bakcujp too
    You need to run the iPhoto library on an external drive or had a complete library on and external drive and a smaller library on yoru internal drive using iPhoto Library Manager - http://www.fatcatsoftware.com/iplm/ -  to move photos between the two libraries
    Moving the iPhoto library is safe and simple - quit iPhoto and drag the iPhoto library intact as a single entity to the external drive - depress the option key and launch iPhoto using the "select library" option to point to the new location on the external drive - fully test it and then trash the old library on the internal drive (test one more time prior to emptying the trash)
    And be sure that the External drive is formatted Mac OS extended (journaled) (iPhoto does not work with drives with other formats) and that it is always available prior to launching iPhoto
    And backup soon and often - having your iPhoto library on an external drive is not a backup and if you are using Time Machine you need to check and be sure that TM is backing up your external drive
    LN

  • Is diadem the best solution to store test limits and parameters

    I understand that diadem is the best repository for test results for analysis and reporting. However can I make use of the database structure to store test limits and instrument setup values as a TDM file that I can call upon in a VI.
    If so, is it possible to extract from the TDM file only the values that I require.
    For example.
    Widget1 test requires min/max values A,B,C and instrument setup values for condition 1,2,3
    Widget2 test requires min/max values A,B,C and instrument setup for condition 4,5,6
    If this is the case would it make sense to have 1 single TDM with all values and parameters or seperate TDM for widget1 and widget2 etc etc..
    Regards
    Chris

    Hi Chris,
    This question has come up in one form or another many times over the last several years, actually predating the DataFinder.  I think the honest answer is that the jury is still out as to whether DIAdem and the DataFinder offer a good general-purpose data management and reporting combination for manufacturing test data.  Certainly I have created numerous proofs-of-concept to show prospective customers that it can work and what they can expect from DIAdem and the DataFinder with their data.  There are some cases where I think DIAdem and the DataFinder are hands-down the best tool on the market, and there are other situations where DIAdem and the DataFinder can still do the job but it's a bit of a stretch, and then there are high-end situations where other tools are the better fit.
    I'll try to keep this reasonably short, but I think this is a question many will be interested in reading about, so I want to go ahead and post a few points to clarify those "different situation" comments above.  DIAdem and the DataFinder become a much more compelling fit when you run lots of different types of tests, such that the number of results is variable from one test to another, or the result quantity names and units are different from test to test, or the analysis/reporting required is different from test to test.  High mix data plays to the strengths of DIAdem and the DataFinder, because the DataFinder is a self-configuring and expanding data base, and DIAdem is a general purpose analysis and reporting tool and is very flexible.  If your acquired data has a very static format, on the other hand, then it would take much less effort to set up and maintain your own relational data base and create one analysis and reporting solution that would never need to be expanded.
    DIAdem and the DataFinder also excel when the overall data amount is not astronomical and when the data consumers are all on the same LAN and are willing to install DIAdem to look at the data or create reports.  Many times in a manufacturing test setting a web interface is requested so that the data consumers can be anywhere in the world and need have only a web browser to interact with the data.  DIAdem and the DataFinder are fully compatible with Citrix and other Windows Terminal Services layers, but that does not come built-in with DIAdem 10.2 and the DataFinder Server Edition 1.1.  Both DIAdem and the DataFinder can handle VERY large data sets, but if you need high-end server options such as data base backups, triple-redundancy, complicated user-definable data access privileges, etc., then a conventional relational data base such as SQL Server or Oracle are going to have more to offer.
    Note that the DataFinder data base is an indexed compendium of information stored in flat files and is always tied to those flat files.  If you add or remove or edit a data file, the DataFinder will automatically update the corresponding records-- there is no way to sever this link and use the DataFinder in a file-less mode.  The best way to organize your manfacturing data in these data files is to expose your single value, named data results as properties on the file, group, or channel level.  This enables you to query on the values of those properties.  TDM files used for this purpose will index very quickly but will have a large footprint on disk, since you're storing this information in XML.  If file size is an issue, TDMS files index at least as fast and would have a smaller disk footprint.  ASCII files are actually surprisingly efficient for storing manufacturing data.  If you happen to be using TestStand, note that there is an ATML DataPlugin on www.ni.com/dataplugins which you can use which will automatically expose your measurements as properties in DIAdem and the DataFinder.
    Ask as you have additional questions,
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments
    in environments higher volume your data is, on the other hand, the more we have to be careful to make sure that DIAdem and the DataFinder

  • Engineer question: The limitations of auto-align

    I have a question about the capability limitations of auto-align that I'm hoping one of the Photoshop engineers - or perhaps another qualified member of the community - might be able to field.
    I've been trying, unsuccessfully, to use PS CS5 to precisely align groups of star photographs, as many people do with other apps. I need to align all of the images in a group exactly - to a pixel-level of precision - and combine them into a single flattened image. I have to do this with hundreds of groups consisting of thousands of individual image files, so when CS5 announced the auto-stack and auto-align features, this seemed like a nearly ideal solution.
    Yet after stacking the different images together, and despite the fact that each layer is very nearly identical to the others, PS always rejects them with the standard dialogue that it couldn't find sufficent overlap ("Needs to have at least 40% overlap", etc). I've tried this with dozens of different star groups, each >99% identical in visual content, and always get the same rejection.
    One immediate line of thought I had was whether Photoshop was having too much trouble distinguishing individual stars from the noise typical in astrophotos like these. It doesn't help to apply Noise Reduction first: when PS digs into the images to find alignable content or shapes, it might see them as more different than they appear to the eye, and PS may be having trouble seeing stars as not noise. I just don't know enough about the auto-align function to really say what's failing here.
    There are many other specialty apps that perform this function (PixelInsight, Registax, etc etc), but as a dedicated professional user of the Master Collection products, I was hoping to find out from an engineer whether this type of image does in fact exceed the native image recognition capabilities of auto-align. Alternately, if there are additional steps or preliminary adjustments that might assist the algorithm in its processing, I'd be interested to hear about those. I'm also willing to post reference images if that's an aid to analysis.
    Many thanks.

    Thanks for the quick reply, Chris.
    Is there any current workaround here, or, from your vantage, do you feel that the existing capabilities aren't there yet?
    Lastly, and without meaning to ask you to reveal anything about future release specifics, do you feel that a functionality capable of handling star alignment at the pixel level is a likely near-term possibility, a far-off possibility, or simply off the table?

  • What is the best method to have multiple string limits in a single test

    Hello All,
    I need a little advice I have two strings that need to be evaluated as a pass criteria for a single test test step (Numeric String limit test). What would be the best approach to this?
    The strings are "00000000 00000010" and  "00000000 00000000"

    A couple of ideas:
    1) Concatenate the strings. Both when storing them in the data source and in your limits for the values you expect.
    2) Use arrays for both the data source and limit and loop on the step requiring all iterations to pass and using the loop index variable in your expressions for your string limit test. For example, for the limit expression you would have:
    Locals.expectedValues[RunState.LoopIndex]
    and for the data source expression you would have:
    Locals.stringMeasurements[RunState.LoopIndex]
    Then you would set the loop options on the step to loop the required number of times. You can even use the expression function GetNumElements(Locals.expectedValue) in your loop while expression to set the number of iterations dynamically.
    Hope this helps,
    -Doug

  • 2010 iMac 2.93 i7 27" or 2011 2.8 i7 21.5" - which will suit better long-term?  I do a fair amount of audio, video editing and photoshop, but I have a limited budget.  What's going to give the best bang-for-buck for the next 3-5 yrs?

    I do a fair amount of audio, video editing and photoshop, but I have a limited budget.  What's going to give the best bang-for-buck for the next 3-5 yrs?  My current machine is a 13" Macbook unibody 2.4 Core 2 Duo w/ 4Gb Ram so it's time to move forward with more power and screen real estate!

    Hello, Jeff
    I could never edit on a 13" screen. I'm currently using a 17" MBP i7 Early 2011 as a fast replacement to my aged 20" Intel iMac.
    Both systems are not that far apart on stats and you will find that processing HD video will rely highly on the read/write to your storage. Myself, I'd be eye balling the 2011 for the Thunderbolt port so HD Video export/compression doesn't take forever! Currently processing a finished HD project for DVD uses at most 20% of my total CPU capacity. The FW800 drive is the big bottleneck! (I know I need at least a RAID to see a real speed boost).
    How good your eyes are and your usage style would dictate if the difference in screen size make a difference to you.

  • How can I control the image size when I export form iphoto, the choice is too limited, I need to send a photo under 3 MB but if I choose high quaulity it is only 1.1 and i need to keep the best quaulity I can. Thanks for help.

    How can I control the image size when I export form iphoto, the choice is too limited, I need to send a photo under 3 MB but if I choose high quaulity it is only 1.1 and i need to keep the best quaulity I can. Thanks for help.

    Any image can only be as large as the Original. With a program like Photoshop you can UpRes an image and get it to a bigger size, larger files size as well, but the actual quality of the image will be degraded, depending on the UpRes system and the original quality of the image.
    iPhoto is not the program to be doing that in and I don't think it even has that option.
    So I suspect the image you are trying to send isn't much bigger than what you are getting. You can also try Exporting it from iPhoto to yopur desktop and see what size you end up with. If it is still that 209KB +/- file size then that is the size of the original image.

  • Challenge - Where is the best place to host reports given the following limitations?

    Hello All,
    I am going to box everyone in a little bit here but where is the best place to host reports given the following restrictions:
    Cannot be within a BPF
    Cannot be accessed from via the EPM folders
    The crux of here is that some of our end users only want to go and run reports much like you would in a traditional BI environment. Given that BPC has no concept of a portal page for hosting such reports we need the next best thing. Currently we are posting reports to the documents folder but the downside of this is that we need a distinct copy of a report for each environment (DEV, QA, PROD) which is quite unfortunate. Coming from a traditional the seems ridiculous.
    Cheers,

    We use EPM folders to save the reports.
    Perhaps for your requirement, you should look to Web forms and Web reports..

  • Switching to mac, questions on best place to host reference masters

    I am getting ready to switch from PC to Mac and have a question I am hoping I could get some advice on (I have a lot of questions actually, but will start with this one).
    I have a large library of photos and videos and plan on getting an imac for my wife and macbook pro for me (I do the tagging/organizing and video editing).
    Based on what I read, I am planning on keeping the masters on a separate drive and the aperture library on my macbook pro.
    Question is, am I better off keeping the masters on the imac and access that folder as an external folder on my macbook pro for the masters, or should I use a NAS drive for them?
    I have an older NAS that I can probably reformat for this purpose, but I often run into complications using a NAS (offsite backup issues, connectivity issues, can't use Time Machine, ...) and have read on the forums about people having issues writing to files on NAS's...
    I have a buffalo terastation pro ii, it is about 3 years old.  It is 2 terabytes.  I would get the imac with 2 terabytes and the 256gig ssd drive so the 2 tera could be dedicated to media.
    Thanks in advance for any advice, appreciate it.
    Peter

    plampione wrote:
    I am about to buy the macs, so both the desktop and macbook pro will be fast.  I plan on buying the fastest processor, 8gig of ram on the macbook, debating 8 or 16 on the imac, though from the previous thread sounds like it should be 16, and ssd on both (imac with 256 ssd + 2 TB HD, macbook with 512 sad).
    I suggest ordering both Macs with the minimum RAM and adding 3rd party RAM yourself.  The task is trivial and will save you heaps.  I bought my iMac with 4GBs (that's two 2GB modules plus two empty slots), and then added two 4GB modules into the empty slots.  So I have 12GBs.  My plan is to try to live with just 12GBs until 8GB modules become affordable, and then replace the two original 2GB modules with two 8GB modules, bringing it to 24GBs.  If needed, I'd ultimately go to 32GBs by replacing the two 4GB modules I have just installed.
    I think the MacBook Pro (MBP) still has only two slots, so using 4GB modules, you are limited to 8GBs.  If the architecture of the MBP supports it, you may be able to go to 16GB when 8GB modules become affordable.  But you'd need to check this.
    - using an attached harddrive on the macbook pro - Definitely a reasonable idea, but I was hoping to be able to bring up the library from anywhere in the house.  I do have gigabit ethernet throughout the house and can wire in as needed, but again, was hoping to be able to sit on the sofa and do some tagging.
    Another way to achieve this still has your Aperture Library and Masters on the iMac as I recommended above.  You could use the MBP to share the screen of the iMac.  It's a standard OS X function, no software to buy.  You enable sharing on the iMac and log in from the MBP.  That way you have the function on your sofa, but the power and capacity of the iMac for your actual work.  This works fine over wifi for me via an Airport Extreme using 802.11N
    - Backup - by putting the masters on the imac i was planning on using time machine to back them up onsite, and crashplan to backup offsite.
    Good plan.  TM and Crashplan are good products and free.
    I'd also recommend making a second onsite backup by cloning to a pair of external drives, and periodically send one of these drives offsite.  You may find that Crashplan for offsite backup over the internet is too slow if you have a big photo shoot.
    What if I plug into the gigabit ethernet whenever I am going to run aperture?  Would that allow the imac/macbook model to work?  Concerned about the performance issues described above though, even for the machines themselves.  Would copying the library nightly to my wife's computer work?  How big does a 20+k library get with referenced masters?
    Size of the library depends on many factors such as choices for previews, and the amount of editing (adjustments) you do, but here is a data point.  I have 47K images, many RAW and the Library was about 60GBs, the referenced Masters another 280GBs.  I say "was" because I have let it grow by importing recent images into it.
    Today we are on PCs and I use windows live photo gallery and tag all my photos, which are stored on a NAS, my wife also runs WLPG and can see all the tagging I do.  This works pretty well and is simple, I just import from the camera, show all 'untagged' photos/videos, and tag them.  I also move them to a basic folder structure to be safe.
    I understand this is not as feasible on mac, and if you are wondering why I am switching, it is because I have been having a lot of stability problems with the pc's and have had enough.  I use iphones, ipads, and apple tv's.  Recently got a macbook air and really like it.  Just need to figure out how to manage the photo/video library.
    A lot of my concerns about your plan are because I'm impatient.  On my MBP, it was taking about 5 seconds to render each image as I reviewed them.  I typically take a very short time to tag the images, so the computer was slower then me.  Now, with the iMac, I'm the slow link in the chain.
    It may be that you are used to a leisurely tagging process, but I have no experience with WLPG.
    The great things about Aperture that I really love, having come from a Photoshop background are that Aperture is always non-destructive in its editing and it is non-modal in the UI.
    There is a price to be paid for non-destructive editing.  Aperture often has to go back to the Master and then apply your adjustments.  This takes resources.  But it saves you time.  And you don't have to muck around creating another external folder structure yourself, as it seems you do with WLPG.  I just import my images into "Projects" in Aperture (the Masters are Managed), and when I want to selectively change Managed Masters to Referenced Masters, I just tell Aperture to do this and that it should utilize my Project structure as the external folder structure.  Aperture does all the folder creation for me, yippee!
    Finally, you may even find that the MacBook Air is good enough to do the tagging on with the shared screen approach I mentioned above, and then maybe you don't have to buy a MBP.
    HTH.  Happy to discuss further

  • Looking for an app for a blue tooth, type I want to use on my Iphone is a "jawbone" Question is out of the many available in the store which one would be the best and most reliable?

    Looking for an app for blue tooth on my Iphone.  Out of the many in the app store which one(s) would be the best & most reliable?  Currently trying to set up for an older Jaw Bone?

    You are asking several different questions. If you need to store your photos, music, and movies on an external volume, you certainly can. Any externally connected hard disk drive will work, connected either directly to your Mac or to your Time Capsule as a shared volume.
    You should not rely upon using that as a backup device though. Although you certainly may use it for both purposes, it is a better idea to have dedicated backup devices for a variety of reasons not limited to redundancy. You would not want to simultaneously lose all your pictures as well as your backup. If they are all on the same device, that could happen. Furthermore, a backup cannot back up the volume on which it is running.
    As for adding an Extreme or Express, using its LAN port for your iMac, and then enable Internet sharing so you can effectively use the iMac as a "hotspot", you can do that too, but I am unclear on what benefit you believe this arrangement would convey for you.
    An Extreme's Guest network is separate from its Main network; that is the reason for having it.

  • Question about GCU limits and Nintendo Amiibo figures

    I noticed that the Nintendo Amiibo figures on the Best Buy website qualify for the 20% Gamers Club Unlocked discount.  My question is, how does this work with the 3 per title per system per year limit?  There are twelve different figures available and I would like to preorder all of them from Best Buy, but the potential to get my account suspended has me scared off.
    I know that I could avoid this by not using my GCU membership on any figures past the first three.  The unfortunate thing about that option is that it would preclude me from getting any My Best Buy Rewards points for them since my MyBB account is tied to GCU.  That would be a bummer, and seems like a major oversight/limitation of the program.

    Hi MetalSlugger,
    I’m very happy to hear that you’re interested in pre-ordering all of the Nintendo Amiibo figures at Best Buy! Being a Gamers Club Unlocked (GCU) member, it’s completely understandable that you’d want to utilize your GCU benefits while pre-ordering the figures. It looks like everyone in this thread has done a fantastic job with researching other threads in regards to using GCU benefits on figure purchases and it seems like you’ve reached the correct conclusion.
    Since the Amiibo figures all have an individual SKU, then you should be just fine with ordering up to three of any given figure while using your GCU benefits. Furthermore, you should have no issue doing this for every figure. The fact that JokerBingo has been doing this with Skylanders and Disney Infinity figures further shows this!
    Thanks for posting your inquiry here on the forums and please let us know if you have any further questions.
    Cheers,
    Brian|Senior Social Media Specialist | Best Buy® Corporate
     Private Message

  • How to get the table of value field? and can we expand the technical limits

    Dear
    I have created value field in COPA with KEA6. And now, I need the table which the value fields are saved. Yet, I have tried a lot to find it and get failure? Can any guy help me? Please tell me how to get the table of a value field.
    And another question is that, can we extend the technical limits for the number of value field for ECC6.0?
    We have a note for R.4.x Please see below:
    OSS note 160892
    You can display the length of a data record using Transaction KEA0 ('Maintain Operating Concern'). After you have navigated to the 'Characteristics Screen' or to the 'Value field Screen' choose menu path 'Extras -> Technical Limits'.
    The maximum displayed here under 'Length in bytes on the DB' is the maximum length permitted by the Dictionary. The reserve required for the release upgrade must be subtracted from this value.
    To increase the allowed number of the value fields, increase the value that is assigned to field ikcge-bas_max_cnt (FORM init_ikcge_ke USING fm_subrc, approx. line 165) in Include FKCGNF20. It specifies the number of the possible value fields. The corresponding part of the source code is attached to the note as a correction.
    David Sun
    Regards!

    how to extend the limit of value numbers? please see the original question.

  • (workflow question) - What is the best way to handle audio in a large Premiere project?

    Hey all,
    This might probably be suitable for any version of Premiere, but just in case, I use CS4 (Master Collection)
    I am wrestling in my brain about the best way to handle audio in my project to cut down on the time I am working on it.
    This project I just finished was a 10 minute video for a customer shot on miniDV (HVX-200) cut down from 3 hours of tape.
    I edited my whole project down to what looked good, and then I decided I needed to clean up all the Audio using Soundbooth, So I had to go in clip by clip, using the Edit in SoundBooth --> Render and Replace method on every clip. I couldn't find a way to batch edit any audio in Soundbooth.
    For every clip, I performed similar actions---
    1) both tracks of audio were recorded with 2 different microphones (2 mono tracks), so I needed only audio from 1 track - I used SB to cut and paste the good track over the other track.
    2) amplified the audio
    3) cleaned up the background noise with the noise filter
    I am sure there has to be a better workflow option than what I just did (going clip by clip), Can someone give me some advice on how best to handle audio in a situation like this?
    Should I have just rendered out new audio for the whole tape I was using, and then edit from that?
    Should I have rendered out the audio after I edited the clips into one long track and performed the actions I needed on it? or something entirely different? It was a very slow, tedious process.
    Thanks,
    Aza

    Hi, Aza.
    Given that my background is audio and I'm just coming into the brave new world of visual bits and bytes, I would second Hunt's recommendation regarding exporting the entire video's audio as one wav file, working on it, and then reimporting. I do this as one of the last stages, when I know I have the editing done, with an ear towards consistency from beginning to end.
    One of the benefits of this approach is that you can manage all audio in the same context. For example, if you want to normalize, compress or limit your audio, doing it a clip at a time will make it difficult for you to match levels consistently or find a compression setting that works smoothly across the board. It's likely that there will instead be subtle or obvious differences between each clip you worked on.
    When all your audio is in one file you can, for instance, look at the entire wave form, see that limiting to -6 db would trim off most of the unnecessary peaks, triim it down, and then normalize it all. You may still have to do some tweaking here and there, but it gets you much farther down the road, much more easily.Same goes for reverb, EQ or other effects where you want the same feel throughout the entire video.
    Hope this helps,
    Chris

Maybe you are looking for

  • How to change Default setting AutoCommit to 0 in Procedure or in Connecting

    [tttest@host1 ~]$ ttisql "dsn=ana_tt1122" Copyright (c) 1996-2011, Oracle. All rights reserved. Type ? or "help" for help, type "exit" to quit ttIsql. connect "dsn=ana_tt1122"; Warning 20000: Use of a deprecated feature: Authenticate (ignored) Connec

  • What happened to Tetris?

    So I had to reload my Droid X, and every purchased app that I bought loaded just fine...except for Tetris by EA Mobile. I noticed that between Christmas and Valentine's Day, EA Mobile cleared all of their Market apps on Verizon and put them back out

  • Failover (reasons)

    I was looking for a documentation that explains the failover reasons but the only doc I found (command guide), does not explain the reasons only the states. http://www.cisco.com/en/US/docs/security/asa/asa82/command/reference/s3.html#wp1473355 •No Er

  • Where is my purchased app?

    iTunes 10, Windows 7 x64. I just purchased an application through the iTunes store and when I click on Store > Check for available downloads I'm informed that all purchases for my account have been downloaded but I cannot find the app in iTunes on my

  • How to export a table with BLOBs

    Hi, when I have to export a table with its own data, I generally export it by using the export on UNIX. I've never tried to export a table with a BLOB. Is it possible in the same manner or what else? Thanks!