Problem with Batch Compressing

I may be doing something wrong but when I'm batch compressing (15 short films in this instance) Compressor wants to call each one the same name as the first one, even though they are uniquely named. Is this some odd quirk of the programme?
For example if one is called 'film' at the top, when I send the rest off to compress, it puts 'film' in the title box of the others.
Can anyone shed any light on this for me>?
Cheers.

Well although it seemed to want to title all of them the same as the first one, they did all have unique names at the end of the job.
So now worries!
K.

Similar Messages

  • Problem with batch in outbound delivery

    Hi gurues...I´m newby in SD and I have the followin problem.
    I confirm an outbound delivery that need a batch, without the batch.
    Obviously I can´t post goods receipt, but when i do the shipment,
    it creates the bill but it don´t post goods receipt because it says that i need a batch in the position. How can i do to avoid that the shipment creates the billing without the batch??????..... 

    Jorge,
    Here I am giving you more idea batch determination so that it will be more easy for you to correct the isue as you are new.
    On batch determination, the whole process, how it is determined automatically in the order.
    1)     Normally we use batch determination at delivery level, because at the time of order material may or may not be created. For this material should be configured with batch and batch determination should be checked in sales views of material.
    2)     A2) Batch Determination during order Creation.
    For this you need to maintain Classes d for you Material. Depending on the Manufacturing process you can define the characteristics for your material.
    Ex: Purity for Medicines, Resistance for Electric Items.
    You need to create a class (You might have to create a new class type) which incorporates the characteristic.
    First Create the Characteristic Using Ct04 and then using Cl02 create the Class including this characteristic.
    Then in your material master Classification View Enter this class.
    Then Create a Batch for the particular plant and Stor Loc using MSC1N.Give the value of the characteristics in this batch.
    Then go to SPRO ->Logistics General ->Batch Management and maintain the Condition Technique (Procedure, Strategy Types and assignment to sales docs etc).
    Then Create the Batch Determination Record using VCH1.

  • [2.6.37] Problems with module compression

    Hello,
    during the last days I had problems installing carl9170-driver package from the aur. It is a package based on compat-wireless.
    Some other people had similar problems with other compat-wireless aur packages
    All the packages that are based on compat-wireless provide new versions of already existing kernel modules. They put those modules under /lib/modules/kernel-version/updates.
    After installing depmod -a should find the new modules in the /updates folder and after a reboot you can use the updated wireless drivers.
    This does not work for kernel 2.6.37 because all the in-kernel modules are .gz compressed but the carl9170-driver package produces uncompressed modules.
    I guess that depmod is not able to work correctly if you mix compressed modules with uncompressed ones.
    I found this https://patchwork.kernel.org/patch/4039/ and it seems that there are configuration options wheter you get a compressed or uncompressed module when you run "make install_modules" in a PKGBUILD.
    These configuration is made in "Makefile.modinst"
    So the question is: Is this a problem with arch's kernel? I think when compressed in-kernel-modules are used, then external modules should compile in the same way.

    Thanks for your post. I've been reporting this problem for weeks:
    https://aur.archlinux.org/packages.php?ID=41472
    https://aur.archlinux.org/packages.php?ID=41485
    https://bugs.archlinux.org/task/22633
    I solved the problem by going to /lib/modules/kernel2.6.37ARCH/updates...iwlwifi and then ran:
    cd /lib/modules/2.6.37ARCH/update
    gzip -r *
    depmod
    Then I reloaded the modules and finally depmod would report that the compat-wireless one was loaded.
    The above instructions only worked with compat-wireless-patched from AUR. I'm still unable to use compat-wireless-patched-daily even with the above trick (no module get loaded at all).
    Last edited by alphazo (2011-02-20 19:54:59)

  • Problems with parallel compression

    I'm having some problems utilizing parallel compression on some drums. Here's the setup:
    -9 tracks of audio (mixed, EQ'ed, etc. ) outputting to a Bus, which serves as the master drum fader
    -master drum fader has very light EQ and compression, and a limiter
    -audio tracks also have a (post-fader) send going to a separate bus for parallel compression
    -parallel comp bus is heavily compressed, slightly EQ'ed
    -plugins used are all Logic's: limiter, compressor, channel EQ
    The problem is that as I increase the parallel fader, the bass drum loses it's punch. It almost sounds like some kind of phase cancellation. The snare seems to lose a little presence as well. The cymbals, on the other hand, sound great. So if I leave the parallel fader very low, I'm not getting the compression I want out of the cymbals. Any ideas?
    I'm still pretty new to mixing, and definitely to parallel compression, so it seems likely a problem with my plugin settings. I got some advice from a friend of mine, and tweaked a little from there, but here are the settings of the compressor on the parallel bus:
    Threshold: -17 dB
    Ratio: 10:1
    Attack: 10 ms
    Release: 11 ms
    Knee: 0.0
    Peak Mode
    Thanks for your help.

    I guess I'm just a little stuck on the usual parameters (ratio, threshold, attack/release, etc.), though I realize the Warmer doesn't operate in quite the same way.
    Right. It is a bit confusing and there are only some controls which have usual plug-in equivalents. Look at VM as being a limiter, which in turn has an infinite ratio. The compression element of VM (as I understand it) affects the amount of compression underneath the ceiling. So the knee control affects that compression depth, rather than the compression curve, related to threshold, as seen in conventional compressors, like a sub threshold adjustment.
    the knee sets the amount of compression applied to signals leading up to the ceiling, right?
    Exactly. The ceiling, you would think, is the limiter ceiling for output a bit like Logic's adaptive limiter output. It is really a way of moving the output level to something other than 0 dbFS, and works with the saturation settings on the VM's back panel, and the way it handles the input side of VM. So I usually leave the ceiling at zero (as it's not being used for mastering) and change the drive, knee and saturation to taste. So the metering behavior you see, to me makes sense. Compare with pre and post, and adjust the drive, and it is more visually in step. I think it's more like a threshold than anything else. The knee is also affecting VM's make up gain, which you can balance with the drive knob to get the right amount of coloration.
    Drive is, as you say, just an input level.
    I always have FAT on. When used on drums, I change the rear saturation to less high and bass, more mid. I also change the freq. settings to be more active in the low mids, but this changes for each kit. I usually have brick wall and Auto (for release, not auto gain) on, with release set at X1.
    The Mix control is mysterious to me. You would think that it works so that you can change the mix between an unprocessed input to the VM output and therefore use it on a drum submix without a separate bus for parallel compression, but it sounds different to me (and much better) when I use it on a bus set to 100%, so that's what I do, with some EQ before VM.

  • Problem with batch management indicator

    Hi Gurus,
    I have an issue with batch management.
    There is one material which was not batch managed. The requirement was to make it batch managed. There were no open purchase orders and the only thing pending was the stock in the present and previous periods. The stock quantity in the previous period  and the present period matched(470 Kg). I opened the previous posting period, used movement type 551 and scrapped the stock. There was zero stock for the current period and the present period.
    I changed the batch management indicator successfully. Now the issue is whil i am trying to cancel the material document
    1. Now that the material is batch managed, whie trying to cancel the material document using MIGO, the system prompts  for a batch to be entered. But the batch field is greyed out and i am not able to enter a batch
    2.  I have tried to cancel the material document using MBST and MIGO. The sytem prompts for a batch to be entered but not allowing me to do so as the batch field is greyed out.
    Any pointers will be appreciated.
    Many Thanks,
    Sajin

    Hi Sajin,
    What is a reason to cancel a material document? When was that material document created? Before or after flagging the indicator in the material master?
    Ilya.

  • Dear all i have a problem with batch selection

    Hello ,
    i have this problem , i  can't assign automatically batches to some promotional material items, we have generic batches accord to the price of this items. when i deliver the order i have to put mannually the batch.
    Some of you can help me?
    i tried to classificate the batch with transsaction MSC2N but this changes are not saved and when i start another order must to put batch manually again
    Thanks

    For Batch determination in Delivery:
    -Please check if the Material Masters for these materials have been extended to Classification Views with class 023.
    -Check VCH1   if the search strategy records are maintained .
    There is also a detailed document here :
    Automatic Batch Determination Based on Shelf Life

  • Problem with Batch Updates

    Hi All,
    I have a requirement where in I have to use batch updates. But if there's a problem in one of the updates, I want the rest of the updates to ingore this and continue with the next statement in the batch. Is this not one of the features in BatchUpdates in JDBC2? I have been trying to accomplish this since 2 days now. Can anyone help me with this. Can anyone please help me with this?
    FYI, I have tried the following.
    I have 3 test updates in my batch.
    2 nd statement is an erraneous statement(deliberate). Other 2 statements are fine. It is appropriatley throwing 'BatchUpdateException'. when I ckeck the "arrays of ints" reurned by executeBatch() as well as BatchUpdateException.getUpdateCounts() are returning an arrays of size '0'. If remeove the erraneous statement, behaviour is as expected.
    Thanks in advance,
    Bharani

    The next paragraph of the same API doc:
    If the driver continues processing after a failure, the array returned by the method BatchUpdateException.getUpdateCounts will contain as many elements as there are commands in the batch, and at least one of the elements will be the following:
    3. A value of -3 -- indicates that the command failed to execute successfully and occurs only if a driver continues to process commands after a command fails
    A driver is not required to implement this method.

  • Problem with Inventory compression - 0RT_C01

    Hello experts,
    We are in NW04S, implemetation for retail customer.
    for inventory data I am using BX, BF and UM datasources.
    Now the problem is if I do compression of the dataload at cube <i>(0RT_C01-standard cube for retail</i>) level (as per explained in HOW TO INVENTORY document) I get totally wrong values <i>[for closing balance (0CPTOTSTOBU), Outflow issue(0CPTSTISBU) and inflow reciepts(0CPTSTREBU)]</i>
    But values come out correctly if we Dont do the compression at all.
    Now my question is, as per How To inventory document we have to do the compression to get right values but if i do so i get wrong data.
    But if i dont do the compression I get the right data.
    why this is happening.... did anyone faced this kind of issue before?

    Hello Ashu,
    Thanks for your reply.
    As you said <i> "0RT_C01 cubes doesn't supports BX data source for Stock Initilization. Instead of it, SAP recommends to use 2Lis_40_s278 for stock inilization."</i>  If SAP doesnt recommend to use BX then why SAP have delivered BX datsource with 0rt_c01?
    as you are also using the same cube so are you compressing the data(using marker updates)?? and if yes then are you getting right values after compression??
    or you using 2Lis_40_s278 for stock inilization? if yes then is it working fine in your case?
    Thanks
    Jagpreet.

  • Cs6 problem with batch from raw to jpg

    hello
    I recently have problem when I want to batch raw files in bridge and convert them to jpg. it gives me errow message as: The command "Save" is not currently available. (-25920)
    I try all possible ways but it doesn't work. I try to process files - script- image processor but in this way doesn't work eather. different computer works just fine with same files and using same workflow.
    Just few weeks back I was able to do all my work without problem. Please HELP,HELP,HELP
    Bibi

    Andrew Shalit wrote:
    I used iPhoto for many years and found it to be cumbersome and more trouble than it was worth.  It did not let me manage, organize, rate, tag, and edit my photos in any way approaching convenience. 
    Hate to say this for the 3rd time, but use Canon's "DPP" (Digital Photo Professional).
    With it you can "manage, organize, rate, tag, and edit [your] photos" as well as batch convert to jpeg or othet formats.

  • Problem with Batch Resizing in CS3

    I have a bunch of .jpeg pictures I want to resize.  I created a new action, I went to File - Automate - Fit Image and set the size to 1920x1080, closed but did not save the image. Then I stopped the action.
    Then I went to File - Automate - Batch and set the Action to my new action, set up the source and destination folders and clicked OK.
    However, after resizing the first picture in the source folder, I get the JPEG quality window opening and when I click ok, it just keeps opening the JPEG quality window after each picture is resized.
    Is there anyway to prevent this jpeg quality window from opening each time, so the action just resizes all the pictures ?  I'm using CS3 Photoshop?
    Thanks in advance,
    John Rich

    That could be a video card problem, so run a check on it.
    If changing drives be sure to deactivate CS3 before doing this (in Help menu).  Then activate once installed on new HD.  Best to do this with disks rather than just file transfer.

  • Problems With Batch Actions

    I have two actions that I built. They have worked fine for the past several months to a year.. Now all of a sudden the action will not work when run on a batch but it will work when run on a single photo. any ideas? I'm not even sure where to begin to start trouble shooting this problem.
    Thank you!!
    Amanda

    I'm using cs6 which is the same version the actions were created in. I use two different actions. The first one resizes a photo and the second one adds a frame around the photo with our company logo on it. the second one is the one that is not working. It works fine on a single photo but not on the batch action. I've attached a sample. The first one was run on a single photo and the second one was run on a batch.

  • Problems with batch in fpc 5

    I have a large project with many clips and sequences.
    I dont want to batch everything right now.
    Can i remove the funtion "add additional items found window" that shows up every time when youre about to batch something?
    This function takes me about 2-3 minutes to just go thru all the items in my large project(140mb).
      Mac OS X (10.3.9)  

    I must say, it's a bit difficult to understand your problem exactly if you don't give many details and it's also more inviting for someone to offer help.
    Look, your project is wayyyy!!! too big. But, maybe it needs to be, who knows? Either way, I think your talking about "batch capturing" only select clips at a time. If so, create a seperate bin and drag only those inside of it. Make that bin your LOGGING BIN-control>set logging bin. The program will disregard other clips. I've done this myself on huge project.
    If this is not what you're referring too, then refer to top of this post.
    Peace

  • Problem with batch inputs

    Dear All
    Could you advise me, we have many batch inputs stored in our system, some of them are very old one.
    What is SAP recommendation?
    How long batch inputs should be stored?
    Maybe you have some experience in this matter and can advice us>
    Thank you in advance for your help.
    Best regards
    Maja

    Hi Maja,
    if you double click on each batch input session, u will find the status, (Error, Processed, Not Message). u can delete the batch inputs whose status is Error or no message. those are all not in use. Even if you leave that also not a problem.
    U can find the status in 2nd Column, if the 2nd column has symbles like (Execute, Create and Error). Based on that also u can find the status.
    Kumar

  • Problem with batch import and ReadAloud flag

    We use the batch import to get our books into the Content Server. We have a couple of books where we want to allow the read-aloud feature. According to the manual setting the flag acs:RightReadAloud to -1 in the XML import file should take care of this but it doesn't work for us. I have tried setting the flag to -1 and 0 but the result is the same. <br />Below is an excerpt from our XML file:<br />          <acs:PublishInfo><br />               <acs:RightReadAloud>-1</acs:RightReadAloud><br />Any suggestions?

    What version of ACS are you using and what process are you using? <br /><br />The following should work:<br />- export the XML for the books you want to change<br />- modify the XML tags for acs:RightReadAloud from 0 to -1<br />- make sure the values for the NotifcationType tags are set to 04 (update)--rather than 03 (replace)<br />  <Product> <br />    <NotificationType>03</NotificationType><br />- import the XML<br /><br />Do a new search for the books in ACS (to make sure you're seeing the new values) and check that the read aloud rights have changed.<br /><br />There is a problem in the importer which could cause an update like this to fail. See:<br />http://support.adobe.com/devsup/devsup.nsf/docs/53803.htm

  • Sequencing problem with Batch Writing

    I'm using TopLink - 9.0.3.5 and I want to use the registerAllObjects method from the UnitOfWork instead of registering each individually to batch write records to the db. Here is a snippet of what I'm doing:
    Session session = ToplinkUtils.getSession();
    session.getLogin().useBatchWriting();
    session.getLogin().dontUseJDBCBatchWriting();
    session.getLogin().setSequencePreallocationSize(200);
    session.getLogin().setMaxBatchWritingSize(200);
    session.getLogin().bindAllParameters();
    session.getLogin().cacheAllStatements();
    session.getLogin().useNativeSequencing();
    UnitOfWork uow = ToplinkUtils.getActiveUnitOfWork(userId, ip);
    for loop{
    Notification dao = (Notification)
    ToplinkUtils.createObject (Notification.class.getName());
    dao.setName("someName");
    dao.setAddress("someAddress");
    allObjects.add(dao);
    uow.registerAllObjects(allObjects);
    This is the error I'm getting:
    2007-03-06 15:28:40,482 DEBUG (11776:9:127.0.0.1) TOPLINK - JTS#beforeCompletion()
    2007-03-06 15:28:40,482 DEBUG (11776:9:127.0.0.1) TOPLINK - SELECT GMS_NOTIFICATION_SEQ.NEXTVAL FROM DUAL
    2007-03-06 15:28:40,497 DEBUG (11776:9:127.0.0.1) TOPLINK - INSERT INTO GMS_NOTIFICATION ...
    2007-03-06 15:28:40,716 DEBUG (11776:9:127.0.0.1) TOPLINK - EXCEPTION [TOPLINK-4002] (TopLink - 9.0.3.5 (Build 436)): oracle.toplink.exceptions.DatabaseException
    EXCEPTION DESCRIPTION: java.sql.SQLException: ORA-00001: unique constraint (GMSG2K.GMS_NOTIFICATION_PK) violated
    It appears that the next sequence number is aquired for the primary key and a record is added but when Toplink tries to add the next record either the sequence is not aquired or the same sequence number is being used.
    Do I need to set up a table in memory to acquire the sequences? Can anyone give me some guidence?
    thanks

    registerAllObjects() does not do anything special it just calls registerObject(), it does not affect batching.
    In 9.0.3 batch writing was not supported with parameter binding, so this is why you are not seeing batching. This support was added in 9.0.4.
    In 9.0.4 you should use,
    Session session = ToplinkUtils.getSession();
    session.getLogin().useBatchWriting();
    session.getLogin().setSequencePreallocationSize(200);
    session.getLogin().setMaxBatchWritingSize(200);
    session.getLogin().bindAllParameters();
    session.getLogin().cacheAllStatements();
    session.getLogin().useNativeSequencing();
    where setMaxBatchWritingSize(200) means 200 statements.
    In 9.0.3 you can only use dynamic batch writing, which may not perform well. For dynamic batch writing, setMaxBatchWritingSize(200) means 200 characters (the size of the SQL buffer) so is too small, the default is 32,000.
    If you are concerned about performance enabling sequence preallocation would also be beneficial.
    If you still get timeouts, you may consider splitting your inserts into smaller batches (100 per each unit of work) instead of all in a single transaction.

Maybe you are looking for

  • Where are the drivers and how come my monitor works without them?

    Hey folks! I thought the drivers were all in the kernel, but then again I always have to install the Intel/Nvidia/whatever drivers to get my graphic card/chip working, but that's only if I want to run X. So how come I can see the console text display

  • Restoring Backed Up files in the program from the Restore Disk 1

    My computer was having some serious issues so i decided to restore it but before that I chose to backup some of the files.  It (the program that came with my HP) gave me several options of things I could choose to back up like emails, documents, phot

  • Interactive PDF in Muse

    I have created a interactive PDF in InDesign with drop-down otions and text fields and a submit button. Everything works well in Acrobat. When I link the document in Muse the submit button does not work anymore. How can fix that? Thanks

  • Smart Performance Issue

    Hi All, I have 2 servers where i see the proformance issue with the smart view. on Test server the smart view takes just 50 Seconds to display the Data On Production Server it takes approx 31 Minutes to Display the data Any idea where i need to look

  • Application for both internal and external use

    Hi, I am looking to develop an application for a small company that will be an order tracking system. The app would need to be accessible externally over the web for customers to log in and check their order status, but also internally by the company