Compressor post-processing applescript and cluster
I'm having a lot of trouble using a post-processing applescript. I have a preset that pumps out an mp3 from an hdv .mov. the post-processing script does some funky stuff and should pump out an xvid .avi. when submit the job to "This Computer" it works fine, I get the .mp3 and the script gets kicked off and generates .avi's. When the job is submitted to our cluster, however, the job fails immediately, doesn't generate either mp3s or .avi's. the log has the following:
<----snip---->
pid="250" msg="Hard-linked source file to cluster storage."/>
<log tms="226012897.974" tmt="02/29/2008 13:21:37.974" pid="250" msg="Job resource error: System Error = 1 {Operation not permitted}Failed to read filethread terminated due to exception (remote exception). This is likely caused by missing or renamed watermark filter file."/>
<mrk tms="226012897.975" tmt="02/29/2008 13:21:37.975" pid="250" kind="end" what="service-request" req-id="D729F684-764C-439E-9A06-888BB4C9405C:1" msg="Preprocessing job request error (exception=Job resource: System Error = 1 {Operation not permitted}Failed to read filethread terminated due to exception (remote exception))."></mrk>
</logs>
</service>
</services>
<----snip---->
all requisite files have been placed on either a global filesystem (OD home directory for the post-processing script itself) or copies have been distributed to all nodes (a requisite to the applescript). The submitting user has full read/write permissions to the working dir and any subfolder.
Any ideas?
In case anyone runs up against this one, I found a solution/workaround. It turns out that when submitting the job to the queue, qadmin would copy the applescript over to the shared folder. During this copy, it would change attributes of the file such that the OS thought the .app was a classic application and would refuse to run it.
Workaround: in compressor preferences, set "Cluster Options:" to "Never copy source to cluster"
Luckily this worked for our setup as we assume all source media and scripts are already on a global filesystem.
Similar Messages
-
PI post processing - configure the role of Integration service fails
I am running post processing template, and it keeps failing. So I have a couple questions:
Can I rerun the PI template multiple times until the entire process completes successfully? Must every process complete successfully? I have 100% but it is red....
The process errors out on configure the role of Integration service - error max no of 100 conversation exceeded. SXMB_SET_ROLE_TO_IS AbapConfigurationWriter FAILED.
Any help would be greatly appreciated.
Thanks
MikieHi,
>>>><i>Can I rerun the PI template multiple times until the entire process completes successfully? Must every process complete successfully? I have 100% but it is red....</i>
If one template fails and if you re run the template it wont be successful. If you have 100% but if it is red dont worry it is a success.
The failing of templates can be due to, if you not configured SLD first then it might give you such problems. First execute all the templates under NWA and try executing all the PI templates.
The last method would be that you need do that configuration manually using XI installation guide, it surely works.
Regards,
Ramesh P -
SPRO - Create Destination for Import Post-Processing
Hi,
What is the significance of this setting in SPRO?
SAP IMG -> Business Intelligence -> Transport Settings -> Create Destination for Import Post-Processing
How and in what situations do we need to fill the entries for this setting?
ThanksHi George,
If you dont create a destination here in a target system for transport then your process chains will not get activated after the transport. I am not sure if it has any other implications.
regards,
Sumit -
Authentication post processing+goto URL
Hi All,
How can I get goto url for a user that has been authenticated by SAM 7.1 using authentication post processing class?
I have to check user attribute in post processing class and redirect by user to SIM change password page as per requirement.
I want that the same goto url I can pass to change password page so that user can redirected to initial requested page after successful password change.
I tried using request.getParameter(goto) but it gives me some encrypted value which I am not able to use. I searched on google and found that ssoToken.getProperty("FullLoginURL"); works for some guys but unfortunately not for me :( :(
If sombody has similar kind of implemetation please let me know the solution.
I would appreciate any help in this regard.
Edited by: Amit_Bansal on Feb 24, 2009 1:55 PMI am certain there could be if you have applications that send sensitive data in the URL. We do not. If you have some old application that sends user password in the URL ... I could not think of a case we had that would require the string to be encoded. So we removed it. The other alternative would be to decode the string store it in the session perhaps and the retrieve that out of the session. It seemed like a bit too much work any way you sliced it so we unencoded it. I am sure there is some security expert someplace rolling in his grave. I could not think of an instance in our architecture that would require it and it made troubleshooting issues almost impossible with "normal" tools when it was encoded.
-
Post processing after system copy from standalone to cluster enviornment
Hi,
I have done a system copy(Backup restore method) from a standalone quality BW (QBW) system to cluster environment (TBW).
The steps I have followed as follows.
. setup the machines in the cluster environment ( Windows 2003 EE server,sql 2005 SP3)
. Backup the current database in the source system
. Given a different SID (TBW) during the System copy installation .its a dual stack installation.
. Restore the database in the target system and select the back up restore method during the system copy database installation step.
.The installation completed successfully and i m having some doubts regarding the post processing steps after this .
We want to retain the same client in the target system ,for this i need to rename the logical name .Kindly assist me for changing the logical name for the extraction of BW delta process from extraction server to BW system and also the other steps I have to follow after system copy.
Thanks
Arun PadikkalHi Arun,
You should probably post this question in a basis forum.
Cheers,
Diego -
Best way to stream lots of data to file and post process it
Hello,
I am trying to do something that seems like it should be quite simple but am having some difficulty figuring out how to do it. I am running a test that has over 100 channels of mixed sensor data. The test will run for several days or longer at a time and I need to log/stream data at about 4Hz while the test is running. The data I need to log is a mixture of different data types that include a time stamp, several integer values (both 32 and 64 bit), and a lot of floating point values. I would like to write the data to file in a very compressed format because the test is scheduled to run for over a year (stopping every few days) and the data files can get quite large. I currently have a solution that simply bundles all the date into a cluster then writes/streams the cluster to a binary file as the test runs. This approach works fine but involves some post processing to convert the data into a format, typically a text file, that can be worked with in programs like Excel or DIAdem. After the files are converted into a text file they are, no surprise, a lot larger than (about 3 times) the original binary file size.
I am considering several options to improve my current process. The first option is writing the data directly to a tdms file which would allow me to quicly import the data into DIAdem (or Excel with a plugin) for processing/visualization. The challenge I am having (note, this is my first experience working with tdms files and I have a lot to learn) is that I can not find a simple way to write/stream all the different data types into one tdms file and keep each scan of data (containing different data types) tied to one time stamp. Each time I write data to file, I would like the write to contain a time stamp in column 1, integer values in columns 2 through 5, and floating point values in the remaining columns (about 90 of them). Yes, I know there are no columns in binary files but this is how I would like the data to appear when I import it into DIAdem or Excel.
The other option I am considering is just writing a custom data plugin for DIAdem that would allow me to import the binary files that I am currently creating directly into DIAdem. If someone could provide me with some suggestions as to what option would be the best I would appreciate it. Or, if there is a better option that I have not mentioned feel free to recommend it. Thanks in advance for your help.Hello,
Here is a simple example, of course here I only create one value per iteration in the while loop for simplicity. You can also set properties of the file which can be useful, and set up different channels.
Beside, you can use multiple groups to have more flexibility in data storage. You can think of channels like columns, and groups as sheets in Excel, so you see this way your data when you import the tdms file into Excel.
I hope it helps, of course there are much more advanced features with TDMS files, read the help docs! -
Direct link from Premiere Pro to Speedgrade won't open and stops at the post processing bar.
I've just installed the new version of Speedgrade and every time I try to open a direct link from Premiere Pro to Speedgrade, it opens and starts to load, but when the post processing bar appears it stops there and doesn't open, I've gone back to the original versions of both programmes and that same thing happens, HLEP!
Hi Vinay,
1. I am using third party plug ins, and I understand that might be causing an issue, but I've also tried opening a new project with nothing in it at all, and that won't open.
2. Premiere Pro quits after the project loading bar or after I create a new project.
3. I used a lot of files! mov, mts, psd, ai, png, jpeg, and AE compositions through dynamic link.
4. The project has about five sequences. They range from 1 minute to 4 minutes.
Thanks! -
Table and views which are afftected during the SAP license post processing
Hi,
can anyone tell me those table and views which are afftected during the SAP license post processing process in SAP 4.7 installation on oracle.
Regards,
Abhishekhi
there is no table active with the name MLICHECK
the table is not active in the dictionary
what to do now?
i want to see the license data of the sap system now in the table view.............. -
Post processing records in MF47 and COGI
Hi All,
I have a query....
In STD SAP will the postprocessing records created by Repetative Mfg. be visible in transaction COGI???
And will the postprocessing records created by Descrite Mfg. be visible in transaction MF47???
Regards,
Vinayak.Hi ,
In general for Discreate Mfg the post processing records are checked and cleared in Tcode : COGI.
Whereas for REM Mfg it is : MF47.
You will be able to view the REM postprocessing records in MF47, it is a standard behaviour of SAP , hence I can say there is no bug in your system.
Hope this will help you.
Regards
radhak mk -
my post processing times have tripeled and the develop tasks take multiple seconds for one adjustment, Please Help. . . .
my post processing times have tripeled and the develop tasks take multiple seconds for one adjustment, Please Help. . . .
-
I have a few questions regarding audio post processing done outside of PrPro. The content is music that was recorded at 48khz/24-bit. After post processing, it is in a 32-bit file, which is then combined with a video clip via Premiere.
1) Does Premiere change the audio bit rate when it compresses video clips during the DVD burn process?
2) Does Premiere add dithering to the audio signal, prior to the compression process?
3) Is it desirable to add dithering to audio clips that are added to video clips?
Thanks,
SteveHi Hunt,
Thanks. A couple of questions - I am not considering DVD-audio, but audio that has been recorded off-camera, edited, and then combined with the corresponding video clips.
1) Does 24-bit audio get converted to 16-bit audio by Premiere? If so, this suggests that adding dither to audio files used within video clips could be beneficial.
2) I am guessing BD refers to Blue-ray. Yes? If so, is it advisable to up-sample audio clips or leave them at the sample rate that they recorded at?
Thanks,
Steve -
Reinstall EP6SP2P4 and then post processing hangs AGAIN!!!!
I reinstalled EP Sp2 Patch4 on Solaris and it runs through the install until the very end and then is just hangs....and the reason it hangs is the server cannot start....all logs pcd etc etc show successful deploy of applications during the install....
it just never completes the portal install post processing....
This never happens on NT.....what is the deal with Solaris? I never would believe that an SAP NT product is more stable but clearly the Solaris is a POS!!!!
John RyanI have checked the deployment logs...and other logs
Loading service: com.sap.portal.runtime.system.repository|application_reposito
ry
Loading service: com.sap.portal.umeregistration|ume_registration
Loading service: com.sap.portal.usermanagement|usermanagement
Loading service: com.sap.portal.usermanagement|user_management_engine
Loading service: com.sap.portal.usermanagement|security
Aug 13, 2004 12:36:10... com.sapportals.config.tools.upgrade.Upgrader [Cli
ent_Thread_6] Warning": Upgrading Configuration Archive named "portal" with vers
ion "6.0.2.0"
Aug 13, 2004 12:36:20... com.sapportals.config.tools.upgrade.Upgrader [Cli
ent_Thread_6] Warning": Configuration upgrade completed
Loading additional applications:
Loading services:
Portal initialization done.
Everything looks fine, but why doesn't the server come up.....
The port is open no other process is using it...
Just does not make sense....
John -
Red saturation problems and post-processing
I see problems with the color red saturating out quickly. This often causes detail to be lost. I have pictures of a red bird where the jpeg produced by the camera is usually better, showing more detail, than anything I can produce from the raw file. I have tried limiting saturation using the hue/saturation tool (PSE12) but that gives unacceptable results as the color gets lost when you reduce saturation.
My best workaround so far is to start with the jpeg file, open it as a raw file, and do post-processing that way. That has given pictures with more detail than starting with the original raw file. Is there anything else I can try? I find it hard to believe the camera processor does better than PSE12 in these cases.I think I'm missing something. Now I see the code and DC's inside the DTR. However, when I try to import into NWDS no DC's come in (the software components exist, but no DC's in them). Additionally, the CBS web ui reports that my software components do not contain any DC's even though I see them in the DTR. What things can I look at to determine what I'm missing here?
Thought I'd add some more info..after applying the support packs, we imported the new SAPBUILD, SAP_JTECH, and SAP_jee SCA'S into our track as we required some functionality from the newer build SCA. We also reimported our old archives back into the system by manually checking them in assuming this would fix the problem with us not seeing the source in the NWDS or the DTR. After the import, the CBS no longer sees our custom DC's, but the DTR does (both in active and inactive ws). When importing the dev configuration into the NWDS our custom DC's no longer appear, but SAP's standard SCA's do.
Message was edited by:
Eric Vota -
File Being processed in two cluster nodes
Hi ,
We are having two cluster nodes and when my adapter picks the file, the file is getting processed in 2 cluster nodes.
I believe the file should get processed in either of the cluster node but not in both cluster nodes.
Has any one faced this kind of situation in any of your projects where you might be having different cluster nodes.
Thanks,
Chandra.Hi Chandra
Did u get a chance to see this post.. it may help
Processing in Multiple Cluster Nodes
Regards,
Sandeep -
Post Process Event Handler not getting user's CURRENT_STATE for a UDF field
I have a post process event handler in OIM R2 BP04 , which runs on Trusted Reconciliation and it compares user's ("CURRENT_USER") state and ("NEW_USER_STATE") and based on that derives a business logic.
The problem that i am facing is that, it is not able to get the User's ("CURRENT_USER") state for a UDF(EMAIL_LIST) field and it is coming as null,and hence is breaking the business logic.The same Event Handler is working on TEST and QA ( 4 node cluster)environments and is not working on PROD environment( 4 node cluster).
The different thing that was done on was that during the initial recon the event Handler was not present and after the initial load of the users i have manually executed the database sql query which have updated the "EMAIL_LIST field manually for all the users
I think that since during the initial recon as the EMAIL_LIST was not populated and was populated through the SQL update for all the users, the orchestration inter event data does not contain email list, and so it is coming as null.
But i am seeing the same behavior for new records as well, which are created and then updated after event handler is registered.
Please reply, if you have encountered something similar.
Thnx
AkshatYes i need the old state, which is
Identity[] oldUserStatesIdntArr =
(Identity[])(Identity[])interEventData.get("CURRENT_USER");
Maybe you are looking for
-
Full details: When I click on any iTunes program link, I receive the following warning "The procedure entry point AVCF AssetCreateWithByteStreamAndOptions could not be located in the dynamic link library AVFoundationCF.dll." I click ok and it tells m
-
How to reverse clearing document
Dear gurus, I would like to reverse clearing document(document no. 1) by FBRA but i can not, it have message "Document xxxx is not purely a clearing document". because this clear document(document no. 1) was cleared by anohter document(document no. 2
-
Problems with connection to the Mountain Lion
Someone is having problems with connection on Mac Mountain Lion? I did a clean install and the connection is unstable.
-
I would like to store my photos on my ipad, not my iphone. I have 64 GB on my ipad, and do not have a computer. My icloud is full, so I would like to only save photos/videos on my ipad, and not my iphone. Is this possible? Or do I have to save th
-
HT4993 Why does my iphone say activation error
Why does my iphone say activation error