Stereo 3D Rig: Compositing with S3D-Footage & C4D-Renders

Hello @ all!
I do a research for stereoscopic workflows with the new Stereo 3D Rig in Adobe After Effects CS5.5.
I really searched the web and think there is no resource about stereoscopic workflows anywhere that I've not seen.
I even bought the Video2Brain-Tutorial hosted by Angie Taylor.
But though there are few questions remaining!
1. How can I achieve a horizontal image translation (HIT) with a parallel camera setup (without converging) to adjust my parallax?
background information: I try to compose some text-layers with my real-footage shot with a parallel setup. I know how to set the footage-parallax indepently with expressions - this is not the problem!
I don't want the Stereo 3D Rig to converge when adjusting the parallax for my compositing elements (the text-layers).
I tried it with a mix of using the stereo scene depth parameter and the scene convergence parameter, also tried the expression which avoids moving the convergence plane, but i can't manage to get the result that i want. This is because i have to decrease the stereo scene depth to about 0,1% which is too less, but otherwise the deviation of left and right view is too big. The coordinates of the text-layers are ok.
Of course i could just use the Converge to option (as everybody does this in all the stereoscopic workflow resources that are available in the web), but this creates strong keystoning effects and in a way works against the principles of stereoscopy.
Also this is not recommended by the Adobe Help Sites:
You can also use parallel virtual cameras. This technique is useful if you need to match live footage and add digital elements to that scene. Keeping the virtual camera orientations consistent with the cameras used in the footage helps to keep the perspectives of the digital elements and that of the stereo footage aligned.
[…]Real scenes are almost always shot with parallel cameras. Keep this in mind if you are trying to mix and match live footage with digital elements. If your scene consists of only 3D elements in After Effects, then it is probably safe and preferable to use converged cameras.
[Source]
Understanding stereoscopic 3D in After Effects
http://kb2.adobe.com/cps/898/cpsid_89868.html
2. What about the Stereo 3D Rig for compositing purposes with footage rendered out of Cinema 4D?
I try to figure out a stereoscopic workflow in After Effects allowing me to make use of the composition-project-file that i exported out of Cinema 4D.
After importing the file i have my left and right camera and some Nullobjects as placeholders which are supposed to be replaced by some other 3D-elements in After Effects (e.g. text-layer).
Actually, the cameras have already the right data and could be used, but the of course creates left and right Eye View automatically based on the Master Cam.
Is there any idea for using the Stereo 3D Rig in a stereoscopic workflow to make a compositing with data exported out of Cinema 4D?
Or is it better to create, in this case, a structure of compositions which only uses the cameras exported from Cinema 4D?
Any help is appreciated!
Best,
Paul

Hi Paul,
1.
Sounds like you're hitting a limitation of 3D glasses effect. The HIT (or scene convergence in 3D glasses) works on the pixel level and does not calculate sub pixel.
Here are some things you can try.
Use this expressionin in 3D glasses scene convergence to cancel out your stereo scene depth more robustly and allow you to either change your scene convergence in pixels, or in % of comp width.
if ( effect("ADBE Stereo 3D Controls")("ADBE Stereo 3D Controls-0006") == 0 ) {
    offset = effect("ADBE Stereo 3D Controls")("ADBE Stereo 3D Controls-0003");
    if ( effect("ADBE 3D Glasses2")(5) == 1 ){
        offset *= thisComp.width / 200;
    value - offset;
} else {
    value;
Now you can switch your 3D glasses units to pixels and change your convergence by 1 pixel at a time, instead of 0.1% of comp width.
If that is not enough control another option if you need sub pixel accuracy; you will need to recreate what 3D glasses scene convergence is doing manually using precomps. Try Precomping your left and right eye comps and then then hook up an expression to those comps x position, left comp moves right the right moves left. Put that slider right next to 3D glasses, and now you should be able to adjust the convergence by sub pixels. Another benefit of this method is it doesn't crop, you will get only one image on the edges though.
(This is a bug in 3D glasses, it should really operate with sub pixel values but we ran out of time to do this feature).
2.
I'm not as familiar with the C4D workflow. Can you import their stereo cameras directly in AE? If so, you can set up your Master cam to follow one of their cameras and set the mode to be Left or Right hero. Then match their convergence settings. You could try ploppling their cameras into the left and right eye comps above the AE camera so it uses theirs as the active camera if you don't want to deal with matching settings, but then you can't make quick changes to the camera without changing both cameras exactly.
If you're exporting left and right footage, you can follow the guidance on the document you mentioned.
Let me know if this helps out or you have further questions.
Amir
After Effects Engineering (the guy that made the stereo rig and who is responsible for your pain)

Similar Messages

  • Premiere Pro - replace After Effects composition with original footage?

    I have a Premiere Pro project where I have replaced some clips with linked After Effects compositions (= doing the grading and stabilization in AE).
    Is there a way of undoing this and easily reverting to the original clips in the place of the linked compositions? I'm thinking of a command that would simply restore the original clips.
    Thanks!

    No such animal, I'm afraid, though it would be nice. You have a couple options:
    Always, always, always be sure to duplicate your sequence and stash it away somewhere in your project before doing the Replace. That way you have something to go back to if you need it. Always.
    Immediately after initiating a Replace (and letting the comp be created in AE with all footage imported), Edit > Undo (Ctrl+Z) the action in Premiere. This will undo the Replace and remove the comp from Premiere, BUT the comp will still be created and all footage placed in it in After Effects. After you do your work in AE, just drag the AE comp into Premiere, and plop it on a track above your original edit. That way, you can easily tweak the original edit in Premiere, and then copy and paste stuff back in AE to your DL'ed comp, if you need to.
    This will have varying degrees of success, but may be your only salvage if it's too late for the first two: you can copy and paste your footage items from your AE comp into a new Premiere Pro sequence, or you can use File > Export > Premiere Pro Project from AE, and then import that into Premiere. Either way, it's going to be a little ugly due to the way that AE and Premiere work differently with their respective timelines (layers vs. tracks), but it can save your bacon when you need it. I've actually used this method to recover hopelessly busted Premiere Pro projects, as well; it takes some rebuilding, but at least you don't lose everything.
    Make a feature request for your initial concept: Adobe - Feature Request/Bug Report Form

  • Need help with compositing DPX red footage and workflow

    Hi Guys...sorry to repost this..I've tried in many other forums, but haven;'t gotten a straight answer yet, I'm hoping anyone on this page can give advice...
    Recently, I was asked by a friend to help on a low budget project by doing some roto and paint work, and some simple composites in After Effects. The question I have is regarding the specific workflow when it comes to dealing with Red Footage in AE. Currently, the project, which is very small, is getting it’s fine cut done by the editor, and their post supervisor asked how I would like the plates to be delivered for comping…I requested 4k 10 bit DPX files set to Red Gamma/ Red Log Film color space, which I understand to be standard for ingesting.
    The question I have is the best workflow setup for this in AE? Because AE has recently upgraded with CC 2014, there are several new plug ins that I am not familiar with regarding importing DPX or Open EXR files, and the precise color space to work in so I can have a “normalized” project to comp in…..basically Log to Lin, and then exporting Lin to Log.
    This is what I have gathered, and again, I’m not sure if this is correct, so please let me know your thoughts on the best way to work with this:
    Basically I was told to work in sRGB color space in AE in order to work with my VFX comps and elements in the “normalized” sRGB linear colorspace, so go “Log to Lin"
    1. Import the red footage as 10 bit DPX sequence into AE (with a RedLog Film color gamma applied)…the project settings should be set at 32 bit depth and the working space set to sRGB
    2. Right Click on the DPX footage and select “Interpret Footage” select “Main” and then go to the “Color Management” Tab
    3. At the “Assign Profile” Box, select “Universal Camera Film Printing Density”. This should “normalize” the footage in the RGB colorspace of the project, and I should be able to see a Rec709 (sRGB) image instead of flat Log.
    For the Rendered Output from AE for the colorist…..
    Best to render out two different formats, one in Log space for the colorist, and one in a 1080 QT linear space for approvals from the Director
    1. In the AE Render queue, set up two output modules…go to “Output Module Settings” and select “Color Management” tab, then select the “Output Profile”.
    2. Set the profile to “Universal Camera Film Printing Density”, and the file format to DPX…or whichever the colorist requests as the file format (png, Tiff, etc)
    3. The second module can be set to sRGB in QT for approvals
    Does this make sense? Is there a better way? Am I on point? Thanks so much, I really want to have this nailed down before I continue.

    Jason talks about merging clips right a the beginning of the video.
    http://tv.adobe.com/watch/cs-55-production-premium-feature-tour-/adobe-creative-suite-55-p roduction-premium-feature-tour-overview/

  • After effects: creating a stereo 3D rig.

    Hey guys,
    Im trying to put together two pieces of footage that i have shot with a stereo rig. I have imported my footage into after effects, I then create a new layer. After this i go to Layer -> Camera -> Create Stereo 3D rig.
    Every time i press this i get the following error;
    "After Effects error: internal verification failure, sorry! {unexpected match name searched for in group} (29 :: 0)"
    does anyone have any suggestions on how to fix this?
    also once the error appears and i press "ok" it simply pops back up and I need to force quit the program in order to get rid of it!
    I am running the latest version of Adobe After Effects CC, on a mac pro.

    See this:
    3d camera rig error 29::0, how do I resolve this?

  • Understanding compositions with different frame rate

    Hey everyone,
    I am working with AVCHD footage with a frame rate of 25.
    If it drop my footage into a new compostion after affects sets everything
    up automatically.
    I'd now like to what happens to my avchd clip if I change the frame rate
    of my compostion from 25 to 24. What excatly happens to the frame
    missing? Does AE just gets rid of it?
    Does it make a difference if I just change my comp settings from 25 to 24
    or if I drop my 25 comp into a new one with 24.
    Thinking of it the other way around, if I change my frame rate from 25 to 50,
    does that mean that AE doubles the frames of my files?
    In general, does the changes have any impact on the speed of my footage?
    I'm looking forward to your reply. Much appreciated.
    Best,
    Alex
    PS: this count for Premier Pro aswell?

    When you create a layer in a composition, and that layer is based on a video footage item, the composition checks at each composition frame to see (sample) the data from the footage item. If there isn't a single frame of data from the footage item that exactly lines up with the frame in the composition, then the composition samples from the footage frames on either side and does one of several things: It can either just pick one of the two frames, or it can just mix them together, or it can do something much smarter and try to reconstruct the image that would have existed between the two frames.What it does depends on what frame blending setting you're using.
    When you're using a footage item that has a higher frame rate than your composition, and both frame rates are relatively high, you don't need to worry about this at all, as the default settings tend to work just fine.
    To sum up, changing the frame rate of the composition just changes how often the composition samples the image data from the footage item. It does nothing to the footage item itslef and doesn't affect how fast the video from that footage item plays in a layer.
    To interpret a footage item as being at another frame rate, or to conform a footage item to a frame rate, you have to use the Interpret Footage dialog box.
    For more information, see "Frame rate".

  • Why is my Hardware Accelerate Composition, Layer, and Footage Panels check box disabled

    Hello anyone!
    I have all the requirements to meet Adobe's standards for GPU requirements, but this check box (Hardware Accelerate Composition, Layer, and Footage Panels)[Edit>Preferences>Display] is disabled for me, why? Do I need to worry about this setting? Does it really matter? Any information on this topic would be helpful. Thanks in adavance for your feedback.
    GPU Information:
    Fast Draft:
    Available
    Texture Memory:
    924.00 MB
    Ray-tracing:
    GPU
    OpenGL
    Vendor:
    NVIDIA Corporation
    Device:
    GeForce GTX 470/PCIe/SSE2
    Version:
    3.0.0
    Total Memory:
    1.20 GB
    Shader Model:
    4.0 or later
    CUDA
    Driver Version:
    4.2
    Devices:
    1 (GeForce GTX 470)
    Current Usable Memory:
    608.00 MB (at application launch)
    Maximum Usable Memory:
    1.25 GB
    Computer:
    Windows 7 64bit SP1
    24GB MEM
    i7 intel
    Feature support levels
    There are three tiers or levels, from lowest to highest requirements, of support:
    Level 1: For OpenGL SwapBuffer:
    This level simply requires a GPU that can do OpenGL 1.5, or greater, with Shader Model 3.0, or greater. Most ATI and NVIDIA cards, and the Intel HD Graphics 3000 chipset (available in the MacBook Air, Mac Mini, various Windows machines, etc.) and 4000 (Windows only at this time) are supported. If your GPU does not support these requirements, software OS blitting like CS5.5 occurs, and there are improvements for software blitting in After Effects CS6, as well.
    Level 2: For Fast Draft previews, Hardware BlitPipe, and Cartoon GPU acceleration:
    Includes Level 1 features. This level requires OpenGL 2.0, or greater (with Shader Model 4.0, or greater, on Windows), and 256 MB, or greater, of texture memory. Most ATI and NVIDIA cards released in the past five years, plus the Intel HD Graphics 3000/4000, support this level.
    If your GPU does not support these requirements, these features will be disabled:
    Fast Draft mode
    The "Hardware Accelerate Composition, Layer, and Footage Panels" preference.
    The Cartoon effect's "Use OpenGL When Available" option (the Cartoon effect then runs on the CPU).
    Level 3: For Ray-traced 3D rendering on the GPU:
    Includes Level 1 & 2 features (for machines with attached monitors). This level requires a supported NVIDIA GPU and 512 MB, or greater, of texture memory. For a current list of supported GPUs, see the Adobe website.

    Well, you should be concerned that it isn't available  - there is still some configuration issue with your graphics card - but you can live without it most of the time.
    Mylenium

  • Hardware Accelerate Composition, Layer, and Footage Panels - optimize

    Hello.
    The AE preference for "Hardware Accelerate Composition, Layer, and Footage Panels" makes my deep compositions workable - for a limited time, and then simple selection tasks slow down substantially, as if I had turned this checkbox off. What can I upgrade to have more "fast time" with After Effects? Thanks for any insight into how to make my system work. Also, this may be my last Mac.... Is anyone out there able to run hundreds or thousands of layers in After Effects on a PC without having simple select tasks slow down?
    • simple selection task = "select all" in a single open composition with 10 layers or with 400 layers, although only in a project which contains deep compositions.
    • This slowdown exhibits with or without caps lock enabled
    • This slowdown exhibits before and after purge all
    • CC, CC 2014 latest update | Mac OS 10.8.5 | MacPro5,1 | 2 x 2.66 GHz 6-Core Intel Xeon | 96 GB 1333 MHz DDR3 ECC | 3 separate Mercury Accelsior > 500 MB/s read/write for System & Apps, Projects & Media and Cache | NVIDIA Quadro 4000 2048 MB | CUDA Driver: latest update | Wacom Intuos 5 | LED Cinema Display | Dell 2407WFP

    You are using to graphics cards of different model? That's probably why. I've really tried to see a difference with that switch and it's not measurable, so don't worry about it!
    - Jonas Hummelstrand
    http://generalspecialist.com/

  • How do I use edge commons composition loader to load multiple compositions with a next and back button?

    I am working on an interactive book and have set up each page as a separate composition in edge.
    I am using  the edge commons JS library to load multiple compositions into a main composition.
    You can see how this works here: Edge Commons - Extension Library for Edge Animate and Edge Reflow | EdgeDocks.com
    The way the edge commons tutorial is set up requires a button for each composition i want to load. I am interested in loading multiple compositions with a "next" and "back" button, and a "swipe left, "swipe right" gesture on the content symbol that each composition is loaded into. I also need the swipe features on the content symbol not to interfere with the interactive elements on the loaded composition.
    Please suggest a solution that will work without adding additional scripts beyond edge commons and jquery.

    Sort of. I'm using this code inside an action for a button symbol. But it doesn't work perfectly. Trying to debug it.
    Let me know if you have any luck.
    //Check to see if pageCounter already exists
    if (typeof EC.pageCounter === 'undefined') {
      // it doesn't exist so initialize it to first page
        EC.pageCounter = 2;
    //check if the page is only 1 digit -- patch for single digit
    if (EC.pageCounter < 9) {
       // it is, so we need to pad a 0 on the front.
      EC.pageCounterString = "0" + EC.pageCounter;
      //e.g.  01 ...09,11,12,13....115,222352,,....
    else {
      EC.pageCounterString = EC.pageCounter;
    EC.loadComposition(EC.pageCounterString + "/publish/web/" + EC.pageCounterString + ".html", sym.$("container"));
    EC.pageCounter = EC.pageCounter + 1;
    //TODO for back  -1

  • Is there any Sound card 5.1 include stereo mix that working with MacBook Air such as Creative Labs SoundBlaster X-Fi Surround 5.1 Pro Entertainment System ? and if there any another sound card can you tell me what is it please ? Thank you.

    Is there any Sound card 5.1 include stereo mix that working with MacBook Air such as Creative Labs SoundBlaster X-Fi Surround 5.1 Pro Entertainment System ? and if there any another sound card can you tell me what is it please ? Thank you.

    Okay, I did my best searching and I couldn't find anything that matched all your descriptors. I did do a search on both decoder numbers. SB0256 returned very little results, but with SB0466, I was able to find this ebay listing:
    http://cgi.ebay.com/ws/eBayISAPI.dll...6_fvi%3D&_rdc=
    This card isn't a Champion like you suggested. And with my complete lack of knowledge, I don't know what I/O dri've bay means. But since it is PCI and the decoder matches the number you gave me, this card should be adequete, right?
    Whether it works or not, my search indicated it might be very hard for me to find that card (or any PCI card for that matter). Everything I saw was out of stock. I know this is a forum for Creative, but are there any other manufacturers that will allow me to accomplish my goal, either with current cards, or discontinued cards? Thanks again
    Oh, and I will vote on that thread. I probably won't wait for them to release the decoder since I need it now, but I still believe Creative should be giving the customer what they want

  • Working with 48fps footage in Premiere CC

    Hi there,
    I am currently working on a 24fps project with R3D footage in Premiere.
    Some of the footage for a new sequence I am working on was shot at 48fps (and will need to be exported to 24fps while retaining 48fps' slow motion effect) and I am unsure how to go about this.
    Can Premiere play back in the program monitor at 48fps? Can it interpret/render 48fps footage to export to 24fps while retaining the original attributes of the 48fps footage? Or would I need to use additional software to accomplish this?
    Ideally I would like to be able to edit/playback the footage at the correct frame rate within Premiere as I need to synchronise it with music to see which segments of the clips I have will work best.
    Any advice is greatly appreciated.
    Thanks!

    If I'm understanding you right. Your R3D footage is a baseline of 24fps (or 23.976fps) but was overcranked on the camera to 48fps, meaning it should playback half the speed of realtime. I don't know much about camera settings but normally the camera processes this internally and then spits it out at 24fps but was actually overcranked for slow-motion. Arri Alexa does this internally. I have worked on R3D slow-mo before and I know it processes it internally also.
    Unless the red camera has a setting to turn that off I do not know. The important part is; is the footage at 24fps or 48fps BASELINE and how much excess fps was set to overcrank in the camera settings? (i.e was it set to 'squeeze' 48fps into 24fps)? Normally premiere pro cc is good at reading the footage's baseline fps value in the browser.
    But to interpret footage inside of premiere pro cc, its easy - all you do is bring in your footage into the browser, then right click on it, go to modify then interpret footage; in the menu at the top, change the fps value interpretation from 48fps to 24fps (or whatever baseline you want) and then click ok. Drop the clip onto a timeline matching the rest of your baseline fps and it should be slow-mo. You should see the value of the clip's timecode length increase double in the browser metadata.
    Hope that helps, let me know how it goes. Be useful if you can take screen grabs of the metadata so we can see what your looking at.
    Regards

  • In version 10.1 fcpx how do I edit in proxy, then when I am finished share with optimize footage? It was easy in 10.9 version, but I can't figure this out in 10.1, please help.

    in version 10.1 fcpx how do I edit in proxy, then when I am finished share with optimize footage? It was easy in 10.9 version, but I can't figure this out in 10.1, please help.

    The switch is in the upper right of the viewer.

  • Composite with dependency not working after soa server restart

    Hello,
    I have composite application that invokes another composite application. After deploying those work fine. But after restarting soa server the one having dependency does not work any more. To soa server log I get following:
    <Jul 1, 2010 11:45:29 AM EEST> <Error> <oracle.integration.platform> <SOA-20003> <Unable to register service.
    oracle.fabric.common.FabricException: Error in getting XML input stream: http://Yacico:8001/soa-infra/services/default/validationForCC/getStatusByCC?WSDL: Response: '503: Service Unavailable' for url: 'http://Yacico:8001/soa-infra/services/default/validationForCC/getStatusByCC?WSDL'
    at oracle.fabric.common.metadata.MetadataManagerImpl.getInputStreamFromAbsoluteURL(MetadataManagerImpl.java:276)
    Caused By: java.io.FileNotFoundException: Response: '503: Service Unavailable' for url: 'http://Yacico:8001/soa-infra/services/default/validationForCC/getStatusByCC?WSDL'
    <Jul 1, 2010 11:45:30 AM EEST> <Error> <oracle.integration.platform> <SOA-20020> <Deployment of composite "POProcessing" failed: Unable to find a WSDL that has a definition for service {http://oracle.com/sca/soapservice/POProcessing/POProcessing/receivePO}receivePO and port execute_pt. Please make sure that the port attribute for the binding defined in the composite file is correct by checking the namespace, service name, and port name. In addition, check that the WSDL associated with the binding namespace is imported and currently reachable (check the import nodes at the top of the composite file). Finally, validate the HTTP proxy settings for the server..>
    So POProcessing does not work any more after server restart. validationForCC works fine also after server restart. Url http://...validationForCC/getStatusByCC?WSDL points to wsdl file and is visible by browser.
    I use soa suite 11g patch set 2 (11.1.1.3) running on redhat enterprise linux 5.
    Any idea what is the problem?
    Is it somehow possible to configure which composites are started first during server startup?
    regards, Matti
    Edited by: user10197965 on Jul 1, 2010 2:28 AM

    Yes, I did that. I'm not all that happy about this as a solution either, but it's better than making multiple copies.
    We have since found out that his is a known bug and that it is fixed in some, but not all, deployments.
    -------- see below -------------
    Composites With WSDL Dependencies Fail To Deploy Following SOA Server Restart [ID 1272070.1]          
    Modified:Jul 19, 2012 Type:PROBLEM Status:MODERATED Priority:3                         
    In this Document
         Symptoms
         Cause
         Solution
         References
    This document is being delivered to you via Oracle Support's Rapid Visibility (RaV) process and therefore has not been subject to an independent technical review.
    Applies to:
    Oracle SOA Platform - Version 11.1.1.3.0 and later
    Information in this document applies to any platform.
    Symptoms
    A SOA Project has an external reference to a Web Service or a reference to another Composite.
    When the SOA Server is started, the Composite is trying to access the WSDL of it's referenced Web Service in order to load data structures. If SOA can not find the WSDL then the Composite fails to load/deploy.
    Once this happens the Composite can not be started, or shutdown, from Enterprise Manager Fusion Middleware Control Application.
    Related Error Messages:
    [ERROR] [SOA-20020] ... Unable to find a WSDL that has a definition for service ... Please make sure that the port attribute for the binding defined in the composite file is correct by checking the namespace, service name, and port name. In addition, check that the WSDL associated with the binding namespace is imported and currently reachable (check the import nodes at the top of the composite file). Finally, validate the HTTP proxy settings for the server.]
    javax.wsdl.WSDLException: WSDLException: faultCode=INVALID_WSDL: Error reading import of oramds
    Cause
    When SOA Server is restarting the Composite can not access the WSDL of it's referenced Web Service (it is not available).
    In the first scenario there are two Composites on the same server: CompositeA and CompositeB.
    In SOA Suite 11g there is no possibility to specify the load order for the composites. If CompositeA references CompositeB and if CompositeA is firstly loaded then it can not access the WSDL from CompositeB and the issue occurs.
    In the second situation there is one Composite on the server (CompositeA) which references an external WebService (ExternalWS). If the ExternalWS is not available when the SOA Server is starting then this issue occurs.
    Solution
    Solution 1
    Redeploy the affected Composite into the SOA Server.
    You can do that but it is not advisable in a production environment and in a development environment it will take a lot of time to redeploy the composite manually
    Solution 2
    Copy the abstract WSDL locally into the project.
    This is documented in these articles in our Knowledge base:
    •     Document:1155033.1 Node Restart Cause Composites To Become Unavailable. Response: '503: Service Unavailable'
    •     Document:1151973.1 Boot Order Of The Composites Upon Soa Suite Restart
    Steps:
    o     a. Copy and use the WSDL file in the Project
    o     b. Edit the WebService Adapter
    o     c. Change the "WSDL URL" to point to the WSDL copied into the project
    o     d. Redeploy the Project
    o     e. Make this configuration with all the Projects that have a references with other Web Services
    Solution 3
    Use shared artifacts in Metadata Service (MDS). A WSDL used by more than one composite is a shared artifact per definition. If the WSDL structure is changed (that is not happening frequently in a production environment), you will normally deploy a composite with a new version because overwriting it will break your production environment. Moreover, in a development environment you will need a proper process in place where different developers access the same artifacts within MDS. In addition, when you use external WSDLs (owned by third parties), you will have a proper process/agreement in place to be notified about modifications affecting your applications.
    1.     In order to use shared artifacts a MDS connection must be configured in jDeveloper:
    http://download.oracle.com/docs/cd/E14571_01/integration.1111/e10224/sca_lifecycle.htm#SOASE85488
    2.     Deploy the shared artifacts:
    o     a. Create a JAR profile and include the artifacts to share
    o     b. Create a SOA bundle that includes the JAR profile
    o     c. Deploy the SOA bundle to the application server
    http://download.oracle.com/docs/cd/E14571_01/integration.1111/e10224/sca_lifecycle.htm#SOASE85472
    If the shared artifacts (WSDLs) needed are from other composites this step can be skipped
    3.     Create a new WebService
    o     a. When completing the "WSDL URL" click on "Find existing WSDL's"
    o     b. Select "Resource Palette"
    o     c. Go to the SOA-MDS
    o     d. Select a WSDL from a Composite or the one deployed at step 2.
    Known Restriction 1
    The port and the location for the WSDL reference is not completed automatically by jDeveloper in the composite.xml file. This information must be inserted manually. In Bug:10287325 is raised this issue and it is available for SOA 11gPS2+.
    Known Restriction 2
    Another issue regarding MDS caching is raised in Bug:10218147 - the MDS cache is not refreshed when an artifact is deployed or deleted.
    In order to refresh this cache the server must be restarted. This Bug resolves the issue but in a production environment the server restart will be the preferred option. The reason for that is because in a production environment you never deploy single composites under the same version after artifacts (WSDLs) have been modified.
    The fix for the Bug:10218147 can be an acceptable solution in development environment where redeployment would be quicker than restarting the server.
    Bug:10218147 is available for SOA 11gPS2+.
    To find out more information about how to use Shared Metadata check the following documentation:
    Oracle Fusion Middleware Developer's Guide for Oracle SOA Suite 11g
    41 Deploying SOA Composite Applications
    41.7.3 Deploying and Using Shared Metadata Across SOA Composite Applications in Oracle JDeveloper
    URL:
    http://download.oracle.com/docs/cd/E14571_01/integration.1111/e10224/sca_lifecycle.htm#CACFEAJJ
    Solution 4
    The last solution is to use an UDDI (Universal Description Discovery and Integration).
    Oracle recommends Oracle Service Registry 11g (OSR). The advantage is that you can use OSR and SOA Suite in a Shared WebLogic Domain.
    http://www.oracle.com/technetwork/middleware/registry/overview/index.html
    http://blogs.oracle.com/governance/2010/05/oracle_service_registry_11gr1.html
    To find out more information about Oracle Service Registry check the following documentation:
    Oracle Fusion Middleware Administrator's Guide for Oracle SOA Suite 11g
    33 Configuring Service and Reference Binding Components
    33.1.3 Changing the Endpoint Reference and Service Key for Oracle Service Registry Integration
    http://download.oracle.com/docs/cd/E15523_01/integration.1111/e10226/bc_config.htm#SOAAG37248
    Oracle Fusion Middleware Developer's Guide for Oracle SOA Suite 11g
    A BPEL Process Activities and Services
    A.4 Publishing and Browsing the Oracle Service Registry
    http://download.oracle.com/docs/cd/E15523_01/integration.1111/e10224/bp_appx_ref.htm#SOASE85561
    Oracle Fusion Middleware Service Registry 11g
    http://download.oracle.com/otndocs/tech/soa/OSR11gR1ProductDocumentation.pdf
    Solution 5
    In case you have BPM Components in the Composites deployed check the following note:
    Document 1317803.1 Soa Suite Composite Fails To Deploy Upon Restart Of Managed Server
    The issue is caused by the Bug:11822470 SOA SUITE COMPOSITE FAILS TO DEPLOY UPON RESTART OF MANAGED SERVER
    References
    BUG:10218147 - WSDL CHANGES NEED SOA SUITE SERVER RESTART
    BUG:10278478 - WHEN SOA SERVER IS RESTARTED, SOME SOA COMPOSITES COULD NOT LOAD/BE DEPLOYED
    BUG:10287325 - ABSTRACT WSDL NOT AVAILABLE WHEN COMPOSITE STARTED
    BUG:10311698 - WHEN SOA SERVER IS RESTARTED, SOME SOA COMPOSITES CAN NOT BE STARTUP / SHUTDOWN
    BUG:11822470 - SOA SUITE COMPOSITE FAILS TO DEPLOY UPON RESTART OF MANAGED SERVER
    @ BUG:9267312 - MDS ARTIFACTS ARE STILL CACHED AFTER DELETING
    @ BUG:9708488 - AFTER SOA RESTART, ALL PROCESSES HAVE TO BE REDEPLOYED
    @ BUG:9749845 - SCHEMA CACHE STARTS EMPTY AFTER RE-START, BUT NOT AFTER DEPLOYMENT
    NOTE:1151973.1 - Boot Order Of The Composites Upon Soa Suite Restart
    NOTE:1155033.1 - Node Restart Cause Composites To Become Unavailable. Response: '503: Service Unavailable'
    NOTE:1317803.1 - Soa Suite Composite Fails To Deploy Upon Restart Of Managed Server
    Bug 11822470 : SOA SUITE COMPOSITE FAILS TO DEPLOY UPON RESTART OF MANAGED SERVER                    
                   Bug Attributes     
    Type     B - Defect     Fixed in Product Version     11.1.1.6
    Severity     2 - Severe Loss of Service     Product Version     11.1.1.4
    Status     80 - Development to QA/Fix Delivered Internal     Platform     912 - Microsoft Windows (32-bit)
    Created     Mar 1, 2011     Platform Version     2003
    Updated     Oct 12, 2012     Base Bug     N/A
    Database Version     N/A     Affects Platforms     Generic
    Product Source     Oracle          
    Abstract: SOA SUITE COMPOSITE FAILS TO DEPLOY UPON RESTART OF MANAGED SERVER
    *** 03/01/11 08:07 am ***
    Customer has a project consists of two services for interaction with the
    process and one reference to an external service.
    On deployment of the process to an Enterprise Environment, the process works
    as expected. However, upon a restart of the managed server, the process will
    then fail to deploy.
    Workaround used is to redeploy the project again.
    Bug 10278478 : WHEN SOA SERVER IS RESTARTED, SOME SOA COMPOSITES COULD NOT LOAD/BE DEPLOYED                    
                   Bug Attributes     
    Type     B - Defect     Fixed in Product Version     
    Severity     2 - Severe Loss of Service     Product Version     11.1.1.3.0
    Status     92 - Closed, Not a Bug     Platform     226 - Linux x86-64
    Created     Nov 10, 2010     Platform Version     RED HAT ENTERPRISE LINUX 5
    Updated     Dec 10, 2010     Base Bug     N/A
    Database Version     N/A     Affects Platforms     Generic
    Product Source     Oracle          
    Abstract: WHEN SOA SERVER IS RESTARTED, SOME SOA COMPOSITES COULD NOT LOAD/BE DEPLOYED
    Detailed Problem Description
    ====================
    When a SOA composite get dependencies with other SOA composites located on the same server, if the dependency is still not loaded, the SOA composite will not be loaded, and there is no possibilities for Entreprise Manager console to restart it (EM crash). The only way to make it working is to redeployed the SOA composite from jDeveloper that is not acceptable in a production environment.
    The composite that has the issue cannot be start-up or shut-down, although the Enterprise Manager do not show any errors with it.

  • Error importing composite with business rules into SVN

    Hello,
    When I import a composite with business rules into Tortoise SVN I get below error.
    Error: Commit blocked by pre-commit hook (exit code 1) with output:
    Error: Path
    Error: '/trunk/ProjectName_SCA/.rulesdesigner/jaxb_classes/com/ProjectName/package-info.class'
    Error: is restricted for commit by pattern '\.class$' for the current user.
    I could import other composites(w/o business rules)
    Thanks

    Further invesitgation bears this problem out.
    Oracle support recommend wrapping the SimpleType in a ComplexType. This does work, but now I have an extra wrapper element to deal with. I either have to use the wrapped type in my other complex, composed Types and/or add an external wrapping element when trying to create Business Services in BPM to call the BusinessRules I've created.
    This is a bit messy.
    To be clear, this does not seem to be an issue with Business Rules; the BR editor and generation of Facts (including simple restricted types -> JAXB 2.0/Java Enumerations) seems to work correctly. There seems to be an issue exposing DFs as Services. The code which generated the WSDL and its supporting types seems to choke on restricted SimpleTypes.
    As a side note, it seems that HumanTasks have a similar limitation
    Edited by: wylderbeast on May 31, 2011 3:27 PM

  • 11G: Error invoking adf-binding service in composite with JAVA API

    Hello,
    i'm trying to invoke a asyncrhonous composite via JAVA API. My composite has two services: WS and ADF-BC SERVICE both of two are wired with a MEDIATOR that connects with two BPEL Process depending on two rules.
    I need to invoke a process depending of the input: if the input is A i invoke the process 1 and if the input is B I invoke the process 2. I use the Mediator instead of switch because it's an example in order to build a complex decission system then.
    R1 and R2 are two static rules in the Mediator.
    R1 = $in.payload/client:process/client:tiposiniestro = 'A'
    R2 = $in.payload/client:process/client:tiposiniestro = 'B'
    And my XSD is:
    <schema attributeFormDefault="unqualified"
         elementFormDefault="qualified"
         targetNamespace="http://xmlns.oracle.com/POC_jws/ProcesoApertura/ProcesoApertura"
         xmlns="http://www.w3.org/2001/XMLSchema">
    <element name="process">
              <complexType>
                   <sequence>
                        <element name="tiposiniestro" type="string"/>
                   </sequence>
              </complexType>
         </element>
         <element name="processResponse">
              <complexType>
                   <sequence>
                        <element name="result" type="string"/>
                   </sequence>
              </complexType>
         </element>
    </schema>
    If i invoke with SOAP, sending the message:
    <soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
         <soap:Body xmlns:ns1="http://xmlns.oracle.com/POC_jws/ProcesoApertura/ProcesoApertura">
              <ns1:process>
                   <ns1:tiposiniestro>A</ns1:tiposiniestro>
    </ns1:process>
    </soap:Body>
    </soap:Envelope>
    the composite works. But if i try to invoke with JAVA, with the code:
    String payload5 = "<process xmlns=\"http://xmlns.oracle.com/POC_jws/ProcesoApertura/ProcesoApertura\">" +
         "     <tiposiniestro>A</tiposiniestro>" +
         "</process>";
    String conversationId = UUID.randomUUID().toString();
    NormalizedMessage nm = new NormalizedMessageImpl();
    nm.addProperty(NormalizedMessage.PROPERTY_CONVERSATION_ID, conversationId);
    Map<String,Object> payload = nm.getPayload();
    payload.put("payload", pPayLoad);
    nm.setPayload(payload);
    service.post(pOperation, nm);
    i get the following error:
    Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.w3c.dom.Element
    How can I send the message via JAVA API using the mediator in my composite?
    Is there any important difference in the format between send a request with SOAP or JAVA?
    Thanks in advance.
    Edited by: user7239022 on 05-ene-2010 7:38
    Edited by: user7239022 on 05-ene-2010 7:40
    Edited by: user7239022 on 05-ene-2010 7:49
    Edited by: user7239022 on 05-ene-2010 8:41

    Talking about the dynamic rules in the Mediator, Oracle documentation says:
    "As of now, only SOAP bindings are supported. There is a dummy SOAP binding in the composite.xml file. This endpoint is overridden by Mediator in runtime through NM property. So, outbound services can be called only over SOAP."
    Is it valid for static rules as well? I means, how can i use Java API to invoke a Composite with a Mediator as first element? I always get the message:
    +"java.lang.ClassCastException: java.lang.String cannot be cast to org.w3c.dom.Element"+
    Thanks again

  • Trouble with P2 Footage QT Exports

    Hi,
    My P2 footage looks fine in the timeline but when I export QT's (w/ "current settings" marked) the aspect ratio changes and they look squished. The QT info pane states that the video is in fact 1080, matching the clip info and sequence settings. I don't understand what's happening.
    One note: I thought the DP was going to shoot 720, so one thing I'm wondering is if I somehow made a mistake when I brought in the footage, as it was the first time I had logged and transferred P2 footage (I'm used to working with XDCam EX footage where I've never had a problem like this).
    Can anyone help me troubleshoot this?
    Thank you in advance.
    -Cherry

    Thank you for the advice but I don't see any settings to change in the QTPlayer 10, and now the same thing has happened to my client who of course also defaults to QTPlayer 10, but thankfully she has QT7 Player as a back-up, which plays the P2 footage Quicktime correct 16x9 aspect ratio.
    I've seen several references online to this problem with P2 footage - but no solution other than to use another media player. Apparently the 16x9 embedded flag/value is somehow lost during QT export with P2 footage. When this happens, the file is assumed to be in a 4:3 aspect ratio. I've poured through the settings in FCP and don't see anything that I could change to help this.
    Just wondering if anyone knows of a fix - either in QT or in FCP!
    Thanks in advance!

Maybe you are looking for