DV Audio issue.  Two sources (DV Deck and ADVC_1394), one choice DV Audio

I recently purchased a ADVC-1394 for a webcasting project. It worked really well with Quicktime broadcaster and in a web embeded flash broadcaster I created. In Quicktime pro under preferences>recording it shows up as ADVC-1394 and my dv deck (Pluged via Firewire) shows up at 1394-DV. But in Final Cut Pro 5.1.4 all I get for input audio is DV-Audio which comes from the ADVC-1394 audio inputs. I attempted to create my own capture preset but the same happens. I also re-installed FCP with no luck. When I take the card out everything works fine, but with the card in I get no audio from a firewire source.
The project was over and I decided to wipe the computer to try to solve this problem and some others that I started getting when I upgraded to leo. After the re-install all my other problems are solved except this one.
I know the ADVC-1394 is not on the compatible hardware list for FCP, but both Canopus and some other research indicate the ADVC-1394 as the same chipset as the ADVC-100 which is on the list.
So my question...
How can I get the advc audio to work correctly as a capture source independent of the Firewire DV Deck?
So far my work around is to plug the rca audio from the deck into the audio inputs on the card, but I feel this could work better. I have tryed the card in a g4 as well with similar outcome.

weird. I will try at work later. apparently it does support multiple dv source video connections, because I can pick and choose which video I import, just not the audio. And why would QTpro support this and no Final Cut?!?!
BTW my onboard firewire port is cooked so I'm using a pci card for firewire. very strange, but I switched the cards and now i get full control of the timecode and I can hear the sound from the firewire deck, but no video. seems like the position of the pci cards on the bridge actually matters.
I never had this kind of problem with my miglia alchemy card. alas, miglia still doesnt support leo. The only reason I'm still using the card is because my deck has a menu that can only be accessed thru the rca ports. also I like to play playstation when there is nothing to do at work.
still trying, thanks for the tip.

Similar Messages

  • Problem BW  : DTP lost source system mapping  and new one created?

    I had a little question concerning the BI content activation, when you have to activate objects and there are two source systems, one of them already with a flow working. Do you have to check both source systems in your screen of source system selection, if there are shared objects? I was afraid if I only checked the one I wanted the existing one would dissappear? I donu2019t want to touch the existing one as it was a 3.5 flow which has been migrated to 7.0.
    I have also another question, which is a worse scenario of what has happened recentlyu2026!! Imagine you have originally one source system in your BW DEV environment (towards QA) with a certain amount of Datasources connected to it as well as some DTPu2019s. A new source system was added to this BW DEV (towards PROD). Do you think adding this new source system could have an impact in the mapping of the DTP towards the Datasources? In fact, some of our DTP have been modified and they are pointing to the incorrect Datasource/Source system. These DTP were created manually as well as the Transformations as it was a former 3.5 flow. Do you know a way to bring it back to itu2019s previous connection. Maybe a table that points to the correct source system or a function module? RSBKDTP and RSBK* tables do not store this info (only descriptions) and RSRV doesnu2019t do it.
    Please let me know if you have had a similar case! Points are waiting and the gratitude of my boss...lol

    Any ideas?

  • Issues on Sources System ECC and BI

    Dear Friends,
    I ‘m trying to get data’s (extractions) from ECC 6.0 QA 110 to BI DEV 100. But I only have the Source System to the ECC 6.0 DEV without any data, this connections is working very well. I asked the Basis Consultant to do another connection of the BI DEV 100 with the ECC 6.0 QA 110 using RFC connections with Bitemote with sapall,  between ECC and BI.  He already did it and I can see at SPRO under SAP NetWeaver > SAP BIW > Links to Other Systems > General Connection Setting > Assign Logical System to Client, on both sides in BI and ECC. The question is, do I need to do Any configuration at the BI side to active (create?) this new Source System ECC 6.0 QA 110? I hope I’m clear. Thanks in advance.
    Antony

    Hi Frank,
    I can't imagine any issues compared to other installations. The only thing you need to take care about is that in BI you can only work in one client and in your ecc system you can have multiple clients to work in. These will be treaten as different source systems for your bw.
    regards
    Siggi

  • Merge multiple source table dates and to one target table

    The requirement is to merge multiple source tables (each table has a set of start and end date) to one target table with one set of start and end date and contain the date relevant column values from each source table.  Payment source tablestart dateend datepayemnt1/1/201512/31/2015301/1/201612/31/999960Position source tablestart dateend dateposition1/1/201512/31/2016101/1/201712/31/999920Target tablestart dateend datepayemntposition1/1/201512/31/201530101/1/201612/31/201660101/1/201712/31/99996020 What transformation(s) will be best to use to handle this requirement? Thanks, Lei

    Thanks Karen,
    that was exactly what i was hoping for.
    Maybe it could be made easier/less confusing if the Mapping Workbench just made you choose a target table. But maybe this is not usefull if the table contains two foreign keys to the same table. Or maybe this should just be put somewhere in the documentation.
    Regards,
    Robert
    Hi Donald,
    fortunately i'm my own DBA so i don't have any problems ;-). However i'm certainly interested in the reasons for not having such a conditional foreign key.
    However actually the foreign key isn't conditional, the condition is that either the field (using the FK) must be filled, or a free-format field. The reasoning for this is that we have a list of known towns and if the addres is local a town from that list must be chosen. If the addres is outside the country a town can just be typed in (no list).
    Concerning the agrregate, all fields are always used. There are no neediness flags anywhere. The aggregate contains three fields which are mapping as direct (two fields) or a One-to-one (the FK). All 'parents' all contains these three fields.
    Regards,
    Robert

  • HT204053 When I set up my I phone 5 I used a secondary e-mail address during set up of my Apple ID. I did not remember I had already an Apple ID from my lap top.  Now I have two Apple ID's and the one on my phone is the one with an e-mail address that I n

    I need Help with Apple ID's.  I have two with two different e-mail addresses.  I only want to have one!  How do I delete the account on my I-phone 5 and use the other account I have on my lap-top?
    Thanks.

    Go to Settings/General/Reset and choose Reset All Content and Settings. This will erase everything on the iPhone and allow you to set it up again as new with the proper Apple ID.

  • Mapping issue: Two different IDOC P19 segment to one flat file line

    Hello,
    Scenario: For each P01 segment (without uepos) in an order idoc I want to create one line in a plain text file.
    Problem: One P01 segment has two P19 segement with different qualifiers. I want to map both P19.IDNTR (with different qualifiers) fields to the same record in my plain text file.
    How can I do this?
    My "if statement" does noet work. I check the qualifier in the P19 segment. But when one field in the record of the plain text is filled with an P19.IDTNR the second P19.IDTNR field is empty.
    Please advice.
    Best Regards,
    Erik van Lisdonk

    Erik,
    you should get both values if you change the Context of your INPUT-Fields one level higher (for the IDTNR and for the Qualifier that is checked in the IF-statement).
    The Context needs to be set to P01 for all fields and I assume it is set to P19.
    Greetings
    Stefan Hilpp

  • Process chain - Two source system issue

    Hi Experts,
    I face problem when i transport a process chain from BW DEV to Quality.
    I have two source systems in Quality ECI and EXS.EXS is a sandbox server(Source system) which we no longer use.
    When i transport the process chain from DEV to QA, i get two DTP's automatically added to Process chain in quality for two source systems ECI and EXS whereas i do not want the data to be loaded from EXS client.
    I checked the BW dev system, it has only one DTP in the process chain.
    Now I want to elminate the other DTP in the process chain in quality.
    Please suggest.
    Regards,
    Srini

    I Dont think so a DTP for EXS system has got created automatically when you transport the PC to QA (which actually doesn't exist in DEV).
    Please check you DEV system once again, there might a DTP from EXS system also.. other wise go the Release TR and then check the contents-->iif you find two DTP's there then it is there in the dev system..
    Any ways if you want to delete the other DTP from the PC in QA.. you can Delete it directly from the PC itself.. Goto change mode of the PC and then select the DTP which you want to delete-->then right click and hten Click on Remove Process -->then activate the PC..
    Note: You might have authorization to edit the PC in QA then only you will be able to do the changes..

  • Color Balance Issue importing from DV Deck Via Firewire...

    The fact that searching google and this formum has lead to me finding NO results, I am wondering whether anyone has or is experiencing similar issues..
    For some time now, I have been having this issue, which I just realized is not normal. I have a Sony DV Cam Deck which I use to import via Firewire to Premiere CS5.5. Attached to the Sony Deck I also have a small monitor connected to the composite video output.
    The color and white balance look fine on the small monitor, however on the import via firewire, there is tinge through it almost like an incorrect white balance.
    YES, I have calibrated and Re-Calibrated my Computer monitors using Spyder3 and when viewing photos and other video not imported through the deck, the color is great.. .It is NOT a problem with the Deck because the SAME thing happens with a friends Deck. I am just using the standard Firewire import on the ASUS Motherboard, although I do have a Blackmagic Studio2 card. This does not really come into it during firewire import....
    As I said, now that I realized, this has been happening for 2 years or more. I was always color correcting the camera mans footage thinking his whitebalance was out, but after now attaching an external monitor to the Deck whilst firewire import, I notice the color is fine on the monitor. I am in PAL land Australia, but it happens with NTSC footage as well...
    Anyone ever experienced this or know of this issue????
    Any advice would be greatly appreciated...
    Thanks Tony.

    The footage is mainly from the same camera, although occasionally I edit two camera shoots so the other camera is a different model, but this also has a different colour balance after imported.. I have used two identical Sony decks, and my latest acquirsition is a Sony HDV Deck that will natuarally play both. I get the same results from all three decks.
    Yes I have rendered footage with no correction or effects and watched it back on a Sony TV/Monitor. Colour is out.
    I suppose the tinge looks exactly like an incorrect colour temperature setting, but i know it is correct because when I playback the deck directly into a TV through composite it is fine. Example Warm tone Beige Curtains in a Reception Center look like they have a bad green Cast. Skin tones are no where near natural, again a green cast. I am not sure what is going on..
    But yes, it has been there for a couple of years and it just hit me like a tone of bricks after getting this new deck. On previous decks, I never had a monitor connected to the composite and only monitored through the firewire so I always assumed the colour white balance on the cameramans camera is out. WHen I set this deck up, I put a montior on the composite. Yesterday as I was importing source footage, I glanced over and it amazed me to see that the composite monitor on the deck had beautiful rich acurate colour on it, whilst what was coming in on the pc screen, even after playing it back on the TV/Monitor, was horrible.
    So I thought, Hell all these years I have been spending hours colour correcting footage that was ok to start with but lost somting in the transfer?? HOW?? I haven't got a clue.. It really doesnt make sence and I am still trying different experiments. I am now going to import through the component and svideo outputs of the deck, to see what results I get... I will post that later.. Thank you all.

  • Two source system in RSA1

    Hello All,
    I have the following scenario.
    I have two source system ECDCLNT200 and ECDCLNT230 and one BI system BIDCLNT200. client 200 is used for configuration and  client 230 is used for data loading and validation.now firstly  i created source system ECDCLNT230 in RSA1  then it comes under SAP menu but when i try to create ECDCLNT200 in SAP Menu it comes automatically under BI menu. i deleted it and tried again but the same problem is there.
    if i remain the same configuration means ECDCLNT200 under BI menu then will it create any problem in future ? or if i want to create ECDCLNT200 under SAP menu what is the otheroption ?
    I tried both the way. first i rightclick on main node and selected create source system and second time i right click on SAP and select create source system but couldn't create it.
    Please let me know the solution.
    Regards,
    Komik Shah

    Thanks vincent,
    i got the solution through SAP note.
    I have assigned the points.
    Thank you again
    Regards,
    Komik Shah

  • Have two apple id's and one won't reset pw

    I have two AppleID's setup and when one gets locked out, I go to the Apple Support site and enter the ID information but I never receive the email confirmation to reset it.  I have checked both registered emails and I don't seem to get anything from Apple.
    This is a reoccuring problem.
    Help please!

    All content and data is forever associated with the Apple ID you used to download it. You should have reset the password for the original I'd.
    I forgot.apple.com

  • One target table is loading from two different source but same columns but one source is in a database and other is in a flat file.

    Hope you all are doing good.
    I have a business issue to be implemented in ODI 11G. Here it is. I am trying to load a target table from two sources having same coulmn names. But one source is in file format and other is in Oracle Database.
    This is what i think i will create two mappings in the same interface using Union between the sources. But i am not sure how the interface would connect to different logical architecture to connect to two different sources.
    Thanks,
    SM

    You are on the right track, this can all be done in a single interface. Do the following
    1) Pull your file data model into the source designer and and your target table model to the target pane.
    2) Map all the relevant columns
    3) In the source designer create a new dataset and choose the UNION join type (this will create a separate tab in the source designer pane)
    4) Select the new dataset tab in the source designer pane and pull your source oracle table data model into the source designer. Map all the relevant columns to the target
    5) Make sure that your staging location is defined on a relational technology i.e. in this case the target would be an ideal candidate as that is where ODI will stage the data from both file and oracle source and perform the UNION before loading to the target
    If you want to look at some pretty screenshots showing the steps above take a look at http://odiexperts.com/11g-oracle-data-integrator-part-611g-union-minus-intersect/

  • Join two source tables and replicat into a target table with BLOB

    Hi,
    I am working on an integration to source transaction data from legacy application to ESB using GG.
    What I need to do is join two source tables (to de-normalize the area_id) to form the transaction detail, then transform by concatenate the transaction detail fields into a value only CSV, replicate it on the target ESB IN_DATA table's BLOB content field.
    Based on what I had researched, lookup by join two source tables require SQLEXEC, which doesn't support BLOB.
    What alternatives are there and what GG recommend in such use case?
    Any helpful advice is much appreciated.
    thanks,
    Xiaocun

    Xiaocun,
    Not sure what you're data looks like but it's possible the the comma separated value (CSV) requirement may be solved by something like this in your MAP statement:
    colmap (usedefaults,
    my_blob = @STRCAT (col02, ",", col03, ",", col04)
    Since this is not 1:1 you'll be using a sourcedefs file, which is nice because it will do the datatype conversion for you under the covers (also a nice trick when migrating long raws to blobs). So col02 can be varchar2, col03 a number, and col04 a clob and they'll convert in real-time.
    Mapping two tables to one is simple enough with two MAP statements, the harder challenge is joining operations from separate transactions because OGG is operation based and doesn't work on aggregates. It's possible you could end up using a combination of built in parameters and funcations with SQLEXEC and SQL/PL/SQL for more complicated scenarios, all depending on the design of the target table. But you have several scenarios to address.
    For example, is the target table really a history table or are you actually going to delete from it? If just the child is deleted but you don't want to delete the whole row yet, you may want to use NOCOMPRESSDELETES & UPDATEDELETES and COLMAP a new flag column to denote it was deleted. It's likely that the insert on the child may really mean an update to the target (see UPDATEINSERTS).
    If you need to update the LOB by appending or prepending new data then that's going to require some custom work, staging tables and a looping script, or a user exit.
    Some parameters you may want to become familiar with if not already:
    COLS | COLSEXCEPT
    COLMAP
    OVERRIDEDUPS
    INSERTDELETES
    INSERTMISSINGUPDATES
    INSERTUPDATES
    GETDELETES | IGNOREDELETES
    GETINSERTS | IGNOREINSERTS
    GETUPDATES | IGNOREUPDATES
    Good luck,
    -joe

  • There are two CDC methods - Source based CDC and Target based CDC which one

    There are two CDC methods - Source based CDC and Target based CDC, which one is better performancewise. in data services.
    If there is any document available which compares both and provides any info on performance etc, will be helpful.
    thank you for the helpful info.

    LIke Suneer mentioned CDC is for better performance.
    Following link would be helpful.
    http://wiki.sdn.sap.com/wiki/display/BOBJ/Extractors-Source-Based+CDC
    http://wiki.sdn.sap.com/wiki/display/BOBJ/Extractors-Target-Based+CDC
    Thanks,
    Arun

  • 1Z0-052. Two big cert guides and official documentation - why so many contradictions ?

    In this discussion i will compare two 1Z0-052 prepare certification guides (one of these is marked as "Oracle Press" !!!) and official Oracle Database 11g Release 2 (11.2) documentation (the exam "1Z0-052" has been validated against this documentation) in effort to understand why these sources have so many contradictions (direct and undirect).
    Begin with this one :
    1) "(Oracle Press) John Watson - OCA Oracle Database 11g: Administration I Exam Guide (Exam 1Z0-052)", "Chapter 2: Exploring the Database Architecture", page 54 :
    "Note that from release 8i onward, checkpoints do NOT occur on log switch..."
    2) "Oracle Database Concepts 11g Release 2 (11.2) Part Number E25789-01", "13 Oracle Database Instance", "When Oracle Database Initiates Checkpoints" :
    Thread checkpoints OCCUR in the following situations:
    - Consistent database shutdown
    - ALTER SYSTEM CHECKPOINT statement
    - Online redo log switch
    3) "Biju Thomas - Oracle Database 11g Administrator Certified Associate study guide", "Chapter 8 Introducing Oracle Database 11g Components and Architecture", page 413 :
    "Checkpoints OCCUR automatically when an online redo log file is full (a log switch happens)."
    My question is :
    Does checkpoint occur or not on log switch ?
    Many other contradiction follows....

    Hi John,
    it will be my sixth certification in the last 3 years (i have all major certifications in Java)... But it is the first time i see that the Official Documentation of the product doesn't reflect completely the Certification Preparation Guide (i'm not going to say that you book is wrong, i'm about the complete compliance).
    I cannot place my questions in another oracle DBA forum, because there most frequently the responses are "If you see the Official Documentation you find that..." and now i know that Official Documentation sometimes mismatch for the exam.
    You (and I) certainly know that the certification exam is the multiple-choice questions... If i will find the question :
    When checkpoints occurs (select all valid questions) ?
    1) Consistent database shutdown
    2) Issue of "ALTER SYSTEM CHECKPOINT" statement
    5) Online redo log switch
    after your explanation (see your first reply) now i know what can i must check and what i must not check, but before your explanation with small knowledges of database how can i be shure that your information is correct if the Official Documentation says the contrary ?
    If i will find the question :
    What is the checkpoint position ?
    1) is the SCN (the System Change Number) in the redo stream where instance recovery must begin
    2) ...
    3) ...
    If read only your book i cannot see SCN, i see RBA. How can i know that SCN and RBA are the same think if neither Oracle Documentation neither your book say that ? What another sources or guides i must have ?
    And what about the location of the "alert_SID.log" file ?
    What you suggest me about this exam question :
    Which initialization parameter determines the location of the alert log file?
    1) DIAGNOSTIC_DEST
    2) BACKGROUND_DUMP_DEST
    3) ALERT_LOG_DEST
    4) USER_DUMP_DEST
    If i read you book i select "BACKGROUND_DUMP_DEST", because in your book i see :
    "The alert log is a continuous record of critical operations applied to the instance and the database. Its location is determined by the instance parameter BACKGROUND_DUMP_DEST, and its name is alert_SID.log, where SID is the name of the instance.... The directory ADR_HOME/trace (ADR is automatic diagnostic repository) is used as the default value for the instance parameters USER_DUMP_DEST (trace files generated by user sessions) and BACKGROUND_DUMP_DEST (the alert log and trace files generated by background processes)."
    But if i read Official Oracle documentation i select "DIAGNOSTIC_DEST", because i see :
    "The parameter BACKGROUND_DUMP_DEST is ignored by the new diagnosability infrastructure introduced in Oracle Database 11g Release 1 (11.1), which places trace and core files in a location controlled by the DIAGNOSTIC_DEST initialization parameter."
    Therefore if i read you book i select "B", if i read documentation i select "A".
    You must understand me - the only my goal is to be maximum precise - this precision is required for exam. I also have not free time to waste - during the week i'm working, i'm preparing for this OCA DBA certification, during weekend i'm writing my book in mathematics for preschool childs (it's also about 700 pages)...
    Excuse me and thank you,
    Dmitri.

  • Any known compatibility issues with disabling onboard sound and using

    I have the K7 N2 board and disabled on board sound in the bios to use my Turtle Beach Santa Cruz. Sound seems to work fine in Windows Xp and the few games I have installed so far. My problem is with UT2003. When I try to launch the game I get the splash screen then it simply goes away. When I checked the system folder and looked at the debug log it shows the following.
    Exception code: c0000005
    Exception address: 77f51f2c 0001:00000f2c C:WINDOWSSystem32
    tdll.dll
    Call stack:
       77f51f2c 0001:00000f2c C:WINDOWSSystem32
    tdll.dll
       77f5268e 0001:0000168e C:WINDOWSSystem32
    tdll.dll
       77f52631 0001:00001631 C:WINDOWSSystem32
    tdll.dll
       7639131d 0001:0000031d C:WINDOWSSystem32IMM32.dll
       77f5e52d 0001:0000d52d C:WINDOWSSystem32
    tdll.dll
       77e798cc 0001:000188cc C:WINDOWSsystem32kernel32.dll
       77e7990f 0001:0001890f C:WINDOWSsystem32kernel32.dll
       7c00442a 0001:0000342a C:Program FilesUnreal Tournament 2003SystemMSVCR70.dll
       77e814c7 0001:000204c7 C:WINDOWSsystem32kernel32.dll
    I am able to run UT in safe mode without sound. I yanked my Santa Cruz card and uninstalled the drivers and tried with on board but still had the same problem. Since disabling the sound allows me to play, it has to do something with either the on board or my card or some sort of incompatibility between the two. Anyone know of any issues? On a side note. If I was to wipe my drive and use the on board is the quality of the sound as good as using a separate card? Any help would be appreciated.

    Just a follow up...I figured I could spend countless hours trying to figure out why UT wouldn'l load or spend 3 hours wiping the disk and reinstalling software. The latter proved to be worth it. My preference would have been not to install any of the audio drivers at all, but since they are bundled with the system drivers there is really no choice.  The one thing I did differently was not to install the live drivers from the disk that came with my board  but rather update to the newer Nforce 2 system driver dated 2/25/2003.
    So the install went Xp Pro, Sp1, Video drivers,  newest system drivers, disable on board sound in bios, Santa Cruz drivers then UT2003 and the game loads fine. I still get the message about the game port not having enough resources but that can be ironed out.
    Hopefully this information can help someone else out who might experience conflicts going forward.
     A side note..I find that with two network cards (3Com and on board) that Xp will spend about  three mintues after it loads non responsive as it looks for the other network connection. Running the network wizard on box 1 connected to the internet and creating the network disk and running it on the other machine allowing the on board card to see the network and eliminates the painful lull after Xp load to the time I can do anything.
    Thanks for the timely responses. I'm really enjoying this board. So many options, so little time.

Maybe you are looking for