Question about 24p capture

I have a friend who shot with 24p dv cam and I was wondering, do I need to use his 24p dv cam to capture or can I capture using my Panasonic cam (which is not 24p) in Final Cut Pro? I realize I have to set my capture presets for 24p, but I don't have a regular dv deck.

Sorry, Tim, but that brief article you read, I would not call research at all. It tells you in a very cursory fashion what to do to capture video shot at 24P Advanced. What it does not tell you is anything about what 24p is, how it's recorded, the various pulldown schemes, what a pulldown does, why it's needed and why or when you should remove it. You need to know that 24p is actually vanilla DV NTSC with a pulldown because the NTSC standard requires video to play at 29.97 fps, no ifs ands or buts about it. How you remove the pulldown to get back to 23.976 fps is dependent upon what flavor of 24p you shot and quite honestly, for years now we've been reading posts by those who have no clue what flavor they shot or they "accidently" shot both and need to combine them in the same sequence. The destination can be the same but the routes taken to get there are not the same.
Adam Wilt's article and Graeme's article are far superior to that Ripple Training blurb.

Similar Messages

  • From USB to IEEE-1394: question about HD capture, editing, sharing.

    I’m new to editing video but have a question about capturing video into Premiere Elements 7. I have been using a Canon HF11 which shoots only in HD and only has USB output to copy video to my desktop. I don’t have a Blu-ray burner yet but I have burned a few projects to DVD, shared video on YouTube, and saved files in .avi and .mpeg to view on computer screens. But obviously I’m missing out on sharing HD on disk at this time. I just bought a Canon XH A1s which shoots both HD and SD on miniDV tapes. And uses IEEE-1394 cable for capturing (to Pre7). I did a 3 minute test project in SD and burned it on a DVD disk. I was very pleased at the results and the ease of editing. I don’t have any HD tapes yet, but when I do what differences will I experience from shooting in HD and then capturing, editing, then to sharing? My current HD processes with the HF11 is slow, occasional low-resources warnings, and less then pleasing results on DVD disks. Thanks for any and all comments and/or suggestions.

    Your HF11 is AVCHD format and the Canon XH A1 is HDV MPEG2. AVCHD takes a lot more computer horespower to edit compared to HDV MPEG2... main reason being that AVCHD is more highly compressed giving small file sizes. This is also why you get the low resource and memory alarms while editing.
    To shoot HD you do not need HD miniDV tapes you can use the standard miniDV tapes. However the HD tapes are supposed to give fewer drop-outs than the standard tapes but are considerable more expensive. With DV-AVI a drop out would be hardly noticed, but as HDV MPEG2 is a compressed format and the picture relies on information in a number of frames any drop-outs can give a second or so of bad video. That said I have had very few issues with drop-outs. If you are doing a wedding or something important that you do not want to risk then probably better to use a HD miniDV tape.
    In the past the advise has been to capture in high definition to get the best quality and then downconvert in the camcorder and capture in DV-AVI... the reason being that older versions of Premiere Elements did not do a good job of the down conversion. Reports on PE7 indicate it is better and that you could use a HDV workflow and burn to DVD as a last stage. This also allows you to export a high definition version of your video to view on a monitor where you can appreciate the higher resolution. Or in the future when you have a Blu-ray burner and player you can burn the file to disc... save doing all the editing all over.

  • This sounds like a silly question about podcast capture.

    To be honest, I'm not a native English speaker and I don't know much about PC. I got my new Macbook yesterday and the whole point of getting it is to make podcasts. While I was in Account Setup, it asked me to enter a Podcast Producer server to connect to. What should I put there?
    Many thanks.
    Theresa

    I have the same question!

  • Question About HDV Capture / Timeline

    Hello,
    I am capturing HDV from a HV40. Whenever I bring a clip into the timeline, the audio is not a stereo-linked pair, it is instead two individual mono tracks. When I make it stereo linked, I then have to set the pan back to zero. These are a lot of extra steps to control audio volume. How do I make it so that it's stereo linked when I capture so I don't have to do all of this afterwards?
    My capture settings are:
    Sequence Preset: HDV1080i60
    Capture Preset: HDV
    Device Control Preset: HDV FireWire Basic
    Thank you for your help.

    Stereo linking is selected in the clip settings in the capture window.
    You do not have to reset to the pan to zero. -1 is correct for a stereo pair; 0 is not.

  • HD 24p capture

    I just asked a simple question about how i cud capture 24P footage and only response i get is a question asking which camera was it shot on!
    Is that all that you experts got?
    Message was edited by: Host

    A1lens wrote:
    My simple question was that i have some HD footage shot at 24p but i don't have the camera ( Sony HVR- S270 ) with me anymore... Sony's M25U does'nt play it... is there any other way i can capture it?
    Part of the issue is that you did not mention that in the other thread http://discussions.apple.com/message.jspa?messageID=7446514#7446514
    Even when you mentioned the camera,, the fact that you do not have the camera would change things as you noted. I have not shot that camera but I am sure others here may have solutions.
    Thought sometimes it may take a bit of time to get answers or replies to answers people get around to it. Remember many of the people --> are professionals who do alot of good by just being good help out others types, but sometimes work, life and other things get in the way.
    No one is being prideful, just remember when people ask questions it is to try to help. And the more details you provide about set-ups and workflow can help narrow down the questions ahead of time.

  • Help with 24p capture/sequence settings and interlace artifacts

    We are trying to capture sd video shot on a panasonic DVX 100B at 24p (not advanced) using FCP 6.02.
    The video was shot anamorphic (squeeze) through the camera.
    We have tried a variety of capture and sequence setting and continue to get varying degrees of artifact (lines) at motion points (e.g. when a hand moves).
    We have tried:
    Capture Easy set-up 29.97
    Capture Anamorphic 29.97
    We have tried changing field dominance to none.
    We have experimented with both interlace and progressive scan.
    We have experimented with different playback units including the camera shot with.
    Would love some advice on best capture sequence settings. Playback will be computer/video projector.

    It's sounds like you've been mislead about what 24p (as opposed to 24p Advanced) actually is on tape - it's interlaced 29.97 footage. So, what you're seeing is normal - there should be three progressive frames followed by 2 interlaced frames (since it's pulldown added to 23.98 material).
    So, to get to the heart of your question, you are capturing correctly - Anamorphic 29.97.
    If you want true progressive output, strip the pulldown - use Cinema Tools - and edit as 23.98. Otherwise, if you can't go 23.98, you're stuck with what you've got.

  • Questions about editing with io HD or Kona 3 cards

    My production company is switching from Avid to Final Cut Pro. I have a few editing system questions (not ingesting and outputting, just questions about systems for the actual editors - we will have mac pros with either kona 3 or io HD for ingest and outputs)
    1) Our editors work from home so they most likely will be using MacBook Pros - Intel Core 2 Duo 2.6GHz 4GB computers with eSata drives to work on uncompressed HD, will they be able to work more quickly in FCP if they are using the new Mac Pro 8-Core (2 Quad-Core 2.8GHz Intel Xeon) or will the mac book pro's be able to hold their own with editing hour long documentaries, uncompressed HD
    2) Will having an AJA Kona 3 (if we get the editors mac pros) or io HD (for the mac book pros) connected be a significant help to the editor's and their process, will it speed up their work, will it allow them to edit sequences without having to render clips of different formats? Or will they be just as well off without editing with the io HD?
    I'm just trying to get a better understanding of the necessity of the AJA hardware in terms of working with the editors to help them do what they have to do with projects that have been shot on many formats- DVCPro tapes, Aiptek cameras that create QTs and P2 footage.
    Thanks

    1. with the IoHD, laptops become OK working with ProRes and simply eSata setups. Without the Io, they can't view externally on a video monitor (a must in my book). It will not speed up rendering a ton, nor will it save renders of mixed formats. The idea is to get all source footage to ProRes with the Io, and then the Io also lifts the CPU from having to convert ProRes to something you can monitor externally on a video monitor, and record back to any tape format you want... all in real time.
    2. Kona 3's on Towers would run circles around render times on a Laptop... no matter what the codec, but the Kona does not really speed renders up. That's a function of the CPU and just how fast is it. (lots of CPU's at faster speeds will speed up render times).
    I'd recommend you capture to ProRes with Io's or the Kona 3 and don't work in uncompressed HD. You gain nothing doing it quality wise at all. And you only use up a ton of disk space (6 times the size in fact) capturing and working in uncompressed HD, which from your post, you're not shooting anyway. The lovely thing about ProRes is that it's visually lossless, efficient, and speeds up the editing process. Mixing formats can be done, but it's better to go to ProRes for all source footage, and edit that way.
    With either the Kona or the Io, you then can output to uncompressed HD tape... that's what they do for you no matter what codec you've edited in. ProRes is designed to be the codec of choice for all HD projects when you're shooting different formats especially... Get them all singing the same tune in your editing stations, and you'll be a much happier camper. Only reason to buy laptops is portability... otherwise you're much better off with towers and the Kona 3 speed wise.
    Jerry
    Message was edited by: Jerry Hofmann

  • A Question about RAW and Previews

    I have just recently starting shooting in RAW (mostly for the post production editing abilities - I am an avid amateur photographer bent on learning as much as I can). I set my camera to capture in RAW + L. I don't know why I feel like I want it to capture both the RAW and JPEG file, and thus leads me to my first question: Is it necessary to have the camera capture both the RAW and Large JPEG? I am assuming the answer to be no, as I am sure if after importing the RAW file into Aperture, you could always export a JPEG if you wanted one? So no need to fill up your internal memory (if using managed masters) with the extra JPEG? Is this thinking correct?
    Next, if you do import RAW-only files and then want to export certain images, do you have a choice to export the original RAW image? It seems that it only allows you to export a JPEG Original Size. To answer my own question, perhaps you have to export the Master in order to export the full RAW file when exporting? If you want to export a JPEG, you have to export not the Master, but a version of the Master? Is this correct?
    Lastly, I wanted to ask a question about Previews. I have my preferences set so that previews have the highest quality with no limits to size. What is the significance of setting it this way? I just assumed that if I wanted to share an image at the highest quality without exporting it, this was the way to go. Is there any validity to this? The reason I ask is that I don't want to have all of these high quality previews taking up internal disk space if I really don't need to. Is there a way to change the preview size once previews are created? Meaning, if you have it set to generate low quality previews, can you change it dynamically to high and vice versa?
    I know this is a lot in one post. Thanks for tackling it.
    Mac

    You can change the quality of the Previews in the Preferences -> Previews tab.
    You can regenerate Previews with the Delete and Update Previews under the Images menu.
    Regards
    TD

  • Questions about SRM PO in Classic scenario

    Hello All
    I have a number of questions about the SRM PO in classic scenario.
    1) If the Backend PO is changed in ECC i.e. if any quantity is added , can we have an approval workflow
    for the same?
    We currently have release strategies for other PO's in ECC. How do we accommodate the PO changes only?
    Our requirement is not have an approval initially once the PO is created, but only for the changes
    2) If the PO is sent as XML to the Vendor, is it possible to capture the PO response in ECC? What are the Pre-requisites
    for this to happen. Should SAP XI be required for this?
    3) In case the PO is cancelled/ reduced , does the Balance goes back to SRM sourcing cockpit?
    We are using SRM 7.0
    Regards
    Kedar

    Hi,
    1) If the Backend PO is changed in ECC i.e. if any quantity is added , can we have an approval workflow
    for the same?
    We currently have release strategies for other PO's in ECC. How do we accommodate the PO changes only?
    Our requirement is not have an approval initially once the PO is created, but only for the changes
    Sol: In ECC6.0 if the P.O is changed and release strategy is there in ECC6.0 then it follows the ECC6.0 Approval Route.
    2) If the PO is sent as XML to the Vendor, is it possible to capture the PO response in ECC? What are the Pre-requisites
    for this to happen. Should SAP XI be required for this?
    XI is mandatory
    3) In case the PO is cancelled/ reduced , does the Balance goes back to SRM sourcing cockpit.
    Once P.O is created in ECC 6.0 for the P.R in Sourcing Cockpit, cancelling/reduction will not have a updation in the sourcing cockpit in SRM.
    Eg  100 nos P.R is in SRM sourcing cockpit for which  you have createdaa P.O for 40 nos is ECC6.0
    for the remaining 60 nos PR ,you can create a P.O in ECC6.0
    Regards
    Ganesh

  • Question about "Native ISO" and Color Grading in PP

    I have a question about "Native ISO" in the real world and how it relates to color grading.  I was shooting 35mm film before all these digital cameras became flat-out amazing practically overnight.  Then the goal was always to shoot with the lowest ISO possible to achieve the least amount of grain (unless you were making an artistic decision to get that look).  If I was shooting outside plus had a nice lighting package I'd shoot 5201/50 ASA (Daylight) and 5212/100 ASA (Tungsten) 99 times out of 100.
    I've recently been shooting a lot with the Blackmagic 4K and have read that its "Native ISO" is 400.  Because of my film background this seems counter-intuitive.  Yesterday I was shooting for a client and had the camera at an f16 with a 200 ISO.  Because of what I'd read, I was tempted stop down to an f22 and change my ISO to 400... but the "little film voice in my head" just wouldn't let me do it.  It kept telling me "Higher ISO means more noise... stay at 200 and you will get a cleaner image".
    So how does it work with "Native ISO"?  Should I really shoot at a 400 ISO every chance I get in order to capture the best image for how the camera is calibrated?  Will it really give me more latitude when color grading?  Or would I still get a cleaner image staying at ISO 200?   I've Googled around quite, but haven't found any articles that answer specifically this question.  Would love to hear from someone who knows a bit more on the subject or has a link that could point me in the right direction.
    Thanks much.

    Hey, shooter ... yea, interesting discussion and always nice to learn. Great pic, too!
    jamesp2 ...
    Great answer. I've followed quite a bit of the discussion about the BM cams as well, one does feel a need to check out the possibilities for that next beastie one will need to acquire. But ... which one?
    I've always been a bit of a hard-case about testing testing testing. For instance, what happens with dome down or use of a flat diffuser vs. dome in the up position in metering? Back in the film days, we had our own lab and did our own printing as well as the um ... difficult images ... from other studios. I needed to know how to get exactly the same diffuse highlight no matter whether it was a "standard" light 3:1 studio shot, or a near-profile with no fill that needs dark shadows. I tested & burned through boxes of medium-format polaroid & 120 film and a lot of color paper. Finding? To get the same print time no matter the contrast or lighting style, needed to be metered either with the flat disc (Minolta) or dome-down (Sekonic) and held at the highlight-location pointed at the main light source. I could meter and nail the exposure every time. Ahh no, insist so many ... one must have the dome on/up and pointed at the camera! Right. Do that, change the contrast, and see what happens to your diffuse forehead highlight on a densitometer ... and see how your printing exposure times change. Oh, and you've just moved your center-of-exposure up or down on the film's H&D curve, which will also change the way the shadows & highlights print. In truth, though it was subtle, we had realistically no matter latitude for a best-case image with pro neg film as one had with chromes. You could probably get away with being "off" easier, but it still wasn't dead-on.
    So wading into video ... oi vey, you may have noticed the things claimed here there & everywhere ... this setting is God's Gift to Humanity but no, it's total crap ... this sensor is totally flawed but someone else is certain it's the finest piece out there. Yes, opinions will be all over ... but ... in film, it was the densitometer. In video, it's the scopes. Truth. And getting to that can be a right pain. I've seen quite a few contradictory comments about using the BM cams in film mode and also at ISO 200. Yours above gives the most ... comforting? ... explanation (for me) because of your reference to your scopes & the waveform patterns. Thank you.
    Love to learn ...
    Neil

  • Questions about the content of download meeting recording .zip file

    I tried posting this on the resurrected Connect forum, but my Adobe ID wasn't recognized there....
    Concerning the files that are included in the .zip file of the meeting recording that can be downloaded:
    1) Is there any documentation describing the files and their contents (i.e. what each file represents, what each XML element and attribute in those files represent)
    2) Are there any files that capture mouse movement on a shared desktop?
    Thank you!

    Hi Sean,
    Regarding your first post:
    Thanks Jorma! I don't have access to an FMS build at the moment but I'm quite certain it's there. As for contacting Jaydeep, I am 90% sure he authorized us to broadcast his email on here if folks had questions about the tool, but, in the case that I'm wrong and he didn't - I'm going to double-check first.
    Regarding your most recent post..
    "To be clear, the most critical goal I'm trying to accomplish is to create an automated process that will download the recording meeting at its highest quality in a consistent and reliable manner".
    I personally believe this is possible; unfortunately, I haven't seen it done yet. If your recording contains:
    - audio
    - a camera feed
    - screensharing
    Then I think you might be able to get this going. If it contains shared content, like a shared PPT, this gets trickier.
    "To do this, of course, I have to reproduce some of the functionality that Connect provides, starting and combining video and audio streams according to the instructions in the control files."
    Exactly right. If your recording didn't contain shared content, then all you've got on your hands are a bunch of audio/video files that you could edit together as you wanted with your favourite video editing tool. If it contains shared content, here's (at a high level) what's happening.
    For shared PPTs or FTContent files:
    First (for version 9 recordings only), Connect reads the information on the Shared Content's location and SCO within mainstream and indexstream and validates it before loading it. I don't recall this happening to the same extent with version 8 or earlier, but maybe it was. Now, if the content is validated (ie. Connect can find it) the share pod will display as black, if it doesn't, you get an empty pod with an message like "No content is being shared" or something like that.
    Connect then looks at the actual FTContent file, and loads the content that is to be shared using the file path and sco ID listed in here. It's important to note that the SCO ID and file path in here will likely not be the same as the original file you uploaded to your room, it's a new SCO id (I believe SCOs of this type are called referenced scos) and new path.
    Now...if I was going to build some sort of player which would play all these files in one screen to make a recording...I might not want to use Connect's code here. If you know the file path to the shared content (from FTContent), you could easily view it with the content URL (conveniently also in FtContent). I'm not a coder, but I'm envisioning something like Presenter's GUI where you've got the presentation's content in the main area, and a video file (if there is one) playing back on the side.
    Anyways, food for thought if you want to try to go about this. Connect recordings are incredibly complex and they come with a big learning curve, but if you can make sense of them the knowledge is quite valuable.

  • Questions about ActiveX Bridge and SWT

    Hi !
    I found few weeks ago the www.reallyusefulcomputing.com site answering my question about Java & Macro of MSWord. Now I've found two steps that I am not able to overcome.
    First, when I build a simple application and I have internal libraries to use, am I compelled to place the jars in the lib/ext directory of the Java Runtime Directory? I have many difficulties in placing a convenient class-path in the manifest file. Why does the application ignore the manifest string?
    The second problem (and the biggest one) is that I am trying to build an SWT java-bean-application but I have problems during runtime. Is it possible to use this technology for my software or am I compelled to use AWT and SWING?
    I really hope you can help me.
    Thank you in advance.

    hi,
    I have to catch events from excel sheet in my java code..events like some change in value or a click of some user defined button etc.I have written th follwoing code but it does not gives me any event.It simply opens the specified file in the excel in a seperate window.but when i click on the sheet or change some value no event is captured by my code....can ne one pls tell me how to go about it....
    public class EventTry2 {
         private Shell shell;
         private static OleAutomation automation;
         static OleControlSite controlSite;
         protected static final int Activate = 0x00010130;
         protected static final int BeforeDoubleClick = 0x00010601;
         protected static final int BeforRightClick = 0x000105fe;
         protected static final int Calculate = 0x00010117;
         protected static final int Change = 0x00010609;
         protected static final int Deactivate = 0x000105fa;
         protected static final int FollowHyperlink = 0x000105be;
         protected static final int SelectionChange = 0x00010607;
         public void makeVisible()
              Variant[] arguments = new Variant[1];
              arguments[0]=new Variant("true");
              //Visible---true
              automation.setProperty(558,arguments);
              //EnableEvent--true
              boolean b =automation.setProperty(1212,arguments);
            System.out.println(b);
             public Shell open(Display display){
             this.shell=new Shell(display);
              this.shell.setLayout(new FillLayout());
              Menu bar = new Menu(this.shell,SWT.BAR);
              this.shell.setMenuBar(bar);
              OleFrame frame = new OleFrame(shell,SWT.NONE);
            File file= new File("C:\\Book1.xls");
              try{
              controlSite =  new OleControlSite(frame, SWT.NONE, "Excel.Application");
              this.shell.layout();
              boolean a2=true;
              a2=(controlSite.doVerb(OLE.OLEIVERB_SHOW|OLE.OLEIVERB_OPEN|OLE.OLEIVERB_UIACTIVATE|OLE.OLEIVERB_HIDE|OLE.OLEIVERB_PROPERTIES|OLE.OLEIVERB_INPLACEACTIVATE)== OLE.S_OK);
              System.out.println("Activated::\t"+a2);
            }catch(SWTException ex)
                 System.out.println(ex.getMessage());
                 return null;
              automation = new OleAutomation(controlSite);
              //make the application visible
              makeVisible();
              System.out.println("Going to create Event listener");
              OleListener eventListener=new OleListener(){
                   public void handleEvent(OleEvent event){
                        System.out.println("EVENT TYPE==\t"+event.type);
                   switch(event.type){
                   case Activate:{
                        System.out.println("Activate Event");
                   case BeforeDoubleClick:{
                        System.out.println("BeforeDoubleClick Event");
                   case BeforRightClick:{
                        System.out.println("BeforeRightClick Event");
                   case Calculate:{
                        System.out.println("Calculate Event");
                   case Change:{
                        System.out.println("Change Event");
                   case Deactivate:{
                        System.out.println("DeActivate Event");
                   case FollowHyperlink:{
                        System.out.println("Activate Event");
                   case SelectionChange:{
                        System.out.println("Activate Event");
                        Variant[] arguments = event.arguments;
                        for(int i=0;i<arguments.length;i++)
                             System.out.println("@@");
                             arguments.dispose();
              System.out.println("outside");
              OleAutomation sheetAutomation=this.openFile("C:\\Book1.xls");
              controlSite.addEventListener(sheetAutomation,Activate,eventListener);
              controlSite.addEventListener(sheetAutomation,BeforeDoubleClick,eventListener);
              controlSite.addEventListener(sheetAutomation,BeforRightClick,eventListener);
              controlSite.addEventListener(sheetAutomation,Calculate,eventListener);
              controlSite.addEventListener(sheetAutomation,Change,eventListener);
              controlSite.addEventListener(sheetAutomation,Deactivate,eventListener);
              controlSite.addEventListener(sheetAutomation,FollowHyperlink,eventListener);
              controlSite.addEventListener(sheetAutomation,SelectionChange,eventListener);
              shell.open();
              return shell;
         public OleAutomation openFile(String fileName)
              Variant workbooks = automation.getProperty(0x0000023c);//get User Defined Workbooks
              Variant[] arguments = new Variant[1];
              arguments[0]= new Variant(fileName);
              System.out.println("workbooks::\t"+workbooks);
              IDispatch p1=workbooks.getDispatch();
              int[] rgdispid = workbooks.getAutomation().getIDsOfNames(new String[]{"Open"});
              int dispIdMember = rgdispid[0];
              Variant workbook = workbooks.getAutomation().invoke( dispIdMember, arguments );
              System.out.println("Opened the Work Book");
              try {
                   Thread.sleep(500);
              } catch (InterruptedException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
              int id = workbook.getAutomation().getIDsOfNames(new String[]{"ActiveSheet"})[0];
              System.out.println(id);
              Variant sheet = workbook.getAutomation().getProperty( id );
              OleAutomation sheetAutomation=sheet.getAutomation();
              return(sheetAutomation);
         * @param args
         public static void main(String[] args) {
              // TODO Auto-generated method stub
              Display display=new Display();
              Shell shell=(new EventTry2()).open(display);
              while(!shell.isDisposed()){
                   if(!display.readAndDispatch()){
                        display.sleep();
              controlSite.dispose();
              display.dispose();
              System.out.println("-----------------THE END-----------------------------");

  • Question about repository

    Hi
    I have some questions about repositories and if anyone could answer I´ll be very grateful.
    What does it contains? Data or objects without data?
    How does it works together with a database, I mean, when there is a DML sentence where are those changes apply? In the database, in the repository, both...
    The first time you install a repository how can you insert data on it, I mean do you have to import objects from the schemas, you don´t have to do it if both are in the same database...
    And If any of you knows a good guide or any doc to learn more about the uses of a repository with an oracle database please tell me.
    Thank you very much.
    Gerardo

    Oracle Designer is a CASE tool, that is, a tool for capturing and analysing requirements, designing a system and (optionally) generating it. This is all data, which is stored in the repository. Think of it as any other application.
    Once analyzed, designed and modelled the deployed application will run against some other database not the repository.
    I mean, do I have to do the changes on the database or in the designer toolsAs a DBA I would not expect you normally to touch the repository, except for the usual things (checking tablespace growth, etc). Are you also supposed to be the Repository administrator?
    Anyway, the data model and integrity rules in the respository are very complicated and you do not want to tool around with SQL in the repository; use the Designer tools. Howvever, Oracle do provide a suite of PL/SQL packages - teh Repository API - which do allow us to manipulate the data programmatically. This can be very handy when we need to do reetitive tasks on a large number of elements.
    Cheers, APC

  • Some question about communicating the usb RAW device

    I have some question about USB communication: I want to make my VI communicate
    TI-DSP by USB, now, the driver of USB on DSP have done, and  there are a test
    program writen by VC and a driver fold(with a .inf and a .sys files), when I
    install the driver and run the test program, the driver program on DSP run
    regularly. And then I want to program a VI which have the same function as the
    test program, so I unload the driver on PC firstly, then install the DSP in
    NI_VISA according to "Using NI-VISA 3_0 to Control Your USB Device - Tutorial -
    Instrument Drivers". When I sent the standard control request using the VISA test panel, the status below happened. I don't know what wrong with my step.
    Dev  Phase  Data                       Info           Time   Cmd.Phase.Ofs    
     15  CTL    80 06 03 00 - 00 00 04 00  GET DESCRIPTR  5.2sc        56.1.0       
     14  CTL    80 06 03 00 - 00 00 04 00  GET DESCRIPTR   11us        57.1.0       
     14  USTS   00 00 01 c0                canceled       2.0sc        57.2.0       
     15  USTS   00 00 01 c0                canceled         5us        56.2.0 
    PS: the software I use to capture the data is BUSHOUND
    1、Do I have to install the .sys driver file in VISA? How can I install the driver file without losing the device in MAX?
    2、Someone told me it must be done by calling .dll fies in LV, but I want to know if LV can program the function directly without calling .dll file?
    Thank for any reply~~!

    逍遥浪子 wrote:
    I have some question about USB communication: I want to make my VI communicate TI-DSP by USB, now, the driver of USB on DSP have done, and  there are a test program writen by VC and a driver fold(with a .inf and a .sys files), when I install the driver and run the test program, the driver program on DSP run regularly. And then I want to program a VI which have the same function as the test program, so I unload the driver on PC firstly, then install the DSP in NI_VISA according to "Using NI-VISA 3_0 to Control Your USB Device - Tutorial - Instrument
    Drivers". When I sent the standard control request using the VISA test
    panel, the status below happened. I don't know what wrong with my step. Dev 
    Phase 
    Data                      
    Info          
    Time   Cmd.Phase.Ofs     --- 
    -----  ------------------  15 
    CTL    80 06 03 00 - 00 00 04 00  GET
    DESCRIPTR  5.2sc       
    56.1.0         14 
    CTL    80 06 03 00 - 00 00 04 00  GET
    DESCRIPTR   11us       
    57.1.0         14 
    USTS   00 00 01
    c0               
    canceled      
    2.0sc       
    57.2.0         15 
    USTS   00 00 01
    c0               
    canceled        
    5us        56.2.0  PS: the
    software I use to capture the data is BUSHOUND 1、Do I have to install the .sys driver file in VISA? How can I install the driver file without losing the device in MAX? 2、
    Someone told me it must be done by calling .dll fies in LV, but I want
    to know if LV can program the function directly without calling .dll
    file?
    This thread
    contains already a related answer and explains what a sys driver is.
    While you could theoretically use the Call Library Node to call all the
    necessary Win32 API kernel functions to connect to a device driver,
    this would be very cumbersome and not doable without a real good
    understanding about C programming. Writing an interface DLL instead
    won't need more C programming knowledge at all but will give you a
    clean interface to that device driver which eventually could be used in
    other programming environments as well.
    Rolf Kalbermatter
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Questions about Real Application Testing(RAT)

    Hi All,
    We have a production database running on 10gR3 on a server with local drives, while we have a Oracle 11gR2 DB running on a server with NFS mounts (using S7310 - AmberRoad) i.e. Faster and better storage.
    We captured the load on 10gR3 and replayed it on 11gR2. We noticed the following:
    (1) Replay is considerably slow even though Oracle11gR2 instance has a faster storage. We suspect that it may be something to do with the buffer cache / SGA because there is nothing in cache on the target (we didn't shutdown 10gR2 DB for capture) – what should we do then?
    (2) To make sure that we can take the advantage of cache, we replayed the load 2nd time right after the 1st replay and everything ran to our surprise. So we are wondering how’s that possible since we did not restore the DB as we do not want to wipe off the cache (chicken and egg situation)? Does Oracle rollback the changes after the replay?
    (3) Do we have to restore database on Target every time we do replay? But if we do that then we won't have anything in the SGA.
    So we need your advise and also would like to know how everyone else is doing this testing?
    Regards,
    RJiv.

    DB Replay's workload capture facility allows you to either start capture from a closed (mounted) database (capture starts upon opening the DB), or to begin capture mid-stream during normal activity. Starting capture on the production system from a closed database eliminates the divergence in performance resulting from a primed cache, as well as possible data divergence issues from open, partially-completed transactions at the time the capture started.
    For many customers, it will clearly not be possible to close their database during peak periods (!!)
    One way to address the cache priming issue is to start capture in production from a closed state during a low period of activity, and the allow capture to run through the peak period.
    Another approach is to start capture mid-stream with the DB open and to run capture for a long period (long enough to stabilize the cache). When performing the replay, begin a new AWR snapshot after the cache has stabilized.
    Your question about running the replay again after the first replay is done is confusing. Of course you will not get meaningful data from that, since replay must begin from the capture start SCN. If you run replay twice in a row without reverting the database to the capture start SCN, it will be applying meaningless changes to a database in a state that is unlike that of the original. You will be testing the data errors codepath instead of real performance.
    It is typical to enable database flashback on the repay database so that it can be repeatedly reverted to the capture start SCN for testing under a variety of scenarios.
    Regards,
    Jeremiah Wilton
    Blue Gecko, Inc.
    http://www.bluegecko.net

Maybe you are looking for

  • How do I resize a single cell in a TableView?

    So I have a tableview, and when a user selects it, I would like to resize that particular cell. I have all my code in place in terms of drawing that particular cell when selected, but my only problem is I can't seem to figure out how to resize a sing

  • Problem with accessing a web service from outside world

    Hello SAP experts I have a custom BAPI for which I have created a webservice. If I login as an admin to the BOX where I have installed this SAP server and open an internet explorer window and give the URL, I can access the web service without any pro

  • Actual and relative file paths - CS4(mac)

    Is there a way to save your file so that it ALWAYS looks for the relative paths of all its content? My issue is that when i xfer the prproj file to another computer it asks me to reconnect everything. The other computer has exactly the same folder st

  • Notification Error when using PL/SQL document

    I have a notification that calls a procedure that returns some HTML body of the email. It worked once, but now I get the error below. In Oracle Apps on the notification tab the notification shows correctly, it's the actual sending of the email that i

  • How to open D750 Raw files in Photoshop CS6

    Hi, I need help with opening my D750 raw files in CS6. I have downloaded the plug  in 8bi and have it stored in  file formats folder. But still cannot open raw files. Rebecca