How to use an external trigger on an electrometer (KEITHLEY 6517A)

I'm really new using LABVIEW, so please, be patient with me...
I'm using a KEITHLEY electrometer to acquire data from a dosimeter. We are going to do some measurements on a pulsed source and before the pulse there's a trigger signal. I don't know the size of the trigger, I'll just know that when I start my measurements (in another lab) So I would like to implement a case were the user can decide the value of the external trigger, so this information goes to the eqipment before starting the measurement.
What I would like to know is... Someone can give me a hint on how to do that? I'm using GPIB and LABVIEW 7.1.  and I already took all the 6517A libraries for labview.
Thank you all

The external trigger on a 6517 is a TTL signal. Any pulse higher than 3.4V will trigger it. You cannot program the threshold level.
You can set the delay programmatically with TRIG: DEL <n> command.
ref: page 2-81 and following of User Manual
Message Edited by normandinf on 10-21-2008 03:55 PM

Similar Messages

  • Data-acquisition with NI 6036E DAQ card & GPIB using an external trigger

    Hi all,
    I hope somebody could give me some help with the following and answer some questions:
    Simple system description:
    Labview 6.1
    PCI-GPIB card
    6036E DAQ card
    In my system, I am using an external analog trigger signal (A) for continuous data-acquisition. Characteristics of the analog trigger signal (A) are: ~40 Hz, signal height +1.48V, triggered by rising edge (the analog trigger signal (A) could be changed to a TTL signal). Each data-acquisition is done within ~1.0 ms after the rising edge of the trigger pulse. The timing of the data-acquisition and analyzing procedure is controlled by execution in a sequence structure placed in a loop.
    Now, I connected a power meter to the system, to measure the laser power during the data-acquisition. The power meter has two options to provide the laser power data:
    a) via analog signal output (voltage corresponds to laser power in watts)
    b) via GPIB (direct output reading of laser power in watts).
    Problem:
    During a certain point in my data-acquisition sequence structure (defined by a frame), I want to use the next occuring analog trigger signal (A) to acquire 1 value from the power meter.
    How do I do this in Labview programming for the following two situations?
    a) If I connect the analog output from the power meter to an analog input channel of the 6036E DAQ card. The analog trigger (A) would be connected to a second analog input channel (In case the analog trigger signal (A) is changed to a TTL signal it would be connected to the PFI0/Trig input pin on the DAQ card).
    b) If I use the GPIB connection of the power meter. The analog trigger (A) would be connected to a second analog input channel (In case the analog trigger signal (A) is changed to a TTL signal it would be connected to the PFI0/Trig input pin on the DAQ card).
    An other possibility would be to trigger the power meter directly, so it outputs constantly power meter values at ~40 Hz. How could I than acquire 1 power meter value (at a certain time im my sequence structure) via analog input at DAQ card or GPIB?
    Additional questions:
    How do I configure the PFI0/Trig pin on the 6036E DAQ board individually as an INPUT?
    How do I use an analog trigger signal (A) as counting signal for a loop, or as an activation signal for a sequence structure which includes GPIB commands?
    It would be very nice if somebody could give me some help.
    Kind regards,
    beam

    Hi beam,
    I just want to verify that I understand your situation correctly:
    An external trigger signal (A) is wired to one of your input channels (e.g. CH0) to trigger data acquisition of a second channel (e.g. CH1). Your power meter is connected to an analog input channel, which you would like to trigger with a certain rising edge at some of your sequence structure.
    Problem:
    During a certain point in my data-acquisition sequence structure (defined by a frame), I want to use the next occuring analog trigger signal (A) to acquire 1 value from the power meter.
    How do I do this in Labview programming for the following two situations?
    a) If I connect the analog output from the power meter to an analog input channel of the 6036E DAQ card. The analog trigger (A) would be connected to a second analog input channel (In case the analog trigger signal (A) is changed to a TTL signal it would be connected to the PFI0/Trig input pin on the DAQ card).
    If a task has been configured to acquire signal from one analog channel, it's not possible to run a second analog input task or to add a second channel on the fly. You had mentioned that it's possible to read from the instrument through GPIB. Is it possible to perform a software trigger such that at a certain frame of your structure, when the trigger signal A reaches voltage "x", a GPIB command is written to your power meter to query a measurement reading?
    Additional questions:
    How do I configure the PFI0/Trig pin on the 6036E DAQ board individually as an INPUT?
    You do not need to explicitly configure the PFI0 line as an input. If you want to use it as an input such that it acts as an analog trigger, simply wire the trigger signal to this pin. When configuring the trigger in your software, specify PFI0 as the trigger source.
    How do I use an analog trigger signal (A) as counting signal for a loop, or as an activation signal for a sequence structure which includes GPIB commands?
    You can try using the Limit VI to find out when the trigger signal reaches a certain level, and count how many times this level is reached. Similarly, you can use this as the condition to execute GPIB commands.
    Hope this helps,
    Lesley

  • How to use an external start button for 6024E w/ sc-2345, I would like it to function like a start button in LabView but I don't know how to connect the external button to the sc-2345's connector block

    I am confused as to how to connect my external start botton. I would like it to function like a start button on a front panel of a VI. I would like to use the +5V on pin 14 but I really don't know where to go from here. Any guidence for a novice would be most appreciated.

    phod,
    This is the LabVIEW Real-Time forum, so I suggest that in the future that you post this type of question to the Multifunction DAQ forum.
    For the simplest solution you will have to connect your button to a digital line of your board, consult the sc-2345 user manual for a diagram of where these lines are exposed. You will have to connect your start button in series with a line that is high, such as your 5V pin or another digital line. Then connect this to a digital line that will be your start trigger.
    Your program can poll the digital line that is connected to your button in a while loop and when it goes high, it lets the rest of the program execute. For the programming, I suggest you take a look at the shipping examples that come with LabVIEW. If yo
    u have LabVIEW 7.0 go to Help>>Find Examples. Open Hardware Input Output>>Traditional DAQ>>Digital Input and Output>>E Series for some examples of digital I/O programming with E-series boards.
    Hope that gets you started.
    Gerardo

  • Multiple recording of waveforms using an external trigger signal

    Dear all,
    Maybe somebody could help me with the following:
    System:
    Windows 2000,Pentium III,~800-900MHz,32-bit master PCI bus,256 MB RAM
    PCI-NI 5122 Digitizer
    PCI-GPIB card
    PCI-DAQ card
    Labview 7.1
    NI Scope 2.7
    Problem:
    I want to acquire multiple waveforms (e.g. 100) using an external TTL trigger signal of ~40Hz. I have to acquire 1 waveform per trigger signal i.e. 100 subsequent trigger signals will acquire 100 subsequent waveforms.
    After, or during the waveform acquisition (?), I have to transfer the waveforms as 100 1D-arrays from the onboard memory of the digitizer through my fitting routine (one-by-one!). The fitting routine accepts one waveform (1D-array) at the time, fits the waveform and calculates 1 fit-value per waveform. So, when I pass the acquired 100 waveforms subsequently through the fitting routine, I need to get out a 1D-array with 100 fit-values. This 1D-array with the 100 fit values will be further processed and a final value will be saved.
    This procedure will repeat itself at different points in a sequence structure. The sequence structure will be located in a loop and running for several hours!
    Please see attached program flow diagram.
    Qustions:
    1. When I want to acquire 100 waveforms using 100 subsequent trigger pulses, the attached VI (Multiple_record VI) gives me the following error message:
    ERROR
    "Possible reason(s):
    Driver Status: (Hex 0xBFFA4009)
    A previous acquisition is still in progress. If you are attempting to change an attribute, note you can only change fetch attributes while an acquisition is still in progress.
    Status Code: -1074118647
    What am I doing wrong? Is there a better and more efficient way to do this?
    2. How can I transfer the 100 waveforms from the digitizer, one-by-one as 1D-arrays, through my fitting routine? A waveform will have ~2000-3000 data pts. (DBL type). Remember: I DO NOT HAVE TO SAVE THE WAVEFORMS!
    3. Which way is faster, and more efficient (memory) for my PC system to transfer the waveforms from the onboard memory of the digitizer to the fitting routine?:
    a) Start transfer of already acquired waveforms (1D-arrays), one-by-one through the fitting routine, during the waveform acquisition process is still in progress?
    b) First, acquire the 100 waveforms (1D-arrays), store them in the onboard memory (digitizer) and then transfer them one-by-one through the fitting routine?
    4. How can I make sure that after the waveforms were transferred through the fitting routine, the onboard memory of the digitizer is EMPTY (clear waveforms which have already been fetched) and ready to repeat the waveform acquisition at a different point of the sequence structure?
    5. I do not want to display each acquired waveform because I think it will slow down the acquisition process. Instead, I want to display the last of the acquired 100 waveforms, so I can check my fit and maybe adjust the fit-parameters. How can I do that?
    I think the important thing is to perform this process in the fastest way possible with the best use of the PC resources (memory etc.) because this process will run for hours. Any suggestions?
    It would be very nice if somebody could give me some help with this.
    Kind regards,
    beam
    Attachments:
    program flow diagram.vi ‏11 KB

    The method you use to analyze your data is fine. The only thing I would watch out for is memory problems. If you keep with the numbers you gave earlier, 100 waveforms at 3000 pts/wfm, you will be transferring 2.4MBytes of data per set. This is way over the 1MByte buffer NI-SCOPE uses for data transfer. It will work, but it might be slow. If you have problems, switch to niScope Fetch WDT.vi for your acquisition and move it inside the loop. Before you call the fetch VI, set the record it will fetch by using the property node and setting Fetch->Fetch Record Number using your loop index.
    A couple of other performance tips I think you probably know. The graph inside the loop will slow you down a lot. In addition, the graph of all the data outside the loop will also take a fair amount of time (plotting 2.4MBytes of data). If you want to see just the last waveform, pop-up on the terminal going out of the loop and disable indexing so only the last waveform goes through, not all of them.
    Your analysis method should work fine. You may consider doing an I16 fetch instead of the WDT fetch you are currently using. This will reduce your memory usage by a factor of four, provided you don't immediately convert it to a double array. If your analysis is actually finding a max/min, this is faster on an integer array than a double. You can scale the integers using the numbers in the information cluster. It is much faster to scale one integer at the end than a whole array. That said, the I16 fetch is not particularly faster than the WDT fetch, just less memory. This choice will depend on your analysis.
    The min record length is the minimum number of points the device will acquire. The actual record length is different if the acquisition rate you ask for is not one of the acquisition rates the device is capable of. The acquisition rate is then coerced to the next highest rate and the record length increased so your acquisition time remains constant. This behavior is part of the IVI specification for digitizers. Using your VI as an example, you ask for 300kS/s acquisition rate and 800 data points. What you actually get is 300,300.3003 S/s and 801 data points. The sample rate is determined by an integer divisor from 100MHz, in this case 100,000,000/333. So, if you really want a certain number of data points, you need to set the sample rate to a physically realizeable one. Alternately, you can just fetch the number you want instead of all the points, realizing your sample window will be a bit shorter than you asked for.
    You are correct. To see anything in the records done, you need to delay. However, in this code, the records done output is really not doing anything. The fetch won't fetch until everything is done, so you know all is finished when your data shows up. I would just delete it. If you do want to put a delay in, wrap a sequence structure around the records done query. Add a frame before the current frame and drop a Wait (ms)delay into it. You can find this primitive in the timing and dialog palette. This is an example of when the sequence is actually useful. The delay VI has no good way of enforcing data flow, so the sequence does it.
    The way you have implemented it, you don't need to poll acquisition status. To get acquisition status, use niScope Acquisition Status.vi. You would put this in a WHILE loop before your acquire and exit the WHILE loop when the acquisition was done (or an error occurred) to proceed to the acquisition. Make sure you also put an appropriate delay in the WHILE loop or it will eat your entire processor capacity. The delay should be based on how long you expect the acquisition to take.
    You should definitely put your fetch code in a subVI if you are going to call it more than once. LabVIEW makes it easy. Select the code you need to make into a subVI, then select Edit->Create SubVI. Don't forget to put an icon and documentation in the subVI. You will thank yourself later.
    Let me know if you have any more questions...
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • How to use bapi /external service directly in application service

    Hi
    I have to use BAPI_ALM--RDER_MAINTAIN in my caf application
    i have imported it as external service.As there are problems mapping its input fields to entity service , it has to be consumed directly vai application service.
    But i ahve never done so before.Can someone give me an example as to how t do it.
    If someone can give example of how to code in the application service to call  the bapi , it would be great . as i have some time constraint.
    Thanks ,
    Points assured for help.

    Hi Vivek,
    As you asked for an example to call a bapi directly from Application Service you can follow this link.
    <a href="http://help.sap.com/saphelp_nw04s/helpdata/en/44/57fff3b10b3672e10000000a114a6b/frameset.htm">Using Generated External Proxy in Application Service</a>
    Thanks and Regards
    Avijit

  • How to use an external drive with 2 Macs?

    Using an external drive for LR 5.4 files and can download and view without any issues on my MacBook.  However, when the same drive is attached to my iMac I cannot download or view any files on the external drive that were added after March of this year.  Any ideas on how to go about resolving this issue?

    The following screenshot might be helpful. It shows where in Lr preferences you can configure which catalog Lr uses as the default. If you need to direct it to an external drive then use the 'Other' option to point to this drive then next time Lr launches set it back 'Load most recent catalog'

  • How to use an external hard drive on a MacBook Air?

    Hi there.
    Recently I bought a portable USB external hard drive for my MacBook Air due to my full startup disk. I also bought multiple disks for the hard drive. I inserted the disk, hooked up the USB, and did everything properly. When I first hooked it up, I lost all internet files that I had saved. All images, videos and downloads (such as Minecraft and other gaming downloads) from the internet were gone and unable to find.
    Now, I don't want to make the wrong move and lose all of my internet files again, so I am asking how to transfer files SAFELY to the USB without any files being deleted. I tranfered files before, but they were just short cuts to the original file, so I am also asking how to fix that as well. My Dad was a computer technician, although he didn't work for Apple, so he doesn't know how to use the USB.
    I've tried everything to find my internet files, but nothing. If there is a way to get them back, I'd be glad if you shared that with me. My dock files aren't working due to this insanely full startup disk, so I'd appreciate it if you could tell me ASAP how to use the USB, how to SAFELY transfer files without them turning into shortcuts, and how to get my internet files back.
    Thank you very much.
    P.s My OS X version is 10.8.3 (Not with Mountain Lion)

    You cannot format a drive so that files will be automatically put on that drive when you run out of room on another drive. Transferring the files must be done specifically by you. You will need to learn how to manage free space on the SSD because 64 GBs is a bit small for much more than OS X.
    Drive Partition and Format
    1. Open Disk Utility in your Utilities folder.
    2. After DU loads select your hard drive (this is the entry with the mfgr.'s ID and size) from the left side list. Click on the Partition tab in the DU main window.
    3. Under the Volume Scheme heading set the number of partitions from the drop down menu to one. Click on the Options button, set the partition scheme to GUID then click on the OK button. Set the format type to Mac OS Extended (Journaled.) Click on the Partition button and wait until the process has completed.
    4. Select the volume you just created (this is the sub-entry under the drive entry) from the left side list. Click on the Erase tab in the DU main window.
    5. Set the format type to Mac OS Extended (Journaled.) Click on the Security button, check the button for Zero Data and click on OK to return to the Erase window.
    6. Click on the Erase button. The format process can take up to several hours depending upon the drive size.

  • How to use my external processor effects?

    I have a lexicon processor and I want to know how to use it with logic.
    Thanks
    g5 ma' os   Mac OS X (10.4.4)  

    I hope this info can help you.
    The main Problem is that Logic don't have an delay compensation plugin like in nuendo or cubase "external FX"
    Here is the info to compensate the delay coused by the D/A and A/D converters:
    6.30 How do I set-up record / playback / monitoring delay?
    Subject: Record delay vs send/return using external effect units
    You'd think that with modern multichannel audio interfaces and a modern, professional audio sequencer like Logic, this would be a piece of cake, right? It turns out that there are a number of potential traps, all to do with Logic's highly inadequate record delay compensation. What follows is a run through of the general setup procedure, using my RME Multiface.
    Basic Record/Playback delay Setup
    Set up a 4/4 audio click track (trim the sample starts so they are right on the beats). Use real audio, not a software instrument. Rerecord the main outs to a new track via a hardware loopback cable (i.e. cable your audiocard's outputs into its inputs). Measure the clicks with reference to bars/beats in the Sample Editor - the clicks should be (but may not be) recorded right on the beat.
    If the audio driver has a record delay parameter in samples, use that to adjust. If not - use the ASIO Buffer delay "IN" for coarse adjustment (multiples of buffer). Leave the ASIO Buffer delay "OUT" at zero. Use the main driver playback delay for final fine adjustment (samples). Do NOT use the Arrange window's delay parameter - it's in ticks and thus tempo-dependent!
    So if you are playing in time with a prerecorded track, your playing is recorded in the correct position to preserve your exact timing on playback.
    The use of the playback delay to compensate is not ideal, as it will mess with the playback timing vs displayed position of any audio - this has an adverse effect on fine editing, and MIDI-to-audio sync - but Logic unfortunately does not provide a sample-accurate record delay adjustment, and has not done so since version 3.5...
    Effect Return/Record Setup
    You definitely want to avoid monitoring external FX returns through Logic if at all possible, since that would add 2x the audio buffer worth of latency to the return, which will adversely mess with the sound of any time-based effects (i.e. just about everything). So, monitor your external FX returns at source, or through direct hardware monitoring - in my case that's through RME TotalMix routed to the main outs for (near) zero latency.
    You probably want to be able to record the external FX outputs into Logic (to free up the FX for other uses) and have it play back exactly as it was monitored, right?
    Using your 4/4 audio click track, panned left, send to an external delay (approx 1/2 beat, single repeat). Pan the delay return R at source and monitor as above. Rerecord the main outs to new track via a hardware loopback cable. Measure the number of samples from a click to its delay in the Sample Editor
    Now record the delay return signal to a new track. Playback just the audio click panned left & the recorded delay panned right. Rerecord the main outs to new track via hardware loopback cable. Measure the number of samples from click to delay in Sample Edit.
    Both measurements must be the same for accurate recording of FX returns, but it's likely they won't be, probably because the playback delay has been messed with. Compensate by inserting a sample delay on the FX return Input Objects in Logic. Since you're not monitoring the FX returns through Logic anyway, the input delay will be recorded but not monitored (in 5.2+, but if you use an earlier version you're hosed since effects are not recorded).
    Record the delay return again, play it back with the click, rerecord the outputs and measure again. Adjust the input sample delay until the recorded-&-played-back delay position is identical to the monitored delay.
    Live Input Monitoring/Recording
    For accurate timing, monitor any live inputs at source or through direct hardware monitoring (TotalMix), the same as for FX returns. When recorded, playback timing will be accurate. Sends from live inputs to external FX can also be applied in at source or in TotalMix. No problem.
    Live Input Monitoring Through Logic
    Here's where the problem starts. If you also plan to monitor some live inputs through Logic - to add Logic FX, or to control & automate the live inputs, or to add live inputs to a bounce, etc - then you'll be monitoring the live inputs with a latency of 2x the audio buffer size (or more if applied processing induces further delays). Therefore when you record a live input, on playback it will be early by that amount, since the record/playback delay is set up to compensate for zero-latency monitoring. What you heard live is not what you get on playback.
    There's no set-&-forget way around this, since Logic won't let you apply a sample delay to an input without the delay also being applied to the input's monitor output from Logic. So using the same input delay trick you applied to recording the FX returns won't work - you'll wind up monitoring with even more delay, which will need yet more input compensation, and so on. You can't use the record/playback delay to compensate, because that would screw up the recording of source-monitored material.
    You could conceivably monitor everything through Logic at all times, and use a single record/playback delay to compensate for all of it (with the editing & MIDI-to-audio sync shortcomings discussed, but on a larger scale as considerably more compensation is required), but that will screw with the sound of any time-based external FX as noted above.
    So assuming you stick with source-monitoring the external FX returns, there are 2 options for correct playback timing of any recorded tracks that were monitored through Logic while recording:
    1 - Output all such tracks to a bus, and place a Sample delay on that bus to compensate. This will correct the playback you hear, but it won't correct the bad positioning of the audio.
    2 - Physically move the recorded audio later to compensate. Uh-oh - Logic's Arrange window is not sample-accurate. And it's not possible to move a newly recorded audio region later in the Sample Editor without adding samples to the start of the file, which is a tedious process if you've just recorded a number of tracks. You'll just have to get it as close as you can in the Arrange - bear in mind that ticks are tempo dependent, so you have to calculate the number of ticks based on the current song tempo (let's not even begin to discuss Audio vs tempo changes in Logic), or use the smallest SMPTE nudge available. No fun at all.
    If anyone has any other suggestions, I'm all ears...
    So what about OS-X?
    In Logic under OS X, the CoreAudio driver setup panel doesn't have any record/playback delay setup. If you're optimistic, you might interpret that as an indication that it's all done automatically by CoreAudio and the driver. But given Emagic's history in this area, what's the bet it's currently a big fat inaccurate mess? Rumblings from the Mobile i/o list seem to indicate this...
    G5 dual 2,5   Mac OS X (10.4.4)  

  • How to use PXI Star Trigger for PXIe-5663 in PXIe-1075 chassis

    HI all,
    I have this sytem configuration:
    PXIe-8135 controller. Windows 7 64-bit, RFSA 2.7.5. NI-SYNC 3.4.1
    PXIe-1075 chassis
    PXIe-5663 (2x)
    PXIe-6672 Timing & Sync Card (slot #10)
    I want to trigger the recording of my Digitizer with an external trigger.
    The External Trigger is connected to PFI0 of the PXIe-6672 Timing card.
    Then, the PXIE-6672 card routes the trigger to the backplane of the PXIe-1072 (Destination "NISYNC_VAL_PXITRIG0")
    The PXIe-5663 are triggered with “NIRFSA_VAL_PXI_TRIG0_STR” as the source.
    The trigger fires my PXIe-5663 correctly, but the timing is not tight (> 5ns).
    I would like to use the PXI Star trigger instead, I think that I should be able to acheive much better synchronization with this.
    But NI-RFSA won't let me do this:
    When I try to call
    "niRFSA_ConfigureDigitalEdgeStartTrigger(rfsa_sess​ion, NIRFSA_VAL_PXI_STAR_STR , NIRFSA_VAL_RISING_EDGE)", I get the error:
    "Specified Route Cannot Be Satisfied, Because the Hardware Does Not Support It"
    I don't understand why the PXIe-5663E would not be able to use that Route.
    Any idea?
    Regards,
    Serge
    Serge Malo, ing.
    Concepteur logiciel
    Software Developer
    T (514) 842-7577 x648 | [email protected]

    That explanation isn't quite right. Usually, even PXIe modules have a connection to PXI_Star. The PXIe standard added the PXIe_DStar trigger buses, and it also preserved the PXI_Star bus from the PXI standard.
    However, there is an additional twist in this situation. I'm assuming that your PXIe-5663 includes a PXIe-5622 as the digitizer. It turns out that a synchronization technique called NI-TClk has eliminated the need for our more recent digitizers to rely on triggering from PXI_Star. I was able to find some documentation that includes this information, here and here. Given that, I think you have two options that should result in better synchronization.
    The first option is to use TClk; I found an example program that demonstrates using TClk to acheive phase coherent signal acquisition across two 5663s. The second option is to use cables of matched length to connect two PFI front panel terminals of the timing board (6672) to the PFI1 front panel terminals of the digitizers (5622). The timing board would accept the external trigger on PFI0 and then issue triggers on PFI1 and PFI2 with around 500 ps of skew (manual, page A-4) . The digitizers would use NIRFSA_VAL_PFI1_STR as the trigger. I hope one of these solutions will meet the demands of your particular application.
    I will also follow up with the owners of the RFSA product documentation to see if we can include a note about why PXI_Star is not supported in some cases.
    James Blair
    NI R&D

  • How to use 2 Time trigger UI elements

    Hi Experts,
      My user doesn't want to see the time out error in the webdynpro application after leaving(idle mode) the browser for hours. And one more thing is he wants a popup for every 15mnts to show a message on the screen.
    So, I have used 2 Time Trigger UI elements to achieve this functionality,
          1st one(Time Trigger UI Element) is used to show the pop up message and It is working fine to show the message in a Pop Up .
          2nd one is used to handle the Time Out Error for hours and this is not working as expected, if user leaves the browser for hours .
    I hope you understand my requirement and issue.
    Could you please suggest me how to achieve this functionality.
    1000 Thanks in advance.
    Regards,
    Giri

    Hi,
    If I set the 3 hours to 2nd Time Trigger UI element and tested after 2 hours and browser is still throwing the time out error message.
    After 2hrs is the time out error throwing by the Timed trigger or the standard session time out? I believe after 2 hrs the standard session time out is throwing! You need to give the delay time less than your session time out.
    You can get the session timeout using wdr_task=>server->session_timeout ( in mins ) And you can get the application server time out using
    data: name type pfeparname,
            value type pfepvalue.
      name = 'rdisp/plugin_auto_logout'.   " parameter
      call 'C_SAPGPARAM' id 'NAME'  field name
                                        id 'VALUE' field value.   " value contains time out 
    Regards,
    Kiran

  • How to use the external encoder

    Can someone please tell me how to open the external encoder so i can copy and paste links into it to use my web cam?
    Thanks

    Find a forum for whatever product this involves and post your question there.
    Here is a link to a page that has links to all Adobe forums...
    Forum links page:
    https://forums.adobe.com/welcome

  • How to use an external XSLT engine for Oracle BPEL Process Manager

    Hi,
    is there a way to use an external XSLT engine instead of the build in provided in Oracle BPEL Process Manager?
    The reason is to perform some XSL Transformations that use an OWL Query language.
    Thanks!

    Yes. you can write your own xpath function which can connect to external xslt engine and pass-in your document:
    <copy>
    <from expression="my:myxslt-processor(bpws:getVariableData('var1','part'))"/>
    <to variable="v2" part="payload/>
    </copy>
    The following Thread discusses about how to create an extension xpath function:
    http://forums.oracle.com/forums/thread.jspa?forumID=212&threadID=305548

  • How to use an external hard drive for your music storage

    How do I use my external hard drive as my storage for music? I have already saved my music on my external and I have my iTunes program on the computer just wondering what is the next step is? Also I want to make sure I do not use my computer's hard drive for any sizable storage.

    sorry to butt in, but this appears to be the best thread to pose my question/problem.
    i have successfully migrated my iTunes Library to my external hard drive. the last part of the instructions tell me to delete the iTunes music folder from my hard drive to free up space.
    however, i've not done this because i would like to keep a reduced number of music files on my MacBook so that i can listen to music when i'm away from my external hard drive.
    i've managed to delete the files i don't want stored on my hard drive, and when i open iTunes without the external hard drive, it defaults to the original location. i'm sure this is not good practise, because whilst i can now play music, the information in iTunes is not right.
    what is the best way of storing and playing some of my iTunes library on my hard drive, whilst using the external drive as a 'master'? can it be done?
    MacBook   Mac OS X (10.4.7)   2GB RAM, 120GB HD, 2GHZ

  • With DAQmx, how to use AO start trigger for AO/AI synchronization with finite AI sampling

    I am a new user to DAQmx and I am trying to synchronize AI (finite samples) with AO in LabVIEW 7.1 using a PCI 6229 card. I want to generate a finite waveform (AO) and, subsequently, collect a finite number of voltage samples (AI). I would like to repeat the AO-AI cycles in a while loop.
    Alternatively, I could use an infinite AO generation and collect finite number of voltage samples on AI but always exactly at the same spot of the AO buffer.
    Using traditional DAQ and a 6024E card, I used a counter triggered by AO start trigger signal (example attached). I have problems with translating this example into DAQmx.
    Please help!
    Ruber
    Attachments:
    AIAODelay_traditional_Eseries.vi ‏155 KB

    Lesley,
    Thank you very much for your suggestion. Late last night I actually tried a to use AI start trigger instead of AO start trigger and it worked (since I tried the AI & AO to start simultaneously, it does not matter what triggers what), even in the loop. The devil is in details, as I had to carefully wire the number of data points and what to place inside/outside the loop.
    The problem with shared clock is that I need to sample the AI and AO at different rates but using AO and AI clock separately did not seem to affect the performance.
    I still want to try to use the AO start trigger (as you suggest) because I would like to delay the AI by a few ms from the AO. Is there a simple way to to that?
    I suppose, switching from traditional DAQ to DAQmx requires your brain to be rewired - after playing with it for a couple of days and nights I developed a "feeling" for it. One of the differences was that, in order to use this example in the loop, one has to use 'stop task' inside and 'clear task' outside the loop.
    Thanks again!
    Radek Uberna

  • How to use an external clock to acquire analog data?

    I need to acquire dta using an external clock which has a variable frequency (in a small range). I am using a PCMCIA 6062 board. Does it accept external clok?

    The 6062 can acquire data based on an external clock. If you are programming in LabVIEW, take a look at the shipping example called Cont Acq&Graph ExtScanClk.vi. If you are using C or VB, there are examples in the NI-DAQ/Examples folder.
    Regards,
    Brent Runnels
    Applications Engineer
    National Instruments

Maybe you are looking for

  • Message getting stuck in SMQ2 for proxy

    Hi All,     I am facing one problem in inbound queue of proxy scenario.   Some times messages are in scheduled status. In the trace i can see the following message. <Trace level="1" type="T">--no sender or receiver interface definition found</Trace>

  • How to deal with Nikon D90 AVI files in FCE 4

    I purchased a Nikon D90 digital camera and it records clips stored in AVI format at 1280x720. When I import the video, a red line appears on the video render status line. Why does the video need to be rendered before making any changes? Also, why whe

  • Query for "installed on date"

    Hi, I am having trouble running a query to get the "installed on Date". I am not very good at this.  I've been using the script below, but I can't get the correct language for the installed on date. I'm hoping someone can help me complete it and tell

  • Rather a simple issue

    Hi I would like to know if it is possible to change the icon of the drive which is showing up in the desktop. Instead of the harddrive kind of image I would like to keep a different icon for Macintosh HD as well as for the Windows drive. I think it i

  • Transporting a BADI

    Hi, I am new to BADI. I am working on SAP Auto-Id infrastructure 4.0 (used in RFID projects)  there is a Custom BADI which is implemented in one SAP AII system. I need to copy this to other SAP AII system. How to create a Transport Request for Custom