Software Trigger from CC1 on PCIe-1433

Dear Sir or Madam,
Now I am getting into trouble about NI Measurement Automation explore with my PCIe-1433. I like to make software trigger from this software into my cameralink camera. However I am not so familiar with this software. Is it possible to make software trigger ? If it is possible, I like to know how to generate it.
Best regards,

Hello,
Check out this forum post. It goes through the steps necessary to set up triggering in MAX.
-Erik S
Applications Engineer
National Instruments

Similar Messages

  • How to connect external trigger PCIe-1433?

    I am having an issue getting the external trigger to work using a cameralink camera, a PCIe-1433 card and a signal generator.  The signal generator is connected to the SMB connector on the 1433 card and is providing a 20 Hz signal with a 20ms pulse width at TTL voltages (0-4V).  From memory, we are using the MAX software and under the Acquisition tab I have CC1 set to external, however the camera is not receiving this external trigger.
    I've attached a rough diagram of our setup.
    What could my issues be?  My thoughts are:
    1) Are there other settings in MAX I need to ensure?  I know the SMB connector can be an input or an output.  How can I make sure this is set to input?
    2) My input signal is 0-4V which is TTL, but does the signal need to be 0-5V?
    Any thoughts would be greatly appreciated.  Thank you.
    Attachments:
    ExternalTrigger.jpg ‏29 KB

    Hello tuckturn
    Thank you very much for getting in touch with us. 
    1)  In Measurement and Automation Explorer, the default is an input.  You would need to use LabVIEW to change the SMB connector to be an output. Can you please show me a screenshot of your camera attributes in Measurement and Automation Explorer.
    2)  Where does this input signal come from?  Do you have a 5V TTL compliant output to test this?  Can you please provide me the specification for whatever device is outputting the voltage?
    Thanks again.
    Sincerely,
    Greg S.

  • How to connect external trigger PCI-1433

    Hi all:
    I met the same problem as this old post: https://forums.ni.com/t5/Machine-Vision/How-to-connect-external-trigger-PCIe-1433/m-p/1677560/highli...
    however the post did not give solution.
    I am having an issue getting the external trigger( from PCI 6259) to work with a cameralink camera, a PCIe-1433 card . The PCI 6259 is connected to BNC 2111, the output trigger signal from CTR0 is connected to the SMB connector on the 1433 card and is providing pulse train at TTL voltages (0-5V).  Using the MAX SOFTWARE and under the Acquisition tab I have CC1 set to external 0, however the camera is not receiving this external trigger. The camera is Basler 4096-140km.
    I measured the ouiput signal from CTL0, the sigal seems good.
    I tried RTSI trigger before, failded. I thought SMB should be more straight forward, seems I was wrong.
    Looking forward to reply!
    Attachments:
    camera configuration.jpg ‏168 KB
    External Camera Trigger(SMB).vi ‏57 KB

    Bobjojo,
    You will actually need to affect the properties of both the camera and the frame grabber to take advantage of this triggering mode. I found a document that contains a good run down of the process to acquire in this mode (linked below). As far as the programming for the acquisition is concerned, the frames will be built at the driver level for the specified frame height. This means that the IMAQ driver will composite the line scans for you, and any simple acquisition (the document shows using the Vision Acquisition Express VI) will be able to pull the image into LabVIEW.
    www.ni.com/white-paper/13786/en/pdf
    Karl G.
    Applications Engineer
    ni.com/support

  • PCIe-1433: pulse train to CC1, in C++?

    I'm sure this should be a simple question and there's a lot of good information from previous threads but I can't quite put it together.
    I have a PCIe-1433 and a Point Grey Gazelle; everything works okay, and in the Camera File Generator, I can put the camera into bulb-shutter mode and correctly control the exposure. So, through the camera file, control of CC1 works well.
    But now I want to programmaticaly generate a simple pulse train, to CC1 of this camera. CC1 is already set as "external" line 0 in MAX. I believe I am correctly putting the camera into external trigger mode because the frame rate drops to 0 (or the timeout rate). The commands are a bit confusing to me; currently I am using
    imgPulseCreate2(PULSE_TIMEBASE_50MHZ, 50000, 50000, IMG_SIGNAL_STATUS, IMG_IMMEDIATE,
    IMG_TRIG_POLAR_ACTIVEH,IMG_SIGNAL_EXTERNAL, 0, IMG_PULSE_POLAR_ACTIVEH,
     PULSE_MODE_TRAIN, &pulseId);
    imgPulseStart(pulseId, sessionId);
    No errors reported (after some trial and error), but I don't believe it's creating a pulse. I need to do this programmatically because I will need to vary the pulse rate to something the camera can support for different ROI's, although if there is a combination of using the camera file and code I would be happy to use that.
    Any help would be greatly appreciated!

    GREAT! Thank you Eric, starting with your tip and then some more trial and error, I've got it going now. A few things:
    * The attribute I was trying to modify actually only involved a pulse width, but it still wanted me to stop the acquisition.
    * The solution ended up being to code the camera mode (set by serial commands to first set the trigger source to CC1, then set the trigger mode to single) in the camera file.
    For some reason that I'll figure out one day, sending these same codes with serial write just did not work at all. Even if I told the camera file not to send any serial codes at all, ever, sending the same two commands myself didn't seem to work (even though I got acknowledgement from the camera). I'd like to solve this because I believe that calling AcquisitionStart resends these codes (for example, when I set up for a sequence acquisition) but this is redundant and slows things down.
    * So now I tell the camera file to send "trsrc line1\r" and "trm single\r", but with no pulse output. Then in the code, I use the commands below to set the pulse.
    imgPulseCreate(PULSE_TIMEBASE_50MHZ, 70000, 70000, IMG_IMMEDIATE,
    IMG_TRIG_POLAR_ACTIVEH,IMG_EXT_TRIG0, IMG_PULSE_POLAR_ACTIVEH,
    PULSE_MODE_TRAIN, &pulseId);
    imgPulseStart(pulseId, sessionId);
    then, to change the exposure time:
    imgPulseUpdate(pulseId, sessionId, delay, width);

  • PCIe-1433, extension boards and multiple trigger outputs

    Hi All,
    I am using two cameras and three LED lighting bars to take images of fabric from a conveyor. The cameras are interfaced with an NI PCIe-1433 card which has two extension boards attached to it. To synchronise the triggering of both the cameras and the LEDs I am using PhaseA of a quadrature encoder signal located on the conveyor. I need to produce six pulses to provide the external triggers at various points of the PhaseA waveform.
    I understand that the PCIe-1433 only has 1 external output that can be used as a trigger although with the extension board this increases to 3. I am assuming that even though I have two extension boards on the same PCIe-1433 card I am unable to use the additional 3 external outputs, which would give me the total of 6 I require?
    Any guidance would be appreciated
    Kmor

    Hi Kmor,
    Can you tell me what extension boards you are using?
    Fouad

  • Software trigger with PCI-5122

    I am going to program a board with PCI-6552 and in meantime in specific times capture the samples of the anolog output of the board by PCI-5122.
    Is there any way to use software trigger and the data pattern of PCI-6552 (e.g. define a software event) to trigger the PCI-5122. If it is possible, how the software event can be defined and connected to the "send software trigger" block?

    Whe using a software trigger with the 5122, you generate the trigger whenver you call the "Send Software Trigger" function. you can program the software events that cause this funtion to be called in any way you want.
    In the simplest case, you could place the "Send Software Trigger" function in a while loop and only execute it when the user presses a button.
    Jack

  • RTSI cable configuration with PCIe 1433 card

    Hi All:
    I tried to trigger my line camera with RTSI signal.
    The layout is: trigerr signal  is generated from one PCI card 6259 (Dev2/ctr0). The line camera is connected with one PCIe 1433 card, and the two cards were connected with RTSI cable.
    The problem is that how to register or configure PCIe 1433 with RTSI cable in MAX, it seems I can only configure PCI card within RTSI cable ......
    If it is true, how should I do to rout the trigger signal to the camera, such as "route Dev2/ctr0 to Trigger line 0"?
    Below is printscreen of MAX Configuration. Attached is my VI.
    thanks a lot.
    Bob
    Attachments:
    Triggered with RTSI from DAQ.vi ‏63 KB
    camera configuration.jpg ‏218 KB

    Thanks a lot, the post helped.
    But I am still confused with the details:
    To camera and PCIe 1433 configruaration, as the attached pictures shows: RTSI is configurerd as Camera Control Line as number "2"; the exposure control of camera is defined as "triggered".
    In previous attached VI, you may find that the IMAQ configurer trigger is applied to define trigger  type as "RTSI", trigger number as "2".
    About ttrigger signal from PCI 6259 (Dev2), digital pulse is generated and output terminal is defined as ctr0 ( seems only ctr0 or ctr1 is option).
    So I still do not know how to route signal from "Dev2/ctr0" to "Imag0 CCL 2" in labview.
    By the way, I read  the online example "LL Triggered Ring using DAQmx Pulse Train", http://www.ni.com/example/29476/en/.  Within this VI,  the DAQmx Connect Terminals VI is applied to route the counter's output to a RTSI pin so the signal can be used by the IMAQ device. However Dev2/ctro is not recognized as scour terminal, only other items such as PFI, APFI or RTSI are avaible, and output signal source is also limited with similar things, sounds like just internal connectrion with Dev2.
    thanks a lot.

  • NI PCIe-1433 Camera Link frame grabber programming using VC++

    Hello, I have a Basler Camera Link-compatible camera and they use the NI PCIe-1433 Camera Link frame grabber as the image acquisition device.
    Right now, I want to develop some applications on that camera and need to do programming on the NI PCIe-1433 by VC++.
    So I want to know where I can find the software development kit for the NI PCIe-1433. For example, I can control the camera to grab the image by using the different functions(such as imgGrab(), or somthing like that ) from that software development kit.If there are some sample code that would be better.  

    Hi Wrsbj,
    The IMAQ functions that you mentioned are contained in the IMAQ driver, which is used to acquire from Camera Link frame grabbers. I would also check the compatibility chart, to make sure that you get the version you need. As for example programs, you can take a look at this KnowledgeBase article concerning building and running the IMAQ examples in VC++.
    David S.

  • HSDIO conditionally fetch hardware compare sample errors (script trigger to flag whether or not to wait for software trigger)

    I am moderately new to Labview and definitely new to the HSDIO platform, so my apologies if this is either impossible or silly!
    I am working on a system that consists of multiple PXI-6548 modules that are synchronized using T-CLK and I am using hardware compare.  The issue I have is that I need to be able to capture ALL the failing sample error locations from the hardware compare fetch VI... By ALL I mean potentially many, many more fails than the 4094 sample error depth present on the modules.
    My strategy has been to break up a large waveform into several subsets that are no larger than 4094 samples (to guarantee that I can't overflow the error FIFO) and then fetch the errors for each block.  After the fetch is complete I send a software reference trigger that is subsequently exported to a scriptTrigger that tells the hardware it is OK to proceed (I do this because my fetch routine is in a while loop and Labview says that the "repeated capbility has not yet been defined" if I try to use a software script trigger in a loop).
    This works fine, but it is also conceivable that I could have 0 errors in 4094 samples.  In such a case what I would like to do is to skip the fetching of the hardware compare errors (since there aren't any) and immediately begin the generation of the next block of the waveform.  That is, skip the time where I have to wait for a software trigger.
    I tried to do this by exporting the sample error event to a PFI and looping that PFI back in to generate a script trigger.  What I thought would happen was that the script trigger would get asserted (and stay asserted) if there was ever a sample error in a block, then I could clear the script trigger in my script.  However, in debug I ended up exporting this script trigger back out again and saw that it was only lasting for a few hundred nanoseconds (in a case where there was only 1 injected sample error)... The sample error event shows up as a 1-sample wide pulse.
    So, my question is this:  is there a way to set a flag to indicate that at least one sample error occurred in a given block  that will persist until I clear it in my script?  What I want to do is below...
    generate wfmA subset (0, 4094)
    if scriptTrigger1
      clear scriptTrigger1
      wait until scriptTrigger0
    end 
    clear scriptTrigger0
    generate wfmA subset (4094, 4094)
    I want scriptTrigger1 to be asserted only if there was a sample error in any block of 4094 and it needs to stay asserted until it is cleared in the script.  scriptTrigger0 is the software trigger that will be sent only if a fetch is performed.  Again, the goal being that if there were no sample errors in a block, the waiting for scriptTrigger0 will not occur.
    I am probably going about it all wrong (obviously since it doesn't work), so any help would be much appreciated!

    Please disregard most of my previous post... after some more debug work today I have been able to achieve the desired effect at slower frequencies.  I did straighten out my script too:
    generate wfmA
    if scriptTrigger1
      clear scriptTrigger0
      wait until scriptTrigger0
    end if
    generate wfmA
    scriptTrigger1 = sample error event flag
    scriptTrigger0 = software trigger (finished fetching error backlog in SW)
    However, I am still having a related issue.
    I am exporting the Sample Error Event to a PFI line, looping that back in on another PFI line, and having the incoming version of the Sample Error Event generate a script trigger.  My stimulus has a single injected sample error for debug. For additional debug I am exporting the script trigger to yet another PFI; I have the sample error event PFI and the script trigger PFI hooked up to a scope.
    If I run the sample clock rate less than ~133MHz everything works... I can see the sample error event pulse high for one clock period and the script trigger stays around until it is consumed by my script's if statement.
    Once I go faster than that I am seeing that the script trigger catches the sample error event occasionally.  The faster I go, the less often it is caught.  If I widen out the error to be 2 samples wide then it will work every time even at 200MHz.
    I have tried PFI0-3 and the PXI lines as the output terminal for the sample error event and they all have the same result (this implies the load from the scope isn't the cause).
    I don't know what else to try?  I can't over sample my waveform because I need to run a true 200MHz. I don't see anything that would give me any other control over the sample error event in terms of its pulsewidth or how to export it directly to a script trigger instead of how I'm doing it.
    Any other ideas?

  • Why can I not find the primary video setting in Bios so I can change it from integrated to PCIE?

    I have purchased a new video card to add to my computer which has integrated graphics.  When I installed the card I got a blank screen so I removed the card and restarted the computer and entered BIOS in an attempt to change it from integrated to PCIE the problem is I cannot find the option to make the change.  I can select advanced chipset but there is no primary video option to select and as far as I can tell there is no option to switch it from integrated to PCIE....  I researched this heavily and am now resorting to this as I cannot find a way to get the video card to work.  I have posted the options I have in my system's Bios below.  Thnks for looking   BIOS Version 2.14.1219Options:Main     Advanced      Power      Security      Boot Options       Exit Main:System infoDateTimeAdvancedMiscellaneous            AHCI Post 1 Not Present            AHCI Post 2 Not Present            AHCI Post 3 Not Present            AHCI Post 4 HL-DT-ST- DVDRAM GH7ON            AHCI Post 5 Not PresentAdvanced Chipset Configuration            Intel EIST             [Enabled]            Intel Turbo Boost [Enabled]            Intel AES-NI        [Disabled]            Intel XD BIT        [Enabled]            DVMT Mode        [DVMT]            DVMT Memory Size [256mb]Integrated Peripherals            Onboard SATA Controller [Enabled]            Onboard USB Controller [AHCI]            Legacy USB Support [Enabled]            USB Storage Emulation [Auto]            Onboard Audio Controller [Enabled]            Onboard LAN Controller [Enabled]            Onboard LAN Option ROM [Disabled]PC Health Status            {Health Status info}Smart Fan [Enabled]PowerACPI Suspend Mode   [3(STR)]Deep Power Off ModePower On By RTC AlarmPower on by PCIE AlarmPower on by Onboard LANWake up by PS2 Keyboard/MouseWake up by USB Keyboard/MouseRestore on AC Power LossSecuritySupervisor passwordUsername PasswordChange Supervisor PasswordBoot Options1st Boot Device2nd Boot Device3rd Boot Device4th Boot Device5th Boot DeviceEFI Device PriorityHard Disk Drive PriorityOptical Disk Drive PriorityRemovable Device PriorityNetwork Device PriorityQuiet Boot OnHalt OnExit    

    iPhone User Guide (For iOS 7 Software)Sep 20, 2013 - 23 MB

  • How do I get LV2009 to see my PCIE-1433?

    Hi all - I think this problem is just a matter of picking the right download.  I'm trying to modify an LV application that was installed on demo system provided by a vendor. They very kindly provide all their source code, but it's written in version 2009, and they didn't include the development environment. I downloaded 2009 sp1 and installed it using my 2014 SP1 license. So far, so good. 
    When I opened the VI, LV couldn't find any of the IMAQ sub-vi's. Eventually, I figured out that I needed to install the 2009 version of the Vision Acq System. Downloaded and installed it, and that took care of most of the missing VIs. There were still a few that required the Vision Development Kit, but I don't have that license, so I just deleted those and tried to run the VI. I got errors that it couldn't open the framegrabber. It turned out I had unstalled the driver for the PCIE-1433.
    So I downloaded the Driver package, installed that, and the PCIE-1433 is still not recognized.  Now I'm stuck.  I know this system should work somehow, because the vendor's application worked before I started messing with it.
    Any ideas, folks?  Install everything in a different order (above is the actual order I installed things).
    thanks!
    MADman
    Solved!
    Go to Solution.

    It seems that the PCIe-1433 needs the IMAQ 4.4 driver or later, which is included in VAS 2010.03. You will need this driver and LabVIEW 32-bit.
    Getting Started with the NI PCIe-1433
    http://www.ni.com/pdf/manuals/374000a.pdf
    What Versions of NI-IMAQ, NI-IMAQdx, and NI-IMAQ I/O Come with My Version of Vision Acquisition Software?
    http://digital.ni.com/public.nsf/allkb/6C42133468D​66324862578BC00655CF8
    NI-IMAQ Compatibility with Different LabVIEW Versions
    http://digital.ni.com/public.nsf/allkb/DB928F6D5E9​D6B97862579A7006B2850
    Thanks,
    Frank
    Application Engineer
    National Instruments

  • Where to down load Software or driver for NI PCI-8430/16 (RS232)

    where to down load Software or driver for NI PCI-8430/16 (RS232)

    Try here:NI-Serial 4.0
    I just did a search from ni.com for "ni-serial download" and that was the first link.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Camera File Generator Won't Run with MAX + PCIe-1433

    Hi,
    I am a camera developer and am trying to integrate a couple of new cameralink cameras with the PCIe-1433 NI frame grabber.  I have installed NI MAX 14.0 and version 3.0.0 of the Camera File Generator.
    I am trying to build a new Camera File for our companies cameras.
    The NI MAX shows the NI PCIe-1433 device when it launches.
    When I try to launch the Camera File Generator, it failis to load with the following message:
    "You must have NI-IMAQ 4.6.0 or higher installed before running the NI Camera File Generator.  NI-IMAQ 4.6.0 is installed with Vision Acquisition Software 2011."
    The MAX configuration tree shows I have version 14.0 of the NI-IMAQ software available.
    Is there any sort of developer's guide for creating a camera file to allow me to use my camera?  I need to write some serial commands to the unit and thought tha tthe Camera File Generator would support this.... 
    Thanks.
    -Mike

    Hi mawillia,
    The latest version of the IMAQ drivers is version 4.9, and it would show up where you said, under the Software tab in NI MAX.  Do you have this latest version installed?  If you like, feel free to expand out your software tab and take a picture of it and post it here.
    Also, you are correct that the serial commands can be set in the camera file.  Here is a link to our support page for the Camera File Generator: http://sine.ni.com/psp/app/doc/p/id/psp-723/lang/en
    Within it is a whole host of various KnowledgeBase articles.  This one seems particularly helpful:
    What framegrabber specifications do I need to know to create my camera file?:  http://digital.ni.com/public.nsf/allkb/9B89C9FA43A6973A86257A62004658D5
    Unfortunately, I couldn't find any articles specific to coding the serial commands for your camera in the camera file.  But I recommend opening up one of the bundled camera files (I'm looking at the Basler acA2000-340kc (Base).icd ) and within that, click the Camera Control tab.  The serial commands for each attribute will show up on the right side, and you can use this as an example for creating yours.
    Let me know if you have any further questions!
    Julian R.
    Applications Engineer
    National Instruments

  • How can i use PCIe 1433 and basler line camera to construct image and pocess it?

    hello! every one ,I am new to machine vision.i have two problems to ask.First,how can i use line camera and 1433 to acquire 2-D image?second,if i want to pocess the 1-D signal ,how can i do it ?i expect for your answer ,thank you!

    What you are trying to do is not something you can easily do in a forum, but I can give you the basics.
    You have a CameraLink frame Grabber (PCIe 1433) and a LineScan Camera
    The framegrabber needs to be configured to understand how to communicate with the camera.  You need a file from the camera vendor.  This file might already be installed.  You can use MAX to check if it is, under Devices/IMAQ.
    Assuming you can get to the point where you can communicate with the camera, you need to configure your acquisition.
    Since the camera is a line scan camera, it captures image data one pixel ine at a time.  You will need to configure the frame grabber to determine how many lines per image you want.  The low limit is usually 1, and the high limit is either based on the framegrabber's internal memory, or that of the computer.
    If you are using labview, you will need to open a sessing to the camera, acquire the image(s), and then close the session when you are done.
    Once the image(s) are captured, you can performa analysis using the NI Vision Toolkit.
    Sorry, but there is no way to condense years into one post.
    Machine Vision, Robotics, Embedded Systems, Surveillance
    www.movimed.com - Custom Imaging Solutions

  • Help: Can you use software trigger on a digital line.

    Hi:
    We have a legacy DAQCARDard 700 which does not support hardware triggering.
    We have a trigger from our instrument that is 5 V+ and we would like to
    trigger when it drops to 0 V. We have several questions.
    1. Do we need to invert the signal in MAX or is that only for V that are
    negative?
    2. Is there a good example in the examples where the state of this digital
    line is used to trigger an analog acquisition? When we tried using the
    digital trigger examples, we had an error saying that the hardware did not
    support that mode.
    3. We have an analog software trigger mode (based on the Analog Software
    Trigger example) working which we would like to modify over to read the
    digital line, but we have done very little wi
    th digital I/O.
    4. The digital trigger has been assigned a virtual channel of dtrg.
    Any help would be much appreciated.
    Thanks in advance,
    Pete

    Thanks Doug,
    If we read the digital line instead of as an analog line would it improve
    the accuracy of the triggering. Everything works now except we have a
    little bit of timing jitter within about 1 data point scanning at 50 kHz.
    However, I've never done any digital i/o with LabView, but may I should work
    through some of the tutorials. If you thought that this might solve the
    jitter problem. Would checking the state of the digital line allow a faster
    response with softtrig, I guess is my question?
    Pete
    "Doug Norman" wrote in message
    news:[email protected]..
    > Hello Pete,
    >
    > You are correct that the DAQCard-700 has no digital (or analog)
    > hardware trigger. The analog trigger example t
    hat is working for you
    > is using conditional retrieval. This is where data is always being
    > acquired and the driver looks at the values to determine when to
    > "trigger" and read the data into LabVIEW. To answer your questions:
    > 1. I don't think you need to invert the signal. This is for when you
    > want a digital low (below 0.8 volts) to show up as a digital high, and
    > a high (above 2.0 volts) to be read as a low.
    > 2. I don't know of a good example. You would basically have to
    > monitor the digital line. When it goes from high to low you would
    > then start your analog acquisition.
    > 3. I think this could be your best bet. If you have enough analog
    > input lines, why not just connect this digital signal as one of your
    > analog inputs. Then use this example to trigger when the 5 volts
    > drops to 0. It won't hurt to acquire your digital signal on an anlog
    > input along with your other data.
    > 4. I don't understand this question.
    >
    > Best Regards,
    >
    > Doug Norman

Maybe you are looking for

  • XMII server specifications

    Hello all, SAP recommends the following specifications for the xMII server 1) RAID - 1 Disk (3 logical SATA Disks) Question: Our normal standard is for a RAID 5 with three physical drives. Would this cause any issues? On the 3 logical SATA Disks, doe

  • Transferring Music from Shared Administrator to my iPod

    All of my music is going into a Shared Folder called HP_Administrator's Limewire Tunes. This music does not sync with my iPOD. How do I get the music onto my iPod?

  • Changes not reflected in portal after successful import

    Hello, I am having trouble with some transports in NWDI. I made some changes to the ESS Address Details dynpro and tested them in the development portal. I then imported them into CONS and the import ran successfully. However I cannot see the changes

  • Urgent -  Workflow (WF14000133)

    Hello We use SRM Server5.5(Classic scenario) We implemented BBP_WFL_APPROV_BADI for WF14000133 But agent is not assgined. When i check the workflow log using swi1, there is no agent. System show the message " all users can process this task", when i

  • Using ssl on WebLogic, not on Apache

    Hi Folks, This is probably a really obvious question, though I can't seem to figure it. Does anyone know if Apache plug-in supports SSL between the browser and WebLogic ? For example, can it then get a session id from the request, so it can keep stic