Multiple producer one consumer

Greetings,
I am developing an application where data are obtained at different
rates and different processes, but I want to store in one file, store
this data as I can if I have multiple producers and one consumer??
Solved!
Go to Solution.

I took a look at your code and you might run into problems with the various ACK messages you send. You send them in parallel based on the data you read. However, there is no guarantee which one will get sent first. If you need to make ensure the order of the ACKs then I don't think you want to process the receive data in parallel like that. Also, as previously noted your consumer loop will ONLY operate when there is data at BOTH queues. For a given consumer task I prefer it to use a single queue and define a more generic message structure that will allow multiple producers to post data of differing types. If you have a limited and finite set of data types you can use a typedefed cluster. This will mean that for each message one or more elements of the cluster will have no meaning. You could use a variant as the data type and the message ID will indicate how that data should be interpreted. If you want to use a LVOOP solution you can define a base message class. Part of this again is the ID. The children classes would have the necessary methods for dealing with the specific messages and data types. Then your elements are an instance of the message class.
Mark Yedinak
"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot

Similar Messages

  • Business Package on Producer and Consumer in a FPN

    Hi Gurus,
       I need to know if we need to install the Same Business packages on both the producer and consumer or is it enough to install the Business Package only on the producer....
    Regards!

    Whats your scenario, typically one would go about installing business packages and other content on the producer or multiple producers and federate it with a consumer portal and provide access via the consumer.
    For example if you have a box which hosts ESS content, another box which hosts BI content and you have a global portal which is the box which all the end users will be accessing, then you simly have to install BP for ESS on the ESS box, create BI content on the BI box and display all this content via the consumer (global portal).
    Hope this helps.
    Good Luck!!
    GLM

  • Performance issues when using AQ notification with one consumer

    We have developed a system to load data from a reservation database to a reporting database
    A a certain point in the proces, a message with the identifier of the reservation is enqueued to a queue (multi-consumer) on the same DB and then propagated to a similar queue on the REP database.
    This queue (multi-consumer) has AQ notification enabled (with one consumer) which calls the queue_callback procedure which
    - dequeues the message
    - calls a procedure to load the Resv data into the Reporting schema (through DB link)
    We need each message to be processed ONLY ONCE thus the usage of one single subscriber (consumer)
    But when load testing our application with multiple threads, the number of records created in the Reservation Database becomes quite large, meaning a large number of messages going through the first queue and propagating to the second queue (very quickly).
    But messages are not processed fast enough by the 2nd queue (notification) which falls behind.
    I would like to keep using notification as processing is automatic (no need to set up dbms_jobs to dequeue etc..) or something similar
    So having read articles, I feel I need to use:
    - multiple subscribers to the 2nd queue where each message is processed only by one subscriber (using a rule : say 10 subscribers S0 to S10 with Si processing messages where last number of the identifier is i )
    problem with this is that there is an attempt to process the message for each subscriber, isn't there
    - a different dequeuing method where many processes are used in parallel , with each message is processed only by one subscriber
    Does anyone have experience and recommendations to make on how to improve throughput of messages?
    Rgds
    Philippe

    Hi, thanks for your interest
    I am working with 10.2.0.4
    My objective is to load a subset of the reservation data from the tables in the first DB (Reservation-OLTP-150 tables)
    to the tables in the second DB (Reporting - about 15 tables at the moment), without affecting performance on the Reservation DB.
    Thus the choice of advanced queueing (asyncronous )
    - I have 2 similar queues in 2 separate databases ( AND Reporting)
    The message payload is the same on both (the identifier of the reservation)
    When a certain event happens on the RESERVATION database, I enqueue a message on the first database
    Propagation moves the same message data to the second queue.
    And there I have notification sending the message to a single consumer, which:
    - calls dequeue
    - and the data load procedure, which load this reservation
    My performance difficulties start at the notification but I will post all the relevant code before notification, in case it has an impact.
    - The 2nd queue was created with a script containing the following (similar script for fisrt queue)
    dbms_aqadm.create_queue_table( queue_table => '&&CQT_QUEUE_TABLE_NAME',
    queue_payload_type => 'RESV_DETAIL',
    comment => 'Report queue table',
    multiple_consumers => TRUE,
    message_grouping => DBMS_AQADM.NONE,
    compatible => '10.0.0',
    sort_list => 'ENQ_TIME',
    primary_instance => '0',
    secondary_instance => '0');
    dbms_aqadm.create_queue (
    queue_name => '&&CRQ_QUEUE_NAME',
    queue_table => '&&CRQ_QUEUE_TABLE_NAME',
    max_retries => 5);
    - ENQUEUING on the first queue (snippet of code)
    o_resv_detail DLEX_AQ_ADMIN.RESV_DETAIL;
    o_resv_detail:= DLEX_AQ_ADMIN.RESV_DETAIL(resvcode, resvhistorysequence);
    DLEX_RESVEVENT_AQ.enqueue_one_message (o_resv_detail);
    where DLEX_RESVEVENT_AQ.enqueue_one_message is :
    PROCEDURE enqueue_one_message (msg IN RESV_DETAIL)
    IS
    enqopt           DBMS_AQ.enqueue_options_t;
    mprop           DBMS_AQ.message_properties_t;
    enq_msgid           dlex_resvevent_aq_admin.msgid_t;
    BEGIN
    DBMS_AQ.enqueue (queue_name => dlex_resvevent_aq_admin.c_resvevent_queue,
    enqueue_options => enqopt,
    message_properties => mprop,
    payload => msg,
    msgid => enq_msgid
    END;
    - PROPAGATION: The message is dequeued from 1st queue and enqueued automatically by AQ propagation into this 2nd queue.
    (using a call to the following 'wrapper' procedure)
    PROCEDURE schedule_propagate (
    src_queue_name IN VARCHAR2,
    destination IN VARCHAR2 DEFAULT NULL
    IS
    sprocname dlex_types.procname_t:= 'dlex_resvevent_aq_admin.schedule_propagate';
    BEGIN
    DBMS_AQADM.SCHEDULE_PROPAGATION(queue_name => src_queue_name,
                                            destination => destination,
    latency => 10);
    EXCEPTION
    WHEN OTHERS
    THEN
    DBMS_OUTPUT.put_line (SQLERRM || ' occurred in ' || sprocname);
    END schedule_propagate;
    - For 'NOTIFICATION': ONE subscriber was created using:
    EXECUTE DLEX_REPORT_AQ_ADMIN.add_subscriber('&&STQ_QUEUE_NAME','&&STQ_SUBSCRIBER',NULL,NULL, NULL);
    this is a wrapper procedure that uses:
    DBMS_AQADM.add_subscriber (queue_name => p_queue_name, subscriber => subscriber_agent );
    Then notification is registered with:
    EXECUTE dlex_report_aq_admin.register_notification_action ('&&AQ_SCHEMA','&&REPORT_QUEUE_NAME','&&REPORT_QUEUE_SUBSCRIBER');
    - job_queue_processes is set to 10
    - The callback procedure is as follows
    CREATE OR REPLACE PROCEDURE DLEX_AQ_ADMIN.queue_callback
    context RAW,
    reginfo SYS.AQ$_REG_INFO,
    descr SYS.AQ$_DESCRIPTOR,
    payload RAW,
    payloadl NUMBER
    IS
    s_procname CONSTANT VARCHAR2 (40) := UPPER ('queue_callback');
    r_dequeue_options DBMS_AQ.DEQUEUE_OPTIONS_T;
    r_message_properties DBMS_AQ.MESSAGE_PROPERTIES_T;
    v_message_handle RAW(16);
    o_payload RESV_DETAIL;
    BEGIN
    r_dequeue_options.msgid := descr.msg_id;
    r_dequeue_options.consumer_name := descr.consumer_name;
    DBMS_AQ.DEQUEUE(
    queue_name => descr.queue_name,
    dequeue_options => r_dequeue_options,
    message_properties => r_message_properties,
    payload => o_payload,
    msgid => v_message_handle
    -- Call procedure to load data from reservation database to Reporting DB through the DB link
    dlex_report.dlex_data_load.load_reservation
    ( in_resvcode => o_payload.resv_code,
    in_resvHistorySequence => o_payload.resv_history_sequence );
    COMMIT;
    END queue_callback;
    - I noticed that messages are not taken out of the 2nd queue,
    I guess I would need to use the REMOVE option to delete messages from the queue?
    Would this be a large source of performance degradation after just a few thousand messages?
    - The data load through the DB may be a little bit intensive but I feel that doing things in parallel would help.
    I would like to understand if Oracle has a way of dequeuing in parallel (with or without the use of notification)
    In the case of multiple subscribers with notification , does 'job_queue_processes' value has an impact on the degree of parallelism? If not what setting has?
    And is there a way supplied by Oracle to set the queue to notify only one subscriber per message?
    Your advice would be very much appreciated
    Philippe
    Edited by: user528100 on Feb 23, 2009 8:14 AM

  • How do I split a single event with many clips into multiples events, one event per date?

    I archived the video from my AVCHD camera into a Final Cut video archive. Later I imported this into iMovie. All the clips from this archive (spanning several months) are dumped into a single event. If I recall, older versions of iMovie would import video into separate events for each day. Then I would go through and give each event a meaningful name (e.g., "Mary's Birthday").
    This is what I would like. Is there a way to split this very long event into multiple events, one for each day that is present in the event?

    Not automatically but with imovie 10 you can create new events with any name you like then move clips into them from other events  (Just like FCP 10.1).   See:
    http://help.apple.com/imovie/mac/10.0/#mov1d890f00b
    You can also choose to display clips in an event grouped by date.
    Geoff

  • How do I update a waveform chart with multiple plots one point at a time?

    Hi,
    I am updating multiple plots one point at a time.  That is, every time I update the chart I have a value for one of the plots.  If I put in a cluster with NaN for each of the charts except the one with the data I can only get individual points on the chart.  I would rather have them connected.
    Any clues?  The only thing I can think of is to delay the update and do interpolation between samples - too complicated for me.
    Thanks

    Are the points within a given plot equally spaced in time?  If so, perhaps you need to maintain your own history of the data and used a waveform graph rewriting the history data to the graph each time.  If the points are unequally spaced, then you will need to do it as an XY graph.

  • Uploading multiple items one at a time - file view resets

    In my work, I need to upload many files, one at a time, using an interface through the browser.
    When I go to upload multiple files, one at a time, using 10.6.2, the Finder takes me back one or two levels down from the file location between uploads. In 10.4, the Finder always took me back to the folder last accessed.
    Is there a Preference, Shareware like TinkerTool, or a Terminal Command I can use so that I always go back to the folder last accessed?

    OK. Wow. I did not realize the question was so obtuse. Feel free to ask for clarification (no reason to be nasty or flame me).
    "what mechanism is being used"
    In my original post, I did say I was uploading via a browser. So I don't know if this clarifies for you or not.
    Using Firefox or Safari.
    Uploading photos, one at a time, to a website that uses the browser (rather than FTP).
    In OS 10.4.11, after each photo, I would upload a second one and the dialog box would open up to the same folder as the previous upload, greatly simplifying the process.
    In OS 10.6.3, after each photo, I would go to upload a second photo and the dialog box would open one level down, or sometimes seemingly randomly, two levels down. This requires me to navigate back to the folder with the photos which greatly increases the amount of time between each upload.
    It would be helpful if I could show photos to explain.
    I am not certain that this really clarifies my original posting. Does this make sense or do you need more info?

  • Fastest way to add multiple images, one after the other, in DW5.5

    There has to be a faster way to add multiple images, one after the other, in a web page rather than just dragging each from the Files tab w/in DW 5.5
    I was a little surprized I couldn't select multiple images and just drag a block and have DW just insert an img tag with empty alts and the source path but maybe there is some options I just haven't found?
    I tried using Bridge to generate a quick html slideshow/gallery but it creates skads of divs and such... I just have to update an older site and was looking for an efficient way to add a bunch of images (random names ugh so just copying-pasting one and changing names is still more manual) that lie next to each other, no breaks inbetween....seems an ideal automated situation.
    I'd settle for an external script or app or something because they come in blocks of 25 approx and I have all my actions in PS set up to size appropriately which is nice so I just wondered if there was something good for this low-level production task.
    Thanks

    Thanks. Yes, I started writing a script but I just wondered if there were hidden gems in DW....so many programs do more than they are intended to do nowadays, I had to ask people who use it more!
    Actually, the fastest was to just dupe one line of code, paste x amount and then just use the pick-wip to point and drag. That at least cut the menu out of the equation.
    I thought I remembered doing a batch insert years ago..maybe in GoLive or before Adobe had DW? DW and GoLive are the only GUI html editors I've ever used..but, may it was just something out of an ealier photoshop...they've taken good stuff out and put good stuff in so many times that I can only remember so much!

  • Bursting - OUTPUT sends one big report, instead of multiple individual ones

    Bursting - OUTPUT sends one big report, instead of multiple individual ones:
    Data File below - shows report has the proper tags for the PP_NUMBER; which should split the job into individula files to be email, instead it creates one large file and emails it to the last email account
    Please help!
    <BCG_PPINVOICE_UK_ALL>
    - <LIST_G_PP_NUMBER>
    - <G_PP_NUMBER>
    <PP_NUMBER>PP100</PP_NUMBER>
    + <LIST_G_INVOICE>
    <PP_TOTAL>-1965</PP_TOTAL>
    <AMOUNT_BEFORE_START>0</AMOUNT_BEFORE_START>
    <PP_INVOICE_CONTACT>HPA - Michael Palmer</PP_INVOICE_CONTACT>
    <ADDRESS1>Porton Down</ADDRESS1>
    <ADDRESS2 />
    <ADDRESS3 />
    <ADDRESS4 />
    <CITY>Salisbury</CITY>
    <COUNTRY>United Kingdom</COUNTRY>
    <EMAIL_ADDRESS>[email protected]</EMAIL_ADDRESS>
    </G_PP_NUMBER>
    - <G_PP_NUMBER>
    <PP_NUMBER>PP101</PP_NUMBER>
    + <LIST_G_INVOICE>
    <PP_TOTAL>0</PP_TOTAL>
    <AMOUNT_BEFORE_START>0</AMOUNT_BEFORE_START>
    <PP_INVOICE_CONTACT>Imperial College - Prof Wells</PP_INVOICE_CONTACT>
    <ADDRESS1>Level 3</ADDRESS1>
    <ADDRESS2>Sherfield Building</ADDRESS2>
    <ADDRESS3 />
    <ADDRESS4 />
    <CITY>London</CITY>
    <COUNTRY>United Kingdom</COUNTRY>
    <EMAIL_ADDRESS>[email protected]</EMAIL_ADDRESS>
    </G_PP_NUMBER>
    - <G_PP_NUMBER>
    <PP_NUMBER>PP102</PP_NUMBER>

    This is EBusiness - XML Publiser (Not Enterprise)
    My Control file is as follows:
    <xapi:requestset type="bursting">
    <xapi:request select="/BCG_PPINVOICE_UKB_ALL/LIST_G_PP_NUMBER/G_PP_NUMBER/PP_NUMBER">
    <xapi:delivery>
    <xapi:email id="/G_PP_NUMBER/${PP_NUMBER}" server="localhost" port="25" from="[email protected]" reply-to="[email protected]">
    <xapi:message id="/G_PP_NUMBER/${PP_NUMBER}" to="${EMAIL_ADDRESS}" attachment="true" subject="Becmanc Coulter Genomics - Prepaid Account Balance ">Dear Prepaid Account holder Please find attached the current statement of your account. If you have any queries regarding the statement, or wish to make a top up of the account please e-mail me directly at [email protected] Kind regards UK Finance Team</xapi:message>
    </xapi:email>
    </xapi:delivery>
    <xapi:document output="output="PP_STATEMENT_${PP_NUMBER}"" output-type="pdf" delivery="/G_PP_NUMBER/${PP_NUMBER}">
    <xapi:template type="rtf" locale="" location="xdo://XBOL.BCG_PPINVOICE_UKB.en.US/?getSource=true"/>
    </xapi:document>
    </xapi:request>
    </xapi:requestset>
    Edited by: user9015277 on Sep 3, 2010 10:29 AM

  • Using a queue with only one consumer

    Hello,
              I would like to know how I can setup a system with a JMS Queue and only one
              consumer.
              I think I can create a MDB and specify in the DD that I want only one
              instance of the EJB so that I can have only one consumer.
              Can anyone confirm this assertion ?
              regards,
              Dom
              

              Dominique Jean-Prost wrote:
              > Hello,
              >
              >
              > I would like to know how I can setup a system with a JMS Queue and only one
              > consumer.
              >
              > I think I can create a MDB and specify in the DD that I want only one
              > instance of the EJB so that I can have only one consumer.
              >
              > Can anyone confirm this assertion ?
              Yes.
              For more information on configuring MDB concurrency see
              the JMS Performance Guide white-paper here:
              http://dev2dev.bea.com/technologies/jms/index.jsp
              >
              > regards,
              >
              > Dom
              >
              >
              

  • Producer and Consumer Issue

    Hi,
    Do we need to consume all the portles of the producer from portal admin console?
    or is there a way to use the .Portal file directly as all the portlets of the producer are consumed already?
    Edited by: user8894463 on Dec 15, 2009 2:34 PM

    Hi
    1. In 10.x (not sure about 9.x), from Workshop IDE, we can create like Standalone Books and Pages that can be Consumed as Whole on the Consumer side. From IDE, right click project and select New -> Other -> Expand Weblogic Portal and you should see "Book", "Page". Once this is done, you can add all your portlets on these stuff. Now on Consumer Side, when you register the Producer, you do See these Books and Pages. So if you consume full Book or Page, you will get all the Portlets also. This is very handy instead of creating and consuming "n" number of portlets on consumer manually. Also if your consumer portal is created from Workshop IDE, you can choose above menu option and select "Remote Book" or "Remote Page" and this asks for Remote Producer URL and consume that entire book or page. All that matters is on Producer side, you will not create Books and Pages in .portal file. Instead use the first above option like Create New Other -> Book or Page.
    2. By default, when you register Producer, it just shows list of all Portlets from Producer. It is upto you on Consumer side, what portlets to consumer or not and create corresponding proxy portlets. By default, all portlets created from WLP are Remotely consumable.
    The concept of creating standalone Books and Pages and consuming them as a whole on Consumer side is really cool.
    HTH
    Ravi Jegga
    Edited by: Ravi Jegga on Dec 15, 2009 6:25 PM

  • Multiple parameters one row for each record

    I have this query I want the user to be able to allow the user to enter multiple MATL_CODE_MOD as a parameters, the problem is that the GURMAIL table have multiple records one for each GURMAIL_MATL_CODE_MOD, so it will give me duplicates.. How I can code this, so I can allow the user to enter multiple MATL_CODE_MOD codes, and have only one row for each record...
    and GURMAIL_MATL_CODE_MOD in ('8IBC','8IBD')
    select
    spriden_id,        
    spbpers_name_suffix   name_suffix,
    spbpers_name_prefix   name_prefix,
    spriden_last_name     last_name,
    spriden_first_name    first_name,
    spriden_mi            mi,
    srbrecr_term_code,
    srbrecr_majr_code,
    srbrecr_program_1,
    saturn_midd.utlq.f_matl_code_type(srbrecr_pidm)
    FROM
    saturn.srbrecr,
    saturn.spriden,
    saturn.spbpers,
    SATURN.SPRADDR,
    general.gurmail
    WHERE
    spriden_pidm = srbrecr_pidm
    and gurmail_pidm = spriden_pidm
    AND spriden_pidm = spraddr_pidm 
    --and srbrecr_term_code = decode(p_term  ,'%',SRBRECR_TERM_CODE,p_term)
    and spbpers_pidm = spriden_pidm
    and spriden_change_ind is null
    --and gurmail_matl_code_mod
    and SRBRECR_PROGRAM_1 = 'CMQ'
    and GURMAIL_MATL_CODE_MOD in ('8IBC','8IBD')
    --AND SPRADDR_ATYP_CODE = 'CM'
    AND SPRADDR_STREET_LINE1 IS NOT NULL;

    Hi,
    SELECT DISTINCT ...will keep duplicate rows out of the result set.
    Depending on your data, another possibility might be to not join the table that contains GURMAIL_MATL_CODE_MOD. Do an EXISTS sub-query on that table instead.
    If you need more help, post some sample data from all the tables, and the results you want from that data.
    Simplify, if possible, You can probably post a problem with 2 or 3 tables, each with 2 or 3 columns, that will have the same answer.
    Edited by: Frank Kulash on Apr 22, 2009 2:28 PM

  • Send data from producer to consumer

    Hi,
    I am trying on a sampe project with weblogic as both producer and consumer. I could send the data from consumer to producer to using Interceptors. Now, my requirement is to send data from producer back to consumer. I can do this using SimpleStateHolder, but I want a generic way like passing information in cookies/HTTP Request Headers from producer...etc and to retrieve them at consumer. This is because in my actual implementation the producer will be websphere.
    So, please can anyone tell me how to send data from producer to consumer ???
    Thanks,
    Anu

    Hello Anu,
    I believe WLP 10.2 requires some patches to get the consumer
    interceptor to properly access cookies coming from the producer, so
    that is probably why you aren't seeing the cookies. After reading your
    use-case, I don't think you will need to get these patches to get your
    use-case to work, but if you are still interested in the patches, I can
    find out the details for you.
    The reason the redirect code you posted isn't working is because the
    response has already been committed during the getMarkup operations and
    it is too late to redirect the page to a different URL, so the redirect
    is being ignored.
    The good news is that the functionality you want should all be fairly simple to implement.
    When the user opens the remote portlet, makes a selection and then
    submits it, the first thing that will happen is a WSRP
    BlockingInteraction call to the producer, to let the remote portlet
    know that a user has interacted with it. This happens before the
    GetMarkup operation, during a time when it is still legal to redirect
    to a different URL. In fact, the portlet on the producer is allowed to
    send a response indicating that the page should be redirected to a
    different URL.
    So in your remote portlet, you can have it look at the form values that
    were submitted during the BlockingInteraction call, and if the user
    selected the particular value that should be redirected to another
    portlet, the remote portlet can request a redirect. The only problem
    here is that your remote portlet doesn't know the URL to the portlet it
    wants to redirect to on the consumer side, but you can handle that in
    an interceptor on the consumer.
    So, rather than implement the IGerMarkupInterceptor, use the IBlockingInteractionInterceptor. For example:
    public sampleInterceptor implements IBlockingInteractionInterceptor
    // Other methods need to be implemented to do nothing...
    public Status.PostInvoke postInvoke(<code>IBlockingInteractionRequestContext requestContext,
    IBlockingInteractionResponseContext responseContext)</code>
    String redirectUrl = responseContext.getRedirectURL();
    if(redirectUrl != null)
    // The producer portlet wants to redirect- substitute the right consumer URL
    PageURL pageUrl =
    PageURL.createPageURL(requestContext.getHttpServletRequest(),
    requestContext.getHttpServletResponse),
    "voipTrunk_portal_page_11_page_12_page_13");
    responseContext.setRedirectURL(pageUrl.toString());
    This should be all that you need to do on the consumer side. It will
    automatically redirect for you to the URL you set in the interceptor,
    since the producer portlet requested a redirect.
    On the producer side, you will need to have the portlet send the
    redirect request during the BlockingInteraction operation. How you do
    this depends on what portlet type you are using and what producer you
    are using. For example, in WLP using a JSP portlet, you would need to
    use a backing file on the portlet, and have that class implement the
    JspBacking class:
    http://edocs.beasys.com/wlp/docs102/javadoc/com/bea/netuix/servlets/controls/content/backing/JspBacking.html
    Then, in the handlePostbackData method you would look for the special value and redirect if it exists, such as:
    public boolean handlePostbackData(HttpServletRequest request, HttpServletResponse response)
    String paramValue = request.getParameter("paramName");
    if((paramValue != null) && (paramValue.equals("specialValue"))
    // Need to send a redirect request
    PortletBackingContext pbc = PortletBackingContext.getPortletBackingContext(request);
    pbc.sendRedirect("http://anyUrlWillWork");
    return(true);
    return(false);
    Since the consumer interceptor is changing the redirect URL, any
    redirect URL the producer sends will work- it just needs to look like a
    valid, absolute URL to pass some simple checks on the producer.
    Backing files are documented here: http://e-docs.bea.com/wlp/docs102/portlets/building.html#wp1077130
    As I mentioned before, different portlet types and producers would do
    this differently. For example, I don't think WebSphere has backing
    files, and in JSR168 portlets a backing file is not needed- you would
    do the equivalent code in the JSR168 portlet's processAction() method
    (using javax.portlet.ActionResponse.sendRedirect(String URL) to send
    the redirect request). JSR168 portlets should work in both WLP and
    WebSphere, but I don't know the details of the portlet type you want to
    use on the WebSphere producer; if it isn't a JSR168 portlet, WebSphere
    must have some way equivalent to the backing file's handlePostbackData
    method to participate in a WSRP BlockingInteraction operation and
    request a redirect.
    No cookies or headers are required though, so I don't think you would
    need the patches to WLP 10.2 for the interceptor dealing with cookies.
    Hope this helps,
    Kevin

  • Drawback of putting producer and consumer in one loop. [Ethernet IP]

    The "Create Assembly Instance" exapmple vi have two separte loops. One for the input (producing data) and the other for the output (consuming data). Would it be possible to combine everything into one loop? Are there any drawbacks to using one loop for the input and the output?
    and i know one drawbacks will be that both consumer and producer will have to have the same rate.. 

    That's an interesting piece of code.  It's not so much a producer/consumer as it is two separate loops handling input and output.  Take a look at this link to see what producer/consumer is: http://www.ni.com/white-paper/3023/en/  You can also find the code by going to File->New and opening the "From Template" folder under VI.
    What controls your state machine?  With events not being possible on a RT system, I'd expect you'd have some form of polling.  Polling would still use a time period between polls.  Granted, the network requirement would definitely break determinism, as you've already noted.  If you're doing this, why not put it into a timed loop?  Timed loops aren't always determinstic.  That's why there is a "Finished Late?" terminal.  It's not the loop that makes the code deterministic, it's the way you put your program together. By moving it into a timed loop, you gain the ability to give it a priority.  I'll explain why this is important in a minute.
    You're welcome to combine the two into a single loop if you still meet your timing requirements.  That's a design choice that is up to you.  I don't know what your other Ethernet/IP "stuffs" is, but I'd likely combine this into my output loop if possible.  I'm assuming it has something to do with the data you care to send.
    The overhead from the loop isn't enough to worry about the worload on the CPU.  Ultimately, the code within the loop determines how rough you're being on the CPU.  That's true in one loop or in four loops.  Splitting code into multiple loops just lets you prioritize code.  If everything is in a single loop, it all must run before the next iteration.  If your code is split into ten loops, as an example, only what is inside each loop must be run on that loop's iteration.  Using priority, as you've mentioned, ensures you determine which loop runs first.  Let's say that loop completes and gives the other loops time to run on the CPU.  Before they complete, the loop wants to run again.  The CPU will go back to that loop and run.  By splitting the loops up, you've ensured this piece of your code will run even if the CPU can't handle processing all of the code within that period.  Rather than hurting determinism, you've aided it.  The parts of your code that you aren't worried about being deterministic happen when the CPU has time for them.  The parts that you NEED to be deterministic happen deterministically and push those other parts of the code out of the way.
    Looking at the example code you're showing, I'd really want to know what it is you plan to do with the code and what you need to be deterministic.  I'd assume you plan to read the data, process it, and send a corresponding output.  If you need ALL of this to be deterministic, I'd put it within a single loop or use queues to send data from the input loop (commented as consumer) to the output loop (commented as producer).  This decision would really just depend on how fast you care to acquire data and how deterministic you desire the output to be.  Without using the queues, you create something called "race conditions."  When you send an output, is that related to the newest input or one before it?  You simply cannot tell.  
    Jeff B.
    Applications Engineer
    National Instruments

  • One producer two consumer loops, dequeue same element in both consumers

    Hi!
    What is the best way for the following:
    I enqueu data in my producer loop. I need this data dequeue-d in both of my consumer loops, but I want to dequeu the same element in both loops.
    Of course if I put dequeues in both loop, then the second consumer loop will loose the odd elements and the first consumer loop will loose the even elements.
    thanks!
    Solved!
    Go to Solution.

    Blokk wrote:
    Hi!
    What is the best way for the following:
    I enqueu data in my producer loop. I need this data dequeue-d in both of my consumer loops, but I want to dequeu the same element in both loops.
    Of course if I put dequeues in both loop, then the second consumer loop will loose the odd elements and the first consumer loop will loose the even elements.
    thanks!
    Makes very little sence- that is, the problem is stated in such a way as to preclude an informative response
    Gerd wrote"
    either create two queues or use notifiers...
    Best regards,
    GerdW
    Usually, Gerd gives good advice but on this one I'm going to pick on him just a bit- I bet he rushed in just a tad without thinking about the premise of the OP's question - and Gerd I don't mean to sound mean, my appologies
    The question pre-supposes that to use a queue element twice in parallel it must be read twice.  This is false and led to less than optimal advice.  What about a template like shown in this snippet? 
    We can certainly dequeue once and spawn as many independant actions as we need within a single consumer loop!  Much more scalable than creating a queue for each action.
    Jeff

  • ODSEE 7 - 11.1.1.3.0 Replication from multiple masters(not multimaster) into one consumer (into same DIT)

    Hi All,
    Would you be able to help me regarding a replication question?
    We have an existing LDAP topology where we maintain masters and consumers.
    We have a request to expose (if it is possible) an additional suffixes into the current DIT on consumer side.
    Here is the situation :
    What do you think? is it possible to do this way?
    The goal is to get the objects from ou=europe and ou=us and from ou=company as well when the search is on the ou=company,dc=example,dc=com with scope =2 (subtree)
    Thank you for your help
    regards
    Laszlo

    Hi Laszlo,
    thank you for the additional clarification; in that scenario, adding the two sub-suffixes and creating the replication from the other masters (ou=europe and ou=us) shouldn't be an issue, as long as you have created the same structure also the other masters.
    Basically you could have on all the masters (company, europe and us) the root suffix which will always be: ou=company,dc=example,dc=com, then on the "europe" and "us" directories it will be just a kind of 'empty placeholder', whereas in the "company" directories will be fully populated:
    Master "Company" 1 - root suffix: ou=company,dc=example,dc=com                [This sub-suffix will contain the data and will be replicated]
    Master "Company" 2 - root suffix: ou=company,dc=example,dc=com                [This sub-suffix will contain the data and will be replicated]
    Master "Europe" 1 - root suffix: ou=company,dc=example,dc=com                    [This suffix will remain mostly empty and not replicated]
    Master "Europe" 1 - sub-suffix: ou=europe,ou=company,dc=example,dc=com    [This sub-suffix will contain the data and will be replicated]
    Master "Europe" 2 - root suffix: ou=company,dc=example,dc=com                    [This suffix will remain mostly empty and not replicated]
    Master "Europe" 2 - sub-suffix: ou=europe,ou=company,dc=example,dc=com    [This sub-suffix will contain the data and will be replicated]
    Master "US" 1 - root suffix: ou=company,dc=example,dc=com                   [This suffix will remain mostly empty and not replicated]
    Master "US" 1 - sub-suffix: ou=us,ou=company,dc=example,dc=com          [This sub-suffix will contain the data and will be replicated]
    Master "US" 2 - root suffix: ou=company,dc=example,dc=com                   [This suffix will remain mostly empty and not replicated]
    Master "US" 2 - sub-suffix: ou=us,ou=company,dc=example,dc=com          [This sub-suffix will contain the data and will be replicated]
    Replication:
    ou=company,dc=example,dc=com:
    msco1 <---MMR--> msco2
    msco1 ---> cons01, 02, ... 16
    msco2 ---> cons01, 02, ... 16
    ou=europe,ou=company,dc=example,dc=com
    mseu1 <---MMR--> mseu2
    mseu1 ---> cons01, 02, ... 16
    mseu2 ---> cons01, 02, ... 16
    ou=us,ou=company,dc=example,dc=com
    msus1 <---MMR--> msus2
    msus1 ---> cons01, 02, ... 16
    msus2 ---> cons01, 02, ... 16
    HTH,
    marco

Maybe you are looking for

  • Safari on Windows XP Pro?

    Hey everyone This is my first time using the forums here and I was just wondering if it's possible to use Safari on Windows XP. If anyone would have an answer to this would you be able to provide a link to a download page, it would be much appreciate

  • Change size of pages/text in pdf book

    Hello I am trying to read a few pdf books but the text is so small that I have to zoom way in and then scroll from side to side to read the thing. Its super irritating. Is there a way to make the pdf book act more like the regular books? I would like

  • HT4108 Apple Composite AV Connector works with video

    My (just purchased at Apple Store) Apple Composite AV connector cable works fine providing video out to my iPhone 4S but will not "mirror" my iPhone screen. It seems I am doing something wrong, any settings I should check on the phone or TV (or proje

  • Extreme trouble transfering video files to ZEN MX

    I'm having trouble putting videos on my Zen MX. I've tried a multitude of ways to transfer the file(s); . Using Centrale with a normal avi. file - convert unsuccessful. 2. Using Format Factory to change the avi. file using the preset conditions for Z

  • Saving Captivate project freezes pc

    I have a user at the office, when she tries to save a captivate project (project specs below) it will freeze her machine (this is not an Export, just a captivate save), Exports seem to work just fine. Project Specs: This is an established project tha