Reading data from an UIBB reused in multiple tabs in OIF

Hello experts,
I've an application using OIF, with three tabs. A specific UIBB needs to be embedded in all three tabs. The UIBB has an input field, which has to be filled by the user.
Now, I need to read all three different instances of this UIBB field, in the three tabs, through FPM.
I was able to do it, by creating three different usages of the UIBB in another WD component. I could read the data, by mapping the context node of the usages to three separate nodes created in the main application.
But, no idea how do it using FPM. Please help.
Thanks in Advance...
Satyajit

Hello Indu,
Thanks for the quick response....
Firstly, even though it is the same UIBB, is the data different in each UIBB ?
The UIBB has an user input field...yes, it can be different....
If yes, you need to have 3 interface views for each UIBB
I don't want to go for this as I'm embedding the window of a specific WD component as an UIBB and we don't want to replicate the same thrice, with different contexts.
It can also be done with single interface view and 3 nodes, where you have to bind the node to the UIBB as soon as it becomes active. A little tedious
Could you pls elaborate how to do this in FPM? I just need to get the data from three different instances in three tabs.

Similar Messages

  • Best way to control and read data from multiple instruments?

    Hello,
    I'm building an application to test power supplies involving multiple pieces of equipment over a couple of different comm busses. The application will also send control instructions to some of the instruments, and read data from some of the instruments. The reading and control profiles will not run on the same schedule (variable length control steps, configurable read interval).
    I was thinking of using a queued statemachine (producer/consumer) for the control profile and another to read the data, but I got concerned that there would be collisions between sending control commands and read commands to the same machine. Is there a suggested design pattern for implementing something like this?
    Timing of the commands isn't critical down to the milisecond, but I need to collect reasonably accurate timestamps when the data is read. The same is true for the control commands.
    Here are the instruments I'm going to use, if the are control, read, or both, and the communication method
    Instrument Funtions Comm Method
    Power Supply Read data Communicates to PMBus Adapter
    PMBus to USB Adapter Read data USB (Non-Visa)
    Switch control relays USB (VISA)
    Power Dist. Unit read data/control outlets SNMP (Ethernet)
    Electronic Load read data/control load GPIB (VISA)
    Thermal Chamber read data/control temp Ethernet (VISA)
    Thanks,
    Simon

    Hello, there is a template in LV called "Continuous measurement and Logging".
    It can give you some idea how to properly decouple the "GUI Event handler" top loop (where your Event structure is) from the DAQ and other loops.
    You do not need to totally replicate the above example, but you can find there nice tricks which can help you at certain points.
    The second loop from the top is the "UI message loop". It handles the commands coming from the top loop, and regarding to the local state machine and other possible conditions and states, it can command the other loops below.
    During normal run, the different instrument loops just do the data reading, but if you send a control command from the top loop to a certain instrument loop (use a separate Queue for every instrument loops), that loop will Dequeue the control command, execute it, and goes back to data reading mode (in data reading mode the loop Dequeu itself with a "data read" command automatically). In this way the control and data read modes happen in serial not in parallel which you want to avoid (but I think some instrument drivers can even handle parallel access, it will not happen in really parallel...).
    In every instrument loop when you read a value, attach a TimeStamp to it, and send this timestamp/value couple to the DataLogging loop via a Queue. You can use a cluster including 3 items: Source(instrument)/timestamp/value. All the instrument loops can use the same "Data logging" Queue. In the Datalogging while loop, after Dequeue-ing, you can Unbundle the cluster and save the timestamp and data-value into the required channel (different channel for every instrument) of a TDMS file.
    (Important: NEVER use 2 Event structures in the same top level "main" VI!)

  • Read data from multiple sheets

    Hi,
    Can anyone tell me how to read data from multiple excel sheets?
    Thanks.

    Hi,
    go through this link this may help u.
    https://www.sdn.sap.com/irj/scn/wiki?path=/display/snippets/readmultiplesheetsofanExcelfileintoSAPthroughABAP&focusedCommentId=92930268
    Thanks
    Edited by: tarangini katta on Apr 23, 2009 2:34 PM

  • How do I read data from a DMM or DC Power Supply at a specified rate?

    I have a PXI system with 4071 DMMs and 4110 DC Power Supplies. I want to be able to measure the power consumption of my DUT as it performs various operations. This is what my process flow looks like:
    1. Configure DMM and Power Supply 
    2. Wait for DUT to go into a certain mode
    3. Start  Acquiring data from DMM and PS.
    4. Wait for DUT to get out of this mode
    5. Stop Acquiring data from DMM and PS.
    Since the duration of step #4 is uncertain, I cannot tell the units to collect a predetermined number of samples and give it back to me. Instead I have to take readings as long as it's required. 
    I'm attaching screenshots of my setup and read process. The reading is done inside a timed loop which is running at 1kHz.
    The problem is that using niDMM Read Multi-Point or niDCPower Measure Multiple takes 5-9ms before I get a reading, so I'm not really getting 1ms resolution in my data. I'm sure there are other folks who have had the need to read data from these devices with better time-resolution, so if anyone can point me in the correct direction, it would be great. I'm pretty sure I'm not doing the correct thing here.
    Attachments:
    1-Setup.PNG ‏12 KB
    2-Read.PNG ‏9 KB

    What is really confusing to me is why you are even using the multi-point function when you set the sample count to 1. If you want multiple points, request multiple samples and let the DMM acquire them at a rate you specify. If you want a single sample, use the normal Read. You also need to pay attention to how fast the DMM can acquire. If you want the fastest sample rates, you might be a lot better off with a DAQ board.

  • Reading data from input stream on unix

    I have a program that reads data from input stream from the socket. If the data is over 1500 bytes it is sent in multiple TCP packets. Whats weird is, if I run the program in windows environment it waits till it receives all the packets but when I run the same program in unix environment it only reads the first packet and go further without waiting for all the TCP packets!!
    The line that reads from input stream is
    datalen = inStr.read(byteBuffer);is there anyway I can make it wait till it receives all the packets on unix system? I do not understand why it works fine for windows in this case but not for unix.
    I'll appreciate any help..
    Thanks

    Try using a DataInputStream with the readfully() method.

  • Reading Data from CSV file

    Hi Guys,
    I am trying to read data from a CSV file character by character. Whats the best way to do this? Any examples around?
    Thanks
    tzaf

    Does this mean your file will have multiple lines? And each line would indicate a new record? If so, you should use the BufferedReader and take in each line as a String. Then you can use the StringTokenizer to separate your string into tokens using the comma as your delimiter. From there you can convert the string tokens into whatever form you like, but by default they are already in String form. To make it an Integer I'd use Integer.parseInt().
    I will show you some code on how to get the values from your file using the BufferedReader and the StringTokenizer but what you do with those values afterwards I'm going to leave up to you.
    File csvfile = new File("myfile.csv");
    byte[] fileBuf = new byte[1024];     // buffer for file data
    int bytesRead = 0;               // number of bytes read
    try
         BufferedReader fileIn = new BufferedReader( new FileReader( csvfile ));
         PrintStream out = new PrintStream( System.out );
         String readLine;     // stores a line from the file as a string
         while( (readLine = fileIn.readLine()) != null )
              StringTokenizer tokens = new StringTokenizer( readLine, ",", false);
              // false means you don't want to count the commas, only the values
              while( tokens.hasMoreTokens() )
                   String aValue = tokens.nextToken(); 
                   // ... do what you want with the value
                   // ... change to Integer or whatever
                   out.println(aValue); // Printing the value to the screen
         fileIn.close();          
    }Good luck,
    .kim

  • Do I have to use lock when I am reading data from a table

    Hi,
    When i am reading data from a table , do I have to set a lock on that table .
    Is it necessary for us to set the lock on a table when I am reading data from the table.
    When I am updating the table , do I have to set a lock on the table ?
    If yes, then what sort of lock-read lock,write lock or shared lock?
    Regards,
    Sushanth H.S.

    check it out,
    Lock objects are use in SAP to avoid the inconsistancy at the time of data is being insert/change into database.
    SAP Provide three type of Lock objects.
    - Read Lock(Shared Locked)
    protects read access to an object. The read lock allows other transactions read access but not write access to
    the locked area of the table
    - Write Lock(exclusive lock)
    protects write access to an object. The write lock allows other transactions neither read nor write access to
    the locked area of the table.
    - Enhanced write lock (exclusive lock without cumulating)
    works like a write lock except that the enhanced write lock also protects from further accesses from the
    same transaction.
    You can create a lock on a object of SAP thorugh transaction SE11 and enter any meaningful name start with EZ Example EZTEST_LOCK.
    Use: you can see in almost all transaction when you are open an object in Change mode SAP could not allow to any other user to open the same object in change mode.
    Example: in HR when we are enter a personal number in master data maintainance screen SAP can't allow to any other user to use same personal number for changes.
    Technicaly:
    When you create a lock object System automatically creat two function module.
    1. ENQUEUE_<Lockobject name>. to insert the object in a queue.
    2. DEQUEUE_<Lockobject name>. To remove the object is being queued through above FM.
    You have to use these function module in your program.
    check this link for example.
    http://help.sap.com/saphelp_nw04s/helpdata/en/cf/21eea5446011d189700000e8322d00/content.htm
    tables:vbak.
    call function 'ENQUEUE_EZLOCK3'
    exporting
    mode_vbak = 'E'
    mandt = sy-mandt
    vbeln = vbak-vbeln
    X_VBELN = ' '
    _SCOPE = '2'
    _WAIT = ' '
    _COLLECT = ' '
    EXCEPTIONS
    FOREIGN_LOCK = 1
    SYSTEM_FAILURE = 2
    OTHERS = 3
    if sy-subrc 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    endif.
    Normally ABAPers will create the Lock objects, because we know when to lock and how to lock and where to lock the Object then after completing our updations we unlock the Objects in the Tables
    http://help.sap.com/saphelp_nw04s/helpdata/en/cf/21eea5446011d189700000e8322d00/content.htm
    purpose: If multiple user try to access a database object, inconsistency may occer. To avoid that inconsistency and to let multiple user give the accessibility of the database objects the locking mechanism is used.
    Steps: first we create a loc object in se11 . Suppose for a table mara. It will create two functional module.:
    1. enque_lockobject
    1. deque_lockobject
    before updating any table first we lock the table by calling enque_lockobject fm and then after updating we release the lock by deque_lockobject.
    Select the radio button "Lock object"..
    Give the name starts with EZ or EY..
    Example: EYTEST
    Press Create button..
    Give the short description..
    Example: Lock object for table ZTABLE..
    In the tables tab..Give the table name..
    Example: ZTABLE
    Save and generate..
    Your lock object is now created..You can see the LOCK MODULES..
    In the menu ..GOTO -> LOCK MODULES..There you can see the ENQUEUE and DEQUEUE function
    Reward if helpful

  • Anyone tried this - Extract data from HANA Live reuse views into BW?

    Hello Experts,
    I've read from this blog http://scn.sap.com/community/bw-hana/blog/2014/05/26/go-hybrid--sap-hana-live-sap-bw-data-integration that this scenario
    > "Loading of data into BW using Reuse Layer of SAP HANA Live as data source (Extract data from HANA Live reuse views into BW)" is possible.
    Does anyone has a step by step guide on how to do this? Can you please share?
    Regards,
    DaSaint

    Hi DaSaint,
    best to check the online documentation
    Notes about transferring data from SAP HANA using ODP - Modeling - SAP Library
    Best regards,
    Andreas

  • The attempt to read data from the server failed

    Today I was looking into an error with several IMAP accounts in Apple Mail:
    +The attempt to read data from the server "<<servername.tld>>" failed.+
    At first I thought this was an Apple Mail problem, as the accounts in question seemed to work just fine when not used in combination. One possible answer to this problem is rather short:
    Apple Mail uses IMAP caching, which uses more than 4 connections at the same time to the mail server. Some mail servers (like courier-imap in its default configuration) do not allow that much connections from the same IP address. The more accounts you are trying to connect to at the same time raises this number of connections. Meaning while you could probably check one account for new mails, the second will ultimately fail for no obvious reasons. The only solution to this problem is to raise the connections allowed by your IMAP server software. this solution only applies to people who have root access to their mail server.
    in courier-imap you have to edit /etc/courier-imap/imapd
    and change MAXPERIP=4 to a higher number (5 to 10 times the number of accounts you want to check simultaneously)
    and change MAXDAEMONS=40 to a higher number (with only one user 200 might work, whereas if you serve multiple users, something like 500 or higher might be better suited).
    of course increasing the numbers increases load on your server. this is why there are these restrictions in place.

    MY SOLUTION REPOSTED FROM ANOTHER THREAD:
    I've just solved a similar issue.
    I have a dedicated server running Plesk 9.5 and when I upgraded to iLife 11 and Snow Leopard this error appeared. I could quickly click "get mail" and I'd get all my mail, but only 3-4 of my 9 mail accounts would connect. Theo others would have the error:
    "The server error encountered was: The attempt to read data from the server..."
    I found solutions for those using IMAP mail:
    modify the /etc/courier-imap/imapd configuration file and change MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40. This allows all the machines behind my home firewall to connect to multiple accounts on the e-mail server with mailbox caching enabled.
    I'd made this change on my server but it didn't seem to have any effect. It dawned on me that I'm using POP, not IMAP. So I found in /etc/courier-imap/pop3d the same settings. I changed the MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40 and voila, all my connections concurrently worked.
    This has taken me more than two days to fix and I hope posting this helps someone else with the same issue.

  • How to pass data from one  UIBB to anothere UIBB in OIF/GAF - FPM

    Dear all,
    as of now i am using FPM only for displaying data  from different components.
    now i would like to pass the data from one UIBB to another by calling second UIBB on action of a button. (instead of using standard path in GAF scenario)
    how could i achieve this.  is there any difference mechanism's for UIBB's of single component and UIBB's from different different components.
    it would be great if some one can explain or help me reg this issue on both OIF and GAF FPM's.
    Thanks in Advance.
    Best Regards,
    Kranthi kumar Palle.

    I've actually combined these two approaches to data sharing - I've passed a class reference in the shared data component. This is nice (in my opinion) because it is very obvious where the data is coming from and who it is shared with. It also means that there is not a huge overhead in passing data through the shared context, because you are just replicating a reference to a class instance.
    And -  you don't have to deal with singleton classes :-). So if you want/need to extend your implementation at a later date (for example embedding multiple instances of the same "app" in the one window - or suspending and resuming to another instance of the same app you can then do this. (NB - suspend resume to launch another FPM app does not work because of this (amongst other things)).
    NB a shared data component need not be faceless! I certainly have "shared data" UIBBs that also have UI components - possibly not best practice - but it certainly can be done.
    Cheers,
    Chris

  • Read data from serial port or TCP port of frontend PC

    Hello Friends,
    I have requirement to read data from device connected to frontend PC which will provide meter reading data.
    Vendor has given me two option.
    1. Device can be connected to seiral port and data transfer will be done through MODBUS RTU protocol.In that case data need to capture from serial port.
    2. Device can be connected to TCP port and Socket program can be provided for data transfer. In that case SAP will act as client and communicate with TCP port.
    There will be multiple workstation with individual meters connected to them.
    I am aware of text file interfacing through front end tools using custom code using VB,JAVA or others.
    Is there any solution availble  to achieve above things using  ABAP other than text file , like direct communication?
    I am using ECC 6.0.

    Hello,
    Socket programming in not available on ABAP, but you may use RFC for the same.
    Use the below links for more details
    [Link 1|http://help.sap.com/printdocu/core/print46c/en/data/pdf/BCFESDE2/BCFESDE2.pdf]
    [Link 2|http://forums.sdn.sap.com/thread.jspa?threadID=1820233]
    Regards,
    Abhishek

  • Reading data from Lookup [qualified flat] (multi-valued) table

    I have a main table u2019ABCu2019 which has a Lookup [qualified flat] (multi-valued) field u2018XYZu2019 along with some other text fields. I am able to read data from table ABC including lookup field data, but I do not see the relation
    For Ex: If for a contract in table u2018ABCu2019, there are multiple line items in look up table. I have a contract with 2 line items in u2018ABCu2019, but 4 line items in look up table (2 records each for each line item with item number).  Now, when I read the data from table u2018ABCu2019 for that contract, I get all the 4 lines in result table; but item no is blank even though there is item no field in table u2018XYZu2019.
    Below is the sample code:
    wa_query-parameter_code = Contract ID Field.
      wa_query-operator = 'EQ'.
      wa_query-dimension_type = 1. "mdmif_search_dim_field.
      wa_query-constraint_type =8. " mdmif_search_constr_text.
      wa_string =  Contract Number.
      GET REFERENCE OF wa_string INTO wa_query-value_low.
      APPEND wa_query TO lt_query.
      CLEAR wa_query.
    CALL METHOD lcl_api->mo_core_service->query
            EXPORTING
              iv_object_type_code      = 'ABCu2019
            it_query                 = lt_query
            IMPORTING
              et_result_set            = lt_result_set_key.
    LOOP AT lt_result_set_key INTO ls_result_set_key.
        lt_keys = ls_result_set_key-record_ids.
      ENDLOOP.
    Contract Price for lower bound
      ls_result_set_definition-field_name = XYZ'.
      APPEND ls_result_set_definition TO lt_result_set_definition.
    CALL METHOD lcl_api->mo_core_service->retrieve
        EXPORTING
          iv_object_type_code      = 'ABCu2019
          it_result_set_definition = lt_result_set_definition
          it_keys                  = lt_keys
        IMPORTING
          et_result_set            = lt_result_set.
    In table lt_result_set, I see 4 records, but I do not know which 2 records belong to item 1 of the contract and other 2 records for item2.
    Can anyone help me in resolving this. Any kind of help is appreciated.
    Thanks,
    Shekar

    i do not quite understand ur problem statement ..
    but if u have items in the main table to which QLUT entries are linked ..
    then when you read the main table record (u get LinkIds at the record level) that gives u the relation to the Qlookup values
    i mean for ur case - u would get only 2 values in the QLUT values for the "selected" record in the main table
    am not sure if you say correctly - that you get "unlinked" values in the QLUts
    so u should 2 values in lt_result_set and not 4  .. see if u are using the LinkIds properly
    hth
    thanks
    -Adrivit

  • Server error: "The attempt to read data from the server '(null)' failed"

    Multiple times during each day my client (Mail.app) puts up a little exclamation mark "!" next to the mail account hosted on our Leopard Server. Clicking on this little alert icon pops up a message that reads:
    +There may be a problem with the mail server or network. Verify the settings for account “Leopard Server Account” or try again.+
    +The server returned the error: The attempt to read data from the server “(null)” failed.+
    I can make the "!" go away by choosing Mailbox>Synchronize>Leopard Server Account. And everything seems peachy but it inevitably pops up again in another hour or two. It's annoying because I'm not sure if mail is getting through or not when the "!" is up.
    Any ideas why this is happening?

    MY SOLUTION REPOSTED FROM ANOTHER THREAD:
    I've just solved a similar issue.
    I have a dedicated server running Plesk 9.5 and when I upgraded to iLife 11 and Snow Leopard this error appeared. I could quickly click "get mail" and I'd get all my mail, but only 3-4 of my 9 mail accounts would connect. Theo others would have the error:
    "The server error encountered was: The attempt to read data from the server..."
    I found solutions for those using IMAP mail:
    modify the /etc/courier-imap/imapd configuration file and change MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40. This allows all the machines behind my home firewall to connect to multiple accounts on the e-mail server with mailbox caching enabled.
    I'd made this change on my server but it didn't seem to have any effect. It dawned on me that I'm using POP, not IMAP. So I found in /etc/courier-imap/pop3d the same settings. I changed the MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40 and voila, all my connections concurrently worked.
    This has taken me more than two days to fix and I hope posting this helps someone else with the same issue.

  • Attempt to read data from the server failed...

    I'm getting the following error on one of my email accounts:
    +There may be a problem with the mail server or network. Verify the settings for account “[my account]” or try again.+
    +The server returned the error: The attempt to read data from the server “[my account]” failed.+
    I'd like to find out more about the error, so that I can start to track this down and see if there is a problem with the mail server settings, or if the problem is on my end.
    I don't see any reference to this error in the logs – would it be somewhere else?

    I've just solved a similar issue.
    I have a dedicated server running Plesk 9.5 and when I upgraded to iLife 11 and Snow Leopard this error appeared. I could quickly click "get mail" and I'd get all my mail, but only 3-4 of my 9 mail accounts would connect. Theo others would have the error:
    The server error encountered was: The attempt to read data from the server...
    I found solutions for those using IMAP mail:
    modify the /etc/courier-imap/imapd configuration file and change MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40. This allows all the machines behind my home firewall to connect to multiple accounts on the e-mail server with mailbox caching enabled.
    I'd made this change on my server but it didn't seem to have any effect. It dawned on me that I'm using POP, no IMAP. So I found in /etc/courier-imap/pop3d the same settings. I changed the MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40 and voila, all my connections concurrently worked.
    This has taken me more than two days to fix and I hope posting this helps someone else with the same issue.

  • How to read data from analog input and export it to a file using Components Works

    I've made a simple application (Delphi 5 using the activeX) that read data from analog input channel 0. After I read the data, I want to export it to a file using the Components Works on a 6035E NIDAQ. (I'm not using labview and don't want to use it)
    How can I do that???
    Regards,
    Francis

    Here's a sample code to do this...
    P.S.: If you read only one channel this code won't work.
    function MyFunction: Integer;
    var
    Low, High, i, j, ChannelCount, RowCount: Integer;
    Po: P;
    begin
    //you are reading multiple channel
    ChannelCount := VarArrayHighBound(Buffer, 1);
    RowCount := VarArrayHighBound(Buffer, 2);
    for i := 0 to ChannelCount do begin
    for j := 0 to RowCount do begin
    WriteToDisk(Buffer[i, j]));
    end;
    end;
    end;

Maybe you are looking for

  • Assignment of souces automaticaly in PR

    Is it possible to assign the sources of supply in PR automaticaly without creation of source list with only info records in place. Thx

  • Formatting Failed- Device not Supported

    Hi. I'm new to DVDSP and hope this isn't a stupid question. (No hits when I searched for the same problem.) I have a project that started as FCS HD. I sent it to Compressor and created a QT H.264.mov file that is 5.96GB. Built a DVDSP project that en

  • Reduction of flicker on JTable (or reliability of setIgnoreRepaint)

    A table in one of our applications receives and logs about 10-20 events per second, and does a fairly good job at this. The problem, however, is when the table is visible the flickering is terrible. Even the header row disappears at times when the ta

  • Iphoto 11 De-noise

    I was curious to know before I upgraded to ILife 11', was if the photos are saved after editing. I know there has been a lot of concerns about this in Iphoto 09 after using the de-noise edit feature. After applying the effect in edit mode and click d

  • Retriggering of checks after input in transactions

    Hi all, I'm working on an enjoy transaction on an old R3 system (46C). When an user enters some info (e.g. by writing into an input field and pressing "enter"), the following - standard - sequence is triggered: (the user press enter) 1-> standard cod