B2MML with real time data

Hi All
         I want some B2mml files with real time data.
So exact( maintenance infromation and Production) B2MML that is coming from MIS.
So if anyone have some information regarding this please tell me.
Thanks in Advance
Ramshanker

Hi, Ramshanker.
In the cases where the plant system is directly delivering B2MML documents (such as with certain products from Wonderware, Siemens, and some others), using the data in xMII is quite straightforward.
1) If the B2MML message is not "standard" B2MML (be aware that very few plant vendors use the baseline standard - most have a number of specific extensions), you can add the vendors' schemas or B2MML example documents to xMII by adding them to a ReferenceDocument folder on the server.
2) In xMII BLS, create a transaction (let's assume you called it B2MMLHandler) with one input parameter, of type "XML" (let's assume you called it "B2MMLDocument").  Add a "ReferenceSchema" or "ReferenceDocument" action, and configure it to load the vendor's B2MML format.  Use the "assign reference document" menu option to assign this format to the transaction input.
3) You can now use the B2MML document to perform analytics or to map from the B2MML document into any RFC or BAPI calls or ESA service calls from within BLS.
4) The plant system system should "post" the XML data to an xMII transaction using the "Runner" URL as in:
http://<yourserver>/Lighthammer/Runner?Transaction=B2MMLHandler&InputParameter=B2MMLDocument
Of course, if the MES system isn't capable of posting the documents and they are available on the file system, you can use the xMII file actions (or remote file access via FTP) to access and load the B2MML documents, perhaps at a scheduled interval.
Hope this helps get you started in the right direction.
Best regards,
Rick Bullotta

Similar Messages

  • Problem with Real Time Data Acquisition

    Hello,
    I tried to built a RDA data flow according to the SAP-Help and this blog (/people/kamaljeet.kharbanda/blog/2006/11/13/real-time-data-acquisition-bi2004s), but I ran into a problem. Everything seems to be fine, but even though the daemon is running no new data is added to the DSO. I'll explain in detail what steps I took and hope someone can pinpoint my error(s).
    System is BW4 on SPS 17. My datasource as well as my DSObject are in the same system
    1.) Created a transparent table with a timestamp-field for generic delta as a local object.
    2.) Created a generic datasource from view/table (used the above table), marked it as realtime-enabled and used timestamp as delta-specific field.
    3.) Replicated the metadata for my datasource and activated it.
    4.) Created a Data Storage Object with infoobjects corresponding to the table fields
    5.) Created a transformation between DataSource and DSO
    6.) Created a DTP from Data Source to DSO. The type is realtime-DTP.
    7.) Created an InfoPackage for Delta-Initial Load. Update mode: Initialize Delta Process -> Initialization with Data Transfer.
    8.) Then I changed my DTP (step 6) to "normal DTP" and executed it. Then activated the data in the DSO. The datasets from the table (which i filled with an ABAP-program) were successfully loaded and activated. So now my DSO contains 5 datasets.
    9.) Changed the DTP back to "realtime DTP"
    10.) Created an InfoPackage for RDA. Adapter is set to "Realtime extraction from SAP System"
    11.) Created a daemon, assigned the IP for RDA and the DTP to the daemon and started it up.
    12.) Executed my ABAP program to added 3 datasets to the table which is used as a datasource.
    So now in my source table there are 8 datasets, but the daemon won't load them. In my RDA-Monitor (trx rsrda) it shows my daemon, the assigned IP and the assigned DTP (status green). Under the IP there's one yellow request (that should be right according to the help), but unlike the picture in the blog (link see above) there's no request under the DTP.
    Job overview (sm37) shows an active batch job and some jobs that are ready (every 10minutes a new batch starts and the old one is closed). the job log shows, that the daemon runs, but loads no data...
    i think it might be a problem with the delta extraction somewhere, because the intial load works fine and there are no errors anywhere.
    Edit: perhaps my settings in the data source unter tab: extraction are wrong?
    Delta process: AIE After Images By Extractor (can't change that)
    Direct Access: Allowd
    real Time: Real-Time DA Supported
    Adapter: realtime data extraction (also tried Access to SAP Data through Service API which seems senseless)
    Data Format: Fixed Lenght
    Anyone can explain what those fileds mean and which one i should use?
    Edited by: Christoph Jender on Apr 10, 2008 2:48 PM
    Edited by: Christoph Jender on Apr 10, 2008 3:33 PM

    Hi again, another thing i'd like to try is to just active a business content datasource which is realtime-enabled and has the ability to dynamically add some values so that i can test if it works. but i have no idea what business content would be applicable here, any ideas on that perhaps?

  • The problem with Real-Time Data Acqusutuin

    Hi , all
    I created ds with Generic Delta : Delta-Specific Fields pointed time stamp and in settings checked Real-time Enabl . After that created modeling in BW , made initial update like discribed in SAP Documentation , created one more infopackage with adapter :Real-time extraction from sap , assigned infopackage and dtp to daemon and started job of daemon .
    Request stay yellow all of time and dosn't bring no one records of changes in R3 . Anybody know what is the couse for that? Maby I forgot to check something anywhere or star trigger anyone!?

    The following setup working fine for me
    In R/3 for Generic Delta --- 
            Filed Name - AEDTM
            0Calday -- Radio Button - 'ON'
            Upper Limit - 0
            Lower Limit - 2
            Real_Time enabl - Check Box - 'ON'
            New Status for changed records - Radio Button - 'ON'
    In BI
        Create infopackage (init) for replicated datasource
        Execute the DTP to load this init data to DSO.
        Do NOT activate DSO
        Now create another Infopack (real_time Enable & ) and create real-time DTP.
    In RSRDA
        Create a Daemon
        Assign InfoPack & DTP to Daemon.
        Now start daemon...Here the message says something like 'one of the request is not activated'.
        Now, select DTP under Daemon --> use menu Edit --> Repair Chain --> Execute
        (If repair chain is not yet there then Daemon --> use menu Edit --> Repair Chain --> Generate and  
         once this is done execute the using the above path). Once the chain finishs
        Now start Daemon and refresh.
        Now u can see, an open request (yellow) under infopackage.
        As soon as the new records gets load into PSA, then an open request starts under DTP.
        (Note: If no new data available in PSA, no open request starts under DTP)
    Also note that, RDA requests (in PSA & DSO Manage) should always be in yellow, that means Real-Time Aquisition is working.

  • Issue with Real-Time data acquisition

    Hello,
    I'm using RDA to extract a non-standard transparent table (Z*) from  ECC to BI. When I launch my deamon, my infopackage and my DTP become yellow only 2/3 seconds and green just after, and no data are coming from ECC.
    If I check the deamon log I could see following lines :
    Job started
    Step 001 started (program RSCRT_RDA_DEMON, variant &0000000001977, user ID XXXXXX)
    InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL for daemon 01 is locked
    Setting status of InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL
    Status set (2 -> 3)
    Start of upload for InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL for daemon 01
    Request 242.373 opened, status set (0 -> 1)
    InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL: Deleting TIDs
    InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL: TIDs successfully deleted
    Data records deleted in the queue, status set (1 -> 4)
    Data package 000001 opened, status set (4 -> 2)
    The current application triggered a termination with a short dump.
    Upload finished for InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL for daemon 01
    Setting status of InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL
    Status set (3 -> 2)
    A short dump is triggered in ECC, MESSAGE_TYPE_X in FM RSC3_DEMON_UPL. Why ???
    I have done another extractor (on a HR infotype) and It works fine.
    Thanks for your help,

    First tip is to not configure the daq task and close the task on every iteration of the while loop.  Configure the task before the loop starts and close it after the loop.  Only stop and configure the task in the event that any of the configuration data changes.

  • Experience with Real-time Data Acquisition (RDA)

    Hello all,
    We are looking into using RDA functionality to quickly move data from ECC to BW and would like to talk to anyone who has had experience using this feature (did it work well, limitations, what to look out for, etc). Would appreciate if you could share your experiences.
    Thanks,
    - Rao

    Hi,
    Below link might be useful to you,
    [http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b05895f6-a4bc-2c10-bbad-96d8a341de78?QuickLink=index&overridelayout=true&45896020788918]
    Regards,
    Durgesh.

  • RMAN in active data guard with real time apply ON without a catalog DB

    I would like to know whether I can take RMAN backups (both level 0 & incremental) in active data guard with real time apply ON without a catalog database. version is 11.2.0.3.2. Will it make difference in the control file of standby syncing with primary? Can I use these backups on primary to restore and recover the database(including control file backup)?

    Thank you very much msberg. Could you please answer the below queries as well?
    Should I cancel Real Time Apply before starting RMAN?
    Will it make difference in the control file of standby syncing with primary? Is it possible to use these backups on primary to restore and recover the database(including control file backup)? If possible, how will I do that?

  • How to count occurences of a certain string in incoming real-time data? Also displaying RTC data. Current VI included

    I use LabView student Express 7 on a Windows XP system.
    Time-frame: we are doing final integrations for our balloon experiment today. We just got told that the press wants to view real-time data, which we haven't programmed for. I need help to get a working VI at the latest by 25.02.2004 before 0800(morning) (GMT+1).
    Note on license
    It is a student balloon flight, and the data will not be used in scientific work, so the I am not breaking any license agreements (I hope).
    Problem synopsis:
    The balloon continually transmits data at 9600baud. The data is a semi-repeating header followed by a constant lenght data-package arranged like this:
    BEXUS[h][m][s]BEXUS[h][m][s]
    [Raw binary data, 7channels*8sub-channels*8bits]
    What the groundstation is doing right now:
    Take all incomming data and save (append) the data to a file without any data-handling. (We figured we would go post-processing).
    What I need to change in less than 24 hours:
    - Add a "package" counter
    - Add a display of the clock data (RTC)
    How I planned to implement the changes:
    -RTC display:
    The RTC data is in BCD format, since that means that if you look at the data as hex numbers, you get the hours and minutes and seconds out in "clear text". That is 12 hours is 0x12hex. I figured that I can do a match pattern BEXUS and pass the "after substring" to another match pattern BEXUS from which I feed the "before substring" to a type-cast VI (casting string to u8) and displaying that, which should give me a display of "123000" for the time 12:30:00... I couldn't get it to work at all when I tried out the supplied "beta" vi.
    - Package counter:
    Counting how many BEXUS that gets detected and dividing by 2. I don't know how to do this. I've looked on the forum (a good thread on the problem: "how do I count the number of *'s in a string") but these use either loops or arrays... and I'm not sure how this works when I'm getting the data in at realtime. I cant make an array and then count it, since then the array would grow fast and possibly interfere with saving of the data??? Saving the data is critical.. without that file we cant do post-processing.
    Since my time is so limited (I'm not even supposed to do the groundstation software but they called on me in the last minute because no-one else had time/wanted too/could do it) I hope that you could make an exception and provide me with working VI's (based on the one I have attached) so that I can show something to the press! (Free comercial for NI!! Since the student version shows the National Instruments water-mark on all VI's!!! Possible TV time!!)
    Thanks!
    PS: even if you are to late (after 25) post anyway!
    Why:
    -I can learn from it
    -the launch might be delayed due to weather conditions
    -others might find it amusing!
    Thanks again!
    Attachments:
    BexusII_groundstation.vi ‏46 KB

    I have a valid example data file attached to this thread.
    If you open BEXTEST.bin in a hex-editor of your choice, you'll see the BEXUS as 42 45 58 55 53 and then the time as 00 28 09 etc.
    I couldn't get Joe Guo's VI to work. It doesn't count packages correctly, and the time is not displayed correctly either.
    The file was saved using a straight save to file VI.
    The data is from actual launching area tests performed a few mintues ago. The time displayed is "On time" e.g. how long the gondola has been powered up.
    I have a spare T-junction, so I can hook into the balloon real-time data as we fly, in case anyone care to see if they can figure out why the latest version of Joe Guo's program is not displaying correctly.
    I will monitor this
    thread during and after flight to see if anyone can make it in time!
    Thanks for the great effort!!
    Attachments:
    bextest.bin ‏53 KB

  • Error "cannot load request real time data targets" for new cube in BI 7.

    Hi All,
    WE have recently upgarded our SCM system from 4.1 to SCM 7.0 which incorporated BI 7.0.
    I am using BI 7.0 for first time and ahve the following issue:
    I ceated a new infocube and data source of flat file and succesfully created transformation, and Data Transfer Process. Everything looked fine. I added flat file and checked preview and could see data. Now when I start job to load data in infocube the follwing error is shown "cannot load request real time data targets". 
    I checked cube type in setting in infcune is shows as Standard.  When I doube clicked on error the following message showed up
    You are trying to load data into a real-time InfoCube using a DTP.
    This is only possible if the correct load settings have been defined for the InfoCube.
    Procedure
    In the object tree of the Data Warehousing Workbench, call Load Behavior of Real-Time InfoCube from the context menu of the InfoCube. Switch load behavior to Transactional InfoCube can be loaded; planning not allowed.
    I did not understand what it is meant and how to set changes. Can someone advice and follow me through.
    Thanks
    KV

    Hi Kverma,
    Real-time InfoCubes can be filled with data using two different methods: using the transaction for entering planning data, and using BI staging, whereby planning data cannot be loaded simultaneously. With Real time cube you can select the method you want to use for update as
    Real Time data Target can be loaded With Data; Planning not allowed &
    Real Time data Target can be Planned; Data loading not allowed
    You can change this behaviour by right clicking on cube and selecting Change real time load behaviour and select first option. You will be able to load the data then
    Regards,
    Kams

  • Error when activating Real Time Data Source

    Hi..
    I created a generic datasource in R/3 system via RSO2, with the Real-time check box enabled in the settings for Generic Delta. I saved the datasource. Replicated it into BW System.
    I tried to activate this Real time Data Source in BW, it says "Error when activating Data Source".
    I am on BI 7.0, with PI_BASIS: 2005_1_700.
    Please let me know if you need any further information.
    Thanks,
    Sai.

    Hi sai,
    Can you specify from where you have created your geneic data source like by using DB tabl/view/infoset query.
    am

  • Need Suggestion On Real Time Data Accees

    Hi,
    Our application (let say A) runs on * 8.1.7.4.0*
    Another Application (let say B) runs on *10.2.0.4.0*
    Both the application A and B exchange data through DB links.
    Both the application A and B perform select and DML on other tables.
    Application A has views (approx. 10 views) on application B 's table through DB links.
    Application B also has views (approx. 50 views) on application A 's table through DB links.
    Application A has store procedure (approx 10 SPs) through which it does insert/update on two of application B 's table approx 15K call of that procedure per day.
    Application B also calls some of the application A 's procedure to insert/update data in application A 's table.
    Currently both the application's data center are at the same place. But, there is a proposal/need to move application B's data center far away from the current location. Because of these we are in a assumption that DB Link over WAN will create performance issue and searching of probable replacement.
    After of couple of discussion we have identified approx five application A 's tables which are frequently accessed by application B and expect real time data.
    Similarly, Application A expect real time data from two application B 's table.
    'Golden Gate' mechanism came in our discussion, butt it's expensive,
    Can we replace this store procedure code/dml with web service call or so ??
    Can anyone suggest ... what kind of approach we should take here ?

    Views which are residing in application A 's side use many internal tables and one or two application B s table.
    I don't have idea on 'multi-master replication ' ... is it two way replication ..? ... you aware of any documentation I can refer ..? How costly it is (if you can give an idea) ?
    Also, among many tables we need near real time (2 mins interval) replication on 5 tables.
    MVs again have dependencies on DB link, which exactly what are trying to replace (assuming performance issue)
    With fast refresh (let say 2 minutes interval) method on MVs do you think it could cause performance issue specially when data center are at different location ?

  • Real-time data acquisition for HR datasources

    Dear Experts,
    I have a couple of questions on real-time data acquisition...
    a) Can you tell me if any standard HR datasources support real-time data acquisition?
    b) Can we apply real-time data acquisition for generic datasources? If yes, is there any difference in the process compared to working with business content datasources when using real-time data acquisition?
    Hope you can provide some answers...as always, points will be awarded for answers that effectively address the questions.
    Thanks a bunch.
    K

    Hi Karthik,
    a)The decision to go for SAP remote cube depends on the volume of data and the frequency of queries. It is advicible not to use the SAP remote cube if the data volume is high and also, the query frequency is high. Which module in HR are yuo implementing? In HR, the data volume is generally high. So, if you go for loading the data from R/3 every 3 hours, you are asfe as long as the loading volumes are less. For example, for implementing Time management, I would not advice frequent loads as it is time consuming process. So, make a decision based on the above mentioned lines. If the data volume is not high, I would prefer the SAP ermote cube as it will reduce managing the loads.
    b)I mentioned FM extractor just for the sake of control of data selection. You can even go for view/table extractor.
    Hope this helps.
    Thanks and Regards
    Subray Hegde

  • Call web service from OBI dashboard to show real-time data alongside data from DW

    Hello,
    OBI = 11.1.1.7.0
    We have a requirement to pull in real-time data from an external, secure web service (WS) into an OBI report.  The OBI report also contains data from the data warehouse (DW) along with a calculation that is derived from multiplying the DW and WS data .  In theory, the behavior that were looking for is similar to the Action Link "Invoking a web service" whereby the call is made to the web service for each row of data.  How would it be possible to achieve this result on an OBI report?
    Desired Result of OBI Report
    Product.......Qty.........Price........Potential Revenue
    TV..............100...........$500..........$50000
    VCR.............22............$75...........$1650
    DVD..........2500............$20..........$50000
    Breakdown of the data source for each column on the OBI report
    Product.......Qty.........Price........Potential Revenue
    DW............WS...........DW..........Calculated_on_the_fly_(qty*price)
    DW............WS...........DW..........Calculated_on_the_fly_(qty*price)
    DW............WS...........DW..........Calculated_on_the_fly_(qty*price)
    Thank you,
    Mike

    Does anyone know if it's possible to call a webservice from an OBI report to populate a column's value?  Thank you for your input.

  • About DTP with real-time access.

    Hello Gurus,
           setting for error handling only has an impact while repairing a DTP request (repair) and during the conversion of the DTP to standard DTP (for example, to correct an error during extraction).
          will you please give a simpel scenario to explain above words?
    thank you very much.
    Fran

    Hi,
    Use :
    With SAP NetWeaver 7.0, SPS 14, the following changes and enhancements are available for real-time data acquisition:
    ●      Changes to the menu and the context menu of the monitor for real-time data acquisition
    The menu and the context menus for the individual objects in the monitor for real-time data acquisition have been standardized and enhanced. In particular, it is now possible to assign daemons and data transfer processes on various levels using context menu entries. For a complete overview of the functions in the menu and in the context menus, see Monitor for Real-Time Data Acquisition.
    ●      Assignment of daemons for InfoPackages and data transfer processes (DTP)
    To assign InfoPackages and data transfer processes to a daemon, you can call the monitor for real-time data acquisition in the following ways:
    ○       In the Data Warehousing Workbench using the respective context menu entry Assign RDA Daemon.
    ○       In InfoPackage maintenance using the Schedule tab page, and in data transfer process maintenance using the Execute tab page.
    The button names for jumping to the monitor for real-time data acquisition have changed here. To jump to the monitor, choose Assign Daemon.
    ●      Repair process chains for repairing a broken data transfer or a missing delta in the DataStore object
    In certain situations, a gap in the delta of the DataStore object can occur if there is a closed request in the PSA but there is no corresponding request in the DataStore object. For example, this is the case if a DTP request has terminated due to an error in the transformation. Here, you can create a repair process chain to repair a missing or broken update from the PSA. The repair process chain contains a standard DTP as well as any further processes required for subsequent processing (such as activating the data in the DataStore object or subsequent process chains). When the repair process chain is executed, the complete delta is loaded from the source into the DataStore object and processes for activation and further processing are executed if required.
    ●      Process types for starting and stopping real-time data acquisition
    You can use process chains to control real-time data acquisition using process types Start Real-Time Data Acquisition (RDA) Load Process and Stop Real-Time Data Acquisition (RDA) Load Process.
    More Info :
    http://help.sap.com/saphelp_nw70/helpdata/en/47/2731751c2a2dede10000000a1553f7/frameset.htm
    Regards
    Ram.

  • RDA(real time data aquasation)

    can anyone pl. tel me how the process goes in RDA(real time data Aquasation) means no steps internally.

    Hi,
    Real-Time Data Acquisition u2013BI 2004
    Real-time data acquisition supports tactical decision-making. It also supports operational reporting by allowing you to send data to the delta queue or PSA table in real-time.
    You might be having complex reports in your BI System, which helps in making decisions on the basis of data of your transactional system. Sometimes (quarter closure, month end, year ending...) single change in the transactional data can change your decision, and its very important to consider each record of transactional data of the company at the same time in BI system as it gets updated in the transactional system.
    Using new functionality of Real-time Data Acquisition (RDA) with the Net Weaver BI 2004s system we can now load transactional data into SAP BI system every single minute. If your business is demanding real-time data in SAP BI, you should start exploring RDA.
    The source system for RDA could be SAP System or it could be any non-SAP system. SAP is providing most of the Standard Data Sources as real-time enabled.
    The other alternative for RDA is Web Services, even though Web Services are referred for non-SAP systems, but for testing purpose here I am implementing Web Service (RFC) in SAP source system.
    Eg will be a production line where business wants information regarding defective products in the real time so that production can be stopped before more defective goods are produced.
    In the source system, the BI Service API has at least the version Plug-In-Basis 2005.1 or for 4.6C source systems Plug-In 2004.1 SP10.
    Refer:
    Real-Time Data Acquisition -BI@2004s
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f80a3f6a983ee4e10000000a1553f7/content.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3db14666-0901-0010-99bd-c14a93493e9c
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3cf6a212-0b01-0010-8e8b-fc3dc8e0f5f7
    http://help.sap.com/saphelp_nw04s/helpdata/en/52/777e403566c65de10000000a155106/content.htm
    https://www.sdn.sap.com/irj/sdn/webinar?rid=/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    Regards
    Tg

  • How to Integrate real time data between 2 database servers

    How to Integrate real time data between 2 database servers
    May 31, 2006 2:45 AM
    I have a scenario where the data base (DB2 400) is maintained by AS 400 application and my new website application based on j2ee platform access the same database also but the performance is very low. So we have thought of introducing new oracle data base which will be accessed by j2ee application and all the data from db 400 database will be replicate to oracle data base. In that scenario the only problem is of real time data exchange between 2 databases. How do we achieve that considering both the application As400 and j2ee website application are running in parallel and accessing the same information lying on DB2 400 database. We have to look at transaction management also.
    Thanks
    Panky
    DrClap
    Posts:25,835
    Registered: 4/30/99 Re: How to Integrate real time data between 2 database servers
    May 31, 2006 11:16 AM (reply 1 of 2)
    You certainly wouldn't use XML for this.
    The process you're looking for is called "replication". Ask your database experts about it.
    I predict that after you spend all the money to install Oracle and hire consultants to make it replicate the DB2/400 database, your performance problem will be worse.
    panks
    Posts:1
    Registered: 5/31/06 Re: How to Integrate real time data between 2 database servers
    May 31, 2006 11:55 PM (reply 2 of 2)
    Yeajh I now that its not a XML solution.
    Replication is one of the option but AS400 application which uses DB2/400 DB is highly loaded and proposed website also uses the same database for retrieval and updation purpose.All the inventory is maintained in the DB2/400 database so I have thought of introducing new oracle database which will be accessed by new website and it will have all the relevant tables structure along with data from DB2/400 application. Now whenever there is a order placement from new website then first it should update the oracle database and then this data shuold also migrate to db2/400 application at real time so that the main inventory which is lying on db2/400 should be updated on real time basis because order placement is aslo possible from As400 application. So the user from As400 application should not get the wrong data.
    Is it possible to use MQ products??
    -Panky

    Hi,
    the answer to your question is not easy. Synchronization or integration or replication data between 2 (or more) database servers is very complicated task, even though it doesn't look like.
    Firstly I would recommend to create good analysis regarding data flow.
    Important things are:
    1) what is primary side for data creation. In other words on which side - DB2 or Oracle - are primary data (they are created here) and on which side are secondary data (just copies)
    2) on which side are data changed - only in DB2 side or only on Oracle side or on both sides
    3) Are there data which are changed on both side concurrently? If so how should be conflicts solved?
    4) What does it mean "real time"? Is it up to 1 ms or 1s or 1 min or 1 hour?
    5) What should be done when replication will not work? I mean replication crash etc.
    BTW. The word "change" above means INSERT, UPDATE, DELETE commands.
    Analysis should be done for every column in every table. When analysis is ready you can select the best system for your solution (Oracle replication, Sybase replication server, MQ, EJB or your proprietary solution). Without analysis it will be IMHO gunshot into the dark.

Maybe you are looking for

  • Links to Containerelements in a Worddocument aren't saved

    Hello, in my workflow I'm creating a worddocument. My problem is, that my links of the containerelements, which I posted in the document does not been saved. when I closed the document and reopen it, then the links of the containerelements aren't the

  • Final Cut Pro X Free Trial - Lost Past Files

    I downloaded the free trial of FCP X about 20 days ago and when i went to import and begin a new project today, the past 4 projects and their original files I have worked on are gone.  I have looked everywhere in my external hard drive (where I save

  • Select statement dump

    Hi, Am getting dump with the below select statement in ECC 6.0, where as the same select works fine in 4.7C. SELECT distinct aebeln abukrs absart alifnr azterm aekorg aekgrp awaers awkurs akdatb akdate aunsez bebelp btxz01 bwerks blgort bbednr bmatkl

  • Suggestions on Managing mixed (NT - Unix) environments

    We are in process of implementing 11.5.5 Our Applications instances are on Unix however, we are looking for advice on what is the most effective manner of testing/moving customized forms and reports between the NT GUI environment where they are modif

  • Include Adobe AIR in apk and app file.

    Is there any way to include a Adobe AIR in apk and app file (Android and iOS), that the user dont have to download it and install separtely? Is it possible to be  done in Flash Builder 4.5 (ActionScript Mobile Project)?