Best data provisioning tool for very large amount of data updated real time?

about a few hundred million entries of data a day and it must be replicated to sap hana in real time, what would be the best option?

Hi Wayne,
If you are looking for real time replication, then SLT is the best option. What is the source system for this replication?
Regards,
Chandu.

Similar Messages

  • Looking for ideas for transferring large amounts of data between systems

    Hello,
    I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
    We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
    We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
    Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
    We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

  • How can we suggest a new DBA OCE certification for very large databases?

    How can we suggest a new DBA OCE certification for very large databases?
    What web site, or what phone number can we call to suggest creating a VLDB OCE certification.
    The largest databases that I have ever worked with barely over 1 Trillion Bytes.
    Some people told me that the results of being a DBA totally change when you have a VERY LARGE DATABASE.
    I could guess that maybe some of the following topics of how to configure might be on it,
    * Partitioning
    * parallel
    * bigger block size - DSS vs OLTP
    * etc
    Where could I send in a recommendation?
    Thanks Roger

    I wish there were some details about the OCE data warehousing.
    Look at the topics for 1Z0-515. Assume that the 'lightweight' topics will go (like Best Practices) and that there will be more technical topics added.
    Oracle Database 11g Data Warehousing Essentials | Oracle Certification Exam
    Overview of Data Warehousing
      Describe the benefits of a data warehouse
      Describe the technical characteristics of a data warehouse
      Describe the Oracle Database structures used primarily by a data warehouse
      Explain the use of materialized views
      Implement Database Resource Manager to control resource usage
      Identify and explain the benefits provided by standard Oracle Database 11g enhancements for a data warehouse
    Parallelism
      Explain how the Oracle optimizer determines the degree of parallelism
      Configure parallelism
      Explain how parallelism and partitioning work together
    Partitioning
      Describe types of partitioning
      Describe the benefits of partitioning
      Implement partition-wise joins
    Result Cache
      Describe how the SQL Result Cache operates
      Identify the scenarios which benefit the most from Result Set Caching
    OLAP
      Explain how Oracle OLAP delivers high performance
      Describe how applications can access data stored in Oracle OLAP cubes
    Advanced Compression
      Explain the benefits provided by Advanced Compression
      Explain how Advanced Compression operates
      Describe how Advanced Compression interacts with other Oracle options and utilities
    Data integration
      Explain Oracle's overall approach to data integration
      Describe the benefits provided by ODI
      Differentiate the components of ODI
      Create integration data flows with ODI
      Ensure data quality with OWB
      Explain the concept and use of real-time data integration
      Describe the architecture of Oracle's data integration solutions
    Data mining and analysis
      Describe the components of Oracle's Data Mining option
      Describe the analytical functions provided by Oracle Data Mining
      Identify use cases that can benefit from Oracle Data Mining
      Identify which Oracle products use Oracle Data Mining
    Sizing
      Properly size all resources to be used in a data warehouse configuration
    Exadata
      Describe the architecture of the Sun Oracle Database Machine
      Describe configuration options for an Exadata Storage Server
      Explain the advantages provided by the Exadata Storage Server
    Best practices for performance
      Employ best practices to load incremental data into a data warehouse
      Employ best practices for using Oracle features to implement high performance data warehouses

  • Grid Control Architecture for Very Large Sites: New Article published

    A new article on Grid Control was published recently:
    Grid Control Architecture for Very Large Sites
    http://www.oracle.com/technology/pub/articles/havewala-gridcontrol.html

    Oliver,
    Thanks for the comments. The article is based on practical experience. If one was to recommend a pool of 2 management servers for a large corporate with 1000 servers, what that would mean is that if 1 server was brought down for any maintenance reason (for eg. applying an EM patch), all the EM work load would be on the remaining management server. So it is better to have 3 management servers instead of 2 when the EM system is servicing so many targets. Otherwise, the DBAs would be a tad angry since only 1 remaining managment server would not be able to service them properly during the time of the maintainance fix on the first management server.
    The article ends with these words: "You can easily manage hundreds or even *thousands* of targets with such an architecture. The large corporate which had deployed this project scaled easily up to managing 600 to 700 targets with a pool of just three management servers, and the future plan is to manage *2,000 or more* targets which is quite achievable." The 2000 or more is based on the same architecture of 3 managment servers.
    So as per the best practice document, 2 management servers would be fine for 1000 servers, although I would still advise 3 servers in practice.
    For your case of 200 servers, it depends on the level of monitoring you are planning to do, and the type of database managment activities that
    will be done by the DBAs. For eg, if the Dbas are planning on creating standby databases now and then through Grid Control, running backups daily
    via Grid Control, cloning databases in Grid Control, patching databases in Grid Control and so on, I would definitely advise a pool of 2 servers
    in your case. 2 is always better than 1.
    Regards,
    Porus.
    Edited by: Porushh on Feb 21, 2009 12:51 AM

  • Read data in real time and save as an excel file

    Hi,
    I want to write a LabVIEW progarmme which able to read data in real time and save it as an excel file from Varian Vacuum muli-gauge.
    It is using RS232 port.
    Can anyone give me some examples or point me in the right direction?
    I am a beginner of LabVIEW. Hope anyone can help me.
    Thank you very much!!
    Joanne

    Thanks for your reply.
    I just use MAX to verify that the rs232 port is operational.
    However, there is an error (please refer to the attachment).
    One possible reason is in MAX I am trying to do the default command *idn? ...but it doesn't work.
    I read the vacuum multi-gauge manual but I don't know which command should I use...
    I attached the manual and can you tell me which command should I use?
    Or can you tell me other possible reason for this error code?
    Thank you very much. 
    Joan
    Attachments:
    Varian Multi-Gauge Controller.pdf ‏2747 KB
    error1.JPG ‏111 KB

  • Viewing data in real-time and saving on-demand - USB4000 Spectrometer

    Hey,
    I am trying to create a program with a USB4000 Fiber Optic Spectrometer to allow the user to view the data in real-time on the front-panel, yet to be implemented, and also be able to save the data to an excel datasheet, or if another format if more appropriate. I am reasonably new to LabVIEW and have limited experience with saving data to excel.
    I am currently trying to build upon one of the example VIs given with the drivers. Currently the VI produces data like the following:
    Wavelength (nm)    Spectrum Data
    xyz                               xyz                      
    xyz                               xyz
    xyz                               xyz
    xyz                               xyz
    I want my VI to continue to append to a new column every iteration but I think due to me inputting an array of dimension size 2 it can't but am unsure how to get around this?
    The current VI starts off by creating the Headers and then appending to insert the spectrometer Data.
    Any help would be much appreciated, apologies if I am unclear about anything.
    Attachments:
    USB4000 15-07-14.vi ‏18 KB

    Ok, a couple comments. First, it is easier to build a 2D array by appending rows and then transposing the array before saving it. Also it is much more efficient to preallocate the array if you know how many columns the data is going to have.
    Second, adding a new column or row isn't a problem because an additional column (or row) doesn't effect the number of dimensions.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • Confirmation date in real time

    Hi, Is it possible to set the confirmation date as real time date meaning, the system should accept prsent date confirmation only, not yesterday confirmation in CO11N screen. Pls advise how to set the same.

    HI,
    If you know the Exit name, go to transaction CMOD.
    Choose menu Utillities->SAP Enhancements. Enter the exit name and press enter.
    You will now come to a screen that shows the function module exits for the exit.
    Using Project management of SAP Enhancements, we want to create a project to enahance trasnaction CO11N
    Go to transaction CMOD
    Create a project called ZCO11N
    Choose the Enhancement assign radio button and press the Change button.
    Please check this links which will help you to know more detail on user exits.
    http://sap-img.com/abap/a-short-tutorial-on-user-exits.htm
    http://sap-img.com/abap/what-is-user-exits.htm
    http://sap-img.com/abap/what-is-the-difference-between-smod-and-cmod.htm
    http://sap-img.com/ab038.htm
    http://help.sap.com/saphelp_46c/helpdata/en/4a/5b75387be80518e10000009b38f889/frameset.htm
    Hope this will help.
    Regards,
    R.Brahmankar

  • Export journalized data in real time

    Hello,
    I must pull journalized data in real time out of Oracle DB source.
    my package has ODIWaiForData operator and interface which triggers only in case of Journalized Data in my source table. Interface loads data to target Oracle table in real time and it works.
    Is it possible to send Journalized Data out of Oracle database in real time? Maybe some insert/update statements within sql file, csv file or through ODIOSCommand operator somehow?
    Regards

    Hi,
    did you already see about Logminer?
    Cezar Santos
    www.odiexperts.com

  • Oracle Data Integrator - Real Time Integration

    Hi,
    I want to know that, is there any possibility of integrating data in real-time using Oracle Data Integrator?
    If yes, does it affect the OLTP system performance? (Could it read from db logs,etc..)
    Thanks..

    Using ODI with Logminer-based CDC will affect performance on the source system more than using Oracle Goldengate, let me explain why:
    When using the ODI Logminer-based Journalisation the Logminer functionality will move the primary-key of the affected row into the journal table. When you are then ready to move the changed data, running an interface to move the data reads the Journal-view, which joins the journal table (primary keys) to the source data (and dealing with deleted rows), optimising out duplicate rows in order to bring across the then-current state of the data, which can then be loaded into the target system, on completion, the moved rows can be removed from the journal table. The data will appear in the journal table as soon as Logminer puts it there, which may be a lag of up to two minutes using asynchronous setting, whereas Synchronous Logminer applies "system triggers to the table, with the consequent overhead.
    With Oracle Goldengate, the comitted transactions are read from the log as soon as the log-writer puts the commit into the log. All the data is picked up from the log, at that point. It is then written to the trail-file system of Oracle Goldengate which can be propogated to multiple other systems potentially with sub-second latency, and minimal impact on the source system due to the efficient reading and writing mechanisms. One other consequence of using Oracle Goldengate is that you get every change of the data, not the optimised then-current state of the data when moved.
    Hope this explanation helps.

  • Receiving very large amounts of data via ibrd in Visual Basic

    I am trying to read anything from a single byte up to 2Mb of data back into a Visual Basic program. Until now I have been using something like
    Dim Response as string * 60000
    ibrda hDevice, Response
    which works fine up to the magic 65536ish string size limit, but I want more!
    Ideally reading into a very large byte array would suffice, e.g.
    Dim ByteBuffer(0 To 2000000 - 1) As Byte
    ibrd hDevice, ByteBuffer
    But I cannot find a way to get the gpib-32.dll to accept the ByteBuffer as a destination in Visual Basic, even though ibrd32 is declared in vbib-32.bas as accepting Any type as a destination, and a Long as a count, implying the the 65536 limit doesn't apply for that call.
    It may be possibl
    e to use repeated ibrd calls until one such call fails to fill the buffer, concatenating the results each time, but this seems a crude solution when the DLL would appear to be able to do the job in one go.
    Using ibrdf may work, but would rather not use a file as intermediate storage.
    TIA

    I'm wondering if that # 65536 is a VB cap for string type. Refering to the lang intf,
    Declare Function ibrd32 Lib "Gpib-32.dll" Alias "ibrd" (ByVal ud As Long, sstr As Any, ByVal cnt As Long) As Long
    Sub ibrd(ByVal ud As Integer, buf As String) <---
    Dim cnt As Long
    cnt = CLng(Len(buf))
    Call ibrd32(ud, ByVal buf, cnt)
    isn't buf a string type?

  • Setting resolution, deciding file type, for very LARGE Canvas prints. 36MP camera.

    Okay, so I noticed my lightroom was on 240 PPI resolution. I changed it to 300 because I read 300 was for standard prints. What would I need to set the export to in the resolution box for a very large Canvas?
    Is it better to choose Tiff instead of Jpeg for prints of this quality, if not then what should I choose?
    I am using a Sony A7R full frame 36.4MP and with some of the sharp Zeiss lens there is really no pixelation that noticeable when I zoom in 100 percent. Of course the A7R is being said to be one of the best Sensors on the market today. It's supposed to be like the Nikon D800E, but apparently it has some advantages.
    In other words, I want to export in as high of quality as possible for the Canvas. File size is not an issue.

    Changing the resolution setting does absolutely nothing to the digital image. This is a common misconception. The only thing that counts in the digital image is the pixel dimensions. Regardless of what the PPI setting is (240, 300, 600, whatever) the image still has the same number of pixels. To determine what you need for a print of any size it is necessary to multiply the inches by the desired pixels per inch. Suppose you want a 16 x 20" print at 300 pixels per inch. The math would be something like this:
    300x16 = 4800 pixels
    300x20 = 6000 pixels
    So to print a 16 x 20" print you would need an image that is 4800 x 6000 pixels. And the PPI setting can be anything you want it to be because it has no effect on the image.

  • Best video capture tool for Panasonic AG-DVX100A

    I'm trying to decide on the best video capture/ media converter tool for my Panasonic AG-DVX100A. The choice is between the Grass Valley ADVC 110 Media Converter and the BlackMagic Design Video Device for Mac OS10. Has anyone used either/both of these. What advantages do each have over the other? Thanks for your time and advice.

    Hi -
    The Grass Valley ADVC 110 Media Converter requirers firewire.
    I have and use the Blackmagic Video recorder which is USB based -
    http://www.bhphotovideo.com/c/product/558914-REG/BlackmagicDesign_VIDREC_Video_Recorder_USBCapture.html
    BUT it converts the video to H.264 format which is a very difficult format to edit in. It is primarily designed to allow you to load video on to your computer for viewing, posting to you tube, or other very simple uses.
    If you intend to edit these files with iMovie, you might be able to get away with it (although importing the files will be slow), if you plan to edit with Final Cut Express or Final Cut Pro you will need to convert them to an editable format.
    You might look at taking your camera somewhere to a facility that can properly capture the DV material on your Mini-DV tapes and put them on an external hard disk for you.
    Then you could plug in that disk on your computer and obtain access to the files that way.
    Hope this helps.

  • Best Usage reporting tool for SharePoint 2013 (onpremises)

    Please suggest me best reporting tool for SharePoint 2013 on premises.
    Need to get site usage summary for any time in last one year.
    Need to get library/list usage summary for any time in last one year. etc.
    How many peak hits and unique no of users across all levels.
    Thanks, Ram Ch

    Hi Ram
    We have two links “Popularity Trends” and “Popularity and Search Reports” in the site settings. By Clicking on the two links we can view the usage reports in SharePoint 2013.
    More references:
    http://technet.microsoft.com/en-us/library/jj715890(v=office.15).aspx
    http://sureshpydi.blogspot.com/2013/06/usage-reports-and-popularity-trends-in.html
    http://blogs.msdn.com/b/chandru/archive/2013/08/31/sharepoint-2013-web-analytics-report-where-is-it.aspx
    http://www.prweb.com/releases/2012/8/prweb9821144.htm
    Amit Kotha

  • Using SRM for very large contracts and contract management

    We are doing an SRM 7.01 implementation project. SRM will be used primarily for outsourced contract management. The contracts are all services associated with facilitites (plant) maintenance and also support services like cleaning or catering.
    They have very large numbers of items priced individually (e.g. 10,000) per contract. The items price depends on the location the work is expected to be performed. The location is represented in SAP RE-FX architectual object. The price can be priced at any level of the hierarchy e.g. service A is priced the same across the whole state but service B is priced per campus.
    q1. SAP advises that there are performance limitations on SRM contracts >2000 lines. Has anyone experience in a solution to provide very large contracts in SRM? How did you do it please?
    q2. SAP advises to use the plant to represent the location for pricing purposes, but this would result in a very large number of plants. Has any one experience in alternative solutions to for variable location pricing in SRM contracts please? I.e. like integrating the RE-FX architectural object or similar into contract and PO line items.
    thanks very much

    Hi Prakash,
    SRM does provide the facility of contract management with the help of Purchase Contracts and Global Outline Agreements but it is used as part of the sourcing for materials and services. The materials or services have contracts against some given target value against which PO is released. The contract is based on a material number ( eithe material or a service) which will be used as a source of supply during the creation of the Shopping Cart. It might not really fit in the scenario of carrier and freight forwarders but still can be customized for this kind of use.
    The contract management functionalities in the R/3 space can also be looked on for this purpose.
    Reg
    Sachin

  • Now I understand the reason for very large screen monitors

    I now understand why many people want to buy the new very large screen Macs. I have always loved my 17" flatscreen iMac G4 and since the picture itself is the same and the only thing that changes is the real estate around the picture, I always felt that getting a larger monitor would be an exercise in self-indulgence.
    Well, now I see that all of the surrounding real estate would be very useful for putting folders, documents, pictures, etc on the screen and having them visible and accessible. For someone making a webpage, and I imagine also for making iMovies, which I will soon do for my website, a 19" or larger screen would be VERY helpful. But when a person buys a new computer, then all of that information has to be transferred....... and that is enough to make a person stick with the 17" monitor (that and the price of a 21").
    — Lorna in Southern California

    Have you tried using Exposé to make that smaller
    screen more expansive? You can drag from one window
    to another with an Exposé transition in between.
    While not a replacement for a larger screen, it does
    help when I need it.
    Ken, I've had Tiger for about a week and the only things I've been working with are iWeb and iPhoto! Later I will explore Exposé. It sounds like they were trying to help us out and that's good.
    — Lorna in Southern California

Maybe you are looking for

  • How do I edit audio in Final Cut Pro 10.1.3 Trail Version?

    I'm extremely frustrated. I need to edit the audio and I can't even see the keyframes or waves. If I do put a keyframe the audio bounces. Like I set it to -10 and it slowly goes back up to 0. I am use to Final Cut 7. Please Help!!!

  • Menu does not appear in adobe reader XI

    I am using Adobe Reader XI in windows XP. Menus dont appear when i click an item on the menu bar or right click in the open space. This thing occurs in my system for Adobe reader only. In rest of the applications in my system menus appear and it work

  • Help on Dispyaing Missing Time in a day

    Thanks in Advance I want to display my output and insert missing time records Original Query In the above query 9:01 and 9:03 are missing for same date SQL> SELECT * FROM MISSING_TIME; DATE_TIME DESCRIPTION1 DESCRIPT2 9/25/2003 9:00:00 AM test1 test2

  • How do I switch tabs in fullscreen mode in Safari iOS?

    I like FS mode in Safari, but seems to be unable to switch tabs while in fullscreen?

  • Not able to connect to sqql developer

    hi am running oracle Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product PL/SQL Release 10.2.0.1.0 - Production CORE 10.2.0.1.0 Production TNS for 32-bit Windows: Version 10.2.0.1.0 - Production NLSRTL Version 10.2.0.1.0 - Production am