Real time application and large amount of data

Hi,
I've a real-time application that needs to allocate large amount of data in memory (more than 5GB). I'm using 1.6 version of JRE and I'm planning to migrate to 1.8 because the application experiment a lot of "Stop of the world" every day. I spent a lot of time analysing and testing all the parameters and policies of the GC.
The question is, with the 1.8 version dissapears the pauses on the application caused by the GC cleaning process?
Thanks.

Just noting that the GC only really needs to do something when objects are created and then are no longer in use.  So if your application needs a large amount of memory then keeping it, rather than discarding it might be a better solution.

Similar Messages

  • JSP and large amounts of data

    Hello fellow Java fans
    First, let me point out that I'm a big Java and Linux fan, but somehow I ended up working with .NET and Microsoft.
    Right now my software development team is working on a web tool for a very important microchips manufacturer company. This tool handles big amounts of data; some of our online reports generates more that 100.000 rows which needs to be displayed on a web client such as Internet Explorer.
    We make use of Infragistics, which is a set of controls for .NET. Infragistics allows me to load data fetched from a database on a control they call UltraWebGrid.
    Our problem comes up when we load large amounts of data on the UltraWebGrid, sometimes we have to load 100.000+ rows; during this loading our IIS server memory gets killed and could take up to 5 minutes for the server to end processing and display the 100.000+ row report. We already proved the database server (SQL Server) is not the problem, our problem is the IIS web server.
    Our team is now considering migrating this web tool to Java and JSP. Can you all help me with some links, information, or past experiences you all have had with loading and displaying large amounts of data like the ones we handle on JSP? Help will be greatly appreciated.

    Who in the world actually looks at a 100,000 row report?
    Anyway if I were you and I had to do it because some clueless management person decided it was a good idea... I would write a program in something that once a day, week, year or whatever your time period produced the report (in maybe a PDF fashion but you could do it in HTML if you really must have it that way) and have it as a static file that you link to from your app.
    Then the user will have to just wait while it downloads but the webserver or web applications server will not be bogged down trying to produce that monstrosity.

  • Can LabVIEW handle fast and large amount of data?

    Hello,
    I want to develop server for GPS Tracking system. This system will be communication with about thousands of devices at a time. My question is that whether LabVIEW can b used to implement a server on LINUX platform to handle such fast and heavy transactions without having any hanging of software/system. I can use multithreaded processor, probably, Core-2-Quad processor, to build this application. 
    Please let me know whether LabVIEW can be used to designed such type of application.
    Hasan Baig

    Scott W wrote:
    As discussed previously, you should be able to implement your application using LabVIEW, but you should take care that you design the code properly.  From what I understand, you need labview to be able to process ~256kBps.  LabVIEW will have no problem doing that, but the problem may lay in getting the information from ~1000 computer in 5 seconds.  This will all be determined by:
    1. Your network hardware and set up.
    2. The code used to pull the data from the network.
    There is a library available for using MODBUS with LabVIEW, I suggest reading up on it and learn more about the API of the library.  You can find all that information here: http://sine.ni.com/nips/cds/view/p/lang/en/nid/201711
    If you would rather it be an easy, not too much time required, out of the box solution there is the DSC module that is available for LabVIEW.  This greatly simplifies your application and also makes data communication very effecient.  You can find more information about the DSC module here: http://sine.ni.com/nips/cds/view/p/lang/en/nid/1010
    Just as a point of advice, the DSC module will most likely save you anywhere from 1 to 2 weeks of solid work, depending upon your experience with LabVIEW.
    A couple of notes on the previous (sorry Scott but people tell me they appreciate me being honest)
    1) The Modbus Libraries at one time (I don't know it it has been fixed) was a CPU pig because there where no "zero ms waits" in any of the loops. It takes some time but it can be fixed.
    2) Although DSC may save 1-2 weeks if we don't know LV, it ends up costing me 1-2 weeks of effort every time I go through a major upgrade ( I have been suppoting DSC apps since BridgeVIEW 2.1). PLUS since all of the DSC functions are hidden protected burried.... I have no option to fix DSC. I also believe the Modbus in DSC is going to come through the lookout protocol driver which is supported out of (?) Singapore and I am reduced to e-mail support. Does DSC run on Linux? The files system used to support DSC is complex hard to find out what does what and when last I checked, Security software steps on it functionality with no sign that logging has stopped until we go back to look for it and find out is its not working. It cost one of my customers almost $5K to pay me to trouble shoot why the DSC history was failing on five of thier machines. In the end all I could tell them was, "Unplug the 1st network cable before starting a test." And to the speed of DSC... I find it interesting that the White Paper comparing DSC with Datasocket reads has disappeared from the NI site.
    So....
    Code it yourself so you can fix it.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • What kind of real time problems and issues arise in Data Modelling

    Hellooo experts,
    I am not a BW expertise, so just want to gather some knowledge on Problems and issues related to DATA MODELLING, if any body have a document or a link can you please snd it to [email protected] or reply to this post. I promise to reward you for proper answers hope this is the way i say thanks for your help.
    kishore

    hi kishore,
    The first and most important problem is to understand <b>Clients requirement</b>. Clients even some time not aware of how to utilize efficiency of BW, as they are very much use to with traditional SAP R/3 reporting. So most of time we have to suggest options to client.
    If we come to technical aspect, than identifying <b>Key Figures and Characterstics</b> is the most important task. And how you map that to <b>available infoobjects</b>. Attributes also plays major roles, like which attributes is better to use as navigational one and which as exclusive attribute.
    Their is long list for data modelling issues, its better you refer either <b>Mastering BIW</b> book by Mac'Donalds. And SAP Help Library is also having good material.
    regards,
    kamaljeet

  • How to download an FPGA vi along with a real time application

    Hello
    I am targeting cRIO 9012, cRIO 9102. I downloaded the FPGA vi on flash memory and then built a real time application and set it as startup.  But there was no signals on the modules IOs which are handled by FPGA vi.
    Also the shared variables of that application when running as standalone aren't accessible. Please provide me the steps that should be followed to access the shared variables of a standalone real time system. Please help me resolve these problems.
    Best Regards
    Mani

    Hi Mani,
    What modules do you have?  What kinds of signals are you measuring?
    Have you deployed your shared variable library on your host PC?
    Regards,
    Jeremy_B
    Applications Engineer
    National Instruments

  • Is there any way to connect time capsule to a MacBook Pro directly via USB. I have a large amount of data that I want to back up and it is taking a very long time (35GB is taking 3 hrs, I have 2TB if files in total)...)?

    Perhaps via USB. I have a large amount of data that I want to back up and it is taking a very long time (35GB is taking 3 hrs, I have 2TB if files in total)...? I want to use TimeCapsule as back-up for an archive which is curently stored on a 2 TB WESC HD. 

    No, you cannot backup via direct usb connection..
    But gigabit ethernet is much faster anyway.. are you connected directly by ethernet?
    Is the drive you are backing up from plugged into the TC? That will slow it down something chronic.. plug that drive in by its fastest connection method.. WESC sorry I have no idea. If ethernet use that.. otherwise USB direct to the computer.. always think what way the files come and go.. but since you are copying from the computer everything has to go that way.. it makes things slower if they go over the same cable.. if you catch the drift.

  • Pull large amounts of data using odata, client API and so takes a long time in project server 2013

    We are trying to pull large amounts of data in project server 2013 using both client API and odata calls, but it seem to take a long time. How is this done
    In project server 2010 we did this creating SQL views in both the reporting database and for list creating a view in the content database. Our IT dept is saying we can't do this anymore. How does a view in Project database or content database create issues?
    As long as we don't add a field in the table. So how's one to do this with creating a view?

    Hello,
    If you are using Project Server 2013 on premise I would recommend using T-SQL against the dbo. schema in the Project Web Database for your reports, this will be far quicker that the APIs. You can create custom objects in the dbo. schema, see the link below:
    https://msdn.microsoft.com/en-us/library/office/ee767687.aspx#pj15_Architecture_DAL
    It is not supported to query the SharePoint content database directly with T-SQL or add any custom objects to the content database.
    Paul
    Paul Mather | Twitter |
    http://pwmather.wordpress.com | CPS |
    MVP | Downloads

  • CRIO and ni 9234 modules not working or communicating through fpga with accelerometers, fpga connected to real time application which is also connected to shared variables linked to modbus slave

    Hi,
    I have a compact rio which has a 4 way chassis attached to that chassis is three ni9234 modules they are linked using fpga to a real time application then using shared variables in the low speed loop that are linked to a modbus slave to communicate with dcs, the ni 9234's have accelerometers connected to them with iepe ac coupled option on the c modules, my problem is the real time application seems to be running okay even when power loss occurs it restarts with no problem and the fpga writes to the portable hard drive bin files fine but without a accelerometer connected I get low noise readings as soon as I connect a accelerometer to any one of the 10 outputs it just goes to a fixed number (0.03125) as soon as disconnect it again it reverts back to reading noise, I have run a scan on the modules and only get a spike when I connect or disconnect the accelerometer, I have tested the voltage at the pins of the module and I get 22 volts dc which makes it more likely that the hardware is not the problem but a software is maybe causing this to hang-up, I attach project and files for your perusal. I also carried out a new project which in scan mode directly linked the module input to shared variable and the same scenerio again. Help would be much appretiated. 
    Many thanks
    Jason
    Solved!
    Go to Solution.
    Attachments:
    logger 2plusmodbus2.zip ‏679 KB

    Whren using waveform acquisition with the 9234s we recommend the following FPGA and RT template.
    http://sine.ni.com/nips/cds/view/p/lang/en/nid/209114
    it can be extended as a data logger with:
    http://zone.ni.com/devzone/cda/epd/p/id/6388
    or using shared variables combined with scan engine
    http://zone.ni.com/devzone/cda/tut/p/id/9851
    The FPGA in all of these, as well as the RT framework have been used successfully by 1000s of users.  I would recommend giving these a try. 
    Preston Johnson
    Principal Sales Engineer
    Condition Monitoring Systems
    Vibration Analyst III - www.vibinst.org, www.mobiusinstitute.com
    National Instruments
    [email protected]
    www.ni.com/mcm
    www.ni.com/soundandvibration
    www.ni.com/biganalogdata
    512-683-5444

  • What java collection for large amount of data and user customizable record

    I'm trying to write an application which operates on large amount of data. I want user could customize data structure (record) from different types of variables(float,int,bool,string,enums). These records should be stored in some kind of Array. Size of record: 1-200 variables; size of Array of those records: about 100000 items (one record every second through whole day). I want these data stored in some embedded database (sqlite, hsqldb) - access using simple JDBC. Could you give me some advise how to design thoses data strucures. Sincerely yours :)
    Ok, maybe I give some example. This will be some C++ code.
    I made an interface:
    class ParamI {
    virtual string toString() = 0;
    virtual void addValue( ParamI * ) = 0;
    virtual void setValue( ParamI * ) = 0;
    virtual BYTE getType() = 0;
    Than I made some template class derived from interface ParamI:
    template <class T>
    class CParam : CParamI {
    public:
         void setValue( T val );
         T getValue();
         string toString();
         void setValue( ParamI *src ) {
              if ( itemType == src->getType() ) {
                   CParam<T> ptr = (CParam<T>)src;
                   value = ptr->value;
    private:
         BYTE itemType;
         T value;
    sample constructor of <int> template:
    template<> CParam<int>::CParam() {
         itemType = ParamType::INTEGER;
    This solution makes me possible to write collection of CParamI:
    std::vector<CParamI*> myCollection;
    CParam<int> *pi = new CParam<int>();
    pi->setValue(10);
    myCollection.push_back((CParamI*)pi);
    Is this correct solution?. My main problem is to get data from the collection. I have to check its data type using getType() method of CParamI interface.
    Please could give me some advise, some idea to make it right using java.

    If you have the requirement that you have to be able to configure on the fly, then what I've done in the past is just put everything into data pairs into a list: something along the line of: (<Vector>, <String>), where the Vector would store your data and String would contain a data type. I would then make a checker to validate the input according to the SQL databypes that I want to support on the project. It's not a big deal with the amount of data you are talking about.
    The problem you're going to have is when you try to allow dynamic definition, on the fly, of data being input to a table that has already been defined. Your DB will not support that, unless you just store that data pair--which I do not suggest.

  • On iOS 7.0.2 everytime I unlock my phone while my music is playing, it skips. Forward, backward, and by small and large amounts of time.

    On iOS 7.0.2 everytime I unlock my phone while my music is playing, it skips. Forward, backward, and by small and large amounts of time.

    Hi cowboyincognito,
    Thanks for visiting Apple Support Communities.
    If you don't have the option to play music by genre on your iPhone, try this step to find the Genre option:
    Browse your music library.
    You can browse your music by playlist, artist, or other category. For other browse options, tap More. Tap any song to play it.
    To rearrange the tabs in the Music app, tap More, then tap Edit and drag a button onto the one you want to replace.
    Best Regards,
    Jeremy

  • Is the only way to import large amount of data and database objects into a primary database is to shutdown the standby, turn off archive log mode, do the import, then rebuild the standby?

    I have a primary database that need to import large amount of data and database objects. 1.) Do I shutdown the standby? 2.) Turn off archive log mode? 3.) Perform the import? 4.) Rebuild the standby? or is there a better way or best practice?

    Instead of rebuilding the (whole) standby, you take an incremental (from SCN) backup from the Primary and restore it on the Standby.  That way, if, for example
    a. Only two out of 12 tablespaces are affected by the import, the incremental backup would effectively be only the blocks changed in those two tablespaces (and some other changes in system and undo) {provided that there are no other changes in the other ten tablespaces}
    b. if the size of the import is only 15% of the database, the incremental backup to restore to the standby is small
    Hemant K Chitale

  • ERROR MESSAGE WHEN DISPLAYING LARGE RETRIEVING AND DISPLAYING LARGE AMOUNT OF DATA

    Hello,
    Am querying my database(mysql) and displaying my data in a
    DataGrid (Note that am using Flex 2.0)
    It works fine when the amount of data populating the grid is
    not much. But when I have large amount of data I get the following
    error message and the grid is not populated.
    ERROR 1
    faultCode:Server.Acknowledge.Failed
    faultString:'Didn't receive an acknowledge message'
    faultDetail: 'Was expecting
    mx.messaging.messages.AcknowledgeMessage, but receive Null'
    ERROR 2
    faultCode:Client.Error.DeliveryInDoubt
    faultString:'Channel disconnected'
    faultDetail: 'Channel disconnected before and acknowledge was
    received'
    Note that my datagrid is populated when I run the query on my
    Server but does not works on my client pcs.
    Your help would br greatly appreciated here.
    Awaiting a reply.
    Regards

    Hello,
    Am using remote object services.
    USing component (ColdFusion as destination).

  • MY phone is using large amounts of data, when i then go to system services, it s my mapping services thats causing it. what are mapping services and how do i swithch them off. i really need help.

    MY phone is using large amounts of data, when i then go to system services, it s my mapping services thats causing it. what are mapping services and how do i swithch them off. i really need help.

    I Have the same problem, I switched off location services, maps in data, whatever else maps could be involved in nd then just last nite it chewed 100mb... I'm also on vodacom so I'm seeing a pattern here somehow. Siri was switched on however so I switched it off now nd will see what happens. but I'm gonna go into both apple and vodacom this afternoon because this must be sorted out its a serious issue we have on our hands and some uproar needs to be made against those responsible!

  • Large Amount of Data in JSF

    Hello,
    I am using the Table Group component for displaying data in my application designed in Java Studio Creator.
    I have enabled paging on the component. I use CachedRowSet on the bean for the page for getting the data. This works very well at the moment in my development environment. At the moment I am testing on small amount of data.
    I was wondering how does this component perform with very large amounts of data (>75,000 rows). I noticed that there is a button available for users to retrieve all the rows. So I was wondering apart from that instance, when viewing in a paged mode does the component get all the results from the database everytime ?
    Which component would be best suited for displaying large amounts of data in a table format?
    Thanks In Advance!!

    Thanks for your reply. The table control that I use does have paging as a feature and I have enabled it. It still takes time to load the data initially.
    I wonder if it is got to do with the logic of paging. How do you specify which set of 20 records to extract from SQL.
    Thanks for your help!!

  • DSS problems when publishing large amount of data fast

    Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
    There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
    I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
    My questions are
    1. Is there any limit in speed (frequency) for data publishing in DSS?
    2. Can DSS be unstable if loaded to much?
    3. Can I lose/miss data in any situation?
    4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
    5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
    Regards
    Idriz Zogaj
    Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
    Memory Profesional
    direct: +46 (0) - 734 32 00 10
    http://www.zogaj.se

    LuI wrote:
    >
    > Hi all,
    >
    > I am frustrated on VISA serial comm. It looks so neat and its
    > fantastic what it supposes to do for a develloper, but sometimes one
    > runs into trouble very deep.
    > I have an app where I have to read large amounts of data streamed by
    > 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
    > same time.)
    > I use either a Moxa multiport adapter C320 with 16 serial ports or -
    > for test purposes - a Keyspan serial-2-USB adapter with 4 serial
    > ports.
    Does it work better if you use the serial port(s) on your motherboard?
    If so, then get a better serial adapter. If not, look more closely at
    VISA.
    Some programs have some issues on serial adapters but run fine on a
    regular serial port. We've had that problem recent
    ly.
    Best, Mark

Maybe you are looking for