Real-time process slowly filling up Postgres DB space

I have a real-time job which takes data from a JMS queue, processes the data, and then applies inserts/updates to an Oracle DB. In the task definition for the process Results drill down is set = "None" yet the Postgres DB grows substantially as the job runs. Also the following SQL seems to find a row per JMS message processed...
select count(*) from  pg_stat_user_tables where schemaname = 'results'
Anybody any clues as to what may be causing the DB to grow?
Cheers
Jon

It looks like we're going to have to - Postgres has just died...
WARNING: 22-May-2014 04:24:27: Unable to index table : 'DNM_819_776_1_ingr'
WARNING: 22-May-2014 04:24:27: A database error has occurred : ERROR: could not create relation base/25589/3549571: File too large. (Code: 200,302)
com.datanomic.director.results.database.exception.sql.ResultsSQLException: A database error has occurred : ERROR: could not create relation base/25589/3549571: File too large. (Code: 200,302)
at com.datanomic.director.results.database.translator.MapErrorCodes.mapException(MapErrorCodes.java:70)
at com.datanomic.director.results.database.AbstractTableDAO.executeSQL(AbstractTableDAO.java:66)
at com.datanomic.director.results.database.AbstractTableDAO.executeSQL(AbstractTableDAO.java:39)
at com.datanomic.director.results.database.TableInsertDao.addIndexes(TableInsertDao.java:291)
at com.datanomic.director.results.database.TableInsert.close(TableInsert.java:423)
at com.datanomic.director.results.database.TableInsert.close(TableInsert.java:301)
at com.datanomic.director.match.runtime.data.writers.AbstractDBWriter.close(AbstractDBWriter.java:173)
at com.datanomic.director.match.runtime.data.realtime.ResultsBucket.finishRealtimeBuckets(ResultsBucket.java:63)
at com.datanomic.director.match.runtime.RealtimeHandler.finalizeDBStore(RealtimeHandler.java:623)
at com.datanomic.director.match.munger.MatchRealtimeExecutor.doTheStuff(MatchRealtimeExecutor.java:303)
at com.datanomic.director.runtime.engine.RuntimeProcessMunger$MungerExecutable.execute(RuntimeProcessMunger.java:872)
at com.datanomic.utils.execution.Parallelizer$Worker.run(Parallelizer.java:210)
at com.datanomic.utils.execution.Parallelizer$Worker.runHere(Parallelizer.java:156)
at com.datanomic.utils.execution.Parallelizer.run(Parallelizer.java:85)
at com.datanomic.director.runtime.engine.RuntimeProcessMunger.execute(RuntimeProcessMunger.java:459)
at com.datanomic.utils.execution.Parallelizer$Worker.run(Parallelizer.java:210)
at java.lang.Thread.run(Thread.java:722)
Do you recognise this as being the ultimate failure of too much data written to the Postgres DB?

Similar Messages

  • Sap script real time process

    hai this is siva ,
    i want sap-script real time process.

    hai this is siva ,
    i want sap-script real time process.

  • Acrobat Plugin Can be involved in Real-time processing

    Hi,
    I am implementing an acrobat plug-in solution for color separation and generation of PRN file for the given PDF file.
    Is the Plugin Architecture is good for the real time processing and RIPPIng?  I have a doubt like the Acrobat Process
    is involved in exectuing the each and every plugin HFT functions, is that would be fine for my plug-in to respond
    in realtime?
    Please suggest me.
    Thanks & Regards,
    Abdul Rasheed.

    First and foremost, Acrobat can NOT be used on a server, right?  So this is going to be a user-invoked process, correct?
    Beyond that, what do you thin would be a problem?
    From: Adobe Forums <[email protected]<mailto:[email protected]>>
    Reply-To: "[email protected]<mailto:[email protected]>" <[email protected]<mailto:[email protected]>>
    Date: Wed, 21 Sep 2011 07:29:28 -0700
    To: Leonard Rosenthol <[email protected]<mailto:[email protected]>>
    Subject: Acrobat Plugin Can be involved in Real-time processing
    Acrobat Plugin Can be involved in Real-time processing
    created by skrasheed<http://forums.adobe.com/people/skrasheed> in Acrobat SDK - View the full discussion<http://forums.adobe.com/message/3929688#3929688

  • Real time processing in formula node

    hello all,
    I'm using LabVIEW for data acquisition of an analog signal.My sampling frequency is KHz. Now if I've to process these data's in formula node what type of variable I need to define in formula node.?
    For example,if I want to add add all the magnitudes of the samples and store the sum after each sample squired.How do I write the formula node?
    I'm an amateur in labVIEW. amateur guide.
    Thanks in advance.

    pramodantony wrote:
    Now if I've to process these data's in formula node what type of variable I need to define in formula node.?
    Who forces you to use the formula node? You simply need to process the data, the means should be irrelevant.
    Acquired data in LabVIEW can be of many forms (arrays, waveforms, dynamic data, etc.). Can you show us what you have?
    LabVIEW Champion . Do more with less code and in less time .

  • I have problem with playback in real time

    playback in real time is  slowly, i have BlackMagic duo decklnik card
    what is the problem should i have to change the settings on the project?
    My english isn't good, hope you will understand me.
    Thank You.

    >Is mov file 1920 x 1080
    That doesn't really say much - what is the FORMAT of the media?  Look at the Item Properties.  Posting a screen shot of the Item Properties would help.  Do the same with the Sequence Settings.
    >how can i change FCP's RT settings to Dynamic.
    Use the drop-down RT menu near the upper left-hand corner of the Timeline.
    Also please answer the other questions:
    Where is the media stored?
    How is that hard drive connected to your Mac?
    How much available free space on each hard drive (percentage)?  Please list free space individually for each hard drive.
    -DH

  • Continuous data acquisition and analysis in real time

    Hi all,
    This is a VI for the continous acquisition of an ECG signal. As far as I understand the DAQmx Analog read VI needs to be placed inside a while loop so it can acquire the data continously, I need to perform filtering and analysis of the waveform in real time. The way I set up the block diagram means that the data stays int the while loop, and as far as I know the data will be transfered out through the data tunnels once the loop finishes executing, clearly this is not real time data processing. 
    The only way I can think of fixing this problem is by placing another while loop that covers the filtering the stage VIs and using some sort of shift registeing to pass the data to the second loop. My questions is whether or not this would introduce some sort of delay, and wether or not this would be considered to be real time processing. Would it be better to place all the VIs (aquicition and filtering) inside one while loop? or is this bad programming practice. Other functions that I need to perform is to save the data i na file but only when the user wants to do so. 
    Any advice would be appriciated. 
    Solved!
    Go to Solution.

    You have two options:
    A.  As you mentioned, you can place code inside your current while loop to do the processing.  If you are clever, you won't need to place another while loop inside your existing one (nested loops).  But that totally depends on the type of processing you are doing.
    B.  Create a second parallel loop to do the processing.  This would decouple the processes to ensure that the processing does not hinder your acquisition.  See here for more info.
    Your choice really depends on the processing that you plan to perform.  If it is processor-intensive, it might introduce delays as you mentioned.  
    I would reccomend you first try to place everything in the first loop, and see if your DAQ buffer overflows (you can monitor the buffer backlog while its running).  If so, then you should decouple the processes into separate loops.
    Regarding whether or not "this would be considered to be real time processing" is a loaded question.  Most people on these forums will say that your system will NEVER be real-time because you are using a desktop PC to perform the processing (note:  I am assuming this is code running on a laptop or desktop?).  It is not a deterministic system and your data is already "old" by the time it exits your DAQ buffer.  But the answer to your question really depends on how you define "real-time processing".  Many lay-people will define it as the processing of "live data" ... but what is "live data"?

  • TCP/IP Connecting with Real Time Controller

    I have a host running Labview on a windows XP and a realtime embedded controller on a pxi chassis that acts as the server.  When the realtime is started it automatically goes into listen mode and listens for a connection from the host.  The host opens a connection.  After a valid connection is open the Real-Time side goes into a TCP_Read and the host can then send commands that the real time processes and sends to the FPGA on the pxi-chassis. 
    Now the problem I'm having is how to handle the case when a TCP connection is lost.  I can have the TCP_Read on the real-time error on a time out and then go into a listen mode but this isn't very logical because then the host will have to reconnect each time a time out occurs.  So if I make the TCP_Read timeout be infinite and if the connection is lost (let's say I unplug the ethernet cable and re-plug it back in) then I cannot recover from this and the Real-time will need to be re-booted.
    I've tried to send the Real-time into listen mode if the error code is other than a timeout error (code 56) and have it go back to TCP_read mode if it is a timeout error.  But if the connection is lost by means of a physical way (such as me pulling the ethernet wire and plugging it back in) then the Real-Time never sees that the connection is invalid.  The host on the other hand can detect it bc it will get an error when it's trying to write?
    So my is:
    Is there any way to prevent an infinite loop that needs a reboot and at the same time prevent the host from reconnecting every time there is a timeout?

    Hi SJeane,
    I apologize for taking so long to respond, but I wanted to test this on my end.  In doing so, I realized that using the RT Reboot Controller.vi after the connection is lost does not work because the message to reboot cannot be relayed to the target without communication!  Thus, to solve this problem, we have to approach it a different way.  You mentioned that you tried programmatically clearing errors, but did you try to reestablish connection after clearing the errors?  I tested this on my end with a FieldPoint controller, and the attached VIs resumed operation even after unplugging/replugging the Ethernet cable (no reboot).  Will this solution work for you?
    Peter K.
    National Instruments
    Attachments:
    Reestablish.zip ‏39 KB

  • CRIO-9067 Real-time unexpected error restart (Hex 0x661)

    Hi all,
    I recently moved to LV2014 SP1 and NI-RIO 14.5 to support development of a cRIO-9067 Linux RT controller. The past two days of development have seen LabVIEW crash numerous times (eight at last count), along with numerous VIs being in an undeployable state due to some unseen error (there's no broken run arrows anywhere, and a LV restart seems to cure what was broken).
    The above issues are tolerable, if somewhat annoying. What's concerning is the most recent error I received this morning. I had just run the top level VI from source (and verified it was running), went to make myself a coffee and came back to this error.
    LabVIEW: (Hex 0x661) The LabVIEW Real-Time process encountered an unexpected error and restarted automatically.
    The VI wouldn't have been running for more than 10 minutes. I've since tried to reproduce the error without luck.
    This is somewhat concerning as the application for this cRIO will be 24/7 process control. Is this a known issue with the newer Linux RT controllers? Is there anything that can be done to detect the error in LabVIEW?
    Below is a screenshot of the software installed on the controller:

    You should be able to access the error logs on your controller through MAX.
    Is There an Error Log File for My Real-Time Controller?-http://digital.ni.com/public.nsf/allkb/E734886E027D0B6586256F2400761E30?OpenDocument
    How Do I Locate the Error Logs on My cRIO if I Don't Have MAX? - http://digital.ni.com/public.nsf/allkb/9D2F9D4F8C834D678625766D00633837
    Could you post the error log files for the Compact RIO targets that reproduce the error?
    It’s possible that a memory leak is occurring someplace in your program that causes the crash.
    Also, are you using the System Config Set Time.VI anywhere in your application? There has been a reported issue with this VI in relation to this error, but the hex code error is not common. We can try to cross-compare the issue internally to see if there are simlarities to the reported case. 
    Additionally, is this crash repeatable with other code, say a simple shipping example? I would imagine that the crash is related to a routine called in your program, but it’s possible there is a corruption in the software installation on your target.
    Also, what is the ProcessCommandMessage.vi responsible for/doing in your application?
    Will M.
    Applications Engineer
    National Instruments

  • Real time JVM  Implementation

    Hi All,
    We some students of our university would like to implement Real time JVM on Linux.
    We do have a solid idea on linux as well as on Java. But this is a very new and interesting topic.
    Can anyone help me how to start the assignment i.e. implementation of real time jvm on linux.
    Thanks and Regards
    tapas

    Just guessing...
    There are only two impacts to real time processing in java.
    First the garbage collector. Since it runs at odd times. And ties up the rest of the application for unpredictable amounts of time a real time system will need another type of solution.
    Second, threads. Because threads prempt each other this can also produce unpredictable behavior. Some usual solutions involve no threads at all. Or a way to preclude interruption.
    Finally I suspect that if you do a literature search you would be able to find some papers on this subject.

  • Real time solution using Documaker

    Hi all,
    We are using Documaker 10.3, currently we have a flat file as input to Documaker which creates the forms in our daily batch. Now we are planning to generate few forms in real time, like on a click of button from a source system a document should generate.
    Right now I can think of creating a flat file out of the source system and generate a form (miniature batch), not satisfied with this approach though. Is there any better way of dealing with these kind of situations? Is there any facility (which I am obviously not aware of) in Documaker which can take care of this.
    Thanks in advance

    Venkata,
    You can certainly continue to use the flat-file approach if it satisfies the requirements for your real-time processing. If the source system is not able to generate a "batch of one" extract file, then you will need to explore other methods of generating your input for Documaker, which are going to be specific to your source system -- can you elaborate on this?
    When running Documaker in real-time mode, it's a fairly simple process using components in the Oracle Documaker suite -- specifically Docupresentment. However, with pre-11.4 versions of Documaker, the licensing for Docupresentment was handled separately, so you will need to ensure you are licensed for Docupresentment to use it.

  • Process Chain for Real Time Demon

    Please help I am stuck I followed the step by sdn but this is missing in step. how to create now process chain.
    I created the below
    DSO CONNECTED TO dATASOURCE via Trans,
    Real Time IP
    Real Time DTP
    assigned to Datasource and assigned the DS, IP, DTP to Deamon in RSRDA. NOW I started also manually via start all IP. but How to set the process chains now.
    PLEASE HELP ME STEP BY STEP TO PROCESS CHAIN SINCE i am new to this daemon in process chains
    Thanks
    Soniya
    null

    Hi
    refer to this
    CREATION OF PROCESS CHAINS
    Process chains are used to automated the loading process.
    Will be used in all applications as you cannot schedule hundreds of infopackages manually and daily.
    Metachain
    Steps for Metachain :
    1. Start ( In this variant set ur schedule times for this metachain )
    2.Local Process Chain 1 ( Say its a master data process chain - Get into the start variant of this chain ( Sub chain - like any other chain ) and check the second radio button " Start using metachain or API " )
    3.Local Process Chain 2 ( Say its a transaction data process chain do the same as in step 2 )
    Steps for Process Chains in BI 7.0 for a Cube.
    1. Start
    2. Execute Infopackage
    3. Delete Indexes for Cube
    4.Execute DTP
    5. Create Indexes for Cube
    For DSO
    1. Start
    2. Execute Infopackage
    3. Execute DTP
    5. Activate DSO
    For an IO
    1. Start
    2.Execute infopackage
    3.Execute DTP
    4.Attribute Change Run
    Data to Cube thru a DSO
    1. Start
    2. Execute Infopackage ( loads till psa )
    3.Execute DTP ( to load DSO frm PSA )
    4.Activate DSO
    5.Further Processing
    6.Delete Indexes for Cube
    7.Execute DTP ( to load Cube frm DSO )
    8.Create Indexes for Cube
    3.X
    Master loading ( Attr, Text, Hierarchies )
    Steps :
    1.Start
    2. Execute Infopackage ( say if you are loading 2 IO's just have them all parallel )
    3.You might want to load in seq - Attributes - Texts - Hierarchies
    4.And ( Connecting all Infopackages )
    5.Attribute Change Run ( add all relevant IO's ).
    Start
    Infopackge1A(Attr)|Infopackge2A(Attr)
    Infopackge1B(Txts)|Infopackge2B(Txts)
    /_____________________|
    Infopackge1C(Txts)______|
    \_____________________|
    \___________________|
    __\___________________|
    ___\__________________|
    ______ And Processer_ ( Connect Infopackge1C & Infopackge2B )
    __________|__________
    Attribute Change Run ( Add Infobject 1 & Infoobject 2 to this variant )
    1. Start
    2. Delete Indexes for Cube
    3. Execute Infopackage
    4.Create Indexes for Cube
    For DSO
    1. Start
    2. Execute Infopackage
    3. Activate DSO
    For an IO
    1.Start
    2.Execute infopackage
    3.Attribute Change Run
    Data to Cube thru a DSO
    1. Start
    2. Execute Infopackage
    3.Activate DSO
    5.Further Processing
    6.Delete Indexes for Cube
    7.Execute Infopackage
    8.Create Indexes for Cube

  • Desing requriment for processing the IDOC in real time

    hi Gurus,
    we have a requirement to design for processing the IDOC in real time i.e.all the IDOC must be store and process for give time line with out using BPPM tool.we have suggested to use cache storage or storing the files in file adapter.
    please suggested good process with out compromising the Performance of PI.
    regards
    shankar

    hi Raj,
    Please find the example below for your reference
    let say
    System A-->PI--
    >system B
    system A sends multiple files based on time(daily or weekly) and real time( as an when the record created) to systerm B (CRM) in the form of IDOC Via. PI
    we needs to club the real time files and send it to system B.How best we can do without using BPM.
    regards
    shankar

  • DPC spike at Windows automatic maintenance startup. Can not leave the computer alone processing real-time streams.

    This is what happens when I leave the computer idle for a while and the Windows automatic maintenance starts:
    Driver file      Description               ISR count  DPC count  Highest execution (ms)  Total execution (ms)
    ntoskrnl.exe  NT Kernel & System   0              50764        0,235854                    
     332,950426
    A DPC spike is generated by ntoskrnl.exe causing drop outs in real-time streams.
    JTS

    Hi Fjtorsol,
    We hope your issue has been resolved, if you've found solution by yourself. you could share with us and we will mark it as answer.
    This high Deferred Procedure Call (DPC) latencies are usually caused by certain drivers, if it is caused by automatic maintenance, please re-check your task schedule which is marked as “when computer is idle”. As what has been suggested by MVP ZigZag, please
    use the Microsoft Windows Performance Analyzer from the Windows Assessment and Deployment Kit (ADK) to identify the cause of any DPC latency spikes.
    https://www.microsoft.com/en-gb/download/details.aspx?id=39982
    DPC CPU Usage Summary Table will open containing a list of drivers/program. This list is already correctly sorted (by the Actual Duration column). The process on the very top of the list is therefore likely to be the cause of your problem.
    Regards
    D. Wu
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • How to make 1D Array but with only one element filled in Real Time

    Hi folks,
    here I am with another question. I want to implement an prediction discrete state space observer which is going to run on a CRIO real time target. I am going to do it just like in the example which comes with LV.
    I have some questions regarding the input and outputs which in the example those are "dummy".
    My model is a SISO model, but the function "Construct SS model" returns parameters (A,B,C,D Matrices) as 2D arrays, so once you connect the cluster model into the Discrete Observer model, it takes y and u as 1D arrays despite of the fact that there is a SISO model.
    I realized that the function I am using in the simulations, uses 1D arrays but with only one element filled:
    Does anyone knows how to implement such 1D arrays in Real Time? I guess the way to do it is preallocating one array of zeros of size 1, and then recirculating it through some SR, and replacing the element with my real input and output, but at the dummy.vi, they are using a simple "build array"
    function.

    Ok, I did it that way. But I am facing another problem right now...
    At some point the Discrete Observer return a NAN array, you will see the code in the code snippet?
    I get rid of that component by component, but the observer gets "stuck" in it. So my Control law is zero... but the state stimate is NAN.
    Also I am attaching the VI.
    I do not know why, since in the simulation program all runs well. any thoughts? Maybe the internal numeric precision of the State Space Model?
    Attachments:
    RT - Pole Placement + Complete Observer.vi ‏40 KB

  • How to update the HTML file so that we can Control our process in real time

    After installing following three steps as per the lookout 4 online help I am unable to Monitor and control the Process in HTML format, which was exported manually in lookout server.
    1) Creating a Web Client Page in Lookout
    2) Download a Lookout Web Client
    3) Setting Up Own Web Server
    My browser shows only the instance, which I have uploaded manually without any update
    Problem: How to automatically update/refresh the HTML file so that we can Monitor/Control our process in real time/bi-directional mode.

    Hi,
    It seems like your process is not updating. When you create a Web Client, it uses ActiveX which lets you control the Lookout process fully. Make sure that you run the process. You can do this by pressing CTRL+Spacebar which puts it in Run-mode. Perhaps then you may see your graphs, etc updating.
    Also, please refer to page 11-1 of the Users Manual linked below:
    http://www.ni.com/pdf/manuals/322390a.pdf
    What kind of Web Server are you using? Make sure all the settings in it are done properly. If you have LabVIEW, you can use the LabVIEW Web Server.
    Hope this information is helpful. Please let us know if you have any further questions.
    Regards,
    A Saha
    Applications Engineer
    National Instruments
    Anu Saha
    Academic Product Marketing Engineer
    National Instruments

Maybe you are looking for

  • Inovice is not being importe while running Request :Payables Open Interface

    Dear We are on release on R12, We need to import legacy invoice into oracle e-business thought import program, Right now we are working on to import invoice through subject mentioned request, but after running this request System is not being import

  • SSD freezes for a few days then returns to normal and smart OK

    Hello <EDIT> The freezes have stopped as suddenly as they had started. Logs show all that all smart tests passed successfully for the last three days: # journalctl |grep self-test juil. 26 05:05:25 llewellyn smartd[442]: Device: /dev/sdb [SAT], previ

  • Pictures loading with question marks

    Lately whenever I load websites like pinterest or netflix the pictures never load. Instead blue boxes with white question marks load. My sister has a mac as well and this has been happening to her too. How do i fix this?

  • Strange problem when copying files to NAS

    Hello all. I have a strange problem when trying to copy files from my intel iMac to my Seagate NAS. I am copying files from my HD and when the file is nearly copyied i get an error message saying that the file can't be copied because it is already in

  • Concurrent invocations to an undefined number of partner links

    Hi everybody!, I'm getting introduced to BPEL and I have a problem when describing my first process: The problem I have is that I need something (structure, container, etc.) that allows me to make undefined (n) concurrent invocations to an undefined