How to handle large data while acquisition? BNC 2110

I want to acquire data using  BNC 2110, I am writing a software in VB 6. We will use 3 channels. We are supposed to scan about 10000 points before AcquiredData is triggered. in all we will need to scan 10000 * 1000 * 1000 before data is put into a binary fall. Can anybody let me know, how to hande this large number points

Hello Vjuno,
In order to acquire 10,000,000,000 points you are going to have to be streaming this data to your hard drive as you go.  To do this you'll need to write the data you read to a file each loop iteration.  In general it is a good practice to make your "samples to read" at least 10% of your sample rate in seconds to avoid overflowing buffers, however, depending on your computer you may be able to go faster.  I made an example program in LabVIEW and was able to read 10,000 points at a time from each of 3 analog inputs at 333MHz and write the values to file without overflowing a buffer.  However, even opening a web browser while the code was running was enough to delay the VI long enough for the buffer to overflow.
You can use the DAQmx Configure Input Buffer call to increase the buffer size and account for spikes in CPU usage from other processes, and you should also monitor the "Available Samples Per Channel" property to make sure you aren't steadily gaining samples in your buffer.  Since you want to acquire 10 billion samples at 1MHz this acquisition will take several hours; if you're not able to keep the buffer empty then it will become apparent before the end of your acquisition.  By monitoring the samples in the buffer you can tell if you're pulling the samples out fast enough, if you find that this number is steadily increasing then you should either reduce the sample rate or increase the number of samples to read each time you call the DAQmx Read.
In my example program I used a write to TDMS (binary) file and a PCI-6251.
I hope this helps, and have a good night.
Cheers,
Brooks

Similar Messages

  • How to handle large data in file adapter

    We have a scenario Proxy -> PI -> File Sever using File adapter.
    File adapter is using FCC for conversion.
    recently we had wave 2 products live and suddenly for this interface we have increase in volume of messages, due to which File adapter is not performing well, PI goes slow or frequent disconnect from file server problem. Due to which either we will have duplicate records in file or file format created is wrong.
    File size is somewhere around 4.07 GB which I also think quite high for PI to handle.
    Can anybody suggest how we can handle such large data.
    Regards,
    Vikrant

    Check this Blog for Huge File Processing:
    Night Mare-Processing huge files in SAP XI
    However, you can take a look also to this Blog, about High Volume Messages:
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging
    PI Performance Tuning Best Practice:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2016a0b1-1780-2b10-97bd-be3ac62214c7?QuickLink=index&overridelayout=true&45896020746271

  • How to handle large data sets?

    Hello All,
    I am working on a editable form document. It is using a flowing subform with a table. The table may contain up to 50k rows and the generated pdf may even take up to 2-4 Gigs of memory, in some cases adobe reader fails and "gives up" opening these large data sets.
    Any suggestions? 

    On 25.04.2012 01:10, Alan McMorran wrote:
    > How large are you talking about? I've found QVTo scales pretty well as
    > the dataset size increases but we're using at most maybe 3-4 million
    > objects as the input and maybe 1-2 million on the output. They can be
    > pretty complex models though so we're seeing 8GB heap spaces in some
    > cases to accomodate the full transformation process.
    Ok, that is good to know. We will be working in roughly the same order
    of magnitude. The final application will run on a well equipped server,
    unfortunately my development machine is not as powerful so I can't
    really test that.
    > The big challenges we've had to overcome is that our model is
    > essentially flat with no containment in it so there are parts of the
    We have a very hierarchical model. I still wonder to what extent EMF and
    QVTo at least try to let go of objects which are not needed anymore and
    allow them to be garbage collected?
    > Is the GC overhead limit not tied to the heap space limits of the JVM?
    Apparently not, quoting
    http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html:
    "The concurrent collector will throw an OutOfMemoryError if too much
    time is being spent in garbage collection: if more than 98% of the total
    time is spent in garbage collection and less than 2% of the heap is
    recovered, an OutOfMemoryError will be thrown. This feature is designed
    to prevent applications from running for an extended period of time
    while making little or no progress because the heap is too small. If
    necessary, this feature can be disabled by adding the option
    -XX:-UseGCOverheadLimit to the command line."
    I will experiment a little bit with different GC's, namely the parallel GC.
    Regards
    Marius

  • How to handle  user exits while using BAPI

    HI experts can any one help me on how to handle user exits while using BAPI. Do we need to handle it explicitly or standard  BAPI will take care of it??.
    Regards,
    Hari Krishna

    If you have added some fields using append structures for screen enhancements, then you have to use appropriate user exits to fill these data while calling BAPI.  Some BAPIs have EXTENSION structures to fill the custom data which can be processed using user exists or enhancements.
    Regards
    Vinod

  • How to handle large result set of a SQL query

    Hi,
    I have a question about how to handle large result set of a SQL query.
    My query returns more than a million records. However, the Query Template has a "row count" parameter. If I don't specify it, it by default returns only 100 lines of records in the query result. If I specify it, then it's limited to a specific number.
    Is there any way to get around of this row count issue? I don't want any restriction on the number of records returned by a query.
    Thanks a lot!

    No human can manage that much data...in a grid, a chart, or a direct-connected link to the brain. 
    What you want to implement (much like other customers with similar requirements) is a drill-in and filtering model that helps the user identify and zoom in on data of relevance, not forcing them to scroll through thousands or millions of records.
    You can also use a time-based paging model so that you only deal with a time "slice" at one request (e.g. an hour, day, etc...) and provide a scrolling window.  This is commonly how large datasets are also dealt with in applications.
    I would suggest describing your application in more detail, and we can offer design recommendations and ideas.
    - Rick

  • How to handle time&date

    can anyone tell me how to handle time&date correctly?using Calendar,GregorianCalendar,TimeZone,Locale
    thank you very much

    This is too large a topic to discuss in depth here. Here is a link to a tutorial on times and dates, and a search link that references many documents on the subject.
    http://java.sun.com/docs/books/tutorial/i18n/format/dateintro.html
    http://onesearch.sun.com/search/developers/index.jsp?and=calendar+&nh=100&phr=how+to&qt=&not=&field=&since=&col=javatecharticles&col=javatutorials&col=devall&rf=0&Search.x=20&Search.y=7
    When you have specific questions, just ask.

  • In  BDC how you handled header data and item data

    In  BDC how you handled header data and item data

    Raja,
    Can you be more clear ?
    Usually you load the header data one and then loop at the item data and then load the item data.
    This example should help you.
    http://www.sap-img.com/abap/bdc-example-using-table-control-in-bdc.htm
    Regards,
    Ravi
    Note - Please mark all the helpful answers

  • Can express vi handle large data

    Hello,
    I'm facing problem in handling large data using express vi's. The input to express vi is a large data of 2M samples waveform & i am using 4 such express vi's each with 2M samples connected in parallel. To process these data the express vi's are taking too much of time compared to other general vi's or subvi's. Can anybody give the reason why its taking too much time in processing. As per my understanding since displaying large data in labview is not efficient & since the express vi's have an internal display in the form of configure dialog box. Hence i feel most of the processing time is taken to plot the data on the graph of configure dailog box. If this is correct then Is there any solution to overcome this.
    waiting for reply
    Thanks in advance

    Hi sayaf,
    I don't understand your reasoning for not using the "Open Front Panel"
    option to convert the Express VI to a standard VI. When converting the
    Express VI to a VI, you can save it with a new name and still use the
    Express VI in the same VI.
    By the way, have you heard about the NI LabVIEW Express VI Development Toolkit? That is the choice if you want to be able to create your own Express VIs.
    NB: Not all Express VIs can be edited with the toolkit - you should mainly use the toolkit to develop your own Express VIs.
    Have fun!
    - Philip Courtois, Thinkbot Solutions

  • How to handle the date attribute,passing parameter from one page to another

    hi Friends,
    i want to pass data attribute from one page to another page-
    i am passing like below ,in jdev log window i am getting below error.
    String StatusUpdateDate = row.getAttribute("StatusUpdateDate");
    params.put("StatusUpdateDate",StatusUpdateDate)
    Error(121,50): incompatible types; found: java.lang.Object, required: java.lang.String
    Suppose i am passing like below , while moving one page to another i am getting below error in application
    String StatusUpdateDate = row.getAttribute("StatusUpdateDate").toString()
    Status Update Date - JBO-25009: Cannot create an object of type:oracle.jbo.domain.Date with value:26-MAR-2009
    please can any suggest me how to handle this error.
    Thanks and Regards,
    vamshi

    Hi Pratap, Thanks for your help
    it was my mistake that previously property it was varchar2, now i have changed as you suggested every thing. still i am getting error. this is my code-
    AM CODE-
    public void xxselection(String Name, String Email,String Product,String Region, DATE StatusUpdateDate)
    DetailVOImpl vo1=getDetailVO1();
    vo1.initQuery2(Name);
    Row detailRow = vo1.createRow();
    detailRow.setAttribute("Name", Name);
    detailRow.setAttribute("Email", Email);
    detailRow.setAttribute("Product", Product);
    detailRow.setAttribute("Region", Region);
    detailRow.setAttribute("StatusUpdateDate", StatusUpdateDate);
    vo1.last();
    vo1.next();
    vo1.insertRow(detailRow);
    detailRow.setNewRowState(Row.STATUS_INITIALIZED);
    Controller- Process Form Request- Source page
    if (pageContext.getParameter("Detail")!= null)
    String Name=row.getAttribute("Name").toString();
    String Email=row.getAttribute("Email").toString();
    String Product=row.getAttribute("Product").toString();
    String Region=row.getAttribute("Region").toString();
    DATE StatusUpdateDate =(DATE)row.getAttribute("StatusUpdateDate");
    HashMap params =new HashMap();
    params.put(" Name", Name);
    params.put("Email",Email);
    params.put("Product",Product);
    params.put("Region",Region);
    pageContext.putTransactionTransientValue("StatusUpdateDate",StatusUpdateDate); //As you suggested
    pageContext.forwardImmediately("OA.jsp?page=/xxm/oracle/apps/pos/stg/webui/DetailStagePG",
    null,
    OAWebBeanConstants.KEEP_MENU_CONTEXT,
    null,
    params,
    true, // retain AM
    OAWebBeanConstants.ADD_BREAD_CRUMB_NO);
    another page Controller-Process request-Destination page-
    String Name = pageContext.getParameter("Name");
    String Email = pageContext.getParameter(" Email");
    String Product = pageContext.getParameter("Product");
    String Region = pageContext.getParameter("Region");
    DATE StatusUpdateDate=(DATE)pageContext.getTransactionTransientValue("StatusUpdateDate");
    Timestamp tstmpStatusDate=StatusUpdateDate.timestampValue();
    System.out.println("tstmpStatusDate"+tstmpStatusDate);
    Serializable[] parameters1 = {Name,Email,Product,Region,tstmpStatusDate};
    am.invokeMethod("xxselection", parameters1);
    Error - getting at while running the application page to page
    No method with signature - No method with signature - xxselection(class java.lang.String, class java.lang.String, class java.lang.String, class java.lang.String, class java.lang.String)
    every thing is getting passed except DATE Attribute, please check the code and update me
    Thanks in Advace-
    vamshi

  • How to handle large heap requirement

    Hi,
    Our Application requires large amount of heap memory to load data in memory for further processing.
    Application is load balanced and we want to share the heap across all servers so one server can use heap of other server.
    Server1 and Server2 have 8GB of RAM and Server3 has 16 GB of RAM.
    If any request comes to server1 and if it requires some more heap memory to load data, in this scenario can server1 use serve3’s heap memory?
    Is there any mechanism/product which allows us to share heap across all the servers? OR Is there any other way to handle large heap requirement issue?
    Thanks,
    Atul

    user13640648 wrote:
    Hi,
    Our Application requires large amount of heap memory to load data in memory for further processing.
    Application is load balanced and we want to share the heap across all servers so one server can use heap of other server.
    Server1 and Server2 have 8GB of RAM and Server3 has 16 GB of RAM.
    If any request comes to server1 and if it requires some more heap memory to load data, in this scenario can server1 use serve3’s heap memory?
    Is there any mechanism/product which allows us to share heap across all the servers? OR Is there any other way to handle large heap requirement issue? That isn't how you design it (based on your brief description.)
    For any transaction A you need a set of data X.
    For another transaction B you need a set of data Y which might or might not overlap with X.
    The set of data (X or Y) is represented by discrete hunks of data (form is irrelevant) which must be loaded.
    One can preload the server with this data or do a load on demand.
    Once in memory it is cached.
    One can refine this further with alternative caching strategies that define when loaded data is unloaded and how it is unloaded.
    JEE servers normally support this in a variety of forms. But one can custom code it as well.
    JEE servers can also replicate cached data across server instances. Custom code can do this but it is more complicated than doing the custom caching.
    A load balanced system exists for performance and failover scenarios.
    Obviously in a failover situation a "shared heap" would fail completely (as asked about) because the other server would be gone.
    One might also need to support very large data sets. In that case something like Memcached (google for it) can be used. There are commercial solutions in this space as well. This allows for distributed caching solutions which can be scaled.

  • In BAPI PO CREATION How to handled errors datas

    Hi friends ,
    In BAPI PO CREATION upload the datas How to handle/capture errors datas.?
    arun

    Hi,
    After completion of the program IT_RETURN table will have all the messages in it.
    Loop the IT_RETURN internal table and display the data.
    Regards
    Sudheer

  • How to handle the Date & Time Object?

    Hi,
    I have big problem.I am working on j2ee application(using jsp & servlets).I am using Jboss App Server. The server is placed in India.
    But there is two offices, one is in India ,another one is in USA.
    But i want to handle the date and timestamp object commonly,
    But this object should be able to calculate their own date & timestamp object( e.g.,India,USA ).How can i solve this issue?.
    Help appreciated!
    P.Saravanan

    Hi,
    I have big problem.I am working on j2ee application(using jsp & servlets).I am using Jboss App Server. The server is placed in India.
    But there is two offices, one is in India ,another one is in USA.
    But i want to handle the date and timestamp object commonly,
    But this object should be able to calculate their own date & timestamp object( e.g.,India,USA ).How can i solve this issue?.
    Help appreciated!
    P.Saravanan

  • How to handle large images?

    Hi,
    Does anyone know how to handle big jpg images (1280*960) so that they could be presented in a midlet.
    The problem is that the images requires so much memory that they can't be decoded to an Image object with Image.createImage method. One solution would be to extract thumbnail image from exif headers. Unfortunately at least images taken with Nokia 6680 don't contain thumbnail in exif headers.
    So the only solution seems to be to decode the byte presentation of the image and resize it before creating an Image object.
    Do anybody know any library for this or tips where to start?
    Br, Ilpo

    Hi,
    I think it is not possible. My application contains a file browser (which uses jsr-75). User can use the browser to select an image either from phone memory or memory card. After the selection I would like to present the selected image for that user can be sure it is the right image. The selected image will be then sent to the server side with some additional data for further processing (but that is another story).
    Now the problem is that for example with Nokia 6680 user can take images as big as 1280*960 and I can't present them anymore because of the memory restrictions. With 640*480 image there is no problem because I can create an image object and then use a simple algorithm to resize the image for presentation.
    Br, Ilpo

  • How to handle empty Dats field received from SAP RFC response

    Hi All,
    I am invoking a SAP RFC which gives me a Dats field in response.
    A valid dats fields is successfully received by my pipelines.
    But when an empty Dats field is received, My pipeline fails and i get error.
    How to handle the empty Dats field from SAP

    Hi Anant,
    This is because the legacy SAP adapter accepted RFC messages with date field empty. In the new version, the same call results in an error. WCF-SAP adapter doesn't allow blank XML nodes.
    You need to use the below custom pipeline component as a workaround.
    Refer:
    Pipeline component for enabling legacy behavior in WCF-SAP adapter.
    Rachit
    Please mark as answer or vote as helpful if my reply does

  • How to handle The Date error???

    Actually I am using three dropdown list as day,month and year
    day contains 31 items and same values i.e., 1,2,3...31
    and month contains jan,feb,....dec
    year 1981,1982,.....2007
    and I am concateded these three to form date as
    d:=:day||'-'||:month||'-'||:year;
    Now my problem is
    when i am selecting values 31 jun 1980,30 feb 1980 i.e., non existing dates then it is throwing
    "Unhandled trigger ..... "
    please tell me how to handle it?

    There are a number of different exceptions you can get with dates.
    SELECT To_Date('10132007','DDMMYYYY') FROM dual;
    ORA-01843: not a valid month
    SELECT To_Date('99122007','DDMMYYYY') FROM dual;
    ORA-01847: day of month must be between 1 and last day of month
    SELECT To_Date('10-FEB-2007','DDMMYYYY') FROM dual;
    ORA-01858: a non-numeric character was found where a numeric was expectedFor this I would just trap the ORA-01847 and 'when others' exceptions.

Maybe you are looking for