Maximum data rate RT host can handle (sbRIO 9602)

Hi all, i got a sbRIO 9602,
in my application FPGA reads data from custom module @ 1Mhz, just i'm wondering if 
RT host has the capability to read data and visualize it on a graph without losing samples.
I just want to know this before look how to implement any Bufferized/Producer consumer/DMA transfer 'cause
if RT host can't simply manage this data rate i will have soon or later buffer overflow.
Until now what i've thought is to copy 1000 contiguous samples in a memory in a ONE SHOT way, reading this memory from RT Host and visualize it on a waveform Graph.
But again if there is the possibility to show acquired data continuously is obviously better.
Thanks in advance.

Hi, thank you all for your suggestions.
In my RT Host i read 10000 values in every cycle so data rate drops to 1Mhz/10000 ->100Hz i think.
What i want to do know is to detect peaks in my data and create an histogram of that peaks in RT.
I've placed a peak detector VI that takes as input my array of 10000 values and generate an array with only peaks but i don't understand how to use Histogram PtByPt VI. It works with single data value at a time so to generate a consistent histogram i should wait 10000 cycles and so it may not  be as fast as 100Hz .
Previuously i've tried to detect peaks and generate Histogram directly on FPGA to be sure to be as fast as possible due to 1Mhz data acquiring, but now with 10000 values chunk i guess if i can realize this on RT host side .
Any suggestion will be really appreciated.

Similar Messages

  • Ipad 2 video data rate. Whats the maximum data rate it can play back?

    I'm creating a video file I want to play in 1920 x 1080 on the ipad 2 and not sure what the maximum data rate should be.

    Frankiedude wrote:
    I'm creating a video file I want to play in 1920 x 1080 on the ipad 2 and not sure what the maximum data rate should be.
    Depends upon the format of the video. See:
    http://www.apple.com/ipad/specs/

  • Upload Maximum Date Rate Lower Than Upload Data Ra...

    Hello there,
    I have just brought up my BT Home Hub 5 stats and my upload data rate is higher than my upload maximum date rate. Is this normal? My upload noise margin has dropped to 4.9 - 5.1 dB. I've had BT Infinity just over 2 days and was wondering if this was normal for FTTC connections?
    Many thanks,
    William

    normally the data rate will be lower than max data rate but yours is that close I would not bother unless it starts to affect your connection which is unlikely
    If you like a post, or want to say thanks for a helpful answer, please click on the Ratings star on the left-hand side of the post.
    If someone answers your question correctly please let other members know by clicking on ’Mark as Accepted Solution’.

  • Maximum data rate

    Good Evening, 
    We recently had a VDSL connection installed to upgrade our speeds from 6mb to around 15mbps on a good day. 
    BT Wholesale says the IP Profile is 14.12mbps but the maximum data rate on the hub stats is much higher. Could someone help with this please. 
    And just to say its good to be back on the forums again. 
    Kind regards, 
    Dominic
    Here are the stats:
    5. VDSL uptime:
    3 days, 13:25:37
    6. Data rate:
    1296 / 14583
    7. Maximum data rate:
    1629 / 23612
    8. Noise margin:
    6.2 / 6.2
    9. Line attenuation:
    0.0 / 31.4

    dombullion wrote:
    how do my stats look. could i be getting faster speeds?
     VDSL uptime:
    4 days, 07:07:57
    6. Data rate:
    1296 / 14583
    7. Maximum data rate:
    1691 / 23990
    8. Noise margin:
    6.4 / 6.3
    9. Line attenuation:
    0.0 / 31.4
    10. Signal attenuation:
    0.0 / 25.5
    No it looks like you are getting the best speed, your Noise Margin is near 6db which is the minumum the connection likes to be at, it appears that DLM is doing its job right.

  • Maximum file size that OSB can handle for JMS

    Hi,
    We have a requirement to process 60MB xml files over JMS in Oracle Service Bus. While prototyping this we are facing heap space errors.
    Can somebody let me know what is the maximum file size that we can process? The scenario is as below.
    JMS --> OSB --> JMS --> OSB --> JMS
    Thanks

    if you don't need to access the entire content of the message, you can try using content streaming
    (see this OSB - Iterating over large XML files with content streaming discussion)
    See also here http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/userguide/context.html#wp1110513 for best practices.
    Otherwise I have noticed myself that OSB is very hungry in memory when loading large messages. I had trouble even with a 5 MB binary file being loaded - this would take 500 MB RAM. in these cases I would rather look for ETL tools such as Oracle Data Integrator or Open Source Pentaho.

  • Data Rate V Maximum data rate

    Can anyone tell me if my speed will increase due to DLM if my max data rate continues to climb.  I used to have 70meg downstream no problem, but after lots of attempts to get BT to resolve an issue left it (thought I would just let my contract expire) when it dropped to 57 max.  Now the max data rate has increased from 65000 to the value below.  As a result, my speeds have increased to upto 62 down again.  There is however a big discrepancy between the two figures, does this mean my speed will continue to increase if it continues upwards accordingly?  Why the big difference between the two?
    Thanks in advance.

    Within 10 days if the cause has been fixed, I'd say
    If you found this post helpful, please click on the star on the left
    If not, I'll try again

  • Maximum number of volumes "dbmsrv" can handle

    We plan to reorganize our production EPR system. Our SAN has 176 disks. To be able to distribute the load evenly across all the physical disks I want to create as much volumes for the database.
    I know from the past that "dbmsrv" wasn't able to handle so much volumes. It crashed because the answer packet was too big. One could add volumes but neither DBStudio nor DBMGui was able to show all the devices.
    Is this problem still present in 7.7.7.06?
    Markus

    Hello Markus,
    this issue had been fixed and is not present in 7.7.07.x :
    [http://maxdb.sap.com/webpts?wptsdetail=yes&ErrorType=0&ErrorID=1153616]
    regards,
    Lars

  • Maximum Image file size flash can handle?

    Does anyone knows the max image file size flash can load
    using MovieClipLoader?
    I tried 10.5 MB PNG file and Flash thinks it is loaded but,
    the actual image never shows up.
    Thanks,

    Well it is good for that client. But it still won't work in
    Flash. Sorry. You will need to slice the picture into different
    chunks at 2880 x 2880 max and reassemble the tiles in Flash.
    You may also run into an issue that things sometimes get lost
    if there is more that 16,000 (or thereabouts) pixels difference in
    their coordinates.

  • Is there a recommended maximum number of jobs QMaster can handle?

    In our setup we need to render some 4,000 - 8,000 short video segments (only a few seconds each). Right now we are submitting them all one after the other via the command line to a cluster (QAdministrator has been set to allow 10,000 batches in the queue).
    Has anyone experienced problems with queues this large? When rendering, QMaster randomly dies - mostly at around 1,000 batches. The batches remain in the spool but I need to reinstall the QMaster Service Node package to get everything off and running again.
    I don't like the idea of having to babysit the queue and I submitting the next one after the previous one negates the whole idea of using clusters.
    I've thought about adding more equipment to clear out the queue faster, but I don;t want to recommend this route if it doesn't work as that would be a huge expense wasted.
    It's almost as if QMaster has a memory leak but that's just a guess.
    Any thoughts or wisdom would be welcome.

    I would imagine most people submitting such a large number of jobs are using something more robust like Qube:
    http://www.pipelinefx.com/products/qube-film.php
    Qmaster is very buggy and would need a complete rewrite before it could be qualified for use in such an environment.

  • Transfer data rate

    OK, so I have resolved an older problem only to have dropped frames all the time during playback. How do I find out the maximum transfer data rate my processor can handle? ANy suggestions on how to get around an issue like that? If I want clean crisp graphics, how can I avoid the DV codec - which I understand is awful for graphics- yet still be able to play my sequence back without dropped frames?

    Hi Kristin,
    If I read correctly, you moved everything off your internal drive to an external. This external drive - is it a SATA drive, or FireWire? You mentioned SATA before, so I'm inclined to think that's what it is. If it IS an external Serial ATA drive, when you got it, did you zero all data on the drive before using it?
    Also, when you started using this new external, did you change your capture/render drive settings in your prefs in FCP? If not, they may still be set to your system drive, the default, instead of your new external hard drive. If this is the case, change it in FCP, save, and quit. TRASH the render files and move all the capture files from the system disk. re-open FCP, reconnect the moved capture files (if any), and of course, you'll have to re-render most of your timeline. when you do this, it'll render everything to the new hard drive (SATA, yes?), and you SHOULD be good to go.
    Now, as for interfacing with a monitor or an external device - are you going throgh firewire to a camera or deck and then to your video monitor? If so, no matter what, you won't be able to view anything but a single frame at a time with uncompressed. The only other thing that I can think of is that your camera/deck/external video device is powered on while you're trying to do this. In order to monitor Uncompressed externally, you need a pretty spiffy video capture card like the Kona. But internally, you should have no issues. So, make sure any external device except a hard drive is turned OFF, and disconnected.
    If none of this works, what you may want to do is change your sequence settings back to NTSC/DV for compression, and when you're totally done editing your project, you change them BACK to uncompressed, render, and export an uncompressed file to an external hard drive, take that drive to a video production company, have them slap it on a DigitalBeta tape as a master, and you're good to go.
    Hope this helps...
    -Kris

  • Query to find the  second maximum date in a table

    please give me the query to find the second maximum date in a table

    You can try with this
    SELECT empno
          ,hiredate
      FROM emp        a
    WHERE 2          = (SELECT COUNT(DISTINCT hiredate)
                           FROM emp        b
                          WHERE b.hiredate      >= a.hiredate
    OR
    SELECT empno
          ,hiredate
      FROM (SELECT ROWNUM      row_num
                  ,empno
                  ,hiredate
              FROM emp        a
          ORDER BY hiredate   ASC
    WHERE row_num             = 2;Regards
    Arun

  • Finding maximum Date in Cube

    Hi,
    How can I find the Maximum date from all the dates of a date characterstic which exists in CUBE Data.
    I want restrict a selection in my Query with this maximum Date. So how can i fetch that date into a variable?

    Hi
    You can call the function module RSDRI_INFOPROV_READ in your exit for the variable and read the dates and find the maximum. You have to pass the name of the info cube u want to read. However writing this code is slightly a tedious job. So think of any other way u can meet ur requirement.
    If you want to use the function module tell me i ll send u a piece of code which will give you an idea.
    Thanks
    Mansi

  • Information on how much SAP Business One can handle in throughput

    Hi All
    Is there any information on how much data SAP Business One can handle. I know in theory it only HDD space really set the limit but also thinking performance what setups do you have out there in terms of user, transactions per day, number of items, BP etc.
    Example:
    We have a potential customer who currently have 50.000 monthly subscriptions (0 lines invoice lines pr subscription) with an expected increase of 20.000 a year
    So in 3 year they will have 110.000 * 12 * 10 = 6 million invoice lines pr. year the first year (+2.4 million new lines increase to that number)
    Can SAP business One handle this amount of data in a way that have any performance at all (Server is not yet known) or in other words. When does SAP Business One become too small and we need to move to a bigger system?
    Is there any task-forces in SAP we can contact for such?

    Hi Rasmus,
    You may check:
    Discovered performance issues? Refer to our Performance Landing Page
    Root Cause Analysis? What Root Cause Analysis?
    Hope you can access the link given in the thread.
    Thanks,
    Gordon

  • Data rate stuck at 35000

    I'm guessing its either the DLM, a fault or a combination of both.  In the lead-up to this I was getting a lot ofseemingly DNS related errors last week - so not sure if this has influenced the DLM - certain web-pages would pause substantially before loading etc
    BT Infinity 2 - installed in April
    HH5 never seems to last more than about a day without rebooting/renegotiating, but speed has always stayed at 44-46mb and the 2 noise margins are always between 6.0-6.2
    now though - my negotiated speed has dropped to a strangely "round" 350000 and my noise margin is way up - assumingly showing there is still margin for it to be negotiated faster - so why 35000 ?
    any help appreciated - thanks
    1. Product name: BT Home Hub
    2. Serial number: +068343+NQ41060163
    3. Firmware version: Software version 4.7.5.1.83.8.173.1.6 (Type A) Last updated 10/05/14
    4. Board version: BT Hub 5A
    5. VDSL uptime: 0 days, 00:16:13
    6. Data rate: 12627 / 35000
    7. Maximum data rate: 12638 / 49739
    8. Noise margin: 6.0 / 8.4
    9. Line attenuation: 0.0 / 20.8
    10. Signal attenuation: 0.0 / 20.2
    <edit> just found this when searching for faults on my line - so perhaps the work on the below has caused the drop ?
    Reported Thu 04/12/2014 at 10:09 PM
    Issue Ended Fri 05/12/2014 at 12:33 PM
    We've fixed the recent problem in your area. To get your broadband running smoothly again, restart your BT Home Hub. Just turn it off at the mains, wait a few seconds, then turn it back on again. Give it a couple of minutes to finish starting up.

    Are BT still using DLM now then ?  See :-
    https://community.bt.com/t5/BT-Infinity-Speed-Connection/Court-case-forces-Openreach-to-turn-off-Dyn...
    http://www.thinkbroadband.com/news/6726-assia-court-case-forces-openreach-to-turn-off-dlm.html
    Best regards,
    dfenceman

  • Max size of spreadsheet Numbers can handle?

    I have a 46.9MB Excel spreadsheet. I have Excel 2004 for the Mac but this spreadsheet takes about 4 to 5 times longer to recalc on my Mac Mini at home than my IBM Thinkpad at work.
    I thought I would try Numbers for 30 days to see if it would recalc faster. But Numbers says the spreadsheet is too large for it to open. I have several other spreadsheets that are even larger.
    If these are too large, then there's no point in buying the program. I had hoped it would handle them, because NeoSource is extremely slow to open the spreadsheet (it keeps wanting to adjust the row height before it will open, which takes forever).

    The question was in the subject line: Is there a maximum size spreadsheet that Numbers can handle? I would hope that it's bigger than 46.9MB (the spreadsheet that I'm working with). I tried NeoOffice but it takes forever to open the spreadsheet because it's always trying to adjust the row height - I've waited several minutes for NeoOffice to try and open before finally giving up.
    So I thought I would try Numbers instead, but it can't open a 46.9MB spreadsheet (I get error-worksheet is too large). I have 2GB of RAM and nothing else open at the time. I thought I would ask if others more experienced with the program what the maximum size spreadsheet they have worked with in Numbers is? Does anyone know what the largest spreadsheet you can work with is? Usually the limiting factor is your RAM not the program. With 2GB of RAM a 46.9MB spreadsheet should not be a problem. If Numbers can't open a 46.9MB spreadsheet, it would be limited to only small jobs.

Maybe you are looking for