Problème DAQmx buffer + convertion

Bonjour,
Alors voila mon problème:
Je fais l'acquisition d'accéléromètres par le module 9234 avec un compactDAQ 9174. Je veux faire une acquisition avec des échantillons en continue. J'ai un traitement du signal pour obtenir un spectre en dB ou je fais une moyenne de 12 spectres à afficher.
Lorsque je suis en continue, je ne peux faire le contrôle qu'une fois car après il me marque problème de buffer, alors je voudrais savoir comment réinitialiser le buffer avant chaque contrôle??? ou alors avoir une aide pour ce sujet!!!
De plus, je souhaite générer le code labview au lieu du vi express pour pouvoir intéragir avec la sensibilité et les paramètres d'acquisition mais là aussi j'ai un problème, comme vous pouvez le voir sur mon image jointe, l'assistantDAQ est converti correctement seulement quand je lance le contrôle il ne se passe rien, je pense qu'il ya un problème avec la boucle!!!
Donc voilà en gros, je laisse une image et les vi a disposition, et le but principale est que mon contrôle des 12 spectres moyenné soient plus rapide en passant par l'acquisition continue et en vidant le buffer, qu'en passant par l'acquisition de N echantillons où la ça marche très bien mais le temps est beaucoup trop long!!
Merci et n'hésitez pas à me contacter pour plus d'inof
Arnaud
Pièces jointes :
Convertion DAQmx.jpg ‏233 KB
Acquisition signal Ambiance.zip ‏233 KB

Je viens de corriger un problème, en faite quand je générais le code, et que je lançais le contrôle, je ne sortais jamais de la boucle, du coup j'ai mis une constante a vrai et ça marche!!
Cependant, j'ai toujours le même problème à savoir la rapidité, quand je suis avec le vi express, mon traitement des 12 spectres se fait en 400ms et dès que je génére le code, la ça explose et ça met 2s à faire le meme travail!!!
Alors faudrait que quelqu'un m'explique pourquoi, parce que je suis perdu là!!!
Merci

Similar Messages

  • DAQmx buffer property node not available

    The following question was posted to the 'Multifunction DAQ' forum.
    Since I received no respones, I'm trying it here:
    I transferred an intact VI from a desktop
    PC to a laptop.
    On the laptop, however, the VI does not compile since the
    property node "DAQmx Buffer" is not available. I suspect that I maybe
    did not install the full DAQmx suite on the laptop but I don't know
    which part might be missing.
    (On both machines there is no DAQ hardware installed since I was just trying to do some editing.)
    Is my suspicion correct? and how do I find out what's missing'?
    If my suspicion is not correct: what else could it be?
    TIA
    Franz

    If  I can guess you should check which versions of daqmx are installed on both machines!
    Ton
    Free Code Capture Tool! Version 2.1.3 with comments, web-upload, back-save and snippets!
    Nederlandse LabVIEW user groep www.lvug.nl
    My LabVIEW Ideas
    LabVIEW, programming like it should be!

  • DAQmx buffer size/write size

    Guys, please forgive my posting a link here for a question I am asking in the DAQmx board, however like myself I feel a lot of people that have relevant knowledge may not frequent that board, so if you may be able to help please take a look.
    http://forums.ni.com/t5/Multifunction-DAQ/DAQmx-buffer-size-write-size/td-p/2717349

    Oops. Just realized my very silly mistake: I forgot to add the Start Task VI. I did so and it works as designed.

  • How to ignore DAQmx buffer errors ?

    Hi,
    In my application, I have to start the data acquisition of analog inputs (PCI 6025E)so that it could be read when the uses wishes to through a user interface VI. In addition to this, I also have a background thread that reads two of these AI ports every 250 msecs. However, by the time the buffer is read it has probably been overwritten atleast once and then it throws this error window saying buffer was overwritten etc. I think the buffer is a circular buffer and if so buffer being overwritten doesn't affect my application. So, is there a way I can avoid this error window from popping up ?
    Thanks,
    Sharmila

    There might be a couple of ways out of this. If you create a functional global (also known as a LabVIEW2 style global)you can have it written too by your DAQ portion, and read from elsewhere. You can make this LV2 global act as a circular buffer, allowing you to read from it when ever you need to, but allowing the DAQ to write to it when ever it needs to. There has been a lot of discussion on the construction of LV2 globals, so you should be able to find the information.
    Additionally, in the recent versions of LabVIEW there is an option (under the tools/option/block diagram menu pulldown) that allows enabling automatic error handling. What this does is cause an error dialog box to pop-up on any vi you use that has error handling that you haven't "handled" by wiring the error out to something else. Unchecking this may prevent the popup, I don't know whether the vi generating it will then just continue or whether it will need to have it or some earlier vi "reset". This is a useful feature, particularly in development and debugging, although I prefer to intentionally handle errors when I'm designing my code as it forces you to think about the various possible input cases that might fall outside of what you really wanted to happen.
    Putnam Monroe
    Certified LabVIEW Developer
    Putnam
    Certified LabVIEW Developer
    Senior Test Engineer
    Currently using LV 6.1-LabVIEW 2012, RT8.5
    LabVIEW Champion

  • USB 9162 VS cDAQ 9174

    Bonjour,
    J'ai déjà crée deux autres post sur mon problème mais cette fois je m'attaque au produit lui même!!!
    http://forums.ni.com/t5/Discussions-de-produit-de-NI/Probl%C3%A8me-DAQmx-buffer-convertion/td-p/1187...
    http://forums.ni.com/t5/Discussions-de-produit-de-NI/cDAQ-rapidit%C3%A9/m-p/1184553#M35939
    Mon application met 250 ms à faire un traitement du signal lorsqu'elle fonctionne avec le NI USB 9162, et lorsqu'elle fonctionne avec le cDAQ 9174, elle met 750ms!!! Avec exactement les mêmes paramètres d'acquisition, de traitement et d'affichage.
    Vous voyez le problème??
    D'où ma question: le transfert de données par le hi-speed USB 9162 est-il plus rapide que celui sur le cDAQ 9174??
    Si c'est le cas, comment ça se fait? Et y a t-il un moyen d'overclocker le cDAQ?
    Je souligne le problème, car si celui-ci s'avère exacte, l'utilité de mon matériel serait remis en cause et cela m'amènerait à revoir mon jugement sur NI  !!!
    Merci
    Arnaud

    En faite cela concerne simplement le temps d'acquisition, car j'ai essayé sans le traitement du signal et le problème est toujours le même.
    En faite, au départ je suis parti d'un vi express!! Ensuite j'ai généré le code pour comprendre comment cela fonctionné et pour avoir accès au paramètres style sensibilité!!
    En générant le code, un sous-vi se crée (voir mon autre post 1er lien) ou on retrouve tout les bloc DAQmx (créer,..., supprimer), et avec les différents essais je m'aperçois que c'est se vi qui est plus lent ou du moins c'est se qui se passe qui est plus lent en fonction du USB-9162 ou du cDAQ 9174!!!
    D'ou la question sur la rapidité d'echange des données entre les deux USB!!!!

  • DAQmx Sample Clock (on-board clock) effects

    Why would I get slightly different readings when using the Sample Clock (onboard clock) and not using it? On several different cards, I've noticed slightly lower levels of analog input when using the Sample Clock and not using it.
    thanks
    Richard

    Thanks Doug. In LabVIEW, on a SCXI-1102 and a 1502. (DAQ is a PCI-6221).
    I understand the benefit of the DAQmx buffer that gets setup when you use Sample Cock, but if I'm just taking "1Chan 1Samp" samples, I wouldn't expect a difference. The sample taken with the Sample Cock included in the Task is a few tenths of a volt less than a the reading taken from a Task not including the Sample Clock (on a ~1V signal).
    Message Edited by Broken Arrow on 01-14-2009 04:21 PM
    Richard

  • Compare daqmx event with dynamic data

    Good evening everyone!
    Our project is now facing a problem with LabView.
    For our project, we need to use daq to acquire current for our SOC. We want to use virtual channel so people can choose channels from the front panel directly. But when use DAQmx Virtual Channel, the task out is daqmx event which can be used for further calculation like multiple times to get our SOC, also can not compare with a constant.
    Could anybody help me  with this question? How can I extract the value inside the task out for DAQmx or convert daqmx event to dynamic data directly  ??

    So you changed your DAQ Assistant to attempt to use the DAQmx API.  Normally, I would say "Good for you".  But if the code was working, why change it?
    Mimic what I did with the Analog Output.  See how I configured before the loop, wrote inside the loop, and closed after the loop?  You should do the same thing with the Analog Input, except you do a DAQmx Read instead of a write.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • CAN CONVERTER

    Bonjour,
    Suite au probléme rencontré ("can converter error : invalid number of data bytes") pour convertir mes  fichier de type "vector ASCII" (.asc) avec ma bus database configuration (.dbc), sujet du 02-07-2011, votre réponse en message privé été: "R&D has discovered the problem-- some of the hex codes in your log file are only 1 character (instead of the expected 2 characters).  They think they can expand the CAN Converter to read 1 character hex codes as well".
    DIAdem R&D is adapting the CAN Converter dll to work with your submitted CAN log and database files.  We hope to be able to soon send you an updated dll that you can use in DIAdem 2010 SP1.
    Avez vous réalisé un updated dll ?
    Cordialement

    Hi POPPOTE,
    My apologies for the long delay.  R&D has indeed addressed this issue, and it is scheduled to release with DIAdem 2012 SP1.  I'm not sure when the SP1 will release though, so I'd like to send you the DLL behind the Bus Log Converter so you can try out the solution on your existing DIAdem 2012.  Do you in fact have access to DIAdem 2012, or are you still using an earlier version, in which case which version?  It would be most convenient for R&D to provide only a solution for DIAdem 2012 and later versions.
    You can email me directly at [email protected]
    Regards,
    Brad Turpin
    DIAdem Product Support Engineer
    National Instuments

  • DAQmx 7.4 Input.OnbrdBufSize reports incorrect values

    I'm using a PXI-1042 chassis with six PXI-6115 DAQ boards. I'm using the DAQmx buffer property node to read the size of my onboard buffer for each card, but I'm getting incorrect results. I know what the value should be (16.7 Million samples). However, I get a value of 2.14 Billion samples for two or three of the cards and a value of 16.7 Million for the rest of the cards. If I reboot the system, the values are stay the same, but are returned from different cards. For example: If I read the Onboard buffer size from each card, the values would be 2.14Gs, 2.14Gs, 16.7Ms, 16.7Ms, 2.14Gs, 16.7Ms for cards 1, 2, 3, 4, 5, and 6 respectively. I then reboot the system and get: 16.7 Ms, 2.14Gs, 16.7Ms, 16.7Ms, 2.14Gs, and 16.7Ms.
    Does anyone have any suggestions as too why I'm getting the wrong values?
    Any suggestions would be greatly appreciated.
    Thank you,
    Jeremy

    Alex,
    Thanks for responding. I am using an embedded controller. Its the PXI-8187. To get a value of 16.7 Ms, I'm using the DAQmx buffer property node and asking for the onboard buffer size from each PXI-6115 module. I've tried getting the property two ways. The first thing I did was create DAQmx channel constant and programmatically create a task, followed by querying the onboard buffer size. Second, I created six tasks in MAX, each corresponding to a different 6115 module. I then called all six in parallel. Both methods yielding the same result. In response to your second question: I cannot duplicate the problem with just one board. For example: I uninstalled all six cards and removed their registry entries. I then installed one at a time, and each time rebooted the system to see the results. With just one board installed, I didn't seem to have a problem. As soon as I installed the second board, I began getting the weird value of 2 Gs. I proceded to install one board at a time. Its seems that every time I reboot the computer I get different values.
    I talked with NI. They can't duplicate the error. They suggested that I may need to reinstall my drivers.
    Jeremy

  • Onboard buffer size

    Hi, can anyone tell me the difference between the "buffer size" and "onboard buffer size" under the DAQmx buffer>>output property?
    Thanks,
    David
    www.controlsoftwaresolutions.com
    Solved!
    Go to Solution.

    OnbrdBufSize - is the fast memory every DAQ card has to store the acquired data. It's like RAM in the PC. For example here you find that the NI PCI 6120 has 128MB of this memory which will equal to some amount of samples the buffer can store (this number you will read). This said it also explains why you can read and not set this property.
    Output.BufSize - Before you start data acquisition software buffer is defined based on the task you perform. I think it just allocates part of the fast memory of the DAQ card. The reason why the buffer is set (either by the user or automatically) is to have defined memory space for each task. Imagine you have more tasks using the card and each of them needs some memory of the card - if I am wrong correct me somebody.
    If your acquisition is finite (sample mode on the Timing function/VI set to Finite Samples), NI-DAQmx allocates a buffer equal in size to the value of the samples per channel attribute/property. For example, if you specify samples per channel of 1,000 samples and your application uses two channels, the buffer size would be 2,000 samples. Thus, the buffer is exactly big enough to hold all the samples you want to acquire.
    If the acquisition is continuous (sample mode on the Timing function/VI set to Continuous Samples), NI-DAQmx allocates a buffer equal in size to the value of the samples per channel attribute/property, unless that value is less than the value listed in the following table. If the value of the samples per channel attribute/property is less than the value in the table, NI-DAQmx uses the value in the table.
    Sample Rate Buffer Size
    No rate specified 10 kS
    0–100 S/s 1 kS
    100–10,000 S/s 10 kS
    10,000–1,000,000 S/s 100 kS
    >1,000,000 S/s 1 MS
    LV 2011, Win7

  • Erreur -50405

    Bonjour,
    j'ai mis un place un stand d'acquisition et de mesure avec le matériel suivant :
    Labview 2010, v10.0
    1x NI-cDAQ-9188 avec modules : 9263, 9481, 9435, 9207 et 9401
    1x NI-cDAQ-9188 avec modules : 9217 et 9213
    MAX v4.7.1f8
    Mon programme est construit de la façon suivante : Une boucle cadensée tourne toutes les 2 secondes, à l'intérieure de cette boucle, une structure en séquence opère différentes actions : affichage, lecture canaux, calculs (bilans d'énergie), régulation, sauvegarde. En parallèle à cette boucle, une autre boucle, cadensée à 30 s lance une simulation TRNSYS.
    Mon programme fonctionne très bien lorsque, tout à coup, l'acquisition n'arrive plus à suivre. J'ai l'impression que le PC n'arrive plus à communiquer avec le CDaq. Cette erreur est aléatoire et apparait une à deux fois par jour. Après pleins d'essais, j'ai finalement lancé le NI-SPY. Je mets en fichier joint le résultat obtenu, vous verrez que tout est OK jusqu'à 17h43.
    Quelqu'un a-t-il une idée de ce que je peux faire pour éviter cette erreur !
    Pour info, mon système est déconnecté de mon réseau d'entreprise, donc ce n'est pas quelque chose d'externe qui survient.
    Un grand merci d'avance de votre aide.
    Cath
    Solved!
    Go to Solution.

    Bonjour Cath,
    Je comprends vos problèmes avec votre appareil ayant cDAQ. Tout d'abord, je vais m'excuser si mon français n'est pas très bon comme je réponds à partir de la branche Britannique, mais je vais faire de mon mieux!
    J'ai fouillé dans nos ressources ici et avons trouvé quelques informations sur l'erreur que vous rencontrez.
    L'erreur -50405 est une erreur de temporisation qui peuvent survenir dans les deux systèmes USB et Ethernet basé. Il ya deux clés de Registre sur l'ordinateur hôte qui peut être modifiée pour ajuster la longueur du temps avant que l'erreur -50405 apparaîtra. En permettant plus de temps, le système sera plus tolérant des variations au sein du réseau.
    Les étapes suivantes devraient résoudre le problème:
    1. Ouvrez le Registre Windows et accédez à:
    "HKEY_LOCAL_MACHINE \ SYSTEM \ CurrentControlSet \ Services \ niemrk \ Parameters \ '
    (Je ne suis pas sûr à 100% ce que les noms seront sur une copie Française de Windows)
    Si la clé paramètre n'existe pas, cliquez-droit niemrk> Nouveau> Clé. Renommer la nouvelle clé «Parameters».
    2. Si la clé Parameters contient déjà DWORDs pour 'ConnectionTimeout' et 'DiscoveryTimeout', sauter cette étape. Cliquez-droit sur Paramètres> Nouvelle> Valeur DWORD. Nommez la nouvelle valeur DWORD "ConnectionTimeout". Répétez cette étape, l'ajout d'une nouvelle valeur DWORD nommée "DiscoveryTimeout '
    3. Après deux DWORDs ont été créés, double-cliquez sur chaque, sélectionnez le bouton radio Décimal, entrez un numéro dans le champ Données de la valeur et cliquez sur OK. Le nombre que vous entrez est le nombre de millisecondes avant l'erreur de temporisation apparaît. La valeur par défaut est 2500ms. Un guide pour ces valeurs:
    Timeout = 12MB / (Nombre de Canaux * Scan * Nombre Taux d'octets par balayage)
    Donc, pour une tâche à 4 canaux d'entrée analogique à 25KS / s avec un module 16 bits:
    12MB / (4 * 25KS / s * 2B) = 60 secondes.
    4. C'est probablement une bonne idée d'augmenter la taille de l'entrée DAQmx buffer pour accommoder la longueur temporisation augmenté. Avant la tâche DAQmx est démarré, placer un DAQmxConfigureInputBuffer.vi. La valeur câblé dans la taille du buffer (en échantillons par canal) peut être déterminée par la longueur du délai d'attente que vous avez configuré:
    Buffer Size = Fréquence d'échantillonnage * Nombre de secondes avant expiration
    Espérons que pourrait arrêter l'erreur que vous rencontrez. S'il ya quelque chose que je peux vous aider avec, s'il vous plaît n'hésitez pas à poster.  J'espère que tous fait sens, désolé si mon français n'est pas très bon!
    Cordialement,
    Oli
    LabVIEW Student Ambassador
    National Instruments UK

  • How to Optimize SCXI 1600 for speed with Thermocouples

    I'm working on a data acquisition system for my engineering firm and I'm trying to find a way to use our new thermocouple system as fast as possible.
    The requirements for the DAQ process are:
    Read 32 voltage channels from a PCI-6071E card
    Read 32 thermocouple channels from a SCXI-1600 with an 1102C accessory
    Complete the entire operation in under 5ms (this is so other parts of the program can respond to the incoming data quickly and trigger safety protocols if necessary)
    Using LabVIEW 7.1 and MAX 4.4, I've got the voltage channels working to my satisfaction (with traditional DAQ VIs) and the rep rates I measure when I run the program are around 1ms (I do this by putting the DAQ code in a loop and reading the millisecond timer every time through that loop, then calculating the average time between loop executions).  I have been trying to get similar performance from the thermocouple channels using DAQ Assistant and DAQmx.  Some of the problems I've encountered are:
    Very slow rep rates with 1-sample and N-sample acquisition modes (300-500ms)
    Good rep rates when I switch to continuous mode, but then I get buffer overflow error -200279.
    When I attempted to correct that error by setting the DAQmx buffer to overwrite unread data and only read the most recent sample, the calculated sample rate went to 20ms.  It was around 8ms when I left the error unhandled and continued acquisition.
    At this point I'm out of ideas and am just looking for something to try and optimize the DAQ process for speed, as much as is possible.
    Thank you for any help.

    I guess I would be interested in checking out your code to see if there is anything I can recommend on changing.  However, I do have a few general Ideas of how to improve your performance.  These recommendations are purely based on what you could be doing to slow down the speed of the program because I am not sure how exactly you have everything set up.  
    -Are you setting up the task and closing the task each time that you read from your daq card?  the way to get around this is to only have the DAQmx read vi in the while loop so you do not have time alloted for opening and closing the task each time.
    -Try using a Producer/Consumer architecture.  This architecture uses queues and splits the aquisition with the post processing.  Here is a link to how to set up this architecture and some information on when to use it.
    Application Design Patterns: Producer/Consumer
    http://zone.ni.com/devzone/cda/tut/p/id/3023 
    Message Edited by Jordan F on 02-06-2009 04:35 PM
    Regards,
    Jordan F
    National Instruments

  • Configure Logging VI start and stop without losing or dropping data

    Hi there,
    I am currently using a M-series PCI-6280 and a counter card PCI-6601 to do measurements on Labview 2010. 
    I have 3 tasks set up, 2 linear encoder counter tasks on the 6601 and 1 analog input task with 4 x inputs on my 6280.  These are all timed with a counter on the 6601 through an RTSI line.
    On all of these tasks, I have a similar set-up as the picture in http://zone.ni.com/devzone/cda/tut/p/id/9574 except they are all set up as "Log" only and "create/replace" file for the configure logging.vi.  
    Since I want to have the encoders continuously locate my position, I never "stop" the counter tasks.  However, I need to use the data acquired after every move along the path to obtain and process the location and the subsequent analog input data points I have.  What I have done, is I would "stop.vi" my sample clock on the counter, then open the tdms file, read the tdms file, get the array from there, close the tdms file, then "run.vi" the sample clock on the counter.  However, I found when I stop the sample clock, not ALL the data points have been streamed into the file.  
    For example, say I move my carriage to 100mm to the right while logging, then stop the sample clock, obtain the array from the tdms file, find the size of this array called "X" for future use.  Then, I move the carraige back to the origin WITHOUT starting the clock again.  THEN I start the sample clock again and move 100mm towards the left.  After this is done, I again, stop the sample clock and obtain the data from my tdms file starting the offset of read at "X+1".  I EXPECT that this array SHOULD start at 0 then move on to 100mm to the left.  However, what it does, is I see the data start at 100mm from the right for a few data points, then JUMPS to 0, then moves to the left as expected.
    I propose that this means that when I stop the sample clock, the data from the counters (buffer?) have not all been streamed to the tdms file yet, and when I start the clock again, the remaining gets dumped it.  Here, I am confused since I thought that "configure logging.vi" streams directly into the file without a buffer.
     I was wondering if I am doing anything wrong, expecting the wrong things, and if there is a way to implement what I want.  As in, is there a way to flush the remaning data in the "buffer" (i tried the flush tdms vi, but that didn't work and I didn't think it was relevant) after I stop the clock or something like that, or is my implementation competely off. 
    Thanks for reading,
    Lester
    PS.  Having a way to read the most recent point from the counters would be awesome too.  However, everytime I set the offset/relative to most current, it says that I cant do that while logging, even though I saw a community post saying that it works with analog inputs.  Maybe this is off topic, but yeah, currently I'm just trying to get the arrays from tdms working, THANKS!
    Solved!
    Go to Solution.

    Hello, Lester!
    There are a few solutions for you here.
    First, since you don't need the excess data points, you can set the "DAQmx Read" Property Node to "Overwrite" any data from previous acquisitions. 
    To do this, first select a "DAQmx Read Property Node" from the functions palette. The first property should default to "RelativeTo." Click on this, and instead select "OverWrite Mode" (see attached). Next, right-click on the input, create constant, and select "Overwrite Unread Samples" from the drop-down enum. After wiring the other inputs and outputs on the node, This should accomplish what you're looking for. If for some reason it doesn't, consider the following options.
    Your second option is to limit the buffer size. If you know how long you want to acquire data for, you can use the sampling rate (it looks like 5000Hz, in your case) to determine the number of samples to acquire. Simply use the "DAQmx Buffer Property Node" and input the number of samples desired. When the required number of samples is acquired, the buffer will begin to overwrite the old samples. This, of course, is not a very flexible method, and obviously requires a pre-set number of data points.
    Your final option is to do a brute-force open and close of your task with each acquisition. Simply use the "Clear Task" and "Create Task" functions with every new set of data desire. This should clear the buffer, but will be fairly slow (especially if you eventually change your program to take many sets of data very quickly). 
    As mentioned, the "OverWrite" property in the "DAQmx Read" node should take care of this for you, but feel free to try the other options if they better suit your needs.
    Let us know if you have any further questions!
    Best,
    Will H | Applications Engineer | National Instruments
    Will Hilzinger | Switch Product Support Engineer | National Instruments

  • Show image problem

    Hello guys,
    I need your help urgent.
    What I am trying to do is to display a student record (i.e number, name,grades,..etc) from a table, I need also to display his image as inline info with his record in a separate region.
    what I did is that
    1. I have create a procedure as follows
    PROCEDURE retreive_img_data(p_id varchar2) AS
    Lob_field BLOB;
    buffer RAW(32767);
    offset BINARY_INTEGER:=1;
    buffer_size NUMBER:=32767;
    BEGIN
    -- retreive the LOB locator into the PL/SQL locator variable lob_field
    SELECT photo
    INTO lob_field
    FROM students
    WHERE std_number=p_id
    -- The mime header is sent first before sending the image
    --content
    OWA_UTIL.MIME_HEADER('image/jpg');
    -- read the LOB content in a loop and send it across to the browser
    LOOP
    DBMS_LOB.READ(lob_field,buffer_size,offset,buffer);
    -- convert the raw content read into varchar2 and send it to the browser
    htp.p(UTL_RAW.CAST_TO_VARCHAR2(buffer));
    offset:=offset+buffer_size;
    END LOOP;
    -- Catch the no_data_found exception.This is raised by the dbms_lob.read
    -- procedure
    -- when the end of the LOB is reached and there are no more bytes to read
    EXCEPTION
    WHEN NO_DATA_FOUND
    THEN null;
    END retreive_img_data;
    2. created a HTML region and its source is :
    <IMG SRC="show_image?p_id=&P3_STD_NUMBER." width="70" height="80" alt="No Picture">
    where P3_STD_NUMBER is a page field contains the student number.
    but when I run the page no image is there, I have checked that there is a photo in the database for the student I used to test the page.
    any help where is the error , or how can I do it the right way.
    thanks

    Sergio,
    forgive , I have migrated my application to 10g , I am facing the same problem, againg my images in my database but they don't show up. This I am sure my NLS_LANG for the DAD is correct , any way this is my DB nls.
    1* select * from nls_database_parameters
    SQL> /
    PARAMETER VALUE
    NLS_LANGUAGE AMERICAN
    NLS_TERRITORY AMERICA
    NLS_CURRENCY $
    NLS_ISO_CURRENCY AMERICA
    NLS_NUMERIC_CHARACTERS .,
    NLS_CHARACTERSET AR8MSWIN1256
    NLS_CALENDAR GREGORIAN
    NLS_DATE_FORMAT DD-MON-RR
    NLS_DATE_LANGUAGE AMERICAN
    NLS_SORT BINARY
    NLS_TIME_FORMAT HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY $
    NLS_COMP BINARY
    NLS_LENGTH_SEMANTICS BYTE
    NLS_NCHAR_CONV_EXCP FALSE
    NLS_NCHAR_CHARACTERSET AL16UTF16
    NLS_RDBMS_VERSION 10.1.0.2.0
    and this is my DAD
    Alias /i/ "E:\OracleHomeBI\marvel\images/"
    <Location /pls/htmldb>
    SetHandler pls_handler
    Order deny,allow
    Allow from all
    AllowOverride None
    PlsqlDatabaseUsername HTMLDB_PUBLIC_USER
    PlsqlDatabasePassword htmldb_public_user
    PlsqlDatabaseConnectString oracledb:1521:yudb ServiceNameFormat
    PlsqlDefaultPage htmldb
    PlsqlDocumentTablename wwv_flow_file_objects$
    PlsqlDocumentPath docs
    PlsqlDocumentProcedure wwv_flow_file_manager.process_download
    PlsqlAuthenticationMode Basic
    PlsqlNLSLanguage AMERICAN_AMERICA.AR8MSWIN1256
    </Location>
    -- Need your help

  • This should be very simple...

    using the usb 9171 chassis and a ni-9411 module and Labview 2012
    I have one reflective optical reader wired to the 9411.
    this is mounted to the side of a conveyor bed to detect boxes passing by
    I've been trying to setup a VI to count the number of boxes that pass by the optical reader (that part is the easy part) but to also give me a gage that tells me boxes per minute (which could be as low as 1 box passes by per minute or up to 60 boxes per minute)
    any help??

    You could set up a frequency measurement task to do this.
    The tricky part is that the read is a blocking call--it waits until you have a sample available which only happens when a box passes by (starting with the 2nd box, the frequency measurement is computed by inverting the time between consecutive boxes).  You don't want to be stuck inside the DAQmx Read call for up to 1 minute.  A few ways to go about this:
    You can set the read timeout to some low value and discard the timeout error (-200284).  If you're using too much CPU you might need to add a wait to your loop in the timeout case (or maybe this is one of the rare times that setting the DAQmx Read Wait Mode to "Sleep" might actually help).
    You could poll the available samples per channel property to see when to read, but unfortunately polling available samples per channel does not query the onboard FIFO or initiate a transfer back to the software buffer (last I checked), so you'll be reading "0" for quite some time until the on-board FIFO starts filling up, at which point a large chunk of data will be transferred back to the DAQmx buffer in PC memory at once.  Note that this does not apply to PCI/PCIe DAQ devices which transfer data to the PC buffer pretty much as soon as possible--USB and Ethernet devices try to minimize the overhead of unnecessary data transfers but in this case it is a hindrance.
    Using the DAQmx Every N Samples event has the same problem as #2.
    Reading -1 samples returns whatever is in the DAQmx buffer in PC memory, so it also has the same problem as #2 (the read will return 0 samples until the hardware decides to transfer the data over to the PC).
    In your case, you might have luck with #2, #3, or #4 by setting the CI.DataXferReqCond to "Onboard Memory not Empty", but without being able to validate any of these workarounds I think I'm just going to recommend you use suggestion #1:
    You can either run this task in addition to your current edge count task (there are 4 counters on the 9171 which should be plenty given you can only have 1 module), or you can run it instead of it (poll back the Total Samples per Channel and add 1 to determine the count of boxes, however you wouldn't be able to distinguish between 0 and 1 boxes this way since the first sample is returned after the 2nd box passes).
    Best Regards,
    John Passiak

Maybe you are looking for

  • Does TM overwrite or just replace what's missing?

    When I went on a business trip recently I had to delete about 15GB worth of music from iTunes to free up some hard drive space. When I got home i hooked up my iPhone to sync it and thought it would transfer all the music I had deleted. Needless to sa

  • I have a older ipod and i need a charger for it the model

    help please i have an older ipod and i need to find a charger for it please help model number m8541

  • Problem in filtering the detailed report

    Hello All,     I am having scenario  where i need to jump from xcelsius to BW web report for details and i want this web report to be filtered by  plant value .I have created the flash variable i have referenced the input value to the palnt cell in d

  • DNS Resolver not working with JDK 1.5.0_6 and Windows 2003

    I have a Windows Server 2003 machine which now has Java JDK 1.5.0_6 installed on it. Somehow the DNS resolver doesn't work - every call to InetAddress.getByName("host") throws an UnknownHostException. I've tried de-installed java and re-installing, b

  • Updated ipad2 to 8.1.2 now no sync on mac mini 10.8.5

    I haven't synced my ipad2 with my mac  mini for a  while.  Now, since updated my ipad2 to 8.1.2 I can not see the device on itunes allowing me  to sync the two OS devices together.  The device option on itunes on the mac mini, stays grey,  not recogn