LabVIEW and Finite Element Analysis

I would like to know how people have used LabVIEW for/with finite element modeling (FEM), either LabVIEW itself or LabVIEW interacting with third-party FEM software.
Sincerely,
Don

Don,
I have not seen applications built in LabVIEW that perform finite element modeling. While I am not entirely familiar with the intricacies of FEM, I do know that the process entails a great deal of advanced mathematics. LabVIEW does have VIs for performing various math functions including differential equation solvers. If you wanted to use LabVIEW for FEM, you can also consider the use of the MathScript node and the MathScript window to write m-script code for these mathematical functions.
I may be able to provide additional information if you could provide an example of your FEM application. How intricate is the model for the object that you will be modeling? For example, will you be analyzing forces and torques on a beam, or will you be modeling an asymmetrical collision of a car into a wall? Do you know the sort of calculations will your application entail?
At this time, LabVIEW does not have an easy means of communicating with a 3rd party FEM software. Again I would ask in what respect you would like to interact with alternative software. Do you want to see a visual representation of the physical object that you are analyzing? LabVIEW can easily interact with SolidWorks software for 3D modeling, but not a software package that would perform finite element analysis. Would you like to develop a front panel for an analysis application that is performed in another software? This may be possible if we further discuss the details.
Finally, I can encourage you to submit any product suggestions at the National Instruments Product Suggestion Center. Our LabVIEW developers carefully consider each of these suggestions, and any input you have as to how our application can better serve you would be greatly valued.
I apologize that I do not have a more concrete answer for you, but I hope that this information was useful for you. Please let me know if you would like to discuss this application further. Thanks,
Mike
National Instruments
Applications Engineer

Similar Messages

  • Diadem load Finite Element Analysis software

    hi,all.
      I want use diadem read  Finite Element Analysis software's grid files. I know dataplugs do  not support it.  i think the file may be read through GPI, but i don't know how to do it, could anyone tell me?
    Labview 7.0, 8.0, 8.6,8.6.1,2011

    Hello kkjmt,
    could you please describe what type of data you want to read from the file ? Is it geometry information describing the geometry of the mesh or do you want to read result data e.g. temperature calculated at a vertex of the mesh ? or both ? In any case it woul dbe helpful to see an example of the data file.
    Andreas

  • Can WPF be used to perform finite element analysis?

    Hi All,
    We are using C#.net, WPF to build a visualization application to mathematically design 3d models for optimization and interface it with Pro E Wildfire. We are able to derive the unstructured wireframe of the model, but would like to view it with a structured
    (uniform rectangular) wireframe. Also we want to see if there are ways in WPF with which one can peform stress / Force analysis on the 3d model. Is there any external API compatible with WPF available that can be made use of.
    Every single direction/ clue is appreciated.
    ThanKs n regards,
    RM

    Hi RM79,
    I am marking your issue as "Answered", if you have new findings about your issue, please let me know.
    Best regards,
    Sheldon _Xiao[MSFT]
    MSDN Community Support | Feedback to us
    Microsoft
    Please remember to mark the replies as answers if they help and unmark them if they provide no help.

  • Finite element mesh?

    Hi,
    I am trying to draw a structure which I usually model using finite element analysis. In order to get high quality illustrations I was hoping to redraw it in illustrator. My problem is that I'd need a fill that creates only intact squares, in FE terms this is called advancing front mesher, i.e. squares are created starting at the outline and then work there way towads the centre. Is there a way to do that in illustrator? A standard "grid" fill doesn't work as it produces incomplete squares at the edges.
    thanks for help!

    Well hereis anotherway that might work for you I made a grid as before but this time a narrow one with say 4 Columns and 30 rows I expanded the grid, not sure that part is necessary. Then I went to Object >Envelop Distort >Make with Mesh gave it one column and 4 Rows of cousre it could be only one or tow or three rows or more if you wish.
    Then I use the direct select tool to form the shape I wanted rather than try to make the grid conform to the shape I simply shaped the grid.

  • How large can a LabVIEW Queue in elements and bytes?

    How large can a LabVIEW Queue in elements and bytes?

    rocke72 wrote:
    How large can a LabVIEW Queue in elements and bytes?
    In
    elements it is likely something like 2^31. In bytes it is most probably
    around the same number or better, depending how exactly the different
    queue elements are stored. I think they are stored as independent
    pointers so it would be theoretically better than those 2^31. In
    practice however starting to allocated so much memory in LabVIEW will
    always cause problems on nowadays computers. Without 64 bit CPU and OS
    going above 2 GB of memory per application is very hard and as far as I
    know not supported by LabVIEW.
    Also allocating many chunks of memory (like a few million queue
    elements holding strings) will definitely slow down your system tremendously eventhough
    it will work but the OS memory manager will be really stress tested
    then.
    Rolf Kalbermatter
    Message Edited by rolfk on 06-09-2006 12:24 AM
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Controling heating/cooling element with labview and usb 6008 DAQ card

    Greetings All
    I'm looking for a heating/cooling type of element that I can control with labview and my usb-6008 card. Heating/Cooling will just be for clean water and the temperture range will be from 0 C to 100 degree C.
    Thanks
    The heating and cooling elements can be seperate products. Any recommendations

    A simple kettle element will of course be sufficient for upto 100°C
    To get the temperature down you need a cooling system, what springs readily to mind of course is a refrigerator.
    Just a note of warning, water and electricity are dangerous bed fellows.
    On that note perhaps one of those vortex air units would be safer - they blow hot in one direction and cold in the other, nominally to 100°C
    see: -
    http://www.airtxinternational.com/how_vortex_tubes_work.php

  • Are there any 3rd party LabVIEW and/or TestStand code analysis tools

    I am looking for any 3rd party tools to analyze LabVIEW and/or TestStand code. Looking for tools that will check for memory leaks, timing, and any other code performance problems. I know LabVIEW has the profiler tool, but I am looking for 3rd party tools.

    Hi wbolton,
    I am personally not aware of any 3rd party tools that do the same thing as the Profiler, but I can say that if you'd like to analyze a .VI or .seq file, the most straightforward way to do so would be with the appropriate National Instruments software.
    Regards,
    Dan Richards
    Dan Richards
    Certified LabVIEW Developer

  • Data logging optimization function in LabVIEW and SignalExpress

    Hello, everyone!  I and a colleague have assembled the attached VI,
    which is used to control a plant growth chamber, and I could use some advice. 
    I would like to add data logging capabilities, ideally to note (1)
    environmental conditions, such as temp and CO2 level and (2) when certain
    situations occur, such as high CO2 or low pressure (which results in a change
    to the “Case Structure for CO2 and Pressure”).
    The chamber runs for weeks at a time, though, so I have a constraint.
    Instead of continuously logging all the environmental data (which would yield
    giant file sizes), I want to be able to take “snapshots”, say every 5 minutes,
    so I can examine the chamber’s condition over a long period of time.
    I attempted to use the Time Delay function to control the Write Measurements
    function (as in LabVIEW’s Cycle Analysis example, which I’ve also attached),
    but Time Delay halted my entire program. 
    I also tried the Wait (ms) function, but had no success.
    I recently discovered the powerful data review and reporting tools that
    SignalExpress has to offer.  Ideally, I
    would like to use a SignalExpress express VI to record the measurements instead of the more primative Write Measurements, but, I’m not sure how to implement this in my VI.  For example, would I make a data acquisition VI in SignalExpress for all my sensors, export to LabVIEW, and use that to replace the current DAQ structure in my VI?  (Because the chamber VI is interactive, I'm pretty sure that porting the LabVIEW VI to SignalExpress wouldn't work out well.)  And I’m
    still not sure how to control how often it records measurements.
    Any advice would be much appreciated!
    Attachments:
    Chamber VI.vi ‏789 KB
    Cycle Analysis.vi ‏300 KB

    Hi, Sarah.  Yes I definitely plan to use LabVIEW for the majority of my data acquisition and logging.  I apologize, I think I probably should have broken my post into two separate posts for clarity, as my major concern is being able to set how often LabVIEW logs the environmental data (viewing the logged data in SignalExpress would be nice, but not a requirement).
    As I explained before, I'd really like to be able to set up a system that records the environmental condtions at regular time intervals, say every 30 minutes, or if my "Case Structure for CO2 and Pressure" status changes (such as from "Normal Conditions" to "High CO2".
    As an example, attached is a boolean structure (TimedWritedMeasurements.vi) that I came up with to record a simulated signal every 5 seconds (5000 ms).  Once I can get it to work, I would connect the boolean to my environmental sensor outputs and CO2 and Pressure case structure in my Chamber VI.vi that I attached in my first post.
    My problem is I can't seem to get the Trigger and Gate function to work the way I'd like.  I'd like the Wait (ms) function to trigger the Trigger and Gate function every 5 seconds to change the case structure in the TimedWriteMeasurements.vi to "True", thereby causing the Write Measurement function to record the environmental conditions.  Then I'd like the case structure reset to false, until it's triggered to true in another five seconds.  Can anyone point me in the right direction?
    Attachments:
    TimedWriteMeasurements.vi ‏84 KB

  • Bug or Feature? Array Custom Contextual Menu Destroys "Delete" and "Insert" Element!

    Step to Reproduce:
    - Create an Array (doesn't matter of what type).
    - Ctrl-M (to switch to Run Mode)
    - Notice that:
           - when you Right-Click on the Array borders, you have access to an "Empty Array" menu item among other things
           - when you Right-Click in an Array element, you have access to an "Insert Element Before" and a "Delete Element" menu item among other things
    - Now switch back to Edit Mode and modify the contextual menu in the following way:
            - Advanced>>Run-Time Shortcut Menu>>Edit...
            - Edit>>Copy Entire Menu
            - Switch to "Custom" menu (instead of "Default"): the menu disappears and is replaced by a single ??? item
            - Edit>>Paste: The default menu reappears with the ??? on top
            - Create you favorite custom menu item by editing the ??? item (say: Do Nothing)
    - Save the menu with the control and switch to Run Mode (Ctrl-M).
    - Now try the first 3 steps above: wherever you right-click, you have access to the Custom Menu, but the Array Element contextual menu is GONE.
    In other words, you cannot (it seems) define a custom contextual menu for an array without destroying the default contextual menu for its elements.
    Therefore, if you want to preserve the ability to Insert and Delete Elements in an array, you have to add these two items to the Array contextual menu and juggle with the position of the right-click to figure out whether or not to display them...

    Well, I created in LabVIEW 2012 such a control and added part of the default menu in the way you described with copy-paste as a submenu to an Edit entry in my custom menu and it did not disappear:
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Merging labview and mathscript array question tia sal2

    Greetings,
    I really like mathscript functionality it’s just taking a
    while to learn how to merge labview and mathscript together.  I can get an XY graph out which plots the
    correct wave function (thanks to the forums help) the problem is going from 1-D
    Array of cluster 2 elements to 1-D of waveform Double so I can get a sound representation
    of the waveform. 
    Anyone have any suggestions.
    Tia sal2
    Attachments:
    mathscript formula to sound with waveform graph.vi ‏248 KB

    Sorry not sure what happened internet gremlins maybe...here's the vi and graphic image
    Second try Hopefully this will work
    Attachments:
    mathscript formula waveform graph.vi ‏1 KB
    image_of_mathscript_to_labview.jpg ‏108 KB

  • Why are photoshop and premier elements so laggy?

    I have an i7 3770 comp running Windows 8 with 32gb RAM and a 128gb SSD + 2tb hard drive. I run plenty of other intesive programs without any problems, but for some reason both premier elements and photoshop elements run stupidly slowly. What's up with this?

    First, check to see if the auto analyzer is running. (Organizer>preferences>media analysis) See if shutting it down makes any difference. Also, if you've imported a huge number of photos (like in the thousands), PSE will run slow till organizer is done  generating thumbnails. Sometimes it helps to let it run overnight to finish up.

  • Error Code "1097.....this might corrupted LABVIEW's memory.........." appears after exit the labview and re-start labview

            I have a probelm when I use the "tool/import shared library(.dll)" wizard to generate the API from C.
            The API  works correctly when I complete the wizard, but after I exit labview and re-start labview to run the API again. The error 1097 code "......this might corrupted LABVIEW's memory.............."appears.
            The only solution now is to re-generate or update the wizard to generate the API.
            I try to find the answer in
            http://digital.ni.com/public.nsf/allkb/58596F5D41CE8EFB862562AF0074E04C?OpenDocument 
            The paragraph in above article shows
             "LabVIEW does not crash until it is closed
    The most likely problem is that the DLL function being called has corrupted the memory. If you pass arrays or strings to the DLL, the DLL function cannot dynamically resize the array. Writing beyond the last element of the array or string could corrupt the memory and this may not be obvious until LabVIEW is closed."
    But it doesn't match my case. Does someone have idea about this? I Attach my .h file for reference
    Attachments:
    test_api.h ‏1 KB
    link_test_api.h ‏1 KB

    wewe1215 wrote:
            I have a probelm when I use the "tool/import shared library(.dll)" wizard to generate the API from C.
            The API  works correctly when I complete the wizard, but after I exit labview and re-start labview to run the API again. The error 1097 code "......this might corrupted LABVIEW's memory.............."appears.
            The only solution now is to re-generate or update the wizard to generate the API.
            I try to find the answer in
            http://digital.ni.com/public.nsf/allkb/58596F5D41CE8EFB862562AF0074E04C?OpenDocument 
            The paragraph in above article shows
             "LabVIEW does not crash until it is closed
    The most likely problem is that the DLL function being called has corrupted the memory. If you pass arrays or strings to the DLL, the DLL function cannot dynamically resize the array. Writing beyond the last element of the array or string could corrupt the memory and this may not be obvious until LabVIEW is closed."
    But it doesn't match my case. Does someone have idea about this? I Attach my .h file for reference
    Which of the 2 functions do you call? How?
    The documentation of InitSocketEx() seems to indicate reversed parameter order than what the function prototype looks like!
    I indeed do not see much possibilities to pass in to small a buffer that the DLL function might be overwriting past the end. This really only leaves one more possibility:
    Your DLL is doing something largely illegal somehow. Maybe something as silly as storing the reference to the ipAdress instead of the address itself. The memory passed as parameters into a C function is generally only valid for the duration of the call. This is especially true if you use LabVIEW since LabVIEW will re- and deallocate memory buffers frequently as soon as they are not used anymore and the parameter to a Call Library node is considered to not be required anymore after the function returns.
    Rolf Kalbermatter
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • LabVIEW and Vision (8.2) 2D FFT - Some differences noted

    Hello:
    I am attaching a small llb showing the 2d FFT results obtained using both the LabVIEW and Vision (8.2) functions.  Some differences in results are seen.  The Vision results seem to have more components (move the scroll bars to get to approximate vision container center) while the LabVIEW results show only one pixel at near the center that is not zero. I wanted to know if someone can explain the differences seen. 
    Given that I might trust the Vision results more, I need to display them either in a vision container of the size shown that autoscales X and Y and does not show scrollbars (i.e. behavior like intensity graph), or I need to be able to convert the vision container results so as to be able to display in intensity graph.  I don't know how to get the vision container to do the fomer and since the results for the vision container are in complex format, it is not clear to me how to do the latter conversion. 
    Any thoughts are appreciated.
    Sincerely,
    Don
    Attachments:
    2D FFT Comparison.llb ‏2184 KB

    Hi Gavin:
    The end of my post above proposes to do exactly what you state: converting vision container to array using IMAQ image to array.  But how do you do it?  Remember, the vision container is of type complex - you do not have the option to specifiy complex when using that function. Run the attached and see the error one gets.
    It is not clear to me from the references you site why we should get different answers between the two functions.  Does this mean when I go to another function library such as from visual basic or c++ I would get 2 more different answers?  There is some subtle difference between the functions that only the R&D dept. can probably tell us.  On something gross like the 2d fft example (C:\Program Files\National Instruments\LabVIEW 8.2\examples\analysis\dspxmpl.llb\2D FFT of a Pulse.vi) included with LabVIEW, the results are substantially the same.
    Sincerely,
    Don
    Attachments:
    2D FFT Comparison.llb ‏2192 KB

  • Labview and multicore technology

    Greetings
    the scenario that i'm having now is as following:
    - I'm a 1st year PhD student,and am trying to figure out(find) a new,innovative and impressive project in wireless communications systems area. while am doing my literature review and exploring my favorit knowledge base www.ni.com website ,by chance, i read some titles about the new Multicore tech.then, after surfing most ni's webcasts i got many ideas on how to exploit this tech. together with LabVIEW to boost up my yet-undefined PhD wireless com.main project.in other words,i want to speed-up any existing wireless technology (e.g beamforming algorithms of smart antennas).However, while i met my supervisor i did for him a quick demo and he was wondering about which wireless topic we can make use of multicore tech together with labview,i suggested him {improving wireless systems using multicore tech. and labview coding}.But, a class A student told us that that multithreading problems of multicore tech. have already been solved by software giants, and nothing special about using labview to design parallel processing as many other text-based langauges can do the same,as their compilers already designed for multicore systems.I get depressed actually after he told me this.becuase am intending to use labview (my most favorit software with multicore tech to simulate the performance of a wireless system,and publish a paper about this topic.
    in short gentle men, i wish u guide me and tell me anything related to multicore tech with labview.any new problem you want to simulate it using labview so that i can work on it using labview.and please try to refer me to the latest updates about multicore and latest webcast as i saw almost all webcasts now available on ni website. and suggestion, any feed back, any new PhD research idea.all welcomed and appreciated.
    thanks a lot
    please help to find something related to wireless, anything new for research.....please
    Labview Lover

    Labview Lover wrote:
    labview (my most favorit software with multicore tech to simulate the performance of a wireless system,and publish a paper about this topic.
    Why not do a project using Labview to create different models of wireless technologies, and create simulations toevaluate their potential performance.  The benefit of multi-core programming would be to optimize the speed to which you canrun simulations of the models.  For instance, you can include spectral analysis using LabVIEW.  
    An idea would be to do the above for RAKE receivers or Software Defined Radios.  There are many things you can research and investigate using LabVIEW.  You can create the hardware and compare physical to the model.  There are lots of project ideas in this area.
    R

  • Help!!! How to get the recovery time of transient response of a power supply with Labview basic package without analysis option?

    How to get the recovery time of transient response of a power supply with Labview basic package without analysis option? Does anyone have any idea or some similar function SUBVIs?
    Recovery time of transient response is defined as the time from the beginning of the transient to the voltage point on the waveform fallen into 10percent of the overshoot. Well, the waveform is something like a pulse with a soft slope.

    I recommend plotting your data on a graph on paper. Take a look at the data, and determine what is unique about the point you are looking for. Look for how you can teach your program to look for this point.
    I have written several algorithms that do similar, one in fact being for a power supply, the other being for RPM. Neither algorithm used any advanced analysis tools. They are just a matter of determining, mathematically, when you achieve what you are looking for. Just sit down with your graph (I recommend multiple copies) and draw horizontal and vertical lines that determine when you get to the point you are looking for. You are probably going to have to reverse the array and start from the end, so think in those terms.
    If you have trouble, emai
    l me a bitmap of the graph, and what you are looking for and I will try to be of further assistance. Don't do that however; until you you have given this a few tries. Your solution should be involve a lot of logic on analog levels.
    Good luck

Maybe you are looking for