Changes in labview vi

Dear all,
I use the "path" to define the  file location. However, when I can changed the value for that "path", I saved it. The program told me it is saved.
However, next time, when I open the VI, it is still the old value. What's wrong there?
Thanks.
Solved!
Go to Solution.

In order to save a change in a value of a control, you have to go to the edit menu and select 'Make Current Values Default (or Make Selected Values Default). Then you do a save.

Similar Messages

  • Sir i am using datasocket read ,i am communicating with java but my problem is that bcz im using while loop to see if value has changed my labview consumes all the processors time ,sir i want a event like thing so that while loop is not in continuous loop

    sir i have given lot of effort but i am not able to solve my problem either with notifiers or with occurence fn,probably i do not know how to use these synchronisation tools.

    sir i am using datasocket read ,i am communicating with java but my problem is that bcz im using while loop to see if value has changed my labview consumes all the processors time ,sir i want a event like thing so that while loop is not in continuous loopHi Sam,
    I want to pass along a couple of tips that will get you more and better response on this list.
    1) There is an un-written rule that says more "stars" is better than just one star. Giving a one star rating will probably eliminate that responder from individuals that are willing to anser your question.
    2) If someone gives you an answer that meets your needs, reply to that answer and say that it worked.
    3) If someone suggests that you look at an example, DO IT! LV comes with a wonderful set of examples that demonstate almost all of the core functionality of LV. Familiarity with all of the LV examples will get you through about 80% of the Certified LabVIEW Developer exam.
    4) If you have a question first search the examples for something tha
    t may help you. If you can not find an example that is exactly what you want, find one that is close and post a question along the lines of "I want to do something similar to example X, how can I modify it to do Y".
    5) Some of the greatest LabVIEW minds offer there services and advice for free on this exchange. If you treat them good, they can get you through almost every challenge that can be encountered in LV.
    6) If English is not your native language, post your question in the language you favor. There is probably someone around that can help. "We're big, we're bad, we're international!"
    Trying to help,
    Welcome to the forum!
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Change/Remove LabView 6.02

    After updating LabView 6.0i Profetional to Version 6.02 I can not change/remove LabView using Control panel \ Add/Remove Programs in Windows 2000. The error message from Windows is "Error applying transforms. Verify that the specified transform paths are valid."
    What can I do with this issue?

    Hi TXu
    Try the steps described on this site:
    http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/9f563a9a1b40dfcb86256a1700547acb?OpenDocument
    Luca P.
    Applications Engineer
    National Instruments
    Regards,
    Luca

  • Cannot change Radix Labview 2010

    Good Afternoon,
    I'm trying to change the radix of an integer constant to Hex display in LabView 2010, as discussed in this post:
    http://forums.ni.com/t5/LabVIEW/How-can-I-create-a-hexadecimal-constant-in-LabVIEW/td-p/1008715
    However, in the right click context menu I do not see a "Hex" option.
    I was able to copy/paste the integer from the forum posted VI and use it in my application, but I Need to know how to change the display to Hexadecimal 
    And I don't want to use a HEX string to integer conversion, I should be able to input a hex number directly into a numeric constant just like in C
    Thank you,

    In the menu you have to go to Display Format to change the radix.  Then you have to go to Appearance to show the radix.  This last step should be easier.
    http://forums.ni.com/t5/LabVIEW-Idea-Exchange/Make-it-easier-to-show-the-radix-when-you-change-it/id...

  • Losing changes in LabVIEW 8--bug report

    Don't know if this has been reported yet or not but it was driving me crazy making my coding changes evaporate until I started realizing it wasn't me.
    To reproduce:
    Create a blank project and a new VI
    Add FP.Open VI Property node with a False Constant
    Run the VI
    All changes will be LOST
    After nearly going mad and tracking down the root cause, I find the newly added note:  "National Instruments recommends using the Front Panelpen [or Close] method instead of this property."
    So you can try that too if you want, which has the same results:
    Create a blank project and a new VI
    Add FP.Close Method
    Run the VI
    All changes will be LOST
    *POOF*  If the VI was an Untitled new VI that hadn't been saved, it will just vanish from the project list.

    You're correct. I forgot one step to make the, "This
    is
    how we close instances of VI templates opened to avoid the Save
    dialog." statement true. The VI must open a reference to itself and
    pass that to the Node to avoid the Save dialog. 7.1 does indeed pop up
    the Save dialog if you just have the Node set to close.
    So now the question is, did NI intentionally change this or not. I've
    found other changes that I thought were bugs that they intentionally
    changed. See this thread for one of them.
    I'll see if I can get the attention of an AE to look into this.
    BTW, I do hit (Ctrl+S) before I run anything because I've had LabVIEW dissapear when I hit Run.
    Ed
    Message Edited by Ed Dickens on 05-03-2006 03:56 PM
    Ed Dickens - Certified LabVIEW Architect - DISTek Integration, Inc. - NI Certified Alliance Partner
    Using the Abort button to stop your VI is like using a tree to stop your car. It works, but there may be consequences.

  • NI-DAQmx 7.4 Changes for LabVIEW Real-Time

    Hello LabVIEW Real-Time and DAQmx users,
    National Instruments has recently released NI-DAQmx 7.4. Changes introduced in NI-DAQmx 7.4 provide increased flexibility when performing hardware-timed single-point operations on real-time platforms. The Developer Zone Tutorial: DAQmx Hardware-Timed Single Point Lateness Checking has been created to help explain these changes. Information is also available in the NI-DAQmx Help.
    In order to provide these changes in NI-DAQmx 7.4, the behavior of 'DAQmx Is Read or Write Late.vi', used in previous versions, had to be changed. In NI-DAQmx 7.4, the 'DAQmx Is Read or Write Late.vi' has been obsoleted and no longer performs any lateness checking. Applications being upgraded to NI-DAQmx 7.4 are recommended to use one of the lateness checking options discussed in the above tutorial.
    Regards,
    JR Andrews
    NI

    Hello,
    I changed the program and added a case structure to configure each channel according their type. In order to go through all the channels in the configuration file, I put a for structure. 
    It seems it is working, however, when I start acquisition, I only see one channel and that is last one which is configure. 
    Here is the original file, configures all, but only works for AI Voltage:
    Here is the modified one:
    What I am doing wrong? 
    Thanks!
    Attachments:
    ConfigureFor Channels.PNG ‏131 KB

  • Change TS Labview Server from Labview OI?

    I have tried the NI recommended way to select the LabView server programatically from TestStand which works fine e.g.
     RunState.Engine.GetAdapterByKeyName("G Flexible VI Adapter").AsLabVIEWAdapter.SetServerInfo(LabVIEWServer_RTEServer, "C:\\Program Files\\National Instruments\\Shared\\LabVIEW Run-Time\\8.6\\lvrt.dll")
    But I would like to call the same functionality from a LabView Operator Interface...
    After I get the engine reference I can use an Invoke node for GetAdapterByKeyname but I am struggling with the next AsLabVIEWAdapter.SetServerInfo part - I can't see these methods. Any tips/examples how to do this?
    Thanks.

    Simon,
    Thank you for providing CIM1 with this VI! I have made a few improvements to the VI and attached it below. You don't actually have to use the Adapter.AsPropertyObject method at all, you can directly connect the Adapter reference to the Variant to Data VI and cast it to a LabVIEW Adapter.
    Also, the logic that was used for determing the type of LabVIEW Server seems incorrect. According to the LabVIEWServerTypes Enumeration Help, the LabVIEWServer_ExecServer enum should be used for the LabVIEW development environment or a LabVIEW executable that registers itself as a LabVIEW ActiveX Automation Server. The LabVIEWServer_RTEServer enum should only be used for the LabVIEW Run-Time Engine. In your code it seemed like you were setting the LabVIEWServer_RTEServer enum for both the LabVIEW Run-Time Engine as well as a LabVIEW executable server. I've modified this portion of the VI as well to behave correctly.
    Let me know if you have any questions.
    Manooch H.
    National Instruments
    Attachments:
    ConfigureLabVIEWAdapter.vi ‏13 KB

  • Changes in Labview.lib between 6.0.2 and 6.1 ?

    I recently installed 6.1 and tried to run out test system. The error DSPtrAndHand occured in Labview.lib and lv was shut down. I looked at the labview.lib files both in 6.0.2 and 6.1 and the lib file was twice as big in 6.0.2. I am not an expert on the system but i think that we have an external code that calls the function DSPtrAndHand which seems to be some kind of memcopy. This function seems to have disappeared in 6.1. We have tried to recompile with the new labview.lib and the linker warns that DSPtrAndHand cannot be found.
    Br
    Johan N

    cornholio wrote:
    > I recently installed 6.1 and tried to run out test system. The error
    > DSPtrAndHand occured in Labview.lib and lv was shut down. I looked at
    > the labview.lib files both in 6.0.2 and 6.1 and the lib file was twice
    > as big in 6.0.2. I am not an expert on the system but i think that we
    > have an external code that calls the function DSPtrAndHand which seems
    > to be some kind of memcopy. This function seems to have disappeared in
    > 6.1. We have tried to recompile with the new labview.lib and the
    > linker warns that DSPtrAndHand cannot be found.
    Yes, this function has disappeared in LabVIEW 6.1. I'm not sure as to
    the reason and considering that it was documented in the CIN Reference
    Manual in earlier versions, I was rather surprised to see
    that.
    However the workaround is quite simple by creating your own function to
    do that particular operation:
    MgErr DSPtrAndHand(UPtr p, UHandle h, int32 size)
    MgErr err;
    int32 len;
    if (!h || !*h || !p || !size)
    return mgArgErr;
    len = DSGetHandleSize(h);
    err = DSSetHandleSize(h, size + len);
    if (!err)
    MoveBlock(p, *h + len, size);
    return noErr;
    Rolf Kalbermatter
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • FP default-values and diagram "constants" can change in LabVIEW 6.1 and 7.1

    Hi Folks,
          I'm posting here [instead of bug report] first, in case this isn't really a bug.
    I created a bunch of similier VIs with type-def cluster inputs, and, on each VI, gave the clusters a default-value.  I'm finding that when these typedefs are changed by removing an element, the default FP values are corrupted.  In my case it gets worse.  When I used these VIs on diagrams, I frequently created a cluster-constant - derived from the VI's cluster-control (described above.)  The remaining elements in the diagram-"constants" are also changing.
    Regards
    P.S.  Sorry if topic has already been discussed! (I did search a "bit", first...)   
    Message Edited by Dynamik on 10-28-2005 10:44 PM
    When they give imbeciles handicap-parking, I won't have so far to walk!
    Attachments:
    Untitled12.vi ‏17 KB
    Untitled.ctl ‏7 KB

    This sounds like it could be a bug AND normal operation.
    There are bugs in LV 7.1, 7.0 and possibly earlier that cuases LV to choose the wrong value when bundling and un-bundling by name.
    I have been told these are fixed in LV 8.0
    See this thread for more details on the bug.
    http://forums.ni.com/ni/board/message?board.id=170&message.id=105455&jump=true
    Now as far as the constants that are based on the typedef changing, tht is normal behaviour. As Odd_Modem mentioned, doing an explicit bundle by name is the way to handle this. If you use the same "constant" repeatedly in your code, a sub-VI that does the bundling makes your code a lot easier to read.
    Here is a code snippet
    and the source for that example can be ound in this thread.
    http://forums.ni.com/ni/board/message?board.id=170&message.id=148471#M148471
    Ben
    Message Edited by Ben on 10-29-2005 10:42 AM
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Analog signal change(module 9381) in labview when activating switch on relay module (NI9481) on seperate circuit

    We are getting analog voltage changes in Labview on several different inputs (not found with multimeter). It happens when we activate the NI 9481 relay module with a 24V DC signal to power a solenoid. Don't understand why this is happens or why it shows on three different inputs in LabVIEW but not when testing the points with a meter.

    That picture is of little help, since the devices will report what they measure, wether it is the voltage of interest or not
    To find the root cause of that effect, you 'just' have to follow the current. However current and to note _fast changing current_  can go funny, not always so obvious ways.
    That's why I asked for details of the setup, and every wire counts
    (and ground is a con......   see sig)
    And a as fast as possible sampled voltage channel while a switch is actvated (say 20ms) would help too.
    Greetings from Germany
    Henrik
    LV since v3.1
    “ground” is a convenient fantasy
    '˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'

  • LabVIEW memory management changes in 2009-2011?

    I'm upgrading a project that was running in LV8.6.  As part of this, I need to import a custoemr database and fix it.  The DB has no relationships in it, and the new software does, so I import the old DB and create the relationships, fixing any broken ones, and writ eot the new DB.
    I started getting memeory crashes on the program, so started looking at Task manager.  The LabVIEW 8.6 code on my machine will peak at 630MB of memory when the databse is fully loaded.  in LabVIEW 2011, it varies.  The lowest I have gotten it is 1.2GB, but it will go up to 1.5GB and crash.  I tried LV 2010 and LV 2009 and see the same behavior.
    I thought it may be the DB toolkit, as it looks like it had some changes made to it after 8.6, but that wasn't it (I copied the LV8.6 version into 2011 and saw the same problems).  I'm pretty sure it is now a difference in how LabVIEW is handling memory in these subVIs.  I modified the code to still do the DB SELECTS, but do nothing with the data, and there is still a huge difference in memory usage.
    I have started dropping memory deallocation VIs into the subVIs and that is helping, but I still cannot get back to the LV 8.6 numbers.  The biggest savings was by dropping one in the DB toolkit's fetch subVI.
    What changed in LabVIEW 2009 to cause this change in memory handling?  Is there a way to address it?

    I created a couple of VIs which will demonstrate the issue.
    For Memory Test 1, here's the memory (according to Task Manager):
    Pre-run
    Run 1
    Run 2
    Run 3
    LabVIEW 8.6
    55504
    246060
    248900
    248900
    LabVIEW 2011
    93120
    705408
    1101260
    1101260
    This gives me the relative memory increase of:
    Delta Run 1
    Delta Run 2
    Delta Run 3
    LabVIEW 8.6
    190556
    193396
    193396
    LabVIEW 2011
    612288
    1008140
    1008140
    For Memory Test 2, it's the same except drop the array of variants:
    Pre-run
    Run 1
    Run 2
    Run 3
    LabVIEW 8.6
    57244
    89864
    92060
    92060
    LabVIEW 2011
    90432
    612348
    617872
    621852
    This gives us delats of:
    Delta Run 1
    Delta Run 2
    Delta Run 3
    LabVIEW 8.6
    32620
    34816
    34816
    LabVIEW 2011
    521916
    527440
    531420
    What I found interesting in Memory Test #1 was that LabVIEW used more memory for the second run in LV2011 before it stopped.  I started with Test 1 because it more resembled what the DB toolkit was doing since it passes out variants that I then convert.  I htought maybe LabVIEW didn't store variants internally the same any more.  I dropped the indicator thinking it would make a huge difference in Memory Test 2, and it didn't make a huge difference.
    So what is happening?  I see similar behaviore in LV2009 and LV2010.  LV2009 was the worst (significantly), LV2010 was slightly better than 2011, but still siginificantly worse than 8.6.
    Pre-run
    Run 1
    Run 2
    Run 3
    LabVIEW 8.6
    55504
    246060
    248900
    248900
    LabVIEW 2011
    93120
    705408
    1101260
    1101260
    Attachments:
    Memory Test.vi ‏8 KB
    Memory Test2.vi ‏8 KB

  • How can I change the termination character (Hex 00) in LabVIEW?

    I'm talking to a 3rd party device over USB where I make function calls to their DLL. I send and receive Hex commands as Strings data type. I can send without problem however when I read back a string that should be Hex "FF 00 FF", I get back "FF" because LabVIEW sees 00 as a termination character so I don't get any data past the 00.  Likewise if I'm supposed to read "00 FF", I get back nothing!   The following KB linked below explains that LabVIEW sees 00 as a termination character.  It shows where using Serial to change the termination character to something else.  I don't have this option using my 3rd party USB device.  Is there a way to change what LabVIEW sees as it's termination character?
     http://digital.ni.com/public.nsf/allkb/9A3589A05F21A1B186256CE9006448D4

    That article is talking about two things. The first has to do with the fact that in C strings are terminated using 0x00. The second part has to do with a serial port read terminating once it sees 0x00. Based on what you described, I don't believe the second part does not apply to you since you are calling a third-party DLL, so it's doing the serial port read. Is this correct? Or, are you actually doing the serial port read yourself using VISA functions? If you are just calling their DLL, then the issue has to do with the string being passed back from the DLL to LabVIEW. You're probably passing in a string datatype. You should simply change this to a byte array so you get all the values.

  • LabVIEW 2014 array constant change

    Why did the "graying out" of array constants change between LabVIEW 2014 and LabVIEW 2013? In 2014 the effect is much more subtle and it's far harder to tell the # of elements in an array constant. Please change this back ASAP!

    My only thought was that they tried to make the disabled elements more readable.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines
    Attachments:
    Disabled Elements.PNG ‏24 KB

  • Excel activex call changes between Office 2000 and Office XP. How does one manage that?

    I have several Active X calls from within a VI. One in particular is
    the Excel Cell Value property node in Office 2000. MS has decided to
    call it Excel Cell Value2 in Office XP.
    I have built and exe on a machine with Office 2000 and can run the code
    on a machine with Office XP, but I can not build on the machine with
    Office XP. I can also run a VI with that call
    on the Office XP box, but if I mass compile the VI I get the broken arrow.
    I guess I am confuesed as to why it can run but not compile. If the
    ActiveX call is not there for the compile why is it there for the run?
    If I can expect this does it work in reverse where I
    build on an Office XP box and run on a Office 2000 box?

    These problems you are experiencing do stem from the ActiveX changes between Excel versions.  When you mass compile LabVIEW checks to see if everything is linked correctly, including ActiveX portions of your code.  Since there are slight changes the LabVIEW compiler detects something is wrong, but cannot isolate the problem.  You might need to force a recompile of that VI.  Check out this link.
    http://digital.ni.com/public.nsf/websearch/50D06DEE8B9DC018862565A0006742F2?OpenDocument
    Hope this helps!
    Andy F.
    National Instruments

  • I would like to read a text file in which the decimal numbers are using dots instead of commas. Is there a way of converting this in labVIEW, or how can I get the program to enterpret the figures in the correct way?

    The program doest enterpret my figures from the text file in the correct way since the numbers contain dots instead of commas. Is there a way to fix this in labVIEW, or do I have to change the files before reading them in the program? Thanks beforehend!

    You must go in the labview option menu, you can select 'use the local
    separator' in the front side submenu (LV6i).
    If you use the "From Exponential/Fract/Eng" vi, you are able to select this
    opton (with a boolean) without changing the labview parameters.
    (sorry for my english)
    Lange Jerome
    FRANCE
    "Nina" a ecrit dans le message news:
    [email protected]..
    > I would like to read a text file in which the decimal numbers are
    > using dots instead of commas. Is there a way of converting this in
    > labVIEW, or how can I get the program to enterpret the figures in the
    > correct way?
    >
    > The program doest enterpret my figures from the text file in the
    > correct way since the numbers contain dots instea
    d of commas. Is there
    > a way to fix this in labVIEW, or do I have to change the files before
    > reading them in the program? Thanks beforehend!

Maybe you are looking for