Using LabView, Arduino, and Easy-Driver to control a Stepper Motor

Hello fair folks of the LabView forums!
I am a humble engineering student in need of some advice and input on a project I am working on.
I’ve also done a fair amount of reading on the LabView forums with people trying to use LabView and Arduino to control stepper motors, and I’ve used as much information from each as I could. However the problem I'm encountering now is one which is unique to my VI Configuration.
So, here is what has been done thus far:
-Installed LabView Interface for Arduino [LIFA]
-Installed necessary VISA drivers for LabView to communicate with arduino
-Uploaded firmware to Arduino to allow it to communicate with LabView
-Properly wired Stepper Motor with Easy Driver and Power Supply; I have verified this by first using the Arduino independently of LabView, using example code found here.
-I have created a VI, and have just recently got it to run without reporting back any errors. While I suspect the behavior I’m encountering has to do with something inside the VI, it is difficult to determine as there are no errors being reported!
But here is general problem I am encountering:
-Connect Joystick to USB Port, connect Arduino to USB Port.
-Plug in outlet for Power Supply
-Open LabView VI
-Run LabView VI
-LabView successfully detects Joystick, and Arduino.
-Tilt Joystick; Stepper Motor moves in proper direction, but it only takes a single step. If held, it does not move further. If returned to zero, and then tilted, it again only takes a single step.
I know that in the example code, the Arduino issued the step command by writing the signal from LOW to HIGH, and varied the speed by how often it did this.
I assumed that the Write-PWM feature would do this same thing, but perhaps my assumption is wrong. I will continue to tinker with this myself, but I would be extremely grateful for any insight you might be able to lend.
I’m thinking, if not the Arduino Write-PMW feature perhaps a simple timing-sequence that could be used to alternate between writing 0 and 1, with the timescale of the sequence being scaled to the X-Axis from the Joystick. But, I am open to suggestions, and certainly appreciate any thoughts you may have to offer!
Attached to this is the VI used in this setup, a picture of said VI, as well as a rough sketch of the hardware configuration.
Thank you!
Attachments:
Arduino Stepper Control.vi ‏1224 KB
VI Picture.png ‏82 KB
Configuration Sketch.png ‏522 KB

Hi danjifraga,
I am not so familiar with the Arduino toolkit functions, but you may have better luck looking at the Arduino page at:
www.ni.com/arduino
I'll ask around on Monday to see if anyone is more familiar with the toolkit.
Good luck,
Brian
Brian G.

Similar Messages

  • Problems with Serial Communication using Labview 6 and Solaris 8

    I am working on a Driver for a Temperature Controller. But I am stuck at the very basics. I am using Labview 6 and the platform is Solaris 8 on a SUN Ultra 60 Workstation. I can not get the Serial communication to work. When I am running raw (uncompiled) code it works (I can read from and write to ttya and ttyb) but once compiled I get error code 37 (device not found). I have tried the following steps to fix this with no luck.
    1) I made sure that the "serpdrv" file is in the same folder with the executable. I also make sure the serpdrv file is added as a support file when building the app.
    2) I changed from using traditional serial VI's to labview 6's new visa functions. With these "new" VI's when
    I try to initiliaze the visa device and wire a control to the "visa reference" input only 1 serial port shows up (ASRL2, missing ASRL1). I am not sure if this is part of same problem or whole new issue.
    3) I reinstalled both visa and labview 6.0.2 update hoping this would help with no luck
    4) I placed the following entry into the ".labviewrc" file
    labview.serialdevices: "/dev/ttya:/dev/ttyb"
    If anybody has had the same problem I would love to hear about it and if you have any solutions
    Jamie Shea

    Hi Jamie,
    1. Do you have NI-VISA driver installed on the machine on which you are running this executable?? If you are trying to run the executable on the same machine on which the development program has ran fine, then you can ignore this point.
    2. If you have done all the changes that are suggested by other discussions related to this topic, then try changing the Port input to Visa Serial Configure.Vi from a control to a constant and try it. In some case, I have seen this to do the trick. I think this point should solve your problem. If it does do tell me. :-))

  • How can I control 3 stepper motor in same time?

    I want to control 3 stepper motor and 30 electro-valves.
    I only had a few lessons on how to control one in C in college.
    Now I need to do this in labview and using an board from NI
    How can do the control of the three motors in same time?
    What the best board to do this. Maybe the NI PCI 6034.
    But I will need more than one?

    At least some of the NI motor controller boards have up to 32 digital lines which can be set to input or output. I did not write the software for our systems but we are using quite a few of the digital lines for our systems and have no problems reading the digital inputs or setting the digital outputs. Also, the digital outputs offer a high current sink capability and usually can be used to drive external drives without any interface circuits.
    The digital lines are available on a separate connector of the NI boards and there are also break-out boxes (with screw terminals) available. I do not see any serious problems to do your control tasks just with one NI motion controller board, as long as you do not have any special requirements (high sampling speed, high frequency pulse output etc) for the digital lines.

  • What's the type of control for stepper motor?

    Hi,
    I'm using PXI-7358 controller, UMI-7774, Industrial Device NextStep third part microstepping drive, stepper motor SANYO-DENKI (type 103-8932-6421, NEMA42) and incremental encoder 3600 ppr (14400 count/round). I want to say what's the NI onboard type of control used to control motor step position? In the case of servomotor the type of onboard control is a PID, in the case of stepper motor what's the type of control? What's the MAX parameters to set for the project specific (overshoot, settling time, rise time etc..)? What does it means "Pull-in Window" e "Pull-in Tries" in MAX? So, it's possible control the stepper motor with user's algorithm control, excluding the onboard control?
    Thanks for your patience,
    Best regards
    Lorenzo

    >
    Matt wrote:
    > Go to SE24.  Type in cl_dd_document and press enter.  Select the methods tab.  Look for the method "CONSTRUCTOR".  Double click on it.   Click on SIGNATURE button.  The types of the parameters are clearly seen.
    >
    > matt
    TYPE and VALUES OF TYPE -- different things. For example, TYPE C -- CHAR. VALUES of this: A, B, C, ..., 1, 2, 3.
    Thus, return to question.
    TYPES: sdydo_attribute(50) TYPE c
    TYPE: sdydo_attribute
    VALUES: ???
    May be, value ABRAKADABRA correct?

  • How can I control 3 stepper motors w/ amplifier by sending TTL pulses from DIO96

    We have NI PCI-DIO-96 board and we have 3 stepper motors with amplifier and encoders. We want to control the motors by sending TTL pulses directly from the DIO-96 board to the amplifiers. Is this possible? What would you recommend if this is not a good approach? Thanks in advance.
    Roman Zeylikovich

    Roman,
    Thank you for contacting National Instruments. While using a motion controller would be the recommended approach for any type of motion application, you may be able to use your DIO board to generate a TTL pulse train to control the step and direction inputs of your drive and motor. You will need to make sure that the current sinking and sourcing specification for the DIO-96 fits your stepper motor. That board is not designed to source very much current at all so this is one issue you will need to verify. Also, this board does not have any counter/timer logic that can handle quadrature encoder inputs.
    Again, this digital device is probably not the best solution to control a stepper motor, but, depending on your hardware, could be configured to work proper
    ly. The PCI-7334 is a low-cost stepper motion controller that can control up to 4 axes and is designed to easily accommodate these types of applications. You can browse through more information on our Motion controller boards at the following website:
    http://sine.ni.com/apps/we/nioc.vp?cid=3809〈=US
    Let us know if you have any more questions or comments.
    Regards,
    Michael
    Applications Engineer
    National Instruments

  • Complete hardware simulation using LabView, Multisim, and MAX (easy answer accepted!)

    Hello, all!
    Sorry, I'm new, but I've checked around for a definitive answer on this, but I'm not 100% sure.  I'm learning LabView for an upper-division Physics class.  We're using NI hardware (DAQ-MX) and a mix of lab hardware - primarily basic stuff such as voltmeters, oscilloscopes, and breadboards with simple components.  I'm also doing some work with NIM instrumentation, but that's secondary to my needs here.  So, when I'm away from school, is it possible to do a complete simulation of my classwork using LabView, Multisim (for my breadboard), and the Measure and Automation Explorer (for the DAQ-MX)?  I know that I can create a circuit and drop it into Labview, but I'm not so sure about the DAQ.  I'm hoping for something that's a "seamless" recreation of what I'm doing in class.  I can take a simple "yes" or "no"; as long as I know that it's possible, I can look for the solution.
    Thanks for the help!
    Solved!
    Go to Solution.

    I have an easy answer and a harder answer:
    Easy Answer
    You can simulate almost any NI instrument in MAX.  Right click the Devices and Interfaces item in the left pane and select Create New...  Unfortunately, this limits you to whatever the driver designer thought would be a good signal.  It is good for simple testing, but you will rapidly run into its limitations.
    Harder Answer
    Use LabVIEW classes to make a hardware abstraction layer.  This my seem like an advanced topic for a beginning LabVIEW programmer, and it is, but it also neatly solves the problem of switching between simulation VIs and real acquisition VIs without writing a bunch of switching code.  In short, you create a LabVIEW class which has the interface you want for you data acquisition.  This can be your simulation code.  You then create a child class which has exactly the same interface, but uses the DAQmx/NI-SCOPE/NI-DMM/etc. that you really want to use.  Switching between the two is as simple as selection the class you want to use at run time.  This is a lot of info in a short time.  If you want to go this route, read the LabVIEW help on LabVIEW classes and work through the examples.  I would encourage you to do this, since the sooner you learn how to effectively use object-oriented LabVIEW, the easier your life will be.
    As a further bit of information, most of the measurement instrument groups (e.g. NI-SCOPE, NI-FGEN, NI-DMM, etc.) allow you to simulate an instrument if you use the open with options VI instead of the standard open VI.  The inputs are rather arcane, but get the job done.
    Let us know if you have further questions.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • I use LabView 6i and I use Dialog Tab Control, but I can't put colors on it, some one try it before?

    The Diaglog Tab Control works very well, but I could not paint it with Front Panel tools or with its property. Any one knows about this, thanks!

    THANK YOU!!! I have spent hours searching for the answer to this question. I didn't realize that there were two different tab controls! This is just another example of why I use NI products and software. Where else will you get this kind of product support.
    Dave Green
    Process Systems

  • Need Expert's Advice - How to use LabView Efficiently and to increase Readability

    My application is fairly complex. It is a real world testing applications that simultaneously controls 16 servo motors running various stress testing routines asynchronously and all at the same time. The application includes queues, state machines, sub VI's, dynamically launched VI's, subpanels, semaphores, XML files, ini files, global variables, shared variables, physical analog and digital interfaces and industrial networking. Just about every technique and trick that LabView 2010 has to offer and the kitchen sink as well.
    Still I am not happy with the productivity that LabVIEW 2010 has provided, nor the readability of my final product.
    Sometimes there are too many wires. Much of my state machines have a dozen or more wires just going from input to output, doing nothing, just because one or two states in the machine need that variable in some state. Yeah, I could spend alot of time bundling and unbundling and rebundling those values, but I don't think that would improve things much.
    We have had a long discussion about the use or misuse of Local variables in this forum and I don't want to repeat that here. I use them sparingly where I think it is relatively safe to do so. I also have a bug whenever I try and copy some code that contains one or more local variables. On Pasting the code with local variables, the result is something other than what I expected, I am not sure what. I have to undo the paste and rebuild the code one object at a time.
    I am also having trouble using trouble using Variable Property Nodes. When I cut and paste them, they often loose their reference object and I have to go back into the code and redo the Link To on each one. That wastes alot of time and effort.
    Creating subVIs is often not appropriate when the code makes many references to objects on the Front Panel. Some simple code will turn into a bunch of object references and dereferences which also tends to take alot of work to clean up and often does not help overall readability in many cases. I use subVIs when appropriate, but because of the interface overhead, not as often as I would like to. My application already has over 150 sub VIs.
    The LabView Clean Up Diagram function often works poorly. It leaves way too much empty space between objects, making my diagrams 3 to 4 24" screens wide. That is way too much and difficult to navigate effectively. The Clean Up function puts objects in strange places relative to other objects used nearby. It does a poor job routing wires and often makes deciphering diagrams more difficult rather than easier.
    My troubleshooting strategies don't work well for large diagrams and complex applications. The diagrams are so complex that execution highlighting may take 20 minutes for a single pass. Probes help, but breakpoint aren't of much use, because single stepping afterwards often takes you to somewhere else in the same diagram. I can't follow the logic well doing this.
    Using structures, I may have Case structures nested 5 to 10 levels deep inside some Event Structure inside a While Loop. Difficult to work with and not very readable.
    All and all, I can make it work, but I am not happy about the end result.
    I am hoping to benefit from some expert advice from those that are experienced in producing large complex applications efficiently, debugging efficiently and producing readable diagrams that they are proud of.
    Can anyone offer their advice on how best to use the LabView features to achieve these results in complex applications? I hope that you can help show me the light.

    I'm not an expert but I'm charged out as one at work.
    I am off today so I'll share some thoughts that may help or possibly inspire others to chime. I have tried to continually improve my code in those areas and would greatly welcome others sharing their approaches and insights.
    Note:
    I do refactoring services to help customers with this situation. What I will write does not represent what we do in a code review since our final delverable is a complete final design and that is beyond the scope of this reply.
    I'll comment on your points.
    dbaechtel wrote:
    My application is fairly complex. ...
    While watching Olympic figure skating competion slow-motion replays, I learned how the subtleties of how the launching skate is planted while entering a jump can make the difference between a good jump and a bad one.
    In software, we plant our foot when we turn from the design to the development. I have to admit that there where a couple of times when I moved from design to development too early and found myself in a situation like you have described.
    How to know when design is done?
    Waterfall says "cross every 't' and dot every 'i' " while Agile says "code now worry about design latter" and Bottom-up "says "demo working why bother designing" (Please feel free to coment on these over-simplifications gang).
    My answer is not much more helpful for those new to LabVIEW. 
    My design work is done when my design diagrams are more complicated than the LabVIEW diagrm they describe.
    dbaechtel wrote:
     simultaneously controls 16 servo motors running various stress testing routines asynchronously and all at the same time. The application includes ...and the kitchen sink as well.
    Have you posted any design documents you have? These will help greatly in letting us understand your application. More on diagrams latter.
    Anytime I see multiple "variations on a theme", I think LVOOP (LabVIEW OOP ) . I'll spare you the LVOOP sales pitch but will testify that once you get your first class cloned off and running as a sibling (or child) you'll appreciate how nice it is to be able to use LVOOP.
    Discalimer:
    If you don't already have an OOP frame of mind, the learning curve will be steep.
    dbaechtel wrote:
    Still I am not happy with the productivity that LabVIEW 2010 has provided, nor the readability of my final product.
    Sometimes there are too many wires....going from input to output, doing nothing,... spend alot of time bundling and unbundling and rebundling those values, but I don't think that would improve things much.
     Full disclaimer:
    I used to be of the same opinion and even used performance arguements to make my point. I have since, changed my mind.
    Let me illustrate (hopefully). This link (if it works for you, use lefthand pane to navigate hierachy) shows an app I wrote from about 10 years ago when I was in my early days of routing wires. Even the "Main" VI started to suffer from too many wires as this preview from that links shows.
    Clustering related data values using Type Definitions   is the first method I would would urge. This makes it easier to find the VIs that use the Type def via the browse relationships>>>callers. If I implement my code correctly, any problem I believe is associated with a particualr piece of data that is a Type def has to be in one of the VIs that use that type def therefore easier to maintain.
    When I wrote "related data" I am refering data normalization rules (which my wife knows and I picked-up from her and I claim no expertise in this area) where only values that are used together are grouped. E.G. Cluster named File contains "Path" and "Refnum" but not "PhaseOfMoon". This works out nicely with first creating sub-VI since all of the data related to file operations are right there whe I need it and it leads into the next concept ...
    When I look at a value in a shift register on the diagram taking up space that is only used in a small sub-set of states, I concider using an Action Engine . This moves the wire from the current diagram into the Action Engine (AE), and cleans up the diagram. The AE brings with it built-in protection so provided I keep all of the opearations related to the the Type def inside the AE I am protected when I start using multiple threads that need at that data (trust me, it may not make a difference now but end users are clever). So that extra wire is effective encapsualted and abstracted away from the diagram you are looking at.
    But I said earlier that I would not sell LVOOP so I'll show you what LVOOP based LV apps look like to contrast what I was doing ten years ago in that earlier link. This is what the top level VI looks like.
     And this is the Analysis mode of that app.
    I suspose I should not mention that LVOOP has wizards that automatically create the sub-VI (accessors) that bundle/unbundle the clusters, should I?
    Continuing...
    dbaechtel wrote:
    We have had a long discussion about the use or misuse of Local variables...I also have a bug whenever I try and copy some code...
    If you can simplify the code and duplicat ethe bug. please do so. We can get it logged and fixed.
    dbaechtel wrote:
    I am also having trouble using trouble using Variable Property Nodes....
    That sounds like a usage issue. Posting code to illustrate the process will et us take a shot at figuring out what is happening. 
     dbaechtel wrote:
    Creating subVIs is often not appropriate... My application already has over 150 sub VIs.
    "Back in the day..." LV would not even try to create a sub-VI that involved controls/indicators. I use sub-VIs to maintain a common GUI often but I do it on purpose and when I find myself creating a sub-VI that involves a control/indicator, I hit ctrl-z immediately! 
    I figure a way around it (AE ?) and then do the sub-VI.
    Judging by your brief explanation and assuming you do a LVOOP implementation, I would estimate that app need 750-1500 VIs. 
     dbaechtel wrote:
    The LabView Clean Up Diagram function often works poorly.... 
    THe clean-up works fine for how I use it. After throwing together "scratch code" and debugging the "rats nest" I'll hit clean-up as a first step. It guess good enough on simple digrams and in some cases inspires me to structure the diagram in a different way that I may not have thought about. If I don't like, ctrl-z.
    Good deisgn and modualr implementaion led to smaller diagrams that just don't need thrre screens.
     dbaechtel wrote:
    My troubleshooting strategies don't work well for large diagrams and complex applications....Can anyone offer their advice on how best to use the LabView features to achieve these results in complex applications? I hope that you can help show me the light.
    Smaller diagrams single step faster since the sub-VI run full speed. I cringe thinking about a 3-screen diagram with multiple probes open ( shivver!).
    Re: Nestested structres
    Sub-VIs (wink, wink, nudge, nudge)
    If it works you have prven the concept is possible. That is the first step in an application.
    I hope that gives you some ideas.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Using Labview speaker and microphone with multisim

    I've just installed the student version of Circuit Design Suite
    10.0.  When I try to use the supplied Labview speaker and microphone all of
    the controls are greyed out.  I'm able to use the Labview signal analyzer
    and the signal generator just fine.
    I'm using Windows XP Pro and have a 2.4 GHz Pentium 4 processor with 1.5 GB
    of RAM.
    Is there something I can do to allow me to use the speaker and
    microphone?
    Rich Green

    The speaker and microphone are not real-time components, you need to give it time to acquire the data, only when it finish the button will be active.  For this feature to work, you need an actual microphone to be connected to your machine and if you talk into it, it will capture you voice and can send it out through your computer speaker. 
    Tien P.
    National Instruments

  • Hi, I'm new to Labview...​How do I export Data to a spreadshee​t from an agilent 8510c using labview 6 and the VI from the NI site?

    Hi, I'm trying to export data to a spreadsheet from an Agilent 8510c to a laptop(Win XP) using a PCMCIA-GPIB card, Labview 6 and the VI from the NI site. How do I do this? Ideally I'd like the Magnitude and Phase Data, as pairs for each frequency point, but the Real and Imaginary pairs can work just as well. Any help would be greatly appreciated.

    When you say the "VI from the NI site", I'm going to assume that you mean the HP8510C driver. If that's the case, then VI called HP8510C Application Example should be your starting point. It returns two 1D arrays - real raw data and imaginary raw data. These can be combined with a Build Array function and the resulting 2D array can be wired to the Write to Spreadsheet File function. This will create a tab or comma separated file that Excel or any other spreadsheet program can import. The very simplest program would look something like the attached picture. You may need to modify the instrument program to suit your requirements but this should get you started.
    Attachments:
    HP8510C_and_Spreadsheet_Write.jpg ‏5 KB

  • Is it possible to use the parallel port to control a stepper motor (compumotor s6-drive)?

    I'm using a compumotor S6 drive and I was wondering if I could somehow use Labview to program the parallel port to send the required step and forward backward signal to the controller and motor. All I need it to do is go back and forth at a user defined rate. Considering I know very little about Labview this is a daunting task indeed.

    Hello Tiano!
    I found an example on the www.ni.com website named "Reading from the Bi-directional Parallel Port."
    Here is the link: http://sine.ni.com/apps/we/niepd_web_display.DISPLAY_EPD4?p_guid=B123AE0CBA4C111EE034080020E74861&p_node=DZ52058&p_submitted=N&p_rank=&p_answer=&p_source=External
    Hope this and the other documents help you along your way!
    BB_Phil

  • How can I control a stepper motor drive using a DAQ card?

    I need to control a simple CW/CCW stepper motor drive using a DAQ
    card.  I simply need to output a 5 volt signal and then a drop in
    the signal, and so on 200 times or steps for 1 revolution.  I have
    desiigned the basic program, but do not know how to output the
    appropriate signal.
    Cheers,
    Matt

    What DAQ card do you have? Ideally, one with a counter output to generate the steps, where you can control frequency. If you try to start the motor too fast (acceleration or abrupt application of fast pulses) it may stall out. Search in Help/Search for Examples for 'frequency' and 'pulse' for example of frequency generation. You can also use a digital output on the DAQ card to control direction.
    If DAQ card does not have a counter, you coul do a software timed loop to toggle a digital output to generate the pulse train. It may not be fast enough nor accurate for your application, do not know from info you have given
    ~~~~~~~~~~~~~~~~~~~~~~~~~~
    "It’s the questions that drive us.”
    ~~~~~~~~~~~~~~~~~~~~~~~~~~

  • How do i use both ssd and hdd drives on macbook pro together

    have a mid 2012 macbook pro 15 inch. I replaced hdd drive with ssd drive and put hdd drive in optic bay. Was wondering what the best way to use both together is

    Use your SSD for your system, applications, working projects, as a scratch disk (if you're editing photos or video, for example) and you can use the HD to store you user files (Documents) and infrequently accessed files.
    The only disadvantage that I see with a setup like yours is backups. You can use clones to backup both drives, and Time Machine to - at least - backup your boot drive, but you're going to run the risk of losing data if you don't clone your HD. That's a minimum of two clones for one computer (I shouldn't be one to be talking about excessive clones, though, as I keep three clones of my SSD, one clone of my 2TB 'working' drive, and two Time Machine backups of my SSD).
    But that's the way I'd treat the drive in the opti-bay - just as an 'internal' external drive.
    Clinton

  • Process Failure when communicating over MODBUS using LabVIEW 2011 and DSC

    I'm currently trying to read from a PLC's holding registers using MODBUS/TCP. I've confirmed that the PLC is updating the values and responding to MODBUS communication correctly using a third party program called Modbus Poll. However, when I try to poll the PLC using LabVIEW's shared variable engine, I am unable to read any values from the same addresses that I'm viewing with Modbus Poll.
    My setup simply consists of a PC connected directly to the PLC over Ethernet, with no router in between. I am using LabVIEW 2011 SP1 with the DSC module.
    I opened the NI Distributed Systems Manager to view the status of all shared variables in the Modbus library that I created and I've noticed that the CommFail bit is permanently set to "true". All other variables with a "read" access mode report "Process Failure". I've tried restarting the process as well as stopping and starting the local variable engine with no success. I've also restarted my computer several times to see if any services were failing, but this does not seem to have fixed the problem.
    I finally resorted to monitoring communications over the network card that I have the PLC plugged into via Ethernet using Wireshark and I've found that while Modbus Poll is communicating with the PLC, many MODBUS and TCP packets are sent and received. However, when solely using LabVIEW or the NI DSM to communicate with the PLC, there does not appear to be any communication over the network card.
    Something that may be worth noting is that I was able to communicate with the PLC and read values from it with the DSM on just one occasion, when I first figured out which addresses I should be reading from. It all stopped working shortly thereafter. Prior to this, "CommFail" was not usually set to "true" with my current configuration. Thinking that it was my firewall, I have since turned my firewall off, but this seems to have had no effect on the problem either.
    Any help on this matter would be appreciated.
    Solved!
    Go to Solution.

    Just a thought but I think the  register addresses used by LabVIEW are one off of the actual register #.  I was using a CRIO as a modbus IO Server and had to shift the register addresses by 1 to get things to work correctly (can;t recall if it was +1 or -1).  This is documented somewhere on ni.com but can;t seem to find it now.  But here is another  link that may help:
    http://zone.ni.com/reference/en-XX/help/371618E-01/lvmve/dsc_modbus_using/
    Dan

  • Restrictions for using sql commands and operators in loader control file

    Hi ,
    It suppose that there is a lot of restrictions and limitations when using sql commands and operators in the loader control files, same as it seems I cannot use (or) when with case statement, also it seems there is certain length for the case,
    So guys, what are the common limitations and restrictions to be avoided in the loader control file ?
    Your efforts are highly appreciated
    Ash

    Hi Ash,
    if you need to do more complicated logic its better to define the file to be loaded as an external table. You can then use any sql function you like against the external table rather than messing around with what you can and can;t do in a sqlldr control file.
    You can use the external_table option of sqldr to generate the definition.
    Regards,
    Harry
    http://dbaharrison.blogspot.com/

Maybe you are looking for

  • Images not in "~/Pictures/iPhoto Library" folder not properly handled

    One of the features I was grateful for in iPhoto 6 is the ability to NOT import photos into the ~/Pictures/iPhoto Library" folder. This meant I could use iPhoto to view and edit photos without duplicating a folder structure I use for my photo files.

  • HOW TO GET INPUT PRINT SCREEN ONCE

    I'VE GOT AN INTERNAL TABLE AND AS IT IS OF TYPE CREATED LOCALLY, I CANT PASS THE TABLE INTO SMARTFORM. THEREFORE I'VE TO CALL  THE SMARTFORM FOR EACH LINE OF THE TABLE TO PASS THE VALUES. THE PROBLEM IS THAT THE USER INPUT SCREEN, THAT ASKS THE USER

  • Log file utilization problem

    I've encountered a problem with log file utilization during a somwhat long transaction during which some data is inserted in a StoredMap. I've set the minUtilization property to 75%. During insertion, things seem to go smoothly, but at one point log

  • Why does my  ipad2 disconnect from my wifi?

    When I go into the room where the router is I can connect. But when I walk into another room it disconnects. My brother's ipad2 can connect anywhere on the house? I just receive my iPad Monday. Should I send back to apple?

  • Calling a LabView DLL from VB

    I am running an NI example which is here http://zone.ni.com/devzone/cda/epd/p/id/3990 and when I run it the new VB environment converts the program to the new .net structure ok. However, when it runs it gives this error Loader lock was detected Attem