Problem in data acquisition for cRIO-9076 wth c series drive interface module 9516

I am using LabVIEW for my project i.e., the speed control (using PID) of a motor and want to create a vi for the same.
The specifications of the products being used are as follows:
1) Motor: AKM24F (dc motor)
2) CompactRIO: cRIO-9076
3) C Series Servo Drive Interface: NI-9516
I am facing problem regarding the real time interface between the motor and PID block in labVIEW i.e. in the data acquisition part to be specific. Please suggest a way in which I can successfully acquire the analog data(speed) from the motor and vice versa in the vi.

What is the priority of the VI you're running?  I'd be concerned that maybe you've starved out the ethernet transmit thread or something.
-Danny

Similar Messages

  • A problem with data acquisitioning in LV 7.1 (transition from Traditional NI-DAQ to NI-DAQmx)

    Hi to everyone,
    I have a problem with data acquisitioning in LV 7.1.
    I made a transition from Tradiotional NI-DAQ to NI-DAQmx in my LabVIEW application.
    The problem I have is that when I acquire data in Traditional (without writing somewhere -
    just reading) then there is no scan backlog data. But when I acquire data in application that
    acquisition is based on DAQmx than a scan backlog indicator shows numbers from 20 to 50 for
    about 6 min and then that number quite quickly increases until I get an error (unable to
    acquire data. The data was overwritten).
    Acquisition settings are the same in both cases. When I acquire with DAQmx I use a global
    channels. Is a reason for that phenomenon in global channels data procesing? But it seems
    strange why does it flows quite smoothly for about 6 min and then it stucks?
    Best regards,
    Ero

    If you have an old Daq unit it may not be compatible with DAQMX. Which DAQ unit do you have? I think NI have a list showing which DAQ driver you can use with your card
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

  • Real-time data acquisition for HR datasources

    Dear Experts,
    I have a couple of questions on real-time data acquisition...
    a) Can you tell me if any standard HR datasources support real-time data acquisition?
    b) Can we apply real-time data acquisition for generic datasources? If yes, is there any difference in the process compared to working with business content datasources when using real-time data acquisition?
    Hope you can provide some answers...as always, points will be awarded for answers that effectively address the questions.
    Thanks a bunch.
    K

    Hi Karthik,
    a)The decision to go for SAP remote cube depends on the volume of data and the frequency of queries. It is advicible not to use the SAP remote cube if the data volume is high and also, the query frequency is high. Which module in HR are yuo implementing? In HR, the data volume is generally high. So, if you go for loading the data from R/3 every 3 hours, you are asfe as long as the loading volumes are less. For example, for implementing Time management, I would not advice frequent loads as it is time consuming process. So, make a decision based on the above mentioned lines. If the data volume is not high, I would prefer the SAP ermote cube as it will reduce managing the loads.
    b)I mentioned FM extractor just for the sake of control of data selection. You can even go for view/table extractor.
    Hope this helps.
    Thanks and Regards
    Subray Hegde

  • Problem in data acquisition

    Regarding to my project I have implemented a LabVIEW program which is doing data acquisition. We have LabVIEW 6i Installed in PXI-8170. The DAQ Card that we use is NI6030E DAQ Card Multifunction I/O.
    I have implemented a VI to acquire and plot and record multiple analog input channels. By using AI Acquire Waveforms VI to acquire waveforms from 4 channels in a single run and using Index Array function to extracts the data for each channel.
    For plotting waveforms, I use waveform chart for each channel. I put all this functions in a While Loop and use a stop push button to stop the acquisition. I also put another push button named Save to disk in the loop that by pressing it we can save data to a File by using Write to Spreadsheet File. VI, located in a case structure out of the while loop. I save data as a 2D array by connecting the waveforms output connector of AI Acquire Waveforms VI to the input connector of Write to Spreadsheet File VI.
    But I have some problem that I can not solve it yet which are listed below:
    1- I want to record all the data that I acquire for a specific time, but I can not. I only can save the specific part of the waveforms which I think it is specified by the number of samples/ch and Scan rate controls.
    For example I do the acquisition for about 1:30 minute and save it, but when I plot the data which is saved in the spreadsheet file; it is only the last 20 seconds. So I can not save all the data that I try to acquire.
    2- I want to record data for a specific time automatically. I mean when I run the VI after press the Save to disk push button the VI automatically start saving data for a specified period of time and then automatically stop. (How to define a timer to control the VI?)
    3- I want to start acquisition each time from zero base time. I mean when I do the acquisition for the first time and then I want to do it for second time, the pervious acquisition reset or clear and the time axes of each chart again start from zero not continue the pervious time. It means that I need to clear the chart.
    Regarding to this problem I try to use the AI Clear VI, but it didn�t work. May be I don�t know how to use it.
    Attachments:
    test4chPPG1.vi ‏49 KB

    I need your hints about these 3 problems.

  • Data Acquisition for Position Transducer

    I just purchased an Analog Position Transducer from SpaceAge Control, and was wondering if anyone had any code to gather the data. I must admit i'm lost when it comes to this, I thought the transducer would come like an instron where you just plug it into the computer. I have a six pronged plug that exits the device. I have a couple TI data acquisition boards at my disposal. I am hoping someone has used a similar product or has some advice. Thank You
    Keith

    We have lots of code to gather data in all sorts of ways. Building measurement applications is what we do. Since you are lost when it comes to this, you would do well to hire a guide in the form of an Alliance Member company. We do this stuff every day.
    Your TI DAQ boards may do the trick, but you would have to show someone the specs for them. Also, you need to think about what kind of speed you require and where you want the position data to go. Table? Plot? Database? A good LabVIEW developer can set you up with some code that does exactly what you want.
    Daniel L. Press
    PrimeTest Corp.
    www.primetest.com

  • Meet a problem of data exchange for sale order from CRM to R3.

    Dear Friends:
          I do the data exchange for sale oder from to R3 today , the problem's detail is as follows:
          When i save a sale order in CRM (Version is 5.0) . it can automatically generate a bdoc which bdoc type is BUS_TRANS_MSG. but the bdoc status alway is "Sent to receivers (not all have confirmed)". and the original order in CRM can not be change .it quote that "Document is being distributed - changes are not possible",  so i check the order status analysis in detail .it presents two error messages ," Event 'BEFORE_CHANGE', attribute '     FINI': Error code for function module 'CRM_STATUS_BEFORE_COMPLETED_EC' , "Item is not yet completed in OLTP system".  so i check  the order in R/3 ,it has already been create and without any error messages.
       Would like to tell me how to solve it . thanks your any idear..

    Hi Benjamin,
    When performing uploads to R/3 from CRM there is a response from the OTLP system that is sent back to the CRM Middleware to confirm that the data records were received and processed correctly. 
    Here is a checklist you can run through to verfiy that the connections, systems and objects that are needed are all in place:
    <b>On R/3 system:</b>
    - Check R/3 outbound queue (transaction SMQ1) for any entries that are not reaching CRM.
    - Check that all RFC destinations on R/3 are defined correctly and are pointing to CRM
    - Check the CRMCONSUM table in R/3 to ensure CRM is registered as a consumer
    - Check the CRMRFCPAR table in R/3 to ensure that order objects are valid for exchange between R/3 and CRM
    - Check for any short dumps in R/3 (ST22/ST21)
    <b>On CRM:</b>
    - Are there entries stuck in the inbound queue (SMQ2) with R3AU* names?
    - What does the CRM Middleware Trace show (SMWT)?  Sometimes this has more detail than the specific BDoc overview (SMW01)
    - Check for short dums in CRM (ST22)
    Let us know what else you uncover and we can work from there.
    Brad

  • Problem in data sources for transaction data through flat file

    Hello Friends,
    While creating the data sources for transaction data through flat file, I am getting the following error "Error 'The argument '1519,05' cannot be interpreted as anumber' while assigning character to application structure" Message no. RSDS016
    If any one come across this issue, please provide me the solution.
    Thanks in Advance.
    Regards
    Ravi

    Hallo,
    just for information.
    I had the same problem.
    Have changed the field type from CURR to DEC and have set external instead of internal.
    Then, the import with flatfile worked fine.
    Thank you.

  • IDOC: Problem with data filter for IDOC extension field

    Hallo!
    I've created an idoc extension for the basic type DEBMAS06 that works fine. Now I want to use a data filter for one field ( company code ) of my segment. Every segment with a company code different from 100 should be filtered and not send to the other client. But what happend is that for all customers that have at least one company code different from 100, all segments including the one with cc 100 were deleted and a error "Segment ... does not exist for message type DEBMAS" appeared on the screen.
    Does anyone have any ideas about this problem?

    Not sure about changes to be made at the filtering options.
    An alternative would be sending the data to XI as it is and perform the mapping to remove the unnecessary segments.
    Disadvantage: Unnecessary processing of segment would be done at XI.
    Advantage: The integration logic would be completely handled by XI.
    Regards,
    Prateek

  • Network Streaming data acquisition (PC-cRIO)

    Hello,
    I have written some code using Labview's network streaming option to transfer data from the cRIO to my computer and then save this data in a .csv file. I wanted to test it but I'm confused on how to run or deploy it.
    Right now I have the code divided into two VIs, one for each endpoint. I'm assuming the writer endpoint corresponds to the cRIO and the reader to my PC. 
    So my questions are:
    - Where exactly in the project explorer am I supposed to add these files? (please check the attachment and tell me if it is correct).
    - What do I need to deploy into the cRIO? and How do I run the whole application?
    - Can I build a UI for both VIs (writer and reader)? or because the writer is running in the cRIO I could only see the front panel of the reader?
    Thank you very much!
    Attachments:
    project explorer.jpg ‏37 KB

    It looks like you have the files in the right place.  Just try to run each VI in their proper context.  If you are connected to your cRIO, it will deploy and run for you.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Problem with data acquisition and motion control

    I have PCI-6024E and PCI-7342 cards on a single PC. Servo motion control works fine when it goes alone or is accompanied by one point data acquisition in a while loop, but when I start data acquisition with specified sample rate, the motor moves with breaks.
    Does anybody know what is the problem? Is it possible to fix it?

    How are you performing your motion? Is it a position move or a velocity move? How are your triggering the data acquisition? Using breakpoints on the 7342? If so, are you using single breakpoints or modulo? How are you configuring the data acquisition rate? Are you triggering just the start to the acquisition and the unsing the daq scan clock or are you using the motion controller to send the scan clock itself?

  • Query about Leaving date,Reason for leaving fields in R3 report of HR Modul

    Hi Experts,
              I have a report in R3 for HR. In the report i am getting fields like leaving date,reason for leaving .
    When I  clicked F1 on the leaving date and reason  i came to know the field name as FIREDATE and MGTXT...
    I checked in DD03L table with these field name and found no related tables for that.....
    but i found some structures only.........
    In order to create a generic datasource with these fields included then how data will get for these fields....
    But i find no relevant tables regarding these fields.....in the report of r3 when checked f1
    Text
    Leaving date
    Definition
    Date, on which an employee leaves the company.
    Dependencies
    The following infotypes are taken into account when the value is created:
    P0000 - Actions
    P0001 - Organizational Assignment
    But the above P0000 ,P0001 are structures .... how can the leave date get populated in the generic datasource.which i may have to create ?
    Thanks
    Vamsi

    The easiest way to accomplish that would be to add some logic in a custom controller (i.e some java code). Which is a little more complex than regular personalizations.

  • Problem on date format for iso standard

    oracle 8.0.4e
    when i try to execute these script on oralce 8.0.4,i get different result.
    why?
    what shall i think about iso data standard?
    select to_char(to_date('2005/01/01','YYYY/MM/DD'),'YYYY/MM/DD') from dual;
    #2005/01/01
    select to_char(to_date('2005/01/01','YYYY/MM/DD'),'IYYY/MM/DD') from dual
    #2004/01/01
    --------------

    you'll have to look at iso-year and iso-week at the same time, otherwise it just makes no sense.
    example :
    select to_char(to_date('26.12.2004','dd.mm.yyyy','IW-IYYY') from dual ;
    52-2004
    select to_char(to_date('01.01.2005','dd.mm.yyyy','IW-IYYY') from dual ;
    53-2004
    select to_char(to_date('09.01.2005','dd.mm.yyyy','IW-IYYY') from dual ;
    01-2005
    so now be careful, there is not a thing like iso-month, -quarter, -halfyear.
    best way to go is, make your own calendar tables, i.e. declare week 53-2004 part of your special december-2004, etc.
    the very creator of all worlds didn't think about IT problems, but he well thought about jobs for IT professionals ;-)

  • Problem in Data extraction for NEW GL DSO 0FIGL_O10

    Hi ,
    I am facing Problem in extraction of records from SAP to BW.
    I have installed Business Content of NEW GL DSO  0FIGL_O10.
    When I extract the Data from SAP R/3, to this DSO  ( 0FIGL_O10 )  the reocrds are getting over written
    For Example  When I go the the Mange Option ( InfoProvider Administration)  the transferred Records and the Added Records are not same.  The Added records are less then the Transfered reocords.
    This is happening becuase of Key Filed Definations.
    I have 16 Characterisics in the KEY FIELD, which the maximum that I can have. But the Data comming from is Unique in some casses.
    As result the data get added up in the DSO, hence my balances are not matching with SAP R/3 for GL Accounts.
    There are total 31 Characteristics in the Datasource (0FI_GL_10) . Of which 16 Charactheristics i can include in the Key field area.
    Please suggest some solution.
    Regards,
    Nilesh Labde

    Hi,
    For safety, the delta process uses a lower interval setting of one hour (this is the default setting). In this way, the system always transfers all postings made between one hour before the last delta upload and the current time. The overlap of the first hour of a delta upload causes any records that are extracted twice to be overwritten by the after image process in the ODS object with the MOVE update. This ensures 100% data consistency in BW.
    But u can achive ur objective in different manner::
    Make a custom info object ZDISTINCT and populate it in transformation using ABAP code. In ABAP try and compound the values from different charactersitcs so that 1 compounded characterstic can be made. Use ZDISTINCT in ur DSO as key
    Just a thought may be it can solve ur problem.
    Ravish.

  • Sync data acquisition for Arduino and cDAQ

    Hello,
    I am using the Labview cDAQ, thermocouple (NI 9214) and digital input (NI 9411) modules, as well as an arduino uno
    to read a few signals and do some data processing.  I would like a 0.2-0.5 Hz sampling frequency such that all the data collected is synched to the same timestamp.  I have about 30 thermocouples (2 NI9214 units), 2 TTL signal (approx 1Hz frequency), and the arduino
    is acquiring an analog voltage signal.
    Any examples or guidance would be appreciated.
    Kind Regards,
    Anna

    Hello Anna,
    Just to make sure I understand correctly, you wish you acquire all data (analog, digital, thermocouple) at 0.2-0.5 Hz and have each data point taken along with a timestamp at the same point in time?  In other words, at 0.5 Hz, you will read 30 TC measuremnts, 1 point of each TTL signal, and one point of the analog signal ever 2 seconds, and the associated timestamp for that acquisition?
    Since the timing is relatively slow, you would likely be able to do this fairly well in software alone.  The following Community Example...
    https://decibel.ni.com/content/docs/DOC-9543
    ...uses the producer/consumer architecture to collect Analog Waveform data in one loop (you would acquire analog and digital), and then break it out in the consumer loop to write a timestamp (from the waveform) to the first column of a 2D array, and the corresponding data points to the subsequent columns, which you could then write to a file, display in a table, etc.
    Regards,
    National Instruments

  • Problem setting data collection for SLD

    Good day
    I have the following situation.
    A Solution Manager 71 installation, just recently patched to SP11. Previously I had the SLD on this system. And then a colleague informed that this could be a problem in the future as my ECC systems are Basis 731 which is higher level than the 702 on the Solution Manager system. So I installed a separate Java stack system on the same server as the Solution Manager.
    The problem that I am faced with now is that I cant get the data collection (RZ70) to connect with the new Java system that I have installed. When I go into SLD and Administration, I see under Details>Data Suppliers that the server is specified as the Gateway Host, with a Gateway port of 3303. On the Solution Manager system it specified the gateway host as "localhost" and the gateway port as sapgw00.
    So, from RZ70 I specify the gateway service as sapgw03 and then I get the result:
    0: DSIMSID001_SID_00                    
    : Collection of SLD data finished
    0: DSIMSID001_SID_00                    
    : Data collected successfully
    0: DSIMSID001_SID_00                    
    : RFC data prepared
    0: DSIMSID001_SID_00                    
    : Used RFC destination: SLD_NUC
    0: DSIMSID001_SID_00                    
    : RFC call failed: Error when opening an RFC connection (CPIC-CALL: 'ThSAPOCMINIT' : cmRc=17 thRc=2
    0: DSIMSID001_SID_00                    
    : Batch job not scheduled
    And from the SLD_NUC entry in SM59 I get
    Logon    Connection Error
    Error Details    Error when opening an RFC connection (CPIC-CALL: 'ThSAPOCMINIT' : cmRc=17 thRc=2
    Error Details    ERROR: SAP gateway communication error (Is SAP gateway closed?)
    Error Details    LOCATION: SAP-Server DSIMSID001_SID_00 on host DSIMSID001 (wp 1)
    Error Details    COMPONENT: CPIC
    Error Details    COUNTER: 47
    Error Details    MODULE:
    Error Details    LINE:
    Error Details    RETURN CODE: 239
    Error Details    SUBRC: 0
    Error Details    RELEASE: 721
    Error Details    TIME: Mon Mar 16 14:04:05 2015
    Error Details    VERSION:
    Can anyone give me a few ideas what may be wrong here please?
    Regards
    Ray Phillips

    Hi Ray,
    Please check if the below SLD configuration is done.You can use any one of the two below link depending on how you want to configure your SLD.
    Configure SLD for JCo and Creation of JCo Destinations - Portal - SCN Wiki
    or
    How to Create & Configure Local SLD on SAP NetWeaver 7.3 AS Java after Installation
    If your SLD server is not a local SLD then make sure that in both the SLD system and the remote system host fine entry is there in /etc/hosts and etc/services.

Maybe you are looking for

  • The Size Of My Other Doubled In Less than 12 hours

    I synced my phone last night around 7 pm and the other was about 365 mb. I have only installed 1 app (Last.fm) since then and I synced it again this morning and now my other is almost 900mb! Anyone know what caused this/how I can get other to take up

  • HT2731 how do i get past the emergency number pad to set up my iphone 3gs

    how to get past the emergency screen to setup my iphone 3gs

  • MDCrashReportTool quits on plug in of iPhone 3G

    I am fairly sure I am on the only person who has a situation like this, but whenever I plug in my iPhone 3G. I get this a message saying "MDCrashReportTool unexpectedly quit" Does anyone else have this problem and if so is there a way you know how to

  • MXML Dynamic Object Creation

    Hi , Static Object Creation : Eg:     <mx:Fade id="ViewStack_EffectStart" duration="500" alphaFrom="0.0" alphaTo="1.0"/>     <mx:Fade id="ViewStack_EffectEnd" duration="500" alphaFrom="1.0" alphaTo="0.0"/> <comp:ErrorBox id="errorBox" active="{active

  • Cannot open System Preferences after update to Snow Leopard

    Hi, everybody, Just updated to Snow Leopard and everything seemed to go smoothly, but when I tried to open System Prefences, I get the message, "You can't use this version of the application System Preferences with this version of Mac OS X. (You have