Sample data using sftp

HI all
i am using jsch-1.0.28.jar. i am connecting to a server using sftp protocol using the methods present in com.jcraft.jsch.ChannelSftp. Using this class and methods in this class i am able to connect, get the remote file to a local system, listing of files etc..
But i need the sample data of this file and i am not interested to download the whole file.
how do i make this possible
Instead if we use a SCP or FTP we have a provision to run a command on the shell that is opened and we get the outpuststream from where we can actually get the sample data.
BUt for sftp there was no method which runs a command. Rather there are methods which execute a give us the result
can anybody help me in this issue to read a sample data from the remote system using sftp
Thanks
Vijay Sunder

VijaySunder wrote:
But i need the sample data of this file and i am not interested to download the whole file.
how do i make this possible If you are using SFTP, it is not possible.

Similar Messages

  • Sample Data Used in Personnel Subarea and Employee Subgroup

    Hi HR Guru,
    Can anyone give me some sample values you used in Personnel Subarea and employee Subgroup?  Our company is looking into HR implementation.  We just wondered how other SAP customer's configuration for this field. 
    Thanks!
    RL

    Hi Reiling,
    There are plenty of threads here that describe the concept of personnel area/subarea, employee group/subgroup. You can also check SAP documentation.
    Based on that and your organization set up you should be able to chalk out the personnel structure.
    Donnie

  • Sending empty files using SFTP Adapter

    I am trying to send empty files using SFTP adapter. The interface has to send the file whether its empty file or containing data using SFTP adapter. I am using BizTalk Server 2013 R2. Is it a bug or the hotfix is already there for this issue.

    The issue here is not your SFTP not able to send 0KB files, but the file receive adapter that is receiving the file. The file adapter deletes 0 KB files and doest not transmit it further.
    If u have a ftp receive for example you should be able to send 0KB files.
    If u have a custom file receive adapter , is it handling 0KB files ?
    Regards <br/> When you see answers and helpful posts,<br/> please click Vote As Helpful, Propose As Answer, and/or Mark As Answer

  • How can I quickly sample data from 200 channels using DAQmx?

    Hello,
    I am trying to sample thermocouple data using a SCXI 1000 chassis with two SCXI 1100 modules and one SCXI 1102B, all of which have a 1300 isothermal terminal block. The computer I am using has a PCI-6052E DAQ card and 1 GB of RAM. I will eventually need to upgrade to a SCXI 1001 chassis with enough modules to collect about 175-200 different signals. The problem is I am testing with just the 96 channels I have available and the program runs super slow, with the maximum sample rate only .026 Hz. Currently I have all the channels being read by one DAQmx VI. I've looked through the example programs and tutorials, but nothing there deals with large numbers of channels. Is there a way to improve the sample rate to around 1 Hz with so many channels?
    Thanks in advance

    Check the filter configuration for your SCXI-1100.  The product page says that if you set the filter for 4Hz, you'll only get 3 Samples/second.  The slowest module in the chassis determines the sample rate.  If one of your modules is configured that way, you'll get a maximum rate of approximately 3 hz / 96 channels = 0.031 hz/channel, which is pretty close to what you've reported.  The filter is set by a jumper on the module, see the manual under "filter selection."
    You might want to look at "Determining Maximum Scan Rate for Multiple SCXI Modules" as well.

  • Using logic to decide whether or not to bring in (a sample) data.

    i have a report that works like this.
    Its broken into 2 different queries that are joined by data links. The First query grabs the most recent data (using max date) based on the group by function at the bottom of the query.
    select max(ops$penlims.get_colldate(b.sample_Id)) fdate,
    b.user_sampleid fuid, a.method fmethod, ops$Penlims.get_collpt_consult(b.sample_id) as ftype
    from nais_tasks a, nais_samples b
    where a.sample_id = b.sample_id
    and ops$penlims.get_colldate(b.sample_id) between :inStartDate AND :inEndDate
    and b.user_sampleid like :well
    and ops$penlims.get_collpt_consult(b.sample_id) like :pType
    and ops$penlims.get_collpt_consult(b.sample_id) not like '%EFF%'
    and a.condition in ('APPROVED')
    and not a.method in ('BALANCE PROGRAM','528 (VERSION 1)')
    and a.method not like '300%ANALY%ER%'
    group by b.user_sampleid, ops$penlims.get_colldate(b.sample_Id), a.method, ops$Penlims.get_collpt_consult(b.sample_id)
    order by user_sampleid, ops$penlims.get_colldate(b.sample_Id), a.method, ops$penlims.get_collpt_consult(b.sample_id)
    The second query is joined by a data link to the first query and is primarily just the part of the report that brings in the data to populate the report.
    Here is the second query:
    select distinct a.sample_id SSID, a.user_sampleid SUID, c.number_value, b.operation,
    ops$penlims.get_colldate(a.sample_id) SDATE, ops$penlims.get_collpt_consult(a.sample_id) STYPE, b.method SMETHOD,
    b.task_id STID, c.units sunits, b.condition TCond, c.condition RCond,
    ops$penlims.get_colltime(a.sample_id) STIME,
    c.component,
    ops$penlims.get_conc(result_id) gconc,
    ops$penlims.get_collpt_consult(a.sample_id) gcollpt
    from nais_results c, nais_tasks b, nais_samples a
    where a.sample_id = b.sample_id
    and b.task_id = c.task_id
    AND ops$penlims.get_colldate(a.sample_id) between :inStartDate AND :inEndDate
    and user_sampleid like :well
    and ops$penlims.get_collpt_consult(a.sample_id) LIKE :pType
    and c.condition in ('APPROVED')
    and b.condition in ('APPROVED')
    and ops$penlims.get_collpt(a.sample_id) LIKE '%PROD%'
    and lower (component) not like '%std%'
    My problem is this. My boss is happy with the report in that it brings in the most recent (ops$penlim.get_collpt_consult) sample_type for each (user_sampleid) well and groups/orders them correctly. Today he asked that if he passed in a parameter such as %RAW% into :ptype and the report returns RAW/HOP and STATION RAW for different dates; both the max date for the repective sample_type, is there a way to take the RAW/HOP sample over the STATION RAW if they both exist for that particular well (user_sampleid) or leave the STATION RAW sample in the report if no RAW/HOP sample exists. I can clarify any questions and could probably write a cursor to do this. Is there a faster way than writing a cursor? maybe a slight logic or program unit mod i can do?
    Thanks for even reading this far and thanks for ANY input!!
    Mike Mullowney

    I just was reading a thread about dpi on the DW forum and
    also speaking from experience they had a long thread and several
    posts conducted tests proving that dpi does not really make a big
    difference whether it's 72 or 300 dpi.
    dpi dots per inch come into play, with print generally. Some
    argued that 72 verses 300 used up more bandwidth but I did not see
    any significant difference when I tested the same graphic pixel
    size with one at 72 dpi and the other 300 dpi the amount of pixels
    in an image ppi is what matters for monitor display.
    For me most situations dynamic or embedded, I determine which
    way to go by how many pictures I need to show.
    Another solution I have used is to create your slide show
    using Flash video which will buffer and be able to use many photos.
    Hope any of this helps.
    INVT

  • Need Sample Code in C#  to Insert,Update,Query data using W 2.0 wsdl

    Hi,
    Can anyone please share sample code in C# to Insert,Update,Query data using W 2.0 wsdl.
    Thanks in advance

    I have found solution.
    Need add following line for non string data type.
    objOutreachUpdateList.Opportunity[0].IndexedNumber0Specified = true;

  • Time doesn't match sampled data?

    Hallo all experts,
    I write a LV code which reads data from USB 6211 and saves them with time instants in a text file, but the time instants don't correspond the sampled data. The time values are generated by elapsed time, after build array with the data read from DAQ, they are fed to the write to a text file. The test signal is 10 Hz, but the text file yields 0.2 Hz signal. How could I synchronize them?
    Any tips are highly appreciated.
    win2

    Don't use the "elapsed time" express VI for precision timings. It seems to have limited resolution (internally, it converts a timestamp to DBL).
    You can use e.g. the tick count to keep track of the time. See the attached comparison. (still there will always be some subtle differences due to the software timings).
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    usb6211_forumMOD.vi ‏42 KB

  • Creating a sample report using JAVA SDK

    Hi,
    I am trying to create a sample report using JAVA SDK.
    I slelect 4 "free cells" and pass 4 different strings to it.
    I even slelect the font colour and size. When i run the class and try to view the report in Infoview, I only seeblank blocks without any data. Now if I edit the report from infoview, and save the changes, I am able to see the data.
    My issue is, Why am I not able to see the data when I run the java code.
    Please find teh code below.
    package com;
    import java.awt.Color;
    import java.io.FileOutputStream;
    import java.util.ArrayList;
    import java.util.Date;
    import java.util.List;
    import com.businessobjects.rebean.wi.BinaryView;
    import com.businessobjects.rebean.wi.DataProvider;
    import com.businessobjects.rebean.wi.DataProviders;
    import com.businessobjects.rebean.wi.DataSource;
    import com.businessobjects.rebean.wi.DataSourceObject;
    import com.businessobjects.rebean.wi.DocumentInstance;
    import com.businessobjects.rebean.wi.DocumentLocaleType;
    import com.businessobjects.rebean.wi.FontImpl;
    import com.businessobjects.rebean.wi.FreeCell;
    import com.businessobjects.rebean.wi.HTMLView;
    import com.businessobjects.rebean.wi.OutputFormatType;
    import com.businessobjects.rebean.wi.PageHeaderFooter;
    import com.businessobjects.rebean.wi.Query;
    import com.businessobjects.rebean.wi.Recordset;
    import com.businessobjects.rebean.wi.Report;
    import com.businessobjects.rebean.wi.ReportBody;
    import com.businessobjects.rebean.wi.ReportCell;
    import com.businessobjects.rebean.wi.ReportContainer;
    import com.businessobjects.rebean.wi.ReportElement;
    import com.businessobjects.rebean.wi.ReportEngine;
    import com.crystaldecisions.sdk.framework.CrystalEnterprise;
    import com.crystaldecisions.sdk.framework.IEnterpriseSession;
    import com.crystaldecisions.sdk.framework.ISessionMgr;
    import com.crystaldecisions.sdk.occa.infostore.IInfoObject;
    import com.crystaldecisions.sdk.occa.infostore.IInfoObjects;
    import com.crystaldecisions.sdk.occa.infostore.IInfoStore;
    import com.crystaldecisions.sdk.plugin.CeKind;
    public class Aug7th {
          * @param args
         public static void main(String[] args) {
              // TODO Auto-generated method stub
              String CMS = "pundl8136:6400";
              String userID = "srivas";
              String password = "morcom123";
              String auth = "secEnterprise";
              List<String> entire =new ArrayList<String>();
              List<String> country =new ArrayList<String>();
              List<String> resort =new ArrayList<String>();
              IEnterpriseSession enterpriseSession;
              try
                   ISessionMgr mySessionMgr = CrystalEnterprise.getSessionMgr();
                   enterpriseSession = mySessionMgr.logon(userID, password, CMS,auth);
                   if (enterpriseSession != null)
                   {//Create and store useful objects for the session.
                        IInfoStore iStore = (IInfoStore)enterpriseSession.getService("InfoStore");
                        ReportEngine reportEngine = (ReportEngine)enterpriseSession.getService("WebiReportEngine");
                        IInfoObject infoView = null;
                        String str = "SELECT SI_ID, SI_NAME, SI_PARENTID FROM CI_INFOOBJECTS WHERE (SI_KIND = '"+CeKind.WEBI+"' OR SI_KIND='FullClient') " +
                        "AND SI_INSTANCE = 'false' AND SI_NAME='Structure Test_001_Java' ORDER BY SI_NAME ASC ";
                        //String str = "SELECT SI_ID, SI_NAME, SI_PARENTID FROM CI_INFOOBJECTS ORDER BY SI_NAME ASC ";
                        IInfoObjects objInfoObjectsWIDs = (IInfoObjects) iStore.query(str);
                        System.out.println(objInfoObjectsWIDs.size());
                        IInfoObject objInfoObjectWID = (IInfoObject) objInfoObjectsWIDs.get(0);
                        DocumentInstance doc = reportEngine.openDocument(objInfoObjectWID.getID());
                        DataProviders dps = doc.getDataProviders();
    //                     Retrieve the 1st data provider
                        DataProvider dp = dps.getItem(0);
    //                     Retrieve the universe objects
                        DataSource ds = dp.getDataSource ();
                        Query q = dp.getQuery();
                        Recordset rs = dp.getResult(0);
    //                     0: assume query has one flow
                        rs.first();
    //                     Print the column types. They can be Integer, String,
    //                     or Date.
                        for (int i = 0; i < rs.getColumnCount(); i++) {
                        Class c = rs.getColumnType(i);
                        StringBuffer sbt = new StringBuffer();
                        if ( c.equals(Integer.class) )
                        sbt.append("Integer");
                        if ( c.equals(String.class) )
                        sbt.append("String");
                        if ( c.equals(Date.class) )
                        sbt.append("Date");
                        sbt.append(";");
                        System.out.println(sbt.toString());
                        System.out.println(rs.getColumnCount());
                        while (!rs.isLast()) {
    //                          column names
                             StringBuffer sbn = new StringBuffer();
                             StringBuffer sbd = new StringBuffer();
                             for (int j = 0; j < rs.getColumnCount(); j++) {
                             sbn.append( rs.getColumnName(j).toString() );
                             sbn.append(";");
                             System.out.println("sbn "+sbn.toString());
    //                          data
                             for (int k= 0; k< rs.getColumnCount(); k++) {
                             sbd.append( rs.getCellObject(k).toString() );
                             sbd.append(";");
                             entire.add(rs.getCellObject(k).toString());
                             System.out.println("sbd "+sbd.toString());
                             rs.next();
                        System.out.println(entire.size());
                        for(int i=0;i<entire.size();i++){
                             country.add(entire.get(i));
                             i++;
                             System.out.println("entireList "+entire.get(i));
                             resort.add(entire.get(i));
                        DataSourceObject city = ds.getClasses().getChildByName("Country");
                        DataSourceObject resorts = ds.getClasses().getChildAt(1);
                        dp.runQuery();
                        ReportContainer report = doc.createReport("Resort");
                        PageHeaderFooter header = report.getPageHeader();
                        FreeCell headerCell = header.createFreeCell("Resort Report");
                        PageHeaderFooter footer = report.getPageFooter();
                        FreeCell footerCell = footer.createFreeCell("Report Ends");
                        ReportBody body =  report.createReportBody();
                        for(int k=0;k<resort.size();k++){
                        FreeCell res=body.createFreeCell(resort.get(k));
                        res.getAttachTo();
                        res.setHeight(15d);
                        res.setWidth(30d);
                        Color c = new Color(255,255,255);
                        Color c1 = new Color(255,0,0);
                        FontImpl fnt = (FontImpl)res.getFont();
                        fnt.getDecoration().setTextColor(c1);
                        res.setFont(fnt);
                        //res.deleteAttachment();
                        //res.setAttachTo(body,VAnchorType.BOTTOM,HAnchorType.NONE);
                        doc.applyFormat();
                        doc.refresh();
                        final String l_docToken = doc.getStorageToken();
                        final DocumentInstance l_docToSave = reportEngine.getDocumentFromStorageToken(l_docToken);
                        doc.saveAs("mor31",835,null,null);
                        doc.closeDocument();
                        str = "SELECT SI_ID, SI_NAME, SI_PARENTID FROM CI_INFOOBJECTS WHERE (SI_KIND = '"+CeKind.WEBI+"' OR SI_KIND='FullClient') " +
                        "AND SI_INSTANCE = 'false' AND SI_NAME='mor31' ORDER BY SI_NAME ASC ";
                        //String str = "SELECT SI_ID, SI_NAME, SI_PARENTID FROM CI_INFOOBJECTS ORDER BY SI_NAME ASC ";
                        objInfoObjectsWIDs = (IInfoObjects) iStore.query(str);
                        System.out.println(objInfoObjectsWIDs.size());
                        objInfoObjectWID = (IInfoObject) objInfoObjectsWIDs.get(0);
                        DocumentInstance doc1 = reportEngine.openDocument(objInfoObjectWID.getID());
                        String token = doc1.getStorageToken();
                        DocumentInstance doc2 = reportEngine.getDocumentFromStorageToken(token);
                        doc2.saveAs("123123", 835, null, null);
                   //     doc.refresh();
                        //doc.save();
                   enterpriseSession.logoff();
              catch(Exception e)
                   e.printStackTrace();

    duplicate post:
    Sample report using JAVA SDK

  • How can you tell how fast you are sampling when using digital inputs to the PCI6014

    i am currently sampling information via th edigital inputs of teh PCI6014. However in order to perform FFT and even to use the "Amplitude and Phase Spectrum vi" i need to know my sampling rate so that i can wire this constant to dt.i have attached the Vi that i created as well as some sample data. can you please help me to configure it such that i obtain a correct display of amplitude and phase. thanks
    Attachments:
    DISPLAY3 ‏40 KB
    proper_sampling_at_1Hz.txt ‏1 KB

    Hello,
    Thanks for contacting National Instruments.
    I found some information that I feel will help you to calculate the sampling rate, so that you'll be able to use this constant in your VI. Please see the link below.
    http://digital.ni.com/public.nsf/websearch/5782F1B396474BAF86256A1D00572D6E?OpenDocument
    I hope this helps you. Please let me know if you need any further support. Have a great day!
    Regards,
    Joe Des Rosier
    Applications Engineering

  • How to renormalize number of flows in Netflow Sampled data

    Hi,
    I am working on extrapolation(renormalization) of bytes/packets/flows from randomly sampled (1 out of N packets) collected data. I believe bytes/packets can be renormalized by multiplying bytes/packets value in exported flow record by N.
    Now, I am trying to extrapolate number of flows. So far i have not got any information on it. Do you people have any idea on how flows can be renormalized from sampled data ?
    Well, at the same time i have some doubts regarding this concept altogether -
    1. In packet sampling, we do not know how many flows got dropped. Even router cache will not have entries for dropped flows
    2. In flow sampling, router cache will maintain entries of all the flows and there may be some way by which one can know how many actual flows were there. But again there is no way to know values of individual attributes in missed flows like srcip/dstip/srcport/dstport etc.(though they are there in flow cache)
    3. In case of sampling (1 out of N packets), we anyway multiply #packets and #bytes with N to arrive at estimate for total packets and bytes. When we multiply by N, it means we have taken into account all those packets as well which were NOT sampled. So, it means all the packets which flowed between source and destination have been accounted for. Then there are no missed flows, isn't it ? And if there do exist some missed flows then multiplication by N to extrapolate number of packets/bytes is not correct.
    4. What is the use of count of flows anyways. Number of flows may vary depending upon the configuration such as active timeout etc. So, it does not provide any information about the actual flow between source and destination unlike number of packets and bytes.
    Please share your thoughts.
    Thanks,
    Deepak

    The simplest way is to call GetTableCellRangeValues with VAL_ENTIRE_TABLE as the range, next summing array elements.
    But I don't understand your comment on checksum, so this may not be the more correct method for your actual needs: can you explain what do you mean?
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • I'm doing a scan around a line by sampling data 360 degrees for every value of z(z is the position on the line). So, that mean I have a double for-loop where I collect the data. The problem comes when I try to plot the data. How should I do?

    I'm doing a scan around a line by sampling data 360 degrees for every value of z(z is the position on the line). So, that mean I have a double for-loop where I collect the data. The problem comes when I try to plot the data. How should I do?

    Jonas,
    I think what you want is a 3D plot of a cylinder. I have attached an example using a parametric 3D plot.
    You will probably want to duplicate the points for the first theta value to close the cylinder. I'm not sure what properties of the graph can be manipulated to make it easier to see.
    Bruce
    Bruce Ammons
    Ammons Engineering
    Attachments:
    Cylinder_Plot_3D.vi ‏76 KB

  • For this sample data how to fulfill my requirement ?

    For this sample data how to fulfill my requirement ?
    with temp as
    select 'MON' WEEKDAY,'9-10' TIMING,'I' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'9-10' TIMING,'II' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'9-10' TIMING,'III' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'10-11' TIMING,'I' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'10-11' TIMING,'II' CLASS FROM DUAL UNION
    select 'TUE' WEEKDAY,'9-10' TIMING,'I' CLASS FROM DUAL UNION
    select 'TUE' WEEKDAY,'9-10' TIMING,'II' CLASS FROM DUAL
    select ?? (what will be the query ??)
    How can i get output data in this way :
    WEEKDAY TIMING CLASS
    MON 9-10 I,II,III
    MON 10-11 I,II
    TUE 9-10 I,II

    If in 11g, you can use LISTAGG
    with temp as
    select 'MON' WEEKDAY,'9-10' TIMING,'I' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'9-10' TIMING,'II' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'9-10' TIMING,'III' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'10-11' TIMING,'I' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'10-11' TIMING,'II' CLASS FROM DUAL UNION
    select 'TUE' WEEKDAY,'9-10' TIMING,'I' CLASS FROM DUAL UNION
    select 'TUE' WEEKDAY,'9-10' TIMING,'II' CLASS FROM DUAL
    select
    WEEKDAY,
    TIMING,
    LISTAGG(CLASS,',') WITHIN GROUP (order by 1) as class_aggregate
    from temp
    GROUP by WEEKDAY,TIMING;
    WEEKDAY       TIMING     CLASS_AGGREGATE
    MON           9-10       I,II,III
    MON           10-11      I,II
    TUE           9-10       I,IIOther techniques for different versions are also mentioned here :
    http://www.oracle-base.com/articles/misc/StringAggregationTechniques.php#listagg

  • BPC 5 - Best practices - Sample data file for Legal Consolidation

    Hi,
    we are following the steps indicated in the SAP BPC Business Practice: http://help.sap.com/bp_bpcv151/html/bpc.htm
    A Legal Consolidation prerequisit is to have the sample data file that we do not have: "Consolidation Finance Data.xls"
    Does anybody have this file or know where to find it?
    Thanks for your time!
    Regards,
    Santiago

    Hi,
    From [https://websmp230.sap-ag.de/sap/bc/bsp/spn/download_basket/download.htm?objid=012002523100012218702007E&action=DL_DIRECT] this address you can obtain .zip file for Best Practice including all scenarios and csv files under misc directory used in these scenarios.
    Consolidation Finance Data.txt is in there also..
    Regards,
    ergin ozturk

  • Is it possible to read digital data using an external clock (PCI-6259 M)?

    I’m using a NI PCI-6259 M Series card and trying to write my program in VC++6.0 using the functions in the DAQmx driver.
    Question1: Not all functions listed in the NI-DAQmx C Reference Help seems to be supported by my NI-card, where can I find information about which of the functions that are supported?
    Question2: I want to read data from a device that clock out data on the falling edge of a clock signal. The clock signal and the data signal are routed to two DIO terminals on the NI-card. The question is if it is possible to read data using the clock as a sample clock? See two code examples below that doesn’t work. In both cases 10 samples are read at once, even if the external clock is not present.
    Example 1
    // Create tasks
    Status = DAQmxCreateTask("", &m_ReadTrimTask);
    // Set up read task
    status = DAQmxCreateDIChan(m_ReadTrimTask, "Dev1/port2/line0", "", DAQmx_Val_ChanPerLine);
    status = DAQmxCfgChangeDetectionTiming(m_ReadTrimTask,"Dev1/port2/line6","Dev1/port2/line6",DAQmx_Val_FiniteSamps, 10);
    // Read data
    int32 sampsPerChanRead, numBytesPerSamp;
    status = DAQmxReadDigitalLines(m_ReadTrimTask, 10, 10.0, DAQmx_Val_GroupByChannel, result, 10, &sampsPerChanRead, &numBytesPerSamp ,NULL);
    Example 2
    // Create tasks
    Status = DAQmxCreateTask("", &m_ReadTrimTask);
    // Set up read task
    status = DAQmxCreateDIChan(m_ReadTrimTask, "Dev1/port2/line0", "", DAQmx_Val_ChanPerLine);
    status = DAQmxSetSampTimingType(m_ReadTrimTask, DAQmx_Val_SampClk);
    status = DAQmxSetSampClkRate(m_ReadTrimTask, 1000.0);
    status = DAQmxSetSampClkActiveEdge(m_ReadTrimTask, DAQmx_Val_Falling);
    status = DAQmxSetSampClkSrc(m_ReadTrimTask, " Dev1/port2/line6");
    // Read data
    int32 sampsPerChanRead, numBytesPerSamp;
    status = DAQmxReadDigitalLines(m_ReadTrimTask, 10, 10.0, DAQmx_Val_GroupByChannel, result, 10, &sampsPerChanRead, &numBytesPerSamp ,NULL);

    Hello Magnus,
    Thank you for contacting National Instruments.
    "Question1: Not all functions listed in the NI-DAQmx C Reference Help seems to be supported by my NI-card, where can I find information about which of the functions that are supported?"
    The best place to look for this information would be the M Series Help Manual. There you can find the features of your PCI-6259 and what operations it supports.
    "Question2: I want to read data from a device that clock out data on the falling edge of a clock signal. The clock signal and the data signal are routed to two DIO terminals on the NI-card. The question is if it is possible to read data using the clock as a sample clock? See two code examples below that doesn’t work. In both cases 10 samples are read at once, even if the external clock is not present."
    Look at the "ContReadDigChan-ExtClk_Fn.c" example project which ships with the NI-DAQ driver. This is located at: C:\Program Files\National Instruments\NI-DAQ\Examples\DAQmx ANSI C\Digital\Read Values\Cont Read Dig Chan-Ext Clk.
    You will have to make some minor modifications to convert this to a finite acquisition, but that is simply a matter of changing the "sampleMode" parameter of the DAQmxCfgSampClkTiming() function. You will also have to route your clock signal to a PFI line and specify which line in your code.
    I hope this helps.
    Sean C.
    Applications Engineering
    National Instruments

  • Problem Loading Data Using UTL File Package

    Hi Friends..
    My Database is Oracle 10gR2 and OS is WINDOWS.
    I have one excel file which contains 10 fields.
    My requirement is to Load data from 1 excel file into two tables..
    tables are master and detail
    Below are sample data and structure for it..
    Excel file format
    TEST.CSV
    Srno Empno Empname City Challanno challandate Materialno materialname Materialqty Materialcost
    1     232 raj      Hyderabad      533     20/04/2010     11     abc      34     10
    1     232 raj      Hyderabad      533     20/04/2010     12      aa      4     110
    1     231 ram Baroda      533     20/04/2010     14      abcd      33     210
    Master table
    empno
    Challanno
    challandate
    Detail table
    empno
    materialno
    materialname
    materialqty
    materialcost
    My question is ..While reading 1st line if its empno is new then record is entered in master table for the first time and remaining records of same empno is entered in detail table.. now when empno no changes then new entry is done in master table and associated records are entered in detail table..
    So in this case for empno 232 master table would have
    232,533,20/4/2010
    Detail table for empno 232 would now have 2 records..
    232,11,abc,34,10
    232,12,aa,4,110
    I am using UTL_FILE package to achieve this as the file is on server...
    Kindly please help me to proceed in this..
    Really appreciate your help...

    sai121 wrote:
    Its ok..if u dont wanna reply sir... but thats what i m told to do..and i have 4yrs of industry experience too...but cant argue with boss..u know that right.It's not that people don't want to reply, it's that what you're asking for is something achieved very simply in a few lines of code using External tables but is a convoluted and complex thing to do using UTL_FILE, so why would anybody want to waste their time giving you a load of code to achieve what you want when they know it's the wrong approach anyway.
    I've been computer programming for 28 years (jeez has it really been that long :D ), so I wouldn't be telling you that you're doing it the wrong way without knowing that there are better ways to do it and you're asking for the wrong way. Speak with your boss, tell him that you've been recommended to use External Tables instead because they're the right way to read such data and UTL_FILE is not the right approach.

Maybe you are looking for