Java servlet: how to store large data result across multiple web session

Hi, I am writing a java servlet to process some large data.
Here is the process
1), user will submit a query,
2) servlet return a lot of results for user to make selection,
3). user submit their selections (with checkboxes).
4). servlet send back the complete selected items in a file.
The part I have trouble with (new to servlet) is that how I can store the results arraylist (or vector) after step 2 so I needn't re-search again in step 4.
I think Session may be helpful here. But from what I read from tutorial, session seems only store small item instead of large dataset. Is it possible for session to store large dataset? Can you point me to an example or provide some example code?
Thanks for your attention.
Mike

I don't know whether you connect the databases and store the resultset?

Similar Messages

  • How to store the datas in a .txt file in to a JTable in Java Swing

    Hi sir,
    Here i want to know how to store the data's of a .txt file
    in to a JTable in java swing.
    Where here the .txt file like for eg,spooler.txt is in the server and from there it will come to my client machine what i have to do is to take that .txt file and store the datas of the .txt file
    in a JTable.This is what i want.So pls. do help and provide the code as well.I will be thankful.Since i am involved in a project which involves this it is Urgent.
    Thanx,
    m.ananthu

    You can't just display data from a text file in a JTable. You have you understand the structure of the data so that you can parse the data and create a table model that can access the data in a row/column format. Here is an example of a simple program that does this:
    http://forum.java.sun.com/thread.jsp?forum=57&thread=315172

  • How to store xml data into file in xml format through java program?

    HI Friends,
    Please let me know
    How to store xml data into file in xml format through java program?
    thanks......
    can discuss further at messenger.....
    Avanish Kumar Singh
    Software Engineer,
    Samsung India Development Center,
    Bangalore--560001.
    [email protected]

    Hi i need to write the data from an XML file to a Microsoft SQL SErver database!
    i got a piece of code from the net which allows me to parse th file:
    import java.io.IOException;
    import org.xml.sax.*;
    import org.xml.sax.helpers.*;
    import org.apache.xerces.parsers.SAXParser;
    import java.lang.*;
    public class MySaxParser extends DefaultHandler
    private static int INDENT = 4;
    private static String attList = "";
    public static void main(String[] argv)
    if (argv.length != 1)
    System.out.println("Usage: java MySaxParser [URI]");
    System.exit(0);
    String uri = argv[0];
    try
    XMLReader parser = XMLReaderFactory.createXMLReader("org.apache.xerces.parsers.SAXParser");
    MySaxParser MySaxParserInstance = new MySaxParser();
    parser.setContentHandler(MySaxParserInstance);
    parser.parse(uri);
    catch(IOException ioe)
    ioe.printStackTrace();
    catch(SAXException saxe)
    saxe.printStackTrace();
    private int idx = 0;
    public void characters(char[] ch, int start, int length)
    throws SAXException
    String s = new String(ch, start, length);
    if (ch[0] == '\n')
    return;
    System.out.println(getIndent() + " Value: " + s);
    public void endDocument() throws SAXException
    idx -= INDENT;
    public void endElement(String uri, String localName, String qName) throws SAXException
    if (!attList.equals(""))
    System.out.println(getIndent() + " Attributes: " + attList);
    attList = "";
    System.out.println(getIndent() + "end document");
    idx -= INDENT;
    public void startDocument() throws SAXException
    idx += INDENT;
    public void startElement(String uri,
    String localName,
    String qName,
    Attributes attributes) throws SAXException
    idx += INDENT;
    System.out.println('\n' + getIndent() + "start element: " + localName);
    if (localName.compareTo("Machine") == 0)
    System.out.println("YES");
    if (attributes.getLength() > 0)
    idx += INDENT;
    for (int i = 0; i < attributes.getLength(); i++)
    attList = attList + attributes.getLocalName(i) + " = " + attributes.getValue(i);
    if (i < (attributes.getLength() - 1))
    attList = attList + ", ";
    idx-= INDENT;
    private String getIndent()
    StringBuffer sb = new StringBuffer();
    for (int i = 0; i < idx; i++)
    sb.append(" ");
    return sb.toString();
    }// END PRGM
    Now , am not a very good Java DEv. and i need to find a soln. to this prob within 1 week.
    The next step is to write the data to the DB.
    Am sending an example of my file:
    <Start>
    <Machine>
    <Hostname> IPCServer </Hostname>
    <HostID> 80c04499 </HostID>
    <MachineType> sun4u [ID 466748 kern.info] Sun Ultra 5/10 UPA/PCI (UltraSPARC-IIi 360MHz) </MachineType>
    <CPU> UltraSPARC-IIi at 360 MHz </CPU>
    <Memory> RAM : 512 MB </Memory>
    <HostAdapter>
    <HA> kern.info] </HA>
    </HostAdapter>
    <Harddisks>
    <HD>
    <HD1> c0t0d0 ctrl kern.info] target 0 lun 0 </HD1>
    <HD2> ST38420A 8.2 GB </HD2>
    </HD>
    </Harddisks>
    <GraphicCard> m64B : PCI PGX 8-bit +Accel. </GraphicCard>
    <NetworkType> hme0 : Fast-Ethernet </NetworkType>
    <EthernetAddress> 09:00:30:C1:34:90 </EthernetAddress>
    <IPAddress> 149.51.23.140 </IPAddress>
    </Machine>
    </Start>
    Note that i can have more than 1 machines (meaning that i have to loop thru the file to be able to write to the DB)
    Cal u tellme what to do!
    Even better- do u have a piece of code that will help me understand and implement the database writing portion?
    I badly need help here.
    THANX

  • How do you share Aperture file across multiple users on same Mac?

    How do you share Aperture file across multiple users on same Mac? Seems this should be a preferences choice.

    When you share your library between users, you may run into permission and ownership problems, if both users are editing the Aperture library and not only reading it. To avoid that, it helps to put the Aperture library onto a separate disk or a separate partion of your hard drive. For s separate partition or disk you can enable the "ignore ownership on this volume" flag. Then all users can access the library as owners of this library.
    You might try to put the aperture library into a shared folder on your mac, but that has caused problems recently, i.e. when the library also contains video files.
    Regards
    Léonie

  • How to store binary data in Java

    I need to show news in my web aplication.So I am retrieving news from a Feed Server in which the field GEN_UNICODE_MESSAGE_1 gives the news data.News consists of both English/Arabic data.So how will I store the binary content in JAVA and how can I display the news in JSP?
    I tried storing the binary content in ByteArrayInputStream and appended it using StringBuilder.But I didnt get the desired output.Output is seen as ÇáÎáíÌíÉ ááÇÓÊËãÇÑÇáãÌãæÚÉÇáÔÑßÉÇáÇãÓÈÊãÈÑÓÈÊãÈÑÅÌãÇá.Please help me.
    Here GEN_UNICODE_MESSAGE_1 is of type BINARY.
    try{
    BufferedInputStream in=new BufferedInputStream(new ByteArrayInputStream((byte[])msg.getField("GEN_UNICODE_MESSAGE_1").value.get(h)));
    //ByteArrayInputStream in=new ByteArrayInputStream(((byte[])msg.getField("GEN_UNICODE_MESSAGE_1").value.get(h)));
    StringBuilder builder = new StringBuilder();
    byte[] buff=new byte[1024];
    int len;
    while ((len=in.read(buff))!=-1){
    builder.append(new String(buff,0,len));
    String incomingmsg=builder.toString();
    }catch(IOException e){}
    Edited by: Alance on May 7, 2010 10:12 PM

    BufferedInputStream bis=new BufferedInputStream(new ByteArrayInputStream((byte[])msg.getField("GEN_UNICODE_MESSAGE_1").value().toString().getBytes()));
    BufferedReader br = new BufferedReader(new InputStreamReader(bis));What on earth is all this?
    1. What is the type of msg.getField("GEN_UNICODE_MESSAGE_1").value()?
    2. Whatever that type is, you are then converting it to a String.
    3. You are then converting that to bytes.
    4. You are then casting that to byte[], which it already is,
    5. You are then wrapping a ByteArrayInputStream around that.
    6. ... and an InputStreamReader round that ...
    7. ... and a BufferedReader around that ...
    8. ... and reading lines
    9. ... and appending them to a StringBuilder
    10. ... and converting that to a String.
    Which you already at at step 2.
    String incoming = msg.getField("GEN_UNICODE_MESSAGE_1").value().toString();Not sure whether that's correct given the charset issues, but if not the problem starts at step 2. In any case steps 3-10 add precisely nothing.
    This is all pointless. The key question is (1). Answer that first.
    2. The type of String.getButy
    >
    StringBuilder sb = new StringBuilder();
    String line = null;
    while ((line = br.readLine()) != null) {
    sb.append(line + "\n");
    br.close();
    String incomingmsg=sb.toString();

  • How to store measurement data in a single database row

    I have to store time-data series into a database and do some analysis using Matlab later. Therefore the question might be more a database question rather than Diadem itself. Nevertheless I'm interested if anyone has some best practices for this issue.
    I have a database which holds lifecycle records for certain components of same nature and different variants. Depending on the variant I have test setups which record certain data. There is a common set of component properties and a varying number of environmental measurements to be included. Also the duration of data acquisition varies across the variants.
    Therefore having tables appears to be non-optimal solution for storing the data because the needed number of columns is unknown. Additionally I cannot create individual tables for each sample of a variant. This would produce to many tables.
    So there are different approaches I have thought of:
    Saving the TDM and TDX files as text respectively as BLOB
    This makes it necessary to use intermediate files.
    Saving the data as XML text
    I don’t know yet if I can process XML data in Matlab
    Has anybody an advice on that problem?
    Regards
    Chris

    Chris
    Sorry for the lateness in replying to your post. 
    I have done quite a bit of using a Database to store test results.  (In my case this was Oracle DB, using a system called ASAM-ODS)
    My 2 Cents:
    Three functions were needed by users for me.  1) To search and find the tests,  and  2)  To take the list of Tests and process the data into a report/output summary. 2) If the file size is large, then being able to import the data quickly into analysis tool speeds up processing.
    1) Searching for test results.  This all depends on what parameters are of value for searching.  In practice this is a smaller list of values(usually under 20), but I have had great difficulty getting users to agree on what these parameters are. They tend to want to search for anything.   The organization of the searching parameters has direct relationship to how you can search.   The NI Datafinder does a nice job of searching for parameters, so long as the parameter data are stored in properties of Channel Groups or Channels. It does not search or index if the values are in channel values.
    Another note: Given these are searchable parameters, it helps greatly if these parameters have a controlled entry, so that the parameters are consistent over all tests, and not dependent on free form entry by each operator. Searching becomes impossible if the operators enter dates/ names in wildly different formats.
    2) Similar issue exists if put values into databases. (I will use the database terms of Table and column(Parameter) and Row (instance of Data that would be one test record.)
    The sql select statement, can do fast finds, if store the searchable parameters in rows of a table. Where would have one row for each Test Record.   The files I worked with have more than 2000 parameters.   Making a table that would store all of these, and be able to search for all of these, makes a very large number of rows. I did not like this approach, as it has substantial maintenance time, and when changes are made, things get ugly quick.
    3)This is where having the file format be something that the analysis tool can quickly load up is beneficial.   Especially if the data files are large. In DIAdem's case, it reads TDM,TDMS files very quickly into the dataportal.   It can also read in the MDF or HDF files, but these are hierarchical structures that require custom code to traverse the information, and get the information into dataportal to use in Analysis /reporting. (Takes more time to read data, but have much more flexibility in the data structure than the two level tdm/tdms format.)
    My personal preferences
    I would not want to put the test data into a Table row. Each of the columns would be fixed and the table would be very wide in columns.
    >
    I personally like to put the test Data into a file, like TDMS, MDF, or HDF and then the database would have a entry for the reference to the attachment. The values that are in the database is just the parameters that are used for test Searching, either in datafinder or in sql commands in the user interface.
    Hopefully these comments help your tasks some.
    Respectfully,
    Paul
    tdmsmith.com

  • How to store xml data fragments, that will not be queried?

    Hello,
    Delphi Client application communicates with Java Server application via XML messages. Client sends XML message over HTTP Post method. Java Servlet gets XML message, parses it, performs requested action (select/insert/update/delete), generates resulting response and sends it back to the Client.
    I use Oracle DB XE 10.2.
    For example: Client sends a request to the server, to append certain Order with new Product info:
    Request:
    <?xml version="1.0" encoding="UTF-8" ?>
    - <Request OrderID="123123123" Action="NewProduct">
    - <Product TempProdID="2" ProdName="L01" VisualID="1" Amount="1" TechClass="1" TechSubject="1" TechVersion="0" TechName="TestTech" ElemOk="0">
    <Modified UserID="XXX" UserGroup="XXX" GroupLevel="0" />
    - <QuickInfo>
    <ProdIcon Format="PNG"
    Encoding="Base64">iVBORw0KGgoAAAANSUhEUgAAAEAAAABACAIAAAA
    lC+aJAAAAA3RSTlP////6yOLMAAAAvElEQVR42u3aQQ6EIAwAQP7/afe0
    mo1mBVur0emJgwGmRDFNWwvH9I153OpjyoisefqXW3afm4WypP+MgomvT
    z8AAAAAAAAAAAAAAMAzAClzAWQAdvexfqATEKmA/Fm0rYs5ozvoAWyWj4
    ZqJ9efQKR8BJAHOPEdKAAc/lLdAhC/K68EBG+JWwAixfABgF8Jf6MAAAA
    AAAAAAAAAAAAALwRUGgAAAAAAsgGJ3cfVrcfFl2jiIZzV+V7Zd/8BOtNi
    0MnJ58oAAAAASUVORK5CYII=
    </ProdIcon>
    <Parameters />
    </QuickInfo>
    - <TechProduct CurVer="1" MinVer="1" TotalItems="330" ItemNodeSize="55074">
    - <SubItem Class="TPlGraph" BindID="00B9C004">
    <PropList />
    - <SubItem Ident="Profiles" Class="TProfileList" Child="1" BindID="0188598C">
    <PropList />
    - <SubItem Class="TPlProfile" Child="1" BindID="018CA6CC">
    - <PropList>
    <Property PropIdent="ProfBaze0X" ValText="0" />
    <Property PropIdent="ProfBaze0Y" ValText="990" />
    <Property PropIdent="ProfBaze1X" ValText="990" />
    <Property PropIdent="ProfHier0" ValText="0" />
    - <Property PropIdent="InBorder" ValIdent="None" ValText="N&#279;ra">
    <ExtValue ColIdent="ItemCode" Value="None" />
    <ExtValue ColIdent="Type" Value="" />
    <ExtValue ColIdent="Filter" Value="" />
    <ExtValue ColIdent="Width" Value="0" />
    <ExtValue ColIdent="TechCall" Value="" />
    </Property>
    </PropList>
    </SubItem>
    </SubItem>
    </SubItem>
    </TechProduct>
    </Product>
    </Request>
    I use DOM parsers to parse the received requests, extract certain info and insert it into relational tables using standart SQL queries.
    My question is: what is the best way to store XML data fragments, that are not required to be saved relationally? I need to save the content of node <TechProduct> from the above example to relational table's column. There will be no need to query this column, no need to use relational views. I will use it only when Client application will request to modify certain order's product. Then I will have to send back the same <TechProduct> node via XML response.
    So what column type do I have to use? CLOB? XMLType? Is it better to use object types? Do I have to register XML Schema for better performance? The size of the fragment can be ~2MB.
    Thanks for your help
    Message was edited by:
    Kichas

    Thank you for reply,
    As you suggested, I will use XMLType storage as CLOB (without XML Schema).
    As I mentioned before, I use Java Servlet, deployed on Tomcat WebServer, to receive XML messages from Client application via HTTP POST method.
    I use these libs to get the XML payload and parse it into a Document:
    import org.w3c.dom.*;
    import org.xml.sax.InputSource;
    import javax.xml.parsers.DocumentBuilderFactory;
    import javax.xml.parsers.DocumentBuilder;
    And here is the code:
    public void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
    try {
    // get the XML payload and parse it into a Document
    DocumentBuilderFactory docBuilderFactory = DocumentBuilderFactory.newInstance();
    DocumentBuilder docBuilder = docBuilderFactory.newDocumentBuilder();
    Document dom;
    InputSource input = new InputSource(request.getInputStream());
    dom = docBuilder.parse(input);
    catch(Exception ex) {
    System.out.println("Exception thrown in XmlService");
    ex.printStackTrace();
    throw new ServletException(ex);
    I create a relational table, that contains XMLType column:
    CREATE TABLE xwarehouses (
    warehouse_id NUMBER,
    warehouse_spec XMLTYPE)
    XMLTYPE warehouse_spec STORE AS CLOB;
    Now I want to insert all DOM Document into XMLType column. So I do like this:
    import oracle.xdb.XMLType;
    String SQLTEXT = "INSERT INTO XWAREHOUSES (WAREHOUSE_ID, WAREHOUSE_SPEC) VALUES (?, ?)";
    XMLType xml = XMLType.createXML(con,dom);
    PreparedStatement sqlStatement = con.prepareStatement(SQLTEXT);
    sqlStatement.setInt(1,2);
    sqlStatement.setObject(2,xml);
    sqlStatement.execute();
    sqlStatement.close();
    dom is the Document, that I got from HTTP Request input stream.
    My servlet throws an exception:
    java.lang.NoClassDefFoundError: oracle/xml/parser/v2/XMLParseException
    at XmlService.GetMatListServiceHandler.processRequest(GetMatListServiceHandler.java:111)
    at XmlService.XmlServiceHandler.handleRequest(XmlServiceHandler.java:43)
    at XmlService.XmlServiceServlet.doPost(XmlServiceServlet.java:69)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:716)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:809)
    why does he needs oracle.xml.parser.v2.XMLParseException? I don't use Oracle parser? Does this code line throws the exception (I am not able to debug my code, because I have not configured JDeveloper to be able to use Remote Debuging):
    XMLType xml = XMLType.createXML(con,dom);
    Does it reparses the given dom Document or what?. When I deploy xmlparserv2.jar to Tomcat, everything is ok, application inserts XML data into XMLType column.
    Is there another way to insert the whole org.w3c.dom Document or Document fragment (that is already parsed) into XMLType column. Can you provide any sample code?
    The Document may contain national symbols, so they should be correctly stored and then later retrieved.
    Many thanks.

  • How to store the data instead of memoryengine to database when use XML file

    Hi Experts,
    I have doubt which is into XML to database data transfer.
    While transferring the data from XML to database the tables will be store in memory for the xml files, my aim is instead of storing this in memory engine
    is there any way to store the data in to the data base itself..
    I am assuming when the data size increases say for example huge amount data, storing this much of data is big issue here right, so to avoid this is there any way to store
    this data directly in the database level.
    Please understand my requirement or else I have the good explanation about my requirement and please assist me how do this job....
    Thx,
    Sahadeva.

    Hi Experts,
    For this job I have applied the
    jdbc:snps:xmll?f=/mydata/xml/file.xml&s=wrk_admin&dp_driver=oracle.jdbc.driver.OracleDriver&dp_url=jdbc:oracle:thin:@localhost:1521:xel&dp_user=wrk_admin&dp_password=bPyXS2eRXw8fWnKEmTYSEf&dp_schema=wrk_admin&dp_doc=Y
    my schema name is wrk_admin
    xml file name is mydata
    jdbc driver: localhost:1521:xe
    Encoded password: I have encoded the pwd from cmd:c:oracle\oracle_odi1\oracledi\agent\bin\encode wrk_admin
    The above are the details I have applied to my ODI studio for creating the dataserver.
    After applying this I got the following error.
    Could you please correct me here if I am wrong...
    jdbc:snps:xml?param1=value1&param2=value2&...
         at oracle.odi.jdbc.datasource.LoginTimeoutDatasourceAdapter.doGetConnection(LoginTimeoutDatasourceAdapter.java:133)
         at oracle.odi.jdbc.datasource.LoginTimeoutDatasourceAdapter.getConnection(LoginTimeoutDatasourceAdapter.java:62)
         at com.sunopsis.sql.SnpsConnection.testConnection(SnpsConnection.java:1118)
         at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.getLocalConnect(SnpsDialogTestConnet.java:420)
         at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.localConnect(SnpsDialogTestConnet.java:860)
         at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.jButtonTest_ActionPerformed(SnpsDialogTestConnet.java:806)
         at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.connEtoC1(SnpsDialogTestConnet.java:165)
         at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.access$1(SnpsDialogTestConnet.java:161)
    Thx,
    Sahadeva.

  • How to store the data captured from oscillosco​pe into the MS Access database table

    Hi All,
    In my application, I tried to save the data captured from SCOPE, but could store only 200 bytes. I've taken the Memo data type for the field in database. I want to store all data ( 4 channels) in one field & retrieve back and display on the XY Graph.
    Thanks in advance.
    Regards,
    Shrini

    kramish wrote:
    Im pretty sure that Access does support JDBCNo it does not. It supports ODBC.
    just doing a quick Google came up with some pages:
    http://blog.taragana.com/index.php/archive/access-microsoft-access-database-from-java-using-jdbc-odbc-bridge-sample-code/
    http://www.javaworld.com/javaworld/javaqa/2000-09/03-qa-0922-access.html
    Both articles explains how to use the jdbc-odbc bridge. I think I've seen a pure jdbc driver for access but it wasn't from Microsoft and it wasn't free.
    Kaj

  • How to handle large data sets?

    Hello All,
    I am working on a editable form document. It is using a flowing subform with a table. The table may contain up to 50k rows and the generated pdf may even take up to 2-4 Gigs of memory, in some cases adobe reader fails and "gives up" opening these large data sets.
    Any suggestions? 

    On 25.04.2012 01:10, Alan McMorran wrote:
    > How large are you talking about? I've found QVTo scales pretty well as
    > the dataset size increases but we're using at most maybe 3-4 million
    > objects as the input and maybe 1-2 million on the output. They can be
    > pretty complex models though so we're seeing 8GB heap spaces in some
    > cases to accomodate the full transformation process.
    Ok, that is good to know. We will be working in roughly the same order
    of magnitude. The final application will run on a well equipped server,
    unfortunately my development machine is not as powerful so I can't
    really test that.
    > The big challenges we've had to overcome is that our model is
    > essentially flat with no containment in it so there are parts of the
    We have a very hierarchical model. I still wonder to what extent EMF and
    QVTo at least try to let go of objects which are not needed anymore and
    allow them to be garbage collected?
    > Is the GC overhead limit not tied to the heap space limits of the JVM?
    Apparently not, quoting
    http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html:
    "The concurrent collector will throw an OutOfMemoryError if too much
    time is being spent in garbage collection: if more than 98% of the total
    time is spent in garbage collection and less than 2% of the heap is
    recovered, an OutOfMemoryError will be thrown. This feature is designed
    to prevent applications from running for an extended period of time
    while making little or no progress because the heap is too small. If
    necessary, this feature can be disabled by adding the option
    -XX:-UseGCOverheadLimit to the command line."
    I will experiment a little bit with different GC's, namely the parallel GC.
    Regards
    Marius

  • Newbie: How to store/analyze data in a time critical path

    Hi,
    I monitor two CAN channels with LabView. To monitor I use a special PCMCIA card and a dll from the card manufacturer.
    I get a lot of messages from each channel and I have to compare them.
    To get all messages I have to use the dll in a while loop.
    But to store the data in two 2d arrays and analyze it during one loop is to slow.
    So how can I do it? Somehow a hash table, queue or parallel tasks - one for the data collecting and one for analyzing?
    And how to handle the two CAN channels?
    As you can see I'm new in LabView and I'm looking for a good program architecture...
    Thanks,
    Thomas

    Hi Thomas,
    You raise a good point, and I'm happy to see that you are wanting to make your LabVIEW code more efficient! In general, you want to avoid building up large arrays in a while loop (as you've already alluded to). I believe a queue is an appropriate data structure to offload the processing of the data to another loop. We usually call this a producer-consumer type architecture, the details of which you can find here:
    Application Design Patterns: Producer/Consumer
    http://zone.ni.com/devzone/conceptd.nsf/webmain/c54badadd8bbde4286256c5200533b80
    You're producer loop will be the code acuiring from the CAN channels using your DLL. The queue will pass data from the producer to the consumer, where you will perform whatever analysis you need.
    Have a good one!
    Charlie S.
    Visit ni.com/gettingstarted for step-by-step help in setting up your system

  • How to store the data coming from network analyser into a text or excel file

    Hii everyone
    I'm using Agilent 8719ET network analyser and wish to store the data coming from netowrk analyser into a text file/ excel file.
    Presently I'm able to get the data on Labview graph using GPIB . Can anyone suggest how to go ahead after collect data sub vi. How can the data be stored into a file apart from showing on the graph?
    Attached is the vi for kind consideration...
    Looking for help
    Regards
    Rohit
    Attachments:
    Agilent 87XX Series Exceed Max Meas.vi ‏43 KB

    First let me say that your code really looks pretty good. The data handling could be made more efficient by calculating the number of datapoints that are going to be in the completed dataset and preallocating the entire array -- but depending upon your answer to my questions, the logic in the lower shift register may be going away - so we won't worry about that right now.
    The thing I need to know before addressing the data storage question is: Each time you call "Collect and Display Data.vi", how many element are in the array? Are you reading single data points, or a group of data? (BTW: if the answer to that question is obvious based on the way the other VIs are setup, I don't have the drivers so I can't tell what the setup values are.) Second, how fast does the loop iterate? Are we talking msec per loop?, seconds? fortnights?
    The issues here are two-fold: how much data? and how fast is it coming? The answer to these will tell you how to save the data.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • How to store the data of a file into an ArrayList?

    Hi! everyone.
    I want to know
    if I have a File, and the data in the file are type int, type String...
    And I have a ArrayList which is of type Question
    How do I store the data of that file into the ArrayList
    I tried to use the while loop(use the hasNextLine() to read the data line by line)
    But I cannot add data which are not of type Question to the ArrayList.
    Can you tell me what I should do, please?
    I also wonder that
    The data of the file are of many types, but when I try to read it with the nextLine(), the data all turn out to be of type String. Why?
    Thank you.
    Edited by: Terry001 on Apr 30, 2008 1:13 PM

    No, a line in the file is just part of a question
    The format of the file is like this:
    *<question type code>                    :     String*
    *<question point value>                    :     int*
    *<question category>                         :     String*
    *<question difficulty level>               :     int*
    *<question text>                              :     String*
    *<question correct answer>               :     String*
    *<optional question-specific data>     :     String*
    *<question terminator>                    :     String*
    And here is an example
    TF //TrueFalseQuestion
    5 //points value
    None //category
    3 //difficulty level
    The capital of the United States is Washington, D.C. //question text
    True // answer
    *** //quetion terminator
    I created an ArrayList in the Test class:
    private ArrayList<Question> questions; // Create inside constructor
        public Test (String name, String instr)
            testName = name;
            scoreEarned = 0;
            scorePossible = 0;
            instructions = instr;
            questions = new ArrayList<Question>(); //[MAX_NUMBER_OF_QUESTIONS];
        }And I tried to use the following method to store the data of the file to the ArrayList
    // This method loads a set of questions from a plain text file
        public void loadQuestionsFromFile(String fileName) throws FileNotFoundException
            try
                File fileReader = new File("input.txt");
                Scanner sc = new Scanner(fileReader);
                while (sc.hasNextLine())
                    // I don't know how to pass the data I got from the nextLine() method to the ArrayList because they are of different type
            catch (FileNotFoundException a)
                System.out.println(a);
        }    As all you said, I should create an Question object in the while loop
    Question temp = new Question ();But I have no idea how to pass the int and String to the Question object.
    Thank you

  • How to store bit data in VARCHAR(4000) field?

    Hi.
    Please help!
    We are porting some C/C++ software with embedded SQLs from NT/DB2 to Linux/Oracle.
    On NT/DB2 we have some table to store file data in a VARCHAR(4000) blocks.
    Table looks like this
    CREATE TABLE FileData (filetime as timestamp not null, idx int not null, datablock varchar(4000) FOR BIT DATA not null, primary key (filetime, idx) );
    As you can see DB2 has appropriate field modifier - "FOR BIT DATA" which makes DB2 storing data as-is, not converting characters.
    I need to know - if it is possible to do the same in Oracle like in DB2?
    If Oracle has some kind of field modifier like "FOR BIT DATA" in DB2?
    If not, how can I do the same in Oracle?
    The current problems are:
    1) when application imports the file with some national chars the Oracle stores "?" in a database in place of national chars.
    2) another piece of a problem - if file is more than 4000 bytes length, it reports the ORA-01461 error (see it is trying to expand some chars to UTF8 char which is not fit a single char, so finally - not fit the field size).
    So, it seems that it cannot proceed national chars at all. :-\
    For details please see enclosed [C code|http://dmitry-bond.spaces.live.com/blog/cns!D4095215C101CECE!1606.entry] , there is example how data written to a table.
    In other places we also need to read data from FIELDATA table and store back to file (other filename, other location).
    Here is summary on a field-datatype variants I have tried for the "datablock" field:
    1) VARCHAR2, RAW, LONG RAW, BLOB - does not work! All reports the same error - ORA-01461.
    2) CLOB, LONG - both working fine but(!) both still returns "?" instead of national chars on data reading.
    Hint: how I did try these field types - I just drop the "FileData" table, created it using different type for "datablock" field and run the same application to test it.
    I think I need to explain what the problem - we do not provide direct access to Oracle database, we use a some middle-ware. Middle-ware is a C/C++ software which also has a interface for dynamic SQLs execution. So, exactly this middle-ware (which is typically running on the same server with Oracle) receives the "?" instead of national chars! But we need it to return all data AS-IS(!) without any changes!!! That is wjhy I did ask - if Oracle has any options to store byte-data as-is?
    The BIG QUESTION - HOW CAN WE DO THIS?!
    Another thing I need to explain - it is ok to use Oracle-specific SQL ONLY IF THERE IS REALLY NO WAY TO ACHIEVE THIS WITH STANDARD SQL! Please.
    So, please look on a C code (by link I have posted above) and tell - if it is possible to make working in Oracle the VARCHAR approach we using at the moment?
    If not - please describe what options do we have for Oracle?
    Regards,
    Dmitry.
    PS. it is Oracle 11gR2 on CentOS 5.4, all stuff installed with default settings, so Oracle db encoding is "AL32UTF8".
    C/C++ application is built as ANSI/ASCII application (non-unicode), so sizeof(char)=1.
    The target Oracle db (I mean - the one which will be used on customer site) is Oracle 10g. So, solution need to be working on Oracle 10g.

    P. Forstmann wrote:
    There is some contradiction in your requirements:
    - if you want to store data as is without any translation use RAW or BLOB
    - if you want to store national character data try to use NVARCHAR2 or NCLOB.Seems you did not understand the problem. Ok, I'll try to explain. Please look on the code sample I provided in original question
    (I just added expanded data structures there, sorry I forgot to publish them when post original question):
    EXEC SQL BEGIN DECLARE SECTION;
      struct {
        char timestamp[27];
        char station[17];
        char filename[33];
        char task[17];
        char orderno[17];
        long filelen;
      gFilehead;
      struct {
        char timestamp[27];
        long idx;
        struct {
          short len;
          char arr[4001];
        } datablock;
      gFiledata;
    EXEC SQL END DECLARE SECTION;
    #define DATABLOCKSIZE 4000
    #ifdef __ORACLE
      #define VARCHAR_VAL(vch) vch.arr
    #elif __DB2
    #endif
    short dbWriteFile( char *databytes, long datalen )
      short nRc;
      long movecount;
      long offset = 0;
      gFilehead.filelen = gFilehead.filelen + datalen;
      while ((datalen + gFiledata.datablock.len) >= DATABLOCKSIZE)
        movecount = DATABLOCKSIZE - gFiledata.datablock.len;
        memcpy(&VARCHAR_VAL(gFiledata.datablock)[gFiledata.datablock.len], databytes, movecount);
        gFiledata.datablock.len = (short)(gFiledata.datablock.len + movecount);
        exec sql insert into filedata (recvtime, idx, datablock)
          values(
            :gFiledata.recvtime type as timestamp,
            :gFiledata.idx,
            :gFiledata.datablock /* <--- ORA-01461 appears here */
        nRc = sqlcode;
        switch (nRc)
        case SQLERR_OK: break;
        default:
          LogError(ERR_INSERT, "filedata", IntToStr(nRc), LOG_END);
          exit(EXIT_FAILURE);
        offset = offset + movecount;
        datalen = datalen - movecount;
        gFiledata.idx = gFiledata.idx + 1;
        memset(&gFiledata.datablock, 0, sizeof(gFiledata.datablock));
        databytes = databytes + movecount;
        gFiledata.datablock.len = 0;
      if (datalen + gFiledata.datablock.len)
        memcpy(&VARCHAR_VAL(gFiledata.datablock)[gFiledata.datablock.len], databytes, datalen);
        gFiledata.datablock.len = (short)(gFiledata.datablock.len + datalen);
      return 0;
    }So, the thing we need is - to put some data into the "datablock" field of following structure:
      struct {
        char timestamp[27];
        long idx;
        struct {
          short len;
          char arr[4001];
        } datablock;
      gFiledata;Then insert it into a database table using static SQL like this:
        exec sql insert into filedata (recvtime, idx, datablock)
          values(
            :gFiledata.recvtime type as timestamp,
            :gFiledata.idx,
            :gFiledata.datablock /* <--- ORA-01461 appears here */
            ); And then expect to read exactly the same data back!
    The problems are:
    1) Oracle make decision to convert the data we are inserting (why? and how to disable converting?!)
    2) even if it inserts the data (CLOB and LONG field datatypes are working fine when inserting data with such static SQL + such host variable) then it became not readable! (why?! how to make it readable?!)
    P. Forstmann wrote:
    ORA-01461 could mean that you have a wrong data type for bind variable variable in your client code:Not me decided that host variable is the "LONG datatype value" - the Oracle make that decision instead of me. And that is the problem!
    Looks like Oracle react on any char code >= 0x80.
    So, assume if I run the code:
    // if Tab1 was created as "CREATE TABLE Tab1 (value VARCHAR(5))" then
    char szData[10] = "\x41\x81\x82\x83\x84";
    EXEC SQL INSERT INTO Tab1 (value) VALUES (:szData);
    Oracle will report the ORA-01461 error!
    EXACTLY THIS IS THE PROBLEM I HAVE DESCRIBED IN MY ORIGINAL QUESTION.
    So, problem - why Oracle make such decision instead of me?! How we can make Oracle insert data into a table AS-IS?
    What other type of host variable we should use to make Oracle think that data is a binary?
    void*? unsigned char? Could you please provide any examples?
    Ok, you did recommend - "use the RAW datatype". But RAW datatype is limited to size 2000 bytes only - we need 4000! So, it is not match our needs at all.
    Also you have mentioned "use BLOB" - but testing shows that Oracle reports the same ORA-01461 error on inserting data into a BLOB field from such host variable! (see the code I posted)
    What also we can do?
    Change type of host variables? BUT HOW?!

  • How to handle large data in file adapter

    We have a scenario Proxy -> PI -> File Sever using File adapter.
    File adapter is using FCC for conversion.
    recently we had wave 2 products live and suddenly for this interface we have increase in volume of messages, due to which File adapter is not performing well, PI goes slow or frequent disconnect from file server problem. Due to which either we will have duplicate records in file or file format created is wrong.
    File size is somewhere around 4.07 GB which I also think quite high for PI to handle.
    Can anybody suggest how we can handle such large data.
    Regards,
    Vikrant

    Check this Blog for Huge File Processing:
    Night Mare-Processing huge files in SAP XI
    However, you can take a look also to this Blog, about High Volume Messages:
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging
    PI Performance Tuning Best Practice:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2016a0b1-1780-2b10-97bd-be3ac62214c7?QuickLink=index&overridelayout=true&45896020746271

Maybe you are looking for

  • Obiee and BIP security - obiee 11g 11.1.1.7.1

    Hello,   I have configured an external LDAP setting for authentication. Reordered the new LDAP as first authentication provider. The issue i am facing is , that i am able to login with external ldap users and weblogic as expected in obiee.  But when

  • HDTV Setting CS5

    Hi there i just got a new Camcorder (Sony HDR-CX 155) that shoots images in 1080p quality. I'm new to the CS 5 (but i've used CS3 a while ago) and I don't know what settings to set up when opening a new project. I want to cut a video in 1080p without

  • Wifi sync help

    I have completed every troubleshooting step for trying to get this to work but my phone never shows up in itunes after I unplug my phone from usb. When I try to plug it to a power source to do automatic sync, it doesn't work. Additionally, wifi sync

  • [community] major changes to qemu

    Following discussion in this thread I have now addressed the qemu "situation".  Here is a basic rundown from the other thread: The Situation OK - let me clarify the situation pkg wise: qemu - in [community]: built without support of the kernel module

  • Lumia 920 volume button not working

    Is there any way to change the volume without the buttons? Maybe a windows app, or a setting I havent found. The ribbon has been damaged and the volume up button doesnt work, reolacement on the way, but in the mean time how to I turn up the volume?