DocumentBuilder timeout - parse(url)

Is there anyway to configure a timeout on parsing from a url with DocumentBuilder?
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
final DocumentBuilderFactory docFactory = DocumentBuilderFactory.newInstance();
final DocumentBuilder docBuilder = docFactory.newDocumentBuilder();
Document document = docBuilder.parse(url);I can't seem to find a reference anywhere

tony_murphy wrote:
Is there anyway to configure a timeout on parsing from a url with DocumentBuilder?
I can't seem to find a reference anywhereI think it is not the job of DocumentBuilder, DocumentBuilder is capable of parsing number of sources for building Document.
Try other way with parsing InputStream. Check it out below:
import java.io.*;
import java.net.*;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
try
     URL url = new URL (strURL);     // Getting URL to invoke
     HttpURLConnection urlCon = (HttpURLConnection) url.openConnection ();     // Opening connection with the URL specified
     urlCon.setReadTimeout (1000);     // Set Read Time out in milliseconds, Setting 1 second as read timeout
     urlCon.connect ();     //Connecting
     InputStream iStream = urlCon.getInputStream ();     //Opening InputStream to Read
     DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();     //Building Document Builder Factory
     DocumentBuilder builder = factory.newDocumentBuilder();     // Building Document Builder
     doc = builder.parse(iStream);     // Parsing InputStream
} catch (Exception ex) {
     // TODO: Exception Handling
Note: Perform proper Exception Handling.
Thanks,
Tejas

Similar Messages

  • Has anybody used DocumentBuilder.parse(URL)

    Hi friends
    Has anybody used DocumentBuilder.parse(URL) for make document object.
    I have the following piece of code
    <code>
    DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
    DocumentBuilder builder = factory.newDocumentBuilder();
    System.out.println("| Start | Fetch xml"); // ------------ 1
    Document document = builder.parse("http://some url that gives back XML");
    System.out.println("| Stop | Fetch xml"); // ------------- 2
    </code>
    Now the problem is .. once in a while the code will hang at point 1 and will never reach point 2. Exception handling has been done.. but there are no exceptions being logged. The code simply hangs.
    Please let me know if you have also faced the same or similiar problem.
    Thanking in anticipation
    Partha.

    Is it similar with a file URL instead a http URL.
    Use
    Document document = builder.parse("file://c:/xmlFile");
    instead of
    Document document = builder.parse("http://some url that gives back XML");

  • DOMParser.parse(URL) hangs

    Anytime I call DOMParser.parse(URL) where URL is of type "http://", the parse call hangs (as near as I can tell) indefinitely. Are URLs of this type not supported? Is there a work around to this problem?

    No. Within the same class, the following DOES work:
    DOMParser dp = new DOMParser();
    dp.setErrorStream(new PrintWriter(errs));
    // Set Schema Object for Validation
    dp.setXMLSchema((XMLSchema)((new XSDBuilder()).build(schema.location)));
    Note that schema.location is a String like "http://www.wherever.com/file.xsd" which points to the web server that is hanging on DOMParser.parse(URL);

  • Set timeout for URL connection in java1.4

    hi
    I want to set timeout for URL connection in java 1.4..
    java 5.0 provides with a setTimeout(int ) method but how should it be done in 1.4 environment
    Thanks in advance
    sneha

    sun.net.client.defaultConnectTimeout (default: -1)
    sun.net.client.defaultReadTimeout (default: -1)
    See the [Networking Properties|http://java.sun.com/j2se/1.4.2/docs/guide/net/properties.html].

  • Datasocket Server looses connections after a few hours "Parsing URL"

    Hi there,
    I need some help cause the DataSocket server is doing some bad stuff...!
    The server runs on a WinNT 4.0 platform as a datalogger. It works fine, but after a few hours it seems the server looses connection somehow, the datasocket status of the frontpanel elements tells: "Connecting: Parsing URL" .
    The server has about 5 DINT variables and about 3 clusters containing INT variables (size about 30).
    I really ran out of ideas aof what happens on this "old" machine, maybe someone had some preblems like me and solved them....
    Thank for hints and tipps!

    Hello,
    what service packs for Windows NT have you installed?
    Datasocket server crashes after a short period of time when diagnostic
    window is open (Tools->Diagnostics). Disable the "Auto-Refresh"
    option in the Diagnostic Dialog from the Options menu.
    regards
    P.Kaltenstadler
    National Instruments

  • How to set TimeOut in URL

    Hi everyone,
    " URL u = new URL(url);
    InputStream is = u.openStream();
    BufferedReader dis = new BufferedReader(new InputStreamReader(is)); "
    My problem is,
    1. if host is down,my program waits for dafult time to connect
    2. After connecting to server, there 's some problem to get response from Host.
    my program wait for response long time.
    I want to change these two timeout. Also, i 'm using jdk1.3.if it's 1.5 i can use setConnectTimeout method .
    If u have solution, please revert to me.
    Thanks
    JaiGanesh.R

    http://coding.derkeiler.com/Archive/Java/comp.lang.java.programmer/2004-01/3271.html

  • Parsing URL query parameters

    Hi all, I keep running into situations where I need to parse the parameters from the query string of a URL. I'm not using Servlets or anything like that.
    After much frustration with not being able to find a decent parser, I wrote one myself, and thought I would offer it to others who might be fighting the same thing. If you use these, please keep the source code comments in place!
    Here it is:
    * Written by: Kevin Day, Trumpet, Inc. (c) 2003
    * You are free to use this code as long as these comments
    * remain in tact.
    * Parse parameters from the query segment of a URL
    * Pass in the query segment, and it returnes a MAP
    * containing entries for each Name.
    * Each map entry is a List of values (it is legal to
    * have multiple name-value pairs in a URL query
    * that have the same name!
         static public Map getParamsFromQuery(String q) throws InvalidParameterException{
              * Query, q, can be of the form:
              * <blank>
              * name
              * name=
              * name="value"
              * name="value"&
              * name="value"&name2
              * name="value"&name2="value2"
              * name="value"&name="value"
              * name="value & more"&name2="value"
              Map params = new HashMap();
              StringBuffer name = new StringBuffer();
              StringBuffer val = new StringBuffer();
              StringBuffer out = null;
              boolean inString = false;
              boolean readingName = true; // are we reading the name or the value?
              int i = -1;
              int qlen = q.length();
              out = name;
              while (i < qlen){
                   char c = ++i < qlen ? q.charAt(i) : '&';
                   if (inString){
                        if (c != '\"')
                             out.append(c);
                        else
                             inString = false;
                   } else if (c == '&') {
                        String nameStr = cleanEscapes(name.toString());
                        String valStr = cleanEscapes(val.toString());
                        List valList = (List)params.get(nameStr);
                        if (valList == null){
                             valList = new LinkedList();
                             params.put(nameStr, valList);
                        valList.add(valStr);
                        name.setLength(0);
                        val.setLength(0);
                        out = name;
                   } else if (c == '=') {
                        out = val;
                   } else if (c == '\"') {
                        inString = true;
                   } else {
                        out.append(c);
              if (inString) throw new InvalidParameterException("Unexpected end of query string " + q + " - Expected '\"' at position " + i);
              return params;
         static private String cleanEscapes(String s){
              try {
                   return URLDecoder.decode(s, "UTF-8");
              } catch (UnsupportedEncodingException e) {
                   e.printStackTrace();
              return s;
    You'll also need to create a new Exception class called InvalidParameterException.
    Cheers,
    - Kevin

    Because javax.servlet.* is not included in the
    standard Java distribution. I am writing my own
    mini-web server interface for applications and don'tSounds like an interesting project, have you thought about implementing the Servlet API (or a subset)? It is well known to developers, and I dont think it would require that much extra work, but that would depend on mini mini is of course =).
    want to add that dependency nastiness just to get a
    parser...OK, I was just curious.

  • Solution to parse URL query parameters

    I would like to parse query parameters in a URL GET request. I want to know if there is an efficient solution.
    e.g. http://www.google.com?search=red&query=blue
    I want to get "search", "red", "query", "blue" strings. I am not sure whether using StringTokenizer is the efficient solution.
    Thanks
    Jawahar

          StringTokenizer st = new StringTokenizer("http://www.google.com?search=red&query=blue","?&=",true);
          Properties params = new Properties();
          String previous = null;
          while (st.hasMoreTokens())
             String current = st.nextToken();
             if ("?".equals(current) || "&".equals(current))
                //ignore
             }else if ("=".equals(current))
                params.setProperty(URLDecoder.decode(previous),URLDecoder.decode(st.nextToken()));
             }else{
                previous = current;
          params.store(System.out,"PARAMETERS");

  • Parse url

    Hi all,
    I am looking for some example code on how to select out certain data from a URL and save to a text file. I can easily save the entire page to a text file but I only need some of the data- for example, the information between tag �A� and tag �B�?
    Much thanks in advance!
    Alex

    so what you want is not parsing an URL, but parse html content and save it...
    anyway, use some kind of html parsing liibrary... like htmlparser - http://htmlparser.sourceforge.net/

  • How to parse URL Data into an NSString Array in iphone application

    Hi Every one
    I am newbie to iphone programming. I am having problem with reading and displaying the data into the table view. My application has to be designed like this. There is a csv file in the server machine and I have to access that URL line by line. Each line consists of 8 comma separated values. Consider each line has first name, second name and so on. I have to parse the data with comma and a newline and store them in an array of first name, second name array an so on. The next thing is I have to set first name second name combined and must be displayed in the UITableView. Can anyone provide me with an example of how to do it? I know I am asking the solution but I encountered a problem in connection methods separately and parsing simultaneously. You help is much appreciated.
    Thanks

    What does that have to do with a URL?
    The only thing that doesn't sound good is "array of first name" and "second name array". For each row, extract all the field and store them in an NSDictionary. Add a derived field consisting of first name concatenated with last name. That will be easy to display in a table.

  • Parsing URL issue

    **** WARNING ADULT RELATED LINKS ON THIS ISSUE ****
    I am having an issue with Firefox parsing part of the URL.
    Example: http://trial.jenndoll.com/bonus.php?fc=2
    Firefox is parsing away the ?fc=2 from the above URL. I have tested other browser and this functions properly.

    Did you ever find a way to provision a resource using SPML?
    I'm facing the same problem at the moment..
    Regards,
    Tine

  • Parsing url from HTML file

    hello
    I have a html page converted to text format. I need to parse the url's present in that file.
    I got all the url's but I want to extract only a specific url i.e., url in the anchor tag
    e.g., like <a href = "http://www.java.sun.com"> Java </a>
    In that file, if I need only the url in the anchor tag... what would be the method ?
    Any help would be great
    Thanks in advance
    k

    Thank you very much for your prompt reply and for the link u have provided.

  • Parsing url iview variables

    Hi all,
    I need create a URL iview that redirect to a target page parsing one variable.
    How can I call the URL iview with one variable and propagate this variable to a target page?
    thanks in advance,
    david

    Hi David,
      While creating the url iview, pass you variable at the end of the url like http://www.yahoo.com?myVariable.  And in the target, you can get the myVariable value using getParameter method.  Hope this will help you.
    Regards,
    Venkatesh. K
    /* P.S: Consider Points if it is useful */

  • Session Timeout and Url Redirect in BlazeDS?

    We have a JSF2 Webapp and Flex 4 integreated.
    Question
    1. How can we pass the parameters in web.xml to make FLEX4 redirect to login page when the session timeouts instead of giving a AMF Communication Error?
    Thanks,
    User.

    hi, i am also struggling with the same problem, have you got any solution

  • ReferenceError: loadScript is not defined:parse url

    I get this error every time i try to open a saved htm page. this never happened previously (i have been using that saved page for a few months now). please tell me how to get rid of it.

    Start Firefox in <u>[[Safe Mode|Safe Mode]]</u> to check if one of the extensions (Firefox/Tools > Add-ons > Extensions) or if hardware acceleration is causing the problem (switch to the DEFAULT theme: Firefox/Tools > Add-ons > Appearance).
    *Do NOT click the Reset button on the Safe Mode start window or otherwise make changes.
    *https://support.mozilla.org/kb/Safe+Mode
    *https://support.mozilla.org/kb/Troubleshooting+extensions+and+themes

Maybe you are looking for

  • Please help, I really need this

    Recently my battery ran out on my 3gs and the screen went yellow! i kept it charged in but the screen remained black. When I plugged it into my computer it said iPhone is in recovery mode? so I clicked on the restore button, it downloaded iOS4.3.5 an

  • HT201210 how to download the latest software to my ipad which I purchased 2 two years ago

    May some one can tell me how to download the latest version of software to my ipad (the oldest version).

  • One iFrame is not showing images, others do

    Hi There, I'm getting a little mad here. I have an old site running in GL CS2 with a few pages that have an iFrame. Although there was nothing changed on the page, suddenly a page with an iFrame stopped showing their clickable images. This site has o

  • Changing the recipient to which alert messages go

    When we installed the JSMS system a few years ago, the installer has defined a certain email address (aliased as [email protected]) to which the alert messages (e.g. most delivery failed messages) go Now I want to change this email address so another

  • Component Database Hang

    I've stupidly named two component the same name and I'm now having problems with that component.  To delete these components, if I try Tools->Database and click on the Components tab, the program hangs and I can only shut it down.  I'm not sure if th