Does the parser work with large XML files?

Is there a restriction on the XML file size that can be loaded into the parser?
I am getting a out of memory exception reading in large XML file(10MB) using the commands
DOMParser parser = new DOMParser();
URL url = createURL(argv[0]);
parser.setErrorStream(System.err);
parser.setValidationMode(true);
parser.showWarnings(true);
parser.parse(url);
Win NT 4.0 Server
Sun JDK 1.2.2
===================================
Error output
===================================
Exception in thread "main" java.lang.OutOfMemoryError
at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
led Code)
at java.util.Hashtable.<init>(Unknown Source)
at oracle.xml.parser.v2.DTDDecl.<init>(DTDDecl.java, Compiled Code)
at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
led Code)
at oracle.xml.parser.v2.ValidatingParser.checkDefaultAttributes(Validati
ngParser.java, Compiled Code)
at oracle.xml.parser.v2.NonValidatingParser.parseAttributes(NonValidatin
gParser.java, Compiled Code)
at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingPa
rser.java, Compiled Code)
at oracle.xml.parser.v2.ValidatingParser.parseRootElement(ValidatingPars
er.java:97)
at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingP
arser.java:199)
at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:146)
at TestLF.main(TestLF.java:40)
null

We have a number of test files that are that size and it works without a problem. However using the DOMParser does require significantly more memory than your doc size.
What is the memory configuration of the JVM that you are running with? Have you tried increasing it? Are you using our latest version 2.0.2.6?
Oracle XML Team

Similar Messages

  • Speed up Illustrator CC when working with large vector files

    Raster (mainly) files up to 350 Mb. run fast in Illustrator CC, while vector files of 10 Mb. are a pain in the *blieb* (eg. zooming & panning). When reading the file it seems to freeze around 95 % for a few minutes. Memory usage goes up to 6 Gb. Processor usage 30 - 50 %.
    Are there ways to speed things up while working with large vector files in Illustrator CC?
    System:
    64 bit Windows 7 enterprise
    Memory: 16 Gb
    Processor: Intel Xeon 3,7 GHz (8 threads)
    Graphics: nVidia Geforce K4000

    Files with large amounts vector points will put a strain on the fastest of computers. But any type of speed increase we can get you can save you lots of time.
    Delete any unwanted stray points using  Select >> Object >> stray points
    Optimize performance | Windows
    Did you draw this yourself, is the file as clean as can be? Are there any repeated paths underneath your art which do not need to be there from live tracing or stock art sites?
    Check the control panel >> programs and features and sort by installed recently and uninstall anything suspicious.
    Sorry there will be no short or single answer to this, as per the previous poster using layers effectively, and working in outline mode when possible might the best you can do.

  • How well does the Ipad2 work with the verizon 4g hotspot?

    How well does the ipad2 work with the verizon 4g hotspot?

    WoW Macbook Pro 2011 15" i7 2.3Ghz quad core AMD 6750 hi-res AG OSX gameplay with FPS and temps 
    MacBook Pro 17in. Review (Early 2011) 
    http://www.youtube.com/watch?v=hiEDf_l0PqY

  • Does the curricula work with or without the SAP software?

    Hi! As a faculty member... does the curricula work with or without the SAP software? Is it better to lecture with the software or can I get by without it?
    If I'm part of a member university how many students can I provide access to the software? Are there any limitations?
    Also, I'm based in Philadelphia with connections to the 2nd largest university... Can I pick my UCC or will it be assigned to me? I'm open to either one!
    Many thanks!
    Richard

    Hi Everyone,
    One key intent of this community is to present curriculum resources for lectures of either type - with access to SAP software for the professor and students in classroom and lab environements OR without access to the software.
    The former, with SAP software access, requires formal membership in our long-standing University Alliances program (UAP). With this membership, professors receive special access to curriculum elements designed to be dependent on use of SAP software (an of course access to everything in our community).
    The latter, without SAP software access, requires simply registration in this new University Alliances community (UAC). This community makes many resources available to professors to integrate into their business and IT-related lectures and projects - including pre-recorded demos, videos, expert and analyst reports, articles, white papers, customer reviews, references, and case studies, etc.
    The curren state of the site is like a start-up. There is a lot on offer, but we also expect that there is a lot more out there to grow. We know there are many elements which we can make available to the worldwide community once they are contributed. We are actively approaching professors in our networks now for items.
    We would be grateful for anyone that we have not contacted to come forward and offer content of their own, for the UAP members or for the greater UAC.
    Thanks and Best Wishes
    Bob LoBue

  • Does the SMPTETTPlugin work with Strobe?

    Does the SMPTETTPlugin work with Strobe, or just OSMF?
    http://blogs.adobe.com/accessibility/2011/10/new-work-on-closed-captioning.html

    **Update**
    I gave MAC filtering a try. It worked great without the range extender. However, for the range extender to work, it's MAC address must be setup in the router for access. When I tried to connect to the network on a computer without a registered MAC, and with the range expander on, it let me pass right through.
    So, I'm assuming since i was connecting to the router through the range expander, which had a registered MAC, I was able to get access. So wireless MAC filtering was a no go since anybody wanting to connect could do so via the range extender.
    For me...WEP/WPA encyption doesn't work. Wireless MAC filtering doesn't work.  But what DOES work, is not broadcasting the SSID while not having any wireless security encryption in place. Nobody can wirelessly connect to your network without knowing the SSID, which in a way is more or less like a password. Just remember to change the SSID on both the router and the extender to something that you, or anybody else in the area, has not used in the past.
    I hope all this info is helpful to somebody.

  • Problems with Large XML files

    I have tried increasing the memory pool using the -mx and -ms options. It doesnt work. I am using your latest XML parser for Java v2. Please let me know if there are some specific options I should be using.
    Thanx,
    -Sameer
    We have a number of test files that are that size and it works without a problem. However using the DOMParser does require significantly more memory than your doc size.
    What is the memory configuration of the JVM that you are running with? Have you tried increasing it? Are you using our latest version 2.0.2.6?
    Oracle XML Team
    Is there a restriction on the XML file size that can be loaded into the parser?
    I am getting a out of memory exception reading in large XML file(10MB) using the commands
    DOMParser parser = new DOMParser();
    URL url = createURL(argv[0]);
    parser.setErrorStream(System.err);
    parser.setValidationMode(true);
    parser.showWarnings(true);
    parser.parse(url);
    Win NT 4.0 Server
    Sun JDK 1.2.2
    ===================================
    Error output
    ===================================
    Exception in thread "main" java.lang.OutOfMemoryError
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at java.util.Hashtable.<init>(Unknown Source)
    at oracle.xml.parser.v2.DTDDecl.<init>(DTDDecl.java, Compiled Code)
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at oracle.xml.parser.v2.ValidatingParser.checkDefaultAttributes(Validati
    ngParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseAttributes(NonValidatin
    gParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingPa
    rser.java, Compiled Code)
    at oracle.xml.parser.v2.ValidatingParser.parseRootElement(ValidatingPars
    er.java:97)
    at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingP
    arser.java:199)
    at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:146)
    at TestLF.main(TestLF.java:40)
    null

    You might try using a different JDK/JRE - either a 1.1.6+ or 1.3 version as 1.2 in our experience has the largest footprint. If this doesn't work can you give us some details about your system configuration. Finally you might try the SAX interface as it does not need to load the entire DOM tree into memory.
    Oracle XML Team

  • Working with large Artboards/Files in Illustrator

    Hello all!
    I'm currently designing a full size film poster for a client. The dimensions of the poster are 27" x 40" (industry standard film poster).
    I am a little uncertain in working with large files in Illustrator, so I have several problems that have come up in several design projects using similar large formats.
    The file size is MASSIVE. This poster uses several large, high-res images that I've embedded. I didn't want them to pixelate, so I made sure they were high quality. After embedding all these images, along with the vector graphics, the entire .ai file is 500MB. How can I reduce this file size? Can I do something with the images to make the .ai file smaller?
    I made my artboard 27" x 40" - the final size of the poster. Is this standard practice? Or when designing for a large print format, are you supposed to make a smaller, more manageable artboard size, and then scale up after to avoid these massive file sizes?
    I need to upload my support files for the project, including .ai and .eps - so it won't work if they're 500MB. This would be good info to understand for all projects I think.
    Any help with this would be appreciated. I can't seem to find any coherent information pertaining to this problem that seems to address my particular issues. Thank you very much!
    Asher

    Hi Asher,
    It's probably those high-res images you've embedded. Firstly, be sure your images are only as large as you need them Secondly, a solution would be to use linked images while you're working instead of embedding them into the file.
    Here is a link to a forum with a lot of great discussion about this issue, to get you started: http://www.cartotalk.com/lofiversion/index.php?t126.html
    And another: http://www.graphicdesignforum.com/forum/archive/index.php/t-1907.html
    Here is a great list of tips that someone in the above forum gave:
    -Properly scale files.  Do not take a 6x6' file then use the scaling tool to make it a 2x2'  Instead scale it in photoshop to 2x2 and reimport it.  Make a rule like over 20%, bring it back in to photoshop for rescaling.
    -Check resolutions. 600dpi is not going to be necessary for such and such printer.
    -Delete unused art.  Sloppy artists may leave old unused images under another image.  The old one is not being used but it still takes up space, therefore unecessarily inflating your file.
    -Choose to link instead of embedd.  This is your choice.  Either way you still have to send a large file but many times linking is less total MB then embedding.  Also embedding works well with duplicated images. That way multiple uses link to one original, whereas embedding would make copies.
    -When you are done, using compression software like ZIP or SIT (stuffit)
    http://www.maczipit.com/
    Compression can reduce file sizes alot, depending on the files.
    This business deals with alot of large files.  Generally people use FTP's to send large files, or plain old CD.  Another option is using segmented compression.  Something like winRAR/macRAR or dropsegment (peice of stuffit deluxe) compresses files, then breaks it up into smaller manageble pieces.   This way you can break up a 50mb file into say 10x 5mb pieces and send them 5mb at a time. 
    http://www.rarlab.com/download.htm
    *make sure your client knows how to uncompress those files.  You may want to link them to the site to download the software."
    Good luck!

  • Software available for working with large video files?

    Hello,
    I'm working in PP CS6. I was wondering if there are any workarounds or 3rd party plugins/software that
    make working with really large video files easier and faster?
    Thanks.
    Mark

    Hi Jeff,
    Thanks for helping. This is the first time I shot video with my Nikon D5200. It was only a 3 minute test clip
    set at the highest resolution, 1920x1080-60i. I saw the red line above the clip in PP CS6 and hit the enter
    key to render the clip.
    It took almost 18 minutes or so to render the clip. This is probably normal but I was wondering if there is
    a way to reduce the file size so it doesn't take quite as long to render. I just remember a few years back
    that when the Red camera was out, guys were working with really huge files and there was a program
    from Cine something that they used to reduce the file size and make it more manageable when editing.
    I could be mistaken. I've been out of the editing look for a few years and just getting back into it.
    Thanks.
    Mark
    Here's my PC's components list you asked for:
    VisionDAW 4U 8-Core Xeon Workstation
      2 Intel QUAD-Core Xeon 5365-3.0GHz, 8MB, 1333MHz Processors
      16GB 667MHz Fully Buffered Server Memory Modules (2x2GB)
      Microsoft® Windows® Windows 7 Ultimate (x64)
      WDC 250GB, Ultra ATA100, 7200 rpm, 8MG Buffer Main OS HD
      2 WWDC 750GB, SATA II, 7200 RPM, 16MB Buffer HD (Raid 0)
      2 WDC 750GB, SATA II, 7200 rpm, 16MG Buffer HD (Samples)
      2 WDC 1TB Enterprise Class, SATA II, 7200 RPM, 32MB Buffer Hard Drive
      MOTU 24 I/O (main) / MOTU 2408mk3 (slave)
      Plexor PX-800A 18X Dbl. Layer DVD+/-RW Optical Drive
      Buffalo BuRay Drive (External) BR-816SU2 
      Front Panel USB Acess
      Integrated FireWire (1394a) interface
      Thermaltake Toughpower 850W Power Supply
      3xUAD1 Universal Audio Cards
      NVIDIA QUADRO FX 1800 / Memory 768 MB GDDR3
      CUDA Parallel Processor Cores / 64
      Maximum Display Resolution Digital @60Hz = 2560x1600
      Memory Interface 192Bit
      Memory Bandwidth (GB/sec) / 38.4 GB/sec
      PCI-Express, DUAL-Link DVI 1
      Digital Outputs 3 (2 out of 3 active at a time)
      Dual 25.5" Samsung 2693HM LCD HD Monitors

  • Advice for working with large clip files

    A few years ago I made a movie using iMovie2. At the time I was working with clips recoded on one minidv disc. I am now ready to begin a project (just bought iLife06) that is much larger. I have roughly ten 60 min minidv discs. I am getting nervous about the amount of memory needed to import and work with this many clips.
    Is it possible to import the clips to my external fire wire HD? Can I still work in iMovie while the project lives on the ext firewire drive? Can anyone tell me roughly how much memory is needed for larger projects like this? (I expect the final project will be 30-40 minutes).
    Since the footage all comes from our European concert tour, it easily divides into 3 separate sections - would it be easier to create 3 different iMovie projects? Is it possible to combine them in the end to create one film?
    Thank you so much for your help with this.
    Sincerely,
    Bob Linscott

    Is it possible to import the clips to my external
    fire wire HD? Can I still work in iMovie while the
    project lives on the ext firewire drive?
    Should be fine. I've been editing 4 hours worth of footage down to about 50 minutes, on a project stored on a FireWire 800 drive. FireWire 400 (if that's what you're using) is half the speed, but should still be fine, I think.
    So, create a new project, save it to your external drive, and then import your footage into the project.
    Can anyone
    tell me roughly how much memory is needed for larger
    projects like this? (I expect the final project will
    be 30-40 minutes).
    If you mean hard drive space for the project, I think you're talking about 11 GB per hour: so, in short, a lot. I think you could import your footage, edit a bit, and empty your trash to free up space, but iMovie does seem to keep your originals fairly relentlessly, so I'm not entirely sure.
    Watch out for extracting audio from clips too: I've been doing this in order to do cross-fades, but it increases the size of your movie. I ended up buying a second, bus-powered 80 GB external hard drive just to hold the iMovie project
    Since the footage all comes from our European concert
    tour, it easily divides into 3 separate sections -
    would it be easier to create 3 different iMovie
    projects? Is it possible to combine them in the end
    to create one film?
    This could well be a good idea. Once your projects gets over a certain length, you may experience the "herky jerky" pkayback issue (search for "herky jerky" on these forums), making editing difficult. So three separate projects might be easier.
    To combine them all at the end, you'll want to export the 2nd and 3rd projects as full quality DV (Share > QuickTime > Compress Movie For: Full Quality), then import them (File > Import) into your first project, and add them to the timeline.

  • What does the black screen with large exclamation point mean?

    I'm trying to edit, or even just select a photo in iPhoto, but can't get it to come up - just the black screen with a large exclamation point. What does that mean, and how do I get back control of my photos?

    It means your database is corrupted and that iPhoto has lost the link between the thumbnail in the iPhoto Window and the actual file.
    Option 1
    Back Up and try rebuild the library: hold down the command and option (or alt) keys while launching iPhoto. Use the resulting dialogue to rebuild. Choose to Rebuild iPhoto Library Database from automatic backup.
    If that fails:
    Option 2
    Download iPhoto Library Manager and use its rebuild function. This will create a new library based on data in the albumdata.xml file. Not everything will be brought over - no slideshows, books or calendars, for instance - but it should get all your albums and keywords, faces and places back.
    Because this process creates an entirely new library and leaves your old one untouched, it is non-destructive, and if you're not happy with the results you can simply return to your old one.
    Regards
    TD

  • TransformerHandler throws OutOfMemoryError with large xml files

    i'm using TransformerHandler to convert any content to SAX events and transform it using XSLT into an XML file.
    the problem is that for large amount of content i get a OutOfMemoryError.
    it seams that the content is kept in memory and only flushed when i call handler.endDocument();
    i tried using auto flush writers as the Result, or call the flush() method myself, but nothing.
    here is the example - pls help!
    import javax.xml.transform.TransformerFactory;
    import javax.xml.transform.sax.SAXTransformerFactory;
    import javax.xml.transform.sax.TransformerHandler;
    import javax.xml.transform.stream.StreamResult;
    import javax.xml.transform.stream.StreamSource;
    import org.xml.sax.helpers.AttributesImpl;
    public class Test
          * test handler memory usage
          * @param loops no of loops - when large enogh - OutOfMemoryError !!!
          * @param xsltFilePath xslt file
          * @param targetXmlFile output xml file
          * @throws Exception
         public static void testHandlerMemUsage(int loops, String xsltFilePath, String targetXmlFile)throws Exception
              //verify SAX support
              TransformerFactory factory = TransformerFactory.newInstance();
              if(!factory.getFeature(SAXTransformerFactory.FEATURE))
                   throw new UnsupportedOperationException("SAX tranformations not supported");
              TransformerHandler handler=
                   ((SAXTransformerFactory)factory).newTransformerHandler(new StreamSource(xsltFilePath));
              handler.setResult(new StreamResult(targetXmlFile));
              handler.startDocument();
              handler.startElement(null,"root","root",new AttributesImpl());
              //loop
              for(int i=0;i<loops;i++)
                   handler.startElement(null,"el-"+i,"el-"+i,new AttributesImpl());
                   handler.characters("value".toCharArray(),0,"value".length());
                   handler.endElement(null,"el-"+i,"el-"+i);
              handler.endElement(null,"root","root");
              //System.out.println("end document");
              //only after endDocument() starts to print..
              handler.endDocument();
              //System.out.println("ended document");
         public static void main(String[] args)throws Exception
              System.out.println("--starting..");
              testHandlerMemUsage(500000,"/copy.xslt","/testHandlerMemUsage.xml");
              System.out.println("--we are still here -- increase loops..");
    }

    Did you try increasing memeory when starting java with the -Xmx parameter? You know that java uses only 64MB by default, so you might need to increase it to e.g. 256MB for your XML to work.

  • Working with local xml file

    Hi, I would like to work with a local xml file, but I don't want to point to a strict location as I would like my application to run on a few different machines.
    Is there a special place within Netbeans project structure that I can place such files?
    Say if I try and load a file "somefile.xml" - where is it going to look first?
    I have a <default package> with my settings.xml in there. How can I reference this within my app?
    Edited by: 993541 on Mar 16, 2013 9:25 AM

    993541 wrote:
    Is there a special place within Netbeans project structure that I can place such files?dunno Netbeans project structure at all but I'd sugest maven project structure wich is widely used: http://maven.apache.org/guides/introduction/introduction-to-the-standard-directory-layout.html
    There you would place such a file in the <tt>main/resources</tt> folder
    Say if I try and load a file "somefile.xml" - where is it going to look first?I'ts going to look for it where you tell your program to.
    When created as <tt>new File("somefile.xml")</tt> it will bee searched in the <i>currend working directory</i>. The problem with that is that it is unreliable what this will be at runtime (once your Program left your IDE...).
    You should better get it via <tt>getClass().getResource("somefile.xml")</tt> But in this case the file must be present in the classpath in the same package as the Class aquireing it. Adding a <tt>'/'</tt> in front of the file name expects it in the root directory of a classpath entry.
    I have a <default package> with my settings.xml in there. How can I reference this within my app?<tt>getClass().getResource("/settings.xml")</tt>
    But in case you include it into the delivery jar file it will not be writable. Also you cannot expect the installation folder of your App to be writable. On startup of your program you should copy your (default) settings to a writable place like <tt>new File(System.getProperty("user.home2),".myApp/settings.xml");</tt> and modify it there.
    bye
    TPD

  • Speeding up a dualcore G5 to work with large Photoshop files

    Hi
    I have a 2.3Ghz dualcore G5 (the one that came out late 2005).
    I'm working on a bunch of large format film scans (500Mb +) in Photoshop CS2, and I'm trying to speed things up.
    This last week I've installed two upgrades that have helped get things moving - first I upgraded the RAM from 4.5Gb to 12 Gb, then I installed a second hard drive (Seagate 7200.11 500Gb, with jumper in place) to use as a dedicated scratch disk.
    Both upgrades have given a significant speed boost, but I'm now wondering what else I can do????
    I want to speed up the time that it takes to open and save these large scans.
    My first thought was to buy a second Seagate 500Gb drive as a replacement for the original 250Gb WD drive. I would then have two 500Gb internal drives that I could configure as a RAID 0 array with disk utility. I would then clone my original 250Gb onto the RAID 0 drives with Super Duper.
    Wouldn't such a set-up double the speed of opening and saving large Photoshop files?
    I realise that with RAID 0 there is an increased chance of data loss from disk failure (double the chance?), but if I back up daily to a 1Tb external that should be ok, no?
    Or should my next move to be to utilise the PCI-E slots (which I don't really understand)????
    Thanks for any advice you can offer.
    Richard

    In my G5 Quad, I find the fastest Photoshop performance overall- especially with large-file open and saves, occurs when the setup is as follows:
    Startup disk is a 2 x 150G Raptor RAID0 which contains the system, apps, and all user files- including the image file being worked on.
    PS scratch is then pointed either to a third fast drive or to an external RAID0 array. This setup is substantially faster than:
    ....the safer method of using one Raptor for system and apps, and the other for user and image files, with a third drive for PS scratch.
    With a really fast large scratch disk, you can sometimes put the image file on the same volume as the scratch file, open and save it from there, and you'll get a bit of an additional boost so long as the scratch disk is faster than the startup disk.
    For CS3 (I believe it works for CS2 as well), a performance plugin called DisableScratchCompress.plugin is available which can also speed things up with a fast scratch disk. I found that using this plugin speeded things up but only if the image file was opened/saved from the first config I mentioned; it was slower if placed on the scratch disk while using the DSC plugin.
    More here: Photoshop Acceleration Basics
    Of course if you stripe a disk with data, be sure to frequently back it up..:)

  • Does the timecapsule work with mac OS X 10. 4. 11?

    does it work with 10.4.11? Or only with leopard? If it does work with 10.4.11 what features does leopard have that 10.4.11 not have?

    Not exactly. The USB cable is only to connect printers and HDDs (or a combination of both using a USB hub). You will connect either to the network that the Time Capsule creates, or to the network that it is a part of (if you have it set to join an existing network) via ethernet or wifi. Then you should be able to see and access the HDD on the TC as a network attached storage drive (NAS). This means you can mount the TC on your desktop and use it as a regular HDD. It's at this point that you can copy the files over to it, or use a program such as SuperDuper and tell it to use the TC's HDD as it's target.

  • Why does BTAHL7 Only works with Tutorial HL7 File?

    I have been trying for days to determine the issue.
    I've only been able to get the sample file in the end to end tutorial to work.  Every other HL7 file I try to parse, no matter the schema, doesn't work.  They all say "Data at the root level is invalid. Line 1, position 1."  and
    the file just goes through to the send pipe as it came in.  If I change to BTAHL72XPipelines I get "Reason: Message had parse errors, can not be serialized " Even though the schemas all exist in the application as they are suppossed to.  All
    I want to do is pass them through but I can't.  and the ACK/NACK statements all come through without errors and provide back to me the appropriate MHS with MSA = CA.
    I found a valid HL7 file online and tried it with the appropriate schema and even that doesn't work.
    MSH|^~\&|EPIC|EPICADT|SMS|SMSADT|199912271408|CHARRIS|ADT^A04|1817457|D|2.5|
    PID||0493575^^^2^ID 1|454721||DOE^JOHN^^^^|DOE^JOHN^^^^|19480203|M||B|254 MYSTREET AVE^^MYTOWN^OH^44123^USA||(216)123-4567|||M|NON|400003403~1129086|
    NK1||ROE^MARIE^^^^|SPO||(216)123-4567||EC|||||||||||||||||||||||||||
    PV1||O|168 ~219~C~PMA^^^^^^^^^||||277^ALLEN MYLASTNAME^BONNIE^^^^|||||||||| ||2688684|||||||||||||||||||||||||199912271408||||||002376853
    Any thoughts?

    Yes, I get various events.  
    EventID 4136 on the BizTalk Accelerator for HL7, and Event ID 5720 & 5754 on BizTalk Server, all three saying the message had parse errors, can not be serialize.    Also on BizTalk Accelerator for HL7 I get 8706 saying 
    Unable to log message "Error happened in body during parsi..." to the event log due to exception: 
    "No connection could be made because the target machine actively refused it 127.0.0.1:4000".
    Can anyone get the above HL7 file to parse through?
    UPDATE:
    Okay, so the above error lead me to the link below which says to turn on the HL7 Logging service and then I received the following errors. 
    http://www.biztalkgurus.com/biztalk_server/biztalk_blogs/b/biztalksyn/archive/2009/07/02/no-connection-could-be-made-because-the-target-machine-actively-refused-it.aspx
    Error happened in body during parsing 
    Error # 1
    Alternate Error Number: 301
    Alternate Error Description: The message has an invalid first segment
    Alternate Encoding System: HL7-BTA
    Error # 2
    Segment Id: PID
    Sequence Number: 1
    Field Number: 5
    Error Number: 207
    Error Description: Application internal error
    Encoding System: HL79999
    Alternate Error Number: Z100
    Alternate Error Description: Trailing delimiter found
    Alternate Encoding System: HL7-BTA
    Error # 3
    Segment Id: PID
    Sequence Number: 1
    Field Number: 6
    Error Number: 207
    Error Description: Application internal error
    Encoding System: HL79999
    Alternate Error Number: Z100
    Alternate Error Description: Trailing delimiter found
    Alternate Encoding System: HL7-BTA
    Error # 4
    Segment Id: PID
    Sequence Number: 1
    Field Number: 18
    Error Number: 102
    Error Description: Data type error
    Encoding System: HL79999
    Error # 5
    Segment Id: PID
    Sequence Number: 1
    Field Number: 19
    Error Number: 207
    Error Description: Application internal error
    Encoding System: HL79999
    Alternate Error Number: Z100
    Alternate Error Description: Trailing delimiter found
    Alternate Encoding System: HL7-BTA
    Error # 6
    Segment Id: NK1
    Sequence Number: 1
    Field Number: 1
    Error Number: 101
    Error Description: Required field is missing
    Encoding System: HL79999
    Error # 7
    Segment Id: NK1
    Sequence Number: 1
    Field Number: 2
    Error Number: 207
    Error Description: Application internal error
    Encoding System: HL79999
    Alternate Error Number: Z100
    Alternate Error Description: Trailing delimiter found
    Alternate Encoding System: HL7-BTA
    Error # 8
    Segment Id: NK1
    Sequence Number: 1
    Field Number: 34
    Error Number: 207
    Error Description: Application internal error
    Encoding System: HL79999
    Alternate Error Number: Z100
    Alternate Error Description: Trailing delimiter found
    Alternate Encoding System: HL7-BTA
    Error # 9
    Segment Id: PV1
    Sequence Number: 1
    Field Number: 3
    Error Number: 102
    Error Description: Data type error
    Encoding System: HL79999
    Error # 10
    Segment Id: PV1
    Sequence Number: 1
    Field Number: 3
    Error Number: 207
    Error Description: Application internal error
    Encoding System: HL79999
    Alternate Error Number: Z100
    Alternate Error Description: Trailing delimiter found
    Alternate Encoding System: HL7-BTA
    Error # 11
    Segment Id: PV1
    Sequence Number: 1
    Field Number: 7
    Error Number: 207
    Error Description: Application internal error
    Encoding System: HL79999
    Alternate Error Number: Z100
    Alternate Error Description: Trailing delimiter found
    Alternate Encoding System: HL7-BTA

Maybe you are looking for

  • New UCS and VMware setup Questions

    /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:

  • Upload Directory Configuration Problem in jdev 10.1.3.2 adfbc

    I am configuring a file upload utiltity. By default my files are created in /tmp directory i.e on c:\tmp folder. I intend to change this behaviour as i want this to be uplaoded into a drive other than c:\ When we define the upload directory i have sp

  • Trouble email a document.

    All of a sudden I can't email document on my iPhone 5. It comes up but can't type in address or anything. Something must be wrong cause I've emailed documents before. Is anyone else having this problem or any suggestions?

  • Layer navigation problems

    this feature works great = to select next layer down/up  -  option + [ or ] BUT can you do the below. 1 - is there a way to make this work when the layers are inside a layer folder? 2 - is there a way to make this work if I want to keep another layer

  • How to make blog that looks like Wordpress or Blogger

    I LOVE the functionality of iweb's blog in terms of placing photos from Aperture and iphoto, but I want to be able to have all my entries on one scrolling page with the archives separated by year/month like other blogging software. I read that I coul