Huge size file processing in PI

Hi Experts,
1. I have seen blogs which explains processing huge files. for file and sftp
SFTP Adapter - Handling Large File
File/FTP Adapter - Large File Transfer (Chunk Mode)
Here also we have constrain that we can not do any mapping. it has to be EOIO Qos.
would it be possible, to process  1 GB size file and do mapping? which hardware factor will decide that sytem is capable of processing large size with mapping?
is it number of CPUs,Applications server(JAVA and ABAP),no of server nodes,java,heap size?
if my system if able to process 10 MB file with mapping there should be something which is determining the capability.
this kind of huge size file processing will fit into some scenarios.  for example,Proxy to soap scenario with 1GB size message exchange does not make sense. no idea if there is any web service will handle such huge file.
2. consider pi is able to process 50 MB size message with mapping. in order to increase the performance what are the options we have in PI
i have come across these two point many times during design phase of my project. looking for your suggestion
Thanks.

Hi Ram,
You have not mentioned what sort of Integration it is.You just mentioned as FILE.I presume it is FILE To FILE scenario.In this case in PI 711 i am able to process 100MB(more than 1Million records ) file size with mapping(File is in the delta extract in SAP ECC AL11).In the sender file adapter i have chosen recordset per message and processed the messages in bit and pieces.Please note this is not the actual standard chunk mode.The initial run of the sender adapter will load the 100MB file size into the memory and after that messages will be sent to IE based on recordset per message.If it is more than 100MB PI Java starts bouncing because of memory issues.Later we have redesigned the interface from proxy to file asyn and proxy will send the messages to PI in chunks.In a single run it will sent 5000 messages.
For PI 711 i believe we have the memory limtation of the cluster node.Each cluster node can't be more than 5GB again processing depends on the number of Java app servers and i think this is no more the limitation from PI 730 version and we can use 16GB memory as the cluser node.
this kind of huge size file processing will fit into some scenarios.  for example,Proxy to soap scenario with 1GB size message exchange does not make sense. no idea if there is any web service will handle such huge file.
If i understand this i think if it is asyn communication then definitely 1GB data can sent to webservice however messages from Proxy should sent to PI in batches.May be the same idea can work for Sync communication as well however timeouts in receiver channel will be the next issue.Increasing time outs globally is not best practice however if you are on 730 or later version you can increase timeouts specific to your scenario.
To handle 50 MB file size make sure you have the additional java app servers.I don't remember exactly how many app server we have in my case to handle 100 MB file size.
Thanks

Similar Messages

  • How to load huge size file in database

    Hi,
    I have a requirement to load files into database and download them as well. Can you tell me which is the best method to do on plsql.
    Thanks
    Sudhir

    The file upload will be a tad ugly.
    You need to create an upload form/page in Apex that allows the user to upload the file. The Apache location configuration for that Database Access Descriptor (DAD) will include a parameter called PlsqlDocumentTablename - and this parameter specifies the name of the Oracle (or Apex) table to use for uploads.
    The default document table is WWV_FLOW_FILE_OBJECTS$.
    The 150Mb file will thus be a LOB in a row in the document table.
    You can add a post PL/SQL process to that Apex upload page to process the uploaded document. Because of the size of the uploaded file, I suggest that you start a batch process (via the DBMS_JOB or DBMS_SCHEDULER interfaces) to process the upload file.
    Okay, now for the processing. This is the ugly part - as the file is a LOB column in the database and you want to use it via SQL*Loader as an external file or external table. You can write your own CSV parser in PL/SQL. But I do not recommend using this approach for large files - SQL*Loader and external tables are a lot more flexible. I have not yet seen a means to define an external table on a CLOB - so this means having to write code using UTL_FILE and unload the 150Mb LOB contents to an actual external file. Once that is done, you can then use an external table on the external file to load it.
    As for sending files to the client via Apex. This is done using the WPG_DOCLOAD.download_file() call to stream the LOB to the client browser. E.g. basic example:
    --// URL example: http://my-server.my-domain.com/MyDAD/MySchemaName.StreamFile?fileID=1234
    create or replace procedure StreamFile( fileID number ) AUTHID DEFINER is
            mimeType        varchar2(48);
            fileName        varchar2(400);
            lobContent      BLOB;
    begin
            --// read the LOB from a table (or dynamically create it)
            select
                    f.filename,
                    f.mime_type,
                    f.blob_content
                            into
                    fileName,
                    mimeType,
                    lobContent
            from    FLOWS_FILES.WWV_FLOW_FILE_OBJECTS$ f
            where   f.id = fileID;
            --// format a basic HTTP header that describes the file stream send
            OWA_UTIL.mime_header( mimeType, FALSE );        -- e.g. text/csv text/plain text/html image/gif
            HTP.p( 'Content-Disposition: attachment; filename='||fileName );
            HTP.p( 'Content-Length: ' || DBMS_LOB.GetLength(lobContent) );
            OWA_UTIL.http_header_close;
            --// now write the BLOB as a mime stream using the Web Procedural Gateway's
            --// doc load API
            WPG_DOCLOAD.download_file( lobContent );
    exception when OTHERS then
            --// Decide what HTML to generate (if at all) if there is a failure
            HTP.prn( 'StreamFile() failed with '||SQLERRM(SQLCODE) );
    end;
    /

  • JDOM parser fails to parse Huge Size Xml Files ??? Unable to Parse ???

    Hi All,
    When i transformed or parsed many XML Files of huge size...I am getting java.lang.OutOfMemory error for all the huge xml files. It is working fine for files which is of small size.
    I've 2GB ram in my system. I have also set heapsize for the JVM.
    -Xms512M -Xmx1800M (or) -Xms512M -Xmx1500M
    I like to know what are the drawbacks of JDOM parser ?
    What is the maximum size of the Xml which JDOM can parse ?
    I've read a long time before that parsers have certain limitations. Can anybody tell me the limitations of the parses?
    Please help. i'm currently using jdom.jar to parse the xml files.
    Thanks,
    J.Kathir

    Hi All,
    When i transformed or parsed many XML Files of huge
    size...I am getting java.lang.OutOfMemory error for
    all the huge xml files. It is working fine for files
    which is of small size.Of course it is.
    >
    I've 2GB ram in my system. I have also set heapsize for the JVM.
    -Xms512M -Xmx1800M (or) -Xms512M -Xmx1500MYou can't always get what you want. Your JVM is competing with all the other processes on your machine, including the OS, for RAM.
    I like to know what are the drawbacks of JDOM parser ?You just discovered it: all DOM parsers create an in-memory representation of the entire document. A SAX parser does not.
    What is the maximum size of the Xml which JDOM can parse ?The max amount of memory that you have available to you.
    I've read a long time before that parsers have
    certain limitations. Can anybody tell me the
    limitations of the parses?See above.
    Please help. i'm currently using jdom.jar to parse the xml files.You can try a SAX parser or a streaming parser. Maybe those can help.
    %

  • Huge HD file sizes in iMovie when importing AVCHD files from HD Camcorder

    Although several answers have been given on different ways to import HD files from a AVCHD camcorder, in this case a Sony one, the issue is what to do with the HUGE files it creates. I recorded 2 hours of HD content in about 30GB on the camcorder. I can burn those native HVCHD files on DVD's, took 6 of them, and play them on a Blue Ray player in HD. To edit, I tested a sample 50MB file that was about 3 minutes in length. It convered to quicktime file in the same 1920x1080 quality resulted in a file that was about 800MB, looked great, but how do I store or play a 20 minute version of it? iMovie is inadequate to do anything other than produce smaller size files for an iPod or lower quality, unless it is going to be a very short HD movie if doing true HD quality. One of the reasons I took the MAC plunge was the better multimedia capabilities. In this case, Apple (iMove and Final Cut) is behind.
    I ended up shifting my HD editing back to a PC with $100 Sony Vegas software that can edit the AVCHD files and author in it. I can produce a DVD with about 25 minutes of edited HD in AVCDH file format and play it. I can also store the finished product in a much smaller space using AVCHD files. I have a 1 TB drive on my Mac that would be consumed if I worked in iMovie with converting all these AVCDH files and kept them in HD quality since it looks like it needs to expand them 12-16 times the AVCHD file size.
    Yes, I know I could lower the quality to better work with them in iMovie, but that makes no sense to me. I want to record and edit in the HD format. Seems like these AVCHD cameras are becoming a standard since they are so efficient at storing HD content, iMovie needs to figure out how to edit and store them the same way. Very disappointed in how iMovie handles HD, to me it is NOT very good at handling HD, no good way to store or distribute it.
    Am I missing soomething here? I have spent hours looking for a different answer, but have not found one yet? Maybe the next version of iMovie will be better. I like using it for SD DVD authoring, but for HD, did not work at all for me.
    Jon

    Interesting you say it is not meant for editing, but there are about a half dozen editor's for windows that will work on these exact files. I know with Sony Vegas that I have the same level of control with frame precision as I do with iMovie so I think there must be more to the story why Apple has not been able to support it like other software manufacturers who support windows. Don't get me wrong, i would love to do the work in iMovie and not have to learn something new. I don't have the storage space to use the HD apple format, that 2 hr of HD in AVCHD would turn into about 500GB then what do I put it on to play it. If the BlueRay standard calls for 25 or 50G disc's and we can get full two hour movies on them today, then they must be using a file format similar to AVCHD to fit on a disc, they cannot be using anything close to what Apple uses for HD files, it is just simple logic and math. So why doesn't Apple finure it out before everyone who starts working in HD comes to the same conclusion that the Apple QT file formats for HD is going to be an issue with space, you can create content, but cannot afford the disc space to store it or play it.

  • Need help -To Restrict Huge temp file, which grows around 3 GB in OBIEE 11g

    Hi Team,
    I am working on OBIEE 11.1.1.5 version for a client specific BI application. we have an issue concerning massive space consumption in OBIEE 11g installed linux environment whenever trying to run some detail level drill down reports. While investigating, we found that whenever a user runs the drill down report a temp file named nQS_xxxx_x_xxxxxx.TMP is created and keep's growing in size under the below given folder structure,
    *<OBIEE_HOME>/instances/instance1/tmp/OracleBIPresentationServicesComponent/coreapplication_obips1/obis_temp/*
    The size of this temp file grows huge as much as around 3 GB and gets erased automatically when the drill down report output is displayed in UI. Hence when multiple users simultaneously try to access these sort of drill down reports the environment runs out of space.
    Regarding the drill down reports:
    * The drill down report has around 55 columns which is configured to display only 25 rows in the screen and allows the user to download the whole data as Excel output.
    * The complete rows being fetched in query ranges from 1000 to even above 100k rows. Based on the rows fetched, the temp file size keeps growing. ie., If the rows being fetched from the query is around 4000 a temp file of around 60 MB is created and gets erased when the report output is generated in screen (Similarly, for around 100k rows, the temp file size grows up to 3 GB before it gets deleted automatically).
    * The report output has only one table view along side Title & Filters view. (No Pivot table view, is being used to generate this report.)
    * The cache settings for BI Server & BI Presentation services cache are not configured or not enabled.
    My doubts or Questions:
    * Is there any way to control or configure this temp file generation in OBIEE 11g?
    * Why the growing temp file automatically gets deleted immediately after the report output generation in screen. Is there any default server specific settings governing this behaviour?
    * As per certain OBIEE article reference for OBIEE 10g, I learnt that for large pivot table based reports the temp file generation is quite normal because of huge in-memory calculations involved . However we have used only Table view in output but still creates huge temp files. Is this behaviour normal in OBIEE 11g. If not, Can any one Please suggest of any specific settings to be considered to avoid generating these huge files or atleast generate a compressed temp file.
    * Any other work around solution available for generating a report of this type without the generation of temp files in the environment?
    Any help/suggestions/pointers or document reference on this regard will be much appreciated. Please advice
    Thanks & Regards,
    Guhan
    Edited by: 814788 on 11-Aug-2011 13:02

    Hello Guhan,
    The temp files are used to prepare the final result set for OBI presentation server processing, so as long as long you dataset is big the tmp files will be also big and you can only avoid this by reducing your dataset by for example filtering your report.
    You can also control the size of your temp files by reducing the usage of the BI server.I mean by this if you are using any functions like for example sorting that can be handled by your database so just push to the DB.
    Once the report finished the BI server removes automatically the tmp files because it's not necessary anymore.you can see it as a file that is used for internal calculations once it's done the server gets rid of it.
    Hope this helps
    Adil

  • Reader 10.1 update fails, creates huge log files

    Last night I saw the little icon in the system tray saying an update to Adobe Reader was ready to be installed.
    I clicked it to allow the install.
    Things seemed to go OK (on my Windows XP Pro system), although very slowly, and it finally got to copying files.
    It seemed to still be doing something and was showing that it was copying file icudt40.dll.  It still displayed the same thing ten minutes later.
    I went to bed, and this morning it still showed that it was copying icutdt40.dll.
    There is no "Cancel" button, so this morning I had to stop the install through Task Manager.
    Now, in my "Local Settings\TEMP" directory, I have a file called AdobeARM.log that is 2,350,686 KB in size and a file MSI38934.LOG that is 4,194,304 KB in size.
    They are so big I can't even look at them to see what's in them.  (Too big for Notepad.  When I tried to open the smaller log file, AdobeARM.log, with Wordpad it was taking forever and showing only 1% loaded, so after five minutes, I terminated the Wordpad process so I could actually do something useful with my computer.)
    You would think the installer would be smart enough to stop at some point when the log files begin to get enormous.
    There doesn't seem to be much point to creating log files that are too big to be read.
    The update did manage to remove the Adobe Reader X that was working on my machine, so now I can no longer read PDF files.
    Maybe I should go back Adobe Reader 9.
    Reader X never worked very well.
    Sometimes the menu bar showed up, sometimes it didn't.
    PDF files at the physics e-print archive always loaded with page 2 displayed first.  And if you forgot to disable the look-ahead capability, you could get banned from the e-print archive site altogether.
    And I liked the user interface for the search function a lot better in version 9 anyway.  Who wants to have to pop up a little box for your search phrase when you want to search?  Searching is about the most important and routine activity one does, other than going from page to page and setting the zoom.

    Hi Ankit,
    Thank you for your e-mail.
    Yesterday afternoon I deleted the > 2 GB AdobeARM.log file and the > 4.194 GB
    MSI38934.LOG file.
    So I can't upload them.  I expect I would have had a hard time doing so
    anyway.
    It would be nice if the install program checked the size of the log files
    before writing to them and gave up if the size was, say, three times larger
    than some maximum expected size.
    The install program must have some section that permits infinite retries or
    some other way of getting into an endless loop.  So another solution would be
    to count the number of retries and terminate after some reasonable number of
    attempts.
    Something had clearly gone wrong and there was no way to stop it, except by
    going into the Task Manager and terminating the process.
    If the install program can't terminate when the log files get too big, or if
    it can't get out of a loop some other way, there might at least be a "Cancel"
    button so the poor user has an obvious way of stopping the process.
    As it was, the install program kept on writing to the log files all night
    long.
    Immediately after deleting the two huge log files, I downloaded and installed
    Adobe Reader 10.1 manually.
    I was going to turn off Norton 360 during the install and expected there
    would be some user input requested between the download and the install, but
    there wasn't.
    The window showed that the process was going automatically from download to
    install. 
    When I noticed that it was installing, I did temporarily disable Norton 360
    while the install continued.
    The manual install went OK.
    I don't know if temporarily disabling Norton 360 was what made the difference
    or not.
    I was happy to see that Reader 10.1 had kept my previous preference settings.
    By the way, one of the default settings in "Web Browser Options" can be a
    problem.
    I think it is the "Allow speculative downloading in the background" setting.
    When I upgraded from Reader 9 to Reader 10.0.x in April, I ran into a
    problem. 
    I routinely read the physics e-prints at arXiv.org (maintained by the Cornell
    University Library) and I got banned from the site because "speculative
    downloading in the background" was on.
    [One gets an "Access denied" HTTP response after being banned.]
    I think the default value for "speculative downloading" should be unchecked
    and users should be warned that one can lose the ability to access some sites
    by turning it on.
    I had to figure out why I was automatically banned from arXiv.org, change my
    preference setting in Adobe Reader X, go to another machine and find out who
    to contact at arXiv.org [I couldn't find out from my machine, since I was
    banned], and then exchange e-mails with the site administrator to regain
    access to the physics e-print archive.
    The arXiv.org site has followed the standard for robot exclusion since 1994
    (http://arxiv.org/help/robots), and I certainly didn't intend to violate the
    rule against "rapid-fire requests," so it would be nice if the default
    settings for Adobe Reader didn't result in an unintentional violation.
    Richard Thomas

  • Large File Processing Problem

    HI Group,
    I am facing problem in XI while processing 48 MB File through File adapter,I have used Content Conversion in the design.
    I am using normal 64Bit operating system with Max of 2GB heap size,still I am facing the problem,Can any body tell me how much Heap size I required to process 48 MB size file through XI?

    Hi,
    Refer following SAP note-for this go to www.service.sap.com/notes
    File adapter faq- 821267
    Java Heap - 862405
    for java settings- 722787
    This blog may give some insight-/people/sap.user72/blog/2004/11/28/how-robust-is-sap-exchange-infrastructure-xi
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    btw, if the error tells that, trailer is missing.. where are you getting this error ?
    Regards,
    Moorthy

  • Reading  huge xml files in OSB11gR1(11.1.1.6.0)

    Hi,
    I want to read a huge xml file of size 1GB in OSB(11.1.1.6.0)?
    I will be creating a (JCA)file adapter in jdeveloper and importing artifacts to OSB.
    Please let me know the maximum file size that could be handled in OSB?
    Thanks in advance.
    Regards,
    Suresh

    Depends on what you intend to do after reading the file.
    Do you want to parse the file contents and may be do some transformation? Or do you just have to move the file from one place to another for ex. reading from local system and moving to a remote system using FTP?
    If you just have to move the file, I would suggest using JCA File/FTP adapter's Move operation.
    If you have to parse and process the file contents within OSB, then it may be possible depending on the file type and what logic you need to implement. For ex. for very large CSV files you can use JCA File Adapter batching to read a few records at a time.

  • How to break up a huge XML file and generate serialized JSP pages

    I have a huge xml file, with about 100 <item> nodes.
    I want to process this xml file with pagination and generate jsp pages.
    For example:
    Display items from 0 to 9 on page 1 page1.jsp , 10 to 19 on page2.jsp, and so on...
    Is it possible to generate JSP pages like this?
    I have heard of Velocity but dont know if it will be the right technology for this kind of a job.

    Thank you for your reply, I looked at the display tag library and it looks pretty neat with a lot of features, I could definitely use it in a different situation.
    The xml file size is about 1.35 MB, and the size is unpredictable it could shrink or grow periodically depending on the number of items available.
    I was hoping to create a documentation style (static pages) of the xml feed instead of having 1 jsp with dynamic pages
    I was looking at Anakia : http://jakarta.apache.org/velocity/docs/anakia.html , may be it has features that enable me to create static pages but not very sure.
    I think for now, I will transform the xml with an xsl file and pass the page numbers as input parameters to the xsl file
    null

  • MSI Control Service is consuming the huge Paging file

    Hi
    I've newly installed WINDOWS 8.1 64bit ver 6.3 Build 9600 onto S99A Krait edition with Intel Core i7 5930k + 16G memory. After 1 weeks  I've noticed  some uncertain huge size usage of share memory and paging (10g-20G).
    I've tryed stopping various services and processes. such paging usage was not decreased until I stopped MSICTL_CC (MSIControlService.exe). Apparenty the process MSIControlService.exe seems not using  such a huge share memory  and paging file or  I cannnot find any suspicious symptom except for the many  processess' handles of  MSIControlService.exe.
    MSI command center version   is 1.0.0.94.
    After I stopped  MSICTL_CC, there is not such symptom of inceasing  Huge paging file or Huge Memory usage.
    What I want to know:
    Is there any problem when MSI CTL_CC is not started?
    Is there any possibility  that this issue will be fixed?
    Best Regards,

    >What is your test configuration?   where did you install your windows??
    >what exact settings did you modify for UEFI and PCIE M.2 PCH setting??
    I don't use a test configuration. I just use PC for the consumer usage.
    I install WINDOWS8.1 64 bit into  PCIE SSD.
    (Intel 750 Series SSDPEDMW400G4R5 HHHL (CEM2.0) 400GB PCIe NVMe 3.0 x4 MLC Internal Solid State Drive (SSD) ) for C drive.
    M2 slot is for M.2 PCI Express SSD「Samsung XP941.
    For Usage of PCIE SSD as Booting Drive, I choose UEFI boot and I set not PCH SATA but PCIE settiing  for M.2/
    >Do you mean to say that you just open command center then modify fan contol using Smart fan ? 
    Yes, That is what I 've done for Command Center
    >>and then you will need shutdown your system for 1 day and then check again the MSISuperIOService.exe at Task Mananger and you could
    see that the huge increase in the memory??
    Physical Used memory is usual. Virtuall memory is getting Huge (cuurently 13G) while Physical used memory just
    around 4G after 2 days restart of MSI control services.
    the 2 process MSISuperIOService.exe and especially MSIcontrolService.exe have problems that the handles of Process (in task manager) is quite many comparing to the all other processes.
    Currently
    the number of  Handles 
    MSIcontrolService.exe :198700
    MSISuperIOService.exe:23183
    MSIClockSerice.exe:8251
    In Other processes' handle, the number of the handle of svchost.exe is just around 2500. This is the most except for the above three process at My PC.
    >>Are you using the latest BIOS version?
    Now using N20. It may be attached with S99A Klait edition. If it is old, there is no problem to upgrade. But this issue seems to be caused by application's matter.  BIOS upgrade seems to be useless for  this issue.

  • Large file processing in XI 3.0

    Hi,
    We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine for just this file processing might solve the problem which I doubt.
    Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
    Any comments on this would be appreciated.
    Thanks
    Steve

    Hi Debnilay,
    We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
    Thanks
    Steve

  • Large file processing in file adapter

    Hi,
    We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine just for this large file processing might solve the problem which I doubt.
    Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
    Any comments on this would be appreciated.
    Thanks
    Steve

    Dear Steve,
    This might help you,
    Topic #3.42
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/70ada5ef-0201-0010-1f8b-c935e444b0ad#search=%22XI%20sizing%20guide%22
    /people/sap.user72/blog/2004/11/28/how-robust-is-sap-exchange-infrastructure-xi
    This sizing guide &  the memory calculations  it will be usefull for you to deal further on this issue.
    http://help.sap.com/bp_bpmv130/Documentation/Planning/XISizingGuide.pdf#search=%22Message%20size%20in%20SAP%20XI%22
    File Adpater: Size of your processed messages
    Regards
    Agasthuri Doss

  • Photoshop elements 8 not proceeding at multiple file processing

    Hi everyone,
    I've got the following problem with photoshop elements 8. I worked for a long time with it. Using the multiple file processing functionality a lot. I do quite a lot of big shoots and I only use the raw settings for processing. After that I let photoshop elements do the rest of the work, which is converting all my raw images into jpegs and resizing them to prepare them for uploading. These shoots often contain more then 200 photographs.
    As of this last weekend when I start up the multiple file processing window it starts opening the first file but does not proceed. Which means that I have to edit all the images by myself and resize them to the proper size etc etc etc. I really don't want to do this. This funcionality is one of the main reasons why I use photoshop. Anyone got any idea why it suddenly stopped working? Anyone else experienced this and solved it?
    Thanks a lot.
    Ben

    One thing to check is the bit depth that camera raw is set to.
    Open a raw photo in the camera raw dialog and look down along
    the bottom of the dialog where it says bit depth. If it says 16bit,
    change to 8 bit and press done.
    Photoshop elements sometimes won't process 16 bit files using process
    multiple files.
    MTSTUNER

  • Word docs become huge RTF files in RoboHelp 8.02

    Every Word document that I add to the RoboHelp 8 project becomes huge in file size.  For example, a 100k Word 2007 document that consists of a couple of pages of text with a couple of small JPEG images (around 50k each) becomes a 15-30 meg RTF file in RoboHelp after importing.  The same happens if I created the page/document from within RoboHelp.  Is there a setting to keep the RTF files from swelling to such large gigantic sizes?

    I found out what the problem was.  For some reason, the RTF files swell up when the a jpeg image is added and embedded into the Word document.  I created a test document and converted some of the jpeg images to bitmap (BMP) files and used the Image import tool that comes with RoboHelp.  The RTF documents stayed tiny even when the same images were added but now as BMPs.   The difference is that the RoboHelp image tool creates a link to the bitmap image as opposed to embedding it in the document which is what happened with the jpegs.  I read about this in another post that alluded to this in one of the RoboHelp forums.
    Thanks!

  • Reading huge flat file through SOAP adapter

    Hi Everybody,
        In one of our interface we need to read big flat file using soap adapter at sender side into xi and we are using java map to convert into xml. but before that i need to split this flat file into multiple files  in the first message mapping. and in the second map we have to write a java map to do the flat file conversion to XMLBut i got struck up in reading this big flat file into XI as i need to declare some datatype to read this entire file. Can anybody tell me how i can do this. is it a possible to do first of all with SOAP adapter . 
    Thanks
    raj

    hi vijay,
      Thanks for your prompt reply. Due to some reasons i am not allowed to use file adapter . i can use only JMS adapter or SOAP adapter. we tried few scenarios with JMS content conversion but what ever scenario i am asking here is complex at multilevel i can't even use JMS in this case. so we are thinking to read whole file using SOAP adapter and then we are planning to split the file into multiple files, as file can be huge size ,using java mapping and in next level we want to use another mapping to do content conversion. SO I have to do experiements whether this is a feasible solution or not. because when u declare at sender side
    <ffdata_MT>
    <Recordset>
        <ROW>    String type
    when u declare like this and when u sent the flat file using SOAP adapter at sender side we are getting whole file which we sent at part of "ROW" as string. but inside java mapping i need to see whenther i can split this in XI ,so that i can use these split files in next mapping for content conversion. Hope i am clear now. I want to know whether it is a feasible solution or not.
    I really appreciate if sombody give some idea on this
    Thanks
    raj

Maybe you are looking for

  • Windows 8.1 won't boot up

    My HP ENVY 15-jo11dx was downloading current updates for the built in camera but having problems with the download. Suddenly my computer rebooted but Windows 8.1 would not boot. Waited over 3 hours while hard drive lite stayed on. Nothing came on scr

  • FU Ant task failure: java.util.concurrent.ExecutionException: could not close client/server socket

    We sometimes see this failure intermitently when using the FlexUnit Ant task to run tests in a CI environment. The Ant task throws this exception: java.util.concurrent.ExecutionException: could not close client/server socket I have seen this for a wh

  • ITunes KEEPS losing my library, it's insane!

    Gah! I'm HATING iTunes right now, and I'm a die hard-mac user! Every single day, I'll start up itunes, all will be well.. I'll hit quit.. A few minutes later, I'll open it again, and oh! iTune slibrary corrupt! It's driving me nuts... I'm doing backu

  • Rendering still not working

    i am still having trouble rendering my project clips. i continue to attempt to render and the box that tells you how much rendering has progressed comes up for a couple of seconds, and then disappears with out rendering anything. i have already tried

  • Saving an image file in phone memory

    Hi everyone, I want to save a file (an image file) in my phone/ memory-card memory. i've infact captured it from camera and now want to save in the memory not in the RecordStore. What should i do ? Please help me. Thanks in advance