Big File issue

Hi Group,
When we are trying to process the big file (around 40MB) ,the source file adapter not picking up the file,we are using AAE,can any body suggest if we required any addition tuning.
Regards,
Rajiv

Hi Rajiv,
As said above by Michal 40 MB is not a big file but if yo need to tune your server you can set the below parameters in PI Server.
u2022     UME Parameters :  May be we need to look into the pool size and poolmax wait parameters - UME recommended parameters (like: poolmaxsize=50, poolmaxwait=60000)
u2022     Tuning Parameters:  May be we need to look/define the Message Size Limit u201Clike: EO_MSG_SIZE_LIMIT = 0000100u201D under tuning category
u2022     ICM Parameters: May be we need to consider ICM parameters (ex: icm/conn_timeout = 900000. icm/HTTP/max_request_size_KB = 2097152)
Thanks and Regards,
Naveen.

Similar Messages

  • Photoshop CC slow in performance on big files

    Hello there!
    I've been using PS CS4 since release and upgraded to CS6 Master Collection last year.
    Since my OS broke down some weeks ago (RAM broke), i gave Photoshop CC a try. At the same time I moved in new rooms and couldnt get my hands on the DVD of my CS6 resting somewhere at home...
    So I tried CC.
    Right now im using it with some big files. Filesize is between 2GB and 7,5 GB max. (all PSB)
    Photoshop seem to run fast in the very beginning, but since a few days it's so unbelievable slow that I can't work properly.
    I wonder if it is caused by the growing files or some other issue with my machine.
    The files contain a large amount of layers and Masks, nearly 280 layers in the biggest file. (mostly with masks)
    The images are 50 x 70 cm big  @ 300dpi.
    When I try to make some brush-strokes on a layer-mask in the biggest file it takes 5-20 seconds for the brush to draw... I couldnt figure out why.
    And its not so much pretending on the brush-size as you may expect... even very small brushes (2-10 px) show this issue from time to time.
    Also switching on and off masks (gradient maps, selective color or leves) takes ages to be displayed, sometimes more than 3 or 4 seconds.
    The same with panning around in the picture, zooming in and out or moving layers.
    It's nearly impossible to work on these files in time.
    I've never seen this on CS6.
    Now I wonder if there's something wrong with PS or the OS. But: I've never been working with files this big before.
    In march I worked on some 5GB files with 150-200 layers in CS6, but it worked like a charm.
    SystemSpecs:
    I7 3930k (3,8 GHz)
    Asus P9X79 Deluxe
    64GB DDR3 1600Mhz Kingston HyperX
    GTX 570
    2x Corsair Force GT3 SSD
    Wacom Intous 5 m Touch (I have some issues with the touch from time to time)
    WIN 7 Ultimate 64
    all systemupdates
    newest drivers
    PS CC
    System and PS are running on the first SSD, scratch is on the second. Both are set to be used by PS.
    RAM is allocated by 79% to PS, cache is set to 5 or 6, protocol-objects are set to 70. I also tried different cache-sizes from 128k to 1024k, but it didn't help a lot.
    When I open the largest file, PS takes 20-23 GB of RAM.
    Any suggestions?
    best,
    moslye

    Is it just slow drawing, or is actual computation (image size, rotate, GBlur, etc.) also slow?
    If the slowdown is drawing, then the most likely culprit would be the video card driver. Update your driver from the GPU maker's website.
    If the computation slows down, then something is interfering with Photoshop. We've seen some third party plugins, and some antivirus software cause slowdowns over time.

  • Cfspreadsheet and reading the xlsx files issue

    I am trying to process large excel files (65,000 rows) into the DB.  So I try to use the new cfspreadsheet tag to read in the files by 5000 rows at a time and then process the query into the tables.  The process is working fine when reading the (pre 2003 format "xls" files) but not when reading the new "xlsx" type of files.  The server is running out of memory trying to read those files for some reason.  Is this a known bug in ColdFusion 9.0??  Has anyone experiansed similar issues?
    Thanks

    if it's a POI issue (not likely, admittedly), of CF's
    usage thereof.
    If you check the POI archives, you should see a LOT of memory issues with large xlsx files. Usually related to how the file is processed. IIRC the standard way of loading a spreadsheet ie new XSSFWorkbook(name) just reads the whole file into memory in one shot. So with all the bloated xml and parsing going on it is very prone to OOM errors.
    For big files, they actually recommend using the event api instead, because of its lower footprint. I used it a couple months ago for a side project and it made a huge difference!  I did some basic profiling and the standard method (usermodel) was definitely more prone to big spikes in memory and oom errors. In contrast, memory usage with event api was far lower and more stable.
    Reason for bringing this all of this up ...? Well depending on how CF actually reads/loads spreadsheets internally, maybe the problem is a little bit of both .. ?  Just my $0.02
    -Leigh
    Message was edited by: -==cfSearching==-

  • VERY big files (!!) created by QuarkXPress 7

    Hi there!
    I have a "problem" with QuarkXPress 7.3 and I don't know if this is the right forum to ask...
    Anyway, I have createed a document, about 750 pages, with 1000 pictures placed in it. I have divided it in 3 layouts.
    I'm saving the file and the file created is 1,20GB !!!
    Isn't that a very big file for QuarkXPress??
    In that project there are 3 layouts. I tried to make a copy of that file and delete 2 of 3 layouts and the project's file size is still the same!!
    (Last year, I had created (almost) the same document and as I checked that document now, its size is about 280 MB!!)
    The problem is that I have "autosave" on (every 5 or 10 minutes) and it takes some time to save it !
    Can anyone help me with that??
    Why does Quark has made SO big file???
    Thank you all for your time!

    This is really a Quark issue and better asked in their forum areas. However, have you tried to do a Save As and see how big the resultant document is?

  • OVM Manager 3.1.1 CLI clone big files error

    Hello,
    We use ovm manager cli for backup purposses. During a clone operation of a big file after about 10 to 12 min. we get an error:
    Error Msg: com.oracle.odof.exception.PermissionException: Exchange is not connected
    But the clone operation is being continued successfully_.
    So, from client point of view operation failed, but for server succeded.
    Clone command is as follows:
    ssh admin@ovmm -p 10000 "clone VirtualDisk id=$VM_FILE_ID target=$VM_FILE_REPO_ID cloneType=Sparse"
    The whole message:
    Command: clone VirtualDisk id=0004fb00001200004d032c969c42095d.img target=0004fb000003000072340b1e2eb70904 cloneType=Sparse
    Status: Failure
    Time: 2013-01-18 14:42:58.955
    Error Msg: com.oracle.odof.exception.PermissionException: Exchange is not connected
    Fri Jan 18 14:42:58 CET 2013
    OVM> Failed to complete command(s), error happened. Connection closed.
    We blamed timeouts, but after setting higher values nothing has changed.
    Thank you in advance
    Gregory

    This really sounds like this nasty timeout issue, which had its first appearances in the early builds of OVM 3.0.3 and prevented e.g. the creation of rather large storage repositories over iSCSI when using a standard 1 GBit connection. The command would simply timeout on the OVMM after 120 secs, rather than waiting for the ovs-agent to report back…
    In OVM 3.1.1 (368) this timeout had been upped to 10 mins, but I knew that this only would last for months… and I told Oracle Support so, but eh… you know… ;)

  • WIFI dies when transferring big files over the network

    Hi, I have Dell Inspiron 1520 with ipw3945 driver, when I transfer big files over the network, my wifi dies, anybody knows what could it be? If you need additional information, feel free to ask...
    Thanks in advance

    I guess we shall see. I can't imagine we are the only 7 with these issues.  Mine is mainly just lag... and overall slowness.  I hope to work through this.  If it lasts much longer I will have to revert back all 3 of my lappies to FC which I really do not want to do.  But on the lappies i connect via wireless 90% of the time and need it to work. 
    By the way how are you guys configuring your wifi?  I am using the iwl drivers and wicd.
    Well I did some more testing and loaded up netcfg and still get the same results so that rules out that part.  I also did some packet captures and I am getting a bunch of tcp retransmissions followed by tcp segment lost followed by a reset which in turn kills my rdp session (remote desktop protocol). I also went back several versions of the uucode driver and still get the same reults so I guess it seems to be a kernel issue.  Back to FC I go.  Damn shame...
    Last edited by BKJ (2008-10-08 01:08:56)

  • Sudhir Choudhrie - Can't Able to Download Big Files

    Hi, This is Sudhir Choudhrie, i've chosen Firefox for my daily browsing and downloading need because it has a function which lets you pause the running download and can start it whenever or next day you want. And you don't need any third-party trial download manager. But nowadays i finding it difficult to download big files. Whenever i choose to start a paused downloading file, i can't be able to download that file again. Please tell me if it's a website error or some error in Firefox. Thanks,

    Hi,
    Thank you for your question. This I do not know the answer, however I would be happy to test a download url. If this is an issue with downloads it is also possible to troubleshoot [https://support.mozilla.org/en-US/kb/cant-download-or-save-files]

  • Big files support

    Please add big files support. I can will use Lightroom for manage picture about 15000x10000 (from drum scan).

    I also had issues opening AVCHD files on case-sensitive filesystems.
    You can try these steps to solve the issue:
    Right click on the AVCHD file and select 'Show Package Contents'
    Right click on the BDMV file and select 'Show Package Contents'
    Rename index.bdm to INDEX.BDM (it seems that all files in the BDMV folder should be in UPPERCASE for Quicktime to open the AVCHD file).
    Go back twice and try again to double-click on the AVCHD file (Quicktime should open a window showing the multiple clips, as expected).

  • BCC FlexUI big file import timeout.

    Hi mates.
    We are facing an issue with BCC when importing a big file with cross-shells. During the importing process for the file, the first screen is shown (Step 1 of 2) and, as the file takes long to be imported (2-3 minutes for a large number of assets), the second screen is never shown despite the process ends successfully. We are using Atg 10.0.3
    Have you faced any similar problem? We have wrapped projectAssets.jsp with transaction marks, but it didn't work. Is there nay FlexUI time-out configuration we could set up to get the second screen even when the file takes long to import?
    Thank you very much.
    Regards.

    Hi Joel.
    Thank you very much for your prompt response. We have monitored the process since the BCC user clicks on the Import button to make sure we haven't got the problem you are telling me. The importing process is finishing successfully in the back end after a couple of minutes, transaction timeout is set up properly in Weblogic to allow long running transactions in customer's environment so, we are not having connection timeouts. The thing is the callback to the FlexUI is missing, probably because the interface is listening for a callback only during a period of time lower than the time the file needs to be imported, thus, the interface doesn't update the state and hangs in Step 1 despite the importation has been done and it should reach the Step 2. We opened an SR about this case (3-6766177811) and they recommend us to apply a change to projectAssets.jsp, wraping it into a transaction. We've tested the change and it doesn't work, we are still facing the same issue when importing big files.
    I am completely new with Flex, so, I don't know very much how this behaves. Is there any parameter related to a listener timer or something similar that could be set up in FlexUI to get call backs from long-running transactions?
    Thank you very much for your support.
    Kind Regards.
    Felix Rodriguez.

  • Not enough space on my new SSD drive to import my data from time machine backup, how can I import my latest backup minus some big files?

    I just got a new 256GB SSD drive for my mac, I want to import my data from time machine backup, but its larger than 256GB since it used to be on my old optical drive. How can I import my latest backup keeping out some big files on the external drive?

    Hello Salemr,
    When you restore from a Time Machine back up, you can tell it to not transfer folders like Desktop, Documents. Downloads, Movies, Music, Pictures and Public. Take a look at the article below for the steps to restore from your back up.  
    Move your data to a new Mac
    http://support.apple.com/en-us/ht5872
    Regards,
    -Norm G. 

  • Adobe Photoshop CS3 collapse each time it load a big file

    I was loading a big file of photos from iMac iPhoto to Adobe Photoshop CS3 and it keep collapsing, yet each time I reopen photoshop it load the photos again and collapse again. is there a way to stop this cycle?

    I don't think that too many users here actually use iPhoto (even the Mac users)
    However, Google is your friend. A quick search came up with some other non-Adobe forum entries:
    .... but the golden rule of iPhoto is NEVER EVER MESS WITH THE IPHOTO LIBRARY FROM OUTSIDE IPHOTO.In other words, anything you might want to do with the pictures in iPhoto can be done from *within the program,* and that is the only safe way to work with it. Don't go messing around inside the "package" that is the iPhoto Library unless you are REALLY keen to lose data, because that is exactly what will happen.
    .....everything you want to do to a photo in iPhoto can be handled from *within the program.* This INCLUDES using a third-party editor, and saves a lot of time and disk space if you do this way:
    1. In iPhoto's preferences, specify a third-party editor (let's say Photoshop) to be used for editing photos.
    2. Now, when you right-click (or control-click) a photo in iPhoto, you have two options: Edit in Full Screen (ie iPhoto's own editor) or Edit with External Editor. Choose the latter.
    3. Photoshop will open, then the photo you selected will automatically open in PS. Do your editing, and when you save (not save as), PS "hands" the modified photo back to iPhoto, which treats it exactly the same as if you'd done that stuff in iPhoto's own editor and updates the thumbnail to reflect your changes. Best of all, your unmodified original remains untouched so you can always go back to it if necessary.

  • I've doubled my RAM but still can't save big files in photoshop...

    I have just upgraded my Ram from the shipped 4GB to 8GB but I still can't save big images in photoshop. It seems to have made no difference to the speed of general tasks and saving and I can't save big files in photoshop at all. I already moved massive amounts of files off my computer and onto my external hard drive, so there is much less in my computer now, and twice the RAM but it is making no noticeable difference. When I click memory, under 'about my mac' it shows that the RAM is installed and now has twice the memory. Could this be something to do with photoshop? I'm running CS6.

    Also, I just calculated 220 cm into inches and this is roughly 86.5 inches. Over 7 feet in length.
    WIth an image that large, you may want to consider down sampling to a lower DPI to help with being able to work on and process images of this size much easier and it will be easier for your Print house to process, as well.
    You might want to consider working with these rather large images at 225 DPI instead of 300 if resolution at close distances is still a concern.
    Or what you good try is working with the images at 300 DPI, but then save them/export them as a Jpeg image at the highest image quality setting.
    I do a lot of project where I use a high resolution jpeg image to save on image proceesing overhead.
    The final printed images still come out pretty clear, clean and crisp without having to process those large files at a print house or having to deal with using that full resolution image in a page layout or illustration program.

  • Error in loading big file

    Hi All,
    I have an application on WL8.1 with sp3, the database is SQL Server 2000. I have code below to load local file into database. The data type in database is image.
    PreparedStatement pStatement = null;
    InputStreammyStream myStream = new InputStream();
    myStream.setEmbeddedStream( is );
    pStatement.setBinaryStream( 1, myStream, -1 );
    pStatement.executeUpdate();
    pStatement.close();
    pStatement = null;
    is is InputStream from a local file and the sql statement is
    insert into file_content(content) values(?)
    This workes fine for the files with size less than 150M, but for those big files (>150M), it doesn't work and I got error message as below:
    <Feb 11, 2005 12:00:41 PM PST> <Notice> <EJB> <BEA-010014> <Error occurred while attempting to rollback transaction: javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    at weblogic.transaction.internal.ServerTransactionImpl.internalRollback(ServerTransactionImpl.java:396)
    at weblogic.transaction.internal.ServerTransactionImpl.rollback(ServerTransactionImpl.java:362)
    Any body can help? Thanks in advance.

    Fred Wang wrote:
    Hi All,
    I have an application on WL8.1 with sp3, the database is SQL Server 2000. I have code below to load local file into database. The data type in database is image.
    PreparedStatement pStatement = null;
    InputStreammyStream myStream = new InputStream();
    myStream.setEmbeddedStream( is );
    pStatement.setBinaryStream( 1, myStream, -1 );
    pStatement.executeUpdate();
    pStatement.close();
    pStatement = null;
    is is InputStream from a local file and the sql statement is
    insert into file_content(content) values(?)
    This workes fine for the files with size less than 150M, but for those big files (>150M), it doesn't work and I got error message as below:
    <Feb 11, 2005 12:00:41 PM PST> <Notice> <EJB> <BEA-010014> <Error occurred while attempting to rollback transaction: javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    at weblogic.transaction.internal.ServerTransactionImpl.internalRollback(ServerTransactionImpl.java:396)
    at weblogic.transaction.internal.ServerTransactionImpl.rollback(ServerTransactionImpl.java:362)
    Any body can help? Thanks in advance.I already answered this as to the cause, in the ms newsgroups... It's the DBMS choking so
    bad on the size of your image file that it actually kills the connection. Note that the DBMS
    has to save the whole image to log as well as to the DBMS table. The fundamental response is
    to get DBA help to configure the DBMS to be able to do what you want. If you want and can,
    you could split your image column into multiple image columns, and split your data into
    100meg chunks, and then enter an empty row and then update it column by column, and then
    concatenate the data again in the cleint when you get it out, etc, but much better to post
    to the MS server newsgroups for ideas.
    Joe

  • What are the good ways to send a big file( 20MB-100MB) file to my friend?

    what are the good ways to send a big file( 20MB-100MB) file to my friend?
    Thanks in advance

    if this is with the internet, iChat is probly your best bet,
    but if you just want a transfer,
    plug a firewire into both of your computers, shutdown one of them, hold "T" and press the power button, the restarted computer should pop up as an external drive on the second computer.

  • Can't move big files also with Mac OS Extended Journaled!

    Hi all =)
    This time I can't really understand.
    I'm trying to move a big file (an app 9 GB large) from my macbook app folder to an external USB drive HFS+ formatted, but at about 5.5 GB the process stop and I get this error:
    "The Finder can't complete the operation because some data in "application.app" can't be read or written. (error code -36)"
    Tried to search for this error code over the internet with no results.
    Tried the transfer of the same file to a different drive (which is also HFS) but I still get the same error code.
    Both drives have plenty of available free space.
    Tried also  different USB cables.
    The app in subject has just been fully downloaded with no error from the app store.
    What should I try now? Please any suggestion welcome.. this situation it's so frustrating!
    Thanks

    LowLuster wrote:
    The Applications folder is a System folder and there are restrictions on it. I'm not sure why you are trying to copy an App out of it to an external drive. Try copying it to a folder on the ROOT of the internal drive first, a folder you create using a Admin account, and then copying that to the external.
    Thanks a lot LowLust, you actually solved my situation!
    But apparently in this "forum" you can't cancel the choosen answer if by mistake you clicked on the wrong one and you can't edit your messages and you even can't delete your messages... such a freedom downhere, jeez!

Maybe you are looking for

  • How do I compile a java file in eclipse

    Hi, I have just installed eclipse and added a plugin for tomcat. I created a new tomcat project and have been trying out several jsp files. Everything nice and dandy. Now there is a bean in the tutorial that I am doing. So I use the context menu, new

  • Itunes 11.1 not recognizing iphone 5

    I have iphone 5 with version 6.0.x installed. After upgrading to iTunes 11.1 my device does not get recognized even I restarted my iphone etc.

  • Encrypt/decrypt in c++ and java

    Hi, I have a problem in encryt a file using RC2 in java and decrypt in c++. I am using the bouncycastle.org as my RC2 provider in JCE. Here is the code in c++ void CEScoreDlg::encrypt(BYTE *pData, unsigned long dwDataSize, unsigned long *dwOutSize)  

  • Resetting X6 to 'out of the box' state?

    By 'out of the box' state i mean both phone memory and the mass memory wiped clean, and every 3rd party app which i installed gone. How do i do it? Any help will be appreciated.. Thanks.. Nokia X6 32Gb (comes with music) Solved! Go to Solution.

  • Parent-Chold O/R Mapping

    Hello I have two tables in an Oracle database: PARENT and CHILD. The PARENT table has a column called IDENTIFIER (String) The CHILD table has a column called PARENT_IDENTIFIER (String) I want to markup my Parent has having a collection of Children. T