Using XI to FTP large files

Hi Folks,
I have a scenario in which I need to transfer a flat file to an external system. No mapping is required. We are not using BPM. I read Michael's comments on using Java proxies to transfer large files. I know that we can use standard Java IO APIs to copy the file over. However, I don't know how to implement this.
In my scenario an SAP tranaction will create the file. I just need XI to pick it up and FTP it to another server. Can you point in the right direction as to how i should go about imlementing this?
1. I assume i will still have to use file adapter to pick up the file.Right?
2. Then, i use Java server proxy to FTP it to the target system?
3. In order to generate the proxy i need a message interface. Should i use a dummy Message Interface as my inbound and outbound that points to a dummy message type?
Can someone provide me a sample?
Thanks,
Birla

Hi Nilesh,
Thanks for the reply and the link. However, the blog doesn't provide solution to my problem. I was asking if XI can pick-up a large file (say 200MB) and FTP it to an external system without doing content conversion.
I already read these blogs.
FTP_TO_FTP
/people/william.li/blog/2006/09/08/how-to-send-any-data-even-binary-through-xi-without-using-the-integration-repository
/people/shabarish.vijayakumar/blog/2006/04/03/xi-in-the-role-of-a-ftp
These blogs suggest that i can use Java proxy to achieve better performance.
I just don't know how to implement it.
Any help would be much appreciated.
Thanks,
Birla.

Similar Messages

  • Photoshop hangs temporarily using brush tool on large files

    When using the brush tool on layer masks on very large files (2-6G), photoshop will stop responding for a minute or two, seemingly randomly. The app has not crashed, but I cannot paint, change brush size, or interact with palettes. I've noticed that the CPU usage is exactly 79% when this happens.
    I've had this problem on my older mac - a 2008 8x2.8g mac pro with 24g ram, and my current machine, a 2009 mac pro 8x2.93ghz with 64g memory, using the boot drive (2t with about 1.2t free) as a scratch disk.
    After a minute or two, especially if I try to click on a palette, ps comes back to life. Not fatal, but annoying.

    Try to improve performance under Photoshop> Preferences> General> Performance

  • Need help using JTextArea to read large file

    Hi here is the deal.
    I've got a large file (about 12Mbytes) of raw data (all of the numbers are basically doubles or ints) which was formed with the ObjectOutputStream.writeInt/writeDouble (I say this to make clear the file has no ascii whatsoever).
    Now I do the file reading on a SwingWorker thread where I read the info from the file in the same order I put it originally.
    I need to convert it to string and visualize it on a JTextArea. It starts working. However a one point (56% to be exact since I know exactly the number of values I need to read) it stops working. The program doesn�t freeze (probably because the other worker thread froze) and I get no exceptions (even tough I�m catching them) and no errors.
    Does anyone have any idea of what the problem could be?
    Thank you very much in advance.
    PD: I don�t know if it matters but I'm using ObjectInputStream with the readInt/readDouble functions to get the values and then turning them to strings and adding them to the JTextArea.

    I can put up the code.
    I don't have it with me right now but I'll do it later.
    Thank you.
    Second. I need to debug a function aproximation that uses a method that has to manage that many numbers. If I don�t put it into a txt and read it there is no way I will know where the problems are, if any. And yes I can look at the txt and figure out problems. It's not that hard.
    What I'll try to do is to write directly to regular txt file instead of doing it to the JTextArea.
    Thank you for your help and I'll post back with the code and results.
    PD: I don't know what profiling is, would you mind telling me?

  • Netware 6.5 Sp7 Breaks FTP large Files

    Has anyone had a issue with FTP on Netware when after
    loading sp7 we can no longer ftp files that are larger then
    4 gig, any help would be nice

    Mike,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://forums.novell.com)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://support.novell.com/forums/faq_general.html
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://support.novell.com/forums/

  • What file type should i use for reading a large file of data?

    I want to store data in a file and access them later. The data file could reach 500 to 1000 MBytes, as a binary file.
    I may have memory problems.
    I do not need the whole data in memory, only a few data each time, for my calculations.
    If i use a binary file, i can not read only a few data. I have to load the whole file and then read the data i need.
    Is this correct?
    If i use another type of file, can i read only a few bytes without loading the whole file?
    Maybe a TDMS file?
    Solved!
    Go to Solution.

    I would probably use a TDMS file for this since it could also be read into excel if it were small enough - just more flexiblility. But you can also do it using binary types. You don't have to read the entire file when using binary files. See below.

  • I am trying to send a large file and it takes over 15 hours and still doesnt send the complete file

    I am sending a large file over the internet to Graphistudio. When i try and send the file it tells me it is going to take 19 hours. when i have sent these files in the past it has never taken this long. when it does finish sending it doesnt even send all the file and bits are missing.

    What method or app are you using to send the "large" file with?  Emails, for instance are often limited to 35 or 40 MB's maximum size by the ISP provider.  You would need to use an FTP setup to send a larger file.   And what is the size of the file in MB's or GB's?  You might check out the following free FTP Apps for Mac:
           http://mac.appstorm.net/roundups/internet-roundup/top-7-free-ftp-clients-for-mac /

  • Can't Copy Large Files To External Drive

    This is driving me nuts. I'm running 10.4.11 with an external Newer Tech Ministack 320GB hard drive that's only ten weeks old. Yesterday I downloaded some large (50mb or so) QuickTime files to my desktop, which is to say my internal drive. I then tried to copy them to my external drive, since that's where I keep my movie files since they take up so much room. One copied over without any trouble, but numerous attempts to copy the others result in the process freezing after anywhere from 1/4 to 3/4 of the data has been copied, or else I get the "error 36" message. My first thought was that the external drive was failing, but I can copy small movie files (10mb or so) back and forth between the two drives with no problem. What's the point of having a large external drive if you can't use it for your large files? Repaired permissions on both drives, no improvement.

    I have repaired permissions and run disk repair on the internal drive as well. Yes, I've played the clips in their entirety, but I'm going to do it again and look for any hiccups. That's an interesting theory and I want to check it out.
    It has occurred to me that a workaround, although it by no means addresses the underlying problem, is to use QuickTime Pro to chop the clips up into several smaller files, copy them to the external drive, and then reassemble them via QuickTime Pro on the external drive.
    FWIW, I neglected to mention (although I don't know if it's pertinent or not) that immediately after installing the external drive ten weeks ago I used SuperDuper to clone my entire internal drive onto the external firewire drive. I then erased the movies from the internal drive, which was bumping up against maximum capacity. I am able to boot from the external drive, and I want to retain this capacity should the internal drive (which is, after all, six and a half years old) fail.

  • FTP Get File.vi not responding

    I'm trying to use the vi "FTP Get File.vi" (from the internet toolkit) to retrieve a specific file from an ftp site. Pn;y that for the moment. The FTP site is set up, and I checked and double checked that it was working fine (using filezilla) and that the stated directories and files are correctly spelled, etc.  When I run the attached vi I get a darkened a darkened arrow icon indicating the program is running, No errors are generated.  If I change ANYTHING on the remote path, such as an incorrect spelling of the name of the file I want to download, I do get an error and the program stops.  Evidently it knows the file is there, but something is not working.  I include in the attached  vi the information including the password for the remote ftp site to make it easier to test although you will probably have to creat a local directory to recieve the copies file at "C:\TELEMEDICINE" or else edit the local path however you want.  I will change the site pw, and perhaps the name after we get this figured out, but I figured it might make helping me easier for now.  Possibly I am misunderatnding soemthing about how this vi is suppse to be used, but I have read the help docimentation, and everything similar I could find onf the web. Possibly it is something stupid.
    Some other related information that may or may not help. 
    (0) I tried the FTP browser vi which is one of the example vis included with labview (do a search on "FTP" under examples),  It worked great - just like filezilla, so the toolkit is working.
    (1) The remote file is a .lvm data file, and is pretty small - 124kb.  So if the file transfer is working it should take a very short time (as it does using filzilla).
    (2) I did try letting the program run for many minutes on the outside chance something was very slow, but the local file never appeared anywhere, and the program was just frozen in RUN mode, but no errors ever so long as I have the remore path correct, as I stated above.
    (3) I checked and made sure the security permissions were set (I'm only using windows firewall and security on the local computer).  Labview had full permissions.
    (4) I'm using labview 2009, but I got all the  update disks.  Not sure if I bothered to go through the full update on this computer (I thought I did but it still says 2009),  Probavly doesn';tmatter either way.
    Attachments:
    CLIENT2.vi ‏10 KB

    I do specify the whole file path complete with name of the file and it extension
    (see original post attachment). Is that what you are referring to?
    I suppose I should update with some more recent information here to save people some
    time. The problem is NOT solved at all, so I would still appreciate any info
    that may help. However we now know that the program works exactly the same,
    apparently, for me (on a VISTA machine and a Windows 7 machine) and for Frank
    from tech support.  The program works sometimes, and then inexplicably
    stops working. The timings of these failure don't appear to be connected to
    anything we did - things stop working for a good while, but after doing nothing. For example while the computer is sleeping.  I haven’t determined if  the “freeze ups” happen on on different computers all at
    the same time. I’m working on that one.
    So, the conclusion is that it must (used advisedly) be something related to
    the only common factors – either the ftp server on my web provider, or that the
    vis in question are not quite right and this only shows up now and then for
    some reason.   I believe that I set everything up correctly on the
    ftp site I haven’t changed anything lately, and have used numerous FTP
    browsers, such as Filezilla with that site for a very long time (years) 
    with no issues ever. It is hard to think of a locus for the source of the
    problem other than one of those two.
    I'm planning to call the company serving my ftp site at some point, but what
    am I going to ask them other than if my ftp site is configured correctly? It
    says it is. It has always worked. It works using the labview example browser
    (which uses different vis), even when the "FTP Get file.vi" and FTP
    Get Buffer.vi" do not work at the same moment.

  • Microstate acctg. in large file compilation

    I am trying to enable microstate accounting by using the following code
    #include <procfs.h>
         sprintf(procname,"/proc/%d/ctl",(int)getpid());
         ctlfd = open(procname,O_WRONLY);
         ctl[0]=PCSET;
         ctl[1]=PR_MSACCT;
         write(ctlfd,ctl,2*sizeof(long));My application needs to be compiled with the large file compilation environment flags. But the compiler (SunStudio 8 on Solaris 8) gives me an error:
    "Cannot use procfs in the large file compilation environment".
    Is there any way to get around this problem?
    Or is there any other way to enable microstate accounting that would work with the large file compilation?

    Your question is not about C++, but about Solaris process accounting.
    Inspection of the procfs header shows that it requires a consistent ILP32 (32-bit int, long, and pointer) or LP64 (64-bit long and pointer) model. It won't work with the ILP32 model modified with 64-bit file access. The error message is coming from the procfs header.
    If you can build your application as a 64-bit program, using -xarch=v9, it should compile. But your application needs to be 64-bit clean, with no hidden ILP32 assumptions.
    Lint provides options for identifying C code that is not 64-bit clean. The C++ compiler has an -xport64 option for the same purpose. Refer to the C and C++ Users Guides for details.

  • Faster alternative to cfcontent for large file downloads?

    I am using cfcontent to securely download files so the user
    can not see the path the file is stored in. With small files this
    is fine, but with large 100mb+ files, it is much, much slower than
    a straight html anchor tag. Does anyone have a fast alternative to
    using cfheader/cfcontent for large files?
    Thanks

    You should be able to use Java to handle this, either through
    a custom tag or you might be able to call the Java classes directly
    in Coldfusion. I don't know much about Java, but I found this
    example on the Web and got it to work for uploading files. Here's
    an example of the code. Hope it gives you some direction:

  • FTP and HTTP large file ( 300MB) uploads failing

    We have two IronPort Web S370 proxy servers in a WCCP Transparent proxy cluster.  We are experiencing problems with users who upload large video files where the upload will not complete.  Some of the files are 2GB in size, but most are in the hundreds of megabytes size.  Files of sizes less than a hundred meg seem to work just fine.  Some users are using FTP proxy and some are using HTTP methods, such as YouSendIt.com. We have tried explict proxy settings with some improvment, but it varies by situation.
    Is anyone else having problems with users uploading large files and having them fail?  If so, any advice?
    Thanks,
       Chris

    Have you got any maximum sizes set in the IronPort Data Security Policies section?
    Under Web Security Manager.
    Thanks
    Chris

  • FTP Sender Adapter with EOIO for Large files

    Hi Experts,
    I am Using XI 3.0 version,
    My scenario is File->XI->FIle, I need to pick the files from FTP Server , there are around 50 files in it each of 10 MB Size.
    So i have to pick the files from FTP folder in the same order as they put into folder , i,e.., FIFO.
    So in Sender FTP Comm channel , i am Specifying
    Qos = EOIO
    queue name =  ACCOUNT
    whether the files will be picked in FIFO into the queue XBTO_ACCOUNT?
      So my question is,
    what is the procedure to specify the parameters so that it will process in FIFO?
      What would be the BEST PRACTICE to acheive it for better performance point of view.
    Thanks
    Sai.

    Hi spantaleoni ,
    I want to process the files using FTP protocol which should be in FIFO ,
    ie.., Files placed first in a folder should be picked up first and remaining one next and so on.
    So if i use FTP ,
    Then to process the files in sequence i.e, FIFO
    the processing parameters
    Qos= EOIO
    Queue name = ACCOUNT
    would process the files in FIFO.
    and to process large files (10MB size) what would be the best  Polling interval secs.
    Thanks,
    Sai

  • Processing Large Files using Chunk Mode with ICO

    Hi All,
    I am trying to process Large files using ICO. I am on PI 7.3 and I am using new feature of PI 7.3, to split the input file into chunks.
    And I know that we can not use mapping while using Chunk Mode.
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    Thanks in Advance,
    - Pooja.

    Hello,
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    According to this blog:
    File/FTP Adapter - Large File Transfer (Chunk Mode)
    The following limitations apply to the chunk mode in File Adapter
    As per the above screenshots, the split never cosiders the payload. It's just a binary split. So the following limitations would apply
    Only for File Sender to File Receiver
    No Mapping
    No Content Based Routing
    No Content Conversion
    No Custom Modules
    Probably you are doing content conversion that is why it is not working.
    Hope this helps,
    Mark
    Edited by: Mark Dihiansan on Mar 5, 2012 12:58 PM

  • Handling large files with FTP in XI

    Hi All,
    I have a file scenario where i have to post the file with size of more than 500MB and with the fields upto 700 in each line.
    The other scenario worked fine if the file size is below 70MB and less number of fields.
    Could anyone help me in handling such scenario with large file size and without splitting the file.
    1) From your previous experience did you use any Tools to help with the development of the FTP interfaces?
    2) The client looked at ItemField, but is not willing to use it, due to the licensing costs. Did you use any self-made pre-processors?
    3) Please let me know the good and bad experiences you had when using the XI File Adapter?
    Thanks & Regards,
    Raghuram Vedagiri.

    500 MB is huge. XI will not be able to handle such a huge payload for sure.
    Are you using XI as a mere FTP or are you using Content Conversion with mapping etc?
    1. Either use a splitting logic to split the file outside XI ( using Scripts ) and then XI handles these file.
    2. or Quick Size your hardware ( Java Heap etc ) to make sure that XI can handle this file ( not recommended though). SAP recommends a size of 5 MBas the optimum size.
    Regards
    Bhavesh

  • Using JCA FTP and File Adapters to transfer files as attachments

    I am trying to develop a flow that picks a file up using the JCA FTP adapter and writes it using the JCA File Adapter. I can get this to work using opaqeElement but the issue I have is that the files that will be transferred are regularly very large > 1 GB and using the opaqueElement simply results in a OutOfMemoryException.
    Reading the Adapter documentation it seems that Reading/Writing file as an attachment may be an answer. I therefore did the following.
    . Created FTP Read and File Write adapters in JDeveloper
    . Inmported this in to OEPE
    . Created Proxy Service and Business Service and wired them together
    The problem i am having is that even though there is no error and the file is being read on the FTP server, the resulting file only contains that attachment details, the file is not a copy of the file that was picked up from the ftp server.
    I am assuming that this behaviour is not correct and that the file should be transferred in its entirity. Unfortunatly the documentation is not very detailed and I cannot find any examples on the internet.
    Can anyone offer any advice or even better suggest where I can find an example.
    Regards
    My JCA adapters are as follows:
    FileWriteAsAttachmentAdapter_file
    <adapter-config name="FileWriteAsAttachmentAdapter" adapter="File Adapter"
         wsdlLocation="FileWriteAsAttachmentAdapter.wsdl"
         xmlns="http://platform.integration.oracle/blocks/adapter/fw/metadata">
    <connection-factory location="eis/HAFileAdapter"/>
    <endpoint-interaction portType="Write_ptt" operation="Write">
    <interaction-spec className="oracle.tip.adapter.file.outbound.FileInteractionSpec">
    <property name="PhysicalDirectory" value="C:\fileserver\download\toprocess"/>
    <property name="Append" value="false"/>
    <property name="FileNamingConvention" value="attachment_%SEQ%.csv"/>
    <property name="NumberMessages" value="1"/>
    </interaction-spec>
    </endpoint-interaction>
    </adapter-config>
    FileWriteAsAttachmentAdapter.wsdl
    <wsdl:definitions
    name="FileWriteAsAttachmentAdapter"
    targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/file/AtatchmentApp/Project1/FileWriteAsAttachmentAdapter"
    xmlns:jca="http://xmlns.oracle.com/pcbpel/wsdl/jca/"
    xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/"
    xmlns:tns="http://xmlns.oracle.com/pcbpel/adapter/file/AtatchmentApp/Project1/FileWriteAsAttachmentAdapter"
    xmlns:imp1="http://xmlns.oracle.com/pcbpel/adapter/file/attachment/"
    xmlns:plt="http://schemas.xmlsoap.org/ws/2003/05/partner-link/"
    >
    <plt:partnerLinkType name="Write_plt" >
    <plt:role name="Write_role" >
    <plt:portType name="tns:Write_ptt" />
    </plt:role>
    </plt:partnerLinkType>
    <wsdl:types>
    <schema targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/file/attachment/" xmlns="http://www.w3.org/2001/XMLSchema"
    elementFormDefault="qualified" >
    <element name="attachmentElement" >
    <complexType>
    <attribute name="href" type="string" />
    </complexType>
    </element>
    </schema>
    </wsdl:types>
    <wsdl:message name="Write_msg">
    <wsdl:part name="body" element="imp1:attachmentElement"/>
    </wsdl:message>
    <wsdl:portType name="Write_ptt">
    <wsdl:operation name="Write">
    <wsdl:input message="tns:Write_msg"/>
    </wsdl:operation>
    </wsdl:portType>
    </wsdl:definitions>
    FtpGetAsAttachmentAdapter_ftp.jca
    <adapter-config name="FtpGetAsAttachmentAdapter" adapter="FTP Adapter"
         wsdlLocation="FtpGetAsAttachmentAdapter.wsdl"
         xmlns="http://platform.integration.oracle/blocks/adapter/fw/metadata">
    <connection-factory location="eis/ftp/TlmsHAFtpAdapter"/>
    <endpoint-activation portType="Get_ptt" operation="Get">
    <activation-spec className="oracle.tip.adapter.ftp.inbound.FTPActivationSpec">
    <property name="AsAttachment" value="true"/>
    <property name="DeleteFile" value="true"/>
    <property name="MinimumAge" value="300"/>
    <property name="PhysicalDirectory" value="/ftp/tlms2atas"/>
    <property name="Recursive" value="false"/>
    <property name="FileModificationTime" value="FileSystem"/>
    <property name="PollingFrequency" value="60"/>
    <property name="FileType" value="ascii"/>
    <property name="IncludeFiles" value=".*[_][0-9][0-9][-][0-9][0-9][-][0-9][0-9][_].*\.csv"/>
    <property name="UseHeaders" value="false"/>
    <property name="ModificationTimeFormat" value="dd/MM/yyyy HH:mm"/>
    </activation-spec>
    </endpoint-activation>
    </adapter-config>
    FtpGetAsAttachmentAdapter.wsdl
    <wsdl:definitions
    name="FtpGetAsAttachmentAdapter"
    targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/ftp/AtatchmentApp/Project1/FtpGetAsAttachmentAdapter"
    xmlns:jca="http://xmlns.oracle.com/pcbpel/wsdl/jca/"
    xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/"
    xmlns:tns="http://xmlns.oracle.com/pcbpel/adapter/ftp/AtatchmentApp/Project1/FtpGetAsAttachmentAdapter"
    xmlns:attach="http://xmlns.oracle.com/pcbpel/adapter/file/attachment/"
    xmlns:pc="http://xmlns.oracle.com/pcbpel/"
    xmlns:plt="http://schemas.xmlsoap.org/ws/2003/05/partner-link/"
    >
    <plt:partnerLinkType name="Get_plt" >
    <plt:role name="Get_role" >
    <plt:portType name="tns:Get_ptt" />
    </plt:role>
    </plt:partnerLinkType>
    <wsdl:types>
    <schema targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/file/attachment/" xmlns="http://www.w3.org/2001/XMLSchema"
    elementFormDefault="qualified" >
    <element name="attachmentElement" >
    <complexType>
    <attribute name="href" type="string" />
    </complexType>
    </element>
    </schema>
    </wsdl:types>
    <wsdl:message name="Get_msg">
    <wsdl:part name="attach" element="attach:attachmentElement"/>
    </wsdl:message>
    <wsdl:portType name="Get_ptt">
    <wsdl:operation name="Get">
    <wsdl:input message="tns:Get_msg"/>
    </wsdl:operation>
    </wsdl:portType>
    </wsdl:definitions>

    System resources are finite and have a threshold limit for processing. The Oracle SOA Suite, dependent on system resources, also has certain size limitations, largely due to the underlying resources beyond which the system cannot process incoming requests.
    For example, Oracle JCA Adapters can process large payloads but the Oracle BPEL PM consumes huge memory when processing large payloads, which can cause OutOfMemory conditions and affect the whole system.
    You can try File ChunkedRead option.This will read your large CSV file in chunks so chances of memory out will be less. You can define your own chunk size.
    This is a feature of Oracle File and FTP Adapters that uses an invoke activity within a while loop to process the target file. This feature enables you to process arbitrarily
    large files. See below link's section "4.5.5 Oracle File Adapter ChunkedRead"
    http://www.orastudy.com/oradoc/selfstu/fusion/integration.1111/e10231/adptr_file.htm#BABDIABG
    Thanks,
    Ashu

Maybe you are looking for

  • Blank drop-downs when using LOV with static VO

    Hello, I am using JDeveloper 11.1.2.3.0 I have set some attributes to display as LOV. The attributes that are connected with Entity Based VO are fine but those that are connected with Static List VO not. In the second case I am getting drop-downs wit

  • Error when Drill down on top level of hierarchy node

    Hi Gurus, The following error occurs when I drill down to a "Actual Line items sub report" using the "top node" of an hierarchy based report . But, the same drill down works fine when I drill down using the very sub level of the same hierarchy (witho

  • How do i select the file extension type?

    using imovie '11. i've finalised a project and exported it. it automatically saved as a .m4v (itunes) file. my question is: is there any way to manually select the file extension when saving/finalising a project? for example, i might be required to s

  • TROUBLE WHIT INDESIGN FILES don't delete the temporal idlk

    I have a server, my indesign files create a temporal file (idlk) for backup reason, Indesign delete this archive automaticaly on document close. My document is closed and the temporal file is still open. HELP ME PLEASE.

  • Premiere Pro CC Project File (.prproj) no longer XML?

    Hi, Does anyone know why the project file (.prproj) is no longer a readable XML file in PP CC? In earlier versions, you could open the project file in a text editor and read it as an XML file. In CC, it's illegible garbage. Is there a way to decode t