File handling issues

Hi All,
I have following doubts in processing the file.
Ex
00BSD003062                                                  
01UN100010000000010306190005400000151778          
02UN100010200001193633970001          A19000003285N
02UN100010200002183557200001          A19000002236N
1)In second record if i want to consider only 5 chars to one filed rest i need not consider..Is it possible in XI ignoring un wanted chars , how to ignore the rest of chars?
2)By  considering 1000102 as account number i need to reconsile it , means the above two rows(3 and 4) amounts  should be added and the result only should be posted.Is it possible and pls guide me how to do this.
3) Is it possible of audit logging? means how many records posted (counting number of records) successfully at the end of file processing?
4)Is it possible of sorting in XI and how?
Thanks in advance.
Regards,
venu

> 1)In second record if i want to consider only 5 chars
> to one filed rest i need not consider..Is it possible
> in XI ignoring un wanted chars , how to ignore the
> rest of chars?
explore the filedFixedLengths option in your content conversion config. this link has details on it http://help.sap.com/saphelp_nw04/helpdata/en/2c/181077dd7d6b4ea6a8029b20bf7e55/content.htm
> 2)By  considering 1000102 as account number i need to
> reconsile it , means the above two rows(3 and 4)
> amounts  should be added and the result only should
> be posted.Is it possible and pls guide me how to do
> this.
Before posting to ur target application, you can always manipulate your payload using mapping programs.
If you are looking at doing it when reading the file itself, then you may have to write a module processor.
> 3) Is it possible of audit logging? means how many
> records posted (counting number of records)
> successfully at the end of file processing?
One option is to write a module processor, deploy it as an addon to ur file adapter,
parse the payload and get the recordcount.
> 4)Is it possible of sorting in XI and how?
I doubt this feature is available. file adapter will parse thru the file starting Line 1 till eof.
explore sort options using std functions/UDFs in your Message Mapping
Regards
Saravana

Similar Messages

  • Duplicate File Handling Issues - Sender File Adapter - SAP PO 7.31 - Single Stack

    Hi All,
    We have a requirement to avoid processing of duplicate files. Our system is PI 7.31 Enh. Pack 1 SP 23. I tried using the 'Duplicate File Handling' feature in Sender File Adapter but things are not working out as expected. I processed same file again and again and PO is creating successful messages everytime rather than generating alerts/warnings or deactivating the channel.
    I went through the link  Michal's PI tips: Duplicate handling in file adapter - 7.31  . I have maintained similar setting but unable to get the functionality achieved. Is there anything I am missing or any setting that is required apart from the Duplicate file handling check box and a threshold count??
    Any help will be highly appreciated.
    Thanks,
    Abhishek

    Hello Sarvjeet,
    I'd to write a UDF in message mapping to identify duplicate files and throw an exception. In my case, I had to compare with the file load directory (source directory) with the archive directory to identify whether the new file is a duplicate or not. I'm not sure if this is the same case with you. See if below helps: (I used parameterized mapping to input the file locations in integration directory rather than hard-coding it in the mapping)
    AbstractTrace trace;
        trace = container.getTrace();
        double archiveFileSize = 0;
        double newFileSizeDouble = Double.parseDouble(newFileSize);
        String archiveFile = "";
        String archiveFileTrimmed = "";
        int var2 = 0;
        File directory = new File(directoryName);
        File[] fList = directory.listFiles();
        Arrays.sort(fList, Collections.reverseOrder());
        // Traversing through all the files
        for (File file : fList){   
            // If the directory element is a file
            if (file.isFile()){       
                            trace.addInfo("Filename: " + file.getName()+ ":: Archive File Time: "+ Long.toString(file.lastModified()));
                            archiveFile = file.getName();
                          archiveFileTrimmed = archiveFile.substring(20);       
                          archiveFileSize = file.length();
                            if (archiveFileTrimmed.equals(newFile) && archiveFileSize == newFileSizeDouble ) {
                                    var2 = var2 + 1;
                                    trace.addInfo("Duplicate File Found."+newFile);
                                    if (var2 == 2) {
                                            break;
                            else {
                                    continue;
        if (var2 == 2) {
            var2 = 0;
            throw new StreamTransformationException("Duplicate File Found. Processing for the current file is stopped. File: "+newFile+", File Size: "+newFileSize);
    return Integer.toString(var2);
    Regards,
    Abhishek

  • CSV File Handling Issue

    Hi All,
    We have IDoc to File(CSV) Scenario.
    Target Field Values have comma(,) character in it.  Field Separator : comma(,)
    Fields contain comma disturbs the File Generation Sequence.
    eg.,
    Header
    field 1, field 2, field 3, field 4
    field 1=test
    field 2=sample
    field 3=firstname,lastname
    field4 = address
    Output CSV:
    field1, field2 , field 3, field 4
    test,sample,firstname,lastname,address
    Field 3 Value has been splitted into two. How to handle this case. Kindly help
    Best Regards,
    Suresh S

    Hi,
    Double Quotes inclusion at Mapping level and following FCC Parameters helped to resolved that issue.
    However, we just need to exclude again the double quotes in the field before posting it to end Application which can be handled through FTP Module level configuration.
    Does anyone have idea about Standard Adapter Module which handle my requirement
    Best Regards,
    Suresh S

  • File handling issue

    Dear All,
    I have 1 test program say ZTEST,
    The above prog is sending data to application server. The file path is like 'path /usr/sap/tmp/ZFILE.txt'
    I have 2 files,
    first file contains data like
    1
    2
    3
    Second file contains data like
    A
    B
    C
    If I execute the program ZTEST for the first file. The first file data 1 2 3 is sending to file and the file 'path /usr/sap/tmp/ZFILE.txt' is having
    1
    2
    3
    If I execute the program ZTEST for second time for the second file. The second file data A B C is sending to file and the file 'path /usr/sap/tmp/ZFILE.txt' is having
    A
    B
    C
    If I execute the program ZTEST for first file and second file simultaniously. The first file data 1 2 3 or  second file data  A B C should send, but it is sending mimatching data to file 'path /usr/sap/tmp/ZFILE.txt' . now file is having mixup data
    1
    B
    3
    or
    A
    2
    C
    But I want to send the data in the below format. 
    1
    2
    3
    or
    A
    B
    C
    Please let me know if you have any concerns....
    Thanks in advance
    Kind regards,
    M.Rayapureddy

    try changing the properties od the dataset at run time using SET DATASET
    TYPE-POOLS:
      dset.
    DATA:
      dsn  TYPE STRING,
      fld  TYPE STRING,
      attr TYPE dset_attributes.
    OPEN DATASET dsn IN LEGACY TEXT MODE FOR INPUT.
    attr-changeable-indicator-code_page = 'X'.
    attr-changeable-code_page           = '0100'.
    attr-changeable-indicator-repl_char = 'X'.
    attr-changeable-repl_char           = '*'.
    SET DATASET dsn ATTRIBUTES attr-changeable.
    READ DATASET dsn INTO fld.
    WRITE / fld.
    CLOSE DATASET dsn.

  • Error in creating IO file handles for job (number 3152513)

    Hi All -
    I am using Tidal 5.3.1.307. And the Windows agent that is running these jobs is at 3.0.2.05.
    Basically the error in the subject was received when starting a particular job once it was cancelled and a couple of other different jobs a few days before. These jobs have run successfully in the past.
    This particular job was running for 500+ minutes when it should run at an estimated 40 minutes. At that time it would not allow for a re-start of the job, it just stayed in a launched status.
    Trying to figure out what causes this error.
    Error in creating IO file handles for job 3152513
    Note - from that being said we were to see 2 instances of this process running at the same time, we noticed some blocking on the DB side of things.
    Trying to figure out if this is a known tidal issue or a coding issue or both.
    Another side note, after cancelling the 2nd rerun attempt the following error was encountered: Error activating job, Duplicate.
    When we did receive the Error creating IO file, the job did actually restart, but Tidal actually lost hooks into it and the query was still running as an orphan on the db server.
    Thanks All!

    The server to reboot is the agent server.  You can try stopping the agent and then manually deleting the file.  That may work.  When the agent is running the agent process may keep the file locked, so rebooting may not be sufficient.
    The numerical folders are found as sub-directories off of the services directory I mentioned.  I think the numbers correspond to the job type, so one number corresponds to standard jobs, another to FTP jobs.  I'd just look in the numbered directories until you find a filename matching the job number.
    The extensions don't really matter since you will want to delete all files that match your job number.  There should only be one or two files that you need to delete and they should all be in the same numbered sub-directory.
    As to the root cause of the problem, I can't really say since it doesn't happen very often.  My recollection is that it is either caused by a job blowing up spectacularly (e.g. a memory leak in the program being launched by Tidal) or someone doing something atypical with the client.

  • Illustrator CS4 .pdf files - two issues

    Issue 1: In Bridge I cannot open an .ai or .pdf file in Illustrator by clicking a thumbnail. Is 'drag and drop' into Illustator from Bridge the sole method? (apart from not using Bridge).
    Issue 2: After exporting an .ai file to .pdf (Acrobat Pro CS4) a 'File Association' alert is shown. In previous versions of Illustrator the newly created .pdf file would open up in Acrobat for review/email forwarding etc when the 'preview in Acrobat' was selected... I am confused about the correct settings in the File Associations Panel in Bridge (Edit/Preferences/File Handling) under Illustrator it shows Illustrator and in the drop down InDesign and two other programmes I do not use. If someone can advise me how to add Acrobat to that drop-down I would be most grateful.
    Peter Ward

    Don't think it could be corrupt preferences. I've tried trashing the prefs as well as reinstalling CS4.
    I've noticed that it has something to do with an object that has a stroke. If I create a new box without a stroke then then my initial measurements are correct. However if I add a stroke my measures change according to the stroke weight. I've looked for ways to turn this off, but can not seem to find anything. I think this problem has something to do with the over all appearance of things.
    Which lead me to another issues I'm having.... What happened to the "Filter" menu? When I try to round corners on a rectangle it applies the round corners as an effect "appearance". I have to expand the appearance in order to the corners to actually be rounded. Why did this change? I want to change the actual object the first time I create it and not take extra steps. This can waste a lot of time. I think this same issue is what is going on with the stroke weight. I don't want my object size to increase when I increase my stroke weight. If I did want this, I would create outlines. Can this be turned off? If not, I will have to continue using CS3 until Adobe fixes this problem.

  • How do I know if this variable is a file handle?

    G'day
    (This has also been posted on StackOverflow)
    Say I have this code:
    function doFileStuff(){
        var file = "";
        try {
            file = fileOpen(filePath);
            // do stuff with file
        finally {
            fileClose(file);
    If the fileOpen() process fails, the fileClose() call will error. What I need to do is this sort of thing (pseudocode):
    if (isFile(file)){
        fileClose(file);
    I know I can test if file is an empty string still, and this works for me here, but it's not testing what I should be testing: whether file is a file handle. I can check the object's Java class, but this again sounds a bit hacky to me, and there should be a CFML way of doing it.
    There should be something like just isFile(), shouldn't there? I can't find anything like this in the docs.
    Any thoughts / tips? I have gone into more depth in my investigations on my blog. it's too wordy for here.
    Cheers for any help.
    Adam

    That would just defer the issue.. fileOpen() doesn't return a boolean, so I can't go:
    if (fileOpen(filePath)){
         fileClose(file);
    fileOpen() returns a file object; or nothing if it fails.  The whole thing is to identify whether it's a file.  That's the question.
    As per my original, it's dead easy to work around, provided one leverages known side effects of the situation (original variable state; that if it's a file it exposes some public properties; that one can doa  getClass() on it via Java, etc), but one shouldn't have to work around something as fundamental as this.  So I was wondering if I had missed something.
    Seemingly not (based on feedback I've had from various quarters).
    Adam

  • Agent Unreachable, collection status: file handles exhausted

    Hi, I have a problem with management agent. Status of agent in grid control is Agent Unreachable. Here is an output of emctl status agent:
    Oracle Enterprise Manager 10g Release 5 Grid Control 10.2.0.5.0.
    Copyright (c) 1996, 2009 Oracle Corporation. All rights reserved.
    Agent Version : 10.2.0.5.0
    OMS Version : 10.2.0.5.0
    Protocol Version : 10.2.0.5.0
    Agent Home : /opt/oracle/agent10g
    Agent binaries : /opt/oracle/agent10g
    Agent Process ID : 26832
    Parent Process ID : 26821
    Agent URL : https://bs11.xxxx.lan:3873/emd/main/
    Repository URL : https://gridcontrol.xxxx.lan:1159/em/upload
    Started at : 2010-07-06 14:24:30
    Started by user : oracle
    Last Reload : 2010-07-06 14:31:50
    Last successful upload : 2010-07-06 15:11:47
    Total Megabytes of XML files uploaded so far : 51.87
    Number of XML files pending upload : 4
    Size of XML files pending upload(MB) : 0.01
    Available disk space on upload filesystem : 60.35%
    Collection Status                            : File handles exhausted
    Last successful heartbeat to OMS : 2010-07-06 15:11:52
    I wonder what does this message mean Collection Status                            : File handles exhausted I can't find any solution to my problem, I tried restarting agent, clearstate, upload, resynchronization... all of this did nothing.

    Did you already check: Master Note for 10g Enterprise Manager Grid Control Agent Performance & Core Dump issues [ID 1087997.1]
    On http://support.oracle.com
    https://support.oracle.com/CSP/ui/flash.html#tab=KBHome%28page=KBHome&id=%28%29%29,%28page=KBNavigator&id=%28bmDocTitle=Master%20Note%20for%2010g%20Enterprise%20Manager%20Grid%20Control%20Agent%20Performance%20&%20Core%20Dump%20issues&bmDocDsrc=KB&bmDocType=BULLETIN&bmDocID=1087997.1&viewingMode=1143&from=BOOKMARK%29%29
    Regards
    Rob
    http://oemgc.wordpress.com

  • Invalid File Handle - Windows 7 clients talking to Mac OS 10.5.8  server

    Hello
    I have a file-sharing volume setup under 10.5.8 server - however with the addition of Window 7 clients i'm noticing lots of error whilst trying to copy to or from that volume {setup under SMB and AFP} via a windows machine...
    My Mac OS clients report no issues
    My windows clients report the '' Invalid File Handle''
    any fixes?
    many thanks in advance!

    http://www.laurentnomine.com/2009/09/invalid-file-handle-when-copying-files-from -os-x-leopard-10-5-to-vista7/

  • Browser File Handling

    Hi Folks,
    I have existing web application setup is now Browser File Handling =permissive
    Requirement:- force to download pdf file as save option (no need to open in browser)
    Now i have changed  settings Strict, but still pdf file open in browser only.So please let me know how to setup do force to download only pdf file.
    Give solution asap plz
    Thanks

    Hi,
    From your description, I know you want to download pdf file without opening in browser.
    I reproduce your issue with your steps. To resolve your issue, please try these points below:
    1.      
    Test your case in another browser.
    2.      
    Test in another computer with the same environment.
    3.      
    Clean the cookies of current browser, then test your issue.
    Best Regards
    Vincent Han
    TechNet Community Support

  • File handling in Lightroom is pretty, terribly, awful?

    The current workflow in terms of importing and mostly- organising your catalog/files is pretty damn painful as I find it. Is it just me, or are others running into similar issues?
    First, let me state that I'm really fond of the develop features of Lightroom, and have been using the programme since version 1.
    Lately however, I'm running into seriously annoying situations using Lightroom:
    - Importing/moving duplicate filename errors:  I managed to actually lose photographs because of trying to import a catalog maintained by a colleague that used different import and photo enumeration (camera) settings. This is something I hoped Lightroom would gracefully manage in the background; in this example his camera reset his count on card format. As he used date based folder import he evaded filename conflicts because all identically named photos resided in separate folders by default. However, when converting that folder/file structure to mine, which has one file folder per project/shoot resulted in thousands and thousands of file name conflicts. This simply baffled me. I had to go through tons of hoops in order for Lightroom to allow for two similarly named photographs to exist in the same folder. Not only would Lightroom not identify and warn that though the files were named the same, these photographs were NOT identical, it would then simply give a list of 'errors' caused by the duplicate names and leave it at that. Why on earth would Lightroom not offer to rename the 'conflicting' files? Or even better, offer the possibility create unique filenames upon import?
    What do you reckon the OS thinks when multiple files with the same name gets put in the trashcan? Right... Try and undo that one!
    - Can't copy:  Hmmm? Can we not imagine that photographers may want to maintain different physical copies of photographs on separate media? Imagine that I'd like to keep a subset of my archive on a server, or external hard drive.. How is it possible that this is ...impossible? As much as I love the Virtual Copy in it's capacity to save HD space, we'd very much like to have the opposite too. A real copy, with 2 different entries in the catalog, or a Physical/Linked Copy that has one negative but two (raw) source files?
    - Moving/File Operations themselves:  So imagine I'm moving all the folders of one superfolder to a differently named one on a server. Because the root folder on the server already exist, I select the 40 folders I want to move and drag them to the folder on the server. Guess what. It starts copying the files to all folders- in parallel. The nightmare of any spinny disks, Lightroom creates 40 folders and as if distributing candy amongst kids starts putting one file in each. Then on to file no. 2. Everyone got 2 files? Yes? Here comes number three. 43 operations in process. WAT?
    Sorry, I'm letting off some steam whilst waiting until these 43 operations complete. (Yes, I don't dare to cancel these operations, because.. oh )
    And I'm probably doing something wrong or perhaps I don't grasp the logic properly. I'd be looking forward to any tips/hints.
    Cheers,
    T

    I've never had such problems, but I've sure heard about them (or problems like them) a fair amount over the years.
    If you can report specific problems to Adobe here (1 problem per report), there is a chance it will do some good:
    Recently active topics in Photoshop Family about Photoshop Lightroom
    Note: one thing I realized way back is that I did not want to leave anything up to Lightroom (or chance), file-handling-wise, and so always assure my naming convention does not result in duplicate files. I NEVER want a -2 added to any of my files (ok, sometimes when testing, but otherwise: not). So anyway, to make a long story short, I recommend conventions and workflow which avoids the drama as much as possible.. - good luck (sorry I've not been more help..).
    Rob

  • I am having a file moving issue with Adobe Reader XI

    I am having a file moving issue with Adobe Reader XI on a Windows 7 32bit system. When Adobe reader is set as the default program for PDF files moving files from one directory to another takes over a minute, regardless of the size of the file. When the default handler is set to another program, like Nuance or PDF Creator, this is instant. What in Adobe Reader is causing this and what can I do to speed up file moving? Any help will be greatly appreciated. Thank you.

    I am using Windows 7 home edition and using Internet Explorer 10
    The PDF's are online on websites that I am trying to open.  Both on the site as well as trying in a new window.
    Thank you
    Jeff

  • Peak file handling problem?

    At first I thought this was a video playback issue, but in fact it's more like a peak file handling or priority problem. If you insert several long audio files (>20 mins) into the multitrack you will soon see that scrolling by ctrl + mouse wheel or dragging the scroll bar at certain resolutions will disrupt metering, time display, everything. And apart from being terribly annoying, it's also a total disregard of obvious priorities. With such a program, seamless audio and video playback must have the highest priority (= THE Output), then metering, then time display. Redrawing the clips on the timeline should come at the end of the list, and under no circumstances may it hinder the functioning of the primary outputs, as it does now (on the PC and Mac alike, but it's worse on the former).
    I firmly believe that this issue should be adressed first, before replacing any "missing" features, because this is a fundamental conceptual problem.

    Hi MC and RJ, thanks for the input. I actually tried two different machines. One was a Gigabyte EP35-DS4 with an E8500 proc, 2 gig mem, nVidia 7300GT graphics, with XP, with the mobo audio as well as the MOTU 828mk3 (both ASIO and MME modes). The other was a Gigabyte H55M-USB3 with Core i5 660, builtin Intel gfx, 2 gig ram, an almost perfectly clean and empty Vista 32, and the mobo soundchip. Both machines acted roughly the same. The audio files I used were ordinary 16 bit stereo files, mostly about 40 minutes long each (different ones). I install the trial (everything at deafult), load two files into the multitrack (no fx, no routing, no nothing), hit play, and with a timeline window of about 50 seconds, I drag the scroll bar from the left extreme to right extreme with a speed that it travels for 10-15 seconds. While I'm doing this, the metering and the time display (as well as the video playback, if there's a video loaded) become jerky, at times even freeze for half a second or so (audio is undisturbed). Sometimes better, sometimes worse. With mono files it's less pronounced.
    But even in simple cases, if I just hit play and not touch the scroll bar, when the red play cursor reaches the right side, and the timeline is redrawn, sometimes this is enough to momentarily freeze the video playback.
    My colleague says he saw some of this with the beta on his Mac (I don't have the specs).
    MC, are you saying you don't experience this behaviour?

  • File Handler Leak?

    After we upgrade to 4.0.71 from 3.0, we see multiple file handlers are opened for the last jdb file. We have 100+ clients and each clients have their own environment, thus caused us top the file handler limit on linux.
    We are using
    OS: Linux 2.6.18-92.1.6.el5 #1 SMP Fri Jun 20 02:36:06 EDT 2008 x86_64 x86_64 x86_64 GNU/Linux
    The environment is writable.
    I would like to confirm if this is related to the issue fixed in 4.0.92:
    Fix a problem where cleaned and deleted log files could accumulate in the log cleaner's backlog, or list of files to be cleaned. This occurs when multiple cleaner threads are configured. The impacts of this problem are:
    1. The EnvironmentStats.getCleanerBacklog stat is incorrect, which could lead the application to unnecessarily increase the number of cleaner threads.
    2.
    3. If EnvironmentConfig.CLEANER_MAX_BATCH_FILES is set to a non-zero value, log cleaning is disabled when the number of deleted files in the backlog reaches this limit.
    [#18179]
    or is another bug or expected behavior.
    To be specific:
    I ran the lsof for the berkeley DB directory:
    java 13665 mybuys 147r REG 253,0 141869 25927939 /berkeley-db/00000000.jdb
    java 13665 mybuys 148r REG 253,0 141869 25927939 /berkeley-db/00000000.jdb
    java 13665 mybuys 279rw REG 253,0 0 25927936 /berkeley-db/je.info.0.lck
    java 13665 mybuys 280w REG 253,0 0 25927937 /berkeley-db/je.info.0
    java 13665 mybuys 281uw REG 253,0 0 25927938 /berkeley-db/je.lck
    java 13665 mybuys 282r REG 253,0 141869 25927939 /berkeley-db//00000000.jdb
    and note that there are 3 instances of 00000000.jdb. I only see one file hander to this file when server just restart. We have a periodical process to refresh the Berkeley DB from data feed, and we see two handler increased in each refresh. The environment handler, entityStore, and primaryIndex are all singleton per client.
    Can someone explain why there is a need to have multiple hanlders opened for one jdb file (and seems always the last one). Is there a workaround with it.
    Edited by: JoshWo on Mar 2, 2010 3:14 PM

    Everything you asked about is per environment, not per environment handle. In general, we never allocate expensive resources like file handles for each environment handle.
    For us, it is not critical to have many file handlers but we just need to be able to estimate before deployment so OS limit can be set correctly (I believe the max is 40K per process on Linux). Any formula to publish?For any environment:
    <li> Read-only handles: You can set the maximum to 3, but that is the smallest maximum you can specify. If you only have one log file then only one handle will be open. But if you have 3 log files, then 3 file handles may be opened.
    <li> je.lck: 1 handle is always open.
    For a read-write environment, there are the following 3 additional handles (added to the above):
    <li> Write handle: 1 handle for writing.
    <li> Fsync handle: 1 handle for fsync.
    <li> je.lck: 1 additional handle for exclusive locking.
    Also, you mentioned "maximum number of open-for-read handles is reached". Can you point a documentation link on this parameter or elaborate more here? Is it per environment handler or per environment?EnvironmentConfig.LOG_FILE_CACHE_SIZE is in the javadoc, in the je package.
    So overall (if you set LOG_FILE_CACHE_SIZE to 3), you should assume:
    -- 4 handles for a read-only environment
    -- 7 handles for a read-write environment
    I suggest you test this. I'm giving you these numbers by reading the code, not by testing.
    --mark                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • How to Resolve "Bad File Handle" Problem

    I am using Adobe Acrobat 9.0 and inadvertently have deleted Distiller from my computer. When trying to save an edited PDF, I get a message "Cannot save file.  Bad File Handle".
    What shall I do to resolve my problem?  Is this a result of deleting Distiller?  Thank you.

    Hi festuss,
    Can you please let us know how you removed Adobe Distiller from your computer. Also, please let us know the Operating System and the exact version of Adobe Acrobat 9.
    This issue can be related to removing Distiller. You can try repair or uninstalling and re-installing Adobe Acrobat.

Maybe you are looking for

  • Installing XP using boot camp with partitiion drive?

    I recently re-installed Leopard on my mac, and i decided to partition the drive. Then, when I went to install XP using boot camp it said that it only works with 1 partition! Is there any way round this without re-partitioning my drive? I also have an

  • G-mail labels syncing with Mac Mail folders

    I reinstalled my OS recently (a reinstall of Mavericks), and since have had an issue when trying to move e-mail into custom folders in mac mail. It seems as though G-mail (which thinks of things in terms of labels, rather than folders) is failing to

  • Giving START WITH dynamically in a query

    Hi, How do you specify the value in START WITH dynamically during runtime. For example please consider the bewlo scenario: CREATE TABLE ISCT ITEM_NO VARCHAR2(15 CHAR) NOT NULL, ITEM_TYPE VARCHAR2(3 CHAR) NOT NULL, ITEM_TYPE_SCO VARCHAR2(3 CHAR) NOT N

  • Possible to set Dynamic Config from ABAP Proxy

    Hi, I know how to get and set Dynamic Config from Java/XSLT mappings and adapter modules. But I want to do this from an outbound ABAP Proxy Call (in the same way I can set the serialization context for EOIO), is that possible? The scenario is: I have

  • WCS not graphing CleanAir Interferers

    I have WCS managing 4 controllers.  1 of which has clean air enabled.  WCS gives air quality reports just fine, but it gives me no information on the Device Interferers or the Interferer count.  I can go to the controller, go to monitor>cleanair and