Special character u00A3 and u20AC sign issue in File adapter

Hi Experts,
Kindly help me out. I am using file adapter and File encoding as ISO-8859-1.It's converting the file data u20AC as " &amp#8364; " and  £ as
" &amp#163; "
Whether ISO-8859-1 supports all the special characters.or I need to use some other encoding.
Regards,
Nutan

Hi Ramesh,
Thanks for the reply. I have done the same. This is the code:
While testing in mapping euro sign is displayed as u20AC so as per the java mapping it is getting converted as &;#8364. So what changes I need to do.Whether I have to take care of # i the code or not.
public class HandleSpecial implements  StreamTransformation
     public void setParameter(Map param)
     public void execute(InputStream in,OutputStream out)
          try
               //String read_data;
               int read_data;
               int read_nxt_data;
               while((read_data = in.read()) != -1)
                    if (read_data != '&')
                    out.write(read_data);
                    else
                         in.mark(1);
                         read_nxt_data = in.read();
                         if (read_nxt_data != '#')
                              in.reset();
                              out.write(read_data);      
                         else
                              out.write("&#".getBytes());
               out.flush();
          catch (Exception e)
Regards,
Nutan

Similar Messages

  • Cell Definition and Reverse sign Issue after upgrade

    Hi ,
    we are having a issue here after upgrade to BI 7 from 3.1c .
    There is a Collision between Cell Definition and Reverse sign in the query designer.
    Issue is . We have Key Figure "Net Days AR" which has Reverse Sign Checked under properties. In 3.1c when we ran the report the Net Days AR would get value like  for example    -40 . but now after upgrade to BI 7 SP 16 we are getting value 40. the negative sign is missing. The formula is correct.
    I went and checkd if we have any cell definitions defined on this and found the cell definitions onthis keyfigure.
    values are comming correct if there are  no cell definitions.
    Thanks in Advance and Points will be awarded.
    Thanks,
    Reddy

    we have the same problem. In BI 2004s with BEx 3.x when the cell reference is defined, the reverse sign flag is ignored.
    I opened a message to SAP. Until now no solution and I think we'll never receive solution. Here are the answers we received:
    SAP
    I have to say this is the limitation of front end 3.5.
    In designer 3.5, it works this way:
    When you define a cell referance, this cell is blocked from change of the column property change. No matter how you change the column key fig proerty, it keeps the original status when you define it.
    If you want the defined cell property, you need to delete the cell
    referance, change property, then redefine the referance.
    I agree this is very inconvinience, that's why I would recommend you to use query designer 7.0 if possible. With 7.0 front end, you can define separately the property for defined cells. It is more flexible.
    ME
    At the moment it is not possible to migrate to 7.0 frontend.
    I tried to delete the cell reference, change properties and redefine the reference. I tried to create new columns, define cell properties, save and define the cell reference. I tried several different combinations...
    No way, as soon as I define the cell reference, column properties are ignored.
    1) How can we restore the correct behavour of a query in frontend 3.5?
    2) If we found a way to restore it, we need a smart procedure to modify all our existing queries (including how to identify those affected by this inconvenience).
    SAP
    For front end 3.5, you need do following:
    1. delete cell referance
    2. change the column property (such as tick on or off the sign reversal)
    3. re-define the cell (this cell will with the new sign status)
    4. define formular base on cell
    After step 3, you can't change cell property whatever you do in
    column property. This is the designed behavior.
    If you want to change property after step 3, you need to start
    all over from step 1.
    ME
    I did it.
    Don't work. When I define the cell reference, column properties are ignored.
    SAP
    It seems there is some issue in front end.
    Do you use the front end 3.5 with latest patch?
    Could you please check front end installation as per sapbexc.xla (described in note 197460). If there are problems shown within this check do the following:
    - Remove your frontend via 'START' - 'PROGRAMS' - 'SAP FRONTEND' -'REMOVE SAP FRONTEND'
    - Install the lastest frontend and GUI patch (according to note 496977or 496989 again and check the installation).
    ME
    I don't think so. I am using:
    Sapgui 7.10 Patch 8
    Bex 3.50 Patch 3
    Bex 7.x Patch 5
    to be continued...

  • Special character standardization and cleansing

    For foreign character transliteration.  Can OEDQ examine a field's entry character by character, or would it need to do a token by token analysis from a reference list.  Either way I'd like to use a reference list containing special characters, but it would be more convenient to use a pure list of special characters rather than a long list of words (tokens) containing those characters.

    Hi Mike,
    Thank you so much for taking time out to answer. Yes these questions are based on doing work in UCM.
    Let me dive a little deeper and get specific.  I think what you're saying is that OEDQ has OOTB functionality that will support any character set, just some (Double-Byte, Arabic) take a little more work than others.  What I want to do in this case is have Polish special characters appear in the UI in their Polish form.  I don't want them substituted for normalized English, or any other characters.
    In the matching process we don't want these special character tokenized attributes to enter the auto-match with similar (but different) English versions automatically.  We will probably want them to enter the suspect match queue for data steward (librarian) review. That is a matter of the match rules tuning process.  Which will be the next thing I'm querying this forum about.
    What I think you are saying in the previous reply is that OEDQ OOTB functionality has the ability to publish Polish special characters in the UI and we can adjust the matching rules during the tuning process to ensure that a data steward checks a special character vs. non-special character exact match?
    Thanks-Aaron

  • Special Character Validation and Reset Focus

    Hi All,
    I have developed an Webdynpro JAVA Application which carries many input Fields.
    Client side Validations needs to be handled for Special Character Restriction for each of the Fields.
    Validation handled through OnEnter or OnChange Events of a Particular Field shows the exception but clicking other fields made the exception ignored.
    My Requirement goes this way.
    field1
    field2
    field n
    Validation Button
    On click of Validation button exception should be raised for all the Fields that has invalid Characters .
    It Should be displayed in Message Area( Using MessageManager) which should carry Field Name causing the Exception. On clicking the Exception it should take the Cursor the Erroneous Field.
    Kindly share the Methods of any standard Interface which does this job or customized methods if any?
    Best Regards,
    Suresh S

    try WDPermanentMessage

  • Production issue - Sender File adapter not picking up the files from folder

    Hi Guys,
    Ever since the upgrade from XI 3.0 to PI 7.1, we have come acrossinstances of weird error.
    Thsi time again - now the 3rd time - tandom basis, in our production PI server, we have teh file slying in the source directory folder in the server.
    I can see the files lying there in AL11.
    However, it looks like that the file polling has just stopped and the channel is going blank in channel monitoring.
    I have checked in SXMB_MONI and there are no messages since the morning.
    I have tried craeting a replica of the current channel but it is not working.
    This is teh production server and thsi has alraedy created production issues.
    I ahve checked in the SDN forum but am not able to find the details.
    Plaese help me.
    I am anyway going to raise the issue with SAP now.
    Regards,
    Archana
    <REMOVED BY MODERATOR>
    Edited by: Prateek Raj Srivastava on Jun 8, 2010 4:50 PM

    Hi Prateek,
    I have trying all sorts since the morning and then just checked teh file permissions.
    The file permissions were incorrect as compared ot the other files that were processed successfully today.
    Somehow the permissions were changed on the server and the interface channel was not able to poll the files.
    I got teh permissions changed back to 666 and all the files were pikced up in a minute.
    I got the folder checked and it seems like that the permissions were changed somewhere very early in the morning and we are trying to find out how it happened and who did that.
    However, another question i had - this sender file adapter was polling the source directory and deleting the files from there.
    I would have expected that if the channel had issues with the file permission because of which it was not able to access the file, it would have thrown an error something like the file permissions error.
    But there was not a single error in the channel monitoring.
    How can we configure it in a beter way so that we at least soem kind of error indication?
    Please advice.
    Regards,
    Archana

  • Regarding issue sender file adapter in clustered  environment(PI 7.0)

    Hi Experts,
    we  are using  sender  file adapter in clustered environment(there are 6 J2EE cluster nodes in XI system) for an interface.the  file sender communication channel  for this interface  is scheduled to run twice  every day. recently   the   channel stopped polling (picking the files from the source directory) and i dont see any error.
    when i open Communication Channel monitoring in RWB and select the file sender communication channel  and run it manually, none of the cluster nodes  are polling for the file.
    i have tried   editing the communication channel in Integration Directory   and  activating  it. but it does not pick the file.
    Can you let me know how  the issue can be resolved.
    Thanks
    -Kaushik
    Edited by: Kausik M on Dec 18, 2008 4:13 AM

    Kausik,
    A computer cluster is a group of linked computers, working together closely so that in many respects they form a single computer. The components of a cluster are commonly, but not always, connected to each other through fast local area networks. Clusters are usually deployed to improve performance and/or availability over that provided by a single computer, while typically being much more cost-effective than single computers of comparable speed or availability.
    clearly your cluster nodes are out of sync...!!!!
    It is possible that your ftp server went down for a while . And in the profile of FTP machine the entry of XI server is not made permanent .
    1. Try to ping the FTP site from XI server.
    BTW are u getting any error msg at RWB ?
    Regards,

  • Duplicate File Handling Issues - Sender File Adapter - SAP PO 7.31 - Single Stack

    Hi All,
    We have a requirement to avoid processing of duplicate files. Our system is PI 7.31 Enh. Pack 1 SP 23. I tried using the 'Duplicate File Handling' feature in Sender File Adapter but things are not working out as expected. I processed same file again and again and PO is creating successful messages everytime rather than generating alerts/warnings or deactivating the channel.
    I went through the link  Michal's PI tips: Duplicate handling in file adapter - 7.31  . I have maintained similar setting but unable to get the functionality achieved. Is there anything I am missing or any setting that is required apart from the Duplicate file handling check box and a threshold count??
    Any help will be highly appreciated.
    Thanks,
    Abhishek

    Hello Sarvjeet,
    I'd to write a UDF in message mapping to identify duplicate files and throw an exception. In my case, I had to compare with the file load directory (source directory) with the archive directory to identify whether the new file is a duplicate or not. I'm not sure if this is the same case with you. See if below helps: (I used parameterized mapping to input the file locations in integration directory rather than hard-coding it in the mapping)
    AbstractTrace trace;
        trace = container.getTrace();
        double archiveFileSize = 0;
        double newFileSizeDouble = Double.parseDouble(newFileSize);
        String archiveFile = "";
        String archiveFileTrimmed = "";
        int var2 = 0;
        File directory = new File(directoryName);
        File[] fList = directory.listFiles();
        Arrays.sort(fList, Collections.reverseOrder());
        // Traversing through all the files
        for (File file : fList){   
            // If the directory element is a file
            if (file.isFile()){       
                            trace.addInfo("Filename: " + file.getName()+ ":: Archive File Time: "+ Long.toString(file.lastModified()));
                            archiveFile = file.getName();
                          archiveFileTrimmed = archiveFile.substring(20);       
                          archiveFileSize = file.length();
                            if (archiveFileTrimmed.equals(newFile) && archiveFileSize == newFileSizeDouble ) {
                                    var2 = var2 + 1;
                                    trace.addInfo("Duplicate File Found."+newFile);
                                    if (var2 == 2) {
                                            break;
                            else {
                                    continue;
        if (var2 == 2) {
            var2 = 0;
            throw new StreamTransformationException("Duplicate File Found. Processing for the current file is stopped. File: "+newFile+", File Size: "+newFileSize);
    return Integer.toString(var2);
    Regards,
    Abhishek

  • Problem with special character like u00E5,u00E4 u00F6 in Sender JMS Adapter

    Hi,
    Problem:
    Sender JMS Adapter has transformation error when the file includes western european characters such as å,ä ö. And because of this reason the data is not picked up by adapter.
    The scenrio is JMS --- > XI ---> Proxy.
    If the file does not include the western European characters then it is working fine, but when the file include the western european characters such as å, ä, ö so got the error.
    Our Efforts:
    We have tried ISO8859-1 in JMS Module Tab as shown below.
    1. Transfer.ContentType text/xml;charset=ISO-8859-1
        AND with this also
    2. Transfer.ContentType application/coctet-stream;charset=ISO-8859-1
    None of them working.
    Error In CC:
    In Sender CC monitoring we are getting the below error.
    Error while processing message 'aa157082-b064-4421-0fc3-c286d2732093'; detailed error description: com.sap.aii.adapter.jms.api.channel.filter.MessageFilterException: Error converting Message: sun.io.MalformedInputException: TransformException: Error converting Message: 'sun.io.MalformedInputException' at com.sap.aii.adapter.jms.core.channel.filter.SendToModuleProcessorFilter.filter(SendToModuleProcessorFilter.java(Compiled Code)) ...
    Any suggestion in this regard will be a great help.
    Regards,
    Sarvesh

    > I think you have to figure out first, which codepage you really have, not just try. Maybe you ask someone or you check with a hex editor.
    Hi Stefan,
    Finally we solved the problem.
    As you said above to figure the the codepage, we asked to our MQ team for the same and found that they are using ISO-8859-1. In XI we were also trying with multiple conbinations (ISO-8859-1, UTF-8 and many more), but didn't get any success.
    Finally MQ team changed their encoding to UTF-8 and in XI we used ISO-8859-1 and we get the success in picking the data with spl char. Even though in XI the spl char comes in destorted order but at the receiver end they are in proper shape.
    Thanks a lot for your help.
    Regards,
    Sarvesh

  • Polling issue in File Adapter scenario

    Hello Experts,
    I have a scenario where in I am using a batch command to rename a file before processing it using "run operating system command before message processing". The command works fine and renames the file, but the issue is that it does not process the renamed file in the same polling interval.
    In next polling interval the file is processed as expected.
    Can you please let me know the reason why this might be happening?
    Thanks.

    Hi Ravi,
    First the file is picked and then the pre-message processing OS command is executed and then the message is processed. Processing of the message is not dependent on whether the OS command gets executed successfully or not. Basically, PI issues the OS command and then processes the message.
    So if you have renamed the file using OS command it should be picked up in the subsequent interval.
    Hope it helps!
    Cheers,
    Anand

  • How to write Header and Footer elements in Write file Adapter

    Hi,
    I have a requirement to write the file.The write file contains header and footer elements ,how we can write these elements. These elements are fixed for all files.these are not come from any input.below is the sample file.
    $begintable
    name,Id,Desg
    ad,12,it
    $endtable

    Hi,
    I have created the XSD for you, and i created a sample SOA Composite which writes the file same like what you want, the below XSD can write a file with one header record, multiple data records and one trailer record.
    <?xml version="1.0" encoding="UTF-8" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd"
    xmlns:tns="http://TargetNamespace.com/WriteFile"
    targetNamespace="http://TargetNamespace.com/WriteFile"
    elementFormDefault="qualified" attributeFormDefault="unqualified"
    nxsd:version="NXSD" nxsd:stream="chars" nxsd:encoding="UTF-8">
    <xsd:element name="Root-Element">
    <xsd:complexType>
    <!--xsd:choice minOccurs="1" maxOccurs="unbounded" nxsd:choiceCondition="terminated" nxsd:terminatedBy=","-->
    <xsd:sequence>
    <xsd:element name="RECORD1">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="header" type="xsd:string"
    nxsd:style="terminated" nxsd:terminatedBy="${eol}"
    nxsd:quotedBy='"'/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:element name="RECORD2" minOccurs="1" maxOccurs="unbounded">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="data1" type="xsd:string"
    nxsd:style="terminated" nxsd:terminatedBy=","
    nxsd:quotedBy='"'/>
    <xsd:element name="data2" type="xsd:string"
    nxsd:style="terminated" nxsd:terminatedBy=","
    nxsd:quotedBy='"'/>
    <xsd:element name="data3" type="xsd:string"
    nxsd:style="terminated" nxsd:terminatedBy="${eol}"
    nxsd:quotedBy='"'/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:element name="RECORD4" nxsd:conditionValue="$endtable">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="trailer" type="xsd:string"
    nxsd:style="terminated" nxsd:terminatedBy="${eol}"
    nxsd:quotedBy='"'/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:schema>
    Hope this helps,
    N

  • Issue in File adapter

    hi all,
    In my BPEL process, the source file contains all the records in one single line. All the records are only one field. They are delimited by ':'. I am unable to read these multiple records and insert into a table.
    Can someone please guide me how to read multiple records in a single (horizontal) line.
    Thanks,
    Kamal.

    In "*Native Format Builder - Step 5 of 7 : Specify Delimiters*" Dialog page
    enter ":" in the Records Delimited by field.
    --Prasanna                                                                                                                                                                                                                                                                                   

  • Content Conversion Issue - sender File adapter..!!

    Hi All ,
    Input file:
    GRP|HD|7001|7001A00443|012|
    GRP-LN|DTL|1|ZTAS|3|002|209782010|0001|EN
    GRP-LN|TXT|Customer: KR Test, Case 3
    GRP-LN|TXT|Power            : -2.25
    GRP-LN|DTL|2|ZTAS|4|002|209782035|0001|EN
    GRP-LN|TXT|Customer: CL Test, Case 4
    GRP-LN|TXT|Sphere Power            : -2.25
    T     7
    I have 4 segments (Header,Detail,Text,Trailer) with "|" as the field separator and HD,DTL,TXT,T are the Keyfields
    I need to get the xml structure like below heirarchy. Text segment has to come under Detail structure. Header,Detail and Trailer has to be in the same position.
    Recordset
    --Header
    --Detail
    Text
    --Trailer
    After completion of content conversion, I am getting the XML structure in the below format. All the segments are coming in the same hierarchy.
    Recordset
      -Header
      -Detail
      -Text
      -Trailer
    Text segment has to come under Detail segment. I used the below parameters for FCC.
    Recordset Structure: Header,1,Detail,,Text,,Trailer,1
    Recordset Sequence: Variable
    Header.fieldSeparator
    Header.keyFieldValue
    Header.fieldNames
    Detail.fieldSeparator
    Detail.keyFieldValue
    Detail.fieldNames
    Text.fieldSeparator
    Text.keyFieldValue
    Text.fieldNames
    Trailer.fieldSeparator
    Trailer.keyFieldValue
    Trailer.fieldNames
    Kindly suggest me wat went wrong and how to solve it by content conversion.?
    Thanks
    Deepthi

    Hi All,
    Thank you for your replies.
    I thought of doing it in mapping . My map is like
    DTL -
    > E1EDP01
    TXT----
    >  -
    EIEDPT2
    According to my scenario when ever DTL comes then E1EDP01 has to come and EIEDPT2 has to repeat untill next DTL comes.
    Ex: 1DTL and  4TXT segments .So the output will be..
    DTL -> E1EDP01
    TXT -
    > -
    EIEDPT2
    TXT  -
    > -
    EIEDPT2
    TXT  -
    > -
    EIEDPT2
    TXT -
    > -
    EIEDPT2
    DTL -> E1EDP01
    TXT -
    > -
    EIEDPT2
    TXT  -
    > -
    EIEDPT2
    TXT  -
    > -
    EIEDPT2
    TXT -
    > -
    EIEDPT2
    Right now I couldn't able to generate the above target XML. Everytime I am getting all the TXT segments under one DTL segment like below.
    DTL -> E1EDP01
    TXT -
    > -
    EIEDPT2
    TXT  -
    > -
    EIEDPT2
    TXT  -
    > -
    EIEDPT2
    TXT -
    > -
    EIEDPT2
    TXT -
    > -
    EIEDPT2
    TXT  -
    > -
    EIEDPT2
    TXT  -
    > -
    EIEDPT2
    TXT -
    > -
    EIEDPT2
    Any suggestions how we can get rid of in mapping?
    Thanks
    Deepthi

  • Issues with File Adapter

    Hi!
    I am working on File(source.csv)->Xi->File(Target.csv) Scenario.
    In my Source File i have 4 fields which are mapped to the target file which contains 2 additional fields with constant values mapped.   So, I should get 6 fields at receiver end.
    <b>Source structure</b>                      
    CharacterName                                 
    Lowerlimit                                                                               
    Upperlimit
    Targetvalue                                     
      <b> Target Structure</b>
    CharacterName 
    Lowerlimit
    Upperlimit
    Targetvalue
      Plant (constant value -1000)
      Status (Constant value - Released)  
    But, While using file adapter it showing only 4 fields of my Source file at the receiver end , not the ones which have been mapped with constant values(It is with out Using FCC) .
    When i  use FCC under File Adapter,  XI is not picking the file. These are the parameters I am defining at Sender Adapter:
    Document Name : file_rece( Not sure whether it should be similar as of my Msg Type name)
    RecordSet Name :Record (not sure abt naming conventions)
    Recordset Structure : main,1
    In parameters:
    main.fieldNames:CharacterName,Lowerlimit,Upperlimit ,Targetvalue
    main.fieldSeparator: ,
    main.endSeparator : 'nl'
    Receiving Adapter:
    Recordset Structure : main,1
    main.fieldSeparator :,
    main,endSeparator : 'nl'
    ---I gone through some previous posts but cant able to resolve it.
    Regards
    Parth
    <b></b>

    Hi! Bhavesh
    <b>This is the source Data type xml format generated in IR:</b>
    <?xml version="1.0" encoding="ISO-8859-1"?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http:
    bSU1" targetNamespace="http:
    bSU1">
         <xsd:element name="bsu_source" type="Bbsu_target_dt" />
         <xsd:complexType name="Bbsu_target_dt">
              <xsd:annotation>
                   <xsd:appinfo source="http://sap.com/xi/TextID">
                   853e43e0755211dba7bf001560a58e89
                   </xsd:appinfo>
              </xsd:annotation>
              <xsd:attribute name="CharacterName" type="xsd:string">
                   <xsd:annotation>
                        <xsd:appinfo source="http://sap.com/xi/TextID">
                        a9f9a8a073a911db9df8f654ac1116d8
                        </xsd:appinfo>
                   </xsd:annotation>
              </xsd:attribute>
              <xsd:attribute name="LowerLimit" type="xsd:integer">
                   <xsd:annotation>
                        <xsd:appinfo source="http://sap.com/xi/TextID">
                        a9f9a8a173a911dbaaabf654ac1116d8
                        </xsd:appinfo>
                   </xsd:annotation>
              </xsd:attribute>
              <xsd:attribute name="UpperLimit" type="xsd:integer">
                   <xsd:annotation>
                        <xsd:appinfo source="http://sap.com/xi/TextID">
                        a9fb2f4073a911db868ef654ac1116d8
                        </xsd:appinfo>
                   </xsd:annotation>
              </xsd:attribute>
              <xsd:attribute name="TargetValue" type="xsd:integer">
                   <xsd:annotation>
                        <xsd:appinfo source="http://sap.com/xi/TextID">
                        a9fb2f4173a911db9ed2f654ac1116d8
                        </xsd:appinfo>
                   </xsd:annotation>
              </xsd:attribute>
         </xsd:complexType>
    </xsd:schema>

  • JSP&Special Character

    Hi all,
    I have a problem related to special characters. I have a list with set of parameters. One of the parameter contains an special Character (apostrophe) and I'm using the <bean:write ..... /> to display it and for the validation of the parameter i'm passing the value as an String argument to a function. But due to the special character it was not invoking the javascript function....
    Any body has any answer for this...
    Any further inputs will be highly appreciated....

    there're a couple of possible turnarounds:
    -escape quotes
    - replace spec chars by there &#<char_code>; value
    - delimit your parameters in javascript with double quotes, so the simple quote won't be interpreted(only differs th problem, i agree)

  • File Adapter write XML file with special characters

    Hi,
    I have a BPEL processes which use DB adapter to retrieve record and output in a XML document using File Adapter.
    In the table, some records contain these special characters - ">" , "<" , "." . I was expecting the file adapter will convert it to escape characters, but it didn't happen. it just put the same value back to the XML file, which cause other system to reject the file. Do anyone know how to resolve this ?
    The version of SOA is 10.1.3.4
    Thanks for the help.
    Calvin
    Edited by: user12018221 on May 25, 2011 1:48 PM

    one option is to specify validateXML on the partnerlink (that describes the file adapter endpoint) such as shown here
    <partnerLinkBinding name="StarLoanService">
    <property name="wsdlLocation"> http://<hostname>:9700/orabpel/default/StarLoan/StarLoan?wsdl</property>
    <property name="validateXML">true</property>
    </partnerLinkBinding>
    hth clemens

Maybe you are looking for

  • Error while creating Process flows in OWB 11gR2

    Hi I have never faced this before and thats why it baffles me. I am trying to create a new process flow module which has already been configured to the OWF_LOCATION(Oracle workflow location). I get Java exception error window with long list of detail

  • REALLY noisy fan on 24" iMac

    Hi. I got my iMac 24" September 2008. This year, or maybe late 2009, the fan started making small noises. Now, a week ago maybe, the fan is like a vacuum cleaner! I start my Mac, a minute or two later the fan is loud. I can hear that its going louder

  • Print Service problem

    I'm trying to print using print service. I do the following: Doc getPrintDoc() Doc doc = null; DocFlavor flavor = DocFlavor.INPUT_STREAM.TEXT_HTML_UTF_8; StreamPrintServiceFactory[] psFactories = StreamPrintServiceFactory.lookupStreamPrintServiceFact

  • FATAL HARDWARE ERROR

    My daughter has a new G4 that I tried to run a disk repair on after it started crashing continously. All that came up was: If the drive has not failed completely back up as much data as you can and then replace it with a working drive:....geesh!! Alr

  • Error msg when doing a Return Delivery

    Hi All Can you help here please We are on EC SRM 4.0 and have just upgraded to SP11 When doing a return delivery on an existing confirmation, sometimes we get a error message"Confirmation already exits" Then we cannot do anything with this confirmati