Two Transformations

For a Testing purpose I had the following scenario
Block 1
1) Receive step (Empno, Empname) Test_MI
2) Transformation ( Simple Transformation [Empno Concat Empno into Empno, Empname to Empname (Result)]
END Block 1
Block 2
3) Transformation (Result from previous step to New Output)
4) Send Step
Block 2
The above Test  failed. The error was
CL_SWF_XI_MSG_BROKER method CALL_TRANSFORMATION cannot be executed
Its looks like the The result from a Transformation can't be used inthe following transaction as in input???
Any suggestions will be greatly appreciated.

Hi Mohini,
It will work for sure, I have done in one of my scenario and it works. I guess there is something wrong with the mapping or with the interface that is assigned. I know that you stuck with this for past 2 days. Lets solve this for sure. Please check the interface mapping if it is proper.
For testing do step by step..
Frist try to save the output of the first transformation in a file.
Then use that XML to test the secong mapping.
Thanks,
Prakash

Similar Messages

  • Two transformations in sequence in BPM

    Hi ,
    I am trying to add two transformation steps in sequence one after one in my BPM scenario.  In the second transformation step I have to use the first transformation’s output. But it is giving mapping error in BPM at runtime.
      Is there any tricky way to use this kind of scenario?  Do I need to add any assignment step in between these two transformation steps,  to take the output of first transformation as input to the second transformation step?
    Instead of creating two transformations (two maps), I can use directly only one transformation (one map with directly source structure to target structure).
    Thanks.
    Siva Rama.

    Hi Siva,
    Have a look at the following links:
    http://help.sap.com/saphelp_nw04/helpdata/en/62/dcef46dae42142911c8f14ca7a7c39/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/27/db283fd0ca8443e10000000a114084/content.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/41/e3d13f7fb44c21e10000000a1550b0/content.htm
    Hope this helps.
    Cheers,
    Chandra

  • Transformation issue in BPM

    I have a message type into which I am trying to send data in two different transformation steps in my BPM.  The first transformation step does it fine and I am able to see the data.  After that the flow completes the second one I don't see the data that I already have in it.
    The second transformation step is nullifying the fields mapped by the first one even though I am not mapping anything into those fields in my second step.
    Can some one help me with this?  Is there some setting to retain the values in the message?

    Hi Prasad,
    are you using the same container variable for those two transformation steps as target? if not try creating a new one with the same message type a use it in the second transofrmation step
    if this doesn't work create a new abstract interface message and use it in the BPM for the second mapping (I remember having some problems with using the same one too)
    Regards,
    michal

  • Error when transporting Oracle datasource and transformation.

    I am doing transports in BI 7.0 from DEV to PRD. While moving transformation for Oracle Data Source to PRD, I am facing the method of execution error in transport. Please note for the same target we are using two transformations from different Oracle data sources. One is getting transported to PRD.  while the other one is failing with error message source system does not exist.
    Please help me to solve this issue. Points will be assigned.

    Hi Davis,
    Before transporting transformation make sure either of source or target should availble in target system.
    suppose you have a transformation between Data source and Data Target, first transport Data source, then transport Transformation along with data target.
    Before tansporting Transformation either source or target should be avaialble in QA system..then only transportation will happen smoothly..
    Hope this helps.
    regards
    SK

  • Transformation pointing to Quality Client in Production System Repeatedly

    Hi,
    I need to understand one issue which is explained in detail below step by step.
    1. I created one Customized Data Source(Full Load) and replicated in BW.
    2. Also created Info Objects, DSO, Info Package, Transformation, DTP.
    3. While transporting this modelling to Production(first time), i unknowingly/mistakenly missed capturing the datasource in Transport Request.Sequence of transport was -
    Data Source
    Info Area
    Info Object Catalogue
    Info Object
    DSO
    Cube & MP
    Data Source
    Transformation
    Info Package & DTP
         Now, the moment TR for transformation was transported it threw an error - "Data Source does not exist in Quality client". Then I transported the Data Source TR and after that re-transported the transformation TR production and this error was removed.
    But, there were 2 transformations in production - 1) Pointing to Production, 2) Pointing to quality.
    4. I request the basis team to delete the second transformation(pointing to quality) from BW production system.
    5. After that we captured Data Source and transformation in same TR and transport was successful.
    6. Now, after few days we transported a change to production for same object and the system again threw the same error mentioned in point no.3 above and created two transformations again.
    How to resolve this issue permanently and I also want to know the reason of this type of system behavior.

    Mohammed,
    I did that also - please read my step no.4.
    We are able to delete the transformation which was referring to quality client and then captured D.S & transformation in a single TR and transport as successful.
    Now, after few days I did one change in the same transformation and transported to production.
    System Behavior:
    Error - " No source exist (i.e.,system referred to quality in production)".
    Also again transformation was visible in production which was referring to quality client(which we already deleted with the help of BASIS).
    Now since again the transformation needs to be deleted from production, our basis team is telling us to they cannot delete objects every time in production.Thus,
    1. Is there any option to solve this issue permanently?
    2. Also is it necessary to transport data source along with the transformation which is directly linked to data source?
    Please Note: We have Dev+Quality as one client and Production.

  • Regarding assign and transform

    Hi all,
    iam doing a task, in which the data from one (oracle 10g) is moved to another database(oracle 12i ),
    source table contains around 30 columns and destiantio contans some N number of columns.
    I just used data polling for source . when data is inserted in the source side ,it is automatically picked and inseted in to destination.
    Around 22 columns are directly mapped , and remaining are validated based on some conditons and then inserted to desinaiton.
    iam using transform activity for direct mapping , and remaining columns are mapped with assign activity(after chcecking the conditions).
    Promblem is only transform values are getting inserted in the destination table.
    can't we use both assign and transform in one bpel process .
    can't we use two transform activities in same bpel process.
    regrads,
    ramakrishna

    HI Buddy,
    <?xml version = "1.0" encoding = "UTF-8" ?>
    <!--
    Oracle JDeveloper BPEL Designer
    Created: Thu Oct 16 16:11:42 GMT+05:30 2008
    Author: 703036713
    Purpose: Empty BPEL Process
    -->
    <process name="assing_transform"
    targetNamespace="http://xmlns.oracle.com/assing_transform"
    xmlns="http://schemas.xmlsoap.org/ws/2003/03/business-process/"
    xmlns:client="http://xmlns.oracle.com/assing_transform"
    xmlns:bpws="http://schemas.xmlsoap.org/ws/2003/03/business-process/"
    xmlns:ns5="http://xmlns.oracle.com/pcbpel/adapter/db/source/"
    xmlns:ns6="http://xmlns.oracle.com/pcbpel/adapter/db/top/source"
    xmlns:ns7="http://xmlns.oracle.com/pcbpel/adapter/file/source/"
    xmlns:ns8="http://www.emp.org"
    xmlns:ns2="http://xmlns.oracle.com/pcbpel/adapter/db/dest/"
    xmlns:ns1="http://xmlns.oracle.com/pcbpel/adapter/db/soure/"
    xmlns:ns4="http://xmlns.oracle.com/pcbpel/adapter/db/top/dest"
    xmlns:ns3="http://xmlns.oracle.com/pcbpel/adapter/db/top/soure"
    xmlns:xp20="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.Xpath20"
    xmlns:ldap="http://schemas.oracle.com/xpath/extension/ldap"
    xmlns:ora="http://schemas.oracle.com/xpath/extension"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
    xmlns:orcl="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.ExtFunc">
    <!--
    PARTNERLINKS
    List of services participating in this BPEL process
    -->
    <partnerLinks>
    <partnerLink myRole="dest_role" name="dest" partnerRole="dest_role"
    partnerLinkType="ns2:dest_plt"/>
    <partnerLink myRole="Read_role" name="source"
    partnerLinkType="ns7:Read_plt"/>
    </partnerLinks>
    <!--
    VARIABLES
    List of messages and XML documents used within this BPEL process
    -->
    <variables>
    <variable name="input" messageType="ns7:emp_msg"/>
    <variable name="output" messageType="ns2:EmpCollection_msg"/>
    </variables>
    <!--
    ORCHESTRATION LOGIC
    Set of activities coordinating the flow of messages across the
    services integrated within this business process
    -->
    <sequence name="main">
    <receive name="Receive_1" partnerLink="source" portType="ns7:Read_ptt"
    operation="Read" variable="input" createInstance="yes"/>
    <assign name="Transform_1">
    <bpelx:annotation>
    <bpelx:pattern>transformation</bpelx:pattern>
    </bpelx:annotation>
    <copy>
    <from expression="ora:processXSLT('Transformation_2.xsl',bpws:getVariableData('input','emp'))"/>
    <to variable="output" part="EmpCollection"/>
    </copy>
    </assign>
    <assign name="Assign_1">
    <copy>
    <from variable="input" part="emp" query="/ns8:emp/ns8:empdet/ns8:name"/>
    <to variable="output" part="EmpCollection"
    query="/ns4:EmpCollection/ns4:Emp/ns4:empname"/>
    </copy>
    </assign>
    <invoke name="Invoke_1" partnerLink="dest" portType="ns2:dest_ptt"
    operation="insert" inputVariable="output"/>
    </sequence>
    </process>
    THIS IS MY BPEL FILE , CREATED FOR TESTING PURPOSE
    IN WHICH I HAVE TAKEN THREE FIELDS IN XML FILE AND TRIYING TO SEND TO DATABASE
    FOR TWO FIELDS IAM USING TRANSFORM
    AND FOR ONE FIELD ,IAM USING ASSIGN
    THE BPEL PROCESS SUCESSFULLY DEPLOYING , BUT IT IS GIVING RUNTIME (CHECKING IN BPEL CONSOLE PROCESS FLOW)
    THE ERROR IS ASSIGN ACTIVITY SHOWING ERROR IN BPEL SPEC 1.1 SECITON 14.3
    COULD U SUGGEST ME , IS THERE ANY OTHER ALTERNATIVES TO SOLVE THE ISSUE
    THANKS AND REGRADS
    RAM

  • How to merge XML-Data of two variables?

    Hi all,
    I have to combine information coming from 2 different sources (1st delivered by client input, 2nd delivered by ftp-adapter) for passing it to another service.
    So there are different namespaces for the three describing xsd's (2 in, 1 out). As every structure is composed of several elements, I don't want to copy them one by one . But when copying a whole node with child elements, the target structure yields empty nodes, because the target child elements obtain the source namespace!
    Using 2 XSLT's works well for each input, but in combination, the last translate deletes the information of the first one.
    So, the only way to merge the data I figured out by now, is to use a XSLT for each input transforming into 2 separate Variables belonging to the same namespace and then copying the complete nodes of the two variables into the target (which of course belongs to this namespace as well).
    Although there is an inconsistent indexing of the namespaces (see example), happily all data can be accessed.
      <CommonData>
        <CommonData xmlns="http://TargetNamespace.com/CommonData">
          <FileInfo xmlns="http://TargetNamespace.com/CommonData">
            <Name>testfile2.txt</Name>
            <Size/>
            <DateReceived>2009-02-10T17:15:46+01:00</DateReceived>
          </FileInfo>
          <FileData xmlns:ns0="http://TargetNamespace.com/CommonData">
            <ns0:KOPF>
              <ns0:Id>1</ns0:Id>
              <ns0:Value>1</ns0:Value>
              <ns0:Text>Hier könnten Ihre Daten stehen -</ns0:Text>
            </ns0:KOPF>
            <ns0:POSI>
              <ns0:Id>1</ns0:Id>
              <ns0:Position>1</ns0:Position>
              <ns0:Value>1</ns0:Value>
              <ns0:Text>eins ----</ns0:Text>
            </ns0:POSI>
            <ns0:POSI>
              <ns0:Id>2</ns0:Id>
              <ns0:Position>2</ns0:Position>
              <ns0:Value>2</ns0:Value>
              <ns0:Text>zwei ----</ns0:Text>
            </ns0:POSI>
          </FileData>
        </CommonData>
      </CommonData>Now for the question:
    Is this really the only way to merge variables? As it took 4 operations to achieve this, it seems to be too complicated, and therefore too inperformant to me.
    BTW: In real life there would be up to 9999 positions whith much more payload to be processed ;-)
    Any comments appreciated,
    cheers, Marco
    Edited by: MarcoSfromGER on 11.02.2009 08:43

    Well, if you only want to change namespace (no other changes) I would use a single generic XSL that changes the namespace for any XML document into a given target namespace, keeping the rest. That will require two transformation calls, but only one XSL. I'm afraid I don't have an example available, but if you google perhaps you can find it, I have seen it somewhere.
    Normally when you have different namespaces the contents differ as well, though, and what you really want is a merge function. There is a way to pass two documents into a single XSL and that is to use parameters. You pass one source document the normal way and assign the other converted to a string to a paramter. You must use an assign with processXSLT in BPEL, not a transform activity. The processXSLT really takes three arguments and the third contains the parameters. Now to the difficulty - in the transformation you need to change the string into a nodeset (an XML document). In the current XPath version used by the platform there is no nodeSet function, but it is possible to write one in Java. If you have it you can select from both documents while building the third. I don't have an example handy and yes it is a bit messy as well, but much of the work only has to be done once and can be reused in other processes.

  • How to merge XML-Data of two variables of different namespace?

    Hi all,
    I am facing a problem manipulating XML data in a BPEL process. As I didn't get any response on my question in the BPEL forum, I decided to post it again in this forum. I am not very familiar with using XML up to this moment, so please don't blame me if this turns out to be a stupid question.
    I have to combine information coming from 2 different sources (1st delivered by BPEL-client input "FileInfo", 2nd delivered by a ftp-adapter "FileData") for passing it to another service.
    So there are 3 XSDs (2 in, 1 out), each with a different namespace. As every structure is composed of several elements, I don't want to copy them one by one. But when copying a whole node including child elements, the target structure yields empty nodes, because the target child elements maintain the source namespace!
    Using 2 XSLT's works well for each input, but in combination, the last transformation deletes the information of the first one.
    So, the only way to merge the data I figured out by now, is to use a XSLT for each input, transforming it into 2 separate Variables belonging to the same namespace and then copying the complete nodes of the two variables into the target (which of course belongs to the common namespace as well).
    Although there is an inconsistent indexing of the namespaces (see example), happily all data can be accessed.
      <CommonData>
        <CommonData xmlns="http://TargetNamespace.com/CommonData">
          <FileInfo xmlns="http://TargetNamespace.com/CommonData">
            <Name>testfile2.txt</Name>
            <Size/>
            <DateReceived>2009-02-10T17:15:46+01:00</DateReceived>
          </FileInfo>
          <FileData xmlns:ns0="http://TargetNamespace.com/CommonData">
            <ns0:KOPF>
              <ns0:Id>1</ns0:Id>
              <ns0:Value>1</ns0:Value>
              <ns0:Text>Hier könnten Ihre Daten stehen -</ns0:Text>
            </ns0:KOPF>
            <ns0:POSI>
              <ns0:Id>1</ns0:Id>
              <ns0:Position>1</ns0:Position>
              <ns0:Value>1</ns0:Value>
              <ns0:Text>eins ----</ns0:Text>
            </ns0:POSI>
            <ns0:POSI>
              <ns0:Id>2</ns0:Id>
              <ns0:Position>2</ns0:Position>
              <ns0:Value>2</ns0:Value>
              <ns0:Text>zwei ----</ns0:Text>
            </ns0:POSI>
          </FileData>
        </CommonData>
      </CommonData>Now for the question:
    Is this really the only way to merge variables? As it took 4 operations to achieve this, it seems to be too complicated, and therefore too inperformant to me.
    BTW: In real life there would be up to 9999 positions with much more payload (in total up to several MBs) to be processed!
    Having full control of the namespaces in this case, I could think of using one namespace for all XSDs too. But would this be a better solution? Is there a golden rule on using namespaces relating to design issues?
    Any comments appreciated,
    cheers, Marco

    Well, if you only want to change namespace (no other changes) I would use a single generic XSL that changes the namespace for any XML document into a given target namespace, keeping the rest. That will require two transformation calls, but only one XSL. I'm afraid I don't have an example available, but if you google perhaps you can find it, I have seen it somewhere.
    Normally when you have different namespaces the contents differ as well, though, and what you really want is a merge function. There is a way to pass two documents into a single XSL and that is to use parameters. You pass one source document the normal way and assign the other converted to a string to a paramter. You must use an assign with processXSLT in BPEL, not a transform activity. The processXSLT really takes three arguments and the third contains the parameters. Now to the difficulty - in the transformation you need to change the string into a nodeset (an XML document). In the current XPath version used by the platform there is no nodeSet function, but it is possible to write one in Java. If you have it you can select from both documents while building the third. I don't have an example handy and yes it is a bit messy as well, but much of the work only has to be done once and can be reused in other processes.

  • Fork with two recieve steps

    Hi,
       I have a BPM fork step with two receive steps, One is IDOC and another is ABAP proxy. both are independant steps so use a dummy correlation.
    I am assuming any message that arrives should start the process. When the proxy runs it start the process and works fine.
    when an IDOC arrives, its giving an error Permanent error in BPE inboud processing.
    In SXMB_Moni the status is message scheduled on outbound side.
    Thanks for the answer,
    Vinay.

    Thanks for your replies.
      Actually i was using BPM because bothe the messages will be transformed to target messages which are same type. So we have two transformations one for IDOc and one for Proxy.
       Then we have single sender step.
    I tried not to use correlation but since there are two receive steps its asking the correaltion as mandatory entry in receiver steps.
    I am now going to split them into two processes each with its own process.
    thanks,
    vinay.

  • Join two ODS into one cube!

    Hello Experts -
    I need to join fields of 2 ODS into one cube. I know the common field. We are in BI 7.0.
    Can someone guide me through the steps?
    I know there are two transformation rules. And I have created them. How do I link them so that I have one record when the common field matches?

    Syed!!!
    What your are trying is very complicated, if you want to do infoset that even worst, not only data wise, but only performance. If you are planning to get key figure only from one ods and only characteristics coming from different ods, infoset might work as long as you know the relationship (one-to-one relationship will work fine in infoset). Performance wise, MP is very good, but the question again, you will get multiple lines unlike infoset.
    So before, someone decided to use infoset, MP or even some other data model, they need to know well the reporting requirement and well divers in the relationship of the ods.
    I have faced the same situation in many of my projects, there are places I used infoset and MP, and also created a different infoobject as a master data and turned the attributes as NAV. Attribute.
    Thanks.
    Wond

  • One infosource two datasources

    Hi Friends,
    I have the following task.
    I have two data sources, one with actual data and one with planning data.
    Now I want to create an infocube with both the actual and the planning data and one or two aditional fields for the difference
    between actual and planning data.
    I have created the data sources, I have created on infocube with all the fileds from the two datasources.
    When I have created two transformation from datasouces to infosource and one from infosource to the infocube.
    But when I want to create the DTP for the infocube, it asks me to choose between the two datasources.
    How can I go on with it.
    Is there a docu how to handle two datasources, one infosource and one infocube or how else can I do it?
    Thanks in advance.
    RG. Jimbob

    Hi Jim,
    If i am getting you correctly , you need to create one cube with Actual key Figure and Planned Key Figure so that you can calculate the variance.
    In this case, i would suggest you to not add value type in the cube. ( Just have it till the info source).Then write a routine for Actua Key Figure and Planned Key Figure.
    i.e IF VTYPE = 10 populate actuals else populate planned key figure.
    As in any situation if u have value type info object in cube, you will always get tw records in the cube based on value type, even if all other characteristics are same.
    In the scenario where 2 records are generated, if u run a report without dragging value type , it will aggregate both the records as you are expecting.
    Hope this solves your problem.
    Thanks,
    Vikram

  • Transfer Rules for two Source System in Production

    Hi All,
    I have a question.
    I have source system ECDCLNT230 (ECC Devlopment) which is connected to BIDCLNT200(BID Devlopment) I have two production source system ECPCLNT410 (ECP Pre Production Client) & ECPCLNT400 (ECP Production Client) which i want to connect with BIPCLNT400. when i transport my content from ECDCLNT230 to ECPCLNT410 and ECPCLNT400 , it works fine. i created 2 source system in BIPCLNT400 and replicated data source in to BIPCLNT400 so all data source has replicated.
    Now when i transport the request from BIDCLNT200 to BIPCLNT400, i need two transfer rules, two transformation one for each source system. for that what kind of settings are required.
    Thanks is advance.
    Regards,
    Komik Shah

    Hi,
    If I understand correctly, in your BW Production system, you want to connect / load data from your ECC Pre Production and Production system.
    In this case, to automate the transport needed for changes of transfer rules for both source system, you will need to have two tranfer rules in your BW Dev. You can either create two source system in Dev pointing to the same ECC Dev system, or one to ECC Dev and the second to ECC QA. Use different name for this two source system.
    In BW Production, you then then maintain the source system conversion of both source system.
    Thanks.

  • Transformations short dump

    Hi All,
    While saving or activating infocube transformations in BI development system getting abap short dump.
    Short dump:
    Runtime Errors         ASSERTION_FAILED
    Date and Time          04.08.2009 12:35:44
    Short dump has not been completely stored (too big)
    Short text
         The ASSERT condition was violated.
    What happened?
         In the running application program, the ASSERT statement recognized a
         situation that should not have occurred.
         The runtime error was triggered for one of these reasons:
         - For the checkpoint group specified with the ASSERT statement, the
           activation mode is set to "abort".
         - Via a system variant, the activation mode is globally set to "abort"
           for checkpoint groups in this system.
         - The activation mode is set to "abort" on program level.
         - The ASSERT statement is not assigned to any checkpoint group.
    Error analysis
         The following checkpoint group was used: "No checkpoint group specified"
         If in the ASSERT statement the addition FIELDS was used, you can find
         the content of the first 8 specified fields in the following overview:
         " (not used) "
         " (not used) "
         " (not used) "
         " (not used) "
         " (not used) "
         " (not used) "
    Trigger Location of Runtime Error
        Program                                 CL_RSTRAN_GEN_STEPGRP=========CP
        Include                                 CL_RSTRAN_GEN_STEPGRP=========CM00O
        Row                                     19
        Module type                             (METHOD)
        Module Name                             OPEN_IF
    Source Code Extract
    Line  SourceCde
        1 METHOD open_if.
        2
        3   DATA:
        4     ls_step_gen TYPE ty_s_gen_param_step.
        5
        6   READ TABLE o_t_step_gen INTO ls_step_gen
        7     WITH TABLE KEY r_step_ = i_r_step_.
        8   ASSERT sy-subrc = 0.
        9   IF ls_step_gen-r_step_exe IS BOUND.
       10     EXIT.
       11   ENDIF.
       12
       13   IF o_if = if_rstran_gen_root=>c_s_status-open.
       14     r_true = rs_c_false.
       15   ELSEIF o_if = space.
       16     r_true = rs_c_true.
       17     o_if = if_rstran_gen_root=>c_s_status-open.
       18   ELSE.
    >>>>>     ASSERT 1 = 2. "Invalid call
       20   ENDIF.
       21
       22 ENDMETHOD.
    Can anybody tell me why it is throwing short dump in this case.
    Suggest me necessary actions to resolve this issue.
    Thanks,

    Hi Shiva,
    Let me explain my problem.
    I have one infocube, two transformations. In the infocube I have added few master data objects directly into the corresponding dimensions and mapped to read from master data with current date.
    Now I am changing Current date to End date in the transformations. After changing Current date to End date from Calendar Month or Calendar Day activating the transformations. While activating the transformations getting short dump.
    Here I am using time dependent master data.
    Thanks

  • Can we assign the two infoobject to a single datasource

    Hi Friends,
    I have requirement where I have to assign two info objects to a single datasource.  Please let me know if there is any procedure for this.
    Regards
    sAN

    Create Two transformation (don't use infosource unless you  are using read master data thing in the rule)
    Now one transformation will load to infoobject 1
    and second will load to infoobject 2.
    scenario
    DS-+INFOPACKAGE>->transformatino1 + DTP1>IOBJ1
    DSINFOPACKAGE>>trasnformation2DTP2-->IOBJ2.
    If you are using read master data then you will have data flow like this.
    DSINFOPACKAGE>->transformation 1Infosource-->transformation1"+DTP-2->Infoobject 1.
    DS-INFOPACKAGE-->->transformation 2 >Infosource>transformation 2"DTP2-->infoobject 2.
    Else a very silly suggestion create a DSO with these two..IOBJ.:-) and load from DS.
    Regards,
    Rakesh

  • Xsl:output ignored when doing transformation by XMLReader and XMLFilter

    I want to make XML transformation via XSL file using XMLReader and filter as described in java tutorial http://java.sun.com/xml/jaxp/dist/1.1/docs/tutorial/xslt/5_chain.html, but it is not working correctly. Transformation is not using encoding and method set in xslfile by <xsl:output method="xml" encoding="ISO-8859-2"/>.
    I wrote simple test program whitch has two method doing transformation:
    transform_method1 - transforms by XMLReader and filter
    transform_method2 - transforms in other manner described in the same tutorial
    The second method is working, but first not. Maybe someone has some solution why the first method is not working correctly?
    Here is test class:
    package xslt;
    import javax.xml.parsers.*;
    import org.xml.sax.*;
    import org.xml.sax.helpers.*;
    import javax.xml.transform.*;
    import javax.xml.transform.sax.*;
    import javax.xml.transform.stream.*;
    import java.io.*;
    public class TestTransform {
        public static void main(String[] args) {
            File xml_file = new File(args[0]);
            File xsl_file = new File(args[1]);
            File output_file_method1 = new File(args[2]);
            File output_file_method2 = new File(args[3]);
            transform_method1(xml_file, xsl_file, output_file_method1);
            transform_method2(xml_file, xsl_file, output_file_method2);
        private static void transform_method1(File xml, File xsl, File output) {
            try {
                BufferedInputStream bis = new BufferedInputStream(new FileInputStream(xml));
                InputSource input = new InputSource(bis);
                SAXParserFactory spf = SAXParserFactory.newInstance();
                SAXParser parser = spf.newSAXParser();
                XMLReader reader = parser.getXMLReader();
                SAXTransformerFactory stf = (SAXTransformerFactory) TransformerFactory.newInstance();
                XMLFilter filter1 = stf.newXMLFilter(new StreamSource(xsl));
                filter1.setParent(reader);
                StreamResult result = new StreamResult(output);
                Transformer transformer = stf.newTransformer();
                SAXSource transformSource = new SAXSource(filter1, input);
                System.out.println("encoding method1="+transformer.getOutputProperty(OutputKeys.ENCODING));
                transformer.transform(transformSource, result);
            } catch (Exception e) {
                e.printStackTrace();
        private static void transform_method2(File xml, File xsl, File output) {
            try {
                TransformerFactory tFactory =   TransformerFactory.newInstance();
                Transformer transformer = tFactory.newTransformer( new StreamSource( xsl) );
                System.out.println("encoding method2="+transformer.getOutputProperty(OutputKeys.ENCODING));
                transformer.transform(new StreamSource(xml), new StreamResult(output));
            } catch (Exception e) {
                e.printStackTrace();
    }and here is test xml file:
    <?xml version="1.0" encoding="ISO-8859-2"?>
    <root>
        <line>Some text line 1</line>
        <line>Some text line 2</line>
        <line>Some text line 3</line>
    </root>the test stylesheet:
    <?xml version="1.0" encoding="ISO-8859-2"?>
    <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
    <xsl:output method="xml" encoding="ISO-8859-2"/>
    <xsl:template match="root">
         <p><xsl:apply-templates/></p>
    </xsl:template>
    <xsl:template match="line">
         <div><xsl:apply-templates/></div>
    </xsl:template>
    </xsl:stylesheet>I get output file from method1:
    <?xml version="1.0" encoding="UTF-8"?>
        Some text line 1
        Some text line 2
        Some text line 3and method2 output:
    <?xml version="1.0" encoding="ISO-8859-2"?>
    <p>
        <div>Some text line 1</div>
        <div>Some text line 2</div>
        <div>Some text line 3</div>
    </p>Output on System.out from program is:
    encoding method1=UTF-8
    encoding method2=ISO-8859-2Can someone help how can I make transformation like transform_method1 style not to ignore <xsl:output method="xml" encoding="ISO-8859-2"/> tag?

    You have two transformations chained together. The first is the XMLFilter, which uses your XSLT, and the second is the Transformer, which takes its output and does an identity transformation on it. The second transformation is the one that produces your final XML, so it's the one that controls what encoding it gets.
    So your stylesheet should be working. Your only "problem" is that you get a different encoding than you expected. And I put "problem" in quotes because the output you are getting is identical, in XML terms, to the output you want to get.

Maybe you are looking for

  • Help can't update my software

    I can't open my updates when I open my apps store the updates remains blank now on my setting I can't update my software too when I press the update software the screen says downloading but nothing is happening the gear button just kept on turning I

  • BI content to install customer Infobjects

    Hi Gurus , I want to Create customer sales query , for this I went into bI content to install customer Infobjects , but these infojects are not coming in rsa1 infobjects screen where we create infobjects so that i can transfer standard infoobjects in

  • HT1338 have mac os x 10.5.8 Adobe flash player wont update without 10.6 version, when i hit update software says i have latest.

    I have been having a problem for the last week.  Use pogo.com and it wanted me to update the Adobe Flash Player to play 1/2 the games on the site.  When I updated it said needed os x version 10.6 or higher for latest version.  Currently have os x 10.

  • DV2490SE DVD Drive Problem

    Hello, I have a problem with my friend's computer. Her DVD drive is not showing up in "My Computer" or "Device Manager".  Her laptop is a DV2490 SE with Windows Vista Home 64 bit. I've tried pressing I think it was F12 to reach the bios page and then

  • Auto_ID_Integration_Scenario

    Hi, <b>Outbound Delivery Execution        1.      Send outbound delivery creation information The supplier’s ERP system sends the IDoc DELINF.DELVRY03 or DELINF.DELVRY05 to the XI system which dispatches the information to SAP AII.</b> The above give