Use idoc structure as Data Type....

Hi all,
this is my SYNC scenario: SOAP -> PI -> PROXY and the response in the same way.
in the proxy I want to use the same Structure of the ORDERS05 iDoc...
so..
1: I've Imported the iDoc
2: Created the external definition
and then...
first, I tried to use the IDOC in the "Message Interface" but when activate it, a message appear saying " references an IDoc message and a non-IDoc message"...
well as it doesn't works...
I tried with the external definition in the Message Interface... I can Activate it, but when go to ECC and try to regenerate the Proxy this is the error message "
Interface uses external and internal message definitions"...
so... my last chance was create a data type with the same structure... for it, imported the XSD but errors and more errors appears... so I think than my best way is not a data type... but I can't find a solution.
someone has any idea??
Thanks.

> this is my SYNC scenario: SOAP -> PI -> PROXY and the response in the same way.
>
> in the proxy I want to use the same Structure of the ORDERS05 iDoc...
> so..
> 1: I've Imported the iDoc
> 2: Created the external definition
I hope you are just only using the structure of IDOC and not the IDOC itself. Because IDOC doesn't support SYNC scenario.
> first, I tried to use the IDOC in the "Message Interface" but when activate it, a message appear saying " references an IDoc message and a non-IDoc message"...
As I said you cannot use IDOC in Sync scenario, So this was because of that as per my understanding.
> well as it doesn't works...
>
> I tried with the external definition in the Message Interface... I can Activate it, but when go to ECC and try to regenerate the Proxy this is the error message "
>  Interface uses external and internal message definitions"...
As per my experience, You cannot create Proxy with External Definition. I guess in PI7.1 it is possible.
> so... my last chance was create a data type with the same structure... for it, imported the XSD but errors and more errors appears... so I think than my best way is not a data type... but I can't find a solution.
>
Yes you don't have any other option except creating the Data Type. So check what's going wrong when you are creating the it.
Regards,
Sarvesh

Similar Messages

  • How to use clob or blob data type in OWB

    how to use clob or blob data type in OWB?
    if OWB not surport these data type,how can i extract the large data type from data source

    The same question was asked just two days ago No Data Found: ORA-22992
    Nikolai Rochnik

  • Data Mining - Scalar Mining Structure Column Data type error...

    Hoping someone will have a solution for this error
    Errors in the metadata manager. The data type of the '~CaseDetail ~MG-Fact Voic~6' measure must be the same as its source data type. This is because the aggregate function is not set to count or distinct count.
    Is the problem due to the data type of the column used in the mining structure is Long, and the underlying field in the cube has a type of BigInt,or am I barking up the wrong tree?

    You're right the error does occur when processing the mining model. I've built a simple mining structure to look into the problem - its just a decision tree based on a cube. The cube itself has a single Fact table, with numerous related dimensions. It has two measure groups - one for the use of distinct count, in the other group the aggregation functions used are count and sum.
    The measure which seems to be causing a problem uses the sum aggregation in the cube. The underlying table has this field as a BigInt, which was used due to round issues on the cubes when the original Int datatype was used.
    In the mining model the same field appears as a Continuous with datatype of Long. It has been tried in various guises as predictonly, and ignore modes both with the same results.
    The mining model used to work without a problem when the underlying type was an Integer, but then the cube would report the wrong figures.
    Interestingly, when I build a mining model directly against the source data there were no problems.
    Let me know if you need any further information.
    Thanks

  • How to use special charecters in Data Type ?

    Hi Experts
    i want to create the data base structure for receiver side. in that data type we need to use special characters like (_ _ -- and # ) . is that possible that we can use the special characters in the data type creation .
    Thank you
    G.Praveen Kumar

    Hi,
    read reply given byAkhila K 
    Re: DataType character u00F1
    and this
    https://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/9420 [original link is broken] [original link is broken] [original link is broken]
    Regards,
    Manisha
    Edited by: Manisha Dahatonde on May 15, 2009 2:30 PM

  • Can I use Non-standard XSD Data Types in my XSD; if so how?

    Please help if you can, this is a complex question, so bear with me.
    Also note that I am in Livecycle 8.2 ES (not ES2 or higher).
    I am working on creating XSD schemas to map to form objects.
    I have created one master schema document that is wired into multiple forms, and I have one separate schema for reusable form objects, that I refer to as a "common node".
    All of my individual form schemas are brought together in this one Master Schema via the use of include statements.
    EXAMPLE: This is like my Master Schema
    <?xml version="1.0" encoding="UTF-8"?>
    <!--W3C Schema written by Benjamin P. Lyons - Twin Technologies July 2010-->
    <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified" attributeFormDefault="unqualified" >
    <xs:include schemaLocation="./commonElementsNode.xsd" />
    <xs:include schemaLocation="./form1111.xsd" />
    <xs:include schemaLocation="./form2222.xsd" />
    <xs:include schemaLocation="./form3333.xsd" />
    <xs:element name="form">
    <xs:complexType>
      <xs:sequence>
       <xs:element ref="commonElementsNode" />
       <xs:element ref="form1111" />
       <xs:element ref="form2222" />
       <xs:element ref="form3333" />
      </xs:sequence>
    </xs:complexType>
    </xs:element>
    </xs:schema>
    This works fine.
    I can load this up in Designer in the Data View and everything appears in the Data View hierarchy correctly, with "form" as the top node.
    And as long as I use standard "xs:" data types - everything works great.  So if I go into LiveCycle Designer and I go to File --> Form Properties --> Preview --> Generate Preview Data and generate dummy XML data - it respects my XSD conventions.
    Now here is where the problem arises:
    In these schemas, I need to define the data types.
    The client I am working for needs me to use their data types.
    These data types are not standard xs: data types, like "xs:string" or "xs:date".
    Rather, the data types are ones that have been defined in other schemas and reserved to a namespace.
    For instance, rather than use xs:date I need to use something like:  "myns:DateType"
    This "myns:DateType" is defined as:
    <xs:complexType name="DateType">
      <xs:simpleContent>
       <xs:extension base="xs:date">
        <xs:attribute name="format" type="xs:string" use="optional">
         <xs:annotation>
          <xs:documentation xml:lang="en">
           <mydoc:Name>Date Format Text</mydoc:Name>
           <mydoc:Definition>The format of the date content</mydoc:Definition>
           <mydoc:PrimitiveType>string</mydoc:PrimitiveType>
          </xs:documentation>
         </xs:annotation>
        </xs:attribute>
       </xs:extension>
      </xs:simpleContent>
    </xs:complexType>
    Note that I have redacted this data type slightly and changed the namespace to protect the anonymity of my client, but we can assume that their data type is valid and currently in use with other systems.
    It conforms to W3 standards.
    Note also how this type is an enumeration of the base type "xs:date".
    This is defined in a schema called something like "MyCoreTypes.xsd"
    There is a namespace reservation in this file that looks something like this:
    <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified" attributeFormDefault="unqualified"
    xmlns:myns="http://clinetname.com/schemas/mycoretypes" >
    So there is a name space reservation there.
    In my aforementioned "Master Schema" file, I have an include statement that looks like this:
    <xs:include namespace="http://clinetname.com/schemas/mycoretypes" schemaLocation="./MyCoreTypes.xsd" />
    (let's assume that the schema is in the same folder, as the Master Schema, so we can use the "./" relative path.)
    Now the problems is that in all my forms, where I have a myns:DateType (e.g.:  in form1111, a "Date of Birth" element that looks like this: <xs:element name="OwnerBirthDt" type="myns:DateType"/> ) the XSD is not respected when the XML dummy data is generated in LiveCycle Designer implying that the XSD's data types are not being recognized properly.
    Has anyone had this problem before?
    Is there a solution?
    Is it even possible to use these kind of include or import references in LiveCycle to define a data type other that the standard "xs:" data types?
    Please let me know - it would be greatly appreciated.
    I am more than willing to clarify the question further if others are willing to help.
    Thanks -
    Ben Lyons

    prf.kishorekumar wrote:
    i came here with a hope that I would definitely get some help.
    pls.. some one reply1) You got some help. No where do I see thanks or acknowledgment for the information given.
    2) Please remember that people on the forum help others voluntarily, it's not their job.
    3) Google can often help you here if the forum can't. Using Google I found this interesting link:
    http://today.java.net/pub/a/today/2004/05/24/html-pt1.html
    It discusses the Swing HTML EditorKit as well as some other free HTML renderers.
    Edited by: petes1234 on Oct 24, 2007 7:29 PM

  • Using DBMS_DATAPUMP with LONG data type

    I've got a procedure below that calls the DBMS_DATAPUMP procedure using a REMOTE_LINK to move a schema from one database to another. However, a couple of the tables within that schema have columns with the LONG data type. And when I run it I get an error saying that you cannot move data with the LONG data type using a REMOTE LINK. So no data in those particular tables gets moved over.
    Has anyone else had this issue? If so, do you have a work around? I tried adding a CLOB column to my table and setting the new CLOB to equal the LONG, but I couldn't get that to work either...even when I tried using a TO_LOB. If I could get that to, then I could just drop the LONG, move the schema, then recreate the LONG column on the opposite side.
    Here's my procedure....
    DECLARE
         /* EXPORT/IMPORT VARIABLES */
         v_dp_job_handle                    NUMBER ;          -- Data Pump job handle
         v_count                              NUMBER ;          -- Loop index
         v_percent_done                    NUMBER ;          -- Percentage of job complete
         v_job_state                         VARCHAR2(30) ;     -- To keep track of job state
         v_message                         KU$_LOGENTRY ;     -- For WIP and error messages
         v_job_status                    KU$_JOBSTATUS ;     -- The job status from get_status
         v_status                         KU$_STATUS ;     -- The status object returned by get_status
         v_logfile                         NUMBER ;
         v_date                              VARCHAR2(13) ;
         v_source_server_name          VARCHAR2(50) ;
         v_destination_server_name     VARCHAR2(50) ;
    BEGIN
         v_project := 'TEST' ;
         v_date := TO_CHAR(SYSDATE, 'MMDDYYYY_HHMI') ;
         v_source_server_name := 'TEST_DB' ;
         v_dp_job_handle := DBMS_DATAPUMP.OPEN(
              OPERATION     => 'IMPORT',
              JOB_MODE     => 'SCHEMA',
              REMOTE_LINK => v_source_server_name,
              JOB_NAME     => v_project||'_EXP_'||v_date,
              VERSION          => 'LATEST') ;
         v_logfile := DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE ;
         DBMS_DATAPUMP.ADD_FILE(
              HANDLE          => v_dp_job_handle,
              FILENAME     => v_project||'_EXP_'||v_date||'.LOG',
              DIRECTORY     => 'DATAPUMP',
              FILETYPE     => v_logfile) ;
         DBMS_DATAPUMP.METADATA_FILTER(
              HANDLE          => v_dp_job_handle,
              NAME          => 'SCHEMA_EXPR',
              VALUE          => '= '''||v_project||''' ') ;
         DBMS_DATAPUMP.START_JOB(v_dp_job_handle) ;
         v_percent_done := 0 ;
         v_job_state := 'UNDEFINED' ;
         WHILE (v_job_state != 'COMPLETED') AND (v_job_state != 'STOPPED')
         LOOP
              DBMS_DATAPUMP.GET_STATUS(
                   v_dp_job_handle,
                   DBMS_DATAPUMP.KU$_STATUS_JOB_ERROR + DBMS_DATAPUMP.KU$_STATUS_JOB_STATUS + DBMS_DATAPUMP.KU$_STATUS_WIP,
                   -1,
                   v_job_state,
                   v_status) ;
                   v_job_status := v_status.JOB_STATUS ;
              IF v_job_status.PERCENT_DONE != v_percent_done THEN
                   DBMS_OUTPUT.PUT_LINE('*** Job percent done = '||TO_CHAR(v_job_status.PERCENT_DONE)) ;
                   v_percent_done := v_job_status.PERCENT_DONE ;
              END IF ;
              IF BITAND(v_status.MASK, DBMS_DATAPUMP.KU$_STATUS_WIP) != 0 THEN
                   v_message := v_status.WIP ;
              ELSIF BITAND(v_status.mask, DBMS_DATAPUMP.KU$_STATUS_JOB_ERROR) != 0 THEN
                   v_message := v_status.ERROR ;
              ELSE
                   v_message := NULL ;
              END IF ;
              IF v_message IS NOT NULL THEN
                   v_count := v_message.FIRST ;
                   WHILE v_count IS NOT NULL
                   LOOP
                        DBMS_OUTPUT.PUT_LINE(v_message(v_count).LOGTEXT) ;
                        v_count := v_message.NEXT(v_count) ;
                   END LOOP ;
              END IF ;
         END LOOP ;
         DBMS_OUTPUT.PUT_LINE('Job has completed') ;
         DBMS_OUTPUT.PUT_LINE('Final job state = '||v_job_state) ;
         DBMS_DATAPUMP.DETACH(v_dp_job_handle) ;
    END ;

    But the application we have that uses the database cannot be changed to read from a CLOBWhy can't you change the application?
    Well, anyway you should point out to your superiors that Oracle documented years ago to not use LONGS anymore...
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/datatype.htm#sthref3806
    It clearly states:
    LONG Datatype
    Note:
    Do not create tables with LONG columns. Use LOB columns (CLOB, NCLOB) instead. LONG columns are supported only for backward compatibility.
    Oracle also recommends that you convert existing LONG columns to LOB columns. LOB columns are subject to far fewer restrictions than LONG columns. Further, LOB functionality is enhanced in every release, whereas LONG functionality has been static for several releases.
    How do I go from CLOB to LONG?I'm sorry, cannot help you on that one, I don't think you can do that at all (Oracle wants us to stop using LONGS, so, it's a one-way conversion...):
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1037232794454#15512131314505
    So: NO built_in, you'll need to write a program if the clob is ALWAYS LESS THAN 32k in size, you can use plsql..but is that the case in your case? Only you know that.
    I believe that question is still unanswered on this forum, but you might try searchin for answers on this forum, and
    the 'Database-General' forum: General Database Discussions
    Perhaps you can google a Q&D workaround...
    ( And consider convincing your collegues to just convert your LONGS to LOBS)
    Edited by: hoek on Apr 8, 2009 5:43 PM

  • Type incompatibility between two structures in data type

    hi
    I have two structures, x_autwr and y_autwr, here the problem is data type of x_autwr is CURR ,15 and y_autwr is DEC, 23.
    now when I write
    x_autwr = y_autwr it gives an errornous data, for eg if the value of Y-autwr is 2000.0000, it transforms to x_autwr and makes it 200000.00,
    PLease note that I cannot change the data type as its from standard, but I have tried taking to local variables of type i, c , or decimal of type 2,or divide by 100, but still no logic works.
    Please help.

    I think this assignment is not correct. The CURR fields are always associated with a CUKY, do you have the currency key with you ?
    If yes, then you can try this logic:
    x_autwr = y_autwr.
    DATA:
    l_corr_factor TYPE i,
    ls_bapi1090_1 TYPE bapi1090_1.
      CALL FUNCTION 'BAPI_CURRENCY_GETDECIMALS'
        EXPORTING
          currency          = v_waers "Your Currency key field
        IMPORTING
          currency_decimals = ls_bapi1090_1.
        l_corr_factor = 10 ** ( lf_bapi1090_1-curdecimals ).
    x_autwr = x_autwr / l_corr_factor.
    BR,
    Suhas

  • External SOAP client Accessing Webservice using built in Java Data type

    We have built a webservice and deployed it on WLS. We accessed this from a swing
    client it works fine.The webservice methods uses non-built in JAVA data types
    as parameters.These are Value Objects and the JAVA Serializer and Deserializer
    classes are generated using Ant task itself.The client coding uses Value objects
    given by the Client_jar (generated using <Clientgen> ant task) to pass to the
    webservice. We dont want to go by this way.Is there anyway were in we can pass
    parameters to the method which expects non-built in java datatypes?
    Why i am asking this is, i want to know how an external client (.Net client )
    will be able to exceute my webservice (i.e., passing the required Object which
    the method is expecting)?

    Hi Anish,
    Well first off, your web service doesn't send or receive "objects". It only sends
    and recieves XML, which is in turn converted to/from Java objects at invocation
    time. Second, a .NET (or Perl, or Python, or C++) client will be able to call
    your web service, because the wsdl.exe tool (for .NET) will generate "programming
    language specific" objects from the <types><schema> elements, in the WSDL of your
    web service :-) The wsdl.exe tool will create C# objects from the WSDL, that will
    convert XML to/from C# when your web service is called. That's the beauty of XML
    schema - it's a "universal typing system", so it's not tied to a particular programming
    language. The only issue is whether or not the web services platform vendor's
    XML Schema/WSDL processor, can successfully process the WSDL. Some vendors have
    more complete implementations of the WSDL and XML Schema specs than others, so
    expect varying success here. The one in WLS 7.0 is pretty good, so you shouldn't
    have too many problems consuming WSDL generated by .NET tools (or any other tool
    for that matter).
    Regards,
    Mike Wooten
    "Anish" <[email protected]> wrote:
    >
    We have built a webservice and deployed it on WLS. We accessed this from
    a swing
    client it works fine.The webservice methods uses non-built in JAVA data
    types
    as parameters.These are Value Objects and the JAVA Serializer and Deserializer
    classes are generated using Ant task itself.The client coding uses Value
    objects
    given by the Client_jar (generated using <Clientgen> ant task) to pass
    to the
    webservice. We dont want to go by this way.Is there anyway were in we
    can pass
    parameters to the method which expects non-built in java datatypes?
    Why i am asking this is, i want to know how an external client (.Net
    client )
    will be able to exceute my webservice (i.e., passing the required Object
    which
    the method is expecting)?

  • Problem in LSMW using IDOC in convert data step

    Hi,
    I am getting a Run-time error in CONVERT DATA step of LSMW using IDOC. The error diplayed is a syntax error. I am not able to figure out the error. Anybody having idea about this, please help me...
    The data is read successfully.
    Edited by: majualex on Nov 4, 2009 9:23 AM

    Error in the ABAP Application Program
    The current ABAP program "/SAPDMC/SAPLLSMW_OBJ_070" had to be terminated
      because it has
    come across a statement that unfortunately cannot be executed.
    The following syntax error occurred in program "/1CADMC/SAP_LSMW_CONV_00000051
      " in include "/1CADMC/SAP_LSMW_CONV_00000051 " in
    line 25:
    ""." expected after "DATA"."
    I think the pgm is standard ABAP program.....

  • Create interface or use idoc structure directly?

    Dear all,
    I am creating a scenario from a FILE (as IDOC structure) to RFC message.
    If I create a Interface Mapping, do I use the IDOC structure and RFC message as source and target Interface or do I first create interface definitions based on these structures and then use these?
    Regards.

    > I am creating a scenario from a FILE (as IDOC
    > structure) to RFC message.
    >
    > If I create a Interface Mapping, do I use the IDOC
    > structure and RFC message as source and target
    > Interface or do I first create interface definitions
    > based on these structures and then use these?
    You are talking about IDoc-XML? Then you should be able to use the IDoc Interfaces with your sender service directly.
    CHRIS

  • Possible? Using External Definitions for data type/message type definitions

    Hi all,
    i want to use external definitons in my own data type definitions. Is that possible? And if yes, how?
    I imported many definitions via a mass import. Now I also want to use these definitions for further interfaces (than the imported ones). The most convenient would be of course to just use them as a custom created type. Can I somehow "convert" an external definition to a data type/message type?
    KR
    Felix

    Hello Felix,
          The External defination which you has been imported that cant use in DT and MT but yeah you can use this in Service interface.
    you cant modify that External defination if you want to do that then 1st of all modify it and then reimport it as an another external defination.
    and creating service interface with this ED you can use it in your message mapping .
    i hope this will help to you to clear your doubt.
    Monica

  • Joining table using CHAR and VARCHAR data type as indicator

    Hi All,
    I would like to join 3 tables together but I'm unable to get a perfect concordance (using =) between CHAR and VARCHAR data type.
    Does a command exist to get impartially all data treated as CHAR or VARCHAR ?
    Thanks in advance for your help !

    You want the database to perform with such a crappy database design? Good luck with that.. Instead, you need to look at why you are NOT using surrogate integer based keys for your data tables..
    Thank you,
    Tony Miller
    Webster, TX
    If vegetable oil is made of vegetables, what is baby oil made of?
    If this question is answered, please mark the thread as closed and assign points where earned..

  • Using oracle.sql.BLOB data type in Java Class to pass in a Blob

    All,
    I'm trying to pass in a BLOB from PL/SQL to a Java Class, but the BLOB isn't passed correctly.
    When I check the length of the BLOB in PL/SQL its different from the length of the BLOB in java.
    I'm using DB 11g and the ojdbc5.jar file in my java classes.
    The java function uses the oracle.sql.BLOB type to get the parameter.
    The java class is loaded into the DB and called via a PL/SQL function.
    Kind regards,
    Nathalie

    The question is indeed a little ambigious defined ;o)
    When I pass the BLOB to the java method and invoke BLOB.getBytes() and then get the length of the BLOB in java the length of the BLOB is bigger than in PL/SQL.
    When I use the method 'getBinaryStream' and write this to a buffer, the code works.
    I will log a tar regarding the getBytes()-method to ask for more detailed information regarding the methods provided using the JDBC Drivers.
    Kind regards,
    Nathalie

  • Jdev11g: How to use a Ord.image data type with ADF Faces 11g

    Where to find an example about Ord.image data type with ADF Faces 11g preview3

    Hi,
    such a sample doesn't yet exist.
    Frank
    Btw.: The Jdeveloper 11 forum is JDeveloper and OC4J 11g Technology Preview

  • Copy imported IDOC structure to own data type structure in PI

    I am looking to copy the imported IDOC structure to data type structure, is there any way to do this instead creating whole elements again. The reason to do this is I need to change the occurence of one of the element and standard IDOC structure wouldn't let me do it.
    Thanks,
    Menaga

    Hi Menaga,
    Inorder to change the occurance in IDOC you will need to import the idoc as external definition which you can use for message mapping (once you change the occurance and import the XSD).
    Michal's blog will help you get to it precisely,
    /people/michal.krawczyk2/blog/2005/12/04/xi-idoc-bundling--the-trick-with-the-occurance-change
    Gud luck,
    Regards,
    Pavan

Maybe you are looking for