Mapping of data.

Hi All,
Please help on following scenario:
I have View context node where it is directly mapped from the Component controller context node. If I clear the elements in the View context node the  component controller context node elements are clearing. therefore iam getting out of bound memory problem in Application.
Regards,
CSP

Hi,
Do not map the context incase you do not want them in sync.
Use
wdThis.wdGet<ControllerName>Controller().wdGetContext().node<YourNode> and copy the contents using WDCopyService.
Ashu

Similar Messages

  • Problem with context mapping and data flow in a FPM application

    Hi All,
    I am trying to develop an ESS application using FPM. For the same, the requirement is to see the history of an employee in the second view.
    The first view has got just the overview information and the second one has got the detail. So, the records or the fields are the same on both the views.
    As per the FPM guidelines, the Model is residing in the Fc component and the respective Vc components are using the model data accordingly.
    I am executing the model in the Fc component calling the executable method in the interfaceController of the first view and then trying to display the output data of the BAPI in the first view which provides the overview information.This is working fine.
    But when i am trying to map the same output node to the Table UI for the second view, the record size is coming zero and thus no information is available.
    For the above issue, I am again executing the RFC in the InterfaceController of the second view to populate the records, which is incorrect as it is already executed and the data is available for the first view.
    I request you to let me know the correct approach to Context mapping and data flow when using FPM-roadmap. Is their any standard method or approach available to deal with such requirements? Please let me know.
    Thanks in advance.
    Regards
    DK

    Hi Idhaya,
    I model node is available in Fc and the Fc interface controller is being used in the first Vc and the second Vc.
    So the idea is, as the executable method is generated in the Fc, so i have created a custom method to call the executable method in Fc, where the input parameter is getting passed and this custom method is finally getting called is the first Vc.
    So , now my first Vc is ready to call the custom method in Fc and execute the RFC. Once the RFC is executed, the nodes in the Fc should get populated which is the ideal case.
    And as the Fc is used as a component in the second Vc, the same node is available to the UI elements.
    But, when I check the record size for the output node, it is always zero, for the second Vc.
    Regards
    DK

  • Problem in mapping xml data with header details from IPM 11g to BPEL

    Hi,
    I want to map xml data as a supporting content from IPM application to BPEL.
    My xml is
    <?xml version="1.0" encoding="utf-8"?>
    <DocumentFile xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://tempuri.org/Document.xsd">
    <Document DocumentType="Invoice">
    <DocumentImage>
    <Filename>\\10.205.0.209\Img\10883212.TIF</Filename>
    </DocumentImage>
    </DocumentFile>
    If I remove header details from root element <DocumentFile> i.e. modified xml is
    <DocumentFile>
    <Document DocumentType="Invoice">
    <DocumentImage>
    <Filename>\\10.205.0.209\Img\10883212.TIF</Filename>
    </DocumentImage>
    </DocumentFile>
    it works fine but i need to pass header details as well.
    Please suggest.
    Thanks,
    Priya

    Hi Naveen,
    In sxmb_moni the content transmitted to the adapter(RFC)is as follows
    <?xml version="1.0" encoding="UTF-8" ?>
    - <ns:ZRFID_EQUIP xmlns:ns="urn:sap-com:document:sap:rfc:functions">
    - <RECORDS>
    - <item>
      <FLOC>f1-01-01</FLOC>
      <RFID_NO>I006</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-02</FLOC>
      <RFID_NO>I002</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-03</FLOC>
      <RFID_NO>I003</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-04</FLOC>
      <RFID_NO>I004</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-05</FLOC>
      <RFID_NO>I005</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-06</FLOC>
      <RFID_NO>I001</RFID_NO>
      </item>
      </RECORDS>
      </ns:ZRFID_EQUIP>
    At r/3 side the field floc and rfid_no gets mapped to floc which is of char30
    eg floc=f1-01-01I006
       rfid_no=

  • Context Mapping and data binding

    Hi,
    Please explain about the context mapping and data dinding
    and also differences
    Thank's & Regard's.
    Sri

    Hi Sridevi Sudunaguntla ,
    context mapping-> means mapping between different contexts
    ie suppose if we are creating a node in the component controllet and if we want to use that node in view a and view b, then  we have to map this node from component controller to the required view.. this mapping of context nodes is called context mapping..for context mapping, create nodes in compcontroller.. tehn go to view-> context-> tehn drag and drop required node from component controller to the view ..
    data binding-> means binding the data from nopde to the ui elements or viceversa... after the nodes are created in indudual views or after context mapping.. we have to bind these nodes to the ui elements... suppose we have an input field and we want to read that,, so we have to bind the input field to an attribute in the node.. this binding is called.. data binding.
    Regards
    Sarath

  • Error in Mapping of Data for Scrambling Rule due to Inconsistency detected in Mapping Engine PIFD

    Hi all,
    We are performing an HCM Data scrambling, however, in Package Setting phase, we are encountering error in mapping of data for scrambling rule, saying there are inconsistency detected in mapping Engine PIFD.
    We have already implemented and checked below SAP notes, but still no luck.
    1854557 - Missing PIFD objects or mappings after LT AddOn install.
    1665861
    - TDMS 4.0 - Collective note for Scrambling
    Please advise how to fix the issue? Project is at stand still due to this.
    OSS message already opened.
    Thanks and regards,
    Philip

    Hi
    This is bug 5195315, which looks to be fixed in a 10.2 patch. If you can pick up the latest 10.2 patch there is like 3 years of fixes worth picking up.
    Cheers
    David

  • Mapping util.Date to Oracle timestamp

    Tuesday, March 22, 2005
    I am currently experiencing difficulty in mapping a java.util.Date
    field to an Oracle TIMESTAMP column.
    Here's what I see. By default, Kodo maps the date field to a DATE
    column. I suppose this makes sense since Oracle's date columns
    have time information that resolves to the second. In this case,
    the client has a business case to store subsecond resolution,
    hence the desire to store the date field in an Oracle TIMESTAMP
    column.
    First question: how should this be done?
    Here's what I've tried. I tried setting the jdbc-type extension
    for the date field to "timestamp". This setting makes no
    difference, and I suspect the reason is that OracleDBDictionary
    has made the mapping from TIMESTAMP to DATE.
    I tried setting the jdbc-sql-type extension for the date field to
    "timestamp". This makes a difference only when I drop the table.
    Then the schematool's refresh action creates a table with date's
    field mapped to a TIMESTAMP column. I have also gone ahead and
    manually altered the table to achieve the same effect.
    Once the mapping is created, I see the following behavior. Kodo
    has no problem reading the TIMESTAMP column and putting the info
    into the date field. It also has no problem saving non-null date
    values into the TIMESTAMP column. But it does have a problem
    storing a null in the date field.
    Second question: what is the workaround to this problem?
    The the stack dump (obtained by using the JDO Tools Library
    example) follows.
    Thanks in advance,
    David Ezzio
    enter command:
    --> return book
    Select the book to return:
    1. book [com.ysoft.jdo.book.library.Book-354] "Gone to War" checked out:
    Tue Mar 22 10:38:01 EST 2005
    2. book [com.ysoft.jdo.book.library.Book-356] "Gone to Work" checked
    out: Tue Mar 22 10:33:58 EST 2005
    3. book [com.ysoft.jdo.book.library.Book-357] "Gone Fishing" checked
    out: Tue Mar 22 10:33:58 EST 2005
    4. book [com.ysoft.jdo.book.library.Book-360] "Gone Sailing" checked
    out: Tue Mar 22 10:33:58 EST 2005
    5. book [com.ysoft.jdo.book.library.Book-355] "Gone Hunting" checked
    out: Tue Mar 22 10:33:58 EST 2005
    Enter selection:
    --> 2
    okay
    enter command:
    --> commit
    exception caught in command
    kodo.util.FatalDataStoreException: The transaction has been rolled back.
    See the nested exceptions for details on the errors that occu
    rred.
    at
    kodo.runtime.PersistenceManagerImpl.throwFlushException(PersistenceManagerImpl.java:1262)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    NestedThrowablesStackTrace:
    kodo.util.DataStoreException: Invalid column type
    at
    kodo.jdbc.sql.DBDictionary.newDataStoreException(DBDictionary.java:3081)
    at kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:77)
    at kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:63)
    at kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:43)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flush(PreparedStatementManager.java:89)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:445)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:193)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:95)
    at
    kodo.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:609)
    at
    kodo.runtime.DelegatingStoreManager.flush(DelegatingStoreManager.java:153)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    NestedThrowablesStackTrace:
    java.sql.SQLException: Invalid column type
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:269)
    at
    oracle.jdbc.driver.OracleStatement.get_internal_type(OracleStatement.java:6164)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setNull(OraclePreparedStatement.java:1316)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.PoolConnection$PoolPreparedStatement.setNull(PoolConnection.java:406)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.LoggingConnectionDecorator$LoggingConnection$LoggingPreparedStatement.setNull(LoggingConnectionDecorato
    r.java:792)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at kodo.jdbc.sql.DBDictionary.setNull(DBDictionary.java:950)
    at
    kodo.jdbc.sql.OracleDictionary.setNull(OracleDictionary.java:450)
    at kodo.jdbc.sql.RowImpl.toSQL(RowImpl.java:828)
    at kodo.jdbc.sql.RowImpl.flush(RowImpl.java:1039)
    at kodo.jdbc.sql.RowImpl.flush(RowImpl.java:975)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flushInternal(PreparedStatementManager.java:160)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flush(PreparedStatementManager.java:84)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:445)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:193)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:95)
    at
    kodo.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:609)
    at
    kodo.runtime.DelegatingStoreManager.flush(DelegatingStoreManager.java:153)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    enter command:
    -->

    Hi Stephen,
    There are two related issues that are addressed. One, some Oracle
    drivers return the wrong type (Type.OTHER) for the TIMESTAMP field.
    This is true for the
    9.2.0.1.0 driver that ships with 9iR2. This causes an exception when
    attempting to assign a null to the date field that has been mapped to a
    TIMESTAMP column. Two, all of the 9i drivers (and 10g drivers) return a
    type name of "TIMESTAMP(x)" where x is the precision. This confuses
    Kodo's OracleDictionary which is looking for a string without the
    precision characters.
    Following your suggestion, the following code fixes it just fine. It is
    harmless, in that all it does is do what OracleDictionary intended but
    failed to do. To use it, you must add the following property
    configuration to the kodo.properties file.
    kodo.jdbc.DBDictionary: xxx.jdo.FixedOracleDictionary
    Without the fix, Kodo does not reassign the TIMESTAMP columns to a type
    of DATE. So far as I can tell, as long as the driver returns a
    Types.TIMESTAMP this does not cause a failure.
    This fix will be moot as soon as the bug in OracleDictionary is fixed.
    What I wonder about is why does Kodo reassign type TIMESTAMP to DATE?
    Why don't you treat TIMESTAMP types as TIMESTAMP types? Curious minds
    want to know.
    Best wishes,
    David
    ---- code follows
    package xxx.jdo;
    import java.sql.*;
    import kodo.jdbc.schema.*;
    import kodo.jdbc.sql.*;
    * Some Oracle drivers do not return the correct type for the TIMESTAMP
    field.
    * This class fixes this issue for Kodo 3.3. The problem (an exception
    complaining
    * about an invalid column type) appears when mapping a Java field
    (Date for example) to
    * an Oracle timestamp field, and only when attempting to set null on
    the Java field.
    public class FixedOracleDictionary
    extends OracleDictionary
    public Column[] getColumns (DatabaseMetaData meta, String catalog,
    String schemaName, String tableName,
    String columnName, Connection conn)
    throws SQLException
    // Let Kodo's OracleDictionary do its thing
    Column[] cols = super.getColumns (meta, catalog, schemaName,
    tableName,
    columnName, conn);
    // Catch the columns with a name of "TIMESTAMP(n)" and mark them
    as DATE types.
    // This is what the OracleDictionary intended to do, but was
    foiled by the
    // name which now has a precision.
    for (int i = 0; cols != null && i < cols.length; i++)
    String tName = cols.getTypeName();
    if (tName != null && tName.startsWith("TIMESTAMP"))
    cols[i].setType(Types.DATE);
    return cols;
    ---- code ends
    Stephen Kim wrote:
    This is a bug (1111)with regards to specific combinations of Oracle 10
    driver and db.
    To work around the issue until the next relase, getColumns (...) in
    OracleDictionary needs to be extended/modified to instead of doing a
    strict equals () comparison to "TIMESTAMP", to instead do a startsWith
    ("TIMESTAMP")
    David Ezzio wrote:
    Tuesday, March 22, 2005
    I am currently experiencing difficulty in mapping a java.util.Date
    field to an Oracle TIMESTAMP column.
    Here's what I see. By default, Kodo maps the date field to a DATE
    column. I suppose this makes sense since Oracle's date columns
    have time information that resolves to the second. In this case,
    the client has a business case to store subsecond resolution,
    hence the desire to store the date field in an Oracle TIMESTAMP
    column.
    First question: how should this be done?
    Here's what I've tried. I tried setting the jdbc-type extension
    for the date field to "timestamp". This setting makes no
    difference, and I suspect the reason is that OracleDBDictionary
    has made the mapping from TIMESTAMP to DATE.
    I tried setting the jdbc-sql-type extension for the date field to
    "timestamp". This makes a difference only when I drop the table.
    Then the schematool's refresh action creates a table with date's
    field mapped to a TIMESTAMP column. I have also gone ahead and
    manually altered the table to achieve the same effect.
    Once the mapping is created, I see the following behavior. Kodo
    has no problem reading the TIMESTAMP column and putting the info
    into the date field. It also has no problem saving non-null date
    values into the TIMESTAMP column. But it does have a problem
    storing a null in the date field.
    Second question: what is the workaround to this problem?
    The the stack dump (obtained by using the JDO Tools Library
    example) follows.
    Thanks in advance,
    David Ezzio
    enter command:
    --> return book
    Select the book to return:
    1. book [com.ysoft.jdo.book.library.Book-354] "Gone to War" checked
    out:
    Tue Mar 22 10:38:01 EST 2005
    2. book [com.ysoft.jdo.book.library.Book-356] "Gone to Work" checked
    out: Tue Mar 22 10:33:58 EST 2005
    3. book [com.ysoft.jdo.book.library.Book-357] "Gone Fishing" checked
    out: Tue Mar 22 10:33:58 EST 2005
    4. book [com.ysoft.jdo.book.library.Book-360] "Gone Sailing" checked
    out: Tue Mar 22 10:33:58 EST 2005
    5. book [com.ysoft.jdo.book.library.Book-355] "Gone Hunting" checked
    out: Tue Mar 22 10:33:58 EST 2005
    Enter selection:
    --> 2
    okay
    enter command:
    --> commit
    exception caught in command
    kodo.util.FatalDataStoreException: The transaction has been rolled
    back. See the nested exceptions for details on the errors that occu
    rred.
    at
    kodo.runtime.PersistenceManagerImpl.throwFlushException(PersistenceManagerImpl.java:1262)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at
    com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at
    com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    NestedThrowablesStackTrace:
    kodo.util.DataStoreException: Invalid column type
    at
    kodo.jdbc.sql.DBDictionary.newDataStoreException(DBDictionary.java:3081)
    at
    kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:77)
    at
    kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:63)
    at
    kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:43)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flush(PreparedStatementManager.java:89)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:445)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:193)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:95)
    at
    kodo.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:609)
    at
    kodo.runtime.DelegatingStoreManager.flush(DelegatingStoreManager.java:153)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at
    com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at
    com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    NestedThrowablesStackTrace:
    java.sql.SQLException: Invalid column type
    at
    oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at
    oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    at
    oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:269)
    at
    oracle.jdbc.driver.OracleStatement.get_internal_type(OracleStatement.java:6164)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setNull(OraclePreparedStatement.java:1316)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.PoolConnection$PoolPreparedStatement.setNull(PoolConnection.java:406)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.LoggingConnectionDecorator$LoggingConnection$LoggingPreparedStatement.setNull(LoggingConnectionDecorato
    r.java:792)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at kodo.jdbc.sql.DBDictionary.setNull(DBDictionary.java:950)
    at
    kodo.jdbc.sql.OracleDictionary.setNull(OracleDictionary.java:450)
    at kodo.jdbc.sql.RowImpl.toSQL(RowImpl.java:828)
    at kodo.jdbc.sql.RowImpl.flush(RowImpl.java:1039)
    at kodo.jdbc.sql.RowImpl.flush(RowImpl.java:975)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flushInternal(PreparedStatementManager.java:160)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flush(PreparedStatementManager.java:84)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:445)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:193)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:95)
    at
    kodo.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:609)
    at
    kodo.runtime.DelegatingStoreManager.flush(DelegatingStoreManager.java:153)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at
    com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at
    com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    enter command:
    -->

  • Data Associtions not enabling  while i am  trying to do data mapping with data Associtions in BPM studio what is the reson?

    Data Associtions not enabling  while i am  trying to do data mapping with data Associtions in BPM studio what is the reson?

    Shouldn't @StartDate be an input parameter to the stored procedure? @RunDate also?
    Example for sp with parameters:
    http://www.sqlusa.com/bestpractices2008/stored-procedure-parameters/
    The last error will go away upon a successful sp compile.
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Design & Programming
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • Must a EntityBean map with data table?

    I know Not every data table will map with Entity bean
    but must a EntityBean map with a datatable?
    rgds

    Ofcourse Not, EntityBean Just A style of Bean,
    We declare something as EntityBean because it perform some action like a Entity
    For Example,A Man Contain some attribute, or A File with some attribute,
    usually we have store this in the database ,so EntityBean Usually map with data table.
    but you can also store the data in the file format as you like,you can control it in the BMP model
    in my opinion ,Entitybean is the object base on the attribute , the Sessionbean is the object base on the action with the user.So we can distinguish them��:-)

  • Not able to map extension data type in standard namespace.

    Hi guys,
    I have a customer who has this problem.  I am from the Business Partner area and BP supports XI for data exchange.  We provide standard message types in XI which contains core BP fields in ABA namespace.  Corresponding complex structure exists in the ABAP system.
    Now, the customer has created an enhancement data type for one of the data types present in the standard message type.  This enhancement data type is present in another namespace.  The customer has also generated the proxy for this enhancement data type which automatically created an Append structure to the standard complex structure in ABAP.
    Now, from the ABAP side all the data is extracted including the enhancement data into the complex structure.  In one of the outbound function modules, we call XI passing this data in complex structure.  However, we call XI in ABA namespace.  So, in XI, since the ABA namespace does not contain the enhancement data type, the customer is not able to map this to his target.  Even though we send all the data from ABAP to XI, since we call it in ABA namespace, the enhancement data is lost.
    So, is there any way of mapping this enhancement data also in ABA namespace itself.  Or from the ABAP, should I call the namespaces seperately.  This means, first I will call ABA namespace to map all standard fields.  Then, I call customer specific namespace to map enhancement data type.
    I would really appreciate a help over here since its a very important customer for SAP.
    Many thanks,
    Vikram

    I thing for customer mapping you have to do an mapping in our own namespace.
    than you need a copy of the fb BUPA_OUTBOUND_MAIN an also from fb
    ABA_BUPA_CALL_PROXY.
    in the copy of ABA_BUPA_CALL_PROXY you can call your own proxy
    do to changing this line
    DATA : lo_ababusiness_partner_out TYPE REF TO co_ababusiness_partner_out ."XI.3.0
    to
    DATA :lo_ababusiness_partner_out  TYPE REF TO yourproxy_mt
    see also weblog /people/michal.krawczyk2/blog/2006/11/14/xipi-data-type-enhancements-standard-business-partner

  • How to map bapimtcs-data to bapimatmra ?

    Hello friends,
    In order to implement the user exit z_uexit_material_vtl_status mentioned in Note Nr. 623026, I need a function which can convert (or map) bapimtcs-data to bapimatmra. This is necessary beacuse the standart code includes lines which contain syntax errors ("LS_BAPIMARA" and "LS_BAPISTRUCT-DATA" are not mutually convertible in a Unicode program. error) The conversion (mapping) of
    bapimatmvk to bapimtcs is necessary, too. Here is the link leading to the problematic user exit. Any useful help would appreciated and rewarded.
    Regards;
    Özcan.

    Hi,
    This MIGHT work, I am doing something similar in one of my user exits:
    in your variable declarations put this:
    DEFINE move_casting.
      assign &1 to <unicode_x1> casting.
      assign &2 to <unicode_x2> casting.
      move <unicode_x1> to <unicode_x2>.
    END-OF-DEFINITION.
    And then at the point where you need to convert the data:
    move_casting ls_bapimtcs-data ls_bapimatmra
    I hope it helps!
    Jeroen

  • Date mapping 2 dates to 0calquarter

    Hello BI experts,
    BI 7.0, I mapped  Date1 & Date 2 to 0calquarter based on a condition at field level.
    Code:
    IF DATE1< 01jan2010
          RESULT = DATE1.
        ElSEIF  DATE2>= '20100101'.
          RESULT = DATE2.
        ENDIF.
    But when i give DATE1 = 13.03.2010
                              DATE2 = 13.09.2008
    0CALQUARTER =0.2010 which is wrong.For all the values i am getting 0calquarter = 0.YYYY(resp year)
    What might be the solution ?any ideas?
    Edited by: Lucky sdn on Jan 13, 2010 8:32 AM

    Hi,
    You can't map date format local object directly to calquarter field. You have to write some ABAP code to manually convert the date format to quarter format as below.
    Date: YYYY(4), date(8),     
    constants: q1 value '01',         q2 value '04',         q3 value '07',         q4 value '10'.
    YYYY = date+0(4).
    case date+4(2).
    when Q1.
    CONCATENATE YYYY  '1'  INTO result.
    when Q2.
    CONCATENATE YYYY '2' INTO result.
    when Q3.
    CONCATENATE YYYY '3' INTO result.
    when Q4.
    CONCATENATE YYYY '4' INTO result.
    endcase.
    Do reply with comments.

  • HT201299 what does it mean when you get this message on the Iphone? "online maps consume data traffic"

    what does it mean when you get this message on the Iphone? "online maps consume data traffic"

    That means that if you use online maps, it will consume data. If you're on WiFi, no worries. If you are on cellular data, then it will eat into your cellular data plan.

  • How to map query data to workbook

    Hi Freinds,
          Hoping you have goodday, please let me know how to create workbook
          and how to map query data to it.
    Thanks
    Chandan Kumar

    Hello,
    Execute a BEx Query and in the BEx toolbar you can see SAVE button, click that as select SAVE as Workbook.
    If you want to insert more queries then first place the cursor where you want to insert the query, then Goto BEX toolbar, Select the Tool Icon and you can see a menu called Insert Query.
    Thanks
    Chandran

  • Map Publish Data Smart link

    Is there a way to configure the smart link right after a Map Published Data to only allow the next activity to run if the output patches a pattern in the Map Published Data?

    Hi
    if not ... you can use the "Compare Values" Activity and subscribe the Published Data from the "Map Published Data" Activity there and trigger only if "comparison result equals true".
    Regards,
    Stefan
    www.sc-orchestrator.eu ,
    Blog sc-orchestrator.eu

  • PI 7.11 mapping lookup - data enrichment - appropriate approach?

    Hi guys,
    we just upgraded from PI 7.0 to PI 7.11.
    Now I´m facing a new scenario where an incoming order have to be processed.
    (HTTP to RFC)
    Furthermore each item of the order have to be enriched by data looked up in a SAP ERP 6.0 system.
    the lookup functionality could be accessed during RFC or ABAP Proxy
    With the new PI release we have several possibilities to implement this scenario, which are ...
    (1) graphical RFC Lookup in message mapping
    (2) ccBPM
    (3) using of the lookup API in java mapping
    (4) message mapping RFC Lookup in a UDF
    Because of performance reason I prefer to make use of the Advanced Adapter Engine, if this is possible.
    Further there should only one lookup request for all items of the order instead of each order item.
    I tried to implement possiblity (1), but it seems to be hard to fill the request table structure of the RFC function module. All examples in SDN only uses simple (single) input parameters instead of tables. Parsing the result table of the RFC seems to be tricky as well.
    Afterwards I tried to implement approach (3) using an SOAP adapter as Proxy with the protocol XI 3.0.
    (new functionality in PI 7.11)
    But this ends up in a crazy error message so it seems that SOAP adapter could not used as proxy adapter in this case.
    ccBPM seems also be an good and transparent approach, because there is no need of complex java code or lookup api.
    So  the choice is not so easy.
    What´s the best approach for this scenario??
    Are my notes to the approach correct or do I use/interpret it wrong?
    Any help, ideas appreciated
    Kind regards
    Jochen

    Hi,
    the error while trying to use the soap channel for proxy communication is ....
    com.sap.aii.mapping.lookup.LookupException: Exception during processing the payload. Error when calling an adapter by using the communication channel SOAP_RCV_QMD_100_Proxy (Party: , Service: SAP_QMD_MDT100_BS, Object ID: 579b14b4c36c3ca281f634e20b4dcf78) XI AF API call failed. Module exception: 'com.sap.engine.interfaces.messaging.api.exception.MessagingException: java.io.IOException: Unexpected length of element <sap:Error><sap:Code> = XIProxy; HTTP 200 OK'. Cause Exception: 'java.io.IOException: Unexpected length of element <sap:Error><sap:Code> = XIProxy; HTTP 200 OK'.
    so this feature seems not to work for soap lookups, isn´t it.
    Kind regards
    Jochen

  • Problem in mapping xml data with imported RFC parameters

    I am currently working on a senario in which a flat file is generated by an RFID server and placed in FTP server.
    The flat file is picked up from the FTP server using XI and the contents are mapped to the corresponding imported RFC parameters.
    The content of the file which is in text format is successfully converted in XML at the XI side.
    File contains records of 2 fields Functional location and RFID equipment number . In the R/3 side these fields
    are used as Functional location and equipment number of PM module .
    The structure of the FTP message is as follows
    <ns:RFID_MSG_TYPE xmlns:ns="urn://sisl:rfiddemo">
    <RecordSet>
      <Row>
       <FL1>"f1</FL1>
       <FL2>01</FL2>
       <FL3>01</FL3>
       <RFID_NUM>I001"</RFID_NUM>
      </Row>
    </RecordSet>
    </ns:RFID_MSG_TYPE>
    After the mapping program which maps the above structure to the imported RFC is executed the following payload document is generated
    <ns:ZRFID_EQUIP xmlns:ns="urn:sap-com:document:sap:rfc:functions">
    <RECORDS>
      <item>
       <FLOC>f1-01-01</FLOC>
       <RFID_NO>I001</RFID_NO>
      </item>
    </RECORDS>
    </ns:ZRFID_EQUIP>
    The size of FLOC is 30 of type string and RFID_NO is also a string with size 18.
    When the data is brought in R/3 both the fields FLOC and RFID_NO gets mapped in FLOC which is of type char30.

    Hi Naveen,
    In sxmb_moni the content transmitted to the adapter(RFC)is as follows
    <?xml version="1.0" encoding="UTF-8" ?>
    - <ns:ZRFID_EQUIP xmlns:ns="urn:sap-com:document:sap:rfc:functions">
    - <RECORDS>
    - <item>
      <FLOC>f1-01-01</FLOC>
      <RFID_NO>I006</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-02</FLOC>
      <RFID_NO>I002</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-03</FLOC>
      <RFID_NO>I003</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-04</FLOC>
      <RFID_NO>I004</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-05</FLOC>
      <RFID_NO>I005</RFID_NO>
      </item>
    - <item>
      <FLOC>f1-01-06</FLOC>
      <RFID_NO>I001</RFID_NO>
      </item>
      </RECORDS>
      </ns:ZRFID_EQUIP>
    At r/3 side the field floc and rfid_no gets mapped to floc which is of char30
    eg floc=f1-01-01I006
       rfid_no=

Maybe you are looking for