Pre-mapping vs post-mapping  tool

Hello everyone.
The problem is that I need to find the mapping rule mistakes comparing the amounts in the mapped accounts with the amounts in the accounts imported from the previous system. I'm not able to build an automatic tool in BPC to compare the imported data vs the data before applying the mapping.
How can I build it?
Thanks,
Oscar

Hi Nilanjan,
Thanks for the answer, but I think I haven't explained correctly.
I import the data in an Application with 10 dimensions using a txt file with the information of all the dimensions. Some of these dimensions have a conversion file with a mapping, the problem we have is that when we upload the data from the file to BPC, it takes a lot of time to find witch lines from the txt file have been uploaded on each BPC member, so we would like to know if it is possible to build some kind of "log" where we could find the destination members for each line in the txt file.
A good example of this "log" could be something similar to the lines below:
Data from the txt file . . . . . . . . . . . . . . . . . . . BPC members where the line have been stored
Dim1 . . ..Dim2. . . ..Dim3... . . Dim10 . . . . . Dim1 . . . . . . .Dim2 . . . . . . . Dim3... . . . . . . . Dim10
TxtData1 TxtData2  TxtData3... TxtData10 . . .BPCMember1 BPCMember2 BPCMember3... BPCMember10
Thanks again for your help,
Oscar.
Edited by: Oscar G.C. on Sep 24, 2010 9:08 AM

Similar Messages

  • Problems with pre-mapping process operator in owb 9i

    Hi,
    I was trying to use the pre-mapping process operator in owb 9i. Problem is that the manual does not specify how the inputs need to be connected to this operator.
    Following is what I went through -
    I created a mapping table operator and a mapping dimension operator and connected these two. Then i created a pre-mapping process operator selecting the LTRIM function. Further I connected one of the table attributes to this pre-mapping operator as input and connected the output of this pre-mapping operator to the appropriate dimension operator attribute.
    On performing Validate, following error message was flashed -
    VLD-2451 : Illegal connection to pre-mapping process operator
    I am trying to learn how to use OWB 9i from the manual. So my interpretation of the use of the pre-mapping process operator may be wrong.
    In any case kindly help,
    Thanks,
    Saju

    Pre-mapping process is use to perform some operations preceding to mapping operation itself.
    For example, if your mapping is designed to incrementally append data to table for the definite time interval (witch is a parameter of the map operation) you might want to perform the table data cleanup for that period. That will allow for reload data number of time.
    In this case you have to define the procedure witch perform cleanup and than include the call to that procedure as a pre-mapping process.
    Other examples of pre- and post mapping process is disabling referential integrity before loading and re-enabling them after loading.
    Anyway, OWB documentation has clear definition for pre- and post-mapping processes.

  • Problems with pre-mapping process in owb 9i

    Hi,
    I was trying to use the pre-mapping process operator in owb 9i. Problem is that the manual does not specify how the inputs need to be connected to this operator.
    Following is what I went through -
    I created a mapping table operator and a mapping dimension operator and connected these two. Then i created a pre-mapping process operator selecting the LTRIM function. Further I connected one of the table attributes to this pre-mapping operator as input and connected the output of this pre-mapping operator to the appropriate dimension operator attribute.
    On performing Validate, following error message was flashed -
    VLD-2451 : Illegal connection to pre-mapping process operator
    I am trying to learn how to use OWB 9i from the manual. So my interpretation of the use of the pre-mapping process operator may be wrong.
    In any case kindly help,
    Thanks,
    Saju

    Hi,
    Essentially the pre (and post) mapping processes are executed before and after the mapping logic. These are separate procedures in the generated package and are "stand alone" from the main package procedure holding the actual mapping diagram.
    If you want to use LTRIM, there are 2 supported ways:
    1) use an expression, this means you feed the column you want to do the expression on into the expression operator, open the code editor and select from the transformations (or type ltrim......) and then link the result to the target object
    2) use a transformation operator, choose ltrim from the Oracle library and connect the operator as I stated in use case 1
    We do not encourage you to use a filter to do transformations, it is better to use the operators that are intended for transformations.
    Thanks,
    Jean-Pierre

  • IDoc to File[EDI] using seeburger BIC Mapping Tool

    Hi Gurus,
    I'm having a scenario of IDoc to EDI FIle, I'll be using IDoc DESADV.DELVRY03.ZDELVR0X; Kindly advise steps needed in BIC Mapping Tool. I've already referred to this post /people/dijesh.tanna/blog/2008/05/25/sap-pixi-content-conversion-using-generator-mapping-functionality-of-seeburger-part-1 but still can't get it right.
    Thanks.

    Thanks a lot experts :D....
    I'll be using a file adapter for my scenario..
    Any Idea on what Seeburger BIC mapping module should I use?
    am I just use the ff module?
    Module u2013 Processing Sequence                                   
    Number     1     Name     localejbs/CallBicXIRaBean     Type     Local Enterprise Bean     Module Key     bic
    Number     2     Name     localejbs/CallSapAdapter     Type     Local Enterprise Bean     Module Key     exit
    Module u2013 Module Configuration                         
    Module Key     bic     Parameter name     destSourceMsg     Parameter value     MainDocument
    Module Key     bic     Parameter name     destTargetMsg     Parameter value     MainDocument
    Please advise.
    Thanks.
    Edited by: Devilbatz on Dec 8, 2010 8:31 AM

  • MAP Toolkit - How to use this MAP tool kit for all SQL Server inventory in new work enviornment

    Hi Every one
     Just joined to new job and planning to do Inventory for whole environment so I can get list of all SQL Server installed . I downloaded MAP tool kit just now. So looking for step by step information to use this for SQL Inventory. If anyone have documentation
    or screen shot and can share would be great.
    Also like to run It will be good to run this tool anytime or should run in night time when is less activity? 
    Hoe long generally takes for medium size environment where server count is about 30 ( Dev/Staging/Prod)
    Also any scripts that will give detailed information would be great too..
    Thank you 
    Please Mark As Answer if it is helpful. \\Aim To Inspire Rather to Teach A.Shah

    Hi Logicinisde,
    According to your description, since the issue regards Microsoft Assessment and Planning Solution Accelerator. I suggestion you post the question in the Solution Accelerators forums at
    http://social.technet.microsoft.com/Forums/en-US/map/threads/ . It is appropriate and more experts will assist you.
    The Microsoft Assessment and Planning (MAP) Toolkit is an agentless inventory, assessment, and reporting tool that can securely assess IT environments for various platform migrations. You can use MAP as part of a comprehensive process for planning and migrating
    legacy database to SQL Server instances.
    There is more information about how to use MAP Tool–Microsoft Assessment and Planning toolkit, you can review the following articles.
    http://blogs.technet.com/b/meamcs/archive/2012/09/24/how-to-use-map-tool-microsoft-assessment-and-planning-toolkit.aspx
    Microsoft Assessment and Planning Toolkit - Technical FAQ:
    http://ochoco.blogspot.in/2009/02/microsoft-assessment-and-planning.html
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • 3.1.4 reverse mapping tool issue

    (Sorry for the duplicate posting...I meant to start a new thread with
    this but accidentally posted it as a reply to a 6-month old thread)
    Hello,
    I was running Kodo 3.0.2 when Abe and I had the exchange reproduced
    below back in January to deal with Oracle tables with "$" in the column
    names (which I subsequently updated to 3.0.3). The original subject of
    this discussion was "3.0.2 reverse mapping tool generates invalid
    ..mapping file".
    I was able to get this working by running the following commands to
    implement Abe's suggestion:
    reversemappingtool -p kodo.properties -package db \
    -cp custom.properties -ds false schema.xml
    sed -e 's/\$/__DOLLAR__/' db/package.mapping > db/package.mapping.new
    mv db/package.mapping.new db/package.mapping
    javac db/*.java
    mappingtool -p kodo.properties -a import db/package.mapping
    sed -e 's/__DOLLAR__/\$/' db/package.jdo > db/package.jdo.new
    mv db/package.jdo.new db/package.jdo
    In my custom.properties file, I had lines like these to put useful names
    on my class's fields:
    db.TransactionDetailHistory.y$an8.rename : addressNumber
    As I said, in 3.0.3, this worked perfectly.
    I picked this code back up for the first time since getting it working 6
    months ago, and decided to update it to 3.1.4 (since I'm already using
    that on other projects). Problem is, the reverse mapping tool has
    changed and the code it generates no longer works as it once did. I
    tried running the 3.1.2 and 3.1.0 reverse mapping tool, and it failed
    the same way, so it looks like this change happened in the 3.0.x to
    3.1.x version change.
    What happens is this: In the generated Java source, my fields used to
    end up with names as per my specification (e.g., the Oracle column named
    "y$an8" showed up as "addressNumber" in the java source).
    However, it looks like the "$" became special somehow in 3.1.0 - the
    "y$an8" column now shows up as "yAn8" in the generated Java. I tried
    changing my custom.properties file accordingly, but it still shows up as
    yAn8 even after changing my mapping to look like this:
    db.TransactionDetailHistory.yAn8.rename : addressNumber
    What do you make of this?
    Thanks,
    Bill
    Abe White wrote:
    > Hmmm... this is a problem. '$' is not legal in XML names, and there
    is no standard way to escape it.
    >
    > Your best bet is probably to do the following:
    > 1. In the generated .mapping file, replace all '$' characters with
    another token, such as '--DOLLAR--'.
    > 2. Switch your properties to use the metadata mapping factory:
    > kodo.jdbc.MappingFactory: metadata
    > 3. Import your mappings into the metadata mapping factory:
    > mappingtool -a import package.mapping
    > 4. Delete the mapping file.
    > 5. In your .jdo file, replace '--DOLLAR--' with '$' again.
    >
    > The metadata mapping factory doesn't put column names in its XML
    attribute names, so you should be able to use it safely.

    William-
    However, it looks like the "$" became special somehow in 3.1.0 - the
    "y$an8" column now shows up as "yAn8" in the generated Java. I tried
    changing my custom.properties file accordingly, but it still shows up as
    yAn8 even after changing my mapping to look like this:
    db.TransactionDetailHistory.yAn8.rename : addressNumberWell, the reverse mapping tool makes some assumptions based on common
    naming strategies for relational databases and Java naming: columns like
    "FIRST_NAME" will be renamed to "firstName". The Reverse Mapping tool is
    seeing the "$" and treating it as a non-alphanumeric delimiter, so is
    fixing it.
    Can you try a couple of additional properties:
    db.TransactionDetailHistory.y$An8.rename: addressNumber
    db.TransactionDetailHistory.y$an8.rename: addressNumber
    Also, are other rename properties working for you, or is that the only
    field or class you attempt to rename? It might just be the case that
    you aren't correctly specifying the properties file or something.
    Finally, bear in mind that you can always implement your own
    kodo.jdbc.meta.ReverseCustomizer and just use that; not the easiest
    solution, but it can certainly be used to have very fine-grained control
    over the exact names that are generated.
    In article <[email protected]>, William Korb wrote:
    (Sorry for the duplicate posting...I meant to start a new thread with
    this but accidentally posted it as a reply to a 6-month old thread)
    Hello,
    I was running Kodo 3.0.2 when Abe and I had the exchange reproduced
    below back in January to deal with Oracle tables with "$" in the column
    names (which I subsequently updated to 3.0.3). The original subject of
    this discussion was "3.0.2 reverse mapping tool generates invalid
    .mapping file".
    I was able to get this working by running the following commands to
    implement Abe's suggestion:
    reversemappingtool -p kodo.properties -package db \
    -cp custom.properties -ds false schema.xml
    sed -e 's/\$/__DOLLAR__/' db/package.mapping > db/package.mapping.new
    mv db/package.mapping.new db/package.mapping
    javac db/*.java
    mappingtool -p kodo.properties -a import db/package.mapping
    sed -e 's/__DOLLAR__/\$/' db/package.jdo > db/package.jdo.new
    mv db/package.jdo.new db/package.jdo
    In my custom.properties file, I had lines like these to put useful names
    on my class's fields:
    db.TransactionDetailHistory.y$an8.rename : addressNumber
    As I said, in 3.0.3, this worked perfectly.
    I picked this code back up for the first time since getting it working 6
    months ago, and decided to update it to 3.1.4 (since I'm already using
    that on other projects). Problem is, the reverse mapping tool has
    changed and the code it generates no longer works as it once did. I
    tried running the 3.1.2 and 3.1.0 reverse mapping tool, and it failed
    the same way, so it looks like this change happened in the 3.0.x to
    3.1.x version change.
    What happens is this: In the generated Java source, my fields used to
    end up with names as per my specification (e.g., the Oracle column named
    "y$an8" showed up as "addressNumber" in the java source).
    However, it looks like the "$" became special somehow in 3.1.0 - the
    "y$an8" column now shows up as "yAn8" in the generated Java. I tried
    changing my custom.properties file accordingly, but it still shows up as
    yAn8 even after changing my mapping to look like this:
    db.TransactionDetailHistory.yAn8.rename : addressNumber
    What do you make of this?
    Thanks,
    Bill
    Abe White wrote:
    Hmmm... this is a problem. '$' is not legal in XML names, and thereis no standard way to escape it.
    Your best bet is probably to do the following:
    1. In the generated .mapping file, replace all '$' characters withanother token, such as '--DOLLAR--'.
    2. Switch your properties to use the metadata mapping factory:
    kodo.jdbc.MappingFactory: metadata
    3. Import your mappings into the metadata mapping factory:
    mappingtool -a import package.mapping
    4. Delete the mapping file.
    5. In your .jdo file, replace '--DOLLAR--' with '$' again.
    The metadata mapping factory doesn't put column names in its XMLattribute names, so you should be able to use it safely.--
    Marc Prud'hommeaux
    SolarMetric Inc.

  • Differentiating Between Windows XP and Windows XP Embedded with MAP Tool

    Hi All,
    I have a customer who has multiple Windows XP Embedded Thinclients.  When he runs the MAP tool they show up as Windows XP devices.  I have personally not seen this yet since he hasn't shared the MAP data yet but does anyone know if the MPA tool
    can differentiate between Windows XP desktops and Windows XP embedded Thinclients or are they all lumped together?  Does the MAP tool show for example versioning that might help differentiate?   Any help would be greatly appreciated.

    When queried by MAP, the WMI service in the Windows XPe machines are reporting back as “Microsoft Windows XP Professional” and since MAP can only report what WMI tells it, that is what shows up in the reports. This is an artifact of Windows XPe because it
    starts out as a full version of XP Pro and then each OEM can choose what parts they want to remove, if any. If the OEMs don’t modify the WMI class to show the product as XP Embedded, it will continue to show up with its original description of Windows XP Professional.
    Please remember to click "Mark as Answer" on the post that helps you, and to click
    "Unmark as Answer" if a marked post does not actually answer your question. Please
    VOTE as HELPFUL if the post helps you. This can be beneficial to other community members reading the thread.

  • Pre-mapping process seems to be called after instead of before

    Hi,
    I'm new here and this is my first time posting, so please be gentle! :-)
    Running OWB 10.2.0.1.0 on an oracle 10g target db and a 9i source db.
    I have a mapping which contains a pre-mapping process and it contains a simple mapping from a source view into the target table.
    When I look at the generated code, the pre mapping process is at the bottom, after the main insert part of the mapping.
    Next, when I perform a test, updating a test record and following it through the logic, it seems that the pre-mapping process really is in fact getting called after the main mapping.
    Has anyone experienced this?
    Thanks,
    Sammi

    Resolved. Pre-mapping works fine and as intended. Problem in logic was elsewhere.

  • Problem using pre-mapping process operator in owb 9i

    Hi,
    I was trying to use the pre-mapping process operator in owb 9i. Problem is that the manual does not specify how the inputs need to be connected to this operator. Following is what I went through -
    I created a mapping table operator and a mapping dimension operator and connected these two. Then i created a pre-mapping process operator selecting the LTRIM function. Further I connected one of the table attributes to this pre-mapping operator as input and connected the output of this pre-mapping operator to the appropriate dimension operator attribute. On performing Validate, following error message was flashed -
    VLD-2451 : Illegal connection to pre-mapping process operator
    I am trying to learn how to use OWB 9i from the manual. So my interpretation of the use of the pre-mapping process operator may be wrong.
    In any case kindly help,
    Thanks
    Saju

    Pre-mapping process is use to perform some operations preceding to mapping operation itself.
    For example, if your mapping is designed to incrementally append data to table for the definite time interval (witch is a parameter of the map operation) you might want to perform the table data cleanup for that period. That will allow for reload data number of time.
    In this case you have to define the procedure witch perform cleanup and than include the call to that procedure as a pre-mapping process.
    Other examples of pre- and post mapping process is disabling referential integrity before loading and re-enabling them after loading.
    Anyway, OWB documentation has clear definition for pre- and post-mapping processes.

  • Mapping Tool foreign keys

    <field name="employee">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="one-one">
    <extension vendor-name="kodo" key="column.ID" value="EMPLOYEE_ID"/>
    </extension>
    </field>
    Hello all,
    If I have the above field defined for one my classes in package.jdo,
    should the mapping tools create the foreign keys automatically?
    Thanks.

    Hi Marc,
    Thanks for the reply.
    I tried using null and cascade for the value of the jdbc-delete-action. I
    also added the property kodo.jdbc.ForeignKeyConstraints: true to
    kodo.properties. In all cases the foreign key is showing up in the
    schema, but not getting inserted into the database, MySQL with jdbc
    driver version 3.0.15ga.
    A more worrying issue to me is that fact that the joins are not showing up
    in the schema. From what I understood from section 7.5.1 of the
    documentation the one-one jdbc-field-map extension should add the joins.
    My code still works I am just a little miffed as to why the joins are not
    showing up. They do not show up for other types of jdbc-field-map, for
    collections, as I would expect.
    I'm pasting below the meta-data for the class in question plus the portion
    of the schema.xml for that class. (If you need the full schema.xml and
    meta-data please supply an email address I can send it to, as I do not
    want to post this information on an open forum).
    I'm also posting below the output of the mapping tool. Please tell me if
    there is anything else you need.
    Cheers.
    <class name="TripImpJdo"
    persistence-capable-superclass="trekwatch.core.security.AbstractPolicedObject"
    objectid-class="TripImpJdoId">
    <extension vendor-name="kodo" key="data-cache-timeout"
    value="-1"/>
    <extension vendor-name="kodo" key="jdbc-class-ind"
    value="in-class-name">
    <extension vendor-name="kodo" key="column"
    value="JDOCLASS"/>
    </extension>
    <extension vendor-name="kodo" key="jdbc-class-map"
    value="base">
    <extension vendor-name="kodo" key="table"
    value="tripimpjdo"/>
    </extension>
    <extension vendor-name="kodo" key="jdbc-field-mappings">
    <extension vendor-name="kodo"
    key="trekwatch.core.TWObject.authorId">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="AUTHORID"/>
    </extension>
    </extension>
    <extension vendor-name="kodo"
    key="trekwatch.core.TWObject.id">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="ID"/>
    </extension>
    </extension>
    <extension vendor-name="kodo"
    key="trekwatch.core.TWObject.timeStamp">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="TIMESTAMP0"/>
    </extension>
    </extension>
    <extension vendor-name="kodo"
    key="trekwatch.core.security.AbstractPolicedObject.copyPolicy">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="pc">
    <extension vendor-name="kodo" key="column"
    value="COPYPOLICY"/>
    </extension>
    </extension>
    <extension vendor-name="kodo"
    key="trekwatch.core.security.AbstractSecureTWObject.acl">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="pc">
    <extension vendor-name="kodo" key="column"
    value="ACL"/>
    </extension>
    </extension>
    </extension>
    <extension vendor-name="kodo" key="jdbc-version-ind"
    value="version-number">
    <extension vendor-name="kodo" key="column"
    value="JDOVERSION"/>
    </extension>
    <field name="activities">
    <collection
    element-type="trekwatch.core.activity.ActivityImpJdo"/>
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="many-many">
    <extension vendor-name="kodo" key="element-column.ID"
    value="ACTIVITIES_ID"/>
    <extension vendor-name="kodo" key="order-column"
    value="ACTIVITIES_ORDER"/>
    <extension vendor-name="kodo" key="ref-column.ID"
    value="ID"/>
    <extension vendor-name="kodo" key="table"
    value="tripi_activities"/>
    </extension>
    </field>
    <field name="atomSignature" persistence-modifier="persistent">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="pc">
    <extension vendor-name="kodo" key="column"
    value="ATOMSIGNATURE"/>
    </extension>
    </field>
    <field name="budget">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="BUDGET"/>
    </extension>
    </field>
    <field name="budgetType">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="BUDGETTYPE"/>
    </extension>
    </field>
    <field name="description">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="DESCRIPTION"/>
    </extension>
    </field>
    <field name="endDate">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="ENDDATE"/>
    </extension>
    </field>
    <field name="gearList">
    <extension vendor-name="kodo" key="jdbc-delete-action"
    value="null"/>
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="one-one">
    <extension vendor-name="kodo" key="column.ID"
    value="GEARLIST_ID"/>
    </extension>
    </field>
    <field name="name">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="NAME0"/>
    </extension>
    </field>
    <field name="numDays">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="NUMDAYS"/>
    </extension>
    </field>
    <field name="startDate">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="STARTDATE"/>
    </extension>
    </field>
    <field name="timeAvailable">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="value">
    <extension vendor-name="kodo" key="column"
    value="TIMEAVAILABLE"/>
    </extension>
    </field>
    <field name="todoList">
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="one-one">
    <extension vendor-name="kodo" key="column.ID"
    value="TODOLIST_ID"/>
    </extension>
    </field>
    <field name="tripElements">
    <collection element-type="TripElement"/>
    <extension vendor-name="kodo" key="jdbc-field-map"
    value="pc-collection">
    <extension vendor-name="kodo" key="element-column"
    value="ELEMENT"/>
    <extension vendor-name="kodo" key="order-column"
    value="TRIPELEMENTS_ORDER"/>
    <extension vendor-name="kodo" key="ref-column.ID"
    value="ID"/>
    <extension vendor-name="kodo" key="table"
    value="tripi_tripelements"/>
    </extension>
    </field>
    </class>
    <table name="tripimpjdo">
    <pk column="ID"/>
    <column name="ACL" type="varchar" size="255"/>
    <column name="ATOMSIGNATURE" type="varchar" size="255"/>
    <column name="AUTHORID" type="varchar" size="255"/>
    <column name="BUDGET" type="integer"/>
    <column name="BUDGETTYPE" type="integer"/>
    <column name="COPYPOLICY" type="varchar" size="255"/>
    <column name="DESCRIPTION" type="varchar" size="255"/>
    <column name="ENDDATE" type="timestamp"/>
    <column name="GEARLIST_ID" type="bigint"/>
    <column name="ID" type="bigint" not-null="true"/>
    <column name="JDOCLASS" type="varchar" size="255"/>
    <column name="JDOVERSION" type="integer"/>
    <column name="NAME0" type="varchar" size="255"/>
    <column name="NUMDAYS" type="integer"/>
    <column name="STARTDATE" type="timestamp"/>
    <column name="TIMEAVAILABLE" type="integer"/>
    <column name="TIMESTAMP0" type="timestamp"/>
    <column name="TODOLIST_ID" type="bigint"/>
    <fk delete-action="cascade" to-table="gearitemlist"
    column="GEARLIST_ID"/>
    <fk to-table="todolist" column="TODOLIST_ID"/>
    <index name="I_TRPMPJD_GEARLIST_ID" column="GEARLIST_ID"/>
    <index name="I_TRPMPJD_JDOCLASS" column="JDOCLASS"/>
    <index name="I_TRPMPJD_JDOVERSION" column="JDOVERSION"/>
    <index name="I_TRPMPJD_TODOLIST_ID" column="TODOLIST_ID"/>
    </table>
    create-schema:
    [echo]
    ================================================================
    [echo] Refreshing the schema in the data store
    [echo]
    ================================================================
    [mappingtool] 625 INFO [main] kodo.Tool - Mapping tool running on type
    "class ***" with action "refresh".
    [mappingtool] 1891 INFO [main] kodo.Tool - Mapping tool running on type
    "class ***" with action "refresh".
    [mappingtool] 1906 INFO [main] kodo.Tool - Mapping tool running on type
    "class ***" with action "refresh".
    (many more of the same as above for each persistent class.
    [mappingtool] 2250 INFO [main] kodo.Tool - Recording mapping and schema
    changes.
    [mappingtool] 5063 WARN [main] kodo.jdbc.Schema - The foreign key
    "F_TRPMPJD_GEARLIST" was not added to table "TRIPIMPJDO".
    build:
    BUILD SUCCESSFUL
    Total time: 20 seconds
    Marc Prud'hommeaux wrote:
    Goerge-
    As well as adding the jdbc-delete-action, you also want to set the
    following in your kodo.properties:
    kodo.jdbc.ForeignKeyConstraints: true
    Also, I notice you are using MySQL: MySQL doesn't support some foreign
    keys (such as deferred constraints). Is your foreign key deferred? Can
    you post the schema.xml file, as well as the complete output from the
    mappingtool so we can take a look at it?
    In article <[email protected]>, Goerge wrote:
    UPDATE again:
    This dialog with myself is actually not bad.
    Anyway, I've made some head way:
    After reading through 16 pages of posts here, I found one with the same
    problem I had, where it stated that foreign key constraints are not added
    to the database unless a jdbc-delete-action extension is used.
    After adding the following extension to the field in question, I still do
    not get the foreign key in the database, but at least I get the following
    message when running the mapping tool:
    [mappingtool] 6157 WARN [main] kodo.jdbc.Schema - The foreign key
    "F_TRPMPJD_GEARLIST" was not added to table "TRIPIMPJDO".
    I am using MySQL with innodb tables, so foreign keys should be supported.
    Is there another reason they are not being added?
    Thanks again.
    Marc Prud'hommeaux
    SolarMetric Inc.

  • Bug in exists() function of XI Graphical Mapping Tool?

    Hi!
    If I connect a source field with the exists() function in XI Graphical Mapping Tool and the tag exists it returns TRUE, otherwise it returns FALSE, so everything works as expected.
    But I have to connect a user-defined function with exists(). The user-defined function will either calculate a value or set Resultset.SUPPRESS.
    If there is a value the exits() function returns TRUE, however if Resultset.SUPPRESS is set it does also return TRUE! This looks to me like a bug in exists() function. Shouldn't it always return FALSE if the input is Resultset.SUPPRESS?
    Regards, Tanja

    Hi Stefan!
    > The exists() function checks, if a queue is empty.
    > An empty queue is <b>not</b> represented by the
    > SUPPRESS value.
    > If inside a queue there is a SUPPRESS value, the
    > queue is <b>not</b> empty.
    Ok, so it's not a bug and the exists() function is working as expected.
    > If you want the exist() function after a UDF, provide
    > an empty queue, or easier: return the values "true"
    > or "false" directly from the UDF.
    Yes, that's how I actually solved the problem. The UDF was used at several places where the ResultList.SUPPRESS output was needed. So I copied the UDF and changed it so that the output was TRUE or FALSE instead.
    Regards, Tanja

  • Mapping tool creates new columns one at a time

    Hi All,
    I'm using Kodo 3.2.2 with MySql 4.1. I have a fairly large table (over
    2million rows) representing one persistent class. I need to add several
    new fields to the object, which will require a refresh mapping on the
    database to add new columns for the fields.
    It seems when I run refresh mapping on the database, Kodo is adding each
    new field individually in seperate ALTER TABLE statements, rather than
    adding all the columns in one ALTER TABLE statement.
    With MySQL and large tables, this can really increase the time required to
    refresh the mapping - I've tried with MSSQL and that database must handle
    ALTER TABLE commands differently because it doesn't take a fraction of the
    time that MySQL takes.
    Is this a bug (I doubt it), but rather a feature improvement?
    Thoughts?
    Thanks,
    Brendan

    Thanks Abe for the reply - I've been meaning to look at the process of
    creating SQL scripts to perform the refresh mappings on our databases.
    I'll have a look at this process before we talk contracting!
    Thanks,
    Brendan
    Abe White wrote:
    Unfortunately, we don't have any immediate plans to optimize ALTER TABLEsince
    it's not a runtime operation, and as you say most DBs don't have a problemwith
    it. Also, note that the documentation shows how to get Kodo to create SQL
    scripts rather than directly modify the DB; you could easily edit thescripts
    Kodo generates to make them more efficient before piping the SQL to MySQLusing
    its client app. (Note that Kodo 3.3 introduces the ability to generate SQL
    scripts directly from the mapping tool, rather than having to take an extrastep
    through the schema tool).
    If getting Kodo to combine ALTER statements is important to you, you couldalso
    contact [email protected] about contracting us to expedite this work.

  • XSLT mapping tool

    Hi All,
    I am abap consultant, so its difficult to write a XSLT mapping program.
    Is there any XSLT mapping tool available on net to use free?
    Which can take source XML & Target XML and upon doing graphical mapping
    it generates the XSLT mapping program between source & target XML.
    Regards

    Rohan,
    Is there any XSLT mapping tool available on net to use free?
    Yes, Stylus Studio and Altova have free download-able tool for XSL Mapping.
    Which can take source XML & Target XML and upon doing graphical mapping
    it generates the XSLT mapping program between source & target XML.
    I don't think there is any tool which will make use of graphical mapping to create an XSL Mapping.
    But the point is, if you are creating a graphical mapping, why you need XSL?
    Regards,
    Neetesh

  • Different mapping tools in the market --  For BPM mapping in XI

    Hi All
         I am more intersted in learning the BPM mapping tools(3rd party or any)
         Can any list out the different tools and which are the best ?
         Our company looking forward to implement XI (BPM's) with mapping tools
        Can any help me on this
    Regards
    Rakesh

    Hi Rakesh,
    plz have a look to <a href="http://help.sap.com/saphelp_nw04/helpdata/en/14/80243b4a66ae0ce10000000a11402f/frameset.htm">SAP Library: SAP Exchange Infrastructure</a> for understanding XI, BPM and Mappings.
    BPM has no special mapping tool. You can use Java, ABAP, XSLT and graphical Mapping (Message Mapping) at XI, for any case - BPM or not. 3rd Party has nothing to do with BPM or Mapping. That means a non SAP system.
    Regards,
    Udo

  • Mapping tools ?

    Hi,
    Most of my flows are <b>"IDoc -> file" or "file -> Idoc"</b>, and I need to manage several transcodification tables for different countries (I prefer a DDIC table to FixValue or Value mapping) and with an amount of Business systems (more than 300).
    Before this week, I used <b>Graphical mapping</b>... but with some difficults to manage correctly several levels of IDoc segments. Graphical mapping is easy to use when there are one or two "0...unbound" segments but no more. And you cannot have an easy access to a DDIC (need to use an RFC in BPM).
    Since I'm trying to use <b>abap mapping</b>... but with some difficults to manage correctly the creation of an "0..unbound" messages, in order to distribute them to one or several receivers. Abap mapping is esay to use when there is only one receiver (no management before a block "Multiline to single line"; and it is interresting for requests on DDIC tables (very easy). But it seems there is some differences between a testing with tcode "sxi_mapping_test" (thus in foreground ==> easy to debug) and with the execution of the scenario (thus in background ==> difficult to debug).
    I don't know yet the XSLT mapping and the Java mapping (a complete one, not a java function inside graph mapping).
    Inside XI 3.0, for such flows, is there a <b>mapping tool</b> which is better than the other with IDocs?
    Regards.
    Mickael.

    You can call Java programs from XSLT mapping.
    So any lookups you can code in Java and call it from XSLT program.
    Below is the link to know more on XSLT Mapping with Java Enhancement.
    http://help.sap.com/saphelp_nw04/helpdata/en/55/7ef3003fc411d6b1f700508b5d5211/frameset.htm
    There are couple of blogs on the same.
    Hope it helps,
    Regards,
    Satish

  • Any free downloads are there XSLT mapping tool

    Hi all ,
    any free down loads are there XSLT mapping  tool , i need to enchance the mapping which is already done in XSLT mapping there they have done look ups also using java codinh , i do have just XSLT file in import arichive , so please help me in it
    thanking you
    sridhar

    Hi ;
    Refer this
    XSLT mapping
    /people/prasadbabu.nemalikanti3/blog/2006/03/30/xpath-functions-in-xslt-mapping
    Mudit

Maybe you are looking for