Best O/R Mapping Tool Available

I would be the most grateful if you help me to chose the best FREE Object/Relational Mapping tool/code library.
Any suggestion is welcome.
Thanks.

I found
http://objectbridge.sourceforge.net/
I heard something about a project named S. O. D. A. or similiar, a kind of Java API for treating an RDBMS like a OODBMS, I think.

Similar Messages

  • No Mapping tools available

    Hi,
    we have strange problem in File2BAPI Scenario that in Message-Mapping no tools are available. I mean this smal row where you e.g. arrange automaticall dependencies and show lines between the fields - i have no clue if these tools have a name.
    Can somebody help?

    If I understand you correct ...you are searching for mapping links
    which source filed is mapped to  which target field if this is what you are searching for then....
    you can find a rectangle box in the mapping tool ...which have the option of show all .... which shows the mapping links
    HTH
    Rajesh

  • XSLT mapping tool

    Hi All,
    I am abap consultant, so its difficult to write a XSLT mapping program.
    Is there any XSLT mapping tool available on net to use free?
    Which can take source XML & Target XML and upon doing graphical mapping
    it generates the XSLT mapping program between source & target XML.
    Regards

    Rohan,
    Is there any XSLT mapping tool available on net to use free?
    Yes, Stylus Studio and Altova have free download-able tool for XSL Mapping.
    Which can take source XML & Target XML and upon doing graphical mapping
    it generates the XSLT mapping program between source & target XML.
    I don't think there is any tool which will make use of graphical mapping to create an XSL Mapping.
    But the point is, if you are creating a graphical mapping, why you need XSL?
    Regards,
    Neetesh

  • Which object-relational mapping tool is the best for Oracle Coherence?

    Which object-relational mapping tool is the best for Oracle Coherence?
    My application is read-and-write-intensive. Which tool is most suitable for this application?
    TopLink essentials, TopLink, Eclipse or Hibernate?
    Thank you

    I would pick Hibernate mainly because of its popularity and wide knowledge base.
    Coherence has provided some documentation for the integration.
    http://download.oracle.com/docs/cd/E14526_01/coh.350/e14537/usehibernateascoh.htm#CEGFEFJH
    If you have the schema in database, myEclipse can provide you hibernate bindings by reverse engineering.

  • What are the different kinds of tools  available to cleanup the siebel file

    Hi All,
    Please let me know What are the different kinds of tools available to cleanup the siebel filesystem.
    Thanks in advance
    Tusar

    assuming that you installed Siebel Server under d:\D:\dba81
    cd D:\sba81\siebsrvr\bin
    1) reporting mode
    sfscleanup.exe /u SIEBEL-USER /p SIEBEL-PASSWORD /C SIEBEL-DATA-SOURCE /d SIEBEL-TABLE-OWNER /f SIEBEL-FILE-SYSTEM /m SIEBEL-FILE-SYSTEM-FOR-INCORRECT_FILES /r Y /x "D:\sba81\siebsrvr\log\sfscleanup_report.log"
    2) Real execution
    Replace /r Y by /r N
    Best Regards
    EvtLogLvl

  • Different mapping tools in the market --  For BPM mapping in XI

    Hi All
         I am more intersted in learning the BPM mapping tools(3rd party or any)
         Can any list out the different tools and which are the best ?
         Our company looking forward to implement XI (BPM's) with mapping tools
        Can any help me on this
    Regards
    Rakesh

    Hi Rakesh,
    plz have a look to <a href="http://help.sap.com/saphelp_nw04/helpdata/en/14/80243b4a66ae0ce10000000a11402f/frameset.htm">SAP Library: SAP Exchange Infrastructure</a> for understanding XI, BPM and Mappings.
    BPM has no special mapping tool. You can use Java, ABAP, XSLT and graphical Mapping (Message Mapping) at XI, for any case - BPM or not. 3rd Party has nothing to do with BPM or Mapping. That means a non SAP system.
    Regards,
    Udo

  • Tools available to extract  documents from SAP document management system

    Hi ,
    Can anybody provide some valid information on
    Tools available to extract the documents and their associated meta data from SAP Document Management system and DIR (Document Information Record) to non SAP document management sysyem ?
    Thanks..

    Hi,
    On ALE / EDI interface, you can use Message DOCMAS for triggering the outbound IDocs, you can use Program: RBDSEDOC
    For ALE / EDI Configuration, you can refer the book by Arvand Nagpal which is a step by step guide.
    Hope this helps.
    Best Regards, Murugesh AS

  • Design approach help : BIC Mapping Tool Conversion

    Hi All,
    Design approach:
    we know that BIC mapping tool can be used for EDI to XML conversion. also i know that its a Any to any converter.
    But we prefer EDI to EDI XML conversion only via this tool and do the mapping in SAP PI as it would be easy to do mapping change if any in PI than redeploy the mapping in module location every time.
    My question is:
    we know that sender file adapter supports FCC for max of 3 hierarchy levels.
    Can we make it a practice of accepting requirements in SAP PI which has complex Flat file structure (more than 3 levels). and use BIC mapping tool to convert the Complex flat file Structure (with more  than 3 levels) into PI required format and proceed with the mapping and Configuration. (Note: hear the assumption is that the organisation has BIC Mapping tool procured for some EDI message scenario's ).
    or is it best practice to ask the Source system to supply flat files strictly in Header,with 1 line item format?
    Suggestions would be deeply appreciated.
    Warm Regards,
    Senthilprakash

    >
    > if you are thinking of using only BIC mapping tool for converting flat file to xml and want to proceed with normal configuration without using BIC modules then it is not feasible.
    >
    Dear Suresh,
    I am very well aware of that. we have Seeburger SFTP adapter installed in our PI system. and for EDI conversion we use BIC for converting to XML and deploy the module in the adapter level.
    Now my question is that: weather is it the rite way to start accepting requirement from source which has complex flat file structure (non edi messages) and using BIC module in adapter to convert it into XML and proceed with configuration or better to ask the source system to send the flat file in a format which is accepted by our file adater FCC.
    what are the drawbacks/bottlenecks we see in using BIC module for converting non EDI complex flat files into XML format? in PI
    Regards,
    Senthilprakash.

  • Reverse mapping tool in 3.0.1 ignores "-schemas" option

    I believe I have discovered a bug in the 3.0.1 version of the reverse
    mapping tool.
    Here is a script of the commands that worked fine in 3.0.0:
    Script started on Mon Jan 12 11:02:19 2004
    1$ which schemagen
    /opt/kodo-jdo-3.0.0/bin/schemagen
    2$ echo $PATH
    /opt/kodo-jdo-3.0.0/bin:/sw/db/oracle/oracle817/bin:/sw/gen/sparc-sun-solaris2.9/acroread/5.06/bin:/sw/gen/sparc-sun-solaris2.9/cvs/1.11.5/bin:/sw/gen/sparc-sun-solaris2.9/esound/0.2.29/bin:/sw/gen/sparc-sun-solaris2.9/mpg123/0.59r/bin:/usr/bin:/sw/gen/sparc-sun-solaris2.9/gnupg/1.2.1/bin:/sw/gen/sparc-sun-solaris2.9/mozilla/1.3/bin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/sbin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/bin:/sw/pd/workman-1.3.4/bin:/usr/openwin/bin:/usr/bin:/sbin:/bin:/usr/sbin:/usr/ccs/bin:/usr/ucb:/opt/local/bin:/sw/modules/bin:/sw/com/bin:/sw/pd/bin:/sw/pd/office52/program:/sw/pd/RealPlayer8:/users/n9208/bin:/opt/openssh/bin:/usr/dt/bin:/usr/dt/bin:/usr/openwin/bin:/sw/db/tools/bin:/sw/db/iss/bin:/usr/local/bin:/usr/local/scripts
    3$ echo $CLASSPATH
    :/opt/oracle/oracle9.0.1.4.zip:/opt/kodo-jdo-3.0.0:/opt/kodo-jdo-3.0.0/lib/kodo-jdo-runtime.jar:/opt/kodo-jdo-3.0.0/lib/kodo-jdo.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-collections-2.1.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-lang-1.0.1.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-logging-1.0.3.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-pool-1.0.1.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-regexp-1.1.jar:/opt/kodo-jdo-3.0.0/lib/jca1.0.jar:/opt/kodo-jdo-3.0.0/lib/jdbc-hsql-1_7_0.jar:/opt/kodo-jdo-3.0.0/lib/jdbc2_0-stdext.jar:/opt/kodo-jdo-3.0.0/lib/jdo-1.0.1.jar:/opt/kodo-jdo-3.0.0/lib/jndi.jar:/opt/kodo-jdo-3.0.0/lib/jta-spec1_0_1.jar:/opt/kodo-jdo-3.0.0/lib/log4j-1.2.6.jar:/opt/kodo-jdo-3.0.0/lib/xalan.jar:/opt/kodo-jdo-3.0.0/lib/xercesImpl.jar:/opt/kodo-jdo-3.0.0/lib/xml-apis.jar:/opt/kodo-jdo-3.0.0/lib/jfreechart-0.9.13.jar:/opt/kodo-jdo-3.0.0/lib/jcommon-0.8.8.jar
    4$ schemagen -p kodo.properties -f schema.xml -schemas PRODTRDTA.F0101
    0 INFO [main] kodo.Tool - Schema generator running on schemas
    "PRODTRDTA.F0101". This process may take some time. Enable the
    kodo.jdbc.Schema logging category to see messages about the collection of
    schema data.
    136 INFO [main] jdbc.Schema - Reading table information for schema name
    "PRODTRDTA", table name "F0101".
    672 INFO [main] jdbc.Schema - Reading column information for table
    "PRODTRDTA.F0101".
    727 INFO [main] jdbc.Schema - Reading primary keys for schema name
    "PRODTRDTA", table name "F0101".
    2187 INFO [main] jdbc.Schema - Reading indexes for schema name
    "PRODTRDTA", table name "F0101".
    2432 INFO [main] jdbc.Schema - Reading foreign keys for schema name
    "PRODTRDTA", table name "F0101".
    2632 INFO [main] kodo.Tool - Writing XML schema.
    5$
    script done on Mon Jan 12 11:03:14 2004
    Note the first line of logging output: both the schema name and table name
    are properly recognized.
    Here is the scripted output of the same commands in 3.0.1:
    Script started on Mon Jan 12 10:29:03 2004
    1$ which schemagen
    /opt/kodo-jdo-3.0.1/bin/schemagen
    2$ echo $PATH
    /opt/kodo-jdo-3.0.1/bin:/sw/db/oracle/oracle817/bin:/sw/gen/sparc-sun-solaris2.9/acroread/5.06/bin:/sw/gen/sparc-sun-solaris2.9/cvs/1.11.5/bin:/sw/gen/sparc-sun-solaris2.9/esound/0.2.29/bin:/sw/gen/sparc-sun-solaris2.9/mpg123/0.59r/bin:/usr/bin:/sw/gen/sparc-sun-solaris2.9/gnupg/1.2.1/bin:/sw/gen/sparc-sun-solaris2.9/mozilla/1.3/bin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/sbin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/bin:/sw/pd/workman-1.3.4/bin:/usr/openwin/bin:/usr/bin:/sbin:/bin:/usr/sbin:/usr/ccs/bin:/usr/ucb:/opt/local/bin:/sw/modules/bin:/sw/com/bin:/sw/pd/bin:/sw/pd/office52/program:/sw/pd/RealPlayer8:/users/n9208/bin:/opt/openssh/bin:/usr/dt/bin:/usr/dt/bin:/usr/openwin/bin:/sw/db/tools/bin:/sw/db/iss/bin:/usr/local/bin:/usr/local/scripts
    3$ echo $CLASSPATH
    :/opt/oracle/oracle9.0.1.4.zip:/opt/kodo-jdo-3.0.1:/opt/kodo-jdo-3.0.1/lib/kodo-jdo-runtime.jar:/opt/kodo-jdo-3.0.1/lib/kodo-jdo.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-collections-2.1.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-lang-1.0.1.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-logging-1.0.3.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-pool-1.0.1.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-regexp-1.1.jar:/opt/kodo-jdo-3.0.1/lib/jca1.0.jar:/opt/kodo-jdo-3.0.1/lib/jdbc-hsql-1_7_0.jar:/opt/kodo-jdo-3.0.1/lib/jdbc2_0-stdext.jar:/opt/kodo-jdo-3.0.1/lib/jdo-1.0.1.jar:/opt/kodo-jdo-3.0.1/lib/jndi.jar:/opt/kodo-jdo-3.0.1/lib/jta-spec1_0_1.jar:/opt/kodo-jdo-3.0.1/lib/log4j-1.2.6.jar:/opt/kodo-jdo-3.0.1/lib/xalan.jar:/opt/kodo-jdo-3.0.1/lib/xercesImpl.jar:/opt/kodo-jdo-3.0.1/lib/xml-apis.jar:/opt/kodo-jdo-3.0.1/lib/jfreechart-0.9.13.jar:/opt/kodo-jdo-3.0.1/lib/jcommon-0.8.8.jar:/opt/kodo-jdo-3.0.1/lib/jline.jar:/opt/kodo-jdo-3.0.1/lib/sqlline.jar
    4$ schemagen -p kodo.properties -f schema.xml -schemas PRODTRDTA.F0101
    1 INFO [main] kodo.Tool - Schema generator running on schemas "all".
    This process may take some time. Enable the kodo.jdbc.Schema logging
    category to see messages about the collection of schema data.
    103 INFO [main] jdbc.Schema - Reading table information for schema name
    "null", table name "null".
    Exception in thread "main" java.lang.OutOfMemoryError
    5$
    script done on Mon Jan 12 11:01:45 2004
    Note the first line of logging output here: the schema is listed as "all"
    instead of the limited scope I had specified.
    This run eventually crashes due to the fact that the account which I am
    running the mapping tool in has access to thousands of tables, and thus
    eventually the JVM runs out of available heap.
    My workaround is to fall back to 3.0.0.

    Thanks for the report. We noticed this ourselves a short while ago.
    The bug will be fixed in 3.0.2.

  • Is there a ruleset comparison tool available in the market?

    Dear all,
    I wanted to know if there is a SAP GRC ruleset comparison tool available in the market? As a part of our audit requirement, I would need to compare our current rulesets with the ones from last quarter - To identify any changes/enhancements.
    I know Bizrights Approva supports a comparison tool called ExamXML where we can perform a comparison of 2 XML files and figure out the differences/ changes.
    Please let me know if any of you has used such a tool for GRC ruleset comparison.
    Thanks,
    Kunal

    >
    Kenguru wrote:
    > As a part of our audit requirement, I would need to compare our current rulesets with the ones from last quarter - To identify any changes/enhancements. > Kunal
    If any auditor is comparing sap delivered rule sets with a companies' grc rule sets (without deep investigations) and reporting the differences in his/her audit report (as white spaces)  then the auditor is doing it the wrong way.
    The auditor should be aware of the following facts:
    1. SAP delivered rule sets are mere best practices (only starting point)
    2. Most of the customers modify/update the rule sets as per their requirements
    3. Organizational rules are created by customers differently
    4. Some customers don't even choose sap delivered rule sets and completely create their own.
    So the difference between rule sets is obvious, but these findings may or may not be entirely appropriate to reach to a conclusion for audit purposes.
    Best Regards,
    Amol Bharti
    http://amudee.com

  • 3.1.4 reverse mapping tool issue

    (Sorry for the duplicate posting...I meant to start a new thread with
    this but accidentally posted it as a reply to a 6-month old thread)
    Hello,
    I was running Kodo 3.0.2 when Abe and I had the exchange reproduced
    below back in January to deal with Oracle tables with "$" in the column
    names (which I subsequently updated to 3.0.3). The original subject of
    this discussion was "3.0.2 reverse mapping tool generates invalid
    ..mapping file".
    I was able to get this working by running the following commands to
    implement Abe's suggestion:
    reversemappingtool -p kodo.properties -package db \
    -cp custom.properties -ds false schema.xml
    sed -e 's/\$/__DOLLAR__/' db/package.mapping > db/package.mapping.new
    mv db/package.mapping.new db/package.mapping
    javac db/*.java
    mappingtool -p kodo.properties -a import db/package.mapping
    sed -e 's/__DOLLAR__/\$/' db/package.jdo > db/package.jdo.new
    mv db/package.jdo.new db/package.jdo
    In my custom.properties file, I had lines like these to put useful names
    on my class's fields:
    db.TransactionDetailHistory.y$an8.rename : addressNumber
    As I said, in 3.0.3, this worked perfectly.
    I picked this code back up for the first time since getting it working 6
    months ago, and decided to update it to 3.1.4 (since I'm already using
    that on other projects). Problem is, the reverse mapping tool has
    changed and the code it generates no longer works as it once did. I
    tried running the 3.1.2 and 3.1.0 reverse mapping tool, and it failed
    the same way, so it looks like this change happened in the 3.0.x to
    3.1.x version change.
    What happens is this: In the generated Java source, my fields used to
    end up with names as per my specification (e.g., the Oracle column named
    "y$an8" showed up as "addressNumber" in the java source).
    However, it looks like the "$" became special somehow in 3.1.0 - the
    "y$an8" column now shows up as "yAn8" in the generated Java. I tried
    changing my custom.properties file accordingly, but it still shows up as
    yAn8 even after changing my mapping to look like this:
    db.TransactionDetailHistory.yAn8.rename : addressNumber
    What do you make of this?
    Thanks,
    Bill
    Abe White wrote:
    > Hmmm... this is a problem. '$' is not legal in XML names, and there
    is no standard way to escape it.
    >
    > Your best bet is probably to do the following:
    > 1. In the generated .mapping file, replace all '$' characters with
    another token, such as '--DOLLAR--'.
    > 2. Switch your properties to use the metadata mapping factory:
    > kodo.jdbc.MappingFactory: metadata
    > 3. Import your mappings into the metadata mapping factory:
    > mappingtool -a import package.mapping
    > 4. Delete the mapping file.
    > 5. In your .jdo file, replace '--DOLLAR--' with '$' again.
    >
    > The metadata mapping factory doesn't put column names in its XML
    attribute names, so you should be able to use it safely.

    William-
    However, it looks like the "$" became special somehow in 3.1.0 - the
    "y$an8" column now shows up as "yAn8" in the generated Java. I tried
    changing my custom.properties file accordingly, but it still shows up as
    yAn8 even after changing my mapping to look like this:
    db.TransactionDetailHistory.yAn8.rename : addressNumberWell, the reverse mapping tool makes some assumptions based on common
    naming strategies for relational databases and Java naming: columns like
    "FIRST_NAME" will be renamed to "firstName". The Reverse Mapping tool is
    seeing the "$" and treating it as a non-alphanumeric delimiter, so is
    fixing it.
    Can you try a couple of additional properties:
    db.TransactionDetailHistory.y$An8.rename: addressNumber
    db.TransactionDetailHistory.y$an8.rename: addressNumber
    Also, are other rename properties working for you, or is that the only
    field or class you attempt to rename? It might just be the case that
    you aren't correctly specifying the properties file or something.
    Finally, bear in mind that you can always implement your own
    kodo.jdbc.meta.ReverseCustomizer and just use that; not the easiest
    solution, but it can certainly be used to have very fine-grained control
    over the exact names that are generated.
    In article <[email protected]>, William Korb wrote:
    (Sorry for the duplicate posting...I meant to start a new thread with
    this but accidentally posted it as a reply to a 6-month old thread)
    Hello,
    I was running Kodo 3.0.2 when Abe and I had the exchange reproduced
    below back in January to deal with Oracle tables with "$" in the column
    names (which I subsequently updated to 3.0.3). The original subject of
    this discussion was "3.0.2 reverse mapping tool generates invalid
    .mapping file".
    I was able to get this working by running the following commands to
    implement Abe's suggestion:
    reversemappingtool -p kodo.properties -package db \
    -cp custom.properties -ds false schema.xml
    sed -e 's/\$/__DOLLAR__/' db/package.mapping > db/package.mapping.new
    mv db/package.mapping.new db/package.mapping
    javac db/*.java
    mappingtool -p kodo.properties -a import db/package.mapping
    sed -e 's/__DOLLAR__/\$/' db/package.jdo > db/package.jdo.new
    mv db/package.jdo.new db/package.jdo
    In my custom.properties file, I had lines like these to put useful names
    on my class's fields:
    db.TransactionDetailHistory.y$an8.rename : addressNumber
    As I said, in 3.0.3, this worked perfectly.
    I picked this code back up for the first time since getting it working 6
    months ago, and decided to update it to 3.1.4 (since I'm already using
    that on other projects). Problem is, the reverse mapping tool has
    changed and the code it generates no longer works as it once did. I
    tried running the 3.1.2 and 3.1.0 reverse mapping tool, and it failed
    the same way, so it looks like this change happened in the 3.0.x to
    3.1.x version change.
    What happens is this: In the generated Java source, my fields used to
    end up with names as per my specification (e.g., the Oracle column named
    "y$an8" showed up as "addressNumber" in the java source).
    However, it looks like the "$" became special somehow in 3.1.0 - the
    "y$an8" column now shows up as "yAn8" in the generated Java. I tried
    changing my custom.properties file accordingly, but it still shows up as
    yAn8 even after changing my mapping to look like this:
    db.TransactionDetailHistory.yAn8.rename : addressNumber
    What do you make of this?
    Thanks,
    Bill
    Abe White wrote:
    Hmmm... this is a problem. '$' is not legal in XML names, and thereis no standard way to escape it.
    Your best bet is probably to do the following:
    1. In the generated .mapping file, replace all '$' characters withanother token, such as '--DOLLAR--'.
    2. Switch your properties to use the metadata mapping factory:
    kodo.jdbc.MappingFactory: metadata
    3. Import your mappings into the metadata mapping factory:
    mappingtool -a import package.mapping
    4. Delete the mapping file.
    5. In your .jdo file, replace '--DOLLAR--' with '$' again.
    The metadata mapping factory doesn't put column names in its XMLattribute names, so you should be able to use it safely.--
    Marc Prud'hommeaux
    SolarMetric Inc.

  • What is the best sharepoint 2010 synchronisation tool for macbook air

    what is the best sharepoint 2010 synchronisation tool for macbook.
    Access sharepoint from a mapped drive is really slow, so I want to save my files locally and synchronise with Sharepoint 2010.
    Is there an application or client for this.

    Ok, so do you have a recommend brand and space capacity?
    You mentioned that one of your primary needs is to store video, which demands large capacity. As a consequence I'd suggest a drive with at least 1Tb, perhaps greater, capacity. I don't have a preferred brand or manufacturer - it's a bit like recommending which is better between Ford and Chevy - but I have a couple of Western Digital external USB drives which have proven very reliable, and a Seagate which is used as a daily backup and has worked flawlessly.
    DWB's point about backups is important too, because when you store files on any drive there is a risk of that drive failing - and indeed, all drives will fail at some point. The question is, can you live with the total loss of those files if the drive does fail? If you are storing files that you value, then a backup is needed to help protect those files. Ideally that would mean two drives, not one. One to use for the storing the files, and the second to use to back those files up.

  • What are the tools available for gps NMEA  data streams reception ?

    Since no software for bluetooth connection for my Holux GPS device to my iPad2, I plan to write one myself. What are thr tools available and what is the simplest development platform just for this simple task? That is just to read the NMEA ASCII stream from the Bluetooth device.

    It depends on what 'interprocess communication' your "app" program has available.
    I regulary use DDE to control a PLL App which controls our PLL via the LPT Port. This is only a write process, but works very easily. The read is equaly easy. You need to know the various "keywords" like service, topic and instruction which the "app" will respond to.
    Generaly I found ActieX to be more extensive, meaning its probably going to take longer and more steps to achive similar simple results.
    The file does not seem to be the best way.
    Hope that helps a bit.

  • Opacity / transparency in gradient map tool not working?

    Hi *,
    when selecting a gradient with transparency, e.g. neutral density in the gradient map tool (in an adjustment layer), transparency is ignored, i.e. neutral density is working like plain black - which I assume is not intended. And, I'm almost certain, the gradient map properly applied transparency earlier but I might have changed involuntarily some setting. So far, I could not find a good answer. Any ideas?
    Best regards,
    arnie

    You want to use a Gradient Fill adjustment layer instead of Gradient Map for the Neutral Density presets

  • ELM Mapping tool in CRM 5.2

    Hi Friends
    We are on CRM 5.2 and trying to use the ELM mapping tool, where in we define the mapping format. However transaction CRMD_MKTLIST_MAP gives a message that use new interface. The other transaction to maintain the external list works fine.
    Please help regarding how to access the Define mapping format transaction...is there any new transaction for that ?
    Thanks in Advance.
    -MP
    Edited by: Mohanpreet singh on May 25, 2009 11:57 PM

    Hi ,
    In CRM Version 4.0 or 5.0, mapping formats were created for External List Management. After you upgrade to Version 5.1 or higher, these mapping formats are no longer available on the screen for maintaining external lists.
    In CRM Version 5.1, the mapping function for External List Management was converted to new database tables. Mapping formats that were created before CRM 5.1 must be migrated into these new tables.
    The old interface for maintaining mapping formats is based on the old database tables, which means that they can no longer be used.
    As long as the mapping formats are not migrated, they are not available on the screen for maintaining external lists.
    The Process to migrate from old mapping formats to New Mapping Format is like this :
    Start transaction SA38 in SAP GUI. Enter CRM_MKTLIST_MAP_MIGRATE as the program name. You can navigate from the menu to the program documentation. Read this documentation to understand how the program is to be used.
    Then start the program and select the mapping formats to be migrated from the list displayed.
    The program only provides you with the mapping formats in the logon client. You must start the program in each client in your system in which you want to migrate mapping formats.
    Hope This Helps .
    Regards ,
    Nags .

Maybe you are looking for

  • Clearing the displayed value for a SELECT-OPTION

    How do I clear the displayed value of a SELECT-OPTION?  I have 2 SELECT-OPTIONs on my screen (standard basic report program screen).  I use code like this to populate the drop-down boxes for each one.  ===== AT SELECTION-SCREEN ON VALUE-REQUEST FOR s

  • AD Trusted Reconciliation Issue in OIM 11g R2

    Hi, I am trying to reconcile the users from AD(Trusted Source) to OIM 11g R2. I gave object class as User. and User ID in search filter but by default ObjectCategory is getting added in my search filter. so my search query ends up something like.. (&

  • No batch split performed for delivery, instead 2 items

    Hi, I have batch managed materials and during creation of the delivery, the system does not create a batch split, instead it created 2 seperate line items for each batch. This is causing problems during EDI invoicing (intracompany): the same item mul

  • What is the 5v output on the umi7774 referenced to?

    the UMI 7774 has a 5V output on the feedback connctor (amongst other places) what is this 5V line referennced to i.e. whats the return line? is it DGND? Ciso? Anyone know? thanks

  • Mail does not quit

    When I click quit mail, the application does not always quit. I often have to force quit. Any solutions to this problem? Thank you for your help. power book   Mac OS X (10.4.5)