Mappings - crememdm etc.

Hi,
I'd like to send Crememdm IDOC from MDM to XI
but are there any predefined mappings from MDM objects to IDOCs in MDM?
or do I have to build the mapping inside MDM myself ?
if there are can I download them from somewhere?
Thank you for your help,
Regards,
michal

hi,
partly
>>>>(Like vendor master,material master) there r predefine maps....so syndicator will use that....and will create xml file
could you just specify where do you find those predefined ones ?
(in which folder or where (on service.sap.com)
thank you,
Regards,
michal

Similar Messages

  • External table in Oracle 10g and Windows server 2003 problem

    I'm having a problem accessing an external table in an Oracle 10g and Windows 2003 server environment. The disk is remote (mapped) to the server. It's the usual access errors, kup-04001 or kup-04063.
    I have the [seemingly] exact same setup in an Oracle 9i and Windows 2000 server environment which works perfectly. Services run as local SYSTEM and SYSTEM has full permissions on the disk. Directories exist and so do the permissions to the directories.
    In the Oracle 10g and Windows 2003 environment, services run as local SYSTEM and SYSTEM has full permissions on the mapped disk. Directories exist and so do the permissions to the directories.
    This obviously effects mappings, deployments, etc.
    Does anyone know if something changed in either the db or the os?
    Thank you,
    Michael

    Thank you for your reply.
    Your proposal is the standard solution. As a matter of fact, that's how it is on my "old" system. Server "A" Oracle services are started by SYSTEM. Server "B" Oracle services are started by SYSTEM and is where the flat files reside. SYSTEM has full permissions on a disk local to Server "B". Database directoriies defined on Server "A" point to Server "B" remote disk. External Tables on Server "A" are defined with directories that point to Server "B" remote (mapped) disk. I have no problems with this configuration.
    I'm having a problem acheiving the same configuration with Oracle 10g and Windows 2003. I guess I'm baffled over the fact it isn't working the same way as my "old" environment. Why shouldn't it? Oracle (and you) want me to start services as a specific user. It shouldn't have to be that way unless something changed in the way the database makes calls to the os or if Microsoft slipped something in with one of it's numerous security patches.

  • Deployed in Kuwait - VPNclient configuring w/o networkmanager (pptp)

    My laptop connects to a wireless router in the barracks and Kuwait has quite a few filters on what we can access.
    I've purchased an account with an online VPN provider and have been able to connect via windows7 and via networkmanager-pptp in Arch.
    The problem is, I typically use wicd.  This is for many reasons, but here it's almost necessary since there are multiple routers broadcasting on different channels but all with the same essid; so networkmanager cannot differentiate between them.
    Anyway, I know the vpn works but I can't get the configs right without networkmanager.
    I've done it according to the wiki and various other posts I read here and elsewhere and I can connect but it doesn't seem to "take".  My ip doesn't change (as seen by what is my ip) and sites are still blocked.  And it get's stranger.  Once connected to the vpn I cannot load any pages except for google / igoogle.  Google chat with my wife works fine, but I'm unable to ping google.com?
    And running poff gives me this:
    sudo poff myCon
    /usr/bin/poff: I could not find a pppd process for provider 'myCon'. None stopped.
    here are my connection settings from /peers: http://pastebin.com/pui3rTwv
    here are my settings from options.pptp: http://pastebin.com/Lbm6UyMt
    and my chap-secrets looks like this:
    <my user name> <vpn server ip> <my password> *
    And here's a terminal output of  "pon myCon debug dump logfd 2 nodetach":  http://pastebin.com/M79JJp4C
    As always, thanks for the help.

    actually, you can still do that
    1) add static routes for all your vpn servers ip addresses, e.g, route add europe.myvpn.com gw <my isp's default gateway> dev <my device>
    for ease of use, add all your routes to a bash script, and then run it from rc.local
    2) before connecting to the vpn, you need to disable your default route through eth0/etc , but doing so, you will not be able to connect at all as your myCon config file uses a dns name in the pty parameter instead of the vpn servers ip address, either change it in the config file, or add all those dns-ip mappings to /etc/hosts
    3) create multiple ppp config files with different pty parameters, and you can then execute them using a normal bash script containing, pon myEuropeVPN, pon myAsiaVPN, etc.
    Last edited by Sin.citadel (2010-06-10 18:14:23)

  • Integration Process of abstract interfaces of diff SWCV's

    Hey all,
    I am trying to create an integration process of a sender and receiver abstract interfaces of different SWCV’s. If I create the integration process in one of the SWCV, I am not able to select the other interface in the receive step.Is it a necessary criteria to have the both sender and receiver abstract interfaces to be in the same SWCV? Can they not be in different SWCV?
    -AR

    Hi Antonio,
    CLEAR SLD CACHE is not enough. After your SWCV update in SLD, you need to import again your SWVC inside IR.
    All your objects will be NOT deleted. Thus don't worry about your Data Type, Mappings, BPM, etc...
    I have already done such an import (after adding a usage-depency), I have never lose an object.
    If you are some doubt, you can Export all your namespaces in a file (cf. Tools > Export). Thus you will be able to Import all your namespaces if something will be wrong (but it will be not the case!). Ask to your admin to get this file in order to NOT send it to your production system...
    Regards
    Mickael
    Message was edited by: Mickael Huchet

  • Configuring apache & tomcat on seperate machines

    I need help to setup tomcat 3.2.1. and apache 1.3.20 on separate machines (Windows 95 & Windows NT)!! I am able to run them fine on one machine but am baffled with putting them on two separate machines.
    I would be most gratefull if anyone can provide me with the steps of configuration and show me examples with regards to the Apache httpd.conf & Tomcat mod_jk.conf files.
    Thanks

    Not sure how to solve your two machine setup, but as for the mapped network drive, mapped drives are profile specific. When you log into the machine all settings, mappings, desktop, etc. are restores. When you log off, all those things are gone.
    Apache on the other hand does not have it's own "profile", and is running as a service so it doesn't "know" of any mapped drives from it's point of view. Local drives are fine, the OS can see those at all levels.
    Correct me if I'm wrong anyone, but you could have Apache and Tomcat running on seperate machines as their own web service. And when ever you need to make a JSP/servlet call, just reference the Tomcat machine.
    Tomcat is currenly setup for port 8080, but I'm sure you can set it up to work off port 80 and have it "look" like a normal web server. The Apache server would just handle your static HTML pages.

  • New ODI User having trouble creating an XML Data Source...

    How's everyone doing? I'm new to the Oracle Data Integrator software. My goal is to take an XML message, do a bunch of mappings/constraints/etc. and integrate the appropriate data into an Oracle database. Seems like ODI is good for the job, but I'm running into some issues setting up the actual XML source file.
    I followed this guide: [ODI XML to Database Transformation|http://www.oracle.com/technology/obe/fusion_middleware/ODI/ODIXML_to_DB_Transform/ODIXML_to_DB_Transform.htm], which was helpful, but since it used example XML files it did not go into how to use your own custom XML file.
    Let me start off by sharing the XML file I'm attempting to import:
    <?xml version="1.0" encoding="UTF-8"?>
    <Money xmlns="http://xxxxx.oracle.com/schema/Money"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://xxxxx.oracle.com/schema/Money file:/C:/OraHome_1/oracledi/demo/xml/Money.xsd">
        <MoneyID>MoneyID0</MoneyID>
        <ContractID>ContractID0</ContractID>
        <EffectiveDate>2006-05-04T18:13:51.0Z</EffectiveDate>
        <MessageDate>2006-05-04T18:13:51.0Z</MessageDate>
        <ReversalIndicator>false</ReversalIndicator>
        <PriorMoneyID>PriorMoneyID0</PriorMoneyID>
        <Amount>0</Amount>
        <MoneyType>0</MoneyType>
        <ExchangeDetails>
            <CostBasis>0</CostBasis>
            <ExchangeType>0</ExchangeType>
            <MEC>false</MEC>
            <LoanAmount>0</LoanAmount>
        </ExchangeDetails>
    </Money>Pretty straightforward I'd say. Note that the XSD file it references DOES exist in that directory, so everything should be fine. So I go into Topology, select the "Physical Architecture" tab, right click "XML", click "Insert Data Server". I name my file "XML_MONEY", select the JDBC tab, and fill in the following fields:
    JDBC Driver: com.sunopsis.jdbc.driver.xml.SnpsXmlDriver
    JDBC Url: jdbc:snps:xml?f=../demo/xml/Money1.xml
    When I hit "Test", it gives me this annoying message:
    java.sql.SQLException: Could not generate the DTD because the file could not be created. Verify that you have write permission in the directory.
    I have no idea why. I definitely have permissions to write to that directory.
    I played around some more, and removed the junk at the top of the XML file, leaving the file with JUST XML tags. Now I get this error:
    java.lang.ArrayIndexOutOfBoundsException: 1
    What does this mean? I'm confused why this would be happening, because I a couple of days ago when I was playing around with a dummy XML file (with just tags) it connected just fine.
    If anyone can lend a hand I'd greatly appreciate it. I'm probably missing something stupid, so feel free to yell at me.
    Thanks!!
    Edited by: user9958203 on Nov 19, 2009 12:48 PM
    Edited by: user9958203 on Nov 19, 2009 12:48 PM

    hi , i will try to help u out....lets take a simple example
    create your own xml for example
    <pre>
    <?xml version="1.0" encoding="UTF-8"?>
    <ROWSET>
    <ROW>
    <ENAME>SCOTT</ENAME>
    <SAL>15000</SAL>
    <LOC>INDIA</LOC>
    </ROW>
    <ROW>
    <ENAME>ROCK</ENAME>
    <SAL>25000</SAL>
    <LOC>USA</LOC>
    </ROW>
    </ROWSET>
    After that go to Topology manager create a data server in xml technology for example "EMPLOYEE"
    JDBC DRIVER : com.sunopsis.jdbc.driver.xml.SnpsXmlDriver
    JDBC URL : jdbc:snps:xml?f=C:/EMP.xml&s=EMP01
    Test it you will get Successful connection message.
    After getting this message apply the settings.Physical Schema window will pop up there give Schema as EMP01,Work Schema as EMP01 then in the context tab
    create a context and give the logical schema name it can be any name..here i am giving the my logical schema name as EMP01.Apply the changes.
    Create a model
    1.In Definition tab give the model name
    select Technology as XML
    Logical Schema as EMP01
    2.In the Reverse tab select the context which u have created in the topology manager
    3.In the Selective Reverse tab reverse all the tables.
    in the model you will be getting 2 datastores ROWSET,ROW
    Create an Interface
    Drag the ROW datastore into source
    I have crated EMP_TEST table in my oracle database Ename,Sal,Loc as columns and reversed it so that i can use EMP_TEST table as my target table
    Drag the EMP_TEST table in the Target datastore map the columns
    select IKM as SQL toSQL Append execute the interface the data in the xml will be loaded in to EMP_TEST table.
    </pre>
    Hope this helps u
    Thanks,
    praneeth.

  • Use of interface controller

    HI
    can u please explain me use of interface controller with simple example along with the code
    Thanks
    kishore

    Hi Kishore,
    Here r the few facts abt interface controller:-
    Used to control access from other components.
    Is created automatically when a component is created.
    Exists as long as its component is embedded in another
    component.
    This controller contains data, methods, and events that can be used by other Web Dynpro components.
    A componentu2019s interface controller can be used by the embedding component for mappings, eventing, etc.
    Interface for external users of a component: data transfer,
    eventing.
    Interface Controller
    One Web Dynpro Component may declare the use of another Web Dynpro Component to form a u201CParent-Childu201D usage relationship. In such cases, the only point of interaction between the parent and child components is the functionality found in the childu2019s interface controller.
    There is a usage declaration arrow pointing from the component controller to a child component. Notice that the arrow joins the child component at the top right-hand corner. This is the location of the childu2019s interface controller.
    At no time does the Web Dynpro Runtime interact with a componentu2019s interface controller. It is only ever used to form a Parent-Child relationship between two Web Dynpro Components.
    Check the links:-
    http://help.sap.com/saphelp_nw70/helpdata/EN/47/45641b80f81962e10000000a114a6b/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/EN/44/0dcafa2d9d1ca2e10000000a11466f/frameset.htm
    hope dat may help u..
    Regards,
    Deepak

  • OWB general questions for effective use.

    Hi all,
    I have been using OWB for a while now, and am getting to the point where I want to make sure I am using it effectively.
    For example, how does one decide what to include in one project, or to split it up into multiple projects? I am loading a warehouse, and so far I am only loading raw data into tables.
    My next step will be to perform ETL on the raw data and start forming more structured warehouse data. Would that step be better contained in a separate project? Would I need to repeat the definitions of the tables in the loading project? Should I just keep the whole thing in one project? The loading project is quite large, as we have raw data from many sources, and it seems to get one file in takes about 5 - 7 OWB objects (flat file, ext table, 2 - 3 mappings, process flow, 1 - 2 tables.)
    So I have dozens of mappings, tables, etc.
    Even though much of the data comes from different places, it is generally used together by the end users, and the ETL will likely also need to use most of it together.
    Is there any "Best Practices" posted anywhere?
    Another question that has come up is this: It seems the idea is to create the warehouse structures completely in OWB and deploy to the DB. However OWB doesn't allow for a full table definition, for things like Triggers, or for advanced features due to a later DB version.
    So does one just create a "phantom" entry in OWB that is never deployed, and then create the actual table manually, or deploy and then modify manually to add the trigger?
    Or are we not supposed to be using DB triggers, and instead control everything through OWB?
    Any insight would be appreciated.
    Thanks

    Hi
    I think the kind of questions you are asking are more aimed at methodology's not so much OWB. There are plenty of sites you can get this kind of info from one but not necessarily the best being <http://www.ittoolbox.com/>
    In any case we use three projects and multiple schemas
    project & schema 1 is used to collect data quickly from multiple sources
    project & schema 2 normalizes the data (acts as the storage repository)
    project & schema 3 is where the datamarts exist (de-normalized data)
    this approach allows you to isolate your integration layer from your reporting layer. most changes only affect one of the layers, not all.
    as far as creating your structures in owb is concerned I seen no problem, provided you are using a good ER tool and have ironed out any potential problems.
    I have certainly created triggers manually and added them after deployment, but in most cases you can use Transformations, post-mapping, and pre-mapping processes to do the same thing, after all the data should only get into the target through a mapping. If it gets in any other way, you have a hole in you bucket.
    Chris

  • Add New Virtual Network Interface to RRAS

    In the past (W2k3), adding an interface to the OS would also cause that interface to be added to RRAS.  I have a server running RRAS and it has multiple VMs running in Hyper-V.  I want to add an isolated VM that has a single port available from the Internet to that VM.  I have added an Internal Virtual Network and configured it with a static IP (in the host OS) (same configuration (except IP address) as my other virtual network interfaces).  The new interface does not show up in RRAS even after restart.  In W2k3, you could add an interface in the Network Interfaces area.  In W2k8, the only interface you can add is a demand-dial interface.
    In the past, I had to delete the RRAS server and reinstall it to see the new interface.  This is ugly as it removes, for an extended time, all of the mappings, filters, etc. that I have in place.
    Any ideas on how to manually add a new interface to RRAS either by registry, powershell, script, etc.?Pete

    I found a new way around this which should work for everyone, a twist on the existing registry solution:
    Open RegEdit.exe.
    Go to "HKLM\System\CurrentControlSet\Services\Tcpip\Parameters\Interfaces" where each of your network adapters (physical or virtual) has a GUID named sub-key. Identify which one is your network adapter (look at the IP or DHCP settings in the child key=values
    or configure a temporary address so you can find it). Now you have discovered the GUID. Click the GUID key then hit F2 to goto rename mode and highlight the whole key, hit CTRL+C to copy to the clipboard.
    Go to "HKLM\System\CurrentControlSet\Services\RemoteAccess\Interfaces". Add a new sub-key which is the next in sequence, for example I had 1...6 so I added "7". This is key add the following...
    A DWORD32 (32bit integer) called "Enabled" set to 1.
    A REG_SZ (string) called "InterfaceName" then paste your interface GUID, e.g. set to "{XXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX}".
    A DWORD32 called "Stamp" set to 0.
    A DWORD32 called "Type" set to 3 (LAN).
    Restart RRAS, the interface will appear!
    Right click each protocol, e.g. IPv4, IPv6 then you can right click "General" to add a "New Interface..." and your interface will then be enabled for that protocol.
    I'm happy I played around a bit more to find that now. But MS even though we have a solution which does not require us to lose our configuration, please fix the source of this problem!
    Key Artefacts

  • OWB 10gR2 - Implementation Strategy

    I am trying to find what is the best strategy to install and configure a OWB environment for the following scenario on one linux server machine:
    There are three Oracle databases on this machine:
    database 1 = OWB repository owned by a owb_owner schema. The repository will contain all of design metadata (Source Oracle module with source tables, Target "Oracle module A" with tables, sequences, mappings and Target "Oracle module B" with tables, sequences, mappings). The ETL process is setup to transform data in two stages: 1> Source to target A 2> Target A to target B
    database 2 = It will have target schema Target_A. The contents of "Oracle Module A" from the OWB repository in database 1 needs to be deployed to this "target_A" schema
    database 3 = It will have target schema Target_B. The contents of "Oracle Module B" from the OWB repository in database 1 needs to be deployed to this "target_B" schema
    Do I need to have OWB repository installed in database 2 and database 3 and have control center service running in each of them to facilitate execution of ETL process?
    Deployment of the objects will be launched from a windows workstation via client design center / control center manager. Do I need to have control service running from client machine to facilitate deployment ? If so, would it facilitate deployment of the objects to the above two targe schemas ?
    The intent is to have a process flow in OWB metadata repository of database 1 streamline the complete ETL process tied to a location owned by OWF_MGR schema in database1.
    Please advice what implementation strategy will work best for the scenario. I read strategy presented in http://download.oracle.com/docs/cd/B31080_01/doc/install.102/b28224.pdf .. BUT I need more clarity on the available options.

    Hi.
    you will need a unified repository in each database. The unified repository in OWB 10GR2 contains both the objects for the design repository (your mappings, tables etc.) and the control center repository (runtime and deployment information such as execution times of mappings etc.). In previous versions of OWB they were separate.
    Database 1: Install the unified repository and just use it as your design repository. You will use this when you log in via Designer.
    Database 2: Install unified repository together with Control Center Service and use as control center repository.
    Database 3: Install unified repository together with Control Center Service and use as control center repository.
    While it is possible to have the Control Center Service just run on a client machine , I ran into all sorts of problems in the past with this configuration, e.g. when you log off your client machine it shuts down the Control Center Service. From my experience this setup is not very stable.
    In OWB Designer you can then set up locations and configurations to deploy your mappings to database 2 or database 3.
    You will need owf_mgr on both database 2 and database 3 and you need to deploy your process flows to both databases. What you can't do is to run a process flow say from database 1 or database 2 across databases 2 and 3.
    Can you explain in some more detail why you have a requirement to transform data across two target servers?
    Regards
    Uli

  • DIY 2 routers w/o cascading

    Hi, just wanted to share my exprience with adding a linksys wireless router wrt54gs v.6 to my existing network which already has a router on it. You might ask why did you add another router and the answer is simple it was cheaper to get then the standard WAP by about $30 US dollars "they were running a special".
    I currently have this setup
    Wall to cable modem
    Cable modem to Netgear router w/firewall & vpn
    Netgear "last port" to linksys port 4
    computers connected to netgear
    servers, printers & wireless connected to linksys
    all on the same ip scheme.
    My network uses a netgear as the dhcp and everything on the linksys looks to it for dhcp.
    When I first got the linksys I tried to follow the instructions and cascade the 2 routers then came to find out that by doing that everything on the Netgear would not talk to the computers on the linsys. So its easy to say that is not what I was looking for. This is what I did
    Attached linksys router to my computer and started it up went into the configuration "refer to your documentation" SAVE YOUR SETTINGS AFTER EACH CHANGE!
    Turn off dhcp "very important" SAVE
    Assign a static ip address to your router from your current network. Example...Main router is 192.168.0.1 so make your linksys 192.168.0.5 SAVE
    Go to setup and advanced routing and select the drop down box that says Gateway and switch it to router. Was the top selection on my menu. SAVE
    Reattach your computer to your existing network and cycle the power or just pull a new ip address.
    Go into your current network router/gateway which is curretnly assigningyour IP address and tell it to start at giving out address at 192.168.0.50. By doing this you have about 49 static ip's to use for servers, wap's & printers.
    Plug in your Linksys home router into the network there are a few ways can do this.
    Make a crossover cable or buy one and plug it into your main router "ex port 4" then take the cable and plug it into the linksys router "ex port 4" DO NOT CONNECT TO THE DESIGNATED INTERNET PORT
    If your current router has a uplink port "check your documentation" you can use a standard ethernet cable and plug into that and then plug into your linksys router. DO NOT CONNECT TO THE DESIGNATED INTERNET PORT
    Message Edited by Konoko on 10-29-200611:13 AM

    actually, you can still do that
    1) add static routes for all your vpn servers ip addresses, e.g, route add europe.myvpn.com gw <my isp's default gateway> dev <my device>
    for ease of use, add all your routes to a bash script, and then run it from rc.local
    2) before connecting to the vpn, you need to disable your default route through eth0/etc , but doing so, you will not be able to connect at all as your myCon config file uses a dns name in the pty parameter instead of the vpn servers ip address, either change it in the config file, or add all those dns-ip mappings to /etc/hosts
    3) create multiple ppp config files with different pty parameters, and you can then execute them using a normal bash script containing, pon myEuropeVPN, pon myAsiaVPN, etc.
    Last edited by Sin.citadel (2010-06-10 18:14:23)

  • Oracle Incentive Compensation Question

    Hi,
    I would like to know if there are other users who have implemented "Oracle Incentive Compensation".
    We have implemented OIC about a year ago, and are currently running into performance issues in the "Loader, Collection and Calculation" process. We process huge transaction volume of data every day, and our process performance creeps up on some nights (Just before the next purge is due).
    Any help is greatly appreciated.
    Thanks!

    Hi,
    You also may want to analyze the collection and calculation process to see if there are any oppurtunities to tune. In some colelction process, we add some sql statements (indirect mappings, actions etc), that may slow down the process.
    You also may want to consider aggregating transactions before collection toe reduce the amount of data fed into OIC.
    Similarly, depending on how your comp plan is defined there may be opportunities to improve performance. Depending on what version you are on, there some options to aggregate data during rollup process so that calculation runs faster.
    If you need more specific details, you may reach me at [email protected]
    Regards
    Srini

  • Financial Analytics - Need Information

    Hello,
    I am trying to find out the details about the prebuilt ETL process that comes with financial analytics module .Can someone please point me to some documentation where i can find basic informations like what are the mappings available and how many mappings available etc . If you are not able to locate any documentation but you know these information ,kindly let me know
    Thanks
    Venkat

    first identify your source system like Oracle ERP or people soft or siebel and its version. then find the suitable container in DAC like Oracle R12 or Oracle R12.1.1 etc. with in that you would be able to find suitable subject areas. Also you would be able to find suitable execution plan. find them. before you run the executable plan, read the BI apps config guide and configure all lookup files properly.

  • Import/Export xls Error

    Hi,
    I'm facing a problem with import/export xls files.
    Once I selected the file the system doesn't return any response even if the file has very small size (about 70 KB).
    It happens both from workbench and web.
    Furthermore I can't to export something to Excel (I.e. the categories)
    Any suggestion about it?
    Thank you

    Import XLS:
    Make sure the ups* range in your XLS workbook is defined to include all rows you want to import, including the two header rows.
    Please also be aware that XLS Import only adds records and does not update/overwrite what's already in there. If you want to change mappings/periods etc., you need to delete these records first and then add them again.
    Export to XLS:
    Your settings for export sound ok. Maybe there's an issue with the MIME Types on the web server - but I'm not too familiar with that, maybe someone else has an idea on this.
    If you export to XLS a file should be created in Outbox\Excel (not during import, just want to make sure I'm clear on this).
    Regards,
    Matt

  • Partner Profile Details

    Hi all
        In our case we need to send a IDOC to SAP r/3.How can v know the deatils about sender port,partner number and sender address details ?

    Hi Rajesh,
    In XI for any scenario involving XI, you will have to define the system details at the name scpace level in the integration builder and the port details in the transaction IDX1. Also you will have to load the metadata of the IDoc in the tnx IDX2. In the integration directory, you will have to provide  the adapter specific metadata for the business system which you are using as sender. When you complete these steps, at runtime, XI will take the sender system information from the adapter specific metadata, and the port from the IDX1. Then it loads the meta data from IDX2. Thus it recognizes the IDoc. If you miss any of these steps except for the first one, it will throw an error. The first step is used only during design the mappings, interfaces etc.. in IR.
    Award if helpful,
    Sarath.

Maybe you are looking for

  • Error while creating Shipment Creation

    Hi all,          when am creating Collective shipment vth Tcode:VT04, am getting the below error- " Unit of measure between L&KG are not maintained" - we have previous del documents for which shipment doc was created through VT04. but this time, amno

  • Whlie creating a new instance, listener doesn't work

    Hi I got some problem while install a new instance. Maybe listener configured incorrectly please help me out. I created a new instance(SID:BUS) by myself. I didn't use DBCA or anyother tools There's legacy instance on my NT server. That's why I have

  • Ifrun60_dump  file Error

    hi, When i was trying to open forms runtime, its giving below error. ERROR:FAILED TO OPEN THE FILE:ifrun60_dump_2304 Is there any solution to this? Thanks. Ravi

  • How to rerun Roo.sh script successfully

    hi, my machine got restarted accidently while I was running Root.sh script after clusterware installation on second node and script could not run completely. so many daemons like gsd, evm and ons could not be configured or started. plz guide me how t

  • ITunes is automatically determining audio volume on my files

    Whenever I open iTunes it begins automatically "determining audio volume" for my podcasts. I don't know why it's doing this. When I click on the x in the window it stops, but if I close iTunes and reopen it, it starts again. How can I make iTunes sto