Mapping to an IDOC using an externally created XSL file

I want to map a message to an IDoc using an ABAP XSLT transform. I am creating the XSLT in an external tool.
I need to use an XSD definition of the IDOC as the target document of the mapping. I can get that in two ways:
1) Get the XSD definition of the IDOC from transaction WE60 and use the menu command 'Documentation -> XML Schema'
2) In the Integration Repository, export the XSD of the imported IDOC.
The problem is, there are significant syntax differences between these versions, and I don't know which is the correct version. For instance, the XSD exported from the Integration Repository has a mandatory attribute 'SEGMENT' for each segment that must be filled to create a valid XML message; this is not the case when the XSD definition is created in WE60.
Can somebody advise which is the correct version to use?

Hi Anthony,
This is really strange...
In both case XSD should be same...can you please tell for which idoc you are generating xsd?
Eventhough if you generate XML Schema using WE60,  mandatory attribute 'SEGMENT' for each segment has to be there....this validation would be done while posting idoc. It will check for all mandatory fields.
I have some different thoughts that what wojciech has suggested...
"If were your shoes I would use xsd from integration repository because that's the expected result...."
But final goal is not to achieve mapping but posting idoc with proper data in R/3 system successfully.
Your second question..
"Do I assume then that placing a constant value of 1 in the SEGMENT attribute is just to satisfy the validation of the XML message against the XSD:..?
ummm...I think if you assign constant "1" to segment it will generate that segment while creating idoc. Its mandatory to assign constant value to segment which you want to generate...
"Can I assume that the IDoc adapter will fill the SEGMENT attribute with the correct value?.."
I dont think so Idoc adapter will fill this attribute automatically...you need to assign constant to field segment.
Nilesh

Similar Messages

  • HT201250 I have two external hard drives. One is my Time Machine backup drive.  The other I use for external storage of files (documents, photos, movies, etc).  Can I set Time Machine to backup BOTH my Mac hard drive and my other external hard drive?

    I have two external hard drives. One is my Time Machine backup drive.  The other I use for external storage of files (documents, photos, movies, etc).  Can I set Time Machine to backup BOTH my Mac hard drive and my other external hard drive?

    Yes you can make multiple backups on one hard drive, for example if you’ve 1TB hard drive installed in your PC and you’ve two Mac Machines with 500GB drive each then you just make two backup images with size of 500GB each.
    http://www.halfspot.com/use-your-pc-hard-drive-for-time-machine-backup/

  • Error Using OEM to create export files

    Hi
    I get the following error whenever I try to use OEM to create export files (login to OS as oracle user -> http://localhost:1158/em -> login as SYSTEM -> Maintenance -> Export to Export Files (with oracle user as host credentials)
    >>>
    Validation Error
    Examine and correct the following errors, then retry the operation:
    Error - ERROR: NMO not setuid-root (Unix-only)
    >>>
    From what I've read in this forum and in others is that the problem is often related to the owner/group permissions associated with the bin/nmo and bin/nmb files. However I've tried the posted solutions by changing the owner/permissions as suggested, but this does not help. I get the same error regardless if the files are owned by oracle or by root.
    Here is some more info
    >>>
    [root@rhlinux bin]# id oracle
    uid=501(oracle) gid=502(dba) groups=502(dba)
    [root@rhlinux bin]# ls -ld /app/
    drwxr-xr-x 4 root root 4096 Jul 6 13:36 /app/
    [root@rhlinux bin]# ls -ld /app/oracle/
    drwxrwxr-x 5 oracle dba 4096 Jul 10 10:13 /app/oracle/
    [root@rhlinux bin]# ls -ld /app/oracle/product/
    drwxrwx--- 3 oracle dba 4096 Jul 6 16:04 /app/oracle/product/
    [root@rhlinux bin]# ls -ld /app/oracle/product/v10.2.0/
    drwxr-x--- 58 oracle dba 4096 Jul 11 15:31 /app/oracle/product/v10.2.0/
    [root@rhlinux bin]# ls -ld /app/oracle/product/v10.2.0/bin/
    drwxr-xr-x 2 oracle dba 8192 Jul 20 11:19 /app/oracle/product/v10.2.0/bin/
    [root@rhlinux bin]# ll nm?
    -rwxr-x--- 1 root dba 18462 Jul 6 16:16 nmb
    -rwxr-x--- 1 root dba 19624 Jul 6 16:16 nmo
    >>>
    >>>
    [oracle@rhlinux bin]$ uname -a
    Linux rhlinux 2.6.9-34.0.2.EL #1 Fri Jun 30 10:23:19 EDT 2006 i686 i686 i386 GNU/Linux
    >>>
    >>>
    [oracle@rhlinux bin]$ sqlplus system@dware
    SQL*Plus: Release 10.2.0.1.0 - Production on Thu Jul 20 16:34:16 2006
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    Enter password:
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    >>>
    >>>
    [oracle@rhlinux bin]$ emctl status agent
    TZ set to Canada/Saskatchewan
    Oracle Enterprise Manager 10g Database Control Release 10.2.0.1.0
    Copyright (c) 1996, 2005 Oracle Corporation. All rights reserved.
    Agent Version : 10.1.0.4.1
    OMS Version : 10.1.0.4.0
    Protocol Version : 10.1.0.2.0
    Agent Home : /app/oracle/product/v10.2.0/rhlinux.swlocal_DWARE
    Agent binaries : /app/oracle/product/v10.2.0
    Agent Process ID : 32098
    Parent Process ID : 30792
    Agent URL : http://rhlinux.swlocal:3938/emd/main
    Started at : 2006-07-20 12:00:05
    Started by user : oracle
    Last Reload : 2006-07-20 12:00:05
    Last successful upload : 2006-07-20 16:27:29
    Total Megabytes of XML files uploaded so far : 1.54
    Number of XML files pending upload : 0
    Size of XML files pending upload(MB) : 0.00
    Available disk space on upload filesystem : 47.83%
    Agent is Running and Ready
    >>>
    >>>
    [oracle@rhlinux bin]$ emctl status dbconsole
    TZ set to Canada/Saskatchewan
    Oracle Enterprise Manager 10g Database Control Release 10.2.0.1.0
    Copyright (c) 1996, 2005 Oracle Corporation. All rights reserved.
    http://rhlinux.swlocal:1158/em/console/aboutApplication
    Oracle Enterprise Manager 10g is running.
    Logs are generated in directory /app/oracle/product/v10.2.0/rhlinux.swlocal_DWARE/sysman/log
    >>>
    Any other ideas what might be causing this problem?
    Thanks and take care,
    Shayne

    From what I've read in this forum and in others is
    that the problem is often related to the owner/group
    permissions associated with the bin/nmo and bin/nmb
    files. However I've tried the posted solutions by
    changing the owner/permissions as suggested, but this
    does not help. I get the same error regardless if the
    files are owned by oracle or by root.
    I am not sure which "posted solutions" you had, but this is related to not running root.sh which must be run against the Agent Home.
    1. Stop the Agent (emctl stop agent) while connected to the OS as Oracle user (normally oracle) that installed Grid Control
    2. Connect to the OS as root and run root.sh from Agent ORACLE_HOME (if you do not have root access, your System Admin can do it)
    3. Start the Agent (emctl start agent) while connected to the OS as Oracle user

  • Using MDAB to create Planning file

    Hello,
    I noticed that for some materials, there is no entry in the planning files.
    How can I use MDAB to create planning file entries for specific materials? Can I specify the name of materials?
    Please help me out with this issue.
    Sincerely,
    Ketan

    Hai,
       If u activte planning file entries in back ground ur system take care about creating planning file entries automatically. For those materials which are Having MRP type other than  ND.
       If ur MRP type is ND then there is no Entry in Planning file.
    I hope this may solve UR prob
    KM

  • Warning when trying to launch application over mapped drive "We can't verify who created this file. Are you sure you want to run this file?"

    We have 2 load balanced application servers that map drives with programs on them for users from a central server that are giving the warning "We can't verify who created this file. Are you sure you want to run this file?". Basically how it works
    is a user logs in to one or the other application servers (for example, 10.0.0.1 or 10.0.0.2) and runs a log on script that maps an L:\ drive to the users folder on the central storage server (10.0.0.100). When the user goes to launch the .exe from the L:\
    drive it presents that warning.
    I've looked around and have seen recommendations that you need to add the site to the local intranet zone for IE, but when I try to add L:\ it tells me that I have entered an invalid sequence. It appears that you can only put network shares in that location,
    which is not how we're mapping the drive.
    Has anyone run in to this before and knows how to resolve it?
    Thanks!

    Hi,
    A mapped drive letter is not supported so please try to input \\server or \\server\folder to see if it will help. 
    If you have any feedback on our support, please send to [email protected]

  • How can I use a manually created help file

    Is it possible to use a manually crated help file *.hlp (microsoft help file) in a oracle forms application.
    I want to open the help file from in my help menu. How can I do that??
    Joury

    Hi,
    Have you tried WIN_API_SHELL.winhelp and WIN_API_SHELL.winhelpex?
    Both are available in D2KWUtil.PLL, usually included in Forms demos.
    Pedro
    null

  • Mapping Issue: Target IDoc structure not getting created properly

    Hi Experts,
    I am stuck with a strange problem in my mapping. It is a M:N scenario mapping where we have multiple IDocs in source as well as in target.
    For example, in source Queue-1 I am getting values as [SUPPRESS, true, true, true, true] and in the in source Queue-2 I am getting the values as [1,2,3,4,5]. I used MapWithDefault function to match the number of values in both the queues but in the target IDoc the node (say TMPN) is created only 4 times because of the condition put at the target node. So, the target node should have values as [1,2,3,4] but it gets values as [2,3,4,5] because the first target TMPN node is not created in the first target IDoc. I can provide you with the skeleton of the map. Hope this will give you a fair idea. Let me know if you require more details.
    1. Target Structure:
        IDoc - No TMPN Node
        IDoc - 1 TMPN Node
        IDoc - 2 TMPN nodes
        IDoc - 1 TMPN node
    Condition put on target TMPN_Node ---> (created only 4 times due to condition)
    2. Mapping Skeleton:
    Values coming from Q1 (SUPPRESS, true, true, true, true) --->
                                                                                    \===>FORMATBYEXAMPLE+SPLITBYVALUE==> TMPN-F0 [2]
                                                                                    /                                                                              TMPN-F0 [3]
    Values coming from Q2 (1,2,3,4,5) -
    >                                                                               TMPN-F0 [4]
                                                                                    TMPN-F0 [5]
    Where F0 is  the field which is created when TMPN Node is created. So, if TMPN gets created 6 times, the F0 gets created 6 times.
    Please let me know if there is an alternative to this problem. How can I have values as [1,2,3,4] in field F0 instead of [2,3,4,5]

    hi,
    i got the email.
    as i can see your mapping seams to be rigth, but there is sometrhing i dont get. in a previuos post you said:
    I am stuck with a strange problem in my mapping. It is a M:N scenario mapping where we have multiple IDocs in source as well as in target.
    For example, in source Queue-1 I am getting values as SUPPRESS, true, true, true, true and in the in source Queue-2 I am getting the values as 1,2,3,4,5. I used MapWithDefault function to match the number of values in both the queues but in the target IDoc the node (say TMPN) is created only 4 times because of the condition put at the target node. So, the target node should have values as 1,2,3,4 but it gets values as 2,3,4,5 because the first target TMPN node is not created in the first target IDoc. I can provide you with the skeleton of the map. Hope this will give you a fair idea. Let me know if you require more details.
    1. Target Structure:
    IDoc - No TMPN Node
    IDoc - 1 TMPN Node
    IDoc - 2 TMPN nodes
    IDoc - 1 TMPN node
    lets asume you define as default value "1"
    now, the result of the formatByExample taking the example you sent to me is:
    1,T1,T2,T3,T4 without context change. and the result of the function splitByValue should be what you are specting.
    but its something that is not clear still,
    could you post the result of the mapWithDefault? i think your problem is there.
    Rgds
    RP-.

  • Using java to create a file?

    hi there
    I just have a question regarding the creation of a file using PrintWriter class.:
    FileOutputStream fileOutputStream = new FileOutputStream(outFileName);
    PrintWriter out = new PrintWriter(new OutputStreamWriter(fileOutputStream, encoding));
    after the file is being created, do I need to set out to null. because I was not able to move or modifiy the file while the program is running.
    thank you for your help.

    to null. because I was not able to move or modifiy the
    file while the program is running.
    thank you for your help.that is because the program locks the file.

  • Using Pages to create epub file??

    Why when I export from pages to an epub file does it change the layout (i.e. moves part of page 2 to page 3 and then continues on like that?)
    I'm trying to publish a book of poetry on ibooks but it keeps messing with the format?

    Howdy trentharris,
    Documents created in Pages and exported as ePub may look different when displayed. The following article explains this and gives some tips for best practices to reduce the differences -
    Pages '09: Creating an ePub Document to Read in iBooks
    Thanks for using Apple Support Communities.
    Best,
    Brett L 

  • Xcelsius XML Connection Issue when using a Crystal created XML file

    HI Xcelsius/Crystal Gurus,
    I am running Crystal Reports 2008 version 12.2.9.698 and Xcelsius 2008 5.3.0.0 Build 12,3,0,670,
    I am attempting to create an XML file from a Crystal report which will be consumed by an Xcelsius Dashboard.  Apparently, there is something with the file that Xcelsius does not like.
    I have formatted the report exactly as Xcelsius requires and exporting via text with an xml extension to create an .xml file.  When I add an XML Connection in Xcelsius and add the appropriate names/range, etc...I am able to see the file via the Preview XML button on the Connection, but when I use a component, I am not seeing the data in the component.
    I can 'recreate' the file in NotePad by manually typing in the entries to look exactly like the file created by Crystal and set up a connection in Xcelsius to consume that file and it works beautifully.
    Does anyone have any experience with using Crystal Reports to create an XML file (not using the XML Export, but by exporting as a Text and adding an extension)?
    Is it possible that Crystal is adding hidden characters or something?  I can pull this Crystal created file into an XML editor or MS Word and it is validated as a good XML File.
    Any help is greatly appreciated!

    This article is all about setting up security on the web service. But, I just want to connect to the web service as a client using the proxy and the web service doesn't have any security set on it at this time.
    I am getting a connection timeout, even when I set the syncMaxWaitTime in the domain.xml file on the app server to a higher number (although this doesn't seem to affect the time it takes for the debugger to return).

  • Use installshield to create setup file from mvc web service

    I need help from gurus :)
    I have an mvc web service and I need to create a setup package with installshield but I don't know how to do it.
    I want create setup to install at another machine (under IIS)
    Can someone help me please?
    Thanks in advance.

    Hello,
    Welcome to MSDN forum.
    Your issue is out of support range of VS General Question forum which mainly discusses
    the usage of Visual Studio IDE such as WPF & SL designer, Visual Studio Guidance Automation Toolkit, Developer Documentation and Help System
    and Visual Studio Editor.
    Because you want to use InstallShield which we don't provide support, I suggest you to consult your issue on InstallShield forums for better support.
    Thanks,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Is it possible to use JTrees to create a File Manager??

    -- I'm wonderin on how can i use JTrees to view my entire system file...
    i want to create a FileManager apps wherein you can navigate your file system like navigating a windows explorer....
    Thanks

    You can in fact I once did it.
    All you have to do is to define a root node class which will have all the Directory roots as its childs.
    And another node class that represents a file or a folder in the disk.

  • Error while trying to use Export to create dump-file....................

    I am trying to export my portal PAGEGROUP plus other objects under it i.e. pages etc. and followed the instructions listed in:
    http://download-uk.oracle.com/docs/cd/B14099_11/portal.1012/b14037/cg_imex.htm#i1030999
    but when I run the .cmd script generated through portal after export I get the following error:
    Verifying the environment variables...
    Verifying the Oracle Client version...
    export Mode Selected
    Verifying the portal schema passed...
    ERROR:
    ORA-01017: invalid username/password; logon denied
    SP2-0751: Unable to connect to Oracle.  Exiting SQL*Plus
    Error: Could not connect. Invalid Portal Schema, Password or Connect String
    Export/Import aborted.Eventhough I successfully logged-on as a portal user through SQLPLUS / JDEVELOPER and em console. And am giving the same username/password when I run the script but somehow keeps giving me the above error. Here is my export command that I am using:
    C:\Export>exportscript.cmd -mode export -s PORTAL -p <portal_password> -pu <portal_username>
    -pp <portal_usernamepassword> -company <companyname> -c ORCL -d <dumpdfilename>
    here ORCL is the connect_string.
    Any help is appreciated.
    Thanks

    I got it working so please disregard actually the PORTAL password wasn't correct I retrieved the password using the following:
    C:\Export>ldapsearch -h localhost -p 389 -D "cn=orcladmin" -w <password> -b "cn=IAS
    Infrastructure Databases,cn=IAS,cn=Products,cn=OracleContext" -s sub "orclResou
    rcename=portal"and used the value for orclpasswordattribute as a portal password and it worked fine.
    Thanks

  • Disconnected Use(with external drive)= Corrupted file structure

    Help. Really like the way LR handles RAW. Big problem for me is that my Mac Book Pro HD is too full. I'm using an EHD from OWC(FW800). If I launch LR with the drive unmounted LR starts verifying files and pulling all sorts of garbage into the LR Folder panel. If I reconnect the drive there is no repair and only a fraction of the files show up with their terminal folder- the folder structure on the EHD is flattened to useless. Fatal flaw? or my error?
    Thanks,
    Steve

    Hi Dan- Happy to try and help. It has happen under 2 scenarios, the lrdb files are at MacHD/users/srp/pictures/LightRoom/...irdb in both cases.
    Scenario 1: I had a folder with some duplicates in MAC HD/users/...pictures/My Photos/duplicate files. When I would dismount the EHD and launch LR it would pull the dups on the Mac HD into the DB and reference what looked to be temps for missing files as well with fairly random preview distribution. The EHD file structure is OWC/users/srp/pictures/My Photos/files by date.
    Scenario 2: I ditched the duplicate folders on the MAC HD (they were not referenced in LR before disconnect anyway) and launched LR. It initiates checking for files and photos again and pulled random preview references to odd files replacing the original folder structure. In this case there were no valid references to pull so it was mostly garbage, it did preserve a few "?" files.
    Folders Panels: http://aktlr.smugmug.com/gallery/2668872#141260988.
    Ive included scenario 1 & 2 grabs and an example of a garbage path
    My work around is that I delete the lrdb and replace it with a clean copy, better than re-importing everything.

  • Using LaCie external for chosen files

    I've been running TM for about a year & to be honest I'm not a big fan only because I have an older iMac from 2005 & it eats up a lot of memory & power making my Mac virtually useless every hour when TM backs up. I tried using one of those TM editors but there was some bug that ended up making it run out of control one early morning frying my hard drive, so no more of that!
    I just want to backup my most important stuff, iTunes library, iPhoto, mail & maybe a folder that houses my most recent resume & I know I don't need to run TM for that. I'm just not sure how I can use my LaCie 500GB without using TM, how can I bypass TM? And how would I setup my LaCie?
    Or is it just time for a newer more powerful Mac?
    Any help would be appreciated.
    Message was edited by: ibroker

    ibroker wrote:
    Nothing else on my LaCie except for TM. Using Firewire for my LaCie. From what it sounds like, is that TM is just doing it's job & the 1 minute for normal & several minutes for larger file backups is normal. I tend to forget that anytime you download anything it's going to take longer to backup & I tend to download a lot of music & app's from iTunes.
    True, but it shouldn't take very long or a whole lot of CPU, especially via FireWire. F/W is somewhat faster than USB, and takes less of your Mac's CPU, as the F/W chipset does part of the work.
    So if the backups are slowing things down appreciably, do try the things in #D2.
    If you want to think about alternatives, see Kappy's post on Basic Backup, complete with links to the web sites of each product.
    Like many folks here, I use Time Machine as my primary backups, plus a bootable "clone." I use CarbonCopyCloner (listed in Kappy's post), SuperDuper is similar.
    The clones have advantages and disadvantages vs. Time Machine. They take much longer and more CPU to do an incremental backup, so most folks run them daily, or even weekly. I have my Mac scheduled to wake up early each morning, and a CCC "update" runs shortly thereafter, while I'm still snoozing. So I don't notice it at all!
    The downside is, of course, it doesn't have the hourly incremental backups, so I can't recover a previous version of something I've changed or deleted in error, or that has somehow been corrupted.
    If you want to stop TM altogether, you can turn it off and erase the LaCie with Disk Utility, then start with another app.
    Or you could turn TM off, partition the LaCie, if there's enough room, and put a clone in the new partition (it should be the same size as your internal HD, or 20% larger than the data on your internal HD if that's smaller). See #6 in the Frequently Asked Questions *User Tip,* also at the top of this forum.
    Then you could also run the occasional TM backup via +Back Up Now+ from the TM icon in your menubar, or just keep the TM backups for a while until you're comfy with the clone.
    As Kappy mentions, CCC is donationware, so you can try it for a while before sending them some $$ so they can keep it up to date. SD has a free version, but it can't be scheduled to run automatically, and will only do a full replacement. You'd need the paid version (about $30) for that.

Maybe you are looking for

  • Upgrade from 10.1.2 to Tiger (can it be done)

    I have a relative who is still using 10.1.2. He is using a flat panel G4 with 512k ram and 40GB HD. he would like to upgrade but he still has some old system 9 apps that he uses and does not want to lose them plus he still wants his iMac to work. Can

  • One time customer/vendor

    Good morning Consultants.......... i am not clarified my self about one time vendors/customers ,,,,,,,, pls let me know how exactly came in to picture one time customer/vendor and  one example for this Thanks in advance Regards prabhakar

  • Single frame glitch in titles:  Can't get rid of it!

    I am trying to create a dvd using iDVD 5. The quicktime file created by FCE is flawless. I am writing the dvd with "high quality". In the resultant dvd, right when the titles begin there is a glitch, a frame which shows the titles that will be coming

  • Invalid column name 'formula6'.

    This error occurs when i attempt to process the dimension.   "- Error occurs during UpdateRemainData. Invalid column name 'formula6'." When I uncheck the InApp  FORMULAH1 and FORMULAH2 properties it process without error. When I have them as InApp, i

  • Comment ouvrir un document Pages version 5

    Bonjour à la communauté, Je voudrais savoir, car j'ai un gros problème : Je souhaiterai ouvrir un document enregistré avec Pages en version 5, mais je n'ai pas Pages en version 5. Je n'ai que Pages version 4.3, qui bizarrement ne l'ouvre pas par manq