Scripts to update Mapping files & Lookup files for OBIA 7.9.6.2/7.9.6.3

Hi I am looking to know what are the mandatory mapping files & Lookup files that required to be mapped for OBIA Financial Analytics for Oracle EBS R12.1.3 source system while implementing OBIA
Is their any document that list outs the need/importance about these files while configuring the same
Thanks & Regards,
VJ

Hi gurus,
any body have an idea how to reconcile the OBIA 7.9.6.2 mapping files and Lookup files data are in line to Source Oracle EBS R12.1.3 system.
Regards,
VJ

Similar Messages

  • One to one mapping question -- can I just map a lookup field for queries?

    I have a table with a state code. I'd like to have a "virtual lookup" on the java class to a region table. I.e., this java class "studies" has a state code. I can map a one-to-one to the descriptor class that has the ref table, but I'd like to have a property in the java class pre-mapped to the "region" field in this lookup so for querying i can just use:
    ReadAllQuery q = new ReadAllQuery();
    q.setReferenceClass(study.class);
    q.setSelectionCriteria(new ExpressionBuilder().get("Region").equal(Region));
    to get all the studies for a particular region.
    am I going about this wrong? or do I have to get the reference table descriptor fromt he one to one map and do something different?

    well,on the way home last nite, I realized I'd had a complete brain fart. I've done this before. I just set up a 1-1 map between descriptors and then built the query like this:
    q.setSelectionCriteria(new ExpressionBuilder().get("refFPO").get("FPO_NO").equal(FPO));
    You can get the pointer to the "refFPO" which is the the descriptor mapped 1-1, then appended the column you wish to get.
    I still went ahead and amended my class to include a read-only, non-toplink mapped attribute "FPO" which just gets the reference variable pointing to "refFPO.getFPO_NO();
    I answered my own question just in case anyone wondered ....

  • Is it possible to run one ud script to update parms for multiple servers...

    Is it possible to run one ud script to update certain parameters in mib for multiple
    servers by giving multiple occurrences of the parameter and server id. I tried
    a ud script as follows and it seem to update the parameter for only the first
    server.
    SRVCNM .MIB
    TA_CLASS T_SERVER
    TA_OPERATION SET
    TA_SRVID 101
    TA_SRVID 102
    TA_SRVID 103
    TA_CLOPT -A -r -e srv1.err --
    TA_CLOPT -A -r -e srv2.err --
    TA_CLOPT -A -r -e srv3.err --

    From the ud's output, it looks like it used only one occurrence of the fields that
    I provided.
    "james mathew" <[email protected]> wrote:
    >
    Is it possible to run one ud script to update certain parameters in mib
    for multiple
    servers by giving multiple occurrences of the parameter and server id.
    I tried
    a ud script as follows and it seem to update the parameter for only the
    first
    server.
    SRVCNM .MIB
    TA_CLASS T_SERVER
    TA_OPERATION SET
    TA_SRVID 101
    TA_SRVID 102
    TA_SRVID 103
    TA_CLOPT -A -r -e srv1.err --
    TA_CLOPT -A -r -e srv2.err --
    TA_CLOPT -A -r -e srv3.err --

  • Update mapping of a node

    I have a few nodes in my component which correspond to some of the function modules. These function are meant for reading data from a table and saving data into it.
    My mappings are working perfectly. But now i need to add another field to the concerned database table. Thus, these nodes need to be updated. Is it possible that i update the mappings for these nodes without creating the service calls all over again.
    Also note that the node for the function module that fetches data is bound to an alv grid?
    The 'create attribute' and 'update mapping' options are disabled for this node in the component controller.
    Is it possible to update mappings without doing too much of rework?
    regards,
    Priyank
    Message was edited by:
            Priyank Jain

    Hi Priyank.
    The nodes of my service calls are enabled for editing in the component controller. I can add atrributes. Make sure you are in edit mode. Then you can add the field to the corresponding node.
    After that you can update the mapping in corresponding views if necessary.
    Cheers,
    Sascha

  • Create key mapping using import manager for lookup table FROM EXCEL file

    hello,
    i would like create key mapping while importing the values via excel file.
    the source file containing the key, but how do i map it to the lookup table?
    the properties of the table has enable the creation of mapping key. but during the mapping in import manager, i cant find any way to map the key mapping..
    eg
    lookup table contains:
    Material Group
    Code
    excel file contain
    MatGroup1  Code   System
    Thanks!
    Shanti

    Hi Shanti,
    Assuming you have already defined below listed points
    1)  Key Mapping "Yes" to your lookup table in MDM Console
    2) Created a New Remote System in MDM console
    3) proper rights for your account for updating the remote key values in to data manager through import manager.
    Your sample file can have Material Group and Code alone which can be exported from Data Manager by File-> Export To -> Excel, if you have  data already in Data Manager.
    Open your sample file through Import Manager by selecting  the remote system for which you want to import the Key mapping.
    (Do Not select MDM as Remote System, which do not allows you to maintain key mapping values) and also the file type as Excel
    Now select your Soruce and Destination tables, under the destination fields you will be seeing a new field called [Remote Key]
    Map you source and destination fields correspondingly and Clone your source field code by right clicking on code in the source hierarchy and map it to Remote Key if you want the code to be in the remote key values.
    And in the matching criteria select destination field code as a Matching field and change the default import action to Update NULL fields or UPDATED MAPPED FIELDS as required,
    After sucessfull import you can check the Remote Key values in Data Manager.
    Hope this helps
    Thanks
    Sowseel

  • Lil' script to update adblockfilter, adblocking via /etc/hosts file

    hi, i've recently changed to adblocking via the hosts file (which works great btw), but i was missing  filtersetupdating like in firefox, so i've created with my limited scripting skills this one...
    # lil' script to update /etc/hosts adblock-filter
    #hosts adblock filter taken from this site...
    wget --directory-prefix=/tmp http://www.mvps.org/winhelp2002/hosts.txt
    #Backup /etc/hosts to /tmp
    cp /etc/hosts /tmp
    #standard static hosts file
    echo '# /etc/hosts: static lookup table for host names' > /etc/hosts
    echo '#' >> /etc/hosts
    echo '#<ip-address> <hostname.domain.org> <hostname>' >> /etc/hosts
    echo '127.0.0.1 localhost.localdomain localhost' >> /etc/hosts
    #add custom statc host configuration here
    echo ' ' >> /etc/hosts
    echo '###Ad-Blocking###' >> /etc/hosts
    cat /tmp/hosts.txt >> /etc/hosts
    echo '# End of file' >> /etc/hosts
    rm /tmp/hosts.txt
    enjoy!

    hosts_udate
    #!/bin/bash
    # 2012 Ontobelli for this script
    # make hosts temporal directory
    HOSTSDIR=~/.hostsupdate
    mkdir -p "${HOSTSDIR}"
    # make temporary directory
    TMPDIR=/tmp/hostsupdate
    mkdir -p "${TMPDIR}"
    # set output file
    OUTPUTFILE="${TMPDIR}/hosts"
    # set temporal file
    TMPFILE="${TMPDIR}/tmpfile"
    if [ ! -f "${HOSTSDIR}/hosts.local" ]; then
    echo "You need to create "${HOSTSDIR}"/hosts.local containing the hosts you wish to keep!"
    exit 0
    fi
    # download the mvps.org hosts file.
    wget -c -O "${HOSTSDIR}/hosts.mvps" "http://winhelp2002.mvps.org/hosts.txt"
    # download hpHOSTS
    wget -c -O "${HOSTSDIR}/hosts.hphosts" "http://support.it-mate.co.uk/downloads/HOSTS.txt"
    # download hpHOSTS Partial
    wget -c -O "${HOSTSDIR}/hosts.partial" "http://hosts-file.net/hphosts-partial.asp"
    # download hpHOSTS ad/tracking servers
    wget -c -O "${HOSTSDIR}/hosts.adservers" "http://hosts-file.net/ad_servers.asp"
    # download the pgl.yoyo.org hosts Peter Lowe - AdServers
    wget -c -O "${HOSTSDIR}/hosts.yoyo" "http://pgl.yoyo.org/as/serverlist.php?hostformat=hosts&showintro=0&mimetype=plaintext"
    # download SysCtl Cameleon hosts
    wget -c -O "${HOSTSDIR}/hosts.sysctl" "http://sysctl.org/cameleon/hosts"
    # cat entries in a single file
    cat "${HOSTSDIR}/hosts.mvps" > "${TMPFILE}0"
    cat "${HOSTSDIR}/hosts.hphosts" >> "${TMPFILE}0"
    cat "${HOSTSDIR}/hosts.partial" >> "${TMPFILE}0"
    cat "${HOSTSDIR}/hosts.adservers" >> "${TMPFILE}0"
    cat "${HOSTSDIR}/hosts.yoyo" >> "${TMPFILE}0"
    cat "${HOSTSDIR}/hosts.sysctl" >> "${TMPFILE}0"
    # tabs to space
    sed -e 's/ / /g' "${TMPFILE}0" > "${TMPFILE}1"
    # find relevant lines without comments
    grep ^127.0.0.1 "${TMPFILE}1" > "${TMPFILE}2"
    # remove duplicate spaces
    cat "${TMPFILE}2" | tr -s [:space:] > "${TMPFILE}3"
    # remove carriage returns
    cat "${TMPFILE}3" | tr -d "\r" > "${TMPFILE}4"
    # 0.0.0.0 is nicer than constantly knocking on localhosts' door.
    sed -e 's/127.0.0.1 /0.0.0.0 /g' "${TMPFILE}4" > "${TMPFILE}5"
    # remove inline comments
    cut -d ' ' -f -2 "${TMPFILE}5" > "${TMPFILE}6"
    # sort blocklist entries and remove duplicates
    sort "${TMPFILE}6" | uniq > "${TMPFILE}7"
    # remove unneeded blocked sites
    grep -Ev ' dl.dropbox.com| host_you_want_to_whitelist' "${TMPFILE}7" > "${TMPFILE}9"
    # write the user's hosts.local to head, then the blacklists
    cat "${HOSTSDIR}"/hosts.local > "${OUTPUTFILE}"
    cat "${TMPFILE}9" >> "${OUTPUTFILE}"
    echo -e "# end of file" >> "${OUTPUTFILE}"
    # move to /etc/hosts
    mv "${OUTPUTFILE}" /etc/hosts
    # delete temporary directory
    rm -r -f "${TMPDIR}"
    hosts.local
    # /etc/hosts: static lookup table for host names
    #<ip> <hostname.domain.org> <hostname>
    127.0.0.1 localhost.localdomain localhost YOURHOSTSNAMEHERE
    ::1 localhost.localdomain localhost YOURHOSTSNAMEHERE
    # YOUR PERSONAL list
    # blocked list
    Create an alias in your ~/.bashrc
    alias hu='sudo /root/.hostsupdate/hosts_update'
    Run
    # hu <enter>
    Script and cache must be located in /root/.hostsupdate or modify scrip accordingly
    Cheers.
    Last edited by ontobelli (2012-02-15 09:15:17)

  • Lookup issue for File to Idoc scenario-- Urgent Pls.

    Hi All,
    Iam doing File to IDoc scenario with one file --> any of 3 R/3 systems.
    Routing has to be done dynamically using a lookup file, based on the incoming GLN code and last character of the Order Reference number.
    The last character of the Order ref Num will have A,B or C.
    Where,  ' A'  for  R/3 152 Client
                 ' B' for  R/3 142 Client
                 'C'  for  R/3  132 Client.
    1. PlantGLN_Routing lookup file :
    DestinationSystem,   Plant GLNCode,       Partner Number,  
    A,                            5000243000473,         GDKDVRC152,     
    B,                            5000243000473,         GDKDVRC142,      
    C,                            5000243000473,         GDKDVRC132,      
    A                                 500034000487           GDKDVRC152,     
    B                                 500034000487           GDKDVRC142,     
    C                                 500034000487           GDKDVRC132,
    By Using DestinationSystem and Plant GLN Code as lookup key --> I need to get the value of partner system.
    2.  IdocCtrlLookup
    Purpose: To read the IdocCtrlLookup file to populate the Idoc control segment
    Now based on the partner system from previous table --> I need to get the details of Sender Port , Sender Partner Type , Sender Partner Function , Sender Partner Number , Rx Port , Rx Partner Type , Rx Partner Function , Rx Partner Number.
    Please help me how can acheive this scenario with the help of lookups.
    Its very urgent.
    Regards
    Krupakar.

    HI,
    Here you would have to use one mapping for dynamic routing , based on this create IM and use it in receiver determination.
    the another mapping is general to file to idoc.
    see the dynamic routing link here
    Dynamic Configuration of Some Communication Channel Parameters using Message Mapping -
    /people/william.li/blog/2006/04/18/dynamic-configuration-of-some-communication-channel-parameters-using-message-mapping
    Also if you know the field name the context object can be defined for partner no that can be used in receiver determination in condition. to particular partner no in receiver determination.
    Regards
    Chilla

  • Mapping in Transformation file for loading infoprovider

    Mapping in transformation file for load from infoprovider:
    The requirement is : if Account of BW starts with 70XXXXXXX then use char1 if Account of BW starts with 12XXXXXXX then use char2 in BPC dimension 2.
    So, in the transformation file for a load from an infoprovider we want for a dimension to use the data from a certain BW characteristic based on the characteristic Account.
    For example if the account start with 70 then use for a certain bpc-dimension u201Cdetailu201D the characteristic of 0COUNRTY  should be used, if account start with 2 the char X should be used etc..
    Following in the transformation works but the issue is that we have to specify all the accounts individually (+100 accounts in the statement which is not feasible):
    BPC_detail = *IF (BWACCOUNT = str(70000010) then 0COUNTRY;str(NO_DETAIL))
    Where BPC_detail is the dimension in BPC and BWACCOUNT is the characteristic in BW.
    Following statement does not work: there is also no documentation available how to do this:
    BPC_detail = *IF (BWACCOUNT(1:2) = str(70) then 0COUNTRY;str(NO_DETAIL))
    Is there a solution/statement that fulfills this requirement for the load of an infoprovider?
    ( so similar to what you can do with the load of a flat file like for example:  Entity=IF(col(1,1:1)=U then SEntity;*col(1,1:1)=Z then *col(1,3:6); *STR(ERR)) )
    Rgds

    Hi,
    Install process chain /CPMB/LOAD_INFOPROV_UI from BI Content as follows:
    1.Enter Tcode RSA1
    2. In the left navigation bar, click 'BI content'
    3. Select process chain and double click "Select Objects".
    4. Select the process chain /CPMB/LOAD_INFOPROV_UI.
    5. Click 'Transfer Selections' button.
    6. On the right pane, install objects from BI Content.
    7. Enter Tcode SE38.
    8. Input program name ujs_activate_content and click to run.
    9. Only select option 'Update DM Default Instructions'.
    10. Execute program.
    Hope it helps..
    Regards,
    Raju

  • Simple BASH script to update subversion files

    This is just a simple BASH script that will update all .svn files in a specified directory.  If an update fails, it will attempt to update all the subdirectories in the failed one, so as much will be updated as possible.  Theoretically, you should be able to supply this script with only your root directory ( / ), and all the .svn files on your computer will be updated.
    #! /bin/bash
    # Contributor: Dylon Edwards <[email protected]>
    # ================================
    # svnup: Updates subversion files.
    # ================================
    #  If the user supplies no arguments
    #+ then, update the current directory
    #+ else, update each of those specified
    [[ $# == 0 ]] \
        && dirs=($PWD) \
        || dirs=($@)
    # Update the target directories
    for target in ${dirs[@]}; do
        # If the target file contains a .svn file
        if [[ -d $target/.svn ]]; then
            # Update the target
            svn up $target || {
                # If the update fails, update each of its subdirectories
                for subdir in $( ls $target ); do
                    [[ -d $target/$subdir ]] &&
                        ( svnup $target/$subdir )
                done
        # If the target file doesn't contain a .svn file
        else
            # Update each of its subdirectories
            for subdir in $( ls $target ); do
                [[ -d $target/$subdir ]] &&
                    ( svnup $target/$subdir )
            done;
        fi
    done

    Cerebral wrote:
    To filter out blank lines, you could just modify the awk command:
    ${exec awk '!/^$/ { print "-", $_ }' stuffigottado.txt}
    very nice; awk and grep: two commands that never cease to amaze me.

  • Cannot publish Flash Updates Verification of file signature failed for file SCUP 2011, SCCM 2012 R2 and WSUS all on same Windows Server 2012 machine

    I am attempting to distribute Adobe Flash updates using SCUP 2011, SCCM 2012 R2, WSUS ver4 and Windows Server 2012.  Everything installs without error.  I have acquired a certificate for SCUP signing from the internal Enterprise CA.  I have
    verified the signing certificate has a 1024 bit key.  I have imported the certificate into the server's Trusted Publishers and Trusted Root CA stores for the computer.  When I attempt to publish a Flash update with Full content I receive the following
    error:
    2015-02-13 23:00:48.724 UTC Error Scup2011.21 Publisher.PublishPackage PublishPackage(): Operation Failed with Error: Verification of file signature failed for file:
    \\SCCM\UpdateServicesPackages\a2aa8ca4-3b96-4ad2-a508-67a6acbd78a4\3f82680a-9028-4048-ba53-85a4b4acfa12_1.cab
    I have redone the certificates three times with no luck.  I can import metadata, but any attempt to download content results in the verification error.
    TIA

    Hi Joyce,
    This is embarrassing, I used that very post as my guide when deploying my certificate templates, but failed to change the bit length to 2048.  Thank you for being my second set of eyes.
    I changed my certificate key bit length to 2048, deleted the old cert from all certificate stores, acquired the a new signing cert, verified the key length was 2048, exported the new cert to pfx and cer files, imported into my Trusted publishers
    and Trusted Root Authorities stores, reconfigured SCUP to use the new pfx file, rebooted the server and attempted to re-publish the updates with the following results:
    2015-02-16 13:35:44.006 UTC Error Scup2011.4 Publisher.PublishPackage PublishPackage(): Operation Failed with Error: Verification of file signature failed for file:
    \\SCCM\UpdateServicesPackages\a2aa8ca4-3b96-4ad2-a508-67a6acbd78a4\3f82680a-9028-4048-ba53-85a4b4acfa12_1.cab.
    Is there a chance this content was already created and signed with the old cert, so installing the new cert has no effect?  In ConfigMgr software updates I see 4 Flash updates, all marked Metadata Only (because they were originally published as "Automatic." 
    No Flash updates in the ConfigMgr console are marked as downloaded.  I can't find any documentation on how the process of using SCUP for downloading content for an update marked Metadata Only actually works. 
    Comments and suggestions welcome.

  • Updated to Adobe Muse 2014 this morning and have worked in it for the last 8 hours. Now when I try to publish the updated site to Business Catalyst for my client to preview it crashes everytime. I have tried just publishing altered files only, then tried

    Updated to Adobe Muse 2014 this morning and have worked in it for the last 8 hours. Now when I try to publish the updated site to Business Catalyst for my client to preview it crashes everytime. I have tried just publishing altered files only, then tried the whole site again, and then tried publishing as a new site altogether. Thought I would then try to export as HTML in the hope of uploading the files via an FTP client and Muse crashes and locks up again. I am extremely stressed about this as I am in the last few days of of website I have been working on now with no issues since December. We are due to go live and my client needs to see it. I am desperate for an answer. It is not looking good. I am on an Apple Mac and have not had any isses publishing it for the last 6 months. Not very happy to say the least. Need desperate help.

    Hi Zak, I got onto Adobe Customer Care Live Chat this morning and gave them the error message. After some trouble shooting with them it appears the older archived file of the site still publishes. I have now reverted back to the old file and copied and pasted out of the new file and from some ideas given to me by support I am now able to publish to Business Catalyst. It seems there was something corrupt within the new pages added yesterday. I have no idea if this would have still happened if I hadn't updated but I am glad it wasn't a Muse specific problem. I am loving using Muse and the support from Adobe has been excellent. Thanks everyone. By the way I do love the new version and apart from this hick up that lost me a few hours, aged me some more and gave me grey hair I really love Muse. Thanks again.

  • How do I import Map Info Tab files into Spatial for a map of europe?

    How do I import Map Info Tab files into Spatial for a map of europe via FME and have oracle spatial draw the map without problems?
    So far I've got to the stage where I can import the data, spatially index it (in oracle 9i) and get my SVG (scaleable vector graphics) application to view the map.
    The problem is that countries that have more than one polygon (more than one row in the database) do not draw properly.
    When I view the Map Info tab file in the FME viewer I can see that the data is fine, but I can also see that some of the polygons used to draw a country are donugts and some aren't.
    This seems to cause a problem when I import the data into oracle spatial as I don't know if a row in the table needs to be inserted as an independent SDO_GEOMETRY or if it should form part of a larger SDO_GEOMETRY (as in 2 or more rows make up the polygon shape for a country).
    I have a feeling that I'm not using FME correctly, because at the moment I have to import the tab file into Oracle then re-insert the data into a spatially formatted table (one with a spatial index) - I get the impression that FME should do that for me but as I'm new to this I don't really know.
    Any Help welcome :|
    Tim

    Tim,
    MapInfo has a free utility called EasyLoader that allows you to upload a table directly to Oracle. EasyLoader creates the geometries and spatial index. You can download it free from http://www.mapinfo.com/products/download.cfm?ProductID=1044
    Andy Greis
    CompuTech Inc.

  • Error: 1:n multi-mapping using BPM for file to file scenario

    Hi. Iu2019m trying to do 1:n multi-mapping using bpm scenario. I use file to file. Input file consists of many records and then I want many record to be transformed into many file at target system. I follow step in http://www.riyaz.net/blog/xipi-1n-multi-mapping-using-bpm/. However, I have a problem when file is retrieved in to XI. It doesnu2019t create any output file.
    When I look at SXMB_MONI, it shows u201CNo object type found for the message Check that the corresponding process is activated.u201D Besides, it shows error message "Unable to perform action for selected message" when i clikc at PE in SXMB_MONI.
    I have check at many previous posts with the same error message but still couldnu2019t solve it. I already activate my BPM and check that status in SXI_CACHE = 0. There is noting left in change list of my IR and ID.
    Here is my design and configuration.
    IR
    Data Type: DT_file_split -> for both input and output file
    Message Type: MT_file_split_sender, MT_file_split_receiver
    Message Interface: SI_file_split_in, SI_file_split_out, SI_file_abs_source, SI_file_abs_target
    Message mapping: MM_file_split for mapping MT_file_split_sender with MT_file_split_receiver
    Interface mapping: OM_file_split
    u2022     Source = SI_file_split_out
    u2022     Target = SI_file_split_in
    u2022     Mapping Program = MM_file_split
    BPM following this link http://www.riyaz.net/blog/xipi-1n-multi-mapping-using-bpm/.
    ID
    Import my Integration process
    2 Communication Channel for getting input file (CC_File_split_sender) and creating output file (CC_File_split)
    2 Receiver determination:
    u2022     Source system to BPM using interface SI_File_Abs_source
    u2022     BPM to target system using interface SI_File_Abs_target
    1 Interface determination:
    u2022     from source system to BPM
    u2022     Sender interface: SI_File_Split_Out
    u2022     Receiver interface: SI_file_abs_source
    1 Sender Agreement
    u2022     Commu. Component: Source System
    u2022     Using interface: SI_File_Abs_source
    u2022     Sender Communication Channel: CC_File_Split_Sender
    1 Receiver Agreement
    u2022     Sender Communication Component: BPM
    u2022     Receiver Communication Component: Target System
    u2022     Receiver Interface: SI_File_Split_In
    u2022     Receiver Communication Channel: CC_File_Split
    Anyone know how to fix this?
    Thanks,
    Pavin

    Hi,
    Yes, thats the problem.
    You are creating file from test tab of  1..N mapping .
    In case of 1..N mappping in mapping Extra tags of messages are addded to the data.As shown here:-
    Messages
          Message1
               MessageType
    When you use this mapping to generate xml message then it will add additional tags for <Messages> and <Message1>, which is not correct. it should only have structure of you MT.
    So remove start and end tag of <Messages> and <Message1> from your data file. Mentioned below in bold.
    <xml......>
    <Messages>
    <Message1>
    <MT_...>
    </MT_...>
    </Message1>
    </Messages>
    This should solve your problem.
    Regards,
    Sami.

  • Request Message mapping in SXMB_MONI for File - RFC - File without BPM

    Hi ,
    In my File-RFC-File scenario, the messages are processed successfully.
    but when i look into the SXMB_MONI for File to RFC step, the records are present only till the  Message split According to Receiver List and not able to see the records from Request Message mapping step onwards.it contains
    <?xml version="1.0" encoding="UTF-8" ?>
      <ns1:Z_PI_LOTUSNOTES_UNIFORM xmlns:ns1="urn:sap-com:document:sap:rfc:functions" />
    in my mapping for the receiver RFC i have not done the mapping for all the fields. few fields i have disabled. whether this could create a problem in message mapping. pls let me know what needs to be done.

    Is it only for this scenario that the above display issue is occuring or is it for all scenarios?
    If onlt a particular Pipeline steps are displayed then you can check what is the TRACE level set in SXMB_ADM --> Integration Engine Configuration --> and check if the TRACE parameter is set to at least 2 (max is 3 which will ensure that your DB will get full quickly)
    For more information refer: /people/michal.krawczyk2/blog/2005/05/10/xi-i-cannot-see-some-of-my-messages-in-the-sxmbmoni
    Regards,
    Abhishek.

  • File content convertion for multi mapping occurance 1:N

    Hi ALL,
    In my scenario i have used multimapping means 1:N,
    In my scenario the sender is file adapter and receiver is JMS adapter.
    My requirement is that i have to pick the file from respective directory path...
    through file content convertion
    For example below is the structure when the sender file adpater pickes the flat file..through File content convertion....
    <?xml version="1.0" encoding="utf-8"?>
    <ns:mt_BookingConfirmation xmlns:ns1="urn:agrp:ml">
         <Recordset>
             <BOOKING_REQCON_ROU>
               <RECORD_TYPE>ROU</RECORD_TYPE>
               <SENDER_ID>AS.MAN</SENDER_ID>
               <RECEIVER_ID>abcd.99018293</RECEIVER_ID>
            </BOOKING_REQCON_ROU>
         </Recordset>
    </ns:mt_BookingConfirmation>
    But the structure of the source which i have in the mpping is as below,you can see the diffrence and this diffrence we will get automatically when we change the occurances of the target structure.
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:Messages xmlns:ns0="http://sap.com/xi/XI/SplitAndMerge">
       <ns0:Message1>
          <ns1:mt_BookingConfirmation xmlns:ns1="urn:agrp:ml">
            <Recordset>
             <BOOKING_REQCON_ROU>
               <RECORD_TYPE>ROU</RECORD_TYPE>
               <SENDER_ID>AS.MAN</SENDER_ID>
               <RECEIVER_ID>abcd.99018293</RECEIVER_ID>
            </BOOKING_REQCON_ROU>
         </Recordset>
          </ns1:mt_BookingConfirmation>
       </ns0:Message1>
    </ns0:Messages>
    because of this diffrence in the structure... the picked file could not able to pass ttrough from the message mapping.Because of this error is throwned in the SXMB_MONI
    Please if some one know how to do the file content convertion for this type of diffrent structure.
    Please let me know ASAP.
    Thanks in advance...
    Best Regards,
    Aravind.Pujari

    Hi,
    Check Below links.
    File content conversion sites
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/venkat.donela/blog/2005/06/08/how-to-send-a-flat-file-with-various-field-lengths-and-variable-substructures-to-xi-30
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    http://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
    Regards,
    Phani
    Reward Points if Helpful

Maybe you are looking for

  • How to send values from a "Report Column" instead of a "Page Item" ?

    Hi there, Here's my case: Two pages A and B. Page A is used to view each employee from a selected department. Page B is used to created/modify employees. I have a create button on page A which redirect to page B. By default, page B is blank (as of co

  • Error in query when adding prompts

    Hello Everyone, I have a simple query which is working just fine in query manager. Now, I need to narrow the query by adding two prompts. However, when I add a prompt, it will complain with an error saying: Incorrect Syntax Near pch1.itemcode... Howe

  • PDF embedded in Powerpoint

    Hi folks. This problem occurred after I converted a PDF file from PowerPoint and then embedded it in PowerPoint slides. Whenever I tried to open and scroll through pages from the embedded PDF, the message "error in processing pages...(23)" would pop

  • GG on ASM

    WE are implementing GG on Oracle 11.2.0.3 running on 64bit RHEL with ASM for storage ( no RAC). The GG version is 11.2.1.0.1 and we want to use integrated capture to support advanced compression. Is there anything different needs to be done in the AS

  • DashCode opening randomly from TweetDeck

    Everytime I open a web link from Desktop TweetDeck (Default web browser is Chrome) Dashcode appears in the dock, and the little light under it tells me it's running. THIS IS REALLY ANNOYING. How can I stop it? Thanks!