OCI Catalog MM PO and PR

Hi,
Have integrated OCI catalog to be called in ME21N and ME51N.
1.Only one catalog can be called unlike PM OCI inetgration were we can call multiple catalog.How do we call multiple catalog in ME21N and ME51N
2.How can we map the catalog to a R/3 Vendor number so that I dont need to add vendor number manually after data is transfered from catalog to PO or PR.
Regards

Hi
I just wanted first to know, how you implemented an OCI catalog function in ME51N or ME21N?
I found two SAP notes which tells me that it is "A function is missing in the standard system."
Check Notes
1092922 - Using several catalogs in purchasing
1092923 - Catalog connection in purchasing
Really appreciate your answer
BR, Markus

Similar Messages

  • UI Patterns and OCI Catalogs

    Hi,
    I am trying to write a CAF application that uses the Design patterns (Object Selector and Object Editor) to maintain my Service.
    I would like to put a link onto the page that will punch out to a OCI catalog and take the data posted back, and put this into the fields on my UI.
    Any suggestions would be appreciated
    Paul

    Hi,
    regarding 1. please try to add the WD-model of the CAF project to the Public Part. You can do this via using the context menu when the model node in the project hierarchy is selected.
    After that please re-build the project and build the DC once again. Then you should be able to add CAF's WD-model into your WD-component.
    However: if your IDE is a CAF 7.0 SPS8 patch 0, then you will face another issue: the entities of the CAF project are not visible in the component context, so the data binding will not be possible. In that case please use CAF 7.0 SPS8 patch 1 (available on SMP) or you have to extend the public part of the WD-Model with enitity type "Java folder tree" with the folder of the source code of the WD model.
    Regards,
      Jan

  • OCI Catalog for Vendor problem : Logical system for catalog not maintained

    We have set up an OCI catalog with on of our vendors. This works fine untill we transfer shopping basket data from Vendor sysem into our SRM-system (SRM5.0). Then data is not being transfered and Vendor Webshop screen remains 'open'.
    in SLG1 i have found an error "Logical system for catalog not maintained. Inform system administrator'
    Anyone an idea what the cause of this error can be or what we need to check ?
    Thanks in advance

    Hi,
    I have not seen that error before. However, I think it may have to do with the LOGICAL_SYSTEM OCI field. Can you check if your punch-out catalog vendor is populating that field? Also, check if you have the correct BP# populated in the catalog configuration and the LOGICAL SYSTEM Field on the define External Webservices is blank.
    SG

  • Access a OCI Catalog from ABAP code

    Hi Experts,
    We may need to access an OCI Catalog from an ABAP code. We will need to punch out the Catalog in a browser window, and somehow get the results from the browser back to the ABAP code.
    Any idea of how can this be accomplished?
    Thanks and regards,
    José Omar

    Accessing the catalog is nothing but creating a hook url and directing to that URL.
    For eg. you should have the parameters read from the SPRO settings in SRM and then creating an URL in the fly.
    But as of now you can have hard coded URLs being fed in.
    ie you click on a link and you fire the exit plug as in Web dynpro in suspend mode.
    The main criteria is to have the catalog OCI 4.0 complaint. (read mandatory fields etc in OCI 4.0 documentation)
    Then once you shop from some items in the catalog , the OCI compliant catalog will send you the data in the required file format.
    At the receiving end , PO etc . you should have a ABAP method (at resume ) that interprets this sent name-value pair file from the catalog. then populate this data where you want.
    you can have a look at the SRM methods that does the same in ABAP.
    This would the 20,000 feet or more process overview on how it works. Hope this helps
    thanks
    -Adrivit

  • OCI Catalogs in MM

    Hi Guys,
    as we all know since ECC6.0 we can use OCI catalogs in MM.
    I now have a vendor who send me his OCI interface data. I customized the data. Everything is fine.
    But I now have additions wishes:
    1. Integration of multiple catalogs - only the default catalog is shown but there must be an opportunity to do this because it's mentioned in sap note 1092922 - has anybody done this before?
    2. Is there any chance to influence the data from OCI catalog via BAdI or Enhancement. The only BAdI i found is ME_CATALOG_INTERFACE which is not accesible because it's declared SAP internal
    3. How can i use the fields "NEW_ITEM-CUST_FIELD1" - "NEW_ITEM-CUST_FIELD5" can i define a mapping anywhere?
    4. Has anybody here ever written an costumer specific transaction using MM Catalog API? Please let me know how it was done.
    Thanks in advance!
    Regards,
    Christian

    about Q2:
    The BADI will be available to customers only from EHP4 on. The reason for that not being delivered to customer was that the     
    interface was not finalized and would be possibly changed.           
    If you want to implement it you may do it under own responsibility. In order to use this BADI, you need to execute transaction SE18 and select BADI ME_CATALOG_INTERFACE. You will need to uncheck checkbox 'within SAP' and save. Then you should be able to create your implementation via SE19. Or change the internal flag directly on database table SXS_ATTR                                                                               
    This is of course considered as a modification from the standard

  • Troubleshooting javascript error in OCI catalog

    Hi experts.
    I have created an OCI catalog for a customer.
    The customer is using SAP ERP ECC 6.0 EHP4 and IE11
    When the customer is launching the OCI catalog the shopping site appears and everything is great,
    but when the customer is pressing a certain button in the catalog nothing happens.
    I suspect that this might be due to a javascript error in the code.
    However the browser in this system has no developer toolbar and every menu option
    is overriden. The F12 key is escape etc.
    So basically what I want to know...
    Is there a way to debug a javascript error in this environment?
    By the way
    When I run the code in a regular browser it is working
    I have tested IE7 - IE11
    FF 36.0.1
    Chrome 40.0.2214.115 m
    with regards
    /Jocke

    Thank you for your response,
    I have seen this page. How can I upload the source files for people to take a look as requested in this link?
    Brett Messick
    Senior Digital Media Developer
    Enterprise Learning | People and Organization Effectiveness
    T. Rowe Price
    4515 Painters Mill Road
    Owings Mills, MD 21117
    •Office: (410) 345-6917
    •E-mail: [email protected]

  • PM to SRM integration through OCI catalog

    Hi,
    We have sucessfully set up the necessary settings in ECC6 and SRM for OCI catalog integration when in transaction IW31 (Create Order)
    I can see in table EPRTRANS the PR number and when running programme BBP_EXTREQ_TRANSFER the PR is transfered to SRM.  When looking in transaction SLG1 I am getting an error message saying the Tax Code 00 does not exist. Check enteries
    Regards,
    Gary

    Hi Yousif,
    Why dont u save the search as a named search in MDM and then pass it onto the SRM MDM.
    Save the records  as Named search in MDM Data Manager
    For eg:
    Named Search name -->> search1
    Now while passing the parameters to the SRM MDM Catalog specify the Named Search in its url.
    http://J2EESERVER:PORTNO/SRM-MDM/SRM_MDM?sap-locale=EN&HOOK_URL=&mask=&namedSearch=search1&username=Admin&password=PASSWORD&catalog=REPOSITORYNAME&server=MDMSERVERIPADDRESS&datalanguage=EN
    After passing the named search in the hook url the you will be able to see only the search which was saved in Data Manager.
    Hope dis helps u
    Regards Tejas.........

  • Configuring an OCI catalog for PM

    We are in the midst of implementing the use of various vendor catalogs with SAP PM.  One feature in version 4.7 PM is the ability to link to an external OCI catalog.
    Have any of you created such a link and do you have any configuration suggestions or guides.  I am stuck on what is actually needed and haven't found much information.

    We hired a consultant that helped with the basic config.

  • Multiple OCI catalogs in MM

    Dear Community,
    in standard, MM only supports connection of one OCI catalog for use in transaction ME21N or ME51N.
    I have heard rumors that there is a "tweak" or "modification" that enables the use of multiple catalogs in MM too.
    Does anybody know more details or has some suggestions how to carry out such change in the system?
    Thanks and kind regards
    Julian Bradler
    PS: Happy new year 2009 to all of you!

    Hi,
    Could you tell me the SPRO path for doing setting for catalog use in ME21N and ME51N
    Regards

  • Steps for Mapping OCI Catalog with R3 ISA (ERP E-Commerce)

    Hi Gurus,
    I just needed some help on how to go about integrating R3 ISA with OCI Catalog. Please provide reference to some relevant documentation if available.
    Thanks.
    Jai

    Technically other than some [XCM settings and configuration|http://help.sap.com/saphelp_crm50/helpdata/en/2f/86653fac7ab21ae10000000a114084/content.htm] that define the external catalog URL, there isn't much needed to integrate a OCI compliant catalog - provided, you already have a OCI compliant catalog.
    Start here on some technical information on OCI catalogs - [/people/masayuki.sekihara/blog/2007/12/07/oci-open-catalog-interface-setting-and-trouble-shooting|/people/masayuki.sekihara/blog/2007/12/07/oci-open-catalog-interface-setting-and-trouble-shooting]

  • Create OCI catalog for SRM using Web Dynpro

    Hi all,
    Have you ever implemented an OCI catalog for SRM using web-dynpro?
    Please provide details.
    Regards,
    PooYee

    Hi PooYee,
    the BSP application would run the WD catalog and communicate with SRM.
    In a certain way it would "wrap" the WD catalog application and enable accessing the http header.
    This of course is not a trivial implementation.
    Send me a mail and maybe I can give you some more ideas.
    regards, Ulli

  • OCI Catalog Limitations

    Dear SAP expert,
    Could you please share your experience on external / OCI catalog re- limitations in the SAP CRM E-Commerce area?
    Should we expect difficulties with integration & pricing? Will it restrict available functionalities and views in the B2C scenario?
    Thanks in advance,
    Regards, Stephanie

    OCI catalog integration with CRM Web Channel is fairly simple and works fine. CRM Web channel side of the OCI integration can be enhanced to include custom fields to be transferred from OCI catalog to Web Channel basket.
    The catalog item details and Pricing are the features of the external (OCI compliant) catalog you are using. So, you must check your external catalog for this
    The user interface and navigation - or how the user interacts with the catalog and orders items - is what might turn you off. Remember, the external catalog, as you know is technically another web application that is NOT web channel application. Inter web application integration is through OCI (hiitp or xml) protocol.
    1. Matching the styles of external catalog and your web channel application - might sound easy, but depends on the external catalog's capability to adapt to style changes.
    2. The navigation from one application (catalog) to the other (web channel) is NOT what you will like in today's slick UI-s. Two applications will be open when the user is browsing the catalog to select and "collect" the items in external catalog and then transfers the items to the web channel basket.
    Suggest you do a proof-of-concept and run the UI and navigation to the business users before you jump in.
    Also, explore if you can use the external catalog using web services.
    Easwar Ram
    http://www.parxlns.com

  • OCI Catalog integration with R3 ISA (ERP E-Commerce)

    Hi Gurus,
    I just needed some help on how to go about integrating R3 ISA with OCI Catalog. Please provide reference to some relevant documentation if available.
    Thanks.
    Jai

    Materials ManagementPurchasingEnvironment Data
    Web Services: ID and Description
    Sequence
    no     Description/Name of the paramener.     Vale of the patameter for web service     Type
    (Select)
    10               0 URL.
    20     User name          2 fixed value
    30     password          2 fixed value
    40     SERVICE          2 fixed value
    50     HOOK_URL          4 return url
    60     Submit          2 fixed value

  • LR 4.4 (and 5.0?) catalog: a problem and some questions

    Introductory Remark
    After several years of reluctance this March I changed to LR due to its retouching capabilities. Unfortunately – beyond enjoying some really nice features of LR – I keep struggling with several problems, many of which have been covered in this forum. In this thread I describe a problem with a particular LR 4.4 catalog and put some general questions.
    A few days ago I upgraded to 5.0. Unfortunately it turned out to produce even slower ’speed’ than 4.4 (discussed – among other places – here: http://forums.adobe.com/message/5454410#5454410), so I rather fell back to the latter, instead of testing the behavior of the 5.0 catalog. Anyway, as far as I understand this upgrade does not include significant new catalog functions, so my problem and questions below may be valid for 5.0, too. Nevertheless, the incompatibility of the new and previous catalogs suggests rewriting of the catalog-related parts of the code. I do not know the resulting potential improvements and/or new bugs in 5.0.
    For your information, my PC (running under Windows 7) has a 64-bit Intel Core i7-3770K processor, 16GB RAM, 240 GB SSD, as well as fast and large-capacity HDDs. My monitor has a resolution of 1920x1200.
    1. Problem with the catalog
    To tell you the truth, I do not understand the potential necessity for using the “File / Optimize Catalog” function. In my view LR should keep the catalog optimized without manual intervention.
    Nevertheless, when being faced with the ill-famed slowness of LR, I run this module. In addition, I always switch on the “Catalog Settings / General / Back up catalog” function. The actually set frequency of backing up depends on the circumstances – e.g. the number of RAW (in my case: NEF) files, the size of the catalog file (*.lrcat), and the space available on my SSD. In case of need I delete the oldest backup file to make space for the new one.
    Recently I processed 1500 photos, occupying 21 GB. The "Catalog Settings / Metadata / Automatically write changes into XMP" function was switched on. Unfortunately I had to fiddle with the images quite a lot, so after processing roughly half of them the catalog file reached the size of 24 GB. Until this stage there had been no sign of any failure – catalog optimizations had run smoothly and backups had been created regularly, as scheduled.
    Once, however, towards the end of generating the next backup, LR sent an error message saying that it had not been able to create the backup file, due to lack of enough space on the SSD. I myself found still 40 GB of empty space, so I re-launched the backup process. The result was the same, but this time I saw a mysterious new (journal?) file with a size of 40 GB… When my third attempt also failed, I had to decide what to do.
    Since I needed at least the XMP files with the results of my retouching operations, I simply wanted to save these side-cars into the directory of my original input NEF files on a HDD. Before making this step, I intended to check whether all modifications and adjustments had been stored in the XMP files.
    Unfortunately I was not aware of the realistic size of side-cars, associated with a certain volume of usage of the Spot Removal, Grad Filter, and Adjustment Brush functions. But as the time of the last modification of the XMP files (belonging to the recently retouched pictures) seemed perfect, I believed that all my actions had been saved. Although the "Automatically write changes into XMP" seemed to be working, in order to be on the safe side I selected all photos and ran the “Metadata / Save Metadata to File” function of the Library module. After this I copied the XMP files, deleted the corrupted catalog, created a new catalog, and imported the same NEF files together with the side-cars.
    When checking the photos, I was shocked: Only the first few hundred XMP files retained all my modifications. Roughly 3 weeks of work was completely lost… From that time on I regularly check the XMP files.
    Question 1: Have you collected any similar experience?
    2. The catalog-related part of my workflow
    Unless I miss an important piece of knowledge, LR catalogs store many data that I do not need in the long run. Having the history of recent retouching activities is useful for me only for a short while, so archiving every little step for a long time with a huge amount of accumulated data would be impossible (and useless) on my SSD. In terms of processing what count for me are the resulting XMP files, so in the long run I keep only them and get rid of the catalog.
    Out of the 240 GB of my SSD 110 GB is available for LR. Whenever I have new photos to retouch, I make the following steps:
    create a ‘temporary’ catalog on my SSD
    import the new pictures from my HDD into this temporary catalog
    select all imported pictures in the temporary catalog
    use the “File / Export as Catalog” function in order to copy the original NEF files onto the SSD and make them used by the ‘real’ (not temporary) new catalog
    use the “File / Open Catalog” function to re-launch LR with the new catalog
    switch on the "Automatically write changes into XMP" function of the new catalog
    delete the ‘temporary’ catalog to save space on the SSD
    retouch the pictures (while keeping and eye on due creation and development of the XMP files)
    generate the required output (TIF OR JPG) files
    copy the XMP and the output files into the original directory of the input NEF files on the HDD
    copy the whole catalog for interim archiving onto the HDD
    delete the catalog from the SSD
    upon making sure that the XMP files are all fine, delete the archived catalog from the HDD, too
    Question 2: If we put aside the issue of keeping the catalog for other purposes then saving each and every retouching steps (which I address below), is there any simpler workflow to produce only the XMP files and save space on the SSD? For example, is it possible to create a new catalog on the SSD with copying the input NEF files into its directory and re-launching LR ‘automatically’, in one step?
    Question 3: If this I not the case, is there any third-party application that would ease the execution of the relevant parts of this workflow before and/or after the actual retouching of the pictures?
    Question 4: Is it possible to set general parameters for new catalogs? In my experience most settings of the new catalogs (at least the ones that are important for me) are copied from the recently used catalog, except the use of the "Catalog Settings / Metadata / Automatically write changes into XMP" function. This means that I always have to go there to switch it on… Not even a question is raised by LR whether I want to change anything in comparison with the settings of the recently used catalog…
    3. Catalog functions missing from my workflow
    Unfortunately the above described abandoning of catalogs has at least two serious drawbacks:
    I miss the classification features (rating, keywords, collections, etc.) Anyway, these functions would be really meaningful for me only if covering all my existing photos that would require going back to 41k images to classify them. In addition, keeping all the pictures in one catalog would result in an extremely large catalog file, almost surely guaranteeing regular failures. Beyond, due to the speed problem tolerable conditions could be established only by keeping the original NEF files on the SSD, which is out of the question. Generating several ‘partial’ catalogs could somewhat circumvent this trap, but it would require presorting the photos (e.g. by capture time or subject) and by doing this I would lose the essence of having a single catalog, covering all my photos.
    Question 5: Is it the right assumption that storing only some parts (e.g. the classification-related data) of catalog files is impossible? My understanding is that either I keep the whole catalog file (with the outdated historical data of all my ‘ancient’ actions) or abandon it.
    Question 6: If such ‘cherry-picking’ is facilitated after all: Can you suggest any pragmatic description of the potential (competing) ways of categorizing images efficiently, comparing them along the pros and contras?
    I also lose the virtual copies. Anyway, I am confused regarding the actual storage of the retouching-related data of virtual copies. In some websites one can find relatively old posts, stating that the XMP file contains all information about modifying/adjusting both the original photo and its virtual copy/copies. However, when fiddling with a virtual copy I cannot see any change in the size of the associated XMP file. In addition, when I copy the original NEF file and its XMP file, rename them, and import these derivative files, only the retouched original image comes up – I cannot see any virtual copy. This suggests that the XMP file does not contain information on the virtual copy/copies…
    For this reason whenever multiple versions seem to be reasonable, I create renamed version(s) of the same NEF+XMP files, import them, and make some changes in their settings. I know, this is far not a sophisticated solution…
    Question 7: Where and how the settings of virtual copies are stored?
    Question 8: Is it possible to generate separate XMP files for both the originally retouched image and its virtual copy/copies and to make them recognized by LR when importing them into a new catalog?

    A part of my problems may be caused by selecting LR for a challenging private project, where image retouching activities result in bigger than average volume of adjustment data. Consequently, the catalog file becomes huge and vulnerable.
    While I understand that something has gone wrong for you, causing Lightroom to be slow and unstable, I think you are combining many unrelated ideas into a single concept, and winding up with a mistaken idea. Just because you project is challenging does not mean Lightroom is unsuitable. A bigger than average volume of adjustment data will make the catalog larger (I don't know about "huge"), but I doubt bigger by itself will make the catalog "vulnerable".
    The causes of instability and crashes may have NOTHING to do with catalog size. Of course, the cause MAY have everything to do with catalog size. I just don't think you are coming to the right conclusion, as in my experience size of catalog and stability issues are unrelated.
    2. I may be wrong, but in my experience the size of the RAW file may significantly blow up the amount of retouching-related data.
    Your experience is your experience, and my experience is different. I want to state clearly that you can have pretty big RAW files that have different content and not require significant amounts of retouching. It's not the size of the RAW that determines the amount of touchup, it is the content and the eye of the user. Furthermore, item 2 was related to image size, and now you have changed the meaning of number 2 from image size to the amount of retouching required. So, what is your point? Lots of retouching blows up the amount of retouching data that needs to be stored? Yeah, I agree.
    When creating the catalog for the 1500 NEF files (21 GB), the starting size of the catalog file was around 1 GB. This must have included all classification-related information (the meaningful part of which was practically nothing, since I had not used rating, classification, or collections). By the time of the crash half of the files had been processed, so the actual retouching-related data (that should have been converted properly into the XMP files) might be only around 500 MB. Consequently, probably 22.5 GB out of the 24 GB of the catalog file contained historical information
    I don't know exactly what you do to touch up your photos, I can't imagine how you come up with the size should be around 500MB. But again, to you this problem is entirely caused by the size of the catalog, and I don't think it is. Now, having said that, some of your problem with slowness may indeed be related to the amount of touch-up that you are doing. Lightroom is known to slow down if you do lots of spot removal and lots of brushing, and then you may be better off doing this type of touch-up in Photoshop. Again, just to be 100% clear, the problem is not "size of catalog", the problem is you are doing so many adjustments on a single photo. You could have a catalog that is just as large, (i.e. that has lots more photos with few adjustments) and I would expect it to run a lot faster than what you are experiencing.
    So to sum up, you seem to be implying that slowness and catalog instability are the same issue, and I don't buy it. You seem to be implying that slowness and instability are both caused by the size of the catalog, and I don't buy that either.
    Re-reading your original post, you are putting the backups on the SSD, the same disk as the working catalog? This is a very poor practice, you need to put your backups on a different physical disk. That alone might help your space issues on the SSD.

  • LR 5.6 on Mac desktop all of a sudden will not read any card from any reader but will work on my laptop. I can work on previous images but not import new ones. Even if I create a catalog on laptop and import to my desktop on a thumb drive, the images are

    LR 5.6 on Mac desktop all of a sudden will not read any card from any reader but will work on my laptop. I can work on previous images but not import new ones. Even if I create a catalog on laptop and import to my desktop on a thumb drive, the images are only accessible as long as the thumb drive is inserted.

    Sounds like you may need to repair the Disk Permissions on your drive where your images are stored.

Maybe you are looking for