Request of suggestion for implementing XI Scenarios - Reg

Hi Experts,,
                 We are new to SAP. We need to implement the following scenarios in our landscape..Please give us suggestion how to proceed further.
Scenario 1::
       There is no PI at cutomer landscape and only SAP R/3 system is present..But in our landscape PI is present.From their SAP R/3 how we can connect to our PI?
Scenario 2::
         Suppose if there is no PI and SAP R/3 and customer is only willing to give .csv file  or any flat file , from our PI  how we can pick that flat file at customer side?

>>Scenario 1::
>>There is no PI at cutomer landscape and only SAP R/3 system is present..But in our landscape PI is present.From their SAP >>R/3 how we can connect to our PI?
No problem.
1) Do ALE Configuration between Customer R/3 and your PI. Now ask customers to send idocs to PI.
Check this link
http://wiki.sdn.sap.com/wiki/display/ABAP/7StepsForALEConfiguration
or
2)through proxy configuration between Customers and Your PI, you can establish communication.  There are plenty of ways to connect R/3 to PI. 
>>Scenario 2::
>>Suppose if there is no PI and SAP R/3 and customer is only willing to give .csv file or any flat file , from our PI how we can >>pick that flat file at customer side?
Create NFS at unix level and request customer to drop file over there. Configure File adapter in PI  to pick up the file. The file can be flat file or xml.

Similar Messages

  • Suggestions for implementing a scenario

    Hi All,
    I have a requirement where in I need to add a custom tab in Opportunity TI screen. In that custom tab, we have certain search parameters like Product ID and the search needs to trigger an external web service that takes the search parameters as the input and returns a table containing IDs called KIT IDs.
    This table needs to be displayed in the custom tab. It should have additional functionality like
    1. Provision to choose multiple KIT IDs and compare its details
    2. Provision to display something called Components when one KIT ID is chosen. A KIT may have multiple components.
    Though I have a few ideas of implementing this, it would be helpful if I can get more suggestions.
    Thanks & Regards
    Jagathshree.

    Great! Thanks. I wasn't sure if this demo was going to be re-factored to JSF, so I'm very excited to see that it has been - there's additional functionality in this application beyond SRDemo that will be very informative to see implemented in JSF.

  • Need a Suggestion For implementing the Digital Signature For the Documents

    Hi,
    Currently I am working in a Document Management System. I need a Good Suggestion for how to implement a Digital Signature For the Documents.
    Thanks in Advance
    Sabarish V

    Hmm, if you are not using Oracle Payroll, what are you using for payroll? I am wondering why you could not use your payroll system, whatever it is, to handle this reimbursement program.
    Well, you may want to talk to Oracle support about how to handle this in Oracle iExpense. You can certainly handle advances for Expense Reports. You would then apply the advance to the expense report items. The catch is I don't think you can stop expense item entry after the adavance is satisfied. You would have to set up a work flow process of some kind to have the expense reports reviewed and only approve expenses that are applied to the advance, is what I am thinking. Not your ideal solution, but something to think about. It could be the Oracle folks might know of a sneaky way to handle this. What you are trying to do is unusual. Employee advances are common, but the idea of not being able to exceed the advance amount is what unusual about this. Normally you will accept any expenses over the advance amount and reimburse the employee for those extra amounts not advanced.
    Good luck.
    John Dickey

  • Sugestion required for Fiscal Year scenario

    Hi Gurus,
    I  need your suggestion for the below scenario
    One of our client going to  change the Fiscal period  from  (Jan -  dec ) to (Sep - Oct). We have nearly 5 years of records as on date based on the (Jan-Dec) fiscal period.
    Requirement: If new fiscal peirod will come in active , they want to view all the data ( Including  historical records)based on the both the fiscal period based on the selection in the query.
    Technical requirement: Dynamic switching is requried for Fiscal Year Variant in the query level. 
    Suggestion Required for below point:
    1. Is it required to maintain two Fiscal year variant ?
    2.How do we maintain the new fiscal year period for historical record ?
    3. Is it any posibilities their with out major disturbance of the existing query ?
    4. Is it possible to do in query level with minor impact of performance ?
    5. Is it possible to do in back end side with out redundancy of the records ?
    6.I have a solution to use multiprovider on top of two infoprovides which are maintain the fiscal period based on two fiscal year variant. If I realize this it would lead major back end work.
    7. I have another solution to add another two fields in the existing structure to hold new variant and fiscal period. If I do this How dynamically I can change the structure of the query for to view the different fiscal period information.
    Anticipate your reply eagerly
    With Regards
    Siva

    Hi,
    1. Is it required to maintain two Fiscal year variant ?
    Yes,  You will need to  have another Fiscal Year variant for Sep to Oct.
    2.How do we maintain the new fiscal year period for historical record ?
    To do this at the BI Level, You can have a routine to convert the Fiscal Year Period according to the new Variant.
    You will have to get the help of a functional consultant in order to understand the logic of opening and closing balance. According to the new variant the current Fisc Period 009.2009(Sep 2009) will become 001.2010 and so on. Apart form just converting the period, you will also have to take care of carrying over the balances.
    3. Is it any posibilities their with out major disturbance of the existing query ?
    Once you implement the above logic, you can just include the New Fiscal variant into the query.
    4. Is it possible to do in query level with minor impact of performance ?
    Yes, as mentioned above.
    5. Is it possible to do in back end side with out redundancy of the records ?
    Not sure about this.
    6.I have a solution to use multiprovider on top of two infoprovides which are maintain the fiscal period based on two fiscal year variant. If I realize this it would lead major back end work.
    This seems to be the best way. You will not have to touch the existing cube. Only your reports will have to be copied to the New MultiCube. The logic in the 2nd Q above will be for the new cube.
    Regards,
    Gaurav

  • Scenarios for implementing workflow

    Hello,
    Can anybody suggest some scenarios where I can implement SAP work flow?
    Some links will be very useful.
    Regards,
    Sujith

    Hello,
      Implementing the model for workflow execution
    This section discusses the modeling option we implemented with the required task iteration functions for our client. The process model we created to represent the end-to-end process, and its mapping to the client's business scenarios, are shown. A snapshot of the process model, with its process modules and activities that each of the process modules orchestrate, are also discussed.
    Rationale for the choice
    To implement task rework we selected Option 2, the process-subprocess approach (stop, rewind, and start). This approach allows for better process simulation that helps optimize the business process. It also helps keep the model unchanged, even after the process improvements result in less rework. Although, it is recommended that the model is cleaned up to remove the unwanted rework steps.
    Rework needs will likely be reduced because rework is not typically driven by process or business conditions. It is most often caused by operational constraints and lack of full understanding of the process. The rework occurrences would diminish as users gain better understanding by using a workflow solution and by applying monitoring results to improve the business process.
    The business process model developed using WBI Modeler consists of process modules identified in the process scope. The end-to-end process is created by using the process modules and business rules to help choreograph these modules in the correct sequence.
    The process model approach supports execution of business process for a "happy path." The process modules are used to execute the end-to-end process for happy path and the task rework requirement. The approach also supports reusability. The model information includes organization data, process data elements, and metrics data (KPIs).
    End-to-end business process model
    The contract management process involves various business activities that result in creation of proposals and contracts. The activities are logically grouped into process modules, which in turn make up the end-to-end process model. A business event in the CRM system triggers the launching of the process and executes the modules and activities based on business rules. The process execution creates and assigns tasks to sales support team members, who will work and complete the tasks. The process modules are mapped to the business scenarios as shown here.
    Scenario 1: Execute Service Request 1 - Customer Research
    Scenario 2: Execute Service Request 2 - Solution Configuration
    Scenario 3: Execute Service Request 3 - Solution Pricing
    Scenario 4: Execute Service Request 4 - Perform Customer Credit Check
    Scenario 5: Execute Service Request 5 - Prepare Proposal and Contract

  • Procedures for implementing a snapshot scenario with custom DataSources

    Hi Gurus,
    I have checked the How To paper ([How to Handle Inventory Management Scenarios in BW (NW2004)|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328]). However, only SAP standard BW objects are mentioned in the paper e.g. InfoCube (0IC_C03), Material Stock InfoSource (2LIS_03_BX), Material movements IS (2LIS_03_BF) and Revaluations IS (0LIS_03_UM).
    On the contrary, I need to handle custom DataSources for the Snapshot scenario. Are there any differences in the implementation methodology? Which additional aspects should I take into consideration? For example, the load sequence, delta type, etc.
    Could you please list out the step-by-step procedures for such an implementation?
    Thanks in advance!
    Regards,
    Meng

    Hi Meng,
    You can approach this in two ways.
    1) If the volume of data is not much, you can derive the balance at query level, as follows.
    User enters the date, based on this restrict your key figure to display all values less than this date.
    2) If the volume of data is high, then you will have issues with performance if you are calculating the balance in the front end. In this case, you can model this with 'Non cumulative' key figure.  Again there are 2 ways of approaching this back end solution based on the volume of data. ( Say in one case you have 2 years of history in your DSO and in the second case, you have  5 years of history ).
    A) For example, If there are only 2 years of history
    Create a non cumulative Key figure 'ZBALANCE' with inflow and outflow, in a cube.
    Map this to your credit and debit as + and - respectively and map the calender day to posting date.
    Just initialise the dataload with data transfer and start loading the delta as normal.
    You will be able to see the balances for each and every calday in your reporting.
    This approach is straight forward and simple.
    Compress the cube for getting the better performance.
    B) If there are 5 years of history and you are not interested in loading all the 5 years data in getting the balance
    Here you want to have the initial balance, continue delta and would like to load 2 years of history.
    The cube and non cumulative KF are created as mentioned above.
    For generating initial balance, you have to create another DSO without calander day and ZBalance mapped to credits and debits in additive mode. Load your DSO data into this new DSO to generate initial balance. This balance will be loaded to your cube as initial balance. ( Like 2LIS_03_BX ).
    You have to compress this request with marker update ( Must ).
    Load your historical data for 2 years from the original DSO. Compress without marker update ( Must ).
    initialise without data transfer from DSO to cube and load deltas normally.
    Compress the delta requests normally for performance reasons.
    Please read the 'Inventory document' in detail.
    Please let me know, if any of the information is still not clear.
    Thanks,
    Krishnan

  • Reg: RFC / IDOC required for doing CIDX Scenario

    Hi! Gurus,
    This is Amar Srinivas Eli. I have task here to work and implement CIDX Scenario succesfully. There are some business cases as per the CIDX standards from that I was selected one Business case i,,e, FORECASTING.
    But inorder to implement that scenario I need some RFC's or  IDOC lists from SAP side whether it may be a source or target side..what ever it may be...and other side will be CIDX format.
    Here I need your help in the below mentioned tasks. Kindly help me out in a detailed step by step guide and do the needful to me.
    1. How to get RFC / IDOC name from R3 System and I know that I can get RFC from SE37 and IDOCs 
        from WE05 but I need whether all the required fileds are statisfying in the RFC's or not.
    2. My first preference is only for STANDARD SAP RFCs or IDOCs.
    3. Please let me know how to see the inner fields present in those RFC's
    4. ALso please provide any configuration guide which contains screen shots if you already worked
        earlier on CIDX Scenario I mean step by step and also for testing
    5... tell me if there is any other tool other than STK kit for testing CIDX messages.
    6.  Is there any Freeware STK tool kit is there or not...
    Also share your experience and errors that you have faces while doing this CIDX Scenario.
    Regards:
    Amar Srinivas Eli

    Hi! All,
    Thanks for your fast response..
    Hi! Raja,
    I gone through your points. Here I have few doubts on those please clarify that.
    1. Coming to testing i didt used any testing tool , but you can comre your output using XML Spy or Stylus studio.*
    *CIDx Documents,you wil find in SDN please search in SDN.
    Means Without using STK Tool or else some other tool How can I get the CIDX response for my request in the Testing environment. See I am not doing this in  real time I am trying to findout the solution for Business cases so I need one testing environment tool right ? Is that ALTOVA XML spy will give no right ?
    Note: You told that you have done that business case IDOC--CIDX Order change / req Will you please share the information documentation I mean step  by step procedure for end to end doc...based on your scenario ?
    2. I know those R3 and PI settings I mean configurations means ports and RFC destinations and all those..but only thing I dono is "How you got that IDOC I mean on what basis and how did u search the IDOC?"
    3.  See Here I am working on topic FORECASTING it includes so many sub tasks like Deman PLan req and response, supply plan req and response and replenishment order and etc....so based on that how can I get those related RFC's or IDOC's in order to communicate from R3 Side
    If Suppose I have seen one RFC/IDOC and for example out of 10 fields few are there in one IDOC and another few are there in another IDOC then in that case I need to go for Multimapping IDOCs I mean 2 Senders to one Receiver if yes Is it possible?
    My first requirement is based on the above mentioned business cases how can I get the corresponding RFC's or IDOCs ?
    Regards:
    Amar Srinivas Eli
    Edited by: Amar Srinivas Eli on Jan 5, 2009 12:19 PM

  • I continue to receive message that "We could not complete your iTunes Store request. An unknown error occurred (4002). Please try again later." This has been happening every time iTunes Match runs in background. Any suggestions for a cure?

    I continue to receive message that "We could not complete your iTunes Store request. An unknown error occurred (4002). Please try again later." This has been happening every time iTunes Match runs in background. Any suggestions for a cure?

    Found a potential solution here:
    https://discussions.apple.com/thread/4332757
    Gsleeroy
    Re: error 4002 in itunes match do you have a solution? 
    Sep 23, 2012 10:08 AM (in response to matracaelcan)
    Hi All,
    I had this problem today myself, and was frustrated repeatedly by the '4002' error.
    I have literally just fixed the issue by doing the following steps:
    1: Go to the 'Store' tab and select 'Turn Off iTunes Match'
    2: Return to the 'Store' tab and select 'Update Genius'
    3: Wait for this to complete succesfully, the return to the 'Store' tab once more and select 'Turn On iTunes Match'.
    4: iTunes Match will now go through the motions and should succeed!
    I hope this helps

  • ITunes error message: "We could not complete your iTunes store request. An unknown error occurred (4002). Any suggestions for curing this?

    After migrating my old iMac to my new one, every time I start iTunes, I get the following message: "We could not complete your iTunes store request. An unknown error occurred (4002). Any suggestions for curing this?

    Perfect - thanks so much!
    For anyone else wondering, the clue was in what was displayed on the screen after I turned iTunes Match on again: it asked whether I wanted to add this computer. Having copied my iTunes library over from the old machine, I'd completely forgotten that iTunes Match identifies each computer uniquely (not each iTunes Library), so was never going to work with this one until I specifically added it. Sorted!

  • Which scenario u will suggest for Iron & Steel industry & why?

    Which scenario u will suggest for Iron & Steel industry & why?
    Product - Galvanised Plain/ Galvanised Corrugated Sheets
    Process- Pickling-Cold Rolling -Galavanizing- Cutting /Corrugation
    Kindly suggest manufacturing scenario Discrete / Repetitve ?
    Madan

    Hi
    Please clear the following doubts:
    1. Is it working on Make to order scenario or Make to stock scenario?
    2. Will the sheets that come out are of different sizes and shapes?
    if so the final operation will be different for different sizes.
    Chandra

  • Different scenarios for implementing screen exits

    Hi all,
    Is there different scenarios for implementing screen exits for different applications such as MM,SD....etc.

    Hi all,
    Is there different scenarios for implementing screen exits for different applications such as MM,SD....etc.

  • Request Response Bean for SOAP Sender Adapter

    Hi Friends,
    Is it possible to use  Request Response Bean Module described (FIle to RFC to File here) for SOAP(sender) and JDBC(receiver) adapter?
    I want to configure SOAP to JDBC to JDBC scenario.
    http://wiki.sdn.sap.com/wiki/display/HOME/UsingRequestResponseBeanModuleinFILE+Adapter
    Scenario:
    I will do SOAP adapter call Asynchronously and JDBC receiver adapter will select data from database. This response will go back to SOAP adapter and then SOAP will divert this response to another JDBC adapter. This JDBC adapter will insert data into another database.
    I do not want to use ccBPM. Is there any design approach to implement this scenario?
    Thanks,
    Sandeep Maurya

    Hi Sandeep,
    SAP says SOAP sender adapter does not support Modules. You can serch Help.sap (for SOAP sender channel) to find the same.
    I would suggest to use AXIS adapter (provides all the functionality of SOAP) and supports the Modules as well.
    I have used the Module beans that you mentioned in the past... but it is not consistent with its processing.. sometimes it stucks with Message ID issue... I have seen same issue faced some other friends as well .. search on SDN you will find the issue with these modules...
    Using BPM would be another option...
    Thanks,
    Sunil Singh

  • A suggestion for handling optdepends.

    [UPDATE]
    I've rewritten this post to present the idea more clearly.
    [/UPDATE]
    I've submitted a feature request: http://bugs.archlinux.org/task/12708
    If you like this idea, please express your support there too.
    The Current Situation
    The pacman database contains a file named "depends" in each package's directory which specifies the package's depends in the following format:
    %DEPENDS%
    foo
    bar
    this
    that
    Pacman reads this file and creates an internal representation of this list for the package which it uses during the sync operation to handle dependencies. Each package may also list optional dependencies which provide further functionality for the package without being required to use the package. Let's take gimp as an example:
    pacman -Si gimp
    Depends On : gtk2>=2.14.4 lcms>=1.17 libxpm>=3.5.7 libwmf>=0.2.8.4 libxmu>=1.0.4 librsvg>=2.22.3 libmng>=1.0.10 dbus-glib>=0.76 libexif>=0.6.16 pygtk>=2.13.0 desktop-file-utils gegl>=0.0.22 curl
    Optional Deps : gutenprint: for sophisticated printing only as gimp has built-in cups print support
    libwebkit: for the help browser
    poppler-glib: for pdf support
    hal: for Linux input event controller module
    alsa-lib: for MIDI event controller module
    If you want to install libwebkit to use gimp's help browser, you have 2 choices:
    pacman -S libwebkit
    pacman -S --asdeps libwebkit
    With the first choice, libwebkit will clutter the list of explicitly installed packages ("pacman -Qet"). With the second choice, libwebkit will be considered an orphan and will be listed in "pacman -Qdt", which not only means it clutters that list but it also means that you can no longer purge orphans with "pacman -Rs $(pacman -Qqdt)".
    In both cases, when you uninstall gimp, you must remember to uninstall libwebkit too, because pacman doesn't know that you installed it as a dependency for gimp.
    This may not be a problem for one package, but it will once the number or optdepends you have installed increases.
    My Suggestion
    Create an optdepends database in /var/lib/pacman/optdepends/ that follows the same format as the current depends files:
    %OPTDEPENDS%
    foo
    bar
    this
    that
    Add a function to pacman to check if a package has an entry in the optdepends database.
    During a sync operation, treat any optdepends specified in the optdepends database as if they had been specified in the depends file.
    Add a "--getoptdeps" flag to pacman to enable interactive installation of optdepends for a given package that follows the same pattern as the current group installation dialogue.
    Store the results of this dialogue in the optdepends database.
    Let's take gimp as an example again. You know that gimp has an optdepend that you want, so you do this:
    pacman -S --getoptdeps gimp
    gimp package found, checking optdepends
    :: gimp has the following optdepends:
    gutenprint: for sophisticated printing only as gimp has built-in cups print support
    libwebkit: for the help browser
    poppler-glib: for pdf support
    hal: for Linux input event controller module
    alsa-lib: for MIDI event controller module
    :: Install whole content? [y/N] n
    :: Install gutenprint as optdepend for gimp? [y/N] n
    :: Install libwebkit as optdepend for gimp? [y/N] y
    :: Install poppler-glib as optdepend for gimp? [y/N] n
    :: Install hal as optdepend for gimp? [y/N] n
    :: Install alsa-lib as optdepend for gimp? [y/N] n
    Retrieving libwebkit...
    Libwebkit will now be handled exactly as if it were a true dependency of gimp. It is neither explicitly installed nor an orphan. It will get removed with gimp unless it's a depend or optdepend for another package.
    /var/lib/pacman/optdepends/gimp/optdepends now looks like this:
    %OPTDEPENDS%
    libwebkit
    The Benefits of This Method
    Default pacman behavior remains unchanged.
    Most of the code is already in place (depends file parser, package selection dialogue, dependency handling during sync operation)
    The existing databases (local, sync) would not require any changes.
    The only extra overhead would be checking if a package has an entry in the optdepends database.
    Users can define their own optional dependencies by adding them to the optdepends database (manually or with provided tools)
    This opens the doors for metapackages to replace groups.
    About Metapackages
    A metapackage is a package that contains nothing itself but organizes other packages. For an example of how these work on Arch, take a look at metapax.
    Every package group could be converted to a metapackage if this suggestion were implemented. To understand the benefits of using metapackages instead of groups, we need to consider how groups currently work.
    When you install gnome, this is what happens:
    pacman -S gnome
    gnome package not found, searching for group...
    :: group gnome (including ignored packages):
    epiphany gnome-applets gnome-backgrounds gnome-control-center gnome-desktop gnome-icon-theme gnome-media gnome-mime-data gnome-mount gnome-panel gnome-python gnome-screensaver gnome-session gnome-settings-daemon
    gnome-themes gnome2-user-docs libgail-gnome metacity nautilus notification-daemon yelp
    :: Install whole content? [Y/n] n
    :: Install epiphany from group gnome? [Y/n] y
    :: Install gnome-applets from group gnome? [Y/n] y
    :: Install gnome-backgrounds from group gnome? [Y/n] y
    :: Install gnome-control-center from group gnome? [Y/n] y
    :: Install gnome-desktop from group gnome? [Y/n] y
    Most users will install all of the packages, others won't. In either case, once the packages are on your system, pacman has no concept of the gnome "group". Each package is effectively independent of the gnome group. If a new package is added to the gnome group, for example "gnome-somenewpackage", pacman will not install it during your next update. It won't even ask you about it or tell you that there is a new package. There have been questions on this forum from users wondering why new gnome packages weren't installed automatically. This applies to all groups... kde, xorg, xfce, etc.
    If we instead replaced groups with metapackages, each package in the group would become an optdepend of the metapackage. With my suggestion, this would lead to exactly the same dialogue as above. Each package in a metapackage would remain optional just as packages in groups currently are. The advantage would be that if "gnome-somenewpackage" is added to the gnome metapackage, it would be possible to inform the user during an update and prompt for installation.
    Here's the discussion on flyspray about groups vs metapackages: http://bugs.archlinux.org/task/8242
    Notes on Metapackages
    The only complicated parts of handling metapackages are the following:
    If a package is a metapackage, it should be detected during installation and automatically jump to the optdepends dialogue in order for it to behave exactly as groups do.
    During a metapackage update, there should be a way to inform the user of new optdepends, but this might be as simple as including an upgrade message in the package install file.
    Last edited by Xyne (2009-01-13 16:20:52)

    No, this wouldn't affect a packages "true" dependencies in any way.
    Packages now have 2 types of dependencies, "depends" and "optdepends". "depends" are installed with the package and are required for the package to run. "optdepends" just display message during installation to the effect of "optional dependencies for this package: foo - for foo support, bar - for bar support, baz - for web access and printing". "gimp" is an example of a package with optional dependencies.
    As it is right now, optional dependencies are nothing more than installation messages. If you decide to install optional dependencies for a given package, they are completely independent of the target package. Let me give a concrete example:
    pacman -Si gimp
    Depends On : gtk2>=2.14.4 lcms>=1.17 libxpm>=3.5.7 libwmf>=0.2.8.4 libxmu>=1.0.4 librsvg>=2.22.3 libmng>=1.0.10 dbus-glib>=0.76 libexif>=0.6.16 pygtk>=2.13.0 desktop-file-utils gegl>=0.0.22 curl
    Optional Deps : gutenprint: for sophisticated printing only as gimp has built-in cups print support
    libwebkit: for the help browser
    poppler-glib: for pdf support
    hal: for Linux input event controller module
    Ok, I want to install gimp and I want libwebkit to be able to use gimp's help browser. I have 2 options right now:
    Option 1:
    pacman -S gimp libwebkit
    libwebkit is now installed as an explicit package.
    Option 2:
    pacman -S gimp
    pacman -S --asdeps libwebkit
    libwebkit is now installed as a dependency.
    With option 1, libwebkit clutters my list of explicitly installed packages (pacman -Qet). With option 2, it is considered an orphan by pacman and would be removed with an orphan purge ("pacman -Rsn $(pacman -Qqdt)"). In both cases, if I remove gimp, libwebkit stays on my system even though I only want it for gimp. It will not be removed with "pacman -Rs gimp" because pacman has no idea that it has anything to do with gimp.
    My suggestion therefore it to create a way for pacman to treat selected optdepends as depends. Given the gimp example, what this would mean for the user is that when the user runs "pacman -S gimp", it would present a dialogue as follows:
    gimp has the following optional dependencies:
    gutenprint: for sophisticated printing only as gimp has built-in cups print support
    libwebkit: for the help browser
    poppler-glib: for pdf support
    hal: for Linux input event controller module
    Would you like to install these optional dependencies? [y/N] y
    Install all optional dependencies? [y/N] n
    Install gutenprint? [y/N] n
    Install libwebkit? [y/N] y
    Install poppler-glib? [y/N] n
    Install hal? [y/N] n
    retrieving libwebkit...
    libwebkit would now be treated as if it had been specified in gimp's depends array. When you uninstall gimp, it would be removed with gimp just as gimp's other dependencies.
    There would also be tools to add optional dependencies to a package later (either with pacman or something else... I'll gladly contribute something to do this), so if you want to add gutenprint to gimp later, you could and then let pacman grab it as a dependency of gimp.
    Again, this has nothing to do with "true" dependencies of packages. This is just a fix for the kludge now known as "optdepends".
    First, let's look at what happens when you install the gnome
    pacman -S gnome
    gnome package not found, searching for group...
    :: group gnome (including ignored packages):
    epiphany gnome-applets gnome-backgrounds gnome-control-center gnome-desktop gnome-icon-theme gnome-media gnome-mime-data gnome-mount gnome-panel gnome-python gnome-screensaver gnome-session gnome-settings-daemon
    gnome-themes gnome2-user-docs libgail-gnome metacity nautilus notification-daemon yelp
    :: Install whole content? [Y/n] n
    :: Install epiphany from group gnome? [Y/n] n
    :: Install gnome-applets from group gnome? [Y/n] y
    :: Install gnome-backgrounds from group gnome? [Y/n] y
    :: Install gnome-control-center from group gnome? [Y/n] y
    :: Install gnome-desktop from group gnome? [Y/n] y
    :: Install gnome-icon-theme from group gnome? [Y/n] y
    :: Install gnome-media from group gnome? [Y/n] n
    :: Install gnome-mime-data from group gnome? [Y/n]
    After the installation, each of those packages is treated as an independently installed package. The "group" gnome only exists when you select packages for the initial installation. There have been threads on this forum posted by users who didn't understand why "pacman -Syu" failed to retrieve packages that had been added to the gnome "group". That's because pacman simply updates the existing packages and doesn't know about groups once their on the system. If they add "gnome-some-new-package", you have to either run "pacman -S gnome" again and either re-install all the packages or run through the dialogue until you get to the new package, or you have to explicitly install any new packages directly. You need to find out when a new package has been included in gnome too, because there is no way for pacman to know this (I posted a script somewhere to check if you have all packages in a group, forgot where though).
    The idea of a metapackage is that it is an empty package that simply specifies other packages as dependencies (i.e it contains no files, just package information). That's what metapax creates (http://bbs.archlinux.org/viewtopic.php?id=53788). If a gnome metapackage is created with metapax, the user can install it and get all of the packages in gnome. If a new package is added to the gnome metapackage, this package will be retrieved on the next sync update. The user doesn't need to regularly check that he has everything in gnome because the metapackage handles all the packages in gnome.
    The problem with this approach is that everything it specifies is a "depends", so you have to include everything. With "optdepends" though, you would get a similar dialogue to the one when installing a group (as my example above for gimp), but the installed metapackage would have all of the advantages of a package when syncing and uninstalling.
    Users would also create their own metapackages. Lets say that you would like to create a custom DE from existing packages so that you can quickly install a simliar desktop on different systems. You could create a metapackage with your window manager, text editor, image viewer, video player, etc. You could then simply install that package on different machines and be presented with the choice of which components you'd like to install. You could distribute this over your network with a local user repository. If you later want to add another package to it, that package could be optionally included on the different machines during the next update.
    Last edited by Xyne (2009-01-11 18:03:10)

  • Alerting for complete async scenario

    I'm new to XI and I need advice from XI experts.
    Could any one explain me how to setup the alert scenario for complete async scenario in case of any errors in XI like mapping error etc. I have gone through some alert config documents but what I'm looking for is, suggestions and recommondations for as how to implement alert scenario for async processes. how is it implemented in async and sync processes in real time. whats the good approach for alerting and error handling.
    Thanks,
    Sudha Madhuri

    HI,
    You can use alerts for alert the msg/error msg to user with/without interpt the process or while processing the msg.
    In Async process you can use alerts in UDF or through BPM.
    In Sync , you can raise alerts through BPM.
    Also reconcile the process with BPM.
    Please see the below links..
    Alerts with variables from the messages payload (XI) - UPDATED - /people/michal.krawczyk2/blog/2005/03/13/alerts-with-variables-from-the-messages-payload-xi--updated
    /people/michal.krawczyk2/blog/2005/09/09/xi-alerts--step-by-step - Alert Configuration
    /people/michal.krawczyk2/blog/2005/09/09/xi-alerts--troubleshooting-guide - Trouble shoot alert config
    XI ALerts with container elements - /people/sukumar.natarajan/blog/2007/01/07/how-to-raise-alerts-from-abap-proxy
    Reconciliation of Messages in BPM - /people/krishna.moorthyp/blog/2006/04/08/reconciliation-of-messages-in-bpm
    /people/sap.india5/blog/2005/12/06/xi-ccms-alert-monitoring-overview-and-features - CCMS Alert Monitoring
    Triggering XI Alerts from a User Defined Function - /people/bhavesh.kantilal/blog/2006/07/25/triggering-xi-alerts-from-a-user-defined-function
    blogs for alerts
    http://help.sap.com/saphelp_nw2004s/helpdata/en/2b/d925bf4b8a11d1894c0000e8323c4f/content.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/9c/34193cb4f5131de10000000a11405a/content.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/8a/3e2d4105f8d92be10000000a1550b0/content.htm
    Regards
    Chilla..
    <i>Points rewarded if it is useful..</i>

  • PI mapping content for MM-SUS scenario?

    Dear Guru's,
    We are configuring MM-SUS scenario with SRM on 7.01 and ECC 6 ehp 5 and PI 7.02. While configuring the PI scenario for the above integration, we are not finding the required mapping for the purchase order output from the ECC side  in the PI scenario.
    We have imported XI Content for SRM SERVER 7.01, SAP APPL 6.05, SRM SERVER IC 7.01. Aprat from this any other XI content needs to be imported into PI SLD ?
    For the interface PurchaseOrderERPRequest_Out_V1 from MM to SUS, do we required to do the PI mapping manually or any XI standard content is available. If so, please help me with the Content Details.
    Thank you.
    Ranjan

    Dear Nikhil,
    As suggested by you earlier, we have imported the content for the integration scenario 'SE_SERVICE_PROCUREMENT' and we have found all the required content. Only thing missing is the mapping for ERP PO REQUEST OUT in the PI content.
    I am wondering whether SAP is not giving the standard content and we need to develop our own content for mapping or there is a direct call in place.
    Anybody who has used the scenario, what was the approach? or is there anything we are missing?
    Thanks and regards,
    Ranjan
    Ranjan Sutradhar

Maybe you are looking for