Best Practices for SSO between NWBC and BOBJ CMC

What are the best practices in this scenario:
- NWBC client (using SAP ECC logon credentials)
- BOBJ client (configured using Windows AD credentials)
I would like my users to log into NWBC - but be automatically logged into CMC for running crystal reports inside the NWBC gui.
Thanks
Shane Kelly

yes.  we're not using portal.    only SAPGUI up till now.
but we've recently configured our DEV server to run NWBC.
Normally my users log into CMC Infoview in a browser - but with NWBC i can bring infoview directly into the UI.
but it asks for a sign=on every time.
i'd like to configure SSO for NWBC to BOBJ infoview somewhow.

Similar Messages

  • Best practice for migrating between environments and versions?

    Hi to all,
    we've got a full suite of solutions custom developed in SAP BPC 7.0, SP 7. We'd like to understand if
    - there are best practice in order to copy this applications from an environment to another environment (another client)
    - there are best practice in case the client has got a newer version of SAP BPC (they should install the 7.5, while we're still stucked with the 7.0).
    Thank you very much
    Daniele

    Hi Daniele
    I am not entirely sure, what you are asking, Please could you provide additional information.
    Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
    What is the best method? Server Manager backup and restore, etc  ?
    And
    Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
    Kind Regards
    Daniel

  • Best Practices for NCS/PI Server and Application Monitoring question

    Hello,
    I am deploying a virtual instance of Cisco Prime Infrastructure 1.2 (1.2.1.012) on an ESX infrastructure. This is being deployed in an enterprise enviroment. I have questions around the best practices for moniotring this appliance. I am looking to monitor application failures (services down, db issues) and "hardware" (I understand this is a virtual machine, but statistics on the filesystem and CPU/Memory is good).
    Firstly, I have enabled via the CLI the snmp-server and set the SNMP trap host destination. I have created a notification receiver for the SNMP traps inside the NCS GUI and enabled the "System" type alarm. This type includes alarms like NCS_DOWN and PI database is down. I am trying to understand what the difference between enabling SNMP-SERVER HOST via the CLI and setting the Notification destination inthe GUI is? Also how can I generate a NCS_DOWN alarm in my lab. Doing NCS stop does not generate any alarms. I have not been able to find much information on how to generate this as a test.
    Secondly, how and which processes should I be monitoring from the Management Station? I cannot easily identify the main NCS procsses from the output of ps -ef when logged in the shell as root.
    Thanks guys!

    Amihan_Zerrudo wrote:
    1.) What is the cost of having the scope in a <jsp:useBean> tag set to 'session'? I am aware that there are a list of scopes like page, application, etc. and that if i use 'session' my variable will live for as long as that session is alive. (did i get this right?). You should rather look to the functional requirements instead of costs. If the bean need to be session scoped (e.g. maintain the logged in user), then do it so. If it just need to be request scoped (e.g. single page form data), then keep it request scoped.
    2.)If the JSP Page where i use that <useBean> is to be accessed hundred of times a day, will it compensate my server resources? Right now i am using the Sun Glassfish Server.It will certainly eat resources. Just supply enough CPU speed and memory to a server. You cannot expect that a webserver running at a Pentium 500MHz with 256MB of memory can flawlessly serve 100 simultaneous users at the same second. But you may expect that it can serve 100 users per 24 hour.
    3.) Can you suggest best practice in memory management given the architecture i described above?Just write code so that it doesn't unnecessarily eat memory. Only allocate memory if your application need to do so. You should rather let the hardware depend on the application requirements, not to let the application depend on the hardware specs.
    4.)Also, I have implemented connection pooling in my architecture, but my application is to be used by thousands of clients everyday.. Can the Sun Glassfish Server take care of that or will I have to purchase a powerful sever?Glassfish is just an application server software, it is not server hardware. Your concerns are rather hardware related.

  • What is the best practice for creating master pages and styles with translated text?

    I format translated text all the time for my company. I want to create a set of master pages and styles for each language and then import those styles into future translated documents. That way, the formatting can be done quickly and easily.
    What are the best practices for doing this? As a company this has been tried in the past, but without success. I'd like to know what other people are doing in this regard.
    Thank you!

    I create a master template that is usually void of content, with the exception I define as many of the paragraph styles I believe can/will be used with examples of their use in the body of the document--a style guide for that client. When beginning a new document for that client, I import those styles from the paragraph styles panel.
    Exception to this is when in a rush I begin documentation first, then begin new work. Then in the new work, I still pull in those defined paragraph and or object styles via their panels into the new work.
    There are times I need new styles. If they have broader applicability than a one-off instance or publication, then I open the style template for that client and import that style(s) from the publication containing the new style(s) and create example paragraphs and usage instructions.
    Take care, Mike

  • Tips n Tricks/Best Practices for integrating iPhone, iPad and MacBook Pro

    My wife just purchased an iPhone, iPad and Macbook Pro for her non profit consulting business and I was wondering if a tips and tricks or best practices for efficiently and productively integrating these devices exists?

    http://www.apple.com/icloud/

  • SAP Best Practices for SSO Configuration

    Hello There,
    Are there any SAP Best Practices available for SSO Configuration. If so, Kindly help me with those..
    And also any Third party tools available in the market for SSO Configuration..
    Appriciate your Help on this.. Thanks in advance.
    Regards,
    Pranay S
    Edited by: Pranay Subedari on Apr 29, 2011 9:12 AM

    Hello,
    Types on the SSO are classified with the systems involved in configuration (i.e.) SSO between ABAP Stack and Java stack or LDAP, OS
    Refer the link for more details [Document Deleted]
    Regards,
    Anand
    Message was edited by: Jason Lax

  • Best Practice For Database Parameter ARCH_LAG_TARGET and DBWR CHECKPOINT

    Hi,
    For best practice - i need to know - what is the recommended or guideline concerning these 2 Databases Parameter.
    I found for ARCH_LAG_TARGET, Oracle recommend to setup it to 1800 sec (30min)
    Maybe some one can guide me with these 2 parameters...
    Cheers

    Dear unsolaris,
    First of all if you want to track the full and incremental checkpoints, make the LOG_CHECKPOINT_TO_ALERT parameter TRUE. You will see the checkpoint SCN and the completion periods.
    Full checkpoint is being triggered when a log switch happens and checkpoint position in the controlfile is written in the datafile headers. For just a really tiny amount of time the database could be consistent eventhough it is open and in read/write mode.
    ARCH_LAG_TARGET parameter is disabled and set to 0 by default. Here is the definition for that parameter;
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/initparams009.htm
    If you want to set this parameter up the Oracle recommends it to be 1800 as you have said. This can subject to change from database to database and it is better for you to check it by experiencing it.
    Regards.
    Ogan

  • Best Practice - Outer Join between Fact and Dim table

    Hi Gurus,
    Need some advice on the below scenario
    I have an OOTB subject area and we have around 50-60 reports based on it. The related subject area Fact and Dim1 table are having inner join.
    Now I have a scenario for one report where outer join has to be implemented between Fact and Dim1. Here I am against changing the OOTB subject area join as the outer join will impact the performance of other 50-60 reports.
    Can anyone provide any inputs on what is the best way to handle this scenario?
    Thanks

    Ok. I tried this:
    Driving table : Fact, Left outer join -- didnt work.
    Driving table: Dimension D left outer join -- didnt work either
    In either the case, I see physical query as D left outer Join on Fact F. and omitting the rows.
    And then I tried this -
    Driving table: Fact, RIght outer join.
    Now, this is giving me error:
    Sybase][ODBC Driver]Internal Error. [nQSError: 16001] ODBC error state: 00000 code: 30128 message: [Sybase][ODBC Driver]Data overflow. Increase specified column size or buffer size. [nQSError: 16011] ODBC error occurred while executing SQLExtendedFetch to retrieve the results of a SQL statement. (HY000)
    I checked all columns, everything matched with database table type and size.
    I am pulling Fact.account number, Dimension.account name, Fact.Measures. I am seeing this error each time I pull Fact.Account number.

  • Best Practice for Buy in Set and Dismantle for Sales

    Hi All SAP Masters,
    We have a scenario that when purchasing an item as "set", in this set, it has a few components inside this set (something like a material BOM). Example, a machine which comes with several parts. However, when the user received this set from the supplier, the user would further dismantle certain part(s) from the set/"machine" and sell it separately to the customer as a component/"single item".
    What is the best practice in the SAP process to be adopted?
    Please help. Thank you.
    Warmest Regards,
    Edwin

    If your client  have PP module , then follow this steps
    Consider A is the purchased material and going to dismantle the A into B, and C
    1) create a BOM for B material
        and assign the header material  A as consumption material with + ve qty
       and C component as byproduct and maintain - ve qty in BOM
    2) maintain backflush indicator for A & C in material master MRP2 view
    3) create routing for B and maintain auto GR for final operation
    4) create a  production order for B
    5) confirm the order in Co11n, A  will be consumed in 261 movement, C will be receipt with 531 movement
    B will receipt in 101 movement .
    once the stock is posted into unrestricted you can sale B & C

  • Best Practice for saving all fieds and searches in capital letters

    I want to save all fields in my all pages in CAPS and also to search with CAPS e.g user enters search criteria in small letters, then automatically it should convert to caps. What is the best practice to do that?

    Hi,
    There are already so many discussions on this in this forum, some of the links are:
    Uppercase
    How to convert user input in the page to upper case?
    Sireesha

  • Best Practice for update to iPhone and iTouch

    OK, when 3.0 comes down the pike, what is the best way to get 3.0 as a "clean" install? Currently 2.2.1 is on both. If I do a restore, will the system only pick up 3.0 or will it see 2.2.1 which is currently on the hard drive? With that in mind, how can I delete the 2.2.1 version of the iPhone and iTouch software? Sorry for two question in one post.
    Steve H

    When firmware update 2.0 was released, the entire iPhone was eraseed first including the existing firmware - just as when restoring an iPhone with iTunes, followed by 2.0 being installed, which was followed by the iPhone's backup being transferred to the iPhone.
    The same may apply with firmware update 3.0 with your iPhone's backup being updated immediately before. If not, firmware version 2.2.1 will be updated with 3.0.
    If 2.2.1 is updated and you want a "clean" install of 3.0, you can follow the initial upgrade by restoring your iPhone with iTunes.

  • Best Practices for Batch Updates, Inserts and Complex Queries

    The approach we have taken for our ALDSP Architecture is to model or DASi as Business Data Objects, each DS joining several (some times many) tables and lookups. This works ok when needing to access individual records for read and update, but when we need to update multiple tables and rows within the same commit, trying to do this with a logical single ds built on tables or other dASi, proves both cumbersome and slow. This is also the case for queries, when we have complex where clauses within a DS built upon two or more multi-table-joined logical DASi.
    We tried a DS built on SQL, but that does not allow dml operations. We may have to just use JDBC. Any thoughts on how best to leverage DAS in this respect.

    I tried doing this by creating a UO class and using it on a DS built on a sql statement. What we wanted to do here is first read the DS to get a list of ID values that met the conditions of the query and then call submit() and have the UO update all the necessary tables associated with those IDS.
    However, we found that U/O never get's called unless you actually update something, not just send submit() after a read. Dis I misunderstand the way this shoudl work?

  • Best practices for performance on io and storage.

    I'm building / buying a new server was planning on going virtual.
    Dual xeon 2620 v3 with 64 GB ram we have about 15 users and 14 remote users.
    Main server 2008 / 2012 SQL
    2nd Server 2008 / 2012 File storage
    3rd Server / Terminal Services / Citrix (may not be needed still evaluating)
    Here is my concern.
    HyperV server installed on a mirror dive 120 GB SAS Raid 0 (i've been informed that this is unnecessary as hyper V doesn't require much space and that having it on an SSD would only improve on the boot up speed for the actual hyper V server, hypervisor,
    there fore even if I put this on a slow 5400 RPM drive this would only affect the initial boot up of hyperV (which I don't' plan on rebooting often)  is this true, would the page file be an issue?
    I was then planning on having 600 sas 15K (x4) raid 10, I would use this for the datastores of the 3 server on these drives.
    I've been informed that the I/O on these drives will affect performance and that each server should be on it's own  separate physical drives (raid volume).
    Is this common? Should I be using separate HD's for each virtual machine?
    nambi

    I'm building / buying a new server was planning on going virtual.
    Dual xeon 2620 v3 with 64 GB ram we have about 15 users and 14 remote users.
    Main server 2008 / 2012 SQL
    2nd Server 2008 / 2012 File storage
    3rd Server / Terminal Services / Citrix (may not be needed still evaluating)
    Here is my concern.
    HyperV server installed on a mirror dive 120 GB SAS Raid 0 (i've been informed that this is unnecessary as hyper V doesn't require much space and that having it on an SSD would only improve on the boot up speed for the actual hyper V server, hypervisor,
    there fore even if I put this on a slow 5400 RPM drive this would only affect the initial boot up of hyperV (which I don't' plan on rebooting often)  is this true, would the page file be an issue?
    I was then planning on having 600 sas 15K (x4) raid 10, I would use this for the datastores of the 3 server on these drives.
    I've been informed that the I/O on these drives will affect performance and that each server should be on it's own  separate physical drives (raid volume).
    Is this common? Should I be using separate HD's for each virtual machine?
    Do not create "silos" or "islands" of storage as it both a) hell of a management and b) effective way to steal IOPS from your config. OBR10 (One Big RAID10) is a way to go. See:
    OBR10 (One Big RAID10) is a way to go
    http://community.spiceworks.com/topic/262196-one-big-raid-10-the-new-standard-in-server-storage
    Good luck :)
    Hyper-V Shared Nothing Cluster. Only two Hyper-V hosts needed.

  • Best practice for iCloud across Lion and Snow Leopard

    Hi all,
    I'm not going to bleat about SL not supporting iCloud. I understand the business model.
    However, I'm wondering if any of you are in the same shoes as me and, if so, how you handle content management across iCloud compatible and incompatible devices.
    I have Lion on our 6month old iMac and Snow Leopard on my 6 year old Macbook.  This Macbook is still incredibly fast and reliable. I program websites across both devices using software like Coda 2, Transmit, etc etc...
    I'm a huge fan of the cloud and, while I can use dropbox on all my devices, I kinda like the inbuilt functionality of iCloud more.  (Plus I have more space there). Just wondering what workflows people are using to try and sync and work with content across compatible and incompatible devices.
    Cheers,
    Danno

    Danno. wrote:
    Hi all,
    I'm not going to bleat about SL not supporting iCloud. I understand the business model.
    However, I'm wondering if any of you are in the same shoes as me and, if so, how you handle content management across iCloud compatible and incompatible devices.
    I have Lion on our 6month old iMac and Snow Leopard on my 6 year old Macbook.  This Macbook is still incredibly fast and reliable. I program websites across both devices using software like Coda 2, Transmit, etc etc...
    I'm a huge fan of the cloud and, while I can use dropbox on all my devices, I kinda like the inbuilt functionality of iCloud more.  (Plus I have more space there). Just wondering what workflows people are using to try and sync and work with content across compatible and incompatible devices.
    Cheers,
    Danno
    There are some hacks to using iCloud with Snow Leopard. One of the keys is you must use iCloud as your primary source for Contacts and Calendars. That is, when you want to make additions or changes to either your calendar or contacts you really MUST do it from the web interface to iCloud. Otherwise you may wind up with duplicates.
    See the article here: http://bit.ly/LvVCR2

  • Best Practice for very large itunes and photo library..using Os X Server

    Ok setup....
    one Imac, one new Macbook Pro, one Macbook, all on leopard. Wired and wireless, all airport extremes and express'
    have purchased a mac mini plus a firewire 800 2TB Raid drive.
    I have a 190GB ever increasing music library (I rip one to one no compression) and a 300gb photo library.
    So..question Will it be easier to set up OS X Server on the mini and access my itunes library via that?
    Is it easy to do so?
    I only rip via the Imac, so the library is connected to that and shared to the laptops...how does one go about making the imac automatically connect to the music if i transfer all music to the server ?
    The photo bit can wait depending on the answer to the music..
    many thanks
    Adrian

    I have a much larger itunes collection (500gb/ 300k songs, a lot more photos, and several terabytes of movies). I share them out via a linux server. We use apple TV for music/video and the bottleneck appears to be the mac running itunes in the middle. I have all of the laptops (macbook pros) set up with their own "instance" of itunes that just references the files on the server. You can enable sharing on itunes itself, but with a library this size performance on things like loading cover art and browsing the library is not great. Please note also I haven't tried 8.x so there may be some performance enhancements that have improved things.
    There is a lag on accessing music/video on the server of a second or so. I suspect that this is due to speed in the mac accessing the network shares, but it's not bad and you never know it once the music starts or the video starts. Some of this on the video front may be the codec settings I used to encode the video.
    I suspect that as long as you are doing just music, this isn't going to be an issue for you with a mini. I also suspect that you don't need OSX server at all. You can just do a file share in OSX and give each machine a local itunes instance pointing back at the files on the server and have a good setup.

Maybe you are looking for

  • Erro no Evento de Cancelamento NFE 3.10

    Bom dia a todos. Estamos realizando testes de estorno de notas já na versão 3.10. Na transação J1BNFE no SAP ECC, ao solicitar o estorno, estamos recebendo a mensagem abaixo: " A conexão RFC 0054494302 falhou na transmissão da solicitação" A chamada

  • Firefox freezes content of template. IE works

    Using JDeveloper 11.1.1.3 We have a strange issue with an application in firefox. This application has a simple page with some dropdown lists, inputText and commandButtons on it. it also uses a template. When we run the application in internet explor

  • Adobe InDesign often crashed

    Hello, Issues with Indesign CS6 crashing 3-4weeks after fresh install. Version: 8.0.2 ran all the update in adobe/indesign, even fresh install of Windows a few time. System Information: Windows 7 Enterprise 64bit 6GB RAM CPU i5 3.20GHz Pulled the cra

  • RSRD_X_PRODUCE_PROXY error in broadcasting job

    Hi, After system copy i am facing a problem in Production system that is The Brodcasting Jobs are getting failed due to the given below error Error log : Java system error: call FM RSRD_X_PRODUCE_PROXY to ProgId xxxxxxxxxxxxxxxxxx on host xxxxxxxxx w

  • What are the configuration need between R3 and XI when we use RFC sync adap

    hi, What are the configuration need between R3 and XI when we use RFC sync adapter. Regards siva.