External System Authentication Credentials Best practice

We are in the process of an 5.0 upgrade.
We are using NTLM as our authentication source to get teh users and the groups and authenticate against the source. So currently we only have the NT userid, group info(NT domain password is not stored).
We need to get user credentials to other systems/applications so that we can pass that on the specfic applications when we search/crawl or integrate with those apps/systems.
We were thinking of getting the credentials(App userid and password) for other applications by developing a custom Profile Web service to gather the information specific to these users. However, don't know if external application password is secured when retrieving from the external repository via a PWS and storing into the Portal database.
Is this the best approach to take to gather the above information? If not, please recommend the best practice to follow.
Alternatively, can have the users enter the external system credentials by having them edit their user profile. However, this approach is not preferred.
If we can't store the user credential to the external apps, we won't eb able to enhance the user experience when doing a search/or click-thorugh to tthe other applications.
Any insight would be appreciated.
Thanks.
Vanita

Hi Vanita,
So your solution sounds fine - however, it might be easier to use an SSO Token or the Plumtree UserID in your external applications as a difinitive authentication token.
For example if you have some external application that requires a username and password, then if you are in a portlet view of the application the application should be able to take the userid plumtree sends it to authenticate that it is the correct user.  You should limit this sort of password bypass to traffic being gatewayed by the portal (i.e. coming from the portal server only).
If you want to write a Profile Web Service, the data the gets stored in the Plumtree Database is exactly what the Profile Web Service send it as the value for a particular attribute.  For example if your PWS tells Plumtree that the APP1UserName and APP1Password for user My Domain\Akash is Akash and password then that is what we save.  If your PWS encrypts the password using some 2-way encryption before hand, then that is what we will save.  These properties are simply attached to the user, and can be sent to different portlets.
Hope this helps,
-aki-

Similar Messages

  • Oracle SLA Metrics and System Level Metrics Best Practices

    I hope this is the right forum...
    Hey everyone,
    This is what I am looking for. We have several SLA's setup and we have defined many Business Metrics and we are trying to map them to System level metrics. One key area for us is Oracle. I was wondering is there is a best practice guide out there for SLA's when dealing with Oracle or even better, System Level Metric best practices.
    Any help would be ideal please.

    Hi
    Can you also include the following in the FAQ?
    1) ODP.NET if installed prior to this beta version - what is the best practice ? De-install it prior to getting this installed etc ..
    2) As multiple Oracle home's have become the NORM these days - this being a Client only should probably be non-intrusive and non-invasive.. Hope that is getting addressed.
    3) Is this a pre-cursor to the future happenings like some of the App-Server evolving to support .NET natively and so on??
    4) Where is BPEL in this scheme of things? Is that getting added to this also so that Eclipse and .NET VS 2003 developers can use some common Webservice framework??
    Regards
    Sundar
    It was interesting to see options for changing the spelling of Webservice [ the first one was WEBSTER]..

  • Needed: system requirements, recommendations & best practices

    I realize that this is just a beta release but I think it'd be nice if the system requirements for this VS.NET add-in were clearly defined in a single post/page.
    For example, Christian Shay's reply to "Oracle Release Requirements" by [170516] indicates that "For the beta release you should be able to use Oracle Database version 8.1.7 or later." However, Christian's reply in the "Oracle DB 8.1.7 connection problems" thread started by [151631] indicates that "you'd need to upgrade to at least 8.1.7.4.1 for it to work."
    Which is it??? I'm using Oracle8i Enterprise Edition Release 8.1.7.2.0 and I don't want to install this add-in if it's not going to work with my current environment. I realize I can download 10g but my employer is using 8.1.7.2 and that's my target DB for now.
    In browsing this forum, I've read a number of posts with suggestions (such as installing the version 10 client in a new Oracle Home, etc.) and I think it'd be nice to see a basic set of requirements/recommendations/best practices in a single place for all to read.

    Hi
    Can you also include the following in the FAQ?
    1) ODP.NET if installed prior to this beta version - what is the best practice ? De-install it prior to getting this installed etc ..
    2) As multiple Oracle home's have become the NORM these days - this being a Client only should probably be non-intrusive and non-invasive.. Hope that is getting addressed.
    3) Is this a pre-cursor to the future happenings like some of the App-Server evolving to support .NET natively and so on??
    4) Where is BPEL in this scheme of things? Is that getting added to this also so that Eclipse and .NET VS 2003 developers can use some common Webservice framework??
    Regards
    Sundar
    It was interesting to see options for changing the spelling of Webservice [ the first one was WEBSTER]..

  • Authentication & authorization best practices

    Hello!
    From what I understand of various documentation and videos, the Azure API app is just a web api with some extra metadata.  However in the previous product (Mobile Services) there were extra facilities to handle authentication - both for custom providers
    and for other social providers.
    Has this all gone?  Are we going back to this method here:
    http://www.asp.net/web-api/overview/security/individual-accounts-in-web-api
    ... and then doing all our social authentication manually / using other libraries like before?
    If the authentication plans are unfinished (as this is preview after all) it would be good to know so I can prioritise my work. :-)
    Thank you!

    Hi Guang, thank you for your help (on this post and so many others!)
    So from a code perspective, I can just grab those Nuget packages and load them straight into this API App project and code for it too, is that right?
    I've started by making the project in VS rather than provisioning one in the portal so I haven't seen those settings yet.  I'll just get myself a bit of a relevant demo going, then I will provision one and see the rest.
    Thanks again, let me know if those mobile service libraries are the right ones to grab.

  • SAP CRM V1.2007 Best Practice

    Hello,
    we are preparing the installation of a new CRM 2007 system and we want to have a good demo system.
    We are considering too options;
    . SAP CRM IDES
    . SAP CRM Best Practice
    knwoing that we have an ERP 6.0 IDES system we want to connect to.
    The Best Practice seems to have a lot of preconfigured scenarios that will not be available in the IDES system (known as the "SAP all in one").
    How can we start the automatic installation of the scenarios (with solution builder) connecting to the ERP IDES system?
    Reading the BP Quick guide, it is mentioned that in order to have the full BP installation we need to have a ERP system with another Best Practice package.
    Will the pre customized IDES data in ERP be recognized in CRM?
    In other words, is the IDES master data, transactional data and organizational structure the same as the Best Practice package one?
    Thanks a lot in advance for your help
    Benoit

    Thanks a lot for your answer Padma Guda,
    The difficult bit in this evaluation is that we don't know exactly the
    difference between the IDES and the Best Practice. That is to say,
    what is the advantage to have a CRM Best Practice connected to an ERP
    IDES as opposed to a CRM IDES system connected to a ERP IDES system?
    As I mentioned, we already have an ERP IDES installed as back end system.
    I believe that if we decide to use the ERP IDES as the ERP back end, we will loose some of the advantage of having an ERP Best practice connected to an CRM best practice e.g. Sales area already mapped and known by the CRM system, ERP master data already available in CRM, transactional data already mapped, pricing data already mapped etc.
    Is that righ? Or do we have to do an initial load of ERP in all cases?

  • Best Practice for migration to Exadata2

    Hi Guru,
    I'm thinking to migrate an Oracle RAC 11g (11.2.0.2) on HP/UX Itanium cluster machine to a New Exadata 2 System
    Are there best practice? Where can I found documentation about migration?
    Thanks very much
    Regards
    Gio
    Edited by: ggiulian on 18-ago-2011 7.39

    There are several docs available on MOS
    HP Oracle Exadata Migration Best Practices [ID 760390.1]
    Oracle Exadata Best Practices [ID 757552.1]
    Oracle Sun Database Machine X2-2/X2-8 Migration Best Practices [ID 1312308.1]
    If you already have Exadata, I recommend to open an SR with Oracle and engage with ACS.
    - Wilson
    www.michaelwilsondba.info

  • Best Practice Question - Activate Company Code - Open client or transport?

    Hello.
    When activating Company Codes in a newly productive system, is it best practice to do it directly in an open client, or to change the setting via transport?
    Thanks and Regards,
    D Flores

    What do you mean by activating company code?
    Is it productive check box you are talking about in OBY6.
    If so, open client for manual change, make the setting and put it back the client settings.

  • Best practicies exposing AM (OAF 11.5.10) as webservice to external systems

    IHAC how is developing extensions to there ebusiness install base using OAF 11.5.10 and they have approached me with questions on how they could expose some of the business services developed (AM VO mainly) as webservices to be used in a BPEL/Webservice framework. The BPEL service is seebeyond (not sure how it is spelled) and not Oracle's.
    I have outlined 2 ways, but since I have not developed anything on OAF I have no idea it is possible.
    First was to migrate the ADF BC (or BC4J) projects from OAF with JDeveloper 10.1.3 and just more or less create a simple facade layer of a session bean right-click and deploy as webservice.
    Second: was to use a webservice library such as axis to be used in JServ directly to expose them on the "target" server.
    Has anyone any best practicies on this topic,

    For recognition and stable functionality of USB devices, your B&W G3 should be running OS 8.6 minimally. The downloadable OS 8.6 Update can be run on systems running OS 8.5/8.5.1. If (after updating to 8.6), your flash drive still isn't recognized, I'd recommend downloading the OS 9.1 Update for the purpose of extracting its newer USB support drivers, using the downloadable utility "TomeViewer." These OS 9.1 USB support files can be extracted directly to your OS 8.6 Extensions folder and are fully compatible with the slightly older OS software. It worked for me, when OS 8.6's USB support files lacked a broad enough database to support my first USB flash drive.

  • Wireless authentication network design questions... best practices... etc...

    Working on a wireless deployment for a client... wanted to get updated on what the latest best practices are for enterprise wireless.
    Right now, I've got the corporate SSID integeatred with AD authentication on the back end via RADIUS.
    Would like to implement certificates in addition to the user based authentcation so we have some level of dual factor authentcation.
    If a machine is lost, I don't want a certificate to allow an unauthorized user access to a wireless network.  I also don't want poorly managed AD credentials (written on a sticky note, for example) opening up the network to an unathorized user either... is it possible to do an AND condition, so that both are required to get access to a wireless network?

    There really isn't a true two factor authentication you can just do with radius unless its ISE and your doing EAP Chaining.  One way that is a workaround and works with ACS or ISE is to use "Was machine authenticated".  This again only works for Domain Computers.  How Microsoft works:) is you have a setting for user or computer... this does not mean user AND computer.  So when a windows machine boots up, it will sen its system name first and then the user credentials.  System name or machine authentication only happens once and that is during the boot up.  User happens every time there is a full authentication that has to happen.
    Check out these threads and it explains it pretty well.
    https://supportforums.cisco.com/message/3525085#3525085
    https://supportforums.cisco.com/thread/2166573
    Thanks,
    Scott
    Help out other by using the rating system and marking answered questions as "Answered"

  • Best practices of BO/BW SSO SAP Authentication transports

    Hi Friends,
    We are going to integrate BW system with BO (SAP authentication). All the queries are built through BICS connections. And we have various reporting tools to implement SSO SAP authentication (Webi,Crystal,Dashboard.Design studio…etc)
    As per the process there are certain activities which has to be performed at BW level
    e.g -- BW Roles creation (PFCG---Crystal role enablement) and assigning to BO users
    Once it is created in BW , we have to do  integration at BO level( in CMC application) by selecting authentication and roles import followed by ……Groups..Users…folder and access level...
    My question here is
    Transports of BW objects for BO SSO (SAP) authentication (such as roles created for Users, Keystore certificate, uploads). Will these objects be transported by BW team or they will be separately downloading or uploading the certificate in different systems (like QAS  ...PROD….)
    And at BO level, once I integrate BO SSO, Do I need to do manual integration in QAS and Production system as well or it can be transported with promotion management of BO tool
    Will these SSO(SAP) authentication can be applied to all tools in BI Launchpad such as (Design studio,Webi,Web application,Crystal….etc)  as all users  are required to have SSO to all BO tool
    Regarding LUMIRA tool , Can we do SSO authentication
    Please share your thoughts and experience.
    I t would be great if I get BO administration best practices document for BW BO SSO and Users and Group management  in CMC for implementing
    Thanks in advance

    Hi ,
    Please find my answers below:
    1. The roles will be created in BW and should automatically appear in BO CMC Authentication SAP roles, if there is a connectivity setup between BO and BW irrespective of the SSO.The roles are transported by the BW security team.
    2. Every environement will have a unique connection to the corresponding SAP BW environment.For example SAP BW DEV will be mapped to BO DEV, SAP BW PROD will be mapped to BO PROD.So these settings cannot be migrated through Promotion Management.
    3.This authentication can be applied to all tools , the SSO does not depend on the tool ,it depends on the integration between two systems which in this case are BO and SAP BW
    As mentioned earlier, after integration all tools can have SSO
    You can refer to a lot of help documents on this site which will help you to setup the integration between SAP BW AND SAP BO.
    Kind Regards,
    Priyanka

  • Best practice for extracting data to feed external DW

    We are having a healthy debate with our EDW team about extracting data from SAP.  They want to go directly against ECC tables using Informatica and my SAP team is saying this is not a best practice and could potentially be a performance drain.  We are recommending going against BW at the ODS level.  Does anyone have any recommendations or thoughts on this?

    Hi,
    As you asked for Best Practice, here it is in SAP landscape.
    1. Full Load or Delta Load data from SAP ECC to SAP BI (BW): SAP BI understand the data element structure of SAP ECC, and delta mechanism is the continous process of data load from a SAP ECC (transaction system) to BI (Analytic System).
    2. You can store transaction data in DSOs (granular level), and in InfoCubes (at a summrized level) within SAP BI. You can have master data from SAP ECC coming into SAP BI separately.
    3. Within SAP BI, you SHOULD use OpenHub service to provide SAP BI data to other external system. You must not connect external extractor to fetch data from DSO and InfoCube to target system. OpenHub service is the tool that faciliate data feeding to external system. You can have Informatica to take data from OpenHubs of SAP BI.
    Hope I explain to best of your satisfaction.
    Thanks,
    S

  • Best practice for integrating oracle atg with external web service

    Hi All
    What is the best practice for integrating oracle atg with external web service? Is it using integration repository or calling the web service directly from the java class using a WS client?
    With Thanks & Regards
    Abhishek

    Using Integration Repository might cause performance overhead based on the operation you are doing, I have never used Integration Repository for 3rd Party integration therefore I am not able to make any comment on this.
    Calling directly as a Java Client is an easy approach and you can use ATG component framework to support that by making the endpoint, security credentials etc as configurable properties.
    Cheers
    R
    Edited by: Rajeev_R on Apr 29, 2013 3:49 AM

  • Best practice to integrate the external(ERP or Database etc) eCommerce data in to CQ

    Hi Guys,
    I am refering to GEOMetrixx-Outdoors project for building eCommerce fucntionality in our project.
    Currently we are integrating with an ERP system to fetch the Product details.
    Now I need to store all the Product data from ERP system in to our CRX  under etc/commerce/products/<myproject> folder structure.
    Do I need to create a csv file structure as explained in the geometrixx-outdoors project  and place it exactly the way they have mentioned in the documentation? By doing this the csvimporter will import the data in to CRX and creates the Sling:folder and nt:unstructured nodes in to CRX?
    Please guide me  which is this best practice to integrate the external eCommerce data in to CQ system to build eCommerce projects?
    Are there any other best practices ?
    Your help in this regard is really appreciated.
    Thanks

    Hi Kresten,
    Thanks for your reply.
    I went through the eCommerce framework link which you sent.
    Can you get me few of the steps to utilise eCommerce framework to pull all the product information in to our CRX repository and also  how to synchronise between the ERP system and CRX data. Is that we have a scheduling mechanism to pull the data from our ERP system and synch it with CRX repository?
    Thanks

  • Best practice for external but secure access to internal data?

    We need external customers/vendors/partners to access some of our company data (view/add/edit).  It’s not so easy as to segment out those databases/tables/records from other existing (and put separate database(s) in the DMZ where our server is).  Our
    current solution is to have a 1433 hole from web server into our database server.  The user credentials are not in any sort of web.config but rather compiled in our DLLs, and that SQL login has read/write access to a very limited number of databases.
    Our security group says this is still not secure, but how else are we to do it?  Even if a web service, there still has to be a hole in somewhere.  Any standard best practice for this?
    Thanks.

    Security is mainly about mitigation rather than 100% secure, "We have unknown unknowns". The component needs to talk to SQL Server. You could continue to use http to talk to SQL Server, perhaps even get SOAP Transactions working but personally
    I'd have more worries about using such a 'less trodden' path since that is exactly the areas where more security problems are discovered. I don't know about your specific design issues so there might be even more ways to mitigate the risk but in general you're
    using a DMZ as a decent way to mitigate risk. I would recommend asking your security team what they'd deem acceptable.
    http://pauliom.wordpress.com

  • What is the best practices recommended from microsoft to give access a intranet portal from internet externally

    Hi
    what is the best practices recommended from microsoft
    i have a intranet portal in my organization used by employees  and i want to give access for employees to access external from  internet also
    can i use same url  for employees access intranet portal from internally and externally or diffrent url?
    like ( https://extranet.xyz.com.in)  and (http://intranet.xyz.com.in)
    internal url access by employees is( http://intranet.xyz.com.in)
    and this portal configured with claims based authentication
    here i have a F5 for load blance and
     a request from external to F5 is https request and F5 to sharepoint server http request
    and sharepoint server to F5 is http request but F5 to external users it is https response so 
    when i change below settings in alternate access mapings   all links changed to https
    but only authentication link is still showing http and authentication page not opened.
    adil

    Hi,
    One of my clients has an environment similar to yours with an internal pair of F5s and a pair used for the access from the internet. 
    I am only going to focus on the method using an F5 Load Balancer and SSL Offloading. the setup of the F5 will not be covered in detail but a reference to the documentation to support SharePoint and SSL Offloading will be provided
    Since you arte going to be using SSL Offloading you do not need to extend your WebApps to use separate IIS WebSites with Unique IP Addresses
    Configure the F5 with SSL Offloading
    Configure a Internal AAM for SSL (HTTPS) for each WebApp that maps to the Public HTTP FQDN AAM Setting for each WebApp
    Our environment has an additional component we require RSA Authentication for all internet facing Sites. So we have the extra step of extending the WebApp to a separate IIS WebSite and configuring RSA for each extended WebSite.Reference:
    Reference SharePoint F5 Configuration:
    http://www.f5.com/featured/video/ssl-offloading/
    -Ivan

Maybe you are looking for