JmxSecurityException when using Identity Management

Search or 'create new user' in Identity Management iView results in the following error:
"com.sap.engine.services.jmx.exception.JmxSecurityException: Caller
<anonymoususer> not authorized, only role administrators is allowed to
access JMX".
The message appears while I am logged in as Administrator to the portal; looks like a misconfiguration of anonymous access/user.
The same error appears when navigating to IM through NWA, and when trying to use other admin tasks in NWA.
How could anonymous user configuration affect the IM iview within the super_admin_role?
ONly When assigning <anonymoususer> administrators group, the IM iview will be functioning properly while logged in as Administrator!
Any help is much appreciated.

Hi ,
The same error appears when navigating to IM through NWA, and when trying to use other admin tasks in NWA.
ONly When assigning <anonymoususer> administrators group, the IM iview will be functioning properly while logged in as Administrator!
To access visual admin or NWA , user should be assigned to administrator role .
Search or 'create new user' in Identity Management iView results in the following error:
"com.sap.engine.services.jmx.exception.JmxSecurityException: Caller
<anonymoususer> not authorized, only role administrators is allowed to
access JMX".
The message appears while I am logged in as Administrator to the portal; looks like a misconfiguration of anonymous access/user.
Can you please tell clearly stepwise what exactly you are trying to do?
Regards,
Koti Reddy

Similar Messages

  • Using Identity Management for Securing Web Services

    My goal is to associate my services with an Oracle Internet Directory. I made some attempts to set up SAML authentication for the web services, but it didn't have the right outcome.
    (My identity management server and OID is up and running and I have successfully made authentication modules for other web applications)
    Here is what I did:
    1. I wrote a simple java file, used jdeveloper tools to create and deploy it as a web service to OC4J. I associated an identity management server with this service through OC4J web tools as security provider.
    2. I made a data control for the web service and put it in an ADF application . (client)
    3. I deployed the client project(2) to OC4J.
    I could use the web service through the page.
    Then
    I secured the webservice to expect SAML for authentication.
    Surprisingly, the client could still communicate with the webservice, Why? Shouldn't it have rejected the request because of the problem in SAML token? (The proxy and the data control were not secured, and didn't provide any SAML tokens)
    4.
    I added login page to my client project (through ADF security wizard). It used idenity management for authentication successfully. login process completes and web service data control is displayed.
    5. I want the authentication information to be propagated through the page so that the web service receives the data and uses Identity Management.
    I know I should add <property name="oracle.security.wss.propagate.identity" value ="true"/>
    to one of the configuration files, but don't know where exactly.
    Best Regards,
    Farbod

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Passing arguments to Managed Server ServerStart when using Node Manager

    Below is the procedure to pass arguments to Managed Server serverstart tab when using Node Manager to start and stop the Managed Server instance
    For passing the JVM arguments we will have to use "-server" which is hotspot and then pass the arguments
    for eg: "-server -Xms2048m -Xmx2048m -verbosegc"
    For adding jar's to the classpath you will have to add weblogic.jar and weblogic_sp.jar from WL_HOME/server/lib directory as well along with the custom jar (In some cases we can ignore weblogic_sp.jar, just adding weblogic.jar will do).
    for eg: "\home\user\debug.jar;\home\user\bea\wlserver_10.3\server\lib\weblogic.jar;\home\user\bea\wlserver_10.3\server\lib\weblogic_sp.jar;"
    - - Tarun

    Hi,
    To me, if you don't wanna use the NodeManager, you won't be able to start your managed servers with WLST.
    I guess the only ways are :
    * through the console / WLST using the NodeManager
    * using the startManagedWebLogic.[cmd/sh]
    Regards

  • Problem with Database Initialization when using Configuration Manager (ubuntu-jboss-mysql)

    Hello,
    When I try to initialize the database using Configuration Manager, the following error occurs:
    ALC-TTN-002-001: JDBC datasource lookup failed for resource reference [java:comp/env/jdbc/
    IdpDs]. The most likely cause is that a datasource having a JNDI
    name of [IDP_DS] does not exist or is misconfigured. Check the application
    server's configuration.
    I DO have an IDP_DS datasource configured in jboss. I carefully followed the Jboss configuration instructions, so I don't really understand which is the issue here.
    Anyone encountered a similar problem? Any help?
    Many thanks in advance,
    Artur

    Data source files have to end with ds.xml and should be copied to the /deploy folder of your JBoss configuration (all) for them to be picked up by JBoss.
    If the file was edited in Windows, make sure it does not contain DOS characters. Ubuntu Linux is not a supported LiveCycle OS platform currently. Only Red Hat Enterprise and SUSE Enterprise are.

  • Looking for recommendations for SSO using Identity Management Suite

    Apparently there is more than one way to perform the single sign on functionality and I'm wondering if anyone has any recommendations. We want to use the user's CAC to authenticate and we are using both a thick and a thin client with business intelligence software (not Oracle) for reporting and the thick app needs to login, transfer data to and pull data from the database. We don't want the user to have to enter their information for login regardless of which piece of the application they are accessing. I was considering Enterprise Users, but not sure this is the best solution.
    We plan on using OID to store the user's data and using the identity management with access manager suite. We need to use label security and row level security combined with roles in the database. We plan on using Oracle's Advanced security option for encryption and we are using 11g database

    Great questions - swifer was made by me for me to use with arch.  But its dependencies are light and very common (you almost certainly have them already), so it should run on other distros well.  It does use something comparable to profiles for secure networks but much simpler - they are just wpa_supplicant.conf segments, but swifer writes them itself for simple cases (and most cases are simple cases; wep is a current exception) so generally you would have to pay no attention to them.
    As for permissions on the files, I've been meaning to get around to that - they've been 0644, but I just changed it to 0600 in the last revision which is now on github.
    EDIT: also keep in mind this is a development version.  I use it myself as my only networking tool on a netbook that I bring everywhere - and I have no issues.  But I also know how to connect manually in a bind: and I advocate for every user knowing the manual steps (ip, iwlist (if needed), iw / wpa_supplicant, dhcpcd/dhclient).
    Last edited by Trilby (2013-06-21 22:13:24)

  • Path problem when use node manager to start Managed Server

    Hi, I have met a problem:
    My managed server's root directory is /usr/local/bea/cluster1/usr_projects/mydomain
    My Node Manager 's default directory is /usr/local/bea/cluster1/weblogic700/common/nodemanager
    I have some config file located under managed server's root directory and log
    file is also should be generated in managed server's root directory.
    But when I try to use node manager to startup/shutdown the Managed Server, I found
    weblogic can not find the config file if I don't move the config file from Managed
    server's root directory to node manager server's default directory. and also log
    file was created under node manager's default directory.
    It seems that managed server's default directory has been changed to node manager's
    default directory.
    What shall I do if I want Managed Server's default directory to be kept?

    Hi Lumin,
    If the weblogic version is 8.1, there is a RootDirectory field in Remote
    Start tab which should resolve your problem. Just enter the absolute
    directory pathname on the machine where you are starting your managed server
    and the current working directory of the managed server will be this
    RootDirectory. Before 8.1, the RootDirectory was used for finding the config
    file for weblogic but it was never used to change the working directory of
    the managed server created by the node manager.
    cheers,
    gaurav.
    "lumin" <[email protected]> wrote in message
    news:3eb7a4ae$[email protected]..
    >
    Hi, I have met a problem:
    My managed server's root directory is/usr/local/bea/cluster1/usr_projects/mydomain
    My Node Manager 's default directory is/usr/local/bea/cluster1/weblogic700/common/nodemanager
    >
    I have some config file located under managed server's root directory andlog
    file is also should be generated in managed server's root directory.
    But when I try to use node manager to startup/shutdown the Managed Server,I found
    weblogic can not find the config file if I don't move the config file fromManaged
    server's root directory to node manager server's default directory. andalso log
    file was created under node manager's default directory.
    It seems that managed server's default directory has been changed to nodemanager's
    default directory.
    What shall I do if I want Managed Server's default directory to be kept?

  • 'duplicate column name'-Exception when using identical objects

    Hi all,
    we're currently experiencing problems when using one single object instance for two different members of a mapped class. Here is an excerpt from our ToplinkMapping.java which shows the relevant parts:
    public ClassDescriptor buildQuotationDefDescriptor() {
    RelationalDescriptor descriptor = new RelationalDescriptor();
    descriptor.descriptorIsAggregate();
    descriptor.setJavaClass(de.hvb.ha.data.QuotationDef.class);
    // Descriptor Properties.
    descriptor.setAlias("QuotationDef");
    AggregateObjectMapping commonQtyMapping = new AggregateObjectMapping();
    commonQtyMapping.setAttributeName("commonQty");
    commonQtyMapping.setReferenceClass(de.hvb.ha.data.type.VolumeDT.class);
    commonQtyMapping.setIsNullAllowed(false);
    commonQtyMapping.addFieldNameTranslation("commonQty_value->DIRECT", "value->DIRECT");
    descriptor.addMapping(commonQtyMapping);     
    AggregateObjectMapping wideningQtyMapping = new AggregateObjectMapping();
    wideningQtyMapping.setAttributeName("wideningQty");
    wideningQtyMapping.setReferenceClass(de.hvb.ha.data.type.VolumeDT.class);
    wideningQtyMapping.setIsNullAllowed(false);
    wideningQtyMapping.addFieldNameTranslation("wideningQty_value->DIRECT", "value->DIRECT");
    descriptor.addMapping(wideningQtyMapping);
    well, if we now assign one object-instance of VolumeDT to both members, i.e. like:
    VolumeDT vol = new VolumeDT();
    quotationDef.commonQty = vol;
    quotationDef.wideningQty = vol;
    we end up in the mentioned SQL-Exception, because toplink produces the following SQL-Statement:
    UPDATE T_QUOTATIONTARGET SET WIDENING_QUANTITY = 0.0, WIDENING_QUANTITY = 0.0 WHERE …
    but we've expected something like:
    UPDATE T_QUOTATIONTARGET SET COMMON_QUANTITY = 0.0, WIDENING_QUANTITY = 0.0 WHERE …
    Any idea we can prevent this behavior and still use one object-reference for both members??
    Appreciate any help!

    Thanks for the reply but unfortunately the mentioned change didn't fixed the problem. The sql-statement produced by toplink still looks like the one mentioned above. Moreover, the mentioned change couldn't be managed by the Workbench (which we use in the project to create the mappings), could it?
    We currently use TopLink Version 10.1.3.3.
    Since you've asked I post the relevant parts of the Parent-Mapping (and again its Parent), so sorry for the verbose post:
    public ClassDescriptor buildQuotationORMWrapperDescriptor() {
    RelationalDescriptor descriptor = new RelationalDescriptor();
    descriptor.setJavaClass(de.hvb.ha.server.businessobjects.techapi.instrument.toplink.quotation.QuotationORMWrapper.class);
    descriptor.addTableName("T_QUOTATION");
    descriptor.addPrimaryKeyFieldName("T_QUOTATION.SEQ_KEY");
    // Descriptor Properties.
    descriptor.useSoftCacheWeakIdentityMap();
    descriptor.setIdentityMapSize(300);
    descriptor.useRemoteSoftCacheWeakIdentityMap();
    descriptor.setRemoteIdentityMapSize(300);
    descriptor.setSequenceNumberFieldName("T_QUOTATION.SEQ_KEY");
    descriptor.setSequenceNumberName("Quotation");
    descriptor.setAlias("QuotationORMWrapper");     
    // Query Manager.
    descriptor.getQueryManager().checkCacheForDoesExist();
    OneToManyMapping quotationDefsMapping = new OneToManyMapping();
    quotationDefsMapping.setAttributeName("quotationDefs");
    quotationDefsMapping.setReferenceClass(de.hvb.ha.server.businessobjects.techapi.instrument.toplink.quotation.QuotationDefORMWrapper.class);
    quotationDefsMapping.dontUseIndirection();
    quotationDefsMapping.privateOwnedRelationship();
    quotationDefsMapping.useCollectionClass(java.util.ArrayList.class);
    quotationDefsMapping.addAscendingOrdering("orderBy");
    quotationDefsMapping.addTargetForeignKeyFieldName("T_QUOTATIONTARGET.SEQ_KEY", "T_QUOTATION.SEQ_KEY");
    descriptor.addMapping(quotationDefsMapping);
    and now the parent of QuotationORMWrapper:
    public ClassDescriptor buildPersistentInstrumentDescriptor() {
    RelationalDescriptor descriptor = new RelationalDescriptor();
    descriptor.setJavaClass(de.hvb.ha.server.businessobjects.techapi.instrument.toplink.PersistentInstrument.class);
    descriptor.addTableName("T_INSTRUMENT");
    descriptor.addPrimaryKeyFieldName("T_INSTRUMENT.INSTR_ID");
    descriptor.addPrimaryKeyFieldName("T_INSTRUMENT.INSTR_KEYTYPE");     
    // Descriptor Properties.
    descriptor.useSoftCacheWeakIdentityMap();
    descriptor.setIdentityMapSize(300);
    descriptor.useRemoteSoftCacheWeakIdentityMap();
    descriptor.setRemoteIdentityMapSize(300);
    descriptor.setAlias("PersistentInstrument");
    OneToOneMapping quotationMapping = new OneToOneMapping();
    quotationMapping.setAttributeName("quotation");
    quotationMapping.setReferenceClass(de.hvb.ha.server.businessobjects.techapi.instrument.toplink.quotation.QuotationORMWrapper.class);
    quotationMapping.useBasicIndirection();
    quotationMapping.privateOwnedRelationship();
    quotationMapping.addTargetForeignKeyFieldName("T_QUOTATION.INSTR_ID", "T_INSTRUMENT.INSTR_ID");
    quotationMapping.addTargetForeignKeyFieldName("T_QUOTATION.INSTR_KEYTYPE", "T_INSTRUMENT.INSTR_KEYTYPE");
    descriptor.addMapping(quotationMapping)
    I hope I don't missed any relevant parts. By the way, where could I log a bug for this issue and can it be considered a bug?
    Thanks in advance!
    Message was edited by:
    user630939

  • 500 "Server error" when use User Management Console

    Hello All, i'm staring with Hyperion products on this week))
    I installed Shared Serveces and Essbase – System 9
    I deployed Shared Serveces on OAS 10.1.3.1
    And now when I use User Management Console and try to open
    node 'Projects' I get an error 500 "Server error".
    Why ?
    Please help me resolve this issue
    What's wrong? May be I have to install additional components to use Projects?

    I found follow error in my OC4J where deployed "User Management Console"
    log file located in $OAS_ORACLE_HOME/j2ee/Hyperion/application-deployments/SharedServices9/Hyperion_default_group_1/application.log
    07/11/16 10:57:16.633 interop: Error preloading servlet
    javax.servlet.ServletException: Error instantiating servlet 'integration'. Servlet class com.hyperion.interop.webservices.integration.IntegrationWebservice not found in web-application interop
         at com.evermind.server.http.HttpApplication.servletClassNotFound(HttpApplication.java:4866)
         at com.evermind.server.http.HttpApplication.findServlet(HttpApplication.java:4832)
         at com.evermind.server.http.HttpApplication.findServlet(HttpApplication.java:4734)
         at com.evermind.server.http.HttpApplication.initPreloadServlets(HttpApplication.java:4922)
         at com.evermind.server.http.HttpApplication.initDynamic(HttpApplication.java:1134)
         at com.evermind.server.http.HttpApplication.<init>(HttpApplication.java:738)
         at com.evermind.server.ApplicationStateRunning.getHttpApplication(ApplicationStateRunning.java:414)
         at com.evermind.server.Application.getHttpApplication(Application.java:545)
         at com.evermind.server.http.HttpSite$HttpApplicationRunTimeReference.createHttpApplicationFromReference(HttpSite.java:1990)
         at com.evermind.server.http.HttpSite$HttpApplicationRunTimeReference.<init>(HttpSite.java:1909)
         at com.evermind.server.http.HttpSite.initApplications(HttpSite.java:645)
         at com.evermind.server.http.HttpSite.setConfig(HttpSite.java:290)
         at com.evermind.server.http.HttpServer.setSites(HttpServer.java:270)
         at com.evermind.server.http.HttpServer.setConfig(HttpServer.java:177)
         at com.evermind.server.ApplicationServer.initializeHttp(ApplicationServer.java:2450)
         at com.evermind.server.ApplicationServer.setConfig(ApplicationServer.java:998)
         at com.evermind.server.ApplicationServerLauncher.run(ApplicationServerLauncher.java:131)
         at java.lang.Thread.run(Thread.java:595)
    07/11/16 10:57:17.334 interop: Error preloading servlet
    javax.servlet.ServletException: Cannot initialize Hyperion DSF. Error:
         Missing class: org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
         Dependent class: com.hyperion.workflow.engine.server.services.modelrepository.WfProductSchemaParser
         Loader: SharedServices9.web.interop:0.0.0
         Code-Source: /u0/SOA/SOA_Suite/j2ee/Hyperion/applications/SharedServices9/interop/WEB-INF/lib/wf_eng_server.jar
         Configuration: WEB-INF/lib/ directory in /u0/SOA/SOA_Suite/j2ee/Hyperion/applications/SharedServices9/interop/WEB-INF/lib
    The missing class is not available from any code-source or loader in the system.
         at com.hyperion.dsf.server.framework.DsfServer.init(DsfServer.java:52)
         at com.evermind.server.http.HttpApplication.loadServlet(HttpApplication.java:2361)
         at com.evermind.server.http.HttpApplication.findServlet(HttpApplication.java:4810)
         at com.evermind.server.http.HttpApplication.findServlet(HttpApplication.java:4734)
         at com.evermind.server.http.HttpApplication.initPreloadServlets(HttpApplication.java:4922)
         at com.evermind.server.http.HttpApplication.initDynamic(HttpApplication.java:1134)
         at com.evermind.server.http.HttpApplication.<init>(HttpApplication.java:738)
         at com.evermind.server.ApplicationStateRunning.getHttpApplication(ApplicationStateRunning.java:414)
         at com.evermind.server.Application.getHttpApplication(Application.java:545)
         at com.evermind.server.http.HttpSite$HttpApplicationRunTimeReference.createHttpApplicationFromReference(HttpSite.java:1990)
         at com.evermind.server.http.HttpSite$HttpApplicationRunTimeReference.<init>(HttpSite.java:1909)
         at com.evermind.server.http.HttpSite.initApplications(HttpSite.java:645)
         at com.evermind.server.http.HttpSite.setConfig(HttpSite.java:290)
         at com.evermind.server.http.HttpServer.setSites(HttpServer.java:270)
         at com.evermind.server.http.HttpServer.setConfig(HttpServer.java:177)
         at com.evermind.server.ApplicationServer.initializeHttp(ApplicationServer.java:2450)
         at com.evermind.server.ApplicationServer.setConfig(ApplicationServer.java:998)
         at com.evermind.server.ApplicationServerLauncher.run(ApplicationServerLauncher.java:131)
         at java.lang.Thread.run(Thread.java:595)
    07/11/16 10:57:17.482 interop: 10.1.3.1.0 Started

  • Import Fails When Using Transport Manager ID

    Hi Experts,
    When carrying out an import request in NWDI in Change Management Service under
    'Consolidation' with an ID (ztesttpt) which has been assigned only 'NWDI.Operator' Role
    we are getting the below errors and import fails.
    'TCSDeployException_Communication: Server cdbaxd08 did not accept
    login request as apiadmin on port 50018'
    'Caller ztesttpt not authorized, only role administrators is allowed to access JMX'
    Note that as per the SAP NWDI document 'How To setup NWDI Permissions and Roles', we
    have assigned the following actions to the 'NWDI.Operator' Role.
    CBS.Administrator
    CMS.CriticalFunctions
    CMS.Display
    CMS.Transport
    However, when using the ID 'nwdi_cmsadm' (which has FULL authorizations) to do the import
    there is no issue.
    Is there any additional actions/roles that need to be assigned to the ztesttpt ID?
    Any advice or comments would be greatly appreciated.

    Abhishek,
    check this link
    http://help.sap.com/saphelp_nw70/helpdata/en/46/5b8c954bb04cae84a21793ad9b4c92/frameset.htm
    Thanks
    Bala Duvvuri

  • Native IO Disabled when using node manager to start managed server

    Hi,
    I am able to start my node manager and the managed server from the Admin Console. OS is AIX, WebLogic 9.2 MP3. The managed server starts OK, but Native IO is disabled even though it is enabled in the Admin Console for that manager server:
    ####<Jul 28, 2009 8:25:00 AM CDT> <Info> <Socket> <pstps09.statefarm.com> <PIA2> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1248787500274> <BEA-000447> <Native IO Disabled. Using Java IO.>
    I set NativeVersionEnabled to False in nodemanager.properties as specified in the BEA documentation, but can get Native IO to work. The log file clearly indicates that the JDK version is 32-bit:
    java.fullversion = J2RE 1.5.0 IBM J9 2.3 AIX ppc-32 j9vmap3223-20071007 (JIT enabled)
    Any ideas how to fix this or is this working as designed for AIX platforms?
    Thanks!
    -Mary Manchukian

    The following document asks to set NativeVersionEnabled to False:
    http://edocs.bea.com/wls/docs61/adminguide/remotestart.html#1043956:
    The Node Manager is available for use only on Windows and UNIX platforms. Native libraries are available for running the Node Manager on Windows, Solaris, HP-UX, AIX and Red Hat Linux operating systems. For UNIX operating systems other than Solaris and HP UX, you will need to use the following argument on the java command line when starting the Node Manager:
    -Dweblogic.nodemanager.nativeVersionEnabled=false
    If we do not set NativeVersionEnabled to False, the managed server doesn't start at all and fails with the error:
    java.io.IOException: Server failed to start up. See server output log for more details.
    at weblogic.nodemanager.server.ServerManager.start(ServerManager.java:296)
    at weblogic.nodemanager.server.Handler.handleStart(Handler.java:511)
    at weblogic.nodemanager.server.Handler.handleCommand(Handler.java:115)
    at weblogic.nodemanager.server.Handler.run(Handler.java:66)
    at java.lang.Thread.run(Thread.java:810)
    All I want is to be able to start the managed server from the Admin Console > Remote Start tab.
    Thanks,
    -Mary

  • When using Power Management i get 'no data'

    I'm using all sorts of reports in SCCM and all are working except the power management reports. I get the following message: 'no data'
    The following options are turned on:

    The CM client is installed on every workstation and the resource explorer is populated. Is it possible to force the report instead of waiting +/- 30 days?
    It is still unclear to me if you environment is healthy.
    You say the query above only shows you 1 computer but every computer has the client installed. How many computers do you see listed in “Count operating system versions”?
    For the computer show in the resource explorer screenshot, do you get any results for “Power Management - Computer activity by computer”
     Feb 7 2015?
    Until the data is collected from all of your computers and until you have several days’ worth of data, some reports will not full work. There is no way to hurry it up.
    Garth Jones | My blogs: Enhansoft and
    Old Blog site | Twitter:
    @GarthMJ
    1. Count Operating System Versions:
    Microsoft
    Windows 7 Enterprise
    932
    Microsoft
    Windows 8.1 Enterprise
    1
    Microsoft
    Windows Server 2008 R2 Enterprise
    1
    Microsoft
    Windows XP Professional
    13
    2. For the computer in the screenshot (and other workstations) i get "no Data". When i click onclick for detailed information i get some data like
    the Power Settings Plan.

  • Flash version problems when using Extension Manager

    I am deaf so I wanted to try the extension at
    http://www.adobe.com/cfusion/exchange/index.cfm?event=extensionDetail&loc=en_us&extid=1182 518
    in case any web sites I visit happen to have provided captioning.
    When I try to install the extension with Extension Manager CS5, it halts with a message saying that I need Flash version 8 or higher.  Downgrading from Flash version 10.1 to 9.x did not help.  Further downgrading to version 8.42 did not help either.  Yes, I did uninstall Flash each time before installing the next version.
    I spent some time trying to find help in "Adobe Community Help", but that program is useless because the font is so tiny and it does not provide a way for the user to select a usable font size.   Reducing my display resolution enough to make "Adobe Community Help" readable renders everything else useless because menus, buttons, fonts, dialog boxes, etc are now so freaking huge.

    I do not have the "Creative Suite" in any version.   I thought I was trying to install an extension for viewing content - I have no interest in creating anything. 
    I went back to the link I posted and it still makes me think it is an extension for displaying captions and that there is a different extension for creating them.
    If it really is not an extension to allow end-users to read captions, then I've been wasting both of ours times.  I'll just go back to nagging web sites - news sites in particular - to stop using Flash and switch to proper video formats that allows captioning for deaf users.

  • Integrating third party applications using Identity Management

    Hi,
    I have a 3rd party application(Peoplesoft) which my customers have been using for a long time. And I have 2 ADF applications(say Apps1 and Apps2) running on a single domain in my weblogic 10.3-11gR1 application server. I have configured security realm of my WLS 10.3 to authenticate the users based on LDAP store(installed using OIM 11g).
    When I've logged into Apps1 and access Apps2, it has automatically logged me into Apps2 as current user.
    But When I try to access 3rd party application like Peoplesoft from my Apps1, it is taking me to psft login page again, even though all the peoplesoft users are synchronized with LDAP store.
    I believe, that the authentication I have provided in my WLS is scoped for my Apps1 and Apps2 only. As Peoplesoft is/can be hosted on another application server, I think I need to send an extra information while calling PSFT application to validate and authenticate the current user, so that I can continue with the current session without having to login again.
    Please let me know, if you have ever encountered this kind of scenario.( I believe this is general scenario for an Integration solution in Fusion).
    Also please suggest if I need to use any other aspects of OIM.
    Thanks,
    Harikiran.

    Hi Hari,
    It seems like a normal SSO scenario and you would need OAM for this.
    I am not an expert in ADF apps, but are those deployed in WebLogic server? If so, you would need to the SSPI integration of WebLogic with OAM. This integration will help you in achieving SSO for WLS resources and non-WLS resources using OAM. In addition, need to integrate Peoplesoft with OAM if you want to access Peoplesoft as well for SSO.
    You can use same LDAP that you are using for Apps as user store for OAM.
    Hope this helps.
    Mahendra.

  • Unable to mail-enable a document library when using Directory Management Service in SharePoint 2013

    I'm not able to mail-enable a library document while Directory Management Service is enabled. This behavior only happens when Directory Management Service is turned on.
    Correlation ID: a8c7b29c-d193-90b5-ae14-64cd1143445f
    Note that I have the OU created and permissions setup properly according to MS official documentation.

    Hi,
    According to your post, my understanding is that you failed to mail-enable a library document while Directory Management Service is enabled.
    Please make sure you configure the incoming email correctly.
    For more information, you can refer to:
    https://hosting.intermedia.net/support/kb/default.asp?id=2439
    http://davecoleman146.com/2010/10/20/how-to-setup-mail-enabled-document-libraries-in-sharepoint-2010-part-1/
    If so and the error message persists, please check the SharePoint ULS log to find more information about this error, the ULS log file is in the location: C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\LOGS
    You can check the ULS log by the methods here:
    http://blog.credera.com/technology-insights/microsoft-solutions/troubleshooting-sharepoint-errors/
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • ORA-14551 when using rules manager inside a SELECT * FROM TABLE(myFunction)

    Hello
    I want to return rows from a pipelined function, and map this to a view, or a BC4J an query this in my application.
    I want to populate the rows based on many evaluations, to achieve this, I want to include the rules manager functionality.
    I have defined my event, rule class and result view successfully, but... when I run my query, I have the following error:
    ORA-14551: cannot perform a DML operation inside a query
    We believe this is caused when the engine tries to populate the results view with the matching rules.
    Is there a workaround for this?
    Please help
    Thnks in advance.
    Alex.

    Alex,
    I cannot think of any workaround that would allow you to evaluate the rules and return the results with a single query. A Rules Manager application with composite events modifies the state of the database (for maintaining incremental state as well as the results view) when some events are processed. So, you will not be able to pipeline the results to a SQL query. If you can at least separate the ADD_EVENTS call from the rest of the logic this may be possible.
    Hope this helps,
    -Aravind.

Maybe you are looking for