Object not exported for Remote object in WSAD

Hi,
I have a few classes that extend and implement java.rmi.remote object. The JAR is placed under WEB-INF/lib of a web application.
I am using WSAD 5.1.1. When I start the server, it gives me the following error.
Can somebody make out anything?? Please help
Error Stack:
[5/3/05 12:07:04:188 IST] 3d410866 Helpers W NMSV0610I: A NamingException is being thrown from a javax.naming.Context implementation. Details follow:
Context implementation: com.ibm.ws.naming.jndicos.CNContextImpl
Context method: rebind
Context name: localhost/nodes/localhost/servers/server1
Root exception is java.rmi.NoSuchObjectException: object not exported
I tried creating the stub and the skeleton classes manually and added them to the JAR file but still I get the same error. The JAR is in the CLASSPATH of the server.
I also tried creating a simple remote object and tried binding it:
Hashtable propertiesMap = new Hashtable();
propertiesMap.put(Context.INITIAL_CONTEXT_FACTORY,"com.sun.jndi.cosnaming.CNCtxFactory");
propertiesMap.put(Context.PROVIDER_URL, "iiop://localhost:2809" );
Context context= new InitialContext(propertiesMap);
myRemote instance = new myRemote();
context.bind("TestString",instance);
Even this gives :
javax.naming.ConfigurationException: Problem with PortableRemoteObject.toStub(); object not exported or stub not found. Root exception is java.rmi.NoSuchObjectException: object not exported
at sun.rmi.transport.ObjectTable.getStub(ObjectTable.java:115)
at java.rmi.server.RemoteObject.toStub(RemoteObject.java:88)
at com.ibm.rmi.util.JDKBridge.getJRMPStub(JDKBridge.java:129)
at com.ibm.rmi.javax.rmi.PortableRemoteObject.toStub(PortableRemoteObject.java:138)
at javax.rmi.PortableRemoteObject.toStub(PortableRemoteObject.java:105)
at java.lang.reflect.Method.invoke(Native Method)
-sam

Hi,
Not sure if this question is not clear ... Can somebody give me some hints please??
-sam

Similar Messages

  • The FM "DD_DOMA_GET" not released for 'remote' calls.

    Hi,
    When I try to import RFCs from my CRM system in XI, I am getting the following exception:
    com.sap.aii.ibrep.sbeans.upload.RemoteUploadException: The function module "DD_DOMA_GET" not released for 'remote' calls.
    Solution please!!
    Regards,
    Mahesh.

    Hi,
      Making remote enabled will solve this,
    see my reply(Anirban) and jacob's reply in this thread(although this was with idoc)
    Unable to import SAP objects
    For, <i>We tried to make it Remote Enabled but it asks for "Access Key". How to go further?
    </i>
    You need a devloper access key for FM, in SE37
    you can get one in market place, follow this thread,
    Whats Development Key
    Regards,
    Anirban.

  • The connection was denied because the user account is not authorized for remote login

    Using Terminal Server 2008 not able to get non administrator users to login to the remote desktop. Have tried from Windows server 2008 and from Windows servers 2003. Get error login in "The connection was denied because the user account is not authorized for remote login" from Windows Server 2008. Error "The requested session access is denied" from Windows Server 2000.

    Is that seriously the only way to do this? Doesn't this render the "Allow log on through Terminal Services" GP Setting useless?
    I would like to know this answer, as well.  I have created a new AD group for my assistant admins called "Domain Admins (limited)".  I have added this group to the GP setting "Allow log on through Terminal Services", but the
    assistant admins cannot log in through RDP.  It 'feels like' this is all I would need to do.
    Craig
    Found some good info
    here. There are really two things required for a user to connect to a server via RDP. You can configure one of them via Group Policy but not the other.
    1) Allow log on through Terminal Services can be configured through Group Policy, no problem.
    2) Permissions on the RDP-listener must also be granted.  If your user is a member of the local Administrators group or the local Remote Desktop Users group then this is handled.  If you are trying to utilize a new, custom group (as I am),
    then there isn't a way to do this via group policy (that I have found).
    EDIT: Found the answer.  I am creating a blog post to outline the steps.  They aren't hard, but they're not self-explanatory.  It deals with the Restricted Groups mentioned above, but it's still automate-able using Group Policy so that you
    don't have to touch each computer.  I think the above poster (Andrey Ganev) got it right, but
    I had trouble deciphering his instructions.
    Here is my blog post that walks through this entire process, step-by-step.

  • Keychain not updated for Remote Login

    Since installing Lion on both machines: When I connect to my G5 Powermac from my MBAir, I use the Keychain to remember my password. This feature worked in previous OS Versions by selecting  the 'Remember' Option in the dialogue (meaning you would only see the following dialogue when your password changed on the destionation machine).
    With Lion, the Password onthe Keychain is not updated when the flag is set. As a result, when I select the destination machine from the Finder, I always have to wait for 'Not Connected' message (while the process tries to log in with my old password). Then, I have to 'Connect As. ..." and enter my current password (every rassafrassin' time).
    Can someone please patch this thing.
    Thanks,
    g

    I'm having some trouble with an RD server Win 2008 on a domain. I have a group called domain\authorizedpeople that I would like to enable remote access for. I added this group to the gpo: Computer Configuration\Policies\Windows Settings\Security Settings\Local Policies\User Rights Assignment\Allow log on through Terminal Services. I also added this group to server manager > configure remote desktop on the server itself, and I added this group to the remote desktop users' group on the server for good measure.
    When I try to log on using an account in that group, I get "The connection was denied because the user account is not authorized for remote login". However when I go to server manager > configure remote desktop and add that specific user, it works fine.
    Is there a reasonable explanation for this? I really don't want to have to add...
    This topic first appeared in the Spiceworks Community

  • AD SSO not happening for Remote Users

    Dear Members
    I am having an issue with the NAC Deployment for Remote users (Users behind WAN Router)
    Windows AD SSO (2008) is happening for LAN users successfullly, however remote users
    are not able to do AD SSO.
    it is ensured that remote users even in unauthenticated state can reach Active directory. there is no filtering
    on any of the device across the path, for this communication.
    When i use Kerbtray on the remote PC, i found no tickets at all.(i am logged in thru Domain)
    what could be going wrong, is it delay (as they are wan user) which might attribute this issue, and if so, where are the needed parameters that can be tuned for AD SSO to happen.
    Any help will ne highly appreciated.
    thanks
    Ahad

    Hi Ahad,
    As long as ALL the policies in Table 8-1 are configured for the Unauthenticated Role
    http://www.cisco.com/en/US/docs/security/nac/appliance/configuration_guide/48/cas/s_adsso.html#wp1174219
    the CAS should be out of the picture for what concerns the communication between the PC and Kerberos.
    If the Kerbtray.exe output for a failing user is empty, it means that the unsuccessful users do not have any Service Ticket (ST) at all.
    This points to an issue with AD (considering the fact that the CAS is already allowing all the traffic to/from AD).
    The failing users are either unable to send the Ticket-Granting Ticket (TGT) to AD, or they are unable to obtain the Service Ticket (ST) from AD.
    The CAS during this phase is neither performing any actions nor blocking any traffic, since all the communications to/from AD are already fully open in the unauthenticated role.
    Regards,
    Fede
    If  this helps you and/or answers your question please mark the question as  "answered" and/or rate it, so other users can easily find it.

  • Backupeventlog not working for remote system in PowerShell

    I have the following code snippet that works if I specify localhost or the name of the local computer, but does not work if set to a remote server. I am running the script as Administrator using my Domain Admin account. Both local and remote systems
    are Windows Server 2012 and in the same domain. PowerShell version = 3.0.
    Param(
    [parameter(Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateScript({Test-Connection $_ })]
    [String]
    $Server,
    [parameter(Mandatory=$true,ValueFromPipeline=$true)]
    [String]
    $LogFileName,
    [parameter(Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateScript({Test-Path $_ -PathType 'Container'})]
    [String]
    $OutputFolder
    $EventLog = Get-WmiObject win32_nteventlogfile -ComputerName $Server -Filter "LogFileName = '$LogFileName'" -ErrorAction SilentlyContinue -ErrorVariable $ResultError
    If ($ResultError)
    Write-Host "ERROR: unable to event logs from $Server."
    } Else {
    Write-Host "Backing $Server."
    $EventLog.backupeventlog("$OutputFolder\$Server.evtx")

    After spending some more time troubleshooting this and doing some more research I found the link below. I found it strange the hoops the author went through to handle event logs on remote systems until I realized that the path for "backupeventlog"
    is relative to the remote system, not the system running the script. So I checked the other systems and found that the event log backups had been created. This would explain the returnvalue of 80. What's worse is that the path for "backupeventlog"
    can't be a UNC -- I tried, it didn't work.
    Back Up Your Event Logs with a Windows PowerShell Script
    https://technet.microsoft.com/en-ca/magazine/2009.07.heyscriptingguy.aspx
    I decided to re-read Microsoft's documentation and found the magical text that I had over-looked:
    BackupEventlog method of the Win32_NTEventlogFile class
    https://msdn.microsoft.com/en-us/library/aa384808(v=vs.85).aspx
    In addition, you must make backups to the local computer; you cannot save a backup of the event logs on Computer A to Computer B. Backups are implemented by using the LocalSystem account, which does not have the network credentials necessary to access remote
    computers. If you want to save backups to a central repository, modify the script to first perform the backup, and then move the backup file to the central repository.
    jrv:
    I would mark your post of Wednesday, March 04, 2015 9:45 PM as helpful if this option were available. Your comment got me to dig into this deeper and discover
    the root cause.
    Thanks.

  • Audio Will not export for clips longer then one minute

      I am running adobe premiere pro cs3 on my HP Pavilion.  i run windows XP service pack 2.
    Until recently i had no problem but the other day I tried exporting from adobe media encoder under these settings:
    Format: M-PEG 2
    Widescreen 16:9
    NTSC
    The bit rate varies. I've done larger bitrates and smaller bitrates.  it doesnt help.
    And suddenly it would render my full video with an mpg extension but there was no audio. I ran it again but shortened the size of the clip, and it came out perfectly fine, audio in tact.  lengthened the sequence again and it wont export audio. its not that the audio box isnt checked, and its not rendering the audio as a seperate file like it used to.  I do a LOT of editing and run my own online show, so it is very important for me to be able to render these videos like this.  I could always render as AVI but that takes up soooooo much space, and most of the other formats are too low quality for me to import again and edit, or even just to upload for people to see.  can anyone help me?

    Yes, HDD = Hard Disk Drive. One often sees it listed as HD, but that can also stand for High Def, so I use the HDD abbreviation myself.
    Now, how you have things set up, that 15GB of free space is probably not nearly enough. This is especially if one has their Windows Page File dynamically managed by the OS (default method), as it is very, very likely that the Page File is filling up the remainder of the HDD.
    If everything is on the external, media, Project, Scratch Disk, Export folders, etc., one might be able to squeak by with that overloaded HDD, but I need to point out - at about 75% of capacity, the HDD's performance will start to slow, and this will intensify, as more is used. This overfilling can result in failure. I would look into cleaning up the C:\.
    Also, when video editing, a 2x physical (not partitions) HDD setup is the bare minimum, and a 3x HDD setup is even better.
    Some get around this, a bit, by using eSATA externals, which will be almost as good as the internals. Not sure that I would recommend any other connection type though. I use FW-800 externals, but have a minimum of a 3x internal on my laptop and an 8x on my workstation, so there's always plenty of internal storage. I just use the externals, to migrate the Projects between my computers.
    Good luck, and pay really, really close attention to that filled C:\.
    Hunt

  • WHENEVER SQLERROR does not work for remote db?

    Hi,
    I've encountered a problem that an error on a remote db (called via db link) does not force SQL*PLUS to exit and report the error.
    Local machine: Sun Solaris
    Local db: 8.1.7.2
    Remote machine: Windows NT, 2000 or XP (I don't know).
    Remote db: 9.2.0.5
    Call to SQL*PLUS: sqlplus /NOLOG @$BASE_DIR/tools/bin/xxx.sql
    Relevant piece of code in xxx.sql:
    connect cemis/cemis@loc9280
    whenever sqlerror exit SQL.SQLCODE
    set heading off
    set trimspool on
    SET PAGES 0
    SET LINESI 250
    SET ECHO OFF
    SET VERIFY OFF
    set termout off
    set feedb off
    set recsep off
    rem *** Do the work ***
    whenever sqlerror exit SQL.SQLCODE
    rem Get rid of old data
    TRUNCATE TABLE local_table;
    rem fill table via db-link
    INSERT INTO local_table
    col1,
    col2
    SELECT
    rem_col1,
    rem_col2
    FROM rem_user.rem_table@dbl_name
    COMMIT;
    spool file.csv
    SELECT 'Spalte1' || chr(9) || 'Spalte2'
    FROM dual;
    SELECT col1, col2 FROM local_table;
    spool off
    exit 0
    In our environment the script runs through without any error. After execution the file.csv is not there. No errors in the log file.
    I tried to select data from the remote db manually and ran into this:
    'ORA-01017: invalid username/password; logon denied' followed by
    'ORA-02063: preceding line from dbl_name'.
    Do you have any idea why this error does not cause sql*plus to exit?
    Does the WHENEVER... only work properly with local errors?
    I would be grateful for any information regarding this.
    Regards,
    Guido

    I believe this is a limitation in SP3, but tell me, are you running with a simple producer or a complex producer?

  • DD_DOMA_GET" not released for 'remote' calls : Advice required

    Hi,
    This is one of the common messages that I(and many SDN'ers) received when I tried to deploy my application in WAS SP12.
    I read the messages and it was quite obvious, that DD_DOMA_GET needs to be RFC enabled and this would work fine.
    However, I also noticed a colleague of mine who could deploy applications without receiving this error. He has a model in his program and he tries to execute the function module as well in his Web Dynpro program and had absolutely no issues.
    When I tried, I received the above error, constantly. I compared his program and mine and found a subtle difference. The view controller - context attributes were structurally bound in my case, and in his case, he used standard types as string, boolean etc... Therefore, I concluded that every time my application tries to execute, it sees a com.xx.app.type.ModelField for a view - context attribute , it uses the JCo Destination WD_RFC_METADATA_DEST and calls function module DD_DOMA_GET. It fetches the type and length of the field. In this case, DD_DOMA_GET is not RFC-enabled and therefore returns with an exception.
    My question is : is this workaround i.e. to not use structure binding for context attributes of a view and use simple fields like string, boolean etc. beneficial ? What are the drawbacks of this workaround ?
    Best Regards,
    Subramanian V.

    Good point, Valery and what you assume is also right. We used a particular SAP System where DD_DOMA_GET was RFC-enabled and then changed JCo destinations.
    I am not clear with your second point and in relation with my question in particular.
    The two points that I made in my first and second post(I should have made them in the same post, sorry about that) are required to not get that error.
    a) <b>Type</b> of View->Context->Value Attribute should not refer to model(for e.g. com.aq.model.types.xubname) , especially when there is a UI Element binding
    b) Custom Controller should be bound to model and not component controller
    I am not even fully sure if there are any other preconditions to avoid the DD_DOMA_GET error, but I tested with the above 2 points and my program also gets executed properly.
    From your point:
    <i>Next, using Adaptive RFC model in described manner is almost the same as using SAP Enterprise Connector instead while any dictionary-related functionality is not used (see
    Re: Poll : SAP Enterprise Connector , second item in list)</i>
    I understand that, if I follow points (a) and (b) as mentioned above, I am "almost" using the concept of Enterprise Connector, except for dictionary usage.
    What intrigues me is why the distinction between component and custom controller(for failure - DD_DOMA_GET) and more importantly, the method that we are currently using, is it safe ?
    Best Regards,
    Subramanian V.

  • File transfer option not available for remote devices under MAX

    I am currently using LabVIEW 2012 and am using a application running on a Windows 7 PC to communicate (TCP/IP over Ethernet) with a RT PC.  I have built an installer that installs a LabVIEW application as well as MAX on the Windows 7 PC.  After installation i can open MAX and browse under Remote Systems to my RT device but am unable to ftp files to the the RT device as the File Transfer feature on the drop down menu is not shown. (This PC does not have the LabVIEW development environment installed.)  Does anyone know if there are Additional Installers that are required to be included that will activate this feature or any reason why this feature in not available?
    Regards,
    Rob  

    Did you include the MAX FTP Client from the RT module in your installer? See this knowledgebase to see what I mean:
    http://digital.ni.com/public.nsf/allkb/571113c2ce05998e862574e8006005a2?OpenDocument
    Also, you can most likely just do ftp using windows explorer, if you prefer that method. Just open up windows explorer, and in the address bar type "ftp://x.x.x.x/" (where x.x.x.x is the IP address of the RT PC you are accessing.
    Also, are you using the LabVIEW real-time module on your RT PC?
    Colden

  • HT201066 QTime 10.1 on OSX 10.7.5 export for Web doesnt include video , only audio

    Bizarre thing.. QT will not export for web a vidoe with picture, just audio??

    Here are several examples:
    Posted on March 12: http://vista.adventureadv.com/jackall/Aragon-proof1/index.html
    Posted yesterday on my server:http://vista.adventureadv.com/jackall/cherry-proof1/index.html
    Posted yesterday on my remote server (hosted by 1&1): http://www.adventureadv.com/cherry-proof1/index.html
    In case you think I screwed up the original exported HTML file: http://vista.adventureadv.com/jackall/cherry-proof1/readme-cherry.html
    http://www.adventureadv.com/cherry-proof1/readme-cherry.html
    Direct link to the QT file: http://vista.adventureadv.com/jackall/cherry-proof1/cherry-720.mov
    Also, the server named vista is a Mac server running OSX Server 10.4-something - despite its name, it is not a Windows Vista machine.

  • Using Hint option to optimise queries for remote tables

    1. How do i provide a hint to access indexes for remote tables in my query?
    2. The explain plan in the 'TOAD' does not expand for remote tables, ie there is no way i can find out whether my query is using the remote table index or not.
    Any help on the above two questions would be highly appreciated.
    Thanks
    Varsha

    1. How do i provide a hint to access indexes for remote tables in my query?
    2. The explain plan in the 'TOAD' does not expand for remote tables, ie there is no way i can find out whether my query is using the remote table index or not.
    Any help on the above two questions would be highly appreciated.
    Thanks
    Varsha

  • UAG/RDS RemoteApps not authorised for?

    Hi all,
    We have setup a UAG as gateway in front of RDS 2012. We published Remote Desktops and RemoteApps using a tspub file.
    I can succesfully logon to UAG and also open published Remote Desktops.
    The problem is with RemoteApps. When i try to open it i get a message "The connection was denied because the user account is not authorized for remote login".
    Thats strange because when i logon to the RDS Web Server (internally) i can access the RemoteApps.
    I am member of the Remote Desktop Users on the broker and Session Host servers hosting the RemoteApp.
    In RDS 2012 i am also granted access to the RemoteApp, and even on Collection level i have granted myself all permissions.
    I cannot think of a blocked port issue as RemoteApps use the same port as Remote Desktops (3389).
    On UAG it self the RemoteApp is having All Users Authorized.
    Seems to be an issue with UAG. Does anyone have an idea?
    Greetz and thanks.

    Hi Rob,
    Thank you for posting in Windows Server Forum.
    For running RemoteApp outside environment there are certain basic things which need to take care for. 
    - As you have deploy RD Gateway server, please check whether you have configured RD RAP and RD CAP properties well. 
    - After configuring that properties, we also need to create one RD Managed computer groups where we need to add all the server under that specified groups. 
    - In IIS manager, please update the FQDN name of farm under Default RD Gateway. 
    - We need port 443 for RDWeb. 
    - We need certificate signed by trusted authorities (can purchase from public CA) and the certificate must be placed under Local computer/Personal Store. 
    - For a try uncheck the option “Bypass RD Gateway for local address” in your RDS deployment properties. 
    You can check the setting to configure RD Gateway with following link.
    Deploying Remote Desktop Gateway RDS 2012
    For other reference guide you can go throughthis article.
    Hope it helps!
    Thanks.
    Dharmesh Solanki
    TechNet Community Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • 1405975 - Minimum Authorization Profile for Remote Service Delivery

    In the document described in the header SAP Customers are asked to provide a logon user for SAP Remote Service Delivery.
    Due to security concerns, customers wish to grant restricted authorizations only.
    The minimal autorizations required have been described by SAP in a Z_BASIC_SERVICE_V1.zip file.
    Its there that I noticed the requirement for the SE93 transactioncode.
    With SE93 you can assign transactioncodes to your account and create new ones.
    I realy can't match this requirement with 'restricted autorizations only'.
    Am I missing something?

    Hi,
    If you think SE93 authorization should be restricted, then you can remove this from the role. As far I know, SE93 authorization is not necessary for remote service delivery. It is one of the 'good-to-have' authorization not the compulsory one.
    Regards,
    Vivek

  • Multi object not exported while clicking Edit Object in "Acrobat  X Pro to Photoshop CS5"

    Subject: Multi object not exported while clicking Edit Object in “Acrobat  X Pro to Photoshop CS5”
    I unable to get all selected object (images) which from Acrobat X pro to Photoshop CS5, I getting only one image.
    It was working fine when used Acrobat 6 & Photoshop CS.
    Also I ensured Preference setting as shown below:
    Kindly let me know the solution, it should helpful.

    Dear Mylenium,
    Thanks for your respond on this. However the same scenario is working perfectly in "Acrobat 6" to "Photoshop CS". Could you please clarify?
    Pasumalai

Maybe you are looking for

  • Query not correct when made with a "meaning field"

    Hi I am working with Forms 6i. My forms have plenty of fields with LOVs attached to it. The field shown on the form, what I call the meaning field, isn't the database column. But the database column is in the data block too and should be filled with

  • Rebuilding / reinstall of Leopard, looking for a software inventory tool

    Hi, I have been running A iMac from a tiger install then upgraded over the past 2 years and looking at doing a reinstall of leopard. However I want to make sure I don't loose any of my paid applications and there keys. Is there a software inventory t

  • Ripping Home Video DVD's

    I've got a giant stack of Camcorder Videos in crazy formats. Is there an Apple OS X program that lets me rip my personal DVD's into a single video rather than VIDEO & AUDIO TS?

  • Denim update and connection fail for microsoft web services

    Denim update loaded recently to my Lumia 1020 but fail connection to Microsoft account and web services as well as to Nokia account. Fail code is 502; Server Error 502 - Web server received an invalid response while acting as a gateway or proxy serve

  • URL Link works 1st time only

    I created a report that displays a document name with a URL link to the document. 1st time to access it, works fine in both IE and Firefox, but the second time accessed, it hangs. The only "unique" thing I am doing is wrapping the anchor in a paragra