About the stub generated by Jdev10.1.2
hi,
I generated the stub class by the Jdeveloper9i from the wsdl which is basic authentication, and it can invoke the web service. But when I updated the soap.jar from the Oc4j9.0.3X to the oc4j 10.1.2, the stub can not invoke the web service any more.
So I did some testes.
1. I create the web service which has not authentication.
1.1 I generated the stub class by the Jdeveloper9.0.5.2. The stub class can invoke web servie.
1.2 I generated the stub class by the Jdeveloper10.1.2. The stub class can also invoke web service.
2. I create the web service which has basic authentication.
2.1 I generated the stub class by the Jdeveloper9.0.5.2 and set the userName and password. The stub class can invoke web servie.
2.2 I generated the stub class by the Jdeveloper10.1.2 and set the userName and password. But the stub class can not invoke web service. It throws the exeception likes below:
[SOAPException: faultCode=SOAP-ENV:Protocol; msg=Unsupported response content type
"text/html", must be: "text/xml". Response was:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd"><HTML><HEAD><TITLE>Yo
u are not authorized to view this page</TITLE><META HTTP-EQUIV="Content-
Type" Content="text/html; charset=Windows-1252"><STYLE
type="text/css"> BODY { font: 8pt/12pt verdana } H1 { font: 13pt/15pt verdana
} H2 { font: 8pt/12pt verdana } A:link { color: red } A:visited { color: maroon }
</STYLE></HEAD><BODY><TABLE width=500 border=0
cellspacing=10><TR><TD><h1>You are not authorized to view this
page</h1>You do not have permission to view this directory or page using the
credentials that you supplied because your Web browser is sending a WWW-Authenticate header
field that the Web server is not configured to accept.<hr><p>Please try the
following:</p><ul><li>Contact the Web site administrator if you believe
you should be able to view this directory or page.</li><li>Click the <a
href="javascript:location.reload()">Refresh</a> button to try again with
different credentials.</li></ul><h2>HTTP Error 401.2 - Unauthorized:
Access is denied due to server configuration.<br>Internet Information Services (IIS)
</h2><hr><p>Technical Information (for support personnel)
</p><ul><li>Go to <a href="http://go.microsoft.com/fwlink/?
linkid=8180">Microsoft Product Support Services</a> and perform a title search
for the words <b>HTTP</b> and <b>401</b>.</li><li>Open
<b>IIS Help</b>, which is accessible in IIS Manager (inetmgr), and search for
topics titled <b>About Security</b>, <b>Authentication</b>, and
<b>About Custom Error
Messages</b>.</li></ul></TD></TR></TABLE></BODY>&l
t;/HTML>] at org.apache.soap.rpc.Call.getEnvelopeString(Call.java:209) at
org.apache.soap.rpc.Call.invoke(Call.java:268)
3.
Then I compared the stub classes and found that the stub class generated from the
Jdev9.0.5.2 has one codes in the stub constuctor while the stub class generated from
Jdev10.1.2 has not.
System.setProperty("oracle.soap.transport.noHTTPClient", "true");
3.1
Then I noted this code in the 9052 stub class and run it, it also threw the same the exception.
3.2
But I add this code in the 1012 stub class and run it, it threw another
exception:java.lang.NullPointerException at
oracle.net.www.protocol.http.HttpURLConnection.<init>(HttpURLConnection.java:101) at
oracle.soap.transport.http.OracleSOAPHTTPConnection.getConnection
(OracleSOAPHTTPConnection.java:1022) at
oracle.soap.transport.http.OracleSOAPHTTPConnection.post(OracleSOAPHTTPConnection.java:833)
at oracle.soap.transport.http.OracleSOAPHTTPConnection.send
(OracleSOAPHTTPConnection.java:713) at org.apache.soap.rpc.Call.invoke(Call.java:261)
But I only change the soap.jar from 10.1.2 version to the 9.0.5.2 version and run this 1012 stub class after add that code, it can invoke the web service.
4. Note: The stub class 9052 run in the Jdeveloper9.0.5.2 and the stub class 1012 run in the Jdeveloper10.1.2. They both don't use proxy server.
Now we use oc4j 10.1.2 platform, and I cannot create the stub class to invoke the web service which is basic authentication. Colud you please help me solve this problem?
Thanks & Best Regards!
I have set REALM information which is get from the response in the TCP monitor, and it can invoke the web service.
For the exception 2.2, maybe it is due to that web service is in Windows platform, then I can't get the REALM information from the TCP Monitor!
Thanks
Similar Messages
-
Why is the stub generated from the implementation and not the interface?
Why is the stub generated from the implementation and not the interface?
Because if a remote server object implements multiple remote interfaces, its stub must implement all the same remote interfaces. The only way to know which interfaces to implement is to examine the server object class.
-
Stub generated in Jdev9i for webservice with 'Vector' return type
Hi,
In the OAF page that I am developing, I am trying to consume a web service generated in SAP PI using Jdeveloper. My Jdeveloper version is 9.0.3.5(I need to use this version since I need to deploy the OAF page in EBS11i). The stub generated based on the WSDL is given below.
import oracle.soap.transport.http.OracleSOAPHTTPConnection;
import org.apache.soap.encoding.soapenc.BeanSerializer;
import org.apache.soap.encoding.SOAPMappingRegistry;
import org.apache.soap.util.xml.QName;
import java.util.Vector;
import org.w3c.dom.Element;
import java.net.URL;
import org.apache.soap.Body;
import org.apache.soap.Envelope;
import org.apache.soap.messaging.Message;
* Generated by the Oracle9i JDeveloper Web Services Stub/Skeleton Generator.
* Date Created: Tue Jan 25 16:12:55 IST 2011
* WSDL URL: file:/C://Working/XXXXXXX/RegConsComplaint_OB.wsdl
public class RegConsComplaint_OBServiceStub
public RegConsComplaint_OBServiceStub()
m_httpConnection = new OracleSOAPHTTPConnection();
public static void main(String[] args)
try
RegConsComplaint_OBServiceStub stub = new RegConsComplaint_OBServiceStub();
// Add your own code here.
catch(Exception ex)
ex.printStackTrace();
public String endpoint = "http://XXXXXX:8000/sap/xi/...../RegConsComplaint_OB";
private OracleSOAPHTTPConnection m_httpConnection = null;
private SOAPMappingRegistry m_smr = null;
public Vector RegConsComplaint_OB(Element requestElem) throws Exception
URL endpointURL = new URL(endpoint);
Envelope requestEnv = new Envelope();
Body requestBody = new Body();
Vector requestBodyEntries = new Vector();
requestBodyEntries.addElement(requestElem);
requestBody.setBodyEntries(requestBodyEntries);
requestEnv.setBody(requestBody);
Message msg = new Message();
msg.setSOAPTransport(m_httpConnection);
msg.send(endpointURL, "http://sap.com/xi/WebService/soap1.1", requestEnv);
Envelope responseEnv = msg.receiveEnvelope();
Body responseBody = responseEnv.getBody();
return responseBody.getBodyEntries();
}I am wondering whether I will be able to use this stub generated by Jdeveloper since the input type is 'Element' and return type is 'Vector'; while in the Jdeveloper documentation the supported "primitive XML Schema types and arrays of primitive XML Schema types as parameters and return values for web services" do not include either of the two.
Regards,
SujoyHi Sujoy
I have been having big problems consuming microsoft sharepoint webservices using jDeveloper 9i.
Problems with jdk version compatability with jDev and NTLM authentication (Sharepoint).
so switching to db connection using utl_http.
Can you pls send me the code set for reference at [email protected]
thanks.
Regards
Sachin -
I run Stub Generator in J2ME Wireless Toolkit 2.2.
But.. I have a problem follow.
addservice.Interface_Stub is not abstract and does not override abstract method _getPropertyNames() in javax.xml.rpc.Stub
I don't know What is it..
I need your help.. Plz, answer me.. :-)I'm having the same problem. I know for sure that the WSDL files contains correct information; I really cannot understand why the "Stub Generator" in the Wireless Toolkit 2.2 is not able to generate the stub.
In my case is failing with a very simple web-service which has only one method with a String in input and another String in output.
Please, follow up on this thread in case you find a solution.
Thanks. -
I run Stub Generator in J2ME Wireless Toolkit 2.2.
But.. I have a problem follow.
C:\ImplementationCode\MSWS\src\addservice\AddIntegerIF_Stub.java:14: addservice.AddIntegerIF_Stub is not abstract and does not override abstract method _getPropertyNames() in javax.xml.rpc.Stub
public class AddIntegerIF_Stub implements addservice.AddIntegerIF, javax.xml.rpc.Stub {
^
1 error
error: compilation failed, errors should have been reported
I don't know What is it..
I need your help.. Plz, answer me.. :-)I'm having the same problem. I know for sure that the WSDL files contains correct information; I really cannot understand why the "Stub Generator" in the Wireless Toolkit 2.2 is not able to generate the stub.
In my case is failing with a very simple web-service which has only one method with a String in input and another String in output.
Please, follow up on this thread in case you find a solution.
Thanks. -
Question about the generated package of an RMI stub class.
Does anyone know why rmic tool generated stubs are in the same package as the class that implements the remote interface? For example, say I have an interface, MyRemoteInterface, in the foo.bar package, and an implementing class, MyRemoteImpl, in the baz package. When baz.MyRemoteImpl is compiled by rmic, a stub, MyRemoteImpl_Stub, is generated in the baz package.
Now I would have thought that MyRemoteImpl_Stub would have been generated in the foo.bar package, as the stub is to be used on the client to handle calls at the interface level. In other words, the baz package really only makes sense on the server. This is in fact how stubs are generated for CORBA using the -iiop flag to rmic.
Is there at least a way to specify the generated package?if u are running rmic on a class called com.yourmi.MyClass the stub will be generated with
exactly the same package.
the only thing that u can change is the location of the stub (for example, in another directory)
but also in that other directory you will find com\yourmi\MyClass_Stub
regards
marco -
Where are the client stubs generated
I am using weblogic 8, so i understand that the client stub is generated by the container when the my ejbs are deployed.
But where is this jar file created.
Is there a tool which i can use to generate the .jar file once my ejb is delployed.
Thank you
Rahul(Found something on this for weblogic 6 ..maybe this holds true for weblogic 8) ..still cant figure out exactly where it creates the stub.. might have to try it out and search BEA_HOME
WebLogic Server supports the use of ejb-client.jar files.
The ejb-client.jar contains the home and remote interfaces, the primary key class (as applicable), and the files they reference. WebLogic Server does not add files referenced in your classpath to ejb-client.jar. This enables WebLogic Server to add necessary custom classes to the ejb-client.jar without adding generic classes such as java.lang.String.
For example, the ShoppingCart remote interface might have a method that returns an Item class. Because this remote interface references this class, and it is located in the ejb-jar file, it will be included in the client jar.
You configure the creation of an ejb-client.jar file in the bean's ejb-jar.xml deployment descriptor file.When you compile the bean with ejbc, WebLogic Server creates the ejb-client.jar.
To specify an ejb-client.jar:
1. Compile the bean's Java classes into a directory, using the javac compiler from the command line.
2. Add the EJB XML deployment descriptor files to the compiled unit using the guidelines in WebLogic Server EJB Deployment Files.
3. Edit the ejb-client-jar deployment descriptor in the bean's ejb-jar.xml file, as follows, to specify support for ejb-client.jar:
<ejb-client-jar>ShoppingCartClient.jar</ejb-client-jar>
4. Generate the container classes that are used to access the bean using weblogic.ejbc and create the ejb-client.jar using the following command:
$ java weblogic.ejbc <ShoppingCart.jar>
Container classes include both the internal representation of the EJB that WebLogic Server uses, as well as implementation of the external interfaces (home, local, and/or remote) that clients use.
External clients can include the ejb-client.jar in their classpath. Web applications would include the ejb-client.jar in their /lib directory.
Note: WebLogic Server classloading behavior varies, depending on whether or not the client is stand-alone. Stand-alone clients with access to the ejb-client.jar can load the necessary classes over the network. However, for security reasons, programmatic clients running in a server instance cannot load classes over the network. -
Namburi,
When you said you used the Reg Exp tool, did you use it only as
preconfigured by the iMT migrate application wizard?
Because the default configuration of the regular expression tool will only
target the files in your ND project directories. If you wish to target
classes outside of the normal directory scope, you have to either modify the
"Source Directory" property OR create another instance of the regular
expression tool. See the "Tool" menu in the iMT to create additional tool
instances which can each be configured to target different sets of files
using different sets of rules.
Usually, I utilize 3 different sets of rules files on a given migration:
spider2jato.xml
these are the generic conversion rules (but includes the optimized rules for
ViewBean and Model based code, i.e. these rules do not utilize the
RequestManager since it is not needed for code running inside the ViewBean
or Model classes)
I run these rules against all files.
See the file download section of this forum for periodic updates to these
rules.
nonProjectFileRules.xml
these include rules that add the necessary
RequestManager.getRequestContext(). etc prefixes to many of the common
calls.
I run these rules against user module and any other classes that do not are
not ModuleServlet, ContainerView, or Model classes.
appXRules.xml
these rules include application specific changes that I discover while
working on the project. A common thing here is changing import statements
(since the migration tool moves ND project code into different jato
packaging structure, you sometime need to adjust imports in non-project
classes that previously imported ND project specific packages)
So you see, you are not limited to one set of rules at all. Just be careful
to keep track of your backups (the regexp tool provides several options in
its Expert Properties related to back up strategies).
----- Original Message -----
From: <vnamboori@y...>
Sent: Wednesday, August 08, 2001 6:08 AM
Subject: [iPlanet-JATO] Re: Use Of models in utility classes - Pease don't
forget about the regular expression potential
Thanks Matt, Mike, Todd
This is a great input for our migration. Though we used the existing
Regular Expression Mapping tool, we did not change this to meet our
own needs as mentioned by Mike.
We would certainly incorporate this to ease our migration.
Namburi
--- In iPlanet-JATO@y..., "Todd Fast" <toddwork@c...> wrote:
All--
Great response. By the way, the Regular Expression Tool uses thePerl5 RE
syntax as implemented by Apache OROMatcher. If you're doing lotsof these
sorts of migration changes manually, you should definitely buy theO'Reilly
book "Mastering Regular Expressions" and generate some rules toautomate the
conversion. Although they are definitely confusing at first,regular
expressions are fairly easy to understand with some documentation,and are
superbly effective at tackling this kind of migration task.
Todd
----- Original Message -----
From: "Mike Frisino" <Michael.Frisino@S...>
Sent: Tuesday, August 07, 2001 5:20 PM
Subject: Re: [iPlanet-JATO] Use Of models in utility classes -Pease don't
forget about the regular expression potential
Also, (and Matt's document may mention this)
Please bear in mind that this statement is not totally correct:
Since the migration tool does not do much of conversion for
these
utilities we have to do manually.Remember, the iMT is a SUITE of tools. There is the extractiontool, and
the translation tool, and the regular expression tool, and severalother
smaller tools (like the jar and compilation tools). It is correctto state
that the extraction and translation tools only significantlyconvert the
primary ND project objects (the pages, the data objects, and theproject
classes). The extraction and translation tools do minimumtranslation of the
User Module objects (i.e. they repackage the user module classes inthe new
jato module packages). It is correct that for all other utilityclasses
which are not formally part of the ND project, the extraction and
translation tools do not perform any migration.
However, the regular expression tool can "migrate" any arbitrary
file
(utility classes etc) to the degree that the regular expressionrules
correlate to the code present in the arbitrary file. So first andforemost,
if you have alot of spider code in your non-project classes youshould
consider using the regular expression tool and if warranted adding
additional rules to reduce the amount of manual adjustments thatneed to be
made. I can stress this enough. We can even help you write theregular
expression rules if you simply identify the code pattern you wish to
convert. Just because there is not already a regular expressionrule to
match your need does not mean it can't be written. We have notnearly
exhausted the possibilities.
For example if you say, we need to convert
CSpider.getDataObject("X");
To
RequestManager.getRequestContext().getModelManager().getModel(XModel.class);
Maybe we or somebody else in the list can help write that regularexpression if it has not already been written. For instance in thelast
updated spider2jato.xml file there is already aCSpider.getCommonPage("X")
rule:
<!--getPage to getViewBean-->
<mapping-rule>
<mapping-rule-primarymatch>
<![CDATA[CSpider[.\s]*getPage[\s]*\(\"([^"]*)\"]]>
</mapping-rule-primarymatch>
<mapping-rule-replacement>
<mapping-rule-match>
<![CDATA[CSpider[.\s]*getPage[\s]*\(\"([^"]*)\"]]>
</mapping-rule-match>
<mapping-rule-substitute>
<![CDATA[getViewBean($1ViewBean.class]]>
</mapping-rule-substitute>
</mapping-rule-replacement>
</mapping-rule>
Following this example a getDataObject to getModel would look
like this:
<mapping-rule>
<mapping-rule-primarymatch>
<![CDATA[CSpider[.\s]*getDataObject[\s]*\(\"([^"]*)\"]]>
</mapping-rule-primarymatch>
<mapping-rule-replacement>
<mapping-rule-match>
<![CDATA[CSpider[.\s]*getDataObject[\s]*\(\"([^"]*)\"]]>
</mapping-rule-match>
<mapping-rule-substitute>
<![CDATA[getModel($1Model.class]]>
</mapping-rule-substitute>
</mapping-rule-replacement>
</mapping-rule>
In fact, one migration developer already wrote that rule andsubmitted it
for inclusion in the basic set. I will post another upgrade to thebasic
regular expression rule set, look for a "file uploaded" posting.Also,
please consider contributing any additional generic rules that youhave
written for inclusion in the basic set.
Please not, that in some cases (Utility classes in particular)
the rule
application may be more effective as TWO sequention rules ratherthan one
monolithic rule. Again using the example above, it will convert
CSpider.getDataObject("Foo");
To
getModel(FooModel.class);
Now that is the most effective conversion for that code if that
code is in
a page or data object class file. But if that code is in a Utilityclass you
really want:
>
RequestManager.getRequestContext().getModelManager().getModel(FooModel.class
So to go from
getModel(FooModel.class);
To
RequestManager.getRequestContext().getModelManager().getModel(FooModel.class
You would apply a second rule AND you would ONLY run this rule
against
your utility classes so that you would not otherwise affect yourViewBean
and Model classes which are completely fine with the simplegetModel call.
<mapping-rule>
<mapping-rule-primarymatch>
<![CDATA[getModel\(]]>
</mapping-rule-primarymatch>
<mapping-rule-replacement>
<mapping-rule-match>
<![CDATA[getModel\(]]>
</mapping-rule-match>
<mapping-rule-substitute>
<![CDATA[RequestManager.getRequestContext().getModelManager().getModel(]]>
</mapping-rule-substitute>
</mapping-rule-replacement>
</mapping-rule>
A similer rule can be applied to getSession and other CSpider APIcalls.
For instance here is the rule for converting getSession calls toleverage
the RequestManager.
<mapping-rule>
<mapping-rule-primarymatch>
<![CDATA[getSession\(\)\.]]>
</mapping-rule-primarymatch>
<mapping-rule-replacement>
<mapping-rule-match>
<![CDATA[getSession\(\)\.]]>
</mapping-rule-match>
<mapping-rule-substitute>
<![CDATA[RequestManager.getSession().]]>
</mapping-rule-substitute>
</mapping-rule-replacement>
</mapping-rule>
----- Original Message -----
From: "Matthew Stevens" <matthew.stevens@e...>
Sent: Tuesday, August 07, 2001 12:56 PM
Subject: RE: [iPlanet-JATO] Use Of models in utility classes
Namburi,
I will post a document to the group site this evening which has
the
details
on various tactics of migrating these type of utilities.
Essentially,
you
either need to convert these utilities to Models themselves or
keep the
utilities as is and simply use the
RequestManager.getRequestContext.getModelManager().getModel()
to statically access Models.
For CSpSelect.executeImmediate() I have an example of customhelper
method
as a replacement whicch uses JDBC results instead of
CSpDBResult.
matt
-----Original Message-----
From: vnamboori@y... [mailto:<a href="/group/SunONE-JATO/post?protectID=081071113213093190112061186248100208071048">vnamboori@y...</a>]
Sent: Tuesday, August 07, 2001 3:24 PM
Subject: [iPlanet-JATO] Use Of models in utility classes
Hi All,
In the present ND project we have lots of utility classes.
These
classes in diffrent directory. Not part of nd pages.
In these classes we access the dataobjects and do themanipulations.
So we access dataobjects directly like
CSpider.getDataObject("do....");
and then execute it.
Since the migration tool does not do much of conversion forthese
utilities we have to do manually.
My question is Can we access the the models in the postmigration
sameway or do we need requestContext?
We have lots of utility classes which are DataObjectintensive. Can
someone suggest a better way to migrate this kind of code.
Thanks
Namburi
[email protected]
[email protected]
[Non-text portions of this message have been removed]
[email protected]
[email protected]Namburi,
When you said you used the Reg Exp tool, did you use it only as
preconfigured by the iMT migrate application wizard?
Because the default configuration of the regular expression tool will only
target the files in your ND project directories. If you wish to target
classes outside of the normal directory scope, you have to either modify the
"Source Directory" property OR create another instance of the regular
expression tool. See the "Tool" menu in the iMT to create additional tool
instances which can each be configured to target different sets of files
using different sets of rules.
Usually, I utilize 3 different sets of rules files on a given migration:
spider2jato.xml
these are the generic conversion rules (but includes the optimized rules for
ViewBean and Model based code, i.e. these rules do not utilize the
RequestManager since it is not needed for code running inside the ViewBean
or Model classes)
I run these rules against all files.
See the file download section of this forum for periodic updates to these
rules.
nonProjectFileRules.xml
these include rules that add the necessary
RequestManager.getRequestContext(). etc prefixes to many of the common
calls.
I run these rules against user module and any other classes that do not are
not ModuleServlet, ContainerView, or Model classes.
appXRules.xml
these rules include application specific changes that I discover while
working on the project. A common thing here is changing import statements
(since the migration tool moves ND project code into different jato
packaging structure, you sometime need to adjust imports in non-project
classes that previously imported ND project specific packages)
So you see, you are not limited to one set of rules at all. Just be careful
to keep track of your backups (the regexp tool provides several options in
its Expert Properties related to back up strategies).
----- Original Message -----
From: <vnamboori@y...>
Sent: Wednesday, August 08, 2001 6:08 AM
Subject: [iPlanet-JATO] Re: Use Of models in utility classes - Pease don't
forget about the regular expression potential
Thanks Matt, Mike, Todd
This is a great input for our migration. Though we used the existing
Regular Expression Mapping tool, we did not change this to meet our
own needs as mentioned by Mike.
We would certainly incorporate this to ease our migration.
Namburi
--- In iPlanet-JATO@y..., "Todd Fast" <toddwork@c...> wrote:
All--
Great response. By the way, the Regular Expression Tool uses thePerl5 RE
syntax as implemented by Apache OROMatcher. If you're doing lotsof these
sorts of migration changes manually, you should definitely buy theO'Reilly
book "Mastering Regular Expressions" and generate some rules toautomate the
conversion. Although they are definitely confusing at first,regular
expressions are fairly easy to understand with some documentation,and are
superbly effective at tackling this kind of migration task.
Todd
----- Original Message -----
From: "Mike Frisino" <Michael.Frisino@S...>
Sent: Tuesday, August 07, 2001 5:20 PM
Subject: Re: [iPlanet-JATO] Use Of models in utility classes -Pease don't
forget about the regular expression potential
Also, (and Matt's document may mention this)
Please bear in mind that this statement is not totally correct:
Since the migration tool does not do much of conversion for
these
utilities we have to do manually.Remember, the iMT is a SUITE of tools. There is the extractiontool, and
the translation tool, and the regular expression tool, and severalother
smaller tools (like the jar and compilation tools). It is correctto state
that the extraction and translation tools only significantlyconvert the
primary ND project objects (the pages, the data objects, and theproject
classes). The extraction and translation tools do minimumtranslation of the
User Module objects (i.e. they repackage the user module classes inthe new
jato module packages). It is correct that for all other utilityclasses
which are not formally part of the ND project, the extraction and
translation tools do not perform any migration.
However, the regular expression tool can "migrate" any arbitrary
file
(utility classes etc) to the degree that the regular expressionrules
correlate to the code present in the arbitrary file. So first andforemost,
if you have alot of spider code in your non-project classes youshould
consider using the regular expression tool and if warranted adding
additional rules to reduce the amount of manual adjustments thatneed to be
made. I can stress this enough. We can even help you write theregular
expression rules if you simply identify the code pattern you wish to
convert. Just because there is not already a regular expressionrule to
match your need does not mean it can't be written. We have notnearly
exhausted the possibilities.
For example if you say, we need to convert
CSpider.getDataObject("X");
To
RequestManager.getRequestContext().getModelManager().getModel(XModel.class);
Maybe we or somebody else in the list can help write that regularexpression if it has not already been written. For instance in thelast
updated spider2jato.xml file there is already aCSpider.getCommonPage("X")
rule:
<!--getPage to getViewBean-->
<mapping-rule>
<mapping-rule-primarymatch>
<![CDATA[CSpider[.\s]*getPage[\s]*\(\"([^"]*)\"]]>
</mapping-rule-primarymatch>
<mapping-rule-replacement>
<mapping-rule-match>
<![CDATA[CSpider[.\s]*getPage[\s]*\(\"([^"]*)\"]]>
</mapping-rule-match>
<mapping-rule-substitute>
<![CDATA[getViewBean($1ViewBean.class]]>
</mapping-rule-substitute>
</mapping-rule-replacement>
</mapping-rule>
Following this example a getDataObject to getModel would look
like this:
<mapping-rule>
<mapping-rule-primarymatch>
<![CDATA[CSpider[.\s]*getDataObject[\s]*\(\"([^"]*)\"]]>
</mapping-rule-primarymatch>
<mapping-rule-replacement>
<mapping-rule-match>
<![CDATA[CSpider[.\s]*getDataObject[\s]*\(\"([^"]*)\"]]>
</mapping-rule-match>
<mapping-rule-substitute>
<![CDATA[getModel($1Model.class]]>
</mapping-rule-substitute>
</mapping-rule-replacement>
</mapping-rule>
In fact, one migration developer already wrote that rule andsubmitted it
for inclusion in the basic set. I will post another upgrade to thebasic
regular expression rule set, look for a "file uploaded" posting.Also,
please consider contributing any additional generic rules that youhave
written for inclusion in the basic set.
Please not, that in some cases (Utility classes in particular)
the rule
application may be more effective as TWO sequention rules ratherthan one
monolithic rule. Again using the example above, it will convert
CSpider.getDataObject("Foo");
To
getModel(FooModel.class);
Now that is the most effective conversion for that code if that
code is in
a page or data object class file. But if that code is in a Utilityclass you
really want:
>
RequestManager.getRequestContext().getModelManager().getModel(FooModel.class
So to go from
getModel(FooModel.class);
To
RequestManager.getRequestContext().getModelManager().getModel(FooModel.class
You would apply a second rule AND you would ONLY run this rule
against
your utility classes so that you would not otherwise affect yourViewBean
and Model classes which are completely fine with the simplegetModel call.
<mapping-rule>
<mapping-rule-primarymatch>
<![CDATA[getModel\(]]>
</mapping-rule-primarymatch>
<mapping-rule-replacement>
<mapping-rule-match>
<![CDATA[getModel\(]]>
</mapping-rule-match>
<mapping-rule-substitute>
<![CDATA[RequestManager.getRequestContext().getModelManager().getModel(]]>
</mapping-rule-substitute>
</mapping-rule-replacement>
</mapping-rule>
A similer rule can be applied to getSession and other CSpider APIcalls.
For instance here is the rule for converting getSession calls toleverage
the RequestManager.
<mapping-rule>
<mapping-rule-primarymatch>
<![CDATA[getSession\(\)\.]]>
</mapping-rule-primarymatch>
<mapping-rule-replacement>
<mapping-rule-match>
<![CDATA[getSession\(\)\.]]>
</mapping-rule-match>
<mapping-rule-substitute>
<![CDATA[RequestManager.getSession().]]>
</mapping-rule-substitute>
</mapping-rule-replacement>
</mapping-rule>
----- Original Message -----
From: "Matthew Stevens" <matthew.stevens@e...>
Sent: Tuesday, August 07, 2001 12:56 PM
Subject: RE: [iPlanet-JATO] Use Of models in utility classes
Namburi,
I will post a document to the group site this evening which has
the
details
on various tactics of migrating these type of utilities.
Essentially,
you
either need to convert these utilities to Models themselves or
keep the
utilities as is and simply use the
RequestManager.getRequestContext.getModelManager().getModel()
to statically access Models.
For CSpSelect.executeImmediate() I have an example of customhelper
method
as a replacement whicch uses JDBC results instead of
CSpDBResult.
matt
-----Original Message-----
From: vnamboori@y... [mailto:<a href="/group/SunONE-JATO/post?protectID=081071113213093190112061186248100208071048">vnamboori@y...</a>]
Sent: Tuesday, August 07, 2001 3:24 PM
Subject: [iPlanet-JATO] Use Of models in utility classes
Hi All,
In the present ND project we have lots of utility classes.
These
classes in diffrent directory. Not part of nd pages.
In these classes we access the dataobjects and do themanipulations.
So we access dataobjects directly like
CSpider.getDataObject("do....");
and then execute it.
Since the migration tool does not do much of conversion forthese
utilities we have to do manually.
My question is Can we access the the models in the postmigration
sameway or do we need requestContext?
We have lots of utility classes which are DataObjectintensive. Can
someone suggest a better way to migrate this kind of code.
Thanks
Namburi
[email protected]
[email protected]
[Non-text portions of this message have been removed]
[email protected]
[email protected] -
About the template FSCM9.1 FP2 Peopletools 8.52.03 (v4 - July 2012)
Hello,
Just tested quickly this new template delivered 2 months ago (July 2012).
As far as I undestand, it is just a recut of the one delivered in April 2012. At least it solves the main issue I reported in that other About the template FSCM9.1 FP2 Peopletools 8.52.03 (v3) about the network prompt missing.
But I still have remarks/issues on the template FSCM9.1 FP2 Peopletools 8.52.03 (v4) released earlier in July 2012.
_1. First of all, there a lot of errors reported in /var/log/messages_
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526918.883:3): avc: denied { read } for pid=92 comm="restorecon" name="libc.so.6" dev=xvda2 ino=21 scontext=system_u:sys
tem_r:restorecon_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=lnk_file
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526918.910:4): avc: denied { execute } for pid=92 comm="restorecon" path="/lib64/libc-2.5.so" dev=xvda2 ino=20 scontext=
system_u:system_r:restorecon_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526921.489:5): avc: denied { read } for pid=296 comm="pam_console_app" name="ld.so.cache" dev=xvda2 ino=94143 scontext=s
ystem_u:system_r:pam_console_t:s0-s0:c0.c1023 tcontext=system_u:object_r:file_t:s0 tclass=file
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526921.489:6): avc: denied { getattr } for pid=290 comm="pam_console_app" path="/etc/ld.so.cache" dev=xvda2 ino=94143 sc
ontext=system_u:system_r:pam_console_t:s0-s0:c0.c1023 tcontext=system_u:object_r:file_t:s0 tclass=file
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526921.530:7): avc: denied { read } for pid=293 comm="pam_console_app" name="libc.so.6" dev=xvda2 ino=21 scontext=system
_u:system_r:pam_console_t:s0-s0:c0.c1023 tcontext=system_u:object_r:file_t:s0 tclass=lnk_file
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526921.530:8): avc: denied { execute } for pid=293 comm="pam_console_app" path="/lib64/libc-2.5.so" dev=xvda2 ino=20 sco
ntext=system_u:system_r:pam_console_t:s0-s0:c0.c1023 tcontext=system_u:object_r:file_t:s0 tclass=file
Sep 13 05:02:55 localhost kernel: input: PC Speaker as /class/input/input3
Sep 13 05:02:55 localhost kernel: Initialising Xen virtual ethernet driver.
Sep 13 05:02:55 localhost kernel: Error: Driver 'pcspkr' is already registered, aborting...
Sep 13 05:02:55 localhost kernel: Floppy drive(s): fd0 is unknown type 15 (usb?), fd1 is unknown type 15 (usb?)
Sep 13 05:02:55 localhost kernel: floppy0: Unable to grab IRQ6 for the floppy driver
Sep 13 05:02:55 localhost kernel: lp: driver loaded but no devices found
Sep 13 05:02:55 localhost kernel: md: Autodetecting RAID arrays.
Sep 13 05:02:55 localhost kernel: md: Scanned 0 and added 0 devices.
Sep 13 05:02:55 localhost kernel: md: autorun ...
Sep 13 05:02:55 localhost kernel: md: ... autorun DONE.
Sep 13 05:02:55 localhost kernel: EXT3 FS on xvda2, internal journal
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526929.896:9): avc: denied { execute } for pid=965 comm="restorecon" path="/lib64/libc-2.5.so" dev=xvda2 ino=20 scontext
=system_u:system_r:restorecon_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
Sep 13 05:02:55 localhost kernel: kjournald starting. Commit interval 5 seconds
Sep 13 05:02:55 localhost kernel: EXT3 FS on xvda1, internal journal
Sep 13 05:02:55 localhost kernel: EXT3-fs: mounted filesystem with ordered data mode.
Sep 13 05:02:55 localhost kernel: kjournald starting. Commit interval 5 seconds
Sep 13 05:02:55 localhost kernel: EXT3 FS on xvdb1, internal journal
Sep 13 05:02:55 localhost kernel: EXT3-fs: mounted filesystem with ordered data mode.
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526930.647:10): avc: denied { execute } for pid=989 comm="setfiles" path="/lib64/libc-2.5.so" dev=xvda2 ino=20 scontext=
system_u:system_r:setfiles_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526942.398:11): avc: denied { net_admin } for pid=990 comm="setfiles" capability=12 scontext=system_u:system_r:setfiles
_t:s0 tcontext=system_u:system_r:setfiles_t:s0 tclass=capability
Sep 13 05:02:55 localhost kernel: hrtimer: interrupt took 35229469 ns
Sep 13 05:02:55 localhost kernel: Adding 2104504k swap on /dev/xvda3. Priority:-1 extents:1 across:2104504k SS
Sep 13 05:02:55 localhost kernel: warning: process `kudzu' used the deprecated sysctl system call with 1.23.
Sep 13 05:02:55 localhost kernel: Loading iSCSI transport class v2.0-870.
Sep 13 05:02:55 localhost kernel: libcxgbi:libcxgbi_init_module: tag itt 0x1fff, 13 bits, age 0xf, 4 bits.
Sep 13 05:02:55 localhost kernel: libcxgbi:ddp_setup_host_page_size: system PAGE 4096, ddp idx 0.
Sep 13 05:02:55 localhost kernel: Chelsio T3 iSCSI Driver cxgb3i v2.0.0 (Jun. 2010)
Sep 13 05:02:55 localhost kernel: iscsi: registered transport (cxgb3i)
Sep 13 05:02:55 localhost kernel: NET: Registered protocol family 10
Sep 13 05:02:55 localhost kernel: cnic: Broadcom NetXtreme II CNIC Driver cnic v2.2.14 (Mar 30, 2011)
Sep 13 05:02:55 localhost kernel: Broadcom NetXtreme II iSCSI Driver bnx2i v2.6.2.3 (Jan 06, 2010)
Sep 13 05:02:55 localhost kernel: iscsi: registered transport (bnx2i)
Sep 13 05:02:55 localhost kernel: iscsi: registered transport (tcp)
Sep 13 05:02:55 localhost kernel: iscsi: registered transport (iser)
Sep 13 05:02:55 localhost kernel: iscsi: registered transport (be2iscsi)
Sep 13 05:02:55 localhost kernel: ip6_tables: (C) 2000-2006 Netfilter Core Team
Sep 13 05:02:55 localhost kernel: warning: `mcstransd' uses 32-bit capabilities (legacy support in use)
Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526970.336:12): avc: denied { sys_tty_config } for pid=1374 comm="consoletype" capability=26 scontext=system_u:system_r
:consoletype_t:s0 tcontext=system_u:system_r:consoletype_t:s0 tclass=capability
Sep 13 05:02:55 localhost kernel: RPC: Registered udp transport module.
Sep 13 05:02:55 localhost kernel: RPC: Registered tcp transport module.
Sep 13 05:02:55 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Sep 13 05:03:00 localhost automount[1769]: lookup_read_master: lookup(nisplus): couldn't locate nis+ table auto.master
Sep 13 05:03:59 localhost kernel: type=1400 audit(1347527039.771:13): avc: denied { sys_tty_config } for pid=2029 comm="consoletype" capability=26 scontext=system_u:system_r
:consoletype_t:s0 tcontext=system_u:system_r:consoletype_t:s0 tclass=capability
Sep 13 05:04:00 localhost NET[2061]: /sbin/dhclient-script : updated /etc/resolv.conf
Sep 13 05:04:01 localhost kernel: IPv6 over IPv4 tunneling driver
Sep 13 05:04:01 localhost NET[2219]: /opt/oracle/psft/vm/oraclevm-template.sh : updated /etc/resolv.conf
Sep 13 05:04:08 localhost NET[2472]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Sep 13 05:06:08 localhost restorecond: Reset file context /etc/resolv.conf: system_u:object_r:etc_runtime_t:s0->system_u:object_r:net_conf_t:s0
Sep 13 05:08:19 localhost kernel: Slow work thread pool: Starting up
Sep 13 05:08:19 localhost kernel: Slow work thread pool: Ready
Sep 13 05:08:19 localhost kernel: FS-Cache: Loaded
Sep 13 05:08:19 localhost kernel: FS-Cache: Netfs 'nfs' registered for caching
Sep 13 05:08:19 localhost kernel: svc: failed to register lockdv1 RPC service (errno 97).
...Well, I don't know if it triggers others problems yet, but the last line could reveale an error within the /etc/hosts file which has not been properly modified during deployment (especially IPV6, it probably should be removed) :
[root@psovmfscmfp2 /]# more /etc/hosts
127.0.0.1 localhost.localdomain localhost
::1 localhost6.localdomain6 localhost6
192.168.1.150 psovmfscmfp2.phoenix.nga psovmfscmfp2
[root@psovmfscmfp2 /]#_2. Now about the COBOL_
Despite I choosed to install Microfocus, COBOL does not work. Sample COBOL processes such as PTPDBTST and PTPDTTST finished in error.
The log is empty, here below the output from the file $PS_CFG_HOME/psft/pt/8.52/appserv/prcs/PRCSDOM/LOGS/stdout (psadm2) :
=================================Error===============================
Message: Process 10899 is marked 'Initiated' or 'Processing' but can not detect status of PID
Process Name: PTPDBTST
Process Type: COBOL SQL
Session Id: 9313
=====================================================================
OprId = VP1Note that I successfully tested AEs and SQRs.
Here is the command line fired that I can see from the process monitor > parameter (nga is being my run control id) :
PSRUN PTPDBTST ORACLE/F91TMPLT/VP1/OPRPSWD/nga/10899//0 I used the following trace setting on the PTPDBTST's process parameter (override) to see what happens :
%%DBTYPE%%/%%DBNAME%%/%%OPRID%%/%%OPRPSWD%%/%%RUNCNTLID%%/%%INSTANCE%%//%%DBFLAG%%But it does not generate more logs...
I also use "RCCBL Redirect =1" in psappsrv.cfg (reconfigure and restart appdom), then start the COBOL through menu PeopleTools > Utilities > Debug > PeopleTools Test Utilities, and run a "Remote Call Test".
I received "COBOL Program PTPNTEST aborted (2,-1) FUNCLIB_UTIL.RC_TEST_PB.FieldChange PCPC:2143 Statement:26", but it generated two empty files (PTPNTEST_VP1_0913064910.out and PTPNTEST_VP1_0913064910.err).
Next step, checking the folder $PS_HOME/cblbin, it is...er... empty... does this mean COBOL have not been compiled ? Hmmm, I'm pretty sure I replied 'yes' when it was prompted though (still have the screenshots)...
And we can see several folders dated from today and license seems ok from Microfocus directories :
[psadm1@psovmfscmfp2 tools]$ cd /opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
[psadm1@psovmfscmfp2 svrexp-51_wp4-64bit]$ ls -lrt
total 264
-r--r--r-- 1 root root 10455 Nov 19 2009 ADISCTRL
dr-xr-xr-x 10 root root 4096 Nov 19 2009 terminfo
dr-xr-xr-x 2 root root 4096 Nov 19 2009 xdb
-r--r--r-- 1 root root 11949 Nov 19 2009 eslmf-mess
dr-xr-xr-x 2 root root 4096 Nov 19 2009 include
dr-xr-xr-x 17 root root 4096 Nov 19 2009 lang
dr-xr-xr-x 4 root root 4096 Nov 19 2009 es
dr-xr-xr-x 2 root root 4096 Nov 19 2009 dynload
drwxrwxrwx 2 root root 4096 Nov 19 2009 deploy
dr-xr-xr-x 2 root root 4096 Nov 19 2009 dynload64
dr-xr-xr-x 2 root root 4096 Nov 19 2009 dialog
dr-xr-xr-x 2 root root 4096 Nov 19 2009 cpylib
dr-xr-xr-x 8 root root 28672 Nov 19 2009 lib
dr-xr-xr-x 3 root root 4096 Nov 19 2009 snmp
dr-xr-xr-x 8 root root 4096 Nov 19 2009 src
dr-xr-xr-x 28 root root 4096 Nov 19 2009 demo
dr-xr-xr-x 6 root root 4096 Nov 19 2009 docs
-rw-r--r-- 1 root root 49 Sep 13 05:13 license.txt
-r-xr-xr-x 1 root root 12719 Sep 13 05:13 install.orig
-r-xr-xr-x 1 root root 13006 Sep 13 05:13 install
dr-xr-xr-x 6 root root 4096 Sep 13 05:13 lmf
dr-xr-xr-x 2 root root 4096 Sep 13 05:13 aslmf
dr-xr-xr-x 6 root root 4096 Sep 13 05:15 etc
dr-xr-xr-x 4 root root 12288 Sep 13 05:15 bin
[psadm1@psovmfscmfp2 svrexp-51_wp4-64bit]$ more license.txt
I
ORACLE-30DAYDEV64
01030 A0780 014A6 7980B A17CSo let's assume it has been properly installed and let's compile the COBOLs. Here we go :
[psadm1@psovmfscmfp2 svrexp-51_wp4-64bit]$ cd $PS_HOME/setup
[psadm1@psovmfscmfp2 setup]$ ./pscbl.mak
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Convert all files for Unicode ....
Conversion Summary for Source Codes in :
Source: /opt/oracle/psft/pt/tools/src/cbl/
Target: /opt/oracle/psft/pt/tools/src/cblunicode/
Number of Copy Libraries Read: 71
Modified: 71
Not Modified: 0
Number of Programs Read: 44
Modified: 44
Not Modified: 0
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : All COBOL files were converted for Unicode successfully
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPCBLAE.cbl ...
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak: line 249: cob: command not found
cp: cannot stat `PTPCBLAE.gnt': No such file or directory
cp: cannot stat `PTPCBLAE.int': No such file or directory
cp: cannot stat `PTPCBLAE.lst': No such file or directory
...What about env. variables ? COBDIR, COBPATH and COBOL do not appears anywhere in PATH...
[psadm1@psovmfscmfp2 setup]$ env|grep -i cobol
[psadm1@psovmfscmfp2 setup]$Let's set the env variables as we could expect to be (page 27, step 17 of the given doc), and retry to compile the COBOL :
[psadm1@psovmfscmfp2 setup]$ export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
[psadm1@psovmfscmfp2 setup]$ export LD_LIBRARY_PATH=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/lib:$LD_LIBRARY_PATH
[psadm1@psovmfscmfp2 setup]$ export PATH=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/bin:$PATH
[psadm1@psovmfscmfp2 setup]$ ./pscbl.mak
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Convert all files for Unicode ....
Conversion Summary for Source Codes in :
Source: /opt/oracle/psft/pt/tools/src/cbl/
Target: /opt/oracle/psft/pt/tools/src/cblunicode/
Number of Copy Libraries Read: 71
Modified: 71
Not Modified: 0
Number of Programs Read: 44
Modified: 44
Not Modified: 0
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : All COBOL files were converted for Unicode successfully
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPCBLAE.cbl ...
Micro Focus LMF - 010: Unable to contact license manager. This product has been unable to contact the License Manager. Execution of this product has been terminated. This product cannot execute without the License Manager. Contact your license administrator or refer to the 'Information Messages' chapter of the License Management Facility Administrator's Guide.
cob64: error(s) in compilation: PTPCBLAE.cbl
cp: cannot stat `PTPCBLAE.gnt': No such file or directory
cp: cannot stat `PTPCBLAE.int': No such file or directory
cp: cannot stat `PTPCBLAE.lst': No such file or directory
...Ok, maybe a bit better, at least it is trying to contact LMF. Probably the LMF is not started. Let's try to start it :
[root@psovmfscmfp2 microfocus]# ./mflmman
MF-LMF:Thu Sep 13 07:19:37 2012: LMF Starting
[root@psovmfscmfp2 microfocus]#Good, it is starting, it means it wasn't (sic). Now retry to compile :
[psadm1@psovmfscmfp2 setup]$ export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
[psadm1@psovmfscmfp2 setup]$ export LD_LIBRARY_PATH=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/lib:$LD_LIBRARY_PATH
[psadm1@psovmfscmfp2 setup]$ export PATH=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/bin:$PATH
[psadm1@psovmfscmfp2 setup]$ ./pscbl.mak
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Convert all files for Unicode ....
Conversion Summary for Source Codes in :
Source: /opt/oracle/psft/pt/tools/src/cbl/
Target: /opt/oracle/psft/pt/tools/src/cblunicode/
Number of Copy Libraries Read: 71
Modified: 71
Not Modified: 0
Number of Programs Read: 44
Modified: 44
Not Modified: 0
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : All COBOL files were converted for Unicode successfully
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPCBLAE.cbl ...
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPCURND.cbl ...
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPDBTST.cbl ...
<snipped>
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPWLGEN.cbl ...
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : All COBOL programs have been successfully compiled.
/opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : The COBOL executables were copied to /opt/oracle/psft/pt/tools/cblbin
rm: cannot remove `/opt/oracle/psft/pt/apptools/src/cblunicode/CECCRLP1.cbl': Permission denied
rm: cannot remove `/opt/oracle/psft/pt/apptools/src/cblunicode/CECCRLUP.cbl': Permission deniedIt looks better, I think the last lines marked with "Permission denied" can be safely be ignored.
Those files are owned by psadm3 with a read only for other users (sic). But more concern, I'm wondering why it looks into apptools (???) whereas I'm using psadm1 (tools only, COBPATH=/opt/oracle/psft/pt/tools/cblbin).
Anyway, seems the *.gnt files required to run the COBOLs programs are now in bin :
[psadm1@psovmfscmfp2 setup]$ ls /opt/oracle/psft/pt/tools/cblbin
PTPCBLAE.gnt PTPDTTST.gnt PTPECOBL.gnt PTPLOGMS.gnt PTPRATES.gnt PTPSQLGS.gnt PTPTESTU.gnt PTPTSCNT.gnt PTPTSLOG.gnt PTPTSTBL.gnt PTPTSWHR.gnt
PTPCURND.gnt PTPDTWRK.gnt PTPEFCNV.gnt PTPMETAS.gnt PTPRUNID.gnt PTPSQLRT.gnt PTPTESTV.gnt PTPTSEDS.gnt PTPTSREQ.gnt PTPTSUPD.gnt PTPUPPER.gnt
PTPDBTST.gnt PTPDYSQL.gnt PTPERCUR.gnt PTPNETRT.gnt PTPSETAD.gnt PTPSTRFN.gnt PTPTFLDW.gnt PTPTSEDT.gnt PTPTSSET.gnt PTPTSUSE.gnt PTPUSTAT.gnt
PTPDEC31.gnt PTPECACH.gnt PTPESLCT.gnt PTPNTEST.gnt PTPSHARE.gnt PTPTEDIT.gnt PTPTLREC.gnt PTPTSFLD.gnt PTPTSTAE.gnt PTPTSWHE.gnt PTPWLGEN.gnt
[psadm1@psovmfscmfp2 setup]$Have a try to link COBOLs :
[psadm1@psovmfscmfp2 setup]$ ./psrun.mak
./psrun.mak - linking PSRUN for oel-5-x86_64, Version 2.6.32-200.13.1.el5uek ...
./psrun.mak - Successfully created PSRUN in directory: /opt/oracle/psft/pt/tools/bin
./psrun.mak - linking PSRUNRMT for oel-5-x86_64, Version 2.6.32-200.13.1.el5uek ...
./psrun.mak - Successfully created PSRUNRMT in directory: /opt/oracle/psft/pt/tools/bin
[psadm1@psovmfscmfp2 setup]$The err files are empty :
-rw-r--r-- 1 psadm1 oracle 0 Sep 13 07:26 psrun.err
-rw-r--r-- 1 psadm1 oracle 0 Sep 13 07:26 psrunrmt.errSo far, so good now. We are able to test again the sample COBOL... until next failure.
Yes, unfortunately, it fails again. But good thing, the log is not empty now :
PSRUN: error while loading shared libraries: libcobrts64.so: cannot open shared object file: No such file or directoryThat's probably coming from some missing libraries during the psprcs.cfg configuration. Let's use the same env. variables settings as for psadm1 when compiling COBOLs.
[psadm2@psovmfscmfp2 appserv]$ export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
[psadm2@psovmfscmfp2 appserv]$ export LD_LIBRARY_PATH=$COBDIR/lib:$LD_LIBRARY_PATH
[psadm2@psovmfscmfp2 appserv]$ export PATH=$COBDIR/bin:$PATH
[psadm2@psovmfscmfp2 appserv]$ ./psadminReconfigure, restart prcs and re-test... SUCCESSFULLY !!!!!!!!!!!!!!!!!!!!!!!!!
Log from PTPDBTST process shows :
SUCCESSFUL DATABASE CONNECTION
SUCCESSFUL DATABASE DISCONNECTWhat a pain !
I did not go further, but we could expect the same issue within the Application COBOLs, since the cblbin directory is also empty out there.
According to psprcs.env, there're two values for COBDIR and the one for the applications cobol is empty :
[psadm2@psovmfscmfp2 PRCSDOM]$ more psprcsrv.env
INFORMIXSERVER=192.168.1.149
COBPATH=/opt/oracle/psft/pt/apptools/cblbin:/opt/oracle/psft/pt/tools/cblbin
PATH=/opt/oracle/psft/pt/apptools/bin:/opt/oracle/psft/pt/apptools/bin/interfacedrivers::/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/bin:/opt/oracle/psft/pt/tools/appserv:/opt
/oracle/psft/pt/tools/setup:/opt/oracle/psft/pt/tools/jre/bin:/opt/oracle/psft/pt/bea/tuxedo/bin:.:/opt/oracle/psft/pt/oracle-client/11.2.0.x/bin:/opt/oracle/psft/pt/oracle-clie
nt/11.2.0.x/perl/bin:/usr/local/bin:/bin:/usr/bin:/opt/oracle/psft/pt/tools/bin:/opt/oracle/psft/pt/tools/bin/sqr/ORA/bin:/opt/oracle/psft/pt/tools/verity/linux/_ilnx21/bin:/hom
e/psadm2/bin:.
[psadm2@psovmfscmfp2 PRCSDOM]$ ls /opt/oracle/psft/pt/apptools/cblbin
[psadm2@psovmfscmfp2 PRCSDOM]$ ls /opt/oracle/psft/pt/tools/cblbin
PTPCBLAE.gnt PTPDTTST.gnt PTPECOBL.gnt PTPLOGMS.gnt PTPRATES.gnt PTPSQLGS.gnt PTPTESTU.gnt PTPTSCNT.gnt PTPTSLOG.gnt PTPTSTBL.gnt PTPTSWHR.gnt
PTPCURND.gnt PTPDTWRK.gnt PTPEFCNV.gnt PTPMETAS.gnt PTPRUNID.gnt PTPSQLRT.gnt PTPTESTV.gnt PTPTSEDS.gnt PTPTSREQ.gnt PTPTSUPD.gnt PTPUPPER.gnt
PTPDBTST.gnt PTPDYSQL.gnt PTPERCUR.gnt PTPNETRT.gnt PTPSETAD.gnt PTPSTRFN.gnt PTPTFLDW.gnt PTPTSEDT.gnt PTPTSSET.gnt PTPTSUSE.gnt PTPUSTAT.gnt
PTPDEC31.gnt PTPECACH.gnt PTPESLCT.gnt PTPNTEST.gnt PTPSHARE.gnt PTPTEDIT.gnt PTPTLREC.gnt PTPTSFLD.gnt PTPTSTAE.gnt PTPTSWHE.gnt PTPWLGEN.gnt
[psadm2@psovmfscmfp2 PRCSDOM]$The directory "/opt/oracle/psft/pt/apptools/cblbin" is owned by psadm3 and hosted on the database server (nfs mounted), so I assume we also need to set proper values for env variables, and compile the COBOLs before being able to use them.
To resume what I did to make the COBOLs working on this PSOVM :
1. As root, start LMF (this has to be done only once)
cd /opt/oracle/psft/pt/cobol/microfocus
./mflmman
2. As psadm1, set proper env. variable and compile (setting env variable has to be done each time you want to compile COBOLs)
export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
export LD_LIBRARY_PATH=$COBDIR/lib:$LD_LIBRARY_PATH
export PATH=$COBDIR/bin:$PATH
cd $PS_HOME/setup
./pscbl.mak
./psrun.mak
3. As psadm2, set proper env. variable and reconfigure psprcs.cfg, restart, restart (setting env variable has to be done each time you want to start the process scheduler, so probably better to append these in the .bash_profile)
export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
export LD_LIBRARY_PATH=$COBDIR/lib:$LD_LIBRARY_PATH
export PATH=$COBDIR/bin:$PATH
cd $PS_HOME/appserv
./psadmin
4. Same as step 2, but with user psadm3.
HTH,
Nicolas.
PS: will it be the same issue on the HCM template delivered at the same time ? To be tested as well.
PS2: and yes, I tested it twice before posting, result is same.
Edited by: N Gasparotto on Sep 13, 2012 5:17 PMFortunately, the COBOL issue does not exist on PSOVM HCM9.1 FP2 PT8.52.06 delivered in July 2012 (v3). COBOL are properly compiled (tools and app COBOLs), cblbin is not empty and they run successfully on the first shot.
Nicolas. -
Geting "Incorrect client ID.The stub is not connected yet" during deploying
Hai all,
I am unable to deploy a J2EE application from Deploy Tool. The following trace is comg in defaultTrace.trc.2
#1.5#000D9D9FAF62003C0000000900000F080003FC0472DB309D#1121520967258#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4#Guest#2##sap_J2E_1009250#Guest#097b3220f5fd11d9b557000d9d9faf62#Finalizer##0#0#Error#1#/System/Server#Plain#ID:Can't send inform message.: Incorrect client ID. The stub is not connected yet.#com.sap.engine.services.monitor.mbeans.MonitorResourceBundle##
#1.5#000D9D9FAF62003C0000000A00000F080003FC0472DB33DE#1121520967258#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4#Guest#2##sap_J2E_1009250#Guest#097b3220f5fd11d9b557000d9d9faf62#Finalizer##0#0#Error#1#/System/Server#Plain###Incorrect client ID. The stub is not connected yet.#
#1.5#000D9D9FAF62003C0000000B00000F080003FC0472DB3A15#1121520967258#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4.com.sap.engine.services.rmi_p4.StubImpl p4_finalize#Guest#2##sap_J2E_1009250#Guest#097b3220f5fd11d9b557000d9d9faf62#Finalizer##0#0#Error##Plain###
com.sap.engine.services.rmi_p4.exception.P4BaseIOException: Incorrect client ID. The stub is not connected yet.
at com.sap.engine.services.rmi_p4.server.P4ObjectBrokerServerImpl.getException(P4ObjectBrokerServerImpl.java:860)
at com.sap.engine.services.rmi_p4.server.P4ObjectBrokerServerImpl.getException(P4ObjectBrokerServerImpl.java:853)
at com.sap.engine.services.rmi_p4.server.P4SessionProcessor.reply(P4SessionProcessor.java:169)
at com.sap.engine.services.rmi_p4.StubImpl.p4_finalize(StubImpl.java:221)
at com.sap.engine.services.rmi_p4.StubBase.finalize(StubBase.java:121)
at java.lang.ref.Finalizer.invokeFinalizeMethod(Native Method)
at java.lang.ref.Finalizer.runFinalizer(Finalizer.java:83)
at java.lang.ref.Finalizer.access$100(Finalizer.java:14)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:160)
#1.5#000D9D9FAF62003C0000000C00000F080003FC0472DB3F36#1121520967258#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4#Guest#2##sap_J2E_1009250#Guest#097b3220f5fd11d9b557000d9d9faf62#Finalizer##0#0#Error#1#/System/Server#Plain#ID:Can't send inform message.: Incorrect client ID. The stub is not connected yet.#com.sap.engine.services.monitor.mbeans.MonitorResourceBundle##
#1.5#000D9D9FAF62003C0000000D00000F080003FC0472DB423D#1121520967258#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4#Guest#2##sap_J2E_1009250#Guest#097b3220f5fd11d9b557000d9d9faf62#Finalizer##0#0#Error#1#/System/Server#Plain###Incorrect client ID. The stub is not connected yet.#
#1.5#000D9D9FAF62003C0000000E00000F080003FC0472DB47AA#1121520967258#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4.com.sap.engine.services.rmi_p4.StubImpl p4_finalize#Guest#2##sap_J2E_1009250#Guest#097b3220f5fd11d9b557000d9d9faf62#Finalizer##0#0#Error##Plain###
com.sap.engine.services.rmi_p4.exception.P4BaseIOException: Incorrect client ID. The stub is not connected yet.
at com.sap.engine.services.rmi_p4.server.P4ObjectBrokerServerImpl.getException(P4ObjectBrokerServerImpl.java:860)
at com.sap.engine.services.rmi_p4.server.P4ObjectBrokerServerImpl.getException(P4ObjectBrokerServerImpl.java:853)
at com.sap.engine.services.rmi_p4.server.P4SessionProcessor.reply(P4SessionProcessor.java:169)
at com.sap.engine.services.rmi_p4.StubImpl.p4_finalize(StubImpl.java:221)
at com.sap.engine.services.rmi_p4.StubBase.finalize(StubBase.java:121)
at java.lang.ref.Finalizer.invokeFinalizeMethod(Native Method)
at java.lang.ref.Finalizer.runFinalizer(Finalizer.java:83)
at java.lang.ref.Finalizer.access$100(Finalizer.java:14)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:160)
#1.5#000D9D9FAF62003B0000000100000F080003FC047395DF47#1121520979492#com.sap.engine.compilation##com.sap.engine.compilation.ExternalCompiler.compile()#Administrator#32###Administrator#8c8d9e90f5fe11d99d3d000d9d9faf62#SAPEngine_Application_Thread[impl:3]_32##0#0#Error##Plain###Error while compiling :
java.io.IOException: CreateProcess: javac -encoding Cp1252 -d C:/usr/sap/J2E/JC00/j2ee/cluster/server0/apps/sap.com/onlyear/EJBContainer/temp/temp1121520967445 -classpath .;./bin/system/frame.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/ext/jms/jms.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/ext/tcsecssl/iaik_jsse.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/ext/webservices_lib/saaj-api.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/ext/tcsecssl/iaik_smime.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/ext/add_ejb/add_ejb.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/ext/servlet/servlet.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/services/naming/naming.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/ext/webservices_lib/webservices_lib.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/interfaces/resourcecontext_api/resourcecontext_api.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/ext/tcsecssl/w3c_http.jar;C:/usr/sap/J2E/JC00/j2ee/cluster/server0/bin/interfaces/webservices/webservices_api.jar;C:/us?
at java.lang.Win32Process.create(Native Method)
at java.lang.Win32Process.<init>(Win32Process.java:66)
at java.lang.Runtime.execInternal(Native Method)
at java.lang.Runtime.exec(Runtime.java:566)
at java.lang.Runtime.exec(Runtime.java:491)
at java.lang.Runtime.exec(Runtime.java:457)
at com.sap.engine.compilation.ExternalCompiler.compile(ExternalCompiler.java:65)
at com.sap.engine.services.ejb.util.AdminUtils.compile(AdminUtils.java:449)
at com.sap.engine.services.ejb.deploy.DeployAdmin.deploySingleJar(DeployAdmin.java:625)
at com.sap.engine.services.ejb.deploy.DeployAdmin.generate(DeployAdmin.java:266)
at com.sap.engine.services.ejb.EJBAdmin.deploy(EJBAdmin.java:2093)
at com.sap.engine.services.deploy.server.application.DeploymentTransaction.makeComponents(DeploymentTransaction.java:1015)
at com.sap.engine.services.deploy.server.application.DeploymentTransaction.begin(DeploymentTransaction.java:594)
at com.sap.engine.services.deploy.server.application.ApplicationTransaction.makeAllPhasesOnOneServer(ApplicationTransaction.java:300)
at com.sap.engine.services.deploy.server.application.ApplicationTransaction.makeAllPhases(ApplicationTransaction.java:331)
at com.sap.engine.services.deploy.server.DeployServiceImpl.makeGlobalTransaction(DeployServiceImpl.java:2910)
at com.sap.engine.services.deploy.server.DeployServiceImpl.deploy(DeployServiceImpl.java:451)
at com.sap.engine.services.deploy.server.DeployServiceImplp4_Skel.dispatch(DeployServiceImplp4_Skel.java:1511)
at com.sap.engine.services.rmi_p4.DispatchImpl._runInternal(DispatchImpl.java:286)
at com.sap.engine.services.rmi_p4.DispatchImpl._run(DispatchImpl.java:172)
at com.sap.engine.services.rmi_p4.server.P4SessionProcessor.request(P4SessionProcessor.java:104)
at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:37)
at com.sap.engine.core.cluster.impl6.session.UnorderedChannel$MessageRunner.run(UnorderedChannel.java:71)
at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
at java.security.AccessController.doPrivileged(Native Method)
at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:94)
at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:140)
#1.5#000D9D9FAF62003B0000000900000F080003FC04739BCB45#1121520979883#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4#Administrator#32###Administrator#8c8d9e90f5fe11d99d3d000d9d9faf62#SAPEngine_Application_Thread[impl:3]_32##0#0#Error##Plain###com.sap.engine.services.rmi_p4.DispatchImpl _runInternal Cannot deploy application sap.com/onlyear..
Reason: Exception during generation of components of application sap.com/onlyear in container EJBContainer.; nested exception is:
com.sap.engine.services.deploy.exceptions.ServerDeploymentException: Exception during generation of components of application sap.com/onlyear in container EJBContainer.#
#1.5#000D9D9FAF62003B0000000A00000F080003FC04739BD41D#1121520979883#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4.com.sap.engine.services.rmi_p4.DispatchImpl runInternal#Administrator#32###Administrator#8c8d9e90f5fe11d99d3d000d9d9faf62#SAPEngineApplication_Thread[impl:3]_32##0#0#Error##Plain###
java.rmi.RemoteException: Cannot deploy application sap.com/onlyear..
Reason: Exception during generation of components of application sap.com/onlyear in container EJBContainer.; nested exception is:
com.sap.engine.services.deploy.exceptions.ServerDeploymentException: Exception during generation of components of application sap.com/onlyear in container EJBContainer.
at com.sap.engine.services.deploy.server.DeployServiceImpl.deploy(DeployServiceImpl.java:466)
at com.sap.engine.services.deploy.server.DeployServiceImplp4_Skel.dispatch(DeployServiceImplp4_Skel.java:1511)
at com.sap.engine.services.rmi_p4.DispatchImpl._runInternal(DispatchImpl.java:286)
at com.sap.engine.services.rmi_p4.DispatchImpl._run(DispatchImpl.java:172)
at com.sap.engine.services.rmi_p4.server.P4SessionProcessor.request(P4SessionProcessor.java:104)
at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:37)
at com.sap.engine.core.cluster.impl6.session.UnorderedChannel$MessageRunner.run(UnorderedChannel.java:71)
at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
at java.security.AccessController.doPrivileged(Native Method)
at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:94)
at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:140)
Caused by: com.sap.engine.services.deploy.exceptions.ServerDeploymentException: Exception during generation of components of application sap.com/onlyear in container EJBContainer.
at com.sap.engine.services.deploy.server.application.DeploymentTransaction.makeComponents(DeploymentTransaction.java:1021)
at com.sap.engine.services.deploy.server.application.DeploymentTransaction.begin(DeploymentTransaction.java:594)
at com.sap.engine.services.deploy.server.application.ApplicationTransaction.makeAllPhasesOnOneServer(ApplicationTransaction.java:300)
at com.sap.engine.services.deploy.server.application.ApplicationTransaction.makeAllPhases(ApplicationTransaction.java:331)
at com.sap.engine.services.deploy.server.DeployServiceImpl.makeGlobalTransaction(DeployServiceImpl.java:2910)
at com.sap.engine.services.deploy.server.DeployServiceImpl.deploy(DeployServiceImpl.java:451)
... 10 more
Caused by: java.lang.NullPointerException
at com.sap.engine.compilation.ExternalCompiler.getErrorMessage(ExternalCompiler.java:123)
at com.sap.engine.services.ejb.util.AdminUtils.compile(AdminUtils.java:451)
at com.sap.engine.services.ejb.deploy.DeployAdmin.deploySingleJar(DeployAdmin.java:625)
at com.sap.engine.services.ejb.deploy.DeployAdmin.generate(DeployAdmin.java:266)
at com.sap.engine.services.ejb.EJBAdmin.deploy(EJBAdmin.java:2093)
at com.sap.engine.services.deploy.server.application.DeploymentTransaction.makeComponents(DeploymentTransaction.java:1015)
... 15 more
#1.5#000D9D9FAF62003B0000000D00000F080003FC04739BDA8D#1121520979883#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4#Administrator#32###Administrator#8c8d9e90f5fe11d99d3d000d9d9faf62#SAPEngine_Application_Thread[impl:3]_32##0#0#Error#1#/System/Server#Plain#ID:011403: Cannot deploy application sap.com/onlyear..
Reason: Exception during generation of components of application sap.com/onlyear in container EJBContainer.; nested exception is:
com.sap.engine.services.deploy.exceptions.ServerDeploymentException: Exception during generation of components of application sap.com/onlyear in container EJBContainer.#com.sap.engine.services.monitor.mbeans.MonitorResourceBundle##
#1.5#000D9D9FAF62003B0000000E00000F080003FC04739BDCAE#1121520979883#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4#Administrator#32###Administrator#8c8d9e90f5fe11d99d3d000d9d9faf62#SAPEngine_Application_Thread[impl:3]_32##0#0#Error#1#/System/Server#Plain###Cannot deploy application sap.com/onlyear..
Reason: Exception during generation of components of application sap.com/onlyear in container EJBContainer.; nested exception is:
com.sap.engine.services.deploy.exceptions.ServerDeploymentException: Exception during generation of components of application sap.com/onlyear in container EJBContainer.#
#1.5#000D9D9FAF62003B0000000F00000F080003FC04739BDEA3#1121520979883#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4#Administrator#32###Administrator#8c8d9e90f5fe11d99d3d000d9d9faf62#SAPEngine_Application_Thread[impl:3]_32##0#0#Error##Plain###TraceException in DispatchImpl...#
#1.5#000D9D9FAF62003B0000001000000F080003FC04739BDFA7#1121520979883#com.sap.engine.services.rmi_p4##com.sap.engine.services.rmi_p4#Administrator#32###Administrator#8c8d9e90f5fe11d99d3d000d9d9faf62#SAPEngine_Application_Thread[impl:3]_32##0#0#Error##Plain###P4:SAPEngine_Application_Thread[impl:3]_32: TraceException in DispatchImpl...#
The above trace repeating all the time when I try to deploy.
Can any body tell whether is it a configuration problem or any application problem. Also on the same server another j2ee application is deploying successfully.
Thanks a lot,
Gangadhar.Hai,
Thanks Brandelik, actually the problem was coming because of java.lang.outOfMemory exception during compilation time.
As of now I am not able to fix that problem. I am changing all the available options for memory settings.
Anybody please help me how to set the memory parameters for deploying an application. My application needs minimum of -Xmx512m for deployment on oracle10g server.
Thanx,
Gangadhar. -
Oracle Security : what do you think about the following policy violation ?
If you install OEM10, you will be able to see if you violate some security guidelines :
Interresting is revoking UTL_FILE from public, which is critical. Also revoke UTL_TCP and UTL_SMTP. This is going to upset an expert I know...
Take care about the failed login attempts. If you set it to 10 to the default profile, and if your DBSNMP password is NOT the default password, then Oracle will lock your account after node discovery!
In Solaris, you can disable execution of the user stack with the system parameters set noexec_user_stack=1
set noexec_user_stack_log=1. I did not find how to do it on AIX. However, those settings may have side effects.
About the ports, it complains about open ports, even if this is the port oracle listener is using! Simply ignore most of the violations there.
About JAccelerator (NCOMP), it is located on the "companion" CD.
Ok, Waiting for your feedback
Regards
Laurent
[High] Critical Patch Advisories for Oracle Homes Configuration Host Checks Oracle Homes for missing critical patches
[High] Insufficient Number of Control Files Configuration Database Checks for use of a single control file
[High] Open ports Security Host Check for open ports
[High] Remote OS role Security Database Check for insecure authentication of remote users (remote OS role)
[High] EXECUTE UTL_FILE privileges to PUBLIC Security Database Test for PUBLIC having EXECUTE privilege on the UTIL_FILE package
[High] Listener direct administration Security Listener Ensure that listeners cannot be administered directly
[High] Remote OS authentication Security Database Check for insecure authentication of remote users (remote OS authentication)
[High] Listener password Security Listener Test for password-protected listeners
[High] HTTP Server Access Logging Security HTTP Server Check that HTTP Server access logging is enabled
[High] Web Cache Access Logging Security Web Cache Check that Web Cache access logging is enabled
[High] Web Cache Dummy wallet Security Web Cache Check that dummy wallet is not used for production SSL load.
[High] HTTP Server Dummy wallet Security HTTP Server Check that dummy wallet is not used for production SSL load.
[High] Web Cache owner and setuid bit' Security Web Cache Check that webcached binary is not owned by root and setuid is not set
[High] HTTP Server Owner and setuid bit Security HTTP Server Check the httpd binary is not owned by root and setuid bit is not set.
[High] HTTP Server Directory Indexing Security HTTP Server Check that Directory Indexing is disabled on this HTTP Server
[High] Insufficient Redo Log Size Storage Database Checks for redo log files less than 1 Mb
[Medium] Insufficient Number of Redo Logs Configuration Database Checks for use of less than three redo logs
[Medium] Invalid Objects Objects Database Checks for invalid objects
[Medium] Insecure services Security Host Check for insecure services
[Medium] DBSNMP privileges Security Database Check that DBSNMP account has sufficient privileges to conduct all security tests
[Medium] Remote password file Security Database Check for insecure authentication of remote users (remote password file)
[Medium] Default passwords Security Database Test for known accounts having default passwords
[Medium] Unlimited login attempts Security Database Check for limits on the number of failed logging attempts
[Medium] Web Cache Writable files Security Web Cache Check that there are no group or world writable files in the Document Root directory.
[Medium] HTTP Server Writable files Security HTTP Server Check that there are no group or world writable files in the Document Root directory
[Medium] Excessive PUBLIC EXECUTE privileges Security Database Check for PUBLIC having EXECUTE privileges on powerful packages
[Medium] SYSTEM privileges to PUBLIC Security Database Check for SYSTEM privileges granted to PUBLIC
[Medium] Well-known accounts Security Database Test for accessibility of well-known accounts
[Medium] Execute Stack Security Host Check for OS config parameter which enables execution of code on the user stack
[Medium] Use of Unlimited Autoextension Storage Database Checks for tablespaces with at least one datafile whose size is unlimited
[Informational] Force Logging Disabled Configuration Database When Data Guard Broker is being used, checks primary database for disabled force logging
[Informational] Not Using Spfile Configuration Database Checks for spfile not being used
[Informational] Use of Non-Standard Initialization Parameters Configuration Database Checks for use of non-standard initialization parameters
[Informational] Flash Recovery Area Location Not Set Configuration Database Checks for flash recovery area not set
[Informational] Installation of JAccelerator (NCOMP) Installation Database Checks for installation of JAccelerator (NCOMP) that improves Java Virtual Machine performance by running natively compiled (NCOMP) classes
[Informational] Listener logging status Security Listener Test for logging status of listener instances
[Informational] Non-uniform Default Extent Size Storage Database Checks for tablespaces with non-uniform default extent size
[Informational] Not Using Undo Space Management Storage Database Checks for undo space management not being used
[Informational] Users with Permanent Tablespace as Temporary Tablespace Storage Database Checks for users using a permanent tablespace as the temporary tablespace
[Informational] Rollback in SYSTEM Tablespace Storage Database Checks for rollback segments in SYSTEM tablespace
[Informational] Non-System Data Segments in System Tablespaces Storage Database Checks for data segments owned by non-system users located in tablespaces SYSTEM and SYSAUX
[Informational] Users with System Tablespace as Default Tablespace Storage Database Checks for non-system users using SYSTEM or SYSAUX as the default tablespace
[Informational] Dictionary Managed Tablespaces Storage Database Checks for dictionary managed tablespaces (other than SYSTEM and SYSAUX)
[Informational] Tablespaces Containing Rollback and Data Segments Storage Database Checks for tablespaces containing both rollback (other than SYSTEM) and data segments
[Informational] Segments with Extent Growth Policy Violation Storage Database Checks for segments in dictionary managed tablespaces (other than SYSTEM and SYSAUX) having irregular extent sizes and/or non-zero Percent Increase settingsInterresting is revoking UTL_FILE from public, which is critical. Also revoke UTL_TCP and UTL_SMTP. This is going to upset an expert I know...Okay, as this is (I think) aimed at me, I'll fall for it ;)
What is the point of revoking UTL_FILE from PUBLIC? Yes I know what you think the point is, but without rights on an Oracle DIRECTORY being able to execute UTL_FILE is useless. Unless of course you're still using the init.ora parameter
UTL_FILE_DIR=*which I sincerely hope you're not.
As for UTL_SMTP and UTL_TCP, I think whether a program is allowed to send e-mail to a given SMTP server is really in the remit of the e-mail adminstrator rather than the DBA.
Look, DBAs are kings of their realm and can set their own rules. The rest of us have to live with them. A couple of years ago I worked a project where I was not allowed access to the USER_DUMP_DEST directory. So every time I generated a TRC file I had to phone up the DBA and a couple of hours later I got an e-mail with an attachment. Secure yes, but not very productive when I was trying to debug a Row Level Security implementation.
I have worked on both sides of the DBA/Developer fence and I understand both sides of the argument. I think it is important for developers to document all the privileges necessary to make their app run. Maybe you don't have a better way of doing that than revoking privileges from PUBLIC. Or maybe you just want to generate additional communication with developers. That's fine. I know sometimes even DBAs get lonely.
Cheers, APC -
What do people think about the different Generic Java approaches?
I have seen a lot of different approaches for Generic Java, and when people find problems with each approach the normal response has been: the other approach is worse with such and such a problem, do you have a better way?
The different approaches I have seen are: (in no particular order)
Please correct me if I am wrong and add other approaches if they are worthy of mention.
1) PolyJ - by MIT
This is a completely different approach than the others, that introduces a new where clause for bounding the types, and involves changing java byte codes in order to meet it's goals.
Main comments were not a java way of doing things and far too greater risk making such big changes.
2) Pizza - by Odersky & Wadler
This aims at extending java in more ways than just adding Generics. The generic part of this was replaced by GJ, but with Pizza's ability to use primitives as generic types removed, and much bigger changes allowing GJ to interface with java.
Main comments were that Pizza doesn't work well with java, and many things in Pizza were done in parallel with java, hence were no longer applicable.
3) GJ - by Bracha, Odersky, Stoutamire & Wadler
This creates classes with erased types and bridging methods, and inserts casts when required when going back to normal java code.
Main comments are that type dependent operations such as new, instanceof, casting etc can't be done with parametric types, also it is not a very intuitive approach and it is difficult to work out what code should do.
4) Runtime Generic Information - by Natali & Viroli
Each instance holds information about its Runtime Type.
Main comments from people were that this consumes way too much memory as each instance holds extra information about its type, and the performance would be bad due to checking Type information at runtime that would have been known at compile.
5) NextGen - by Cartwright & Steele
For each parameterized class an abstract base class with types erased is made and then for each new type a lightweight wrapper class and interface are created re-using code from the base class to keep the code small.
Main comments from people were that this approach isn't as backwards compatible as GJ due to replacing the legacy classes with abstract base classes which can't be instantiated.
6) .NET common runtime - by Kennedy & Syme
This was written for adding Generics to C#, however the spec is also targeted at other languages such as VB.
Main comments from people were that this approach isn't java, hence it is not subject to the restrictions of changing the JVM like java is.
7) Fully Generated Generic Classes - by Agesen, Freund & Mitchell
For each new type a new class is generated by a custom class loader, with all the code duplicated for each different type.
Main comments from people were that the generated code size gets too big, and that it is lacking a base class for integration with legacy code.
8) JSR-14 - by Sun
This is meant to come up with a solution Generic Solution to be used in java. Currently it is heavily based on GJ and suffering from all the same problems as GJ, along with the fact that it is constantly undergoing change and so no one knows what to expect.
See this forum for comments about it.
As if we didn't have enough approaches already, here is yet another one that hopefully has all of the benefits, and none of the problems of the other approaches. It uses information learnt while experimenting with the other approaches. Now when people ask me if I think I have a better approach, I will have somewhere to point them to.
(I will be happy to answer questions concerning this approach).
9) Approach #x - by Phillips
At compile time 1 type is made per generic type with the same name.
e.g.class HashSet<TypeA> extends AbstractSet<TypeA> implements Cloneable, Serializable will be translated to a type: class HashSet extends AbstractSet implements Cloneable, SerializableAn instance of the class using Object as TypeA can now be created in 2 different ways.
e.g.Set a = new HashSet();
Set<Object> b = new HashSet<Object>();
//a.getClass().equals(b.getClass()) is trueThis means that legacy class files don't even need to be re-compiled in order to work with the new classes. This approach is completely backwards compatible.
Inside each type that was created from a generic type there is also some synthetic information.
Information about each of the bounding types is stored in a synthetic field.
Note that each bounding type may be bounded by a class and any number of interfaces, hence a ';' is used to separate bounding types. If there is no class Object is implied.
e.g.class MyClass<TypeA extends Button implements Comparable, Runnable; TypeB> will be translated to a type: class MyClass {
public static final Class[][] $GENERIC_DESCRIPTOR = {{Button.class, Comparable.class, Runnable.class}, {Object.class}};This information is used by a Custom Class Loader before generating a new class in order to ensure that the generic types are bounded correctly. It also gets used to establish if this class can be returned instead of a generated class (occurs when the generic types are the same as the bounding types, like for new HashSet<Object> above).
There is another synthetic field of type byte[] that stores bytes in order for the Custom Class Loader to generate the new Type.
There are also static methods corresponding to each method that contain the implementation for each method. These methods take parameters as required to gain access to fields, contructors, other methods, the calling object, the calling object class etc. Fields are passed to get and set values in the calling object. Constructors are passed to create new instances of the calling object. Other methods are passed when super methods are called from within the class. The calling object is almost always passed for non static methods, in order to do things with it. The class is passed when things like instanceof the generated type need to be done.
Also in this class are any non private methods that were there before, using the Base Bounded Types, in order that the class can be used exactly as it was before Generics.
Notes: the time consuming reflection stuff is only done once per class (not per instance) and stored in static fields. The other reflection stuff getting done is very quick in JDK1.4.1 (some earlier JDKs the same can not be said).
Also these static methods can call each other in many circumstances (for example when the method getting called is private, final or static).
As well as the ClassLoader and other classes required by it there is a Reflection class. This class is used to do things that are known to be safe (assuming the compiler generated the classes correctly) without throwing any exceptions.
Here is a cut down version of the Reflection class: public final class Reflection {
public static final Field getDeclaredField(Class aClass, String aName) {
try {
Field field = aClass.getDeclaredField(aName);
field.setAccessible(true);
return field;
catch (Exception ex) {
throw new Error(ex);
public static final Object get(Field aField, Object anObject) {
try {
return aField.get(anObject);
catch (Exception ex) {
throw new Error(ex);
public static final void set(Field aField, Object anObject, Object aValue) {
try {
aField.set(anObject, aValue);
catch (Exception ex) {
throw new Error(ex);
public static final int getInt(Field aField, Object anObject) {
try {
return aField.getInt(anObject);
catch (Exception ex) {
throw new Error(ex);
public static final void setInt(Field aField, Object anObject, int aValue) {
try {
aField.setInt(anObject, aValue);
catch (Exception ex) {
throw new Error(ex);
}Last but not least, at Runtime one very lightweight wrapper class per type is created as required by the custom class loader. Basically the class loader uses the Generic Bytes as the template replacing the erased types with the new types. This can be even faster than loading a normal class file from disk, and creating it.
Each of these classes has any non private methods that were there before, making calls to the generating class to perform their work. The reason they don't have any real code themselves is because that would lead to code bloat, however for very small methods they can keep their code inside their wrapper without effecting functionality.
My final example assumes the following class name mangling convention:
* A<component type> - Array
* b - byte
* c - char
* C<class name length><class name> - Class
* d - double
* f - float
* i - int
* l - long
* z - boolean
Final Example: (very cut down version of Vector)public class Vector<TypeA> extends AbstractList<TypeA> implements RandomAccess, Cloneable, Serializable {
protected Object[] elementData;
protected int elementCount;
protected int capacityIncrement;
public Vector<TypeA>(int anInitialCapacity, int aCapacityIncrement) {
if (anInitialCapacity < 0) {
throw new IllegalArgumentException("Illegal Capacity: " + anInitialCapacity);
elementData = new Object[initialCapacity];
capacityIncrement = capacityIncrement;
public synchronized void setElementAt(TypeA anObject, int anIndex) {
if (anIndex >= elementCount) {
throw new ArrayIndexOutOfBoundsException(anIndex + " >= " + elementCount);
elementData[anIndex] = anObject;
}would get translated as:public class Vector extends AbstractList implements RandomAccess, Cloneable, Serializable {
public static final Class[][] $GENERIC_DESCRIPTOR = {{Object.class}};
public static final byte[] $GENERIC_BYTES = {/*Generic Bytes Go Here*/};
protected Object[] elementData;
protected int elementCount;
protected int capacityIncrement;
private static final Field $0 = Reflection.getDeclaredField(Vector.class, "elementData"),
$1 = Reflection.getDeclaredField(Vector.class, "elementCount"),
$2 = Reflection.getDeclaredField(Vector.class, "capacityIncrement");
static void $3(int _0, Field _1, Object _2, Field _3, int _4) {
if (_0 < 0) {
throw new IllegalArgumentException("Illegal Capacity: " + _0);
Reflection.set(_1, _2, new Object[_0]);
Reflection.setInt(_3, _2, _4);
static void $4(int _0, Field _1, Object _2, Field _3, Object _4) {
if (_0 >= Reflection.getInt(_1, _2)) {
throw new ArrayIndexOutOfBoundsException(_0 + " >= " + Reflection.getInt(_1, _2));
((Object[])Reflection.get(_3, _2))[_0] = _4;
public Vector(int anInitialCapacity, int aCapacityIncrement) {
$3(anInitialCapacity, $0, this, $2, aCapacityIncrement);
public synchronized void setElementAt(Object anObject, int anIndex) {
$4(anIndex, $1, this, $0, anObject);
} and new Vector<String> would get generated as:public class Vector$$C16java_lang_String extends AbstractList$$C16java_lang_String implements RandomAccess, Cloneable, Serializable {
protected Object[] elementData;
protected int elementCount;
protected int capacityIncrement;
private static final Field $0 = Reflection.getDeclaredField(Vector$$C16java_lang_String.class, "elementData"),
$1 = Reflection.getDeclaredField(Vector$$C16java_lang_String.class, "elementCount"),
$2 = Reflection.getDeclaredField(Vector$$C16java_lang_String.class, "capacityIncrement");
public Vector$$C16java_lang_String(int anInitialCapacity, int aCapacityIncrement) {
Vector.$3(anInitialCapacity, $0, this, $2, aCapacityIncrement);
public synchronized void setElementAt(String anObject, int anIndex) {
Vector.$4(anIndex, $1, this, $0, anObject);
}Comparisons with other approaches:
Compared with PolyJ this is a very java way of doing things, and further more it requires no changes to the JVM or the byte codes.
Compared with Pizza this works very well with java and has been designed using the latest java technologies.
Compared with GJ all type dependent operations can be done, and it is very intuitive, code does exactly the same thing it would have done if it was written by hand.
Compared with Runtime Generic Information no extra information is stored in each instance and hence no extra runtime checks need to get done.
Compared with NextGen this approach is completely backwards compatible. NextGen looks like it was trying to achieve the same goals, but aside from non backwards compatibility also suffered from the fact that Vector<String> didn't extend AbstractList<String> causing other minor problems. Also this approach doesn't create 2 types per new types like NextGen does (although this wasn't a big deal anyway). All that said NextGen was in my opinion a much better approach than GJ and most of the others.
Compared to .NET common runtime this is java and doesn't require changes to the JVM.
Compared to Fully Generated Generic Classes the classes generated by this approach are very lightweight wrappers, not full blown classes and also it does have a base class making integration with legacy code simple. It should be noted that the functionality of the Fully Generated Generic Classes is the same as this approach, that can't be said for the other approaches.
Compared with JSR-14, this approach doesn't suffer from GJ's problems, also it should be clear what to expect from this approach. Hopefully JSR-14 can be changed before it is too late.(a) How you intend generic methods to be translated.
Given that Vector and Vector<Object> are unrelated types,
what would that type be represented as in the byte code of
the method? In my approach Vector and Vector<Object> are related types. In fact the byte code signature of the existing method is exactly the same as it was in the legacy code using Vector.
To re-emphasize what I had said when explaining my approach:
System.out.println(Vector.class == Vector<Object>.class); // displays true
System.out.println(Vector.class == Vector<String>.class); // displays false
Vector vector1 = new Vector<Object>(); // legal
Vector<Object> vector2 = new Vector(); // legal
// Vector vector3 = new Vector<String>(); // illegal
// Vector<String> vector4 = new Vector(); // illegal
Vector<String> vector5 = new Vector<String>(); // legal
You must also handle the case where the type
parameter is itself a parameterized type in which the type
parameter is not statically bound to a ground instantiation.This is also very straightforward: (let me know if I have misunderstood you)
(translation of Vector given in my initial description)
public class SampleClass<TypeA> {
public static void main(String[] args) {
System.out.println(new Vector<Vector<TypeA>>(10, 10));
}would get translated as:public class SampleClass {
public static final Class[][] $GENERIC_DESCRIPTOR = {{Object.class}};
public static final byte[] $GENERIC_BYTES = {/*Generic Bytes Go Here*/};
private static final Constructor $0 = Reflection.getDeclaredConstructor(Vector$$C16java_util_Vector.class, new Class[] {int.class, int.class});
static void $1(Constructor _0, int _1, int _2) {
try {
System.out.println(Reflection.newInstance(_0, new Object[] {new Integer(_1), new Integer(_2)}));
catch (Exception ex) {
throw (RuntimeException)ex;
public static void main(String[] args) {
$1($0, 10, 10);
}and SampleClass<String> would get generated as:public class SampleClass$$C16java_lang_String {
private static final Constructor $0 = Reflection.getConstructor(Vector$$C37java_util_Vector$$C16java_lang_String.class, new Class[] {int.class, int.class});
public static void main(String[] args) {
SampleClass.$1($0, 10, 10);
Also describe the implementation strategy for when these
methods are public or protected (i.e. virtual).As I said in my initial description that for non final, non static, non private method invocations a Method may be passed into the implementing synthetic method as a parameter.
Note: the following main method will display 'in B'.
class A {
public void foo() {
System.out.println("in A");
class B extends A {
public void foo() {
System.out.println("in B");
public class QuickTest {
public static void main(String[] args) {
try {
A.class.getMethod("foo", null).invoke(new B(), null);
catch (Exception ex) {}
}This is very important as foo() may be overwritten by a subclass as it is here. By passing a Method to the synthetic implementation this guarantees that covariance, invariance and contra variance all work exactly the same way as in java. This is a fundamental problem with many other approaches.
(b) The runtime overhead associated with your translationAs we don't have a working solution to compare this to, performance comments are hard to state, but I hope this helps anyway.
The Class Load time is affected in 4 ways. i) All the Generic Bytes exist in the Base Class, hence they don't need to be read from storage. ii) The custom class loader, time to parse the name and failed finds before it finally gets to define the class. iii) The generation of the generic bytes to parametric bytes (basically involves changing bytes from the Constant Pool worked out from a new Parametric type, Utf8, Class and the new Parametric Constant types may all be effected) iv) time to do the static Reflection stuff (this is the main source of the overhead). Basically this 1 time per class overhead is nothing to be concerned with, and Sun could always optimize this part further.
The normal Runtime overhead (once Classes have been loaded) is affected mainly by reflection: On older JDKs the reflection was a lot slower, and so might have made a noticeable impact. On newer JDKs (since 1.4 I think), the reflection performance has been significantly improved. All the time consuming reflection is done once per class (stored in static fields). The normal reflection is very quick (almost identical to what is getting done without reflection). As the wrappers simply include a single method call to another method, these can be in-lined and hence made irrelevant. Furthermore it is not too difficult to make a parameter that would include small methods in the wrapper classes, as this does not affect functionality in the slightest, however in my testing I have found this to be unnecessary.
(c) The space overhead (per instantiation)There are very small wrapper classes (one per new Type) that simply contain all non private methods, with single method calls to the implementing synthetic method. They also include any fields that were in the original class along with other synthetic fields used to store reflected information, so that the slow reflection only gets done once per new Type.
(d) The per-instance space overheadNone.
(e) Evidence that the proposed translation is sound and well-defined for all relevant cases (see below)Hope this is enough, if not let me know what extra proof you need.
(f) Evidence for backward compatibility
(For example, how does an old class file that passes a Vector
to some method handle the case when the method receives a Vector<T>
where T is a type parameter? In your translation these types are unrelated.)As explained above, in my approach these are only unrelated for T != Object, in the legacy case T == Object, hence legacy code passing in Vector is exactly the same as passing in Vector<Object>.
(g) Evidence for forward compatibility
(How, exactly, do class files that are compiled with a generics compiler run on an old VM)They run exactly the same way, the byte codes from this approach are all legal java, and all legal java is also legal in this approach. In order to take advantage of the Generics the Custom Class Loader would need to be used or else one would get ClassNotFoundExceptons, the same way they would if they tried using Collections on an old VM without the Collections there. The Custom Class Loader even works on older VMs (note it may run somewhat slower on older VMs).
(h) A viable implementation strategyType specific instantiations are at Class Load time, when the Custom Class Loader gets asked for a new Class, it then generates it.
The type specific instantiations are never shipped as they never get persisted. If you really wanted to save them all you need to do is save them with the same name (with the $$ and _'s etc), then the class loader would find them instead of generating them. There is little to be gained by doing this and the only reason I can think of for doing such a thing would be if there was some reason why the target VM couldn't use the Custom Class Loader (the Reflection class would still need to be sent as well, but that is nothing special). Basically they are always generated at Runtime unless a Class with the same name already exists in which case it would be used.
The $GENERIC_DESCRIPTOR and $GENERIC_BYTES from the base class along with the new Type name are all that is required to generate the classes at runtime. However many other approaches can achieve the same thing for the generation, and approaches such as NextGen's template approach may be better. As this generation is only done once per class I didn't put much research into this area. The way it currently works is that the $GENERIC_DESCRIPTOR are basically used to verify that a malicious class files is not trying to create a non Type Safe Type, ie new Sample<Object>() when the class definition said class Sample<TypeA extends Button>. The $GENERIC_BYTES basically correspond to the normal bytes of a wrapper class file, except that in the constant pool it has some constants of a new Parametric Constant type that get replaced at class load time. These parametric constants (along with possibly Utf8 and Class constants) are replaced by the Classes at the end of the new type name, a little more complex than that but you probably get the general idea.
These fine implementation details don't affect the approach so much anyway, as they basically come down to class load time performance. Much of the information in the $GENERIC_BYTES could have been worked out by reflection on the base type, however at least for now simply storing the bytes is a lot easier.
Note: I have made a small syntax change to the requested class:
public T(X datum) --> public T<X>(X datum)
class T<X> {
private X datum;
public T<X>(X datum) {
this.datum = datum;
public T<T<X>> box() {
return new T<T<X>>(this);
public String toString() {
return datum.toString();
public static void main(String[] args) {
T<String> t = new T<String>("boo!");
System.out.println(t.box().box());
}would get translated as:
class T {
public static final Class[][] $GENERIC_DESCRIPTOR = {{Object.class}};
public static final byte[] $GENERIC_BYTES = {/*Generic Bytes Go Here*/};
private Object datum;
private static final Field $0 = Reflection.getDeclaredField(T.class, "datum");
private static final Constructor $1 = Reflection.getDeclaredConstructor(T$$C1T.class, new Class[] {T.class});
static void $2(Field _0, Object _1, Object _2) {
Reflection.set(_0, _1, _2);
static Object $3(Constructor _0, Object _1) {
try {
return Reflection.newInstance(_0, new Object[] {_1});
catch (Exception ex) {
throw (RuntimeException)ex;
static String $4(Field _0, Object _1) {
return Reflection.get(_0, _1).toString();
static void $5() {
T$$C16java_lang_String t = new T$$C16java_lang_String("boo!");
System.out.println(t.box().box());
public T(Object datum) {
$2($0, this, datum);
public T$$C1T box() {
return (T$$C1T)$3($1, this);
public String toString() {
return $4($0, this);
public static void main(String[] args) {
$5();
}as the generic bytes aren't very meaningful and by no means a requirement to this approach (NextGen's template method for generation may work just as well), here are the generated classes with some unused code commented out instead:
class T$$C28T$$C22T$$C16java_lang_String {
private T$$C22T$$C16java_lang_String datum;
private static final Field $0 = Reflection.getDeclaredField(T$$C28T$$C22T$$C16java_lang_String.class, "datum");
// private static final Constructor $1 = Reflection.getDeclaredConstructor(T$$C34T$$C28T$$C22T$$C16java_lang_String.class, new Class[] {T$$C28T$$C22T$$C16java_lang_String.class});
public T$$C28T$$C22T$$C16java_lang_String(T$$C22T$$C16java_lang_String datum) {
T.$2($0, this, datum);
// public T$$C34T$$C28T$$C22T$$C16java_lang_String box() {
// return (T$$C34T$$C28T$$C22T$$C16java_lang_String)T.$3($1, this);
public String toString() {
return T.$4($0, this);
public static void main(String[] args) {
T.$5();
class T$$C22T$$C16java_lang_String {
private T$$C16java_lang_String datum;
private static final Field $0 = Reflection.getDeclaredField(T$$C22T$$C16java_lang_String.class, "datum");
private static final Constructor $1 = Reflection.getDeclaredConstructor(T$$C28T$$C22T$$C16java_lang_String.class, new Class[] {T$$C22T$$C16java_lang_String.class});
public T$$C22T$$C16java_lang_String(T$$C16java_lang_String datum) {
T.$2($0, this, datum);
public T$$C28T$$C22T$$C16java_lang_String box() {
return (T$$C28T$$C22T$$C16java_lang_String)T.$3($1, this);
public String toString() {
return T.$4($0, this);
public static void main(String[] args) {
T.$5();
class T$$C1T {
private T datum;
private static final Field $0 = Reflection.getDeclaredField(T$$C1T.class, "datum");
// private static final Constructor $1 = Reflection.getDeclaredConstructor(T$$C6T$$C1T.class, new Class[] {T$$C1T.class});
public T$$C1T(T datum) {
T.$2($0, this, datum);
// public T$$C6T$$C1T box() {
// return (T$$C6T$$C1T)T.$3($1, this);
public String toString() {
return T.$4($0, this);
public static void main(String[] args) {
T.$5();
class T$$C16java_lang_String {
private String datum;
private static final Field $0 = Reflection.getDeclaredField(T$$C16java_lang_String.class, "datum");
private static final Constructor $1 = Reflection.getDeclaredConstructor(T$$C22T$$C16java_lang_String.class, new Class[] {T$$C16java_lang_String.class});
public T$$C16java_lang_String(String datum) {
T.$2($0, this, datum);
public T$$C22T$$C16java_lang_String box() {
return (T$$C22T$$C16java_lang_String)T.$3($1, this);
public String toString() {
return T.$4($0, this);
public static void main(String[] args) {
T.$5();
}the methods from the Reflection class used in these answers not given in my initial description are:
public static final Object newInstance(Constructor aConstructor, Object[] anArgsArray) throws Exception {
try {
return aConstructor.newInstance(anArgsArray);
catch (InvocationTargetException ex) {
Throwable cause = ex.getCause();
if (ex instanceof Exception) {
throw (Exception)ex;
throw new Error(ex.getCause());
catch (Exception ex) {
throw new Error(ex);
public static final Constructor getDeclaredConstructor(Class aClass, Class[] aParameterTypesArray) {
try {
Constructor constructor = aClass.getDeclaredConstructor(aParameterTypesArray);
constructor.setAccessible(true);
return constructor;
catch (Exception ex) {
throw new Error(ex);
} -
The problem about the signal of JNI
Hello,
When the system (it creates by the C language) created here is operated on Solaris and javaVM is operated using a JNI, the phenomenon which the core file outputs within the JIT compiler of java (libsunwjit.so) has occurred.
Although it had generated twice until now, since it had collided with the signal which the signal has generated in one of the another threads, and it uses internally by JavaVM from the contents of a core file when it generated first, it had generated.
However, although the core file was investigated when it generated at the 2nd times, there was no trace that another thread generated the signal.
Is there any case where collide with the signal currently used by JavaVM, and also a signal is generated in a JIT compiler?
What thing has an obstacle used as the factor which a JIT compiler makes have generated the signal in the past again?
Please give me the reply to the above-mentioned question for a cause elucidation.
As appending data, the contents which referred to the core file are described below.
(gdb) thread 1
[Switching to thread 1 (LWP 163 )]
#0 0xfe359d88 in __sigprocmask () from /usr/lib/libthread.so.1
#1 0xfe34eb34 in __sigredirect () from /usr/lib/libthread.so.1
#2 0xfe351a10 in thrpkill_unlocked () from /usr/lib/libthread.so.1
#3 0xfe239470 in abort () from /usr/lib/libc.so.1
#4 0xfb49353c in panicHandler ()
from /opt/fujitsu/jasmine/JasEMedia/jre/lib/sparc/classic/libjvm.so
#5 0xfb798084 in intrDispatchMD ()
from /opt/fujitsu/jasmine/JasEMedia/jre/lib/sparc/native_threads/libhpi.so
#6 0xfe359348 in __libthread_segvhdlr () from /usr/lib/libthread.so.1
#7 0xfe35bdf0 in __sighndlr () from /usr/lib/libthread.so.1
#8 0xfe3586f8 in ?? () from /usr/lib/libthread.so.1
#9 0xfb3bd8bc in JITPerformDynamicPatch ()
from /opt/fujitsu/jasmine/JasEMedia/jre/lib/sparc/libsunwjit.so
#10 0xfb3de848 in JITResolveConstPoolEntry ()
from /opt/fujitsu/jasmine/JasEMedia/jre/lib/sparc/libsunwjit.so
#11 0xb945908 in ?? ()
#12 0xc5054e4 in ?? ()
#13 0xe2974c in ?? ()
#14 0x851b75c in ?? ()
#15 0x7c3841c in ?? ()
#16 0xb917560 in ?? ()
#17 0x8a9b6d4 in ?? ()
#18 0xb8e18e0 in ?? ()
#19 0x71377d8 in ?? ()
#20 0x7137784 in ?? ()
#21 0x7d7d7f4 in ?? ()
#22 0xc67c4b0 in ?? ()
#23 0x913998c in ?? ()
#24 0xb2bf58 in ?? ()
#25 0xc51a854 in ?? ()
#26 0x8794c7c in ?? ()
#27 0xfb3de980 in JIT_INVOKER_MARKER ()
from /opt/fujitsu/jasmine/JasEMedia/jre/lib/sparc/libsunwjit.so
#28 0xfb494cec in notJavaInvocation ()
from /opt/fujitsu/jasmine/JasEMedia/jre/lib/sparc/classic/libjvm.so
#29 0xfb457674 in jni_Invoke ()
from /opt/fujitsu/jasmine/JasEMedia/jre/lib/sparc/classic/libjvm.so
#30 0xfb45930c in jni_CallVoidMethod ()
from /opt/fujitsu/jasmine/JasEMedia/jre/lib/sparc/classic/libjvm.so
#31 0xfbee0d7c in ChartJem_Generate ()
from /opt/fujitsu/jasmine/EMImageChart/lib/GKM0JEMC.so
#32 0xfb7630a4 in ?? ()
from /opt/fujitsu/jasmine/data/default/methods/JasEMedia/lib043C.so.2.0
#33 0xff2b106c in odb_OmsPcExec () from /opt/fujitsu/jasmine/lib/libjas.so
Please contact me, if the core file is required. Separately, I will send.
In that case, also about the transmission method, please unite and contact me.
The system configuration is as follows.
Operation System:SOLARIS(7)R00081
Java version:1.2.2 Classic VM (build JDK-1.2.2-W, green threads, sunwjit)
I'm hoping someone can help me out.
Thanks!After that, a problem is not solved. Would you teach, if there is the method of something saying?
-
PDA: Calling library functions - seems to link the stubbed .cpp file instead of the DLL
I'm having trouble developing a Lab View PDA module that calls a DLL built using Visual C++. The DLL functions correctly when called in a non-PDA VI. My issues seem to be with porting to the PDA.
My configuration:
- Lab View 8.5 with the PDA 8.5 module
- Visual Studio 8.0 with the Windows Mobile 6.0 SDK
- ASUS 626 PDA with an Intel PXA70 procesor running Windows Mobile 6 Classic
Following the PPCBatt example code provided with the PDA module, I have:
- used extern "C" to prevent name mangling
- placed the DLL built with the Windows Mobile SDK in the \Windows directory on the PDA
- created a stub Win32 DLL and lib
- created a stubbed cpp file whose functions only return zero
- included the stubbed cpp and lib files in the build spec / source files / additional files
- placed Call Library Function nodes on my PDA VI, selected the function names, set the parameter types
- built and deployed the executable, both with and without debug
When I set the library path property of the Call Library Node, the functions appeared in the function name pulldown, but the parameters did not populate. I had to manually add them and set their types. The help page says they would autopopulate when the function was selected.
I've debugged the VI, and the Library Function Call nodes are being called. It seems the build is linking the code from the stub C file provided in the additional files portion of the source files property page, instead of adding hooks to call the DLL on the PDA. As a test, I changed an output parameter in one of the functions in the stubbed cpp file - the changed value showed in the front panel indicator.
What am I doing wrong?
DanHi Dan,
I'm not sure if I understood you problem fully. When calling external code with LabVIEW PDA, the DLL acts as a stub DLL with the correct function prototypes for the C code that you want to call. Here's a Knowledge Base article that might help explain about calling External Code in LabVIEW PDA.
Regards,
Stanley Hu
National Instruments
Applications Engineering
http://www.ni.com/support -
Why don't I see the option Generate under menu item File in Photoshop CC?
Hi,
Why don't I see the option >Generate under menu item >File in Photoshop CC?
I want to be able to generate a edge reflow project.
/ThorsPlease confirm if you are updated to the latest version of Photoshop CC. Go to Photoshop>About Photoshop on Mac, For Win, go to Help>About Photoshop.
We recently released a new update. Here is teh link for more info:http://forums.adobe.com/message/5674936#5674936
Regards
Pragya
Maybe you are looking for
-
Why does my Menu Bar is scrolling from left to right when I call it and not stabilizing?
I have an an XP Windows system and I use Firefox as my default Browser, I suddenly found that my browaer is hit by something that makes the Menu Bar when I try to use it scrolling all menus from left to right so quickly that I can't point to any choi
-
Workflow - Retrieve attached document on the container
Hello All!! I have configured a decision task to e included a rejection text on the UWL. When I add this text, I can see on the Container I have attached an Office Document with my text, on the BOR SOFM. How can I read the text included on this docum
-
Special pricing requirement in BOM
Dear Gurus, Our Client is assembling Machines with sub assemblies and so on. Now when ever a customer comes he asks for a different combination, so we have designed the MMR as ERLA and has a price at header level and once assembled will be PGIed at
-
In the lower right corner of the window there is a column of little white boxes. Right over the column is "9.8". What is this and how do I get rid of it?
-
When I add a title to the movie, the title scrolls by very fast and then I have a number of seconds of black before my movie begins. I have tried to alter the speed and the pause, but I can't change the length of time the black stays on the screen. A