Error while running legal consolidation package RUNCONSOL
Hi All
I have below error while i run the consolidation package can anyone throw light on the same
Start time --->5:17:46 PM - Date:1/6/2009 (build code:254)
User:BPCDEV\bpcadmin
Appset:FTIL
App:LEGALAPP
Logic mode:1
Logic by:
Scope by:
Data File:
Debug File:D:\BPC\Data\WebFolders\FTIL\LEGALAPP
..\AdminApp\LEGALAPP\LegalConsolidation.txt
Logic File:D:\BPC\Data\WebFolders\FTIL\LEGALAPP
..\AdminApp\LEGALAPP\LegalConsolidation.lgx
Selection:D:\BPC\Data\WebFolders\FTIL\LEGALAPP\PrivatePublications\bpcadmin\TempFiles\FROM_81_.TMP
Run mode:1
Query size:0
Delim:,
Query type:0
Simulation:0
Calc diff.:0
Formula script:
Max Members:
Test mode:0
Is Modelling:1
Number of logic calls:1
Call no. 1, logic:D:\BPC\Data\WebFolders\FTIL\LEGALAPP
..\AdminApp\LEGALAPP\LegalConsolidation.lgx
Building sub-query 1
Query Type:0
Max members:
Executing SPRUNCONSO [LEGALAPP], [ACTUAL], [C_FTILG], [SPSCOPE_200127],[SPLOG_282986]
SPRunConso Version 2.06
ERROR CSD-015 No instruction retreived from the ELIM Table with FLOW Dimension
Time to run stored procedure:1.6 sec.
call 1 completed and data posted in 1.6 sec.
Run completed in 1.8 sec.
End time --->5:17:48 PM - Date:1/6/2009
SPRunConso Version 2.06
ERROR CSD-015 No instruction retreived from the ELIM Table with FLOW Dimension
Thanks and Regards
Harish B K
Hi this was due to not maintaining proper design and it was solved later once we setup all the parameters perfectly
and had time dimension members for one year prior (though seems strange)
Similar Messages
-
Error in running Legal COnsolidation Package
Hi All,
On running the legal consolidation package i am constantly getting the same error which is :
'RUNLOGIC:Dimension Set: "ENTITYDATAVALUE" not assigned in Data manager'
I have checked all the dimensions and it all seems to be proper. I tried altering the input of data manager to include ENTITY dimension but the manager did not run at all.
Please help me out. if anyone encountered the same error and has countered it pls share how to do so..
Regards
NavinHi,
I think you are using %ENTITY_SET% in your script. However, you are not prompting for the entity member in the DM package. You need to add the entity dimension in the prompt statement of the advanced script of the DM packge.
Hope this helps. -
Error while running data manager package
Hi All,
when i am running data manager package for currency conversion i am getting the following error
"An exception with the type CX_SY_CREATE_DATA_ERROR occurred, but was neither handled locally, nor declared in a RAISING clause
The data object could not be created: The type /B28/MHED7W9U does not exist."
Plase suggest where i am making mistake.
ThanksBelow are our BPC versions:
BPC on Server Manager: 5.0.486
Data Manager from eData: 5.0.484
BPC from eTool: 5.0.486
Below is the total error message as per your suggestion to run Export package. (even with service account which we used to install the software we are getting same error message.)
TOTAL STEPS 2
1. Dump Data: Failed in 0 sec.
[Selection]
FILE=\ApShell_SK\FINANCE\DataManager\DataFiles\SKTEST.TXT
TRANSFORMATION=\ApShell_SK\FINANCE\DataManager\TransformationFiles\System Files\Export.xls
MEASURENAME=PERIODIC
(Member Selection)
Category: ACTUAL
Time: 2006.JAN
Entity:
Account:
DataSrc:
IntCo:
RptCurrency:
[Messages]
An error occurred while executing a package.
Package Error Events:
ErrorCode = -1073668060
Source = Dump Data
SubComponent=
Description = The task "Dump Data" cannot run on this edition of Integration Services. It requires a higher level edition.
IDOfInterfaceWithError= {8BDFE889-E9D8-4D23-9739-DA807BCDC2AC} -
Error while running Cur Trans Package
Hi Experts,
I am getting an error message at the time of running FX Restatement package.
SPRunConversion Version 2.08
Warning : No Rate found in the opening period
ERROR FX-280 Timeid=20080100 - Nothing Extract from Fact Tables "
I saw in the Fact Tables - and the rate exists.
Do you have any idea why the error is coming ?
thanks in advance
SamvirSamvir,
TblFactRate table should have all kind of rate information in that time period.
For example, AVG is average rate and END is Endrate.
So if some rate type is missing, that error message will be returned.
I hope it will help you.
James Lim -
Error while running the SSIS package from SQL DB to excel file - export option
hi all,
I have 4.6 million records in my sql db and i want to copy this into a excel file. for this i went to the db and rt click and export to command and started the ssis package running. but after
few minutes, it throws me an error that "error in transferring data into excel file."
Can anyone help me why this happened and the resolution ?
help is appreciated!
Copying to `excel1_Wbook` (Error)
Messages
Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8007000E.
(SQL Server Import and Export Wizard)
Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Destination - excel1_Wbook" (217) failed with error code 0xC0202009 while processing input "Destination Input" (228).
The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information
about the failure.
(SQL Server Import and Export Wizard)
Error 0xc02020c4: Data Flow Task 1: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
(SQL Server Import and Export Wizard)
Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - excel1_Wbook returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The
meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)You need to split your data and create (at least 5) worksheet targets.
For example if you have a ROW_NUMBER column you can use for instance a
Conditional Split for something like:
ROW_NUMBER % 5 == 0 for Case 1 (excel 1)
ROW_NUMBER % 5 == 1 for Case 2 (excel 2)
ROW_NUMBER % 5 == 2 for Case 3 (excel 3)
ROW_NUMBER % 5 == 3 for Case 4 (excel 4)
ROW_NUMBER % 5 == 4 for Case 5 (excel 5) -
Error while running: SYS.DBMS_STATS Package.
when i am trying to execute the DBMS_STATS Package inside a procedure, getting the error.
ORA-12805: parallel query server died unexpectedly
ORA-06512: at "SYS.DBMS_STATS", line 9643
ORA-06512: at "SYS.DBMS_STATS", line 10137
ORA-06512: at "SYS.DBMS_STATS", line 10324
ORA-06512: at "SYS.DBMS_STATS", line 10378
ORA-06512: at "SYS.DBMS_STATS", line 10355
ORA-06512: at line 2
Can anyone help me out.I dont know much of the DBA stuff.Hi,
Try setting the following parameters in init.ora:
dbhandles_cached = 0
hash_join_enabled=false
timed_statistics = false
If the problem still reproduces then try increasing your SORT_AREA_SIZE of your session to a larger value.
When a ORA-12805 occurs in most cases it is accompanied with a ORA-600 or a ORA-7445. Hence please check your alert.log if such errors are reported there.
Verify the files of the temporary tablespace of type TEMPORARY are in a full file system ( no free space available).
Nicolas. -
Error while running OWB mapping package from sql prompt
hi all,
i have deployed my owb mapping in oracle. Now i want to execute this from the sql prompt. the foolowing error is giving when i try to run.
SQL> DECLARE
2 RetVal NUMBER;
3 P_ENV WB_RT_MAPAUDIT.WB_RT_NAME_VALUES;
4 BEGIN
5 RetVal := TEST1.TEST_MAP.MAIN ( P_ENV );
6 END;
7 /
RetVal := TEST1.TEST_MAP.MAIN ( P_ENV );
ERROR at line 5:
ORA-06550: line 5, column 19:
PLS-00302: component 'TEST_MAP' must be declared
ORA-06550: line 5, column 3:
PL/SQL: Statement ignored
pls help me for finding the solution.
-Regards
Raj KumarYou need to split your data and create (at least 5) worksheet targets.
For example if you have a ROW_NUMBER column you can use for instance a
Conditional Split for something like:
ROW_NUMBER % 5 == 0 for Case 1 (excel 1)
ROW_NUMBER % 5 == 1 for Case 2 (excel 2)
ROW_NUMBER % 5 == 2 for Case 3 (excel 3)
ROW_NUMBER % 5 == 3 for Case 4 (excel 4)
ROW_NUMBER % 5 == 4 for Case 5 (excel 5) -
Error while running SSIS package from Integration service or via Job
Hi All,
I encounter the below error while running SSIS Package from Job or integration service. However on execution completes success fully while running from data tools. The issue occurs after migration to 2012 from 2oo5 dtsx. PFB the error.
SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by
the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
A buffer failed while allocating 10485760 bytes.
The system reports 26 percent memory load. There are 206110765056 bytes of physical memory with 150503776256 bytes.
free. There are 2147352576 bytes of virtual memory with 185106432 bytes free. The paging file has 208256339968 bytes with 145642921984 bytes free.
The package also runs successfully from other servers. This happens only in one of our server.seems like issue with some sql configuration.Hi ,
Are you running using SQL Agent Job and Data tools on same server or different?
If it is executing fine using Data tools and failing with Job it might be User credentials issue.Try
to run Job with your credentials by using proxy .
Regards,
Prathy
Prathy K -
Error while running a Java Program
Can anyone help me,
I am getting the following error while running a Java program, Below is the exception thrown, please help.
java.nio.BufferOverflowException
at java.nio.Buffer.nextPutIndex(Buffer.java:425)
at java.nio.DirectByteBuffer.putChar(DirectByteBuffer.java:463)
at org.jetel.data.StringDataField.serialize(StringDataField.java:295)
at org.jetel.data.DataRecord.serialize(DataRecord.java:283)
at org.jetel.graph.DirectEdge.writeRecord(DirectEdge.java:216)
at org.jetel.graph.Edge.writeRecord(Edge.java:288)
at com.tcs.re.component.RESummer1.run(RESummer1.java:505)
java.nio.BufferOverflowException
at java.nio.Buffer.nextPutIndex(Buffer.java:425)
at java.nio.DirectByteBuffer.putChar(DirectByteBuffer.java:463)
at org.jetel.data.StringDataField.serialize(StringDataField.java:295)
at org.jetel.data.DataRecord.serialize(DataRecord.java:283)
at org.jetel.graph.DirectEdge.writeRecord(DirectEdge.java:216)
at org.jetel.graph.Edge.writeRecord(Edge.java:288)
at com.tcs.re.component.RECollectCont.run(RECollectCont.java:304)Ok, let's see. Write the following class:
public class Grunt {
public static void main(String[] args) {
System.out.println("Hello Mars");
}Save it as "C:\Grunt.java", compile by typing:
javac c:\Grunt.javaRun by typing:
java -classpath "C:\" GruntDoes it say "Hello Mars"? If yes, go back to your program and compare for differences (maybe you used the "package" statement?).
Regards -
Error while running the rdf using concurrent program
Hi All,
I am facing the following error when i try to run the concurrent program
REP-0736: There exist uncompiled program unit(s).
REP-1247: Report contains uncompiled PL/SQL.
I am using a package in the program units. when i ues it the database its working fine. but when i use in the rdf.
it is throwing the above error.
Also the rdf is scuccessfully compiled.
Please let me know what should be done now??
Thanks
Janaclose report builder,
re-open
re-connect
open pgm unit procedure
compile each pgm unit individually
then ctrl+shift+k
ctrl+t
try now
CHK THIS LINK...may be while running, the user wont have access to the tables used in packages...
REP-1247 ERROR WHILE RUNNING REPORT
Edited by: Dora on Dec 15, 2009 10:00 AM -
Error while running ejbc - Deployment time
Whilst trying to deploy my WSE through J2EE deploy tool i get the follwoing error:
Deployment failed on target Server localhost:4848 : Deployment Error -- Error while running ejbc -- Fatal Error from EJB Compiler -- jaxrpc compilation exception
I am not using ejb as far as I'm aware. I have succesfully buit my WSE using asant with no compilation errors.
Here is copy of the code
package rcews;
import java.rmi.RemoteException;
import java.util.Vector;
import java.sql.*;
import javax.sql.DataSource;
//import javax.naming.Context;
import javax.naming.InitialContext;
import javax.naming.NamingException;
import javax.xml.rpc.server.ServletEndpointContext;
import javax.xml.rpc.server.ServiceLifecycle;
import javax.xml.rpc.ServiceException;
public class RceWSImplementation implements RceWSInterface, ServiceLifecycle {
private DataSource ds;
ServletEndpointContext endPtCntxt;
public void init(Object context) throws ServiceException {
try {
endPtCntxt = (ServletEndpointContext) context;
InitialContext ic = new InitialContext();
// Specify the logical name of the database
// Obtain the DataSource object associated with the logical name
ds = (DataSource) ic.lookup("java:comp/env/jdbc/rcedstar");
ic.close();
} catch(NamingException ne) {
ne.printStackTrace();
throw new ServiceException("\nCannot initialise JNDI ENC\n", ne);
public String [] getSignals() throws RemoteException {
Statement sqlStmnt;
ResultSet rs;
String countQuery = "SELECT COUNT(DISTINCT signal_ref) FROM tblSignals";
String query = "SELECT DISTINCT signal_ref FROM tblSignals ORDER BY signal_ref";
String[] signals = null;
int count = 0;
try {
// Establish connection with the database and return a Connection object
Connection con = ds.getConnection();
System.out.println("Connection made Successfully");
// Create and execute SQL statement
sqlStmnt = con.createStatement();
rs = sqlStmnt.executeQuery(countQuery);
if(rs.next()) {
int size = rs.getInt(0);
signals = new String[size];
rs = sqlStmnt.executeQuery(query);
// Move through ResultSet and pull singal reference information
while(rs.next()) {
signals[count] = rs.getString("signal_ref");
count++;
con.close();
} catch (Exception e) {
e.printStackTrace();
return signals;
public Vector getLampDurations(String[] signal_refs, String[] lamps, int hours) throws RemoteException {
// Varaible Declarations
Vector details = null;
String signals, lamp_colours;
PreparedStatement psqlStmt;
ResultSet rs;
String query = "SELECT signal_ref, light, SUM(duration), SUM(on_count) FROM tblsignals, tblsignal_events " +
"WHERE tblSignals.id = tblSignal_Events.id AND tblSignals.signal_ref IN (?) AND tblSignals.light IN (?) " +
"GROUP BY signal_ref, light" +
"HAVING SUM(duration) > ?";
StringBuffer sbSignals = new StringBuffer(50);
StringBuffer sbLamps = new StringBuffer(31);
// Build Strings for use in Prepared SQL statements
for(int i=0;i<signal_refs.length-1;i++) {
sbSignals.append("'" + signal_refs[i] + "'");
sbLamps.append("'" + lamps[i] + "'");
if (i <= signal_refs.length - 2) {
sbSignals.append(",");
sbLamps.append(",");
signals = new String(sbSignals);
lamp_colours = new String(sbLamps);
try {
// Establish connection with the database and return a Connection object
Connection con = ds.getConnection();
System.out.println("Connection made Successfully");
// Apply strings to statement
psqlStmt = con.prepareStatement(query);
psqlStmt.setString(1, signals);
psqlStmt.setString(2, lamp_colours);
psqlStmt.setInt(3, hours);
// Obtain data from DB
details = new Vector(4);
String[] results = new String[4];
rs = psqlStmt.executeQuery();
while(rs.next()) {
// Copy contents of the ResultSet into a String Array for each record
int i = 0;
results[i] = rs.getString(1); //Signal Reference
results[i++] = rs.getString(2); //Lamp (i.e colour)
double seconds = rs.getDouble(3); //Duration
seconds = seconds / (60*60); //Convert Seconds to Hours - would be nice to convert to Hours:mins:secs
results[i++] = String.valueOf(seconds);
results[i++] = String.valueOf(rs.getInt(4)); //Number of Times lamp has been switched on
// Append the string Array to a Vector
details.addElement(results);
if (psqlStmt != null) { psqlStmt.close(); }
if (con != null) { con.close(); }
} catch (Exception e) {
e.printStackTrace();
return details;
public void destroy() {
ds = null;
}Have since found this from the server log but I have no idea about how to solve this!
[#|2004-03-10T15:08:04.574+0000|INFO|j2ee-appserver1.4|javax.enterprise.system.tools.deployment|_ThreadID=13;|DPL5109: EJBC - START of EJBC for [RCE-WebService]|#]
Remote message: Processing beans ....
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RceWSInterface_Tie.java:32: '(' or '[' expected
super(new rcews.RCE-WebService_SerializerRegistry().getRegistry());
^
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RceWSInterface_Tie.java:32: ')' expected
super(new rcews.RCE-WebService_SerializerRegistry().getRegistry());
^
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RCE-WebService_SerializerRegistry.java:18: '{' expected
public class RCE-WebService_SerializerRegistry implements SerializerConstants {
^
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RCE-WebService_SerializerRegistry.java:33: <identifier> expected
registerSerializer(mapping,rcews.RceWSInterface_getLampDurations_RequestStruct.class, type, serializer);
^
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RCE-WebService_SerializerRegistry.java:33: '{' expected
registerSerializer(mapping,rcews.RceWSInterface_getLampDurations_RequestStruct.class, type, serializer);
^
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RCE-WebService_SerializerRegistry.java:65: illegal start of type
return registry;
^
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RCE-WebService_SerializerRegistry.java:65: <identifier> expected
return registry;
^
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RCE-WebService_SerializerRegistry.java:68: 'class' or 'interface' expected
private static void registerSerializer(TypeMapping mapping, Class javaType, QName xmlType,
^
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RCE-WebService_SerializerRegistry.java:74: 'class' or 'interface' expected
^
C:\Sun\AppServer\domains\domain1\applications\j2ee-modules\RCE-WebService\WEB-INF\classes\rcews\RCE-WebService_SerializerRegistry.java:75: 'class' or 'interface' expected
^
10 errors -
Error while running adlnkoh.sh
I am performing cloning of Oracle Apps R12 version 12.0.4
Using cold backup single node to single node.
I got an error and i checked the logfile i got this
Error while running adlnkoh.sh.
return code = .1.7.12.14.15.16.17.18.19.21.22.23.24.25.26.27.30.32.33.34.42.43.44.45.46.47.48.49.51.52.53.54.55.56.57.58.59.60.62
Please check logfile located at /oracle/cloning/db/tech_st/10.2.0/appsutil/log/TEST1_zishan/make_12091249.log
Then I had seen above logfile and i got this error
running genclntsh...
/oracle/cloning/db/tech_st/10.2.0/lib/libcore10.a: could not read symbols: File format not recognized
collect2: ld returned 1 exit status
genclntsh: Failed to link libclntsh.so.10.1
Failed to generate client shared library on Thu Dec 9 12:49:07 IST 2010...
rm -f oracle dbv tstshm maxmem orapwd dbfsize cursize genoci extproc extproc32 hsalloci hsots hsdepxa dgmgrl dumpsga mapsga osh sbttest expdp impdp imp exp sqlldr rman nid extjob extjobo genezi ikfod grdcscan /oracle/cloning/db/tech_st/10.2.0/rdbms/lib/ksms.s /oracle/cloning/db/tech_st/10.2.0/rdbms/lib/ksms.o
Completed removing old executables on Thu Dec 9 12:49:07 IST 2010...
/usr/bin/gcc -O2 -I/oracle/cloning/db/tech_st/10.2.0/rdbms/demo -I/oracle/cloning/db/tech_st/10.2.0/rdbms/public -I/oracle/cloning/db/tech_st/10.2.0/plsql/public -I/oracle/cloning/db/tech_st/10.2.0/network/public -DLINUX -D_GNU_SOURCE -D_LARGEFILE64_SOURCE=1 -D_LARGEFILE_SOURCE=1 -DSLTS_ENABLE -DSLMXMX_ENABLE -D_REENTRANT -DNS_THREADS -c -o config.o config.c
Completed relinking target config.o ...
Completed linking target links on Thu Dec 9 12:49:09 IST 2010...
/oracle/cloning/db/tech_st/10.2.0/precomp/lib
Linking /oracle/cloning/db/tech_st/10.2.0/precomp/lib/proc
/usr/bin/ld: cannot find -lclntsh
collect2: ld returned 1 exit status
/bin/chmod: cannot access `/oracle/cloning/db/tech_st/10.2.0/precomp/lib/proc': No such file or directory
make: *** [oracle/cloning/db/tech_st/10.2.0/precomp/lib/proc] Error 1
Failed linking target relink on Thu Dec 9 12:49:10 IST 2010...
/oracle/cloning/db/tech_st/10.2.0/rdbms/lib
/usr/bin/ar cr /oracle/cloning/db/tech_st/10.2.0/rdbms/lib/libknlopt.a /oracle/cloning/db/tech_st/10.2.0/rdbms/lib/kkpoban.o
/oracle/cloning/db/tech_st/10.2.0/lib//libcore10.a: could not read symbols: No more archived files
collect2: ld returned 1 exit status
make: *** [oracle/cloning/db/tech_st/10.2.0/rdbms/lib/oracle] Error 1
Failed linking target oracle on Thu Dec 9 12:49:12 IST 2010...
/oracle/cloning/db/tech_st/10.2.0/network/lib
genclntsh: Failed to link libclntsh.so.10.1
make: *** [client_sharedlib] Error 1
Failed linking targets nnfgt.o mkldflags client_sharedlib on Thu Dec 9 12:49:12 IST 2010...
/oracle/cloning/db/tech_st/10.2.0/sqlplus/lib
make: *** [sqlplus] Error 1
Failed linking target sqlplus on Thu Dec 9 12:49:12 IST 2010...
/oracle/cloning/db/tech_st/10.2.0/rdbms/lib
/usr/bin/ld: cannot find -lclntsh
collect2: ld returned 1 exit status
make: *** [trcroute] Error 1
Failed linking target install ((ins_net_client.mk) on Thu Dec 9 12:49:12 IST 2010...
/oracle/cloning/db/tech_st/10.2.0/plsql/lib
chmod 755 /oracle/cloning/db/tech_st/10.2.0/bin
make: *** [ctxload] Error 1
Failed linking target install on Thu Dec 9 12:49:13 IST 2010...
/oracle/cloning/db/tech_st/10.2.0/sysman/lib
make -f /oracle/cloning/db/tech_st/10.2.0/sysman/lib/ins_sysman.mk relink_sharedobj SHAREDOBJ=libnmemso
Error found while relinking
return code = .1.7.12.14.15.16.17.18.19.21.22.23.24.25.26.27.30.32.33.34.42.43.44.45.46.47.48.49.51.52.53.54.55.56.57.58.59.60.62No it is not packages problem i had check all the packages it is showing some relinking error.
Please do help me out, I am sending u the log record of make file
/oracle/cloning/db/tech_st/10.2.0/appsutil/log/TEST1_zishan/make_12091509.log
running genclntsh...
/oracle/cloning/db/tech_st/10.2.0/lib/libcore10.a: could not read symbols: File format not recognized
collect2: ld returned 1 exit status
genclntsh: Failed to link libclntsh.so.10.1
Failed to generate client shared library on Thu Dec 9 15:09:21 IST 2010...
rm -f oracle dbv tstshm maxmem orapwd dbfsize cursize genoci extproc extproc32 hsalloci hsots hsdepxa dgmgrl dumpsga mapsga osh sbttest expdp impdp imp exp sqlldr rman nid extjob extjobo genezi ikfod grdcscan /oracle/cloning/db/tech_st/10.2.0/rdbms/lib/ksms.s /oracle/cloning/db/tech_st/10.2.0/rdbms/lib/ksms.o
Completed removing old executables on Thu Dec 9 15:09:21 IST 2010...
/usr/bin/gcc -O2 -I/oracle/cloning/db/tech_st/10.2.0/rdbms/demo -I/oracle/cloning/db/tech_st/10.2.0/rdbms/public -I/oracle/cloning/db/tech_st/10.2.0/plsql/public -I/oracle/cloning/db/tech_st/10.2.0/network/public -DLINUX -D_GNU_SOURCE -D_LARGEFILE64_SOURCE=1 -D_LARGEFILE_SOURCE=1 -DSLTS_ENABLE -DSLMXMX_ENABLE -D_REENTRANT -DNS_THREADS -c -o config.o config.c
Completed relinking target config.o ...
make[1]: *** [oracle/cloning/db/tech_st/10.2.0/sysman/lib/nmccollector] Error 1
make[1]: Leaving directory `/oracle/cloning/db/tech_st/10.2.0/sysman/lib'
make: *** [nmccollector] Error 2
Failed linking target collector on Thu Dec 9 15:09:34 IST 2010...
The value of IS_RAC:false
Error found while relinking
return code = .1.7.12.14.15.16.17.18.19.21.22.23.24.25.26.27.30.32.33.34.42.43.44.45.46.47.48.49.51.52.53.54.55.56.57.58.59.60.62 -
Getting error while running dtp
Hi
i am getting this error while running dtp, i am uisng BI 7.0 write optimized DSO
Duplicate data record detected (DS GMND0002 , data package: 000001 , data record: 4 )
ple let me know ur solutions for this problem,Hi,
You are bound to get this error as write optimized behave in this pattern.
open the write opt DSO and in the settings you can see a check box for 'Do not check uniquness of data'
Check it if you want to use the duplicate records. But there will be two records (duplicate) uniquely identified by the technical key.
Now if you want to use delta concept in this then in DTP use delta by request.
If you further want to use delta then refer to this thread..
[write oprimized dso(URGENT);
Regards,
Priti -
Getting error while running jar file
Hi,
I am getting below error while running the jar file
"Exception in thread "main" java.lang.NoClassDefFoundError: oracle/jdeveloper/layout/XYLayout:"
I have created a simple ADF desktop app in which i am displaying the frame.I created the jar from Jdeveloper.
Please help..How do you run the jar file? Take a look at this thread for a possible a solution: Re: deploying a jdeveloper adf application to a desktop
Is the class XYLayout packaged or referenced? Take a look at this thread: XYlayout classes - can I distribute with my code? -
Security Violation Error while running schedule task from OIM.
Hi All,
I am getting this error while running a custom java schedule task from OIM:
*Thor.API.Exceptions.tcAPIException [EJB:010160] Security Violation: User '<anonymous>' has insufficient permission to access EJB:*
type=<ejb>,application=Xellerate,module=xlDataObjectBeans.jar,ejb=tcReconciliationoperations,method=createDeleteReconciliationEvent
at Thor.API.Operations.tcReconciliationOperationsClient.createDeleteReconciliationEvent(UnKnown Source).
I got this error as soon as my code start creating Delete Reconciliation Event.
Note: I have already protected the JNDI Namespace.
Please provide some pointers.
Regards,
SunnyHi Rajiv,
Check this:
package com.centrica.iam.scheduletask;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileFilter;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Calendar;
import java.util.Date;
import java.util.HashMap;
import java.util.Hashtable;
import java.util.Iterator;
import java.util.Set;
import oracle.iam.connectors.common.ConnectorLogger;
import com.thortech.xl.dataaccess.tcDataSet;
import com.thortech.xl.dataaccess.tcDataSetException;
import com.thortech.xl.dataobj.PreparedStatementUtil;
import com.thortech.xl.orb.dataaccess.tcDataAccessException;
import com.thortech.xl.scheduler.tasks.SchedulerBaseTask;
import Thor.API.tcResultSet;
import Thor.API.Exceptions.tcAPIException;
import Thor.API.Exceptions.tcInvalidValueException;
import Thor.API.Operations.tcLookupOperationsIntf;
import Thor.API.Operations.tcReconciliationOperationsIntf;
import Thor.API.Operations.tcSchedulerOperationsIntf;
public class CustomFlatFile extends SchedulerBaseTask {
private static tcSchedulerOperationsIntf schedulerIntf;
private static tcLookupOperationsIntf lookupIntf;
private static tcReconciliationOperationsIntf reconIntf;
String sObjectName;
String LookupName;
String LookupName2;
String FileDirectory;
String FileName;
String File;
String delimeter;
String isDeleteTrue;
HashMap<String, String> attrMap = new HashMap();
HashMap<String, String> delMap = new HashMap();
HashMap<String, String> finalMap = new HashMap();
ArrayList list = new ArrayList();
public boolean isReconStopped;
public CustomFlatFile()
isReconStopped = false;
public void init()
LookupName = getAttribute("Attribute Lookup Name");
FileDirectory = getAttribute("Directory Path");
FileName = getAttribute("File Name");
delimeter = getAttribute("Delimeter");
sObjectName = getAttribute("Resource Object Name");
isDeleteTrue = getAttribute("Is Delete Allowed");
public void execute(){
try {
System.out.println("Start Exceute");
//Initiate lookupIntf
lookupIntf = (tcLookupOperationsIntf)getUtility("Thor.API.Operations.tcLookupOperationsIntf");
reconIntf=(tcReconciliationOperationsIntf)getUtility("Thor.API.Operations.tcReconciliationOperationsIntf");
catch (tcAPIException tcapiexception){
tcapiexception.printStackTrace();
//logger.error(classname, s, tcapiexception.toString());
//logger.setStackTrace(tcapiexception, classname, s, tcapiexception.getMessage());
catch (Exception excep){
excep.printStackTrace();
//logger.error(classname, s, excep.toString());
//logger.setStackTrace(excep, classname, s, excep.getMessage());
attrMap = readLookup(LookupName);
System.out.println(attrMap.toString());
readFile();
if (isDeleteTrue.equalsIgnoreCase("true"))
performDelete();
System.out.println("Finish Execute");
public void performDelete()
System.out.println("Start Perform delete");
int k = list.size();
System.out.println("list size " + list.size());
try
Thread.sleep(15000);
/* Hashtable ahashtable[] = new Hashtable[k];
Hashtable hashtable = new Hashtable();
for (int i=0;i<k;i++)
hashtable.put("User Id", list.get(i));
ahashtable[i] = hashtable;
System.out.println(list.get(i));
Set set = reconIntf.provideDeletionDetectionData(sObjectName, ahashtable);
System.out.println("Set--" + set.toString());
tcResultSet tcresultset = reconIntf.getMissingAccounts(sObjectName, set);
System.out.println("tcresultset - " + tcresultset.getRowCount());
if (!(tcresultset.isEmpty()))
long l[] = reconIntf.deleteDetectedAccounts(tcresultset);
for (int i1=0;i1<l.length;i1++)
System.out.println("delete recon key " + l[i1]);
//Get the existing list of Managed users
tcDataSet tcdataset = new tcDataSet();
tcDataSet tcdataset1 = new tcDataSet();
String query = "select orf.orf_fieldname,prf.prf_columnname, sdk.sdk_name from orf, sdk, pkg, tos, prf, obj " +
"where pkg.obj_key = obj.obj_key and pkg.pkg_key = tos.pkg_key and tos.sdk_key is not null " +
"and tos.sdk_key=sdk.sdk_key and tos.tos_key=prf.tos_key and prf.prf_iskey='1' and prf.orf_key=orf.orf_key " +
"and orf.orf_parent_orf_key is null and obj.obj_name='" + sObjectName + "'";
tcdataset.setQuery(getDataBase(), query);
tcdataset.executeQuery();
String FFName = tcdataset.getString("prf_columnname");
String FName = tcdataset.getString("sdk_name");
String ROFName = tcdataset.getString("orf_fieldname");
System.out.println("form- " + FName + " Field- " + FFName);
query = "select " + FFName + " from " + FName + " udtable, oiu a, ost b " +
"where udtable.orc_key=a.orc_key and a.ost_key=b.ost_key and b.ost_status!='Revoked'";
System.out.println(query);
tcdataset1.setQuery(getDataBase(), query);
tcdataset1.executeQuery();
int i = tcdataset1.getRowCount();
ArrayList list1 = new ArrayList();
String s1 = null;
System.out.println("N. of rows--" + i);
for (int j=0;j<i;j++)
tcdataset1.goToRow(j);
s1 = tcdataset1.getString(0);
System.out.println("s1---" + s1);
if (!(list.contains(s1)))
list1.add(s1);
System.out.println("under if--" + s1);
//Getting the existing list of unmanaged users
query = "select distinct (b.rcd_value) from rce a, rcd b, orf c, obj d where a.rce_key=b.rce_key and " +
"b.orf_key=c.orf_key and c.orf_fieldname='" + ROFName + "' and a.rce_status!='Event Linked' " +
"and a.obj_key = d.obj_key and d.obj_name='" + sObjectName + "'";
tcdataset1.setQuery(getDataBase(), query);
tcdataset1.executeQuery();
i = tcdataset1.getRowCount();
System.out.println("No. Of Unmanaged Users " + i);
for (int j=0;j<i;j++)
tcdataset1.goToRow(j);
s1 = tcdataset1.getString(0);
System.out.println("s1---" + s1);
if (!(list.contains(s1)))
list1.add(s1);
System.out.println("under if--" + s1);
int k1 = list1.size();
System.out.println("list1 size--" + k1);
for (int j1=0;j1<k1;j1++)
delMap.clear();
delMap.put(ROFName, (String)list1.get(j1));
System.out.println(delMap.toString());
long l = reconIntf.createDeleteReconciliationEvent(sObjectName, delMap);
System.out.println("delete recon key--- " + l);
catch (Exception exception)
exception.printStackTrace();
public void readFile(){
String s = "readFile()";
//logger.setMethodStartLog(classname, s);
HashMap map = new HashMap();
try {
File = getFile();
BufferedReader reader = new BufferedReader(new FileReader(new
File(File)));
String line = "";
int k = attrMap.size();
String value[] = new String[k];
String Header[]= new String[k];
if (delimeter.equalsIgnoreCase("|"))
delimeter = "\\" + delimeter;
line = reader.readLine();
Header = line.split(delimeter);
while((line = reader.readLine()) != null)
value = line.split(delimeter);
k = value.length;
for (int i = 0;i<k;i++){
finalMap.put(attrMap.get(Header), value[i]);
System.out.println(finalMap.toString());
System.out.println("Start Ignoring Event");
if (!(reconIntf.ignoreEvent(sObjectName, finalMap)))
System.out.println("Not Ignored");
long l1 = reconIntf.createReconciliationEvent(sObjectName, finalMap, true);
System.out.println("Recon Key--" + l1);
else
System.out.println("ignore event ---" + finalMap.toString());
list.add(finalMap.get("User Id"));
System.out.println(list.size() + "add--" +finalMap.get("User Id") );
finalMap.clear();
catch (Exception exception)
exception.printStackTrace();
public boolean stop(){
String s = "stop()";
//logger.setMethodStartLog(classname, s);
//logger.info(classname, s, "Stopping Reconciliation........");
isReconStopped = true;
//logger.setMethodFinishLog(classname, s);
return true;
FileFilter fileFilter = new FileFilter()
public boolean accept(File file)
String sFilePath = file.getName();
if( sFilePath.startsWith(FileName) )
return true;
else
return false;
public String getFile() throws FileNotFoundException, Exception{
String s = "getFile()";
//logger.setMethodStartLog(classname, s);
String s1;
File dir = new File(FileDirectory);
File[] files = dir.listFiles(fileFilter);
if (files.length ==0)
throw new FileNotFoundException();
if (files.length>1)
throw new Exception("Multiple Matches found for this file name");
s1 = files[0].toString();
//logger.setMethodFinishLog(classname, s);
return s1;
public HashMap readLookup(String s1){
String s = "readLookup()";
//logger.setMethodStartLog(classname, s);
HashMap map = new HashMap();
try {
tcResultSet tc1= lookupIntf.getLookupValues(s1);
int i = tc1.getRowCount();
for (int j = 0;j<i;j++){
tc1.goToRow(j);
map.put(tc1.getStringValue("Lookup Definition.Lookup Code Information.Code Key"), tc1.getStringValue("Lookup Definition.Lookup Code Information.Decode"));
catch (tcAPIException tcapiexception){
tcapiexception.printStackTrace();
//logger.error(classname, s, tcapiexception.toString());
//logger.setStackTrace(tcapiexception, classname, s, tcapiexception.getMessage());
catch (Exception excep){
excep.printStackTrace();
//logger.error(classname, s, excep.toString());
//logger.setStackTrace(excep, classname, s, excep.getMessage());
return map;
Maybe you are looking for
-
Is there a way to print a pdf, which is secured with password?
I want to know a way to print pdf which is secured with password to print without throwing a error ? Instead it has to ask for a password and print..
-
File-to-Mail attachment without manipulating the payload
Hello, I have a requirement to pick a file from a file server and send it to receiver as an email attachment as-is. The key is that the system should not read the contents of the file (payload). I'm unsure how to go about this! Any help in this regar
-
Shipping and A/R report comparision
Hi All, I have a scenario where I need to compare 2 customized reports ( shipping and A/R report) each day . I need to make sure that both the billing quantity and amount are compared on both these reports and if there is a discrepency on
-
Getting error in payroll data extraction
hi, i am extracting payroll data for south africa and used following includes. but i am getting errror " staement RP-IMP-C2-RW . is not defined ". plz help me in this , am i misng some includes. INCLUDE ole2incl. INCLUDE rpc2cd09. "Cluster CD data
-
Best personal finance app for mac?
I'm looking for the best finance app for my iMac. I'm most interested in tracking my accounts -- banking and investments. I'd like to have the software on my two iMacs and on 2 MacBooks without purchasing the product 4 times. Budgeting function is no