Allowing parallel processing of cube partitions using OWB mapping
Hi All,
Iam using an OWB mapping to load a MOLAP cube partitioned on TIME dimension. I configured the OWB mapping by checking the 'Allow parallel processing' option with the no.of parallel jobs to be 2. I then deployed the mapping.The data loaded using the mapping is spread across multiple partitions.
The server has 4 CPU's and 6 GB RAM.
But, when i kick off the mapping, i can see only one partition being processed at a time in the XML_LOAD_LOG.
If i process the same cube in AWM, using parallel processing, i can see that multiple partitions are processed.
Could you pls suggest if i missed any setting on OWB side.
Thanks
Chakri
Hi,
I have assigned the OLAP_DBA to the user under which the OWB map is running and the job started off.
But, it failed soon with the below error:
***Error Occured in __XML_MAIN_LOADER: Failed to Build(Refresh) XPRO_OLAP_NON_AGG.OLAP_NON_AGG Analytic Workspace. In __XML_VAL_MEASMAPS: In __XML_VAL_MEASMAPS_VAR: Error Validating Measure Mappings. In __XML_FND_PRT_TO_LOAD: In __XML_SET_LOAD_STATUS: In ___XML_LOAD_TEMPPRG:
Here is the log :
Load ID Record ID AW Date Actual Time Message Time Message
3973 13 SYS.AWXML 12/1/2008 8:26 8:12:51 8:26:51 ***Error Occured in __XML_MAIN_LOADER: Failed to Build(Refresh) XPRO_OLAP_NON_AGG.OLAP_NON_AGG Analytic Workspace. In __XML_VAL_MEASMAPS: In __XML_VAL_MEASMAPS_VAR: Error Validating Measure Mappings. In __XML_FND_PRT_TO_LOAD: In __XML_SET_LOAD_STATUS: In ___XML_LOAD_TEMPPRG:
3973 12 XPRO_OLAP_NON_AGG.OLAP_NON_AGG 12/1/2008 8:19 8:12:57 8:19:57 Attached AW XPRO_OLAP_NON_AGG.OLAP_NON_AGG in RW Mode.
3973 11 SYS.AWXML 12/1/2008 8:19 8:12:56 8:19:56 Started Build(Refresh) of XPRO_OLAP_NON_AGG.OLAP_NON_AGG Analytic Workspace.
3973 1 XPRO_OLAP_NON_AGG.OLAP_NON_AGG 12/1/2008 8:19 8:12:55 8:19:55 Job# AWXML$_3973 to Build(Refresh) Analytic Workspace XPRO_OLAP_NON_AGG.OLAP_NON_AGG Submitted to the Queue.
Iam using AWM (10.2.0.3 A with OLAP Patch A) and OWB (10.2.0.3).
Can anyone suggest why the job failed this time ?
Regards
Chakri
Similar Messages
-
Query on processing a PDF file using Java mapping
Hi All,
i am trying to process a XML and PDF file using Java mapping, it is successful in XML but unable to do for PDF.
below is the code i am using... can any one guide me how to process PDF's..
byte byte1 = 0;
java.io.ByteArrayOutputStream bos = (ByteArrayOutputStream)outputstream;
while((byte1=(byte)inputstream.read())!=-1){
bos.write(byte1);
bos.close();
Thank You,
MadhavHi Madhav,
I think instead of going with JAVA mapping you can write a custom adapter module for it.
Ref: /people/sap.user72/blog/2005/07/31/xi-read-data-from-pdf-file-in-sender-adapter
Also check : Re: PI 7.1 : Taking a input PDF file and mapping it to a hexBinary attribute
/people/shabarish.vijayakumar/blog/2009/05/17/trouble-writing-out-a-pdf-in-xipi
Thanks,
Edited by: Hareenkumar on Dec 21, 2010 11:12 AM -
How to achieve parallel processing in a single request?
Hi all,
I have a method in a Session EJB that will perform some business logic before it returns an answer to the client. The logic it will perform is to collect data from the applications database and two external systems, before sending all data to a third external system to get a response and send it back to the client. Each external system is quite slow so I would like to do all the collecting of data concurrent, parallel processing. How should I handle this? I'm not allowed to create my own threads in EJB's. Can I use MDB in some way? To the calling client this should be a synchronous call...
Greatfull for any suggestions
Cheers
Anders =)Usually, the request is received by a component located in the web container, such as by an HTTP request (including Web Services). This component is able to start threads to allow parallel processing. Now, if for some reason the request arrives directly at EJB level and that you cannot move its receiver to web component, I think JMS is not a viable solution because you will switch to asynchronous processing and you have no way to make your EJB wait for the responses while preserving the client request (waiting implies programmatic life cycle management, which is forbidden in EJB container). Maybe a resource adapter (JCA) can bring a solution. A resource adapter acts as a datasource (a datasource is a specialization of a resource adapter) and thus it is a logical way to implement an adapter to an external, eventually non-J2EE, resource, as the name implies :) But I don't have enough knowledge in JCA to be sure of this.
Hope it helps.
Bruno Collet
http://www.practicalsoftwarearchitect.com -
Tabular parallel processing?
hi all,
does anyone know if there have been any new enhancements in allowing parallel processing of large fact tables? I know, as of RTM and SP1 and even SSAS2014, that's something that was not able to be technically entertained, AFAIK....
are there any plans for parallel processing in the near future, any guidance from Microsoft regarding this issue?
thx,
CosHi Cos,
I didn't got any document/KB article/news regarding enhance parallel processing for Tabular model. As we know, multiple tables can be processed in parallel, whereas the processing of several partitions within the same table cannot be parallelized.
The new features of SQL Server 2014 Analysis Services didn't mention this technical, please see:
What's New in Analysis Services and Business Intelligence:
http://technet.microsoft.com/en-us/library/bb522628.aspx
For this topic, I would recommend to submit a feedback to the Microsoft Connect at this link
https://connect.microsoft.com/SQLServer/Feedback
I also will update this post if I get any information about this.
If you have any feedback on our support, please click
here.
Regards,
Elvis Long
TechNet Community Support -
Parallel processing for increaing the performance
various ways of parallel processing in oracle especially using hints
Please let me knw if there exists any online documentation in understanding the conceptFirst of all: As a rule of thumb don't use hints. Hints make programs too unflexible. A hint may be good today, but might make things worse in future.
There are lots of documents available concerning parallel processing:
Just go to http://www.oracle.com/pls/db102/homepage?remark=tahiti and search for parallel (processing)
Due to my experience in 10g, enabling parallel processing might slow down processing extremely for regular tables. The reason are lots of waits in the coordination of the parallel processes.
If, however, you are using parallel processing for partitioned tables, parallel processing works excellent. In this case, take care to choose the partitioning criterion properly to be able to distribute processing.
If, for example, your queries / DMLs work on data corresponding to a certain time range, don't use the date field as partitioning criterion, since in this case parallel processing might work on just a single partition. Which again would result in massive waits for process coordination.
Choose another criterion to distribute the data to be accessed to at least <number of CPUs -1> partitions (one CPU is needed for the coordination process). Additionally consider to use parallel processing only in cases where large tables are involved. Compare this situation with writing a book: If you are planning to have some people writing a (technical) book consisting of just 10 pages, it wouldn't make any sense at all concerning time reduction. If, however, the book is planned to have 10 chapters, each chapter could be written by a different author. Reducing the resulting time to about 1/10 compared to a single author writing all chapters.
To enable parallel processing for a table use the following statement:
alter table <table name> parallel [<integer>];If you don't use the <integer> argument, the DB will choose the degree of parallelism, otherwise it is controlled by your <integer> value. Remember that you allways need a coordinator process, so don't choose integer to be larger than <number of CPUs minus 1>.
You can check the degree of parallelism by the degree column of user_/all_/dba_tables.
To do some timing tests, you also can force parallel dml/ddl/query for your current session.
ALTER SESSION FORCE PARALLEL DML/DDL/QUERY [<PARALLEL DEGREE>]; -
Parallel processing of mass data : sy-subrc value is not changed
Hi,
I have used the Parallel processing of mass data using the "Start New Task" . In my function module I am handling the exceptions and finally raise the application specific old exception to be handled in my main report program. Somehow the sy-subrc is not getting changed and always returns 0 even if the expection is raised.
Can anyone help me about the same.
Thanks & Regards,
NitinHi Silky,
I've build a block of code to explain this.
DATA: ls_edgar TYPE zedgar,
l_task(40).
DELETE FROM zedgar.
COMMIT WORK.
l_task = 'task1'.
ls_edgar-matnr = '123'.
ls_edgar-text = 'qwe'.
CALL FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' STARTING NEW TASK l_task PERFORMING f_go ON END OF TASK
EXPORTING
line = ls_edgar.
l_task = 'task2'.
ls_edgar-matnr = 'abc'.
ls_edgar-text = 'def'.
CALL FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' STARTING NEW TASK l_task PERFORMING f_go ON END OF TASK
EXPORTING
line = ls_edgar.
l_task = 'task3'.
ls_edgar-matnr = '456'.
ls_edgar-text = 'xyz'.
CALL FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' STARTING NEW TASK l_task PERFORMING f_go ON END OF TASK
EXPORTING
line = ls_edgar.
*& Form f_go
FORM f_go USING p_c TYPE ctype.
RECEIVE RESULTS FROM FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' EXCEPTIONS err = 2.
IF sy-subrc = 2.
*this won't affect the LUW of the received function
ROLLBACK WORK.
ELSE.
*this won't affect the LUW of the received function
COMMIT WORK.
ENDIF.
ENDFORM. "f_go
and the function is:
FUNCTION z_edgar_commit_rollback.
*"*"Interface local:
*" IMPORTING
*" VALUE(LINE) TYPE ZEDGAR
*" EXCEPTIONS
*" ERR
MODIFY zedgar FROM line.
IF line-matnr CP 'a*'.
*comment raise or rollback/commit to test
* RAISE err.
ROLLBACK WORK.
ELSE.
COMMIT WORK.
ENDIF.
ENDFUNCTION.
ok.
In your main program you have a Logical Unit of Work (LUW), witch consists of an application transaction and is associated with a database transaction. Once you start a new task, your creating an independent LUW, with it's own database transaction.
So if you do a commit or rollback in your function the effect is only on the records your processing in the function.
There is a way to capture the event when this LUW concludes in the main LUW. That is the PERFORMING whatever ON END OF TASK. In there you can get the result of the function but you cannot commit or rollback the LUW from the function since it already have implicitly happened at the conclusion of the funtion. You can test it by correctly comment the code I've supplied.
So, if you want to rollback the LUW of the function you better do it inside it.
I don't think it matches exactly your question, maybe it lead you on the right track. Give me more details if it doesn't.
Hope it helps,
Edgar -
Need help with parallel process in background; not able to call FM in bgnd
Hello,
I am trying since 2 days to solve the issue of parallel process in background without using FPP.
For which I want to call function module of class method in new task but to be processed by background process and not dialog.
I searched so many websites but everyone has suggesteed to 'call function in background task' . But the fact is the processing of function happens by dailog process even in this case.
I want to loop at table and call FM or class method inside each loop.
Kindly suggest me how can I call function or class method in new task in everycall and prcoess it in background.
thanksBalaji,
Is the name of the button between single or double quotes?
Regards,
Dan
Blog: http://DanielMcGhan.us/
Work: http://SkillBuilders.com/ -
How to enable an monitor parallel processing in Oracle
Hi All,
I have 2 short questions:
1. When we want parallel processing, we can either use a parallel hint in the query, or alter a table to be parrallel. My question is what is the difference in the following 2 syntax:
a. ALTER TABLE myTable PARALLEL (DEGREE 3);
b. ALTER TABLE myTable PARALLEL 3;
Does the "DEGREE" keywor make any difference? or they both are same statements?
2. When we enable parallel processing, how can we monitor oracle processes, to confirm that a certain table is being actually processed by multiple threads of a singe user process?
An early response would be highly appreciated. Thanks.1)The parallel clause lets you change the default degree of parallelism for queries and DML on the table.
2) PARALLEL DEGREE specifies the number of query server processes that can scan the table in parallel. Either specify a positive integer or DEFAULT which signifies to use the initialization parameter
check further http://mywebsys.com/oracle/syntax/view_syntax.php?id=23
Thanks -
Are fact tables and cubes same in OWB?
Dear all
A simple question. How can I create a star schema (that is, with a fact table and dimensions) using OWB?
OWB has options to create cubes, but as per my understanding a cube is not a fact table.
Cube contains pre-computed data where as fact table contains normal data with references to dimensions.
Please correct me if I am wrong.
thanks in advanceThese are just different levels of abstraction.
"Cube" is the highest level of abstraction referring to the overall package of data.
"Star schema" is how cubes are modelled showing the relationships from a fact entity to the dimension entities.
Relational and OLAP are different methods of physical implementation.
In OWB, to promote sharing of dimensions across cubes to avoid inconsistency the idea is you define and build the dimensions independently. Then you define the "cubes" as measures and references to the dimensions. When you build the "cube" you pass in business identifiers from the source data which OWB will use to link the measures to the applicable dimension data. Due to the wonders of inner joins anyone reading the "cube" will only see dimension data related to the data in that cube!
Using OWB you do not need to be concerned with the physical implementation when you use the dimension and cube operators as those operators know what to do. -
How to monitor parallel processing
Hi All,
I have 2 short questions:
1. When we want parallel processing, we can either use a parallel hing in the query, or alter a table to be parrallel. My question is what is the difference in the following 2 syntax:
a. ALTER TABLE myTable PARALLEL (DEGREE 3);
b. ALTER TABLE myTable PARALLEL 3;
Does the "DEGREE" keywor make any difference? or they both are same statements?
2. When we enable parallel processing, how can we monitor oracle processes, to confirm that a certain table is being actually processed by multiple threads of a singe user process?
An early response would be highly appreciated. Thanks.user566817 wrote:
2. When we enable parallel processing, how can we monitor oracle processes, to confirm that a certain table is being actually processed by multiple threads of a singe user process?There are a number of virtual performance views that can be used. Please refer to the Oracle® Database Reference guide for details on these.
Had a look though my scripts and I have this one.. cannot recall if I "borrowed" it from somewhere and customised it and how old it is.. but it should (hopefully) still be mostly correct. It uses the virtual view v$px_process to determine the list of current PQ slaves in the pool and if they are used, map them to the Oracle session using them.
select distinct
x.server_name as "PQ",
x.status as "Status",
x.sid as "OraPID",
w2.sid as "Parent OraPID",
v.osuser as "O/S User",
v.schemaname as "User",
w1.event as "Child Wait",
w2.event as "Parent Wait"
from v$px_process x,
v$lock l,
v$session v,
v$session_wait w1,
v$session_wait w2
where x.sid =! l.sid(+)
and to_number (substr(x.server_name,3)) = l.id2(+)
and x.sid = w1.sid(+)
and l.sid = w2.sid(+)
and x.sid = v.sid(+)
and nvl(l.type,'PS') = 'PS'Use at own risk - best would be to verify that this is still valid using the Reference Guide. Or create similar ones using the available V$ views for the details you want to see (e.g. SQL statement executed per PQ, etc). -
Hi
I need to get an idea about the parallel processing of a program to get a better performance and submit the program in background. If anyone have any simple code which give a better undersatnding please post it.Hi Vighnesh
You can search a lot about
parallel processing
in SCN.
Use advance search option like.....
https://www.sdn.sap.com/irj/scn/advancedsearch?cat=sdn_all&query=parallelprocessing&adv=false&sortby=cm_rnd_rankvalue
Let me know in case of any issues.
s@chin -
Parallel processing of condition records in SAP
Hi,
I have a particular scenario, wherein XI sends 30000 idocs for pricing condition records of message type COND_A to SAP, and SAP has to process all the idocs within 15 minutes. Is it possible, and what kind of parallel processing techniques can be used to achieve this?
Regards,
Vijay
Edited by: Vijay Iyengar on Feb 21, 2008 2:05 PMHi
We had a similar performance issue to load conditions of sales deal.
We did not use IDOC.
Initially we did the BDC and it was loading 19 records per second and than later we developed a direct input program, which loaded close to 900 records per second.
What we did was, we wrote a direct input pogram and called the function module
CALL FUNCTION 'RV_KONDITION_SICHERN_V13A' IN UPDATE TASK
But Pls note - We took approval from SAP before using it.
Regards
Madhan
Edited by: Madhan Doraikannan on Oct 20, 2008 11:40 AM -
Parallel processing issue withing same server
hi,
i need to perform parallel processing withing same server using work processes available in same server.
suggest if this can be accomplished and explain the design if possible.Hello Venkata,
You can achieve parallel processing by using CALL FUNCTION .... STARTING NEW TASK <task name>.
In this case function module runs in asynchronous mode without stopping calling program.
For more details you can refer following link:
https://wiki.sdn.sap.com/wiki/display/Snippets/Easilyimplementparallelprocessinginonlineandbatchprocessing
Thanks,
Augustin. -
Error occurred while processing the "sales" partition of the "sales" measure group in the cube
Hi
when i ran the job for processing the cube it showing the error like "error occurred while processing the sales partition of the sales measure group in the cube". but in log files no error massage was there. after get that error message we ran
the cube manually
at that time the cube executed successfully.
my aim when the job was run it will be process automatically but it is not like that.
can you suggest the solution.
thank you
satyak248Hi Satyak248,
According to your description, you get the error when using Windows Task to process a cube on a schedule, however can process the cube on SSMS successful manually, right?
In your scenario, you can process the cube manually, the issue can be cause by Windows Task was not set correctly. So you can try to process the cube using SSIS package. The Analysis Services Processing Task in SQL Server Integration Services (SSIS) allows
for the processing of one to many to all Analysis Services objects in an SSIS package. Once the SSIS package is created, then a job can be created within the SQL Server Management Studio which will allow for scheduling.
http://www.mssqltips.com/sqlservertip/2994/configuring-the-analysis-services-processing-task-in-sql-server-2012-integration-services/
Regards,
Charlie Liao
TechNet Community Support -
Can I create a BI Beans compliant cube using OWB?
Can I create a cube that I can browse using BI beans through OWB 9.0.4 or are there additional steps that I need to take using other tools such as Enterprise manager?
Are there any known incompatibilities between OWB 9.0.4 and BI beans 9.03.1.?
I will also pose this question in the BI Beans forum.
Thanks for any replies.
CorHi,
I am trying to build an Analytic workspace using OWB (9.0.4.8) Transfer Brigde and I got the similar error.
None of the view/mv sqls are generated and the analytic workspace was not created either.
FYI.
**! Transfer logging started at Wed May 14 18:07:41 EDT 2003 !**
OWB Bridge processed arguments
Default local= en_US
Exporting project:OM_SAMPLE
initializing project:OM_SAMPLE
Initializing module :WH
Exporting cube:SALES
Exporting dimension:CHANNELS
Exporting dimension:COUNTRIES
Exporting dimension:CUSTOMERS
Exporting dimension:PRODUCTS
Exporting mappings
Exporting table:CHANNELS
Exporting table:COUNTRIES
Exporting table:CUSTOMERS
Exporting table:PRODUCTS
Exporting table:SALES
Exporting datatypes
Exporting project OM_SAMPLE complete.
setting parameter: olapimp.deploytoaw = Y
setting parameter: olapimp.awname = OWBTARDEMO
setting parameter: olapimp.awobjprefix = OWBTAR_
setting parameter: olapimp.awuser =
setting parameter: olapimp.createviews = Y
setting parameter: olapimp.viewprefix = OWBTAR_
setting parameter: olapimp.viewaccesstype = OLAP
setting parameter: olapimp.creatematviews = Y
setting parameter: olapimp.viewscriptdir = /opt/oracle
setting parameter: olapimp.deploy = N
setting parameter: olapimp.username = OLAPSYS
setting parameter: olapimp.password = manager
setting parameter: olapimp.host = 10.215.79.139
setting parameter: olapimp.port = 1521
setting parameter: olapimp.sid = INDEXDB
setting parameter: olapimp.inputfilename = C:\TEMP\bridges\null-nullMy_Metadata_Transfer1052950061353.XMI
setting parameter: olapimp.outputfilename = C:\Panneer\owbtardemo.sql
Loading Metadata
Loading XMI input file
processing dim: CHANNELS
processing level: CHANNELin dimension CHANNELS
processing level attribute use: CHL_ID in level CHANNEL for level attribute ID
processing level attribute : ID in level CHANNEL
processing level attribute use: CHL_LLABEL in level CHANNEL for level attribute LLABEL
processing level attribute : LLABEL in level CHANNEL
processing level attribute use: CHL_SLABEL in level CHANNEL for level attribute SLABEL
processing level attribute : SLABEL in level CHANNEL
processing level: CLASSin dimension CHANNELS
processing level attribute use: CLS_ID in level CLASS for level attribute ID
processing level attribute : ID in level CLASS
processing level attribute use: CLS_LLABEL in level CLASS for level attribute LLABEL
processing level attribute : LLABEL in level CLASS
processing level attribute use: CLS_SLABEL in level CLASS for level attribute SLABEL
processing level attribute : SLABEL in level CLASS
processing hierarchy: CHANNEL_HIERARCHY in dimension CHANNELS
processing dim: COUNTRIES
processing level: REGIONin dimension COUNTRIES
processing level attribute use: RGN_ID in level REGION for level attribute ID
processing level attribute : ID in level REGION
processing level attribute use: RGN_LLABEL in level REGION for level attribute LLABEL
processing level attribute : LLABEL in level REGION
processing level attribute use: RGN_SLABEL in level REGION for level attribute SLABEL
processing level attribute : SLABEL in level REGION
processing level: COUNTRYin dimension COUNTRIES
processing level attribute use: CTY_ID in level COUNTRY for level attribute ID
processing level attribute : ID in level COUNTRY
processing level attribute use: CTY_LLABEL in level COUNTRY for level attribute LLABEL
processing level attribute : LLABEL in level COUNTRY
processing level attribute use: CTY_SLABEL in level COUNTRY for level attribute SLABEL
processing level attribute : SLABEL in level COUNTRY
processing hierarchy: COUNTRY_HIERARCHY in dimension COUNTRIES
processing dim: CUSTOMERS
processing level: CUSTOMERin dimension CUSTOMERS
processing level attribute use: CTR_CREDIT_LIMIT in level CUSTOMER for level attribute CREDIT_LIMIT
processing level attribute : CREDIT_LIMIT in level CUSTOMER
processing level attribute use: CTR_EMAIL in level CUSTOMER for level attribute EMAIL
processing level attribute : EMAIL in level CUSTOMER
processing level attribute use: CTR_ID in level CUSTOMER for level attribute ID
processing level attribute : ID in level CUSTOMER
processing level attribute use: CTR_NAME in level CUSTOMER for level attribute NAME
processing level attribute : NAME in level CUSTOMER
processing dim: PRODUCTS
processing level: PRODUCTin dimension PRODUCTS
processing level attribute use: PDT_DESCRIPTION in level PRODUCT for level attribute DESCRIPTION
processing level attribute : DESCRIPTION in level PRODUCT
processing level attribute use: PDT_ID in level PRODUCT for level attribute ID
processing level attribute : ID in level PRODUCT
processing level attribute use: PDT_LIST_PRICE in level PRODUCT for level attribute LIST_PRICE
processing level attribute : LIST_PRICE in level PRODUCT
processing level attribute use: PDT_MIN_PRICE in level PRODUCT for level attribute MIN_PRICE
processing level attribute : MIN_PRICE in level PRODUCT
processing level attribute use: PDT_NAME in level PRODUCT for level attribute NAME
processing level attribute : NAME in level PRODUCT
processing level: CATEGORYin dimension PRODUCTS
processing level attribute use: CTY_ID in level CATEGORY for level attribute ID
processing level attribute : ID in level CATEGORY
processing level attribute use: CTY_LLABEL in level CATEGORY for level attribute LLABEL
processing level attribute : LLABEL in level CATEGORY
processing level attribute use: CTY_SLABEL in level CATEGORY for level attribute SLABEL
processing level attribute : SLABEL in level CATEGORY
processing hierarchy: PRODUCT_HIERARCHY in dimension PRODUCTS
processing cube: SALES
processing classification type is := Warehouse Builder Business Area
processing catalog name := SALESCOLLECTION ,and description is := null
processing catalog entry element name := SALES
processing Cube
processing catalog entity cube := SALES
processing measure := COSTS , in a cube := SALES
processing measure := SALES , in a cube := SALES
processing catalog entry element name := CHANNELS
processing catalog entry element name := COUNTRIES
processing catalog entry element name := CUSTOMERS
processing catalog entry element name := PRODUCTS
processing catalog entry element name := CHANNELS
Class Name CHANNELS is TableImpl@405ffd not supported
processing catalog entry element name := COUNTRIES
Class Name COUNTRIES is TableImpl@5e1b8a not supported
processing catalog entry element name := CUSTOMERS
Class Name CUSTOMERS is TableImpl@6232b5 not supported
processing catalog entry element name := PRODUCTS
Class Name PRODUCTS is TableImpl@6f144c not supported
processing catalog entry element name := SALES
Class Name SALES is TableImpl@14013 not supported
processing classification type is := Dimensional Attribute Descriptor
Classification type Dimensional Attribute Descriptor is not supported
closing output file
closing log stream
**! Transfer process 2 of 2 completed with status = 0 !**
**! Transfer logging stopped at Wed May 14 18:07:47 EDT 2003 !**
But when I ran the "select * from dba_registry" everything seems to be valid.
CATALOG Oracle9i Catalog Views 9.2.0.2.0 VALID 24-APR-2003 09:39:24 SYS SYS DBMS_REGISTRY_SYS.VALIDATE_CATALOG
CATPROC Oracle9i Packages and Types 9.2.0.2.0 VALID 24-APR-2003 09:39:24 SYS SYS DBMS_REGISTRY_SYS.VALIDATE_CATPROC
OWM Oracle Workspace Manager 9.2.0.1.0 VALID 24-APR-2003 09:39:27 SYS WMSYS OWM_VALIDATE
JAVAVM JServer JAVA Virtual Machine 9.2.0.2.0 VALID 23-APR-2003 22:19:09 SYS SYS [NULL]
XML Oracle XDK for Java 9.2.0.2.0 VALID 24-APR-2003 09:39:32 SYS SYS XMLVALIDATE
CATJAVA Oracle9i Java Packages 9.2.0.2.0 VALID 24-APR-2003 09:39:32 SYS SYS DBMS_REGISTRY_SYS.VALIDATE_CATJAVA
ORDIM Oracle interMedia 9.2.0.2.0 LOADED 23-APR-2003 23:16:42 SYS SYS [NULL]
SDO Spatial 9.2.0.2.0 LOADED 23-APR-2003 23:17:06 SYS MDSYS [NULL]
CONTEXT Oracle Text 9.2.0.2.0 VALID 23-APR-2003 23:17:26 SYS SYS [NULL]
XDB Oracle XML Database 9.2.0.2.0 VALID 24-APR-2003 09:39:39 SYS XDB DBMS_REGXDB.VALIDATEXDB
WK Oracle Ultra Search 9.2.0.2.0 VALID 24-APR-2003 09:39:42 SYS WKSYS WK_UTIL.VALID
OLS Oracle Label Security 9.2.0.2.0 VALID 24-APR-2003 09:39:43 SYS LBACSYS LBAC_UTL.VALIDATE
ODM Oracle Data Mining 9.2.0.1.0 LOADED 12-MAY-2002 17:59:03 SYS ODM [NULL]
APS OLAP Analytic Workspace 9.2.0.2.0 LOADED 23-APR-2003 22:49:51 SYS SYS [NULL]
XOQ Oracle OLAP API 9.2.0.2.0 LOADED 23-APR-2003 22:51:49 SYS SYS [NULL]
AMD OLAP Catalog 9.2.0.2.0 VALID 02-MAY-2003 15:00:13 SYS OLAPSYS CWM2_OLAP_INSTALLER.VALIDATE_CWM2_INSTALL
Your help is appreciated!
Thanks
Panneer
Maybe you are looking for
-
Error in bapi_po_getdetail
Hi all, I am using BAPI_PO_GETDETAIL to get required data. cODE IS data: po_items type table of bapiekpo with header line. data: po_item_schedules type table of BAPIEKET with header line. data: PO_ITEM_TEXTS type table of BAPIEKPOTX with header lin
-
Windows XP installation on new machines
Hey, anybody knows if the latest Toshiba machines support Windows XP installation, or it is commonly accepted to support only Windows 7 and later OS? I tried to install Windows XP Pro SP3, 2008 release on Satellite C660 machine released in 2011, and
-
X-fi Extreme Gamers not in IRQ list, doesn't work at all.
9X-fi Extreme Gamers not in IRQ list, doesn't work at all.O Lo mates, my specs: Intel C2D E8500 Gigabyte GA-EP45-DS3R Club3D ATi 4870 52MB 2GB OCZ 2 WD Raptors connected to onboard raid controller X-Fi in PCI slot my problem: The X-fi isn't working a
-
Automator, iTunes won't shuffle first song
I have a remote computer running iTunes playing a shuffled playlist. I wrote an Automator task to reshuffle the playlist and start it over again. The Workflow is simple: Get Specified iTunes Items (for a playlist with one blank song) Play iTunes Play
-
Question about processor speed
Ok. I used to have an iBook G4 1.2 Ghz. I am considering getting one off of ebay with a 1.07 Ghz processor...if I had 768 RAM last time and had it with the one i am considering buying, would there big a noticeable difference in speed or performance?