Quality collector job fails inspite of extractor fetching the data successfully
Hi,
Quality collector job fails on 'DVS' (managed system) inspite of extractor fetching the data successfully.
Currently Solution manager in use is Solution Manager 'SOP' 7.1 SP12
All the RTCCTOOL recommendations are applied on SOP & DVS. ST-PI is on 2008_1_700 SP11 on both the systems. ST-BCO component in on SP11.
As a part of trouble shooting, the extractor run for ATC & ATC exemptions are monitored.
The extractor run is successfull. Also Master run for ATC is successfully completed.
ATC monitoring for Custom Code (result) extractor result
There are no ATC monitoring for Custom Code (exemption) but the extractor run is successful
Successfull ATC master run on DVS Managed system
following notes are implemented
2127901 - CCLM: Quality fails to get Results and exemptions due to Incorrect format of date
2067543 - Quality Collector : Missing Function name for Remote Solution
Is anyone facing a similar issue?
Can
HI Sylke,
Note 2098187 is for ATC exception, the not is currently not in the Solution manager system SOP (and also in SOD which is the development system of solution manager landscape).
But I have configured SOD (Solution manager Development system) to PDS (ECC sandbox) and the Quality collector job was successful (without note 2098187 implemented)
As SOD and SOP are at same SP level, I am expecting Quality collector job on SOP for DVS to finish successful as well similar to Quality collector on SOD for PDS which is already finished successfully and fetched the data. Can you please advise here?
Based on 2077995, i have executed the report for DVS in SOP system and the status is Green for '0SM_ATC'
When i check the for data in 0SM_ATC, I find the data for DVS system
I am able to successfully run 0SM_ATC_CCL_QUAL but not 0SM_ATC_RUNDATE_LOOKUP.
when i try to execute the query 0SM_ATC_RUNDATE_LOOKUP in RSRT for SOP, i get the below notification, looks like i need to install/activate the query in RSA1 transaction
0SM_ATC_RUNDATE_LOOKUP in RSA1 of SOP system
0SM_ATC_RUNDATE_LOOKUP in RSA1 of SOD system (development solution manager where Quality collector job for PDS is successful)
sample output of 0SM_ATC_CCL_QUAL,
Can you advise here a well? Looks like install/activate 0SM_ATC_RUNDATE_LOOKUP in SOP, should solve the problem
All the notes suggested are already implemented in SOD and SOP
thanks
Sai
Similar Messages
-
Hierarchies Job Failing The job process could not communicate with the dat
Hi Experts,
We have a group of hierarchies that run as a separate job on the DS schedules. The problem is this when we schedule the job to run during the production loads it fails but when we run immediately after it fails it runs completely fine. So it basically means that if i run it manually it runs but when its scheduled to run with the production job it fails. Now the interesting thing is If i schedule the job to run anytime after or before the production jobs are done. It works fine.
The error i get is
The job process could not communicate with the data flow <XXXXXX> process. For details, see previously logged
error <50406>.
Now this XXXXX DF has only Horizontal Flatenning and it does not run as separate process because if i have it has separate process it fails with an EOF . So i removed the run as separate process and changes the DF to use in memory .
Any Suggestion on this problem...Thanks Mike.. I was hoping its a memory issue but the thing i don't understand is when the job is scheduled to run with the production job it fails. when i manually run the job during the production job it runs, this kinda baffles me.
DS 3.2 (Verison 12.2.0.0)
OS: GNU/LINUX
DF Cache Setting :- In Memory
processor : 0
vendor_id : GenuineIntel
cpu family : 6
model : 26
model name : Intel(R) Xeon(R) CPU X5670 @ 2.93GHz
stepping : 4
cpu MHz : 2933.437
cache size : 12288 KB
fpu : yes
fpu_exception : yes
cpuid level : 11
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss syscall nx rdtscp lm constant_tsc ida nonstop_tsc arat pni ssse3 cx16 sse4_1 sse4_2 popcnt lahf_lm
bogomips : 5866.87
clflush size : 64
cache_alignment : 64
address sizes : 40 bits physical, 48 bits virtual
power management: [8]
processor : 1
vendor_id : GenuineIntel
cpu family : 6
model : 26
model name : Intel(R) Xeon(R) CPU X5670 @ 2.93GHz
stepping : 4
cpu MHz : 2933.437
cache size : 12288 KB
fpu : yes
fpu_exception : yes
cpuid level : 11
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss syscall nx rdtscp lm constant_tsc ida nonstop_tsc arat pni ssse3 cx16 sse4_1 sse4_2 popcnt lahf_lm
bogomips : 5866.87
clflush size : 64
cache_alignment : 64
address sizes : 40 bits physical, 48 bits virtual
power management: [8]
Thanks for your help -
Extractor class to fetch the data from PA0105
Hi All,
Can anyone suggest me the extractor class used to fetch the data from the table PA0105.
Any information regarding this will be highly helpful.If you give PA0105 all data will be extracted.
Or if you need any specific data you need to know the filed names first. check the table from SE16 for identifying the field names. and do it accordingly.
Please reward points
Regards
Venu -
Getting an error while fetching the data and bind it in the Tree table
Hi All,
I am getting an error "A navigation paths parameter object has to be defined - " while fetching the data and bind it in the Tree table.
Please find the code and screenshot below
var oModel = new sap.ui.model.odata.ODataModel("../../../XXXX.xsodata/", true);
var oTable = sap.ui.getCore().byId("table");
oTable.setModel(oModel);
oTable.bindRows({
path: "/Parent",
parameters: {expand: "Children"}
Can anyone please give me a suggestion to rectify this?
Thanks in Advance,
AravindhHi All,
Please see the below code. It works fine for me.
var oController = sap.ui.controller("member_assignment");
var oModel = new sap.ui.model.odata.ODataModel("../../../services/XXXX.xsodata/", true);
var Context = "/PARENT?$expand=ASSIGNEDCHILD&$select=NAME,ID,ASSIGNEDCHILD/NAME,ASSIGNEDCHILD/ID,ASSIGNEDCHILD/PARENT_ID";
var oTable = sap.ui.getCore().byId("tblProviders");
oModel.read(Context, null, null, true, onSuccess, onError);
function onSuccess(oEventdata){
var outputJson = {};
var p = 0;
var r = {};
try {
if (oEventdata.results){
r = oEventdata.results;
} catch(e){
//alert('oEventdata.results failed');
$.each(r, function(i, j) {
outputJson[p] = {};
outputJson[p]["NAME"] = j.NAME;
outputJson[p]["ID"] = j.ID;
outputJson[p]["PARENT_ID"] = j.ID;
outputJson[p]["DELETE"] = 0;
var m = 0;
if (j.ASSIGNEDCHILD.results.length > 0) {
$.each(j.ASSIGNEDCHILD.results, function(a,b) {
outputJson[p][m] = { NAME: b.NAME,
ID : b.ID,
PARENT_ID: b.PARENT_ID,
DELETE: 1};
m++;
p++;
var oPM = new sap.ui.model.json.JSONModel();
oPM.setData(outputJson);
oTable.setModel(oPM);
function onError(oEvent){
console.log("Error on Provider Members");
oTable.bindRows({
path:"/"
Regards
Aravindh -
Not able to fetch the data by Virtual Cube
Hi Experts,
My requirement is I need to fetch the data from Source System (From Data base table) by using Virtual Cube.
What I have done is I have created Virtual Cube and created corresponding Function Module in Source System.
That Function MOdule is working fine, if Data base table is small in Source System.But If data base table contains huge amount of data (millions of record), I am not able to fetch the data.
Below is the code I have incorporated in my function module.
c_th_mapping TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_TH_IOBJ_FLD_MAPPING.
DATA:
l_s_map TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_S_IOBJ_FLD_MAPPING.
l_s_map-iobjnm = '0PARTNER'.
l_s_map-fldnm = 'PARTNER'.
insert l_s_map into table l_th_mapping.
create object l_r_srv
exporting
i_tablnm = '/SAPSLL/V_BLBP'
i_th_iobj_fld_mapping = l_th_mapping.
l_r_srv->open_cursor(
i_t_characteristics = characteristics[]
i_t_keyfigures = keyfigures[]
i_t_selection = selection[] ).
l_r_srv->fetch_pack_data(
importing
e_t_data = data[] ).
return-type = 'S'.
In the above function Module,Internal table L_TH_MAPPING contains Info Objects from Virtual Cube and corresponding field from Underlying data base table.
The problem where I am facing is, in the method FETCH_PACK_DATA, initially program is trying to fetch all the recordsfrom data base table to internal table.If Data base table so lagre, this logic is not working.
So would you please help me how to handle these kind of issues.Hi Experts,
My requirement is I need to fetch the data from Source System (From Data base table) by using Virtual Cube.
What I have done is I have created Virtual Cube and created corresponding Function Module in Source System.
That Function MOdule is working fine, if Data base table is small in Source System.But If data base table contains huge amount of data (millions of record), I am not able to fetch the data.
Below is the code I have incorporated in my function module.
c_th_mapping TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_TH_IOBJ_FLD_MAPPING.
DATA:
l_s_map TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_S_IOBJ_FLD_MAPPING.
l_s_map-iobjnm = '0PARTNER'.
l_s_map-fldnm = 'PARTNER'.
insert l_s_map into table l_th_mapping.
create object l_r_srv
exporting
i_tablnm = '/SAPSLL/V_BLBP'
i_th_iobj_fld_mapping = l_th_mapping.
l_r_srv->open_cursor(
i_t_characteristics = characteristics[]
i_t_keyfigures = keyfigures[]
i_t_selection = selection[] ).
l_r_srv->fetch_pack_data(
importing
e_t_data = data[] ).
return-type = 'S'.
In the above function Module,Internal table L_TH_MAPPING contains Info Objects from Virtual Cube and corresponding field from Underlying data base table.
The problem where I am facing is, in the method FETCH_PACK_DATA, initially program is trying to fetch all the recordsfrom data base table to internal table.If Data base table so lagre, this logic is not working.
So would you please help me how to handle these kind of issues. -
Reg: fetch the data by using item_id which is retuned by In line View Query
Hi all,
create table xxc_transactions(type_id number,trx_line_id number ,item_id number,org_id number);
insert into xxc_transactions values(null,null,null,null);
create table xxc_items1(item_id number,org_id number,item_no varchar2(10));
insert into xxc_items1 values(123,12,'book');
create table xxc_headers(header_id number,order_id number);
insert into xxc_headers values(null,null);
create table xxc_lines(header_id number,item_id number,line_id number);
insert into xxc_lines values(null,null,null);
create table xxc_types_tl(transaction_id number,NAME varchar2(10));
insert into xxc_types_tl values(106,'abc');
create table xxc_quantity(item_id number);
insert into xxc_quantity values (123);
create table xxc_quantity_1(item_id number);
insert into xxc_quantity_1 values (123);
SELECT union_id.item_id,
b.org_id,
e.name,
fun1(union_id.item_id) item_no
FROM xxc_transactions a,
xxc_items1 b,
xxc_headers c,
xxc_lines d,
xxc_types_tl e,
(SELECT item_id
FROM xxc_quantity
WHERE item_id = 123
UNION
SELECT item_id
FROM xxc_quantity_1
WHERE item_id = 123
UNION
SELECT item_id
FROM xxc_transactions
WHERE item_id = 123) union_id
WHERE a.type_id = 6
AND a.item_id = b.item_id
AND union_id.item_id = b.item_id
AND a.org_id = b.org_id
AND c.header_id = d.header_id
AND d.line_id = a.trx_line_id
AND d.item_id = b.item_id
AND c.order_id = e.transaction_id
AND b.org_id = 12
GROUP BY union_id.item_id,
b.org_id,
e.name
ORDER BY union_id.item_id;
create or replace function fun1(v_item in number)
return varchar2
is
v_item_no
Begin
select item_no from xxc_items1
where item_id=v_item;
return v_item_no ;
Exception
When Others Then
v_item_no := null;
return v_item_no;
END fun1;
I need fetch the data by using item_id which is retuned by In line View Query(UNION)
item_id org_id name item_no
123 12 abc book
Version: 11.1.0.7.0 and 11.2.0.1.0
Message was edited by: Rajesh123 Added test cases script
Message was edited by: Rajesh123 changed Question as fetch the data by using item_id which is retuned by In line View Query(UNION)Hi Master , sorry for the late reply and can you please help on this?
create table xxc_transactions(type_id number,trx_line_id number ,item_id number,org_id number);
insert into xxc_transactions values(null,null,null,null);
create table xxc_items(item_id number,org_id number,item_no varchar2(10));
insert into xxc_items values(123,12,'book');
create table xxc_headers(header_id number,order_id number);
insert into xxc_headers values(null,null);
create table xxc_lines(header_id number,item_id number,line_id number);
insert into xxc_lines values(null,null,null);
create table xxc_types_tl(transaction_id number,NAME varchar2(10));
insert into xxc_types_tl values(106,'abc');
create table xxc_uinon_table(item_id number);
insert into xxc_types_tl values(123);
SELECT union_id.item_id,
b.org_id ,
e.name ,
fun1(union_id.item_id) item_no --> to get item_no
FORM xxc_transactions a,
xxc_items b,
xxc_headers c,
xxc_lines d,
xxc_types_tl e,
( SELECT item_id
FROM xxc_uinon_table ) union_id
WHERE a.type_id= 6
AND a.item_id = b.item_id
AND union_id.item_id = b.item_id
AND a.org_id = b.org_id
AND c.header_id = d.header_id
AND d.line_id= a.trx_line_id
AND d.item_id= b.item_id
AND c.order_id= e.transaction_id ---106
AND b.org_id = 12
GROUP BY union_id.item_id,
b.org_id ,
e.name
ORDER BY union_id.item_id;
Note: xxc_uinon_table is a combination of UNION's
select 1 from dual
union
select 1 from dual
union
select no rows returned from dual;
I will get 1 from the above Query
Thank you in advanced -
How to fetch the data from pl/sql table dynamically
Hi All, I have the requirement of comparing two db views data in pl/sql. So, I have bulk collect the view into pl/sql table. But, the issue is, It is expecting to provide the column name for comparison. But, in my case, column name is dynamic. So, i cannot provide the column name directly for comparison.
For eg: In my view t1_VW, i have 4 columns. stid, c1,c2,c3,c4 and similar structure for t2_vw
my code
TYPE v1_type IS TABLE OF t1_vw%ROWTYPE;
l_data v1_type;
TYPE v1_type1 IS TABLE OF t2_vw%ROWTYPE;
l_data1 v1_type1;
test varchar2(1000);
test1 varchar2(1000);
temp1 number;
begin
SELECT * Bulk collect into l_data
FROM T1_VW;
SELECT * Bulk collect into l_data1
FROM T2_VW;
select l_data(1).stid into temp1 from dual; -- It is working fine and gives me the value properly
-- But, in my case, we are reading the column names from array, i am constructing the query dynamically and execute it.
test :='select l_data(1).stid into temp1 from dual';
execute immediate test into temp1;
-- I am getting error as follows:
Error report:
ORA-00904: "L_DATA": invalid identifier
ORA-06512: at "SYSTEM.BULKCOMPARISON", line 93
ORA-06512: at line 2
00904. 00000 - "%s: invalid identifier"
*Cause:
*Action
end;
- Please help me to get rid of this issue. Is it possible to construct the query dynamically and fetch the data?. If not, is there any other better approach to compare the data between two views?.Output should display what are all columns changed and its old value and new value.
For eg., output should be
COLUMNNAME OLD_VALUE NEW_VALUE STID
C1 20 10 1
C2 50 40 2
C3 60 70 2
C2 80 90 3Why no do this only via a simple sql ?
create table a (STID number, C1 number, C2 number, C3 number);
insert into a values (1, 20, 30, 40)
insert into a values (2, 40, 50, 60);
insert into a values (3, 90, 80, 100);
create table b as select *
from a where 1 = 0;
insert into b values (1, 10, 30, 40)
insert into b values (2, 40, 40, 70);
insert into b values (3, 90, 90, 100);
commit;And now you can issue such a kind of select
SELECT stid , c1, c2, c3
FROM
( SELECT a.*,
1 src1,
to_number(null) src2
FROM a
UNION ALL
SELECT b.*,
to_number(null) src1,
2 src2
FROM b
GROUP BY stid , c1, c2, c3
HAVING count(src1) <> count(src2)
order by stid;I would then create a new table a_b_difference having the same structure as a or b and insert into it like this
create table a_b_diff as select * from a where 1 = 0;
insert into a_b_diff
SELECT stid , c1, c2, c3
FROM
( SELECT a.*,
1 src1,
to_number(null) src2
FROM a
UNION ALL
SELECT b.*,
to_number(null) src1,
2 src2
FROM b
GROUP BY stid , c1, c2, c3
HAVING count(src1) <> count(src2)
order by stid
;Then each time there is a difference between a column in a and its equivalente one in b (per unique stid ) a record will be inserted in this table.
You can do more by adding the name of the table in front of each record in this table to see exactly where the data comes from
Best Regards
Mohamed Houri -
How to fetch the data from databse table and get the required output
Hi,
I have made a project that connects CEP to database table but i m getting some problem in fetching the data from database.
From the following code :
If the where condition is removed then the application runs fine but i am still not able to fetch the data from the table because it is not showing any output.
Can anyone please suggest me that how to write WHERE statement correctly and how i will be able to see the output.
Following is the config.xml for processor:
======================================
<?xml version="1.0" encoding="UTF-8"?>
<wlevs:config xmlns:wlevs="http://www.bea.com/ns/wlevs/config/application"
xmlns:jdbc="http://www.oracle.com/ns/ocep/config/jdbc">
<processor>
<name>JDBC_Processor</name>
<rules>
<query id="q1"><![CDATA[
SELECT STOCK.SYMBOL as symbol, STOCK.EXCHANGE as exchange
FROM ExchangeStream [Now] as datastream, STOCK
WHERE datastream.SYMBOL = datastream.SYMBOL ]]></query>
</rules>
</processor>
<jms-adapter>
<name>JMS_IN_Adapter</name>
<jndi-provider-url>t3://CHDSEZ135400D:7001</jndi-provider-url>
<destination-jndi-name>jms.TestKanikaQueue</destination-jndi-name>
<user>weblogic</user>
<password>welcome1</password>
</jms-adapter>
</wlevs:config>
Following is the assembly file:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:osgi="http://www.springframework.org/schema/osgi"
xmlns:wlevs="http://www.bea.com/ns/wlevs/spring" xmlns:jdbc="http://www.oracle.com/ns/ocep/jdbc"
xmlns:spatial="http://www.oracle.com/ns/ocep/spatial"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/osgi
http://www.springframework.org/schema/osgi/spring-osgi.xsd
http://www.bea.com/ns/wlevs/spring
http://www.bea.com/ns/wlevs/spring/spring-wlevs-v11_1_1_3.xsd
http://www.oracle.com/ns/ocep/jdbc
http://www.oracle.com/ns/ocep/jdbc/ocep-jdbc.xsd
http://www.oracle.com/ns/ocep/spatial
http://www.oracle.com/ns/ocep/spatial/ocep-spatial.xsd">
<wlevs:event-type-repository>
<wlevs:event-type type-name="StockEvent">
<wlevs:properties>
<wlevs:property name="SYMBOL" type="byte[]" length="16" />
<wlevs:property name="EXCHANGE" type="byte[]" length="16" />
</wlevs:properties>
</wlevs:event-type>
<wlevs:event-type type-name="ExchangeEvent">
<wlevs:class>com.bea.wlevs.event.example.JDBC_CEP.ExchangeEvent</wlevs:class>
</wlevs:event-type>
<wlevs:event-type type-name="StockExchangeEvent">
<wlevs:properties>
<wlevs:property name="symbol" type="byte[]" length="16" />
<wlevs:property name="price" type="byte[]" length="16" />
<wlevs:property name="exchange" type="byte[]" length="16" />
</wlevs:properties>
</wlevs:event-type>
</wlevs:event-type-repository>
<bean id="readConverter" class="com.bea.wlevs.adapter.example.JDBC_CEP.Adapter_JDBC" />
<bean id="outputJDBCBean" class="com.bea.wlevs.bean.example.JDBC_CEP.OutputBean_JDBC">
</bean>
<wlevs:adapter id="JMS_IN_Adapter" provider="jms-inbound">
<wlevs:listener ref="ExchangeStream" />
<wlevs:instance-property name="converterBean"
ref="readConverter" />
</wlevs:adapter>
<wlevs:processor id="JDBC_Processor" advertise="true">
<wlevs:listener ref="OutputChannel" />
<wlevs:table-source ref="STOCK" />
</wlevs:processor>
<wlevs:channel id="ExchangeStream" event-type="ExchangeEvent" advertise="true">
<wlevs:listener ref="JDBC_Processor" />
</wlevs:channel>
<wlevs:channel id="OutputChannel" event-type="StockExchangeEvent"
advertise="true">
<wlevs:listener ref="outputJDBCBean" />
</wlevs:channel>
<wlevs:table id="STOCK" event-type="StockEvent"
data-source="StockDs" table-name="STOCK" />
<wlevs:table id="STOCK_EXCHANGE" event-type="StockExchangeEvent"
data-source="StockDs" table-name="STOCK_EXCHANGE" />
</beans>
ExchangeEvent.java:
package com.bea.wlevs.event.example.JDBC_CEP;
public class ExchangeEvent {
public String SYMBOL;
public String symbol;
public String exchange;
public ExchangeEvent() {
public String getSYMBOL() {
return SYMBOL;
public void setSYMBOL(String sYMBOL) {
SYMBOL = sYMBOL;
public String getSymbol() {
return symbol;
public void setSymbol(String symbol) {
this.symbol = symbol;
public String getExchange() {
return exchange;
public void setExchange(String price) {
this.exchange = price;
Adapter Class:
package com.bea.wlevs.adapter.example.JDBC_CEP;
import com.bea.wlevs.adapter.example.JDBC_CEP.MyLogger;
import com.bea.wlevs.adapters.jms.api.InboundMessageConverter;
import java.text.DateFormat;
import java.util.Date;
import com.bea.wlevs.adapters.jms.api.MessageConverterException;
import com.bea.wlevs.event.example.JDBC_CEP.ExchangeEvent;
import javax.jms.JMSException;
import javax.jms.Message;
import javax.jms.TextMessage;
import java.util.ArrayList;
import java.util.List;
import java.util.Random;
public class Adapter_JDBC implements InboundMessageConverter{
@SuppressWarnings("unchecked")
public List convert(Message message) throws MessageConverterException, JMSException {
Random rand = new Random();
int unique_id = rand.nextInt();
DateFormat dateFormat;
dateFormat = DateFormat.getTimeInstance();
dateFormat.format(new Date());
MyLogger.info(unique_id + " CEP Start Time is: " + dateFormat.format(new Date()));
System.out.println("Message from the Queue is :"+ message);
TextMessage textMessage = (TextMessage) message;
String stringMessage = textMessage.getText().toString();
System.out.println("Message after getting converted into String is :"+ stringMessage);
String[] results = stringMessage.split(",\\s*"); // split on commas
ExchangeEvent event1 = new ExchangeEvent();
event1.setSYMBOL(results[0]);
List events = new ArrayList(2);
events.add(event1);
return events;
Output Bean Class :
package com.bea.wlevs.bean.example.JDBC_CEP;
import com.bea.wlevs.ede.api.StreamSink;
import com.bea.wlevs.event.example.JDBC_CEP.ExchangeEvent;
import com.bea.core.datasource.DataSourceService;
public class OutputBean_JDBC implements StreamSink{
public void onInsertEvent(Object event) {
if (event instanceof ExchangeEvent) {
ExchangeEvent cacheEvent = (ExchangeEvent) event;
System.out.println("Symbol is: " + cacheEvent.getSymbol());
System.out.println("Exchange is: " + cacheEvent.getExchange());
System.out.println(DataSourceService.class.getClass());
Kindly let me know if you need further info.Do you have StockDs configured in your server config.xml?
I think the query should look more like this:
SELECT stocks.SYMBOL, stocks.EXCHANGE
FROM STOCK as stocks, ExchangeStream [Now] as datastream WHERE stocks.SYMBOL = datastream.SYMBOL
Thanks
andy -
How to Fetch the Data from a Cube
Dear All,
We created a cube containing dimensions
Customer, Product, Branch, Activity, Time dimensions
using Oracle Analytical WorkSpace Manager.
Once Cube is created,
How can I see the Data existing in the Cube using normal SQL Queries ??? Through Analytical Workspace Manager Toll we r able to see the data. But our requirement is to see the data from the Cube using SQL Queries.
Regards,
S.Vamsi KrishnaHey I got the Solution. It follows in this way :
A Cube is nothing but a Data Storage. Based on the Mapping we given it considers the data.
To fetch the data from the Cube -> we have to write the SQl Query as below :
SELECT dealer_name,model_name,sales
FROM TABLE(OLAP_TABLE('MDB.FINAL_AW DURATION SESSION',
'DIMENSION dealer_name AS varchar2(30) FROM FINALDEAL
DIMENSION model_name AS varchar2(30) FROM FINALMODEL'));
We can create View for the above statement o
we can apply group by ,rollup, etc etc clauses
and even we can write where clauses for the above select statement.
But now my doubt is :
can we apply any calculations while mapping the Level to an Dimension.
Generally we will map Level toa dimension as DBUSER.TABLENAME.COLUMN NAME
can we apply any calculation like :
MIS.PROPKEY020MB.MATURITY_DATE+2
Please help for the above.
If any wrong is there please let me know
Regards,
S.Vamsi Krishna
can we apply -
Report is not fetching the data from Aggregate..
Hi All,
I am facing the problem in aggregates..
For example when i am running the report using Tcode RSRT2, the BW report is not fetching the data from Aggregates.. instead going into the aggregate it is scanning whole cube Data....
FYI.. Checked the characteristcis is exactely matching with aggregates..
and also it is giving the message as:
<b>Characteristic 0G_CWWPTY is compressed but is not in the aggregate/query</b>
Can some body explain me about this error message.. pls let me know solution asap..
Thankyou in advance.
With regards,
HariHi
Deactivate the aggregates and then rebuild the indexes and then activate the aggregates again.
GTR -
How to fetch the data from a pl/sql table and varray, with some example
I want to fetch the data using a cursor from Pl/sql table and varry and I want to update the data.
Please provide me some example.PL/SQL Table - please note that, right term is Associative Array.
Presumably you are referring to the 'often heated' back-and-forth that sometimes goes on in the forums when people refer to ANY PL/SQL type using a term with the word 'table' in it?
Curious that you then show an example of a nested table!
type emp_tab is table of employees%rowtype;
The 'right term' for that is 'nested table'. The following would be an 'associative array' or 'index-by table'
type emp_tab is table of employees%rowtype INDEX BY PLS_INTEGER;
Those used to be called 'PL/SQL tables' or 'index-by tables' but 'associative array' is the current term used.
Associative Arrays
An associative array (formerly called PL/SQL table or index-by table) is a set of key-value pairs. Each key is a unique index, used to locate the associated value with the syntax variable_name(index).
The data type of index can be either a string type or PLS_INTEGER.
Since the Oracle docs often use 'PL/SQL table' or 'index-by table' it isn't unusual for someone asking a question to use those terms also. Technically the types may not be 'tables' but it's clear what they mean when they use the term.
In PL/SQL the term 'nested table' is still used even though the PL/SQL collection is not really a table. SQL does have nested tables where the data is actually stored in a table. The PL/SQL 'nested table' type can be used as the source/destination of the SQL data from a nested table so that may be why Oracle uses that term for the PL/SQL type.
The doc that SKP referenced refers to this use:
Nested Tables
In the database, a nested table is a column type that stores an unspecified number of rows in no particular order. When you retrieve a nested table value from the database into a PL/SQL nested table variable, PL/SQL gives the rows consecutive indexes, starting at 1. -
Hi all.
I think that the problem I want to discuss is well-known, but still I got no answer whatever I tried ...
I installed the BIEE on Linux (32 bit, OEL 5 - to be more precise), the complete installation was not a big deal. After that I installed the Administration tool on my laptop and created the repository. So... my tnsnames.ora on the laptop looks like this:
TESTDB =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.5)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = testdb)
And the tnsnames.ora on server, in its turn, looks like this:
TESTDB =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = localhost.localdomain)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = testdb.localdomain)
The database worked normally and I created and transferred the repository to the server and started it up.
It started without any errors, but when I tried to fetch the data via the representation services I got the error:
Odbc driver returned an error (SQLExecDirectW).
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred.
[nQSError: 16023] The ODBC function has returned an error. The database may not be available, or the network may be down. (HY000)
I discovered, that the ODBC on my laptop was named not correctly (it should have been identical to tnsnames entry) - so I corrected it, saved and replaced the repository on the server and restarted it... - and still got the same error.
Apparently, something is wrong with the data source. So let me put here some more information...
My user.sh looks like this:
ORACLE_HOME=/u01/app/ora/product/11.2.0/dbhome_1
export ORACLE_HOME
TNS_ADMIN=$ORACLE_HOME/network/admin
export TNS_ADMIN
PATH=$ORACLE_HOME/bin:/opt/bin:$PATH
export PATH
LD_LIBRARY_PATH=$ORACLE_HOME/lib:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH
and my odbc.ini looks like this:
[ODBC]
Trace=0
TraceFile=odbctrace.out
TraceDll=/u01/OracleBI/odbc/lib/odbctrac.so
InstallDir=/u01/OracleBI/odbc
UseCursorLib=0
IANAAppCodePage=4
[ODBC Data Sources]
AnalyticsWeb=Oracle BI Server
Cluster=Oracle BI Server
SSL_Sample=Oracle BI Server
TESTDB=Oracle BI Server
[TESTDB]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=local
Repository=SH
Catalog=
UID=
PWD=
Port=9703
[AnalyticsWeb]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=local
Repository=
Catalog=
UID=
PWD=
Port=9703
[Cluster]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=local
Repository=
FinalTimeOutForContactingCCS=60
InitialTimeOutForContactingPrimaryCCS=5
IsClusteredDSN=Yes
Catalog=SnowFlakeSales
UID=Administrator
PWD=
Port=9703
PrimaryCCS=
PrimaryCCSPort=9706
SecondaryCCS=
SecondaryCCSPort=9706
Regional=No
[SSL_Sample]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=localhost
Repository=
Catalog=SnowflakeSales
UID=
PWD=
Port=9703
SSL=Yes
SSLCertificateFile=/path/to/ssl/certificate.pem
SSLPrivateKeyFile=/path/to/ssl/privatekey.pem
SSLPassphraseFile=/path/to/ssl/passphrase.txt
SSLCipherList=
SSLVerifyPeer=No
SSLCACertificateDir=/path/to/ca/certificate/dir
SSLCACertificateFile=/path/to/ca/certificate/file.pem
SSLTrustedPeerDNs=
SSLCertVerificationDepth=9
Can anybody point a finger where the error line is? According to the documentation it should work fine.Maybe the driver name is wrong? What driver I need then?
Cause I can't find it.
I'm really sorry to bother, guys :) Let me know if you get some ideas about it (metalink didn't help).OK, several things wrong here. First the odbc.ini is not meant to be used for Oracle databases, that's not supported on Linux. On Linux you should OCI (Oracle native drivers) and nothing should be added on odbc.ini. Your user.sh seems to be pointing to your DB installation path. This is not correct. It should point to your Oracle client installation so you need to install the Oracle FULL client somewhere. Typically this is normally done with the same OS account as the one used for OBIEE whereas the DB normally runs with the oracle account. Once you got the client installed test it under the OBIEE account doing tnsping and sqlplus to your DB. Also the LD_LIBRARY_PATH should point to $ORACLE_HOME/lib32 not lib as the lib directory is the 64bits and OBIEE uses the 32bits libraries even in 64bits OSes. Finally change your RPD connection to use OCI. Make all those changes and you should be good.
-
Error in fetching the data from textfield and inserting to the database..
I'm using Java Swing as front end and MySQL as backend using the netbeans ide..I am trying to fetch the data from the textfiled in the form and insert to the database table.i've skipped the generated code..In the following code i get the erro cannot find symbol "stmt" in the actionPerformed method..
mport java.awt.event.*;
import java.sql.*;
public class BarcodeReader extends JFrame implements ActionListener {
public BarcodeReader() {
initComponents();
nb.addActionListener(this);
public void jdbcConnect(){
Connection con=null;
String url = "jdbc:mysql://localhost:3306/";
String db = "mynewdatabase";
String driver = "com.mysql.jdbc.Driver";
String user = "usrname";
String pass = "pwd";
try{
String s=newtxt.getText();
con=DriverManager.getConnection(url + db, user, pass);
Statement stmt=con.createStatement();
Class.forName(driver);
public void actionPerformed(ActionEvent e){
try{
jdbcConnect();
stmt.executeUpdate("INSERT into machine(mname) values '"+jTextField1.getText()+"'");
}}catch (Exception ex) {
System.out.println(ex);
public static void main(String args[]) {
java.awt.EventQueue.invokeLater(new Runnable() {
public void run() {
new BarcodeReader().setVisible(true);
}There are far too many errors to try and clear.
For one, the exception references the actionPerformed method (according to your text), so why is that not shown here?
For another you are performing, possible time-consuming, actions, and even worse IO actions, on the event thread, which is a huge no-no.
You are not closeing your resources properly, if at all, which is another huge no-no.
You are completely mixing your "view" (the gui), and your "model" (the data related classes), which is another huge no-no.
etc, etc, etc. -
How to fetch the data & display the data if fields got the same name in alv
hi frnds, i need ur help.
how to fetch the data & display the data if fields got the same name in alv grid format.
thanks in advance,
Regards,
mahesh
9321043028Refer the url :
http://abapexpert.blogspot.com/2007/07/sap-list-viewer-alv.html
Go thru the guide for OOPs based ALV.
Use SET_TABLE_FOR_FIRST_DISPLAY to display the table:
CALL METHOD grid->set_table_for_first_display
EXPORTING
I_STRUCTURE_NAME = 'SFLIGHT' Structure data
CHANGING
IT_OUTTAB = gt_sflight. Output table
You can also implement
Full Screen ALV, its quite easy. Just pass the output table to FM REUSE_ALV_GRID_DISPLAY.
For controlling and implementing the FS-ALV we have to concentrate on few of the components as follows :
1. Selection of data.
2. Prepare Layout of display list.
3. Event handling.
4. Export all the prepared data to REUSE_ALV_GRID_DISPLAY.
Regd,
Vishal -
How can i join the header and item table to fetch the data
hi experts,
i have a doubt in using inner join or for all entries, for fetching the data from the item table mseg, by taking the doc.no from mkpf. Plz sort out the difference, what happens, if i use the both statements for fetching dataHi,
Both has same functionality.
but if u are using FAE, u ahev to check for the
~intial condition of the source table,
~ duplicate entries, if any
Inner join will fetch the data from all the join tables at once. FAE will fetch the date from a table first then use that data to fetch data from subsequent table.
say in ur case, if u r using FAE,
1.select from mkpf.
2.select from mseg fae in I_MKPF.
first try using JOIN. if it is taking lots of time, then try FAE.
regards,
madhu
Maybe you are looking for
-
HP Officejet 4630 e All in One "alignment sheet printing" but it doesn't print anything
HP Officejet 4630 e All in One Stampa pagina di allineamento in corso... bloccata con questa scritta nel display che non riesco a togliere. Sistema operativo Windows 8 64k. Posso aggirare il problema di stampare perchè comunque se scelgo allineament
-
Data Corruption when installing Windows XP
Folks, I would greatly appreciate any direction you might have to help me solve te following issue. Here are my parts: k7n2 Delta - L AMD XP 3000+ Saphire Radeon 9800 PRO 128 Antec Server case 430w Truepower supply. Plextor 52/24/52 IDE Maxtor SATA 8
-
How do you find the right replacement for a hard drive
i have a 2007 macbook and the hard drive just went out and i want to find a new one that would be combatible
-
Can I use a verizon iphone5 for prepaid plans?
What about an iphone 5. I have an verizon iphone 5 that used to be on a regular plan and now I need to put it on a prepaid plan... << Branched to a new discussion for better exposure and more relevant answers >>
-
Oracle MDM 2.0 Installation problem
Hi , I am trying to install Oracle MDM2.0 but I was facing the following problems while installating the framework Can't locate SPL/splExternal.pm" Error Occurs When Trying To Run Install . I am using the following versions Oracle Weblogic Server 11g