Only 1st character while fetching data through get_char_property
Hi
I am reading excel file through OLE2 object. while I am fatching data thr this utility It is fetching only first character of the string.
procedure read_chr_cell( p_row number, p_col number, p_value in out varchar2 ) is
args ole2.list_type;
cell ole2.obj_type;
begin
args := ole2.create_arglist;
ole2.add_arg(args, p_row );
ole2.add_arg(args, p_col );
cell := ole2.get_obj_property(worksheet,'Cells',args);
ole2.destroy_arglist(args);
p_value := ole2.get_char_property(cell,'Value'); [[[ PROBLEM AREA ]]]
ole2.release_obj(cell);
end;
any advice is appreaciate
Thanks
Vishal
Hi Vishal
I think u need to LOOP Through the Records :)
Hope this helps...
Regards,
Amatu Allah
Similar Messages
-
Problem getting the selecteditem in tree while fetching data through httpserivces
I have a mxml file in which a want to display data in tree structure. The supplied to the tree is fetched from database through httpservice
the mxml file content is :
<?xml version="1.0" encoding="utf-8"?><mx:Application
xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute" creationComplete ="getMenu.send();">
<mx:HTTPService id="
getMenu"method="
GET"url="
http://172.17.26.55:7001/frmwk/MenuData"resultFormat="
e4x"result="resultHandler(event)"
useProxy="
false"
/>
<mx:Script>
<![CDATA[
import mx.controls.Alert;
import mx.rpc.events.FaultEvent;
import mx.rpc.events.ResultEvent;
import mx.collections.XMLListCollection;
import mx.events.CloseEvent;
private var companyData:XML = new XML;
private var Menu:XMLListCollection;
private function resultHandler(event:ResultEvent):void{
companyData = event.result
as XML;Menu =
new XMLListCollection(companyData.MENU);tree.dataProvider=Menu;
private function getSelected():void{
Alert.show(
"selected--->"+tree.selectedItem);}
]]>
</mx:Script>
<mx:Label text="Tree with XML data"/>
<mx:Tree id="tree" top="100" left="400" labelField="@title" height="224" width="179"dragEnabled="
false" dropEnabled="false" allowMultipleSelection="true"
/>
<mx:Button label="get Selected item" click="getSelected()" x="400" y="450"/>
When i try to get the selected item is is giving me null
Please help !!
thanks in advance.How are you selecting the item? How many items are you selecting?
-
Append a string while fetching data Through Sql Statement
Hi i have a table which having a one field contains information of some delted files.
when i write a sql query to fetch the information its giving out put like ..
SELECT DISTINCT filedname FROM TB_JOBCOMP_DBS40
Assinid
complte_date
pr_date
but i need a out put like
Assinid -------------------->deleted
complte_date -------------------->deleted
pr_date -------------------->deleted
is there any way to write sql to fetch this informationAre you looking for concatenation ?
SQL> select concat(ename,'----- Names Of employee') from emp
2 WHERE rownum<3
3 ;
CONCAT(ENAME,'-----NAMESOFEMPLOYEE')
SMITH----- Names Of employee
ALLEN----- Names Of employee
SQL> select ename||'----- Names Of employee' from emp
2 WHERE rownum<3;
ENAME||'-----NAMESOFEMPLOYEE'
SMITH----- Names Of employee
ALLEN----- Names Of employee -
Hi,
While fetching data though IR wen client from workspace for a year(all 12 months) I am getting error as ("Out of Memory .Advice : Close other applications or windows and try again").
If I am trying same through IR studio it does not give any output and show me same repoting front page.
If i am selecting periods till 8 months it is giving the required data in both IR web client and IR studio.
Could you please suggest how can we resolve this issue.
Thanks,
D.N.RanaIssue Cause :
Sometimes this is due to excessive data which brings the size of the BQY file up around one gigabyte uncompressed in size (for processing may take twice as actual RAM, plus the memory space space for the plugin, and the typical memory limit on a 32-bit system is 2 gigabytes).
Solution :
To avoid excessive BQY size exceeding memory availability:
Ensure that your computer has at least 2Gb of free RAM before he runs IR Studio.
Put a limit to the number of rows that can be pulled down: Right click on Request label of Query section and put a value in Return First xxx Rows (and check the check box).
Do not pull down more than 750 MB of data (remember it may be duplicated while processing).
Place limits or aggregations in Query section (as opposed to Result section) to limit data entering the BQY. -
How to use for all entires clause while fetching data from archived tables
How to use for all entires clause while fetching data from archived tables using the FM
/PBS/SELECT_INTO_TABLE' .
I need to fetch data from an Archived table for all the entries in an internal table.
Kindly provide some inputs for the same.
thanks n Regards
RameshHi Ramesh,
I have a query regarding accessing archived data through PBS.
I have archived SAP FI data ( Object FI_DOCUMNT) using SAP standard process through TCODE : SARA.
Now please tell me can I acees this archived data through the PBS add on FM : '/PBS/SELECT_INTO_TABLE'.
Do I need to do something else to access data archived through SAP standard process ot not ? If yes, then please tell me as I am not able to get the data using the above FM.
The call to the above FM is as follows :
CALL FUNCTION '/PBS/SELECT_INTO_TABLE'
EXPORTING
archiv = 'CFI'
OPTION = ''
tabname = 'BKPF'
SCHL1_NAME = 'BELNR'
SCHL1_VON = belnr-low
SCHL1_BIS = belnr-low
SCHL2_NAME = 'GJAHR'
SCHL2_VON = GJAHR-LOW
SCHL2_BIS = GJAHR-LOW
SCHL3_NAME = 'BUKRS'
SCHL3_VON = bukrs-low
SCHL3_BIS = bukrs-low
SCHL4_NAME =
SCHL4_VON =
SCHL4_BIS =
CLR_ITAB = 'X'
MAX_ZAHL =
tables
i_tabelle = t_bkpf
SCHL1_IN =
SCHL2_IN =
SCHL3_IN =
SCHL4_IN =
EXCEPTIONS
EOF = 1
OTHERS = 2
OTHERS = 3
It gives me the following error :
Index for table not supported ! BKPF BELNR.
Please help ASAP.
Thnaks and Regards
Gurpreet Singh -
How can we improve the performance while fetching data from RESB table.
Hi All,
Can any bosy suggest me the right way to improve the performance while fetching data from RESB table. Below is the select statement.
SELECT aufnr posnr roms1 roanz
INTO (itab-aufnr, itab-pposnr, itab-roms1, itab-roanz)
FROM resb
WHERE kdauf = p_vbeln
AND ablad = itab-sposnr+2.
Here I am using 'KDAUF' & 'ABLAD' in condition. Can we use secondary index for improving the performance in this case.
Regards,
HimanshuHi ,
Declare intenal table with only those four fields.
and try the beloe code....
SELECT aufnr posnr roms1 roanz
INTO table itab
FROM resb
WHERE kdauf = p_vbeln
AND ablad = itab-sposnr+2.
yes, you can also use secondary index for improving the performance in this case.
Regards,
Anand .
Reward if it is useful.... -
Eliminate duplicate while fetching data from source
Hi All,
CUSTOMER TRANSACTION
CUST_LOC CUT_ID TRANSACTION_DATE TRANSACTION_TYPE
100 12345 01-jan-2009 CREDIT
100 23456 15-jan-2000 CREDIT
100 12345 01-jan-2010 DEBIT
100 12345 01-jan-2000 DEBITNow as per my requirement, i need to fetch data from CISTOMER_TRANSACTION table for those customer which has transaction in last 10 years. In my above data, customer 12345 has transaction in last 10 years, whereas for customer 23456, does not have transaction in last 10 years so will eliminate it.
Now, CUSTOMER_TRANSACTION table has approximately 100 million records. So, we are fectching data in batches. Batching is divided into months. Total 120 months. Below is my query.
select *
FROM CUSTOMER_TRANSACTION CT left outer join
(select distinct CUST_LOC, CUT_ID FROM CUSTOMER_TRANSACTION WHERE TRANSACTION_DATE >= ADD_MONTHS(SYSDATE, -120) and TRANSACTION_DATE < ADD_MONTHS(SYSDATE, -119) CUST
on CT.CUST_LOC = CUST.CUST_LOC and CT.CUT_ID = CUST.CUT_IDThru shell script, months number will change. -120:-119, -119:-118 ....., -1:-0.
Now the problem is duplication of records.
while fetching data for jan-2009, it will get cust_id 12345 and will fetch all 3 records and load it into target.
while fetching data for jan-2010, it will get cust_id 12345 and will fetch all 3 records and load in into target.
So instead of having only 3 records, for customer 12345 it will be having 6 records. Can someone help me on how can i eliminate duplicate records from getting in.
As of now i have 2 ways in mind.
1. Fetch all records at once. Which is impossible as it will give space issue.
2. After each batch, run a procedure which will delete duplicate records based on cust_loc, cut_id and transaction_date. But again it will have performance problem.
I want to eliminate it while fetching data from source.
Edited by: ace_friends22 on Apr 6, 2011 10:16 AMYou can do it this way....
SELECT DISTINCT cust_doc,
cut_id
FROM customer_transaction
WHERE transaction_date >= ADD_MONTHS(SYSDATE, -120)
AND transaction_date < ADD_MONTHS(SYSDATE, -119)However please note that - if want to get the transaction in a month like what you said earlier jan-2009 and jan-2010 and so on... you might need to use TRUNC...
Your date comparison could be like this... In this example I am checking if the transaction date is in the month of jan-2009
AND transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27) AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)) Your modified SQL...
SELECT *
FROM customer_transaction
WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27) AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27))Testing..
--Sample Data
CREATE TABLE customer_transaction (
cust_loc number,
cut_id number,
transaction_date date,
transaction_type varchar2(20)
INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2009','dd-MON-yyyy'),'CREDIT');
INSERT INTO customer_transaction VALUES (100,23456,TO_DATE('15-JAN-2000','dd-MON-yyyy'),'CREDIT');
INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2010','dd-MON-yyyy'),'DEBIT');
INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2000','dd-MON-yyyy'),'DEBIT');
--To have three records in the month of jan-2009
UPDATE customer_transaction
SET transaction_date = TO_DATE('02-JAN-2009','dd-MON-yyyy')
WHERE cut_id = 12345
AND transaction_date = TO_DATE('01-JAN-2010','dd-MON-yyyy');
UPDATE customer_transaction
SET transaction_date = TO_DATE('03-JAN-2009','dd-MON-yyyy')
WHERE cut_id = 12345
AND transaction_date = TO_DATE('01-JAN-2000','dd-MON-yyyy');
commit;
--End of sample data
SELECT *
FROM customer_transaction
WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27) AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27));Results....
CUST_LOC CUT_ID TRANSACTI TRANSACTION_TYPE
100 12345 01-JAN-09 CREDIT
100 12345 02-JAN-09 DEBIT
100 12345 03-JAN-09 DEBITAs you can see, there are only 3 records for 12345
Regards,
Rakesh
Edited by: Rakesh on Apr 6, 2011 11:48 AM -
Fatal error while fetching data from bi
hi,
i am getting following error while fetching data from bi using select statement
i have written code in this way
SELECT [Measures].[D2GFTNHIOMI7KWV99SD7GPLTU] ON COLUMNS, NON EMPTY { [DEM_STATE].MEMBERS} ON ROWS FROM DEM_CUBE/TEST_F_8
error description when i click on test
Fatal Error
com.lighthammer.webservice.SoapException: The XML for Analysis provider encountered an errorthanks for answering .but when i tried writing the statement in transaction 'MDXTEST' and clicked on check i am getting following error
Error occurred when starting the parser: timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
Message no. BRAINOLAPAPI011
Diagnosis
Failed to start the MDX parser.
System Response
timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
Procedure
Check the Sys Log in Transaction SM21 and test the TCP-IP connection MDX_PARSER in Transaction SM59.
SO I WENT IN SM 59 TO CHECK THE CONNECTION.
CAN U TELL ME WHAT CONFIGERATION I NEED TO DO FOR MAKING SELECT STATEMENTS WORK? -
Facing prolem in Dashboard 4.1, while fetching data from Bex Query
Hi Exports,
I am facing an error message " Failed to (de-)serialise data. (Xsl 000004)" while fetching data in Dashboard from Bex Query.
The query is getting connect. while drag n drop some dimensions and measures then going for Refesh or Run Query, geting this error.
The same query is working fine with other comp like webi n crystal.
Anybody having solution for this please let me know. I am stuck somewhere.
Thank YouHi,
Check the data in the infoProvider.Reduce the Bex query Characteristics & Key figure fields.Try to identify due to which characteristsic adding in Bex Query ,are you facing the issue.Check that characteristic data in the infoProvider.
Regards,
Venkat -
While import data through ff_5 i am getting error
hello
while import data through ff_5 i am getting error as below
message error fv150 and fv151
'Termination in statement no. 00009 of acct
1101200570116; closing record 62F missing'
so please give solution
thank inadvance
SIRIDear Siri,
I guess you are importing an MT940 format. This format should have the closing balance at the end.
This closing balance starts with :62F:
A sample is
:61:0801180118DR3835,97NTRF000000//000000//
:86:1022 LTD CHENNAI18012008
:61:0801180118DR69885,09NCHK850819//850819//
:86:6101 LTD COCHIN18012008
:62F:C080118INR7210048,86
I guess that is missing in the import file.
Maintain that and the import will happen.
Assign points if useful
regards
Venkatesh -
Error while fetching data into collection type.
Hi all,
I'm facing a problem while i'm fetching data into a table type.
Open c1;
open c2;
loop
Fetch c1 into partition_name_1;
fetch c2 into partition_name_2;
exit when c1%notfound or c2%notfound;
open C1_refcursor for 'select a.col1,b.col2 from table1 partition('||partition_name_1||') a, table2 partition('||partition_name_2) b
where a.col2=b.col2';
loop
fetch c1_refcursor BULK COLLECT into v1,v2 <-----This is the line where i'm getting the error as "ORA-01858: a non-numeric character was found where a numeric was expected"
limit 100000;
exit when v1.count=0;
forall i in 1..v1.count
end loop;
i also checked the data type of the table variable its same as the column selected in the refcursor.
Please help me out.
Message was edited by:
Sumit NarayanOk I see that, but I don't think you can use associative arrays in this manner then, because you cannot select data directly into an associative_array.
As far as I'm aware, they must be assigned an index and the error suggests to me that its missing an index.
You can select directly into records and maybe this is where you need to go. -
Error while Loading data through .csv file
Hi,
I am getting below date error when loading data through into Olap tables through .csv file.
Data stored in .csv is 20071113121100.
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
... nl:ERROR(u:'transformation error')).
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
Any help is greatly appreciated.
Thanks,
Poojak1) Wrong format, you wont get much support loading OLAP cubes in here I think.
2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
*** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
Expression in Informatica is setup as below.
IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
Thanks,
Poojak -
Error while reading data through External Table!!!
CREATE TABLE "COGNOS"."EXT_COGNOS_TBS9_TEST"
( "ITEM_DESC" VARCHAR2(200 BYTE),
"EXT_CODE" VARCHAR2(20 BYTE),
"RC_DATE" DATE,
"RES_KD_AMNT" NUMBER(18,3),
"RES_FC_AMNT" NUMBER(18,3),
"NRES_KD_AMNT" NUMBER(18,3),
"NRES_FC_AMNT" NUMBER(18,3),
"TOTAL" NUMBER(18,3),
"OF_WHICH_OVR1" NUMBER(18,3)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY "EXTDATADIR"
ACCESS PARAMETERS
( RECORDS
DELIMITED BY NEWLINE LOAD WHEN *({color:#ff0000}EXT_CODE LIKE 'TBS9%'{color})* FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL )
LOCATION
( 'TBS9_TEST.CSV'
External table creation went through successfully but am getting error while reading data. Am quite sure error is because of above line in red color. Could you please help me in transforming logic.
Thanks in Advance,
APLet's start with the basics...
1) You state that you are getting an error. What error do you get? Is this an Oracle error (i.e. ORA-xxxxx)? If so, please include the error number and the error message as well as the triggering statement. Or is the problem that rows are getting written to the reject file and errors are being written to the log file? If so, what record(s) are being rejected and what are the reasons given in the log file? Or perhaps the problem is something else?
2) You state that you are quite sure that the problem relates to the hilighted code. What makes you quite sure of this?
Justin -
Reg: Proxy error while fetching data from RFC's
Hi All,
I am fetching data from RFC's. When the data is in bulk I am getting an error like:
Proxy Error
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the request POST /webdynpro/dispatcher/sap.com/pb/PageBuilder.
Reason: Error reading from remote server
When the data is samll only I am not getting any error and am able to fetch it easily. Please suggest me something in this regard.
Thanks in advance,
GauravHi Detlef / Gowtham,
Now I am able to fetch data from flat files from OWB Server as well as OWB Client.
One way I have achieved as suggested by you
1) Creating location on the OWB Client
2) Samples the files at client
3) Created & Configured external table
4) Copy all flat files on OWB Server
5) Updated the location which I created at the client.
Other way
1) Creating location on the OWB Client
2) Samples the files at client
3) Created & Configured external table
4) Copied flat files on the sever in same drive & directory . like if my all flat files are on C:\data at OWB Client then I copied flat file C:\data on the OWB Server. But this is feasible for Non-Windows.
Hence my problem solved.
Thanks a lot.
Regards,
Manmohan -
Slow Speed While fetching data from SQL 2008 using DoQuery.
Hello,
I am working for an AddOn and tried to use DoQuery for fetching data from SQL 2008 in C#.
There are around 148 records which full fill this query condition but it takes much time to fetch the data.
I wanna know that is there any problem in this code by which my application is getting slower.
I used break Points and checked it, I founds that while connecting to the server it is taking time.
Code:
// Get an initialized SBObob object
oSBObob = (SAPbobsCOM.SBObob)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoBridge);
//// Get an initialized Recordset object
oRecordset = (SAPbobsCOM.Recordset)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoRecordset);
string sqlstring = "select DocEntry,ItemCode From OWOR where OWOR.Status='R' and DocEntry not in ( Select distinct(BaseRef) from IGE1 where IGE1.BaseRef = OWOR.DocEntry)";
oRecordset.DoQuery(sqlstring);
var ProductList = new BindingList<KeyValuePair<string, string>>();
ProductList.Add(new KeyValuePair<string, string>("", "---Please Select---"));
while (!(oRecordset.EoF))
ProductList.Add(new KeyValuePair<string, string>(oRecordset.Fields.Item(0).Value.ToString(), oRecordset.Fields.Item(0).Value.ToString() + " ( " + oRecordset.Fields.Item(1).Value.ToString() + " ) "));
oRecordset.MoveNext();
cmbProductionOrder.ValueMember = "Key";
cmbProductionOrder.DisplayMember = "Value";
Thanks and Regards,
Ravi SharmaHi Ravi,
your code and query look correct. But can you ellaborate a little bit.
It seems to be a DI API program ( no UI API ) ?
When you say "I founds that while connecting to the server it is taking time." do you mean the recordset query or the DI API connection to SBO ? The later would be "normal" since the connection can take up to 30 seconds.
To get data it is usually better to use direct SQL connections.
regards,
Maik
Maybe you are looking for
-
My wife's nano no longer is recognized by Windows but my ipod touch is? How can I fix this?
-
Strange problem with iTunes.
When I started iTunes on my PC, instead of displaying the iTunes console normally, an iTunes "import" window displayed (that I've never notice before) with a activity bar indicating it was working. (The window also had a "Cancel" button on it). I was
-
Allocated ATP derivation for bulk loads and stealing
Gurus, We are designing a solution where the requirement is to derive ATP for more than a million items every day. For around a 250,000 items, we need to know the maximum quantity available for an item, org, demand class (including Stealing). We need
-
Where can I download oracle portal 10.1.4.2 patch
Dear All, I have installed ORacle Portal 10.1.2.0.2 and patched with version 10.1.4. I would like to get patchset 10.1.4.2 and do the patching. Can someone point out the location from where I can download the patch set. Best Regards, Bala
-
hi im trying to draw a picture on my panel with a fct name JP. TRaht work eprfectly, but i want the size of the Image (name ball) because now im adding my picture with the size 200,200, but its to big, and the picture could change each time the suer