Dump in BAPI while fetching data
Hi all,
I m getting a short dump (: ' No more storage space available for extending an internal table.') in my vendor trial balance report . The place of short dump is the BAPI which im using to fetch opening balance i.e BAPI_AP_ACC_GETKEYDATEBALANCE.
Please find below the screen shot of the dump.
Kindl suggest the possible solutions.
Short text
No more storage space available for extending an internal table.
What happened?
You attempted to extend an internal table, but the required space was
not available.
Error analysis
The internal table "\FUNCTION-POOL=3008\DATA=XLINEITEMS[]" could not be further
extended. To enable
error handling, the table had to be delete before this log was written.
As a result, the table is displayed further down or, if you branch to
the ABAP Debugger, with 0 rows.
At the time of the termination, the following data was determined for
the relevant internal table:
Memory location: "Session memory"
Row width: 2122
Number of rows: 2757476
Allocated rows: 2757476
Newly requested rows: 4 (in 1 blocks)
How to correct the error
The amount of storage space (in bytes) filled at termination time was:
Roll area...................... 6216720
Extended memory (EM)........... 4005681840
Assigned memory (HEAP)......... 1998959040
Short area..................... " "
Paging area.................... 24576
Maximum address space.......... 18446743890435787967
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"TSV_TNEW_PAGE_ALLOC_FAILED" " "
"SAPL3008" or "L3008U05"
"BAPI_AP_ACC_GETKEYDATEBALANCE"
Information on where terminated
Termination occurred in the ABAP program "SAPL3008" - in
"BAPI_AP_ACC_GETKEYDATEBALANCE".
The main program was "ZIBFI_VEND_TB_H1 ".
In the source code you have the termination point in line 48
of the (Include) program "L3008U05".
The program "SAPL3008" was started as a background job.
Job Name....... "ZIBFI_VEND_TB_H1"
Job Initiator.. 108735
Job Number..... 11490500
Line number:
44 PERFORM READ_KONTO USING VENDOR
45 COMPANYCODE RETURN.
46 CHECK RETURN IS INITIAL.
47
>>>>> SELECT * FROM BSIK APPENDING CORRESPONDING FIELDS OF TABLE XLINEITEMS
49 WHERE BUDAT <= KEYDATE
50 AND BUKRS EQ COMPANYCODE
51 AND LIFNR EQ VENDOR
52 AND BSTAT IN BSTAT.
Only 2 ways for this:
a) reduce the no of records
b) ask your basis team to increase the session memory
Regards,
Nikhil
Similar Messages
-
Hi,
While fetching data though IR wen client from workspace for a year(all 12 months) I am getting error as ("Out of Memory .Advice : Close other applications or windows and try again").
If I am trying same through IR studio it does not give any output and show me same repoting front page.
If i am selecting periods till 8 months it is giving the required data in both IR web client and IR studio.
Could you please suggest how can we resolve this issue.
Thanks,
D.N.RanaIssue Cause :
Sometimes this is due to excessive data which brings the size of the BQY file up around one gigabyte uncompressed in size (for processing may take twice as actual RAM, plus the memory space space for the plugin, and the typical memory limit on a 32-bit system is 2 gigabytes).
Solution :
To avoid excessive BQY size exceeding memory availability:
Ensure that your computer has at least 2Gb of free RAM before he runs IR Studio.
Put a limit to the number of rows that can be pulled down: Right click on Request label of Query section and put a value in Return First xxx Rows (and check the check box).
Do not pull down more than 750 MB of data (remember it may be duplicated while processing).
Place limits or aggregations in Query section (as opposed to Result section) to limit data entering the BQY. -
How to use for all entires clause while fetching data from archived tables
How to use for all entires clause while fetching data from archived tables using the FM
/PBS/SELECT_INTO_TABLE' .
I need to fetch data from an Archived table for all the entries in an internal table.
Kindly provide some inputs for the same.
thanks n Regards
RameshHi Ramesh,
I have a query regarding accessing archived data through PBS.
I have archived SAP FI data ( Object FI_DOCUMNT) using SAP standard process through TCODE : SARA.
Now please tell me can I acees this archived data through the PBS add on FM : '/PBS/SELECT_INTO_TABLE'.
Do I need to do something else to access data archived through SAP standard process ot not ? If yes, then please tell me as I am not able to get the data using the above FM.
The call to the above FM is as follows :
CALL FUNCTION '/PBS/SELECT_INTO_TABLE'
EXPORTING
archiv = 'CFI'
OPTION = ''
tabname = 'BKPF'
SCHL1_NAME = 'BELNR'
SCHL1_VON = belnr-low
SCHL1_BIS = belnr-low
SCHL2_NAME = 'GJAHR'
SCHL2_VON = GJAHR-LOW
SCHL2_BIS = GJAHR-LOW
SCHL3_NAME = 'BUKRS'
SCHL3_VON = bukrs-low
SCHL3_BIS = bukrs-low
SCHL4_NAME =
SCHL4_VON =
SCHL4_BIS =
CLR_ITAB = 'X'
MAX_ZAHL =
tables
i_tabelle = t_bkpf
SCHL1_IN =
SCHL2_IN =
SCHL3_IN =
SCHL4_IN =
EXCEPTIONS
EOF = 1
OTHERS = 2
OTHERS = 3
It gives me the following error :
Index for table not supported ! BKPF BELNR.
Please help ASAP.
Thnaks and Regards
Gurpreet Singh -
How can we improve the performance while fetching data from RESB table.
Hi All,
Can any bosy suggest me the right way to improve the performance while fetching data from RESB table. Below is the select statement.
SELECT aufnr posnr roms1 roanz
INTO (itab-aufnr, itab-pposnr, itab-roms1, itab-roanz)
FROM resb
WHERE kdauf = p_vbeln
AND ablad = itab-sposnr+2.
Here I am using 'KDAUF' & 'ABLAD' in condition. Can we use secondary index for improving the performance in this case.
Regards,
HimanshuHi ,
Declare intenal table with only those four fields.
and try the beloe code....
SELECT aufnr posnr roms1 roanz
INTO table itab
FROM resb
WHERE kdauf = p_vbeln
AND ablad = itab-sposnr+2.
yes, you can also use secondary index for improving the performance in this case.
Regards,
Anand .
Reward if it is useful.... -
Eliminate duplicate while fetching data from source
Hi All,
CUSTOMER TRANSACTION
CUST_LOC CUT_ID TRANSACTION_DATE TRANSACTION_TYPE
100 12345 01-jan-2009 CREDIT
100 23456 15-jan-2000 CREDIT
100 12345 01-jan-2010 DEBIT
100 12345 01-jan-2000 DEBITNow as per my requirement, i need to fetch data from CISTOMER_TRANSACTION table for those customer which has transaction in last 10 years. In my above data, customer 12345 has transaction in last 10 years, whereas for customer 23456, does not have transaction in last 10 years so will eliminate it.
Now, CUSTOMER_TRANSACTION table has approximately 100 million records. So, we are fectching data in batches. Batching is divided into months. Total 120 months. Below is my query.
select *
FROM CUSTOMER_TRANSACTION CT left outer join
(select distinct CUST_LOC, CUT_ID FROM CUSTOMER_TRANSACTION WHERE TRANSACTION_DATE >= ADD_MONTHS(SYSDATE, -120) and TRANSACTION_DATE < ADD_MONTHS(SYSDATE, -119) CUST
on CT.CUST_LOC = CUST.CUST_LOC and CT.CUT_ID = CUST.CUT_IDThru shell script, months number will change. -120:-119, -119:-118 ....., -1:-0.
Now the problem is duplication of records.
while fetching data for jan-2009, it will get cust_id 12345 and will fetch all 3 records and load it into target.
while fetching data for jan-2010, it will get cust_id 12345 and will fetch all 3 records and load in into target.
So instead of having only 3 records, for customer 12345 it will be having 6 records. Can someone help me on how can i eliminate duplicate records from getting in.
As of now i have 2 ways in mind.
1. Fetch all records at once. Which is impossible as it will give space issue.
2. After each batch, run a procedure which will delete duplicate records based on cust_loc, cut_id and transaction_date. But again it will have performance problem.
I want to eliminate it while fetching data from source.
Edited by: ace_friends22 on Apr 6, 2011 10:16 AMYou can do it this way....
SELECT DISTINCT cust_doc,
cut_id
FROM customer_transaction
WHERE transaction_date >= ADD_MONTHS(SYSDATE, -120)
AND transaction_date < ADD_MONTHS(SYSDATE, -119)However please note that - if want to get the transaction in a month like what you said earlier jan-2009 and jan-2010 and so on... you might need to use TRUNC...
Your date comparison could be like this... In this example I am checking if the transaction date is in the month of jan-2009
AND transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27) AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)) Your modified SQL...
SELECT *
FROM customer_transaction
WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27) AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27))Testing..
--Sample Data
CREATE TABLE customer_transaction (
cust_loc number,
cut_id number,
transaction_date date,
transaction_type varchar2(20)
INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2009','dd-MON-yyyy'),'CREDIT');
INSERT INTO customer_transaction VALUES (100,23456,TO_DATE('15-JAN-2000','dd-MON-yyyy'),'CREDIT');
INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2010','dd-MON-yyyy'),'DEBIT');
INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2000','dd-MON-yyyy'),'DEBIT');
--To have three records in the month of jan-2009
UPDATE customer_transaction
SET transaction_date = TO_DATE('02-JAN-2009','dd-MON-yyyy')
WHERE cut_id = 12345
AND transaction_date = TO_DATE('01-JAN-2010','dd-MON-yyyy');
UPDATE customer_transaction
SET transaction_date = TO_DATE('03-JAN-2009','dd-MON-yyyy')
WHERE cut_id = 12345
AND transaction_date = TO_DATE('01-JAN-2000','dd-MON-yyyy');
commit;
--End of sample data
SELECT *
FROM customer_transaction
WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27) AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27));Results....
CUST_LOC CUT_ID TRANSACTI TRANSACTION_TYPE
100 12345 01-JAN-09 CREDIT
100 12345 02-JAN-09 DEBIT
100 12345 03-JAN-09 DEBITAs you can see, there are only 3 records for 12345
Regards,
Rakesh
Edited by: Rakesh on Apr 6, 2011 11:48 AM -
Fatal error while fetching data from bi
hi,
i am getting following error while fetching data from bi using select statement
i have written code in this way
SELECT [Measures].[D2GFTNHIOMI7KWV99SD7GPLTU] ON COLUMNS, NON EMPTY { [DEM_STATE].MEMBERS} ON ROWS FROM DEM_CUBE/TEST_F_8
error description when i click on test
Fatal Error
com.lighthammer.webservice.SoapException: The XML for Analysis provider encountered an errorthanks for answering .but when i tried writing the statement in transaction 'MDXTEST' and clicked on check i am getting following error
Error occurred when starting the parser: timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
Message no. BRAINOLAPAPI011
Diagnosis
Failed to start the MDX parser.
System Response
timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
Procedure
Check the Sys Log in Transaction SM21 and test the TCP-IP connection MDX_PARSER in Transaction SM59.
SO I WENT IN SM 59 TO CHECK THE CONNECTION.
CAN U TELL ME WHAT CONFIGERATION I NEED TO DO FOR MAKING SELECT STATEMENTS WORK? -
Facing prolem in Dashboard 4.1, while fetching data from Bex Query
Hi Exports,
I am facing an error message " Failed to (de-)serialise data. (Xsl 000004)" while fetching data in Dashboard from Bex Query.
The query is getting connect. while drag n drop some dimensions and measures then going for Refesh or Run Query, geting this error.
The same query is working fine with other comp like webi n crystal.
Anybody having solution for this please let me know. I am stuck somewhere.
Thank YouHi,
Check the data in the infoProvider.Reduce the Bex query Characteristics & Key figure fields.Try to identify due to which characteristsic adding in Bex Query ,are you facing the issue.Check that characteristic data in the infoProvider.
Regards,
Venkat -
Error while fetching data into collection type.
Hi all,
I'm facing a problem while i'm fetching data into a table type.
Open c1;
open c2;
loop
Fetch c1 into partition_name_1;
fetch c2 into partition_name_2;
exit when c1%notfound or c2%notfound;
open C1_refcursor for 'select a.col1,b.col2 from table1 partition('||partition_name_1||') a, table2 partition('||partition_name_2) b
where a.col2=b.col2';
loop
fetch c1_refcursor BULK COLLECT into v1,v2 <-----This is the line where i'm getting the error as "ORA-01858: a non-numeric character was found where a numeric was expected"
limit 100000;
exit when v1.count=0;
forall i in 1..v1.count
end loop;
i also checked the data type of the table variable its same as the column selected in the refcursor.
Please help me out.
Message was edited by:
Sumit NarayanOk I see that, but I don't think you can use associative arrays in this manner then, because you cannot select data directly into an associative_array.
As far as I'm aware, they must be assigned an index and the error suggests to me that its missing an index.
You can select directly into records and maybe this is where you need to go. -
Slow Speed While fetching data from SQL 2008 using DoQuery.
Hello,
I am working for an AddOn and tried to use DoQuery for fetching data from SQL 2008 in C#.
There are around 148 records which full fill this query condition but it takes much time to fetch the data.
I wanna know that is there any problem in this code by which my application is getting slower.
I used break Points and checked it, I founds that while connecting to the server it is taking time.
Code:
// Get an initialized SBObob object
oSBObob = (SAPbobsCOM.SBObob)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoBridge);
//// Get an initialized Recordset object
oRecordset = (SAPbobsCOM.Recordset)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoRecordset);
string sqlstring = "select DocEntry,ItemCode From OWOR where OWOR.Status='R' and DocEntry not in ( Select distinct(BaseRef) from IGE1 where IGE1.BaseRef = OWOR.DocEntry)";
oRecordset.DoQuery(sqlstring);
var ProductList = new BindingList<KeyValuePair<string, string>>();
ProductList.Add(new KeyValuePair<string, string>("", "---Please Select---"));
while (!(oRecordset.EoF))
ProductList.Add(new KeyValuePair<string, string>(oRecordset.Fields.Item(0).Value.ToString(), oRecordset.Fields.Item(0).Value.ToString() + " ( " + oRecordset.Fields.Item(1).Value.ToString() + " ) "));
oRecordset.MoveNext();
cmbProductionOrder.ValueMember = "Key";
cmbProductionOrder.DisplayMember = "Value";
Thanks and Regards,
Ravi SharmaHi Ravi,
your code and query look correct. But can you ellaborate a little bit.
It seems to be a DI API program ( no UI API ) ?
When you say "I founds that while connecting to the server it is taking time." do you mean the recordset query or the DI API connection to SBO ? The later would be "normal" since the connection can take up to 30 seconds.
To get data it is usually better to use direct SQL connections.
regards,
Maik -
Error while fetching data from OWB Client using External Table.
Dear All,
I am using Oracle Warehouse Builder 11g & Oracle 10gR2 as repository database on Windows 2000 Server.
I facing some issue in fetching data from a Flat File using external table from OWB Client.
I have perform all the steps without any error but when I try to view the data, I got the following error.
======================================
RA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file expense_categories.csv in SOURCE_LOCATION not found
ORA-06512: at "SYS.ORACLE_LOADER", line 19
java.sql.SQLException: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file expense_categories.csv in SOURCE_LOCATION not found
ORA-06512: at "SYS.ORACLE_LOADER", line 19
at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:70)
at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:110)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:171)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:455)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:413)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:1030)
at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:183)
at oracle.jdbc.driver.T4CStatement.executeForDescribe(T4CStatement.java:774)
at oracle.jdbc.driver.T4CStatement.executeMaybeDescribe(T4CStatement.java:849)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1186)
at oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:1377)
at oracle.jdbc.driver.OracleStatementWrapper.executeQuery(OracleStatementWrapper.java:386)
at oracle.wh.ui.owbcommon.QueryResult.<init>(QueryResult.java:18)
at oracle.wh.ui.owbcommon.dataviewer.relational.OracleQueryResult.<init>(OracleDVTableModel.java:48)
at oracle.wh.ui.owbcommon.dataviewer.relational.OracleDVTableModel.doFetch(OracleDVTableModel.java:20)
at oracle.wh.ui.owbcommon.dataviewer.RDVTableModel.fetch(RDVTableModel.java:46)
at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerPanel$1.actionPerformed(BaseDataViewerPanel.java:218)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1849)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2169)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:420)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:258)
at javax.swing.AbstractButton.doClick(AbstractButton.java:302)
at javax.swing.AbstractButton.doClick(AbstractButton.java:282)
at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerPanel.executeQuery(BaseDataViewerPanel.java:493)
at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerEditor.init(BaseDataViewerEditor.java:116)
at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerEditor.<init>(BaseDataViewerEditor.java:58)
at oracle.wh.ui.owbcommon.dataviewer.relational.DataViewerEditor.<init>(DataViewerEditor.java:16)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
at oracle.wh.ui.owbcommon.IdeUtils._tryLaunchEditorByClass(IdeUtils.java:1412)
at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1349)
at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1367)
at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:869)
at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:856)
at oracle.wh.ui.console.commands.DataViewerCmd.performAction(DataViewerCmd.java:19)
at oracle.wh.ui.console.commands.TreeMenuHandler$1.run(TreeMenuHandler.java:188)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:209)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:461)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:163)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:157)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:149)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:110)
===========================
In the error it is showing that file expense_categories.csv in SOURCE_LOCATION not found but I am 100% sure that file is very much there.
Is anybody face the same issue?
Do we need to configure something before loading data from a flat file from OWB Client?
Any help would higly appreciable.
Regards,
Manmohan SharmaHi Detlef / Gowtham,
Now I am able to fetch data from flat files from OWB Server as well as OWB Client.
One way I have achieved as suggested by you
1) Creating location on the OWB Client
2) Samples the files at client
3) Created & Configured external table
4) Copy all flat files on OWB Server
5) Updated the location which I created at the client.
Other way
1) Creating location on the OWB Client
2) Samples the files at client
3) Created & Configured external table
4) Copied flat files on the sever in same drive & directory . like if my all flat files are on C:\data at OWB Client then I copied flat file C:\data on the OWB Server. But this is feasible for Non-Windows.
Hence my problem solved.
Thanks a lot.
Regards,
Manmohan -
BAM reports taking more time while fetching data from EDS
I have created an external Data Source(EDS) in the BAM, and when i create an object with that EDS, it is taking very long to fetch data from the EDS( more than 20 mins.), but as the Database is installed on my local system, when i fire a query there, it don't take much time. Please help me in this, what could be the possible reason?
Your messaging gateway question would be better addressed in the SQL PL/SQL forum
All Places > Database > Oracle Database + Options > SQL and PL/SQL > Discussions -
Reg: Proxy error while fetching data from RFC's
Hi All,
I am fetching data from RFC's. When the data is in bulk I am getting an error like:
Proxy Error
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the request POST /webdynpro/dispatcher/sap.com/pb/PageBuilder.
Reason: Error reading from remote server
When the data is samll only I am not getting any error and am able to fetch it easily. Please suggest me something in this regard.
Thanks in advance,
GauravHi Detlef / Gowtham,
Now I am able to fetch data from flat files from OWB Server as well as OWB Client.
One way I have achieved as suggested by you
1) Creating location on the OWB Client
2) Samples the files at client
3) Created & Configured external table
4) Copy all flat files on OWB Server
5) Updated the location which I created at the client.
Other way
1) Creating location on the OWB Client
2) Samples the files at client
3) Created & Configured external table
4) Copied flat files on the sever in same drive & directory . like if my all flat files are on C:\data at OWB Client then I copied flat file C:\data on the OWB Server. But this is feasible for Non-Windows.
Hence my problem solved.
Thanks a lot.
Regards,
Manmohan -
Sessions opening\closing in oracle while fetching data from XI
Hi Friends,
I used JDBC adaptor to fetch data from Oracle. I set the poll interval 86400 seconds because we need to run it on daily basis.
Now when XI fetch data from Oracle, It will open session in oracle, but it is not closed automatically. So, I have nearly 100 channels, and all channels are activated. So, for each channel 1 session is opened in oracle. By this way, the oracle server performance goes down. It is not working properly at that time.
How can I close these sessions in oracle.
Regards,
NarendraDid u try this paramter in JDBC sender adapter...
Disconnect from Database After Processing Each Message
Set this indicator if the database connection is to be released and reestablished before every poll interval.
-Siva Maranani -
LabVIEW Memory full while fetching data from database
Hi,
In my program I need to sync some data from client PC as per the selected time frame.
But while fecthing the data from the clinet database, my application is hanging and when I run it with code I get the 'LabVIEW memory full' error message.
Kindly suggest to overcome this problem.Fetching the entire database is probably not a good idea. You should narrow down how much you read at a time.
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines -
Only 1st character while fetching data through get_char_property
Hi
I am reading excel file through OLE2 object. while I am fatching data thr this utility It is fetching only first character of the string.
procedure read_chr_cell( p_row number, p_col number, p_value in out varchar2 ) is
args ole2.list_type;
cell ole2.obj_type;
begin
args := ole2.create_arglist;
ole2.add_arg(args, p_row );
ole2.add_arg(args, p_col );
cell := ole2.get_obj_property(worksheet,'Cells',args);
ole2.destroy_arglist(args);
p_value := ole2.get_char_property(cell,'Value'); [[[ PROBLEM AREA ]]]
ole2.release_obj(cell);
end;
any advice is appreaciate
Thanks
VishalHi Vishal
I think u need to LOOP Through the Records :)
Hope this helps...
Regards,
Amatu Allah
Maybe you are looking for
-
Hi, I am trying to export a Schema from 11.2.0.2 and getting the below error on export ORA-39127: unexpected error from call to export_string :=SYS.DBMS_RMGR_GROUP_EXPORT.GRANT_EXP(12425,1,...) ORA-06502: PL/SQL: numeric or value error: NULL index ta
-
Set "save metadata to XMP" as default
Is there a way to set the checkbox 'auto save changes to XMP' in a catalog as default?
-
Hi All, Is it necessary to recreate a control file in 10gR1 to increase maxdatafiles parameter. Is it not enough if I just set db_files parameter to larger number? Thanks, Hal.
-
Caps Key not working in main user account
Caps key does work in a guest account. Just bought this 13" Macbook Pro and transferred my user account from my old Powerbook. Over the air since I didn't have a firewire 800 to 400 cable. Everything seems to work fine but that. Should I do a backup
-
Trouble with saving Security Update 2008-007
Hi, When it search for soft updates, it always invites me to install Security Update 2008-007, but it can't be saved (it says impossible to save - network error : the file (-3001) can't be opened). Could you please help me with this issue ? Thanks ve