Limit for EXPORT TO MEMORY ID statement
Hi All,
I would like to know whether is any limit to the size of the data(size of <int_tab> below) which we can export to ABAP memory using statement -
EXPORT <int_tab> TO MEMORY ID 'XXXX'.
I have read the F1 help of export, there size 30000 bytes is mentioned if we export to shared memory but for export to ABAP memory it is mentioned anything like that.
Can someone help me out in finding this, or there is no limitations for exporting to ABAP memory?
Regards
Munish Garg
I supposed that the abap code hasn't limits, but the admin has certain parameters for limit users memory or something else. Overcharging system memory get that machines are slower than usual.
Similar Messages
-
Memory Limit for "IMPORT/EXPORT from MEMORY ID" statement
Hi All,
Can anyone tell me whether there is any memory limit exists for "IMPORT/EXPORT from MEMORY ID" statement.
Like may be we can transfer xx MB of data via this......or it is open, we can transfer any amount of data to ABAP memory via this.
Regards
Munish Garg1. Each user sessions have external sessions and each external sessions have internal sessions.
2. The programs being executed in the internal sessions can access ABAP memory
3.ABAP memory is a storage area for internal program variables like fields, structures, internal tables,,They can be passed between internal sessions of an external session.
4. you can transfer data through ABAp memory using IMPORT and EXPORT statements
5. after IMPORT FROM MEMORY ID <id> , you can use sy-subrc to check the existance of the cluster ID. here sy-subrc is not used to check whether the Import was successful or not. -
Is there any limit for VISA allocate DMA memory ?
Hi,
I have get an error: -1073807300, when I allocate memory for DMA larger than 4M Bytes.
Is there any limit for allocate memory for DMA ?
I have run the attach vi on win7 32bit or 64bit, It seems there is a limit at about 4M Bytes.
However I can allocate memory more than 20M Bytes on WinXP system.
Is there any solution for win7 to allocate memory more than 4M Bytes ?I altered the VI to loop from 1M in increments of 1M until it errored (as fast as it could) and then output the farthest it got without an error. Here is what the plot of those highest allocations looks like and attached is the VI.
-
Regarding Distribution Monitor for export/import
Hi,
We are planning to migrate the 1.2TB of database from Oracle 10.2g to MaxDB7.7 . We are currently testing the database migration on test system for 1.2TB of data. First we tried with just simple export/import i.e. without distribution monitor we were able to export the database in 16hrs but import was running for more than 88hrs so we aborted the import process. And later we found that we can use distribution monitor and distribute the export/import load on multiple systems so that import will get complete within resonable time. We used 2 application server for export /import but export completed within 14hrs but here again import was running more than 80hrs so we aborted the import process. We also done table splitting for big tables but no luck. And 8 parallel process was running on each servers i.e. one CI and 2 App servers. We followed the document DistributionMonitorUserGuide from SAP. I observerd that on central system CPU and Memory was utilizing above 94%. But on 2 application server which we added on that servers the CPU and Memory utilization was very low i.e. 10%. Please find the system configuration as below,
Central Instance - 8CPU (550Mhz) 32GB RAM
App Server1 - 8CPU (550Mhz) 16GB RAM
App Server2 - 8CPU (550Mhz) 16GB RAM
And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
Please can someone let me know how to improve the import time. And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
Thanks,
Narendra> And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
> Please can someone let me know how to improve the import time.
R3load connects directly to the database and loads the data. The quesiton is here: how is your database configured (in sense of caches and memory)?
> And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
There are no such documents available since the process of migration to another database is called "heterogeneous system copy". This process requires a certified migration consultant ot be on-site to do/assist the migraiton. Those consultants are trained specially for certain databases and know tips and tricks how to improve the migration time.
See
http://service.sap.com/osdbmigration
--> FAQ
For MaxDB there's a special service available, see
Note 715701 - Migration to SAP DB/MaxDB
Markus -
ECC6 IMPORT/EXPORT to MEMORY ID 'OPENFI00002213E'
Hi experts,
I have a problem with IMPORT/EXPORT in ECC6 within the same program. The short dump happenned at IMPORT time. Here is part of my code.
DATA: MEMID13(15) VALUE 'OPENFI00002213E',
T_VBKPF LIKE VBKPF OCCURS 0 WITH HEADER LINE.
EXPORT t_vbkpf TO MEMORY ID memid13.
CALL TRANSACTION 'FV60'
USING bdcdata
OPTIONS FROM s_ctu_params
MESSAGES INTO bdcmsgcoll.
IMPORT t_vbkpf FROM MEMORY ID memid13.
In 4.6C, the internal table t_vbkpf was updated with the created document from the transaction 'FV60' call but it gave a short dump in ECC6.
Runtime Errors CONNE_IMPORT_WRONG_STRUCTURE
Except. CX_SY_IMPORT_MISMATCH_ERROR
Date and Time 2007.12.06 15:46:46
Short text
Error when importing object "T_VBKPF".
Please help, thanks!
CHuongHi,
In ECC 6 for import from memory you need the import structure will be flat , character-type data
The following is from documentation.
+
If MEMORY is specified, the data cluster that was written to the ABAP Memory under the identification specified in id with the statement EXPORT is imported. For id, a flat , character-type data object is expected. This object contains the identification of the data cluster.
+
otherwise use this way
TYPES:
BEGIN OF tab,
col1 TYPE i,
col2 TYPE i,
END OF tab.
DATA:
wa_indx TYPE indx,
wa_itab TYPE tab,
cl TYPE mandt VALUE '100',
itab TYPE STANDARD TABLE OF tab.
IMPORT tab = itab
FROM DATABASE indx(xy)
TO wa_indx
CLIENT cl
ID 'TABLE'.
WRITE: wa_indx-aedat, wa_indx-usera, wa_indx-pgmid.
ULINE.
LOOP AT itab INTO wa_itab.
WRITE: / wa_itab-col1, wa_itab-col2.
ENDLOOP.
a® -
I want to get rid of EXPORT and IMPORT statement. What are the alternative ways for Import and export.
EXPORT TO memory ID &
IMPORT from Memory ID
BRegards
PravinYou can use INDX cluster tables for this purpose, quite usefull for passing values.
IMPORT p1 = w_btrtl FROM DATABASE indx(HR) ID W_CHAVE.
export p1 = w_btrtl to DATABASE indx(HR) ID W_CHAVE.
DELETE from database indx(HR) ID W_CHAVE. -
Joins And For all Enteries in Select Statement
Could you please tell me when there is a high amount of data which is being handled in the table, does the use of INNER JOINS and FOR ALL ENTERIES in SELECT Statement decreases the system performance? ?
Can you also let me know where can i get some tips regarding do's and dont's for ABAP Programming, I want to increase my system performance.
Currently the programs which are being used are taking a lot of time for execution...
Its very URGENT!Hai Jyotsna
Go through the following Tips for improving Performence
For all entries
The for all entries creates a where clause, where all the entries in the driver table are combined with OR. If the number of entries in the driver table is larger than rsdb/max_blocking_factor, several similar SQL statements are executed to limit the length of the WHERE clause.
The plus
Large amount of data
Mixing processing and reading of data
Fast internal reprocessing of data
Fast
The Minus
Difficult to program/understand
Memory could be critical (use FREE or PACKAGE size)
Some steps that might make FOR ALL ENTRIES more efficient:
Removing duplicates from the driver table
Sorting the driver table
If possible, convert the data in the driver table to ranges so a BETWEEN statement is used instead of and OR statement:
FOR ALL ENTRIES IN i_tab
WHERE mykey >= i_tab-low and
mykey <= i_tab-high.
Nested selects
The plus:
Small amount of data
Mixing processing and reading of data
Easy to code - and understand
The minus:
Large amount of data
when mixed processing isnt needed
Performance killer no. 1
Select using JOINS
The plus
Very large amount of data
Similar to Nested selects - when the accesses are planned by the programmer
In some cases the fastest
Not so memory critical
The minus
Very difficult to program/understand
Mixing processing and reading of data not possible
Use the selection criteria
SELECT * FROM SBOOK.
CHECK: SBOOK-CARRID = 'LH' AND
SBOOK-CONNID = '0400'.
ENDSELECT.
SELECT * FROM SBOOK
WHERE CARRID = 'LH' AND
CONNID = '0400'.
ENDSELECT.
Use the aggregated functions
C4A = '000'.
SELECT * FROM T100
WHERE SPRSL = 'D' AND
ARBGB = '00'.
CHECK: T100-MSGNR > C4A.
C4A = T100-MSGNR.
ENDSELECT.
SELECT MAX( MSGNR ) FROM T100 INTO C4A
WHERE SPRSL = 'D' AND
ARBGB = '00'.
Select with view
SELECT * FROM DD01L
WHERE DOMNAME LIKE 'CHAR%'
AND AS4LOCAL = 'A'.
SELECT SINGLE * FROM DD01T
WHERE DOMNAME = DD01L-DOMNAME
AND AS4LOCAL = 'A'
AND AS4VERS = DD01L-AS4VERS
AND DDLANGUAGE = SY-LANGU.
ENDSELECT.
SELECT * FROM DD01V
WHERE DOMNAME LIKE 'CHAR%'
AND DDLANGUAGE = SY-LANGU.
ENDSELECT.
Select with index support
SELECT * FROM T100
WHERE ARBGB = '00'
AND MSGNR = '999'.
ENDSELECT.
SELECT * FROM T002.
SELECT * FROM T100
WHERE SPRSL = T002-SPRAS
AND ARBGB = '00'
AND MSGNR = '999'.
ENDSELECT.
ENDSELECT.
Select Into table
REFRESH X006.
SELECT * FROM T006 INTO X006.
APPEND X006.
ENDSELECT
SELECT * FROM T006 INTO TABLE X006.
Select with selection list
SELECT * FROM DD01L
WHERE DOMNAME LIKE 'CHAR%'
AND AS4LOCAL = 'A'.
ENDSELECT
SELECT DOMNAME FROM DD01L
INTO DD01L-DOMNAME
WHERE DOMNAME LIKE 'CHAR%'
AND AS4LOCAL = 'A'.
ENDSELECT
Key access to multiple lines
LOOP AT TAB.
CHECK TAB-K = KVAL.
ENDLOOP.
LOOP AT TAB WHERE K = KVAL.
ENDLOOP.
Copying internal tables
REFRESH TAB_DEST.
LOOP AT TAB_SRC INTO TAB_DEST.
APPEND TAB_DEST.
ENDLOOP.
TAB_DEST[] = TAB_SRC[].
Modifying a set of lines
LOOP AT TAB.
IF TAB-FLAG IS INITIAL.
TAB-FLAG = 'X'.
ENDIF.
MODIFY TAB.
ENDLOOP.
TAB-FLAG = 'X'.
MODIFY TAB TRANSPORTING FLAG
WHERE FLAG IS INITIAL.
Deleting a sequence of lines
DO 101 TIMES.
DELETE TAB_DEST INDEX 450.
ENDDO.
DELETE TAB_DEST FROM 450 TO 550.
Linear search vs. binary
READ TABLE TAB WITH KEY K = 'X'.
READ TABLE TAB WITH KEY K = 'X' BINARY SEARCH.
Comparison of internal tables
DESCRIBE TABLE: TAB1 LINES L1,
TAB2 LINES L2.
IF L1 <> L2.
TAB_DIFFERENT = 'X'.
ELSE.
TAB_DIFFERENT = SPACE.
LOOP
AT TAB1.
READ TABLE TAB2 INDEX SY-TABIX.
IF TAB1 <> TAB2.
TAB_DIFFERENT = 'X'. EXIT.
ENDIF.
ENDLOOP.
ENDIF.
IF TAB_DIFFERENT = SPACE.
ENDIF.
IF TAB1[] = TAB2[].
ENDIF.
Modify selected components
LOOP AT TAB.
TAB-DATE = SY-DATUM.
MODIFY TAB.
ENDLOOP.
WA-DATE = SY-DATUM.
LOOP AT TAB.
MODIFY TAB FROM WA TRANSPORTING DATE.
ENDLOOP.
Appending two internal tables
LOOP AT TAB_SRC.
APPEND TAB_SRC TO TAB_DEST.
ENDLOOP
APPEND LINES OF TAB_SRC TO TAB_DEST.
Deleting a set of lines
LOOP AT TAB_DEST WHERE K = KVAL.
DELETE TAB_DEST.
ENDLOOP
DELETE TAB_DEST WHERE K = KVAL.
Tools available in SAP to pin-point a performance problem
· The runtime analysis (SE30)
· SQL Trace (ST05)
· Tips and Tricks tool
· The performance database
Optimizing the load of the database
Using table buffering
Using buffered tables improves the performance considerably. Note that in some cases a statement can not be used with a buffered table, so when using these statements the buffer will be bypassed. These statements are:
Select DISTINCT
ORDER BY / GROUP BY / HAVING clause
Any WHERE clause that contains a sub query or IS NULL expression
JOIN s
A SELECT... FOR UPDATE
If you wan t to explicitly bypass the buffer, use the BYPASS BUFFER addition to the SELECT clause.
Use the ABAP SORT Clause Instead of ORDER BY
The ORDER BY clause is executed on the database server while the ABAP SORT statement is executed on the application server. The database server will usually be the bottleneck, so sometimes it is better to move the sort from the database server to the application server.
If you are not sorting by the primary key ( E.g. using the ORDER BY PRIMARY key statement) but are sorting by another key, it could be better to use the ABAP SORT statement to sort the data in an internal table. Note however that for very large result sets it might not be a feasible solution and you would want to let the database server sort it.
Avoid the SELECT DISTINCT Statement
As with the ORDER BY clause it could be better to avoid using SELECT DISTINCT, if some of the fields are not part of an index. Instead use ABAP SORT + DELETE ADJACENT DUPLICATES on an internal table, to delete duplicate rows.
Thanks & regards
Sreenivasulu P -
What's the size limit for Bytes?
I insert a 90K image into a LONG RAW column with Bytes and found only about 25K was there and the other 65K is lost. Also I got no error in this process. Is there any limit of the size of Bytes?
Hi,
You can activate the SIZECHECK compiler option for interactive compilations only. Once enabled, the compiler option remains active until disabled. This option is automatically disabled for batch compilations.
Enable the SIZECHECK option prior to compiling a program unit to raise an alert if the size of a program unit source is approaching an operating system limit for memory allocation. If the size of the source is close to an operating system limit, the compiled state of that source will probably be larger and may exceed the operating system limit.
The SIZECHECK option also raises an alert if the compiled state of the program unit is approaching or exceeds an operating system specific limit for memory allocation.
If the source of the program unit, or the compiled program unit exceeds an operating system-specific memory allocation limit, you may wish to break the program unit into smaller program units.
Check your operating system documentation for the memory allocation limit on your platform.
You can temporarily remove a compiler option using the DISABLE command.
enjoy
Tehzeeb Ahmed -
Exporting slices from multiple states doesn't name the 1st state
I've tried various setups in the File > HTML Setup > Document Specific options for setting state names for exported slices, and i've tried the defaults.
Whenever I have a page with multiple states, and i want to export a slice from each state, when it generates the files it creates:
slicename.png
slicename_s2.png
slicename_s3.png etc
how can i force fireworks to name the 1st slice with _s1I've tried various setups in the File > HTML Setup > Document Specific options for setting state names for exported slices, and i've tried the defaults.
Whenever I have a page with multiple states, and i want to export a slice from each state, when it generates the files it creates:
slicename.png
slicename_s2.png
slicename_s3.png etc
how can i force fireworks to name the 1st slice with _s1 -
Is there a page size limit when exporting a pdf from InDesign?
My indd file is 201x40 inches with a 1.5 inch slug. When I export to pdf I get this message, "The document page size exceeds the page size range supported in PDF files. Exported pages will be resized to fit within PDF limits."
Although the PDF spec mentions that it's feasible to create a PDF as large as 15,000,000 inches in either direction, 200 x 200 inches is Distiller’s limit for PostScript. It’s based on the Windows printer driver limitation.
If I read the quote below correctly, the maximum page size is 15 million inches (381 km).
"This is from the PDF Reference 1.6:
177. In PDF versions earlier than PDF 1.6, the size of the default user space unit is fixed at 172 inch. In Acrobat viewers earlier than version 4.0, the minimum allowed page size is 72 by 72 units in default user space (1 by 1 inch); the maximum is 3240 by 3240 units (45 by 45 inches). In Acrobat versions 5.0 and later, the minimum allowed page size is 3 by 3 units (approximately 0.04 by 0.04 inch); the maximum is 14,400 by 14,400 units (200 by 200 inches).
Beginning with PDF 1.6, the size of the default user space unit may be set with the UserUnit entry of the page dictionary. Acrobat 7.0 supports a maximum UserUnit value of 75,000, which gives a maximum page dimension of 15,000,000 inches (14,400 * 75,000 * 172). The minimum UserUnit value is 1.0 (the default)." -
Is there a size limit to exporting PDF to Word?
Is there a size limit when exporting PDF to Word format? I keep getting time out errors for files exceeding 100 MB
Hi riversideredhead63,
100 MB is indeed the file size limit for uploading files to cloud.acrobat.com. Have you tried optimizing the PDF files in Acrobat to reduce their file size?
Best,
Sara -
EXPORT to MEMORY ID Vs. EXPORT to DATA BASE IDX table??
Hi,
Some times we use EXPORT to MEMORY ID and some times we EXPORT to DATABASE IDX table, pls. let me know,
1) When we use EXPORT to MEMORY ID?
2) When we use EXPORT to DATA BASE IDX table?
3) I knew that in the (1) case, data wuld be stored in shared memory for session live.......but, what about (2) case,Where its stored, is it storeas really in data base? like other data (say, sales order data) this exported data also persists for ever in the Oracle data bse? or this table (idx) is a logical data base?
Thank youFirst of make it a point to read the SAP documentation if you have any doubts: [http://help.sap.com/abapdocu_70/en/ABAPEXPORT_DATA_CLUSTER_MEDIUM.htm#!ABAP_ALTERNATIVE_4@4@|http://help.sap.com/abapdocu_70/en/ABAPEXPORT_DATA_CLUSTER_MEDIUM.htm#!ABAP_ALTERNATIVE_4@4@]
SAP has defined a predefined DB table INDX as a reference (you can create your custom table similar to INDX if you want). When you EXPORT the data to this table, a new record is created in the table & the data exported will be stored in RAW format.
This data remains in the INDX table even after the session is terminated.
BR,
Suhas -
Import/export to memory and global variables
Hi,
Im working on some functionality using import/export to memory. One of the statements is:
import gf_memid_exit = g_exit_flag gf_memid_result = t352r from memory id 'ZREV_EXT'.
Im using global variables to contain the cluster names. But this is not working. When I change the global variables into local variables, it works. It this normal? Because I would like to have global variables (even better I would like to have global constants) to declare in order to use them within the program.
Any ideas?
With regards,
MikeHi Mike...
I think with import u can use global or local variables..in debug mode pl. check when u r using global variables..whether it is getting cleared or overwritten by some other values or not. Or try with global constants..it should work...
Regards,
Joy. -
Problems with EXPORT TO MEMORY / IMPORT FROM MEMORY
Hello.
we've just made a SAP OS/DB migration of our 46C system from UNIX/Oracle to Windows/SQL.
Our FI consultant has realized a processs failed.
The ABAP program that files (it is a Z program) is using EXPORT TO MEMORY and IMPORT TO MEMORY.
It seems the import is failing because the export to memory fails also.
Of course we have changed memory parameters ..do you know what memory parameters are involved?
Thanks very much for your help.It seems the import is failing because the export to memory fails also.
Can you explain?
Did you check the extended debugger facilities like memory areas?
What is the structure/amount of data for ex- and import, same name, same data object?
Just reveal what you are really doing, post a few code lines.
Or not.
Regards,
Clemens -
Export to memory/Import from memory
When you're exporting to memory and there are multiple users on an online transaction. Does memory mean the users own memory. If multiple users are on AS02 and you have to custom sub screens and you want to export a field to memory on the 1st custom screen and import it on the other screen, is it possible to import another users memory if they're in that transaction doing the same thing?
Hi,
There are two types of Memories
ABAP memory is a temporary memory which can store data for a session. In this case each user will have his own memory.
For more information on ABAP Memory click the below link
http://help.sap.com/saphelp_47x200/helpdata/en/fc/eb3bc4358411d1829f0000e829fbfe/content.htm
SAP Memory is used to store and access data from different sessions. Users accessing the memory will have the same data.
Regards,
Vara
Maybe you are looking for
-
HP Color Laserjet 2500 colors problem
Helloo! I have a problem with my HP Color Laserjet 2500. I use Windows 7 proffesional 32-bit computer. Installed driver from hp support, because the CD that came with the printer, is only for XP/ 2000 /ME etc. I tried both, the PC5 and PC6 driver, a
-
Dreamweaver CS5 doesn't start, cannot find configuration file
Dreamweaver Could not locate the Resource file in the Configuration folder. This file is required to run Dreamweaver. Please reinstall the application. Dreamweaver CS5 doesn't start due to the error message above. Found the configuration folder but o
-
[SOLVED] problem with permissions.
hi all I am trying to paste a folder in the path : /usr/share/themes but I can't do that at all because of an error about permissions. I can only paste & copy things in the Filesystem Folders through the terminal when I am in the root mode too. I hav
-
Reference monitor doesn't play simultaneously in real time? How is it used?
I thought the point is to have two monitors playing back simultaneously for scopes / composite or to send to a secondary feed for clients, similar to video village on set.
-
What are the components & support package levels required in 4.7 to extract
Hi All, In my landscape we have ECC 6.0 and 4.7EE, as of now we are extracting data from ECC 6.0 to BI 7.0.Now we have requirement to extract data from 4.7EE. Could you please suggest me what are the components & support package levels required in 4.