Manual data transfer program - RHMOVE30
Why Plan version changes from active to never use, when we do manual data transfers(RHMOVE30) for OM objects.
Hi,
plan version doesn't change. There are still the objects in active planvariant. .: is only used by the report to export and import the objects. You don't have to carry about.
Regards
Gabriel
Similar Messages
-
The standard SAP data transfer program for vendors
Hi Experts,
Could u please tell me that wt is the program which is <b>"the standard SAP data transfer program for vendors"</b>?
could you plz tell me what exactly is it?
and itz tcode..
is it the "XK01" only or anything other?
thnx
Message was edited by: sey niHi,
Pl use RFBIKR00 for Vendor Matser upload. Th eProgram documentation is very useful too.
Regards,
Suresh Datti -
Error 7 occurred when generating the data transfer program
Hello All ,
In Master Data Load Process Chain , we get error like
1. System Response
Caller 09 contains an error message.
Diagnosis
Error 7 occurred when generating the data transfer program for the requested InfoSource.
System Response
The data transfer is terminated.
Procedure
Check the SAP Support Portal for the appropriate Notes and create a customer message if necessary.
Note : We faced this issue for two days now . Just repeating the load make it success .
If any one faced and fixed this issue . Please let me know .
Thanks in advance .Hi,
. Initially goto transaction SE38, Run the program RSDS_DATASOURCE_ACTIVATE_ALL. Give your Datasource name, and source system and check the check box for "Only Inactive objects".
This will actiavate the given datasource.
2. Replicate the datasource in RSA1.
3. Try to schedule the infopackage for the datasource which you have activated now.
4. IF infopackage runs through, Repeat the process for all datasources ie uncheck the check box, which means it will activate all datasources for the source system.
Also make sure that there will be enough Back Ground Processor available....
Reduce the parallel process
Thanks
BVR -
Junk characters in data transfer Program
Hello ABAP gurus,
i am facing the problem in an custmom made ABAP Interface program. the program has an option of downloading the files in local system(Windows system) and in Unix system (AL11). when i dwnload the files locally and feed that file to a CRM system, the CRM processes the file correctly but in case if we schedule a job then the CRM reads the file from Unix system and generates an error showing the junk characters for the special european characters. please request you guys to help me in this regard.Hi,
When you download the file locally are able to see the junk character in the file.
Logon on to the unix system and check the file content there and if any junk characters are there, usually when we upload the file then you will have those junk character being populated at the end of the record, open the file in unix and then remove those characters and reprocess that again, it will get through.
Thanks,
Mahesh. -
Basic thing to know about data transfer programs
Hi Friends
I have one basic question need to be clarified:
I have one file contains 100 records, I have to upload this file to sap.
My question is :
suppose 80th record is the error record, what will happen if I am doing this using session metod?
suppose 80th record is the error record, what will happen if I am doing this using call transaction metod?
suppose 80th record is the error record, what will happen if I am doing this using session metod but in background mode?
Please clarify me with all possible scenorios.
<b>Points are assured for useful answers.</b>
Regards,
SreeHi:
Qn No: 1) No data will be saved .
2) The error record wont be saved. In your case all the records except 80th record will be saved using call transaction ( Foreground) .
3) No data will be saved . -
please get me the detailed procedure ( including code) of any data transfer program using BAPI's
check the sample code....it is used to transfer the data to MM01 tcode.
*TO CREATE MATERIAL USING BAPI.
STRUCTURE DECLARATIONS *
TABLES: BAPIMATHEAD, "Headerdata
BAPI_MARA, "Clientdata
BAPI_MARAX, "Clientdatax
BAPI_MARC, "Plantdata
BAPI_MARCX, "Plantdatax
BAPI_MAKT, "Material description
BAPI_MBEW, "VALUATION DATA
BAPI_MBEWX,
BAPI_MARM,
BAPI_MARMX,
bapi_mean,
BAPIRET2. "Return messages
DATA:V_FILE TYPE STRING. "input data file
DATA:
BEGIN OF LSMW_MATERIAL_MASTER,
MATNR(018) TYPE C, "Material number
MTART(004) TYPE C, "Material type
MBRSH(001) TYPE C, "Industry sector
WERKS(004) TYPE C, "Plant
MAKTX(040) TYPE C, "Material description
DISMM(002) TYPE C, "Extra Field Added In the Program as itsrequired
MEINS(003) TYPE C, "Base unit of measure
MATKL(009) TYPE C, "Material group
SPART(002) TYPE C, "Division
LABOR(003) TYPE C, "Lab/office
PRDHA(018) TYPE C, "Product hierarchy
MSTAE(002) TYPE C, "X-plant matl status
MTPOS_MARA(004) TYPE C, "Gen item cat group
BRGEW(017) TYPE C, "Gross weight
GEWEI(003) TYPE C, "Weight unit NTGEW(017) TYPE C, "Net weight
GROES(032) TYPE C, "Size/Dimensions
MAGRV(004) TYPE C, "Matl grp pack matls
BISMT(018) TYPE C, "Old material number
WRKST(048) TYPE C, "Basic material
PROFL(003) TYPE C, "DG indicator profile
KZUMW(001) TYPE C, "Environmentally rlvt
BSTME(003) TYPE C, "Order unit
VABME(001) TYPE C,
EKGRP(003) TYPE C, "Purchasing group
XCHPF(001) TYPE C, "Batch management
EKWSL(004) TYPE C, "Purchasing key value
WEBAZ(003) TYPE C, "GR processing time
MFRPN(040) TYPE C, "Manufacturer part number
MFRNR(010) TYPE C, "Manufacturer number
VPRSV(001) TYPE C, "Price control indicator
STPRS(015) TYPE C, "Standard price
BWPRH(014) TYPE C, "Commercial price1
BKLAS(004) TYPE C, "Valuation class
bwkey(004) type c,
END OF LSMW_MATERIAL_MASTER.
INTERNAL TABLE DECLARATIONS *
*to store the input data
DATA:
BEGIN OF it_matmaster OCCURS 0.
INCLUDE STRUCTURE LSMW_MATERIAL_MASTER.
DATA:
END OF it_matmaster.
*for material description
DATA:BEGIN OF IT_MATERIALDESC OCCURS 0.
INCLUDE STRUCTURE BAPI_MAKT .
DATA:END OF IT_MATERIALDESC.
*FOR gross wt
data: begin of it_uom occurs 0.
include structure BAPI_MARM.
data:end of it_uom.
DATA: BEGIN OF IT_UOMX OCCURS 0.
INCLUDE STRUCTURE BAPI_MARMX.
DATA:END OF IT_UOMX.
data:begin of it_mean occurs 0.
include structure bapi_mean.
data:end of it_mean.
DATA:BEGIN OF IT_MLTX OCCURS 0.
INCLUDE STRUCTURE BAPI_MLTX.
DATA:END OF IT_MLTX.
*to return messages
DATA:BEGIN OF IT_RETURN OCCURS 0.
INCLUDE STRUCTURE BAPIRET2.
DATA:END OF IT_RETURN.
SELECTION SCREEN *
SELECTION-SCREEN BEGIN OF BLOCK B1 WITH FRAME TITLE TEXT-002.
PARAMETERS:P_FILE LIKE RLGRAP-FILENAME OBLIGATORY.
SELECTION-SCREEN END OF BLOCK B1 .
AT SELECTION SCREEN *
AT SELECTION-SCREEN ON VALUE-REQUEST FOR P_FILE.
CALL FUNCTION 'F4_FILENAME'
EXPORTING
PROGRAM_NAME = SYST-CPROG
DYNPRO_NUMBER = SYST-DYNNR
FIELD_NAME = 'P_FILE'
IMPORTING
FILE_NAME = P_FILE.
TO UPLOAD THE DATA *
START-OF-SELECTION.
V_FILE = P_FILE.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = V_FILE
FILETYPE = 'ASC'
HAS_FIELD_SEPARATOR = 'X'
HEADER_LENGTH = 0
READ_BY_LINE = 'X'
DAT_MODE = ' '
IMPORTING
FILELENGTH =
HEADER =
tables
data_tab = IT_MATMASTER
EXCEPTIONS
FILE_OPEN_ERROR = 1
FILE_READ_ERROR = 2
NO_BATCH = 3
GUI_REFUSE_FILETRANSFER = 4
INVALID_TYPE = 5
NO_AUTHORITY = 6
UNKNOWN_ERROR = 7
BAD_DATA_FORMAT = 8
HEADER_NOT_ALLOWED = 9
SEPARATOR_NOT_ALLOWED = 10
HEADER_TOO_LONG = 11
UNKNOWN_DP_ERROR = 12
ACCESS_DENIED = 13
DP_OUT_OF_MEMORY = 14
DISK_FULL = 15
DP_TIMEOUT = 16
OTHERS = 17
IF sy-subrc 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
*ELSE.
*DELETE IT_MATMASTER INDEX 1.
ENDIF.
DATA POPULATIONS *
LOOP AT IT_MATMASTER.
*HEADER DATA
BAPIMATHEAD-MATERIAL = IT_MATMASTER-MATNR.
BAPIMATHEAD-IND_SECTOR = IT_MATMASTER-Mbrsh.
BAPIMATHEAD-MATL_TYPE = IT_MATMASTER-Mtart.
BAPIMATHEAD-BASIC_VIEW = 'X'.
BAPIMATHEAD-PURCHASE_VIEW = 'X'.
BAPIMATHEAD-ACCOUNT_VIEW = 'X'.
*CLIENTDATA
BAPI_MARA-MATL_GROUP = IT_MATMASTER-MATKL.
BAPI_MARA-DIVISION = IT_MATMASTER-SPART.
BAPI_MARA-DSN_OFFICE = IT_MATMASTER-LABOR.
BAPI_MARA-PROD_HIER = IT_MATMASTER-PRDHA.
BAPI_MARA-PUR_STATUS = IT_MATMASTER-MSTAE.
BAPI_MARA-ITEM_CAT = IT_MATMASTER-MTPOS_MARA.
BAPI_MARA-NET_WEIGHT = IT_MATMASTER-NTGEW.
BAPI_MARA-PO_UNIT = 'KG'.
BAPI_MARA-UNIT_OF_WT_ISO = 'KG'.
BAPI_MARA-UNIT_OF_WT = 'KG'.
BAPI_MARA-PACK_VO_UN = 'KG'.
BAPI_MARA-BASE_UOM_ISO = 'KG'.
bapi_mara-size_dim = it_matmaster-groes.
BAPI_MARA-MAT_GRP_SM = IT_MATMASTER-MAGRV.
BAPI_MARA-OLD_MAT_NO = IT_MATMASTER-BISMT.
BAPI_MARA-BASE_UOM = IT_MATMASTER-MEINS.
BAPI_MARA-BASIC_MATL = IT_MATMASTER-WRKST.
BAPI_MARA-HAZMATPROF = IT_MATMASTER-PROFL.
BAPI_MARA-ENVT_RLVT = IT_MATMASTER-KZUMW.
BAPI_MARA-PO_UNIT = IT_MATMASTER-BSTME.
BAPI_MARA-VAR_ORD_UN = IT_MATMASTER-VABME.
BAPI_MARA-PUR_VALKEY = IT_MATMASTER-EKWSL.
BAPI_MARA-MANU_MAT = IT_MATMASTER-MFRPN.
BAPI_MARA-MFR_NO = IT_MATMASTER-MFRNR.
BAPI_MARAX-MATL_GROUP = 'X'.
BAPI_MARAX-DIVISION = 'X'.
BAPI_MARAX-DSN_OFFICE = 'X'.
BAPI_MARAX-PROD_HIER = 'X'.
BAPI_MARAX-PUR_STATUS = 'X'.
BAPI_MARAX-ITEM_CAT = 'X'.
BAPI_MARAX-NET_WEIGHT = 'X'.
BAPI_MARAX-UNIT_OF_WT = 'X'.
BAPI_MARAX-UNIT_OF_WT_ISO = 'X'.
bapi_maraX-size_dim = 'X'.
BAPI_MARAX-MAT_GRP_SM = 'X'.
BAPI_MARAX-OLD_MAT_NO = 'X'.
BAPI_MARAX-BASE_UOM = 'X'.
BAPI_MARAX-BASE_UOM_ISO = 'X'.
BAPI_MARAX-BASIC_MATL = 'X'.
BAPI_MARAX-MFR_NO = 'X'.
BAPI_MARAX-HAZMATPROF = 'X'.
BAPI_MARAX-ENVT_RLVT = 'X'.
BAPI_MARAX-PO_UNIT = 'X'.
BAPI_MARAX-PACK_VO_UN = 'X'.
BAPI_MARAX-VAR_ORD_UN = 'X'.
BAPI_MARAX-PUR_VALKEY = 'X'.
BAPI_MARAX-MANU_MAT = 'X'.
BAPI_MARAX-MFR_NO = 'X'.
*PLANT DATA
BAPI_MARC-PLANT = IT_MATMASTER-WERKS.
BAPI_MARC-PUR_GROUP = IT_MATMASTER-EKGRP.
BAPI_MARC-BATCH_MGMT = IT_MATMASTER-XCHPF.
BAPI_MARC-GR_PR_TIME = IT_MATMASTER-WEBAZ.
BAPI_MARCX-PLANT = IT_MATMASTER-WERKS.
BAPI_MARCX-PUR_GROUP = 'X'.
BAPI_MARCX-BATCH_MGMT = 'X'.
BAPI_MARCX-GR_PR_TIME = 'X'.
*VALUATION DATA
BAPI_MBEW-PRICE_CTRL = IT_MATMASTER-VPRSV.
BAPI_MBEW-STD_PRICE = IT_MATMASTER-STPRS.
BAPI_MBEW-COMMPRICE1 = IT_MATMASTER-BWPRH.
BAPI_MBEW-VAL_AREA = IT_MATMASTER-BWKEY.
BAPI_MBEW-VAL_CLASS = IT_MATMASTER-BKLAS.
BAPI_MBEWX-PRICE_CTRL = 'X'.
BAPI_MBEWX-STD_PRICE = 'X'.
BAPI_MBEWX-COMMPRICE1 = 'X'.
BAPI_MBEWX-VAL_AREA = IT_MATMASTER-BWKEY.
BAPI_MBEWX-VAL_CLASS = 'X'.
IT_MATERIALDESC-LANGU = 'EN'.
IT_MATERIALDESC-MATL_DESC = IT_MATMASTER-MAKTX.
append IT_materialdesc.
IT_UOM-GROSS_WT = IT_MATMASTER-BRGEW.
IT_UOM-ALT_UNIT = 'KG'.
IT_UOM-ALT_UNIT_ISO = 'KG'.
IT_UOM-UNIT_OF_WT = IT_MATMASTER-GEWEI.
APPEND IT_UOM.
IT_UOMX-GROSS_WT = 'X'.
IT_UOMX-ALT_UNIT = 'KG'.
IT_UOMX-ALT_UNIT_ISO = 'KG'.
IT_UOMX-UNIT_OF_WT = 'X'.
APPEND IT_UOMX.
it_mean-unit = 'KD3'.
append it_mean.
it_mltx-langu = 'E'.
it_mltx-text_name = it_matmaster-matnr.
APPEND IT_MLTX.
CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
EXPORTING
headdata = BAPIMATHEAD
CLIENTDATA = BAPI_MARA
CLIENTDATAX = BAPI_MARAx
PLANTDATA = BAPI_MARc
PLANTDATAX = BAPI_MARcx
FORECASTPARAMETERS =
FORECASTPARAMETERSX =
PLANNINGDATA =
PLANNINGDATAX =
STORAGELOCATIONDATA =
STORAGELOCATIONDATAX =
VALUATIONDATA = BAPI_MBEW
VALUATIONDATAX = BAPI_MBEWX
WAREHOUSENUMBERDATA =
WAREHOUSENUMBERDATAX =
SALESDATA =
SALESDATAX =
STORAGETYPEDATA =
STORAGETYPEDATAX =
FLAG_ONLINE = ' '
FLAG_CAD_CALL = ' '
IMPORTING
RETURN = IT_RETURN
TABLES
MATERIALDESCRIPTION = IT_MATERIALDESC
UNITSOFMEASURE = IT_UOM
UNITSOFMEASUREX = IT_UOMX
INTERNATIONALARTNOS = it_mean
MATERIALLONGTEXT = IT_MLTX
TAXCLASSIFICATIONS =
RETURNMESSAGES =
PRTDATA =
PRTDATAX =
EXTENSIONIN =
EXTENSIONINX =
read table it_return with key TYPE = 'S'.
if sy-subrc = 0.
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
WAIT =
IMPORTING
RETURN =
*else.
*CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'
IMPORTING
RETURN =
endif.
WRITE:/ IT_RETURN-TYPE,
2 IT_RETURN-ID,
22 IT_RETURN-NUMBER,
25 IT_RETURN-MESSAGE.
IT_RETURN-LOG_NO,
IT_RETURN-LOG_MSG_NO,
IT_RETURN-MESSAGE_V1,
IT_RETURN-MESSAGE_V2,
IT_RETURN-MESSAGE_V3,
IT_RETURN-MESSAGE_V4,
IT_RETURN-PARAMETER,
IT_RETURN-ROW,
IT_RETURN-FIELD,
IT_RETURN-SYSTEM.
ENDLOOP.[/code]
Reward if useful.
Dara. -
Purchasing a new iMac (27 i5), what is the best method to transfer programs/files,documents from a 24" iMac?
What is the best method to remove personal data from the 24" iMac?Use setup assistant which is offered when you setup your new Mac. It will transfer information from a Time Machine backup, Clone or another Mac.
It's best to do this during setup to avoid issues with duplicate IDs.
Regards -
Net Book Value calculation Issue Manual Legacy Data Transfer
Dear All
I want to Upload balance of old fixed Asset in newly configure System that are managed manually in previous year, I want to Post Their previous Current Written Down Value and Current Accumulated Depreciation when i go via IMG Create Legacy Data Transfer and enter,
Take Over Value of assets I entered for example
Accusation Value 100,000
Accumulated ordinary depreciation (Old depreciation in Previous Years) RS.10,000
Net book Value Rs.110,000
and system Calculate the Depreciation on 110,000 as WDV 10% that is 11,000
My requirement is that it will Calculate it as 100000-10000
Depreciation should be calculated at a 90000 @ 10% WDV that is 9000
Please suggest me, how i can do.
or other path or T.code where i can do.
i am new please explain in detail.
Regard
Khalid mahmoodDear Atif and AP
Issue is not with Depreciation key it is working well according to my requirement but the issue is that base value that the system use is wrong.
i want that the system calculate depreciation at Cumulative Acquisition value (Purchase /capitalize/ cost price) less Accumulated Depreciation
is equals to Written Down Value.
I want that system use WDV as a base in each year after deducting year depreciation or (Acqusition cost - Accumulated depreciation).
I post a new asset and assign depreciation key it working well as required.
But i going to post Old asset for example Rs.100000 in year 2006. its Accumulated depreciation is suppose Rs.30000 in Year end 2011.
i want to post that asset in year 2012 as 100000-30000 and WDV IS Rs.70000 ;
i want that it will use 70000 as a base in the coming year to calculate depreciation.
My response is 100000+30000 and use 130000 as base to calculate the ordinary depreciation for the next year.
Now Please read the first thread again and guide me according to my scenario.
or tell me how i post the old asset value and their depreciation.
Regard
Khalid Mahmood -
HR Master Data Transfer to MRS
Hi Experts,
we have recently implemented MRS scheduling for PM orders and we ran three jobs to transfer work center data, capacities and personnel assignment from ECC6 HR (release 604/0020) into MRSS (release 700/0004) . The reports are /MRSS/SGE_PN_MNT , /MRSS/CAG_CG_PLANNING_NODE_MNT and /MRSS/HCM_RPTWFMIF. The data transfer took several days to run manually.
Now our customer wants to run nightly updates to MRS on any work centers/ Personnel records which have changed. Issues are:
u2022 /MRSS/CAG_CG_PLANNING_NODE_MNT short dumps if you execute in background
u2022 if PM users have work orders open when we execute the jobs, the useru2019s transaction can short dump (this can happen even if they are from a plant which is not related to the data being transferred)
How have other MRS users overcome these issues of regular HR master data updates?
Thanks in advance
SarahPFAL transaction is used when we want to transfer an object without a change pointer. For sending deltas you should be using RBDMIDOC (TRAN - BD21) and provide the message type . This program will send the delta.
But neither of the programs PFAL or BD21 will help in filtering the future dated record. This filtering should happen in an user exit for outbound IDoc. The user exit need to be coded to exclude a change pointer from processing if the key in CP contains a future dated record and that may get processed. This may get complex as you would like to send part of the changed record and not the other part of changed record. For example if you create a new IT0002 with future date, you would still want the current record which was delimited one day prior to future dated record.
It would be nice to have the target system handle this instead of handling it within SAP.
Regards
Ravikumar -
Failed to execute custom transfer program
I have already looked at the custom transfer transport thread
I checked all the parameter but still failing
If anybody has done this before
Please let me know what kind of transfer program uses
and the configurations
specially the custom transfer arguments
This is the trace file
880 1996 FIL-052031 3/2/2010 3:33:21 PM |Data flow DF_Info_SAP
880 1996 FIL-052031 3/2/2010 3:33:21 PM Failed to execute the custom transfer program <C:\temp\psftp.exe harmp001 ****** tlsrv003.com
880 1996 FIL-052031 3/2/2010 3:33:21 PM $AW_SAP_WORKING_DIR prm_Info.dat
tls\LANDING\>. Its return code and error message are <rc =
880 1996 FIL-052031 3/2/2010 3:33:21 PM 1. Error message: PuTTY Secure File Transfer (SFTP) client Release 0.60 Usage: psftp [options] [user@]host Options: -V
880 1996 FIL-052031 3/2/2010 3:33:21 PM print version information and exit -pgpfp print PGP key fingerprints and exit -b file use specified batchfile -bc
880 1996 FIL-052031 3/2/2010 3:33:21 PM output batchfile commands -be don't stop batchfile processing if errors -v show verbose messages -load
880 1996 FIL-052031 3/2/2010 3:33:21 PM sessname Load settings from saved session -l user connect with specified username -P port connect to specified port
880 1996 FIL-052031 3/2/2010 3:33:21 PM -pw passw login with specified password -1 -2 force use of particular SSH protocol version -4 -6 force use of
880 1996 FIL-052031 3/2/2010 3:33:21 PM IPv4 or IPv6 -C enable compression -i key private key file for authentication -noagent disable use of
880 1996 FIL-052031 3/2/2010 3:33:21 PM Pageant -agent enable use of Pageant -batch disable all interactive prompts>.
5404 2444 FIL-052031 3/2/2010 3:33:21 PM |Data flow DF_Info_SAPDid you try using psftp manually from a dos prompt?
-
Hi,
I have a confusion.
Can I consider eCATT as a standard data transfer tool.What may its greater advantage be?
Thank youSAP CATT - Computer Aided Test Tool
Just sharing my experiences with CATT (although, I have not used this feature in the last 2 years or so !).
Simply, any one in SAP can use this tool as long as you can access to the tcode. I used CATT atleast in 3 different SAP modules. Typically, in SAP projects, CATT execution is a favorite task for "technical" associates. When the technical resources are either not available immediately or their plate is full with other important tasks, functional associates can simply jump into this activity, since there is no coding or programming involved. This tool has been a great gift from SAP since I am a functional person and could not do coding ! I can remember at least 30 or 40 occassions where I used CATT to get this and that done.
Below find few examples:
1. Created multiple sales orders (excess of 200) in a matter of minutes for the purpose of end user training
2. Created multiple purchase ordes (excess of 200) in a matter of minutes for the purpose of end user training
3. Created Deliveries for the sales orders in a matter of minutes for the purpose of end user training
4. Created config. entires, if the volume of records is large. I remember once I entered 900 records.
5. Extensively used in preparing the transactional data for the purpose of archiving. It worked impeccably.
6. Loading of master data (example: material groups)
Note: Upon execution of CATT, it is very possible that some records will fail, which have to be addressed manually. SAP really needs to further enhance this area of CATT because, there is no easy way of identifying the failed ones, it has to be done manually. One workaround is simply download the result into an Excel and using sort feature of Excel, identify the failed ones and deal with them manually.
With Compliment by: Ranga Rachapudi
CATT stands 4 Computer Aided Testing Tool
Although CATT is meant for as a testing tools, many SAP users have now use CATT frequently to upload vendors master data and make changes to other master record.
SAP Consultant and Abapers tends to used it for creating test data.
With CATT, you don't have to create any ABAP upload programs and this save on development time. However, you still have to spend time on data mapping into the spreadsheet format.
The transactions run without user interaction. You can check system messages and test database changes. All tests are logged.
What CATT does is record you performing the actual transaction once.
You then identify the fields that you wish to change in that view.
Then export this data to a spreadsheet to populate with the data required.
This is uploaded and executed saving you keying in the data manually.
To perform CATT, it has to be enabled in your production environment (your systems administrator should be able to do this - SCC4).
You will also need access to your development system to create the CATT script.
User Guide for Data Upload
The use of CATT is for bulk uploading of data. Although CATT is primarily a testing tool, it can be used for the mass upload of data. The way CATT works is like a real user actually inputting on the SAP screen. You prepare a set of data that are required to be input into the system and execute what you called a Test Case and CATT will do the boring task of keying for you.
Over-all procedure
The over-all procedure to upload data using CATT is as follows:
· Creation of the CATT test case & recording the sample data input.
· Download of the source file template.
· Modification of the source file.
· Upload of the data from the source file.
Details of each step are provided in the following paragraphs.
Detailed Procedure
Creation of the CATT test case:
Creation of the test case is completed as follows:
· Execute Transaction SCAT
· Name the test case. Test case name must start with Z. It is also good practise to include the transaction code in
the test case name (e.g. Z_IE01_UPLOAD for the upload of equipment)
· Click the Record button.
· Enter the transaction code (e.g. IE01)
· Continue recording the transaction. Ensure data is entered into every field that is to be used during the upload.
· Save the test case.
Download the source file template
Download of source file template is conducted in two stages as follows:
· Creation of import parameters:
· Within transaction SCAT, Double Click on the TCD line in the Maintain Functions screen.
· Click the Field List button (Field list is displayed).
· For every field that you wish to upload data, double click in the Column New field contents (This creates an
import parameter).
· In the Maintain Import Parameter Pop-Up:
· Delete the default value if not required.
· Press Enter
· The New field contents column now contains the character & followed by the field name (e.g. &EQART). This
is the name of the import parameter.
· Repeat this for every field (in every screen) to be uploaded.
· Back out and save the CATT test case
· Download of source file template:
· Use the path GOTO -> Variants -> Export Default
· Select path and file name (e.g. C:TEMPZ_IE01_UPLOAD.TXT)
· Click Transfer
Modify the source file
The downloaded source file template is now populated with the data that is to be uploaded. This is completed as follows:
· Using Excel, open the tab-delimited text file.
· Do not change any of the entries that already exist.
1st row contains the field names.
2nd row contains the field descriptions.
3rd row displays the default values which are set in the test case.
4th row contains a warning that changing the default values in the spreadsheet has no effect on the actual default values.
· The data to be uploaded can be entered in the spreadsheet from row 4 onwards (delete the 4th row warning &
replace with data for upload).
· Save the file as a Text file (Tab delimited).
Upload data from the source file
Uploading the data is completed as follows:
· Execute the CATT test case
· In the Execute screen:
· Set processing mode to Errors or Background (your choice).
· Set variants to External from file.
· Click the Choose button and select the file to be uploaded.
· If uploading to another client, click the Remote execution button and select the RFC connection to the required client.
· If uploading to the current client, click the execute button.
Also, refer the links....
http://www.sap-img.com/sap-catt.htm
http://sap.ittoolbox.com/documents/popular-q-and-a/catt-procedure-1795
http://www.thespot4sap.com/Articles/CATT.asp
http://help.sap.com/printdocu/core/Print46c/en/data/pdf/BCCATTOL/BCCATTOL.pdf
http://help.sap.com/printdocu/core/Print46c/en/data/pdf/BCCATTOL/CACATTOL.pdf
Reward Points if this helps, -
Hi!
I have some questions regarding FTP data transfer and programing message fields in SAP
1) How can I program the message fields in SAP?
How much effort do I invest on this issue?
2) We are about to replace our FTP-transfer in SAP-system from manuall to automatic (from location/city A --> to location/city B).
The current approach is/was to start batch-input manually.
How can we do this issue? Is there some ABAP-knowledge necessary or is that typical SAP-Basis (administrative) task?
Thank you very much
regards
AxelHi!
Are you saying that you want to upload text file using ftp, and run batch input automatically from executing ABAP program?
I would like to find the automatic way for FTP-data transfer frorm location 1 to location 2. My current solution based on manual execution of Batch-input, that transfers/pushes the data from location 1 to the location 2.
Thank you for your recommendation/information.
regards -
NI8451 SPI data transfer speed and SCLK setup time adjustment
I'm using NI USB-8451 SPI bus to do communication. I can not reach the speed of communication 4MHz (NI USB-8451 module advertises speeds up to 12MHz). Actrually the data transfer speed is much slower than 4/8 Mb/s. The 16k*16bit data cost around 800ms while it should be 128ms if the data transfer achieves 4Mhz speed. In the manual there is a SPI timing clock figure like this:
In the 4Mhz communication case, the t2 should be 0.25us. I wonder whether the low data transfer speed is due to t1,t3 and t4 since they occupy too much "dead time". If my guess is right, is there any method to reduce the t1,t3 or t4, espetially t3? I know that in the advanced API there is a way to add delay while I did not figure out how to reduce delay(t3). If my guess is wrong, what is the exactly maximum data transfer speed NI8451 can support? (not the clock rate)
Thanks for help.Hi everyone. Im using the SPI communication with 8451 and Im having the same situation., since the serial flash memory I need to program is big enough, t3 (SCLK Setup time) and other "dead time" which I think is the time when buffer on the 8451 needs to be re-filled those are killing my expectation on the final results. I can't see a way to decrease t3 (~10uSec) and in the same way Im seeing something like buffering up to 100~110 bytes then, a ~1.5mSec delay appears on the signal waveforms. Did somebody have good results trying to avoid this?
Thank you.
Javier
Attachments:
t3 - SCLK setup time.jpg 156 KB
Data byte Transfer_109.jpg 137 KB -
Data Transfer using DataSink from DataSource to MediaLocator not working
I wrote this pretty straightforward program to transfer from a DataSource to a MediaLocator and when i run it nothing happens. I waited for around 10 minutes and still nothing happened. I manually closed the program to see if any data has been transferred to the destination file but the destination file timpu.mp3 is empty. Can someone tell me why it isn't working?
import javax.swing.*;
import javax.media.*;
import java.net.*;
import java.io.*;
import javax.media.datasink.*;
import javax.media.protocol.*;
class Abc implements DataSinkListener
DataSource ds;
DataSink dsk;
public void transfer() throws Exception
ds=Manager.createDataSource(new MediaLocator(new File("G:/java files1/jmf/aa.mp3").toURL()));
MediaLocator mc=new MediaLocator(new File("G:/java files1/jmf/timpu.mp3").toURL());
dsk=Manager.createDataSink(ds,mc);
System.out.println(ds.getContentType()+"\n"+dsk.getOutputLocator().toString());
dsk.open();
dsk.start();
dsk.addDataSinkListener(this);
public void dataSinkUpdate(DataSinkEvent event)
if(event instanceof EndOfStreamEvent)
try
System.out.println("EndOfStreamEvent");
dsk.stop();
dsk.close();
System.exit(1);
catch(Exception e)
public class JMFCapture5
public static void main(String args[]) throws Exception
Abc a=new Abc();
a.transfer();
}Message was edited by:
qUesT_foR_knOwLeDgeHave thrown this together - so it's not pretty, but should give you an idea
public class ABC extends JFrame implements DataSinkListener, ControllerListener, ActionListener{
private Container cont;
private JButton jBRecord;
private boolean bRecording = false;
private Processor recordingProcessor;
private DataSource recordingDataSource;
private DataSink recordingDataSink;
private Processor mainProcessor;
private DataSource mainDataSource;
public ABC(){
super("Basic WebCam Handler");
cont = getContentPane();
cont.setLayout(new BorderLayout());
// control panel buttons
jBRecord = new JButton("R+");
jBRecord.setToolTipText("Record On / Off");
cont.add(jBRecord, BorderLayout.NORTH);
// & Listeners
jBRecord.addActionListener(this);
addMyCamera();
pack();
setVisible(true);
private void addMyCamera()
Vector vCDs = CaptureDeviceManager.getDeviceList(null); // get Devices supported
Iterator iTCams = vCDs.iterator();
while(iTCams.hasNext())
CaptureDeviceInfo cDI = (CaptureDeviceInfo)iTCams.next();
if(cDI.getName().startsWith("vfw:"))
try{
MediaLocator mL = cDI.getLocator();
mainDataSource = Manager.createCloneableDataSource(Manager.createDataSource(mL));
mainProcessor = Manager.createProcessor(mainDataSource);
mainProcessor.addControllerListener(this);
mainProcessor.configure();
break;
}catch(Exception eX){
eX.printStackTrace();
private void startRecording(){
if(!bRecording){
try{
System.out.println("startRecording");
recordingDataSource = ((SourceCloneable)mainDataSource).createClone();
recordingProcessor = Manager.createProcessor(recordingDataSource);
recordingProcessor.addControllerListener(this);
recordingProcessor.configure();
bRecording = true;
}catch(Exception eX){
eX.printStackTrace();
private void stopRecording(){
if(bRecording){
System.out.println("stopRecording");
bRecording = false;
try{
recordingProcessor.close();
recordingDataSink.stop();
recordingDataSink.close();
}catch(Exception eX){
eX.printStackTrace();
public void actionPerformed(ActionEvent e)
Object obj = e.getSource();
if(obj == jBRecord)
if(jBRecord.getText().equals("R+"))
jBRecord.setText("R-");
startRecording();
else
jBRecord.setText("R+");
stopRecording();
* ControllerListener
public void controllerUpdate(ControllerEvent e)
Processor p = (Processor)e.getSourceController();
if(e instanceof ConfigureCompleteEvent){
System.out.println("ConfigureCompleteEvent-" + System.currentTimeMillis());
if(p == recordingProcessor){
try{
VideoFormat vfmt = new VideoFormat(VideoFormat.CINEPAK);
TrackControl [] tC = p.getTrackControls();
tC[0].setFormat(vfmt);
tC[0].setEnabled(true);
p.setContentDescriptor(new FileTypeDescriptor(FileTypeDescriptor.QUICKTIME));
Control control = p.getControl("javax.media.control.FrameRateControl");
if(control != null && control instanceof FrameRateControl){
FrameRateControl fRC = (FrameRateControl)control;
fRC.setFrameRate(30.0f);
catch(Exception eX)
eX.printStackTrace();
else{
p.setContentDescriptor(null);
p.realize();
else if(e instanceof RealizeCompleteEvent){
System.out.println("RealizeCompleteEvent-" + System.currentTimeMillis());
try{
if(p == mainProcessor){
Component c = p.getVisualComponent();
if(c != null){
cont.add(c);
p.start();
validate();
else if(p == recordingProcessor){
GregorianCalendar gC = new GregorianCalendar();
File f = new File("C:/Workspace/" + gC.get(Calendar.YEAR) + gC.get(Calendar.MONTH) +
gC.get(Calendar.DAY_OF_MONTH) + gC.get(Calendar.HOUR_OF_DAY) +
gC.get(Calendar.MINUTE) + gC.get(Calendar.SECOND) + ".mov");
MediaLocator mL = new MediaLocator(f.toURL());
recordingDataSink = Manager.createDataSink(p.getDataOutput(), mL);
p.start();
recordingDataSink.open();
recordingDataSink.start();
}catch(Exception eX){
eX.printStackTrace();
else if(e instanceof EndOfMediaEvent){
System.out.println("EndOfMediaEvent-" + System.currentTimeMillis());
p.stop();
else if(e instanceof StopEvent){
System.out.println ("StopEvent-" + System.currentTimeMillis());
p.close();
public void dataSinkUpdate(DataSinkEvent event)
if(event instanceof EndOfStreamEvent){
try{
System.out.println("EndOfStreamEvent-" + System.currentTimeMillis());
recordingDataSink.stop();
recordingDataSink.close();
}catch(Exception e){
public static void main(String args[]) throws Exception
ABC a=new ABC();
// a.transfer();
} -
We run a small network in our office consisting mainly of Intel iMacs we purchased several months ago. The server computer is a brand new Intel Mac Pro server with 4 500 gig drives raided together, two 2.66GHz dual core intel processors, all the other bells and whistles we could think of, etc. We added in two fiber-optic switches: one in the server room, one in the office. The switches connect to each other and the server with fiber optic cables, but the iMacs connect with ethernet cables.
We had the iMacs for a while but we just recently got the new server and upgraded our old 100 base switches. Afterwards we wanted to test out the data transfer speeds, as we plan to back up to the server frequently. We were dismayed to find that transfer rates capped out at 60 MegaBytes/sec according to the Activity Monitor's Network Activity tab. In fact, it would range mostly from 40-45MB/sec. None of us here have much experience with networking, but that seemed a tad too slow. My basic math tells me that a byte is 8 bits, and from that a Gigabit network should transfer data at 120 MegaBytes/sec, which is three times the speeds we were actually seeing.
We sent data both to and from the server in order to test this. Thinking it was perhaps a problem with the ethernet itself, we grabbed an external FireWire hard drive and transferred data from one of the iMacs directly to it and noticed exactly the same transfer rates. We plugged two iMacs directly into each other and transferred at the same rate, ~45MB/sec.
Well this was highly frustrating. All Macs supposedly ship with Gigabit ethernet since, what, 2002? Earlier? Why are our speeds so slow? We thought the hard drives might be slow, so we got info on the drives and googled them for their tech specs. The iMacs' Western Digital drives are capable of much much faster speeds according to everything we've read.
We started reading anything we could online that addressed this issue. Some information read the problem might be that the optical DVD-R/CD-R drive was only capable of slower transfer speeds. Since the optical drive and the hard drive are on the same bus it would slow down the hard drive's maximum transfer rate, much like having a Gigabit hub with a 10 base computer plugged into it would slow the entire network down (which is of course why we use switches instead of hubs). Is there any truth to this? If this is, in fact, the case, can we bypass this bottleneck somehow? I'm not talking about opening the computer and manually disabling the optical drive, as that's a waste of a perfectly good DVD-R/CD-R drive.
Also, if this is true, why in blue blazes is Apple flaunting Gigabit ethernet if the computer can only take advantage of 1/3 the speeds Gigabit ethernet has to offer?! I'll happily provide any more information that's relevant to the problem at hand.
Thank you
Mac OS X (10.4.8)Here's the specs for the default Hard Drive that Apple put in here:
http://www.westerndigital.com/en/products/Products.asp?DriveID=137
From that page
Buffer To Host (Serial ATA) 300 MB/s (Max)
Buffer To Disk 748 Mbits/s (Max)
I have no idea what signaling overhead for my data transfer protocol means.
The ethernet cords are no longer than 20 feet, are all Cat5E, and are well shielded. There is little to no ElectroMagnetic interference in any area they run. The longest cable is the Fiber Optic one, which runs about... oh... 40 yards? These were all installed by professionals.
I feel it's important to stress the fact that we tested an isolated direct computer to computer transfer with a single 6 foot Cat5E Ethernet cable and still experienced the same speeds that we experienced over the network. We tested multiple computers, multiple cables, multiple files.
Maybe you are looking for
-
How can i tranfer all content from the itunes on my computer to the itunes on my new ipad?
I've recently purchased a new ipad mini. I wont really be using my computer anymore so wanted to transfer all content from my itunes such as music, apps, photos etc to my new ipad so they're accessible there. How would i do this?
-
TS1717 Data Execution Prevention - wont let me launch i tunes
Ahhh Help me please. Have installed new update , Itunes 7, Itunes 8 with no luck. Had this happen once and deleted and reinstalled I tunes 8 and it worked but not this time. Just ge the Data Execution Prevention pop up Running Windowsa Vista
-
The labels and text fileld are not hiden by jXDatePicker
Hello When i click on the drop-down jXDatePicker, labels and thet fields (taht are under jXDatePicker in the form ) show over the calendar- the ables and text filelds are displayed onto the calendar. Is there some ideas what could be the problem and
-
Hi, in my scenario where we used to process files of 300 mb of files.The file is split to 10 mb each using the scripts and then processed. Here like out of 30 files only 20 files are processed some of the files are skipped. No message is created in
-
Hi, I have PDF form 2-3 pages long with a Tabular Data View. It contains almost 50 records per per page. Now I am try to remove a particular Instance and its working fine I am uisng details._dataForm.removeInstance(this.parent.index). But the issue i