Measure Formulae for Uploadable and Data collection report
Hi,
I have query related to application of measure Formula for Uploadable and Data collection report.
Consider a scenario where i use a MDX query to create a data collection report and I map these columns from these reports to an rowsource using a loader file. Can I use measure formula feature like onchange to work with Uploadable and Data colection reports such that changes to any one column should take effect in another column.
Regrads,
Wesley
Wesley,
IOP uploadable reports are used for sending data to the server in batch. They are coupled to the IOP model via a rowsource. You have a lot of flexibility with the spreadsheet you build for staging the upload. It's really just a function of how crafty you are with vb. Cascading changes from one column to another can be easily accomplished this way.
Onchange formulas are for something else. They are part of the model definition and are used for describing how to cascade changes between IOP data blocks.
Similar Messages
-
Hello .. I've been subscribed for ic;oud for 20$ per year and I found it useless for many reasons: that I can not disconnect my mobile while the uploading process and it takes long time for uploading my data .. Its not a reliable system that's why I need to deactivate the space service and take my money back .. Thanks
The "issues" you've raised are nothing to do with the iCloud service.
No service that uploads data allows you to disconnect the device you are uploading from while uploading data. Doing so would prevent the upload from completing. It is a basic requirement for any uploading service that you remain connected to it for uploading to be possible.
The time it takes to upload data to iCloud is entirely dependent on how fast your Internet connection is, and how much data you are uploading. Both of these things are completely out of Apple's control. Whichever upload service you use will be affected by the speed of your Internet connection. -
What is the method used for upload of data
Hi All,
What is the method used for upload of Data from excel spread sheets to SAP HR.
it is Bulk Data from many countries.LSMW or BDC Session Method?
what are country specific infotypes used for PA and OM.
can u plz give the list of country specific infotypes used for PA n OM Module.
Thanks
ArchanaHi Archana,
To Upload bulk data I think BDC is the best and effecient way.
Regarding Infotypes we dont have any country specific Infotypes in OM & PA. In Payroll we do have according to country wise.
I hope you had understood the point
Regards
Pavani
Remainder: Points to be given on answers -
Please send detail steps for uploading legacy data
Hi friends,
please send detail steps for uploading legacy data
Thanking u in advance,
Diwa.HI U CAN USE LSMW TO UPLOAD LEGACY DATA
LSMW is used for migrating data from a legacy system to SAP system, or from one SAP system to another.
Apart from standard batch/direct input and recordings, BAPI and IDocs are available as additional import methods for processing the legacy data.
The LSMW comprises the following main steps:
Read data (legacy data in spreadsheet tables and/or sequential files).
Convert data (from the source into the target format).
Import data (to the database used by the R/3 application.
But, before these steps, you need to perform following steps :
Define source structure : structure of data in the source file.
Define target structure : structure of SAP that receives data.
Field mapping: Mapping between the source and target structure with conversions, if any.
Specify file: location of the source file
Of all the methods used for data migration like BDC, LSMW , Call Transaction which one is used most of the time?
How is the decision made which method should be followed? What is the procedure followed for this analysis?
All the 3 methods are used to migrate data. Selection of these methods depends on the scenario, amount of data need to transfer. LSMW is a ready tool provided by SAP and you have to follow some 17 steps to migrate master data. While in BDCs Session method is the better choice because of some advantages over call transaction. But call transaction is also very useful to do immediate updation of small amout of data. (In call transaction developer has to handle errors).
SO Bottom line is make choice of these methods based of real time requirements.
These methods are chosen completely based on situation you are in. Direct input method is not available for all scenario, else, they are the simplest ones. In batch input method ,you need to do recording for the transaction concerned. Similarly, IDoc, and BAPI are there, and use of these need to be decided based on the requirement.
Try to go through the some material on these four methods, and implement them. You will then have a fair idea about when to use which.
LSMW Steps For Data Migration
How to develop a lsmw for data migration for va01 or xk01 transaction?
You can create lsmw for data migration as follows (using session method):
Example for xk01 (create vendor)
Initially there will be 20 steps but after processing 1 step it will reduced to 14 for session method.
1. TCode : LSMW.
2. Enter Project name, sub project name and object name.
Execute.
3. Maintain object attributes.
Execute
select Batch Input recording
goto->Recording overview
create
recording name.
enter transaction code.
start recording
do recording as per ur choice.
save + back.
enter recording name in lsmw screen.
save + back
Now there will be 14 steps.
2. MAINTAIN SOURCE STRUCTURES.
Here you have to enter the name of internal table.
display change
create
save + back
3. MAINTAIN SOURCE FIELDS.
display change
select structure
source_fields->copy fields.
a dialogue window will come .
select -> from data file
apply source fields
enter No. of fields
length of fields
attach file
save + back
4. MAINTAIN STRUCTURE RELATIONS
display change
save + back
5. MAINTAN FIELD MAPPING & CONVERSION RULE
display change
click on source field, select exact field from structue and enter
repeat these steps for all fields.
save+back
6. MAINTAIN FIXED VALUES, TRANSACTION, USER DEFINED
execute
save + back
7. SPECIFY FILES.
display change
click on legacy data
attah flat file
give description
select tabulatore
enter
save + back
8. ASSIGN FILE
execute
display change
save + back
9. IMPORT DATA.
execute
display change
save + back
10. DISPLAY IMPORTED DATA
enter ok, it willl show records only.
back
11. CONVERT DATA
execute
display change
save + back
12. DISPLAY CONVERTED DATA
execute
display change
save + back
13. CREATE BATCH INPUT SESSION
tick keep batch input folder
F8
back
14. RUN BATCH INPUT SESSION.
sm35 will come
Object name will be shown here
select object & process -
For uploading master data(ex:customer data) into sap,
hi
for uploading master data(ex:customer data) into sap,
which methods you prefer? call
transaction/session/lsmw/bapi? why?
Thanks
RamaHello,
Check this:
COde:
REPORT zprataptable2
NO STANDARD PAGE HEADING LINE-SIZE 255.
DATA : BEGIN OF itab OCCURS 0,
i1 TYPE i,
lifnr LIKE rf02k-lifnr,
bukrs LIKE rf02k-bukrs,
ekorg LIKE rf02k-ekorg,
ktokk LIKE rf02k-ktokk,
anred LIKE lfa1-anred,
name1 LIKE lfa1-name1,
sortl LIKE lfa1-sortl,
land1 LIKE lfa1-land1,
akont LIKE lfb1-akont,
fdgrv LIKE lfb1-fdgrv,
waers LIKE lfm1-waers,
END OF itab.
DATA : BEGIN OF jtab OCCURS 0,
j1 TYPE i,
banks LIKE lfbk-banks,
bankl LIKE lfbk-bankl,
bankn LIKE lfbk-bankn,
END OF jtab.
DATA : cnt(4) TYPE n.
DATA : fdt(20) TYPE c.
DATA : c TYPE i.
INCLUDE bdcrecx1.
START-OF-SELECTION.
CALL FUNCTION 'WS_UPLOAD'
EXPORTING
filename = 'C:\first1.txt'
filetype = 'DAT'
TABLES
data_tab = itab.
CALL FUNCTION 'WS_UPLOAD'
EXPORTING
filename = 'C:\second.txt'
filetype = 'DAT'
TABLES
data_tab = jtab.
LOOP AT itab.
PERFORM bdc_dynpro USING 'SAPMF02K' '0100'.
PERFORM bdc_field USING 'BDC_CURSOR'
'RF02K-KTOKK'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'RF02K-LIFNR'
itab-lifnr.
PERFORM bdc_field USING 'RF02K-BUKRS'
itab-bukrs.
PERFORM bdc_field USING 'RF02K-EKORG'
itab-ekorg.
PERFORM bdc_field USING 'RF02K-KTOKK'
itab-ktokk.
PERFORM bdc_dynpro USING 'SAPMF02K' '0110'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFA1-LAND1'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'LFA1-ANRED'
itab-anred.
PERFORM bdc_field USING 'LFA1-NAME1'
itab-name1.
PERFORM bdc_field USING 'LFA1-SORTL'
itab-sortl.
PERFORM bdc_field USING 'LFA1-LAND1'
itab-land1.
PERFORM bdc_dynpro USING 'SAPMF02K' '0120'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFA1-KUNNR'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFBK-BANKN(01)'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTR'.
cnt = 0.
LOOP AT jtab WHERE j1 = itab-i1.
cnt = cnt + 1.
CONCATENATE 'LFBK-BANKS(' cnt ')' INTO fdt.
PERFORM bdc_field USING fdt jtab-banks.
CONCATENATE 'LFBK-BANKL(' cnt ')' INTO fdt.
PERFORM bdc_field USING fdt jtab-bankl.
CONCATENATE 'LFBK-BANKN(' cnt ')' INTO fdt.
PERFORM bdc_field USING fdt jtab-bankn.
IF cnt = 5.
cnt = 0.
PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFBK-BANKS(01)'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=P+'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFBK-BANKN(02)'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTR'.
ENDIF.
ENDLOOP.
PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFBK-BANKS(01)'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTR'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0210'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFB1-FDGRV'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'LFB1-AKONT'
itab-akont.
PERFORM bdc_field USING 'LFB1-FDGRV'
itab-fdgrv.
PERFORM bdc_dynpro USING 'SAPMF02K' '0215'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFB1-ZTERM'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0220'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFB5-MAHNA'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0310'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFM1-WAERS'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'LFM1-WAERS'
itab-waers.
PERFORM bdc_dynpro USING 'SAPMF02K' '0320'.
PERFORM bdc_field USING 'BDC_CURSOR'
'RF02K-LIFNR'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTR'.
PERFORM bdc_dynpro USING 'SAPLSPO1' '0300'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=YES'.
PERFORM bdc_transaction USING 'XK01'.
ENDLOOP.
PERFORM close_group.
Header file:
1 63190 0001 0001 0001 mr bal188 b in 31000 a1 inr
2 63191 0001 0001 0001 mr bal189 b in 31000 a1 inr
TC file:
1 in sb 11000
1 in sb 12000
1 in sb 13000
1 in sb 14000
1 in sb 15000
1 in sb 16000
1 in sb 17000
1 in sb 18000
1 in sb 19000
1 in sb 20000
1 in sb 21000
1 in sb 22000
2 in sb 21000
2 in sb 22000
Regards,
Vasanth -
FM for uploading the data in infotype
Dear Friends,
I need a example how i can use the FM(HR_MAINTAIN_MASTERDATA,HR_INFOTYPE_OPERATION) for uploading the data in multiple infotypes at a time.
I have a data in internal table now how i should pass the infotype no in the FM am not aware of that kindly help meee.
And also my personnal no is internal ,how i pass in the FM.
How i pass the remaning values .....
my code ......
Loop at it_employee into wa_employee.
CALL FUNCTION 'HR_MAINTAIN_MASTERDATA'
EXPORTING
PERNR = '00000000'
MASSN = wa_employee-MASSN
ACTIO = 'INS'
TCLAS = 'A'
BEGDA = SY-DATUM
ENDDA = '99991231'
OBJPS =
SEQNR =
SPRPS =
SUBTY =
WERKS = wa_employee-WERKS
PERSG = wa_employee-PERSG
PERSK = wa_employee-PERSK
PLANS = wa_employee-PLANS
DIALOG_MODE = '1'
LUW_MODE = '1'
NO_EXISTENCE_CHECK = 'X '
NO_ENQUEUE = 'X'
IMPORTING
RETURN = t_return
RETURN1 =
HR_RETURN =
TABLES
proposed_values = it_employee.
MODIFIED_KEYS =
write : / t_return.
thanks
sandeepHi
you have to open one more gl account for intial upload.
E.g. If you have the debit balance of Gl alc 50000 in legacy system, then the entry should be:
Dr 50000alc $100
Cr initial Upload alc $100.
if you want futher explaination let me know, i will explain in detail.
Thanks & Regards,
Reva naik. -
Interface Using BAPI for Uploading shipment datas
Can any1 send me the example code for Inbound Interface using BAPI for Uploading shipment datas.please kindly send me the programs which u using with BAPI
Hi
Except hiring (or new joinee) for all other actions you can use below Function Module.
HR_INFOTYPE_OPERATION.
~~~Ganesh Kumar K. -
hi
for uploading master data (ex: customerd data) into sap, which methods we prefer
(call transaction/ session/ LSMW) . why?depends on how much data you want to upload..
LSMW itself will give you multiple options to load data.
if there is a huge number of records that you want to upload, do not go for a call transaction method, rather use sessions method if you want to use BDCs. explore in LSMW that which options are available for customer data while uploading it. -
How to set a report culture for number and date
Hi,
Is there a way to change the report locale for the number and date?
I've tried using both CrystalDecisions.Share.SharedUtils.RequestLCID and CrystalReportViewer.SetProductLocale and neither work
The only one that work is to use the Thread.Culture. But that also change the application culture, not just the report.
Anyone have a solution?
ThankHi Michel,
I don't believe you can do this within just the report itself but I haven't played with this very much. Because the App hosts the Viewer it's based on it's culture.
You should be able to mix if you create a separate thread for each report job that way they are in their own space.
Search the Object browser on celocale and you'll find more info. Also the SDK help file should have examples.
No API's to use the Engine, had to use RAS.
Need to know what version you are using also and WEB or Windows app?
Here are some samples using RAS:
rpt1.ReportClientDocument.LocaleID = CrystalDecisions.ReportAppServer.DataDefModel.CeLocale.ceLocaleGerman;
CrystalDecisions.ReportAppServer.DataDefModel.CrFieldDisplayNameTypeEnum.crFieldDisplayNameFormula, CrystalDecisions.ReportAppServer.DataDefModel.CeLocale.ceLocaleUserDefault);
And using this way you may have to alter every object in the report.
Thanks
Don -
What is the Tcodes for Uploading of data using BDC & CATT
PP members:
I was going through the <b>cutover activities</b> , and what I understood is we transfer all the legacy system data into SAP before going live
The data upload follows certain steps (depends on the organizational design load strategies)
First we upload all the master data ( material master, BOM, W/C's & Routings)
Then the transaction data ( Ideally speaking, there should no open orders i.e. WIP as on the day of cutoff )
If the WIP (Work in Process) is unavoidable then the materials consumed shall be treated as <b>materials of the previous stage</b> and necessary adjustments shall be made after cutover day
At this point, I could not able to understand what does the author mean <b>materials of the previous stage</b>
Now comming to the uploading of data into SAP from legacy system, we use tools like LSMW, CATT & BDC
Is it a must to use <b>only LSMW tool</b> to upload master data or any other upload tools are fine
Lastly,. I am not sure about the Tcode of CATT & BDC
Summary of the questions:
1.What does the author mean <b>material of previous stage</b>, for WIP materials during cutover activities
2. Is it mandatory to use only LSMW tool for uploading for master data
3. What are the Tcodes for upload tools CATT & BDC ?
Thanks for your time
Suren RDear,
1.What does the author mean material of previous stage, for WIP materials during cutover activities - as i understood, what is the stage of material..like it must have gone through 2 work centers and other 2 is left. i.e. you need to create Production order with only 2 operation as other 2 is already over. - usually it is done in such a way that we will create Production order and confirm till 2 operations and WIp is calculated so thatb FI will tally the books in SAP and lagacy.
2. Is it mandatory to use only LSMW tool for uploading for master data - no you can use any tool as required and suits yr requirement
3. What are the Tcodes for upload tools CATT & BDC- BDC through a prog in SE38. CATT through - SCEM. -
Use of FOR Cursor and BULK COLLECT INTO
Dear all,
in which case we prefer to use FOR cursor and cursor with BULK COLLECT INTO? The following contains two block that query identically where one is using FOR cursor, the other is using BULK COLLECT INTO . Which one that performs better given in the existing task? How do we measure performance between these two?
I'm using sample HR schema:
declare
l_start number;
BEGIN
l_start:= DBMS_UTILITY.get_time;
dbms_lock.sleep(1);
FOR employee IN (SELECT e.last_name, j.job_title FROM employees e,jobs j
where e.job_id=j.job_id and e.job_id LIKE '%CLERK%' AND e.manager_id > 120 ORDER BY e.last_name)
LOOP
DBMS_OUTPUT.PUT_LINE ('Name = ' || employee.last_name || ', Job = ' || employee.job_title);
END LOOP;
DBMS_OUTPUT.put_line('total time: ' || to_char(DBMS_UTILITY.get_time - l_start) || ' hsecs');
END;
declare
l_start number;
type rec_type is table of varchar2(20);
name_rec rec_type;
job_rec rec_type;
begin
l_start:= DBMS_UTILITY.get_time;
dbms_lock.sleep(1);
SELECT e.last_name, j.job_title bulk collect into name_rec,job_rec FROM employees e,jobs j
where e.job_id=j.job_id and e.job_id LIKE '%CLERK%' AND e.manager_id > 120 ORDER BY e.last_name;
for j in name_rec.first..name_rec.last loop
DBMS_OUTPUT.PUT_LINE ('Name = ' || name_rec(j) || ', Job = ' || job_rec(j));
END LOOP;
DBMS_OUTPUT.put_line('total time: ' || to_char(DBMS_UTILITY.get_time - l_start) || ' hsecs');
end;
/In this code, I put timestamp in each block, but they are useless since they both run virtually instantaneous...
Best regards,
ValIf you want to get 100% benifit of bulk collect then it must be implemented as below
declare
Cursor cur_emp
is
SELECT e.last_name, j.job_title
FROM employees e,jobs j
where e.job_id=j.job_id
and e.job_id LIKE '%CLERK%'
AND e.manager_id > 120
ORDER BY e.last_name;
l_start number;
type rec_type is table of varchar2(20);
name_rec rec_type;
job_rec rec_type;
begin
l_start:= DBMS_UTILITY.get_time;
dbms_lock.sleep(1);
/*SELECT e.last_name, j.job_title bulk collect into name_rec,job_rec FROM employees e,jobs j
where e.job_id=j.job_id and e.job_id LIKE '%CLERK%' AND e.manager_id > 120 ORDER BY e.last_name;
OPEN cur_emp;
LOOP
FETCH cur_emp BULK COLLECT INTO name_rec LIMIT 100;
EXIT WHEN name_rec.COUNT=0;
FOR j in 1..name_rec.COUNT
LOOP
DBMS_OUTPUT.PUT_LINE ('Name = ' || name_rec(j) || ', Job = ' || job_rec(j));
END LOOP;
EXIT WHEN cur_emp%NOTFOUND;
END LOOP;
CLOSE cur_emp;
DBMS_OUTPUT.put_line('total time: ' || to_char(DBMS_UTILITY.get_time - l_start) || ' hsecs');
end;
/ -
Measuring Point for Tools and Die Maintenance
Dear All,
I am trying to map the following process:-
My client has 10 press machines and around 250 Dies for punching. Dies are assembled to press machines and they produce jobs by punching.
Client has an Preventive Maintenance plan for each Die in terms of number of strokes punched by an Die. for e.g PM is done after every 300000, 600000, 900000, 1200000 strokes. Estimated life is 1200000.
What they want is when production person confirms the production order say for 10000 jobs the counter of the die should increase by 10000.......and so on. When the counter reaches 300000 preventive maintenance call should be scheduled.
What i have done is created an Die as an Equipment with PRT as equipment category. Created an measuring point for that Die, assined Usage formula etc.
I have created an Maintenance plan for that die in which i have assigned the strategy (IP11) which contains the stroke cycles 300000, 600000.........etc.
Assigned the Die as a PRT to the material which is to be produced in the Routing.
When i confirm the production order the counter of the PRT increases which i can check IK13.
What i am expecting is that in IP10 system should update the counter automatically so that when the counter reaches to 300000 maintenance call wil be scheduled and order will be generated.
I would request you to please give me some input on this.
Thanks and Regards,
RashmiHello Rashmi;
I have seen the screen shots & there are problems with the following:
1) Counter overflow reading
You have entered 130; keep it as- 999999 (it is the reading after which system starts counting from 0 value)
2) Formula that you have used;
You have used the formula- SAPF02; do not use this. Create you own formula with following details: SAPF99 (new formula) = SAP_20 * SAP_09
3) You have manually entered the reading for the measuring point; do not use it. System will calculate the measurement reading from PRT usage value.
CREATE a single measuring document with reading as '0'.
4) Also in CA01; you need to assign this PRT in routing with "EQUIPMENT" & not material. Also make sure that in the same screen (assignment of PRT- basic data) maintain usage value as 1 (if this is not maintained, measuring doc. will NOT be updated).
5) Now when you confirm the production order, based on number of units created; system will automatically create the measuring doc.
If there is any problem with CA01, CO01 or CO11n; ask your PP consultant to assist.
Do let me know if there is any problem in this.
Regards
Hemant -
ECATT Use for upload master data
Hi
Please Guide me how to use ECATT , when we Upload the Master data in SAP Best Practice ,not in SAP R/3 ( note : for SAP Best Practice system only) through ECATT.
Can anybody have any idea about this
Guide me
Regards
RoobalECATT can be used by using transaction code SECATT.
This tool is used for data uploading and regression testing.
You need to prepare a scenario for regression testing.
You should create
1. Test Script
2. Test Data
3. Test Configuration.
If you want to upload master data, LSMW is the best tool.
http://www.scmexpertonline.com/downloads/SCM_LSMW_StepsOnWeb.doc
Regards,
Ravi
Edited by: Ravi Sankar Venna on Apr 22, 2009 12:54 PM -
Best strategies for reducing effective data collection rate?
Hi,
I'm writing a VI to collect data from cDAQ modules that have a minimum sampling rate of 1612Hz due to the master timebase (specifically 9237 and 9215 modules). I would like to log data (1D waveform with timestamps) at 100Hz for 12 hours and save this to file. THe data also needs to be displayed during this time, and data logging should be robust so there is minimal risk of data loss.
I've made one attempt at doing this previously - I used the Align and resample expressVI but this was fairly messy, and the problems I ran into were that I had errors about the maximum array size being reached, and also constant buffer overwrite errors.
I would like to basically start this VI again from scratch, and I'm wondering if there are any suggestions for an overall strategy for this? I'm not asking for any code to be written for me, just concrete shoves in the right direction regarding things like data file type, how to resample or downsample data, how to clear arrays, etc.
Thanks,
Claire.Claire,
It appears that you are appending the resampled waveform to an empty array of waveforms and then writing the appended array to the file. What does appending to an empty array accomplish? Just write the decimated array directly to the file.
If you want to have an array of all the waveforms, then you need to use shift registers to pass the appended array to the next iteration of the consumer loop.
You do not need the sequence structure. Dataflow takes care of making things happen in order.
I do not see how the feedback node will ever change the file name as it only executes once. I am not sure what you intended, but you may want to move the file path creation inside the loop and put it into a case structure which executes when it is time to create a new file.
The attached image is an example of the way I was thinking of decimating your data.
In both my image and your program the use of build array in a loop is not a good practice it the array can get large due to memory allocation issues. Better is to initialize an array of the maximum size and use Replace Array Element to put the data into the array.
Lynn
Attachments:
Decimate waveform.png 83 KB -
How to define the DATE fields in itab for uploading the data?
Hi Experts,
Am uploading the data from Excel sheet to Z table via my_itab.
So, there r 3 Date fields in Z table, which are defined as DATS(8).
So, pls. let me know that, How I hv to define the itab-date_one, itab-date_second, itab-date_three, weather CHAR(10) or sy-datum or concatenation??
thanqHi Sri, follow like this.
First Move that Excel Sheet Data to Internal Table Using Function Module ALSM_EXCEL_TO_INTERNAL_TABLE.
And you can insert that internal table data to Database Table Using INSERT Command.
First upload Data From Excel Sheet to ITAB with Required Structure. Once the Data Comes...
And Declare another Internal Table (ITAB1) Type Database Table name.
And Use Insert Database Table From ITAB1.
Check this Example.
TABLES:MARA.
DATA:BEGIN OF ITAB OCCURS 0,
MATNR LIKE MARA-MATNR,
MBRSH LIKE MARA-MBRSH,
MTART LIKE MARA-MTART,
MEINS LIKE MARA-MEINS,
END OF ITAB.
DATA:ITAB1 LIKE MARA OCCURS 0 WITH HEADER LINE.
START-OF-SELECTION.
ITAB-MATNR = '123ABCDA'. .
ITAB-MBRSH = 'C'.
ITAB-MTART = 'FERT' .
ITAB-MEINS = 'KG' .
APPEND ITAB.
ITAB-MATNR = '123ABCDB'. .
ITAB-MBRSH = 'C'.
ITAB-MTART = 'FERT' .
ITAB-MEINS = 'KG' .
APPEND ITAB.
LOOP AT ITAB.
MOVE-CORRESPONDING ITAB TO ITAB1.
APPEND ITAB1.
ENDLOOP.
LOOP AT ITAB1.
INSERT MARA FROM ITAB1.
MODIFY MARA .
ENDLOOP.
or,
By using type pools TRUXS our problem my be solve.
Take one internal table like your standard data base table and
one internal table for truxs like
DATA: it_raw TYPE truxs_t_text_data.
DATA: itab like mara occurs 0 with header line.
Use FM
CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
EXPORTING
I_FIELD_SEPERATOR =
i_line_header = 'X'
i_tab_raw_data = it_raw " WORK TABLE
i_filename = p_file
TABLES
i_tab_converted_data = it[] "ACTUAL DATA
EXCEPTIONS
conversion_failed = 1
OTHERS = 2.
Now looping that and updated to data base table.
hope this helps.
kindly reward if found helpful.
cheers,
Hema.
Maybe you are looking for
-
How to do Excise Invoice?
Hi Friends, I am new to excise concept, My client is a manufacturing company, they are manufacturing machines. so they are using this excisable concepts and excise tax. They are using CENVAT 10.3 % Tax, i have created that but in normal i
-
How do I save photos before restoring Iphone 3gs? Phone crashed, wants me to restore via i tunes. I have no backup so believe restore process will delete all my photos. Any way out. I promise to back up in future. Hlep..
-
After reboot the password is not asked
Hi All I have just installed ArchLinux for the first time, using the beginners guide After installing it, configuring it, generating the master key required for pacman updated, I run: pacman -Syu After some of the things have been installed, I did ge
-
I have an Acer Laptop Aspire 1690 I've installed the latest driver from the Acer website. I can use the scroll button ok in outlook express, internet explorer, adobe pdf reader, etc but not in firefox. I've searched the help in here and tried changin
-
Can the apple store update my software on my mac if it is 10.5.8
can i update my mac at the apple store if the software is 10.5.8