Urgent - Uploading spreadsheet data
Hi,
Can you point me to a sample application which explains how to upload spreadsheet data into multiple tables?
As an e.g.
Table 1 - City (id, name)
Table 2 - City_Details(city_id, details)
city_id is a foreign key to 'id' in city table
Sample spreadsheet rows:
Row 1: 'Los Angeles' 'A major city in California'
Row 2: 'Sacramento' ' Capital of California'
Thanks,
Vishal
Take a look at thread Re: Loading a CSV file into a table(Vikas Plz)
Put your own logic into the HTMLDB_TOOLS package so it inserts into two tables.
Mike
Similar Messages
-
Uploading spreadsheet data into the database
Hi
I want to upload the spreadsheet data into the database through front end...I dont have any idea how to do upload without using the 'utilities' option..Can anyone please help me to do this?
Thanks in advance
FazilaHi
I refered the example sent by vikas...but i could not understand..I dont need to specify table name in runtime...my requirement is that I will have the constant table(say MD look up table)...and I will have some data under the column heading( say repid,split name)...
Now I want to import my spreadsheet data which are under the heading repid and split name through my front end application and I have the option whether to 'overwrite' the records or 'append' the new records...after clicking the necessary option..I want to import my spread sheet data into the table defined already...and my another requirement is that I want to check the duplication of data between the spreadsheet and table...If I find the duplicates, I have to omit it and store the remaing details....
Please give me some guidelines to solve this problem....
Thanks in advance
Fazila -
Has there been any progress on this front?
I've read the previous posts on this topic & there still doesn't seem to an elegant solution short of exposing end-users to the Data Workshop or parsing pasted data from a Text Area (suggested but not documented how to do this).
In my business unit (Finance/Pricing) all serious number crunching is done on complex spreadsheet apps. & important calculated values (like price-points offered to customers) are transferred to an (MS Access) database for monthly & ad-hoc reporting.
At present it is a trivial exercise for me to automate the transfer of calculated values from the spreadsheet onto an MS Access form using VBA. This saves time & reduces transcription errors for end users, & they love this feature.
Our business unit is considering HTML DB as an alternative database to MS ACCESS, but it would be very difficult to persuade management if this also meant more data-entry & the greater possibility of transcription errors by the end users.
Do you know of any Javascript that could copy data out of an open Excel file, using named ranges onto a HTML DB form for saving into tables? or any other straightforward way to implement end-user transfer of values calculated in a spreadsheet into HTML DB?Asim,
HTML DB currently requires developer privileges to use its spreadsheet loading capabilities. There are example available in the HTML DB Studio (http://htmldb.oracle.com/studio) however that may help you build your own data loading screens.
See for example:
http://htmldb.oracle.com/pls/otn/f?p=18326:54:::::P54_ID:1342
Sergio -
Dear All,
I've tried to implement this how to
http://www.oracle.com/technology/products/database/application_express/howtos/howto_create_upload_spreadsheet_form.html
it works grat but when the data to cut/paste in the filed gets too much bigger I end with some ORA-06502: PL/SQL: error character string too small. I think this could be a limitation of the varchar2 datatype . How can I overcome this limit ?
Any ideas would be appreciated
Thank you so much !
DarioI guess the next logical step would be modify the example and use a LONG datatype. It would be interesting to see if this worked....
-
Problem in Uploading Excel Data ! - Urgent
Dear Experts,
I am uploading excel data using FM 'TEXT_CONVERT_XLS_TO_SAP' , its directly convert data and store in internal table same as in excel sheet(row , column wise).
But the problem is , one of the column having description more then 500 characters. so while uploding excel to itab its truncating the text and only uploding 255 characters.
Is there any other way , so that i can upload more then 500 characters long text. Also i have tried 'ALSM_EXCEL_TO_INTERNAL_TABLE' FM , it was also not working.
Please Help me , its Urgent.
Points will be rewared.
Thanks & Regards,Hi,
Please use FM 'GUI_UPLOAD'.
types: begin of ttab,
rec(1000) type c,
end of ttab.
types: begin of tdat,
fld1(10) type c,
fld2(10) type c,
fld3(10) type c,
end of tdat.
data: itab type table of ttab with header line.
data: idat type table of tdat with header line.
data: file_str type string.
parameters: p_file type localfile.
at selection-screen on value-request for p_file.
call function 'KD_GET_FILENAME_ON_F4'
exporting
static = 'X'
changing
file_name = p_file.
start-of-selection.
file_str = p_file.
call function 'GUI_UPLOAD'
exporting
filename = file_str
filetype = 'ASC'
tables
data_tab = itab
exceptions
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
others = 17.
Best regards,
Prashant -
Measure Formulae for Uploadable and Data collection report
Hi,
I have query related to application of measure Formula for Uploadable and Data collection report.
Consider a scenario where i use a MDX query to create a data collection report and I map these columns from these reports to an rowsource using a loader file. Can I use measure formula feature like onchange to work with Uploadable and Data colection reports such that changes to any one column should take effect in another column.
Regrads,
WesleyWesley,
IOP uploadable reports are used for sending data to the server in batch. They are coupled to the IOP model via a rowsource. You have a lot of flexibility with the spreadsheet you build for staging the upload. It's really just a function of how crafty you are with vb. Cascading changes from one column to another can be easily accomplished this way.
Onchange formulas are for something else. They are part of the model definition and are used for describing how to cascade changes between IOP data blocks. -
Hi experts,
I have to upload legacy data into SAP by using eCATT recording Transaction. For doing this I have completed Test script in tcode SECATT. After this what I have to do.. How can I attach my legacy file.. Can anyone help me. ..Its really urgent
Thanks in advanceHi,
There are a series of weblogs in this area. Check them out, I think there are 8 in a series:
/people/sapna.modi/blog/2006/04/10/ecatt--an-introduction-part-i
cheers
Aveek -
Is it possíble upload spreadsheet for price condition?
Hi!!
My company would like to work with pricing list, but we have a lot of differents SKU and price by customer. For example, we have for one customer more than 500 different SKU's and price.
If we use the standard transaction VK11, we will have much work to update the price,
I would like to know if is it possible upload spreadsheet for price condition, by BAPI, ABAP program, etc~.
Our price condition have the follow key fields.
Sales organization
Customer
Material
Incoterms (CIF/FOB)
We will use Date to validate the price too.
Best regards,
Juoio CésarHi
You can use program RV14BTCI - Batch Input for Uploading Condition Pricing.
Check below links
Is there a way we can upload pricing condition ... | SCN
Standard Upload Program -
Please send detail steps for uploading legacy data
Hi friends,
please send detail steps for uploading legacy data
Thanking u in advance,
Diwa.HI U CAN USE LSMW TO UPLOAD LEGACY DATA
LSMW is used for migrating data from a legacy system to SAP system, or from one SAP system to another.
Apart from standard batch/direct input and recordings, BAPI and IDocs are available as additional import methods for processing the legacy data.
The LSMW comprises the following main steps:
Read data (legacy data in spreadsheet tables and/or sequential files).
Convert data (from the source into the target format).
Import data (to the database used by the R/3 application.
But, before these steps, you need to perform following steps :
Define source structure : structure of data in the source file.
Define target structure : structure of SAP that receives data.
Field mapping: Mapping between the source and target structure with conversions, if any.
Specify file: location of the source file
Of all the methods used for data migration like BDC, LSMW , Call Transaction which one is used most of the time?
How is the decision made which method should be followed? What is the procedure followed for this analysis?
All the 3 methods are used to migrate data. Selection of these methods depends on the scenario, amount of data need to transfer. LSMW is a ready tool provided by SAP and you have to follow some 17 steps to migrate master data. While in BDCs Session method is the better choice because of some advantages over call transaction. But call transaction is also very useful to do immediate updation of small amout of data. (In call transaction developer has to handle errors).
SO Bottom line is make choice of these methods based of real time requirements.
These methods are chosen completely based on situation you are in. Direct input method is not available for all scenario, else, they are the simplest ones. In batch input method ,you need to do recording for the transaction concerned. Similarly, IDoc, and BAPI are there, and use of these need to be decided based on the requirement.
Try to go through the some material on these four methods, and implement them. You will then have a fair idea about when to use which.
LSMW Steps For Data Migration
How to develop a lsmw for data migration for va01 or xk01 transaction?
You can create lsmw for data migration as follows (using session method):
Example for xk01 (create vendor)
Initially there will be 20 steps but after processing 1 step it will reduced to 14 for session method.
1. TCode : LSMW.
2. Enter Project name, sub project name and object name.
Execute.
3. Maintain object attributes.
Execute
select Batch Input recording
goto->Recording overview
create
recording name.
enter transaction code.
start recording
do recording as per ur choice.
save + back.
enter recording name in lsmw screen.
save + back
Now there will be 14 steps.
2. MAINTAIN SOURCE STRUCTURES.
Here you have to enter the name of internal table.
display change
create
save + back
3. MAINTAIN SOURCE FIELDS.
display change
select structure
source_fields->copy fields.
a dialogue window will come .
select -> from data file
apply source fields
enter No. of fields
length of fields
attach file
save + back
4. MAINTAIN STRUCTURE RELATIONS
display change
save + back
5. MAINTAN FIELD MAPPING & CONVERSION RULE
display change
click on source field, select exact field from structue and enter
repeat these steps for all fields.
save+back
6. MAINTAIN FIXED VALUES, TRANSACTION, USER DEFINED
execute
save + back
7. SPECIFY FILES.
display change
click on legacy data
attah flat file
give description
select tabulatore
enter
save + back
8. ASSIGN FILE
execute
display change
save + back
9. IMPORT DATA.
execute
display change
save + back
10. DISPLAY IMPORTED DATA
enter ok, it willl show records only.
back
11. CONVERT DATA
execute
display change
save + back
12. DISPLAY CONVERTED DATA
execute
display change
save + back
13. CREATE BATCH INPUT SESSION
tick keep batch input folder
F8
back
14. RUN BATCH INPUT SESSION.
sm35 will come
Object name will be shown here
select object & process -
Hi experts,
I have to upload legacy data into SAP by using eCATT recording Transaction. For doing this I have completed Test script in tcode SECATT. After this what I have to do.. How can I attach my legacy file.. Can anyone help me. ..Its really urgenthi silviya,
check these threads...
/people/sapna.modi/blog/2006/04/10/ecatt-scripts-creation--sapgui-mode-part-iii
Steps in ECATT
uploading data from external files with multiple entries in ecatt
ecatt upload
Error in executing eCATT GUI Script
hope these help,
do reward if it helps,
priya. -
" Can not interpret the data in file " error while uploading the data in DB
Dear All ,
After running the below report I am getting the " Can not interpret the data in file " error.
Need to upload the data in DB through excel or .txt file.
Kindly advise to resolve the issue.
REPORT ZTEST_4.
data : it like ZPRINT_LOC occurs 0 with header line,
FILETABLE type table of FILE_TABLE,
wa_filetable like line of filetable,
wa_filename type string,
rc type i.
CALL METHOD CL_GUI_FRONTEND_SERVICES=>FILE_OPEN_DIALOG
CHANGING
FILE_TABLE = filetable
RC = rc.
IF SY-SUBRC = 0.
read table filetable into wa_filetable index 1.
move wa_filetable-FILENAME to wa_filename.
Else.
Write: / 'HI'.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
start-of-selection.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
FILENAME = wa_filename
FILETYPE = 'ASC'
HAS_FIELD_SEPARATOR = 'X'
TABLES
DATA_TAB = it.
IF SY-SUBRC = 0.
Write: / 'HI'.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
insert ZPRINT_LOC from table it.
if sy-subrc = 0.
commit work.
else.
rollback work.
endif.
Regards
Machindra Patade
Edited by: Machindra Patade on Apr 9, 2010 1:34 PMDear dedeepya reddy,
Not able to upload the excel but have sucess to upload the .csv file to db through the below code. Thanks for your advise.
REPORT ZTEST_3.
internal table declaration
DATA: itab TYPE STANDARD TABLE OF ZPRINT_LOC,
wa LIKE LINE OF itab,
wa1 like line of itab.
variable declaration
DATA: v_excel_string(2000) TYPE c,
v_file LIKE v_excel_string VALUE 'C:\Documents and Settings\devadm\Desktop\test.csv', " name of the file
delimiter TYPE c VALUE ' '. " delimiter with default value space
read the file from the application server
OPEN DATASET v_file FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc NE 0.
write:/ 'error opening file'.
ELSE.
WHILE ( sy-subrc EQ 0 ).
READ DATASET v_file INTO wa.
IF NOT wa IS INITIAL.
append wa TO itab.
ENDIF.
CLEAR wa.
ENDWHILE.
ENDIF.
CLOSE DATASET v_file.
EXEC SQL.
TRUNCATE TABLE "ZPRINT_LOC"
ENDEXEC.
*------display the data from the internal table
LOOP AT itab into wa1.
WRITE:/ wa1-mandt,wa1-zloc_code,wa1-zloc_desc,wa1-zloc,wa1-zstate.
ENDLOOP.
insert ZPRINT_LOC from table itab. -
Issue when uploading Sales data from DSO to Cube.
Dear All,
I have an issue when I am uploading Sales data from DSO to Cube. I am using BI 7.0 and I have uploaded all sales document level data to my DSO. Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube. Cube has customer wise aggregation data.
In DSO I have NetPrice(KF) and Delivered_QTY(KF). I do a simple multiplication routine in the transformation from DSO to Cube.
RESULT = SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
Can someone please help me.
ShankaHi,
are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
but first verify if other key figures are also having the same issue.
Thanks
Ajeet -
What is the Tcodes for Uploading of data using BDC & CATT
PP members:
I was going through the <b>cutover activities</b> , and what I understood is we transfer all the legacy system data into SAP before going live
The data upload follows certain steps (depends on the organizational design load strategies)
First we upload all the master data ( material master, BOM, W/C's & Routings)
Then the transaction data ( Ideally speaking, there should no open orders i.e. WIP as on the day of cutoff )
If the WIP (Work in Process) is unavoidable then the materials consumed shall be treated as <b>materials of the previous stage</b> and necessary adjustments shall be made after cutover day
At this point, I could not able to understand what does the author mean <b>materials of the previous stage</b>
Now comming to the uploading of data into SAP from legacy system, we use tools like LSMW, CATT & BDC
Is it a must to use <b>only LSMW tool</b> to upload master data or any other upload tools are fine
Lastly,. I am not sure about the Tcode of CATT & BDC
Summary of the questions:
1.What does the author mean <b>material of previous stage</b>, for WIP materials during cutover activities
2. Is it mandatory to use only LSMW tool for uploading for master data
3. What are the Tcodes for upload tools CATT & BDC ?
Thanks for your time
Suren RDear,
1.What does the author mean material of previous stage, for WIP materials during cutover activities - as i understood, what is the stage of material..like it must have gone through 2 work centers and other 2 is left. i.e. you need to create Production order with only 2 operation as other 2 is already over. - usually it is done in such a way that we will create Production order and confirm till 2 operations and WIp is calculated so thatb FI will tally the books in SAP and lagacy.
2. Is it mandatory to use only LSMW tool for uploading for master data - no you can use any tool as required and suits yr requirement
3. What are the Tcodes for upload tools CATT & BDC- BDC through a prog in SE38. CATT through - SCEM. -
Infocube was not uploaded with data in BI 7
Dear One's,
Iam trying to upload data from flat file to info cube. It was successfull even in infocube. But contents of infocube does not shows PRICE key figure values, it is showing only MAT_KEY values uploaded successfully. what could be the reason?
Is it necessary to create a flat file with 0CURRENCY unit chara. to support PRICE ?
And also while DTP is scheduled under processing tab it show only PSA rest all were disabled then how can it upload into data target?
Kindly provide me with solution for the above situation.
Thanks,
RajDear One's
Thanks for your replies, but what Iam facing is let me explain -
I have to upload only 2 infoobjects data i.e MAT_KEY [cara.] and PRICE [keyfig] into infocube. Now tell me what are all the files should be entered into the flat file?
and how can I enter USD price and INR prices values without 0CURRENCY?
While running DTP between DS & Datatarget Iam getting only to PSA under processing tab if Iam correct.
Kindly do the needful
Thanks,
Raj -
How to upload Excel data in Ztables fastly & easily.
Dear ALL,
I want to compare material codes in 2 different excel sheets.
I have downloaded output from 2 different SQVI into 2 different excel sheets.
Now I want to compare these 2 sheets to get matching codes or unmatching codes.
I can use VLOOKUP in excel also. But How can I make use of ABAP by downloading these excel data in 2 different Ztables and compare these 2 tables with some user defined TC (transaction code) ?
For this activity How to upload Excel data(2 sheets) in 2 Ztables fastly & easily ?
Is there any method other than SCAT , BDC and LSMW ???
Or which is best method in above ?
Pl' reply.
Thanks.Have a look at the program
*: Description :
*: This is a simple example program to get data from an excel :
*: file and store it in an internal table. :
*: Author : www.sapdev.co.uk, based on code from Jayanta :
*: SAP Version : 4.7 :
REPORT zupload_excel_to_itab.
TYPE-POOLS: truxs.
PARAMETERS: p_file TYPE rlgrap-filename.
TYPES: BEGIN OF t_datatab,
col1(30) TYPE c,
col2(30) TYPE c,
col3(30) TYPE c,
END OF t_datatab.
DATA: it_datatab type standard table of t_datatab,
wa_datatab type t_datatab.
DATA: it_raw TYPE truxs_t_text_data.
At selection screen
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
CALL FUNCTION 'F4_FILENAME'
EXPORTING
field_name = 'P_FILE'
IMPORTING
file_name = p_file.
*START-OF-SELECTION.
START-OF-SELECTION.
CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
EXPORTING
I_FIELD_SEPERATOR =
i_line_header = 'X'
i_tab_raw_data = it_raw " WORK TABLE
i_filename = p_file
TABLES
i_tab_converted_data = it_datatab[] "ACTUAL DATA
EXCEPTIONS
conversion_failed = 1
OTHERS = 2.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
END-OF-SELECTION.
END-OF-SELECTION.
LOOP AT it_datatab INTO wa_datatab.
WRITE:/ wa_datatab-col1,
wa_datatab-col2,
wa_datatab-col3.
ENDLOOP.
Message was edited by:
K N REDDY
Maybe you are looking for
-
I need help and I can't seem to find any.
Okay here's what I did... I got a new Mac Mini from my parents for Christmas, but before my dad could hook it up, I used my brothers computer to put music on my iPod Nano, they use windows though. So now that my computer is hooked up I can't seem to
-
I have received a pop-up advisory from Firefox several times, saying that my Free Download Manager is not compatible with the recommended 5.0.1 upgrade and wold have to be disabled until the compatibility problem was corrected. (I think that I did do
-
While creating Goods Receipt for multiple serial numbers it is generating an internal error
Error:Internal error (-5002) occurred [ 131-183] we are using version SAP B1 9.0 PL 4 The code which we are using. Dim oPDN As SAPbobsCOM.Documents oPDN = oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.oPurchaseDeliveryNotes)
-
Year End 2009 Phase 2 Patch Number
Does anyone know the Patch NUmber for the Year End 2009 Phase 2 Patch? I found the readme on line but I do not see the Patch Num ber used to download it: Release 11i Q4 2009 Statutory updates, Year End 2009 Phase 2, and 2010 Year Begin Readme [ID 976
-
Downloads do not finish in Safari and Firefox
Hey guys, new to this forum, so be easy on me... I just bought a Macboko Unibody a few weeks ago, and for some reason, when I try and download certain files, they don't completely download, however Safari and Firefox believe they're complete (i.e. an