Extraction from R/3 extractor to flat file
Hi,
I want to be able to extract data using r/3 extractors (LIS/Business Content extractor) into r/3 itself loading the data into a transparent table or loading the data into a flat file on the application server.
As an example, i want to extract data from 0MATERIAL_ATTR into a flat file.
I don't want to use BW or any other third party tools for that job, only R/3 6.4.
Is it possible ? Do i need to code and ABAP program for that ?
Thx fro your answer.
Ludovic
0MATERIAL_ATTR was just an example.
What i need is to extract data from LO-Cockpit 2lis_* extractor. In these case i don't have any table behind, data are being calculated in memory..therefore i really need to load the extracted data into flat file or transparent table.
Another point about RSA3, some LO-Cockpit extractor returns about 10'000'000 lines, so it makes impossible to run rsa3 correctly.
Do you have any other suggestion ?
Similar Messages
-
How to convert from SQL Server table to Flat file (txt file)
I need To ask question how convert from SQL Server table to Flat file txt file
Hi
1. Import/Export wizened
2. Bcp utility
3. SSIS
1.Import/Export Wizard
First and very manual technique is the import wizard. This is great for ad-hoc and just to slam it in tasks.
In SSMS right click the database you want to import into. Scroll to Tasks and select Import Data…
For the data source we want out zips.txt file. Browse for it and select it. You should notice the wizard tries to fill in the blanks for you. One key thing here with this file I picked is there are “ “ qualifiers. So we need to make
sure we add “ into the text qualifier field. The wizard will not do this for you.
Go through the remaining pages to view everything. No further changes should be needed though
Hit next after checking the pages out and select your destination. This in our case will be DBA.dbo.zips.
Following the destination step, go into the edit mappings section to ensure we look good on the types and counts.
Hit next and then finish. Once completed you will see the count of rows transferred and the success or failure rate
Import wizard completed and you have the data!
bcp utility
Method two is bcp with a format file http://msdn.microsoft.com/en-us/library/ms162802.aspx
This is probably going to win for speed on most occasions but is limited to the formatting of the file being imported. For this file it actually works well with a small format file to show the contents and mappings to SQL Server.
To create a format file all we really need is the type and the count of columns for the most basic files. In our case the qualifier makes it a bit difficult but there is a trick to ignoring them. The trick is to basically throw a field into the
format file that will reference it but basically ignore it in the import process.
Given that our format file in this case would appear like this
9.0
9
1 SQLCHAR 0 0 """ 0 dummy1 ""
2 SQLCHAR 0 50 "","" 1 Field1 ""
3 SQLCHAR 0 50 "","" 2 Field2 ""
4 SQLCHAR 0 50 "","" 3 Field3 ""
5 SQLCHAR 0 50 ""," 4 Field4 ""
6 SQLCHAR 0 50 "," 5 Field5 ""
7 SQLCHAR 0 50 "," 6 Field6 ""
8 SQLCHAR 0 50 "," 7 Field7 ""
9 SQLCHAR 0 50 "n" 8 Field8 ""
The bcp call would be as follows
C:Program FilesMicrosoft SQL Server90ToolsBinn>bcp DBA..zips in “C:zips.txt” -f “c:zip_format_file.txt” -S LKFW0133 -T
Given a successful run you should see this in command prompt after executing the statement
Starting copy...
1000 rows sent to SQL Server. Total sent: 1000
1000 rows sent to SQL Server. Total sent: 2000
1000 rows sent to SQL Server. Total sent: 3000
1000 rows sent to SQL Server. Total sent: 4000
1000 rows sent to SQL Server. Total sent: 5000
1000 rows sent to SQL Server. Total sent: 6000
1000 rows sent to SQL Server. Total sent: 7000
1000 rows sent to SQL Server. Total sent: 8000
1000 rows sent to SQL Server. Total sent: 9000
1000 rows sent to SQL Server. Total sent: 10000
1000 rows sent to SQL Server. Total sent: 11000
1000 rows sent to SQL Server. Total sent: 12000
1000 rows sent to SQL Server. Total sent: 13000
1000 rows sent to SQL Server. Total sent: 14000
1000 rows sent to SQL Server. Total sent: 15000
1000 rows sent to SQL Server. Total sent: 16000
1000 rows sent to SQL Server. Total sent: 17000
1000 rows sent to SQL Server. Total sent: 18000
1000 rows sent to SQL Server. Total sent: 19000
1000 rows sent to SQL Server. Total sent: 20000
1000 rows sent to SQL Server. Total sent: 21000
1000 rows sent to SQL Server. Total sent: 22000
1000 rows sent to SQL Server. Total sent: 23000
1000 rows sent to SQL Server. Total sent: 24000
1000 rows sent to SQL Server. Total sent: 25000
1000 rows sent to SQL Server. Total sent: 26000
1000 rows sent to SQL Server. Total sent: 27000
1000 rows sent to SQL Server. Total sent: 28000
1000 rows sent to SQL Server. Total sent: 29000
bcp import completed!
BULK INSERT
Next, we have BULK INSERT given the same format file from bcp
CREATE TABLE zips (
Col1 nvarchar(50),
Col2 nvarchar(50),
Col3 nvarchar(50),
Col4 nvarchar(50),
Col5 nvarchar(50),
Col6 nvarchar(50),
Col7 nvarchar(50),
Col8 nvarchar(50)
GO
INSERT INTO zips
SELECT *
FROM OPENROWSET(BULK 'C:Documents and SettingstkruegerMy Documentsblogcenzuszipcodeszips.txt',
FORMATFILE='C:Documents and SettingstkruegerMy Documentsblogzip_format_file.txt'
) as t1 ;
GO
That was simple enough given the work on the format file that we already did. Bulk insert isn’t as fast as bcp but gives you some freedom from within TSQL and SSMS to add functionality to the import.
SSIS
Next is my favorite playground in SSIS
We can do many methods in SSIS to get data from point A, to point B. I’ll show you data flow task and the SSIS version of BULK INSERT
First create a new integrated services project.
Create a new flat file connection by right clicking the connection managers area. This will be used in both methods
Bulk insert
You can use format file here as well which is beneficial to moving methods around. This essentially is calling the same processes with format file usage. Drag over a bulk insert task and double click it to go into the editor.
Fill in the information starting with connection. This will populate much as the wizard did.
Example of format file usage
Or specify your own details
Execute this and again, we have some data
Data Flow method
Bring over a data flow task and double click it to go into the data flow tab.
Bring over a flat file source and SQL Server destination. Edit the flat file source to use the connection manager “The file” we already created. Connect the two once they are there
Double click the SQL Server Destination task to open the editor. Enter in the connection manager information and select the table to import into.
Go into the mappings and connect the dots per say
Typical issue of type conversions is Unicode to non-unicode.
We fix this with a Data conversion or explicit conversion in the editor. Data conversion tasks are usually the route I take. Drag over a data conversation task and place it between the connection from the flat file source to the SQL Server destination.
New look in the mappings
And after execution…
SqlBulkCopy Method
Sense we’re in the SSIS package we can use that awesome “script task” to show SlqBulkCopy. Not only fast but also handy for those really “unique” file formats we receive so often
Bring over a script task into the control flow
Double click the task and go to the script page. Click the Design script to open up the code behind
Ref.
Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/ -
Extracting data directly from a datasource to a flat file
My requirements are to extract data from a SAP standard profit center accounting data extractor 0EC_PCA_1 directly to a flat file. Can this be done in a standard way or is custom ABAP development required ?
I have tried the RSA3 based extractor, but as the result is displayed in packets, I would have to export the content of each extracted packet to separate flat file which is not very feasible.
Br,
HRHi,
1st option: create a dummy ods as well as a dummy infosource and dummy update rules. Assign the datasource to the dummy infosource. In the start routine of the transfer rules take each datapackage and download it to a file. Delete the datapackage. So you need almost no coding and you can also use a possible delta mechanism.
2nd option: create your own abap and call the extraction module. Take care about the way it works like init mode, building packages... Possibly you need to find a way to remodel the delta capability.
kind regards
Siggi -
Extract work order data from r/3 system in flat file(csv)and export to BI
Hi,
I am new in interface.
I need to extract data regarding actual cost and quantities of work assigned to Service Providers from SAP system and send it to BI for reporting purposes.
This interface will extract Master data as well as transactional data. Extraction of Master data will be a full extract and that of transactional data will be Delta Extraction.
Custom development will extract the data from different fields and will export it to flat files (CSV format). This program will amalgamate all the flat files created into one big file and export it to BI system.
Export of data to BI system will be done by schedule program Control M, which will export the data from SAP system to BI system in batches. Control M is expected to run daily at night.
Please provide the step-by-step proces to do this.
Thanks
Suchi
Moderator message: anything else? please work yourself first on your requirement.
Edited by: Thomas Zloch on Mar 25, 2011 1:21 PMHi Ravi,
you've got to set up the message type MDMRECEIPT for the Idoc distribution from R/3 to XI. Check chapter 5.2 in the IT configuration guide available on <a href="http://service.sap.com/installmdm">MDM Documentation Center</a>. It describes the necessary steps.
BR Michael -
How to extract table from oracle in to delimited flat files
Hi i have the following requirement.i tried one dump procedure but i could extract table one by one.i need to do extract on regular basis using plsql procedure.
Data will be extracted from production tables in Oracle into pipe delimited flat files that will be sent by SFTP. The list below represents the tables that will be used for extract along with a notation whether the entire table is extracted or only incremental transactional data.
Table name extraction type No of records
EXPIRE All Records 157 - One Time All Records 17
ACE All Records 7,970
DATA All Records 5,868
MEMBER All Records 24,794,879
MEMBER Incremental & Update 13,893,587 (Initial Load)
MEMBERRED All Records 25,108,606
MEMBERPOINT All Records 42,487,640
MEMBERCOM Incremental & Update 14,337,561 (Initial Load)
MEMBERCODE Incremental Only 14,985,568 (Initial Load)
MEMBERDETAIL Incremental Only 14,341,890 (Initial Load)
MEMBERHISTORY Incremental Only 70,021,067 (Initial Load)
suggest me how can i extract these tables by using plsql procedure.In the above table some table has be extract select list of column.Saubhik wrote:
This may help you.
Re: Dynamic Fetch on dynamic Sql
Well I was going to post my standard response, but I see I don't have to. ;) -
How to extract required data from a column to a flat file
my ssis package is working OK. However, I want to refine one of the column extraction.
when data is extracted to the flat file, I just want to the initials, firstname, lastname e.g.
FZ = Ben Smith, Add1, add1, etc
the only bit that i want is Ben Smith
how can i state in the package to just give me the name and exclude the rest
sukaiAdd a derived column task to extract Name part alone and give expression as below
LEFT([ColumnName],FINDSTRING([ColumnName],",",1)-1)
If before SSIS 2012 use SUBSTRING
SUBSTRING([ColumnName],1,FINDSTRING([ColumnName],",",1)-1)
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Error Extracting Essbase Product Dimension to a Flat File
I am new to ODI and do not know how to troubleshoot this issue.
I am trying to extract an Essbase Dimension (Product) to a Flat File.
The source is Essbase Product and the Flat File is the Target. The target file is .csv.
I have the fields to execute on Staging Area.
When I click apply, I get an error stating: Target/Staging Area: You must set a IKM for the target.
In the IKM folder I have the following:
IKM SQL to File Append
IKM SQL Hyperion Essbase (DATA)
IKM SQL Hyperion Essbase (METADATA)
I appreciate any help you can offer.Follow this and you should be able to get an interface to extract metadata from essbase - http://john-goodwin.blogspot.com/2008/10/odi-series-essbase-outline-extractor.html
Cheers
John
http://john-goodwin.blogspot.com/ -
How to get purchasing data from SAP R/3 to Flat files
Hi,
Please can anyone help me to get the data from Purchasing extractors (2LIS_02_ITM, 2LIS_02_SCL, 2LIS_02_S012) to flat files? Here we dont want to load data from R/3 to BW for reporting, wanted to load data from flat files to OWB (Oracle warehouse builder) for reporting. So Is there anyway to generate data into flat files from Purchasing extractors (Full and deltas)?
Thanks,
Pavan.Hello,
here is a short report which converts S012 entries to strings with separator semicolon. Perhaps this will help you?
Regards
Walter Habich
REPORT habitest2 LINE-SIZE 255.
TYPES:
strtab_t TYPE TABLE OF string.
CONSTANTS:
separator VALUE ';'.
DATA:
it_s012 LIKE s012 OCCURS 0,
wa_s012 LIKE s012,
strtab TYPE strtab_t,
strele TYPE string.
SELECT * FROM s012 INTO TABLE it_s012 UP TO 100 ROWS.
PERFORM data_to_string
TABLES
strtab
USING
'S012'. "requires it_s012 and wa_s012
LOOP AT strtab INTO strele.
WRITE: / strele.
ENDLOOP.
*& Form data_to_string
FORM data_to_string TABLES strtab TYPE strtab_t
USING ittab TYPE any.
DATA:
h_zaehler TYPE i,
line_str TYPE string,
l_tabellenname(10) TYPE c,
l_arbeitsbereichsname(10) TYPE c,
h_string TYPE string,
h_char(255) TYPE c.
FIELD-SYMBOLS: <l_tabelle> TYPE ANY TABLE,
<l_arbeits> TYPE ANY,
<feldzeiger> TYPE ANY.
CLEAR strtab.
CONCATENATE 'IT_' ittab INTO l_tabellenname.
ASSIGN (l_tabellenname) TO <l_tabelle>.
CONCATENATE 'WA_' ittab INTO l_arbeitsbereichsname.
ASSIGN (l_arbeitsbereichsname) TO <l_arbeits>.
LOOP AT <l_tabelle> INTO <l_arbeits>.
CLEAR: h_zaehler, line_str.
line_str = ittab.
DO.
ADD 1 TO h_zaehler.
ASSIGN COMPONENT h_zaehler OF
STRUCTURE <l_arbeits> TO <feldzeiger>.
IF sy-subrc <> 0. EXIT. ENDIF.
WRITE <feldzeiger> TO h_char LEFT-JUSTIFIED. "#EC *
h_string = h_char.
CONCATENATE line_str separator h_string INTO line_str.
ENDDO.
APPEND line_str TO strtab.
ENDLOOP.
ENDFORM. "data_to_string -
Extract BW infocube data to a flat file
Hi Folks,
I have a requirement where I need to extract data from Infocubes to a flat file.
Please suggest me the ABAP programming methodology where i can use function module to extract data from cube and dump it to internal table.
Your immediate help is required.
Thanks,
Rahulyou can simply try a workaround for your problem
right click on cube -> display data-> menu bar->edit--download->save as spreadsheet (.xls) and extract the data
and if you want .txt only. try saving excel in .csv then you can open that via notepad and save it in .txt
you wil be saved frm writing abap codes
-laksh -
Publishing data from a Cube to a flat file
Hi,
I need to publish data from a Info Cube to a flat file. The requirement here is a variable selection will be specified (fiscal periods, product group) and based on the selection chosen the data should be generated in the flat file. What are the options I have to do this.
I know using an APD we can generare data into a flat file but how can we have a variable selection for that data generation. Plus is there a limit on the number of data records we can generate
using an APD
Apart from the APD what other options do we have?
Thanks
Rashmi.Hi Rashmi,
Please follow below steps by using open hub destination
1. Go to RSA1 -> Select Open Hub Destination from Left pane.
2. Right Click on the info area under which you want to create Open Hub and Select Create Open Hub Destination.
3. Give Name, Description for the Open Hub.
4. In template, Select Object type as "Infocube" and give the name in Name field. Press Enter
5. In Destination tab, Select Destination type as File. Give all other details like Filename, Directory etc.
In the Destination tab select type as FILE and give SEPARATOR as comma(,) and not as semi-colon(
and observe that the values iin the file will be displayed in different cells
6. In Field Def tab, you will see all the Fields that can be transferred from Source infocube to File.
7. Save the open hub destination.
8. Right click on created open hub, Select Create transformation. Give Source of transformation as the infocube. Make transformation as needed.
9. Again right click on the open hub, Select Create DTP. Then Save and Activate the DTP.
10. Schedule the data extract as required.
Option2:
You can download in LISTCUBE tcode as well.
Hope this helps.
Regards,
Reddy -
Dynamic file name from input payload (RFC 2 flat file)
Hi,
I have an RFC to flat file scenario. The output flat file has not an XML structure, it's just a plain text file generated with abap mapping.
In my source interface (RFC), I have a field called <FILENAME>, I want to use the value of that field to create the target file using dynamic file name. But if in variable substitution I use payload:ZRFC_NAME,1,FILENAME,1 It doesn't work because the dynamic variable substitution try to access to output payload, not the source one...
What can I do?Hi Marshal,
You can add a extra node to your target strucutre like
FileName- Node
--FileName - Element.
do the mapping from the field filename of RFC to FileName field in u r target strucure. And use this field path at Refrence in variable subtituion.
In the Content converison add the Name & Values as below
FileName.fileldNames -- FileName
FileName.fieldFixedLengths -- 0
FileName.fixedLengthTooShortHandling -- Cut
So the extra field in u r target structure would not populate in u r target text file.
Cheers
Veera -
Loading data to a cube from a DSO and a Flat file
Hi All,
I have an Infocube with fields
product
plant
customer
quantity
country
I am trying to load data into it , few fields from a DSO and others from a flat file
e.g plant and country from a dso
and product, customer and quantity from a flat file.
I have created 2 transformations,
one from the DSO -> cube (which works)
another transfromation does not get activated -- it gives an error msg saying (no source fields are assigned to the rule)
Is it possible to make a load of this sort? If so, then why is the transformation giving that error?
Please do help!
Thanks and Regards,
RadhikaDear Friend,
Not sure what have you done, it should be like this
datasouce---DSO1--
Cube1
Flat2--DSO(optional)-Cube2
then you should build a multiprovider on top of these cubes (cube1 and cube2 ) and then create Query.
Please check if this is something you have done.
Hope this helps.
Thanks
sukhi -
Exporting the user IDs from R/3 to a flat file
I need to generate a flat file with all the user IDs from an ABAP system. How can I do that? is there something available out-of-the-box or I need to develop something?
Also, is there a quick way to bering all the user IDs from R/3 into the Portal?Hi,
Goto SE16 - click on the Table contents button in the screen and execute the table.it will list out the user details - > Edit > Download-> Spreadsheet ->give the name and location for the file.
REward with points if it is useful
Regards,
Sangeetha.A -
Hello,
I have to load data from a csv file to SQL Database. The file is placed into a directory by another program but the file name being same, has different extensions based on time of the day that it is placed in the directory. Since I know the file name
ahead of time, so, I want to strip off the date/time extension from the file name so that I can load the file using Flat File Connection Manager. I am trying to use 'variable' and 'expression editor' so that I can specify the file name dynamically. But I
don't know how to write it correctly. I have a variable 'FileLocation' that holds the folder location where the file will be placed. The file for example: MyFileName201410231230 (MyFileName always the same, but the date/time will be different)
Thanks,
jkrishI don't want to use ForEach Loop because the files are placed by a FTP process 3 times a day at a specific time, for ex. at 10 AM, 12 PM and 3 PM. These times are pretty much fixed. The file name is same but the extension will have this day time stamp. I
have to load each file only once for a particular reason, meaning I don't want to load the ones I already loaded. I am planning on setting up the SSIS process to load at 10:05, 12:05 and 3:05 daily. The files will be piling up in the folder. As it comes,
I load them. At some point in time, I can remove the old ones so that they won't take up space in the server. In fact, I don't have to keep the old ones at all since they are saved in a different folder anyways. I can ask the FTP process to
remove the previous one when the new one arrives. So, at any point in time, there will be one file, but that file will have different extensions every time.
I am thinking of removing the extensions before I load every time. If the file name is 'MyFileNamexxxxxxx.csv', then I want to change it to 'MyFileName.csv' and load it.
Thanks,
jkrish
You WILL need to use it eventually because you need to iterate over each file.
Renaming is unnecessary as one way or another you will need to put a processed file away.
And having the file with the original extension intact will also help you troubleshoot.
Arthur
MyBlog
Twitter -
SAP CRM Extracts BP, IBASE and Contracts to flat files
Hi,
We want to download SAP CRM (5.0) data to flat files (every month). The datasets are rather large (1mln per object) and we have identified different approaches to do this:
- Using an ABAP report with BAPI's
- Using an ABAP report with direct selections
Now performance might become an issue so direct selections seems the way to go.
Are there any other means of downloading the data from SAP CRM to flat files, maybe use an BW adapter?
Any ideas are welcome!
Kind regards,
Arjan Boerhi,
My knowledge...
In CRM... Using BAPI ..
BWA1 (BW Adapter: Maintain DataSource)
BWA5 ACTIVATE DS...
BWA7 name of the datsource and pressing the candle icon for 'activate delta'.
Give me u id ...
Hope this help ..assign points...
raj
Maybe you are looking for
-
Voice Memo Won't Sync from iPhone
Ugh! I've been syncing voice memos from iPhone to iTunes for more than a year with no problems. Now this morning I can't sync the latest one I made last night. I've: -- Restarted my iMac -- Tried File>Sync Nothing works. The file I need just sits on
-
Can't change my country on my apple id
My apple id was set up in Ireland where I used to live. After moving back to Italy I tried to change the country from Ireland to Italy but I was told to spend 0,15euro left on my account first, but I cannot do so because I cannot buy any apps with th
-
Monitor went black after changing resolution
I changed my mac mini resolution to something higher. The screen went black and now it never shows anything on it anymore. I turned off the mac and the monitor but still the same problem when I turn the mac back on. How can I get to see the screen ag
-
Changes not appearing in Design View
I am just starting out in DW5.5 and I have been using a good tutorial but I have run into some buggy behavior. For example I have inserted a CSS style rule but the changes I am making (e.g., box height, color) -- though they appear in the coding view
-
Union tables from multiply data srouces in OBIEE
Hi Pros, Recenetly I am doing a POC about union the same structure table from multiply data sources through OBIEE, uptill now I still can't make that out, I have tried below ways : Precondition: I have configured those two tables under one physical f