Extracting data directly from a datasource to a flat file

My requirements are to extract data from a SAP standard profit center accounting data extractor 0EC_PCA_1 directly to a flat file. Can this be done in a standard way or is custom ABAP development required ?
I have tried the RSA3 based extractor, but as the result is displayed in packets, I would have to export the content of each extracted packet to separate flat file which is not very feasible.
Br,
HR

Hi,
1st option: create a dummy ods as well as a dummy infosource and dummy update rules. Assign the datasource to the dummy infosource. In the start routine of the transfer rules take each datapackage and download it to a file. Delete the datapackage. So you need almost no coding and you can also use a possible delta mechanism.
2nd option: create your own abap and call the extraction module. Take care about the way it works like init mode, building packages... Possibly you need to find a way to remodel the delta capability.
kind regards
Siggi

Similar Messages

  • How to remove the date extensions from a filename in SSIS Flat File Connection Manager dynamically at run time

    Hello,
    I have to load data from a csv file to SQL Database. The file is placed into a directory by another program but the file name being same, has different extensions based on time of the day that it is placed in the directory. Since I know the file name
    ahead of time, so, I want to strip off the date/time extension from the file name so that I can load the file using Flat File Connection Manager. I am trying to use 'variable' and 'expression editor' so that I can specify the file name dynamically. But I
    don't know how to write it correctly. I have a variable 'FileLocation' that holds the folder location where the file will be placed. The file for example:  MyFileName201410231230  (MyFileName always the same, but the date/time will be different)
    Thanks,
    jkrish

    I don't want to use ForEach Loop because the files are placed by a FTP process 3 times a day at a specific time, for ex. at 10 AM, 12 PM and 3 PM. These times are pretty much fixed. The file name is same but the extension will have this day time stamp. I
    have to load each file only once for a particular reason, meaning I don't want to load the ones I already loaded. I am planning on setting up the SSIS process to load at 10:05, 12:05 and 3:05 daily. The files will be piling up in the folder. As it comes,
    I load them. At some point in time, I can remove the old ones so that they won't take up space in the server.  In fact, I don't have to keep the old ones at all since they are saved in a different folder anyways. I can ask the FTP process to
    remove the previous one when the new one arrives. So, at any point in time, there will be one file, but that file will have different extensions every time.
    I am thinking of removing the extensions before I load every time. If the file name is 'MyFileNamexxxxxxx.csv', then I want to change it to 'MyFileName.csv' and load it.
    Thanks,
    jkrish
    You WILL need to use it eventually because you need to iterate over each file.
    Renaming is unnecessary as one way or another you will need to put a processed file away.
    And having the file with the original extension intact will also help you troubleshoot.
    Arthur
    MyBlog
    Twitter

  • What cable do I need to transfer data direct from my iMac to my macbook pro

    What cable do I need to tranfer data direct from my iMac to my MacBook pro?

    Generally, Ethernet or FireWire.
    (71181)

  • How to Parse XML data directly from context variables in webdynpro

    Hello,
       I have two requirements:
    1) I have a context variable which has string value.
       I want to write the this value into a flat file.
       How do I do this in WebDynpro.
       Any sample code for this.
    2) In Webdynpro, I want to parse and process the XML data directly from a string context variable which
       has the value in XML format.
       How do I achieve this. Any pointers or sample codes for this.
    Thanks and Regards,
    Anupama.

    Anupama,
    Here is some link which talks about unpacking xml and converting to HTML.
    <a href="http://help.sap.com/saphelp_nw04/helpdata/en/eb/3dfb402eb5f76fe10000000a1550b0/content.htm">http://help.sap.com/saphelp_nw04/helpdata/en/eb/3dfb402eb5f76fe10000000a1550b0/content.htm</a>
    I have done something like this in portal development and not in webdynpro.But in principle it should work very where.

  • How to read data directly from clusters

    hi all,
    how to read data directly from clusters
    Thanx in advance,
    amruta.

    Using macro:
    RP-IMP-C2-B2.
    RP-IMP-C2-B1.
    RP-IMP-C2-ZL.
    ....etc.
    For TM cluster, U also can use BAPIs like HR_TIME_RESULTS_GET
    More details see SAP HR course 350(HR Programming)

  • Picking data directly from ALV List

    HI experts !
    I have a scenario in which the client is executing some tcodes ( Some hourly, some daily , some weekly, ome monthly ) and all the data gets displayed in alv list . The requirement is that is it possible for XI to pick that data directly from alv and update the data base?
    OR
    If the above case is not possible then the client is thinking of putting the data in a spool and then by running some program they fetch the data ?
    Guide me on this ? how to execute such scenario?
    Regards
    saras jain

    Create and outbound interface with needed datatypes and message types. Create a client proxu for this.
    Create a report, which gets all these data in to internal tables and then call the client proxy from XI. You can have whatever adapter on the receiver side..
    VJ

  • Is it possible to select an area on a graph and to delete data directly from it?

    Hi,as written in the message subject, I'm interesting in the possibility of delete data directly from a graph, using two cursors on the x-axis, or selecting directly an area.
    In fact I don't know how to link the cursor position to the data position in an array, in fact if I'm able to do this, I can use the cursors to select an interval on the x-axis and then with the help of a control on the front panel, delete the data from the array, and obtain a new graph without the selected area.
    Thanks for your kindly attention
    Best regards
    Michele Maria Marotta
    PhD students
    University of Salerno-Italy

    I'm assuming from your question that you have an XY graph built from an array, and I'm also assuming the array is sorted by the X axis.
    Create a property node for the graph (Right-click on the terminal and select Create>>Property node). Now you can either select the Cursor>>Cursor position and Cursor>>Active cursor properties to get the coordinates of the cursor(s), or you can select the Cursor List property to get an array of clusters that holds the data for all the cursors. Assuming you only have two cursors, you can use a for loop and place an Unbundle by name VI in it to get the cursor position. Now that you have the positions for both cursors, go back to the original array and remove the data that's between these values.
    Try to take over the world!

  • Can we send the data into different data target from single datasource how

    Hai
    can we send the data into different data target from single datasource how ?

    Hi,
    Create the transformation for each target and connect through DTP if you are in BI 7.0
    If it is BW 3.5 create transfer rules and load it Info source and then different update rules for different targets then load it using IP.
    If you are speaking about loading data from R3 data source to multiple data sources in BI
    Then follow the below step.
    1)create init IP's and run the IP's with different selection to different DSO(With same selection it is not possible).
    2)Then you will have different delta queues for the same data source in RSA7
    3)Your delta loads will run fine to different data sources from same DS in R3.
    Hope this helps
    Regards,
    Venkatesh

  • Is it possible to record data directly from PXI-5112 scope card through PXI bus to SCSI RAID array (connected to PXI-8210)?

    Colleagues,
    Is it possible to record data directly from PXI-5112 scope card through PXI bus to SCSI RAID array (connected to PXI-8210)?
    Which will be the maximum transfer rate for continuous data recording?
    Thank you,
    Sergey
    Sergey

    Sergey,
    The PXI-8210 can connect to any SCSI 2 compliant device. If the RAID controller is SCSI 2 and appears just like a hard drive in the operating system, then you can send data directly to the RAID array. The problem is that the driver for the PXI-5112 does not yet support continuous acquisition. The on-board 16MB or 32MB buffer stores the data until the entire acquisition is completed. Once the acquisition complete, all of the data is transferred from the on-board buffer to the hard drive. After that happens, the NI 5112 is ready for another acquisition.
    Best Regards,
    Jace Curtis
    NI Applications Engineering

  • Data Load into two data targest from one DataSource

    Hi,
    I want to load data into two data targest from one DataSource. I did full load, then Initialization & delta settings for that DataSource. But I want to load data into one data target using delta InfoPackage & into 2nd data target using Full Load InfoPackage. Can I schedule execution of these 2 infopackges one delta & 2nd Full load InfoPackage for the same DataSource simaltaneously?
    Regards,
    Pradip

    Hi,
    In BI, You can achieve this. As thorugh info package, you laod data till PSA only. Then it is being loaded to other data targets.
    Create two DTPs, one with FULL load and another one with Detla Load. It will work.
    In R/3 Side, As sandeep has mentioned , there might be possibility that if you run full load after delta , your delta may get corrupt. i am not sure about that , but there is a possibility.
    - Jaimin

  • How do I extract multiple tracks from a CD into a single file in Audition CS6?

    I use Audition to edit my Pastor's messages. The message CD is made with five minute tracks. I want to extract the tracks from the CD into a single file for editing and altering the track lengths. I have done this before in Audition CS6, but I haven't been able to find instructions for it. I only found a note stating this is available in AuditionCC.

    I have discovered an answer to my problem. The way to bring multiple tracks into Audition CS6 from a CD is to use the FILE / OPEN APPEND / TO NEW command. This brings the separate tracks into a single file in Audition CD6. Thanks to everyone who commented.

  • How to convert from SQL Server table to Flat file (txt file)

    I need To ask question how convert from SQL Server table to Flat file txt file

    Hi
    1. Import/Export wizened
    2. Bcp utility
    3. SSIS 
    1.Import/Export Wizard
    First and very manual technique is the import wizard.  This is great for ad-hoc and just to slam it in tasks.
    In SSMS right click the database you want to import into.  Scroll to Tasks and select Import Data…
    For the data source we want out zips.txt file.  Browse for it and select it.  You should notice the wizard tries to fill in the blanks for you.  One key thing here with this file I picked is there are “ “ qualifiers.  So we need to make
    sure we add “ into the text qualifier field.   The wizard will not do this for you.
    Go through the remaining pages to view everything.  No further changes should be needed though
    Hit next after checking the pages out and select your destination.  This in our case will be DBA.dbo.zips.
    Following the destination step, go into the edit mappings section to ensure we look good on the types and counts.
    Hit next and then finish.  Once completed you will see the count of rows transferred and the success or failure rate
    Import wizard completed and you have the data!
    bcp utility
    Method two is bcp with a format file http://msdn.microsoft.com/en-us/library/ms162802.aspx
    This is probably going to win for speed on most occasions but is limited to the formatting of the file being imported.  For this file it actually works well with a small format file to show the contents and mappings to SQL Server.
    To create a format file all we really need is the type and the count of columns for the most basic files.  In our case the qualifier makes it a bit difficult but there is a trick to ignoring them.  The trick is to basically throw a field into the
    format file that will reference it but basically ignore it in the import process.
    Given that our format file in this case would appear like this
    9.0
    9
    1 SQLCHAR 0 0 """ 0 dummy1 ""
    2 SQLCHAR 0 50 "","" 1 Field1 ""
    3 SQLCHAR 0 50 "","" 2 Field2 ""
    4 SQLCHAR 0 50 "","" 3 Field3 ""
    5 SQLCHAR 0 50 ""," 4 Field4 ""
    6 SQLCHAR 0 50 "," 5 Field5 ""
    7 SQLCHAR 0 50 "," 6 Field6 ""
    8 SQLCHAR 0 50 "," 7 Field7 ""
    9 SQLCHAR 0 50 "n" 8 Field8 ""
    The bcp call would be as follows
    C:Program FilesMicrosoft SQL Server90ToolsBinn>bcp DBA..zips in “C:zips.txt” -f “c:zip_format_file.txt” -S LKFW0133 -T
    Given a successful run you should see this in command prompt after executing the statement
    Starting copy...
    1000 rows sent to SQL Server. Total sent: 1000
    1000 rows sent to SQL Server. Total sent: 2000
    1000 rows sent to SQL Server. Total sent: 3000
    1000 rows sent to SQL Server. Total sent: 4000
    1000 rows sent to SQL Server. Total sent: 5000
    1000 rows sent to SQL Server. Total sent: 6000
    1000 rows sent to SQL Server. Total sent: 7000
    1000 rows sent to SQL Server. Total sent: 8000
    1000 rows sent to SQL Server. Total sent: 9000
    1000 rows sent to SQL Server. Total sent: 10000
    1000 rows sent to SQL Server. Total sent: 11000
    1000 rows sent to SQL Server. Total sent: 12000
    1000 rows sent to SQL Server. Total sent: 13000
    1000 rows sent to SQL Server. Total sent: 14000
    1000 rows sent to SQL Server. Total sent: 15000
    1000 rows sent to SQL Server. Total sent: 16000
    1000 rows sent to SQL Server. Total sent: 17000
    1000 rows sent to SQL Server. Total sent: 18000
    1000 rows sent to SQL Server. Total sent: 19000
    1000 rows sent to SQL Server. Total sent: 20000
    1000 rows sent to SQL Server. Total sent: 21000
    1000 rows sent to SQL Server. Total sent: 22000
    1000 rows sent to SQL Server. Total sent: 23000
    1000 rows sent to SQL Server. Total sent: 24000
    1000 rows sent to SQL Server. Total sent: 25000
    1000 rows sent to SQL Server. Total sent: 26000
    1000 rows sent to SQL Server. Total sent: 27000
    1000 rows sent to SQL Server. Total sent: 28000
    1000 rows sent to SQL Server. Total sent: 29000
    bcp import completed!
    BULK INSERT
    Next, we have BULK INSERT given the same format file from bcp
    CREATE TABLE zips (
    Col1 nvarchar(50),
    Col2 nvarchar(50),
    Col3 nvarchar(50),
    Col4 nvarchar(50),
    Col5 nvarchar(50),
    Col6 nvarchar(50),
    Col7 nvarchar(50),
    Col8 nvarchar(50)
    GO
    INSERT INTO zips
    SELECT *
    FROM OPENROWSET(BULK 'C:Documents and SettingstkruegerMy Documentsblogcenzuszipcodeszips.txt',
    FORMATFILE='C:Documents and SettingstkruegerMy Documentsblogzip_format_file.txt'
    ) as t1 ;
    GO
    That was simple enough given the work on the format file that we already did.  Bulk insert isn’t as fast as bcp but gives you some freedom from within TSQL and SSMS to add functionality to the import.
    SSIS
    Next is my favorite playground in SSIS
    We can do many methods in SSIS to get data from point A, to point B.  I’ll show you data flow task and the SSIS version of BULK INSERT
    First create a new integrated services project.
    Create a new flat file connection by right clicking the connection managers area.  This will be used in both methods
    Bulk insert
    You can use format file here as well which is beneficial to moving methods around.  This essentially is calling the same processes with format file usage.  Drag over a bulk insert task and double click it to go into the editor.
    Fill in the information starting with connection.  This will populate much as the wizard did.
    Example of format file usage
    Or specify your own details
    Execute this and again, we have some data
    Data Flow method
    Bring over a data flow task and double click it to go into the data flow tab.
    Bring over a flat file source and SQL Server destination.  Edit the flat file source to use the connection manager “The file” we already created.  Connect the two once they are there
    Double click the SQL Server Destination task to open the editor.  Enter in the connection manager information and select the table to import into.
    Go into the mappings and connect the dots per say
    Typical issue of type conversions is Unicode to non-unicode.
    We fix this with a Data conversion or explicit conversion in the editor.  Data conversion tasks are usually the route I take.  Drag over a data conversation task and place it between the connection from the flat file source to the SQL Server destination.
    New look in the mappings
    And after execution…
    SqlBulkCopy Method
    Sense we’re in the SSIS package we can use that awesome “script task” to show SlqBulkCopy.  Not only fast but also handy for those really “unique” file formats we receive so often
    Bring over a script task into the control flow
    Double click the task and go to the script page.  Click the Design script to open up the code behind
    Ref.
    Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/

  • Extract work order data from r/3 system in flat file(csv)and export to BI

    Hi,
    I am new in interface.
    I need to extract data regarding actual cost and quantities of work assigned to Service Providers from SAP system and send it to BI for reporting purposes.
    This interface will extract Master data as well as transactional data. Extraction of Master data will be a full extract and that of transactional data will be Delta Extraction.
    Custom development will extract the data from different fields and will export it to flat files (CSV format). This program will amalgamate all the flat files created into one big file and export it to BI system.
    Export of data to BI system will be done by schedule program Control M, which will export the data from SAP system to BI system in batches. Control M is expected to run daily at night.
    Please provide the step-by-step proces to do this.
    Thanks
    Suchi
    Moderator message: anything else? please work yourself first on your requirement.
    Edited by: Thomas Zloch on Mar 25, 2011 1:21 PM

    Hi Ravi,
    you've got to set up the message type MDMRECEIPT for the Idoc distribution from R/3 to XI. Check chapter 5.2 in the IT configuration guide available on <a href="http://service.sap.com/installmdm">MDM Documentation Center</a>. It describes the necessary steps.
    BR Michael

  • Data load from to datasources to InfoCube via common Infosource in BI7

    Hi all,
    I need to load data from two diffent DataSource to one Infocube. Between this Infocube and DataSources there is one infosource common for both Datasources.
    Between Infocube and Infosource there is one transformation and between each DataSource and that single InfoSource there are individual transformations.
    I need to move data from each DataSource to infocube via InfoSource and through all the three transformations..
    I tried to create a DTP to load data from one DS1 to InfoCube. It worked fine. But while creating DTP from other DS to Infocube I am not able to use the transformation which I have created from InfoSource to Infocube. Its getting bypassed and systems creates a new transformation between DS2 to infocube.
    Can anybody help me for this???

    Hello all,
    Thanks for instant replies..
    We need have Infosource as we need to have currency transformation, and multi-stage transformation...
    My question is when I am creating a DTP for second DS it is bypassing the transformation which I have created between InofSource and DataSource...
    It is following that that transformation only for  first DS.. and not for the second one. Whereas, we need have that transformation (transformation between InfoSource and InfoCube) common for all DS which are loading data through InfoSource.

  • Data Upload from Multiple DataSource

    Hi All,
    How can i upload data from multiple datasources to a single Cube in BI 7??
    Kindly provide step by step procedure.
    Regards.

    Hi,
    In BI 7 you still have both data flows present, You can either use the old 3.5 dataflow or the new BI 7 dataflow.
    SAP has still allowed organizations to use the old 3.5 dataflow till they convert over to the newer flows so that the production dosen't disturb. Its takes some efforts to convert the 3.5 data flows to the 7.0 flows.
    So, the way its usually done is, a technical upgrade is done. Then existing modules are converted over to the 7.0 dataflows in a phased manner. Any newer projects that come along are carried out with the 7.0 functionality.
    Even if you install business content in a 7.0 system, you'll get 3.5 objects. You need to convert them over to the 7.0 versions.
    Cheers,
    Kedar

Maybe you are looking for