Extracting transaction data using flat files in nw2004s

Hi BW Gurus,
I am trying to load the Infocube from flat file.  flat file is in csv format with comma diliminator,  when i am loading the datasource with the flat file.  flat file is actually consist of 20 columns with column headings, 5 columns has no data it consist only headings.
When I am loading data to the datasource, it is reading only 15 columns which is having values,  I tried to enter the values in the empty column and extracted to datasource still it is not uploading that values.
Please help me.
Thanking you,
Ravi

Hi
Please check your transfer structure should match with your  flat file.
Order of fields in your flat file should be in the same order as the transfer structure..
Hope it helps you...
Teja

Similar Messages

  • Deleting master data after loading transactional data using flat file

    Dear All,
    I have loaded transaction data  into an infocube using a flat file . While loading DTP i have checked the  option "load  transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
    While loading the flat file, I made a mistake for DIVISION Characteristic  where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
    But when I see the master data for DIVISION , i can see a new entry  with value '4000'.
    My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
    I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
    Please suggest me on this.
    Regards,
    Veera

    Hi,
    Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
    If this master data is not used any where else just delete the master data completely with SID option.
    If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
    Hope this helps
    Akhan.

  • Can i load transaction data thro flat files without loading masters?

    Hi all,
    is it possible to load history data using flat files in sales cube without having loaded any masters?
    thanks

    Hi
    You can load the transaction data without loading the master data. You have to check the option in update rules (update also if no master data exists).
    In reporting you will not be able to see the attribute and text data. Its always adivceable to load master data first and run apply hierarchy/attribute change and load the transaction data.
    AHP: Nice to see you after a long time
    REGards
    Rak

  • How to create the Export Data and Import Data using flat file interface

    Hi,
    Request to let me know based on the requirement below on how to export and import data using flat file interface.....
    Please provide the steps involved for the same.......
    BW/BI - Recovery Process for SNP data. 
    For each SNP InfoProvider,
    create:
    1) Export Data:
    1.a)  Create an export data source, InfoPackage, comm structure, etc. necessary to create an ASCII fixed length flat file on the XI
    ctnhsappdata\iface\SCPI063\Out folder for each SNP InfoProvider. 
    1.b)  All fields in each InfoProvider should be exported and included in the flat file. 
    1.c)  A process chain should be created for each InfoProvider with a start event. 
    1.d)  If the file exists on the target drive it should be overwritten. 
    1.e)  The exported data file name should include the InfoProvider technical name.
    1.f)  Include APO Planning Version, Date of Planning Run, APO Location, Calendar Year/Month, Material and BW Plant as selection criteria.
    2) Import Data:
    2.a) Create a flat file source system InfoPackage, comm structure, etc. necessary to import ASCII fixed length flat files from the XI
    ctnhsappdata\iface\SCPI063\Out folder for each SNP InfoProvider.
    2.b)  All fields for each InfoProvider should be mapped and imported from the flat file.
    2.c)  A process chain should be created for each InfoProvider with a start event. 
    2.d)  The file should be archived in the
    ctnhsappdata\iface\SCPI063\Archive directory.  Each file name should have the date appended in YYYYMMDD format.  Each file should be deleted from the \Out directory after it is archived. 
    Thanks in advance.
    Tyson

    Here's some info on working with plists:
    http://developer.apple.com/documentation/Cocoa/Conceptual/PropertyLists/Introduc tion/chapter1_section1.html
    They can be edited with any text editor. Xcode provides a graphical editor for them - make sure to use the .plist extension so Xcode will recognize it.

  • Problem in data sources for transaction data through flat file

    Hello Friends,
    While creating the data sources for transaction data through flat file, I am getting the following error "Error 'The argument '1519,05' cannot be interpreted as anumber' while assigning character to application structure" Message no. RSDS016
    If any one come across this issue, please provide me the solution.
    Thanks in Advance.
    Regards
    Ravi

    Hallo,
    just for information.
    I had the same problem.
    Have changed the field type from CURR to DEC and have set external instead of internal.
    Then, the import with flatfile worked fine.
    Thank you.

  • Loading transaction data from flat file to SNP order series objects

    Hi,
    I am an BW developer and i need to provide data to my SNP team.
    Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
    I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
    Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
    thanks in advance
    Rahul

    Hi,
    Please go through the following links:
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
    Hope this helps...
    Regards,
    Habeeb
    Assign points if helpful..:)

  • Error in loading  transaction dat from flat file

    hi everybody
    while i try to load  data from flat file at time of  scheduling it shows the error "check load from infosourceand its showing  errorwhilw monitoring the data as follow"the  correcsponding datapacket were not updated using psa" what can  i do to rectify the  problem
    thanks in advance

    Hi,
    Please check the data whether it is in a correct format using the "Preview" option in the infopackage.
    Check whether all the related AWB objects are active or not.
    Regards,
    K.Manikandan.

  • Extract Master Data to Flat File

    Hello
    Do you know a easy way to extract master data into a flat file?   I've tried using maintain master data but my master table is too large to display this way.

    Dear Satya,
    1) You can simply write an ABAP to access the P table ( /BIC/PInfobobject )
    2) Using infospokes
    rsbo->ZDOWNLOAD->Create->Choose Data Source as Infopbject with attributes..
    and select your fields and selections and specify the flat file destination and thatsts it..
    regards,
    Hari
    Message was edited by: Hari Kiran Y

  • How to use 'flat file extraction' option in Infospoke

    Hello Gurus,
    We are trying to extract the BW data using Infospoke to a 3rd party ETL tool (Datastage). Now in the current BW 3.5 release, the customer is on SP11 and we are not able to extract any data using a 3rd party system option in Infospoke. Due to the fact, we found out from SAP, that since we are getting an error (3rd party system is not set)in infospoke, we have to upgrade the SP level from 11 to either 16 or 17. The customer that we are dealing with is not going to do that until July - 2007.
    So now we have a second option to load the 'flat file' in BW App. server and FTP to Datastage. Is this something workable option?
    When using that option within Infospoke destination Tab, we can see two options. One is to use File option and load the data within the standard DIR HOME. I tried that but I can't see the file and the entire data within the standard directory. Can anyone explain why and what will be the procedure to follow?
    The second option is to use Logical file. I never done that before. Can anyone explain step by step process or provide the detail document and/or link?
    Since we are at the critical timeframe stage, I would appreciate if I can get the faster response from the Gurus.
    I will make sure to give 'reward points' to each and every helpful answer(s).
    thanks in advance,
    Maulesh

    Hi shashi,
    first, let me clarify one thing here! I am not saying '3rd party tool' as an option in infospoke. I know Infospoke provides only two options (DB table, and file). But when you use DB table option, you check mark on 3rd party destination and that's where you define an RFC destination for 3rd party tool. So when you execute an infospoke, it will start extracting data and load into the defined destination. Since we are on SP11 on BW 3.5 release, we are getting an error while executing an Infospoke using 'DB table option'. The error is like '3rd party destination is not set'. When I went to Marketplace and found an OSS note, I found out that we need an upgrade on SP level (upto 17).
       As I mentioned earlier, the client doesn't want to upgrade until next year, and we are at the critical timeframe stage, where we have to make a decision whether we hold this project until the SP upgrade happens or found alternate approach where we can extract the data from BW (somehow) and load into Teradata database via Datastage - BW Pack (3rd party ETL).
       So we might be able to use the second option in Infospoke destination, which is to use 'Flat file' load.
       Now when I use File option with App. server checked, the data loads into default Server and directory on BW. Which is DIR_HOME. I can see those file extracted and loaded into that directory under the transaction AL11. But I can't see any data. Do you think I have to request for an increase in size limit? If I see all the data, how can I FTP from the BW server? Do I have to work with SAP Basis or create a UNIX script?
       When I use Logical file name, I guess I have to follow what you have described on your response, correct? Can i use an existing Logical path and file name? Do you have a 'How to' document on creating YEAR and PER FM? Is it possible that Basis can do this work for us? What happens after we assign a logical Path and logical file name into the destination tab on Infospoke? What is the use of logical file versus file? Overall, I am looking for a process from extracting the data from Infospoke using either file or logical file, all the way to load into 3rd party database.
      looking forward for your response!
    thanks,
    Maulesh

  • Use LINQ to extract the data from a file...

    Hi,
    I have created a Subprocedure CreateEventList
    which populates an EventsComboBox
    with a current day's events (if any).
    I need to store the events in a generic List communityEvents
    which is a collection of
    communityEvent
    objects. This List needs to be created and assigned to the instance variable
    communityEvents.
    This method should call helper method ExtractData
    which will use LINQ to extract the data from my file.
    The specified day is the date selected on the calendar control. This method will be called from the CreateEventList.
    This method should clear all data from List communityEvents.  
    A LINQ
    query that creates CommunityEvent
    objects should select the events scheduled for selected
    day from the file. The selected events should be added to List
    communityEvents.
    See code below.
    Thanks,
    public class CommunityEvent
    private int day;
    public int Day
    get
    return day;
    set
    day = value;
    private string time;
    public string Time
    get
    return time;
    set
    time = value;
    private decimal price;
    public decimal Price
    get
    return price;
    set
    price = value;
    private string name;
    public string Name
    get
    return name;
    set
    name = value;
    private string description;
    public string Description
    get
    return description;
    set
    description = value;
    private void eventComboBox_SelectedIndexChanged(object sender, EventArgs e)
    if (eventComboBox.SelectedIndex == 0)
    descriptionTextBox.Text = "2.30PM. Price 12.50. Take part in creating various types of Arts & Crafts at this fair.";
    if (eventComboBox.SelectedIndex == 1)
    descriptionTextBox.Text = "4.30PM. Price 00.00. Take part in cleaning the local Park.";
    if (eventComboBox.SelectedIndex == 2)
    descriptionTextBox.Text = "1.30PM. Price 10.00. Take part in selling goods.";
    if (eventComboBox.SelectedIndex == 3)
    descriptionTextBox.Text = "12.30PM. Price 10.00. Take part in a game of rounders in the local Park.";
    if (eventComboBox.SelectedIndex == 4)
    descriptionTextBox.Text = "11.30PM. Price 15.00. Take part in an Egg & Spoon Race in the local Park";
    if (eventComboBox.SelectedIndex == 5)
    descriptionTextBox.Text = "No Events today.";

    Any help here would be great.
    Look, you have to make the file a XML file type -- Somefilename.xml.
    http://www.xmlfiles.com/xml/xml_intro.asp
    You can use NotePad XML to make the XML and save the text file.
    http://support.microsoft.com/kb/296560
    Or you can just use Notepad (standard), if you know the basics of how to create XML, which is just text data that can created and saved in a text file, which, represents data.
    http://www.codeproject.com/Tips/522456/Reading-XML-using-LINQ
    You can do a (select new CommunityEvent) just like the example is doing a
    select new FileToWatch and load the XML data into the CommunityEvent properties.
    So you need to learn how to make a manual XML textfile with XML data in it, and you need to learn how to use LINQ to read the XML. Linq is not going to work against some  flat text file you created. There are plenty of examples out on Bing and Google
    on how to use Linq-2-XML.
    http://en.wikipedia.org/wiki/Language_Integrated_Query
    <copied>
    LINQ extends the language by the addition of query
    expressions, which are akin to
    SQL statements, and can be used to conveniently extract and process data from
    arrays, enumerable
    classes, XML documents,
    relational databases, and third-party data sources. Other uses, which utilize query expressions as a general framework for readably composing arbitrary computations, include the construction of event handlers<sup class="reference" id="cite_ref-reactive_2-0">[2]</sup>
    or
    monadic parsers.<sup class="reference" id="cite_ref-parscomb_3-0">[3]</sup>
    <end>
    <sup class="reference" id="cite_ref-parscomb_3-0"></sup>

  • Data Extraction  or IDOC to flat file

    hi,
    I have a project to create a flat file from SAP, for an external legacy system. There are 3 requirement.
    What approach should I take. Simple data extraction OR Idoc to flat file.
    There are 4 requirements:
    1. first time extract all data.
    2. on subsequent run, extract only changed & new records
    3. if in SAP table, a record is deleted, then marked deleted in flat file.
    What approach should I take if I use data extraction.
    Thanks.

    I read your question, my first thought would be to look at where the data is going?
    What are the data requirements of the legacy system. IDOCs can speed up the development related to pushing the data out from SAP. Using ALE and change pointers you can automatically pass out the delta with a limited amount of development.
    However, the receiving system then needs to parse the IDOC data. depending on the IDOC you are working with this can be a challenge especially if the legacy developer doesn't get IDOCs.
    Sometimes its easier to collect and write the data from SAP using "simple data extraction". The data is more readily organized into a format the receiving system is expecting.
    You can also pass the idoc to a middleware maping application if one is available and do the SAP to legacy mapping there.
    Cheers

  • BPC 7.5 will extract data from flat file or directly from cube

    Hi,
        i have a doubt plz give me a solution.
    1) previously our BPC 7.0 team extracted data from BI 7.0 through flat file, but now they have upgraded to BPC 7.5 i want to know whether BPC 7.5  will extract directly from our BW infocube or not.
    2) please let me know how data extracted from BPC 7.5.
    Thanks and Regards
            Satish

    Hi,
    If your import file has duplicate record in it then the Status of your DM package for import ends with warning, where all the records in your file might have got accepted but only one last data rows amoung the duplicate rows might have got submitted to the system.
    The duplicate records will be rejected and you can see the list of duplicate records as well in the DM package status report.
    Like Nilanjan said you can use Append DM Pacakge to submit all duplicate records aswell into the system. But here the problem is it appends the data in flat file with data available in the system.
    Say you have 100 for account1 in the system and you run Append DM Package having values duplicate values for account1 say 100 and 200. So after the Append Package execution you will have 100100200= 400 for account1.
    So for Append DM package to work in the same way as import, run clear package before executing the Append package.
    Hope this helps,
    Regards,
    G.Vijaya Kumar

  • Not able to extract performance data from .ETL file using xperf commands. getting error "Events were lost in this trace. Data may be unreliable ..."

    Not able to extract  performance data from .ETL file using xperf commands.
    Xperf Commands:
    xperf –i C:\TempFolder\Test.etl -o C:\TempFolder\BootData.csv  –a process
    Getting following error after executing above command:
    "33288636 Events were lost
    in this trace. 
    Data may be unreliable
    This is usually caused
    by insufficient disk bandwidth for ETW lo
    gging.
    Please try increasing the minimum
    and maximum number of buffers
    and/or
                    the buffer size. 
    Doubling these values would be a good first at
    tempt.
    Please note, though, that
    this action increases the amount of me
    mory
                    reserved
    for ETW buffers, increasing memory pressure on your sce
    nario.
    See "xperf -help start"
    for the associated command line options."
    I changed page size file but its does not work for me.
    Any one have idea, how to solve this problem and extract ETL file data.

    I want to mention one point here. I have total 4 machines out of these 3 machines above
    commands working properly. Only one machine has this problem.<o:p></o:p>
    Hi,
    I consider that you can try to use xperf to collect the trace etl file and see if it can be extracted on this computer:
    Refer to following articles:
    start
    http://msdn.microsoft.com/en-us/library/windows/hardware/hh162977.aspx
    Using Xperf to take a Trace (updated)
    http://blogs.msdn.com/b/pigscanfly/archive/2008/02/16/using-xperf-to-take-a-trace.aspx
    Kate Li
    TechNet Community Support

  • How often in the real time projects extract data from flat files n process

    I am going thru teh BODS data integrator, and trying to understand the demand of ETL services extract data from a flat file, is that really impt in teh real time jobs.
    Thank you very much for the helpful info.

    Hi,
    As per the inputs given by you guys i started loading data from flat file.
    I try to load 28 files from i which i was able to load 24 files succesfully.For the other 4 i got this error messages
    1) Error 'Enter period in the format __.YYYY...' at conversion exit CONVERSION_EXIT_PERI6_INPUT (field CALMONTH record 1, value DUMYTRA)
    Message no. RSDS012
    2)  a) Error 'The argument '1,008.00' cannot be interpreted as anumber' on assignment field QUANT_B record 11714 value 1,008.00
    Message no. RSDS013
       b) Error 'The argument '1,110.00' cannot be interpreted as anumber' on assignment field QUANT_B record 15374 value 1,110.00
    Message no. RSDS013
    3) a) Error 'The argument '1,140.00' cannot be interpreted as anumber' on assignment field QUANT_B record 1647 value 1,140.00
    Message no. RSDS013
       b) Error 'The argument '2,028.00' cannot be interpreted as anumber' on assignment field QUANT_B record 4625 value 2,028.00
    Message no. RSDS013
    4) Error 'The argument '1,151.00' cannot be interpreted as anumber' on assignment field QUANT_B record 7808 value 1,151.00
    Message no. RSDS013
    I'am unable to trace out what is the error exactly.
    I checked this values in files they are perfect.
    can anybody please guide me on this issue.
    With Regards,
    Pradeep.B

  • Steps to load the data by using flat file for hierarchies in BI 7.0

    Hi Gurus,
    steps to load the data by using flat file for hierarchies in BI 7.0

    hi ,
    u will get the steps int he following blog by Prakash Bagali
    Hierarchy Upload from Flat files
    regards,
    Rathy

Maybe you are looking for

  • How to enable javascript in windows server 2008 R2 enterprise ?

    Hi all, Hope doing well, sir i am using windows server 2008 R2 enterprise operating system. i am running my web application in localhost. it's working fine. but i have used timer using javascript and some javascript function that is not working in th

  • Getting: Unexpected Error when trying to upload files with my post!

    Hello, i tried today to post a reply to someone`s message t the forums but when i tried to attach 2 .php files i get unexpected error message and prevented me from posting the reply. but when i delete or reloaded the page, i shows me an error message

  • Fm modulator causing screen discoloration

    I purchased a 20 gig Ipod Photo less than 2 weeks ago. I have tried 2 types of fm modulators in my car, a Belkin that plugs into the headphone jack, and a DLO Transpod docking type. The problem I have is that when I used either transmitter, it would

  • Reversal biling/accounting document

    I have a billing document, I have cancelled it Since it was earlier released to accounting, I want to send this cancelled document for reversal from accounting. So I went to OBA7, where I dont find document S1...how to create that?? Is there any othe

  • HT5493 Java update and reverting to 1.6

    Good afternoon. After installing the java update this week I've realised it makes my Macbook useless for work. I regularly rely on a Corporate Information System at my University to access student records. This system is incompatible with Java 7, upo