Problem in Flat file transaction data loading to BI7

Hi Experts,
I am new to BI7, got problem in activating data source for transaction data and create transfer routine to calculate sales revenue which is not there in flat file but flat file has got price per unit and quantity sold.
Flat file fields are:
Cust id   SrepID  Mat ID  Price per unit  Unit measure  Quantity   TR. Date
cu100    dr01      mat01         50                 CS              5     19991001       
created info objects are
IO_CUID
IO_SRID
IO_MATID
IO_PRC
IO_QTY
IO_REV
0CALDAY
When i created IO_QTY, unit/curency was given as 0UNIT. In creation of DS, under fields tab, extra field CS was shown I did confuse to which object should i map it. and at the same time when i enter IO_QTY its prompting one dailogbox with 0UNIT, so i said ok. Now i can't map CS to unit as system created automatic ally 0UNIT when i enter IO_QTY. Could you please give me solution for this and procedure to enter formulae for calculating Sales revenue at this level instead of query level. Your solutions will be more appreciated with points.
Await for ur solutions
Thanks
Shrinu

Hi Sunil,
Thanks for your quick response. I tried to assign the infoobjects under the fields tab in creation of DS. I have not reached to data target yet.
Could you please answer to my total question.
Will you be able to send some screen shots to my email id [email protected]
Thanks
Shri

Similar Messages

  • Flat file transactional data load error

    I am trying to load a flat into a BPC model. I am using a DM " Import transational data load from flat file" to load the data. The transformation file and data file are validated correctly.
    However,when i am running the DM package, i am hitting the below message.
    Task Name : Load
    Cannot perform Read.
    Model : Package Status : Error.
    We have two secure dimension in the model. I have tried different combinations even with PRimary admin role and i am still getting the same error message.
    Is this a secirity related error ? The model has been transported from DEV . In DEV, i am not facing any errors.
    Any advise/ help?

    Hi King,
    I think in the back end you need check the option real time load behavior, Is it in planning mode or loading mode. If possible share your error message screen.
    Goto RSA1 ---> select your cube--> Right click---> change real time load behavior--> Change it in to planning mode
    Regards,
    Saida Reddy Gogireddy

  • Problem in loading of flat file transaction data into BI7

    Hi Experts,
    I am new to BI7, got in problem in creation of Data source.
    I have fields in flat file are:
    CUSTID,SREPID,MATID,PRICE PER UNIT,UNIT,QUANTITY,TRANS.DATE
    one of the row under the above fields is
    C100,SR100,MAT01, 50,CS,5,19991005
    Created infoobjects are:
    IO_CUSID
    IO_SRID
    IO_MATID
    IO_PRC
    IO_QTY
    IO_REV
    0CALDAY
    IO_QTY was created with Unit/currency '0UNIT'
    In creation of Data source, under field tab system shown extra CS field, I did confuse that which infoobject should be map to this field. At the same time i created IO_QTY with 0UNIT so when i entered IO_QTY under infoobjects column, system automatically created 0UNIT infoobject. I didn't completly understand why one of the unit (CS) sat under the fields column. could you clarify me please.
    And also could you please tell me the procedure of creating transfer routine for sales revenue as we didn't have this object in flat file (SS). In earlier version of BW(Addison Wesley Step by Step book) i did this at infosource level. Here i am loading the data without infosource so just i would like to know is it possible to create transfer routine at DS level or do i need to create Infosource in this case. If it is the case also could you please make me understand with procedure to create transfer routine for sales revenue.
    Your solutions will be more appreciated with points.
    Thank you,
    Shri

    Hi,
    1.Open Administrator Workbench: Modeling, from the menu or using the transaction RSA1
    2.Go to Source Systems (File)and Create a new System.
    3. Double click on the datasources, create data source
    4.Entitle the Datasource, choose Transaction Data as Data Type Datasource
    5. In next screen, Go to Extraction , Activate the data source
    6.Go to Preview. Press Read Preview Data.
    7.Data will be loaded: Save and activate the datasource
    8.Go to InfoProvider, go to Info area and create Infocube
    9.Entitle the infocube, press create
    10.Display all info objects: . Choose your info objects , Move the characteristics and key figure to the infocube by drag and drop adn activate
    11.Go to InfoCube, create transformation
    12.Click on the ‘Show Navigator’ button. Match the fields in navigator. Save and activate.
    13.go to info provider, your info cube, create data transfer process
    14.Choose: Extraction Mode: full
    Update: Valid records update, Reporting possible (Request green)
    Activate, execute
    14.Double click on the data source, Read Preview Data
    15.Create an InfoPackage by right clicking on the datasource. Execute the load into PSA.
    16.Verify data was loaded into PSA before proceeding with the next step.
    17. Execute the Data Transfer Process
    18.Monitor the results (Menu: GoTo: DTP Monitor)
    19.Try to display the data at the BW Frontend.
    Thanks,
    Sankar M

  • Output in flat file master data loading

    Hi gurus,
    iam loading master data to an info object from a flat file. In the output the headers ( info objects) names i was seeing only long descriptions, but not technical names given by me. How to see the technical names of the info objects in the output.

    Hi Ram,
    I am not getting clearly  your question.
    What i got is
    You are able to see the description in the preview in External Data tab.  But you are not able to get the corrosponding values after loading.  (The same problem i gone thr.)
    Soln :
    Check the language field. Put it to English ( E ). and try to load the package again.
    Note Please check the Preview and simulated preview also.
    Let me know if problem persists.
    Regards
    Vinay

  • Flat File: no data load into Info Cube

    Hi there,
    i try to load a flat file. When I simulate the upload its works well. But no Data was load in my Info Cube. When I try define a query there are no available.
    Can someone provide me with a solution for this problem?
    With rgds
    Oktay Demir

    Hi Oktay,
    in addition to A.H.P.'s marks, check if
    - Data is posted not only into PSA but also into datatarget,
    - updaterules are active.
    - Check Monitor-Status in Cube-Administration
    - Check availabilitiy for reporting of the request wthin Cube-Administration.
    Cheers
    Sven

  • Extraction, Flat File, Generic Data Source, Delta Initialization

    Extraction, Flat File, Generic Data Source, Delta Initialization
    I have couple of questions regarding data extraction.
    1.     If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source 
    2.     Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    3.     I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    4.     What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update  or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    5.     If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been  expanded by one field or it may does not need to know at all?
    THANKSSSSSSSSSs

    Hi,
    1. If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source
    Once you create Datasource for A flat file extraction then it is file source system specific hence you cont change to Application table source Data source
    In info package you can change the source as application server instead of desktop no need to change the DS
    2. Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed  so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)
    Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    3. I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    Generic datasource nothing but we extracting data directly from the database table without any interface between the application/systems
    4. What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    Correct
    5. If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been expanded by one field or it may does not need to know at all?
    Once you add the new field to structure(DS) you will get the data as on date onwards not historical data hence what is the concept of setup table  ( delta records come from the Delta Que not from the setup table )
    If you want histaric data to new field then you need to setp table deletion ...etc...
    Hope it is clear..
    Regards,
    Satya

  • Transactional data loads PIR, IM stock, Open PO's documentation

    I have to make a documentation of the process of transactional data loads.
    transactional data loads pirchase info rec, IM stock, Open PO's
    the transactional data is live in both sap and legacy, so they have to be even in both systems in all stages of this process.
    how do i maintain that is the question.
    send me any details regarding this.
    thank you
    sridhar

    Check these three thigns
    /n/sapapo/CCR
    /n/sapapo/CQ
    Check what type of stock has active IM, and what type of stock went in after you created the GR.
    Still if you have any problem let us know.
    My

  • Flat File - Delete data

    Hi Expert,
    I have flat file as data source. I want to know how to handle deletes with flat files.
    For examaple,
    If this is my flat file data
    Company...Costcenter....Stock
    A......B......10
    M.....Z.......20
    K......W......25
    Company and Costcenter are keys.
    If the above three records are loaded into an ODS last month and this month if I realize that A..B...10 record is wrong and is not needed, how do I delete this from the ODS? Do I need to send some indicator in the flat file for BW to understand that this is a delete entry?
    Another case is if A..B record changes valye from 10 to 35, then I can send the same file or the same record in a diferent file again and it will over write the data, which is ok.
    Can exprets plzz explain these data load streategies?
    Jason

    Hello ,
      For Case1) Do the reverse posting of the record which u wish to delete from the ODS. I Mean (A..B...-10)
    Case2) If you are loading data only till ODS. Then in your ODS u have active data table and a Change log table.
             so this new record A..B..35 will overwrite the existing rec -> A..B..10 to A..B..35 and if you are doing further update to Cube then in change log it will first create a reverse image of the exinging record ie A..B..-10 and the writes the new record A..B..35
    so --> A..B..10
             A..B..-10
             A..B..35  so the remaining will be A..B..35
                   Provided you have to maintain the Delta method in you flatfile
    -- EnjoySAP:-)
    **Award Credits if you find the solution, Have a great day

  • Getting Problem in Flat File to IDoc Scenario.

    Hi,
    I am facing problem which is as follows.
    Test Flat File Contain Data As Follow
    Field1,Field2,Field3
    Field4,Field5,Field6
    Field1,Field2,Field3
    Field4,Field5,Field6
    Field1,Field2,Field3
    Field4,Field5,Field6
    Source structure has only one segment with all six fields and is repeating.
    What will be the file content conversion parameters so that it will pick first 2 lines and convert it into segment in this away.(I cant create two segments)
    Thanks in advance.

    Hi,
    I agree with Vijay.. so try to create the Data type like this..
    Recoredset 0...1
    -----Line1 0.... unbound
    --------Field1 0...1
    --------Field2 0...1
    --------Field3 0...1
    -----Line2 0.... unbound
    --------Field4 0...1
    --------Field5 0...1
    --------Field6 0...1
    Now do your FCC
    Recordset Structure ---- Line1,*,Line2,*
    Recordset per Message ---- 1
    Line1.fieldNames ----- Field1,Field2,Field3
    Line1.fieldSeparator ---- ,
    Line1.endSeparator ----- 'nl'
    Line2.fieldNames ----- Field4,Field5,Field6
    Line2.fieldSeparator ---- ,
    Line2.endSeparator ----- 'nl'
    now map the fields as per your requirement.
    Regards,
    Sarvesh

  • A problem about flat file,help me plz!

    hi all:
    i have met a question while working with owb.
    in my case i have two flat files as the data source.
    at first i use one of the files as data source .And everything was gone through smoothly,including deploy and execute.
    But when i use the other file as data source,error came up when the connectors were being deployed.it reported "Could not find location XXXX ".
    i wonder if there is any limitation while using flat file as data sourse.if not ,how can i deal with such a problem.
    best regards!
    thanks a lot!

    when i deploy with source as flat files i use sqlldr which even OWB uses. and there is no problem with multiple sources in sqlldr. check in the script after generating mapping. If INFILE option in sqlldr script every path of source is mentioned properly.
    regards
    roopa

  • How to rectify the errors in master data loads & transactional data loads?

    hy,
    please any one tell me
    How to rectify the errors in master data loads & transactional data loads?
    thnQ
    Ravi

    Hi,
    Please post specific questions in the forum.
    Please explain the error you are getting.
    -Vikram

  • Track flat files that failed loading in sql loader

    Hi,
    Can anyone please suggest me any way to track the flat files which failed while loading in sql loader. And pass this failed flat file name to any column of a database table
    Thanks in advance.
    Edited by: 806821 on Nov 2, 2010 10:22 AM

    Hi Morgan thnannks for ur reply.
    Define failed. 1 row not loaded ... no rows not loaded ... what operating system ... what version of the Oracle database ... track in a table, send an email?
    Your inquiry is long on generalities and short on specifics: Fill in all the blanks ... not just the ones I've listed above.
    even if 1 row is not loaded even then it should be considered failed to load and the name of the that particular flat file should be fetched.
    Operating system is unix
    Oracle database we are using is R12
    track in a table , yeah we want to send an email notificaiton whenever the flat files fails to load.
    Thanks once again...!!

  • Master Data/transactional Data Loading Sequence

    I am having trouble understanding the need to load master data prior to transactional data.  If you load transactional data and there is no supporting master data, when you subsequently load the master data, are the SIDs established at that time, or will then not sync up?
    I feel in order to do a complete reload of new master data, I need to delete the data from the cubes, reload master data, then reload transactional data.  However, I can't explain why I think this.
    Thanks,  Keith

    Different approach is required for different scenario of data target.  Below are just two scenarios out of many possibilities.
    Scenario A:
    Data target is a DataStore Object, with the indicator 'SIDs Generation upon Activation' is set in the DSO maintenance
    Using DTP for data loading.
    The following applies depending on the indicator 'No Update without Master Data' in DTP:
    - If the indicator is set, the system terminates activation if master data is missing and produces an error message.
    - If the indicator is not set, the system generates any missing SID values during activation.
    Scenario B:
    Data target has characteristic that is determined using transformation rules/update rules by reading master data attributes.
    If the attribute is not available during the data load to data target, the system writes initial value to the characteristic.
    When you reload the master data with attributes later, you need to delete the previous transaction data load and reload it, so that the transformation can re-determine the attributes values that writes to the characteristics in data target.
    Hope this help you understand.

  • How to convert Flat file(.txt) data to an Idoc format(ORDERS05)

    Hi,
    How to convert Flat file(.txt) data to an Idoc format(ORDERS05). If any FM does the same work please let me know.
    thanks in advance,
    Chand
    Moderator message : Duplicate post locked. Read forum rules before posting.
    Edited by: Vinod Kumar on Jul 26, 2011 11:11 AM

    Hi,
            For more information, please check this link.
    http://sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/46759682-0401-0010-1791-bd1972bc0b8a
    Have a look at the FM IDOC_XML_FROM_FILE. May be it helps...
    Regards

  • What are the advantages of idoc compare to flat file. how data is secure

    what are the advantages of idoc compare to flat file. how data is secure in idocs compare to flat file

    Hi Ramana,
    In simple words, Main advantage with idoc over flat file is security....
    I will explain you some scenario here U got a flat file with all the data...Now u r having the flat file if you want u can modify the data in it, or somehow any one can modify the data in it  if they were able to access this file. That means u maintained the file in the presentation server
    One level of higher security to the above level is maintaining the flat file in application server, at point also even though lot of people r not having the access to that file, super user who is  having  the access may modify the data or delete the data from it rite....
    so in both of those levels u don't have 100% security...
    So there come to the picture of idocs, Idocs simply data carriers, those r generated by a program but not manually...data will be divided into number of segments based upon ur program. So manually its not so easy to modify the data in these idocs. If any changes to be made in the data then u have to modify the data in the application and then u have to update the idoc or you have to generate the new idoc with that corresponding data. so in this case not even super user can manipulate the data directly in the idoc....
    I think u got my point what I mean to say.....
    If you find it useful mark the points
    ~~Guduri

Maybe you are looking for