Flat file load changes in key figures

Hello,
I have a weird issue when i try to load some data into a cube(Flat File) with 1900 records only 1668 records gets loaded into the cube and also when i try to validate it the numbers doesnt match.Can you guys suggest me if i am missing anything here.Do i have to double check anything.Please let me know.Any help greatly appreciated.
Thanks,
Vinay

nothing problem for your cube model.
there are some records with same charectertic values. you can validate it, calculate sum of 1 KF in cube and calculate sum of 1 KF colomn. you can compare both values.
both values should match.
assume...flat file data
customer   amount
<b>1000            10</b>
1001            20
1002            30
<b>1000            10</b>.
cube will have
customer   amount
<b>1000            20</b>
1001            20
1002            30
hope this helps.
Nagesh.

Similar Messages

  • Upload Flat file to Infocube with key figures of type FLOAT

    Dear Experts,
    I need your help in the following issue : I want to upload a flat file(csv) to an infocube. Two of the key figures are of type FLOAT. In the flat file the amount are in the format number with 4 decimals. eg 0.0445.  When I upload the data, although I have NO errors, the key figures show value 0. (BW system is 3.5).
    How should I declare these numbers in the excel file?
    Thanks in advance for your help
    Regards
    Dina

    Problem solved. I just uploaded the file first to a ODS and then to the infocube and everything worked ok.
    The mystery of BW!!!
    Dina

  • What are the settings for datasource and infopackage for flat file loading

    hI
    Im trying to load the data from flat file to DSO . can anyone tel me what are the settings for datasource and infopackage for flat file loading .
    pls let me know
    regards
    kumar

    Loading of transaction data in BI 7.0:step by step guide on how to load data from a flatfile into the BI 7 system
    Uploading of Transaction data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( Transaction data )
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to create ODS( Data store object ) or Cube.
    • Specify name fro the ODS or cube and click create
    • From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    • Click Activate.
    • Right click on ODS or Cube and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    4. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
    Loading of master data in BI 7.0:
    For Uploading of master data in BI 7.0
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to select Insert Characteristics as info provider
    • Select required info object ( Ex : Employee ID)
    • Under that info object select attributes
    • Right click on attributes and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.

  • Flat File loading Initialize with out Data transfer is disabled in BI 7.0

    Hi experts,
              When loading through flat file in BI 7.0 for Info Package Level Initialization Delta Process with data Transfer is coming by default,but when i want to select Initialization Delta Process without Data transfer is disabled. (in the creation of Data Source (flat file) in the Extraction Tab Delta Process is changed to FIL1 Delta Data (Delta Images).
    please provide me Solution.
    regards
    Subba reddy.

    Hi Shubha,
    For flat file load please go throught he following link:
    http://help.sap.com/saphelp_nw70/helpdata/EN/43/03450525ee517be10000000a1553f6/frameset.htm
    This will help.
    Regards,
    Mahesh

  • Flat file: loading Master data questions

    Hi Experts,
    1
    If I have a flat file with columns, Columns1, Columns2, Columns3, and we load this manually to an ODS monthly.
    a.
    Where in BW can I see the exact match to the columns, Columns1, Columns2, Columns3?
    b.
    What will be the effect of receiving the flat file from the user this month if the columns are not in the usual order e.g. Columns3, Columns2, Columns1?
    2..
    I have an Info Object CustCode_10, check with Text (Medium description) and it also has an attribute CustDispl_11.
    a.
    What should be the format of the flat file in order to get in the Master data?
    b.
    After loading a flat file, I can go to the infoobject CustCode_10, right click and select Maintain Master… to see the master data. How do I see the data in the Attribute.?
    Thanks

    Hi,
    for 1 a) you can see the exact order in the maintain Transfer rules section, where in you can check in the transfer structure.
    b) if you give it in the wrong order, wrong data will be loaded if all the 3 colums has same data type and length. or else you will get an error while updating.
    2) a) The flat file should contain like Key, Language and description.
    Regards
    Srini
    Message was edited by:
            Srinivas
    Message was edited by:
            Srinivas

  • Best practise around handling time dependency for flat file loads

    Hi folks,
    This is a fairly common situation - handling time dependency for flat file loads. Please can anyone share their experience around handling this. One common approach is to handle the time validity changes within the flat file where it is easily changeable by the user but then again is prone to input errors by the user. Another would be to handle this via a DSO. Possibly, also have this data entered directly in BI using IP planning layouts. There is a IP planning function that allows for loading flat file data but then again, it only works without the time dependency factor.
    It would be great to hear thoughts or if anyone can point to a best practise document for such a scenario.
    Thanks.

    Bump!

  • Unit Code, Commercial Code - Flat File loading vs BPS Loading

    Hi SDN Community,
    Recently, we have moved our flat file loading to be performed by BPS interfaces.
    During flat file loads, the Commercial code of all units of measures, are verified.
    eg.DAY
    But when loaded by BPS, the UNIT code is verified.
    eg. TAG.
    In the display of the BPS upload, it displays DAY which is the commercial code.
    The only thing, is the customer is forced to use TAG in the BPS upload files.
    They wish to create another record in the transaction
    CUNI to be
    Unit code = DAY
    Commercial code = DAY.
    However, i have found that we cannot allocate the same commercial code to another unit code.
    Is this a design constraint, or a process error that i am doing.
    Thank you.
    Simon

    04.01.2010 - 02:22:53 CET - Reply by SAP     
    Dear customer,
    As i do not fully understand how this works, but the base table T006A
    has 2 entries, 1 for english, 1 for German, should it not be that the
    English EN entry field be working rather that the German DE, hence DAY
    should be used? Can you please confirm my understanding on this?
    >>> If you check my attachment "SE16.xls", you can see that it's for
    language >>English<<, and the internal format is TAG while the external
    format is DAY.
    - Are there any plans to modify the BPS functionality in newer SAP BW
    versions to allow the customer to indicate the same UOM as per flat
    file laods? Or is an upgraded Support Pack that allows this?
    - If not, would it be possible to make any customer enhancements to
    allow this to take place depending on customer requirements.
    >>> There is no plan at this moment to change this BPS functionality
    in newer SAP BW versions or support packages.
    We really recommend you to use internal format TAG in the upload file
    in BPS which I think should be acceptable and feasible to you. All
    other ways trying to use external format is risky and we cannot assure
    you that it will work well as it's not SAP standard function. (I think
    it's not worth the risk as the standard function which requires
    internal format should not be too unacceptable)
    Thanks for your understanding.
    I also don't think my BC-SRV-ASF-UOM colleague would be able to help
    you a lot regarding this.
    Best Regards,
    Patricia Yang
    Support Consultant - Netweaver BW
    Global Support Center China
    SAP Active Global Support

  • Flat-File Loading problem

    Hi Friends,
    I am struggling with flat-file loading problem. I am trying to load a .csv file into data target. I took all pre-cautions while loading data. I look into preview and simulate the data. Everything is ok, but when i schedule the data, i found 0 records in the monitor. The following is the STATUS message of the above problem:
       No data available
    Diagnosis
    The data request was a full update.
    In this case, the corresponding table in the source system does not
    contain any data.
    System response
    Info IDoc received with status 8.
    Procedure
    Check the data basis in the source system.
             Can anybody help me what is the problem and procdure to resolve it?
    Regards,
    Mahesh

    Hi Eugene,
    Thanks for the quick reply. The following screen-shot tells you the messages of detail tab;
    OVER ALL STATUS MISSING WITH MESSAGES OR WARNINGS
    REQUEST: MISSING MESSAGES
    EXTRACTION
    EVERYTHING IS OK
    DATA REQUEST RECEIVED
    NO DATA AVAILABLE DATA ELECTION ENDED.
    PROCESSING
    NO DATA
               The above message was shown in details tab. Pls guide me to locate the problem.
    Regards,
    Mahesh

  • Transformation not generating-Flat file loading

    Hello guys, I hope you can help me with this little confusion I have on BI7 Flat file loading.
    I got a File (CSV) on my workstation. I am trying to load Master Data. Here is the example of my file and issues:
    Lets say, I have CSV file named "CarModel.CSV" on my PC.
    This excel file has 10 Records and No atributes for this Field.
    So the records should show like this and it is showing correctly in PSA level, DS level.
    A
    B
    C
    D
    E
    F
    My goal is to load Flat file data to InfoObject (inserted in Infoprovider)
    I created Source system, DS, all thats stuffs.
    I am now on Display DS screen under Proposal Tab. I am putting 10 Records to show and hitting Load example data....it works fine by showing all 10 records. However in the bottom part of this screen, what should show as a Field ?In my case , First it was showing the First Record ("A")...i didnt think it was correct but i prceeded anyways. Transformation could not be generated.
    I tried by deleting "1" from the field "No. of header rows to be ignored" and its the same result ..No transformations.. I mean i know its very simple to load this data but i am not sure what i am doing wrong..My question is:
    1) What should show under Field/propasal tab in my case?am i supposed to create this Field?
    2) Should the system to propse the field which is from my flat file header ?in my case i dont have any header..should i have include header in my csv file like "car model"? i am confused please give me some info thanks
    dk

    Hi,
    In filed tab, u have to enter ur infoobject names in an order...and press enter it'll automatically give the description and its other factors...
    i guess u shuld have some header in the sense eg: customer,cust ID like this..this only u have to enter as fields...in proposal tab...try that
    rgds,

  • Input Enabled Query - Web Template - Not able to change the key figure.

    Hello IP Gurus,
    This is a simple question .
    I have input enabled query and for all the KF's , I selected the radio button - data can be changed using user entries or planning functions , Equal Distribution and No disagrregation.
    I attached the query to a web template.
    When I execute the template I am not able to enter or change the key figures.
    Do I have to use any planning function to do that in the web template or I am missing any configuration . Please let me know how to change or enter the key figures
    Thanks
    Senthil

    HI Senthil,
    Please ensure that u have enabled the property  " start Query in changed Mode". This u can find it in the query properties under the advanced tab.
    For the Input readiness this alone is not enough.You have to use all the charecteristics that u have in the multiprovider or the aggregation level over which the query is built.
    regards.
    Shafi.

  • Regarding flat file load to DSO...

    Hi Experts,
    I am trying to load transaction data from flat file to DSO in BI7.0 that comprises two characteristics and one keyfigure. The characteristics type is char and keyfigure's type is int.  Till PSA, data is fine.  In transformation, I have just done one-to-one mapping with source field and target field, but while executing DTP, the error popped up as follows,
    Data package 1:Errors during processing
    Extraction completed, processing termination
    Please help. I tried various combinations, but not able to schedule.  Valuable answers would be awarded remarkably.
    Thanks,

    Hi there...
    Here are some ideas to Ur problem
    Assuming that u defined info objects correctly, length and types... and u mapped transformations correctly
    And ur data in flat file is absolutely correct...
    Of course very common but suggestible is plz check the data lengths in flat file  and defined length of the info object.. If yes
    Come to Data source and check the settings for the flat file ,separater and escape sign and rows to ignored..and check the data preview...
    Plz compare the data in PSA and Flat file... plz do this very carefully... If yes...
    Go to data source and at the FIELDS tab and check whether u assigned info objects correctly...if yes...
    Check the conversion routines in the same screen of the tab FIELDS.. if for key figure u see PER16.. Remove it and try to load to PSA and do load the cube..
    Hope this can give u an idea...
    Regrds...KP

  • Error in flat file loading : Important

    Hi !
    I am using BW 3.5 .  I am loading data from flat file which has a KF VOLUME and UOM as USG but it is giving error as Conversion exit error.(CONVERSION_EXIT_CUNIT_INPUT).
    Please tell me what should i do ?

    Dear Rajib Goon
    Its won't be a much problem.
    Open your info source, in the Key Figures Transfer View, you can see a one column name called "Fixed Unit Of Measurement". Here enter "USG" for the Keyfigure VOLUME & Reschedule your info package.
    Hope this will help you lot.
    Regards
    Saravanan.ar

  • Flat File Load in 3.5 version

    Hi BW Gurus,
    I am facing one problem in transfer routine in by using flat file
    For example i am using key figures Price,Quanity and Revenue of the product.
    While load the data (Flat file), i use to write routine on on Revenue = tran_structure-/bic/io_price * tran_structure-/bic/io_quntity.
    In my flat file like this
    customerID ProductID Price   quantity  Revenue
    C001       P001       10       2
    C002       P002       15       3
    because i used to write routine on Revenue thats why i d't mention any values in revenue.
    I am using 3.5 version .. In 3.0B version i worked  but here i am not getting the data for Revenue .
    Any one can help me
    Thanks
    Bhima

    hi,
    if I understood you correctly you wrote routine for transfer rule for characteristic revenue (not start routine, yes?)
    your code should look like that
    result = tran_structure-/bic/io_price * tran_structure-/bic/io_quntity.
    abort = 0.
    check if you mapped revenue field in update rules as well
    Regards,
    Andrzej

  • Data loading mechanism for flat file loads for hierrarchy

    Hi all,
    We have a custom hierarchy which is getting data from a flat file that is stored in the central server and that gets data from MDM through XI. Now if we delete few records in MDM, the data picked in BI will not consist of the records which are deleted. Does it mean that the hierarchy itself deletes the data it consists of already and does a full load or does it mean every time we load the data to the BI, do weu delete the records from the tables in BI and reload?
    Also we have some Web Service(gets loaded from XI) text data sources.
    Is the logic about updating the hierrarchy records different as compared to the existing web service interfaces?
    Can anyone please tell me the mechanism behind these data loads and differentiate the same for above mentioned data loads?

    create the ODS with the correct keys. And load full loads from the flat files. You can have a cube pulling data from the ODS.
    Load data in ODS
    Create the cube.
    Generate export datasource ( rsa1 > rt clk the ods > generate export Datasource )
    Replicate the export ds ( rsa1 > source system > ds overview > search the ds starting with 8 + the ODS name
    press the '+' button activate the transfer rules and comm str
    create the update rules for the cube with the above infource ( same as '8ODSNAME' Datasource )
    create infopackage with intial load (in the update tab)
    Now load data to cube
    Now load new full loads to ODS
    create a new infopackage for delta (in the update tab)
    run in infopackage. (any changes / new records will be loaded to cube)
    Regards,
    BWer
    Assing points if helpful.

  • Issue with flat file loading timing out

    Hello
    I have a scenario, where I am loading a flat file with 65k records into a cube. The problem is, in the loading process, I have to look up the 0Material table which has a million records.
    I do have an internal table in the program, where I select a subset of the Material table ie around 20k to 30k records. But my extraction process takes more than 1 1/2 hrs and is failing (timed out).
    How can i address this issue? I tried building indexes on the Material table and its not helping.
    Thanks,
    Srinivas.

    Unfortunately, this is BW 3.5, so there is no END routine option here. And I tried both .csv and notepad file methods, and both are creating problems for us.
    This is the total code, do you guyz see any potential issues:
    Start Routine (main code)
    refresh i_oldmats.
    refresh ZI_MATL.
    data: wa_datapak type transfer_structure.
    loop at datapak into wa_datapak. (** I collect all the old material numbers from my flat file into an internal table i_oldmats**)
       i_oldmats-/BIC/ZZOLDMATL = wa_datapak-/BIC/ZZOLDMATL.
       collect i_oldmats.
    endloop.
    sort i_oldmats.
      SELECT /BIC/ZZOLDMATL MATERIAL (** ZI_MATL only has recs. where old materials exist, this gets about 300k records out of 1M**)
             FROM /BI0/PMATERIAL
             INTO ZI_MATL FOR ALL
             ENTRIES in i_oldmats WHERE
              /BIC/ZZOLDMATL = i_oldmats-/BIC/ZZOLDMATL .
                collect    ZI_MATL.
      Endselect.
      Sort ZI_MATL.
    Transfer rule routine (main code)
    IF TRAN_STRUCTURE-MATERIAL = 'NA'.
        READ TABLE ZI_MATL INTO ZW_MATL
        WITH KEY /BIC/ZZOLDMATL = TRAN_STRUCTURE-/BIC/ZZOLDMATL
        BINARY SEARCH.
        IF SY-SUBRC = 0.
          RESULT = ZW_MATL-MATERIAL.
        ENDIF.
      ELSE.
        RESULT = TRAN_STRUCTURE-MATERIAL.
      ENDIF.
    Regards,
    Srinivas.

Maybe you are looking for