Wrong data flowing to PSA

Hi,
For particular condition type amount flowing in BI is automatically multiplied by 1000.
Suppose, I have 2 rows in VA23(R3) like
Sales Document No.          Condition Type          Amount
     20000140                         ZBSR                  84,007,635.00
      20000140                                   ZFLS                         8,606.25000
When I checked for same document in RSA3 data is like:
Sales Document No.          Condition Type          Amount
     20000140                         ZBSR                  84,007,635.00
      20000140                                   ZFLS                         8,606,250.00  
So, you can see that for cond type ZFLS it is multiplying record by 1000.
Please tell me how to rectify
Regards

Hi,
check the RSA3 record and PSA record for one.
your are given above only R/3 table and RSA3 and give the PSA data also for same records
your not mention PSA data.
problem have decimal places
go to SU01 -> give the user name -> change the decimal places
other ask to your basis person
Thanks,
Phani.

Similar Messages

  • Reverse Data Flow

    Hello Experts,
    I have a InfoCube, and i am backtracking it's data all the way to R/3. I checked the data flow to this cube and it showed that it's been loaded from an Infosource ( there's no ODS in between the Infosource and the Infocube) - When i click on Infosource, i can see the Communication Structure ( Which i suppose is the structure in which the data is stored in an Infosource - Correct me if i am wrong anywhere)  and the Transfer Structure ( Which i suppose is the structure in which the data is stored in the Datasource) and where i also can see the Datasource assigned to this Infosource and this datasource would tell me whether this is a delta load or init load.
    Coming to the questions ?
    How is data loaded into a datasource - I know a datasource is something which is present both at R/3 and BW end. So how is data loaded into a Datasource at R/3 end and how is it loaded into BW. If someone can provide how to do in BW, i mean the T Codes or the procedure - that would help me a lot in understing or getting the bigger picture.
    Where does PSA come into picture in this whole process ?
    All answers would be really appreciated and duly rewarded with points.
    Thanks,
    Nandita

    Hi Nandita,
    Data permanently stored in tables only.
    Data source is structure  and it provide the data to other objects at run time
    data source can be crated by views, tabels, structure and feeding programme, sap queryetc below transactions are useful for managing data sources
    RS02,RSA6,RSA5,RSA3,RSA7,BWA1,BWA7 and RSA2
    Here is data flow.
    data source(R3) - Transfer structure (data source ) (BW) (PSA) -Transfer rules (BW)Infosource (CS) (BW)-Update rules (BW)---- data target (BW)
    BW side tranfer structure is just replication of r/3 side data source to move date from data source through infosource to data target.
    Update rules and transfer rules are joining infocube , info source and infosource to data source.
    PSA is the  persisting staging area, whatever data is there in r/3 same data available in BW as PSA, it entry point of data to BW.
    i hope it will give some idea.
    Thnaks
    Thota

  • SRM 7.0 data flow frm SC to PO

    Hi,
    As I am new to SRM, I wish to know how the data flow from SC to PO.
    Here is my scenarion, we are in extended classic scenarion, we have 10 custom fields in SC and the same 10 fields in PO. The moment a SC is ordered, approved and created our PO will be created . I want to know how the data for the custom fields and the standard fields flow from SC to PO.
    I found some BADi's, and I learnt that all those badis are related to BAckend PO,
    BBP_CREATE_BE_PO_NEW - For classic scenario in newer versions of SRM
    BBP_ECS_PO_OUT_BADI     - For extended classic scenario
    BBP_EXTLOCALPO_BADI     - For controlling the Extended classic scenario on Local PO
    Please correct me if I am wrong. Let me the exact data flow.
    Information on this highly appreciated and will be higly rewared. Thanks in advance.
    Krishna Chaitanya

    Solved

  • R/3 data flow is timing out in Data Services

    I have created an R/3 data flow to pull some AP data in from SAP into Data Services.  This data flow outputs to a query object to select columns and then outputs to a table in the repository.  However the connection to SAP is not working correctly.  When I try to process the data flow it just idles for an hour until the SAP timeout throws an error.  Here is the error:
    R/3 CallReceive error <Function Z_AW_RFC_ABAP_INSTALL_AND_RUN: connection closed without message (CM_NO_DATA_RECEIVED)
    I have tested authorizations by adding SAP_ALL to the service account I'm using and the problem persists.
    Also, the transports have all been loaded correctly.
    My thought is that it is related to the setting that controls the method of generating and executing the ABAP code for the data flow, but I can't find any good documentation that describes this, and my trial and error method so far has not produced results.
    Any help is greatly appreciated.
    Thanks,
    Matt

    You can't find any good documentation??? I am working my butt off just.......just kiddin'
    I'd suggest we divide the question into two parts:
    My dataflow takes a very long time, how can I prevent the timeout after an hour? Answer:
    Edit the datastore, there is a flag called "execute in background" to be enabled. With that the abap is submitted as a background spool job, hence does not have the dialog-mode timeout. Another advantage is, you can watch it running by brwosing the spool jobs from the SAP GUI.
    The other question seems to be, why does it take that long even? Answer:
    Either the ABAP takes that long because of the data volume.
    Or the ABAP is not performing well, e.g. join via ABAP loops with the wrong table as inner.
    Another typical reason is to use direct_download as transfer method. This is fine for testing but it takes a very long time to download data via the GUI_DOWNLOAD ABAP function. And the download time would be part of the ABAP execution.
    So my first set of questions would be
    a) How complex is the dataflow, is it just source - query - data_transfer or are there joins, lookups etc?
    b) What is the volume of the table(s)?
    c) What is your transfer method?
    d) Have you had a look at the generated abap? (in the R/3 dataflow open the menu Validation -> Generate ABAP)
    btw, some docs: https://wiki.sdn.sap.com:443/wiki/display/BOBJ/ConnectingtoSAP

  • Foreach Loop Container with a Data Flow Task looking for file from Connection Manager

    So I have a Data Flow Task within a Foreach Loop Container. The Foreach Loop Container has a Variable Mapping of User:FileName to pass to the Data Flow Task.
    The Data Flow Task has a Flat File Source since we're looking to process .csv Files. And the Flat File Source has a Flat File Connection Manager where I specified the File name when I created it. I thought you needed to do this even though it won't really
    use it since it should be getting its File name from the Foreach Loop Container. But when attempting to execute, it is blowing up because it seems to be looking for my test file name that I indicated in the Flat File Connection Manager rather than the file
    it should be trying to process in User:FileName from the Foreach Loop Container.
    What am I doing wrong here??? I thought you needed to indicate a File name within the Flat File Connection Manager even though it really won't be using it.
    Thanks for your review...I hope I've been clear...and am hopeful for a reply.
    PSULionRP

    The Flat File Connection manager's Connection String needs to be set to reference the variable used in the ForEach Loop:
    Arthur My Blog

  • Help me in my data flow ... new to Bi 7.0

    Dear friends,
    Iam new to data flow in Bi7.o. Iam loading  data into infocube from flat files which has 7 records. i loaded the data into infocube with 7 records .. then i added 2 records to my flat file when i load it .. iam getting 9 records in my PSA but 16 records in infocube..   i donot knw how to do it..  what setting i should  do  at data source maintaintence...plz help me in understanding the data flow for Bi7.o though i am studying help files..
    Regards,
    pavan

    Hello Pava,
    1. The InfoPackage are the same
    2. The processing type for the DTP must be DELTA
    --> The source for the DTP is in this case the DataSource (PSA).
    --> So u need 3 processes: Delta Infopackage from FF --> to PSA, detla DTP from PSA to DSO --> detla DTP from DSP to InfoCube
    3. "only get delta once" means, the source request is extracted via delta DTP from the source to the target. If you delete the target request from the InfoCube or DSO, the related source request will NOT transfered to the target again with the next delta upload. Usually it will.
    4. "get data by request" means, the delta DTP uses the same number of requests as in the source. Usually the DTP collects the data from the source and creates a new Request that could include several source requests. With this flag the DTP request uses the source requests to transfer the data into the target. So the nb. of requests are the same in the source and in the target.
    I hope this helps,
    Michael

  • Owb3i data flow connectors problem in mapping editor

    Hi,
    To define mappings between source and target, the manual says:
    steps 1-3 and then,
    "Repeat steps one through three until you have created all the data flow
    connection appropriate for your situation."
    The manual also says:
    "To connect Mapping Operators, you draw lines from output attributes or output
    attribute groups to input attributes or groups between the operators."
    The question:
    When I draw a lines for individual connections, it is fine. But I have a source/target with 201 columns. So I dragged my mouse between the groups. This created 201 additional attributes in the target. Am I doing it wrong or is it a 'feature'? How else I can make all the 201 connections less painfully?
    TIA.
    --Rao.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    sunil kumar wrote:
    Hi,
    >
    > As said above convert transfer rules to transformations (right click on transfer rules -> additional prop -> create transformation)
    >
    > Then right click on datasource -> Migrate
    >
    > After doing this, your datasource will be linked to Infocube directly with one transformations automatically.
    >
    > Regards,
    > Sunil
    Hi Sunil & Sravan,
    THanks for ur quick reply.As you said click on transfer rules and create transformation.As i can see in my system i have a transformation from data source to infosource .i.e. starting with RSDS_  and then another transformation from info source to info cube.i.e. starting with TRCS_  .  what i did was created a transformation by right click on info cube and by giving data source name as 2LIS_03_BF .this is a manual transformation i am mapping manually.Is this a correct way? Please tell me how to do it.
    as i have to do this immediately.
    Thanks in advance

  • Data flow not visible while creating remote cube.

    Hi SDN,
    I am working on a remote cube. I need to link the infosource to remote cube. I have  selected the source system and assigned.
    I have done,  like--Remote cube --> Context Menu --> Show Data flow , and I wanted to see the source system , Info source and the remote cube, but could not find them .
    Guide me what I've missed /went wrong.
    Thanks in Advance
    Ankit

    Hi,
    Remote cube is technology where you report on data which is stored in the source system. While creating the Infocube you have to select Remote Infocube Radiobutton and in the next screen it will ask you for the infosource.
    In that infosource you can assign the datasource in the infosource menu.
    Now you see the data flow.
    Hope it works,
    Regards,
    Sasi

  • Code: 0xC0208452 Source: Data Flow Task ADO NET Destination [86] Description: ADO NET Destination has failed to acquire the connection {}. The connection may have been corrupted.

    Hi There!
    I have created one package (1) to import data from Flatfile(csv), (2)Clean It  then (3)send clean rows to SQL Database.
    This package was working fine before. Since I have decided to deploy this package to automate this process, I have no clue what went wrong but this doesn't run anymore. Flatfile and Database are on same windows box. We are running SQL 2008.I
    have attached some screenshot to make this conversation more concise.
    Your time and efforts will be appreciated!
    Thanks,
    DAP

    Hi Niraj!
    I recreated connection and I was able to remove that RED DOT next to those connections.
    Still package doesnt run well :(
    I have only one server. I use same server through out the process. I ran that process as job through SSMS and attached is output file(if this explains more)...
    Microsoft (R) SQL Server Execute Package Utility
    Version 10.0.4000.0 for 64-bit
    Copyright (C) Microsoft Corp 1984-2005. All rights reserved.
    Started:  11:34:38 AM
    Error: 2014-07-18 11:34:39.33
       Code: 0xC0208452
       Source: Data Flow Task ADO NET Destination [86]
       Description: ADO NET Destination has failed to acquire the connection {2430******}. The connection may have been corrupted.
    End Error
    Error: 2014-07-18 11:34:39.33
       Code: 0xC0047017
       Source: Data Flow Task SSIS.Pipeline
       Description: component "ADO NET Destination" (86) failed validation and returned error code 0xC0208452.
    End Error
    Error: 2014-07-18 11:34:39.33
       Code: 0xC004700C
       Source: Data Flow Task SSIS.Pipeline
       Description: One or more component failed validation.
    End Error
    Error: 2014-07-18 11:34:39.33
       Code: 0xC0024107
       Source: Data Flow Task 
       Description: There were errors during task validation.
    End Error
    DTExec: The package execution returned DTSER_SUCCESS (0).
    Started:  11:34:38 AM
    Finished: 11:34:39 AM
    Elapsed:  0.531 seconds
    Thanks for your time and efforts!
    DAP

  • Data Flow from SAP Source (ECC) system to SAP BI system

    Hi All,
    I wanted to know how data will be flown from SAP Source system to SAP BI system.Data flow should include
    1) Data will be flown by using the IDOCs?
    2) What all are the interfaces involved while data is transferring?
    3) What will happen exactly, if you execute the PSA?.
    If you have any info on this, could you please post here....I
    Regards,
    K.Krishna Chaitanya.

    Hi Krishna,
    Please go through  this article :
    "http://www.trinay.com/C6747810-561C-4ED6-B85C-8F32CF901602/FinalDownload/DownloadId-C2EB7035A229BFC0BB16C09174241DC8/C6747810-561C-4ED6-B85C-8F32CF901602/SAP%20BW%20Extraction.pdf".
    Hope this answers all the mentioned questions.
    Regards,
    Sarika

  • Only one Extractor source is allowed in a data flow ?

    Hi,
    We have a common scenario (I think this is common for SAP ECC in general) where we have 2 tables, *_ATTR (Attributes) and *_TEXT (descriptions).  We extract these using extractors.
    Initially when we tried to place 2 in a dataflow, we got an error saying 'Only one Extractor source is allowed in a data flow'.
    To work around this, we created 2 embedded dataflows and everything seems to work fine.  But when I do a validate all, I get the error:
    [Data Flow:DF_ODS_MD_WBS_ELEMENT]  Only one Extractor source is allowed in a data flow. (BODI-1116145)
    Should I be worried ?  My gut feeling is this is a bug in the validate all function, but I could be wrong.  Everything seems to run fine, so is it safe to ignore this ?
    This currently effects about 5 dataflows, but we have about 40 more to implement like this, so I don't want to get it wrong.
    Thanks,
    Leigh.

    Like it says, that is invalid xml. It looks like it is a bug
    in the example doc.
    Add a root node to the xml and you should be ok.
    Tracy

  • Debugging data flow

    Hi,
    I know that in BI 3.X version we have update rule and transfer rule.
    In BI 7 we have transformations and DTP.
    In 3.X u2013 Data comes to PSA and then goes to Infosource and reaches the data target via Infopackage.
    In BI 7 u2013 Data comes to PSA via Infopackage and from there goes to data target by DTP which executes the transformations.
    There should be an ABAP program to execute the above data flow.
    All I want to know the name of the program and how the 3.X data flow & 7 data flow process can be debugged?  Because as the debugging gives more opportunity to understand the process better. 
    Regards,
    Lakshminarasimhan.N

    Hi Lakshminarasimhan.N:
    Take a look at the documentation below.
    "Steps to Debug Routines in BI Transformations" article by Rudra Pradeep Reddy Neelapu.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/205e46d2-0b30-2d10-cba0-ea9dba0cb1aa?quicklink=index&overridelayout=true
    "SAP NetWeaver 7.0 BI: Extended Capabilities to Debug a Data Transfer Process (DTP) Request" Weblog by Michael Hoerisch.
    /people/community.user/blog/2007/03/29/sap-netweaver-70-bi-extended-capabilities-to-debug-a-data-transfer-process-dtp-request
    "Step by Step Guide to Debug a Start, End or an Expert Routine in BI 7.0 Transformations" article by Nageswara Reddy M.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b0038ad7-a0c7-2c10-cdbc-dd674682c8e7?quicklink=index&overridelayout=true
    Regards,
    Francisco Milán.

  • Log on data load through a BW data flow

    Dears,
    I am requesting to all of you who have already implemented this type of functionality. I am trying to find the easiet way, with less complexity, to implement a log through an existing BW data flow.
    I mean data load by an infopackage give some log on right and wrong records within the monitor, how can I utilize this information? is there a specific table which stored each record and their message? Or a program has to be implemented which will publish laoding status in a specific table?
    Thanks for your quick feedback,
    LL

    Hi Ludovic
    The monitor messages are only written if there is some problem in the record processing. You can only find information for those records which have problem or if the processing during the routines encountered some problem.
    What you can do to capture messages is write one transfer routine and amend the monitor messages table rsmonmess for the same.
    Also,please check the tables starting with RSMO*
    regards
    Vishal

  • DATA FLOW

    Can u please tell me <b>data flow from Std datasource to Data Targets(CUBE , DSO)</b>in BI 7.0.
    amit shetye.

    <i>
    DS
    transformation to DSO or CUBE
    </i>
    load process
    <i>
    START VATRIANT
    INFOPACKAGE LOAD TO PSA
    PSA TO DSO or CUBE (DTP)
    DSO TO CUBE (DTP)
    </i>
    Nagesh Ganisetti

  • Data Arrives to PSA but does NOT add any records to ODS.... WHY?

    Hi Gurus:
    I have a unique isseu.  My dada form an info-object arrives successfully to PSA.  But this data does not get added to ODS.  I know that ODS does not contain these records and PSA does show these records.  There are no errors but zer records are added to the ODS.  Has anyone come across this before?  Am I missing somethign??
    Joe-Smith

    r u loading parallelly to PSA and data target (or) PSA and then into Data target..
    which flow u r following...
    otheriwse u delete that request in DSO as anyhow it got 0 records from PSA...
    and go to PSA(data source) manage and select the request u need to update and click on UPDATE WITH SCHEDULER option in that window and start...
    let us know the status...after that...
    rgds,
    nkr

Maybe you are looking for

  • Home sharing registers my library but wont open it

    on my main computer home sharing will register the new computer but on the new computer it will register that it can "see my other library but wont open it. if i click on it to open it trys to load but then just stops and goes back to the Music part

  • Problems in authentication while accessing WSDL from  Sharepoint

    I am trying to access objects from a SharePoint on my company site, which requires Authentication and Authorization in order to access  the same. I have written a client and am trying to access the WSDL using Axis 1.0. I am  getting an authentication

  • Selection of data

    hi friend's, I need one logical help.  I having a database table there are some fields . Three Fields are: Year, Employee Code(Pernr), Employee Name(Ename). I m using query as following: SELECT PERNR ENAME FROM ZTRUST INTO (YEAR_ITAB-PERNR, YEAR_ITAB

  • Find does not return updated entity

    Dear all, I have a problem understanding entityManager.find(...) function. According to the documentation, as well as the book "Pro JPA 2 - Mastering the Java™ Persistence API" (Mike Keith, Merrick Schincariol), the method should attempt to return th

  • Has anyone done any work with the "Form" Email Feature in 2008?

    Has anyone done any work with the "Form" Email Feature in 2008? 2006.7 Oracle 9i Websphere 5.11 IE6 & IE7 I have been testing the version 2008.2 and noticed the "Form" link in the email templates, has anyone used this or know what its for?  It seems