Joining Procurement team:Hints?: datasources, reports, data flow, processes

Hi Experts.
I am getting ready to join our Procurement team  as the BI developer and I will appreciate any hints to help on this project.:
1. What are the procument related datasources?
2. What type of reports are desired here?
3. What is the typical data flow in this environment?
3b. What are the processes involved in procurement the the SAP BI developer needs to know?
4. Any hints or ideas in succeeding in this SAP BI procurement environment?
5. Is procument considered Logistic(LO)? And hence involves setup tables, etc
6. In general, is there a link or document which discusses the processes in the various functional areas in SAP?
Thanks

Hi.
I find the site very informative but I still do not find the datasources used in extracting the data from the ECC. 
For example, the DSO 0PUR_O02 only mentions that
u201C The ODS supports a Delta update of the InfoSource 2LIS_02_HDR and of the ODS 0PUR_O01.u201D
without pointing out the datasource feeding the ODS 0Pur_O02.
I see the various reports and Infoproviders, such as 0Pur_O02 but can you show me the datasource from ECC feeding this DSO, and how do you make that link? From the site?
Thanks
Edited by: AmandaBaah on Oct 27, 2011 1:15 PM

Similar Messages

  • Datasource and Data flow for SAP table AFVC and field AUFPL

    Hi
           During mapping of SAP fields to datasource field we have come accross AFVC table and field AUFPL .I am not able to find datasource for above mentioned table or field.Can any one please help me out ?
    Thanks
    Datta

    Hi DLK,
    You can search using SE11.
    goto SE11 in ECC system, give the table name and click on where used list.
    in the next screen select structure and press enter.
    in the result screen you will get the list
    Regards,
    Venkatesh

  • Debugging data flow

    Hi,
    I know that in BI 3.X version we have update rule and transfer rule.
    In BI 7 we have transformations and DTP.
    In 3.X u2013 Data comes to PSA and then goes to Infosource and reaches the data target via Infopackage.
    In BI 7 u2013 Data comes to PSA via Infopackage and from there goes to data target by DTP which executes the transformations.
    There should be an ABAP program to execute the above data flow.
    All I want to know the name of the program and how the 3.X data flow & 7 data flow process can be debugged?  Because as the debugging gives more opportunity to understand the process better. 
    Regards,
    Lakshminarasimhan.N

    Hi Lakshminarasimhan.N:
    Take a look at the documentation below.
    "Steps to Debug Routines in BI Transformations" article by Rudra Pradeep Reddy Neelapu.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/205e46d2-0b30-2d10-cba0-ea9dba0cb1aa?quicklink=index&overridelayout=true
    "SAP NetWeaver 7.0 BI: Extended Capabilities to Debug a Data Transfer Process (DTP) Request" Weblog by Michael Hoerisch.
    /people/community.user/blog/2007/03/29/sap-netweaver-70-bi-extended-capabilities-to-debug-a-data-transfer-process-dtp-request
    "Step by Step Guide to Debug a Start, End or an Expert Routine in BI 7.0 Transformations" article by Nageswara Reddy M.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b0038ad7-a0c7-2c10-cdbc-dd674682c8e7?quicklink=index&overridelayout=true
    Regards,
    Francisco Milán.

  • Design of data-flow

    Hello experts!
    We want to design our data-flow in a new way and I want to know from some experts, if this is a good way of if there maybe is a better way.
    At the moment we extract over the following InfoSources:
    0CO_OM_WBS_1
    0CO_OM_WBS_6
    0CO_OM_OPA_1
    0CO_OM_OPA_6
    We extract from every data source in full load in up to 8 different InfoCubes. Everytime we update different attributes and selections with a lot of self written ABAP-rules and so on. All historical grown.
    Now we don't want to read the data every time from DataSource into BW. So we say, we need an entry layer end update all the data from inside the BW.
    We know choose an InfoCube as update-layer, because we maybe want to report on this new data befor it is update to the data targets. We also want to delete requests which could be difficult with an ODS-Object. An ODS-Object has also just 16 key-fields which could be a problem.
    So know we want to design an InfoCube for every InfoSource and make data mart update to the other InfoCubes ( data targets).
    What we also want is a additional layer for saving some requests sometimes. At the moment I would say, I just make a copy of the InfoCubes I want to design.
    So thats it. For design of the InfoCubes I have an additional thread:
    InfoCube Design
    I have just around one year of BW-experience and never done this before. Maybe someone can give me some tipps or hints if this is a good way or if there is a better one.
    Thanks in advance and best regards,
    Peter

    Hi
    You want to update your Entry layer of Cubes with four Infosources
    From this Datamart, your going to update second layer of Cubes which will be used for reporting.
    Instead building this many cubes, you can create a historical cube which will store all the data of the past and one current cube which will have only new data just arrived.
    In the historical cube, you can have all your requests and other details as your passing data only from the current cube to the historical.
    From the new cube you can do current reporting only
    You can build a Multicube over the historical cube and current cube to have full reporting
    Regards
    Ganesh N
    This will make design simple and efficient instead of creating so many cubes
    Regards
    Ganesh N

  • Oracle to SAP BI with BCS u2013 Best Data flow Design.

    Hi,
      We are a SAP implementation team. We are @ Blue Print stage. My client is a RETAIL Business Gaint. Client has 50 % of Transaction data and Master data in Oracle data base. Now we are moving to BI 7.0 and also has plans to use SAP-BCS.
      We would like to map all the existing Oracle tables to BI. Provide any clue regarding the best Data flow (From Oracle 10G to BI 7.0).
    Your quick and valuable suggestion/ links are highly appreciated.
    Warm Regards,
    Bab

    Hi Ashok,
    You have mentioned that you have a Oracle 10g  system as a data inflow which perfectly sets the platform to extract the data from the Oracle  system to SAP BW system.
    This Can be done through the DB connect ,through where you could select the necessary tables and form them as a datasource in BI system,further creating the usual BI objects on top of the DS.
    Once the Data is in BI ,we could pull the BI cube or form a replica of the cube in BCS application format to use it in the BCS environment.
    Hope this helps,
    Regards,
    Rajesh.

  • Data flow not visible while creating remote cube.

    Hi SDN,
    I am working on a remote cube. I need to link the infosource to remote cube. I have  selected the source system and assigned.
    I have done,  like--Remote cube --> Context Menu --> Show Data flow , and I wanted to see the source system , Info source and the remote cube, but could not find them .
    Guide me what I've missed /went wrong.
    Thanks in Advance
    Ankit

    Hi,
    Remote cube is technology where you report on data which is stored in the source system. While creating the Infocube you have to select Remote Infocube Radiobutton and in the next screen it will ask you for the infosource.
    In that infosource you can assign the datasource in the infosource menu.
    Now you see the data flow.
    Hope it works,
    Regards,
    Sasi

  • Reverse Data Flow

    Hello Experts,
    I have a InfoCube, and i am backtracking it's data all the way to R/3. I checked the data flow to this cube and it showed that it's been loaded from an Infosource ( there's no ODS in between the Infosource and the Infocube) - When i click on Infosource, i can see the Communication Structure ( Which i suppose is the structure in which the data is stored in an Infosource - Correct me if i am wrong anywhere)  and the Transfer Structure ( Which i suppose is the structure in which the data is stored in the Datasource) and where i also can see the Datasource assigned to this Infosource and this datasource would tell me whether this is a delta load or init load.
    Coming to the questions ?
    How is data loaded into a datasource - I know a datasource is something which is present both at R/3 and BW end. So how is data loaded into a Datasource at R/3 end and how is it loaded into BW. If someone can provide how to do in BW, i mean the T Codes or the procedure - that would help me a lot in understing or getting the bigger picture.
    Where does PSA come into picture in this whole process ?
    All answers would be really appreciated and duly rewarded with points.
    Thanks,
    Nandita

    Hi Nandita,
    Data permanently stored in tables only.
    Data source is structure  and it provide the data to other objects at run time
    data source can be crated by views, tabels, structure and feeding programme, sap queryetc below transactions are useful for managing data sources
    RS02,RSA6,RSA5,RSA3,RSA7,BWA1,BWA7 and RSA2
    Here is data flow.
    data source(R3) - Transfer structure (data source ) (BW) (PSA) -Transfer rules (BW)Infosource (CS) (BW)-Update rules (BW)---- data target (BW)
    BW side tranfer structure is just replication of r/3 side data source to move date from data source through infosource to data target.
    Update rules and transfer rules are joining infocube , info source and infosource to data source.
    PSA is the  persisting staging area, whatever data is there in r/3 same data available in BW as PSA, it entry point of data to BW.
    i hope it will give some idea.
    Thnaks
    Thota

  • Data flow model for Master data extraction

    Hello friends,
    I have the following scenario where the report layout looks like this. What would be the data flow model for this example if I'm loading master data from R/3
    KNVV     KNA1     KNVP     KNA1     KNA1     KNVP     KNA1
    VKORG     KATR1     KUNN2     NAME1     KTOKD     KUNNR     NAME1
    Sales Org     BCG     Payer No      Payer Nm     Acc Grp     Sold 2 No     Sold 2 Name
    I would appreciate any help.
    Thanks

    Balaji,
    are you trying to develop a master data IOBJ for the field combination given and want to know how to proceed ?
    Did you check out if the current 0customer master data already has the required fields ?
    otherwise you can
    1. create a view for the master
    2. create a FM for the same and use that as a datasource.
    I would suggest you look at the customer master that comes as part of business content first.
    Arun
    Assign points if useful

  • TRM Data Flow(0CFM_MC2)

    Hi gurus,
    I am constructing data flow for TRM transactional reporting.
    From 0CFM_MC2 to the bottom(until data source) could you please tell me the flow from Info cube till datasources.
                                      0CFM_MC2
                                     (Multi Provider)
    0CFM_C11                                        0CFM_C10
    (Info Cube)                                        (Info Cube)
    Kind Regards.
    Eddy

    You can search same in Help.sap.com it will provide the flow.
    Positions According to Position Value Date - TRM Treasury and Risk Management - SAP Library

  • Getting 401 error while creating a Report Data Source with MOSS 2010 Foundation

    I have setup SQL Server 2008 R2 Reporting Services with SharePoint 2010 Foundation in SharePoint integrated mode. SharePoint Foundation is in machine 1 whereas SQL Server 2008 R2 and SSRS Report Server are in machine 2. While configuring Reporting
    Services - Sharepoint integration, I have used Authentication Mode as "Windows Authentication" (I need to use Kerberos).
    My objective is to setup a Data Connection Library, a Report Model Library, and a Reports Library so that I can upload a Report Data Source, some SMDLs, and a few Reports onto their respective libraries.
    While creating the top level site, "Business Intelligence Center" was not available for template selection since SharePoint Foundation is being used. I therefore selected "Blank Site" as the template.
    While creating a SharePoint Site under the top level site, for template selection I again had to select "Blank Site".
    I then proceeded to create a library for the data connection. Towards this, I created a new document library and selected "Basic page" as the document template. I then went to Library Settings for this newly created library and clicked on
    Advanced Settings. In the Advanced Settings page, for "Allow management of content types?" I selected "Yes". Then I clicked on "Add from existing content types" and selected "Report Data Source". I deleted the existing
    "Document" content type for this library.
    Now I wanted to created a Data Connection in the above Data Connection library. For this when I clicked on "New Document" under "Documents" of "Library Tools" and selected "Report Data Source", I got the error "The
    request failed with HTTP status
    401: Unauthorized.".
    Can anybody tell me why I am getting this error?
    Note: I have created the site and the library using SharePoint Admin account.

    Hi,
    Thank you for your detailed description. According to the description, I noticed that the report server was not part of the
    SharePoint farm. Add the report server to the
    SharePoint farm and see how it works.
    To join a report server to a SharePoint farm, the report server must be installed on a computer that has an instance of a SharePoint product or technology. You can install the report server before or after installing the SharePoint product
    or technology instance.
    More information, see
    http://msdn.microsoft.com/en-us/library/bb283190.aspx
    Thanks.
    Tracy Cai
    TechNet Community Support

  • Infoset Join condition on Key feilds and data fields

    Hi Guys,
    I have a requirement to biuld the Info set with join conditon on two DSO's  the info objects which i am using in the JOin condition are defined as data fieds in one DSO and defined as key fields in another DSO, is it possible to define join condition on key fields and data fields.
    The two info objects are                
                           0AC_DOC_NO
                           0ITEM_NUM
    These two info objects are defined as  data fields in DSO :   0LIV_DS1   Invocie verificaion
                                                            key fields in DSO:    0FIAP_0o3 FI AP Line Item
    Please suggest me is it possible to define join the condtion on the data fields and key feilds.
    Thanks
    Best regards
    SG

    Hi
    yes you can create join, you will get any issue in reporting level.
    example: Say i want to create Info Set on 0MATERIAL and Sales DSO.
    In 0MATERIAL Info Object it is key filed, but in my DSO 0MATERIAL is data field.Still we can create
    Creation of join is dependent on fields common in your source objects.
    check out the below document
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2f5aa43f-0c01-0010-a990-9641d3d4eef7?QuickLink=index&overridelayout=true
    Regards,
    Venkatesh
    Edited by: Venkateswarlu Nandimandalam on Sep 27, 2011 2:26 AM

  • R/3 data flow is timing out in Data Services

    I have created an R/3 data flow to pull some AP data in from SAP into Data Services.  This data flow outputs to a query object to select columns and then outputs to a table in the repository.  However the connection to SAP is not working correctly.  When I try to process the data flow it just idles for an hour until the SAP timeout throws an error.  Here is the error:
    R/3 CallReceive error <Function Z_AW_RFC_ABAP_INSTALL_AND_RUN: connection closed without message (CM_NO_DATA_RECEIVED)
    I have tested authorizations by adding SAP_ALL to the service account I'm using and the problem persists.
    Also, the transports have all been loaded correctly.
    My thought is that it is related to the setting that controls the method of generating and executing the ABAP code for the data flow, but I can't find any good documentation that describes this, and my trial and error method so far has not produced results.
    Any help is greatly appreciated.
    Thanks,
    Matt

    You can't find any good documentation??? I am working my butt off just.......just kiddin'
    I'd suggest we divide the question into two parts:
    My dataflow takes a very long time, how can I prevent the timeout after an hour? Answer:
    Edit the datastore, there is a flag called "execute in background" to be enabled. With that the abap is submitted as a background spool job, hence does not have the dialog-mode timeout. Another advantage is, you can watch it running by brwosing the spool jobs from the SAP GUI.
    The other question seems to be, why does it take that long even? Answer:
    Either the ABAP takes that long because of the data volume.
    Or the ABAP is not performing well, e.g. join via ABAP loops with the wrong table as inner.
    Another typical reason is to use direct_download as transfer method. This is fine for testing but it takes a very long time to download data via the GUI_DOWNLOAD ABAP function. And the download time would be part of the ABAP execution.
    So my first set of questions would be
    a) How complex is the dataflow, is it just source - query - data_transfer or are there joins, lookups etc?
    b) What is the volume of the table(s)?
    c) What is your transfer method?
    d) Have you had a look at the generated abap? (in the R/3 dataflow open the menu Validation -> Generate ABAP)
    btw, some docs: https://wiki.sdn.sap.com:443/wiki/display/BOBJ/ConnectingtoSAP

  • SQL Query using a Variable in Data Flow Task

    I have a Data Flow task that I created. THe source query is in this file "LPSreason.sql" and is stored in a shared drive such as
    \\servername\scripts\LPSreason.sql
    How can I use this .sql file as a SOURCE in my Data Flow task? I guess I can use SQL Command as Access Mode. But not sure how to do that?

    Hi Desigal59,
    You can use a Flat File Source adapter to get the query statement from the .sql file. When creating the Flat File Connection Manager, set the Row delimiter to a character that won’t be in the SQL statement such as “Vertical Bar {|}”. In this way, the Flat
    File Source outputs only one row with one column. If necessary, you can set the data type of the column from DT_STR to DT_TEXT so that the Flat File Source can handle SQL statement which has more than 8000 characters.
    After that, connect the Flat File Source to a Recordset Destination, so that we store the column to a SSIS object variable (supposing the variable name is varQuery).
    In the Control Flow, we can use one of the following two methods to pass the value of the Object type variable varQuery to a String type variable QueryStr which can be used in an OLE DB Source directly.
    Method 1: via Script Task
    Add a Script Task under the Data Flow Task and connect them.
    Add User::varQuery as ReadOnlyVariables, User::QueryStr as ReadWriteVariables
    Edit the script as follows:
    public void Main()
    // TODO: Add your code here
    System.Data.OleDb.OleDbDataAdapter da = new System.Data.OleDb.OleDbDataAdapter();
    DataTable dt = new DataTable();
    da.Fill(dt, Dts.Variables["User::varQuery"].Value);
    Dts.Variables["QueryStr2"].Value = dt.Rows[0].ItemArray[0];
    Dts.TaskResult = (int)ScriptResults.Success;
    4. Add another Data Folw Task under the Script Task, and join them. In the Data Flow Task, add an OLE DB Source, set its Data access mode to “SQL command from variable”, and select the variable User::QueryStr.
    Method 2: via Foreach Loop Container
    Add a Foreach Loop Container under the Data Flow Task, and join them.
    Set the enumerator of the Foreach Loop Container to Foreach ADO Enumerator, and select the ADO object source variable as User::varQuery.
    In the Variable Mappings tab, map the collection value of the Script Task to User::QueryStr, and Index to 0.
    Inside the Foreach Loop Container, add a Data Flow Task like step 4 in method 1.
    Regards,
    Mike Yin
    TechNet Community Support

  • Automatic creation of BW data flow documentation

    Dear Gurus,
    I need to write documentation of the data flow of a huge project which I haven't implemented by myself.
    The documentation should contain a mapping of the objects in the dataprovider, towards objects in the source system(s).
    Eventually with the info in which dataproviders the objects are included, e.g. between the multiprovider and the source system.
    Details of transformations can be ignored; eventually mentioning there's a routine involved, but that's the maximum.
    With the data repository, I can have the content of cubes in a graphical overview, but it doesn't really provide me useful information.
    You can imagine I prefer an automatic way to create this documentation.
    Anybody who knows a solution, even if it only provides part of the purpose?
    Any solution via query, standard SAP or customized program, ...
    Recommendations would be very highly appreciated!
    Thx & Rgds, sam

    Worldwide documentation is made on SAP BW projects, but no reply on automatic documentation.
    A lot of time must be lost by manually creating documentation on mapping objects to source system fields.
    ==> SAP, please, work out a solution.
    I didn't find a satisfying solution, but I've done it the following way:
    List all objects for a multiprovider via the meta data repository, and paste in excel document.
    Then listing all objects for the underlying dataproviders, and paste in separate sheets of this excel.
    Compare the objects of the MP with the objects on the other sheets using excel functions, and sign when a dataprovider contains a certain object.
    For the datasources, I checked if an object is present, and if yes, give the original source field.
    This in summary as a not optimal and not complete solution, but it prevents making mistakes.
    Rgds. sam

  • Help me in my data flow ... new to Bi 7.0

    Dear friends,
    Iam new to data flow in Bi7.o. Iam loading  data into infocube from flat files which has 7 records. i loaded the data into infocube with 7 records .. then i added 2 records to my flat file when i load it .. iam getting 9 records in my PSA but 16 records in infocube..   i donot knw how to do it..  what setting i should  do  at data source maintaintence...plz help me in understanding the data flow for Bi7.o though i am studying help files..
    Regards,
    pavan

    Hello Pava,
    1. The InfoPackage are the same
    2. The processing type for the DTP must be DELTA
    --> The source for the DTP is in this case the DataSource (PSA).
    --> So u need 3 processes: Delta Infopackage from FF --> to PSA, detla DTP from PSA to DSO --> detla DTP from DSP to InfoCube
    3. "only get delta once" means, the source request is extracted via delta DTP from the source to the target. If you delete the target request from the InfoCube or DSO, the related source request will NOT transfered to the target again with the next delta upload. Usually it will.
    4. "get data by request" means, the delta DTP uses the same number of requests as in the source. Usually the DTP collects the data from the source and creates a new Request that could include several source requests. With this flag the DTP request uses the source requests to transfer the data into the target. So the nb. of requests are the same in the source and in the target.
    I hope this helps,
    Michael

Maybe you are looking for