FIGL extraction

I am working on BI version 7.0
Can anyone please tell me the steps to do FIGL extraction in BI? I need more help in the steps involved in the R/3 side. I am not very clear about this as it is not LO extraction. SO please help
Thanks in advance
Naveen
Edited by: Naveen_Purdue on Nov 4, 2009 2:00 PM

Hi,
Check that wether the data source, DSO, Infocube and corresponding info objects are installed or not.
-> if not goto->BI Content->select the info provider by info area-> select the prorper Info cube-> select the source system-> install the Bi content with "In data flow Before" grouping method.
-> once BI content is installed then goto ur data source-> repllicate the dat source-> and then activate it.
-> after then create the transformation between ur datasource and target(Info provider(Infocube/DSO))->activate it.
->then create the DTP(Data transfer process) between datasource and target-> activate it.
->finally go to datasource create info package execute it. and then execute the DTP ur data will load to perticual target..
Note: in BI7.0 no need to create info source directly we can load the data
Thanks
Regards
Areef.sd

Similar Messages

  • Special Extraction for FIGL

    Hi,
    I have a query in R/3 based on a FIGL logical database.
    This query use the GSEG structure and is able to spread
    in two columns the item of an accounting documents.I need to  modelize this  in BW due to poor performance in R/3 .The GSEG give for one item in BSEG the others item of the same document.
    With the FIGL Extractor line item i'am not able to do this.
    Did someone face the same problem?
    Thanks for your help.
    Oliver

    Hi,
    You are not mentioed what exact eror is displaying , might be these are the errors like TR/UR or transformation may inactive , special char, etc.As you said in your post plant maitenance data source has been replicated due to this transformation may inactive.
    this is a plant maintenence cube and ia replicated the datasource agian in BI Production and it is active.
    Regards
    sivaraju
    Edited by: kolli sivaraju on Jul 6, 2011 8:20 AM

  • Finance GL/ AR /AP extraction

    Hello Experts,
    I am a newb with respect to BI/BW extraction. Please read through and let me know if all the configuration is proper and is there any change required.
    I donot have access to SAP notes. I have referred to best practices, but for the GL account the document is kinda of offtrack or I believe its incomplete.
    Very well here it goes.
    For GL extraction.
    General Ledger (New): Transaction Figures
    ODS object 0FIGL_O10 is connected to infoSource 0FI_GL_6 which is further connected to datasource 0FI_GL_6
    FIGL: Transaction Figures- Cost of Sales Ledger
    ODS object 0FIGL_O07 is connected to infoSource 0FI_GL_7 which is further connected to datasource 0FI_GL_7
    General Ledger: Line items
    ODS object 0FIGL_O02 is connected to infoSource 0FI_GL_4 which is further connected to datasource 0FI_GL_4
    For AR Extraction
    FIAR:Transaction Figures
    ODS object 0FIAR_O06 is connected to infoSource 0FI_AR_6 which is further connected to datasource 0FI_AR_6
    FIAR:Line Item
    ODS object 0FI_AR_O03 is connected to infoSource 0FI_AR_4 which is further connected to datasource 0FI_AR_4
    For AP Extraction
    FIAP:Transaction Figures
    ODS object 0FIAP_O06 is connected to infoSource 0FI_AP_6 which is further connected to datasource 0FI_AP_6
    FIAP:Line Item
    ODS object 0FI_AP_O03 is connected to infoSource 0FI_AP_4 which is further connected to datasource 0FI_AP_4
    I need to know whether the above configuration is good. My following question is with regard to the infopackages. All of the above sources are delta compatible and so I need to know how to begin the loading of data
    This is what I see in the infopackage.
    Under the update tab in the infopackage
    Update mode
    Full Update
    Initialize Delta Process
    > Initialization with Data transfer
    > Initialization without Data Transfer
    > Early Delta Initialization.
    I am thinking that the following extraction steps would be appropriate:
    1) Do a full update of GL Line items, Transaction and Cost of Sales and then do a full update for AP and AR Line and Transaction.
    2) Initialize with Data Transfer should be done once a full upload is done.
    Please throw some more light. Your help would be highly appreciated and awarded accordingly.
    Thank you.

    Let's take just on flow in consideration, for instance the GL loading process.
    The simplest staging scenario needs a first InfoPackage (IP1) for initializing the delta loading process, and a second InfoPackage (IP2) for a delta loading.
    No InfoPackage for a full loading would be needed in this basic scenario.
    You'll run IP1 once. Without any restriction in the selection tab, the delta process will be initialized for the entire set of data provided by the DataSource.
    You'll run IP2 every day. This packege will transfer all the new/changed/deleted (delta) records. As the delta process has been initialized for the entire dataset, it'll be delta-<i>sensible</i> to the entire dataset.
    For efficiency reasons (a delta process is less time consuming when it runs against a smaller dataset), you could define one ore more InfoPackage for loading chunks of data in full mode.
    Let's say have to load 2 years of data. You might create another InfoPackage (IP3) which will load the past year's data, in full mode.
    When editing IP3, you can set this restriction in the tab <i>selection data</i>, by setting the fiscal year to the previous year. Moreover, in the tab <i>update parameters</i> the update mode needs to be <i>full</i> .
    On the other hand, you have to edit IP1, and set the fiscal year to the current year (in the tab <i>selection data</i>, you've got to fill the field from value with the current year, and the field to value with the year 9999 -- this way the delta process will work also next year and so on).
    This change will allow you to initialize the delta process the current year's data -- and any data that will be posted in the future.
    You will run IP3 once, and you'll get the past year's data. Next you'll run IP1 once to initialize the delta process for this year's data. Finally you'll schedule the run of IP2 for every day.
    The number of full data loadings you'll need depends on the amount of data you have to fetch from the source system.
    Cheers, <a href="https://wiki.sdn.sap.com/wiki/display/profile/Davide+Cavallari">Davide</a>
    Message was edited by:
            Davide Cavallari

  • Line Item FI GL (new) extraction with out EHP3

    Hi,
    In ECC 6.0, we have used the (new)-GL functionality for FI GL Configuration.
    Actually we donu2019t have the ehp3 on ECC 6.0
    So, I(BI Extraction) canu2019t use 0FI_GL_14 --> 0FIGL_O14 (for Line item) datasource without EHP3. Available from table - FAGLFLEXA
    and I cant use the older version datasourse, 0FI_GL_4 General Ledger Accounting Line Items  -tables BSEG, BKPF (Line Item) get filled.
    I don't want the totals -
    0FI_GL_10 --> 0FI_GL_10 --> 0FIGL_C10 (for Totals) - With New G/L in R/3, the tables FAGLFLEXT (totals),
    Need to know if there is any other datasource which will get the data for line item FI GL(new) ?
    How to do the extraction for FIGL(new) line item?
    Any Replace for datasource 0FI_GL_4?
    Suggestions regarding installation of ehp3?
    Regards,
    Vijay Malyavantham

    hI
    Thanks for the reply.
    As per your suggestion, we need to bring GL view parameters from FAGL* tables. I need clarification here:
    1. In our case, single line item in BSIK may correspond to multiple profit centers and hence multiple line items in FAGL* tables.
    2. Users run the report for multiple vendors ( beig range) with their profit centers as selection criteria.
    3. In the above case, we need to select all the line items for that vendor and go to FAGLFLEXA and select all line items and delete all line items not pertaining to the profit center entered as selection.
    4. The above process would be time consuming since all the open items will be selected and then delete all those items which are not required.
    5. Else, can we select the items from FAGLFLEXA directly for the profit centers entered and bring additional information from BSIK table. Since vendor accounts are not there in FAGLFLEXA, it would be cumbersome and time consuming.
    6. Which of the above approach would you recommend.
    7. Is it possible to modify the ACO report so that we can add additional information such as posting date, due date etc.
    Regards
    S.Radhakrishnan

  • FI/GL extraction

    Hi Experts,
    Currently i'm working with FIGL module and extracting data from 0FI_GL_6 and 0FI_GL_4 data sources. I'm not getting data into BW and is in yellow status. I'm getting the error as mentioned below while extracting data with full load and init load as well. Could any one please help it's an urgent issue.
    Diagnosis
    No errors found. The current process has probably not finished yet.
    System Response
    The ALE inbox of BI is identical to the ALE outbox of the source system
    or
    the maximum wait time for this request has not yet been exceeded
    or
    the background job has not yet finished in the source system.
    Current status
    Data request was not successfully sent off
    Thanks & Regards,
    Amar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    Also see if there is any 'sysfail' for any datapacket in SM37.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
    5) SM58 and BD87 for pending tRFCs.
    Once you identify you can rectify the error.
    If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the data tables.
    Thanks,
    JituK

  • FIGL Init Load Failed

    Hi BW Gurus,
    I tried to load the FIGL line item extractor - FI_GL_14 and it failed after 2 days.
    Due to memory and also system resiurces.
    I plan to do the following.
    1.Create a infopackage for each fiscal period and do a full load.
    2.After all the loads do a init without data transfer.
    3.Then schedule the delta.
    Will this work.
    Also please advise whether I can schedule all the full loads for the fiscal periods in parallel.
    Also in ECC the users will be posting to figl during the extract. I don't think this will have any impact on the extract. Is it correct.
    Please let me know if the above solution will work.
    Thanks
    Senthil

    Hi Senthil,
    Please do not do full loads and init w/o data transfer afterwards. You got two options,
    1) Perform init w/o data transfer first and then  do full loads.Due to this whatever is changed after init w/o data transfer will be captured as delta so that you won't loose any data. If you perform full first,you will loose delta till the time you perform Init.
    (Assuming everything is on Overwrite mode on first level DSO which is generally the case for 0FI_GL)
              OR
    2) Perform Init with fiscal period ranges. You can do any number of Init's with period ranges. Further deltas will automatically take correct fiscal period ranges.
    Answer to your one more Q, offcource you can load full for fiscal periods in parallel there is no problem with that.
    If you follow one of above two methods you won't even miss a single record.
    Let me know if you have any doubts. Please give marks accrodingly.
    Regards,
    Viren
    Edited by: Viren Devi on Mar 15, 2009 12:16 AM

  • How to extract Cleared and Open Items from SAP R/3?

    Hi experts,
    I have a requirement from user to extract cleared and open invoice items from SAP R/3.
    Can someone tell me how to do that?
    Thanks!

    Hi,
    Use the Data source 0FI_AR_4 to Know the status (0FI_DOCSTAT) of payment to be done by customer.
    OR
    Enhance the 2LIS_13_VDITM with VBUP fields which will give the status of Billing.
    With rgds,
    Anil Kumar Sharma .P

  • Extraction and loading of Training and Event Management data (0HR_PE_1)

    hello,
    I've got the following doubt:
    before BI 7.0 release, extractor 0HR_PE_1 extracts event data (eventid, attendee id, calday,...) but if you load straight to cube 0PE_C01, as Calendar year/month needs a reference date (for example, event start date), you'll get total duration of event in hours or days refered to event star date.
    So in a query filtered by month, you get total duration of an event that starts in the filtered month but it couldn`t end until few months later o year later, so you don´t get appropiate information.
    Example:
    Event          calday        Hours
    10004377  20081120   500        but from event_attr event end date is 20090410.
    In a query filtered by 200811 you get total duration time (500 hours when in 200811 event hours have been 20) or if you filter by any month of 2009 you don´t get information of duration of that event.
    I had to create a copy of standar cube (without calday, only Calendar year/month, Calendar year in time dimension) and disaggrate data creating as many entries for an event as months lasts and adjust calculation of ratios duration of event (duration of event in each month/ total duration of event).
    The question is: Is there any improvement or change on business content in BI 7.0 to treat Training and Event Management data? Has anybody had to deal with that?
    Thank you very much.
    IRB

    Hi,
    TEM data is stored in HRP tables.
    You can load the catalog by creating LSMWs for objects Business event group (L), Business event types (D), Locations (F), Organizers (U) as per requirement.
    LSMW for tcode PP01 can be used to create these objects.
    To create Business Events (E) you can create LSMW for PV10/PV11.
    To book attendee create LSMW for tcode PV08 as here you can specify the actual business event ID which reduces ambiguity.
    tcode PV12 to firmly book events
    tcode PV15 to follow up
    Hope this helps.
    Regards,
    Shreyasi.

  • How to Extract Data for a Maintenance View, Structure and Cluster Table

    I want to develop  3 Reports
    1) in First Report
    it consists only two Fields.
    Table name : V_001_B
    Field Name1: BUKRS
    Table name : V_001_B     
    Field Name2: BUTXT
    V_001_B is a Maintenance View
    For this one I don't Find any Datasource
    For this Maintenance View, How to Extract the Data.
    2)
    For the 2nd Report also it consists Two Fields
    Table name : CSKSZ
    Field Name1: KOSTL (cost center)
    Table name : CSKSZ
    Field Name2: KLTXT (Description)
    CSKSZ is a Structure
    For this one I don't Find any Datasource
    For this Structure How to Extract the Data
    3)
    For the 3rd Report
    in this Report all Fields are belonging to a Table BSEG
    BSEG  is a Cluster Table
    For this one also I can't Find any Datasource,
    I find very Few Objects in the Datasource.
    For this One, How to Extract the Data.
    Please provide me step by step procedure.
    Thanks
    Priya

    Hi sachin,
    I don't get your point can you Explain me Briefly.
    I have two Fields for the 1st Report
    BUKRS
    BUTXT
    In the 2nd Report
    KOSTL
    KLTXT
    If I use  0COSTCENTER_TEXT   Data Source
    I will get KOSTL Field only
    what about KLTXT
    Thanks
    Priya

  • Basic extraction  doubts sap SAP R/3 to SAP BI

    IN SBIW which radio button i should select if i am extracting the Table in R/3  with 77 fields into sap BI ?
    1) Transaction  or
    2)Master data attributes
    3)texts data
    please tell me ...
    and after that If i want to  use the table for reporting in SAP BI, To which is the best data target to create in SAP BI?
    a)Infocube
    b)DSO
    and please tell me the steps to do in SAP BI system?
    I am asking all these because i am new to sap BI ?
    please do write me...thank you !!!!!!1

    Hi,
    i suppose you need transaction data. We could be sure if you post the table which you want to extract.
    The infoprovider you need is dependent from the data you need for reporting. Pls. search in SDN or in the help.sap.com for modelling help.
    Regards
    Edited by: Zoltan Both on Aug 20, 2008 1:14 PM
    Edited by: Zoltan Both on Aug 20, 2008 1:16 PM

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Open Hub: How-to doc "How to Extract data with Open Hub to a Logical File"

    Hi all,
    We are using open hub to download transaction files from infocubes to application server, and would like to have filename which is dynamic based period and year, i.e. period and year of the transaction data to be downloaded. 
    I understand we could use logical file for this purpose.  However we are not sure how to have the period and year to be dynamically derived in filename.
    I have read in sdn a number of posted messages on a similar topic and many have suggested a 'How-to' paper titled "How to Extract data with Open Hub to a Logical Filename".  However i could not seem to be able to get document from the link given. 
    Just wonder if anyone has the correct or latest link to the document, or would appreciate if you could share the document with all in sdn if you have a copy.
    Many thanks and best regards,
    Victoria

    Hi,
    After creating open hub press F1 in Application server file name text box from the help window there u Click on Maintain 'Client independent file names and file paths'  then u will be taken to the Implementation guide screen > click on Cross client maintanance of file name > create a logical file path by clicking on new entiries > after creating logical file path now go to Logical file name definition there give your Logical file , name , physical file (ur file name followed by month or year what ever is applicable (press f1 for more info)) , data format (ASC) , application area (BW) and logical path (choose from F4 selection which u have created first), now goto Assignment of  physical path to logical path > give syntax group >physical path is the path u gave at logical file name definition.
    however we have created a logical path file name to identify the file by sys date but ur requirement seems to be of dynamic date of tranaction data...may u can achieve this by creating a variable. U can see the help from F1 that would be of much help to u. All the above steps i have explained will help u create a dynamic logical file.
    hope this helps u to some extent.
    Regards

  • F.01 - extract to consolidation on company code level

    Hi everyone
    We're trying to do an extract to consolidation on company code level (1). When popup dialog "Filename for extraction file" appears we can choose between presentation server or application server.
    When choosing presentation server and logical filename ECCS_FILE we get error GC027 (Error opening file c:\temp\EXTRACT_CS_CO.txt).
    Doing so choosing application server however works (whereas path isn't the same).
    We already checked and double-checked settings of logical file name and logical path in transaction FILE.
    Any suggestions what to do or isn't it supposed to work via presentation server?
    Thanks in advance and best regards to all
    Renaud

    I have not seen anything that you have mentioned to enter file path?
    Is this being customised for your Company?
    How did you download this before?

  • Extract Data with OPEN HUB to a Logical Filename

    Hi Experts,
    Can anybody help me in sending the link for How to guide...Extract Data with OPEN HUB to a Logical Filename?
    Thanks in advance.
    BWUser

    Hi,
    check this links...
    http://searchcrm.techtarget.com/generic/0,295582,sid21_gci1224995,00.html
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e698aa90-0201-0010-7982-b498e02af76b
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1570a990-0201-0010-1280-bcc9c10c99ee
    hope this may help you ..
    Regards,
    shikha

  • InfoSpoke Flat File Extract to Logical Filename

    I'm trying to extract data from an ODS to a flat file. So far, I've found that the InfoSpoke must write to the application server for large data volume. Also, in order for the InfoSpoke to transport properly, I must use logical filenames. I've attempted to implement the custom class and append structure as defined in the SAP document "How To... Extract Data with OPEN HUB to a Customer Defined Logical Filename". I'm getting an error when attempting to import the included transports (custom class code). It appears to be a syntax error. Has anyone encountered this, and, if so, how did you fix it?

    Hello.
    I'm getting a syntax error also.  I did not import the transport, but applied the notes thru the appendix.  When I modified the method "GET_OBJECT_REF_INT" in class CL_RSB_DEST as below, I get a syntax error on the "create object" statement.
        when rsbo_c_desttype_int-file_applsrv.
    *{   REPLACE        &$&$&$&$                                          1
    *\      data: l_r_file_applsrv type ref to cl_rsb_file_applsrv.
          data: l_r_file_applsrv type ref to zcl_rsb_file_logical.
    *}   REPLACE
          create object l_r_file_applsrv
            exporting i_dest    = n_dest
                      i_objvers = i_objvers
    Class CL_RSB_DEST,Method GET_OBJECT_REF_INT
    The obligatory parameter "I_S_VDEST" had no value assigned to it.

Maybe you are looking for

  • Outbound payload not visible in sxmb_moni

    Hello Guru's, I have a issue in message monitoring (sxmb_moni). I can't see the outbound payload for the most interfaces. Do I have to set a parameter somewhere? Does anyone has any suggestion? Kind regards, Kamran

  • "SQL Query in HTTP Request" (5474:0)

    Hi, The IDS signature "SQL Query in HTTP Request" (5474:0) does not recognize all malicious SQL selects. Currently, the reg exp looks like [%]20|[=]|[+])[Ss][Ee][Ll][Ee][Cc][Tt]([%]20|[+])[^\r\n\x00-\x19\x7F-\xFF]+([%]20|[+])[Ff][Rr][Oo][Mm]([%]20|[+

  • Changing member names

    Hi,I have an existing outline and I want to change some member names. The outline has data in it. I was wondering if it is possible via a script or a rules file to rename some members without it affecting the data.Thanks in advance.

  • HR Forms Issue - Payroll Simulation

    Hi all, I have requirement to run HR Forms through payroll simulation. When I run for certain scenario, at a time 9 periods are processed as below. Each period is generating 1 HR Form each, totally 9 HR Forms are generated. In the Form Interface, I h

  • Audio Jack Issue

    I got a new 12 core Mac Pro about a month ago. Purchased with two Cinema Displays. Last week I bought some speakers and only the left speaker is was putting out sound. I thought maybe I had a bad set with a short so I returned it and bought a new dif