GL Hiearchy Extraction

Hi  All
FAGL_FSV_POS_READ Function module has not given all the information from Financial Statement Version using tcode FSE3. It gives only the first parent node information.. I need to retrieve all the node information.
Pls any one help to suggest any funtion module or BAPI or Extraction Logic..
Thanks
Rahul

Hi Rahul,
The FM 'FAGL_FSV_POS_READ' does gave all the information. The table parameter ET_011Z holds item details and table ET_011P holds the node details. You need to loop thru these tables inorder to display it as the tree structure.
Thanks,
Mahesh

Similar Messages

  • Reg : Hierarchy extraction from Source

    Hello Friends,
    I have an issue with Hiearchy Extraction.
    I am trying to Load 0WBS_ELEMT ( Hiearchy) data (Full Update) for the first time.Its displaying the following error.(Its extracted from SAP R/3)
    Diagnosis
        The hierarchy level of the parent node must be one larger than the
        hierarchy level of the actual node. This is not the case for the node
        with ID 00075558 and its parent node with ID 00000001 .
    System response
    Procedure
        Correct the hierarchy level of the node or of the parent node in your
        source system.
    Procedure for System Administration
    Can any of you who encountered the similar issue share your valuable experience with me please ...
    Kind Regards,
    Sai Gautam Pusala

    Its resolved/
    Thanks,
    Sai Gautam
    Edited by: Sai Gautam Pusala on Oct 21, 2011 2:14 PM

  • Virtual Time Hierarchy parameters

    Hello
    I have a few Virtual Time Hierarchies defined on our BW system.
    These are automatically shown in Business Objects webi when accessing a BEx query but they always show only 1 level.
    If I define one of the virtual hierarchies on my object in BEx and I set the number of levels manually this is correctly represented in Webi.
    Apparently the default level setting for Virtual Time Hierarchies is level=1
    Is there anywhere where this can be changed?
    I would like to set the correct level of each Virtual Time Hierarchy so these are all usable in Webi instead of the 1 that was defined in the BEx query
    Regards
    Filip

    Hello Jamie
    Did you find a solution for this?
    I found that it is possible if you activate the hierarchy display in the BEx query.
    Then you select the virtual time hierarchy that you want to be shown
    Important is to set the option "Expand to level"
    Once this is done the hierarchy reflects in Webi
    BUT the measures are not correctly shown if you dont have the lowest level of the hiearchy extracted as well in your webi query...
    sample
    YeaQuaMon
    - set expand to level = 3 in BEx
    - in the webi report you should also put the Month in the query (lowest level of YeaQuaMon) otherwise the aggregation is not going properly...
    I've been looking for a solution on the default level of the Virtual Time hierarchies but I was not able to find...
    Regards
    f

  • Extract sub-Hierarchy

    Hello,
    actually I have a problem with hierarchy extraction. I need to publish a sub-hierarchy.
    My idea:
    Create a transformation between a full hierarchy and a sub hierachy. Within the transformation, I would have filtered out the nodes I do not want to publish in the sub hiearchy. However it is not possible to use transformation between hierachy.
    Now i tried to build a generic extractor on the /BIC/H-table and wanted to transform this generic datasource into the sub hierarchy --> not possible because I can not create a generic-hierarchy-datasource (besides flat files).
    Any idea how so solve this problem with NW 7.0?
    Thank You!
    Markus

    Hi,
    You can download the hierarchy to flat file as follows
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/0403a990-0201-0010-38b3-e1fc442848cb?QuickLink=index&overridelayout=true
    http://wiki.sdn.sap.com/wiki/display/BI/HierarchyDownloadto+Flatfile
    http://forums.sdn.sap.com/thread.jspa?threadID=69187
    Thanks and regards
    Kiran

  • How to extract Cleared and Open Items from SAP R/3?

    Hi experts,
    I have a requirement from user to extract cleared and open invoice items from SAP R/3.
    Can someone tell me how to do that?
    Thanks!

    Hi,
    Use the Data source 0FI_AR_4 to Know the status (0FI_DOCSTAT) of payment to be done by customer.
    OR
    Enhance the 2LIS_13_VDITM with VBUP fields which will give the status of Billing.
    With rgds,
    Anil Kumar Sharma .P

  • Extraction and loading of Training and Event Management data (0HR_PE_1)

    hello,
    I've got the following doubt:
    before BI 7.0 release, extractor 0HR_PE_1 extracts event data (eventid, attendee id, calday,...) but if you load straight to cube 0PE_C01, as Calendar year/month needs a reference date (for example, event start date), you'll get total duration of event in hours or days refered to event star date.
    So in a query filtered by month, you get total duration of an event that starts in the filtered month but it couldn`t end until few months later o year later, so you don´t get appropiate information.
    Example:
    Event          calday        Hours
    10004377  20081120   500        but from event_attr event end date is 20090410.
    In a query filtered by 200811 you get total duration time (500 hours when in 200811 event hours have been 20) or if you filter by any month of 2009 you don´t get information of duration of that event.
    I had to create a copy of standar cube (without calday, only Calendar year/month, Calendar year in time dimension) and disaggrate data creating as many entries for an event as months lasts and adjust calculation of ratios duration of event (duration of event in each month/ total duration of event).
    The question is: Is there any improvement or change on business content in BI 7.0 to treat Training and Event Management data? Has anybody had to deal with that?
    Thank you very much.
    IRB

    Hi,
    TEM data is stored in HRP tables.
    You can load the catalog by creating LSMWs for objects Business event group (L), Business event types (D), Locations (F), Organizers (U) as per requirement.
    LSMW for tcode PP01 can be used to create these objects.
    To create Business Events (E) you can create LSMW for PV10/PV11.
    To book attendee create LSMW for tcode PV08 as here you can specify the actual business event ID which reduces ambiguity.
    tcode PV12 to firmly book events
    tcode PV15 to follow up
    Hope this helps.
    Regards,
    Shreyasi.

  • How to Extract Data for a Maintenance View, Structure and Cluster Table

    I want to develop  3 Reports
    1) in First Report
    it consists only two Fields.
    Table name : V_001_B
    Field Name1: BUKRS
    Table name : V_001_B     
    Field Name2: BUTXT
    V_001_B is a Maintenance View
    For this one I don't Find any Datasource
    For this Maintenance View, How to Extract the Data.
    2)
    For the 2nd Report also it consists Two Fields
    Table name : CSKSZ
    Field Name1: KOSTL (cost center)
    Table name : CSKSZ
    Field Name2: KLTXT (Description)
    CSKSZ is a Structure
    For this one I don't Find any Datasource
    For this Structure How to Extract the Data
    3)
    For the 3rd Report
    in this Report all Fields are belonging to a Table BSEG
    BSEG  is a Cluster Table
    For this one also I can't Find any Datasource,
    I find very Few Objects in the Datasource.
    For this One, How to Extract the Data.
    Please provide me step by step procedure.
    Thanks
    Priya

    Hi sachin,
    I don't get your point can you Explain me Briefly.
    I have two Fields for the 1st Report
    BUKRS
    BUTXT
    In the 2nd Report
    KOSTL
    KLTXT
    If I use  0COSTCENTER_TEXT   Data Source
    I will get KOSTL Field only
    what about KLTXT
    Thanks
    Priya

  • Basic extraction  doubts sap SAP R/3 to SAP BI

    IN SBIW which radio button i should select if i am extracting the Table in R/3  with 77 fields into sap BI ?
    1) Transaction  or
    2)Master data attributes
    3)texts data
    please tell me ...
    and after that If i want to  use the table for reporting in SAP BI, To which is the best data target to create in SAP BI?
    a)Infocube
    b)DSO
    and please tell me the steps to do in SAP BI system?
    I am asking all these because i am new to sap BI ?
    please do write me...thank you !!!!!!1

    Hi,
    i suppose you need transaction data. We could be sure if you post the table which you want to extract.
    The infoprovider you need is dependent from the data you need for reporting. Pls. search in SDN or in the help.sap.com for modelling help.
    Regards
    Edited by: Zoltan Both on Aug 20, 2008 1:14 PM
    Edited by: Zoltan Both on Aug 20, 2008 1:16 PM

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Open Hub: How-to doc "How to Extract data with Open Hub to a Logical File"

    Hi all,
    We are using open hub to download transaction files from infocubes to application server, and would like to have filename which is dynamic based period and year, i.e. period and year of the transaction data to be downloaded. 
    I understand we could use logical file for this purpose.  However we are not sure how to have the period and year to be dynamically derived in filename.
    I have read in sdn a number of posted messages on a similar topic and many have suggested a 'How-to' paper titled "How to Extract data with Open Hub to a Logical Filename".  However i could not seem to be able to get document from the link given. 
    Just wonder if anyone has the correct or latest link to the document, or would appreciate if you could share the document with all in sdn if you have a copy.
    Many thanks and best regards,
    Victoria

    Hi,
    After creating open hub press F1 in Application server file name text box from the help window there u Click on Maintain 'Client independent file names and file paths'  then u will be taken to the Implementation guide screen > click on Cross client maintanance of file name > create a logical file path by clicking on new entiries > after creating logical file path now go to Logical file name definition there give your Logical file , name , physical file (ur file name followed by month or year what ever is applicable (press f1 for more info)) , data format (ASC) , application area (BW) and logical path (choose from F4 selection which u have created first), now goto Assignment of  physical path to logical path > give syntax group >physical path is the path u gave at logical file name definition.
    however we have created a logical path file name to identify the file by sys date but ur requirement seems to be of dynamic date of tranaction data...may u can achieve this by creating a variable. U can see the help from F1 that would be of much help to u. All the above steps i have explained will help u create a dynamic logical file.
    hope this helps u to some extent.
    Regards

  • F.01 - extract to consolidation on company code level

    Hi everyone
    We're trying to do an extract to consolidation on company code level (1). When popup dialog "Filename for extraction file" appears we can choose between presentation server or application server.
    When choosing presentation server and logical filename ECCS_FILE we get error GC027 (Error opening file c:\temp\EXTRACT_CS_CO.txt).
    Doing so choosing application server however works (whereas path isn't the same).
    We already checked and double-checked settings of logical file name and logical path in transaction FILE.
    Any suggestions what to do or isn't it supposed to work via presentation server?
    Thanks in advance and best regards to all
    Renaud

    I have not seen anything that you have mentioned to enter file path?
    Is this being customised for your Company?
    How did you download this before?

  • Extract Data with OPEN HUB to a Logical Filename

    Hi Experts,
    Can anybody help me in sending the link for How to guide...Extract Data with OPEN HUB to a Logical Filename?
    Thanks in advance.
    BWUser

    Hi,
    check this links...
    http://searchcrm.techtarget.com/generic/0,295582,sid21_gci1224995,00.html
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e698aa90-0201-0010-7982-b498e02af76b
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1570a990-0201-0010-1280-bcc9c10c99ee
    hope this may help you ..
    Regards,
    shikha

  • InfoSpoke Flat File Extract to Logical Filename

    I'm trying to extract data from an ODS to a flat file. So far, I've found that the InfoSpoke must write to the application server for large data volume. Also, in order for the InfoSpoke to transport properly, I must use logical filenames. I've attempted to implement the custom class and append structure as defined in the SAP document "How To... Extract Data with OPEN HUB to a Customer Defined Logical Filename". I'm getting an error when attempting to import the included transports (custom class code). It appears to be a syntax error. Has anyone encountered this, and, if so, how did you fix it?

    Hello.
    I'm getting a syntax error also.  I did not import the transport, but applied the notes thru the appendix.  When I modified the method "GET_OBJECT_REF_INT" in class CL_RSB_DEST as below, I get a syntax error on the "create object" statement.
        when rsbo_c_desttype_int-file_applsrv.
    *{   REPLACE        &$&$&$&$                                          1
    *\      data: l_r_file_applsrv type ref to cl_rsb_file_applsrv.
          data: l_r_file_applsrv type ref to zcl_rsb_file_logical.
    *}   REPLACE
          create object l_r_file_applsrv
            exporting i_dest    = n_dest
                      i_objvers = i_objvers
    Class CL_RSB_DEST,Method GET_OBJECT_REF_INT
    The obligatory parameter "I_S_VDEST" had no value assigned to it.

  • Index failure for a PDF with Extraction Not Allowed

    Hi,
    We have a few PDFs files protected against Page Extraction.
    Is it possible to pass the password to the TREX Engine, somehow, in order to be able to index these files? We want to keep this protection, because the users should not be able to extract any information, when opening the documents.
    Thanks & Best Regards,
    John

    The Template .dwt file is only used locally by DW itself, there's no reason to upload it to the server unless you intend to share it with someone. You have to upload each individual page that the Template updates, every time you update the template and propagate that change across the child pages.
    It would probably be better to use a server side include. That way, you could update one file then upload it and all of your pages would automatically carry that update...
    SSI for Apache: http://www.javascriptkit.com/howto/ssi.shtml
    PHP's version: http://www.tizag.com/phpT/include.php
    It's especially handy if your site gets to the point where you have more than about 20 pages.

  • Extract an image which is only on opening screen

    A while back I chose an image of my cat to use on the opening screen of my iPhone.
    The cat is now dying and I would love to use that image in a multimedia program.
    But I no longer have the original image.
    Is there somewhere that the image is stored, and can I get to it to extract it?

    It is possible to extract this file without jailbreaking. There is an application called mobilesyncbrowser that lets you extract files from your backup - it's going to cost $20 as the extract photos option is only available in the registered "Plus" version.
    To extract the lock screen picture using this program it's located in Pictures & Other Files > Library and the file name is Lockbackground.jpg

Maybe you are looking for