Building a custom hierarchy data source extractor

Hi SDN,
The ECC side has given us a Z table which contains a hierarchy. I will write a function module to extract the hierarchy and pass on to BI.
Can someone assist with steps needed to accomplish this? Obviously FM, Z table and view over it, etc are custom. What should the function module generate and fill what to load into BI? How do you generate a data source for such a custom situation?
Please advise.
Thanks.

Hi Shahid,
Only way you can extract is dumping the data or hierarchy  into flat file and loading into BI. Download the data into flat file and then you have upload the same. You cannot create Generic datasource for Hierarchy.
If you have your data in Flat structure in the table then this same code may help you.
report zbiw_glacc_hier line-size 250.
data :
     begin of itab occurs 0,
      seg(5),
      segname(30),
      spu(5),
      spuname(30),
      bu(5),
      buname(30),
      pu(15),
      puname(30),
     cust(8),
     custname(60),
      end of itab.
data : itab_all like itab occurs 0 with header line,
         itab_root like itab occurs 0 with header line.
data : begin of i_hier occurs 0.    " BIW Structure to hold hirearchy
       INCLUDE STRUCTURE E1RSHND.
data:   nodeid like e1rshnd-nodeid,           "NODE ID
        infoobject like e1rshnd-infoobject,   "infoobjectNAME
        nodename like e1rshnd-nodename,       "nodename year
        link(5), "   LIKE E1RSHND-LINK,              "linkname
        parentid like e1rshnd-parentid.       "parent id
data:   leafto(32),                           "Interval - to
        leaffrom(32).                         "Interval - from
        include structure e1rsrlt.
        include structure rstxtsml.
*DATA : PARENT LIKE E1RSHND-NODENAME.
data : end of i_hier.
data : sr(8) type n value '00000000'.
data : begin of itab1 occurs 0, " Node Level1 of Hierarchy
       z_ah_level1 like itab-seg,
       z_ah_level2 like itab-spu,
       z_ah_desc_level2 like itab-spuname,
       end of itab1.
data : begin of itab2 occurs 0, " Node Level1 of Hierarchy
       z_ah_level2 like itab-spu,
       z_ah_level3 like itab-bu,
       z_ah_desc_level3 like itab-buname,
       end of itab2.
data : begin of itab3 occurs 0,   " Node Level3 of Hierarchy
       z_ah_level3 like itab-bu,
       z_ah_level4 like itab-pu,
       z_ah_desc_level4 like itab-puname,
       end of itab3.
data wa like itab.
data : begin of itab_first occurs 0, " Node Level2 of Hierarchy
       z_ah_level1 like ITAB-SEG,
       end of itab_first.
data : begin of itab_sec occurs 0, " Node Level2 of Hierarchy
       z_ah_level2 like ITAB-SPU,
       end of itab_sec.
data : begin of itab_third occurs 0, " Node Level2 of Hierarchy
       z_ah_level3 like ITAB-BU,
       end of itab_third.
data : begin of i_download occurs 0,
       record(250),
       end of i_download.
data : v_pid like e1rshnd-nodeid,
       v_line(250).
start-of-selection.
CALL FUNCTION 'UPLOAD'
EXPORTING
  CODEPAGE                      = ' '
   FILENAME                      = 'C:/'
   FILETYPE                      = 'DAT'
  TABLES
    DATA_TAB                      = itab .
IF SY-SUBRC  0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
        WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
itab_all[] = itab[].
  itab_root[] = itab_all[].
  sort itab_all by seg spu bu pu .
  sort itab_root by seg.
  delete adjacent duplicates from itab_root comparing seg.
Appending Root level of node
sr = 00000001.
move : sr to i_hier-nodeid,
       '0HIER_NODE' to i_hier-infoobject,
       'SHA HIERARCHY' to i_hier-nodename,
       '00000000' to i_hier-parentid,
       'EN' to i_hier-langu,
       'SHA HIERARCHY' to i_hier-txtmd.
append i_hier.
clear i_hier.
v_pid  = '00000000'.
loop at itab_root.
if itab_root-seg = itab_root-spu.
  delete itab_root.
endif.
endloop.
  loop at itab_root.
Appending first level of the node
    sr = sr + 1.
    move : sr to i_hier-nodeid,
          '0HIER_NODE' to i_hier-infoobject,
           itab_root-seg to i_hier-nodename,
           v_pid to i_hier-parentid,
          'EN' to i_hier-langu,
          itab_root-segname to i_hier-txtsh,
          itab_root-segname to i_hier-txtmd,
          itab_root-segname to i_hier-TXTLG.
    append i_hier.
    clear i_hier.
  endloop.
  loop at itab_all.
    move itab_all to wa.
Separation of level 3 of the node
    at new spu.
      if not wa-spu is initial.
        move : wa-seg to itab1-z_ah_level1,
               wa-spu to itab1-z_ah_level2,
               wa-spuname to itab1-z_ah_desc_level2.
        append itab1.
        clear itab1.
      endif.
    endat.
Separation of level 3 of the node
    at new bu.
      if not wa-bu is initial.
        move : wa-spu to itab2-z_ah_level2,
               wa-bu to itab2-z_ah_level3,
               wa-buname to itab2-z_ah_desc_level3.
        append itab2.
        clear itab2.
      endif.
    endat.
Separation of level 2 of the node
    at new pu.
      if not wa-pu  is initial.
        move : wa-bu to itab3-z_ah_level3,
               wa-pu to itab3-z_ah_level4,
               wa-puname to itab3-z_ah_desc_level4.
        append itab3.
        clear itab3.
      endif.
    endat.
    clear wa.
  endloop.
  sort itab1 by z_ah_level1 z_ah_level2.
  delete adjacent duplicates from itab1 comparing z_ah_level1
                                              z_ah_level2.
  sort itab2 by z_ah_level2 z_ah_level3.
  delete adjacent duplicates from itab2 comparing z_ah_level2
                                                  z_ah_level3.
  sort itab3 by z_ah_level3 z_ah_level4.
  delete adjacent duplicates from itab3 comparing z_ah_level3
                                                  z_ah_level4.
loop at itab_root.
move itab_root-SEG to itab_first-z_ah_level1.
append itab_first.
clear itab_first.
endloop.
itab_sec[] = itab2[].
itab_third[] = itab3[].
sort itab_first by z_ah_level1.
delete adjacent duplicates from itab_first.
sort itab_sec by z_ah_level2.
delete adjacent duplicates from itab_sec.
sort itab_third by z_ah_level3.
delete adjacent duplicates from itab_third.
data : indx like sy-tabix.
clear indx.
loop at itab_first.
indx = sy-tabix.
read table itab_sec with key z_ah_level2 = itab_first-z_ah_level1
                        binary search.
if sy-subrc eq 0.
delete itab_first index indx.
endif.
endloop.
clear indx.
loop at itab_sec.
indx = sy-tabix.
read table itab_third with  key z_ah_level3 = itab_sec-z_ah_level2
                        binary search.
if sy-subrc eq 0.
delete itab_sec index indx.
endif.
endloop.
clear indx.
clear indx.
loop at itab1.
If itab1-z_ah_level1 = itab1-z_ah_level2.
  delete itab1.
endif.
endloop.
loop at itab2.
If itab2-z_ah_level2 = itab2-z_ah_level3.
  delete itab2.
endif.
endloop.
loop at itab3.
If itab3-z_ah_level3 = itab3-z_ah_level4.
  delete itab3.
endif.
endloop.
Appending First level node to Hierarchy table
  loop at itab_root.
    loop at itab1 where z_ah_level1 = itab_root-SEG.
      read table i_hier with key nodename = itab1-z_ah_level2.
       if sy-subrc ne 0.
      read table i_hier with key nodename = itab1-z_ah_level1.
      if sy-subrc eq 0.
      move i_hier-nodeid to i_hier-parentid.
Appending First level node to Hierarchy table
      sr = sr + 1.
      move  : sr to i_hier-nodeid,
              '0HIER_NODE' to i_hier-infoobject,
              itab1-z_ah_level2 to i_hier-nodename,
              'EN' to i_hier-langu,
              itab1-z_ah_desc_level2 to i_hier-txtsh,
              itab1-z_ah_desc_level2 to i_hier-txtmd,
              itab1-z_ah_desc_level2 to i_hier-txtlg.
      append i_hier.
      clear i_hier.
      endif.
      endif.
    endloop.
  endloop.
Appending second level node to Hierarchy table
  loop at itab_sec.
    loop at itab2 where z_ah_level2 = itab_sec-z_ah_level2.
      read table i_hier with key nodename = itab2-z_ah_level3.
       if sy-subrc ne 0.
      read table i_hier with key nodename = itab2-z_ah_level2.
       if sy-subrc eq 0.
      move i_hier-nodeid to i_hier-parentid.
Appending second level node to Hierarchy table
      sr = sr + 1.
      move : sr to i_hier-nodeid,
            '0HIER_NODE' to i_hier-infoobject,
            itab2-z_ah_level3 to i_hier-nodename,
            'EN' to i_hier-langu,
            itab2-z_ah_desc_level3  to i_hier-txtsh,
            itab2-z_ah_desc_level3 to i_hier-txtmd,
            itab2-z_ah_desc_level3 to i_hier-txtlg.
      append i_hier.
      clear i_hier.
     endif.
     endif.
    endloop.
  endloop.
Appending third level node to Hierarchy table
  loop at itab_third.
    loop at itab3 where z_ah_level3 = itab_third-z_ah_level3.
      read table i_hier with key nodename = itab3-z_ah_level4.
    if sy-subrc ne 0.
      read table i_hier with key nodename = itab3-z_ah_level3.
       if sy-subrc eq 0.
      move i_hier-nodeid to i_hier-parentid.
Appending third level node to Hierarchy table
      sr = sr + 1.
      move : sr to i_hier-nodeid,
           'ZGL_AC_CD' to i_hier-infoobject,
            '0GL_ACCOUNT' to i_hier-infoobject,
            itab3-z_ah_level4 to i_hier-nodename,
            'EN' to i_hier-langu,
            itab3-z_ah_desc_level4 to i_hier-txtsh,
            itab3-z_ah_desc_level4 to i_hier-txtmd,
            itab3-z_ah_desc_level4 to i_hier-txtlg.
      append i_hier.
      clear i_hier.
      endif.
        endif.
    endloop.
  endloop.
  loop at i_hier.
    concatenate i_hier-nodeid i_hier-infoobject i_hier-nodename
                i_hier-link i_hier-parentid i_hier-leafto
                i_hier-leaffrom i_hier-langu i_hier-txtsh
                i_hier-txtmd i_hier-txtlg into v_line
                separated by ','.
    move  v_line to i_download-record.
    append i_download.
    write : / i_hier.
  endloop.
  perform f_download.
**&      Form  f_download
      text
-->  p1        text
<--  p2        text
form f_download.
  call function 'DOWNLOAD'
   exporting
  BIN_FILESIZE                  = ' '
  CODEPAGE                      = ' '
  FILENAME                      = ' '
      filetype                      = 'DAT'
  ITEM                          = ' '
  MODE                          = ' '
  WK1_N_FORMAT                  = ' '
  WK1_N_SIZE                    = ' '
  WK1_T_FORMAT                  = ' '
  WK1_T_SIZE                    = ' '
  FILEMASK_MASK                 = ' '
  FILEMASK_TEXT                 = ' '
  FILETYPE_NO_CHANGE            = ' '
  FILEMASK_ALL                  = ' '
  FILETYPE_NO_SHOW              = ' '
  SILENT                        = 'S'
  COL_SELECT                    = ' '
  COL_SELECTMASK                = ' '
  NO_AUTH_CHECK                 = ' '
IMPORTING
  ACT_FILENAME                  =
  ACT_FILETYPE                  =
  FILESIZE                      =
  CANCEL                        =
    tables
      data_tab                      = i_download
  FIELDNAMES                    =
EXCEPTIONS
  INVALID_FILESIZE              = 1
  INVALID_TABLE_WIDTH           = 2
  INVALID_TYPE                  = 3
  NO_BATCH                      = 4
  UNKNOWN_ERROR                 = 5
  GUI_REFUSE_FILETRANSFER       = 6
  CUSTOMER_ERROR                = 7
  OTHERS                        = 8
  if sy-subrc  0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
        WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
  endif.
endform.                    " f_download
this code may need modification as your requirement and this may be the optimized one.
First you have to read data from you Z-table into internal table then need to separate a each node level.
hope this helps.
There is PDF article on this. Good Luck.
Regards
Satish Arra
Edited by: Satish Arra on Feb 11, 2009 1:45 AM

Similar Messages

  • How to do sorting/filtering on custom java data source implementation

    Hi,
    I have an entity driven view object whose data should be retrieved and populated using custom java data source implementation. I've done the same as said in document. I understand that Query mode should be the default one (i.e. database tables) and createRowFromResultSet will be called as many times as it needs based on the no. of data retrieved from service, provided we should write the logic for hasNextForCollection(). Implementation sample code is given below.
    protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams) {      
    Iterator datumItr = retrieveDataFromService(qc, params);
    setUserDataForCollection(qc, datumItr);
    hasNextForCollection(qc);
    super.executeQueryForCollection(qc, params, noUserParams);
    protected boolean hasNextForCollection(Object qc) {
    Iterator datumItr = (Iterator) getUserDataForCollection(qc);
    if (datumItr != null && datumItr.hasNext()){
    return true;
    setCallService(false);
    setFetchCompleteForCollection(qc, true);
    return false;
    protected ViewRowImpl createRowFromResultSet(Object qc, ResultSet resultSet) {
    Iterator datumItr = (Iterator) getUserDataForCollection(qc);
    Object datumObj = datumItr.next();
    ViewRowImpl r = createNewRowForCollection(qc);
    return r;
    Everything is working fine include table sorting/filtering. Then i noticed that everytime when user perform sorting/filtering, executeQueryForCollection is called, which in turn calls my service. It is a performance issue. I want to avoid. So i did some modification in the implementation in such a way that, service call will happen only if the callService flag is set to true, followed by executeQuery(). Code is given below.
    private boolean callService = false;
    public void setCallService(boolean callService){
    this.callService = callService;
    public boolean isCallService(){
    return callService;
    protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams){
    if (callService)
    Iterator datumItr = retrieveDataFromService(qc, params);
    setUserDataForCollection(qc, datumItr);
    hasNextForCollection(qc);
    super.executeQueryForCollection(qc, params, noUserParams);
    Issue i have:
    When user attempts to use table sort/filter, since i skipped the service call and set null as userDataCollection, createRowFromResultSet is not called and data which i retrieved and populated to view object is totally got vanished!!. I've already retrived data and created row from result set. Why it should get vanished? Don't know why.
    Tried solution:
    I came to know that query mode should be set to Scan_Entity_Cache for filtering and Scan_View_Rows for sorting. I din't disturb the implementation (i.e. skipping service call) but overrided the executeQuery and did like the following code.
    @Override
    public void executeQuery(){
    setQueryMode(callService ? ViewObject.QUERY_MODE_SCAN_DATABASE_TABLES : ViewObject.QUERY_MODE_SCAN_ENTITY_ROWS);
    super.executeQuery();
    By doing this, i could able to do table filtering but when i try to use table sorting or programmatic sorting, sorting is not at all applied.
    I changed the code like beolw*
    @Override
    public void executeQuery(){
    setQueryMode(callService ? ViewObject.QUERY_MODE_SCAN_DATABASE_TABLES : ViewObject.QUERY_MODE_SCAN_VIEW_ROWS);
    super.executeQuery();
    Now sorting is working fine but filtering is not working in an expected way because Scan_View_rows will do further filtering of view rows.
    Question:
    I don't know how to achieve both filtering and sorting as well as skipping of service call too.
    Can anyone help on this?? Thanks in advance.
    Raguraman

    Hi,
    I have an entity driven view object whose data should be retrieved and populated using custom java data source implementation. I've done the same as said in document. I understand that Query mode should be the default one (i.e. database tables) and createRowFromResultSet will be called as many times as it needs based on the no. of data retrieved from service, provided we should write the logic for hasNextForCollection(). Implementation sample code is given below.
    protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams) {
      Iterator datumItr = retrieveDataFromService(qc, params);
      setUserDataForCollection(qc, datumItr);
      hasNextForCollection(qc);
      super.executeQueryForCollection(qc, params, noUserParams);
    protected boolean hasNextForCollection(Object qc) {
      Iterator datumItr = (Iterator) getUserDataForCollection(qc);
      if (datumItr != null && datumItr.hasNext()){
        return true;
      setFetchCompleteForCollection(qc, true);
      return false;
    protected ViewRowImpl createRowFromResultSet(Object qc, ResultSet resultSet) {
      Iterator datumItr = (Iterator) getUserDataForCollection(qc);
      Object datumObj = datumItr.next();
      ViewRowImpl r = createNewRowForCollection(qc);
      return r;
    }Everything is working fine include table sorting/filtering. Then i noticed that everytime when user perform sorting/filtering, executeQueryForCollection is called, which in turn calls my service. It is a performance issue. I want to avoid. So i did some modification in the implementation in such a way that, service call will happen only if the callService flag is set to true, followed by executeQuery(). Code is given below.
    private boolean callService = false;
    public void setCallService(boolean callService){
      this.callService = callService;
    public boolean isCallService(){
      return callService;
    protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams){
      if (callService) {
        Iterator datumItr = retrieveDataFromService(qc, params);
        setUserDataForCollection(qc, datumItr);
      hasNextForCollection(qc);
      super.executeQueryForCollection(qc, params, noUserParams);
    }Issue i have:
    When user attempts to use table sort/filter, since i skipped the service call and storing of userDataCollection, createRowFromResultSet is not called and data which i retrieved and populated to view object is totally got vanished!!. I've already retrived data and created row from result set. Why it should get vanished? Don't know why.
    Tried solution:
    I came to know that query mode should be set to Scan_Entity_Cache for filtering and Scan_View_Rows for sorting. I din't disturb the implementation (i.e. skipping service call) but overrided the executeQuery and did like the following code.
    @Override
    public void executeQuery(){
      setQueryMode(callService ? ViewObject.QUERY_MODE_SCAN_DATABASE_TABLES : ViewObject.QUERY_MODE_SCAN_ENTITY_ROWS);
      super.executeQuery();
    }By doing this, i could able to do table filtering but when i try to use table sorting or programmatic sorting, sorting is not at all getting applied.
    I changed the code like below one (i.e. changed to ViewObject.QUERY_MODE_SCAN_VIEW_ROWS instead of ViewObject.QUERY_MODE_SCAN_ENTITY_ROWS)
    @Override
    public void executeQuery(){
      setQueryMode(callService ? ViewObject.QUERY_MODE_SCAN_DATABASE_TABLES : ViewObject.QUERY_MODE_SCAN_VIEW_ROWS);
      super.executeQuery();
    }Now sorting is working fine but filtering is not working in an expected way because Scan_View_rows will do further filtering of view rows.
    If i OR'ed the Query Mode as given below, i am getting duplicate rows. (vewObject.QUERY_MODE_SCAN_ENTITY_ROWS | ViewObject.QUERY_MODE_SCAN_VIEW_ROWS) Question:
    I don't know how to achieve both filtering and sorting as well as skipping of service call too.
    Can anyone help on this?? Thanks in advance.
    Raguraman
    Edited by: Raguraman on Apr 12, 2011 6:53 AM

  • Bill Presentment Architecture, how to Overide Default Rule and ensure the AR Invoice/Transaction Chooses "Customer Transaction Data Source"

    Hi Intelligentsia,
      we are on 12.2.4 on linux, i have setup external template with supplementary Data source as "Customer Transaction Data Source", when i test it fails, i am not able to debug it , however when i run the BPA Transaction Print Program (Multiple Languages) it always gives me the default layout.
    Query is
    How do i ensure the Default rule Does not apply to my Invoice and i am able to override it
    is there a method to explicitly Ensure the Supplementary Data Source as "Customer Transaction Data Source", when i am creating the AR Invoice/ Transaction?  Am i missing some setup in AR Invoice Transaction Flexfield where i need to setup this "Customer Transaction Data Source" as the DFF Context ?
    please let me know if you need any more information.
    Abdulrahman

    Hello,
    Thanks for the answer. When you say rule data is that Rule creation date or the "Bill Creation From Date" that we setup while creating the rule? I have created a new invoice after the rule created, but it did not pick the new custom template.
    I have another issue. It would be greate if you could help. I have split my logo area into 2 vertically to display logo in one and legal entity and addres on the other one. In the Online Preview I can see the logo and Legal address. But in the print preview , i am not able to see them. It just shows a blank space. Any Idea?
    Thanks in advance

  • Method for uploading Customer Hierarchy Data

    hi experts,
    i have a requirement in which i need to upload Customer Hierarchy data into ECC system from CRM system.
    I need to know is there any method in LSMW by which i can do it?
    thanks,
    Ags.

    resolved..

  • How to execute Custom java data source LOV view object from a common mthd?

    Hi,
    My application contains Custom java data source implemented LOVs. I want to have a util method which gets the view accessor name, find the view accessor and execute it. But i couldn't find any API to get the view accessors by passing the name.
    Can anyone help me iin how best view accessors can be accessed in common but no by creating ViewRowImpl class (By every developer) and by accessing the RowSet getters?
    Thanks in advance.

    I am sorrry, let me tell my requirement clearly.
    My application is not data base driven. Data transaction happens using tuxedo server.
    We have entity driven VOs as well as programmatic VOs. Both are custom java data source implemented. Entity driven VOs will participate in transactions whereas programmatic VOs are used as List view object to show List of values.
    Custom java datasource implementation in BaseService Viewobject Impl class looks like
            private boolean callService = false;
        private List serviceCallInputParams = null;
        public BaseServiceViewObjectImpl()
            super();
         * Overridden for custom java data source support.
        protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams)
            List dataFromService = null;
            if(callService)
                callService = retrieveDataFromService(serviceCallInputParams);
            setUserDataForCollection(qc, dataFromService != null? dataFromService.iterator(): null);   
            super.executeQueryForCollection(qc, params, noUserParams);
         * Overridden for custom java data source support.
        protected boolean hasNextForCollection(Object qc)
            Iterator<BaseDatum> datumItr = (Iterator<BaseDatum>) getUserDataForCollection(qc);
            if (datumItr != null && datumItr.hasNext())
                return true;
            callService = false;
            serviceCallInputParams = null;
            setFetchCompleteForCollection(qc, true);
            return false;
        }Individual screen developer, who want to load data to VO, will do something like the below code in their VO impl class
        public void fetch()
            BaseServiceViewObjectImpl vo = this;
            vo.setCallService(true);
            vo.setServiceCallInputParams(new ArrayList());
            vo.executeQuery();
        }As these custom java data source implemented LOV VOs comes across the screens, i want to have a util method at Base VOImpl class, that gets the view accessor name, finds the LOV VO instance, retrieves data for that. I want to do something like
         * Wrapper method available at Base Service ViewObject impl class
        public void fetchLOVData(String viewAccessorName, List serviewInputParams)
            // find the LOV View object instance
            BaseServiceViewObjectImpl lovViewObject  = (BaseServiceViewObjectImpl) findViewAccessor(viewAccessorName);
            // Get data for LOV view object from service
            lovViewObject.setCallService(true);
            lovViewObject.setServiceCallInputParams(serviewInputParams);
            lovViewObject.executeQuery();
    Question:
    1. Is it achievable?
    1. Is there any API available at View Object Impl class level, that gets the view accessor name and returns the exact LOV view object instance? If not, how can i achieve it?

  • Customized delta data source for deleting data record in the source system.

    Hello Gurus,
           there is a customized delta data source,  how to implement delta function for deleting data record in the source system?
    I mean if there is record deleted in the source sytem, how to notify SAP BW system for this deleting change by this customized delta
    data source?
    Many thanks.

    Hi,
    when ever record deleted we need to write the code to insert the record in  Z table load this records into BW in a cube with similar structure.while loading into this cube multiply the Keyfigure by -1.
    add this cube in the Multi Provider.The union of the records in the orginal cube and the cube having deleted records will result in zero vale and will not be displayed in report .
    Regards,

  • Hierarchy Data Source for CO_PA

    Hi
    Please list me the steps to create Hierarchy Data Source for CO_PA
    &#61664; Create Hierarchy Data Source &#61664;Profitability Analysis &#61664;KEB0  ???????
    Thanks

    Hi Roberto,
    As  you said
    IMG activity Create Hierarchy DataSource (transaction KEB1) in the IMG under Data Transfer into the SAP Business Information Warehouse -> Setting for Application-Specific DataSources (PI) ->Profitability Analysis"
    Created Data Source for Hirerachy....
    When i checked the Data SOurce in RSA£
    Following Error Message is displayed
    Error 6 in function module RSS_PROGRAM_GENERATE
    Could you asist me on how to proceed and what is ‘vdh1n’  transaction is used for.
    How can i check the hirerachy of 0Customer in R/3
    Thanks

  • BI Publisher with Siebel 8.1 using custom SQL data source

    Hello ,
    We have Siebel 8.1 implemented with embedded BI Publisher for reporting .
    For some custom requirements , we want to connect to other oracle database table and display the results in Siebel reporting environment .
    I know this is possible with normal BI Publisher environment . But Since I am new to Siebel , I am not sure it will work with SQL as data source .
    Could you please guide me how to do that (if feasible )
    Thanks and regards
    Amit

    Hi,
    I am trying to call the a BIP Report in a workflow. I do several steps prior and then do an insert into the Report Output BC to get the Run Id and then a step to Generate the report output calling XMLP Driver Service with method GenerateBIPReport. I am passing in the argurments but I am unsure of all the dwtails as there isnt alot of documentation on using it in workflow. Can you please assist me or point me to some documentation> I followed the Information of the Doc ID 823360.1 but I may be missing something. Not sure how it knows what to include. Thought it was the bookmark input but not sure. I want to pass it an activity id and return the data associate with that activity (ei. orders). Thanks in advance.... Tracy

  • How to load 3.x hierarchy data source from a certain level ?

    Hi all,
    I have an old 3.x hierarchy data souce which has a structure like below :
       Level 0 :  root node
                  Level 1:  Product Category
                      Level 2: Product Line
                          Level 3: Product
                              Level 4 : Product Version
    each level has independent infoobject for it.
    The hierarchy was initially constructed for the infoobject on the level 4 'Product Version'.
    Now here comes anther requriement, we also want to have a hierarchy for 'Product' and try to reuse the datasource.
    So the question is , is it possible to extract from the same data source, while just update the hierarchy for 'Product ' starting from the level 3 (up to level 0)?
    PS: i can't migrate the data souce to 7.X, because the source system doesn't support.
    I hope I made my question clearly.
    Really apprericated that you can give me a hint.
    Thanks,
    Amon

    Finally I solved this problem myself.
    there are many mays if you want to realize the requirement, only if you change your way of thinking, don't be stuck your mind with the thought we have to use the original data source coming from the source system.
      I'd like to introduce 2 ways I am thinking of:
    1. use the SAP standard program 'Z_SAP_HIERARCHY_DOWNLOAD' to save the hierarchy as a .csv file, then you can refer to http://scn.sap.com/community/data-warehousing/bw/blog/2012/01/08/bw-730-hierarchy-loading-becomes-easier-with-the-help-of-new-framework
    however this didn't work for me, because the program might be outdated in my working environment BW7.4, though I did some bugfix for this program , the output file seems with some errors.
    2. since the hierarchy for the 4th level infoobject has been successfully updated in my case, we can use it as the data provider directly . Here is the general dataflow : 4th level infoobject hierarchy -> DSO -> target infoobject hierarchy (3rd level infoobject) .
    for this DSO strucure, you should also refer to the link above, the structure is just like the EXCEL file provided. but you should be aware of adding external characteristics in the data fields of your DSO if there is external characteristics in your hierarchy.
    in the transformation of 'source infoobject hierarchy' , you should select 'hierarchy (one segment)' as the subtype of hierarchy.
    in the transoformation of 'DSO -> target infoobject' , you should map the source and target carefully , especially for the segment 'Structure' , you should make sure the field 0H_HIERNODE and the external chars. have proper value filled.

  • PSA Deletion for Hierarchy Data sources

    Hi All,
    We have a 0customer_hier data source whose PSA needs to be included in the process chain for deletion of PSA on regular basis. I could find one table on the F4 search funcitonality while adding the PSA table in process chain variant.
    But if i look into RSRV i can 4 PSA tables. How do i get to assign all the 4 tables in the deletion variant ? Any pointers? Thanks !
    Regards,
    Vidya.

    Hi Vidya,
    In EHP1 , the deletion of PSA Tables can be done by specifying the selection patterns.
    You can check the new features in EHP1 for the same.
    -Vikram

  • Is there a Function module to get customer hierarchy data?

    Howdy,
    I'm writing a report where the user can, on the selection screen, enter a customer number or a hierarchy node and then the program has to get all the higher level nodes and  and lower level nodes for the selected Sales area.
    eg. for the following hierarchy:
    A
    ->B
    --|-> D
    --|-> E
    -> C
    ---|-> F
    ---|-> G
    if node b was selected it would pull back:
    KUNNR     HKUNNR
    A         blank
    B         A
    C         A
    D         B
    E         B
    F         C
    G         C
    Anyone know if there is a Function module that can do this?
    Thanks

    Hi Victoria,
    even i was using the same FM "RSNDI_SHIE_STRUCTURE_GET3"
    my code goes like this...
    i am passing hierarchy name and  node name...this is for cost element hierarchy.
    please check the code and tell if i am missing anything.
    o/p i am getting is all 0's.
    report ztest5.
    tables : ZREPCODMAP.
    data: itab type standard table of ZREPCODMAP with header line.
    TYPES: BEGIN OF STRUC,
             RESULT type RSSH_S_NODEBYNAMEWOL,
            ZREPCODMAP-REPDSELM,
            END OF STRUC.
    DATA: itab1 type standard table of struc with header line.
    data: w_hiesel like RSNDI_S_HIESEL.
    w_hiesel = '1000KES'.
    *data: w_nodebyname type RSNDI_T_NODENM.
    data: tab type standard table of RSNDI_S_HTAB with header line.
    data: lsubrc type SY-SUBRC.
    select * from ZREPCODMAP into table itab where PROFIT_CTR <> ' '.
    loop at itab.
    concatenate itab-CO_AREA itab-REPDSELM INTO itab1-RESULT.
    *write: itab1.
    break-point.
    CALL FUNCTION 'RSNDI_SHIE_STRUCTURE_GET3'
    EXPORTING
    I_S_HIEKEY =
    i_s_hiesel = w_hiesel
    i_s_subtreesel = itab1-result
    I_T_NODENAME =
    I_NO_NODENM_TABLE = w_flag
    IMPORTING
    E_S_HIEDIR =
    e_subrc = lsubrc
    TABLES
    E_T_HIEDIRT =
    e_t_hierstruc = tab.
    E_T_MHIERNODE =
    E_T_THIERNODE =
    E_T_HIERINTVL =
    E_T_NODENAMES = tab.
    E_T_NODEATTR =
    E_T_LEVEL =
    E_T_MESSAGE =
    write: tab.
    *write: lsubrc.
    endloop.
    Thanks & Regards,
    Vijaya

  • Connecting Oracle Warehouse Builder to an Informix data source

    I am having trouble connecting Oracle Warehouse Builder to my Informix database. I am getting error ORQ-28545. Here is what I did: Under Project Explorer I went to Non-Oracle under Datases. Then right click on Informix and selected New. In the Create Module Window I edit a the Location. Gave it a name, Set Type to HOST:PORT:SERVICE, enter my username, passowrd, my Host IP, left Port as default(1521), enter my Service Name, Enter the Schema. Click on Test Connection and got the above mentioned error.

    Hi,
    the first you should configure access to your Informix database at target Oracle DB (OWB works with non-Oracle DB over HS).
    So you should perform all steps described in Oracle Database Heterogeneous Connectivity Administrator's Guide (4.1 Setting Up Access to Non-Oracle Systems):
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14232/admin.htm#sthref152
    Oleg

  • Data Federator - Building Web Services Data Source - Request Guidance

    Hello -
    I am trying to build a web services data source in Data Federator.
    I have:
    1. In ECC, I have created a web service, proxy, port, endpoint, etc. and am testing it with WS Navigator
    2. I have created a Project in Data Federator of the type web service
        a. Here I have assigned the WSDL URL (generated in ECC) as the URL
        b. I have setup web service authentication using the same userid, pwd in the web services definition (note: I have set the
            web services authentication to 'NONE' to avoing authorization issues)\
        c. I am now trying to "Generate Operations"
    I keep getting the error :
    The File Access Parameters are not valid (directory path = <HERE IS THE PATH TO THE WSDL WITH THE HOST/PORT stripped OUT>; file pattern = document
    I believe that it is finding the WSDL, becuase when I change the WSDL from 'Document' to 'RPC', the error changes with the file pattern = rpc.
    What needs to be done to the file access parameter piece.
    PLEASE provide guidance.  Thanks.
    - abhi

    Ananya,
    We are using XI to load data to BI from source system. XI basically uses the same concept as Web Services - Real time data concept via RFC embeded in Proxies.
    I was using web services to test data load - eventually we will use XI.
    KJs blog is pretty good. I was able to get data loaded - after several iterations and several OSS notes. It is still not perfect - there are several manual steps . I am going to pick KJs brain on that.
    We are on SP10 and looks like there are several bugs in RSRDA. Some of these are addressed in SP11 an SP12.
    Some notes to consider are
    0001003963, 0001008276,0001009260 and 001003265.
    Let me know if you have any questions.

  • Data Source  combine - Customer Invoice and Outbound Delivery Item Details

    Hello,
    I need create report where getting the outbound delivery item details ( like ship date, quantities, product, identified_stock)
    and also need invoice number ( customer invoice number) which is available in the Customer Invoice Header or Detail source.
    But try to create combined data source not able to join  Customer Invoice and Outbound Delivery Item Details.
    Since after adding any one of them as first datasource and try to add another data source from above then it doesn't shows up.
    Anybody have idea how to get the invoice information (invoice number) and shipping data.
    Thanks,
    Viral.

    Hi Viral,
    you only can comine data sources of the same access context. The customer invoice data sources have access context "company" and the outbound delivery sources have "site" or "sales". So unfortunately, you can`t combine these data sources.
    Best regards,
    Andreas

  • Text Data source for Classification Data

    Hi BW Gurus, Ajay Das and Kishore helped me very well on this, just last step doubt. In r/3 system I have added the char tYPE AND kyf TYPE CHARACTERISTICS to classification data now it has generated the Classification customer attribute data source 1CL_0CUSBMW01 and I see the text datasources generated afor char type characteristics in /NCTBW such as 1CL_AABMW01,1CL_AABMW03,  etc.. now when i go back to BW
    i see the customer attribute data source 1CL_0CUSBMW01 and also text datasources generated in datasource overview, but when i go infosource tab in rsa1 I WAS INTstructed to create the infopackage on 0customer for customer attribute data source 1CL_0CUSBMW01 AND load the data now I " Have a doubt I think I also have to create 4 more infopackages and specify the text datasources generated for the characteristic added in CTBW ex 1CL_AABMW01,1CL_AABMW03. to load the text for the characteristics. please clarify asap if it is required to create all these text infopacks to load the text datasource to respective infoobject or just customer attribute data source 1CL_0CUSBMW01 infopack will take careof every thing, beacuse my manager did not told me to create the text infopack so how will the text get loaded for the infoobject
    soniya

    1CL_0CUSBMW01 is the DS containing 0CUSTOMER and the two (or more) characteristics I assume. This has to be used to update 0CUSTOMER. You can either use the existing infosource, or, create a new one to do this. You will then need a new infopackage to run this.
    Let us say one of the characteristics also has text, and you want to bring that to BW using generated DS 1CL_AABMW01. You will replicate it, and use it to update texts to infoobject XXXX (whatever this characteristic is, in BW). This will not be used to update 0CUSTOMER.
    Let us say your class char was COLOR. Its values were 1,2,3,4,... and the associated text was Red, Green, Blue...
    Now, based on the above, 0CUSTOMER will have an attribute COLOR which will have values in Attr table as 1,2,3,4...
    The text table for characteristics COLOR will have values as
    1--Red
    2--Green
    3--Blue.....
    The text will NOT go to 0CUSTOMER if that is what you are asking.

Maybe you are looking for

  • Delete Overlapping Requests from InfoCube: Before or After the Generate Ind

    Hi, Delete Overlapping Requests from InfoCube Before or After the Generate Index of the Infocube? Why? I think "After", but the system (transaction RSPC)suggest 1.Generate Index 2.Overlapping Requests from InfoCube ... Thanks Alessandro

  • Need to download Acrobat 9 Professional

    I have purchased in the past a download license for Adobe Acrobat Professional 9.  I had to wipe the computer that this was installed on and now I need to reinstall it.  Where can I get a download for Acrobat 9 Professional for Windows? thanks James

  • Project Online - Timesheet Auditing - where do we read the log?

    Where are the Timesheet Auditing logs maintained? I can retrieve a bunch of Timesheet data with this query - https://MyProjectOnlineTenant.sharepoint.com/sites/pwa/_api/projectdata/TimesheetLineActualDataSet() – but not sure if these are the audit lo

  • Conditional Suppression - No Data

    I am working with Conditional Suppression via Studio (reporting). I am trying to suppress a formula column/row where there is no data (the column is dividing the value by 12). I received #ERROR since the column formula cannot divide "no data by 12. A

  • Time stamp request does not work using Internet Explorer

    Hi fellows I get the following error information when signing a document only when I am using Adobe Acrobat 9 Pro in the Internet Browser: Creation of this signature could not be completed.  Error encountered while signing:  Timestamp signature prope