How to do sorting/filtering on custom java data source implementation

Hi,
I have an entity driven view object whose data should be retrieved and populated using custom java data source implementation. I've done the same as said in document. I understand that Query mode should be the default one (i.e. database tables) and createRowFromResultSet will be called as many times as it needs based on the no. of data retrieved from service, provided we should write the logic for hasNextForCollection(). Implementation sample code is given below.
protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams) {      
Iterator datumItr = retrieveDataFromService(qc, params);
setUserDataForCollection(qc, datumItr);
hasNextForCollection(qc);
super.executeQueryForCollection(qc, params, noUserParams);
protected boolean hasNextForCollection(Object qc) {
Iterator datumItr = (Iterator) getUserDataForCollection(qc);
if (datumItr != null && datumItr.hasNext()){
return true;
setCallService(false);
setFetchCompleteForCollection(qc, true);
return false;
protected ViewRowImpl createRowFromResultSet(Object qc, ResultSet resultSet) {
Iterator datumItr = (Iterator) getUserDataForCollection(qc);
Object datumObj = datumItr.next();
ViewRowImpl r = createNewRowForCollection(qc);
return r;
Everything is working fine include table sorting/filtering. Then i noticed that everytime when user perform sorting/filtering, executeQueryForCollection is called, which in turn calls my service. It is a performance issue. I want to avoid. So i did some modification in the implementation in such a way that, service call will happen only if the callService flag is set to true, followed by executeQuery(). Code is given below.
private boolean callService = false;
public void setCallService(boolean callService){
this.callService = callService;
public boolean isCallService(){
return callService;
protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams){
if (callService)
Iterator datumItr = retrieveDataFromService(qc, params);
setUserDataForCollection(qc, datumItr);
hasNextForCollection(qc);
super.executeQueryForCollection(qc, params, noUserParams);
Issue i have:
When user attempts to use table sort/filter, since i skipped the service call and set null as userDataCollection, createRowFromResultSet is not called and data which i retrieved and populated to view object is totally got vanished!!. I've already retrived data and created row from result set. Why it should get vanished? Don't know why.
Tried solution:
I came to know that query mode should be set to Scan_Entity_Cache for filtering and Scan_View_Rows for sorting. I din't disturb the implementation (i.e. skipping service call) but overrided the executeQuery and did like the following code.
@Override
public void executeQuery(){
setQueryMode(callService ? ViewObject.QUERY_MODE_SCAN_DATABASE_TABLES : ViewObject.QUERY_MODE_SCAN_ENTITY_ROWS);
super.executeQuery();
By doing this, i could able to do table filtering but when i try to use table sorting or programmatic sorting, sorting is not at all applied.
I changed the code like beolw*
@Override
public void executeQuery(){
setQueryMode(callService ? ViewObject.QUERY_MODE_SCAN_DATABASE_TABLES : ViewObject.QUERY_MODE_SCAN_VIEW_ROWS);
super.executeQuery();
Now sorting is working fine but filtering is not working in an expected way because Scan_View_rows will do further filtering of view rows.
Question:
I don't know how to achieve both filtering and sorting as well as skipping of service call too.
Can anyone help on this?? Thanks in advance.
Raguraman

Hi,
I have an entity driven view object whose data should be retrieved and populated using custom java data source implementation. I've done the same as said in document. I understand that Query mode should be the default one (i.e. database tables) and createRowFromResultSet will be called as many times as it needs based on the no. of data retrieved from service, provided we should write the logic for hasNextForCollection(). Implementation sample code is given below.
protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams) {
  Iterator datumItr = retrieveDataFromService(qc, params);
  setUserDataForCollection(qc, datumItr);
  hasNextForCollection(qc);
  super.executeQueryForCollection(qc, params, noUserParams);
protected boolean hasNextForCollection(Object qc) {
  Iterator datumItr = (Iterator) getUserDataForCollection(qc);
  if (datumItr != null && datumItr.hasNext()){
    return true;
  setFetchCompleteForCollection(qc, true);
  return false;
protected ViewRowImpl createRowFromResultSet(Object qc, ResultSet resultSet) {
  Iterator datumItr = (Iterator) getUserDataForCollection(qc);
  Object datumObj = datumItr.next();
  ViewRowImpl r = createNewRowForCollection(qc);
  return r;
}Everything is working fine include table sorting/filtering. Then i noticed that everytime when user perform sorting/filtering, executeQueryForCollection is called, which in turn calls my service. It is a performance issue. I want to avoid. So i did some modification in the implementation in such a way that, service call will happen only if the callService flag is set to true, followed by executeQuery(). Code is given below.
private boolean callService = false;
public void setCallService(boolean callService){
  this.callService = callService;
public boolean isCallService(){
  return callService;
protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams){
  if (callService) {
    Iterator datumItr = retrieveDataFromService(qc, params);
    setUserDataForCollection(qc, datumItr);
  hasNextForCollection(qc);
  super.executeQueryForCollection(qc, params, noUserParams);
}Issue i have:
When user attempts to use table sort/filter, since i skipped the service call and storing of userDataCollection, createRowFromResultSet is not called and data which i retrieved and populated to view object is totally got vanished!!. I've already retrived data and created row from result set. Why it should get vanished? Don't know why.
Tried solution:
I came to know that query mode should be set to Scan_Entity_Cache for filtering and Scan_View_Rows for sorting. I din't disturb the implementation (i.e. skipping service call) but overrided the executeQuery and did like the following code.
@Override
public void executeQuery(){
  setQueryMode(callService ? ViewObject.QUERY_MODE_SCAN_DATABASE_TABLES : ViewObject.QUERY_MODE_SCAN_ENTITY_ROWS);
  super.executeQuery();
}By doing this, i could able to do table filtering but when i try to use table sorting or programmatic sorting, sorting is not at all getting applied.
I changed the code like below one (i.e. changed to ViewObject.QUERY_MODE_SCAN_VIEW_ROWS instead of ViewObject.QUERY_MODE_SCAN_ENTITY_ROWS)
@Override
public void executeQuery(){
  setQueryMode(callService ? ViewObject.QUERY_MODE_SCAN_DATABASE_TABLES : ViewObject.QUERY_MODE_SCAN_VIEW_ROWS);
  super.executeQuery();
}Now sorting is working fine but filtering is not working in an expected way because Scan_View_rows will do further filtering of view rows.
If i OR'ed the Query Mode as given below, i am getting duplicate rows. (vewObject.QUERY_MODE_SCAN_ENTITY_ROWS | ViewObject.QUERY_MODE_SCAN_VIEW_ROWS) Question:
I don't know how to achieve both filtering and sorting as well as skipping of service call too.
Can anyone help on this?? Thanks in advance.
Raguraman
Edited by: Raguraman on Apr 12, 2011 6:53 AM

Similar Messages

  • How to execute Custom java data source LOV view object from a common mthd?

    Hi,
    My application contains Custom java data source implemented LOVs. I want to have a util method which gets the view accessor name, find the view accessor and execute it. But i couldn't find any API to get the view accessors by passing the name.
    Can anyone help me iin how best view accessors can be accessed in common but no by creating ViewRowImpl class (By every developer) and by accessing the RowSet getters?
    Thanks in advance.

    I am sorrry, let me tell my requirement clearly.
    My application is not data base driven. Data transaction happens using tuxedo server.
    We have entity driven VOs as well as programmatic VOs. Both are custom java data source implemented. Entity driven VOs will participate in transactions whereas programmatic VOs are used as List view object to show List of values.
    Custom java datasource implementation in BaseService Viewobject Impl class looks like
            private boolean callService = false;
        private List serviceCallInputParams = null;
        public BaseServiceViewObjectImpl()
            super();
         * Overridden for custom java data source support.
        protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams)
            List dataFromService = null;
            if(callService)
                callService = retrieveDataFromService(serviceCallInputParams);
            setUserDataForCollection(qc, dataFromService != null? dataFromService.iterator(): null);   
            super.executeQueryForCollection(qc, params, noUserParams);
         * Overridden for custom java data source support.
        protected boolean hasNextForCollection(Object qc)
            Iterator<BaseDatum> datumItr = (Iterator<BaseDatum>) getUserDataForCollection(qc);
            if (datumItr != null && datumItr.hasNext())
                return true;
            callService = false;
            serviceCallInputParams = null;
            setFetchCompleteForCollection(qc, true);
            return false;
        }Individual screen developer, who want to load data to VO, will do something like the below code in their VO impl class
        public void fetch()
            BaseServiceViewObjectImpl vo = this;
            vo.setCallService(true);
            vo.setServiceCallInputParams(new ArrayList());
            vo.executeQuery();
        }As these custom java data source implemented LOV VOs comes across the screens, i want to have a util method at Base VOImpl class, that gets the view accessor name, finds the LOV VO instance, retrieves data for that. I want to do something like
         * Wrapper method available at Base Service ViewObject impl class
        public void fetchLOVData(String viewAccessorName, List serviewInputParams)
            // find the LOV View object instance
            BaseServiceViewObjectImpl lovViewObject  = (BaseServiceViewObjectImpl) findViewAccessor(viewAccessorName);
            // Get data for LOV view object from service
            lovViewObject.setCallService(true);
            lovViewObject.setServiceCallInputParams(serviewInputParams);
            lovViewObject.executeQuery();
    Question:
    1. Is it achievable?
    1. Is there any API available at View Object Impl class level, that gets the view accessor name and returns the exact LOV view object instance? If not, how can i achieve it?

  • Bill Presentment Architecture, how to Overide Default Rule and ensure the AR Invoice/Transaction Chooses "Customer Transaction Data Source"

    Hi Intelligentsia,
      we are on 12.2.4 on linux, i have setup external template with supplementary Data source as "Customer Transaction Data Source", when i test it fails, i am not able to debug it , however when i run the BPA Transaction Print Program (Multiple Languages) it always gives me the default layout.
    Query is
    How do i ensure the Default rule Does not apply to my Invoice and i am able to override it
    is there a method to explicitly Ensure the Supplementary Data Source as "Customer Transaction Data Source", when i am creating the AR Invoice/ Transaction?  Am i missing some setup in AR Invoice Transaction Flexfield where i need to setup this "Customer Transaction Data Source" as the DFF Context ?
    please let me know if you need any more information.
    Abdulrahman

    Hello,
    Thanks for the answer. When you say rule data is that Rule creation date or the "Bill Creation From Date" that we setup while creating the rule? I have created a new invoice after the rule created, but it did not pick the new custom template.
    I have another issue. It would be greate if you could help. I have split my logo area into 2 vertically to display logo in one and legal entity and addres on the other one. In the Online Preview I can see the logo and Legal address. But in the print preview , i am not able to see them. It just shows a blank space. Any Idea?
    Thanks in advance

  • Customized delta data source for deleting data record in the source system.

    Hello Gurus,
           there is a customized delta data source,  how to implement delta function for deleting data record in the source system?
    I mean if there is record deleted in the source sytem, how to notify SAP BW system for this deleting change by this customized delta
    data source?
    Many thanks.

    Hi,
    when ever record deleted we need to write the code to insert the record in  Z table load this records into BW in a cube with similar structure.while loading into this cube multiply the Keyfigure by -1.
    add this cube in the Multi Provider.The union of the records in the orginal cube and the cube having deleted records will result in zero vale and will not be displayed in report .
    Regards,

  • XmlDataProvider .... is gone completely in my Xaml file. Why? How many different ways to deal with xml data source through WPF

    I followed a procedure described in a book.
    1. insert "Inventory.xml" file to a project "WpfXmlDataBinding" .
    2. add the XML data source through the data panel of "blend for 2013", named it "InventoryXmlDataStore" and store it in the current document.
    3. dragged and droppped the nodes from the Data panel onto the artboard.
    Then I checked my Xaml file against the one provided by the book
    Xaml file by the book:
    <Window.Resources>
    <!-- This part is missing in my xaml file --><XmlDataProvider x:Key="InventoryDataSource"
    Source="\Inventory.xml"
    d:IsDataSource="True"/>
    <!-- This part is missing in my xaml file -->
    <DataTemplate x:Key="ProductTemplate">
    <StackPanel>
    <TextBlock Text="{Binding XPath=@ProductID}"/>
    <TextBlock Text="{Binding XPath=Cost}"/>
    <TextBlock Text="{Binding XPath=Description}"/>
    <CheckBox IsChecked="{Binding XPath=HotItem}"/>
    <TextBlock Text="{Binding XPath=Name}"/>
    </StackPanel>
    </DataTemplate>
    </Window.Resources>
    <Grid>
    <ListBox HorizontalAlignment="Left"
    ItemTemplate="{DynamicResource ProductTemplate}"
    ItemsSource="{Binding XPath=/Inventory/Product}"
    Margin="89,65,0,77" Width="200"/>
    </Grid>
    my Xaml file:
    <Window x:Class="WpfXmlDataBinding.MainWindow"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    Title="MainWindow" Height="922" Width="874">
    <Window.Resources>
    <DataTemplate x:Key="ProductTemplate">
    <StackPanel>
    <TextBlock Text="{Binding XPath=@ProductID}"/>
    <TextBlock Text="{Binding XPath=Cost}"/>
    <TextBlock Text="{Binding XPath=Description}"/>
    <CheckBox IsChecked="{Binding XPath=HotItem}"/>
    <TextBlock Text="{Binding XPath=Name}"/>
    </StackPanel>
    </DataTemplate>
    </Window.Resources>
    <Grid DataContext="{Binding Source={StaticResource InventoryXmlDataStore}}">
    <ListBox HorizontalAlignment="Left" Height="370"
    ItemTemplate="{DynamicResource ProductTemplate}"
    ItemsSource="{Binding XPath=/Inventory/Product}"
    Margin="65,55,0,0" VerticalAlignment="Top" Width="270"/>     
        </Grid>
    </Window>
    All looks quite the same except the <XmlDataProvider ....> part under <Window.Resources>, which is gone completely in my Xaml file.
    1, Why?
    2, How many different ways to deal with xml data source through WPF?
    Thanks, guys.
    (ps My "WpfXmlDataBinding" runs without problem through.)

    Never do yourself down Richard.
    Leave that to other people.
    It's quite common for smart developers to think they're not as good as they are.
    I coach a fair bit and it's a surprisingly common feeling.
    And to repeat.
    Never use anything ends .. provider.  They're for trivial demo apps.  Transform xml into objects and use them.  Write it back as xml.  Preferably, use a database.
    You want to read a little mvvm theory first.
    http://en.wikipedia.org/wiki/Model_View_ViewModel
    Whatever you do, don't read Josh Smiths explanation.  I used to recommend it but it confuses the heck out newbies. Leave that until later.
    Laurent Bugnion did a great presentation at mix10.  Unfortunately that doesn't seem to be working on the MS site, but I have a copy.  Download and watch:
    http://1drv.ms/1IYxl3z
    I'm writing an article at the moment which is aimed at beginners.
    http://social.technet.microsoft.com/wiki/contents/articles/30564.wpf-uneventful-mvvm.aspx
    The sample is just a collection of techniques really.
    I have a sample which involves no real data but is intended to illustrate some aspects of how viewmodels "do stuff" and how you use datatemplates to generate UI.
    I can't remember if I recommended it previously to you:
    https://gallery.technet.microsoft.com/WPF-Dialler-simulator-d782db17
    And I have working samples which are aimed at illustrating line of business architecture.  This is an incomplete step by step series but I  think more than enough to chew on once you've done the previous stuff.
    http://social.technet.microsoft.com/wiki/contents/articles/28209.wpf-entity-framework-mvvm-walk-through-1.aspx
    The write up for step2 is work in progress.
    https://gallery.technet.microsoft.com/WPF-Entity-Framework-MVVM-78cdc204
    Hope that helps.
    Recent Technet articles: Property List Editing;
    Dynamic XAML

  • [Best practice] How to call a service from custom Java code

    Hi all,
    I'm wondering what the best method is to call a standard service from custom Java code?
    In a specific situation iDoc script is extended with custom functions with a custom component. There's Java code mapping to these functions that is executing these functions. The iDoc script functions are called from a workflow entry script.
    In the Java code that runs when the custom iDoc functions are called, I want to call a standard Content Server service. I don't think that the m_service variable is available, so filling the binder and using m_service.executeService() probably isn't possible.
    Also, if it were possible (that is, if I want to call a standard service from my own custom service Java code), what would then be the best method to do so?
    Regards, Stijn

    Hi Sapan,
    Let me explain a bit further.
    I'm an UCM consultant trying to solve a problem that occured at a client when they installed the CS10gR35CoreUpdateBundle.
    Content items are entered into a Workflow when they are checked in. Part of one of the entry scripts of the a workflow step is that related content to the content item in the workflow is (re)submitted for conversion.
    To achive this, a custom component provides an iDoc script extension. This iDoc function (resubmitForConversion) is implemented in Java (the class extends ScriptExtensionsAdaptor).
    In this Java method, first the related content items are fetched. Then the service RESUBMIT_FOR_CONVERSION should be called for all dID's in of the related content.
    Thus, at a certain point in the custom Java code, a native Content Server service must be called. Of course the class of this Java code does not extend the Service class, so the m_service object isn't available.
    The thing is: before installed the 10gR35CoreUpdateBundle everything worked OK. This code was used to execute the service:
            Workspace workspace = CommonUtils.getSystemWorkspace();
            String cmd = binder.getLocal("IdcService");
            if (cmd == null) throw new DataException("!csIdcServiceMissing");
            ServiceData serviceData = ServiceManager.getFullService(cmd);
            if (serviceData == null) throw new DataException(LocaleUtils.encodeMessage("!csNoServiceDefined", null, cmd));
            Service service = ServiceManager.createService(serviceData.m_classID, workspace, null, binder, serviceData);
            UserData fullUserData = CommonUtils.getFullUserData(userName, service);
            service.setUserData(fullUserData);
            binder.m_environment.put("REMOTE_USER", userName);
            ServiceException error = null;
            try {
                service.setSendFlags(true, true);
                service.initDelegatedObjects();
                service.globalSecurityCheck();
                service.preActions();
                service.doActions();
                service.postActions();
                service.updateSubjectInformation(true);
                service.updateTopicInformation(binder);
            } catch (ServiceException e) {
                error = e;
            } finally {
                service.cleanUp(true);
                if (!CommonUtils.isWorkspaceConnectionInTransaction(workspace)) {
                     workspace.releaseConnection();
            }the first problem was that the CS began to complain that a transaction was started within another transaction. So I suspect that the 10gR35 update wrapped a transaction around a workflow script entry.
    With some decompiling I figured out how a service is called from iDoc with the <$executeService()$> command. So I replaced the code above with:
                  String cmd = binder.getLocal("IdcService");
                ServiceData serviceData = ServiceManager.getFullService(cmd);
                if (serviceData == null) throw new DataException(LocaleUtils.encodeMessage("!csNoServiceDefined", null, cmd));
                Workspace workspace = CommonUtils.getSystemWorkspace();
                Service service = ServiceManager.createService(serviceData.m_classID, workspace, null, binder, serviceData);
                UserData fullUserData = CommonUtils.getFullUserData(userName, service);
                service.setUserData(fullUserData);
                binder.m_environment.put("REMOTE_USER", userName);
                service.initDelegatedObjects();
                service.executeSafeServiceInNewContext(cmd, true);This solved the transaction problem but introduces another problem: !csUnableToResubmitItem,(null)!csIllegalScriptAccess,RESUBMIT_FOR_CONVERSION
    The Service Reference Guide says that the access level for RESUBMIT_FOR_CONVERION is 33 (Read, Scriptable). However, in shared/config/resources/std_services.htm the access level is specified as 2 (write).
    Thus, my question still is:
    What is the best method to call a standard Content Server service from any Java code (so without extending the Service class, or having the m_service object available)?

  • How to upload different views of customer master data using LSMW-IDOC

    I need to upload customer master data  using LSMW Idoc method for my client. Now customer will have different views like main view, Sales data, Company code data, Partner function data etc. And except main data all other data can be multiple for each customer. We are going to upload data from tab delimited .TXT file. Should I propose different LSMW for upload different views for the customer from different .TXT files? or we can upload all the customer related data (like main view data and partner function data )from a single .TXT file. Kindly suggest which one in convenient and how we can prepare the data file in both cases.

    convinient is the method that you can handle.
    but as the guy who loads the data you have to load the data like they are available, like it is convinient for others to prepare the data.
    In general there is no problem to use an IDOC method to load a customer master with multiple comany codes and several sales orgs in one shot.
    the data can be in one source file, but need then to be maintained in a certain way
    Example1: all data in 1 structure
    GD1 - CC1 - SO1
    GD1 - CC2 - SO2
    GD1 - CC2 - SO3
    in this case the GD (general data) is redundand in each line which has different Company code data or different Sales Org data
    Example2: all data in 3 sturcures but one file
    GD1
    .CC1
    .CC2
    ..SO1
    ..SO2
    ..SO3
    Example3: data delivered in 3 files - you join the files in LSMW, they must have a common identifier like the old customer number in the beginning of each file
    FILE General data:
    GD1
    GD2
    GD3
    file Compamy code data:
    CC1
    CC2
    CC3
    file Sales org data:
    SO1
    SO2
    SO3
    LSMW is flexible and can handle each of this scenarios, are you flexible too?

  • Customized java date function in heart of JDK

    hi
    how can i force JDK to return Persian or Arabic date instead of christian era?
    i have an application in java platform. i couldn't change code of program, because source is closed and is not reachable. i want to know how can i use a wrapper for date function in java? if it is possible how can i replace java date functionality with my desired date function?
    thanks for any help

    Thanks for your help.
    however i looking for one thing, similar a function wrapper that can substitute original method.
    I don't know where the application use date function and how calls it. so i think that it is possible to change behavior of java itself.

  • Building a custom hierarchy data source extractor

    Hi SDN,
    The ECC side has given us a Z table which contains a hierarchy. I will write a function module to extract the hierarchy and pass on to BI.
    Can someone assist with steps needed to accomplish this? Obviously FM, Z table and view over it, etc are custom. What should the function module generate and fill what to load into BI? How do you generate a data source for such a custom situation?
    Please advise.
    Thanks.

    Hi Shahid,
    Only way you can extract is dumping the data or hierarchy  into flat file and loading into BI. Download the data into flat file and then you have upload the same. You cannot create Generic datasource for Hierarchy.
    If you have your data in Flat structure in the table then this same code may help you.
    report zbiw_glacc_hier line-size 250.
    data :
         begin of itab occurs 0,
          seg(5),
          segname(30),
          spu(5),
          spuname(30),
          bu(5),
          buname(30),
          pu(15),
          puname(30),
         cust(8),
         custname(60),
          end of itab.
    data : itab_all like itab occurs 0 with header line,
             itab_root like itab occurs 0 with header line.
    data : begin of i_hier occurs 0.    " BIW Structure to hold hirearchy
           INCLUDE STRUCTURE E1RSHND.
    data:   nodeid like e1rshnd-nodeid,           "NODE ID
            infoobject like e1rshnd-infoobject,   "infoobjectNAME
            nodename like e1rshnd-nodename,       "nodename year
            link(5), "   LIKE E1RSHND-LINK,              "linkname
            parentid like e1rshnd-parentid.       "parent id
    data:   leafto(32),                           "Interval - to
            leaffrom(32).                         "Interval - from
            include structure e1rsrlt.
            include structure rstxtsml.
    *DATA : PARENT LIKE E1RSHND-NODENAME.
    data : end of i_hier.
    data : sr(8) type n value '00000000'.
    data : begin of itab1 occurs 0, " Node Level1 of Hierarchy
           z_ah_level1 like itab-seg,
           z_ah_level2 like itab-spu,
           z_ah_desc_level2 like itab-spuname,
           end of itab1.
    data : begin of itab2 occurs 0, " Node Level1 of Hierarchy
           z_ah_level2 like itab-spu,
           z_ah_level3 like itab-bu,
           z_ah_desc_level3 like itab-buname,
           end of itab2.
    data : begin of itab3 occurs 0,   " Node Level3 of Hierarchy
           z_ah_level3 like itab-bu,
           z_ah_level4 like itab-pu,
           z_ah_desc_level4 like itab-puname,
           end of itab3.
    data wa like itab.
    data : begin of itab_first occurs 0, " Node Level2 of Hierarchy
           z_ah_level1 like ITAB-SEG,
           end of itab_first.
    data : begin of itab_sec occurs 0, " Node Level2 of Hierarchy
           z_ah_level2 like ITAB-SPU,
           end of itab_sec.
    data : begin of itab_third occurs 0, " Node Level2 of Hierarchy
           z_ah_level3 like ITAB-BU,
           end of itab_third.
    data : begin of i_download occurs 0,
           record(250),
           end of i_download.
    data : v_pid like e1rshnd-nodeid,
           v_line(250).
    start-of-selection.
    CALL FUNCTION 'UPLOAD'
    EXPORTING
      CODEPAGE                      = ' '
       FILENAME                      = 'C:/'
       FILETYPE                      = 'DAT'
      TABLES
        DATA_TAB                      = itab .
    IF SY-SUBRC  0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    itab_all[] = itab[].
      itab_root[] = itab_all[].
      sort itab_all by seg spu bu pu .
      sort itab_root by seg.
      delete adjacent duplicates from itab_root comparing seg.
    Appending Root level of node
    sr = 00000001.
    move : sr to i_hier-nodeid,
           '0HIER_NODE' to i_hier-infoobject,
           'SHA HIERARCHY' to i_hier-nodename,
           '00000000' to i_hier-parentid,
           'EN' to i_hier-langu,
           'SHA HIERARCHY' to i_hier-txtmd.
    append i_hier.
    clear i_hier.
    v_pid  = '00000000'.
    loop at itab_root.
    if itab_root-seg = itab_root-spu.
      delete itab_root.
    endif.
    endloop.
      loop at itab_root.
    Appending first level of the node
        sr = sr + 1.
        move : sr to i_hier-nodeid,
              '0HIER_NODE' to i_hier-infoobject,
               itab_root-seg to i_hier-nodename,
               v_pid to i_hier-parentid,
              'EN' to i_hier-langu,
              itab_root-segname to i_hier-txtsh,
              itab_root-segname to i_hier-txtmd,
              itab_root-segname to i_hier-TXTLG.
        append i_hier.
        clear i_hier.
      endloop.
      loop at itab_all.
        move itab_all to wa.
    Separation of level 3 of the node
        at new spu.
          if not wa-spu is initial.
            move : wa-seg to itab1-z_ah_level1,
                   wa-spu to itab1-z_ah_level2,
                   wa-spuname to itab1-z_ah_desc_level2.
            append itab1.
            clear itab1.
          endif.
        endat.
    Separation of level 3 of the node
        at new bu.
          if not wa-bu is initial.
            move : wa-spu to itab2-z_ah_level2,
                   wa-bu to itab2-z_ah_level3,
                   wa-buname to itab2-z_ah_desc_level3.
            append itab2.
            clear itab2.
          endif.
        endat.
    Separation of level 2 of the node
        at new pu.
          if not wa-pu  is initial.
            move : wa-bu to itab3-z_ah_level3,
                   wa-pu to itab3-z_ah_level4,
                   wa-puname to itab3-z_ah_desc_level4.
            append itab3.
            clear itab3.
          endif.
        endat.
        clear wa.
      endloop.
      sort itab1 by z_ah_level1 z_ah_level2.
      delete adjacent duplicates from itab1 comparing z_ah_level1
                                                  z_ah_level2.
      sort itab2 by z_ah_level2 z_ah_level3.
      delete adjacent duplicates from itab2 comparing z_ah_level2
                                                      z_ah_level3.
      sort itab3 by z_ah_level3 z_ah_level4.
      delete adjacent duplicates from itab3 comparing z_ah_level3
                                                      z_ah_level4.
    loop at itab_root.
    move itab_root-SEG to itab_first-z_ah_level1.
    append itab_first.
    clear itab_first.
    endloop.
    itab_sec[] = itab2[].
    itab_third[] = itab3[].
    sort itab_first by z_ah_level1.
    delete adjacent duplicates from itab_first.
    sort itab_sec by z_ah_level2.
    delete adjacent duplicates from itab_sec.
    sort itab_third by z_ah_level3.
    delete adjacent duplicates from itab_third.
    data : indx like sy-tabix.
    clear indx.
    loop at itab_first.
    indx = sy-tabix.
    read table itab_sec with key z_ah_level2 = itab_first-z_ah_level1
                            binary search.
    if sy-subrc eq 0.
    delete itab_first index indx.
    endif.
    endloop.
    clear indx.
    loop at itab_sec.
    indx = sy-tabix.
    read table itab_third with  key z_ah_level3 = itab_sec-z_ah_level2
                            binary search.
    if sy-subrc eq 0.
    delete itab_sec index indx.
    endif.
    endloop.
    clear indx.
    clear indx.
    loop at itab1.
    If itab1-z_ah_level1 = itab1-z_ah_level2.
      delete itab1.
    endif.
    endloop.
    loop at itab2.
    If itab2-z_ah_level2 = itab2-z_ah_level3.
      delete itab2.
    endif.
    endloop.
    loop at itab3.
    If itab3-z_ah_level3 = itab3-z_ah_level4.
      delete itab3.
    endif.
    endloop.
    Appending First level node to Hierarchy table
      loop at itab_root.
        loop at itab1 where z_ah_level1 = itab_root-SEG.
          read table i_hier with key nodename = itab1-z_ah_level2.
           if sy-subrc ne 0.
          read table i_hier with key nodename = itab1-z_ah_level1.
          if sy-subrc eq 0.
          move i_hier-nodeid to i_hier-parentid.
    Appending First level node to Hierarchy table
          sr = sr + 1.
          move  : sr to i_hier-nodeid,
                  '0HIER_NODE' to i_hier-infoobject,
                  itab1-z_ah_level2 to i_hier-nodename,
                  'EN' to i_hier-langu,
                  itab1-z_ah_desc_level2 to i_hier-txtsh,
                  itab1-z_ah_desc_level2 to i_hier-txtmd,
                  itab1-z_ah_desc_level2 to i_hier-txtlg.
          append i_hier.
          clear i_hier.
          endif.
          endif.
        endloop.
      endloop.
    Appending second level node to Hierarchy table
      loop at itab_sec.
        loop at itab2 where z_ah_level2 = itab_sec-z_ah_level2.
          read table i_hier with key nodename = itab2-z_ah_level3.
           if sy-subrc ne 0.
          read table i_hier with key nodename = itab2-z_ah_level2.
           if sy-subrc eq 0.
          move i_hier-nodeid to i_hier-parentid.
    Appending second level node to Hierarchy table
          sr = sr + 1.
          move : sr to i_hier-nodeid,
                '0HIER_NODE' to i_hier-infoobject,
                itab2-z_ah_level3 to i_hier-nodename,
                'EN' to i_hier-langu,
                itab2-z_ah_desc_level3  to i_hier-txtsh,
                itab2-z_ah_desc_level3 to i_hier-txtmd,
                itab2-z_ah_desc_level3 to i_hier-txtlg.
          append i_hier.
          clear i_hier.
         endif.
         endif.
        endloop.
      endloop.
    Appending third level node to Hierarchy table
      loop at itab_third.
        loop at itab3 where z_ah_level3 = itab_third-z_ah_level3.
          read table i_hier with key nodename = itab3-z_ah_level4.
        if sy-subrc ne 0.
          read table i_hier with key nodename = itab3-z_ah_level3.
           if sy-subrc eq 0.
          move i_hier-nodeid to i_hier-parentid.
    Appending third level node to Hierarchy table
          sr = sr + 1.
          move : sr to i_hier-nodeid,
               'ZGL_AC_CD' to i_hier-infoobject,
                '0GL_ACCOUNT' to i_hier-infoobject,
                itab3-z_ah_level4 to i_hier-nodename,
                'EN' to i_hier-langu,
                itab3-z_ah_desc_level4 to i_hier-txtsh,
                itab3-z_ah_desc_level4 to i_hier-txtmd,
                itab3-z_ah_desc_level4 to i_hier-txtlg.
          append i_hier.
          clear i_hier.
          endif.
            endif.
        endloop.
      endloop.
      loop at i_hier.
        concatenate i_hier-nodeid i_hier-infoobject i_hier-nodename
                    i_hier-link i_hier-parentid i_hier-leafto
                    i_hier-leaffrom i_hier-langu i_hier-txtsh
                    i_hier-txtmd i_hier-txtlg into v_line
                    separated by ','.
        move  v_line to i_download-record.
        append i_download.
        write : / i_hier.
      endloop.
      perform f_download.
    **&      Form  f_download
          text
    -->  p1        text
    <--  p2        text
    form f_download.
      call function 'DOWNLOAD'
       exporting
      BIN_FILESIZE                  = ' '
      CODEPAGE                      = ' '
      FILENAME                      = ' '
          filetype                      = 'DAT'
      ITEM                          = ' '
      MODE                          = ' '
      WK1_N_FORMAT                  = ' '
      WK1_N_SIZE                    = ' '
      WK1_T_FORMAT                  = ' '
      WK1_T_SIZE                    = ' '
      FILEMASK_MASK                 = ' '
      FILEMASK_TEXT                 = ' '
      FILETYPE_NO_CHANGE            = ' '
      FILEMASK_ALL                  = ' '
      FILETYPE_NO_SHOW              = ' '
      SILENT                        = 'S'
      COL_SELECT                    = ' '
      COL_SELECTMASK                = ' '
      NO_AUTH_CHECK                 = ' '
    IMPORTING
      ACT_FILENAME                  =
      ACT_FILETYPE                  =
      FILESIZE                      =
      CANCEL                        =
        tables
          data_tab                      = i_download
      FIELDNAMES                    =
    EXCEPTIONS
      INVALID_FILESIZE              = 1
      INVALID_TABLE_WIDTH           = 2
      INVALID_TYPE                  = 3
      NO_BATCH                      = 4
      UNKNOWN_ERROR                 = 5
      GUI_REFUSE_FILETRANSFER       = 6
      CUSTOMER_ERROR                = 7
      OTHERS                        = 8
      if sy-subrc  0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      endif.
    endform.                    " f_download
    this code may need modification as your requirement and this may be the optimized one.
    First you have to read data from you Z-table into internal table then need to separate a each node level.
    hope this helps.
    There is PDF article on this. Good Luck.
    Regards
    Satish Arra
    Edited by: Satish Arra on Feb 11, 2009 1:45 AM

  • BI Publisher with Siebel 8.1 using custom SQL data source

    Hello ,
    We have Siebel 8.1 implemented with embedded BI Publisher for reporting .
    For some custom requirements , we want to connect to other oracle database table and display the results in Siebel reporting environment .
    I know this is possible with normal BI Publisher environment . But Since I am new to Siebel , I am not sure it will work with SQL as data source .
    Could you please guide me how to do that (if feasible )
    Thanks and regards
    Amit

    Hi,
    I am trying to call the a BIP Report in a workflow. I do several steps prior and then do an insert into the Report Output BC to get the Run Id and then a step to Generate the report output calling XMLP Driver Service with method GenerateBIPReport. I am passing in the argurments but I am unsure of all the dwtails as there isnt alot of documentation on using it in workflow. Can you please assist me or point me to some documentation> I followed the Information of the Doc ID 823360.1 but I may be missing something. Not sure how it knows what to include. Thought it was the bookmark input but not sure. I want to pass it an activity id and return the data associate with that activity (ei. orders). Thanks in advance.... Tracy

  • How to configure crystal report xml file as data source in BOE in Solaris?

    Hi,
    How to configure crystal reports from xml file as data source in Solaris? I didn't find any suitable driver for xml / excel files for sun solaris.
    Which driver i have to use to connect xml file to crystal report to view my crystal report in solaris BOE?
    And the same way excel file as data source for crystal report.
    Thanks

    Hi Don thanks for the reply,
    In windows environment I donot have any problem when creating crystal report from Xml file and Excel file. After creating reports when I publish those into boe server in solaris, getting connection failed error.
    My solaris BOE server doent have any network connection with windows machines. So i have to place the files in solaris server.
    Below the steps what I tried:
    1. Created crystal reports from cr designer in windows using ADO.Net(xml) and in another try with Xml webservices drivers. Reports works well as it should.
    2. Saved in BOE repository in Solaris server from crystal reports and changed database configuration settings as:
        -Used custom database logon information and specified cr_xml as custom driver.
        -Chnaged database path to file directory path according to solaris server file path </app/../../>
        -tried table prefix also
        - Selected radio button use same database logon as when report is run saved.
    My environment :
    SAP BOXI3.1 sp3
    crystal reports 2008 sp3
    SunOS
    Cr developing in windows 7.
    For Excel I tried with ODBC in windows but I can't find any ODBC or JDBC drivers for Excel in solaris.
    Any help to solve my issues
    Thanks
    Nagalla

  • How to modify the Generic Delta in Standard Data Source...?

    Hi BW Guru's,
    We have one issue like all values are coming from customer master data data source 0CUST_COMPC_ATTR which is delta capable. Recently the functional guys done some modification on customer by remapping the field to sales rep. this will be impact on collector field (this is generic field - ZZ field). Now there will be frequent changes for collector field and the delta is not able to pick up the changed values in to BW system. 
    I have tried in different ways but not resolved finally i got some clue which as follows:
    Data source 0CUST_COMPC_ATTR shows delta process as 'E' Unspecific Delta
    Using Extractor (Not ODS-capable) in RSA2.
    Now i would like to change the option to delta process to ‘D’ unspecified delta using delta queue (not ODS-capable).
    Can any one give advice me on this.  Your help will be appreciated.
    Thanks in Advance,
    Venkat

    Hi Olivier,
    Thanks for quick response.
    I am sorry for not mentioning the module on which i am working.
    I am working on FI-CO extraction.  I heard that In FI-CO the delta queue does not support because it won't support V3 Delta mechanism.
    But any how i will try with ROOSOURCE and let you know the status of it.
    If any advices please revert back with suitable suggestions.
    Thanks and Regards,
    Venkat

  • Crystal Reports Developer Edition XML or Java data source

    <p>Is there a way to use XML or custom Java objects (POJO) data source  in Crystal Reports Developer Edition ?  Currently Crystal Reports  is being invoked as API from an existing code. But that code can be modified if need be.</p>

    Hello Jayashree,
    please see a sample for CR 08 with XML as datasource [here|https://smpdl.sap-ag.de/~sapidp/012002523100006016532008E/XML_datasource.zip].
    Best regards
    Falk

  • How do I reference multiple tables in SSAS Data Source View Named Calculation functionality?

    Hi SSASers - 
    On the Data Source View node of SSAS Visual Studio Interface, I want to create a named calculation that references multiple tables. Something like: CASE WHEN tableA.Column1 = 'Y' Then tableB.Column1 ELSE tableB.Column2 End, but the compiler throws an error
    here "Deferred prepare could not be completed". 
    What is the syntax for referencing multiple tables on this node or how else can multiple tables be used to create a calculated value in SSAS? I'm new to SSAS and so far have been building big views and building calculations this way. Another option is the
    Calculation tab off the cube node but this calculation will need to be based off of dimensions AND measures so please provide a syntax example here also. 
    Thanks in advance!
    Carl

    Thanks Jiri! The named query functionality off the Data Source View node is exactly what I was looking for - it's just a view on the SSAS side of things instead of the relational dbase.
    Sorry for the delayed answer verification - got pulled into something else last week. Carl 
    Carl

  • How to handel parametrs when bi answers is data source

    Hi All,
    I created a report which having obi answers as data source.how to create a parameters for this report same like prompts in bi answer report.
    can any one help me in this.
    Thanks,
    KSS

    The native XML driver is incompatible with ADO.NET DataSet XML.  The specific issue is that the driver cannot handle the recursive definition on "Schema" that the .NET DataSet XML uses.
    The workaround is to create a .NET class that Web References, invoke the Web Service method for the DataSet, then returns it. 
    Use the ADO.NET (XML) driver in Crystal Report to consume the .NET data source.
    Note - when you deploy your report, ensure you deploy the DLL for the .NET class you've created.
    Sincerely,
    Ted Ueda

Maybe you are looking for

  • Unable to attach USB device in XP Mode on Windows 7 enterprise x64

    I am using XP Mode on win 7 enterprise edition x64, when i try to attach USB devices on XP Mode i receive the below message ''Could not attach the USB device. You can attach the USB device to the different USB port or restart the virtual machine and

  • Deleting Duplicate Entries in sender JDBC adapter

    Hi i am working on JDBC to IDOC scenario. When i am fetching the records from the Oracle database table using select query, i am getting the records but its getting duplicated many a times and hence its failing at an idoc processing at ECC side.. Cou

  • CS4 error 150:30

    My Mac had to be replaced and I used my backup hard drive to load the new one.  Nothing in CS4 will boot, giving me this message about licensing. I tried to follow the instructions to repair this -- cannot locate the folders they say are buried in my

  • Table report painter

    Hi, i need a table name for element definition that linked to characteristic inside, is there any body now name of the tables?? many thanks

  • DB12&DB13 access issue

    Hi, when user executes DB12, DB13 T-code , user is getting error "you are not authorised to execute transaction". but when he executed SU53 tcode getting report is showing object "S_ADMI_FCD" with value DBA in BI system. what could be the reason the