Data filtering for table and chart

Hi,
I'm new to crystal reports XI.
I have one sql query which I am using as the data source for a chart and a table (cross tab). The query return balance for monthly data. In the chart I need to display all the data returned by the query. In the table I need to display data for 12 months. I am wondering what is the best way to filter the table data only include the 12 months of data without the use of a second query.
Any ideas would be greatly appreciated.
Thanks.

Please use selection expert. Using You can set your criteria of filter at report level itself.
If by standard option you cannot filter for 12 months data, you can go for formuael over there.
else, use subreport(which is like subquering only)....

Similar Messages

  • Delayed Data Display for Tables and Views

    Is it possible to delay the display of data for data and views.
    I work on remote databases over relatively slow ADSL/Dialup connections when compared to a LAN/WAN.
    If the tables are large, the data retrieved can be relatively slow and therefore the setting of a filter before display would reduce the amount of data retrieved and enable the viewing of the attributes, statistics etc in a timely manner

    AL_INDEX and AL_COLMAP store metadata of tables that you import in Datastore
    AL_COLMAP and AL_COLMAP_TEXT table store column mapping information that use in Dataflow, this data is usually populated when you save Dataflow, check if the "automatically calculate column mapping" option is checked in the Tools-> Options window in Designer
    if you are not seeing any data in these 2 tables, then go to Designer, select Datastore tab in object library right in the Datastore workspace select Repository menu option and select Calculate Usage Dependency once that completes do select Calculate Column Mapping from the same menu, once that completes check if you see data in these 2 tables

  • Protect a document while allowing insertion of data in a table and points on a chart.

    I have created a Word document that includes a table and a chart.  Can I protect the document but allow entry of data into the table and allow points to be placed on the chart using Adobe?

    Hi brianh89327665,
    While you can apply a permissions password to a PDF to prevent things such as printing and changing the document, you can't apply it selectively to parts of the PDF.
    Best,
    Sara

  • Pivot tables and charts error: Please load datasource first

    hi all,
    i am getting a error: please load data source first in rtf template , only for pivot table and charts(note: normal tables and fields i can insert )
    Please help me with this.
    Thanks and Regards
    Prakhar

    Well... if you think and try long enough, you may find the solution .
    The core problem is to reference to queried values not shown in the table/chart. And there IS a solution.
    Just use...
    NoFilter(RelativeValue([measure prior year_1];([year/month]);-12))
    ... instead of
    RelativeValue([measure prior year_1];([year/month]);-12)
    NoFilter enables RelativeValue to reference to values not present in a table or chart, but available in the report .
    From there on, you can define e.g. the starting point of the category axis of a chart by defining a local filter without losing values to be referenced .

  • Grouping drilldown table and chart

    Hi,
    I want to group drilldown table and chart on same webi  report.
    For example table has many records and chart must be shown below table no matter how many records does it have.
    I cant estimate the data record number every run so when table rec number is bigger than 40 chart looks under table.I have urgently correct this display issue.
    Is there any function about this?
    Any help will highly be appreciated.
    Thanks

    Hi Nil,
    Try these steps.
    1. Click on Preferences in InfoView.
    2. Click on Web Intelligence
    3. Scroll to Drill options
    4. Check u201CSynchronize drill on report blocksu201D
    As you drill to the next hierarchy on the table, the chart will also drill down along with the table.
    Hope this helps.

  • Fetch data from one table and insert into two tables in desired format

    I have similar to the following data in a table and it is not normalized. The groupID is being used to group two records of similar nature.
    DECLARE @OldDoc TABLE (oldDocID INT, groupID INT, deptID INT)
    INSERT INTO @OldDoc (oldDocID, groupID) VALUES (1, NULL, 111),(2,NULL,111),(3,1,111),(4,NULL,333),(5,1,222),(6,NULL,333),(7,2,222),(8,2,333),(9,NULL,111),(10,3,222),(11,NULL,333),(12,3,444)
    I need to process the data from the above table (@OldDoc) and write into two new tables (@NewDoc and @NewDocGroup) as follows.
    oldDocID should be stored as newDocID when inserting to @NewDoc table. Only records with groupID NULL and one record (first one) per group should be considered (For example, oldDocID 5 is not considered as 3 and 5 belong to the same groupID 1) for insertion. 
    DECLARE @NewDoc TABLE (newDocID INT)
    INSERT INTO @NewDoc (newDocID) VALUES (1),(2),(3),(4),(6),(7),(9),(10),(11)
    All records from @OldDoc should be considered for insertion into @NewDocGroup table. OldDocID is inserted as NewDocID and deptID is as-is. Instead of groupID, the ID of the first record in the 
    group should be considered as parentNewDocID (For example, 3 is considered as parentNewDocID for newDocID 5 as 3 and 5 belong to the same groupID in @OldDoc table) for the newDocID.
    DECLARE @NewDocGroup (newDocID INT, parentNewDocID INT, deptID INT)
    INSERT INTO @NewDocGroup (newDocID, parentNewDocID, deptID) VALUES (1,1,111),(2,2,111),(3,3,111),(4,4,333),(5,3,222),(6,6,333),(7,7,222),(8,7,333),(9,9,111),(10,10,222),(11,11,333),(12,10,444)
    How do I accomplish the above using SQL ? Thanks for the help.

    >> I have similar to the following data in a table and it is not normalized. The group_id is being used to group two records [sic] of similar nature. <<
    Rows are not records. Tables have to have a key by definition. You do not do math with identifiers, so they should not be numeric. Let's ignore that error for now. In short, you are posting garbage. If you had followed Forum Netiquette, would you have posted
    this? 
    CREATE TABLE Old_Documents
    (old_doc_id INTEGER NOT NULL PRIMARY KEY, 
     group_id INTEGER, 
     dept_nbr INTEGER NOT NULL
       REFERENCES Departments (dept_nbr));
    INSERT INTO Old_Documents(old_doc_id, group_id, dept_nbr) 
    VALUES  (1, NULL, 111), 
    (2, NULL, 111), 
    (3, 1, 111), 
    (4, NULL, 333), 
    (5, 1, 222), 
    (6, NULL, 333), 
    (7, 2, 222), 
    (8, 2, 333), 
    (9, NULL, 111), 
    (10, 3, 222), 
    (11, NULL, 333), 
    (12, 3, 444);
    >> I need to process the data from the above table (Old_Documents) and write into two new tables (New_Documents and New_Documents_Groups) as follows. <<
    Just like punch cards and mag tape data processing! Being old and being new are a status, not another kind of entity. But that is how mag tapes work. And you even use the verb "fetch" from tape files. This design flaw is called  attribute splitting.
    Do you have a Male_Personnel and Female_Personnel table? NO! It is just Personnel! 
    >> old_doc_id should be stored as new_doc_id when inserting to New_Documents table. Only records [sic] with group_id NULL and one record [sic] (first [sic; no ordering in a table] one) per group should be considered (For example, old_doc_id 5 is not considered
    as 3 and 5 belong to the same group_id =1) for insertion. <<
    Think about your punch card mindset. Why did you physically materialize that redundant New_Documents table? Let me answer that: this is how you work with punch cards! In SQL we use a VIEW:
    CREATE VIEW New_Documents (new_doc_id)
    AS 
    SELECT old_doc_id 
      FROM Old_Documents;
    >> All records [sic] from Old_Documents should be considered for insertion into New_Documents_Groups table. The old_doc_id is inserted as new_doc_id and dept_nbr is as-is. Instead of group_id, the ID [sic: which identifier??] of the first [sic: tables
    have no ordering like a deck of punch cards] record [sic] in the group should be considered as parent_new_doc_id (For example, 3 is considered as parent_new_doc_id for new_doc_id 5 as 3 and 5 belong to the same group_id in Old_Documents table) for the new_doc_id.
    <<
    Why not use 5 as the parent? My guess is that you are trying to form equivalence classes. See:
    https://www.simple-talk.com/content/print.aspx?article=2020
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Pull data from SQL Table and display it in mail

    I have a requirement to pull the data from SQL table and send it in email.  Currently I am sending the hard coded info in email but is it possible to pull some data from SQL Table and than format it and send it across in the same email? 
    Can you guide me with steps on this.
    Neil

    There are several ways to do this.  First is to populate a file in a data flow and then send that as an attachment in the send mail task. 
    As far as including the results in the email body this becomes a bit trickier.  To use a variable you would need to use an SSIS variable type of
    Object, this is similar to a collection in .NET.  The problem once the object is populated is that it isn't like a readable result set, but again more like an array or a collection.  There is no native method to take the object variable and
    specify .ToString() or cast its results as text.  You would need to iterate through each row and append it to another variable of type string, this could be done with a script task or ForEach container.
    Also you mentioned formatting the results.  What type of formatting were you looking for.  A limitation of the SMTP send mail task is that the message body doesn't support HTML so if you were looking at creating a table within the mail body you
    would have to use a script task or a custom component
    David Dye My Blog

  • How to export an entire model content (tables and charts) to pdf

    Hello,
    I would like to export a complete model, which includes tables and charts, to pdf.
    I am already familiar with [this|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/47fe4fef-0d01-0010-6f87-ed8ecb479123] tutorial but it explains how to do it for a single table and doesn't deal with charts. What I would like to do is take like a screenshot of the entire iView/model content to pdf.

    Hi, rbish-
    Not for the entire stage.  The only thing you could do is to highlight everything and convert it to a symbol, then ensure that you move your triggers and labels to the new symbol.  Then you could export an individual symbol, which will also package all of the dependent symbols.
    Hope that helps,
    -Elaine

  • Import dumpfile with seperate tablespaces for table and index

    Hi,
    We have a schema for which the its tables are stored in seperate tablespace and indexes are stored in different tablespace. Now we have take full schema export. Now we want to import it on another schema. Now I want to know if the we have difference in the tablespace name we use REMAP_TABLESPACE clause of the impdp command but what about the seperate tablespace for table and indexes. How would Oracle handle this.
    Regards,
    Abbasi

    Hi,
    I hope you created the same tablespace structure on the target side if not so remap_tablespace option you have to use for specifying different tablespaces.Oracle will take care of putting data and index.Any how if a index is moved from one tablespace to other you have to rebuild them,once you rebuild them than only stattistics are gathered otherwise you
    might face some performance issue.
    Better option is to keep same tablespace structures in source and target environment.
    Best regards,
    Rafi.
    http://rafioracledba.blogspot.com
    Edited by: Rafi (Oracle DBA) on May 9, 2011 7:07 AM

  • Data service for table in Oracle 8.0.6

    Hi,
    Using WebLogic 8.1.4 and LiquidData 8.5 I am trying to create physical data services for tables in a DB in Oracle 8.0.6. I am aware that that Oracle version is not supported by Oracle anymore, but I need to work with that version anyway (you know how it is sometimes).
    I managed to create a connection pool for this through the WebLogic Server Console by providing the JDBC driver for 8.0.6., but when I want to create a data source using the new connection pool and WebLogic tries to get the metadata, I get pop up windows with messages like:
    "Bigger type length than maximum"
    and
    "OALL8 in an inconsistent state"
    and
    "Protocol violation"
    One more thing to mention: I also added the Oracle 8.0.6 JDBC driver to the WebLogic Server classpath (Tools -> WebLogic Server -> Server Properties ... -> WebLogic Server: added classes12.zip to Server classpath additions) and restarted WebLogic Workshop and Server. Still I get those error messages.
    Is there a special procedure how to provide/configure a specific driver for a DBMS that is not natively supported by WebLogic?
    Any help is appreciated.
    Thanks,
    Wilko

    Hi Mike,
    Thanks for the quick reply. Below the contents of the console window from starting the Workshop and Server. I'll try your next hint next and let you know about the outcome. As far as I see there were no errors issued by the Server while I tried to connect to Oracle 8.0.6 to upload metadata. (I am not sure whether anything was printed out while I started the server). My address is w.eschebach at vsnlinternational dot com.
    Thanks,
    Wilko
    This is how my workshop.cfg looks like:
    C:\bea\weblogic81\workshop
    C:\bea\jdk142_05\jre\bin\java.exe
    -XX:-UseThreadPriorities -Xmx256m -Xms64m -Xss256k -client -Dsun.io.useCanonCaches=false -Dsun.java2d.noddraw=true -Dsun.java2d.d3d=false -Djava.system.class.loader="workshop.core.AppClassLoader" -cp "C:\bea\weblogic81\workshop\wlw-ide.jar" workshop.core.Workshop
    Console output:
    DEBUG: extensions=C:\bea\weblogic81\workshop\\extensions
    INFO: Registering extension com.bea.portal.ide.CommonServices
    INFO: Service com.bea.portal.ide.findrefs.FindRefsSvc registered
    INFO: Handler for urn:com-bea-portal-ide:ref-finders registered
    INFO: Registering extension workshop.control.ControlServices
    INFO: Service com.bea.ide.control.ControlSvc registered
    INFO: Registering extension com.crystaldecisions.integration.weblogic.workshop.r
    eport.Bootstrap
    INFO: Registering extension workshop.debugger.DebuggerServices
    INFO: Exit Handler found
    INFO: Service com.bea.ide.debug.DebugSvc registered
    INFO: Handler for urn:com-bea-ide:debugExpressionViews registered
    INFO: Registering extension workshop.jspdesigner.JspDesignerServices
    INFO: Service com.bea.ide.ui.browser.BrowserSvc registered
    INFO: Service com.bea.ide.jspdesigner.PaletteActionSvc registered
    INFO: Handler for urn:com-bea-ide-jspdesigner:tags registered
    INFO: Registering extension workshop.liquiddata.LiquidDataExtension
    INFO: Registering extension workshop.pageflow.services.PageFlowServices
    INFO: Exit Handler found
    INFO: Service workshop.pageflow.services.PageFlowSvc registered
    INFO: Service com.bea.ide.ui.palette.DataPaletteSvc registered
    INFO: Handler for urn:workshop-pageflow-wizard:extension registered
    INFO: Registering extension com.bea.portal.ide.portalbuilder.PortalBuilderServic
    es
    INFO: Service com.bea.portal.ide.portalbuilder.laf.LookAndFeelSvc registere
    d
    INFO: Service com.bea.portal.ide.portalbuilder.laf.css.CssSvc registered
    INFO: Service com.bea.portal.codegen.CodeGenSvc registered
    INFO: Registering extension com.bea.portal.ide.PortalServices
    INFO: Service com.bea.portal.ide.cache.CacheInfoSvc registered
    INFO: Registering extension workshop.process.ProcessExtension
    INFO: Service workshop.process.ProcessSvc registered
    INFO: Service workshop.process.broker.channel.ChannelManagerSvc registered
    INFO: Handler for urn:com-bea-ide-process:process registered
    INFO: Registering extension workshop.shell.ShellServices
    INFO: Exit Handler found
    INFO: Service com.bea.ide.ui.frame.FrameSvc registered
    INFO: Service com.bea.ide.core.datatransfer.DataTransferSvc registered
    INFO: Service com.bea.ide.actions.ActionSvc registered
    INFO: Service com.bea.ide.document.DocumentSvc registered
    INFO: Service com.bea.ide.core.HttpSvc registered
    INFO: Service com.bea.ide.ui.help.HelpSvc registered
    INFO: Service com.bea.ide.ui.output.OutputSvc registered
    INFO: Service com.bea.ide.core.navigation.NavigationSvc registered
    INFO: Service com.bea.ide.filesystem.FileSvc registered
    INFO: Service com.bea.ide.filesystem.FileSystemSvc registered
    INFO: Service com.bea.ide.refactor.RefactorSvc registered
    INFO: Service com.bea.ide.security.SecuritySvc registered
    INFO: Handler for urn:com-bea-ide:actions registered
    INFO: Handler for urn:com-bea-ide:document registered
    INFO: Handler for urn:com-bea-ide:frame registered
    INFO: Handler for urn:com-bea-ide:encoding registered
    INFO: Handler for urn:com-bea-ide:help registered
    INFO: Registering extension workshop.sourcecontrol.SCMServices
    INFO: Service com.bea.ide.sourcecontrol.SourceControlSvc registered
    INFO: Handler for urn:com-bea-ide:sourcecontrol registered
    INFO: Registering extension workshop.sourceeditor.EditorServices
    INFO: Service com.bea.ide.sourceeditor.EditorSvc registered
    INFO: Service com.bea.ide.sourceeditor.compiler.CompilerSvc registered
    INFO: Handler for urn:com-bea-ide:sourceeditor:sourceinfo registered
    INFO: Registering extension com.bea.wls.J2EEServices
    INFO: Service com.bea.wls.ejb.EJBSvc registered
    INFO: Service com.bea.wls.DBSvc registered
    INFO: Registering extension workshop.workspace.WorkspaceServices
    INFO: Exit Handler found
    INFO: Service com.bea.ide.workspace.WorkspaceSvc registered
    INFO: Service com.bea.ide.workspace.ServerSvc registered
    INFO: Service com.bea.ide.workspace.SettingsSvc registered
    INFO: Service com.bea.ide.build.AntSvc registered
    INFO: Service com.bea.ide.workspace.RunSvc registered
    INFO: Handler for urn:com-bea-ide:settings registered
    INFO: Handler for urn:com-bea-ide:project registered
    INFO: Registering extension workshop.xml.XMLServices
    INFO: Service com.bea.ide.xml.types.TypeManagerSvc registered
    INFO: Service com.bea.ide.xml.types.TypeResolverSvc registered
    INFO: Service com.bea.ide.xmlmap.XMLMapSvc registered
    DEBUG: Workshop temp dir: C:\DOCUME~1\TR003137\LOCALS~1\Temp\wlw-temp-18920
    DEBUG: ExtensionsLoaded: 8329ms
    DEBUG: UI Displayed: 11563ms
    DEBUG: Time to load XQuery Functions (in seconds) - 0
    DEBUG: Time to load repository (in seconds) - 0
    DEBUG: LdBuildDriver loaded
    DEBUG: project ProvisioningDataServices activated
    DEBUG: Setting active project to: ProvisioningDataServices
    DEBUG: Workspace Activated: 17126ms
    DEBUG: Document Panel initialized: 17501ms
    DEBUG: *** CompilerProject constructor 1
    DEBUG: WorkspaceLoaded: 17594ms
    DEBUG: getClasspathMapping initiated with 29 item list.
    DEBUG: getClasspathMapping returning 29 item map.
    INFO: Startup Complete
    DEBUG: Time to load repository (in seconds) - 1
    DEBUG: Loading template file wsrp-producer-project.zip
    DEBUG: Loading template file wli-tutorial.zip
    DEBUG: Loading template file wli-schemas.zip
    DEBUG: Loading template file wli-newprocess.zip
    DEBUG: Loading template file wli-helloworld.zip
    DEBUG: Loading template file webflow-project.zip
    DEBUG: Loading template file tutorial-webservice.zip
    DEBUG: Loading template file tutorial-pageflow.zip
    DEBUG: Loading template file tutorial-jbc.zip
    DEBUG: Loading template file tutorial-ejb.zip
    DEBUG: Loading template file portal-project.zip
    DEBUG: Loading template file portal-application.zip
    DEBUG: Loading template file pipeline-application.zip
    DEBUG: Loading template file oag-schemas.zip
    DEBUG: Loading template file netui-webapp.zip
    DEBUG: Loading template file liquiddata-project.zip
    DEBUG: Loading template file liquiddata-application.zip
    DEBUG: Loading template file ejb-template.zip
    DEBUG: Loading template file default-workshop.zip
    DEBUG: Loading template file datasync-template.zip
    DEBUG: Loading template file crystalreports.zip
    DEBUG: Loading template file commerce-project.zip
    DEBUG: Loading template file commerce-application.zip
    DEBUG: URI is null. Delete Version will not show up in the menu
    DEBUG: URI is null. Delete Version will not show up in the menu
    DEBUG: GCThread: performing gc while idle

  • Extract data from database tables and download in pdf and csv

    extract data from database tables and download in pdf and csv
    hi how can i re-write my old form procedure in adf java. the procedure used to extract data from diffirent table and dowload the data in pdf and csv.am not downloading image, i what to extract data from diffirent tables in my database and download that data in pdf and csv. i would like to write this in java adf.i just what direction am not asking anyone to do my work this is my learning curve
    the form code is
    function merge_header3 return varchar2 is
    begin
         return '~FACILITY DESCRIPTION~ACCOUNT NO~BRANCH CODE~BANK REF NO.~P/P/ AMOUNT~Postal Address 1~Postal Address 2~Box Postal Code~Dep. Date~Month~BANK NAME~BRANCH NAME~ACCOUNT TYPE~DESCRIPTION~OBJECTIVE DESCRIPTION';
    end;
    procedure download_file (i_pbat integer) is
      dir varchar2(80);
      file_name1 varchar2(80);
      file_name2 varchar2(80);
      appl_code varchar2(80);
      fil1 client_text_io.file_type;
      fil2 client_text_io.file_type;
      dat varchar2(1000);
      DATA VARCHAR2(1000);
      bvspro varchar2(100);
      ssch   varchar2(100);
      bvspro_total number(20,2);
      ssch_total   number(20,2);
      grand_total  number(20,2);
      cnt    integer;
      cursor pbat is
           select *
           from sms_payment_batches
           where id = i_pbat
      cursor pay  (pb_id integer) is
           select *
           from sms_payment_vw
           where pbat_id = pb_id
           order by subsidy ASC,programme,beneficiary_name
      cursor cgref (low varchar2) is
           select *
           from cg_ref_codes
           where rv_domain ='SMS'
           and rv_low_value = low
      success boolean;     
      begin  
           set_application_property(cursor_style,'busy');
           appl_code := sms_global.ref_code('SMS','APP_CODE','SMS',0);
        dir       := sms_global.ref_code('SMS','PAY_DIR','c:\sms\batch_payments',0);
             success := webutil_file.create_directory(dir);
         if webutil_file.file_is_directory(dir) then
             null;
    --         message ('directory exists');
        else
    --                  message ('create directory ');
             success := webutil_file.create_directory(dir);
    --         if success then        message ('directory exists');    end if;
        end if;     
        for c_pbat in pbat loop
             file_name1 := dir ||'\' || appl_code||c_pbat.batch_number||'-'||to_char(c_pbat.batch_dt,'yyyymmdd')||'pay.txt';
             file_name2 := dir ||'\' || appl_code||c_pbat.batch_number||'-'||to_char(c_pbat.batch_dt,'yyyymmdd')||'merge.txt';
    --message('create files ');
    --         fil1  := client_text_io.fopen (file_name1,'W');
    --         fil2  := client_text_io.fopen (file_name2,'W');
        fil1  := client_text_io.fopen (file_name1,'W','');
        fil2  := client_text_io.fopen (file_name2,'W','');
                   dat :=                       'FROM ACCOUNT NUMBER'
                                                                ||'~'||'FROM ACCOUNT DESCRIPTION'
                                                                ||'~'||'MY STATEMENT DESCRIPTION'
                                                                ||'~'||'BENEFICIARY ACCOUNT NUMBER'
                                                                ||'~'||'BENEFICIARY SUB ACCOUNT NUMBER'        
                                                                ||'~'||'BENEFICIARY BRANCH CODE'
                                                                ||'~'||'BENEFICIARY NAME'
                                                                ||'~'||'BENEFICIARY STATEMENT DESCRIPTION'
                                                                ||'~'||'AMOUNT';
             --     client_text_io.put_line(fil1,dat);
             bvspro:= null;
             ssch  := null;
             cnt := 0;     
             dat := '~'||lpad('~',16,'~');
             for c_pay in pay(c_pbat.id) loop
    --message('cpay loop ' || cnt);              
               if bvspro is null then
                     dat := lpad('~',16,'~');
                     dat := utility.put_field(1,c_pay.programme,dat,'~');     
               client_text_io.put_line(fil2,dat);
               dat := utility.put_field(1,c_pay.subsidy,dat,'~');
               client_text_io.put_line(fil2,dat);
               dat := merge_header3;
                     client_text_io.put_line(fil2,dat);
                     bvspro := c_pay.programme;
                     ssch := c_pay.subsidy;
                     grand_total := 0;
                     bvspro_total := 0;
                     ssch_total := 0;
               end if;
               if bvspro <> c_pay.programme then
                     dat := lpad('~',16,'~');
                     dat := utility.put_field(5,ssch_total,dat,'~');
                     dat := lpad('~',16,'~');
                     dat := utility.put_field(5,bvspro_total,dat,'~');
               dat := utility.put_field(1,'Total:' || bvspro,dat,'~');
                     client_text_io.put_line(fil2,dat);
                     dat := lpad('~',16,'~');
               client_text_io.put_line(fil2,dat);
                     dat := utility.put_field(1,c_pay.programme,dat,'~');     
               client_text_io.put_line(fil2,dat);
                     bvspro := c_pay.programme;
               dat := utility.put_field(1,c_pay.subsidy,dat,'~');
               client_text_io.put_line(fil2,dat);
               dat := merge_header3;
                     client_text_io.put_line(fil2,dat);
                     bvspro := c_pay.programme;
                     ssch := c_pay.subsidy;
                     bvspro_total := 0;
                     ssch_total := 0;
                     cnt :=0;
             end if;                           
               if ssch <> c_pay.subsidy then
                     dat := lpad('~',16,'~');
                     dat := utility.put_field(5,ssch_total,dat,'~');
                     dat := lpad('~',16,'~');
               client_text_io.put_line(fil2,dat);
               dat := utility.put_field(1,c_pay.subsidy,dat,'~');
               client_text_io.put_line(fil2,dat);
               dat := merge_header3;
                     client_text_io.put_line(fil2,dat);
                     ssch := c_pay.subsidy;
                     ssch_total := 0;
                     cnt :=0;
             end if;                           
            bvspro_total := bvspro_total + c_pay.amount;
            ssch_total   := ssch_total   + c_pay.amount;              
                  grand_total  := grand_total  + c_pay.amount;              
            cnt := cnt +1;
    --message('bfore write file 2 ' );              
            client_text_io.put_line(fil2
                                   ,cnt
                            ||'~'|| c_pay.beneficiary_name
                                                                ||'~'||c_pay.BENEFICIARY_ACCOUNT_NUMBER ||''            
                                                                ||'~'||c_pay.BRANCH_CODE             ||''           
                                                                ||'~'|| c_pay.BENEFICIARY_STATEMENT_DESC            
                                                                ||'~'|| c_pay.AMOUNT                                
                            ||'~'|| c_pay.address_line1
                            ||'~'|| c_pay.address_line2
                                                    ||'~'|| c_pay.postal_code
                                                    ||'~'|| TO_CHAR(c_pay.deposit_date,'DD-Mon-YYYY')
                                                    ||'~'|| c_pay.month
                                                    ||'~'|| c_pay.bank
                                                    ||'~'|| c_pay.bank_branch
                                                    ||'~'|| c_pay.account_type
                                                    ||'~'|| c_pay.subsidy
                                                    ||'~'|| c_pay.programme)
                  DATA :=                                  c_pay.FROM_ACCOUNT_NUMBER                   
                                                                ||'~'||c_pay.FROM_ACCOUNT_DESCR                    
                                                                ||'~'||c_pay.MY_STATEMENT_DESCR                    
                                                                ||'~'||c_pay.BENEFICIARY_ACCOUNT_NUMBER
                                                                ||'~'
                                                                ||'~'||c_pay.BRANCH_CODE            
                                                                ||'~'||c_pay.BENEFICIARY_NAME                      
                                                                ||'~'||c_pay.BENEFICIARY_STATEMENT_DESC            
                                                                ||'~'||c_pay.AMOUNT;                                
            DATA := REPLACE(DATA, ',' , ' ' );
            DATA := REPLACE(DATA, '~' , ',' );
    --message (cnt ||' ' || data);       
    --message('bfore write file 1 ' );              
                  client_text_io.put_line(fil1, data);
             end loop;
    --message ('end of write');         
                 dat := lpad('~',16,'~');
                 dat := utility.put_field(6,ssch_total,dat,'~');
                 dat := lpad('~',16,'~');
           dat := utility.put_field(1,'Total:' || bvspro,dat,'~');
                 dat := utility.put_field(5,bvspro_total,dat,'~');
              client_text_io.put_line(fil2,dat);
              dat := lpad('~',16,'~');
           client_text_io.put_line(fil2,dat);
           dat := utility.put_field(1,'Grand Total:' ,dat,'~');
                 dat := utility.put_field(5,grand_total,dat,'~');
              client_text_io.put_line(fil2,dat);
             -- close file
    for i in 1..50 loop  
           if substr(i,-1) = 0 then
                 message ('flush ' || i);
           end if;                 
                  client_text_io.put_line(fil1, lpad(' ',2000));
                  client_text_io.put_line(fil2, lpad(' ',2000));
                  client_text_io.put_line(fil1, lpad(' ',2000));
                  client_text_io.put_line(fil2, lpad(' ',2000));
    end loop;
             client_text_io.fclose(fil1);
             client_text_io.fclose(fil2);
        end loop;
       set_application_property(cursor_style,'default');
        exception
             when others then
                  message(sqlcode ||' ' ||sqlerrm);
       end download_file;    i try this but this code onlydownload image not data from database tables
        public void downloadImage(FacesContext facesContext, OutputStream outputStream)
            BindingContainer bindings = BindingContext.getCurrent().getCurrentBindingsEntry();
            // get an ADF attributevalue from the ADF page definitions
            AttributeBinding attr = (AttributeBinding) bindings.getControlBinding("DocumentImage");
            if (attr == null)
                return;
            // the value is a BlobDomain data type
            BlobDomain blob = (BlobDomain) attr.getInputValue();
            try
            {   // copy the data from the BlobDomain to the output stream
                IOUtils.copy(blob.getInputStream(), outputStream);
                // cloase the blob to release the recources
                blob.closeInputStream();
                // flush the output stream
                outputStream.flush();
            catch (IOException e)
                // handle errors
                e.printStackTrace();
                FacesMessage msg = new FacesMessage(FacesMessage.SEVERITY_ERROR, e.getMessage(), "");
                FacesContext.getCurrentInstance().addMessage(null, msg);
            }

    You should ask your forum in the ADF-forum.

  • Master data loads for Attributes and texts failing

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

  • Selection groups for tables and header tables

    A couple of questions or rather observations around Selection Groups for tables and conversion objectu2026
    1) For tables that do not really have a date field, for instance table PDSNR, it appears that you cannot specify a selection group. Consequently, you have to transfer the table in its entirety. Is this something that can be modified via customizing?
    2) And what about the header tables, for example, X_ANLC, I have tried but not able to change the selection groups on these objects.
    Thanks!
    Harmeet

    Hi Harmeet.
    Since you mention you can only filter by date I am assuming you are using TDTIM.
    How would you want to filter PDSNR, for example? Would it want to filter all entries >= a given sequence number?
    The only way I know to do that is to work directly with the configuration tables. In your case you will need to use CNVMBTSEL* tables. Check how the standard selection groups are configured, and it should not be too difficult to follow the same logic and create your own selection groups.
    CNVMBTSELGRP                   MBT PCL Selection groups              (main table to create the selection group)
    CNVMBTSELMEMTYPE               MBT PCL Selection group member types  (main table to define members => fields for the selection group)
    CNVMBTSELGRPDEF                MBT PCL Selection group definition    (definition of the fields in that selection group)
    CNVMBTSELGRPVAL                MBT PCL Selection group values        (values for each field; you can use GE for >=)
    CNVMBTSELCLAUSE                MBT PCL Selection group clause        (should also be straightforward if you are using only one field)
    CNVMBTSELVAR                   MBT PCL Selection group variants     (configure the same variant for your selection group as for the others)
    CNVMBTSELREF                   MBT PCL Selection group reference     (assignment of selection group to conversion object)
    Hope this helps,
    Rui Dantas

  • Creating filters for tRFC and qRFC (SM58).

    Hi All,
    I am trying to create filters for qRFC and tRFC as per OSS note 441269 - Setting up tRFC/qRFC monitoring in the alert monitor. When I try to create/copy owner I am getting error "_not in name space_" in task bar when I try to save the changes, I tried all kinds of naming convection still unable to proceed. Pls tell me how can I resolve this issue.
    System:-
    Kernel version 700
    Support pack 13
    Regards,
    kurma

    Hi Jordan,
    We used to 'reorganize' the tRFC/qRFC LUWs via sm58.
    Your found note is ok and try to refer some other notes as well.
    Note 375566 - Large number of entries in tRFC and qRFC tables
    Note 760113 - Delete unprocessed LUWs in the qRFC
    Following similar thread might also help -
    how to delete error entities in TRFC queue to clear stuck
    Thanks

  • Create withholding tax data subsequently for reconciled and open items in c

    Hi All,
    Mine is a US co code and I have a  vendor for which invoice and payment documents are posted. After posting the invoice i found that the vendor is subjected to classic withholding tax which i did not calculate when posting invoice, making payments.
    Now i have changed the vendor master data with relevant withholding tax details.
    I know the program RFWT0020 allows us to create withholding tax data subsequently for reconciled and open items in cases where vendors or customers become liable for withholding tax with a tax rate of 0 %.
    It is not working out for me.
    Can any body help me on this.
    Thanks in advance.

    Dear all,
    by se38 --> report documentation, You can read carefully the following:
    The auxiliary program makes it possible to convert both classic and extended withholding tax data. However, with classic withholding tax, ONLY VENDOR data can be converted.
    In addition, the program enables the withholding tax code to be changed for reconciled and open items with existing withholding tax data. For this, the tax rate of the existing withholding tax data as well as the tax rate of the new withholding tax code must be 0%. The new withholding tax codes are taken from the customer or vendor master data. In the case of extended withholding tax, the withholding tax code is only changed if the related withholding tax categories are the same.  The withholding tax base amount is not changed by this procedure.
    Recreating or changing the withholding tax data requires that the program first be executed for INVOICES and then executed for PAYMENTS in a subsequent step.
    I hope this helps You.
    Mauri

Maybe you are looking for

  • Messages keeps saying "Your message could not be sent..."

    I am sick to death with this problem in Messages. I thought it was specific to one colleague, but now I just had it happen to another. I will get messages just fine. I can respond a few times (around 3) without problem, and then it simply refuses to

  • New and the itunes wont work

    i just recieved my ipod for my birthday and am trying to put music on it. i have downloaded itunes from the start-up cd and also the internet and both dont seem to work. all i get when i double click the itunes icon my desktop is an error, but thats

  • Crystal Report query on top of BI Query : Exception raised in infoview

    Hi experts, I have created a crystal report query on ntop of a BI query and exported to BO.   - For the BI query without variable, the crystal report runs with success in infoview and Crystal report   -For the same BI query with a  variable, the crys

  • IPod In Car Kit

    Hi, Can anyone help me. I am wanting to buy the "In car kit" for my ipod. But am I being a little thick? It is difficult to get an understanding of how this works. Do you have to have a cassette player stereo? As I have a CD player. Does anyone know

  • How do  i install a sim card?

    how do i install a sim card on the iphone 5