Chart not showing large amount of data properly

Hi,
I have got 3 charts on a page. 2 are Bar chart and 3rd is a
Column Chart. All one below another.
The first two charts on the y axis do not have fixed items.
These items are created at run time. If the no. of items reaches
more than 15, the 1st two charts do not show them properly
formatted.
i.e It does show the correct data, but the bars are very
tiny. Of course this would be because of the space it has been
allocated.
But can somebody suggest any nice solution to this? As, to
how do I get the charts to display a large no. of items nicely,
even thought it has a restricted small space allocated to it. I am
not able to find any examples on this.
Thanks

There is no auto-resize feature.
I have never tried this, but what if you put your chart in the report more than once and conditionally suppress according to how much data there is?
Debi
Edited by: Debi Herbert on Mar 29, 2011 6:33 AM

Similar Messages

  • Best way to pass large amounts of data to subroutines?

    I'm writing a program with a large amount of data, around 900 variables.  What is the best way for me to pass parts of this data to different subroutines?  I have a main loop on a PXI RT Controller that is controlling hydraulic cylinders and the loop needs to be 1ms or better.  How on earth should I pass 900 variables through a loop and keep it at 1ms?  One large cluster??  Several smaller clusters??  Local Variables?  Global Variables??  Help me please!!!

    My suggestion, similar to Altenbach and Matt above, is to use a Functional Global Variable (FGV) and use a 1D array of 900 values to store the data in the FGV. You can retrieve individual data items from the FGV by passing in the index of the desired variable and the FGV returns the value from the array. Instead of passing in an index you could also use a TypeDef'd Enum with all of your variables as element of the Enum, which will allow you to place the Enum constant on the diagram and make selecting variables, as well as reading the diagram, simpler.
    My group is developing a LabVIEW component/example code with this functionality that we plan to publish on DevZone in a month or two.
    The attached RTF file shows the core piece of this implementation. This VI off course is non-reentrant. The Init case could be changed to allocate the internal 1D array as part of this VI rather than passing it from another VI.
    Message Edited by Christian L on 01-31-2007 12:00 PM
    Christian Loew, CLA
    Principal Systems Engineer, National Instruments
    Please tip your answer providers with kudos.
    Any attached Code is provided As Is. It has not been tested or validated as a product, for use in a deployed application or system,
    or for use in hazardous environments. You assume all risks for use of the Code and use of the Code is subject
    to the Sample Code License Terms which can be found at: http://ni.com/samplecodelicense
    Attachments:
    CVT_Double_MemBlock.rtf ‏309 KB

  • Sending large amounts of data spontaneously

    In my normal experience with the internet connection, the bits of data sent is about 50 to 80% of that received, but occasionally Firefox starts transmitting large amounts of data spontaneously; what it is, I don't know and where it's going to, I don't know. For example, today the AT&T status screen showed about 19 MB received and about 10 MB sent after about an hour of on-line time. A few minutes later, I looked down at the status screen and it showed 19.5 MB received and 133.9 MB sent, and the number was steadily increasing. Just before I went on line today, I ran a complete scan of the computer with McAfee and it reported nothing needing attention. I ran the scan because a similar effusion of sending data spontaneously had happened yesterday. When I noticed the data pouring out today, I closed Firefox and it stopped. When I opened Firefox right afterward, the transmission of data from did not recommence. My first thought was that my computer had been captured by the bad guys and now I was a robot, but McAfee says not to worry. But should I worry anyway? What's going on, or that not having a good answer now, how can I find out what's going on? And how can I make it stop, unless I'm seeing some kind of maintenance operation that Mozilla or Microsoft is subjecting me to?

    Instead of using URLConnection open a Socket to the server port (80 probably) send a POST http request followed by the data, you may then (optional) recieve data from the server to check that the servlet is ok, this is the same protocol as URLConnection, but you have control over when the data is actually sent...
    Socket sock=new Socket(getHost(),80);
    DataOutputStream dos=new DataOutputStream(sock.getOutputStream());
    dos.writeBytes("POST servletname\r\n");
    dos.writeBytes("Content-type: text/plain\r\n");  //optional, but good if you know
    dos.writeBytes("Content-length: "+lengthOfData+"\r\n")  //again, optional, but good if you can know it without caching the data first
    dos.writeBytes("\r\n");   // gotta have a blank line before the data
      // send data now
    DataInputStream=new DataInputStream(sock.getInputStream());  //optional if you want to recieve
      // recieve any feedback from servlet if you want
    dis.close();
    dos.close();
    sock.close();im guessing that URLConnection caches the data so it can fill in "Content-length"

  • Bex Report Designer - Large amount of data issue

    Hi Experts,
    I am trying to execute (on Portal) report made in BEx Report Designer, with about 30 000 pages, and the only thing I am getting is a blank page. Everything works fine at about 3000 pages. Do I need to set something to allow processing such large amount of data?
    Regards
    Vladimir

    Hi Sauro,
    I have not seen this behavior, but it has been a while since I tried to send an input schedule that large. I think the last time was on a BPC NW 7.0 SP06 system and it worked OK. If you are on a recent support package, then you should search for relevant notes (none come to mind for me, but searching yourself is always a good idea) and if you don't find one then you should open a support message with SAP, with very specific instructions for recreating the problem from a clean input-schedule.
    Good luck,
    Ethan

  • Advice needed on how to keep large amounts of data

    Hi guys,
    Im not sure whats the best way is to make large amounts of data available to my android  app on the local device.
    For example records of food ingredients, in the 100's?
    I have read and successfully created .db's using this tutorial.
    http://help.adobe.com/en_US/AIR/1.5/devappsflex/WS5b3ccc516d4fbf351e63e3d118666ade46-7d49. html
    However to populate the database I use flash? So this kind of defeats the purpose of it. No point in me shifting a massive array of data from flash to a sql database, when I could access the data direct from the as3 array?
    So maybe I could create the .db with an external program? but then how would I include that .db in the apk file and then deploy it to users android device.
    Or maybe I create a as3 class with an xml object init and use that as a means of data storage?
    Any advice would be appreciated

    You can use any means you like to populate your SQLite database, including using external programs, (temporarily) embedding a text file with SQL statements, executing some SQL from AS3 code etc etc.
    Once you have populated your db, deploy it with your project:
    http://chrisgriffith.wordpress.com/2011/01/11/understanding-bundled-sqlite-databases-in-ai r-for-mobile/
    Cheers, - Jon -

  • With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?

    With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?
    For example, in Notes, I have written three notes; however if I click on 'All On My Mac' on the side bar, I see about 10 different versions of each note I make, it saves a version every time I add or delete a sentence.
    I also noticed, that when I write an email, Mail saves about 10 or more draft versions before the final is sent.
    I understand that all this journaling provides a level of security, and prevents data lost; but I was wondering, is there a function to clean up journal logs once in a while?
    Thanks
    Roz

    Are you using Microsoft word?  Microsoft thinks the users are idiots. They put up a lot of pointless messages that annoy & worry users.  I have seen this message from Microsoft word.  It's annoying.
    As BDaqua points out...
    When you copy information via edit > copy,  command + c, edit > cut, or command +x, you place the information on the clipboard. When you paste information, edit > paste or command + v, you copy information from the clipboard to your data file.
    If you edit > cut or command + x and you do not paste the information and you quite Word, you could be loosing information.  Microsoft is very worried about this. When you quite Word, Microsoft checks if there is information on the clipboard & if so, Microsoft puts out this message.
    You should be saving your work more than once a day. I'd save every 5 minutes.  command + s does a save.
    Robert

  • Looking for ideas for transferring large amounts of data between systems

    Hello,
    I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
    We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
    We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
    Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
    We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

  • Azure Cloud service fails when sent large amount of data

    This is the error;
    Exception in AZURE Call: An error occurred while receiving the HTTP response to http://xxxx.cloudapp.net/Service1.svc. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being
    aborted by the server (possibly due to the service shutting down). See server logs for more details.
    Calls with smaller amounts of data work fine. Large amounts of data cause this error.
    How can I fix this??

    Go to the web.config file, look for the <binding> that is being used for your service, and adjust the various parameters that limit the maximum length of the messages, such as
    maxReceivedMessageSize.
    http://msdn.microsoft.com/en-us/library/system.servicemodel.basichttpbinding.maxreceivedmessagesize(v=vs.100).aspx
    Make sure that you specify a size that is large enough to accomodate the amount of data that you are sending (the default is 64Kb).
    Note that even if you set a very large value here, you won't be able to go beyond the maximum request length that is configured in IIS. If I recall correctly, the default limit in IIS is 8 megabytes.

  • DSS problems when publishing large amount of data fast

    Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
    There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
    I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
    My questions are
    1. Is there any limit in speed (frequency) for data publishing in DSS?
    2. Can DSS be unstable if loaded to much?
    3. Can I lose/miss data in any situation?
    4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
    5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
    Regards
    Idriz Zogaj
    Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
    Memory Profesional
    direct: +46 (0) - 734 32 00 10
    http://www.zogaj.se

    LuI wrote:
    >
    > Hi all,
    >
    > I am frustrated on VISA serial comm. It looks so neat and its
    > fantastic what it supposes to do for a develloper, but sometimes one
    > runs into trouble very deep.
    > I have an app where I have to read large amounts of data streamed by
    > 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
    > same time.)
    > I use either a Moxa multiport adapter C320 with 16 serial ports or -
    > for test purposes - a Keyspan serial-2-USB adapter with 4 serial
    > ports.
    Does it work better if you use the serial port(s) on your motherboard?
    If so, then get a better serial adapter. If not, look more closely at
    VISA.
    Some programs have some issues on serial adapters but run fine on a
    regular serial port. We've had that problem recent
    ly.
    Best, Mark

  • How do I pause an iCloud restore for app with large amounts of data?

    I am using an iPhone app which is holding 10 Gb of data (media files) .
    Unfortunately, although all data was backed up, my iPhone 4 was faulty and needed to be replaced with a new handset. On restore, the 10Gb of data takes a very long time to restore over wi-fi. If interrupted (I reached the halfway point during the night) to go to work or take the dog for a walk, I end up of course on 3G for a short period of time.
    Next time I am in a wi-fi zone the app is restoring again right from the beginning
    How does anyone restore an app with large amounts of data or pause a restore?

    You can use classifications but there is no auto feature to archive like that on web apps.
    In terms of the blog, Like I have said to everyone that has posted about blog preview images:
    http://www.prettypollution.com.au/business-catalyst-blog
    Just one example of an image at the start of the blog post rendering out, not hard at all.

  • How can I edit large amount of data using Acrobat X Pro

    Hello all,
    I need to edit a catalog that contains large amount of data - mainly the product price. Currently I can only export the document into excel file and then paste the new price onto the catalog using Acrobat X Pro one by one, which is extremely time-consuming. I am sure there's a better way to make this faster while keeping the accuracy of the data. Thanks a lot in advance if any one's able to help! 

    Hi Chauhan,
    Yes I am able to edit text/image via tool box, but the thing is the catalog contains more than 20,000 price data and all I can do is deleteing the orginal price info from catalog and replace it with the revised data from excel. Repeating this process over 20,000 times would be a waste of time and manpower... Not sure if I make my situation clear enough? Pls just ask away, I really hope to sort it out, Thanks! 

  • Couldn't copy large amount of data from enterprise DB to Oracle 10g

    Hi,
    I am using i-batis to copy data from enterprise DB to oracle and viceversa.
    The datatype of a field on EDB is 'text' and the datatype on oracle is 'SYS.XMLTYPE'
    i am binding these to a java string property in a POJO to bind values.
    I could successfully copy limited amount of data from EDB to oracle but if there is more data, i am getting the following exceptions with different oracle drivers ( but i could read large amount of data from EDB):
    --- Cause: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.update(SqlMapExecutorDelegate.java:457)
    at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.update(SqlMapSessionImpl.java:90)
    at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.update(SqlMapClientImpl.java:66)
    at com.aqa.pojos.OstBtlData.updateOracleFromEdbBtlWebservice(OstBtlData.java:282)
    at com.aqa.pojos.OstBtlData.searchEdbAndUpdateOracleBtlWebservice(OstBtlData.java:258)
    com.ibatis.common.jdbc.exception.NestedSQLException:
    --- The error occurred in com/aqa/sqlmaps/SQLMaps_OSTBTL_Oracle.xml.
    --- The error occurred while applying a parameter map.
    --- Check the updateOracleFromEDB-InlineParameterMap.
    --- Check the parameter mapping for the 'btlxml' property.
    --- Cause: java.sql.SQLException: setString can only process strings of less than 32766 chararacters
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.iba
    I have latest oracle 10g jdbc drivers.
    remember, i could copy any amount of data from oracle to EDB but not otherway around.
    PLease let me know if you have come across this issue, any recommendation is very much appreciated.
    Thanks,
    CK.

    Hi,
    I finally remembered how I solved this issue previously.
    The jdbc driver isn't able to directly call the insert with a column xml_type. The solution I was using was to build a wrapper function in plSQL.
    Here it is (for insert but I suppose tha update will be the same)
    create or replace procedure insertXML(file_no_in in number, program_no_in in varchar2, ost_XML_in in clob, btl_XML_in in clob) is
    begin
    insert into AQAOST_FILES (file_no,program_no,ost_xml,btl_xml) values(file_no_in, program_no_in, xmltype(ost_XML_in), xmltype(btl_XML_in));
    end insertXML;
    here is the sqlmap file I used
    <?xml version="1.0" encoding="UTF-8" ?>
    <!DOCTYPE sqlMap
    PUBLIC "-//ibatis.apache.org//DTD SQL Map 2.0//EN"
    "http://ibatis.apache.org/dtd/sql-map-2.dtd">
    <sqlMap>
         <typeAlias alias="AqAost" type="com.sg2net.jdbc.AqAost" />
         <insert id="insert" parameterClass="AqAost">
              begin
                   insertxml(#fileNo#,#programNo#,#ostXML:CLOB#,#bltXML:CLOB#);
              end;
         </insert>
    </sqlMap>
    an here is a simple program
    package com.sg2net.jdbc;
    import java.io.IOException;
    import java.io.Reader;
    import java.io.StringWriter;
    import java.sql.Connection;
    import oracle.jdbc.pool.OracleDataSource;
    import com.ibatis.common.resources.Resources;
    import com.ibatis.sqlmap.client.SqlMapClient;
    import com.ibatis.sqlmap.client.SqlMapClientBuilder;
    public class TestInsertXMLType {
         * @param args
         public static void main(String[] args) throws Exception {
              // TODO Auto-generated method stub
              String resource="sql-map-config-xmlt.xml";
              Reader reader= Resources.getResourceAsReader(resource);
              SqlMapClient sqlMap = SqlMapClientBuilder.buildSqlMapClient(reader);
              OracleDataSource dataSource= new OracleDataSource();
              dataSource.setUser("test");
              dataSource.setPassword("test");
              dataSource.setURL("jdbc:oracle:thin:@localhost:1521:orcl");
              Connection connection=dataSource.getConnection();
              sqlMap.setUserConnection(connection);
              AqAost aqAost= new AqAost();
              aqAost.setFileNo(3);
              aqAost.setProgramNo("prg");
              Reader ostXMLReader= Resources.getResourceAsReader("ostXML.xml");
              Reader bltXMLReader= Resources.getResourceAsReader("bstXML.xml");
              aqAost.setOstXML(readerToString(ostXMLReader));
              aqAost.setBltXML(readerToString(bltXMLReader));
              sqlMap.insert("insert", aqAost);
              connection.commit();
         public static String readerToString(Reader reader) {
              StringWriter writer = new StringWriter();
              char[] buffer = new char[2048];
              int charsRead = 0;
              try {
                   while ((charsRead = reader.read(buffer)) > 0) {
                        writer.write(buffer, 0, charsRead);
              } catch (IOException ioe) {
                   throw new RuntimeException("error while converting reader to String", ioe);
              return writer.toString();
    package com.sg2net.jdbc;
    public class AqAost {
         private long fileNo;
         private String programNo;
         private String ostXML;
         private String bltXML;
         public long getFileNo() {
              return fileNo;
         public void setFileNo(long fileNo) {
              this.fileNo = fileNo;
         public String getProgramNo() {
              return programNo;
         public void setProgramNo(String programNo) {
              this.programNo = programNo;
         public String getOstXML() {
              return ostXML;
         public void setOstXML(String ostXML) {
              this.ostXML = ostXML;
         public String getBltXML() {
              return bltXML;
         public void setBltXML(String bltXML) {
              this.bltXML = bltXML;
    I tested the insert and it works correctly
    ciao,
    Giovanni

  • Transporting large amounts of data from one database schema to another

    Hi,
    We need to move large amount of data from one 10.2.0.4 database schema to another 11.2.0.3 database.
    Am currently using datapump but quite slow still - having to do in chunks.
    Also the datapump files quite large so having to compress and move across the network.
    Is there a better/quicker way?
    Habe haerd about transportable tablespaces but never used them and don't know about speed - if quicker thana datapump.
    tablespace names different in both databases.
    Also source database on solaris opertaing system on sun box
    target database on aix on ibm power series box.
    Any ideas would be great.
    Thanks
    Edited by: user5716448 on 08-Sep-2012 03:30
    Edited by: user5716448 on 08-Sep-2012 03:31

    user5716448 wrote:
    Hi,
    We need to move large amount of data from one 10.2.0.4 database schema to another 11.2.0.3 database.
    Pl quantify "large".
    Am currently using datapump but quite slow still - having to do in chunks.
    Pl quantify "quite slow".
    Also the datapump files quite large so having to compress and move across the network.
    Again, pl quantify "quite large".
    Is there a better/quicker way?
    Habe haerd about transportable tablespaces but never used them and don't know about speed - if quicker thana datapump.
    tablespace names different in both databases.
    Also source database on solaris opertaing system on sun box
    target database on aix on ibm power series box.
    It may be possible, assuming you do not violate any of these conditions
    http://docs.oracle.com/cd/E11882_01/server.112/e25494/tspaces013.htm#ADMIN11396
    Any ideas would be great.
    Thanks
    Edited by: user5716448 on 08-Sep-2012 03:30
    Edited by: user5716448 on 08-Sep-2012 03:31Master Note for Transportable Tablespaces (TTS) -- Common Questions and Issues [ID 1166564.1]
    HTH
    Srini

  • Uploading of large amount of data

    Hi all,
    i really hope you can help me. I have to upload quite large amount of data from flat files to ODS (via PSA of course). But the process takes very long time. I used method of loadin to PSA and then packet by packet into ODS. Loading of cca 1.300.000 lines from flat file takes about 6 or more hours. It seems strange for me. Is it normal or not?? Or should I use another uploading method or set up ODS some way ?? thanks

    hi jj,
    welcome to the SDN!
    in my limited experience, 6hrs for 1.3M records is a bit too long. here are some things you could try and look into:
    - load from the application server, not from the client computer (meaning, move your file to the server where BW is running, to minimize network traffic).
    - check your transfer rules and any customer exits related to loading, as the smallest performance-inefficient bits of code can cause a lot of problems.
    - check the size of data packets you're transmitting, as it could also cause problems, via tcode RSCUSTA2 (i think, but i'm not 100% sure).
    hope ths helps you out - please remember to give out points as a way of saying thanks to those that help you out okay? =)
    ryan.

  • Scrolling of large amount of data with Unscrolled HEADER .How to do???

    I am in much need of scrolling a large amount of data without scrolling the header,i.e. while scrolling,only data will be scrolled,so that I can see the header as well.I am using JSP in struts framework.Waiting for your help friends.Thanks in advance.

    1) Install Google at your machine.
    2) Open it in your favourite web browser.
    3) Note the input field and the buttons.
    4) Enter smart keywords like "scrollable", "table", "fixed" and "header" in that input field.
    5) Press the first button. You'll get a list of links.
    6) Follow the relevant links and read it thouroughly.

Maybe you are looking for

  • Error while uploading the custom app 2nd time

    Hi All, when i am uploading the custom application through the program /UI5/UI5_REPOSITORY_LOAD for 1st time it is getting properly uploaded, if we do any changes again uploading then i am not able upload the file of the app. following error are show

  • Boot Camp 5.0 - After Installing Windows Drivers...

    Hello, all. In a bit of a pickle here, though it's nothing hugely serious. So anyway, yesterday I tried to download the Windows Support Software via Boot Camp Assistant 5.0 on my Mountain Lion MBP. It was taking quite a while so I resolved to pop a b

  • GO URL in 11g

    Hi I was implementing go url code in obiee11g, as implement in 10g. Formula wise, it is correct. But it is throwing the following error, when i ran the dashboard. "HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has

  • Cannot create keyboard shortcuts

    Preview has lost its ability to 'Fit to Screen' & 'Close' with the keyboard in Slideshow. When trying to add via Systems Preferences, the ability to type the name is there but only get the 'bonk' sound & flash when keyboard keys are hit in trying to

  • Email notification missing on C7

    I've bought a new C7-00 recently and noticed that new email arrivals are not notified with "@" symbol on the home screen (like for  'missed-calls' and SMS). I however get tone alerts on new email arrivals. This used to work on my previous nokia (5233