Capacity table showing more than 100 % load

Hi,
We are using SNP optimizer
In one of the runs we have observed that the capacity of the resource is getting overloaded more than 100% but when we check in the optimizer log the log shows load of only 100%
Can you please help me to understand why there is no consistency between log of optimizer and capacity table
Regards
Abhi

Abhi,
You have the answer.  just need to find the data.  The optimizer is loading the resource to what it thinks is one hundred percent.
It is either not seeing all the orders loaded on that resource, or has a different available capacity then is shown in the book. 
A classic problem here is reviewing the results in a different planning book then you ran the optimizer.  For example if you ran the optimizer in a monthly bucket book and you are viewing the results in a weekly book.  1 out of 4 weeks will be very overloaded.
Carefully compare your optimizer log to the details of your planning book  you should find the difference.
George

Similar Messages

  • General Scenario- Adding columns into a table with more than 100 million rows

    I was asked/given a scenario, what issues do you encounter when you try to add new columns to a table with more than 200 million rows? How do you overcome those?
    Thanks in advance.
    svk

    For such a large table, it is better to add the new column to the end of the table to avoid any performance impact, as RSingh suggested.
    Also avoid to use any default on the newly created statement, or SQL Server will have to fill up 200 million fields with this default value. If you need one, add an empty column and update the column by using small batches (otherwise you lock up the whole
    table). Add the default after all the rows have a value for the new column.

  • Download Header using GUI_DOWNLOAD for more than 100 fields

    Hi,
    I want to download data with header from a database table using the FM 'GUI_DOWNLOAD'.
    I know that this can be achieved by
    1. Download headers first to an internal table and call FM ' GUI_DOWNLOAD" 
    2. then Download data to the same file by calling FM 'GUI_DOWNLOAD' with APPEND = 'X'
    But my database table has more than 100 fields, pls tell me how to download all the fields as header information.
    Thanks in advance.

    Hi
    This can be done using the FM GUI_DOWNLOAD.
    Store all the field names in a internal table, pass the internal table to FIELDNAMES component of the FM. This will download the internal table with headings.
    Note - To download it as excel, pass FILETYPE as 'DAT'.
    Regards,
    Vinod

  • More than 100 items in Form or Page

    Can Apex allow more than 100 items in a page? I am getting message like
    'Item count exceeds maximum of 100'.
    Even data entered in the form for extra items over 100 is not posted to database.
    Pl help or suggest any workaround.
    I am using HTMLDB 2.0

    Example:
    1. Create the test data
    CREATE TABLE "TEST_TABLE1"
    ("KEY1" NUMBER(10) NOT NULL ,
    "DESC" VARCHAR2(24) ,
    "OPTION1" VARCHAR2(10) ,
    "OPTION2" VARCHAR2(10) ,
    "OPTION3" VARCHAR2(10) ,
    "OPTION4" VARCHAR2(10) ,
    "OPTION5" VARCHAR2(10) ,
    "OPTION6" VARCHAR2(10) ,
    "OPTION7" VARCHAR2(10) ,
    "OPTION8" VARCHAR2(10) ,
    "OPTION9" VARCHAR2(10) ,
    "OPTION10" VARCHAR2(10) ,
    "OPTION11" VARCHAR2(10) ) ;
    INSERT INTO"TEST_TABLE1"
    VALUES (1,'Row 1','Red','Blue','Green','Yellow','Orange','Black','Ivory','Purple','White','Lime','Gray');
    INSERT INTO"TEST_TABLE1"
    VALUES (2,'Row 2','Blue','Green','Yellow','Orange','Black','Ivory','Purple','White','Lime','Gray','Red');
    INSERT INTO"TEST_TABLE1"
    VALUES (3,'Row 3','Green','Yellow','Orange','Black','Ivory','Purple','White','Lime','Gray','Red','Blue');
    INSERT INTO"TEST_TABLE1"
    VALUES (4,'Row 4','Yellow','Orange','Black','Ivory','Purple','White','Lime','Gray','Red','Blue','Green');
    INSERT INTO"TEST_TABLE1"
    VALUES (5,'Row 5','Orange','Black','Ivory','Purple','White','Lime','Gray','Red','Blue','Green','Yellow');
    INSERT INTO"TEST_TABLE1"
    VALUES (6,'Row 6','Black','Ivory','Purple','White','Lime','Gray','Red','Blue','Green','Yellow','Orange');
    INSERT INTO"TEST_TABLE1"
    VALUES (7,'Row 7','Ivory','Purple','White','Lime','Gray','Red','Blue','Green','Yellow','Orange','Black');
    INSERT INTO"TEST_TABLE1"
    VALUES (8,'Row 8','Purple','White','Lime','Gray','Red','Blue','Green','Yellow','Orange','Black','Ivory');
    INSERT INTO"TEST_TABLE1"
    VALUES (9,'Row 9','White','Lime','Gray','Red','Blue','Green','Yellow','Orange','Black','Ivory','Purple');
    INSERT INTO"TEST_TABLE1"
    VALUES (10,'Row 10','Lime','Gray','Red','Blue','Green','Yellow','Orange','Black','Ivory','Purple','White');
    2. Create a blank page.
    3. Create a report region with the following query.
    select
    "DESC",
    htmldb_item.hidden(1,KEY1) Key1,
    htmldb_item.text(2,OPTION1,10,10) OPTION1,
    htmldb_item.text(3,OPTION2,10,10) OPTION2,
    htmldb_item.text(4,OPTION3,10,10) OPTION3,
    htmldb_item.text(5,OPTION4,10,10) OPTION4,
    htmldb_item.text(6,OPTION5,10,10) OPTION5,
    htmldb_item.text(7,OPTION6,10,10) OPTION6,
    htmldb_item.text(8,OPTION7,10,10) OPTION7,
    htmldb_item.text(9,OPTION8,10,10) OPTION8,
    htmldb_item.text(10,OPTION9,10,10) OPTION9,
    htmldb_item.text(11,OPTION10,10,10) OPTION10,
    htmldb_item.text(12,OPTION11,10,10) OPTION11
    from test_table1
    4. create a save button which branches to the same page
    5. Create a page process that executes the following SQL when the button is pressed:
    -- update Record
    for i in 1..htmldb_application.g_f01.count
    loop
    update test_table1
    set option1 = replace(htmldb_application.g_f02(i),'%'||'null%',NULL),
    option2 = replace(htmldb_application.g_f03(i),'%'||'null%',NULL),
    option3 = replace(htmldb_application.g_f04(i),'%'||'null%',NULL),
    option4 = replace(htmldb_application.g_f05(i),'%'||'null%',NULL),
    option5 = replace(htmldb_application.g_f06(i),'%'||'null%',NULL),
    option6 = replace(htmldb_application.g_f07(i),'%'||'null%',NULL),
    option7 = replace(htmldb_application.g_f08(i),'%'||'null%',NULL),
    option8 = replace(htmldb_application.g_f09(i),'%'||'null%',NULL),
    option9 = replace(htmldb_application.g_f10(i),'%'||'null%',NULL),
    option10 = replace(htmldb_application.g_f11(i),'%'||'null%',NULL)
    option11 = replace(htmldb_application.g_f12(i),'%'||'null%',NULL)
    where key1 = htmldb_application.g_f01(i);
    end loop;
    6. Run the page (This page has 110 items)
    If you have a table with more than 100 fields, you can try something like this:
    Select 1, apex_item.text(1,field1,10,10),
    apex_item.text(2,field2,10,10),
    union all
    Select 2, apex_item.text(1,field11,10,10),
    apex_item.text(2,field12,10,10),
    , field2, field3, field4, field5

  • Liquify shows jagged image when liquifying at more than 100%

    Hello
    Well, that`s the problem. I recently changed my video card and liquify started working a different way. The video card I have now is ATI RADEON HD 5850 DDR5, 1gb. And the computer I have is:
    QuadCore Intel Core i7 920, 2800 MHz (21 x 133)
    Motherboard: DX58SO (1 PCI, 1 PCI-E x4, 2 PCI-E x16, 4 DDR3 DIMM, Audio, Gigabit LAN, IEEE-1394)
    RAM: 6 GB DDR3-1333 DDR3 SDRAM
    Video Card: ATI Radeon HD 5800 Series (1024 MB)
    Hard Drive: 1TB western digital, 4 partitions, 73gb free space in total and 13gb free on C Drive.
    Windows 7 ultimate 64 bit.
    Photoshop CS6.
    This are the things that happened after installing the new Video Card:
    - I used to work with the PUSH LEFT tool on Brush Pressure 1. Now, to achieve the same effect (the same kind of pressure), I have to use a higher number, like 6 o 7.
    - When liquifying the image at more than 100%, the image gets jagged. I can't do any precise work like this, because I'm not seeing the image correctly, the pixels are shifted, the lines jagged. If I go back to 100% the image looks normal again.
    I already tryied on the Preference panel to change the options of the video card but the problem persists. The way I have it set up now is this:
    I need to do some precise works on small areas of the image so this is quie a problem for me. Any suggestions, ideas?
    THANKS A LOT IN ADVANCE!
    Margaret.

    Could you please post a screenshot to illustrate the issue?
    What is the GPU driver?

  • More than 100 picture in a dia show

    Hallo,
    it is possible to change the maximum of 100 picture import settings for a diashow?
    regards
    dominik

    I think you mean 'slideshow'.
    And no - you cannot change that setting. A slideshow is a special case of a track, and as such it has a limit of 99 chapter markers only. When you create a slideshow each picture gets a marker added automatically, so you cannot have more than 100 images.

  • Row chaining in table with more than 255 columns

    Hi,
    I have a table with 1000 columns.
    I saw the following citation: "Any table with more then 255 columns will have chained
    rows (we break really wide tables up)."
    If I insert a row populated with only the first 3 columns (the others are null), is a row chaining occurred?
    I tried to insert a row described above and no row chaining occurred.
    As I understand, a row chaining occurs in a table with 1000 columns only when the populated data increases
    the block size OR when more than 255 columns are populated. Am I right?
    Thanks
    dyahav

    user10952094 wrote:
    Hi,
    I have a table with 1000 columns.
    I saw the following citation: "Any table with more then 255 columns will have chained
    rows (we break really wide tables up)."
    If I insert a row populated with only the first 3 columns (the others are null), is a row chaining occurred?
    I tried to insert a row described above and no row chaining occurred.
    As I understand, a row chaining occurs in a table with 1000 columns only when the populated data increases
    the block size OR when more than 255 columns are populated. Am I right?
    Thanks
    dyahavYesterday, I stated this on the forum "Tables with more than 255 columns will always have chained rows." My statement needs clarification. It was based on the following:
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28318/schema.htm#i4383
    "Oracle Database can only store 255 columns in a row piece. Thus, if you insert a row into a table that has 1000 columns, then the database creates 4 row pieces, typically chained over multiple blocks."
    And this paraphrase from "Practical Oracle 8i":
    V$SYSSTAT will show increasing values for CONTINUED ROW FETCH as table rows are read for tables containing more than 255 columns.
    Related information may also be found here:
    http://download.oracle.com/docs/cd/B10501_01/server.920/a96524/c11schem.htm
    "When a table has more than 255 columns, rows that have data after the 255th column are likely to be chained within the same block. This is called intra-block chaining. A chained row's pieces are chained together using the rowids of the pieces. With intra-block chaining, users receive all the data in the same block. If the row fits in the block, users do not see an effect in I/O performance, because no extra I/O operation is required to retrieve the rest of the row."
    http://download.oracle.com/docs/html/B14340_01/data.htm
    "For a table with several columns, the key question to consider is the (average) row length, not the number of columns. Having more than 255 columns in a table built with a smaller block size typically results in intrablock chaining.
    Oracle stores multiple row pieces in the same block, but the overhead to maintain the column information is minimal as long as all row pieces fit in a single data block. If the rows don't fit in a single data block, you may consider using a larger database block size (or use multiple block sizes in the same database). "
    Why not a test case?
    Create a test table named T4 with 1000 columns.
    With the table created, insert 1,000 rows into the table, populating the first 257 columns each with a random 3 byte string which should result in an average row length of about 771 bytes.
    SPOOL C:\TESTME.TXT
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    INSERT INTO T4 (
    COL1,
    COL2,
    COL3,
    COL255,
    COL256,
    COL257)
    SELECT
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3)
    FROM
      DUAL
    CONNECT BY
      LEVEL<=1000;
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SET AUTOTRACE TRACEONLY STATISTICS
    SELECT
    FROM
      T4;
    SET AUTOTRACE OFF
    SELECT
      SN.NAME,
      SN.STATISTIC#,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SPOOL OFFWhat are the results of the above?
    Before the insert:
    NAME                      VALUE                                                
    table fetch continue        166
    After the insert:
    NAME                      VALUE                                                
    table fetch continue        166                                                
    After the select:
    NAME                 STATISTIC#      VALUE                                     
    table fetch continue        252        332  Another test, this time with an average row length of about 12 bytes:
    DELETE FROM T4;
    COMMIT;
    SPOOL C:\TESTME2.TXT
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    INSERT INTO T4 (
      COL1,
      COL256,
      COL257,
      COL999)
    SELECT
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3)
    FROM
      DUAL
    CONNECT BY
      LEVEL<=100000;
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SET AUTOTRACE TRACEONLY STATISTICS
    SELECT
    FROM
      T4;
    SET AUTOTRACE OFF
    SELECT
      SN.NAME,
      SN.STATISTIC#,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SPOOL OFFWith 100,000 rows each containing about 12 bytes, what should the 'table fetch continued row' statistic show?
    Before the insert:
    NAME                      VALUE                                                
    table fetch continue        332 
    After the insert:
    NAME                      VALUE                                                
    table fetch continue        332
    After the select:
    NAME                 STATISTIC#      VALUE                                     
    table fetch continue        252      33695The final test only inserts data into the first 4 columns:
    DELETE FROM T4;
    COMMIT;
    SPOOL C:\TESTME3.TXT
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    INSERT INTO T4 (
      COL1,
      COL2,
      COL3,
      COL4)
    SELECT
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3)
    FROM
      DUAL
    CONNECT BY
      LEVEL<=100000;
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SET AUTOTRACE TRACEONLY STATISTICS
    SELECT
    FROM
      T4;
    SET AUTOTRACE OFF
    SELECT
      SN.NAME,
      SN.STATISTIC#,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SPOOL OFFWhat should the 'table fetch continued row' show?
    Before the insert:
    NAME                      VALUE                                                
    table fetch continue      33695
    After the insert:
    NAME                      VALUE                                                
    table fetch continue      33695
    After the select:
    NAME                 STATISTIC#      VALUE                                     
    table fetch continue        252      33695 My statement "Tables with more than 255 columns will always have chained rows." needs to be clarified:
    "Tables with more than 255 columns will always have chained rows +(row pieces)+ if a column beyond column 255 is used, but the 'table fetch continued row' statistic +may+ only increase in value if the remaining row pieces are found in a different block."
    Charles Hooper
    IT Manager/Oracle DBA
    K&M Machine-Fabricating, Inc.
    Edited by: Charles Hooper on Aug 5, 2009 9:52 AM
    Paraphrase misspelled the view name "V$SYSSTAT", corrected a couple minor typos, and changed "will" to "may" in the closing paragraph as this appears to be the behavior based on the test case.

  • Reports with more than 100 columns

    I am using Apex 3.2. I am using a classic report with more than 100 columns. So to use the custom heading in the report, I exported the page,did modification in exported sql file and imported back.
    But when I am passing the parameters from a input screen for the first time either it is showing the all data avilable in the database or with only header. If I do the refresh in input screen and try it then it is working as intended. I am updating at the below loaction of the page.Let me know if I have to update anything more in the sql file.
    declare
    s varchar2(32767) := null;
    begin
    s := null;
    wwv_flow_api.create_report_columns (
    p_id=> 200100535534034253 + wwv_flow_api.g_id_offset,
    p_region_id=> 2003453533453 + wwv_flow_api.g_id_offset,
    p_flow_id=> wwv_flow.g_flow_id,
    p_query_column_id=> 157,
    p_form_element_id=> null,
    p_column_alias=> 'XYZ',
    p_column_display_sequence=> 157,
    p_column_heading=> 'XYZ 123',
    p_column_alignment=>'LEFT',
    p_disable_sort_column=>'Y',
    p_sum_column=> 'N',
    p_hidden_column=> 'N',
    p_display_as=>'WITHOUT_MODIFICATION',
    p_pk_col_source=> s,
    p_column_comment=>'');
    end;
    /

    Hi,
    Its just a general thought and I realize you know your application and user community better I do, but do you think your users are going to be very happy when they are presented with a report with more than a hundred columns? Have you considered maybe presenting the data in some sort of rolled up form from which the user can then drill down to the data they are particularly interested in. Also, I'm sure your LAN administrator would be happy not to see 100+ column by x number row reports being regularly shipped across the network.
    Also, 100+ column reports suggests tables with 100+ columns which are probably not designed in a very relationally compliant way. I find that good DB design usually results in applications that have to make less compromises, such as hacking export files in order to fool the API into making ridiculous, unsupported and unsupportable compromises.
    Just a thought..................
    Regards
    Andre

  • How can I see more than 100 users when adding users to a node?

    When adding new users through the User Management interface, only the
    first 100 entries are listed. How do I increase the number of entries
    so I can see the next hundred (or more) users? Is this done via the server
    configuration parameter maxsearchresult?
    <P>
    The GUI Add User to Node option was meant for infrequent adds in smaller
    environments. In large deployments (more than 100 user adds at a time), you
    should be using the command line script "unidsattach". See the Administrators
    Guide for more details on the command.
    <p>
    The parameter, maxsearchresult, is meant to limit the number of users that can
    be returned on a search request by the calendar client. For example, if you have
    1000's of calendar users and someone tries to search for all of them, this
    parameter will make sure that the user is not left waiting a long time for
    results.

    In Settings > Notification Center, try turning "on" all the six buttons under "Today View".  Also turn "on" the top two buttons under "Access on Lock Screen".  Doing that should show you "Tomorrow" at the bottom of Noticiation Center.

  • Smart form: Spool to PDF Conversion more than 100 pages.

    Hi folks,
    I have an issue with Function Module (FM) CONVERT_ABAPSPOOLJOB_2_PDF when the report to be converted has more than 100 pages.....i have explained this scenario as below
    We have been using the FM CONVERT_ABAPSPOOLJOB_2_PDF to convert ABAP report to PDF file and then send this via mail using FM SO_DOCUMENT_SEND_API1.....
    PDF conversion using FM CONVERT_ABAPSPOOLJOB_2_PDF works well if the number of report pages is less than 100....i.e the table pdf of the FM will contain the output as expected.....
    If the report has more than 100 pages, the the FM CONVERT_ABAPSPOOLJOB_2_PDF will schedule the conversion job as a background job and returns with a SY-subrc eq 0 and the table parameter pdf will ne empty.
    Could you suggest me a woorkaround in this situtaion...i.e how do I then get the output of the background job
    And  any process to convert Spool to PDF if having more than 100 pages.
    Thanks in Advance............

    Pb,
    If the report has more than 100 pages, the the FM CONVERT_ABAPSPOOLJOB_2_PDF will schedule the conversion job as a background job and returns with a SY-subrc eq 0 and the table parameter pdf will ne empty
    i would suggest you to if pages are more than 100 than you need to send report output to sp01 with spool request number by using fm job_open..job_close..
    than from sp01 you can convert pdf by running RSTXPDFT4 in se38.
    hope this would solve your problem.
    Amit.

  • How to add the rows formatted as table headings, Tables headings are repeated when a table spans more than one page.

    Hi all,
    i am facing problem while generating Test Result word document after successful execution of TestStand.
    The Problem is :
    i want to add rows Formatted as table headings, table headings are repeated when a table spans more than one page(marked as Red).
    Example:
    Page  No. 1
    |     Test case Number  |  Test Step number      |
    |      100                         |            100                   |
    Page  No. 2
    |     Test case Number  |  Test Step number      |
    |      200                         |            300                   |
    Test Result word document should generate with Table headings(marked as Red) in every pages of the document, but i am not getting as per above example.
    Please through light on this.
    Regards,
    Susa.

    Hi Santiago,
    Thank you very much for your valuable reply.
    i want to generate MS-word report for TestStand after successful testing using MS-word2000.
    Test report contains Actual values, Expected values and Pass/Fail status.
    In my program i have customized all  fields i can able to generate test report which contains Verification engineer name , test mode, test date, start time, end time Actual values, Expected values and Pass/Fail status.etc....
    To put all values of test case number, Test step number, Actual values, Expected values and Pass/Fail status in to table for each time, i will
    insert a row into table every time values arrives, once the table exceedes its page size it moves to the next page, next page should start with table row header  but it start with  values of above said parameters.
    so i'm not able to repeat table row header for each page.
    Please find the attached file for your reference.
    Attched file expected.doc  :   This file contains what i wanted to generate MS-word report. Here table row header "Test Case Number and Test Step Number " is repeated in second page.
    Attached file Actual output from source code.doc   :  This report generated from the source code. Here table row header "Test Case Number and Test Step Number" is not repeated in second page.
    Do you know any property to set "repeat as header row at the top of each page" using MS-word ActiveX in CVI/Labwindows.
    i think this information is sufficient for you,
    Still if you need some information please ask me.
    Thanks
    Susa.
    Attachments:
    Actual output from source code.doc ‏25 KB
    expected.doc ‏26 KB

  • Can SharePoint calendar show more than 3 items on a Day's schedule in Month view? How?

    Hi there,
    Can SharePoint calendar show more than 3 items on a Day's schedule in Month view? How?
    Thank you so much.

    Hi,
    From your description, my understanding is that you want to expand all items if a Day’s schedule has more than 3 items.
    You could accomplish your requirement with jQuery to auto expand all items when you load month view, please refer to this article:
    Auto Expand Month View on SharePoint 2013 Calendar
    http://blog.eastridge.net/auto-expand-month-view-on-sharepoint-2013-calendar
    If I misunderstand your requirement, please provide more detailed information.
    Best Regards,
    Vincent Han
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • How to print a graph in which internal table has more than 32 entries?

    hiii experts
    i am trying to make a line graph using 'gfw_pres_show' function module.
    but in report internal table has more than 32 entries
    so how can i print a graph having more than 32 entries?

    Hi ricky_lv,
    According to your description, there is main report and subreport in it, when the subreport spans across multiple pages, you want to show column headers on each page. If that is the case, we can set column headers visible while scrolling in main report.
    For detail information, please refer to the following steps:
    In design mode, click the small drop down arrow next to Column Groups and select Advanced Mode.
    Go to your Row Groups pane, click on the first static member.
    In properties grid, set FixData to True.
    Set RepeatOnNextPage to True.
    Here is a relevant thread you can reference:
    https://social.technet.microsoft.com/Forums/en-US/e1f67cec-8fa3-4c5d-86ba-28b57fc4a211/keep-header-rows-visible-while-scrolling?forum=sqlreportingservi
    The following screenshots are for your reference:
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    If you have any feedback on our support, please click
    here.
    Wendy Fu
    TechNet Community Support

  • Flex file upload more than 100 MB

    Hi,
    Is there possibly any method where I can upload file to the server which are more than 100 MB?
    I've gone through the nice piece of code demonstrated at here : http://www.zehnet.de/2009/02/23/flex-fileupload-component/ , where it shown how we can upload files in chunks to the server following FP10, but it also limits to the FP max limit of 100 MB. I've also gone through the discussion followed at: http://forums.adobe.com/thread/494366 , where a process been discussed where FP first reads a defined amount of bytes from a file, loads them and upload them and then go for the rest of the bytes, following openAsync() method for FileMode.READ, but which is only available in AIR. So, I wonder, if there could possibly any way to upload more than of 100 MB files in chunks to the server in browser mode of the Flash Player.
    Thanks for any coming suggestions.
    Rgds,
    Santanu.K

    Long time no see vet. Your problem sucks. I would go to sun site review there articles on JVM tunning. To solve this problem you will need to change the command line arguments to JVM to increase the heap space.
    java -Xms<initial heap size> -Xmx<maximum heap size>
    Defaults are usually: java -Xms32m -Xmx128m
    -Xm*s* - initial java heap size
    -Xm*x* - maximum java heap size
    -Xm*n* - the size of the heap for the young generation
    This is a good article: http://java.sun.com/developer/technicalArticles/Programming/turbo/
    Also, if this solves your problem please check it off so I can get some points. I've been out of the game for awhile.
    Ike Wiggins
    http://bipublisher.blogspot.com

  • More than 100 members in Dynamic Calc

    Hi, I just realised I had more than 100 children for a dynamic calc member in a sparse dimension. I can save outline but its sends me warnings. What else could happen if I leave it as it is?Thanks in advance

    The only thing that COULD happen would be that performance when querying the dyanamic calc member could be adversely affected. If the dimension is sparse, Essbase has to load all of the blocks for those child members to calculate the total, which, depending on a number of factors, can take some time.Regards,Jade-----------------------------------Jade ColeSenior Business Intelligence ConsultantClarity [email protected]

Maybe you are looking for