Too many table columns

Hi,
I have to create a pdf with a table having multiple columns.
But the table has too many columns to fit in a A4 page.
I do not wan to change the paper type. I have to use A4 paper only.
Is there a way in which I can show the table records in two consecutive rows..
so that I can split the columns into two rows.
There will be two header rows... and two data rows (only data rows will repeat)
data row1 will have data from say fields 1 to 10 and data row 2 will have data from fields 11 to 20.
Regards
Reema.

As far as I know there's probably no way ID is going to do what you want.
You can place a table across a multiple page spread, but the odds of being able to do that and still keep the file printable are marginal, at best. You can't for example, leave blank space at the gutters unless you are able to add a blank column that spans the gutter, and the limt for multipage spreads is 10 pages wide, whcih doesn't sound like it's probably enough to hold almost 600 columns.
Perhaps placing the table using named ranges of appropriate widths would work...

Similar Messages

  • Bex Query: Too many table names in the query The maximum allowable is 256

    Hi Experts,
    I need your help, Im working on a Query using a multiprovider of 2 datastores, I need to work with cells to assign specific acconts values to specific rows and columns, so I was creating a Structure with elements from a Hierarchy, but I get this error when I'm half way of the structure:
    "Too many table names in the query. The maximum allowable is 256.Incorrect syntax near ')'.Incorrect syntax near 'O1'."
    Any idea what is happening? is ti possible to fix it? do I need to ask for a modification of my Infoproviders? Some one told me is possible to combine 2 querys, is it true?
    Thanks a lot for your time and pacience.

    Hi,
    The maximum allowable limit is 256 holds true. It is the max no. of characteristics and key figures that can be used in the column side. While creating a structure, you create key figures (restricted or calculated) and formulas etc.. The objects that you use to create these should not be more than 256.
    http://help.sap.com/saphelp_nw70/helpdata/EN/4d/e2bebb41da1d42917100471b364efa/frameset.htm
    Not sure if combination of 2 query's is possible.  You can use RRI. Or have a woorkbook with 2 queries.
    Hope it helps.

  • Select from (too many) tables

    Hi all,
    I'm a proud Oracle Apex developer. We have developed an Interactive Report that is generated from many joined tables in a remote system. I've read that to improve performances we can do the following:
    1) Create a temporary table on our system that stores the app_user id and the colmun as a result of the query
    2) Create a procedure that does:
    declare
    param1:= :PXX_item
    param2:= :PXY_item.
    param3:= :V('APP_USER')
    insert into <our_table>
    (select param3, <query from remore system>)
    commit;
    3) Rediresct to a query page where IR reads from this temp table
    On "Exit" button there's a procedure that purge table data of that user (delete from temp where user=V('app_user'), so the temp table is only filled with necessary data.
    Do you see any inconvenience? Application will be used from about 500 users, about 50 concurrent users at a time.
    Thank you!

    1) We don't have a control on source syste, we can only perform query on itI was referring to a materialized view on the system where Apex is installed, not on the source database.
    2) There are many tables involvedI don't understand why this is a problem. Too much data I can see, but too many tables... not so much.
    3) Data has to be in real time, with no delayThis would a problem for MV or collections. The collections would store the data as of the initial query. Any IRs using the collection after the fact would be using stale data. If you absolutely have to have the data as of right now every time, then the full query must run on the remote system every time. Tuning that query is the only option to make it faster.
    4) There are many transactions on the source tables (they are the core of the source system) and so MV could not be refreshed so fastProbably could be with fast refresh enabled, but not necessarily practical to do so. As I indicated in 3, you have painted yourself into a corner here. You have indicated a need for a real-time query and that eliminates a number of possibilities for query-once use-many performance solutions.

  • Too many tables with EXECUTE permision

    Hi Gurus,
    I found that my live databases has too many tables with EXECUTE permision. But I dont know how it happens: my query as follows:
    select table_name
    from dba_tab_privs
    where owner='SYS' AND
    privilege = 'EXECUTE' AND
    grantee = 'PUBLIC'
    result
    TABLE_NAME
    /598cc2d9_AWExceptionMessageRe
    /24bd47b0_AWExceptionMessageRe
    /b99e8561_AWExceptionMessageRe
    /968869b8_AWExceptionMessageRe
    /f8bf68b3_AWExceptionMessageRe
    /9abd5a42_AWExceptionMessageRe
    /5e83964b_AWExceptionMessageRe
    /f01cb9e5_AWExceptionMessageRe
    /380f765f_AWExpressCommandExce
    /adef78c4_AWMemberExistsExcept
    /5166f5c2_AWObjectExistsExcept
    TABLE_NAME
    oracle/AWXML/SparseDefinition
    oracle/AWXML/ModelDimRef
    /9d17934e_AWFunctionNotSupport
    /d18d9de8_AWHandlerBaseTest
    DBMS_AW_XML
    INTERACTIONEXECUTE
    CWM2_OLAP_INSTALLER
    DBMS_XSOQ_ODBO
    OLAPI_MDX_ROWSET_IMPL_T
    OLAPI_MDX_ROWSET_TABLE
    16444 rows selected.
    ==========================================================
    Then I execute above query in other database. result was:
    TABLE_NAME
    STANDARD
    UTL_HTTP
    DBMS_PICKLER
    DBMS_JAVA_TEST
    UTL_FILE
    UTL_RAW
    UTL_TCP
    UTL_INADDR
    UTL_SMTP
    DBMS_TRANSACTION
    DBMS_SESSION
    DBMS_DDL
    DBMS_UTILITY
    DBMS_SPACE
    DBMS_ROWID
    DBMS_PCLXUTIL
    DBMS_APPLICATION_INFO
    DBMS_OUTPUT
    DBMS_DESCRIBE
    DBMS_SQL
    DBMS_EXPORT_EXTENSION
    DBMS_JOB
    DBMS_STATS
    DBMS_ZHELP_IR
    DBMS_PSP
    DBMS_RULE
    AQ$_AGENT
    AQ$_DEQUEUE_HISTORY
    AQ$_SUBSCRIBERS
    AQ$_RECIPIENTS
    AQ$_HISTORY
    AQ$_NOTIFY_MSG
    AQ$_DUMMY_T
    DBMS_AQ_EXP_QUEUE_TABLES
    DBMS_AQ_EXP_INDEX_TABLES
    DBMS_AQ_EXP_TIMEMGR_TABLES
    DBMS_AQ_EXP_HISTORY_TABLES
    DBMS_AQ_EXP_SUBSCRIBER_TABLES
    DBMS_AQ_EXP_QUEUES
    DBMS_AQ_IMP_INTERNAL
    DBMS_RMIN
    DBMS_RESOURCE_MANAGER
    DBMS_RESOURCE_MANAGER_PRIVS
    DBMS_RMGR_PLAN_EXPORT
    DBMS_RMGR_GROUP_EXPORT
    DBMS_RMGR_PACT_EXPORT
    LOW_GROUP
    DEFAULT_CONSUMER_GROUP
    DBMS_DEBUG_VC2COLL
    DBMS_DEBUG
    PBSDE
    DBMS_SUMMARY
    DBMS_SNAPSHOT
    DBMS_REFRESH
    DBMS_SNAPSHOT_UTL
    DBMS_REFRESH_EXP_SITES
    DBMS_REFRESH_EXP_LWM
    DBMS_TRACE
    DBMS_LOB
    UTL_REF
    UTL_COLL
    ODCIPREDINFO
    ODCIRIDLIST
    ODCIINDEXCTX
    ODCIARGDESCLIST
    ODCIFUNCINFO
    ODCISTATSOPTIONS
    ODCICOLINFOLIST
    ODCIOBJECT
    ODCIOBJECTLIST
    ODCIQUERYINFO
    ODCICONST
    SYSEVENT
    DICTIONARY_OBJ_TYPE
    DICTIONARY_OBJ_OWNER
    DICTIONARY_OBJ_NAME
    DATABASE_NAME
    INSTANCE_NUM
    LOGIN_USER
    IS_SERVERERROR
    SERVER_ERROR
    DES_ENCRYPTED_PASSWORD
    IS_ALTER_COLUMN
    IS_DROP_COLUMN
    GRANTEE
    REVOKEE
    PRIVILEGE_LIST
    WITH_GRANT_OPTION
    DICTIONARY_OBJ_OWNER_LIST
    DICTIONARY_OBJ_NAME_LIST
    IS_CREATING_NESTED_TABLE
    CLIENT_IP_ADDRESS
    DBMS_REPUTIL
    DBMS_REPUTIL2
    DBMS_OFFLINE_RGT
    DBMS_REPCAT_RGT_EXP
    DBMS_REPCAT_INSTANTIATE
    DBMS_CRYPTO_TOOLKIT
    DBMS_RANDOM
    how come it happens? need help from u all !!!!!!!!!!

    you asked why you have such grant found in dba_tab_privs. Let's try to find DBMS_RANDOM (listed on your output) in $ORACLE_HOME/rdbms/admin:
    cd $ORACLE_HOME/rdbms/admin
    grep dbms_random *
    dbmsrand.sql:CREATE OR REPLACE PACKAGE dbms_random AS
    dbmsrand.sql: -- execute dbms_random.seed(12345678);
    dbmsrand.sql: -- execute dbms_random.seed(TO_CHAR(SYSDATE,'MM-DD-YYYY HH24:MI:SS'));
    dbmsrand.sql: -- my_random_number := dbms_random.random;
    dbmsrand.sql: -- my_random_real := dbms_random.value;
    dbmsrand.sql: -- select dbms_random.value from dual;
    dbmsrand.sql: -- insert into a values (dbms_random.value);
    dbmsrand.sql: -- execute :x := dbms_random.value;
    dbmsrand.sql:END dbms_random;
    dbmsrand.sql:CREATE OR REPLACE PACKAGE BODY dbms_random AS
    dbmsrand.sql:END dbms_random;
    dbmsrand.sql:CREATE OR REPLACE PUBLIC SYNONYM dbms_random FOR sys.dbms_random;
    dbmsrand.sql:GRANT EXECUTE ON dbms_random TO public;
    This is run by catproc.sql script which you run while creating database.

  • Oracle Instanc has too many tables, way to subset?

    <p>Hello,</p><p> </p><p>When i try to bring up an Oracle instance and do a query, thereare so many tables that it takes 10 minutes for H. to bring up thelist of tables.</p><p> </p><p>Is there a way to subset by Owner like TOAD does, so that i'mnot trying to bit off the whole thing at once?</p><p> </p><p>Thanks very much!</p><p> </p><p>BobK</p>

    Thanks for the idea. I use the filters all of the time to find tables of interest. Like if I'm looking for tables with product info, I'll put "%prod%", or something like that. But as you suggest, I guess there's no reason why I couldn't key in every one of my 100+ tables of interest so I don't see 300+ tables that are empty and unused.
    One thing that worries me is that I would make this huge investment in time (keying into the filter) and I'd hit "clear filter" by accident one day.
    I think I'll write a macro in AutoIt and enter the interesting tables in the filter using the (reusable) macro. Thanks again.
    -- Dale --

  • Too Many Table logs in DBTABLOG, RSTBPDEL is taking too much time

    Hi Experts,
    In one of our CRM system, DBTABLOG table is logging one table which is having 1 Billion entries right now. Business dont want to switch off the logging at this moment. But the table is increasing rapidly 42 Gb per month. RSTBPDEL program is running from weeks to delete them, but no control on increment.
    Can you please suggest any way to delete them quickly at first, so that my house keeping job will run daily and finish soon.
    Regards,
    Mohan.

    Hello Mohan,
    The DBTABLOG table does get large, the best is to switch off logging. If that's not possible, increase the frequency of your delete job, also explore one more alternative have a look at the archival object: BC_DBLOGS, you could archive old records (in accordance with your customer's data retention policies) to reduce the size of the table.
    Also, have a look at the following notes, they will advise you on how to improve the performance of your delete job:
    Note 531923 - Audit Trail: Indexes on table DBTABLOG
    Note 579980 - Table logs: Performance during access to DBTABLOG
    Regards,
    Siddhesh

  • 401 Unauthorized after too many tables in AXL SQL Query

    Hi...
    I have an app that sends several AXL calls. All work fine with the exception of one accessing MGCP data via the AXL SQL QUERY command. I have found that if I only do a couple of tables it works fine, but if I had in more than 3 I get a 401 unauthorized return. Now, I know the commands are built correctly becuase its working for every other command in the set and like I said if I only do 3 or less tables the query works. Also, if I SSH into the CM locally and do run sql with the full command, it returns all tables fine, which leads me to believe this is a restriction in the return of the axl soap call...
    Help?
    I am using CM 6.0. The full query being passed in that returns 401 unauthorized is:
    SELECT MGCP.pkid, MGCP.DomainName, TypeProduct.Name AS GatewayProduct, CallManagerGroup.Name AS CallManagerGroup, MGCPSlotConfig.Slot, TypeMGCPSlotModule.Name AS UnitModule, MGCPSlotConfig.Subunit AS SubUnitIndex, TypeMGCPVic.Name AS SubUnitProduct, TypeMGCPVic.MaxNumPorts, MGCP.VersionStamp, MGCP.SpecialLoadInformation FROM TypeMGCPVic RIGHT OUTER JOIN MGCPSlotConfig ON TypeMGCPVic.Enum = MGCPSlotConfig.tkMGCPVic LEFT OUTER JOIN CallManagerGroup RIGHT OUTER JOIN MGCP ON CallManagerGroup.pkid = MGCP.fkCallManagerGroup LEFT OUTER JOIN TypeProduct ON MGCP.tkProduct = TypeProduct.Enum ON MGCPSlotConfig.fkMGCP = MGCP.pkid LEFT OUTER JOIN TypeMGCPSlotModule ON MGCPSlotConfig.tkMGCPSlotModule = TypeMGCPSlotModule.Enum ORDER BY MGCP.DomainName, MGCPSlotConfig.Slot, UnitModule DESC, MGCPSlotConfig.Subunit"
    Again, this works locally on the CM machine, it works if I pull back some of the tables... The AXL trace logs do not have errors, they just stop (with a return soap call that isnt passed back because 401 is recieved).
    Thanks!

    Here is the trace log. The return is there! It just never gets recieved.
    2008-05-22 09:50:23,458 INFO [http-8443-Processor23] axl.AXLRouter - <?xml version="1.0" encoding="UTF-8"?>http://schemas.xmlsoap.org/soap/envelope/" SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">http://www.cisco.com/AXL/API/6.0" xmlns:xsi="http://www.cisco.com/AXL/API/6.0" sequence="1">67a5af3c-878b-8dd3-2865-ca3ca007d2c22811-Test-VG1Cisco 2811Sub-Pub0NM-4VWIC-MBRD0VWIC2-1MFT-T1E1-T111209133478-fb3741a3-bb70-4013-90fa-734bea80f77267a5af3c-878b-8dd3-2865-ca3ca007d2c22811-Test-VG1Cisco 2811Sub-Pub0NM-4VWIC-MBRD1VWIC2-1MFT-T1E1-T111209133478-fb3741a3-bb70-4013-90fa-734bea80f77267a5af3c-878b-8dd3-2865-ca3ca007d2c22811-Test-VG1Cisco 2811Sub-Pub0NM-4VWIC-MBRD2VIC2-4FXO41209133478-fb3741a3-bb70-4013-90fa-734bea80f77267a5af3c-878b-8dd3-2865-ca3ca007d2c22811-Test-VG1Cisco 2811Sub-Pub0NM-4VWIC-MBRD3VIC-4FXS41209133478-fb3741a3-bb70-4013-90fa-734bea80f77243187db4-aa08-2141-a7eb-c865f45c5996Router.atrion.internalCisco 2811Pub-Sub0NM-4VWIC-MBRD0VWIC-2MFT-T121207845753-f4497a98-4ee7-42e7-9770-090898a52ca9eb492b1e-f238-ace9-3ffe-0a7b8765f364test-3845Cisco 3845Pub-Sub0NM-4VWIC-MBRD0VIC2-2MFT-T1E1-E121210884106-fa04b172-efe5-4ca0-9e44-1934af3fa7c6eb492b1e-f238-ace9-3ffe-0a7b8765f364test-3845Cisco 3845Pub-Sub0NM-4VWIC-MBRD1VIC2-2MFT-T1E1-E121210884106-fa04b172-efe5-4ca0-9e44-1934af3fa7c6
    2008-05-22 09:50:23,461 INFO [http-8443-Processor23] axl.AXLRouter - Request 1211393338927 was process in 30ms

  • Too many tables

    iam developing h.r web application that use ejb technology, and the database of my application about 73 tables and about 65% of these tables is lookup tables do i have to map all the tables to entity beans?
    i think this will kill the performance??????????i do not know exectly but i thought that i can map all the lookup tables to one entity bean, my qiz is can i do this??????????????/
    please help
    thanx in advance

    i think this will kill the performance??????????Maybe not, you know, entity beans can cache date quite well.
    If you use WEBLogic, than map lookup tables to read only entity beans.
    i do
    not know exectly but i thought that i can map all the
    lookup tables to one entity bean, my qiz is can i do
    this??????????????/No, it's not possible with CMP beans.

  • Too Many Tables?  Can I Display A Subset?

    When I use SQL Developer, open a connection, and click the + next to tables, I get a bunch of tables that are empty and unused by the application. I really can't do anything about the user associated with the tables/schema since the whole thing is "owned" by the application. But what I thought might be possible was to somehow keep a list of the tables that actually are important, and have that come back when I expand the tables node, rather than have all of the tables keep showing up. Is there a way to do this?
    -- Dale --

    Thanks for the idea. I use the filters all of the time to find tables of interest. Like if I'm looking for tables with product info, I'll put "%prod%", or something like that. But as you suggest, I guess there's no reason why I couldn't key in every one of my 100+ tables of interest so I don't see 300+ tables that are empty and unused.
    One thing that worries me is that I would make this huge investment in time (keying into the filter) and I'd hit "clear filter" by accident one day.
    I think I'll write a macro in AutoIt and enter the interesting tables in the filter using the (reusable) macro. Thanks again.
    -- Dale --

  • Too Many Parameters Errors. Toplink Generated Class error 300 columns table

    Hi Folks,
    I have a table which has around 300 columns, once I generate JPA entity using Entities from the Table wizard in Jdeveloper, it gives me an errror that too many parameters in the Method, that is actualy in the generated constructor.
    In case I delete half of the arguments in the constructor, it doest gives me error, as it doesnt exceeed the limit, would that be fine ? or Toplink internally need to call that constructor for the instrumenting purposes?
    Other option is to go by a default constructure, that I recommend personallly, but want to make sure it will work.
    Any comments, or anyone came into this problem before
    A.B
    Edited by: AAB on Sep 8, 2009 7:32 AM

    Hello,
    TopLink will only use the no argument constructor when building the object, then populate attributes using either attribute or property access. So the generated constructor taking arguments is not used at all by TopLinkl.
    Best Regards,
    Chris

  • Too many columns to be shown in the Enterprise Manager 11g?

    Hello,
    we are having some problems with the Enterprise Manager 11g. When we want to VIEW DATA of a specific table, we get this exception. We think that our table has too many columns to be displayed. If we delete some of the columns, the data is shown in the enterprise manager. But this cannot be a solution for us. Can you help us with this point?
    2009-08-03 10:07:04,210 [EMUI_10_07_04_/console/database/schema/displayContents] ERROR svlt.PageHandler handleRequest.639 - java.lang.ArrayIndexOutOfBoundsException: -128
    java.lang.ArrayIndexOutOfBoundsException: -128
         at oracle.sysman.emo.adm.DBObjectsMCWInfo.getSqlTimestampIndexes(DBObjectsMCWInfo.java:194)
         at oracle.sysman.emo.adm.schema.TableViewDataBrowsingDataSource.executeQuery(TableViewDataBrowsingDataSource.java:167)
         at oracle.sysman.emo.adm.DatabaseObjectsDataSource.populate(DatabaseObjectsDataSource.java:201)
         at oracle.sysman.emo.adm.DatabaseObjectsDataSource.populate(DatabaseObjectsDataSource.java:151)
         at oracle.sysman.emo.adm.schema.DisplayContentsObject.populate(DisplayContentsObject.java:369)
         at oracle.sysman.db.adm.schm.DisplayContentsController.onDisplayAllRows(DisplayContentsController.java:303)
         at oracle.sysman.db.adm.schm.DisplayContentsController.onDisplayContents(DisplayContentsController.java:290)
         at oracle.sysman.db.adm.schm.DisplayContentsController.onEvent(DisplayContentsController.java:136)
         at oracle.sysman.db.adm.DBController.handleEvent(DBController.java:3431)
         at oracle.sysman.emSDK.svlt.PageHandler.handleRequest(PageHandler.java:577)
         at oracle.sysman.db.adm.RootController.handleRequest(RootController.java:205)
         at oracle.sysman.db.adm.DBControllerResolver.handleRequest(DBControllerResolver.java:121)
         at oracle.sysman.emSDK.svlt.EMServlet.myDoGet(EMServlet.java:781)
         at oracle.sysman.emSDK.svlt.EMServlet.doGet(EMServlet.java:337)
         at oracle.sysman.eml.app.Console.doGet(Console.java:318)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:743)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:856)
         at com.evermind.server.http.ResourceFilterChain.doFilter(ResourceFilterChain.java:64)
         at oracle.sysman.eml.app.EMRepLoginFilter.doFilter(EMRepLoginFilter.java:109)
         at com.evermind.server.http.EvermindFilterChain.doFilter(EvermindFilterChain.java:15)
         at oracle.sysman.db.adm.inst.HandleRepDownFilter.doFilter(HandleRepDownFilter.java:153)
         at com.evermind.server.http.EvermindFilterChain.doFilter(EvermindFilterChain.java:17)
         at oracle.sysman.eml.app.BrowserVersionFilter.doFilter(BrowserVersionFilter.java:122)
         at com.evermind.server.http.EvermindFilterChain.doFilter(EvermindFilterChain.java:17)
         at oracle.sysman.emSDK.svlt.EMRedirectFilter.doFilter(EMRedirectFilter.java:102)
         at com.evermind.server.http.EvermindFilterChain.doFilter(EvermindFilterChain.java:17)
         at oracle.sysman.eml.app.ContextInitFilter.doFilter(ContextInitFilter.java:336)
         at com.evermind.server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:627)
         at com.evermind.server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:376)
         at com.evermind.server.http.HttpRequestHandler.doProcessRequest(HttpRequestHandler.java:870)
         at com.evermind.server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:451)
         at com.evermind.server.http.HttpRequestHandler.serveOneRequest(HttpRequestHandler.java:218)
         at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:119)
         at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:112)
         at oracle.oc4j.network.ServerSocketReadHandler$SafeRunnable.run(ServerSocketReadHandler.java:260)
         at com.evermind.util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.java:303)
         at java.lang.Thread.run(Thread.java:595)When we select the table via SQL, everything works fine.

    Hi,
    I'm Galit from the QE team of VIN.
    All the things that you've described are correct.
    It is actually an edge case where the only VM, that the manual App can be managed from its Map view, was removed from the App.
    The Manual App management is as designed, and may be changed in the future.
    There are 2 ways to overcome this situation:
    1.You can, as you stated, create another Manual App with similar name and remain with the "Zombie App".
    2. To run a specific command that will remove the Zombie App from the DB.
    Please note that option no. 2 involves using an API that we do not publish.
    If you would like to use option no. 2 contact me in private and we will see about supplying the relevant commands to run in order to delete the "zombie" application.
    Thanks,
    Galit Gutman

  • Too Many Index in a Table

    Dear Gurus,
    I´ve got some performance problems of an especific table called CC_FICHA_FINANCEIRA, the structure of the table is described below:
    NUMBER OF RECORDS: ABOUT 1.600.000
    NAME NULL TYPE
    CD_FUNDACAO NOT NULL VARCHAR2(2)
    NUM_INSCRICAO NOT NULL VARCHAR2(9)
    CD_PLANO NOT NULL VARCHAR2(4)
    CD_TIPO_CONTRIBUICAO NOT NULL VARCHAR2(2)
    ANO_REF NOT NULL VARCHAR2(4)
    MES_REF NOT NULL VARCHAR2(2)
    SEQ_CONTRIBUICAO NOT NULL NUMBER(5)
    CD_OPERACAO NOT NULL VARCHAR2(1)
    SRC NUMBER(15,2)
    REMUNERACAO NUMBER(15,2)
    CONTRIB_PARTICIPANTE NUMBER(15,2)
    CONTRIB_EMPRESA NUMBER(15,2)
    DIF_CONTRIB_PARTICIPANTE NUMBER(15,2)
    DIF_CONTRIB_EMPRESA NUMBER(15,2)
    TAXA_ADM_PARTICIPANTE NUMBER(15,2)
    TAXA_ADM_EMPRESA NUMBER(15,2)
    QTD_COTA_RP_PARTICIPANTE NUMBER(15,6)
    QTD_COTA_FD_PARTICIPANTE NUMBER(15,6)
    QTD_COTA_RP_EMPRESA NUMBER(15,6)
    QTD_COTA_FD_EMPRESA NUMBER(15,6)
    ANO_COMP NOT NULL VARCHAR2(4)
    MES_COMP NOT NULL VARCHAR2(2)
    CD_ORIGEM VARCHAR2(2)
    EXPORTADO VARCHAR2(1)
    SEQ_PP_PR_PAR NUMBER(10)
    ANO_PP_PR_PAR NUMBER(5)
    SEQ_PP_PR_EMP NUMBER(10)
    ANO_PP_PR_EMP NUMBER(5)
    SEQ_PP_PR_TX_PAR NUMBER(10)
    ANO_PP_PR_TX_PAR NUMBER(5)
    SEQ_PP_PR_TX_EMP NUMBER(10)
    ANO_PP_PR_TX_EMP NUMBER(5)
    I think that the indexes of this table can be the problem, there are too many. I will describe them below:
    INDEX COLUMNS
    CC_FICHA_FINANCEIRA_PK CD_FUNDACAO
    NUM_INSCRICAO
    CD_PLANO
    CD_TIPO_CONTRIBUICAO
    ANO_REF
    MES_REF
    SEQ_CONTRIBUICAO
    CD_OPERACAO
    ANO_COMP
    MES_COMP
    CC_FICHA_FINANCEIRA_IDX_002 CD_FUNDACAO
    NUM_INSCRICAO
    CD_PLANO
    CD_TIPO_CONTRIBUICAO
    ANO_COMP
    ANO_REF
    MES_COMP
    MES_REF
    SRC
    CC_FICHA_FINANCEIRA_IDX_006 CD_ORIGEM
    CC_FICHA_FINANCEIRA_IDX_007 CD_TIPO_CONTRIBUICAO
    CC_FICHA_FINANCEIRA_IDX2 CD_FUNDACAO
    ANO_REF
    MES_REF
    NUM_INSCRICAO
    CD_PLANO
    CD_TIPO_CONTRIBUICAO
    CONTRIB_EMPRESA
    CC_FICHA_FINANCEIRA_IDX3 CD_FUNDACAO
    ANO_REF
    MES_REF
    CD_PLANO
    CD_TIPO_CONTRIBUICAO
    SEQ_CONTRIBUICAO
    There are columns that have 4 indexes. Is it right? How is the better way to analyze those indexes?
    Regards...

    Hi,
    You can monitor index usage to know if it used by application.
    See metalink note 136642.1 Identifying Unused Indexes with the ALTER INDEX MONITORING USAGE Command
    Nicolas.

  • Too many information in 1 table line - alternative solution ?

    Hi,
    Just join today since i am new to Adobe LifeCycle.
    I am using Adobe Life Cycle 8.1.2
    As the title explains, currently i am designing a interactive adobe form (orientation: Portrait) which has a table with "add/remove" button. The "add/remove" works perfect (i can add line or remove line from the table).
    The problem is I have too many information (read: columns) in 1 table line. In results:
    1. Table line is not sufficient to display all information, OR
    2. I can squeeze all columns into 1 line and play around with smaller font size. But the form looks ugly.
    Has anyone encountered this kind of situation before ?
    Can you share your design (if possible attached the form) for similar situation ?
    Any bright input/idea are most welcomed.
    Thanks In Advance !
    Note:
    1. Landscape is NOT an option.
    2. I still want to have "add/remove" functionality.

    Hi,
    Suppose you have set the the no of columns to 6 for any row . 1st row has 6 columns. 2nd row has 6 columns intially. But you can merge some of the columns to make it one. Selects the columns you want to merge.Then Right Click -> Merge cells. But this may lead to another problem about the caption of the columns. But if it's text field or same type of field you can specify the option to set the caption as on top.
    Thanks,
    Bibhu.

  • Can you have too many rows in a table?

    How many rows would you consider to be too many for a single table and how would you re-arrange the data if
    asked?
    any answers?
    sukai

    I have some tables with over 100 million rows that still perform well, and I'm sure much larger tables are possible.  The exact number of rows would vary significantly depending on a number of factors including:
    Power of the underlying hardware
    Use of the table – frequency of queries and updates
    Number of columns and data types of the columns
    Number of indexes
    Ultimately the answer probably comes down to performance – if queries, updates, inserts, index rebuilds, backups, etc. all perform well, then you do not yet have too many rows.
    The best way to rearrange the data would be horizontal partitioning.  It distributes the rows into multiple files; which provides a number of advantages including the potential to perform well with larger number of rows.
    http://msdn.microsoft.com/en-us/library/ms190787.aspx

  • Unable to load the following resource: Error: too many columns- form won't

    Hi ..
    Can you please suggest on the following error, which i am facing the issue while opening data form in Hyperion Planning 11.1.1.4 .
    Unable to load the following resource: Error: too many columns- form won't
    Regards,
    Vasu

    It is something around 256 if I am not mistaken, I dread to think what a form with 700 columns is like to work with.
    Cheers
    John
    http://john-goodwin.blogspot.com/

Maybe you are looking for

  • Page content not viewable in Design View

    I am in the process of creating a site and am using a template for the basis of the pages. All of the pages so far are working fine, however, when I pasted the nested table into this page Sage Concerts, I was able to edit the content in Design view,

  • Missing classes from servicegen client jar

    If you generate the WSDL, stubs, etc from an EJB jar, not all the required classes are copied to the service_client.jar. If your EJB method takes a class (say HomeAddressDTO) and that class inherits from a base-class, AddressDTO, then the AddressDTO

  • Script is busy or stopped responding error message

    When I am watching a video that requires adobe flash player, for example YouTube, and I unplug my extended monitor my Firefox freezes completely and a minutes later displays an error. I get the error message : "A script on this page may be busy, or i

  • Reinstall + FrontRow

    Hi, Today I have changed my MacBook Pro Hard Drive, and I made a clean Install of Tiger 10.4.6, with the CD that I had with the machine. The installation finished, I dont find FrontRow and my remote doesn't work... Have you an issue, FrontRow isn't o

  • Create N adobe interactive forms ????

    Hi everybody, I want to know if there is someone who can tell me if is possible to create x adobe forms depends of a x variable??. My situation is like this: I have a webdynpro app., this app retrieves information from r3 (a table of customers), for