More than 1000 columns

In which database i can create more than 1000 columns in a table

Hi.
I can´t imagine a problem in the real life where you need such number of columns.
I can´t imagine how you will administer such object.
I think that you don´t know the database normalization concept.
There are a lot of documentation out there about this concept, search in the web "database normalization" and take a minute for reading.
Regards.
Daniel
[If you think my writing is bad, you must listen to me speaking! ;-) ]

Similar Messages

  • ORA-24335 - cannot support more than 1000 columns - How to solve this?

    Hi,
    I got error message 'ORA-24335 - cannot support more than 1000 columns ' when i try to insert x no of rows for a table with following code:
    INSERT ALL
    INTO tableA Values ('A', 'B', 'C', 1, 2, 3)
    INTO tableA Values ('D', 'E', 'F', 4, 5, 6)
    INTO tableA Values ('G', 'H', 'I', 7, 8, 9)
    SELECT *
    FROM DUAL
    How to solve above?
    What does it really mean? It's not as easy as if my table has 10 columns than I can't insert more than 100 rows (1000 columns divided with 10 columns = maximun 100 rows to insert), or?
    Is there a better/more appropriate way to do insert many rows?
    Should I do my inserts with an OracleTransaction (My application is developed with C# and asp.net), as ~follows
    BEGIN
    INSERT INTO tableA Values ('A', 'B', 'C', 1, 2, 3)
    INSERT INTO tableA Values ('D', 'E', 'F', 4, 5, 6)
    INSERT INTO tableA Values ('G', 'H', 'I', 7, 8, 9)
    COMMIT
    END
    Thx in advance!
    Edited by: user8819407 on 2010-mar-07 15:40

    Hi,
    So how did you solve the problem? Can you please give an example?
    I have the same problem inserting over 1000 values into my Oracle DB. The table only has 51 columns but to speed up the insert command, I use bulk inserts via insert all command. I need to insert 100 rows at a time for a total of 5100 values.
    Example:
    SQLCmd = INSERT ALL INTO MYTABLE (VAL_ID,VAL_NAME,VAL_NUM,VAL_TIME,VAL_CMM,VAL_TIME1,VAL_TRIP,VAL_REMAINING,VAL_AGE,VAL_COUNT,VAL_RSTS,VAL_VOLT,VAL_RF,VAL_DC,VAL_HOPS,VAL_STATUS,VAL_ELAPSED,MODIFY_TIME) VALUES (?,?,?,TO_DATE(?,?),?,TO_DATE(?,?),?,?,?,?,?,?,?,?,?,?,?,SYSTIMESTAMP)
    INTO MYTABLE (VAL_ID,VAL_NAME,VAL_NUM,VAL_TIME,VAL_CMM,VAL_TIME1,VAL_TRIP,VAL_REMAINING,VAL_AGE,VAL_COUNT,VAL_RSTS,VAL_VOLT,VAL_RF,VAL_DC,VAL_HOPS,VAL_STATUS,VAL_ELAPSED,MODIFY_TIME) VALUES (?,?,?,TO_DATE(?,?),?,TO_DATE(?,?),?,?,?,?,?,?,?,?,?,?,?,SYSTIMESTAMP)
    INTO MYTABLE (VAL_ID,VAL_NAME,VAL_NUM,VAL_TIME,VAL_CMM,VAL_TIME1,VAL_TRIP,VAL_REMAINING,VAL_AGE,VAL_COUNT,VAL_RSTS,VAL_VOLT,VAL_RF,VAL_DC,VAL_HOPS,VAL_STATUS,VAL_ELAPSED,MODIFY_TIME) VALUES (?,?,?,TO_DATE(?,?),?,TO_DATE(?,?),?,?,?,?,?,?,?,?,?,?,?,SYSTIMESTAMP)
    INTO MYTABLE (VAL_ID,VAL_NAME,VAL_NUM,VAL_TIME,VAL_CMM,VAL_TIME1,VAL_TRIP,VAL_REMAINING,VAL_AGE,VAL_COUNT,VAL_RSTS,VAL_VOLT,VAL_RF,VAL_DC,VAL_HOPS,VAL_STATUS,VAL_ELAPSED,MODIFY_TIME) VALUES (?,?,?,TO_DATE(?,?),?,TO_DATE(?,?),?,?,?,?,?,?,?,?,?,?,?,SYSTIMESTAMP)
    SELECT 1 FROM DUAL
    SQLVals = 1 22C38299 80700334 04-19-2012 13:55:33 mm-dd-yyyy hh24:mi:ss MCC93000 04/19/2012 13:55:42 mm-dd-yyyy hh24:mi:ss 12 4 0.792191 23 1 0.00 -113.50 13.48 9 1 9
    1 36PR7038 8070EDC2 04-19-2012 12:24:35 mm-dd-yyyy hh24:mi:ss MCC60360 04/19/2012 12:24:41 mm-dd-yyyy hh24:mi:ss 7 4 0.757501 3 2 13.88 -114.80 14.06 5 1 6
    1 42C63512 8050166F 04-19-2012 16:02:50 mm-dd-yyyy hh24:mi:ss MCC52420 04/19/2012 16:02:57 mm-dd-yyyy hh24:mi:ss 10 4 0.778471 8 1 0.00 -122.30 13.05 8 1 7
    1 33MR3076 80803E75 04-19-2012 13:13:16 mm-dd-yyyy hh24:mi:ss MCC60330 04/19/2012 13:13:22 mm-dd-yyyy hh24:mi:ss 13 5 0.636721 28 3 0.00 -122.19 0.70 8 1 6
    Then I call: &DBInsert($sqlCmd, @sqlVals);
    This is the error I get: ORA-24335: cannot support more than 1000 columns.
    I need to insert about 879,500 rows (51 cols per row). The values I read from a text file in sets of about 48,000 and put them into a hash. I tried inserting one row at a time, but it takes too long, about 28 hrs! Then, I tried the bulk update with only 6 rows and the total time was about 25 minutes. Which is acceptable to us.
    Thanks!

  • Handeling more than 1000 columns in where clause

    Hi,
    I am using oracle 9i. We are creating a select query and adding where clause at runtime.
    The no of clolumns in where clause depends on the results from another enterprise app. There may be any no of columns from 0-N. Till 1000 columns every thing works fine but when it goes beyond 1000 it gives the following error:
    ERROR:
    java.sql.SQLException: ORA-01795: maximum number of expressions in a list is 100
    0
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at oracle.jdbc.oci8.OCIDBAccess.check_error(OCIDBAccess.java:2321)
    at oracle.jdbc.oci8.OCIDBAccess.parseExecuteDescribe(OCIDBAccess.java:12
    55)
    at oracle.jdbc.driver.OracleStatement.doExecuteQuery(OracleStatement.jav
    a:2391)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStateme
    nt.java:2672)
    at oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:
    572)
    Please help.
    Thanks in advance.

    Thanks Archana,
    You are right.
    I think i have not framed my question right.
    Actually its problem with "IN", which i am using in where clause. It working properly till 1000 expressions like "select * from foo where fooid in (n0,n1,n2......,n1000)".
    But throwing exception once the count increases to 1000 like
    "select * from foo where fooid in (n0,n1,n2......,n1000,n1001,n1002)".
    Thanks,
    Mukesh

  • Table with more than 1000 columns

    I need to store data from an equipment that logs 1500 parameters every minute. The natural approach would be to create a table where the first column stores the timestamp and the remaining columns the values sampled:
    CREATE TABLE parameters (
    sample_date date,
    param1 float,
    param2 float,
    param1500 float
    However, since there is a limitation of "just" 1000 coluns in Oracle, the table was designed as follows:
    CREATE TABLE parameters (
    sample_date date,
    param_id number(4),
    value float
    An auxiliar table stores the valid parameters and their ids.
    This works fine to store information, however it is very difficult to select data in a natural way.
    There are situations where we just need to make a report using a few columns out of the 1500 available. Is it possible to create a view that would make the way we designed the table transparent? Let's say we just need to make a report using params 1,2 and 4. How can we create a view that would return all parameters of a sample in a single row:
    sample_date param1 param2 param4
    just as if we had the parameters stored in individual columns?
    Marco

    Not a very efficient and space friendly design to do name-value pairs like that.
    Other methods to consider is splitting those 1500 parameters up into groupings of similar parameters, and then have a table per group.
    Another option would be to use "vertical table partitioning" (as oppose to the more standard horizontal partitionining provided by the Oracle partition option) - this can be achieved (kind of) in Oracle using clusters.
    Sooner or later this name-value design is going to bite you hard. It has 1500 rows where there should be only 1 row. It is not scalable.. and as you're discovering, it is unnatural to use. I would rather change that table and design sooner than later.

  • ALV grid display with more than 1000 columns

    Hi Friends,
    I have to prepare a report output which have 1015 columns.
    User will give 100 weeks of data to retrieve. I have to display the output in day wise.
    100*7 + 315 = 1015 columns.
    I am using ALV grid display for this in 4.6C.
    My Question is, whether I have to declare the output table type with 1015 fields.?
    Is there any other way to do this, without declaring 1015 cloumns.
    Please guide me to solve this.
    Regards,
    Viji.

    I'm thinking when your End-user will press Ctrl + P feeding A4 size to printer
    Thomas:
    Maybe the functional consultant is pulling your leg?
    May be OP is pulling our legs or something further?
    Cheers

  • Can columns be more than 1000 in a select query (oracle 11 g)

    I am getting this error: "ORA-01792: maximum number of columns in a table or view is 1000"
    I have a dynamic query where number of column can increase according to the user input.
    They can expect more than 1000 columns. Is it possible to fetch more than 1000 rows in a query?
    I appreciate all your help.
    Edited by: user10232912 on Apr 26, 2012 2:07 AM

    >
    They can expect more than 1000 columns.
    Then they are idiots. IMO.
    Open challenge. Show me an entity with a 1000 attributes and I will show you a flawed data
    model and a total lack of grasping fundamentals of implementing that into a relation database product like Oracle.I second that - as someone who once had to ETL a system which had a table with 35.000 fields - that's 35K.
    It was a system which made extensive use of arrays - and arrays of arrays of arrays...
    Paul...

  • Table with more than 35 columns

    Hello All.
    How can one work with a table with more than 35 columns
    on JDev 9.0.3.3?
    My other question is related to this.
    Setting Entities's Beans properties from a Session Bean
    bought up the error, but when setting from inside the EJB,
    the bug stays clear.
    Is this right?
    Thank you

    Thank you all for reply.
    Here's my problem:
    I have an AS400/DB2 Database, a huge and an old one.
    There is many COBOL Programs used to communicate with this DB.
    My project is to transfer the database with the same structure and the same contents to a Linux/ORACLE System.
    I will not remake the COBOL Programs. I will use the existing one on the Linux System.
    So the tables of the new DB should be the same as the old one.
    That’s why I can not make a relational DB. I have to make an exact migration.
    Unfortunately I have some tables with more than 5000 COLUMNS.
    Now my question is:
    can I modify the parameters of the ORACE DB to make it accept Tables and Views with more than 1000 columns, If not, is it possible to make a PL/SQL Function that simulate a table, this function will insert/update/select data from many other small tables (<1000 columns). I want to say a method that make the ORACLE DB acting like if it has a table with a huge number of columns;
    I know it's crazy but any idea please.

  • Create table more than 1000 colum

    Hello
    I have 1500+ column in my select query. how to create table like my query is
    create table xyz nologging as
    select a1,a2,a3,a4,a5,b1,b2,b3,b4,b5,................a1500
    from abcd a,bcde b
    where a.a1=b.b1
    In oracle 10g I can create only 1000 column in a table.
    Please suggest

    799301 wrote:
    My all columns are formulated with different different tables and want to create one intermediate table
    below query is just example
    select count(unique((case when sysdate >45 then video=Y and audio=N then 1 else 0 end))) vd_45,
    count(unique((case when sysdate >105 then video=Y and audio=N then 1 else 0 end))) vd_105,
    count(unique((case when sysdate >195 then video=Y and audio=N then 1 else 0 end))) vd_195,
    count(unique((case when sysdate >380 then video=Y and audio=N then 1 else 0 end))) vd_380,
    count(unique((case when sysdate >45 then video=N and audio=Y then 1 else 0 end))) ad_45,
    count(unique((case when sysdate >105 then video=N and audio=Y then 1 else 0 end))) ad_105,
    count(unique((case when sysdate >195 then video=N and audio=Y then 1 else 0 end))) ad_195,
    count(unique((case when sysdate >380 then video=N and audio=Y then 1 else 0 end))) ad_380
    from abcd a,bcde b
    where a.a1=b.b1;
    please suggest how create more than 1000 table columnBy creating a proper relation between the "formulated function" and stored value.
    Simplistically put:
    Do you model invoices as entities INVOICES_2010, INVOICES_2011 and so on?
    Or do you model it as INVOICES that has a year (date) attribute?
    It does not make sense to put attribute data into the name of entity. Nor does it make sense to put relationship data into the name of an attribute and create a 1000+ attributes as a result.

  • INDIRECT for more than 1000 Rows

    Here's a puzzler!
    Why does this not work for values at G6579 of more than 1000, i.e., for Cells below C1000.
    =IF(G6579<=0,CHAR(32),INDIRECT("C"&G6579))
    It works fine for all the rows above 6579 which find rows above C1000. The Cells in column C have values, but at Row 6579 and below, this formula generates "This formula contains an invalid reference".
    Have I hit a limit on allowable INDIRECT argument values?
    Dick

    Dick Huitema wrote:
    YVAN:
    May you explain why you use CHAR(32) to define a string of one space ?
    DICK:
    I want the cell left blank; if there's a better way, I'm all ears. I might then be able to use the test you recommended (using LEN)... although I haven't needed one so far.
    Further thoughts?
    BARRY:
    Previous thoughts, actually. Yvan, as he mentioned, had suggested using the empty string ( "" ) >instead of a space.
    =IF(G6579<=0,"",INDIRECT("C"&1000))
    Must have had your earplugs in at the time. ;-)</div>
    Yup! And the sleep mask must have been on, too!
    Sorry Yvan... I didn't quite understand what "empty string" meant and thought you were trying to suggest using LEN... which made no sense, of course.
    An empty string is a string which contains nothing, the one defined by ""
    =CHAR(32) defines a string of one space character, the one which is often created using =" "
    In many tables, we need to take care of empty cells.
    The true definition of an empty cell is a cell which contains nothind.
    This status may be checked thanks to the function ISBLANK(theCell).
    Alas, as there is no function allowing us to define the value of a cell to nothing (I'm desparately waiting for the function BLANK()) we have to use the canada dry blank cell.
    My own choice is to set the content of the cell to the empty string ("") . You used the space (" ").
    I described the way o check if a cell if "blank in the extended fashion"
    LEN(of a true empty cell) is zero, because the length of nothing is nil.
    LEN(of a cell containing "") is zero, because it's the definition of the empty string.
    So, the single instruction LEN(theCell) give us the wanted informatioN
    If you use the space string, the test is more complex. The one which I sdescribed was wrong because it didn't treated the case of a table using both empty strings and space strings.
    It would be
    =IF(OR(LEN(theCell)=0,theCell=" "),theCellIisEmpty,theCellIsNotEmpty)
    Yvan KOENIG (VALLAURIS, France) lundi 12 juillet 2010 19:15:08

  • Row chaining in table with more than 255 columns

    Hi,
    I have a table with 1000 columns.
    I saw the following citation: "Any table with more then 255 columns will have chained
    rows (we break really wide tables up)."
    If I insert a row populated with only the first 3 columns (the others are null), is a row chaining occurred?
    I tried to insert a row described above and no row chaining occurred.
    As I understand, a row chaining occurs in a table with 1000 columns only when the populated data increases
    the block size OR when more than 255 columns are populated. Am I right?
    Thanks
    dyahav

    user10952094 wrote:
    Hi,
    I have a table with 1000 columns.
    I saw the following citation: "Any table with more then 255 columns will have chained
    rows (we break really wide tables up)."
    If I insert a row populated with only the first 3 columns (the others are null), is a row chaining occurred?
    I tried to insert a row described above and no row chaining occurred.
    As I understand, a row chaining occurs in a table with 1000 columns only when the populated data increases
    the block size OR when more than 255 columns are populated. Am I right?
    Thanks
    dyahavYesterday, I stated this on the forum "Tables with more than 255 columns will always have chained rows." My statement needs clarification. It was based on the following:
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28318/schema.htm#i4383
    "Oracle Database can only store 255 columns in a row piece. Thus, if you insert a row into a table that has 1000 columns, then the database creates 4 row pieces, typically chained over multiple blocks."
    And this paraphrase from "Practical Oracle 8i":
    V$SYSSTAT will show increasing values for CONTINUED ROW FETCH as table rows are read for tables containing more than 255 columns.
    Related information may also be found here:
    http://download.oracle.com/docs/cd/B10501_01/server.920/a96524/c11schem.htm
    "When a table has more than 255 columns, rows that have data after the 255th column are likely to be chained within the same block. This is called intra-block chaining. A chained row's pieces are chained together using the rowids of the pieces. With intra-block chaining, users receive all the data in the same block. If the row fits in the block, users do not see an effect in I/O performance, because no extra I/O operation is required to retrieve the rest of the row."
    http://download.oracle.com/docs/html/B14340_01/data.htm
    "For a table with several columns, the key question to consider is the (average) row length, not the number of columns. Having more than 255 columns in a table built with a smaller block size typically results in intrablock chaining.
    Oracle stores multiple row pieces in the same block, but the overhead to maintain the column information is minimal as long as all row pieces fit in a single data block. If the rows don't fit in a single data block, you may consider using a larger database block size (or use multiple block sizes in the same database). "
    Why not a test case?
    Create a test table named T4 with 1000 columns.
    With the table created, insert 1,000 rows into the table, populating the first 257 columns each with a random 3 byte string which should result in an average row length of about 771 bytes.
    SPOOL C:\TESTME.TXT
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    INSERT INTO T4 (
    COL1,
    COL2,
    COL3,
    COL255,
    COL256,
    COL257)
    SELECT
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3)
    FROM
      DUAL
    CONNECT BY
      LEVEL<=1000;
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SET AUTOTRACE TRACEONLY STATISTICS
    SELECT
    FROM
      T4;
    SET AUTOTRACE OFF
    SELECT
      SN.NAME,
      SN.STATISTIC#,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SPOOL OFFWhat are the results of the above?
    Before the insert:
    NAME                      VALUE                                                
    table fetch continue        166
    After the insert:
    NAME                      VALUE                                                
    table fetch continue        166                                                
    After the select:
    NAME                 STATISTIC#      VALUE                                     
    table fetch continue        252        332  Another test, this time with an average row length of about 12 bytes:
    DELETE FROM T4;
    COMMIT;
    SPOOL C:\TESTME2.TXT
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    INSERT INTO T4 (
      COL1,
      COL256,
      COL257,
      COL999)
    SELECT
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3)
    FROM
      DUAL
    CONNECT BY
      LEVEL<=100000;
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SET AUTOTRACE TRACEONLY STATISTICS
    SELECT
    FROM
      T4;
    SET AUTOTRACE OFF
    SELECT
      SN.NAME,
      SN.STATISTIC#,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SPOOL OFFWith 100,000 rows each containing about 12 bytes, what should the 'table fetch continued row' statistic show?
    Before the insert:
    NAME                      VALUE                                                
    table fetch continue        332 
    After the insert:
    NAME                      VALUE                                                
    table fetch continue        332
    After the select:
    NAME                 STATISTIC#      VALUE                                     
    table fetch continue        252      33695The final test only inserts data into the first 4 columns:
    DELETE FROM T4;
    COMMIT;
    SPOOL C:\TESTME3.TXT
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    INSERT INTO T4 (
      COL1,
      COL2,
      COL3,
      COL4)
    SELECT
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3),
    DBMS_RANDOM.STRING('A',3)
    FROM
      DUAL
    CONNECT BY
      LEVEL<=100000;
    SELECT
      SN.NAME,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SET AUTOTRACE TRACEONLY STATISTICS
    SELECT
    FROM
      T4;
    SET AUTOTRACE OFF
    SELECT
      SN.NAME,
      SN.STATISTIC#,
      MS.VALUE
    FROM
      V$MYSTAT MS,
      V$STATNAME SN
    WHERE
      SN.NAME = 'table fetch continued row'
      AND SN.STATISTIC#=MS.STATISTIC#;
    SPOOL OFFWhat should the 'table fetch continued row' show?
    Before the insert:
    NAME                      VALUE                                                
    table fetch continue      33695
    After the insert:
    NAME                      VALUE                                                
    table fetch continue      33695
    After the select:
    NAME                 STATISTIC#      VALUE                                     
    table fetch continue        252      33695 My statement "Tables with more than 255 columns will always have chained rows." needs to be clarified:
    "Tables with more than 255 columns will always have chained rows +(row pieces)+ if a column beyond column 255 is used, but the 'table fetch continued row' statistic +may+ only increase in value if the remaining row pieces are found in a different block."
    Charles Hooper
    IT Manager/Oracle DBA
    K&M Machine-Fabricating, Inc.
    Edited by: Charles Hooper on Aug 5, 2009 9:52 AM
    Paraphrase misspelled the view name "V$SYSSTAT", corrected a couple minor typos, and changed "will" to "may" in the closing paragraph as this appears to be the behavior based on the test case.

  • To display text field with more than 1000 length in ALV

    Hi Friends,
           I need to display material long text field which is more than 1000 length in ALV or Object oriented ALV.
    Please help me to resolve the issue.
    Regards,
    Jaya.

    Hi,
    U mean to say in a single cell of ALV ??
    It is not possible in ALV .. As per OSS note 857823
    Below is the reason for same.. I think you should to split in number of columns.
    Summary
    Symptom
    Entries in cells of the type CHAR or string are truncated after 128 characters in the SAP GUI.
    Other terms
    ALV Grid Control (cl_gui_alv_grid), function module (Full-screen)
    Grid (Reuse_alv_grid_display, SAPLSLVC_FULLSCREEN), SAPGUI, back end, front end
    Reason and Prerequisites
    The data table that is sent to the front end only allows character values with the length 128.
    Solution
    This is the standard system behavior and cannot be changed.

  • Colorization of more than 500 columns in CFBuilder

    Hi,
    For those who faced issues and reproted bugs related to colorization not working for more than 500 columns, please refer to the blog post here:
    http://blogs.adobe.com/cfbuilder/2009/07/colorization_of_more_than_500.html
    Thanks,
    Dipanwita
    http://blogs.adobe.com/cfbuilder

    @John Is your response for Open Directory users because I can say with certainty that your response is not true for AD searches.  The filter box only whittles down the intial 500 displayed results (appears to from my testing).  I can view all 8000 of my groups (yes I tweaked the 1000 cap for LDAP queries on the ad server) and up to 8000 users in Directory Utility, even do individual searches for undisplayed users (actually have 50000 or so) and still get them back.  It seems Directory Utilitity DOES form LDAP queries correctly and efficiently, whereas Server app does not.  It seems Server app, at least against an AD server, tries to LDAP query for everything, and only filters down the max results it gets back.  Am I wrong?  If I were, I should be able to use the search box and still get the same 8000 groups I do in Directory Utility, which I don't.  Know how to fix?
    To clarify: This only happens when I need to give access permissions to "Services" in the Users or Groups section under Accounts.  Intestingly, looking up the users and groups to set permissions for the actual Sharepoint, works just fine.
    @Antonio While you're right it doesn't seem Server App can come anywhere close to Active Directory in terms of managing the entire infrastructure, I think you're wrong when you sort of insinuate that it shouldn't be able to work in tandem with it.  Forming the LDAP queries to AD better seems like such a simple thing to fix - nobody is asking it do the same workload as AD.  I work at an educational institution that is AD infrastructure on the whole, but I just happen to manage an Arts college that it's part of and I prefer the Server App + Deploy Studio + Remote Desktop to manage our 90% rate of macs, as opposed to using the SCCM and AD tools I easily could use.  And I doubt I'm the only mac admin in a similar situation.  This is something Apple could and SHOULD easily fix.

  • How to create a template for more than four column conditions?

    Hi
    I want to create a report template for a tabular form should have more than 4 column template conditions (more than 12 conditions i need) to get the coloring for row based on the conditions.
    so, please tel me how to do a template to meet my requirements.
    APEX VERSION: 4.2
    Database :11gR2.
    Cheers
    Tulasi.

    Tulasi 1243 wrote:
    Hi
    I want to create a report template for a tabular form should have more than 4 column template conditions (more than 12 conditions i need) to get the coloring for row based on the conditions.
    so, please tel me how to do a template to meet my requirements.
    VC wrote:I don't think it is possible using the template column conditions...try using some jQuery code that runs on page load.
    I am sure this has been discussed here before. Search the forum...good luck.It is possible using a template, and it has been discussed many times...
    The trick is to combine classTyler's conditional <tt>class</tt> technique with a custom named column report template.
    Determine a class for the row in the query:
    select
              empno
            , ename
            , sal
            , case
                when sal < 1000 then 'low'
                when sal between 1000 and 2000 then 'medium'
                when sal > 2000 then 'high'
              end sal_class
    from
              empCreate a custom named column report template, referencing the class in the row template:
    <tr class="#SAL_CLASS#"><td>#EMPNO#</td><td>#ENAME#</td><td>#SAL#</td></tr>Then create a style sheet to apply the required backgrounds:
    tr.low td { background-color: yellow; }
    tr.medium td { background-color: white; }
    tr.high td { background-color: red; }

  • How to display more than 60 columns in a report

    I have a table defined as follows
    id
    column_name
    column_value
    data sample
    id col_name column_value
    1 col1 val1
    1 col2 val2
    1 col3 val3
    2 col1 val1
    2 col2 val2
    2 col3 val3
    now I want to display the data in a report as follows
    id col1 col2 col3
    1 val1 val2 val3
    2 val1 val2 val3
    I was able to generate output using pivots [http://technology.amis.nl/blog/1197/pivot-dynamic-data]
    the problem is that I can have more that 60 columns retrieved in each row, the pivot solution will retrieve them correctly but I can't display more than 60 columns in apex reports, why this restriction in apex, any solution

    Hello:
    In the Source section of the report definition choose 'Use Generic Column Names (parse query at runtime only)' and then specify a suitable value for 'Maximum number of generic report columns:'
    Varad

  • How to create combobox to display more than one columns

    I need kind help with the following question. As the combobox includes two pieces--textbox and the combobox list. Then how to create a combo box bean, which is based on table EMP(empno number(6), ename varchar2(40)) records for example, to achieve these features:
    1) allow more than one columns to be displayed in its records list--e.g., I need to show these records:
    empno (value) ename (label)
    103 David M Baker
    104 David M Baker
    105 Kelly J Volpe
    106 Krista F Carpenter
    107 Michelle P Silverman
    The two 'David M Baker's are different employees, but unfortunately, with the same name.
    2) allow combo box list to return the column value 'empno' even though it shows both columns as above. i.e., if user picks the second record above, then the combobox list returns 104 to the textbox in the background, but the 'David M Baker' is displayed on the textbox. Of course the combobox list may return 'David M Baker' if needed when there is only one column in the list as the current standard feature.
    3)allow partial match search by typing in some letters. i.e., if user types in the textbox of the combobox letter 'K' or 'k' then the partially matched records
    105 Kelly J Volpe
    106 Krista F Carpenter
    should be automatically displayed in the combobox list, not the whole list as above; then user may double click to choose one of the two or if user continues to type in 'R' or 'r', then the uniquely matched record 'Krista F Carpenter' is displayed in the textbox and the 106 is returned to the textbox.
    4) as a bonus if it's doable, allow combobox to return values to different textboxes when its records list has more than one columns.
    The reason I need these features is that I am working on the project migrated from Microsoft Access applications to centralized Java version web application. We at beginning promised to users community that Java swing will provide all the GUI user friendly features Microsoft Access has, but now we got stucked--we ate our words and got tons of complains from our users community. This is just the most needed component I posted here. I really hope that Java would add all the default GUI user-friendly features to compete with MS since its Win95 GUI has been accepted as industry standard. And most users are used to it. They claimed that they don't know and don't care what tool you use the newly created application should be more user friendly, not the opposite.
    I would be very much appreciated if any one would help me with this item.

    Thanks for your comments. I think nobody expects Sun to write everything including special features for its components. But I do think Sun should provide at least those standard user-friendly features for the GUI components because most users have been used to the GUI user-friendly features provided by Win95 and Access/Excel applications. Then this will help us to productively create applications to beat MS applications.
    Other wise like me, to get the existing GUI features, existed in old MS Access application, for our migrated Java application, I must re-create the GUI components library first which is a big burden to me at least, for others it might be fun for coding on their own from scratch, but I have to focus on the timing of project.
    If you really can pass the request to Sun and push them move a bit, please pass these words: before Sun starts to revise them, please play around window GUI, e.g., Access/Excel applications, then plan what to do, the bottom line is to equally match or better than them in FUNCTIONALITY(Look and feel is not my focus here). Don't ignore the influence of Windows regardless of you hate it or love it, the reality is most users are so familiar with windows GUI features which are accepted as industry standard. Thus the choice is to match or better to beat them. Don't make your car by closing your door, don't assume users will like what you come out in a closed room.

Maybe you are looking for