Decision Factor: Benefits of Computed Columns

Any computed column can be computed in a query. So how do you decide when to design a computed column?
In the following
BOL example the hard-wired expression may make it a good candidate for a (server-side) computed column: you don't want the developers to get confused and reinvent their own formula for the
business rule
to calculate RetailValue in client apps:
ALTER TABLE dbo.Products ADD RetailValue AS (QtyAvailable * UnitPrice * 1.5);
Provide example with your post. Thanks.
Kalman Toth Database & OLAP Architect
SELECT Video Tutorials 4 Hours
New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

If its a frequently required computation you can keep it as a computed column and just include it in select list from table rather than including the calculation at every place. Also any changes in calculation in future mean just change computed column definition
at a single place rather than changing each and every procedure/code which needs the computation.
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

Similar Messages

  • Adding complex computed column at run-time fails in PB 12.6

    I support a large 'vintage' application that was originally written in the late 90's, and has been migrated from version to version over the years.  Currently I'm working to get it working in PB 12.6 Classic, build 3506, migrating it from PB 11.5
    The first issue I've come across is that PFC treeviews are not working properly.  After some debugging, I determined that the cause of the failure is the PFC code's creation of 'key' column in the datawindows being used to populate each level.
    In the pfc_u_tv.of_createkey function, it looks at the linkages between the levels of a treeviews data, and then crafts an expression, and then uses that to create a computed column, which is subsequently used to uniquely identify the data.
    The expression is a series of string concatenations, such as this:
    expression='String(ctg_cd) + "#$%" + String(app_cd) + "#$%" + String(app_cd) + "#$%" + String(win_id) + "#$%" + String(ctg_cd) + "#$%"'
    This expression is then used in a modify statement to create the new compute key column:
    ls_rc = ads_obj.Modify("create compute(band=detail x='0' y='0' " + &
      "height='0' width='0' name=pfc_tvi_key " + ls_Exp + ")")
    In earlier versions, this works, and when you do a getitemstring afterward you get a long concatenated value that PFC uses to uniquely identify treeview items.
    Such as 'a/r#$%plcy#$%plcy#$%w_m_plcy_fncl_tran#$%a/r#$%'
    However, in PB 12.6, only the first portion of the expression is processed.  There is no error returned from the Modify function, and the column is created, but the expression is only evaluating one part of the expression.  In this partiular case, it results in
    'plcy'
    I just noticed that this isn't the *first* item in the set of concatenated columns and string literals, which is interesting.  In any case, when the computed key values are not correct, the whole method PFC is using to populate and run the treeview falls apart, and you get all children under each parent, and selections on the treeview object do not work properly because the key column values are not unique.
    I can copy the expression value from a debugging session, and use that to create a computed column at design time, pasting the expression in, and naming it the same as the created column (pfv_tvi_key), and this then allows the treeview to work properly.  However, this is a very cumbersome and problematic workaround.  The application has a lot of treeviews, with a very large number of datawindows operating as all the different types of items on different levels of different trees.  I don't think it's a practical workaround, and future maintenance would be very difficult.
    bu the workaround still demonstrates it is not an issue with the syntax of the compute expression, just an issue with the way it is handled by PowerBuilder when a column is created at run time,vs. at design time.
    Has anyone else encountered this issue?  I would think there are a fair number of older apps still around that are using PFC treeviews.
    My next step will be to install PB 12.6 build 4011 and cross my fingers.  Other than that, perhaps try 12.5?

    Updating the PFC layers has started to lead down a rabbit hole of hundreds and hundreds of errors.
    Granted, many of these are probably rooted in a few ancestor issues.
    But I'm not sure this will lead to a solution.
    The problem lies in using a modify statement to create a computed column at run time containing an expression with multiple concatenated values.
    If I look at the pfc_u_tv.of_createkey function from the newer PFC version, I see the same code there:
    ls_rc = ads_obj.Modify("create compute(band=detail x='0' y='0' " + &
      "height='0' width='0' name=pfc_tvi_key " + ls_Exp + ")")
    And the same code above it which creates the expression used in the modify.
    So after doing all the patching for the new PFC version, I somehow suspect I'll still be facing the same datastore.modify problem.
    Not to say that isn't a good thing to do, just not sure it addresses the root of the problem.

  • How to add text labels to computed column totals in a banded report?

    I'm sure I'm missing something obvious here. I have a simple report with a table that consists of 3 DB field columns and 3 computed columns. I have the 'show column total' property enabled and it inserts the totals at the bottom of the computed columns. What I haven't been able to do is insert a label to the left of the computed totals. I tried inserting a blank field and adding my label, but the location is not dynamic - if one grouped item has 3 rows and the next has 4 the inserted field does not flow to the bottom of the table for each group, it displays in the same position for each group.
    What property/option am I not seeing?
    Thanks!
    Edited by: user11293154 on Jun 22, 2009 1:03 PM

    Thanks for the reply. I'm using Interactive Studio version 9.3, but can't find the 'Set Spring' feature. The help file doesn't have a reference to it. There is a Spring(method) with this example;
    ActiveDocument.Sections["Report"].Body.Tables["Table"].Spring("Chart")ActiveDocument.Sections["Report"].Body.Tables["Table"].Spring("Pivot")

  • Interactive Report - Compute columns

    Hi,
    Interactive report query is storing in a apex view - apex_application_page_ir.
    Like, if compute columns added in runtime, APEX is making a new query with compute columns. I want to know in which APEX view is having this query:
    I created a Interactive report (APEX 3.2.0) for the following query:
    select
    "EMPNO",
    "ENAME",
    "JOB",
    "MGR",
    "HIREDATE",
    "SAL",
    "COMM",
    "DEPTNO"
    from "EMP"
    In the run time, I added a compute column , now the query is as follows:
    select
    ROWID as apxws_row_pk,
    "EMPNO",
    "ENAME",
    "JOB",
    "MGR",
    "HIREDATE",
    "SAL",
    "COMM",
    "DEPTNO",
    "SAL" *10/100 "APXWS_CC_001",
    count(*) over () as apxws_row_cnt
    from (
    select * from (
    select
    "EMPNO",
    "ENAME",
    "JOB",
    "MGR",
    "HIREDATE",
    "SAL",
    "COMM",
    "DEPTNO"
    from "EMP"
    ) r
    ) r where rownum <= to_number(:APXWS_MAX_ROW_CNT)
    order by ROWID
    Regards
    Mohan

    it is not the sql query its all about the report_columns in apex_application_page_ir
    in case anyone landed in this page

  • REP-1510 Group manager unable to compute column

    Hai all,
    I am using oracle 6i report builder.I have created a matrix report with different type of grouping and i tried to add a slno in the matrix report.I used summary column for showing the Serial Number but the serial number is showing on the report  when user has  select only one item (fromitem=01 and toitem=01).when user has  select more than one item (from item=01,toitem=10)it showing the error
    'REP-1510 Group manager unable to compute column'.
    How to solve this issue,how can create serial number in my matrix report

    try this one
       SELECT ROW_NUMBER() OVER (ORDER BY COLUMN_NAME) SLNO FROM TABLE_NAME;

  • Errors processing after adding Computed Column to Time.

    Errors in the high-level relational engine. The 'Time' table is based on a named query, and contains one or more computed columns. A table based on a named query cannot contain computed columns. If the computed columns are necessary, add them to the query
    expression.
    Help me, pls! Thanks!

    Hi CUONGNV0207,
    According to your description, you get this error when processing cube. Right?
    As the error message mentioned, a table based on a named query cannot contain computed columns. If you create a computed column in this table, it will throw this error. In this scenario, please go to the DSV and check if that table is set up with named query.
    You can replace it with a DimTable in your database.
    Reference:
    cryptic error when processing cube
    Errors processing after adding Computed
    Column to Time
    If you have any question, please feel free to ask.
    Simon Hou
    TechNet Community Support

  • Passing a query-limit-value to a Computed-Column on the Results Page

    Hello All,
    Is it possible to use a user-selected variable-limit (filter), and pass it to an IF statement in a Computed-Column in a Results Screen.
    Example:
    I have a Computed Column in a Results screen which uses the following logic:
    if ( Accounting_Period == "2" && Ledger == "ACTUALS" ) {Year_to_Date_Actuals} else (0)
    Instead of [Accounting_Period == "2"], I would like to set Accounting_Period == <variable_query_limit>
    How would one do this?
    Thanks!
    smg

    if not using a dashboard you may want to look at the OnPostProcess event and put script there to collect the selected value of the limit and modify the computed column of the results section.
    strongly recommend you review documentation

  • Creating computes columns

    Hi all,
    can we create a table with computed column.
    for example
    CREATE TABLE TABCOMM(A NUMBER,B NUMBER,C NUMBER DEFAULT A+B));
    please help to create a trigger for this ..
    please help in this.
    Message was edited by:
    user554101

    Even for the right DEFAULT value embedded to the table building, that concern only insert statement, not update :
    SQL> create table toto (a number, b number, c number default 0);
    Table created.
    SQL> insert into toto(a,b) values (1,2);
    1 row created.
    SQL> insert into toto(a,b,c) values (2,3,null);
    1 row created.
    SQL> select * from toto;
             A          B          C
             1          2          0
             2          3
    2 rows selected.
    SQL> update toto set a=4 where a = 2;
    1 row updated.
    SQL> select * from toto;
             A          B          C
             1          2          0
             4          3
    2 rows selected.
    SQL> C with null value was not update with default value here above, so why the trigger should taken in account the update ?
    Nicolas.

  • Computation Columns in IR get lost

    Hei,
    i have an IR with many computation columns in saved reports.
    Whenever the source table structure is changed (new column, delete column, etc), all computation columns are moved to non-visible sides of reports.
    Is this a bug or a feature? ;)
    Regards
    Raffaello

    Raffaello wrote:
    Thanks for the fast reply.
    But...
    I tried another way using a second server.
    Same application, same id.
    Adding some columns, export the application and import on server one, replacing the application.
    The computation columns don't get lost in the visible side.
    Why do this way work, but the direct way don't?Just a guess. When you export/import either the report gets regenerated or something copied without being invalidated

  • Difference in computed columns

    I've been trying to get issues with persisted computed columns to resolve. In the Schema compare between the DACPAC and the project you get the "shows difference but no differences highlighted". So I finally decided to compare the model.xml files and see
    the following. How can I change the DB or DB project to get these to match?
    DB Project:
                        <Element Type="SqlComputedColumn" Name="[dbo].[User].[UsernameLowered]">
                            <Property Name="ExpressionScript">
                                <Value QuotedIdentifiers="True" AnsiNulls="True"><![CDATA[([dbo].[User_UsernameLowered]([Username]))]]></Value>
                            </Property>
                            <Property Name="IsPersisted" Value="True" />
                            <Relationship Name="ExpressionDependencies">
                                <Entry>
                                    <References Name="[dbo].[User_UsernameLowered]" />
                                </Entry>
                                <Entry>
                                    <References Name="[dbo].[User].[Username]" />
                                </Entry>
                            </Relationship>
                        </Element>
    DACPAC extracted from destination DB:
                        <Element Type="SqlComputedColumn" Name="[dbo].[User].[UsernameLowered]">
                            <Property Name="ExpressionScript">
                                <Value><![CDATA[([dbo].[User_UsernameLowered]([Username]))]]></Value>
                            </Property>
                            <Property Name="IsPersisted" Value="True" />
                            <Property Name="IsPersistedNullable" Value="True" />
                            <Relationship Name="ExpressionDependencies">
                                <Entry>
                                    <References Name="[dbo].[User_UsernameLowered]" />
                                </Entry>
                                <Entry>
                                    <References Name="[dbo].[User].[Username]" />
                                </Entry>
                            </Relationship>
                        </Element>
                    </Entry>

    I would also like to know when this bug will be fixed as I also ran into it today, and strangely enough, after applying the last SSDT udpate (v11.1.40706.0).
    It is a very small project, which I just opened to update some stored procedures, and all of a sudden it showed these "IsPersistedNullable" differences between the database and the project.
    Can the reason be that the "Allow Nulls" field has the checkboxes ticked in SSMS, and  unticked in the SSDT designer for the computed columns?
    (none can be changed by the way, so it clearly is a bug)
    This bug was also reported on MSConnect, but expertly bounced off as a non-issue:
    https://connect.microsoft.com/SQLServer/feedback/details/797222/ssdt-schema-compare-of-tables-with-computed-columns
    /Charles

  • Issue with Replication Latency with computed columns

    We have transactional replication enabled and there are 15 tables that are replicated. Out of 15 there are 6 tables where we have computed columns that are persisted. When ever there are several writes, if the actual transaction size is 100 MB on the publisher,
    the log reader agent is queuing up almost 30-40 GB of data and the latency is significantly increasing and the transaction log is getting held up by REPLICATION in log_reuse_wait. 
    An example schema for a table is
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    SET ANSI_PADDING ON
    GO
    CREATE TABLE [dbo].[address](
    [address_id] [int] IDENTITY(1,1) NOT NULL,
    [crm_entity_id] [int] NOT NULL,
    [address_title] [varchar](600) NOT NULL,
    [address1] [varchar](300) NULL,
    [address2] [varchar](300) NULL,
    [address3] [varchar](300) NULL,
    [city] [varchar](300) NULL,
    [state_name] [varchar](15) NULL,
    [state_non_roman] [varchar](300) NULL,
    [postal_code] [varchar](60) NULL,
    [district] [varchar](15) NULL,
    [country] [varchar](15) NULL,
    [country_non_roman] [varchar](150) NULL,
    [non_roman] [char](1) NOT NULL,
    [is_primary] [char](1) NOT NULL,
    [parent_address_id] [int] NULL,
    [vat_supply_to] [char](1) NOT NULL,
    [created_by] [char](8) NOT NULL,
    [created_time] [datetime] NOT NULL,
    [modified_by] [char](8) NOT NULL,
    [modified_time] [datetime] NOT NULL,
    [address_title_uni] AS (case when [address_title] IS NULL then NULL else CONVERT([nvarchar](200),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](600),[address_title],0)),0) end) PERSISTED,
    [address1_uni] AS (case when [address1] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[address1],0)),0) end) PERSISTED,
    [address2_uni] AS (case when [address2] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[address2],0)),0) end) PERSISTED,
    [address3_uni] AS (case when [address3] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[address3],0)),0) end) PERSISTED,
    [city_uni] AS (case when [city] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[city],0)),0) end) PERSISTED,
    [state_non_roman_uni] AS (case when [state_non_roman] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[state_non_roman],0)),0) end) PERSISTED,
    [postal_code_uni] AS (case when [postal_code] IS NULL then NULL else CONVERT([nvarchar](20),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](60),[postal_code],0)),0) end) PERSISTED,
    [country_non_roman_uni] AS (case when [country_non_roman] IS NULL then NULL else CONVERT([nvarchar](50),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](150),[country_non_roman],0)),0) end) PERSISTED,
    CONSTRAINT [pk_address] PRIMARY KEY CLUSTERED
    [address_id] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
    SET ANSI_PADDING OFF
    GO
    ALTER TABLE [dbo].[address] WITH CHECK ADD CONSTRAINT [fk_address] FOREIGN KEY([crm_entity_id])
    REFERENCES [dbo].[crm_entity] ([crm_entity_id])
    GO
    ALTER TABLE [dbo].[address] CHECK CONSTRAINT [fk_address]
    GO
    ALTER TABLE [dbo].[address] WITH CHECK ADD CONSTRAINT [fk_address2] FOREIGN KEY([parent_address_id])
    REFERENCES [dbo].[address] ([address_id])
    GO
    ALTER TABLE [dbo].[address] CHECK CONSTRAINT [fk_address2]
    GO

    We have transactional replication enabled and there are 15 tables that are replicated. Out of 15 there are 6 tables where we have computed columns that are persisted. When ever there are several writes, if the actual transaction size is 100 MB on the publisher,
    the log reader agent is queuing up almost 30-40 GB of data and the latency is significantly increasing and the transaction log is getting held up by REPLICATION in log_reuse_wait. 
    An example schema for a table is
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    SET ANSI_PADDING ON
    GO
    CREATE TABLE [dbo].[address](
    [address_id] [int] IDENTITY(1,1) NOT NULL,
    [crm_entity_id] [int] NOT NULL,
    [address_title] [varchar](600) NOT NULL,
    [address1] [varchar](300) NULL,
    [address2] [varchar](300) NULL,
    [address3] [varchar](300) NULL,
    [city] [varchar](300) NULL,
    [state_name] [varchar](15) NULL,
    [state_non_roman] [varchar](300) NULL,
    [postal_code] [varchar](60) NULL,
    [district] [varchar](15) NULL,
    [country] [varchar](15) NULL,
    [country_non_roman] [varchar](150) NULL,
    [non_roman] [char](1) NOT NULL,
    [is_primary] [char](1) NOT NULL,
    [parent_address_id] [int] NULL,
    [vat_supply_to] [char](1) NOT NULL,
    [created_by] [char](8) NOT NULL,
    [created_time] [datetime] NOT NULL,
    [modified_by] [char](8) NOT NULL,
    [modified_time] [datetime] NOT NULL,
    [address_title_uni] AS (case when [address_title] IS NULL then NULL else CONVERT([nvarchar](200),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](600),[address_title],0)),0) end) PERSISTED,
    [address1_uni] AS (case when [address1] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[address1],0)),0) end) PERSISTED,
    [address2_uni] AS (case when [address2] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[address2],0)),0) end) PERSISTED,
    [address3_uni] AS (case when [address3] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[address3],0)),0) end) PERSISTED,
    [city_uni] AS (case when [city] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[city],0)),0) end) PERSISTED,
    [state_non_roman_uni] AS (case when [state_non_roman] IS NULL then NULL else CONVERT([nvarchar](100),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](300),[state_non_roman],0)),0) end) PERSISTED,
    [postal_code_uni] AS (case when [postal_code] IS NULL then NULL else CONVERT([nvarchar](20),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](60),[postal_code],0)),0) end) PERSISTED,
    [country_non_roman_uni] AS (case when [country_non_roman] IS NULL then NULL else CONVERT([nvarchar](50),[dbo].[udfVarBinaryToUTF16](CONVERT([varbinary](150),[country_non_roman],0)),0) end) PERSISTED,
    CONSTRAINT [pk_address] PRIMARY KEY CLUSTERED
    [address_id] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
    SET ANSI_PADDING OFF
    GO
    ALTER TABLE [dbo].[address] WITH CHECK ADD CONSTRAINT [fk_address] FOREIGN KEY([crm_entity_id])
    REFERENCES [dbo].[crm_entity] ([crm_entity_id])
    GO
    ALTER TABLE [dbo].[address] CHECK CONSTRAINT [fk_address]
    GO
    ALTER TABLE [dbo].[address] WITH CHECK ADD CONSTRAINT [fk_address2] FOREIGN KEY([parent_address_id])
    REFERENCES [dbo].[address] ([address_id])
    GO
    ALTER TABLE [dbo].[address] CHECK CONSTRAINT [fk_address2]
    GO

  • Computed Column specification issue

    I have an expression which is turning into error validating formula for computed column specification in table design. I am new to such statement. Can you help?
    CAST(TRANS_ENTRY_DATE AS DATE)
    Thanks

    Hi Purval,
    One of the value of column Trans_entry_date cannot be cast to date, therefore you are getting that error.
    Try something like this: 
    case when 
    (case ISDATE(TRANS_ENTRY_DATE) then CAST(TRANS_ENTRY_DATE AS DATE) else null end)
    If using sql server 2012, you can take advantage of TRY_CONVERT function: 
    TRY_CONVERT(date, TRANS_ENTRY_DATE)
    Here is msdn link for same => http://technet.microsoft.com/en-us/library/hh230993.aspx
    Regards Harsh

  • Pivot table total on a computed column not showing when using a filter

    I have this strange issue. I will try to explain.
    I have three columns in my report - Delivered Qty, Opened Quantity, and the third one is computed that is (Delivered Qty - Opened Qty). The data is displayed in a table by quarter. So basically I have Quarter, Delivered Qty, Open Qty, Not Opened Qty. I also have the total row. A pie chart shows the opened vs. un-opened quantities. It all works fine.
    The issue is when I add a prompt and the corresponding filter, the total for the computed field is blank. This happens whether it is pivot table or a regular table. That causes the pie chart to show 100% for the Opened qty.
    When I remove the filter, it works again. I can see the total for the computed field and the pie chart looks great.
    Can anyone please help me? I have a number of hours trying to play around with this and also looked in the forum to see if any one might have already discussed similar issue in the past. I could not find any.
    Thanks.

    I found the issue with my table total not showing up. Basically the new materialized view had its own data table alias. However, in the filter I was using date columns from different date table alias (copy and paste problem). As soon as I changed to the right date columns from the associated alias table, the row total showed up.
    Thanks anyways for your support.

  • How a report will dynamically take a decision whether to display a column?

    Hi,
    I have a query in which there is a selection called "Uncategorized Expense".
    I want that column to be displayed only if it contains non zero entries i.e. dynamic decision should be made for display.
    How can I achieve this in query.
    I am working in BW3.5.
    Thanks,
    Rashmi.

    Hi,
    You could also try creating an exception specifying Uncategorized Expense is not equal to zero.  However, if more than one key figure exists in the query this may suppress those as well.
    Kind regards,
    Larry

  • Computed column and design mode

    If I have a computed field in my ViewObject (the town-name returned by a stored procedure from the town-code), and I want to create a RowSetInfo object w/ the abovementioned view as QueryInfo.
    In the queryInfo window if I select the source as this view, I cannot navigate to the Selection Order tab. The tab is active, but the definition is visible: http://195.228.152.68/~tib/query.gif
    The expression for the townname is the following:
    town_sp.to_name(Partner.towncode)
    If I cannot use computed field like this, what can I do?
    Thx:
    Tib

    Hi!
    I have read about SWC and Library Projects and have tried it out, but I need some schema how to extend DataGrid. Which methods to override.
    For example: Where should I add columns, how to add inspectable properties, like for example visible column1 or visible column2, and then how to refresh preview in Design Mode.
    I'm searching for book about writing custom component with DataGrid example. Found one or two books on amazon, but nothing about DataGrid.
    What I'm trying to do is to add Remote Object to my DataGrid that will pull some data from server and then it will populate it.
    It is not so hard to do in standard application, but I want to try to do it inside a custom component.
    Any samples will be cool
    Thanks for help to all!

Maybe you are looking for

  • Problem with installing Lightroom 2 onto Macbook Pro running OS10.6.8 - any ideas why it will not install??

    To be able to work on my older laptop on location I wanted to install Lightroom 2 onto a MacBook Pro which runs 10.6.8 It stops at the end of the installment and tells me to contact Adobe for info.... that's nit very easy anyway. Has someone a quick

  • 2 hard drives - Windows on one and OSX on the other

    I currently have one 750gb spinning HDD with Windows on a Boot Camp partition in my MacBook. I am thinking of getting a new 256gb SSD and making it a OSX only drive. Once I get that setup, I would like to get rid of the SuperDrive and install my old

  • Java 1.6.0 on Solaris 10?

    I'm writing an application that uses Java 1.6.x, so I've been trying it out on various operating systems... Windows : peice of cake, installs and runs first time. Linux : peice of cake, installs and runs first time. Mac OS : need to download java bet

  • JAXB and Soap Message Unmarshelling Issue

    We are using JDK 6 and while unmarshelling a soap message getting exception My java code is public class Test { private static JAXBContext jaxbContext;      public static void main (String a []){      try { jaxbContext =JAXBContext.newInstance("net.d

  • How i create oracle9i data Warehouse

    hi master sir i am use oracle 9i with devloper 6i sir how i create data Warehouse with my 9i data base please give me step what software i install how i use portal how i create datamart how create cub how i connect my oracle 9i with Warehouse how i s