Need help with complex column creation command

Hello, all
I need help with a complex column creation command and SQL anywhere help is not sufficient for it.
Here is the situation:
I need to write a generic DDL "alter table" command, which can add/modify columns without knowing in advance if they already exist in the destination table.
Is there a command, which looks like:
alter table "table1" add (on existing modify) column1 <datatype> <default> ?
Thank you,
Arcady

Hi.
I don't think this is supported in alter table command. But you can code that inside an if statement which queries systables & syscolumns. Your code should be something like that:
if (select count(*) from sysobjects, syscolumns where sysobjects.id = syscolumns.id and sysobjects.name = 'some_table' and syscolumns.name = 'some_column') < 1
begin
    alter table some_table add some_column numeric(12) not null
end
This is an example..
Andreas.

Similar Messages

  • Need help with the restore-mailbox command

    I just started playing around with the restore-mailbox command now that the Recovery Storage Group and exmerge are not options in 2010. I am hoping someone can help me out here. I am working in a test environment. What I did was backup a database with a
    single mailbox on it called testex2010. I used Windows backup. I then restored the database and logs, created the recovery database using the shell and was able to mount the database after running eseutil /R etc...
    I now want to recover the data in the mailbox testex2010 to another mailbox called Recovery Mailbox in the production database. I was able to successfully restore testex2010 to itself. I proved  this by deleting all the messages in the live database,
    running the restore command and seeing the messages back in the live mailbox.
    The command I used was restore-mailbox -identity testex2010 -RecoveryDatabase RDB1
    I am prompted with the "are you sure" text and after hitting yes, all goes well. I now want to restore to another mailbox in the production database called Recovery Mailbox. This is where it breaks down. No matter how I format the command, I cannot get it
    to work. Two questions:
    1) should I be able to see mailboxes in the Recovery Database using the GUI or the Shell or do you just assume they are there?
    2) what is the command to restore messages from a mailbox in the Recovery Database to another mailbox in the production environment?
    Is what I am trying even possible? Just so you know, I did try this command"
    Restore-mailbox -identity testex2010 -RecoveryDatabase RDB1 -RecoveryMailbox "Recovery Mailbox" -TargetFolder testex2010
    It fails telling me that the mailbox Recovery mailbox is not in the Recovery Database.

    Restore-mailbox -identity testex2010 -RecoveryDatabase RDB1 -RecoveryMailbox "Recovery Mailbox" -TargetFolder testex2010
    The wording of the
    Restore-Mailbox command can be tricky. If I look at your example above, this is what it would attempt to do...
    Restore the content of the mailbox with display name "Recovery Mailbox" in the RDB into the mailbox testex2010 under a folder named testex2010.
    Think of the -Identity parameter as the mailbox you want to restore INTO.
    Think of the -RecoveryMailbox parameter as "Recoverying The Mailbox of..."
    Also get into the habit of using the DisplayName with -RecoveryMailbox.
    Microsoft Premier Field Engineer, Exchange
    MCSA 2000/2003, CCNA
    MCITP: Enterprise Messaging Administrator 2010
    Former Microsoft MVP, Exchange Server
    My posts are provided “AS IS” with no guarantees, no warranties, and they confer no rights.

  • Need Help with complex query and computing values

    I have an issue trying to combine data from several tables. I need help trying to compute the "Total Hours", "Max Pressure" ,"Average Pressure" while displaying the "Manufacturer",
    "Part Type" , "Serial Number", "District", "Status","Truck Type",and "truck number" for a certain Part on all Trucks. I need to be able check and see if the serial number was on
    a particular job and calculate the hours of that serial number if it was on that job and the jobdate falls between the install date and removal date. Ive tried but keep getting either
    repeating rows, total hrs of the truck instead of the serial number. Ive considered doing a pivot to display it but have been having trouble putting it together.
    table 1
    (*records of parts*)
     Contains  serial number,truck number, part type, part number, install date, removal date, status
    table 2
    (*records of Jobs*)
    contains Jobnumber, JobStartdate, Max pressure, average pressure, and Totalhrs
    table 3
    (records related to jobs and trucks)
    contains jobnumber, district , and truck numbers
    Table 4
    (records of manufacturers and part numbers)
    contains partnumber, manufacturer name, and truck type
    I would like to get it to show like below
    SerialNo   PrtType       
    MFG      TruckNo  
     TrkType    TtlHrs 
       MaxPr   AvgPr 
      Status    
    Dst
    AB345     1200series  
    NGK        2G34        
    Boss X       400     10,000 
     9500  NonOp    
    NE
    Thanks in advance

    Hope this helps
    Note: (Date of Mar 1 2014 was just extended to a further date so the system would know that its still operating when calculating current hours)
    Table 1
    SerialNo    TruckNo  InstallDate RemovalDate      Status       PartNo      PartType
    BJ1002       F917   2013-09-17   2014-03-01   Oper         L871235       BJ 3000 GL
    CWS1002   F104   2012-11-21   2013-03-29   NonOper   L76088-450   CWS 2000
    CWS1003   F104   2013-04-24   2013-08-01   NonOper   L76088-450   CWS 2000
    CWS1005   F187   2012-11-21   2013-04-04   NonOper   L76088-450   CWS 2000
    CWS1006   F187   2013-04-24   2013-06-30   NonOper   L76088-450   CWS 2000
    CWS1007   F187   2013-06-30   2013-03-01   Oper         L76088-450   CWS 2000
    CWS1009   2F60   2013-05-05   2013-03-01   Oper         L76088-450   CWS 2000
    CWS1011   F809   2013-05-28   2013-08-28   NonOper   L76088-400   CWS 2000
    CWS1013   F990   2013-06-11   2013-10-29   NonOper   L76088-450   CWS 2000
    CWS1015   F783   2013-06-28   2013-03-01   Oper         L76088-450   CWS 2000
    Table 2
    JobNumber    Date                 District         PrAvTreat PrMaxTr TotalHrs
    553811287    2012-01-19    Fairmount    7337        8319     1.53
    652110088    2012-08-20    San Antonio  6340       7075      0.47
    652110090    2012-08-21    San Antonio  6134       7131      0.62
    652110091    2012-08-22    San Antonio  6180       2950      0.58
    652110092    2012-08-23    San Antonio  5959       6789      0.64
    652110093    2012-08-23    San Antonio  6165       7466      0.62
    Table 3
    TruckNo District    JobNumber
    1F01      Odessa   10011012329
    1F01      Odessa   10011012333
    1F01      Odessa   10011014831
    1F01      Odessa   10011014834
    1F01      Odessa   10011012332
    1F01      Odessa   10011012328
    1F01      Odessa   10011014829
    Table 4
    PartNumber    Manufacturer  TruckType
    L322020-2     Jimmy Dean   Ford T
    L322738-2     Lucas             Ford T
    L47869-1       Jimmy Dean   Ford T
    L76070-650   NGK               Ford T   
    Sam Howard

  • Need help with af:column and skinning

    Hallo,
    ADF Faces and skinning is in general a great opportunity to get projects done - thx for this.
    - I use a af:table with af:column with sortable="true".
    - This works great, but the customer want's to have the icon for sorting to the left side of the label.
    - I have created my own skin for this and can change the sorting icon, but I cannot place it to the left side of the label:
    af|column::sort-ascend-icon {
    content:url(/skins/espa/sort_ab_h.gif);
    Has anyone done this? Is this possible?
    Thx, Willi

    Looking at the source gives a clear picture - it can't be done.
    Now I am switching over to less comples jsf components and additional implementations.
    Thx, Willi

  • Need help with custom column in BI Publisher

    Hi Guru's
    I have started working with BI Publisher Recently and need with below issue
    Can you please let me know how can i create a custom column like % based on two existing measures in the report
    I tried creating it in obiee report and used that SQL to create BI Publisher Report , but the result column in obiee is not working as expected in BI Publisher,
    can some one please help me with this
    Thanks a lot in advance.

    This column can be calculated in BIP RTF template. But if it is a column inside a FOR-loop
    then it may need to be calculated slightly different.
    Like I said, get the xml data and rtf then send it to me : [email protected]
    and will get it fixed for you.
    thanks
    Jorge

  • Need Help with complex query for productio database

    Hello again,
    i need your help again, for an query how Shows me how long every production step takes per Order.
    See sample data and what i expect.
    Thank you all for your help.
    We use Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    Here the sample data tables:
    CREATE      TABLE      TABLE_2
    (     "ORDER_NR"      VARCHAR2 (12)
    ,      "PRIORITY"      VARCHAR2 (2)
    ,      "WO_STEP"      VARCHAR2 (1)
    ,      "STEP_DATE"      DATE
    CREATE      TABLE      TABLE_1
    (     "ORDER_NR"           VARCHAR2     (12) PRIMARY KEY
    ,      "PRIORITY"           VARCHAR2      (2)
    ,      "CREATE_DATE"      DATE
    ,     "ACT_STEP"          VARCHAR2     (2)
    ,     "STEP_DATE"          DATE
    ,     "EMPLOYEE"          VARCHAR2     (5)
    ,     "DESCRIPTION"     VARCHAR2     (20)
    INSERT      INTO      TABLE_1      (ORDER_NR,               PRIORITY,      CREATE_DATE,                                                        ACT_STEP,     STEP_DATE,                                                            EMPLOYEE,     DESCRIPTION)
                        VALUES           ('1KKA1T205634',     '12',          TO_DATE('10-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'),     'U',          TO_DATE('28-FEB-13 12:00:00','DD-MON-RR HH24:MI:SS'),     'W0010',     'CLEAN HOUSE');
    INSERT      INTO      TABLE_1      (ORDER_NR,               PRIORITY,     CREATE_DATE,                                                        ACT_STEP,     STEP_DATE,                                                            EMPLOYEE,     DESCRIPTION)
                        VALUES           ('1KKA1Z300612',     '12',          TO_DATE('08-FEB-13 14:00:00','DD-MON-RR HH24:MI:SS'),     'F',          TO_DATE('20-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'),     'K0052',     'REPAIR CAR');
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'A',          TO_DATE('12-FEB-13 13:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          '5',          TO_DATE('13-FEB-13 09:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'K',          TO_DATE('13-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          '5',          TO_DATE('13-FEB-13 11:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'K',          TO_DATE('13-FEB-13 12:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          '5',          TO_DATE('13-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'C',          TO_DATE('14-FEB-13 08:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'B',          TO_DATE('14-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'E',          TO_DATE('18-FEB-13 13:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'F',          TO_DATE('20-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'S',          TO_DATE('21-FEB-13 08:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'R',          TO_DATE('21-FEB-13 09:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'U',          TO_DATE('28-FEB-13 12:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'A',          TO_DATE('12-FEB-13 13:52:42','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          '5',          TO_DATE('13-FEB-13 09:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'K',          TO_DATE('13-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          '5',          TO_DATE('13-FEB-13 11:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'K',          TO_DATE('13-FEB-13 12:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          '5',          TO_DATE('13-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'C',          TO_DATE('14-FEB-13 08:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'B',          TO_DATE('14-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'E',          TO_DATE('18-FEB-13 13:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'F',          TO_DATE('20-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'));
    COMMIT;And here is what i expect from my query:
    SYSDATE     28.Feb.13 14:00                                                                                     
    ORDER_NR     PRIORITYCREATE_DATE     STATUS     STATUS_DATE     DESCRIPTION     AGE_1     AGE_2     WAITNG     STEP_A     STEP_B     STEP_C     STEP_5     STEP_K     STEP_E     STEP_F     STEP_S     STEP_R     
    1KKA1T205634     12     10.Feb.13 10:00     U     28.Feb.13 12:00     CLEAN HOUSE     18,083     8,833     2,125     0,833     4,125     0,083     0,750     0,208     2,125     0,666     0,042     7,125     
    1KKA1Z300612     12     08.Feb.13 14:00     F     20.Feb.13 16:00     REPAIR CAR     20,000     16,042     2,125     0,833     4,125     0,083     0,750     0,208     2,125     0,666          And now the explanation to the query result:
    The AGE_1 is the difference in days between the 'CREATE_DATE' and IF EXSIST the STEP 'U' then STEP_DATE or if the STEP 'U' is not found in TABLE_2 then it should show the difference in days between the 'CREATE_DATE' and the 'SYSDATE'
    The AGE_2 is the difference in days between the STEP 'A' STEP_DATE and IF EXSIST the STEP 'R' then STEP_DATE or if the STEP 'R' is not found in TABLE_2 then it should show the difference in days between the 'CREATE_DATE' and the 'SYSDATE'
    The WAITING is the difference in days between CREATE_DATE and STEP 'A' STEP_DATE
    The following columns show the days how long the ORDER_NR stays in these STEP, if an ORDER_NR comes into the same STEP more then one time it should be calculated together.
    If the ORDER_NR skips a step it should show a zero in the specific field.
    I hope my explanation is good enough, my english skills are far away from good.
    Thanks all for your help.
    Greets Reinhard W.

    Hi,
    i changed this query:
    with t2 as (
                select  t.*,
                        lead(step_date) over(partition by order_nr order by step_date) next_step_date
                  from  table_2 t
    select  t1.*,
            nvl(
                max(
                    case t2.wo_step
                      when 'U' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_1,
            nvl(
                max(
                    case t2.wo_step
                      when 'R' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_2,
            sum(
                case t2.wo_step
                  when 'B' then t2.next_step_date - t2.step_date
                end
               ) step_b
      from  table_1 t1,
            t2
      where t2.order_nr = t1.order_nr
      group by t1.order_nr,
               t1.priority,
               t1.create_date,
               t1.act_step,
               t1.step_date,
               t1.employee,
               t1.descriptionTo this:
    with t2 as (
                select  t.*,
                        lead(step_date) over(partition by order_nr order by step_date) next_step_date
                  from  table_2 t
    select  t1.order_nr,
            nvl(
                max(
                    case t2.wo_step
                      when 'U' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_1,
            nvl(
                max(
                    case t2.wo_step
                      when 'R' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_2,
            sum(
                case t2.wo_step
                  when 'B' then t2.next_step_date - t2.step_date
                end
               ) step_b
      from  table_1 t1,
            t2
      where t2.order_nr = t1.order_nr
      group by t1.order_nrthen i get the ORA-00979 Error.
    Whats wrong?
    I have another question.
    How can i handle i i want to group to or more 'STEP's in one Column.
    in Case of this i want that the column 'STEP_B' contains all days for STEP 'B' and STEP '5'.
    I tried already with a + Operation like this:
    with t2 as (
                select  t.*,
                        lead(step_date) over(partition by order_nr order by step_date) next_step_date
                  from  table_2 t
    select  t1.*,
            nvl(
                max(
                    case t2.wo_step
                      when 'U' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_1,
            nvl(
                max(
                    case t2.wo_step
                      when 'R' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_2,
            Round( sum(
                case t2.wo_step
                  when 'B' then t2.next_step_date - t2.step_date
                end
               ) +
            sum(
                case t2.wo_step
                  when '5' then t2.next_step_date - t2.step_date
                end
               ), 3 ) step_b
      from  table_1 t1,
            t2
      where t2.order_nr = t1.order_nr
      group by t1.order_nr,
               t1.priority,
               t1.create_date,
               t1.act_step,
               t1.step_date,
               t1.employee,
               t1.descriptionBut this does reply evertime a NULL.
    Thank You.

  • Task Scheduling Script - Need help with passing the scheduled command (variables are not being evaluated)

    Hi Everyone,
    I'm trying to get a simple task scheduler script to work for me and can't get the command I need passed to the scheduler to evaluate properly.
    Here's the script:
    ###Create a new task running $Command and execute it Daily at 6am.
    $TaskName = Read-Host 'What would you like this job to be named?'
    $Proto = Read-Host 'What is the protocol? (FTP/FTPS/SFTP)'
    $User = Read-Host 'What is the user name?'
    $Pwd = Read-Host 'What is the password?'
    $Server = Read-Host 'What is the server address?'
    $NetworkDir = Read-Host 'Please input the network location of the file(s) you wish to send. Refer to documentation for more details.'
    $RemoteDir = Read-Host 'Please input the REMOTE directory to which you will upload your files. If there is none please input a slash'
    $Command = 'winscp.com /command "option batch abort" "option confirm off" "open $Proto://$User:$Pwd@$Server" "put $NetworkDir $RemoteDir" "exit"'
    $TaskAction = New-ScheduledTaskAction -Execute "$Command"
    $TaskTrigger = New-ScheduledTaskTrigger -Daily -At 6am
    Register-ScheduledTask -Action $TaskAction -Trigger $Tasktrigger -TaskName "$TaskName" -User "Administrator" -RunLevel Highest
    Write-Host "$TaskName created to run Daily at $TaskStartTime"
    What's messing up is the $Command creation, the command needs to have the quotes around "option blah blah", but if I wrap the whole line in single quotes the variables that are evaluated for the "open blah blah" strings (which also need
    to be inside quotes) and the "put blah blah" string are not being evaluated properly.
    I've dorked about with different bracketing and quoting but can't nail the syntax down, could someone point me in the right direction? My Google-fu seems to be lacking when it comes to nailing down this issue.
    Thanks

    Hmmn, closer. I'm getting this error now:
    + $Command = $tmpl -f  $User, $Pwd, $Server, $NetworkDir, $RemoteDir
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : InvalidOperation: (winscp.com /com...t {4} {5}" exit:String) [], RuntimeException
        + FullyQualifiedErrorId : FormatError
    And the command being added to the new task looks like this:
    winscp.com /command "option batch abort" "option confirm off" "open ($Proto)://($User):($Pwd)@($Server)" "put $NetworkDir $RemoteDir" "exit"
    Here's the current state of the script. I get what you're doing to try to bypass the quotes issue, using an array. I'm just not awesome at this yet sooooooo...
    $TaskName = Read-Host 'What would you like this job to be named?'
    $Proto = Read-Host 'What is the protocol? (FTP/FTPS/SFTP)'
    $User = Read-Host 'What is the user name?'
    $Pwd = Read-Host 'What is the password?'
    $Server = Read-Host 'What is the server address?'
    $NetworkDir = Read-Host 'Please input the network location of the file(s) you wish to send. Refer to documentation for more details.'
    $RemoteDir = Read-Host 'Please input the REMOTE directory to which you will upload your files. If there is none please input a slash'
    $tmpl = 'winscp.com /command "option batch abort" "option confirm off" "open {0}://{1}:{2}@{3}" "put {4} {5}" exit'
    $Command = $tmpl -f $User, $Pwd, $Server, $NetworkDir, $RemoteDir
    $TaskAction = New-ScheduledTaskAction -Execute $Command
    $TaskTrigger = New-ScheduledTaskTrigger -Daily -At 6am
    Register-ScheduledTask -Action $TaskAction -Trigger $Tasktrigger -TaskName "$TaskName" -User "Administrator" -RunLevel Highest
    Write-Host "$TaskName created to run Daily at $TaskStartTime"

  • Help with Datagrid Column Creation

    I'm trying to get a better understanding of how the Spark DataGrid component works so that I can create and extended version for one of my projects. One of the things that really has me stumped is how the datagrid sets up the columns based on the dataprovider when no explicit columns are defined. I've been able to debug through the datagrid initialization up to the point where the setter for dataprovider gets called. At that point my debug trace goes 'down the rabbit hole' so to speak, and when it comes back the columns are initialized but I can't figure out how. Could someone please point me to the Class/function that is actually responsible for creating the columns from the dataprovider if they are not defined explicitly?

    I finally managed to track down the code I needed, it's in the commitProperties method of Grid.as (line 3555). That's the good news. The bad news is I still seem to be missing something. My understanding was, if the datagrid autogenerated the columns, those columns would be replaced if another collection of columns are defined for the datagrid at some later point. For some reason that doesn't seem to be happening for me.
    The datagrid I'm working with is part of a larger component with a skin. The list of columns for the datagrid is bound to an ArrayList defined in the hostComponent of the skin. I can understand why the creation of the columns could happen before the binding from the host triggers, what I don't get is why they aren't overridden later. Any suggestions?

  • Need help with discount column

    Hi,
    I have managed to create an invoive after discovering that I had LiveCycleDesigner included with Creative Suite 3
    I only have very basic skills in this area, so please bear with me
    I'm hoping what I am after is easy enough for someone out there to help me with.
    As you can see, I have created the below columns.
    The amount column is...     Show: calculate*    Quantity * Unit Price (with unit price being the Binding name under the Object tab)
    So what I would like to do is add the discount variable in and need to know the script to enable this.
    The Binding names are Quantity, Decgription, UnitPrice, ItemDiscount and Amount.
    I got somewhere with the amount column as...       Show: calculate*    Quantity * UnitPrice * ItemDiscount / 100
    But this left me with the actual discount in the amount column rather than the amount - discount.
    I tried a few other ideas, but couldn't seem to get it right so I thought it best to get those who know to give me the right answer.
    The other question is after I open the template with Adobe Reader, and fill it out , is it possible to save the resulting document as a 'standard' PDF?
    So far I have had to export as jpg and then convert to PDF. I'm sure I must be missing something obvious....
    I'm using windows 7 and LiveCycle Designer 8
    Thanks
    Martin

    Hi,
    Thanks for your help.
    Wasn't sure how to apply the maths to the script and get it to work:)
    They both worked perfectly
    Is it possible for the Amount field to be left blank  unless data is enetered in the other columns?
    Right now the Preview PDF has a coloum of $0.00 under amount
    As for saving the PDF
    When I go to Save As in Adober Reader I get the following message
    I've had a look through both the LiveCycle and PDF documents but have been unable to find any adjustable settings that might change this.s
    I presume the Feild Display Pattern Szzz,zz9.99 is the correct way to remove the $ signs?
    Thanks again
    Martin

  • Need help with complex NAT setup with DHCP forwarding

    This is going to get a little crazy, so try to stick with me... Attached is a very crude network map to clarify as best I can...
    I need my Cisco 831 to perform 1-to-1 NAT, but the Inside Global ('Black' LAN hosts as seen by the WAN) addresses must be assigned via DHCP from a DHCP server on the WAN. (This is the complex NAT part) The internal IP addresses would by assigned by the 831's DHCP.
    In addition, the WAN's DHCP server must see the MAC address of the request as the one that belongs to the host on my LAN - Effectively, the WAN DHCP server should not be able to tell there is NAT going on. (This is the DHCP forward problem)
    Any ideas on how to combine these tasks?
    TIA,
    -Chris Bowles, CCNA

    Hi.
    I don't think this is supported in alter table command. But you can code that inside an if statement which queries systables & syscolumns. Your code should be something like that:
    if (select count(*) from sysobjects, syscolumns where sysobjects.id = syscolumns.id and sysobjects.name = 'some_table' and syscolumns.name = 'some_column') < 1
    begin
        alter table some_table add some_column numeric(12) not null
    end
    This is an example..
    Andreas.

  • Need help with complex query

    Hi All
    I have the following requirement, please see the following table
    BIN_DATA_ID     TRAFFIC_SAMPLE_ID     END_INTV_TIME     BIN_1_DATA     BIN_2_DATA     BIN_3_DATA
    1     539     1100     0     34     19
    2     539     1200     0     65     18
    3     539     1300     0     51     17
    4     539     1400     0     65     27
    5     539     1500     0     99     48
    6     539     1600     0     426     138
    7     539     1700     0     151     62
    8     539     1800     0     80     32
    9     539     1900     0     31     11
    10     539     2000     0     37     11
    11     539     2100     0     24     9
    12     539     2200     0     16     5
    13     539     2300     0     27     12
    14     539     2400     0     55     20
    15     539     100     0     18     9
    16     539     200     0     134     52
    17     539     300     0     230     69
    18     539     400     0     15     7
    19     539     500     0     6     5
    20     539     600     0     47     23
    21     539     700     0     100     41
    22     539     800     0     196     43
    23     539     900     0     81     20
    24     539     1000     0     58     28
    25     539     1100     0     58     24
    26     539     1200     0     60     22
    27     539     1300     0     42     18
    28     539     1400     0     53     15
    29     539     1500     0     107     43
    30     539     1600     0     441     146
    31     539     1700     0     128     34
    32     539     1800     0     67     27
    33     539     1900     0     45     22
    34     539     2000     0     24     13
    35     539     2100     0     16     11
    36     539     2200     0     28     5
    37     539     2300     0     23     8
    38     539     2400     0     73     11
    39     539     100     0     16     3
    40     539     200     0     243     82
    41     539     300     0     121     55
    42     539     400     0     17     4
    43     539     500     0     5     5
    44     539     600     0     41     15
    45     539     700     0     101     34
    46     539     800     0     184     43
    47     539     900     0     69     15
    48     539     1000     0     51     18
    49     539     1100     0     61     25The above data showed is for only one"TRAFFIC_SAMPLE_ID"=539, what i want is i should ignore the first record and starting from the second record (in this example "END_INV_TIME" 1200) I need to find its pair which in this case is another 1200 and then find the average of AVG{(BIN_1_DATA+BIN_2_DATA+BIN_3_DATA) of day 1 +(BIN_1_DATA+BIN_2_DATA+BIN_3_DATA)  of day 2}, same is the case with rest all of teh records 1300,1400,1500.......
    Thanks

    Is this what you are looking for?
    Use a self-join with the "<" condition on the ID columns to make sure you don't join a row to itself or get two rows for each join.
    WITH
      csvs as
        ( select  '1,539,1100,0,34,19'    x  from dual union all   
          select  '2,539,1200,0,65,18'    x  from dual union all   
          select  '3,539,1300,0,51,17'    x  from dual union all   
          select  '4,539,1400,0,65,27'    x  from dual union all   
          select  '5,539,1500,0,99,48'    x  from dual union all   
          select  '6,539,1600,0,426,138'  x  from dual union all   
          select  '7,539,1700,0,151,62'   x  from dual union all   
          select  '8,539,1800,0,80,32'    x  from dual union all   
          select  '9,539,1900,0,31,11'    x  from dual union all   
          select  '10,539,2000,0,37,11'   x  from dual union all   
          select  '11,539,2100,0,24,9'    x  from dual union all   
          select  '12,539,2200,0,16,5'    x  from dual union all   
          select  '13,539,2300,0,27,12'   x  from dual union all   
          select  '14,539,2400,0,55,20'   x  from dual union all   
          select  '15,539,100,0,18,9'     x  from dual union all   
          select  '16,539,200,0,134,52'   x  from dual union all   
          select  '17,539,300,0,230,69'   x  from dual union all   
          select  '18,539,400,0,15,7'     x  from dual union all   
          select  '19,539,500,0,6,5'      x  from dual union all   
          select  '20,539,600,0,47,23'    x  from dual union all   
          select  '21,539,700,0,100,41'   x  from dual union all   
          select  '22,539,800,0,196,43'   x  from dual union all   
          select  '23,539,900,0,81,20'    x  from dual union all   
          select  '24,539,1000,0,58,28'   x  from dual union all   
          select  '25,539,1100,0,58,24'   x  from dual union all   
          select  '26,539,1200,0,60,22'   x  from dual union all   
          select  '27,539,1300,0,42,18'   x  from dual union all   
          select  '28,539,1400,0,53,15'   x  from dual union all   
          select  '29,539,1500,0,107,43'  x  from dual union all   
          select  '30,539,1600,0,441,146' x  from dual union all   
          select  '31,539,1700,0,128,34'  x  from dual union all   
          select  '32,539,1800,0,67,27'   x  from dual union all   
          select  '33,539,1900,0,45,22'   x  from dual union all   
          select  '34,539,2000,0,24,13'   x  from dual union all   
          select  '35,539,2100,0,16,11'   x  from dual union all   
          select  '36,539,2200,0,28,5'    x  from dual union all   
          select  '37,539,2300,0,23,8'    x  from dual union all   
          select  '38,539,2400,0,73,11'   x  from dual union all   
          select  '39,539,100,0,16,3'     x  from dual union all   
          select  '40,539,200,0,243,82'   x  from dual union all   
          select  '41,539,300,0,121,55'   x  from dual union all   
          select  '42,539,400,0,17,4'     x  from dual union all   
          select  '43,539,500,0,5,5'      x  from dual union all   
          select  '44,539,600,0,41,15'    x  from dual union all   
          select  '45,539,700,0,101,34'   x  from dual union all   
          select  '46,539,800,0,184,43'   x  from dual union all   
          select  '47,539,900,0,69,15'    x  from dual union all   
          select  '48,539,1000,0,51,18'   x  from dual union all   
          select  '49,539,1100,0,61,25'   x  from dual
    , wdata as 
        ( select  to_number(regexp_replace( x , '^(.*),(.*),(.*),(.*),(.*),(.*)$' , '\1' ))  as bin_data_id
               ,  to_number(regexp_replace( x , '^(.*),(.*),(.*),(.*),(.*),(.*)$' , '\2' ))  as traffic_sample_id
               ,  to_number(regexp_replace( x , '^(.*),(.*),(.*),(.*),(.*),(.*)$' , '\3' ))  as end_intv_time
               ,  to_number(regexp_replace( x , '^(.*),(.*),(.*),(.*),(.*),(.*)$' , '\4' ))  as bin_1_data
               ,  to_number(regexp_replace( x , '^(.*),(.*),(.*),(.*),(.*),(.*)$' , '\5' ))  as bin_2_data
               ,  to_number(regexp_replace( x , '^(.*),(.*),(.*),(.*),(.*),(.*)$' , '\6' ))  as bin_3_data
            from  csvs
    SELECT  a.traffic_sample_id
         ,  a.end_intv_time
         ,  ((a.bin_1_data + a.bin_2_data + a.bin_3_data) + (b.bin_1_data + b.bin_2_data + b.bin_3_data)) / 2  as av
      from  wdata   a
      join  wdata   b  on  ( a.traffic_sample_id = b.traffic_sample_id  and
                             a.end_intv_time     = b.end_intv_time      and
                             a.bin_data_id       < b.bin_data_id          )
    order  by 1, 2 ;

  • Need help with File system creation fro Oracle DB installation

    Hello,
    I am new to Solaris/Unix system landscape. I have a Sun enterprise 450 with 18GB hard drive. It has Solaris 9 on it and no other software at this time. I am planning on adding 2 more hard drives 18gb and 36gb to accommodate Oracle DB.
    Recently I went through the Solaris Intermediate Sys admin training, knows the basic stuff but not fully confident to carry out the task on my own.
    I would appreciate some one can help me with the sequence of steps that I need perform to
    1. recognize the new hard drives in the system,
    2. format,
    3. partition. What is the normal strategy for partitioning? My current thinking is to have 36+18gb drives as data drives. This is where I am little bit lost. Can I make a entire 36GB drive as 1 slice for data, I am not quite sure how this is done in the real life, need your help.
    4. creating the file system to store the database files.
    Any help would be appreciated.

    Hello,
    Here is the rough idea for HA from my experience.
    The important thing is that the binaries required to run SAP
    are to be accessible before and after switchover.
    In terms of this file system doesn't matter.
    But SAP may recommend certain filesystem on linux
    please refer to SAP installation guide.
    I always use reiserfs or ext3fs.
    For soft link I recommend you to refer SAP installation guide.
    In your configuration the files related to SCS and DB is the key.
    Again those files are to be accessible both from hostA and from hostB.
    Easiest way is to use share these files like NFS or other shared file system
    so that both nodes can access to these files.
    And let the clustering software do mount and unmount those directory.
    DB binaries, data and log are to be placed in shared storage subsystem.
    (ex. /oracle/*)
    SAP binaries, profiles and so on to be placed in shared storage as well.
    (ex. /sapmnt/*)
    You may want to place the binaries into local disk to make sure the binaries
    are always accessible on OS level, even in the connection to storage subsystem
    losts.
    In this case you have to sync the binaries on both nodes manually.
    Easiest way is just put on shared storage and mount them!
    Furthermore you can use sapcpe function to sync necessary binaries
    from /sapmnt to /usr/sap/<SID>.
    For your last question /sapmnt should be located in storage subsystem
    and not let the storage down!

  • Need Help with Complex Background Images

    Hey, Macromedia, I mean, Adobe Dreamweaver users. I have a
    dilemma, before you read further, look at this website:
    https://statons.rtotogo.com/rtotogostore/rto_store/sign_in.asp.
    Notice the background images. The light tan stripe at the top, the
    orange/yellow stripe beneath it, and the darker tan color that
    encompasses the rest of the background. When you go to a longer
    page on the site, the top two colors do not repeat, and as the page
    shrinks on smaller pages, so does the darker tan background. How
    can I do this? I've seen it done before, and I read the source
    code, but I am confused as to how I can accomplish this..... Please
    advise! I am in desperate need of your expertise!
    Thanks in advance for the help.
    Chuck

    It's controlled using CSS check out the css script and the
    backgounds are
    set to repeat along the x axis only
    cheers
    Ian
    [email protected]
    http://www.edwards-micros.co.uk
    "ChuckRWD" <[email protected]> wrote in
    message
    news:e93ha7$lmi$[email protected]..
    > Hey, Macromedia, I mean, Adobe Dreamweaver users. I have
    a dilemma,
    > before you
    > read further, look at this website:
    >
    https://statons.rtotogo.com/rtotogostore/rto_store/sign_in.asp.
    Notice
    > the
    > background images. The light tan stripe at the top, the
    orange/yellow
    > stripe
    > beneath it, and the darker tan color that encompasses
    the rest of the
    > background. When you go to a longer page on the site,
    the top two colors
    > do
    > not repeat, and as the page shrinks on smaller pages, so
    does the darker
    > tan
    > background. How can I do this? I've seen it done before,
    and I read the
    > source code, but I am confused as to how I can
    accomplish this.....
    > Please
    > advise! I am in desperate need of your expertise!
    >
    > Thanks in advance for the help.
    > Chuck
    >

  • Need help with an index creation

    Hi,
    Oracle 8.0.6,
    Win 2000 server
    SELECT PT_CODE,BUS_UNIT,TOP_BRASS_YN,ROOM_CLASS_CODE,WARD_CODE,
    ADMIT_STATUS_FLAG,PT_ADMIT_DATE,PT_ADMIT_TIME,MD_CODE,REQUEST_NO,REG_NO,
    INP_REG_NO,MOTHER_PT_CODE,CRE_NOTES,PT_DIAGNOSIS_NOTE,
    TMP_CLINIC_DISCHARGE_DATE,ROOM_CODE,BED_CODE,ROWID
    FROM
    A_INP_PATIENTS_ADMT WHERE bus_unit = :1 and admit_status_flag = 'C' and
    nvl(clinical_discharge_yn,'N') not in ('Y','C') and (ward_code = :2 or :3 =
    '007') order by room_code
    call         count       cpu    elapsed       disk      query    current        rows
    Parse 1 0.00 0.03 0 0 0 0
    Execute 1 0.00 0.00 0 0 0 0
    Fetch 4 0.14 26.61 2306 47604 0 33
    total 6 0.14 26.64 2306 47604 0 33
    I want to create a function based index on column nvl(clinical_discharge_yn,'N').
    I issued the following command,
    create index new_index on A_INP_PATIENTS_ADMT (bus_unit, admit_status_flag, nvl(clinical_discharge_yn,'N'), ward_code)
    But i am getting the error ORA-00907:missing right parenthesis.
    Could you plss tell me where am i missing the parenthesis
    Best Regards,

    If I remember correctly function-based indexes weren't introduced until the 8.1.x versions of Oracle.

  • Need Help With The 'this.closeDoc' Command

    Is there a way to save (or is there a work around available) that can be used to save data in a pdf when using the ‘this.closeDoc()’ command? The 'disclosed' property of the called PDF (in this case: PFSL-18. 06.96.pdf) is set to 'true'.
    What I want to do with the code below is to prompt the user to (YES) Save and Close the pdf, or (NO) Reset (clear) the form, before saving and closing the pdf. Currently, on YES, the pdf closes but does not save data, and on NO does not reset and save the cleared pdf before closing. I’m assuming the Doc.saveAs command should be inserted just above the lines ‘this.closeDoc….’ but am not sure how it should be written.
    var cMsg = "Do you want to save changes to ‘PFSL-18, 06.96.pdf’ before closing?";
    var nRtn = app.alert(cMsg,1,3,"PFSL-18");
    //Yes
    if(nRtn==4)
    this.closeDoc("../Mutual Fund Forms/PFSL-18, 06.96.pdf", this);
    else
    //No
    this.resetForm();
    this.closeDoc("../Mutual Fund Forms/PFSL-18, 06.96.pdf", this);

    From the Acrobat JS API Reference:
    Parameters  
     bNoSave 
    (optional) A Boolean value indicating whether to close the document without saving:● If false (the default), the user is prompted to save the document if it has been modified.
    ● If true, the document is closed without prompting the user and without saving, even if the document has been modified. Be careful in using this feature because it can cause data loss without user approval.
    There is only parameter for the 'closeDoc' method.
    The Acrobat JS API Reference is a free download from the Acrobat Developer area of their web site

Maybe you are looking for