Opinions Needed

hi i would like everyone to give feedback on what i have so
far. what do you think of the colors used? the site isnt done yet
because waiting on the updated info.
here is the link to the new site i redid so far. -
http://www.geocities.com/mrnitemare09
if you wanna see what they have now heres that link. -
http://www.adaptivesportsman.org
thanks
joe

Joe
Since you asked...
Really rough, the picture backgrounds very quickly came and
went in style
because they are distracting and provide no service, plus
they slow things
down.
That green background with reverse text color is hard on the
eyes.
Everything is so big, rolls off my screen (I have a 24"
monitor).
On another note, I belong to a group that is also a 501 c 3
in Pa. We have
taken people hunting, gotten many youth there first taste of
the outddors
and have donated quite a bit of venison to local charities.
Here in pa we
have an over population of whitetail and some of the
homeowners have gotten
together to ask us to help them
I have a site that I put up some time back. Its small, just
for a presence,
but feel free to look at it. It is pretty old and on the list
of things I
should get around to updating.
http://www.delchester.org
Gary
"Joseph Kunz" <[email protected]> wrote in
message
news:gl2fm7$lk3$[email protected]..
> hi i would like everyone to give feedback on what i have
so far. what
> do you
> think of the colors used? the site isnt done yet because
waiting on the
> updated info.
> here is the link to the new site i redid so far. -
>
http://www.geocities.com/mrnitemare09
>
>
>
> if you wanna see what they have now heres that link. -
>
http://www.adaptivesportsman.org
>
> thanks
>
> joe
>

Similar Messages

  • Reconfigure/Expand/Replace HD's in G5 Opinions Needed

    Hi,
    For some time I have needed to update my configuration of my hard drives on my dual 2.0 G5. I need more space and adequate backup in an organized and efficient setup. I'm a Photographer and Graphic Designer so I work with large files.
    Currently Running:
    (2) internal 160GB drives (Operating System Drive/Work Files Drive)
    (1) External 600GB raid for permanent and continuous backup (2 300gb drives working as one 600gb drive, I would rather not run this configuration due to failure of one drive would cause failure of entire drive. Did not realize that when I bought it. Was not too informed at the time, but I am now)
    What I am considering doing:
    Replace internal work drive with 1.5TB to 2TB and mirror to external drive.
    Other option for work drive would be to go to external drive raid setup, 2TB raided (2, 1TB drives acting as one) mirrored to identical setup. I'm hoping this would give me speed when mirrored. Maybe not? The main question is speed in the mirrored backup setup. I work with and need to store large image files (RAW and Photoshop). I'm worried mirror raiding would slow the work flow. Anyone mirror raiding experience slower speeds.
    Would like to reinstall Leopard on a clean drive and start over with operating system drive as there seems to be some quarks with it but it still runs okay. Is this a good idea? How do you migrate all of your applications over. Can you just drag and drop. Someone told me you should have your applications on a separate disk anyway. And why we are on it, what drive should you setup for a scratch disk for Photoshop. Yeah I'm getting off the topic a bit but if anyone can help, I would appreciate it.
    The external I figure I would setup as a 300GB mirror for my wife who is an artist and needs a drive to hold all the files I have on my drive of her work for reproduction.
    Feel free to ask for further details. Hope to get some opinions. And by the way what kind of drive would you recommend. I'm seeing a lot of recommendations for Western Dig.
    Thanks in advance,
    CK

    I would not use mirror arrays for anything.
    A couple backups, yes.
    2 x 300GB drives in RAID1 mirror = 300GB anyway.
    So that sounds like you ran stripe RAID0.
    Mirror internal to external? No, and you would need an SATA PCI-X controller.
    What I would configure:
    5 x WD internal drives with 3 running off 4-channel controller, drives mounted in Sonnet G5 Jive. Or 1TB drives.
    External - 4 channel SATA controller, FirmTek 5PM case, 4 x 1TB drives.
    WD Caviar 640GB/1TB are popular and fast.
    Hitachi Enterprise 1TB and 2TB (2TB is new and darn expensive).
    http://www.barefeats.com/hard125.html
    You want a faster boot drive, then you might want bootable SATA controller and 2 x WD 10K VelociRaptors. Fast scratch volume, 2-3 WD Caviar 640s.
    The OS boot drive should only hold applications and doesn't need to be larger than 150-300GB, so just using WD 10K, or format of half 640GB.
    Disable spotlight on boot drive and scratch, and don't use boot drive as primary (or any) scratch volume - it will be doing enough as is.
    Scratch. Same applies. stripe the outer half of two or more drives for the best performance is also used sometimes.
    Just replacing those old small slow 160s would make a good improvement, and, they are old enough to be on their last legs.
    Where you used "mirror" I would just use SuperDuper or Carbon Copy Cloner to make backup images.
    CS3: http://kb2.adobe.com/cps/401/kb401089.html
    CS4: http://kb2.adobe.com/cps/404/kb404440.html
    http://www.macgurus.com/guides/photoshopguide.php
    http://homepage.mac.com/boots911/.Public/PhotoshopAccelerationBasics2.4W.pdf
    http://macperformanceguide.com/OptimizingPhotoshop-Configuration.html
    Quad G5 vs 2008 Mac Pro (dated) but the real gap keeps growing:
    http://www.barefeats.com/harper.html

  • RMAN Copy to DEV Procedure - Opinions Needed

    Hi All,
    Wanted some opinions/advice on a problem I'm having. My goal is to have a nice easy script (that I can give to an inexperienced DBA) that copies a prod database to DEV. I've been googling around and found similar problems but noe seem to apply to me, and the procedures given seem to be missing things...The details are:
    10.2.0.4 on both sides. Prod is in archivelogmode. Backups run to tape normally but since we have no netbackup license for DEV I'm doing a special backup to disk. Backup script is:
    export ORACLE_HOME=/u01/app/oracle/product/10.2
    export PATH=$ORACLE_HOME/bin:$PATH
    export ORACLE_SID=PLTPRD01
    rman nocatalog target=sys/prodpw@PLTPRD01 log=/home/oracle/logs/full_rman_to_disk.log <<-EOF
       configure backup optimization on;
       configure default device type to disk;
       configure device type disk parallelism 1 backup type to compressed backupset;
       configure datafile backup copies for device type disk to 1;
       configure maxsetsize to unlimited;
       run {
         allocate channel ch1 type Disk maxpiecesize = 1900M;
         allocate channel ch2 type Disk maxpiecesize = 1900M;
         allocate channel ch3 type Disk maxpiecesize = 1900M;
         allocate channel ch4 type Disk maxpiecesize = 1900M;
         allocate channel ch5 type Disk maxpiecesize = 1900M;
         allocate channel ch6 type Disk maxpiecesize = 1900M;
         backup full database noexclude
         include current controlfile
         format '/u10/PLTPRD01/backup/datafile_%s_%p.bak'
         tag 'datafile_daily';
       run {
         allocate channel ch1 type Disk maxpiecesize = 1900M;
         allocate channel ch2 type Disk maxpiecesize = 1900M;
         backup archivelog all
         format '/u10/PLTPRD01/backup/archivelog_%s_%p.bak'
         tag 'archivelog_daily';
       run {
          allocate channel ch1 type Disk maxpiecesize = 1900M;
          backup format '/u10/PLTPRD01/backup/controlfile_%s.bak' current controlfile;
       quit
    EOFThat gives me the following files:
    -rwxrwSrwt  1 oracle dba  494244352 Dec  6 01:43 archivelog_6007_1.bak
    -rwxrwSrwt  1 oracle dba  373885952 Dec  6 01:42 archivelog_6008_1.bak
    -rwxrwSrwt  1 oracle dba    2048000 Dec  6 01:43 controlfile_6010.bak
    -rwxrwSrwt  1 oracle dba 1990311936 Dec  6 00:54 datafile_5999_1.bak
    -rwxrwSrwt  1 oracle dba 1990213632 Dec  6 01:15 datafile_5999_2.bak
    -rwxrwSrwt  1 oracle dba 1990303744 Dec  6 01:35 datafile_5999_3.bak
    -rwxrwSrwt  1 oracle dba  388759552 Dec  6 01:39 datafile_5999_4.bak
    -rwxrwSrwt  1 oracle dba 1990254592 Dec  6 00:51 datafile_6000_1.bak
    -rwxrwSrwt  1 oracle dba 1990287360 Dec  6 01:12 datafile_6000_2.bak
    -rwxrwSrwt  1 oracle dba 1990328320 Dec  6 01:33 datafile_6000_3.bak
    -rwxrwSrwt  1 oracle dba  567746560 Dec  6 01:38 datafile_6000_4.bak
    -rwxrwSrwt  1 oracle dba 1990213632 Dec  6 00:52 datafile_6001_1.bak
    -rwxrwSrwt  1 oracle dba 1990221824 Dec  6 01:13 datafile_6001_2.bak
    -rwxrwSrwt  1 oracle dba 1990336512 Dec  6 01:36 datafile_6001_3.bak
    -rwxrwSrwt  1 oracle dba   60645376 Dec  6 01:37 datafile_6001_4.bak
    -rwxrwSrwt  1 oracle dba 1990230016 Dec  6 00:52 datafile_6002_1.bak
    -rwxrwSrwt  1 oracle dba 1990344704 Dec  6 01:12 datafile_6002_2.bak
    -rwxrwSrwt  1 oracle dba 1990311936 Dec  6 01:32 datafile_6002_3.bak
    -rwxrwSrwt  1 oracle dba  233996288 Dec  6 01:34 datafile_6002_4.bak
    -rwxrwSrwt  1 oracle dba 1371709440 Dec  6 00:45 datafile_6003_1.bak
    -rwxrwSrwt  1 oracle dba 1990361088 Dec  6 00:50 datafile_6004_1.bak
    -rwxrwSrwt  1 oracle dba 1990230016 Dec  6 01:11 datafile_6004_2.bak
    -rwxrwSrwt  1 oracle dba 1990230016 Dec  6 01:31 datafile_6004_3.bak
    -rwxrwSrwt  1 oracle dba  713326592 Dec  6 01:38 datafile_6004_4.bak
    -rwxrwSrwt  1 oracle dba    2048000 Dec  6 00:45 datafile_6005_1.bakSo, I move over to the target server and database (UAT in this case) and run a restore using those files. But, it won't restore the archlog files and gives the old "RMAN-06102: no channel to restore a backup or copy of log thread" error. I can recover by copying over the relevant archlogs and doing a "recover using backup controlfile until cancel", but I wondered why the restore didn't recover automatically for me? The restore is as follows:
    rman nocatalog target sys/prodpw@PLTPRD01 AUXILIARY sys/UATpw@UAT cmdfile='/home/oracle/scripts/oracle/duplicate_database.rman' log=/home/oracle/logs/UAT_restore_201112080930.log(script is)
    RUN
      ALLOCATE AUXILIARY CHANNEL aux1 TYPE DISK FORMAT '/u10/PLTPRD01/backup/datafile_%s_%p.bak';
      ALLOCATE AUXILIARY CHANNEL aux2 TYPE DISK FORMAT '/u10/PLTPRD01/backup/datafile_%s_%p.bak';
      ALLOCATE AUXILIARY CHANNEL aux3 TYPE DISK FORMAT '/u10/PLTPRD01/backup/datafile_%s_%p.bak';
      ALLOCATE AUXILIARY CHANNEL aux4 TYPE DISK FORMAT '/u10/PLTPRD01/backup/datafile_%s_%p.bak';
      ALLOCATE AUXILIARY CHANNEL aux5 TYPE DISK FORMAT '/u10/PLTPRD01/backup/datafile_%s_%p.bak';
      ALLOCATE AUXILIARY CHANNEL aux6 TYPE DISK FORMAT '/u10/PLTPRD01/backup/datafile_%s_%p.bak';
      ALLOCATE AUXILIARY CHANNEL aux7 TYPE DISK FORMAT '/u10/PLTPRD01/backup/archivelog_%s_%p.bak';
      SET NEWNAME FOR DATAFILE  1 TO '/u07/oradata/UAT/system01.dbf';
      SET NEWNAME FOR DATAFILE  2 TO '/u07/oradata/UAT/undotbs01.dbf';
      SET NEWNAME FOR DATAFILE  3 TO '/u07/oradata/UAT/sysaux01.dbf';
      SET NEWNAME FOR DATAFILE  4 TO '/u07/oradata/UAT/users01.dbf';
      SET NEWNAME FOR DATAFILE  5 TO '/u07/oradata/UAT/bidata_index.dbf';
      SET NEWNAME FOR DATAFILE  6 TO '/u07/oradata/UAT/bistage_constraint.dbf';
      SET NEWNAME FOR DATAFILE  7 TO '/u07/oradata/UAT/bistage_data.dbf';
      SET NEWNAME FOR DATAFILE  8 TO '/u07/oradata/UAT/bistage_index.dbf';
      SET NEWNAME FOR DATAFILE  9 TO '/u07/oradata/UAT/bss_constraint.dbf';
      SET NEWNAME FOR DATAFILE 10 TO '/u07/oradata/UAT/bss_data.dbf';
      SET NEWNAME FOR DATAFILE 11 TO '/u07/oradata/UAT/bss_index.dbf';
      SET NEWNAME FOR DATAFILE 12 TO '/u07/oradata/UAT/bussrules_constraint.dbf';
      SET NEWNAME FOR DATAFILE 13 TO '/u07/oradata/UAT/bussrules_data.dbf';
      SET NEWNAME FOR DATAFILE 14 TO '/u07/oradata/UAT/bussrules_index.dbf';
    (plus lots more lines of datafile mappings)
      SET NEWNAME FOR TEMPFILE 1 TO '/u07/oradata/UAT/temp01.dbf';
      SET NEWNAME FOR TEMPFILE 2 TO '/u07/oradata/UAT/temp02.dbf';
      DUPLICATE TARGET DATABASE TO UAT
      PFILE='/u01/app/oracle/product/10.2/dbs/initUAT.ora'
      NOFILENAMECHECK
      LOGFILE GROUP 1 ('/u03/redo/oradata/UAT/redo1a.dbf',
                       '/u04/redo/oradata/UAT/redo1b.dbf') SIZE 500M REUSE,
              GROUP 2 ('/u03/redo/oradata/UAT/redo2a.dbf',
                       '/u04/redo/oradata/UAT/redo2b.dbf') SIZE 500M REUSE,
              GROUP 3 ('/u03/redo/oradata/UAT/redo3a.dbf',
                       '/u04/redo/oradata/UAT/redo3b.dbf') SIZE 500M REUSE,
              GROUP 4 ('/u03/redo/oradata/UAT/redo4a.dbf',
                       '/u04/redo/oradata/UAT/redo4b.dbf') SIZE 500M REUSE,
              GROUP 5 ('/u03/redo/oradata/UAT/redo5a.dbf',
                       '/u04/redo/oradata/UAT/redo5b.dbf') SIZE 500M REUSE,
              GROUP 6 ('/u03/redo/oradata/UAT/redo6a.dbf',
                       '/u04/redo/oradata/UAT/redo6b.dbf') SIZE 500M REUSE,
              GROUP 7 ('/u03/redo/oradata/UAT/redo7a.dbf',
                       '/u04/redo/oradata/UAT/redo7b.dbf') SIZE 500M REUSE,
              GROUP 8 ('/u03/redo/oradata/UAT/redo8a.dbf',
                       '/u04/redo/oradata/UAT/redo8b.dbf') SIZE 500M REUSE;
      RELEASE CHANNEL aux1;
      RELEASE CHANNEL aux2;
      RELEASE CHANNEL aux3;
      RELEASE CHANNEL aux4;
      RELEASE CHANNEL aux5;
      RELEASE CHANNEL aux6;
      RELEASE CHANNEL aux7;
    }I tried this with and without the aux7 for the archlogs, along with adding a "set until time", but always get the same errors. The logfile is as follows, but truncated (I can put it all in if anyone really wants it):
    allocated channel: aux1
    channel aux1: sid=155 devtype=DISK
    allocated channel: aux2
    channel aux2: sid=154 devtype=DISK
    allocated channel: aux3
    channel aux3: sid=153 devtype=DISK
    allocated channel: aux4
    channel aux4: sid=152 devtype=DISK
    allocated channel: aux5
    channel aux5: sid=151 devtype=DISK
    allocated channel: aux6
    channel aux6: sid=150 devtype=DISK
    allocated channel: aux7
    channel aux7: sid=149 devtype=DISK
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    executing command: SET NEWNAME
    (more set newname commands)
    Starting Duplicate Db at 08-DEC-11
    contents of Memory Script:
       set until scn  2076483089;
       set newname for datafile  1 to
    "/u07/oradata/UAT/system01.dbf";
       set newname for datafile  2 to
    "/u07/oradata/UAT/undotbs01.dbf";
       set newname for datafile  3 to
    "/u07/oradata/UAT/sysaux01.dbf";
       set newname for datafile  4 to
    "/u07/oradata/UAT/users01.dbf";
       set newname for datafile  5 to
    "/u07/oradata/UAT/bidata_index.dbf";
       set newname for datafile  6 to
    "/u07/oradata/UAT/bistage_constraint.dbf";
    (more set newname output)
       set newname for datafile  60 to
    "/u07/oradata/UAT/ibmslob_data4.dbf";
       restore
       check readonly
       clone database
    executing Memory Script
    channel aux1: starting datafile backupset restore
    channel aux1: specifying datafile(s) to restore from backup set
    restoring datafile 00001 to /u07/oradata/UAT/system01.dbf
    restoring datafile 00002 to /u07/oradata/UAT/undotbs01.dbf
    restoring datafile 00005 to /u07/oradata/UAT/bidata_index.dbf
    restoring datafile 00008 to /u07/oradata/UAT/bistage_index.dbf
    restoring datafile 00016 to /u07/oradata/UAT/customiz_data.dbf
    restoring datafile 00020 to /u07/oradata/UAT/galaxybe_data2.dbf
    restoring datafile 00023 to /u07/oradata/UAT/galaxydal_data.dbf
    restoring datafile 00026 to /u07/oradata/UAT/galaxy_data.dbf
    restoring datafile 00029 to /u07/oradata/UAT/ibmsarch_data.dbf
    restoring datafile 00037 to /u07/oradata/UAT/ibmstask_constraint.dbf
    channel aux1: reading from backup piece /u10/PLTPRD01/backup/datafile_6003_1.bak
    channel aux2: starting datafile backupset restore
    channel aux2: specifying datafile(s) to restore from backup set
    restoring datafile 00004 to /u07/oradata/UAT/users01.dbf
    restoring datafile 00007 to /u07/oradata/UAT/bistage_data.dbf
    restoring datafile 00015 to /u07/oradata/UAT/customiz_constraint.dbf
    restoring datafile 00022 to /u07/oradata/UAT/galaxydal_constraint.dbf
    restoring datafile 00028 to /u07/oradata/UAT/ibmsarch_constraint.dbf
    restoring datafile 00032 to /u07/oradata/UAT/ibmsaudit_index.dbf
    restoring datafile 00035 to /u07/oradata/UAT/ibmsdynm_index.dbf
    restoring datafile 00040 to /u07/oradata/UAT/mag_ctl.dbf
    restoring datafile 00043 to /u07/oradata/UAT/odyssey_constraint.dbf
    restoring datafile 00057 to /u07/oradata/UAT/ibmslob_data1.dbf
    channel aux2: reading from backup piece /u10/PLTPRD01/backup/datafile_6002_1.bak
    channel aux3: starting datafile backupset restore
    channel aux3: specifying datafile(s) to restore from backup set
    restoring datafile 00003 to /u07/oradata/UAT/sysaux01.dbf
    restoring datafile 00010 to /u07/oradata/UAT/bss_data.dbf
    restoring datafile 00011 to /u07/oradata/UAT/bss_index.dbf
    restoring datafile 00012 to /u07/oradata/UAT/bussrules_constraint.dbf
    restoring datafile 00017 to /u07/oradata/UAT/customiz_index.dbf
    restoring datafile 00024 to /u07/oradata/UAT/galaxydal_index.dbf
    restoring datafile 00030 to /u07/oradata/UAT/ibmsarch_index.dbf
    restoring datafile 00038 to /u07/oradata/UAT/ibmstask_data.dbf
    restoring datafile 00041 to /u07/oradata/UAT/mag_data.dbf
    restoring datafile 00046 to /u07/oradata/UAT/pilatdba_constraint.dbf
    channel aux3: reading from backup piece /u10/PLTPRD01/backup/datafile_6001_1.bak
    channel aux4: starting datafile backupset restore
    channel aux4: specifying datafile(s) to restore from backup set
    restoring datafile 00006 to /u07/oradata/UAT/bistage_constraint.dbf
    restoring datafile 00014 to /u07/oradata/UAT/bussrules_index.dbf
    restoring datafile 00019 to /u07/oradata/UAT/galaxybe_constraint.dbf
    restoring datafile 00021 to /u07/oradata/UAT/galaxybe_index.dbf
    restoring datafile 00027 to /u07/oradata/UAT/galaxy_index.dbf
    restoring datafile 00034 to /u07/oradata/UAT/ibmsdynm_data.dbf
    restoring datafile 00036 to /u07/oradata/UAT/ibmslob_data.dbf
    restoring datafile 00044 to /u07/oradata/UAT/odyssey_data.dbf
    restoring datafile 00045 to /u07/oradata/UAT/odyssey_index.dbf
    restoring datafile 00060 to /u07/oradata/UAT/ibmslob_data4.dbf
    channel aux4: reading from backup piece /u10/PLTPRD01/backup/datafile_6004_1.bak
    channel aux5: starting datafile backupset restore
    channel aux5: specifying datafile(s) to restore from backup set
    restoring datafile 00009 to /u07/oradata/UAT/bss_constraint.dbf
    restoring datafile 00013 to /u07/oradata/UAT/bussrules_data.dbf
    restoring datafile 00018 to /u07/oradata/UAT/dbcc_repos.dbf
    restoring datafile 00025 to /u07/oradata/UAT/galaxy_constraint.dbf
    restoring datafile 00033 to /u07/oradata/UAT/ibmsdynm_constraint.dbf
    restoring datafile 00039 to /u07/oradata/UAT/ibmstask_index.dbf
    restoring datafile 00042 to /u07/oradata/UAT/mag_index.dbf
    restoring datafile 00048 to /u07/oradata/UAT/pilatdba_index.dbf
    restoring datafile 00050 to /u07/oradata/UAT/ratings_data.dbf
    restoring datafile 00058 to /u07/oradata/UAT/ibmslob_data2.dbf
    channel aux5: reading from backup piece /u10/PLTPRD01/backup/datafile_6000_1.bak
    channel aux6: starting datafile backupset restore
    channel aux6: specifying datafile(s) to restore from backup set
    restoring datafile 00031 to /u07/oradata/UAT/ibmsaudit_data.dbf
    restoring datafile 00047 to /u07/oradata/UAT/pilatdba_data.dbf
    restoring datafile 00049 to /u07/oradata/UAT/ratings_constraint.dbf
    restoring datafile 00051 to /u07/oradata/UAT/ratings_index.dbf
    restoring datafile 00052 to /u07/oradata/UAT/tools.dbf
    restoring datafile 00053 to /u07/oradata/UAT/undotbs02.dbf
    restoring datafile 00054 to /u07/oradata/UAT/workflow_constraint.dbf
    restoring datafile 00055 to /u07/oradata/UAT/workflow_data.dbf
    restoring datafile 00056 to /u07/oradata/UAT/workflow_index.dbf
    restoring datafile 00059 to /u07/oradata/UAT/ibmslob_data3.dbf
    channel aux6: reading from backup piece /u10/PLTPRD01/backup/datafile_5999_1.bak
    channel aux4: restored backup piece 1
    piece handle=/u10/PLTPRD01/backup/datafile_6004_1.bak tag=DATAFILE_DAILY
    channel aux4: reading from backup piece /u10/PLTPRD01/backup/datafile_6004_2.bak
    channel aux2: restored backup piece 1
    piece handle=/u10/PLTPRD01/backup/datafile_6002_1.bak tag=DATAFILE_DAILY
    channel aux2: reading from backup piece /u10/PLTPRD01/backup/datafile_6002_2.bak
    channel aux3: restored backup piece 1
    piece handle=/u10/PLTPRD01/backup/datafile_6001_1.bak tag=DATAFILE_DAILY
    channel aux3: reading from backup piece /u10/PLTPRD01/backup/datafile_6001_2.bak
    channel aux5: restored backup piece 1
    piece handle=/u10/PLTPRD01/backup/datafile_6000_1.bak tag=DATAFILE_DAILY
    channel aux5: reading from backup piece /u10/PLTPRD01/backup/datafile_6000_2.bak
    channel aux6: restored backup piece 1
    piece handle=/u10/PLTPRD01/backup/datafile_5999_1.bak tag=DATAFILE_DAILY
    channel aux6: reading from backup piece /u10/PLTPRD01/backup/datafile_5999_2.bak
    channel aux1: restored backup piece 1
    piece handle=/u10/PLTPRD01/backup/datafile_6003_1.bak tag=DATAFILE_DAILY
    channel aux1: restore complete, elapsed time: 01:48:13
    channel aux4: restored backup piece 2
    piece handle=/u10/PLTPRD01/backup/datafile_6004_2.bak tag=DATAFILE_DAILY
    channel aux4: reading from backup piece /u10/PLTPRD01/backup/datafile_6004_3.bak
    channel aux3: restored backup piece 2
    piece handle=/u10/PLTPRD01/backup/datafile_6001_2.bak tag=DATAFILE_DAILY
    channel aux3: reading from backup piece /u10/PLTPRD01/backup/datafile_6001_3.bak
    channel aux5: restored backup piece 2
    piece handle=/u10/PLTPRD01/backup/datafile_6000_2.bak tag=DATAFILE_DAILY
    channel aux5: reading from backup piece /u10/PLTPRD01/backup/datafile_6000_3.bak
    channel aux6: restored backup piece 2
    piece handle=/u10/PLTPRD01/backup/datafile_5999_2.bak tag=DATAFILE_DAILY
    channel aux6: reading from backup piece /u10/PLTPRD01/backup/datafile_5999_3.bak
    channel aux2: restored backup piece 2
    piece handle=/u10/PLTPRD01/backup/datafile_6002_2.bak tag=DATAFILE_DAILY
    channel aux2: reading from backup piece /u10/PLTPRD01/backup/datafile_6002_3.bak
    channel aux4: restored backup piece 3
    piece handle=/u10/PLTPRD01/backup/datafile_6004_3.bak tag=DATAFILE_DAILY
    channel aux4: reading from backup piece /u10/PLTPRD01/backup/datafile_6004_4.bak
    channel aux5: restored backup piece 3
    piece handle=/u10/PLTPRD01/backup/datafile_6000_3.bak tag=DATAFILE_DAILY
    channel aux5: reading from backup piece /u10/PLTPRD01/backup/datafile_6000_4.bak
    channel aux4: restored backup piece 4
    piece handle=/u10/PLTPRD01/backup/datafile_6004_4.bak tag=DATAFILE_DAILY
    channel aux4: restore complete, elapsed time: 02:49:24
    channel aux2: restored backup piece 3
    piece handle=/u10/PLTPRD01/backup/datafile_6002_3.bak tag=DATAFILE_DAILY
    channel aux2: reading from backup piece /u10/PLTPRD01/backup/datafile_6002_4.bak
    channel aux3: restored backup piece 3
    piece handle=/u10/PLTPRD01/backup/datafile_6001_3.bak tag=DATAFILE_DAILY
    channel aux3: reading from backup piece /u10/PLTPRD01/backup/datafile_6001_4.bak
    channel aux6: restored backup piece 3
    piece handle=/u10/PLTPRD01/backup/datafile_5999_3.bak tag=DATAFILE_DAILY
    channel aux6: reading from backup piece /u10/PLTPRD01/backup/datafile_5999_4.bak
    channel aux3: restored backup piece 4
    piece handle=/u10/PLTPRD01/backup/datafile_6001_4.bak tag=DATAFILE_DAILY
    channel aux3: restore complete, elapsed time: 02:57:18
    channel aux2: restored backup piece 4
    piece handle=/u10/PLTPRD01/backup/datafile_6002_4.bak tag=DATAFILE_DAILY
    channel aux2: restore complete, elapsed time: 02:57:20
    channel aux5: restored backup piece 4
    piece handle=/u10/PLTPRD01/backup/datafile_6000_4.bak tag=DATAFILE_DAILY
    channel aux5: restore complete, elapsed time: 02:57:45
    channel aux6: restored backup piece 4
    piece handle=/u10/PLTPRD01/backup/datafile_5999_4.bak tag=DATAFILE_DAILY
    channel aux6: restore complete, elapsed time: 02:58:50
    Finished restore at 08-DEC-11
    sql statement: CREATE CONTROLFILE REUSE SET DATABASE "UAT" RESETLOGS ARCHIVELOG
      MAXLOGFILES     48
      MAXLOGMEMBERS      3
      MAXDATAFILES      400
      MAXINSTANCES     8
      MAXLOGHISTORY     4674
    LOGFILE
      GROUP  1 ( '/u03/redo/oradata/UAT/redo1a.dbf', '/u04/redo/oradata/UAT/redo1b.dbf' ) SIZE 500 M  REUSE,
      GROUP  2 ( '/u03/redo/oradata/UAT/redo2a.dbf', '/u04/redo/oradata/UAT/redo2b.dbf' ) SIZE 500 M  REUSE,
      GROUP  3 ( '/u03/redo/oradata/UAT/redo3a.dbf', '/u04/redo/oradata/UAT/redo3b.dbf' ) SIZE 500 M  REUSE,
      GROUP  4 ( '/u03/redo/oradata/UAT/redo4a.dbf', '/u04/redo/oradata/UAT/redo4b.dbf' ) SIZE 500 M  REUSE,
      GROUP  5 ( '/u03/redo/oradata/UAT/redo5a.dbf', '/u04/redo/oradata/UAT/redo5b.dbf' ) SIZE 500 M  REUSE,
      GROUP  6 ( '/u03/redo/oradata/UAT/redo6a.dbf', '/u04/redo/oradata/UAT/redo6b.dbf' ) SIZE 500 M  REUSE,
      GROUP  7 ( '/u03/redo/oradata/UAT/redo7a.dbf', '/u04/redo/oradata/UAT/redo7b.dbf' ) SIZE 500 M  REUSE,
      GROUP  8 ( '/u03/redo/oradata/UAT/redo8a.dbf', '/u04/redo/oradata/UAT/redo8b.dbf' ) SIZE 500 M  REUSE
    DATAFILE
      '/u07/oradata/UAT/system01.dbf'
    CHARACTER SET UTF8
    contents of Memory Script:
       switch clone datafile all;
    executing Memory Script
    datafile 2 switched to datafile copy
    input datafile copy recid=1 stamp=769350658 filename=/u07/oradata/UAT/undotbs01.dbf
    datafile 3 switched to datafile copy
    input datafile copy recid=2 stamp=769350658 filename=/u07/oradata/UAT/sysaux01.dbf
    datafile 4 switched to datafile copy
    (etc)
    datafile 60 switched to datafile copy
    input datafile copy recid=59 stamp=769350674 filename=/u07/oradata/UAT/ibmslob_data4.dbf
    contents of Memory Script:
       set until scn  2076483089;
       recover
       clone database
        delete archivelog
    executing Memory Script
    executing command: SET until clause
    Starting recover at 08-DEC-11
    starting media recovery
    Oracle Error:
    ORA-01547: warning: RECOVER succeeded but OPEN RESETLOGS would get error below
    ORA-01194: file 1 needs more recovery to be consistent
    ORA-01110: data file 1: '/u07/oradata/UAT/system01.dbf'
    released channel: aux1
    released channel: aux2
    released channel: aux3
    released channel: aux4
    released channel: aux5
    released channel: aux6
    released channel: aux7
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of Duplicate Db command at 12/08/2011 12:31:54
    RMAN-03015: error occurred in stored script Memory Script
    RMAN-06053: unable to perform media recovery because of missing log
    RMAN-06102: no channel to restore a backup or copy of log thread 1 seq 101913 lowscn 2076390043
    RMAN-06102: no channel to restore a backup or copy of log thread 1 seq 101912 lowscn 2076281132
    RMAN-06102: no channel to restore a backup or copy of log thread 1 seq 101911 lowscn 2076093182
    RMAN-06102: no channel to restore a backup or copy of log thread 1 seq 101910 lowscn 2076023375
    RMAN-06102: no channel to restore a backup or copy of log thread 1 seq 101909 lowscn 2075932322
    RMAN-06102: no channel to restore a backup or copy of log thread 1 seq 101908 lowscn 2075932310
    RMAN-06102: no channel to restore a backup or copy of log thread 1 seq 101907 lowscn 2075896928
    (more no channel errors)
    Recovery Manager complete.Thoughts? Ideas? I'm thinking the backup script is wrong somehow.

    this is also an intereting way to open your DB (or should I say test DB)
    http://practicalappsdba.wordpress.com/2008/04/01/how-to-recover-and-open-the-database-if-the-archive-log-required-for-recovery-is-missing/

  • Opinions Needed : Wattage for Power Supple Upgrade

    Hey there BBY Forum users!
    I'm looking to chunk some change out for a PSU upgrade and wanted some outside opinions as to how high of wattage I should really look for(I have my own ideas).
    Here's all the hardware currently in the system:
    Antec 1200 case: 6x120mm LED case fans, 1x200mm LED exhaust fan
    Hard Drives : 2x320GB 7200RPM, 2x1TB 7200RPM(All SATA)
    CPU : Intel Core i7-930 2.8GHz
    Graphics : PNY GeForce GTS 470
    Cooling : Corsair H50 Watercooler
    Opticals : LG BluRay ROM, Sony DVD RW(both SATA)
    RAM : 6GB(3x2GB) DDR3 1333
    Extras : Couple external HDD's, cam, printer, etc, and a cold cathode tube.
    I currently have a 700 watt RocketFish, which, nothing against them, just isn't high end enough for my taste.
    I've been leaning towards a 900W or a 1kW PSU, but need some outside advice on brand/type.
    Thanks!
    If you like my post, or solution to your issue/question, go ahead and click on the little star by my name and/or accept the post as the Solution. It makes me happy.
    I'm NOT an employee of Best Buy, or Geek Squad, though I did work as an Agent for a year 5 years ago. None of my posts are to be taken as the official stance that Best Buy will take on your situation. My advice is just that, advice.
    Unfortunately, that's the bad luck of any electronic, there's going to be bad Apples... wait that's a horrible pun.

    I see one problem with 'future proofing' as you call it. By the time you need 1000 watts, the power supply could be two or three years old. At that time, the components used to build a power supply will have improved two fold.  They will be more efficient and most likely cheaper. Any benefit realized now, will be lost by then.  And who knows, in two years, system boards maybe running on a 3 volt and 5 volt bus. 12 volt is only needed in drives and soon they may all be running on 5 volts.
    Gary | It's true; I am a Best Buy Fan Boy!         @GaryFunk     PrivateMessage

  • Research paper, opinions needed

    Hello, I am writing a small research paper (500 words) for school and I am doing it on the differences between c++ and java. I am currently going to college and I am about to complete my associates degree for programming and this is part of the last assignment. I just need a few opinions and maybe some advantages or disadvantages between the two. any avice would be appreciated. I have my own opinions but I need to cite others as well.Thankyou

    sounds to me like you want an answer...tell us what you think first and then maybe some of us will add on to what you said.

  • Portal UME data store and various options (Opinions needed!)

    We are currently exploring our options with connecting the portal (UME) to various data sources for user authentication. Per EP 101, we all know that yes, we can authenticate against (1) the portal db (2) the portal DB + an SAP system and (3) the portal db + a LDAP directory. Now, of course, in most cases, #3 is the standard option. But now, we want to explore another option.....what if we set up synchronization with the LDAP directoy (ie. http://help.sap.com/saphelp_nw04/helpdata/en/95/49cb3a663bfc70e10000000a114084/frameset.htm). For example, our process is such that now, within SAP R/3, a "new hire" is created and then this triggers the creation of their userid/password in the external LDAP directory as well. Is it possible to then have synchronization set up so that the LDAP directory will then synchronize with the portal db and create the user in the portal db itself? (the example given in the help file seems to suggest this but does not provide any detail). Then the portal could authenticate users against it's own db? (ie. no need to make a "trip" to the LDAP directory). Soooooo first off, is this possible and if so, how? Second, what are the pros/cons of this approach versus the standard option of simply using the LDAP directory for authentication and storing only portal specific attributes in the portal's own db? Lastly any "gotchas" to be aware of (ie such as "yes this works fine for NDS but no way will it work for MS-AD" haha)?
    oh...and one more...take the LDAP directory out of the picture for a moment...is it possible to "synchronize" directly from an SAP system (such as 4.6d or ECC5.0) directly with the portal db (as well as other SAP componenet systems)? (*this one is more out of curiousity than anything...past experience with CUA. haha)
    thanks BIG TIME in advance!
    Chris

    Chris I can answer the second part of your question only, sorry!
    It is possible to automatically sync users directly from a sap system, I currently do this for relase 4.7, so it should work ECC5 on onwards (you would think). As for 4.6c/d? I just posted a new thread asking that very question, hopefully someone helps!
    with NW04 portal and about SP13 or better you get a new UME connection option - dataSourceConfiguration_abap.xml, picking this automates the link between ABAP and portal users & roles.
    Any user created in 4.7 automatically appears in portal plus (this is the good bit) dataSourceConfiguration_abap.xml makes all ABAP security roles appear as portal groups. You then simply assign one of these replicated groups to your portal roles, so a user assignment to a role in ABAP seamlessly becomes assigned to a portal role, giving you portal use managment without having to go near the portal system.
    So it's not really like CUA at all, just a mechanism that automatically replicates all ABAP users & roles into the portal in a useable form
    hope that helps a little
    danny

  • Opinions Needed On Right Choice of card

    :confused2:I've been running the following thread
    Ti4600-TDX8X
    because I thought I was going to get a Ti4600-TD8X second hand but the deal has fallen through .
    Now need to start all over looking for a replacement card and need some help.
    As you will see I'm currently running a Voodoo 3 3000 16MB that I bought s/h in June 2000. It's been a good card and still is.
    However I thought it might be time to upgrade for the following reasons:
    1) I'm hoping to get a 17" Flat screen (TFT) Monitor later this year.
    2) I might want to do some transfer some old 8mm film from my analogue camcorder and some VHS tapes to DVD/SVCD.
    3) Wanted to make a general improvement in the performance of my system.
    I'm not a gamer, apart from Flight Simulator 2000, so I don't need one of the latest super fast cards. My main use is word processing/spreadsheets/home accounting, some digital image editing from my digital camera and CD writing.
    When I first started to think of upgrading I set myself to get 128MB on the card as I thought this would be better for the film transfer side of things (I'll get a dedicated capture card with hardware locking of video/audio if I go ahead) but since reading various topics on the subject it seems that extra RAM is only needed when rendering complex 3D games or images at high res. Not withstanding this I'm still inclined toward 128MB.
    I decided to look on eBay(UK) for a good s/h card or at least one not quite at today's cutting edge. I hadn't realised just how many variations of card there are! My initial budget of 30GBP (50USD) was based on seeing a Radeon 9200SE for that price. Since then I've realised I probably need to allocate a bit more if I'm to give myself some choice.
    As you may gather I'm a complete novice as far as video cards are concerned and have no idea how I decide if a 4200 is better for me than a MMX-440 or a 5200 better than an ATI 9200 or 9100 etc. etc. When I spotted the Ti4600 it seemed perfect, a card with good reviews and at a good price 70GBP(120USD).
    Based on the Ti4600 I've set my budget now to a max of 70GBP (120USD). I need a good general purpose card that will cover all my needs. I'm inclined to nVidia rather than ATI as I've read there are less compatability issues with the nVidia drivers. My only other requirement is that the card must have an onboard fan. At the moment I'm using a PCI slot fan and I want to remove that and free up the slot.
    Any help on picking the "right" card would be much appreciated.

    chris18,
    The best cards are the ATI 9600XT, the ATI All in Wonder 9600Pro and the GeForceFX 5700 Ultra. (If you can find a MSI GeForce FX5900XT-VTD128 get it. The card prices around the GeforceFX 5700 Ultra but isafaster card.)
    From [H]ard|OCP:
    Quote
    NVIDIA has put themselves in an awkward situation with the 5900XT. With the performance it has shown and the potential it has with overclocking at the price point it is being sold at it could actually hinder sales of their 5700Ultra. Why purchase a 5700Ultra at the same price of a 5900XT when the 5900XT is faster?
    We definitely see a pattern with the games and cards tested in this review. The FX5900XT is much better competition to the Radeon 9600XT then the 5700Ultra is to the 9600XT. The Radeon 9600XT beats the 5700Ultra in almost every game, but the tables turn when you factor in the 5900XT. The 5900XT is able to match or beat the 9600XT in more than a few games by allowing you to run at higher quality settings than the 9600XT and still perform the same or better though there are some games where the 9600XT is the winner.
    Take Care,
    Richard

  • Opinions Needed for Brand New Java Source Editor....

    Help:
    I need you to tell me what features your ideal source code editor (with some project management tools built in) would have. Tell me anything, however big, however infeasible, or however trivial.
    Background:
    I'm starting work this summer on a major project, set to last until the summer of 2004 (2 years), during the course of which I plan to develop an advanced text editor (with project management tools), written in Java, that will be built from the ground up differently from any other. It will be freeware on release.
    Features:
    The software will be able to handle HTML/XML/Java/Prolog/LISP/C/C#/Perl languages and plain text, and will support plug-ins for other languages.
    In the course of developing it, I want to experiment with novel features, built-in project management tools, support for numerous programming languages, different ways of viewing, editing and presenting the project/code, and the application of the latest AI research.
    I also want to build a 'concept' GUI (as in 'concept car), that will be rather unlike any other application around at the moment and will be designed from the start to follow all the latest theory on user psychology, behaviour and preferences.
    I have dozens of ideas for this, which sadly I need to keep mostly under wraps for the time being. However, this project is intended to create from scratch an entirely new text editor using user-centred design, and I need to know what all you novice and experienced users want.
    Timeline:-----------------------------------------------
    Prototypes expected by Xmas 2002.
    First beta likely to appear late Summer 2003.
    Successive betas to be followed by release
    candidates around about Xmas 2003.
    Final release to appear early Spring 2004.
    Update Summer 2004.
    (All these releases will be freeware)
    As you can tell from the schedule, this is a serious project, and I hope I can produce a piece of (initially) freeware that will benefit everyone...
    Thanks for the help!
    Martin Robbins
    UW Aberystwyth (UK).

    a good deal is to implement your software using the Control-Model-View paradigm...
    it may provide many different GUI.. including a web based GUI... imagine a distributed editor GUI .. which the user may access him project trough the web, mantaining the CVS control, secutiry and cross-plataform benefits....
    it also provide you a chance of create new GUIs (evolution) along the next years.. without having to change the editing classes....
    Nowadays I�m developing a AI Planner as part of my MsC thesis...
    my project has the following design:
    - a model: the set of data structures which are filled at the start of the planning process...
    - a kernel controller, the code which makes the real work (parse the input files creating the model, dispatch the planner thread and save the results..)
    - a GUI: I have tow GUIs now, an JFrame application and a JApplet application. These two GUIs share the same model and control classes.. this provide the same eficiency trough the web or trough a standalone application ....
    that�s the point: a good GUI must be disassociated from the classes which makes the "undeground working"
    it may be helpfull if you create a code model, some code checkers (it may be threads) and the GUI separatelly...
    you may take a look at my efforts:
    http://www.lia.ufc.br/~gaucho/planner.html
    (the linke above shows only the GUI, but it is really not finished)

  • Catching Errors: opinions needed

    Under what conditions should a program try to catch Errors (not Exceptions) ? The docs for the Error class read:
    "An Error is a subclass of Throwable that indicates serious problems that a reasonable application should not try to catch."
    However, nothing enforces the fact that I should not try to catch an Error. I can catch a StackOverFlowError, ExceptionInInitializerError, etc. just as I would catch any Exception class.
    Everything I have read talks plenty about Exception handling, but basically nothing about Error handling.
    My questions are: 1) Why doesn't the JVM prevent me from catching Errors? 2) Under what conditions should I try to catch Errors? 3) What are good practices for handling Errors?

    My questions are: 1) Why doesn't the JVM prevent me from catching Errors? In case you need to catch them.
    2) Under what conditions should I try to catch Errors?When you want to perform an action when a specific error occurs.
    An action I perform is to shutdown my server when an OutOfMemoryError or StackOverflowError occurs. This is required as an Error only results in the end of the thread which threw it. The other threads can keep running, leaving a running but unless application, like a zombie. Under this condition, if the application is not working it should shutdown. e.g. System.exit();
    3) What are good practices for handling Errors?When you catch an erro you should rethrow it. (I don't always, but this is considered good practice)

  • Opinions needed on best way to add a mobile site

    We have an existing somewhat older website that does not nicely support mobile devices. While it looks and works fine on tablets, it becomes somewhat unusable on smartphones. We do not have the time/dollars at this point to start from scratch and rebuild with all of the latest bells and whistles. So the question becomes what is the cleanest/fastest/easiest way to add a mobile website. In trying to research this answer, it appears there is no one right answer. I actually have found many possibilities. Right now I have Dreamweaver CS5 and am not going to upgrade any time soon or change tools.
    Here are some possibilities I'm considering:
    Adding a new "home" page and letting the user decide to either "go lean and mobile" or go wide screen and use the full website. This doesn't seem too bad to me since I'm clearly going to have to maintain 2 websites in the near term.
    From there I see choices where the direction is chosen based on the OS or it is chosen based on the screen size. I have seen some based on Javascript and some CSS based. It just gets pretty confusing to know which is best.
    We are adding online ordering (pizza place) and want to definitely make it useable for the mobile world. Any thoughts would be appreciated.
    As a follow on question, how big of a problem would we run into (how many phone users might we anger) if we used a js based menu on our mobile sit?. Not sure how many devices/OS don't support javascript.
    Thanks so much!
    Donna

    If you go down this slippery slope, trying to identify mobile user-agents is the single biggest obstacle you will face.  The list is understandably huge and constantly changing.   To make matters worse, many browsers pretend to be other browsers. So sniffing for mobile devices and redirecting them to your mobile only site is not only a major chore, it's not even 100% reliable.  And then there's the matter of maintaining 2 separate sites...  Oh, don't even go there.
    Re-code your current site to make it Responsive.  Below are several Frameworks you can use to jump start your new layouts.
    Foundation Zurb
    http://foundation.zurb.com/templates.php
    Skeleton Boilerplate
    http://www.getskeleton.com/
    Initializr (HTML5 Boilerplate, Responsive or Bootstrap)
    http://www.initializr.com/
    DMX Zone's Bootstrap *free Extension for DW*
    http://www.dmxzone.com/go/21759/dmxzone-bootstrap/
    Project Seven's Page Packs *Commercial CSS Templates for DW*
    http://www.projectseven.com/products/templates/index.htm
    Whichever menu system you choose, it needs to be user friendly for desktops as well as touch screen devices.  CSS styled lists are the foundation to build on.  Then you can add jQuery MeanMenu to make it mobile friendly.  I have a working example of it below (resize your browser viewport):
    http://alt-web.com/FluidGrid/Fluid2.html
    Nancy O.

  • Going from 15" rMBP to 13" rMBP Opinions Needed.

    On a whim I picked up one of the new 13" rMBP 512GB models yesterday while in the Apple store to see if I could downsize from my 15" as I am doing lots of travel now for work. The smaller footprint and weight would make a big difference.
    I have been throwing some Adobe Premiere editing (I edit about 2-3 hours of video a week in it) at it and it runs super smooth. Exporting takes about 1.8x as long as my late 2013 15" rMBP, but I am totally able to do other work while it is exporting in the background. And it still stays pretty cool throughout the export above and below and the fan's aren't very loud.
    I am torn on sticking with my 15" late-2015 rMBP or moving to the 2015 13" rMBP. I love the size of the 13" especially knowing I am going to be doing lots of travel across the country, Japan, and Germany in the next few months...
    From what I have throw at it the 8GB of ram seems to be good, and I can always by a newer model in a year or 18 months if I need it. I am able to still do plenty of stuff while I have video encoding in background with Premiere's media encoder queue.
    What would you do if it was even money?

    It is a rather personal choice. I imagine the biggest difference in performance is the drop down to a dual core processor. That will have a big impact on the kind of work you're doing.  Size and Weight may very well be more important to you, though, and if it's not causing any problems for you, the trade off in performance may very well be worth it.
    I don't need to do that kind of heavy lifting when I'm on the go, and find an Air does just fine for me. If I have any really heavy work to do, I'm probably sitting at my iMac. Of course, I also have no real need for a retina display either, which it sounds like you will definitely benefit from.
    Of course, there's no law that says you have to get rid of the 15". When you're not on the road, having more than one machine can be a big productivity booster.

  • Hard drive configuration opinions NEEDED

    I just purchased the following and waiting for arrival. I was wanting to know what would be the best config for performance/stability. If you were myself, how would you unleash this beast? Please point me in the right direction, because the wife is a little mad at me right now.
    300GB VelociRaptor
    150GB VelociRaptor
    1TB WD Black
    12GB DDR2 RAM
    I use PS CS3 Ext heavily and soon Logic Studio. The 500GB stock drive is running PS now. I have an idea of how I want to run it, but I want to make sure I get the performance/stability expected from this upgrade.
    Cheers,
    Ben

    I'd use each Velo for booting a system because of its speed. One for Leo and one for SL when it arrives. The Black for software, music and video arching. Add another for scratch. The 12 is always beneficial just toss that in, now that is me for my wants although you have other needs. I always run at least two systems on my MP.

  • Opinions Needed: iPhone App...

    I am looking for a good budget software and it would be great if it had an iphone app that it could sync with. That would be HUGE. If anyone has any suggestions, I would really appreciate it. I have looked at a few and haven't found exactly what I am needing.
    Thanks!

    Well, budget is to $$$, I looked at Money and Checkbook Pro but weren't really that impressed with the interfaces. I'm being super picky, I know, but yadda yadda.
    Thanks!

  • Opinions needed on VMware Fusion or Windows programs on Mac

    I am new to the Mac Family. My son is going to college this fall for Computer Animation, so I thought I would treat him and the family to a new computer system. After much research, the Mac Pro was the computer to get. I have a lot of software and downloads that are still PC. I was told that VMware Fusion is a good support system for Windows on Macs. Has anyone used this product and has anyone used it with Windows Vista?
    P.S. I got my Pro Mac with 3.00 GHZ Dual Core, 4 GB Memory, and 500 GB Serial ATA 3Gb/s hard drive.
    P.S.S. If I go with a conversion system (VMware Fusion), do I need a good anti-virus program (Norton/McAfee, etc.), or does that conflict with Leopard os X?

    Windows on Intel Macs
    There are presently several alternatives for running Windows on Intel Macs.
    1. Install the Apple Boot Camp software. Purchase Windows XP w/Service Pak 2 or Vista. Follow instructions in the Boot Camp documentation on installation of Boot Camp, creating Driver CD, and installing Windows. Boot Camp enables you to boot the computer into OS X or Windows.
    2. Parallels Desktop for Mac and Windows XP, Vista Business, or Vista Ultimate. Parallels is software virtualization that enables running Windows concurrently with OS X.
    3. VM Fusionand Windows XP, Vista Business, or Vista Ultimate. VM Fusion is software virtualization that enables running Windows concurrently with OS X.
    4. CrossOver which enables running many Windows applications without having to install Windows. The Windows applications can run concurrently with OS X.
    Note that Parallels and VM Fusion can also run other operating systems such as Linux, Unix, OS/2, Solaris, etc. There are performance differences between dual-boot systems and virtualization. The latter tend to be a little slower (not much) and do not provide the video performance of the dual-boot system.
    If you install Vista you must use Vista Business or Ultimate with the virtual machine software. You will need Windows anti-virus software. It isn't required for OS X.

  • Declare Internal Table in Code or in Data Dictionary...Opinions needed?

    Is it better to define your internal table and all of its fields in a begin of/end of declaration stmt in your code's data definitions. Or, is it better to create a structure in the Data Dictionary, which means one line of code in your program? Why does one choose the method they prefer?
    I am thinking it is best to declare in Data Dictionary, especially since ABAP is now leaning towrds objects. However, I am seeing a lot of field by field definitions in programs. I am very interested in how most programmers accomplish this and why.    
                     Thank-You

    I would say that if the internal table is to be used in more than one program, then I would create a structure and/or table type in the ABAP dictionary.  If it is to be used in just one program, then I would define it locally.
    Regards,
    Rich Heilman

  • Using an Excel file as a dynamic data source? Opinions needed...

    I have posted this topic before, but as always; in order to get the relevant correct answer you have to ask the correct question.
    I'm trying to create a number of Pricelists linked to an Excel/CSV file. I have a Excel file that contains Pricelist information which is Product specific.
    I have had a number of suggestion that follow:
    A direct link to the Excel file. PROs: Excel file can be uploaded on FTP and overwritten if (and when) amended. Linking this is easy peasy in Dreamweaver. Person browsing can download info straight away on request - no hassle. CONs: Simply, not everyone has Excel and those who don't can not access the information.
    Import Excel file as tabular data. PROs: Fairly easy to do in Dreamweaver. Person browsing can see info straight away. CONs: Can be time consuming on larger Excel files. NOT amendable (so a number of price changes becomes a big job). Can't simply overwrite Excel file on FTP. Larger Excel files can take a lot of page space and thus require tonnes of scrolling).
    Use the Excel file as a dynamic data source. --Not entirely sure how I would go about doing this (any suggestions/links/tutorials etc)-- PROs: ? CONs: Contributor added this suggestion is not a good idea as it performs poorly on the web.
    Create a dynamic page using a database and import the Excel file to that....or maintain the price list in the database rather than an Excel file. --Again not entirely sure how I would go about doing this (any suggestions/links/tutorials etc)-- My understanding of this option is that it will require XML data and SPRY work. Is this correct? Can this be someone who is not an advanced user?
    If once again, I'm off the mark and better suggestions can be thought - please do so.
    As you can see I'm at a bit of a crossroads so an suggestions, comment, help, links, tutorials or applause would be greatly received.
    Thanks in advance and looking forward to seeing the comments this throws up!

    Hi
    Although not everyone has excel just about everyone can open csv files in some way, if not offer the option to download "open office" which is free for the pc, as for the mac they have a program installed by default to use csv files.
    The import tabular data is not really an option for the reasons stated.
    Use excel as data source - not a good idea, requires asp.net to work correctly otherwise it does run slow and is not recommended if you are expecting more than a very small number of users.
    As for the dynamic with database, this can be done with xml and spry but if you have a large amount of data this is almost as bad as the option above. You are probably better creating a database and importing your excel spreadsheet into this, for tutorials on creating a php/mysql database and set-up of testing server see - http://www.adobe.com/devnet/dreamweaver/application_development.html.
    No matter which way you go with the last option it will require a fair amount of knowledge and experience to do correctly, efficiently and securely.
    PZ
    www.pziecina.com

Maybe you are looking for

  • Difference between connection pooling and simple connection

    Anybody please tell me what is the Difference between connection pooling and simple connection and also where we define connection pooling and how. Thanks Please reply soon amitindia

  • How to get old search box back

    Yesterday my broswer updated and I now have version 36. This comes with a new search box. I had several search engines listed with different languages, now all i see in this stupid new version is the same icon, I can't see the labels anymore which ha

  • IMovie export version 6.0.3?

    What's happened to iMovie 6.0.3? Everything was fine until I upgraded to OS 10.4 It seems like every time I upgrade my OS, I run into a whole new set of problems with iMovie. I now find out that I can no longer export by clicking on the time line and

  • Illustrator bug- menu bar all messed up, and it keeps crashing

    I have the weirdest bug on my trial version- the menu bar on top appears with $$$ signs and all messed up, and crashes constantly. I tried to delete and reinstall illustrator (I downloaded a trial version of the basic creative suite with Illustrator,

  • Install Error in WS 2008 R2

    Attempting to install CF10 on Windows Server 2008 R2 Standard 64bit SP1. I've attempted twice and received an error near the end of the installation process both times: "ExecuteAppCmd.exe has stopped working." After dismissing the error, the installa