BRTOOLS with tape continuation for large database

Hello,
I have a R/3 database of 2TB which needs to be copied onto a new staging server for upgrade.
The problem I am facing is that I dont have any space in the SAN(storage) for taking the backup on disk.
So the only option I have is to take backups on tape.
Even tried backup with compress mode on tape but ended up with CPIO error for handling larger than 2GB files(note 20577).
And Since the database size is 2TB and the tapes I have hold 800GB I would have to use multiple tapes.
Since my some of the files in the database like *DATA_1 range between 6GB to 10GB  CPIO cannot
handle files larger than 2GB as per note 20577.
So I had to change the parameter   tape_copy_cmd = dd   in init<sid>.sap.
But DD will end once the end of tape is reached with a error message thereby failing my backup.
Please help me get out of this situation.
Regards,
Guru

Hi,
Please check the 'Sequential backup' section in the backup guide. If its not possible to use a tape with a big capacity, you could use this method instead.
You would need to add/ modify the following parameter in init<sid>.ora :
1. volume_backup
2. tape_address
3.tape_address_rew
4.exec_parallel
You'll find more info about these parameters in www.help.sap.com and in the backup guide itself.
Br,
Javier

Similar Messages

  • RMAN Tips for Large Databases

    Hi Friends,
    I'm actually starting administering a large Database 10.2.0.1.0 on Windows Server.
    Do you guys have Tips or Docs saying the best practices for large Databases? I mean large as 2TB of data.
    I'm good administering small and medium DBs, but some of them just got bigger and bigger!!!
    Tks a lot

    I would like to mention below links :
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28318/partconc.htm
    http://download.oracle.com/docs/cd/B28359_01/server.111/b32024/vldb_backup.htm
    For couple of good advices and considerations for RMAN VLDB:
    http://sosdba.wordpress.com/2011/02/10/oracle-backup-and-recovery-for-a-vldb-very-large-database/
    Google "vldb AND RMAN in oracle"
    Regards
    Girish Sharma

  • SAP EHP Update for Large Database

    Dear Experts,
    We are planning for the SAP EHP7 update for our system. Please find the system details below
    Source system: SAP ERP6.0
    OS: AIX
    DB: Oracle 11.2.0.3
    Target System: SAP ERP6.0 EHP7
    OS: AIX
    DB: 11.2.0.3
    RAM: 32 GB
    The main concern over here is the DB size. It is approximately 3TB. I have already gone through forums and notes it is mentioned the DB size is not having any impact on the SAP EHP update using SUM. However am stil thinking it will have the impact in the downtime phase.
    Please advise on this.
    Regards,
    Raja. G

    Hi Raja,
    The main concern over here is the DB size. It is approximately 3TB. I have already gone through forums and notes it is mentioned the DB size is not having any impact on the SAP EHP update using SUM. However am stil thinking it will have the impact in the downtime phase.
    Although 3TB DB size may not have direct impact on the upgrade process, downtime of the system may vary with larger database size.
    Points to consider
    1) DB backup before entering into downtime phase
    2) Number of Programs & Tables stored in the database. ICNV Table conversions and XPRA execution will be dependent on these parameters.
    Hope this helps.
    Regards,
    Deepak Kori

  • DR startegy for Large database

    Hi All,
    We have a 30TB database for which we need to design a backup strategy.( Oracle 11gR1 SE, 2 Node RAC with ASM)
    Client needs a DR site for the database and from the DR site we will be running tape backup.
    The main constraint we are facing here are size of DB which will grow till 50 TB in future and another is we are running in Oracle standard edition.
    Taking a full RMAN backup to a SAN box will take around 1 week for us for a DB size of 30TB.
    Options for us:
    1. Create a manual standby and apply archive logs( We cant use Dataguard as we are in SE edition)
    2. Storage level replication ( Using HP Continous access)
    3. Use thrid party tools such as Shareplex,Golden gate, DBVisit etc
    Which one will be the best option here with respect to cost and time or do we have any other option better than this.
    We cant upgrade to Oracle EE edition as of now since we need to meet the project deadline for Client. We are migrating legacy data to production now and this will be interrupted if we go for a upgrade.
    Thanks in advance.
    Arun
    Edited by: user12107367 on Feb 26, 2011 7:47 AM
    Modified the heading from Backup to DR

    Arun,
    Yes this limitation about BCT is problematic in SE but after all if everything was included in SE who would pay the EE licence :) ?.
    Only good thing if BCT is not in use is that RMAN checks the whole database for corruption even if the backup is an incremental one. There is no miraculous "full Oracle" solution if your backups are so slow but as you mentioned the manual standby with delayed periodic applications of the archives is possible. It's up to you to evaluate if if works in your case though : how many archive log files will you daily generate and how long will it take to apply them on your environment ?
    (notice about Golden Gate it's no more a third party tool : it's now an Oracle tool and it is clearly introduced as a recommended replacement for Streams)
    Best regards
    Phil

  • Sql Server Management Assistant (SSMA) Oracle okay for large database migrations?

    All:
    We don't have much experience with the SSMA (Oracle) tool and need some advice from those of you familiar with it.  We must migrate an Oracle 11.2.0.3.0 database to SQL Server 2014.  The Oracle database consists of approximately 25,000 tables and 30,000
    views and related indices.  The database is approximately 2.3 TB in size.
    Is this do-able using the latest version of SSMA-Oracle?  If so, how much horsepower would you throw at this to get it done?
    Any other gotchas and advice appreciated.
    Kindest Regards,
    Bill
    Bill Davidson

    Hi
    Bill,
    SSMA supports migrating large database of Oracle. To migrate Oracle database to SQL Server 2014, you could use the latest version:
    Microsoft SQL Server Migration Assistant v6.0 for Oracle. Before the migration, you should pay attention to the points below.
    1.The account that is used to connect to the Oracle database must have at least CONNECT permissions. This enables SSMA to obtain metadata from schemas owned by the connecting user. To obtain metadata for objects in other schemas and then convert objects
    in those schemas, the account must have the following permissions: CREATE ANY PROCEDURE, EXECUTE ANY PROCEDURE, SELECT ANY TABLE, SELECT ANY SEQUENCE, CREATE ANY TYPE, CREATE ANY TRIGGER, SELECT ANY DICTIONARY.
    2.Metadata about the Oracle database is not automatically refreshed. The metadata in Oracle Metadata Explorer is a snapshot of the metadata when you first connected, or the last time that you manually refreshed metadata. You can manually update metadata
    for all schemas, a single schema, or individual database objects. For more information about the process, please refer to the similar article: 
    https://msdn.microsoft.com/en-us/library/hh313203(v=sql.110).
    3.The account that is used to connect to SQL Server requires different permissions depending on the actions that the account performs as the following:
     • To convert Oracle objects to Transact-SQL syntax, to update metadata from SQL Server, or to save converted syntax to scripts, the account must have permission to log on to the instance of SQL Server.
     • To load database objects into SQL Server, the account must be a member of the sysadmin server role. This is required to install CLR assemblies.
     • To migrate data to SQL Server, the account must be a member of the sysadmin server role. This is required to run the SQL Server Agent data migration packages.
     • To run the code that is generated by SSMA, the account must have Execute permissions for all user-defined functions in the ssma_oracle schema of the target database. These functions provide equivalent functionality of Oracle system functions, and
    are used by converted objects.
     • If the account that is used to connect to SQL Server is to perform all migration tasks, the account must be a member of the sysadmin server role.
    For more information about the process, please refer to the  similar article: 
    https://msdn.microsoft.com/en-us/library/hh313158(v=sql.110)
    4.Metadata about SQL Server databases is not automatically updated. The metadata in SQL Server Metadata Explorer is a snapshot of the metadata when you first connected to SQL Server, or the last time that you manually updated metadata. You can manually update
    metadata for all databases, or for any single database or database object.
    5.If the engine being used is Server Side Data Migration Engine, then, before you can migrate data, you must install the SSMA for Oracle Extension Pack and the Oracle providers on the computer that is running SSMA. The SQL Server Agent service must also
    be running. For more information about how to install the extension pack, see Installing Server Components (OracleToSQL). And when SQL Express edition is used as the target database, only client side data migration is allowed and server side data migration
    is not supported. For more information about the process, please refer to the  similar article: 
    https://msdn.microsoft.com/en-us/library/hh313202(v=sql.110)
    For how to migrate Oracle Databases to SQL Server, please refer to the  similar article: 
    https://msdn.microsoft.com/en-us/library/hh313159(v=sql.110).aspx
    Regards,
    Michelle Li

  • Beta Testing Continues for "Oracle Database 11g: Performance Tuning" Exam

    !http://blogs.oracle.com/certification/2009-0204-1123.gif!
    The beta period continues for the <strong>Oracle Database 11g: Performance Tuning certification exam (1Z1-054)</strong>, which is a single exam requirement for Oracle 11g DBA OCPs to earn the Oracle Database 11g Performance Tuning Certified Expert (OCE) certification. Read more on the <a href="http://bit.ly/UGJh4">Oracle Certification Blog</a>.</p>

    Hello Hussein
    Yes true, I remember it for the OCE and Linux exams they rescheduled the end date several times. As far as I know it is related to the number of participants and the given feedback.
    I've also participated to several other exams, and I must admit that it is a long and hard process to get through. When I got the feedback 10 weeks after the beta period closure, I had to review nearly all the topics to get the exams passed the second time. But this it is a cheap and good exam preparation.
    What about you Hussein? Do you think that's trivial?
    Cheers,
    Hub

  • Oracle for large database + configuration

    Hi
    I have some historical data for Stocks and Options that I want to save into an Oracle database. Currently I have about 190G of data and expect to grow about 5G per month. I have not completely thought about how to organize the tables. It is possible that there might just be one table which might be larger than the hard disk I have.
    I am planning to put this on a DELL box, running Windows 2000. Here is the configuration.
    Intel Xeon 2.4GHz, 2G SDRAM, with 3 146G SCSI harddrive with PERC3 SCSI controller. This machine roughly costs 7000$.
    Is there any reason that this wont work. Will Oracle be able to organize one database across multiple disks? How about Tables? Can Tables span multiple disks.
    All this data is going to be read only.
    My other cheaper choice is
    Intel box running P3, 2G RAM, 2 200G IDE drives. My questions for this are/ Will this configuration work?
    Also, for this kind of a database what kind of total disk space should I budget for?
    thanks
    Venkat

    The Server Manager was deprecated in 9i. Instead of using it you have to use the SQL*Plus. Do you have another JRE installed ?
    How to create a database manually in 9i
    Administrator's Guide for Windows Contents / Search / Index / PDF
    http://download-east.oracle.com/docs/cd/B10501_01/win.920/a95491.pdf
    Joel Pérez

  • Remove fragmentation for large database

    I have two databases each with page file size close to 80GB, Index 4GB. Average Clustering Ratio on
    them is close 0.50. I am in a dillema how to defragment these databases. I have two options -
    1> Export level0 data, clear all data (using Reset), Reimport level0 data and fired a calc all.
    2> Export all data. clear all data (using Reset), Reimport all data.
    Here is the situation.
    -> Export all data is running for 19 hours. Hence I could not continue with option 2.
    -> Option 1 works fine, but when I fire the calc after loading level0 data, Average clustering ratio
    goes back to 0.50. So the database is fragmented again. I am back to the point where I had started.
    How do you guys suggest to handle this situation?
    This is version Essabse 7. ( yeah, it is old).

    The below old thread seems to be worth reading:
    [Thread: Essbase Defragmentation|http://forums.oracle.com/forums/thread.jspa?threadID=713717&tstart=0]
    Cheers,
    -Natesh

  • Need Help with complex query for productio database

    Hello again,
    i need your help again, for an query how Shows me how long every production step takes per Order.
    See sample data and what i expect.
    Thank you all for your help.
    We use Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    Here the sample data tables:
    CREATE      TABLE      TABLE_2
    (     "ORDER_NR"      VARCHAR2 (12)
    ,      "PRIORITY"      VARCHAR2 (2)
    ,      "WO_STEP"      VARCHAR2 (1)
    ,      "STEP_DATE"      DATE
    CREATE      TABLE      TABLE_1
    (     "ORDER_NR"           VARCHAR2     (12) PRIMARY KEY
    ,      "PRIORITY"           VARCHAR2      (2)
    ,      "CREATE_DATE"      DATE
    ,     "ACT_STEP"          VARCHAR2     (2)
    ,     "STEP_DATE"          DATE
    ,     "EMPLOYEE"          VARCHAR2     (5)
    ,     "DESCRIPTION"     VARCHAR2     (20)
    INSERT      INTO      TABLE_1      (ORDER_NR,               PRIORITY,      CREATE_DATE,                                                        ACT_STEP,     STEP_DATE,                                                            EMPLOYEE,     DESCRIPTION)
                        VALUES           ('1KKA1T205634',     '12',          TO_DATE('10-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'),     'U',          TO_DATE('28-FEB-13 12:00:00','DD-MON-RR HH24:MI:SS'),     'W0010',     'CLEAN HOUSE');
    INSERT      INTO      TABLE_1      (ORDER_NR,               PRIORITY,     CREATE_DATE,                                                        ACT_STEP,     STEP_DATE,                                                            EMPLOYEE,     DESCRIPTION)
                        VALUES           ('1KKA1Z300612',     '12',          TO_DATE('08-FEB-13 14:00:00','DD-MON-RR HH24:MI:SS'),     'F',          TO_DATE('20-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'),     'K0052',     'REPAIR CAR');
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'A',          TO_DATE('12-FEB-13 13:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          '5',          TO_DATE('13-FEB-13 09:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'K',          TO_DATE('13-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          '5',          TO_DATE('13-FEB-13 11:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'K',          TO_DATE('13-FEB-13 12:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          '5',          TO_DATE('13-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'C',          TO_DATE('14-FEB-13 08:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'B',          TO_DATE('14-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'E',          TO_DATE('18-FEB-13 13:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'F',          TO_DATE('20-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'S',          TO_DATE('21-FEB-13 08:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'R',          TO_DATE('21-FEB-13 09:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1T205634',     '12',          'U',          TO_DATE('28-FEB-13 12:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'A',          TO_DATE('12-FEB-13 13:52:42','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          '5',          TO_DATE('13-FEB-13 09:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'K',          TO_DATE('13-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          '5',          TO_DATE('13-FEB-13 11:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'K',          TO_DATE('13-FEB-13 12:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          '5',          TO_DATE('13-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'C',          TO_DATE('14-FEB-13 08:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'B',          TO_DATE('14-FEB-13 10:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'E',          TO_DATE('18-FEB-13 13:00:00','DD-MON-RR HH24:MI:SS'));
    INSERT     INTO      TABLE_2      (ORDER_NR,               PRIORITY,     WO_STEP,     STEP_DATE)
                        VALUES           ('1KKA1Z300612',     '12',          'F',          TO_DATE('20-FEB-13 16:00:00','DD-MON-RR HH24:MI:SS'));
    COMMIT;And here is what i expect from my query:
    SYSDATE     28.Feb.13 14:00                                                                                     
    ORDER_NR     PRIORITYCREATE_DATE     STATUS     STATUS_DATE     DESCRIPTION     AGE_1     AGE_2     WAITNG     STEP_A     STEP_B     STEP_C     STEP_5     STEP_K     STEP_E     STEP_F     STEP_S     STEP_R     
    1KKA1T205634     12     10.Feb.13 10:00     U     28.Feb.13 12:00     CLEAN HOUSE     18,083     8,833     2,125     0,833     4,125     0,083     0,750     0,208     2,125     0,666     0,042     7,125     
    1KKA1Z300612     12     08.Feb.13 14:00     F     20.Feb.13 16:00     REPAIR CAR     20,000     16,042     2,125     0,833     4,125     0,083     0,750     0,208     2,125     0,666          And now the explanation to the query result:
    The AGE_1 is the difference in days between the 'CREATE_DATE' and IF EXSIST the STEP 'U' then STEP_DATE or if the STEP 'U' is not found in TABLE_2 then it should show the difference in days between the 'CREATE_DATE' and the 'SYSDATE'
    The AGE_2 is the difference in days between the STEP 'A' STEP_DATE and IF EXSIST the STEP 'R' then STEP_DATE or if the STEP 'R' is not found in TABLE_2 then it should show the difference in days between the 'CREATE_DATE' and the 'SYSDATE'
    The WAITING is the difference in days between CREATE_DATE and STEP 'A' STEP_DATE
    The following columns show the days how long the ORDER_NR stays in these STEP, if an ORDER_NR comes into the same STEP more then one time it should be calculated together.
    If the ORDER_NR skips a step it should show a zero in the specific field.
    I hope my explanation is good enough, my english skills are far away from good.
    Thanks all for your help.
    Greets Reinhard W.

    Hi,
    i changed this query:
    with t2 as (
                select  t.*,
                        lead(step_date) over(partition by order_nr order by step_date) next_step_date
                  from  table_2 t
    select  t1.*,
            nvl(
                max(
                    case t2.wo_step
                      when 'U' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_1,
            nvl(
                max(
                    case t2.wo_step
                      when 'R' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_2,
            sum(
                case t2.wo_step
                  when 'B' then t2.next_step_date - t2.step_date
                end
               ) step_b
      from  table_1 t1,
            t2
      where t2.order_nr = t1.order_nr
      group by t1.order_nr,
               t1.priority,
               t1.create_date,
               t1.act_step,
               t1.step_date,
               t1.employee,
               t1.descriptionTo this:
    with t2 as (
                select  t.*,
                        lead(step_date) over(partition by order_nr order by step_date) next_step_date
                  from  table_2 t
    select  t1.order_nr,
            nvl(
                max(
                    case t2.wo_step
                      when 'U' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_1,
            nvl(
                max(
                    case t2.wo_step
                      when 'R' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_2,
            sum(
                case t2.wo_step
                  when 'B' then t2.next_step_date - t2.step_date
                end
               ) step_b
      from  table_1 t1,
            t2
      where t2.order_nr = t1.order_nr
      group by t1.order_nrthen i get the ORA-00979 Error.
    Whats wrong?
    I have another question.
    How can i handle i i want to group to or more 'STEP's in one Column.
    in Case of this i want that the column 'STEP_B' contains all days for STEP 'B' and STEP '5'.
    I tried already with a + Operation like this:
    with t2 as (
                select  t.*,
                        lead(step_date) over(partition by order_nr order by step_date) next_step_date
                  from  table_2 t
    select  t1.*,
            nvl(
                max(
                    case t2.wo_step
                      when 'U' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_1,
            nvl(
                max(
                    case t2.wo_step
                      when 'R' then t2.step_date
                    end
               sysdate
              ) - t1.create_date age_2,
            Round( sum(
                case t2.wo_step
                  when 'B' then t2.next_step_date - t2.step_date
                end
               ) +
            sum(
                case t2.wo_step
                  when '5' then t2.next_step_date - t2.step_date
                end
               ), 3 ) step_b
      from  table_1 t1,
            t2
      where t2.order_nr = t1.order_nr
      group by t1.order_nr,
               t1.priority,
               t1.create_date,
               t1.act_step,
               t1.step_date,
               t1.employee,
               t1.descriptionBut this does reply evertime a NULL.
    Thank You.

  • Selection Interface for large database

    I am looking for a working example of a CF selection field that fills or builds a name list as you type.  The database has about 600,000 names with 400 new people being added each day.  I am looking for a smart tool that watches you type and brings down a name list as you go.  In the list the user needs to see the name and other identifying information like DOB and phone number.  User clicks the row and the persons recorded is located.  I think I have a good understanding the CFC side of this. 
    If you type fast the tool waits for a second. "Sounds like" support would also be nice.
    Thanks for any ideas?

    You mean AutoSuggest? See this link:
    http://forta.com/blog/index.cfm/2007/5/31/ColdFusion-Ajax-Tutorial-1-AutoSuggest
    You might want to adjust the code to work on official version because the example was built for beta version.

  • Brbackup for large database

    Dear All,
    In our PRD Server the DB size more 800 GB.Now we using the Brtools for backup.The tape size is 400/800 GB ultrium 3.In future the DB Size may increase .How to take the backup .Is there any way to take backup in to split up backup.
    Kindly Give the solution.
    Regards
    guna

    Hello,
    if your backup is to big for one tape, you may do one of the following:
    Use more than one tape drive. You may specify them in your init<SI>.sap file.
    Use one drive, and manually enter another tape as soon as the first is full. You probably don't want to, though...
    Use one drive and a loader that will automatically change tapes.
    So most probably you will have to pay for additional hardware.
    regards

  • Backups for large databases

    I am wondering how people do restores of very large DB's. Ours is not that large yet bit will grow to the point where exports and imports are not feasible. The data only changes periodically and as it is a web application,cold backups are not really an option. We don't run in archived log mode because of the static nature of the data. Any suggestions?

    put the read only tables in a read only tablespace and slowly changing tables to another tablespace. the most frequent ones in another tablespace.
    take transportable tablespace export of the frequently changing tablespace daily and slowly changing 2-3 times a week (depending on your site specifics). this involves nothing but a metadata export of the datadictionary info of the tablespaces exported and an OS level copy of the datafiles of those tablespaces.
    this is the best way for you to backup/recover. check out oracle documentation or this website for transportable tablespaces.
    I guess it comes to a point where you have to make a tradeoff between performance and recoverability. In my opinion always take recoverability over performance.
    If the periodic change of data is nothing but a bulk data load then after the dataload take a backup of a database. having multiple recovery scenarios is the best way for recovery.

  • How to use Source Code Control for Large Application?

    Hi, All!
    I would like to collect knowledge about "best practice" examples for using Source Code Control and project organization for relative large application (let's say approx 1000 SubVIs).
    Tools used:
    LabVIEW 8.0
    CVS Server
    PushOK CVS Proxy Client
    WinCVS
    With LabVIEW 8 we can organize large project pretty well. This described in article Managing Large Applications with the LabVIEW Project.
    I have read this article too: Using Source Control Software with LabVIEW In this Article Source Safe used, but with PushOK all looks nearby the same and works (some tricks for compare function are required).
    Example. Two developers working together on same project. Internally project is modular, so one developer will work with module "Analysis", and another one with "Configuration" without interferences. These modules placed into Subfolders as shown in example above.
    Scenario 1:
    Developer A started with modification of module "Analysis". Some files checked out. He would like to add some SubVIs here. So, he must also perform check out for the project file (*.lvproj), otherwise he cannot add anything into project structure.
    Developer B at the same time would like to add some new functions into module "Configuration". He also needed to check out project file, but this file already checked out by Developer A (and locked). So, he must wait until lvproj file will be checked in. Another way is mark *.lvproj files as text files in PushOK, but then one of developers will get conflict message by checking in and then merging will be necessary. This situation will coming very often, because in most cases *.lvproj file will be checked out all the time.
    Question: Which practice is better for such situation? Is Libraries better than folder for large project?
    Scenario 2:
    Developer C joined to the team. First, he must get complete project code for starting (or may be at least code of one Library, which assigned to him).
    Question: How it can be done within LabVIEW IDE? Or WinCVS (or other SCC UI) should be used for initial checkout?
    Scenario 3:
    Developer D is responcible for Build. Developers A,B,C have added lot of files into modules "Analysis", Configuration" and "FileIO". For building he need to get complete code. If our project splitted into folders, he should get latest *.lvproj first, then newly added SubVIs will appear in Project Explorer, then he should expand tree, select all SubVIs and get latest versions for all. If Project organized in Libraries, he must do the same for each library, isn't?.
    Question: Is this "normal way", or WinCVS should be used for this way? In WinCVS its possible with two mouseclicks, but I prefer to get all code from CVS within LabVIEW IDE recursively...
    That was a long post... So, if you already working with LabVIEW 8 with SCC used for large project, please post your knowledge here about project structure (Folders or Libraries) and best practices, its may be helpful and useful for all of us. Any examples/use cases/links etc are appreciated.
    Thank you,
    Andrey

    Regarding your scenarios:
    1. Using your example, let's say both developers checked out version 3
    of the project file. Assuming that there are only files under the
    directories in the example project, when Developer A checks in his
    version of the project, there will be new files in one section of the
    project separate from where Developer B is working. Developer B,
    notices that there is now a version 4 of the project. He needs to
    resolve the changes so will need to merge his changes to the latest
    version of project file. Since the project file is a text file, that is
    easy to do. Where an issue arrises is that after Developer B checks in
    his merged changes, there is a revision 5. When Developer A and B go to
    make another change, they get the latest version which will have the
    merged changes to the project file but not the referenced files from
    both Developer A and B. So when A opens version 5, he sees that he is
    missing the files that B checked in and visa versa. Here is where the
    developers will needs to manually use the source control client and,
    external to LabVIEW, get those new files.
    Where libraries help with the above scenario is that the library is a
    separate file from the project so changes made to it outside of the
    project do not require the project to be modified. So this time, the
    developers are using a single project again which time time references
    two libraries. The developers check out the libraries, make changes to
    the libraries, and then check those changes in. So when each developer
    opens the project file, since it references the project file, the
    changes to the library will be reflected. There is still the issue of
    the new files not automatically coming down when the latest version of
    the library is obtained. Again, the developers will needs to manually
    use the source control client and, external to LabVIEW, get those new
    files. In general, you should take advantage of the the modularity that
    libraries provide.
    2. As noted in the above scenario, there is no intrinsic mechanism to
    get all files referenced by a LabVIEW project. Files that are missing
    will be noted. The developer will then have to use the source control
    provider's IDE to get the initial contents of the project  (or library).
    3. See above scenarios.
    George M
    National Instruments

  • I want JDBC driver for Sybase database...

    Hi All:
    Can any one help me with JDBC driver for Sybase database ? I have surfed the net, but not able to find a Sybase driver that works correctly.
    Also, can you please let me know if anything special needs to be done in case of Sybase database connection, apart from what we normally do to connect to a database like "Oracle" or "MS SQL Server".
    Thanks and Regards,

    And why do you think someone using Oracle might know the Competitor's drivers?
    Having said this: you should give jTDS a try:
    http://jtds.sourceforge.net
    Also, can you please let me know if anything special
    needs to be done in case of Sybase database
    connection, apart from what we normally do to connect
    to a database like "Oracle" or "MS SQL Server".It's all in the docs of the driver

  • Can I use tape drive for backup with win 2012 OS & Sybase database?

    I have installed enhancement package 6 for SAP ERP 6.0 with OS windows 2012 and database sybase ASE(15.7.050). I am facing problem while taking database backup on tape drive. I want to know how to initialize tape drive in case of sybase DB and is it possible to take back-up on same. I failed to schedule backup from DB13 so now I am doing that activity by command line. So please help me to solve these issues.
    Here I am attaching some files related with backup commands that presently I am using. My SAP system ID is DQS.

    That output actually looks pretty good, it was dumping to the tape drive, it just ran out of tape because the drive apparently doesn't automatically detect when it is reaching the end.
    For the next step, try adding a CAPACITY parameter to your dump command as discussed at these places in the documentation.
    Specifying tape capacity for dump commands
    Set the Maximum Capacity for a Tape Drive
    Your output above shows it was able to dump 87565594 KB before it ran out of tape,
    so lets be a little conservative and try
    dump database DQS to TAPE1 with   capacity = 87000000, init
    go
    When the dump reaches that point, ASE should prompt you to pop in another new tape
    and then issue the sp_volchanged procedure from a second session (it will tell you what parameters to use for sp_volchanged).
    Note: once we get the dump working successfully, we can then create a logical dump device that would store the capacity value, and create a full dump configuration so you gain all the new functionality like dump history, etc.
    Enhancements to dump and load
    -bret

Maybe you are looking for

  • Macbook Pro 11' - SSD installation problems

    I have bought a new crucial 256mb SSD after my Samsung 8400 256mb SSD failed twice. I am now trying to install the SSD but my computer will not connect to the wifi in my building. I have tried a number of different networks and tested them with other

  • Broken itunes backup

    WHen i was updating to ios 7.0.3 my pc blue-screen-death-turned-off. Next i had to chkdsk my system partition, after that last backup made just before updating ios was broken, so it's not visible in list, but i can see it in "c:\Users\msi\AppData\Roa

  • Quantity Allocation

    Hi, We are in the process of finalizing our blue print,Need more clarification on the "Procurement Quantity Allocation". It has been identified as not a standard feature of SAP,would like to know the possibilities from experts. Our core business fall

  • Samsung SSD not working internally in macbook pro 2009

    hi, installed mountain lion 10.8.2 on samsung 512 830 SSD and it boots perfectly in an external enclosure (using it now) but when I put it internally, it doesnt boot at all I get just the grey boot up spinning wheel can anyone please help me with thi

  • Questions about movies in iTunes

    I like our new iPhone and still need the iPod Classic because of its nice large hard drive (which still doesn't hold all podcasts). I bought the Classic more for its ability to share pictures than to play music. Photo Stream's nice, but not for > 1,0