Using Archiver to replicate tables

Hi,
I'm trying to replicate a singular table between two instances content server using archiver.
CREATE TABLE TEST
ID NVARCHAR2(20) NOT NULL,
SCHCREATETIMESTAMP TIMESTAMP(6),
SCHMODIFYTIMESTAMP TIMESTAMP(6),
SCHSOURCEID NVARCHAR2(50)
When I insert or update records using "Export Archiver" in manual mode works good!, but in this mode, the archiver doesn't get the records deleted.
I try to use "export Archiver" in automatic mode, but seems don't work because don't get the changes on the records (update/insert or delete) .
Does anybody using this feature ?.
Content Server 7.5.1
Thanks.

Be careful before engaging in dml apply handler doing such weird things... Take time to experiment basic functionality before.
I have not had the time to do such experiment yet myself... but I'd rather be careful with new features until I saw them work satisfactorily.
I personnally would investigate using materialized views for what you describe.
Regards,

Similar Messages

  • Customised archiving object for archiving a Y table

    Hi Guys,
    I am trying to create a customize archiving object for archiving a Y table.
    BC_SBOOK is one archiving object which i am using as the template. IT has no archiving calss defined in AOBJ tcode
    Could you please let  me know if it is necessary to create the archiving class.
    Also if i can get any reference document for the same.
    Thanks & reards,
    Jack

    Hi Jack,
    Please refer the below link for more infor on Archiving Classes
    [https://cw.sdn.sap.com/cw/docs/DOC-14096]
    Regards
    Abhii

  • Replicate Tables as Staging Tables in a DWH

    Hi
    We are in process of building a Data Mart, for the staging area we are considering to use GG as the Change Data Capture tool.
    The idea is in the source (OLTP Database) we have a table called T1, this table has no PK and it's partitioned by day, generates around 20 million of rows per day and a few hundreds of thousands of changes such as Updates and Deletes. In the staging area we will setup a table called STG_T1 with same structure as T1 plus a few columns such as
    @GETENV("TRANSACTION" , "CSN")
    @GETENV("GGHEADER", "COMMITTIMESTAMP")
    @GETENV("GGHEADER", "LOGRBA")
    @GETENV("GGHEADER", "LOGPOSITION")
    @GETENV("GGHEADER", "OPTYPE")
    @GETENV("GGHEADER", "BEFOREAFTERINDICATOR")
    All the changes will be comverted to INSERTS using INSERTALLRECORDS in the replicat. This has a problem, since we dont have PK in the source we dont know how to identify a row's change history in the source in STG_T1.
    Has anyone got experience replicating OLTP to a Staging Area using OGG and the ETL basics to propage the changes to the Fact Tables from the Staging Area?
    Thanks

    If there is no primary key on the source, when you do the ADD TRANDATA, all columns will be supplementally logged. This is probably what you want so that you will have all columns when you apply the operation as an insert on the target.
    Even if you don't have a primary key on the target table, you can give Replicat a KEYCOLS on the MAP of one of the target columns - it won't really make any difference what column you pick since you are only going to only be applying inserts so Replicat does not have to format a WHERE clause. However, with no primary key on the target side, you do want to make sure you have enough information on the record to make each row unique.
    I would suggest you take a look at the following MOS articles to help guide you:
    What Tokens need to included in the transaction to make it unique for Insertallrecords to be used in the replicat [ID 1340823.1]
    Oracle GoldenGate - Best Practice: Creating History Tables [ID 1314698.1]
    Oracle GoldenGate Best Practice - Oracle GoldenGate for ETL Tools [ID 1371706.1]
    Let us know if you still have further questions.
    Best regards,
    Marie

  • Sync database using archive log files

    Hi,
    I am having two databases A nd B. A is the live Database. I need to replicate the A database and its name as B. I had done the replication using hotbackup.
    1. create the B database
    2. then take the hotbackup and copy the files to the B database
    3. Then delete the controlfile, logfiles and datafiles then paste the files from A database.
    4. Recreate the control files in B database
    5. Now the B database is up and running.
    But my question is in A database is still live and its generating archive log files. I would like to know how to sync the B Database similar to A database using archive log files.
    Kindly share me an example..
    Rgds..

    As you have renamed and opened the clone database, it is no longer the same DBID / Database Incarnation as the source. As you point out, it is not a standby database.
    You cannot apply archivelogs from the source database to this clone.
    You have to consider [Oracle Streams|http://download.oracle.com/docs/cd/B19306_01/server.102/b14229/toc.htm] to implement data replication.
    Hemant K Chitale
    Edited by: Hemant K Chitale on Jan 5, 2010 3:34 PM

  • How to archive the NAST table entries

    Hi All,
    Table NAST has 33 million records and there are lots of jobs which use this table. It is a performance issue. I want to know is there any way to increase the performance of accessing NAST table.
    OR
    Is there any way to archieve the entries from NAST table
    Thanks in advance.

    Hi Sushant,
    Welcome to SDN. The archiving should be a well coordinated and well thought effort in any enterprise. I will not suggest you to archive only single table. All the related tables should be archived together till a point in time. The table NAST is part of following archiving objects.
    LE_HU     Handling units
    MM_EKKO     Purchasing documents
    MM_MATBEL     Materials management: Material documents
    MM_REBEL     Materials Management: invoice documents
    RV_LIKP     Deliveries
    SD_LIKP     No text available
    SD_VBAK     Sales documents
    SD_VBKA     Sales activities
    SD_VBRK     Billing documents
    SD_VTTK     SD Transport
    W_PROMO     Retail: Promotion
    You have to perform an analysis that which business processes ( like billing, delivery ) have resulted the most number of NAST records. Talk to your functional guys and <b>bring a project to archive them</b> up to a finite date ( like 12/31/2002). It will clear up not only NAST but other common tables (like CDHDR and CDPOS).

  • Data Archival for Custom Tables

    Hi Experts,
    We have a requirement of archiving the related custom tables while we archive the standard SAP tables data. We are going a head with creating the custom archiving objects to handle this. However, i would like to know about the feature of "Exit routine assignement in generation" in the standard Archiving objects (AOBJ).  Can this be used to handle custom tables data? If yes, can you please provide me an example how this can be acheived? If not, what is the purpose of this?
    Please help!!
    Thanks and Best Regards,
    Surya

    Hello Lavaughn,
    I can not speak for SAP, but from my personal experience TimeStamps are the state of the art to store time date info in db tables. Even with date/Time fields a conversion to the local timezone is necessary. Of course you can reuse your db definition on dynpros. But since UI and business logic will be divided anyway this is also not really a valid point.
    Best Regards
    Klaus

  • Archiving / Purging Custom table

    Hello!
    I would like to ask if there is a program in Oracle EBS that I could used to set up to archive custom tables? If not, what would you recommend a best approach.

    9ff898ea-85f6-4303-af4e-b42cfcc64343 wrote:
    Hello!
    I would like to ask if there is a program in Oracle EBS that I could used to set up to archive custom tables? If not, what would you recommend a best approach.
    Oracle doesn't provide seeded concurrent programs to purge/archive custom tables since this varies from one customer to another and the business requirements are not the same -- You should create custom concurrent programs (or scripts) to purge/archive your custom tables.
    If you are looking for 3rd party archiving/purging solutions, please see:
    https://forums.oracle.com/message/4060032
    https://forums.oracle.com/message/9155043
    Thanks,
    Hussein

  • To Use  Cursor or  TYPE table Index by PLS_integer

    Hi All,
    Let's see if I have table with no. of records 19,26,20,000.
    If I want to loop through all the records which will be a optimized way To Use Cursor or TYPE table Index by PLS_integer.
    Please guide.
    Thanks.

    What is it you want to do to/with the rows you're looping through?
    Ideally you want to avoid looping, as that's row by row (aka slow by slow) processing and it's expensive time-wise.
    If you're doing DML (insert/update/delete) then you're best off doing it in one sql statement, rather than looping.

  • How to use non-default Alias Table in Analyzer report

    Hi,I defined many alias tables in Essbase. I would like to use a different alias table other than the "default" in Analyzer 6.5 report. In the on-line help, it said I can modify in "database connection properties" when first defining a new report to use a specific alias table. It tells me to do the following: Click the "New" toolbar button. Select a Display Type or Layout, and click OK. Right-click a database connection name in the "Select Database Connections" dialog box, and select Modify from the right-click menu.So I did this, but as I did the last-right click, there is no "Modify" option available. Only has "Add New..", "Database Connection Properties.." However, if I defined a new personal database connection using the login user, I can select to use other alias table. But this will go to personal database connection properties.Is it possible to specifically tell Analyzer to use other alias the global level in database connection? What I use to do is to have certain reports to use the default alias, and another to use another alias table. These reports should be able to share across all users.Sam

    In deed it is fix in the GA.Another way to set the alias table is to do it in the Admin client. If you add a connexion to a user there is a new 6.5 button "set alias" that allow you to set the default alias table for this specific user. But, it does not exit on a user group level.

  • Use of client dependent tables

    Hi Gurus,
    I have read many threads on difference between client dependent and independent data/objects etc.
    But, someone tell me please, wheat exactly the 'Use or Advantage' of client dependent tables/data.
    Thanks in advance.
    V

    HI,
       The use of client dependent tables is that if  data in one table of a particular client is updated then that data wont be seen in any other clent it provides a security to the data we retrict users with their authorizations in differen clients
    When you log on to an SAP System, you log on to a particular client of this system. Any activities you carry out in the system are always carried out in one client. When you plan your SAP system landscape, you must consider which clients you need for which activities.
    By assigning activities to be performed in a client, you give each client a particular role. This section describes the most important client roles.
    Since you need to adapt the SAP software for your own business needs, each SAP system landscape requires a client where Customizing settings, and possibly ABAP Workbench developments, can be made. This client is known as the Customizing and development client, or Customizing client for short. The abbreviation CUST is used for this client.
    Before you can use the Customizing settings and Workbench developments productively, you need to test them extensively for errors. Any faulty settings can seriously disrupt productive operations, and at worst, lead to the loss of productive data. The integrated nature of the various SAP applications means that there are many dependencies between the different Customizing settings. Even an experienced Customizing developer may not discover these dependencies immediately. The correctness of the settings can only be guaranteed with extensive testing. The client where these tests are made is the Quality Assurance Client, QTST for short.
    A separate client is required for productive use of the SAP System. So that this client can be used without disruption, it is essential that no Customizing settings or Workbench developments are made here, and also that no tests are carried out. This client is known as the Production Client, PROD for short.
    These three clients, CUST, QTST and PROD, are the central clients that exist in every system landscape. Standard system landscapes have precisely one client for each of these client roles.
    We recommend that you make all your Customizing settings in a single Customizing client, and then use the CTS to transport them to the other clients.
    We also recommend that you do not make any Customizing settings or Workbench developments in the quality assurance or production clients. You can make sure of this by making appropriate client settings.
    In addition to the central clients, you can also set up other clients for other tasks. However, you must remember that each extra client takes up additional system resources (main memory and database space). They also need to be administrated. For example, you need to set up and administrate access authorization for the users, and also distribute any changes to other clients with the CTS. You must weigh up the advantages and disadvantages of setting up other clients.
    Examples of other client roles are:
    Development test client (TEST): Developers can use this client to test their Customizing settings and Workbench developments, before they release their change requests. In this client the developers can create test application data for realistic tests. If they discover errors, they can remove them in the Customizing client. A development test client is always set up in the same SAP System as the Customizing client. This means that any changes that are made to cross-client data in the Customizing client are also immediately visible in the development test client. Changes to client-specific data are copied from the Customizing client to the development test client using a special  client copy function. The client copy function uses the unreleased change requests from the Customizing client to do this. The development test client is set so that you cannot make changes to Customizing data and Repository objects.
    Prototype or sandbox client (SAND): You can use this client to test any client-specific Customizing settings if you are not sure whether you want to use them in this form. Any settings that you want to keep are then entered in the Customizing client. To prevent conflicts between the prototype client settings and real settings in the Customizing client, you cannot make changes to cross-client Customizing data and Repository objects in the prototype client. The CTS does not record changes made to client-specific Customizing data, and does not transport them from the prototype client. You can make sure of this by making appropriate client settings.
    Training client (TRNG): To prepare end users for new functions that are to be transported into the production client, you can set up a training client. The users can use the new functions in this client with specially created application data. This client is set so that you cannot make changes to Customizing data and Repository objects.'''
    plzz reward if this information is usefull to u plaa dont forget to reward

  • Project Stock report using QBEW and QBEWH tables

    Dear All,
    First of all Happy New Year to all of you.
    Here is my question: I want to build a ABAP report for stock situation for project stock. The report demands the stock by closing month. Suppose I run the report for month 11/2010, it should show the closing stock at the end of the month 11/2010.
    Now I am using the values from table QBEW for this purpose.But QBEW table shows only current entries. If a material stock is changed or value is changed, it goes into history table QBEWH.
    Now problem is that if I try to make sum of QBEW and QBEWH, the stock value is excessive. Now I am not sure which entries I should consider from QBEWH table for this purpose.
    Has anybody worked on similar requirement?
    Thanks in advance!
    Regards,
    Mahendra Dighe

    Hi,
    Please try the report S_P00_07000140 - Inventory and Raw material for special stock report.
    Thanks.
    abdul

  • Using one user account table across multiple databases but account used as a foreign key

    I want to use one user account table from one database and use it across a couple other databases. The problem is that I want some tables to use the primary key from the user account table as a foreign key to access the data when the user logs in. Is this
    the right way of going about it? Do I have to create a user account table in all my databases? What is the best practice to handle this problem? Thanks in advance.

    You can use Triggers or using replication. 
    more info: 
    Add Foreign Key relationship between two Databases
    SQL Server Replication
    Saeid Hasani [sqldevelop]

  • How to use substr in external table defnition.

    Hi All,
    Im using oracle 11g. I have an external table which is reading data from a file. For one of the column, i need to get only the first 250 characters. My external table defnition looks like this
    create table tbl_substr
    ( col1 varchar2(20),
    col2 varchar2(250)
    organization external
    ( type oracle_loader
    default directory XXXX
    access parameters (
    records delimited by newline
    FIELDS TERMINATED BY '|'
    missing field values are null
    ( col1 ,
    col2 "substr(:col2,1,250)"
    ) ) location ('file.txt') )
    reject limit unlimited
    But this defnition gives an error when i do select * from tbl_substr
    I want to use substr in external table defnition its self and not in SELECT. Also i dont want to crete a view to solve this. If anyone has done this please help.

    You need to play with COLUMN_TRANSFORMS
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/et_params.htm#sthref1792
    BTW, i too got it from Google. I was not aware about this :)
    Amardeep Sidhu

  • USING SUM IN INTERNAL TABLE

    plz give me a simple example for using SUM in internal table and do some calculations in the same internal table.

    HI
    CHECK WITH THIS
    Syntax
    SUM.
    Effect
    The statement SUM can only be specified within a loop starting with LOOP, and is only considered within a AT- ENDAT control structure. Prerequisites for using the statement SUM include using the addition INTO in the LOOP statement, and that the specified work area wa is compatible with the row type of the internal table. In addition, SUM cannot be used when the row type of the internal table itab contains components that are tables.
    The statement SUM calculates the component total with the numeric data type (i, p, f) of all rows in the current control level and assigns these to the components of the work area wa. In the control levels FIRST, LAST, and outside of an AT-ENDAT control structure, the system calculates the sum of numeric components of all rows in the internal table.
    Example
    Control level processing for creating a list. At the end of line groups, the total of reserved places is calculated and issued.
    DATA: sflight_tab TYPE SORTED TABLE OF sflight
                      WITH UNIQUE KEY carrid connid fldate,
          sflight_wa  LIKE LINE OF sflight_tab.
    SELECT *
           FROM sflight
           INTO TABLE sflight_tab.
    LOOP AT sflight_tab INTO sflight_wa.
      AT NEW connid.
        WRITE: / sflight_wa-carrid,
                 sflight_wa-connid.
        ULINE.
      ENDAT.
      WRITE: / sflight_wa-fldate,
               sflight_wa-seatsocc.
      AT END OF connid.
        SUM.
        ULINE.
        WRITE: / 'Sum',
                  sflight_wa-seatsocc UNDER sflight_wa-seatsocc.
        SKIP.
      ENDAT.
      AT END OF carrid.
        SUM.
        ULINE.
        WRITE: / 'Carrier Sum',
                  sflight_wa-seatsocc UNDER sflight_wa-seatsocc.
        NEW-PAGE.
      ENDAT.
      AT LAST.
        SUM.
        WRITE: / 'Overall Sum',
                  sflight_wa-seatsocc UNDER sflight_wa-seatsocc.
      ENDAT.
    ENDLOOP.
    Hope this solves ur problem....
    <b>do reward if useful....</b>
    regards
    dinesh

  • Proper use of a Lookup table and adaptations for NET

    Hello,
    I need to create a few lookup tables and I often see the following:
    create table Languages
    Id int identity not null primary key (Id),
    Code nvarchar (4) not null,
    Description nvarchar (120) not null,
    create table Posts
    Id int identity not null primary key (Id),
    LanguageId int not null,
    Title nvarchar (400) not null,
    insert into Languages (Id, Code, Description)
    values (1, "en", "English");
    This way I am localizing Posts with language id ...
    IMHO, this is not the best scheme for Languages table because in a Lookup table the PK should be meaningful, right?
    So instead I would use the following:
    create table Languages
    Code nvarchar (4) not null primary key (Code),
    Description nvarchar (120) not null,
    create table Posts
    Id int identity not null primary key (Id),
    LanguageCode nvarchar (4) not null,
    Title nvarchar (400) not null,
    insert into Languages (Code, Description)
    values ("en", "English");
    The NET applications usually use language code so this way I can get a Post in English without using a Join.
    And with this approach I am also maintaining the database data integrity ...
    This could be applied to Genders table with codes "M", "F", countries table, transaction types table (should I?), ...
    However I think it is common to use int as PK in lookup tables because it is easier to map to ENUMS.
    And know it is even possible to map to Flag Enums so have a Many to Many relationship in an ENUM.
    That helps in NET code but in fact has limitations. A Languages table could never be mapped to a FLags Enum ...
    ... An flags enum can't have more than 64 items (Int64) because the keys must be a power of two.
    A SOLUTION
    I decided to find an approach that enforces database data integrity and still makes possible to use enums so I tried:
    create table Languages
    Code nvarchar (4) not null primary key (Code),
    Key int not null,
    Description nvarchar (120) not null,
    create table Posts
    Id int identity not null primary key (Id),
    LanguageCode nvarchar (4) not null,
    Title nvarchar (400) not null,
    insert into Languages (Code, Key, Description)
    values ("en", 1, "English");
    With this approach I have a meaningfully Language code, I avoid joins and I can create an enum by parsing the Key:
    public enum LanguageEnum {
    [Code("en")
    English = 1
    I can even preserve the code in an attribute. Or I can switch the code and description ...
    What about Flag enums? Well, I will have not Flag enums but I can have List<LanguageEnum> ...
    And when using List<LanguageEnum> I do not have the limitation of 64 items ...
    To me all this makes sense but would I apply it to a Roles table, or a ProductsCategory table?
    In my opinion I would apply only to tables that will rarely change over time ... So:
        Languages, Countries, Genders, ... Any other example?
    About the following I am not sure (They are intrinsic to the application):
       PaymentsTypes, UserRoles
    And to these I wouldn't apply (They can be managed by a CMS):
       ProductsCategories, ProductsColors
    What do you think about my approach for Lookup tables?
    Thank You,
    Miguel

    >>IMHO, this is not the best scheme for Languages table because in a Lookup table the PK should be meaningful, right?<<
    Not necessarily. The choice to use, or not to use, a surrogate key in a table is a preference, not a rule. There are pros and cons to either method, but I tend to agree with you. When the values are set as programming terms, I usually use a textual value
    for the key. But this is nothing to get hung up over.
    Bear in mind however, that this:
        create table Languages
          Id int identity not
    null primary key
    (Id),     
          Code nvarchar (4)
    not null, Description nvarchar
    (120) not
    null,
    is not equivalent to
        create table Languages
          Code nvarchar (4)
    not null primary
    key (Code),     
          Description nvarchar (120)
    not null,
    The first table needs a UNIQUE constraint on Code to make these solutions semantically the same. The first table could have the value 'Klingon' in it 20 times while the second only once.
    >>However I think it is common to use int as PK in lookup tables because it is easier to map to ENUMS.<<
    This was going to be my next point. For that case, I would only change the first table to not have an identity assigned key value, as it would be easier to manage at the same time and manner as the enum.
    >>. A Languages table could never be mapped to a FLags Enum ...<<
    You could, but I would highly suggest to avoid any values encoded in a bitwise pattern in SQL as much as possible. Rule #1 (First Normal Form) is partially to have 1 value per column. It is how the optimizer thinks, and how it works best.
    My rule of thumb for lookup (or I prefer the term  "domain" tables, as really all tables are there to look up values :)), is all data should be self explanatory in the database, through data if at all possible. So if you have a color column,
    and it contains the color "Vermillion", and all you will ever need is the name, and you feel like it is good enough to manage in the UI, then great. But bear in mind, the beauty of a table that is there for domain purposes, is that you can then store
    the R, G, and B attributes of the vermillion color (254, 73, 2 respectively, based on
    http://www.colorcombos.com/colors/FE4902) and you can then use that in coding. Alternate names for the color could be introduce, etc. And if UserRoles are 1, 2, 3, and 42 (I have seen worse), then
    definitely add columns. I think you are basically on the right track.
    Louis
    Without good requirements, my advice is only guesses. Please don't hold it against me if my answer answers my interpretation of your questions.

Maybe you are looking for

  • BB pearl no longer receiving emails

    I have hughes for my internet provider.  They just recently upgraded thier email system, since then, I can not get mail on my pearl.  When I go into email settings, it just tells me that there is already an account set up for this pin.  I am so confu

  • In STO process , PGI to be made mandatory

    I wish to make PGI mandatory for STO process..Steps are as follows: -Create PO.-.ME23n -Create Replenishment Delvery -VL10B -Do PGI -.VL02n>PGI -Create Proforma Invoice-VF01 -Create Excise -J1iin In order to achieve that , in VTFL copy control  for 

  • Multi-choice question - go back and re-answer?

    Hello there, cp5. Request was to build an activity where learner is presented 4 questions each with two options.  After answering the question, return to the list of 4 questions where their responses are measured accumulatively.  After seeing the sum

  • Fuzzy hdtv external monitor

    I've had this problem a while but its really getting on my nerves.  I have a 37" hdtv as an external monitor for my window desktop.  Every once in a while I want to use it with my macbook pro - when I initially did this everything worked fine.  Then

  • PLANNED ORDER & PRODUCTION ORDER DATES.

    Dear All, How to come Dates in Planned order and then Production order. in Discrete Mfg. through MRP Run. I want details How to calculates production dates MANOJ