Free Connections to a table before truncate

Hi All,
I need to kill all connections to table before truncating table in production environment.Is there any script for the same?
Regards
Rahul

You don't really have a connection to a table, but processes take out locks on rows in the table as they access the rows. Or on table if they scan many rows.
It would be possible to query sys.dm_tran_locks to get the processes that hold locks, but you cannot really prevent new processes to come in.
A more lazy method is to issue your "TRUNCATE TABLE" in one query window, and then run sp_who in another. You process will be reported as blocked by other process, so you kill that process. Then you run sp_who again, kill that guy and so on.
Then again, killing processes in a production environment? It's not entirely appetising.
Erland Sommarskog, SQL Server MVP, [email protected]

Similar Messages

  • Detemining the free space for each tables before Archiving an object.

    Hi Everyone,
    I want to know,how can i get the information about the how much space will get free from each table which is related to an archiving object <b>before</b> i perform
    archiving on that particular object.
    Are there any transactions for the same, or some transaction which can be related to these.
    eg:FI_DOCUMNT is related to lots of table, before i archive this object, i want to know that space that will be free from all the tables individually which are affected by this object.
    Regards,
    Nipun Sharma

    Hi Nipun,
    as far as I know: there is no easy tool to get this numbers. But on the other hand, you don't need exact numbers, estimations will do.
    It's a good idea to start with the biggest objects: take DB02, make a detailed analysis where you select the biggest tables -> corresponding archive objects should be your main focus (for the beginning).
    Count for the biggest tables in each objects the entries per year (or month, whatever periods you are interested in). Most tables have creation date to do so, otherwise go for number range. For some numbers you could search the creation date, the rest is estimation again.
    Then you will have an idea, which volume was created in which time frame.
    Still you need some test archive runs (in PRD or an (old) copy, at least for an example amount of data): you need to know, which % of the documents can technically be archived, how much will stay open because of missing closing. That's critical information (maybe 90% will stay in system) and can only be analyzed by SARA test runs - if you identify the missing object status, you can go on selecting this directly, but in the beginning you need the archive run.
    With the volume / time frame and the percentage, which can be deleted you should be able to give estimations based on current total object size. Make clear, that you talk about estimations: every single object will be checked for having correct status - before this isn't done (in a test run), no one can tell exact numbers.
    Hope, this will help you,
    regards,
    Christian

  • Truncate Table before Insert--Performance

    HI All,
    This post is in focus of special requirement where a table is truncated before inserting records in the table.
    Now, when a table is truncated the High Water Mark(HWK) is reset to lowest memory allocated for table in tablespace. After this, would insert with append can boost the performance of the insert query?
    In simple insert query, the oracle engine consults the free list to look for free spaces.
    But in insert with apppend, the engine starts above the HWM. And the argument is when truncate has been executes on table, would the freelist be used in simple insert.
    I just need to know if there are any benefits of using append insert on truncated table or simple insert would be same in term of performance with respect to insert with append.
    Regards
    Nits

    Hi,
    if you don't need the data truncate the table. There is no negativ impact whether you are using an conventional path or a direct path insert.
    If you use append less redo is written for the table if the table is in NOLOGGING mode, but redo is written for all indexes. I would recommand to create a full backup after that (if needed), because your table will not be recoverable after that (no REDO Information).
    Dim

  • Should i drop schema or truncate tables before importing?

    I have exported a dump of the whole schema form oracle database. I need to import this to another database with a different schema name but same tables. The only thing i am doing is this to bring data consistent with the other database.
    Now Do i need to drop schema or do anything like that to avoid and redundancy or i can just go ahead and do the import? Please let me know how to do it. I would prefer not deleted any tables but truncate all the data if needed from about all 200 tables.
    Thanks

    For the places in your application where a Sequence is used to insert or set values in some tables, note that the sequence's current value may have been incremented if it has been used since the last import. If your truncate and re-import the tables, the tables may have the older (lower) values but the current sequence may be higher.
    You would then have to "reset" the sequence best by dropping and recreating it.

  • OBIEE execute stored procedure to load tables before running report

    Hi..
    I want to execute a stored procedure to load database tables before running a report in OBIEE .
    I need to pass 2 parameters to the stored procedure which loads into tables.
    In the Connection Pool --> Connection Script Tab --> Execute before query, I wrote the below query using the repository variables VAR1 & VAR2 to execute the
    DECLARE VAR1 number; VAR2 number;
    BEGIN
    schema_name.package_name1.package_body('VALUE OF(VAR1)', 'VALUE OF(VAR2)'); COMMIT;
    schema_name.package_name2.package_body('VALUE OF(VAR1)', 'VALUE OF(VAR2)'); COMMIT;
    END;
    I am receiving the following error to declare the schema_name.package_name
    +++Administrator:2a0000:2a0004:----2010/06/21 14:29:00
    -------------------- Sending query to database named ACBS-OCC (id: <<49419>>):
    BEGIN schema_name.package_name1.package_body1('VALUE OF(VAR1', 'VALUE OF(VAR2'); COMMIT; schema_name.package_name2.package_body2('VALUE OF(VAR1)', 'VALUE OF(VAR2)'); COMMIT;END;
    +++Administrator:2a0000:2a0004:----2010/06/21 14:29:00
    -------------------- Query Status: Query Failed: [nQSError: 16001] ODBC error state: S1000 code: 6550 message: [Oracle][ODBC][Ora]ORA-06550: line 1, column 7:
    PLS-00201: identifier 'SCHEMA_NAME.PACKAGE_NAME1' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    ORA-06550: line 1, column 93:
    PLS-00201: identifier 'SCHEMA_NAME.PACKAGE_NAME2' must be declared
    ORA-06550: line 1, column 93:
    PL/SQL: Statement ignored.
    [nQSError: 16015] SQL statement execution failed.
    Please suggest how to declare and execute the stored procedure.
    Thanks in advance.

    Hi,
    I know that any Function / Procedure needs to be called using a EVALUATE function in OBIEE.
    Thanks,
    Vijay

  • Error connecting to a table in a database

    I am trying to insert user login script and have an error.
    Can anyone tell me why Dreamweaver MX 2004 cannot connect to
    the tables in my databases? I am using MS Access 2002 and there are
    tables in the databses.
    I can supply a jpeg of the error if needed

    Hello,
    Are you using Windows XP SP2?
    If so, it's a known issue.
    Here's the fix:
    http://kb.adobe.com/selfservice/viewContent.do?externalId=tn_19515&sliceId=2
    Click on "connectivity fails with local ASP.NET and ASP
    servers..."
    There's a link to download the extension fix in "Workaround"
    ASP: When using a local ASP test server running on XP SP2
    with Dreamweaver
    (i.e. the Testing Server URL prefix is set to "
    http://localhost"), and you
    specify "Using driver on testing server" or "Using DSN on
    testing server,"
    database connectivity fails. If you click the Test button in
    the Custom
    Connection String or Data Source Name (DSN) dialog box, it
    says the
    connection was made successfully. However, if you then try to
    browse the
    tables in the Databases panel or create a recordset, the
    database tables do
    not display or you get the following error message: "Unable
    to retrieve
    tables from this connection, click on the 'Define...' button
    to test this
    connection."
    Workaround: Download and install the Dreamweaver extension
    fix by Macromedia
    that resolves this issue (Ref. 179021). To use this
    extension, follow the
    steps below. The steps below apply to both ASP.NET and ASP
    sites:
    1.. In Dreamweaver, go to the Files panel and choose one of
    your ASP.NET
    site definitions.
    2.. Select Site > Advanced > Remove Connection
    Scripts.
    Note: If this option is grayed out, open an ASP.NET page in
    Dreamweaver.
    Removing the connection scripts will delete the contents of
    the
    _mmServerScripts folder, which is located in the testing
    server folder. The
    _mmServerScripts folder is hidden in Dreamweaver's Files
    panel, but will be
    visible if viewed with a different file browser. The
    _mmServerScripts folder
    consists of these three files: adojavas.inc, MMHTTPDB.asp and
    MMHTTPDB.js.
    The connection scripts are used by Dreamweaver to perform
    remote database
    connectivity when developing pages within Dreamweaver. These
    script files
    have no affect on your web pages during run-time (i.e. when a
    visitor views
    your ASP.NET pages via a web browser).
    3.. Repeat steps 1-2 for each of your ASP.NET site
    definitions.
    4.. Quit Dreamweaver.
    5.. Install the extension.
    6.. Restart Dreamweaver.
    7.. Go into the Files panel, select one of your ASP.NET
    sites, and open an
    .aspx file.
    8.. Go into the Databases panel and delete your existing
    database
    connections. Before deleting them, make sure to open up each
    connection and
    write down the settings. You can open a connection by either
    double-clicking
    the connection name, or right-clicking the connection name
    and selecting
    Edit.
    9.. Create a new database connection. Use the same
    connection names you
    had before, so any existing pages with DataSets and Server
    Behaviors still
    work correctly.
    10.. Click the Test button in the database connection dialog
    box. This
    will upload the new connection scripts to the testing server.
    11.. Repeat steps 7-10 for each of your Dreamweaver ASP.NET
    site
    definitions.
    Note: This extension is compatible with Dreamweaver MX 2004
    and Dreamweaver
    MX. The current version of the extension is 1.0.2. Version
    1.0.1 resolved a
    database connectivity problem with the ASP VBScript server
    model. Version
    1.0.2 includes the fixes in version 1.0.1 and also resolves
    compatibility
    issues with Dreamweaver MX (version 6). You can see which
    version of the
    extension you have by opening the Extension Manager.
    "ducati1" <[email protected]> wrote in
    message
    news:[email protected]...
    >I am trying to insert user login script and have an
    error.
    > Can anyone tell me why Dreamweaver MX 2004 cannot
    connect to the tables in
    > my
    > databases? I am using MS Access 2002 and there are
    tables in the databses.
    > I can supply a jpeg of the error if needed
    >
    >

  • How to calculate the percentage of free space for a table in Oracle

    okay, I am a little confused here. I have been searching the web and looking at a lot of documents. What I basically want to find out is this, how do I calculate the difference between the number of bytes a table is using and the total bytes allocated to a table space (going that way to get percent free for a particular table). So I need a byte count of a table and total table space and the percentage difference. I have been looking at the DBA_TABLES DBA_TABLESPACES and DBA_SEGMENTS views. I have tried to calculated the space as num_rows * avg_row_len (if I am wrong, let me know). I have been trying to compare that calculation to the number in DBA_TABLESPACES and DBA_SEGMENTS. I am just looking for the total space allocated to the table space that the table sits in (seem logical right now) to make the percentage value work. Thus I want to be able to track the table as it grows as compated to the table space it sits in to see a value as it changes over time (days, weeks, etc.) each time I run this script I am working on.
    Can someone get me straight and help me to find out if I am looking in the right places. Any advice would help.
    Edward

    You can use a little modified version of dbms_space from Tom, show_space. Have a look,
    SQL> create table test222 as select * from all_objects;
    Table created.
    SQL> delete from test22 where rownum<=100;
    delete from test22 where rownum<=100
    ERROR at line 1:
    ORA-00942: table or view does not exist
    SQL> delete from test222 where rownum<=100;
    100 rows deleted.
    SQL> analyze table test222 compute statistics;
    Table analyzed.
    SQL>  create or replace procedure show_space
      2  ( p_segname in varchar2,
      3    p_owner   in varchar2 default user,
      4    p_type    in varchar2 default 'TABLE',
      5    p_partition in varchar2 default NULL )
      6  -- this procedure uses authid current user so it can query DBA_*
      7  -- views using privileges from a ROLE and so it can be installed
      8  -- once per database, instead of once per user that wanted to use it
      9  authid current_user
    10  as
    11      l_free_blks                 number;
    12      l_total_blocks              number;
    13      l_total_bytes               number;
    14      l_unused_blocks             number;
    15      l_unused_bytes              number;
    16      l_LastUsedExtFileId         number;
    17      l_LastUsedExtBlockId        number;
    18      l_LAST_USED_BLOCK           number;
    19      l_segment_space_mgmt        varchar2(255);
    20      l_unformatted_blocks number;
    21      l_unformatted_bytes number;
    22      l_fs1_blocks number; l_fs1_bytes number;
    23      l_fs2_blocks number; l_fs2_bytes number;
    24      l_fs3_blocks number; l_fs3_bytes number;
    25      l_fs4_blocks number; l_fs4_bytes number;
    26      l_full_blocks number; l_full_bytes number;
    27
    28      -- inline procedure to print out numbers nicely formatted
    29      -- with a simple label
    30      procedure p( p_label in varchar2, p_num in number )
    31      is
    32      begin
    33          dbms_output.put_line( rpad(p_label,40,'.') ||
    34                                to_char(p_num,'999,999,999,999') );
    35      end;
    36  begin
    37     -- this query is executed dynamically in order to allow this procedure
    38     -- to be created by a user who has access to DBA_SEGMENTS/TABLESPACES
    39     -- via a role as is customary.
    40     -- NOTE: at runtime, the invoker MUST have access to these two
    41     -- views!
    42     -- this query determines if the object is a ASSM object or not
    43     begin
    44        execute immediate
    45            'select ts.segment_space_management
    46               from dba_segments seg, dba_tablespaces ts
    47              where seg.segment_name      = :p_segname
    48                and (:p_partition is null or
    49                    seg.partition_name = :p_partition)
    50                and seg.owner = :p_owner
    51                and seg.tablespace_name = ts.tablespace_name'
    52               into l_segment_space_mgmt
    53              using p_segname, p_partition, p_partition, p_owner;
    54     exception
    55         when too_many_rows then
    56            dbms_output.put_line
    57            ( 'This must be a partitioned table, use p_partition => ');
    58            return;
    59     end;
    60
    61
    62     -- if the object is in an ASSM tablespace, we must use this API
    63     -- call to get space information, else we use the FREE_BLOCKS
    64     -- API for the user managed segments
    65     if l_segment_space_mgmt = 'AUTO'
    66     then
    67       dbms_space.space_usage
    68       ( p_owner, p_segname, p_type, l_unformatted_blocks,
    69         l_unformatted_bytes, l_fs1_blocks, l_fs1_bytes,
    70         l_fs2_blocks, l_fs2_bytes, l_fs3_blocks, l_fs3_bytes,
    71         l_fs4_blocks, l_fs4_bytes, l_full_blocks, l_full_bytes, p_partition);
    72
    73       p( 'Unformatted Blocks ', l_unformatted_blocks );
    74       p( 'FS1 Blocks (0-25)  ', l_fs1_blocks );
    75       p( 'FS2 Blocks (25-50) ', l_fs2_blocks );
    76       p( 'FS3 Blocks (50-75) ', l_fs3_blocks );
    77       p( 'FS4 Blocks (75-100)', l_fs4_blocks );
    78       p( 'Full Blocks        ', l_full_blocks );
    79    else
    80       dbms_space.free_blocks(
    81         segment_owner     => p_owner,
    82         segment_name      => p_segname,
    83         segment_type      => p_type,
    84         freelist_group_id => 0,
    85         free_blks         => l_free_blks);
    86
    87       p( 'Free Blocks', l_free_blks );
    88    end if;
    89
    90    -- and then the unused space API call to get the rest of the
    91    -- information
    92    dbms_space.unused_space
    93    ( segment_owner     => p_owner,
    94      segment_name      => p_segname,
    95      segment_type      => p_type,
    96      partition_name    => p_partition,
    97      total_blocks      => l_total_blocks,
    98      total_bytes       => l_total_bytes,
    99      unused_blocks     => l_unused_blocks,
    100      unused_bytes      => l_unused_bytes,
    101      LAST_USED_EXTENT_FILE_ID => l_LastUsedExtFileId,
    102      LAST_USED_EXTENT_BLOCK_ID => l_LastUsedExtBlockId,
    103      LAST_USED_BLOCK => l_LAST_USED_BLOCK );
    104
    105      p( 'Total Blocks', l_total_blocks );
    106      p( 'Total Bytes', l_total_bytes );
    107      p( 'Total MBytes', trunc(l_total_bytes/1024/1024) );
    108      p( 'Unused Blocks', l_unused_blocks );
    109      p( 'Unused Bytes', l_unused_bytes );
    110      p( 'Last Used Ext FileId', l_LastUsedExtFileId );
    111      p( 'Last Used Ext BlockId', l_LastUsedExtBlockId );
    112      p( 'Last Used Block', l_LAST_USED_BLOCK );
    113  end;
    114
    115  /
    Procedure created.
    SQL> desc show_space
    PROCEDURE show_space
    Argument Name                  Type                    In/Out Default?
    P_SEGNAME                      VARCHAR2                IN
    P_OWNER                        VARCHAR2                IN     DEFAULT
    P_TYPE                         VARCHAR2                IN     DEFAULT
    P_PARTITION                    VARCHAR2                IN     DEFAULT
    SQL> set serveroutput on
    SQL> exec show_space('TEST222','SCOTT');
    BEGIN show_space('TEST222','SCOTT'); END;
    ERROR at line 1:
    ORA-00942: table or view does not exist
    ORA-06512: at "SCOTT.SHOW_SPACE", line 44
    ORA-06512: at line 1
    SQL> conn / as sysdba
    Connected.
    SQL> grant sysdba to scott;
    Grant succeeded.
    SQL> conn scott/tiger as sysdba
    Connected.
    SQL> exec show_space('TEST222','SCOTT');
    BEGIN show_space('TEST222','SCOTT'); END;
    ERROR at line 1:
    ORA-06550: line 1, column 7:
    PLS-00201: identifier 'SHOW_SPACE' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    SQL> exec scott.show_space('TEST222','SCOTT');
    PL/SQL procedure successfully completed.
    SQL> set serveroutput on
    SQL> exec scott.show_space('TEST222','SCOTT');
    Unformatted Blocks .....................               0
    FS1 Blocks (0-25)  .....................               0
    FS2 Blocks (25-50) .....................               1
    FS3 Blocks (50-75) .....................               0
    FS4 Blocks (75-100).....................               1
    Full Blocks        .....................             807
    Total Blocks............................             896
    Total Bytes.............................       7,340,032
    Total MBytes............................               7
    Unused Blocks...........................              65
    Unused Bytes............................         532,480
    Last Used Ext FileId....................               4
    Last Used Ext BlockId...................           1,289
    Last Used Block.........................              63
    PL/SQL procedure successfully completed.
    SQL>http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:5350053031470
    I use this to find the space allocations.
    Just read your post again,this is not going to show you the percentage of the free/used space. This is going to be the number of blocks which are free/used. For the growth trend, you can look at (in 10g) Oracle EM. It has added now,Segment Growth Trend report which can show you for each object,comparing to the allocated space to the object,how much space is being used by it.
    HTH
    Aman....

  • How to retrieve the number of "free" rows in a table?

    Hi,
    if in a client only environment (no sync to mobile server) rows are inserted and deleted into a table,
    is there a way to retrieve the number of "free" rows in a table? Number of "Free" rows stands for
    number of rows that can be inserted again, before the table extents in size.
    Is there a way in OLite 10.3.0.2.0 to retrieve the size of tables and indexes? ALL_TABLES is not
    a place that really works.
    Best regards and many thanks,
    Gerd

    Hi Gary,
    many thanks, the partner uses a Lite client db without sync. The db runs inside an laboratory device and collects measures. There must be a way to estimate the the number of "measures" rows, that stil can be stored in the db.
    Than we need to make the deleted space available for new rows. The partner tested defrag.exe and found that it
    needs very long time to run, especially if the db is bigger than 2GB. ... and that this run sometimes fails.
    Is there any recommendation the partner can follow on?
    Thanks,
    Gerd

  • How to check that my table is parent table or child table before trucating

    Hi:
    say i wanted to trucate a table .the table might be a parent table or might be a child table.if its a child table then there wouldnt be any problems in truncating.
    if its a parent table the i believe i would have to diable all the foreign key constraints from the child table referring to the parent table ..my question is
    1. i have a table say ABC. before truncating how do i find out if table ABC is a parent table or not?
    2.lets assume i tried to truncate the table ABC and i got the error saying that disable the foreign keys on the child table referring to the parent table ABC.
    in my schema there are say 50 tables.out of which there might be 10 odd child tables referring to my parent table ABC.
    so my question here would be how do i specifically find out these 10child tables referring to the Parent table ABC so that i can disable the foreign keys on the child tables.
    Thanks 4 ur time in reading this and hopefully i think u wld give me a solution!!

    Hi,
    Try this SQL below:
    select r.owner, r.table_name
    from user_constraints r, user_constraints o
    where r.r_owner = o.owner and r.r_constraint_name = o.constraint_name
    and o.constraint_type in ('P','U') and r.constraint_type = 'R'
    and o.table_name = 'ABC';Cheers

  • Unable to connect to "SAP Table, Cluster, or Function" in Crystal reports.

    I cannot connect to "SAP Table, Cluster, or Function" in Crystal reports. The system gives such an error: Database Conection Error: Functional module "/CRYSTAL/GET_OSQL_OBJECT_LIST" not found.
    Did anyone get this error? What can't be wrong with my system?

    Hi Sergey,
    you must import some transports in your SAP R/3 system before you can access its tables. Have a look at the documentation of the int.KIT for SAP.
    The transports are normally to be found in the install directory of your server installation of the BOBJ int.KIT for SAP. In the documentation is easy to identify those which must be imported depending on the system's type and version.
    Regards,
    Stratos

  • Errors occurred during extraction of UD Connect object field- table not fou

    Hi,
    In BI 7.0 system I can't select any UD Connect source object on the "Extraction" tab page of  the DataSource maitenance screen. If I type the table name (which is "UDITEST") into it and then hit the "Proposal" tab page, I get the following error:
    Errors occurred during extraction of UD Connect object field-list: Errors occurred during extraction of UD Connect object field-list: UDCADAPTERROR::RSSDK|200|Table: uditest not found|
    Message no. RSDS_ACCESS036
    Analysis:
    1) We have tested the BI JDBC Connector using the URL:
    http://xxxabcdev03:50000/TestJDBC_Web/TestJDBCPage.jsp
    We got the list of tables displayed (UDITEST table is also displayed)=> connector is configured properly.
    2) When we are trying to configure the Source system in rsa1->modelling->source systems-> UD connect -> create,
    a) RFC Destination: We are using an RFC Connection that is already in place of type 'T'   between the J2EE engine and BI ABAP engine. We tested this connection from SM59 -> TCP/IP connections and it is working fine.
    How do we test whether the RFC between J2EE engine and BI ABAP engine is a 2 way RFC connection?
    b) Logical System Name: We have manually typed in free text 'UDC_local'. Could you please clarify do we need to type in manually or select the logical system name from the F4 help?
    Which logical system name should we enter here exactly.. pls clarify.
    Do we have a seperate logical system name for the J2EE server...?
    c) Type of Connector : JDBC
    d) name of connector: SDK_JDBC
    e) Source system name: SDK_JDBC
    f) Type and Release : blank
    Thanks to any answers in advance!
    Best regards,
    Syam

    Hi,
    Tried to give the Logical System Name in uppercase ie: UDC_LOCAL.
    Now, when I click on the F4 of 'UD Connect Source Object' in Extraction tab, I get the message as below:
    "Extraction of existing UD Connect data source objects".
    But no list of tables is being displayed. When I entered the table name (UDITEST) manually, it gives the same error as :
    " Errors occurred during extraction of UD Connect object field- table not found"...
    Could you please specify from where is the UD Conenct source object fetched from?
    Rgds
    Syam

  • Source tables getting truncated

    While using OWB 10gR2, sometimes after executing mappings, I discover that the source tables just got truncated, which is not something I planned to do. I do want the target tables to be truncated before they get populated. So while creating the mappings, in the Table Operator Properties, I set the loading type to 'none' for all the source tables and 'truncate/insert' for all the target tables. Is this correct, or is it something else likely causing the problem?
    Thanks a lot

    You need not change the table load properties for the source tables. Leave it as it is (it is by default Insert, I think). Change the target table load properties to Truncate/Insert. This should be okay. I think the source tables were set to Truncate unknowingly which caused the problem.
    Sometimes, OWB does not deploy the current version of the package and we see the old version of the code in the database. I faced this problem quite often, The best way out is to drop the package from the back end and re-create thepackage from OWB.
    Regards
    -AP

  • Export data from database table before database migration

    Hello,
    We are planning to migrate our SAP ERP 6 Ehp4/NW7.01 from Oracle 11.2 to IBM DB2 v. 9.7 database. During test migrations I have established that we spend a lot of time for one particular table (COEP). Because we donu2019t have possibility to archive this table before migration I have an idea to export data from previous years from this table to the file system (using an ABAP report), delete those data from table before migration and then after migration, import back to the database from the file system.
    Does anybody have any concerns or suggestions about this idea?
    Thank you for your answers
    Andrej

    Hello Andrej,
    I strongly do not recommend to do so.
    I am not sure whether technically this could work at all..
    Even it if would work .. In order to really save time, the export and the import would have to be a dirty one (meaning the system is operational and in production). With this there is a high risk to produce inconsistencies on this table.  And you most likely will receive no support if something unforeseen happens and you end with problems.
    Also  your approach (if it should work at all) , would have to be tested thoroughly by you , also protecting the table from any changes.
    I do not believe that this can save any effort compared to implementing advanced migration techniques like table splitting.
    On top, you would go high risk to end with an unsupported system, with not using official migration procedures
    Hans-Juergen

  • Error while connecting to ECC tables from Crystal

    Hello,
    I am trying to create a crystal report on the SAP ECC tables directly, when I try to create a new connection from u201CSAP Table, Cluster, or Functionu201D option in crystal reports 2008 I get this error" Logon Failed. Details: you don't have necessary rights to design reports against the SAP system. please contact your administrator"
    Now my question is where are my rights missing, is it on BO side or SAP side, I see that I have access to SAP tables when I try to logon through the SAP logon pad it works, I could see the tables.
    Please let me know your thoughts.
    Thanks

    Hi,
    part of the ABAP Transports from the SAP Integration Kit are also specific authorization objects. Take a look at the Installation guide for the SAP Integration Kit in the appendix.
    Ingo

  • How to connect two different tables from different server?

    Dear all expert,
    I need to develop a report where i need to connect 2 different tables from 2 different servers. For examples:
    BUT000 tables from CRM and KNKK from R3.
    Is it possible to do so? Because i try to link these 2 tables together and when i preview the report, it run very slow and sometime it hang. I did not get any result.
    Thank You.

    Moved to SAP Integration kit forum
    Appears you are using SAP data sources.
    And yes you can but the linked field types must match and be able to be converted into the WHERE portion of the SQL. Click on Database, Show SQL... and see if there is one. If not CR can't figure out how to so it brings all data client side and filters on the second pass.
    Thank you
    Don

Maybe you are looking for

  • WSUS Administrators group and CleanUp error

    Hi, I try to do automatic cleanup of WSUS parent and downstream servers. My script run fine with my account (domain admin) but I want to run the script with low rights. So, I created a local account on each server, member of WSUS Administrators local

  • Advice reg RFFOUS_C program

    Hi all, we are using RFFOUS_C to print checks.We had a requirement of printing with bottom line software. we have to pass the data from RFFOUS_C to bottom line software. Right now we copied the orginal program and commented all the forms and passed t

  • Need textbook for muse

    Where can I find Muse textbook?

  • How to set transactionManager on XADataSource?

    How do I set the transaction manager on an OracleXADataSource? Other data source implementations that I've used (such as XAPool) have a setTransactionManager method that allows one to programmatically assign the transaction manager, but there is no s

  • ITunes must be reloaded for ipad to show up in devices

    Every time I need to use my ipad I have to reinstall iTunes. Once I shut down, it's time to reinstall again. It shows up in iphoto just fine, and also on my husband's pc. The techs have no idea what is happening. Help!