Test Data Management Server - Databases/Tables

Hello,
We are currently preparing to do a pilot with the Test Data Management Server and were point some initial documents together indicating technical changes it will bring to our landscape. We have attend the TZTDM3 course and have read the Master Guide but haven't located the detailed information regarding what databases/tables are created when TDMS is installed. If anyone could point me at this documentation or provided the info it would be greatly appreciated.
Thanks,
J

Thanks for response but I don't want move data warehouse databases. My idea is create another server and change the data warehouse managemet server role to this new server preserving the databases in the original server. Is possible do this?

Similar Messages

  • Updating data in the database table

    Can any help me in the code for updating data in the database table.
    Regards,
    Rahul

    Hi Rahul,
    A slightly longer procedure that i'm adding here...
    1.) Create the component (i'm sure you have this covered)
    2.) Next on the button click that updates the database - add an action.
    3.) double click the action so that you are taken to the methods section of the view.
    4.) next you need to add the code that is required the update the database - this will be in the form of the above two posts.
    5.) compile and test the application
    Let me know in case you need further information on how to do this with a function module or something.
    Thanks.

  • When / why use XML to store data instead of database table ?

    Hi All,
    I still not use XML much in applications and don't know much about its utilization.
    I read here and there about storing data as XML instead of into database tables.
    - could any body please tell me when / why use XML to store data instead of database table ?
    e.g : store inventory per warehouse in XML format. ?
    - What is the other cases or reasons of extracting database records into XML or vice versa ?
    - is there any good pdf on this ?
    Thank you for your help,
    xtanto

    It depends entirely what you want to accomplish with the 'XML in the database'. There are basically 3 independent methods: As CLOB, as XMLType views or as native XMLType 'columns'
    Each method has advantages and disadvantages, especially in the performance vs purpose tradeoff.
    The Oracled Press book "Oracle Database 10g XML & SQL Design, Build, & Manage XML Applications in Java, C, C++, & PL/SQL" is highly recommended for anyone interested in Oracle and XML. http://books.mcgraw-hill.com/getbook.php?isbn=0072229527&template=oraclepress

  • How to input data in a database table without knowing in advance table and column configurations

    Hi,
    I have a problem using LabVIEW for input data (manually) in a SQL database. I have about 40 tables in the database, each of them is related to a specific engine component. I need to create a user interface (maybe visualizing the table with a table control) where the users can insert data in the database table fields. Could someone give me some suggestion on how to do it?
    Using the  DB tools insert data.vi I need to know in advance the column configuration of the table, but in my database each table has its own structure! So do I have to create 40 different masks, one for every different table?
    Thanks in advance.
    Michela

    I have not actually used the LV SQL Toolkit, but I will try and offer high level ideas :-)
    when you have retrieved the construction data for a table, you should be able to use array and cluster indexing to aquire the names of the fieds, enough to create a labelled table on the LV Front panel.
    After completing a new entry you can INSERT the entry into the database using the same data.
    Is the SQL toolkit an additional purchase, or included in newer versions as standard? If you post a sample of the cluster/array that you retrieve, I could give you a sample VI to give you some pointers in creating the User Interface table 
    - Cheers, Ed

  • How to generate test data for all the tables in oracle

    I am planning to use plsql to generate the test data in all the tables in schema, schema name is given as input parameters, min records in master table, min records in child table. data should be consistent in the columns which are used for constraints i.e. using same column value..
    planning to implement something like
    execute sp_schema_data_gen (schemaname, minrecinmstrtbl, minrecsforchildtable);
    schemaname = owner,
    minrecinmstrtbl= minimum records to insert into each parent table,
    minrecsforchildtable = minimum records to enter into each child table of a each master table;
    all_tables where owner= schemaname;
    all_tab_columns and all_constrains - where owner =schemaname;
    using dbms_random pkg.
    is anyone have better idea to do this.. is this functionality already there in oracle db?

    Ah, damorgan, data, test data, metadata and table-driven processes. Love the stuff!
    There are two approaches you can take with this. I'll mention both and then ask which
    one you think you would find most useful for your requirements.
    One approach I would call the generic bottom-up approach which is the one I think you
    are referring to.
    This system is a generic test data generator. It isn't designed to generate data for any
    particular existing table or application but is the general case solution.
    Building on damorgan's advice define the basic hierarchy: table collection, tables, data; so start at the data level.
    1. Identify/document the data types that you need to support. Start small (NUMBER, VARCHAR2, DATE) and add as you go along
    2. For each data type identify the functionality and attributes that you need. For instance for VARCHAR2
    a. min length - the minimum length to generate
    b. max length - the maximum length
    c. prefix - a prefix for the generated data; e.g. for an address field you might want a 'add1' prefix
    d. suffix - a suffix for the generated data; see prefix
    e. whether to generate NULLs
    3. For NUMBER you will probably want at least precision and scale but might want minimum and maximum values or even min/max precision,
    min/max scale.
    4. store the attribute combinations in Oracle tables
    5. build functionality for each data type that can create the range and type of data that you need. These functions should take parameters that can be used to control the attributes and the amount of data generated.
    6. At the table level you will need business rules that control how the different columns of the table relate to each other. For example, for ADDRESS information your business rule might be that ADDRESS1, CITY, STATE, ZIP are required and ADDRESS2 is optional.
    7. Add table-level processes, driven by the saved metadata, that can generate data at the record level by leveraging the data type functionality you have built previously.
    8. Then add the metadata, business rules and functionality to control the TABLE-TO-TABLE relationships; that is, the data model. You need the same DETPNO values in the SCOTT.EMP table that exist in the SCOTT.DEPT table.
    The second approach I have used more often. I would it call the top-down approach and I use
    it when test data is needed for an existing system. The main use case here is to avoid
    having to copy production data to QA, TEST or DEV environments.
    QA people want to test with data that they are familiar with: names, companies, code values.
    I've found they aren't often fond of random character strings for names of things.
    The second approach I use for mature systems where there is already plenty of data to choose from.
    It involves selecting subsets of data from each of the existing tables and saving that data in a
    set of test tables. This data can then be used for regression testing and for automated unit testing of
    existing functionality and functionality that is being developed.
    QA can use data they are already familiar with and can test the application (GUI?) interface on that
    data to see if they get the expected changes.
    For each table to be tested (e.g. DEPT) I create two test system tables. A BEFORE table and an EXPECTED table.
    1. DEPT_TEST_BEFORE
         This table has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look BEFORE the
         test for that test case is performed.
         CREATE TABLE DEPT_TEST_BEFORE
         TESTCASE NUMBER,
         DEPTNO NUMBER(2),
         DNAME VARCHAR2(14 BYTE),
         LOC VARCHAR2(13 BYTE)
    2. DEPT_TEST_EXPECTED
         This table also has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look AFTER the
         test for that test case is performed.
    Each of these tables are a mirror image of the actual application table with one new column
    added that contains a value representing the TESTCASE_NUMBER.
    To create test case #3 identify or create the DEPT records you want to use for test case #3.
    Insert these records into DEPT_TEST_BEFORE:
         INSERT INTO DEPT_TEST_BEFORE
         SELECT 3, D.* FROM DEPT D where DEPNO = 20
    Insert records for test case #3 into DEPT_TEST_EXPECTED that show the rows as they should
    look after test #3 is run. For example, if test #3 creates one new record add all the
    records fro the BEFORE data set and add a new one for the new record.
    When you want to run TESTCASE_ONE the process is basically (ignore for this illustration that
    there is a foreign key betwee DEPT and EMP):
    1. delete the records from SCOTT.DEPT that correspond to test case #3 DEPT records.
              DELETE FROM DEPT
              WHERE DEPTNO IN (SELECT DEPTNO FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3);
    2. insert the test data set records for SCOTT.DEPT for test case #3.
              INSERT INTO DEPT
              SELECT DEPTNO, DNAME, LOC FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3;
    3 perform the test.
    4. compare the actual results with the expected results.
         This is done by a function that compares the records in DEPT with the records
         in DEPT_TEST_EXPECTED for test #3.
         I usually store these results in yet another table or just report them out.
    5. Report out the differences.
    This second approach uses data the users (QA) are already familiar with, is scaleable and
    is easy to add new data that meets business requirements.
    It is also easy to automatically generate the necessary tables and test setup/breakdown
    using a table-driven metadata approach. Adding a new test table is as easy as calling
    a stored procedure; the procedure can generate the DDL or create the actual tables needed
    for the BEFORE and AFTER snapshots.
    The main disadvantage is that existing data will almost never cover the corner cases.
    But you can add data for these. By corner cases I mean data that defines the limits
    for a data type: a VARCHAR2(30) name field should have at least one test record that
    has a name that is 30 characters long.
    Which of these approaches makes the most sense for you?

  • How to fetch data from single database table using 2 internal tables.

    Hi friends,
    i am a new user of ABAP and also SDN.
    i need a help. 
    i want to fetch data from one database table based on primary keys of 2 internal tables.  how to put in where clause.
    Thanks in advance.

    hii
    refer to following code ..i hope it will help you
    SELECT matnr                         " Material Number
        FROM mara
        INTO TABLE i_mara
       WHERE matnr IN s_matnr.
      IF i_mara[] IS NOT INITIAL.
        SELECT matnr                       " Material Number
               werks                       " Plants
               prctr                       " Profit Center
          FROM marc
          INTO TABLE i_marc
           FOR ALL ENTRIES IN i_mara
         WHERE matnr = i_mara-matnr
           AND werks IN s_werks.
      ENDIF.                               " IF i_mara[] IS NOT INITIAL
      i_output = i_marc.
      IF i_marc[] IS NOT INITIAL.
        SELECT matnr                       " Material Number
               werks                       " Plants
               lgort                       " Storage Location
          FROM mard
          INTO TABLE i_mard
           FOR ALL ENTRIES IN i_marc
         WHERE matnr EQ i_marc-matnr
           AND werks EQ i_marc-werks
           AND lgort IN s_lgort.
      ENDIF.                               " IF i_mara[] IS NOT INITIAL
    regards
    twinkal

  • Create a data source and database tables using WSAD

    Hi, guys:
    the following is from a tutorial:
    http://www-106.ibm.com/developerworks/websphere/techjournal/0306_wosnick/wosnick.html
    "To create the data source and Cloudscape database tables automatically, right click on the HelloWorldServer in the Servers view, and select the Create tables and data sources menu item. A dialog will then display showing that the data source and database tables were created successfully (Figure 5)."
    I am using WSAD 5.0 trial version. I cannot find Create tables and data sources menu item if I right click on the HelloWorldServer in the Servers view. I am wondering if this is because trial version does not have this feature?
    regards

    This question is a little off topic but you may get a reply. Please note this forum is about Sun's J2EE SDK and its related technologies. You may have better luck posting your question to an IBM specific resource.

  • Essbase Analytics Link cannot create data synchronization server database

    When I try to create data synchronization server database using Essbase Analytics Link, the below error occur, anyone can help?Thnaks
    dss.log:
    19 Oct 2011 17:28:55] [dbmgr] ERROR: last message repeated 2 more times
    [19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\Comma.hdf"
    [19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\PERIOD.hrd"
    [19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\VIEW.hrd"
    [19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\YEAR.hrd"
    [19 Oct 2011 17:28:58] [dbmgr] Create metadata: "C:/oracle/product/EssbaseAnalyticsLink/oem/hfm/Comma/Default/Comma.hdf"
    [19 Oct 2011 17:28:59] [dbmgr] WARN : HR#03826: Directory "C:\oracle\product\EssbaseAnalyticsLink/Work/XOD/backUp_2" not found. Trying to create
    [19 Oct 2011 17:29:15] [dbmgr] ERROR: ODBC: HR#01465: error in calling SQLDriverConnect ([Microsoft][ODBC SQL Server Driver][Shared Memory]Invalid connection. [state=08001 code=14]).
    [19 Oct 2011 17:29:15] [dbmgr] ERROR: HR#00364: Cannot open source reader for "ACCOUNT"
    [19 Oct 2011 17:29:15] [dbmgr] ERROR: HR#00627: Cannot create dimension: "ACCOUNT".
    [19 Oct 2011 17:29:16] [dbmgr] ERROR: HR#07722: Cube 'main_cube' of application 'Comma' is not registered.
    eal.log:
    [2011-Oct-19 17:28:56] http://localhost/livelink/Default.aspx?command=readYear&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:28:56] http://localhost/livelink/Default.aspx?command=readPeriod&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:28:57] http://localhost/livelink/Default.aspx?command=readView&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:28:57] http://localhost/livelink/Default.aspx?command=getVersion&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:28:58] DSS Application created
    [2011-Oct-19 17:28:58] http://localhost/livelink/Default.aspx?command=getICPWeight&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:29:15] (-6981) HR#07772: cannot register HDF
    [2011-Oct-19 17:29:15] com.hyperroll.jhrapi.JhrapiException: (-6981) HR#07772: cannot register HDF
    [2011-Oct-19 17:29:15]      at com.hyperroll.jhrapi.JhrapiImpl.updateMetadata(Native Method)
    [2011-Oct-19 17:29:15]      at com.hyperroll.jhrapi.Application.updateMetadata(Unknown Source)
    [2011-Oct-19 17:29:15]      at com.hyperroll.hfm2ess.bridge.HyperRollProcess.updateMetadata(Unknown Source)
    [2011-Oct-19 17:29:15]      at com.hyperroll.hfm2ess.bridge.ws.BridgeOperationManagerImpl.createAggServerApp(Unknown Source)
    [2011-Oct-19 17:29:15]      at com.hyperroll.hfm2ess.bridge.ws.BridgeOperationManager.createAggServerApp(Unknown Source)
    [2011-Oct-19 17:29:15]      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [2011-Oct-19 17:29:15]      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [2011-Oct-19 17:29:15]      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [2011-Oct-19 17:29:15]      at java.lang.reflect.Method.invoke(Method.java:597)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.WLSInstanceResolver$WLSInvoker.invoke(WLSInstanceResolver.java:92)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.WLSInstanceResolver$WLSInvoker.invoke(WLSInstanceResolver.java:74)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.server.InvokerTube$2.invoke(InvokerTube.java:151)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.server.sei.EndpointMethodHandlerImpl.invoke(EndpointMethodHandlerImpl.java:268)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.server.sei.SEIInvokerTube.processRequest(SEIInvokerTube.java:100)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.api.pipe.Fiber.__doRun(Fiber.java:866)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.api.pipe.Fiber._doRun(Fiber.java:815)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.api.pipe.Fiber.doRun(Fiber.java:778)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.api.pipe.Fiber.runSync(Fiber.java:680)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.server.WSEndpointImpl$2.process(WSEndpointImpl.java:403)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.transport.http.HttpAdapter$HttpToolkit.handle(HttpAdapter.java:532)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.transport.http.HttpAdapter.handle(HttpAdapter.java:253)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.transport.http.servlet.ServletAdapter.handle(ServletAdapter.java:140)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.WLSServletAdapter.handle(WLSServletAdapter.java:171)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.HttpServletAdapter$AuthorizedInvoke.run(HttpServletAdapter.java:708)
    [2011-Oct-19 17:29:15]      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:363)
    [2011-Oct-19 17:29:15]      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:146)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.util.ServerSecurityHelper.authenticatedInvoke(ServerSecurityHelper.java:103)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.HttpServletAdapter$3.run(HttpServletAdapter.java:311)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.HttpServletAdapter.post(HttpServletAdapter.java:336)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.JAXWSServlet.doRequest(JAXWSServlet.java:98)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.http.AbstractAsyncServlet.service(AbstractAsyncServlet.java:99)
    [2011-Oct-19 17:29:15]      at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:183)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3717)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
    [2011-Oct-19 17:29:15]      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    [2011-Oct-19 17:29:15]      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
    [2011-Oct-19 17:29:15]      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:207)
    [2011-Oct-19 17:29:15]      at weblogic.work.ExecuteThread.run(ExecuteThread.java:176)
    [2011-Oct-19 17:29:15] LiveLinkException [HR#09746]: Data Synchronization Server database cannot be created

    What version of EAL have you installed, what OS + 32bit/64bit are you installing it on.
    What version of the OUI did you use.
    Have you gone through all the configuration steps successfully.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Creating BAM report using data of two database tables. Say "EmployeeTable" and "DepartmentTable

    I want to populate data from two database tables into a BAM report. I have created two data objects for the tables and want to create 3rd data object so that it can take the data from the other data objects automatically. Is there any such facility within BAM?

    Hi
    1. I am not sure if you have option to combine multiple DataObjects like that.
    2. For your scenario, on Database side itself, I would recommend Create a VIEW that joins these 2 Tables and pull the required Columns and add required conditions. Verify the VIEW has all the rows expected.
    3. Then create DataObject for this View. Once you have DataObject, create the Report.
    Thanks
    Ravi Jegga

  • Workflows table is not showing up in Data Manager drop down table list

    Hi,
    I am able to see Workflows table in Console but not able to see it Data Manager drop down table list in record mode for selection.
    Even not allowed to create another workflow table.
    Thanks for any tips/clue
    -reo

    Reo,
    You will not be able to create another workflow table. There is only a single workflow table that will hold all the workflows you create through the Data Manager.
    As Vito mentioned, please make sure to load the corresponding MDMWorkflow component on the client machines running the Data Manager that you wish to create and view workflows from.
    Once the workflow component is installed you should see it as a new tab in the Data Manager. You will need Visio to create workflows.
    Thanks,
    Tim

  • Can I restore the deleted statistical data from the database tables?

    Hi all,
       I have deleted the statistical data from the database tables like(Ex: RSDDSTAT, RSDDSTATWHM,..) by mistake through RSA1> Tools> BW Statistics for Infoproviders--> Delete.
    Is there any way to restore the deleted data back? Thanks in advance.

    Now I'm really confused-
    Your first post said
    "<b>I have deleted the statistical data from the database tables like(Ex: RSDDSTAT, RSDDSTATWHM</b>,..) by mistake through RSA1> Tools> BW Statistics for Infoproviders--> Delete."
    but your last respsonse said
    "I have deleted the BW Statistics data, <b>not the actual data in RSDDSTAT tables</b> through
    RSA1 -> Tools -> BW Statistics for InfoProviders -> clicked 'Delete' bin to delete data."
    If you used the RSA1 -> Tools -> BW Statistics for InfoProviders -> clicked 'Delete' - <b>then you deleted the data from the RSDDSTAT tables</b>. This assumes you accepted the default date range that would have popped up after the clicking on the Delete button which specified to delete thru the current date.  If this is what you did, the data is gone.  Your only hope is be to recover from a DB backup.  
    The data in the RSDDSTAT tables is what is used to feed the BW Statistics cubes, generally on a daily basis.

  • Deletion of data within the database tables

    The user is trying to clean up the 2014 data within the database tables. He is running a delete function which keeps causing the log files
    to exceed their limit. The tables are large and he is unable to delete the data in one command due to available size and logging. What is the best way to approach this?
    Thanks,

    Hi venkatesh1985,
    According to your description, the user fails to delete data in tables due to the limited space of log file. Based on my research, this issue could occur when you use the delete statement(DELETE FROM ExampleTable) in a single transaction and consume all
    available space on your transaction log disk.
    To avoid this issue, you could use the two methods below to delete the data.
    1. Use a loop combined with TOP and delete rows in smaller transactions as the following example. This method requires you to delete all the tables one by one.
    SELECT 1
    WHILE @@ROWCOUNT > 0
    BEGIN
    DELETE TOP (1000)
    FROM LargeTable
    END
    For more information about the process, please refer to the article:
    http://dbadiaries.com/how-to-delete-millions-of-rows-using-t-sql-with-reduced-impact
    2. If you want to delete all the data from all tables in the specific database, you could script the entire database and all database objects. Then drop the database and recreate it using the script as the steps below.
    a. In Object Explorer, expand the node for the instance containing the database to be scripted.
    b. Point to Tasks, and then click Generate Scripts and click Next.
    c. Select the option of 'Script the entire database and all database objects'.
    d. Specify how scripts should be saved. You could save the script to a file or new query window. Click Next, then click ok.
    e. Drop the database, and run the script in the query window to recreate it. For more information, please refer to the article:
    http://msdn.microsoft.com/en-us/library/bb895179.aspx#Introduction
    In addition, if possible, please increase the size of the log file or move the log file to a different disk with more disk space.
    Regards,
    Michelle Li

  • Fetch data from different database tables

    Hi...
    How can i fetch data from different database tables and put it into a internal table and then display it??? Can provide simple short codes as i'm new to ABAP. Thanks.

    Hi,
    Check this sample code..
    TYPE-POOLS: slis.
    DATA: BEGIN OF itab OCCURS 0,
            vbeln TYPE vbeln,
            expand,
          END OF itab.
    DATA: BEGIN OF itab1 OCCURS 0,
            vbeln TYPE vbeln,
            posnr TYPE posnr,
            matnr TYPE matnr,
            netpr TYPE netpr,
          END OF itab1.
    DATA: t_fieldcatalog TYPE slis_t_fieldcat_alv.
    DATA: s_fieldcatalog TYPE slis_fieldcat_alv.
    s_fieldcatalog-col_pos = '1'.
    s_fieldcatalog-fieldname = 'VBELN'.
    s_fieldcatalog-tabname   = 'ITAB'.
    s_fieldcatalog-rollname  = 'VBELN'.
    s_fieldcatalog-outputlen = '12'.
    APPEND s_fieldcatalog TO t_fieldcatalog.
    CLEAR: s_fieldcatalog.
    s_fieldcatalog-col_pos = '1'.
    s_fieldcatalog-fieldname = 'VBELN'.
    s_fieldcatalog-tabname   = 'ITAB1'.
    s_fieldcatalog-rollname  = 'VBELN'.
    s_fieldcatalog-outputlen = '12'.
    APPEND s_fieldcatalog TO t_fieldcatalog.
    CLEAR: s_fieldcatalog.
    s_fieldcatalog-col_pos = '2'.
    s_fieldcatalog-fieldname = 'POSNR'.
    s_fieldcatalog-tabname   = 'ITAB1'.
    s_fieldcatalog-rollname  = 'POSNR'.
    APPEND s_fieldcatalog TO t_fieldcatalog.
    CLEAR: s_fieldcatalog.
    s_fieldcatalog-col_pos = '3'.
    s_fieldcatalog-fieldname = 'MATNR'.
    s_fieldcatalog-tabname   = 'ITAB1'.
    s_fieldcatalog-rollname  = 'MATNR'.
    APPEND s_fieldcatalog TO t_fieldcatalog.
    CLEAR: s_fieldcatalog.
    s_fieldcatalog-col_pos = '4'.
    s_fieldcatalog-fieldname = 'NETPR'.
    s_fieldcatalog-tabname   = 'ITAB1'.
    s_fieldcatalog-rollname  = 'NETPR'.
    s_fieldcatalog-do_sum    = 'X'.
    APPEND s_fieldcatalog TO t_fieldcatalog.
    CLEAR: s_fieldcatalog.
    DATA: s_layout TYPE slis_layout_alv.
    s_layout-subtotals_text            = 'SUBTOTAL TEXT'.
    s_layout-key_hotspot = 'X'.
    s_layout-expand_fieldname = 'EXPAND'.
    SELECT vbeln UP TO 100 ROWS
           FROM
           vbak
           INTO TABLE itab.
    IF NOT itab[] IS INITIAL.
      SELECT vbeln posnr matnr netpr
             FROM vbap
             INTO TABLE itab1
             FOR ALL ENTRIES IN itab
             WHERE vbeln = itab-vbeln.
    ENDIF.
    DATA: v_repid TYPE syrepid.
    v_repid = sy-repid.
    DATA: s_keyinfo TYPE slis_keyinfo_alv.
    s_keyinfo-header01 = 'VBELN'.
    s_keyinfo-item01   = 'VBELN'.
    CALL FUNCTION 'REUSE_ALV_HIERSEQ_LIST_DISPLAY'
         EXPORTING
              i_callback_program = v_repid
              is_layout          = s_layout
              it_fieldcat        = t_fieldcatalog
              i_tabname_header   = 'ITAB'
              i_tabname_item     = 'ITAB1'
              is_keyinfo         = s_keyinfo
         TABLES
              t_outtab_header    = itab
              t_outtab_item      = itab1
         EXCEPTIONS
              program_error      = 1
              OTHERS             = 2.
    IF sy-subrc <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    Thanks
    Naren

  • Details of Test Data Migration Server(TDMS)

    Hi Friends,
        I am very much interested in the new tool Test Data Migration Server(TDMS).Can anyone of u r aware of this tool.i had a detailed search in google & SAP sites.But i could not find much information.Plz help me out in finding about this tool.
    Thanx
    Rajeev

    Hi Rajeev,
    There are documents available on these pages:
    Service Marketplace
    http://service.sap.com/customdev-tdms
    SAP Homepage
    http://www.sap.com/services/customdev/tdms
    If you have specific questions, just let me know.
    Regards,
    Manfred

  • How to load text file data to Oracle Database table?

    By using Oracle Forms, how to load text file data to Oracle Database table?

    Metalink note 33247.1 explains how to use text_io as suggested by Robin to read the file into a Multi-Row block. However, that article was written for forms 4.5 and uses CREATE_RECORD in a loop. There was another article, 91513.1 describing the more elegant method of 'querying' the file into the block by transactional triggers. Unfortunately this more recent article has disappeared without trace and Oracle deny its existence. I know it existed as I have a printed copy in front of me, and very useful it is too.

Maybe you are looking for

  • Mac won't recognize iPod Touch after library move

    Ever since my music library outgrew my internal hard drive and I moved it to my Time Capsule, my computer doesn't recognize my iPod Touch when I plug it in, though the iPod is charging from the USB port. I've deleted iTunes, rebooted and reinstalled

  • Keynote loses file

    While editing a presentation, Keynote crashed. The edited file has been completely lost and could not be found after booting Keynote and the system. However, the recent items list of both Keynote and MACOS displayed this item. But selecting it did no

  • Query on view - IS the querry executed each time view is referred?

    I want to know whether query inside a view is executed each time when the view is being referred? Also which on of below will be faster? select      a1.x,      a1.y,      b1.z from      TableA a1,      TableB b1 where      a1.keyName = 'F' || b1.some

  • Load sharing w/ multiple CEs in a WCCPv2 environment

    Hi, can someone help me understand how to configure load balancing based on source ip addresses in a WCCPv2 environment? I´m using several Cisco Content engines (565, 510, 507, ACNS 5.2.1.7) attached via a L2 connection to a pair of 6509 Sup2/MSFC2 d

  • Dynamic Delivery Content issue

    Hi everyone!! Im trying to reference information in the XML data to be put into the delivery content but i can`t get it work :S Im using this format ${ELEMENT} but it looks like its a null element :S This is the java: import oracle.apps.xdo.batch.Doc