Tcodes for all Setup tables.

Hi all,
Can you give me the list of transaction codes of all the setup tables for their corresponding datasources.
It would be a gr8 help.
Thanks in advance!
Regards,
Vineeta.

Hi,
Transactions to fill setup tables
OLIIBW transaction PM data
OLI1BW INVCO Stat. Setup: Material Movemts
OLI2BW INVCO Stat. Setup: Stor. Loc. Stocks
OLI3BW Reorg.PURCHIS BW Extract Structures
OLI4BW Reorg. PPIS Extract Structures
OLI7BW Reorg. of VIS Extr. Struct.: Order
OLI8BW Reorg. VIS Extr. Str.: Delivery
OLI9BW Reorg. VIS Extr. Str.: Invoices
OLIABW Setup: BW agency business
OLIFBW Reorg. Rep. Manuf. Extr. Structs
OLIIBW Reorg. of PM Info System for BW
OLIQBW QM Infosystem Reorganization for BW
OLISBW Reorg. of CS Info System for BW
OLIZBW INVCO Setup: Invoice Verification
OLI7BW is the tcode for Sales Order.
Hope this helps.
Thanks,
JituK

Similar Messages

  • Tcode for filling setup tables

    Hi experts,
    What is the tcode for filling setup tables of 2lis_06_inv ?
    Full points will be assigned.
    Regards,
    Bhadri M.

    Hi,
    Filling up the set up tables depends on the datasource.
    there are different T-codes for the respective extract structures
    OLIIBW transaction PM data
    OLI1BW INVCO Stat. Setup: Material Movemts
    OLI2BW INVCO Stat. Setup: Stor. Loc. Stocks
    OLI3BW Reorg.PURCHIS BW Extract Structures
    OLI4BW Reorg. PPIS Extract Structures
    OLI7BW Reorg. of VIS Extr. Struct.: Order
    OLI8BW Reorg. VIS Extr. Str.: Delivery
    OLI9BW Reorg. VIS Extr. Str.: Invoices
    OLIABW Setup: BW agency business
    OLIFBW Reorg. Rep. Manuf. Extr. Structs
    OLIIBW Reorg. of PM Info System for BW
    OLIQBW QM Infosystem Reorganization for BW
    OLISBW Reorg. of CS Info System for BW
    OLIZBW INVCO Setup: Invoice Verification
    OLI7BW is the tcode for Sales Order.
    Have a look at these discussions:
    set up tables
    Set up tables
    Set up tables
    view of set up tables data in se11??????????
    Set Up tables..
    lo: delete set up tables: DOUBT
    Blogs of Roberto will be useful as well:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    LOGISTIC COCKPIT DELTA MECHANISM - Episode two: V3 Update, when some problems can occur...
    LOGISTIC COCKPIT DELTA MECHANISM - Episode three: the new update methods
    LOGISTIC COCKPIT - WHEN YOU NEED MORE - First option: enhance it !
    LOGISTIC COCKPIT: a new deal overshadowed by the old-fashioned LIS ?
    Display Setup Table Data:-
    Goto SE11 or SE16 -> MC*SETUP -> F4 you will get all the setup tables choose and display the data
    To check completion of dta loading
    NPRT t-code... Where u can see the entry with ur user and the Run of ur setup..
    Regards
    Tg

  • Tcode to fill setup tables for application 0CS( Customer Service)

    Dear Experts,
    i have to fill the setup tables for the datasource 2LIS_18_I0TASK.
    the datasource comes under the appilcation 0CS( Customer Service).
    when i am trying to fill the setup tables through the tcode SBIW i am not able to find the respective appilcation.
    Could you please let me know the transaction to fulfill the same.
    Regards,
    Sunil...

    Hi SunilKumar Bolle,
    For filling setup tables for 0CS Data Sources Use the T-Code- OLISBW, or use the path SBIW->Settings for Application-Specific DataSources (PI)->Logistics->Managing Extract Structures->Initialization->Filling in the Setup Table->Application-Specific Setup of Statistical Data->Service Management - Perform Setup
    The only way to say thanks in SDN is to assign Points.

  • Max no of records in for all entries table

    Hello all,
    Hi have used for all entries in a select statement in BW extractor. This extractor is working fine for the test data. When i moved this code to pre-production for testing, there this extractor has to deal with thousands of records. In pre-prod , this select statement is not picking up all the records available in DB. Can any one give any idea on behavior of for all entries for large number of records. and is the any max limit for for all entries table.
    Thank you..correct answer will be rewarded.
    Regards
    Sravan

    Moderator message - Please search before asking and do not offer rewards (particularly since as far as I can see, you've awarded a total of two points in the last two years - post locked
    Rob

  • Recordcount for all the tables in my user

    How I will get recordcount for all the tables in my user
    with a single query??
    Plz help.
    Thanx in advance.

    Not possible. As there can be any number of tables with any names, this requires dynamic SQL.
    SQL given to the Oracle SQL Engine cannot be dynamic ito scope and references - it must be static. For example, one cannot do this:
    SELECT count(*) FROM :table
    For the SQL Engine to parse the SQL, determine if it is valid, determine the scope and security, determine an execution plan, it needs to know the actual object names. Objects like tables and columns and functions cannot be variable.
    You will therefore need to write a user function (in PL/SQL) that dynamically creates a [SELECT COUNT] SQL for a table, execute that SQL and return the row count - and then use SQL to iterate through USER_TABLES for example and sum the results of this function.
    Note that object tables are not listed in USER_TABLES - thus a more comprehensive list of all table objects in your schema can be found in USER_OBJECTS.

  • Is Oracle Partitioning req'd in the Source R/3 system for SBIW Setup Table?

    We have ECC6.0 as our Source System.  My BI team is trying to load 3 years of SALES history into the BI7.0 system, and then the APO system will pull 1 of those 3 years into it.  To begin this process, they are using SBIW to Manage the Extract Structures and create the Setup Tables.
    I got an alert that these Setup Tables were created with Partitions.  We are not licensed for partitioning in our ECC(R/3) system, only in BI and APO.
    1.)  Is Partitioning really necessary for these Setup Tables?
    2.)  It appears that the Setup jobs create these tables by default as partitioned?  How do I disable or change that? 
    Thanks,
    Richard (Basis)

    Hi Neeraj,
    thank you for the reply.
    Yes, i have checked tRFCs and logs , everything looks ok, we are getting expected records in BW but job is not ending in the backend system and BW load is always showing YELLOW.
    looks like our developer using  custom FM to extract data and i feel that something code issue which is going endless loop. they need to write code to terminate process after last record in the extract process. i will update you once i get root cause of the issue.
    thanks again..
    thanks
    Venkata

  • What is the Tcode for Aggregate Statstics Table and PSA table

    What is the Tcode for Aggregate Statstics Table and PSA table

    Hii raina..
    There is no such perticular Tcode to check PSA data . Go to RSA1 and open ur DS . Right click and say manage. U will get PSA data.. But their should be data in PSA 
    I hope thjis is useful to u.

  • Down time for 2LIS_03_BX setup table

    Hello SDN,
    I have data reconcilation issue in my BW server for inventort managenent, for data source 2LIS_03_BX and 2LIS_03_BF. For this I have to refill setup table of 2LIS_03_BX in R3 server. For filling setup table for R3 system I need to ask for down time from cleint
    I NEED TO KNOW HOW DO I CALCULATE REQUIRED DOWN TIME FOR FILLING SETUP TABLE FOR DATA SOURCE 2LIS_03_BX.

    Dear Pravender,
    I understand your statement (" For the complete load, check in how much time you can do initialization? That much down time only you need, later you can fill setup tables for history data and load.") as to follow following steps:
    1. In the down time, start initialization without data transfer info package in BIW for 2LIS_03_BF and 2LIS_03_UM.
    2. Then during up time (after releasing down time, transactional data posting allowed in R3), fill setup table for 2LIS_03_BX.
    3. Run full upload info package for 2LIS_03_BX
    4. Start Delta info package for data sources 2LIS_03_BF and 2LIS_03_UM
    Let me know if I am right for the above procedures. These steps will allow us take very less down time.
    Thanks for the reply
    Regards,
    Jaydeep
    Edited by: Jaydeepsinh Rathore on Sep 4, 2009 8:23 AM

  • How to generate test data for all the tables in oracle

    I am planning to use plsql to generate the test data in all the tables in schema, schema name is given as input parameters, min records in master table, min records in child table. data should be consistent in the columns which are used for constraints i.e. using same column value..
    planning to implement something like
    execute sp_schema_data_gen (schemaname, minrecinmstrtbl, minrecsforchildtable);
    schemaname = owner,
    minrecinmstrtbl= minimum records to insert into each parent table,
    minrecsforchildtable = minimum records to enter into each child table of a each master table;
    all_tables where owner= schemaname;
    all_tab_columns and all_constrains - where owner =schemaname;
    using dbms_random pkg.
    is anyone have better idea to do this.. is this functionality already there in oracle db?

    Ah, damorgan, data, test data, metadata and table-driven processes. Love the stuff!
    There are two approaches you can take with this. I'll mention both and then ask which
    one you think you would find most useful for your requirements.
    One approach I would call the generic bottom-up approach which is the one I think you
    are referring to.
    This system is a generic test data generator. It isn't designed to generate data for any
    particular existing table or application but is the general case solution.
    Building on damorgan's advice define the basic hierarchy: table collection, tables, data; so start at the data level.
    1. Identify/document the data types that you need to support. Start small (NUMBER, VARCHAR2, DATE) and add as you go along
    2. For each data type identify the functionality and attributes that you need. For instance for VARCHAR2
    a. min length - the minimum length to generate
    b. max length - the maximum length
    c. prefix - a prefix for the generated data; e.g. for an address field you might want a 'add1' prefix
    d. suffix - a suffix for the generated data; see prefix
    e. whether to generate NULLs
    3. For NUMBER you will probably want at least precision and scale but might want minimum and maximum values or even min/max precision,
    min/max scale.
    4. store the attribute combinations in Oracle tables
    5. build functionality for each data type that can create the range and type of data that you need. These functions should take parameters that can be used to control the attributes and the amount of data generated.
    6. At the table level you will need business rules that control how the different columns of the table relate to each other. For example, for ADDRESS information your business rule might be that ADDRESS1, CITY, STATE, ZIP are required and ADDRESS2 is optional.
    7. Add table-level processes, driven by the saved metadata, that can generate data at the record level by leveraging the data type functionality you have built previously.
    8. Then add the metadata, business rules and functionality to control the TABLE-TO-TABLE relationships; that is, the data model. You need the same DETPNO values in the SCOTT.EMP table that exist in the SCOTT.DEPT table.
    The second approach I have used more often. I would it call the top-down approach and I use
    it when test data is needed for an existing system. The main use case here is to avoid
    having to copy production data to QA, TEST or DEV environments.
    QA people want to test with data that they are familiar with: names, companies, code values.
    I've found they aren't often fond of random character strings for names of things.
    The second approach I use for mature systems where there is already plenty of data to choose from.
    It involves selecting subsets of data from each of the existing tables and saving that data in a
    set of test tables. This data can then be used for regression testing and for automated unit testing of
    existing functionality and functionality that is being developed.
    QA can use data they are already familiar with and can test the application (GUI?) interface on that
    data to see if they get the expected changes.
    For each table to be tested (e.g. DEPT) I create two test system tables. A BEFORE table and an EXPECTED table.
    1. DEPT_TEST_BEFORE
         This table has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look BEFORE the
         test for that test case is performed.
         CREATE TABLE DEPT_TEST_BEFORE
         TESTCASE NUMBER,
         DEPTNO NUMBER(2),
         DNAME VARCHAR2(14 BYTE),
         LOC VARCHAR2(13 BYTE)
    2. DEPT_TEST_EXPECTED
         This table also has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look AFTER the
         test for that test case is performed.
    Each of these tables are a mirror image of the actual application table with one new column
    added that contains a value representing the TESTCASE_NUMBER.
    To create test case #3 identify or create the DEPT records you want to use for test case #3.
    Insert these records into DEPT_TEST_BEFORE:
         INSERT INTO DEPT_TEST_BEFORE
         SELECT 3, D.* FROM DEPT D where DEPNO = 20
    Insert records for test case #3 into DEPT_TEST_EXPECTED that show the rows as they should
    look after test #3 is run. For example, if test #3 creates one new record add all the
    records fro the BEFORE data set and add a new one for the new record.
    When you want to run TESTCASE_ONE the process is basically (ignore for this illustration that
    there is a foreign key betwee DEPT and EMP):
    1. delete the records from SCOTT.DEPT that correspond to test case #3 DEPT records.
              DELETE FROM DEPT
              WHERE DEPTNO IN (SELECT DEPTNO FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3);
    2. insert the test data set records for SCOTT.DEPT for test case #3.
              INSERT INTO DEPT
              SELECT DEPTNO, DNAME, LOC FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3;
    3 perform the test.
    4. compare the actual results with the expected results.
         This is done by a function that compares the records in DEPT with the records
         in DEPT_TEST_EXPECTED for test #3.
         I usually store these results in yet another table or just report them out.
    5. Report out the differences.
    This second approach uses data the users (QA) are already familiar with, is scaleable and
    is easy to add new data that meets business requirements.
    It is also easy to automatically generate the necessary tables and test setup/breakdown
    using a table-driven metadata approach. Adding a new test table is as easy as calling
    a stored procedure; the procedure can generate the DDL or create the actual tables needed
    for the BEFORE and AFTER snapshots.
    The main disadvantage is that existing data will almost never cover the corner cases.
    But you can add data for these. By corner cases I mean data that defines the limits
    for a data type: a VARCHAR2(30) name field should have at least one test record that
    has a name that is 30 characters long.
    Which of these approaches makes the most sense for you?

  • For all entries table handled by Tables parameter of Subroutine!

    Hi....
    See my code...
    Dont leave as it seems very big code... Actually its very small one...
    In sourse code of function module...
    data:itab1  type standard table of <local structure of top include> with header line,
                     itab2 type standrad table of knvp with header line.
    perform routine_data tables  itab1
                          using    p_var
    Subroutine code, saved in F include of that function group...
    form routine_data  tables itab1 type standard table
                               using    p_var
      select * from <db table>
                  into corresponding fields of table itab
                  where <keyfield> = p_var.
    endform.
    Back to source cod eof Function module..
    if sy-subrc is = 0.
        select parvw kunn2 kunnr from knvp
                            into corresponding fields of table itab2 for all entries in itab1
                             where kunnr = itab1-kunnr
                             and ( parvw = 'WE' or parvw = 'RE' or parvw = 'RG').
    endif.
    This code is working fine......
    ==================Now coming to my problem==========================
    In source code of function module...
    data:itab1  type standard table of <local structure of top include> with header line,
                     itab2 type standrad table of knvp with header line.
    perform routine_data tables  itab1
                                   using    p_var
    Subroutine code, saved in F include of that function group...
    form routine_data  tables itab1 type standard table
                               using    p_var
      select * from <db table>
                  into corresponding fields of table itab
                  where <keyfield> = p_var.
    endform.
    Function module source code...
    perform routine2_data tables itab1
                                     itab2.
    F include coding part for above subroutine....
    form routine2_data  tables itab1 type standard table
                                          itab2 type standard table
        select parvw kunn2 kunnr from knvp
                            into corresponding fields of table itab2 for all entries in itab1
                             where kunnr = itab1-kunnr                                         <-----causing error
                             and ( parvw = 'WE' or parvw = 'RE' or parvw = 'RG').
    endform.
    Giving error message....
    >>> The specified type has no structure and therefore no component called 'KUNNR".....
    So here the problem is there is a incorrect way to declare parameters....
    Plz remind that SUBROUTINES OF FUNCTION MODULES SAVING IN INCLUDE PROGRAMS, because they making some deffenrce with normal external subroutines...
    also...
    Here for all entries is mandatory!
    And Two sub routines are mandatory!
    Thanks for your attention...
    Naveen Inuganti.

    Hi ,
    Use the below  syntax to pass the tables as parameters
    *The below perform is in the source code of the F.M
    PERFORM goods_movement_post TABLES itab1
                                       itab2
                                       itab3
                                   USING ls_goodsmvt_header
                                         g_mov_code.
    suppose u are using the itab1 & itab2 tables data to get the itab3 Data 
    And The below code is in the Frms include
    FORM goods_movement_post
                      TABLES
                         pt_itab1 STRUCTURE vbak
                         pt_itab2 STRUCTURE vbap
                         pt_itab3 STRUCTURE bapiret2
                      USING
                         p_ls_goodsmvt_header STRUCTURE bapi2017_gm_head_01
                         p_g_mov_code.
    ENDFORM
    Thanks & Reagrds
    Mallikharjuna Reddy

  • Posting Block for SD setup tables.

    Hi Gurus,
    Do we need to have the posting block in R/3 while filling up the setup tables for SD billing and deliveries?
    thanks in advance.
    Regards,
    Venkat

    Hi
    The posting block is set to make automatic posting of billing document to FI postings
    If you remove the posting block after billing you have to manually release it to accounting
    So it depends on the functionality of R/3 whether we need to have or not. It may not have any affect on filling of setup tables...It will not change any data
    Regards
    N Ganesh

  • Table for billing setup table

    Hi all,
       In r/3 is there a table (that can be accessed via se16) that holds the billing setup data? (2lis_13_vditm).
       Thanks.
    Dave

    MC13VD0ITMSETUP ..
    Ravi

  • Tcode for all reports

    hi
       iam new to bw , plzz help me , whats the tcode to check all activated reports in my project

    Hi,
    Which are all reports are available under the info provider, all reports will be availble for reporting. You dont want to check whether the query is activated or not. Only thing you have to check either data is available for reporting or not.
    If you need list of queries there is two ways in my mind:
    1. Goto Bex Query Designer, Open a query, goto your info provider. Under that you can get the list of queries.
    2. Goto RSA1 --> Transport Connection. In Collection Mode, Select Data Flow Afterwards. Drag your info provider, under that you can get list of queries.
    Regards,
    Vivek V

  • How to find the tcode for relation between tables

    Hi all
    Just wanted to know the tcode that will show/give graphycal/ nongraphycal relation between tables if there is any..
    Very urgent, please.

    Transaction code to find relation between tables(Very urgent.)
    Please do NOT open duplicate posts.
    Regards,
    Ravi
    Note - Please mark the helpful answers

  • Is it possible to set the database with nulls first for all its tables ?

    Hello,
    I have a schema of 117 tables and I would like Oracle to behave nulls first when doing any order by on any tables.
    Is there a way to tell that respectable old database server to do such a thing ?
    As it is today, I have to write some custom DAO specific to Oracle just so as to have a nulls first in the sql statements.
    Kind Regards,

    user573224 wrote:
    I have a schema of 117 tables and I would like Oracle to behave nulls first when doing any order by on any tables.Basically this is an application (sql) issue. If you do not code explicitly, ascending ordering is default and also means that nulls last is the default.
    http://docs.oracle.com/cd/E11882_01/server.112/e26088/statements_10002.htm#i2171079
    As 'nulls first' belongs to order by clause of select statment, this seems like a question for the SQL forum. {forum:id=75} That you happen to run sql on XE does not matter.
    As it is today, I have to write some custom DAO specific to Oracle just so as to have a nulls first in the sql statements.Changing how a database engine behaves seems like the opposite of dbms agnostic application, if that's the purpose. Also in general, database agnostic/ignorant applications are often considered a sure way to unscalable and unmanageable systems.
    One could also consider that customers having Oracle as part of their infrastructure, would like to see that applications make good use of Oracle's strong and highly capable features and options -- not treat it as some sort data dumping bin.
    (Ok, granted, with XE this might not be true. Instead there are issues of limitations and lack of support (patches), that you might not want or can take on as an applications vendor.)
    Btw, good to see that you've found a fix in sql for your problem, and thank you for reverting back with that to the forum!
    Edited by: orafad on Dec 7, 2012 10:29 PM
    Edited by: orafad on Dec 7, 2012 10:31 PM

Maybe you are looking for

  • Unlimited Delivery Check box in PO

    Dear all, We are facing one issue. In service PO we found one field 'Over Delivery" field and one check box  "Unlimited Delivery". For service we are finding that by default we are getting "unlimited Del" check box checked and system is not allowing

  • How to find number of lines in an input file.

    Hi, Can someone tell me, if there is a way to find out the number of lines in an input file. I do not want to read line by line and then count the number of lines read.

  • Slideshow music vanished

    I exported my slideshow to iDVD, and burned it to DVD. Halfway through the slideshow after the 3rd song, the music stopped. I confirmed that the playlist included 8 songs, and I received no error messeges when the slideshow was transformed into a mov

  • User exit for route determination in VA01

    Dear friends, I need a userexit for route determination whenever required delivery date is changed at the time of sales order creation VA01. Please reply ASAP. Thanks, --Sonal

  • Leopard Freezes on Restart

    I'm having a Leopard installation problem I haven't seen discussed, so here goes. Installation appears to go as it should (taking about an hour). I get a screen saying installation was successful and a count-down to restart. I click restart. GRAY scr