Customizing SBIW GRDWT 0HR_PT_02

Hi!
I'm trying to do some customizing in the SBIW for the Time Types and the after reading the threads and the forums I'm still not capable of maintaing correctly the data types. I'm quite new in BW and it's the first time I try to do this kind of customizing. I've tried to follow the Help given on the left of the "Execute" in SBIW but already defining the GRDWT I'm stuck.
I'm working for a project in which MOLGA would be 08. By default I have MOLGA and then otherwise 01. Should I start with changing that to 08??
And after I suppose I should go to the important part which would be to Maintain the Time Types??
I really need your help guys!!
PS.
Could you explain also how to assign points for your answers?
Thanks

Hi Yuvaraaj,
thanks for taking the time to answer my question.
For time selection I use the extractors default datafields: BEGDA and ENDDA.
Pseudo-delta might be an option IF the two fields you mention are processed in the extractors selection logic. I actually don't know and would need to take a closer look on the extractors programm logic.
Anyway my goal is to stay as close to SAPs standard as possible.
Pittily the extractor does not support filtering by time in delta mode. (Found this on SAPs help pages and also tried to set start and enddate ... no luck as assumed.)
I had another look at [SAPs documentation for 0HR_PY_1_CE|http://help.sap.com/saphelp_nw70ehp2/helpdata/en/6a/cccc6015484464b25605375a5db965/content.htm] and have to recall the statement above.
As there is no restriction mentioned about delta and time selection there must be a way to pass a time frame when initializing the delta.
I tried the following:
1. I startet an Infopackage in full mode with the follwing filter criteria: BEGDA = 01.10.2009; ENDDA = 31.10.2009
The extractor dilivers ~60000 records; their inper is allways 10.2009. -> Selection worls fine
2. I started an Infopackage in delta init mode and exactly the same selection criteria as the full mode Infopackage and i receive millions of records whichs inper is spread over several years.
So the time selection is definitely not working in delta mode.
Are there any customizing settings I missed in the OLTP system?
Regards
Chris

Similar Messages

  • SBIW for Porject Systems datasources for 0PS_C02

    I am looking for an explanation how to configure the objects under the following path
    SBIW
    Settings for Application-Specific DataSources (PI)
    Project System
    Project System Dates,Durations,and Floats in the BW
    Installing business content for this cube has been fairly easy up until this step and I have been unable to find clear instructions for what, when, how, or why this is done. Any assistance or identification of documentation would be greatly appreciated.  I have the D12 Project System - Dates documentation guides for all releases already.
    Thanks

    The documentation I found on BWEXPERTS online and best practices site states:
    For these extractors [0PS_DAT_*], you need to go to BW Customizing (SBIW) ...... to describe the settings, duration and buffer. This has two steps:
    1) Create new combinations of the InfoObjects Value type etc
    2) Choose which date field of an R/3 master table is built through a given characteristic vector in BW.
    I have been unable to find an explanation for these two steps which include 8 steps in config in SBIW.  I don't understand whether or not i need to perform some or all steps and would like to find something that explains in detail what is the result of performing and using this config.
    Let me add that we are on SAP 4.7 (620) with BW 3.1 currently have installed support pack stack including 19 - 23 in our dev box where I have been testing these objects.
    Thanks so much for your comments

  • Maximum package size for data packages was exceeded and Process terminated

    Hello Guru,
    When i am execute the process chain i got this message Maximum package size for data packages was exceeded and Process terminated,any body help to me in this case how can i proceed.
    Thanks & Regards,
    Suresh.

    Hi,
    When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    You can also try RSCUSTV* where * is an integer to change data load settings.
    Change Datapackage size for extraction, use Transaction RSCUSTV6.
    Change Datapackage size when upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    IN R/3, T-Code SBIW --> Genaral settings --> Maintain Control Parameters for Data Transfer (source system specific)
    Hope this helps.
    Thanks,
    JituK

  • WHAT IS DATA PACKET SIZING IN BW?

    Hi Gurus,
    WHAT IS DATA PACKET SIZING IN BW? is it modeling, extraction or reproting related topic?
    Regards,RAMU.

    Hi,
    To have the control over datapacket size (no of records in a datapackage) you can have settings in infopackage.
    In info package > Scheduler --> DataS. Default Data Transfer.This would be a local settings whenever you run the load through that IP.The changes that you make here takes priority over the default settings (SBIW) and applicable only for the loads from this info package.Default is 20000kB. Try half of it. You should specify it for the correct update method, ie if its a delta IP, you shud mention the size against delta IP. you can define number of datapackets per idoc as 10.
    You can also try RSCUSTV* where * is an integer to change data load settings.
    change Datapackage size for Flat File extraction, use Transaction RSCUSTV6.
    change Datapackage size when upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    IN R/3, T-Code SBIW --> Genaral settings --> Maintain Control Parameters for Data Transfer (source system specific)
    Try
    http://help.sap.com/saphelp_nw04/helpdata/en/51/85d6cf842825469a51b9a666442339/frameset.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    This is related to loading performance, take a look at this doc 'bw load performance and analysis'
    Hope this helps.
    Thanks,
    JituK

  • Data Element Dcoumentation -urgent

    HI,
               has any one modified the data element documentaion for any Data Elwement in CMOD.If so for what Data Elemet you have done and what was the necessaity.Plz do help me its too urgent i wud reward points for all the answers.

    Hi ,
    To have the control over datapacket size (no of records in a datapackage) you can have settings in infopackage.
    In info package > Scheduler --> DataS. Default Data Transfer.This would be a local settings whenever you run the load through that IP.The changes that you make here takes priority over the default settings (SBIW) and applicable only for the loads from this info package.Default is 20000kB. You should specify it for the correct update method, ie if its a delta IP, you shud mention the size against delta IP. you can define number of datapackets per idoc as 10.
    You can also try RSCUSTV* where * is an integer to change data load settings.
    Change Datapackage size for extraction, use Transaction RSCUSTV6.
    Change Datapackage size when upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    IN R/3, T-Code SBIW --> Genaral settings --> Maintain Control Parameters for Data Transfer (source system specific)
    Try
    http://help.sap.com/saphelp_nw04/helpdata/en/51/85d6cf842825469a51b9a666442339/frameset.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    This is related to loading performance, take a look at this doc 'bw load performance and analysis'
    data packet
    SAP Note 138794 Extraction parameter on BW/OLTP
    SAP Note 130253 General tips on uploading transaction data to BW
    SAP Note 365762 Loading data Pre-analysis of (performance)problems
    Note 409641 - Examples of packet size dependency on ROIDOCPRMS
    Note 417307 - Extractor package size: Collective note for applications
    I hope this will help you,
    Assign points if this helps.
    Thanks...
    Vasu...

  • Data packet

    Hi,
    How can i restrict a packet size to certian number of records,
    can u plzz tell me how.
    Regrds
    Akki

    Hi,
    To have the control over datapacket size (no of records in a datapackage) you can have settings in infopackage.
    In info package > Scheduler --> DataS. Default Data Transfer.This would be a local settings whenever you run the load through that IP.The changes that you make here takes priority over the default settings (SBIW) and applicable only for the loads from this info package.Default is 20000kB. Try changing it. You should specify it for the correct update method, ie if its a delta IP, you shud mention the size against delta IP. you can define number of datapackets per idoc as 10.
    You can also try RSCUSTV* where * is an integer to change data load settings.
    Change Datapackage size for extraction, use Transaction RSCUSTV6.
    Change Datapackage size when upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    IN R/3, T-Code SBIW --> Genaral settings --> Maintain Control Parameters for Data Transfer (source system specific)
    Try
    http://help.sap.com/saphelp_nw04/helpdata/en/51/85d6cf842825469a51b9a666442339/frameset.htm
    SAP Note 138794 Extraction parameter on BW/OLTP
    Note 409641 - Examples of packet size dependency on ROIDOCPRMS
    Note 417307 - Extractor package size: Collective note for applications
    To find bytes transfered during load
    use tables RSMONICDP or RSMONICTAB to see the number of data packets and records per data packet
    You can see the data packets and number of records from the RSMON details. You can also look at the TRFC stats in SM58 SMQ1 and SMQ2 in R3 and you can also see the IDOCS via BD87 and determine information from there.
    The number of records should also appear in the Infopackage --> manage details menu
    Hope this helps.
    Thanks,
    JituK

  • OSS Notes Please

    Dear Gurus,
    Can anyone send me the following OSS notes if u have.  567747, 130253 and 417307.  Your kind help will be definitely awarded.
    My mail ID is [email protected]
    Best Regards
    Mohan Kumar
    Message was edited by: mohan kumar

    hi Mohan,
    i think you will need to have access to oss yourself,
    note 567747 is composite one that contains several notes.
    sent to you mail ...
    130253
    Symptom
    Uploading transaction data to BW takes too long
    Other terms
    Business Warehouse, data upload, batch upload, transaction data upload,
    performance, runtime, data load, SPLIT_PARTITION_FAILED ORA00054
    Reason and Prerequisites
    Loading data from a mySAP system (for example, R/3) or from a file takes a very long time.
    Solution
    The following tips are a general check list to make the mass data upload to the Business Warehouse (BW) System as efficient as possible.
    Tip 1:
    Check the parameter settings of the database as described in composite Note 567745.
    Check the basis parameter settings of the system.
    Note 192658 Setting basis parameters for BW Systems
    See the following composite notes:
    Note 567747 Composite note BW 3.x performance: Extraction & loading
    Note 567746 Composite note BW 3.x performance: Query & Web applications
    Tip 2:
    Import the latest BW Support Package and the latest kernel patch into your system.
    Tip 3:
    Before you upload the transaction data you should make sure that ALL relating master data has been loaded to your system. If no master has been loaded yet, the upload may take up to 100 percent longer because in this case, the system must retrieve master data IDs for the characteristic attributes and it must add new records to the master data tables.
    Tip 4:
    If possible, always use TRFC (PSA) as the transfer method instead of
    IDocs. If you (have to) use IDocs, keep the number of data IDocs as low
    as possible. We recommend an IDoc size of between 10000 (Informix) and 50000 (Oracle, MS SQL Server).
    To upload from a file, set this value in Transaction RSCUSTV6.
    To upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    Tip 5:
    If possible, load the data from a file on the application server and not from the client workstation as this reduces the network load. This also allows you to load in batch.
    Tip 6:
    If possible, use a fixed record length when you load data from a file (ASCII file). For a CSV file, the system only carries out the converison to a fixed record length during the loading process.
    Tip 7:
    When you load large data quantities from a file, we recommend that you split the file into several parts. We recommend using as many files of the same size as there are CPUs. You can then load these files simultaneously to the BW system in several requests. To do this, you require a fast RAID.
    Tip 8:
    When you load large quantities of data in InfoCubes, you should delete
    the secodary indexes before the loading process and then recreate them afterwards if the following applies: The number of the records that are loaded is big in comparison to the number of records that already exist in the (uncompressed) F fact table. For non-transactional InfoCubes, you must delete the indexes to be able to carry out parallel loading.
    Tip 9:
    When you load large quantities of data in an InfoCube, the number range buffer should be increased for the dimensions that are likely to have a high number of data sets.
    To do this, proceed as follows. Use function module RSD_CUBE_GET to find the object name of the dimension that is likely to have a high number of data sets.
    Function module settings:
    I_INFOCUBE = 'Infocube name'
    I_OBJVERS = 'A'
    I_BYPASS_BUFFER = 'X'
    The numbers for the dimensions are then contained in table 'E_T_DIME', column 'NUMBRANR'. If you enter 'BID' before this number, you get the relevant number range (for example BID0000053).
    You can use Transaction SNRO (-> ABAP/4 Workbench -> Development --> Other tools --> Number ranges) to display all number ranges for the dimensions used in BW if you enter BID*. You can use the object name that was determined beforehand to find the required number range.
    By double-clicking this line, you get to the number range maintenance. Choose Edit -> Set-up buffering -> Main memory, to define the 'No. of numbers in buffer'.
    Set this value to 500, for example. The size depends on the expected data quantity in the initial and in future (delta) uploads.
    !! Never buffer the number range for the package dimension !!
    Tip 10:
    When you load large quantities of data, you should increase the number
    range buffer for the info objects that are likely to have a high number of data sets. To do this, proceed as follows:
    Use function module RSD_IOBJ_GET to find the number range name of the info object that is likely to have a high number of data sets.
    Function module settings:
    I_IOBJNM = 'Info object name'
    I_OBJVERS = 'A'
    I_BYPASS_BUFFER = 'X'
    The number for the info object is in table 'E_S_VIOBJ', column 'NUMBRANR'. Enter 'BIM' in front of this number to get the required number range (for example BIM0000053).
    Use Transaction SNRO (-> ABAP/4 Workbench -> Development --> Other tools --> Number ranges) to display all number ranges used for the info objects in BW by entering BIM*. By entering the object name determined beforehand you can find the desired number range.
    By double-clicking this line you get to the number range object maintenance. Choose Edit -> Set-up buffering -> Main memory, to define the 'No. of numbers in buffer'.
    Set this value to 500, for example. The size depends on the expected data quantity in the initial and in future (delta) uploads.
    !! Never buffer the number range object for the characteristic 0REQUEST!!
    417307
    Symptom
    Performance load is too high/not justifiable during data load.
    Customizing settings via extractors IMG path (Transaction SBIW in the OLTP system; can be called directly in the OLTP or using Customizing, from the BW system) do not yield any considerable improvement or are not clear.
    The settings in Table ROIDOCPRMS or in the Scheduler in the BW system are not taken into account by some extractors. How is the data package size defined for the data transfer to the BW? Are there application-specific features to determine the package size? If so, what are they?
    Other terms
    SBIW, general settings extractors, MAXSIZE, data volume, OLTP, service API, data package size, package size, performance
    Reason and Prerequisites
    The general formula is:
          Package size = MAXSIZE * 1000 / size of the transfer structure,
                        but not more than MAXLINES.
    You can look up the transfer structure (extract structure) in Table ROOSOURCE in the active version of the DataSource and determine its size via SE11 (DDIC) -> Utilities -> Runtime object -> Table length.
    The system default values of 10,000 or 100,000 are valid for the MAXSIZE and MAXLINES parameters (see the F1 help for the corresponding fields in ROIDOCPRMS). You can use the IMG Transaction SBIW (in the OLTP system) "Maintain General settings for extractors" to overwrite these parameters in Table ROIDOCPRMS on a system-specific basis. You also have the option of overriding these values in the scheduler (in the target BW system). However, in the Scheduler (InfoPackage) you can only reduce the MAXSIZE. The advantage of using the Scheduler to carry out maintenance is that the values are InfoSource-specific.
    However, some extractors have their own flow logic which MAXSIZE does not load 1:1 from the ROIDOCPRMS.
    This Note does not cover all SAP applications.
    Solution
    Application/DataSource               Standard settings or note
    Generic extractor                     Standard (Example: Note 409641)
    Delta Extraction via DeltaQueue       Standard as of PlugIn 2000.2
                                               Patch 3
    LO-LIS                                 Standard
    Logistic Cockpit SD                       Notes 419465 and 423118
    Logistic Cockpit MM-IM:
    Extraction 2LIS_03_UM                     Notes 537235 and 585750
    Extraction 2LIS_03_BF                     Note 454267
               In general, the following applies to Logistic Cockpit Extraction: The package size set only serves as guideline value.
                Depending on the application, contents and structure of the documents, and the selection in the reconstruction program, the actual size of the transfer packages may differ considerably. That is not an error.
                For 'Queued Delta' update mode the package size of 9999 LUWs (=Logical Unit of Work, in this particular case documents or posting units per transaction are concerned) is set for LO Cockpit. However, if a transaction updated more than 10000 documents at once, the number 9999 would be invalid. This is because an update process cannot be split.
                In the case of a 'Direct Delta' update mode there is no package size for LUW bundling. In the DeltaQueue (RSA1) every LUW is updated individually.
               Also note the following:
                         If you want to transfer large data sets into the BW System, it is a good idea to carry out the statistical data setup and the subsequent data transfer in several sub-steps. In doing so, the selections for the statistical data setup and in the BW InfoPackage must correspond to each other. For performance reasons, we recommend using as few selection criteria as possible. You should avoid complex selections. After loading init delta with selection 1, the setup table has to be deleted and rebuilt with selection 2.
                         Bear the following in mind: The delta is loaded for the sum of all selections from the init deltas. Remember that the selections are made so that they do not impede the delta load, for example, if you have initialized the delta for the periods January 1st, 1999 to December 1st, 2000 and December 2nd, 2000 to August 1st, 2001 then you get a time interval from January 1st, 1999 to August 1st, 2001 in the BW. Documents from August 2nd, 2001 are no longer loaded.
    CO-OM                                     Note 413992
    PS                                        Standard or according
                                               to Note 413992
    CO-PC                                     Standard
    FI (0FI-AR/AP-3)                          First tables with open items(BSID for customers, BSIK for vendors) are read. The package size from the MAXISE field in Table ROIDOCPRMS is used in the extraction from these tables. If the packages are grouped together according to ROIDOCPRMS and a few data records still remain, these remaining records are transferred in one additional package.
    After the open items are extracted, the system extracts from the table of cleared items (BSAD for customers, BSAK for vendors). In this extraction,the package size from the MAXSIZE field is adhered to.
    FI (0FI-/AR/AP/-4 , new as of BW 30A) Like FI-AR/AP-3, but with one difference: if there are remaining records after the system reads the open items using the setting in the MAXSIZE field, they are not transferred in one extra package, but added to the first package of items read from the table of cleared items. For example, if 10 data packages with 15000 data records are extracted from Table BSID in accordance with ROIDOCPRMS, and 400 data records remain, the package size of the first data package from Table BSAD is 15400.
    Both new and changed records are formatted in the following sequence in the delta transfer: 1) new BSIK/BSID records; 2) new BSAK/BSAD records; 3) changed BSIK/BSID records; 4) changed BSAK/BSAD records.
    Package size 0FI_GL_4:
    Prior to Note 534608 and the related notes, package size could vary considerably since the MAXLINES were applied to the document headers only. Then all documents lines for the document headers were read and transferred. As a result, the packages were 2 to 999 times as large as MAXLINES depending on the number of line items per document.
    Note 534608 and the related notes changed the logic so that the MAXLINES is now also applied to the document lines. For each package, MAXLINES can be exceeded by up to 998 lines since a document is always transferred completely in one package. Smaller 'remaining packages' may also occur; for example if MAXLINES = 10000, and 10000 document headers with 21000 lines are selected, 2x10000 and the remainder of 1000 were were transferred in a separate package. Selection logic in 0FI_GL_4: Selection of the new FI documents via CPUDT -> the changed documents are then selected via Table BWFI_AEDAT. When changing the selection from new to changed documents, a package may occur which consists of the 'remainder' of the CPUDT selection and the first package of the BWFI_AEDAT selection. This package can then have a maximum size of 2 x MAXLINES.
    FI-FM                                  Note 416669
    EC-PCA                                 For the most part, the systemadheres to the standard settings but for technical reasons, packages packages smaller than the MAXSIZE or 7200 larger than the MAXSIZE may be omitted in Table ROIDOCPRMS
    FI-SL                                  as in EC-PCA
    PT                                     Standard (refer to Note
                                           397209)
    PY                                     Standard
    PA                                     Standard
    RE                                     Standard
    ISR-CAM (Category Management,
             new as of PlugIn 2001.1)     Standard
    CO-PA                                  During the initialization and fullupdate in the profitability analysis, a join is always read from two tables (For details see Note 392635). To avoid terminations caused by Select statements that run for too long, access occurs with intervals for the object numbers (fixed size 10,000). New intervals are read until the package size requested by BW is reached. Therefore the size of the data package is always equal to or larger than the specification, but it can vary considerably.
    Master data:
    Business partner                       Standard
    Product                                Standard
    Customer                               Standard
    Vendor                                 Standard
    Plant                                  Standard
    Material                               Standard
    567747
    Symptom
    You want to improve the performance of the extraction and loading of your data into SAP BW 3.x.
    Solution
    This is a composite note that deals with performance-relevant topics in the area of extraction and loading.
    If you encounter performance problems, ensure that the current Support Package has been imported.
    This note is continually updated. You should therefore download a new version on a regular basis.
    You will find further documents in the SAP Service Marketplace, alias bw under the folder "Performance".
    Contents:
    I.    Extraction from the OLTP
    II.   Loading generally
    III.  Master data
    IV.   Roll-Up/aggregate structure
    V.    Compression
    VI.   Hierarchies/attribute realignment run
    VII.  DataMart interface
    VIII. ODS objects
    IX.   Miscellaneous
    I.    Extraction from the OLTP
    Note 417307: extractor packet size: Collective note for applications
    Note 505700: LBWE: New update methods from PI 2002.1
    Note 398041: INFO: CO-OM/IM IM content (BW)
    Note 190038: Composite note performance for InfoSource 0CO_PC_01 and
    Note 436393: Performance improvement for filling the setup tables
    Note 387964: CO delta extractors: poor performance for Deltainit
    II.   Loading generally
    Note 130253: Notes on uploading transaction data into BW
    Note 555030: Deactivating BW-initiated DB statistics
    Note 620361: Performance data loading/Admin. data target, many requests
    III.  Master data
    Note 536223: Activating master data with navigation attributes
    Note 421419: Parallel loading of master data (several requests)
    IV.   Roll-Up/aggregate structure
    Note 484536: Filling aggregates of large InfoCubes
    Note 582529: Rollup of aggregates & indexes (again as of BW 3.0B Support Package 9)
    V.    Compression
    Note 375132: Performance optimization for InfoCube condensation
    Note 583202: Change run and condensing
    VI.   Hierarchies/attribute realignment run
    Note 388069: Monitor for the change run
    Note 176606: Apply Hierarchy/Attribute change ... long runtime
    Note 534630: Parallel processing of the change run
    Note 583202: Change run and condensing
    VII.  DataMart interface
    Note 514907: Processing complex queries (DataMart, and so on)
    Note 561961: Switching the use of the fact table view on/off
    VIII. ODS objects
    Note 565725: Optimizing the performance of ODS objects in BW 3.0B
    IX.   Miscellaneous
    Note 493980: Reorganizing master data attributes and master data texts
    Note 729402: Performance when compiling and searching in the AWB

  • LO Settings (in SBIW) for SD and Inventory before Customizing LO (LBWE)

    Hi,
    Please let me know the LO Settings(in SBIW) to be maintained in R/3 for SD, Inventory and 
    Purchasing before Customizing  LO (LBWE).
    i.e, for :
    <u>SD</u>
    1. Change statistics currency for each sales organization
    2. Assign Update Group
    <u>Inventory :</u>
    1.Determine Industry Sector
    2.Transaction Key Maintenance for SAP BW
    3.Stock initialization
    4.Plant view
    Warehouse View
    <u>Purchasing :</u>
    1.Determine Industry Sector
    2.Transaction Key Maintenance for SAP BW
    Note : I am NOT looking for LO steps
    Thanks
    Kishore

    Hi,
    For sales & distribution steps are
    1.t-code = sbiw
    2. expand "settings for application specific datasource (PI)"
    3.expand "Logistics"
    4.expand "Settings: Sales & Distribution"
    5.click on "IMG-Activity" of " Change statistics currency for each sales organization"
    Now you can change according to your requirements.
    Hope this helps.
    Cheers.
    Sukanya

  • I not look in InfoPackages customizing from SBIW in OLTP ?

    Hi, i changing values in OLTP tx SBIW and when make infopacke in SAP BI7 not look new parameters, only look old limit for maxlines and package size.
    We replicate again and active datasources.
    Table ROOSPRMS is empty in OLTP and BI and table in ROIDOCPRMS in OLTP have correct values.
    Any idea ?

    Correct Customizing is setting OLTP Name and correct display is in Infopackage, maybe you need edit table ROOSPRMS in Field MAXPROCS.
    In some case not assume values setting in SBIW for Max Process.
    Bye.

  • BW customizing (RSO2, SBIW, RS*, LBWE) not available in 4.6C

    Dear all,
    Need assistance to enable the BW customizing in our legacy 4.6C SP 20.  We are currently running ECC5 in our production with BW3.5 and our users requested the historical data in 4.6C to be loaded to BW as well.  Somehow, the required Tcodes do not exist in 4.6C.  What steps do we need to take?
    Thanks...
    ismail

    Found this SAP Note 318876 - Including BW extraction IMG in the SAP Ref.IMG which may be relevant but it refers to SP 08 whereas our legacy 4.6C is already at SP 20.  Will ask our Basis to have a look.  Thanks

  • BW Early watch report ( SBIW )

    Hi All,
    We have received the early watch report for BW and in that It has mentioned to do SBIW ( IMG ) activity as follows:
    Upload customizing BW (SBIW)
    One part of table ROIDOCPRMS in every SAP source system controls the data transfer from the source systems
    to this system. Table ROIDOCPRMS contains the following information:
    maxsize - Maximum size of an IDoc packet in KB
    statfrqu - The number of packets that are transferred before statistical information is sent
    maxlines - Maximum number of records sent in one IDoc packet
    maxprocs - Maximum number of dialog work processes for each upload request used to send the data to the
    system
    The correct setup of this table is important for performance and stability of the upload process. We checked the
    setup of this table for all source systems connected to this system.
    SAP asks to change the entries and it has given recommendations. I hope this will increase the rate of Data Transfer from ERP to BW.
    My Questions :
    01. My question is they have mentioned to set this in the respective Source System. i.e. Should I set it in the Production System.
    02. one more thing. It is also required to create a transport request to do the above operation. for that case, should I create a Transport Request in the Production System. ? Is that generally ok ? how it is handled usually ??
    03. To execute SBIW Transaction, what authorization is required ? and how to check which authorization is required to execute SBIW ??
    Please throw me some light on this.
    Thanking you in Anticipation.
    Thanks & Regards
    L Raghunahth
    PS : If you know answer to any particular one also, please feel free to answer.

    Hi Raghunahth,
    Please see my answers below.
    01. My question is they have mentioned to set this in the respective Source System. i.e. Should I set it in the Production System.
    These settings are related to the respective source sytem. Yes these settings are to be done on production system
    02. one more thing. It is also required to create a transport request to do the above operation. for that case, should I create a Transport Request in the Production System. ? Is that generally ok ? how it is handled usually ??
    No transport request required. Check with your basis guy to carryout these settings.
    03. To execute SBIW Transaction, what authorization is required ? and how to check which authorization is required to execute SBIW ??
    Ask the basis guy to give acces rights to tcode SBIW on the source system.
    Hope this helps.
    Regards,
    Sreenivas.

  • Is Oracle Partitioning req'd in the Source R/3 system for SBIW Setup Table?

    We have ECC6.0 as our Source System.  My BI team is trying to load 3 years of SALES history into the BI7.0 system, and then the APO system will pull 1 of those 3 years into it.  To begin this process, they are using SBIW to Manage the Extract Structures and create the Setup Tables.
    I got an alert that these Setup Tables were created with Partitions.  We are not licensed for partitioning in our ECC(R/3) system, only in BI and APO.
    1.)  Is Partitioning really necessary for these Setup Tables?
    2.)  It appears that the Setup jobs create these tables by default as partitioned?  How do I disable or change that? 
    Thanks,
    Richard (Basis)

    Hi Neeraj,
    thank you for the reply.
    Yes, i have checked tRFCs and logs , everything looks ok, we are getting expected records in BW but job is not ending in the backend system and BW load is always showing YELLOW.
    looks like our developer using  custom FM to extract data and i feel that something code issue which is going endless loop. they need to write code to terminate process after last record in the extract process. i will update you once i get root cause of the issue.
    thanks again..
    thanks
    Venkata

  • Source System BW - what is "Customizing for extractors" etc.

    Hi,
    Can somebody explain IN DETAIL what is the purpose of
    1)Customizing for the extractors
    2)Transfer Global settings
    3)Transfer exchange rates
    I could not find any documentation in help.sap.com. Is there any documentation?
    thanks a lot

    Jay,
    1)Customizing for the extractors
    This is to modify/control the data transfer settings of the particular source system. It takes you to the SBIW screen (Implementation Guide) directly in the source system. Here you can control your datapacket size, number of datapackets for each IDoc etc. From this screen, you cal also maintain the datasources. All the activities realting to 'maintenance of datasources' can be accessed from here.
    2)Transfer Global settings
    Global tabls such as Fiscal Period tables, Fiscal variant values are transferred to BW from the R/3 system where they are maintained. Do a lookup on tables T00* in SE11 and you can see a bunch of tables.
    3)Transfer exchange rates
    Similar to the above tables, different exchange rates are maintained in the R/3 system. These gets transferred to BW. tables such as TCURR, TCURC, TCURV etc. These are called Global settings.
    Hope it helps
    Gova

  • SAP help for Logistics Customizing Cockpit

    Hi Friends
    Greetings...
    I want to read complete scenario for BW extraction, I want to know LIS, and LO concept, Central Delta Managemen, Set up tables, why we need MCstructures if we have LIS structures etc.,
    I'm trying to search in "help.sap.com" but couldn't find i got only "customizing extractors" in BW help, but couldn't find complete scenario of LO and LIS. I have 350 extraction material, but still i want to read from help.sap.com also. kindly help me to get this site
    bye
    lasya

    Hi,
    Step by Step for LIS Extraction
    LIS EXTRACTION
    T.code - :MC18 – create field catalog
    1. Characteristic Catalog
    Application-01-Sales and Distribution, 02-Purchasing, 03-Inventory Controlling, etc.
    Catalog category 1. Characteristic catalog, 2. Key figures catalog 3. Date catalog
    Select characteristic catalog and enter, click on characteristic select the source table and it will be display the relevant source field and select the source field, copy + close, copy.
    Save, similarly create key figures catalog
    T.code : MC21 – create infostructure
    Infostructure : S789
    Application – 01
    Choose characteristic select the catalog, select the fields, copy + close
    Choose key figures catalog select the key figures ,copy + close, save and generate
    T.code – MC24 – create updating
    Infostructure : S789
    Update group : 01- Sales document, delivery, billing document ,enter
    Select the key figures click on rules for key figures give suggest rules, copy save and generate
    Click on updating (activate updating)
    Select the infostructure set periodic split 1. Daily, 2. Week, 3. Month, 4. Posting period
    Updating –1)No updating,2)Synchronous updating (V1), 3)As synchronous updating (V2), 4)As synchronous updating (V3),
    T.code – LBW0 - Connection of LIS Information structures to SAPBW
    Information structure : S786
    Select the radio button-Setup LIS environment and Execute.
    Select the radio button-Generate data source and Execute.
    For Delta update:
    Select the radio button-Generate updating and Execute
    Select the radio button -Activate / deactivate and Execute.
    T.code – SBIW – Display IMG (implementation guide)
    Setting for applications specific data source – logistics – Managing transfer information structure – setup of statistical data – applications specific setup of statistical data –perform statistical setup – sales.
    Choose activity
    Setup – Orders, deliveries, billing
    Choose the activities enter the infostructure (S789), give name of the run, date of termination, time of termination, No. of tolerated faulty documents. Then execute
    T.code – RSA3 – Extractor checker
    Give the data source name eg. 2LIS 01S789 and execute, result will get some records
    Go to BW side replicate data source – Assign infosource – Create infocube – Create update rules – create infopackage and schedule the package with initialize delta process.
    For delta update :
    In R/3 side
    T.code – MC25, set update (V1) or (V2) or (V3)
    T.code – LBW0, choose generate updating and execute then choose activate / deactivate and execute
    BW side - create infopackage and schedule the package with delta update.
    First time if your scheduling the infopackage -in R/3 side T.code :MC25 -Udating set to No update,insted of selecting the update V1,V2,V3.
    If your doing the Delta update:in R/3 side T.code :MC25-Updating set to either V1 or V2 or V3. and the to T.code :LBW0 -Select the radio button Active/deactivate and Execute.
    and schedule the infopackage with delta update.
    3.Modules:
    SD,MM, PP,QM..
    4. Deltas for LIS:
    After setting up the LIS environment, 2 transparent tables and 1 extract structure is generated for this particular info structure. Within transaction SE11 you can view the tables ‘SnnnBIW1’, ‘SnnnBIW2’ and the structure ‘SnnnBIWS’ and the InfoStructure itself ‚Snnn‘
    The tables S5nnnBIW1 & SnnnnBIW2 are used to assist the delta update process within BW.
    Extract structure ‘SnnnnBIWC’ is used as an interface structure between OLTP InfoStructure and BW
    The OLTP system has automatically created an entry in the control table ‘TMCBIW’. Within transaction ‘SE16’ you’ll see, that for your particular InfoStructure the field ‘BIW active’ has the value ‘X’ and the field ‘BIW status’ is filled with value ‘1’ (refers to table SnnnBIW1).
    The orgininal LIS update program ‚RMCX#### will be enhanced within the form routines ‚form Snnnbiw1_update_....‘ and ‚form Snnnbiw2_update
    With the transaction ‘SE38’ you’ll see at the end of the program [starting at line 870 / 1006], that the program is enhanced within a ‘BIW delta update’ coding
    Within the flag ‚Activate/Deactivate‘ the update process into the delta tables (SnnnBIW1/Sn5nnBIW2) is swichted on/off. In the table ‚TMCBIW‘ is defined, which table is active for delta update.
    check LO also
    LO
    rgds,
    raj
    Message was edited by:
            gangaraju mullapudi

  • Error while generating custom ref sets Hierarchies

    I am trying to gernerate datasource for custom reference sets hierarchies through SBIW and getting an error as below.
    "Changes to Repository or Cross-Client customizing are not permitted."
    How do I fix this problem. Please advise.
    Thanks,
    Ravi

    hi Ravi,
    it seems you are trying to generate datasource in QAs or Production system ? where changes are not allowed.
    normally we have to do the changes in Development system and then transport to QAs/Production.
    setting to allow changes done with transaction SCC4, but it's basis guy job.
    hope this helps.

Maybe you are looking for

  • Run Python Script in Automator

    I have a python script (which was written for me), and I would like to make it so that the script executes every x minutes. I know this should be simple to do, but I can't figure it out. Thus far, I have created a workflow in automator, used the "Run

  • Cant Access external shares from my admin account

    Hi, the title says it all. No matter if I try to connect via smb or afp. If I try to log in from the admin account to external afp or smb shares the login is refused cause of permissions. IF I do switch to a user other than the system admin account,

  • Need Help Achieving this Effect (picture)

    The effect I am trying to achieve is fairly basic. I am trying to recreate this curvy line. As shown here. I tried doing this with a few different methods that didn't work out as I had hoped. I tried 1) using a horizontal line and then transform - >d

  • WF1907 sound

     I have a WF1907 Compaq monitor connected to a SR5450F computer. The components are 2 years old purchased new. To this date I  have remote speakers connected and working. I now want to disconnect the remote speakers and use the monitor speakers. I di

  • What video types can I edit in iMovie?

    I used a camcorder that resulted in MPG video files. I'm using MPEG Streamclip to convert them. If I convert them to .mov, the files sizes are huge. MPEG-4 comes out as a reasonable size. What other video file types can I edit in i Movie?