Load from Setup tables breaking for high volume

Hello Friends,
There are around 50,000 records in the setup table for 03 related appl area. The load is breaking for 50,000 records in BW. But it works if I further select only one day data in the BW scheduler.
Why is this?
Immediate help is appreciated.
Regards,
Simmi

Hello gurus,
So, I tried reducing the packet size to 20,000 and no of Idocs to 10 per Infopckage. But this is working for only 40K records which a just a weeks data for the 2LIS_03_BF extractor.
If I run 30 days worth of load form setup tables for the same etxractor, the load is breaking.
The same is happening for a customer built Zextractor. For this I actually tried with 10,000 records and 5 IDOCS per package.
Does, this mean for sure that there is a memory issue?
Also, I see that all the BGD WPs have been occupied by other jobs. Is this an issue?
But we have deltas running every night from R3, but t hese are less volumes(<10,000)and we have not seen this issue.
However within BW there are heavy loads between data targets and they don't break.
Is this a issue only for full loads from R3?
Appreciate your immediate response.
Thanks
Simmi
Edited by: simmi on Jun 20, 2008 1:39 AM

Similar Messages

  • Selective Deletion and Data Load from Setup Tables for LIS Extractor

    Hi All,
    We came across a situation where one of the delta in PSA was missed to load in DSO. This DSO updates another cube in the flow. Since it has been many days since this miss come in our knowledge we need to selectively delete data for those documents from DSO & Cube and then take a full load for the Documents filling the setup table.
    Now what will be the right approach to load this data from setup table > DSO > cube. There is change log present for those documents and a few KPI's in DSO are in summation mode.
    Regards
    Jitendra

    thanks Ajeet!!!!
    This is Sales Order extractor, the data got loaded to ODS just fine, but since the data is coming to ODS from different extractor. Everything is fine in ODS, but not in the cube. Will Full repair request and Full load would it make difference when the data is going to cube? I thought that it would matter only if I am loading to ODS.
    what do you mean "Even if you do a full load without any selections you should do a full repair ".
    thanks.
    W

  • Is it possible to automate the load of setup tables for LIS datasources?

    Hello,
    For LIS datasources, eg. 04 (shop floor control), once the production confirmation is done, we have to manually run the transaction OLI4BW to load the setup tables before extracting data to BW. Is there a way to automate the load of setup tables?
    Regards,
    Suraj

    Hi,
    Yes we can !
    Create variants for small ranges and scheduled all of them at a time. So that, loading will be completed soon.
    Thanks,
    Saru

  • Architecture/design for high volume web service calls from on Demand

    Hi,
    We have hundreds of end users (less than 600 users) that will be initiating web service calls from On Demand that will deal with both querying (stateful transaction) and inserting/updating data (stateless). Currently, we see a limitation on the number of stateless sessions (33) that can initiate web service calls. We are afraid this will lead to poor performance with the number of sessions that are allocated for stateless sessions.
    Can someone provide any design or framework information that can be implemented for high volume web service calls from On Demand? We anticipate 35K+ web service calls that will be initiated from On Demand per hour.
    Thanks.

    Hi,
    We have hundreds of end users (less than 600 users) that will be initiating web service calls from On Demand that will deal with both querying (stateful transaction) and inserting/updating data (stateless). Currently, we see a limitation on the number of stateless sessions (33) that can initiate web service calls. We are afraid this will lead to poor performance with the number of sessions that are allocated for stateless sessions.
    Can someone provide any design or framework information that can be implemented for high volume web service calls from On Demand? We anticipate 35K+ web service calls that will be initiated from On Demand per hour.
    Thanks.

  • 2LIS_03_UM - FULL LOAD VIA SETUP TABLE NOT EQUAL TO DELTA

    Hello,
    I try to load stock data with 2LIS_03_UM.
    When I extract data using setup table and full load I don't get the same records as I get with delta load. Although the key figures values are the same, the documents numbers are different. Most documents from delta load are ML (material ledger) documents while documents from full load (via setup table) are mainly FI documents.
    What can be the reason? How can I fix it?
    Thank you,
    Hadar

    Hi,
    Delta load extracts data from Delta Queue based on your update mode selected in LBWE.
    Full Load or Init load extracts data from Setup tables, here if you are extracting data from setup tables ? did you fill the setup tables now or not after deleting the existing data of the setup tables?
    are you getting same values when compare to R/3 and BW data for some document numbers?

  • My volume button for higher volume(F12) is not working whereas mute and lower volume(F10 and F11) buttons are working, what should I do?

    My volume button for higher volume is not working

    I think there is a Windows bootcamp application that will recognize those keys. The bootcamp stuff is here: http://www.apple.com/support/bootcamp/
    I think it is called bootcamp assistant. Look here: http://www.mac-guides-and-solutions.com/boot-camp-assistant.html

  • Is it too much to ask from apple, an option for different volumes levels from different apps, including alarm, text, email, basically from every posible app?

    Is it too much to ask from apple, an option for different volumes levels from different apps, including alarm, text, email, basically from every posible app?

    My daughter has had her Razr for about 9 months now.  About two weeks ago she picked up her phone in the morning on her way to school when she noticed two cracks, both starting at the camera lens. One goes completely to the bottom and the other goes sharply to the side. She has never dropped it and me and my husband went over it with a fine tooth comb. We looked under a magnifying glass and could no find any reason for the glass to crack. Not one ding, scratch or bang. Our daughter really takes good care of her stuff, but we still wanted to make sure before we sent it in for repairs. Well we did and we got a reply from Motorola with a picture of the cracks saying this was customer abuse and that it is not covered under warranty. Even though they did not find any physical damage to back it up. Well I e-mailed them back and told them I did a little research and found pages of people having the same problems. Well I did not hear from them until I received a notice from Fed Ex that they were sending the phone back. NOT FIXED!!! I went to look up why and guess what there is no case open any more for the phone. It has been wiped clean. I put in the RMA # it comes back not found, I put in the ID #, the SN# and all comes back not found. Yet a day earlier all the info was there. I know there is a lot more people like me and all of you, but they just don't want to be bothered so they pay to have it fix, just to have it do it again. Unless they have found the problem and only fixing it on a customer pay only set up. I am furious and will not be recommending this phone to anyone. And to think I was considering this phone for my next up grade! NOT!!!!

  • I have just purchased my Iphone 4s, and the volume of the sound is very low, I can hardly hear the other person. I have tried the button for higher volume, but there is no change

    I have just purcahsed my Iphone 4s, and the volume is very low, I can hardly hear the other person talk. I have tried the botton for higher volume, but it doesn`t work. Can anybody give me an answer to what is wrong?

    Did u take out the screen protection?

  • Load the setup table.

    hi experts,
                     i have doubt on loading the set up table.  what are the transaction codes for the filling the set up(LO) ex like sales,quality,purchase. olis7bw for sales. but it is not working in my system. other wise can any one provide the navigation steps from the sbiw also. it would be great.
    i am using BI7.0.
    Thanks in advance
    regards
    amith

    Hi,
    first remember to do that in your source system (R/3 or ECC)...
    otherwise the cusomizing path is as follows...
    SBIW
    Settings for Application-Specific DataSources
    Logistics
    Managing Extract Structures
    Initialization
    Filling in the Setup Table
    Application-Specific Setup of Statistical Data
    hope this helps...
    Olivier.

  • Oracle database integration with SAP PI for high volume & Complex Structure

    Hi
    We have requirement for integrating oracle database to SAP PI 7.0 for sending data which is eventually transferred to multiple receivers. The involved data structure is hugely complex (around 18 child tables) with high volume processing requirement (100K+ objects need to be processed in 6-7 hours). We need to implement logic for prioritizing the object i.e. high priority objects must be processed first and then objects with normal priority.
    We could think of implementing this kind of logic in database procedures (at least it provides flexibility for implementing data selection logic as well as processed data can be marked as success in the same SP) but since PI sender adapter doesn't support calling Oracle stored procedures currently so this option is rules out. we can try implementing complex data selection using oracle table function but table function doesn't allow any SQL query which changes data (UPDATE, INSERT, DELETE etc) so it is impossible to mark selected objects in table function from PI communication channel "Update Query" option.
    Also, we need to make sure that we are not processing all the objects at once as message size for 20 objects can vary from 100 KB to 15 MB which could really lead to serious performance issues for bigger messages.
    Please share any implementation experience for handling issues:
    1 - Database Integration involving Oracle at sender side
    2 - Complex Data structures
    3 - High Volume Processing
    4 - Controlled data selection from database to contro the message size in PI
    Thanks,
    Panchdev

    Hi,
          We can call the stored procedure using receiver adapter using ccBPM, we can follow different approaches for reading the data in this case.
    a) In this  a ccBPM instance needs to be triggered using some dummy message, after receiving this message the ccBPM can make  a sync call to the Oracle database the store procedure(this can be done using the specific receiver data type strucure), on getting the response message the ccBPM  can then proceed with the further steps.The stored procedure needs to be optimized for improving the performance as the mapping complexity will largely get affected by the structure in which the stored procedure returns the message.Prioritization of the objects can be handled in the stored procedure.
    b) In this a ccBPM instance can first read data from the header level table, then it can make subsequent sync calls to Oracle tables for reading data from the child tables.This approach is less suitable for this interface as the number child tables is big.
    Pravesh.

  • Data loading from one table to another

    Hi,
    I want to load some data from a temp table to a master table. The master is having 40million records and the temp table is having 23 million records. Master table is having around 50 columns and we are adding 4new columns and the temp table is having 5columns. The data for these 4new columns are available in the temporary table also the employee column is there in common to these two table.
    I used a stored procedure to load the data, whcih uses a cursor. But its taking more that 6hours to load.
    Can any one suggest me a good technique to load data faster?
    Thanks,
    Santhosh.

    hi consider this case scenario which matches with yours.
    first of all you have to update not insert in master table.
    master table = emp with columns (emp_id, emp_name, emp_designation)
    to this original master table you added two more columns emp_salary, emp_department
    so now your master table looks like emp_id, emp_name, emp_designation, emp_salary, emp_department
    but when you do select * from master table, the last two columns salary & department are blank.
    Now you have another temp table with folllowing columns (emp_id, emp_salary, emp_department)
    now emp_id is common to master & temp tables & you want to put values from temp table into master tables? I think this is what ur trying to do..
    so for the above case the query i would write is
    update master_table m set m.emp_salary=(select t.emp_salary from temp_table t where
    t.emp_id=m.emp_id);
    commit;
    Regds.

  • Problem during  Data Warehouse Loading (from staging table to Cube)

    Hi All,
    I have created a staging Module in owb to load my flat files to my staging tables.I have created an Warehouse module to load my staging tables to Dimension and Cube that I have created.
    My senario:
    I have a temp_table_transaction which had loaded my flat files to it .This table had loaded with 168,271,269 milion record as through this flat file.
    I have created a mapping in owb which loaded my temp_table_transaction which has join with other tables and some expression and convert function that these numbers fill to a new table called stg_tbl_transaction in my staging module.Running this mapping takes 3 hours and 45 minutes with this configue of my mapping:
    Default operation mode in running parameter of Mapp config=Set based
    My dimesion filled correctly but I have two problem when I want to transfer my staging table to my Cube:
    #1 Problem:
    i have created a cube is called transaction_cube with owb and it generated and deployed correctly.
    i have created a map to fill my cube with 168,271,268 milon recodes in staging table was called stg_tbl_transaction and deployed it to server (my cube map operating mode is set based)
    but after running this map it did not complete after 9 hour and I forced to cancel my running's map by kill its sessions .I want to know this time for loading this capacity of data is acceptable or for this capacity of data we should spend more time.Please let me know if anybody has any Issue.
    #2 Problem
    To test my map I have created a map with configure set based in operation modes and select my stg_tbl_transaction as source with 168,271,268 records in it and I have created another table to transfer and load my data in it.I wanted to test the time we should spend on this simple map but after 5 hours my data had not loaded in new table.I want to know where is my problem.Should I have set something in configue of map or anothe things.Please guide me about these problems.
    CONFIGURATION OF MY SERVER:
    i run owb on two socket xeon 5500 series with 192 GB ram and disks with RAID 10 Array
    Regards,
    Sahar

    For all of you
    It is possible to load from Infoset to Cube we did it, and it was ok.
    Data are really loaded from Infoset (Cube + master dat) to cube.
    When you create a transformation under a cube Infoset is proposed, and it works fine ....
    Now the process is no more operationnal and i don't understand why .....
    Load from infoset to cube is possible, i can send you screen shot if you want ....
    Christophe

  • CCM 2.0 - Files Storage for high volume of files

    Hi all,
    I have seen in one message the following information:
    <i>1. Files storage.
    My point is that I think you've created a virtual folder on the SRM Server in SICF, which means that all your files are stored internally in the database, and not physically on the server.
    The most simple way is to create a physical folder on the server OS, and then create analias in SICF to this folder. You then will be abble to load in mass the pictures on the server using FTP, or network Share.</i>
    I want to upload images in the Catalog (CCM 2.0) and in my case I have high volume of files. Where is the best place to stored this data, in the database or in the server?
    And if it is in the server, how I can create an alias?
    Many thanks!!
    Regards

    Hi ,
    What we had done for image upload for CCM 2.0 was like this:
    1. in SE80 go to MIME repository -> drill down to services -> bc /sap/bsp - >Create a personal folder
    2. Import your image .jpeg in this folder.
    3. Derive an URL with the structue : server name/domain name/services/file name
    4. test this URL in IE browser ,it should open the picture in IE for you.
    5. then paste this URL in the characteristic 'image' of an item in master catalog in CAT
    with this in EBP we could see the photos of the items.
    BR
    Dinesh
    reward if helps

  • Mainframe data loaded into Oracle tables - Test for low values using PL/SQL

    Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Some columns from the mainframe data had 'low values' in them. These columns were defined on the Oracle tables as varchar2 types. In looking at the data, some of these columns appear to have data that looks like little square boxes, not sure but maybe that is the way Oracle interprets the 'low values' in the original data into varchar. When I run a select to find all rows where this column is not null, it selects these columns. In the results of the select statement, the columns appear to be blank, however, in looking at the data in the column using SQL Developer, I can see the odd 'square boxes'. My guess is that the select statement is detecting that something exists in this column. Long story short, some how I am going to have to test this legacy data on the Oracle table using Pl/Sql to test for 'low values'. Does anyone have any suggestions on how I could do this????? Help! The mainframe data we are loading into these tables is loaded with columns with low values.
    I am using Oracle 11i.
    Thanks
    Edited by: ncsthbell on Nov 2, 2009 8:38 AM

    ncsthbell wrote:
    Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Not a wise thing to do. Mainframe operating systems typically use EBCDIC and Unix and Windows servers use ASCII. The endian is also different (big endian vs little endian).
    Does anyone have any suggestions on how I could do this????? As suggested, use the SQL function called DUMP() to see the actual contents (in hex) of these columns.

  • Selective load into setup table

    Hello,
    Is there any way we can load multiple selections into setup table in OLI7BW. Currently I am only able to load range of values. But i have a list of almost 2000 records and they are not in sequence. And loading one by one requires too much effort. Loading ranges would load millions of records.
    Please advice.
    Regards,
    Tints

    Hi Tinu,
    Pick all 200 0 orders and check in VBAK table for CREATED ON field.
    Down load all 2000 orders into excel and put filter on created on value....then note down all the created on dates.
    Now you can load orders based on created on date. make sure that you are loading the data to DSO as it avoids duplication
    Regards
    SujanR

Maybe you are looking for

  • IPhoto 08 has more Photo Info than Aperture 2.1

    When I examine the information in the Metadata Inspector of Aperture 2.1, it is missing information found in iPhoto 08: Exposure. iPhoto tells me that my picture was taken with Aperture priority in the Exposure section of the Photo Info window. Apert

  • Itune error code 3 come  at the end of restoring process

    help!! my iphone 5 is stuck in recovery mode and when I try to restore it,my iphone start restoring (ios7.1)but in middle of restoring I get itune error code 3,so pls help and the error come  at the end of restoring process

  • Java Code - to upload files on https

    Hello: I am really looking for a bean to upload and download flat files to a https server. Thanks.

  • FM for checking Windows Authentication

    Hi All, Is there any function module / class / method  etc available for checking windows based authentication? My purpose is to call such FM (If available) from a web based interface as RFC, Kindly advise. Thanks in advance. Regards Vipin

  • Use of jar files

    hello, does anyone know how to compile and run a java program requiring .jar libraries? thanks