Why Set up table in LO Extraction ?
Hi All,
In LO Extraction we have filling up of set up tables for doing Init and then delta falls into update table etc then using V3 collective run push it to Delta Queue. Then we extract into BW. Why in LO alone this is the methodology for extraction and why not in other extractions like COPA or HR or FISL or anything ? What is the reason for these unique steps in LO extraction alone ?
Kindly let me know the answer.
Best Regards,
Fanie Hudson.
This question has already been posted several times and lot of documents are available.
Have a look at these discussions:
set up tables
Set up tables
Set up tables
view of set up tables data in se11??????????
Set Up tables..
lo: delete set up tables: DOUBT
Blogs of Roberto will be useful as well:
/people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
LOGISTIC COCKPIT DELTA MECHANISM - Episode two: V3 Update, when some problems can occur...
LOGISTIC COCKPIT DELTA MECHANISM - Episode three: the new update methods
LOGISTIC COCKPIT - WHEN YOU NEED MORE - First option: enhance it !
LOGISTIC COCKPIT: a new deal overshadowed by the old-fashioned LIS ?
award points if useful
Similar Messages
-
Why set up table not used in applications like FI, HR etc
Hi,
We all know that we use set up tables for LO Cockpit applications.
But we dont have for non LO applications like HR , FI...why such a difference is made by SAP...
One of the fellow suggested that for FI, HR etc, there is huge posting of data and so we dont want to store such a duplicate data
is this the case?
or there is some other reason...
please suggest the same
Thanking You,
Tarun Brijwani.Hi,
this question has already been asked a few times. Please search the forums for the answer.
regards
Siggi -
Why we have setup tables in LO extraction and why not in CO-PA,Generic
Hi Friends
Why we have setup tables in LO extraction and why not in CO-PA,Generic
Please give me reply
****************Points are assured********************
Thanks&Regards
RevathiHi Revathi,
Please check the following Threads :
[why set up tables ?;
[Why LO only having Setup tables not Others.......?;
Regards
Hemant Khemani -
Problem about fill set up table
Hi all,
I got a problem when I run the fill set up table for sales billing data source. what I did is:
first delete the setup table then oli9bw->type in a sales document no.(as I only want this order data), then give a run name, last execute, but I got the message as follow:
Data source 2LIS_13_VDITM contains data still to be transferred
Could you pls explain what this means, and what should I do to solve the problem
ThanksSet up table definition
Setup table is store the historical data, where as the delta records are updated in delta queue not to set up table.
So once historical data is loaded you can delete the contents of the set up table.
name is extract sturcture of your data source + setup
also data is taken from setup table when an init / full/ full repair load is done.
so in order to take ur records from the table delete the setup table(LBWG) and initialize it with ur records.
You can find the contents SE16
Filling set up tables
transac LBWG
+ SBIW -> Logistics -> Managing Transfer Information Structures-> Setup of Statistical Data-> Application-Specific Setup of Statistical Data
set up tables
Set up tables
Set up tables
view of set up tables data in se11??????????
Set Up tables..
lo: delete set up tables: DOUBT
LBWQ is the extraction queue and RSA7 is delta queue. Data is sent to delta queue from extraction queue through collective job scheduled in transaction LBWE.
when we want to extract the data using LO Cockpit, the data will be come to the extraction queue first and from there it will processed to the delta queue. SO lbwq works as a outbound queue.
If the update mode is Unserialised V3 then as soon as the document is posted it comes update table which you can see in Tr. Code SM13. After the jonb is scheduled the records come to RSA7 i.e delta queue from which BW pulls the data
If you use u201CQueued Deltau201Dupdate method then the data moves to Extraction queue(LBWQ). Then run Collective update to move the data from LBWQ into Delta Queue (RSA7). Then schedule the data using the infopackage by selecting Delta Load in the update tab.
So in If you are going to do Setup table filling, delete the data from LBWQ.
The usage of Extraction queue(LBWQ) comes in to picture in delta loading. But actually system starts collecting the data (whenever there is document creation in R/3) after activating the EXtraction structure.
Steps to Be Performed
We know that there are 4 types of delta extraction that are available in LO. If you use u201CQueued Deltau201Dupdate method then the data moves to Extraction queue(LBWQ). Then run Collective update to move the data from LBWQ into Delta Queue (RSA7). Then schedule the data using the infopackage by selecting Delta Load in the update tab.
Here is Roberto's weblog:
/people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
/people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
/people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
check this out...
LBWQ - T code is for what?
Ttransaction LBWQ
What is LBWQ?
Assign points if it helps
Hope it helps
regards
Bala -
What is the concept of setup tables in LO extraction?
Hi,
what is the concept of set up tables in LO EXTRACTION?
GIVE ME COMPLETE DETAILS and how to run deltas? I want step by step procedure.
Thanks in advance.
ravi.Ravi,
I think the blog series by Robert Negro about LO is the best available information.
SDN thread:
/community [original link is broken]
It would be nice if there is any help.sap documentation available.
Raj. -
Set up table data extracted from R/3 not visible in data target of BW
Hai friends,
I am currently working on extracting data from R/3 to BW. I read the several docs given in the forum and did the following:
1) In the LBWE transaction, my extract structure is already active.
2) In SBIW, i went to the filling of set up tables for QM
3) I executed the set up table extraction
4) Then, i checked in RSA3. The extraction was successful.
5) In BW, i replicated the datasource, and in the infopackage, i selected in the
PROCESSING TAB, "PSA and then into Data targets (Package by Package)
6) In UPDATE tab, i selected FULL UPDATE
7) And then i did immediate load.
8) In RSMO, it showed successful. (It showed the same number of records as in the
RSA3 of R/3)
But when i went into the data target (ODS) and checked for its contents, nothing is visible. Why is it so? HAve i skipped any step? Please help.
Regards,
Neha SolankiHai,
U r rite. It is an NW2004 system.This is what is displayed in the status tab in RSMO.
Data successfully updated
Diagnosis
The request has been updated successfully.
InfoSource : 2LIS_05_QE2
Data type : Transaction Data
Source system: QAS678
And i can find no such button as u said.
Regards,
neha Solanki -
Why we have setup tables in LO extraction and why not in CO-PA,Generic,FI-S
HI Friends
Why we have setup tables in LO extraction and why not in CO-PA,Generic,FI-SL. Please clarify.
Thanks&Regards
Revathi
<removed by moderator>
Edited by: Siegfried Szameitat on Nov 13, 2008 12:07 PMHi revathi,
The R/3 database structure for accounting is much more easier than the Logistical structure.
Once you post in a ledger that is done. You can correct, but that give just another posting.
BI can get information direct out of this (relatively) simple database structure.
In LO, you can have an order with multiple deliveries to more than one delivery addresses. And the payer can also be different.
When 1 item (orderline) changes, this can have its reflection on order, supply, delivery, invoice, etc.
Therefore a special record structure is build for Logistical reports.and this structure now is used for BI.
In order to have this special structre filled with your starting position, you must run a set-up. from that moment on R/3 will keep filling this LO-database.
If you wouldn't run the setup. BI would start with data from the moment you start the filling of LO (with the logistica cocpit)
I hope I have been clear.
Udo -
Extracting similar data from a dynamic set of tables
I want to select common fields from an arbitrary set of tables, adding an extra field that is the table name, and return all of these rows in a single table. I know the steps but I am getting bogged down in the details (being new to Oracle):
--Create a temporary table.
--Generate a dynamic SQL statement to grab the data for each table of interest.
--Execute that query and store the results in the temporary table.
--Output the entire temporary table and delete it.
Here is my latest attempt:
================
Create global temporary table myTempTable
Title varchar2(30),
IdType number(8),
Count number(8)
) on commit delete rows;
declare
sql_stmt VARCHAR2(200);
tname VARCHAR2(30);
begin
for tname in (
select distinct table_name from sys.ALL_TAB_COLS
where owner='me' and table_name like '%ASSIGNS'
loop
sql_stmt :=
'insert into myTempTable(Title, IdType, Count) ' ||
'select '':1'', IdType, count(*) from me.:1 group by IdType';
EXECUTE IMMEDIATE sql_stmt USING tname;
end loop;
select * from myTempTable;
commit; -- to delete the temp table
end;
================
The error messages are:
--expressions have to be of SQL types
--an INTO clause is expected in this SELECT statement
What am I missing here?(1) First of all I am executing the whole script in SQL*Plus, thats why I use SQL*Plus command like the SLASH Operator.
<br>
(2)
SQL> VAR cur refcursor
SQL> CREATE TABLE mytemptable AS SELECT ename table_name, deptno idtype, sal cnt FROM emp WHERE 1=0
Table created.
SQL> DECLARE
sql_stmt VARCHAR2 (200);
tname VARCHAR2 (30);
BEGIN
FOR tname IN (SELECT DISTINCT table_name
FROM SYS.all_tab_cols
WHERE owner = USER AND table_name LIKE '%EMP')
LOOP
sql_stmt :=
'insert into myTempTable select '''
|| tname.table_name
|| ''' table_name, deptno IdType, count(*) cnt from '
|| tname.table_name
|| ' group by deptno';
EXECUTE IMMEDIATE sql_stmt;
END LOOP;
OPEN :cur FOR
SELECT *
FROM mytemptable;
EXECUTE IMMEDIATE 'drop table myTempTable';
END;
PL/SQL procedure successfully completed.
SQL> PRINT :cur
TABLE_NAME IDTYPE CNT
EMP 10 3
EMP 20 5
EMP 30 6
3 rows selected.Since we DROP the table at the end we don't need an explicit commit, since all DDL statement will implicitly commit anyway. -
How to extract data into the set-up table for 2LIS_06_INV LIS structure
We are using ECC 6.0 and SAP BI NW 2004S. I activated the 2LIS_06_INV (Invoice Verification) structure. Interestingly, I don't see any Events under this structure (MC06M_0ITM) - my understanding is the events usually determine what type of data is generated for a given structure.
I see Invoice Verification when I use the Inventory Management -Perform Setup option when doing the set-up tables. However, when I use this option, I get a message saying " No extraction structure active or no BW connected".
Can someone list the pre-requisites and the steps to load the set-up table for the 2LIS_06_INV structure.
Thanks,
Sanjay1: RSA5 Activate Data Source
2: LBWE Activate datasource again
3: SBIW Fill setup table 'Settings for Application-Specific DataSources (PI)'-> 'Initialization'->'Filling in the Setup Table'->'Application-Specific Setup of Statistical Data'->'Invoice Verification - Execute Reconstruction' -
Why to give New Run Name every time for filling up the Set UP Tables
Hi,
While filling up the SETUP Tables we have to give RUN Name.
Every time while filling the SETUP Tables we have to give New Run Name.Is there any special reason for this
every time we have to give NEW Run Name while filling up the SET UP Tables?
Thanks and Regards,
Asim
Edited by: asim khan on Apr 19, 2011 4:09 PM
Edited by: asim khan on Apr 19, 2011 4:11 PMHi,
Thats true but my question is why to give NEW Run name every time while filling up the SET UP Tables.
Why it wont take old run name . Whats the logic behind this.
Thanks for the Reply.
Asim -
What is the use of delet set up table in lo **** pit, exactly what happenda
what is the use of delete set up table in lo **** pit, exactly what happened if we do the process, is all tables will lose their data which r available in functional module,
Why i'm asking this, because we assign functional module in LBWG.
plz .......
thanx
vidhuHI Vidhu,
Initially we don't delete the setup tables but when we do change in extract structure we go for it. We r changing the extract structure right, that means there are some newly added fields in that which r not before. So to get the required data (i.e.; the data which is required is taken and to avoid redundancy) we delete n then fill the setup tables.
To refresh the statistical data. The extraction set up reads the dataset that you want to process such as, customers orders with the tables like VBAK, VBAP) & fills the relevant communication structure with the data. The data is stored in cluster tables from where it is read when the initialization is run. It is important that during initialization phase, no one generates or modifies application data, at least until the tables can be set up. -
Error while filling up set up tables in SAP R/3
Experts,
I trying to fill up set up tables for application component 11 (SD) for my LO extraction. It gives me the following error
<b>Error determining rate: foreign curr. RMB local curr. AUD date 06/19/2007 (doc. 100025787)</b>
when I check in RSA3, I have data before 06/19/2007 but nothing is filled up after that.
Does anyone know how to fix this?
Thanks
AshwinHi Ashwin,
Did you get the solution for issue(filling the setup tables for applicaion component 11)
"Error determining rate: foreign curr. RMB local curr. AUD date 06/19/2007 (doc. 100025787)"
thanks in advance.
Bhaskar. -
How to copy a set of tables from a database to another periodically?
We have a 4 node RAC primary database(10.2.0.2) with a physical standby(10.2.0.2) on our production site. Offlate we noticed that one of the applications(APP2) is causing heavy loads due large data downloads on the primary database servers. Our primary database has 2 schemas,
1) one being the main schema with all objects, (USER1)
2) and the other has views that query some set of tables from the main schema. (USER2)
The application APP2 uses USER2 views to query and download huge data periodically. We need to be able to give accurate data results to APP2, but in the same time take off the load from the database, as APP2 is not our main application.
We would like to know if there are any cost effective options in oracle to do this, and if so, what is the best option? Anyone has any experience setting up something like this before?
We have thought of creating another 10.2.0.2 database on a different server and giving it regular updates(like data feeds) from the current database. The current database data changes quiet often, so the data feeds would have to be done often to keep the data current on the new database. So, we are not exactly sure how to go about it. Would a COPY command help?
Please advice.user623066 wrote:
Our 4 node RAC is already busy with our main application, which has its connections spread across all 4 nodes.
Our main applications services are the same on all nodes and use all 4 nodes in the same way.
There are some other utilities that we run from one of the app servers that connect to only 1 of the nodes.
APP2 uses all 4 servers, which is again controlled by connection pooling and distributes the load.Wouldn't separate services be more beneficial here? If APP2 is locked down to one node during normal operation, that ensures that other connections aren't going to be competing for hardware with APP2 on 3 of the 4 nodes. If APP2 is generating less than 25% of the total load, you can let the other applications use whatever hardware resources are left idle on the node APP2 is locked down to.
By Large data downloads, I meant both increase in network traffic and the CPU load on the database nodes.
We are already using resouce manager to limit the resources allocated to USER2 that APP2 uses.
And we have also limited the large downloads to take place in the early hours of the day when the traffic from our main application is less.
But this has still not been optimal for the usage requirements for APP2. APP2 is also doing queries all through the day, but has a limit for the number of rows downloaded during peak hours.Can you explain a bit more about why using Resource Manager hasn't been sufficient? That's normally a pretty good way to prevent one hungry user from drastically affecting everyone else. Perhaps you just need to tweak the configuration here.
Logical Standby seems a good option. But we need to keep our physical standby in place. Is it possible to have a logical standby and a physical standby? (ofcourse on separate servers)Sure. You can have as many standby servers of whatever type you'd like.
Could we use a COPY command to copy data for the set of tables to a new database? Or is that also a complex option?You could, yes. COPY is a SQL*Plus command that has been depricated for copying data between Oracle databases for quite a while. It only works from SQL*Plus and would only be designed for one-time operations (i.e. there is no incremental COPY command). I can just about guarantee that's not what you want here.
How do materialized views work? Wouldn't they still reside on the main database? Or is it possible to have remote materialized views?You probably don't want materialized views, but if you decide to go down that path
- You'd create materialized view logs on the base tables to track changes
- You'd create materialized views on the destination database that select data over a database link back to the source database
- You'd put those materialized views into one or more refresh groups that are scheduled to refresh periodically
- During a refresh, assuming incremental refreshes, the materialized view logs would be read and applied to the materialized views on the destination system to update the materialized views.
Justin -
hello,
i am customizing the lo cockpit for purchasing in the following datasources:
2lis_02_hdr
2lis_02_itm
2lis_02_scl
the problem lies when i am filling the set up table using oli3bw. i get the following message:
"DataSource 2LIS_02_HDR contains data still to be transferred"
this activity was performed 3 days back & that point of time i had given the next day as the termination date.
can anyone help me out on this issue?
any help would be greatly appreciated."DataSource 2LIS_02_HDR contains data still to be transferred"
this means that there is data in the update queue or delta queue for this extractor. if your in production follow the exact guidelines that you can find on this forum. you have to do this in the weekend. stop the v3 update job (run it one last time manually), extract your delta twice and transport the modifications.
if you don't care too much about your data cause you have to reinit anyway, just kill the delta queue -
How to check whether set up tables are filled or not
Hi Gurus,
How to check whehter set up tables are filled or not?
Cheers,
Reddy.Hi Reddy.......
And another point I want to make u clear............LBWQ is not the Delta Queue.............Delta queue is RSA7...............it comes into play in delta loading.......LBWQ is Extraction Queue.........While delta loading....if we use the Update mode Queued Delta........then any Changes will not directly get updated in the Delta queue.....For the Delta set up.........
1) First u hav to run init with data transfer(After filling the set up table)
2) After that any Changed ........will be recorded in the Extraction Queue(LBWQ)........
3) After a certain number of records get accumulated in LBWQ........We hav to run V3 jobs to bring the data in the Delta queue(RSA7)...
Check this.....
Re: question on LBWQ and RSA7
Regards,
Debjani.....
Maybe you are looking for
-
How to have cascading lov for a single column in tabular form
Hi, How to have a cascading lov for a single column in tabular form ie i have one employee name column in tabular form if v pressed add row then one row ll be added In my scenario based on first row value the second row value to be displayed To achie
-
Need suggestion for a database connection issue in an EJB Application
Hi Friends, I am facing some serious problem in my EJB application. Some times, my application is waiting for unknown reason while it connects to database as executing a stored procedure to get and assign a unique number to a transaction. This stored
-
What is default listener port of iphone for FTP connection
Hi, I have implemented one FTP connection code in iphone apps and FTP client like cyberduck, fillzila can connect to iphone using FTP. So i am confused about port. Which port should i use for connection FTP and iphone. Can anybody pls tell me about p
-
Kindly can ayone helo me out in prepare Functional specs for BDC For f-28
-
Employee Download - HR* prefix in Number range
Our requirment when we download employee from ECC the prefix of Number range should be HR* we defined same number range with prefix HR----- and mark as Ext. but while download it takes only sales prospect number range. What extra setting requir for g