Mapping of tables
Dear Experts,
I am doing mapping between lips, vbap, vbak and vbkd tables.
here in lips table 2 records coming in debugging.But in vbap,vbak and vbkd there are no records am getting.My code:
SELECT vgbel
vbeln
posnr FROM lips INTO TABLE it_lips
FOR ALL ENTRIES IN it_vbrp
WHERE vbeln = it_vbrp-vgbel.
IF sy-subrc EQ 0.
READ TABLE it_lips INTO wa_lips INDEX 1 .
DELETE ADJACENT DUPLICATES FROM it_lips COMPARING vgbel vbeln.
LOOP AT it_lips INTO wa_lips.
ENDLOOP.
IF sy-subrc EQ 0.
SELECT single vbeln posnr FROM vbap INTO wa_vbap
FOR ALL ENTRIES IN it_lips
WHERE vbeln = it_lips-vbeln
AND posnr = it_lips-posnr.
WHERE posnr = wa_lips-vgpos
and vbeln = wa_lips-vgbel.
IF sy-subrc EQ 0.
SORT it_vbap BY posnr.
DELETE ADJACENT DUPLICATES FROM it_vbrp COMPARING posnr.
IF sy-subrc EQ 0.
LOOP AT it_vbap INTO wa_vbap.
ENDLOOP.
ENDIF.
endif.
SELECT SINGLE vbeln
netwr
vkorg
knumv FROM vbak
INTO wa_vbak
WHERE vbeln = wa_vbap-vbeln.
IF sy-subrc EQ 0.
MOVE :wa_vbak-netwr TO wa_header-total_inv.
LOOP AT it_vbak INTO wa_vbak.
ENDLOOP.
ENDIF.
SELECT SINGLE vbeln
bstkd
bstdk
posnr FROM vbkd INTO wa_vbkd WHERE posnr = wa_vbap-posnr
AND vbeln = wa_vbap-vbeln.
Plz help me.
Regards
chitty.
Change your VBAP select as below:
SELECT single vbeln posnr FROM vbap INTO wa_vbap
FOR ALL ENTRIES IN it_lips
WHERE vbeln = it_lips-vgbel
AND posnr = it_lips-vgpos.
Similar Messages
-
Value Mapping through Tables in R/3 from XI
Hi All,
I'm new to XI and got a scenario as "Value Mapping through Tables in R/3 from XI". Kindly do tell me the steps which I should follow to complete this scenario. I'm from ABAP so table creation is known to me, what next should I do after table creation?
Thanks in Advance to all of you,
Sreedharsree,
value mapping are done using message mapping. In message mapping you have to make a RFC call to query the db and get the response. were you have to write a java function in mapping program.
check this blog, it speaks on value mapping
/people/jayakrishnan.nair/blog/2005/06/20/dynamic-file-name-using-xi-30-sp12-part--i -
Is Mapping Lookup table possible with IDOC to FIle scenario
Hi all,
Need suggestion, I am using SP16
My sceanrio is IDOC to FIles, and have to use a Mapping Lookup tables for some of the fields within the mapping...
'Crossref: PlantLoc_to_WhseComDiv. Value mapping lookup to take two fields from SAP and convert to WMS 3-digit value'
How to go with this, since i have checked in SAP library that it is for only RFC,JDBC,SOAP adapters ...
Need u r valuable inputs,
Regards,
sridharYou can use RFC or SOAP or JDBC lookup in your mapping.Why not?..It does not mean that we use the lookups only in RFC secnarios.You can use them in any scenario.
-
How to map 2 tables together into a new one, like an Union?
Hi.
I am new in Oracle Data Warehouse Builder and what I'm wanting to do could be simple.
My question is: How to map 2 tables into a new one? That is, what I need is to do a Union.
See the following tables:
What I have is:
---------------+ ---------------+
| A | B | | A | B |
|--------------- |---------------
| 10 | 34 | | 13 | 39 |
---------------+ ---------------+
Table 1 Table 2
What I wish is:
---------------+ ----------------------+
| A | B | Original table |
|--------------- ----------------------+
| 10 | 34 | 1 |
|--------------- ----------------------+
| 13 | 39 | 2 |
|--------------- ----------------------+
Then, I need put all data together into a new table and this new table should has a new column showing the original source of data to each record.
Can I do such thing?
I'm using the Oracle Warehouse Builder and trying to do a map in this way.
Any hint will be very helpful. I also will appreciate an indication of some document which can show how to do it.
Thanks .
Rodrigo Pimenta.
Brazil.Use the SET OPERATOR. That should help i guess.
-
How can I see in what maps some table is used?
Hi,
Anyone can tell me how can I see in what maps some table is used, without open all of then?
Thanks,
Gustavo.Good morning Gustavo,
OWB has some public views defined which can provide you with this information. Check Appendix D (Warehouse Builder Public Views) of the OWB User Guide for all info.
Log in as the owner of the design time repository, if you run the following query you'll have a basic overview of whic table is used in which mapping:SELECT MAP_NAME MAPPING_NAME
, DATA_ENTITY_NAME TABLE_NAME
, BUSINESS_NAME TABLE_OPERATOR_ALIAS
, DESCRIPTION TABLE_OPERATOR_DESCRIPTION
FROM ALL_IV_XFORM_MAP_COMPONENTS
WHERE DATA_ENTITY_TYPE = 'TABLE'Good luck, Patrick -
How to copy FDM setting (import format, dimension mapping, control table)
Dear All,
How to copy FDM setting (import format, dimension mapping, control table) from application to another application.
I found that only dimension mapping can be imported. Is there any way to copy FDM application quickly?
Thanks your helpIf you get a chance try the following script, it's so simple and easy to extract all the map data to XML and will help in to import back through Import script.
Sub MapExtractor()
'UpStream WebLink Custom Script:
'Created By: SatyaK
'Date Created: 12/08/09
'Purpose: This Script will produce a Account Map File for
' the current location.
'Execute the export
strFilePath = API.IntBlockMgr.InterfaceMgr.fExportMapToXML("File Path", PPOVPeriod)
End Sub
------------- -
How to do hibernate mapping for table or view without a primary key
Hi,
Anyone knows how to do hibernate mapping for table or view without a primary key?
thanks.
ccor you can make all column primary key .Anybody seriously considering that as a good idea should understand the implications:
1. this requires that each row be unique - that's a minor issue normally, but can be a significant problem in some cases
2. in practically all databases, a primary key constraint creates a unique index - this has many implications:
a) in most databases, the index stores a copy of the column data that is indexed - this suggestion therefore more than doubles the data storage required for the data
b) whenever an indexed column is changed, the index has to be maintained - this suggestion then more than doubles the work that each UPDATE statement has to do, since both the table and the index have to be maintained for any UPDATE at all
This might work OK for toy projects, but it doesn't scale well at all... -
Where to set Mapped Primary Table
I am using sunOne studio 4 update 1, EE.
I have created a schema by doing New->Database->Database Schema, all OK.
I have then created related ejbs by doing New->J2EE->Related CMP Entity EJBs, all OK.
Now I want to view the mapping information so
I look at the CMP bean entry properties under the EJB module but I see no mapping information in Sun ONE AS Mappings tab. !!!
When I try to edit the protected field I get warning indicating that bean must have a valid Mapped primary table. It tells me to set a Mapped Primary Table in the main properties sheet for this bean.
My problem is I can not find this sheet
Please help before I go crazy
Thanks in advance
Tex...OK I know where to set these fields now.
It is set in the Sun ONE AS tab pane.
However, here comes the next problem.
The Wizard generated schema does not recognise the primary key fields so I can not map CMP to any table :-)
I think this is a bug because I have generated schemas from 3 different databases (IBM, MySWL, SAP) and the tables have primary key fields.
When I get a connection in Runtime Pane under Database node, I can see the primary fields are highlighted with red color.
However, when I obtain the schema using, New->Database->Database Schema, there are no key fields generated.
Anyone have any ideas, work around?
Thanks
Tex... -
Collection MAP HASh map hash table
Hi
i wan to be familiar with Collection MAP API hasp map & hash table
please guide me...
tell me resource link where i can get more information about it.
BuntyThanks Jos
it is more theortical You just skimmed through it for at most eleven minutes. How can you
tell it's 'more theoretical'? It is a tutorial you know ...
kind regards,
Jos -
Mapping dimension table to fact table in admin tool 10g
Hi
I have a criteria when it is run uses fact 1 as fact table and gives result, and when I add new column to this criteria an extra dimension(dim1) gets added to query backend and fact 1 changes to fact2.
The problem I am facing over here is both should result same amount for records. but due to new column from dim1 and new fact table fact2the amount varies and inaccruate results have come.
When I have checked the sources for Fact in logical layer i can find these facts(fact1,fact2) where fact 1 is not mapped with dim 1 and fact 2 is mapped with dim1.
Will mapping dim1 to fact1 will solve my problem.. And what would be the steps to add/map dim1 to fact1.
Please suggest.I just checked back the setting and here are the changed details again.
I have a criteria where when it is run, the backend query is formed with f1 fact table.
And for the same criteria when I add a column c1, the backedn query is formed with f2 fact table and f1 is no more here and one new dimension d1 consisting c1 is getting added.
In both the cases results are different..the expected thing is same results.
Note:
d1 is connected to f2 checked in physical diagram
d1 is not connected to f1 checked in physical diagram.
when I checked the connection between three table in physical diagram. Only d1 and f2 are connected and f1 is not connected to either of table. How to go about this issue.
Please suggest. -
Issue in Mapping Qualified Table in Import Manager
Hi Experts
I have extracted Customer data from R/3 ECC, I could successfully loaded reference data, but when trying to load Main table data (Using DEBMDM06 MAP) I am getting mapping issues with Qualified table (Sales Data and Bank Details).
Error Messages
Map 'Sales Data' field to import qualifier(s)
Map 'Bank Detals' field to import qualifier(s)
I have checked and found that all the field and value mapping is in place, If i remove mapping for the qualified table fields then the data is ready to import.
What is the issue, and how do I import Qualified Table data.
Appreciate your inputs
Gaurav.Hi vickey ,
Kindly go through the below link it will help you understand better the mapping the workinga nd importing of Qualified tables:
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/00a15239-684e-2b10-b8ae-b936b7d1c1fe
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c03240fa-cc3e-2b10-aa9a-a5798e319a6e(Qualified importing with main table)
Hope It Helped
Thanks & Regards
Simona Pinto -
Error while mapping fact table to LOWEST LEVEL
Hi
While Iam mapping my fact table using cwm2_olap_table_map, map_facttbl_levelkey , Iam mentioning it has the lowest level of dimension table , it is throwing error . Will it only work for ET not for LOWEST LEVEL?. Infact Iam not storing the embedded total any where in my fact tables .
Can anyone help me in mapping the lowest level of dimension to fact table????Hi
Are you sure you are using the correct mapping tools. It sounds as if you have a relational schema which reauires a CWM mapping procedure? To map a relational schema it is best to use OEM which has wizards to guide you through the mapping process.
CWM2 metadata is used to map 9iR2 Analytical Workspace objects. These typically involve fully summarised variables within the AW.
Hope this helps
Keith Laker
Product Manager
Oracle Business Intelligence Beans -
Opinion needed on best way to map multiple table joins (of the same table)
Hi all
I have a query of the format:
select A.col1, B.col1,C.col1
FROM
MASTER_TABLE A, ATTRIBUTE_TABLE B, ATTRIBUTE_TABLE C
WHERE
A.key1 = B.key1 (+)
AND
A.key1 = C.key1(+)
AND
B.key2(+) = 100001
AND
C.key2(+) = 100002
As you can see, I am joining the master table to the attribute table MANY times over, (over 30 attributes in my actual query) and I am struggling to find the best way to map this efficiently as the comparison for script vs. mapping is 1:10 in execution time.
I would appreciate the opinion of experienced OWB users as to how they would tackle this in a mapping and to see if they use the same approach as I have done.
Many thanks
AdiSELECT external_reference, b.attribute_value AS req_date,
c.attribute_value AS network, d.attribute_value AS spid,
e.attribute_value AS username, f.attribute_value AS ctype,
g.attribute_value AS airtimecredit, h.attribute_value AS simnum,
i.attribute_value AS lrcredit, j.attribute_value AS airlimitbar,
k.attribute_value AS simtype, l.attribute_value AS vt,
m.attribute_value AS gt, n.attribute_value AS dt,
o.attribute_value AS datanum, p.attribute_value AS srtype,
q.attribute_value AS faxnum,
R.ATTRIBUTE_VALUE AS FAXSRTYPE,
s.attribute_value AS extno,
t.attribute_value AS tb, u.attribute_value AS gb
v.attribute_value AS mb, w.attribute_value AS stolenbar,
x.attribute_value AS hcredit, y.attribute_value AS adminbar,
z.attribute_value AS portdate
FROM csi_item_instances a,
csi_iea_values b,
csi_iea_values c,
csi_iea_values d,
csi_iea_values e,
csi_iea_values f,
csi_iea_values g,
csi_iea_values h,
csi_iea_values i,
csi_iea_values j,
csi_iea_values k,
csi_iea_values l,
csi_iea_values m,
csi_iea_values n,
csi_iea_values o,
csi_iea_values p,
csi_iea_values q,
CSI_IEA_VALUES R,
csi_iea_values s,
csi_iea_values t,
csi_iea_values u,
csi_iea_values v,
csi_iea_values w,
csi_iea_values x,
csi_iea_values y,
csi_iea_values z
WHERE a.instance_id = b.instance_id(+)
AND a.instance_id = c.instance_id(+)
AND a.instance_id = d.instance_id(+)
AND a.instance_id = e.instance_id(+)
AND a.instance_id = f.instance_id(+)
AND A.INSTANCE_ID = G.INSTANCE_ID(+)
AND a.instance_id = h.instance_id(+)
AND a.instance_id = i.instance_id(+)
AND a.instance_id = j.instance_id(+)
AND a.instance_id = k.instance_id(+)
AND a.instance_id = l.instance_id(+)
AND a.instance_id = m.instance_id(+)
AND a.instance_id = n.instance_id(+)
AND a.instance_id = o.instance_id(+)
AND a.instance_id = p.instance_id(+)
AND a.instance_id = q.instance_id(+)
AND A.INSTANCE_ID = R.INSTANCE_ID(+)
AND a.instance_id = s.instance_id(+)
AND a.instance_id = t.instance_id(+)
AND a.instance_id = u.instance_id(+)
AND a.instance_id = v.instance_id(+)
AND a.instance_id = w.instance_id(+)
AND a.instance_id = x.instance_id(+)
AND a.instance_id = y.instance_id(+)
AND a.instance_id = z.instance_id(+)
AND b.attribute_id(+) = 10000
AND c.attribute_id(+) = 10214
AND d.attribute_id(+) = 10132
AND e.attribute_id(+) = 10148
AND f.attribute_id(+) = 10019
AND g.attribute_id(+) = 10010
AND h.attribute_id(+) = 10129
AND i.attribute_id(+) = 10198
AND j.attribute_id(+) = 10009
AND k.attribute_id(+) = 10267
AND l.attribute_id(+) = 10171
AND m.attribute_id(+) = 10184
AND n.attribute_id(+) = 10060
AND o.attribute_id(+) = 10027
AND p.attribute_id(+) = 10049
AND q.attribute_id(+) = 10066
AND R.ATTRIBUTE_ID(+) = 10068
AND s.attribute_id(+) = 10065
AND t.attribute_id(+) = 10141
AND u.attribute_id(+) = 10072
AND v.attribute_id(+) = 10207
AND w.attribute_id(+) = 10135
AND x.attribute_id(+) = 10107
AND y.attribute_id(+) = 10008
AND z.attribute_id(+) = 10103
AND external_reference ='07920490103'
If I run this it takes less than a second in TOAD, when mapped in OWB it takes ages. 10:1 is a conservative estimate. In reality it takes 15-20 minutes. CSI_IEA_VALUES has 30 million rows CSI_ITEM_INSTANCES has 500,000 rows.
Hope that helps. I would love to know how others would tackle this query. -
Hi All,
I have implemented the Out of Box solution of Order Management Analytics module with 11.5.10 version of Oracle EBS and respective adopters. Under the same I am not able to notice any mapping to load the data into W_CUSTOMER_STATUS_HIST_F_TMP table and due to which it causing few reports under order management.
Could you please tell me the mappings name for this Target Table so that I can verify from my side if it has been missed or so?
I am finding this W_CUSTOMER_STATUS_HIST_F_TMP table is getting used as Source Table under the folder of PLP for the mapping as,
PLP_CustomerStatusHistoryFact_Extract
PLP_CustomerStatusHistoryFact_Old_Customers_Load
and PLP_CustomerStatusHistoryFact_Extract
Thanks,
-VinayNot sure this would help but As per data lineage guide, it says PLP_CustomerStatusHistoryFact_Extract mapping populates this table from series of different tables. Later, From temp table it goes to W_CUSTOMER_STATUS_HIST_F table via mapping PLP_CustomerStatusHistoryFact_New_Customers_Load
~r -
XML mapping error table or view does not exist
Hi Friends,
I am using ODI 11g.
I am facing a error while working on XML to ORACLE mapping,
My source table name is Employee and Target table name is Copy Of Employee.I dont hv any join condition.
ORA-00942: table or view does not exist
and
select SYS_GUID(),
2883011,
rowid,
'F',
*'ODI-15065: Join error (TABLE__EMPLOYEE) between the table Copy of EMPLOYEE and the table TABLE_.',*
* sysdate,*
* '(237011)ODI_Test.INT_XML_TEST',*
* 'TABLE__EMPLOYEE',* 'FK',
EMPLOYEEORDER,
EMPLOYEE_KEY,
EMPLOYEE_KEYORDER,
HIRE_DT,
HIRE_DTORDER,
MGR_ID,
MGR_IDORDER,
MGR_NAME,
MGR_NAMEORDER,
NAME,
NAMEORDER,
POSTN,
POSTNORDER,
SVISOR_ID,
SVISOR_IDORDER,
SVISOR_NAME,
SVISOR_NAMEORDER,
TABLE_FK,
TYPE_,
TYPE_ORDER
from EDGE_DEV_OWNER."I$_Copy of EMPLOYEE" EMPLOYEE
where (
EMPLOYEE.TABLE_FK
) not in (
select TABLE_PK
from EDGE_DEV_OWNER.TABLE_
and (
EMPLOYEE.TABLE_FK is not null
Please suggest how to resolve this.
Thanks,
LonyHI Ksbabu,
i am getting this error
oracle.odi.runtime.agent.invocation.InvocationException: object name already exists: EDGE_DEV_OWNER_RESULTS in statement [create table EDGE_DEV_OWNER_RESULTS(RESULTSPK NUMERIC(10) NOT NULL, SNPSFILENAME VARCHAR(255) NULL, SNPSFILEPATH VARCHAR(255) NULL, SNPSLOADDATE VARCHAR(255) NULL)]
at oracle.odi.runtime.agent.invocation.RemoteRuntimeAgentInvoker.invoke(RemoteRuntimeAgentInvoker.java:265)
Thanks,
Lony
8886446166 -
Hi friends,
I did this simple report in obiee 10g(i.e)
*"NATIONALITY COUNT IN DEPARTMENT WISE"*
For that i used the following tables:
per_all_assignments_f----->fact table
hr_all_organization_units----->dim table(containing departments)
per_all_people_f---------------->dim table(containing nationality)
I made all the mappings in the physical diagram, as also viewed my report in BI answers
It shows the following results like
NATIONALITY---------------------------------------------------------------------COUNT(NATIONALITY)
AUS------------------------------------------------------------------------------------------------24
AFR------------------------------------------------------------------------------------------------25
PHQ_VB-------------------------------------------------------------------------------------------40
SH_VT----------------------------------------------------------------------------------------------4
The problem is for me it is showing the above results, but the nationality column is of various codes of the country.
Since i doesnt want the code of the nationalitian to display in the results..i need the meaning of each and every nationality..
like,
AUS------------------------Australian
AFR-------------------------African
PHQ_VB----------------------Germanian(assigned)
Since i know that the meaning for the nationalitian is available in "FND_LOOKUP_VALUES"...okay..
I can import "FND_LOOKUP_VALUES" table to the physical layer...but how i can able to give the mapping to the fact table in my physical diagram...
In my report the fact table is "per_all_assignments_f"
As my fact table doesnt contains any matching column corresponding to the dimension table "FND_LOOKUP_VALUES".....
Then how i can give mappings to the fact column???? for viewing the full meaning of the nationalitian in my report.....
Help me friends...
All izz Well
GTA...Hi Kranthi,
Thanks for your reply....
For the meaning to appear for each and every nationalitian i imported the HR_LOOKUPS table and i have given join to the per_all_people_f which is an dimension table.....
This is the following query that i executed in toad for getting meaning for each and every nationality...
select distinct h15.meaning, h15.lookup_code from hr_lookups h15, per_all_people_f papf, per_all_assignments_f paaf
where h15.lookup_type(+) = 'NATIONALITY'
AND h15.lookup_code(+) = papf.nationality and h15.meaning is not null and papf.person_id = paaf.person_id The samething i implemented in obiee, that is in physical diagram i have given the join between hr_lookups and per_all_people_f..
like the lookup_code column in hr_lookups table and nationality column in the per_all_people_f table
i obtained the results in BI answers, but i didnt get accurate result........
Since for getting the accurate result i need to give one more join between the hr_lookups table and per_all_people_f table........
This is the join i need to give to obtain accurate result that i have already mentioned in the above query...
> h15.lookup_type(+) = 'NATIONALITY'
since OBIEE is not allowing me to given a second join from the lookup table to the people table...then how i can able to obtain accurate result in BI answers......
Is there anyother way to give second join between the same two tables in my case between hr_lookups and per_all_people_f..
Please Help me with this......
Thanks for your support.....
All izz Well
GTA...
Edited by: GTA on Feb 17, 2011 1:15 AM
Maybe you are looking for
-
I have one Captivate 3 project published as a Stand Alone project with Flash 8 selected. There are 36 slides, no audio, no eLearning, SWF size and quality are high. One person who runs this gets an "Invalid Floating Point" error when he tries to run
-
What is ignore=y in import n export
what is ignore=y in import n export and what is diffrnce betwen conventional path n direct path
-
Javascript button in APEX 4.0
How could I get a button in APEX 4 that triggers a Dynamic Action without submitting the page.... ??
-
why is it whenever I sign in and put in my code it asks me three times for my password and then says: "Error press continue and sign in to redeem code." I really need help because I have three fifteen dollar cards that I want to put on! I had four bu
-
Select remote pipelined function
System: AIX 5.1L Oracle: 9.2.0.5.0 64 Bit If I have a package, created in schema 'exceed', on 1 instance which has a pipelined function. I then grant execute to this package for another user, called 'nav'. On another instance, that has a database lin