Mapping constant values to the target data source columns
Hi
A Target data source has only two columns(RPT_NAME, RPT_LEVEL) and both of them are constants and mapped as follows
WITH TEMP1 AS
(select 'AS' as A from dual
union
select 'IS' as A from dual
union
select 'ED' as A from dual
select t.A as RPT_NAME, 'INV' as RPT_LEVEL from TEMP1 t
Here there is no source tabel involved for mapping and gnerating a from query.
Kindly let me know hoto map this table.
Thanks
Hi Amit
Is there a way we can create an intermediary staging table or model or view in ODI itself, the business requirements are such that the source and the target databases cannot change/include any other view or table .
Thanks
Similar Messages
-
Constant values to the target field
Hi friends,
how to assign three constant values to the target field in XI..?
like :
constant--->Trget field.
There is one target field but 3 constant values have to be assigned....
is it thr FixValue fuction..?
Regards
SamHi,
Goto MM Editor
Conversion --> FixValues
Double Click
Fill in the "Key" "Value" pair as per u r requirment
Key Value
A R1
B R2
C R3
So from Source structure A comes it gets R1 in target
hope this is helpful
Srini -
About Query Data Source Columns property
Hello everyone,
I'm new to Oracle Forms version 10.1.0.2.
When I create a data block based on a table using Data Block Wizard, the block's Query Data Source Columns property is automatically populated with column definition entries corresponding to the columns of the base table.
I tried making changes to these entries, for example by changing the data types to wrong data types or even deleting them, and I found that those changes had no effect on the block at all. The form was still working as I wanted.
Please explain what is exactly the role of the block's Query Data Source Columns property.
Thank you very much.
p.s: The F1 key help says "The Query Data Source Columns property is valid only when the Query Data Source Type property is set to Table, Sub-query, or Procedure". So, please explain in each context of Query Data Source Type.p.s: The F1 key help says "The Query Data Source Columns property is valid only when the Query Data Source Type property is set to Table, Sub-query, or Procedure". So, please explain in each context of Query Data Source Type.
IMHO those properties are very self-explaining: It is the data source of the block, or in other terms: how it is populated.
Table means the data block is based on a table and subsequently will be populated by
select col1, col2, col3 from your_table
With sub-query the block will be populated with your subquery; forms will issue
select col1, col2, col3 from (
-- this is your subquery
select col1, col2, col3 from tab1, tab2 where [....]
With Procedure in short you'd have a stored procedure which returns a ref cursor and the block will be populated by the ref cursor.
As for your question about the name: this actually should matter; the default is NULL which means that there needs to be a column which has the exact name as the item so in the above sample with table the item associated with your_table.col1 should be named col1. If it isn't the property should be named like the column. If this property also doesn't reflect the name this shouldn't work IMO.
cheers -
What are all the Standard Data targets&Data source for Warehouse management
Hi,
What are all the Standard Data targets & Data source for Warehouse management.
Thanks,
Ahmed.HI,
the master data
1.equipment data --IE01
2.All functional locatiions -IL01
3.Maintainnace work centers - IR01
4.Task list data - IA05
4.Schedule maintanence plan
6.Strategies
7.Measuring points
8.Characters for calibration
9.Equipemnt BOM
10.Material BOM
Regards,
Anddra -
What are all the Standard Data targets&Data source for Transportation Manag
Hi,
What are all the Standard Data targets & Data source for Transportation Management.
Thanks,
Ahmed.HI,
the master data
1.equipment data --IE01
2.All functional locatiions -IL01
3.Maintainnace work centers - IR01
4.Task list data - IA05
4.Schedule maintanence plan
6.Strategies
7.Measuring points
8.Characters for calibration
9.Equipemnt BOM
10.Material BOM
Regards,
Anddra -
Target data source does not support the AGO operation
Hi,
In BI Admin Tool, I join Essbase cube and relational source. Then I apply Ago function to Essbase measures. In BI Answer, I try to run query that includes Essbase Ago measures and relational columns(non measures), error message shows the following detail:
Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 22001] Target data source does not support the AGO operation. (HY000)
When I remove the relational columns or run Essbase current-date messures, the result is fine.
So, what's exact meaning of this error message? and does the relational columns (non-measure) support Essbase measures' Ago function?to clarify:
fail case:
criteria
YEAR | YTD,gen03 | MONTH_NAME | SALES(YEAR_AGO)
cube dimension: year, ytd,gen03
relational source: month_name
cube measure using AGO(): sales(year_ago)
result: error messageSuccess case:
criteria
YEAR | YTD,gen03 | SALES(YEAR_AGO)
cube dimension: year, ytd,gen03
cube measure using AGO(): sales(year_ago)
result: success~! how can i solve it? thx -
Defining a 1:1 mapping back from the target to source
Is it always required to define a 1:1 mapping back from the target to source
object in a 1:M mapping?
Can I not just define one to may mapping in source mapping?
Thanks.You can map a 1:M without the returning 1:1. The issue is that you now have made it your application model's responsibility to manage the FK relationship instead of leveraging TopLink.
The FAQ addresses this here.
Doug -
Only constant values populate in target payload
Hi all,
I have a File to RFC scenario that attempts to post invoices via BAPI_ACC_GL_POSTING_POST.
The source values are populating the payload (see below) but the only values in the target payload are constants supplied in the mapping.
I have tested the mapping in the designer. Both the message mapping tester and the interface mappings tester populate the target field values with the source values. I can supply the mapping text if needed.
Any thoughts?
Thanks,
Troy
Source payload in sxmb_moni:
<?xml version="1.0" encoding="utf-8" ?>
- <ns:MT_Sales_Tran_Invoice xmlns:ns="urn:http://freemanco.com/xi/sales_tran1.0">
- <Invoice>
- <invoice_header>
<Event>000000000142319</Event>
<Branch>0112</Branch>
<Transaction_Date>20070509</Transaction_Date>
<Record_Type>IN</Record_Type>
<Company>01</Company>
<Customer>0000100002</Customer>
<Invoice_No>336</Invoice_No>
<Amount>13500</Amount>
<Tax_Amount>0</Tax_Amount>
</invoice_header>
- <invoice_detail>
<Record_Type>ID</Record_Type>
<Invoice_Number>336</Invoice_Number>
<Quantity>100</Quantity>
<Net_Amount>13500</Net_Amount>
<Discount_Pct>0</Discount_Pct>
<Order_key>446</Order_key>
<Order_Item_Key>1</Order_Item_Key>
<GL_account>410010</GL_account>
<Sequence_No>0000000001</Sequence_No>
</invoice_detail>
</Invoice>
</ns:MT_Sales_Tran_Invoice>
Target payload in sxmb_moni:
<?xml version="1.0" encoding="UTF-8" ?>
- <ns1:BAPI_ACC_GL_POSTING_POST xmlns:ns1="urn:sap-com:document:sap:rfc:functions">
- <DOCUMENTHEADER>
<USERNAME>george</USERNAME>
<COMP_CODE>01</COMP_CODE>
<DOC_TYPE>DR</DOC_TYPE>
</DOCUMENTHEADER>
<ACCOUNTGL />
<CURRENCYAMOUNT />
<RETURN />
</ns1:BAPI_ACC_GL_POSTING_POST>Beena,
You may be on to something...
When I run the sxmb_moni payload in the message mapping tester I get this error:
17:36:32 Start of test
Fatal Error: com.sap.engine.lib.xml.parser.ParserException: XML Declaration not allowed here.(:main:, row:1, col:7) com.sap.aii.utilxi.misc.api.BaseRuntimeException: Fatal Error: com.sap.engine.lib.xml.parser.ParserException: XML Declaration not allowed here.(:main:, row:1, col:7)
Here is the exact xml from sxmb_moni:
<?xml version="1.0" encoding="utf-8" ?>
<ns:MT_Sales_Tran_Invoice xmlns:ns="urn:http://freemanco.com/xi/sales_tran1.0">
<Invoice>
<invoice_header>
<Event>000000000142319</Event>
<Branch>0112</Branch>
<Transaction_Date>20070509</Transaction_Date>
<Record_Type>IN</Record_Type>
<Company>01</Company>
<Customer>0000100002</Customer>
<Invoice_No>336</Invoice_No>
<Amount>13500</Amount>
<Tax_Amount>0</Tax_Amount>
<Username>thompt</Username>
</invoice_header>
<invoice_detail>
<Record_Type>ID</Record_Type>
<Invoice_Number>336</Invoice_Number>
<Quantity>100</Quantity>
<Net_Amount>13500</Net_Amount>
<Discount_Pct>0</Discount_Pct>
<Order_key>446</Order_key>
<Order_Item_Key>1</Order_Item_Key>
<GL_account>410010</GL_account>
<Sequence_No>0000000001</Sequence_No>
</invoice_detail>
</Invoice>
</ns:MT_Sales_Tran_Invoice> -
JDeveloper v11.1.1.2
Is it possible to set multiple Target Data Source iterators in the edit tree binding dialog?Hi Ananda
Thank You very much for your reply!
B) Yes same data is required in both the applications, but not completely same format or structure. E.g. Siebel is the Customer Hub for the client. All the customer data are required to be migrated to multiple applications (say Oracle EBS & Oracle BRM) after verification process, which happens end of the day. What I require is, the ODI interface should pull data from Siebel once & upload those to both Oracle EBS & Oracle BRM simultaneously as per the mapping rules defined for them. Basically I wanted to avoid hitting Seibel applications twice for pulling Customer data by creating two separate interfaces. Is it possible by using ODI???
C) Please take same customer scenario in B. Customer is inserted to Oracle EBS & Oracle BRM using two different interfaces executed sequentially in a package. I want to maintain atomicity. i.e. Either Customer to be created in both or none of the applications. If that particular customer failed in 1st interface, it should try in 2nd interface. Also if it get failed 2nd interface, it should rollback in 1st interface. Can this be achieved in ODI?
Hope, the above would clear my query.
Thank You Again
Priyadarshi -
ADF Tree Target Data Source works badly(doesn't with certain circumstances)
Hi OTN,
On my ADF page I have an af:TreeTable. It exposes 3 levels of hierarchy.
The DataModel is:
ComponentVO
- ParameterVO
- - IndexVO
All 3 are based on ComponentEO entity.
There are also 3 dependant tables on the page, which shows those 3 VOs.
I want the selection in tree to change current row in all three tables.
I'm trying to use Target Data Source feature in a tree binding.
For each level of the tree I specified a target data source like "${bindings.ComponentView1Iterator}".
Also added a partialTrigger attributes to tables to be refreshed upon tree selection.
At runtime I see this feature work almost good.
I click the tree nodes and see all three tables updating current row but there is a situation when they don't.
Say, we have a following node structure:
c1
- p1
- - i11
- - i12
- p2
- - i21
When I select i21 and trying to switch to i11 directly (without clicking p1) nothing happens.
It seems Target Data Source feature doesn't work on parent nodes having only one child.
I've tried removing PartialTriggers, changing ChangeEventPolicy... no luck.
Could someone help coping with the problem?
Thanks.
ADF 11.1.1.4, Firefox 4
Here's tree binding:
<tree IterBinding="ComponentView1Iterator" id="TemplateTreeView1">
<nodeDefinition DefName="model.view.ComponentView"
Name="TemplateTreeView10"
TargetIterator="${bindings.ComponentView1Iterator}">
<AttrNames>
<Item Value="Name"/>
</AttrNames>
<Accessors>
<Item Value="ParameterView"/>
</Accessors>
</nodeDefinition>
<nodeDefinition DefName="model.view.ParameterView"
Name="TemplateTreeView11"
TargetIterator="${bindings.ParameterView1Iterator}">
<AttrNames>
<Item Value="Name"/>
</AttrNames>
<Accessors>
<Item Value="IndexView"/>
</Accessors>
</nodeDefinition>
<nodeDefinition DefName="model.view.IndexView" Name="TemplateTreeView12"
TargetIterator="${bindings.IndexView1Iterator}">
<AttrNames>
<Item Value="Name"/>
</AttrNames>
</nodeDefinition>
</tree>Edited by: ILya Cyclone on Apr 25, 2011 7:16 PMHi,
works fine for me. I created a tree and used the ChangeEvent policy on the referenced iterators.
Frank -
How can I use the same data source twice?
in our system there is already a complete transfer of calculations to BI (0CO_PC_PCP_01).
However I want to create a new data flow from ERP starting from the same data source.
Is this possible?
Thanks for help!!Hi!
welcome to SDN forums.
anydata source can be mapped only to a single infosource. but it will not ristrict you to have different datatargets from a single infosource. so you have to use filters in infopackages to select the data target and filter the data for this data sources. Only thing you have to do with old dataflow is just ristrict the data relavent for the 1st Datatarget in old Infopackage and re.initialize.
with regards
ashwin -
Hello there,
I have an Excel report I created which works perfectly fine on my dev environment, but fails on my test environment when I try to do a data refresh.
The key difference between both dev and test environments is that in dev, everything is installed in one server:
SharePoint 2013
SQL 2012: Database Instance, SSAS Instance, SSRS for SharePoint, SSAS POWERPIVOT instance (Powerpivot for SharePoint).
In my test and production environments, the architecture is different:
SQL DB Servers in High Availability (irrelevant for this report since it is connecting to the tabular model, just FYI)
SQL SSAS Tabular server (contains a tabular model that processes data from the SQL DBs).
2x SharePoint Application Servers (we installed both SSRS and PowerPivot for SharePoint on these servers)
2x SharePoint FrontEnd Servers (contain the SSRS and PowerPivot add-ins).
Now in dev, test and production, I can run PowerPivot reports that have been created in SharePoint without any issues. Those reports can access the SSAS Tabular model without any issues, and perform data refresh and OLAP functions (slicing, dicing, etc).
The problem is with Excel reports (i.e. .xlsx files) uploaded to SharePoint. While I can open them, I am having a hard time performing a data refresh. The error I get is:
"An error occurred during an attempt to establish a connection to the external data source [...]"
I ran SQL Profiler on my SSAS Server where the Tabular instance is and I noticed that every time I try to perform a data refresh, I get the following entries:
Every time I try to perform a data refresh, two entries under the user name ANONYMOUS LOGON.
Since things work without any issues on my single-server dev environment, I tried running SQL Server Profiler there as well to see what I get.
As you can see from the above, in the dev environment the query runs without any issues and the user name logged is in fact my username from the dev environment domain. I also have a separated user for the test domain, and another for the production domain.
Now upon some preliminary investigation I believe this has something to do with the data connection settings in Excel and the usage (or no usage) of secure store. This is what I can vouch for so far:
Library containing reports is configured as trusted in SharePoint Central Admin.
Library containing data connections is configured as trusted in SharePoint Central Admin.
The Data Provider referenced in the Excel report (MSOLAP.5) is configured as trusted in SharePoint Central Admin.
In the Excel report, the Excel Services authentication settings is set as "use authenticated user's account". This wortks fine in the DEV environment.
Concerning SecureStore, PowerPivot Configurator has configured it the PowerPivotUnnattendedAccount application ID in all the environments. There is
NO configuration of an Application ID for Excel Services in any of the environments (Dev, test or production). Altough I reckon this is where the solution lies, I am not 100% sure as to why it fails in test and prod. But as I read what I am
writing, I reckon this is because of the authentication "hops" through servers. Am I right in my assumption?
Could someone please advise what am I doing wrong in this case? If it is the fact that I am missing an Secure Store entry for Excel Services, I am wondering if someone could advise me on how to set ip up? My confusion is around the "Target Application
Type" setting.
Thank you for your time.
Regards,
P.Hi Rameshwar,
PowerPivot workbooks contain embedded data connections. To support workbook interaction through slicers and filters, Excel Services must be configured to allow external data access through embedded connection information. External data access is required
for retrieving PowerPivot data that is loaded on PowerPivot servers in the farm. Please refer to the steps below to solve this issue:
In Central Administration, in Application Management, click Manage service applications.
Click Excel Services Application.
Click Trusted File Location.
Click http:// or the location you want to configure.
In External Data, in Allow External Data, click Trusted data connection libraries and embedded.
Click OK.
For more information, please see:
Create a trusted location for PowerPivot sites in Central Administration:
http://msdn.microsoft.com/en-us/library/ee637428.aspx
Another reason is Excel Services returns this error when you query PowerPivot data in an Excel workbook that is published to SharePoint, and the SharePoint environment does not have a PowerPivot for SharePoint server, or the SQL Server Analysis
Services (PowerPivot) service is stopped. Please check this document:
http://technet.microsoft.com/en-us/library/ff487858(v=sql.110).aspx
Finally, here is a good article regarding how to troubleshoot PowerPivot data refresh for your reference. Please see:
Troubleshooting PowerPivot Data Refresh:
http://social.technet.microsoft.com/wiki/contents/articles/3870.troubleshooting-powerpivot-data-refresh.aspx
Hope this helps.
Elvis Long
TechNet Community Support -
Reading Values from Listbox and data source into MS Office Toolkit
Hi,
Been trying to get this to work but making no progress and my lack of experience on labview is becoming a hinderence.
Does anyone know how I can read the values from the listbox example attached into MS Office Toolkit for Excel?
The values from the listbox need to be compared to mulitple values from a strain data source.
Cheers,
Mike.
Attachments:
Capture.PNG 62 KBHi,
Ok in the attached vi I want value from the listbox "0kg through to 10kg" to be put into the excel table in the report generation toolkit along side data from the convert strain gauge reading.
Cheers,
Mick.
Attachments:
Strain Gauge Edit2.vi 112 KB -
This is the message I am getting when i try to configure synchronization through intellisync in all functions. Any ideas?
Folder is no longer part of the system data source or the folder not found
JTjtwilcox wrote:
Mail account is MAPI
XP
Outlook 2007
thanks for your help so far!
JT
Hi...
Based upon the folder error, my guess is the Desktop Manager is having difficulty determining your default mail client. I am not sure why but it is easy enough to redo a profile in Outlook just to test this. (Control Panel / Mail)
Of course, I would backup my PST file prior to doing this and then create a new profile and import the PST file into the new profile. Once the profile is running and tested ok, I would again go to the Control Panel / Mail and set the default profile to point to the newly created profile. Then run the Desktop Manager and see what happens....
H.
+++++++++++++++++++++++++++++++++++++++++++++++++
If successful in helping you, please give me kudos in my post.
Please mark the post that solved it for you!
Thank you!
Best Regards
hmeister -
Can we set the dynamic data source when using getReportParameters() ?
Hello!
I have a report where one of its parameters refers to a list of values (LOVs). This list of values is an SQL Query type. When the data source used in the report is defined in the BI Publisher server, I'm able to get the report parameters using the getReportParameters() function in my application. However, if the data source is not defined the function throws an exception, which is understandable.
I decided to dynamically set the data source so that even if the data source used by the report is not defined in the BI Publisher server, it still will be able to get the LOVs for the parameter. I tried setting the JDBCDataSource of the dynamicDataSource for the ReportRequest object that I passed to the getReportParameters() function. Please see the sample code below:
reportRequest.dynamicDataSource = new BIP10.BIPDataSource();
reportRequest.dynamicDataSource.JDBCDataSource = new BIP10.JDBCDataSource();
setReportDataSource(reportRequest.dynamicDataSource.JDBCDataSource, connectstr, jdbc, dc); //function to set the values for JDBCDataSource object
reportParams = webrs.getReportParameters(reportRequest, uid, pwd); //call the getReportParameters
I was expecting this to work as this is what I did to dynamically set the data source before calling the runReport function. So, my question is -- can we set the dynamic data source when using getReportParameters() ? I tried this both in versions 10g and 11g. It does not seem to work on both versions.
Regards,
Stephaniereport_id column of apex_application_page_ir_rpt can help us uniquely identify each saved report.
We can assign this report_id value to a page item and this page item can be put in the Report ID Item text box of the Advanced section of the Report Attributes page of the IR.
This should load the saved report identified by report_id and you can get rid of javascript
Regards,
Vishal
http://obiee-oracledb.blogspot.com
http://www.packtpub.com/oracle-apex-4-2-reporting/book
Kindly mark the reply as helpful/correct if it solves your problem
Maybe you are looking for
-
I have a MS-6380E with the latest BIOS and chipset drivers. Athlon 1.5 Ghz processor, Geo4 Ti series card with 64MB ram and latest drivers and an SBLive Platinum 5.1 card, with the latest drivers. Windows XP Pro is the operating system. The system ha
-
So when I try to plug in My iPhone 4 to any plug it won't turn on or charge unless if I use an iPad charger. When I do get to turn it on. My Phone will Standby for a few seconds( While I'm Typing my 4-Digit Code) then it'll unlock and go to the home
-
HI All I had created a User Form using Screen Painter, and while working with that form everything is fine , but when Iam trying to Update a record its shows an error message saying that "This entry already exists in the following tables "(@TBL_INDER
-
Error while provisioning Solaris11 through AI
Hi, I am trying to deploy Solaris 11 on solaris sparc server through AI. I have AI server(IP:10.41.4.70) and sparc server(MGMT IP: 10.41.4.100) in same network. Now I have set environment variable as follows, auto-boot? false network-boot-arguments
-
P35 Neo - XP Pro install fatal error
Hello I have partially duplicated my post since the problem is different from the original thread. But first.... MOBO - P35 Neo - MS-7360 CPU - Intel Core 2 Duo E6750 BIOS - V1.5 MEM - 4 x 1Gb 800Mhz DDR2 HDD - 1 x 250 Gb Maxtor SATA II (I have 2 but