Journal data --- for multiple interfaces
hi,
i have a source table.
i want to capture the journal data on this table.And use the data stored in the "jv$ tables" in multiple interfaces.
The multiple interfaces have same source table and are linked in a package.
Is it possible to achive it.
Yes.
Just have one extend window / lock subscriber at the start of the package, run all the interfaces you want to on that Change data before finishing the package with a Unlock Subscriber / Purge Journal step.
Similar Messages
-
Because I want to acquire the similar data for multiple times and then take an average to increase SNR, I add a while loop to the vi "niScope EX Multi-Device Configured Acquisition (TClk)". It works but it runs very slowly (about 1 sec for each iteration). I think I had put the while loop at a wrong position, which makes the vi run from the very beginning in each iteration. So I really want to know where should I put the while loop to improve the speed? I have attached all the vi and subvi.
Thanks very much.
Attachments:
Multi-Device External Clocking (TClk).vi 1166 KB
avgWfm.vi 15 KBDear Zainykhas,
Thank you for posting this to the discussion forums and for uploading some sample code. I took a lok at the issue you have been having, and it is unclear to me as to why you have placed two for loops around the original while loop. My understanding is that you want to use the original Sample.vi and want to execute this N times where N is the Max Freq divided by Interval so that you can scan for a range of frequencies.
Why not just put Sample.vi around one for loop and use the increment counter scaled by the interval to count up towards Max Freq and insert the desired Frequency into the cluster using Bundle By Name?
Kind Regards,
Robert Ward
Applications Engineer, NI
Attachments:
Modified - RW.vi 48 KB -
Using journalized data in an interface with aggragate function
Hi
I am trying to use the journalized data of a source table in one of my interfaces in ODI. The trouble is that one of the mappings on the target columns involves a aggregate function(sum). When I run the interface i get an error saying "not a group by expression". I checked the code and found that the jrn_subscriber, jrn_flag and jrn_date columns are included in the select statement but not in the group by statement(the group by statement only contains the remiaining two columns of the target table).
Is there a way around this? Do I have to manually modify the km? If so how would I go about doing it?
Also I am using Oracle GoldenGate JKM (oracle to oracle OGG).
Thanks and really aprreciate the help
Ajay'ORA-00979' When Using The ODI CDC (Journalization) Feature With Knowledge Modules Including SQL Aggregate Functions [ID 424344.1]
Modified 11-MAR-2009 Type PROBLEM Status MODERATED
In this Document
Symptoms
Cause
Solution
Alternatives :
This document is being delivered to you via Oracle Support's Rapid Visibility (RaV) process, and therefore has not been subject to an independent technical review.
Applies to:
Oracle Data Integrator - Version: 3.2.03.01
This problem can occur on any platform.
Symptoms
After having successfully tested an ODI Integration Interface using an aggregate function such as MIN, MAX, SUM, it is necessary to set up Changed Data Capture operations by using Journalized tables.
However, during execution of the Integration Interface to retrieve only the Journalized records, problems arise at the Load Data step of the Loading Knowledge Module and the following message is displayed in ODI Log:
ORA-00979: not a GROUP BY expression
Cause
Using both CDC - Journalization and aggregate functions gives rise to complex issues.
Solution
Technically there is a work around for this problem (see below).
WARNING : Oracle engineers issue a severe warning that such a type of set up may give results that are not what may be expected. This is related to the way in which ODI Journalization is implemented as specific Journalization tables. In this case, the aggregate function will only operate on the subset which is stored (referenced) in the Journalization table and NOT over the entire Source table.
We recommend to avoid such types of Integration Interface set ups.
Alternatives :
1.The problem is due to the missing JRN_* columns in the generated SQL "Group By" clause.
The work around is to duplicate the Loading Knowledge Module (LKM), and in the clone, alter the "Load Data" step by editing the "Command on Source" tab and by replacing the following instruction:
<%=odiRef.getGrpBy()%>
with
<%=odiRef.getGrpBy()%>
<%if ((odiRef.getGrpBy().length() > 0) && (odiRef.getPop("HAS_JRN").equals("1"))) {%>
,JRN_FLAG,JRN_SUBSCRIBER,JRN_DATE
<%}%>
2. It is possible to develop two alternative solutions:
(a) Develop two separate and distinct Integration Interfaces:
* The first Integration Interface loads data into a temporary Table and specify the aggregate functions to be used in this initial Integration Interface.
* The second Integration Interfaces uses the temporary Table as a Source. Note that if you create the Table in the Interface, it is necessary to drag and drop the Integration Interface into the Source panel.
(b) Define two connections to the Database so that the Integration Interface references two distinct and separate Data Server Sources (one for the Journal, one for the other Tables). In this case, the aggregate function will be executed on the Source Schema.
Show Related Information Related
Products
* Middleware > Business Intelligence > Oracle Data Integrator (ODI) > Oracle Data Integrator
Keywords
ODI; AGGREGATE; ORACLE DATA INTEGRATOR; KNOWLEDGE MODULES; CDC; SUNOPSIS
Errors
ORA-979
Please find above the content from OTN.
It should show you this if you search this ID in the Search Knowledge Base
Cheers
Sachin -
Single SOAP receiver adapter for multiple interfaces
Hi,
I have to send multiple interfaces like Vendor, Customer, Material to one receiver.
I want to configure only one communication channel (receiver SOAP adapter) to send all these interfaces. Is this possible?
Currently I am provided with different URLs from the receiver system as below.
http://host:port/Services/Vendor.wsdl
http://host:port/Services/customer.wsdl
http://host:port/ServicesMaterial.wsdl
I will be having 3 Sender agreement, 3 receiver determination, 3 interface determination and 3 Receiver agreement.
I want only one SOAP reciever adapter which goes inside all the above 3 Receiver agreement.
So When I give the target url as http://host:port/Services, the messages fail.
But When I specify the full targert url in the adapter as http://host:port/Services/Vendor.wsdl then it works.
Which means I would have to create as many communication channel as interfaces.
Is there a work around for this?hi kantheri,
For this, we have to fill the TargetURL and the SOAPAction in Receiver Communication channel dynamically.
So, we need to write UDF in Message Mappings using DynamicConfiguration to fill the TargetURL and the SOAPAction Dynamically.
DynamicConfigurationKey keyURL = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/SOAP","THeaderSOAPACTION");
DynamicConfigurationKey targetURL=DynamicConfigurationKey.create("http://sap.com/xi/XI/System/SOAP","TServerLocation");
// access dynamic configuration
DynamicConfiguration conf = (DynamicConfiguration) container
.getTransformationParameters()
.get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
conf.put(keyURL,"Soap action");
conf.put(targetURL,"target url");
return "";
In this UDF, we are filling the TargetURL in u201CTServerLocationu201D message attribute and SOAPAction in u201CTHeaderSOAPActionu201D message attribute.
So, whenever we execute this corresponding operation these values will be filled in receiver communication channel at runtime.
TargetURL- Give some dummy URL or http://
SOAPAction - *
regards,
ganesh. -
Generic HTTP URL For Multiple Interfaces
Hi All,
Please suggest on the following requirement. I am using PI 7.31 Single stack (Java only version).
There are multiple HTTP to Proxy interfaces and each interface has different source structure.
The requirement is --- How to utilize a single HTTP URL for all the interfaces.
The major challenge is the source structure for all the interfaces is different.
Kindly Suggest....
Regards,
Nitin...Hi Nitin,
One solution is to create one sender interface (with either multiple operation of each interface or having all structure in different node and optional occurrence), then use operation specific mapping.
refer the below example for multiple operation scenario
Setup Multiple Operations Scenario in ESR and ID
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/90dcc6f4-0829-2d10-b0b2-c892473f1571?overridelayout=t…
regards,
Harish -
Reporting / inputting data for multiple GAAPs
In the past we would setup BPC to include reporting for multiple GAAPs (like IFRS, USGAAP and a local GAAP) by means of adding an extra datasource where you would be able to do remapping some of your entries from one GAAP to another.
SO for instance if you had pensions in IFRS on your IFRS pension account, you would then, in the USGAAP datasource that sits above TOTAL IFRS, remap that amount to the USGAAP account in this way leaving all others the same but just having the USGAAP changes reflected in a total datasource that includes all of your input according to IFRS plus the remaps for USGAAP to give you USGAAP in an alternative hierarchy for exmaple.
That is how it was done, but i was wondering if there were any other ideas, expecvially if you have more than 2 or 3 GAAPs
LEt me know if you have other ideas,
best regards,
EdwinHello - Using datasource would be a solution based on dimensionality u2013 this makes sense to me. Other options with dimensionality may include duplicate accounts per GAAP. I think there many possible ways to use dimensionality to achieve thisu2026.creating new user dim, categories, and surely others.
Another option would be to look at modeling this in separate applications.
Maybe using one application as the base then writing to others via destination app/look up type functionality. You might consider handling different apps per GAAP in the ETL phase. In modeling I would be hesitant to jump towards separate applications unless the complexity and or data volumes warranted thisu2026..shoot for the most simple elegant design when possible.
So depending on the complexity of the business case (what type of ASSETS alone could lead to a longer requirement discussion) and the data volumes look decide on the dimensionality vs. application approach. I would start with dimensional design until my prototype forced me to consider something else.
Hope this helps.
Cheers -
I have PCI 6120 card and I want to acquire data for more than one channels. I'm using traditional DAQ to get it. But it does not work for more than one channels. If someone has a data acquisition vi for PCI 6120. Or some suggestion how to aquire data please let me know.
ThanksHello DSPGUY1,
You can definetly acquire from several channels. For your convenience, I have appended below the content from help that tells you how to configure it:
"channels specifies the set of analog input channels. The order of the channels in the scan list defines the order in which the channels are scanned during an acquisition. channels is an array of strings. You can use one channel entry per element or specify the entire scan list in a single element, or use any combination of these two methods. If x, y, and z refer to channels, you can specify a list of channels in a single element by separating the individual channels by commas, for example, x,y,z. If x refers to the first channel in a consecutive channel range and y refers to the last channel, yo
u can specify the range by separating the first and last channels by a colon, for example, x:y."
Hope this help.
Serges Lemo
Applications Engineer
National Instruments -
Unable to filter the data for multiple time selections by dimensions
Hi to all,
I am new in MDX, i have a problem with my MDX query.
Calculated Member Logic:
SUM((OPENINGPERIOD([Date].[YQMD].[Year],[Date].[YQMD].[Month].&[2010-12-01T00:00:00]):[Date].[YQMD].Currentmember),[Measures].[Paid Amt])
Mdx Logic EX:
With Member [MEASURES].[Received_Amount]
AS
SUM((OPENINGPERIOD([Date].[YQMD].[Year],[Date].[YQMD].[Month].&[2010-12-01T00:00:00]):[Date].[YQMD].Currentmember)
,[Measures].[Paid Amt])
SELECT {[MEASURES].[Received_Amount]} On Columns
,[Date].[YQMD].[Year].members On Rows
From [Financial]
If i select multiple time periods in Rows, the query working fine.
but if select multiple periods in where clause it is not responding.
With Member [MEASURES].[Received_Amount]
AS
SUM((OPENINGPERIOD([Date].[YQMD].[Year],[Date].[YQMD].[Month].&[2010-12-01T00:00:00]):[Date].[YQMD].Currentmember)
,[Measures].[Paid Amt])
SELECT {[MEASURES].[Received_Amount]} On Columns
,[Speciality].[Specialty Name].[Specialty Name].members On Rows
From [Financial]
Where {[Date].[YQMD].[Year].&[2012-01-01T00:00:00],[Date].[YQMD].[Year].&[2013-01-01T00:00:00]}
Note:
Each of them is considered from the minimum date in the database to the selected time.
And also the data has to be filtered with respect to each drill down dimension.
If select multiple time periods the same formula has to be applied with respect to the dimensions.
Kindly help me to get out of this problem
Best Regards,
NagendraHi David,
Thanks for your response.
I have a measure, i have to get received_amount in the database from the database starting period to my selection period. Later i have to check by dimensions using same measure. if i select any one period by dimension it's coming, but if i select multiple
periods in filter level by dimensions it's showing no records.
For Ex:
I have four years data in my database (2010-2013).
In 2010
Bill_Amt
Bill_Date
Specialty received_amount
1000
10/01/2010 4
600
2000
04/08/2010 2
1000
In 2013
Bill_Amt
Bill_Date Specialty
received_amount
1500
22/02/2013 2
1200
2000
14/03/2013 1
800
In the above scenario,
By Period:
if i go by period i should get,
Jan'13 ---> 1600
Feb'13 ---> 2800
Mar'13 ---> 3600
Specialty By Single Period:
If i select Jan'13 by specialty
Specialty
received_amount
2 1000
4 600
If i select Feb'13 by specialty
Specialty
received_amount
2 2200
4 600
If i select Mar'13 by specialty
Specialty
received_amount
1 800
2 2200
4 600
Specialty By Multiple selection Periods:
The result should be sum of the individual selection periods by specialty as follows,
If i select Jan'13 & Feb'13 by specialty
Specialty
received_amount
2 3200
4 1200
If i select Jan'13 , Feb'13 & Mar'13 by specialty
Specialty
received_amount
1 800
2 5400
4 1800
Regards,
Nagendra -
Fetching data for multiple POs one by one using BAPI..
Hi,
We have created a custom BAPI which gives all the details regading a
given PO in multiple tables.
Now in a report I have to give a range of PO numbers which should be passed to BAPI one by one and then the data that will is fetched in BAPI tables should be transferred to flat file one by one for each PO.
How can I do this ?
Thanks.
Regards,
Rajesh.Hi,
1.Get all the PO Numbers into an internal Table.
2.Loop at the Internal Table.
3.Within the Loop,call the BAPI
4.Pass the Value into an Internal Table got thru BAPi
5.Call the Function Module GUI_DOWNLOAD and Pass the internal Table to that fiunctiona module.
Reward Points if it is helpful.
Regards,
Sangeetha.A -
Getting the Data for Outbound Interface
Hello,
I am doing a outbound interface in which i need to take the Economic Order Quantity , Minimum order quantity , Present cost of the item and Prevoius Cost of the Item(Average costing may be used). The Process of Defining the required fields in Oracle is not yet Started as the Implementation is new. If any body have the approximate idea, where can we get the fields(Oracle Tables) it will be helpful. As this is just a Prototype u are welcome to provide the rough data which may change in future once the process is clearly defined .
Expecting the Reply.Hello Madhav Dhurjaty,
Thanks for the Reply..
Economic order is not available in MTL_SYSTEM_ITEMS_B or TL..
we need to derive by some formula..EOQ = SQRT {[2*(annual demand)*(order cost)]/(annual carrying cost)}
Annual demand is calculated from the forecast we provide when we perform reorder point planning. Order Cost and Carrying Cost can be obtained when defining items (General Planning).
But my concern is how to get the Annual demand...
Awaiting for ur reply.. -
Using inbound service to produce data for outbound interface
Hi,
with SAP Enterprise services, we have lots of functionnality provided by SAP which we can use to easily get information
without having too much custom development done in the backend.
In our cas for example, we want to use an erecruiting enterprise service to extract data and send it to an external provider in an xml file.
here are the constraints
1) Reduce development as much as possible
2) Use PI to interface with external systems.
3) External system canot initiate the process.
It would be nice to reuse SAP content (Enterprise services) in order to extract the data and replicate it to the external system.
One way to do this would be to have a program which calls the method of the proxy that implements the enterprise service.
Then the program can just call PI to create a file with the contents returned by the method.
This is not an elegant solution ... , I would have liked PI to initiate the call at a specific time and with the response of the call send the xml file to the provider... this seems to be the reverse of what PI does...
I'm sure there are other solutions... any ideas ?
This is more of an extract and transfer scenario with PI...
Thanks.Hi Thierry,
we want to use an erecruiting enterprise service to extract data and send it to an external
provider in an xml file
Please consider that the Auto generated proxies in Enterprise services are mostly Inbound( to SAP)
and have request and response structure.
When you are bringing PI into picture then you will need to create outbound structures, which you need
to copy.You will face issues with work around as suggested above , the proxies are not editable
and to make it editable you will end up creating a copy which will add more work then reuse.
Also think if the customer wants to enhance few field in the Req/Resp strucutures
(we need to consider this for sure)enhancing the structure is possible using dependencies
but it required costom changes.If you are expecting enhancements then better don't use enterprise services.
As SAP standard is designed to accept request and send response , its always correct to
request third party (consumer) to ask/trigger for data,unlike trying to put PI as triggering agent.
Conclusion:
Regards,
$rinivas
Edited by: Srinivas on Aug 12, 2010 7:27 AM
Edited by: Srinivas on Aug 12, 2010 7:37 AM -
Smb server for multiple interfaces
I have IFS 1.2 installed on a Linux box running RedHat 7.1. There are two ethernet cards in the box. By default, when iFS is started, it grabs both interfaces and uses their port 139. Is there a way to make it only use one of them? I configured lisener to listen on one specific ip instead of hostname, and I was able to make the Oracle 9i to listen to just that ip, but could not make the iFS to do so. Any suggestions?
If there is no way around, has anyone tried to mount iFS via NFS to a directory and then use samba server (from www.samba.org) to share that directory? Is iFS utilities (check in/out, versioning, etc.) working in this kind of setting?I had a NT Box with two cards and wcp was not working as a parameter in server configuration ServerIp was set to the second ip address which i did not use , So i had to manually change the ip address for wcp to use. I stopped wcp server changed the parameter
WCP.IpAddress as documented in setup and administration guide and unloaded and reloaded the server and it seemed to work.
Also Check the ipAddress parameter which SMB is using OEM and see if its using the right
ipAddress.
I have not tried on Linux though -
How to save the data for multiple sittings
Hello All
i have one sign up form. user enter only 4 to 5 fields & he may not fill the the manditory field & he left. but data base don’t give the error to him. but next day he will come at then he want to complet his sign up i.e. when he commit the all detail that time all validation will hapen.
for that i use one table without constrain to store the temarory detail.& use other table (with constrains) for final submition.
1) i creat the view n entity for temparory table
2) & also i create views n entity for Final table
n i create one methos in AmImple file in that file I create one method & expose to client n I drag n drop this method as a ADF Button & in that method I write the following logic
1) taking the data from one view (from temporary table) & setting the value to other view(Final table).
using this i set the value from tempary to final table...
but when user left the manditory field then it allow to submit the data in temparory field but when i copy that data to final table(with constrain) n user submit the data then validation happn i.e. "this fild is mandatory".when i click this error message it goes to first screen.
I try to implement this functionality…but its not working.
can u plz suggest other way to implement this functionality in ADF using jdevloper11g
Thanks in AdvanceUser,
I don't get it :-(
How do you find out if a user comes back or is completely new? You have to have some information about the user to remember him. At least this information has to be mandatory (even for the temporary table).
Let's assume you have figured out that the user has been on your page before:
load his credentials from the temporary table to the form, let him fill in the rest and if he hit save save it in the final table.
Timo -
How to generate a single Journal Entry for multiple Payments
Hi All
When creating a payment batch with 2 notes for example, would be possible in the accounting GL batch summarize only the line of Cash?
ThanksHi
I made the settings below:
1º Journal Line Type
Merge Matching Lines = No
Transfer to GL = Summary
for all lines type of Event Class: Payments
2º Accounting Setup Manager
Update Accounting Options for Applications Payables
General Ledger Journal Entry Summarization = Summarize by GL Period
Redid the Lot Payment with two notes from two different suppliers to verify XLA_AE_LINES not summarized lines CASH accounting class code.
Is it not possible to summarize the lines of CASH? -
Configuration Scenario for multiple Interfaces
Hi,
I have an Party X who will receive invoices and credit memos from SAP.
Credit memo and Invoices both use the same idoc type Invoices.
In XI we have different target format for Credit Memo and Invoices.
I have a condition in the idoc which will determine whether it is Credit Memo or Invoices.
My question is that how do i route the receiver to different mapping at runtime based on whether it is credit memo or invoices.
Since both needs to go to the same receiver, but differnt mapping at runtime.
Regards
KrishHi Krish.
You can create a Condition (routing rule) for this in the Interface Determination Object of the interface scenario.
There it will be routed to correct Interface Mapping and Inbound Message.
Follow above a help link (See InterfaceDeterminationRule ):
Description for Interface Determination
http://help.sap.com/saphelp_nwpi711/helpdata/en/48/d2618c0d7d035be10000000a42189b/frameset.htm
I hope it helps you.
Bruno
Maybe you are looking for
-
Neo2 Platinum v1.9BIOS / X2 4400+ / Memclock 166 issue?
I've found one really strange thing. 2x 1GB Hynix PC3200 / 250x10 @ Memclk 166 - Prime Stable 2x 1GB Hynix PC3200 / 260x10 @ Memclk 166 - Windows won't boot (BSOD) 2x 512MB Corsair PC4400 / 250x10 @ Memclk 166 - Prime Stable 2x 512MB Corsair PC4400 /
-
Unable to close web page using Safari 7.3.1
Doing research and opened a page that required my name and password. No idea what's going on and unable to close page even after force quitting Safari and restarting computer.
-
As above since getting my iPad replaced I used almost 6gb last month but was using it on 3G for a week. In July only used it on home wifi network but orange say I have used 8gb of mobile data and slapp me with a 250 quid bill. Help! I am not thick i
-
External Table LKM: error during task interpretation
I've had this problem more than once now, using "LKM File to Oracle (EXTERNAL TABLE)". Before the interface can even run, I get an error, as if the script can't be output. I've tried removing the LKM and reimporting. No use. I am using this LKM succe
-
Wirelessly Printing from a Macbook Pro to an HP Photosmart C4780
Hi everyone, I've recently just purchased a Macbook Pro and an HP Photosmart C4780 printer. I was told in the Apple store that I could print documents and photos from anywhere in the house without having to plug my laptop into the printer via USB. I