Newby trying to capturing SCD Type 2 data
Hi all,
OWB Client 10.2.0.1.31
OWB Repos 10.2.0.1.0
Target DW DB Oracle 10.2.0.3.0
I've been working with OWB for 2+ years now but what we currently do is very simple mappings and process flows only. Now I need to start capturing history data. I've determined that I want my data to look something like this:
SURROGATE_ID BUS_ID PROD_NAME REVENUE_TYPE EFFECTIVE_DATE EXPIRATION_DATE
1 1-ASDF WIDGET ONGOING 1/1/2009 11/19/2009
2 1-ASDF WIDGET ONETIME 11/20/2009 NULL
When revenue_type value changes, "expire" the old record and create a new record. (I understand the basic concept of surrogate key, business key, effective date, end date, and trigger column).
I've been combing through the Oracle docs and forums as best as I can to find the best practice/recommended approach. I've also completed some basic OWB Implementation lessons, and I still have some basic questions:
1. When I create a "dimension" in OWB and deploy it (which is giving me lots of trouble!) what object does it create in the db? I see in the case of a simple time dimension it creates a sequence and a table. But for SCD2, what db objects should I expect to see in the db?
2. I have found in OTN under "Sample Code" for OWB an article "Slowly Changing Dimensions with Oracle 9i Warehouse Builder (9.0.4)" that launches http://www.oracle.com/technology/sample_code/products/warehouse/files/SCDWhitePaper.zip. Is this still the current best practice for creating SCD2 even in OWB 10.2.0.1.0?
3. I'm a little confused about levels and hierarchies (when I go to validate and deploy my dimension, OWB yells at me about these things). Based on my table design above, I think I do not need any levels or hierarchies. Can someone give me guidance on how to set up my levels and hierarchies properly?
I think that should help confuse me for a while. Thanks!
Sammi
All,
thanks so much for taking the time to try to help me out! Here are some followup questions I have.
In the "dimension canvas" I get the following tabs by default:
relational, dimensional, business definition.
After I "auto bind" my dimension, I get a new tab called "dim_tbl_test_dim" (which is the name I gave to the new test dimension). But when I close out of the dimension editor and then open it up again, both the auto-binding and the "dim_tbl_test_dim" tab seem to be gone. Is this expected behavior?
In searching on the subject I came across this thread, which provides a quick summary on the full life cycle of building and deploying an SCD2. Is this the correct high level process to follow (and I have inserted some questions about several steps in the process)?
Re: How to view data from cubes & Dimensions in OWB 10g
Taking a simple example:
Source Table = A
Destination Table = B
Dimension = C
(how do I tell OWB to bind to my destination table?)
Mapping = D (to load A into B)
(can somone point me to details or docs on how to properly do this mapping? for instance, what load type do i use, what matching columns do i specify, what are the load properties of the different columns?)
Validate Table A.
Validate Table B.
Validate Dimension C.
Validate Mapping D..
Deploy table B.
Deploy Sequence
Deploy mapping D.
Deploy dimension C (configured => Deploy All).
Execute mapping to load table B.
View data in Dimension C.
Thanks again for all the help!
Sammi
Similar Messages
-
Error in merge statement when trying to impliment SCD type 2 using merge...
Hi ,
I'm trying to impliment SCD type 2 using Merge using below blog as reference but sime how it is erroring out with error
http://www.made2mentor.com/2013/08/how-to-load-slowly-changing-dimensions-using-t-sql-merge/
Msg 207, Level 16, State 1, Line 40
Invalid column name 'Current'.
Msg 207, Level 16, State 1, Line 38
Invalid column name 'Current'.
Msg 207, Level 16, State 1, Line 47
Invalid column name 'Current'.
Here is the code below...
--Create Temporaty table to hold dimension records
IF OBJECT_ID('tempdb..#DimVirtualQueue') IS NOT NULL
DROP TABLE #DimVirtualQueue;
CREATE TABLE #DimVirtualQueue
( [VQ_name] [varchar](50) NULL,
[contact_type] [varchar](50) NULL,
[center_node_id] [int] NULL,
[sed_id] [datetime] NULL,
[eed_id] [datetime] NULL,
[insert_date] [datetime] NULL,
[Current] [char](1) NOT NULL
INSERT INTO #DimVirtualQueue(VQ_name, contact_type, center_node_id, sed_id, eed_id, insert_date,[Current] )
SELECT VQ_name, contact_type, center_node_id, sed_id , eed_id,GETDATE(),'Y'
FROM
( --Declare Source and Target tables.
MERGE dbo.tblSwDM_dim_VQ_test AS TARGET
--Source
USING (SELECT
RTRIM(LTRIM(Stage.RESOURCE_NAME)) AS VQ_name,
'Unknown' AS contact_type,
0 AS center_node_id,
CONVERT(INT,CONVERT(VARCHAR(8),GMT_START_TIME,112)) AS sed_id,
CONVERT(INT,CONVERT(VARCHAR(8),ISNULL(GMT_END_TIME,'2070-01-01'),112)) AS eed_id,
GETDATE() AS insert_date
FROM dbo.tblGenesys_stg_RESOURCE_ Stage
WHERE resource_type = 'queue'
AND resource_subtype = 'VirtualQueue'
AND NOT EXISTS (SELECT 1 FROM dbo.tblSwDM_dim_VQ AS dim
WHERE RTRIM(LTRIM(stage.RESOURCE_NAME)) = RTRIM(LTRIM(dim.vq_name))) ) SOURCE
ON TARGET.VQ_name = SOURCE.VQ_name
WHEN NOT MATCHED BY TARGET
THEN
INSERT ( VQ_name, contact_type, center_node_id, sed_id, eed_id, insert_date,[Current] )
VALUES (SOURCE.VQ_name,SOURCE.contact_type,SOURCE.center_node_id,SOURCE.sed_id,SOURCE.eed_id,SOURCE.insert_date,'Y')
WHEN MATCHED AND TARGET.[Current] = 'Y'
AND EXISTS (
SELECT SOURCE.VQ_name
EXCEPT
SELECT TARGET.VQ_name
--Expire the records in target if exist in source.
THEN UPDATE SET TARGET.[Current] = 'N',
TARGET.[eed_id] = SOURCE.eed_id
OUTPUT $Action ActionOut, SOURCE.VQ_name,SOURCE.contact_type,SOURCE.center_node_id,SOURCE.sed_id,SOURCE.eed_id) AS MergeOut
WHERE MergeOut.ActionOut = 'UPDATE';
--Insert data into dimension
INSERT tblSwDM_dim_VQ_test
SELECT VQ_name,contact_type,center_node_id,sed_id,eed_id,insert_date,[Current] FROM #DimVirtualQueue
Any help to resolve issue is appreciated...
Thanks,
Vishal..You need to show the DDL of your target table: dbo.tblSwDM_dim_VQ_test.
Do you have a column named [Current] in this table? -
Has anyone ever implemented SCD Type-4 using SSIS??
Hello Experts!!
I have been trying to implement SCD TYPE-4 using SSIS and really got stuck and searched on-line for help. for my surprise, there isn't anything up on this topic.
I know the theory behind SCD Type-4 is to maintain history in seperate tables in a rapid changing dimensions.
please help if any of you ever implemented scd type-4 using SSIS.Hi,
The stock Slowly Changing Dimension Transformation of SSIS only supports SCD Type 1 and Type 2. For SCD Type 4, it maintains two tables: one to keep the current data, and the other one to keep the historical data. As a workaround, you can also implement
SCD Type 1 via SCD Transformation, and implement Change Data Capture at the same time. SSIS also provides CDC Control Task and related Data Flow components.
References:
http://www.bidn.com/blogs/TomLannen/bidn-blog/2606/slowly-changing-dimension-type-1-changes-using-ssis-scd-component
http://www.mattmasson.com/2011/12/cdc-in-ssis-for-sql-server-2012-2/
Regards,
Mike Yin
TechNet Community Support -
Hello,
I am trying out the SCD type 2 in OWB 10g R2. I have no hierarchies and hence created a dimension with one level. Apart from three business attributes I have 2 mandatory attributes as "EFFECTIVE_DATE" and "EXPIRY_DATE". I have set one of the business attributes to "Trigger History".
I have 3000 rows in the source table. There is no transformation. the data is loaded from source to this dimension directly. One business attribute is loaded using a constant.
When I executed the mapping, all 3000 rows are populated in the target with the expiry date column having null values and current date in the effective date, which is absolutely fine. When I execute the mapping again without changing anything in the source, rows are inserted in to the target with the effective date set to the second run and the expiry date set to previous effective date. As I understand new rows should be inserted only if there is change in the data.
Please correct me if I am wrong. Please clarify and if my understanding is right where am I doing wrong with OWB?
Your help is greatly appreciated
Thanks a lot on advance!
Regards,
MaruthiHi Roelant,
I think it is important to be aware that although Paris - 10gR2 - is not actually buggy (in this respect!), it is really quite idiosyncratic in exactly how it processes SCDs.
I followed up on your and Mark's comments, and did an in depth analysis of this topic. It is at http://www.donnapkelly.pwp.blueyonder.co.uk/documents/OWB_10gR2_SCD.pdf
My conclusions are perhaps of interest to anyone considering doing SCD processing with Paris.
I'll be doing a follow-up this weekend, and publishing a sort of 'how-to-do-it' guide.
Cheers,
Donna
Message was edited to add the words: "in this respect" -
Hi ,
I'm trying to impliment SCD type 2 using Merge but unlike typical Merge where you have target and source table, my Inserts come from one table and updates/changes are determined from another table.. I have issue with updates.
below is structure of three tables :
Dimension Table :
VQ_id, VQ_name,
contact_type, center_node_id,
sed_id, eed_id,
IsCurrent, insert_date
VQ_Id is dimension ID based on which Inserts and undates are determined.
VQ_Name : type 1 change
Contact_type , Center_node_ID : type 2 changes
is Current : flag
sed_id , eed_id are start and end effective date ID's
Insert table :
VQ_id,VQ_Name ,Contact_Type , Center_node_ID , Sed_id , eed_id , Insert_date
from the above table, based on VQ_ID , new records are determined .
Updates/history records :
Type 2 changes are tracked based on below table..
VQ_ID, contact_type,
center_node_id, Start_Effective_Date,
CT_ID, Submit_Date
Based on VQ_ID , contact_type, center_node_id,
Start_Effective_Date , end effective date are determined.
Any help in this regard is appreciated...
Thanks ,
Vishal.-- This is dimension table
CREATE TABLE [dbo].[tblSwDM_dim_VQ](
[VQ_dim_id] [int] IDENTITY(1,1) NOT NULL,
[VQ_id] [int] NOT NULL,
[VQ_name] [varchar](50) NOT NULL,
[contact_type] [varchar](50) NULL,
[center_node_id] [int] NULL,
[sed_id] [int] NULL,
[eed_id] [int] NULL,
[IsCurrent] [bit] NOT NULL,
[insert_date] [datetime] NULL,
[Start_Effective_Date] AS (CONVERT([datetime],CONVERT([varchar](8),[sed_id],(0)),(0))),
[End_Effective_Date] AS (CONVERT([datetime],CONVERT([varchar](8),[eed_id],(0)),(0))),
CONSTRAINT [Pk_tblswDM_dim_VQ] PRIMARY KEY CLUSTERED
[VQ_dim_id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
-- THis ifrom where updates/type2 changes would be loaded
CREATE TABLE [dbo].[tblSwDM_stg_change_control_next_gen](
[row_id] [int] IDENTITY(1,1) NOT NULL,
[VQ_id] [int] NOT NULL,
[contact_type] [varchar](50) NOT NULL,
[center_node_id] [int] NOT NULL,
[Start_Effective_Date] [datetime] NOT NULL,
[CT_ID] [int] NULL,
[Submit_Date] [datetime] NOT NULL,
[isValid] [bit] NULL,
[Remarks] [varchar](100) NULL
) ON [PRIMARY]
Example...
for a Record in dimention table... [dbo].[tblSwDM_dim_VQ]
Before Updates :
VQ_dim_id VQ_id VQ_name contact_type center_node_id sed_id eed_id IsCurrent insert_date Start_Effective_Date End_Effective_Date
2203 376946 Fraud_Span_Det_VQ RFD USCC Detection 4536 20131018 20700101 1 2014-03-21 12:02:42.750 2013-10-18 00:00:00.000 2070-01-01 00:00:00.000
Final Result :
VQ_dim_id VQ_id VQ_name contact_type center_node_id sed_id eed_id IsCurrent insert_date Start_Effective_Date End_Effective_Date
2203 376946 Fraud_Span_Det_VQ RFD USCC Detection 4536 20131018 20140423 0 2014-03-21 12:02:42.750 2013-10-18 00:00:00.000 2014-04-23 00:00:00.000
2605 376946 Fraud_Span_Det_VQ RFS USCC Spanish 4537 20140424 20700101 1 2014-05-07 11:51:00.543 2014-04-24 00:00:00.000 2070-01-01 00:00:00.000 -
How to send different type of data
Hello, everyone,
I am trying to send different types of data between client and server.
For example, from server to client, I need to send some String data, using PrintWriter.println(String n); and send some other stuff using OutputStream.write(byte[] b, int x, int y). What I did is setting up a socket, then send them one by one. But it didn't work.
I am just wondering if there is anything I need to do right after I finished sending the String data and before I start sending other stuff using OutputStream. (Because if I only send one of them, then it works; but when I send them together, then the error always happened to the second one no matter I use PrintWriter first or the OutputStream first)To send mixed type of data by hand allways using the same output/input stream, you could do:
on server side
ServerSocket serverSocket = null;
Socket clientSocket = null;
OutputStream out = null;
try
/*setup a socket*/
serverSocket = new ServerSocket(4444);
clientSocket = clientSocket = serverSocket.accept();
out = new BufferedOutputStream(clientSocket.getOutputStream());
/*send a byte*/
int send1 = 3;
out.write(send1);
/*send a string with a line termination*/
String send2 = "a string sample";
out.write(send2.getBytes("ISO-8859-1"));
out.write('\n');
finally
try { out.close(); }
catch (Exception e) {}
try { clientSocket.close(); }
catch (Exception e) {}
try { serverSocket.close(); }
catch (Exception e) {}
on client side
Socket clientSocket = null;
InputStream in = null;
try
clientSocket = new Socket("localhost", 4444);
in = new BufferedInputStream(clientSocket.getInputStream());
/*receive the byte*/
int receive1 = in.read();
System.out.println("The received message #1 is: " + receive1);
/*receive the string up to the line termination*/
ByteArrayOutputStream baos = new ByteArrayOutputStream();
int c;
while (((c = in.read()) != '\n') && (c != -1))
baos.write(c);
String receive2 = baos.toString("ISO-8859-1");
System.out.println("the received message #2 is: " + receive2);
finally
try { in.close(); }
catch (Exception e) {}
try { clientSocket.close(); }
catch (Exception e) {} -
Hi Exports,
I have requirement to implement SC2.
I have Active_Customer column in customer Dimension table.
The vales of Active_Customer are either ‘0’ or ‘1’.
If Active_Customer is ‘0’ means that the customer at present not doing business, his status is deactivated.
If Active_Customer is ‘1’ means that the customer at present doing business, his status is activated.
In SCD2 behavior Active_Customer column property is have given like ‘add row on changes’.
My specify requirement is that
I need to add another column in customer_Dim when only active_ctutomer become change to ‘1’.
But I don’t want to add column in customer_Dim when only active_ctutomer become change to ‘0’.
Regards
Zakeer
cell No: <removed by moderator>
<email address removed by moderator>
personal information stays visible in a forum... forever.
bad people harvest that information and spam you ... foreverHi,
The requirement mentioned in your query does not qualify for SCD type 2 .
Rather you are trying to maintain history at the column level .. is this correct ?
If yes then you are trying to achieve SCD type 3 .
http://odiexperts.com/scd-type-3/
Thanks,
Sutirtha -
"no data from device" when trying to capture HDV
I need to capture additional footage for an HDV project I am already editing. Capturing the initial footage (about a month ago) went smoothly. Now however, when trying to capture footage from a new tape shot just last week, I simply cannot. If I try to Capture Now, I get a "No Data from Device" message and the capture attempt aborts. If I log some clips and Batch Capture, the material cues and begins to play but the capture window just continues to say CUEING SOURCE MATERIAL and captures nothing. I am using the Sony HVR-M10U deck as a capture device.
The only thing that has changed between the original capture and this latest attempt is that I have also been working on a DV project at the same time, and I used the M10U to capture some DV footage for that project. In order to succesfully capture DV footage I had to force the deck to play only DV (by changing the VCR settings in the menu).
To try and capture this latest HDV footage, I went back into the menu and forced the deck to play HDV only. In searching for answers I came across this Apple article (http://docs.info.apple.com/article.html?artnum=302407) and followed those exact steps.
However, nothing seems to work. I do have device control and the timecode is being read off the deck in Log & Capture (so the firewire connection is fine), but there's no video in the preview window and no capturing is possible. I am able to open my DV project, reset the menu on the deck, and capture DV footage - but I can't for the life of me switch to capturing HDV footage.
Any bright ideas would be much appreciated - I'm really at wits end.
thanks,
TomI am having a similar problem with my sony HVRZ1U, I can capture dv through firewire no problem but when I choose easy setup for HDV it does not recognize the device.
Funny thing is I tried to download clips on a pc in Premiere with all the similar settings on my camera and it works fine, I have deck control and can capture HDV footage, so that rules out a problem with the camera. Back to FCP5 with the same settings and nothing.
Went through uninstalling quicktime 7.03 back to 7.01 - nothing
Deleted the preferences in FCP5 - nothing
Checked the firewire connections and cables - they work fine
Using all the right settings on the camera and in FCP5
The tape format was shot correctly
Any light on what could be going wrong would be very very much appreciated
Al -
Hi,
I'm relatively new to OBIEE and trying to implement slowly changing dimension type 2.-- i.e. to look up the correct record in the customer dimension (A_FICC_ACCOUNT) based on the transaction date in the fact table (A_FICC_PROFITABILITY). The customer dimension has two time stamps (start_effective_date, end_effective_date), where end_effective_date will be set to NULL for the most recent record.
So far I've set this up as a complex join in the physical layer, as follows:
A_FICC_ACCOUNT.ACCOUNT_ID = A_FICC_PROFITABILITY.ACCOUNT_ID
AND
A_FICC_PROFITABILITY.DAYID >= A_FICC_ACCOUNT.START_EFFECTIVE_DATE AND A_FICC_PROFITABILITY.DAYID < A_FICC_ACCOUNT.END_EFFECTIVE_DATE
OR
A_FICC_PROFITABILITY.DAYID >= A_FICC_ACCOUNT.START_EFFECTIVE_DATE AND A_FICC_ACCOUNT.END_EFFECTIVE_DATE IS NULL
When i run a report in Answers, the generated SQL is as follows:
select distinct T2327.YEARCAPTION as c1,
T13994.SHORT_NAME as c2,
T13994.RANK_W12 as c3,
T13994.RANK_W52 as c4,
T2327.YEARID as c5
from
DIM_FICC_ACCOUNT T13994 /* A_FICC_ACCOUNT */ ,
DIM_TIME T2327 /* A_FICC_TIME */ ,
FACT_FICC_PROFITABILITY T13406 /* A_FICC_PROFITABILITY */
where ( T2327.SKEY = T13406.TIME_ID and T13406.ACCOUNT_ID = T13994.ACCOUNT_ID and T13994.SHORT_NAME = 'PETEROLA' and (T13406.DAYID < T13994.END_EFFECTIVE_DATE or T13994.END_EFFECTIVE_DATE is null) and T13406.DAYID >= T13994.START_EFFECTIVE_DATE )
order by c5, c2, c3, c4
Ran directly against the (Oracle) database using an SQL client, it gives meaningful results according to the recorded historical changes in the dimension. (see attached screenshot). In Answers, however, the results are not correct (see attached screenshot).
http://dl.dropbox.com/u/3345113/output.jpg
Any tips as to what might be wrong would be greatly appreciated.
best regards
MagnusMagnus,
As mentioned customer dimension (A_FICC_ACCOUNT) is a SCD type 2, what is the primary key on this table? I don't think it is ACCOUNT_ID alone.
Usually there exists a surrogate key to keep track of any changes, thus it is on this surrogate key a join should exist with a fact table and applying a filter end_effective date is null should produce the correct result.
If the above scenario is not true in your case, do let me know the structure of the customer dimension and fact table (only columns associated with customer dimension), with relation ship information please.
J
-bifacts
http://www.obinotes.com -
Hi all,
I'm getting the following error while validating a scd type 2 dimension:
"VLD-0363: For Slowly Changing Strategies Type-II Dimension DIM_A, if any of the trigger, effective date or expiration date is set for a level then all of them must be set"
This is because I'm trying to create a dimension similar to this:
DIM_A
LEVEL_A
DT_BEGIN --effective date
DT_END --expiration date
COD_A
ID_A
DESC_A
LEVEL_B
DT_BEGIN --effective date
DT_END --expiration date
COD_B
ID_B
DESC_B
LEVEL_A_COD_A -- trigger history
I don't want to keep history on the descriptions because I can assume that, if the description changes, then it must be a typo or some kind of correction. I just want to keep history of the changes in the hierarchy.
OWB doesn't let me do this...
Does this make any sense to you? What is your opinion? Have you ever try to do something like this? Are there any workarounds?
Thank you for your comments.
Best Regards.Hi,
my advise is to map the DIM1_NAT_KEY iside the Fact Table of the Business Model, so you have a new Logical Table Source inside the Logical Fact Table that maps the DIM1_NAT_KEY as a measure. Define the Level for this Logical Table Source and set the COUNT DISTINC aggregation. In this way OBIEE knows that that measure is inside a fact an it treat like that.
I hope it helps.
Regards,
Gianluca -
How to implement SCD type 2 in OWB 11g
Hi all,
I would like to know that how to implement SCD type 2 in OWB 11g.
Actually I have tried to implement it but the target table which contains the effective_date and expiration_date are null after running the mapping.
I have set the effective date and expiration date settings in the SCD tab of dimension object.
Kindly help me the same if anyone knows.
Kind regards,
shikhaYou were able to get OWB11g to move data?
Are you running on a 64-bit windows server? I could not get it to do a basic data move on this type of server install. -
Anybody got SCD Type 2's to perform quickly using dimension operator
Hi there,
Hitting major performance problems running mappings to populate SCD Type 2's when they have large amounts of pre-existing data.
Anybody got this performing acceptably? Tried indexing but to no avail.
Many ThanksHi there,
Thanks for getting back to me - found the patch and this patch hasd already been applied.
An example of the sql being generated in a really simple mapping with the dimension operator for small tables is as follows
MERGE
/*+ APPEND PARALLEL("NS_0") */
INTO
"RETAILER_PUBLISHER_NS"
USING
(SELECT
"MERGE_DELTA_ROW_0"."NS_OUTLET_SRC_ID$1" "NS_OUTLET_SRC_ID",
"MERGE_DELTA_ROW_0"."NS_PUBLISHER_CODE$1" "NS_PUBLISHER_CODE",
"MERGE_DELTA_ROW_0"."NS_TITLE_CLASSIFICATION_CODE$1" "NS_TITLE_CLASSIFICATION_CODE",
"MERGE_DELTA_ROW_0"."NS_SUPPLY_FLAG$1" "NS_SUPPLY_FLAG",
"MERGE_DELTA_ROW_0"."NS_EFF_DATE$1" "NS_EFF_DATE",
"MERGE_DELTA_ROW_0"."NS_EXP_DATE$1" "NS_EXP_DATE",
"MERGE_DELTA_ROW_0"."NS_ID$1" "NS_ID"
FROM
(SELECT
"NS_ID" "NS_ID$1",
"NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID$1",
"NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE$1",
"NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CODE$1",
"NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG$1",
"NS_EFF_DATE" "NS_EFF_DATE$1",
"NS_EXP_DATE" "NS_EXP_DATE$1"
FROM
(SELECT
(Case When (("SPLITTER_INPUT_SUBQUERY"."NS_ID_0_0" IS NULL) OR ((("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS')) OR ("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NOT NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS') AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0", 'J.HH24.MI.SS') >= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS'))) AND (("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" != "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2") OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" != "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0")))) then ("SPLITTER_INPUT_SUBQUERY"."NS_ID_1") else ("SPLITTER_INPUT_SUBQUERY"."NS_ID_0_0") end)/* MERGE_DELTA_ROW.OUTGRP1.NS_ID */ "NS_ID",
"SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_1"/* MERGE_DELTA_ROW.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID",
"SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_1"/* MERGE_DELTA_ROW.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE",
"SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1"/* MERGE_DELTA_ROW.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE",
"SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1"/* MERGE_DELTA_ROW.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG",
(Case When (("SPLITTER_INPUT_SUBQUERY"."NS_ID_0_0" IS NULL)) then ((case when ("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1" < SYSDATE ) then ("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1") else ( SYSDATE ) end)) when ((("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS')) OR ("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NOT NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS') AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0", 'J.HH24.MI.SS') >= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS'))) AND (("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" != "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2") OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" != "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0"))) then ("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1") else ("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0") end)/* MERGE_DELTA_ROW.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE",
(Case When ((ROW_NUMBER() OVER (PARTITION BY "SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_1","SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_1" ORDER BY "SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1" DESC)) = 1) then (Case When (("SPLITTER_INPUT_SUBQUERY"."NS_ID_0_0" IS NULL) OR ((("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS')) OR ("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NOT NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS') AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0", 'J.HH24.MI.SS') >= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS'))) AND (("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" != "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2") OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" != "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0")))) then ( TO_DATE('31-DEC-4000') ) else ("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0") end) else (("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1" - INTERVAL '1' SECOND)) end)/* MERGE_DELTA_ROW.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE"
FROM
(SELECT
"INGRP1"."NS_ID" "NS_ID_1",
"INGRP1"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID_1",
"INGRP1"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE_1",
"INGRP1"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CO_1",
"INGRP1"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG_1",
"INGRP1"."NS_EFF_DATE" "NS_EFF_DATE_1",
"INGRP1"."NS_EXP_DATE" "NS_EXP_DATE_1",
"INGRP2"."NS_ID" "NS_ID_0_0",
"INGRP2"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID_0_0",
"INGRP2"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE_0_0",
"INGRP2"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CO_2",
"INGRP2"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG_0_0",
"INGRP2"."NS_EFF_DATE" "NS_EFF_DATE_0_0",
"INGRP2"."NS_EXP_DATE" "NS_EXP_DATE_0_0",
"INGRP2"."DIMENSION_KEY" "DIMENSION_KEY_0"
FROM
( SELECT
"RETAILER_PUBLISHER_NS"."NS_ID" "NS_ID",
"RETAILER_PUBLISHER_NS"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID",
"RETAILER_PUBLISHER_NS"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE",
"RETAILER_PUBLISHER_NS"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CODE",
"RETAILER_PUBLISHER_NS"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG",
"RETAILER_PUBLISHER_NS"."NS_EFF_DATE" "NS_EFF_DATE",
"RETAILER_PUBLISHER_NS"."NS_EXP_DATE" "NS_EXP_DATE",
"RETAILER_PUBLISHER_NS"."DIMENSION_KEY" "DIMENSION_KEY"
FROM
"RETAILER_PUBLISHER_NS" "RETAILER_PUBLISHER_NS"
WHERE
( "RETAILER_PUBLISHER_NS"."DIMENSION_KEY" = "RETAILER_PUBLISHER_NS"."NS_ID" ) AND
( "RETAILER_PUBLISHER_NS"."NS_ID" IS NOT NULL ) ) "INGRP2"
RIGHT OUTER JOIN ( SELECT
NULL "NS_ID",
"LOOKUP_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID$2" "NS_OUTLET_SRC_ID",
"LOOKUP_INPUT_SUBQUERY"."NS_PUBLISHER_CODE$2" "NS_PUBLISHER_CODE",
"LOOKUP_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CODE$2" "NS_TITLE_CLASSIFICATION_CODE",
"LOOKUP_INPUT_SUBQUERY"."NS_SUPPLY_FLAG$2" "NS_SUPPLY_FLAG",
"LOOKUP_INPUT_SUBQUERY"."NS_EFF_DATE$2" "NS_EFF_DATE",
"LOOKUP_INPUT_SUBQUERY"."NS_EXP_DATE$2" "NS_EXP_DATE"
FROM
(SELECT
"DEDUP_SRC"."NS_ID$3" "NS_ID$2",
"DEDUP_SRC"."NS_OUTLET_SRC_ID$3" "NS_OUTLET_SRC_ID$2",
"DEDUP_SRC"."NS_PUBLISHER_CODE$3" "NS_PUBLISHER_CODE$2",
"DEDUP_SRC"."NS_TITLE_CLASSIFICATION_CODE$3" "NS_TITLE_CLASSIFICATION_CODE$2",
"DEDUP_SRC"."NS_SUPPLY_FLAG$3" "NS_SUPPLY_FLAG$2",
"DEDUP_SRC"."NS_EFF_DATE$3" "NS_EFF_DATE$2",
"DEDUP_SRC"."NS_EXP_DATE$3" "NS_EXP_DATE$2"
FROM
(SELECT
NULL/* DEDUP_SRC.OUTGRP1.NS_ID */ "NS_ID$3",
("PUB_AGENT_MATRIX_CC"."PAM_CUSTOMER_ID"/* EXPR_SRC.OUTGRP1.NS_OUTLET_SRC_ID */)/* DEDUP_SRC.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID$3",
((to_char("PUB_AGENT_MATRIX_CC"."PAM_PUBLISHER_CODE")/* EXP.OUTGRP1.PAM_PUBLISHER_CODE */)/* EXPR_SRC.OUTGRP1.NS_PUBLISHER_CODE */)/* DEDUP_SRC.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE$3",
("PUB_AGENT_MATRIX_CC"."PAM_TITLCLAS_CODE"/* EXPR_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */)/* DEDUP_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE$3",
("PUB_AGENT_MATRIX_CC"."PAM_SUPPLY_FLAG"/* EXPR_SRC.OUTGRP1.NS_SUPPLY_FLAG */)/* DEDUP_SRC.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG$3",
MIN(("PUB_AGENT_MATRIX_CC"."PAM_EFFECTIVE_DATE"/* EXPR_SRC.OUTGRP1.NS_EFF_DATE */)) KEEP (DENSE_RANK FIRST ORDER BY NULL/* EXPR_SRC.OUTGRP1.NS_ID */)/* DEDUP_SRC.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE$3",
NULL/* DEDUP_SRC.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE$3"
FROM
"REFSTG"."PUB_AGENT_MATRIX_CC" "PUB_AGENT_MATRIX_CC"
WHERE
( "PUB_AGENT_MATRIX_CC"."PAM_ADD_REMOVE_FLAG" = 'A' )
GROUP BY
("PUB_AGENT_MATRIX_CC"."PAM_CUSTOMER_ID"/* EXPR_SRC.OUTGRP1.NS_OUTLET_SRC_ID */), ((to_char("PUB_AGENT_MATRIX_CC"."PAM_PUBLISHER_CODE")/* EXP.OUTGRP1.PAM_PUBLISHER_CODE */)/* EXPR_SRC.OUTGRP1.NS_PUBLISHER_CODE */), ("PUB_AGENT_MATRIX_CC"."PAM_TITLCLAS_CODE"/* EXPR_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */), ("PUB_AGENT_MATRIX_CC"."PAM_SUPPLY_FLAG"/* EXPR_SRC.OUTGRP1.NS_SUPPLY_FLAG */),NULL,NULL/* RETAILER_PUBLISHER_NS.DEDUP_SRC */) "DEDUP_SRC") "LOOKUP_INPUT_SUBQUERY"
WHERE
( NOT ( "LOOKUP_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID$2" IS NULL AND "LOOKUP_INPUT_SUBQUERY"."NS_PUBLISHER_CODE$2" IS NULL ) ) ) "INGRP1" ON ( ( ( "INGRP2"."NS_EFF_DATE" IS NULL OR ( ( "INGRP2"."NS_EXP_DATE" IS NULL AND TO_CHAR ( "INGRP2"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) ) OR ( "INGRP2"."NS_EXP_DATE" IS NOT NULL AND TO_CHAR ( "INGRP2"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) AND TO_CHAR ( "INGRP2"."NS_EXP_DATE" , 'J.HH24.MI.SS' ) >= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) ) ) ) ) AND ( ( "INGRP2"."NS_PUBLISHER_CODE" = "INGRP1"."NS_PUBLISHER_CODE" ) ) AND ( ( "INGRP2"."NS_OUTLET_SRC_ID" = "INGRP1"."NS_OUTLET_SRC_ID" ) ) )) "SPLITTER_INPUT_SUBQUERY"
WHERE
( ( ( "SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_1" = "SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_0_0" AND "SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_1" = "SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_0_0" ) ) OR ( "SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_0_0" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_0_0" IS NULL ) )
UNION
SELECT
"DEDUP_SCD_SRC"."NS_ID$4" "NS_ID",
"DEDUP_SCD_SRC"."NS_OUTLET_SRC_ID$4" "NS_OUTLET_SRC_ID",
"DEDUP_SCD_SRC"."NS_PUBLISHER_CODE$4" "NS_PUBLISHER_CODE",
"DEDUP_SCD_SRC"."NS_TITLE_CLASSIFICATION_CODE$4" "NS_TITLE_CLASSIFICATION_CODE",
"DEDUP_SCD_SRC"."NS_SUPPLY_FLAG$4" "NS_SUPPLY_FLAG",
"DEDUP_SCD_SRC"."NS_EFF_DATE$4" "NS_EFF_DATE",
"DEDUP_SCD_SRC"."NS_EXP_DATE$4" "NS_EXP_DATE"
FROM
(SELECT
"AGG_INPUT"."NS_ID$5"/* DEDUP_SCD_SRC.OUTGRP1.NS_ID */ "NS_ID$4",
"AGG_INPUT"."NS_OUTLET_SRC_ID$5"/* DEDUP_SCD_SRC.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID$4",
"AGG_INPUT"."NS_PUBLISHER_CODE$5"/* DEDUP_SCD_SRC.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE$4",
MIN("AGG_INPUT"."NS_TITLE_CLASSIFICATION_CODE$5") KEEP (DENSE_RANK FIRST ORDER BY "AGG_INPUT"."NS_TITLE_CLASSIFICATION_CODE$5" NULLS LAST)/* DEDUP_SCD_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE$4",
MIN("AGG_INPUT"."NS_SUPPLY_FLAG$5") KEEP (DENSE_RANK FIRST ORDER BY "AGG_INPUT"."NS_SUPPLY_FLAG$5" NULLS LAST)/* DEDUP_SCD_SRC.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG$4",
MIN("AGG_INPUT"."NS_EFF_DATE$5") KEEP (DENSE_RANK FIRST ORDER BY "AGG_INPUT"."NS_EFF_DATE$5" NULLS LAST)/* DEDUP_SCD_SRC.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE$4",
MIN("AGG_INPUT"."NS_EXP_DATE$5") KEEP (DENSE_RANK FIRST ORDER BY "AGG_INPUT"."NS_EXP_DATE$5" NULLS LAST)/* DEDUP_SCD_SRC.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE$4"
FROM
(SELECT
"SPLITTER_INPUT_SUBQUERY$1"."NS_ID_0_0$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_ID */ "NS_ID$5",
"SPLITTER_INPUT_SUBQUERY$1"."NS_OUTLET_SRC_ID_1$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID$5",
"SPLITTER_INPUT_SUBQUERY$1"."NS_PUBLISHER_CODE_1$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE$5",
"SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_2$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE$5",
"SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_0_0$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG$5",
"SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_0_0$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE$5",
("SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_1$1" - INTERVAL '1' SECOND)/* UPDATE_DELTA_ROW.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE$5"
FROM
(SELECT
"INGRP1"."NS_ID" "NS_ID_1$1",
"INGRP1"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID_1$1",
"INGRP1"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE_1$1",
"INGRP1"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CO_1$1",
"INGRP1"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG_1$1",
"INGRP1"."NS_EFF_DATE" "NS_EFF_DATE_1$1",
"INGRP1"."NS_EXP_DATE" "NS_EXP_DATE_1$1",
"INGRP2"."NS_ID" "NS_ID_0_0$1",
"INGRP2"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID_0_0$1",
"INGRP2"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE_0_0$1",
"INGRP2"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CO_2$1",
"INGRP2"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG_0_0$1",
"INGRP2"."NS_EFF_DATE" "NS_EFF_DATE_0_0$1",
"INGRP2"."NS_EXP_DATE" "NS_EXP_DATE_0_0$1",
"INGRP2"."DIMENSION_KEY" "DIMENSION_KEY_0$1"
FROM
( SELECT
"RETAILER_PUBLISHER_NS"."NS_ID" "NS_ID",
"RETAILER_PUBLISHER_NS"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID",
"RETAILER_PUBLISHER_NS"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE",
"RETAILER_PUBLISHER_NS"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CODE",
"RETAILER_PUBLISHER_NS"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG",
"RETAILER_PUBLISHER_NS"."NS_EFF_DATE" "NS_EFF_DATE",
"RETAILER_PUBLISHER_NS"."NS_EXP_DATE" "NS_EXP_DATE",
"RETAILER_PUBLISHER_NS"."DIMENSION_KEY" "DIMENSION_KEY"
FROM
"RETAILER_PUBLISHER_NS" "RETAILER_PUBLISHER_NS"
WHERE
( "RETAILER_PUBLISHER_NS"."DIMENSION_KEY" = "RETAILER_PUBLISHER_NS"."NS_ID" ) AND
( "RETAILER_PUBLISHER_NS"."NS_ID" IS NOT NULL ) ) "INGRP2"
RIGHT OUTER JOIN ( SELECT
NULL "NS_ID",
"LOOKUP_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID$2" "NS_OUTLET_SRC_ID",
"LOOKUP_INPUT_SUBQUERY"."NS_PUBLISHER_CODE$2" "NS_PUBLISHER_CODE",
"LOOKUP_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CODE$2" "NS_TITLE_CLASSIFICATION_CODE",
"LOOKUP_INPUT_SUBQUERY"."NS_SUPPLY_FLAG$2" "NS_SUPPLY_FLAG",
"LOOKUP_INPUT_SUBQUERY"."NS_EFF_DATE$2" "NS_EFF_DATE",
"LOOKUP_INPUT_SUBQUERY"."NS_EXP_DATE$2" "NS_EXP_DATE"
FROM
(SELECT
"DEDUP_SRC"."NS_ID$3" "NS_ID$2",
"DEDUP_SRC"."NS_OUTLET_SRC_ID$3" "NS_OUTLET_SRC_ID$2",
"DEDUP_SRC"."NS_PUBLISHER_CODE$3" "NS_PUBLISHER_CODE$2",
"DEDUP_SRC"."NS_TITLE_CLASSIFICATION_CODE$3" "NS_TITLE_CLASSIFICATION_CODE$2",
"DEDUP_SRC"."NS_SUPPLY_FLAG$3" "NS_SUPPLY_FLAG$2",
"DEDUP_SRC"."NS_EFF_DATE$3" "NS_EFF_DATE$2",
"DEDUP_SRC"."NS_EXP_DATE$3" "NS_EXP_DATE$2"
FROM
(SELECT
NULL/* DEDUP_SRC.OUTGRP1.NS_ID */ "NS_ID$3",
("PUB_AGENT_MATRIX_CC"."PAM_CUSTOMER_ID"/* EXPR_SRC.OUTGRP1.NS_OUTLET_SRC_ID */)/* DEDUP_SRC.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID$3",
((to_char("PUB_AGENT_MATRIX_CC"."PAM_PUBLISHER_CODE")/* EXP.OUTGRP1.PAM_PUBLISHER_CODE */)/* EXPR_SRC.OUTGRP1.NS_PUBLISHER_CODE */)/* DEDUP_SRC.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE$3",
("PUB_AGENT_MATRIX_CC"."PAM_TITLCLAS_CODE"/* EXPR_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */)/* DEDUP_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE$3",
("PUB_AGENT_MATRIX_CC"."PAM_SUPPLY_FLAG"/* EXPR_SRC.OUTGRP1.NS_SUPPLY_FLAG */)/* DEDUP_SRC.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG$3",
MIN(("PUB_AGENT_MATRIX_CC"."PAM_EFFECTIVE_DATE"/* EXPR_SRC.OUTGRP1.NS_EFF_DATE */)) KEEP (DENSE_RANK FIRST ORDER BY NULL/* EXPR_SRC.OUTGRP1.NS_ID */)/* DEDUP_SRC.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE$3",
NULL/* DEDUP_SRC.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE$3"
FROM
"REFSTG"."PUB_AGENT_MATRIX_CC" "PUB_AGENT_MATRIX_CC"
WHERE
( "PUB_AGENT_MATRIX_CC"."PAM_ADD_REMOVE_FLAG" = 'A' )
GROUP BY
("PUB_AGENT_MATRIX_CC"."PAM_CUSTOMER_ID"/* EXPR_SRC.OUTGRP1.NS_OUTLET_SRC_ID */), ((to_char("PUB_AGENT_MATRIX_CC"."PAM_PUBLISHER_CODE")/* EXP.OUTGRP1.PAM_PUBLISHER_CODE */)/* EXPR_SRC.OUTGRP1.NS_PUBLISHER_CODE */), ("PUB_AGENT_MATRIX_CC"."PAM_TITLCLAS_CODE"/* EXPR_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */), ("PUB_AGENT_MATRIX_CC"."PAM_SUPPLY_FLAG"/* EXPR_SRC.OUTGRP1.NS_SUPPLY_FLAG */),NULL,NULL/* RETAILER_PUBLISHER_NS.DEDUP_SRC */) "DEDUP_SRC") "LOOKUP_INPUT_SUBQUERY"
WHERE
( NOT ( "LOOKUP_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID$2" IS NULL AND "LOOKUP_INPUT_SUBQUERY"."NS_PUBLISHER_CODE$2" IS NULL ) ) ) "INGRP1" ON ( ( ( "INGRP2"."NS_EFF_DATE" IS NULL OR ( ( "INGRP2"."NS_EXP_DATE" IS NULL AND TO_CHAR ( "INGRP2"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) ) OR ( "INGRP2"."NS_EXP_DATE" IS NOT NULL AND TO_CHAR ( "INGRP2"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) AND TO_CHAR ( "INGRP2"."NS_EXP_DATE" , 'J.HH24.MI.SS' ) >= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) ) ) ) ) AND ( ( "INGRP2"."NS_PUBLISHER_CODE" = "INGRP1"."NS_PUBLISHER_CODE" ) ) AND ( ( "INGRP2"."NS_OUTLET_SRC_ID" = "INGRP1"."NS_OUTLET_SRC_ID" ) ) )) "SPLITTER_INPUT_SUBQUERY$1"
WHERE
( "SPLITTER_INPUT_SUBQUERY$1"."NS_OUTLET_SRC_ID_1$1" = "SPLITTER_INPUT_SUBQUERY$1"."NS_OUTLET_SRC_ID_0_0$1" AND "SPLITTER_INPUT_SUBQUERY$1"."NS_PUBLISHER_CODE_1$1" = "SPLITTER_INPUT_SUBQUERY$1"."NS_PUBLISHER_CODE_0_0$1" ) AND
( ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EXP_DATE_0_0$1" IS NULL AND TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_0_0$1" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_1$1" , 'J.HH24.MI.SS' ) ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EXP_DATE_0_0$1" IS NOT NULL AND TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_0_0$1" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_1$1" , 'J.HH24.MI.SS' ) AND TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EXP_DATE_0_0$1" , 'J.HH24.MI.SS' ) >= TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_1$1" , 'J.HH24.MI.SS' ) ) ) AND
( ( "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_1$1" IS NULL AND "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_2$1" IS NOT NULL ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_1$1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_2$1" IS NULL ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_1$1" != "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_2$1" ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_1$1" IS NULL AND "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_0_0$1" IS NOT NULL ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_1$1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_0_0$1" IS NULL ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_1$1" != "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_0_0$1" ) )) "AGG_INPUT"
GROUP BY
"AGG_INPUT"."NS_ID$5", "AGG_INPUT"."NS_OUTLET_SRC_ID$5", "AGG_INPUT"."NS_PUBLISHER_CODE$5"/* RETAILER_PUBLISHER_NS.DEDUP_SCD_SRC */) "DEDUP_SCD_SRC") ) "MERGE_DELTA_ROW_0"
MERGE_SUBQUERY
ON (
"RETAILER_PUBLISHER_NS"."NS_OUTLET_SRC_ID" = "MERGE_SUBQUERY"."NS_OUTLET_SRC_ID" AND
"RETAILER_PUBLISHER_NS"."NS_PUBLISHER_CODE" = "MERGE_SUBQUERY"."NS_PUBLISHER_CODE" AND
"RETAILER_PUBLISHER_NS"."NS_EFF_DATE" = "MERGE_SUBQUERY"."NS_EFF_DATE" AND
"RETAILER_PUBLISHER_NS"."NS_ID" = "MERGE_SUBQUERY"."NS_ID"
WHEN MATCHED THEN
UPDATE
SET
"NS_TITLE_CLASSIFICATION_CODE" = "MERGE_SUBQUERY"."NS_TITLE_CLASSIFICATION_CODE",
"NS_SUPPLY_FLAG" = "MERGE_SUBQUERY"."NS_SUPPLY_FLAG",
"NS_EXP_DATE" = "MERGE_SUBQUERY"."NS_EXP_DATE"
WHEN NOT MATCHED THEN
INSERT
("RETAILER_PUBLISHER_NS"."NS_ID",
"RETAILER_PUBLISHER_NS"."NS_OUTLET_SRC_ID",
"RETAILER_PUBLISHER_NS"."NS_PUBLISHER_CODE",
"RETAILER_PUBLISHER_NS"."NS_TITLE_CLASSIFICATION_CODE",
"RETAILER_PUBLISHER_NS"."NS_SUPPLY_FLAG",
"RETAILER_PUBLISHER_NS"."NS_EFF_DATE",
"RETAILER_PUBLISHER_NS"."NS_EXP_DATE",
"RETAILER_PUBLISHER_NS"."DIMENSION_KEY")
VALUES
("RETAILER_PUBLISHER_NS_SEQ".NEXTVAL,
"MERGE_SUBQUERY"."NS_OUTLET_SRC_ID",
"MERGE_SUBQUERY"."NS_PUBLISHER_CODE",
"MERGE_SUBQUERY"."NS_TITLE_CLASSIFICATION_CODE",
"MERGE_SUBQUERY"."NS_SUPPLY_FLAG",
"MERGE_SUBQUERY"."NS_EFF_DATE",
"MERGE_SUBQUERY"."NS_EXP_DATE",
"RETAILER_PUBLISHER_NS_SEQ".CURRVAL)
Explain plan:
MERGE STATEMENT, GOAL = ALL_ROWS 1412 2 286
MERGE DW RETAILER_PUBLISHER_NS
VIEW DW
SEQUENCE DW RETAILER_PUBLISHER_NS_SEQ
HASH JOIN OUTER 1412 2 256
VIEW DW 940 2 170
SORT UNIQUE 940 2 218
UNION-ALL
WINDOW SORT 470 1 133
FILTER
NESTED LOOPS OUTER 468 1 133
VIEW DW 4 1 65
SORT GROUP BY 4 1 25
TABLE ACCESS FULL REFSTG PUB_AGENT_MATRIX_CC 3 1 25
VIEW SYS 464 1 68
VIEW DW 464 1 68
TABLE ACCESS FULL DW RETAILER_PUBLISHER_NS 464 1 43
VIEW DW 469 1 85
SORT GROUP BY 469 1 90
NESTED LOOPS 468 1 90
VIEW DW 4 1 37
SORT GROUP BY 4 1 25
TABLE ACCESS FULL REFSTG PUB_AGENT_MATRIX_CC 3 1 25
VIEW SYS 464 1 53
VIEW DW 464 1 68
TABLE ACCESS FULL DW RETAILER_PUBLISHER_NS 464 1 43
TABLE ACCESS FULL DW RETAILER_PUBLISHER_NS 467 337417 14508931
Is this similar to the sql generated at your end? Do you use special loading hints, anything specail with indexing - we have tried standard indexing.
Does this look untoward - have you any other suggestions?
Thanks for your interest. -
Hello
I am writing a pl-sql which will perform SCD type 1.
The Type 1 methodology overwrites old data with new data, and therefore does not track historical data at all. This is most appropriate when correcting certain types of data errors, such as the spelling of a name. (Assuming you won't ever need to know how it used to be misspelled in the past.)
Another example would be of a database table that keeps supplier information.
Supplier_key Supplier_Name Supplier_State
001 Phlogistical Supply Company CA
so hence I created two tables
create table ssn_load1
( ssn number(10,0),
credit_score number(6,0));
create table ssn_load2
( ssn number(10,0),
credit_score number(6,0));
and the target table
create table ssn_target
( sq_id number(8,0) primary key,
ssn number(10,0),
credit_score number(6,0));
since I want sq_id as auto incremented,I have created a following trigger
CREATE SEQUENCE test_sequence
START WITH 1
INCREMENT BY 1;
CREATE OR REPLACE TRIGGER test_trigger
BEFORE INSERT
ON ssn_target
REFERENCING NEW AS NEW
FOR EACH ROW
BEGIN
SELECT test_sequence.nextval INTO :NEW.SQ_ID FROM dual;
END;
Now inoder to perform type 1 I have followed a following thing
YOu have source table
tbl1(col1, col2, col3) - where col1 is the key and (col2, col3) are the attributes
target table
tbl2(col21, col22) - where (col21, col21) are the attributes
do this join for change data capture
select <col list>,
case when col22 is null and col23 is null then 'NEW'
when col2 = col22 and col3 = col23 then 'NO CHANGE'
else 'MODIFIED' end
from
tbl1 LEFT OUTER JOIN tbl2
ON tbl1.col1 = tbl2.col21
following is my pl-sql ,but I dont know how to write insert update statement in the switch case.
declare
cursor ssn1
is select from ssn_load1;*
for row in ssn1 loop
select s2.ssn,s2.credit_score from ssn_load2 s2 left outer join ssn_load1 s1 on s1.ssn=s2.ssn;
* SWITCH (ssn)*
* DO*
CASE 'Insert':if s2.ssn!=s1.ssn
Insert into ssn_target(ssn,credit_score)values();
CASE 'No Change':
CASE 'Modify':Update ssn_target set
DEFAULT:
DO END;
end loop;
END;
please help me how to proceed from here
will be waiting for reply
Thank You!!Isn't this a MERGE?
In the outer join approach, it might be simpler to do the join in the cursor and keep the PL/SQL simple. I didn't really understand your three-table scenario though. -
Revision: 12685
Revision: 12685
Author: [email protected]
Date: 2009-12-08 19:23:32 -0800 (Tue, 08 Dec 2009)
Log Message:
Fix for RTE in VideoPlayer when trying to capture bitmaps. Put try-catch block around bitmapData.draw() and use a Rectangle if it throws an error. Also, added some documentation in BitmapUtil for getSnapshot since it may throw the same type of error.
QE notes: No
Doc notes: No
Bugs: SDK-24574
Reviewer: Ryan
Tests run: checkintests
Is noteworthy for integration: No
Ticket Links:
http://bugs.adobe.com/jira/browse/SDK-24574
Modified Paths:
flex/sdk/trunk/frameworks/projects/spark/src/spark/skins/spark/HighlightBitmapCaptureSkin .as
flex/sdk/trunk/frameworks/projects/spark/src/spark/utils/BitmapUtil.as -
Unicode - "DMBTR" must be a character-type field (data type C,N,D or T)
Greetings Experts!
I am trying to convert legacy code to Unicode for a current ERP6.0 reinstallation and have encountered the syntax error "DMBTR" must be a character-type field (data type C,N,D or T)
The field is part of a structure and the fields attributes are as follows:
COMPONENT = DMBTR
COMPONENT TYPE = DMBTR
DATA TYPE = CURR
LENGTH = 13
DECIMALS = 2
DESCRIPTION = Amount in Local Currency
The code in question is as follows:-
macro Move_Zoned.
Converts a numeric variable to zoned format and moves it to a
target variable.
DEFINE move_zoned.
&1 - source variable
&2 - Number of Decimal Places
&3 - 'To'
&4 - Target Variable.
write &1 to w_zoned no-grouping decimals &2.
condense w_zoned.
Remove the Decimal Points.
search w_zoned for '...'.
while sy-subrc = 0.
move sy-fdpos to w_to_point.
if w_to_point = 0.
w_to_point = 1.
endif.
compute w_from_point = sy-fdpos + 1.
concatenate w_zoned+0(w_to_point)
w_zoned+w_from_point
into w_zoned.
search w_zoned for '...'.
endwhile.
shift w_zoned right deleting trailing space.
translate w_zoned using ' 0'.
call function 'Z_TRANSLATE_ZONED_DECIMALS'
exporting
i_input = w_zoned
importing
i_output = w_zoned
exceptions
x_invalid_zoned_char = c_invalid_zoned_char
x_numeric_info_lost = c_numeric_info_lost
others = c_other_zoned_error.
Get the length of the recipient field so we don't truncate the
numbers....
describe field &4 length w_flength in character mode.
describe field &4 type w_type.
describe field w_zoned length w_zoned_len in character mode.
if w_zoned_len <= w_flength.
move w_zoned to &4.
shift &4 right deleting trailing space.
translate &4 using ' 0'.
else.
Get the start position....
If it's a packed field allow for values up to 6 figures
compute w_zoned_len = w_zoned_len - w_flength.
if w_type = 'P'.
subtract 2 from w_zoned_len.
clear w_type.
endif.
move w_zoned+w_zoned_len &3 &4.
endif.
END-OF-DEFINITION. "Move_zoned
LOOP AT t_single_kunnr.
move_zoned t_single_kunnr-postamt 2
to t_single_kunnr-dmbtr.
DIVIDE t_single_kunnr-dmbtr BY 100.
MODIFY t_single_kunnr.
ENDLOOP.
Is there a solution to get past this syntax error as I would rather not change the datatype of the field in the structure.
Much Obliged
Elphick.Type X is not allowed in Unicode. When a field is declared as Type X with Value u201809u2019 or any other value, it can be resolved by using classes.
Before Unicode
CONSTANTS: c_hex TYPE x VALUE '09'.
Resolution:
Itu2019s work for any value of x.
First a temporary field of Type c should declare. Following class will convert Type x variable into type c.
Example:
CONSTANTS: c_hex TYPE x VALUE '09'.
DATA: LV_TEMP TYPE STRING.
DATA: LV_TMP TYPE C.
TRY.
CALL METHOD CL_ABAP_CONV_IN_CE=>UCCP
EXPORTING
UCCP = c_hex
RECEIVING
CHAR = LV_TMP .
CATCH CX_SY_CONVERSION_CODEPAGE.
CATCH CX_PARAMETER_INVALID_TYPE.
CATCH CX_SY_CODEPAGE_CONVERTER_INIT.
ENDTRY.
CONCATENATE I_OUTPUT-BKTXT I_OUTPUT-BVORG
I_OUTPUT-BUDAT I_OUTPUT-MESSAGE INTO
SEPARATED BY LV_TMP.
I_BUFFER = LV_TEMP.
CLEAR LV_TEMP.
CLEAR LV_TMP.
OR
Note: It works only for type x value 09.
CLASS cl_abap_char_utilities DEFINITION LOAD.
CONSTANTS: c_hex TYPE c VALUE
abap_char_utilities=>HORIZONTAL_TAB.
Maybe you are looking for
-
How do I email (or share) pdf files I have stored on my Galaxy Note II?
I have been using Galaxy Note 2 and used to be able to send pdf files using Adobe Reader to myself and to others. I believe there used to be a "share option" at the top which allowed me to do this. After updating Adobe Reader, this option disappeared
-
I want to know N8's preloaded content.
I've bought a new N8 from someone who got it with him from France, the box was open but he told me that this was just because he had to check the purple screen and that he hasn't used the phone before, now I'm a bit skeptic because I couldn't find an
-
View what queries have been run in SQL 2005
I've been asked to have a look at building an MI mart from a transactional DB. The source Db is in SQL 2005. The transaction logs are showing that there are approx. 35-40 timeouts per day while the quantity of transactions written is <1000 per day.
-
Question regarding bulk binding
Gurus, How does oracle try to differentiate whether the query should use the concept of bulk binding and a normal query ? Is the key word FOR ALL makes the difference ? Or is there any other difference ?? Please help Regards
-
How do I recover a deleted photo from iPhone 4?
How do I recover a deleted photo from the iPhone 4?