Best Way To merge Customer Records
Hi community,
What is the best way to merge customer records for the same operson but may have used different email ids to correspond with BC Website
Not in BC no. You would need to export a custom report, sort that in excel and then form that into the customer import it back in or create some API that goes through and cleans that up.
Similar Messages
-
Hi,
Please suggest me the best way to fetch the record from the table designed below. It is Oracle 10gR2 on Linux
Whenever a client visit the office a record will be created for him. The company policy is to maintain 10 years of data on the transaction table but the table holds record count of 3 Million records per year.
The table has the following key Columns for the Select (sample Table)
Client_Visit
ID Number(12,0) --sequence generated number
EFF_DTE DATE --effective date of the customer (sometimes the client becomes invalid and he will be valid again)
Create_TS Timestamp(6)
Client_ID Number(9,0)
Cascade Flg vahrchar2(1)
On most of the reports the records are fetched by Max(eff_dte) and Max(create_ts) and cascade flag ='Y'.
I have following queries but the both of them are not cost effective and takes 8 minutes to display the records.
Code 1:
SELECT au_subtyp1.au_id_k,
au_subtyp1.pgm_struct_id_k
FROM au_subtyp au_subtyp1
WHERE au_subtyp1.create_ts =
(SELECT MAX (au_subtyp2.create_ts)
FROM au_subtyp au_subtyp2
WHERE au_subtyp2.au_id_k =
au_subtyp1.au_id_k
AND au_subtyp2.create_ts <
TO_DATE ('2013-01-01',
'YYYY-MM-DD'
AND au_subtyp2.eff_dte =
(SELECT MAX
(au_subtyp3.eff_dte
FROM au_subtyp au_subtyp3
WHERE au_subtyp3.au_id_k =
au_subtyp2.au_id_k
AND au_subtyp3.create_ts <
TO_DATE
('2013-01-01',
'YYYY-MM-DD'
AND au_subtyp3.eff_dte < =
TO_DATE
('2012-12-31',
'YYYY-MM-DD'
AND au_subtyp1.exists_flg = 'Y'
Explain Plan
Plan hash value: 2534321861
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1 | 91 | | 33265 (2)| 00:06:40 |
|* 1 | FILTER | | | | | | |
| 2 | HASH GROUP BY | | 1 | 91 | | 33265 (2)| 00:06:40 |
|* 3 | HASH JOIN | | 1404K| 121M| 19M| 33178 (1)| 00:06:39 |
|* 4 | HASH JOIN | | 307K| 16M| 8712K| 23708 (1)| 00:04:45 |
| 5 | VIEW | VW_SQ_1 | 307K| 5104K| | 13493 (1)| 00:02:42 |
| 6 | HASH GROUP BY | | 307K| 13M| 191M| 13493 (1)| 00:02:42 |
|* 7 | INDEX FULL SCAN | AUSU_PK | 2809K| 125M| | 13493 (1)| 00:02:42 |
|* 8 | INDEX FAST FULL SCAN| AUSU_PK | 2809K| 104M| | 2977 (2)| 00:00:36 |
|* 9 | TABLE ACCESS FULL | AU_SUBTYP | 1404K| 46M| | 5336 (2)| 00:01:05 |
Predicate Information (identified by operation id):
1 - filter("AU_SUBTYP1"."CREATE_TS"=MAX("AU_SUBTYP2"."CREATE_TS"))
3 - access("AU_SUBTYP2"."AU_ID_K"="AU_SUBTYP1"."AU_ID_K")
4 - access("AU_SUBTYP2"."EFF_DTE"="VW_COL_1" AND "AU_ID_K"="AU_SUBTYP2"."AU_ID_K")
7 - access("AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd
hh24:mi:ss') AND "AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
filter("AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND
"AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))
8 - filter("AU_SUBTYP2"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
9 - filter("AU_SUBTYP1"."EXISTS_FLG"='Y')Code 2:
I already raised a thread a week back and Dom suggested the following query, it is cost effective but the performance is same and used the same amount of Temp tablespace
select au_id_k,pgm_struct_id_k from (
SELECT au_id_k
, pgm_struct_id_k
, ROW_NUMBER() OVER (PARTITION BY au_id_k ORDER BY eff_dte DESC, create_ts DESC) rn,
create_ts, eff_dte,exists_flg
FROM au_subtyp
WHERE create_ts < TO_DATE('2013-01-01','YYYY-MM-DD')
AND eff_dte <= TO_DATE('2012-12-31','YYYY-MM-DD')
) d where rn =1 and exists_flg = 'Y'
--Explain Plan
Plan hash value: 4039566059
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 2809K| 168M| | 40034 (1)| 00:08:01 |
|* 1 | VIEW | | 2809K| 168M| | 40034 (1)| 00:08:01 |
|* 2 | WINDOW SORT PUSHED RANK| | 2809K| 133M| 365M| 40034 (1)| 00:08:01 |
|* 3 | TABLE ACCESS FULL | AU_SUBTYP | 2809K| 133M| | 5345 (2)| 00:01:05 |
Predicate Information (identified by operation id):
1 - filter("RN"=1 AND "EXISTS_FLG"='Y')
2 - filter(ROW_NUMBER() OVER ( PARTITION BY "AU_ID_K" ORDER BY
INTERNAL_FUNCTION("EFF_DTE") DESC ,INTERNAL_FUNCTION("CREATE_TS") DESC )<=1)
3 - filter("CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND "EFF_DTE"<=TO_DATE('
2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))Thanks,
VijayHi Justin,
Thanks for your reply. I am running this on our Test environment as I don't want to run this on Production environment now. The test environment holds 2809605 records (2 Million).
The query output count is 281699 (2 Hundred Thousand) records and the selectivity is 0.099. The Distinct values of create_ts, eff_dte, and exists_flg is 2808905 records. I am sure the index scan is not going to help out much as you said.
The core problem is both queries are using lot of Temp tablespace. When we use this query to join the tables, the other table has the same design as below so the temp tablespace grows bigger.
Both the production and test environment are 3 Node RAC.
First Query...
CPU used by this session 4740
CPU used when call started 4740
Cached Commit SCN referenced 21393
DB time 4745
OS Involuntary context switches 467
OS Page reclaims 64253
OS System time used 26
OS User time used 4562
OS Voluntary context switches 16
SQL*Net roundtrips to/from client 9
bytes received via SQL*Net from client 2487
bytes sent via SQL*Net to client 15830
calls to get snapshot scn: kcmgss 37
consistent gets 52162
consistent gets - examination 2
consistent gets from cache 52162
enqueue releases 19
enqueue requests 19
enqueue waits 1
execute count 2
ges messages sent 1
global enqueue gets sync 19
global enqueue releases 19
index fast full scans (full) 1
index scans kdiixs1 1
no work - consistent read gets 52125
opened cursors cumulative 2
parse count (hard) 1
parse count (total) 2
parse time cpu 1
parse time elapsed 1
physical write IO requests 69
physical write bytes 17522688
physical write total IO requests 69
physical write total bytes 17522688
physical write total multi block requests 69
physical writes 2139
physical writes direct 2139
physical writes direct temporary tablespace 2139
physical writes non checkpoint 2139
recursive calls 19
recursive cpu usage 1
session cursor cache hits 1
session logical reads 52162
sorts (memory) 2
sorts (rows) 760
table scan blocks gotten 23856
table scan rows gotten 2809607
table scans (short tables) 1
user I/O wait time 1
user calls 11
workarea executions - onepass 1
workarea executions - optimal 9
Second Query
CPU used by this session 1197
CPU used when call started 1197
Cached Commit SCN referenced 21393
DB time 1201
OS Involuntary context switches 8684
OS Page reclaims 21769
OS System time used 14
OS User time used 1183
OS Voluntary context switches 50
SQL*Net roundtrips to/from client 9
bytes received via SQL*Net from client 767
bytes sent via SQL*Net to client 15745
calls to get snapshot scn: kcmgss 17
consistent gets 23871
consistent gets from cache 23871
db block gets 16
db block gets from cache 16
enqueue releases 25
enqueue requests 25
enqueue waits 1
execute count 2
free buffer requested 1
ges messages sent 1
global enqueue get time 1
global enqueue gets sync 25
global enqueue releases 25
no work - consistent read gets 23856
opened cursors cumulative 2
parse count (hard) 1
parse count (total) 2
parse time elapsed 1
physical read IO requests 27
physical read bytes 6635520
physical read total IO requests 27
physical read total bytes 6635520
physical read total multi block requests 27
physical reads 810
physical reads direct 810
physical reads direct temporary tablespace 810
physical write IO requests 117
physical write bytes 24584192
physical write total IO requests 117
physical write total bytes 24584192
physical write total multi block requests 117
physical writes 3001
physical writes direct 3001
physical writes direct temporary tablespace 3001
physical writes non checkpoint 3001
recursive calls 25
session cursor cache hits 1
session logical reads 23887
sorts (disk) 1
sorts (memory) 2
sorts (rows) 2810365
table scan blocks gotten 23856
table scan rows gotten 2809607
table scans (short tables) 1
user I/O wait time 2
user calls 11
workarea executions - onepass 1
workarea executions - optimal 5Thanks,
Vijay
Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:17 AM
Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:19 AM -
What is the best way to do voice recording in a Macbook Pro?
What is the best way to do voice recording in a Macbook Pro.I want to voice record and sendas a MP3 file.
ThanksDeleting the application from your /Applications folder is sufficient. There are sample projects in /Library/Application/Aperture you may want to get rid of as well, as they take up a fair bit of space.
-
What is the best way to merge a file content into log file
What is the best way to merge a file content into log file.
In worst case, I will read the file line by line as string, then use
logger.info(lineString)to output to log file.
However, is there better way to do this?
The eventual log file will be something like:
log message 1
log message 2
content from file line 1
content from file line 2
content from file line 3
log message 3
log message 4ThanksJohn618 wrote:
Thank you and let me explain:
1. What do you mean by better?
I would like to see better performance. read line by line and log each line as string can be slow. Did you measure this and determine that it is actually a problem for your application? Or are you guessing?
Regardless of what you do you are still going to need to read the file.
>
2.The only better way I can think of is not having to do it, but I assume you have a very good reason to want to do this.
Yes, I have to do it beacuse the requirement is to have that file content be part of logging.
Any idea?How is it supposed to be part of it? For example which of the following is better?
File AAA - contents
First Line
Second Line XXX
Log 1
2009-03-27 DEBUG: Random preceding line
2009-03-27 DEBUG: First Line
2009-03-27 DEBUG: Second Line XXX
2009-03-27 DEBUG: Random following line
Log 2
2009-03-27 DEBUG: Random preceding line
2009-03-27 DEBUG: ----- File: AAA -------------
First Line
Second Line XXX
2009-03-27 DEBUG: Random following lineBoth of the above have some advantages and disadvantages.
The first in a mult-threaded app can end up with intermittent log entries in between lines, so having log lines with thread ids becomes important.
The first can be created by reading one line at a time and posting one at a time.
The second can be created by reading the entire file as a single string and then posting using a single log statement. -
Best way to merge photos from different macs
What is the best way to merge photo libraries from different macs onto one mac?
The only way to merge Libraries - versions, metadata etc - is with the paid ($20) version of iPhoto Library Manager
Other than that you will need to export from one Library to the Finder and then import to the other. This won’t get all the Metadata (for instance, Faces) and won't get the Versions either.
Regards
TD -
Best way to merge two i photo libraries
I have ended up with two versions of i photo on the same laptop - just stored in different places. One has 30K photos and the other has 80K photos. What is the best way to merge the two photo libraries as whenever I try to do it - it ends up crashing.
no sure how you are doing it - the only way is to use the merge capability of iPhoto Library Manager - http://www.fatcatsoftware.com/iplm/
LN -
Best way to update custom table
Hello experts,
Iu2019m writing a report program and after pulling data from a custom table Iu2019m modifying certain fields within internal table and then eventually update custom table. The way Iu2019m updating custom table is working fine. However Iu2019m concern about performance issues because Iu2019m doing update on custom table within loop.
Here is my code for reference.
*& Form update_contracts
text
--> p1 text
<-- p2 text
FORM update_contracts .
Update record in an internal table first
loop at izsc_compliance into wa_zsc_compliance..
wa_zsc_compliance-zapproval = c_accepted.
wa_zsc_compliance-CHANGED_DT = sy-datum.
wa_zsc_compliance-CHANGED_TM = sy-uzeit.
wa_zsc_compliance-CHANGED_BY = sy-uname.
modify izsc_compliance from wa_zsc_compliance index sy-tabix.
write:/ sy-tabix, wa_zsc_compliance-vbeln_new, wa_zsc_compliance-zapproval.
if p_test is initial.
move wa_zsc_compliance to zsc_compliance.
update zsc_compliance.
endif..
endloop.
Write records to database
if p_test = 'X'.
skip.
write:/ 'Test mode'.
endif.
ENDFORM. " update_contracts
Iu2019m not certain if there is any better way by not doing update within loop and update custom table outside this loop.
Many thanks in advance.Hi,
Yes, there is a better way to update the custom table. That will be more performance oriented and will be a much cleaner approach. As, I am not much aware of the custom table structure which you have in your program, the best way that I can suggest is to remove the update statement from that check. I guess you are checking against the mode - test or production. Once you have done the check, put the selected entries in an internal table which is same as database table. And then in a single command - a single array operation as it is commonly called, you can update the custom table.
Have a look at the following link
[Overwriting Several Lines Using an Internal Table|http://help.sap.com/saphelp_bw33/helpdata/en/fc/eb3a94358411d1829f0000e829fbfe/content.htm]
[Inserting or Changing Lines|http://help.sap.com/saphelp_bw33/helpdata/en/fc/eb3ac8358411d1829f0000e829fbfe/content.htm]
You can also scan the forum for multiple links and references and sample examples.
Hope this will help. The above approach will be much more performance oriented and will help to optimize. Also, check where exactly you are providing the locking feature if at applicable in your business functionality.
Regards,
Samantak. -
Problem: My wife and I both have MacBook Pros (MBPs). We take a lot of pictures and import them into iPhoto. When the storage capacity of our MBPs gets full, I have been moving our iPhoto libraries into external hard drives, which are now multiple (3 or 4). The problem I now realize we have been making, is that once the iPhoto libraries were copied onto the external hard drives, we were only deleting about half of the photos in each iPhoto library remaining in our MBPs (because we wanted to keep some of the important ones in our hard drives). Once the storage capacities of our MBPs got full again, I would repeat the whole process again and again. In essence I now have several large iPhoto libraries (each about 80 GB), each with multiple duplicate photos, divided among several external hard drives. And I am running out of space on my hard drive again. So what is the best way to:
a) merge all of these iPhoto libraries into just one, while simultaneously being able to delete all the duplicates? (or would you not recommend this?)
b) prevent this from happening again?
Thanks. BTW I am using OS X Mountain Lion 10.8.5 and iPhoto 8.1.2 (2009)If you have Aperture 3.3 or later and iPhoto 9.3 or later you can merge libraries with Aperture.
Otherwise the only way to merge Libraries is with the paid ($30) version of iPhoto Library Manager
The Library Manager app can be set avoid duplicates. -
Hey there, Apple Support Communities.
To start, I'm working on a MBP Retina 15" with a 2.3GHz i7 processor and 16 GB of RAM. 10GB free on a 256GB SS HD. Attached are two external HDs - one 1TB Western Digital portable drive from 2011, one 2TB Porsche LaCie non-portable drive from 2013; both connected via USB. All photo libraries in question are on the external drives.
I have Aperture 3.5.1 and iPhoto 9.5.1. I prefer to work in Aperture.
The Issue(s)
Over the years, I have accumulated a number of iPhoto libraries and Aperture libraries. At one point, I thought my WD drive was dying so I purchased the LaCie and copied all libraries over the the LaCie drive. (Turns out, there's probably an issue with my USB port reading drives, because I can once again see the WD drive and occasionally I can't see the LaCie drive.)
So now I have old version of some libraries on the WD drive, and new versions on the LaCie drive.
When I was moving things, I ran the software Gemini to de-dupe my iPhoto libraries. Not sure what effect that may have had on my issues.
In my main Aperture library and in some iPhoto libraries, I get the image-not-found badge or exclamation point. I've dug through the hidden Masters folders in various libraries to find the original image. In some cases, I have been able to find the original image, sometimes in a different version of the iPhoto library.
My Question(s)
1. For Aperture libraries that have missing originals, is there some magical way to find them, or have they just disappeared into the oblivion?
2. For iPhoto libraries that have missing originals and I have found the original in another iPhoto library, what is the best way to proceed?
3. Are there quirks to merging iPhoto and Aperture libraries (by using the Import->Library) feature that I should be aware of?
TL;DR: Too many iPhoto and Aperture libraries, and not all the original pictures can be found by the libraries anymore, though some originals still do exist in other libraries. Steps/process to fix?
Thank you! Let me know if you need add'l info to offer advice.
With appreciation,
ChristieThat will not be an easy task, Christie.
I am afraid, your cleaning session with Gemini may have actually removed originals. I have never used this duplicate finder tool, but other posters here reported problems. Gemini seems to replace duplicate original files in photo libraries by links, and this way, deleting images can cause the references for other images to break. And Aperture does not follow symbolic links - at least, I could never get it to find original files this way, when I experimented with this.
1. For Aperture libraries that have missing originals, is there some magical way to find them, or have they just disappeared into the oblivion?
You have to find the originals yourself. If you can find them or restore them from a backup, Aperture can reconnect them. The reconnect panel can show you, where the originals are supposed to be, so youcan see the filename and make a Spotlight search.
For iPhoto libraries that have missing originals and I have found the original in another iPhoto library, what is the best way to proceed?
Make a copy of the missing original you found in a folder outside the iPhoto library. You can either open the iPhoto library in Aperture and use "File > Locate Referenced file" to reconnect the originals, or simply reimport them. Then Lift&Stamp all adjustments and metadata to the reimported version.
See this manual page on how to reconnect originals: Aperture 3 User Manual: Working with Referenced Images (the paragraph: Reconnecting Missing or Offline Referenced Images)
Are there quirks to merging iPhoto and Aperture libraries (by using the Import->Library) feature that I should be aware of?
References images will stay referenced, managed will remain managed. You need to unhide all hidden photos in iPhoto - this cannot be done in Aperture.
and not all the original pictures can be found by the libraries anymore, though some originals still do exist in other libraries. Steps/process to fix?
That is probably due to Gemini's replacing duplicate originals by links, and your best cause of action is to fix this before merging the libraries. Reconnecting can be done for your iPhoto libraries in Aperture. -
Best way to export custom postcard as .jpeg
Hi! I am very new to photoshop, and, I am sure these are very elementary questions... please bear with me. I really appreciate your help.
I created a custom postcard with layers - a collage of family photos - to use as a Christmas postcard. After reading advice online, I sized it at 72 resolution, color RGB Color, and tried to export it as .JPEG using Save for web... But, the preview looks like some of the faces on my postcard are blurry... Could you please tell me the best way to format and export such a file? Also, if it needs to be at a higher resolution, do I need to start the whole process over again?
Thanks so much for your patience with my ignorance. I am trying to do this before Christmas, and trying to keep my spirits in a Christmas mood in the process, without beating my computer. =-0Are these to be printed or emailed?
MTSTUNER -
Best way to generate one record per day from a table with eff/exp dates
Hi,
Have a table which has various attributes and an eff and exp date. e.g attributea, 01/05/2012, 16/05/2012
We wish to create another table from this table to have one record per day. e.g 16 records.
What is best way to achieve this in OWB ?
ThanksHi,
Example if have table
with following contents
conversion_rate number(6,4)
EFFEcTIVE_DATE DATE
expiration_date date
example record 1.43, 01/05/2012,16/05/2012
If want to have another table which instead has 16 records one for each day
e.g
1.43, 01/05/2012
1.43,02/05/2012
1.43,16/05/2012
Thoughts on best way to do this.
Thanks -
Best way to call custom classloader
I have created a custom classloader to perform hot deployment for application server. How do I let JVM to use my class loader instead of system class loader?
a. Using the command line argument -Djava.system.class.loader
b. Using -javaagent and setting System.setProperty("java.system.class.loader", myclass) in premain method.
c. Is there any other alternative?
If I am using the command line approach, I will need to have my jar file in the classpath. What is the best way to do this?
Thanks for your helpOr you could write a small program which sets up the classloader and then loads the target program with it.
-
Best way to merge iPhoto Libraries?
The hard drive on my late-2009 27" iMac died with no TimeMachine backup (ouch). I replaced it and started with a fresh new OS install. I have now recovered the data off the damaged hard drive. I have two iPhoto libraries with photos that I do not wish to lose.
What is the easiest/best way to import all of the photos from my "old" iPhoto library into my "new" iPhoto library? I have the latest Mavericks OS and the iPhoto is up to date.Merge:
If you have Aperture 3.3 or later and iPhoto 9.3 or later you can merge libraries with Aperture.
Otherwise the only way to merge Libraries is with the paid ($20) version of iPhoto Library Manager
Just the photos?
Export from the Library to folders on the Finder. Then import to the new Library. Note that this will not get your organisation, your edit hostory and you'll need to choose between originals and edited versions. You may lose some metadata too, depending on your export settings.
This User Tip
https://discussions.apple.com/docs/DOC-4921
has details of the options in the Export dialogue. -
Best way to create customized counter for slideshow.
Hello everone. I am stuck on this one. What is the best way to create a customizable for the slideshow widgets. Im looking to create circles or boxes that change color in relation to the picture in the slideshow. I am sure that I have seen it on a muse site before, but just can't figure how to get it done. Thanks!
Have you tried using:
Effect > Text > Numbers ?
You could also have found that answer by searching for "numbers" at the Community Help site. It's really a great resource in addition to the online help files. Try it out!
(It would have been the first result returned from the search...)
http://www.adobe.com/support/aftereffects/ -
Best way to bulk insert records to sqlserver ??
Hi, Actually i encountered an issue regarding inserting the lakhs of records(10lakhs) in to sql server table using c#. the appropriate ways i know is using SqlBulkCopy or using parallel programming(LINQ) .
i want to know which one is best according performance wise. is there any other approach like "ssis".
please suggest.
a.kesavamallikarjunaEither use SqlBulkCopy or SSIS.
Using a parallel approach with 10lakhs rows will create a huge load on your SQL Server, which may lead to issues on the server side. This is not a general "don't' do it", cause with the appropriate table, index and disk structure, this may work.
SSIS is often the best solution, cause it is built for this kind of operation.
But concrete advice depends on your concrete problem (format, reoccurance, scheduling etc.).
Maybe you are looking for
-
My iTunes music is on an old laptop which I cannot access. What options do I have to play my music on the apple devices including a new Nano that I have just bought?
-
Hello ... I have a set of buttons in a gridlayout. I want the button to be of a particular size. I tried using setPreferredSize, setMaximum and setSize methods. But the button is automatically adjusting to the of the panel. Any suggestions? Thank you
-
Thought Photoshop came with 2 licenses
I bought a copy of Adobe Photoshop &Elements 10 for Mac/PC from Amazon. The description said it could be used on two computers but there is a piece of paper inside that says the serial numbers are valid for a PC or Mac. Is this true or can it be used
-
Nokia c2-02 Facebook app?
Hi folks. Trying to get a Facebook app on my Nokia but it gave me an error message saying that my phone settings keep asking for permission for something or other and it won't complete the install. To be honest I forget the exact wording but I'm hopi
-
I need help on sims 3 on my macbook pro
i have aton on sims 3 expansion packs but one random day my sims 3 wouldnt conect to the sims 3 website so i unistalled. i reinstall the se game fine. But whe i tryed to reinstall my expansion packs i already had it didnt work. (those expansion packs