Linking Surveys with Leads and Opportunities
Hi,
We have a survey associated with each lead and opportunity i.e when creating either a lead or an opportunity there is a tab on the GUI in which we can fill up the survey when creating the lead/Opportunity and then save the transaction. The survey goes to a different Infoprovider and leads and opportunity to their respectice Infoproviders.
We now want to link the survey answers to lead and opportunities. I tried infosets and there doesn't seem to be any linkage between these two . Any ideas or has anyone linked surveys to leads and survey to opportunities. Fyi we are on CRM 4.0 and BW 3.5
Thanks.
The CRM Surveys are fed into the cube 0WS_C01. If you have set the correct setting in the transaction CRM_SURVEY_SUITE, the survey results are extracted to the datasource 0SVY_DATA_1. Through the appropriate GUIDs the answers to the survey from the table CRM_SVY_ANSW (do not exactly remember the name) can be linked to the header GUID in CRMD_ORDERADM_H. (May be through the link table CRMD_LINK)
Doniv
Similar Messages
-
CRM report combining leads and opportunities
Hi,
I have LEADS and OPPORTUNITIES data in different cubes which are getting data std datasources. Now how do i create a report which shows the list of leads converted to opportunities?Hi Madhu,
If you are using DSO 0CRM_OPPH / Cube 0CRM_C04, you will find IO 0CRMPLEAGUI - GUID of Preceding Lead, which captures the Lead GUID, which I assume is only when a Lead is converted to an Opportunity. You can check your data to confirm such cases. If so, restrict on 0CRMPLEAGUI where not # in your query to report such cases.
--Priya -
How to: Linking Material with Assortment and Plant
What are the ways to link 0MATERIAL with assortments and 0PLANT?
Right now I have: 0RT_WRSZ, which links the Plant with the assortment
How do I link that with 0MATERIAL using the WLK1 table?
Thanks,
ChrisTry 0RT_WLK1
-
Assignment of leads and opportunities by product
Hello everyone,
I need to assign leads and opportunities by product to selected personnel, is there anyway to detect more than 1 product in the lead or opportunity? This is because a lead and opportunity may have several products bundled together and it would be very repetitive for the sales person to create one lead\opportunity for each product they sell to the same customer.
Any ideas on how to accomplish this? I was initially thinking of using Books, but it allows for only field comparisons of single values and not multi-picklists. So in the end the problem still comes back to having more than 1 product in a lead\opportunity.
Hope you guys here have some suggestions. Thanks again!Hi Venky,
Yes, the initial thought here is to use a multi-picklist; but unfortunately I don't think the assignment manager supports multi-picklists.
And yes, I would like to make it accessible to multiple people, not only one.
Basically, the process goes something like the following:-
1) Marketing obtains lead and enters into the system
2) Lead is channeled to a product manager
3) Product manager reassigns the lead to a salesperson
4) Salesperson converts the lead into an an opportunity - Product Manager should be able to view this opportunity
My question here is would the product manager still gain access to the lead once it is reassigned (salesperson is not reporting to the product manager in anyway)? And how can the product manager be able to view all opportunities which have the product he\she is in charge of?
My initial thought here is that we have to fall back on using Books?
Thanks. -
Hi all,
I am trying to link Webcenter Content with P6 and PCM. Would you please share the steps to link?
EmmranHi Emmran,
There is no pre-built integration between UCM and Primevera P6/PCM. You will need to integrate either using RIDC APIs that UCM provides or via WebServices.
RIDC - Oracle® Fusion Middleware
Web Service - Configuring WebCenter Content Web Services for Integration - 11g Release 1 (11.1.1)
All possible integration - http://docs.oracle.com/cd/E23943_01/doc.1111/e10807/part5_integrate_apps.htm#CIHJHEHA
HTH
- Anand -
Linking Contacts with iCal and Mail
Hi there, just wondering if you can link interactions via iCal and Mail to contacts on your system, so you can see the appointments, or correspondence you've had with a contact by viewing their contact card. Cheers
There is only 1 user on the computer so they only see one login chain.
I hope that they means *he (she)* and not more than one, and that login chain, means login keychain. If that's the case, why isn't the user doing the checking? It's his (her) account and he (she) should be doing this; otherwise, he (she) is compromising his (her) security and privacy. -
No. range skipping for leads and opportunities
Hello,
I have assigned an internal number range to leads and opportunity, which are own transaction types, i have created by copying standard ones, but it is surprising for me to see that the numbers are being skipped in random intervals.
Have any one of you have encountered this problem...what solution you applied?
RegardsHi,
Go to trx. SNRO and change the object to no keep in buffer and select only one number.
Best regards,
Caíque Escaler -
I am editing in premiere pro 2014 on a windows 8.1 system with 12 mega byes of ram. I imported a dvd into premiere from a .f4v that was
burned using freemakevideo converter.
I imported the DVD into premiere and the audio that is playing is from an previous project I edited with the program, as oppose
to, the audio that is linked with the video imported.
Do you know why this is? I cleaned my media cache and I rebooted my machine.
Please helpWithout knowing the specs of your videos it is hard to say. I'm also running a Mac and am having no such issues.
My first thought is that you have something from a 3rd party installed that is causing conflicts. If all else fails try transcoding your mp4 files to a frame based codec using the Adobe Media Encoder. Pick one of the Apple production codecs or Jpeg compressed QuickTime.
Are you running a ram preview? Did you render your Premiere timeline? Are you new to AE? -
What will I see when doing site survey with WRT and WRE?
I have just successfully setup with security, my WRE54G to my WRT54GS. It wasnt easy, lots of resets and disabling of security, but I didnt have to set a statis IP (which I am happy about)
However, now that I am searching for a connection on my laptop (wirelessly) I see to available networks but both are with the same name (the name of my original wireless network).
Is this right? If so, this gets confusing as I am not sure which I am connecting too.
Then if I change from one to another, I have to enter in the passphrase, which gets annoying.
Is there anyway to have a differnent name of my Wireless network vs the Wireless Expander?The RE, as I understand it, just extends the router signal that it is connected to. That is why it is imperative that the setup be EXACTLY the same. In my case, I stupidly put some capital letters in my SSID but didn't put them in when I setup the RE. Took me the longest time to figure that out....That you see two "networks" on site survey is OK. That you can connect to them differently is weird and that you have to enter the security passphrase suggests that you really haven't been successful in setup. So, Dibbler's advice is correct. Make sure the router and RE settings are the same and the firmware is upgraded.
I can't tell what device I am connected to and I really don't care. I just know that the place I was connecting from prior to the RE was far from the router and I would get the "limited connection" warning and no internet. With the RE and closer to my laptop, I always get "excellent" signal and an internet connection. I assume I'm connecting to the RE but it is seamless.
Also, make sure you the supply the MAC address of the router if the RE is v.1. Got that from the full user guide at the Linksys WRE54g product page.
If you really want different SSIDs I think you need to get a bridged access point.
Message Edited by Luckydog on 06-20-2007 06:11 AM
Message Edited by Luckydog on 06-20-2007 06:15 AM -
How do you link styles with bullets and lists?
It seems that amongst all the other hapless changes in Pages 5 there is no way to link bullets & lists to a style. I hope I'm missing something.
If I create a style with a numbered list it, then click on the next paragraph I'd like to have that style, the style name changes along with font characteristics and alignment, but no bullets or numbers. Moreover, pages crashes often when I try to update bullets and lists.
I have a fantasy that somewhere in Pittsburgh where I beieve the Pages development team lives there's a building full of coders passionately fixing all these things. If just the people on this thread put in $20, the former cost of Pages, we could probably hire a team to do the work for them.1. Yes it is possible (and relatively easy)
2. You should learn about applets, image handling, servlets and jdbc
Good luck -
I received an email with a hyperlink that runs a .swf file. I also recently downloaded PlayerX so decided to us it to open the .swf file. It was unable to do so. When trying to play this file I receive options of "Open in ..." or "Open in PlayerX". I try to select "Open in ..." however nothing is available to select. I need to remove this association. I have restarted the phone, resent the email from its original location but this association seems to have stuck. Can anyone help out with this. I would prefer to not have to delete the App and re-download it.
Jeff, many thanks for the reply. More than Adobe Chat did for me, Problem sorted this morning with telephone support. Complete mix up between 2 accounts I have and Trial copy. Appreciate your help thanks. Have now got a fully operative CC with my LR integrated, concerned as had purchased some years ago and didn't want to miss out on updates by having a separate copy.
Regards
Phil -
I'm having a problem to link itunes with click and buy
i'm trying to get a connection between click and buy and itunes but it doesn't work every time it makes a connection with itunes via the web and he's loads first time and then he doesn't do anything but i still can buy something
iPad Charging
"The fastest way to charge your iPad is with the included 10W USB Power Adapter."
If you believe there is an issue with the Supplied Charger, take it and your Device, to An Apple Store or AASP and get them to check...
Meanwhile, have you tried a Reset of your iPad... you will not lose any Data...
Reset
Press and hold the Sleep/Wake button and the Home button at the same time for at least ten seconds, until the Apple logo appears. Release the Buttons.
http://support.apple.com/kb/ht1430 -
Do This
1) Drag a clip into AE
2) Crop beginning of clip in comp by i.e. 200 frames
3) adjust inpoint work area
4) hit Adjust Comp to Workarea
5) save Project and open AME
6) Drag over the compostion to AME
7) render out the file
AME will NOT regard the NEW composition adjustments and thus render the 200 frames you cut off to the beginning of the movie...
Either I am doing something wrong (I hope but I don think so) or this is a BUUUUG...You can submit bug reports here:
http://www.adobe.com/go/wish
More on how to give feedback here. -
How do I Identify Lead and Lag in consecutive dates with multiple values?
I am using:
Oracle SQL Developer (3.0.04)
Build MAin-04.34
Oracle Database 11g
Enterprise Edition 11.2.0.1.0 - 64bit Production
I would like to identify the Lead and Lags based on two groups.
The groupping is that multiple C_ID can be found in a Single W_ID
and that multiple D_VAL can be found in C_ID
So I would like to identify and mark with "Lead" and "Lag" related to the "consecutivedaysc" (this already matches the D_VAL and C_ID very well) for example W_ID 2004 has C_ID 2059 with D_VAL of 44 for April 2 & 3, the consecutive days are 2 and I would like to correctly mark April 2 as the Lead and April 3 as the lag.
Then I would like to mark the "Lead" and "Lag" independent of if there are multiple D_VAL on the same W_ID
Example that I am having trouble with:
W_ID 2285 on April 10 for C_ID 7847, I don't understand whay I can't get "Lag" in stead of a "Lead" realted to the consecutivedaysw
I would like to eventually have it that the data gets summarized based on W_ID and sinlge (non-repeating) dt with Lead and Lags.
table
with t as (
select 4592 U_KEY,0 D_VAL_PRESENT,2004 W_ID,to_date('4/1/2013','mm-dd-yyyy') dt,2059 C_ID, (null) D_VAL,0 GWID,13 GCID,1 CONSECUTIVEDAYSC,1 CONSECUTIVEDAYSW from dual union all
select 4591,1,2004,to_date('4/2/2013','mm-dd-yyyy'),2059,44,1,11,2,13 from dual union all
select 4590,1,2004,to_date('4/3/2013','mm-dd-yyyy'),2059,44,1,11,2,13 from dual union all
select 4589,1,2004,to_date('4/4/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4588,1,2004,to_date('4/5/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4587,1,2004,to_date('4/6/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4586,1,2004,to_date('4/7/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4585,1,2004,to_date('4/8/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4584,1,2004,to_date('4/9/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4583,1,2004,to_date('4/10/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4582,1,2004,to_date('4/11/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4581,1,2004,to_date('4/12/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4580,1,2004,to_date('4/13/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4579,1,2004,to_date('4/14/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 1092,0,2686,to_date('4/1/2013','mm-dd-yyyy'),7210,(null),0,11,3,3 from dual union all
select 3416,0,2686,to_date('4/1/2013','mm-dd-yyyy'),7211,(null),0,11,3,3 from dual union all
select 18118,0,2686,to_date('4/1/2013','mm-dd-yyyy'),17391,(null),0,11,3,3 from dual union all
select 1091,0,2686,to_date('4/2/2013','mm-dd-yyyy'),7210,(null),0,11,3,3 from dual union all
select 3415,0,2686,to_date('4/2/2013','mm-dd-yyyy'),7211,(null),0,11,3,3 from dual union all
select 18117,0,2686,to_date('4/2/2013','mm-dd-yyyy'),17391,(null),0,11,3,3 from dual union all
select 1090,0,2686,to_date('4/3/2013','mm-dd-yyyy'),7210,(null),0,11,3,3 from dual union all
select 3414,0,2686,to_date('4/3/2013','mm-dd-yyyy'),7211,(null),0,11,3,3 from dual union all
select 18116,0,2686,to_date('4/3/2013','mm-dd-yyyy'),17391,(null),0,11,3,3 from dual union all
select 1089,1,2686,to_date('4/4/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3413,1,2686,to_date('4/4/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18115,1,2686,to_date('4/4/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1088,1,2686,to_date('4/5/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3412,1,2686,to_date('4/5/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18114,1,2686,to_date('4/5/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1087,1,2686,to_date('4/6/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3411,1,2686,to_date('4/6/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18113,1,2686,to_date('4/6/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1086,1,2686,to_date('4/7/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3410,1,2686,to_date('4/7/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18112,1,2686,to_date('4/7/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1085,1,2686,to_date('4/8/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3409,1,2686,to_date('4/8/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18111,1,2686,to_date('4/8/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1084,1,2686,to_date('4/9/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3408,1,2686,to_date('4/9/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18110,1,2686,to_date('4/9/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1083,1,2686,to_date('4/10/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3407,1,2686,to_date('4/10/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18109,1,2686,to_date('4/10/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1082,1,2686,to_date('4/11/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3406,1,2686,to_date('4/11/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18108,1,2686,to_date('4/11/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1081,1,2686,to_date('4/12/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3405,1,2686,to_date('4/12/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18107,1,2686,to_date('4/12/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1080,1,2686,to_date('4/13/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3404,1,2686,to_date('4/13/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18106,1,2686,to_date('4/13/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1079,1,2686,to_date('4/14/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3403,1,2686,to_date('4/14/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18105,1,2686,to_date('4/14/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 17390,1,3034,to_date('4/1/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17389,1,3034,to_date('4/2/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17388,1,3034,to_date('4/3/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17387,1,3034,to_date('4/4/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17386,1,3034,to_date('4/5/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 7305,1,3034,to_date('4/6/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17385,1,3034,to_date('4/6/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14123,1,3034,to_date('4/6/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 17384,1,3034,to_date('4/7/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17383,1,3034,to_date('4/8/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 7302,1,3034,to_date('4/9/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17382,1,3034,to_date('4/9/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14120,1,3034,to_date('4/9/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7301,1,3034,to_date('4/10/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17381,1,3034,to_date('4/10/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14119,1,3034,to_date('4/10/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7300,1,3034,to_date('4/11/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17380,1,3034,to_date('4/11/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14118,1,3034,to_date('4/11/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7299,1,3034,to_date('4/12/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17379,1,3034,to_date('4/12/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14117,1,3034,to_date('4/12/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7298,1,3034,to_date('4/13/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17378,1,3034,to_date('4/13/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14116,1,3034,to_date('4/13/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7297,1,3034,to_date('4/14/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17377,1,3034,to_date('4/14/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14115,1,3034,to_date('4/14/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual
)the script that I am using:
select
t.*/
case
when lag(dt) over(partition by c_id, d_val order by dt, u_key)+1 = dt
then'Lag'
when lead(dt) over(partition by c_id, d_val order by dt, u_key)-1 = dt
then 'Lead_1'
when consecutivedaysc = 1
then'Lead_3'
else 'wrong'
end LeadLagD_VAL,
case
when lag(dt) over(partition by w_id, c_id, d_val_present,gwid order by dt)+1 = dt
then'Lag'
when lead(dt) over(partition by w_id, c_id, d_val_present, gwid order by dt)-1 = dt
then 'Lead_A'
when consecutivedaysw = 1
then 'Lead_B'
else 'wrong'
end Lead_Lag2
from t
order by
W_ID,
dt asc,
C_ID asc
;the results should look like this (but haveing issues)
u_key D_VAL_PRESENT W_ID C_ID DT D_VAL GWID GCID CONSECUTIVEDAYSC CONSECUTIVEDAYSW LEADLAGD_VAL LEAD_LAG2
4592 0 2004 2059 01-APR-13 0 13 1 1 Lead_1 Lead_A
4591 1 2004 2059 02-APR-13 44 1 11 2 13 Lead_1 Lead_A
4590 1 2004 2059 03-APR-13 44 1 11 2 13 Lag Lag
4589 1 2004 2059 04-APR-13 389 1 0 11 13 Lead_1 Lag
4588 1 2004 2059 05-APR-13 389 1 0 11 13 Lag Lag
4587 1 2004 2059 06-APR-13 389 1 0 11 13 Lag Lag
4586 1 2004 2059 07-APR-13 389 1 0 11 13 Lag Lag
4585 1 2004 2059 08-APR-13 389 1 0 11 13 Lag Lag
4584 1 2004 2059 09-APR-13 389 1 0 11 13 Lag Lag
4583 1 2004 2059 10-APR-13 389 1 0 11 13 Lag Lag
4582 1 2004 2059 11-APR-13 389 1 0 11 13 Lag Lag
4581 1 2004 2059 12-APR-13 389 1 0 11 13 Lag Lag
4580 1 2004 2059 13-APR-13 389 1 0 11 13 Lag Lag
4579 1 2004 2059 14-APR-13 389 1 0 11 13 Lag Lag
1092 0 2686 7210 01-APR-13 0 11 3 3 Lead_1 Lead_A
3416 0 2686 7211 01-APR-13 0 11 3 3 Lead_1 Lead_A
18118 0 2686 17391 01-APR-13 0 11 3 3 Lead_1 Lead_A
1091 0 2686 7210 02-APR-13 0 11 3 3 Lag Lag
3415 0 2686 7211 02-APR-13 0 11 3 3 Lag Lag
18117 0 2686 17391 02-APR-13 0 11 3 3 Lag Lag
1090 0 2686 7210 03-APR-13 0 11 3 3 Lag Lag
3414 0 2686 7211 03-APR-13 0 11 3 3 Lag Lag
18116 0 2686 17391 03-APR-13 0 11 3 3 Lag Lag
1089 1 2686 7210 04-APR-13 51 9 0 11 11 Lead_1 Lead_A
3413 1 2686 7211 04-APR-13 51 9 0 11 11 Lead_1 Lead_A
18115 1 2686 17391 04-APR-13 51 9 0 11 11 Lead_1 Lead_A
1088 1 2686 7210 05-APR-13 51 9 0 11 11 Lag Lag
3412 1 2686 7211 05-APR-13 51 9 0 11 11 Lag Lag
18114 1 2686 17391 05-APR-13 51 9 0 11 11 Lag Lag
1087 1 2686 7210 06-APR-13 51 9 0 11 11 Lag Lag
3411 1 2686 7211 06-APR-13 51 9 0 11 11 Lag Lag
18113 1 2686 17391 06-APR-13 51 9 0 11 11 Lag Lag
1086 1 2686 7210 07-APR-13 51 9 0 11 11 Lag Lag
3410 1 2686 7211 07-APR-13 51 9 0 11 11 Lag Lag
18112 1 2686 17391 07-APR-13 51 9 0 11 11 Lag Lag
1085 1 2686 7210 08-APR-13 51 9 0 11 11 Lag Lag
3409 1 2686 7211 08-APR-13 51 9 0 11 11 Lag Lag
18111 1 2686 17391 08-APR-13 51 9 0 11 11 Lag Lag
1084 1 2686 7210 09-APR-13 51 9 0 11 11 Lag Lag
3408 1 2686 7211 09-APR-13 51 9 0 11 11 Lag Lag
18110 1 2686 17391 09-APR-13 51 9 0 11 11 Lag Lag
1083 1 2686 7210 10-APR-13 51 9 0 11 11 Lag Lag
3407 1 2686 7211 10-APR-13 51 9 0 11 11 Lag Lag
18109 1 2686 17391 10-APR-13 51 9 0 11 11 Lag Lag
1082 1 2686 7210 11-APR-13 51 9 0 11 11 Lag Lag
3406 1 2686 7211 11-APR-13 51 9 0 11 11 Lag Lag
18108 1 2686 17391 11-APR-13 51 9 0 11 11 Lag Lag
1081 1 2686 7210 12-APR-13 51 9 0 11 11 Lag Lag
3405 1 2686 7211 12-APR-13 51 9 0 11 11 Lag Lag
18107 1 2686 17391 12-APR-13 51 9 0 11 11 Lag Lag
1080 1 2686 7210 13-APR-13 51 9 0 11 11 Lag Lag
3404 1 2686 7211 13-APR-13 51 9 0 11 11 Lag Lag
18106 1 2686 17391 13-APR-13 51 9 0 11 11 Lag Lag
1079 1 2686 7210 14-APR-13 51 9 0 11 11 Lag Lag
3403 1 2686 7211 14-APR-13 51 9 0 11 11 Lag Lag
18105 1 2686 17391 14-APR-13 51 9 0 11 11 Lag Lag
17390 1 3034 5395 01-APR-13 298 0 0 14 14 Lead_1 Lead_A
17389 1 3034 5395 02-APR-13 298 0 0 14 14 Lag Lag
17388 1 3034 5395 03-APR-13 298 0 0 14 14 Lag Lag
17387 1 3034 5395 04-APR-13 298 0 0 14 14 Lag Lag
17386 1 3034 5395 05-APR-13 298 0 0 14 14 Lag Lag
7305 1 3034 5394 06-APR-13 44 0 0 7 14 Lead_1 Lag
17385 1 3034 5395 06-APR-13 298 0 0 14 14 Lag Lag
14123 1 3034 22421 06-APR-13 44 0 0 7 14 Lead_1 Lag
17384 1 3034 5395 07-APR-13 298 0 0 14 14 Lag Lag
17383 1 3034 5395 08-APR-13 298 0 0 14 14 Lag Lag
7302 1 3034 5394 09-APR-13 44 0 0 7 14 Lead_1 Lag
17382 1 3034 5395 09-APR-13 298 0 0 14 14 Lag Lag
14120 1 3034 22421 09-APR-13 44 0 0 7 14 Lead_1 Lag
7301 1 3034 5394 10-APR-13 44 0 0 7 14 Lag Lag
17381 1 3034 5395 10-APR-13 298 0 0 14 14 Lag Lag
14119 1 3034 22421 10-APR-13 44 0 0 7 14 Lag Lag
7300 1 3034 5394 11-APR-13 44 0 0 7 14 Lag Lag
17380 1 3034 5395 11-APR-13 298 0 0 14 14 Lag Lag
14118 1 3034 22421 11-APR-13 44 0 0 7 14 Lag Lag
7299 1 3034 5394 12-APR-13 44 0 0 7 14 Lag Lag
17379 1 3034 5395 12-APR-13 298 0 0 14 14 Lag Lag
14117 1 3034 22421 12-APR-13 44 0 0 7 14 Lag Lag
7298 1 3034 5394 13-APR-13 44 0 0 7 14 Lag Lag
17378 1 3034 5395 13-APR-13 298 0 0 14 14 Lag Lag
14116 1 3034 22421 13-APR-13 44 0 0 7 14 Lag Lag
7297 1 3034 5394 14-APR-13 44 0 0 7 14 Lag Lag
17377 1 3034 5395 14-APR-13 298 0 0 14 14 Lag Lag
14115 1 3034 22421 14-APR-13 44 0 0 7 14 Lag Lag I place the "wrong" showing that neither the when conditions were no able to work.
any suggestions on a better direction for me to solve this?
Edited by: 1004407 on May 23, 2013 1:16 PM
Then I am trying to get this, not to include C_ID
u_key D_VAL_PRESENT W_ID DT CONSECUTIVEDAYSW LEAD_LAG2
4592 0 2004 01-APR-13 1 Lead_A
4591 1 2004 02-APR-13 13 Lead_A
4590 1 2004 03-APR-13 13 Lag
4589 1 2004 04-APR-13 13 Lag
4588 1 2004 05-APR-13 13 Lag
4587 1 2004 06-APR-13 13 Lag
4586 1 2004 07-APR-13 13 Lag
4585 1 2004 08-APR-13 13 Lag
4584 1 2004 09-APR-13 13 Lag
4583 1 2004 10-APR-13 13 Lag
4582 1 2004 11-APR-13 13 Lag
4581 1 2004 12-APR-13 13 Lag
4580 1 2004 13-APR-13 13 Lag
4579 1 2004 14-APR-13 13 Lag
1092 0 2686 01-APR-13 3 Lead_A
1091 0 2686 02-APR-13 3 Lag
1090 0 2686 03-APR-13 3 Lag
1089 1 2686 04-APR-13 11 Lead_A
1088 1 2686 05-APR-13 11 Lag
1087 1 2686 06-APR-13 11 Lag
1086 1 2686 07-APR-13 11 Lag
1085 1 2686 08-APR-13 11 Lag
1084 1 2686 09-APR-13 11 Lag
1083 1 2686 10-APR-13 11 Lag
1082 1 2686 11-APR-13 11 Lag
1081 1 2686 12-APR-13 11 Lag
1080 1 2686 13-APR-13 11 Lag
1079 1 2686 14-APR-13 11 Lag
17390 1 3034 01-APR-13 14 Lead_A
17389 1 3034 02-APR-13 14 Lag
17388 1 3034 03-APR-13 14 Lag
17387 1 3034 04-APR-13 14 Lag
17386 1 3034 05-APR-13 14 Lag
7305 1 3034 06-APR-13 14 Lag
17384 1 3034 07-APR-13 14 Lag
17383 1 3034 08-APR-13 14 Lag
7302 1 3034 09-APR-13 14 Lag
7301 1 3034 10-APR-13 14 Lag
7300 1 3034 11-APR-13 14 Lag
7299 1 3034 12-APR-13 14 Lag
7298 1 3034 13-APR-13 14 Lag
7297 1 3034 14-APR-13 14 Lag then into this (which I would use where Lead_Lag2 = "Lead_A"
u_key D_VAL_PRESENT W_ID DT CONSECUTIVEDAYSW LEAD_LAG2
4592 0 2004 01-APR-13 1 Lead_A
4591 1 2004 02-APR-13 13 Lead_A
1092 0 2686 01-APR-13 3 Lead_A
11089 1 2686 04-APR-13 11 Lead_A
17390 1 3034 01-APR-13 14 Lead_A but onething at a time.
Thanks for point out the errors Frank, always helpful to know what others see.
Edited by: 1004407 on May 23, 2013 2:36 PM
Edited by: 1004407 on May 23, 2013 4:01 PMIs this the first set you expect?
SQL> with flagged as
2 (
3 select w_id,d_val,dt,u_key,
4 case when lag(dt) over(partition by w_id,d_val order by dt,u_key)
5 in (dt,dt-1)
6 then 0
7 else 1
8 end flg
9 from t
10 ),
11 summed as
12 (
13 select w_id,d_val,dt,u_key,
14 sum(flg) over(order by w_id,d_val nulls first,dt,u_key) sm
15 from flagged
16 ),
17 day_count as
18 (
19 select w_id,d_val,dt,u_key,count(distinct dt) over(partition by sm) cnt
20 from summed
21 )
22 select w_id,d_val,dt,u_key,cnt
23 from day_count
24 order by w_id,d_val nulls first,dt,u_key;
W_ID D_VAL DT U_KEY CNT
2004 01-APR-13 4592 1
2004 44 02-APR-13 4591 2
2004 44 03-APR-13 4590 2
2004 389 04-APR-13 4589 11
2004 389 05-APR-13 4588 11
2004 389 06-APR-13 4587 11
2004 389 07-APR-13 4586 11
2004 389 08-APR-13 4585 11
2004 389 09-APR-13 4584 11
2004 389 10-APR-13 4583 11
2004 389 11-APR-13 4582 11
2004 389 12-APR-13 4581 11
2004 389 13-APR-13 4580 11
2004 389 14-APR-13 4579 11
2686 01-APR-13 1092 3
2686 01-APR-13 3416 3
2686 01-APR-13 18118 3
2686 02-APR-13 1091 3
2686 02-APR-13 3415 3
2686 02-APR-13 18117 3
2686 03-APR-13 1090 3
2686 03-APR-13 3414 3
2686 03-APR-13 18116 3
..... -
Analytic Question with lag and lead
Hello,
I'm working on tracking a package and the number of times it was recorded in an office. I want to see the start and end dates along with number of occurrences (or records) during the start/end dates. I'm pretty confident I can get the start end date correct but it is the number of occurences that is the issue.
Essentially, I want to build a time line start and end_dates and the number of times the package was recorded in the office.
I am fumbling around with using the lag and lead analytic to build start/end dates along with the count of occurrences during that period.
I've been using analytics lag and lead feature and can pretty much get the start and end dates setup but having difficulty determining count ---count(*) within the analytic. (I think I can do it outside of the analytic with a self join but performance will suffer). I have millions of records in this table.
I've been playing with the windowing using RANGE and INTERVAL days but to no avail. When I try this and count(*) (over partition by package_ID, location_office_id order by event_date range ......) I can calculate the interval correctly by subtracting the lead date - current date, however,
the count is off because when I partition the values by package_id, location_office_id I get the third group of package 12 partitioned with the first group of package 12 (or in same window) because they are at the same office. However, I want to treat these separately because the package has gone to a different office in be-tween.
I've attached the DDL/DML to create my test case. Any help would be appreciated.
--Current
package_id, location_office_ID. event_date
12 1 20010101
12 1 20010102
12 1 20010103
13 5 20010102
13 5 20010104
13 5 20010105
13 6 20010106
13 6 20010111
12 2 20010108
12 2 20010110
12 1 20010111
12 1 20010112
12 1 20010113
12 1 20010114
--Needs to look like
package_id location_office_id start_date end_date count
12 1 20010101 20010103 3
12 2 20010108 20010110 2
12 1 20010111 20010114 4
13 5 20010102 20010105 3
13 6 20010106 20010111 2
create table test (package_id number, location_office_id number,event_date date);
insert into test values (12,1,to_date('20010101','YYYYMMDD'));
insert into test values (12,1,to_date('20010102','YYYYMMDD'));
insert into test values (12,1,to_date('20010103','YYYYMMDD'));
insert into test values (13,5,to_date('20010102','YYYYMMDD'));
insert into test values (13,5,to_date('20010104','YYYYMMDD'));
insert into test values (13,5,to_date('20010105','YYYYMMDD'));
insert into test values (13,6,to_date('20010106','YYYYMMDD'));
insert into test values (13,6,to_date('20010111','YYYYMMDD'));
insert into test values (12,2,to_date('20010108','YYYYMMDD'));
insert into test values (12,2,to_date('20010110','YYYYMMDD'));
insert into test values (12,1,to_date('20010111','YYYYMMDD'));
insert into test values (12,1,to_date('20010112','YYYYMMDD'));
insert into test values (12,1,to_date('20010113','YYYYMMDD'));
insert into test values (12,1,to_date('20010114','YYYYMMDD'));
commit;
--I'm trying something like
select package_id, location_office_id, event_date,
lead(event_date) over (partition by package_id, location_office_id order by event_date) lead_event,
count(*) over (partition by package_id, location_office_id order by event_date) rcount -- When I do this it merges the window together for package 12 and location 1 so I get the total, However, I want to keep them separate because the package moved to another office in between).
Appreciate your help,Hi,
Thanks for posting the CREATE TABLE and INSERT statements; that's very helpful!
You can do what you want with LEAD and/or LAG, but here's a more elegant way, using the analytic ROW_NUMBER function:
WITH got_grp_num AS
SELECT package_id, location_office_id, event_date
, ROW_NUMBER () OVER ( PARTITION BY package_id
ORDER BY event_date
- ROW_NUMBER () OVER ( PARTITION BY package_id
, location_office_id
ORDER BY event_date
) AS grp_num
FROM test
-- WHERE ... -- If you need any filtering, put it here
SELECT package_id
, location_office_id
, MIN (event_date) AS start_date
, MAX (event_date) AS end_date
, COUNT (*) AS cnt
FROM got_grp_num
GROUP BY package_id
, location_office_id
, grp_num
ORDER BY package_id
, start_date
;This approach treats the problem as a GROUP BY problem. Getting the start_date, end_date and cnt are all trivial using aggregate functions. The tricky part is what to GROUP BY. We can't just GROUP BY package_id and location_office_id, because, when a package (like package_id=1) leaves an office, goes to another office, then comes back, the two periods spent in the same office have to be treated as separate groups. We need something else to GROUP BY. The query above uses the Fixed Difference method to provide that something else. To see how this works, let's run the sub-query (slightly modified) by itself:
WITH got_grp_num AS
SELECT package_id, location_office_id, event_date
, ROW_NUMBER () OVER ( PARTITION BY package_id
ORDER BY event_date
) AS p_num
, ROW_NUMBER () OVER ( PARTITION BY package_id
, location_office_id
ORDER BY event_date
) AS p_l_num
FROM test
SELECT g.*
, p_num - p_l_num AS grp_num
FROM got_grp_num g
ORDER BY package_id
, event_date
;Output:
` LOCATION
PACKAGE _OFFICE
_ID _ID EVENT_DATE P_NUM P_L_NUM GRP_NUM
12 1 2001-01-01 1 1 0
12 1 2001-01-02 2 2 0
12 1 2001-01-03 3 3 0
12 2 2001-01-08 4 1 3
12 2 2001-01-10 5 2 3
12 1 2001-01-11 6 4 2
12 1 2001-01-12 7 5 2
12 1 2001-01-13 8 6 2
12 1 2001-01-14 9 7 2
13 5 2001-01-02 1 1 0
13 5 2001-01-04 2 2 0
13 5 2001-01-05 3 3 0
13 6 2001-01-06 4 1 3
13 6 2001-01-11 5 2 3As you can see, p_num numbers the rows for each package with consecutive integers. P_l_num likewise numbers the rows with consecutive integers, but instead of having a separate series of numbers for each package, it has a separate series for each package and location . As long as a package remains at the same location, both numbers increase by 1, and therefore the difference between those two numbers stays fixed. (This assumes that the combination (package_id, event_date is unique.) But whenever a pacakge changes from one location to another, and then comes back, p_num will have increased, but p_l_num will resume where it left off, and so the difference will not be the same as it was previously. The amount of the difference doesn't mean anything by itself; it's just a number (more or less arbitrary) that, together wth package_id and location_office_id, uniquely identifies the groups.
Edited by: Frank Kulash on Oct 26, 2011 8:49 PM
Added explanation.
Maybe you are looking for
-
I want to get the iPhone 4 but I can only get it for the early upgrade price. Nobody on our Family Plan is upgrade eligible either, everybody is only early upgrade eligible. I really do not want to pay $399 to get an iPhone. My older sister (not on t
-
How to create a hierarchy datasource in R/3
Hello, We use the 0MATERIAL_LKLS_HIER hierarchy datasource to load material hierachy in BW. I had to modify the function module that creates the data for 0MATERIAL_LKLS_HIER. Now, i need to create a new datasource based on the new function module. To
-
How do I recover my missing Profile
Please help I can not open downloaded files because my Profile is missing or unaccessible.
-
Is it possible to upgrade my powerMAC G5 running OS X 10.5.8 to OS X 10.6 or higher through the internet, via download, or without having to purchase any new hardware?
-
How to make a gridview in Safari 6 ?
Guys please tell me how to make a gridview in safari 6 ?