Need help getting mysql data

I have two pages, page X has all the record and page Y has a record that is based  on a url paramater sent from page X, meaning at the end in the browser I have something like this:
the_site/item_detail.php?item_name=KEDS, CHUKKA, BLUE SLATE
The thing is, I want to get another record on page Y where I want to have some of the information of the first record which receives its URL paramater from page X.
More clear maybe:
On page Y, I have one record which receives its data info from a URL paramater sent from page X
on thi same page, I want to create another record and I would like this new record to use a column from the first one to get its data result.
Thank

I am retrieving data
on page Y there already a recordset that retrievs data based on the URL parameter sent from page X (recordset 1)
On the same page, I want to create another recordset (retrieve more data) and to retrieve this data (of recordset 2 on the same oage) I want to be able to use some information from recordset 1.
I hope your understand what I mean

Similar Messages

  • Need help getting the data from my old address book

    Can someone tell me what file I need to copy and where to restore all my addresses from my old address book. I elected not to migrate everything to avoid problems, but that is one important and long list.
    Any help will be greatly appreciated
    Mireille

    Look in the folder ~/Library/Application Support/AddressBook/ for the files containing your data, where ~ is the item in the Finder's sidebar with the house icon.
    (25514)

  • Need help getting my data from backed up WinXP folder to my Palm desktop

    smkranz
    I am a volunteer, and not an HP employee.
    Palm OS ∙ webOS ∙ Android

    Look in the folder ~/Library/Application Support/AddressBook/ for the files containing your data, where ~ is the item in the Finder's sidebar with the house icon.
    (25514)

  • I have a macbook air bought in 2010 wit liquid damage that might has affected the SSD card. I need to get my data but apple won't help. Authorized dealers never seem to have the same model to try whether the SSD card works. How can I get my data quickly?

    Hi
    I have a macbook air bought in 2010. It has liquid damage that might or might not affected the SSD card. I need the data on my SSD card for work and I need it urgently. Apparently Apple won't get the data out, even if I have apple care... Licensed apple dealers need to have the same macbook air model so they can insert the SSD card and check whether the data is there, and they never seem to have one...
    I'm really disappointed with apple. It's been two weeks and I haven't been able to find out whether the SSD card is holding my data. It can't take that long to do it in store... How can they sell a product and then wash their hands on the matter?
    Anyway, enough with the apple rant. I need to get this data urgently so I need to find out how I can get it out of the card. The SSD card and data might be fine so I don't want to take it to a data recovery centre and get charged loads.
    Thanks for your help.
    Olatz

    See this link for a 4870 card, best instructs for flashing for a noob like me. (last lowest price was $150 for a 1GB vRAM 4870 from TigerDirect).
    http://web.me.com/jacobcroft/4870Flash/4870Flash.html
    However, not every 4870 card will work with the ROM currently floating around out there. BEWARE!
    Link to recovering a bricked card:
    http://forums.techpowerup.com/showthread.php?t=64328

  • A mate has an iphone 4 as a work phone and needs to get all data incl messages onto an old iphone 3 as he is leaving. Is this possible, any help appreciated

    A mate has an iphone 4 as a work phone, now he is leaving and needs to get all data incl messages etc onto an old iphone 3. Is this possible, any help appreciated.
    Robbie

    You won't be able to restore a device running iOS 4.2 from a backup of a device running iOS 5 (as you found out).  And the Iphone 3G can only be updated to iOS 4.2.1 so there is no way to migrate from his iPhone 4 back to his iPhone 3G.  He would have to at least have an iPhone 3GS, which will update to iOS 5.1 and would be able to restore from the iPhone 4 backup.  Unfortunately, there isn't any way around it.

  • I've recently messed up my Mac OS X Lion by deleting Aperture, but I need to get my data out of the computer. And by the way, I've got an important project in Final Cut Pro X. Is there any way I can back it up (including fxs and positions of the clip) ?

    Hello everyone
    I've recently messed up my Mac OS X Lion by deleting Aperture, but I need to get my data out of the computer. I've tried to repair it using DU, but it said I needed to backup all my data and reinstall the OS. How do I backup my data to an external drive?
    And by the way, I've got an important project in Final Cut Pro X. Is there any way I can back it up (including fxs and positions of the clip) ?
    Hope you guys can help me!

    You need to back up 2 folders, Final Cut Events and Final Cut Projects which by default are located on your Movies Folder. You can always check by clicking on a clip inside Fianl Cut X and selecting Show in Finder in the contextual menu for the selected clip.

  • I need to get SSIS Data profiler output in Excel.

    Hi,
    I need to get SSIS Data profiler output in Excel. Tried importing XML generated by SSIS Profiler in Excel but data is getting lost. Can anybody please help on this.
    Thanks,
    Manoj

    Hi Manoj,
    The Data Profiling task outputs the selected profiles into XML file. We can specify the result to any extensions of file, the generated file is valid for the built-in Profile Viewer but invalid for other file viewer applications. So when we specify the output
    to Excel file, we can export the excel file and make analyzing without any problems through the Profile Viewer. But when we use Microsoft Excel open it, it says ‘The file format and extension of’’ don’t match’, then it needs us select one option to open the
    XML file. So we cannot directly export Data Profiling task outputs to Excel file.
    If we still want to export Data profiling task outputs to Excel, we can use the XML Source to extract data from the generated XML file, and then export it to Excel destination in a Data Flow Task.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Need to get the date's.

    Hi,
    I need to get the dates in interval of 7.
    My Scenario:
    When my user gives the date as 04/01/2013. My output should be like
    04/01/2013
    04/02/2013
    04/03/2013
    04/04/2103
    add of 4 dates. How can i acheive this in sql query? Could any one pls help me?
    Regards,
    Prasad k T,

    wilhelm wundt wrote:
    Hi,
    I need to get the dates in interval of 7.Your expected output is not talking about th einterval of 7 days..
    >
    My Scenario:
    When my user gives the date as 04/01/2013. My output should be like
    04/01/2013
    04/02/2013
    04/03/2013
    04/04/2103Is this date dd/mm or mm/dd
    If you need 4 consecutive dates, use
    select input_date+level-1
    from dual
    connect by level <= 4;>
    add of 4 dates. How can i acheive this in sql query? Could any one pls help me?
    Regards,
    Prasad k T,

  • TS1398 I need help getting connected to our home wifi.

    I need help getting connected to our home wifi. I've reset network settings, iPhone, and router  but the error "unable to join network" keeps coming up. I also double checked the password. Please help

    Other possibilities:
    Your router firmware is out of date. Check the manufacturer's website for an update
    You have MAC (Media Access Control) filtering enabled on the router, that restricts access to "known" devices and the new iPhone is not on the table of known devices
    Your router has a setting limiting the number of simultaneous allowed devices
    Your routers SSID ("name") is still the default (e.g., LINKSYS, DLINK, NETGEAR, etc). Your phone has encountered other routers with the same name as you have traveled, and the phone is confused as to which router is the real one. If your router's name is still the default you should change it.

  • Need help getting contacts from outlook 2010

    I need help getting my contacts from outlook 2010 to icloud and/or my iphone. I have icloud up and logged in on pc. when I go to a contact to import it says there was a problem reading the vcard. any help is welcomed.

    I do have the latest version of DTM and working with OUTLOOK 2000.  I know it is old, but it still does what I need it to. I did try to uninstalland reinstall.  No luck.  The only options listed to sync are yahoo, outlook express, and ASCII inport/export.  Obviously it is not syncing my other data either, but it is my contacts that I am most concerned with at this point.  The calader would be nice too!  Any other suggestions, or is my outlook too old to sync?   

  • Need help getting from 10.5.8 to 10.6.

    need help getting from 10.5.8 to 10.6.  i know i'm late to the game!

    If you are not sure of what you are doing,If you are worried about losing anything when doing a major systems upgrade/update,
    Before embarking on a major OS upgrade, it would be wise, advisable and very prudent if you backup your current system to an external connected and Mac formatted Flash drive OR externally connected USB, Thunderbolt or FireWire 800, Mac formatted hard drive. Then, use either OS X Time Machine app to backup your entire system to the external drive OR purchase, install and use a data cloning app, like CarbonCopyCloner or SuperDuper, to make an exact and bootable copy (clone) of your entire Mac's internal hard drive. This step is really needed in case something goes wrong with the install of the new OS or you simply do not like the new OS, you have a very easy way/procedure to return your Mac to its former working state.
    Older version of CarbonCopyCloner found here.
    http://mac.filehorse.com/download-carbon-copy-cloner/2734/
    Older versions of SuperDuper can be found on the SuperDuper Website and Homepage.
    http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html
    Then, determine if your Mac meets ALL minimum system install requirements.
    Mac OS X v10.6 Snow Leopard system requirements
    Purchased Installer disc here.
    http://store.apple.com/us/product/MC573Z/A/mac-os-x-106-snow-leopard
    To install Snow Leopard for the first time, you must have a Mac with:
    An Intel processor
    An internal or external DVD drive, or DVD or CD Sharing
    At least 1 GB of RAM (additional RAM is recommended)
    A built-in display or a display connected to an Apple-supplied video card supported by your computer
    At least 5 GB of disk space available, or 7 GB of disk space if you install the developer tools.
    In addition, see this discussion for more relevant info that you may need if you have any other questions or issues.
    Snow Leopard upgrade - keeps Photoshop?

  • Need help getting YTD total

    I've got a period to date report with following columns:
    week1 tots, week2 tots, week3 tots, week4 tots, week5 tot, period-to-date tots, year-to-date tots
    I have a SELECT statement which totals data for the entire year and separates current period totals
    by grouping on the week_nbr . Any date between beginning of year and the end of the previous period will be week 0
    The Select statement retursn 6 rows: 1 for each week in period and one with week_nbr = 0 which represents the totals from the beginning of year
    to the end of the previous period.
    the select statement returns the data correctly . I need help getting the YTD total for (weeks 1 - 5) + (totals for week 0) for each column.
    This means that I will have a 7th record containing the YTD totals. ( I am not concerned with the PTD totals)
    I tried sum by partition but complex decode statement gave me problems.
      CREATE TABLE PERIOD_DATA
       (     "HOT_DOG_STAND_ID" NUMBER NOT NULL ENABLE,
         "WEEK_DATE" DATE,
         "NET_SALES2" NUMBER,
         "BUNS24434" NUMBER,
         "PICKELS_AW38" NUMBER,
         "MUSTARD_TB56" NUMBER,
         "CHICKENHEADS33" NUMBER,
         "PIECES_SOLD34" NUMBER,
         "SCRAPS35" NUMBER,
         "PIECES_UNACCOUNTED" NUMBER
    REM INSERTING into PERIOD_DATA
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('29-DEC-08','DD-MON-RR HH.MI.SSXFF AM'),14301.39,13951.26,3431.13,0,3680,2484,378,818);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('05-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),14651.37,14651.37,3249.55,0,3200,2419,505,276);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('12-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),14169.89,14169.89,2463.53,0,3136,2080,474,582);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('19-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),15864.46,15864.46,3245.49,0,3472,2764,475,233);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('26-JAN-09','DD-MON-RR HH.MI.SSXFF AM'),15961.2,15916.23,3395.51,0,3648,2838,392,418);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('02-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),19066.4,19066.4,4165.07,0,4336,3682,333,321);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('09-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),18415.74,18415.74,4024.74,0,4032,3365,482,185);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('16-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),18014,17849,3486.33,0,3840,3238,374,228);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('23-FEB-09','DD-MON-RR HH.MI.SSXFF AM'),18671.09,18626.12,3729.42,0,3888,2970,353,565);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('02-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),17636,17636,3815,0,3424,2840,490,94);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('09-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),17235.52,17145.58,3897.42,0,3504,2928,421,155);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('16-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),15989.27,15989.27,3372.95,0,3728,3051,369,308);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('23-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),19067.69,18960.41,4152.6,0,4048,3293,442,313);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('30-MAR-09','DD-MON-RR HH.MI.SSXFF AM'),18717.99,18717.99,3923.69,0,4408,3219,593,596);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('06-APR-09','DD-MON-RR HH.MI.SSXFF AM'),17335.16,17335.16,3769.08,0,3928,2997,514,417);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('13-APR-09','DD-MON-RR HH.MI.SSXFF AM'),18967.39,18967.39,4157.76,0,4144,2991,527,626);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('20-APR-09','DD-MON-RR HH.MI.SSXFF AM'),23090.88,23090.88,4427.96,0,5544,4493,560,491);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('27-APR-09','DD-MON-RR HH.MI.SSXFF AM'),24197.98,24132.99,4248.66,0,6680,5190,606,884);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('04-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),20202.21,20137.22,3714.68,0,7052,6170,422,460);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('11-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),18514.48,18514.48,3266.06,0,5508,4178,571,759);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('18-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),18678.68,18678.68,3814.07,0,5824,4345,633,846);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('25-MAY-09','DD-MON-RR HH.MI.SSXFF AM'),17937.18,17937.18,3051.52,0,4844,4986,529,-671);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('01-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17445.75,17445.75,3079.91,0,5028,4810,656,-438);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('08-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17327.88,17327.88,3263.29,0,6112,4674,672,766);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('15-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17241.72,16937.33,3328.27,0,5792,4490,567,735);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('22-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),16625.83,16625.83,3485.18,0,5408,4319,761,328);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('29-JUN-09','DD-MON-RR HH.MI.SSXFF AM'),17002.84,17002.84,3091.09,0,5664,4369,544,751);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('06-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),16339.19,16274.2,3075.3,0,4784,3440,697,647);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('13-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),17165.12,16885.14,3458.03,0,4320,3296,640,384);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('20-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),17029.77,16899.79,3198.91,0,4448,3449,645,354);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('27-JUL-09','DD-MON-RR HH.MI.SSXFF AM'),16596.89,16596.89,3015.54,0,4624,3288,665,671);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('03-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),16468.58,16468.58,2981.35,0,2224,3495,564,-1835);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('10-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),18625.48,18550.5,3524.44,0,4856,3482,578,796);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('17-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),24538.54,24323.55,5580.71,0,5260,3771,608,881);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('24-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),18081.37,18081.37,3533.45,0,5980,3080,553,2347);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('31-AUG-09','DD-MON-RR HH.MI.SSXFF AM'),17183.25,17183.25,3487.12,0,2544,3262,615,-1333);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('07-SEP-09','DD-MON-RR HH.MI.SSXFF AM'),17688.41,17575.29,3424.17,0,4800,3480,591,729);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('14-SEP-09','DD-MON-RR HH.MI.SSXFF AM'),18211.29,18211.29,3806.32,0,3968,3104,527,337);
    Insert into PERIOD_DATA (HOT_DOG_STAND_ID,WEEK_DATE,NET_SALES2,BUNS24434,PICKELS_AW38,MUSTARD_TB56,CHICKENHEADS33,PIECES_SOLD34,SCRAPS35,PIECES_UNACCOUNTED) values (141,to_timestamp('21-SEP-09','DD-MON-RR HH.MI.SSXFF AM'),16809.21,16744.22,3014.61,0,4128,3124,710,294);
    SELECT HOT_DOG_STAND_ID
    , DECODE(TRUNC(week_date , 'iw') ,
             to_date('24-AUG-09' , 'dd-mon-rr') , 1 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 7 , 2 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 14 , 3 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 21 , 4 ,
             to_date('24-AUG-09' , 'dd-mon-rr') + 28 , 5 , 0) AS week_nbr
    , SUM(NET_SALES2)                                                                                                                                                                                                                                                     AS net_sales2
    , SUM(BUNS24434 ) BUNS24434
    , SUM(PICKELS_AW38) PICKELS_AW38
    , SUM(MUSTARD_TB56) MUSTARD_TB56
    , SUM(CHICKENHEADS33) CHICKENHEADS33
    , SUM(PIECES_SOLD34) PIECES_SOLD34
    , SUM(SCRAPS35) SCRAPS35
    , SUM(PIECES_UNACCOUNTED) * - 1 PIECES_UNACCOUNTED
       /*--== Head average  net_sales / chickenusage*/
    , CASE
          WHEN NVL( SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND(SUM(net_sales2) / ( SUM(ChickenHeads33) / 8 ) , 2)
       END AS Head_average
       /*--=== Efficiency =   (ChickenUsage  - scrappedDiv8 - unaccountedDiv8) / ChickenUsage)  * 100*/
    , CASE
          WHEN NVL(SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND((((SUM(ChickenHeads33) / 8 ) - ( SUM(scraps35) / 8 ) - (SUM(pieces_unaccounted) / 8 )) / (SUM(ChickenHeads33) / 8 )) * 100 , 2)
       END AS efficiency
    FROM period_data per
    WHERE week_DATE BETWEEN TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IY') AND TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IW') + 6 + 7 * 4
    GROUP BY hot_dog_stand_id
    , DECODE(TRUNC(week_date , 'iw') ,
          to_date('24-AUG-09' , 'dd-mon-rr') , 1 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 7 , 2 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 14 , 3 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 21 , 4 ,
          to_date('24-AUG-09' , 'dd-mon-rr') + 28 , 5 ,
          0)
    ORDER BY DECODE(TRUNC(week_date , 'iw') , to_date('24-AUG-09' , 'dd-mon-rr') , 1 , to_date('24-AUG-09' , 'dd-mon-rr') + 7 , 2 , to_date('24-AUG-09' , 'dd-mon-rr') + 14 , 3 , to_date('24-AUG-09' , 'dd-mon-rr') + 21 , 4 , to_date('24-AUG-09' , 'dd-mon-rr') + 28 , 5 , 0);The expected results will be:
    HOT_DOG_STAND_ID       WEEK_NBR               NET_SALES2             BUNS24434              PICKELS_AW38           MUSTARD_TB56           CHICKENHEADS33         PIECES_SOLD34          SCRAPS35               PIECES_UNACCOUNTED     HEAD_AVERAGE           EFFICIENCY            
    141                    7                      697067.09              694887.4               139149.91              0                      175808                 139454                 21036                  15318                  31.72                  79.32                  You can get these dame results by running endpot-to-endpoint query:
    SELECT  HOT_DOG_STAND_ID
         , max(7) as week_nbr
         ,sum(NET_SALES2)      net_sales2
          ,sum(BUNS24434 )        BUNS24434
          ,sum(PICKELS_AW38)      PICKELS_AW38
          ,sum(MUSTARD_TB56)     MUSTARD_TB56
          ,sum(CHICKENHEADS33)   CHICKENHEADS33
          ,sum(PIECES_SOLD34)    PIECES_SOLD34
          ,sum(SCRAPS35)         SCRAPS35
          ,sum(PIECES_UNACCOUNTED)   PIECES_UNACCOUNTED
        ---===== Copied code from outer query
              --==  net_sales / chickenusage 
                      ,   CASE
                             WHEN NVL( sum(ChickenHeads33) / 8    ,0)  = 0 then 0
                             ELSE ROUND(sum(net_sales2)/   ( sum(ChickenHeads33) / 8    ) , 2)
                          END as Head_average
                        --=== Efficiency =   (ChickenUsage  - scrappedDiv8 - unaccountedDiv8) / ChickenUsage)  * 100
                        ,   CASE
                                  WHEN NVL(sum(ChickenHeads33) / 8    ,0)  = 0 then 0
                                  ELSE   ROUND((((sum(ChickenHeads33) / 8 )  - ( sum(scraps35) / 8 ) - (sum(pieces_unaccounted) / 8 )) / (sum(ChickenHeads33) / 8 )) * 100, 2)
                          END as efficiency  
    from period_data
    WHERE week_DATE BETWEEN TRUNC(TO_DATE( '24-AUG-09' ,'DD-MON-YY'), 'IY') AND TO_DATE( '27-sep-09' ,'DD-MON-YY')
    group by hot_dog_stand_id;Thanks In Advance

    Hi,
    Welcome to the forum!
    Thanks for posting the CREATE TABLE and INSERT statements; that's very helpful. You could teach something to some people who have been using this forum for years (except that nobody can teach them).
    user12335325 wrote:
    The expected results will be:
    HOT_DOG_STAND_ID       WEEK_NBR               NET_SALES2             BUNS24434              PICKELS_AW38           MUSTARD_TB56           CHICKENHEADS33         PIECES_SOLD34          SCRAPS35               PIECES_UNACCOUNTED     HEAD_AVERAGE           EFFICIENCY            
    141                    7                      697067.09              694887.4               139149.91              0                      175808                 139454                 21036                  15318                  31.72                  79.32                 
    Do you mean the expected results will include the row above, and that the results will be this row along with the 6 rows you're already getting? (If you wanted just that one row, I suppose you would just run your second query.)
    That sound like a job for ROLLUP.
    VARIABLE   start_date     VARCHAR2 (11);
    EXEC         :start_date  := '24-AUG-2009';
         SELECT  HOT_DOG_STAND_ID
         ,      NVL (CASE
                   WHEN  week_date >= TO_DATE( :start_date, 'DD-MON-YYYY')
                   AND   week_date <  TO_DATE( :start_date, 'DD-MON-YYYY') + 35
                   THEN  1 + FLOOR ( (week_date - TO_DATE( :start_date, 'DD-MON-YYYY'))
                                             / 7
                   ELSE  0
                      END
                 , 7
                 )          AS week_nbr
    , SUM(NET_SALES2)                                                                                                                                                                                                                                                     AS net_sales2
    , SUM(BUNS24434 ) BUNS24434
    , SUM(PICKELS_AW38) PICKELS_AW38
    , SUM(MUSTARD_TB56) MUSTARD_TB56
    , SUM(CHICKENHEADS33) CHICKENHEADS33
    , SUM(PIECES_SOLD34) PIECES_SOLD34
    , SUM(SCRAPS35) SCRAPS35
    , SUM(PIECES_UNACCOUNTED) * - 1 PIECES_UNACCOUNTED
       /*--== Head average  net_sales / chickenusage*/
    , CASE
          WHEN NVL( SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND(SUM(net_sales2) / ( SUM(ChickenHeads33) / 8 ) , 2)
       END AS Head_average
       /*--=== Efficiency =   (ChickenUsage  - scrappedDiv8 - unaccountedDiv8) / ChickenUsage)  * 100*/
    , CASE
          WHEN NVL(SUM(ChickenHeads33) / 8 , 0) = 0 THEN 0
          ELSE ROUND((((SUM(ChickenHeads33) / 8 ) - ( SUM(scraps35) / 8 ) - (SUM(pieces_unaccounted) / 8 )) / (SUM(ChickenHeads33) / 8 )) * 100 , 2)
       END AS efficiency
    FROM period_data per
    WHERE week_DATE BETWEEN TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IY') AND TRUNC(TO_DATE( '24-AUG-09' , 'DD-MON-YY') , 'IW') + 6 + 7 * 4
    GROUP BY  hot_dog_stand_id
    ,           ROLLUP (
                 CASE
                   WHEN  week_date >= TO_DATE( :start_date, 'DD-MON-YYYY')
                   AND   week_date <  TO_DATE( :start_date, 'DD-MON-YYYY') + 35
                   THEN  1 + FLOOR ( (week_date - TO_DATE( :start_date, 'DD-MON-YYYY'))
                                             / 7
                   ELSE  0
                 END          -- week_nbr
    ORDER BY  week_nbr
    ;Notice I simplified the computation of week_nbr.
    Some other people have asked questions about hot dog stands recently.
    I'm curious; is this from a course? If so, where? What is the textbook (if any)?
    Thanks.

  • This may be a duplicate but, I have a new ipad mini that I need to activate but I need to get my data from my 1pad 1 which has never been backed up.  How do I get the data from the ipad 1 to the new mini?  I have itunes acct and also Icloud acct.  Thanks

    I have a new ipad mini that I need to activate but I need to get the data from my old ipad 1 and install on the new mini.  I have a itunes acct and also icloud acct.  How do I back up the ipad 1 data and then activate the mini.  I assume I will need to transfer the sim from the ipad 1 to the mini.
    Thanks for any info
    wino454

    How to Transfer Everything from an Old iPad to New iPad
    http://osxdaily.com/2012/03/16/transfer-old-ipad-to-new-ipad/
     Cheers, Tom

  • TS1538 i need help because i need help getting my ipod touch to connect to windows vista and

    i need help getting my ipod touch to connect to windows because it keeps saying udb device not recognized and i tr\]ied restarting the settings!!!!!

    Have you tried here:
    iOS: Device not recognized in iTunes for Windows

  • Need help in transfering data to my new iphone4 from my old iphone3

    Hello, just received my new iphone 4, need help in transfering data from my old iphone 3 photos, emails etc. apps, thank you

    http://support.apple.com/kb/ht2109

Maybe you are looking for