A way to Unmark a recording "watched"?

I wish I could unmark a recording "watched" when I played it momentarily but decided not to watch it.
I like that the DVR now colors recordings yellow when you've watched them, makes it much easier to sort/delete things.  But sometimes when I record 2 back to back 30 minute shows on the same channel, they arent the best at ending each program at the right time and instead make sure they end correctly after the full hour.  I mean the end of the first program runs into the start of the second recording, so when I want to see the last few seconds/minutes of the show I have to play the beginning of the 2nd recording.  This cause the second program to be mark as watched when I really haven't watch it yet.
That's why I would like a way to unmark it yellow, so when I'm looking at the recordings later I know I haven't seen it yet...
Thanks

I like that idea.
EvanVanVan wrote:
I wish I could unmark a recording "watched" when I played it momentarily but decided not to watch it.
I like that the DVR now colors recordings yellow when you've watched them, makes it much easier to sort/delete things.  But sometimes when I record 2 back to back 30 minute shows on the same channel, they arent the best at ending each program at the right time and instead make sure they end correctly after the full hour.  I mean the end of the first program runs into the start of the second recording, so when I want to see the last few seconds/minutes of the show I have to play the beginning of the 2nd recording.  This cause the second program to be mark as watched when I really haven't watch it yet.
That's why I would like a way to unmark it yellow, so when I'm looking at the recordings later I know I haven't seen it yet...
Thanks

Similar Messages

  • What is the best way to do voice recording in a Macbook Pro?

    What is the best way to do voice recording in a Macbook Pro.I want to voice record and sendas a MP3 file.
               Thanks

    Deleting the application from your /Applications folder is sufficient. There are sample projects in /Library/Application/Aperture you may want to get rid of as well, as they take up a fair bit of space.

  • Best way to Fetch the record

    Hi,
    Please suggest me the best way to fetch the record from the table designed below. It is Oracle 10gR2 on Linux
    Whenever a client visit the office a record will be created for him. The company policy is to maintain 10 years of data on the transaction table but the table holds record count of 3 Million records per year.
    The table has the following key Columns for the Select (sample Table)
    Client_Visit
    ID Number(12,0) --sequence generated number
    EFF_DTE DATE --effective date of the customer (sometimes the client becomes invalid and he will be valid again)
    Create_TS Timestamp(6)
    Client_ID Number(9,0)
    Cascade Flg vahrchar2(1)
    On most of the reports the records are fetched by Max(eff_dte) and Max(create_ts) and cascade flag ='Y'.
    I have following queries but the both of them are not cost effective and takes 8 minutes to display the records.
    Code 1:
    SELECT   au_subtyp1.au_id_k,
                                       au_subtyp1.pgm_struct_id_k
                                  FROM au_subtyp au_subtyp1
                                 WHERE au_subtyp1.create_ts =
                                          (SELECT MAX (au_subtyp2.create_ts)
                                             FROM au_subtyp au_subtyp2
                                            WHERE au_subtyp2.au_id_k =
                                                                au_subtyp1.au_id_k
                                              AND au_subtyp2.create_ts <
                                                     TO_DATE ('2013-01-01',
                                                              'YYYY-MM-DD'
                                              AND au_subtyp2.eff_dte =
                                                     (SELECT MAX
                                                                (au_subtyp3.eff_dte
                                                        FROM au_subtyp au_subtyp3
                                                       WHERE au_subtyp3.au_id_k =
                                                                au_subtyp2.au_id_k
                                                         AND au_subtyp3.create_ts <
                                                                TO_DATE
                                                                    ('2013-01-01',
                                                                     'YYYY-MM-DD'
                                                         AND au_subtyp3.eff_dte < =
                                                                TO_DATE
                                                                    ('2012-12-31',
                                                                     'YYYY-MM-DD'
                                   AND au_subtyp1.exists_flg = 'Y'
    Explain Plan
    Plan hash value: 2534321861
    | Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT         |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
    |*  1 |  FILTER                  |           |       |       |       |            |          |
    |   2 |   HASH GROUP BY          |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
    |*  3 |    HASH JOIN             |           |  1404K|   121M|    19M| 33178   (1)| 00:06:39 |
    |*  4 |     HASH JOIN            |           |   307K|    16M|  8712K| 23708   (1)| 00:04:45 |
    |   5 |      VIEW                | VW_SQ_1   |   307K|  5104K|       | 13493   (1)| 00:02:42 |
    |   6 |       HASH GROUP BY      |           |   307K|    13M|   191M| 13493   (1)| 00:02:42 |
    |*  7 |        INDEX FULL SCAN   | AUSU_PK   |  2809K|   125M|       | 13493   (1)| 00:02:42 |
    |*  8 |      INDEX FAST FULL SCAN| AUSU_PK   |  2809K|   104M|       |  2977   (2)| 00:00:36 |
    |*  9 |     TABLE ACCESS FULL    | AU_SUBTYP |  1404K|    46M|       |  5336   (2)| 00:01:05 |
    Predicate Information (identified by operation id):
       1 - filter("AU_SUBTYP1"."CREATE_TS"=MAX("AU_SUBTYP2"."CREATE_TS"))
       3 - access("AU_SUBTYP2"."AU_ID_K"="AU_SUBTYP1"."AU_ID_K")
       4 - access("AU_SUBTYP2"."EFF_DTE"="VW_COL_1" AND "AU_ID_K"="AU_SUBTYP2"."AU_ID_K")
       7 - access("AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd
                  hh24:mi:ss') AND "AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
           filter("AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND
                  "AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))
       8 - filter("AU_SUBTYP2"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
       9 - filter("AU_SUBTYP1"."EXISTS_FLG"='Y')Code 2:
    I already raised a thread a week back and Dom suggested the following query, it is cost effective but the performance is same and used the same amount of Temp tablespace
    select au_id_k,pgm_struct_id_k from (
    SELECT au_id_k
          ,      pgm_struct_id_k
          ,      ROW_NUMBER() OVER (PARTITION BY au_id_k ORDER BY eff_dte DESC, create_ts DESC) rn,
          create_ts, eff_dte,exists_flg
          FROM   au_subtyp
          WHERE  create_ts < TO_DATE('2013-01-01','YYYY-MM-DD')
          AND    eff_dte  <= TO_DATE('2012-12-31','YYYY-MM-DD') 
          ) d  where rn =1   and exists_flg = 'Y'
    --Explain Plan
    Plan hash value: 4039566059
    | Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT         |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
    |*  1 |  VIEW                    |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
    |*  2 |   WINDOW SORT PUSHED RANK|           |  2809K|   133M|   365M| 40034   (1)| 00:08:01 |
    |*  3 |    TABLE ACCESS FULL     | AU_SUBTYP |  2809K|   133M|       |  5345   (2)| 00:01:05 |
    Predicate Information (identified by operation id):
       1 - filter("RN"=1 AND "EXISTS_FLG"='Y')
       2 - filter(ROW_NUMBER() OVER ( PARTITION BY "AU_ID_K" ORDER BY
                  INTERNAL_FUNCTION("EFF_DTE") DESC ,INTERNAL_FUNCTION("CREATE_TS") DESC )<=1)
       3 - filter("CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND "EFF_DTE"<=TO_DATE('
                  2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))Thanks,
    Vijay

    Hi Justin,
    Thanks for your reply. I am running this on our Test environment as I don't want to run this on Production environment now. The test environment holds 2809605 records (2 Million).
    The query output count is 281699 (2 Hundred Thousand) records and the selectivity is 0.099. The Distinct values of create_ts, eff_dte, and exists_flg is 2808905 records. I am sure the index scan is not going to help out much as you said.
    The core problem is both queries are using lot of Temp tablespace. When we use this query to join the tables, the other table has the same design as below so the temp tablespace grows bigger.
    Both the production and test environment are 3 Node RAC.
    First Query...
    CPU used by this session     4740
    CPU used when call started     4740
    Cached Commit SCN referenced     21393
    DB time     4745
    OS Involuntary context switches     467
    OS Page reclaims     64253
    OS System time used     26
    OS User time used     4562
    OS Voluntary context switches     16
    SQL*Net roundtrips to/from client     9
    bytes received via SQL*Net from client     2487
    bytes sent via SQL*Net to client     15830
    calls to get snapshot scn: kcmgss     37
    consistent gets     52162
    consistent gets - examination     2
    consistent gets from cache     52162
    enqueue releases     19
    enqueue requests     19
    enqueue waits     1
    execute count     2
    ges messages sent     1
    global enqueue gets sync     19
    global enqueue releases     19
    index fast full scans (full)     1
    index scans kdiixs1     1
    no work - consistent read gets     52125
    opened cursors cumulative     2
    parse count (hard)     1
    parse count (total)     2
    parse time cpu     1
    parse time elapsed     1
    physical write IO requests     69
    physical write bytes     17522688
    physical write total IO requests     69
    physical write total bytes     17522688
    physical write total multi block requests     69
    physical writes     2139
    physical writes direct     2139
    physical writes direct temporary tablespace     2139
    physical writes non checkpoint     2139
    recursive calls     19
    recursive cpu usage     1
    session cursor cache hits     1
    session logical reads     52162
    sorts (memory)     2
    sorts (rows)     760
    table scan blocks gotten     23856
    table scan rows gotten     2809607
    table scans (short tables)     1
    user I/O wait time     1
    user calls     11
    workarea executions - onepass     1
    workarea executions - optimal     9
    Second Query
    CPU used by this session     1197
    CPU used when call started     1197
    Cached Commit SCN referenced     21393
    DB time     1201
    OS Involuntary context switches     8684
    OS Page reclaims     21769
    OS System time used     14
    OS User time used     1183
    OS Voluntary context switches     50
    SQL*Net roundtrips to/from client     9
    bytes received via SQL*Net from client     767
    bytes sent via SQL*Net to client     15745
    calls to get snapshot scn: kcmgss     17
    consistent gets     23871
    consistent gets from cache     23871
    db block gets     16
    db block gets from cache     16
    enqueue releases     25
    enqueue requests     25
    enqueue waits     1
    execute count     2
    free buffer requested     1
    ges messages sent     1
    global enqueue get time     1
    global enqueue gets sync     25
    global enqueue releases     25
    no work - consistent read gets     23856
    opened cursors cumulative     2
    parse count (hard)     1
    parse count (total)     2
    parse time elapsed     1
    physical read IO requests     27
    physical read bytes     6635520
    physical read total IO requests     27
    physical read total bytes     6635520
    physical read total multi block requests     27
    physical reads     810
    physical reads direct     810
    physical reads direct temporary tablespace     810
    physical write IO requests     117
    physical write bytes     24584192
    physical write total IO requests     117
    physical write total bytes     24584192
    physical write total multi block requests     117
    physical writes     3001
    physical writes direct     3001
    physical writes direct temporary tablespace     3001
    physical writes non checkpoint     3001
    recursive calls     25
    session cursor cache hits     1
    session logical reads     23887
    sorts (disk)     1
    sorts (memory)     2
    sorts (rows)     2810365
    table scan blocks gotten     23856
    table scan rows gotten     2809607
    table scans (short tables)     1
    user I/O wait time     2
    user calls     11
    workarea executions - onepass     1
    workarea executions - optimal     5Thanks,
    Vijay
    Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:17 AM
    Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:19 AM

  • Is there a way to bulk delete records

    It seems that I have a a lot of duplicated records in my " central " area so I wanted to either filter by Area then delete the duplicates if there is a way to do that or bulk delete every record that is "Central" in the Area column..
    is that possible?

    Are you able to select more than 100 through the Content and Structure manager?
    OR
    I found a technet article that uses powershell to perform a bulk-delete, it might be your best bet to start here:
    http://social.technet.microsoft.com/wiki/contents/articles/19036.sharepoint-using-powershell-to-perform-a-bulk-delete-operation.aspx
    Edit: is this you?
    http://sharepoint.stackexchange.com/questions/136778/is-there-a-way-to-bulk-delete-records ;)

  • Any way to restore deleted record from VBAP table.

    Hi Guru,
    Is their any way to restore deleted record from vabp.
    Back is taken but , All quality server back is taken, any way to restore only
    deleted VBAP data from all Back.
    Regards
    Durgesh

    Hi Sahu ji,
    you will not be able to get those records.
    Check this : If this issue is in Development than no need to worry.
    If in quality and production , then usually a copy of the system is there , and this can help you.
    Also , check is there any report that exports this data in some other form for backup.
    Hope it help you.

  • Best Way To merge Customer Records

    Hi community,
    What is the best way to merge customer records for the same operson but may have used different email ids to correspond with BC Website

    Not in BC no. You would need to export a custom report, sort that in excel and then form that into the customer import it back in or create some API that goes through and cleans that up.

  • Automated way to update PIR records in SAP via an uploadable CSV file

    HI,
      Can we get in place an automated way to update PIR records in SAP via an uploadabel CSV file. Right now we have to manually update PIRs by going into ME12 and updating the pricing and PIR timing manually.If anyone hassuggestions of how to go about it please do pool in here. Do we have BAPI related to this ?
    Thanks,
    Sindhu.

    I woudl suggest you check ORDERS05 IDOC type (ORDCHG message type).
    FM - idoc_input_ordchg

  • I accidently reverted a track to saved, is there any way to recover the recording?

    i accidently reverted a track to saved, is there any way to recover the recording?

    Do you mean, you reverted your project to saved? If you are lucky, the recording still might be in the project package ( if you copied a region of your recording to another track).
    Open the project package in the Finder by ctrl-clicking the project and look at the recordings in the subfolder "Media".
    If there is a more recent Time Machine Backup of your project, enter Time Machine, locate the version that may contain your recording and copy the Media folder to the Desktop or any place you like and then reimport the missing track to your project.

  • HT1390 I downloaded two movies 4 days ago before a school trip. I watched the first on the bus on the way. I am now on the way home and want to watch the other movie and it disappeared. This has happened to me in the past and I dont understand the problem

    I downloaded two movies 4 days ago before a school trip. I watched the first on the bus on the way. I am now on the way home and want to watch the other movie and it disappeared. This has happened to me in the past and I dont understand the problem!!

    Drrhythm2 wrote:
    What's the best solution for this? I
    Copy the entire /Music/iTunes/ folder from her old compouter to /Music/ in her account on this new computer.

  • Can we record "Watch" histroy?

    in the debug, there are some watch windows, how to record the paramters in the windows when execute step by step debug,  in order to review them later?
    Thanks

    It can be used select key to choose all content in the watch , though, then ctl-c; ctl-v to text editor.
    trouble it is, but practice.
    Thanks
    Hello,
    I am afraid that this feature about logging the watch history is not supported in current versions of Visual Studio, the way you shared could be a workaround currently.
    If possible, you could submit this feature request:
    http://visualstudio.uservoice.com/forums/121579-visual-studio. The Visual Studio product team is listening to user voice there.
    You can send your idea there and people can vote. If you submit this suggestion, you might post that link here, I will help you vote it.
    Thanks for your understanding.
    Regards,
    Carl
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • A better way to delete IT2001 records in batch.,

    Hi Gurus,
    A little problem here. Due to data source error, I need to delete IT2001 records in batch.
    Currently I am using standard report RPUREOPN to do this. But RPUREOPN will not update those deduction in IT2006 quota. So I've to delete & reupload cooresponding IT2006 quota data which is inconvenient.
    Is there a better way to do this job? I mean deleting IT2001 in batch and also update the deductions in IT2006.
    Thanks in adv!
    Br,Kee

    After running the report you mentioned above to delete absences/attn you can try running the report RPTUPD00 to revaluate the attendance and absence.
    Regards,
    Divya

  • Is there a way to speed up recording times?

    I've been reading on here for a couple of days and the amount of information has been very informative. I apologize if my question is ignorant or has been posted here before but I am planning on starting a local VHS to DVD conversion store. At the moment I am converting my relatives videos to get the feel for the editing and amount of time.
    -I am currently running a Mac
    -Have a basic VHS player
    -Use Encore for editing and DVD Menus
    -Have a ADVC 110 for my transformation
    I have been able to do everything but my biggest concern is time. Currently I am recording around 8 hours of VHS and it takes around 10 hours to record all of this to my Mac. After that I have to send this stuff to Encore for the other DVD conversions (DVD menus, etc.) which alas takes more time.
    My main question is: What would be a good plan to service more than one client without it taking a couple of days? I want to be able to do this professionally and to do this for many customers. Right now it looks like I can only do a couple a week. I also want to do the best quality of work for clients.
    My only idea right now is to purchase DVD recorders and VCR's. I've thought about pulling a James Bond and try to find a local place to see what they do.
    It seems like everything else time is the biggest factor. I appreciate any insight/feedback or any reading materials that can help educate me on my options. Once again I apologize if this question is too vague or ignorant.

    I have your set up and there is no way to speed up what you are doing. Its real time conversion. I did also bring my older stuff to a service bureau. There set up filled an entire room! With many thousands of dollars worth of equipment. Yet, I still think its real time conversion. ie minute for minute. You might want to check out a professional set up. You will also not want to use imovie for professional DVD burning of DV material.

  • Is there a way to vertically scroll records in an interactive report

    Hi, I have an interactive report which returns over 100,000 records. Currently, to move through these records one must press the pagination icon (x - y of z records).
    Is there a way to have a vertical scroll bar?
    this used to be accomplished in SQL reports with <div style="overflow: auto; height: 500px;width: 1000px;"> set in the header and pagination turned off.
    thanks, Karen
    Edited by: KarenH on Apr 14, 2011 1:23 PM

    just to clarify, I am able to add the region header of *' div style=overflow: auto; height: 400px;* to a regular SQL report to have it scroll...but just unable to do so for an interactive reportl. thanks again.
    Edited by: KarenH on Apr 25, 2011 1:42 PM
    Edited by: KarenH on Apr 25, 2011 1:43 PM

  • Is there any way to name the records from a data merge?

    I have a spreadsheet, that has all the names for the images I'm using for a data merge. What I want to do is name the resulting records after the filenames in one of the columns in the spreadsheet. Is there anyway to do this? Or is there some other way to individually name the records from a data merge? I don't have any scripting prowess, so I can't really mess with that. Any help is greatly appreciated! Thanks!

    You merge it along with the others, just like any field. ID cannot do the naming for you during the merge, but most spreadsheets have lots of logic capabilities that should allow you to extract a name for the record from the filename in the column of your choice. The only reason ID knows to find the image instead of writing the filename is that you've used a special name for the field.

  • Best way to Insert Millions records in SQL Azure on daily basis?

    I am maintaining millions of records in Sql Server 2008 R2 and now i am intended to migrate these on SQL Azure.
    In existing system with SQL Server 2008 R2, few SSIS packages and Stored Procedures are firstly truncate the existing records and then perform Insert operation on the table which holds
    approx 26 Million records in 30 mins. on Daily basis (as system demands).
    When i migrate these on SQL Azure, i am unable to perform these operations in a
    faster way as i did in SQL 2008. Sometimes i got Request timeout error.
    While searching for faster way, many of them suggest for Batch process or BCP. But Batch processing is NOT suitable in my case because it takes much time to insert those records. I required some faster and efficient way on SQL Azure.
    Hoping for some good suggestions.
    Thanks in advance :)
    Ashish Narnoli

    +1 to Frank's advice.
    Also, please upgrade your Azure SQL Database server to
    V12 as you will receive higher performance on the premium tiers.  As you scale-up your database for your bulk insert, remember that
    SQL Database charges by the hour. To minimize costs, scale back down when the inserts have completed.

Maybe you are looking for

  • Are 16-bit greyscale images in IMAQ Vision True 16-bit

    Does 16-bit image in Vision control display true 16-bit grayscale (or is it really mapped to 8-bit)? thanks, Don

  • Parent Child Relationship and Child Search Criteria

    Hi, My question relates to the DB Adapter used in BPEL. I posted the same question on BPEL forum but NOONE answerd me. I post this also to the TopLink forum because although i am using DB adapter, i chosed to query a database using TopLink Select Que

  • Camera calibration profiles doesn't read by LR above v4.2

    Hello! I have some custom camera calibration profiles for my camera installed in /Library/Application Support/Adobe/CameraRaw/CameraProfiles/Camera/Camera-name Everything works fine in LR 4.0-4.2 but this profiles missing in 4.3 and above. Placing pr

  • Imac running slow with lots of pinwheel

    I have done a EtreCheck via Etresoft and the report is enclosed. Can anyone tell me if there is something obvious thats needs either updating or even removing to speed up my pc? Thanks Hardware Information:          iMac (27-inch, Mid 2011)         

  • Grey Screen with Apple Logo - HELP

    First of all: I know there is another topic for this, but this is a different problem (i think?) I was installing snow leopard... it came up with that green tick with restart below it. It shut down and when it booted up it went to the grey screen wit