Some thoughts emerged based on the OVP sample-AkamaiSampleRandomSeek

Thanks a lot Brian.
Thousands of thanks to everyone!
I have checked the AkamaiSampleRandomSeek which import AkamaiEnhancedNetStream for the jump point service's implementation.
When do the seek,we should have a check first.
If the seektime is in range,which means the content is loaded already,just call flash seek method;
If not,we may call the override seek function,build a new request based on the seektime and
send it to the server to get the new content.
Now,my work is going follows:
1.A plugin which can get the random seek action,and return the seektime.
2.Imitate AkamaiEnhancedNetStream's implementing thoughts, I should build a new request url such as
http://aaa.com/a.flv?starttime=xxxx based on the seektime referring to step 1, and starttime indicates the seektime.
3.send the new request.
Question:
1)Whehter my thoughts is accorded with the OSMF framework?Is it feasible?
2)if yes,would you offer me some opinion?
Thanks.

1. Yes, this should be feasible.
2. I suspect the implementation would involve the creation of a custom seek trait (perhaps a subclass of NetStreamSeekableTrait) which implements the logic you describe (check whether the seek can work with the downloaded bytes, or whether the client needs to make another request ).  Once this custom trait is integrated into a MediaElement (probably a subclass of VideoElement), then it would just be a matter of including it in a plugin.

Similar Messages

  • Colour Management - who does what - Some thoughts now the smoke is clearing

    First up, thanks very much to everyone who contributed their ideas and expertise to my recent query here, when I was seeking help for a problem with colour management issues when printing a magazine I edit. I have a ton of suggestions  to work through and study but the smoke is slowly clearing and it raises some interesting points which I think are worth recounting.
    First of all, I have been editing short run magazines now for 25 years, at first part time and later on a professional contract basis.  I am not a trained graphic designer nor a trained printer. I did start out training as a graphic designer, many years ago but gave it up for a career in IT (as a networking specialist). That was full time until 10 years ago, although I did some freelance writing and editing in my spare time.
    And yes, I did start originally with scissors and cut and paste, and moved on through black and white with spot colour and Pagemaker software  to full colour and InDesign today. One thing which may be different about my experience to most of yours is that I am a PC user and always have been. All my editing and graphics work has always been done on a PC - Pagemaker was our DTP package of choice for a long time and we supplemented this with Corel-Draw (which has a range of graphics handling options). All my software is legal and I always register it and keep it up to date. I have used the same graphic designer for quite a few years now and whenever we upgrade our software he goes and gets trained on the latest release.
    Around 10 years ago I was offered the chance to edit a specialist short run magazine (not the current one). This was a chance I took and gave up the day job and became a full time freelance. Editing is not my main or only source of income. I am also  a freelance writer and photographer and heritage consultant and I have a specialist image library.   I sell my own sell my work - articles and pictures - to the national and local press. I also write books (non fiction) on commission. The magazine editing is really an extension of my interest in historic landscapes. I have never had any complaints, or problems, with the freelance work, photos and archived images I sell.  Clients include national newspapers here in the UK, national magazine groups and my books are available in national bookstore chains. I supply my work digitally, naturally, and it includes photos I have taken myself and items which I have scanned into my library of historical images and store on line. No reported colour management issues there.
    I have always enjoyed a good relationship with my publishers and printers because I seek to be as professional as possible, which means delivering my stuff on time, to the required standard so that minimum intervention is required from them. This does assume that I have a clear brief from them on what they need from me.
    Recently this approach has not been enough to avoid colour management issues with the short run magazine I currently edit. I have been wondering when  and where things went astray and date it back to the upgrade to InDesign two years ago. However it may have its roots in my earlier decision to use PCs not Macs for my work.
    Until 4 years ago I had used the same printers for magazine editing for many years. They were a well respected firm specialising in short run magazines. They were not far from where I live and work and if there was a problem I would go over and discuss it with them. They were happy, and competent, to handle Pagemaker files generated on a PC and convert my rgb images to cmyk if there was any concern about the colour balance. On a few occasions I paid them to scan a photo for me. However 4 years ago the owner decided to retire and shut up shop. I needed to find a new printers and it had to be someone who specialised in short run magazines and could meet the budget of the charity I edit for. Also someone who could handle copy generated using Pagemaker running on a PC. I chose a printers I had used briefly in the past  where I knew some of the staff and was promised PC based Pagemaker would not be a problem. I even got this in writing. I started to send them proofs generated using Pagemaker v7 on my PC.
    I soon found that although they had agreed they could handle Pagemaker on a PC in fact they had only a few PC based clients and were using a single ageing PC running Pagemaker to proof their work. In fact nearly all their jobs were Quark based. I was also told we had to supply CMYK images although not given any further requirement so I now did the conversions from rgb to CMYK using my PhotoPaint software. (There are quite a few settings in Corel for the conversion but there was no guidance  by the printer on which to use so to be honest it did not occur to me that it might be a problem).
    Now of course I understand that the drive to get customers to supply CMYK images was a Quark driven requirement back in the late 1990s. I did not and do not use Quark so knew nothing for this.  I did have some early colour problems and font incompatibilities with the new printers and was pressured by their senior Graphic Designer (who designed for their own contract clients) to upgrade to InDesign and provide them with a .pdf, which I was assured would solve all my problems. The .pdf would be the same as the final printed magazine because "it would not require any further intervention by the printers".
    I expect you are collectively throwing up your hands in horror at this point, but I think he was speaking genuinely. The creation of a .pdf  using InDesign, is widely promoted as the ultimate answer to all printing issues.   I have encountered it recently with a lot of printers' salesmen and my friend, who edits a learned journal, has just been told the same thing by her printers, to get her to upgrade to ID. Incidentally she also uses a PC.
    So we upgraded our design process in house to InDesign and our graphic designer went on a course, two courses in fact. When we came to produce our first .pdf using ID, the printers'  Senior Graphic designer came on the phone and talked our designer through the ID Export function. I think he may at that time have told him to create a preset profile with MPC and the defaults, but to be honest I don't recall. We were never sent anything in writing about what settings we needed to match theirs. I continued to have intermittant colour management problems but put this down to my photos. Things came to head with the most recent issue where the colours were badly out on the cover, supplied by a press agency and taken by a professional photographer. The printers seemed to have little or no idea about possible causes.
    Initially I thought that part of the underlying cause must lie in some mismatch between what I was sending the printers and what they expected to receive so I asked them to specify what I should send. All they said was use Profile preset as MPC setting and accept  the defaults which accompany it.
    So I came on here looking for a solution. A lot of people were keen to offer their own experience which I really appreciate. However the messages could be conflicting. Some of you suggested it was the underlying cover photo which was at fault, some that it was my monitor which needed better calibration.
    Many of you here said that part of the problem, if not the whole problem, was the way I was generating my CMYKs for the printer and I should use Photoshop to do this. You also mentioned a number of possible colour management settings which I should try.
    At times the advice seemed to change tack. There were suggestions that the colour management issues I had  were nothing to do with the printers, that it was up to me not them. Quite a lot of you said I needed to be better informed about Colour Management issues. I agree, but I had never had any previously (maybe good luck, maybe good support from my previous printer) so I was not even aware that I needed to be better informed.  Some of you mildly chastised me for not finding out more and doing more to manage my own colour management with the switch to ID. To which I can only say if I had needed to train up, I would have done. I did not realise I needed to.  Nor was my designer aware of the issues as colour management was not really covered on his ID courses which were about typesetting and design.
    Some of you even seemed to hint that unless I was prepared to use an expensive high end printer or effectively retrain as a print specialist or get my graphic designer to do so, then I probably shouldn't be in the magazine editing game at all. OK maybe that is a bit harsh but you get the drift.
    The fact is that printing is much more accessible these days to all sorts of people and in particular to people with PCs. My brother lives in a large village in an isolated area and produces a village magazine which has been a great success. It is in black and white with spot colour but he would like to move to an all colour issue. He is a bit nervous of the colour management issues as he has no experience of graphic design and is his own designer using a low end entry level design package. He too uses a PC. The printers reps all tell him the same thing they tell me, that all he needs to supply is a .pdf using InDesign.
    Somewhere I feel a black hole has developed, maybe back in the 1990s with Quark 4.11. A lot of printers standardised on that, and set up a work flow and prepress dependent on CMYK images as provided by the clients. They assumed the the clients would doing their own colour management. This approach also assumes everyone is using Quark on a Mac with the full range of Adobe software. When it became possible to generate .pdfs using InDesign, this was held out to users as the Holy Grail of magazine printing, even though their workflows and prepress were still based on Quark 4.11 principles. Any underlying colour management issues the clients now have to tackle themselves.
    So now we have the situation in which I find myself, having to learn from scratch a good deal about colour management issues so that I can tell the printers what is needed for my magazine. Meanwhile all the printing salesmen, the ones I encounter anyway, are still busy pushing the InDesign to .pdf as the "be all and end all" solution. Some re-education is needed for all parties I think.

    I am glad to see that the sun is peeping through the clouds.
    I apologise for my Aussie-style straight talk earlier, but as I said before it was not directed personally at you but in the direction of others whom you epitomize, repeating a conversation I have had many times over the last 10 years or so where respectable, well-meaning photographers, designers and other contributors refuse to accept that colour management is being thrust upon them.
    It is a simple fact of life, there is this 'new' thing that has butted into the very root of our trades and changed the most basic principles of printing and photography.  We expect that this kind of thing does not happen but the industry we now work in is not the same one we trained in twenty years ago.
    Many printers are still struggling with the same conflict, so many tradespeople cannot accept this change.
    This is exacerbated by the fact that colour management is so complicated to learn and implement and confounded by the fact that the default settings and a clumsy workflow often yield acceptable results with incorrect, generic settings, hence the old 'use InDesign and make a PDF and it will be ok' route.
    When the chain of colour management includes the photographer, the photographer's client, the designer, the other designer maybe, the prepress person, and the platemaker, and a single incorrect click by any one of those can kill the CM it is not surprising that in the end when someone is looking back to see where it fell over they usually never find out.....   They will meet someone who says ' I never touched it, I simply opened the file and scaled it and closed it'.  And that person will be a reputable photographer or designer (and CLIENT) who has no idea they just broke it.  So what do we do?  We go with the generic setting that seems to yield adequate results therefore avoiding the confrontation. 
    You need to understand the situation of the printer who took his business through the 'early' days of colour management, we had all kinds of very reputable sources supplying incorrect files, we did not have the expertise yet to be able to address the entire workflow, it would have meant training photographers and designers all through the best design houses and national institutions, because they blamed the printer.  Only in the last few years have I seen these people coming around to the fact that they bear responsibility for implementing their own cm and maintaining it through their own work.
    Sadly, many high end sources are still not there, and I mean HIGH end!  Probably the ones that don't even visit this forum because they want to keep blaming the printer... They tend to live with the poor quality reproductions and just pull up the worst ones and fiddle with those and try to avoid the 'elephant in the room'.
    I am sorry to say that it was not practical for a printer to reject mismanaged files for fear of losing clients who would happily accept less than perfect results in order to avoid the painful truth that was being told to them.  The best thing we could do was to gently make those clients aware that their workflow was imperfect and hope to show them how we could help...  Many print shops do not have someone knowledgeable enough or patient enough to do this, or the boss does not understand the issue either and tries to work around it to keep his jobs flowing in the expectation that all those experts in the chain will eventually tame the thing.
    The many experts on this holy forum are waaaaayyyy ahead of the printing industry in general and photographers and designers in general in their understanding of colour management workflow.  I have seen first hand how reputable local industry people and trainers alike are spreading misinformation and bad techniques, when I discovered these forums back in about 2002 I found that they opened up a whole new galaxy of knowledge and facts that actually worked and made sense, unlike what I had been told locally....  This forum taught me what the Adobe text books did not, the Tech' teachers did not, local 'experts' did not! 
    I tell all interested people to join these forums and learn to discriminate between the good and bad information.

  • How do I set up a 2 column fixed, left sidebar... in CC Dreamweaver based on one of the 16 samples in Dreamweaver CS5?

    I am in Graphic Design school and we are studying Dreamweaver CS5.  One of the projects is to create a new page based on one of the 16 samples available which is the 2 column, fixed, left sidebar, header and footer option.
    However, in CC Dreamweaver, I only see the 2 column, fixed, right sidebar option.

    Change line 62 to
    float: left;

  • Is there a way to hide some reports based on the selected values in prompt.

    Hi Experts,
    Is there a way to hide some reports based on the selected values in prompt.
    For ex. if a year is selected in the prompt then the report should display year wise report.
    If a year and half year both are selected in the drop down from prompt section then 2 reports should come.. One for year wise and another for half year wise.Kindly look into this.
    Regards
    Ashish

    Hi,
    Use presentation values in prompts for year,half,qtr and month.Example- For year-y is presentation variable in the same way for halfyear-h,qtr-q and month-m.
    create four intermediate reports.Example-Report r1 with only year column,r2 with only halfyear column,r3 with qtr column and r4 with month column.
    Make column in each report(r1,r2,r3,r4) is equal to their presentation variables(y,h,q,m).
    Use four sections.
    Section1-Place report that should come when only year.
    section2-Place report that should come for year and halfyear.
    Section3-Place report that should come for year,halfyear and qtr.
    Section4-Place report that should come for year,halfyear,qtr and month.
    Apply guided navigation for each section selecting guided navigation-
    For section1-
    properties->Guided navigation->check this Reference Source Request(Yes)->select report r1(year)->check this Show Section(if request returns row)
    In the same way do for remaining section2(select r2),section3(select r3) and section4(select r4)
    Thanks,
    Srikanth
    http://bintelligencegroup.wordpress.com/

  • In 8.1.2 i have un checked the Ken BUrns effect in some individual slides. Sometimes it turns itself back on. At first I thought I'd made the mistake myself, but it has happend many times.

    in 8.1.2 i have un checked the Ken Burns effect in some individual slides. Sometimes it turns itself back on. At first I thought I'd made the mistake myself, but it has happend many times. Can I avoid this? Is the software flawed?

    Try trash the com.apple.iPhoto.plist file from the HD/Users/ Your Name / library / preferences folder. (Remember you'll need to reset your User options afterwards. These include minor settings like the window colour and so on. Note: If you've moved your library you'll need to point iPhoto at it again.)
    What's the plist file?
    For new users: Every application on your Mac has an accompanying plist file. It records certain User choices. For instance, in your favourite Word Processor it remembers your choice of Default Font, on your Web Browser is remembers things like your choice of Home Page. It even recalls what windows you had open last if your app allows you to pick up from where you left off last. The iPhoto plist file remembers things like the location of the Library, your choice of background colour, whether you are running a Referenced or Managed Library, what preferences you have for autosplitting events and so on. Trashing the plist file forces the app to generate a new one on the next launch, and this restores things to the Factory Defaults. Hence, if you've changed any of these things you'll need to reset them. If you haven't, then no bother. Trashing the plist file is Mac troubleshooting 101.

  • HT1725 Some songs that I paid for fully downloaded, but it is only playing the small sample piece from the Itunes store. How should I fix this?

    I had a Wish List with about 7 or 8 songs on there and I purchased them today. They all downloaded fine, except for 2. These two songs completed the download from what the computer says, but it will only play the short sample avaliable from the Itunes store. How should I fix this?

    Other people have been having similar problems with songs over the last few days, I assume that there has been a problem with Apple's servers.
    Depending upon what country that you are in (music can't be re-downloaded in all countries) then try deleting them from your iTunes library and redownload them via the Purchased link under Quick Links on the right-hand side of the iTunes store home page on your computer's iTunes : re-downloading.
    If you aren't in a country where you can re-download music or if they re-download in the state then try the 'report a problem' link from your purchase history : log into your account on your computer's iTunes via Store > View My Account and you should then see a Purchase History section with a 'see all' link to the right of it ; click on that and you should see a list of your purchases ; find those songs and use the 'Report a Problem' link.

  • How to resize the spark datagrid collumns based on the headertext?

    Hi friends,
         I am using spark datagrid for displaying the tablur data in my application, when i setting the dataprovider property of the datagrid, it displays the content exactally what i expeceted.
    but the widht of the collumns are based on the content of the dataprovider, i am not able to see the full collumn name in the datagrid's header. I want to display the full collumn name to the users without setting the collumn width explicitly because the data are dynamically returned from the server. could you pls give me some ideas to acheive this...?
    Thanks in advance.

    Hi Karthikeyan Subramain,
    You can make use of typicalItem proberty to set the column width.
    Here is the link for sample code which uses typicalItem:
    http://butterfliesandbugs.wordpress.com/2011/03/08/its-a-best-practice-to-size-a-spark-dat agrids-columns-with-a-typicalitem/
    Hope this will help you
    Thanks and Best regards,
    Pooja Kuber | [email protected] | www.infocepts.com

  • F4 help based on the Value in other field

    Hello
    I have a requirement in which there are two fields say field1 and field 2 in an ALV grid (in which  new data can be entered). The F4 help of field 2 should be based on the value  the user enters on field1.I have checked out the BC_ALV* programs but there are no clear help .
    I have tried setting the parameter id of the first field value and then get that parameter id on_f4 event of the second field.But where can i set the parameter id of the first field . on_f4 of first field does not have its value and there is no event after_f4 . There is a parameter e_afterf4 in data change event but to tirgger that there shud be some event right.
    Moreover If at all i get the first field value , i can use FM F4_int_table_value_request to show the refined f4 in field 2 . But I am passing the a field symbol table in my grids set table for first display . what can i pass as parameters here to the FM  F4_int_table_value_request?

    hello Kallu ,
      gt_f4_wa-getbefore  = 'X'. --->refreshing layout before F4
      gt_f4_wa-chngeafter = 'X'--> refreshing layout after f4.
    see the sample code of F4...no need to pass dynpro details.....
    *---locals.
      data:  lt_return type table of ddshretval,
             ls_return type ddshretval,
             begin of lt_kostl occurs 0,
              kokrs type kokrs,
              datbi type datbi,
              bukrs type bukrs,
              prctr type prctr,
             end of lt_kostl,
             ls_f4           type  lvc_s_modi.
      field-symbols: <ls_wa>         type any,
                    <t_f4> type lvc_t_modi.
    *---get defalut values.
      refresh lt_kostl.
      select kokrs
             kostl as prctr
             datbi
             bukrs
        from csks
        into  corresponding fields of table lt_kostl
               where kokrs eq g_kokrs
                 and datbi ge sy-datum
                 and bukrs eq yfit_00049-bukrs.
    *---call fm to display int values.
      call function 'F4IF_INT_TABLE_VALUE_REQUEST'
        exporting
          retfield        = 'YFIT_00050-PRCTR'
          window_title    = 'Profit Center list'
          value_org       = 'S'
          display         = space
        tables
          value_tab       = lt_kostl
          return_tab      = lt_return
        exceptions
          parameter_error = 1
          no_values_found = 2
          others          = 3.
      if sy-subrc <> 0.
        message id sy-msgid type sy-msgty number sy-msgno
                with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
      else.
        read table lt_return into ls_return
                   with key fieldname = 'F0004'.
        if sy-subrc eq 0.
          assign er_event_data->m_data->* to <t_f4>.
          ls_f4-fieldname = e_fieldname.
          ls_f4-row_id    = es_row_no-row_id.
          ls_f4-value     = ls_return-fieldval.
          ls_f4-error     = space.
          ls_f4-tabix     = space.
          ls_f4-style     = space.
          ls_f4-style2    = space.
          ls_f4-style3    = space.
          ls_f4-style4    = space.
          append ls_f4 to <t_f4>.
        endif.
      endif.
      er_event_data->m_event_handled = 'X'.
    regards
    Prabhu

  • Some Thoughts On An OWB Performance/Testing Framework

    Hi all,
    I've been giving some thought recently to how we could build a performance tuning and testing framework around Oracle Warehouse Builder. Specifically, I'm looking at was in which we can use some of the performance tuning techniques described in Cary Millsap/Jeff Holt's book "Optimizing Oracle Performance" to profile and performance tune mappings and process flows, and to use some of the ideas put forward in Kent Graziano's Agile Methods in Data Warehousing paper http://www.rmoug.org/td2005pres/graziano.zip and Steven Feuernstein's utPLSQL project http://utplsql.sourceforge.net/ to provide an agile/test-driven way of developing mappings, process flows and modules. The aim of this is to ensure that the mappings we put together are as efficient as possible, work individually and together as expected, and are quick to develop and test.
    At the moment, most people's experience of performance tuning OWB mappings is firstly to see if it runs set-based rather than row-based, then perhaps to extract the main SQL statement and run an explain plan on it, then check to make sure indexes etc are being used ok. This involves a lot of manual work, doesn't factor in the data available from the wait interface, doesn't store the execution plans anywhere, and doesn't really scale out to encompass entire batches of mapping (process flows).
    For some background reading on Cary Millsap/Jeff Holt's approach to profiling and performance tuning, take a look at http://www.rittman.net/archives/000961.html and http://www.rittman.net/work_stuff/extended_sql_trace_and_tkprof.htm. Basically, this approach traces the SQL that is generated by a batch file (read: mapping) and generates a file that can be later used to replay the SQL commands used, the explain plans that relate to the SQL, details on what wait events occurred during execution, and provides at the end a profile listing that tells you where the majority of your time went during the batch. It's currently the "preferred" way of tuning applications as it focuses all the tuning effort on precisely the issues that are slowing your mappings down, rather than database-wide issues that might not be relevant to your mapping.
    For some background information on agile methods, take a look at Kent Graziano's paper, this one on test-driven development http://c2.com/cgi/wiki?TestDrivenDevelopment , this one http://martinfowler.com/articles/evodb.html on agile database development, and the sourceforge project for utPLSQL http://utplsql.sourceforge.net/. What this is all about is having a development methodology that builds in quality but is flexible and responsive to changes in customer requirements. The benefit of using utPLSQL (or any unit testing framework) is that you can automatically check your altered mappings to see that they still return logically correct data, meaning that you can make changes to your data model and mappings whilst still being sure that it'll still compile and run.
    Observations On The Current State of OWB Performance Tuning & Testing
    At present, when you build OWB mappings, there is no way (within the OWB GUI) to determine how "efficient" the mapping is. Often, when building the mapping against development data, the mapping executes quickly and yet when run against the full dataset, problems then occur. The mapping is built "in isolation" from its effect on the database and there is no handy tool for determining how efficient the SQL is.
    OWB doesn't come with any methodology or testing framework, and so apart from checking that the mapping has run, and that the number of rows inserted/updated/deleted looks correct, there is nothing really to tell you whether there are any "logical" errors. Also, there is no OWB methodology for integration testing, unit testing, or any other sort of testing, and we need to put one in place. Note - OWB does come with auditing, error reporting and so on, but there's no framework for guiding the user through a regime of unit testing, integration testing, system testing and so on, which I would imagine more complete developer GUIs come with. Certainly there's no built in ability to use testing frameworks such as utPLSQL, or a part of the application that let's you record whether a mapping has been tested, and changes the test status of mappings when you make changes to ones that they are dependent on.
    OWB is effectively a code generator, and this code runs against the Oracle database just like any other SQL or PL/SQL code. There is a whole world of information and techniques out there for tuning SQL and PL/SQL, and one particular methodology that we quite like is the Cary Millsap/Jeff Holt "Extended SQL Trace" approach that uses Oracle diagnostic events to find out exactly what went on during the running of a batch of SQL commands. We've been pretty successful using this approach to tune customer applications and batch jobs, and we'd like to use this, together with the "Method R" performance profiling methodology detailed in the book "Optimising Oracle Performance", as a way of tuning our generated mapping code.
    Whilst we want to build performance and quality into our code, we also don't want to overburden developers with an unwieldy development approach, because what we'll know will happen is that after a short amount of time, it won't get used. Given that we want this framework to be used for all mappings, it's got to be easy to use, cause minimal overhead, and have results that are easy to interpret. If at all possible, we'd like to use some of the ideas from agile methodologies such as eXtreme Programming, SCRUM and so on to build in quality but minimise paperwork.
    We also recognise that there are quite a few settings that can be changed at a session and instance level, that can have an effect on the performance of a mapping. Some of these include initialisation parameters that can change the amount of memory assigned to the instance and the amount of memory subsequently assigned to caches, sort areas and the like, preferences that can be set so that indexes are preferred over table scans, and other such "tweaks" to the Oracle instance we're working with. For reference, the version of Oracle we're going to use to both run our code and store our data is Oracle 10g 10.1.0.3 Enterprise Edition, running on Sun Solaris 64-bit.
    Some initial thoughts on how this could be accomplished
    - Put in place some method for automatically / easily generating explain plans for OWB mappings (issue - this is only relevant for mappings that are set based, and what about pre- and post- mapping triggers)
    - Put in place a method for starting and stopping an event 10046 extended SQL trace for a mapping
    - Put in place a way of detecting whether the explain plan / cost / timing for a mapping changes significantly
    - Put in place a way of tracing a collection of mappings, i.e. a process flow
    - The way of enabling tracing should either be built in by default, or easily added by the OWB developer. Ideally it should be simple to switch it on or off (perhaps levels of event 10046 tracing?)
    - Perhaps store trace results in a repository? reporting? exception reporting?
    at an instance level, come up with some stock recommendations for instance settings
    - identify the set of instance and session settings that are relevant for ETL jobs, and determine what effect changing them has on the ETL job
    - put in place a regime that records key instance indicators (STATSPACK / ASH) and allows reports to be run / exceptions to be reported
    - Incorporate any existing "performance best practices" for OWB development
    - define a lightweight regime for unit testing (as per agile methodologies) and a way of automating it (utPLSQL?) and of recording the results so we can check the status of dependent mappings easily
    other ideas around testing?
    Suggested Approach
    - For mapping tracing and generation of explain plans, a pre- and post-mapping trigger that turns extended SQL trace on and off, places the trace file in a predetermined spot, formats the trace file and dumps the output to repository tables.
    - For process flows, something that does the same at the start and end of the process. Issue - how might this conflict with mapping level tracing controls?
    - Within the mapping/process flow tracing repository, store the values of historic executions, have an exception report that tells you when a mapping execution time varies by a certain amount
    - get the standard set of preferred initialisation parameters for a DW, use these as the start point for the stock recommendations. Identify which ones have an effect on an ETL job.
    - identify the standard steps Oracle recommends for getting the best performance out of OWB (workstation RAM etc) - see OWB Performance Tips http://www.rittman.net/archives/001031.html and Optimizing Oracle Warehouse Builder Performance http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - Investigate what additional tuning options and advisers are available with 10g
    - Investigate the effect of system statistics & come up with recommendations.
    Further reading / resources:
    - Diagnosing Performance Problems Using Extended Trace" Cary Millsap
    http://otn.oracle.com/oramag/oracle/04-jan/o14tech_perf.html
    - "Performance Tuning With STATSPACK" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-sep/index.html?o50tun.html
    - "Performance Tuning with Statspack, Part II" Connie Dialeris and Graham Wood
    http://otn.oracle.com/deploy/performance/pdf/statspack_tuning_otn_new.pdf
    - "Analyzing a Statspack Report: A Guide to the Detail Pages" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-nov/index.html?o60tun_ol.html
    - "Why Isn't Oracle Using My Index?!" Jonathan Lewis
    http://www.dbazine.com/jlewis12.shtml
    - "Performance Tuning Enhancements in Oracle Database 10g" Oracle-Base.com
    http://www.oracle-base.com/articles/10g/PerformanceTuningEnhancements10g.php
    - Introduction to Method R and Hotsos Profiler (Cary Millsap, free reg. required)
    http://www.hotsos.com/downloads/registered/00000029.pdf
    - Exploring the Oracle Database 10g Wait Interface (Robin Schumacher)
    http://otn.oracle.com/pub/articles/schumacher_10gwait.html
    - Article referencing an OWB forum posting
    http://www.rittman.net/archives/001031.html
    - How do I inspect error logs in Warehouse Builder? - OWB Exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case10.pdf
    - What is the fastest way to load data from files? - OWB exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case1.pdf
    - Optimizing Oracle Warehouse Builder Performance - Oracle White Paper
    http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - OWB Advanced ETL topics - including sections on operating modes, partition exchange loading
    http://www.oracle.com/technology/products/warehouse/selfserv_edu/advanced_ETL.html
    - Niall Litchfield's Simple Profiler (a creative commons-licensed trace file profiler, based on Oracle Trace Analyzer, that displays the response time profile through HTMLDB. Perhaps could be used as the basis for the repository/reporting part of the project)
    http://www.niall.litchfield.dial.pipex.com/SimpleProfiler/SimpleProfiler.html
    - Welcome to the utPLSQL Project - a PL/SQL unit testing framework by Steven Feuernstein. Could be useful for automating the process of unit testing mappings.
    http://utplsql.sourceforge.net/
    Relevant postings from the OTN OWB Forum
    - Bulk Insert - Configuration Settings in OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=291269&tstart=30&trange=15
    - Default Performance Parameters
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=213265&message=588419&q=706572666f726d616e6365#588419
    - Performance Improvements
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270350&message=820365&q=706572666f726d616e6365#820365
    - Map Operator performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=238184&message=681817&q=706572666f726d616e6365#681817
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Poor mapping performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=275059&message=838812&q=706572666f726d616e6365#838812
    - Optimizing Mapping Performance With OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=269552&message=815295&q=706572666f726d616e6365#815295
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Performance of the OWB-Repository
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=66271&message=66271&q=706572666f726d616e6365#66271
    - One large JOIN or many small ones?
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=202784&message=553503&q=706572666f726d616e6365#553503
    - NATIVE PL SQL with OWB9i
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270273&message=818390&q=706572666f726d616e6365#818390
    Next Steps
    Although this is something that I'll be progressing with anyway, I'd appreciate any comment from existing OWB users as to how they currently perform performance tuning and testing. Whilst these are perhaps two distinct subject areas, they can be thought of as the core of an "OWB Best Practices" framework and I'd be prepared to write the results up as a freely downloadable whitepaper. With this in mind, does anyone have an existing best practices for tuning or testing, have they tried using SQL trace and TKPROF to profile mappings and process flows, or have you used a unit testing framework such as utPLSQL to automatically test the set of mappings that make up your project?
    Any feedback, add it to this forum posting or send directly through to me at [email protected]. I'll report back on a proposed approach in due course.

    Hi Mark,
    interesting post, but I think you may be focusing on the trees, and losing sight of the forest.
    Coincidentally, I've been giving quite a lot of thought lately to some aspects of your post. They relate to some new stuff I'm doing. Maybe I'll be able to answer in more detail later, but I do have a few preliminary thoughts.
    1. 'How efficient is the generated code' is a perennial topic. There are still some people who believe that a code generator like OWB cannot be in the same league as hand-crafted SQL. I answered that question quite definitely: "We carefully timed execution of full-size runs of both the original code and the OWB versions. Take it from me, the code that OWB generates is every bit as fast as the very best hand-crafted and fully tuned code that an expert programmer can produce."
    The link is http://www.donnapkelly.pwp.blueyonder.co.uk/generated_code.htm
    That said, it still behooves the developer to have a solid understanding of what the generated code will actually do, such as how it will take advantage of indexes, and so on. If not, the developer can create such monstrosities as lookups into an un-indexed field (I've seen that).
    2. The real issue is not how fast any particular generated mapping runs, but whether or not the system as a whole is fit for purpose. Most often, that means: does it fit within its batch update window? My technique is to dump the process flow into Microsoft Project, and then to add the timings for each process. That creates a Critical Path, and then I can visually inspect it for any bottleneck processes. I usually find that there are not more than one or two dogs. I'll concentrate on those, fix them, and re-do the flow timings. I would add this: the dogs I have seen, I have invariably replaced. They were just garbage, They did not need tuning at all - just scrapping.
    Gee, but this whole thing is minimum effort and real fast! I generally figure that it takes maybe a day or two (max) to soup up system performance to the point where it whizzes.
    Fact is, I don't really care whether there are a lot of sub-optimal processes. All I really care about is performance of the system as a whole. This technique seems to work for me. 'Course, it depends on architecting the thing properly in the first place. Otherwise, no amount of tuning of going to help worth a darn.
    Conversely (re. my note about replacing dogs) I do not think I have ever tuned a piece of OWB-generated code. Never found a need to. Not once. Not ever.
    That's not to say I do not recognise the value of playing with deployment configuration parameters. Obviously, I set auditing=none, and operating mode=set based, and sometimes, I play with a couple of different target environments to fool around with partitioning, for example. Nonetheless, if it is not a switch or a knob inside OWB, I do not touch it. This is in line with my dictat that you shall use no other tool than OWB to develop data warehouses. (And that includes all documentation!). (OK, I'll accept MS Project)
    Finally, you raise the concept of a 'testing framework'. This is a major part of what I am working on at the moment. This is a tough one. Clearly, the developer must unit test each mapping in a design-model-deploy-execute cycle, paying attention to both functionality and performance. When the developer is satisifed, that mapping will be marked as 'done' in the project workbook. Mappings will form part of a stream, executed as a process flow. Each process flow will usually terminate in a dimension, a fact, or an aggregate. Each process flow will be tested as an integrated whole. There will be test strategies devised, and test cases constructed. There will finally be system tests, to verify the validity of the system as a production-grade whole. (stuff like recovery/restart, late-arriving data, and so on)
    For me, I use EDM (TM). That's the methodology I created (and trademarked) twenty years ago: Evolutionary Development Methodology (TM). This is a spiral methodology based around prototyping cycles within Stage cycles within Release cycles. For OWB, a Stage would consist (say) of a Dimensional update. What I am trying to now is to graft this within a traditional waterfall methodology, and I am having the same difficulties I had when I tried to do it then.
    All suggestions on how to do that grafting gratefully received!
    To sum up, I 'm kinda at a loss as to why you want to go deep into OWB-generated code performance stuff. Jeepers, architect the thing right, and the code runs fast enough for anyone. I've worked on ultra-large OWB systems, including validating the largest data warehouse in the UK. I've never found any value in 'tuning' the code. What I'd like you to comment on is this: what will it buy you?
    Cheers,
    Donna
    http://www.donnapkelly.pwp.blueyonder.co.uk

  • Hiding Image field based on the page number in Adobe Form- Script

    Hi Folks,
    I have a problem with the print form that I am working on. I need to insert an image of lines (OMR) based on the page numbers. For the OMR part, the first page will always have 4 lines as a image, the middle pages will have 3 lines as a image and the last page will have two lines as a image.
    I have uploaded these 3 images a BMPs in SE78 and I am using Xstring of these images in the form. I have two master pages. First master page is for the first page and the second master page is for the remaining pages. The first master page will always have the 4 line image. I created a positioned subform in the master page2, in that subform, I have 2 hidden numeric fields, as the run time properties one for the current page number, other one for the total number of pages. I placed those two images(one for the three line image and the other one for the two line image) exactly on the same location positioned dimensions wise. 
    I have written the java script for these two image fields to show based on the page numbering. But, somehow, it is not working. Can anybody please let me know where I am doing wrong. I am posting my java script here.
    for the three line image:
    var cp = data.PageSet.MasterPage2.OMR.cpage.rawValue;
    var np = data.PageSet.MasterPage2.OMR.npages.rawValue;
    //if(data.#pageSet[0].MasterPage2.OMR.cpage.rawvalue == data.#pageSet[0].MasterPage2.OMR.npages.rawvalue){
    for (j=0; j<xfa.layout.pageCount(); j++){
      if(cp == np){
        xfa.resolveNode("data.PageSet.MasterPage2[" + j + "]").OMR.OMR2.presence = "hidden";
      else{
        xfa.resolveNode("data.pageSet.MasterPage2[" + j + "]").OMR.OMR2.presence = "visible";
    For the two line image:
    var cp = data.PageSet.MasterPage2.OMR.cpage.rawValue;
    var np = data.PageSet.MasterPage2.OMR.npages.rawValue;
    for (j=0; j<xfa.layout.pageCount(); j++){
      if(cp == np){
        xfa.resolveNode("data.PageSet.MasterPage2[" + j + "]").OMR.OMR3.presence = "hidden";
      else{
        xfa.resolveNode("data.pageSet.MasterPage2[" + j + "]").OMR.OMR3.presence = "visible";
    Please give me a direction as this is kind of hurry.
    Thanks,
    Srinivas.
    Edited by: srinivas kari on Jun 9, 2010 2:03 AM

    HI Otto,
    Thanks for the response. You are right, I am struck with this image. My problem was to keep the OMR marking on each page based on the page number. It is like a lines with 10mmX10mm dimension at the top right corner for the sorter machine to know the number of pages to put in the envelope. The logic for this is, on the first page, it has 4 lines each 3mm apart. On the middle pages it has 3 lines . On the last page it has 2 lines. When the sorter machine picks these, it looks at the first page with 4 lines, it will count as a first page, it will continue through the 3 line pages as the middle pages until it reaches the 2 line to know it as the last page. This is all happens in the master pages. I have two master pages , one for the first page and the second one for the remaining pages.
    At first I did not know how to To achieve this, I created 3 images. one with 4 lines, another ones with 3 lines and 2 lines. The 4 lines image was on the first master page. The 3 lines and 2 lines images were on the second master page at the same place as the image fields positioned. Thats where I was trying this scripting. I was trying to capture the current page and number of pages. Based on these, I was trying to place them.
    Is there any other way to achieve this instead of using the images? I thought of sy-uline. but some in the forum told that its not going to work. Even if I use the sy-uline, I have to do some script to achieve this I believe.
    Any inputs on this.. Please give the direction.
    Thanks,
    Srinivas.

  • Hiding Image filed based on the page number in Adobe Forms - scripting

    Hi Folks,
    I have a problem with the print form that I am working on. I need to insert an image of lines (OMR) based on the page numbers. For the OMR part, the first page will always have 4 lines as a image, the middle pages will have 3 lines as a image and the last page will have two lines as a image.
    I have uploaded these 3 images a BMPs in SE78 and I am using Xstring of these images in the form. I have two master pages. The first master page will always have the 4 line image. I created a positioned subform in the master page2, in that subform, I have 2 hidden numeric fields, as the run time properties one for the current page number, other one for the total number of pages. I placed those two images(one for the three line image and the other one for the two line image) exactly on the same location positioned dimensions wise.
    I have written the java script for these two image fields to show based on the page numbering. But, somehow, it is not working. Can anybody please let me know where I am doing wrong. I am posting my java script here.
    for the three line image:
    var cp = data.PageSet.MasterPage2.OMR.cpage.rawValue;
    var np = data.PageSet.MasterPage2.OMR.npages.rawValue;
    //if(data.#pageSet[0].MasterPage2.OMR.cpage.rawvalue == data.#pageSet[0].MasterPage2.OMR.npages.rawvalue){
    for (j=0; j<xfa.layout.pageCount(); j++){
    if(cp == np){
    xfa.resolveNode("data.PageSet.MasterPage2 [" + j + "]").OMR.OMR2.presence = "hidden";
    else{
    xfa.resolveNode("data.pageSet.MasterPage2 [" + j + "]").OMR.OMR2.presence = "visible";
    For the two line image:
    var cp = data.PageSet.MasterPage2.OMR.cpage.rawValue;
    var np = data.PageSet.MasterPage2.OMR.npages.rawValue;
    for (j=0; j<xfa.layout.pageCount(); j++){
    if(cp == np){
    xfa.resolveNode("data.PageSet.MasterPage2 [" + j + "]").OMR.OMR3.presence = "hidden"; // there is some problem while //posting it is like MasterPage2[" + j + "]")
    else{
    xfa.resolveNode("data.pageSet.MasterPage2 [" + j + "]").OMR.OMR3.presence = "visible";
    Please give me a direction as this is kind of hurry.
    Thanks,
    Srinivas.
    Edited by: srinivas kari on Jun 9, 2010 12:04 AM

    HI Otto,
    Thanks for the response. You are right, I am struck with this image. My problem was to keep the OMR marking on each page based on the page number. It is like a lines with 10mmX10mm dimension at the top right corner for the sorter machine to know the number of pages to put in the envelope. The logic for this is, on the first page, it has 4 lines each 3mm apart. On the middle pages it has 3 lines . On the last page it has 2 lines. When the sorter machine picks these, it looks at the first page with 4 lines, it will count as a first page, it will continue through the 3 line pages as the middle pages until it reaches the 2 line to know it as the last page. This is all happens in the master pages. I have two master pages , one for the first page and the second one for the remaining pages.
    At first I did not know how to To achieve this, I created 3 images. one with 4 lines, another ones with 3 lines and 2 lines. The 4 lines image was on the first master page. The 3 lines and 2 lines images were on the second master page at the same place as the image fields positioned. Thats where I was trying this scripting. I was trying to capture the current page and number of pages. Based on these, I was trying to place them.
    Is there any other way to achieve this instead of using the images? I thought of sy-uline. but some in the forum told that its not going to work. Even if I use the sy-uline, I have to do some script to achieve this I believe.
    Any inputs on this.. Please give the direction.
    Thanks,
    Srinivas.

  • How can I specify different settings in CustomSettings.ini based on the OS that is being deployed?

    I originally thought this could be achieved using "OSVersion" or "OSCurrentVersion" but (unless I'm mistaken) those variables are in fact tied to the OS from witch you actually launch the MDT Wizard right? i.e. say I run MDT from within
    Windows PE to deploy Windows 8. The "OSVersion" and "OSCurrentVersion" will contain values for the Windows PE instance that is currently running MDT and not the Windows 8 OS I am actually deploying right? Assuming that is the case then
    is there some variable in MDT that stores version information about the OS you are currently deploying? If so can I use that in CustomSettings.ini to specify different settings for specific OSes I deploy?

    Rens,
    First thanks for responding. Next please don't take my response the wrong way but this is a bit of a pet peeve of mine. Maybe I'm just "doing it wrong" but I have used the TaskSequenceID trick in the past and thats actually where my problem lies.
    Too many people are recommending this course of action without including the caveats. Your response implies that using TaskSequenceID is as simple as just adding "TaskSequenceID" to priority and then creating entries based on the TSIDs themselves.
    In reality getting this to work is far more challenging. One needs to only check the following threads (and several others) for the issues other people are having when they try to use TaskSequenceID in CustomSettings.ini:
    http://social.technet.microsoft.com/Forums/en-US/e17a1952-d1f7-41ef-8231-0d6fcc41882e/mdt-2012-settings-per-task-sequence?forum=mdt#f0810428-e6fd-468e-8c76-ca18b210c191
    http://social.technet.microsoft.com/Forums/en-US/320aafee-07d2-4b96-9138-a902fec7edf5/mdt-2012-custom-rules-by-tasksequenceid-not-working-now
    In fact, following the steps outlined here was the *only* way I was able to get TaskSequenceID to work for me:
    http://www.the-d-spot.org/wordpress/2012/07/20/how-to-use-different-settings-per-task-sequence-with-mdt-2012/
    Even then its causing some erratic issues in some areas like button lag in the Wizard because ZTIGather has to run twice and pre-checked Applications in the wizard still installing even if they are unchecked. Since my needs are more OS-specific (as opposed
    to Task Sequence specific) I was hoping for a better solution. I suppose I can check BDD.log but I was hoping someone here had already encounter this same issue and hit upon a solution they could share. 
    Anyone?

  • I want example application in c# based on the duplicate detection alogrithm

    duplicate detection alogrithm
    Step 1: Consider the Stemmed keywords of the web page.
    Step 2: Based on the starting character i.e. A-Z we here by assumed the hash values should start
    with1-26.
    Step 3: Scan every word from the sample and compare with DB (data base) (initially DB
    Contains NO key values. Once the New keyword is found then generate respective hash
    value. Store that key value in temporary DB.
    Step 4: Repeat the step 3 until all the keywords get completes.
    Step 5: Store all Hash values for a given sample in local DB (i.e. here we used array list)
    Step 6: Repeat step 1 to step 6 for N no. of samples.
    Step 7: Once the selected samples were over then calculate similarity measure on the samples
    hash values which we stored in local DB with respective to webpages in repository.
    Step 8: From similarity measure, we can generate a report on the samples in the score of %
    forms. Pages that are 80% similar are considered to be near duplicates.
    this is my duplicate detection alorithm. i want some example windows application in c# based on alogrithm.

    Yu may want to use a dictionary which has a built in hash table for the key.  See code below for starting point
    List<string> input = new List<string>() { "abc", "abd", "def", "ghi" };
    Dictionary<string, List<string>> dict = input.AsEnumerable()
    .GroupBy(x => x.Substring(0,1), y => y)
    .ToDictionary(x => x.Key, y => y.ToList());
    jdweng

  • Categories based on the level of expertise / experience

    UPDATE
    *From now on I will not follow the posts in this topic. My last message can be found Re: categories based on the level of expertise / experience*
    Franky
    Hello,
    I think the best would be to make categories in the forum based on the level of expertise / experience . As I can see the overall quality of the forum has decreased ... a lot.
    I don't know why OPs think that volunteers are here :
    - to answer non correctly interpreted questions,
    - to give instant answers,
    - to read the docs for them,
    - to be mentalists,
    - to be psychologists,
    - etc...
    My suggestion is to create 3 categories:
    - novice (no certification & small amount of reward points)
    - intermediate (OCA || no certification & medium amount of reward points)
    - guru     (OCM || OCP || OCA & medium amount of reward points || no certification & high amount of reward points)
    (The OCM of course could be in the 4th category, but without having that many OCMs around I doubt that it would be useful.)
    Lower level accounts could only read the higher level threads, higher level accounts could write to lower level threads. If a higher level OP doesn't want to physically see lower level threads, then he / she could setup this in his / her profile.
    I hope that my sentences will not provoke anyone here. I've always said that "if something bothers me then I will try to change it, if I cannot then I let it be".
    Franky
    P.S. : I would be glad to see some tips from others as well.
    P.S.2 : No flame please!
    P.S.3. : SPOILER(!) In case everything fails then I will be looking for another community. So, if there are Oracle - dba - forums with community members having an average of 5-10 or more years of experience , then please don't hesitate to write the urls of these in your reply... Thanks.
    Edited by: Franky on Aug 13, 2009 1:51 AM - extended

    Hello Nicolas,
    I think you're missing the point here: I didn't say that OCPs are "that good" or people having no certification are " that bad". No, this would be silly.
    Your definition of novice is wrong, there are very knowledge people without certification (who told you a certified guy is a good DBA ?) and with very few "reward points" over here (who told you a "zero-point" guy is a bad DBA ?).
    Novice on the forum would mean - no certification, no reward points = NO FEEDBACK !!!!!
    Intermediate, guru on the forum would mean - certification or reward points = FEEDBACK !!!!
    A community cannot rely on personal impressions like "I know that he is a good dba!" In this system everyone had to earn the status regardless of his/her job title. The community must have background information before setting the right level. If you have any tip how to setup categories, then feel free the share your thoughts.
    +Same for intermediate or guru, this is a non-sense definition. I know many OCP DBA who are really novice, and again, point system here is just ridiculous.+
    What do you think what would happen to those who have OCA or OCP and cannot answer anythig on their level? I will tell you - nothing. They will continue to reply to lower level threads. That's all. I don't think they would open questions on their level either, becuase it would be too embarrassing...
    What is the value of 10 points taken when given a doc link ?
    Why are you asking me about this? The OP should decide whether the link "worths" 10 points or not. By the way I don't think that OCPs or OCMs would ask such questions....
    Well, I'm sure your thread is an open door to discuss on 1) point system on forum (again and again) 2) the value of certification (which have never meant you're a guru).
    Ok, I see that the word "guru" is a splinter in your eyes. Feel free to rename it! :-) The value of the certification depends on how much effort has been put into it. If one does it by clicking here and there and everywhere and somehow it's enough to pass the exam - then I agree. If one learns, not only by reading the docs, but by working full time with Oracle itself, then I disagree.
    Franky
    P.S.: Once again I hope that this is going to be a proactive thread with many good thoughts. Words like "non-sense" or "ridiculous" let me believe that I'm just waisting my time here....

  • How can i extend the battery life of my iphone 4s? when i left it in standby mode when i saw in 5hours it drains to 5%, i thought when i leave the phone the percent is still the same. whant am i goin to do?

    How can i extend the battery life of my iphone 4s? when i left 50% it in standby mode when i saw in 5hours it drains to 5% i saw 45%, i thought when i leave the phone the percent is still the same. whant am i goin to do?

    I was not aware of the fact one could use ringtones as alarms, this will suffice! Although I foresee some very awkward times as the sound does not stop once I respond to the reminder (it keeps going for good five - ten seconds after I "stop" the reminder), but it's better than missing it altogether!
    Thank you very much!
    Regarding the Alarmed app, doesn't it have to be running to actually remind you?
    -Modular747: I don't know anyone who actually reads the terms of service agreement, those things are usually up to thirty, small print pages full of legal jargon. Rules, I do read - not the legal documents (also they often contradict the laws in my country, but that's a whole different story).
    I agree with you - the average customer support consultant mostly cares about keeping his job and earning his salary (don't we all?) and thus racking up completted tasks (i.e. getting rid of customers as fast as possible - in a good mood). The "threat" none the less is real, if a cell phone causes me more trouble than it is worth, I will indeed throw it away (I have done so before) - and recommend a different brand to anyone willing to listen.
    Don't get me wrong, I love Apple and I've been faithfully using and recommending their producs for twenty years now, not because of fanboyism, but because they earned it. This calendar issue, however, is like a toaster that doesn't automatically eject the toast at the set time. I see it as a major flaw and after googling around a bit I see that it's been around since at least 2007 (THAT'S SEVEN YEARS!!) so I strongly believe they are aware of this problem by now, but are choosing to ignore it for some mysterious reason and it really ****** me off.
    I will send the feedback but based on my observations I suspect the Send button is merely a cleverly disguised Delete button.

Maybe you are looking for

  • Show your tasks in the calendar view - Outlook Web App

    Hi, Is it possible to show your tasks in the calendar view in Outlook Web App 2013 ? Some explanations of what I want to do :-) We use Project Server so user can encode their work, this tasks are sync with an Exchange Server 2013 so tasks can be view

  • Real racing 3 iPad mini

    Hi, I just got my iPad and I had an iPod touch before but I couldn't get all my information for real racing to get on my iPad had previously had on my iPod touch fourth-generation. Now I have to start from scratch. Any ideas on how to fix this?

  • Some classes are not available

    Hi I am using JDeveloper 10.1.3 I have a question. Why some packages are not available for plain JAVA programs. For example oracle.webservices.provider.. or javax.servlet.httprequest.... And if you create for example a web service proxy these package

  • How to give permission to user to run and install application like data card or modem

    Hi, when a standard user try to use an Data card (airtel 3G Dongle), windows 7 prompts for a administrator password to run the application. I have to go and type admin password each time when that person uses the dongle. Is there any way that  i can

  • How to sync data of the operating 100000 record in the 9i DB??

    I use java to develop the program in JDBC. In the 9i DB has 100,000 more record data in the TABLE_A. The TABLE_A had been operatten with the DLL of select and update/insert in the two threads, that is Thread_select and Thread_update, on same time. Th