Article Limit

My Folio keeps crashing....Im thinking that I am overloading it but it seems many people have produced some vary large folios that work just fine. So I want to gauge if I need to rethink my layout or isolate another issue that may be causing the Content viewer to crash on my Ipad 1 and 2. Also When I first created all the buttons that are set up identical in masters work the mso just fine...With  couple of article updates a few stop working or act funny.
My scenario:
Article is set up as V and H. 74 pages. Each V and H are identical (same content just reformatted) Each page has two MSO's with two stages each. One is just a mask...the other is info text that comes and goes with each tap. On a background layer I have one pan and zoom window that contains a 2000x2000px image. Again all these elements are the same for each page. Article Size works out to be about 120 mb or so (Not terrible) . If I upload at horizontal only the MSO or buttons seem to break or move. If I upload as page stacks up and down the app crashes after flicking though about a 1/3rd or half of the pages. or if you flick through many pages and then try to rotate.
Bottom line? am I asking to much of one article or is there possibly something else going on?
Thanks.

Any article going over 100 megs is really pushing it. That might be okay in an iPad3 but the original iPad is likely to have some memory issues.
Bob

Similar Messages

  • Folio and Articles Limit

    I have a .Indd (13mbyte) but I can't add it to the folio because it weighs more than 1GB.
    How should I know before making the preview?
    This article consists of 12 pages:
         - first page with a simple sliding frame,
         - second page with a small gallery
         - other 8 pages has a still image with 3 or 4 hot spots

    In that folder will also display temporary files (TIFF images) that have not yet been used in the folio, but for convenience are present in the "Links" folder.
    It's probably best to keep it clean "Links" folder with only the images that are actually attached and used in the document indesign.
    I thought indesign take photos only really used in the folio.
    Now aspect the picture of the customer, who has since deleted that folder in the material useless

  • TS4088 why is there a limit on 3 years? it should be all the MacBook Pro mid 2010 with symptoms, there should get their logic board changed for free, mine mac is 3 month late and i had this problem for over a year, but i first saw this article today :(

    it should be all the MacBook Pro mid 2010 with symptoms, there should get their logic board changed for free, mine mac is 3 month late and i had this problem for over a year, but i first saw this article today

    Hey Clintonfrombirmingham
    I called Apple technical support in Denmark, but with no positive reply.
    She couldn't do anything, and said that They had sent a recall Email about the problem and with their offer to repair the Macbook Pro, but I'd never recieved an Email about the problem. She wasn't in power to make an exception. It can't be true that i paid a lot of money, for a product that can't barely stand on its own feets, Apple didn't tell me that the product I was about to buy, would restart every 5 minute. and now when  they know the problem, they wont repair it? it just don't make sense for me. If a car seller discovers that all the brakes in a car he had sold, will crash after some years he will call all the cars back to repair no mater what. i just don't understand how Apple will make good service for their custumers, by extending the warranty from 2 to 3 years, but wont take the computers there is a little bit to old, 4 months will make the difference. i can't believe it.
    What can i do now? 
    best regards Oskar

  • "Content generation error. The article exceeds maximum file size limit"

    We are building a single edition app for our univesity's literary and arts magazine and have no problem previewing it through the adobe viewer app, but when we try to create a folio so we can begin the App Store process, we keep getting this pesky error. It refuses to export a folio.
    The links folder associated with the indesign file totals 474mb.
    The folder containing all the Indesign files associated with the app is 537mb.
    So how are we exceeding the 1gb file size?
    Details:
    115 pages within a single article (we converted from the print edition, rather than placing each piece into an article of toys own. Is that a problem?).
    All images (about 50) are png files of under 1mb and the videos have been exported to mp4 (total 370mb of video--do we need to switch to streaming?)
    Every page has a button linking back to the home screen. And at least one, and sometimes two, MSOs (to full-screen an image or to pop up the author's bio.
    3 mp3 audio pieces are insignificant in size.
    We removed extraneous states from the MSOs.
    Our PNG files are sized to a maximum width of 2048px.
    The .indd file itself is 58mb
    No HTML overlays, but we do have 5 or 6 hyperlinks that launch the browser.
    Any guesses why we are unable to generate a folio with this article?
    Is there a way to audit our project to see what the problem is?
    We've scoured these forums and applied the advice that seemed pertinent , so your patience and recommendations would be a great help.

    Thanks to both of you for your replies.
    We watched Adobe's tutorial videos and read the guides on Adobe's site before starting our project (a first app for our lit-mag), and never found the kind of overview that would have made the built-in functionality of the Folio system evident. Are there beginner videos you recommend we screen for the editorial board next year? It has been a great learning process for us this summer, but one involving many errors along the way.
    It seems like it won't be difficult to break up the journal into "articles," but in order to make the TOC work correctly it really is a one piece to a page thing (none of our pieces extends beyond one screen because we used scrollable overlays). Do you think we'll run into trouble with 100+ articles in our folio?
    Thanks again for taking the time to respond!
    --Clamor

  • Tolrence Limit at Service activity level similar to Article.

    Dear All,
    I am in edge of implementing the External service Process in our work place.
    Process will be as follow;
    Service Activity will be created.
    Pricing Conditions
    PO creation with Account Assignment "F" & Item category "D".
    And then normal service entry process.
    Can we have the tolerence limit on the service activity created thru AC01 .E.g If i maintain price of the service in ML45  e.f Rs100. Can i put some tolerence of 10% upper limit so that user can update the Price in the PO more than Rs.110_.
    Please share your valuable feedback & suggestion.
    Thanx in advance
    Sonal D.

    Dear Praveen,
    Thanks for your reply.
    Actually after sending you all the mail I searching the config & other settings in External Services management I came across the Lower limit & upper lImit setting. But unfortunately after testing many scenarios it has been observed that the same was not working...e.g
    If the service Activity's net price maintained in ML45 is 100/- & upper limit as120 /-.
    Then later PO is created with the same service activity, Price populated from the condt records is 100/-.
    Next, I had manually entered 150/-  that is beyond the upper limit but still system didnt restrict in PO.
    Can you help me unserstand that am I missing any other setttings.
    Please confirm.
    Regards,
    Sonal D.

  • Is there a limit of how many resources can be shared via a single Resource Pool in Project 2010 Professional standalone?

    We have a single master project resource pool of over 150 resources that are shared across let's say 10-15 sub projects or sharer files.
    We are using Microsoft Project Professional (standalone no project server)
    Are you aware of any limitations outside of project server with resource pool and sharing, or should we be using Project server (as I have read a lot of articles about resource pool sharing working better in project server)
    I'd like to get clarification on:
    1. I read that the calendars for resources (of which there could be many shift patterns etc.) should always be created in the master project only, not in the sharer files. In my testing I see that a shared resource also brings the additional calendars from
    the Master Project file - and can be chosen in the calendar column from Resource sheet.
    2. Each sharer file can have it's own specific project calendar, but the resource calendar (used by the assigned resources from the pool) will drive the scheduling (as per normal), unless;
    3. A specific task has a 'Task Calendar' applied to it AND "scheduling ignores resource calendars" is ticked.
    I am trying to find an article that addresses, in detail, the limitations or 'look out fors' in this area.  I have found plenty about how to create and share resources, but this is more troubleshooting.
    Any advice or direction that can be given would be appreciated. Thanks.

    No limit to resources, but linked master and resource pools are at high risk of file corruption if you rename, over-write or move a linked file without removing it from the master and pool first. Files will corrupt, its when not if.
    Master files should never have any resources or tasks in them, only inserted projects.
    I much prefer creating master files with the Link option deselected. This copies all data and consolidates resource info (so no pool needed). I record a macro to create new masters each week. This method has no corruption risk.
    Your alternative now is to experiment with Project Server. By far the best way to prototype a system is to get Project Online through Microsoft, ProjectHosts or Bemo. It will give you access to a vanilla project server very quickly and cost for a few months
    testing is trivial for a few users.
    If you go ahead with a full implementation I strongly recommend getting a qualified Project Server partner to help you implement the full system.
    Rod Gill
    Author of the one and only Project VBA Book
    www.project-systems.co.nz

  • Failure of a single article prevents the distribution of all articles in a publication to a subscriber

    We have a SQL Server 2012 R2 SP2 environment and Transactional Replication configured with 1 instance publishing all tables in a database via a single publication (circa 150 articles in the one publication). We have a separate distribution instance,
    and a third subscription instance. All articles in the publication are configured to copy all data, with no advanced filtering configured. It is also noted that Alerts were not configured in Replication Monitor.
    We had an issue whereby, inexplicably, a table on the subscriber vanished (I use this term because I could not find any record in the log file of a DELETE statement being issued against that table - regrettably I wasn't in the position to query the distribution
    database for any such commands either).
    In Replication monitor we saw that the article for the table in question was failing, but no others.
    The issue was that no other articles were being replicated to the subscriber, and this was evident via stalled rowcounts on the subscriber database, but increasing rowcounts on the publishing database.
    When we removed the offending article from the publication, all other articles started getting applied to the subscriber database - and we could observe both publication and subscription rowcounts increasing in lockstep.
    My questions are:
    1) in a single publication with many articles, are the articles applied to the subscription in a single batch - meaning that if a discreet transaction for one article fails, that all other articles will be rolled back with this transaction until it
    is cleared.
    2) Why would Replication Monitor not start showing widespread warnings for every other table (article) that was not successfully being replicated to the subscriber?
    3) Aside from configuring alerts in Replication Monitor (which has happened already) and a bespoke rowcount checking 'monitor', how could Transactional Replication be configured so that the failure of the publication of an obscure article doesn't fail all
    articles --> I note that creating multiple publications with smaller number of articles would limit the exposure, but from what I've observed, this would only limit the exposure to those articles in the same publication.
    Many thnaks

    1) they could be - it all depends on the batch size.
    2) I suspect because the distribution agent applies the changes serially, ie one article after another in the order the changes occurred on the publisher. So if it fails on one table it will not continue until this error is leared.
    3) use the continue on data consistency error profile, or put your problem article in its own publication.
    looking for a book on SQL Server 2008 Administration?
    http://www.amazon.com/Microsoft-Server-2008-Management-Administration/dp/067233044X looking for a book on SQL Server 2008 Full-Text Search?
    http://www.amazon.com/Pro-Full-Text-Search-Server-2008/dp/1430215941

  • Msg 8631 Internal error: Server stack limit has been reached on SQL Server 2012 from T-SQL script that runs on SQL Server 2008 R2

    I have an Script mostly that is generated by SSMS which works with-out issue on SQL Server 2008, but when I attempt to run it on a new fresh install of SQL Server 2012 I get an Msg 8631. Internal error: Server stack limit has been reached. Please look for
    potentially deep nesting in your query, and try to simplify it.
    The script itself doesn't seem to be all that deep or nested.  The script is large 2600 lines and when I remove the bulk of the 2600 lines, it does run on SQL Server 2012.  I'm just really baffled why something that SQL Server generated with very
    few additions/changes AND that WORKS without issue in SQL Server 2008 R2 would suddenly be invalid in SQL Server 2012
    I need to know why my script which is working great on our current SQL Server 2008 R2 servers suddenly fails and won't run on an new SQL Server 2012 server.  This script is used to create 'bulk' Replications on a large number of DBs saving a tremendous
    amount of our time doing it the manual way.
    Below is an 'condensed' version of the script which fails.  I have removed around 2550 lines of specific sp_addarticle statements which are mostly just copy and pasted from what SQL Management Studio 'scripted' for me went I when through the Replication
    Wizard and told it to save to script.
    declare @dbname varchar(MAX), @SQL nvarchar(MAX)
    declare c_dblist cursor for
    select name from sys.databases WHERE name like 'dbone[_]%' order by name;
    open c_dblist
    fetch next from c_dblist into @dbname
    while @@fetch_status = 0
    begin
    print @dbname
    SET @SQL = 'DECLARE @dbname NVARCHAR(MAX); SET @dbname = ''' + @dbname + ''';
    use ['+@dbname+']
    exec sp_replicationdboption @dbname = N'''+@dbname+''', @optname = N''publish'', @value = N''true''
    use ['+@dbname+']
    exec ['+@dbname+'].sys.sp_addlogreader_agent @job_login = N''DOMAIN\DBServiceAccount'', @job_password = N''secret'', @publisher_security_mode = 1, @job_name = null
    -- Adding the transactional publication
    use ['+@dbname+']
    exec sp_addpublication @publication = N'''+@dbname+' Replication'', @description = N''Transactional publication of database
    '''''+@dbname+''''' from Publisher ''''MSSQLSRV\INSTANCE''''.'', @sync_method = N''concurrent'', @retention = 0, @allow_push = N''true'', @allow_pull = N''true'', @allow_anonymous = N''false'', @enabled_for_internet
    = N''false'', @snapshot_in_defaultfolder = N''true'', @compress_snapshot = N''false'', @ftp_port = 21, @allow_subscription_copy = N''false'', @add_to_active_directory = N''false'', @repl_freq = N''continuous'', @status = N''active'', @independent_agent = N''true'',
    @immediate_sync = N''true'', @allow_sync_tran = N''false'', @allow_queued_tran = N''false'', @allow_dts = N''false'', @replicate_ddl = 1, @allow_initialize_from_backup = N''true'', @enabled_for_p2p = N''false'', @enabled_for_het_sub = N''false''
    exec sp_addpublication_snapshot @publication = N'''+@dbname+' Replication'', @frequency_type = 1, @frequency_interval = 1, @frequency_relative_interval = 1, @frequency_recurrence_factor = 0, @frequency_subday = 8,
    @frequency_subday_interval = 1, @active_start_time_of_day = 0, @active_end_time_of_day = 235959, @active_start_date = 0, @active_end_date = 0, @job_login = N''DOMAIN\DBServiceAccount'', @job_password = N''secret'', @publisher_security_mode = 1
    -- There are around 2400 lines roughly the same as this only difference is the tablename repeated below this one
    use ['+@dbname+']
    exec sp_addarticle @publication = N'''+@dbname+' Replication'', @article = N''TABLE_ONE'', @source_owner = N''dbo'', @source_object = N''TABLE_ONE'', @type = N''logbased'', @description = null, @creation_script =
    null, @pre_creation_cmd = N''drop'', @schema_option = 0x000000000803509F, @identityrangemanagementoption = N''manual'', @destination_table = N''TABLE_ONE'', @destination_owner = N''dbo'', @vertical_partition = N''false'', @ins_cmd = N''CALL sp_MSins_dboTABLE_ONE'',
    @del_cmd = N''CALL sp_MSdel_dboTABLE_ONE'', @upd_cmd = N''SCALL sp_MSupd_dboTABLE_ONE''
    EXEC sp_executesql @SQL
    SET @dbname = REPLACE(@dbname, 'dbone_', 'dbtwo_');
    print @dbname
    SET @SQL = 'DECLARE @dbname NVARCHAR(MAX); SET @dbname = ''' + @dbname + ''';
    use ['+@dbname+']
    exec sp_replicationdboption @dbname = N'''+@dbname+''', @optname = N''publish'', @value = N''true''
    use ['+@dbname+']
    exec ['+@dbname+'].sys.sp_addlogreader_agent @job_login = N''DOMAIN\DBServiceAccount'', @job_password = N''secret'', @publisher_security_mode = 1, @job_name = null
    -- Adding the transactional publication
    use ['+@dbname+']
    exec sp_addpublication @publication = N'''+@dbname+' Replication'', @description = N''Transactional publication of database
    '''''+@dbname+''''' from Publisher ''''MSSQLSRV\INSTANCE''''.'', @sync_method = N''concurrent'', @retention = 0, @allow_push = N''true'', @allow_pull = N''true'', @allow_anonymous = N''false'', @enabled_for_internet
    = N''false'', @snapshot_in_defaultfolder = N''true'', @compress_snapshot = N''false'', @ftp_port = 21, @allow_subscription_copy = N''false'', @add_to_active_directory = N''false'', @repl_freq = N''continuous'', @status = N''active'', @independent_agent = N''true'',
    @immediate_sync = N''true'', @allow_sync_tran = N''false'', @allow_queued_tran = N''false'', @allow_dts = N''false'', @replicate_ddl = 1, @allow_initialize_from_backup = N''true'', @enabled_for_p2p = N''false'', @enabled_for_het_sub = N''false''
    exec sp_addpublication_snapshot @publication = N'''+@dbname+' Replication'', @frequency_type = 1, @frequency_interval = 1, @frequency_relative_interval = 1, @frequency_recurrence_factor = 0, @frequency_subday = 8,
    @frequency_subday_interval = 1, @active_start_time_of_day = 0, @active_end_time_of_day = 235959, @active_start_date = 0, @active_end_date = 0, @job_login = N''DOMAIN\DBServiceAccount'', @job_password = N''secret'', @publisher_security_mode = 1
    -- There are around 140 lines roughly the same as this only difference is the tablename repeated below this one
    use ['+@dbname+']
    exec sp_addarticle @publication = N'''+@dbname+' Replication'', @article = N''DB_TWO_TABLE_ONE'', @source_owner = N''dbo'', @source_object = N''DB_TWO_TABLE_ONE'', @type = N''logbased'', @description = null, @creation_script
    = null, @pre_creation_cmd = N''drop'', @schema_option = 0x000000000803509D, @identityrangemanagementoption = N''manual'', @destination_table = N''DB_TWO_TABLE_ONE'', @destination_owner = N''dbo'', @vertical_partition = N''false''
    EXEC sp_executesql @SQL
    fetch next from c_dblist into @dbname
    end
    close c_dblist
    deallocate c_dblist
    George P Botuwell, Programmer

    Hi George,
    Thank you for your question. 
    I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated. 
    Thank you for your understanding and support.
    If you have any feedback on our support, please click
    here.
    Allen Li
    TechNet Community Support

  • Can I add a limit to the number of letters using a tag?

    For example, {tag_caption, 125}. The 125 would be the limit you could type in the caption text field.

    Hi,
    Great suggestion but unfortunately not possible to create parameters.  Only tags that has a character limitation parameter will work. 
    For the list of our current tags + available paramaters please view the article below. 
    - http://kb.worldsecuresystems.com/134/bc_1342.html
    Kind regards,
    -Sidney

  • How can I "profile" folio articles load time and memory use?

    Hi folks,
    I am beating my head against the wall since last week:
    How can I figure out a problematic article (one or more), that causes both Content Viewer and custom viewer (.IPA) to crash on load?
    The crash report states that "application took too long to start", in some cases it is "jettisoned".
    The whole layout has 150+ arcicles, each about 30 pages long in average, with the total size of 705 Mb. We have about 30...40 3D images there, lots of navto links, embedded video/ audio clips - I think we used every MSO type available for iOS viewer.
    All articles are of PDF type.
    No custom JavaScript code, no HTML tweaking - just a transition from relatively complex inDesign layout.
    At some point, while filling the folio with content, the Content Viewer started crashing on load.
    We rolled back and startred re-assembling the whole package, checking once 30...60 more articles were added.
    Our DTP guy was backing up intermediate results while adding articles, so I have several .IPA of the same layout, one with less than 30 articles, and several subsequent versions (60, 90 articles, etc). The smallest one works great even on 1st iPad - a bit slow (about 5 seconds to render a page when jumping between articles), but no crashes, no jerky pages, zooming and flipping is perfect.
    All bigger versions (30+ articles) crash on both iPad1 and iPad3.
    Adobe states there is no limitation in the number of articles and pages per article, but it seems to be a limit which we don't know. Or there is something in our design that causes crashes.
    WHAT TO DO???
    The idea to start from stratch again, making a backup before adding each article, uploading and testing is not good. It will take weeks, we don't have that much time.
    Can there be a "profiling tool" for DPS, similar to what programmers use when the code is unacceptably slow?
    Some switches to run the installed custom viewer app from command line, with verbose logging, debugging, whatever???
    Is there any way to know, what exactly on our side causes the problems?
    Need help ASAP.
    Regards
    Serge

    Mike, I basically have to do the same. The idea to use DPS for something else than glossy magazine or corporate catalog, is my brainchild, and losing hopes feels like euthanazing a pet, to say the least.
    With the only exception: I don't see DPS as an inferior product. Rather, it is a raw product, brought to the light before it is really ready, in order (as it is often, if not always, happens in hi-tech industry) to corner the territory before competitors catch up.
    As you could see in other threads, there are no books or training courses available yet, and I believe Adobe team is not quite sure what can and cannot be done with DPS. A part of the minefield is Apple iOS, that enforces certain limitations to ensure system stability. And I am sure, there is a lot of other factors in play that we are not aware of, and probably never will be.
    The point of saying what's been said is: there is a price to be paid for being on the bleeding edge of technology. This price, other than spending money for expensive equpment, other than investing time into learning and experimenting, is taking the risk of failure. The risk that it will not work out due to reasons beyond your control. So I don't think Adobe is much to blame here, they are playing the same game. Accept the rules or leave...
    Just my 5 cents...

  • Same old "how to limit history" question with FF 27.0.1

    Windows 7 Pro., FF 27.0.1, Add-Ons Adobe and Flash (and it's auxiliaries)...just the basics. My box is a server as well as used for more ordinary tasks...email, browsers, document editors, etc, so it is always "on" and never in "idle." Therefore the add-ons that do these things don't do anything on my box.
    This is, I know, is an old question, but unresolved, in my opinion. I revive it from this post:
    https://support.mozilla.org/en-US/questions/799503?fpa=1
    First let me say I consider Firefox the best browser period. I like Mozilla's and Firefox's essential philosophy of user customization and control, while respecting their need to insure security and stability.
    OK, the question is obvious from the title, How can I limit the history with Firefox? I have observed more recently where the enabling and disabling of JavaScript from the preferences options was removed. I read the Bugzilla and Firefox rationale and it was not difficult for me to open a about:config tab next to a tab containing a page I was working on so I could see how the page rendered without JavaScript. Once you knew how, it was only slightly more difficult than using the preferences options...so that does not really bother me.
    The limiting of history is implied in the browser itself where, if you open up the history window, you can see your history and (for me) see an "older than six months" option which if you mouse over and right click will get a "delete" option. I think everyone who tries this crashes their browser and if anything is removed when you finally get it back up, it will be your cache.
    I know that the amount of history is set in about:config in the setting "places.history.expiration.transient_current_max_pages" as an integer. From reading many articles here in "support", and from personal trial, I find that though that number can be increased, it can't be decreased. If one does, it reverts back to the old setting as soon as Firefox is re-opened.
    I have read the blog referenced in the first link above: http://blog.bonardo.net/2010/01/20/places-got-async-expiration and it even seems outdated, containing references to places.history.expiration.max_pages, which is not the file in about:config in 27.0.1
    In https://bugzilla.mozilla.org/show_bug.cgi?id=643254 I read the debate/discussion about these changes where Marco Bonardo steadfastly holds to his position while I find the comment by al_9x discussing the issue with him; "You have removed functionality that people use and like, that's been in Fx from its inception. It is you who need to justify its removal. And your justification of "nobody really wants to expire history" is a lie, people do and I explained why", to represent my feeling completely. I have no need or desire for history longer than 3 months. It is high-handed to limit Firefox users ability to limit their history. And no...I did not create yet another Mozilla account and "vote."
    There must be some way I and other users can limit (or increase) their history. So I am asking "How"? If it is currently impossible, by design itself, to do so then I find that a very disturbing trend away from the whole philosophy of user control and customization.
    If I am missing something, please inform me. Please *don't* send me a bunch of the standard "help" articles, for I have read them all. So that's my question.
    Regards,
    Axis

    Thank you cor-el!
    I have added the new integer places.history.expiration.max_pages and when I added it places.history.expiration.transient_current_max_pages changed its value to equal what I had put in places.history.expiration.max_pages.
    This is all I needed, I believe. I did not know I was to ''create'' the new integer, now I know how to do so and to your credit you I would like to say to all the Firefox users that this is the best solution I have read to the oft asked question, "How Can I Limit my History in Firefox."

  • When I air print from my iPad, I can't limit the number of pages. It just says print and the number of copies and I end up with way more than I want. How do I set it to limit the pages?

    When I air print using FingerPrint from my ipad, it doesn't let me limit the number of pages, and I end up with more pages than I want. It asks if I want multiple copies, but it doesn't ask me number of pages, or which pages. Is there a way to do this?

    When you use an AirPrint compatible printer, you get this option for page range. You are using your computer as the server in order to print, so you probably have to set the range on the computer. While you can print from your printer with this app, I assume that it doesn't give you the same control that you get directly from the iPad when using an AirPrint printer.
    Read this from the FingerPrint support site. Selecting the page range would be part of the formatting of a print job.
    http://fingerprint-support.collobos.com/knowledgebase/articles/66972-i-can-t-for mat-the-print-job

  • How do I limit the number of rows retrieved at a time using RefCursor?

    I have a PL/SQL package in use, that returns a REF CURSOR. This is currently being used in a Forms 6i application. Now I want to develop an ASP.NET web application that displays exactly the same information as my Forms 6i module. In fact those two applications will be used concurrently for a while.
    I looked at the sample code provided on otn.oracle.com and decided to use the OracleDataAdapter.Fill()-method to fill a dataset and bind that dataset to a pageable GridView. Now I wonder, whether this method retrieves ALL records of the query at once and how I can limit the number of rows fetched? The Select statement retrieves up to 10000 rows at a time. Most of the time, a user is only interested in the first 20-30 rows. Forms 6i fetches more rows as the user scrolls down, how can I implement the same behavior in ODP.NET?
    - Markus

    Excuse me, but the reply does not quite answer my question. Maybe I did not explain my concerns clear enough:
    I understand the use of the two properties (RowSize and FetchSize) to reduce the amount of round trips needed to transfer the data - not the number of rows fetched in total. This would still lead to a situation where all rows are transferred, when I am using the OracleDataAdapter.Fill()-Method. Is this correct or did I misunderstand the function of this method?
    I quote the otherwise really helpful article you send me:
    Of course, there is a cost if the fetch size is arbitrarily large. More client-side memory and processor cycles will be needed to store and manage a larger amount of data. The goal is to find a high-performing balance between the number of round trips and the amount of data retrieved per trip.
    My RowSize is for sure a bit larger than the one in the given example. The query will probably be used by up to 100 users at a time, so I would like to limit the resource-costs not only on the network by the number of round trips, but also on the web-server which is storing all these records in it's memory per user-request.

  • I used the iTunes match workaround and now want to start over, but I am being told I have exceeded the limit even though I completely cleared the match cloud, mindful of the 1000-delete-at-a-time limit....what should I do?

    I have over 25k songs in iTunes. I originally tried the "create a second library" solution. I reduced to less than 25k songs and successfully ran iTunes match. Then I discovered the limitations of the second library solution regarding adding more music, managing playlists, and turning back on my main library and being greeted with all sorts of problems.
    So, I created a blank library, ran iTunes match, and its showed me all of my files in the cloud that had been matched or uploaded. I deleted all of the files, mindful of the 1000 file limit per delete. I have confirmed I have now deleted them all. An iTunes match update says I have no files in the cloud. Perfect.
    I reduced my library again, this time using the superior method of just changing the files I do not want to match to voice memos. That is done. I have less than 25 k songs.
    I waited 2 days and ran match again. Yikes. It did not let me match or upload anything, and every music files's cloud status just says "exceeded limit," despite the fact that my cloud is devoid of music and I have no access to any music via match.
    I am in a limbo where I have nothing in the cloud from match, but I can't add anything because match says I exceeded my limit.
    I emailed customer support, and got back a useless email linking me to a very general article about iTunes match. I called customer service and they said this was something the iTunes store people would need to address, and they can only be contacted by email. Sigh . . . .
    Any thoughts?
    Thanks

    Have you confirmed that you successfull purged iTunes Match by also looking on an iOS device?  If so, keep in mind that Apple's servers may be experiencing a heavy load right now.  They just added about 19 countries to the service and I've read a few accounts this morning that suggests all's not running perfectly right now.

  • Is there a limit on Channel that can be added to selector

    I have read in an old document on javaworld that there is a limit of 63 channels that can be registered with a Selector. I assume I can create multiple Selecors, if I have to manage more than 63 channels and try to select on each one after the other.
    However, I still want to know whether this limit is there as I did NOT see it in any NIO documentaion.
    I quote from the below article -
    "Fourth, a selector can have only 63 channels registered, which is probably not a big deal. "
    http://www.javaworld.com/javaworld/jw-09-2001/jw-0907-merlin_p.html
    Raj

    This was a Windows-only limit in JDK 1.4.0. If you are not on Windows, or are in 1.4.1 or later, there is no limitation.

Maybe you are looking for

  • Fatal Error while Installing DQS.

    Fatal error while trying to install the DQS. Only the DQS staging database is available when I open the SSMS the other databases are not made available with the installation please help out with the error or provide with leads. Error Log: [11/18/201

  • Missing table begin XSL context for:

    Hi, I'm working on a table listing in an RTF template. This has several <?choose?> commands, a <?for-each?> command, and some variables. All of a sudden I'm getting the above message for all my <?choose?> and <end for-each?> statements Any ideas? I'v

  • Question about the  "Suggestions" feature in the search box

    How can you tell if the "Suggestions" in the search box are generic or part of my own browsing history? Can the suggestion history be cleared?

  • Multiple numeric limit test

    Hi, I have a mutiple numeric limit test. I want to modify the measurement set in the test based on my needs. How to access the measurement set property of the test. when I do the export of the property using the property loader utility, the data is i

  • BB as modem for laptop

    I have BlackBerry Bold and want to use my black berry as modem for my laptop computer.