UIX - method data providers and the include element

I've just read in the documentation (Chapter 16 UIX Developer's Guide - Integration with Java ...)
'Instead of recreating the entire page and all of its UINodes every time you render, you'll be able to keep most of the page cached and only recreate the part that varies.'
The example given is using a method data provider which returns a UINode and data binding to it using <include data:node="node@source">
Is it true then that method data providers are always actioned when a page is rendered, even if it has been cached by the framework? If so does that apply to all method data providers not just those which provide data for <includes>? When hitting the back button I have a requirement for part of my page to be the latest view of some data - at the moment I'm overriding 'isCacheable' in my implementation of UIXPageBroker and always returning false to get the right effect but am unhappy about doing it this way.
Hope this makes sense
Cheers
Ian

Data providers are always re-queried on every render; not just
method data providers, but all. So if the UINode served
up to data:node is different every time, you'll get different
output.
However, this is an entirely orthogonal question to whether
the browser will bother asking for the page when the user hits
"Back". Server-side caching and client-side caching are
very different beasts!

Similar Messages

  • [svn:fx-trunk] 15795: Initial check-in for removing unused rsls and the include-inheritance-dependencies-only features .

    Revision: 15795
    Revision: 15795
    Author:   [email protected]
    Date:     2010-04-28 14:01:58 -0700 (Wed, 28 Apr 2010)
    Log Message:
    Initial check-in for removing unused rsls and the include-inheritance-dependencies-only features.
    Changes to implement ?\226?\128?\147remove-unused-rsls feature. When the configuration option is true, RSLs associated with RSLs that are not used by the application are not loaded at runtime. The compiler logs the primary RSLs that are required with the number of failovers, if any.
    compiler/src/java/flex2/compiler/common/Configuration.java
                add ?\226?\128?\147remove-unused-rsls and ?\226?\128?\147include-inheritance-dependencies-only.
    compiler/src/java/flex2/compiler/CompilerSwcContext.java
                pass rslGroup to getSwcGroup.
    compiler/src/java/flex2/compiler/swc/SwcCache.java
                Add parameter to getSwcGroup to a group of rsls can be passed in.
    compiler/src/java/flex2/compiler/swc/SwcGroup.java
                Add a SwcGroup constructor that accepts a SwcGroup of Rsls. Use the RSLs to give preference to scripts that come from RSLs when duplicate scripts are merged.
    compiler/src/java/flex2/tools/CompcPreLink.java
    compiler/src/java/flex2/tools/PreLink.java
                Find the set of SWCs that are contributing code to the application. Use this set to restrict which RSLs are written to the generated root class. Log the RSLs that are required to give the user some feedback.
    compiler/src/java/flex2/tools_en.properties

    Ah, ok. I was in the assumption the PCI connect had something to do with the video connection. But it seems like Apple wanted to reinvented PCIe...
    I'm out of my territory here so feel free to ignore the following.
    These are the things I notice in your Xorg.0.log:
    [    19.474] (==) modesetting(G0): Depth 24, (==) framebuffer bpp 32
    [    19.474] (==) modesetting(G0): RGB weight 888
    [    19.474] (==) modesetting(G0): Default visual is TrueColor
    [    19.474] (II) modesetting(G0): ShadowFB: preferred YES, enabled YES
    [    19.500] (II) modesetting(G0): Output VGA-1-0 has no monitor section
    [    19.526] (II) modesetting(G0): EDID for output VGA-1-0
    [    19.526] (II) modesetting(G0): Using default gamma of (1.0, 1.0, 1.0) unless otherwise stated.
    [    19.526] (==) modesetting(G0): DPI set to (96, 96)
    What the hell is modesetting doing there?
    And have you tried the nvidia blob? It could be that this is not well supported by nouveau. Maybe check up with their IRC / mailing list.

  • How can I force Time Machine to make a complete backup of my Hard Drive.  I just installed a new external drive for Backup since my previous one failed.  Now when I back up, Time Machine only backs up my data folder and the Users folder.

    How can I force Time Machine to make a complete backup of my Hard Drive.  I just installed a new external drive for Backup since my previous one failed.  Now when I back up, Time Machine only backs up my data folder and the Users folder.
    When I start a backup. Time Machine says "Oldest Backup: None; Latest Backup: None", so it seems like it should do a complete backup, but it only does a partial. 

    Hi I'd like to jump in here. Your app showed me this:
    Time Machine:
              Skip System Files: NO
              Mobile backups: OFF
              Auto backup: YES
              Volumes being backed up:
                        Macintosh HD: Disk size: 749.3 GB Disk used: 453.81 GB
              Destinations:
                        Plastic Wrapper [Local] (Last used)
                        Total size: 999.86 GB
                        Total number of backups: 64
                        Oldest backup: 2013-07-24 23:25:11 +0000
                        Last backup: 2013-11-17 01:40:47 +0000
                        Size of backup disk: Too small
                                  Backup size 999.86 GB < (Disk used 453.81 GB X 3)
              Time Machine details may not be accurate.
              All volumes being backed up may not be listed.
              /sbin excluded from backup!
              /usr excluded from backup!
              /System excluded from backup!
              /bin excluded from backup!
              /private excluded from backup!
              /Library excluded from backup!
              /Applications excluded from backup!
    Aside from the size of my backup drive, which I will increase at some point, I'd really like to have time machine backing up all system folders, especially Applications. How to I reset this hidden exclusions?
    Thanks,
    Darcy

  • Differences between the Portal Data Collector and the Activity Data Collect

    Hello,
    I want to know what are the differences between the Portal Data Collector and the Activity Data Collector?
    Best Regards.
    Pablo Mortera.

    All of my SQL Server instances  are sql server 2008r2 standard edition(10.50.2500). MDW is existing database, I try to setup collection sets for multiple instances and store data in one central MDW database 
    I create MDW in one instance, then run run configure Management Data Warehouse in target intance. collection sets were created successfully, but job failed with following error:
    Executed as user: COCAD\INTDEPT01SQLAgentC10. The Management Data Warehouse version "00.00.0000.00" is not supported by the current data collector. Please upgrade the Management Data Warehouse by running the Management Data Warehouse Configuration
    Wizard.  Process Exit Code 5.  The step failed.
    Thanks
    PAULqaz

  • Interleave​d data saving and the TDMS Viewer VI.

    Hello,
    I'm using LabVIEW 2009 Service Pack 1 ( ver 9.0.1 32-bit ).
    I was trying to take a piece of data and save it in interleaved mode.
    The save operation seems to be completed correctly, but when the TDMS Viewer VI is launched it crashes, and I'm unable to browse the data.
    I'm attaching the example I used.
    What I want to know is:
    1) am I using the write properly? What I'm trying to acomplish is to push down an array if I16 to a TDMS write block, recieve 30 channels of data this way.
    2) I managed to use this VI and it crashes, on two PC's, one is a win7 32bit, the second one was an XP. But let me know if this runs properly for you.
    Maciej
    Attachments:
    InterleavedDataErr.zip ‏16 KB

    I have a buffer that can accumulate up to 65536 I16 values.
    In that buffer I get for example 6x 10000 values.
    Every "chunk" of the 10000 values is my interleaved data. One point is one channel.
    My data is shaped like in the opening post:
    http://forums.ni.com/t5/LabVIEW/TDMS-Streaming-in-​interleaved-mode/m-p/1189201#M513943
    Since I'm running a windows machine, sometimes I'll get 5x 10000 values, sometimes it will be 3x , sometimes 6x during one buffer swipe.
    The data rate and the channel length is variable.
    But let's talk about one case:
    10k channels and 10 kHz rate of I16 vals.
    That gives me roughly 200 MB/s of data that I need to get out of the buffer.
    What I want to do on my PC is take that data, and save it interleaved to a TDMS file, while being able to safely stay above the 200 MB/s.
    I would be intersted in aiming for 300 MB/s so that if we push our card further down the line, I'm not stuck with the streaming bandwitch.
    I have noticed that unfortunately if you use an ammount of channles that is 10000 the writing performance drops dramatically.
    What I figured out is that I'm not conducting 1 write on 10000 channels on a small ammount of data. But rather than that a large accumulated amount of data (like 8000 x 10000 of I16)
    And I split the save of 10000 channels into 10x 1000 channels save.
    First channs from 0-999 , than 1000-1999 etc.
    All is done in interleaved mode.
    This is Very Very fast indeed. I got rates of 360 MB/s. That's smoking fast!
    Now the trick is... to handle the appropriate data shaping in the RAM, from my buffer, to the large shaped RAM chunk, that will be cut in to pieces for Streaming. LV is designed for data flow and copies data a lot, hence I get easlity in the out of memory problem, I'm going to have to use some ticks like storing the data structure via a VI Server call, or perhaps queues don't know yet. But I'm going to do some reading on that and it seems that this should be achieavable.
    Thank you both for your input,
    I will post if I get stuck.
    Maciej

  • SQL ENTERPRISE: The edition of Reporting Services that you are using requires that you use local SQL Server relational databases for report data sources and the report server database

    The error below makes absolutely no sense! I'm using Enterprise Core...yet I'm being told I can't use remote data sources:
    w3wp!library!8!03/05/2015-19:08:48:: i INFO: Catalog SQL Server Edition = EnterpriseCore
    w3wp!library!8!03/05/2015-19:08:48:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.OperationNotSupportedException: , Microsoft.ReportingServices.Diagnostics.Utilities.OperationNotSupportedException: The feature: "The edition of Reporting
    Services that you are using requires that you use local SQL Server relational databases for report data sources and the report server database." is not supported in this edition of Reporting Services.;
    Really? This totally contradicts the documentation found here:
    https://msdn.microsoft.com/en-us/library/ms157285(v=sql.110).aspx
    That article says remote connections are completely supported.
    ARGH! Why does this have to be so difficult to setup?!?

    Hi jeffoliver1000,
    According to your description, you are using Enterprise Core edition and you are prompted that you can’t use remote data sources.
    In your scenario, we neither ignore your point nor be doubt with what you say. But actually we have met the case before that even though the SQL Server engine is Enterprise but the reporting services is still standard. So I would recommend you to find the
    actual edition of reporting services you are using. You can find Reporting Services starting SKU in the Reporting Service logs ( default location: C:\Program Files\Microsoft SQL Server\<instance name>\Reporting Services\LogFiles). For more information,
    please refer to the similar thread below:
    https://social.technet.microsoft.com/Forums/en-US/f98c2f3e-1a30-4993-ab41-acbc5014f92e/data-driven-subscription-button-not-displayed?forum=sqlreportingservices
    By the way, have you installed the other SQL Server edition before?
    Best regards,
    Qiuyun Yu
    Qiuyun Yu
    TechNet Community Support

  • Registering with the WebEx Data Center and the Cisco WebEx Node Management System

    Dear guys, ...
    Please help,
    i want to implement to webex node ASR1000, i have read in "Configuring the Cisco Webex Node for ASR 100.pdf", there is prerequisites to implement it, that is "Registering with the WebEx Data Center and the Cisco WebEx Node Management System"
    Can someone tell how to "Registering with the WebEx Data Center and the Cisco WebEx Node Management System"
    Are there any step by step documentation to "Registering with the WebEx Data Center and the Cisco WebEx Node Management System"?
    Thank you
    BR

    You should have received a PAK Key with your order.  Go to Cisco licensing and enter the PAK Key as this will start the process.  Once the PAK Key is validated a screen will be displayed to enter your request for ASR 100 integration.  It normally takes a few days to a couple of weeks to get the information back from WebEx needed to configure your ASR.
    If you did not get a PAK Key contact your WebEx rep to get the process started to integrate your ASR to your WebEx site.
    Hope this helps
    John

  • A message will pop up (Exc in ev handl: Error: Bad NPObject as private data!) and the tab I was on will close out and reopen in a new window. Why is it happening and how do I stop it?

    Okay, I will have a window open with 4 tabs open. At first, everything is fine, but after a day or so, a message will pop up on screen saying "'''Exc in ev handl: Error: Bad NPObject as private data!'''" and after you close it out, the tab you were on will close and reopen in it's own window. I then have to shut down all both old and new windows and open a new window with my original 4 tabs again. It will work fine until a few days pass and it starts over again.

    This issue can be caused by the McAfee Site Advisor extension
    *https://support.mozilla.com/kb/Troubleshooting+extensions+and+themes
    Start Firefox in <u>[[Safe Mode]]</u> to check if one of the extensions or if hardware acceleration is causing the problem (switch to the DEFAULT theme: Firefox (Tools) > Add-ons > Appearance/Themes).
    *Don't make any changes on the Safe mode start window.
    *https://support.mozilla.com/kb/Safe+Mode

  • Excel data source and the use of add command

    HI, Looking for suggestions on how to work with multiple inputs that cannot be joined directly.  Here's the background.
    The Report currently reads in two different Ecel files and uses 3 SQL commands to query an Oracle Database.  I need to join the 5 data sources and am having issues with the 2 excel files.  In one file I need to be able to derive a field based on another column in the file In order to create the join condition to the SQL commands.  I'd equate this to a case statement in SQL, but how goes one do that using the 'add command' feature?  What is the syntax?
    Next I would need to join (left outer) the two excel files using two fields from file A (a1, a2) and 3 fields from file B (b1,b2,b3), where a1=b1 and b2 <= a2  <= b3 when rows from A exist in B.  If row A does not exist in B then we still want it in the report and available to left outer join to the 3 oractab data sources.
    runtime is also a concern.
    Any Suggestions?

    hi Elena,
    in this case the use of subreports is not recommended. that's because you're exporting to excel and you need data in columns across the report. subreports will not, unfortunately, give you what you need.
    this would bring you back to joining the datasources. what i would recommend is looking into using 'oracle database link' to link your oracle db to excel files. here's one article as an example but you may be able to find a better one. if you have questions on this please ask them on an oracle forum as the syntax that you need will be database specific.
    a lot of databases have this type of technology which allows you to create a view to other data. sql server has 'linked servers', sap hana has 'smart data access'. essentially you are creating a non-materialized view to the external data. then this view is available on the main oracle server where you established this connection. this should be a lot easier than trying to bring a bunch of command objects together off independent datasources inside of crystal.
    -jamie

  • Xerces Sockets and The root element is required

    Hi,
    I have a problem with Xerces 1.4.3 saying "The root element is required in a well-formed document.". The story is as follows:
    When I read the XML from a file, it works OK.
    But when I send the file over a socket stream, and try to parse it in the client side, it gives this silly message.
    I am sure that there is no space before <?xml ... ?> and also the xml file is well formed.
    I guess someone can say Xerces 2 solves this problem but I am using JBuilder 7 and I could not install the new xerces over old one. If someone knows how to do this please let me know. (Copying the jars into jbuilder7/lib does not seems to work as the file names are different and it wont overwrite the old xerces parser)
    Any solution?
    Thanks,

    saving and parsing is working fine. hehe :)
    seems that the parser is somewhat going crazy.
    btw. I had written a class extending InputStream, so that it reads more than one file from a socket stream. But I have used it in this save and parse test and it worked so I dont think I have problems with that.
    In one forum here, I saw someone saying that changing the version solved his problems about this silly thing. I think I wil try this.
    Tankut

  • Need JavaScript for Sharepoint Designer 2010, to show a Pop-up to all the users between date range and the redirect the site

    Dear all,
    I am very new to Sharepoint designer 2010. Its better to say, i am just stepping in.
    I have sharepoint site which is accessible for all the users to update few information. But i want this site not to be accessible during a date range, say from 15th to 20th of every month.
    I am planning to have something like this: If any user tries to access this sharepoint site between 15th to 20th date range of everymonth, then the browser must show an pop-up alert saying "This site is not accessible during this date range" and Redirect
    this sharepoint site to someother site (Say http:// somesitename.xx.com) immidiately.
    I know very little about how to add Javascript in sharepoint webpart.
    Please provide the Javascript or any best alternate solution to my concern.
    Thanks in advance.

    Hi,
    From your description, my understanding is that you want to redirect sharepoint site to another site between 15th and 20th every month.
    I agree with what Sudip says. If you still want to use JS code,You could try these steps
    below:
    Open your site with SharePoint Designer.
    Click Master Pages in left navigation.
    Find file v4.master, and backup it(it is very important).
    Right click your v4.master,
    choose "Edit File in Adavance Mode".
    Add the code below into the <head/> tag.
    <script src="https://code.jquery.com/jquery-1.11.1.min.js" type="text/javascript"></script><script
    type="text/javascript">
    $(document).ready(function(){
    var today = new Date();  
    var day = today.getDate(); // get current day
    var id = _spPageContextInfo.userid; // get the id of current logon in user
    var boolFlg = false;
    $.ajax({
       url: _spPageContextInfo.webAbsoluteUrl + "/_api/web/SiteGroups/GetByName(\'"+"sharepoint Owners"+"\')/Users", //get users in a specified group can access the sharepoint site from 15th to 20th every
    month
       type: "GET",
       headers: {"accept": "application/json;odata=verbose"},
       success: function (data) {
          if (data.d.results) { 
             var src = data.d.results;
             for(var i = 0; i < src.length; i++){
                if(src[i].userid==id){ //check the logon in user in  the specified group or not
                   boolFlg = false;
                   break;
             if(!boolFlg){
                if(day <= 20 && day >=15){
                    alert("This site is not accessible during this date range"); // alert the message
                    location.href = "http:// somesitename.xx.com"; // redirect to another site
       error: function (xhr) {
          alert(xhr.status + ': ' + xhr.statusText);
    })</script>
    Best Regards
    Vincent Han
    TechNet Community Support

  • How to use the same variable value for data entry and the planning sequence

    Hi,
    the scenario is the following:
    Using the WAD template a user enters cost center plan data. The cost center is selected by the chosen value for the variable "V1".
    Afterwards he shall push a button which starts a planning sequence (including saving the data and further functions). This planning sequence uses a filter that also contains the variable "V1".
    What or where has it to be defined that the planning sequence uses automatically the same value for the variable "V1" as selected for the data entry?

    You have to define in the planning function. The planning sequence is only a sequence and it read the planning functions underneath it.
    Ravi Thothadri

  • Can't omit both the rowset and the row element?

    consider the simple document below. Notice that row-element and
    rowset-element are both empty, so neither type of element will
    be generated. Also notice that the xsql:query tag is embedded in
    other tags.
    I get "oracle.xml.sql.OracleXMLSQLException: The row enclosing
    tag or the row-set enclosing tag is ommitted; consequently to
    get a well formed XML document, the result can only consist of a
    single row with multiple columns or multiple rows with exactly
    one column each." However, in reality the result will be a valid
    XML document because I enclosed the query in a single top-level
    <table> tag. So the error checking code in the XSQL servlet is
    generating a generating a false positive (that is, seeing an
    error that isn't there).
    My question is, how can I work around this problem? I know I can
    let the servlet generate a rowset element and use an XSLT
    stylesheet to remove it but is there a another way?
    Thanks,
    Brian
    By the way, the "numbers" table contains the numbers 1-1,000,000
    and so it can be used to generate multiple copies of the output
    of any query. In this example, I expect to get 55 <th> tags in
    the output (yes, my HTML table is really that wide), all nested
    in a single <tr> tag which is nested inside a single <table> tag.
    <table connection="XXX"
    xmlns:xsql="urn:oracle-xsql"
    xmlns='http://www.w3.org/1999/xhtml'>
    <tr><th rowspan='2'>ID</th>
    <xsql:query rowset-element=''
    row-element=''>
    <![CDATA[
    SELECT 'fvc'          AS "th"
              , 'fev1'     AS "th"
              , 'fef25_75'     AS "th"
              , 'fev1/fvc'     AS "th"
              , 'vc'          AS "th"
              , 'tlc'          AS "th"
              , 'rv'          AS "th"
              , 'frc_n2'     AS "th"
              , 'frc_pl'     AS "th"
              , 'erv'          AS "th"
              , 'dlco'      AS "th"
         FROM numbers
         WHERE n < 5
    ]]>
    </xsql:query>
    </tr>
    </table>

    In XML, single quotes are equivelent to double quotes. This
    enables you to have attribute values like "That's Neat" and
    'About 2" Long'.
    The problem is that an xsql query can only return a tree as a
    fragment (e.g. there must be one root element). I think this is
    too restrictive.

  • Data Blocks and the Elapsed Time

    Hi,
    I have created 3 tables with one column only. As an example Table 1 below:
    SQL> create table T8k( x char(2000));
    So 3 tables are created in this way i.e. T8k,T16K and T4K
    T8 = in the default database tablespace of 8k (11g v11.1.0.6.0 - Production) (O.S=Windows).
    T16 = I created in a Tablespace with Blocksize 16k.
    T4K = I created in a Tablespace with Blocksize 4k. In the same Instance.
    Each table has 290,000 rows and all the 3 tables have equal size of 555MB (2006(rowsize with overhead) * 290,000/1024/1024 = 555MB) to test Elapsed Time (set timing on).
    As these 3 tables are created under different block sizes so the allocated no. of data blocks are different as below:
    T8K = 97177 BLOCKS= 00:41:20.21 (Elapsed Time)
    T16K=41639 BLOCKS= 00:44:11.59
    BT4K=293656 BLOCKS=00:37:29.06
    Please note the difference. First table i.e. 8k block size, allocated blocks are 97177 and taking around 41 mins. Third table i.e. BT4K (in a 4k block size tablespace), allocated blocks(293656 ) are almost 3 times bigger, taking around 37 mins to execute the query. I mean the difference is only 4 mins hardly and blocks difference is 3 times.
    In case of any doubt, I've created these tables bigger than the memory used for my db(memory_max_size 408M) i.e. 555 MB that If Blocks are already in cache then reading the blocks and counting the rows will nearly be the same regardless of the block sizes or the number of blocks.
    Need solid suggestions and if possilble, links also which has some serious discussions regarding my issue.
    Bundle of thanks.
    Best Regards,

    Because I was not completely satisfied with the things last time. It doesn't seem so simple to me as people said
    I thought may some guru have a look today, not just to point out that i have created it again.
    And I have compact my question to avoid confusion.

  • The feature: "The edition of Reporting Services that you are using requires that you use local SQL Server relational databases for report data sources and the report server database." is not supported in this edition of Reporting Services.

    Hello all,
    I have SQL express 2014 advance edition installed ..
    and i am connecting SQl server 2008 r2 instance which is in network 
    while creating datasource in Reportserver which has Express installed ..
    got this error ..
    please help me how to connect to remote server
    Dilip Patil..

    Error message says it all.
    With SQL Express, Data source should be local SQL DB.
    With SQL Enterprise, Standard, BI edition, you can create Data soruce which are hosted on other servers.
    Please refer similar thread:
    https://social.msdn.microsoft.com/Forums/en-US/c0468e3f-bad7-47a7-a695-75c13762280a/the-feature-the-edition-of-reporting-services-that-you-are-using-requires-that-you-use-local-sql?forum=sqlreportingservices
    Cheers,
    Vaibhav Chaudhari
    [MCTS],
    [MCP]

Maybe you are looking for

  • Questions on Calculation Scripts

    I have two questions on Calculation Scripts: 1. When executing a Calculation script via the Administration Services console (right-click execute) besides the Messages lower window pane, is there a detailed log/trace of the script activity? 2. We had

  • Sales Order Stock - Problem with Schedule line

    Hi guru, I have a problem with my schedules line. We use the Sales order stock ( using MB1B with good movement  412/E) for some specific customers. Even if the sales order reservation works well, there is an issue with the schedule line in the order.

  • Can I run Exchange 2013 Schema more than once

    I have started in installing Exchange 2013. I ran schema update using the Exchange 2013 RTM. Should I go ahead and install Exch2013 RTm and just run the updates afterwards or should I run schema update again using the Exch2013 SP1 setup.exe?

  • Time conversion ZVAR_UNIT with FLTP InfoObject gives an error

    Hello, Situation: ZVGW03X is "a copy" of AFVV.VGW03 but with FLTP instead of QUAN data type: It has to be FLTP, because we do some divisions and multiplications with the InfoObject. When doing the same calculations with Data Type QUAN we get results

  • Listening to iTunes with the screen Flipped down?

    This really bugs me. I don´t want my downloads to stop if i flip down the screen. I want to Listen to iTunes with the screen Flipped down, how do I do it?