N8 has a higher SAR value compared with previous m...

N8=1.02
N97=0.66
N96=0.82
N95=0.50

The SAR limit stated in the ICNIRP guidelines is 2.0 watts/kilogram (W/kg) averaged over ten grams of tissue.

Similar Messages

  • High SAR value for Nokia N79

    Hi
    i am planning to check out nokia n79, but the on reading the tech specs, i found it has good high SAR value, 1.40... is it advisable???..1.40 is really a high value...
    Help needed....Thanks in advance...

    In truth, all SAR values of all in production phones have very minimal known effects on man. This was an issue 15 years ago when SAR values were like 10x of what they are now. If you really have an issue with it, use a headset for calls.
    If you find my post helpful please click the green star on the left under the avatar. Thanks.

  • I've found that middlemouse.contentLoadURL feature has changed its behavior in FF5.0 as compared with previous versions

    I've found that middlemouse.contentLoadURL feature has changed its behavior in FF5.0 as compared with previous versions: I now can paste&load only full urls that start with "http://" prefix, but previously it was possible to paste&load any url even without protocol prefix. It is very useful when someone has an url for example just "support.mozilla.com", but with FF5.0 one _has_ to paste it to urlbar instead of just middle-clicking into the browser window (long standing and usual *nix behavior). How can I make middlemouse.contentLoadURL load any clipboard content not prefixed with protocol specification?

    Oops, already found a solution: https://support.mozilla.com/ru/questions/824759
    But, how could it be done via something less freaky (ala about:config)?

  • Any Idea How Temperatures Of New Imacs Compare With Previous Models ?

    Has anyone had a chance to test or compare the operating temperatures of the new and old models ?
    In particular I would like to know how the new entry 2.4GHz compares with the old 2.0GHz.

    Found my own answer...
    Mid to late Feb, G73 series. 
    Gonna be hot!  Looks like they will be able to run all the most demanding games on highest settings.  Sweet deal. 

  • Alert on Today's Data Compared with Previous Day Data

    Hi All,
    I've a report with multiple tabs having data like,
    Tab:1                                   Tab:2                                 Tab:3                          Tab:4
    Franchise: ABC                   Franchise                          Franchise                    Same as Tab:3
    Start Date:                           Start Date                          Month
    Operator: XYZ                     Operator                            Operator
    Incoming Node:N01            Outgoing Node                  Incoming Node
    Incoming Path:P01             Outgoing Path                    Incoming Path
    Incoming Count:2                Outgoing Count                 Incoming Count
    Incoming Mins: 10.63          Outgoing Mins                   Incoming Mins
    My requirement here is, I've want an alert on Percentage of Mins comparing its data with the previous day(Month for Tab 3 & 4) mins data. For now I've done this by creating a variable(with Relative Value Function) that stores the previous day or month's mins and stored the difference of these two data into another variable as Variation. I've put a percentage on the Variation variable and put a alert on it. It went well till here but, my requirement also has graphical alert on percentage. I tried adding the Variation variable and percentage and all available measures. But an error showed up with every try saying "Error in dataset values :#COMPUTATION"
    Is there any other to do this report?
    Note: I'm using WebI Rich Client v4.1 SP1

    I tried using Franchise(Dim) and with that i've used Incoming Count, Incoming Mins, Variation separately. But nothing worked.

  • Taking model baseline and comparing with previous versions

    Hi:
    We are using the Subversion to version the model generated out of SQL Developer Data Modeler.  Like Erwin, is it possible to take a baseline of each changed version of the model as a single file?  If not, how do we compare the latest version of the model with the previous one to find all the changes?  Greatly appreciate if someone could please share the details if ever done like this before.
    Thanks,
    Senthil

    Hi Senthil,
    you can use "Versioning (or Team in v4)>Design Version History" in order to see all changes committed into repository. You can see changes (at object level) between revisions or compare (merge) to local version.
    Otherwise you need to checkout versions of design at revisions you want to compare and can use "Tools>Compare/merge models" or "File>import>Data Modeler Design" in order to see all changes.
    Philip

  • Nokia N8 reduce lower SAR value

    Hi, I have noticed that nokia N8 has high SAR value of 1.02 W/kg when Samsung Galaxy 2 has SAR of 0.35 !
    I did have not found any posts about bad reception quality on samsung, so why nokia choose so high SAR value??? My old N6680 has SAR of 0.64 and thats a phone I can use anywhere, elevators, bad receptions areas you name it.
    So my quations is this:
    How can I reduce - lower the SAR value of nokia N8 with service application or plugin whatever.
    The Service manual for N8 uses software to test the transmitter with various levels of power settigns, so there should be a way to set lower power, that will increase battery life and you probably know WHO anounced that microwaves are cancerogenic don't you?
    Attachments:
    Nokia N8 reduce lower SAR.gif ‏53 KB

    It is not quite as simple as that.
    Phones always have to be below a particular SAR value, and depending on which part of the world you are in it is typically 2. That is already a very low level, so when you are comparing figures like 0.35 and 1.02 you are comparing two very small numbers below another small number that is already a threshold many times below which any harm would be caused.
    You also should know that the figure quoted is the maximum for a range of different tests and conditions including against the head, body and hand and different channels and voice and data modes, also combined with WiFi etc..
    All this means that in  a typical situation when you are just making a phone call the SAR value in most phones is below than the figure quoted - although again you are still only comparing two very small numbers....
    and the reason why Nokia doesn't allow you to change the power levels is that those are very carefully calibrated to meet the GSM and WCDMA standards otherwise the device would not be certified and allowed for sale, and would most likely not work....

  • Lower SAR values -- More sold BB phones!

    Dear community,
    Dear BB smartphone antenna developers,
    I want to point to the fact that the lowest SAR (specific absorption rate) values for mobile phones have been reached by Samsung over approximately the last 6 years. Samsung Galaxy models have extremly low values of about < 0.4 W/kg. A detailed list of phone models and SAR values can be found on the German website http://handywerte.de/index.php.
    Samsung Galaxy phones show a very good voice quality despite the fact that they "irradiate less". Hence, I wonder why Blackberry does not aim at low SAR values, either? There is a great selling potential for Blackberry as there are many customers who chose a phone with regard to the SAR value - and nowadays chose Samsung.
    Despite the fact that there is no scientific study that indicates a harmful effect of cell phone radiation on human health, Blackberry could sell much more phones with lower SAR values - threre are still many people out there paying attention to the SAR value.
    With kind regards
    BBsupporter
    Solved!
    Go to Solution.

    Hi BillJ,
    you are right, of course, that low SAR values are not the only reason to buy a phone, and, thus, do not automatically equate higher sales rates.
    Whether you have found people who are interested in SAR values or not, depends on the target audiance you are talking to: For instance, if you support business customers who have to ensure security and stabilitiy in the communication systems of their company, then the phantastic characteristics of blackberry phones and services (for which I love blackberry, by the way) is the main focus.
    However, I am an electrical engineer and have worked in the field of electromagnetic compatibility (EMC). When working with EMC, after some time I had to face a citizen's initiative regarding cell phone radiation. I was surprised and realized that there actually are people who think about a possibly harmful effect of cell phone radiation on human health.
    As I said, I am an electrical engineer, and there is till today NO scientific proof for any harmful effect of cell phone radiation. But let us look on it from a BB economic selling point of view: There are different reasons for people to buy a cell phone: a good OS, brilliant display, good camera, long lasting battery, comfortable keyboard, stylish body, big memory, brilliant loudspeakers, etc. ... If there are people out there, who wish a low SAR, then let me tell you that there is NOTHING that can be easier achieved than a lower SAR!
    For instance, to improve the camera significantly, you will have to use better lenses, a better sensor and maybe the software will have to be adjusted for a better image signal processing (noise suppression). This means significant effort for BlackBerry. As an EMC engineer I can tell you that the effort for changing the SAR is negligible - it requires only slight changes in the antenna design. So, if the effort is practically zero, why not do it?
    What is my motivation? I love BlackBerry devices and just want BB to be successful again. It is sad to see that blackberry had so much problems in the past. E.g. I myself think that opening the door for Android in the OS 10 is absolutely necessary and will help to get back customers.
    Hence, my post about the SAR was just to suggest how BlackBerry can even better fulfill some specific customer wishes. If the effort in changing the antenna design is practically zero, if there are practically no costs for the changes, and the voice quality can still be excellent (as shown by Samsung's Galaxy devices), then why not reduce the SAR?
    Kind regards
    BBsupporter

  • Unusually High MAPE Values

    Posting it again for anyone to give you some inputs.
    We ran forecast on a selection ID that has all Product-Customer combinations using Auto Model 56 in the bakground on Prod-cust level. The system has proposed a strategies like constant, croston, seasonal and so on ..which is fine.
    We have set up a limit of 80 for the Error Metric MAPE in the Diagnosis group. If I am not wrong, MAPE is a percentage value and 80 should mean 80%. But after the forcast run using 56, the system generated  abnormally high MAPE values, for example MAPE values obtained are in 10's of thousands like 2,956,009.830 and even 10,000,000.000 . These are no where close to our limit 80. MAPE is the average of absolute percentage differences between Acutals and Expost forecast. So how could system generated MAPE be so high for all the combinations . Are we supposed to read it differently, because i thought auto model though may not propose the best fit model parameters, but atleast not give us a forecast which has such high MAPE values. I was expecting MAPE within 100 or worst case 200, 300 but not thousands and millions..
    Am I missing something ? Please help - Susan

    Hello Susan
    You could use the MAPE-A. You have to implemnt it on your own in a badi.
    The MAPE-A ist a value between 0-2.
    Hope thats helps you.
    Sven

  • BO Webi report hierarchy with measure values showing more(almost double value) compare to BW Bex report

    Hi,
    In our BO Webi report hierarchy with measure values showing more(almost double value) compare to BW Bex report. Can any one please help on this.
    Is it BW problem or BO problem?
    I checked in some other threads but it's not given solution.
    Thanks,
    Manjunatha

    Hi,
    Is it BW problem or BO problem? : BO
    is it causing problem with hierarchly data only ?  without hierarchies data is matching or not?
    Post same in SAP BusinessObjects Web Intelligence  .
    Thanks.

  • Use a single variable value to compare with 2 characteristics

    Hi guys
        I need some advice on how to use a single variable value to compare with 2 characteristics in a Infocube.
    eg : I hv 2 characteristics in Infocube
           Launch date  &  Closing Date
       Now I want to display report where the variable date (inputted by user) is equal to Launch Date and Closing Date.
        with regards

    Bobby,
    if I right understood your situation, you have an input variable ZINPUT (related to a date 'A') and 2 others dates (yours Launch and Closing dates, 'B' and 'C').
    You want to display only the rows where A(user input)=B=C.
    Now you have to create 2 new variables (called ZB and ZC, related to B and C dates) NOT marked as 'ready for input' and set to 'mandatory variable entry'.
    Call Transaction CMOD for the definition of the customer exit (if not already existing!).
    Create a new project, maintain the short text, and assign a development class.
    Goto Enhancements Assignments and assign RSR00001. Press the button components to continue.
    Double-click on EXIT_SAPLRRS0_001. For documentation place the cursor on RSR00001 and use the menu Goto -> Display documentation. 
    Then double-click on ZXRSRU01. If the include doesn’t exist you have to create it; assign a development class and a transport request.
    Enter the coding:
    DATA: L_S_RANGE TYPE RSR_S_RANGESID.
    DATA: LOC_VAR_RANGE LIKE RRRANGEEXIT.
    CASE I_VNAM.
    WHEN 'ZB'.
    (and you have to repeate the same code also for the variable 'ZC' !)
    IF I_STEP = 2.
    READ TABLE I_T_VAR_RANGE INTO LOC_VAR_RANGE
    WITH KEY vnam = 'ZINPUT'.
    if sy-subrc = 0.
    L_S_RANGE-LOW  = LOC_VAR_RANGE-LOW.
    endif.
    L_S_RANGE-sign = 'I'.
    L_S_RANGE-opt = 'EQ'.
    append L_S_RANGE to e_t_range.
    ENDIF.
    ENDCASE.
    Save and activate the coding and the project.
    Now go to your query and use these two new variables to restrict B and C....et voilà !!!
    Let me know if you need more help (and please assign points !!! be generous !!!)
    Bye,
    Roberto

  • Has anyone else had a problem with data? after updating my phone to the new update, i have all this data, and keep getting notifications that my data is getting high. i never had this problem before.

    has anyone else had a problem with data? after updating my phone to the new update, i have all this data, and keep getting notifications that my data is getting high. i never had this problem before.

    fair enough.  No need for any unnecessary posts either.  You issue had been addressed ad nauseum already in this forum had you bothered to search this forum (as forum etiquette would dictate) before posting.
    In any case, I hope that your problem was solved.

  • High volume of batches with Split valuation - impact on system performance

    Hi!
    I have a client that is intending to load a new material type from their legacy system which will be automatically batch managed with split valuation.  So, Valuation category will be 'x' and the valuation type will also be the batch number as automatically created on GR.
    The concern of the client is the impact on system performance.  Having up to 80,000 batches per material master record (so, 80,000 valuation types will be mainatined with a unique price in the Accounting 1 tab of the MMR) and overall around 1 million batches a year.  I'm not aware of any system performance issues around this myself but there seems to be anecdotal evidence that SAP has advised against using this functionality with high volumes of batches. 
    Could you please let me know of any potential problems I might encounter, having 1 million batches with split valuation may cause?  Logically, this would increase to tens of millions of batches over time until archived off via SARA.
    Many thanks!
    Anthony

    I currently have about 1.5 million batches with split valuation in my system (but it is not the X split), and we archive yearly.
    having many batches for one material ( lets say 1000) causes dramatic performace issues during automatic batch determination.
    it took about 5 minutes until a batch was returned into a delivery. if the user then wants a different batch and has to carry out batch determination again, then he just works for 10 to 15 minutes on one delivery.
    This is mainly caused by the storage location segment of the batches. if one batch gets movedd within a plant thru 3 different locations, then the batch has 3 records in table MCHB. But SAP has a report to reorganize the MCHB table that have zero stock.
    The X split has more effect, it is not only the batch table that makes issues in this case. With the x-split SAP adds an MBEW record (material master valuation view) for each new batch.
    However, if the design is made to get a certain functionality (here valution at batch level),   then you have to get a proper hardware in place that can give you the performance that is needed.

  • Custom comparator with LimitFilter

    When I use a custom comparator with a LimitFilter, the results within the limit are returned first and then the comparator is applied, instead of applying the comparator on the complete result set, and then limit them. Is there a way to achieve the scenario I am expecting?
    Thank is advance

    I am also on coherence version 3.5.2
    1. custom-pof-config.xml:
    <?xml version="1.0"?>
    <!DOCTYPE pof-config SYSTEM "pof-config.dtd">
    <pof-config>
         <user-type-list>
              <include>coherence-pof-config.xml</include>
              <user-type>
                   <type-id>2001</type-id>
                   <class-name>com.biperf.cache.example.CacheItem</class-name>
              </user-type>
              <user-type>
                   <type-id>2002</type-id>
                   <class-name>com.biperf.cache.example.CacheSubItem1</class-name>
              </user-type>
              <user-type>
                   <type-id>2003</type-id>
                   <class-name>com.biperf.cache.example.CacheSubItem2</class-name>
              </user-type>
              <user-type>
                   <type-id>2004</type-id>
                   <class-name>com.biperf.cache.example.CustomFilter1</class-name>
              </user-type>
              <user-type>
                   <type-id>2005</type-id>
                   <class-name>com.biperf.cache.example.CustomComparator1</class-name>
              </user-type>
              <user-type>
                   <type-id>2006</type-id>
                   <class-name>com.biperf.cache.example.CustomProcessor1</class-name>
              </user-type>
         </user-type-list>
         <allow-interfaces>true</allow-interfaces>
         <allow-subclasses>true</allow-subclasses>
    </pof-config>2. Domain objects:
    public class CacheItem extends AbstractEvolvable implements EvolvablePortableObject, java.io.Serializable, com.tangosol.io.pof.PortableObject
      private static final int VERSION = 1;
      private static final long serialVersionUID = -1L;
      private long cacheItemId;
      private Set<CacheSubItem1> item1 = new HashSet<CacheSubItem1>();
      private Set<String> item2 = new HashSet<String>();
      private Set<Long> item3 = new HashSet<Long>();
      public long getCacheItemId()
        return cacheItemId;
      public void setCacheItemId( long cacheItemId )
        this.cacheItemId = cacheItemId;
      public Set<CacheSubItem1> getItem1()
        return item1;
      public void setItem1( Set<CacheSubItem1> item1 )
        this.item1 = item1;
      public Set<String> getItem2()
        return item2;
      public void setItem2( Set<String> item2 )
        this.item2 = item2;
      public Set<Long> getItem3()
        return item3;
      public void setItem3( Set<Long> item3 )
        this.item3 = item3;
      @SuppressWarnings( "unchecked" )
      @Override
      public void readExternal( PofReader reader ) throws IOException
        cacheItemId = reader.readLong( 0 );
        item1 = (Set<CacheSubItem1>)reader.readCollection( 1, item1 );
        item2 = (Set<String>)reader.readCollection( 2, item2 );
        item3 = (Set<Long>)reader.readCollection( 3, item3 );
      @Override
      public void writeExternal( PofWriter writer ) throws IOException
        writer.writeLong( 0, cacheItemId );
        writer.writeCollection( 1, item1 );
        writer.writeCollection( 2, item2 );
        writer.writeCollection( 3, item3 );
      @Override
      public int getImplVersion()
        return VERSION ;
    public class CacheSubItem1 extends AbstractEvolvable implements EvolvablePortableObject, java.io.Serializable, com.tangosol.io.pof.PortableObject
      private static final int VERSION = 1;
      private static final long serialVersionUID = -1L;
      private Map<Integer, CacheSubItem2> item1 = new HashMap<Integer, CacheSubItem2>();
      public Map<Integer, CacheSubItem2> getItem1()
        return item1;
      public void setItem1( Map<Integer, CacheSubItem2> item1 )
        this.item1 = item1;
      @SuppressWarnings( "unchecked" )
      @Override
      public void readExternal( PofReader reader ) throws IOException
        item1 = (Map<Integer, CacheSubItem2>)reader.readMap( 0, item1 );
      @Override
      public void writeExternal( PofWriter writer ) throws IOException
        writer.writeMap( 0, item1 );
      @Override
      public int getImplVersion()
        return VERSION ;
    public class CacheSubItem2 extends AbstractEvolvable implements EvolvablePortableObject, java.io.Serializable, com.tangosol.io.pof.PortableObject
      private static final int VERSION = 1;
      private static final long serialVersionUID = -1L;
      private int value;
      private boolean flag;
      public int getValue()
        return value;
      public void setValue( int value )
        this.value = value;
      public boolean isFlag()
        return flag;
      public void setFlag( boolean flag )
        this.flag = flag;
      @Override
      public void readExternal( PofReader reader ) throws IOException
        value = reader.readInt( 0 );
        flag = reader.readBoolean( 1 );
      @Override
      public void writeExternal( PofWriter writer ) throws IOException
        writer.writeInt( 0, value );
        writer.writeBoolean( 1, flag );
      @Override
      public int getImplVersion()
        return VERSION ;
    public class CustomComparator1 implements java.io.Serializable, Comparator<CacheItem>, com.tangosol.io.pof.PortableObject
      private static final long serialVersionUID = -1L;
      private int sortOrder = 1 ;
      private Integer key ;
      public CustomComparator1(){}
      public CustomComparator1( Integer key )
        this.key = key ;
      @Override
      public int compare( CacheItem item1, CacheItem item2 )
        return sortOrder * ( getValue( item1 ).compareTo( getValue( item2 ) )  ) ;
      private Integer getValue( CacheItem item )
        int value = item.getItem1().iterator().next().getItem1().get( key ).getValue() ;
        return new Integer( value ) ;
      @Override
      public void readExternal( PofReader reader ) throws IOException
        sortOrder = reader.readInt( 0 ) ;
        key = reader.readInt( 1 ) ;
      @Override
      public void writeExternal( PofWriter writer ) throws IOException
        writer.writeInt( 0, sortOrder ) ;
        writer.writeInt( 1, key ) ;
      public void setAscendingOrder()
        sortOrder = 1 ;
      public void setDescendingOrder()
        sortOrder = -1 ;
    public class CustomFilter1 implements Filter, java.io.Serializable, com.tangosol.io.pof.PortableObject
      private static final long serialVersionUID = -1L;
      private Integer key = null ;
      public CustomFilter1(){}
      public CustomFilter1( Integer key )
        super() ;
        this.key = key ;
      @Override
      public boolean evaluate( Object item )
        for( CacheSubItem1 subItem1: ((CacheItem)item).getItem1() )
          CacheSubItem2 subItem2 = subItem1.getItem1().get( key );
          if(null!=subItem2){
            return true ;
        return false ;
      @Override
      public void readExternal( PofReader reader ) throws IOException
        key = reader.readInt( 0 ) ;
      @Override
      public void writeExternal( PofWriter writer ) throws IOException
        writer.writeInt( 0, key ) ;
    public class CustomProcessor1 extends AbstractProcessor implements PortableObject
      private static final long serialVersionUID = -1L;
      private Integer key ;
      private Set<String> item2s = new HashSet<String>();
      public CustomProcessor1(){}
      public CustomProcessor1( Integer key, Set<String> item2s )
        this.key = key ;
        this.item2s = item2s ;
      @Override
      public Object process( com.tangosol.util.InvocableMap.Entry entry )
        if ( !entry.isPresent() )
          return null ;
        CacheItem item = (CacheItem)entry.getValue() ;
        return extract( item ) ;
      public CacheItem extract( CacheItem item )
        CacheItem extract = new CacheItem() ;
        extract.setCacheItemId( item.getCacheItemId() );
        Set<CacheSubItem1> item1s = item.getItem1() ;
        for( CacheSubItem1 item1: item1s )
          extract.getItem1().add( getExtractedItem1( item1 ) ) ;
        for( String item2: item2s )
          if( item.getItem2().contains( item2 ) )
            extract.getItem2().add( item2 );
        return extract ;
      private CacheSubItem1 getExtractedItem1( CacheSubItem1 hydrated )
        CacheSubItem1 extracted = new CacheSubItem1() ;
        extracted.getItem1().put( key, hydrated.getItem1().get( key ) ) ;
        return extracted ;
      public Integer getKey()
        return key;
      public void setKey( Integer key )
        this.key = key;
      public Set< String > getItem2s()
        return item2s;
      public void setItem2s( Set< String > item2s )
        this.item2s = item2s;
      @SuppressWarnings( "unchecked" )
      @Override
      public void readExternal( PofReader reader ) throws IOException
        key = reader.readInt( 0 ) ;
        item2s = (Set<String>)reader.readCollection( 1, item2s );
      @Override
      public void writeExternal( PofWriter writer ) throws IOException
        writer.writeInt( 0, key ) ;
        writer.writeCollection( 1, item2s );
    }3. Cache data loader:
    public class CacheDataLoader
      public static final String BASE_ITEM2_KEY = "12345678901234567890";
      public static final char[] VALID_CHARS =
      {'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'J',
      'K', 'L', 'M', 'N', 'P', 'Q', 'R', 'T', 'U',
      'V', 'W', 'X', 'Y', 'Z', '1', '2', '3', '4',
      '5', '6', '7', '8', '9', '0'} ;
      public static final int NUMBER_OF_ITEM2 = 1000 ;
      public static final int NUMBER_OF_ITEM3 = 1000 ;
      public static final int NUMBER_OF_ITEM = 20000;
      public static final int BATCH_LOAD_SIZE = 10000 ;
      public static final int NUMBER_OF_KEYS_PER_SUBITEM1 = 50;
      public static final int MULITPLE_ITEM1_FREQUENCY = 1000 ;
      public static final int NUMBER_OF_MULTIPLE_ITEM1 = 1;
      public static final int MULITPLE_ITEM2_FREQUENCY = 2;
      public static final int NUMBER_OF_MULTIPLE_ITEM2 = 3;
      public static final int MULITPLE_ITEM3_FREQUENCY = 2;
      public static final int NUMBER_OF_MULTIPLE_ITEM3 = 5;
      public static final long BASE_CACHE_ITEM_KEY = 10000000 ;
      public static final long BASE_ITEM3_KEY = 10000000;
      private long item3Count = 0;
      private long item2Count = 0;
      private long cacheItemCount = 0;
      private NamedCache cache = CacheFactory.getCache( "cache-data" ) ;
      public static void main( String[] args )
        CacheDataLoader loader = new CacheDataLoader() ;
        loader.load() ;
        loader.createIndices() ;
      private void createIndices()
        cache.addIndex( new KeyExtractor( IdentityExtractor.INSTANCE ), false, null );
        cache.addIndex( new ReflectionExtractor( "getItem2" ), false, null );
        cache.addIndex( new ReflectionExtractor( "getItem3" ), false, null );
      private void load()
        long start = System.currentTimeMillis();
        cache.clear();
        int iterations = NUMBER_OF_ITEM/BATCH_LOAD_SIZE ;
        for ( int i=0;i<iterations; i++ )
          cache.putAll( getCacheItems( BATCH_LOAD_SIZE ) );
        System.out.println( "CACHE LOAD: Instances: " + cache.size() + " keysize: " + NUMBER_OF_KEYS_PER_SUBITEM1 + " Time: " + ( System.currentTimeMillis() - start ) );
      private Map<Long, CacheItem> getCacheItems( int loadSize )
        Map<Long, CacheItem> cacheItems = new HashMap<Long, CacheItem>() ;
        for( int i=0; i<loadSize; i++)
          CacheItem cacheItem = getCacheItem();
          cacheItems.put( cacheItem.getCacheItemId(), cacheItem ) ;
        return cacheItems ;
      private CacheItem getCacheItem()
        CacheItem cacheItem = new CacheItem();
        cacheItem.setCacheItemId( getNextCacheItemId() );
        cacheItem.setItem1( getItem1() ) ;
        cacheItem.setItem2( getItem2() );
        cacheItem.setItem3( getItem3() );
        return cacheItem;
      private Set<Long> getItem3()
        Set<Long> item3s = new HashSet<Long>() ;
        //First item3
        item3s.add( getNextItem3Id() ) ;
        //Additional item3s
        if( isAdditionalItem3Required() )
          for(int i=0; i<NUMBER_OF_MULTIPLE_ITEM3; ++i){
            item3s.add( getNextItem3Id() ) ;
        return item3s;
      private Set<String> getItem2()
        Set<String> item2s = new HashSet<String>() ;
        //First item2
        item2s.add( getNextItem2Id() ) ;
        //Additional item2s
        if( isAdditionalItem2Required() )
          for(int i=0; i<NUMBER_OF_MULTIPLE_ITEM2; ++i){
            item2s.add( getNextItem2Id() ) ;
        return item2s;
      private Set<CacheSubItem1> getItem1()
        Set<CacheSubItem1> item1s = new HashSet<CacheSubItem1>() ;
        //First item1
        item1s.add( getSubItem1() ) ;
        //Additional item1s
        if( isAdditionalItem1Required() )
          for(int i=0; i<NUMBER_OF_MULTIPLE_ITEM1; ++i){
            item1s.add( getSubItem1() ) ;
        return item1s;
      private CacheSubItem1 getSubItem1()
        CacheSubItem1 item = new CacheSubItem1() ;
        item.setItem1( getSubItemMap( NUMBER_OF_KEYS_PER_SUBITEM1 ) ) ;
        return item;
      private Map<Integer, CacheSubItem2> getSubItemMap( int numberPriceKeys )
        Map<Integer, CacheSubItem2> items = new HashMap<Integer, CacheSubItem2>();
        for ( int x = 0; x < numberPriceKeys; x++ )
          Integer key = x;
          items.put( key, getSubItem2() );
        return items;
      private CacheSubItem2 getSubItem2()
        CacheSubItem2 item = new CacheSubItem2() ;
        item.setFlag( RandomUtils.nextBoolean() ) ;
        item.setValue( getRandomValue() ) ;
        return item;
      private boolean isAdditionalItem1Required()
        return cacheItemCount%MULITPLE_ITEM1_FREQUENCY == 0;
      private boolean isAdditionalItem2Required()
        return cacheItemCount%MULITPLE_ITEM2_FREQUENCY == 0;
      private boolean isAdditionalItem3Required()
        return cacheItemCount%MULITPLE_ITEM3_FREQUENCY == 0;
      private long getNextCacheItemId()
        return BASE_CACHE_ITEM_KEY + (++cacheItemCount);
      private long getNextItem3Id()
        return BASE_ITEM3_KEY + (++item3Count%NUMBER_OF_ITEM3);
      private String getNextItem2Id()
        return BASE_ITEM2_KEY + (++item2Count%NUMBER_OF_ITEM2);
      private int getRandomValue()
        return RandomUtils.nextInt( 10000 ) ;
    }4. Test Case:
    public class TestExampleCache extends TestCase
      public void testLimitFilter1()
        final Integer key = getKey();
        Set<String> item2 = getItem2();
        Set<Long> item3 = getItem3();
        CustomComparator1 comparator = new CustomComparator1(key);
        comparator.setAscendingOrder();
        Filter[] filterArray = { new ContainsAnyFilter( "getItem2", item2 ),
                                 new ContainsAnyFilter( "getItem3", item3 ),
                                 new CustomFilter1( key ) };
        Filter allFilter = new AllFilter( filterArray ) ;
        CustomProcessor1 processor = new CustomProcessor1(key,item2);
        Set<Map.Entry<Long, CacheItem>> result1 = CacheFactory.getCache( "cache-data" ).entrySet( allFilter,comparator );
        for(Map.Entry<Long, CacheItem> entry : result1 ){
          CacheItem item = processor.extract( entry.getValue() );
          System.out.println(item.getCacheItemId()+"-"+item.getItem1().iterator().next().getItem1().values().iterator().next().getValue());
        System.out.println();
      public void testLimitFilter2()
        final Integer key = getKey();
        final int numberOfProducts = 10;
        Set<String> item2 = getItem2();
        Set<Long> item3 = getItem3();
        CustomComparator1 comparator = new CustomComparator1(key);
        comparator.setAscendingOrder();
        Filter[] filterArray = { new ContainsAnyFilter( "getItem2", item2 ),
                                 new ContainsAnyFilter( "getItem3", item3 ),
                                 new CustomFilter1( key ) };
        Filter allFilter = new AllFilter( filterArray ) ;
        LimitFilter limitFilter = new LimitFilter(allFilter, numberOfProducts);
        CustomProcessor1 processor = new CustomProcessor1(key,item2);
        Set<Map.Entry<Long, CacheItem>> result1 = CacheFactory.getCache( "cache-data" ).entrySet( limitFilter,comparator );
        for(Map.Entry<Long, CacheItem> entry : result1 ){
          CacheItem item = processor.extract( entry.getValue() );
          System.out.println(item.getCacheItemId()+"-"+item.getItem1().iterator().next().getItem1().values().iterator().next().getValue());
        System.out.println();
        limitFilter.nextPage();
        Set<Map.Entry<Long, CacheItem>> result2 = CacheFactory.getCache( "cache-data" ).entrySet( limitFilter,comparator );
        for(Map.Entry<Long, CacheItem> entry : result2 ){
          CacheItem item = processor.extract( entry.getValue() );
          System.out.println(item.getCacheItemId()+"-"+item.getItem1().iterator().next().getItem1().values().iterator().next().getValue());
      private Integer getKey()
        return new Integer(10);
      private Set<String> getItem2()
        Set<String> items = new HashSet<String>();
        items.add( "12345678901234567890" + 1 );
        items.add( "12345678901234567890" + 2 );
        items.add( "12345678901234567890" + 3 );
        items.add( "12345678901234567890" + 4 );
        items.add( "12345678901234567890" + 5 );
        items.add( "12345678901234567890" + 6 );
        items.add( "12345678901234567890" + 7 );
        items.add( "12345678901234567890" + 8 );
        items.add( "12345678901234567890" + 9 );
        items.add( "12345678901234567890" + 10 );
        items.add( "12345678901234567890" + 11 );
        items.add( "12345678901234567890" + 12 );
        return items;
      private Set<Long> getItem3()
        Set<Long> items = new HashSet<Long>();
        items.add( new Long(10000001) );
        items.add( new Long(10000002) );
        items.add( new Long(10000003) );
        items.add( new Long(10000004) );
        items.add( new Long(10000005) );
        items.add( new Long(10000006) );
        items.add( new Long(10000007) );
        items.add( new Long(10000008) );
        items.add( new Long(10000009) );
        items.add( new Long(10000010) );
        return items;
    }5. Results:
    a. testLimitFilter1()
    10010001-109
    10002002-121
    10002004-487
    10006003-726
    10008004-762
    10000004-845
    10010003-922
    10014003-1157
    10012002-1426
    10008002-1585
    10002003-1709
    10004004-2004
    10004001-2179
    10018002-2452
    10016004-3073
    10012004-3145
    10008001-3249
    10018001-3270
    10008003-3319
    10016002-3778
    10012001-4256
    10012003-4391
    10002001-4921
    10006002-5072
    10000002-5162
    10016003-5777
    10014004-6068
    10000001-6260
    10000003-6373
    10004002-6615
    10014001-7679
    10006001-7729
    10006004-7794
    10010002-8188
    10010004-8215
    10018004-8258
    10016001-8383
    10018003-8760
    10004003-9652
    10014002-9876
    b. testLimitFilter2()
    Page-1
    10002004-487
    10000004-845
    10012002-1426
    10008002-1585
    10004004-2004
    10018001-3270
    10016003-5777
    10006004-7794
    10016001-8383
    10018003-8760
    Page-2
    10018002-2452
    10008001-3249
    10008003-3319
    10016002-3778
    10012001-4256
    10012003-4391
    10014004-6068
    10000003-6373
    10010002-8188
    10010004-8215
    c. Expected results:
    Page-1
    10010001-109
    10002002-121
    10002004-487
    10006003-726
    10008004-762
    10000004-845
    10010003-922
    10014003-1157
    10012002-1426
    10008002-1585
    Page-2
    10002003-1709
    10004004-2004
    10004001-2179
    10018002-2452
    10016004-3073
    10012004-3145
    10008001-3249
    10018001-3270
    10008003-3319
    10016002-3778
    Edited by: user8065775 on Oct 21, 2009 3:02 PM
    PS : Looks like the following thread addresses the problem that I have mentioned, which has links to download the source code but they do not work. Is there a way that I can access the java source pointed to in the post pointed to by the following link?
    Re: The question about the locking of the cache
    SortByMethodNameAggregator.java
    SortByMethodNameCaller.java
    Can you please email me the code to [email protected]

  • Temperatur​e values at high fixed values

    I have two 1102/1302 with 6363 that has worked fine in the past.  After some down time I ran the vi and did not get correct values.  I investigated the virtual channels in MAX and the table has high fixed values for all channels (218-219 C) at room temperature and they are static values.  It would seem these values are not being read from any connected TCs.  Any idea as to what could cause this behavior?
    Solved!
    Go to Solution.

    teslac,
    How long has the system been sitting?
    Make sure that the hardware is physically set up correctly. Also, make sure that the correct thermocouple type is selected on your Virtual channel.
    Do you get the same values if you run the MAX Test panels?
    Do you get the same values if you run the example LabVIEW Program “Thermocouple - Continuous Input”
    Rob W.

Maybe you are looking for

  • Problems with a style sheet in InDesign CS3

    Hello everyone. I was wondering if anyone had any clue as to what the holy heck is happening to me. I am attempting to standardize our documents and use style sheets--hey, what a concept, right? So I'm going through a recently completed document and

  • Migrating text formatting from InDesign CC to Flash CC

    Since TLF is no longer supported in Flash CC, does that mean that maintaining the integrity of the type from an InDesign layout to Flash is no longer possible? Is there a possible work around? I'm designing content for interactive labels for a museum

  • The lastest upgrade made adobe digital edition unusable. Has this been reported?

    When I try to move my ebook to my reader I get an error message: Adobe Digital editions 3.0 has stopped working. This happened with version 2.0 as well. There were some suggestions to change some settings related to Firefox but that didn't work eithe

  • How to make a NON-ITUNES STORE video to a DVD movie

    Hey all, Just curious as to how i can burn a video i have imported from my ipod (i got a new computer and lost the original video file), that is in the MP4 format or what ever itunes uses, to a Video DVD

  • HP Envy is asking me to save when I try to print.

    Everytime I try to print a document from my computer the print screen asks me to save and will not print, but the printer works from all other devices in the house.  Did I not set something up right on my computer? This question was solved. View Solu