Best Practice for disparately sized data

2 questions in about 20 minutes!
We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
Or does none of this make any sense at all?
Cheers
A

Angel 1058 wrote:
2 questions in about 20 minutes!
We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
Or does none of this make any sense at all?
Cheers
AHi A,
It depends... if there is a relationship between keys and sizes, e.g. if this or that part of the key means that the size of the value will be big, then you can implement a key partitioning strategy possibly together with keyassociation on the key in a way that it will evenly spread the large entries across the partitions (and have enough partitions).
Unfortunately you would likely not get a totally even distribution across nodes because of having fairly small amount of entries compared to the square of the number of nodes (btw, which version of Coherence are you using?)...
Best regards,
Robert

Similar Messages

  • Best Practice for Initial Load Data

    Dear Experts,
        I would like to know the best practices or factors to be concerned when performing initial load
    For example,
    1) requirement from business stakeholders for data analysis
    2) age of data to meet tactical reproting
    3) data dependency crossing sap  modules
    4) Is there any best practice for loading master data?

    HI ,
    check this links
    Master Data loading
    http://searchsap.techtarget.com/guide/allInOne/category/0,296296,sid21_tax305408,00.html
    http://datasolutions.searchdatamanagement.com/document;102048/datamgmt-abstract.htm
    Regards,
    Shikha

  • Best practice for putting binary data on the NMR

    Hi,
    We're creating a component that will consume messages off the NMR, encode them, and subsequently put them back on the NMR. What's the best practice for sending binary data over the NMR?
    1. setContent()?
    2. addAttachment()?
    3. setProperty()?
    If NormailzedMessage.setContent() is the desired approach, then how can you accomplish that?
    Thanks,
    Bruce

    setContent() is used only for XML messages. The recommended way to accommodate binary data is to use addAttachment().

  • Best Practice for Using Static Data in PDPs or Project Plan

    Hi There,
    I want to make custom reports using PDPs & Project Plan data.
    What is the Best Practice for using "Static/Random Data" (which is not available in MS Project 2013 columns) in PDPs & MS Project 2013?
    Should I add that data in Custom Field (in MS Project 2013) or make PDPs?
    Thanks,
    EPM Consultant
    Noman Sohail

    Hi Dale,
    I have a Project Level custom field "Supervisor Name" that is used for Project Information.
    For the purpose of viewing that "Project Level custom field Data" in
    Project views , I have made Task Level custom field
    "SupName" and used Formula:
    [SupName] = [Supervisor Name]
    That shows Supervisor Name in Schedule.aspx
    ============
    Question: I want that Project Level custom field "Supervisor Name" in
    My Work views (Tasks.aspx).
    The field is enabled in Task.aspx BUT Data is not present / blank column.
    How can I get the data in "My Work views" ?
    Noman Sohail

  • Best practice for retraction of data from BPC 10.0 to General Ledger

    Hi All,
    I have the requirement to retract data from BPC 10.0 to General Ledger.
    What is the best practice for doing this?
    I have read the "How To... Retract data from BPC 7.5 NetWeaver to SAP ERP Cost Center Accounting, Part I", but this is prepared to transfer data to Cost Center accounting. Will General Ledger be populated automatically?
    What is your opinion on this?
    Best regards,
    JA

    Hi Rich,
    In BPC 10 NW, the data entered in Input Forms has to be retracted to ECC.
    For this retraction we are following below link.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c029accf-6d95-2d10-79aa-cc291180bf73?QuickLink=index&overridelayout=true&59180354379512
    In this document, I am unable to get the details of class ZCL_BPC_RTRCT_SUPERCLASS.
    as it is Z class, i am unable to create in my system.
    could you please help me.

  • Best practices for submitting CF data to an AJAX page?

    Hi everyone,
    I've got a project I'm working on for work and have hit a
    little problem.
    I am extracting data from my database and then after each
    piece of data (just numbers, usually 10 chunks of numbers), I tack
    a "|" onto the end of each number. Then, I output the data to the
    page. Back on my AJAX enabled page, I get the "responseText" from
    that page, and then split it up using javascript and the
    pre-inserted "|".
    This seems to work fine, but it is quite a bit more messy.
    Also, It would really, really be nice to be able to do sorting and
    various other operations on the data with javascript instead of
    having to rely on CF's icky code logic.
    Can someone please enlighten me as to best practices for this
    type of thing? I get the suspicion that I'll probably be using XML
    somehow, but I'd your opinion.
    Thanks!

    Check out the Samples and Documentation portions of Adobe's
    Spry website for client side use of JSON with Spry.
    http://labs.adobe.com/technologies/spry/home.html
    Here is link to Adobe's Spry Forums:
    http://www.adobe.com/cfusion/webforums/forum/categories.cfm?forumid=72&catid=602
    If you are using CF8 you can use the SerializeJSON function
    to convert a variable to JSON. You might also be interested in the
    cfsprydataset tag. CF 8 documentation:
    http://livedocs.adobe.com/coldfusion/8/htmldocs/
    If you are using a previous version of CF there is 3rd party
    JSON support. You can find links at
    http://json.org.

  • Best Practice for Flat File Data Uploaded by Users

    Hi,
    I have the following scenario:
    1.     Users would like to upload data from flat file and subsequently view their reports.
    2.     SAP BW support team would not be involved in data upload process.
    3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
    4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
    What are the best practice we should adopt for this scenario?
    Thanks!

    Hi,
    I can share what we do in our project.
    We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
    So this happens everyday.
    Rgds
    SVU123
    Edited by: svu123 on Mar 25, 2011 5:46 AM

  • Best practice for exposing internal data to external world?

    Currently we have our Internet server sitting in our corporate DMZ taking website and web service requests from the outside world.  Class libraries with compiled connection strings exist on that server.  That server then has a connection through
    the firewall to the database server.  I'm told that this is no longer the secure/recommended best practice.
    I'm told to consider having that Internet server make requests of not the database server, but rather a layer in between (application server, intranet server, whatever) that has those same Web UI methods exposed.. and then THAT server (being inside the firewall)
    connects to the database server.
    Is this the current recommended best practice to have external users interact with internal data?  It seems like lots of hoops -- outside person app queries Web UI methods on Internet server which in-turn queries same method (duplicated) on Intranet
    server which then talks to the database.
    I'm just trying to determine the simples practice, but also what is appropriately secure for our ASP.NET applications and services.
    Thanks.

    IMO this has little to do with SOA and all about DMZs. What you're are trying to stop is the same comm protocol accessing the database as accessed the web site. As long as you fulfil that then great. WCF can help here because it helps with configuring
    the transport of calls. Another mechanism is to use identities but IMO it's easier to use firewalls and transports.
    http://pauliom.wordpress.com

  • Best Practices for Loading Master Data via a Process Chain

    Currently, we load attributes, text, and hierarchies before loading the transactional data.  We have one meta chain.  To load the master data it is taking more than 2 hours.  Most of the master data is full loads.  We've noticed that a lot of the master data, especially text, has not changed or changed very little since we implemented 18 months ago.  Is there a precedence or best practice to follow such as do we remove these processes from the chain?  If so, how often should it be run?  We would really like to reduce the amount of the master data loading time.  Is there any documentation that I can refer to?  What are other organizations doing to reduce the amount of time to load master data?
    Thanks!
    Debby

    Hi Debby,
    I assume you're loading Master Data from a BI system? The forum here are related to SAP NetWeaver MDM, so maybe you should ask this question in a BI forum?
    Nevertheless, if your data isn't changed this much, maybe you could use a delta mechanism for extraction? This would send only the changed records and not all the unchanged all the time. But this depends on your master data and of course on your extractors.
    Cheers
    Michael

  • Best practice for migrating eLearning data from Sol Mgr 3 to 4?

    Greetings,
    What is the recommended method for moving eLearning data when migrating from Solution Manager 3 to version 4?
    Thanks in advance,
         Ken Henderson

    948115 wrote:
    Dear All,
    This is Priya.
    We are using ODI 11.1.1.6 version.
    In my ODI project, we have separate installations for Dev, Test and Prod. i.e. Master repositories are not common between all the three. Now my code is ready in dev. Test environment is just installed with ODI and Master and Work repositories are created. Thats it
    Now, I need to know and understand what is the simple & best way to import the code from Dev and migrate it to test environment. Can some one brief the same as a step by step procedure in 5-6 lines? If this is the 1st time you are moving to QA, better export/import complete work repositories. If it is not the 1st time then create scenario of specific packages and export/import them to QA. In case of scenario you need not to bother about model/datastores. keep in mind that the logical schema name should be same in QA as used in your DEV.
    Some questions on current state.
    1. Do the id's of master and work repositories in Dev and Test need to be the same?It should be different.
    2. I usually see in export file a repository id with 999 and fail to understand what it is exactly. None of my master or work repositories are named with that id.It is required to ensure object uniqueness across several work repositories. For more understanding you can refer
    http://docs.oracle.com/cd/E14571_01/integrate.1111/e12643/export_import.htm
    http://odiexperts.com/odi-internal-id/
    3. Logical Architecture objects and context do not have an export option. What is the suitable alternative for this?If you are exporting topology then you will get the logical connection and context details. If you are not exporting topology then you need to manually create context and other physical connection/logical connection.
    >
    Thanks,
    Priya
    Edited by: 948115 on Jul 23, 2012 6:19 AM

  • Best Practice for attaching Meta Data, Organizing & Archiving Photos

    I am about to start the daunting task of organizing and archiving ALL of my photos so that I can access them from an external hard drive (or two) when I need them. I want to know/understand the best way to do this before I start so that thousands of hours later I don't realize that I have not considered something critical. I have already spent hours trying to find this information online to no satisfactory avail. I have both iPhoto ('09 v8.1.2) and Aperture 3, which I am just learning to use. I am wanting to back up to my WD My Book. I have already transferred some pictures and realized that, though I had them nicely organized in Albums in iPhoto, that information doesn't transfer over to the My Book so the file names are unrecognizable as any type of descriptor. I believe that I need to assign some Meta Data (?) to each photo so that I can do searches to find particular people or subjects when needed.
    Here are all of the things I want/need to know:
    1. Which Mac program would be best to accomplish attaching MetaData or adjusting file names: iPhoto or Aperture 3?
    2. What is the best way to attach MetaData to pictures so that the information is permanent? Tagging (allowing multiple search options)? Batch Name Change (allowing limited search capabilities)? Other?
    a. If I TAG all of my photos and then try to use that information outside of iPhoto or Aperture 3 (from the external HD) will it still be accessible for searches?
    3. After attaching all of this information, what do I need to know about transferring the photos to an external HD? Are there multiple copies of the same photo (original, edited) and if so, are both transferred? Are they both deleted when I move the pictures to the trash?
    4. Is there anything else that I need to know??
    Your expertise would be greatly appreciated.
    Thank you.

    You are trying to defeat the photo management of iPhoto - My suggestion is to learn iPhoto and use it of choose another program - iPhoto does not work well unless it manages the entire library and you access it using the tools build into the OS - click here for a discussion on the many correct and safe ways to access your iPhoto library photos.
    Aperture may or may not be a better choice - try the Aperture forum for detailed info
    Backing up iPhoto is trivial - simply drag the iPhoto library intact as a single entity to another hard drive - or use an incremental backup program to do it as changes are made - I use Time Machine to one backup drive and SuperDuper to another
    I believe that I need to assign some Meta Data (?) to each photo so that I can do searches to find particular people or subjects when needed.
    This is an integral part of iPhoto and one of its strengths
    1. Which Mac program would be best to accomplish attaching MetaData or adjusting file names: iPhoto or Aperture 3?
    iPhoto does a great job of this - file names are only adjusted when you export - and there is no need for them to be adjusted within the iPhoto library since you never should be going into it directly and never need to see the files directly
    2. What is the best way to attach MetaData to pictures so that the information is permanent?
    Exporting from iPhoto
    Tagging (allowing multiple search options)?
    Using iPhoto and the OS tools
    Batch Name Change (allowing limited search capabilities)?
    Done during export if desired
    Other?
    Depends
    a. If I TAG all of my photos and then try to use that information outside of iPhoto or Aperture 3 (from the external HD) will it still be accessible for searches?
    In iPhoto yes so long as you are not going directly into the iPhoto library but are using the provided tools
    3. After attaching all of this information, what do I need to know about transferring the photos to an external HD?
    Simply drag the iPhoto library intact as a single entity to the EHD - the database is a single entity and it will be the same there (so long as the EHD is formatted Mac OS extended (journaled)
    Are there multiple copies of the same photo (original, edited) and if so, are both transferred?
    Yes - Yes
    Are they both deleted when I move the pictures to the trash?
    You do not move photos to the trash - you select them in iPhoto and press the delete key then empty the iPhoto trash (iPhoto menu ==> empty iPhoto trash)
    4. Is there anything else that I need to know??
    Lots - if you are going to use iPhoto stop thinking of managing files and start thinking of using a SQL database which is what iPhoto is -- otherwise you need to use a different program
    LN

  • Best Practice for Storing Spatial Data in Multiple Projections?

    From what I've been able to determine MS SQL can't do reprojections is that correct? Assuming it is, the state of Washington requires that our GIS data be managed in Washington Stat Plane South
    but for our web mapping sites need the data in Web Mercator.  Is it OK then to have two geometry columns in a spatial enabled table, say a county table, one in WSPS and the other in WM? 
    I’m coming at this from a 30 year background in GIS using ESRI software. 
    Usually we would store the shape / geometry in one projection and project it to other on the fly when needed. 
    That way there is only one definitive source.  I don’t see a way to do this in MS SQL.

    Hi Scott.
    Storing two columns of spatial data is fine in SQL Server. And you are correct that there is no built-in reprojection in SQL Server, most folks do that as part of data loading. There was talk of porting parts of ogr2ogr to SQLCLR, but I don't think anyone
    did that.
    Cheers, Bob

  • Oracle's Best practice for avoiding contingency data problems

    What is Oracle's recommendation to avoid having contingency database problems? Is the timestamp data type good enough for avoiding having the a record updated twice?
    Any feedback is welcome.

    It means you need to lock the records all by yourself.
    3 month ago,I try to simulate the Oracle Developer_2000_Form_6I ,and I found that they use "select .. for update no wait where .. " to lock record in Text_Change event. Then I did it,It seems working ok now in my vb.net program.
    Jimy Ho
    [email protected]

  • Best practice for dealing with Recordsets

    Hi all,
    I'm wondering what is best practice for dealing with data retrieved via JDBC as Recordsets without involving third part products such as Hibernate etc. I've been told to NOT use RecordSets throughout in my applications since they are taking up resources and are expensive. I'm wondering which collection type is best to convert RecordSets into. The apps I'm building are webbased using JSPs as presentation layer, beans and servlets.
    Many thanks
    Erik

    There is no requirement that DAO's have a direct mapping to Database Tables. One of the advantages of the DAO pattern is that the business layer isn't directly aware of the persistence layer. If the joined data is used in the business code as if it were an unnormalized table, then you might want to provide a DAO for the joined data. If the joined data provides a subsiduray object within some particular object, you might add the access method to the DAO for the outer object.
    eg:
    In a user permissioning system where:
    1 user has many userRoles
    1 role has many userRoles
    1 role has many rolePermissions
    1 permission has many rolePermissions
    ie. there is a many to many relationship between users and roles, and between roles and permissions.
    The administrator needs to be able to add and delete permissions for roles and roles for users, so the crud for the rolePermissions table is probably most useful in the RoleDAO, and the crud for the userRoles table in the UserDAO. DOA's also can call each other.
    During operation the system needs to be able to get all permissions for a user at login, so the UserDAO should provide a readPermissions method that does a rather complex join across the user, userRole, rolePermission and permission tables..
    Note that f the system I just described were done with LDAP, a Hierarchical database or an Object database, the userRoles and rolePermissions tables wouldn't even exist, these are RDBMS artifacts since relational databases don't understand many to many relationships. This is good reason to avoid providing DAO's that give access to those tables.

  • Best practice for external but secure access to internal data?

    We need external customers/vendors/partners to access some of our company data (view/add/edit).  It’s not so easy as to segment out those databases/tables/records from other existing (and put separate database(s) in the DMZ where our server is).  Our
    current solution is to have a 1433 hole from web server into our database server.  The user credentials are not in any sort of web.config but rather compiled in our DLLs, and that SQL login has read/write access to a very limited number of databases.
    Our security group says this is still not secure, but how else are we to do it?  Even if a web service, there still has to be a hole in somewhere.  Any standard best practice for this?
    Thanks.

    Security is mainly about mitigation rather than 100% secure, "We have unknown unknowns". The component needs to talk to SQL Server. You could continue to use http to talk to SQL Server, perhaps even get SOAP Transactions working but personally
    I'd have more worries about using such a 'less trodden' path since that is exactly the areas where more security problems are discovered. I don't know about your specific design issues so there might be even more ways to mitigate the risk but in general you're
    using a DMZ as a decent way to mitigate risk. I would recommend asking your security team what they'd deem acceptable.
    http://pauliom.wordpress.com

Maybe you are looking for

  • Open PO Quantity of a Material.

    Hi, I am developing a report to show stock on Liability. I need to get on-order stock which is open PO quantity of a Material. The structure reference is RMMMBESTN_DATEN-MENGE All other stock quantity fields are available in Info Structure S039 but n

  • Issue while adding WCF Web Service reference using Oracle APEX

    Hi, We have an issue with creating the Web Service Reference for the WCF web service: http://dev.virtualearth.net/webservices/v1/metadata/geocodeservice/GeocodeService1.wsdl The APEX gives the following error "The WSDL document could not be understoo

  • Search Example-PGM BAPI_CONTRACT_CREATEFROMDATA

    Hallo, who could provide a demo program with FUBA "BAPI_CONTRACT_CREATEFROMDATA" for the generation of SD-Contracts (VA41). My utility (procedure, program) only generates the text: "Vendor(selling)  receipt still incomplete ... Furthermore I would li

  • How do I get contains to recognize two objects w/ same data as equal?

    I created an EmployeeId class that has an int field called myId. What methods and interfaces should I implement in EmployeeId so the output of the program below will be "Already Have That EmployeeId!" ? I have tried to implement the Comparable interf

  • Latest I-Tunes Version 11 Upgrade on Windows OS 7 64 bit PC

    Latest I-Tunes version 11 won't install on my windows PC.  Get an error message about mobile device permissions.  (Confusing since I'm not on a mobile device)  Tried the upgrade from normal I-Tunes notificatiion and it failed.  Totally removed the I-