IOS Timer Problem - Best practice?

Hey there,
I'm using a global "idle timer" and for some reason, after the app is started the first time the timer event is launched it will launch dozens of times.
After 1..2 regular events all becomes normal. Any idea or tips?
Using AIR SDK 2.7
Best,
Cedric

Hello,
I wrote this workaround for Timer et setTimeOut:
https://github.com/jonasmonnier/Mobilo/tree/master/src/com/mobilo/time
A Timer :
var id:int = Interval.create(method, delay, repeatCount);
A Timeout :
var id:int = Timeout.create(method, delay, params);
Examples here :
https://github.com/jonasmonnier/Mobilo/tree/master/src/test
They are all based on my Tick class to use an unique ENTER_FRAME

Similar Messages

  • Subclass design problems/best practices

    Hello gurus -
    I have a question problem regarding the domain objects I'm sticking in my cache. I have a Product object - and would like to create a few subclasses - say BookProduct and MovieProduct (along with the standard Product objects). These really need to be contained in the same cache. The issue/concern here is that both subclasses have attributes that I'd like to index AND query on.
    When I try to create an index on the subclasses attributes when there are just "standard" products - I get the following error (which only exists on one of the subclasses):
    2010-10-20 11:08:43.280/227.055 Oracle Coherence GE 3.5.2/463 <Error> (thread=DistributedCache:MyCache, member=2): Exception occured during index rebuild: java.lang.RuntimeException: Missing or inaccessible method: com.test.domain.product.Product.getAuthors()
    So I'm not sure the indexing is working or stopping once it hits this exception.
    Furthermore, I get a similar error when attempting to Filter based on that attribute. So if I want to add the following filter:
    Filter filter = new ContainsAnyFilter( "getAuthors", authors );
    I will receive the following exception:
    Caused by: Portable(java.lang.RuntimeException): Missing or inaccessible method: com.test.domain.product.Product.getAuthors()
    What is considered the best practices for this assuming these really should be part of the same names cache? Should I attempt to subclass the extractors to "inspect" the Object for its class type during indexing or applying filters? Or should I just add all the attribute in the BookProduct and MovieProduct into the parent object and just forget about subclassing? That seems to have a pretty high "yuck" factor to me. I'm assuming people have run into this issue before and am looking for some best practices or perhaps something that deals with this that I'm missing. We're currently using Coherence 3.5.2. Not sure if it matters, but we are using the POF format for serialization.
    Thanks!
    Chris

    Hi Chris,
    I had a similar problem. The way I solved it was to use a subclass of the ChainedExtractor that catches all RuntimeException occurring during the extraction like the following:
    * {@link ChainedExtractor} that catches any exceptions during extraction and returns null instead.
    * Use this for cases where you're not certain that an object contains that necessary methods to be extracted.
    * F.e. an object in the cache does not contain the getSomeProperty() method. However other objects do.
    * When these are put together in the same cache we might want to use a {@link ChainedExtractor} like the following:
    * new ChainedExtractor("getSomeProperty.getSomeNestedProperty"). However this will result in a RuntimeException for those entries that
    * don't contain the object with the someProperty. Using the this class instead won't result in the exception.
    public class SafeChainedExtractor extends ChainedExtractor
         public SafeChainedExtractor()
              super();
         public SafeChainedExtractor( String sMethod )
              super( sMethod );
         @Override
         public Object extract( Object entry )
              try
                   return super.extract( entry );
              catch(RuntimeException e)
                   return null;
         @Override
         public Object extractFromEntry( Entry entry )
              try
                   return super.extractFromEntry( entry );
              catch(RuntimeException e)
                   return null;
    }For all indexes and filters we then use extractors that subclassed the SafeChainedExtractor like the following:
    public class NestedPropertyExtractor extends SafeChainedExtractor
         private static final long serialVersionUID = 1L;
         public NestedPropertyExtractor()
              super("getSomeProperty.getSomeNestedProperty");
    //adding an index:
    myCache.addIndex( new NestedPropertyExtractor(), false, null );
    //using a filter:
    myCache.keySet(new EqualsFilter(new NestedPropertyExtractor(), "myNestedProperty"));This way, the extractor will just return null when a property doesn't exist on the target class.
    Regards
    Jan

  • Real time logging: best practices and questions ?

    I've 4 couples of DS 5.2p6 in MMR mode on Windows 2003.
    Each server is configured with the default setting of "nsslapd-accesslog-logbuffering" enabled, and the log files are stored on a local file system, then later centrally archived thanks to a log sender daemon.
    I've now a requirement from a monitoring tool (used to establish correlations/links/events between applications) to provide the directory
    server access logs in real time.
    At a first glance, each directory generates about 1,1 Mb of access log per second.
    1)
    I'd like to know if there're known best practices / experiences in such a case.
    2)
    Also, should I upgrade my DS servers to benefit from any log management related feature ? Should I think about using an external disk
    sub-sytem (SAN, NAS ....) ?
    3)
    In DS 5.2, what's the default access logbuffering policy : is there a maximum buffer size and/or time limit before flushing to disk ? Is it configurable ?

    Usually log-buffering should be enabled. I don't know of any customers who turn it off. Even if you do, I guess it should be after careful evaluation in your environment. AFAIK, there is no configurable limit for buffer size or time limit before it is committed to disk
    Regarding faster disks, I had the bright idea that you could creating a ramdisk and set the logs to go there instead of disk. Let's say the ramdisk is 2gb max in size and you receive about 1MB/sec in writes. Say max-log-size is 30MB. You can schedule a job to run every minute that copies over the newly rotated file(s) from ramdisk to your filesystem and then send it over to logs HQ. If the server does crash, you'll lose upto a minute of logs. Of course, the data disappears after reboot, so you'll need to manage that as well. Sounds like fun to try but may not be practical.
    Ramdisk on windows
    [http://msdn.microsoft.com/en-us/library/dd163312.aspx]
    Ramdisk on solaris
    [http://wikis.sun.com/display/BigAdmin/Talking+about+RAM+disks+in+the+Solaris+OS]
    [http://docs.sun.com/app/docs/doc/816-5166/ramdiskadm-1m?a=view]
    I should ask, how realtime should this log correlation be?
    Edited by: etst123 on Jul 23, 2009 1:04 PM

  • Time Machine best practices after Lion to Mountain Lion upgrade

    I've made the upgrade from Lion to Mountain Lion and everything seems to be OK.
    I have been using Time Machine for backups since I deployed my first and, so far, only Mac (Mac Mini running Lion) in 2011.  I run my TM backups manually.  Since upgrading to Mountain Lion, I have not yet kicked off a TM backup, so my questions involve best practices with TM after an upgrade from Lion to Mountain Lion:
    Can I simply use the same drive as I use currently, do what I've always done, start the backup manually, and TM handles gracefully the new backup from the new OS?  
    At this point, since I have only backups to the Lion system, what I see when I doubleclick on the Time Machine drive is a folder called “Backups.backupdb”, then a subfolder called “My Mac mini”, and then all the backup events.   Nothing else.  What will I see once I do a backup now, after the Mountain Lion upgrade?
    If I for some reason needed to boot to my old Lion system (I cloned the startup disk prior to upgrading to ML) and access my old Lion backups with TM, would I be successful?  In other words does the system know that I'm booted to Lion, so give me access to the TM backups created under Lion?   Conversely when booted to the new Mountain Lion system, will I have access only to the backups created since the upgrade to Mountain Lion?
    Any other best practices steps I should take prior to my first ML backup?
    Time Machine is a great straightforward system to use (although I have to say I’ve not (yet) needed to depend on it for recovery...I trust that will go will when needed) but I don't want to make any assumptions as to how it works after a major OS upgrade.
    Thank you for reading.

    1. Correct. If you want to downgrade to OS X Lion, your Mac will still keep backups created with OS X Lion, so just start into Internet Recovery and select one of the backups made with OS X Lion. If you don't want that Time Machine backs up automatically, you may need to use TimeMachineEditor.
    2. After making a backup with Mountain Lion, it will be the same, but with a new folder that belongs to the new backup you have created.
    3. See my first answer.
    4. One advice: when your Time Machine drive gets full, Time Machine deletes old backups, so maybe it can remove all your OS X Lion backups. However, I don't think that you would need to go back to OS X Lion.
    If you have any questions apart from those, see Pondini's website > http://pondini.org

  • IOS device playlists - best practices

    So here's my conundrum. I have many playlists in iTunes, for listening at home. I want to sync the "best of" each playlist into my iPhone, but still have the songs organized in the same playlist names. For example, if I'm out and about and want to listen to a shuffled playlist of say, rock or mellow or whatever. I have yet to find a satisfactory solution.
    I read about the checkbox field, which will only sync checked items. That sounded like just what I needed, however it has an unfortunate side effect of excluding all unchecked items even from at-home listening, unless I check them all again. But if I check them all again, that essentially ruins my iPhone library the next time I sync.
    I could just do a smart playlist to sync anything I rate say, 4 or 5 stars...but then the songs don't show up in the right playlists, they all show up in the one playlist.
    My current inelegant workaround is basically to have two sets of playlists in iTunes: the original playlists for at home listening which has everything, and pared down versions of all those same playlists for the iPhone (so for example I have Classical, and iPhone Classical, Rock and iPhone Rock, etc). This clutters up my playlist pane quite a bit, and it just seems suboptimal.
    Am I missing a feature that might already exist in iTunes that would serve my purpose better? I know about iTunes Match, but isn't quite what I had in mind (plus that's more money). Any other ideas?
    Thanks!

    nabziF wrote:
    My current inelegant workaround is basically to have two sets of playlists in iTunes: the original playlists for at home listening which has everything, and pared down versions of all those same playlists for the iPhone (so for example I have Classical, and iPhone Classical, Rock and iPhone Rock, etc). This clutters up my playlist pane quite a bit, and it just seems suboptimal.
    You can create Playlist Folders for playlists.
    Creat one for iPhone and one for Home.
    To inclutter it, click the little gray triangle to hide the playlists in iTunes.

  • Server backups, Time Machine, best practices

    Hey guys, I've got a Mac Mini running Lion Server that is providing VPN, Profile Manager, and some file sharing to macs and i-devices in my household.  I also am using portable home folders for the kids accounts. (and soon I may be doing the same for my account and the wife's account)  Connected to my Mini is a Drobo. 
    Currently I'm using Crashplan to backup data on all of our client machines.  (an iMac, and a couple of macbooks)  However, I would like to add Time Machine to the mix.  Here are my thoughts:
    client machines:
    - use TM to backup to the mac mini server.  (backups stored on drobo)
    - EXCLUDE the 'local' (sync'd) home folders for network users, since their home folders are actually stored on the server. 
    server:
    - use TM locally on the server to backup itself to a separate external disk.
    - this backup should INCLUDE the users home folders since they're not getting backed-up on the client side. 
    - optionally install crashplan on the server and use it to backup users' folders as well??
    The whole goal here is convenience.  I trust Crashplan to backup my important stuff offsite (pictures, videos, etc), but in the event of a disk failure I want an easy, no-hassle way to fully restore a machine - either client or server machine - back to its original state before the disk failure.  The only thing that has me scratching my head a bit is the user folder stuff.  If everyone had local accounts it would be easy - just use TM to backup everything.  But since the home folders (for the network users) are actually stored on the server, with a syncronized version on the client, it complicates things a bit.
    Love to hear anyone's feedback on how to proceed.  Thanks!

    It certainly sounds like a solid plan, you're correct in that it doesn't make sense to backup the Home folders for network users, since those are on the Server, and in fact you might be able to get by without TM by considering the local Macs themselves the backups of the home folder, the odds of losing both are highly unlikely, except by some sort of proximity event, like a burglary or house fire (knock on wood).  In that way, I would actually say cloud storage would be better, but again, I would be satisfied with just the "backups" being the local machines themselves, and you're going that extra mile by backing said Home folders via TM (I'm assuming via a Time Capsule or external hard drive?).
    Since the goal is convenience, you may also consider adding a second drive to your Mac Mini, assuming it doesn't already have one.  Currently, I'm using RAID1 (mirroring) with two hard drives in my Mac Mini, so should a hard drive fail, it will automatically boot from the second hard drive and let you know the first drive failed.  It's certainly cheaper than a Time Capsule, especially if you have the tech prowess to take apart your Mac Mini and install a second hard drive.  I purchased iFixit's kit for adding a second hard drive, and while the $50 cost wasn't ideal, it beat several hundred dollars for a Time Capsule, especially since I don't need Terabytes of storage and I happened to have a spare 500GB HDD laying around... That could be you, too!
    As for the Home folders, again I would go the simple route and just use a RAID1 array in the Mac Mini, that way there's no need to backup the Home folders, they're already automatically backed up to the second hard drive with RAID1.
    As you can tell, I like RAID, but also, if you go the RAID route, they say HDD read speeds increase (there's some conflict on the Internet it seems, but I believe it... I see around a 20MB/s speed boost, not a huge deal, but it's something).

  • Flash Alternative Content ~ Best Practice ~ Safari on iOS Devices ~ iPad iPhone iPod Touch

    Hi Folks,
    Is there a documented best practice for providing alternate content for Flash in Safari on iOS devices?
    I am getting white space where my Flash animation would normally appear, and management is displeased. I need to display alternative content in this space. I'm hopeful that Adobe has published a best practice for how to accomplish this.
    tyvm
    Keith

    Not sure about Adobe official stance but they have started using swfobject Flash detection to place Flash <objects> on the Web page. The problem I see with the Adobe implementation is that about the only alternate content they recommend is "Click here to download the latest version of Flash player".
    That is about the lamest alternate content you could possibly imagine! Just because you recommend that your viewers download the new version.... doesn't mean they will... so they still go without REAL alternate content.
    A MUCH BETTER use of swfobject is to actually provide alternate content! For your review::
    If you think that Flash is somehow bad for SEO, it's time to dispell that MYTH!
    If fact, in some circumstances I'll use Flash INSTEAD of just HTML because then I'll have better SEO than with just HTML alone.
    http://www.worldbarefootcenter.com/
    The link to World Barefoot Center in the above post is just one example. View the source code and you see a couple paragraphs of text along with regular HTML links.... but what displays is the Flash version of the image and Flash links.
    The client provided the artwork for the page... and that's what they wanted to use a .jpg image. Well yes, that could be done in HTML but it would be virtually invisible to Google. So Instead I converted the image into a Flash .swf and used swfobject to display the Flash. swfobject allows you to create alternate content inside the <div> which also holds the Flash .swf, then when the page is loaded it detects if the browser has the Flash pluggin. If it does, it displays just the Flash content, if not, it displays the alternate content. Since almost everyone has the Flash pluggin, for most people the Flash version of the <div> will display.
    The alternate content for that <div> can be any regular HTML text, images, media player, links, etc., anything that you would use if you were not using the Flash. Now the best part is that the alternate content can be "over the top" as far as optimizing for SEO, since it will not be seen by most viewers.
    Here's another example of SEO with Flash.. again, the page is just a single image provided by the client:
    http://www.ksowetsuits.com/
    View the source code. The alt content is paragraph after paragraph of information about the site, including lists and links. If it was just the HTML, it might be kind of a boring Home page. But for SEO I can go "over the top" in promoting the site, since most viewers will never see that part... but it's all indexed by search engines. The end result is BETTER SEO using Flash than just HTML.
    On another Web site, a Flash video is displayed, the alt content is the complete text narration of the video. Now how many people would take the time to read that if they could just watch the video instead?? again, better SEO with Flash than without. In fact in one case we had first page search result from that video narration within 4 hours of posting the page.
    On still another site with a Flash video, the alt content is another video, but a .mov version, which will, in effect play Flash video on the iPhone (not possible you say??). Well since the iPhone does not have Flash pluggin, it simply displays the .mov version of the video, while everyione else sees the Flash version.
    So anyway, if Flash is a part of your Web development, you should look into using swfobject and alternate content.
    http://code.google.com/p/swfobject/
    Best wishes,
    Eye for Video
    www.cidigitalmedia.com
    So it is and has been for a number of years now, very easy to provide alternate content for non Flash devices... and that includes text, images, and video.
    Best wishes,
    Adninjastrator

  • Best Practices For Household IOS's/Apple IDs

    Greetings:
    I've been searching support for best practices for sharing primarily apps, music and video among multple iOS's/Apple IDs.  If there is a specific article please point me to it.
    Here is my situation: 
    We currently have 3 iPads (2-kids, 1-dad) in the household and one iTunes account on a win computer.  I previously had all iPads on single Apple ID/credit card and controlled the kids' downloads thru the Apple ID password that I kept secret.  As the kids have grown older, I found myself constantly entering my password as the kids increased there interest in music/apps/video.  I like this approach because all content was shared...I dislike because I was constantly asked to input password for all downloads.
    So, I recently set up an individual account for them with the allowance feature at iTunes that allows them to download content on their own (I set restrictions on their iPads).  Now I have 3 Apple IDs under one household.
    My questions:
    With the 3 Apple IDs, what is the best way to share apps,music, videos among myself and the kids?  Is it multiple accounts on the computer and some sort of sharing? 
    Thanks in advance...

    Hi Bonesaw1962,
    We've had our staff and students run iOS updates OTA via Settings -> Software Update. In the past, we put a DNS block on Apple's update servers to prevent users from updating iOS (like last fall when iOS 7 was first released). By blocking mesu.apple com, the iPads weren't able to check for or install any iOS software updates. We waited until iOS 7.0.3 was released before we removed the block to mesu.apple.com at which point we told users if they wanted to update to iOS 7 they could do so OTA. We used our MDM to run reports periodically to see how many people updated to iOS 7 and how many stayed on iOS 6. As time went on, just about everyone updated on their own.
    If you go this route (depending on the number of devices you have), you may want to take a look at Caching Server 2 to help with the network load https://www.apple.com/osx/server/features/#caching-server . From Apple's website, "When a user on your network downloads new software from Apple, a copy is automatically stored on your server. So the next time other users on your network update or download that same software, they actually access it from inside the network."
    I wish there was a way for MDMs to manage iOS updates, but unfortunately Apple hasn't made this feature available to MDM providers. I've given this feedback to our Apple SE, but haven't heard if it is being considered or not. Keeping fingers crossed.
    Hope this helps. Let us know what you decide on and keep us posted on the progress. Good luck!!
    ~Joe

  • IOS Software Upg(s) - Best Practices?

    Hello, I have several 2960 switches that are all running 12.2(35)SE5.
    I know that 12.2(50)SE1 is out there, and I am curious when it is best to upgrade the software version?
    If my switch is still PRE-production, am I better off to always upgrade to the current software on a switch if it still has not yet been deployed?
    I saw in the release notes for (50)that there were some VLAN authentication updates, and we are in the midst of designing a VLAN deployment using some L3 switches.
    Without KNOWING if I need a specific feature udpate, are there any best practices that may help guide me?
    As mentioned, because this switch is not yet in production, I'm thinking that the time is right if I'm going to upgrade the IOS.

    Best practice is NOT jumping into a new IOS code unless you need a feature provided by the new IOS or your current IOS has a critical bug that is corrected with the new IOS.
    With that said, it seems the new feature is worth having in your environment so 50SE1 may fit the bill. It's your call if you want to be exposed to new bugs.
    HTH,
    Edison.

  • IOS Update Best Practices for Business Devices

    We're trying to figure out some best practices for doing iOS software updates to business devices.  Our devices are scattered across 24 hospitals and parts of two states. Going forward there might be hundreds of iOS devices at each facility.  Apple has tools for doing this in a smaller setting with a limited network, but to my knowledge, nothing (yet) for a larger implementation.  I know configurator can be used to do iOS updates.  I found this online:
    https://www.youtube.com/watch?v=6QPbZG3e-Uc
    I'm thinking the approach to take for the time being would be to have a mobile sync station setup with configurator for use at each facility.  The station would be moved throughout the facility to perform updates to the various devices.  Thought I'd see if anyone has tried this approach, or has any other ideas for dealing with device software updates.  Thanks in advance. 

    Hi Bonesaw1962,
    We've had our staff and students run iOS updates OTA via Settings -> Software Update. In the past, we put a DNS block on Apple's update servers to prevent users from updating iOS (like last fall when iOS 7 was first released). By blocking mesu.apple com, the iPads weren't able to check for or install any iOS software updates. We waited until iOS 7.0.3 was released before we removed the block to mesu.apple.com at which point we told users if they wanted to update to iOS 7 they could do so OTA. We used our MDM to run reports periodically to see how many people updated to iOS 7 and how many stayed on iOS 6. As time went on, just about everyone updated on their own.
    If you go this route (depending on the number of devices you have), you may want to take a look at Caching Server 2 to help with the network load https://www.apple.com/osx/server/features/#caching-server . From Apple's website, "When a user on your network downloads new software from Apple, a copy is automatically stored on your server. So the next time other users on your network update or download that same software, they actually access it from inside the network."
    I wish there was a way for MDMs to manage iOS updates, but unfortunately Apple hasn't made this feature available to MDM providers. I've given this feedback to our Apple SE, but haven't heard if it is being considered or not. Keeping fingers crossed.
    Hope this helps. Let us know what you decide on and keep us posted on the progress. Good luck!!
    ~Joe

  • Best Practice on Knowledge Management, IS01 Problems and Solutions

    Been Playing with KM and looking for insight from other users (will give points) using it for ICWC.
    We have mulitple product lines where we have documents with Q & A's in each line.  As I look at moving that into CRM via IS01, I am looking for any Best Practices or recommendations.
    1. Create a single problem and solution for every question.
    2. Create a single problem (list all questions)  for every product line and create multiple solutions that are linked to that problem (solution for each question)
    3. Is LMSW a good tool for loading data in mass?
    The ICWC search brings back the 1st line on the problem & solution on the screen, meaning I try to limit the characters used so it fits on the ICWC screen, a long 1st line on the problem doesn't allow the agent to see enough of the solution without opening the link.
    Thanks,
    Edited by: Glenn Michaels on Jun 14, 2008 9:52 PM

    Hello Glenn,
    If it helps, here's a scenario about KB on my company system.
    Our call center supervisors are the persons who creates problem and solutions in our KB. They maintain it but don't use IS01 transaction. They use instead  People Centric BSP's for Problem s and solutions (they're integrated in IC webclient with the help of transaction launcher).
    Normally, they prefer creating multiple solutions to one problem, instead of single problem - single solution method. This because some questions may have multiple solutions. They could put all the solutions text in one solution object, but for maintainance purposes we think it's better to create multiple solutions object to every solution text, because if one solution becames obsete, all we have to do is unlink instead of changing the text.
    We don't use LSMW. I don't have much experience in LSMW, but if you use it, be careful to respect KB interval numbers for problems and solutions. We implement an initial set of problems and solutions in our Development system, and we passed to the Quality and Produtive system, with the precious help of sap note '728295 - Transport the SDB customization between two CRM systems' and '728668 -Transport the content of the SDB between two CRM systems'.
    One cool idea to use the KB is using Auto-suggest feature. The idea is to integrate the links between problems and solutions with, for example, service ticket categorization, using BSP Category modeler, and when an agent classifies a ticket, at the top of the screen it will appear the suggested solutions/problems for the classification choosen.
    I think that's all.
    Sorry for my poor english. Today I'm not feeling very inspirated
    Kind regards.
    Edited by: Bruno Garcia on Jun 17, 2008 9:51 PM (added note 728668)

  • Best practice for real time requierement

    All,
    I am trying to find out what is the best practice for reporting against real time data ?
    is it while using
    1 - webi against universe/bex query on the top of hybrid cubes ?
    2 - Crystal report directly against the ECC data ?
    3 - using another solution such as Data Fedrator ? or something different ?
    I am looking to know if anyone got such req and also to share their experience 
    did they get some huge challenge against hybrid cubes ?
    Thanks in advance for your help
    Philippe

    Well their first requierement was to get real time data .. if i am in Xcelsius and click refresh then i want it to load my last data ..
    with live office , i can either schedule a crystal report and get the data delayed or use the option from live office to make iterfresh as right now .. is that a correct assumption ?
    I was talking about BW, just in case they are willing to change the requierement to go from Real time to every 5 min
    Just you know we are also thinking of the following option:
    1 - modify the virtual provider on the  CRM machine to get all the custom fields needed for the Xcelsius Dashboard
    2 - Build some interactive report on the top of these Virtual Provider within CRM
    3 - get the link to this report , it is one of the Report feature within CRM
    4 - design and build your dashboard on the top of it
    5 - EXport your swf file to the cRM web ui
    we are trying to see which one is the best one
    Philippe

  • Best practice to use Time capsule for back-up of 3 different products (MBP 15 OSX lion, MBP 13 OSX lion and MBA 13 OSX mountain lion)?. Only the MBP15 is back-up regularly.

    When I want to save data of the MBA13 on mountain lion (wireless) with time capsule, hois there any best practice to perform?
    After that, assuming that data are back-up, can we easily differentiate data in time capsule belonging to MBP15/13 and MBA13?

    Unfortunately, Apple left off the Ethernet port....the most important port in networking....on the MBA, so your first backup of the entire Mac will need to be done using wireless.
    That may take a day or two unless your MBA has a Thunderbolt port on it in which case you could add a Thunderbolt to Ethernet adapter and connect the MBA to the Time Capsule for the first backup using an Ethernet cable.  It will probably only take 3-4 hours or less doing it this way.
    Once you have the first complete backup done, other subsequent backups can be done using wireless since they will ony take a few minutes, on average.
    Both Macs will backup to the Time Capsule using Time Machine automatically. Backups will be kept completely separate, so one Mac will normally only be able to "see" its own backups.

  • Current best practice for Time service settings for Hyper-V 2012 R2 Host and guest OS's

    I am trying to find out what the current best practice is for Time service settings in a Hyper-V 2012 environment. I find conflicting information. Can anyone point me in the right direction. I have found some different sources (links below) but again the
    are not consistent. Thanks
    http://blogs.msdn.com/b/virtual_pc_guy/archive/2010/11/19/time-synchronization-in-hyper-v.aspx
    http://technet.microsoft.com/en-us/library/virtual_active_directory_domain_controller_virtualization_hyperv(v=ws.10).aspx
    http://social.technet.microsoft.com/wiki/contents/articles/12709.time-services-for-a-domain-controller-on-hyper-v.aspx

    From the first link provided by Brian, it does state that the time service should be off, but then the update changes that statement.  Still best to rely on the first link in the OP - it was written by the guy that has been responsible for much of what
    gets coded into Hyper-V, starting from before there ever was a Hyper-V.  I'd say that's a pretty reliable source. 
    Time service
    For virtual machines that are configured as domain controllers, it is recommended that you disable time synchronization between the host system and guest operating system acting as a domain controller. This enables your guest domain controller to synchronize
    time from the domain hierarchy.
    To disable the Hyper-V time synchronization provider, shut down the VM and clear the Time synchronization check box under Integration Services.
    Note
    This guidance has been recently updated to reflect the current recommendation to synchronize time for the guest domain controller from only the domain hierarchy, rather than the previous recommendation to partially disable time synchronization between the
    host system and guest domain controller.
    . : | : . : | : . tim

  • SAP Best Practice: Problems with Loading of Transaction Data

    Hi!
    I am about to implement SAP Best Practices scenario "B34: Accounts Receivable Analysis".
    Therefore I load the data from SAP ERP IDES system into SAP NetWeaver 2004s system.
    My problems are:
    when I try to load the Transaction data for Infosources 0FI_AR_4 and 0FI_AR_6 I get the following errors/warnings:
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    On the right side I see some actions that are also "yellow", e.g. "DataStore Activation (Change Lo ): not yet activated".
    As a result I cannot see any data in tcode "RSRT" executing the queries "0FIAR_C03/...".
    The problems there
    1) Input help of the web template of query don't contain any values
    2) no data will be shown
    Can some one help me to solve this problem?
    Thank you very much!
    Jürgen

    Be in the monitor window where u got the below issue
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    and go to environment in the menu options TransactRFC--->in the sourcesystem...
    give the logon details and enter and from there give the correct target destination as ur BI server and execute...
    if u find some idoc's pending there push that manually using F6..
    and come back to ur load and refresh....
    if still it doen't turn green u can manully change status to red in STATUS tab and come to processing tab and expand ur processing details and right click on ur data packet which was not yet updated and select manual update...
    it shows busy status and when it comes out of that once again refresh...
    rgds,

Maybe you are looking for