Optimize your data storage intelligently

In today's competitive world, one must work diligently in order to protect the data that is the main sole of any enterprise. The main catch with storage these days is archiving the data that might not be useful to you at that moment but you might need it again anytime. This makes storage expensive.

Identifying fraudulent "phishing" email
Phishing & Other Suspicious Emails
Beware of Browser Pop-Up Tech Support, Phishing Scams

Similar Messages

  • HT204264 "If you exceed your iCloud storage limit, photos and videos won't upload to iCloud, and your library will no longer stay up to date across your devices."  Really?  Can't we support the existing "last 1k photos will be available in the cloud" mode

    "If you exceed your iCloud storage limit, photos and videos won't upload to iCloud, and your library will no longer stay up to date across your devices."  Really?  Can't we support the existing "last 1k photos will be available in the cloud" model?  If we don't do this, won't this kill auto-sync of new photos from iOS devices to our libraries on our Macs (if the Mac's Photos app has auto syncing with the cloud turned on) once we go over our 5gb limit?  I have 350GB of photos in my Mac Photos library - I am not going to pay $20/month for 1tb iCloud storage.  Please advise, Apple.  Thank you.

    I agree and am baffled by this also.  At first I was really excite about this new feature.  Mostly because I would be able to seamlessly sync videos along with my PhotoStream.  I just assumed there would be some systematic way to handle large libraries.  Like removing older files as the current PhotoStream does.  However, it appears that this new feature will simply 'shut down' when you reach your storage limit.  This just makes no sense to me.  I would have to double check, but I have all my photos and videos in iPhoto and stored on a 3 TB external hard drive and I think the full library is over 1 TB.  So, even if I was willing to pay $20 a month for photo syncing (which I am not) I couldn't do it because my library is too large.  Again I am just baffled why Apple cannot blend this new service with the existing PhotoSteam model.  My favorite aspect of PhotoSteam is that is uploads all my iPad and iPhone Photos to my Mac for permanent storage and it seems this is lost in the new version.

  • HT4847 How do you move data from your phone storage to iCloud?

    You can purchase extra storage from iCloud. So shouldn't you be able to store stuff there as opposed to using your phones storage? How would I sry this up?

    In theory, you should, but Apple doesn't let you at this moment. You can send this idea to Apple > http://www.apple.com/feedback
    The iCloud storage idea is a bit confusing for users because it's not designed to store the files you want. Instead, you can save on it files created with apps that are compatible with iCloud, for example, iWork apps. In this case, you can store data on iCloud, but it doesn't let you to move other files like movies or photos

  • Rejected...We found that your app does not follow the iOS Data Storage Guidelines,

    Where and how do I fix this?
    We found that your app does not follow the iOS Data Storage Guidelines, which is required per the App Store Review Guidelines.
    In particular, we found that on launch and/or content download, your app stores 3.54MB. To check how much data your app is storing:
    - Install and launch your app
    - Go to Settings > iCloud > Storage & Backup > Manage Storage
    - If necessary, tap "Show all apps"
    - Check your app's storage
    The iOS Data Storage Guidelines indicate that only content that the user creates using your app, e.g., documents, new files, edits, etc., should be backed up by iCloud.
    Temporary files used by your app should only be stored in the /tmp directory; please remember to delete the files stored in this location when the user exits the app.
    Data that can be recreated but must persist for proper functioning of your app - or because customers expect it to be available for offline use - should be marked with the "do not back up" attribute. For NSURL objects, add the NSURLIsExcludedFromBackupKey attribute to prevent the corresponding file from being backed up. For CFURLRef objects, use the corresponding kCFURLIsExcludedFromBackupKey attribute.

    You're submitting a Newsstand app, right? This error is still under investigation, but the current workaround is create SD (1024x768/768x1024) cover images for your HD (2048x1536) folio renditions and try again.
    More info here:
    http://forums.adobe.com/message/4730637#4730637

  • Lock Up Your Data for Up to 90% Less Cost than On-Premises Solutions with NetApp AltaVault

    June 2015
    Explore
    Data-Protection Services from NetApp and Services-Certified Partners
    Whether delivered by NetApp or by our professional and support services certified partners, these services help you achieve optimal data protection on-premises and in the hybrid cloud. We can help you address your IT challenges for protecting data with services to plan, build, and run NetApp solutions.
    Plan Services—We help you create a roadmap for success by establishing a comprehensive data protection strategy for:
    Modernizing backup for migrating data from tape to cloud storage
    Recovering data quickly and easily in the cloud
    Optimizing archive and retention for cold data storage
    Meeting internal and external compliance regulations
    Build Services—We work with you to help you quickly derive business value from your solutions:
    Design a solution that meets your specific needs
    Implement the solution using proven best practices
    Integrate the solution into your environment
    Run Services—We help you optimize performance and reduce risk in your environment by:
    Maximizing availability
    Minimizing recovery time
    Supplying additional expertise to focus on data protection
    Rachel Dines
    Product Marketing, NetApp
    The question is no longer if, but when you'll move your backup-and-recovery storage to the cloud.
    As a genius IT pro, you know you can't afford to ignore cloud as a solution for your backup-and-recovery woes: exponential data growth, runaway costs, legacy systems that can't keep pace. Public or private clouds offer near-infinite scalability, deliver dramatic cost reductions and promise the unparalleled efficiency you need to compete in today's 24/7/365 marketplace.
    Moreover, an ESG study found that backup and archive rank first among workloads enterprises are moving to the cloud.
    Okay, fine. But as a prudent IT strategist, you demand airtight security and complete control over your data as well. Good thinking.
    Hybrid Cloud Strategies Are the Future
    Enterprises, large and small, are searching for the right blend of availability, security, and efficiency. The answer lies in achieving the perfect balance of on-premises, private cloud, and public services to match IT and business requirements.
    To realize the full benefits of a hybrid cloud strategy for backup and recovery operations, you need to manage the dynamic nature of the environment— seamlessly connecting public and private clouds—so you can move your data where and when you want with complete freedom.
    This begs the question of how to integrate these cloud resources into your existing environment. It's a daunting task. And, it's been a roadblock for companies seeking a simple, seamless, and secure entry point to cloud—until now.
    Enter the Game Changer: NetApp AltaVault
    NetApp® AltaVault® (formerly SteelStore) cloud-integrated storage is a genuine game changer. It's an enterprise-class appliance that lets you leverage public and private clouds with security and efficiency as part of your backup and recovery strategy.
    AltaVault integrates seamlessly with your existing backup software. It compresses, deduplicates, encrypts, and streams data to the cloud provider you choose. AltaVault intelligently caches recent backups locally while vaulting older versions to the cloud, allowing for rapid restores with off-site protection. This results in a cloud-economics–driven backup-and-recovery strategy with faster recovery, reduced data loss, ironclad security, and minimal management overhead.
    AltaVault delivers both enterprise-class data protection and up to 90% less cost than on-premises solutions. The solution is part of a rich NetApp data-protection portfolio that also includes SnapProtect®, SnapMIrror®, SnapVault®, NetApp Private Storage, Cloud ONTAP®, StorageGRID® Webscale, and MetroCluster®. Unmatched in the industry, this portfolio reinforces the data fabric and delivers value no one else can provide.
    Figure 1) NetApp AltaVault Cloud-integrated Storage Appliance.
    Source: NetApp, 2015
    The NetApp AltaVault Cloud-Integrated Storage Appliance
    Four Ways Your Peers Are Putting AltaVault to Work
    How is AltaVault helping companies revolutionize their backup operations? Here are four ways your peers are improving their backups with AltaVault:
    Killing Complexity. In a world of increasingly complicated backup and recovery solutions, financial services firm Spot Trading was pleased to find its AltaVault implementation extremely straightforward—after pointing their backup software at the appliance, "it just worked."
    Boosting Efficiency. Australian homebuilder Metricon struggled with its tape backup infrastructure and rapid data growth before it deployed AltaVault. Now the company has reclaimed 80% of the time employees formerly spent on backups—and saved significant funds in the process.
    Staying Flexible. Insurance broker Riggs, Counselman, Michaels & Downes feels good about using AltaVault as its first foray into public cloud because it isn't locked in to any one approach to cloud—public or private. The company knows any time it wants to make a change, it can.
    Ensuring Security. Engineering firm Wright Pierce understands that if you do your homework right, it can mean better security in the cloud. After doing its homework, the firm selected AltaVault to securely store backup data in the cloud.
    Three Flavors of AltaVault
    AltaVault lets you tap into cloud economics while preserving your investments in existing backup infrastructure, and meeting your backup and recovery service-level agreements. It's available in three form factors: physical, virtual, and cloud-based.
    1. AltaVault Physical Appliances
    AltaVault physical appliances are the industry's most scalable cloud-integrated storage appliances, with capacities ranging from 32TB up to 384TB of usable local cache. Companies deploy AltaVault physical appliances in the data center to protect large volumes of data. These datasets typically require the highest available levels of performance and scalability.
    AltaVault physical appliances are built on a scalable, efficient hardware platform that's optimized to reduce data footprints and rapidly stream data to the cloud.
    2. AltaVault Virtual Appliances for Microsoft Hyper-V and VMware vSphere
    AltaVault virtual appliances are an ideal solution for medium-sized businesses that want to get started with cloud backup. They're also perfect for enterprises that want to safeguard branch offices and remote offices with the same level of protection they employ in the data center.
    AltaVault virtual appliances deliver the flexibility of deploying on heterogeneous hardware while providing all of the features and functionality of hardware-based appliances. AltaVault virtual appliances can be deployed onto VMware vSphere or Microsoft Hyper-V hypervisors—so you can choose the hardware that works best for you.
    3. AltaVault Cloud-based Appliances for AWS and Microsoft Azure
    For organizations without a secondary disaster recovery location, or for companies looking for extra protection with a low-cost tertiary site, cloud-based AltaVault appliances on Amazon Web Services (AWS) and Microsoft Azure are key to enabling cloud-based recovery.
    On-premises AltaVault physical or virtual appliances seamlessly and securely back up your data to the cloud. If the primary site is unavailable, you can quickly spin up a cloud-based AltaVault appliance in AWS or Azure and recover data in the cloud. Usage-based, pay-as-you-go pricing means you pay only for what you use, when you use it.
    AltaVault solutions are a key element of the NetApp vision for a Data Fabric; they provide the confidence that—no matter where your data lives—you can control, integrate, move, secure, and consistently manage it.
    Figure 2) AltaVault integrates with existing storage and software to securely send data to any cloud.
    Source: NetApp, 2015
    Putting AltaVault to Work for You
    Four common use cases illustrate the different ways that AltaVault physical and virtual appliances are helping companies augment and improve their backup and archive strategies:
    Backup modernization and refresh. Many organizations still rely on tape, which increases their risk exposure because of the potential for lost media in transport, increased downtime and data loss, and limited testing ability. AltaVault serves as a tape replacement or as an update of old disk-based backup appliances and virtual tape libraries (VTLs).
    Adding cloud-integrated backup. AltaVault makes a lot of sense if you already have a robust disk-to-disk backup strategy, but want to incorporate a cloud option for long-term storage of backups or to send certain backup workloads to the cloud. AltaVault can augment your existing purpose-built backup appliance (PBBA) for a long-term cloud tier.
    Cold storage target. Companies want an inexpensive place to store large volumes of infrequently accessed file data for long periods of time. AltaVault works with CIFS and NFS protocols, and can send data to low-cost public or private storage for durable long-term retention.
    Archive storage target. AltaVault can provide an archive solution for database logs or a target for Symantec Enterprise Vault. The simple-to-use AltaVault management platform can allow database administrators to manage the protection of their own systems.
    We see two primary use cases for AltaVault cloud-based appliances, available in AWS and Azure clouds:
    Recover on-premises workloads in the cloud. For organizations without a secondary disaster recovery location, or for companies looking for extra protection with a low-cost tertiary site, AltaVault cloud-based appliances are key to enabling cloud-based disaster recovery. Via on-premises AltaVault physical or virtual appliances, data is seamlessly and securely protected in the cloud.
    Protect cloud-based workloads.  AltaVault cloud-based appliances offer an efficient and secure approach to backing up production workloads already running in the public cloud. Using your existing backup software, AltaVault deduplicates, encrypts, and rapidly migrates data to low-cost cloud storage for long-term retention.
    The benefits of cloud—infinite, flexible, and inexpensive storage and compute—are becoming too great to ignore. AltaVault delivers an efficient, secure alternative or addition to your current storage backup solution. Learn more about the benefits of AltaVault and how it can give your company the competitive edge you need in today's hyper-paced marketplace.
    Rachel Dines is a product marketing manager for NetApp where she leads the marketing efforts for AltaVault, the company's cloud-integrated storage solution. Previously, Rachel was an industry analyst for Forrester Research, covering resiliency, backup, and cloud. Her research has paved the way for cloud-based resiliency and next-generation backup strategies.
    Quick Links
    Tech OnTap Community
    Archive
    PDF

    You didn't say what phone you have - but you can set it to update and backup and sync over wifi only - I'm betting that those things are happening "automatically" using your cellular connection rather than wifi.
    I sync my email automatically when I have a wifi connection, but I can sync manually if I need to.  Downloads happen for me only on wifi, photo and video backup are only over wifi, app updates are only over wifi....check your settings.  Another recent gotcha is Facebook and videos.  LOTS of people are posting videos on Facebook and they automatically download and play UNLESS you turn them off.  That can eat up your data in a hurry if you are on FB regularly.

  • Curious as to your table storage params

    Hi everyone, I'm a long time Oracle guy, first time poster to these forums! I'm just curious as to your table storage parameters. For instance, do you use settings such as initial 128k next 128k pctincrease 0 for production tables after analysis? Or do you crank those settings way up, even "past" your projected usage? Anyone using megabytes for their table extents, such as inital 100M next 50M ? Or is that considered bad practice to project 10-15 years into the future?
    Basically I'm re-doing the storage structure of the database that our 3rd party supplier set up; they set everything up with 32k initial extents, 8k subsequents, 121 max extents. (Sounds like Oracle 7!) Since that gives us only a meg of storage in a 2 GB table space, I'm currently analyzing our sample data to see what our requirements will be.
    Actually does anyone mess with kilobyte extents anymore? Or do you go right to megabyte extents?
    Thanks!
    -Thomas H

    Thomas:
    It has always been best practice to size your objects appropriately, and to take all possible steps to ensure that tablespaces do not become fragmented. With LMTs, it is just that much easier, because users cannot violate the storage parameters you set up at the tablespace level.
    With DMTs, a user can specify storage parameters that differ from the tablespace defaults and screw up your carefull analysis. with LMTs, any user supplied storage parameters are effectively ignored (well, they impact the number of extents initially allocated to the object, but not the size of those extents).
    That said, I would do at least some minimal analysis of the space required for each of the tables and indexes. If, like most OLTP databases, you have a wide variety of object sizes, you can somewhat optimize the disk usage by keeping like sized objects together in one tablespace.
    For example, in one (payroll/HR) application I support we have 3 LMT tablespaces, small, medium, and large.
    Small holds the hundreds of small (2 - ~1,000 rows) lookup tables and their indexes, and has uniform extents of 64K
    Medium holds the dozens of larger tables like employee demographics, job history etc. (~1,000 - ~1,000,000 rows) and indexes on those tables, and some of the large tables and has uniform extents of 5M.
    Large holds the 5 or 6 extremely large tables like the detailed daily pay history (> 45,000,000 rows) and some of their larger indexes. This has uniform extents of 100M.
    HTh
    John

  • V4.2 error: Short term data storage is full

    Hope someone here can help.  We are still using 4.2 for now.
    Upon logging into Admin, we are receiving an error stating "PLANNING short term data storage is full, do you want to optimize the application?".  When checking the appset status the application only has about 400 records in short term storage.  We have tried saving the application immediately followed by optimization, yet this error continues.
    Any thoughts on how to rectify this?
    Thank you.

    So you have scheduled a lite optimize or full optimize where you are specifying the number of records from where the optimize is necessary.
    This number is probably very small between 0 and 400 and every time when you are connecting with admin module you are receiving this message.
    So you have to change that number to be 40000 not more and you will not receive any more that message when you have 400 records you will receive the message when you will have over 40000.
    You can find the parameter also into tbldefault table from your appset.
    I don't remember the name of parameter but you will manage to find out this parameter.
    Regards
    Sorin Radulescu

  • SR830 data storage for rs232

    Hi,
    I found a LabVIEW code here which can storage data from input signal (e.g.A/I)in SR830 and transfer it by the GPIB.
    I used function generator to generate a sine wave with frequency:30Hz& Vp-p:100mVolt , conencting to SR830.
    Clearly ,I got a correct result when i used the code by GPIB communication interface(left graph).
    Than,I tried to change the communication interface RS232 to accept the same signal,but it has mistakes(right graph).
     Here is original code "data storage for GPIB"
    Here is rewrite code for RS232 by myself,but what's wrong i did?
    Attachments:
    SR830 DATA STORAGE EXAMPLE.VI ‏54 KB
    scan test1.vi ‏43 KB

    The SR830 expects three integer arguments (separated by commas) for the TRCA? query:
    * i = 1 or 2 specifies which buffer to download from
    * j >= 0 specifies the starting point
    * k>=1 specifies how many points to download
    If you know k, you can calculate the exact number of bytes in the response. For your code, which downloads 4000 points at a time, that will be something like 60 kB (if memory serves, the response in ASCII transfer mode is 15 bytes/value). Make sure that you're not hitting any timeout or serial buffer size limits with a transfer of that size.  
    Edit: You have your bitrate set to 9600 baud (1200 bytes/second) and a 10 second timeout. That will read 12 kB before timing out, or 1/5th of your transfer. The 830 supports baud rates up to 19,200, which will help, but you'll also need either a longer timeout or to transfer your data in smaller chunks. 

  • OWA 2007 Issue : Microsoft.Exchange.Data.Storage.VirusMessageDeletedException Could not get properties.

    Hi I am facing an issue with Outlook web access on my production server Exchange Server 2007 with Sp1. When i try to reply or forward a message from OWA it displays The message has been deleted due to a virus threat . Kindly refer to the below error details. I have a box with all 3 roles installed I do not have an Edge Server . IO have installed Forefront on the same Box as my exchange Server.Below is the Detail of  the error . Kindly Help
    A virus was found in this message and it has been deleted. For further
    information, please contact technical support for your organization.
    Click here to continue working.
     Copy error details to clipboard
     Show details
    Request
    Url: https://192.168.7.12:443/owa/forms/basic/BasicEditMessage.aspx?ae=Item&t=IPM.Note&a=Reply&id=RgAAAACFMIZq8d7LTqcPs%2bRZA5g%2bBwBDDZWeCojSQZ1bZZ7Ga%2fkWAAAAeCc6AABDDZWeCojSQZ1bZZ7Ga%2fkWAA5iTbjpAAAJ
    User host address: 192.168.7.11
    User: Munendra Pal Gangwar
    EX Address: /o=First Organization/ou=Exchange Administrative Group
    (FYDIBOHF23SPDLT)/cn=Recipients/cn=mp_gangwar
    SMTP Address: [email protected]
    OWA version: 8.1.336.0
    Mailbox server: PFCDELEXCH01.PFCDOMAIN
    Exception
    Exception type: Microsoft.Exchange.Data.Storage.VirusMessageDeletedException
    Exception message: Could not get properties.
    Call stack
    Microsoft.Exchange.Data.Storage.MapiPropertyBag.GetProperties(IList`1
    propertyDefinitions)
    Microsoft.Exchange.Data.Storage.StoreObjectPropertyBag.InternalLoad(PropertyDefinition[]
    properties, Boolean forceReload)
    Microsoft.Exchange.Data.Storage.StoreObjectPropertyBag..ctor(StoreSession
    session, MapiProp mapiProp, Origin origin, PropertyDefinition[]
    autoloadProperties, Boolean canSaveOrDisposeMapiProp)
    Microsoft.Exchange.Data.Storage.StoreObjectPropertyBag..ctor(StoreSession
    session, MapiProp mapiProp, Origin origin, PropertyDefinition[]
    autoloadProperties)
    Microsoft.Exchange.Data.Storage.Item.InternalBindItem(StoreSession
    session, StoreObjectId itemId, Byte[] changeKey, ItemBindOption
    itemBindOption, PropertyDefinition[] allPropsToLoad)
    Microsoft.Exchange.Data.Storage.Item.InternalBind[T](StoreSession
    session, StoreId id, ItemBindOption itemBindOption,
    PropertyDefinition[] allPropsToLoad)
    Microsoft.Exchange.Data.Storage.Item.InternalBind[T](StoreSession
    session, StoreId id, PropertyDefinition[] allPropsToLoad)
    Microsoft.Exchange.Clients.Owa.Core.Utilities.GetItem[T](StoreSession
    storeSession, StoreId storeId, Boolean forceAsMessageItem,
    PropertyDefinition[] prefetchProperties)
    Microsoft.Exchange.Clients.Owa.Core.Utilities.GetItem[T](UserContext
    userContext, StoreId storeId, Boolean forceAsMessageItem,
    PropertyDefinition[] prefetchProperties)
    Microsoft.Exchange.Clients.Owa.Core.Utilities.GetItem[T](UserContext
    userContext, StoreId storeId, PropertyDefinition[] prefetchProperties)
    Microsoft.Exchange.Clients.Owa.Basic.EditMessage.LoadMessage()
    Microsoft.Exchange.Clients.Owa.Basic.EditMessage.OnLoad(EventArgs e)
    System.Web.UI.Control.LoadRecursive()
    System.Web.UI.Page.ProcessRequestMain(Boolean
    includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
    Inner Exception
    Exception type: Microsoft.Mapi.MapiExceptionVirusMessageDeleted
    Exception message: MapiExceptionVirusMessageDeleted: Unable to get
    properties on object. (hr=0x80004005, ec=1294) Diagnostic context:
    Lid: 18969 EcDoRpcExt2 called [length=479] Lid: 27161 EcDoRpcExt2
    returned [ec=0x0][length=422][latency=15] Lid: 23226 --- ROP Parse
    Start --- Lid: 27962 ROP: ropOpenMessage [3] Lid: 17082 ROP Error:
    0x50E Lid: 26977 Lid: 21921 StoreEc: 0x50E Lid: 27962 ROP:
    ropExtendedError [250] Lid: 1494 ---- Remote Context Beg ---- Lid:
    1238 Remote Context Overflow Lid: 14164 StoreEc: 0xFFFFFA1D PropTag:
    0x672D0003 Lid: 8660 StoreEc: 0x8004010F PropTag: 0x672D0003 Lid:
    21970 StoreEc: 0x8004010F PropTag: 0x672D0003 Lid: 23921 StoreEc:
    0x3EC Lid: 21970 StoreEc: 0x8004010F PropTag: 0x672F0014 Lid: 23921
    StoreEc: 0x3EC Lid: 21970 StoreEc: 0x8004010F PropTag: 0x6708000B Lid:
    21970 StoreEc: 0x8004010F PropTag: 0xE960102 Lid: 21970 StoreEc:
    0x8004010F PropTag: 0x6708000B Lid: 21970 StoreEc: 0x8004010F PropTag:
    0xE960102 Lid: 21970 StoreEc: 0x8004010F PropTag: 0x67760102 Lid:
    25394 Lid: 19506 Lid: 27698 Lid: 11285 StoreEc: 0x50E Lid: 21970
    StoreEc: 0xFFFFFC07 PropTag: 0x30080040 Lid: 25818 Lid: 6153 StoreEc:
    0x50E Lid: 25906 Lid: 5249 StoreEc: 0x50E Lid: 1750 ---- Remote
    Context End ---- Lid: 27962 ROP: ropGetPropsSpecific [7] Lid: 17082
    ROP Error: 0x4B9 Lid: 26465 Lid: 21921 StoreEc: 0x4B9 Lid: 27962 ROP:
    ropExtendedError [250] Lid: 1494 ---- Remote Context Beg ---- Lid:
    26426 ROP: ropGetPropsSpecific [7] Lid: 1750 ---- Remote Context End
    ---- Lid: 26849 Lid: 21817 ROP Failure: 0x4B9 Lid: 20385 Lid: 28577
    StoreEc: 0x50E Lid: 32001 Lid: 29953 StoreEc: 0x50E
    Call stack
    Microsoft.Mapi.MapiExceptionHelper.ThrowIfError(String message, Int32
    hresult, Object objLastErrorInfo)
    Microsoft.Mapi.MapiProp.GetProps(PropTag[] propTagsRequested)
    Microsoft.Exchange.Data.Storage.MapiPropertyBag.GetProperties(IList`1
    propertyDefinitions)

    I had this same issue, except I have Symantec mail security not Forefront. It wound up being a line in the adult content filter. For whatever reason, they included "If you received this email" as one of the literal strings. This is probably part
    of almost every corporate confidentiality clause I have ever seen!! go figure......
    Thanks for the event log hint- since I have the adult content set to delete, I had no good tracking mechanism in Exhange or SMSME. I set email notifications for the future on these.
    Brent
    I had this same problem, but I have ESET Mail Security Server. in some one case the problem y solved if I attach the "user@domain" in the "white
    List". But in many case the problem is continuous. I have Exchange 2007 in MS Windows Server 2003.
    Thanks for yours help.

  • Data Storage of Analog Output

    Hi all,
     I am attaching my labview code with this post. My problem is that the code runs correctly with "write to lvm file" applette. Without this aplette the waveform shows a continuous graph. But when I add this applette to store the data, the waveform gets broken. It updates after every 100 - 200 sec. Eventhough I used X-scrollbar, I am not able to access the data onc3e it gets updated.
    Is there any other method to store the data from DAQ.?
    I tried to store it spreadsheet, and also used write to TDM file. But in vain. I have been trying this out since 4 days. I don't know where am I wrong. Please help me out.
    Attachments:
    Data Acquisition from 6 Sensors.vi ‏2426 KB

    Storing data as text takes a long time.  LVM, TDM, and ASCII are all text.  The overhead of creating the and storing the data is taking longer than the DAQ card is taking for each set of data.  As a result, you are getting data overruns on the DAQ card - look at the error out output of the DAQ Assistant while the problem is occurring.  Fortunately, there are two things you can do to make things better.  You should probably do both.
    Do not use a text format.  You can either use raw binary using the VIs from the File I/O palette or NI-Hierarchical Waveform Storage (NI-HWS).  NI-HWS is the easier of the two.  If you do not have NI-HWS, you can get it with almost any of the measurement device drivers (e.g. NI-SCOPE).  Make sure you open the file before you run the loop and close it after the loop exits.
    Do your file storage in a parallel loop to your data acquisition.  This is called a producer/consumer architecture.  You can find examples in the LabVIEW examples.  Use a queue to pass data from the acquisition loop to the storage loop.  This will allow your data acquistion to run at full speed, relatively independent of the file storage (they still use the same processor).
    Using these two methods will allow you to write data to disk at hardware limited speeds.
    Note that there are a few other things you could do to make your code simpler.
    Since your processing is the same for all channels, there is no need to break the data up into individual channels before processing.  The processing blocks can handle multiple input waveforms.
    There is no need to query for the current time six times outside the loop and six times inside (you can wire an output to multiple inputs).  This also will reduce the number of calculations you need to set the timestamps.
    You can convert your dynamic data into an array of waveforms subtract the starting timestamp from the original  in a FOR loop - no need to query for the current time inside the loop, the DAQ Assistant already gives you this information.  This will get rid of the incorrect dt value, duplication of Y values, and dubious timestamps.
    Good luck.  Let us know if you need more help.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • Trouble with Data Storage VI's

    I am new to LabView Data Storage, I wrote a very basic program to write data to a data base and to read the data. for some reason I am unable to read the data. I am attaching the VI's, Please can any one tell me what I am doing wrong in the VI's.
    Thank you,
    Mudda.
    Attachments:
    Read DataBase.vi ‏156 KB

    Mudda,
    I modified your code and I'll attach it here for you to look at. First of all, you need to tell the data storage open what function to perform (e.g. Open, creat, or replace). Then, you need to make sure the file you're writing to is a .tdm file. Finally, you need to remove the "Signals" terminal from your read and write vi's. To do this, double click on the vi and uncheck the box for "Show terminals for data channel". If this is checked and the "signals" terminal is visible, then the refnum will not pass any info on the file unless a signal is actually connected. So take a look at the code and see if you have any questions.
    Tyler S.
    Attachments:
    write.vi ‏108 KB

  • 2.23 Apps must follow the iOS Data Storage Guidelines or they will be rejected

    My Multi Issue v14 App (24124) was just rejected by Apple. Apparently because storage of the data (folios?) was not iCloud compatible.
    Is this related to v14? Would building a v15 app resolve the issue?
    or is there some other problem?
    Please advise...
    Full text of Apple rejection below...
    Nov 4, 2011 08:17 PM. From Apple.
    2.23
    We found that your app does not follow the iOS Data Storage Guidelines, which is not in compliance with the App Store Review Guidelines.
    In particular, we found magazine downloads are not cached appropriately.
    The iOS Data Store Guidelines specify:
    "1. Only documents and other data that is user-generated, or that cannot otherwise be recreated by your application, should be stored in the /Documents directory and will be automatically backed up by iCloud.
    2. Data that can be downloaded again or regenerated should be stored in the /Library/Caches directory. Examples of files you should put in the Caches directory include database cache files and downloadable content, such as that used by magazine, newspaper, and map applications.
    3. Data that is used only temporarily should be stored in the /tmp directory. Although these files are not backed up to iCloud, remember to delete those files when you are done with them so that they do not continue to consume space on the user’s device."
    For example, only content that the user creates using your app, e.g., documents, new files, edits, etc., may be stored in the/Documents directory - and backed up by iCloud. Other content that the user may use within the app cannot be stored in this directory; such content, e.g., preference files, database files, plists, etc., must be stored in the /Library/Caches directory.
    Temporary files used by your app should only be stored in the /tmp directory; please remember to delete the files stored in this location when the user exits the app.
    It would be appropriate to revise your app so that you store data as specified in the iOS Data Storage Guidelines.
    For discrete code-level questions, you may wish to consult with Apple Developer Technical Support. Please be sure to include any symbolicated crash logs, screenshots, or steps to reproduce the issues when you submit your request. For information on how to symbolicate and read a crash log, please see Tech Note TN2151 Understanding and Analyzing iPhone OS Application Crash Reports.
    To appeal this review, please submit a request to the App Review Board.

    You might want to check out our ANE (Adobe Native Extension) solution that enables your FB projects to abide by the Apple's Data Storage guidelines.
    https://developer.apple.com/library/ios/#qa/qa1719/_index.html
    Do Not Backup project:
    http://www.jampot.ie/ane/ane-ios-data-storage-set-donotbackup-attribute-for-ios5-native-ex tension/
    David
    JamPot.ie

  • Non-server data storage

    A friend of mine is developing a database for a specific environment:
    A small number of peer-to-peer networked Windows computers in a small office, with no dedicated servers - each computer acting as an individual's workstation. They want a "central" database to store client information etc, with an interface they can all use at once to access and change the data.
    Given the nature of the environment, my first thought was that some sort of file-based data storage (not requiring a server process) would be most appropriate - Access, csv, XML....but I'm not intricately familiar with JDBC support for these mechanisms, so wasn't sure what to recommend specifically.
    They are not willing/able to spend any money on this solution, so it must use the current environment. Can someone recommend a data storage method and point me to an appropriate JDBC driver?
    Oh, and while I have your attention - anybody know of a good CSV parser? I'm currently splitting a line of data by commas, but it's also splitting strings with commas in them...

    A friend of mine is developing a database for a
    specific environment:
    A small number of peer-to-peer networked Windows
    computers in a small office, with no dedicated servers
    - each computer acting as an individual's workstation.
    They want a "central" database to store client
    information etc, with an interface they can all use at
    once to access and change the data.
    based on this i have a non-Java solution to suggest.
    use MS-Access and IIS to develop and deploy an intranet.
    there are several pros to this solution that i see
    - You can develop and deploy an intranet using web browsers as clients very quickly and relatively cheaply
    - an intranet (series of web pages) vs. a full blown application may well be easier to make changes to
    - having the data in a RDBMS (Access) which will be cost effective in this case also will make it relatively simple to upgrade port the system later
    now i like programming in java as much as the next person but from your requirements it sounds like writing an application might be overkill. in my experience doing an intranet like this is a pretty good solution.. you don't have to install anything on the clients... you already have the software for the server (if you don't have IIS most versions of Windows now have PWS or Personal Web Server which will work for this)... the important thing is to have a good database design so that you can make changes or port the client easily later if you need to.

  • ITunes shows that there are no songs on my iphone but there are 274 songs there. The data storage shows up as "other" on iTunes. I just want to clear the music on my phone but  I can't seem to do so. Is there any way to do this?

    iTunes shows that there are no songs on my iphone but there are 274 songs there. The data storage shows up as "other" on iTunes. I just want to clear the music on my phone but  I can't seem to do so. Is there any way to do this? You used to be able to swipe and delete a song straight from your phone but I can't seem to do that anymore.

    That is stealing.

  • I need a memory management/ data storage recomendation for my IMac 2 GH Intel Core 2 Duo

    Since giving my mid 2007 IMac a 2GB memory boost to accommodate Lion all has been well however my memory is full. I have a sizable ITunes library and 6000 photos in IPhoto, Me thinks I should store this all more effectively for the safety of the music and photos and for the well-being of the computer....but there seems to be choices. Is this where ICloud comes into play or time capsule or should I just get an external mini hard drive. Can anyone help me with some pearls of wisdom with data storage.

    Greetings John,
    There are two types of memory which you mention here.
    The 2 GB memory you refer to is RAM.  It is not used for storing information.  Rather for giving the computer a place to do much of its work.
    File storage is handled by the hard drive of the computer.
    If your available hard drive space is getting low you can move larger files to a Mac Formatted external drive:
    Faster connection (FW 800) drives:
    http://store.apple.com/us/product/H2068VC/A
    http://store.apple.com/us/product/TW315VC/A
    http://store.apple.com/us/product/H0815VC/A
    Normal speed (USB) drives:
    http://store.apple.com/us/product/H6581ZM/A
    Larger files can include entire databases like iTunes, iMovie, or iPhoto.
    Keep in mind that if you move these items to an external drive you will have to have the drive plugged in and powered on to access the data on them.  In addition, if you move important information off your internal drive to an external, you should be sure that your backup solution is backing up that external drive to keep your information safe.
    iCloud is not a file storage solution and TimeCapsule is not suited for storing databases like those mentioned above (its meant primarily as a backup solution).  I would stick with an external drive (1 to hold your big files and another one big enough to backup both your computer and the first drive).
    Here are some other general computer clean up suggestions: http://thexlab.com/faqs/freeingspace.html.
    Hope that helps.

Maybe you are looking for