Where Is Local Storage In Windows 8??

So i Have Revel App On My Windows 8 PC,
it stopped syncing because ive apparenly ran out of local storage space,
but i dont know where this folder is??? its not im my pictures
also is there a way via win 8 to stop it syncing to local and only sync to the cloud

Are you out of space on the computer or out of uploads in Revel? Revel does not save full sized copies of the photos on each device, but rather saves them in the cloud to avoid filling up your device. For this reason I am skeptical that Revel is the issue if your PC hard drive is full.  Is it possible you have many programs, music ,documents,  photos or videos that have filled up the disk?
Are you getting an error message from Revel? Are you a premium user? If not, you may have reached your upload limit of 50 uploads per month on a free account.
Pattie

Similar Messages

  • Local storage where?

    I've finally gotten a desktop (Windows XP) and I want to be
    able to play my flash website games with the new computer (instead
    of the laptop with the cracked screen). The flash game saves to the
    local storage and is saved within the flash game, not from an
    existing account on the website outside of the flash game. Anyway,
    I was wondering where the storage is so that I could transfer it
    from the laptop (Windows 2000) to the desktop (windows XP)

    Does no one know?

  • Errors when accessing Encrypted Local Storage

    Hi there,
    I develop an AIR application and some of the users of the application are running into an issue when accessing Encryped Local Storage. Sometimes, when the application tries to put or retrieve an item from Encrypted Local Storage, these errors are reported:
    Error: general internal error
    EncryptedLocalStore database access error
    I've searched around a little and it appears that these errors occur for a variety of reasons. Sounds like corruption to the user's keychain can result in this error being thrown (repairing the keychain was mentioned as a possible solution in one blog post). This same error happens to some Linux users (http://kb2.adobe.com/cps/492/cpsid_49267.html) when switching between Gnome and KDE desktops too. http://blogs.adobe.com/koestler/2009/07/unreadable_encrypted_local_sto.html seems to suggest that changing of usernames on the machine can have an impact on how the ELS is accessed.
    In terms of a solution to this issue, I've seen multiple posts online asking the user to delete the ELS directory on their computer. While this 'solution' works for some users of the application I develop, it doesn't work for other users. http://forums.adobe.com/thread/239605 seems to suggest that a full factory reset of a user's OS gets everything back to square one and the ELS is now usable again - that's not really something that I can suggest to a user of an application. It seems a bit drastic and most users will (understandably) baulk at the idea of a full factory reset.
    So, I'm in a position with AIR development where I have to provide a fallback to ELS. Thankfully, I don't use ELS in the application too much, but I imagine anyone that has to make meaningful use of it would be pretty handicapped by the issues above. I guess I have a few questions:
    1. Has anyone been able to reproduce the errors listed above? Are you able to get your system into a state where 'Error: general internal error' appears consistently when you try to access ELS? I've tried the suggestions listed in the articles I outlined above, but have so far failed in reproducing these errors.
    2. If you have managed to get your machine into this state, what is the most advisable remedial action? As I mentioned above, deleting the ELS directory to start anew seems to work for some users but not for others.
    3. Is there anything that someone has tried where they've found a programmatic solution to avoid this issue altogether?
    4. Can someone in AIR engineering comment on concrete efforts to avoid these sort of scenarios in future versions of the runtime? Is there any debug information that I can provide from users of this application that could help diagnose this issue further and possibly feed back to the AIR development team?
    FWIW, my application descriptor file is pointing at the 1.5.3 version of the runtime. We've seen this happen to users that are running this application with both the 1.5.3 version of AIR and the 2.0.3 version of AIR. We've seen this happen on both Mac and Windows. I've seen this happen with initial installs of the application and with upgrades to new versions too.
    Sean

    Hi Sean,
    Thanks for reporting the issue. As you pointed out (via the blogs and weblinks), we are aware of this issue. And the following blog post talks about the problem in detail:
         http://kb2.adobe.com/cps/492/cpsid_49267.html
    As mentioned there, this issues arises because of corruption of keyring database, which in turn could happen becuase of user migration, switching desktops, ELS data migration to a different machine etc In such a scenario, even native applications are not able to access the ELS store (gnome-keyring or kde-kwallet). So there is little that we could do here. We have never experienced a scenario where an AIR application resulted into the corruption of database.
    Having said that, if you can provide us a constantly reproducible case (a list of steps which can always get us hit the issue), then we can definitely try to do the best possible in this regard.
    Thanks,
    -romil
    (AIR Engineering)

  • Adobe Flash FAIL:  Adobe Flash Player local storage settings incorrect.  Module 'Resume' feature may not work on this computer.

    Using a Windows 2012 RDS Environment, we have users connecting to a CPD website, and as part of the CPD they need to run a systems checker.  When they run the systems checker they get the following error message: "Adobe Flash FAIL:  Adobe Flash Player local storage settings incorrect.  Module 'Resume' feature may not work on this computer". All users are connecting to this environment with Windows CE Clients,I have checked the setting on Adobe Flash and they seem correct but as each user has its own profile on the RDS session, is there something that I should be setting for each user. I have added the website to the trusted sites and it has made no difference.   Any ideas

    It sounds like what's happening is that Flash Player can't write or read from the local shared objects in the user's redirected home directory because we disallow traversing junctions in the broker process.  This behavior was disabled to address a vulnerability identified in some of John Forshaw's research into the IE broker last year.
    You can enable this behavior by adding the following setting to mms.cfg:
    EnableInsecureJunctionBehavior=1
    That said, you can probably gather from the name of the flag that we don't really recommend this approach, and disable this attack surface by default.  There's some risk that a network attacker could craft content that abuses fundamental issues with how Windows handles Junctions to write to arbitrary locations.
    Unfortunately, there's not a simple or easy workaround that I'm aware of (but it's been ages since I've administered a Windows domain) for this kind of NAS/SAN-backed terminal server environment where Flash is not able to access \Users\<user>\AppData\Roaming\Macromedia\Flash Player\ without traversing a junction.

  • Adobe Flash Player 10 - Enable Local Storage Issue

    Hi all,
    Having and issue after upgrading to Flash Player 10 from 9.
    Some chat room applications are indicating that they cannot run
    since local storage is not enabled. If I log into the PC as the
    domain admin everything works fine. If a regular user logs in, that
    message appears. What has adobe changed in Flash Player 10 that I
    need to update?
    Thanks.

    Me too!
    On both IE8 and Firefox. Win 7 32 bit on IBM T42 - 1GB ram.
    Old laptop I know but the 10.1.x.x Flash Players worked just fine. Problem started with 10.2.152.26 flash player.
    Now BBC TV live and iPlayer work just fine, however YouTube does not - audio but no video. Just a black rectangle where the video should be.
    Right click to get the menu and "settings" is greyed out. However select the pop-out option and the video plays. Right click on the pop-out and "settings" is available. So deselect "enable hardware acceleration", close the pop-out and refresh the window (F5) and now YouTube videos play. Switch hardware acceleration back on and now they don't.
    This video works fine with hardware acceleration enabled. http://www.adobe.com/products/flashplayer/features/video/h264/
    These also work. http://www.adobe.com/devnet/flashplayer/stagevideo.html
    Something broken in 10.2, I think.

  • Local storage resource need and use. How it difference from actual VM instance drives?

    Hi,
    I am not able to quite understand the use of local storage resource that we configure from service definition file. The local storage is not durable and provides access same as file storage we would typically use in local environment like c:\my.txt.
    As this local storage is not durable; so is the information stored on (for example C drive) role instance VM. So what is the advantage we get by using local storage resource? Instead we can save it on C drive of role instance VM. Why local storage
    is recommended instead of using VM's drive?
    Please let me know your views.
    Mark As Answer if it helps you |
    My Blog

    Hi,
    >>So what is the advantage we get by using local storage resource?
    Because of the local storage is not durable, On a cloud service, we can create a small local storage where you can save temporary files, this is a powerful model for some scenarios, , especially highly scalable applications, please have
    a look at below articlesfor more details.
    #http://vkreynin.wordpress.com/2010/01/10/learning-azure-local-storage-with-me/
    #http://www.intertech.com/Blog/windows-azure-local-file-storage-how-to-guide-and-warnings/
    Best Regards,
    Jambor
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How does one clear the Local Storage in Safari 6.0?

    I used to be able to delete the cookies in the Local Stiorage folder.  But the old path ~/Library/Safari/LocalStorage doesn't exist anymore since the system got updated to OSX 10.7.4 and Safari 6.0.  And as ever emptying the cache via the develop drop-down menue doesn't clear the cookies stored in Local Storage.
    I am totally at loss, where these cookies are hiding in my system but really want to get rid of them.  Firefox is an alternative but it is tiresome to use this browser as it keeps on blocking things I want to do or asking me a little bit too frequently for permission etc.  Can anyone explain the Local Storage issue to me please?

    hpr3, thanks for your response but you didn't read my post correctly.  I am not referring to 'normal' cookies.  The ones which are stored under the section Local Storage are special and survive the general 'clear all website data' action via Safari's preferences.  I know several people have pointed this out before me.  I was able to clear these cookies in the Local Storage folder before but it was always quite a lot of 'clicking through' to be done and one had to go through the Finder via user/library/safari/local storage etc.  I recall there was a folder involved named Macromedia as well.   But with the upgrade of the operating system and Safari this path doesn't exist anymore.
    If you clear the cookies via preferences, then click out of Privacy and back in, you might notice that even though you haven't used your web browser at all, there are still cookies which magically re-appear. And they are tagged as Local Storage.
    Now, can anyone else come up with a helpful answer?

  • Live migration to HA failed leaving VHD on local storage and VM in cluster = Unsupported Cluster Configuration

    Hi all
    Fun one here, I've been moving non-HA VMs to a HA and everything has been working perfectly until now.  All this is being performed on Hyper-V 2012R2, Windows Server 2012R2 and VMM 2012R2.
    For some reason on the VMs failed the migration with an error 10608 "Cannot create or update a highly available virtual machine because Virtual Machine Manager could not locate or access Drive:\Folder"  The odd thing is the drive\folder is
    a local storage one and I selected a CSV in the migration wizard.
    The net result is that the VM is half configured into the cluster but the VHD is still on local storage.  Hence the "unsupported cluster configuration" error.
    The question is how do I roll back? I either need to get the VM out of the cluster and back into a non-HA state or move the VHD onto the CSV.  Not sure if the latter is really a option.
    I've foolishly clicked "Ignore" on the repair so now I can't use the "undo" option (brain fade moment on my part).
    Any help gratefully received as I'm a bit stuck with this.
    Thanks
    Rob

    Hi Simar
    Thanks for the advice, I've now got the VM back in a stable state and running HA.
    Just to finish off the thread for future I did the following
    - Shutdown the VM
    - Remove the VM from the Failover Cluster Manager (as you say this did leave the VM configuration intact)
    - I was unable to import the VM as per your instructions so I copied the VHD to another folder on the local storage and took a note of the VM configuration.
    - Deleted the VM from VMM so this removed all the configuration details/old VHD.
    - Built a new VM using the details I saved from the point above
    - Copied the VHD into the new VMs folder and attached it to the VM.
    - Started it up and reconfigured networking
    - Use VMM to make the VM HA.
    I believe I have found the reason for the initial error, it appears there was a empty folder in the Snapshot folder, probably from an old Checkpoint that hadn't cleaned up properly when it was deleted.
    The system is up and running now so thanks again for the advice.
    Rob

  • Retrieve variable value from local Storage and display on canvas

    Hi
    I'm working on a project that has multiple html files (the projects are split into 12 so 12 different edge projects and im linking them via window.open()). I have a variable that keeps track of correct answers stored in LocalStorage html object. I have managed to get the localStorage variable to increment up by one each time the object is correct however my last step is to get the variable and them display the result on the canvas. I have tried
    var outPut localStorage.getItem(' ') method to retrieve the variable then used the set and get method to display the result however it doesn't work. Im not sure if I need a for loop to go though the localStorage and get the elements
    Code:
    // insert code to be run when the composition is fully loaded here
    yepnope({nope:['jquery-ui-1.10.0.custom.min.js','jquery.ui.touch-punch.min.js'],complete: init}); // load the jquery files
    sym.setVariable("myScore", 0);
    var c = localStorage["myCount"] || 0; //loading from localStorage
    function init(){
    sym.getSymbol("barLimit").$('scrubber').draggable({start: function(e){
    },drag: function(e,ui){ // start: ...  // Find original position of dragged image
    var leftLimitScrubber  = sym.getSymbol('barLimit').$('scrubber').position().left; // position of the scrubber
    var rightLimitScrubber  = sym.getSymbol('barLimit').$('leftLimit').position().left;
    var LimitTwoLeft  = sym.getSymbol('barLimit').$('rightLimit').position().left;
    if(leftLimitScrubber == rightLimitScrubber){
      sym.getSymbol('correctBar1').play('in'); //
      sym.getSymbol('nextButton').play('in');
      sym.getSymbol('incorrectBar1').play('out'); //
      sym.getSymbol('thumbsDown1').play('out'); //
      sym.getSymbol('thumbsUp1').play('in'); //
      sym.getSymbol('congrats').play('in'); //
    localStorage["myCount"] = parseInt(c)+1; //Converting string to number, and then saving it
    console.log("numberOfCorrectAnswers", localStorage["myCount"]);
    var finalScore = sym.getVariable("myScore");
    finalScore = c;
    sym.setVariable("myScore", finalScore);
    sym.$("Score").html(finalScore);
    } else if(leftLimitScrubber == LimitTwoLeft){
    sym.getSymbol('incorrectBar1').play('in');
    sym.getSymbol('correctBar1').play('out');
    sym.getSymbol('thumbsUp1').play('out');
    sym.getSymbol('thumbsDown1').play('in');
    axis: "x",
    containment: "parent"
           //for (var i = 0; i < localStorage.length; i++){ // iterate throught the local storage
             //var getItem = localStorage.getItem(localStorage.key(i));
              //if(getItem == 'numberOfCorrectAnswers' ){
    The above is the code for the 12th project in  this projects it needs to display the variable inside the object localStorage and display on the canvas.
    Any help will mean a lot. Thank You in advance
    P.S edge animate has a lot of bugs hard to code in

    what you need to do is to create a text box and set a default value of zero. Once that is don't you need a code on the stage which grabs the value form the localStorage object. I used the .text() jquery method to display the value on the canvas. So the zero will be replaced with whatever the value of the localStorage is.
    You also need a if statement to check if the localStorage is undefined, if its not then grab the value and display on the canvas.
    e.g
    var number = localStorage['finalValue']; // for the sake of completeness I had to put this line of code
    if( number ! = undefined){ // if not undefined so the object exits then ...
         sym.$(' (text identifier) '). text(number); // note text identifier is the name of the text box you create in edge
    } // Done

  • Moving Photos from iCloud Photo Library to Local Storage

    Scenario - I've a fully migrated library of photos/videos using iCloud Photo Library on iPhone and Mac.  It's near the limit of the iCloud storage plan I purchased and want to retain.  I'd like to move older and less frequently used content from iCloud Photo Library to more permanent archival storage.
    [This is for two reasons.  First, I prefer to use the Full Resolution setting on mobile devices, and that will be impossible as the entire library grows beyond the storage capacity of even the largest mobile devices.  Second, I don't feel the need to pay for super-sized iCloud storage for content rarely needed and only needed on a Mac.]
    The only option I've identified in Photos to do this is to Export (and delete from iCloud), which exports the original photos, but does not preserve useful Photos metadata and organization, such as Albums.
    What one would might like to see is a way to designate selection portions of the Library for local storage only (including backup within the Photos app library package), so those photos can be manipulated within Photos alongside iCloud content, but don't consume iCloud or mobile device space.  Or, in the alternative, one would like to see a way to Export content to merge into a separate Photos library package, preserving the metadata/organization.  In that way, one could maintain one Photos library as current iCloud synced content, and one or more local-only Photos library packages with archival content (with, importantly, the Export function to move content between the two preserving metadata).
    Does anyone know if there's a way to do this?  If not, Apple, would you consider coming up with a way to address this need?

    {quote:title=Nissin101 wrote @ 3:36pm EMT:}{quote}
    Well I was able to move photos from the camera roll to the photo library by sending *the pictures via email to my dad's blackberry, then saving them to my computer from his phone, then putting them back on into the photo library*.
    This is what I said originally.
    {quote:title=Nissin101 wrote @ 4:08pm EMT:}{quote}
    Alright I guess that answers my question then. However, just as I said I was able to transfer photos from my camera roll to my photo library, so at least that is possible.
    I never said that I did it directly, neither did I mean to imply that I was looking for a direct solution. This I guess is where our misunderstanding comes from. I just did not feel like repeating the whole process I went through. Regardless, I would rather this thread not derail into who said what and whose misunderstanding who. I now know that it is not possible to get pictures from the photo library to the camera roll in any way, so my question is answered for now at least.

  • VSphere 6, can't use FT with local storage

    Hi guys,
    Now that I have figured out that when turning on FT without shared storage results in no datastores being shown in the FT dialog box in the web client, I've run into a different problem (thus the new discussion topic).
    When I turn on FT for a VM that is on local storage, I am now able to select the datastores, but when I select local storage for the secondary VM, I get a bunch of errors that tell me the secondary host can't reach the .vmx file (of course - that's why I needed the shared storage - for these little files). Shouldn't FT be copying the .vmx file and whatever else it needs to be shared to the right spot, since I've told it where that is? Do I need to move my VM to shared storage, then move my VMDKs and stuff to local storage manually?
    Here's a screenshot of the dialog:
    Here's the current hardware setup:
    ESX1:
    - IP: 192.168.220.51
    - Datastores: ESX1-LocalStorage and NFS
    ESX2:
    - IP: 192.168.220.52
    - Datastores: ESX2-LocalStorage and NFS
    My primary VM for FT is on ESX2, and trying to get my secondary on ESX1.
    Thanks!

    gs_khalsa wrote:
    The availability guide isn't accurate (I'll work on getting it corrected). The requirements for FT storage are:
    - Shared storage: FT config file, tie-breaker file, primary VM vmx file
    - Local storage: all other files (VMDKs, etc).
    The limitation with this however is that FT can't move or replicate your VM files to a new location, so if the host where the primary VMs VMDKs are stored is unavailable FT won't be able to spin up a new secondary.
    Example:
    3 hosts (A, B, C)
    Primary VM running on A - VMDKs stored on local storage
    Secondary VM running on B - VMDKs stored on local storage
    If host A fails, the secondary on B will take over and become primary. However, until host A comes back on-line FT will not be able to create a new secondary.
    Does this make sense?
    Hi, gs_khalsa,
    That does make sense. It isn't all that I think people were hoping it'd be. I think a lot of people were envisioning setting up 2 or 3 hosts in a small environment with no shared storage, and using FT as a share-nothing setup that would ensure a VM never goes down as long as at least one host is alive and the VM isn't corrupted. It's 99% there, but the need for shared storage (that is presumably HA storage, since we don't want the storage to be our single point of failure) for 3 little files kills the vision. FT not being able to spin up a new secondary on a 3rd host seems like it could have been done, too, since FT obviously knows how - it just doesn't automatically assume it should use that 3rd host. Maybe it's more complex that it sounds to actually achieve. At least we have multiple vCPU support now, which is a big step forward. Maybe these smaller milestones will come about in sub-version upgrades *crosses fingers*.
    By the way, I appreciate your responses - they've been very helpful in understanding how what I thought the new FT was and where that idea is off target. Thank you.

  • Configuring NAS as local storage

    This is not a specific question related directly to Oracle, but I'm hoping somebody can answer or point me in the proper direction. I'm working on setting up a Disaster Recovery system for my Oracle environment. Right now my DR system is such:
    HP Proliant DL 385 (G5): 64-bit running Oracle Enterprise Linux 5 Update 2 and 10.2.0.4.0
    IoMega StorPro Center NAS: Mounted as NFS, holds all database related files (control, redo and .dbf files)
    I have everything working but the NAS is hooked up to the network, and thus my environment requires network connectivity which I obviously can't count on during a disaster. Is there anyway to configure the NAS as local storage so when I do not have network connectivity I can still access my files on it?
    The vendor (IoMega) was of very little help. They tell me that I can plug the NAS directly into one of the NIC cards and "discover" the NAS that way. The problem is that the discovery agent does not run on Linux and they could not tell me how to get around this.
    Anybody have some experience hooking up a NAS unit as local storage instead of NFS? I'm trying to put on my SA/Network/Storage hats as best as possible, but I have very little experience trying things like this.

    I'm thinking out loud, so bear with me.
    An NFS mount point does an important feature in a clustered environment: file system access serialization. Frequently the underlying NAS file system has been formatted with EXT3 or some other non-cluster aware file system; NFS performs the important locking and serialization to keep this from being corrupted in a cluster. Please keep this in mind when designing a disaster recovery solution.
    What do you mean by "hooked to the network"? Do you mean you are using the public Internet or a corporate network?
    Are they suggesting that you establish a private, direct connection to the NAS?
    Find out how the NAS gets its network address. If it's using DHCP you will need to set up a local server and have the DHCP server listen only to the NIC/NICs where the NAS is plugged. Be sure the client NIC's have addresses on the same network as the NAS unit.
    Bring up networking on the NAS NIC devices.
    The bit about "discovering" the NAS file systems has me puzzled.
    Once you figure that out, mount the NAS file systems somewhere on you system, but NOT IN THEIR PRODUCTION locations.
    Now, set up your local machine as an NFS server. Publish the mount points as NFS exports, and then have your applications use these NFS mountpoints.

  • How to use the local storage space in OVM 3.2.1

    Hi
    We have installed OVM 3.2.1 in a server , which has 1TB HDD and 350GB RAM and we have installed the OVM manager in a seperate machine.
    I am able to identified the server in the manager console but how can i use the local storage space.
    Kindly help us to solve this issue.
    Regards
    Niranjan

    Hi,
    if you want to import an ISO image, why you mount it?
    after you complete your installation.
    1- discover server. (if your installation is finished good, then you found your server in Onglet Servers and VMs / unassigned servers)
    2- Go to Perspective / Physical Disk to verify that detect the second partition
    3- if it's good, then crearte a pool with no Clustered Server Pool.
    4- create a repository on your physical disk.
    5- Go to ISOs in your repository and click import iso image.
    Now, verify that your apache/httpd server it is working or not.
    1- after you install apache/httpd
    2- you shoud verify if it is running or not (pgrep httpd) if not start it (e.g. service httpd start )
    3- copy the ISO file under /var/www/html
    4- open your browser and verify.
    and at this point i think it is good.
    then copy the url to import the iso file with Oracle VM Manager
    the URL Look's like http://IP-Address/image-iso.iso
    you should verify that the IP-Address is reached from the Oracle VM Server.
    and that's it
    I tested it: and in my case:
    i have copied the windows iso image:
    cp ../windowsXP.iso /var/www/html/
    and now my URL is like * http://IP-Address/windowsXP.iso*
    I hope this can help you
    Best Regards

  • Need information on "Flash local storage" please

    Does anyone know about "Flash local storage"? Where is it? what sort of data can be stored there? how you see it and delete it?
    Friend wanted me to check out Pandora and reading through their FAQs I found this interesting tidbit.
    Note that Flash Local Storage, which holds your Pandora account and log-in information, is shared globally across all user accounts and browsers on a computer. This means that even if someone visits Pandora.com in another browser... the last account left signed in to Pandora will appear the next time anyone visits Pandora.com on that computer.
    It go me wondering what other data is being stashed into this "storage" without my knowledge.

    No.
    dogs got to it and now there are bite marks on it and the on and off switch was pulled off

  • How to display MICR font in XMLP where APPS is running on WINDOWS Server?

    Hi,
    I'm working on check printing reports with XMLP. In the output I need to print the check number in the last line of the page in MICR fonts. To do this I have followed the below steps...
    1. Installed the font (MICRe13b5.ttf) in local machine.
    2. Also placed this font in "\apps\ued02\applmgr\common\util\java\1.4\j2sdk1.4.2_04\lib\fonts" folder on the WINDOWS 2003 SERVER.
    3. We placed "xdo.cfg" in "\apps\ued02\applmgr\common\util\java\1.4\j2sdk1.4.2_04\lib\fonts" and "\apps\ued02\applmgr\common\util\java\1.4\j2sdk1.4.2_04\lib" folders.
    We made an entry for the MICR font in the xdo.cfg file.
    When we run the report it's showing the MICR Line in NORMAL FONT (ARIAL).
    Can someone help me how to display the MICR fonts where Oracle Apps running on WINDOWS server?

    The xdo.cfg contains below info.
    <config version="1.0.0" xmlns="http://xmlns.oracle.com/oxp/config/">
    <!-- Font setting -->
    <fonts>
    <font family="MICRe13b" style="normal" weight="normal">
    <truetype path="\apps\ued02\applmgr\common\util\java\1.4\j2sdk1.4.2_04\jre\lib\fonts\MICRe13b5.ttf"/>
    </font>
    </fonts>
    </config>

Maybe you are looking for