Bat file problems with 5.2(3)

I'm having bat file problems with CCM 4.2(3) with bat version 5.2(3). When adding phone/users getting an error number -2146828279 with this message "Description -Subscript Out of range".

Hi Denis,
Just wanted to know if the description contained a comma? There was a bug in a previous BAT version that would affect phone/user inserts if there was a comma.
CSCsb61425 Bug Details
Headline BAT insert fails if phone description contains comma
This was in BAT 5.1.4
Hope this helps!
Rob

Similar Messages

  • Anyone know if the long standing duplicate files problem with File History has been fixed yet?

    There are loads of public threads about the duplicate files problem
    with Windows 8/8.1 File History backup system.
    From all the threads I've looked at, there seems to be no solution,
    and no acknowledgement from MS that they can even repro the problem
    (which surprises me considerably as there are so many people who
    notice the problem).
    Is anyone aware of whether MS (may) have fixed this for Win10?
    Are MS aware of the problem and if they're able to fix it?
    Dave - with many GB of duplicated files in File History :)

    Hmm, is that the attitude MS would recommend? :)
    Why would I care what Microsoft would recommend?
    Clearly you don't, and you appear to have missed my smiley. Calm down
    Noel, many of us are as annoyed by aspects of modern Windows as you
    are. :)
    I'm all about making Windows actually WORK/./
    Aren't we all? Windows is software I use too many hours every day, I
    along with many millions of others need it to work really well. You
    are not alone.
    When they implement something that doesn't work, and even if it did work doesn't do what's needed - and/beyond that/ they remove features people DO need (such as the GUI for Windows Backup), I see no wrong in advising people of the way things
    really are.
    File History essentially does work - it's saved me a couple of times
    in the past couple of weeks. It just has a highly annoying habit of
    creating 100% duplicates of some files for no apparent reason. If MS
    have fixed that I won't have any known complaints about it.
    If you don't like it, you don't have to use it. I generally like it, I
    just want to see that its fixed.
    Dave

  • With conversion to Leopard, file problems with networked Windows computer

    Last night I did an Archive & Install from Tiger to Leopard on my Intel MacBook Pro. Today, I had trouble finding the other computers at my office. Once I finally got them to show up, I opened a Word file found on another computer, made some changes, and when I tried to save it, I got this message: "This is not a valid file name. Try one or more of the following: *Check the path to make sure it was typed correctly. *Select a file from the list of files and folders." Since this file already existed and I wasn't changing the name, I thought this was odd, but I changed the name from "Seating Chart 3-8-08" to "SeatingChart3-8-08" in case Leopard didn't like spaces when talking to Windows, but I got the same error message. Finally I gave up, not knowing what to do, then discovered that it had in fact saved my file. Still, every time I try to save ANY Word document from the shared folder of the Windows computer, I get the same error message endlessly until I choose "Don't Save."
    When I try to open an Excel file from that computer, it won't even open; it says " 'File Name.xls' cannot be accessed. This file may be Read-Only, or you may be trying to access a Read-Only location. Or, the server the document is stored on may not be responding." As with the Word file problem above, I did not have any problem accessing the files until I converted to Leopard.
    The Windows machine is Windows XP using Microsoft Office 2003; I have Microsoft Office 2004 on my machine.

    See if this Link, by Pondini, offers any insight to your issue...
    Transfer from Old  to New
    http://pondini.org/OSX/Setup.html
    Also, See here if you haven't already...
    http://www.apple.com/support/switch101/     Switching from PC

  • Airport Extreme, Airdisk, and Vista - large file problem with a twist

    Hi all,
    I'm having a problem moving large-ish files to an external drive attached to my Airport Extreme SOMETIMES. Let me explain.
    My system - Macbook Pro, 4gb ram, 10gb free HD space on macbook, running latest updates for mac and vista, external hard drive on AE is an internal WD in an enclosure w/ 25gb free (its formatted for pc, but I've used it directly connected to multiple computers, mac and pc, without fail), AE is using firmware 7.3.2, and im only allowing 802.11n. The problem occurs on the Vista side - havent checked the mac side yet.
    The Good - I have bit torrents set up, using utorrent, to automatically copy files over to my airdisk once they've completed. This works flawlessly. If i connect the hard drive directly to my laptop (macbook running bootcamp w/ vista, all updates applied), large files copy over without a hitch as well.
    The Bad - For the past couple weeks (could be longer, but I've just noticed it recently being a problem - is that a firmware problem clue?) If i download the files to my Vista desktop and copy them over manually, the copy just sits for a while and eventually fails with a "try again" error. If i try to copy any file over 300mb, the same error occurs.
    What I've tried - well, not a lot. The first thing i did was to make sure my hard drive was error free, and worked when physically connected - it is and it did. I've read a few posts about formatting the drive for mac, but this really isnt a good option for me since all of my music is on this drive, and itunes pulls from it (which also works without a problem). I do however get the hang in itunes if i try to import large files (movies). I've also read about trying to go back to an earlier AE firmware, but the posts were outdated. I can try the mac side and see if large files move over, but again, i prefer to do this in windows vista.
    this is my first post, so im sure im leaving out vital info. If anyone wants to take a stab at helping, I'd love to discuss. Thanks in advance.

    Hello,
    Just noticed the other day that I am having the same problem. I have two Vista machines attached to TC w/ a Western Digital 500 gig USB'd. I can write to the TC (any file size) with no problem, however I cannot write larger files to my attached WD drive. I can write smaller folders of music and such, but I cannot back up larger video files. Yet, if I directly attach the drive to my laptop I can copy over the files, no problem. I could not find any setting in the Airport Utility with regards to file size limits or anything of the like. Any help on this would be much appreciated.

  • File problem with accent / CS4 Mac

    Hi
    I"m using CS4 on MAC OSX 10.5.7
    I have problem with filename containing accents like 'é'
    For example : I browse for a file called 'testé.xml' using kOpenFileDialogBoss, and then get the resulting IDFile. From this IDFile I get the path in a PMString with the function FileUtils::SysFileToPMString.
    My problem is that character 'é' has a strange representation, instead of having the correct path I get 'teste<0301>.xml' and if I want to use this path with a fopen function or even to show it into a TextEditBoxWidget it'll fail.
    I've look into SnpChooseFile sample and the problem is the same. In the log I also get that strange representation.
    Have you encountered this problem?
    Thanks for your help

    Have a look at FileUtils::DecomposeUnicode and NormalizeUnComposedUnicodeChars. Use the first to convert before using i.e. FileUtils::PMStringToSysFile() and the later afetr i.e. FileUtils::IDFileToPMString().
    HTH

  • RAW Files problems with 5DMarkIII

    I recently started shooting RAW in my new 5DMarkIII but everytime I try to transfer those files to Aperture 3 it crashes and no file can be transfered to my computer. I have the app up to date and the firmware of the camera is the last one. I have no problems with the JPG files, but the RAW files crashes Aperture and I cant get them on my computer!

    Thanks! and I apologize if my english is not good enough.
    No problem your english is probably better than mine!
    Update: In the last two hours Aperture could transfer a couple of files but I had to do it one by one. Then, I attempted to transfer a whole group of files, it crashed again.
    This looks like just one of your images on the card is defective. Try to import without rendering previews: Turn off the Preference: "Previews > new Projects automatically generate previews" before importing.
    Then create the previews one by one after import. Probably Aperture will crash when you try to render a preview for the defective  image, but then you will know which image to remove.
    Regards
    Léonie

  • H.264 .mov files problem with 2-pass vbr in DVD SP4

    I have converted some VHS tapes by connecting the VCR to my camcorder (as a AC/DC bridge) and the camcorder to my macbook pro via firewire, and record using FCP into .mov files.
    However, these files are huge (13 GB for an hour), and so I shrinked them down by exporting them (using QT pro) into .mov file via h.264 codec.
    However, when I try to convert these already shrinked .mov files into mpeg2 files via compressor 2 (so that I can made a DVD in DVD SP), the files become corrupt when I use 2-pass vbr. (Frame freezes on play back in DVD SP and QT pro). It's OK with 1 pass vbr though.
    If I convert those huge .mov files via mpeg4 in QT pro rather than h.264, there is not problem with 2-pass vbr.
    So my delemma is: shall I store the files using mpeg4, or h.264?
    I am under the impression that h.264 gives a better quality for the same file size than mpeg4, but 2-pass vbr will give a better quality than 1-pass vbr if I want to burn it on a DVD.
    Thanks in advance.

    In general recompressing from VHS to your computer to H.264 (or MPEG 4) then to m2v is going to hurt quality, so if you are able to get extra Hard Drives and store that way, or even a inexpensive camera with a firewire in/out you are probably better off (though with VHS it may be hard to tell regardless)
    Also if you at least go straight the first time to m2v and check to make sure it looks okay to you, you may want to make sure to do that first
    That being said, H.264 and One-Pass should in all likelihood (I have not tried recently, did it a while ago just to test things) will be better than MPEG-4 2-Pass, depends on the settings. Also if you try 1-Pass H.264 it seems to work with Compressor.

  • Locked file problem with WebDAV

    I am using DreamWeaver CS3 on my Macintosh at home and on a
    PC at work. I am connecting to my website via WebDAV, and for some
    reason Dreamweaver cannot edit the files on the site. When I try to
    check out any file I get an error stating: "file.html is locked and
    can be viewed but not edited" I then have the option to check it
    out or view or cancel. Regardless of what I do, if I continue and
    open the file and make changes the file will not get updated on the
    remote server even though Dreamweaver reports a successful "put" or
    "check in" command in the log file. This happens on both the
    Macintosh and the Windows PC. Both are also updated to the latest
    versions of Dreamweaver.
    The strange thing is, if I mount the WebDAV server as a
    filesystem on my macintosh and authenticate with the same
    login/password that entered in Dreamweaver's WebDAV settings, I can
    edit the html files using a text editor and they save perfectly
    fine. Additionally, while the web server is mounted on the desktop,
    if I change Dreamweaver's site settings to "local/network" and
    enter the locally mounted WebDAV share path instead of using
    Dreamweaver's built-in WebDAV client, then everything works
    properly in Dreamweaver. Reverting back to using WebDAV in
    dreamweaver (as opposed to through the OS) brings about the same
    "locked file" problem as before.
    There appears to be a problem with Dreamweaver's WebDAV
    access to the site, and I'm wondering if there's an easy fix or if
    this has been experienced by others?
    Thanks,
    -Topher

    Have a look at FileUtils::DecomposeUnicode and NormalizeUnComposedUnicodeChars. Use the first to convert before using i.e. FileUtils::PMStringToSysFile() and the later afetr i.e. FileUtils::IDFileToPMString().
    HTH

  • Olympus E-500 raw file problem with long exposures

    Hi,
    This topic has come up before, but wasn't resolved. Maybe its somewhere else, but I can't find it.
    http://discussions.apple.com/thread.jspa?messageID=4994383&#4994383
    Raw files from my Olympus E-500 with exposures over 3.2s come up with overexposed looking previews in Aperture. I think there is some automatic noise reduction (dark frame) info in these files, since they are 16.68Mb instead of the normal 13.47Mb. Apparently Aperture doesn't know what to do with these larger files.
    Is there a known way around this within Aperture? The files open fine within Olympus Master 2 and Raw Photo Processor.
    I've pasted a screen shot here with some links to raw files:
    http://www.mso.anu.edu.au/~grant/wp/2007/10/raw-problems-with-aperture/
    cheers
    Grant

    Yea, I've had that issue also and figured out the DNG workaround. Filed a bugreport with Apple on Nov 2nd, 2006. It sat with what looked like no attention right up until I got an e-mail on Oct 26th, 2007 (notice the date) saying that the issue was fixed in 10.5. I had already preordered Leopard so was hoping this was right. It wasn't. It's still borked. Logged notes back into the issue saying it wasn't.
    Perhaps if more E-500 users report this it'll get more attention.
    http://bugreport.apple.com

  • Config file problem with new Coherence 3.5

    Hi all,
    I'm trying up upgrade us from Coherence 3.3.1 to 3.5 but my existing tests are failing during Coherence initialization.
    In our unit tests we're not explicitly specifying tangosol.coherence.override. That's a surprise to me but until now it didn't seem to be a problem. Under 3.3.1, we get a message in the logs that says Coherence is loading tangosol-coherence.xml from the coherence jar immediately followed by "Failed to parse the element override: ; java.io.IOException: Exception occurred during parsing: Invalid root element." Despite this, our tests pass.
    Under 3.5, we get the same message about reading tangosol-coherence.xml from the coherence jar, but then an exception from CacheFactory.getCache():
    Caused by: (Wrapped: Failed to parse the element override: ) java.io.IOException: Exception occurred during parsing: Invalid root element
         at com.tangosol.coherence.component.application.console.Coherence.loadOverrides(Coherence.CDB:129)
         at com.tangosol.coherence.component.application.console.Coherence.loadConfiguration(Coherence.CDB:65)
         at com.tangosol.coherence.component.application.console.Coherence.getServiceConfig(Coherence.CDB:14)
         at com.tangosol.coherence.component.application.console.Coherence.ensureRunningLogger(Coherence.CDB:10)
         at com.tangosol.coherence.component.application.console.Coherence.getSafeCluster(Coherence.CDB:14)
         at com.tangosol.coherence.component.application.console.Coherence.getServiceConfig(Coherence.CDB:8)
         at com.tangosol.net.CacheFactory.getServiceConfig(CacheFactory.java:1258)
         at com.tangosol.net.CacheFactory.getConfigurableCacheFactoryConfig(CacheFactory.java:1169)
         at com.tangosol.net.CacheFactory.getConfigurableCacheFactory(CacheFactory.java:594)
         at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:686)
         at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:664)
         at com.wellsfargo.pcs.bus.cache.BusCacheFactory.getCache(BusCacheFactory.java:28)
         at com.wellsfargo.pcs.bus.service.profile.dao.TeamMemberCacheDAOImplTest.setUp(TeamMemberCacheDAOImplTest.java:46)
    Caused by: java.io.IOException: Exception occurred during parsing: Invalid root element
         at com.tangosol.run.xml.SimpleParser.parseXml(SimpleParser.java:134)
         at com.tangosol.run.xml.SimpleParser.parseXml(SimpleParser.java:71)
         at com.tangosol.run.xml.SimpleParser.parseXml(SimpleParser.java:99)
         at com.tangosol.coherence.component.application.console.Coherence.loadOverrides(Coherence.CDB:82)
         ... 26 more
    I didn't see anything in the release notes about any changes I needed to make to my config files. Please advise.
    thanks
    john

    Hi John,
    AFAIK, no IOException is raised if the override file doesn't exist, in any Coherence version I've seen so far. You should simply see a log message saying that the override file was not found.
    Also, you only need to specify tangosol.coherence.override system property if you want to specify a custom name for the file. Otherwise, Coherence will look for tangosol-coherence-override-dev/prod.xml and tangosol-coherence-override.xml in the classpath, and try to load and apply the settings specified in it.
    My guess is that you have one of these files in your classpath, but that its format does not comply with the requirements. It also seems like they used to be ignored in versions prior to 3.5, but was recently changed to prevent node from starting. IMHO, the change is for the better, as it forces you to either fix your override file, if you need it, or to delete it if you don't (alternatively, you can rename it to something else and choose whether to use it or not by specifying tangosol.coherence.override system property.
    Problem with the file that fails to parse but still allows node to start is that your custom configuration settings were never applied, which means that your tests were not using them.
    Regards,
    Aleks

  • Macbook Air 11" i7 corrupt files problem with Adobe Illustrator CS4

    Hi
    this problem may not be entirely related but maybe someone could help anyways or knows how I can proceed...
    I'm running Illustrator CS4 on a Macbook Air 11" (latest 2011 model, with i7, 4GB, 128SSD), and lately some files are getting corrupted. specifically, adobe illusrator CS4 files with weeks of work. interestingly, the time machine backups are also getting corrupted.
    its like I'm working on a file for a few days, backing up with time machine as well
    closes the file, after a few days returns to it… (normal behavior)
    then when i try to get back to the file then sometimes it says "Cant open the illustration, could not complete the operation" and opens a blank canvas.
    restoring the file via time machine also gives the same error (its like all versions get corrupted simultaneously), even 4-5 versions backwards, even though they worked before when the file worked
    it happened quite a few times, ruining weeks of work.
    I tried the following:
    reinstalling CS4 - didn't work
    Opening the files on CS5 - still don't open
    doing the file recovery thing and then opening it in a notepad - didn't work at all
    using Time machine - the files are still corrupt
    it only happenes with AI files, not photoshop files, for example.
    I thought maybe it has anything to do with SSD? since my previous macbook pro (15", 2007) never did these errors…
    please help
    Eido

    If at all, this would be I/O performance and bandwidth issues with Time Machine, but I don't think that's the problem. More to the point I think that it's a case of Preview opening the files to genereate thumbnails and icons and something going wrong there. I'd look up some guides on how to edit the relevant plist and conf files to exclude your AIs or turn off the behavior to generate these previews globally. This may help hugely...
    Mylenium

  • XML File - Problem.with naming tags with XI standard functions

    Hello,
    simple (?) problem:
    Receiver expects an XML file via file adapter.
    (The source data comes from an RFC connect to XI.)
    In the XML tags like these are expected (following W3C definitions):
    A)   </gdt:ValueGroup>
    How to define a data type with a ":" in its name or to convert it to the required tag?
    B)   <gdt:ExtendedAttribute gdt:guid="4c102d6b077de7c1f0e27391e40bb80f" gdt:code="X01" >GR2 LI3</gdt:ExtendedAttribute>
    This one is a real nice one. The part with "gtd:guid=" within the tag is variable. How to add such values into tag names?
    Any ideas if this is possible with standard and how.
    If this is the limit of XI I think we need an XSLT- or JAVA-maping program to generate the required file.
    Thank you for any help!
    Best regards
    Dirk

    Hi Dirk,
    just one thing:
    >>>>The part with "gtd:guid=" within the tag is variable.
    this is ok as it's just an attribute
    of the ExtendedAttribute tag's name
    so it can be variable without any problems
    (you can fill it in the mapping for example)
    Regards,
    michal
    <a href="/people/michal.krawczyk2/blog/2005/06/28/xipi-faq-frequently-asked-questions"><b>XI / PI FAQ - Frequently Asked Questions</b></a>

  • Can't assemble  my .fla with other files / problem with flashExport.jsfl

    Hello I trying to start a new flash projet. I trying like a tutorial on the web, but flash builder can't assemble my fla file in the correct default folder (user/documents/flash builder 4/...).
    So I get compil's problem…
    Tks in advence for you answer
    MAnu

    Hello Sunil,
    Yes I'm working with Flash pro CS5.
    Tks for your interest
    Manu

  • Splitting 1GB Files // Problem with FileStream class

    Hi, in my Air (2 beta) app i'm splitting large files to upload them in  smaller chunks.
    Everything works fine, until i choose files larger  than 1GB?
    This might be the Problem:
    var  newFile:File =  File.desktopDirectory.resolvePath(filename);                        
    trace(newFile.size);
    //  8632723886  (About 8GB correct file size)
    BUT if i use the FileStream Class instead:
    var stream:FileStream = new  FileStream();
    stream.open(new File(filename), FileMode.READ);
    trace(stream.bytesAvailable)
    //  42789294 ("wrong" file size?)
    If i run the same code with files smaller  than 1GB stream.bytesAvailable returns the same result as newFile.size.
    Is there a  limitation in the FileStream class or is my code wrong?
    Thanks!

    use asynchronous file handling method. i.e use (filestream object).openAsync(file,filemode.read). here is the implementation :
    private var fileCounter:int = 0;
    private var bytesLoaded:int = 0;
    private var filePath:String = "D:\\folder\\";
    private var fileName:String = "huge_file";               
    private var fileExtension:String = ".mkv";
    private var file:File = new File(filePath+fileName+fileExtension);
    //split size = 1 GB
    private var splitSize:int = 1024*1024*1024;
    private var fs:FileStream = new FileStream();
    private var newfs:FileStream = new FileStream();
    private var byteArray:ByteArray = new ByteArray();
    private function init():void{
         fs.addEventListener(Event.COMPLETE,onFsComplete);
         fs.addEventListener(ProgressEvent.PROGRESS,onFsProgress);
         newfs.open(new File(filePath+fileName+fileCounter+fileExtension),FileMode.WRITE);
         fs.openAsync(new File(filePath+fileName+fileExtension),FileMode.READ);
    private function onFsComplete(e:Event=null):void{                    
         fs.readBytes(byteArray,0,fs.bytesAvailable);                    
         newfs.writeBytes(byteArray,0,Math.min(splitSize-bytesLoaded,fs.bytesAvailable));
         for(var i:int = 0; i < byteArray.length; i+=splitSize){
              newfs.close();
              newfs.open(new File(filePath+fileName+fileCounter+fileExtension),FileMode.WRITE);
              newfs.writeBytes(byteArray,i,Math.min(splitSize,byteArray.length-i));
              fileCounter++;
              trace("Part " + fileCounter + " Complete");
    private function onFsProgress(e:ProgressEvent):void{
         if((bytesLoaded+fs.bytesAvailable)==file.size){
              onFsComplete();                         
         else if((bytesLoaded + fs.bytesAvailable)>=splitSize){
              fs.readBytes(byteArray,0,splitSize-bytesLoaded);
              newfs.writeBytes(byteArray,0,byteArray.length);
              newfs.close();
              bytesLoaded = fs.bytesAvailable;
              fs.readBytes(byteArray,0,bytesLoaded);
              fileCounter++;
              newfs.open(new File(filePath+fileName+fileCounter+fileExtension),FileMode.WRITE);                         
              newfs.writeBytes(byteArray,0,byteArray.length);
              byteArray.clear();
              trace("Part " + fileCounter + " Complete");
         else{
              bytesLoaded+=fs.bytesAvailable;
              fs.readBytes(byteArray,0,fs.bytesAvailable);
              newfs.writeBytes(byteArray,0,byteArray.length);
              byteArray.clear();     
    cheers!

  • Filter to modify html files problem with SC_NOT_MODIFIED

    Hello !
    I am trying to create filter to modify html files before sending to client. I created ResponseWrapper using CharArrayWriter, then I can modify the cached output and later send it to client.
    This code works fine. BUT the browser caches the html files and requests for changed document. Then there is BIG problem, because the chain.doFilter returns response status SC_NOT_MODIFIED and no output is done to client so that there is nothing to modify.
    My question is how to make the chain.doFilter to return requested html document each time and disable SC_NOT_MODIFIED flag.
    Regards,
    Kamil

    Hi,
    Not only "prn" but also for the terms like "AUX", "NUL" you
    can't create any directories or files in Windows because these are
    reserved terms with Windows which is meant for some device
    drivers...

Maybe you are looking for