Director 12 update

Is there a link available for the Director 12 update mentioned here
http://www.prweb.com/releases/2013/8/prweb10987731.htm
No updates are appearing via the Adobe Application Manager
Thank you

Would love to see this update. Have you had any luck? Any idea where I can get the update?

Similar Messages

  • Facebook Contacts Profile Pics not updating on Incredible

    Just like the title says - after syncing everything with facebook to my current contacts (people) list, the current facebook display picture appears.  However, if one of my contacts changes their facebook display picture - it won't change in the contacts (people) list to the new, current display pic.  Even if you go into the online directories and update the facebook list you get the current display picture in that list but it doesn't transfer over to the linked phone contact display. 
    Any ideas what I am doing wrong or how to correct this?????

    I am experiencing the same thing, and it's bugging the heck out of me as well.   I have tried everything and it just doesn't add up.  I know that People is syncing with FB as I am seeing status updates, but I have at least 5 friends/contacts whose photos are not updated.  WEIRD if you ask me.
    I have sync'd, unsync'd, downloaded lastest FB widget, signed out of Facebook, signed out of FB for HTCSense, shut-off, turned on..... YOU NAME IT!
    Will sit tight until someone has a fix.

  • Code Signing a Director 12 App for the AppStore

    I have seen a few discussions on this topic and signing and submitting to the AppStore while full of challenges was possible with Director 11.
    With Director 12, we have been unable to code sign the projector.
    We use the Terminal to do it:
    codesign -f -v -s "3rd Party Mac Developer Application: Developer's Name" [path to .app]
    We get the following error:
    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/cod esign_allocate: the __LINKEDIT segment does not cover the end of the file (can't be processed) in: /Users/OurApp.app/Contents/MacOS/OurApp
    /Users/OurApp.app: code failed to satisfy specified code requirement(s)
    Any ideas would be welcome!
    Additionally:
    I have just discovered that unless Director is updated any submissions to the Apple App Store will be refused because of the use of the QuickTimes APIs
    Deprecated API Usage - Apple no longer accepts submissions of apps that use QuickTime APIs.

    There isn't a way to save as an earlier version, but my article from 13 years ago still holds true:
    http://www.director-online.com/buildArticle.php?id=1034
    The article tells you how to find two copies of a particular pair of numbers. For D7 those numbers would be 057E, D8 would be 0640, and D8.5 is 073A. Changing the two places where those number are will make the DIR open in older versions of Director.
    The numbers for D11.5 are 0782, and for D12 they are 079F. So, track down the two places where 079F are (which are 18 positions apart) and change them to 0782, and you'll be able to open the file in D11.5.
    One change since I wrote that article is that the identification sequence is now the Windows one on Mac too, most likely because it's an Intel app now. So, the number sequence to look for to get close to those two numbers is 46 43 52 44. In the file I checked there were three places where those numbers were, and it's the last of the places that followed by the two numbers, about 8 & 26 bytes later.

  • Can you Manipulate 3D object in Director Dynamically?

    Greetings,
    I am looking for some way to dynamically manipulate a 3d
    object and am wondering if Director 3d has the ability to do this
    and if so, if any of you want to take this on as a freelance
    project? (this is a serious request on a cool project, not some
    hokey offer)
    I had first hoped that Flash could do it since I am more
    familiar with flash, but I was wondering if Director can update a
    3d object on the fly as it gathers new values for a number of
    points in the 3d object. Can Director do this?
    If you know of any person, any business that knows how to do
    this PLEASE contact me to discuss more. It has been difficult for
    me to understand the capabilites of Director 3D without speaking to
    someone who is talented and competent as I am sure most of you are.
    Cheers,
    -i

    model position, model rotation, model location, model scale,
    model shader, etc. are all easily modified. the actual mesh that
    makes up the model's shape can be changed with the meshdeform
    modifier. however, these changes won't be permanent. the entire 3d
    scene will be in it's original state everytime the movie starts. if
    you want to save state you'll have to store the data defining the
    attributes externally in some way and apply them at runtime.

  • Updating coldfusionmx7 to mx8

    Hi,
    Can I overwrite the directories while updating from
    coldfusionmx07 to coldfusionmx8 ? do i need to uninstall
    coldfusionmx7 before installation ??
    Thank you

    You cannot overwrite the existing installation because that
    will lead to confusion. You need not uninstall older version
    either. You can allow new CF8 installation to coexist with CF7. CF8
    installation will import the CF7 settings when you login to
    administrator for the first time.
    - Vinu

  • ITMS Pinged but no Updates on Podcast Episodes

    I've produced 7 episodes for this podcast. So far only 6 show up on my ITMS page.
    http://pdxcabbie.podbean.com/
    Other Podcast directories are updating
    http://www.vicasting.com/podcast/10332/pdxcabbie.aspx
    http://www.podcastalley.com/podcastdetails.php?podid=52385
    http://www.podcastingnews.com/details/pdxcabbie.podbean.com/feed/view.htm
    I've pinged the ITMS several time over that last 18 or so hours and yet the situation hasn't changed.
    https://phobos.apple.com/WebObjects/MZFinance.woa/wa/pingPodcast?id=265594029
    The feed is reportedly good as per Feedvalidator
    http://pdxcabbie.podbean.com/feed

    I see now that this may be a problem with the ITMS and not an issue of corrupted metadata or broken feeds.
    It seems that the "This American Life" podcast is at least three days behind in showing the current episode on the podcast page. As per the page, the current episode is titled "Mapping" and is dated 10/21/07. My latest download of that podcast is from 10/28/07, titled "In Dog We Trust."
    http://phobos.apple.com/WebObjects/MZStore.woa/wa/viewPodcast?id=201671138
    My point is that if the ITMS isn't showing the most current episode for their most popular podcast, there is definitely something wrong with the store.
    My question now would be; Is this addressed somewhere on the Apple site or is it simply ignored and left to be corrected without any declaration of cause?

  • How to set up local server to use a remote server for login authentication?

    Thank you in advance for any help you can offer.
    We are trying to set up a "sub-network" (dont' know if this is the right terminology) using a 10.4 Server OS, to manage a set of clients... the trick is that the client login/home directory information is on a different remote server, and shall remain there, for the most part.
    To make it easy to understand here's the environment:
    *Local Server:* 10.4 G4 Server Quicksilver 1G dual--we have total control of this one
    *Main/remote server:* 10.5 Xserve.. don't know which vintage--we have very very very little input on this machine.. effectively at the mercy of the sysadmin of this system who is very conservative in changing anything (hence the need for a separate server to install applications and client machine-specific profiles, etc since the Xserve admin refuses do it). This serves MacBooks/MacBookPros and few iMacs. (no Windows PC.. as that group of comptuers have their own server)
    client: ~20 eMacs/iBooks all running 10.4.
    use environment: elementary school-->very low network demand (no e-mail, just running local apps linking to server(s) for licensing and login, and some file saving small files on remote server, user preferences, etc).
    The remote server (the Xserve) has all the login authentication, as well as the home directories. every school year, the directories get updated as new students enroll and old students graduate. Currently all the clients are directly linked to the Xserve via LDAP while we bring the local server on-line.
    the local sever (our G4 Quicksilver) will have few network applications that will support the client machines. We also will be setting up computer accounts and groups for our clients so that we can properly set their environments (the Xserve admin will not do this on the Xserve, so currently all the clients are connecting to the server as a "guest computer" from what little I understand watching what was done)
    now, what is the best way to approach this type of set up with minimal "inconvenience" of the Xserve admin?
    I am pretty experienced with standalone UNIX and macOS X administration, but a novice to this whole Server and network setup thing. Any suggestions, instructions, pointers to URLs with how-tos is much appreciated. I am not afraid to use Terminal (grew up on UNIX before GUI), etc., and willing to try safe but unconventional setups if that is what's needed...
    thanks for any help!

    Oh never mind.... I figured it out myself helps to read up on the manuals. d'oh. sorry for the bandwidth waste...

  • Rebranding

    Hi All,
    We are planning to Rebrand our company name. Change the name of the company. Initially we are identitying the customer facing documents which would need a name change. I Would need your help and suggestion in identifying what would customer facing documents in OTC cycle and P2P cycle.
    Regrds,
    Ajai

    To add new domain as the primary one - use the Email Address Policies. Make the new one as default.
    Here
    Auto Reply - You could use Transport rules for this. However, as users will receive mails on both addresses and reply only using the new one, the old domain will fade gradually. -
    Here
    For external access - you will just need to update the external URL on Exchange 2013 Virtual Directories. Update the Public DNS to point to the new URL/Domain to your public IP. Get a new certificate. Internal domain name does not matter.
    Here
    Regards, Vik Singh "If this thread answered your question, please click on "Mark as Answer"

  • Fatal Error: The schema version of the database is from a newer version of WSUS

    Hello,
    CM2012 R2 RTM on Server2012 R2 RTM with SQL2012 SP1
    installed SQL, WSUS and then CM.
    When finished adding WSUS, there was a post installation message but I skipped it and continued to CM installation, as suggested on some blogs. WSUS in CM seems to be a real mystery... and there are multiple workarounds and suggestions. 
    Now my CM is ready to GO. Configured discovery, boundaries. All components appear green OK. My hand shaked when I click on post Deployment message related to WSUS in server manager. So I clicked it. And received: Post Deployment Failed. There is
    a path to log file. The error:
    Fatal Error: The schema version of the database is from a newer version of WSUS
    than currently installed.  You must either patch your WSUS server to at least
    that version or drop the database. WSUS was installed with SQL DB.
    What do you suggest?
    Thanks.
    Please see full log:
    2013-12-01 06:58:24  Postinstall started
    2013-12-01 06:58:24  Detected role services: Api, Database, UI, Services
    2013-12-01 06:58:24  Start: LoadSettingsFromXml
    2013-12-01 06:58:24  Start: GetConfigValue with filename=UpdateServices-Services.xml item=ContentLocal
    2013-12-01 06:58:24  Value is true
    2013-12-01 06:58:24  End: GetConfigValue
    2013-12-01 06:58:24  Start: GetConfigValue with filename=UpdateServices-Services.xml item=ContentDirectory
    2013-12-01 06:58:24  Value is D:\sources\wsus
    2013-12-01 06:58:24  End: GetConfigValue
    2013-12-01 06:58:24  Content directory is D:\sources\wsus
    2013-12-01 06:58:24  Start: GetConfigValue with filename=UpdateServices-DB.xml item=InstanceName
    2013-12-01 06:58:24  Value is confman
    2013-12-01 06:58:24  End: GetConfigValue
    2013-12-01 06:58:24  SQL instance name is confman
    2013-12-01 06:58:24  End: LoadSettingsFromXml
    Post install is starting
    2013-12-01 06:58:24  Start: Run
    2013-12-01 06:58:24  Fetching WsusAdministratorsSid from registry store
    2013-12-01 06:58:24  Value is S-1-5-21-1033354796-2088831985-1429053453-1003
    2013-12-01 06:58:24  Fetching WsusReportersSid from registry store
    2013-12-01 06:58:24  Value is S-1-5-21-1033354796-2088831985-1429053453-1004
    2013-12-01 06:58:25  Configuring content directory...
    2013-12-01 06:58:25  Configuring groups...
    2013-12-01 06:58:26  Starting group configuration for WSUS Administrators...
    2013-12-01 06:58:26  Found group in regsitry, attempting to use it...
    2013-12-01 06:58:28  Writing group to registry...
    2013-12-01 06:58:28  Finished group creation
    2013-12-01 06:58:28  Starting group configuration for WSUS Reporters...
    2013-12-01 06:58:28  Found group in regsitry, attempting to use it...
    2013-12-01 06:58:28  Writing group to registry...
    2013-12-01 06:58:28  Finished group creation
    2013-12-01 06:58:28  Configuring permissions...
    2013-12-01 06:58:28  Fetching content directory...
    2013-12-01 06:58:28  Fetching ContentDir from registry store
    2013-12-01 06:58:28  Value is D:\sources\wsus
    2013-12-01 06:58:28  Fetching group SIDs...
    2013-12-01 06:58:28  Fetching WsusAdministratorsSid from registry store
    2013-12-01 06:58:28  Value is S-1-5-21-1033354796-2088831985-1429053453-1003
    2013-12-01 06:58:28  Fetching WsusReportersSid from registry store
    2013-12-01 06:58:28  Value is S-1-5-21-1033354796-2088831985-1429053453-1004
    2013-12-01 06:58:28  Creating group principals...
    2013-12-01 06:58:28  Granting directory permissions...
    2013-12-01 06:58:28  Granting permissions on content directory...
    2013-12-01 06:58:29  Granting registry permissions...
    2013-12-01 06:58:29  Granting registry permissions...
    2013-12-01 06:58:29  Granting registry permissions...
    2013-12-01 06:58:29  Configuring shares...
    2013-12-01 06:58:29  Configuring network shares...
    2013-12-01 06:58:29  Fetching content directory...
    2013-12-01 06:58:29  Fetching ContentDir from registry store
    2013-12-01 06:58:29  Value is D:\sources\wsus
    2013-12-01 06:58:29  Fetching WSUS admin SID...
    2013-12-01 06:58:29  Fetching WsusAdministratorsSid from registry store
    2013-12-01 06:58:29  Value is S-1-5-21-1033354796-2088831985-1429053453-1003
    2013-12-01 06:58:29  Content directory is local, creating content shares...
    2013-12-01 06:58:29  Creating share "UpdateServicesPackages" with path "D:\sources\wsus\UpdateServicesPackages" and description "A network share to be used by client systems for collecting all software packages (usually applications)
    published on this WSUS system."
    2013-12-01 06:58:29  Deleting existing share...
    2013-12-01 06:58:29  Creating share...
    2013-12-01 06:58:29  Share successfully created
    2013-12-01 06:58:29  Creating share "WsusContent" with path "D:\sources\wsus\WsusContent" and description "A network share to be used by Local Publishing to place published content on this WSUS system."
    2013-12-01 06:58:29  Deleting existing share...
    2013-12-01 06:58:29  Creating share...
    2013-12-01 06:58:29  Share successfully created
    2013-12-01 06:58:29  Creating share "WSUSTemp" with path "C:\Program Files\Update Services\LogFiles\WSUSTemp" and description "A network share used by Local Publishing from a Remote WSUS Console Instance."
    2013-12-01 06:58:29  Deleting existing share...
    2013-12-01 06:58:29  Creating share...
    2013-12-01 06:58:29  Share successfully created
    2013-12-01 06:58:29  Finished creating content shares
    2013-12-01 06:58:29  Stopping service WSUSService
    2013-12-01 06:58:29  Stopping service W3SVC
    2013-12-01 06:58:32  Configuring database...
    2013-12-01 06:58:32  Configuring the database...
    2013-12-01 06:58:32  Establishing DB connection...
    2013-12-01 06:58:32  Checking to see if database exists...
    2013-12-01 06:58:32  Database exists
    2013-12-01 06:58:32  Switching database to single user mode...
    2013-12-01 06:58:32  Loading install type query...
    2013-12-01 06:58:32  DECLARE @currentDBVersion       int
    DECLARE @scriptMajorVersion     int = (9600)
    DECLARE @scriptMinorVersion     int = (16384)
    DECLARE @databaseMajorVersion   int
    DECLARE @databaseMinorVersion   int
    DECLARE @databaseBuildNumber    nvarchar(10)
    IF NOT EXISTS(SELECT * FROM sys.databases WHERE name='SUSDB')
    BEGIN
        SELECT 1
    END
    ELSE
    BEGIN
        SET @currentDBVersion = (SELECT SchemaVersion FROM SUSDB.dbo.tbSchemaVersion WHERE ComponentName = 'CoreDB')
        SET @databaseBuildNumber = (SELECT BuildNumber FROM SUSDB.dbo.tbSchemaVersion WHERE ComponentName = 'CoreDB')
        DECLARE @delimiterPosition INT = CHARINDEX('.', @databaseBuildNumber)
        IF (@delimiterPosition = 0)
        BEGIN
            RAISERROR('Invalid schema version number', 16, 1) with nowait
            return
        END
        SET @databaseMajorVersion = SUBSTRING(@databaseBuildNumber, 1, @delimiterPosition - 1)
        SET @databaseMinorVersion = SUBSTRING(@databaseBuildNumber, (@delimiterPosition + 1), (10 - @delimiterPosition))
        IF @currentDBVersion < 926
        BEGIN
            SELECT 3
        END
        ELSE
        BEGIN
            IF (@scriptMajorVersion > @databaseMajorVersion OR
               (@scriptMajorVersion = @databaseMajorVersion AND @scriptMinorVersion > @databaseMinorVersion))
            BEGIN
                SELECT 2
            END
            ELSE IF (@scriptMajorVersion = @databaseMajorVersion AND
                     @scriptMinorVersion = @databaseMinorVersion)
            BEGIN
                SELECT 0
            END
            ELSE
            BEGIN
                SELECT 4
            END
        END
    END
    2013-12-01 06:58:33  Install type is: UnsupportedFuture
    2013-12-01 06:58:33  DB is a higher version than the config scripts
    2013-12-01 06:58:33  Swtching DB to multi-user mode......
    2013-12-01 06:58:33  Finished setting multi-user mode
    2013-12-01 06:58:33  Microsoft.UpdateServices.Administration.CommandException: The schema version of the database is from a newer version of WSUS
    than currently installed.  You must either patch your WSUS server to at least
    that version or drop the database.
       at Microsoft.UpdateServices.Administration.ConfigureDB.CheckForUnsupportedVersion(DBInstallType installType, Boolean dbExists)
       at Microsoft.UpdateServices.Administration.ConfigureDB.ConnectToDB()
       at Microsoft.UpdateServices.Administration.ConfigureDB.Configure()
       at Microsoft.UpdateServices.Administration.PostInstall.Run()
       at Microsoft.UpdateServices.Administration.PostInstall.Execute(String[] arguments)
    Fatal Error: The schema version of the database is from a newer version of WSUS
    than currently installed.  You must either patch your WSUS server to at least
    that version or drop the database.
    "When you hit a wrong note it's the next note that makes it good or bad". Miles Davis

    Gerry,
    1. I uninstalled WSUS, removed DB, restarted. And reinstall WSUS again. This time MS Management Console of WSUS could not be started. I did it after WSUS was added following your blog pix.
    I checked c:\program files\update services, there were couple of missing directories by comparison what was in case when I installed WSUS after SQL but before CM: Tools, Shema, and another A.. something. I saw this issue on NOOB. And people copied these
    directories from previous installation.
    How we can explain this behavior on RTM version?
    Then I uninstalled it again. And give another try... Here I found another interesting thing. Please compare your slide number 5 from your link and attached one. You can see that on Add Roles and Features Wizard page are missing items. Some
    of them were found in next screen of my installation (I am using RTM server) but one API and Power shell  cmdlets were not listed on final screen before start install. May be those are note important...
    I reinstall again... and the MS Management Console fails again. Exactly the same thing that I saw during 2 weeks of trying to make works. Don't see what I can screw up here.
    What about missing directories after reinstalling WSUS. Is that normal? And in IIS there is no WSUS.
    I don't see here any difficult thing, just simple wizard... But it not works. Second CM server.
    What I do wrong here to create a failure of MMC. There is any blog with examples of RTM Server 2012 and CM RTM. All stuff is 2012. So kind of guesses and no precise thing to do...
    Sorry for my cry :). The only solution is to call MS and pay $250 for SUCH a trivial thing that not works.
    Any help? Please see pics. Can you confirm that you have the same or more directories in Update Services
    Thanks.
    API and Power Shell Cmdlets feature missed if compare with yours slides.
    "When you hit a wrong note it's the next note that makes it good or bad". Miles Davis

  • Exchange 2013 w/Outlook 2013 "The name of the security certificate is invalid or does not match the name of the site"

    I've completed an upgrade from Exchange 2003 to Exchange 2013 and I have one last SSL message that I can't get rid of.  I've installed a 3rd party cert that is working great for webmail and cell phone access but for some reason the Outlook 2010/2013
    clients get prompted for a security warning.  I just implemented the SSL cert yesterday and I've noticed that new installs of Outlook seem to work just fine.  My Outlook 2013 client doesn't prompt me with the message but I have other users who are
    still getting the "The name of the security certificate is invalid or does not match the name of the site" error.  The domain on the cert error show up as server.mydomain.local.  I've gone through all the virtual directories and pointed
    all of my internal and external URL's to https://mail.mydomain.com.   This made one of the two warnings go away but not the second.  I've dug around on google and gone through everything I could find here and as far as I can tell my internal
    and external url's are configured properly and I can't figure out where this error is originating from.  Any ideas on where I should look outside of the virtual directories? 
    I'm including a good link I found that contains all of the virtual directories I updated.  I've checked them through both CLI and GUI and everything looks good.
    http://www.mustbegeek.com/configure-external-and-internal-url-in-exchange-2013/
    http://jaworskiblog.com/2013/04/13/setting-internal-and-external-urls-in-exchange-2013/

    Hi,
    When the Outlook connect to Exchange 2013/Exchange 2010, the client would connect to Autodiscover service to retrieve Exchange service automatically from server side. This feature is not available in Exchange 2003 Outlook profile.
    Generally, when mailbox is moved to Exchange 2013, the Outlook would connect to server to automatically update these information. It needs time to detect and update the changes in server side. I suggest we can do the following setting For autodiscover service:
    Get-ClientAccessServer | Set-ClientAccessServer –AutodiscoverServiceInternalUri https://mail.mydomain.com/autodiscover/autodiscover.xml
    Please restart IIS service by running IISReset in a Command Prompt window after all configuraions.
    Regards,
    Winnie Liang
    TechNet Community Support

  • Clean install OSX+S 10.9 then migrate data from 10.6 Server?

    Howdy All,
    I've had OSX Server from the first release but stopped at Snow Leopard Server waiting for the "new server" to mature a bit.  I think it is time now to move to OSX Server 10.9 but also for a clean install to clear out the crud that has collected over all these years. 
    I have two main partitions: one for the OS with the Mail store and Wikis and another for user home directories and other files.  Again this is going from Snow Leopard Server 10.6.last to 10.9 Mavericks and then Mavericks Server
    My question is:  If I clean install Maverick and then Mavericks server on another partition will I be able to migrate data and services over manually? 
    Of course, I will need to setup users and directory services again from scratch (but that is part of the plan to clean out the unused stuff that has collected over the years).  I don't mind setting up fileshares, Web sites, DNS, and similar again.
    However, when this is done will I be able to copy the mail store (i.e. all users email) over and will it be detected, updated if needed, and work?  Similarly, when I copy over the wikis, will they be detected, updated if needed, and work?
    Further, when I setup the symbolic links for home directories again (onto the other partition), will the home directories be updated (if needed) and work?  I'm not sure what could have changed with regards to home directories but you never know.
    In summary, I guess my question is about what migration only happens when you update as opposed to what happens when an older version of a resource or service is detected after an update.  Hopefully there is more of the latter than the former. 
    Thanks in advance for any thoughts on this.
    Cheers,
    Ashley.

    Out of your services (AFP/SMB, Mail, OD, SWU, and web), the one with the most potential for disaster and headache is clearly mail.  If you are using the same host device, cutting services over in pieces will not be possible.  However, here are some suggestions and potential points of concern.
    AFP/SMB file services are cake.  The only thing you need to consider is the potential time to copy the data if you are moving it to new disks.  The other issue will be user's GUID values and the associated ACLs.  Let's take the following scenario based on what you've detailed.
    • You have data on /HDD/Shares/ and you are planning on moving them to the SSD.  Is the SSD drive large enough to accept this data? 
    • If you had a share /HDD/Shares/Data and this contained an ACL allowing the design group to have access, the design group from the 10.6.8 OD may have a different GUID than the one you create on the 10.9.1 system.  If this is the case, you can purge all ACLs with a sudo chmod -R -N /path/to/data.  (Server.app should remove and then add but older versions resulted in merged messes so I go nuclear on the old settings)  Then you can apply your new ACLs and allow the permissions to propagate.
    • If you are leaving the data were it is, you will simply need to reset permissions.  However, note that if you are exporting and then importing users (via an OD backup or via standard record format) then you are maintaining GUID and should not need to touch any permissions.
    Regarding SWU, I would suggest looking into Caching server.  If you are moving the entire environment to 10.9 and iOS 7, SWU is no longer needed.  Caching server is easy as pie, requires no client configuration, and is more economical on your internet connection and server storage requirements.
    Web is pretty easy also.  But, this is dependant on what you are doing with web.  If html/php/perl then you pretty much just move your site folders and you are up and running.  If you were using MySQL, note that Apple replaced it with Postgres.  You can either perform a conversion from MySQL to Postgres or you can just install MySQL again manually.  The choice is yours.  If you are not doing database backed sites, the migration should be cake.
    OD is one of those technologies that I always prefer to start clean.  In really large environments, this can be very tough due to passwords.  You can export an OD backup from 10.6 and attempt a restore in 10.9.1.  If you have a lot of MCX in 10.6.8, you may run into some trouble as Apple has deprecated MCX in 10.8 and above.  However, this ensures that you have everything, from password to GUID.  Test, test, and test some more if you go this route.  An alternate option, especially if you are embracing the move away from MCX and to Profiles, is to do a user and group export for 10.6's Workgroup Manager.  This will not provide passwords but it will provide editable text files of your account data.  You can strip out the MXC and other legacy values and then use the resulting file to import users into a clean 10.9.1 OD master.  Once again, you will not get passwords unless you add them to the import file.  You need to figure out how many accounts and how sensitive users are to password resets.
    The final piece is mail.  This is the one area I have very little experience.  I've been burned by Apple's mails solutions from way back in the AppleShare IP days and now make it policy to use anything else but Apple's mail solution.  In a perfect world, moving the mail data store to the new OS and triggering Server.app should be enough.  But Apple + mail never seem to enter the realm of a perfect world.
    And finally, make sure DNS is correct before you do anything.  Since you are dealing with mail, you should also shut firewall port forwards to prevent new mail from coming into the server while you work on the migration.  Nothing worse than stitching mail together after a blown migration attempt.
    R-
    Apple Consultants Network
    Apple Professional Services
    Author "Mavericks Server – Foundation Services" :: Exclusively available in the Apple iBooks Store

  • Adding image BUT not overwriting field info? Or Not saving name to DB?

    I've got a database that calls the image by a link using an image path:
    /images/jewelry1.jpg
    The image field contains the full path in its thumbnail field or its full
    size directory.
    I've created an insert record behavior which allows the client to create the
    path, but I want her to be able to upload the images.
    Right now, the image upload and resize behavior demands a table field, but I
    can't use the thumbnail field or it will overwrite the
    /thumbnail/jewelry1.jpg with the image name jewelry1.jpg and I'll lose the
    path.
    How can I add images to the directories without updating the the wrong
    fields? The same problem exists for the multiple file/image uploads. It not
    only asks for the images, but then wants to save the names to a database.
    Thanks

    Sorry. I'll take care that this wont happen again.
    I've two more queries. How do I store this image in a database. I'm using Access database and defined a field as OLE Object which allows me to store as either Linked or embedded object. Should I store the captured image directly or the buffered Image.
    1) Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
    Connection db=DriverManager.getConnection("jdbc:odbc:ds1","","");
    Statement st=db.createStatement();
    String str="insert into visitor (name,photo) values('"+t1.getText()+"','"+img+"')";
    int rs=st.executeUpdate(str);
    if (rs==1)
    System.out.println(t1.getText());
    2) The JFields whatever I've defined allowing me to enter more than what I've defined. How do I restrict those fields to the no of characters I wanted.

  • Code Signing Cert for AIR and MSI

    If a Code Signing Certificate for AIR is purchased, can that same certificate be used when distributing the package using MSI?
    Or does it not matter as long as the AIR app is signed?

    No, this was a different problem that created similar symptoms.
    I just found out that, since Director 11.5, we can put the Xtras folder inside a projector. I was relying on outdated documentation, both online and in my mind, which said the xtras had to be next to the projector.
    Weirdly, putting the Xtras folder inside the Contents folder (inside the bare stub projector) solved the problem I was having: my sound was not functioning after I code signed the xtra that enables sound. Now it works fine.
    I also created an error when my projector's INI file set Movie01 to a Director movie in the same folder as the projector. Now I have it instead point to a movie in the Resources folder of the projector. So maybe I will just throw all my movies and supporting files in the Resources folder.
    I too am thinking of documenting the process, once I know customers are buying my app and using it successfully. Maybe I'll use screen recording to create a set of YouTube tutorials. That can spare others from this confusion and aggravation, and encourage people to buy the latest version of Director and update their old products. The more money that Adobe earns from Director, the more they will be encouraged to invest in developing Director further.
    If Apple will accept apps without receipt validation, that will certainly simplify things. I saw an Apple web page that stated it was mandatory, but that page has been changed. Maybe validation is optional but no longer required.
    For details, check this:
    https://developer.apple.com/library/mac/releasenotes/General/ValidateAppStoreReceipt/Intro duction.html
    but luckily there is source code out there that can be used to handle those technical details.
    I'm wondering how you applied your set of icons to your bare stub projector. Did you simply replace the projector.icns file? I created an error when I tried that.

  • Default Installation on LVM is not working

    New Thread (Default installation of LVM not working):
    First I did everything exactly as the LVM guide tells me. Then I continued with the Installation Guide.  No extra options, no change from defaults. When I try to boot it, it fails.
    I tried downgrading to 2.02.103-1. No result.
    Old Thread (when I did not know that LVM was at fault):
    Current state:
    EDIT5: OK I have boiled it down to LVM. When I copied all files from etcvol to /etc in root it continued booting until it hit mounting /home... I have no idea what the §$%&/ is wrong.
    Everything that is not root and should get mounted by fstab from LVM dies o.O
    Even stranger: After I removed homevol and swapvol from fstab i get stuck at
    A start job is running for dev-sdb4.device
    I did a "proof of concept" installation just now and it worked out of the box.
    Parameters of the "proof of concept" installation:
    /dev/sda5 ext4 /
    /dev/sda4 EFI /boot
    Rest is default installation guide
    Gummiboot:
    title Arch Linux
    linux /vmlinuz-linux
    initrd /initramfs-linux.img
    options root=/dev/sda5 rw
    Changed Parameters of the broken installation:
    /dev/sda3 LVM on LUKS
    Partitions
    /dev/mapper/MyStorage-rootvol ext4 /
    /dev/sda4 EFI /boot
    /dev/mapper/MyStorage-swapvol swap
    /dev/mapper/MyStorage-homevol ext4 /home
    title Arch Linux (Encrypted)
    linux /initramfs-linux.img
    options initrd=/initramfs-linux.img cryptdevice=/dev/sda3:MyStorage root=UUID=$UUIDofRoot rw
    I added the HOOKS
    HOOKS="... encrypt lvm2 keymap ... filesystems ... shutdown ..."
    Fstab
    # /etc/fstab: static file system information
    # <file system> <dir> <type> <options> <dump> <pass>
    # UUID=767d41f7-afb5-4a9f-a39b-604c830654e7
    /dev/mapper/MyStorage-rootvol / ext4 rw,relatime,space_cache 0 0
    # UUID=6e00082a-4fc4-47a5-9e80-e2330428d2fe
    /dev/mapper/MyStorage-etcvol /etc ext4 rw,relatime,space_cache 0 0
    # UUID=ce066cf1-230a-48b4-b7a8-2635ef39a881
    /dev/mapper/MyStorage-homevol /home ext4 rw,relatime,space_cache 0 0
    # UUID=0C88-FA2A
    /dev/sdb4 /boot vfat rw,relatime,fmask=0022,dmask=0022,codepage=437,iocharset=iso8859-1,shortname=mixed,errors=remount-ro 0 2
    # UUID=9369aa55-d055-4fab-98ef-ccc97b3f3e05
    /dev/mapper/MyStorage-swapvol none swap defaults 0 0
    So what goes wrong? I can't provide the exact logs as the journal persistency module dies in this case. If you know a way to get the actual files please tell me.
    Everything works fine down to "Reached target Local File Systems"
    He runs some isEmpty checks on tmpfiles.d directories which some succeed and /etc/tmpfiles.d and /usr/local/lib/tmpfiles.d fail
    systemd-tmpfiles --create --remove --exclude-prefix=/dev gets forked
    journald gets killed by systemctl
    systemd-tmpfile SIGCHILDs and dies
    systemd-udevd gets notified and changes from start to running
    udevd-control.socket gets changes to running
    system-tmpfiles-setup exits with failure
    Failed to start Recreate Volatile Files and Directories
    systemd-update-utmp gets forked
    accepts connection on private bus
    10 times dbus request: org.freedesktop.dbus.local.disconnect() on /org/freedesktop/dbus/local
    SIGCHILD from systemctl belonged to systemd-journal-flush.service
    Failed to start Trigger Flushing of Journal to Persistent Storage
    more dbus request: org.freedesktop.dbus.local.disconnect() on /org/freedesktop/dbus/local
    systemd-update- from systemd-update-utmp SIGCHILDs and dies
    Failed to start Update UTMP about System Reboot/Shutdown
    starts system initialization sysinit.target -> active
    starts dbus.socket to listeing
    starts sockets.target to active
    systemd-tmpfiles-clean.timer gets started by monotonic timer to cleanup
    dbus-daemon gets forked
    dbus.socket from listening to running
    systemd-logind gets forked
    Failed to start D-Bus System Message Bus
    Failed to start Login Service
    I had a look at higher debug levels, but they all just showed systemd starting something which either failed or in the case of D-Bus died soon after.
    When I broke into init, I could see the volumes which contained what they were supposed to contain.
    Except that nothing really interesting. I noticed a complaint that the AHCI driver should be replaced against a specialized one. But I don't think thats the problem.
    Thank you for your help!
    Last edited by GNA (2014-01-15 21:44:01)

    Removed the fsck filehook.
    posted the fstab.
    Boot logs will follow shortly. I just have to figure out how to get them in file form when the journal does not contain anything.
    EDIT: OK I have no idea how to get my own boot logs sad but true. ill just try to add as much context as I can see but I can't possibly write all of this down by hand.
    EDIT2: Ah no wonder when flushing of journal to persistent storage fails
    EDIT3: Ill just start decreasing the complexity one by one. And as it is so easy ill start with btrfs.
    EDIT4: Changing btrfs to ext4 did not change anything.
    EDIT5: OK I have boiled it down to LVM. When I copied all files from etcvol to /etc in root it continued booting until it hit mounting /home... I have no idea what the §$%&/ is wrong.
    Everything that is not root and should get mounted by fstab from LVM dies o.O
    Even stranger: After I removed homevol and swapvol from fstab i get stuck at
    A start job is running for dev-sdb4.device
    EDIT6: Today I tried to make a "failsafe" system with lvm and luks. Turns out it is not failsafe.
    First I did everything exactly as the LVM on LUKS guide tells me. Then I continued with the Installation Guide.  No extra options, no change from defaults. The result is the very same I currently have.
    EDIT7: Now that I think about it; There was something strange during pvcreate /dev/mapper/lvm. It complained: "/dev/sdc: open failed: No medium found" which is strange as there was never a /dev/sdc to begin with.
    I'll try to install on a pure LVM partition without crypto.
    EDIT8: OK pure install only LVM is also broken.
    Last edited by GNA (2014-01-15 20:46:49)

  • Source monitor and programm monitor stopped displaying anything

    Hey there,
    I'm a CS6 Production Premium user and in severe trouble here: I'm working on a big 1080p AVC-Intra50 project for a client that is due next monday. I'm close to completion, but a couple of days ago, the source monitor stopped displaying video. It would show a still as I loaded in a clip, but as soon as I played back, the source monitor simply went black. The clip was playing though and audio can be heard. Not that big a problem, I thought, and simply used the program monitor for playback. But yesterday, the same thing happened to the program monitor as well. But not just that: Even the stills are gone now. Both monitors remain black at all times, even though playback and sound still work. So I got curious and tried different materials and different projects: AVCHD, DNxHD, even JPEGS, nothing shows on either monitors, regardless of the resolution. Still, everything works just fine on my laptop.
    So I completely uninstalled the entire Creative Suite, used the Clean Script multiple times, including reboots, deleted all remaining files from user and windows directories and updated all my hardware to the latest drivers. Nothing. Both monitors remain black and useless. I didn't change anything to my Adobe setup, nor my hardware or software configuration while this error occured. I searched the web for hours and I'm completely at my wits' end. Any suggestions?
    My system:
    Windows 7 x64
    Intel Core i7 2600k
    8 GB RAM
    ATi 6870

    I did purge the entire cache and database. Didn't help.
    EDIT: I'm sorry, I only purged the database. Now, that I deleted all cached media files manually, the programm monitors is working again, but the source monitor is still out.
    EDIT 2: Okay, now stuff is getting really weird: So the program monitor was working again. I continued editing for a bit, then put the program monitor into full screen and it went black again. Even when I went back to the normal size, it remains black, BUT, now the source monitor was working again. So I put the source monitor intro full screen and it went black again and both monitors stayed black.
    So after deleting the media cache, I had the program monitor working again. After full screen use, it went balck, but the source monitor was working. After full screen use on the source monitor, both monitors are black again. What is going on?!

Maybe you are looking for

  • HR Structural Authorization DSO's

    Hi, I have developed HR module for the first time. I need to create the authorization objects for the HR reports. I found 0PA_DS02 and 0PA_DS03 for structural authorizations in HR. I dont understand the purpose of these DSO's. Can some one explain wh

  • How to Create simple WCF/REST (JOSN,get and post) and deploy to Sharepoint 2013 Server.

    HI All,    I wan to create a simple WCF/REST service and deployed to SharePoint server , i have created some sample svc file and put in to _vti_bin folder using SharePoint solution using vs 2013 but its not accessible from IIS its always asking the W

  • The Database Connection could not be found

    Hi guys, we did a complete new installation of EPM 11.1.2.1 for a testing environment. Did the installation 1-1 to our production system. But on TEST we have the problem to create database connections: Tools > Database Connection Manager > New Databa

  • How to copy links for applications

    How do I copy links for applications? I do this and nothing happens. Where does it go❓      

  • CCM2.0 Config in NW2004S (PI7.0) - Error with assigning pw to xiapplccm

    Hi, I've run into a situation that I don't understand while I was trying to activate CCM 2.0 XI settings. While activating my changes, the activation failed due to a missing password entry in the GeneratedReceiverChannel_RFC communication channel. Th