Import and directory synchronize problem

I'm having a problem importing new pictures into an existing file directory which appears in my Lightroom catalog. The photos import normally but show up as a new directory entry off the root of the Lightroom catalog. When I attempt to move/drag them to the existing Lightroom directory I want, I'm told they can NOT be move because they are already there which I can confirm in the Windows file system. Strangely, synchronizing the directory in Lightroom does not find the files even though they are physically there but not showing up in the catalog.  This is a new problem but I can't associate it with anything I've changed in Lightroom lately.  Can anyone suggest what might be going on and more important how to fix it?  It is very irritating. I have another problem and I don’t know if it is related or not.  Lightroom will NOT remember changes I make to calalog location I make through the edit, calalog settings menu or my watch directory for auto import through file, auto import, auto import settings.  It just ignores them and uses what is already there. Again, any suggestions would be greatly appreciated.

The idle-timeout on DSEE was set to none, which I believe is the default. I tried setting it to 1200 and 2400 seconds without success.
h3. get-ldap-data-source-pool-prop
<pre>
client-affinity-bind-dn-filters : any
client-affinity-criteria : connection
client-affinity-ip-address-filters : any
client-affinity-policy : write-affinity-after-write
client-affinity-timeout : 20s
description : -
enable-client-affinity : false
load-balancing-algorithm : proportional
minimum-total-weight : 100
proportion : 100
sample-size : 100
</pre>
h3. get-ldap-data-source-prop
<pre>
bind-dn : none
bind-pwd : none
client-cred-mode : use-client-identity
connect-timeout : 10s
description : -
down-monitoring-interval : inherited
is-enabled : true
is-read-only : false
ldap-address : localhost
ldap-port : ldap
ldaps-port : ldaps
monitoring-bind-dn : none
monitoring-bind-pwd : none
monitoring-bind-timeout : 5s
monitoring-entry-dn : ""
monitoring-entry-timeout : 5s
monitoring-inactivity-timeout : 2m
monitoring-interval : 30s
monitoring-mode : proactive
monitoring-retry-count : 3
monitoring-search-filter : (objectClass=*)
monitoring-search-scope : base
num-bind-incr : 10
num-bind-init : 2
num-bind-limit : 1024
num-read-incr : 10
num-read-init : 2
num-read-limit : 1024
num-write-incr : 10
num-write-init : 2
num-write-limit : 1024
proxied-auth-use-v1 : false
ssl-policy : never
use-read-connections-for-writes : false
use-tcp-keep-alive : true
use-tcp-no-delay : true
</pre>

Similar Messages

  • How to fix Import and folder-synchronize?

    Installed Lightroom 4.2 RC. Now Import and folder synchronize hang up. Uninstalled 42. RC with same result. Removed and then reinstalled 4.1 with same problem.
    Please advise.
    [email protected]
    Message was edited by: SD Rowdies

    They might not be compatible with the iPad's video capability. Are they Flash videos?

  • DSEE and Directory Synchronizer Question

    I'm investigating using Directory Synchronizer to sync up our DSEE 6.x directory with AD.
    We do not have our directory set up to where users can change their password directly. They must go to the authoritative source (our web portal) and change their password there, then have it synced out to DSEE. We are using SSHA passwords for this, but also have a SHA-1 hash of the password.
    So, how will Directory Synchronizer be able to get passwords into AD? I don't think AD uses or recognizes this password format, does it?

    http://docs.sun.com/app/docs/doc/819-0993/gbfza?l=Ja&a=view should help you.

  • Solaris Name Service Cache and Directory Proxy Problem

    We have some Solaris 10 clients ldapcliented to a Directory Proxy Server. After 15 minutes, the Solaris name service cache will fail to communicate to the proxy instance and the proxy instance's readconnectionsrefused attribute will start incrementing.
    At first it seemed we would need to increase the worker-threads and num-bind-limit, but those do not fix the problem.
    At the same time the name-service-cache starts failing, I am still able to query and search the proxy. I have set up a Jmeter test which continues to run and they never fail.
    It seems very consistent that the problem with the name-service-cache occurs every 15 minutes and I am able to reproduce this at the client's site and in my lab. Restarting either the proxy or the name-service-cache clears the problem.
    Has anyone else seen this problem?
    Edited by: 957466 on Sep 6, 2012 9:11 AM

    The idle-timeout on DSEE was set to none, which I believe is the default. I tried setting it to 1200 and 2400 seconds without success.
    h3. get-ldap-data-source-pool-prop
    <pre>
    client-affinity-bind-dn-filters : any
    client-affinity-criteria : connection
    client-affinity-ip-address-filters : any
    client-affinity-policy : write-affinity-after-write
    client-affinity-timeout : 20s
    description : -
    enable-client-affinity : false
    load-balancing-algorithm : proportional
    minimum-total-weight : 100
    proportion : 100
    sample-size : 100
    </pre>
    h3. get-ldap-data-source-prop
    <pre>
    bind-dn : none
    bind-pwd : none
    client-cred-mode : use-client-identity
    connect-timeout : 10s
    description : -
    down-monitoring-interval : inherited
    is-enabled : true
    is-read-only : false
    ldap-address : localhost
    ldap-port : ldap
    ldaps-port : ldaps
    monitoring-bind-dn : none
    monitoring-bind-pwd : none
    monitoring-bind-timeout : 5s
    monitoring-entry-dn : ""
    monitoring-entry-timeout : 5s
    monitoring-inactivity-timeout : 2m
    monitoring-interval : 30s
    monitoring-mode : proactive
    monitoring-retry-count : 3
    monitoring-search-filter : (objectClass=*)
    monitoring-search-scope : base
    num-bind-incr : 10
    num-bind-init : 2
    num-bind-limit : 1024
    num-read-incr : 10
    num-read-init : 2
    num-read-limit : 1024
    num-write-incr : 10
    num-write-init : 2
    num-write-limit : 1024
    proxied-auth-use-v1 : false
    ssl-policy : never
    use-read-connections-for-writes : false
    use-tcp-keep-alive : true
    use-tcp-no-delay : true
    </pre>

  • Numbers Import and Load Performance Problems

    Some initial results of converting a single 1.9MB Excel spreadsheet to Numbers:
    _Results using Numbers v1.0_
    Import 1.9MB Excel spreadsheet into Numbers: 7 minutes 3.5 seconds
    Load (saved) Numbers spreadsheet (2.4MB): 5 minutes 11.7 seconds
    _Results using Numbers v1.0.1_
    Import 1.9MB Excel spreadsheet into Numbers: 6 minutes 36.1 seconds
    Load (saved) Numbers spreadsheet (2.4MB): 5 minutes 5.8 seconds
    _Comparison to Excel_
    Excel loads the original 1.9MB spreadsheet in 4.2 seconds.
    Summary
    Numbers v1.0 and v1.0.1 exhibit severe performance problems with loading (of it's own files) and importing of Excel V.x files.

    Hello
    It seems that you missed a detail.
    When a Numbers document is 1.9MB on disk, it may be a 7 or 8 MB file to load.
    A Numbers document s not a file but a package which is a disguised folder.
    The document itself is described in an WML extremely verbose file stored in a gzip archive.
    Opening such a document starts with an unpack sequence which is a fast one (except maybe if the space available on the support is short).
    The unpacked file may easily be 10 times larger than the packed one.
    Just an example, the xml.gz file containing the report of my bank operations for 2007 is a 300Kb one but the expanded one, the one which Numers must read, is a 4 MB one, yes 13,3 times the original.
    And, loading it is not sufficient, this huge file must be "interpreted" to build the display.
    As it is very long, Apple treats it as the TRUE description of the document and so, each time it must display something, it must work as the interpreters that old users like me knew when they used the Basic available in Apple // machines.
    Addind a supplemetary stage would have add time to the opening sequence but would have fasten the usage of the document.
    Of course, it would also had added a supplementary stage duringthe save it process.
    I hope that they will adopt this scheme but of course I don't know if they will do that.
    Of course, the problem is quite the same when we import a document from Excel or from AppleWorks.
    The app reads the original which is stored in a compact shape then it deciphers it to create the XML code. Optimisation would perhaps reduce a bit these tasks but it will continue to be a time consuming one.
    Yvan KOENIG (from FRANCE dimanche 27 janvier 2008 16:46:12)

  • CS3 - Import and Export Quality Problems with Canon S90 .mov files

    Hi Folks,
    I have spent several hours searching with no results in the forums, and I am a beginner with Premiere, so I apologize for any annoying questions I may post.
    Here is the problem I am having.  The very first step!  Go figure.
    I have videos taken with a Canon S90 point and and shoot camera, in .mov format, 640x480, 30 frames per second.
    When I play these videos in Quick Time or Windows Media Player they look great.  Very sharp and smooth video.
    When I import these videos into Premiere Pro CS3 - and don't do anything to them besides play them in either the source window or or Program window, the quality is much lower.  They are not sharp, and have an unusual texture, almost like there is a tiny bit of water on the lens distorting the view as it moves.
    I imported the video under the DV-NTSC Standard 48kHZ preset.  I have the most updated Quick Time.
    I also attempted to improve the view settings to their maximum with no results.  I also exported the video to see if it was just the viewing, but the exported video is even worse.  Very low quality and not sharp.
    Thanks for any suggestions on the correct settings to import these videos without loosing quality.

    Well, I have read and understand the Codec aspect a bit.  I have discovered that my
    files are avc1 H.264 Codec.  Doing more research it appears that CS3 Premiere Pro has some trouble editing H.264 files.
    So from what I gather so far, it looks like my best bet would be to convert the file into another format that is easier to edit by Premiere.
    Is that correct?
    If so, I am still unsure what to convert it to.
    What is the program that is best to use to convert with?
    Thanks again for your help.

  • Toplink session and UnitOfWork synchronization problem

    Dear forum readers,
    I am not sure i fully understand the way how toplink deals with caching. To me it seems, that i got some pretty scary results, which i am not sure how to interpret and to work around them.
    The following code snippet is part of a unit test:
    >>>>>>>>>>>> snip >>>>>>>>>>>>>>>
    1 public void test2() {
    2
    3 UnitOfWork uow = (UnitOfWork) SessionManager.getSessionManager().getSession().getUnitOfWork();
    4 Justitiabele justitiabele = findJustitiabele("findById", Justitiabele.class, new Long(551));
    5 ((JustitiabeleIdentiteit) justitiabele.getJustitiabeleIdentiteiten().iterator().next()).setMeisjesnaam("Kettner10");
    6 Justitiabele tmp = (Justitiabele) uow.registerObject(justitiabele);
    7 ((JustitiabeleIdentiteit) tmp.getJustitiabeleIdentiteiten().iterator().next()).setMeisjesnaam("Kettner10");
    8 uow.commitAndResume();
    9 }
    10
    11 public Justitiabele findJustitiabele(String queryName, Class objectClass, Object param) {
    12      SessionWrapper toplinkSessionWrapper = getSession();
    13      toplinkSessionWrapper.getClientSession().executeQuery(queryName, objectClass, param);
    14 }
    >>>>>>>>>>>>>>>> snip <<<<<<<<<<<<<<<<
    I am querying a particular object (line 4). Then i make some changes to that object (line 5). Cause the object is not registered in the UnitOfWork these changes shouldn't be persisted. So far so good. To achieve persistency i now register the object, and i make the same modifications to the toplink clone, expecting them after the commit to be persisted in the database.
    Contrary to my expectations, the changes were not persisted!!!
    Deleting line 5 (the modifications, before registering the object), leads to the desired result.
    Somehow the queried object seems to be a direct reference to the (client-) session cache. So when registering the object in the UnitOfWork, the (already modified) backupclone is copied from the session cache to the UnitOfWork. If the same changes are done to the working clone,there are no differences between backup- and working clone and no changes are made in the database.
    It gets even better: I tried to query the object again (before line 6) (even with a different UnitOfWork) before modifying it, in order to retrieve the original state of the object, but again i only was able to find the modified object.
    If the queried object indeed is a reference to some cache, i cannot understand, why that cache is not read only!!!
    Am i doing something wrong ?
    Is there a way to work around this problem?
    What are the consequences for transaction handling ? What about Isolation, when clients can see each others changes in a kind of writeable shared session???
    I try to work around that problem by registering every object, that is queried from the database in the UnitOfWork right after it was queried. This seems to me the only solution, though this is contrary to what the toplink developers guide says, namely, that only objects which are modified should be registered, due to performance reasons.
    I would be grateful to any help in understanding and working around this problem.
    Martin
    PS: Here's the log i got by running the test.:
    STDOUT >>>>>>>>>>>>>>>>>>>>>>>>>>>>
    C:\devtools\jdev\905\jdk\bin\javaw.exe -ojvm -classpath C:\ToplinkDemo\ToplinkDomein\classes;C:\ToplinkDemo\ToplinkDomein\classes\META-INF\ToplinkDomein;C:\devtools\jdev\905\toplink\jlib\source.jar;C:\devtools\jdev\905\lib\xmlparserv2.jar;C:\devtools\jdev\905\lib\xmlcomp.jar;C:\devtools\jdev\905\jdbc\lib\classes12.jar;C:\devtools\jdev\905\jdbc\lib\nls_charset12.jar;C:\devtools\jdev\905\toplink\jlib\toplink.jar org.dji.br.bl.domein.TestMain
    ServerSession(91)--Connection(92)--TopLink, version: OracleAS TopLink - 10g (9.0.4) (Build 031126)
    ServerSession(91)--Connection(92)--connecting session: djisession
    ServerSession(91)--Connection(92)--connecting(DatabaseLogin(
         platform=>Oracle9Platform
         user name=> "dji"
         datasource URL=> "jdbc:oracle:thin:@S-ORACLE01:1521:djipoc"
    ServerSession(91)--Connection(92)--Connected: jdbc:oracle:thin:@S-ORACLE01:1521:djipoc
         User: DJI
         Database: Oracle Version: Oracle9i Enterprise Edition Release 9.2.0.4.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.4.0 - Production
         Driver: Oracle JDBC driver Version: 9.0.1.5.0
    ServerSession(91)--Connection(101)--TopLink, version: OracleAS TopLink - 10g (9.0.4) (Build 031126)
    ServerSession(91)--Connection(101)--connecting session: djisession
    ServerSession(91)--Connection(101)--connecting(DatabaseLogin(
         platform=>Oracle9Platform
         user name=> "dji"
         datasource URL=> "jdbc:oracle:thin:@S-ORACLE01:1521:djipoc"
    ServerSession(91)--Connection(101)--Connected: jdbc:oracle:thin:@S-ORACLE01:1521:djipoc
         User: DJI
         Database: Oracle Version: Oracle9i Enterprise Edition Release 9.2.0.4.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.4.0 - Production
         Driver: Oracle JDBC driver Version: 9.0.1.5.0
    ServerSession(91)--Connection(103)--TopLink, version: OracleAS TopLink - 10g (9.0.4) (Build 031126)
    ServerSession(91)--Connection(103)--connecting session: djisession
    ServerSession(91)--Connection(103)--connecting(DatabaseLogin(
         platform=>Oracle9Platform
         user name=> "dji"
         datasource URL=> "jdbc:oracle:thin:@S-ORACLE01:1521:djipoc"
    ServerSession(91)--Connection(103)--Connected: jdbc:oracle:thin:@S-ORACLE01:1521:djipoc
         User: DJI
         Database: Oracle Version: Oracle9i Enterprise Edition Release 9.2.0.4.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.4.0 - Production
         Driver: Oracle JDBC driver Version: 9.0.1.5.0
    ServerSession(91)--Connection(105)--TopLink, version: OracleAS TopLink - 10g (9.0.4) (Build 031126)
    ServerSession(91)--Connection(105)--connecting session: djisession
    ServerSession(91)--Connection(105)--connecting(DatabaseLogin(
         platform=>Oracle9Platform
         user name=> "dji"
         datasource URL=> "jdbc:oracle:thin:@S-ORACLE01:1521:djipoc"
    ServerSession(91)--Connection(105)--Connected: jdbc:oracle:thin:@S-ORACLE01:1521:djipoc
         User: DJI
         Database: Oracle Version: Oracle9i Enterprise Edition Release 9.2.0.4.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.4.0 - Production
         Driver: Oracle JDBC driver Version: 9.0.1.5.0
    ServerSession(91)--Connection(107)--TopLink, version: OracleAS TopLink - 10g (9.0.4) (Build 031126)
    ServerSession(91)--Connection(107)--connecting session: djisession
    ServerSession(91)--Connection(107)--connecting(DatabaseLogin(
         platform=>Oracle9Platform
         user name=> "dji"
         datasource URL=> "jdbc:oracle:thin:@S-ORACLE01:1521:djipoc"
    ServerSession(91)--Connection(107)--Connected: jdbc:oracle:thin:@S-ORACLE01:1521:djipoc
         User: DJI
         Database: Oracle Version: Oracle9i Enterprise Edition Release 9.2.0.4.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.4.0 - Production
         Driver: Oracle JDBC driver Version: 9.0.1.5.0
    ServerSession(91)--Connection(109)--TopLink, version: OracleAS TopLink - 10g (9.0.4) (Build 031126)
    ServerSession(91)--Connection(109)--connecting session: djisession
    ServerSession(91)--Connection(109)--connecting(DatabaseLogin(
         platform=>Oracle9Platform
         user name=> "dji"
         datasource URL=> "jdbc:oracle:thin:@S-ORACLE01:1521:djipoc"
    ServerSession(91)--Connection(109)--Connected: jdbc:oracle:thin:@S-ORACLE01:1521:djipoc
         User: DJI
         Database: Oracle Version: Oracle9i Enterprise Edition Release 9.2.0.4.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.4.0 - Production
         Driver: Oracle JDBC driver Version: 9.0.1.5.0
    ServerSession(91)--Connection(111)--TopLink, version: OracleAS TopLink - 10g (9.0.4) (Build 031126)
    ServerSession(91)--Connection(111)--connecting session: djisession
    ServerSession(91)--Connection(111)--connecting(DatabaseLogin(
         platform=>Oracle9Platform
         user name=> "dji"
         datasource URL=> "jdbc:oracle:thin:@S-ORACLE01:1521:djipoc"
    ServerSession(91)--Connection(111)--Connected: jdbc:oracle:thin:@S-ORACLE01:1521:djipoc
         User: DJI
         Database: Oracle Version: Oracle9i Enterprise Edition Release 9.2.0.4.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.4.0 - Production
         Driver: Oracle JDBC driver Version: 9.0.1.5.0
    ServerSession(91)--sequencing connected, state is ForcedToUseWriteAccessor_State
    ServerSession(91)--client acquired
    ClientSession(114)--acquire unit of work: 113
    ClientSession(114)--Execute query ReadObjectQuery(org.dji.br.bl.domein.justitiabele.Justitiabele)
    ServerSession(91)--Connection(101)--SELECT DJI_NUMMER FROM DJI.JUSTITIABELEN WHERE (DJI_NUMMER = 551)
    ServerSession(91)--Execute query ReadAllQuery(org.dji.br.bl.domein.justitiabele.JustitiabeleIdentiteit)
    ServerSession(91)--Connection(92)--SELECT INDICATIE_NONAMER, ACHTERNAAM, BRN_CODE, MEISJESNAAM, ID, ROEPNAAM, GEBOORTEPLAATS_BUITENLAND, TITEL_BUITENLAND, VOORNAAM, VOORLETTERS, JBE_DJI_NUMMER, DATUM_INGANG, DATUM_EINDE FROM DJI.JUSTITIABELEIDENTITEITEN WHERE (JBE_DJI_NUMMER = 551)
    ServerSession(91)--Execute query ReadObjectQuery(org.dji.br.bl.domein.justitiabele.Justitiabele)
    UnitOfWork(113)--Register the object org.dji.br.bl.domein.justitiabele.Justitiabele@82
    UnitOfWork(113)--Register the existing object org.dji.br.bl.domein.justitiabele.JustitiabeleIdentiteit@84
    UnitOfWork(113)--Register the existing object org.dji.br.bl.domein.justitiabele.Justitiabele@82
    UnitOfWork(113)--begin unit of work commit
    ClientSession(114)--Connection(103)--begin transaction
    UnitOfWork(113)--Execute query WriteObjectQuery(org.dji.br.bl.domein.justitiabele.Justitiabele@83)
    UnitOfWork(113)--Execute query WriteObjectQuery(org.dji.br.bl.domein.justitiabele.JustitiabeleIdentiteit@85)
    ClientSession(114)--Connection(103)--commit transaction
    UnitOfWork(113)--end unit of work commit
    UnitOfWork(113)--resume unit of work
    Process exited with exit code 0.

    Mark,
    The object returned from any query on the sessions is the object from the shared cache. Any changes made to this will change the shared cache.
    You must acquire a UnitOfWork and register the cached object into the UnitOfWork in order to get an isolated copy that can be modified within a transactional context (UnitOfWork) without other threads seeing these transient changes. The typical approach is to read through the session and register objects involved in a change prior to modifications.
    The is a UnitOfWork paper available on TopLink technical information page that may be useful to you:
    http://www.oracle.com/technology/products/ias/toplink/technical/index.html
    Doug

  • Problem with importing and creating self signed SSL certificate

    Mac Pro, 10.7.2 Server.  Attempting to import or create a self signed certificate for use as ichat.domain.com to encrypt iChat service.  Server is acutally called server.domain.com but has an alias of ichat.domain.com.  I understand that this is probably not best practice but I would like to keep things this way since we have one server, run multiple services on it, but want to continue to connect to each service at SERVICE.domain.com.  We have been using this type of mismatched certificate with success since 10.4 or so.
    I am working through setup of 10.7 Server to replace our 10.6 server. 
    Tried upgrade of 10.6 to 10.7 installation.  The installation made a mess of some services and our Open Directory, but did move the certificate over and allowed iChat service to function properly.
    Clean install and setup of 10.7 Server.  Exported self signed certificate, private key, and encryption password from 10.6 Server and functioning 10.7 upgraded Server.
    On import or manual creation of certificate get the following error:
    Error
    Check your server's logs for more information.  The error (code 5001) was: Expected SecKeychainItemImport to return a SecIdentityRef, but it did not
    Log shows:
    Dec 29 17:56:55 server servermgrd[498]: -[CertsRequestHandler(HelperAdditions) importP12Data:passphrase:error:]: importedItems = (
                  "<SecCertificate 0x7fcf6ed43c00 [0x7fff78d96f40]>"
    I have tried importing and manually creating other certificates with a variety of names with success.  I assume that there is something buried somewhere that is causing this particular one to be a problem.  Other than manually removing any remnants of the certificate from /etc/certficates I do not have any ideas what to try.  I am essentially ready to move this server to 10.7 except for this problem and would like to avoid a reinstall.
    Suggestions?
    -Erich

    Take a look here.
    https://bbs.archlinux.org/viewtopic.php?id=146649
    Maybe it's a problem with your network.

  • Problem with Import and Export Data Wizard

    Downloaded and installed SQL Server Express 2008 R2 today because I want to explore how Access interacts with SQL Server (using my home computer). I'm using Access 2010 (under Windows 7), so the 2008 version of SQL Server Express seemed to be the version
    to use.
    After a couple of false starts, installation appeared to go okay. After the installation. My Start menu listed Microsoft SQL Server 2008 and Microsoft SQL Server 2008 R2. The latter listed Import and Export Data (64-bit). When I clicked that, the first Import
    and Export Data Wizard page was displayed. I wasn't ready at that time to explore the wizard, so I closed it. An hour or so later I again attempted to open the Import and Export Data wizard. This time, the wizard didn't open. Instead this error message was
    displayed: "The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered."
    I found DTS.dll on my computer at C:\Program Files\Microsoft SQL Server\100\DTS\Binn, so the file is available, but don't know whether it is registered.
    How can I correct this problem?

    First can you please post all log file errors
    >> I can't really give you a solution or specific recommendation since I did not saw this error yet myself, but on your own risk you can try:
    1. You may try to just register 'dts.dll' using regsvr32.exe, but this error may indicate a bigger problem with setup.
    If you are running SQL Server 64bit then try running this at the command prompt: %windir%\syswow64\regsvr32 "%ProgramFiles(x86)%\Microsoft SQL Server\90\dts\binn\dts.dll"
    2. You can try reinstall from start (In this case you have to make sure that you un-install all)
    [Personal Site] [Blog] [Facebook]

  • Font problem in import and link

    I'm using RH8 and have a problem with a Word doc I linked.In a numbered list, NumberList2, the font reverts to Times New Roman from Arial 10pt. I attach a pic. I've tried reapplying the style, linking again and again and it's very stubborn. Is there a fix?

    Hi
    If this is a Linked Word Document, Try going to File>Project Settings >Next and Click on Edit for Word Document conversion Settings, and
    Select Edit and it would prompt you with all avauilable tags along with their Appearance, Example in Word and Robohelp.
    Check your tag in the List and Select "Source " in Drop Down of RoboHelp Style.
    And try to Select that and observe if the Appearnace or Style is imporved.
    Alternately, if you want you can import the Word Doccument and its assocaiated CSS file would be imported too and you can make changes directly to CSS and reapply them.
    Thanks,
    Anjaneai Srivastava

  • I have problems with my films that become slightly jerky after I have imported and edited them in iMovie and then burned them onto a DVD using iDVD. I can see the weak jerkiness when panning in both laterally and vertically. I shoot with a camcorder Canon

    I have problems with my films that become slightly jerky after I have imported
    and edited them in iMovie and then burned them onto a DVD using iDVD.
    I can see the weak jerkiness when panning in both laterally
    and vertically.
    I shoot with a camcorder Canon Vixia HF10. The camera has been set to deliver HD quality (1920x1080). But I have accidentally filmed with a frame rate of 60 (?).
    When I import the films in iMovie I have been asked if I want to change to frame rate 30 (instead of 25). I have chosen 30.
    Could it be the setting of a frame rate 60 in my camcorder that causes jerkiness when panning or what else?

    I have problems with my films that become slightly jerky after I have imported
    and edited them in iMovie and then burned them onto a DVD using iDVD.
    I can see the weak jerkiness when panning in both laterally
    and vertically.
    I shoot with a camcorder Canon Vixia HF10. The camera has been set to deliver HD quality (1920x1080). But I have accidentally filmed with a frame rate of 60 (?).
    When I import the films in iMovie I have been asked if I want to change to frame rate 30 (instead of 25). I have chosen 30.
    Could it be the setting of a frame rate 60 in my camcorder that causes jerkiness when panning or what else?

  • Data importing and exporting problem between BADI's

    Hi Experts,
    I am facing a problem in importing and exporting the data between badi's.
    I am able to import and export the data between DOC CHANGE and DOC CHECK BADI's etc; but now i have a requirement to map a field from SRM to R/3.
    Supplier order key field which is added to the basic data section of Shopping Cart should be mapped to the corresponding fields in R/3. I have added the field and when i am trying to map it to R/3  i am facing the probelm.
    This Supplier order key field will be coming from ZBBP_CATALOG_TRANSFER badi and i am trying to map it in the Z_BBP_CREATE_PO_BACK Badi; i tried by using the below statement to export a single parameter to PO BACK badi from CATALOG TRANSFER Badi
      EXPORT zsupp_ord_key FROM zsupp_ord_key TO MEMORY ID 'Z_SUPP_ORD_KEY'.
    which is not working.
      IMPORT zsupp_ord_key TO zsupp_ord_key FROM MEMORY ID 'Z_SUPP_ORD_KEY'. in PO BACK BADI.
    Can any one please let me know why this is happening?
    And i have strange problem in this ...
    when a user copies a catalog which is already created then we can not map the field to R/3 PO
    and again if he tries to create a cart by selecting "Old carts which are already created" we cant map field for the above two things to happen i think we need a custom coding .....
    Thank you
    Lokesh.

    Hi Lokesh,
    BADI implementations have classes for which an instance is created at runtime and hence the import and export memory logic will not work.
    Please try to create a function module for these BADI and call them in these BADI's. Then go ahead with the import and export logic. This could help.
    I would rather advice, to figure out an option to avoid this memory manipulations.
    Regards
    Kathirvel

  • Various bugs – Library view issue / import and copy problem / unsupported paths message

    In addition to the issues reported in a separate thread regarding importing video, there are several other bugs that are preventing me from progressing on a project and/or indicate instability in Edge Animate.  These issues don’t occur on new projects (yet), but are present in my current composition.
    I'm using the latest version of Edge Animate CC 2014 (2014.1.1) on a Windows 8 Pro (fully updated) system.
    Issue 1 – Library View
    For all the assets that have been imported, when I twirl down to view the thumbnail of each asset in the Library, nothing appears in the thumbnail area.  In earlier iterations of this project, all the items appeared correctly, but now nothing is there for any of the assets.  See sample below of what should appear (and what was there before), and what’s there now (nothing).
    Issue 2 – Asset Import Problem
    When I try to add an already imported image asset to the stage (or try to import a new image asset in the project), the item doesn’t appear properly – it just comes in as a dot with no height or width.  See samples below of what the imported asset looks like on the stage and its size information.
    Assets that were imported and placed before this bug occurred are still visible on the stage. But, if I copy one of these assets, the copy retains the size information, but does not appear at all. See sample.
    Issue 3 – Unsupported Paths message
    When I close the project, I get an unsupported paths message.  See sample.
    Despite this message, the assets appear in the project from within Edge and are present in the project if I open/preview the HTML file in my browser.
    Since these three issues relate to project assets, perhaps they are related.  I can provide a link to download this project if you like. Any help would be greatly appreciated!

    Hi Avinash,
    I'm glad you received the (large) project.
    In the meantime, I installed Edge Animate on another system (OS - Windows 8.1).  The Edge project exhibits the same problems on the other system.
    Thanks for looking into this!
    Regards,
    - Joel

  • Strange pop-up message after opening LR, import and synchronize disabled, must force quit to close

    I recently started getting this message when running Lightroom and not only does it seem to disable both "Import" and "Synchronize" but it won't close. Nothing happens when I click stop and in order to close my catalog I have to force quit. Any help is much appreciated. Btw, I did have a plug-in which I disabled but that didn't seem to help.
    thanks,
    Adam

    Contact Adobe support thru chat. 
    Serial number and activation chat support (non-CC)
    http://helpx.adobe.com/x-productkb/global/service1.html ( http://adobe.ly/1aYjbSC )

  • Problem importing and including the subfolders without getting question marks

    i am trying to import a folder of vacation pictures. it has 5000 pictures and there are many subfolders. It imports them where i want but it gives me question marks on all the subfolders. And the subfolders are empty even though it tells me how many pictures are in the folder. I right clicked the question marked folders and point to where they are and it just says they are already there and do i want to merge the folders. In importing i told it to include the subfolders. I have tried this twice and each time the same results. Very frustrating to have to scroll through thousands of pictures. what am i doing wrong.

    try sync your folders and make sure you are not moving anything externally:
    Synchronize folders
    When you synchronize folders, you have the option of adding files that have been added to the folder but not imported into the catalog, removing files that have been deleted, and scanning for metadata updates. The photo files in the folder and all subfolders can be synchronized. You can determine which folders, subfolders, and files are imported.
    In the Folders panel, select the folder you want to synchronize.
    Choose Library > Synchronize Folder.
    In the Synchronize Folder dialog box, do any of the following:
    To import photos that appear in the folders but have not been imported in the catalog, select Import New Photos. If you select Show Import Dialog Before Importing, you can specify which folders and photos are imported.
    To remove photos that have been deleted from the folder but not from the catalog, select Remove Missing Photos From Catalog. If this option is dimmed, no files are missing. (You can choose Show Missing Photos to display the photos in Grid view.)
    To scan for any metadata changes made to the files in another application, choose Scan For Metadata Updates.
    Click Synchronize.
    If the Import Photos dialog box opens, specify the folders and files you want to import, and then click Import.
    also:
    Make sure that the option "Library" -> "Show Photos in Subfolders" is enabled. If it is not enabled, an activated folder will only show the photos that are directly in that folder.
    -janelle

Maybe you are looking for

  • Stock transport Order does'nt appear in VL10B

    Hi, I have created a stock transport order. And when i try to go for VL10B. The Order does'nt appear there.Can anyone help me? 1)I have made entries in T161W 2)Shipping tab is appearing in the STO 3)Done all possible config.. can anyone help me...

  • Sending csv files to database

    Hi,any suggestions welcome.I have a page that gets a .CSV file for user names from the local system and sends it a servlet to parse and send to the database.I inetnd to use a servlet to parse the csv files and insert into the database.Is there any ea

  • Value shown in cfgridcolumn but not in cfinput

    Hi all I have two queries, the first one to select all information about companies from the database to show in my cfform and cfgrid and the second one to display a list of sectors that I use in a select. They both work fine. company.sectorid is an i

  • How to discover network devices?

    We have a SCOM2012R2 management group with 6+ management servers. If I want to configure discovery of network devices, 1. can I use "All management servers" pool to discover them? or 2. Can I specify/reserve two of them just to do the discover and ma

  • Bug? UIX - errors management in event handling

    Scenario: We have an UIX page bound to a BC4J view. An attribute of the entity behind this view has two validator rules. We change the value of the edit box corresponding to this attribute in such manner that on update, the validation fails for both