"knowledge source failed"

"knowledge source failed" gather form data(1) when i try and use the form wizard.
have done four others with no problem, each of very similar content etc.
I noticed others have asked but no solution...
could someone help me figure this one out?
windows 7 os, acrobat 9 from cs5
Thanks so much,
nancy

I'm getting this on my mac (10.8.4). Trying to print a word 2011 (14.3.6) document and convert it to a form. As you know, printing a PDF from a mac requires saving it as a PDF because you cannot install a PDF print driver. There are two options, the built in mac pdf printer and the Adobe PDF. I have tried both printers. I have converted the docx directly in Acrobat (XI, 11.0.03) and still get the "Knowledge Source Failed" error when I auto generate the form fields. Not sure if it's even affecting anything. Form seems OK otherwise.

Similar Messages

  • Knowledge source failed?  Here's a solution . . .

    I was searching these forums for a solution because every time I tried to use the form wizard on a particular document, I would get the "Knowledge source failed. <Gather Form Data> [1]" error message.  It drove me nuts for about a half-hour.  Here's a quick workaround.  Simply print the troublesome document with the Adobe PDF built-in printer, and run the form wizard on the "printed" PDF.  Easy. 
    However, there were no active forums in which to share this nifty solution, so I started this thread.  Hope this helps!

    I'm getting this on my mac (10.8.4). Trying to print a word 2011 (14.3.6) document and convert it to a form. As you know, printing a PDF from a mac requires saving it as a PDF because you cannot install a PDF print driver. There are two options, the built in mac pdf printer and the Adobe PDF. I have tried both printers. I have converted the docx directly in Acrobat (XI, 11.0.03) and still get the "Knowledge Source Failed" error when I auto generate the form fields. Not sure if it's even affecting anything. Form seems OK otherwise.

  • Acrobat 9.1 "Knowledge source Failed"

    Upon clicking add or edit fields the knowledge base fails.
    Is there a way to repair or fix or a reason why this might occur?

    I'm getting the same error except that I'm using Acrobat for Mac. I created a form in inDesign and exported as PDF. It opens just fine in Acrobat but when I click on "Add or Edit Text Fields" and then click ok so Acrobat detects the fields I get this error:
    "Knowledge source failed.  <Gather Form Data> [1]"
    Anybody have a clue?
    Is it a font issue? How the PDF is exported issue?
    Anyone?
    Please?

  • Excel 2007 = Refresh All = Error Message "Initialization Data Source Failed"

    Excel 2007 => Data Tab - Refresh All => Error Message "Initialization Data Source Failed
    2nd Error message:
    The following data ranged failed to refresh:  ExternalData1 - Continue to refresh all
    How do I resolve this? Trying to connect to Oracle DB environment 12C

    What will happen if you rebuild the PivotTable/Table ?
    Make sure the external database is available and the Oracle Provider for OLE DB is installed correctly.
    Wind Zhang
    TechNet Community Support

  • Creation of the export data source failed

    trying to activate ods but getting this message that the creation of the export data source failed.
    and it also gives message that RFC Connection to source system is damaged===> no metadata upload
    Please can someone help me with this.
    Thanks in advance.
    Michelle

    Hi,
    Check ur source system connection. by going to RSA1-> Source System-> right click and then select check.
    U will get the info about the source system Connection. IF RFC has failed then better be in touch with ur basis.
    Or try restoring ur source system connection by
    Source system-> Restore.
    Its sets all the default conenctions b/n ur systems.
    Once again check the connection now. if it says ok.. then try do the export data source and then assign the data source to info source and then maintain the Update rules..
    Hope this helps-
    Regards-
    MM

  • Connection for Source failed because the environment is not trusted

    Good day all,
    This is the second time I come across this error on two different forms.
    This happens whenever I have a form connected to my database(SQL) and configured my ODBC.
    My form is Reader Extended, when I load it outside LC Designer I get this error : "Connection for Source failed because the environment is not trusted".
    I have searched for solution, one that I saw says one must change bindings from None to Normal, I've done that and didnt solve my problem.
    Can someone please assist me.
    Regards,
    Ace

    Check the two below links and see if they can be of any helpful to resolve your issue.
    In this thread Paul is asking to change the DSN setting.
    "If you set up your DSN as a system DSN instead of a user DSN then that message shoudl disappear."
    http://forums.adobe.com/message/2873482
    In the below blog, Steve is mentioning to clone the connection to get rid of the issue.
    You should be able to get around this by changing this line:
    var oDB = xfa.sourceSet.nodes.item(nIndex);
    to this:
    var oDB = xfa.sourceSet.nodes.item(nIndex).clone(1);
    http://forms.stefcameron.com/2006/10/12/displaying-all-records-from-an-odbc-data-connectio n/
    Thanks
    Srini

  • Can't Create a Data Source - Failed to test connection. [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied

    Hi there,
    I am having a serious issue with The Power BI Data Management Gateway which I am hoping that someone can help me with.
    Basically I am setting a connection between a Power BI demo site and a SQL 2012 Database based on Azure. The Data Management Gateway and is up and running, and Power BI has managed to connect to it successfuly.
    By following the tutorials at
    here I was able to successful create my Data Connection Gateway with a self-signed certificate.
    However, when trying to create the data source I come into problems. The Data Source Manager manages to successfully resolve the hostname, as per the screenshot below:
    Bear in mind that I exposed the require ports in Azure as endpoints and I managed to modify my hosts file on my local machine so I could access the SQL server hosted in Azure using its internal name -- otherwise I would not be able to get this far.
    However the creation of the data source also fails when trying to created it whilst logged in the SQL server in question:
    The Data Source Manager returns the error when using the Microsoft OLE DB Provider for SQL Server:
    Failed to test connection. [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied
    I tried using the SQL Server Native Client 11.0 instead but I also get an error. This time the error is:
    Failed to test connection. Login timeout expiredA network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.Named Pipes Provider: Could not open a connection to SQL Server [53]. 
    Some considerations
    If I provide an invalid username/password, the Data Source Manager does say that the username and password is incorrect.
    Firewall is turned off in the SQL Server (either way, this error also happens if I try top use the Data Source Manager whilst logged in the SQL Server itself).
    SQL Profiler does not show any attempt of connection.
    The SQL server instance in question is the default one.
    The error happens regardless if I select the option to encrypt connection or not.
    In SQL Configuration manager I can see that all protocols are enabled (TCP/IP, Named Pipes and Shared Memory.
    The Event Viewer does not provide any further errors than the one I have copied in this post.
    I'm at a loss here. Could someone please advise what might I be doing wrong?
    Regards,
    P.

    Here is what I had to do to solve this issue:
    Basically I had to add the MSSQL TCP/IP port as an end-point in Azure. After I did that, then I was able to create the data-source. However, I was only able to authenticate with a SQL account, as any domain account would return me an error saying that the
    domain isn't trusted.
    What puzzles me here is how come the Data Source Manager would inform me that an account username/password was invalid, but it would fail/timeout if I provided valid credentials (!?!?!!?)

  • Weblogic Server 10.3.6.0.8 one of data sources failing while other 6 not

    Hello,
    I have 7 data sources connecting to the same database schema. After I upgraded the environment (production) day before yesterday evening I noticed one of the data sources start failing with exception below:
    "Connection test failed with the following exception: weblogic.common.resourcepool.ResourceUnavailableException: No resources currently available in pool ePayBatchDS to allocate to applications.
    Either specify a time period to wait for resources to become available, or increase the size of the pool and retry.."
    The data source capacity size is Min:1 Max:30, I increased to be as 1-50, after change it is worked for 15 minutes then it back again giving the same error message above. Please not that I have in production two servers as high availability approach however I upgraded one of them till now only and the same data source in the old server not failing.
    Please advise.
    Old Environment Specifications:
    Weblogic Server version: 10.3.4.0
    OS: Sun OS10 SPARC 64-bit
    JDK: 1.6.0 update 32 64-bit
    New Environment Specifications:
    Weblogic Server version: 10.3.6.0.8
    OS: Solaris 11.1 (64-Bit Mode) SRU 20 (18 June 2014 patchset).
    JDK: 1.7.0 update 55 64-bit
    Thanks,
    Mohd.

    Hii,
               No resources currently available in pool ePayBatchDS to allocate to applications.
    This exception cleaarly shows the connection are not available to allocate to ePayBatchDS  datasource.
    As you said there are altogether 7 data source out which six are working and one is not.
    In general if you are creating a datasource and providing a connection pool size we should consider the number of connection available or allowed at the database side as well.
    Please cross check at the DB side the number of connection available and based on that try to tune the connection pool size in your data sources.
    Regards,
    Abdul

  • CR 2010 Report using "File System Data" as source fails in production

    Hello, I've recently upgraded my website from VS 2008 with full CR 2008 to VS 2010 using CR 2010 SP1 runtime (x32).  Everything went fine except for one report that uses as a source the "file system data".  It points to a share on a samba drive and it worked great before the upgrade.  After upgrading my development box the report runs fine, but when I deployed it, it fails with the message:
    Failed to load database information. Error in File {report name} {D5D84E7F-54DB-4BCA-B3C3-0A169CDC36D7}.rpt: Failed to load database information.
    I made sure the websiteu2019s Application Pool identity was set to use credentials of a user who has access to the share. 
    The data source has the following properties:
    Database DLL:   crbd_filesystem.dll
    File Mask: .
    Include Empty Directories: False
    Max Number of Subdirectory Levels: -1
    Starting Directory:
    {FreeBSD server name}\{share name}
    This report uses a sub-report which has the same source.  This report works fine from (full) CR 2008 and from the designer inside of VS 2010 works too.  I can run it on my development box (in release and debug modes) which is Windows 7 (x64) using IIS 7. 
    In production, we donu2019t include the runtime in our application deployment, since we run multiple sites and do updates frequently, so we pre-install (one time) the runtime separately on the server.  We installed the CR 2010 SP1 x32 runtime on our test and development boxes which both are Windows 2003 with IIS6 and made sure they are setup to use .Net 4 and have the Application Pool identity setup to have access to the share.  All my other reports which use ODBC work fine.  Just this one report doesnu2019t and Iu2019m at a loss on how to even begin troubleshooting this.
    Is CR 2010 still capable of using u201Cfile system datau201D as a source?  I couldnu2019t find anything on u201Cfile system datau201D with CR 2010.  Though I have seen under Other Sources, it still exist when I created a new report, just to try it out.

    Looks like you've found a bug with the CRVS2010 runtime. I looked for the crbd_filesystem.dll in the CRVS2010 install directory (C:\Program Files (x86)\SAP BusinessObjects\Crystal Reports for .NET Framework 4.0\Common\SAP BusinessObjects Enterprise XI 4.0\win32_x86) and it is there. I then opened up the 32 bit MSI and MSM with the MS Orca utility to see if the file is included and it is not.
    Two things we can do:
    1) Add the crdb_filesystem.dll to your install and make sure it installs to the above directory on the runtime computer
    2) I will created a fix request for the issue, however next fix will be Service Pack 2 (ETA, sometime after October) and I doubt this will make it there. So we'd be looking at SP 3 - if there will be one (ETA would be in 2012).
    Ludek
    Follow us on Twitter http://twitter.com/SAPCRNetSup
    Got Enhancement ideas? Try the [SAP Idea Place|https://ideas.sap.com/community/products_and_solutions/crystalreports]

  • Import of Unit Of Measures XML into Sourcing fails

    Hi All ,
    I am checking imports of Master Data from ERP in Sourcing 7.0 for Integration Scenario with single records, but I am failing in import for Unit of Measures.XML file , It throws error saying "This field is required and must have a valid value. [DISPLAY_NAME]."
    The Standard XML which was generated didn't had <DISPLAY_NAME>field , so I have included it to check and upload , but the error persists.
    Note:This XML was generated with BBP_ES_CUSTOMIZINGDATA_EXTRACT report from ERP
    <?xml version="1.0" encoding="utf-8"?><sapesourcing defaultlanguage="" xsi:noNamespaceSchemaLocation="UnitsOfMeasure.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><objects><object classname="masterdata.UnitsOfMeasure"><fields><NAME><values><value language="EN">Percentage_test</value></values></NAME><INACTIVE>FALSE</INACTIVE><CATEGORY>Power</CATEGORY><DISPLAY_NAME>MPH</DISPLAY_NAME><DESCRIPTION><values><value language="EN">Percentage_test</value></values></DESCRIPTION><ISO_CODE>MPH</ISO_CODE><DEFAULT_FOR_ISO>FALSE</DEFAULT_FOR_ISO><CONVERSION>0.0100 </CONVERSION><CONVERSION_SCALE>0</CONVERSION_SCALE><UNIT_PLUGIN></UNIT_PLUGIN><SYNONYMS>0 </SYNONYMS><PRECISION>0 </PRECISION><SCALE>0 </SCALE><PRIMARY_UNIT></PRIMARY_UNIT></fields><collections/></object></objects></sapesourcing>                             
    Any pointers on why this XML is failing would be great.
    Regards,
    Uday

    Hi Uday,
    There shouldnu2019t be a need to make any updates to the output xml created by BBP_ES_CUSTOMIZINGDATA_EXTRACT. I would raise a message with SAP.
    Regards,
    Vikram

  • Replicate in Source fails with ORA-01403

    Hi,
    I've set up a bidirectional replication between two Oracle 11g R2 databases (WIN11SRC and WIN11TRG). Initially, the data was consistent between the two databases.
    Now, when I update a record in the source, I find it successfully got replicated in the target.
    But when I update a record in the target, it doesn't get through to the source. It fails in the source and rejected by its Replicate. The discard file shows the following error when I tried to update the Department_Name column:
    OCI Error ORA-01403: no data found, SQL
    <UPDATE "HR"."DEPARTMENTS" SET "DEPARTMENT_NAME" = :a1,"DML_TIMESTAMP" = :a2
    WHERE "DEPARTMENT_ID" = :b0 AND "DEPARTMENT_NAME" = :b1 AND "DML_TIMESTAMP" = :b2>
    Operation failed at seqno 3 rba 1740
    Discarding record on action DISCARD on error 1403
    Problem replicating HR.DEPARTMENTS to HR.DEPARTMENTS
    Record not found
    Mapping problem with compressed update record (target format)...
    DEPARTMENT_ID =
    DEPARTMENT_NAME = Administration7
    DML_TIMESTAMP = 2012-08-16 09:30:04.068000000
    *In the source, I've got the following pararmeter files set up:
    Extract LHRDEV1
    DISCARDFILE discard.dsc, PURGE, MEGABYTES 200
    DiscardRollover at 02:00 ON SUNDAY
    TRANLOGOPTIONS EXCLUDEUSER ggs_owner COMPLETEARCHIVEDLOGTIMEOUT 600
    USERID GGS_OWNER@ORA11SRC, PASSWORD g
    ExtTrail dirdat/L1
    TABLE HR.DEPARTMENTS, GETBEFORECOLS (ON UPDATE KEYINCLUDING (DML_TIMESTAMP), ON DELETE KEYINCLUDING (DML_TIMESTAMP));
    Replicat RHRDEV1
    DISCARDFILE ./dirrpt/rhrdev1_discards.dsc, APPEND, MEGABYTES 100
    DISCARDROLLOVER at 02:00 ON SUNDAY
    USERID GGS_OWNER@ORA11SRC, PASSWORD g
    ASSUMETARGETDEFS
    REPERROR (DEFAULT, DISCARD)
    MAP HR.DEPARTMENTS, Target HR.DEPARTMENTS, COMPARECOLS (ON UPDATE KEYINCLUDING (DML_TIMESTAMP), ON DELETE KEYINCLUDING (DML_TIMESTAMP)), &
    RESOLVECONFLICT (UPDATEROWEXISTS, (DEFAULT, USEMAX (DML_TIMESTAMP))), &
    RESOLVECONFLICT (INSERTROWEXISTS, (DEFAULT, USEMAX (DML_TIMESTAMP))), &
    RESOLVECONFLICT (DELETEROWEXISTS, (DEFAULT, OVERWRITE)), &
    RESOLVECONFLICT (UPDATEROWMISSING, (DEFAULT, OVERWRITE)), &
    RESOLVECONFLICT (DELETEROWMISSING, (DEFAULT, DISCARD));In the Target, I've got the following pararmeter files set up:
    Note: Exactly similar to the Source except the DB connection in USERID
    Extract LHRDEV1
    DISCARDFILE discard.dsc, PURGE, MEGABYTES 200
    DiscardRollover at 02:00 ON SUNDAY
    TRANLOGOPTIONS EXCLUDEUSER ggs_owner COMPLETEARCHIVEDLOGTIMEOUT 600
    USERID GGS_OWNER@ORA11TRG, PASSWORD g
    ExtTrail dirdat/L1
    TABLE HR.DEPARTMENTS, GETBEFORECOLS (ON UPDATE KEYINCLUDING (DML_TIMESTAMP), ON DELETE KEYINCLUDING (DML_TIMESTAMP));
    Replicat RHRDEV1
    DISCARDFILE ./dirrpt/rhrdev1_discards.dsc, APPEND, MEGABYTES 100
    DISCARDROLLOVER at 02:00 ON SUNDAY
    USERID GGS_OWNER@ORA11TRG, PASSWORD g
    ASSUMETARGETDEFS
    REPERROR (DEFAULT, DISCARD)
    MAP HR.DEPARTMENTS, Target HR.DEPARTMENTS, COMPARECOLS (ON UPDATE KEYINCLUDING (DML_TIMESTAMP), ON DELETE KEYINCLUDING (DML_TIMESTAMP)), &
    RESOLVECONFLICT (UPDATEROWEXISTS, (DEFAULT, USEMAX (DML_TIMESTAMP))), &
    RESOLVECONFLICT (INSERTROWEXISTS, (DEFAULT, USEMAX (DML_TIMESTAMP))), &
    RESOLVECONFLICT (DELETEROWEXISTS, (DEFAULT, OVERWRITE)), &
    RESOLVECONFLICT (UPDATEROWMISSING, (DEFAULT, OVERWRITE)), &
    RESOLVECONFLICT (DELETEROWMISSING, (DEFAULT, DISCARD));There is a trigger enabled on the table on both dbs as follows:
    CREATE OR REPLACE TRIGGER HR.set_DEPART_dmltstmp
    BEFORE UPDATE
    ON HR.DEPARTMENTS
    REFERENCING NEW AS New OLD AS Old
    FOR EACH ROW
    BEGIN
    IF SYS_CONTEXT ('USERENV', 'SESSION_USER') != 'GGS_OWNER' THEN
       :NEW.DML_TIMESTAMP := SYSTIMESTAMP;
    END IF;
    END;
    /I'm working on GG version 11g R2.
    Thanks in advance.

    You forgot to "ADD TRANDATA" (add PK logging) on the target. You did it on the source but just forgot to do it on the target. Here's how I know:
    1. It s 1403 - no data found. Usually a supplemental log issue.
    2. Discards shows:
    DEPARTMENT_ID =
    I assume that this would be all or part of a PK. No data, not supplemental logging..
    Good luck,
    -joe

  • Using SSIS XML Source failing / renaming a node element

    hello
    I have tried to use SQL Server SSIS XML Source reader to load a file from one of our data providers to SQL Server however the reader fails with the following error:
    The XML Muni File was unable to process the XML data. The element "default_status" cannot contain a child element. Content model is text only.
    it seems that for some reason the reader fails to identify when the node and its child here labeled "default_status" have the same nomenclature.
    I am reaching out to anyone with a solution for this - or a way to write an SQL XML script that renames the children of that node to "default_status_id" for example. then we can use the out of the box reader to load the file.
    any ideas suggestion would be greatly appreciated.
    thanks
    eddy
    eddy.a

    you dont need to use SSIS here
    you can simply use this
    declare @x xml='<instrument_details>
    <default_status>
    <default_status>13</default_status>
    <default_status_date>2013-12-24</default_status_date>
    </default_status>
    <default_status>
    <default_status>2</default_status>
    <default_status_date>2014-04-08</default_status_date>
    </default_status>
    <maturity_details>
    <maturity_amount>10650000.0000000</maturity_amount>
    </maturity_details>
    <denomination_amounts>
    <min_denom_amount>5000.000000</min_denom_amount>
    </denomination_amounts>
    </instrument_details>'
    select t.u.value('(maturity_details/maturity_amount)[1]','decimal(10,2)'),
    t.u.value('(denomination_amounts/min_denom_amount)[1]','decimal(10,2)'),
    m.n.value('default_status[1]','int'),
    m.n.value('default_status_date[1]','datetime')
    from @x.nodes('/instrument_details') t(u)
    cross apply u.nodes('default_status')m(n)
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Multi Data Source, Fail Over, Read-Only detected as unavailable

    We are using WLS 10.3.6.0 with WebCenter Content (WCC) as the application.
    For the production back end, RAC will be be the production DB and we need to support failover to Data Guard.
    We set up a Multi Data Source (MDS) with the first simple Data Source pointing to RAC and the second Data Source pointing to Data Guard.
    Upon testing, this scenario basically works correctly: when RAC is shutdown and Data Guard enabled, the switch occurs and the app works against Data Guard. And conversely, when Data Guard is shut down and RAC brought up, MDS will revert to pointing to RAC and the app will work correctly.
    However the DBA insists, that the alternate Data Source should not need to be completely down but should be able to be in a read-only state and the MDS should detect that if it is in a read-only state, it will be counted as unavailable and try to use the alternate Data Source. This does not work with the default set up, I see in the logs that a connection can be made to the read-only DB instance which will not work at all with the app and is not what is desired by the customer.
    Is there a way to configure the MDS so that if a Data SOurce is only available as read-only, the alternate Data Source will be used?

    I may be able to describe a trick... Try setting the "test table" property of the DataGuard DataSource
    to some update statement, ideally one to a otherwise unused table. eg:
    SQL update myJunkTable set foo = 1 where 1 = 0
    This won't actually update the table (which could have zero rows even) but I hope the
    read-only DBMS would throw an exception anyway, and this will give WLS the impression
    it can't get healthy connections, and That DataSource won't be used, until the read-only
    status is removed.
    However, from a standard point of view, a read-only DBMS may be perfectly useful for some applications,
    so that is not of itself a reason for WLS not to use it.

  • Optimalization, tunig pl/sql query, the best knowledge source.

    hello,
    I'm finding the best informaton about good practice in Oracle, all about Optimalization, tunig pl/sql. Can you help me?
    ths
    Edited by: adamkutera on 2010-04-21 12:11

    Dear adamkutera,
    The BEST documentation (including optimization, do&donts, tuning tips etc) is the Oracle documentation.
    No other documentation is as up-to-date and current as Oracle's documentation.
    They can be found at...
    http://www.oracle.com/technology/documentation/index.html
    and
    http://tahiti.oracle.com/
    Specifically for performance tuning...
    http://download.oracle.com/docs/cd/B14117_01/server.101/b10752/toc.htm
    vr
    Sudhakar B.

  • Installing from source failing

    I ran the $ ./configure script for libsigc++
    When I attempted to optimize it for my CPU, I got the following error. I didnt get the error when I re-tried without optimization, however the error returned when I ran $ make
    make[3]: * [libsigc-2.0.la] Error 1
    make[2]: * [all] Error 2
    make[1]: * [all-recursive] Error 1
    make: * [all] Error 2
    There's some more info I'll post below but the above is the ultimate 'reason' for the failure. Can't figure it out/find anything on Google.
    Here's the rest of the process which seems to result in the above message... anyone have any idea what's happening?:
    make all-recursive
    Making all in sigc++
    make all-am
    /bin/sh ../libtool --tag=CXX --mode=link g++ -g -O2 -o libsigc-2.0.la -rpath /usr/local/lib signal.lo signal_base.lo trackable.lo connection.lo slot.lo slot_base.lo lambda.lo
    g++ -dynamiclib -single_module ${wl}-flat_namespace ${wl}-undefined ${wl}suppress -o .libs/libsigc-2.0.0.0.0.dylib .libs/signal.o .libs/signal_base.o .libs/trackable.o .libs/connection.o .libs/slot.o .libs/slot_base.o .libs/lambda.o -install_name /usr/local/lib/libsigc-2.0.0.dylib -Wl,-compatibility_version -Wl,1 -Wl,-current_version -Wl,1.0
    ld: multiple definitions of symbol _UnwindGetRegionStart
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindGetRegionStart in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindGetRegionStart
    ld: multiple definitions of symbol _UnwindDeleteException
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindDeleteException in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindDeleteException
    ld: multiple definitions of symbol _UnwindFindEnclosingFunction
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindFindEnclosingFunction in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindFindEnclosingFunction
    ld: multiple definitions of symbol _UnwindForcedUnwind
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindForcedUnwind in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindForcedUnwind
    ld: multiple definitions of symbol _UnwindGetDataRelBase
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindGetDataRelBase in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindGetDataRelBase
    ld: multiple definitions of symbol _UnwindGetGR
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindGetGR in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindGetGR
    ld: multiple definitions of symbol _UnwindGetIP
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindGetIP in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindGetIP
    ld: multiple definitions of symbol _UnwindGetLanguageSpecificData
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindGetLanguageSpecificData in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindGetLanguageSpecificData
    ld: multiple definitions of symbol _UnwindGetTextRelBase
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindGetTextRelBase in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindGetTextRelBase
    ld: multiple definitions of symbol _UnwindRaiseException
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindRaiseException in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindRaiseException
    ld: multiple definitions of symbol _UnwindResume
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindResume in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindResume
    ld: multiple definitions of symbol _UnwindSetGR
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindSetGR in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindSetGR
    ld: multiple definitions of symbol _UnwindSetIP
    /usr/lib/gcc/darwin/3.3/libgcc.a(unwind-dw2.o) private external definition of _UnwindSetIP in section (_TEXT,_text)
    /usr/local/lib/libgccs.1.dylib(unwind-dw2s.o) definition of _UnwindSetIP
    /usr/bin/libtool: internal link edit command failed

    HalosGhost wrote:
    I doubt this because of your comments, but are you using systemd/user to manage your X login? If you are, then your PATH is hard-coded, and you'll need to add /usr/local/bin to the window manager service file.
    All the best,
    -HG
    No this is wrong.  I mean, yes, it is hard coded.  And it is wrong for the filesystem that we have here on Arch.  But it still includes /usr/local/sbin:/usr/local/bin first.  But it then has /usr/sbin:/usr/bin.  This is because Fedora apparently didn't go all the way in their /usr merge, so they now have /sbin -> /usr/sbin and /bin -> /usr/bin.
    Edit: See this thread for more info... it was the last one I opened. https://bbs.archlinux.org/viewtopic.php?id=164333 There is a link to a bug report which was promptly closed, and included the information about Fedora's half-assed /usr merge.
    Last edited by WonderWoofy (2013-07-04 22:37:09)

Maybe you are looking for