[solved] google-talkplugin PKGBUILD issues...

Installing the google-talkplugin (for Firefox, and wot wot)
https://aur.archlinux.org/packages/google-talkplugin/
I read Matthew_Moore's 11-22-2014 comment and updated the URL and SHA. The url seemed to work, but the SHA failed verification.
Where in the world do you find the SHA string? It's not in any of the sub-directories of the URL.
[xtian@MasterShake google-talkplugin]$ sudo makepkg --asroot -s
==> Making package: google-talkplugin 5.38.6.0-1 (Tue Nov 25 14:19:19 EST 2014)
==> Checking runtime dependencies...
==> Installing missing dependencies...
resolving dependencies...
looking for inter-conflicts...
Packages (1): lsb-release-1.4-14
Total Download Size: 0.01 MiB
Total Installed Size: 0.04 MiB
:: Proceed with installation? [Y/n] y
:: Retrieving packages ...
lsb-release-1.4-14-any 6.7 KiB 3.27M/s 00:00 [#########################] 100%
(1/1) checking keys in keyring [#########################] 100%
(1/1) checking package integrity [#########################] 100%
(1/1) loading package files [#########################] 100%
(1/1) checking for file conflicts [#########################] 100%
(1/1) checking available disk space [#########################] 100%
(1/1) installing lsb-release [#########################] 100%
==> Checking buildtime dependencies...
==> Retrieving sources...
-> Downloading license.html...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 19289 0 19289 0 0 178k 0 --:--:-- --:--:-- --:--:-- 179k
-> Downloading google-talkplugin_current_x86_64.rpm...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 7140k 100 7140k 0 0 2125k 0 0:00:03 0:00:03 --:--:-- 2125k
==> Validating source files with sha1sums...
license.html ... Skipped
google-talkplugin_current_x86_64.rpm ... FAILED
==> ERROR: One or more files did not pass the validity check!
Last edited by xtian (2014-11-26 19:26:50)

Here's the WIKI page for Installing packages. Suppose we could edit it to reflect that informatin?
Installing packages from the AUR is a relatively simple process. Essentially:
    Acquire the tarball which contains the PKGBUILD and possibly other required files, like systemd-units and patches (but often not the actual code).
    Extract the tarball (preferably in a folder set aside just for builds from the AUR) with tar -xvf foo.tar.gz.
    [MAYBE HERE? Build your package as non-root user...]
    Run makepkg in the directory where the files are saved (makepkg -s will automatically resolve dependencies with pacman). This will download the code, compile it and pack it. [or MAYBE HERE?]
    Look for a README file in src/, as it might contain information needed later on.
    Install the resulting package with pacman:
    # pacman -U /path/to/pkg.tar.xz
AUR helpers add seamless access to the AUR. They vary in their features but can ease in searching, fetching, building, and installing from PKGBUILDs found in the AUR. All of these scripts can be found in the AUR.
Warning: There is not and will never be an official mechanism for installing build material from the AUR. All AUR users should be familiar with the build process.
[HERE AS ASIDE..FOR YOUR OWN SAKE, DON"T JUST INSTALL --AS-ROOT...
Last edited by xtian (2014-11-25 19:25:51)

Similar Messages

  • [SOLVED] makepkg -s PKGBUILD issues

    Hey all, this is my second install of Arch.  I love it, but I am having some issues this time around that I did not have the first install.
    Here is the issue:
    Arch does not seem to recognize that I have installed dependencies when using makepkg.  For example, I want to install compiz-bzr.  I download the tarball and extract it to a folder.  I run:
    makepkg -s PKGBUILD
    and it tells me it cant resolve dependencies gmock.  When I install gmock via AUR the same way it installs fine, but when I go to install compiz-bzr again, the same message says cannot resolve dependencies gmock.
    if i run:
    pacman -Qs gmock
    nothing shows as being installed.  This does not just happen with compiz-bzr, it happens with anything I install via AUR.  I am sure I am missing something probably simple but for the life of me I cannot figure it out..
    please help!
    Last edited by Obrez (2014-03-14 04:03:51)

    man makepkg wrote:       -i, --install
               Install or upgrade the package after a successful build using
               pacman(8).
    edit: too slow   again
    Last edited by skunktrader (2014-03-14 03:59:18)

  • Has anyone solved the "garbled characters" issue i...

    When I use Nokia Maps on my N95, all streets and towns which contain special characters such as: é, è, ü, ä, ö, etc… do appear garbled. For example,
    Zürich appears as: Zã¼rich
    München appears as: Mã¼nchen
    Liège appears as: Liã"ge
    I saw a couple posts on this forum where people suggested to format the memory card, some other suggested to reload the firmware (my N95 has 12.0.013) but nothing has helped. I went as far as using someone's PC whose WinXP is installed in French - I thought that maybe the operating system or regional settings could have some influence, but I was wrong.
    Has anyone solved this extremely annoying issue?
    Thx

    I solved it. It took me one evening. I wonder why Nokia haven't published any comments on this old issue.
    Solution can be found below, but before I want to tell about some technical details. Maybe guys at Nokia will read them
    As someone on this forum said before the problem is that map names are in UTF-8 format, but phone tries to show them using another encoding. After some googling I found that this issue is well-known because Nokia phones use ISO 8859-1 (which is similar to Windows-1252, Western European) by default but some application assume UTF-8 to be default.
    At first, I discovered that when I start Nokia map with empty SD card and allow it to download something from internet, all names were shown correctly. I supposed that the problem sits in that awful Map Loader, but I was wrong: it does absolutely nothing special with data files, just downloading and unzipping. And this super-mega-feature-rich-map-loader require about 40Mb to download for .NET 3.0 plus 10 Mb for itself. Never seen anything worse
    So I've started to compare contents of one SD card with incorrect street and city names and another with correct.
    After some time I came across Cachedatheader.cdt file in diskcache folder. And what do you think?
    The file from correct SD card contains a literal string: "UTF-8". While incorrect one contains a string saying "ISO 8859-1". I've nearly fell out of my chair after seeing that.
    I've replaced wrong file with correct one and names became readable on all my 180mb of previously downloaded maps.
    Finally, after playing with it some more I found a simple solution which should work for everyone:
    ***** SOLUTION *****
    1. Enable internet access in Nokia Maps application (set it to When Needed).
    2. Access memory card from your PC and go to "\PRIVATE\20001f66\diskcache" folder.
    3. Delete "CachedatHeader.cdt" file. Perhaps you should backup it somewhere before deleting, but it looks like it can be removed safely without affecting existing maps.
    4. Put the card back in the phone if you extracted it and start Nokia Maps.
    5. Allow Nokia Maps to access internet and try downloading some new locations (you don't need to download much, just let it access server) or searching for something.
    6. Now codepage should be correct and all names become readable.
    This effect seems to be permanent and will stay even if you disable internet access later. Also note that you don't have to format card, remove or reinstall any map files. The problem is just in one small file.
    Nokia Maps recreates missing CachedatHeader.cdt, but it will recreate it correctly only if internet access is enabled when it starts. Probably the best way to avoid wrong codepage problem is to enable internet access at first start of Nokia Maps and use Map Loader only after downloading some bits of map directly from the phone.
    Message Edited by ankor85 on 09-Jan-2008 09:02 PM

  • Can you help me solve my aspect ratio issue?

    Hey guys,
    I'm posting this in hopes that someone can help me solve an aspect ratio issue with a project that I have to finish by tomorrow.
    I'm trying to avoid having to re-edit the entire thing.
    I shot the project with my Nikon D7000 DSLR. in 720p 30p (29.97)
    I am running Adobe Premiere Pro CS5 (the trial version).  I have the full version of Adobe Premiere Pro CS3.
    The reason I was working with the trial is that CS3 can't handle the H.264 MOVs that the D7000 records.
    You can work with a timeline, but if you try to export anything, it never works.  I always have to convert my footage to ProRes MOVs with MPEG Streamclip before I can work with them is CS3.  That takes a lot of time and a lot of hard drive space.  After much research, I found out that CS5 is the way to go for DSLR footage.  I just need to save up my pennies for the upgrade.
    So I thought I'd edit this short (1 minute) project with the trial to see how CS5 works with the D7000 footage.
    Upon installing the trial program, I found that the project presets were limited.  I knew this ahead of time because it's clearly stated on the Adobe's website.
    But I wasn't aware how limited they are.
    Since "DSLR 1280x720p 30p" is not an option with the trial, I was going to use HDV 720p 30p.  But that was not an option either!
    So basically, my only option for 16x9 30p was "DV NTSC Widescreen".  It's my understanding that this is 864x480 (in square pixel aspect ratio) or 720x480  (in widescreen pixel aspect ratio)   I needed the output file to be 864x486.  I downsized the footage to 69% in the "Video Effects: Motion" setting so it looked correct in the project.  I didn't think about those extra 6 pixels until I outputed the file and saw thin black lines on the top and bottom.  My guess is that Premiere is adding black pixels because my project is technically 720x480 (1.2121)
    Any thoughts on how I can get a clean 864x486 export?  I'd rather not re-edit the whole thing...which I would have to do in CS3 after I spent a few hours converting the original files in MPEG Streamclip.  I don't know if there is a way to export something out of CS5 and then open a new project in CS3 to make this work.
    Thanks in adavance!
    - Jordan

    On export, just crop a few pixels off of each side; that'll let the image scale correctly to the output frame size without black bars.
    I'm not running the trial, but you should still be able to create a custom sequence preset using the Desktop editing mode. Just switch over to the General tab when you create a new sequence, and choose "Desktop" from the editing modes. Set the rest of the parameters as you need them.
    Even easier: once you've imported your footage, just drag a clip to the New Item icon at the bottom of the project panel; a sequence will be created matching your footage parameters. You can edit at full-resolution, and then export to your desired frame size when complete--you'll probably still need to crop a few pixels (in the Export Settings window) to eliminate the black bars.

  • [solved] Google chat: no sound

    I haven't webcam, and try to setup gmail chat sound.Have installed google-talkplugin [1], but both mic and playing back doesn't work. The same devices do work with (sorry) Skype. Have tried with Firefox and Chromium with the same result.
    As additional info - Conversation - Media menu items are disabled in pidgin (while setting voice level does work). Probably some common libs are missed.
    Where to dig in?
    [1] http://aur.archlinux.org/packages.php?ID=40056
    Last edited by student975 (2011-08-13 16:24:08)

    I have one built-in mother board nVidia card:
    [trusktr@rocketship ~ #3 : 22:47:48]$ lspci | grep Audio
    00:05.0 Audio device: nVidia Corporation MCP61 High Definition Audio (rev a2)
    02:00.1 Audio device: nVidia Corporation GF108 High Definition Audio Controller (rev a1)
    [trusktr@rocketship ~ #1 : 22:29:37]$ arecord -l
    **** List of CAPTURE Hardware Devices ****
    card 0: NVidia [HDA NVidia], device 0: ALC662 rev1 Analog [ALC662 rev1 Analog]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    card 0: NVidia [HDA NVidia], device 1: ALC662 rev1 Digital [ALC662 rev1 Digital]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    When I test the configuration in gmail->settings->chat->"voice and video chat" I can hear the ringing sound, video works, but there is no microphone input. Also, the drop down list for the mic input shows only output devices.
    The drop down list for Microphone sound shows these:
    Default device
    HDA NVidia, ALC662 rev1 Analog - Front speakers
    HDA NVidia, ALC662 rev1 Analog - 4.0 Surround output to Front and Rear speakers
    HDA NVidia, ALC662 rev1 Analog - 4.1 Surround output to Front, Rear and Subwoofer speakers
    HDA NVidia, ALC662 rev1 Analog - 5.0 Surround output to Front, Center and Rear speakers
    HDA NVidia, ALC662 rev1 Analog - 5.1 Surround output to Front, Center, Rear and Subwoofer speakers
    HDA NVidia, ALC662 rev1 Analog - 7.1 Surround output to Front, Center, Side, Rear and Woofer speakers
    HDA NVidia, ALC662 rev1 Digital - IEC958 (S/PDIF) Digital Audio Output
    And the drop down list for the Speakers shows these:
    Default device
    HDA NVidia, ALC662 rev1 Analog - Front speakers
    HDA NVidia, ALC662 rev1 Analog - 4.0 Surround output to Front and Rear speakers
    HDA NVidia, ALC662 rev1 Analog - 4.1 Surround output to Front, Rear and Subwoofer speakers
    HDA NVidia, ALC662 rev1 Analog - 5.0 Surround output to Front, Center and Rear speakers
    HDA NVidia, ALC662 rev1 Analog - 5.1 Surround output to Front, Center, Rear and Subwoofer speakers
    HDA NVidia, ALC662 rev1 Analog - 7.1 Surround output to Front, Center, Side, Rear and Woofer speakers
    HDA NVidia, ALC662 rev1 Digital - IEC958 (S/PDIF) Digital Audio Output
    HDA NVidia, HDMI 0 - HDMI Audio Output
    HDA NVidia, HDMI 0 - HDMI Audio Output
    HDA NVidia, HDMI 0 - HDMI Audio Output
    HDA NVidia, HDMI 0 - HDMI Audio Output
    I have no /etc/asound.conf or ~/.asoundrc files.
    Last edited by trusktr (2011-08-27 05:48:44)

  • [SOLVED] google-chrome update fails

    Hi,
    Apologies if this is a dumb-basic problem, but for some reason chrome is not updating via pacaur.
    I noticed chrome was complaining about being an unsupported version in gmail, and I realised my google-chrome-stable was v38, so I went to update it in pacaur...
    pacaur -S google-chrome
    :: Package(s) google-chrome not found in repositories, trying AUR...
    :: resolving dependencies...
    :: looking for inter-conflicts...
    AUR Packages (1): google-chrome-43.0.2357.81-1
    :: Proceed with installation? [Y/n] y
    :: Retrieving package(s)...
    :: View google-chrome PKGBUILD? [Y/n] n
    [sudo] password for timbeauhk:
    /usr/bin/pacaur: line 720: cd: /tmp/pacaurtmp-timbeauhk/google-chrome: No such file or directory
    :: Building google-chrome package(s)...
    ==> ERROR: PKGBUILD does not exist.
    :: Could not clean google-chrome
    I have just performed a pacman -Syu and all updated.
    I have rebooted, even though this should not be necessary.
    chrome was not running at the time.
    It is likey some other fundamental error/mistake on my part, as not being able to change directory is a pretty basic step to fail on, and hints at a silent mkdir failure?
    The dir and privs, from a root perspective, are as follows:
    [root@machine timbeauhk]# ls -al /tmp/pacaurtmp-timbeauhk/
    total 4
    drwxr-xr-x 2 timbeauhk users 60 Jun 6 13:04 .
    drwxrwxrwt 8 root root 180 Jun 6 13:40 ..
    -rw-r--r-- 1 timbeauhk users 61 Jun 6 13:05 rpc.json
    Ideas would be most appreciated.
    Thanks in advance.
    Last edited by timbeauhk (2015-06-06 07:39:23)

    jasonwryan wrote:Install it manually, see if that works...
    That did it, thanks.
    Shame it did not install via pacaur, but hey.

  • [Solved] Google video plugin and libpng dependency

    Hi,
    I am trying to install the plugin necessary to enable hangouts on Google+. I am using the AUR package:
    When I run makepkg I get the following error:
    ==> Making package: google-talkplugin 2.8.5.0-1 (Thu Jun  7 11:30:05 CDT 2012)
    ==> Checking runtime dependencies...
    ==> Missing Dependencies:
      -> libpng12>=1.2.13
    ==> Checking buildtime dependencies...
    ==> ERROR: Could not resolve all dependencies.
    I checked for the libpng version and its as below.
    [santosh@flyingdutchman google-talkplugin]$ pacman -Ss libpng
    extra/libpng 1.5.10-1 [installed]
        A collection of routines used to create PNG format graphics files
    From what I understand the version I already have installed (1.5.*) is greater than 1.2.13 as the dependency stated for the package. Could anyone explain what the problem is and how can I fix it?
    Thanks in advance,
    Cloud
    Last edited by cloudstrife (2012-06-07 18:00:47)

    Hi,
    Thanks for the solution. What is confusing is whether the package required is ligpng12 or libpng that is missing. If its libpng then I have 1.5.* installed as I mentioned previously and I dont see why I need to install anything.
    But I get a feeling I read it wrong and it needs a package libpng12 and whose version must be greater than 1.2.13 and that is available in AUR as mentioned by Pres.
    Could anyone confirm this?
    Appreciate the help.
    - Cloud

  • [SOLVED] Fresh Install Error - Issue with PGP Key 4096R/753E0F1F

    Hello Guys,
    This is my first topic. I have done all the research in this forum and in Google for the issue, with no luck in finding a solution. I have done several Arch linux installs before on Virtual Machines and PC's all of them working properly, except for this one that I can not get to pass the install phase.
    Description of Hardware:
    HP Laptop Pavilion Entertainment PC dv2-1030us
    Athlon AMD Processor
    4 GB RAM
    300GB HD
    ATI Video Card
    Broadcom wireless Card (Default drivers Not Working)
    Wired Network Card (Working)
    I'm trying to install Arch from USB drive created from another arch machine via:
    sudo dd if=/name_of_file.iso of=/dev/sdX
    Using Arch Version:
    Current Release: 2014.08.01
    Included Kernel: 3.15.7
    ISO Size: 559.0 MB
    The above bootleable USB drive have been used to install Arch in another machine currently running it.
    After creating the partitions on the drive, namely:
    BOOT
    SWAP
    ROOT
    HOME
    And formatting them accordingly to the install guide, and having the system ready to download all the base and base-devel packages via:
    pacstrap -i /mnt base base-devel
    Right after it finishes downloading I get the following error:
    downloading required keys in keyring...
    :: Import PGP key 4096R/753E0F1F, "Anatol Pomozov <[email protected]>", created: 2014-02-04? [Y/n] y
    error: key "Anatol Pomozov <[email protected]>" could not be imported
    error: required key missing from keyring
    error: failed to commit transaction (unexpected error)
    Errors occurred, no packages were installed.
    ==> ERROR: Failed to install packages to new root
    I first ignored the error and then tried to proceed with the installation process and executed the following sequences of commands:
    genfstab -U -p /mnt >> /mnt/etc/fstab
    and then tried to chroot with:
    arch-chroot /mnt
    Then i got the following error:
    mount: mount point /mnt/etc/resolv.conf does not exist
    chroot: failed to run command "/bin/sh": No such file or directory
    Here is list of the Topics i Found in this forum that throw some light into the issue, some of the solutions I have tried with no luck at all:
    https://bbs.archlinux.org/viewtopic.php?id=185089
    https://bbs.archlinux.org/viewtopic.php?id=181057
    https://bbs.archlinux.org/viewtopic.php?id=178185
    I hope you can help find a direction where to start fixing this issue, since i feel pretty lost right now. By the way I'm a Newbie in the Arch Linux world, so there are a lot of thing that are not that obvious to me...
    Thank you in advance.
    Last edited by alejandroccs (2014-08-20 21:12:13)

    I've had the same issue with an (outdated) Arch Linux iso. As far as I am aware, it is caused by pacman's keys becoming out of date.
    I've managed to get around this before, but the exact method eludes me.
    You could try just updating the archlinux-keyring package, by refreshing pacman (pacman -Sy), then reinstalling only that package (try pacman -S archlinux-keyring).
    Then you should be able to continue as normal.
    Note that this only affects the CD filesystem in RAM - it will not persists across reboots using the CD.
    The reason why this might work is that the package archlinux-keyring might not be signed by the key you were having issues with. If it still fails, you could temporarily set the SigLevel for the core repo to 'Never', however this does disable package signing, at least until you reset the SigLevel.
    Unfortunately, I could not replicate the problem in a Qemu VM, so I'm not sure how good my method is. Hopefully, it fixes the problem.
    Here are some links that might help shed some light:
    http://calvinx.com/category/unix-linux/archlinux/
    https://wiki.archlinux.org/index.php/Pacman-key
    Hopefully this is helpful!
    pypi

  • How do i solve the activate app issue in DPS App Builder?

    I have no apps showing up in the download section of DPS App Builder.
    And on the App Status section, I am being prompted to activate app?
    Can anyone help please?

    I have a creative cloud subscription. 
    Sent from Samsung MobilePrabhjyot Kaur <[email protected]> wrote:Re: How do i solve the activate app issue in DPS App Builder?
    created by Prabhjyot Kaur in Digital Publishing Suite - View the full discussion
    If you are DPS single edition user, then you need to enter serial number to activate app.
    But it would be more helpful if you let us know which DPS edition you have?
    Please note that the Adobe Forums do not accept email attachments. If you want to embed a screen image in your message please visit the thread in the forum to embed the image at http://forums.adobe.com/message/4853715#4853715
    Replies to this message go to everyone subscribed to this thread, not directly to the person who posted the message. To post a reply, either reply to this email or visit the message page: http://forums.adobe.com/message/4853715#4853715
    To unsubscribe from this thread, please visit the message page at http://forums.adobe.com/message/4853715#4853715. In the Actions box on the right, click the Stop Email Notifications link.
    Start a new discussion in Digital Publishing Suite by email or at Adobe Community
    For more information about maintaining your forum email notifications please go to http://forums.adobe.com/message/2936746#2936746.

  • When is Apple going to release an update to our software that will eliminate a thiefs ability once the phone is stolen to turn off the find my phone option without first entering the security code? I think that this would solve a ton of issues.

    When is Apple going to release an update to our software that will eliminate a thiefs ability once the phone is stolen to turn off the find my phone option without first entering the security code? I think that this would solve a ton of issues.

    It would be a voilation of privete laws of many countries
    Every attempt to fight "evil" have a backside likely the reason they have not don so yet
    In the country i live in if stolen i repport the imei number of the phone to the police and the have all carriers and carriers they work with in other countries block the phone from ever being used as a phone

  • [SOLVED] google test framework: linking issue

    Hello,
    I have got a problem with Google test framework, few days ago I was trying this on another machine and did not get any problems.
    I got this from svn and build libgtest.a. Then I tried to compile the first sample as:
    g++ -I../include -L../ -lgtest -lpthread ../src/gtest_main.cc sample1.cc sample1_unittest.cc
    But got a lot of linking errors like:
    /tmp/ccVHpTQc.o: In function `main':
    gtest_main.cc:(.text+0x28): undefined reference to `testing::InitGoogleTest(int*, char**)'
    gtest_main.cc:(.text+0x2d): undefined reference to `testing::UnitTest::GetInstance()'
    gtest_main.cc:(.text+0x35): undefined reference to `testing::UnitTest::Run()'
    /tmp/ccQuonuE.o: In function `FactorialTest_Negative_Test::TestBody()':
    sample1_unittest.cc:(.text+0x99): undefined reference to `testing::internal::AssertHelper::AssertHelper(testing::TestPartResult::Type, char const*, int, char const*)'
    sample1_unittest.cc:(.text+0xac): undefined reference to `testing::internal::AssertHelper::operator=(testing::Message const&) const'
    sample1_unittest.cc:(.text+0xb8): undefined reference to `testing::internal::AssertHelper::~AssertHelper()'
    sample1_unittest.cc:(.text+0x15d): undefined reference to `testing::internal::AssertHelper::AssertHelper(testing::TestPartResult::Type, char const*, int, char const*)'
    sample1_unittest.cc:(.text+0x170): undefined reference to `testing::internal::AssertHelper::operator=(testing::Message const&) const'
    sample1_unittest.cc:(.text+0x17c): undefined reference to `testing::internal::AssertHelper::~AssertHelper()'
    sample1_unittest.cc:(.text+0x221): undefined reference to `testing::internal::AssertHelper::AssertHelper(testing::TestPartResult::Type, char const*, int, char const*)'
    sample1_unittest.cc:(.text+0x234): undefined reference to `testing::internal::AssertHelper::operator=(testing::Message const&) const'
    sample1_unittest.cc:(.text+0x240): undefined reference to `testing::internal::AssertHelper::~AssertHelper()'
    sample1_unittest.cc:(.text+0x271): undefined reference to `testing::internal::AssertHelper::~AssertHelper()'
    sample1_unittest.cc:(.text+0x2b4): undefined reference to `testing::internal::AssertHelper::~AssertHelper()'
    sample1_unittest.cc:(.text+0x2f9): undefined reference to `testing::internal::AssertHelper::~AssertHelper()'
    As a workaround I unpack the library and got gtest-all.cc.o
    ar x ../libgtest.a
    So when I link samples with this object file - no errors:
    g++ -I../include -L../ gtest-all.cc.o -lpthread ../src/gtest_main.cc sample1.cc sample1_unittest.cc
    My g++ version is
    [ds|samples]$ g++ --version
    g++ (GCC) 4.7.2
    Copyright (C) 2012 Free Software Foundation, Inc.
    This is free software; see the source for copying conditions. There is NO
    warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
    What can be wrong with my gcc?
    Last edited by ds80 (2013-01-27 20:26:44)

    The problem was fixed by changing ordering a litte - I just moved lgtest to the end:
    g++ -I../include -L../ ../src/gtest_main.cc sample1.cc sample1_unittest.cc -lgtest -lpthread

  • PKGBUILD issue - ${CARCH} doesn't work

    [ "${CARCH}" = "i686" ] && KARCH=i386
    [ "${CARCH}" = "x86_64" ] && KARCH=x86_64
    Just doesn't work. That's why KARCH is empty and there is no vmlinuz26 in pkg in the end. I can rewrite PKGBUILD or solve this issue another way but i'm interested where ${CARCH} is setted?
    PS
    Tried to google but it finds Connecticut Association of Residential Care Homes etc. )))
    PPS
    i used to build my kernel without ABS but it was some problems with nvidia drivers (there were wrong paths for modules directory).. and now i decided to use ABS to avoid any other problems

    So it was setted to "pentium4"
    that'll work for now but when makepkg is made port-friendly your CARCH will need to be set to one of the "supported" archs otherwise makepkg may not build your packages. this is the way it is in frugalware and while the arch devs like to do things themselves, it would be silly to rewrite such basic code to support multiple archs

  • [Solved] Black screen - Installation issues on Dell Inspiron 1720

    Hi,
    I'm new to Archlinux, and i want to post my experience/issues of my first installation:
    I performance the core installation (base + devel packages and wireless tools as extra package)
    relevant hardware/drivers used:
    b44 for broadcom nic (wired)
    iwl3945 for my wireless nic
    intel g965 graphics card (intel GM965 express chipset - xf86-video-intel driver)
    Installation went ok till i had to upgrade the system:
    pacman -Suy (after package synchronization)
    glibc: /usr/bin/tzselect exists in filesystem
    glibc: /usr/sbin/zdump exists in filesystem
    glibc: /usr/sbin/zic exists in filesystem
    After some research on the wiki i found that i had to query the packages to know if those files are owned by a program or not (if no pakcage owns those i could remove them, but it was not the case)
    pacman -Qo tzselect
    pacman -Qo zdump
    pacman -Qo zic
    All were owned by the package tzdata:
    so i reinstalled the package tzdata:
    pacman -S tzdata
    Has this to be filed as a bug or is this a stupid newbie error? (did the installation on a virtual machine, on a laptop (twice) and always got this error)
    After that i was able to upgrade the system.
    Following that i installed the video card drivers:
    pacman -S xf86-video-intel and i rebooted the system.
    After reboot i got the grub screen, booted the system and got a black screen (last thing i saw as something like loading udevevents..)
    Googling again and looking the forums here, i read that i had to chroot my environment and update the system again (pacman -Suy -> which did not help).
    Soo after some more searching i found 2 things:
    - https://wiki.archlinux.org/index.php/KM … odesetting  : i added this in menu.lst: nomodeset
    - i blacklisted the intel driver
    Rebooted and i my graphics were working again.
    So later on i'll test if the parameter solved my problem, or blacklisting the driver.
    - i'll unblacklist the driver and reboot -> if it works then i know it's that parameter
    - if not ok, i'll blacklist the driver and remove the parameter from the menu.lst
    I hope to find what's causing the issue.
    Anyone having issues with the intel driver?
    Thanks for sharing your experience/info.
    Last edited by Manuuz (2012-05-12 13:12:54)

    Getting a little bit further, it requires a lot of configuration to get what you want..
    - I just installed fluxbox
    - added my user to the sudo group (allowing to reboot)
    - installed Slim
    What's i've noticed also is that when you select an interface in etc/rc.conf, for example, eth0, and it's not connected at bootup, you lose like 10 seconds before the boot process proceeds.
    If you out-comment that line, you get a red line (error) saying that no interface is definded.
    I guess i'll just have to disable the network daemon at startup and install some kind of network connection manager for my wireless at logon.

  • Think disabling WMM on G580 solved wifi Limited Access issue

    I'm posting this in the hope the hope that it may help someone else.  I have a G580, model 20150 with Windows 8.1 and the Broadcom wifi card.  Like many of you, I had the limited Access dropout issue.  Here's the things I can remember trying previously:
    Switching from the Microsoft Broadcom driver to the Broadcom driver.
    Turning off power management on the Broadcom driver
    Setting IBSS mode to just b/g
    Disable Bluetooth collaboration.
    Disabling all things power managment I could find.
    etc...
    The thing I found that seemed to fix it once and for all is going to the Advanced properties of the Broadcom and disabling "WMM".  As I understand it, WMM is a QoS feature for prioritizing audio/video port traffic over data.  Supposedly, you can't take advantage of wireless N speeds with WMM off, but I'll trade performance for not having to reconnect every 15 minutes.
    I'm not going to mess with telling you how to disable WMM since there are tons of instructions and discussions on the interwebs.  I also can't guarantee that it wasn't a combination of WMM and other fixes that solved my problem because I'm not willing to experiment on my finally working pc.  My system has been constantly connected and going strong for 12 hours.  
    Good luck and I hope this helps.

    Don't these usually work by intercepting the DNS quieries and then redirecting you to the gateway for authentication?  Check your DNS settings and  try to use theirs for the nameserver.  Make sure something else isn't getting specified such as Google Public DNS.

  • Firefox 3.6.13 won't log onto gmail or google accounts, plus issues with hotmail

    Running 3.6.13 on Windows 7 Pro 64-bit. Since 3.6.12, I've had hit-or-miss issues being able to log onto my gmail (no, not a password issue). When I tried loading gmail (either through the unsecure or the https version), the connection would often time out, with Firefox telling me the connection was interrupted and/or had been reset. Hotmail would load, but if I selected multiple messages to move or delete, it wouldn't work, and a notice would pop onto my hotmail session, saying that I wasn't connected to the internet.
    So I tried Chrome, and the gmail issue wasn't a problem in Chrome. But, sure enough, the Hotmail thing was still an issue in Chrome.
    After running through all the suggestions I could find here and elsewhere--including deleting cookies, cache, etc., and creating a new profile, and running in safe mode, and moving to Firefox 4 beta (same issue as 3.6.x), and a total uninstall & reinstall of Firefox--nothing worked. In fact, after upgrading to 3.6.13, gmail wouldn't load at all. I couldn't even log on to any aspect of my google accounts--so, no docs, no calendar, etc.
    So I started over, with a system wipe & reinstall of the OS. After initial install of Firefox, it *seemed* to have solved the problem. But soon after--and this was before adding any customizations other than Adblock Plus (which I have disabled on any google domain)--the problem returned. I ran through the troubleshooting again (e.g., deleting cache, cookies, creating new profile, etc.). No luck. Then I realized the culprit might be... JAVA!
    Uninstalled java, and lo! and behold, gmail loaded. But hotmail still was problematic. And gmail was still hit-and-miss, and loading slowly when it did work.
    So I went to my work computer, which is also running Win 7 Pro 64-bit, and copied all my Firefox files (install directory, profile, etc.) over to my home computer. Y'see, my work computer--despite having all the same settings and software--has no problem with gmail or hotmail. Well, those same settings applied to my home computer still were problematic. And no, it's nothing to do with my home network setup--which hasn't changed since before I started having these issues, anyway.
    So I did a complete uninstall of Java on my home system, and voila! Gmail works, doesn't take a whole lot of time to load (though it's still sluggish compared to my work computer). Hotmail is still buggy, but hey--it's hotmail.
    Unfortunately, when I then installed OpenOffice, Java was re-installed (and I'm not sure how essential having a standalone java install is for OpenOffice, so I'm holding off on any uninstall) and now I'm back to hit-and-miss gmail.
    Long story short, there is something in java (and I've tried version 21, 22, and 23) that is messing up Firefox when it comes to gmail, google accounts, and hotmail. Chrome is not affected by this in gmail or google accounts, but has issues in hotmail. IE8 isn't affected in google or hotmail.
    I hate IE8, and I'm not sold on Chrome. Don't drive me away, Firefox.
    Prior to my OS wipe & reinstall, other sites that use java--primarily for file uploading, like flickr & photo finishing businesses--were not working with Firefox 3.6.13 or Firefox 4 beta.
    Again, clearing cache, clearing cookies, creating a new profile, running in safe mode, uninstalling any individual and/or all add-ons and extensions... none of these was the answer to this problem. Even a fresh OS install didn't quite fix it. Java seems to be implicated in the problem, and I'll be posting this issue over at Oracle's forums, too.
    Other computers I have on my home network are affected by this issue, as well, but not all of them, and not to the same degree.

    I also have this problem and it just started in the last week or so. It seems to be dependent on my home network and the problem only exists with firefox. I have used chrome and IE8 with no issues. I can verify tomorrow that it only exists in my network but one thing I was able to test is that the problem exists even on my linux boot. I am totally dumbfounded with this problem and I can't find anything that will allow the gmail page to load. All other pages I have tried load fine, all be it a little slower than normal but they load. If anyone knows of a difference between firefox and all other browsers on how it goes through the router I would appreciate the info cause I don't know of any differences.

Maybe you are looking for

  • Downloaded PDF file on front end is not opening

    Hi, We have a requirement to open the Master Data Governance documents (which are stored as RAWSTRING in a DB table) from an ALV on double click. We are downloading the binary content(converting XSTRING to BINARY) as a file with appropriate extension

  • Output type configuration

    Hi, I have one doubt. What is the difference between configuring the output type through nace and spro. For few transactions, i see that the functional team add the custom driver program or the form name through NACE and for some cases through SPRO.

  • Problems changing a iview name programmatically

    hi people im trying to change a iview name programmatically but the application give a exception: java.lang.NoClassDefFoundError: com/sapportals/portal/pcd/pl/PlLocaleTextPair This class is just the class that i need for get the name and for change t

  • Issues connecting to secure wifi

    I haven't seen this specific problem mentioned. Here is my issue. I can no longer connect to secure wifi connections that request user name/passwords. For example, my husband and I have the same phone (Galaxy S4). Our gym is now offering free wifi bu

  • Database problem: corrupted directory structure

    I have just installed Lightroom 1.3.1 on a PowerBook G4 with MacOS 10.4.11 and 1 GB RAM. Then i import RAW pictures from cdrom and generate one directory for every cdrom. When i have imported around 2000 pictures in 12 or 13 directorys then i have so