How to manage "forks" of official packages?

I'm trying to figure out a way to use a custom repository to track packages I copy from ABS and modify. Creating the packages, and the repo DB, is easy. I just can't figure out how to sanely USE such a repository.
What I'd like to emulate is a "highest version wins regardless of source repo" behavior, and I know pacman doesn't do this. I want name collisions of packages to resolve to my repo when myversion <= officialversion, and use the official package otherwise.
The value of this behavior would be that pacman alerts me (by nominating for upgrade) when the official package is updated and my modified version is thus out of date. Then, I cancel the pacman upgrade and go update my personal repo instead.
If I put my repo first in pacman.conf, I can't track the official versions of the modified packages. If I put it last, my repo adds no value over not having a personal repo at all and just doing "pacman -U" on the individual packages (which is what I'm doing now).
I want to make it clear that I'm NOT complaining about pacman's behavior, or asking for a change. I think rather that I'm going about this the wrong way, and so I'd like to know how others have solved this problem.
(This thread is also related to my "which official packages do you modify?" thread, in which I try to determine how important this problem is to other users.)

ataraxia wrote:
So, it's taken me a while to write a reply to this thread. I had the problem that when I saw the suggestions, I immediately didn't like them. It's taken me 3 days to figure out why not, and I felt that I'd better have a good reason before I reject the advice of both a pacman developer and a repo maintainer on this subject I do appreciate the replies, and I don't want to look like a jerk.
My feeling is basically that these answers don't make any improvement over what I've been doing -  just not having my custom packages in a repo at all, but installing them with pacman -U. Adding shell scripts to my system is not very KISS. I'm a sysadmin by trade, and if there's one thing I've learned, it's to use as few tools as possible. This is why I avoid quality software like rebase, reflector, powerpill, and yaourt - great stuff, but I can get by without them. (Naturally, Xyne, being the author of several of those examples, probably doesn't subscribe to this philosophy.)
What a jer...
ataraxia wrote:I wonder if I'm suffering from the XY problem here. I don't really want to use a personal repo to solve this problem (in fact I'd prefer to avoid it for the complexity it represents) - I just want to be able to fork official PKGBUILDs in a way that's lightweight, flexible, and scalable, and I saw a personal repo as a good shot at success. Perhaps readers of this thread could tell me how they handle this problem? I'm basically looking for methodology, not helper-tools.
My understanding is that you want to maintain custom versions of repo packages in the same release cycle. You also want to achieve this without using simple tools (or, by extension of your KISS argument, complex tools) and in the simplest way possible. I see that as a contradiction.
Even considering the XY problem, your original post suggested a willingness to accept a fair amount of complexity so it seems strange that you would reject a very simple script to achieve your goal. If doing it all manually is undesirable but using a script is unacceptable then I don't see a solution.
There's an unavoidable (if small) level of complexity involved in what you want to do. Unless you have figured out a way to automate building the custom packages with each new release with whatever changes you've made (which would certainly require complex tools), you will still need to do that manually. If you only need to install these packages on your own system then a simple "makepkg -irs" will work. If you need to share the packages then you have to "makepkg" and then <create a repo and host it|write scripts|use pkgd|do it manually|etc>.
The former is as simple as it gets. The latter will require tools or time. I opt for using tools in that case. In either case, the remaining issue is notification of the update cycle. To achieve that, you either have to check for newer versions manually or use a script. Your original post involved essentially hijacking pacman's update function using repo trickery in order to manually abort a download which you would later have to resume just to find out about new releases. That involves repo maintenance (creation, updating, cleaning) and at least 2 calls to pacman. A script to print out a list of available updates seems very much like an improvement to me. You've already stated that you don't see that as an improvement.
I suspect that I still don't fully understand what you actually want to achieve.
Last edited by Xyne (2009-09-21 02:29:37)

Similar Messages

  • How to resolve a "The Extension Package is invalid."error  message in Adobe Extension Manager CS5,6

    How to resolve a "The Extension Package is invalid.error message in Adobe Extension Manager CS5,6

    I'm using Windows XP Professional sp3.I've tried to install(unfortunately,to no effect,till now)Layers Control by Vadim Chtcherbakov ,Floating Adj by David Barranca,Photoshop Blending Modes by Rufus Deuchler,and The Fader by Capture Monkey.The  Adobe Extension Managers I have installed in my PC are 5.0.1 for CS5,and 6.0.8 for CS6...Thanks in advance for your consultation!

  • How to manage one wsp and dll for multiple clients in farm environment

    1. There is a product which is developed using C sharp , jquery,CSS and sharepoint object models which have been packaged into .wsp file. Whenever we introduce new functionality to the product we used to branch the
    previous code as a version , say Version 1.0 and new functionality of the product will in another solution. This is how we are managing the code in TFS as versions. Each newer version will have new functionalities. We do not give latest functionality for all
    the clients. Each client is having its own version of functionality. Technically in order to access the functionality, the wsp solution should be present in the solution repository which is available in SharePoint central administrator site. This solution
    will be deployed on the client’s site. We are following the above process in SharePoint standalone installation where we used to purchase dedicated server per client and installed sql, SharePoint foundation 2010 as standalone installation and adding the client
    related version of the code to the solution repository. Later host on the site which is created for that client purpose. This process is same for all the clients where we purchase individual server for each client .
        Now we want to host our product in farm environment of sharepoint foundation 2010 where we are going to try 3 level architecture. 
    • SQL Server-In this sever we are going to install sql server 2008R2 standard edition. Which should serve the database service for all the web applications/sitecollections which we are going to create in Web front end server.
    • Application server- In this server we are going to install the sharepoint as farm and will install search server express for serving search functionality for our product
    • Web front end server- In this server we are going to add this server to Sharepoint farm which we have created in application server. Here we are going to create web applications and site collections for all the clients.
    In this scenario how to manage multiple versions of same wsp solution?
    Another major issue w.r.t the architecture of the product and new approach for client deployment as follow.We have CSS, jquery files for serving the functionality.These files have been mapped to 14 hive folder.If any changes we do one of the jquery file or
    css file which is meant for latest version and not for old version, then how to manage this new functionality for that particular css or jquery file in 14 hive folder, since there is only one 14 hive folder. What is the best practice to make this happen? Another
    thing is, how to manage dll files for individual client?

    It sounds like you have a farm scoped solution at work. In that case you can only have a single instance of it per farm, you'd have to branch each version so they appear to be seperate solutions entirely (thus ruining your clients upgrade process).
    Bluntly i don't think a single farm can manage all your user environments.

  • Distribution Manager failed to process package triggers an update of the packages

    Hi all,
    we have randomly following issue:
    SCCM 2012 Sp1 CAS, takes a snapshot of a Windows update package in the morning (not sure how the frequency of this check is done). If for any reason it fails, SCCM automatically redistributes the package to all sites. This happened this morning again for 5
    Windows updates packages. You understand that this means GB sent to all Secondary sites (66) with an useles amount of data sent out.
    From the Status messages I see
    Information Milestone RC0 06.11.2014 07:12:11 SCHVSGGSC600.rccad.net SMS_DISTRIBUTION_MANAGER 2300 Distribution Manager is beginning to process package "SUP-2014.09" (package ID = RC00017B).
    Then lot of updates lists with comment taking a snapshot and finally
    Error Milestone RC0 06.11.2014 07:12:29 SCHVSGGSC600.rccad.net SMS_DISTRIBUTION_MANAGER 2302 Distribution Manager failed to process package "SUP-2014.09" (package ID = RC00017B).    Possible cause: Distribution manager does not have access to either the package source directory or the distribution point.  Solution: Verify that distribution manager can access the package source directory/distribution point.    Possible cause: The package source directory contains files with long file names and the total length of the path exceeds the maximum length supported by the operating system.  Solution: Reduce the number of folders defined for the package, shorten the filename, or consider bundling the files using a compression utility.    Possible cause: There is not enough disk space available on the site server computer or the distribution point.  Solution: Verify that there is enough free disk space available on the site server computer and on the distribution point.    Possible cause: The package source directory contains files that might be in use by an active process.  Solution: Close any processes that maybe using files in the source directory.  If this failure persists, create an alternate copy of the source directory and update the package source to point to it.
    This triggers immediately an update of all DPs
    Information Milestone RC0 06.11.2014 07:43:52 SCHVSGGSC600.rccad.net SMS_DISTRIBUTION_MANAGER 2304 Distribution Manager is retrying to distribute package "RC00017B".    Wait to see if the package is successfully distributed on the retry.
    Any idea
    How this can be avoided, since nobody changed the package and we suppose it was a temp connection issue between the CAS and the package repository server
    If this check can be set up to once a week for instance or even less?
    Thanks,
    Marco

    Hi Daniel,
    thanks for the prompt answer. Actually I saw it yesterday at least for 1 package (the last one). The error is generate by SQL killing a task 
    Adding these contents to the package RC00010D version 347.
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:23
    7796 (0x1E74)
    Sleep 30 minutes... SMS_DISTRIBUTION_MANAGER
    06.11.2014 07:12:23 3652 (0x0E44)
    *** [40001][1205][Microsoft][SQL Server Native Client 11.0][SQL Server]Transaction (Process ID 152) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:25
    5460 (0x1554)
    STATMSG: ID=2302 SEV=E LEV=M SOURCE="SMS Server" COMP="SMS_DISTRIBUTION_MANAGER" SYS=SCHVSGGSC600.rccad.net SITE=RC0 PID=2144 TID=5460 GMTDATE=jeu. nov. 06 06:12:25.422 2014 ISTR0="SUP-2012.Q4" ISTR1="RC000068" ISTR2=""
    ISTR3="" ISTR4="" ISTR5="" ISTR6="" ISTR7="" ISTR8="" ISTR9="" NUMATTRS=1 AID0=400 AVAL0="RC000068"
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:25
    5460 (0x1554)
    Failed to process package RC000068 after 0 retries, will retry 100 more times
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:25
    5460 (0x1554)
    Exiting package processing thread. SMS_DISTRIBUTION_MANAGER
    06.11.2014 07:12:25 5460 (0x1554)
    Used 4 out of 5 allowed processing threads.
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:26
    3300 (0x0CE4)
    Starting package processing thread, thread ID = 0x894 (2196)
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:27
    3300 (0x0CE4)
    Used all 5 allowed processing threads, won't process any more packages.
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:27
    3300 (0x0CE4)
    Sleep 1828 seconds... SMS_DISTRIBUTION_MANAGER
    06.11.2014 07:12:27 3300 (0x0CE4)
    STATMSG: ID=2300 SEV=I LEV=M SOURCE="SMS Server" COMP="SMS_DISTRIBUTION_MANAGER" SYS=SCHVSGGSC600.rccad.net SITE=RC0 PID=2144 TID=2196 GMTDATE=jeu. nov. 06 06:12:27.716 2014 ISTR0="SUP-2014.M05" ISTR1="RC00011D" ISTR2=""
    ISTR3="" ISTR4="" ISTR5="" ISTR6="" ISTR7="" ISTR8="" ISTR9="" NUMATTRS=1 AID0=400 AVAL0="RC00011D"
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:27
    2196 (0x0894)
    Start updating the package RC00011D... SMS_DISTRIBUTION_MANAGER
    06.11.2014 07:12:27 2196 (0x0894)
    CDistributionSrcSQL::UpdateAvailableVersion PackageID=RC00011D, Version=14, Status=2300
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:27
    2196 (0x0894)
    *** [40001][1205][Microsoft][SQL Server Native Client 11.0][SQL Server]Transaction (Process ID 154) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:27
    1424 (0x0590)
    Taking package snapshot for package RC00011D
    SMS_DISTRIBUTION_MANAGER 06.11.2014 07:12:27
    2196 (0x0894)
    Now as I mentioned it was probably an SQL (more likely) or networ issue, however the question was more about how to avoid that. Finally the packages were not changed and if possible I would avoid any automatic SCCM action on packages unless an operator manually
    triggers them.
    I didn't see any error today, so it should not be a configuration issue.
    Is this possible?

  • How to know whether a business package like 'PPS' is installed in a SRM sys

    Dear experts,
    Could you please let me know how to check if a business package like 'PPS' is installed in a SRM system?
    Is it the correct way to check below way:
    SPRO -> SRM Server -> Activate Business Function.
    Thanks and regards,
    Ranjan

    see in the following links
    http://wiki.sdn.sap.com/wiki/display/SRM/PPS-ProcurementforPublicSector-+enhancements#PPS-ProcurementforPublicSector-enhancements-PPSprocessisbasedinExtendedClassicscenario
    http://help.sap.com/saphelp_srm70/helpdata/EN/c9/8d47ed4a5548c08521615d84a590e6/content.htm
    Procurement for Public Sector (PPS)
    Procurement for Public Sector (PPS) is tailored to meet the procurement needs of public sector organizations. PPS is based on SAP Supplier Relationship Management (SAP SRM), in many cases extending and augmenting standard SAP SRM functionality to meet the demands of public sector organizations. PPS offers cost savings and improved efficiency through seamless integration between the contract management and financial processes, while complying with international procurement policies and public regulations.
    Prerequisites
    In SAP ERP, you have activated the Enterprise Extension Public Services (EA-PS). Depending on the required functions in PPS, you have activated a selection of the following business functions:
    http://help.sap.com/saphelp_srm70/helpdata/EN/c9/8d47ed4a5548c08521615d84a590e6/content.htm

  • Understanding how BB manage HTML in emails and attacchments.

    Hi all,
    I'm trying to understand how BB manage HTML in emails and in attachments.
    I created a simple HTML table with some color and no particular complications.
    If I send this table as a HTML email on my BB it will be showed correctly, with all color and attribute shown.
    If I package this html code into a file named try.html and than I send it as an email attachments on my BB I lost many attribute, including colors.
    Can you explain me why and eventually how to solve this problem?
    Thanks.

    I too am somewhat annoyed by the manner in which messages are displayed using the standard e-mail inboxes. I recently started using the Gmail application for my personal e-mail and am very happy with how cleanly messages are displayed. I hope I can find a similar application to handle my other addresses, because it is very difficult to scroll through a large volume of mail that contains HTML and URLs every day.
    If anyone knows of an application that more cleanly displays messages, please do tell. Otherwise, I'm going to be doing a bit of research this evening to see what I can find. I'll post back with my findings.

  • Upgrading certain AUR packages when specific official packages change

    I'm having some trouble with updating. As many Archers, I use some packages from AUR. In my case, I'm using bauerbill specifically to update packages from AUR just like I would with pacman. Quite convenient (thx, xyne!)
    But I have some packages that need to be rebuild from AUR when certain official packages are updated, e.g. compiling thinkhdaps against the new kernel version, or updating Lightning and Enigmail from AUR when I update my Thunderbird. However, so far I have not yet found a way to do this: calling specific commands when certain packages are changed.
    Is there any way to do this in pacman or one of the other arch package managers? Or do you have a clue about how to write a script that could do this? Any help would be greatly appreciated.
    Last edited by Natanji (2010-07-22 06:49:54)

    dmz wrote:You could watch (with inotify) the pacman log for specific applications and events. Turn this into a daemon, and execute relevant commands when application X is updated.
    But I think that seems to be a problem as you wouldn't know which applications to keep watching since some might need to be re-built, and others might not.
    I guess watching every package installed outside of pacman (pacman -Qm) would be the only way to see which AUR package needs to be re-built. Depending on how many packages you have installed from AUR or external sources, this could be very fast to extremely slow.

  • How to manage or monitor  My RTP stock in customer

    dear all,
    I  have RTP (Returnable Transport Packaging) material to deliver with material Ordered to customer
    how to manage or monitor my RTP Stock in customer ?
    and what movement type is used when i send to / received from customer ?
    thanks
    imron

    Hi,
    You can see the RTP at customer place in MMBe using special stock indicator 'V'.
    For more information check the following link:
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/dd/560051545a11d1a7020000e829fd11/frameset.htm
    Regards,
    Ramakrishna

  • Download all Archlinux official package

    Hallo all!
    i used Arch for 6 monts, and its rock!
    is hard to have internet connection in my country
    i just wondering if i can download all Archlinux official package,how?
    so me and other people can taste a beauty of Archlinux
    without need internet connection
    thanks for everything

    I use nightly downloads of core/extra/community using lftp and the mirror command (can't use rsync because of employer restrictions). After the initial download - it's a breeze and it has the advantage of rsync in that only changes are downloaded. Being a rolling release though - the less often you download the more updates there will be ...
    [edit]
    here's the script I use for the nightly downloads ...
    #!/bin/sh
    # function defs
    say() { echo -e "$*"; }
    tod() { date '+%H:%M:%S'; }
    doit() { # args: 1) server, 2) source, 3) target, 4) filemask
    test -d $3 || ( mkdir -p $3; chmod a+rwx $3; )
    ( say "open $1"
    say "mirror -L -v -r -e -i \"$4\" $2 $3"
    say "quit"
    ) >$chatf
    ( lftp -f $chatf || {
    say "-- 'lftp' error!!"; cat $chatf
    rm $chatf; return 1
    } ) | tr -d [\'\`] | sed \
    -e 's/Transferring file/++/' -e 's/Removing old file/--/'
    rm $chatf
    # definitions
    chatf=/tmp/chatf.$$
    home=/usb/linux
    base=archlinux
    test -d $home/$base || {
    say "-- $home/$base not found!!"
    exit 1
    server=archlinux.unixheads.org
    versions="core extra community"
    arch="i686 x86_64"
    files=".tar.gz\$"
    # do the needfull
    say "\n## date: `date '+%a (%V/%j) %Y-%m-%d %H:%M:%S'`"
    for v in $versions; do
    for a in $arch; do
    path="$base/$v/os/$a"
    say "#+ /$path @ `tod`"
    doit $server /$path $home/$path $files
    say "#- done @ `tod`"
    done
    done
    exit 0
    Obviously, you can put this in crontab and have it all automated - redirect stdout to a logfile and you can see what's been updated ;-)
    Last edited by perbh (2009-08-12 15:18:51)

  • How to manage AUD$.

    Hi All,
    I have enabled audit_trail at my DB server.So can anyone guide me how to manage or monitor it's size.

    sb92075 wrote:
    I have enabled audit_trail at my DB server.So can anyone guide me how to manage or monitor it's size.AUDIT data is not different than other data within the DB.
    Treat AUDIT data the same as other application data.This is utterly wrong. RTFM, please, before posting such nonsense.
    2OP: supported way of managing audit data is DBMS_AUDIT_MGMT package. You can find more details about it in the documentation & MOS.

  • How to Manage the Resource Master in Primavera Enterprise Version

    We are using P6 enterprise version in our company having a central database. We had created our own unique resources and standardised across the enterprise and is Admin protected. However we often have to import some schedules sent by our consultant/client. In those schedules they have their own resources.
    So when we import these schedules, it also imports the resources along with it and pollutes our resource master.
    Can anyone let me know, how to manage such a situation.
    Regards

    We have met this issue. You have few ways to deal with it, but each has its own +ve and -ve.
    1- You may create a separate "Working" db into which you can import those files and keep them there only. Your master db shall be used for maters and your own program only.
    2- You may do the same for checking the resourses, vet them before importing into the final db
    3- P3 and MSP importation allows you to select the node to which the resources will be located. We have created a resource node called "Imported Resources" and dumped all those resources into it. Our users don't have access to that node so those resources can only be used on the imported program and can't be assigned to any other program / activity
    4- Enforce your supply chain to use the same resource list. We managed to do that for our main and subcontractors. That formed part of their contracts.

  • How I Managed to Install Windows 8.1 Pro on an iMac 27-inch, Late 2013

    As is typical with many of my Boot Camp installs, this one did *not* goes as smoothly as planned. This installation was particularly troublesome, so I thought I would share how I managed to set up Boot Camp on a brand new iMac (27-inch, Late 2013) using a Windows 8.1 Pro DVD and an external Apple SuperDrive. These are the steps I took:
    Download Boot Camp Support Software 5.1.5640 (http://support.apple.com/kb/DL1721)
    Copy Boot Camp Support Software to empty MS-DOS (FAT)-formatted USB key
    Run Boot Camp Assistant and select third option only (Install Windows 7 or later version)
    Computer would reboot after Boot Camp Assistant finished but would boot back into Mac environment and not continue Windows installation
    Boot computer holding down option key and select Windows (with DVD icon) and go through initial setup process (could go as  far as trying to format Boot Camp partition as NTFS and then got stuck)
    Quit Windows installer and reboot computer holding down option key and select EFI Boot
    Run Windows installer
    These same steps worked for another iMac (same model) that I set up later the same day.

    Note: These types of discs or activities are not supported by DVD or CD sharing:
    DVD movies.
    Audio CDs.
    Copy protected discs such as game discs.
    Install discs for an operating system such as Microsoft Windows (for use with Boot Camp), or Mac OS X.
    Burn a CD or DVD

  • How to manage photos from multiple accounts

    Hi,
    i am trying to solve one problem. I have one iMAC where i have two user profiles connected to different AppleIDs. Next to it there are two iPhones and two iPads. I am trying to find out, how to manage photos from all these devices using one account on iMAC. Second question is if there is any finction in iPhoto that can publish images to NAS server? Second alternative for me is to byu iCloud drive space and use Family sharing for publishing photos. Bellow text schema is attached, it is only my idea and i dont know if this could work like this. Did anyone somehow solved this configuration? Thanks

    I can see the screenshot if I double click on it.  However, here is it again.  Being Halloween maybe it'll show.

  • My family put multiple devices on the icloud, and I need to know how to manage duplicate entries.  Specifically contacts.  If I fix the contact list on my pc will it push the info out to the other devices and maintain it correctly?

    My family put multiple devices on the icloud, and I need to know how to manage duplicate entries.  Specifically contacts.  If I fix the contact list on my pc will it push the info out to the other devices and maintain it correctly?

    All devices signed into the same iCloud account will finish up with the same contacts. Of course if prior to joining iCloud two family members each had an entry for Uncle Fred, then you will finish up with two contact cards for Uncle Fred, and so on. If you tidy this up on your computer then the changes will propagate to everyone else.

  • How to manage my daughter's new phone account

    We just got my 11 year old daughter her first phone (a Samsung S3 Mini). The salesman told us we would be able to use the "Family Base" to control/ turn-on-off her phone, text, web access and email and we would have full control there. Our main goal (as we explained) was to let her have the phone, texting, and gmail, but to turn off her web access (unless she was being supervised - at home). He told us we would definitely be able to keep GMail on, and separately control her web access.  Now that I am looking through the Family Base control center, I'm not really seeing all that.  I see you can manage phone and texting (which we've decided to let her have on), but I don't see management of web access, and I don't see exactly how to manage that separately from her ability to email.
    My second issue is... I would like to activate a "Find my Mobile" or "Find My Phone" type service (because I'm worried about her leaving her phone somewhere) but I'm wondering if this service requires that her phone have web access (and maybe location services on?) so that it can function.
    Surely there are other adults out there trying to manage their kids the same way I am. Can someone please give me some tips/ best practices or suggestions on how to manage all this?
    In addition to my concerns about managing her web usage (while maintaining email), what's the best "find my phone" that I should implement?
    Thank you in advance.

    A few things you can do here.
    1) The Verizon Rep Lied to you
    2) You can download Parental Control Apps from the Play Store that are password protected and work much better then Family Base. Look at those and dump Family Base
    3) Download Where's my Droid. It will do all of the find my phone options and much more. check it out
    4) You might also want to download something like Phone Tracker which will allow you to track the phone and its whereabouts near real-time so you can see where she is at all times. (Check local laws in your area first though to make sure you are legal doing this)

Maybe you are looking for