Refactor vcs prefetching routines out of PKGBUILDs

hi,
I would like to discuss a suggestion to the PKGBUILD format.
Let's take a snippet from the "kdevelop-git" PKGBUILD file [1] .
_gitroot="git://anongit.kde.org/kdevelop"
_gitname="kdevelop"
build() {
cd "$srcdir"
msg "Connecting to Git server...."
if [ -d $_gitname ] ; then
cd $_gitname && git pull origin
msg "The local files are updated."
else
git clone $_gitroot $_gitname
fi
msg "Git checkout done."
msg "Starting make..."
If you have looked at some PKGBUILDs where the package is built from a git source you might have seen these lines quite often. The Problem with this example is that it does not use the "--depth 1" option from git. This option can save you a LOT of bandwith when you *only* need to build a large repository and are not interested in the last 3 year history of it.
This strategy has already been discussed here on the board but I think it is very important for people with low bandwidth and for overall bandwith savings to enable this option in PKGBUILDs where you clone from git repositories.
I would like to suggest that 'makepkg' should take care of prefetching the (git-)repository if it can find the "$_gitroot" and "$_gitname" variables. This should happen in a step before 'build()' so that when the 'build()' function begins the repository is already preloaded. Important is that makepkg should do this with the "--depth 1" option. This is also adaptable for mercurial and svn and cvs repositories.
Gentoo , for example , is already doing this with their EGIT_REPO_URI variable.
I have not looked at the code of makepkg yet but it looks like you already have some similar code for it when you unzip source tarballs. The benefit of this method is also that you would not break existing PKGBUILDs as they would run into the first if statement and just try to pull again from the repository. In the long term package maintainers could get rid of these redundant statements.
If you want I could write a patch for makepkg to do this routine.
I hope you like the idea and please tell me your opinions.
Best,
Daniel Nagy
[1] https://aur.archlinux.org/packages/kd/k … t/PKGBUILD

great, this looks nearly exactly like what i try to achive. One second problem is that there are many PKGBUILDs which are no longer maintained. I mean the package maintainer did not flag it as unmaintained and does not respond to comments. Is there any process of flagging these packages as unmainted so that other people would be able to update the code to use the new vcs-url technique ?

Similar Messages

  • First PKGBUILD, no success

    Could anyone kindly help me out? I'm trying to make a PKGBUILD with little success.
    https://aur.archlinux.org/packages/tbs-linux-drivers/
    It should build some drivers + install them and some firmware to /usr/lib/firmware/

    Sorry for not being so specific. I was expecting someone to try out the pkgbuild. -Which was a silly assumption.
    ok I cleaned away all the "| return 1" things. During the build() -function it stopped with permission error:
    make -C /tmp/makepkg/src/linux-tbs-drivers/v4l
    make: *** /tmp/makepkg/src/linux-tbs-drivers/v4l: Permission denied. Stop.
    make: *** [all] Error 2
    ==> ERROR: error occurred in function build().
        Reversing...
    ==> ERROR: Makepkg was unable to build tbs-linux-drivers.
    This was translated. My locale is Finnish

  • VCS ova import fails VMware workstation 8

    Hello all,
    I was recently surprised and excited by the release of the virtual VCS and set out installing!!
    Using VMware workstation 8 I imported the VCS ova file, VMware waits for a while then displays a massive agreement screen which cannot be moved to find the agree screen. The only option to remove this massive browser screen is to alt f4, this then ends the import and the VCS installation fails.
    Can anyone please advise if it is possible to run the VCS on workstation 8?
    (ESXI is not an option as I'm attempting to install on a desktop and esxi doesn't recognise IDE and sata as storage)
    I'm really wanting to get this working ASAP for my personal development and testing so any assistance would be greatly appreciated.
    Thank you

    For production its anyhow the only way to go.
    For labbing,  if you google a bit then you can find how to run esxi under vmware desktop or fusion.
    I would also not exclude that with some modding you could run the ova on other virtualization
    platforms than esxi, but thats anyhow not supported.
    Please remember to rate helpful responses and identify helpful or correct answers.

  • Refactoring problem

    I have an iPhone project that needs some refactoring. The problem is that the refactoring option is greyed out and not matter what I do, I am unable to refactor anything in this project. If I start a new iPhone project, the refactor menu is selectable and I am able to refactor easily. What am I doing wrong ? Is there some option I am missing?
    Thanks.

    To those that are having the same problem, I did not find a real solution, but I moved everything to a new project and the "magic" refactor button works properly. Hope that helps.
    orangekay, your regressive response is obviously that from an old dog that can't learn new tricks. With your line of thinking, we would still be driving around in horse-drawn carriages.

  • Ocenaudio PKGBUILD request

    I recently heard about a new audio editor (Ocenaudio), and wanted to try it out.  However, there's no PKGBUILD in the AUR, and, while I plan on trying to write a PKGBUILD for it, I figured someone might be able to do it much faster and smoother than I can.
    Link: http://www.ocenaudio.com.br/download (available in 32 and 64 bit .deb and .rpm packages)

    Your install script tries to install directly to /usr/local, what you want is for it to install to $pkgdir/usr/local or something like that. Check out other PKGBUILDs for just about any package to see how that's done.

  • Progress on Unity under Arch Linux!

    See here for information about the new GNOME 3.12-compatible packages: https://bbs.archlinux.org/viewtopic.php … 3#p1404683
    I'm now on IRC! Come join us at #unityforarch on Freenode
    To install Unity from my repos:
    See the wiki: https://wiki.archlinux.org/index.php/un … mmended.29
    To install Unity from source:
    See the wiki: https://wiki.archlinux.org/index.php/unity#From_source
    -- You probably don't want to read anything below --
    The story
    So...rather than wasting internet bandwith to download a new Ubuntu ISO to test out the new Unity features, I decided to try to make it work under Arch Linux. It took a whole lot longer than I expected to get it even partially working. So, here's my story:
    Knowing that Unity isn't in the main repositories, I went the AUR's website and looked for a user created Unity package. That didn't go too well. The Unity package hasn't been updated for 6 months. D'oh! I decided to download the existing PKGBUILD and modify it to work with the Unity 4.xx series. After changing the version number, I tried to "makepkg" it, and was greeted with a message about installing Compiz 0.9.x. I thought it would be an easy install. It was quite the opposite. Compiz's install prefix was set to /opt/unity, but FindCompiz cmake build file expected Compiz to be in /usr, so none of the Compiz packages, except for compiz-core would compile. Then, I tried reinstalling compiz-core, but this time, changing the prefix to /usr. The compiled package ended up being only a few kilobytes big. I guess the mouse wheel was invented for a reason. I looked at the PKGBUILD again, only to find that there was a line at the very bottom that ran "rm -rf ${pkgdir}/usr". That explains a lot! I ended up adopting all the compiz*-git packages and fixing them so they would compile and install.
    So, now that Compiz is working (restarted and tested just to make sure I didn't waste my time with something that didn't work), I went on to install the rest of the dependencies listed in the Unity PKGBUILD file. That went relatively well. I was so happy after seeing the progress counter go up after running "makepkg", but at about 8%, gcc spat out an error about an undeclared function (sorry, I forgot what the function was). Natually, I went to Google and searched the name of the function. 0 results! Exactly was I was looking for! I ended up downloading the Ubuntu 11.10 Alpha 3 ISO and running "find -type f /usr/lib | xargs objdump -T | grep the_function". The problem lied in the libindicator package. There was a newer version available which contained that function. I have no idea why a package that's only 0.02 versions ahead of the AUR package would contain new functions...
    Next! Utouch...ugh...great memories! Not! I was so glad that I had fixed the utouch packages earlier (for touchegg to work). I was too frustrated from compiz and libindicator to try to compile more stuff.
    Cmake. Whoever created the CMakeLists.txt file didn't list all the dependencies required. So after running "makepkg" 10 billion times, waiting for "somebodydidntputthisincmake.h not found" errors to appear, I finally got all the dependencies I needed installed...or so I thought. After installing and compiling all these dependencies, the cmake only continues 3% further before encountering another cryptic gcc error. This time, there no error about a file not being found. So not knowing what dependency was missing, I headed over to http://packages.ubuntu.com and downloaded the Unity DEB source to find the dependencies in then debian/control file. After install those few dependencies that I missed, I ran "makepkg" again, hoping that it would finally compile successfully. CMake went a little further--5% further to be exact--before running into another error. It complained about DndSourceDragBegin() having two return types. Sure, enough "./plugins/unityshell/src/ResultViewGrid.h" had the return type as boolean and "/usr/include/Nux-1.0/Nux/InputArea.h" had the return type as void. WTF? How the heck does this even compile under 11.10???
    After changing void to bool in "/usr/include/Nux-1.0/Nux/InputArea.h", I ran "makepkg" once again anxiously waiting to the see the line "Finished making: unity 4.10.2". CMake compiled about 35% before running into error about an undeclared gtk function. Nooooooooooooo!!! I wasn't brave enough to install the git version of gtk3, so I created a chroot, installed the base packages, and installed all of those dependencies fairly quickly (it gets a lot easier after doing it so many times).
    Moving on to gtk3. After cloning the ~200MB git repository, autotools spits out an error about cairo-gl missing. So, I proceeded to install the cairo-gl-git package, which failed to compile (it compiled successfully outside of the chroot...). GREAT. So, Unity fails to compile because GTK version is too old, and GTK failed to compile because cairo-gl is missing, and cairo-gl fails to compile because I'm in a chroot. GAHHH!!! While thinking about throwing the computer out of the window, I searched the AUR for other GTK3 packages. I just happened to find a package named "GTK3-UBUNTU"! That package was still at version 3.0, but it was pretty easy to get the patches and source code for 3.1 from the Ubuntu GTK source package.
    So, FINALLY, Unity compiles. I was so darn happy, I didn't even care if it ran or not. I logged out and logged back into the GNOME 3 fallback mode, and entered the chroot. After running "xhost +SI:localuser:chenxiaolong" to run X11 apps in the chroot, I crossed my fingers and ran "DISPLAY=:0.0 unity --replace". It failed with python 3 complaining about missing modules. That's okay, since the Unity launch script is written in python 2. I changed the shebang line in "/usr/bin/unity" to point to python 2 and ran "DISPLAY=:0.0 unity --replace". It didn't necessarily fail, but it didn't succeed either. It didn't print out any error messages. Weird... I thought I'd try enabling Unity from the compiz settings manager then. I ran "DISPLAY=:0.0 compiz --replace" and "DISPLAY=:0.0 ccsm" and enabled the Unity plugin. Unity runs! Although nothing shows on the screen, it runs! It shows up in the process list! Woohoo!
    And that's about how far I got. There were quite a few Vala errors during the compiling process (I forgot which package it was), which is probably why Unity won't appear. I'll try again later with the vala-devel or vala-git package and hopefully Unity will work then. Here are screenshots of what I've gotten working so far:
    http://i.imgur.com/7F1fm.jpg
    http://i.imgur.com/zGNJc.jpg
    http://i.imgur.com/3mCgd.jpg
    By then way, I love the simplicity of pacman and the AUR. I can't imagine how long this would have taken with other package managers.
    Moderator edit:  Do not place large images in line.  If you want, you may embed links to thumbnails inside url tags.
    Last edited by chenxiaolong (2014-04-15 17:11:04)

    City-busz: I'm getting a ton of Vala errors when I compile libunity (AUR version) with vala or vala-devel. libunity fails to compile with vala-git. I'll try your packages in a virtual machine and see how they work on 64 bit.
    In the meantime, Unity still fails to show up: http://i.imgur.com/btPwo.png I'll try out your PKGBUILDS and see how that works. I'm glad there are people who want to port Unity to Arch Linux
    EDIT: City-busz: Just to let you know, Unity will fail to compile at around 45% with GTK 3.0. Here's my source packaage for Ubuntu's GTK 3.1: http://ubuntuone.com/p/1EzX/ It contains all of the patches in the Ubuntu source package. I'm not sure if all the patches are needed, but GTK compiles fine with all of them.
    EDIT2: Right now, I'm trying to compile Vala 0.10.4, then version used in Ubuntu 11.10. Hopefully that will eliminate some of the Vala errors.
    EDIT3: Vala 0.10 is too old. 0.12 and 0.14 are also in the Ubuntu repository. Trying those...
    EDIT4: 0.14 is actually 0.13.1. Gah... Vala takes longer to compile under VirtualBox than GTK3...
    EDIT5: Okay...so VirtualBox "helpfully" became slow enough that I could read the error messages. The Vala error messages aren't actually error messages, but rather warnings about unused methods. I wonder what prevents Unity from running then...
    Last edited by chenxiaolong (2011-08-30 02:30:29)

  • Windows XP mode/ Windows virtual PC

    I have to support some LV7.1 programs (along with newer versions).  I just bought a new laptop with Windows 7 Home 64-bit.  LV7.1 loads and runs, but the drivers disk pukes on the 64-bit OS.  (This was not unexpected...)
    The new PC does have an Intel processor that will support Windows Virtual PC and Windows XP mode.  
    I will need to upgrade to W7 Pro, but before I do, I'd like to get some feeling that it will all work.  Should older NIDaq, MAX, etc. install OK in XP mode?  Has anybody done this?  Is there another solution that is better?
    Solved!
    Go to Solution.

    Seth,
    Thanks for the speedy reply.
    I try to use NI-DAQmx, but I won't swear that I don't have traditional DAQ routines out there.  I would like to run real hardware (USB DAQ) on my new laptop, de-bug and burn executables.
    What are my options?  I could keep LV7.1 on my old desktop, newer versions on my new laptop.  Would I be running afowl of LV's EULA?  Does NI care about antique versions of LV?
    I see that NI-DAQmx 9.0 supports versions LV8.2.1, 8.5.1 and 8.6.1.  If I put these on my 64-bit, will executables I make run on 32-bit target systems?
    I don't want to flush Windows 7 64-bit to install the 32-bit.  I want it all!
    My company has let my subscription lapse.  I don't have LV 2009.  Not much chance of an upgrade this year...
    Regards,
    John Spencer

  • Symbian S^3/Anna Calendar Meeting Request (Invitee...

    First... I want to explain how this came about. I've never sent out a Meeting Invite from my Nokia N8 (PR1.2) to anyone before. However, in the days of collaboration, people send & receive Calendar Appointments to and from different Email/PIM Clients. 
    Since Symbian Anna announced that it will have "e-mail invitations with full meeting request support"... I ran a test with a buddy of mine that had Symbian Anna. 
    This is also because iPhone users can 'send invitations' during Calendar Event creation and this meeting request magically appears in my Google Calendar.
    Why are't Nokia generated meeting requests appearing in e-mail and Google Calendar???
    Observations:
    a) When I send out a Meeting Request (via E-mail or SMS), my Nokia N8 PR 1.2 (Pre-Anna) - sends the attachment as Calendar.vcs, which has a totally different format from the meeting.ics that is read by e-mail clients. Is the meeting.ics a change in Symbian Anna? I am assuming that Calendar.vcs is an internal Nokia to Nokia format. This does not work in this day and age of people using many different brands of OS's.
    **Side Note: I am in Canada. When is my Anna Update for my Vanilla Nokia USA N8 going to be available???
    b) When I receive this Calendar.vcs as an attachment in e-mail, it is not translated as a Calendar Invitation item at all in my Gmail or my PC E-mail Client. Why?
    Calendar.vcs is sent out as "Application/Octet-Stream" in Base64 Encoding by the Symbian Email Client or the Nokia Email Servers??? Why???
    What does Application/Octet-Stream tell the receiving email client? This is an application attachment. Treat it as such.
    c) When I asked my friend who has a Nokia N8 (PR 2.0) Symbian Anna to send me a meeting request, I get an email that has an attachment "meeting.ics" -- let's call this Specimen A -- when it arrived in my Google Inbox, it is not seen nor translated as a Calendar Item.
    By this time, I got confused...
    So I sent an Appointment/Calendar Request from my Novell Groupwise and sent it to my Google. I also receive a "meeting.ics" -- let's call this Specimen B. However, Google was able to immediately translate that this meeting.ics is a Calendar Item and neatly displays it as such, asking me to say YES, NO, MAYBE to the Calendar item. It even magically inserts into my Google Calendar. 
    So I was even MORE confused!!!
    Here is what I have gathered from reading the Message Sources.
    Nokia meeting.ics (Specimen A)
    - attachment meeting.ics is sent out as MIME TYPE Application/MSPowerpoint (this tells the receiving client that this is a powerpoint application and is not translated as a Calendar Item)
    - attachment meeting.ics is sent out encoded in Base64
    - meeting.ics files is missing PARTSTAT=NEEDS-ACTION in each Attendee (tells the PIM Calendar that this Calendar Request needs to be acted upon with an Accept/Deny)
    Novell Groupwise meeting.ics (Specimen B)
    - attachment meeting.ics is sent out as MIME TYPE Text/Calendar
    - attachment meeting.ics is encoded in 8-Bit
    - meeting.ics file has PARTSTAT=NEEDS-ACTION for Attendees.
    So what is the difference? Why is the meeting.ics (Specimen B) from Groupwise readable by Google Mail/Calendar while meeting.ics (Specimen A) from Nokia Symbian Email Client not readable at all?
    So I did a small test. 
    I edited Specimen A meeting.ics from Nokia Email Client and added to th PARTSTAT=NEEDS-ACTION Attendees.
    I sent these in 2 Test Emails to my Google Mail.
    Test 1: Original meeting.ics from Nokia Email without PARTSTAT=NEEDS-ACTION
    Test 2: Altered meeting.ics from Nokia Email WITH PARTSTAT=NEEDS-ACTION
    I send these "meeting.ics" tests via my Groupwise Client as an attachment file.
    When I received both in my Google Mail, Google is able to properly render this as a Calendar Item and display it as such, asking me to say YES, NO, MAYBE to both instances of the Test Emails. 
    So, maybe PARTSTAT=NEEDS-ACTION has nothing to do with this then...
    I forwarded the test messages back to myself to see what MIME-TYPE Groupwise sent this as this time.
    It sent it as TEXT/PLAIN and encoded in Base64. So now, we know Base64 Encoding & 8Bit Encoding is not the culprit.
    So I ponder more. WHY WOULD NOKIA EMAIL CLIENT SEND OUT meeting.ics as APPLICATION/MSPOWERPOINT???? *.ics file extension is not even related to anything Powerpoint.
    So if meeting.ics is sent out by the sending email client as text/calendar or text/plain, the receiving email client is able to properly translate this and render this as a Calendar Item (Request).
    However, because Nokia Email Client send this out as "Application/MSPowerpoint", the receiving email client will think this is an e-mail attachment for PowerPoint and will not display the Calendar Item properly. 
    MIME-TYPES. Something so basic, something that would cause a HUGE ERROR.
    Is this a BUG? I THINK IT IS...  PLEASE FIX THIS ERROR. PLEASE! PLEASE! PLEASE!
    So... THIS IS A FAIL. Nokia!
    Collaboration is also part of "Connecting People". If Person X using Outlook, Person B using Crackberry, Person C using iPhone, Person D using Android, Person E using a Nokia Symbian cannot collaborate and send/receive/accept/deny Calendar Items they send each other... isn't there a HUGE DISCONNECT?
    I am hoping, actually, I am praying really hard to the many gods and dieties that this will be forwarded immediately to the Nokia Email Client Team to be looked at and an Update Immediately made available.
    Thanks.
    Attachments:
    SymCalendar-PR1.2VCS.jpg ‏50 KB
    SymCalendar-ContentTypeCompare.jpg ‏217 KB
    symCal-ICSContentComparison.jpg ‏170 KB

    Hi Joe,
    1.      Whether the sender and recipient users are on the same AD site?
    2.      If the user uses OWA to access the Meeting Request message, whether is it still displayed as a normal email instead of a meeting request?
    I would like to explain that a meeting request should have MAPI property “PR_MESSAGE_CLASS” which value is IPM.Schedule.Meeting.Request. A normal message will have the MAPI property “PR_MESSAGE_CLASS” which value is IPM.Note.
    The issue may occur:
    1.      Some third-party software such as spam filter software and disclaimer software change the mail format
    2.      Some Add-ins on the Outlook change the mail format
    To troubleshoot the possible cause 1, I suggest that you enable Pipeline Tracing:
    Using Pipeline Tracing to Diagnose Transport Agent Problems
    http://technet.microsoft.com/en-us/library/bb125198(EXCHG.80).aspx
    Note: Because Pipeline tracing could place a large amount of strain on a system, I suggest that you disable it after gathering the information.
    To troubleshoot the possible cause 2, I suggest that you disable the Add-ins temporarily:
    Disable third-party add-ins of Outlook
    ==================================
    1.      Backup and then remove the following two registry keys:
    HKEY_CURRENT_USER\Software\Microsoft\Office\Outlook\Addins
    HKEY_LOCAL_MACHINE\Software\Microsoft\Office\Outlook\Addins
    2.      Restart the Outlook by using “Outlook /Safe”
    Mike
     

  • MappedSuperClass

    Hi *,
    I'm having some trouble with a MappedSuperClass I defined. Let me build up...
    Node, which keeps a reference to a Reading object:
    @Entity
    public class Node implements Serializable {
        private static final long serialVersionUID = 1L;
        private Long id;
        @EmbeddedId
        private NodePK nodePK;
        @ManyToOne(cascade={CascadeType.PERSIST, CascadeType.REMOVE})
        private Network network;
        @OneToOne
        private Reading latestReading;
        ...//(constructors and methods)
    }Unit, which is the superclass of Reading:
    @MappedSuperclass
    public abstract class Unit implements Serializable {
        private static final long serialVersionUID = 1L;
        @Id
        @GeneratedValue(strategy = GenerationType.AUTO)
        private Long id;
        @OneToOne
        private Node node;
        public Unit() {}
        ...//(constructors and methods)
    }Reading, which inherits from Unit
    @Entity
    public class Reading extends Unit implements Serializable {
        //private Node node;                         <- PROBLEM
        public Reading() {}
        ...//(constructor and methods)
    }So, I create a Node object which stores its latest Reading-object, which is a subclass of Unit. To make Unit a MappedSuperClass, I annotated the class accordingly and extend its subclass Reading.
    I further have a stateless session bean which does the following:
    Node n1 = new Node(1, net1);
    em.persist(n1);
    Reading reading1 = new Reading(n1, null, 1, 2, 3);
    em.persist(reading1);
    n1.setLatestReading(reading1);
    removeNode(1);
    private void removeNode(int address) {
            Node node = (Node) em.createQuery("SELECT x " +
                    "FROM Node x " +
                    "WHERE x.nodePK.nodeAddress = :addr").setParameter("addr", address).getSingleResult();
            em.remove(node);
    }So basically just creating a node, creating a reading, set this reading as the nodes latest reading and finally remove this all from persistence. And there it goes wrong! When running my app I get:
    Caused by: org.apache.derby.client.am.SqlException: DELETE on table 'NODE' caused a violation of foreign key constraint 'READINGNODEADDRESS' for key (1,321).  The statement has been rolled back.
    ...I tried a bunch off stuff including adding AssociationOverride statements and moving the node property from the superclass to its subclasses. The latter works, but is kind of foolish because then the superclass has no other significant properties and I could as well not use a superclass (FYI, Reading will not be the only class extending Unit).
    Something that also seems to work, is adding the node attribute both to the superclass and the subclass (so uncommenting the problem line I indicated above).
    Can someone help me with this? I've been struggling for days with this and about to give up so...
    Thanks!
    Vaak

    I am currently using TopLink Essential build 17. I was able to take my standard Employee demo model and refactor a BaseEntity class out of the Employee class.
    BaseEntity
    @MappedSuperclass
    @TableGenerator(name = "emp-seq-table", table = "SEQUENCE",
                    pkColumnName = "SEQ_NAME", valueColumnName = "SEQ_COUNT",
                    pkColumnValue = "EMP_SEQ", allocationSize=50)
    public abstract class BaseEntity {
        @Id
        @GeneratedValue(strategy = GenerationType.TABLE, generator = "emp-seq-table")
        private int id;
        @Version
        private long version;
        public BaseEntity() {
        public void setId(int id) {
            this.id = id;
        public int getId() {
            return id;
        public void setVersion(long version) {
            this.version = version;
        public long getVersion() {
            return version;
    }The Employee class now looks basically like:
    @Entity
    @AttributeOverride(name="id", column=@Column(name="EMP_ID"))
    public class Employee extends BaseEntity implements Serializable {Doug

  • Accessing a java class method from the jsp page.

    Hi im a beginner with jsp and im trying to find a way to access a method of my java class file in jsp page. After searching through the forums i tried to use the usebean tag. Im using apache to host the jsp file.Below is an excerpt of my code and the error message i got. What am i doing wrong? anyone know?
    <%@ page language="java" %>
    <jsp:useBean id="movies" class="movie.Movie" />
    <jsp:setProperty name="movies" property="*"/>
    <%
    movies.getStart("file:///C:/Video/Applications2/sun.mpg");
    response.setContentType("text/xml");
    %>
    exception
    org.apache.jasper.JasperException: Exception in JSP: /View.jsp:7
    4: <jsp:setProperty name="movies" property="*"/>
    5: <%
    6:
    7: movies.getStart("file:///C:/Video/Applications2/sun.mpg");
    8: response.setContentType("text/xml");
    9: %>
    Stacktrace:
         org.apache.jasper.servlet.JspServletWrapper.handleJspException(JspServletWrapper.java:504)
         org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:375)
         org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:314)
         org.apache.jasper.servlet.JspServlet.service(JspServlet.java:264)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    root cause
    javax.servlet.ServletException: javax/media/ControllerListener

    Hi thanks for responding. Ok i did look through and it was opening some gui. I still need the program to do server side processes so cant use an applet.but i dont need the gui so i revised it and removed the gui. also im using a servlet to call the class now yet i still have the same error. Any ideas?
    Below is the vid2jpg code minus the gui.
    import java.io.*;
    import java.awt.*;
    import javax.media.*;
    import javax.media.control.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    import java.awt.image.*;
    import javax.imageio.*;
    public class vid2jpg implements ControllerListener
         Processor p;
         Object waitObj = new Object();
         boolean stateOK = true;
         DataSourceHandler handler;
    int imgWidth;int imgHeight;
         Image outputImage;
         String sep = System.getProperty("file.separator");
         int[] outvid;
         int startFr = 1;int endFr = 1000;int countFr = 0;
         boolean sunjava=true;
         * Static main method
         public static void main(String[] args)
              if(args.length == 0)
                   System.out.println("No media address.");
                   new vid2jpg("file:///C:/Video/applications2/sun.mpg");     // or alternative "vfw://0" if webcam
              else
                   String path = args[0].trim();
                   System.out.println(path);
                   new vid2jpg(path);
         * Constructor
         public vid2jpg(String path)
              MediaLocator ml;String args = path;
              if((ml = new MediaLocator(args)) == null)
                   System.out.println("Cannot build media locator from: " + args);
              if(!open(ml))
                   System.out.println("Failed to open media source");
         * Given a MediaLocator, create a processor and start
         private boolean open(MediaLocator ml)
              System.out.println("Create processor for: " + ml);
              try
                   p = Manager.createProcessor(ml);
              catch (Exception e)
                   System.out.println("Failed to create a processor from the given media source: " + e);
                   return false;
              p.addControllerListener(this);
              // Put the Processor into configured state.
              p.configure();
              if(!waitForState(p.Configured))
                   System.out.println("Failed to configure the processor.");
                   return false;
              // Get the raw output from the Processor.
              p.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW));
              TrackControl tc[] = p.getTrackControls();
              if(tc == null)
                   System.out.println("Failed to obtain track controls from the processor.");
                   return false;
              TrackControl videoTrack = null;
              for(int i = 0; i < tc.length; i++)
                   if(tc.getFormat() instanceof VideoFormat)
                        tc[i].setFormat(new RGBFormat(null, -1, Format.byteArray, -1.0F, 24, 3, 2, 1));
                        videoTrack = tc[i];
                   else
                   tc[i].setEnabled(false);
              if(videoTrack == null)
                   System.out.println("The input media does not contain a video track.");
                   return false;
              System.out.println("Video format: " + videoTrack.getFormat());
              p.realize();
              if(!waitForState(p.Realized))
                   System.out.println("Failed to realize the processor.");
                   return false;
              // Get the output DataSource from the processor and set it to the DataSourceHandler.
              DataSource ods = p.getDataOutput();
              handler = new DataSourceHandler();
              try
                   handler.setSource(ods);     // also determines image size
              catch(IncompatibleSourceException e)
                   System.out.println("Cannot handle the output DataSource from the processor: " + ods);
                   return false;
         //     setLayout(new FlowLayout(FlowLayout.LEFT));
    //          currPanel = new imgPanel(new Dimension(imgWidth,imgHeight));
         //     add(currPanel);
         //     pack();
              //setLocation(100,100);
         //     setVisible(true);
              handler.start();
              // Prefetch the processor.
              p.prefetch();
              if(!waitForState(p.Prefetched))
                   System.out.println("Failed to prefetch the processor.");
                   return false;
              // Start the processor
              //p.setStopTime(new Time(20.00));
              p.start();
              return true;
         * Sets image size
         private void imageProfile(VideoFormat vidFormat)
              System.out.println("Push Format "+vidFormat);
              Dimension d = (vidFormat).getSize();
              System.out.println("Video frame size: "+ d.width+"x"+d.height);
              imgWidth=d.width;
              imgHeight=d.height;
         * Called on each new frame buffer
         int nextframetime = 0;
    private void useFrameData(Buffer inBuffer)
    try
    if(inBuffer.getData()!=null) // vfw://0 can deliver nulls
    if(sunjava) // and with import javax.imageio.*;
    int frametimesecs = (int)(inBuffer.getTimeStamp()/1000000000);
    if(frametimesecs%10 == 0 && frametimesecs==nextframetime)
    nextframetime+=10;
    BufferedImage bi = new BufferedImage(outputImage.getWidth(null), outputImage.getHeight(null), BufferedImage.TYPE_INT_RGB);
    Graphics g = bi.getGraphics();
    ImageIO.write(bi, "png", new File("images"+sep+"image_"+(inBuffer.getTimeStamp()/1000000000)+".png"));
    catch(Exception e){}
         * Tidy on finish
         public void tidyClose()
              handler.close();
              p.close();
         * Block until the processor has transitioned to the given state
         private boolean waitForState(int state)
              synchronized(waitObj)
                   try
                        while(p.getState() < state && stateOK)
                        waitObj.wait();
                   catch (Exception e)
              return stateOK;
         * Controller Listener.
         public void controllerUpdate(ControllerEvent evt)
              if(evt instanceof ConfigureCompleteEvent ||     evt instanceof RealizeCompleteEvent || evt instanceof PrefetchCompleteEvent)
                   synchronized(waitObj)
                        stateOK = true;
                        waitObj.notifyAll();
              else
              if(evt instanceof ResourceUnavailableEvent)
                   synchronized(waitObj)
                        stateOK = false;
                        waitObj.notifyAll();
              else
              if(evt instanceof EndOfMediaEvent || evt instanceof StopAtTimeEvent)
                   tidyClose();
         * Inner classes
         * A DataSourceHandler class to read from a DataSource and displays
         * information of each frame of data received.
         class DataSourceHandler implements BufferTransferHandler
              DataSource source;
              PullBufferStream pullStrms[] = null;
              PushBufferStream pushStrms[] = null;
              Buffer readBuffer;
              * Sets the media source this MediaHandler should use to obtain content.
              private void setSource(DataSource source) throws IncompatibleSourceException
                   // Different types of DataSources need to handled differently.
                   if(source instanceof PushBufferDataSource)
                        pushStrms = ((PushBufferDataSource) source).getStreams();
                        // Set the transfer handler to receive pushed data from the push DataSource.
                        pushStrms[0].setTransferHandler(this);
                        // Set image size
                        imageProfile((VideoFormat)pushStrms[0].getFormat());
                   else
                   if(source instanceof PullBufferDataSource)
                        System.out.println("PullBufferDataSource!");
                        // This handler only handles push buffer datasource.
                        throw new IncompatibleSourceException();
                   this.source = source;
                   readBuffer = new Buffer();
              * This will get called when there's data pushed from the PushBufferDataSource.
              public void transferData(PushBufferStream stream)
                   try
                        stream.read(readBuffer);
                   catch(Exception e)
                        System.out.println(e);
                        return;
                   // Just in case contents of data object changed by some other thread
                   Buffer inBuffer = (Buffer)(readBuffer.clone());
                   // Check for end of stream
                   if(readBuffer.isEOM())
                        System.out.println("End of stream");
                        return;
                   // Do useful stuff or wait
                   useFrameData(inBuffer);
              public void start()
                   try{source.start();}catch(Exception e){System.out.println(e);}
              public void stop()
                   try{source.stop();}catch(Exception e){System.out.println(e);}
              public void close(){stop();}
              public Object[] getControls()
                   return new Object[0];
              public Object getControl(String name)
                   return null;
    below is the servlet code.
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.io.*;
    public class ShowMovie extends HttpServlet {
    String rootURL="http://127.0.0.1:8080/Video/";
    public void processRequest(HttpServletRequest request, HttpServletResponse response)
    throws ServletException, IOException {
         //String movie=request.getParameter("movie");
         String movie ="son";
         getStart(movie);
              response.sendRedirect(rootURL+"View.jsp");
    protected void doGet(HttpServletRequest request, HttpServletResponse response)
    throws ServletException, IOException {
    processRequest(request, response);
    protected void doPost(HttpServletRequest request, HttpServletResponse response)
    throws ServletException, IOException {
    processRequest(request, response);
         public void getStart(String url){
              new vid2jpg(url);
    this is the error from the server. Im using tomkat 5
    exception
    javax.servlet.ServletException: Servlet execution threw an exception
    root cause
    java.lang.NoClassDefFoundError: javax/media/ControllerListener
         java.lang.ClassLoader.defineClass1(Native Method)
         java.lang.ClassLoader.defineClass(Unknown Source)
         java.security.SecureClassLoader.defineClass(Unknown Source)
         org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:1812)
         org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:866)
         org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1319)
         org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1198)
         java.lang.ClassLoader.loadClassInternal(Unknown Source)
         ShowMovie.getStart(ShowMovie.java:31)
         ShowMovie.processRequest(ShowMovie.java:14)
         ShowMovie.doGet(ShowMovie.java:22)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:689)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    note The full stack trace of the root cause is available in the Apache Tomcat/5.5.17 logs.

  • AUR has a bug in the way it versions packages. [WORKED AROUND]

    If you check out my PKGBUILD in the AUR (http://aur.archlinux.org/packages.php?ID=29410), you'll notice its version on the page is "smooth-tasks-pkgver.txt", while the ${pkgver} variable in its PKGBUILD is "wip_2009_09_13".  `smooth-tasks-pkgver.txt' is a file the PKGBUILD uses to temporarily store the correct ${pkgver} of the package (not its mercurial revision), which is then used to update the arch package and PKGBUILD.
    It was an experiment to see how I could manipulate makepkg's standard versioning scheme to reflect the actual author's package version rather than the mercurial revision, and it works!  AUR just seems to have an issue with the way it parses the PKGBUILD.
    I figured it would be best to discuss this on the forums before filing an actual bug report.
    EDIT: changed the title from "... names packages" to "... versions packages", and edited the post body to correct my original misconception the bug lies in how the AUR names packages, and not versions them.
    Last edited by deltaecho (2009-09-13 21:48:06)

    I don't understand your logic, what is `[ 1 ]' for?  It reminds me of the pieces of code I occasionally come across that look something like:
    if (true) {
    /* Do something here */
    I reckon it makes since to someone, I've just never understood the logic behind such blocks.
    The above snippet of code is one way of creating a block comment within Bash scripts, since whatever is located between the `EOF' tags is piped to nowhere (and thus isn't displayed); by placing a block comment at the end of the PKGBUILD containing an explicit ${pkgver} declaration, I am able to dictate to the AUR what the package version should be, and still have the segment of code ignored by `makepkg' when the script is executed.
    EDIT: Unless, you mean to use something like `(( 0 ))', which Bash will evaluate to false and thus ignore the proposition.  That would indeed make since, but, in my opinion, isn't really any clearer than my implementation.
    Last edited by deltaecho (2009-09-13 23:22:10)

  • How can I get Total Number of messages in an EDI Interchange

    Hi All,
     I am migrating a solution from Covast EDI to native BizTalk EDI and ran into an issue. In Covast when an Incoming interchange is debatched some of the properties like The total number of messages in an Interchange is saved into the XML of the debatched
    message. I am trying to find this value in Native BizTalk EDI.
     How can I determine this value? I only see the InterchangeSequenceNumber property which is only the sequence number of the a message in Interchange. 
    Thanks!

    My first advice in situations like this is the double and triple check that this requirement still exists and it is driven
    by the business.  This is great opportunity to refactor some unnecessary complexity out of the solution if you identify it.
    Assuming it is a valid and verified requirement, how you approach depends on the composition of the Interchange.  Meaning, if you receive single ISA...IEA blocks with a single GS...GE, you can use the max of BTS.InterchangeSequenceNumber.
    If you receive multiple ISA or GS's, it's a little different.

  • How can I get total number of pages in the folio?

    Hello guys, I need some help.
    I'm trying to get  the total number of pages in the folio, but, currently, I only success to get the total number of pages from current article, is there someway to get it? 
    My code: gist:de1de89b493024815c3e
    Thanks so much.

    My first advice in situations like this is the double and triple check that this requirement still exists and it is driven
    by the business.  This is great opportunity to refactor some unnecessary complexity out of the solution if you identify it.
    Assuming it is a valid and verified requirement, how you approach depends on the composition of the Interchange.  Meaning, if you receive single ISA...IEA blocks with a single GS...GE, you can use the max of BTS.InterchangeSequenceNumber.
    If you receive multiple ISA or GS's, it's a little different.

  • Private function / procedure to be used with in a package

    Hi All
    I wanted to find out how would the definition be for a function which is defined so as to be used/called by the other functions / procedures with in a particular pakage and should not be visible/accessible for any other routine out side the package in which it is defined..
    could you please advice me on this? Thanks!
    Sarat

    It is possible: you can nest one function/procedure in another:
    SQL>CREATE OR REPLACE FUNCTION sum_squares(
      2     p_a   IN   NUMBER,
      3     p_b   IN   NUMBER)
      4     RETURN NUMBER
      5  IS
      6     FUNCTION square(
      7        p_n   IN   NUMBER)
      8        RETURN NUMBER
      9     IS
    10     BEGIN
    11        RETURN p_n * p_n;
    12     END square;
    13  BEGIN
    14     RETURN square(p_a) + square(p_b);
    15  END sum_squares;
    16  /
    Function created.
    SQL>SELECT sum_squares(3, 4) FROM dual;
    SUM_SQUARES(3,4)
                  25Urs

  • Introducing Lowarch - Arch for low-end systems

    Some of you might remember that I mentioned my plans of making an Arch install cd for low end systems. I've come quite far with this project now.
    First, I must say that it was a lot harder than I first expected. So I can say I have learned a lot in the process.
    Second, before I start ramblig about where I am in the process, I'll just outline what the plan is:
    The best Linux distribution is Arch Linux, because of it's speed, simplicity and elegance in design. I often get hold of some old laptops which are too old for Arch (i586 normally, but also i486). I have tried a lot of distros made for low end systems, but I always end up thinking, ah how wonderful it would have been to have Arch on this. It would have been the perfect distro for this machine.
    And the more I thought about it, I realised that even though Arch is perfect for super modern systems, it would be even more perfect for old systems, because of it's speed and simplicity, and not least the choice of packages.
    Some have started projects to port Arch to i586. I don't think that is the best solution. A lot of Arch isn't really useful on these machines. Therefore I decided to make a new distro based on Arch, where the main difference is the choice of packages. You don't want to run a video editor on a i586. For now I've called the distro Lowarch.
    Here's what I have done so far:
    1st try:
    Oh, I just change the compiler flags to i486 in /etc/makepkg.conf and start compiling all the packages. That worked great. Almost everything in base compiled fine. Until I started to notice the strange i686's coming up in all the ./configure outputs, and when I tried to install some of it on my i586, a lot of Illegal instruction messages came up.
    After doing some research, I understood that some programs make static links to libraries, all of which, of course, are i686 coded on my system. I realised I needed to do this differently. Of course I didn't want to do all the compiling on a 486 machine.
    2nd try:
    I made a Cross Linux From Scratch system, compiled for i486. Even that I had to make twice, for a lot of programs check what processor they are running on and compile for that. So the second time I used a so-called uname-hack. A small kernel module which makes the kernel report that it's running on an i486 machine.
    When I had a basic build system, I compiled pacman, and started to build packages. This time I had to take care of the order for dependencies to work out. First kernel-headers, glibc, etc. I installed one package at a time into an empty directory with the pacman -r option.
    After having the base system compiled, I booted it, and (using the uname-hack still) continued to compile the other packages in current, and the ones that I wanted from extra.
    It was really hard, since a lot of packages just don't compile with recent glibc and gcc. I have learned a lot about writing small patches just to make gcc happy or take away symbols not used anymore etc. The dependencies in the pkgbuilds are not always correct when it comes to building packages. Sometimes, and this is the most frustrating, packages seems to build correctly, but because some library wasn't installed or whatever, a few files are missing from the package, and I have no way to know, untill something goes wrong. Often this wrong is a package not compiling because include files are missing.
    Another problem was to make an installer. I decided I just wanted a simple installer, like Arch has. So I compiled busybox and made an almost exact copy of the 0.7.2 install cd, just compiled for i486, and with my package selection on it.
    What's left to do:
    1. Look more closely at the package selection. If there are people who are interested in this we can have some nice discussions about this. There are a lot of packages I just don't know what is for, and therefore I don't dare take them away.
    2. Clean up the PKGBUILDs, e.g take away references it the i686 and 64bit  arch.
    3. Compile the whole set of packages again, just in case.
    4. Releasing it.
    I've set up a wordpress page at lowarch.linuxophic.org, which is empty now, but where I'll put out a link to the iso when it's ready.

    Yes, I think it would be smart to check out the PKGBUILD once in a while and see if they are still able to build packages. It seems this is only done when new releases arrive.
    The thing about old programs not compiling with newer glibc and gcc or other libraries is unavoidable, and there's an unending stream of patches to address this issue. If a package hasn't had a release for a year, it's quite common that a change in a library it depends on will break the compilation. When or if new releases come, they should usually fix this.
    Of course, PKGBUILDs of current packages should work with a current system, not with a system from last year, if their supposed to be of any use. But it's a huge job to keep this up to date at all times, and you need some programming skills to do it, and I would rather have the devs focus on more important things, like updating packages and making everything work even better.

Maybe you are looking for

  • Authorization issue?(PI 7.1)

    Hi All, I want to create a SAP ProComp Model but the following message  popups: You need the 'Write' access privilege for this operation. any suggestion? thanks.

  • Need Help with timing using multiple clips

    I am in need of some help. I am trying to have a total of 16 video clips appear to slide across the screen from the right to the left, all while being equally spaced apart, going the same speed without them crossing over one another. I am using the c

  • Using the LAG function

    Hi all, using: Oracle9i Enterprise Edition Release 9.0.1.3. I have a table that looks basically like this: SQL> select labour_rate_id, organisation_id, commence_date_time, termination_date_time, labour_rate   2    from labour_rate   3  where organisa

  • PS CS6 now extended, Illustrator & dreamweaver now trials

    For some reason Adobe Updater suggested an update to my perpetual licenced copy of PS CS6.  Having run this, PS has nowe become a trail of Extended and both dreamweaver CS6 and illustrator CS are alos trials where before they were perpetual licences.

  • How do I re-install Adobe Photohop Elements 8 from the original diskette?

    How do I re-install Adoboe Photoshop Elements 8 from the original diskette.  All the diskette wants to do is to uninstall the program.  [If I did this would I lose all my photos?] Thanks in advance for your help. Tom Barker <Removed by Moderator>