MAVEN Directory Layout

HI!
What if I want to use the Maven Directory Layout with SUN JSC? I haven a WEB application where I use Maven and I want to write the GUI with SUN JSC. I tried to make it recognize the existing WEB.XML and JSP's, however it doesn't work. I can imagine JSC enforces some directory and file locations. Is there any way to change that?

I'd like to use maven as well, but have put it off because I anticipated difficulty integrating it with JSC. Has anyone done this successfully?

Similar Messages

  • Directory layout for a new project

    Hi everyone. I'm trying to divide (and hopefully conquer) a project in the company. Currently we're starting a new project and the word from the manager is: take this older project and make the new one (somehow the old project is a subset of the new one, but anyway...)
    The problem is this older project consists of a base project. So the dev's created an src dir for their project and another dir, lets say core for the base project. (and we're suppused to make this into a new project!).
    I'm relatively new in J2EE, but from what I've read thus far isn't this directory layout wrong? From my point of view we should divide the project in say:
    project_dir
    |
    |---->core-ejb
    |---->old-ejb
    |---->project-ejb
    |---->war
    |---->client
    and it's sub-directory being a sub-project? When creating an ear it can contain multiple ejb jar's right? Like having core-ejb.jar, old-ejb.jar, project-ejb.jar etc. Is my opinion correct, or am I missing something?

    We are introducing a new feature in Flex 2.0.1 that's due out
    early next year. The feature is called "Modules" and it was
    discussed at this year's MAX 2006. You can read more about it on
    Roger Gonzalez's blog:
    http://blogs.adobe.com/rgonzalez/
    - check the "Modular Applications" articles. I think this will go a
    long way to making the reusable parts you are talking about.
    Flex does not build things from instructions unless they are
    written in ActionScript. We have customers that do create dynamic
    interfaces based on data loaded from a database, so it is possible.
    But if you have pre-built templates and all you need to do is
    change certain aspects at runtime, it should be pretty easy with
    Flex. Take a look at the Flex documentation, especially the part
    about the Flex framework and how Flex sources are compiled into
    SWFs.
    You style Flex components using style sheets (CSS files).
    This also includes specifying skins for components if you decide to
    give something a whole new look.
    I'm a bit biased here, but I think using ColdFusion (version
    7.0.2) with Flex 2 is very easy. But it depends on your needs,
    budget, deployment, etc. WIth CF 7.0.2 and Flex Builder 2 you get
    wizards to be build both CFC (ColdFusion components) and matching
    ActionScript objects so that you exchange objects, not just data,
    between Flex and CF.
    WebServices can also be used (with CF, too). This gives you
    more choices to the backend. If you have a large amount of data or
    it is complex, consider Flex Data Services as that has data
    management capabilities.
    Flex 2 has localization capabilties. You create a 'resource
    bundle' in various languages and build SWFs for each one. When the
    end user choses their preference, take them to the page that loads
    the appropriate SWF.
    HTH

  • Scenebuilder not seeing controller in Maven directory structure

    I'm using maven for my javafx project, I used the javafx maven plugin @ https://github.com/zonski/javafx-maven-plugin/wiki. Everything was created correctly with a typical maven directory structure, the fxml file was created in the resources\fxml folder. When I edit the fxml file Scenebuilder will not see my controller, I have set the <?scenebuilder-classpath-element /> fxml element to the path of the controller source and the controller still cannot be seen by Scenebuilder. I have performed a build and then changed the same element to the generated jar and it still does not work.
    So as a work around I keep the fxml file in the same folder as the controller source. I've read that the scenebuilder-classpath-element is used to resolve imports for user written controls, as a suggestion for a Scenebuilder enhancement it would be nice to have the classpath element also used for the declared controller.

    Makes sense.
    I created http://javafx-jira.kenai.com/browse/DTL-5573 to track that proposal.
    Thanks for the suggestion.

  • Changing Directory layout phone 7945 in CME

    Hi guys,
    I have problems to change the search layout in the personal directory on CME.
    Sadly my client does not accept to do the default procedure for search, where he has to press the directory key, down to option 5 and choose how to look.
    So I need to change the search scheme within directories when she press the directory key already show the search by name or last name
    Some documents show how to change the Soft-keys but that does not help me, I need to change the directories to search layout.
    I'm using CME 9.0
    Regards

    cd /Applications/BlackBerry\ Desktop\ Software.app/Contents/MacOS/
    That is the command right there........and here is the web page
    http://btsc.webapps.blackberry.com/btsc/microsites/search.do?cmd=displayKC&docTy pe=kc&externalId=KB17215&sliceId=1&docTypeID=DT_SUPPORTISSUE_1_1&dialogID=489798 244&stateId=0%200%201484449614

  • Maven Support for JE, testers needed!

    Hello Berkeley DB Java Edition Fans and Developers!
    In an effort to better support our developers we've setup what we hope is a functional Maven repository, but we'd like you to test it and let us know if we've been successful or not. I've tried to test it, but to be perfectly honest I'm not a Maven expert. If we've done anything in a non-standard or imperfect manner please speak up and let me know so that I can fix things now.
    Here is the structure, the ever important POM file is found at:
    http://download.oracle.com/maven/com/sleepycat/je/<release>/je-<release>.pom
    e.g.
    http://download.oracle.com/maven/com/sleepycat/je/3.2.76/je-3.2.76.pom
    In that directory you will find:
    je-<release>.pom
    je-<release>.pom.md5
    je-<release>.pom.sha1
    je-<release>.jar
    je-<release>.jar.md5
    je-<release>.jar.sha1
    sources.jar
    sources.jar.md5
    sources.jar.sha1
    Also, I've uploaded a copy of each release's documentation tree to:
    http://download.oracle.com/berkeley-db/docs/je/<release>/
    e.g.
    http://download.oracle.com/berkeley-db/docs/je/3.2.76/
    Finally, the Sleepycat Public (open source) license file for JE is located at:
    http://download.oracle.com/maven/com/sleepycat/je/license.txt
    I believe that if you put the following into your Ant build.xml file it will pick up the JE .jar file using Maven, let me know if this is wrong:
      <!-- Use Maven to fetch Oracle Berkeley DB Java Edition -->
      <path id="maven-ant-tasks.classpath" path="lib/maven-ant-tasks-2.0.9.jar" />
      <typedef resource="org/apache/maven/artifact/ant/antlib.xml" uri="urn:maven-artifact-ant" classpathref="maven-ant-tasks.classpath" />
      <artifact:remoteRepository id="berkeleydb-je.repository" url="http://ossus.com/maven" />
      <artifact:dependencies pathId="dependency.classpath">
        <remoteRepository refid="berkeleydb-je.repository" />
        <dependency groupId="com.sleepycat" artifactId="je" version="3.2.76"/>
      </artifact:dependencies>
    --------------------------------And here is an example project POM file for those who use Maven to build their applications. Let me know if this is correct and usable or if I've made mistakes or there are ways to improve it as well.
    <?xml version="1.0" encoding="ISO-8859-1"?>
    <project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
             xsi:schemaLocation="http://maven.apache.org/maven-v4_0_0.xsd"?>
      <modelVersion>4.0.0</modelVersion>
      <dependencies>
        <dependency>
          <groupId>com.sleepycat</groupId>
          <artifactId>je</artifactId>
          <version>3.2.76</version>
        </dependency>
      </dependencies>
      <repositories>
        <repository>
          <releases>
            <enabled>false</enabled>
            <updatePolicy>always</updatePolicy>
            <checksumPolicy>warn</checksumPolicy>
          </releases>
          <snapshots>
            <enabled>true</enabled>
            <updatePolicy>never</updatePolicy>
            <checksumPolicy>fail</checksumPolicy>
          </snapshots>
          <id>oracleReleases</id>
          <name>Oracle Released Java Packages</name>
          <url>http://download.oracle.com/maven</url>
          <layout>default</layout>
        </repository>
      </repositories>
    </project>
    --------------------------------If one or more Maven user could give this a try it would greatly help me out.
    Questions I have are:
    1. Did I put the source code up in the proper place/package/name/etc?
    2. Did I leave anything out of the POM?
    3. Did I put too much into the POM
    4. Is there some other file I need to put somewhere that tells people what the latest version is of JE (and is that something that indicates the latest version for major releases, as in the latest version of the 2.x series and then also the latest version of the 3.x series, etc.)?
    5. Does this work? :)
    Thanks in advance for your help,
    -greg
    Greg Burd | Senior Product Manager
    Oracle Berkeley DB | ORACLE United States

    Update: I changed our POM to refer to the new version, 3.3.62, and again the download, build and test was successful. So, upgrading JE versions required just a few keystrokes.
    FYI, you've still got a license typo, "Sleepcyat", in both the 3.2.76 and 3.3.62 POMs.
    Also, though it seems harmless, and only happens when using the m2eclipse inside-Eclipse maven build, I'm still getting the following warning on builds:
    [WARN] POM for 'com.sleepycat:je:pom:3.3.62:compile' is invalid. It will be ignored for artifact resolution. Reason: Parse error reading POM. Reason: Unrecognised tag: 'license' (position: START_TAG seen ...</organization>\n\n <license>... @15:12)
    I don't know enough about Maven to know if there's a 'higher-level' way to indicate the latest version overall, or latest within a version-prefix. I suspect people should make that determination from prose elsewhere. They might also try to determine other releasesby browsing the containing directory -- ie: http://download.oracle.com/maven/com/sleepycat/je/ -- though I see that doesn't work, only the direct URLs to the POMs and other resources answer.
    Your 'sources' URLs in both POMs trigger a JAR download as I would expect, but the Javadoc URLs both generate 404s. (I'm not sure if any automated tools depend on these delivering content, as opposed to the values just being there for human reference.)
    HTH -- though the main message is "it's working the way I'd expect!"
    - Gordon @ IA

  • Issue in deploying Service Bus projects using maven

    Hi,
    I am following the following documentation for deploying service bus projects to my OSB server.
    http://docs.oracle.com/middleware/1213/core/MAVEN/osb_maven_project.htm#MAVEN8973
    I was able to issue mvn package as described in documentation and also able to see sbar files in .data/Maven directory of each project.
    But i am seeing following error when i issue mvn pre-integration-test. Am i missing any steps.
    [INFO] --- oracle-servicebus-plugin:12.1.3-0-0:deploy (default-deploy) @ System ---
    [INFO] Service Bus Archive deployed using session Service_Bus_Maven-System-1424718369010.
    java.io.IOException: Unable to resolve 'weblogic.management.mbeanservers.domainruntime'. Resolved 'w
    eblogic.management.mbeanservers'
            at weblogic.management.remote.common.ClientProviderBase.makeConnection(ClientProviderBase.ja
    va:237)
            at weblogic.management.remote.common.ClientProviderBase.newJMXConnector(ClientProviderBase.j
    ava:120)
            at javax.management.remote.JMXConnectorFactory.newJMXConnector(JMXConnectorFactory.java:369)
            at javax.management.remote.JMXConnectorFactory.connect(JMXConnectorFactory.java:267)
            at oracle.sb.maven.plugin.deploy.MBeanHelper$MBeanInvocationHandler.getJMXConnector(MBeanHel
    per.java:228)
            at oracle.sb.maven.plugin.deploy.MBeanHelper$MBeanInvocationHandler.invoke(MBeanHelper.java:
    131)
            at com.sun.proxy.$Proxy16.createSession(Unknown Source)
            at oracle.sb.maven.plugin.DeployMojo.createSession(DeployMojo.java:141)
            at oracle.sb.maven.plugin.DeployMojo.execute(DeployMojo.java:89)
            at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.j
    ava:101)
            at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
            at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
            at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
            at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBu
    ilder.java:84)
            at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBu
    ilder.java:59)
            at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter
    .java:183)
            at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
            at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
            at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
            at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
            at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
            at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
            at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
            at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
            at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
            at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
            at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
    Caused by: javax.naming.NameNotFoundException: Unable to resolve 'weblogic.management.mbeanservers.d
    omainruntime'. Resolved 'weblogic.management.mbeanservers' [Root exception is javax.naming.NameNotFo
    undException: Unable to resolve 'weblogic.management.mbeanservers.domainruntime'. Resolved 'weblogic
    .management.mbeanservers']; remaining name 'domainruntime'
            at weblogic.jndi.internal.BasicNamingNode.newNameNotFoundException(BasicNamingNode.java:1180
            at weblogic.jndi.internal.BasicNamingNode.lookupHere(BasicNamingNode.java:270)
            at weblogic.jndi.internal.ServerNamingNode.lookupHere(ServerNamingNode.java:187)
            at weblogic.jndi.internal.BasicNamingNode.lookup(BasicNamingNode.java:210)
            at weblogic.jndi.internal.BasicNamingNode.lookup(BasicNamingNode.java:224)
            at weblogic.jndi.internal.BasicNamingNode.lookup(BasicNamingNode.java:224)
            at weblogic.jndi.internal.BasicNamingNode.lookup(BasicNamingNode.java:224)
            at weblogic.jndi.internal.RootNamingNode_WLSkel.invoke(Unknown Source)
            at weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:701)
            at weblogic.rmi.cluster.ClusterableServerRef.invoke(ClusterableServerRef.java:231)
            at weblogic.rmi.internal.BasicServerRef$1.run(BasicServerRef.java:527)
            at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:363)
            at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:146)
            at weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:523)

    Hi Siva,
    I am facing same issue I am bale to create package but while deployment goal I am facing problem.I am following same document as you do.
    I can see my mvn-pre-integration ..show me sucees in result
    [DEBUG]   Excluded: org.apache.maven:maven-artifact:jar:3.0.3
    [DEBUG] Configuring mojo com.oracle.servicebus.plugin:oracle-servicebus-plugin:12.1.3-0-0:package from plugin realm ClassRealm[plugin>com.oracle.servicebus.plugin:oracl
    -servicebus-plugin:12.1.3-0-0, parent: sun.misc.Launcher$AppClassLoader@65450f1f]
    [DEBUG] Configuring mojo 'com.oracle.servicebus.plugin:oracle-servicebus-plugin:12.1.3-0-0:package' with basic configurator -->
    [DEBUG]   (f) oracleHome = C:\Oracle\Middleware\Oracle_Home
    [DEBUG]   (f) project = MavenProject: LinuxMachine:TestOSBLinux:1.0-SNAPSHOT @ C:\JDeveloper\mywork\OSB-DEPLOY\LinuxMachine\TestOSBLinux\pom.xml
    [DEBUG]   (f) system = false
    [DEBUG] -- end configuration --
    [INFO]
    [INFO] --- oracle-servicebus-plugin:12.1.3-0-0:deploy (default-deploy) @ TestOSBLinux ---
    [DEBUG] Configuring mojo com.oracle.servicebus.plugin:oracle-servicebus-plugin:12.1.3-0-0:deploy from plugin realm ClassRealm[plugin>com.oracle.servicebus.plugin:oracle
    servicebus-plugin:12.1.3-0-0, parent: sun.misc.Launcher$AppClassLoader@65450f1f]
    [DEBUG] Configuring mojo 'com.oracle.servicebus.plugin:oracle-servicebus-plugin:12.1.3-0-0:deploy' with basic configurator -->
    [DEBUG]   (f) oraclePassword = Oracle123
    [DEBUG]   (f) oracleServerUrl = http://192.168.137.150:7001
    [DEBUG]   (f) oracleUsername = weblogic
    [DEBUG]   (f) project = MavenProject: LinuxMachine:TestOSBLinux:1.0-SNAPSHOT @ C:\JDeveloper\mywork\OSB-DEPLOY\LinuxMachine\TestOSBLinux\pom.xml
    [DEBUG] -- end configuration --
    [INFO] Service Bus Archive deployed using session Service_Bus_Maven-TestOSBLinux-1429888616161.
    [INFO]
    [INFO] --- oracle-servicebus-plugin:12.1.3-0-0:deploy (deploy) @ TestOSBLinux ---
    [DEBUG] Configuring mojo com.oracle.servicebus.plugin:oracle-servicebus-plugin:12.1.3-0-0:deploy from plugin realm ClassRealm[plugin>com.oracle.servicebus.plugin:oracle
    servicebus-plugin:12.1.3-0-0, parent: sun.misc.Launcher$AppClassLoader@65450f1f]
    [DEBUG] Configuring mojo 'com.oracle.servicebus.plugin:oracle-servicebus-plugin:12.1.3-0-0:deploy' with basic configurator -->
    [DEBUG]   (f) oraclePassword = Oracle123
    [DEBUG]   (f) oracleServerUrl = http://192.168.137.150:7001
    [DEBUG]   (f) oracleUsername = weblogic
    [DEBUG]   (f) project = MavenProject: LinuxMachine:TestOSBLinux:1.0-SNAPSHOT @ C:\JDeveloper\mywork\OSB-DEPLOY\LinuxMachine\TestOSBLinux\pom.xml
    [DEBUG] -- end configuration --
    [INFO] Service Bus Archive deployed using session Service_Bus_Maven-TestOSBLinux-1429888623126.
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building LinuxMachine 1.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-so
    rces, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, i
    tegration-test, post-integration-test, verify, install, deploy]
    [DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
    [DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
    [DEBUG] === PROJECT BUILD PLAN ================================================
    [DEBUG] Project:       LinuxMachine:LinuxMachine:1.0-SNAPSHOT
    [DEBUG] Dependencies (collect): []
    [DEBUG] Dependencies (resolve): []
    [DEBUG] Repositories (dependencies): [central (https://repo.maven.apache.org/maven2, default, releases)]
    [DEBUG] Repositories (plugins)     : [central (https://repo.maven.apache.org/maven2, default, releases)]
    [DEBUG] =======================================================================
    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] TestOSBLinux ....................................... SUCCESS [ 20.752 s]
    [INFO] LinuxMachine ....................................... SUCCESS [  0.009 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    But not get deployed .....
    When i  run mvn com.oracle.servicebus.plugin:oracle-servicebus-plugin:deploy**  to invoke deployment goal it fails which below logs
    org.apache.maven.plugin.MojoNotFoundException: Could not find goal 'deploy*.*' in plugin com.oracle.servicebus.plugin:oracle-servicebus-plugin:12.1.3-0-0 amon
    goals deploy, package
            at org.apache.maven.plugin.internal.DefaultMavenPluginManager.getMojoDescriptor(DefaultMavenPluginManager.java:275)
            at org.apache.maven.plugin.DefaultBuildPluginManager.getMojoDescriptor(DefaultBuildPluginManager.java:239)
            at org.apache.maven.lifecycle.internal.MojoDescriptorCreator.getMojoDescriptor(MojoDescriptorCreator.java:233)
            at org.apache.maven.lifecycle.internal.DefaultLifecycleTaskSegmentCalculator.calculateTaskSegments(DefaultLifecycleTaskSegmentCalculator.java:103)
            at org.apache.maven.lifecycle.internal.DefaultLifecycleTaskSegmentCalculator.calculateTaskSegments(DefaultLifecycleTaskSegmentCalculator.java:83)
            at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:85)
            at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:355)
            at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
            at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
            at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:216)
            at org.apache.maven.cli.MavenCli.main(MavenCli.java:160)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
            at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
            at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
            at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
    [ERROR]
    [ERROR]
    Thanks&Regards
    Prabhat

  • Building ADF  application Using Maven

    I have a requirement to build the Adf application using Maven.
    My JDeveloper version is 11.1.1.3 and I imported the Maven plugins.
    When I create the new Fusion application , pom.xml is getting created in model project but not in viewcontroller project. Any reason for that?
    Please help me to point to the right article and example to build Adf application using Maven?

    User,
    While you can build an ADF app using Maven, it's nowhere near easy or automated in terms of getting started. There are a few people who are doing it, but they have invested a lot of effort including:
    * loading the ADF libraries into their corporate repositories
    * re-structuring their ADF projects to match the Maven directory standard (or customizing their POMs)
    * Hand-building POMs for their ADF apps
    Most or all of the people that I know who are doing this are doing all of these things without the benefit of the Maven integration provided in JDeveloper. If you have a search of this forum and the [url https://groups.google.com/forum/?fromgroups#!forum/adf-methodology]ADF EMG, you can find plenty of discussions about this. There's no "step-by-step" guide that I know of, and it will definitely take you quite a bit of effort to get started. The Maven integration in JDeveloper is still in developer preview mode, and in my opinion, isn't ready for prime time - at least not with respect to ADF projects.
    John

  • What you think about Gobolinux's FS layout?

    GoboLinux is a modular Linux distribution: it organizes the programs in your system in a new, logical way. Instead of having parts of a program thrown at /usr/bin, other parts at /etc and yet more parts thrown at /usr/share/something/or/another, each program gets its own directory tree, keeping them all neatly separated and allowing you to see everything that's installed in the system and which files belong to which programs in a simple and obvious way.
    http://www.gobolinux.org/
    They also seem to have a clear direction, or philosophy, like Arch does:
    http://gobo.kundor.org/wiki/The_GoboLinux_way
    Gobolinux's FS layout have some advantages, like:
    * Easy package management, no need to handle and track multiple files scatered in the system
    * Easy coexistence of multiple versions of the same package
    * Easy to rollback, just change symlinks
    * Easy for third-party application distribution, just make a bundle with all it needs
    In fact, this layout is much close to what OS X does. There's an obvious con, and is the fact that some packages might duplicate files, thus increasing disk usage. On the other hand, it would be a much more friendly Linux system for shipping applications, as it's easier to ship a bundle that installs itself correctly with all it needs. I think the current UNIX historical FS layout is holding back a lot of inovation Linux could have on the desktop for making easier for developers target and ship binaries for the plataform.
    No more various distro teams duplicating effort to put source tarballs together. You see we have a lot of distros, but they are actually doing small variations and incrementations of the same thing: building and packaging stuff from source, and taking care for one not break the other. The result is, if the package doesn't come from your distro, you're on high risk of breaking your system. This holds back 3rd parties from targeting the plataform, as they can't control the package building and distribution themselves.
    As I said, OS X already does a similar package concept to that, and see how much good aplications they're bringing from both big companies and independent developers. And, despite the core being an open kernel and GNU userland, all the relevant stack (core libraries, UI) is closed source! I'm asking myself, why Linux, for free and with a ton of open-source core and libraries, without any vendor lock-in, is NOT kicking ass in this aspect? The only fault I see in Linux is a lack of cohesive package distrbution. Gobolinux approach seems a step to fix this.
    So... I want your opinions, not much about Gobolinux itself, but about their approach. Maybe would we ever see an Arch spin-off using this approach, or another approach with similar results - effectively, package compartmentation in the FS layout level? Using this layout could alleviate a bit the burden of forced upgrade that a rolling release brings.
    And for those who didn't know Gobolinux, give it a try.
    I hope this post can give a glimpse of new ideas and solutions for some of the incredible talented people here. I think the Linux landspace is too immersed in inertia and old traditions that don't cope with today needs, Arch was a refreshing oasis of innovation I found in that landscape.
    Last edited by freakcode (2008-09-30 22:39:47)

    jcasper wrote:
    freakcode wrote:But this is scheme is less applicable for executables, and more for dynamic libraries, as binaries link to names like "lib-1.0.so.0.9.1", so those can be distinct symlinked.
    Wouldn't you then need to link all your executables to specifically numbered library .so's?  Most executables, by default, are linked to /usr/lib/libfoo.so.1 where libfoo.so.1 is a symlink to libfoo.so.1.5.2 or whatever (where most libfoo.so.1.* are more or less backward compatible, so most executables linked to libfoo.so.1 will be just fine with the upgrade).   I don't see how this would work if package A needs libfoo 1.4.3 and package B needs libfoo 1.5.2 but both are linked to libfoo.so.1.   So you would need executable X linked to libfoo.so.1.4.3 and executable Y linked to libfoo.so.1.5.2.  And then when you upgrade the library, you need to re-link every package that links to it to take advantage of the new lib (throwing away the whole point of linking to libfoo.so.1 by default).
    Yes, my example wasn't clear. Executables X and Y are linked against lib-A.so, which in turn is a symlink to a minor version like lib-A.so.aa. When this lib gets updated, and breaks compatiblity, the maintainers bump the major version number so it turns into lib-B.so. Executable X can be readily updated to work with the new lib-so.B, while executable Y can still work because both lib-A.so and lib-B.so can coexist in the system. In this case, I don't need to wait both X and Y being updated to the new library to install one of them. Minor versions (lib-A.so.aa, lib-A.so.ab, ...) don't affect most builds, as they normally link to a major release (lib-A.so, that is a symlink to the most recent minor version) that guarantee to maintain compatiblity.
    For instance, this happened with libstdc++ too, but is an exception because both branches (5 & 6) coexisted for some time.
    But don't take my word on it, see how Gobolinux actually manages that http://www.gobolinux.org/index.php?page=k5 (What is it all about?)
    jcasper wrote:So I must be missing something, as I don't see how this new directory layout would improve the modularity of the distro packages.  The cleanliness of the filesystem is arguable improved, but I agree with earlier posts that a good package manager like we have in Arch is just as effective.  In Arch I can instantly see what package a file belongs to, see every file installed by a package, and easily add and remove packages.   So in terms of stuff I can do easily, what do separate directories for each get me?
    (Just to make it clear, I'm not directly comparing with what is existent in Arch, neither saying that what we have today doesn't work, which also happens to be the case with all the other 300 distros. The fact one thing works, doesn't mean it is the only neither the best possible solution. From there comes the interest to discuss Gobolinux's approach)
    The point is, if the layout is clean and sandboxed, you don't need a package manager. A package manager is an abstraction to a database relating what files belongs to what packages, that tries to ultimately solve a fundamental flaw of using the traditional Unix hierarchy, where files from one package are scattered on the filesystem, and then you have no simple means to track them anymore. That's hardly KISS. In Gobolinux's approach, you can track files to their packages by simply inspecting symlinks.
    /System/Links/Libraries] ls -l | cut -b 49-
    libgtk-1.2.so.0 -> /Programs/GTK+/1.2.10/lib/libgtk-1.2.so.0.9.1
    libgtk-1.2.so.0.9.1 -> /Programs/GTK+/1.2.10/lib/libgtk-1.2.so.0.9.1
    libgtk.a -> /Programs/GTK+/1.2.10/lib/libgtk.a
    libgtk.la -> /Programs/GTK+/1.2.10/lib/libgtk.la
    libgtk.so -> /Programs/GTK+/1.2.10/lib/libgtk-1.2.so.0.9.1
    libgtk-x11-2.0.la -> /Programs/GTK+/2.6.7/lib/libgtk-x11-2.0.la
    libgtk-x11-2.0.so -> /Programs/GTK+/2.6.7/lib/libgtk-x11-2.0.so.0.600.7
    libgtk-x11-2.0.so.0 -> /Programs/GTK+/2.6.7/lib/libgtk-x11-2.0.so.0.600.7
    On a side note now, having /bin and /usr/bin, /usr and /usr/local,... all that made sense once, with a different hardware, with a different purpose. Does it make sense today, on a modern desktop system? There's enough room and flexibility to rethink and improve those ideas further *without* actually breaking compatiblity, as Gobolinux and OS X already achieved. The only reason most Linux systems until today stick to old standards is because Linux systems were originally intended as cheap Unix replacements. Funny enough is that OS X, which doesn't follow the Unix hierarchy standard so close, is regarded as fully Unix compatible and marketed using the Unix trademark - whereas Linux isn't.
    Last edited by freakcode (2008-10-01 20:16:09)

  • Flash 8: Change "Test Movie" Working Directory

    I just posted this with Google Groups but it doesn't appear
    to be showing up on the adobe forums, I apologize for those that
    see this twice.
    How do I change the working directory that flash uses when I
    do Test Movie (ctrl-enter)?
    Directory Layout:
    project/
    ..index.html
    ..source/
    ....some.fla
    ..images/
    ....some.png
    ..flash/
    ....some.swf
    project/index.html embeds project/flash/some.swf.
    project/flash/ some.swf loads project/images/some.png. I want to be
    able to use Flash 8's Test Movie feature but when I do it looks for
    the image at project/flash/images/some.png which doesn't exist. How
    can I tell Flash to pretend it's running in project/ instead of
    project/flash/?

    The project is too large and has too many people working on
    it to move files around between author-time and run-time. We are
    just starting the project now so there are only a few SWFs and
    images but the final project will likely have hundreds of SWF files
    and even more images. Because of this, it is unrealistic for us to
    just drop all the SWF files in the top level of the project (where
    index.html will reside) as that directory would be too disorganized
    to work with.
    Note: In this project SWFs dynamicly load other SWFs at
    runtime based on a relative path to the project root directory
    (where index.html is).

  • Pointing to the application directory

    I'm been using this (LeaveApp) folder for development. Now I want to point to this folder which is in c:\LeaveApp. How to do it?
    Please show the guideline or url if you know it. Thanks.

    You know you can deploy more than just the ROOT webapp, right (assuming you're talking about the ROOT webapp that comes with Tomcat)?
    The proper way to create a redistributable webapp is to create the complete folder structure as it would appear in ROOT, with just the files from your app, and then zip them up in a war file (you can use WinZip or the Java jar tool to create the war file, just name the file .war instead of .zip or .jar.
    So imagine you have a directory layout like this:../myapp/*.jsp
    ../myapp/WEB-INF/web.xml
    ../myapp/WEB-INF/lib/*.jar
    ../myapp/WEB-INF/classes/....You should then zip up the contents of the myapp folder (the myapp folder itself should NOT be in the zip) and call the file myapp.war. Then when you deploy it (for example by placing it in the webapps directory of a Tomcat installation), it will be available on the server as (for example) http://localhost:8080/myapp/ (the name is derived from the name of war file by default).

  • Best Practices in Building/Stage-In

    What are the best practices in building/stage-in process? I tried to use a CVS repository to have employees stage in their projects then a pair of actual builders will check them out and compile, completing the elevation to production.
    I'm experiencing challenges, however - namely:
    1) Our organization's process is that should development activities continue, the developer needs to formally check-out the production copy, compiled by the builders.
    2) Eclipse build paths are relative to the computer of the developer, so references to JAR's are invalid and need to be resolved manually.
    - I attempted to use a shared folder, mapped to a drive (Drive T:\) but is there any better way of making sure the correct JARs are referenced and that the references will still work after commit.
    3) CVS Conflicts arise when we try to check in a project (from production) that already has a newer copy committed by a developer (bawal ito - pero for the purposes of SIT, we needed to test this case)
    Also we're using Eclipse Build. Is there any better process of building?
    Thanks a lot for your help.

    801661 wrote:
    1) Our organization's process is that should development activities continue, the developer needs to formally check-out the production copy, compiled by the builders.No idea what you're trying to say here.
    2) Eclipse build paths are relative to the computer of the developer, so references to JAR's are invalid and need to be resolved manually.
    - I attempted to use a shared folder, mapped to a drive (Drive T:\) but is there any better way of making sure the correct JARs are referenced and that the references will still work after commit.Not sure what you mean about "references still working after commit."
    3) CVS Conflicts arise when we try to check in a project (from production) that already has a newer copy committed by a developer (bawal ito - pero for the purposes of SIT, we needed to test this case)No idea what you're saying here. The whole notion of "checking in from production" does not compute.
    In general, however, any time two developers work on the same file, there's a chance for conflicts. Most VCSs come with a conflict resolution/merge tool.
    Also we're using Eclipse Build. Is there any better process of building?An IDE's build can be fine for individual developers' intra-day builds, but for nightly builds that are to be promoted to QA or Production, you'll usually use build tool, such as ant or maven, possibly driven by another process-managing tool such as cruisecontrol.
    Other than that, I'm not following exactly what your processes or problems are, so I'll just try to offer some general tips that have worked for me.
    1. When it's time to cut an official build from the lastest checked in code, briefly disallow checkins, whether by shouting over cube-tops or by administratively enforcing a lock on your VCS. (I don't know if CVS supports that or not, but most VCSs should.) The buildmaster labels the current state of "main" or "trunk", and may even preemptively create a branch that's rooted there. Once the label has been applied, checkins can be re-enabled. This is all often done automatically late at night.
    2. The bulidmaster does a fresh checkout against the new label, and builds from there.
    3a. For 3rd party jars that your application uses, create spot in the repository, e.g. /thirdparty, and stick the jars in there, using whatever directory layout and version is appropriate for you. When you label for a build in step 1, make sure you label the thirdparty tree as well, so that you can always get back to the proper version of the entire repository for a given build.
    3b. Alternatively, there's a tool called maven that can automate and simplify (after an initial learning curve) the management of those dependencies.
    4. For paths that are needed by the developers' environments, pick a standard location. Developers can either go with that, and all its attendant simplicities, or they can arrange things how they want, but they are still individually responsible for a) getting their work done in a timely fashion, b) managing their own environment, without being able to rely on the common knowledge of the rest of the group, and c) not doing things that rely on their particular arrangement and hence end up breaking for everyone else.
    5. CVS, while serviceable for simpler projects, lacks some advanced features that other VCSs have. Subversion is a pretty good free tool. It was based on CVS, I think, or at the very least, its commands are almost identical to CVS's in a lot of cases. Perforce is somewhat more feature rich, I think, but quite a bit more complex, and not free. Git is supposed to be gaining popularity, and is free I think, but I've never used it. Clearcase is very powerful, but it's also expensive, and pretty much requires a full-time admin.

  • Working with local xml file

    Hi, I would like to work with a local xml file, but I don't want to point to a strict location as I would like my application to run on a few different machines.
    Is there a special place within Netbeans project structure that I can place such files?
    Say if I try and load a file "somefile.xml" - where is it going to look first?
    I have a <default package> with my settings.xml in there. How can I reference this within my app?
    Edited by: 993541 on Mar 16, 2013 9:25 AM

    993541 wrote:
    Is there a special place within Netbeans project structure that I can place such files?dunno Netbeans project structure at all but I'd sugest maven project structure wich is widely used: http://maven.apache.org/guides/introduction/introduction-to-the-standard-directory-layout.html
    There you would place such a file in the <tt>main/resources</tt> folder
    Say if I try and load a file "somefile.xml" - where is it going to look first?I'ts going to look for it where you tell your program to.
    When created as <tt>new File("somefile.xml")</tt> it will bee searched in the <i>currend working directory</i>. The problem with that is that it is unreliable what this will be at runtime (once your Program left your IDE...).
    You should better get it via <tt>getClass().getResource("somefile.xml")</tt> But in this case the file must be present in the classpath in the same package as the Class aquireing it. Adding a <tt>'/'</tt> in front of the file name expects it in the root directory of a classpath entry.
    I have a <default package> with my settings.xml in there. How can I reference this within my app?<tt>getClass().getResource("/settings.xml")</tt>
    But in case you include it into the delivery jar file it will not be writable. Also you cannot expect the installation folder of your App to be writable. On startup of your program you should copy your (default) settings to a writable place like <tt>new File(System.getProperty("user.home2),".myApp/settings.xml");</tt> and modify it there.
    bye
    TPD

  • Session Property Set not available in Portal Administration Console

    We have a Portal Application (WLP 10.0 MP1) that needs to have visitor entitlements applied to allow some customization based upon a Session Property Set. The property set has been defined in Workshop and the .ses file is now included in our Portal EAR file in the /META-INF/data directory. When the EAR is deployed on a local (winXP) domain the property set is then available in the portal admin console and can be used to build entitlements. When the EAR is deployed to our test server (Solaris 10) the property set does not become available in the portal admin console. Both servers are in development mode and already have a previous version of the EAR deployed in streaming mode.
    Is there anything obvious that needs to be done to ensure that a property set is enabled when deploying a new EAR?

    We are actually performing the portal EAR build using maven but are creating the same directory layout as is created by Workshop when a data sync project is used and an EAR exported, i.e. the contents of the data sync project ends up in the EAR /META-INF/data directory.
    The mystery here is that the property set correctly deploys in one environment but not in another. I'd really like to know whether there are any particular tricks to the property set deployment and what the deployent process actually does. Could issues with the target environment prevent correct deployment of the property set?

  • Can I use multiple coherence instances on one JVM?

    Or can I explicitly designate a tangosol-coherence-override.xml to coherence? If I can, how to?
    Thanks.
    Edited by: user8028833 on May 31, 2010 1:35 AM

    OK apologies as I was going to just post the bits of code but I didn't get much time last night.
    This code was originally written by a couple of us who wrote some security and single sign on code for Coherence. The project is on Google Code here: http://code.google.com/p/coherence-security/
    The source is a Maven project so it follows the Maven directory structure. If you look in the coherence-security-core test sources there is a package called com.oracle.coherence.patterns.security.test.starter which contains all the ClassLoader code and classes to start a Coherence "cluster" in a single JVM.
    The com.oracle.coherence.patterns.security.ApacheDirectoryTest test class uses this code to start a cluster to run its unit tests. To start the cluster you call CoherenceClusterStarter.getInstance() for example the com.oracle.coherence.patterns.security.ApacheDirectoryTest testSecureClientWithUnauthorisedCredentials() method.
        @Test
        public void testSecureClientWithUnauthorisedCredentials() throws Throwable {
            // Set the main class that runs to be the Secure servers
            System.setProperty("coherence.incubator.mainclass", SecureSandboxStarter.class.getName());
            // Make sure the cluster members run with the [email protected] Principal
            System.setProperty("sun.security.krb5.principal", "[email protected]");
            CoherenceClusterStarter.getInstance();
            // Run the client as the Thomas the Tank identity ([email protected]) which is
            // not authorised to connect as an Extend client in the extend-permissions.xml file
            System.setProperty("sun.security.krb5.principal", "[email protected]");
            LoginContext lc = new LoginContext("Coherence");
            lc.login();
            Subject subject = lc.getSubject();
            try {
                Subject.doAs(subject, new PrivilegedExceptionAction<Object>(){
                    public Object run() throws Exception {
                        SystemPropertyLoader.loadEnvironment("client");
                        System.setProperty("tangosol.coherence.security", "true");
                        CacheFactory.getCache("test");
                        return null;
                fail("Expected to have a PortableException wrapping a java.lang.SecurityException thrown");
            } catch (PortableException e) {
                assertEquals("Portable(java.lang.SecurityException)", e.getName());
        }In the code above the first line sets the coherence.incubator.mainclass System property to the name of the class that will be run (in this case our secured DefaultCacheServer wrapper). Then the next line starts the psuedo-Cluster which the rest of the code uses.
    Note
    The Google Code above was originally done due to a requierment to secure Coherence using Active Directory (Kerberos). The code was started on 3.4 and eventually on 3.5.3 and then used to help Oracle change the Security API for 3.6. There are a few things not quite right and a few "gotchas" that are not in the code online.
    Anyone is welcome to take it and use it but I and the others who worked on it offer no warranty or guarantee it will work for you.
    As of Coherence 3.6 there are much simpler ways to do security - although if you want to use Active Directory or Kerberos some of the code on Google will still be useful.
    Any problems or questions give me a shout either here or mail [email protected].
    Regards
    JK

  • Multiple instances of Apache

    We have to default setup. I would like to know if it is possible to run another instanace of Apache (on another port on the same system). Can this be monitored with Server Admin ?

    You want a whole different instance of Apache? or you want apache to listen on multiple ports?
    Both can be achieved, but there's usually little value in running a second instance unless you want to run two different versions for some reason (e.g. apache 1.3 and 2.1).
    On the other hand, a single apache server can listen on multiple ports and serve different content depending on the port number the connection comes in on. This is a common way of implementing virtual hosts and has the advantage that all the common code can be shared, reducing the memory and CPU overhead.
    If you do want to run two copies, you can do - you'll have to download the apache source and build a custom version that uses a different configuration file and directory layout. You'll also need to write your own startup script, and Server Admin won't have any visibility or knowledge about the second instance. In other words, you're on your own

Maybe you are looking for

  • Eliminate the values from internal table

    I have an internal table with fields.. begin of it occurs 0, ebeln matkl werks end of it. its filled with values.. ebeln |  matkl | werks | 12111 |   A123 | LK 12111 |   A123 | LK 12111 |   A123 | LK 12111 |   A123 | LK 12111 |   A123 | LK 12111 |  

  • Write SQL for LOV for linked tables

    I have two tables: tblRegions (Parent Table) Region ID (Primary Key, sequence, trigger) =100 Region Code (Unique Key) = LA Region (Unique Key) = Latin America tblCompanies (Child Table) Company ID (PK, sequence, & trigger) = 600 Company Code (Unique

  • How to add caption to a picture

    I am new to adobe photoshop and would like to add something like Bob Photography to all the pictures I take. Can you please tell me how to do that ? Thanks

  • Errors during backup after waking from sleep mode

    I just setup my time machine to backup to a USB external HD plugged into my airport extreme base station (802.11n). The full backup was successful as were subsequent test "backup now" operations. I put the Macbook (new Macbook 13") to sleep last nigh

  • Toshiba copier department codes and Adobe Reader X

    I work at the IT department/helpdesk for a real estate company.  We have a couple dozen Toshiba E-Studio 2830c, 4520c, and 5520c. Our users need to enter department codes before they print.  What happens is they get a popup, enter in their department