Best Practices or Project Template for Rep/Version

I have installed the Repository 6i (3) and created the users successfully, even though it has taken a lot of effort to make sure each step is correct.
However, on setting up the workareas and importing the project files, I have been trying back and force to figure out where things go, and who has what access.
Is there something like a best practice or a project template for setting up a basic repository/version control system, that provides
1. the repository structure,
2. corresponding file system structure (for different developers, build manager, etc)
3. access grants, and
4. work scenarios, etc.
The Technet demos and white papaers are either too high-level (basic), or too individual function oriented. I can't get a clear picture of the whole thing, since there are so many concepts and elements that don't easily go together.
Considering that I am a decent DBA and developer, it has taken me 2 weeks, and I am still not ready to sign up other developers to use this thing. How do you expect any small development teams to ever use it? It's one thing to design it to be scalable and all-possible, it's another to make it easily usable. I have been suggested to use MS VSS. The only reason I am still trying Ora-Rep is its promise to directly support Designer and Oracle objects.

Andy,
I have worked extensively with the Repository over the last year and a half. I have collected some of my experiences and the derived guidelines on using the Repository in real life in a number of papers that I will be presenting at ODTUG 2001, next week in San Diego. If you happen to be there (see www.odtug.com), come and see me and we could talk through your specific situation. If you are not and you are interested in those papers, drop me an Email and I could send them to you (they probably will also become available on OTN right after the ODTUG conference).
best regards,
Lucas

Similar Messages

  • Best practice on Oracle VM for Sparc System

    Dear All,
    I want to test Oracle VM for Sparc System but I don't have new model Server to test it. What is the best practice of Oracle VM for Sparc System?
    I have a Dell laptop which has spec as below:
    -Intel® CoreTM i7-2640M
    (2.8GHz, 4MB cache)
    - Ram: 8GB DDR3
    - HDD: 750GB
    -1GB AMD Radeon
    I want to install Oracle VM VirtualBox on my laptop and then install Oracle VM for Sparc System in Virtual Box, is it possible?
    Please kindly give advice,
    Thanks and regards,
    Heng

    Heng Horn wrote:
    How about computer desktop or computer workstation with the latest version has CPU supports Oracle VM or SPARC?Nope. The only place you find SPARC T4 processors is in Sun Servers (and some Fujitsu servers, I think).

  • Help with trying to locate Illustrator templates for CS version 1

    I'm trying to locate a copy of the templates that were issued as part of Illustrator CS version 1. Specifically, I'm looking for a file by the name of "flyer3.ait". I would, ideally, like to access all of the templates for CS, version 1, because I am reading the Adobe Illustrator CS, Idea Kit book by Jerome Holder & Barbara Mulligan. and they use these templates as starter files for their book. Thank you in advance.

    I really doubt that you are going to find them unless someone has a version of CS1 installed somewhere on an old machine. In the newer versions the template files are in a folder called Cool Extras which may or may not have been installed. Installation of this folder was an option in the newer versions. In AICS3, there is a template called flyer.ait in the above folder.

  • Best practice to define length for varchar field of table in sql server

    What is best practice to define length for a varchar field in table
    where field suppose Remarks By Person  varchar(max) or varchar(4000)
    Could it affect on optimization in future????
    experts Reply Must ... 
    Dilip Patil..

    Hi Dilip,
    Varchar(n/max) is a variable-length, non-unicode character data. N defines the string length and can be a value from 1 through 8,000. Max indicates that the maximum storage size is 2^31-1 bytes (2 GB). The storage size is the actual length of the data entered
    + 2 bytes. We always use varchar when the sizes of the column data entries vary considerably. While if the filed data size might exceed 8,000 bytes in some way, we should use varchar(max).
    So the conclusion is just like Uri said, use varchar(max) or varchar(4000) is depends on how much characters we are going to store.
    The following document about varchar in SQL Server is for your reference:
    http://technet.microsoft.com/en-us/library/ms176089.aspx
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Best practices to reduce downtime for Database releases(rolling changes)

    Hi,
    What are best practices to reduce downtime for database releases on 10.2.0.3? What DB changes can be rolling and what can't?
    Thanks in advance.
    Regards,
    RJiv.

    I would be very dubious about any sort of universal "best practices" here. Realistically, your practices need to be tailored to the application and the environment.
    You can invest a lot of time, energy, and resources into minimizing downtime if that is the only goal. But you'll generally pay for that goal in terms of developer and admin time and effort, environmental complexity, etc. And you generally need to architect your application with rolling upgrades in mind, which necessitates potentially large amounts of redesign to existing applications. It may be perfectly acceptable to go full-bore into minimizing downtime if you are running Amazon.com and any downtime is unacceptable. Most organizations, however, need to balance downtime against other needs.
    For example, you could radically minimize downtime by having a second active database, configuring Streams to replicate changes between the two master databases, and configure the middle tier environment so that you can point different middle tier servers against one or the other database. When you want to upgrade, you point all the middle tier servers against database A other than 1 that lives on a special URL. You upgrade database B (making sure to deal with the Streams replication environment properly depending on requirements) and do the smoke test against the special URL. When you determine that everything works, you configure all the app servers to point at B and have Streams replication process configured to replicate changes from the old data model to the new data model), upgrade B, repeat the smoke test, and then return the middle tier environment to the normal state of balancing between databases.
    This lets you upgrade with 0 downtime. But you've got to license another primary database. And configure Streams. And write the replication code to propagate the changes on B during the time you're smoke testing A. And you need the middle tier infrastructure in place. And you're obviously going to be involving more admins than you would for a simpler deploy where you take things down, reboot, and bring things up. The test plan becomes more complicated as well since you need to practice this sort of thing in lower environments.
    Justin

  • Install sharepoint 2013 project templates for VS 2012 without connecting to internet - offline install

    HI,
    is there  any way i can install thE sharepoint developer tools in my dev machine for  woirking with SP 2013 AND VS 2012?
    i am not getting the sharepoint project templates when i went to VS 2012.
    so the link says that i need to connect to internet and install web platform installer 6.0.
    but my machine cant connect to net .
    if anyone has done the  offline installation of sp 2013 project templates for VS 2012  , pls provde me the links / ways.
    help is highly appreciated!

    check this blog
    you need to install Microsoft Office Developer Tools for Visual Studio 2012 to get the project templete
    http://blogs.msdn.com/b/timquin/archive/2013/01/22/setting-up-visual-studio-2012-for-sharepoint-2013-development-offline.aspx

  • What is the best Practice JCO Connection Settings for DC  Project

    When multiple users are using the system data is missing from Web Dynpro Screens.  This seems to be due to running out of connections to pull data.
    I have a WebDynpro Project based on component development using DC's.  I have one main DC which uses other DC's as Lookup Windows.  All DC's have their Own Apps.  Also inside the main DC screen, the data is populated from multiple function modules.
    There are about 7 lookup DC Apps accessed by the user
    I have created JCO destinations with following settigns
    Max Pool Size 20
    Max Number of Connections 200
    Before I moved to DC project it was regular Web Dynpro Project with one Application and all lookup windows were inside the same Project.  I never had the issue with the same settings.
    Now may be becuase of DC usage and increase in applications I am running out of connections.
    Has any one faced the problem.  Can anyone suggest the best practice of how to size JCO connections.
    It does not make any sense that just with 15-20 concurrent users I am seeing this issue.
    All lookup components are destroyed after its use and is created manually as needed.  What else can I do to manage connections
    Any advise is greatly appreciated.
    Thanks

    Hi Ravi,
    Try to go through this Blog its very helpful.
    [Web Dynpro Best Practices: How to Configure the JCo Destination Settings|http://www.sdn.sap.com/irj/scn/weblogs;jsessionid=(J2EE3417600)ID2054522350DB01207252403570931395End?blog=/pub/wlg/1216]
    Hope It will help.
    Regards
    Jeetendra

  • Is dao pattern is the best practice in projects

    let me know if dao pattern is the best followed in all almost all the
    projects though finding alternatives to it. please clarify this for me and also i do want to know the best practices of the industry in using design patterns.

    There is no 'best' pattern. It is just all abouthow
    and where to apply them. This is very true,but these are common
    design patterns used in industry for standard
    problems.
    ost of the time patterns are used not for some
    special reason but for more manageability and ease of
    change.So if you have a small application than it's
    ok but if you are working on big application which
    are needed to be maintained over a time and changes
    are frequent.Than its better to start learning about
    patterns because their will be problems which right
    now you can't see but eventually you have to take
    care of.That is either incorrect or phrased poorly.
    Patterns come about because someone analyzes different existing code bases and notes that there are similarities in the way they are built.
    It isn't that they are easier to maintain but rather that because the pattern has similarities it is easier to comprehend, understand the limitations, understand the possible related patterns, etc. That might lead to easier maintainance but it isn't the reason. The reason is because, if and only if, the requirements/architecture lead to a situation where that pattern could be properly used.

  • Best Practice Advice - Using ARD for Inventorying System Resources Info

    Hello All,
    I hope this is the place I can post a question like this. If not please direct me if there is another location for a topic of this nature.
    We are in the process of utilizing ARD reporting for all the Macs in our district (3500 +/- a few here and there). I am looking for advice and would like some best practices ideas for a project like this. ANY and ALL advice is welcome. Scheduling reports, utilizing a task server as opposed to the Admin workstation, etc. I figured I could always learn from those with experience rather than trying to reinvent the wheel. Thanks for your time.

    hey, i am also intrested in any tips. we are gearing up to use ARD for all of our macs current and future.
    i am having a hard time with entering the user/pass for each machine, is there and eaiser way to do so? we dont have nearly as many macs running as you do but its still a pain to do each one over and over. any hints? or am i doing it wrong?
    thanks
    -wilt

  • Best practices when carry forward for audit adjustments

    Dear experts,
    I would like to know if someone can share his best practices when performing carry forward for audit adjustments.
    We are actually doing legal consolidation for one customer and we are facing one issue.
    The accounting team needs to pass audit adjustments around April-May for last year.
    So from January to April / May, the opening balance must be based on December closing of prior year.
    Then from May / June to December, the opening balance must be based on Audit closing of prior year.
    We originally planned to create two members for December period, XXXX.DEC and XXXX.AUD
    Once the accountants would know their audit closing balance, they would have to input it on the XXXX.AUD period and a business rule could compute the difference between the closing of AUD and DEC periods and store the result on an opening flow.
    The opening flow hierarchy would be as follow:
    F_OPETOT (Opening balance Total)
        F_OPE (Opening balance from December)
        F_OPEAUD (Opening balance from the difference between closing balance of Audit and December periods)
    Now, assume that we are in October, but for any reason, the accountant run a carry forward for February, he is going to impact the opening balance because at this time (October), we have the audit adjustments.
    How to avoid such a thing? What are the best practices in this case?
    I guess it is something that you may have encounter if you did a consolidation project.
    Any help will be greatly appreciated.
    Thanks
    Antoine Epinette

    Cookman and I have been arguing about this since the paleozoic era. Here's my logic for capturing everything.
    Less wear and tear on the tape and the deck.
    You've got everything on the system. Can't tell you how many times a client has said "I know that there was a better take." The only way to disabuse them of this notion is to look at every take. if it's not on the system, you've got to spend more time finding the tape, and adding "wear and tear on the tape and the deck." And then there's the moment where you need to replace the audio for one word from another take. You can quickly check all the other takes (particularly if you've done a thorough job logging the material - see below)_.
    Once it's on the system, you still need to log and learn the material. You can scan thru material much faster once it's captured. Jumping around the material is much easier.
    There's no question that logging the material before you capture makes you learn the material in a more thorough way, but with enough selfdiscipline, you can learn the material as thoroughly once it's been captured.

  • Best practices with LDIF Development for RBAC?

    I'm currently working on enforcing RBAC (Role Based Access controls) in OID that may be subject to change every few months. What I've currently been doing is writing LDIF files to make changes to the existing RBAC once the changes have been finalized.
    Unfortunately, now we have ended up with a growing list of LDIF files that must be run in sequential order if we were to build a new environment. Any defects or development errors that slip through developer unit testing must be handled in the same manner.
    What is the best practice process for performing this type of development? Would it make more sense to have one LDIF file that removes all of the RBAC enforcement (via ldapmodify -c), and then a separate file that will install the latest and most up to date version? I've also considered just using one LDIF file, appending any updates to the end of it and using the ldapmodify command with the -c parameter

    With regard to the 29.97/30 thing, you'll find that video people are idiosyncratically imprecise about that. We say 60 when we mean 59.94, we say 30 when we mean 29.97 and we say 24 when we mean 23.976.
    We're quirky.
    Whenever somebody says one of those nice, round numbers, you can assume they're really talking about the corresponding ugly fraction.
    Unless they're film people, in which case +24 means 24, dangit.+

  • Best practice when upstream does not provide version number

    I'm currently preparing a package for the game Dreamfall Chapters, unfortunately the Developer does not (yet) provide a Version number.
    What is the best practice for this? Just count the releases and set epoch when they finally provide one themselves?
    thanks in advance!

    If you are missing a version number, you could use the date. Prefix it with "r" in case you want to avoid epoch when upstream decides to start versioning.
    pkgver=r20140122
    If you have another release on the same day, increment another subversion
    pkgver=r20140122.1
    Counting the releases by hand is poor practice (you could miss one). Do it only if upstream provides the count.
    Edit: To tie the version stronger to the release, you can also add a part of the archive hash, e.g. +m###### with the first 6 characters of the md5sum.
    pkgver=r20140122+m23df1e
    Edit: r as prefix is better, thanks rumpelsepp.
    Last edited by progandy (2014-10-22 09:32:56)

  • Best practices on using OLM for sign-off on compliance policies?

    We are working on a project to build a parent/child learning object that will include four compliance policies and a "test" to sign off acceptance of these policies as a condition of employment. Has anyone implemented something like this? If so - can you please share some best practices around this type of implementation. Our plan is to build a learning certification.
    We are also interesting in looking into how to automate the assignment of the learning certification to all new hires. Any info here would also be appreciated.
    Thanks - Juli

    We use 11.5.10j...One thing I would recommend is creating an introductory module as child object to the certification, and a concluding module as child object as well.
    In the learning objects I have done where a parent/child relationship exists, the introductory module is important for several reasons, including instructing learners about how to expand the OLM Player outline on the left to move to the next module (in 11.5.10j, at least, it doesn't expand automatically so the general population is not going to understand they need to expand and move to the next module. You might try to tell people to do that independently of the course but we've found it's best to put as part of the course.) The introductory module also includes things like objectives, expected duration, completion requirements, audience, etc.
    As for the concluding module, I include a concluding module to instruct people that they have reached the end and provide a screenshot of the OLM Player "Home" button to instruct them to return to the OLM home page. Using the OLM "Home" button circumvents issues with popup blockers suppressing the passing of completion information back to OLM when exiting using the IE exit button and is generally a cleaner way to exit the course. Again, instructions might be provided independent of the learning object but most will not pay attention to those. There is a patch to test for the presence of popup blockers and alert the user so that there might be less need for a concluding module to do this but we have not been able to implement this yet due to problems caused after it was applied.

  • SAP Best Practices on assigning roles for Auditors

    Dear Gurus,
    We need to set up SAP roles for auditors in or system for SRM ECC & BI.
    Could you please suggest on wich roles should be granted to the auditors as best practice to follow on?
    I will really apprecciate your help.
    Best Regards,
    Valentino

    Hi Martin,
    Thanks for your interest. I would be very happy to work with folks like you to slowly improve such roles as we find improvement possibilities for them, and all benefit from the joint knowledge and cool features which go into them. I have been filing away at a set of them for years now - they are not evil but still usefull and I give them to an auditor without being concerned as long as they can tell me approximately what they have been tasked to look into.
    I then also show them the corresponding user menu of my role for these tasks and then leave them alone for a while... 
    Anyway... SAP told me that if we host the content on SDN for the collaboration and documentation to the changes in the files, then version management of the files can be hosted externally for downloading them (actually, SAP does not have an option because their software does not support it...).
    I will rather host them on my own site and add the link in the SDN wiki and a sticky forum post link to it than use a generic download service, at least to start with. Via change management to the wiki, we can easily map this to version management of the files on a monthly periodic update cycle once there are enough changes to the wiki.
    How about "Update Tuesday" as a maintenance cycle --> config updates each second Tuesday of the month... to remove authorizations to access backdoors which are more than "just display"...
    Cheers,
    Julius

  • Best practices or design framework for designing processes in OSB(11g)

    Hi all,
    We have been working in oracle 10g, now in the new project we are going to use Soa suite 11g.For 10g we designed our services very similar to AIA framework. But in 11g since OSB is introduced we are not able to exactly fit the AIA framework here because OSB has a structure different than ESB.
    Can anybody suggest best practices or some design framework for designing processes in OSB or 11g SOA Suite ?

    http://download.oracle.com/docs/cd/E12839_01/integration.1111/e10223/04_osb.htm
    http://www.oracle.com/technology/products/integration/service-bus/index.html
    Regards,
    Anuj

Maybe you are looking for

  • Use old Windows XP PC hard drive as an external hard drive.

    Hello, I'm planning to buy a Mac Mini as XP is no longer supported. My thought is to use the old SATA 3.5 inch drive as an external drive in a USB 3.0 case to give access to what files and programs. There are a couple of Windows specific programs on

  • Oracle account and microsoft active directory password synchronisation

    Hi We are migrating our application to use windows active directory authentication. We have separate oracle account for each logged in user in the application, and these oracle credentials have to be the same as the windows active directory credentia

  • ASM instance not shown in tnsnames.ora

    Hi, Currently I have an Oracle 11g database on linux with ASM installed, or at least that's what I can tell from querying v$logfile and v$datafile views. But in my tnsnames.ora and listner.ora I only see the db instance with no ASM instance. If I got

  • Rotating Linked Km Images

    Hi guru's, I want to implement image rotation functionality in Enterprise portal Note : Images are stored in KM Repository, on click of each image, it should take to different URL . Thanks in advance Regards Sadik

  • Gnome-volume-manager can't automount removable media any more

    Strange,  I don't know why, it worked perfectly, but now it lost functionability. Anyone with any suggestion? Thanks in advance.