RE: List Archive and Release Management

Hi back.
>> what procedures do people use to manage the development, testing
and release of software?
It's useful to have a separate repository for Development, QA, Production,
etc. It's true that you have to 'import' between them, each time you
propagate changes. It might seem like a pain to constantly export/import,
but actually it's better that way.
Firstly, you should only export to QA when you're absolutely sure it's ready
for quality assurance, and this doesn't happen every day (or shouldn't)
Second, the advantage of this is that you get a 'clean' import and compile
each time, with no other baggage. This is especially critical when you
migrate from QA to a production environment.
There's no need to have a testing_verXX repository, just make a different
workspace for each version of test.
Just check out all the components, import, and integrate. The 'old'
workspace will have a snapshot of the old release.
* Don't Update that old workspace or you'll lose it!
* Put an administrator password on the repos, to protect against
inadvertent usage.
* It's expensive to have many repositories running on your environment
manager, so I recommend only one repos for each of development, QA and
Deployment.
* You might want to consider only bringing up the production repos as
required, they hog RAM like crazy.
It's critical to have a complete image of each version of every application
you have in production. That way, you can recreate any problem.
It's even useful to migrate your development repos every now and then. As
you know, these repos beasties can grow quite large, and it's god to start
from scratch on a regular basis.
For example, every month or so, our core architecure team releases a new
version of the core components. We all integrate, and we create a new repos
from the core architecture stuff, and import all our stuff over the top of
it. That way, we keep a loose coupling between sub-systems, and their stuff
never depends on ours. Also, with a split developmen repos, you can locate
the different teams on different intranets, or even on different sides of
the world, all with relatively little fuss.
Forté SCM hooks and SCCSWe use a unix version control called CMVC.
Whenever we integrate, it checks out the pex files, exports them, and checks
them back in.
John Pianezze
S1 Technologies (Asia Pacific)
Melbourne, Australia
-----Original Message-----
From: Duncan Kinnear [SMTP:[email protected]]
Sent: Wednesday, July 21, 1999 1:05 PM
To: [email protected]
Subject: List Archive and Release Management
Hi folks!
First of all, does anyone know what's going on with the list archive
on
SageIT? There doesn't seem to be a search facility anymore. Seems
a
bit weird when each message is appended with a little signature
'advertising' a searchable archive.
Second (and this is the biggy!), what procedures do people use to
manage the development, testing and release of software? I'm
talking
about how you keep them separate, how you do hot-fixes, how you
identify installed versions at customer sites, etc.
I've thought that we could do it with separate repositories for
"current
development", "testing_verXX", "release_verXY", etc., where we
export
from "development" and then import into "testing", and similarily
for
"release".
Also, we are looking at using the Forté SCM hooks and SCCS on our
Unix host to store historical versions of the projects/classes. But
we
somehow need to identify which version of a particular component is
installed at the customer site. I had thought of defining a
constant in
each project/class called "SCCS_VER" which contains the SCCS
keywords that get mapped to SCCS ID and date when put into SCCS.
Then the constant could be used to display these values in a
Window's
"About" window.
Any thoughts/opinions welcome.
Cheers,
Duncan Kinnear,
McCarthy and Associates, Email:
[email protected]
PO Box 764, McLean Towers, Phone: +64 6 834
3360
Shakespeare Road, Napier, New Zealand. Fax: +64 6 834
3369
Providing Integrated Software to the Meat Processing Industry for
over 10 years
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive
<URL:http://pinehurst.sageit.com/listarchive/forte>
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/forte>

Under Forte 2 we used to export all projects, then import them into a make
repository, then produce an app. Due to the time involved importing &
exporting, and errors resulting from importing projects out of order, we
switched to making copies of the dev repository and producing apps from
that. For us, it used to take 2+ hours to export/import, as opposed to 5
minutes for copying repository files. Unfortunately, we have found that
unless we force-compile prior to deploying, we will get errors when
attempting to compile partitions.
Our current procedure:
Shut down dev repos
Copy to make env
make a workspace that includes all projects
force-compile
update
integrate
make a new workspace for each app to make
make each app
export code
remove unused workspaces
clean repos
-----Original Message-----
From: Peter Sham (HTHK - Assistant Manager - Software Development, IITB)
[mailto:[email protected]]
Sent: Wednesday, July 21, 1999 3:31 AM
To: John Pianezze; Duncan Kinnear
Cc: [email protected]
Subject: RE: List Archive and Release Management
Hi,
We, like what you've explained, keep different releases of our application
in different workspaces. However, when the release is getting larger, this
"check-out-every-components-then-overwrite-by-import" procedure get tougher.
Sometimes, it even creates garbage in the repository which corrupts my
workspace.
So, I have an alternative proposal on this procedure ( though I haven't
really test on this idea ) which is to create a new repository for each new
release.
What do you think?
Regards,
Peter Sham.
-----Original Message-----
From: John Pianezze [SMTP:[email protected]]
Sent: Wednesday, July 21, 1999 3:44 PM
To: Duncan Kinnear
Cc: [email protected]
Subject: RE: List Archive and Release Management
Hi back.
what procedures do people use to manage the development, testingand release of software?
It's useful to have a separate repository for Development, QA, Production,
etc. It's true that you have to 'import' between them, each time you
propagate changes. It might seem like a pain to constantly export/import,
but actually it's better that way.
Firstly, you should only export to QA when you're absolutely sure it's
ready
for quality assurance, and this doesn't happen every day (or shouldn't)
Second, the advantage of this is that you get a 'clean' import and compile
each time, with no other baggage. This is especially critical when you
migrate from QA to a production environment.
There's no need to have a testing_verXX repository, just make a different
workspace for each version of test.
Just check out all the components, import, and integrate. The 'old'
workspace will have a snapshot of the old release.
* Don't Update that old workspace or you'll lose it!
* Put an administrator password on the repos, to protect against
inadvertent usage.
* It's expensive to have many repositories running on your environment
manager, so I recommend only one repos for each of development, QA and
Deployment.
* You might want to consider only bringing up the production repos as
required, they hog RAM like crazy.
It's critical to have a complete image of each version of every
application
you have in production. That way, you can recreate any problem.
It's even useful to migrate your development repos every now and then. As
you know, these repos beasties can grow quite large, and it's god to start
from scratch on a regular basis.
For example, every month or so, our core architecure team releases a new
version of the core components. We all integrate, and we create a new
repos
from the core architecture stuff, and import all our stuff over the top of
it. That way, we keep a loose coupling between sub-systems, and their
stuff
never depends on ours. Also, with a split developmen repos, you can locate
the different teams on different intranets, or even on different sides of
the world, all with relatively little fuss.
Forti SCM hooks and SCCSWe use a unix version control called CMVC.
Whenever we integrate, it checks out the pex files, exports them, and
checks
them back in.
John Pianezze
S1 Technologies (Asia Pacific)
Melbourne, Australia
-----Original Message-----
From: Duncan Kinnear [SMTP:[email protected]]
Sent: Wednesday, July 21, 1999 1:05 PM
To: [email protected]
Subject: List Archive and Release Management
Hi folks!
First of all, does anyone know what's going on with the list archive
on
SageIT? There doesn't seem to be a search facility anymore. Seems
a
bit weird when each message is appended with a little signature
'advertising' a searchable archive.
Second (and this is the biggy!), what procedures do people use to
manage the development, testing and release of software? I'm
talking
about how you keep them separate, how you do hot-fixes, how you
identify installed versions at customer sites, etc.
I've thought that we could do it with separate repositories for
"current
development", "testing_verXX", "release_verXY", etc., where we
export
from "development" and then import into "testing", and similarily
for
"release".
Also, we are looking at using the Forti SCM hooks and SCCS on our
Unix host to store historical versions of the projects/classes. But
we
somehow need to identify which version of a particular component is
installed at the customer site. I had thought of defining a
constant in
each project/class called "SCCS_VER" which contains the SCCS
keywords that get mapped to SCCS ID and date when put into SCCS.
Then the constant could be used to display these values in a
Window's
"About" window.
Any thoughts/opinions welcome.
Cheers,
Duncan Kinnear,
McCarthy and Associates, Email:
[email protected]
PO Box 764, McLean Towers, Phone: +64 6 834
3360
Shakespeare Road, Napier, New Zealand. Fax: +64 6 834
3369
Providing Integrated Software to the Meat Processing Industry for
over 10 years
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive
<URL:http://pinehurst.sageit.com/listarchive/forte>
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive
<URL:http://pinehurst.sageit.com/listarchive/forte>-
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive
<URL:http://pinehurst.sageit.com/listarchive/forte>
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/forte>

Similar Messages

  • List Archive and Release Management

    Hi folks!
    First of all, does anyone know what's going on with the list archive on
    SageIT? There doesn't seem to be a search facility anymore. Seems a
    bit weird when each message is appended with a little signature
    'advertising' a searchable archive.
    Second (and this is the biggy!), what procedures do people use to
    manage the development, testing and release of software? I'm talking
    about how you keep them separate, how you do hot-fixes, how you
    identify installed versions at customer sites, etc.
    I've thought that we could do it with separate repositories for "current
    development", "testing_verXX", "release_verXY", etc., where we export
    from "development" and then import into "testing", and similarily for
    "release".
    Also, we are looking at using the Fort&eacute; SCM hooks and SCCS on our
    Unix host to store historical versions of the projects/classes. But we
    somehow need to identify which version of a particular component is
    installed at the customer site. I had thought of defining a constant in
    each project/class called "SCCS_VER" which contains the SCCS
    keywords that get mapped to SCCS ID and date when put into SCCS.
    Then the constant could be used to display these values in a Window's
    "About" window.
    Any thoughts/opinions welcome.
    Cheers,
    Duncan Kinnear,
    McCarthy and Associates, Email: [email protected]
    PO Box 764, McLean Towers, Phone: +64 6 834 3360
    Shakespeare Road, Napier, New Zealand. Fax: +64 6 834 3369
    Providing Integrated Software to the Meat Processing Industry for over 10 years
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/forte>

    Hi folks!
    First of all, does anyone know what's going on with the list archive on
    SageIT? There doesn't seem to be a search facility anymore. Seems a
    bit weird when each message is appended with a little signature
    'advertising' a searchable archive.
    Second (and this is the biggy!), what procedures do people use to
    manage the development, testing and release of software? I'm talking
    about how you keep them separate, how you do hot-fixes, how you
    identify installed versions at customer sites, etc.
    I've thought that we could do it with separate repositories for "current
    development", "testing_verXX", "release_verXY", etc., where we export
    from "development" and then import into "testing", and similarily for
    "release".
    Also, we are looking at using the Fort&eacute; SCM hooks and SCCS on our
    Unix host to store historical versions of the projects/classes. But we
    somehow need to identify which version of a particular component is
    installed at the customer site. I had thought of defining a constant in
    each project/class called "SCCS_VER" which contains the SCCS
    keywords that get mapped to SCCS ID and date when put into SCCS.
    Then the constant could be used to display these values in a Window's
    "About" window.
    Any thoughts/opinions welcome.
    Cheers,
    Duncan Kinnear,
    McCarthy and Associates, Email: [email protected]
    PO Box 764, McLean Towers, Phone: +64 6 834 3360
    Shakespeare Road, Napier, New Zealand. Fax: +64 6 834 3369
    Providing Integrated Software to the Meat Processing Industry for over 10 years
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/forte>

  • Using Sharepoint as an archival and document management

    Hello, 
    I was asked by my manager to look for possible solutions to convert our archive paper file cabinets that are full of student records into digital format. We do not utilize sharepoint yet at the school but I was researching and found that sharepoint does a good
    job in archiving and content management. The ultimate goal for us is:
    1-  to be able to scan the documents in the file cabinets and save them into folders on sharepoint. We want to be able to create a folder under a specific students' name and then scan all documents related to that student into that specific folder. 
    2- Give permission to specific staff members to access and search for docs by student name. 
    I wanted to ask if  and how this is possible to achieve by Sharepoint knowing that we have an estimate of 120,0000 paper docs to scan. 
    I would much appreciate your help. 
    Thanks! 
    Nancy

    Hi
    There are many options to scan your documents and directly save and place in Sharepoint with very effective manner, you can Save directly from Scanner Deive, versioning, searching,
    ordering, automatic workflow and you can set the RCM policy in that if the documents is saved since very long time you can automatic arrive them from one location to another or can remove them apart from this there is many nice feature for content management
    you will get out of the box using SharePoint content management.
    You can use following application along with SharePoint to scan the contents.
    here are some of them.
    KnowledgeLake provides an affordable, platform based solution for managing
    your company’s unstructured content right
    within SharePoint. KnowledgeLake is an enterprise scalable and comprehensive SharePoint solution for searching, viewing,
    securing, routing and annotating your company’s mission critical electronic content. The KnowledgeLake solution is fully
    integrated with SharePoint, so it is easy to implement, easy to administer, presents low risk, and is cost effective.
    2. ScanSnap and Microsoft SharePoint enable you
    to expand the benefits of your Enterprise Content Management (ECM) system, right from your desk.  Fujitsu bundles the KnowledgeLake ‘Scan to Microsoft SharePoint’ application with the scanning software, making it easy to onramp scanned documents directly
    to SharePoint.
    Scan to Microsoft SharePoint is a desktop application enabling users to easily index, store and retrieve any document to SharePoint 2003/2007/2010 or Office Live. It eliminates
    the conventional multi-step, manual process that standard SharePoint constructs require, replacing it with a fully integrated and automated solution.
    Here is nice article you can find some good details about content management automate.
    http://www.technologyfirst.org/magazine-articles/120-march-2013/823-bizdocshow-to-make-your-small-business-paperless-using-sharepoint-with-integrated-imaging-and-scanning.html
    Krishana Kumar http://www.mosstechnet-kk.com

  • LCM Change and Release Management - How to maintain job integrity

    Hi
    We are using BO in a three scale landscape. DEV, QA and PRD are all are in a same network. I installed Business Objects Life Cycle Manager 3.1 SP3 on my dev system.
    I created a job for DEV -> QA. (Multiple iterations - for changes in DEV while testing)
    Once the final version is tested in QA, those objects suppose to be moved from QA -> PRD not directly from DEV -> PRD. Because when i make a copy of the existing job, that's not allow me to change the source system.
    Now, my question is, I want to maintain integrity of job promotion, means I want to move the same objects which tested in QA to PRD,
    1. DEV -> QA -> PRD. Is it possible in LCM or any other tool?
    2. Is it possible to change the source system of LCM jobs?
    3. To maintain the interity, is it required to create two jobs
    DEV -> QA - Job1
    QA -> PRD - Job2 (With same objects as in Job1)
    4. What is the right approch for BO Change and Release Management?
    Thanks
    Deep

    Same query as in How to maintain LCM job integrity while promoting?
    Please do not cross post. See the [Rules of Engagement|http://www.sdn.sap.com/irj/sdn/wiki?path=/display/home/rulesofEngagement].
    Closing and locking this thread as the one mentioned above is in the correct forum.
    - Ludek

  • TFS and Release Managment Setup in Azure VM- Release managment Extension not avaliable

    Hi
    We are planning to setup TFS and Release Management on a VM in Azure. In the portal there is an option to create a VM with TFS which is great but I do not see any option to install Release Managment as an extension.
    based on this link
    Install Release Management as Extension (at the end of the article). I do not see the this option under extension when i am setting up a VM.
    If there is another way of installing this from the portal please let me know
    I know I can install all this as a stand alone installer but since azure provides these options thought i would use them.
    Thanks

    Hi kumarforum,
    Since this thread is more related to Azure management portal, I will move it to the right forum for a better response. Thanks for your understanding.
    Best regards,

  • File naming, archiving and time management

    I've posted on this subject before, but I have a new twist that I'd like to get some feedback on.
    I usually import my photos, keeping the master (now called original) file name until the end of the calendar year.  At the end of the year, I like to change the original name for classification and archiving purposes.  By then, I've usually made all of the deletions for the year, so I feel comfortable renaming the photos with some sort of counter or index.  My preferred classification system is: "Custom Name"/"Image Date_"/"Counter" (0000).
    The problem that I'm experiencing is that it is impossible to rename my originals using this format without some inaccuracies if I try to name them all at once without readjusting the computer's internal time zone settings.  I live on the east coast, so if I have a photo shot at 10:30 pm PDT on 2011-03-14, it gets named with a date of 2011-03-15, which obviously isn't accurate for when that photo was shot.  Well, it is accurate based on East Coast Time, but I want the file to be renamed with the date that it was shot, where it was shot, not where my computer currently resides.  Of course, I could rename the batch of 2011 photos in segments, but that would mean multiple quits/reopens from Aperture in order to change the time zone appropriately.
    It seems that my only choices are to either rename my photos at the time of import using the correct time zone settings on my computer, or to not use this renaming format.  Neither of these options are very appealing, since this renaming format is my preferred method.
    I guess my question is: does anyone have any insights or advice on either how to better work around this problem, or if not, other renaming methods that they like to use for archival and organizational purposes?  I know there are many to choose from, but I'm looking for something simple, which also provides direct information about the image, should I want to reference my Originals (which I do outside of Aperture from time to time).
    Thanks for adding to this discussion...
    mac

    Allen,
    SierraDragon wrote:
    mac-
    Personally I create a folder for each Project and copy pix from CF card into those folders. Then I import from the backup hard drive into Aperture using the folder name as the Project name.
    Usually each Project includes only one day or less, and I may have YYMMDD_JonesWed_A, YYMMDD_JonesWed_B, etc. for a large or multiday shoot. I do not let any Project contain more than ~400 Nikon D2x RAW+JPEG files.
    Projects are just that and never put into folders other than by month and/or year, just a forever chronological list. All organizing is done via Albums and Keywords. JonesWed_2011 is a keyword that can be an Album instantly when needed; bride is a keyword; wed is a keyword; flower is a keyword; etc.
    I use wedding just as an example. The process applies to all kinds of shoots.
    I use the 1-9999 Nikon auto-numbering of image files, and never rename image files except  sometimes during export. That way original image names can always be found across mass storage devices in the future independent of any application.
    -Allen
    SierraDragon wrote:
    Usually each Project includes only one day or less, and I may have YYMMDD_JonesWed_A, YYMMDD_JonesWed_B, etc. for a large or multiday shoot. I do not let any Project contain more than ~400 Nikon D2x RAW+JPEG files.
    Why do you keep the photo count in a project to around 400 files or so?  Is it detrimental to speed, or are there other considerations that have led you to work this way?
    SierraDragon wrote:
    Projects are just that and never put into folders other than by month and/or year, just a forever chronological list. All organizing is done via Albums and Keywords. JonesWed_2011 is a keyword that can be an Album instantly when needed; bride is a keyword; wed is a keyword; flower is a keyword; etc.
    So, you are saying that you sometimes put projects into folders by month and/or year?  Or, do you just keep all projects at the top level of the hierarchy?  The only folders I use are at the top of my hierarchy, and they are by year, 2002, 2003, 2004...2012.  I then keep all of my projects in the appropriate year.  I used to keep folders that were named things like, "Travel", "Occasions"..., but this became problematic when I had overlap, and images could fit in more than one designated folder.
    SierraDragon wrote:
    I use the 1-9999 Nikon auto-numbering of image files, and never rename image files except  sometimes during export. That way original image names can always be found across mass storage devices in the future independent of any application.
    It sounds as though you don't actually rename your images at all, but rather just keep the original names.  I don't like to do this because after deletions, it creates gaps in my sequence, and I also end up with multiple images with the same name.  I like for each image to have its own unique identifier by name.
    I'm considering importing the images using a version name, where the version is named by the image date.  I'll keep the original file name intact until the end of the year, and then, should I decide to rename my files, I could base my renaming system off of the version name.  This will automatically capture the date of the image without being reliant on my computer's time zone settings.

  • Version control, build, deploy and release management of eBS applications?

    Hi All
    I am setting up a standard configuration management, build & deployment automation and change & release management process for all types of applications including Oracle eBS, Informatica, Siebel, Cognos, Java etc. As you know, SCM for Java and .Net types of applications are very matured, but not for other types of applications such as Oracle eBS.
    Can you give me some hint or point me to some documents on how to manage Oracle eBS applications from SCM's perspective? For example,
    1. How to version control eBS artifacts? e.g. what kind of eBS artifacts need to be version controlled?
    2. How to build eBS artifacts and then deploy to a new environment? How to move an eBS application from one environment (DEV) to test environment?
    3. How to manage changes?
    We have a standard SCM tool called RTC from IBM, which has version control, build and change control functions.
    Thanks
    Jirong

    hujirong wrote:
    Hi All
    I am setting up a standard configuration management, build & deployment automation and change & release management process for all types of applications including Oracle eBS, Informatica, Siebel, Cognos, Java etc. As you know, SCM for Java and .Net types of applications are very matured, but not for other types of applications such as Oracle eBS.
    Can you give me some hint or point me to some documents on how to manage Oracle eBS applications from SCM's perspective? For example,
    1. How to version control eBS artifacts? e.g. what kind of eBS artifacts need to be version controlled?
    2. How to build eBS artifacts and then deploy to a new environment? How to move an eBS application from one environment (DEV) to test environment?
    3. How to manage changes?
    We have a standard SCM tool called RTC from IBM, which has version control, build and change control functions.
    Thanks
    JirongPlease do not create duplicate threads -- Software Configuration Management of Oracle eBS

  • Need to shift to HANA , from Change and release management

    Hi All,
    Please advice me, I am working as Change and release coordinator , which is technology independent,  i have hands on SQL and Mainframe,  now i would like to shift my career to SAP HANA, please advice if this is a suitable option

    Hi Chandru,
    You have a very good job as a change and release coordinator.  Why don't you try SAP Release Management job in project because it is a niche and good.  If you learn SAP Solution Manager ChARM then it will take you to SAP Release Management job.
    If you still looking for SAP HANA, then it can also suits but you can check SAP HANA Administrator.  However, check out the option as SAP Release Management job.
    Best of luck.
    Regards
    GGOPII

  • Archiving and releasing CVI projects

    Hi
    At my company, we have to release the software code into a version control system.
    This is so anybody else can retrieve, modify and rebuild the project, especially when or if I, the creator, is gone.
    My question is about NI software versions, both the CVI version and all of the required NI driver versions used to create the project.
    The project references many include *.h files in their installed location, usually ..\Program Files\National Instruments\.. etc.
    Updating a project relies on the fact that the computer which initially released the build stays in-place, or no NI driver updates occurred.
    Building the retrieved project on another computer, therefore, does not work.
    So, should my archived/released project copy all driver files [like Nidaqex.h, etc] into a "local" \Include folder, which locks-in that drive version,
    or continue to rely on the NI installed folder location?
    Accompanying my software release is a companion document called a VD [version document] which fully describes the created environment
    required for the initial release. This greatly helps anyone in the future to re-build the project with step-by-step instructions.
    If there is any other "white papers" within the NI web-site that would enhance this subject, I would be appreciated to be directed to it.
    Scott Youngren

    >> "Building the retrieved project on another
    computer, therefore, does not work."
    Are you sure of your assertion? I continuously switch projects from one computer to another without compiling problems; I also switched during time between PCs with different localization (english vs. italian OS release: it impacts among other things on "program files" directory name) and different OS versions (win2k vs. WinXP) without facing any problem.
    IMHO the main hint is to maintain standard SW places when installing NI software on different machines, a fact that permits to switch between them without problems.
    A different matter can be the use of non-NI software, for which you will need to investigate on file location for DLLs, include and so on on different machines.
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • Release Management and Dsc - How to queue a build when a release is in progress?

    I have VSO and Release Management 2013.4 working and deploying to Azure vm's via DSC
    The build profile is set to trigger on commit to the Git repo and the release template is set to be triggered on successful build from TFS
    However if a developer commits in quick succession the resultant builds cause releases to overlap in RM? - this causes some of the releases to fail with a DSC error ("the consistency check or pull cmdlet is in progress....")
    Is there a way to force RM to prevent concurrent releases?  (based on the same release template and build profile)

    Thats correct - in this instance its VSO as source control and using the hosted release management to attempt to deploy to azure vms
    And yes build takes about 5 minutes but the release template can take up to 15 minutes to run - this means that a second build can cause the same release template to run again (its set to trigger on build)
    Error log from RM
    xception Message: New deployment is not allowed as an another deployment is in progress. Retry the deployment after sometime. (type OperationFailedException)
    Exception Stack Trace: at Microsoft.TeamFoundation.Release.EnvironmentProvider.Azure.Implementation.AzureDeploymentProvider.ReadDeploymentResponse(DeploymentResponse response)
    at Microsoft.TeamFoundation.Release.EnvironmentProvider.Azure.Implementation.AzureDeploymentProvider.DownloadBuilds(DeploymentMachineSpecification deploymentMachineSpecification, AzureStorageSpecification azureStorageSpecification)
    at Microsoft.TeamFoundation.Release.EnvironmentProvider.Azure.Implementation.AzureDeploymentProvider.RunScript(String scriptPath, String configurationPath, MachineSpecification machine, StorageSpecification storage, Dictionary`2 configurationVariables)
    at Microsoft.TeamFoundation.Release.Tasks.DeployDsc.Execute(DscComponentParametersV2 dscComponentParameters, AzureStorage azureStorage, String[] dnsNameAndUpdatedWinRmPort, String userName, String password, String dscScriptPath, String dscConfigurationPath, Boolean skipCACheck)
    at Microsoft.TeamFoundation.Release.Automation.Tasks.DeployDscTask.DscExecute(DscComponentParametersV2 dscComponentParameters, AzureStorage azureStorage, String[] dnsNameAndUpdatedWinRmPort, String userName, String password, String dscScriptPath, String dscConfigurationPath, Boolean skipCACheck)
    at Microsoft.TeamFoundation.Release.Automation.Tasks.DeployDscTask.DscDeploy(AzureStorage azureStorage, DscComponentParametersV2 dscComponentParameters, String userName, String password, String dscScriptPath, String dscConfigurationPath, String skipCACheck)
    at Microsoft.TeamFoundation.Release.Automation.Tasks.DeployDscTask.Execute(IAutomationContext context)
    at Microsoft.TeamFoundation.Release.DistributedTask.TaskProcessor.TaskExecutor.Execute(TaskExecutionContext context)

  • Is it possible to do Continuous deployment for azure cloud services(website,web and worker roles) using VSO Release management?

    Hi,
    I am trying to do continuous deployment using visual studio online and release management for Azure cloud services. But not able to find the way to do it(with the help of different blogs, those are describing using VM only).
    I tried using Release management visual studio extension also. But no Luck.
    Please help me if it is possible to do continuous deployment using release management.
    Thanks,
    Phani

    Hi,
     Please refer to the following Forum Thread with similar question which has been answered. Let us know if this helps.
     https://social.msdn.microsoft.com/Forums/en-US/9d8322f6-36e5-4cca-a982-d420d34d2072/realease-management-deployment-to-azure-websites-webworker-roles?forum=tfsbuild
    Regards,
    Nithin Rathnakar

  • Problems setting up release management

    Hi I am running into problems in setting up Microsoft Release Management. I have 2 domains. My tfs (separate server), release management server and release management client (both on single physical server) are in domain1 and deployer is in domain2 in Azure
    VM. I am using a VPN tunnel to connect release management server, release management client and deployer. They connect with each other nicely. The problem comes when I add reference to tfs in release management client. TFS fails to verify. Now
    the user I am using to connect has Make requests on behalf of
    others' permissions so that is not the issue. I get TF400324 error. I even used wireshark to troubleshoot but it looks as if release management
    client is not even trying to connect. I can access tfs url via web browser with vpn connected from machine on which release management client is running. when  Now if I disconnect the vpn and try to verify tfs from release
    management client it works. Does release management client supports connecting via ipv6 tunnel?

    Hi abhijitdamle,
    I'd like to know how do you connecting via ipv6 tunnel, and when you get the error(without vpn connection?) . If your machine can be access via HTTP/HTTPS, then RM client can also be connected. For your situation, seems you got the
    issue resolved after using vpn.
    If you have other concerns about the error, you can also check the methods below to see if it works for you:
    1. Check the permission of the user account you use, make sure it has the permission of "make requests on behalf of others"
    2. Clean team foundation cache on your RM client machine
    3. Check the team project collection url to ensure it's input correctly, or use the solution on this
    page
    Best regards,

  • Release Management DACPAC deploy store procedures

    Hi
    i have TFS Build and Release Management, i have component DACPAC database deployer but it does not deploy store procedures or views. the store procedures changes not see in database.  can I deploy Store Procedures and views?
    thank for your help
    steve

    Hi Steve,  
    Thanks for your post.
    Please manually publish your DACPAC file, the store procedure and views can be published to your SQL Server database successfully? You need ensure manually publish the DACPAC file can works fine, then run it in your release template and check the result
    again.
    For more information about deploy database using DACPAC Database Deployer tool in release template, please refer to this article:
    http://www.incyclesoftware.com/2014/03/deploying-databases-release-management/.
    Or you can use the Database Deployer – Execute Script tool to run your stored procedure file, please refer to the information in this post:
    https://social.msdn.microsoft.com/Forums/en-US/24119b72-0b05-48c5-bdea-188f95067aeb/pass-parameters-to-a-sql-script-using-database-deployer-execute-script-from-release-management?forum=tfsbuild.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Release Management vNext deployment succeeds without deploying the DSC to the target.

    All of a sudden, a couple of weeks ago, our vNext releases stopped working.
    The log in the release management client says that the release has been deployed successfully, but nothing has actually happened on the target server. The log looks like this (which is bad)...
    Copying recursively from \\xxxxx.co.nz\Public\Share\Development\Builds\DispatchService_MAIN\DispatchService_MAIN_1.1.7.0\DispatchService\_PublishedWebsites\DispatchService to C:\Windows\DtlDownloads\DispatchService Component succeeded.
    That's all it says. No error, no nothing.
    It used to look like this (which was good)...
    Copying recursively from \\xxx.co.nz\Public\Share\Development\Builds\SosService_MAIN\SosService_MAIN_1.1.32.0\SOSWindowsService to C:\Windows\DtlDownloads\SosServices succeeded.
    Perform operation 'Invoke CimMethod' with following parameters, ''methodName' = SendConfigurationApply,'className' = MSFT_DSCLocalConfigurationManager,'namespaceName' = root/Microsoft/Windows/DesiredStateConfiguration'.
    An LCM method call arrived from computer BDWEB10 with user sid S-1-5-21-3667390900-813461161-666584507-16981.
    [BDWEB10]: LCM: [ Start Set ]
    [BDWEB10]: LCM: [ Start Resource ] [[bStopService]StopTheService]
    [BDWEB10]: LCM: [ Start Test ] [[bStopService]StopTheService]
    [BDWEB10]: LCM: [ End Test ] [[bStopService]StopTheService] in 0.0000 seconds.
    [BDWEB10]: LCM: [ Start Set ] [[bStopService]StopTheService]
    [BDWEB10]: [[bStopService]StopTheService] Stopping service SosServiceDevReviewPci
    [BDWEB10]: LCM: [ End Set ] [[bStopService]StopTheService] in 0.1400 seconds.
    [BDWEB10]: LCM: [ End Resource ] [[bStopService]StopTheService]
    [BDWEB10]: LCM: [ Start Resource ] [[File]CopyDeploymentBits]
    [BDWEB10]: LCM: [ Start Test ] [[File]CopyDeploymentBits]
    etc etc
    I have tried many things, and have discovered some clues, but have filed to find a solution.
    It seems that it is successfully creating the .mof file on the "Deploy using PS/DSC Server" but then it deletes the .mof file and does not try to do a Start-DscConfiguration with the .mof file.
    This is very frustrating, as we have sunk an awful lot of time and money into Microsoft Release Manager.
    Does anyone have any ideas?

    Hi Leighton,  
    Thanks for your post.
    You’re using TFS 2013 Update 4 and Release Management 2013 Update 4?
    Do you mean that the files be copied to C:\Windows\DtlDownloads\DispatchServices folder on target machine as expect, but no file be executed to perform the deploy?
    You said the release template works fine a  couple of weeks ago, do you know which changes happened on this release template after that week? 
    Only this current one release template cannot works? Or if you create a new release template to run DSC deploy, it will receive the same result too?
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • To Do List Archiving with Reminders and more Features.

    I wish To Do List was a seperate program with archiving and categories with autoreplies and follow-up reminders. Needs more features.
    It would also be good to work as a project management system. Great for work or around the house duties.
    Does anyone know of any good software like this for mac?

    This is probably a better search there:
    http://www.macupdate.com/search.php?os=macosx&keywords=todo
    John M

Maybe you are looking for