Best practice for cleaning up conversations after a recieve timeout

We have a pattern where sql server sends a message through service broker and waits for a response (say 10 ten secs), if it doesn't get a response, it will assume a default response and carry on. The response may come later and also end the conversation
target side. However there is no-one to receive the response (which is no longer needed). We have clean up script on a sql agent job but we were wondering if there is a cleaner way to handle these situations

Can't you set up an activation procedure that receives the response message and ends the conversation? I would not expect the activation procedure to be fired, if there is already is someone waiting on the queue.
But it is certainly nothing I have tested.
Erland Sommarskog, SQL Server MVP, [email protected]

Similar Messages

  • Best practice for customizing EJB property after deployment

    Hi Gurus,
      What is the best practice for customizing EJB property after deployment in NW7.1? I have a stateless session bean and it needs to get some environment information before acting. While the information can only be known at runtime. What should I do to achieve it? I thought I can bind the property with a JNDI context but I did not find out where to declare and change the context value. Please advise. Thanks.
    B.R.

    Hi.
    I have a similar problem. But I still can not edit the properties of the ejb-jar.xml.
    I tried to stop the web service, but the properties still remain unmodifiable.
    Could you advise me how to change them?
    We have installed SAP Server 7.0.2

  • Best practices for cleaning up an old CSS?

    We have a legacy help system that we've put into RH 9.  Our CSS has a lot of old styles in it.  Is there a best practices document somewhere that we can use to learn how to clean up the CSS and remove old styles but still preserve the ones we want?
    thanks...

    I don't know of any such document but perhaps this will help.
    First archive a copy of the CSS as it is now so that you can go back and retrieve anything that you later find you shouldn't have deleted.
    Next you should be safe in deleting any styles you see with kadov in them. You will see they are duplicates of another style and were used for the way RoboHelp used to work.
    The rest are a bit more difficult. You need to use the multifile find and replace tool to see if they are used in any topic. Then either change them to what you want now or leave them in the CSS.
    See www.grainge.org for RoboHelp and Authoring tips
    @petergrainge

  • Best practices for cleaning up after a Bulk REST API v2 export

    I want to make sure that I am cleaning up after my exports (not leaving anything staging, etc). So far I am
    DELETEing $ENTITY/exports/$ID and $ENTITY/exports/$ID/data as described in the Bulk REST API documentation
    Using a dataRetentionDuration when I create an export (as a safety net in case my code crashes before deleting).
    Is there anything else I should do? Should I/can I DELETE the syncs I create (syncs are not listed in the "Delete an Entity" section of the documentation)? Or are those automatically deleted when I DELETE an export?
    Thanks!
    1086203

    Hi Chris,
    I met the same problem as pod
    It happens when I tried to load all historical activities, and one sample is same activityId was given to 2 different types (one is EmailOpen, the other is FormSubmit) that generated in year 2013
    Before full loading, I ever did testing for my job, extracting the activity records from Nov 2014, and there is not unique ID issue
    Seems Eloqua fixed this problem before Nov 2014, right?
    So if I start to load Activity generated since 2015, there will not be PK problem, or else, I have to use ActivityId + ActivityType as compound PK for historical data
    Please confirm and advise
    Waiting for your feedback
    Thanks~

  • Best practices for cleaning up and upgrading

    Apologies in advance, as I know there is a wealth of info and ideas on this topic, but I'm looking for the best possible combination of simplicity and effectiveness and the community has done well by me in the past.
    Situation:  I have this old MacBook
    Model Name: MacBook
      Model Identifier: MacBook6,1
      Processor Name: Intel Core 2 Duo
      Processor Speed: 2.26 GHz
      Number of Processors: 1
      Total Number of Cores: 2
      L2 Cache: 3 MB
      Memory: 4 GB
      Bus Speed: 1.07 GHz
      Boot ROM Version: MB61.00C8.B00
      System Version: OS X 10.9.5 (13F34)
      Kernel Version: Darwin 13.4.0
    HD- Available: 6.22 GB
      Capacity: 249.2 GB
    Literally the only paid software that I have on here and need to keep is Microsoft Office.  I do not have the original install disk for this software.
    This machine is running very very slowly, and not much hard disk space left.  I'm not doing very well at cleaning up.
    I have brand new Time Machine backup and I have some pretty old Time Machine backups, pre Mavericks.
    I have a new 2TB external drive.
    I'm looking for the most effective way to speed this one up and make it as lean as possible and, if advisable, upgrade to Yosemite.
    Any recommendations?
    I'm willing to spend a little money if it can make a substantial difference.
    Thanks and apologies for length of post.
    RW

    Hi,
    First thanks for this.  This application has been really helpful.  I've deleted a lot of the easy stuff (music, pics, videos) and am now getting down to some of the stuff I'm less comfortable whacking without some guidance.  Here's an example.  I don't use the Mail application at all (use Gmail).  Here is what I see on OmniDisk Sweeper:
    https://www.dropbox.com/s/6iedkj46u4jaxbf/Screenshot%202014-11-08%2010.26.33.png ?dl=0
    I think the bulk of this is just messages from my gmail account, from a failed attempt to use Mail a bit years ago.  So can you help with at which point I can delete?  Hope question make sense.  You've helped a lot so far and I've freed up about 45 gb so far.
    Rob

  • Best practice for reinstalling anti virus on reinstalling windows

    Best practice for reinstalling anti virus after formatting drive and reinstalling windows No anti virus disc.
    Hasty

    Hello,
    I'd ask in the Windows forum on Microsoft Community.
    Karl
    When you see answers and helpful posts, please click Vote As Helpful, Propose As Answer, and/or Mark As Answer.
    My Blog:http://unlockpowershell.wordpress.com
    My Book:Windows PowerShell 2.0 Bible
    My E-mail: -join ('6F6C646B61726C40686F746D61696C2E636F6D'-split'(?<=\G.{2})'|%{if($_){[char][int]"0x$_"}})

  • Best Practice for Conversion Workflow

    Hello,
    I'm converting video files from our "home grown" virtual media reserve to iTunes U. Some of the files are in RM format, some are already compressed .mov's (not H.264) and some I have the original DV files for.
    Anyone out there have a best practice for converting these file types for posting to iTunes U? I have Final Cut Studio (Compressor), QT Pro and Squeeze available to me.
    Any experience you have with this would be helpful.
    Thanks,
    Jeana

    For converting old files to a podcast compatible video and based on the machine you have, consider elgato turbo.264. It is a fairly priced "co-processor" for video conversion. It is comprised of an application and a small USB device with a encoder chip in it. In my experience, it is the fastest way to create podcast video files. The amount of time that you will save will pay for the device quickly (about $100). Plus it does batch conversion of any video that your system currently plays through QuickTime. it has all the necessary presets and you can create your own. It has a few minor limitations such as not supporting (at this moment) enhanced podcasting features such as chapter markers and closed-captions but since you have old files for conversion, that won't matter.
    For creating new content, the workflow varies a lot. Since you mention MP3s, I guess you are also interested in audio files. I would stick with GarageBand, especially if you are a beginner plus it supports enhanced podcasts.
    In any case the most important goal is to have the simplest and fastest way to go from recording to publication. The less editing the better. To attain that, the best methods will require the largest investments. For example, for video production the best way is to produce the content live so when you finish recording it is only a matter of encoding and publishing. that will require the use of a video switcher that can ingest at least one video camera and a computer output to properly capture presentation material. That's the minimum. there are several devices that can do this for you. Some are disguised PCs and some connect to a PC for tapeless recording. You can check the Tricaster, which I like but wish it was a Mac and not a Windows Xp PC. Other routes may include video mixers from manufacturers like Edirol, Pansonic or Sony connected to a VTR or directly to a Mac for direct-ti-disc capture. I f you look at some of the content available in iTunes U, you will see what I explain here. This workflow requires preparation and sufficient live support but you will have your material ready for delivery almost immediately after the recorded event. No editing required. Finally, the most intensive workflow is to record everything separately and edit it later, which is extremely time consuming.

  • Best practice for importing non-"Premiere-ready" video files

    Hello!
    I work with internal clients that provide me with a variety of differnet video types (could be almost ANYTHYING, WMV, MP4, FLV).  I of course ask for AVIs when possible, but unfortunately, I have no control over the type of file I'm given.
    And, naturally, Premiere (just upgraded to CS5) has a hard time dealing with these files.  Unpredictable, ranging from working fine to not working at all, and everything in between.  Naturally, it's become a huge issue for turnaround time.
    Is there a best practice for preparing files for editing in Premiere?
    I've tried almost everything I can think of:  converting the file(s) to .AVIs using a variety of programs/methods.  Most recently, I tried creating a Watch Folder in Adobe Media Encoder and setting it for AVI with the proper aspect ratio.  It makes sense to me that that should work:using an Adobe product to render the file into something Premiere can work with.
    However, when I imported the resulting AVI into Premiere, it gave me the Red Line of Un-renderness (that is the technical term, right?), and had the same sync issue I experienced when I brought it in as a WMV.
    Given our environment, I'm completely fine with adding render time to the front-end of projects, but it has to work.  I want files that Premiere likes.
    THANK YOU in advance for any advice you can give!
    -- Dave

    I use an older conversion program (my PrPro has a much older internal AME, unlike yours), DigitalMedia Converter 2.7. It is shareware, and has been replaced by Deskshare with newer versions, but my old one works fine. I have not tried the newer versions yet. One thing that I like about this converter is that it ONLY uses System CODEC's, and does not install its own, like a few others. This DOES mean that if I get footage with an oddball CODEC, I need to go get it, and install it on the System.
    I can batch process AV files of most types/CODEC's, and convert to DV-AVI Type II w/ 48KHz 16-bit PCM/WAV Audio and at 29.97 FPS (I am in NTSC land). So far, 99% of the resultant converted files have been perfect, whether from DivX, WMV, MPEG-2, or almost any other format/CODEC. If there is any OOS, my experience has been that it will be static, so I just have to adjust the sync offset by a few frames, and that takes care of things.
    In a few instances, the PAR flag has been missed (Standard 4:3 vs Widescreen 16:9), but Interpret Footage has solved those few issues.
    Only oddity that I have observed (mostly with DivX, or WMV's) is that occasionally, PrPro cannot get the file's Duration correct. I found that if I Import those problem files into PrElements, and then just do an Export, to the same exact specs., that resulting file (seems to be 100% identical, but something has to be different - maybe in the header info?) Imports perfectly into PrPro. This happens rarely, and I have the workaround, though it is one more step for those. I have yet to figure out why one very similar file will convert with the Duration info perfect, and then a companion file will not. Nor have I figured out exactly what is different, after running through PrE. Every theory that I have developed has been shot down by my experiences. A mystery still.
    AME works well for most, as a converter, though there are just CODEC's, that Adobe programs do not like, such as DivX and Xvid. I doubt that any Adobe program will handle those suckers easily, if at all.
    Good luck,
    Hunt

  • Best Practices for Implementing Cryptographic VPN

    With Marcin Latosiewicz
    Welcome to the Cisco Support Community Ask the Expert conversation.  This  is an opportunity to learn and ask questions about implementing cryptographic VPN and how to prepare it for the future with expert Marcin Latosiewicz. 
    Marcin will share his best practices for implementing cryptographic VPN as well as advise those customers who are looking to build a new or update their existing setups how to maximize their potential.  Additionally, Marcin will provide insight into which technologies could be applicable for new deployments and exciting new technologies that will be available in the next few months. 
    Marcin Latosiewicz is a customer support engineer at the Cisco®  Technical Assistance Center in Belgium, with more than 6 years of  experience with Cisco Security products and technologies including  IPsec, VPN, internetworking appliances, network and system security,  Internet services, and  Cisco networking equipment. Prior to joining Cisco, he operated, administered, and ran UNIX and Microsoft networks for 14 years. Latosiewicz holds bachelors and masters degrees in engineering from Warsaw University of Technology. He also holds CCIE® certification in Security (No. 25784) and CCDP® certification.
    Remember to use the rating system to let Marcin know if you've received an adequate response. 
    Because of the volume expected during this event, Marcin might not be able to answer every question. Remember that you can continue the conversation in the Security community, subcommunity, VPN, shortly after the event. This event lasts through September 20, 2013. Visit this forum often to view responses to your questions and those of other Cisco Support Community members.

    Jouni,
    Good question. And answer is complex, there is in depth and there is in depth.
    Most people would be satisfied by reading a summary of all the different components - encryption, hashing, signing, PKI, how IPsec and SSL/TLS work. This group also counts most of security CCIEs.
    To this extent CCIE Security Study Guide (by Henry Benjamin) was a good read, if a bit outdated today.
    Most people who are in depth will look first into specification.
    RFC 4301 (IPsec architecture)
    RFC 2246 (TLS 1.0)
    Are a good start and contain references to other documents worth reading.
    This is where the good folks will base their knowledge of off.
    The really in depth people will look into the math behind it and will conquer topics like
    Elliptic Curve Crypto ( http://en.wikipedia.org/wiki/Elliptic_curve_cryptography ) and difference between CCM, GCM and CCB, to which you have really good materials published by universities.
    There are relatively a few who know this.
    To start with I can suggest:
    - http://www.cl.cam.ac.uk/~rja14/book.html (Ross' Anderson book is free, informative and suprisngly entertaining, this is a definitely a must-read for security/VPN).
    - Have a look at books recommended by Richard Bejtlich or Bruce Scheiner - while they might not be VPN specific it's a good security read most of the time.
    I'll have a look at the books at home see which one can be interesting to read, and edit this post.
    M.

  • Best practices for ARM - please help!!!

    Hi all,
    Can you please help with any pointers / links to documents describing best practices for "who should be creating" the GRC request in below workflow of ARM in GRC 10.0??
    Create GRC request -> role approver -> risk manager -> security team
    options are : end user / Manager / Functional super users / security team.
    End user and manager not possible- we can not train so many people. Functional team is refusing since its a lot of work. Please help me with pointers to any best practices documents.
    Thanks!!!!

    In this case, I recommend proposing that the department managers create GRC Access Requests.  In order for the managers to comprehend the new process, you should create a separate "Role Catalog" that describes what abilities each role enables.  This Role Catalog needs to be taught to the department Managers, and they need to fully understand what tcodes and abilities are inside of each role.  From your workflow design, it looks like Role Owners should be brought into these workshops.
    You might consider a Role Catalog that the manager could filter on and make selections from.  For example, an AP manager could select "Accounts Payable" roles, and then choose from a smaller list of AP-related roles.  You could map business functions or tasks to specific technical roles.  The design flaw here, of course, is the way your technical roles have been designed.
    The point being, GRC AC 10 is not business-user friendly, so using an intuitive "Role Catalog" really helps the managers understand which technical roles they should be selecting in GRC ARs.  They can use this catalog to spit out a list of technical role names that they can then search for within the GRC Access Request.
    At all costs, avoid having end-users create ARs.  They usually select the wrong access, and the process then becomes very long and drawn out because the role owners or security stages need to mix and match the access after the fact.  You should choose a Requestor who has the highest chance of requesting the correct access.  This is usually the user's Manager, but you need to propose this solution in a way that won't scare off the manager - at the end of the day, they do NOT want to take on more work.
    If you are using SAP HR, then you can attempt HR Triggers for New User Access Requests, which automatically fill out and submit the GRC AR upon a specific HR action (New Hire, or Termination).  I do not recommend going down this path, however.  It is very confusing, time consuming, and difficult to integrate properly.
    Good luck!
    -Ken

  • Best Practices for Using Photoshop (and Computing in General)

    I've been seeing some threads that lead me to realize that not everyone knows the best practices for doing Photoshop on a computer, and in doing conscientious computing in general.  I thought it might be a good idea for those of us with some exprience to contribute and discuss best practices for making the Photoshop and computing experience more reliable and enjoyable.
    It'd be great if everyone would contribute their ideas, and especially their personal experience.
    Here are some of my thoughts on data integrity (this shouldn't be the only subject of this thread):
    Consider paying more for good hardware. Computers have almost become commodities, and price shopping abounds, but there are some areas where spending a few dollars more can be beneficial.  For example, the difference in price between a top-of-the-line high performance enterprise class hard drive and the cheapest model around with, say, a 1 TB capacity is less than a hundred bucks!  Disk drives do fail!  They're not all created equal.  What would it cost you in aggravation and time to lose your data?  Imagine it happening at the worst possible time, because that's exactly when failures occur.
    Use an Uninterruptable Power Supply (UPS).  Unexpected power outages are TERRIBLE for both computer software and hardware.  Lost files and burned out hardware are a possibility.  A UPS that will power the computer and monitor can be found at the local high tech store and doesn't cost much.  The modern ones will even communicate with the computer via USB to perform an orderly shutdown if the power failure goes on too long for the batteries to keep going.  Again, how much is it worth to you to have a computer outage and loss of data?
    Work locally, copy files elsewhere.  Photoshop likes to be run on files on the local hard drive(s).  If you are working in an environment where you have networking, rather than opening a file right off the network, then saving it back there, consider copying the file to your local hard drive then working on it there.  This way an unexpected network outage or error won't cause you to lose work.
    Never save over your original files.  You may have a library of original images you have captured with your camera or created.  Sometimes these are in formats that can be re-saved.  If you're going to work on one of those files (e.g., to prepare it for some use, such as printing), and it's a file type that can be overwritten (e.g., JPEG), as soon as you open the file save the document in another location, e.g., in Photoshop .psd format.
    Save your master files in several places.  While you are working in Photoshop, especially if you've done a lot of work on one document, remember to save your work regularly, and you may want to save it in several different places (or copy the file after you have saved it to a backup folder, or save it in a version management system).  Things can go wrong and it's nice to be able to go back to a prior saved version without losing too much work.
    Make Backups.  Back up your computer files, including your Photoshop work, ideally to external media.  Windows now ships with a quite good backup system, and external USB drives with surprisingly high capacity (e.g., Western Digital MyBook) are very inexpensive.  The external drives aren't that fast, but a backup you've set up to run late at night can finish by morning, and if/when you have a failure or loss of data.  And if you're really concerned with backup integrity, you can unplug an external drive and take it to another location.
    This stuff is kind of "motherhood and apple pie" but it's worth getting the word out I think.
    Your ideas?
    -Noel

    APC Back-UPS XS 1300.  $169.99 at Best Buy.
    Our power outages here are usually only a few seconds; this should give my server about 20 or 25 minutes run-time.
    I'm setting up the PowerChute software now to shut down the computer when 5 minutes of power is left.  The load with the monitor sleeping is 171 watts.
    This has surge protection and other nice features as well.
    -Noel

  • Best Practices for new iMac

    I posted a few days ago re failing HDD on mid-2007 iMac. Long story short, took it into Apple store, Genius worked on it for 45 mins before decreeing it in need of new HDD. After considering the expenses of adding memory, new drive, hardware and installation costs, I got a brand new iMac entry level (21.5" screen,
    2.7 GHz Intel Core i5, 8 GB 1600 MHz DDR3 memory, 1TB HDD running Mavericks). Also got a Superdrive. I am not needing to migrate anything from the old iMac.
    I was surprised that a physical disc for the OS was not included. So I am looking for any Best Practices for setting up this iMac, specifically in the area of backup and recovery. Do I need to make a boot DVD? Would that be in addition to making a Time Machine full backup (using external G-drive)? I have searched this community and the Help topics on Apple Support and have not found any "checklist" of recommended actions. I realize the value of everyone's time, so any feedback is very appreciated.

    OS X has not been officially issued on physical media since OS X 10.6 (arguably 10.7 was issued on some USB drives, but this was a non-standard approach for purchasing and installing it).
    To reinstall the OS, your system comes with a recovery partition that can be booted to by holding the Command-R keys immediately after hearing the boot chimes sound. This partition boots to the OS X tools window, where you can select options to restore from backup or reinstall the OS. If you choose the option to reinstall, then the OS installation files will be downloaded from Apple's servers.
    If for some reason your entire hard drive is damaged and even the recovery partition is not accessible, then your system supports the ability to use Internet Recovery, which is the same thing except instead of accessing the recovery boot drive from your hard drive, the system will download it as a disk image (again from Apple's servers) and then boot from that image.
    Both of these options will require you have broadband internet access, as you will ultimately need to download several gigabytes of installation data to proceed with the reinstallation.
    There are some options available for creating your own boot and installation DVD or external hard drive, but for most intents and purposes this is not necessary.
    The only "checklist" option I would recommend for anyone with a new Mac system, is to get a 1TB external drive (or a drive that is at least as big as your internal boot drive) and set it up as a Time Machine backup. This will ensure you have a fully restorable backup of your entire system, which you can access via the recovery partition for restoring if needed, or for migrating data to a fresh OS installation.

  • (Request for:) Best practices for setting up a new Windows Server 2012 r2 Hyper-V Virtualized AD DC

    Could you please share your best practices for setting up a new Windows Server 2012 r2 Hyper-V Virtualized AD DC, that will be running on a new WinSrv 2012 r2 host server.   (This
    will be for a brand new network setup, new forest, domain, etc.)
    Specifically, your best practices regarding:
    the sizing of non virtual and virtual volumes/partitions/drives,  
    the use of sysvol, logs, & data volumes/drives on hosts & guests,
    RAID levels for the host and the guest(s),  
    IDE vs SCSI and drivers both non virtual and virtual and the booting there of,  
    disk caching settings on both host and guests.  
    Thanks so much for any information you can share.

    A bit of non essential additional info:
    We are small to midrange school district who, after close to 20 years on Novell networks, have decided to design and create a new Microsoft network and migrate all of our data and services
    over to the new infrastructure .   We are planning on rolling out 2012 r2 servers with as much Hyper-v virtualization as possible.
    During the last few weeks we have been able to find most of the information we need to undergo this project, and most of the information was pretty solid with little ambiguity, except for
    information regarding virtualizing the DCs, which as been a bit inconsistent.
    Yes, we have read all the documents that most of these posts tend point to, but found some, if not most are still are referring to performing this under Srvr 2008 r2, and haven’t really
    seen all that much on Srvr2012 r2.
    We have read these and others:
    Introduction to Active Directory Domain Services (AD DS) Virtualization (Level 100), 
    Virtualized Domain Controller Technical Reference (Level 300),
    Virtualized Domain Controller Cloning Test Guidance for Application Vendors,
    Support for using Hyper-V Replica for virtualized domain controllers.
    Again, thanks for any information, best practices, cookie cutter or otherwise that you can share.
    Chas.

  • Workflow not completed, is this best practice for PR?

    Hi SAP Workflow experts,
    I am new in workflow and now responsible to support existing PR release workflow.
    The workflow is quite simple and straightforward but the issue here is the workflow seems like will never be completed.
    If the user released the PR, the next activity is Requisition released that using task TS20000162.
    This will send work item to user (pr creator) sap inbox which when they double click it will complete the workflow.
    The thing here is, in our organization, user does not access SAP inbox hence there are thousands of work item that has not been completed. (our procurement system started since 2009).
    Our PR creator will receive notification of the PR approval to theirs outlook mail handled by a program that is scheduled every 5 minutes.
    Since the documentation is not clear enough, i can't digest why the implementer used this approach.
    May I know whether this is the best practice for PR workflow or not?
    Now my idea is to modify the send email program to complete the workitem after the email being sent to user outlook mail.
    Not sure whether it is common or not though in workflow world.
    Any help is deeply appreciated.
    Thank you.

    Hello,
    "This will send work item to user (pr creator) sap inbox which when they double click it will complete the workflow."
    It sounds liek they are sending a workitem where an email would be enough. By completing the workitem they are simply acknowledging that they have received notification of the completion of the PR.
    "Our PR creator will receive notification of the PR approval to theirs outlook mail handled by a program that is scheduled every 5 minutes."
    I hope (and assume) that they only receive the email once.
    I would change the workflow to send an email (SendMail step) to the initiator instead of the workitem. That is normally what happens. Either that or there is no email at all - some businesses only send an email if something goes wrong. Of course, the business has to agree to this change.
    Having that final workitem adds nothing to the process. Replace it with an email.
    regards
    Rick Bakker
    hanabi technology

  • Best practice for upgrading task definition without deleting task instances

    best practice for upgrading task definition in production system without deleting or terminating task instances
    If I try and update a task definition with task instances running I get the following error:
    Task definition 'My Task - Add User' may not be modified while there are active task instances
    Is there a best practice to handle this. I tried to force an update through the console but that didn't work. I tried editing the task from the debug page and got the same error.

    1) Rename the original task definition.
    2) Upload the new task definition with the original name.
    3) Later, after all the running tasks have timed out, delete the old definition.
    E.g., if your task definition is "myWorkflow":
    1) Rename "myWorkflow" to "myWorkflow-old-2009-07-28"
    2) Upload the new task definition as "myWorkflow".
    Existing tasks will stay linked to the original (renamed) workflow definition.
    New tasks will use the new definition.
    As the previous poster notes, depending on the changes you are making, letting the old task definitions stay active could have bad side-effects and might be better avoided.

Maybe you are looking for