Best way to handle duplicate headings stemming from linked TOC book?

What's the best way to handle duplicate topic titles stemming from TOC books that contain links to a topic that you want to have appear in the body? The problem I've had for years now is that the TOC generates one heading, and the topic itself generates one heading.This results in duplicate headings in the printed output.
I have a large ~2500 topic project I have to print every release, and to date we've been handling it in post-build word macros, but it's not 100% effective, so we're looking to fix this issue on the source side of the fence. On some of our smaller projects, we've actually marked with the heading in the topic itself with the Online CBT and that seems to work. We're thinking of doing the same to our huge project unless there's a better way of handling this. Any suggestions?

See the tip immediately above this link. http://www.grainge.org/pages/authoring/printing/rh9_printing.htm#wizard_page3
The alternative is to remove the topic from the print layout so that it only generates by virtue of its link to the book.
See www.grainge.org for RoboHelp and Authoring tips
@petergrainge

Similar Messages

  • What is the best way to handle duplicate in an XML document?

    I have an XML document that may contain duplicated nodes. I want to insert it to the DBXML database, so that the duplicated nodes are eliminated.
    What would be the best way (in term of performance) to do it?
    I thought of enforcing the uniqueness constraint and then insert the nodes one by one, so that I will get an exception from the database if the node is duplicated, but I may have more than 50000 nodes in the worst case, so I'm not sure if this is still a good way to do it.
    Can someone give me some suggestion on this?
    Thanks!

    Hi,
    I would suggest to reconsider building of your document so that it doesn't contain duplicates if you don't need them. And it doesn't have much to do with DB XML then.
    Also you could insert your document with duplicates and use XQuery Update facilities to delete undesirable nodes.
    Vyacheslav

  • What is the best way to handle executing multiple packages from the Agent?

    I have several packages that have to be executed in sequence. I thought the best way to do that was by creating a job for each package then have a master job that executes the other packages. In my master job,  I'm using sp_start_job to call the other
    jobs. The problem is, the master job moves from step to step without waiting for the child jobs to finish; basically they all execute together.
    That is the way I've seen it done in other places so I feel like I'm doing something wrong. In the alternative, I know it's possible to set the individual steps up so they execute the packages directly without calling an external job. I prefer the first
    way though.
    Which way should I jump on this?

    So basically what I'm hearing is just call the packages in a mulit step job. Creating a master package and calling child packages sounds a little crazy and unscaleable especially considering that the packages have master child relationships
    within themselves. It's SSIS Package Inception. 
    Sorry whats the issue with that?
    Provided you're setting the package sequence correctly based on your dependency it will work fine as loop gets iterated based on how you set the package list.
    What we have is a audit and control mechanism which even have details on the dependency so based dependency set only the packages gets ordered and listed for the loop to iterate through and executing. Also tomorrow if a new package comes into being, all
    it takes for us is to tweak the audit table to add a new entry for the package and set the dependency and it will continue to work fine including the new package without touching the existing job for any modification whatsoever.
    Another advantage of this table is we also capture audit details in it like date when package got last executed, its status of execution,rows processed (insert,modified, deleted) etc which can be used for easy monitoring of the data processing tasks as well.
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Best way to handle raiseerror after migration from SQL 2K?

    Hi,
    Here a sample code :
    raise_application_error(-20999, 'message id: 50048');
    /*[SPCONV-ERR(7)]:('Raiserror ( <message id>....) Manual conversion required*/
    /* message id gives id in sqlserver syscomments table*/
    RAISE_APPLICATION_ERROR(-20500,'Trigger forces a rollback');
    /* ROLLBACK; */
    I wonder what the best way to find the equivalent message id, or what the best to do in oracle to replace those event....???
    Thanks in advance
    Pete

    No matter found something !

  • Best way to handle an event from WaitForSingleObject

    Hello LV experts,
    I would like to know what's the best way to handle an event which I catch using the Win32 API (WaitForSingleObject) in Labview(7.1 or 8.20).
    I know there is too many possibilities to do this, but I want to do it  on the best way. that is why I would like to have ideas and tips from you.
    Any and all tips appreciated

    You can download the Library from this link. Inside is an example that shows how to handle windows messages/events.
    http://zone.ni.com/devzone/cda/epd/p/id/4394

  • (workflow question) - What is the best way to handle audio in a large Premiere project?

    Hey all,
    This might probably be suitable for any version of Premiere, but just in case, I use CS4 (Master Collection)
    I am wrestling in my brain about the best way to handle audio in my project to cut down on the time I am working on it.
    This project I just finished was a 10 minute video for a customer shot on miniDV (HVX-200) cut down from 3 hours of tape.
    I edited my whole project down to what looked good, and then I decided I needed to clean up all the Audio using Soundbooth, So I had to go in clip by clip, using the Edit in SoundBooth --> Render and Replace method on every clip. I couldn't find a way to batch edit any audio in Soundbooth.
    For every clip, I performed similar actions---
    1) both tracks of audio were recorded with 2 different microphones (2 mono tracks), so I needed only audio from 1 track - I used SB to cut and paste the good track over the other track.
    2) amplified the audio
    3) cleaned up the background noise with the noise filter
    I am sure there has to be a better workflow option than what I just did (going clip by clip), Can someone give me some advice on how best to handle audio in a situation like this?
    Should I have just rendered out new audio for the whole tape I was using, and then edit from that?
    Should I have rendered out the audio after I edited the clips into one long track and performed the actions I needed on it? or something entirely different? It was a very slow, tedious process.
    Thanks,
    Aza

    Hi, Aza.
    Given that my background is audio and I'm just coming into the brave new world of visual bits and bytes, I would second Hunt's recommendation regarding exporting the entire video's audio as one wav file, working on it, and then reimporting. I do this as one of the last stages, when I know I have the editing done, with an ear towards consistency from beginning to end.
    One of the benefits of this approach is that you can manage all audio in the same context. For example, if you want to normalize, compress or limit your audio, doing it a clip at a time will make it difficult for you to match levels consistently or find a compression setting that works smoothly across the board. It's likely that there will instead be subtle or obvious differences between each clip you worked on.
    When all your audio is in one file you can, for instance, look at the entire wave form, see that limiting to -6 db would trim off most of the unnecessary peaks, triim it down, and then normalize it all. You may still have to do some tweaking here and there, but it gets you much farther down the road, much more easily.Same goes for reverb, EQ or other effects where you want the same feel throughout the entire video.
    Hope this helps,
    Chris

  • Best way to handle multiple currencies

    I have a requirement that users should be able to report against an OLAP cube in a currency of their choice (from a list of about 20) and was wondering what the best way to handle this might be.
    One option would be to have a currency dimension containing the list of valid currencies and then to pre-calculate measures in each of the currencies and store them in the cube. However the downside of this is that the resultant cube would be 20 times larger than a cube in a single currency, take longer to maintain etc. I could of course partition the cube by currency to improve reporting performance since users would only report in one currency at a time.
    Another alternative would be to dynamically calculate the measures based on exchange rates - I guess this could either be done in the cube itself or as part of the reporting code. However since exchange rates are daily, this would obvioulsy prevent me from aggregating data up the time dimension (all measures are at the day level).
    Is there any standard way of doing this and what are the pro's and con's?
    Thanks,
    Chris

    Sorry - messed up - I should have posted this in the OLAP forum.....

  • Best way to find duplicate pages / find duplicate background images

    What is the best way to detect duplicate pages?
    The pages I am dealing with are searchable image (scanned Image background with selectable text overtop). In this case, Any two pages that have the exact same background image will be duplicate.
    I only know how to get page text though, so I've been getting the text and hashing it, then checking for duplicate hashes. This works for the most part, but I fear running into two different pages with the exact same text.
    What about looking at the background image? If a PDF has multiple pages with the same background image, I assume it would store the image once and then just reference it from the pages? Is it possible to check duplicate pages this way?
    Or Does Acrobat have a built-in checking solution I haven't discovered? As always, any help is appreciated

    Ok, well for the most part doing it by text works, but it sometimes flags things that arn't duplicate: such as two of the same worksheets that were not filled out will have the exact same text, despite being completely different pages

  • Best way to handle all erros and get performance(OCI)?

    Hi there,
    Im using[b] Oracle Call Interface to execute batch file process. But I have got a problem.
    I set ExecuteBatch with the same number of commit time, i.e: 100, 1000 or just 10, for thats ok.
    ((OraclePreparedStatement)globalStmt).setExecuteBatch(commit);
    And I handle executeUpdate to catch all SQL Exceptions. I made some proposital files with invalid erros but when I handle "executeUpdate" it�s doesn�t get the corrent error line, and puts out another line that is corret.
    ((OraclePreparedStatement)globalStmt).executeUpdate();
    Check in the code I concluded that its always get the same sequence of commit number like the error line. For example, I have between line 1 - 50 a line error, this line is 31, but I set the commit time for 50, its show me that line errror is 50 wherever 31. But if I put setExecuteBatch with '1' so it can handle corret lines erros, but the system performance bring down. What is the best way to handle all erros and keep the perfronace?
    Sorry for my english, I am not native. Thanks all.

    So by doing this, everything will transfer and look exactly the way I have it on the old machine?
    That is correct, if your old machine is Intel based after using MA the new machine will look just like the old machine. Here is information from Apple on MA, I'd recommend looking it over.
    My recommendation is to answer NO when setting up the new machine when it asks "Are you moving from another Mac?" The reason being let you new machine get set up and run for a couple of hours to ensure it's fine. Then launch MA and follow the prompts, it's very easy and if you use a fast connection like FW it should go smoothly.
    Regards,
    Roger

  • Best way to handle when Production XI is in outrage/down

    Hi All,
    My production box is running smoothly now.If we want to go for any outrage and still we will be receiving many messages through internet from many partners.
    So we are expecting how to receive the messages when production XI is in outrage. My client is thinking of putting one more instance with the replica of Production box. So that when the main production is in outrage then immediately they can switch back to this replica version and they can process the messages untill the main one get ready.
    Now I am very much concern what are the implication will be if we go like this.
    1. which is the best way to handle XI box to maintain when the system is outrage/down?
    2. Is it a good idea of creating replica of production instance and maintaing this when the main one is in outrage?
    Kindly suggest what you guys are doing for your implementations.
    Thanks
    Seema

    >
    > 1. which is the best way to handle XI box to maintain when the system is outrage/down?(An act of extreme violence or viciousness)
    >
    XI Box - Outrage -Should be a robotic XI Box. Now I know that you are thinking what to do when cops come and handcuff it. -- Just kidding..
    Ok, here's the logic, you want to replace a piece of water pipe that gets water to your faucet. What do you do without the water leaking or disrupting the water supply... you either turn off the valve or add a temp pipe to keep the water flowing..
    So check into your sys landscape for Fail safe or High Availability, may be one of those can save you money and time.
    AB

  • Best way to handle medical collections?

    Hello all, My husband and I are new to the "rebuilding your credit" world.  We're coming back from a foreclosure and actively trying to rebuild.  I have 4 active collections for 3 creditors listed on my EQ/TU report - totally $399.  I can pay them in full right now but I don't want to call the creditors to pay them if its not going to be removed from my reports properly.  Some of these are old - 2011. I also have 3 collections "removed" from my TU/EQ report since March - will paying these affect what is listed on my reports?
    What is the best way to handle these accounts to ensure they are removed from my reports completely?    

    Contact each creditor and politely request a Pay for Deletion - agreement they will delete from your reports in exchange for full payment. Get this in writing. If that doesn't work (try a few times), pay it, then try the goodwill route -- write letters requesting early deletion of the entries from your reports. Even though negative items will fall off your report in 7 years (so the 2011 collection will fall off in 2018), you will still have to report that you have unpaid debt on future mortgage applications, etc. And on a manual review, paid collections look better than unpaid ones. I would not advise not paying them at all. 

  • Best way to handle source files

    Hi there,
    After some pretty general advise please.
    The company I work for looks after a lot of websites, and one
    of the
    headaches we have is the best way to handle source files. By
    source files I'm
    referring to Photoshop files, Flash .fla files and also other
    none Adobe
    files that relate to a site, not the .html, .asp, .aspx,
    .css, .js etc type
    files.
    Now I'm NOT after a version control system, just a simple way
    to store the
    source files in a location that is separate from the website
    but still to be
    able to have a smooth workflow between the Dreamweaver site
    and it's source
    files.
    At the moment, and I know this is unwise, we have a
    subdirectory within the
    site where we store the source files, and use WebDAV to
    transfer both site
    and source files to and from the server. But I really want to
    separate the
    site from source but still maintain a link between site and
    source...... if
    you see what I mean. I think the upshot is I would like to be
    able to open a
    site within Dreamweaver and instantly be able to access that
    sites source
    files if needed. This method needs to be shared across a
    small team spread
    round the UK.
    I looked at the Repository Subversion version control, but
    like I said I'm
    not after a source control system, plus it appeared to
    conflict with WebDAV
    and Contribute, that some of our clients use to maintain
    content on their
    sites. I also looked at Version Cue which looked promising,
    but can't see a
    clear workflow between Dreamweaver and Version Cue which
    separates site from
    source. I might be missing something.... part of my brain
    perhaps. :)
    Would be grateful for any advice please.
    Cheers,
    @ndyB

    Take a deep breath. Relax. All is fine.
    iDVD does not look at the size of your video file, it looks at the length. iDVD can accomodate up to 2 hours of movie
    iDVD gives you different options depending on the length of your movie. Although I won't agree with your friend about reducing the length of your movie to 15 minutes, if you could trim out a few minutes to get it under an hour that setting in iDVD (Best Performance though the new version may have renamed it) gives you the best quality. Still, any iDVD setting will give you good quality even at 64 minutes
    In FCE export as Quicktime Movie NOT any flavour of Quicktime Conversion. Select chapter markers if you have them. If everything is on one system unchecked the Make Movie Self Contained button. Drop the QT file into iDVD

  • Best Way to Handle Dynamic Initialization of x number of Objects?

    I want to be able to take a x value(integer) that I get from another part of my program and initialize x number of Objects. Best way to handle that?

    myObject[] myObjArray = new myObject[x];
    for (int i=0; i<x; i++) myObjArray[i] = new myObject("obj#"+i);

  • Best way to handle session timeout

    Hello All,
    oracle 11g, Apex ver 3.1.2
    I am bit confused about the sessoin handling mecahnism for the users .
    Which is the best way to handle session for the users is it programatically or by DBA admin level.
    What are the pros and cons going DBA Level and Programmatica level.
    Before hand I got to have some information on hand for justification.
    thanks/kumar

    Hi,
    I've done a great deal of work with mobile accounts in Snow Leopard and I'm now having a "play" with Lion. To be honest you have to sit down and think about why you need mobile accounts.
    If your user only uses one computer then your safer having a local account backed up by a network Time Machine, this avoids the many many woes that the Servers FileSyncAgent brings to the table.
    If your users are going to be accessing multiple computers on the network and leaving the network then a mobile account is good for providing a uniform user experience and access to files etc. However, your users will have to make a choice as to whether they want their iPhoto libraries on one Local machine (backed up by Time Machine) or whether they want their library to be hosted on the server and not part of the Mobile Home Sync schedule (adding ~/Pictures to the excluded items on the home sync settings).
    With the latter, users will be able to access their iPhoto libraries on any computer when they are within the network (as it's accessed from the users server home folder).
    With the first option the user would have their iPhoto library on one computer (say the laptop they used the most) but then would not be able to access it from other computers they log on to.
    iPhoto libraries are a pain, and I'm working hard to come up with a workaround. If your users moved over to using Apeture then you could include the aperture library as part of the home sync thanks to Deeport (http://deepport.net/archives/os-x-portable-home-directories-and-syncing-flaw-wit h-bundles/)
    He does suggest that the same would work with IPhoto libraries - but it doesn't for a number of mysterious reasons regarding how the OS recognizes thie iPhoto bundle (it does so differently compared to Apeture).
    Hope this helps...

  • Best way to handle tcMultipleMatchFoundException

    Can any one tell me what is the best way to handle tcMultipleMatchFoundException during Reconciliaiton.
    One way which i know is to manually correct the data. Apart form is there any way..?
    Thanks,
    Venkatesh.

    Hi,
    I've done a great deal of work with mobile accounts in Snow Leopard and I'm now having a "play" with Lion. To be honest you have to sit down and think about why you need mobile accounts.
    If your user only uses one computer then your safer having a local account backed up by a network Time Machine, this avoids the many many woes that the Servers FileSyncAgent brings to the table.
    If your users are going to be accessing multiple computers on the network and leaving the network then a mobile account is good for providing a uniform user experience and access to files etc. However, your users will have to make a choice as to whether they want their iPhoto libraries on one Local machine (backed up by Time Machine) or whether they want their library to be hosted on the server and not part of the Mobile Home Sync schedule (adding ~/Pictures to the excluded items on the home sync settings).
    With the latter, users will be able to access their iPhoto libraries on any computer when they are within the network (as it's accessed from the users server home folder).
    With the first option the user would have their iPhoto library on one computer (say the laptop they used the most) but then would not be able to access it from other computers they log on to.
    iPhoto libraries are a pain, and I'm working hard to come up with a workaround. If your users moved over to using Apeture then you could include the aperture library as part of the home sync thanks to Deeport (http://deepport.net/archives/os-x-portable-home-directories-and-syncing-flaw-wit h-bundles/)
    He does suggest that the same would work with IPhoto libraries - but it doesn't for a number of mysterious reasons regarding how the OS recognizes thie iPhoto bundle (it does so differently compared to Apeture).
    Hope this helps...

Maybe you are looking for