Best practice using "Master Page"?

I have a photography website that I am building. Aside from the splash page, all of the other pages have the same header and footer. I figure I can use a master page to save some time. Since the splash page is completely different, should a create a new .png document to work on that? It seems like once you set a master page, ALL pages in the .png have to incorporate the same elements.
Also, with a master page do I only need to slice and export the master elements once? And just slice the new items from the different pages?
Thanks,
Gordon

You don't need a separate PNG file. Simply create the splash page as a separate page not linked to the master. You can remove the master page layer from any page by choosing this option in the Layers panel options.
Jim Babbage

Similar Messages

  • How to remove infopath ribbon using Master Page?

    Hi All,
    I would like to remove infopath ribbon using Master Page, using f12 removed like "<div id="s4-ribbonrow"
    class="s4-pr s4-ribbonrowhidetitle" style="display:none">". But how to perform in original master page? Thanks in advance!

    Hi,
    Based on your description, my understanding is that you want to remove InfoPath ribbon using Master Page.
    using F12 to find the <div id=”RibbonWrapper”> section:
    Navigate to :C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\TEMPLATE\LAYOUTS\ FormServer.aspx. Add the following CSS into <head runat="server"></head>section:<style type="text/css">
    #RibbonWrapper
    display: none !important;
    </style>
    Best Regards,Lisa Chen
    Lisa Chen
    TechNet Community Support

  • Favorite / Best Practice / Useful MTE's ??!?!

    Hi Everyone,
    I'm setting up monitoring on a multi-system architecture; i've got all of the agents working and reporting to my CEN. I've even got the auto-response email all set up, great!
    I've been looking around for any best practice on what MTE's (out of the 100's there!) I should be setting up virtual and rule based nodes for.
    So come on everyone...what are your favorite MTE's. Of course I'm looking at Dialog Response Times, and Filesystem %'s -- but any other tips? hints? tricks? Any neat MTE's out there that you love to check.
    A best practice / useful guide would be brilliant?
    Thanks in advance.
    Nic Doodson

    ??

  • What is the BEST practice - use BO or Java Object in process as webservice

    Hi All,
    I have my BP published as web service. I have defined My process input & output as BOs. My BP talks to DB through DAO layer(written in JAVA) which has Java objects. So I have BO as well as java Objects. Since I am collecting user input in BO, I have to assign individual values contained in BO to Java object's fields.
    I want to reduce this extra headache & want to use either of BO or Java object.I want to know What is the best practice - use BO or Java object as process input. If it is BO,how I can reuse BOs in Java?
    Thanks in advance.
    Thanks,
    Sujata P. Galinde

    Hi Mark,
    Thanks for your response. I also wanted to use java object only. When I use java object as process input argument..it is fine. But when I try to create Process web service, I am getting compilation error - "data type not supported".....
    To get rid of this error, I tried to use heir (BO inheriting from java class). But while invoking process as web service, it is not asking for fields that are inherited from java class.
    Then I created Business Object with a field of type java class... This also is not working. While sending request, it is giving an error that - field type for fields from java class not found.
    Conclusion - not able to use java object as process(exposed as web service) input argument .
    What is the Best & feasible way to accomplist the task - Process using DAO in Java & exposed as web service.
    Thanks & Regards,
    Sujata

  • Looking for best practices using Linux

    I use Linux plataform to all the Hyperion tools, we has been problems with Analyzer V7.0.1, the server hangs up ramdomly.<BR>I'm looking for a Linux best practices using Analyzer, Essbsae, EAS, etc.<BR>I'll appreciate any good or bad comments related Hyperion on Linux OS.<BR><BR>Thanks in advance.<BR><BR>Mario Guerrero<BR>Mexico

    Hi,<BR><BR>did you search for patches? It can be known problem. I use all Hyperion tools on Windows without any big problem.<BR><BR>Hope this helps,<BR>Grofaty

  • How to set unequal columns using master pages in InDesign CS3?

    How to set unequal columns using master pages in InDesign CS3?

    I don't have CS3 anymore but I don't think this has substantially changed in the last few versions of InDesign.
    Choose View > Grids & Guides > uncheck Lock Column Guides. Then drag the column guides to the position you want.

  • Best Practices Used in CMS

    Hi,
    Can anyone share the best practices used in CMS transport.
    Basically, why I need this is that, we want to have a track of all the transport that are done in QA/Prod
    Regards,
    Sreenivas

    Hi
    U can try checking this document to know info about CMS
    (How To… Transport XI Content Using CMS)
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f85ff411-0d01-0010-0096-ba14e5db6306
    Also...
    How to configuring the CMS for XI?
    Business System Groups - CMS

  • Best Practice for Master Data Reporting

    Dear SAP-Experts,
    We face a challenge at the moment and we are still trying to find the right approach to it:
    Business requirement is to analyze SAP Material-related Master Data with the BEx Analyzer (Master Data Reporting)
    Questions they want to answer here are for example:
    - How many active Materials/SKUs do we have?
    - Which country/Sales Org has adopted certain Materials?
    - How many Series do we have?
    - How many SKUs below to a specific season
    - How many SKUs are in a certain product lifecycle
    - etc.
    The challenge is, that the Master Data is stored in tables with different keys in the R/3.
    The keys in these tables are on various levels (a selection below):
    - Material
    - Material / Sales Org / Distribution Channel
    - Material / Grid Value
    - Material / Grid Value / Sales Org / Distribution Channel
    - Material / Grid Value / Sales Org / Distribution Channel / Season
    - Material / Plant
    - Material / Plant / Category
    - Material / Sales Org / Category
    etc.
    So even though the information is available on different detail  levels, the business requirement is to have one query/report that combines all the information. We are currently struggeling a bit on deciding, what would be the best approach for this requirement. Did anyone face such a requirement before - and what would be the best practice. We already tried to find any information online, but it seems Master data reporting is not very well documented. Thanks a lot for your valuable contribution to this discussion.
    Best regards
    Lukas

    Pass a reference to the parent into the modal popup. Then you
    can reference anything in the parent scope.
    I haven't done this i 2.0 yet so I can't give you code. I'll
    post if I do.
    Oh, also, you can reference the parent using parentDocument.
    So in the popup you could do:
    parentDocument.myPublicVariable = "whatever";
    Tracy

  • Best practice: Material Master Description

    Warm Greetings,
    Change Material master Description
    Kindly suggest me the best practice
    In my company material master is created like Electricity & water bill for Main building
                                                                              Electricity & water bill for ware house
                                                                              Electricity & water bill for Work shop
    My finance user is asking Create One material in the name of Electricity & water bill
    When ever the requirement comes we will change the description in the PR & IN PO
    Kindly suggest me the best practice
    Regards
    shamulheq

    Hi,
    You can maintain Description in Material Master PO Text also, which you can print in PO or you can use Additional Data --> Basic data text
    Regards,
    Vikas

  • 2nd Mac - best practices using iPhoto on both?

    Hi -
    I just got a new MacBook and have an iMac that is still the "hub" of my photo library. It is, in fact, about a 180 GB iPhoto library. I know that I can't sync libraries between Macs (a shame - someone should come up with a way to that assuming they haven't already!) so I'm just looking for any best practices?
    I got the MacBook to be able to work on some photos while on the road - I can at least work on post processing in Photoshop, etc. I'm thinking now that my best strategy is to possibly work with the images on my MacBook, importing them into the iPhoto library if desired. Then use my Photo sharing service - Phanfare - to "sync" them? It requires me to download them on the other side and pull them again into the iPhoto Library on the iMac?
    I don't use the Mobile Me Gallery but I suppose that would be another way to have access to them on the alternate computer?
    Any other best practices or suggestions?
    Thx!

    So, if there are times when I'm not home to access my external drive, then going with the two libraries is the best solution, yes?
    Perhaps, but you can get very small and portable external HDs these days.
    I'm not sure though if I should really make both a 180 GB iPhoto library, do you? It is a back up true, but seems like a chunk to move
    But you only do it once. The first time. Thereafter you're simply updating the other with the changes.
    At least maybe I could split into pictures from 2009 - 2010 and have that library for both my iMac and the MacBook. I very rarely access before then (only if I need something specific) so then I could access that via the iMac exclusively?
    That would be viable.
    I would maintain a +full Library+ on the Desktop, the mobile versions a Smaller subset.
    I'm sort of ruling out the one library on the external solution because it eliminates the possibility of being remote -
    As I said above you can get tiny portable drives...
    unless there is some swanky Login to My Computer or something that works with a Mac that can go remotely to my computer and then to my external drive.
    *_This_* might help.
    Regards
    TD

  • Best practice using regular properties

    What is considered best practice when it comes to using properties ? Example like hostname, port number when connecting towards an external resource.
    Should property files be used ? Is this considered a bad practice ? Should deployment descriptors be used - if so - how do one update these properties when changed ?
    Are there any utility classes that makes easy access to this kind of properties ?
    ---- Trond

    Depends on what properties. Many properties like hostname etc can be retrieved using different API calls - such as the request object or other portal specific objects.
    Properties that you applications need, that might change - can be stored in a properties file. I use a singleton to retreive them - and have a reload method on the singleton that I can call if I need to reload the properties once the server has started.
    Kunal

  • Idoc processing best practices - use of RBDAPP01 and RBDMANI2

    We are having performance problems in the processing of inbound idocs.  The message type is SHPCON, and transaction volume is very high.  I am a functional consultant, not an ABAP developer, but will try my best to explain our current setup.
    1)     We have a number of message variants for the inbound SHPCON message, almost all of which are set to trigger immediately upon receipt under the Processing by Function Module setting.
    2)      For messages that fail to process on the first try, we have a batch job running frequently using RBDMANI2.
    We are having some instances of the RBDMANI2 almost every day which get stuck running for a very long period of time.  We frequently have multiple SHPCON idocs coming in containing the same material number, and frequently have idocs fail because the material in the idoc has become locked.  Once the stuck batch job is cancelled and the job starts running again normally, the materials unlock and the failed idocs begin processing.  The variant for the RBDMANI2 batch job is currently set with a packet size of 1 and without parallel processing enabled.
    I am trying to determine the best practice for processing inbound idocs such as this for maximum performance in a very high volume system.  I know that RBDAPP01 processes idocs in status 64 and 66, and RBDMANI2 is used to reprocess idocs in all statuses.  I have been told that setting the messages to trigger immediately in WE20 can result in poor performance.  So I am wondering if the best practice is to:
    1)     Set messages in WE20 to Trigger by background program
    2)     Have a batch job running RBDAPP01 to process inbound idocs waiting in status 64
    3)     Have a periodic batch job running RBDMANI2 to try and clean up any failed messages that can be processed
    I would be grateful if somebody more knowledgeable than myself on this can confirm the best practice for this process and comment on the correct packet size in the program variant and whether or not parallel processing is desirable.  Because of the material locking issue, I felt that parallel processing was not desirable and may actually increase the material locking problem.  I would welcome any comments.
    This appeared to be the correct area for this discussion based upon other discussions.  If this is not the correct area for this discussion, then I would be grateful if the moderator could re-assign this discussion to the correct area (if possible) or let me know the best place to post it.  Thank you for your help.

    Hi Bob,
    Not sure if there is an official best practice, but the note 1333417 - Performance problems when processing IDocs immediately does state that for the high volume the immediate processing is not a good option.
    I'm hoping that for SHPCON there is no dependency in the IDoc processing (i.e. it's not important if they're processed in the same sequence or not), otherwise it'd add another complexity level.
    In the past for the high volume IDoc processing we scheduled a background job with RBDAPP01 (with parallel processing) and RBDMANIN as a second step in the same job to re-process the IDocs with errors due to locking issues. RBDMANI2 has a parallel processing option, but it was not needed in our case (actually we specifically wouldn't want to parallel-process the errors to avoid running into a lock issue again). In short, your steps 1-3 are correct but 2 and 3 should rather be in the same job.
    Also I believe we had a designated server for the background jobs, which helped with the resource availability.
    As a side note, you might want to confirm that the performance issues are caused only by the high volume. An ABAPer or a Basis admin should be able to run a performance trace. There might be an inefficiency in the process that could be adding to the performance issue as well.
    Hope this helps.

  • RH 10: not use Master Page in Printed Docs

    I have a Master Page set up with a company logo and horizontal rules, which works great with the help topics. The problem occurs when I generate printed docs from the source files.
    Everything else works well enough to get by (I have some tweaking to do), BUT every single topic shows up with the rules and the company logo. I don't see anywhere to NOT apply the Master Page to the printed output. Is there a secret to not using the online help layout in the printed docs?
    thanks,
    Alia

    Master page headers and footers 100% guaranteed do not survive the trip to printed documentation. My guess is that in the master page your "headers" and "footers" have been created in the body section of the master page.
    Create a new topic attaching that master page in the process. Then unlink the master page from that topic. Do you still see the header and footer? If you do, my theory is proved. If you do not, post back.
    See www.grainge.org for RoboHelp and Authoring tips
    @petergrainge

  • Best practice using keyword in subject for encryption

    I have setup an encryption content filter on my appiance to encrypt messages that are outbound, and have a subject header that begin with the keyword Secure inside brackets. [Secure]. My intent was to eliminate some fals positives by including the brackets. What ended up happening was for some reason ALL outbound messages were being encrypted.
    After some more testing it seems like the content filter is ignoring the brackets as a requirement, but I can seem to find anything in the online help to back this up.
    Can someone assist with verification of the requirements around using a keywork in the subject to allow users to encrypt oubound messages voluntarily?
    Best practices around achieving this if anyone has them would also be greatly appreciated.
    Thanks,
    Chris

    Hi Chris,
    While I would have to see the syntax of the filter in question to be 100% sure , it sounds as if your conditional variable is being treated literally. The content filters will except regular expressions so what this means is something like [Secure] could be seen as look for anything that contains
    an S, or a
    e or a
    c or a
    u or a
    r or a
    e
    Since you said begins with we would be looking for anything in the subject that begins with of of these letters. This is due to the fact that the brackets [ ] are special characters in regular expressions, thus they need to be escaped. That being said [Secure] would become \\[Secure\\], we use the \\ to escape the brackets. In content filters however, you can get by with using just one escape as the filter is smart enough to go ahead an enter the additional escape for you. So in the filter it would be something like begins with \[Secure\]
    Once you enter this in the condition for you filter it will display like the following,
    subject == "^\\[Secure\\]"
    I hope that helps.
    Christopher C Smith
    CSE
    Cisco IronPort Customer Support  

  • Best practice using clusters to create queue/notifier/bundles ?

    I have in a block diagram a queue, notifier and several instances of bundle cluster
    that all use the same data structure.   There is a cluster typedef for the data structure.
    Of course, each of these objects (define queue, define notifier, bundle)
    need to know how the cluster is defined.
    What is considered best practice?
    1) create a dummy instance of the cluster everywhere the data structure
        definition is needed (and hide them all on the FP)
    2) create only one instance and wire it to all the places it is needed
       But there is no data flow on this wire -- it's only the cluster *definition*
       that is being used, so this would seem to clutter up the BD unnecessarily.
    3) Create only one control instance of the cluster and use local variables
        everywhere else the cluster definition is needed.  It's _value_ is never
        assigned or read so there's no issue with race conditions.
    4) Some other way?
    If you had to clean up someone else's code, how would you hope to
    see this handled?
    It occurred to me in the course of writing this that where I have
    "unbundle... code ... bundle" I could wire the original bundle to 
    both "unbundle" and "bundle" -- but would this be too confusing
    and clutter the BD with unnecessary wire ?
    Thanks and Best Regards,
    -- J.
    Message Edited by JeffBuckles on 11-06-2008 03:17 PM
    Solved!
    Go to Solution.

    Discovered a rather unpleasant side-effect of using the typedef constant cluster this way. Since I'm using the typedef cluster constant only to communicate the definition of the cluster--not for dataflow--I resize it to the same size and shape as a control terminal.  This is shown in the first image, clean_cluster.jpg.   I was also careful to set "Autosizing->None" on all instances of the constant.   However, when I updated the typedef, all instances of the constant resized!  That's dozens of instances across many different VIs.   Why was "Autosizing->None" not respected?  Is there a better way to do this so that my BD doesn't get messed up if I have to update a typedef such as this one?
    Thanks and Best Regards,
    J. 
    Attachments:
    clean_cluster.png ‏2 KB
    Cluster_Updated.png ‏3 KB

Maybe you are looking for

  • CVI under W2K crashes when using stop in debug mode and AOGenerateWaveforms

    When using AOGenerateWaveforms to generate a sine wave. If you hit stop in debug the PC restarts. If you're using the same channel to generate a DC signal, it's fine. Seems to be the card accessing the dynamic memory I've set the waveform up in after

  • Lightroom 4.4 will not import NEF RAW files from Nikon D610

    I shoot a Nikon D800 and have never had any issues importing my NEF RAW files into Lightroom. I borrowed a friend's Nikon D610 and shot a bunch of NEF RAW files. I downloaded them to my computer the same way I always do. For some reason, Lightroom wi

  • Making our site to search by search engines like google, yeahoo...etc

    Hi All, To make our client site search and display results by google / yahoo or any other search engine what needs to be done? we kept "meta" tag keywords and description in all our html file and mastheaders. What we need to do for making our site se

  • Issue: color print not true

    Lightroom 5.7 (Updated from 5.5 after the problem below--no help) Windows 7 Epson 2400 Photo Stylus Printer Photoshop elements 10 New problem after using Lightroom for several years Working with bright red flower photo.  Color management set to profi

  • Serialization vs remote object.

    hi, all come to a crosspoint of selecting which one to use : serialization vs remote object. want to get your inputs. thx for your feedback in advance. when defining objects like passing parameters or accepting a return value in RMI, there are 2 choi