Check-in all documents via superadmin?

Hello Forum,
we have a Problem with the EP 7. We are looking for a possibility to uncheck, checked-out documents per Admin.
We tried to Log-In on the Portal with the Super Admin Role, and tried the point show all checked-out documents.
Unfortunately only the documents owned by the User are displayed.
Do you know what could possibly lead to this behavior?
Greets
Stefan

Hi Stefan,
I'm unsure which "point show all checked-out documents" you are talking of; anyhow, the task described can and should be done via the Resource Locks report - http://help.sap.com/saphelp_nw2004s/helpdata/en/14/a4d14030490b06e10000000a155106/frameset.htm
Hope it helps
Detlev

Similar Messages

  • How to export all documents of a Solution Manager Project?

    Hello,
    I am currently trying to export all documents that exist inside a Solution Manager project (which documents these are, I can see via SI80 -> select project). That means, I need to export between 300 and 7000 documents from SAP to a MS Windows directory.
    Is there any way to do this automatically?
    My approaches so far:
    1) Use the "Knowledge Workbench" tool to connect to the Knowledge Warehouse of the Solution Manager and "check out" all documents to the specific project.
    Problem: The resulting local folder structure is horrifying and all documents are not named by their titles as seen in SAP, but named after the template that was used to create them. So flattening the folder structure would result in many, many name collisions.
    2) Using SOLAR02 to get an overview over all documents via "Environment->Project Analysis->Assignments->Documentation", and selecting everything.
    Problem: I found no download option here, only some kind of overview over all documents.
    The forum threads:
    [Re: download all document in SOLAR01]
    [Mass download of documents from SM to PC;
    didn't help either.
    I would appreciate any help.
    Thanks!
    Edited by: Andreas Seemann on Mar 2, 2011 2:15 PM

    Hello Andreas,
    You might take a peak at this link
    Export of SOLMAN-project
    SOLAR_PROJECT_ADMIN can also copy documentation, plus there is more information on  SAP Knowledge Workbench.
    Using it for the first time is a bit cryptic so they recommend reading the SAP Notes 336968 , 196078 and 817529.
    erhaps worth a view?
    Regards,
    Paul

  • Unable to view all documents that have no checked in version due to exceeding the library threshold limit

     due to exceeding the library threshold limit , unable to view all documents that have no checked in version

    Hi niles, Here is a page that details how to retrieve all the files and take ownership of them via powershell to check them in:
    http://www.demantprasad.com/2013/05/manage-files-which-have-no-checked-in.html
    Mathieu Desmarais http://www.matdesmarais.com/

  • Credit Limit Check - Exclude Specific Documents / Credits from Exposure

    Hi,
    I would like to know if it's possible to exclude specific document types (credits) from the credit exposure calculation used during credit limit checks.
    The ideia is to exclude "DG" credits in customer GL that are related to Trade Deal/Agreements to avoid over-exposure.
    Let me explain:
    Currently, our credit management is configured to "Static" + "Open Items" for all credit groups in the CCA. Therefore, SAP calculates the credit exposure as:
              CE = OPEN ORDERS (S066) + OPEN DELIVERIES (S067) + OPEN BILLING (S067) + OPEN ITEMS (BSID)
    Regarding "Open Items", I understand that SAP calculates the net value between all debits (receivables) and credits (returns, overpayments, etc...) already posted and not cleared in the Customer GL account. This rationale is correct since the credits are due to the customers and should be deducted from their exposure (i.e. "you may not pay them if you don't receive").
    However, in our case, we also pay trade deals contracts (i.e. trade incentives and trade marketing investments) directly to costumers and apart from sales orders or credit memos. The flow is the following: (1) contracts are managed on a satelite system and interfaced to SAP; (2) contracts are created in the costumer GL as a credit (doc. type "DG"); (3) credits are sent to accounts payable for payment via F110.
    Problem is that, contract credits volume is significant and are impacting customers credit exposure and limit check, resulting in greater exposure than allowed. That is, orders are being released while contract credits are open in the customer GL up to the max exposure, but when contracts are paid the exposure goes above the limit. For example:
    Day 1:
              Costumer Credit Limit: 100
              Sales Balance: 30
              Open Itens: 50 (receivables: 70 / contracts: -20)
              Credit Exposure: 80/100
    Day 2:
              Costumer Credit Limit: 100
              Sales Balance: 50
              Open Itens: 50 (receivables: 70 / contracts: -20)
              Credit Exposure: 100/100 (no additional orders are released)
    Day 3:
              Costumer Credit Limit: 100
              Sales Balance: 50
              Open Itens: 70 (receivables: 70 / contracts: 0 - contracts are paid on day 3)
              Credit Exposure: 120/100 (exposure is above customer limit since contracts that were being deducted were paid)
    Any ideas or thoughts? Is something wrong in the explanation above? Should contracts credits be treated through special GL?
    Many thanks!

    Hello Julie,
    the 2nd, System behavior (how do you interact the credit limit check (yes/no question) via DI?)
    Workaround:
    Before you issue the Document, you may do a manual check: BP balance againt Credit Limit, and make deceision.
    Regards
    János

  • How to trigger xml publisher API (ex:Delivering Documents via e-Mail)?

    Dear All:
    How to use xml publisher API ?
    In user's guide always talk API's code.(ex:Delivering Documents via e-Mail
    // create delivery manager instance
    DeliveryManager dm = new DeliveryManager();
    // create a delivery request
    DeliveryRequest req =
    dm.createRequest(DeliveryManager.TYPE_SMTP_EMAIL);
    // set email subject
    req.addProperty(DeliveryPropertyDefinitions.SMTP_SUBJECT, "Invoice");
    // set SMTP server host
    req.addProperty(
    DeliveryPropertyDefinitions.SMTP_HOST, "mysmtphost");
    // set the sender email address
    req.addProperty(DeliveryPropertyDefinitions.SMTP_FROM,
    "[email protected]");
    // set the destination email address
    req.addProperty(
    DeliveryPropertyDefinitions.SMTP_TO_RECIPIENTS,
    "[email protected], [email protected]" );
    // set the content type of the email body
    req.addProperty(DeliveryPropertyDefinitions.SMTP_CONTENT_TYPE,
    "text/html");
    // set the document file name appeared in the email
    req.addProperty(DeliveryPropertyDefinitions.SMTP_CONTENT_FILENAME,
    "body.html");
    // set the document to deliver
    req.setDocument("/document/invoice.html");
    // submit the request
    req.submit();
    // close the request
    req.close(); )
    Not say how to use this code to account effect !!
    Having anybody to use API before?
    Please tell me how to use that,thanks!!
    BY Emily_ye

    Hi Emily
    I had the same question. After much research and a lot of deduction I produced the following:
    import oracle.apps.fnd.cp.request.*;
    import java.io.*;
    import java.sql.*;
    import java.util.Vector;
    import oracle.apps.fnd.util.*;
    import oracle.apps.xdo.XDOException;
    import oracle.apps.xdo.common.pdf.util.PDFDocMerger;
    import oracle.apps.xdo.delivery.DeliveryException;
    import oracle.apps.xdo.delivery.DeliveryManager;
    import oracle.apps.xdo.delivery.DeliveryPropertyDefinitions;
    import oracle.apps.xdo.delivery.DeliveryRequest;
    import oracle.jdbc.driver.OracleCallableStatement;
    public class RunTravProgram implements JavaConcurrentProgram {
    CpContext mCtx; // global reference to concurrent program context
    LogFile logFile; // global reference to context logfile
    OutFile outFile; // global reference to context outfile
    Connection mConn = null;
    ReqCompletion lRC;
    //File Separator
    private String mFileSeparator;
    // globals for template
    String XDOAppShortName = "";
    String XDOtemplateCode = "";
    // hard-wired constants for template addition
    final String XDOLanguage = "en";
    final String XDOTerritory = "US";
    final String XDOFinal_format = "PDF";
    final String XDOtemplateType = "TEMPLATE_SOURCE";
    String PDFFile = "";
    String outFilePath = "";
    String progShortName = "";
    String progDesc = "";
    Integer iRequestID = 0;
    String sWatermark = ""; // watermark text
    String emailAddress = ""; // Not Implemented
    String emailServer = "";
    public static final String M_SUCCESS = "SUCCESS";
    public static final String M_ERROR = "ERROR";
    public static final String M_WARNING = "WARNING";
    * Create a Java FND ConcurrentRequest objec to call fnd_request.submit_request
    * The first three parameters are:
    * Application Short Name -- Application Short name (ie. WAHC)
    * Current Program Short Name -- Concurrent Program being called
    * Current Program Description -- description for above
    * These should be the first three parameters passed by the concurrent
    * program in this order. The next two are constants set to null
    * These are followed by the parameters passed by the first concurrent
    * program that are being passed to the next concurrent program.
    * I am limiting the parameter list to ten for now.
    // Dynamic PLSQL statement used to get a concurrent request completion status
    // This is necessary because the java class does not provide this method :-(
    String mGetCompleteStatus =
    "DECLARE R_VAL BOOLEAN; " + "b_phase VARCHAR2 (80) := NULL; " +
    "b_status VARCHAR2 (80) := NULL; " +
    "b_dev_phase VARCHAR2 (80) := NULL; " +
    "b_dev_status VARCHAR2 (80) := NULL; " +
    "b_message VARCHAR2 (240) := NULL; " + "BEGIN " +
    "r_val := fnd_concurrent.wait_for_request (:1,5,1000," +
    "b_phase,b_status,b_dev_phase,b_dev_status,b_message); " +
    ":2 := b_phase; " + ":3 := b_status; " + ":4 := b_message; " + "end;";
    public RunTravProgram() {
    // no constructor necessary for now
    * Concurrent Processing provides an interface 'JavaConcurrentProgram' with abstract method
    * runProgram() which passes the concurrent processing context 'CpContext'. The concurrent
    * program developer will implement all of their business logic for a concurrent program in
    * runProgram(). The main() method, implemented by AOL, will call runProgram() after
    * performing all of the required initialization for the concurrent program, including
    * establishing a database connection, initializing the required contexts, and setting up
    * the log and output files. CpContext will have the request specific log and output
    * file input methods
    public void runProgram(CpContext pCpContext) {
    mCtx = pCpContext;
    OracleCallableStatement lStmt = null;
    boolean bCompletion = true;
    String sPhase = "";
    String sStatus = "";
    String sMessage = "";
    //get handle on request completion object for reporting status
    lRC = pCpContext.getReqCompletion();
    // assign logfile
    logFile = pCpContext.getLogFile();
    // assign outfile
    outFile = pCpContext.getOutFile();
    // assign fileseparator
    mFileSeparator = getFileSeparator();
    // get the JDBC connection object
    mConn = pCpContext.getJDBCConnection();
    outFilePath =
    ((new File(outFile.getFileName())).getParent() == null ? "" :
    (new File(outFile.getFileName())).getParent() +
    mFileSeparator);
    logFile.writeln("OutFile File Path: -> " + outFilePath, 0);
    // get parameter list object from CpContext
    // these come from the concurrent program
    ParameterList lPara = pCpContext.getParameterList();
    // create a temporary array and retrieve the parameters created by
    // the program. Currently limiting the number of parameters to 10 for now
    String pvals[] = new String[10];
    int pcount = 0;
    while (lPara.hasMoreElements()) {
    NameValueType aNVT = lPara.nextParameter();
    pvals[pcount] = aNVT.getValue();
    pcount++;
    if (pcount > 9)
    break;
    // send parameter values to the log file
    logFile.writeln("Arg 1: APPL_SHORT_NAME -> " + pvals[0], 0);
    logFile.writeln("Arg 2: CURR_PROG_SHORT_NAME -> " + pvals[1], 0);
    logFile.writeln("Arg 3: CURR_PROG_DESCRIPTION -> " + pvals[2], 0);
    logFile.writeln("Arg 4: TEMPLATE_CODE -> " + pvals[3], 0);
    logFile.writeln("Arg 5: P_PLANT_CODE -> " + pvals[4], 0);
    logFile.writeln("Arg 6: P_BATCH_NO -> " + pvals[5], 0);
    logFile.writeln("Arg 7: P_SHOW_PROMISE -> " + pvals[6], 0);
    logFile.writeln("Arg 8: P_WATERMARK -> " + pvals[7], 0);
    XDOtemplateCode = pvals[3]; // store the template name globally
    progShortName = pvals[1]; // store the program short name globally
    XDOAppShortName = pvals[0]; // store the application short name
    sWatermark = pvals[7]; // store the watermark globally
    progDesc = pvals[2];
    try {
    // create a concurrent request object
    ConcurrentRequest cr = new ConcurrentRequest(mConn);
    // use the parameters to call fnd_request.submit_request
    cr.addLayout(XDOAppShortName, XDOtemplateCode, XDOLanguage,
    XDOTerritory, XDOFinal_format);
    Vector param = new Vector();
    param.add(pvals[4]); // plant code
    param.add(pvals[5]); // batch ID
    param.add(pvals[6]); // Show SO info flag
    iRequestID =
    cr.submitRequest(XDOAppShortName, progShortName, progDesc,
    null, false, param);
    mConn.commit();
    // send the request ID to the log file
    logFile.writeln("-- Request ID: ->" + Integer.toString(iRequestID),
    0);
    // call fnd_concurrent.wait_for_request to wait until the request
    // has ended - use this to check the request status before proceeding
    lStmt =
    (OracleCallableStatement)mConn.prepareCall(mGetCompleteStatus);
    lStmt.setInt(1, iRequestID);
    lStmt.registerOutParameter(2, java.sql.Types.VARCHAR, 0, 255);
    lStmt.registerOutParameter(3, java.sql.Types.VARCHAR, 0, 255);
    lStmt.registerOutParameter(4, java.sql.Types.VARCHAR, 0, 255);
    lStmt.execute();
    // get the results of the completion
    sPhase = lStmt.getString(2);
    sStatus = lStmt.getString(3);
    sMessage = lStmt.getString(4);
    lStmt.close();
    // send the results of the request processing to the log file
    logFile.writeln("-- Phase: -> " + sPhase, 0);
    logFile.writeln("-- Status: -> " + sStatus, 0);
    logFile.writeln("-- Message: -> " + sMessage, 0);
    // test here to make sure it completed correctly
    if (sPhase.equals("Completed") && sStatus.equals("Normal")) {
    // construct the PDF file name generated by the called request
    PDFFile = progShortName + "_" + iRequestID + "_1.pdf";
    // add a watermark to the generated PDF
    // create an output stream for the existing PDF
    // and set ouput to append
    OutputStream pdfout =
    new FileOutputStream(outFilePath + PDFFile, true);
    // create an inputstream array (required by calling method)
    InputStream pdfin[] = new InputStream[1];
    pdfin[0] = new FileInputStream(outFilePath + PDFFile);
    // add the watermark passed as a parameter
    bCompletion = addWatermark(pdfin, pdfout);
    // assign the modified file to the context out
    // this will print using this request
    if (bCompletion)
    outFile.setOutFile(outFilePath + PDFFile);
    // release the connection object
    // and set the completion status for the request
    if (bCompletion) {
    pCpContext.getReqCompletion().setCompletion(ReqCompletion.NORMAL,
    } else {
    lRC.setCompletion(ReqCompletion.WARNING, M_WARNING);
    pCpContext.releaseJDBCConnection();
    } catch (SQLException s) {
    logFile.writeln("SQL Error: Exception thrown w/ error message: " +
    s.getMessage(), 0);
    lRC.setCompletion(ReqCompletion.WARNING, M_WARNING);
    pCpContext.releaseJDBCConnection();
    } catch (IOException ioe) {
    logFile.writeln("IO Error: Exception thrown w/ error message: " +
    ioe.getMessage(), 0);
    lRC.setCompletion(ReqCompletion.WARNING, M_WARNING);
    pCpContext.releaseJDBCConnection();
    } catch (Exception e) {
    logFile.writeln("General Exception: " + e.getMessage(), 0);
    lRC.setCompletion(ReqCompletion.WARNING, M_WARNING);
    pCpContext.releaseJDBCConnection();
    } finally {
    try {
    if (lStmt != null)
    lStmt.close();
    pCpContext.releaseJDBCConnection();
    } catch (SQLException e) {
    logFile.writeln(e.getMessage(), 0);
    lRC.setCompletion(ReqCompletion.WARNING, M_WARNING);
    * addWatermark()
    * @param pdfin
    * @param pdfout
    * @return boolean
    * This method will work for an existing document or a newly generated
    * one. Set the outputstream append flag to false for a new document
    * and true for an existing one.
    * NOTE: PDFDocMerger requires an inputstream array even if it only
    * contains one document.
    private boolean addWatermark(InputStream[] pdfin, OutputStream pdfout) {
    if (!sWatermark.equals("")) {
    try {
    PDFDocMerger docMerger = new PDFDocMerger(pdfin, pdfout);
    //docMerger.setTextDefaultWatermark(sWatermark);
    docMerger.setTextWatermark(sWatermark, 80f, 50f);
    docMerger.setTextWatermarkAngle(25);
    docMerger.setTextWatermarkColor(1.0f, .50f, .50f);
    docMerger.setTextWatermarkFont("Garamond", 100);
    docMerger.process();
    docMerger = null;
    return true;
    } catch (XDOException e) {
    logFile.writeln("Watermark process Failed: " + e.getMessage(),
    0);
    return false;
    return true;
    * Returns the file separator
    private String getFileSeparator() {
    return (System.getProperty("file.separator"));
    * EBSEmailDelivery
    * @return
    * Just for testing right now.
    private boolean EBSEmailDelivery() {
    if (!emailAddress.equals("")) {
    try {
    // create delivery manager instance
    DeliveryManager delMgr = new DeliveryManager();
    // create a delivery request
    DeliveryRequest delReq =
    delMgr.createRequest(DeliveryManager.TYPE_SMTP_EMAIL);
    // set email subject
    delReq.addProperty(DeliveryPropertyDefinitions.SMTP_SUBJECT,
    "EBS Report:" + progDesc +
    " for request: " + iRequestID);
    // set SMTP server host
    delReq.addProperty(DeliveryPropertyDefinitions.SMTP_HOST,
    emailServer); // need to supply the email smtp server
    // set the sender email address
    delReq.addProperty(DeliveryPropertyDefinitions.SMTP_FROM,
    emailAddress);
    // set the destination email address
    delReq.addProperty(DeliveryPropertyDefinitions.SMTP_TO_RECIPIENTS,
    emailAddress);
    // set the content type of the email body
    delReq.addProperty(DeliveryPropertyDefinitions.SMTP_CONTENT_TYPE,
    "application/pdf");
    // set the document file name appeared in the email
    delReq.addProperty(DeliveryPropertyDefinitions.SMTP_CONTENT_FILENAME,
    PDFFile);
    // set the document to deliver
    delReq.setDocument(outFilePath + PDFFile);
    // submit the request
    delReq.submit();
    // close the request
    delReq.close();
    return true;
    } catch (DeliveryException de) {
    logFile.writeln("email process Failed: " + de.getMessage(), 0);
    return false;
    return true;
    This is the class for a JCP I created to perform the following:
    1) Launch an existing Concurrent Program that produces PDF output
    2) Grab the PDF and apply a watermark based on user input or conditions
    3) associate the modified PDF to CP output for PASTA printing
    It isn't elegant but it is fairly simple. I added the email capability and tested it but am not implementing it at the present time.
    there is a fair amount of information out there that explains how to create a JCP councurrent program but very little that demonstrates the class needed.
    I hope this helps

  • All documents are getting modified automatically in a Doc-Lib if one doc is modified

    Hi,
    I have an issue with my document library, if I modify one document , it affects modified date for all documents in that library. Not sure what is the issue. Did any one face this type of problem? please let me know.

    Hi cnivass,
    Please go to the library and click Library ribbon->Workflow Settings, chek whether there are some workflows. And check whether there are some Event
    Receivers in this library.
    In addition, for narrowing down this issue, please create a new library and test again in the new library, compare the result.
    And create a new library in another site, test again, compare the result.
    I hope this helps.
    Thanks,
    Wendy
    Wendy Li
    TechNet Community Support

  • Unable to clear a GR document via f.13

    Hi,
    My user is unable to clear a GR document via F.13 since the IR document had been clear with other document.
    It was noted that GR document do not have a matching document.
    User try to clear the GR document manually via F-03 by manual entering the clearing to offset but this leaves the clearing document as open.
    Please advise.

    Hi,
    Above credit entry in GR/IR account is at the time of GRN, and 2nd debit entry in GR/IR  account will be at the time of Invoice. see below entry.
    Debit GR/IR  690
         Cr. A/P(vendor) 690
    Therefore in GR/IR account is open item manage account, and you can see the debit 690 and credit 690, in GR/IR account, and this you have to clear in F.13 please check and confirm.
    You can also view the PO history in ME23N, to verify that the invoice is posted or not, and if invoice is posted, then click on invoice doc, then go on followup FI document, which will show you what account has been debit.(this should be GR/IR with 690).
    Your vendor invoice will be cleared once you it pay it through F110/F-53/F-58.
    Regards
    Javed

  • Open all documents in the same window ?

    How in Reader 9 can I configure it to open all documents in the same window instead of as individual windows?
    Thanks

    Bug report sent to Adobe:
    ******BUG******
    Concise problem statement:
    MDI removed from Reader 9
    Steps to reproduce bug:
    1. Open more than one document.
    2. Look at taskbar overflowing.
    3.
    Results: Customer is pissed off.
    Expected results: Happy customer.
    Steps to make customer happy: Get your heads out of your ***, and make MDI not just an option, but the DEFAULT. Who the hell thought going back to Windows 3.0 SDI interface technology was a good idea? Please post his/her name, and whoever approved that decision's name publically, because a) they will (or should) be looking for a new job since they just got fired, and b) I don't want to make the mistake of hiring them. I mean seriously. I thought what I was reading on the web was a hoax. I had to check snopes.com to be sure. Do you guys not have a random drug testing policy? For that matter, do you not realize that when a person, that you're trusting to make important decisions that affect millions of people, is sitting at their desk drooling on themselves and can't get out more than a single syllable word, that maybe, just maybe, they're not the person you should have in that position? Is this a product line wide decision? Should I look at scrapping all of my other Adobe / Macromedia products and upgrading to those that support 15 year old technology? Are you going to turn into another Symantec? Hey, lets buy take over great companies with great products, turn the product lines into crap, and then discontinue and abandon them. Seriously, what should we expect for Reader 10.0? Are you going to be upgrading to a console application? Please, please, please for 11.0 can we have it be phosphorous green too? I think Im going to go back to my bottle of wine now

  • Can I have "All spots to process" checked at all times, even for new spot colors?

    When I check “All spots to process” in the pdf export settings and save my settings the settings remember that I've checked this option. But, if new spot color objects using new spot color swatches are added to the document (or another document) and I go into the pdf export settings the check mark has been changed into a dash (with the actual checkbox highlighted) – signifying that only some of of the spot colors will be changed to process colors during export. I absolutely fail to see how this could possibly be seen as a feature and not a bug … if the user has checked “ALL spots to process” wouldn't the user expect ALL spots to be converted to process colors, rather than just any spot colors that happened to be in the document that happened to be open when the user first checked that checkbox and saved that setting?
    Am I missing something here? What's the point of even having that checkbox as part of your saved export settings if it doesn't include any other spot colors than those used when saving the settings?
    What's the point of having settings if you can't trust them, and still need to manually "override" them every time?
    I see that some users have taken to writing scripts that instead turn all spot colors in the swatch panel to process colors, and while I commend them for creating that workaround, I'm still pissed at Adobe for not getting the function right.
    If this is a feature, who is it for? People who want to add just certain spot colors and turn those into process colors rather than turning all spot colors into process colors are surely better off doing that in the swatches panel, where they're in total control of what's what. And if they don't want to "permanently" change their spot colors to process colors, and prefer to (temporarily) convert them during exporting/printing only, they can do that in the ink manager. But when someone checks convert "All spots to process" couldn't we safely assume they really want ALL spot colors to be converted and not just some of them? I mean, the way that checkbox behaves now, it's like it's a button and not a checkbox. As in: hit the button "All spots to process" to switch all currently viewed spot colors to process colors in the ink manager, OR check the "All spots to process" checkbox to always convert ALL spot colors to process colors during exporting/printing.
    Anyone got any light to shed on this?
    And is there a way to actually get the advertised behavior, because if you have to run a script every time you export/print you might as well just manually select the checkbox every time instead, but either way it's just really unnecessary as far as I'm concerned … Adobe should get the feature right instead.
    If you save a setting and recall it, it shouldn't be possible for that setting to change into something else (in this case changing a checkmark to a dash).
    Clearly CMYK printing is the norm, so for most users it would make a lot of sense to have the "All spots to process" checked most of the time, and then you just go into the swatches panel or the ink manager and set things correctly for those print jobs that really do need spot colors.
    I myself am not one of those who add spot colors to my swatches unless I'm really using them as spot colors, but I often work with magazines and folders featuring adverts made by whoever, and typically there's always at least one advert that features spot colors, and therefore it would be very nice if the "All spots to process" feature actually worked as advertised without any required actions from me.
    We stopped sending ads back to the advertisers for adjustments a long time ago, unless we absolutely had to, because there were so many things wrong with so many ads that it was simply too much work to write back and explain everything to people who most of the time didn't even understand what we were talking about. We found that it was usually a LOT faster and easier to just adapt the ads ourselves, as long as it was something that could be worked out really quickly from within InDesign itself, which pretty much included most typical errors.
    But with this feature I find Adobe is trying to make my job harder rather than easier, and it's pissing me off. Arrrghh… ;-)

    But It's not a preference it's a shortcut
    It's a bad joke, is what it is. ;-)
    So, why in your opinion should it be presented the way it is? I keep saying in it's current functionality it shouldn't be presented the way it is (and that: if it is, it shouldn't work the way it does). If it's not a preference or even a proper checkbox, why present it that way?
    If you put it right next to the table at the top of the window (so that it's directly associated with that information, rather than information right above it) and just called the checkbox “Spot(s) to process” and had it only visually reflect the content of the sleected spot colors in the table, then I'd see your point with likening it to the “Hyphenate” checkbox.
    If a story has two selected paragraphs that uses two different hyphenation settings then the checkbox should present the way it does now, but if you hit the checkbox so that both paragraphs now use hyphenation and create a third paragraph inbetween the two previous ones it better inherit that setting and not turn off hyphenation for the new paragraph (unless of course there's a defined next paragraph style that switches to a style with hyphenation turned off). And if that checkbox said “Hyphenate all paragraphs” instead, then I would expect it to do just that, and not just the selected ones, and not just the current paragraphs but quite literally all paragraphs even newly created ones – otherwise it doesn't do what it says it does, and simply shouldn't be labeled that way.
    And seriously bad interface design aside, you'd have to rename “All spots to process” to “Switch all currently displayed spot swatches listed in the table above to process” to actually describe what that checkbox does. So even if you're a fan of the current functionality, as opposed to one that actually lets the user set and forget a setting like that, and think it's better that users manually check it repeatedly (which I'm not saying that you are, but you're not giving me any feedback suggesting you even see my point of view with any of this, so what do I know?), then why wouldn't you still support an interface that visually matches/signals that functionality better? If it's a “Select all” checkbox supplementing a table containing a column of checkboxes, then present it that way. Don't put it at the bottom of the window next to another checkbox that works just like a regular checkbox and label it “All spots to process” – because that way you are signalling a different behavior.
    Seriously, if I was to do design using the same mentality that Adobe uses when designing their user interfaces it wouldn't be long before I lost all clients. There's a lot to be said for de facto monopolies, I suppose. Oh no, there's nothing wrong with the design, just as long as you accept it on it's own terms and don't compare it to anything relevant, and just as long as you give people enough time to understand and accept it … and surrender to it.
    For real … I wouldn't win one single pitch that way.
    Today's threads have in many ways been a thorough reminder of the following quote from the second link I provided:
    Is there an Internet rule yet stating that even the most obviously indefensible mistake will eventually be defended by someone somewhere? Awful marketing efforts get explained as genius viral campaigns, broken features become solutions.
    And whether or not you're able to see my point of view or not is really besides the point too.
    The real point was, and remains to be:
    That for those who receive lots of ads or other external files that may or may not contain spot colors it would be far more useful to be able to set a checkbox to always convert all spots to process when exporting, than the current functionality is (and I'm not suggesting eliminating the current functionality, just change so it's presented like what it really is, and then just let that separate checkbox do what it says) … causing unnecessary manual action on the user's behalf shouldn't be the business of Adobe – preventing it should.
    And here's further reading on the subject of bad Adobe interface design for those who might feel so inclined. ;-)
    Cheers!

  • Getting error while tring to Check in any document in Sharepoint 2007. Error - "Unable to validate data. System.Web.Configuration.MachineKey.GetDecodedData(Byte[] buf, Byte[] modifier, Int32 start, Int32 length, Int32& dataLength) ...

    Hi Team,
    I am getting error while tring to Check in any document in Sharepoint 2007. Error - "Unable to validate data. System.Web.Configuration.MachineKey.GetDecodedData(Byte[] buf, Byte[] modifier, Int32 start, Int32 length, Int32& dataLength).
    Anybody please let me know how to resolve the issue. It has affected our 9 site collection with almost 2000+ sites. Please email me your your suggestion or you can share your ideas here also.
    Tried to reset the IIS, checked the Internet connection speed (8MBps). Cleared the cache and all but no luck.
    Thanks,
    Pronob

    Hello,
    I have found this support window for you:
    http://support.microsoft.com/kb/556031
    http://technet.microsoft.com/en-in/library/cc298582%28v=office.12%29.aspx
    This may help you
    Hemendra:Yesterday is just a memory,Tomorrow we may never see
    Please remember to mark the replies as answers if they help and unmark them if they provide no help

  • Error when trying to Cancel an invoice document via MR8M Transaction

    Greetings Everyone,
    l am having an issue when trying to cancel an invoice document via MR8M transaction. It gives me the following error message;
    SYSTEM ERROR: error in routine MRM_DRSEG_CR_CREATE
    Message no.M8008
    Procedure:
    Contact your system administrator.
    This invoice was related to a check payment. The check has been voided already and there is no clearing document for the accounting document. So, the item remains open. However when l try to reverse, it display the error message above.
    Thanks in advance

    read notes 449277 and 750455, your solution is probably there.

  • Error  received while cancelled inoice document via MR8M

    Dear Expert,
    User post the invoice wrongly  in the system with posting  date as 01.10.2011 and  invoice date  is 15.03.2011(Last Fiscal Year).
    Now when user try to calceled the document via MR8M he  received error  as "Item 010 order 205460 budget exceed." (In document item 010 Order 205460, budget  for fiscal year 2010 was exceeded by 238,360.00 INR.)
    Budget for the year 2010 is  already carried forward to the new year 2011. I am unable to find reason via user  receiving this  error.
    Please advise on the same.
    Thanks
    Rohan

    Hi,
    I assume you had transferred all the commitments form fiscal year 2010 to fiscal year 2011 you will surely get the error message of budget exceeded in the year 2011 .You will will have to transfer the budget of the year 2010 also or you may update the current budget .I f you do not want the error message,you will have to first deactivate the Project budget using T code CJBW & after reversing then activate the Project budget using T code CJBV.
    Hope it may help you.
    Kuber

  • When I attempt to Extend a PDF document via my LiveCycle server I get the error"Application Alert" The file could not be saved. Are you sure you want to lose your results?

    When I attempt to Extend a PDF document via my LiveCycle server I get the error"Application Alert" The file could not be saved. Are you sure you want to lose your results?
    I can login fine via this address.
    http://[Server Address]:[Port Number]/ReaderExtensions
    And I can apply the extension successfully, but when I attempt to download I receive the error above.
    Application Alert" The file could not be saved. Are you sure you want to lose your results?

    Please check the supported version of flash player on your client machine for LiveCycle : http://bit.ly/1wGD9t4
    Upgrade or downgrade it accordingly.
    Thanks,
    Wasil

  • Check in new documents in DMS with specific access permissions

    Hi,
    we have an RFC which creates new documents in DMS.
    This calls one after another these FBs:
    - CVAPI_DOC_CREATE
    - CVAPI_DOC_CHECKIN
    - BAPI_DOCUMENT_CHANGE2 for a additional classification of the new document
    Now we have a new request from our customer: to give the document specific access permissions.
    We try the following:
    - manually check in a document template with the necessary permissons.
    - the permissions are given in a classification  ("O,MW-T-D*,IB,02/03/52/53")
    - This is named "authority characteristic" and is checked somewhere else, I do not really know how this works in detail ( but it works)
    - check in a new document with a reference to the template and in expectation that the new document has the same classification and therefore the same access permissions
    - If I do this manually in CV03N is does work
    - We do this with CVAPI_CHECK_IN_WITH_TEMPLATE - but this FB does not copy the classification ( only the description and the attached original documents , and the documentnumber of the new document is an mandatory parameter which is not allowd in our case since we use internal creation of document numbers)
    My question is: Is this a possible way to create new documents with specific permissions
    Is there a possibility to give the permissions to the documenttype instead of give them to every single document of this documenttype ?
    Thanks
    Kerstin

    My guess is that at some point you propagated the ACL entry for "everyone deny delete" to all your folders and sub-folders and their contents by selecting Apply to All in a GetInfo window. Try doing a search in the Leopard forums for
    ACL chmod
    and you'll find a whole raft of discussions about the problem and suggestions for fixes.
    Francine
    Francine
    Schwieder

  • SAP DMS - Unable to check in the document

    Hi All,
    I am using SAP ECC 6.0 and working with SAP DMS.
    When I am creating a D I R, I am not able to check in the document into content repository.
    Following error flashes as soon as I try to save the D I R :
    "Error while checking in and storing: D:\Documents and Settings\......\......"
    The error massege number is 26 253.
    I have searched for the related notes on service.sap.com. However all the related notes are relevant for the SAP 4.6 B to ECC 5.00.
    Please help me in fixing the said issue
    Edited by: Rahul Pathak on Jan 28, 2008 11:35 AM

    Hi Rahul,
    How you are trying to check-in the docuement?
    Suggestion:
    1) Which version of ACAD you are using and Which version of ACAD Integration you are using.
    If you are using ACAD integration version 4.0.0.0 then please upgrade to ACAD integration version 4.5.0.0. Whichi is available in SAP Market Place.
    2) In your configuration, are you trying to create JPG Conversion? if so then please remove the trigger CreateAutoNeutralFile from the config.
    Create a new drawing and test the check-in functionality and let me know the result.
    Regards,
    MRK
    (If the points are useful then pls reward points.)

Maybe you are looking for

  • Missing on Users end: How to install Font Manager.IndesignPlugin. Indesign_CC

    I installed Indesign CC on the admin side of my computer. It seems to open/work fine there but when i login as a user on the computer, it won't open Indesign because it claims it is missing the Font Manager.IndesignPlugin. It originally said it was a

  • Word 2011 for Mac Issue with Custom Page Size

    I have installed Microsoft Office 2011 for Mac on my brand new iMac.  I'm using Microsoft Word 2011 for Mac and I have opened the "Word Publishing Layout" template (this is essentially Microsoft Publisher).  I'm trying to set up custom page sizes by

  • Hard Drive Interface

    I'm looking to purchase an internal hard drive that I will put in a case to make an external drive for my system. Based on the specs of my powermac is there a specific type of hard drive I will need to buy for it to be compatible? Should I get SATA o

  • Music video keeps stopping on ipod touch

    I recently purchased the 'Out of the Game' music video for my 2nd generation ipod touch. It's the first music video I've purchased. I went to play it last night and it kept stopping at 17 seconds every time! The bar up top would continue to move, inc

  • RV042 VPN and firewall?

    Hi, I have RV042s here that connect networks via IPSEC tunnels. I'd like to use the RV042 firewall to control in- and outbound network traffic to a subnet. But it seems that I cannot apply firewall rules to traffic that comes in via the VPN tunnel (I