MAXL Filter Import

I have a need to import security filters where a user may have multiple Reads and/or writes. I have been unsuccessful thus far in creating an import of anything other than a "no_access on" filter. I would greatly appreciate an example of a no_access, read, write filter example. I would also like any advice on multiple read/writes within the same filter.<BR><BR>Thanks in advance,<BR>Steve Wyatt

Try something like this:<BR><BR>create or replace filter Sample.Basic.test1 read on '@IDESCENDANTS("Market")', write on '@IDESCENDANTS("East"), "Budget"', no_access on '@IDESCENDANTS("Bottle")';<BR><BR>-R

Similar Messages

  • MAXL to IMPORT DATA

    Hi All,
    Can you please help me in writing maxl script for the following scenario?
    I have ASO cube, I need to export level 0 data from it and then load it back to the cube.
    In between them, I will have to update the dimensions. I know maxl scripting for loading data, dimensions, exporting, importing but
    When I export lvl0 data to a location, if it writes to multiple files [2gb each]. How can I write maxl to import data with these multiple files. {cache: I dont know how many files it will export}
    Please help..!!
    Thanks.

    You could simply restructure 'in place' - you don't necessarily have to export / clear / restructure / reload.
    But if you do go the reload route look at this thread for some tips from Robert and Glenn on handling an unknown number of export files - Importing spanned export files in MaxL
    In BSO the 'import data' statement will now handle wildcards, e.g. 'export*.txt' but this functionality hasn't been added to the ASO version of the command (at least per the documentation; I haven't actually tested it).
    Glenn, yeah - lack of column export can be a major PITA. I've wasted more than a few hours of my life working through renamed members one-by-one in dozens of export files...

  • Maxl to import datafolders

    I want to make an maxl script to import all files in a specific folder. These files are variable, but all have the same layout.
    The script I have created:
    set varPath = 'P:\\WM\\Finance & Control\\Controlling\\Business Control\\Latest Estimate\\LE 01\\2011\\upload\\';
    set varFileA = 'upload file1.xls';
    set varFileB = 'file2 upload.xls';
    set varFileC = 'file3 upl.xls';
    import database 'WM_T2'.'WM_T' data from data_file "$varPath$varFileA" using server rules_file 'testle1' on error append to "$varPath$varErrorfile";
    import database 'WM_T2'.'WM_T' data from data_file "$varPath$varFileB" using server rules_file 'testle1' on error append to "$varPath$varErrorfile";
    import database 'WM_T2'.'WM_T' data from data_file "$varPath$varFileC" using server rules_file 'testle1' on error append to "$varPath$varErrorfile";
    Is it posible to load all files in the folder without naming them in maxl? For example:
    import database 'WM_T2'.'WM_T' data from data_file "$varPath*.xls" using server rules_file 'testle1' on error append to "$varPath$varErrorfile";
    Thanks in advance
    Edited by: Chris on 24-feb-2011 6:02

    This is the script I made. Comments are in dutch but the cmd speaks for itself. Maybe other Essbase users can use this:
    set Username=
    set Password=
    set Essserver=
    set Application='WM_M'.'WM_MFIN'
    set Errorfile=errorlogLE1.txt
    set Logfile=WMEssbaselogLE1.txt
    REM ------------------ GEEF SCRIPTNAAM OP DIE NA HET INLEZEN MOET WORDEN GESTART --------------------
    set Calculationscript='CalcLE'
    REM ------------------ GEEF BESTEMMINGSMAP OP WAAR DE VERWERKTE BESTANDEN EN LOG NAARTOE MOETEN WORDEN VERPLAATST 2x--------------------
    md "W:\2011\verwerkt\verwerkt-%date%-%time%\"
    set outputdirectory=W:\2011\verwerkt\verwerkt-%date%-%time%\
    REM ---------------- HIERONDER STAAT HET MAXL SCRIPT -----------------------
    ECHO /* MAXL Script voor uploaden van folders*/ > %MaxlFile%
    ECHO set timestamp on; >> %MaxlFile%
    ECHO spool on to '%outputdirectory%%Logfile%';>> %MaxlFile%
    ECHO echo '************* LOGGING IN TO ESSBASE *****************';>> %MaxlFile%
    ECHO login %Username% %Password% on %Essserver%; >> %MaxlFile%
    ECHO echo '************* UPLOADING *****************'; >> %MaxlFile%
    REM ---------------- HIERONDER STAAT DE LOOP VAN DE FOLDER, VOOR ELK BESTAND WORDT EEN REGEL AANGEMAAKT --------------------------
    for /F %%a in ('dir /b %inputdirectory%%inputfiletype%') do (
    ECHO import database %Application% data from data_file '%Inputdirectory%%%a' on error append to "%outputdirectory%%Errorfile%"; >> %MaxlFile%
    ECHO /* Calculate database */ >> %MaxlFile%
    ECHO execute calculation %Application%.%Calculationscript%; >> %MaxlFile%
    ECHO echo '************* LOGGING OUT *****************';>> %MaxlFile%
    ECHO logout;>> %MaxlFile%
    ECHO set timestamp off; >> %MaxlFile%
    ECHO spool off; >> %MaxlFile%
    ECHO exit; >> %MaxlFile%
    REM ---------------- UITVOEREN VAN AANGEMAAKT SCRIPT --------------------------
    essmsh -l %Username% %Password% -s %Essserver% %Maxlfile%
    REM ---------------- VERPLAATSEN VAN GEUPLOADE BESTANDEN EN STARTEN VAN LOGFILE MET RESULTAAT UPLOAD --------------------------
    MOVE /Y "%inputdirectory%%inputfiletype%" "%outputdirectory%"
    start notepad "%outputdirectory%%Logfile%"

  • MaxL for Import of Data & Dimension info using an ODBC source (Oracl based)

    See below for syntax issues with using the SQL Interface with MaxL to load first a dimension, then data. (Any one have experience with getting the exact syntax for this?) ThanksMAXL> import database ESSBASE_APPNAME.ESSBASE_DBNAME dimensions connect as ORACLE_USER identified by ODBC_DEFINITION using rules_file LdJob; 6 - (1) Syntax error near end of statement. 50 - MaxL compilation failed.MAXL> import database ESSBASE_APPNAME.ESSBASE_DBNAME data connect as ORACLE_USER identified by ODBC_DEFINITION using rules_file LdTurn; 6 - (1) Syntax error near end of statement. 50 - MaxL compilation failed.

    I think i got my error. I have to say 'import database dimensions from data_file........' instead of 'import database data from data_file............'. I dont know how to delete this post so i am not deleting it.

  • Camera RAW plugin, filter, importer for Premiere Pro CC

    I've been using it now in Photoshop CC for H264 (5D) video clips and the results were amazing. But the process to export it to Photoshop as video file, do a color adjustment, render it and import back to Premiere is very long. And if to compare to SpeedGrade, the results look better In Camera RAW, and the interface of the plugin is much more simple and "friendly".
    So why it can't just be inside Premiere (like Three Way Color Correctior) in effects section for example?
    Also the ability to import DNG's from Blackmagic Cameras or Magic Lantern RAW (5D3) using such plugin would be amazing! I've seen a "preview4" demo of Adobe DNG importer and it's really cool, has a Camera RAW source settings but crashes very often in Premiere CC
    Will there be a future development of this technology?

    That may be partially true, but not entirely. I know for a fact that some of the things I have requested have been implemented. Not because I asked for them, but because a lot of people did. I have also received emails from Adobe employees saying "Great Request! I want that too. I will make sure it gets on the list."
    I have also received emails telling me I am not the only one asking and I should feel free to read between the lines.
    I'm just saying that if you have a great request that they also want - remember they can't just automatically add stuff they want for themselves just because they are users also - you might get fast tracked. Or if you are one of many, you have a good chance at seeing it happen.
    In this case, keep in mind that Camera RAW just became a filter in Photoshop recently. It is an amazing and extremely welcome development, but it is new. If we mounted a campaign to get it into Premiere Pro, and got enough users behind it, perhaps we could get it done. In order to do that, we would probably have to make some amazing videos using Photoshop to do the job, post them, and then get people excited by the possibilities. As it stands, most of the Premiere Pro users have no idea what I am talking about. Why would they unless they are also photographers? I would not have known if it had appeared this time last year - before I bought my GH3.
    Everything you can do in Camera RAW you can do other ways as far as I can tell, but Camera RAW makes it so easy it is obvious to me that you are correct in asking for it in Premiere Pro. I haven't used it much, so I have not experienced the speed issues. I will see what I can do to play around with it over the next few days and see if I can determine a faster workflow. My guess is that you start with Camera RAW, open it in Ae and use Dynamic link to put it into Premiere Pro, but that is just a guess.It might require the use of a digital intermediate. Put the entire clip into Photoshop, change the settings and go to bed. Wake up in the AM and export. Let's see if we can experiment a little - everyone can help.

  • Filter Import by Number of Rating Stars and other such

    My work flow is to look at the card with Nikon ViewNX2, and assign rating stars to the keepers, then import the keepers
    into Lightroom.  I  can't see the stars in the import dialog, though I can see them once I  do the import.  LR clearly
    understands the rating stars because it can import them.  I'd like to be able to see the rating stars in the import dialog and
    filter on them at import time.  I suspect that there are others out there that would like to filter on color labels and the flags.
    Why not use Lightroom for the review?  I sometimes don't have the Lightroom computer with me, or any computer at all.
    Nikon ViewNX2 is free, and I can install it on whatever computer is handy.  It's also fast, and understands dual monitors.
    Simliar logic applies to BreezeBrowser and even the free Microsoft software.
    There was some discussion of this in http://forums.adobe.com/thread/724396?tstart=0.
    Chuck

    My work flow is to look at the card with Nikon ViewNX2, and assign rating stars to the keepers, then import the keepers
    into Lightroom.  I  can't see the stars in the import dialog, though I can see them once I  do the import.  LR clearly
    understands the rating stars because it can import them.  I'd like to be able to see the rating stars in the import dialog and
    filter on them at import time.  I suspect that there are others out there that would like to filter on color labels and the flags.
    Why not use Lightroom for the review?  I sometimes don't have the Lightroom computer with me, or any computer at all.
    Nikon ViewNX2 is free, and I can install it on whatever computer is handy.  It's also fast, and understands dual monitors.
    Simliar logic applies to BreezeBrowser and even the free Microsoft software.
    There was some discussion of this in http://forums.adobe.com/thread/724396?tstart=0.
    Chuck

  • Essbase - MaxL import dimensions

    Hi everybody,
    I have to update a dimension on my Essbase ASO application (only metadata). It is an incremental load dimension starting from a txt file containing all the elements, either those that currently are in the dimension. I'm using the MaxL statement Import dimensions by server rules file. Although, i'm wondering what the clause "preserve alla data" will do? What happens if i do not insert this clause when importing dimensions? After the the import i have to save the outline or launch the restructure?
    Thank you in adavance guys,
    Maurizio

    Hi Celvin,
    thnks for your answer. So, If i launch the following Maxl Script without the clause "preserve all data" i could lose all the data stored in the db?
    import database sample.basic dimensions
    from data_file 'test.txt'
    using rules_file 'test.rul'
    on error append to 'test.log'
    I tried on the ASOSample and the data haven't been cleared. There may be some options at applicatoin level or server level that effect how it works? could it be a safety clause in case of an abnormal shutdown during the import process?
    I know it's obviously better to put always the "preserve alla data" clause, but at the moment i'm evaluating the functionality of some existing scripts and i want to understand how they works.
    Thank you very much
    Maurizio

  • Essbase JAPI maxl session gives NPE with import statement

    Anybody know why I get a NPE from the Java API when trying to use the IEssMaxlSession to do an import? Here's my code snippet. I've verified the session work using the simpler maxl command commented out below. I also know the import syntax is OK at the maxl prompt. Essbase Error(0) isn't the most helpful diagnostic I've come across.
    Thanks
    IEssMaxlSession maxlSess = null;
    try {
         maxlSess = olapSvr.openMaxlSession("Maxl Test");
                   try {
                        String maxl;
    //                    maxl = "display database \"184_r\".rep";
                        maxl = "import database \"184_a\".agg dimensions connect as \"admin\" identified by \"password\" using server rules_file '/TmpltRFs/RFs/Plan.rul' on error write to \"errlog.log\"";
                        logger.debug(maxl);
         maxlSess.execute(maxl);
         printMessages(maxlSess.getMessages());
                   } catch (EssException e) {
                        printMessages(maxlSess.getMessages());
                        logger.debug(e.getMessage());
                        e.printStackTrace();
    } catch (EssException e){
         logger.debug(e.getMessage());
    14:47:40 DEBUG (essbase.RunMaxl 109): Cannot execute maxl statement. Essbase Error(0): java.lang.NullPointerException
    com.essbase.api.base.EssException: Cannot execute maxl statement. Essbase Error(0): java.lang.NullPointerException
         at com.essbase.server.framework.EssOrbPluginDirect.ex_olap(Unknown Source)
         at com.essbase.server.framework.EssOrbPluginDirect.executeMaxlStatement(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMaxlMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)

    This is due to the unpublished Bug 12661416: MAXL STATEMENT IMPORT DB DIMENSIONS FROM RELATIONAL DATABASE FAILS WITH JAPI. This error is not fixed in next release.
    However, there is one workaround.
    "As a workaround we installed Essbase client 9.3.1 on the epm11 test box, copied native libraries from 9.3.1 admin server and connected to maxl shell from the v 9.3.1 maxljni interface. This seems to work fine."
    Edited by: Karthik_P on Apr 9, 2012 1:12 PM

  • Strange behavior when using servlet filter with simple index.htm

    I am new to J2EE development so please tolerate my ignorance. I have a web application that starts with a simple index.htm file. I am using a servlet filter throughout the website to check for session timeout, redirecting the user to a session expiration page if the session has timed out. When I do something as simple as loading the index.htm page in the browser, the .css file and one image file that are associated, or referenced in the file are somehow corrupted and not being rendered. How do I get the filter to ignore css and image files??? Thank you!!
    The servlet filter:
    import java.io.IOException;
    import javax.servlet.Filter;
    import javax.servlet.FilterChain;
    import javax.servlet.FilterConfig;
    import javax.servlet.ServletException;
    import javax.servlet.ServletRequest;
    import javax.servlet.ServletResponse;
    import javax.servlet.http.HttpServletRequest;
    import javax.servlet.http.HttpServletResponse;
    public class SessionTimeoutFilter implements Filter {
         String[] excludedPages = {"SessionExpired.jsp","index.htm","index.jsp"};
         String timeoutPage = "SessionExpired.jsp";
         public void destroy() {
         public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException {
              if ((request instanceof HttpServletRequest) && (response instanceof HttpServletResponse)) {
                   HttpServletRequest httpServletRequest = (HttpServletRequest) request;
                   HttpServletResponse httpServletResponse = (HttpServletResponse) response;
                   //httpServletResponse.setHeader("Cache-Control","no-cache");
                   //httpServletResponse.setHeader("Pragma","no-cache");
                   //httpServletResponse.setDateHeader ("Expires", 0);
                   String requestPath = httpServletRequest.getRequestURI();
                   boolean sessionInvalid = httpServletRequest.getSession().getAttribute("loginFlag") != "loggedIn";               
                   System.out.println(sessionInvalid);
                   boolean requestExcluded = false;
                   System.out.println(requestExcluded);
                   for (int i=0;i<excludedPages.length;i++){
                        if(requestPath.contains(excludedPages)){
                             requestExcluded = true;
                   if (sessionInvalid && !requestExcluded){
                        System.out.println("redirecting");
                        httpServletResponse.sendRedirect(timeoutPage);
              // pass the request along the filter chain
              chain.doFilter(request, response);
         public void init(FilterConfig arg0) throws ServletException {
              //System.out.println(arg0.getInitParameter("test-param"));
    The index.htm file (or the relevant portion)<HTML>
    <Head>
    <META http-equiv="Content-Style-Type" content="text/css">
    <LINK href="RTEStyleSheet.css" rel="stylesheet" type="text/css">
    <TITLE>Login</TITLE>
    </HEAD>
    <BODY>
    <FORM NAME="Login" METHOD="POST" ACTION="rte.ServletLDAP"><!-- Branding information -->
    <table width="100%" border="0" cellpadding="0" cellspacing="0">
         <tr>
              <td width="30%" align="left"><img src="images/top_logo_new2.gif">
              </td>
              <td width="37%" align="center"></td>
              <td width="33%" align="right"></td>
         </tr>
    </table>
    My web.xml entry for the filter:     <filter>
              <description>
              Checks for a session timeout on each user request, redirects to logout if the session has expired.</description>
              <display-name>
              SessionTimeoutFilter</display-name>
              <filter-name>SessionTimeoutFilter</filter-name>
              <filter-class>SessionTimeoutFilter</filter-class>
              <init-param>
                   <param-name>test-param</param-name>
                   <param-value>this is a test parameter</param-value>
              </init-param>
         </filter>
         <filter-mapping>
              <filter-name>SessionTimeoutFilter</filter-name>
              <url-pattern>/*</url-pattern>
              <dispatcher>REQUEST</dispatcher>
              <dispatcher>FORWARD</dispatcher>
         </filter-mapping>

    Hi,
    Try adding CSS files and images to the excluded Pages.

  • Filter does not work with *.jsp URL pattern???

    Hi All,
    I am, by no means, very good at JSF or Java. I have looked at various forum posts on here for ways to implement a security filter to intercept requests to pages that first require one to be logged in, and if not, redirect them to the login page. Yes, I know a lot of you have heard this many times before, and I'm sorry to bring it up again.
    BUT, from the guidance of other posts, I have got a filter that works fine when the url pattern is set to "/faces/*" or "/<anything>/*", however it won't work for "*.jsp" or "*.<anything>"
    My filter is as follows:
    package test.security;
    import javax.faces.context.FacesContext;
    import javax.servlet.Filter;
    import javax.servlet.FilterChain;
    import javax.servlet.FilterConfig;
    import javax.servlet.http.HttpSession;
    import javax.servlet.ServletRequest;
    import javax.servlet.ServletResponse;
    import javax.servlet.ServletException;
    import javax.servlet.http.HttpServletRequest;
    import javax.servlet.http.HttpServletResponse;
    public class SecurityFilter implements Filter{
        /** Creates a new instance of SecurityFilter */
        private final static String FILTER_APPLIED = "_security_filter_applied";
        public SecurityFilter() {
        public void init(FilterConfig filterConfig) {
        public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws java.io.IOException, ServletException{
            HttpServletRequest req = (HttpServletRequest)request;
            HttpServletResponse res = (HttpServletResponse)response;
            HttpSession session = req.getSession();
            String requestedPage = req.getPathTranslated();
            String user=null;
            if(request.getAttribute(FILTER_APPLIED) == null) {
                //check if the page requested is the login page or register page
                if((!requestedPage.endsWith("Page1.jsp")) /* This is the login page */
                    //set the FILTER_APPLIED attribute to true
                    request.setAttribute(FILTER_APPLIED, Boolean.TRUE);
                    //Check that the session bean is not null and get the session bean property username.
                    if(((test.SessionBean1)session.getAttribute("SessionBean1"))!=null) {
                        user = ((test.SessionBean1)session.getAttribute("SessionBean1")).getUsername();
                    if((user==null)||(user.equals(""))) {
                       // try {
                     //       FacesContext.getCurrentInstance().getExternalContext().redirect("Page1.jsp");
                      //  } catch (ServletException ex) {
                      //      log("Error Description", ex);
                        res.sendRedirect("../Page1.jsp");
                        return;
            //deliver request to next filter
            chain.doFilter(request, response);
        public void destroy(){
    }My web.xml declaration for the filter is:
    <filter>
      <description>Filter to check whether user is logged in.</description>
      <filter-name>SecurityFilter</filter-name>
      <filter-class>test.security</filter-class>
    </filter>
    <filter-mapping>
      <filter-name>SecurityFilter</filter-name>
      <servlet-name>Faces Servlet</servlet-name>
    </filter-mapping>
    Note: I have also tried this with <url-pattern>*.jsp</url-pattern> for the filter mapping in place of the Faces Servlet
    My web.xml declaration for the url pattern is:
    <servlet-mapping>
      <servlet-name>Faces Servlet</servlet-name>
      <url-pattern>*.jsp</url-pattern>
    </servlet-mapping>Which JSC/NetbeansVWP automatically creates a "JSCreator_index.jsp" which has:
    <?xml version="1.0" encoding="UTF-8"?>
    <jsp:root  version="1.2" xmlns:jsp="http://java.sun.com/JSP/Page">
      <jsp:forward page="Page1.jsp"/>
    </jsp:root>When run, this causes an Error 500 in the browser and a NullPointerException in SecurityFilter.java on the line:
    if((!requestedPage.endsWith("Page1.jsp")) /* This is the login page */I think I'm missing something that would be obvious to anyone who knows better than me. Any ideas?

    Dear Ginger and Boris,
    thanks for the information - the problem seems to ocur in EP7 as well, Boris told me it is fixed in SP15. We are on SP14 now, so there is hope !
    actually the information in the oss note stated above is also true, as we have an Oracle DB. On a similar demo system (only difference is SQL DB) the hyphen search works !
    best regards, thank you !
    Johannes

  • Multiple DimBuilds w/Only One Restructure in MaxL?

    I am doing a series of dimension builds via load rules (v7.1.5). In this case I am building this entire dimension from scratch every time, but I want to preserve data, because there is forecast and plan data in this cube, not just actuals. My problem is this: In order to build completely from scracth, my first dimbuild load rule has "Remove Unspecified" turned ON. But that that dimbuild does not include all the level 0 members I will end up adding to this dimension by the end of this process. And I cannot find a way to delay Essbase from performing the restructure until the last dimbuild. I have tried using the "suppress verification" option in MaxL's import dimension command, but it doesn't accomplish this. I cannot find anything in the MaxL docs that refers to this and no one I work with has an answer. There has to be a way to do this, doesn't there? Otherwise I will have to abandon this "build from scratch" methodology and just leave old, dead members lying around in this dimension until they are removed manually.<BR><BR>Thanks,<BR><BR>James

    James:<BR><BR>What's important here is that ALL of the dimension build happen in the same IMPORT statement, as follows:<BR><BR>import database sample.basic dimensions <BR><b>from server text data_file 'genref' using server rules_file 'genref' suppress verification, <BR>from server text data_file 'level' using server rules_file 'level' suppress verification, <BR>from server text data_file 'time' using server rules_file 'time' suppress verification </b><BR>preserve all data on error append to 'C:\Hyperion\EAS\eas\client\dataload.err';<BR><BR>This is the only way that the suppression works.<BR>

  • Can we have a filter to handle all the apps in a container?

    Hi,
    I have a simple (Stupid) question.
    Can I configure a Servlet filter to intercept requests to all the applications (a.k.a all the ServletContexts)? All I know is we can configure a filter for each ServletContext (application), but I am wondering if there is any way.
    The reason I am asking is because, if there is a way to do so, I can configure my filter to provide me the debug output (Such as CGI, session, request, and ServletContext variables) on all the applications on development servers and when I can get rid of debugging on production servers (by not deploying the filter).
    Thanx in advance,
    Thirupati Panyala

    Thanks for the reply Sudhir.
    I tried it in Tomcat 4.x. It works for only the root resources even though I configured the filter for /* which is expected to include all the resources including root (for ex: /examples/...).
    So, I understand, the web.xml under /conf will be overriden by that of the respective apps as you go further down the context paths (apps). That's why the container was not executing my filter when I went to the next level of the URI (http://localhost:8080/index.html WORKED, but NOT http://localhost:8080/examples/jsp/num/numguess.jsp).
    If you want to try it for yourself, here is the piece of code I tried with (This is taken from Apache Tomcat source):
    package filters;
    import java.io.IOException;
    import java.io.PrintWriter;
    import java.io.StringWriter;
    import java.sql.Timestamp;
    import java.util.Enumeration;
    import java.util.Locale;
    import javax.servlet.Filter;
    import javax.servlet.FilterChain;
    import javax.servlet.FilterConfig;
    import javax.servlet.ServletContext;
    import javax.servlet.ServletException;
    import javax.servlet.ServletRequest;
    import javax.servlet.ServletResponse;
    import javax.servlet.http.Cookie;
    import javax.servlet.http.HttpServletRequest;
    * Example filter that dumps interesting state information about a request
    * to the associated servlet context log file, before allowing the servlet
    * to process the request in the usual way. This can be installed as needed
    * to assist in debugging problems.
    * @author Craig McClanahan
    * @version $Revision: 1.5 $ $Date: 2001/05/23 22:26:17 $
    public final class RequestDumperFilter implements Filter {
    // ----------------------------------------------------- Instance Variables
    * The filter configuration object we are associated with. If this value
    * is null, this filter instance is not currently configured.
    private FilterConfig filterConfig = null;
    // --------------------------------------------------------- Public Methods
    * Take this filter out of service.
    public void destroy() {
    this.filterConfig = null;
    * Time the processing that is performed by all subsequent filters in the
    * current filter stack, including the ultimately invoked servlet.
    * @param request The servlet request we are processing
    * @param result The servlet response we are creating
    * @param chain The filter chain we are processing
    * @exception IOException if an input/output error occurs
    * @exception ServletException if a servlet error occurs
    public void doFilter(ServletRequest request, ServletResponse response,
    FilterChain chain)
         throws IOException, ServletException {
    if (filterConfig == null)
         return;
         // Render the generic servlet request properties
         StringWriter sw = new StringWriter();
         PrintWriter writer = new PrintWriter(sw);
         writer.println("Request Received at " +
              (new Timestamp(System.currentTimeMillis())));
         writer.println(" characterEncoding=" + request.getCharacterEncoding());
         writer.println(" contentLength=" + request.getContentLength());
         writer.println(" contentType=" + request.getContentType());
         writer.println(" locale=" + request.getLocale());
         writer.print(" locales=");
         Enumeration locales = request.getLocales();
         boolean first = true;
         while (locales.hasMoreElements()) {
         Locale locale = (Locale) locales.nextElement();
         if (first)
         first = false;
         else
         writer.print(", ");
         writer.print(locale.toString());
         writer.println();
         Enumeration names = request.getParameterNames();
         while (names.hasMoreElements()) {
         String name = (String) names.nextElement();
         writer.print(" parameter=" + name + "=");
         String values[] = request.getParameterValues(name);
         for (int i = 0; i < values.length; i++) {
         if (i > 0)
              writer.print(", ");
              writer.print(values);
         writer.println();
         writer.println(" protocol=" + request.getProtocol());
         writer.println(" remoteAddr=" + request.getRemoteAddr());
         writer.println(" remoteHost=" + request.getRemoteHost());
         writer.println(" scheme=" + request.getScheme());
         writer.println(" serverName=" + request.getServerName());
         writer.println(" serverPort=" + request.getServerPort());
         writer.println(" isSecure=" + request.isSecure());
         // Render the HTTP servlet request properties
         if (request instanceof HttpServletRequest) {
         writer.println("---------------------------------------------");
         HttpServletRequest hrequest = (HttpServletRequest) request;
         writer.println(" contextPath=" + hrequest.getContextPath());
         Cookie cookies[] = hrequest.getCookies();
    if (cookies == null)
    cookies = new Cookie[0];
         for (int i = 0; i < cookies.length; i++) {
         writer.println(" cookie=" + cookies[i].getName() +
                   "=" + cookies[i].getValue());
         names = hrequest.getHeaderNames();
         while (names.hasMoreElements()) {
         String name = (String) names.nextElement();
              String value = hrequest.getHeader(name);
         writer.println(" header=" + name + "=" + value);
         writer.println(" method=" + hrequest.getMethod());
         writer.println(" pathInfo=" + hrequest.getPathInfo());
         writer.println(" queryString=" + hrequest.getQueryString());
         writer.println(" remoteUser=" + hrequest.getRemoteUser());
         writer.println("requestedSessionId=" +
                   hrequest.getRequestedSessionId());
         writer.println(" requestURI=" + hrequest.getRequestURI());
         writer.println(" servletPath=" + hrequest.getServletPath());
         writer.println("=============================================");
         // Log the resulting string
         writer.flush();
         response.getWriter().print(sw.getBuffer().toString());
         // Pass control on to the next filter
    chain.doFilter(request, response);
    * Place this filter into service.
    * @param filterConfig The filter configuration object
    public void init(FilterConfig filterConfig) throws ServletException {
         this.filterConfig = filterConfig;
    * Return a String representation of this object.
    public String toString() {
         if (filterConfig == null)
         return ("RequestDumperFilter()");
         StringBuffer sb = new StringBuffer("RequestDumperFilter(");
         sb.append(filterConfig);
         sb.append(")");
         return (sb.toString());
    And the config in web.xml is :
    <filter>
    <filter-name>Request Dumper Filter</filter-name>
    <filter-class>filters.RequestDumperFilter</filter-class>
    </filter>
    <filter-mapping>
    <filter-name>Request Dumper Filter</filter-name>
    <url-pattern>/*</url-pattern>
    </filter-mapping>
    Thanx
    Thirupati Panyala

  • I keep getting the error:  "lightroom could not import this catalog because of an unknown error" when trying to import from a catalog?

    I have around 400,000 photos of 15 years so given the high number, I organized all my photos into 6 catalogs to avoid potential problems. All the photos and the catalogs are in a 4TB Seagate external hard drive. I use Adobe Lightroom 5 and I use a PC with the latest Windows Office 2013.
    I wanted to have a NEW Catalog of all my rated 1 Star+ photos of all the years in a single Catalog. So I created what I called Star+ Catalog and I was told the best option is to import a Catalog at a time and given that there is no means to filter importing only Stared photos, I would import all the photos and then delete all the UnStared photos. I did that for Catalog Year 2014 but at the end gave me the message: "lightroom could not import this catalog because of an unknown error". It actually had imported around 40k photos from about 50k total. I tried again and again and every time I end up with the same thing. I created another new Catalog and stared from scratch and the same ting happened: the ONLY EXACT 40k or so photos were imported and the rest were not!! When I imported Catalog Year 2013 of around 45k photos, it worked perfectly. But when I imported Catalog Year 2012 of 35k photos, the same thing happened!! 
    I then tried exporting the needed photos to a new Catalog and that worked.  But then when I try to import them again from my Master *+ Catalog I get the same error again!!
    I also tried creating a new Master *+ Catalog and tried to import the Catalogs of each year into it and the same error happened again!!!
    Any advice: 1) on how to solve this? and 2) if there is a better and easier way to create this Master Catalog of All Stared Photos of All my Catalogs?

    This is a duplicate thread - see here

  • Huge performance differences between a map listener for a key and filter

    Hi all,
    I wanted to test different kind of map listener available in Coherence 3.3.1 as I would like to use it as an event bus. The result was that I found huge performance differences between them. In my use case, I have data which are time stamped so the full key of the data is the key which identifies its type and the time stamp. Unfortunately, when I had my map listener to the cache, I only know the type id but not the time stamp, thus I cannot add a listener for a key but for a filter which will test the value of the type id. When I launch my test, I got terrible performance results then I tried a listener for a key which gave me much better results but in my case I cannot use it.
    Here are my results with a Dual Core of 2.13 GHz
    1) Map Listener for a Filter
    a) No Index
    Create (data always added, the key is composed by the type id and the time stamp)
    Cache.put
    Test 1: Total 42094 millis, Avg 1052, Total Tries 40, Cache Size 80000
    Cache.putAll
    Test 2: Total 43860 millis, Avg 1096, Total Tries 40, Cache Size 80000
    Update (data added then updated, the key is only composed by the type id)
    Cache.put
    Test 3: Total 56390 millis, Avg 1409, Total Tries 40, Cache Size 2000
    Cache.putAll
    Test 4: Total 51734 millis, Avg 1293, Total Tries 40, Cache Size 2000
    b) With Index
    Cache.put
    Test 5: Total 39594 millis, Avg 989, Total Tries 40, Cache Size 80000
    Cache.putAll
    Test 6: Total 43313 millis, Avg 1082, Total Tries 40, Cache Size 80000
    Update
    Cache.put
    Test 7: Total 55390 millis, Avg 1384, Total Tries 40, Cache Size 2000
    Cache.putAll
    Test 8: Total 51328 millis, Avg 1283, Total Tries 40, Cache Size 2000
    2) Map Listener for a Key
    Update
    Cache.put
    Test 9: Total 3937 millis, Avg 98, Total Tries 40, Cache Size 2000
    Cache.putAll
    Test 10: Total 1078 millis, Avg 26, Total Tries 40, Cache Size 2000
    Please help me to find what is wrong with my code because for now it is unusable.
    Best Regards,
    Nicolas
    Here is my code
    import java.io.DataInput;
    import java.io.DataOutput;
    import java.io.IOException;
    import java.util.HashMap;
    import java.util.Map;
    import com.tangosol.io.ExternalizableLite;
    import com.tangosol.net.CacheFactory;
    import com.tangosol.net.NamedCache;
    import com.tangosol.util.Filter;
    import com.tangosol.util.MapEvent;
    import com.tangosol.util.MapListener;
    import com.tangosol.util.extractor.ReflectionExtractor;
    import com.tangosol.util.filter.EqualsFilter;
    import com.tangosol.util.filter.MapEventFilter;
    public class TestFilter {
          * To run a specific test, just launch the program with one parameter which
          * is the test index
         public static void main(String[] args) {
              if (args.length != 1) {
                   System.out.println("Usage : java TestFilter 1-10|all");
                   System.exit(1);
              final String arg = args[0];
              if (arg.endsWith("all")) {
                   for (int i = 1; i <= 10; i++) {
                        test(i);
              } else {
                   final int testIndex = Integer.parseInt(args[0]);
                   if (testIndex < 1 || testIndex > 10) {
                        System.out.println("Usage : java TestFilter 1-10|all");
                        System.exit(1);               
                   test(testIndex);               
         @SuppressWarnings("unchecked")
         private static void test(int testIndex) {
              final NamedCache cache = CacheFactory.getCache("test-cache");
              final int totalObjects = 2000;
              final int totalTries = 40;
              if (testIndex >= 5 && testIndex <= 8) {
                   // Add index
                   cache.addIndex(new ReflectionExtractor("getKey"), false, null);               
              // Add listeners
              for (int i = 0; i < totalObjects; i++) {
                   final MapListener listener = new SimpleMapListener();
                   if (testIndex < 9) {
                        // Listen to data with a given filter
                        final Filter filter = new EqualsFilter("getKey", i);
                        cache.addMapListener(listener, new MapEventFilter(filter), false);                    
                   } else {
                        // Listen to data with a given key
                        cache.addMapListener(listener, new TestObjectSimple(i), false);                    
              // Load data
              long time = System.currentTimeMillis();
              for (int iTry = 0; iTry < totalTries; iTry++) {
                   final long currentTime = System.currentTimeMillis();
                   final Map<Object, Object> buffer = new HashMap<Object, Object>(totalObjects);
                   for (int i = 0; i < totalObjects; i++) {               
                        final Object obj;
                        if (testIndex == 1 || testIndex == 2 || testIndex == 5 || testIndex == 6) {
                             // Create data with key with time stamp
                             obj = new TestObjectComplete(i, currentTime);
                        } else {
                             // Create data with key without time stamp
                             obj = new TestObjectSimple(i);
                        if ((testIndex & 1) == 1) {
                             // Load data directly into the cache
                             cache.put(obj, obj);                         
                        } else {
                             // Load data into a buffer first
                             buffer.put(obj, obj);                         
                   if (!buffer.isEmpty()) {
                        cache.putAll(buffer);                    
              time = System.currentTimeMillis() - time;
              System.out.println("Test " + testIndex + ": Total " + time + " millis, Avg " + (time / totalTries) + ", Total Tries " + totalTries + ", Cache Size " + cache.size());
              cache.destroy();
         public static class SimpleMapListener implements MapListener {
              public void entryDeleted(MapEvent evt) {}
              public void entryInserted(MapEvent evt) {}
              public void entryUpdated(MapEvent evt) {}
         public static class TestObjectComplete implements ExternalizableLite {
              private static final long serialVersionUID = -400722070328560360L;
              private int key;
              private long time;
              public TestObjectComplete() {}          
              public TestObjectComplete(int key, long time) {
                   this.key = key;
                   this.time = time;
              public int getKey() {
                   return key;
              public void readExternal(DataInput in) throws IOException {
                   this.key = in.readInt();
                   this.time = in.readLong();
              public void writeExternal(DataOutput out) throws IOException {
                   out.writeInt(key);
                   out.writeLong(time);
         public static class TestObjectSimple implements ExternalizableLite {
              private static final long serialVersionUID = 6154040491849669837L;
              private int key;
              public TestObjectSimple() {}          
              public TestObjectSimple(int key) {
                   this.key = key;
              public int getKey() {
                   return key;
              public void readExternal(DataInput in) throws IOException {
                   this.key = in.readInt();
              public void writeExternal(DataOutput out) throws IOException {
                   out.writeInt(key);
              public int hashCode() {
                   return key;
              public boolean equals(Object o) {
                   return o instanceof TestObjectSimple && key == ((TestObjectSimple) o).key;
    }Here is my coherence config file
    <?xml version="1.0"?>
    <!DOCTYPE cache-config SYSTEM "cache-config.dtd">
    <cache-config>
         <caching-scheme-mapping>
              <cache-mapping>
                   <cache-name>test-cache</cache-name>
                   <scheme-name>default-distributed</scheme-name>
              </cache-mapping>
         </caching-scheme-mapping>
         <caching-schemes>          
              <distributed-scheme>
                   <scheme-name>default-distributed</scheme-name>
                   <backing-map-scheme>
                        <class-scheme>
                             <scheme-ref>default-backing-map</scheme-ref>
                        </class-scheme>
                   </backing-map-scheme>
              </distributed-scheme>
              <class-scheme>
                   <scheme-name>default-backing-map</scheme-name>
                   <class-name>com.tangosol.util.SafeHashMap</class-name>
              </class-scheme>
         </caching-schemes>
    </cache-config>Message was edited by:
    user620763

    Hi Robert,
    Indeed, only the Filter.evaluate(Object obj)
    method is invoked, but the object passed to it is a
    MapEvent.<< In fact, I do not need to implement EntryFilter to
    get a MapEvent, I could get the same result (in my
    last message) by writting
    cache.addMapListener(listener, filter,
    true)instead of
    cache.addMapListener(listener, new
    MapEventFilter(filter) filter, true)
    I believe, when the MapEventFilter delegates to your filter it always passes a value object to your filter (old or new), meaning a value will be deserialized.
    If you instead used your own filter, you could avoid deserializing the value which usually is much larger, and go to only the key object. This would of course only be noticeable if you indeed used a much heavier cached value class.
    The hashCode() and equals() does not matter on
    the filter class<< I'm not so sure since I noticed that these methods
    were implemented in the EqualsFilter class, that they
    are called at runtime and that the performance
    results are better when you add them
    That interests me... In what circumstances did you see them invoked? On the storage node before sending an event, or upon registering a filtered listener?
    If the second, then I guess the listeners are stored in a hash-based map of collections keyed by a filter, and indeed that might be relevant as in that case it will cause less passes on the filter for multiple listeners with an equalling filter.
    DataOutput.writeInt(int) writes 4 bytes.
    ExternalizableHelper.writeInt(DataOutput, int) writes
    1-5 bytes (or 1-6?), with numbers with small absolute
    values consuming less bytes.Similar differences exist
    for the long type as well, but your stamp attribute
    probably will be a large number...<< I tried it but in my use case, I got the same
    results. I guess that it must be interesting, if I
    serialiaze/deserialiaze many more objects.
    Also, if Coherence serializes an
    ExternalizableLite object, it writes out its
    class-name (except if it is a Coherence XmlBean). If
    you define your key as an XmlBean, and add your class
    into the classname cache configuration in
    ExternalizableHelper.xml, then instead of the
    classname, only an int will be written. This way you
    can spare a large percentage of bandwidth consumed by
    transferring your key instance as it has only a small
    number of attributes. For the value object, it might
    or might not be so relevant, considering that it will
    probably contain many more attributes. However, in
    case of a lite event, the value is not transferred at
    all.<< I tried it too and in my use case, I noticed that
    we get objects nearly twice lighter than an
    ExternalizableLite object but it's slower to get
    them. But it is very intersting to keep in mind, if
    we would like to reduce the network traffic.
    Yes, these are minor differences at the moment.
    As for the performance of XMLBean, it is a hack, but you might try overriding the readExternal/writeExternal method with your own usual ExternalizableLite implementation stuff. That way you get the advantages of the xmlbean classname cache, and avoid its reflection-based operation, at the cost of having to extend XMLBean.
    Also, sooner or later the TCMP protocol and the distributed cache storages will also support using PortableObject as a transmission format, which enables using your own classname resolution and allow you to omit the classname from your objects. Unfortunately, I don't know when it will be implemented.
    >
    But finally, I guess that I found the best solution
    for my specific use case which is to use a map
    listener for a key which has no time stamp, but since
    the time stamp is never null, I had just to check
    properly the time stamp in the equals method.
    I would still recommend to use a separate key class, use a custom filter which accesses only the key and not the value, and if possible register a lite listener instead of a heavy one. Try it with a much heavier cached value class where the differences are more pronounced.
    Best regards,
    Robert

  • Why this filter's code is not working right?

    Hey, I have written a filter to record every request and write to a xml file. Everything works fine but the the code in filter's destroy is called twice. Don't know why? Is that there something wrong with my file writing code?? here is the code
    Filter Code:
    package xx;
    import javax.servlet.Filter;
    import javax.servlet.FilterConfig;
    import javax.servlet.FilterChain;
    import javax.servlet.ServletRequest;
    import javax.servlet.ServletResponse;
    import javax.servlet.http.HttpServletRequest;
    import javax.servlet.http.HttpServletResponse;
    import javax.servlet.ServletException;
    import javax.servlet.http.HttpSession;
    import java.io.IOException;
    import UserInfo;
    public class RequestInspector implements Filter {
    protected RequestRecorder recorder = null;
    protected static String ipAddress;
    protected static int hrId = -1;
    public void init(FilterConfig config) {
    String fileLoc = config.getInitParameter("FileLocation");
    String fileName = config.getInitParameter("FileName");
    ipAddress = config.getInitParameter("IPAddress");
    try {
    hrId = Integer.parseInt(config.getInitParameter("HRID"));
    } catch (NumberFormatException nfe) {
    } catch (NullPointerException npe) {
    if (fileLoc != null && fileName != null && ipAddress != null && hrId != -1) {
    recorder = RequestRecorder.getInstance(fileLoc, fileName);
    public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain)
    throws IOException, ServletException {
    HttpServletRequest httpRequest = (HttpServletRequest)request;
    HttpSession session = httpRequest.getSession(false);
    UserInfo info = ((session==null) ? null : (UserInfo)session.getAttribute("USER_INFO"));
    if ( request.getRemoteAddr().equals(ipAddress) && info != null && info.getHrId() == hrId && recorder != null) {
    recorder.record(httpRequest);
    } // if...
    chain.doFilter(request, response);
    public void destroy() {
    if (recorder != null) {
    recorder.stop();
    RequestRecorder Code
    package ids.util;
    import java.io.File;
    import java.io.FileWriter;
    import java.io.IOException;
    import java.util.Enumeration;
    import java.util.Vector;
    import java.util.Hashtable;
    import javax.servlet.http.HttpServletRequest;
    public class RequestRecorder {
    private static final Hashtable holder = new Hashtable(2);
    private FileWriter writer = null;
    private int counter = 0;
    private RequestRecorder(String fileLocation, String fileName)
    throws IOException {
    writer = new FileWriter(new File(fileLocation, fileName));
    initializeFile();
    public static RequestRecorder getInstance(String fileLocation, String fileName) {
    RequestRecorder recorder = null;
    try {
    String path = new File(fileLocation, fileName).getAbsolutePath();
    recorder = (RequestRecorder) holder.get(path);
    if (recorder == null) {
    recorder = new RequestRecorder(fileLocation, fileName);
    holder.put(path, recorder);
    } catch (IOException ioe) {
    return recorder;
    public synchronized void record(HttpServletRequest request) throws IOException {
    if (writer == null) {
    return;
    writer.write("\n\t\t<request path=\"" + request.getRequestURI() +"\" label=\"request" + (++counter) +"\">");
    Enumeration params = request.getParameterNames();
    while (params.hasMoreElements()) {
    String param = (String)params.nextElement();
    String[] paramValues = request.getParameterValues(param);
    if (paramValues != null) {
    for (int i=0; i<paramValues.length; i++) {
    writer.write("\n\t\t\t<param>");
    writer.write("\n\t\t\t\t<paramName>");
    writer.write(param);
    writer.write("</paramName>");
    writer.write("\n\t\t\t\t<paramValue>");
    writer.write(paramValues);
    writer.write("</paramValue>");
    writer.write("\n\t\t\t</param>");
    } //for...
    } //if...
    } //while...
    writer.write("\n\t\t\t<validate>");
    writer.write("\n\t\t\t\t<byteLength min=\"20\" label=\"validation" + counter+"\"/>");
    writer.write("\n\t\t\t</validate>");
    writer.write("\n\t\t</request>");
    writer.flush();
    public void stop() {
    if (writer != null) {
    try {
    finalizeFile();
    writer.close();
    writer = null;
    } catch (IOException ioe) {
    ioe.printStackTrace();
    private synchronized void initializeFile() throws IOException {
    writer.write("<?xml version=\"1.0\" standalone=\"no\"?>");
    writer.write("\n<!DOCTYPE suite SYSTEM \"../conf/suite.dtd\">");
    writer.write("\n<suite defaultHost=\"tiserver.com\" defaultPort=\"8899\" label=\"test\">");
    writer.flush();
    private synchronized void finalizeFile() throws IOException {
    writer.write("\n\t</session>");
    writer.write("\n</suite>");
    writer.flush();
    protected void finalize() {
    stop();
    }The XML output file looks like this
    <?xml version="1.0" standalone="no"?>
    <!DOCTYPE suite SYSTEM "../conf/suite.dtd">
    <suite defaultHost="tiserver.com" defaultPort="8899" label="test">
    <request path="/acfrs/loginP.jsp" label="Login">
    <param>
    <paramName>userId</paramName>
    <paramValue>redfsdf</paramValue>
    </param>
    <param>
    <paramName>passWord</paramName>
    <paramValue>h1dffd3dfdf</paramValue>
    </param>
    <validate>
    <regexp pattern="Invalid Login" cond="false" ignoreCase="true" label="Login Check"/>
    </validate>
    </request>
    </session>
    </suite>frs/left_frame.jsp" label="request1">
    <validate>
    <byteLength min="20" label="validation1"/>
    </validate>
    </request>
    <request path="/acfrs/welcome.jsp" label="request2">
    <validate>
    <byteLength min="20" label="validation2"/>
    </validate>
    </session>
    </suite>
    If you closely observe the XML file "</session> </suite>" is written in the middle of the file too. Why does this happen? By the way, this code works perfectly on my windows desktop but on unix server the above problem turns up. Any clues??
    Thanks

    Ooops! The finalize() method in RequestRecorder is redundant. Mistakenly uncommnented it (finalize() method code ) while posting it here :-)

Maybe you are looking for