To capture no of records loaded into infoprovider

Hi,
I have a requirement such that, to capture the no of records loaded daily into infoprovider.
I want to know which table captures the daily load no of records in an infoprovider.
Based on the data, I have to make an abap program for analysis ?
Can you please let me know which table holds this data ?
Thanks

Hi,
We usually measure number of records in the cube by Fact table, but if the cube is compressed, may be you have to check the E table too. For the ODS, you can check the Active Data table.
Cube - /BIC/E(Cube name) or /BIC/F(Cube name)
ODS - /BIC/A(ODS name)00
Also u can Use the programm SAP_INFOCUBE_DESIGN.
Hope it helps,
Thanks,
Amit
Edited by: Amit Kr on Aug 28, 2009 3:25 PM

Similar Messages

  • Total number of records loaded into ODS and in case of Infocube

    hai
    i loaded some datarecords from Oracle SS in ODS and Infocube.
    My SOurceSytem guy given some datarecords by his selection at Oracle source system side.
    how can i see that 'how many data records are loaded into ODS and Infocube'.
                     i can check in monitor , but that not correct(becz i loaded second , third time by giving the ignore duplicate records). So i think in monitor , i wont get the correct number of datarecords loaded in case of ODS and Infocube.
    So is there any transaction code or something to find number records loaded in case of ODS and Infocube .
    ps tell me
    i ll assing the points
    bye
    rizwan

    HAI
    I went into ODS manage and see the 'transferred' and 'added' data records .Both are same .
    But when i total the added data records then it comes 147737.
    But when i check in active table(BIC/A(odsname)00 then toal number of entries come 1,37,738
    why it is coming like that difference.......
    And in case of infocube , how can i find total number of records loaded into Infocube.(not in infocube).
               Like any table for fact table and dimension tables.
    pls tell me
    txs
    rizwan

  • Re: Issue with multiple records loading into cube

    Hello Gurus,
    A report is run on a monthly basis for Billing commissions. Info cube has all the data related to the Billing commission.
    It has a custom field for total commissions for which the data gets populated from the DSO based on the Billing
    document conditions. Look up has been done for the total commissions field. Most of them are showing up
    the right values, but few billing documents are showing up the wrong values.
    I made an attempt by reloading  the effected Billing documents into Info cube, now the code works and
    total commissions are showing up right values. I tried selective deletion and loaded the data for the whole month,
    the billing documents again show up the wrong values. These billing documents show up the right values when
    they are loaded individually.I tried checking if the code was wrong, but it was working when the effected ones
    are loaded individually. Please let me know your suggestions on this. Thanks in advance

    Nanda,
    Please find the start routine below
    Z9SD_O21 = Billing document header
          L_ZUARC like /BIC/AZ9SD_O2300-INV_QTY,
           L_ZUART like /BIC/AZ9SD_O2300-KPRICE,
           L_ZCGU like /BIC/AZ9SD_O2300-KNVAL,
           L_ZCTU like /BIC/AZ9SD_O2300-KNVAL,
           L_ZCNU like /BIC/AZ9SD_O2300-KNVAL,
           L_ZCQU like /BIC/AZ9SD_O2300-KNVAL,
    FORM SELECT_POST_ST
        USING COMM_BILL_NUM LIKE /BIC/AZ9SD_O2100-DOC_NUMBER.
    *         COMM_BILL_DATE LIKE /BIC/AZ9SD_O2100-BILL_DATE.
       IF COMM_BILL_NUM <> /BIC/AZ9SD_O2100-DOC_NUMBER.
    *  OR COMM_BILL_DATE <> /BIC/AZ9SD_O2100-BILL_DATE.
         CLEAR /BIC/AZ9SD_O2100.
         SELECT SINGLE DOC_NUMBER /BIC/ZPOST_ST
         INTO (/BIC/AZ9SD_O2100-DOC_NUMBER,
    *          /BIC/AZ9SD_O2100-BILL_DATE,
               /BIC/AZ9SD_O2100-/BIC/ZPOST_ST )
         FROM /BIC/AZ9SD_O2100 WHERE
          DOC_NUMBER = COMM_BILL_NUM.
       ENDIF.
    ENDFORM. "SELECT_POST_ST
    *$*$ end of global - insert your declaration only before this line   *-*
    * The follow definition is new in the BW3.x
    TYPES:
       BEGIN OF DATA_PACKAGE_STRUCTURE.
          INCLUDE STRUCTURE /BIC/CS8Z9SD_O24.
    TYPES:
          RECNO   LIKE sy-tabix,
       END OF DATA_PACKAGE_STRUCTURE.
    DATA:
       DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
            WITH HEADER LINE
            WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
       TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
                MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
                DATA_PACKAGE STRUCTURE DATA_PACKAGE
       USING    RECORD_ALL LIKE SY-TABIX
                SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
       CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    *$*$ begin of routine - insert your code only below this line        *-*
    * fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    * to make monitor entries
    * if abort is not equal zero, the update process will be canceled
    * DELETE DATA_PACKAGE WHERE STORNO = 'X'.
    * DELETE DATA_PACKAGE WHERE /BIC/ZPSTAS = 'A'.
    * CG: 07/02/07 Assign Summarization group for each partner function
    * based on Master Data.
       DATA: LV_LINES TYPE I.
       DATA: LV_RETURN(1).
       DESCRIBE TABLE DATA_PACKAGE LINES LV_LINES.
       IF LV_LINES > 0.
         CLEAR: LV_RETURN, SY-SUBRC.
         CALL FUNCTION 'Z_FIND_SUM_GROUP_2'
           IMPORTING
             MY_RETURN                      = LV_RETURN
           TABLES
             IT_DATAPACKAGE                 = DATA_PACKAGE
           EXCEPTIONS
             GENERAL_ERROR                  = 1.
       ENDIF.
       ABORT = 0.
    *$*$ end of routine - insert your code only before this line         *-*
    ENDFORM.

  • ERROR(No text fould) for a request loaded into InfoProvider.

    Dear Experts,
    I'm facing a typical problem. The regular data loads used to be fine. Since past 2 days, when the loads happen through the regular PCs, in some DSOs I find in the Datasource Name column DataSource Name and (No text Found),. The Sourcesystem column shows ERROR and the source system description shows ERROR(No text found). Although, the status of request loaded is green and is available for reporting. But in the next load it throws error saying the previous load is incomplete or improper.
    Any help would be highly appreciated.
    Thankx and Regds,
    Kaushik

    Thankx Srinivas for the reply.
    But there have been no changes whatsoever, in the source nor in the BI Models. And it is happening continuously for the 3rd day. When I delete and reload, it just works fine.
    Regds,
    Kaushik

  • Updated CCB records are extracted but not loaded into OBIU.

    This is OBIU version 2.3.2 and CCB version 2.3.1
    Records updated in CC&B are successfully added to the Change Log (CI_CHG_LOG). They are also extracted successfully by base extract program EXTACCT. However, when the data is loaded into OBIU (using OWB), the record update is not reflected into OBIU. This was tested for base object CD_ACCT.
    The OWB mapping object only creates CD_ACCT records for those accounts in the Change Log with Change Type of 'I' (or Insert). The OWB mapping object does not process those with Change Type of 'U' (or Update). How does one get this Change Type to be processed ?
    A new record (with a new effective date) should have been created that would have up-to-date values for its UDFs.
    The update procedure was able to update the End Date and Job Number of the old record correctly. However, without the creation of a new effective record, all new fact records related to the Account record will no longer have an account related to it.
    This makes the data inaccurate.
    This does not just apply to CD_ACCT, but to other base tables/entities as well. No changes have been done to the DB trigger and OWB metadata of CD_ACCT. The extract program has a user-exit that populates some of the UDFs.
    A custom user exit has been introduced to the extract program of CD_ACCT. DB trigger and OWB metadata have not been changed.
    EXPECTED BEHAVIOR
    The OWB mapping object is expected to process records with Change Type of 'U' (or Update). A new record (with a new effective date) should be created that would have up-to-date values for its UDFs.

    Hi,
    Problem becasue of missing  Industory sector setting in R/3 . This setting has to be done before filling the setuptable .For more information search the forums with BF11.
    Re: 0PUR_C01
    Re: Not getting data added to IC for 2LIS_12_VCITM
    With rgds,
    Anil Kumar Sharma .P
    Message was edited by:
            Anil Kumar Sharma

  • Remove all duplicate records and load into temp table

    Hi
    I have a table contains data like this.
    Emp No Designation location
    1111 SE CA
    1111 DE CT
    3456 WE NJ
    4523 TY GH
    We found that there are two duplicate records for emp no: 1111. I want to delete all duplicate records (in this case two records for emp no:1111) and load into the temp table.
    Please advice me how to do it.

    Oh look, you can search the forums...
    http://forums.oracle.com/forums/search.jspa?threadID=&q=delete+duplicates&objID=f75&dateRange=all&userID=&numResults=30

  • When capturing images from a Nikon D600, they show in Bridge, but when clicked to load into CS6,an error message says that it is the wrong type of document, even jpeg files. This is a NEW frustration.

    When capturing images from a Nikon D600, they show in Bridge, but when clicked to load into CS6,an error message says that it is the wrong type of document, evne jpeg files. This is a NEW frustration.

    Nikon raw files would open up in Adobe Camera Raw and so should jpegs.
    If you select one in Bridge and give the command "Ctrl r" (Windows), what happens?
    Also what is the version of ACR in the title bar?
    Gene
    Note: unmark your question as "answered". the green balloon next to the subject shows it as "solved".

  • I'm using ezcap.tv 116 ezgamer capture card to record my PS3, this is on a windows laptop, Im then moving the recorded videos in mpg format over to my MacBook Pro to use IMovie to edit them to upload to youtube but it won't let me import them.

    I'm using ezcap.tv 116 ezgamer capture card to record my PS3, this is on a windows laptop, Im then moving the recorded videos in mpg format over to my MacBook Pro to use IMovie to edit them so i can upload them to youtube but it won't let me import them, even after changeing them to MP4 format, it just comes up with the message: "No Importable Files None of the selected files or folders can be imported. Change the selection and try again."
    Ive even tried opening in itunes and copying from thier but they wont open in Itunes, no message or anything they simple just dont open. its strange because it does open in quicktime player.
    please if anyone has any ideas that can help me please let me know XD thanks.

    First problem: You're fine. The hotter it gets, the more the fans spin up. The computer is designed so that at max load, at max fan speed, it won't overheat (unless it's obstructed by something, e.g. sitting on your bed swallowed by a comforter). It's not the best thing to keep it that toasty for days at a time, but a couple hours at a time shouldn't be a problem.
    Second problem: If something in the trash won't delete, just use Secure Empty Trash and it should be fine. Since .torrent files are quite small, it should only take a couple seconds.

  • How can I get my (vinyl) record music into iTunes Match?

    I have a rather large collection of music on CDs, Tapes, and records.
    Now that iTunes Match exists I would like to include all of my music into my iTunes library.
    There is no easy way (that I know of) to load anything except CDs into iTunes. So for the rest of the library one must either buy all the music again or buy equipment to play the music one time and convert it to a newer format.
    My CD collection is mostly loaded and since 99.999% (virtually all) of that music is already in iTunes Match it is not uploaded to iTunes again, but it quickly becomes available to play on all my iTunes Authorized Devices.
    With the exception of that 1 in ten thousand songs that I may have that iTunes does not have in the Match library there is no reason to import the music at all - other than to prove I have the actual CD's. (In any case I suppose I could have borrowed them for that anyway.)
    However for CDs at least they can be imported - but for tapes and records I know of no direct way.
    With something state-of-the-art like iTunes Match and my PC(s), iPhone(s) and iPad, why use stone-age-like means to prove ownership of outside-of-iTunes music?
    Why not provide some other information to identify albums owned and have them available in iTunes Match without the 99.999% useless step of loading them?
    In fact for most tape music it is no longer playable (to convert) as the rollers in the tape devices deteriorate and they no longer play.
    Vinyl records play just fine, just not easily into a PC without purchase of expensive equipment that would be used only for the one-time load (which is really just for proof-of-ownership).
    What a ridiculous waste of time, energy and funds for that 1 in 10,000 songs that may not yet exist in the iTunes Match library. Okay, as the music gets older it may even be 2 in 10,000 - but I doubt it.
    I remember that each album (and 45 rpm single) has identifying information scratched into it (around the inside of the innermost track) - can that (or something else) suffice as proof of ownership to get the music into iTunes Match?
    Please respond with ideas to quickly and easily get already-owned vinyl record music into an iTunes Match library.
    I personally do not care if 1 (or 2) out of 10,000 of my songs gets lost because it is not in the iTunes Match library - couldn't be that good in that case anyway.
    Thanks!!

    I too have not figured out how to convert my vinyl collection to iTunes match. Here is what I have tried:
    I record my vinyl records onto CD using a Sony RCD-W500C which makes a very high quality recording. I make sure the song lengths are the same as those from the same album at the iTunes store.
    Then I import the CD into my iTunes music library. I have to enter the names of the tracks manually because Gracenote cannot identify the names of the songs. I do this while the music is in my iTunes music library and the CD has been ejected from my PC.
    Then I try to get the iTunes Match, but it never works. I do, however, get the artwork from iTunes. iTunes then uploads my songs into the iCloud. When I download them back onto my PC and iPhone 4s, I get the identicle recording that I imported onto iTunes. I know this because I can hear the static and crackling of the vinyl. I don't necessarily mind that noise, but I figured if I paid for the iTunes Match I should get the AAC 256 bps music. After all it doesn't cost iTunes anything to give it to me. And there's no way I'm going to purchase all my vinyl records again. I have about 500 albums.
    I do buy all my new music from iTunes because I think it is the best system available. I just think it could be so much better if they would let me download the highest quality recording of the music I own from them.
    I have also tried the Sony Sound Forge Audio Studio which records my vinyl records into a music file on my PC. But that doesn't get the match either.
    THERE HAS TO BE SOMEBODY OUT THERE THAT CAN TELL US WHAT ITUNES MATCH LOOKS AT IN OUR MUSIC UPLOADS THAT DETERMINES WHETHER THEY GIVE US THE MATCH!!!!!!
    When you download a CD, there is some information embedded in the CD that iTunes matches. I don't believe there is any information embeded in the vinyl albums, but when I burn them onto a CD-maybe I could add the info onto the CD before trying to match it if I just knew what they looked for.
    I would appreciate any help anyone can give me on this.

  • Adding leading zeros before data loaded into DSO

    Hi
    In below PROD_ID... In some ID leading zeros are missing before data loaded into BI from SRM into PROD_ID. Data type is character. If leading zeros are missing then data activation of DSO is failed due to missing zeros and have to manually add them in PSA table. I want to add leading zeros if they're missing before data loaded into DSO.... total character length is 40.. so e.g. if character is 1502 then there should be 36 zeros before it and if character is 265721 then there should be 34 zeros. Only two type of character is coming either length is 4 or 6 so there will be always need to 34 or 36 zeros in front of them if zeros are missing.
    Can we use CONVERSION_EXIT_ALPHPA_INPUT functional module ? As this is char so I'm not sure how to use in that case.. Do need to convert it first integer?
    Can someone please give me sample code? We're using BW 3.5 data flow to load data into DSO.... please give sample code and where need to write code either in rule type or in start routine...

    Hi,
    Can you check at info object level, what kind of conversion routine it used by.
    Use T code - RSD1, enter your info object and display it.
    Even at data source level also you can see external/internal format what it maintained.
    if your info object was using ALPHA conversion then it will have leading 0s automatically.
    Can you check from source how its coming, check at RSA3.
    if your receiving this issue for records only then you need to check those records.
    Thanks

  • Downloadhelper just stopped working and I can no longer get the icon to load into the tool bar.

    I downloaded the add on Downloadhelper and it worked for many video captures and then stopped. I reinstalled it but the icon will not load into the toolbar or the tools under View -> Toolbars -> Customize. I have gone through the troubleshooting guides and everything seems to be enabled properly. At the same time Firefox stopped allowing me to open more than one Firefox screen.

    Contact the carrier and ask them what is going on.

  • Different formats in Excel column into 1 format to load into table

    I have a situation in hand. I have to do a SQL Load into a Temporary table from an Excel file but I noticed that the column I am most interested in has about 4 different formats. My desired format will always have to be 7 characters long and has to only have digits in them. That means, no hyphens or special characters or alphabets.
    I have found out that the column in Excel has pretty much 4 or 5 different formats:
    Here is what they are supposed to be transformed into :
    42 = 0000042
    000-0257 = 0000257
    0090647 = 0090647 (Corect Format)
    709782 = 0709782
    135566F1 = 0135566
    The last case only seems to have 2 or 3 lines where this Alphabet comes in but then anything after that needs to be discarded. I could probably just remove them ( The F1 ) from the File manually.
    Any ideas ? Thanks in advance.

    Thanks Frank, for showing a far better way to translate the chars 'away'.
    However, the last record contains the 1 that comes after the F?
    I've pasted your example into my 'beta-clunged'-version (I know it can become far more compact, but wanted to show the steps taken):
    SQL> with t as (
      2  select '42' col from dual union all
      3  select '000-0257' from dual union all
      4  select '0090647' from dual union all
      5  select '709782' from dual  union all
      6  select '135566F1' from dual
      7  )
      8  select col
      9  ,      to_char( replace( col2
    10                         , '-'
    11                         )
    12                , 'fm0000000'  -- 3. apply the format mask
    13                ) my_clunging
    14  ,      lpad ( translate ( col
    15                          , '0123456789' || translate ( col
    16                                                      , 'x0123456789'
    17                                                      , 'x'
    18                                                      )
    19                          , '0123456789'
    20                          )
    21              , 7
    22              , '0'
    23              ) franks_solution
    24  from  ( select col
    25         ,       case when instr( translate( lower(col)
    26                                           , 'abcdefghijklmnopqrstuvwxyz'
    27                                           , 'xxxxxxxxxxxxxxxxxxxxxxxxxx'
    28                                           )
    29                                , 'x'
    30                                ) > 0
    31                      then -- 1. eliminate string after finding a character
    32                           substr( col
    33                                 , 1
    34                                 , instr( translate( lower(col)
    35                                                   , 'abcdefghijklmnopqrstuvwxyz'
    36                                                   , 'xxxxxxxxxxxxxxxxxxxxxxxxxx'
    37                                                   )
    38                                        , 'x'
    39                                        ) -1
    40                                 )
    41                     else col
    42                 end col2
    43          from t
    44        );
    COL      MY_CLUNG FRANKS_
    42       0000042  0000042
    000-0257 0000257  0000257
    0090647  0090647  0090647
    709782   0709782  0709782
    135566F1 0135566  1355661Didn't you notice that in your tests, Raj, or have your requirements changed?

  • Process Chains - number of records loaded in various targets

    Hello
    I have been trying to figure to access the metadata on the process chain log to extract information on the number of records loaded in various targets(Cubes/ODS/Infoobjects).
    I have seen a few tables like RSMONICTAB, RSPCPROCESSLOG and RSREQICODS through various threads posted, however I would like to know whether there is any data model ( relationship structure beween these and other std. tables) so that I could seamless traverse through them to get the information I need.
    In traditional ETL tools I would approach the problem :
    > Load a particular sequence(in our case = BW Process chain) name into the program
    > Extract the start time and end time information.
    > Tranverse through all the objects of the sequence(BW Process chain)
    > Check if the object is a data target
    > If yes scan through the logs to extract the number of records loaded.
    could I have a list of similar tables which I could traverse through ABAP code to extract such information?
    Thanks in advance

    Hi Richa,
    Please check these tables which may be very useful for you.
      rsseldone,
      rsdodso,
      rsstatmanpsa,
      rsds,
      rsis,
      rsdcube
    I have got a abap code where you can get all the information for a particular request.
    If you need more information goto ST13; select BI TOOLS and Execute.
    hope this helps .....
    Regards,
    Ravi Kanth
    Edited by: Ravi kanth on May 15, 2009 11:08 AM

  • Urgent ......data is not  loaded  into  the ODS

    Hi,
    we try to load the data in to ODS via the Infopackage. Job is finished with Green status. In the requests tab I saw that transferred  are 1486 & the addded records are 1486.
    But the ODS sah empty structure,  no records are loaded into the ODS?
    Any suggestions please?
    sailekha

    Hi
    It is availabe for reporting the status is green
    all the related objects are active
    I try to actvate ODS Data, But it asks me the job name
    I am unable to actvate the Data of ODS.
    Still I have the same problem how to proceed further........please let me´know
    Yesterday It is working fine but today I got this problem in Production.
    any suggestions please......
    thanks in advance

  • ODI : how to raise cross reference error before loading into Essbase?

    Hi John .. if you read my post, I want to say that you impress me! really, thank for your blog.
    Today, my problem is :
    - I received a bad quality data file from ERP extract
    - I have cross reference table (Source ==> Target)
    - >> How to raise the error before loading into Essbase !
    My Idea is the following, (first of all, I'm not sure if it is a good one, and also I meet issue to do it in ODI !)
    - Step 1 : make JOIN between data.txt and cross-reference Table ==> Create a table DATA_STEP1 in the ODISTAGING schema (the columns of DATA_STEP1 are the addition of columns of data.txt those of cross-references Tables (... there is more than 20 columns in my case)
    - Step 2 : Control if there is no NULL value in the Target Column (NULL means that the data.txt file contains value that are not defined in my cross reference Table) by using Filter ( Filter = Target_Account IS NULL or Target_Entity IS NULL or ...)
    The result of this interface is send to reject.txt file - if reject.txt file is not empty then a mail is sent to the administrator
    - Step 3 : make the opposite : Filter NOT (Target_Account IS NULL or Target_Entity IS NULL ... ) ==> the result is sent in DATA_STEP3 Table
    - Step 4 : run properly the mapping : source : DATA_STEP3 (the clean and verified data !) with cross reference Tables and send data into Essbase - NORMALY, there is not rejected record !
    My main problem is : what is the right IKM to send data into the DATA_STEP1, or DATA_STEP3 Table, which are Oracle Table in my ODISTAGING Schema ! I thy with IKM Oracle Incremental Update but I get error, and actually I don't need an update (which is time consumming), I just need an INSERT !
    I'm just lookiing for an 'IKM SQL to Oracle" ....
    regards
    xavier

    Thanks john : very speed !
    I understood better now which IKM is useful.
    I found other information about the error followup with ODI : http://blogs.oracle.com/dataintegration/2009/10/did_you_know_that_odi_generate.html
    and I decided to activate Integrity Constorl in ODI :
    I load :
    - data.txt in ODITEMP.T_DATA
    - transco_account.csv in ODITEMP.T_TRANSCO_ACCOUNT
    - transco_entity.csv in ODITEMP.T_TRANSCO_ENTITY
    - and so on ...
    - Moreover I create integrity constraints between T_DATA and T_TRANSCO_ACCOUNT and T_TRANSCO_ENTITY ... so I expected that ODI will raise for me in E$_DATA (the error table) the bad records !
    However I have one issue when loading data.txt into T_DATA because I have no ID or Primary Key ... I read in a training book that I could use a SEQUENCE ... I try but unsuccessful ... :-(
    Is there another simple way to create a Primary Key automaticaly (T_DATA is in an oracle Schema of course) ?thanks in advance

Maybe you are looking for

  • I am installing adobe flash player 12, but it doesn't respond

    My laptop: Mac Air with 10.9.1 OSX Brower: Safari Completely uninstalled Adobe flash player?: Yes Problem: when I was installing adobe flash player 12, after I entered  admins password, furthur steps didn't show up, neither there was anything saying

  • Degenerate Dimension

    While performing modeling in BMM layer I am hitting an issue. I have two fact tables Revenue and Charges. Both has Sales Order Number as common column (Sales Order Number) is available in both the fact tables. Now I want to create a report across bot

  • What is faster: DataReader or DataSet or something else?

    Hi, I need to read 250,000 records and show this information. I am using DataReader and it takes 8 seconds. I need to raise it to 0.8 seconds atleast. What is faster: DataReader or DataSet or something else? Thank's Alexei

  • ACS Express 5.0 vs ACS 5.0

    What's the difference between the two? - Cisco Secure ACS Express 5.0 - Cisco Secure Access Control System 5.0

  • Tomcat 4.1.2 default servlet request handling

    I have a problem which seems to be pretty common. I have a servlet that I would like to make the default servlet in my webapp. I have mapped the servlet to <url-pattern>/</url-pattern>. I would like this servlet to handle all requests, EXCEPT, for th