Interpret Extractor logic

Hi All,
I need some help to interpret the below job log.
A considerable amount of time i.e. 4 hours is being spent in calling the customer enhancement  EXIT_SAPLRSAP_001.
Previously this activity didnot consume such huge time.
We tried checking the TRFCs and short dumps also.
When we checked the process overview in sm50 - the sequential read was working fine.
Not able to interpret where exactly the time is being spent.
Appreciate your help on this.
Date        Time      Message
03.02.2011  02:20:58  Job started
03.02.2011  02:20:58  Step 001 started (program SBIE0001, variant &0000000010155, user ID BIRFC)
03.02.2011  02:20:58  Asynchronous transmission of info IDoc 2 in task 0001 (0 parallel tasks)
03.02.2011  02:20:58  DATASOURCE = 0PA_ER_2
03.02.2011  02:20:58  RLOGSYS    = BICLNT100
03.02.2011  02:20:58  REQUNR     = REQU_EMKACN0684E0YS8255NDE5PWP
03.02.2011  02:20:58  UPDMODE    = F
03.02.2011  02:20:58  LANGUAGES  = *
03.02.2011  02:20:58  *************************************************************************
03.02.2011  02:20:58  *          Current Values for Selected Profile Parameters               *
03.02.2011  02:20:58  *************************************************************************
03.02.2011  02:20:58  * abap/heap_area_nondia......... 4000000000                              *
03.02.2011  02:20:58  * abap/heap_area_total.......... 0                                       *
03.02.2011  02:20:58  * abap/heaplimit................ 40000000                                *
03.02.2011  02:20:58  * zcsa/installed_languages...... DE                                      *
03.02.2011  02:20:58  * zcsa/system_language.......... E                                       *
03.02.2011  02:20:58  * ztta/max_memreq_MB............ 2047                                    *
03.02.2011  02:20:58  * ztta/roll_area................ 6500000                                 *
03.02.2011  02:20:58  * ztta/roll_extension........... 4000000000                              *
03.02.2011  02:20:58  *************************************************************************
03.02.2011  06:32:26  Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 439,405 records
03.02.2011  06:32:26  Result of customer enhancement: 439,405 records
03.02.2011  06:32:29  PSA=0 USING & STARTING SAPI SCHEDULER
03.02.2011  06:32:29  Asynchronous send of data package 1 in task 0002 (1 parallel tasks)
03.02.2011  06:32:35  IDOC: Info IDoc 2, IDoc No. 746127, Duration 00:00:00
03.02.2011  06:32:35  IDoc: Start = 03.02.2011 02:20:58, End = 03.02.2011 02:20:58
03.02.2011  06:32:36  Asynchronous transmission of info IDoc 3 in task 0003 (1 parallel tasks)
03.02.2011  06:32:36  IDOC: Info IDoc 3, IDoc No. 746133, Duration 00:00:00
03.02.2011  06:32:36  IDoc: Start = 03.02.2011 06:32:36, End = 03.02.2011 06:32:36
03.02.2011  06:34:31  tRFC: Data Package = 1, TID = 0AD3E46709EA4D4A4C030CEA, Duration = 00:01:37, ARFCSTATE =
03.02.2011  06:34:31  tRFC: Start = 03.02.2011 06:32:54, End = 03.02.2011 06:34:31
03.02.2011  06:34:31  Synchronized transmission of info IDoc 4 (0 parallel tasks)
03.02.2011  06:34:32  IDOC: Info IDoc 4, IDoc No. 746134, Duration 00:00:01
03.02.2011  06:34:32  IDoc: Start = 03.02.2011 06:34:31, End = 03.02.2011 06:34:32
03.02.2011  06:34:32  Job finished
Regards,
Shravani

Actually this is just a job log. You need to provide code of your extractor extension / customer enhancement.
Better to do SQL trace in TA ST05 to see what SQL statement are the most contributing to bottleneck.

Similar Messages

  • FI-GL Extractor logic in HANA

    Hi Experts,
    We have a requirement to implement FI-GL(0FI_GL_4)  and  FI-SL (0FI_GL_6) extractor logic in SAP HANA. We have extracted the base tables GLT0, BSEG and BKPF in SAP HANA.
    Could anyone give pointers on the FI  Extraction Logic  or some use case scenario.
    Thanks in advance.
    Regards,
    Priyanka

    OK TJ,
    So for the BSIK fields you can try just to add the fields using append struc (use same fieldnames as SAP) and check if they get populated automatically (tcode RSA3). If not try below approach.
    For the BKPF fields I'm afraid you need to program an exit (or BAdi) to populate them (after adding them to the struc). You can use BUKRS, BELNR, and GJAHR (from BSIK -> already being extracted) to access table BKPF and fetch the desired fields.
    now I see your point (I think): you want to use the fields - already in BW via FI-GL load - and do a lookup in BW for these fields out of AP flow? - Yes this should be possible as all is FI-related ***
    Grtx
    Marco
    Edited by: Marco Verbaan on Feb 22, 2010 6:32 PM

  • Two Extractor logics for full and delta?

    Hi,
    I have a situation where in the logics for full upload and delta uploads are drastically dfifferent!
    What is best way to deal it?
    shuould i have two different extractors for full and delta ?
    Raj

    Hi Raj,
    I hope that u are working with the Generic Extractors. If the volume of the posting is too high then u need to restrict the data definitely from R3 side interms of the performance. And u need to maintain the Delta Functionality for the same.
    If u cannt maintain the Delta from the R3 side then atleast try to restrict the same from BW side at the infopack level where u can extract the data based on the changed on and created with the two different IP's and update the same to the ODS(here make sure all the objects are with the overwrite).... and then u can maintain the Delta load from ODS to the Cube easily..
    If the volume of the data is pretty less then u can use Full load to the ODS and then Delta to the Cube.
    Thanks
    Hope this helps..

  • Std Extractor logic

    Hi Guys
    We are using 2lis_11_v_itm data source for our backlog report. Now we are facing an issue with this backlog report.
    While loading this we are not able to get some orders which are still open. When I dig in to this, I came to know that LFREL: Itm Rel for Delivery is not checked for those orders . But Iam not able to understand why the std extractor is filtering this and also where should I check this filter / code for std extractor....
    Sap Help documentation is showing that We can only extract Delivery relevant data for 2lis_11_v_itm Infosource. But i didnt find any document for the same data source.Can any body clarify and send me the docs if any..
    regards

    Please advice me whats the diff between 2lis_11_vaitm & 2lis_11_v_itm....I am getting all the orders in 2lis_11_vaitm...then why do I need to use v_itm for backlog...Do we have a flag /status in vaitm, so that we can just give open in Bex query filter ?
    Also please let me know if I want to create a view tith vbak,vbap,vbkd,vbup,vbuk...what are the main steps I need to take
    Please advice...
    Edited by: sam on Feb 7, 2008 6:28 PM

  • How to Generate Information Errors in Generic Extractor Function Module

    Hi, In my Generic Extractor logic I am reaching out to an additional table to get a field.  I would like to produce an informational error or red error in the process monitor if I'm unable to get the field.  Is this possible?  How can I generate these messages in the Extractor Function Module?
    Thanks!

    Hi,
    Please go through the below doc. It explanins each and every step of creation of generic extractor through Function module.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0f46157-e1c4-2910-27aa-e3f4a9c8df33?quicklink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c062a3a8-f44c-2c10-ccb8-9b88fbdcb008?quicklink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/30aefb04-7043-2c10-8e92-941536eebc79?quicklink=index&overridelayout=true
    Regards,
    mahesh

  • Strategy to fill setup table fo Logical Cockpit

    Hi gurus,
    I have to fill setup tables (i mean MC11Va1ITM and mc12vcoitm) and have a lot orders and deliveries on R3 side. The problem is a can't continue interupdted run (according to data in rmcw3 - table for counter readings of info structures re-setup) the documents started to reinitialize again in spite of unmarked flag.
    So, any ideas about strategy'll be highly appreciated.

    Hi Mikhail,
    Kindly have a look at below links,
    Part 1: SD Overview
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/1034bcbe-b459-2d10-2a89-ecdeb4e53ff1
    Part 2: DB Update logic
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/804165a5-c081-2d10-39b4-af09a680f591
    Part 3: Extractor Logic
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/1094e790-1d93-2d10-17ba-8b559bf0f75b
    Part 4: Update Modes
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/5039632a-c398-2d10-0aaf-97167a3de753
    Part 5: SD Datasource overview
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d07aa007-84ab-2d10-46ba-a5a2679f0d7b
    Part6:Implementation methodology
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d0508c20-84c1-2d10-2693-b27ca55cdc9f
    Part7: LO datasource Enhancements
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/c09d5356-d8c4-2d10-8b84-d24723fc1f0a
    Hope this helps.
    Regards,
    Mani

  • BI extractor

    Hi,
    I got a requirement where i have to use an extractor to move data available in a custom infotype in ECC to BI infoCube.
    The extractor wil be accessible via transaction RSA6 in ECC. The infoCube wil be accessible thru transaction RSA1 in BI.
    Any pointers and stuff related to Extractors wil be really useful to me.
    Wil surely award points for useful replies !!!!!!

    Dear Rajesh,
    In generic extractor based on function module, you do have an opportunity to use not only cursor-based approach for filling extract table, but also alternative approaches u2013 cursors are recommended by SAP, but technically this is not the only possible way to select and fetch data here.
    As an example, you can use u201Cusualu201D (cursor-free) SELECT-statements for filling internal table, then you can make necessary checks or whatever needed by extractor logic and then place finally extracted and cleansed data to interface table (E_T_DATA). In order to divide extracted data into packets consisting of given number of records (which is defined in parameter I_MAXSIZE), you can use a pair of custom local variables which will help you to define first and last record to be passed within the packet in the loop of packets generation until NO_MORE_DATA exception is raised.
    But, from my point of view, if you have a choice to use cursor-based technique or some alternative techniques, the preferable one is cursor-based since this is general recommendation of SAP for development of function module for generic extractor. I would suggest using techniques not involving cursors in those cases when usage of cursor is not suitable or does not comply with your requirements.
    My regards,
    Vadim

  • 0FI_AR_4 Datasource, Delta

    Hi Experts,
    we are using 0FI_AR_4 datasource, this is delta enable, but the problem is we can run delta just once a day.
    Can any one please let me know how to change this so that i can run the delta more than once a day.
    Any document or a link would be of great help.
    Thanks in advance.
    Ananth

    hi Ananth,
    take a look Note 991429 - Minute Based extraction enhancement for 0FI_*_4 extractors
    https://websmp203.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=991429&_NLANG=E
    Symptom
    You would like to implement a 'minute based' extraction logic for the data sources 0FI_GL_4, 0FI_AR_4 and 0FI_AP_4.
    Currently the extraction logic allows only for an extraction once per day without overlap.
    Other terms
    general ledger  0FI_GL_4  0FI_AP_4  0FI_AR_4  extraction  performance
    Reason and Prerequisites
    1. There is huge volume of data to be extracted on a daily basis from FI to BW and this requires lot of time.
    2. You would like to extract the data at a more frequent intervals in a day like 3-4 times in a day - without extracting all the data that you have already extracted on that day.
    In situations where there is a huge volume of data to be extracted, a lot of time is taken up when extracting on a daily basis. Minute based extraction would enable the extraction to be split into convenient intervals and can be run multiple times during a day. By doing so, the amount of data in each extraction would be reduced and hence the extraction can be done more effectively. This should also reduce the risk of extractor failures caused because of huge data in the system.
    Solution
    Implement the relevant source code changes and follow the instructions in order to enable minute based extraction logic for the extraction of GL data. The applicable data sources are:
                            0FI_GL_4
                            0FI_AR_4
                            0FI_AP_4
    All changes below have to be implemented first in a standard test system. The new extractor logic must be tested very carefully before it can be used in a production environment. Test cases must include all relevant processes that would be used/carried in the normal course of extraction.
    Manual changes are to be carried out before the source code changes in the correction instructions of this note.
    1. Manual changes
    a) Add the following parameters to the table BWOM_SETTINGS
                             MANDT  OLTPSOURCE    PARAM_NAME          PARAM_VALUE
                             XXX                  BWFINEXT
                             XXX                  BWFINSAF            3600
                  Note: XXX refers to the specific client(like 300) under use/test.
                  This can be achieved using using transaction 'SE16' for table
                             'BWOM_SETTINGS'
                              Menue --> Table Entry --> Create
                              --> Add the above two parameters one after another
    b) To the views BKPF_BSAD, BKPF_BSAK, BKPF_BSID, BKPF_BSIK
                           under the view fields add the below field,
                           View Field  Table    Field      Data Element  DType  Length
                           CPUTM       BKPF    CPUTM          CPUTM      TIMS   6
                           This can be achieved using transaction 'SE11' for views
                           BKPF_BSAD, BKPF_BSAK , BKPF_BSID , BKPF_BSIK (one after another)
                               --> Change --> View Fields
                               --> Add the above mentioned field with exact details
    c) For the table BWFI_AEDAT index-1  for extractors
                           add the field AETIM (apart from the existing MANDT, BUKRS, and AEDAT)
                           and activate this Non Unique index on all database systems (or at least on the database under use).
                           This can achived using transaction 'SE11' for table 'BWFI_AEDAT'
                               --> Display --> Indexes --> Index-1 For extractors
                               --> Change
                               --> Add the field AETIM to the last position (after AEDAT field )
                               --> Activate the index on database
    2. Implement the source code changes as in the note correction instructions.
    3. After implementing the source code changes using SNOTE instruction ,add the following parameters to respective function modules and activate.
    a) Function Module: BWFIT_GET_TIMESTAMPS
                        1. Export Parameter
                        a. Parameter Name  : E_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
                        2. Export Parameter
                        a. Parameter Name  : E_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
    b) Function Module: BWFIT_UPDATE_TIMESTAMPS
                        1. Import Parameter (add after I_DATE_HIGH)
                        a. Parameter Name  : I_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
                        2. Import Parameter (add after I_TIME_LOW)
                        a. Parameter Name  : I_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
    4. Working of minute based extraction logic:
                  The minute based extraction works considering the time to select the data (apart from date of the document either changed or new as in the earlier logic).The modification to the code is made such that it will consider the new flags in the BWOM_SETTINGS table ( BWFINEXT and BWFINSAF ) and the code for the earlier extraction logic will remain as it was without these flags being set as per the instructions for new logic to be used(but are modified to include new logic).
    Safety interval will now depend on the flag BWFINSAF (in seconds ; default 3600) and has  a default value of 3600 (1 hour), which would try to ensure that the documents which are delayed in posting due to delay in update modules for any reason. Also there is a specific coding to post an entry to BWFI_AEDAT with the details of the document which have failed to post within the safety limit of 1 hour and hence those would be extracted as a changed documents at least if they were missed to be extracted as new documents. If the documents which fail to ensure to post within safety limit is a huge number then the volume of BWFI_AEDAT would increase correspondingly.
    The flag BWFINSAF could be set to particular value depending on specific requirements (in seconds , but at least 3600 = 1 hour)  like 24 hours / 1 day = 24 * 3600 => 86400.With the new logic switched ON with flag BWFINEXT = X the other flags  BWFIOVERLA , BWFISAFETY , BWFITIMBOR are ignored and BWFILOWLIM , DELTIMEST will work as before.
    As per the instructions above the index-1 for the extraction in table BWFI_AEDAT would include the field AETIM which would enable the new logic to extract faster as AETIM is also considered as per the new logic. This could be removed if the standard logic is restored back.
    With the new extractor logic implemented you can change back to the standard logic any day by switching off the flag BWFINEXT to ' ' from 'X' and extract as it was before. But ensure that there is no extraction running (for any of the extractors 0FI_*_4 extractors/datasources) while switching.
    As with the earlier logic to restore back to the previous timestamp in BWOM2_TIMEST table to get the data from previous extraction LAST_TS could be set to the previous extraction timestamp when there are no current extractions running for that particular extractor or datasouce.
    With the frequency of the extraction increased (say 3 times a day) the volume of the data being extracted with each extraction would decrease and hence extractor would take lesser time.
    You should optimize the interval of time for the extractor runs by testing the best suitable intervals for optimal performance. We would not be able to give a definite suggestion on this, as it would vary from system to system and would depend on the data volume in the system, number of postings done everyday and other variable factors.
    To turn on the New Logic BWFINEXT has to be set to 'X' and reset back to ' ' when reverting back. This change has to be done only when there no extractions are running considering all the points above.
                  With the new minute based extraction logic switched ON,
    a) Ensure BWFI_AEDAT index-1 is enhanced with addition of AETIM and is active on the database.
    b) Ensure BWFINSAF is atleast 3600 ( 1 hour) in BWOM_SETTINGS
    c) Optimum value of DELTIMEST is maintained as needed (recommended/default value is 60 )
    d) A proper testing (functional, performance) is performed in standard test system and the results are all positive before moving the changes to the production system with the test system being an identical with the production system with settings and data.
    http://help.sap.com/saphelp_bw33/helpdata/en/af/16533bbb15b762e10000000a114084/frameset.htm

  • Issues while generating Schema DAT files

    We are facing two type of issues when generating Schema ".dat" files from Informix Database on Solaris OS using the
    "IDS9_DSML_SCRIPT.sh " file.
    We are executing the command on SOLARIS pormpt as follows..
    "IDS9_DSML_SCRIPT.sh <DBName> <DB Server Name> ".
    The first issue is ,after the command is excuted ,while generating the ".dat" files the following error is occuring .This error is occuring for many tables
    19834: Error in unload due to invalid data : row number 1.
    Error in line 1
    Near character position 54
    Database closed.
    This happens randomly for some schemas .So we again shift the script to a different folder in Unix and execute it.
    Can we get the solution for avoiding this error.
    2. The second issue is as follows..
    When the ".dat" files are generated without any errors using the script ,these .dat files are provided to the OMWB tool to load the Source Model.
    The issue here is sometimes OMWB is not able to complete the process of creating the Source Model from the .dat files and gets stuck.
    Sometimes the tables are loaded ,but with wrong names.
    For example the Dat files is having the table name as s/ysmenus for the sysmenus table name.
    and when loaded to oracle the table is created with the name s_ysmenus.
    Based on the analysis and understanding this error is occuring due to the "Delimiter".
    For example this is the snippet from a .dat file generated from the IDS9_DSML_SCRIPT.sh script.The table name sysprocauthy is generated as s\ysprocauthy.
    In Oracle this table is created with the name s_ysprocauthy.
    s\ysprocauthy║yinformixy║y4194387y║y19y║y69y║y4y║y2y║y0y║y2005-03-31y║y65537y║yT
    y║yRy║yy║y16y║y16y║y0y║yy║yy║y╤y
    Thanks & Regards
    Ramanathan KrishnaMurthy

    Hello Rajesh,
    Thanks for your prompt reply. Please find my findings below:
    *) Have there been any changes in the extractor logic causing it to fail before the write out to file, since the last time you executed it successfully? - I am executing only the standard extractors out of the extractor kit so assumbly this shouldnt be a issue.
    *) Can this be an issue with changed authorizations? - I will check this today, bt again this does not seem to be possible as the same object for a different test project i created executed fine and a file was created.
    *) Has the export folder been locked or write protected at the OS level? Have the network settings (if its a virtual directory) changed? - Does not seem so because of the above reason.
    I will do some analysis today and revert back for your help.
    Regards
    Gundeep

  • Current and Non Current

    Hello Experts,
    I am quite new in BPC and having some doubts relating to Current and Non current classifications in Balance Sheet while doing Consolidation.
    We have one Operational GL of Long term loan and based on the due dates we need to bifurcate that GL into current and non current portion in BPC for reporting purposes. As there is no due date concept in ECC; the whole amount comes under one GL only.
    Is there is any way to bifurcate a particular GL into Current and Non current based on due dates in BPC 10.0 nw?
    Regards,
    Rahul

    HI Rahul
    To explain further I will take you example of Current and non current portions of loan .
    The amount falling in each category should be determinable in ECC using a combination of account, due date etc depending on how amount is stored (for e,g, in case accounts payable , each line item
    carries a different due date).
    In BPC the 2 approaches are
    1) Having 2 accounts - loan current and non current in BPC
    1a) Amount determinable in ECC
    If the amounts are clear determinable at the time of extraction , the split can be done
    using the extractor logic (line item extractor would help in determining amount with
    the help of due date)
    1b)Manually derived
    In case amount are manually calculated ,the split could be done using input schedule or
    script logic where the split are taken as user input and pushed into the relevant GL
    2) 1 account and flow combination
    This basically means create statistical flos in BPC to represent current and non current
    say 'Cur01' or 'noncur
    2a)Amount determinable in ECC
    The transaction type could be filled with the identified value by extraction program when
    it determines nature and splits
    2b) Manually derived
    In the input schedule or script , the input can be saved agaisy the account and flow combination
    Regards
    Surabhi

  • What is Unsychronized Backup Recovery in BW System ?

    What is Unsychronized Backup Recovery in BW System
    Iam using 2LIS_06_INV Data Source ...in that I see that this data source's extractor Logic is Unsychronized Backup Recovery in BW System..
    Thanks

    also how can we do delta extraction for this data source ?
    How can we do LO Cockpit Extraction for this ?
    Thanks in advance ...

  • Memory Management Questions

    Hello All!
    I read the Memory Management Programming Guide for Cocoa - several times. But some things are still not really clear.. I would like and need to have a deeper understanding. So, I hope someone could help me The problem is that I had to get rid of several (..) memory leaks in my app, and now I am a bit confused and unsure about my skills at all..
    1.
    What is the difference between sayHello1,sayHello2,getHello1,getHello2,getHello3 and which one is "better" (and why) - please dont try to interprete the logic/sense of the methods itself
    - (NSString *) sayHello1{
    return [[[NSString alloc] initWithString:@"Hello"] autorelease];
    - (NSString *) sayHello2{
    return [[[NSString alloc] initWithString:@"Hello"] retain];
    - (void) getHello1{
    NSString *hello = [self sayHello1];
    [hello release];
    - (void) getHello2{
    NSString *hello = [self sayHello2];
    [hello release];
    - (void) getHello3:(NSString *)hello{
    [hello retain];
    NSLog(@"%@", hello);
    [hello release];
    Concerning this, there are several questions:
    2.
    If I have to release everything I retain/alloc, why then do I have a memory leak, if am returning an object (which was allocated with alloc and init) from a method without autorelease. The object is still in memory. But the following method wont work. What I accept. But the object is, if returned, not reachable, but also not released. Why then is it not automatically released? (i dont mean autorelease)
    - (NSString *) sayHello1{
    return [[NSString alloc] initWithString:@"Hello"]];
    - (void) getHello{
    NSString *hello = [self sayHello1]; //wont work. the object is not there, but also not released. WHERE is it?
    [hello release];
    3.
    When is a delegate method released, if I have no variable I can use to "release"? So, if I have nothing to access the delegate like a NSURLConnection delegate?
    should I, for example, call a [self release]?
    - (void)startParser{
    Parser *parser = [[Parser alloc] init];
    [parser start];
    //should I use a [parser autorelease or retain] here?
    - (void)parserDidEndDocument:(NSXMLParser *)parser{
    //do somethings with the parserstuff
    [self release];
    4.
    *And the last question:*
    Where can I see in instruments, which elements have retain counts > 1 and potential leaks? I was reading the instruments guide but there is only theoretical stuff. No practical guides like: your app should not have/use more than x megabyte ram.. for example: my app gets slower and slower the longer a i use it. -> this indicates memory leaks..

    A Leak is only a leak if the reference to the object is lost.
    https://devforums.apple.com/message/189661#189661

  • How to extract CATSDB ?

    I have a request where I need to replicate BW CATSDB extractor 'logic' within R3. extract data from CATSDB. I have used FM CATS_BIW_GET_DATA2 (extractor 0CA_TS_IS_1) and FM requires value in I_REQUNR but nothing works. Does anyone have any other suggestion on how to get the data?
    note: I have tried CATS_BIW_GET_DATA and that worked when I entered i_requnr = '000'. However I received few thousand records less then the actual number in table itself.

    Hi,
    CATS_BIW_GET_DATA selects only status '30' (entries which are approved) from CATSDB. This might be the difference.
    Regards
    Nicola

  • ACS v4.1 PEAP and MAC Address Validation

    I would like to authenticate to a ACS server via both 802.1x (PEAP) and to also validate the MAC Address of the user. Can both of these be done? I have 802.1x (PEAP) working to the ACS and Active Directory but now I would like to add the MAC Address of the laptops. Can I use Network Access Profiles and add the MAC-address under MAC-Authentication bypass?
    Your assistance is appreciated.

    I seem to have figured my way out of this. The reason for the short dot1x timer is that we are using MAB to authenticate the client MAC, so we actually WANT the dot1x authentication to timeout as quickly as possible for the secondary (MAB) authentication to execute.
    I'm also suffering from the age-old problem of interpreting the logic of a config originally implemented by someone else. I'm wondering if all the dot1x commands we have are actually necessary in our situation.
    What I have found when comparing new switches to old is that on the 3750s, show authentication sessions for an interface only shows mab as a runnable method, while on the 3850s it lists dot1x, mab and webauth (in that order). Using authentication order mab and authentication priority mab on an interface of the 3850 seems to do the trick. With debug mab turned on you can see the mab authentication working and the switch then allows the interface to pass traffic. Just as importantly, it blocks the port if I try using a client whose MAC is not in the ACS database.
    Appreciate your help.

  • Query counter : event 3100 not logged in table RSDDSTAT_OLAP

    Hi experts,
    We have a customer scenario where in the table RSDDSTAT_OLAP table is not getting populated with the eventID '3100' when a query is executed. Due to this the query counter is not getting any value ( according to extractor logic RSTC_BIRS_OLAP_AGGREGATED_DATA)
    The changes of the notes : 1250182 ,1505325, 1093755 are already present in the cutomer system.
    Hence would like to know the cases in which OLAP read eventid 000003100 would not get recorded in the table RSDDSTAT_OLAP, on execution of the query .
    Thanks in Advance,
    Keerthi

    Hi,
    It's a good decision you made because you shouldn't create objects in the SYS schema. I see you're using the add_job_email_notification procedure, so I'm assuming you're on 11g onwards.
    This procedure is meant to make the job easy for you. That is, you don't need to worry about setting up AQ, the procedure does it all for you. All you need to do is setup the mail server if not already done, and call this procedure to add the mail notification. That's it.
    You can also specify a filter to the add_job_email_notification procedure, so maybe you could specify directly your CLIENT.LOAD_DATA job to this procedure.

Maybe you are looking for

  • How do I make one corner rounded?

    Hey, Firstly, I'm using Illustrator CS6. I'd like to round a specific corner of a shape (a rectangle to be specific). I've seen in other posts that the CornerEffects.jsx script should work. But every time I select my rectangle, go to File > Scripts >

  • What the heck is 'pgs'?

    Trying to empty trash containing iTunes (long story) and related files, but the trash will not empty because: "The operation cannot be completed because 'pgs' is in use." I found 'pgs' folders in the System Folder/Help/Mac Help/ but not sure why they

  • Why doesn't my 4s sound work

    My iPhone 4s sound only works when my phone is locked 10% of the time the other 90% I have no sound nothing works and the volume button won't show a sound bar increase when I press either one.

  • Using a MacBook Air with 27" HP Monitor

    Our IT department bought an HP 2740w display for my MacBook Air because they didn't want to spend the money on a Thunderbolt Display.  The highest resolution I can seemingly get is 1440 x 900.  I'm SO disappointed.  Nothing (not even close) to the im

  • Transfer pictures from laptop to ipad

    windows 7 with photoshop extended updated to 2014 new addition. i have o idea how to transfer my pictures from y laptop to my ipad with adobi photoshop touch on it. Is this possble and how, in simple terms please as i am not a professional...all help