Max number of records for  'BAPI_PBSRVAPS_GETDETAIL'.

Hi All,
Can you suggest me the number of records to be fed to the 'BAPI_PBSRVAPS_GETDETAIL'.
I am using a few location products for 9 key figures.Whenever number of records
in selection table increases BAPI behaves in a strange way and the code written below it does not get executed.
Please guide me to get full points.
Thanks in Advance,
Chandan Dubey

Server memory issue !

Similar Messages

  • Max Number of records for BAPI 'BAPI_PBSRVAPS_GETDETAIL'

    Hi All,
    Can you suggest me the number of records to be fed to the 'BAPI_PBSRVAPS_GETDETAIL'.
    I am using a few location products for 9 key figures.Whenever number of records
    in selection table increases BAPI behaves in a strange way and the code written below it does not get executed.
    Please guide me to get full points.
    Thanks in Advance,
    Chandan Dubey

    Hi Uma,
        It comes out of the program after this code is executed.I have 50 location product combinations in vit_selection table.
      CALL FUNCTION 'BAPI_PBSRVAPS_GETDETAIL'
        EXPORTING
          planningbook                = planning_book
          period_type                 = 'B'
          date_from                   = l_from_week
          date_to                     = l_to_week
          logical_system              = logical_system
          business_system_group       = business_system_group
        TABLES
          selection                   = vit_selection
          group_by                    = vit_group_by
          key_figure_selection        = vit_kf_selection
          time_series                 = vit_t_s
          time_series_item            = vit_t_s_i
          characteristics_combination = vit_c_c
          return                      = vit_return.
      LOOP AT vit_return.

  • Max number of records in MDM workflow

    Hi All
    Need urgent recommendations.
    We have a scenario where we need to launch a workflow upon import of records. The challenge is source file contains 80k records and its always a FULL load( on daily basis) in MDM. Do we have any limitation in MDM workflow for the max number of records? Will there be significant performance issues if we have a workflow with such huge number of records in MDM?
    Please share your inputs.
    Thanks-Ravi

    Hi Ravi,
    Yes it can cause performance overhead and you will also have to optimise MDIS parametrs for this.
    Regarding WF i think normally it is 100 records per WF.I think you can set a particular threshold for records after which the WF will autolaunch.
    It is difficult to say what optimum number of records should be fed in Max Records per WF so I would suggest having a test run of including 100/1000 records per WF.Import Manager guide say there are several performance implications of importing records in a WF,so it is better to try for different ranges.
    Thanks,
    Ravi

  • Max number of records in an internal table

    Hi,
    Can any one tell me what is the Max Number of records we can get into an internal table.
    if you have any link of sap help on this please FWD.
    thanks in Adv.
    Regards,
    Lakshmikanth.T.V

    Hi lakshmikanth,
    Internal Tables as Dynamic Data Objects
    Internal tables are always completely specified regarding row type, key and access type. However, the number of lines is not fixed. Thus internal tables are dynamic data objects, since they can contain any number of lines of a particular type. The only restriction on the number of lines an internal table may contain are the limits of your system installation. The maximum memory that can be occupied by an internal table (including its internal administration) is 2 gigabytes. A more realistic figure is up to 500 megabytes. An additional restriction for hashed tables is that they may not contain more than 2 million entries. The line types of internal tables can be any ABAP data types - elementary, structured, or internal tables. The individual lines of an internal table are called table lines or table entries. Each component of a structured line is called a column in the internal table.
    regards,
    keerthi.

  • SQL help: return number of records for each day of last month.

    Hi: I have records in the database with a field in the table which contains the Unix epoch time for each record. Letz say the Table name is ED and the field utime contains the Unix epoch time.
    Is there a way to get a count of number of records for each day of the last one month? Essentially I want a query which returns a list of count (number of records for each day) with the utime field containing the Unix epoch time. If a particular day does not have any records I want the query to return 0 for that day. I have no clue where to start. Would I need another table which has the list of days?
    Thanks
    Ray

    Peter: thanks. That helps but not completely.
    When I run the query to include only records for July using a statement such as following
    ============
    SELECT /*+ FIRST_ROWS */ COUNT(ED.UTIMESTAMP), TO_CHAR((TO_DATE('01/01/1970','MM/DD/YYYY') + (ED.UTIMESTAMP/86400)), 'MM/DD') AS DATA
    FROM EVENT_DATA ED
    WHERE AGENT_ID = 160
    AND (TO_CHAR((TO_DATE('01/01/1970','MM/DD/YYYY')+(ED.UTIMESTAMP/86400)), 'MM/YYYY') = TO_CHAR(SYSDATE-15, 'MM/YYYY'))
    GROUP BY TO_CHAR((TO_DATE('01/01/1970','MM/DD/YYYY') + (ED.UTIMESTAMP/86400)), 'MM/DD')
    ORDER BY TO_CHAR((TO_DATE('01/01/1970','MM/DD/YYYY') + (ED.UTIMESTAMP/86400)), 'MM/DD');
    =============
    I get the following
    COUNT(ED.UTIMESTAMP) DATA
    1 07/20
    1 07/21
    1 07/24
    2 07/25
    2 07/27
    2 07/28
    2 07/29
    1 07/30
    2 07/31
    Some dates donot have any records and so no output. Is there a way to show the missing dates with a COUNT value = 0?
    Thanks
    Ray

  • Max Number of items for an FI document (999) has been exceded

    Dear Guru's
    I hav production Bom For A ( Truck) , and now when i declare the production it gives me an ERROR :-
    Max Number of items for an FI document (999) has been exceded.
    How i can overcome this error!!
    Please help
    Regards
    Rahul Bhardwaj

    Hi
    You cant post more than 999 line items in FI document and you need to split the document if line items crosses 999.  It looks your production BOM itself contains some errors.  Check it.  Becoz no production BOM contains such a huge materials.  Splitting the BOM into sub BOM may be another option.
    Srinivas

  • What is the max number of records in table

    Hello Friends,
    am using oracle 11g .
    How many records we can store in a table or what is the maximum size of the table . On what factors it depends.
    If the number of records are ever growing , what is the best possible solutiion ?
    thanks/kumar

    There is a limit based on the limit of the ROWID.
    You may find this limit in Oracle documentation.
    From database version 9.0 it is virtually unlimited for us, as it is hardly likely to reach the max value of the ROWID with data we can store now and with the actual speed of our computers (reported to our lifetime).

  • Is a subquery in a BO report limited to a max number of records???

    Here's my problem:
    I recieved an excel sheet with 700 records of customers from a client who wants me to to create a report with specific data for these customers in my Business Objects universe (BO6.5 on SQL Server).
    So I created a dataprovider with query 1, i.e. the requested data of customers. Then I created a second dataprovider, query 2, based on 'personal files', i.e. the excel sheet. In query 1 I added to the conditions that each customer should be in (sub)query 2 (CustomerId In list of the query result ('query2.CustomerId').
    the syntax I have used for this seems OK.
    However, I recieve the following error: "Too many selected values (LOV0001)". I know this error has to do with parameter MAX_INLIST_VALUES, which is limited by default to 99 and can be extended to 256 max. But I thought it refers to the max number of items in lists of values.
    When I limit the number of records in the excel sheet to 99 the result is perfect (proof that I got the syntax right!). I can upgrade the parameter to 256, and can split the excel sheet into three, but that will not be useful when next time my client sends me 10.000 customer records.
    Can I make reports in BO which use subqueries that result in more than 256 records at all? (hardly imaginable).
    What is the best way to do this?
    Thanks in advance!

    Hi Lucas,
    Following is the information regarding the issue you are getting and might help you to resolve the issue.
    ADAPT00519195- Too many selected values (LOV0001) - Select Query Result operand
    For XIR2 Fixed Details-Rejected as this is by design
    I have found that this is a limitation by design and when the values exceed 18000 we get this error in BO.
    There is no fix for this issue, as itu2019s by design. The product always behaved in this manner.
    Also an ER (ADAPT00754295) for this issue has already been raised.
    Unfortunately, we cannot confirm if and when this Enhancement Request will be taken on by the developers.
    A dedicated team reviews all ERs on a regular basis for technical and commercial feasibility and whether or not the functionality is consistent with our product direction. Unfortunately we cannot presently advise on a timeframe for the inclusion of any ER to our product suite.
    The product group will then review the request and determine whether or not the functionality/feature will be included in a future release.
    Currently I can only suggest that you check the release notes in the ReadMe documents of future service packs, as it will be listed there once the ER has been included
    The only workaround which I can suggest for now is:
    Workaround 1:
    Test the issue by keep the value of MAX_Inlist_values parameter to 256 on designer level.
    Workaround 2:
    The best solution is to combine 'n' queries via a UNION. You should first highlight the first 99 or so entries from the LOV list box and then combine this query with a second one that selects the remaining LOV choices.
    Using UNION between queries; which is the only possible workaround
    Please do let me know if you have any queries related to the same.
    Regards,
    Sarbhjeet Kaur

  • Pulling records where number of records for unique ID = 6

    I have a table that contains address information for everyone in the system. It has numerous fields, though I've only included a few in the create table query below for the sake of brevity. The PIDM uniquely identifies each record as belonging to a particular person in the database. A person can have multiple addresses in the table, though we normally do not allow them to have more than one active address of a particular ATYP_CODE. Again, I am doing this here for the sake of brevity. What I need to do is pull all the records for each PIDM, but only where there are >= 6 records per PIDM. The user doesn't care if the data are pivoted (I can do that part if needed). Pulling the actual data isn't the issue. I just need a little help figuring out how to get only the records of PIDMs with six or more records in the table. So, from the example data below, the records for PIDM 12345 and 34567 are the ones that should be in the output, but the ones from PIDM 23456 should not.
    DROP TABLE SPRADDR;
    CREATE TABLE SPRADDR
    (PIDM              NUMBER(8),
    ATYP_CODE     VARCHAR2(2 CHAR),
    STREETLINE1   VARCHAR2(60 CHAR),
    CITY              VARCHAR2(60 CHAR),
    STATE              VARCHAR2(2 CHAR),
    ZIP              VARCHAR2(10));
    INSERT INTO SPRADDR VALUES (12345,'PR','1 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (12345,'MA','1 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (12345,'BU','1 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (12345,'PR','2 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (12345,'MA','3 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (12345,'PR','4 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (23456,'PR','1 MAIN','KENT','OH','44240');
    INSERT INTO SPRADDR VALUES (23456,'MA','1 MAIN','KENT','OH','44240');
    INSERT INTO SPRADDR VALUES (23456,'BU','1 MAIN','KENT','OH','44240');
    INSERT INTO SPRADDR VALUES (34567,'PR','1 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (34567,'MA','1 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (34567,'BU','1 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (34567,'PR','2 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (34567,'MA','3 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (34567,'PR','4 MAIN','CANFIELD','OH','44406');
    INSERT INTO SPRADDR VALUES (34567,'PR','6 MAIN','CANFIELD','OH','44406');
    COMMIT;I'd greatly appreciate any help you might be able to provide. I'm sure this is easy, but what I've done so far has not worked and I'm not including the code I tried because it's totally cockeyed and not working at all.
    Thanks,
    Michelle Craig
    Data Coordinator
    Admissions Operations and Transfer Systems
    Kent State University

    PIDM 12345 and 34567 are the ones that should.Why 12345? It is repeated 6 times where you asked for > 6. Anyway:
    SQL> select  *
      2    from  (
      3           select  s.*,
      4                   count(*) over(partition by pidm) cnt
      5             from  spraddr s
      6          )
      7    where cnt > 6
      8    order by pidm
      9  /
          PIDM AT STREETLINE1     CITY                 ST ZIP               CNT
         34567 PR 1 MAIN          CANFIELD             OH 44406               7
         34567 MA 1 MAIN          CANFIELD             OH 44406               7
         34567 BU 1 MAIN          CANFIELD             OH 44406               7
         34567 PR 2 MAIN          CANFIELD             OH 44406               7
         34567 MA 3 MAIN          CANFIELD             OH 44406               7
         34567 PR 4 MAIN          CANFIELD             OH 44406               7
         34567 PR 6 MAIN          CANFIELD             OH 44406               7
    7 rows selected.
    SQL> select  *
      2    from  (
      3           select  s.*,
      4                   count(*) over(partition by pidm) cnt
      5             from  spraddr s
      6          )
      7    where cnt >= 6
      8    order by pidm
      9  /
          PIDM AT STREETLINE1     CITY                 ST ZIP               CNT
         12345 PR 1 MAIN          CANFIELD             OH 44406               6
         12345 MA 1 MAIN          CANFIELD             OH 44406               6
         12345 BU 1 MAIN          CANFIELD             OH 44406               6
         12345 PR 2 MAIN          CANFIELD             OH 44406               6
         12345 MA 3 MAIN          CANFIELD             OH 44406               6
         12345 PR 4 MAIN          CANFIELD             OH 44406               6
         34567 PR 1 MAIN          CANFIELD             OH 44406               7
         34567 MA 1 MAIN          CANFIELD             OH 44406               7
          PIDM AT STREETLINE1     CITY                 ST ZIP               CNT
         34567 BU 1 MAIN          CANFIELD             OH 44406               7
         34567 PR 2 MAIN          CANFIELD             OH 44406               7
         34567 MA 3 MAIN          CANFIELD             OH 44406               7
         34567 PR 4 MAIN          CANFIELD             OH 44406               7
         34567 PR 6 MAIN          CANFIELD             OH 44406               7
    13 rows selected.
    SQL>  SY.
    Edited by: Solomon Yakobson on May 10, 2012 10:02 AM

  • Max number of records to hold in explicit cursor

    Hi Everyone,
    What is the maxmimum number of records that could be holded in
    an explicit cursor for manipulation. I need to process millions of records.
    Can I hold it in cursors or use temp table to hold those records and
    do fixes with volume control.
    Thanks

    Hi Kishore sorry for the delayed response,
    Table1
    prim_oid     sec_oid          rel_oid
    pp101     cp102          101
    pp101     cp103          101
    pp102     cp104          101
    pp102     cp105          101
    Table2
    ID     p_oid     b_oid     rel_oid
    1     pp101     -51     102
    2     pp102     -51     102
    3     cp102     52     102
    4     cp103     53     102
    5     cp104     54     102
    6     cp105     54     102
    From table1 I get the parent and child recs based on rel_oid=101,
    the prim_oid and sec_oid are related to another col in table2 again
    with a rel_oid. I need to get all the prim_oid that are linked to -ive b_oid
    in table2 whose child sec_oid are linked with +ive b_oid.
    In the above case, parent pp101 linked to 2 child cp102 & cp103 and
    pp102 linked to 2 child cp104 & cp105. Both pp101 and pp102 are linked
    to -ive b_oid (table2), but the children of these parents are linked to +ive b_oids.
    But pp101's children are linked to 2 diff b_oid and pp102's childrend are linked
    to same b_oid. For my requirement I can only update b_oid of pp102 with that
    of its children b_oid whereas cannot update pp101's b_oid as it children are
    linked to diff b_oid's.
    I've a sql that will return prim_oid, b_oid, sec_oid, b_oid as a record as below
    1     pp101     -51     3     cp102     52
    1     pp101     -51     4     cp103     53
    2     pp102     -51     5     cp104     54
    2     pp102     -51     6     cp105     54
    with a cursor sql that returns records as above, it would be difficult to process
    distinct parent and distinct child. So I've a cursor that returns only the parent
    records as below,
    1     pp101     -51
    2     pp102     -51
    and then for each parent I get the distinct child b_oid, if I get only one child
    b_oid I update parent else dont. but the problem is table2 has 8 million parent recs
    with link to -ve b_oid but child of only 2 million recs have link to only one distinct
    b_oid.
    If i include volume control in the cursor sql chances are all might returns like
    pp101 for which update is not required, so I should not have volume control in
    curosr sql which will now return all the 8 million record. (my assumption).
    is there any other feasible solution? Thanks

  • Max number of records in a cube & architectural issues

    Hi,
    Sorry if my question was already done but i can't find the same question with the research button (may be i have not the good words for search, I'm not english).
    I am on a BIG BIG IP project. The forecast volume of planned records is about 1.000.000.000 records a year. So we choose to split records toward many cubes. 
    1) Is there a maximum number of record supported by a cube to be planned? We planned to put 100.000.000 records maximum on a cube to be planned. Is it too much?
    2) If I make 100 cubes (one for each organizational entity) with 10.000.000 records per cube and if I "plug" a planning layout on these 100 cubes with a multiprovider, will IP:
    spread time to search the good cube to write in (due to the selected entity) in the 100 cubes (too much time !), or,
    search directly in the good cube (thanks to user exit that match cube with selected entity), so the response time will be about the same as the one for 1 layout plugged on 1 cube of 10.000.000 records?
    Thanks a lot, and sorry for my english language level
    Georges

    Hi Georges,
    Having too many records in the cube should not be very detrimental for performance of planning application as long as you can ensure that the data volume that you fetch at one go is restricted to reasonable limits, using restrictions in filters (the more restrictive the better). Take care of this while modelling both your planning functions/sequences and input-ready queries.
    I understand that you'll need to create a multiprovider for reporting purposes, but if you dont need the data from more than one cube for planning purpose, it will be better to create the aggregation level (and rest of the planning model) on top of individual cube. In case, you want to use the same planning function/input queries for multiple cubes (which will probably be the case), you can create the aggregation levels on the multiprovider but make sure you restrict the characteristic 'infoprovider' properly in the filter restrictions to avoid the function reading unnecessary data from many cubes.
    Hope this helps.

  • Max number of connection for adapters

    Hi PI/XI Experts,
    I have a WS to Proxy Syncronous scenario. I need to know how many client could consume the web service at the same time? Are there any limitation? If so what is the max. number? and where it can be changed?
    There is another question that how many of the WS clients message could be processed via ABAP Proxy? It is important to transport the all the messages to ABAP Proxy at the same time as much as possible.
    Kind regards,
    Altuğ Bayram

    Hi Raj,
    Where you get this knowledge? When i do search before this message i only found about RFC adapter max connection limitations. Every server/adapter should have a maximum capacity. Maybe it is high number for WS and XI adapters but i need to know what it is. If you how can i find it please share the information.
    Kind regards,
    Altuğ Bayram

  • Maximum number of Records for Emigall Upload

    Hi,
    Is there any limit or maximum number of records can be uploaded via Emigall at one time.
    Thanks.

    Hi Satish Kumar,
    There exists no limit except for some exceptions ;o) These exceptions are objects that require more and more memory during runtime due to growing internal tables. This behavior leads to performance issues because more and more time is spend in working on the internal tables instead of updating the database. This is known for the PARTNER migration object and all MM and PM related migration objects, such as, CONNOBJ, INST_MGMT, etc.
    On the other hand a long lasting import run (because it takes such a long time to migrate the objects in the import file) limits your options in controlling the data import, for example, restarting a cancelled import run. As already pointed out, the Distributed Import should be your choice when migrating huge import files with many objects to be migrated.
    I hope this answers your question.
    Kind regards,
    Fritz

  • Maximum number of records for usage of "For all entries"

    Hi,
    Is there a limit on maximum number of records to be selected from the database using "For all entries"  statement ?
    Thanks in advance

    There is a UNDOCUMENTED(??) behaviousr
    FOR ALL ENTRIES does ahidden SELECT DISTINCT & drops duplicates.
    http://web.mit.edu/fss/dev/abap_review_check_list.htm
    3 pitfalls
    "FOR ALL ENTRIES IN..." (outer join) are very fast but keep in the mind the special features and 3 pitfalls of using it.
    (a) Duplicates are removed from the answer set as if you had specified "SELECT DISTINCT"... So unless you intend for duplicates to be deleted include the unique key of the detail line items in your select statement. In the data dictionary (SE11) the fields belonging to the unique key are marked with an "X" in the key column.
    ^^!!!!
    (b) If the "one" table (the table that appears in the clause FOR ALL ENTRIES IN) is empty, all rows in the "many" table (the table that appears in the SELECT INTO clause ) are selected. Therefore make sure you check that the "one" table has rows before issuing a select with the "FOR ALL ENTRIES IN..." clause.
    (c) If the 'one' table (the table that appears in the clause FOR ALL ENTRIES IN) is very large there is performance degradation Steven Buttiglieri created sample code to illustrate this.

  • Max number of instances for stateful session beans

    Hi,
    I am using OC4J 9.0.3 which is embedded in JDeveloper. I would like to set the maximum number of instances for stateful session beans. I know that the session-deployment section of the orion-ejb-jar.xml file has an attribute called max-instances, but it only works for stateless session beans. (at least it says so in the documentation and it did not work with my stateful session bean)
    Anyone knows how to do it?
    Thanks,
    Leonardo Penha

    Leonardo -- Stateful session beans do not come from a pool but are assigned 1 bean to 1 user. When the EJBs life ends the bean goes away. If what you are asking about is passivation configuraiton then that is different. We currently do not support passivation based on some count of SFSBs that exists inside the container but we will provide that capability in the next release of OC4J. (Note: When passivation occurs is mostly up to the vendors and it is not required by the J2EE 1.3 specification based on performance metrics. What is required is that when a SFSB is passivated it is done correctly and we do that.)
    Thanks -- Jeff

Maybe you are looking for

  • My ipod 1st gen wont charge or sync with itunes via my laptop

    i tried to reboot it or something by holding down the home and on/off button simutaenesley and a really faint battery icon with red low battery showed up, now whenever i try to reboot it again nothing happens. What is wong?

  • How to resolve HTTPS status code 404 using WinHTTP?

    I am getting status code 404 for the below code snippet. Note: please add `<HOSTNAME>` when you are trying to figure out the issue. HINTERNET m_hConnect; HINTERNET m_hSession; HINTERNET m_hRequest; m_hSession = WinHttpOpen(userAgent, WINHTTP_ACCESS_T

  • Daemon nfsd did not respond to null rpc call

    Hi, We've had a Solaris 10 cluster running an HA NFS service for a few months now. This morning at 3:06am, there was a mysterious issue where all the clients got disconnected for 2 minutes. The problem seemed to fix itself at 3:08am. The only thing I

  • Web service proxy generation validation failed - two declarations collision

    Hi, I'm creating a JAX-WS web service proxy against a web service with very complex payload. When I point the wizard to the WSDL, during the analysis, it throws an error, Error creating model from wsdl "<mywsdl>": (Related to above error) This is the

  • DrawLine - colors become black

    have some problem with colors on nokias (canvas's drawLine(); it does not concern png). When I test on emulator (7210) - there I can see all used colors, but on real device (6610) only the pure remain visible (eg. 255,0,0), all others become black. I