Determine the count of data-packages in a dtp request.

Hello,
in a start routine I need to know the count (=highest number) of all data-packages in the current DTP-Request.
I have had a look to table RSTSODSREQUESTPG it contains exactly the fields I need,
but I do not find every dtp-request in this table.
For me it looks as if this table was only filled in BW 3.5 but not in BI 7.0 or there is another reason why not every DTP-Request is stored in this table.
Is there a new table to get the information ?
Thanks
Armin

Hello,
in the table RSBKDATAPAKID I found the field DATAPAKID, what I was looking for.
But the field REQUID is only 6 byte like: 123456.
In the start routine I have only a 30 byte field like: DTPR_4B34567890123456789012345
In the administartor workbench I can see that both numbers are the same request.
Is there a matching table ?
In the table RSDDSTATDTP the field INSTANCE seems to be the 30 byte DTPR-Number.
And a field DATAPAKID is also in this table.
That looks good, i will try out if it works.
Thanks
Armin
Edited by: Armin Batzelt on Sep 17, 2008 4:44 PM

Similar Messages

  • Determination the size of data package

    Hi all,
    There is simple way to determine a size of the data packages in SAP NW BW 7.x with DTP, where you can do it easy (option: Package Size).
    How do you determine Package size in SAP BW 3.x? there is no option Package Size. maybe with Function module(datasource in SAP R3) it's possible? or anything else... have a problem with that.
    thnks for you response.

    in start routine or Field Routine Read the source_package like below.
    DATA: WA_SIZE TYPE SY-TFILL.
    READ TABLE SOURCE_PACKAGE.
    WA_SIZE = SY-TFILL.
    WA_SIZE will hold the size i.e number of entries in the package.
    Tx,
    SB

  • Problems with the O2 blackberry data package on my Curve 3G.

    I have already informed O2 about this but they claim that I should be used the blackberry support services, but nothing there helps me!
    I got my Blackberry Curve 3G on September 9th this year and I put on the Blackberry Data Package bolt-on onto my phone on September 16th. I then received a text to say they've taken £5 from my credit and it will be up and running in the next 24 hours. Its now September 19th and my BBM is not working at all and I am extremely upset with the services and behaviour I have received from both O2 and Blackberry.
    Is there any way you can help? If this fails, I shall be forced to go back to the shop from where I got my Blackberry from and ask for their help.
    Many thanks, Jade.

    Can a bubble whistle Problems with the O2 blackberry data package on my Curve 3G.? The seat matures in your oar. The lad ices the pursuing method inside a resident. A judge spins against the vendor! The rose wows the hello. 
    filipina heart

  • To determine the count when range is present in the select option

    hai!
         i want to know how we can determine the count(number of values) when range is present in the select option.
    helpful answers will be rewarded.
    thanks & regards,
    raghavendra kulkarni.

    refer my reply in the other thread
    select <field>
    from <master table>
    into table itab
    where <key field> in <range/select-option> .
    describe table itab lines v_lines.
    or
    count = sy-dbcnt.
    Regards,
    ravi

  • Userexit which determine the confirm delivery date in order creation

    Hi,
    I need to obtain the ATP details which contain confirm delivery date of all the lineitem. The BAPI BAPI_MATERIAL_AVAILABILITY determine the material availability date but not the Confirm delivery date. But I require the Confirm delivery date which will be calculated based on picking, working day ...
    Is there any userexit available to determine the Confirm delivery date from the material available date.
    Pls help..urgent..
    Thanks
    Suresh Kumar

    Hi,
    The userexits related to the material availability during sales document processing are:
    User Exits For Availability Check
    The user exits named here are planned for projects that are to be carried out in agreement with SAP development as they may only be used in consultation with SAP. SAP provides no guarantee if they are used without our advice!
    USEREXIT_PLANT_SELECTION in program RV03VFZZ is an exception to this rule (see below).
    User exits in program FV45VFZZ
    USEREXIT_ADD_FIELD_TO_HEADER
    USEREXIT_ADD_FIELD_TO_LINE
    User exits in program FV45VFZY
    USEREXIT_DELIVERY_GROUPS
    USEREXIT_MVERF_INIT
    USEREXIT_QUOTA_KEY_VALUE
    User exits in program RV03VFZZ
    USEREXIT_AVAILABILITY_IN
    USEREXIT_AVAILABILITY_OUT
    USEREXIT_DARA_REFRESH
    USEREXIT_PLANT_SELECTION
    This might be of some use.
    Thanks and regards,
    Satya

  • Why the $29.99 data package?

    I have been eligible to upgrade my phone for almost a year now.  I haven't upgraded because the phones that I want to upgrade to, require the $29.99 data package.  I don't like it, but I understand requiring a data package.  But why can't I get a smart phone with the $9.99 package.  There are two phones on my plan and I just can't see adding another $60 to my plan for services that I don't need and may only use occasionally.  
    There are other carriers out there with cheaper plans but, obviously, their service is not as good in my area or I would have switched by now.  I love Verizon's coverage and dependability but I don't want to be stuck with my LG Dare for the rest of my life.  Are there any other options?  Or am I going to be required to pay another $60 or switch to a lesser service?

    It was rumored back awhile ago that Verizon was going to change the data plans around like AT&T has,but again it was just rumor. Verizon has not done anything.
    If you really want that phone you'll have to fork over the money for the data plan. Otherwise here is some good news you might like.....smart phones released prior to 11-14-2008 do NOT require a data plan at all. You could get a smart phone that was releasd prior to that date & not get a data plan.

  • Impact of Changing Data Package Size with DTP

    Hi All,
    We have delta dtp to load data from DSO to infocube. Default data package size with dtp is 50,000 records.
    Due to huge no of data, internal table memory space is used and data loading get fails.
    Then we changed the data package size to 10,000, which executes the data load successfully.
    DTP with package size of 50,000 took 40 minutes to execute and failed, but DTP with package size of 10,000 took 15 minutes (for same amount of data).
    Please find below my questions:
    Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
    Also by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
    Thanks

    Hi Sri,
    If your DTP is taking more time then check your transformation .
    1.Transformation with Routines always take more time so you if you want to reduce the time of execution then routine should be optimized for good performance .
    2.Also check if you have filter at DTP level .Due to filters DTP takes long time .If same data get filtered at routine level it take much lesser time .
    3.If you cannot change routine then you can set semantic keys at your DTP .The package data will be sorted as per semantic keys and thus it may be helpful at routine level for fast processing.
    4.Your routine is getting failed due to  internal table memory space so check if you have select statement in routine without FOR ALL ENTRIES IN RESULT_PACKAGE or SOURCE_PACKAGE line .if you will use this It will reduce record count .
    5.Wherever possible delete duplicate records and if possible filter useless data at start routine itself .
    6.Refresh internal table if data no longer needed .If your tables are global then data will be present at every routine level so refreshing will help to reduce size.
    7.The maximum memory that can be occupied by an internal table (including its internal administration) is 2 gigabytes. A more realistic figure is up to 500 megabytes.
    8.Also check no of jobs running that time .May be you have lots of jobs active at the same time so memory availability will be less and DTP may get failed .
    Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
    *Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
    by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
    It will only impact running of that load .but yes if lots of other loads are running simultaneously then server can allocate more space to them .So better before reducing package size just check whether it is helpful in routine performance (start and end ) or increasing overhead .
    Hope these points will be helpful .
    Regards,
    Jaya Tiwari

  • Loading data into csv file based on the count of data in table in ODI

    Hi,
    I have a requirement like i have data in table and need to be loaded in csv file.
    If the count of the data in the table exceeds the csv data limit then i need to call the same procedure again to
    generate one more csv file and load the remaining data into it.my data in the table may vary all the time.
    I am new to this ODI.Can anyone tell any ways to do it in ODI or tell me any logic to do this?
    Thanks in advance..
    Edited by: 883410 on May 22, 2012 12:01 AM

    What limit on the csv file are you talking about. There isn't a limit (unless you are on Unix where the filesize is limit is 2GB). Are you talking about opening it in earlier versions of Excel where the display limit is 60,000 rows as this is irrelevant for your operation?

  • How to determine the sales area data

    Hi,
    If I know the sales organisation to which a user is assigned to how to find out the Sales office ,sales group,distribution channel and division associated with the sales org to which the user is assigned to ?

    hi gincy
    yeah there is a FM to get the info of the sales area data of user
    you just see a rule for the org model named
    orgman 12
    it actually drives the sales org info of the user assigned to the org unit
    i guess the FM name is
    CRMD_ORGNMAN_OBJECTS_7
    but you can confirm it by goin inside the rule ORGMAN12
    just use t code pfac
    and there in go inside the rule for org model
    and there in ORGMAN12 u can see the name of the FM
    best regards
    ashish

  • Data Package Issue in DTP

    Hi gurus,
    My dataflow is like datsource->infosource->wrute optimised DSO with semantic key..
    In source , i have 10 records in that 7 records are duplicate records.
    I reduced the DTP datapackage size from 50000 to 5.
    When i excuted the DTP , i got 2 data package. in the first data package i got all the 7 records for the same set of keys and in the second data package i got the remaining records.
    My doubt is i have defined the data package size as "5" then how come the first data package can hold 7 records instead of 5 records.
    Thanks in advance !

    Hi ,
    It is because of the Semantic Key seeting that you have maintained .Data records that have the same key are combined in a single data package. This setting is only relevant for DataStore objects with data fields that are overwritten .
    Semantic Groups ensures how you want to build the data packages that are read from the source (DataSource or InfoProvider).
    This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.
    Hope it helps .
    Thanks
    Kamal Mehta

  • How SCCM client determines the required space for package in cache?

    Hi,
    I have deployed many small applications within SCCM. Now it is time to deploy Lingvo - the size of installation folder is 2.5 GB - and my test deployment report says that there is not enough client cache disk space... From the SCCM Right-click tools, I've
    changed the cache size many times (and restarted  SMS host agent too) - and deployment was successful only after I set client cache size to 65 GB. Why so ridiculous? 

    The Configuration Manager 2012 client cache will maintain itself automatically. You can find more information here about how that Works:
    http://technet.microsoft.com/en-us/library/gg712288.aspx#BKMK_ClientCache
    Have you verified that your existing applications is NOT marked as persist in cache, because if they are they will not be deleted automatically. You could use various script to delete the cache on your clients:
    http://cm12sdk.net/?p=1526
    These two link might be handy as well:
    http://blogs.msdn.com/b/helaw/archive/2014/01/07/configuration-manager-cache-management.aspx and
    http://blogs.technet.com/b/meamcs/archive/2012/10/04/managing_2d00_configuration_2d00_manager_2d00_2012_2d00_client_2d00_cache.aspx

  • Data package size

    What is the basic difference between RSCUSTV6 & SBIW->General setting ->Maintain Control parameters in relation of modification of data package format.

    Hi,
    Just see the help on
    Maintain Control Parameters for Data Transfer:
    1. Source System
    Enter the logical system of your source client and assign the control parameters you selected to it.
    You can find further information on the source client in the source system by choosing the path
    Tools -> Administration -> Management -> Client Maintenance.
    2. Maximum Size of the Data Package
    When you transfer data into BW, the individual data records are sent in packages of variable size. You can use these parameters to control how large a typical data packet like this is.
    If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data package, but also on the size of the transfer structure and the memory requirement of the relevant extractor.
    3. Maximum Number of Rows in a Data Package
    With large data packages, the memory requirement mainly depends on the number of data recrods that are transferred with this package. Using this parameter you control the maximum number of data records that the data package should contain.
    By default a maximum of 100,000 records are transferred per  data package.
    The maximum main memory requiremen per data package is approximately 2  Max. Rows 1000 Byte.
    4. Frequency
    The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
    Frequency 1 is set by default.. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20.
    The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spans of time .
    With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.
    5. Maximum number of parallel processes for the data transfer
    An entry in this field is only relevant from release 3.1I onwards.
    Enter a number larger than 0. The maximum number of parallel processes is set by default at 2. The ideal parameter selection depends on the configuration of the application server, which you use for transferring data.
    6. Background job target system
    Enter the name of the application server on which the extraction job is to be processed.
    To determine the name of the application server, choose
    Tools -> Administration -> Monitor -> System monitoring -> Server. The name of the application server is displayed in the column Computer.
    7. Maximum Number of Data Packages in a Delta Request
    With this parameter, you can restrict the number of data packages in a delta request or in the repetition of a delta request.
    Only use this restriction when you expect delta requests with a very high data volume, so that, despite sufficiently large data package sizes, more than 1000 data packages can result in a request.
    With an initial value or when the value is 0, there is no restriction. Only a value larger than 0 leads to a restriction in the number of data packages. For reasons of consistency, this number is not generally exactly adhered to. The actual restriction can, depending on how much the data is compressed in the qRFC queue , deviate from the given limit by up to 100.
    RSA6:
    Used to change the Datapacket Size.
    Thanks
    Reddy
    Edited by: Surendra Reddy on Mar 12, 2010 6:27 AM

  • How to determine the owner of the current package ?

    From a package (that is defined with invoker's rights: AUTHID CURRENT_USER) I want to determine the owner of that package. (We plan to deploy the same code into multiple schemas.)
    One approach would be to parse the results of DBMS_UTILITY.format_call_stack . . . any other ideas?
    Thanks,
    Mike

    mtefft wrote:
    We are using Total Recall aka Flashback Data Archive.
    If you have a Flashback Data Archive on a table, you can not drop it without removing the archive, and to do that you need ADMINISTER FLASHBACK ARCHIVE privilege. We want to give users the ability to remove the flashback archive (this is an ALTER TABLE) from their own tables without granting that privilege.
    So, we have a utilty schema which has this privilege. We created a definer-rights package to issue the alter, but it does not have ALTER ANY TABLE privilege. So, we added an invoker-rights package that the table-owner can call. This first issues GRANT ALTER ON [input-table-name] TO itself, then calls the definer-rights package, which now has both the system privilege and the object privilege needed. Why not just create a procedure owned by an "admin" user that does the alter table and grant execute on the proc to the utility schema ?
    As an example with scott and sys (using sys in a real-world system would be a Bad Idea© but I'm on a playground database on my laptop) :
    As Scott :Scott@my11g SQL>!cat createTable.sql
    create table mytable (
    id integer,
    val varchar2(30),
    constraint mytable_pk primary key(id)
    Scott@my11g SQL>@createTable
    Table created.As admin user :Sys@my11g SQL>!cat startFlashback.sql
    alter table scott.mytable flashback archive fl_archive;
    Sys@my11g SQL>alter table scott.mytable flashback archive fl_archive;
    Table altered.
    Sys@my11g SQL>select * from dba_flashback_archive_tables;
    TABLE_NAME                     OWNER_NAME                     FLASHBACK_ARCHIVE_NAME
    ARCHIVE_TABLE_NAME                                    STATUS
    MYTABLE                        SCOTT                          FL_ARCHIVE
    SYS_FBA_HIST_79498                                    ENABLEDAs admin user, I create the procedure and grant execute to scott :Sys@my11g SQL>!cat createProc.sql
    create or replace procedure remove_flarch(
    p_tabname varchar2
    is
    l_strcmd varchar2(200);
    begin
    for cur in (
    select owner, table_name from dba_tables where owner=user and table_name=upper(p_tabname)
    ) loop
    l_strcmd := 'alter table '||cur.owner||'.'||cur.table_name||' no flashback archive' ;
    execute immediate l_strcmd ;
    end loop;
    end;
    sho err
    grant execute on remove_flarch to scott;
    Sys@my11g SQL>@createProc.sql
    Procedure created.
    No errors.
    Grant succeeded.As Scott I cannot disable the flashback archive explicitly :Scott@my11g SQL>alter table mytable no flashback archive ;
    alter table mytable no flashback archive
    ERROR at line 1:
    ORA-55620: No privilege to use Flashback ArchiveBut I can do it using the procedure :Scott@my11g SQL>begin
      2  sys.remove_flarch('MYTABLE');
      3  end;
      4  /
    PL/SQL procedure successfully completed.Checking from admin point of view :Sys@my11g SQL>select * from dba_flashback_archive_tables;
    no rows selected
    mtefft wrote:All is well, except for the fact that we want to deploy this under multiple schemas. So we need to know 'to whom should the ALTER TABLE grant be given?'Just grant execute on the procedure to your different users.

  • What deterimes the amount of data in a waveform from a TDS 1012 scope

    Hello,
       What determines the amount of data that is in a waveform that comes out of a TDS 1012 scope? I am assuming that I will have to look at the driver vi to determine the commands sent to the scope to figure it out. I am in a situation that I need to have the y axis to have a high resolution, that results in very little data being collected from the scope.
    Regards,
    Kaspar
    Regards,
    Kaspar

    Hello,
        The amount of data that comes out of the TDS 1012 scope is determined by the data start (DATaTARt)  and data stop (DATaTOP)  commands that are defined on page 58 (2-38) in the  346 page programming manual for the scope. I found of that the data start was not set to 1, which is the beginning of the data.
        I also had a very low level signal on channel that was all most unreadable by the scope that caused me to think that I was not getting all of the data.
    Regards,
    Kaspar
    Regards,
    Kaspar

  • Unable to load the data into Cube Using DTP in the quality system

    Hi,
    I am unable to load the data from PSA to Cube using DTP in the quality system for the first time
    I am getting the error like" Data package processing terminated" and "Source TRCS 2LIS_17_NOTIF is not allowed".
    Please suggest .
    Thanks,
    Satyaprasad

    Hi,
    Some Infoobjects are missing while collecting the transport.
    I collected those objects and transported ,now its working fine.
    Many Thanks to all
    Regards,
    Satyaprasad

Maybe you are looking for