Solution for mass data import to VB02

hello
do you have any procedure to mass data import to VB02 exclusion list for ZSD1 3rd list key
sales dep/ distribution chanel / material / customer
we don't want to do that manually - more then 600 records
do you have any idea how to import it - any easier way ?

HI
you can import the records eaither LSMW or BAPI which ever your are comfortable to do that.
in LSMW we have 14 steps , i hope you know .
thanks
surya

Similar Messages

  • Standard tcode for (mass) data change of internal orders or ... ??

    Hi!
    I really need some info if there by any chance SAP has standard transaction for mass data change of internal orders (more particular, distribution rules in settlement rule section, which can be found in <b>KO02 transaction</b>  ).
    I am trying to change distribution rules for settlement receivers in <b>settlement rule section</b>, that is finish past distribution rules by filling TO PERIOD and TO FISCAL YEAR fields on the right of each rule, and then entering new rules (which i get from external source -flat file, ms excel, csv...).
    If i wanted to import data in SAP i guess i would have to develop a Batch Input. But that would take me some time to develop because it is pretty complicated.
    I found tcode KO08 but i do not really know how to use it. Maybe there is another tcode that i am not aware of?
    I would appreciate any suggestions!
    Thnx, UK

    Hi Srilakshimi,
    If you are familiar with MASS transaction, then you can modify User Responsible field for Internal Orders from transaction KOK2.
    As first step you must create a selection variant in order to define which orders you want to modify. Once selection variant was created, excute transaction with it and you'll get a screen similar to MASS transaction. Select the field you want and massively replace it. Do not forget to save.
    Best Regards!
    Mgitur

  • Value Mapping Replication for Mass Data - Performance Issues

    Hi All,
    We are looking into Value Mapping Replication for Mass Data. We have done this for less number of fields.
    Now we might have to have 15,000 records in the cache for the Value Mapping. I am not sure how this would effect the Java Cache and Java Engine as a whole.
    There might be a situation where we will have to leave the 15K records in the cache table on Java Engine...
    Are there any parameters that we can look into just to see how this hits the performance.
    Any links/ guidance in the right direction might help me..
    reg

    Naveen,
    Check jins reply in this thread (they have done with API and without API using graphical but still some issues):
    Value mapping performance using LookUp API
    ---Satish

  • Anyone worked on Value Mapping Replication for Mass Data

    hi all,
    Is there any one who worked on <b>Value Mapping Replication for Mass Data</b> stuff.
    What is this?
    The on page 139 of the doc below tells about registereing
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/627d1cbc-0601-0010-aea2-c275521673f2
    Can any one explain guide me in that direction.
    reg,

    Naveen,
    Have a look at the weblog /people/sreekanth.babu2/blog/2005/02/23/value-mapping-replication
    Refer the below discussion too
    Re: How to edit roles for specific tasks in IR & ID?
    Best regards,
    raj.
    Message was edited by:
            Raj

  • Is PI recommended for intial Data-import / Mass-upload to MDM

    we have a requirement as below:
    there are around 500,000 records in a customer master in SAP R/3.
    this data need to be imported into SAP MDM. (after this step, SAP R/3 will be removed/scrapped from the landscape)
    after importing to MDM, the records need to be updated to SAP ECC.
    so the requirement is something like:
    SAP R/3 -
    > MDM -
    > ECC.
    the R/3 to MDM import is one time, initial, activity, to be done for around 5 lac records.
    to migrate these much data, is it recommed to use PI, for both inbound and outbound (w.r.t MDM) scenarios .
    thanks in advance.
    Ganesh

    hi,
    in general it's not recommended to use PI for mass uploads
    but I guess in this case the only issue will be upload to SAP ECC
    and not export from old R/3
    so I guess you could use PI to export the data from R/3 to a file/files
    and use those files in LSMW in ECC
    Regards,
    Michal Krawczyk

  • Value Mapping Replication for Mass Data

    Hi SDN,
    I have to design an interface in XI 3.0 (SP14 on AIX) for replicating mass data from file databse etc to the XI runtime cache.
    I went thru the follwing weblog which proved to be helpful:<b>/people/sreekanth.babu2/blog/2005/02/23/value-mapping-replication
    <b>ValueMappingReplicationOut</b> being my Asynchronous outbound interface and <b>ValueMappingReplication</b> being Asynchronous Inbound Interface of in the software component SAP BASIS, software component version SAP BASIS 6.40 in the namespace http//sap.com/xi/XI/System.
    Am aware that for this scenario the receiver is predefined (it must be on the Integration Server).
    My question here is how do i use the outbound interface to send data to XI. <b>I mean in my case i have a file and SQL server as the sender.</b>
    Can somebody detail the steps involved. Suppose i dont want to use ABAP proxy or JAVA proxy HTTP adapter sending the message,
    Do i have other options to send data to XI value mapping interface?
    Regards,
    Vineet

    When you want to send data from a file, just map the structure from the arriving XML to ValueMappingReplication. There is no need for using the interface ValueMappingReplicationOut or use an ABAP proxy.
    Regards
    Stefan

  • Reg:Efficient solution for a data upload scenario

    Hi All,
            I have the following task.
             Required  data from a legacy system(generate  data only in the form of flat files)to SAP R3 as FB01 journals and the output file should be generated periodically(daily,weekly,fortnightly etc…)
    Solution Approaches:
    1)Write a BDC program to extract the data.
    2) Write a ABAP Progam to populate IDoc (if standard IDOc is available) or generate a Outbound proxy (If standard IDoc is not available) to push the data into SAP XI.
    Could anyone  tell me which would be the the best and efficient approach for this task and need your recommendations.
    Thanks in Advance.
    B.Lavanya
    Edited by: Lavanya Balanandham on Mar 31, 2008 2:23 PM

    Hi Lavanya,
    Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
    the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
    Regards,
    Rajeev Gupta

  • IP mobility solution for a Data Centers

    Hello guys
    Please give me an advice.
    Now I try to work out a networking solution for a two geo separated data centers (DCs).
    Both the data centers will contain a virtualised server infrastructure (VMware).
    Now the main design problem is to choose good solution for IP mobility and  for the DCs interconnect. We need to have an ability of moving a VM machines both for maintanance and for disaster recovery between the DCs. And users must seamlessly without any advanced knowledge about routing  be connected to the VM machines that were moved. For now the users work in 1100 different offices around the Ukraine country.
    Yes I know a little about Cisco OTV, RHI and LISP solutions. But all this solutions require the top level switches and routers like Nexus 7000 and Cat 6500. And they are very costly.
    Is there cheaper solution for the IP Mobility?
    Maybe the NHRP protocol that is a part of the DMVPN could be a good solution? 
    Are there any features in NHRP or in other protocol for monitoring the health of a VM machine, customising and propagation the routing info in case of VM machine motion?
    Thank you for help.

    Hi Lavanya,
    Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
    the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
    Regards,
    Rajeev Gupta

  • Is Workspace Manager the right solution for history data only

    I've been experimenting some with WM to just enable history on all tables in a application we have with about 100 tables.
    Everything works fine, except for the fact that we have lots of fk_constraints that require enabling of versioning in order.
    We know that changes to the Db layout will occur as the app develops with business requirement's.
    My questions is : Is WM the right solution here, as I don't see anyway to "transport" historical data in the event of a migration of old DB layout to a new layout?
    (db refactoring).
    /christian

    Hi Christian,
    When you execute enableversioning on the tables, you can specify the complete set of tables that you want to version, instead of having to maintain the correct table order.
    There shouldn't be a need to transport the data between tables. We support DDL changes on versioned tables by using the beginDDL/commitDDL procedures. If you do require the transport of data between databases, then note that we only support a full database import/export. The user guide covers both of these topics in greater detail.
    Regards,
    Ben

  • Solution for Overlapping dates

    Hi
    I am looking for an easy solution the problem described below
    I have 4 records under party key and
    PartyKey account id taxtype date start date end
    123 145 100 01/01/2001 12/31/2002
    123 145 100 05/01/2001 05/31/2001
    123 145 100 01/01/2002 11/30/2002
    123 145 100 01/01/2003 12/31/2007
    124 111 300 01/01/2001 12/31/2006
    125 222 260 04/01/2001 11/30/2001
    125 222 260 01/01/2001 12/31/2001
    I want to select all the records which has overlapping dates with other set of record for the same partykey and other columns and want to list all those records which has no overlapping dates.
    for example for party key 123 first 3 records are overlapping date, and party key 125 has both the records has overlapping date.
    party key 124 has no over lapping date and last record of partykey 123 the dates are not overlapping with first three records
    Can any body help me in building the code. Its very Urgent please
    Thanks
    Rajesh

    My bad, i had to head out shortly so i rushed, and introduced some copy+paste errors....This should be what you're after, hopefully you'll take the time to understand it, since you'll have to support it :)
    ME_XE?select
      2     id_internal,cd_type_acct,cd_type_addr,id_acct,  dt_effect, dt_end,
      3     case when
      4        (dt_effect between lead(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lead(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
      5        OR
      6        (dt_end between lead(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lead(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
      7        OR
      8        (dt_effect between lag(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lag(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
      9        OR
    10        (dt_end between lag(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lag(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
    11    then
    12     'Overlap'
    13    else
    14     'No Overlap'
    15    end as overlaps
    16  from
    17  (
    18     select *
    19     from MV_ITS_PARTY_ADDRESS_EXCEP
    20     order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc
    21  );
           ID_INTERNAL       CD_TYPE_ACCT       CD_TYPE_ADDR ID_ACCT                              DT_EFFECT  DT_END     OVERLAPS
                 93535       300000042829                 10 300                                  10/01/2000 10/31/2000 Overlap
                 93535       300000042829                 10 300                                  10/01/2000 10/31/2000 Overlap
                 93535       300000042829                 10 300                                  11/01/2000 12/31/2001 No Overlap
           20001115051                  0                 60 0                                    01/01/2004 12/31/2004 No Overlap
           20001115051                  0                 60 0                                    01/01/2005 12/31/2005 No Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    02/28/2006 12/31/9999 No Overlap
    11 rows selected.
    Elapsed: 00:00:00.18
    ME_XE?
    [pre]
    Message was edited by:
            Tubby                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Call Bundling for custom bapi for mass data processing

    Hi all,
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/4c/4c0e96725311d396a80004ac96334b/frameset.htm
    can i create a custom bapi where i can compress created update tasks. Not single inserts but a single sql insert with many records.
    Are there some SAP FM in order to do it? Documentation says, i must do "Operations in buffer" and "Update buffer data".
    Regards
    Paul

    Is ABAPFIELD an IMPORTING parameter?
    > Total Questions:  17 (15 unresolved) 
    Maybe you should consider cleaning up your old posts.
    Rob

  • Looking for a proper analytical query/solution for below data

    I have data as shown below:
    mob_id     status_code     status_text     media_date
    39585     600     Online     04-Aug-09
    54988     600     Online     05-Aug-09
    39585     600     Online     05-Aug-09
    39585     600     Online     06-Aug-09
    39585     600     Online     07-Aug-09
    39585     600     Online     08-Aug-09
    39585     600     Online     09-Aug-09
    39585     600     Online     10-Aug-09
    39585     600     Online     11-Aug-09
    39585     600     Online     12-Aug-09
    39585     600     Online     13-Aug-09
    39585     600     Online     14-Aug-09
    39585     600     Online     15-Aug-09
    39585     600     Online     16-Aug-09
    39585     700     Deinstall     17-Aug-09
    54988     600     Online     06-Aug-09
    54988     600     Online     07-Aug-09
    54988     600     Online     08-Aug-09
    54988     600     Online     09-Aug-09
    54988     600     Online     10-Aug-09
    54988     600     Online     11-Aug-09
    54988     600     Online     12-Aug-09
    54988     600     Online     13-Aug-09
    54988     600     Online     14-Aug-09
    54988     600     Online     15-Aug-09
    54988     600     Online     16-Aug-09
    39585     600     Online     20-Aug-09
    39585     600     Online     21-Aug-09
    39585     600     Online     22-Aug-09
    39585     600     Online     23-Aug-09
    39585     600     Online     24-Aug-09
    39585     600     Online     25-Aug-09
    39585     700     Deinstall     26-Aug-09
    39585     600     Online     27-Aug-09
    39585     600     Online     28-Aug-09
    39585     600     Online     29-Aug-09
    39585     600     Online     30-Aug-09
    39585     600     Online     31-Aug-09
    39585     600     Online     01-Sep-09
    39585     700     Deinstall     02-Sep-09
    54988     600     Online     17-Aug-09
    54988     600     Online     18-Aug-09
    54988     600     Online     19-Aug-09
    54988     600     Online     20-Aug-09
    54988     600     Online     21-Aug-09
    54988     600     Online     22-Aug-09
    54988     600     Online     23-Aug-09
    54988     600     Online     24-Aug-09
    54988     600     Online     25-Aug-09
    54988     700     Deinstall     26-Aug-09
    69875     600     Online     20-Aug-09
    69875     600     Online     21-Aug-09
    69875     600     Online     22-Aug-09
    69875     600     Online     23-Aug-09
    69875     600     Online     24-Aug-09
    69875     600     Online     25-Aug-09
    69875     600     Online     26-Aug-09
    Using the above data I need to find out the below result set. Can any one help in this?
    occurrnace_seq     mob_id     start_media_date     end_media_date     no_of_days
    1     39585     04-Aug-09     17-Aug-09     13
    2     39585     20-Aug-09     26-Aug-09     6
    3     39585     27-Aug-09     02-Sep-09     6
    1     54988     05-Aug-09     26-Aug-09     21
    1     69875     20-Aug-09          null null
    Here start_media_date can be found with status_code=600 & end_media_date can be found with status_code=700.
    Please look that the mobility_id is starting multiple times.
    Any one can help me in producing this result using SQL or PL/SQL.
    Many thanks in advance.
    Thanks
    Guttiwas

    guttis wrote:
    Can I run this query on a 70 million records? Does it raise any performance problems. If you have any idea, just thorough some possible suggestions to protect such isses.Well, you can certailny run it on 70 million records. How long it will run depends on your hardware, Oracle and OS settings. Said that, there is a simpler solution:
    select  occurrenace_seq,
            mob_id,
            min(case grp when 'start-of-group' then media_date end) start_media_date,
            max(case grp when 'end-of-group' then media_date end) end_media_date,
            max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
      from  (
             select  t.*,
                     case
                       when status_text = 'Deinstall' then 'end-of-group'
                       when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
                     end grp,
                     sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
                     case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
               from  your_table t
      where grp in ('start-of-group','end-of-group')
      group by mob_id,
               occurrenace_seq
      order by mob_id,
               occurrenace_seq
    /With your sample:
    with t as (
               select 39585 mob_id,600 status_code,'Online' status_text, to_date('04-Aug-09','dd-mon-yy') media_date from dual union all
               select 54988,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
               select 39585,700,'Deinstall', to_date('17-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
               select 39585,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('27-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('28-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('29-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('30-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('31-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('01-Sep-09','dd-mon-yy') from dual union all
               select 39585,700,'Deinstall', to_date('02-Sep-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('17-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('18-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('19-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
               select 54988,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('26-Aug-09','dd-mon-yy') from dual
    select  occurrenace_seq,
            mob_id,
            min(case grp when 'start-of-group' then media_date end) start_media_date,
            max(case grp when 'end-of-group' then media_date end) end_media_date,
            max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
      from  (
             select  t.*,
                     case
                       when status_text = 'Deinstall' then 'end-of-group'
                       when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
                     end grp,
                     sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
                     case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
               from  t
      where grp in ('start-of-group','end-of-group')
      group by mob_id,
               occurrenace_seq
      order by mob_id,
               occurrenace_seq
    OCCURRENACE_SEQ     MOB_ID START_MED END_MEDIA NO_OF_DAYS
                  1      39585 04-AUG-09 17-AUG-09         13
                  2      39585 20-AUG-09 26-AUG-09          6
                  3      39585 27-AUG-09 02-SEP-09          6
                  1      54988 05-AUG-09 26-AUG-09         21
                  1      69875 20-AUG-09
    SQL> SY.

  • XML Solutions for Large Data Sets

    Hi,
    I'm working with a large data set (9 million records comprising 36 gigabytes) and am exploring the use of XML with it.
    I've experimented with a JDBC app (taken straight from Steve Muench's excellent <i>Oracle_XML_Applications</i>) for writing to CLOBS, but achieve throughputs of much less than 40k/s (the minimum speed required to process the data in < 10 days).
    What kind of throughputs are possible loading XML records from CLOBs into multiple tables (using server-side Java apps)?
    Could anyone comment whether XML is a feasible possibility for this size data set?
    Regards,
    Mike

    Just would like to identify myself (I'm the submitter):
    Michael Driscoll <[email protected]>.
    null

  • Need solution for mass exclusion of computers from IE 10 deployment

    hey all,
    hopefully I've got the right forum for this question.
    I have to deploy IE 10 company-wide (about 1800 computers), but I need to exclude hundreds of computers because of an incompatibility with a web-based application.
    we have a query-based collection in SCCM that we use for deploying updates company-wide, but we can't use this since I need to exclude computers from the deployment.
    when I deployed IE 9 I ended up exporting the query results for this collection, then did a copy/paste of the results to a new collection, minus the computers that had to be excluded for the IE 9 deployment.
    unfortunately, the IE 10 deployment has too many exclusions for this to be efficient.
    so, I'm trying to figure out the best approach to this- any ideas/suggestions?
    is there anything in SCCM (or a third party option) that can generate a list of computer names based on a list of AD user names?  the user names list would be lengthy, close to 500.
    I am using SCCM 2007 SP2.
    thanks in advance for any help with this!

    Hi,
    I would try to use an exclude collection,
    1. create a collection with all computers that are excluded from IE10, you can use whatever query you like, inventory e.t.c
    2. create exclude rule in the collection used for deploying and excluding the collection above from the deployment.
    No there is no builtin feature to convert a list of usernames to computernames, check Top Console user, that would be the best source of information.
    Regards,
    Jörgen
    thanks for the response.
    are there any third party options that can generate a list of computer names based on a list of AD user names?
    manually adding hundreds of computer names to (or removing from) a collection really isn't an option.

  • ADF-Faces table and mass data

    Hello Oracle,
    we consider to use ADF-Faces with Spring and TopLink or Hibernate that supports all important commercial databases.
    We need a solution for selecting mass data and show them using a ADF-Faces component.
    ADF-Faces table looks great and is fine for for some hundred datasets. But we have customers who could do a select over 80.000 or more datasets and it cannot be the solution to load them completly in collection-model in memory.
    Do you have a solution or do you plan somthing? A JSF data table component for mass data (e.g. load from database on every "next or "previous" click)?
    Regards
    Florian

    I'm pretty new to ADF Faces etc. but I know the ADF Table component supports paging. If you're using ADF Business Components as the persistence framework it supports paging out of the box giving you exactly what you're looking for.
    If you're using EJBs or Toplink I think you have to write your own paging interface but I don't know this for sure.
    Corey
    Message was edited by:
    cpuffalt
    Message was edited by:
    cpuffalt

Maybe you are looking for

  • AI/AO at different frequency

    Hi, As a newbie, I met a problem when I tried to input and output analog signal at different frequency. I followed PID-control-Multichannel.vi to build a control program, so input/output can be synchronized. However, the project requires that the AI

  • SAP_BASIS IMPORT_PROPER Error during SPAM for CRM Upgrade

    Hi All, During SPAM IMPORT_PROPER for component SAP_BASIS, encountered error. Checking the Import Logs of the queue, we see error logs in Activate inactive runtime objects. Sample message below. May we ask how to fix the error for us to be able to co

  • Plz help me on this

    1) What is logical file and physical file 2)what is logical file path and physical file path 3) what is the use of tcode 'File' 4) what is the screen resolution in bdc 5)IN OPEN DATASET FOR < IN/OUT/APPENDING> IN <TEXT-MODE/BINARY MODE> ENCODING DEFA

  • Creative Suite 5.5 Production Premium Disk space and Permission Errors

    Win7 clean install, Dual Core 3.0, 4gigs ram, Audigy Sound Card, nvidia 8800 GT video card, 600 gig HD space, no anti virus, no firewall, no other programs installed. Download manager and alteritive method both result in extration error " A problem o

  • DW/CS3 Preview in browser puts the file, not previews it.

    DW/CS3 When I preview in browser DW puts the file to the ftp remote server. It will not preview the local file in my primary or secondary browser. It was ok in Win XP but I now have Win7 64.