Reg:Efficient solution for a data upload scenario
Hi All,
I have the following task.
Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals and the output file should be generated periodically(daily,weekly,fortnightly etc
)
Solution Approaches:
1)Write a BDC program to extract the data.
2) Write a ABAP Progam to populate IDoc (if standard IDOc is available) or generate a Outbound proxy (If standard IDoc is not available) to push the data into SAP XI.
Could anyone tell me which would be the the best and efficient approach for this task and need your recommendations.
Thanks in Advance.
B.Lavanya
Edited by: Lavanya Balanandham on Mar 31, 2008 2:23 PM
Hi Lavanya,
Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
the output file should be generated periodically(daily,weekly,fortnightly etc
) - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
Regards,
Rajeev Gupta
Similar Messages
-
Reg:Efficient solution for a scenario
Hi All,
I have the following task.
Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals and the output file should be generated periodically(daily,weekly,fortnightly etc )
Solution Approaches:
1)Write a BDC program to extract the data.
2) Write a ABAP Progam to populate IDoc (if standard IDOc is available) or generate a Outbound proxy (If standard IDoc is not available) to push the data into SAP XI.
Could anyone tell me which would be the the best and efficient approach for this task and need your recommendations.
Thanks in Advance.
B.LavanyaHi Lavanya,
I think LSMW is the best solution for your requirement.
Reward points if it is helpful.
Thanks
Ram -
IP mobility solution for a Data Centers
Hello guys
Please give me an advice.
Now I try to work out a networking solution for a two geo separated data centers (DCs).
Both the data centers will contain a virtualised server infrastructure (VMware).
Now the main design problem is to choose good solution for IP mobility and for the DCs interconnect. We need to have an ability of moving a VM machines both for maintanance and for disaster recovery between the DCs. And users must seamlessly without any advanced knowledge about routing be connected to the VM machines that were moved. For now the users work in 1100 different offices around the Ukraine country.
Yes I know a little about Cisco OTV, RHI and LISP solutions. But all this solutions require the top level switches and routers like Nexus 7000 and Cat 6500. And they are very costly.
Is there cheaper solution for the IP Mobility?
Maybe the NHRP protocol that is a part of the DMVPN could be a good solution?
Are there any features in NHRP or in other protocol for monitoring the health of a VM machine, customising and propagation the routing info in case of VM machine motion?
Thank you for help.Hi Lavanya,
Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
the output file should be generated periodically(daily,weekly,fortnightly etc ) - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
Regards,
Rajeev Gupta -
Function module Vs BDC for master data upload
Hi ,
Please advice we should use the following function modules for master data upload or we should go for BDC.
MP_RFC_SINGLE_CREATE
MP_RFC_INACT_CHANGE
MPLAN_CREATE
MPLAN_CHANGE
MPLAN_SET_DELETION_INDICATOR
ASSET_MASTERRECORD_MAINTENANCE
MPLAN_ITEM_CREATE
MPLAN_ITEM_CHANGE
GL_ACCT_MASTER_SAVE
GL_ACCT_MASTER_SAVE
GL_ACCT_MASTER_SAVE
Actually, we have already used these function modules in our upload program, but we are not sure if these function modules will create any data inconsistency.
Please let me know, if we should continue using the FMs, or there is any risk using the FMs and we should replace them by BDC.
Thanks in advance.HI Vikram,
Better to serch for the BAPI for uploading the master data.Becuase we have problems with BDC and FM's.
If you use FM's it does n't contain all the fields which you want.IF you go for BDC this is not maintainable for future releaseas.IF you upgrade then screen may change.
IF don' have any BAPI then better go for BDC.
Thanks -
LSMW used only for master data upload?
Hi
Can you please let me know if LSMW is used only for master data upload or we can also use it for transaction data ?Hi Christino.
I have come across a standard SDN thread which deals with the uploading master data, refer it:
[SDN Reference for uploading master data using LSMW|how can we upload master data by using LSMW;
[SDN reference for which uploading is preferred (Master data or Transaction data)|Which one is better for uploading data LSMW or ECATT ?;
Good Luck & Regards.
HARSH -
Optimization for bulk data upload
Hi everyone!
I've got the following issue:
I have to do a bulk data upload using JMS deploy in a glassfish 2.1, to process and validate data in a Oracle 10g DB System before it is insert.
I have my web interface that loads a file and then delegates the process to a Stateless Session Bean which read a N number of lines and after that send a message to a JMS Queue. The JMS has to parse the each line, validate them with the data already in the DB, and finally persist the new data.
This process is high process consuming, and I need to improve the performance time. I tried to change glassfish default JMS and JDBC pool size, but I have no a huge difference.
Do you have any advice that could help me?
Thanks in advance!Hi! thank you for you answer!
High process consuming is in the MDB
I'm grouping each N number of read lines in the EJB and then send the message to the JMS. The MDB process the persists each line as info in different related tables.
Thanks again! -
Tutorial for new Data Upload feature in 4.1?
Greetings,
Is there a tutorial available for using the new Data Upload feature in 4.1? We have not upgraded to 4.1 yet and I would like to try out the new feature on my APEX test workspace, so I can use the new feature once we are upgraded to 4.1.
Thanks,
JohnI installed the Product Portal Sample Database and went to page 21. Very nice looking. Is there any tutorial for this Sample application? In other words, is there a tutorial that uses this application as its basis?
What I am really looking for (my apologies if I have not been clear) is a tutorial that steps you through the process of setting up the new feature of Data Upload in APEX 4.1. I would like to create a new application on my test workspace and learn how to set up the Data Upload page.
Seeing the Data Load in action is very helpful though. Thanks for pointing me to this.
Thanks,
John -
Hi, Experts
Normally in Cube we just have requestID, which only has number information and nothing else( request date, time, selection, type of data upload ... )
Can I make a Bex query show information just like Cube manage? becase we had to check whether there is duplicated selection request is not deleted or some missing request in case multi-datasource to one cube
I can not find any useful query in BW statistics queries.
thanks in advance.I'm also can not found enough information from table RSMONICDP
In our case, Cube 0COOM_C02 have lots infosources, some are full upload and some are Delta upload. all of inforpackage are scheduled one process chain.
then I go to log of this process chain, I found some error happened in some days, so some time the process chain is not finished, so that's means in Cube 0COOM_C02 have missing request and duplicated request.
I'm hard to using cube-manage to found all of problem request because there are so many request and so little windowns. so my question is, is there any Bex query or BW table can indicate similiar information within cube - manage - request tab.
so I can analysis them in Excel, it's quict easy for me.
thank you all -
Xls. sheet for Master data upload
Hi Can any body suggest me or send me the sample, how to maintain xls. sheet Template of particular fields for Infotypes 0,1,2,7,8 etc for the purpose of data upload.
<removed by Moderator>
thanks
S MishraHi Mishra,
You can look into the Standard Business Blueprint Templates - Data Transfer Tool of SAP....and get an idea...
Check this Note 1060029 - SAP Best Practices for HCM US - Variants, Misc, LSMW ...
You will find files for infotypes and you can set up your templates based on these....
Other way is to
Go to SE11 >> Enter the Infotype Number PNNNN (with NNNN being the Infotype)...
It presents you with the structure.......Copy that structure and you can create your excel sheet with that template....
Good Luck !!!!
Kumarpal Jain. -
Maintain number range interval for master data upload for existing employee
Hi Experts,
1)I have the scenario to upload additional data for existing employees in PA. Employees are already existed in SAP HR , But additional Infotypes are required to maintain for those emploees
2) I have the senario where i have to upload master data for new employees.
Pls give the detailed description of how to maintain number range interval i.e. External or Internal for upload in both the above scenario. Do we have to maintain number range manually in master data record & then upload it Through BDC or LSMW?
<< Moderator message - Everyone's problem is important. But the answers in the forum are provided by volunteers. Please do not ask for help quickly. >>
Edited by: Rob Burbank on Jan 12, 2011 3:49 PM>
s c patil wrote:
> 2) For new employees i have to maintain desired ( my or client?) number range in SAP system as External number range & then default that number range in NUMKAR & then maintain those number range in master data record & then get the data template filled by client & then upload the data & after that create new number range which is next to existing External number range as an Internal number range. & then default that Internal number range.
>
> Pls reply ASAP
Yes Mr. Patil...
For existing employees
you need to execute HIRING ACTION Through BDC with External number range. While recording you have to use atleast three infotype i.e. IT0000,IT0001,IT0002. In addition you can upload other infotype through PA30.
For new employees
While configuration you can create another number range as internal for new hiring. and use NUMKR feature as well.
Here I don't understand that why r u looking for upload process for new hiring, if it is not mass hiring. It should be day to day activities which would be done by user through PA40.
Best Regards,
Anand Singh -
.CSV file for Master data upload in GRC PC
Hi all,
We want to upload some Master Data for GRC Process Controls 3.0. Since we do not have access to MDUG tool yet, and we have a demo planned soon, we decided to go with the approach of uploading some data using the program GRPCB_UPLOAD.
So I uploaded a .csv file with the following structure and was able to create master data. But, the object name was not updated.
Structure
Object Type
Object ID
Infotype
Sub-type
Start
End
Object abbreviation
Object Name
Language
In addition, for some objects, we'd like to update the other attributes with data too.
Would anyone have the .csv format for such a detailed upload for Process / Sub process / Control?
Regards,
PrekshaHi all,
First of all, thanks in advance, but I´ve tried to upload a template similar to Preksha´s template but when I uploaded it through GRPCB_UPLOAD program it doesn´t work properly, because when I execute the background process, I´ll check its status and then it is cancelled .
In addition, I follow the recomendations from the pdf called "Process Control Upload Master Data v1" released by RIG in June 2008, but in contrast I can´t upload the structure correctly.
Do you have any idea? Or could you give me an idea which explains how to create a correct template? Do you have any clue of what I would have done wrong?
Thanks a lot.
Regards. -
Solution for Overlapping dates
Hi
I am looking for an easy solution the problem described below
I have 4 records under party key and
PartyKey account id taxtype date start date end
123 145 100 01/01/2001 12/31/2002
123 145 100 05/01/2001 05/31/2001
123 145 100 01/01/2002 11/30/2002
123 145 100 01/01/2003 12/31/2007
124 111 300 01/01/2001 12/31/2006
125 222 260 04/01/2001 11/30/2001
125 222 260 01/01/2001 12/31/2001
I want to select all the records which has overlapping dates with other set of record for the same partykey and other columns and want to list all those records which has no overlapping dates.
for example for party key 123 first 3 records are overlapping date, and party key 125 has both the records has overlapping date.
party key 124 has no over lapping date and last record of partykey 123 the dates are not overlapping with first three records
Can any body help me in building the code. Its very Urgent please
Thanks
RajeshMy bad, i had to head out shortly so i rushed, and introduced some copy+paste errors....This should be what you're after, hopefully you'll take the time to understand it, since you'll have to support it :)
ME_XE?select
2 id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect, dt_end,
3 case when
4 (dt_effect between lead(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lead(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
5 OR
6 (dt_end between lead(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lead(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
7 OR
8 (dt_effect between lag(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lag(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
9 OR
10 (dt_end between lag(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lag(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
11 then
12 'Overlap'
13 else
14 'No Overlap'
15 end as overlaps
16 from
17 (
18 select *
19 from MV_ITS_PARTY_ADDRESS_EXCEP
20 order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc
21 );
ID_INTERNAL CD_TYPE_ACCT CD_TYPE_ADDR ID_ACCT DT_EFFECT DT_END OVERLAPS
93535 300000042829 10 300 10/01/2000 10/31/2000 Overlap
93535 300000042829 10 300 10/01/2000 10/31/2000 Overlap
93535 300000042829 10 300 11/01/2000 12/31/2001 No Overlap
20001115051 0 60 0 01/01/2004 12/31/2004 No Overlap
20001115051 0 60 0 01/01/2005 12/31/2005 No Overlap
20001115051 0 60 0 01/01/2006 02/27/2006 Overlap
20001115051 0 60 0 01/01/2006 02/27/2006 Overlap
20001115051 0 60 0 01/01/2006 02/27/2006 Overlap
20001115051 0 60 0 01/01/2006 02/27/2006 Overlap
20001115051 0 60 0 01/01/2006 02/27/2006 Overlap
20001115051 0 60 0 02/28/2006 12/31/9999 No Overlap
11 rows selected.
Elapsed: 00:00:00.18
ME_XE?
[pre]
Message was edited by:
Tubby -
Is Workspace Manager the right solution for history data only
I've been experimenting some with WM to just enable history on all tables in a application we have with about 100 tables.
Everything works fine, except for the fact that we have lots of fk_constraints that require enabling of versioning in order.
We know that changes to the Db layout will occur as the app develops with business requirement's.
My questions is : Is WM the right solution here, as I don't see anyway to "transport" historical data in the event of a migration of old DB layout to a new layout?
(db refactoring).
/christianHi Christian,
When you execute enableversioning on the tables, you can specify the complete set of tables that you want to version, instead of having to maintain the correct table order.
There shouldn't be a need to transport the data between tables. We support DDL changes on versioned tables by using the beginDDL/commitDDL procedures. If you do require the transport of data between databases, then note that we only support a full database import/export. The user guide covers both of these topics in greater detail.
Regards,
Ben -
Looking for a proper analytical query/solution for below data
I have data as shown below:
mob_id status_code status_text media_date
39585 600 Online 04-Aug-09
54988 600 Online 05-Aug-09
39585 600 Online 05-Aug-09
39585 600 Online 06-Aug-09
39585 600 Online 07-Aug-09
39585 600 Online 08-Aug-09
39585 600 Online 09-Aug-09
39585 600 Online 10-Aug-09
39585 600 Online 11-Aug-09
39585 600 Online 12-Aug-09
39585 600 Online 13-Aug-09
39585 600 Online 14-Aug-09
39585 600 Online 15-Aug-09
39585 600 Online 16-Aug-09
39585 700 Deinstall 17-Aug-09
54988 600 Online 06-Aug-09
54988 600 Online 07-Aug-09
54988 600 Online 08-Aug-09
54988 600 Online 09-Aug-09
54988 600 Online 10-Aug-09
54988 600 Online 11-Aug-09
54988 600 Online 12-Aug-09
54988 600 Online 13-Aug-09
54988 600 Online 14-Aug-09
54988 600 Online 15-Aug-09
54988 600 Online 16-Aug-09
39585 600 Online 20-Aug-09
39585 600 Online 21-Aug-09
39585 600 Online 22-Aug-09
39585 600 Online 23-Aug-09
39585 600 Online 24-Aug-09
39585 600 Online 25-Aug-09
39585 700 Deinstall 26-Aug-09
39585 600 Online 27-Aug-09
39585 600 Online 28-Aug-09
39585 600 Online 29-Aug-09
39585 600 Online 30-Aug-09
39585 600 Online 31-Aug-09
39585 600 Online 01-Sep-09
39585 700 Deinstall 02-Sep-09
54988 600 Online 17-Aug-09
54988 600 Online 18-Aug-09
54988 600 Online 19-Aug-09
54988 600 Online 20-Aug-09
54988 600 Online 21-Aug-09
54988 600 Online 22-Aug-09
54988 600 Online 23-Aug-09
54988 600 Online 24-Aug-09
54988 600 Online 25-Aug-09
54988 700 Deinstall 26-Aug-09
69875 600 Online 20-Aug-09
69875 600 Online 21-Aug-09
69875 600 Online 22-Aug-09
69875 600 Online 23-Aug-09
69875 600 Online 24-Aug-09
69875 600 Online 25-Aug-09
69875 600 Online 26-Aug-09
Using the above data I need to find out the below result set. Can any one help in this?
occurrnace_seq mob_id start_media_date end_media_date no_of_days
1 39585 04-Aug-09 17-Aug-09 13
2 39585 20-Aug-09 26-Aug-09 6
3 39585 27-Aug-09 02-Sep-09 6
1 54988 05-Aug-09 26-Aug-09 21
1 69875 20-Aug-09 null null
Here start_media_date can be found with status_code=600 & end_media_date can be found with status_code=700.
Please look that the mobility_id is starting multiple times.
Any one can help me in producing this result using SQL or PL/SQL.
Many thanks in advance.
Thanks
Guttiwasguttis wrote:
Can I run this query on a 70 million records? Does it raise any performance problems. If you have any idea, just thorough some possible suggestions to protect such isses.Well, you can certailny run it on 70 million records. How long it will run depends on your hardware, Oracle and OS settings. Said that, there is a simpler solution:
select occurrenace_seq,
mob_id,
min(case grp when 'start-of-group' then media_date end) start_media_date,
max(case grp when 'end-of-group' then media_date end) end_media_date,
max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
from (
select t.*,
case
when status_text = 'Deinstall' then 'end-of-group'
when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
end grp,
sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
from your_table t
where grp in ('start-of-group','end-of-group')
group by mob_id,
occurrenace_seq
order by mob_id,
occurrenace_seq
/With your sample:
with t as (
select 39585 mob_id,600 status_code,'Online' status_text, to_date('04-Aug-09','dd-mon-yy') media_date from dual union all
select 54988,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
select 39585,700,'Deinstall', to_date('17-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
select 39585,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('27-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('28-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('29-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('30-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('31-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('01-Sep-09','dd-mon-yy') from dual union all
select 39585,700,'Deinstall', to_date('02-Sep-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('17-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('18-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('19-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
select 54988,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('26-Aug-09','dd-mon-yy') from dual
select occurrenace_seq,
mob_id,
min(case grp when 'start-of-group' then media_date end) start_media_date,
max(case grp when 'end-of-group' then media_date end) end_media_date,
max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
from (
select t.*,
case
when status_text = 'Deinstall' then 'end-of-group'
when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
end grp,
sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
from t
where grp in ('start-of-group','end-of-group')
group by mob_id,
occurrenace_seq
order by mob_id,
occurrenace_seq
OCCURRENACE_SEQ MOB_ID START_MED END_MEDIA NO_OF_DAYS
1 39585 04-AUG-09 17-AUG-09 13
2 39585 20-AUG-09 26-AUG-09 6
3 39585 27-AUG-09 02-SEP-09 6
1 54988 05-AUG-09 26-AUG-09 21
1 69875 20-AUG-09
SQL> SY. -
Bdc for ip01, data upload in table control
I was working on a bdc for ip01 . there is a table control - whose 5 rows are visible and I need to fill data upto 6 row . 6th row get filled by scrolling . will scrolling get captured in recording . whether this is possible or not sir . If yes then how?
Sir please help me.
With regards,Hi
Scrolling does not get recorded while recording. You can achieve it by having ok_code to p+ or p++.
Refer the following link for detailed description.
http://wiki.sdn.sap.com/wiki/display/ABAP/BatchInputFAQ#BatchInputFAQ-Howtoscrollatablecontrol%3F
Hope this helps you.
Regards
Vinodh
Maybe you are looking for
-
How do I restore playlist without corrupting/dupping library?
I have a large music collection (50k+) titles which I gathered while deployed. After several transfers/backups among external/internal harddrives, updates of iTunes and the use of Homesharing, the library became a mess of duplicates growing to over
-
Problem connect form6i and Oracle Database 10g
I can't connect form6i to Oracle Database 10g When complete user/password@database fatal error produce: "Oracle Forms Designer has encountered a problem and needs to close. We are sorry for the inconvenience" Error Detail: AppName: ifbld60.exe Ap
-
Blinking amber in time capsule
Hi, i am experiencing some trouble while configuring my time capsule for the 1st time.. the airport utility software locates the device but when i try to configure, it errors out with error code -4. i also tried to reset it but no luck. strange thing
-
Just bought a K7T266 pro and have also bought 512MB PC2100 (200mhz) RAM. It seems to work fine until all of a sudden it will crash, the screen goes black and the monitor turns off. Afetr that to retsart i have to wait 5 minutes before I can turn the
-
Please tell me a webcam that works with a MAC. My HP 2MP cam only works with XP or Vista.