Data load with ABAP control
Hi:
I need to load the data to a Cube with ABAP code in info-Package.... but am a Novice in ABAP..!
Need your help.
I need to load the data to the Cube only the records thta have "NETDUEDATE" gretaed than System Date
(SY-DATUM) and drop the records that do not meet the condition.
Not sure what would be the code / syntax for putting this condition!
Appreciate your help.
Thanks....Pbs
Hi PBS,
You can write a ABAP logic for your requirement. In the data selction tab of the Info-package, give the type as abap routine and the variable in the type. You can implement the logic with IF-ENDIF condition.
Reffer:
ABAP routine in Infopackage
ABAP Routine in InfoPackage.
Regards,
Satyam
Similar Messages
-
Cost center master data is loaded with no controlling area
Hi All,
0costcenter masterdata was loaded from different source systems R/3 and flat files, some of the costcenters not having controlling areas it was loaded without controlling areas,
I am showing one example of one costcenter 12345678,when check this costcenter contents its displying like this
controlingarea|costcenter |language|To|Valid from|Businesarea
Blank | 12345678|Blank|12/31/1999|01/01/1000|
C01 | 12345678|EN|04/01/2005|01/01/1000|
C01 |12345678|EN|12/31/9999|04/02/2005|
this first row showing controlling area is blank and Language is also Blank, with no controlling area which is causing errors after data is being loaded into cubes. user is saying this is duplicate data, please advice what action will I take on this, it is production issue please respond ASAP.
Many thanks in advice.
Thanks,
Kumar.Kumar,
You said you are getting data from R/3 & Flat files...
In this case probably there could be un assigned comm structure in case of Flat file check Comm/Trans structure...
Cheer's
HVR. -
Deleting Data Package with Abap - error in Abap statement
Hi,
I am trying to delete data with abap but my logic is not working and it failed with error message Error in an abap/4 statement when processing
requirement
- Delete data package when plant = INDIA
- Delete data package when Area = 01 OR Group = J001 in customer master data table.
Logic
Data: T_Data TYPE DATA_PACKAGE_STRUCTURE Occurs 0 WITH HEADER LINE.
data: lt_BI0_PCUSTOMER LIKE /BI0/PCUSTOMER OCCURS 0 WITH HEADER LINE.
LOOP AT DATA_PACKAGE.
MOVE-CORRESPONDING DATA_PACKAGE TO T_DATA.
Refresh lt_BI0_PCUSTOMER.
select * from /BI0/PCUSTOMER into TABLE lt_BI0_PCUSTOMER
WHERE CUSTOMER = T_DATA-SOLD_TO.
Read table lt_BI0_PCUSTOMER WITH KEY CUSTOMER = T_DATA-SOLD_TO.
IF sy-subrc EQ 0.
LOOP AT lt_BI0_PCUSTOMER.
IF lt_BI0_PCUSTOMER-AREA= '01' or
lt_BI0_PCUSTOMER-GROUP = 'J001'.
DELETE T_DATA.
APPEND T_DATA.
endif.
ENDLOOP.
ENDIF.
ENDLOOP.
DATA_PACKAGE[] = T_DATA[].
* Delete data package when Plant EQ INDIA
DELETE DATA_PACKAGE WHERE PLANT = 'INDIA'.
thanks
Edited by: Bhat Vaidya on Oct 19, 2010 8:41 AM
Edited by: Thomas Zloch on Oct 19, 2010 9:55 AMHi,
DELETE DATA_PACKAGE WHERE PLANT = 'INDIA'.
Above syntax for delete data from internal table
For deleting the database table write statement as follows
DELETE From DATA_PACKAGE WHERE PLANT = 'INDIA'
Your are missing the from keyword from statement.
What could i have understand, if it is your solution of Query.
Exactly you want to delete data from database table or internal table ? -
Data loading with 1:N & M:N relationship
Hi,
Which data (master or transaction) we can load with 1 : N relationship and M : N relationship.
Can anyone give me the idea.
ThxIn case of master data,the char infoobjects with master data attributes take care of 1:N relation.For e.g for material master data,the material number would be the key based on which the data would be loaded for each new material.In case of transaction data for 1:N relation,you may take up DSO in which the primary keys can be made the key fields of DSO based on which the repeating combinations would be overwritten.You may use an infoset when there is a requirement of inner join(intersection) or left outer join between master data/DSO objects. The infocube can be used for M:N relation.For union operation betwen two infoproviders,multiprovider may be used.
Hope this helps you out.
Regards,
Rinal -
Data Load with warnings, but dataload.err empty
Hi guys,
laoding data on EAS Console for a Essbase database gives below message:
Database has been cleared before loading.
dataload.err is EMPTY.
Any ideas??
Some data is loaded but how can I find out where it stopped or is all my data loaded?
Message:
Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
Transaction [ 0x190004( 0x4fd89600.0x64578 ) ] aborted due to status [1003050].
There were errors, look in H:\My Documents\temp\dataload.err
Database import completed ['AXN_Sim'.'BewMod']
Output columns prepared: [0]
Thanks BerndHi,
There is nothing much you can say whether the data is completely loaded or not. You have to check by yourself on that.
Since the loading was aborted with a warning, it could be a network issue..
Try to load a small file and then check whether you are facing the problem or you coulfd also post your finding on that..
Thanks,
CM -
Material data load with two unit (0MAT_UNIT) failed?
Hi,
I am loading the 0MAT_UNIT data from R/3 QAS to BW QA, I am getting below error.
0MAT_UNIT : Data record 34 ('000000000000010249 ') : Duplicate data record RSDMD 191
There two records for same material number with two different units (EA and PAK), Data loaded to BW dev successfully even if duplicate records for material with two units but it failed in BW QA.
Any setting I am missing?
Please help.
Thanks,
SteveIf you look at the definition of the P table (/BI0/PMAT_UNIT) you will find that both material and unit are keys. So the system recognizes a record as duplicate only when a given material with a given unit exists twice in the data. Based on what you are saying, this is not the case. You have two separate units for a given material. Can you please check to see if there is another record for the same material in a different datapacket ??
-
Data Load with HAL Problem with Decimal-delimiter/Language Setting
Hi,
we use HAL to load Data into Essbase. The Data is with a . as Decimal Point like in
1252.25 When i upload these Data with HAL the number becomes 125225.00
The Locale of my OS is German
The Locale i specified for the file in HAL is English
If i change the Locale of my BS to English the Problem disappears but thats annoying.
Has anybody else such a problem ?
Is There a Solution for this ?
Thanks.
KevinReading over John's blog, we created a rule file.
But, it looks the load process using the rule file seems to be hanging? are we missing any thing? We are using comman seperator and having number of lines to skip as zero. -
Extraction problem - selection conditions for data load using abap program
Hi All,
I have a problem loading data over a selected period where the selection of date range is done using ABAP routine (type 6). Here though in the request header tab in monitor screen i'm able to see the selection date range populated correctly, no records are being extracted. But if i delete the abap filter and directly give the same date range for selection we are able to extract data. if any body has faced similar problem and have a solution for it please help me with yur suggestion.
Thanks,
nithin.It seems the the data range is not properly set in the routine.
You can check the value of selection period generated by routine in the data selection tab-> execute button is there .
Click it to to test the selection values generated by the ABAP routine..
If the value here seems correct one then paste the code of the routine that u have written with brief logic details that u have applied.
Sonal..... -
Data load through ABAP Praogram
Hi,
I have an issue with a data source that I need ti load data through ABAP Programming.
Can anyone suggest me step by step analysis for extraction of data through ABAP Programming.I think my question is not clear.
I have a table 'DD02T' for which:
In Ecc it has 70000 records while in BI it is fetching only 20000 records.
below is the code:
can anyone suggest me how can i get the total records in BI. -
Data loading with routine displays zero for key fig values
Hi all,
my source field is amount...........and target field is research amount...........
if i restrict amount(source field) with cost element(coaeom7) and value type(010 then it is equal to target field research amount.
for this my code is
IF COMM_STRUCTURE-VTYPE = '010' AND
COMM_STRUCTURE-COSTELMNT = 'COAEOM12'.
RESULT = COMM_STRUCTURE-AMOUNT.
ENDIF.
but when i load the data it displays only zeros.........
Pleas suggest
Regards,
Raj.Hi Raj,
Do u need costelement values other than 'COAEOM12' into the target.
Are you writing this routine in Start Routine/End Routine. if its BI.7
If you are loading data from soucre cube/DSO to target structure, in the transformation you need to write start routine, before which you need to map amount field, cost element,and value type from source to Research amount in target,
The you need to write the code
If source_packake cost element = COAEOM12' and source_packake value type =010
then research amount = amount
End If.
i hope this will solve your problem. -
Time Data loading with missing format
Hi expert,
after we load time data, find the following missing information:
0CALDAY:
20.080.112 (Error)
2008.01.12 (correct one)
0CALMONTH:
200.801(Error)
2008.01(correct one)
0CALQUARTE
20.081(Error)
20081(correct one)
Could anyone let us know how to correct data with missing format to correct one and how to do this step by step? Many thank!What is the source for this data ?
Have it corrected in the source
Or
Correct in PSA and load it from there.
Does it happen only for a few records ? -
Error in Data loading with ODS
Hi Guru's
I'm loading the data from the Flat file to the ODS
the key figure net masss volume with the data type quantity show s the value multiply thousand but it in the psa it shows correct value what is there in the source system
for eg: value for this key figure 48.984 in the flat file means 48.(point) 984
but it is showing in the ods 48,984(comma) so some where it is multiplying by 1000 like it is applying coumn level
and there is no transfer rules it is direct mapping
there is no routines
please let me know where is the problem
Points will be assignHi Sanjay,
It is not multiplied by 1000. The settings are like that. The decimal seperator used here is ',' instead of '.' .
Steps to change the settings:
Goto Menu-> system->user Profile->own data->in the following screen of user profile maintanance, goto tab Defaults. There you can find a drop down for Decimal notations
From there you can change the settings. You can also use the T.code SU01.
After changing the settings logoff and then login for the changes to take place.
Hope this helps you.
Regards,
Yokesh. -
Master data loading with Time dependent InfoObjects
Hi
For product master, I have a infoobject Standard Cost, a time dependent keyfigure attribute.
How do I Load data for product master
1. From flat file
2. from R/3 Pleas provide steps to do on R/3 side
Thanks in advance,
Regards,
VjHi,
Though you Material grade is time dependent and sequentially changing.
You can create 4 different DTP with Grade selection (only 1 Transformation).
For example, 1 DTP with filter Grade A, another DTP with grade B, so on.
and executive all 4 dtp sequentially and activate master data after every DTP run.
Hope it will workout.
Thanks,
Balaram -
Transction data loading with master data as key?
Expert's,
I am trying to load the transction data with master data as a key.ZEMPID is my masterdata which having attributes ZEMPNAM and ZEMPDES .My transction data contains ZEMPID.ZEMPSAL and ZEMPAGE.I could create the master data and also load it successfully.But when I tried to create the infosource for trasctional data I should get all the attributes whatever I have given it for the ZEMPID.This I am not getting.Also While creating the cube if I select the infosource then in chraracterstic tab I am not getting my characterstics(master data+attribute).Please any one help me.Hello Vasu
You are talkin abt transactional data and saying that not created cube/Ods ..
Let me put the things again
1. When you load master data you create infosource ( direct or indirect), All the attribute will be automatically comes in the infosource...when you define application component in master data tab in infoobject maintenance it is ready for data to be loaded. Search for application component in infosource..u will get the flow
2. When you load data in infocube only transactional data goes in it.....master data presents in master data tables...In infosource you define what are the field which need to be present in cube.......so you will be able to include only these objects which are present in cube dimensions + attribute which are defined navigational in the cube....
Master data attribute will never be pushed automatically....only componding object are pushed automatically.....
If you want to load attribute of ZEMPID then make them navigational in cube or add them in one of cubes dimension...
Hope i could make myself clear
Thanks
Tripple k -
Data-load with Negetive values
Hi All,
5 People viewed my query..nobody answered.
it is a simpe query which i am not supposed to post ?
if not pls respond ...or pls ignore it....
Pls suugest me onthis.
In my app data file contains Positive values.
Only for few members i have to load the data as --negative values.
EX;Count Type Code D for Debit C for Credit
If D, load Contract & Member counts as negative numbers
If C, load them as positive
i think need to set changes in Rule file....
Pls suggest me how to write the load rule for this.
Thanks in advance.
Bunny
Edited by: Bunny on Feb 11, 2009 3:55 AMHi Bunny,
I dont know the setting on the top of my mind at rule file end . But I have one solution ( you might not like it).
1. Load the comlete data with negative values , then use IF ...ELSE condition and include @ABS function for the required ones.
2. I mean, apply the condition and use the function of ABS which will make -ve numbers +ve
Check out for other posts by other experts too.
If its development setup and if you can change the data source , try this.
Sandeep Reddy Enti
HCC
http://hyperionconsultancy.com/
Maybe you are looking for
-
Dynamin VPN/GRE can't ping other side of tunnel
I am new at this VPN stuff and tryiong to setup a GRE Dynamic IP VPN between my offfice and home. Here is what I ahve done thus far: OFFICE interface Tunnel0 ip address 172.30.1.1 255.255.255.252 no ip redirects ip mtu 1400 ip nhrp map multicast dyn
-
Elements 5.0 to Elements 11.0
I have most of my photos organized with tags in Elements 5.0. I am told not to load 5.0 into Windows 7 on my new computer. I will purchase the latest version of Elements - 11.0 - to install in Windows 7. Is there a file that can be transfered from
-
After upgrading to mountain lion e-mail no longer pushes to my i-Phone
after upgrading to mountain lion e-mail no longer pushes to my i-Phone I have to launch the mail app on phone to retrieve mail now
-
How can I export an SD 16:9 self contained file from FCX?
Hi folks, I'm playing with the trial. Used another thread to figure out how to import mts files (Loving the fact I don't have to clipwrap them! Original footage is in HD - 1920 x 1080. Upon importing it looks like it converts to ProRes422 1440 x 10
-
How do I move all my music from Ipod to my new laptop I have accumulated it over the years from my own cd collection and when it synics it doesn't see it?