Delta Update with flat file into infocube
Hi
Can any one explain me the steps how to achive the delta update with falt file into infocube?
Thanks
There is a documented way of managing a delta but it mostly boils down to how you model the process. SAP says you can add a ROcancel field ( or something similar to it) in the transaction data within the files
that way when you update the data into the cubes, it would be managed - whether it has to be added or over written,
therefore the best way to manage this would be to make some changes i the flat file-
are the flat files being generated by some program? maybe you can add in further to check the updated records -
Similar Messages
-
How to Delta Update in Flat Files
Hi all,
I would like to know is delta update is supported in flat files ? If yes how does this work ..... is there a SAP white paper or document on this
Thanks
NathanHi......
Yes Delta upload is supported by Flat files..........In this case........when the delta data records are loaded by flat file......... This delta type is only used is 'F'.......for DataSources for flat file source systems. Flat files are imported to the PSA when you execute the InfoPackage. The staging process then begins.........The delta queue is not used here.
As Siggi already told in the Following thread......post your flatfile data to an ods object with update mode overwrite. The ods object will do the delta for you then.........
Delta Upload for Flat File
Also Check this.......
SAP NetWeaver 7.0 BI: Data Transfer Process with Only get Delta Once
SAP NetWeaver 7.0 BI: Data Transfer Process (DTP) / The Normal DTP ...
Hope this helps........
Regards,
Debjani........
Edited by: Debjani Mukherjee on Nov 12, 2008 9:40 AM -
Hello Guys,
We need to define the structure of the flat files we'll receive from our ETL team to specify how we'll manage the updates of the records we already loaded into BW.
We'll load those flat files through PSA then into an ODS that will feed an InfoCube. The records can contain new or modified characteristics and/or key figures.
I read a lot of documentation about deltas but I didn't find a clear answer regarding this subject for flat files:
Do we need to specify, for every single record in the flat files, if a record is either a new one, a modified one or a deleted one, or can the ODS figure that out by itself? I guess it can't (at least for the deletion part) so do we have to add an additional field to flag what type of modification (new, modif or delete) a record belongs to?
If we do have to add that information, can I just use ' ' for a new record, 'N' for a modified record and 'D' for a deleted record?
Thank you in advance (and don't worry, I'll reward you).
best regards,
SFHi Sebastien.
The answer really depends on what you types of records you are working with. If your record has a well-defined key (like a document number) and your ETL team gives you after-images of the changed records, you can set your ODS Object up with the same key and use Update Rules set to Overwrite. Then, the ODS Object change log will automatically provide delta-compatible updates to the InfoCubes.
hope that helps.
ADam -
Unsorted Flat File into IDoc with multiple use of nodes
Hi Experts!
I am facing a little problem. I have a source flat file for a classification where some fields appear several times.
My source flat file looks like this:
item1; field1a
item2; fieldA
item3; fieldxa
item1; field1b
as you can see the item1 exists twice (further appearances are also possible).
Now i have to map the flat file into an IDoc structure
My target IDoc looks like this
Header
-- node1
attribute1
-- node2
the "field1a" and "field1b" has to be mapped into the "attribute1" in "node1". "node1" has to be duplicated for each time an "item1" appears (.. and if item2, item3 etc. appears twice, three ... four times...).
So how can i reach it that the node1 will be duplicated automatically when an item appears twice or more times? I know that it could be possible to work with "SplitByValue"... but for this i need all item1 in an straight order.... but i dont have them in a correct order.
I am looking forward to your suggestions.
Thank you in advance.
UdoComplex sorting is not or not easy possible with the grafical mapping tool.
Use a sequence mapping. The first mapping is a simple XSLT which does the sort. The second mapping works as usual.
I have an example XSLT which I used for a different purpose:
<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet version="1.0"
xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:template match="/">
<ORDERLIST>
<xsl:for-each select="ORDERLIST/ITEM">
<xsl:sort select="ID"/>
<xsl:copy-of select="."/>
</xsl:for-each>
</ORDERLIST>
</xsl:template>
</xsl:stylesheet>
Regards
Stefan -
Problem with extraction of flat file to infocube
Hi there,
We are workin on SAP SCM APO release 7 that has the componente BW release 7. and we're trying to load data from a flat file to infocube like we used to do in release 5.
The problem is that we get the message that no data is available everytime we run the DTP. The transfer structure is:
1. source system: flat file
2. data source: a flat file according with our CSV file
3. transformation between the data source and the infocube
4. infocube
do i need to create an infosource even i'm loading from a flat file? With the release 5 of SAP SCM APO i don't need to create an infosource. It works fine from the data source to the infocube.
We don't fine any notes related so I'm asking if you have experienced this before with the release 7.0 and how do you solve it.
RegardsHi,
In BI DTP loads data directly from datasource to infocube. No need to create an infosource. But make sure that you have loaded data till your datasource i.e. upto PSA. For that you need an infopackage to load data from flatfile to PSA. Then from PSA, you can load data to cube by your DTP.
You have to create a flatfile datasource and give path to the CSV file. I guess you have already done this. Now right click on your datasource and click on create infopackage to create an infopackage. here also check the path and start the infopackage. It will load data till your PSA. If this loading is successful you have data in your PSA. Right click on your datasource and click on manage. Thus you can check your data in PSA. Now you need transformation between your datasource and cube which you already have. Now run your DTP and your data will be loaded from PSA.
Indrashis -
Load flat file into oracle with SQL Loader
Hi All,
Oracle 9i
I want to load flat file into oracle with the help of sqlloader but I want to skip some columns from flat file.
Can anyone tell me how can we skip column from flat file , I can’t open flat file into excel as CSV due to large volume
Does anyone has any solution for the same.
Umesh GoelFiller can be use when we want to skip database table column or we want to put null in database table column
but if we have 10 column in flat file and we want to load 1,2,5,7 number column from flat file
then I think filler will not work
If yes then plz let me know.
thx
UG -
Hello Everyone
i am srikanth. i would like to know wheather do we have a facility for Delta Upload for flat file. If yes can u please give me the steps.
thanks in advance
srikanthHi Sabrina...thank you for ur help...i did load the data from cube to ods
steps.
1. i generated export data source on the cube
2. i found the name of the cube with prefix 8<infocube name> in the infosource under DM application component.
3. there are already communication structure and transfer rules activated but when i am creating update rules for the ods..i am getting the message 0recordmode missing in the infosource.
4. so i went to infosource and added 0recordmode in communication structure and activated but the transfer rules in yellow colour..there was no object assigned to 0recordmode but still i activated.
5.again i went to ods and created update rules and activated (tis time i didnt get any message about 0recordmode).
6.i created infopackage and loaded.
a)Now my question is without green signal in the transfer rule how data populated into the ods and in your answer you mentioned to create communication structure and transfer rules where in i didnt do anything.
b) will i b facing any problem if i keep loading the data into the ods from cube (in yellow signal) ..is it a correct procedure..plz correct me..thanks in advance -
Error while loading flat file into DSO
Hi
I am loading data from a flat file into a DSO. My fields in the flat file are Trans_Dt, (CHAR) Particulars (CHAR), Card_Name, (CHAR) Exps_Type, (CHAR)
Debit_Amount,Credit_Amount,***._Amt,Open_Bal_Check_Acnt, (CURR)
0Currency (CHAR)
In the proposal tab apart from the above mentioned fields 3 additional fields viz, field 10, field 11, and field 12 have come. How did these 3 additional fields come when I don't have any additional fields in my flat file? I've deleted these extra 3 fields though.
When I activate the DataSource it is getting activated but then I get the message 'Data structures were changed. Start transactions before hand'. What does this message mean?
When I hit the 'Read preview data' button it doesn't show me any data and gives me the error Missing reference to currency field / unit field for the fields Debit_Amount,Credit_Amount,***._Amt,Open_Bal_Check_Acnt
How do I create a reference field to the above mentioned fields?
Earlier I didn't have the 0Currency field in the flat file. But in my DSO while creating the key figures by default the 0Currency field also got created which is quite obvious. Now while activating the transformations I was getting a message that 'No source field for the field 0Currency'. Hence I had to create a new field in my flat file called 0Currency and load it with USD in all rows.
Please help me in loading this flat file into the DSO.
Thank you.
TR.Hi guys,
Thanks a lot for your answers. with your help I could see the data in the 'Read preview data' and schedule the load. I did use all the Info objects in the info objects column of the data source to load the flat file.
The data is in PSA successfully without any issues. but when I executed the DTP it failed with errors.
Earlier there was no mapping from Currency field in source to the all the key figure fields in the target in the transformation. The mapping was only from Currency to 0CURRENCY but still the transformation got activated. As per your advise I mapped Currency field to the remaining Key Figure fields but then I am getting the error
'Source parameter CURRENCY is not being used'
Why is that so?
list of Errors after executing the DTP:
1. 'Record filtered because records with the same key contain errors'
Message:
Diagnosis: The data record was filtered out becoz data records with the same key have already been filtered out in the current step for other reasons and the current update is non-commutative (for example, MOVE). This means that data records cannot be exchanged on the basis of the semantic key.
System Response: The data record is filtered out; further processing is performed in accordance with the settings chosen for error handling.
Procedure: Check these data records and data records with the same key for errors and then update them.
Procedure for System administration
Can you please explain this error and how should I fix this error.
2. Exception input_not_numeric; see long text - ID RSTRAN
Diagnosis: An exception input_not_numeric was raised while executing function module RST_TOBJ_TO_DERIVED_TOBJ.
System Response
Processing the corresponding record has been terminated.
Procedure
To analyse the cause, set a break point in the program of the transformation at the call point of function module RST_TOBJ_TO_DERIVED_TOBJ. Simulate the data transfer process to investigate the cause.
Procedure for System Administration
What does this error mean? How do I set a breakpoint in the program to fix this error inorder to load the data?
What does Procedure for System Administration mean?
Please advise.
Thank you.
TR. -
How can I load a flat file into a ZTABLE dynamically
I need to create a program which can Load a ZTABLE from a flat file structure (delimited and fixed options required). We have many ZTables where this will be required so I was hoping to do it dynamically somehow. Otherwise I will have to create one ABAP for every ZTable we have to load.
My Inputs should be
PARAMETERS: p_ztable TYPE ddobjname, "Z Table Name
p_infile(132) LOWER CASE, "File Name
p_delim(1). "Delimiter
I know that I can read the file by using gui_upload
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = c_infile
has_field_separator = p_delim
TABLES
data_tab = indata
EXCEPTIONS
file_open_error = 1
file_read_error = 2
OTHERS = 9.
I know that I can split the contents of this file (if a delimiter is used). However I will not know the actual field names of the table until runtime as to what to fields to split it into. In the example below I have the actual table (t_rec) and each of the fields (pernr, lgart, etc) in the code but each table I
need to load will be different – it will have a different # of fields as well.
FORM read_data_pc.
LOOP AT indata.
PERFORM splitdata USING indata.
APPEND t_rec.
CLEAR t_rec.
ENDLOOP.
ENDFORM.
FORM splitdata USING p_infile.
SPLIT p_infile AT p_delim INTO
t_rec-pernr "Employee #
t_rec-lgart "Wage Type
t_rec-begda "Effective date
t_rec-endda. "End date
ENDFORM.
Once I split the data into the fields then I can just look and insert the record.
Does anyone have any ideas? Specific code examples would be great if you do. Thx!!Hi janice,,
Try this sample code where you can upload data from a flat file into the internal table.
REPORT z_test.
TABLES: mara.
FIELD-SYMBOLS : <fs> .
DATA : fldname(50) TYPE c.
DATA : col TYPE i.
DATA : cmp LIKE TABLE OF rstrucinfo WITH HEADER LINE.
DATA: progname LIKE sy-repid,
dynnum LIKE sy-dynnr.
DATA itab TYPE TABLE OF alsmex_tabline WITH HEADER LINE.
DATA: BEGIN OF ZUPLOAD1_T OCCURS 0 ,
matnr like mara-matnr,
ersda like mara-ersda,
ernam like mara-ernam,
laeda like mara-laeda,
END OF ZUPLOAD1_T.
*DATA: ZUPLOAD1_T LIKE mara OCCURS 0 WITH HEADER LINE.
DATA: wa_data LIKE TABLE OF ZUPLOAD1_T WITH HEADER LINE.
selection-screen
SELECTION-SCREEN: BEGIN OF BLOCK blk WITH FRAME TITLE text-001.SELECTION-SCREEN : SKIP 1. PARAMETERS : p_file LIKE rlgrap-filename.SELECTION-SCREEN : SKIP 1.SELECTION-SCREEN : END OF BLOCK blk
. AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
F4 Value for File
CALL FUNCTION 'KD_GET_FILENAME_ON_F4' EXPORTING
PROGRAM_NAME = SYST-REPID
DYNPRO_NUMBER = SYST-DYNNR
FIELD_NAME = ' '
static = 'X'
MASK = ' '
CHANGING file_name = p_file EXCEPTIONS mask_too_long = 1 OTHERS = 2
. IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
START-OF-SELECTION.
CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
EXPORTING filename = P_FILE
i_begin_col = 1
i_begin_row = 1
i_end_col = 5
i_end_row = 12507
tables
intern = ITAB
EXCEPTIONS
INCONSISTENT_PARAMETERS = 1
UPLOAD_OLE = 2
OTHERS = 3. .
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
CALL FUNCTION 'GET_COMPONENT_LIST'
EXPORTING
program = SY-REPID
fieldname = 'ZMARA'
tables
components = CMP.
LOOP AT itab. AT NEW row.
IF sy-tabix = 1. APPEND ZUPLOAD1_T.
ENDIF.
ENDAT.
col = itab-col.
READ TABLE cmp INDEX col.
CONCATENATE 'ZUPLOAD1_T-' cmp-compname INTO fldname.
ASSIGN (fldname) TO <fs>.
<fs> = itab-COL.
APPEND ZUPLOAD1_T. ENDLOOP.
DELETE ZUPLOAD1_T where matnr eq space.
LOOP AT ZUPLOAD1_T INTO wa_data.
insert mara from wa_data .
WRITE: / ZUPLOAD1_T-matnr, 20 ZUPLOAD1_T-ersda , 45 ZUPLOAD1_T-ernam, 55 ZUPLOAD1_T-laeda.
*HERE IAM JUST CHECKING I NEED TO UPDATE A ZTABLE
ENDLOOP.
insert ZMARA FROM table itab ACCEPTING DUPLICATE KEYS.
I have tried it for mara.Please let me know whether it was helful.
Regards,
Kannan -
Load a flat file into BW-BPS using SAP GUI
Hi,
We are using BW BPS 3.5 version, i implemented how to guide " How to load a flat file into BW-BPS using SAP GUI" successfully without any errors.
I inlcuded three infoobjects in the text file costelemt, Posting period and amount. the same three infoobjects i inlcuded the file structure in the global data as specified in the how to document
The flat file format is like this
Costelmnt Postingperiod Amount
XXXXX #
XXXXX 1 100
XXXXX 2 800
XXXXX 3 700
XXXXX 4 500
XXXXX 5 300
XXXXX 6 200
XXXXX 7 270
XXXXX 8 120
XXXXX 9 145
XXXXX 10 340
XXXXX 11 147
XXXXX 12 900
I successfully loaded above flat file in to BPS cube and it dispalyed in the layout also.
But users are requesting to load flatfile in the below format
Costelmnt Annual(PP=#) Jan(PP=1) Feb(PP=2) ........................................Dec(PP=12)
XXXXX Blank 100 800 900
Is it possible to load a flat file like this
They wants load a single row instead of 13 rows for each costelement
How to do this. Please suggest me if anybody accorss this requirment.
In the infocube we have got only one Info object 0FISCPER3(Posting period) and one 0AMOUNT(Amount)
do we need 13 Infobjects for each posting period and amount.
Is there any possiblity we can implement any user exit which we use in BEX Quer's
Please share your ideas on this.
Thanks in advance
Best regards
SSHi,
There are 2 ways to do this.
One is to change the structure of the cube to have 12 key figures for the 12 posting periods.
Another way is to write an ABAP Function Module to fetch the values from each record based on the posting period and store it in the cube for the corresponding characteristic. This way, you dont have to change the structure of the cube.
If this particular cube is not used anywhere else, I would suggest to change the structure itself.
Hope this helps. -
Uploading the data from a flat file into ztable
Hi,
I have a requirement where I have to upload the data from 2 flat files into 2 z tables(ZRB_HDR,ZRB_ITM).From the 1st flat file only data for few fields have to be uploaded into ztable(ZRB_HRD) .Fromthe 2nd flat file data for all the fields have to me uploaded into ztable(ZRB_ITM). How can I do this?
Regards,
Hemahi,
declare two internal table with structur of your tables.
your flat files should be .txt files.
now make use of GUI_UPLOAD function module to upload your flatfile into internal tables.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = 'c:\file1.txt'
has_field_separator = 'X'
TABLES
data_tab = itab1
EXCEPTIONS
OTHERS = 1.
use this function twice for two tables.
then loop them individually and make use of insert command. -
Issue with flat file loading timing out
Hello
I have a scenario, where I am loading a flat file with 65k records into a cube. The problem is, in the loading process, I have to look up the 0Material table which has a million records.
I do have an internal table in the program, where I select a subset of the Material table ie around 20k to 30k records. But my extraction process takes more than 1 1/2 hrs and is failing (timed out).
How can i address this issue? I tried building indexes on the Material table and its not helping.
Thanks,
Srinivas.Unfortunately, this is BW 3.5, so there is no END routine option here. And I tried both .csv and notepad file methods, and both are creating problems for us.
This is the total code, do you guyz see any potential issues:
Start Routine (main code)
refresh i_oldmats.
refresh ZI_MATL.
data: wa_datapak type transfer_structure.
loop at datapak into wa_datapak. (** I collect all the old material numbers from my flat file into an internal table i_oldmats**)
i_oldmats-/BIC/ZZOLDMATL = wa_datapak-/BIC/ZZOLDMATL.
collect i_oldmats.
endloop.
sort i_oldmats.
SELECT /BIC/ZZOLDMATL MATERIAL (** ZI_MATL only has recs. where old materials exist, this gets about 300k records out of 1M**)
FROM /BI0/PMATERIAL
INTO ZI_MATL FOR ALL
ENTRIES in i_oldmats WHERE
/BIC/ZZOLDMATL = i_oldmats-/BIC/ZZOLDMATL .
collect ZI_MATL.
Endselect.
Sort ZI_MATL.
Transfer rule routine (main code)
IF TRAN_STRUCTURE-MATERIAL = 'NA'.
READ TABLE ZI_MATL INTO ZW_MATL
WITH KEY /BIC/ZZOLDMATL = TRAN_STRUCTURE-/BIC/ZZOLDMATL
BINARY SEARCH.
IF SY-SUBRC = 0.
RESULT = ZW_MATL-MATERIAL.
ENDIF.
ELSE.
RESULT = TRAN_STRUCTURE-MATERIAL.
ENDIF.
Regards,
Srinivas. -
Delta upload for flat file extraction
Hello everyone,
Is it possible to initialize delta update for a flat file extraction, If yes please explain how to do that??Hi Norton,
For a Flat file data source, the upload will be always FULL.
Please refer to following doc:
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0b3f0e2-832d-2c10-1ca9-d909ca40b54e?QuickLink=index&overridelayout=true&43581033152710 if you need to to extract delta records from Flat file. We can write routine at infopackage level.
Regards,
Harish. -
Working with flat file as source in owb 10.2
Hi,
I am working with flat file as source . While validating the mapping i am getting the following error like
" to specify a data file configure the mapping , add a node under ' Source data file', type in the file name
and select the file location."
Please give me the suggestion . It is very urgent.Hi Venkat,
I tried the following stepts.
1. in Design Center select your mapping and right click and select configure
2. select sql loader data files and select create
3. On right hand side data file Name : enter your source file name (ex : source.csv)
4. click ok button.
5. open mapping and validate.
The mapping is validating. After validating I deployed the mapping. Up to this the mapping is working fine.
But when I start the mapping, It has completed with errors.
The error message is:
Status
Error Log
RPE-01013: SQL Loader reported error condition, number 1.
LRM-00112: multiple values not allowed for parameter 'control'
Job Summary
Updated : 2009-02-24 15:32:43.0 Job Final Status : Completed with errors Job Processed Count : 1 Job Error Count : 1 Job Warning Count : 0
Please give me the suggestions.
Thanks,
Venkat -
Upload data from flat file into internal table
Hi friends,
I want to upload the data from a flat file into internal table , but the problem is that all the columns in that flat file are seperated by "|" character instead of tabs.
Plz help me out.........HEllo,
DO like this.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
FILENAME = LV_FILENAME
FILETYPE = 'ASC'
HAS_FIELD_SEPARATOR = 'X' " Check here
* HEADER_LENGTH = '1'
* READ_BY_LINE = 'X'
* DAT_MODE = ' '
* CODEPAGE = ' '
* IGNORE_CERR = ABAP_TRUE
* REPLACEMENT = '#'
* CHECK_BOM = ' '
* IMPORTING
* FILELENGTH =
* HEADER =
TABLES
DATA_TAB = IT_COJRNL
EXCEPTIONS
FILE_OPEN_ERROR = 1
FILE_READ_ERROR = 2
NO_BATCH = 3
GUI_REFUSE_FILETRANSFER = 4
INVALID_TYPE = 5
NO_AUTHORITY = 6
UNKNOWN_ERROR = 7
BAD_DATA_FORMAT = 8
HEADER_NOT_ALLOWED = 9
SEPARATOR_NOT_ALLOWED = 10
HEADER_TOO_LONG = 11
UNKNOWN_DP_ERROR = 12
ACCESS_DENIED = 13
DP_OUT_OF_MEMORY = 14
DISK_FULL = 15
DP_TIMEOUT = 16
OTHERS = 17
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
VAsanth
Maybe you are looking for
-
IPhoto doesn't update to modified pics after force quit
iPhoto didn't respond to my 'quit' command, so I chose to +force quit+ it. Before I did so, iPhoto had a problem connecting to Facebook, which seemed to cause iPhoto not being able to quit (this happened to me several times recently, and I always had
-
Hi all, I have copied mmbe transaction to ZMMBE.I add two buttons on screen 300.When i execute it works fine.but when i go back and then execute buttons got disappear.when i debug it doesnt go to PBO.is there any way to explicitly call PBO. Regards L
-
Can you sell two products from one inventory listing?
The scenario is as follows: We buy one product 'ABC' from our supplier and store this in a location in our warehouse. We sell this product to our customers as part number 'ABC'. The issue arises because we also sell this product as a different produc
-
I have a module with 3 stages of quizzes. 1. Easy round; 2. Average round; 3. Difficult round. I want to make the students take the exam starting from easy round to difficult round. if the student failed the easy round he should go back at the start
-
Unable to understand differrences in Response Trace
Dear Friends Please help in trying to understand the difference of the following Response Traces (I need to change the scenario back so that the response trace looks like the first one) : 1.<Trace level="1" type="T">select interface GDS_DP_Out* </Tra