One Time Flat File load in a Cube that extracts from ECC
Hi,
I have a cube that is already extracting data from ECC on a daily basis. I am asked to load some historical data into the cube from their legacy system. My question is, how do i prepare that file from the legacy system to match the fields in the Cube? What are the technicalities that i need to look at?
For Example:
0MATERIAL in the Cube: Material Number in the Flat File
ZAMOUNT in the Cube: Cost of Product in the Flat File
How do I arrange to match these?
Thanks
Hi
Create DS and enter the required objects
Preview the Data
Create Transformations (Map DS fields to Targets Objects - direct assignemnt)
Create Infopack and extract data upto PSA
Create DTP and run - it will extract data from PSA to Target
No of Heade rows to be ignored : 1
File type : CSV
Data seperator : ;
Escape Sign: "
Thanks,
Similar Messages
-
FLAT FILE LOADING INTO TRANSACTION CUBE
Hi Guru's,
I'm working on bps to load flat file into trasaction cube.
when i'm actiavting init Functin module, i'm getting the following error.
'IS NO COMPONENT EXIST WITH THE NAME PARNM'.
THE ERROR IS COMING WHEN THE FM IS READING FILE NAME.
tHANKS IN ADVANCE.Hi Vijay,
Just check Function Module Parameters:
IMPORT: I_AREA TYPE UPC_Y_AREA
I_PLEVEL TYPE UPC_Y_PLEVEL
I_PACKAGE TYPE UPC_Y_PACKAGE
I_METHOD TYPE UPC_Y_METHOD
I_PARAM TYPE UPC_Y_PARAM
IT_EXITP TYPE UPF_YT_EXITP
ITO_CHASEL TYPE UPC_YTO_CHASEL
ITO_CHA TYPE UPC_YTO_CHA
ITO_KYF TYPE UPC_YTO_KYF
EXPORT: ETO_CHAS TYPE ANY TABLE
ET_MESG TYPE UPC_YT_MESG -
Best practise around handling time dependency for flat file loads
Hi folks,
This is a fairly common situation - handling time dependency for flat file loads. Please can anyone share their experience around handling this. One common approach is to handle the time validity changes within the flat file where it is easily changeable by the user but then again is prone to input errors by the user. Another would be to handle this via a DSO. Possibly, also have this data entered directly in BI using IP planning layouts. There is a IP planning function that allows for loading flat file data but then again, it only works without the time dependency factor.
It would be great to hear thoughts or if anyone can point to a best practise document for such a scenario.
Thanks.Bump!
-
What are the settings for datasource and infopackage for flat file loading
hI
Im trying to load the data from flat file to DSO . can anyone tel me what are the settings for datasource and infopackage for flat file loading .
pls let me know
regards
kumarLoading of transaction data in BI 7.0:step by step guide on how to load data from a flatfile into the BI 7 system
Uploading of Transaction data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( Transaction data )
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to create ODS( Data store object ) or Cube.
Specify name fro the ODS or cube and click create
From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
Click Activate.
Right click on ODS or Cube and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
Loading of master data in BI 7.0:
For Uploading of master data in BI 7.0
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to select Insert Characteristics as info provider
Select required info object ( Ex : Employee ID)
Under that info object select attributes
Right click on attributes and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets. -
Flat File loading Initialize with out Data transfer is disabled in BI 7.0
Hi experts,
When loading through flat file in BI 7.0 for Info Package Level Initialization Delta Process with data Transfer is coming by default,but when i want to select Initialization Delta Process without Data transfer is disabled. (in the creation of Data Source (flat file) in the Extraction Tab Delta Process is changed to FIL1 Delta Data (Delta Images).
please provide me Solution.
regards
Subba reddy.Hi Shubha,
For flat file load please go throught he following link:
http://help.sap.com/saphelp_nw70/helpdata/EN/43/03450525ee517be10000000a1553f6/frameset.htm
This will help.
Regards,
Mahesh -
I need format for data in excel file load into info cube to planning area.
Hi gurus,
I need format for data in excel file load into info cube to planning area.
can you send me what should i maintain header
i have knowledge on like
plant,location,customer,product,history qty,calander
100,delhi,suresh,nokia,250,2011211
if it is right or wrong can u explain and send me about excel file format.
babuHi Babu,
The file format should be same as you want to upload. The sequence of File format should be same communication structure.
Like,
Initial columns with Characteristics (ex: plant,location,customer,product)
date column (check for data format) (ex: calander)
Last columsn with Key figures (history qty)
Hope this helps.
Regards,
Nawanit -
Hi Friends,
I am struggling with flat-file loading problem. I am trying to load a .csv file into data target. I took all pre-cautions while loading data. I look into preview and simulate the data. Everything is ok, but when i schedule the data, i found 0 records in the monitor. The following is the STATUS message of the above problem:
No data available
Diagnosis
The data request was a full update.
In this case, the corresponding table in the source system does not
contain any data.
System response
Info IDoc received with status 8.
Procedure
Check the data basis in the source system.
Can anybody help me what is the problem and procdure to resolve it?
Regards,
MaheshHi Eugene,
Thanks for the quick reply. The following screen-shot tells you the messages of detail tab;
OVER ALL STATUS MISSING WITH MESSAGES OR WARNINGS
REQUEST: MISSING MESSAGES
EXTRACTION
EVERYTHING IS OK
DATA REQUEST RECEIVED
NO DATA AVAILABLE DATA ELECTION ENDED.
PROCESSING
NO DATA
The above message was shown in details tab. Pls guide me to locate the problem.
Regards,
Mahesh -
Transformation not generating-Flat file loading
Hello guys, I hope you can help me with this little confusion I have on BI7 Flat file loading.
I got a File (CSV) on my workstation. I am trying to load Master Data. Here is the example of my file and issues:
Lets say, I have CSV file named "CarModel.CSV" on my PC.
This excel file has 10 Records and No atributes for this Field.
So the records should show like this and it is showing correctly in PSA level, DS level.
A
B
C
D
E
F
My goal is to load Flat file data to InfoObject (inserted in Infoprovider)
I created Source system, DS, all thats stuffs.
I am now on Display DS screen under Proposal Tab. I am putting 10 Records to show and hitting Load example data....it works fine by showing all 10 records. However in the bottom part of this screen, what should show as a Field ?In my case , First it was showing the First Record ("A")...i didnt think it was correct but i prceeded anyways. Transformation could not be generated.
I tried by deleting "1" from the field "No. of header rows to be ignored" and its the same result ..No transformations.. I mean i know its very simple to load this data but i am not sure what i am doing wrong..My question is:
1) What should show under Field/propasal tab in my case?am i supposed to create this Field?
2) Should the system to propse the field which is from my flat file header ?in my case i dont have any header..should i have include header in my csv file like "car model"? i am confused please give me some info thanks
dkHi,
In filed tab, u have to enter ur infoobject names in an order...and press enter it'll automatically give the description and its other factors...
i guess u shuld have some header in the sense eg: customer,cust ID like this..this only u have to enter as fields...in proposal tab...try that
rgds, -
Unit Code, Commercial Code - Flat File loading vs BPS Loading
Hi SDN Community,
Recently, we have moved our flat file loading to be performed by BPS interfaces.
During flat file loads, the Commercial code of all units of measures, are verified.
eg.DAY
But when loaded by BPS, the UNIT code is verified.
eg. TAG.
In the display of the BPS upload, it displays DAY which is the commercial code.
The only thing, is the customer is forced to use TAG in the BPS upload files.
They wish to create another record in the transaction
CUNI to be
Unit code = DAY
Commercial code = DAY.
However, i have found that we cannot allocate the same commercial code to another unit code.
Is this a design constraint, or a process error that i am doing.
Thank you.
Simon04.01.2010 - 02:22:53 CET - Reply by SAP
Dear customer,
As i do not fully understand how this works, but the base table T006A
has 2 entries, 1 for english, 1 for German, should it not be that the
English EN entry field be working rather that the German DE, hence DAY
should be used? Can you please confirm my understanding on this?
>>> If you check my attachment "SE16.xls", you can see that it's for
language >>English<<, and the internal format is TAG while the external
format is DAY.
- Are there any plans to modify the BPS functionality in newer SAP BW
versions to allow the customer to indicate the same UOM as per flat
file laods? Or is an upgraded Support Pack that allows this?
- If not, would it be possible to make any customer enhancements to
allow this to take place depending on customer requirements.
>>> There is no plan at this moment to change this BPS functionality
in newer SAP BW versions or support packages.
We really recommend you to use internal format TAG in the upload file
in BPS which I think should be acceptable and feasible to you. All
other ways trying to use external format is risky and we cannot assure
you that it will work well as it's not SAP standard function. (I think
it's not worth the risk as the standard function which requires
internal format should not be too unacceptable)
Thanks for your understanding.
I also don't think my BC-SRV-ASF-UOM colleague would be able to help
you a lot regarding this.
Best Regards,
Patricia Yang
Support Consultant - Netweaver BW
Global Support Center China
SAP Active Global Support -
Hi Experts,
I have loaded data to cube through Flat file in BI7. The request is in green at manage option and reportable but the transferred and added records are 0. Please help me out.
Regards,
RaviDid you run delta dtp from PSA to Cube ? which was already loaded I guess check in cube
If the request from PSA already loaded to cube and next time when you run delta dtp again it will fetch 0/0 records...
Are you sure you have loaded PSA with new request which was not yet loaded to cube?
Edited by: Srinivas on Aug 6, 2010 5:07 PM -
Flat File Load Issue - Cannot convert character sets for one or more charac
We have recently upgraded our production BW system to SPS 17 (BW SP19)
and we have issues loading flat files (contain Chinese, Japanese and
Korean characters) which worked fine before the upgrade. The Asian
languages appear as invalid characters (garbled) as we can see in PSA. The Character Set
Settings was previously set as Default Setting and it worked fine until
the upgrade abd we referred to note 1130965 and with the code page 1100
the load went through (without the Cannot convert character sets for one or more characters error) however the Asian language characters will appear
as invalid characters. We tried all the code pages suggested in the note e.g.
4102, 4103 etc on the info packages but it did not work. Note that the
flat files are encoded in UTF-8.I checked lower case option for all IO
When i checked the PSA failed log no of records processed is "0 of 0" my question is with out processing single record system is througing this error message.
When i use same file loading from local workstation no error message
I am thinking when I FTP the file from AS/400 to BI Application Server(Linex) some invalid characters are adding but how can we track that invalid char?
Gurus please share your thoughts on this I will assign full points.
Thanks, -
Hello
I am trying to load flat file from my workstation to BI. I have a transaction data, in which 4 chars IO filed and 3 KFs field. I am little confusion about loading in BI7 since Infosource , TR, UR not required.
I am creating flat file source system and DS as well. I have info obj catalog for both Chars and kfs. do i have to load master data first ?before loading transactional? i have data in excel format all master data in different file , and also Transaction data (include MD and kfs) in one file. can i just create DS for transaction data and then load or i have to create DS for each Chars IO to load master data first?
when i tried to create DS fro trans data , for some reason under proposal tab it only populates one field, FIELD1, i assume i ll have to go to Field tabe and manuall select all fields by adding new rows..
Could someone give me some idea on how to do the loading to infocube? i also tried to create Transformation , it gave me an error..
when we create DS for transactional data using Flat file , wouldnt system supposed to map all the DS fields to the cube using Transformation, but its not happening to mine, please help...
thanks
DKHi........
Before loading transaction data you have to load Master data...
1) For each master data....ie IO.....you have to create one datasource.........
You have to create datasource using tcode : RSO2.......
The General tab is same for all datasource........
In the Extraction tab...
Real Time>> RDA is not supported......
Adapter >> Load text type file from work station
Name of the file >> select your file
Header rows to be ignored >> 1
Character set settings >> Default
Data format >> Seperated with seperator...
Data seperator>>;
Escape sign >> #
Number format >>Direct entry
Thousands seperator>>.
Decimal point seperator>>,
SAP suggests the use of header row ,as the system can use this header information to help define fields on the file.
The Proposal tab reads the header row and proposes field names and type..........here you have to choose load example data and take a look at the proposed list.Make sure that all fields are flagged Copy to Fieldlist......This proposed field is derived from the header line and contents of your flatfile....
These proposed fiels and types can be changed in Fields tab........At this point you need to check if the field types and lenth are correct......In some cases proposal will not be accurate,and you will need to adjust the field types and length on the fields tab.....
You file should be saved as .csv format.........
2) Then you have to create an infopackage and load data till PSA......
3) Then create transformation between the datasource(PSA) and the target(IO).........
4)Then create DTP and load data..........
After this in the same way you have to create datasource for transaction data...........the infopackage..........transformation ........DTP........process is same...........
Check this link :
http://help.sap.com/saphelp_nw70/helpdata/EN/43/03450525ee517be10000000a1553f6/frameset.htm
Hope this helps you.......
Regards,
Debjani.........
Edited by: Debjani Mukherjee on Sep 21, 2008 9:44 AM -
Error while creating process chain for flat file loading
Hi All,
I had Created a process Chain to load Transactiion data(Full load) form flat file which is in my computer.
start>Load>DTP>DeleteIndex>DTP loading CUBE--> Create Index
but the system is throwing an error as "An upload from the client workstation in the background is not possible."
I dont know why this error is coming?
Can some one help me
Regards
MamtaHi Mamta,
Basically if you want to load the DS through FF using process chain, the FF has to be placed in Application server. We cant load the FF when it is located in the client local workstation(FF On your PC).
So better you remove the Infopackage step from the PC. Load the IP manually. Once it is completed you can start the process chain with the following steps:
start>DTP>DeleteIndex>DTP loading CUBE> Create Index
Hope it is clear & helpful!
Regards,
Pavan -
Regarding flat file load to DSO...
Hi Experts,
I am trying to load transaction data from flat file to DSO in BI7.0 that comprises two characteristics and one keyfigure. The characteristics type is char and keyfigure's type is int. Till PSA, data is fine. In transformation, I have just done one-to-one mapping with source field and target field, but while executing DTP, the error popped up as follows,
Data package 1:Errors during processing
Extraction completed, processing termination
Please help. I tried various combinations, but not able to schedule. Valuable answers would be awarded remarkably.
Thanks,Hi there...
Here are some ideas to Ur problem
Assuming that u defined info objects correctly, length and types... and u mapped transformations correctly
And ur data in flat file is absolutely correct...
Of course very common but suggestible is plz check the data lengths in flat file and defined length of the info object.. If yes
Come to Data source and check the settings for the flat file ,separater and escape sign and rows to ignored..and check the data preview...
Plz compare the data in PSA and Flat file... plz do this very carefully... If yes...
Go to data source and at the FIELDS tab and check whether u assigned info objects correctly...if yes...
Check the conversion routines in the same screen of the tab FIELDS.. if for key figure u see PER16.. Remove it and try to load to PSA and do load the cube..
Hope this can give u an idea...
Regrds...KP -
Urgent: Flat file load issue
Hi Guru's,
Doing loading in data target ODS via flat file, problem is that flat files shows date field values correct of 8 characters but when we do preview it shows 7 characters and loading is not going through.
Anyone knows where the problem is why in the preview screen it shows 7 characters of date when flat file has 8 characters.
Thanks
MKHi Bhanu,
How do i check if conversion is specified or not and another thing is it is not just one field we have 6 date fields and all of them are showing 7 characters in PSA after loading where as flat file have 8 characters in all of the 6 date fields.
In PSA I checked the error message it shows 2 error messages:
First Error message:
The value '1# ' from field /BIC/ZACTDAYS is not convertible into the
DDIC data type QUAN of the InfoObject in data record 7 . The field
content could not be transferred into the communication structure
format.
em Response
The data to be loaded has a data error or field /BIC/ZACTDAYS in the
transfer structure is not mapped to a suitable InfoObject.
The conversion of the transfer structure to the communication structure
was terminated. The processing of data records with errors was continued
in accordance with the settings in error handling for the InfoPackage
(tab page: Update Parameters).
Check the data conformity of your data source in the source system.
On the 'Assignment IOBJ - Field' tab page in the transfer rules, check
the InfoObject-field assignment for transfer-structure field
/BIC/ZACTDAYS .
If the data was temporarily saved in the PSA, you can also check the
received data for consistency in the PSA.
If the data is available and is correct in the source system and in the
PSA, you can activate debugging in the transfer rules by using update
simulation on the Detail tab page in the monitor. This enables you to
perform an error analysis for the transfer rules. You need experience of
ABAP to do this.
2nd Error Message:
Diagnosis
The transfer program attempted to execute an invalid arithmetic
operation.
System Response
Processing was terminated, and indicated as incorrect. You can carry out
the update again any time you choose.
Procedure
1. Check that the data is accurate.
- In particular, check that the data types, and lengths of the
transferred data, agree with the definitions of the InfoSource,
and that fixed values and routines defined in the transfer rules
- The error may have arisen because you didn't specify the
existing headers.
2. If necessary, correct the data error and start processing again.
Thanks
MK
Maybe you are looking for
-
How do you move your iTunes playlist from a pc to a mac
my library is in my Mac, but the playlists are not. How do I get the complete playlists from the p.c. to the Mac?
-
How to identify reports that are using a specific universe object?
Any idea how to use BO Auditor or Metadata Manager to identify webi reports that are using a specific universe object or universe prompt in the report SQL?
-
Leave switches in "Server" mode or Not
I have multiple switches at different locations ( 16 ) I was hoping someone could explain wheather to leave "All" switches in vtp server mode. I have not created any vlans and I don't think that I will at this present time. So the question is, will t
-
Complex Query in toplink descriptor
I have a complex query with "Order by" and "Max" aggregation function and I am trying to convert it to a named query. I am using toplink descriptor in Jdeveloper 11G. 1. Is it possible to write a named query which returns values from more than two di
-
Please help! I'm new to Java and have done the following: Created an array to hold 100 records. Created 5 records for the array (index 0, 1, 2, 3, 4) Now I'm trying to use a for loop to have a write output of the array values and I'm getting a NULL p