Help in Flat File extraction
I have downloaded data from R/3 Sand Box tables (BSEG & BKPF) and saved as .CSV files in BW Sand Box. Connection between them not yet established.
Now i want to upload these flat files into ODS (0FIGL_O02) using 0FI_GL_4 datasource
Assigned FF source system and activated the transfer rules.
But i'm getting confused while creating the infopackage
Can anyone suggest me the best way to do so
Thanks in advance
Srini
1. Go to InfoSource 0FI_GL_4.
2. Click with right mouse button to assign a DataSource.
3. Select your "FF" (File Upload) Source System.
4. Accept the default mapping in transfer rules by activating the InfoSource.
5. Go back and right-click on DataSource FF to create an InfoPackage.
6. Select General Ledger: Line Items.
7. Enter your InfoPackage Description and click SAVE.
8. Click on ExternalData and enter your data file name (Load External Data from Client or Application Server).
9. Select CSV file.
10. Click on SAVE before you try to test with Preview buton.
11. Start loading immediately if you are uploading from your client PC.
Pls let me know if you have problems.
Similar Messages
-
Oracle API for Extended Analytics Flat File extract issue
I have modified the Extended Analytics SDK application as such:
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
instead of
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
Where does the FLATFILE get saved/output when using the API?
Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
thanks for your help,
ChrisNever mind. I found the location on the server.
-
Extended Analytics Flat File extract in ANSI
Hi,
Version: EPM 11.1.2.1
The Flat file extract using Extended Analytics from HFM is exported in UNICODE format. Is there any option to get the export in ANSI code format?
Thanks,
user8783298Hi
In the latest version, Currently the data from HFM can be exported only in UNICODE format. There is a defect filed to Oracle for the same.
At present the only workaround is to change the file encoding after it has been extracted, using third party software.
For example, the extracted file can be opened using Windows Notepad and then desired encoding format can be selected when the file is saved using option Save As.
Hope this helps.
thank you
Regards,
Mahe -
Extended Analytics Flat File extract question
I have modified the Extended Analytics SDK application as such:
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
instead of
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
Where does the FLATFILE get saved/output when using the API?
Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
thanks for your help,
ChrisNever mind. I found the location on the server.
-
Encountering problem in Flat File Extraction
Flat file extraction: the settings which I maintain in the data source u201Cfield tabu201D are conversion routine "ALPHA", format "External". But when I load the data, the record which is maintained in lower case (in the Flat File) is not getting converted into upper case. But if I try with out ALPHA conversion, the lower case data is getting converted into the SAP format (Upper case). Could you please help me to fix the problem
When I use both ALPH & External together?Hai did you enable the lower case enable option check box in the infoobject level .For the objects which you want lower case values .Check the help.sap site as well.
Goodday -
Generic and Flat file extraction
Hi experts
can any body send the Real time scenario for Generic and flat file extraction with examples.
Thanks and regrds,
satyaHi,
Generic extraction is an extraction where you create a Generic DS and based upon this you would be extracting the data from R/3, which means you would have an source system to extract the generic data from the generic data source.
Flact file extraction is an extraction where you need not have to connect to source
system, in this case you source system would be PC, here you would be designing an infosource based up on the fields/columns of your flat file and mapping the corresponding the infoojects to it.. and update rules..
and you would be loading/extracting the data from the file as a source..
These two things are different in nature and usage..
Hope this helps..
assign points if useful..
cheers,
Pattan. -
Changing the file name n Flat file Extraction
Hi,
Currently i am using flat file extraction for my forecast data and i am sending the file through application server.
I have created directory successfully and now everyday morning i receive a file thru FTP server with name 20060903.csv and this name is based on one field in my flat file data. ex. /interface/asf/20060903.csv
During mid off month we have cut off date, and this cut off date varies for each month. During this time file name changes in the FTP and a file with different name i.e 20061002.csv will be existing in the application server.
Now in the infopackage i also need to set the deletion settings like if the file name is same delete the previous requests. I could achieve this if i could get the file name changed.
Lets say if i am not chnaging the file name how do i set deletion condition, like it should not delete if the field(scenario) changes. ie from 20061002 to 20061101. I should have only one file for 20061002 and one file for 20061101 etc... If the scenario is same it should delete.
Any one kindly advise. Very urgent and critical.
Tks & regards,
Bhuvana.Hi Bhunva,
Try the following abap code in routine under External data tab in infopackage.
data: begin of i_req occurs 0,
rnr like RSICCONT-rnr,
end of i_req.
select * from RSICCONT UP TO 1 ROWS
where ICUBE = <datatargetname>
order by TIMESTAMP descending.
i_req-rnr = rsiccont-rnr .
append i_req.
clear i_req.
endselect.
loop at i_req.
select single * from RSSELDONE where RNR eq i_req-rnr and
filename = p_filename.
if sy-subrc = 0.
CALL FUNCTION 'RSSM_DELETE_REQUEST'
EXPORTING
REQUEST = i_req-rnr
INFOCUBE = <datatargetname>
EXCEPTIONS
REQUEST_NOT_IN_CUBE = 1
INFOCUBE_NOT_FOUND = 2
REQUEST_ALREADY_AGGREGATED = 3
REQUEST_ALREADY_COMDENSED = 4
NO_ENQUEUE_POSSIBLE = 5
OTHERS = 6.
IF SY-SUBRC <> 0.
MESSAGE ID sy-MSGID TYPE 'I' NUMBER sy-MSGNO
WITH sy-MSGV1 sy-MSGV2 sy-MSGV3 sy-MSGV4.
else.
message i799(rsm1) with i_req-rnr 'deleted'.
ENDIF.
endif.
let me know if you get any problem in this logic.
regards,
Raju -
Delta upload for flat file extraction
Hello everyone,
Is it possible to initialize delta update for a flat file extraction, If yes please explain how to do that??Hi Norton,
For a Flat file data source, the upload will be always FULL.
Please refer to following doc:
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0b3f0e2-832d-2c10-1ca9-d909ca40b54e?QuickLink=index&overridelayout=true&43581033152710 if you need to to extract delta records from Flat file. We can write routine at infopackage level.
Regards,
Harish. -
Hierarchy flat file extraction
Hi experts--
Can anyone could guide me in Flat file Hierchy extraction.Step by step.
if possible with screen shots.
can send it to [email protected]
regards,
Rambo.Hi,
Flat file hierarchy extraction is similar to the normal flat file extraction procedures. But the file structure itself can be complex and is different than normal flat files.
Take a look at the threads below for more details :
Hierarchy Flat file
Hierarchy from flat file
Program to load a flat file in a Hierarchy
Cheers,
Kedar -
Omit the Open Hub control file 'S_*' for flat file extracts
Hi Folks,
a quick question is it somehow possible to omit the control file generation for flat file extracts.
We got some unix scripts running that get confused by the S_* files and we where wondering if we can switch the creation of those files off.
Thanks and best regards,
AxelHi,
However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
I would suggest you to control it again.
Derya -
Flat file extraction with FILE
Hi BWers,
I want to extract a flat file to BW with an infopackage.
This flat file is located in a physical path changing according to the system :
- in the developpment server, path is :
...\dev\file.csv
- in the quality server, path is :
...\qual\file.csv
- in the production server, path is :
...\prod\file.csv
In the Name of the File into the infopackage extraction tab, i have to put the physical path or the logical file or routine.
I don't want to put directly the physical path, because i would have to change this path in qual and prod servers.
I don't want too to use a routine.
So I created a logical file with FILE T-code. But in physical path i put :
...\dev\<FILENAME>
How to put a directory variable like that : $(director)<filename> which says : if we are in Dev, so $directory =
...\dev\file, else qual...
Thanks for help.
Cheers,Hi,
Here is the answer i found:
Use in FILE\Assignment of Physical Paths to Logical Path the variable like that:
<V=Z_INT_SRV>
Declare this variables in Definition of Variables.
Z_INT_SRV -
How to use 'flat file extraction' option in Infospoke
Hello Gurus,
We are trying to extract the BW data using Infospoke to a 3rd party ETL tool (Datastage). Now in the current BW 3.5 release, the customer is on SP11 and we are not able to extract any data using a 3rd party system option in Infospoke. Due to the fact, we found out from SAP, that since we are getting an error (3rd party system is not set)in infospoke, we have to upgrade the SP level from 11 to either 16 or 17. The customer that we are dealing with is not going to do that until July - 2007.
So now we have a second option to load the 'flat file' in BW App. server and FTP to Datastage. Is this something workable option?
When using that option within Infospoke destination Tab, we can see two options. One is to use File option and load the data within the standard DIR HOME. I tried that but I can't see the file and the entire data within the standard directory. Can anyone explain why and what will be the procedure to follow?
The second option is to use Logical file. I never done that before. Can anyone explain step by step process or provide the detail document and/or link?
Since we are at the critical timeframe stage, I would appreciate if I can get the faster response from the Gurus.
I will make sure to give 'reward points' to each and every helpful answer(s).
thanks in advance,
MauleshHi shashi,
first, let me clarify one thing here! I am not saying '3rd party tool' as an option in infospoke. I know Infospoke provides only two options (DB table, and file). But when you use DB table option, you check mark on 3rd party destination and that's where you define an RFC destination for 3rd party tool. So when you execute an infospoke, it will start extracting data and load into the defined destination. Since we are on SP11 on BW 3.5 release, we are getting an error while executing an Infospoke using 'DB table option'. The error is like '3rd party destination is not set'. When I went to Marketplace and found an OSS note, I found out that we need an upgrade on SP level (upto 17).
As I mentioned earlier, the client doesn't want to upgrade until next year, and we are at the critical timeframe stage, where we have to make a decision whether we hold this project until the SP upgrade happens or found alternate approach where we can extract the data from BW (somehow) and load into Teradata database via Datastage - BW Pack (3rd party ETL).
So we might be able to use the second option in Infospoke destination, which is to use 'Flat file' load.
Now when I use File option with App. server checked, the data loads into default Server and directory on BW. Which is DIR_HOME. I can see those file extracted and loaded into that directory under the transaction AL11. But I can't see any data. Do you think I have to request for an increase in size limit? If I see all the data, how can I FTP from the BW server? Do I have to work with SAP Basis or create a UNIX script?
When I use Logical file name, I guess I have to follow what you have described on your response, correct? Can i use an existing Logical path and file name? Do you have a 'How to' document on creating YEAR and PER FM? Is it possible that Basis can do this work for us? What happens after we assign a logical Path and logical file name into the destination tab on Infospoke? What is the use of logical file versus file? Overall, I am looking for a process from extracting the data from Infospoke using either file or logical file, all the way to load into 3rd party database.
looking forward for your response!
thanks,
Maulesh -
Date format problem in Flat file extraction
My flat file is send date like this DD/MM/YYYY HH:MM:SS how can i extract date and time to my ODS. i need date if posible time in seperate object
how can i create my info object and how to map it.
Thanks
BabuHi,
create a routine and use the statement convert timestamp. Check out the F1-help for details of the statement.
Siggi -
BI 7.0 Flat file extraction
Hi all....
For getting acquainted with BI 7.0 i am trying to extract the data from falt file...
1)every thing is fine i created DTP's at cube level and as well as Data source level
2)I have executed / triggred like in Infopacakage to schedule the dat
3) Request stautus is green in Infocube and anf it is also available for reporting..
4)When i checked monitor i could see over all status green...and i could find no data in the info cube.
i ahve done transformations and DTP's correctly .then why i cpould not see the data in infocube and why i could see all requests set to green and also available for reporting
Thnx in advance
Reg
Ramhi..
follow these steps for easy data load from flat file..
Uploading of master data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to select Insert Characteristics as info provider
Select required info object ( Ex : Employee ID)
Under that info object select attributes
Right click on attributes and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
BW 7.0
Uploading of Transaction data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
5. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
6. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( Transaction data )
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
7. Creation of data targets
In left panel select info provider
Select created info area and right click to create ODS( Data store object ) or Cube.
Specify name fro the ODS or cube and click create
From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
Click Activate.
Right click on ODS or Cube and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
8. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
hope it helps...
all the best.. -
Errors in flat file extraction
Experts pls ans
what R the possible ERRORS while extracting data from flat file to SAP BW 3.5
thanks in advance
ramakrishnaHi,
possibel errors:
You have to select as csv file in info package selection, while loading.
Path of the File.
If your File is open then also you can get error.
If DS structure different to Flat File, when you do scheduling then also you will get error.
Hope it helps you.
Reg
Pra
Maybe you are looking for
-
After IMPDP in new schema all pl/sql in UPPERCASE.
Hi, I am using ORACLE 11g R2 DB and IBM AIX OS. I have faced a very different situation. I had last night's backup dump file of the user, in the morning we had some issue so we need to restore that dump in the current user. So i dropped the user firs
-
How to show next set of columns in a adf table
hi, i want to display 999 no of columns in a adf table. since it is not a better way to show all the columns in the adf table at a time, we want to show 10 columns at first and by using the next and previous links the user can see the next and previo
-
new itunes up date will not down load says i have older version bonjour can not be removed tried removing bonjour with revo uninstaller takes bonjour off but then my internet will not work
-
Move URLs in Text to End of Story Or Link Text. Also: Tables
Hello everyone I'm currently working on a script that translates Textile Markup into formatted InDesign-Text. This worked fine for the most part, seeing as it's not really hard to do by FindChangeByList. Some things are a bit trickier than other thin
-
Adobe Reader for iOS and Android & Rights Management
This week's release of Reader for iOS and Android devices incorporates support for accessing files secured by Adobe LiveCycle Rights Management. As you are probably familiar, LiveCycle Rights Management protects sensitive documents by encrypting them