Loading data into infocube in bi 7.0 from flat file.
Hello All,
I need the complete procedure for loading data into the infocube from flat file in bi 7.0 i.e.using transformation,DTP etc.
Please help me with some gud documents.
Hi Pratighya,
Step by step procedure for loading data from flat file.
1. Create the infoobjects you might need in BI
2. Create your target infoprovider
3. . Create a source system
4. Create a datasource
5. Create and configure an Infopackage that will bring your records to the PSA
6. Create a transformation from the datasource to the Infoprovider
7. Create a Data Transfer Process (DTP) from the datasource to the Infoprovider
8. Schedule the infopackage
9. Once succesful, run the DTP
10. This will fill your target.
Hope this helps
Regards
Karthik
Similar Messages
-
ABAP short dump when loading data into Infocube
Hello All,
I am getting an ABAP short dump error when loading data into Infocube. I tried to load data from PSA and ODS.
I ran the code to compress the Infocube, no improvement still getting the same error.
Error Analysis:
A raise statement in the program "SAPLRSDU_PART" raised in the exception.
Internal Notes:
The termination occured in the function "ab_i fune" of the SAP Basis system. Specifically in the line 2316 of the module "//bas/620/src/krn/runt/abfunc. C#18".
The internal operation just processed is "FUNE"
Advance thanks !!!
SivHello Siv,
try to run program SAP_DROP_EMPTY_FPARTITIONS to display existing partitions. Then compare it to the number of uncompressed requests. Maybe there's a mismatch.
SAP notes that might help: 385163, 590370
Regards,
Marc
SAP NetWeaver RIG, US BI -
hello can anybody please say me how to load data into infocube in bw7.0
thanks a lot
komal.Hi,
These are the steps for creating an Info cube and data loading
1. It is same as creating an Info cube in BW 3.5
2. Here once u created an info cube,assign dimesions to it.
3. Create application component, info source similar to BW 3.5.
4. After creating info source, right click on it
5. Here all the dimensions and key figures are mapped automatically in transformation rules.
6. Right click on info cube find the option : Create Data transfer Process
7. create an info package, as usual and schedule the package to load the data into an Info provider. -
Cannot load data into infocube in BW NW2004s sneak preview
For BW with NW2004s preview.
I can load data into PSA but there is an error happens when loading data from PSA to infocube using DTP. The error is as 'Join table TESTDATRNRPART0 does not exist - read SAP note 871084'. In note 871084, it says the SP05 fix the problem. But I check the status of the BW server, it is 'release 700 / level 0005 / highest support package SAPKW0005'.
Does it mean the BW server has SP05?Hi Parvendar and Vinod,
Thanks for the information. I checked in SM58. For today it shows 4 TRFC. 1 with user bwremote and 3 with user BASIS(the user i am logged in as). All have error saying "An exception occurred that was uncaught".
I also tried to reschedule the Data for the infopackage alone. We are having 3 infopackages in the process chain and I tried iwth one of them.
When i scheduled it, the Infocube -> Manage shows the status of the request as Yellow. When i try to see the Load monitor, it displays the same message as I saw yesterday.
In Extraction Phase it shows
Extraction (messages): Errors Occured
Data Request recieved - green
Data selection scheduled - green
41 Records Sent (41 Records received_) - yellow
Data Package1(41 records): Missing Messages - yellow
Date Selection ended - green
Transfer (Idocs and TRFC) Missing messages or warnings
Data package1: arrived in BW: processing: data packet not yet processed
Info IDOC1,2,3,4 are all green
Update PSA, Transfer Rules and Update rules are all green
Update (0 new/0 changed) : warning recieved
message missing: update finished for 0TCT_C21
Please let me know what could be the possible issue
Regards,
Raghavan
Edited by: Raghavan Guruswaminathan on May 11, 2010 8:10 AM -
How to store data into database by reading sql statements from text file
how to write java program for storing data into database by reading sql statements from text file
Step 1: Create a property file to add various queries.
Step 2: Read the properties file using ResourceBundle
Step 3: Use the jdbc to execute the query read from the property file.
So in future if you need to change query no need do any modifications in java program. But depends on how you use the property file. -
Runtime error while loading data into infocube
Hi all,
I am getting a runtime error while loading the transaction data into the InfoCube.
Exception name: CX_SY_GENERATE_SUBPOOL_FULL
Runtime error: GENERATE_SUBPOOL_DIR_FULL
Anyone has any idea about this error? Any help will be really helpful. Thanks in advance.
Regards,
SatheshHi Sathesh,
Please provide more details for the problem.
You can go and check in ST22. Go through the Dump Analysis and Let us know.
Regards
Hemant Khemani -
Loading data into Fact/Cube with surrogate keys from SCD2
We have created 2 dimensions, CUSTOMER & PRODUCT with surrogate keys to reflect SCD Type 2.
We now have the transactional data that we need to load.
The data has a customer id that relates to the natural key of the customer dimension and a product id that relates to the natural key of the product dimension.
Can anyone point us in the direction of some documentation that explains the steps necessary to populate our fact table with the appropriate surrgoate key?
We assume that we need to have an lookup table between the current version of the customer and the incoming transaction data - but not sure how to go about this.
Thanks in advance for your help.
LauraHi Laura
There is another way to handling SCD and changing Facts. This is to use a different table for the history. Let me explain.
The standard approach has these three steps:
1. Determine if a change has occurred
2. End Date the existing record
3. Insert a new record into the same table with a new Start Date and dummy End Date, using a new surrogate key
The modified approach also has three steps:
1. Determine if a change has occurred
2. Copy the existing record to a history table, setting the appropriate End Date en route
3. Update the existing record with the changed information giving the record a new Start Date, but retaining the original surrogate key
What else do you need to do?
The current table can use the surrogate key as the primary key with the natural key being being a unique key. The history table has the surrogate key and the end date in the primary key, with a unique key on the natural key and the end date. For end user queries which in more than 90% of the time go against current data this method is much faster because only current records are in the main table and no filters are needed on dates. If a user wants to query history and current combined then a view which uses a union of the main and historical data can be used. One more thing to note is that if you adopt this approach for your dimension tables then they always keep the same surrogate key for the index. This means that if you follow a strict Kimball approach to make the primary key of the fact table be a composite key made up of the foreign keys from each dimension, you NEVER have to rework this primary key. It always points to the correct dimension, thereby eliminating the need for a surrogate key on the fact table!
I am using this technique to great effect in my current contract and performance is excellent. The splitter at the end of the map splits the data into three sets. Set one is for an insert into the main table when there is no match on the natural key. Set two is when there is a match on the natural key and the delta comparison has determined that a change occurred. In this case the current row needs to be copied into history, setting the End Date to the system date en route. Set three is also when there is a match on the natural key and the delta comparison has determined that a change occurred. In this case the main record is simply updated with the Start Date being reset to the system date.
By the way, I intend to put a white paper together on this approach if anyone is interested.
Hope this helps
Regards
Michael -
Loading data into HANA DB (using hdbsql client) with Control file.
Hi,
I am working on a Project where the requirement is to load data from a csv file into HANA Database.
I am using HDBSQL, command line client on a windows machine to upload the data into HANA DB on a linux server.
I am able to successfully use the HDBSQL to export the file.
I have the following questions w.r.t to Bulk uploading data from CSV:
Where should the CSV file reside? Can this be in the windows machine or is it mandatory to have in the dropbox location.
Where should the control file reside? Can this be in the windows machine or is it mandatory to have in the dropbox location.
Where will the error file reside in case of errors?
I am new to this and any help is much appreicated.
Thanks,
ShreeshaHi Shreesha,
Where should the CSV file reside? Can this be in the windows machine or is it mandatory to have in the dropbox location.
Where should the control file reside? Can this be in the windows machine or is it mandatory to have in the dropbox location.
Where will the error file reside in case of errors?
We need to create the DATA,CONTROL and ERROR folders on the linux server on which HANA is installed.( or a server accessible to SAP HANA server )
Hence we need to SFTP the file to HANA server to further load into Table.
Also have a look on the similar requirement on which i worked
SAP HANA: Replicating Data into SAP HANA using HDBSQL
Regards,
Krishna Tangudu -
Error while loading Data into Essbase using ODI
Hi,
I am very new to ODI. I have installed ODI and working on Demo environment only. I havn't done any configuration. I am using Essbase Technology which is coming by default.
I have created one sample outline in Essbase and a text file to load data into essbase using ODI.
Following my text file.
Time Market Product Scenario Measures Data
Jan USA Pepsi Actual Sales 222
I am getting the error. I have checked in Operator. It is giving at step 6 i.e. Integration SampleLoad data into essbase.
Here is the description.
from com.hyperion.odi.common import ODIConstants
from com.hyperion.odi.connection import HypAppConnectionFactory
from java.lang import Class
from java.lang import Boolean
from java.sql import *
from java.util import HashMap
# Get the select statement on the staging area:
sql= """select C3_C1 ""Time"",C5_C2 ""Market"",C2_C3 ""product"",C6_C4 ""Scenario"",C1_C5 ""Measures"",C4_C6 ""Data"" from "C$_0Demo_Demo_genData" where (1=1) """
srcCx = odiRef.getJDBCConnection("SRC")
stmt = srcCx.createStatement()
srcFetchSize=30
stmt.setFetchSize(srcFetchSize)
rs = stmt.executeQuery(sql)
#load the data
stats = pWriter.loadData(rs)
#close the database result set, connection
rs.close()
stmt.close()
Please help me to proceed further...Hi John,
Here is the error message in execution tab....
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 20, in ?
java.sql.SQLException: Unexpected token: TIME in statement [select C3_C1 ""Time]
at org.hsqldb.jdbc.jdbcUtil.sqlException(jdbcUtil.java:67)
at org.hsqldb.jdbc.jdbcStatement.fetchResult(jdbcStatement.java:1598)
at org.hsqldb.jdbc.jdbcStatement.executeQuery(jdbcStatement.java:194)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
at org.python.core.PyMethod.__call__(PyMethod.java)
at org.python.core.PyObject.__call__(PyObject.java)
at org.python.core.PyInstance.invoke(PyInstance.java)
at org.python.pycode._pyx4.f$0(<string>:20)
at org.python.pycode._pyx4.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java)
at org.python.core.PyCode.call(PyCode.java)
at org.python.core.Py.runCode(Py.java)
at org.python.core.Py.exec(Py.java)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
java.sql.SQLException: java.sql.SQLException: Unexpected token: TIME in statement [select C3_C1 ""Time]
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source) -
Loading a field of length( 60 and up to 2048) from flat file to BW.
Hello,
Can you please suggests ways by which we can load a field in flat file which is of length greater than 60 and up to 2048 digits to BW.
Facts:
We cannot have more than 59 length digits in an infoobject
Observations:
The idea of using multiple infoobject,ie for 2048 max length it will be 35 infoobjects is not acceptable by my client.
So, can anyone suggests other ways in BW3.5 for loading a text of length up to 2048 from flat file to BW.Hi,
If you have source field DD/MM/YYYY format, please try this code in the field routine.
Data : lv_date(8) type C.
lv_date0(4) = <Source Field Name>6(4).
lv_date4(2) = <Source Field Name>3(2).
lv_date6(2) = <Source Field Name>0(2).
RESULT = LV_DATE.
Thanks,
Saru
Edited by: P. Saravana Kumar on Apr 19, 2009 2:38 PM -
Steps for loading data into the infocube in BI7, with dso in between
Dear All,
I am loading data into the infocube in BI7, with dso in between. By data flow looks like....
Top to bottom:
InfoCube (Customized)
Transformation
DSO (Customized)
Transformation
DataSource (Customized).
The mapping and everything else looks fine and data is also seen in the cube on the FULL Load.
But due to some minor error (i guess), i am unable to see the DELTA data in DSO, although it is loaded in DataSource through process chains.
Kindly advise me, where did i miss.
OR .. Step by step instructions for loading data into the infocube in BI7, with dso in between would be really helpful.
Regards,Hi,
my first impulse would be to check if the DSO is set to "direct update". In this case there is no Delta possible, because the Change log is not maintained.
My second thought would be to check the DTP moving data between the DSO and the target cube. If this is set to full, you will not get a delta. It is only possible to create one DTP. So if you created one in FULL mode you can't switch to Delta. Just create the DTP in Delta mode.
Hope this helps.
Kind regards,
Jürgen -
Error while loading data into 0pur_c03 using 2lis_02_scl
Hi,
While trying to load data into 0pur_c03 by using Data Source 2lis_02_scl from R/3 the error message "The material 100 does not exist or not active" is being displayed. I have loaded master data for 0Material and performed change Run also.Still the problem is not resolved.Please suggest in resolving this issue.
Regards,
Hari Prasad B.I have the same problem in my system and i found that the material XXXXX had movements.....sales documents and more but in certain moment it material was deleted in R3 and it is not loaded into 0MATERIAL infoobject because is marked for deletion....
If you have this error must activate the load request manually and your infocube can hold the data without any problem.....
Regards -
Can not load data into the ods.
When I was trying to load data into BW3.5 from R3 system, I am getting this message
"For InfoSource ZSD_A507_011, there is no Hierarchies for master data in source system R3PCLNT300
" any idea. There is no hierarchies for this.Hello Mat,
What are you trying to load??
If the data source is a hierarchy datasource??
just check this thread
There are no hierarchies for this InfoSource in source system
Thanks
Ajeet -
Use of Open Hub Destination to load data into BPC
Hi Gurus,
I want to load the data from SAP BI to BPC (NW version), using Open Hub Destination. I want to know few things about this.
1) What method should I use? Should I use Destination as Flat Files or Database tables? If Database tables, how can I use this data in the tables in BPC?
2) If I go for Flat Files, which is saved in the application server of BI, how can I import those files to BPC and use it using Data Manager?
3) Also, in case of Flat Files, there are two files which are created. One is the control file and other one is the data file. How can I use both of them? Or can I just use data file without header?
Your replies will be much appreciated.
Thanks,
AbhishekHi Anjali,
I can use the standard data manager package from BI to BPC, if the CSV file is available in the BPC Server or in case if I am directly extracting from BW object, InfoObject or InfoCube.
But, since I will be using Open Hub, the output of this can be in a Database Table or a CSV file, preferably in SAP Application Server. In such cases, how can I use the Database table or CSV file in the standard Data Manager Package?
Thanks for your reply.
Abhishek -
Unable to load data into Planning cube
Hi,
I am trying to load data into a planning cube using a DTP.
But the system throws a message saying that, real time data loading cannot be performed to the planning cube.
What would be the possible reason for the same?
Thanks & Regards,
Surjit PHi Surjit,
To load data into cube using DTP, it should be put in loading mode only.
The real time cube behaviour is set to planning mode only when the data is input to the cube via the planning layouts / or through the Fileupload program.
You can change the behaviour of the cube in RSA1 -> Right click on the cube -> Change Real-time load behaviour and select the first option(real time cube can be loaded with data; planning not allowed)
Best Rgds
Shyam
Edited by: Syam K on Mar 14, 2008 7:57 AM
Edited by: Syam K on Mar 14, 2008 7:59 AM
Maybe you are looking for
-
How do I use my iPad to veiw video on Apple TV?
How do I use my iPad to view video on my Apple TV?
-
Problem Description: Trying to Install Abode Air 2.5 - all seems OK but when launching Tweetdeck I get the following error: Process: Adobe AIR Application Installer [4465] Path: /Applications/Utilities/Adobe AIR Application Install
-
HELP! Illegal operand error - can't open file
I have a file that I've done about 40 hours on since last duplicating the file. It's been saved multiple times, and successfully. but after i last closed it, i've defragged my hard drive because it's been sluggish. Ive also deleted a couple of random
-
i have erased my all phone contact list and then the message comes that it will be deleted with in an hour. but now my phone is not starting and only the apple logo appears on the screen. what should i do now..!
-
How can I control the direction of Panel Page menuTabs (menu1) ?
Hi all, in ADF FACES - JDeveloper 10.1.3 how can I control the direction of Panel Page menuTabs (menu1) either from left to right or right to left ? regards,