Load data into infocube
hello can anybody please say me how to load data into infocube in bw7.0
thanks a lot
komal.
Hi,
These are the steps for creating an Info cube and data loading
1. It is same as creating an Info cube in BW 3.5
2. Here once u created an info cube,assign dimesions to it.
3. Create application component, info source similar to BW 3.5.
4. After creating info source, right click on it
5. Here all the dimensions and key figures are mapped automatically in transformation rules.
6. Right click on info cube find the option : Create Data transfer Process
7. create an info package, as usual and schedule the package to load the data into an Info provider.
Similar Messages
-
ABAP short dump when loading data into Infocube
Hello All,
I am getting an ABAP short dump error when loading data into Infocube. I tried to load data from PSA and ODS.
I ran the code to compress the Infocube, no improvement still getting the same error.
Error Analysis:
A raise statement in the program "SAPLRSDU_PART" raised in the exception.
Internal Notes:
The termination occured in the function "ab_i fune" of the SAP Basis system. Specifically in the line 2316 of the module "//bas/620/src/krn/runt/abfunc. C#18".
The internal operation just processed is "FUNE"
Advance thanks !!!
SivHello Siv,
try to run program SAP_DROP_EMPTY_FPARTITIONS to display existing partitions. Then compare it to the number of uncompressed requests. Maybe there's a mismatch.
SAP notes that might help: 385163, 590370
Regards,
Marc
SAP NetWeaver RIG, US BI -
Cannot load data into infocube in BW NW2004s sneak preview
For BW with NW2004s preview.
I can load data into PSA but there is an error happens when loading data from PSA to infocube using DTP. The error is as 'Join table TESTDATRNRPART0 does not exist - read SAP note 871084'. In note 871084, it says the SP05 fix the problem. But I check the status of the BW server, it is 'release 700 / level 0005 / highest support package SAPKW0005'.
Does it mean the BW server has SP05?Hi Parvendar and Vinod,
Thanks for the information. I checked in SM58. For today it shows 4 TRFC. 1 with user bwremote and 3 with user BASIS(the user i am logged in as). All have error saying "An exception occurred that was uncaught".
I also tried to reschedule the Data for the infopackage alone. We are having 3 infopackages in the process chain and I tried iwth one of them.
When i scheduled it, the Infocube -> Manage shows the status of the request as Yellow. When i try to see the Load monitor, it displays the same message as I saw yesterday.
In Extraction Phase it shows
Extraction (messages): Errors Occured
Data Request recieved - green
Data selection scheduled - green
41 Records Sent (41 Records received_) - yellow
Data Package1(41 records): Missing Messages - yellow
Date Selection ended - green
Transfer (Idocs and TRFC) Missing messages or warnings
Data package1: arrived in BW: processing: data packet not yet processed
Info IDOC1,2,3,4 are all green
Update PSA, Transfer Rules and Update rules are all green
Update (0 new/0 changed) : warning recieved
message missing: update finished for 0TCT_C21
Please let me know what could be the possible issue
Regards,
Raghavan
Edited by: Raghavan Guruswaminathan on May 11, 2010 8:10 AM -
Loading data into infocube in bi 7.0 from flat file.
Hello All,
I need the complete procedure for loading data into the infocube from flat file in bi 7.0 i.e.using transformation,DTP etc.
Please help me with some gud documents.Hi Pratighya,
Step by step procedure for loading data from flat file.
1. Create the infoobjects you might need in BI
2. Create your target infoprovider
3. . Create a source system
4. Create a datasource
5. Create and configure an Infopackage that will bring your records to the PSA
6. Create a transformation from the datasource to the Infoprovider
7. Create a Data Transfer Process (DTP) from the datasource to the Infoprovider
8. Schedule the infopackage
9. Once succesful, run the DTP
10. This will fill your target.
Hope this helps
Regards
Karthik -
Runtime error while loading data into infocube
Hi all,
I am getting a runtime error while loading the transaction data into the InfoCube.
Exception name: CX_SY_GENERATE_SUBPOOL_FULL
Runtime error: GENERATE_SUBPOOL_DIR_FULL
Anyone has any idea about this error? Any help will be really helpful. Thanks in advance.
Regards,
SatheshHi Sathesh,
Please provide more details for the problem.
You can go and check in ST22. Go through the Dump Analysis and Let us know.
Regards
Hemant Khemani -
Steps for loading data into the infocube in BI7, with dso in between
Dear All,
I am loading data into the infocube in BI7, with dso in between. By data flow looks like....
Top to bottom:
InfoCube (Customized)
Transformation
DSO (Customized)
Transformation
DataSource (Customized).
The mapping and everything else looks fine and data is also seen in the cube on the FULL Load.
But due to some minor error (i guess), i am unable to see the DELTA data in DSO, although it is loaded in DataSource through process chains.
Kindly advise me, where did i miss.
OR .. Step by step instructions for loading data into the infocube in BI7, with dso in between would be really helpful.
Regards,Hi,
my first impulse would be to check if the DSO is set to "direct update". In this case there is no Delta possible, because the Change log is not maintained.
My second thought would be to check the DTP moving data between the DSO and the target cube. If this is set to full, you will not get a delta. It is only possible to create one DTP. So if you created one in FULL mode you can't switch to Delta. Just create the DTP in Delta mode.
Hope this helps.
Kind regards,
Jürgen -
Error while loading data into 0pur_c03 using 2lis_02_scl
Hi,
While trying to load data into 0pur_c03 by using Data Source 2lis_02_scl from R/3 the error message "The material 100 does not exist or not active" is being displayed. I have loaded master data for 0Material and performed change Run also.Still the problem is not resolved.Please suggest in resolving this issue.
Regards,
Hari Prasad B.I have the same problem in my system and i found that the material XXXXX had movements.....sales documents and more but in certain moment it material was deleted in R3 and it is not loaded into 0MATERIAL infoobject because is marked for deletion....
If you have this error must activate the load request manually and your infocube can hold the data without any problem.....
Regards -
Can not load data into the ods.
When I was trying to load data into BW3.5 from R3 system, I am getting this message
"For InfoSource ZSD_A507_011, there is no Hierarchies for master data in source system R3PCLNT300
" any idea. There is no hierarchies for this.Hello Mat,
What are you trying to load??
If the data source is a hierarchy datasource??
just check this thread
There are no hierarchies for this InfoSource in source system
Thanks
Ajeet -
Unable to load data into Planning cube
Hi,
I am trying to load data into a planning cube using a DTP.
But the system throws a message saying that, real time data loading cannot be performed to the planning cube.
What would be the possible reason for the same?
Thanks & Regards,
Surjit PHi Surjit,
To load data into cube using DTP, it should be put in loading mode only.
The real time cube behaviour is set to planning mode only when the data is input to the cube via the planning layouts / or through the Fileupload program.
You can change the behaviour of the cube in RSA1 -> Right click on the cube -> Change Real-time load behaviour and select the first option(real time cube can be loaded with data; planning not allowed)
Best Rgds
Shyam
Edited by: Syam K on Mar 14, 2008 7:57 AM
Edited by: Syam K on Mar 14, 2008 7:59 AM -
Hi,
We are having trouble while importing one ledger 'GERMANY EUR GGAAP'. It works for Dec 2014 but while trying to import data for 2015 it gives an error.
Import error shows " RuntimeError: No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'."
I tried all Knowledge docs from Oracle support but no luck. Please help us resolving this issue as its occurring in our Production system.
I also checked all period settings under Data Management> Setup> Integration Setup > Global Mapping and Source Mapping and they all look correct.
Also its only happening to one ledger rest all ledgers are working fine without any issues.
ThanksHi,
there are some Support documents related to this issue.
I would suggest you have a look to them.
Regards -
How to load data into an ods from multiple info sources.
hi all...
i am given a task to load data into an ods from 3 infosources ...
can someone plz give me the flow .
thank u in advance..Hi Hara Pradhan,
You have to create 3 update rules by giving the 3 different infosources while creating each update rule. And u have to create the infopackages under each infosource. with this u can load the data to the same data target from multiple info sources.
Hope it helps!
Assign points if it helps! -
Can i use one interface to load data into 2 different tables
Hi Folks,
Can i use one interface to load data into 2 different tables(same schema or different schemas) from one source table with same structure ?
Please give me advice
Thanks
Raj
Edited by: user11410176 on Oct 21, 2009 9:55 AMHi Lucky,
Thanks for your reply,
What iam trying is ...Iam trying to load the data from legacy tables(3) into oracle staging tables.But i need to load the same source data into two staging tables(these staging tables are in two different schemas)
can i load this source data into two staging tables by using single standard interface(some business logic is there)
If i can then give me some suggestion how to do that
Thanks in advance
Raj -
How can I load data into table with SQL*LOADER
how can I load data into table with SQL*LOADER
when column data length more than 255 bytes?
when column exceed 255 ,data can not be insert into table by SQL*LOADER
CREATE TABLE A (
A VARCHAR2 ( 10 ) ,
B VARCHAR2 ( 10 ) ,
C VARCHAR2 ( 10 ) ,
E VARCHAR2 ( 2000 ) );
control file:
load data
append into table A
fields terminated by X'09'
(A , B , C , E )
SQL*LOADER command:
sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
datafile:
column E is more than 255bytes
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)Check this out.
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961 -
Error Loading Data into a flat file
I am recieving the following error when loading data into a flat file from a flat file. SQL 2005 is my back end DB. If I cut the file iin half approx 500K rows my ODI interface works fine. Not sure what to look at.. I rebuit the interface which before was just dying giving no error and now I am getting this.
Thanks
java.lang.Exception
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)Figured it out, found similar post that stated changing the HEAP size
Increase the page size in odiparams.bat in the bin folder and restart Designer.
For eg:
set ODI_INIT_HEAP=128m
set ODI_MAX_HEAP=1024m -
Error in loading data into essbase while using Rule file through ODI
Hi Experts,
Refering my previous post Error while using Rule file in loading data into Essbase through ODI
I am facing problem while loading data into Essbase. I am able to load data into Essbase successfully. But when i used Rule file to add values to existing values I am getting error.
test is my Rule file.
com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Cannot put olap file object. Essbase Error(1053025): Object [test] already exists and is not locked by user [admin@Native Directory]
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
at java.lang.Thread.run(Thread.java:662)
from com.hyperion.odi.common import ODIConstants
from com.hyperion.odi.connection import HypAppConnectionFactory
from java.lang import Class
from java.lang import Boolean
from java.sql import *
from java.util import HashMap
# Get the select statement on the staging area:
sql= """select C1_HSP_RATES "HSP_Rates",C2_ACCOUNT "Account",C3_PERIOD "Period",C4_YEAR "Year",C5_SCENARIO "Scenario",C6_VERSION "Version",C7_CURRENCY "Currency",C8_ENTITY "Entity",C9_VERTICAL "Vertical",C10_HORIZONTAL "Horizontal",C11_SALES_HIERARICHY "Sales Hierarchy",C12_DATA "Data" from PLANAPP."C$_0HexaApp_PLData" where (1=1) """
srcCx = odiRef.getJDBCConnection("SRC")
stmt = srcCx.createStatement()
srcFetchSize=30
#stmt.setFetchSize(srcFetchSize)
stmt.setFetchSize(1)
print "executing query"
rs = stmt.executeQuery(sql)
print "done executing query"
#load the data
print "loading data"
stats = pWriter.loadData(rs)
print "done loading data"
#close the database result set, connection
rs.close()
stmt.close()
Please help me on this...
Thanks & Regards,
ChinnuHi Priya,
Thanks for giving reply. I already checked that no lock are available for rule file. I don't know what's the problem. It is working fine without the Rule file, but throwing error only when using rule file.
Please help on this.
Thanks,
Chinnu
Maybe you are looking for
-
BP Relationship - Partner Function Assignment not maintained
During the initial load for CUSTOMER_REL the Partner Function Assignment is not maintained for the Relationship "Usage". We want to default this information based on the Sales Area data of the Sold to party and Partner Function maintain in the ECC KN
-
How to Create a Flat File using FTP/File Adapter
Can any body done workaround on creating the Flat file using FTP/File Adapter?. I need to create a simple FlatFile either using of delimiter/Fixed length. using the above said adapters we can create XML file, i tried concatinating all the values into
-
Premiere Elements 8 doesn't take advantage of all my RAM
Hi, I'm running Premiere Elements 8 on a Windows 7 machine with 8 GB of RAM. Elements usually runs fine when editing standard-definition video, but slows down badly with HD. But when I check the system processes, I see that Elements never uses more t
-
I updated to the latest Pacman in testing, and when I tried to use makepkg, I got this: ==> ERROR: BUILDSCRIPT is undefined! Ensure you have updated /etc/makepkg.conf. However, I have already checked and modified the configuration, look: # /etc/makep
-
Can't connect any devices to wifi through actiontec C200a modem
We have centurylink actiontec c2000a modem at work. this is the 4th modem in the last year and a half. Haven't been able to get the wifi to work on any of them. Any suggestions? Thanks