Steps before data load in BI 7.0
Hi all,
We have upgraded the BW from 3.1 to 7.0 SP 9. What are the steps and configurations that are to be taken care in BW and in R3 before extracting the data. Data flow is in 3.x model. We are not migrating to the new data flow.
Do i need to migrate the data source to 7.0 before loading data.
Thanks in advance.
Regards,
PM.
Hello PM
If you stay on the format of 3.x, you do not have to do anything.
Refer This thread,
/message/1358020#1358020 [original link is broken]
and
/message/2523353#2523353 [original link is broken]
and
https://www.sdn.sap.com/irj/sdn/docs?rid=/library/uuid/7c2a7c65-0901-0010-5e8c-be0ad9c05a31
Regards,
BVC
Message was edited by:
B V C
Similar Messages
-
Adding leading zeros before data loaded into DSO
Hi
In below PROD_ID... In some ID leading zeros are missing before data loaded into BI from SRM into PROD_ID. Data type is character. If leading zeros are missing then data activation of DSO is failed due to missing zeros and have to manually add them in PSA table. I want to add leading zeros if they're missing before data loaded into DSO.... total character length is 40.. so e.g. if character is 1502 then there should be 36 zeros before it and if character is 265721 then there should be 34 zeros. Only two type of character is coming either length is 4 or 6 so there will be always need to 34 or 36 zeros in front of them if zeros are missing.
Can we use CONVERSION_EXIT_ALPHPA_INPUT functional module ? As this is char so I'm not sure how to use in that case.. Do need to convert it first integer?
Can someone please give me sample code? We're using BW 3.5 data flow to load data into DSO.... please give sample code and where need to write code either in rule type or in start routine...Hi,
Can you check at info object level, what kind of conversion routine it used by.
Use T code - RSD1, enter your info object and display it.
Even at data source level also you can see external/internal format what it maintained.
if your info object was using ALPHA conversion then it will have leading 0s automatically.
Can you check from source how its coming, check at RSA3.
if your receiving this issue for records only then you need to check those records.
Thanks -
Selective Deletion Before Data Load
Hi Experts - I need to do the data load into Oracle data Warehouse. Before loading data , I need to do some selective deletion from the target table.
In the source dataset I have a date column where I have Max and Min Date . I need to delete the data from the target laying between this Min and Max date.
Any Idea how to do this selective deletion.
Thanks
RCreate a workflow, and declare two local variables, $DateMin and $DateMax, of either date or datetime datatypes, as appropriate. Create a script:
$DateMin = sql('DS','select min([datetime field]) from [incoming table]');
$DateMax = sql('DS','select min([datetime field]) from [incoming table]');
Add a dataflow to your workflow, and connect it downstream of the script. Add two parameters to the dataflow -- let's say you call them $P_DateMin and $P_DateMax. Back in your workflow, in the "Calls" tab of the Variables & Parameters window, set the mapping of the two dataflow input parameters to your two local workflow variables.
In your dataflow: perform a selection of the primary key (the column(s) which constitute the pk) of your target table, filtering on your two input parameter values ($P_DateMin and $P_DateMax. If you want to be on the safe side in terms of preventing blocking issues, send these records into a Data Transfer transform (file preferred, but up to you). Then, downstream from the Data Transfer transform, send the records into a Map Operation transform, mapping 'Normal' to 'Delete'. Then, simply send them into your target table.
You could, of course, just write a SQL script to delete the records, but those are to be avoided as breaking lineage & impact chains.
If all your date or datetime stamp fields on your target table are "whole" dates, with no time portion, and you have a smallish number of dates between your min. and max. dates, and you have a large number of records to delete between those dates, and your target table has an index on the date stamp column, then another approach would be to generate records, one per day, using a Date Generation transform, still making use of your two dataflow parameters. You'd declare the date field so generated to be the (false) primary key, map the records to deletes w/ the Map Operation transform, and then send them into your target, with the "Use input keys" option selected. -
How many errors in data file before data load is aborted?
Is there a limit to how many incorrect records you can have when loading a free form file before Essbase aborts the load process?
I am trying to export a complete database to another database where the cost center level detail is removed, i.e. all parents exist (and data has been exported using All Levels).
ThanksI have done it this way in the past:
1-Create a dummy scenario.
2-Write a calc script to copy data to that scenario, but only fix on the blocks you want to extract.
3-Delete all other scenarios, thus leaving ONLY the data you copied to your dummy scenario.
4-Rename the dummy scenario to the scenario name you need.
5-Export the full database to a file.
6-Load the export into your destination database.
This might work but not knowing your exact source and target hierarchies, this may not work..... -
Hi,
I have a very large Excel workbook which I need to load into SSIS. Using the default Excel connection manager SSIS seems to load the entire work sheet into memory before performing any operation even after setting IMEX=1 within the connection string.
Can anyone provide me with step by step instructions on how to use OLE Data Source Microsoft.ACE please?
I've found the link; -
https://www.connectionstrings.com/ace-oledb-12-0/
Which looks very useful.
Since I currently have Excel 2010 as my data source I tried; -
Data Source=C:\PoC\POC_Data_Source\my.xlsx;Provider=Microsoft.ACE.OLEDB.12.0;Extended Properties="Excel 12.0 Xml;HDR=YES ;IMEX=1";
The connection test successfully. However SSIS still seems to be loading the entire work sheet into memory before starting the data load.
Kind Regards,
Kieran.
Kieran Patrick Wood http://www.innovativebusinessintelligence.com http://uk.linkedin.com/in/kieranpatrickwood http://kieranwood.wordpress.com/Thanks again Koen,
I am developing remotely on a development server. So when I run SSDT I am running it within the development server box. My current technology / environment mix means I can't run SSDT from my local desktop.
I am pretty sure that scanning of the rows is an issue causing performance, e.g. if I just click on the data source component SSDT freezes for several minutes, unless I just have a cut down version of the Excel file, say first 5,000 rows, then the response
of SSDT is acceptable.
Also when I have SSDT pointing to the full spreadsheet and I click around in the associated Data Flow I notice devenv.exe* 32 slowing grabbing 4gb of RAM and releasing it according to Windows Task Manager.
Kind Regards,
Kieran.
Kieran Patrick Wood http://www.innovativebusinessintelligence.com http://uk.linkedin.com/in/kieranpatrickwood http://kieranwood.wordpress.com/ -
Can u give clear steps how to load 3 data target at a time by the help of p
can u give clear steps how to load 3 data target at a time by the help of parllalesim
hi,
create the load infopackage type and give the infopack you need to load.
create 3 similar process types.
chain should have this flow
start -> delet index(if cube is the target) -> load target(connect the load process to start process) parallel -> and process -> create index(if cube)/actiavtion of ods(if ods)
Ramesh -
ODI - How to clear a slice before executing the data load interface
Hi everyone,
I am using ODI 10.1.3.6 to load data daily into an ASO cube (version:11.1.2.1). Before loading data for a particular date, I want the region to be cleared in the ASO cube defined by "that date".
I suppose I need to run a PRE_LOAD_MAXL_SCRIPT that clears the area defined by an MDX function. But I don't know how I can automatically define the region by looking at several coloums in the data source.
Thanks a lot.Hi, thank you for the response.
I know how to clear a region in ASO database. I wrote a MaxL like the following:
alter database App.Db clear data in region '{([DAY].[Day_01],[MONTH].[Month_01],[YEAR].[2011])}'
physical;
I have 3 seperate dimensions such as DAY, MONTH and YEAR. My question was, I don't know how I can automize the clearing process before each data load for a particular date.
Can I somehow automatically set the Day, Month, Year information in the MDX function by looking at the day,month,year coloumns in the relational data source. For example if I am loading data for 03.01.2011, I want my MDX function to become {([DAY].[Day_01],[MONTH].[Month_03],[YEAR].[2011])}'. In the data source table I also have seperate coloumns for Day, Month , Year which should make it easier I guess.
I also thought of using Substitution variables to define the region, but then again the variables need to be set according to the day,month, year coloums in the data source table. I also would like to mention that the data source table is truncated and loaded daily, so there can't be more than one day or one month etc in the table.
I don't know if I could clearly stated my problem, please let me know if there are any confusing bits.
Thanks a lot. -
Hi all,
One of our process chain failed at data loading step. In monitor screen the status tab ismessage as ::
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Extractor .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
In the error message option , I can see the
1) No Planning Version Selected ID SAPAPO/TSM IDno224
2) Errors in source System ID RSM IDno340
Please help me to resolve the issue.Hi
Right click on failed DTP --> select monitor.
here on top you will find "Job overview", select this
now check the job log of this DTP, you will come to know what the exact error is.
activate the objects related to this DTP ( i mean source and target).
Regards,
Venkatesh -
Steps for master data loading by using process chain
Hi
I want to load the Master data by using the process chain . Can anyone tell me the step-by-step method for loading the master data .
I means what are the processes should i need to include in the process chain . Pls tell me in sequenece .
i ll assing the points
kumarHi,
The process os loading master data is similar to transaction data in terms of using the infopackages.
First you have to load Attr followed by texts and hierarchy.For master data you have to include the process type called as Attribute change run.This can be included as a separate process type or using the AND process.
Hope this helps.
Assign points if useful.
Regards,
Venkat -
"master data deletion for requisition" before master data loading
Hello Gurus,
in our bw syetem , for process chains for loading master infoobjects, all include "u201C master data deletion for requisition" ABAP
process except for one process chain. my question is:
why that process chain for master data loading is different from others as for lacking "master data deletion for requisition" in it?
so it does not matter if you include " master data deletion for requisition" ABAP process in process chain for master data loading ?
Many thank.Hi,
ABAP Process means some ABAP program is being executed in this particular step.
It's possible that for all of your process chains except for that one requirement was to do some ABAP program processing.
You can check which program is executed by following below process:
Open your process chain in planning view -> Double click on that particular ABAP process -> Here you can see program name as well as program variant.
Hope this helps!
Regards,
Nilima -
Error while starting data loading on InfoPackage
Hi everybody,
I'm new at SAP BW and I'm working in the "Step-By-Step: From Data Model to the BI Application in the web" document from SAP.
I'm having a problem at the (Chapter 9 in the item c - Starting Data Load Immediately).
If anyone can help me:
Thanks,
Thiago
Below are the copy of the error from my SAP GUI.
<><><><><><><><><><<><><><><><><><><><><><><><><><><><><><><><>
Runtime Errors MESSAGE_TYPE_X
Date and Time 19.01.2009 14:41:22
Short text
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
Short text of error message:
Batch - Manager for BW Processes ***********
Long text of error message:
Technical information about the message:
Message class....... "RSBATCH"
Number.............. 000
Variable 1.......... " "
Variable 2.......... " "
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"MESSAGE_TYPE_X" " "
"SAPLRSBATCH" or "LRSBATCHU01"
"RSBATCH_START_PROCESS"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.
System environment
SAP-Release 701
Application server... "sun"
Network address...... "174.16.5.194"
Operating system..... "Windows NT"
Release.............. "5.1"
Hardware type........ "2x Intel 801586"
Character length.... 8 Bits
Pointer length....... 32 Bits
Work process number.. 2
Shortdump setting.... "full"
Database server... "localhost"
Database type..... "ADABAS D"
Database name..... "NSP"
Database user ID.. "SAPNSP"
Terminal.......... "sun"
Char.set.... "English_United State"
SAP kernel....... 701
created (date)... "Jul 16 2008 23:09:09"
create on........ "NT 5.2 3790 Service Pack 1 x86 MS VC++ 14.00"
Database version. "SQLDBC 7.6.4.014 CL 188347 "
Patch level. 7
Patch text.. " "
Database............. "MaxDB 7.6, MaxDB 7.7"
SAP database version. 701
Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
NT 6.0"
Memory consumption
Roll.... 8112
EM...... 11498256
Heap.... 0
Page.... 65536
MM Used. 6229800
MM Free. 1085264
User and Transaction
Client.............. 001
User................ "THIAGO"
Language key........ "E"
Transaction......... "RSA1 "
Transactions ID..... "CD47E6DDD55EF199B4E6001B782D539C"
Program............. "SAPLRSBATCH"
Screen.............. "SAPLRSS1 2500"
Screen line......... 7
Information on where terminated
Termination occurred in the ABAP program "SAPLRSBATCH" - in
"RSBATCH_START_PROCESS".
The main program was "RSAWBN_START ".
In the source code you have the termination point in line 340
of the (Include) program "LRSBATCHU01".
Source Code Extract
Line
SourceCde
310
endif.
311
l_lnr_callstack = l_lnr_callstack - 1.
312
endloop. " at l_t_callstack
313
endif.
314
315
*---- Eintrag für RSBATCHHEADER -
316
l_s_rsbatchheader-batch_id = i_batch_id.
317
call function 'GET_JOB_RUNTIME_INFO'
318
importing
319
jobcount = l_s_rsbatchheader-jobcount
320
jobname = l_s_rsbatchheader-jobname
321
exceptions
322
no_runtime_info = 1
323
others = 2.
324
call function 'TH_SERVER_LIST'
325
tables
326
list = l_t_server
327
exceptions
328
no_server_list = 1
329
others = 2.
330
data: l_myname type msname2.
331
call 'C_SAPGPARAM' id 'NAME' field 'rdisp/myname'
332
id 'VALUE' field l_myname.
333
read table l_t_server with key
334
name = l_myname.
335
if sy-subrc = 0.
336
l_s_rsbatchheader-host = l_t_server-host.
337
l_s_rsbatchheader-server = l_myname.
338
refresh l_t_server.
339
else.
>>>>>
message x000.
341
endif.
342
data: l_wp_index type i.
343
call function 'TH_GET_OWN_WP_NO'
344
importing
345
subrc = l_subrc
346
wp_index = l_wp_index
347
wp_pid = l_s_rsbatchheader-wp_pid.
348
if l_subrc <> 0.
349
message x000.
350
endif.
351
l_s_rsbatchheader-wp_no = l_wp_index.
352
l_s_rsbatchheader-ts_start = l_tstamps.
353
l_s_rsbatchheader-uname = sy-uname.
354
l_s_rsbatchheader-module_name = l_module_name.
355
l_s_rsbatchheader-module_type = l_module_type.
356
l_s_rsbatchheader-pc_variant = i_pc_variant.
357
l_s_rsbatchheader-pc_instance = i_pc_instance.
358
l_s_rsbatchheader-pc_logid = i_pc_logid.
359
l_s_rsbatchheader-pc_callback = i_pc_callback_at_end.Hi,
i am also getting related this issue kindly see this below short dump description.
Short text
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
Short text of error message:
Variant RSPROCESS0000000000705 does not exist
Long text of error message:
Diagnosis
You selected variant 00000000705 for program RSPROCESS.
This variant does not exist.
System Response
Procedure
Correct the entry.
Technical information about the message:
Message class....... "DB"
Number.............. 612
Variable 1.......... "&0000000000705"
Variable 2.......... "RSPROCESS"
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"MESSAGE_TYPE_X" " "
"SAPLRSPC_BACKEND" or "LRSPC_BACKENDU05"
"RSPC_PROCESS_FINISH"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.
System environment
SAP-Release 701
Application server... "CMCBIPRD"
Network address...... "192.168.50.12"
Operating system..... "Windows NT"
Release.............. "6.1"
Hardware type........ "16x AMD64 Level"
Character length.... 16 Bits
Pointer length....... 64 Bits
Work process number.. 0
Shortdump setting.... "full"
Database server... "CMCBIPRD"
Database type..... "MSSQL"
Database name..... "BIP"
Database user ID.. "bip"
Terminal.......... "CMCBIPRD"
Char.set.... "C"
SAP kernel....... 701
created (date)... "Sep 9 2012 23:43:54"
create on........ "NT 5.2 3790 Service Pack 2 x86 MS VC++ 14.00"
Database version. "SQL_Server_8.00 "
Patch level. 196
Patch text.. " "
Database............. "MSSQL 9.00.2047 or higher"
SAP database version. 701
Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
NT 6.0, Windows NT 6.1, Windows NT 6.2"
Memory consumption
Roll.... 16192
EM...... 4189840
Heap.... 0
Page.... 16384
MM Used. 2143680
MM Free. 2043536
User and Transaction
Client.............. 001
User................ "BWREMOTE"
Language Key........ "E"
Transaction......... " "
Transactions ID..... "9C109BE2C9FBF18BBD4BE61F13CE9693"
Program............. "SAPLRSPC_BACKEND"
Screen.............. "SAPMSSY1 3004"
Screen Line......... 2
Information on caller of Remote Function Call (RFC):
System.............. "BIP"
Database Release.... 701
Kernel Release...... 701
Connection Type..... 3 (2=R/2, 3=ABAP System, E=Ext., R=Reg. Ext.)
Call Type........... "asynchron without reply and transactional (emode 0, imode
0)"
Inbound TID.........." "
Inbound Queue Name..." "
Outbound TID........." "
Outbound Queue Name.." "
Information on where terminated
Termination occurred in the ABAP program "SAPLRSPC_BACKEND" - in
"RSPC_PROCESS_FINISH".
The main program was "SAPMSSY1 ".
In the source code you have the termination point in line 75
of the (Include) program "LRSPC_BACKENDU05".
Source Code Extract
Line SourceCde
45 l_t_info TYPE rs_t_rscedst,
46 l_s_info TYPE rscedst,
47 l_s_mon TYPE rsmonpc,
48 l_synchronous TYPE rs_bool,
49 l_sync_debug TYPE rs_bool,
50 l_eventp TYPE btcevtparm,
51 l_eventno TYPE rspc_eventno,
52 l_t_recipients TYPE rsra_t_recipient,
52 l_t_recipients TYPE rsra_t_recipient,
53 l_s_recipients TYPE rsra_s_recipient,
54 l_sms TYPE rs_bool,
55 l_t_text TYPE rspc_t_text.
56
57 IF i_dump_at_error = rs_c_true.
58 * ==== Dump at error? => Recursive Call catching errors ====
59 CALL FUNCTION 'RSPC_PROCESS_FINISH'
60 EXPORTING
61 i_logid = i_logid
62 i_chain = i_chain
63 i_type = i_type
64 i_variant = i_variant
65 i_instance = i_instance
66 i_state = i_state
67 i_eventno = i_eventno
68 i_hold = i_hold
69 i_job_count = i_job_count
70 i_batchdate = i_batchdate
71 i_batchtime = i_batchtime
72 EXCEPTIONS
73 error_message = 1.
74 IF sy-subrc <> 0.
>>> MESSAGE ID sy-msgid TYPE 'X' NUMBER sy-msgno
76 WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
77 ELSE.
78 EXIT.
79 ENDIF.
80 ENDIF.
81 * ==== Cleanup ====
82 COMMIT WORK.
83 * ==== Get Chain ====
84 IF i_chain IS INITIAL.
85 SELECT SINGLE chain_id FROM rspclogchain INTO l_chain
86 WHERE log_id = i_logid.
87 ELSE.
88 l_chain = i_chain.
89 ENDIF.
90 * ==== Lock ====
91 * ---- Lock process ----
92 DO.
93 CALL FUNCTION 'ENQUEUE_ERSPCPROCESS'
94 EXPORTING
If we do this
Use table RSSDLINIT and in OLTPSOURCRE: enter the name of the data source
And in the logsys enter the name of the source system and delete the entry for that info package. what will happen is process chain will run suceesfully and short dump will not come or what kindly give the detail explanation about this RSSDLINIT.
Regards,
poluru -
BI Content data load - Caller 70 ST22 Error CX_RSR_X_MESSAGE
Hi All,
Currently, we have been in the process of activating BI Content mostly 3.x flows within our new development system.
including cubes 0FIAP_C02, 0FIAR_C05, 0FIAR_C02, 0FIGL_C01 etc. Activation of content through to source system has been successful though data loading has proved into the cube through the 3.x flow has proved to be an issue.
The problem being when the info package is run to update the specific cube the load remains in yellow pending status before eventually reaching its time-out limit and producing a caller 70 error as below:-
Short dump in the Warehouse
Diagnosis
The data update was not finished. A short dump has probably been logged in BI. This provides information about the error.
System Response
"Caller 70" is missing.
Further analysis:
Search the BI short dump overview for the short dump that belongs to the request. Pay attention to the correct time and date on the selection screen.
You access a short dump list using the wizard or the menu path "Environment -> Short dump -> In the Data Warehouse".
Error handling:
Follow the instructions in the short dump.
After checking ST 22 we obtain a number of exceptions as shown detailed below :-
The exception 'CX_RSR_X_MESSAGE' was raised, but it was not caught anywhere
along
the call hierarchy.
Since exceptions represent error situations and this error was not
adequately responded to, the running ABAP program 'SAPLRRMS' has to be
terminated.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"UNCAUGHT_EXCEPTION" "CX_RSR_X_MESSAGE"
"SAPLRRMS" or "LRRMSU13"
"RRMS_X_MESSAGE"
75 ELSE.
76 DATA: l_text TYPE string,
77 l_repid TYPE syrepid.
78
79 l_repid = i_program.
80 l_text = i_text.
81
>>>>> RAISE EXCEPTION TYPE cx_rsr_x_message
83 EXPORTING text = l_text
84 program = l_repid.
85 ENDIF.
86
87 ENDFUNCTION.
I have looked through the many articles related the error including notes :- 551464, 850428, 615389
Though to no avail, there are no locked idocs in sm58 and also basis have checked the DB size and all is OK, also reactivating and dropping index's did not work
Just wondering if there could be any further issues to resolve the data loading issues?
Many ThanksHi,
Based on the provided details, please take a look at below detail in the short dump:
I_IOBJNM
<Object> (In the most of cases the impacted object is 0REQUEST)
So, check the version of the object: >> tcode RSD5 - <object> -> Display
-> 'Version Comparison' button -> display All Properties -> Active/Content Version
By these steps you can see what is different in the versions (A-Active and D-Delivered)
To solve the issue check the steps provided in note 1157796 (in the first part - Symptom item)
It should helps.
Regards,
Sinara -
"UNICODE_IN_DATA" error in ODI 11.1.1.5 data load interface
Hello!
I am sorry, I have again to ask for help with the new issue with ODI 11.1.1.5. This is a multiple-column data load interface. I am loading data from tab-delimited text file into Essbase ASO 11.1.2. The ODI repository database is MS SQL Server. In the target datastore some fields are not mapped to the source but hardcoded with a fixed value, for example, since only budget data is always loaded by default, the mapping for "Scenario" field in the target has an input string 'Budget'. This data load interface has no rules file.
At "Prepare for loading" step the following error is produced:
org.apache.bsf.BSFException: exception from Jython:
Traceback (most recent call last):
File "<string>", line 86, in <module>
AttributeError: type object 'com.hyperion.odi.common.ODIConstants' has no attribute 'UNICODE_IN_DATA'
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
at java.lang.Thread.run(Thread.java:662)
I will be very grateful for any hintsHave you changes any of the Hyperion Java Files?
I have not seen this exact error before but errors like this when the KM is not in sync with the Java Files.
Also I always suggest using a rules file.
If you have changed the files, revert back to the original odihapp_common.jar and see if it works, if you have changed the files to get round the issues I described in the blog you should be alright just to have changed odihapp_essbase.jar
This is the problem now with Oracle and all there different versions and patches of ODI, it seems to me they have put effort into the 10.1.3.x Hyperion modules and then in 11.1.1.5 just given up and totally messed a lot of things up.
I hope somebody from Oracle read this because they need to get there act together.
Cheers
John
http://john-goodwin.blogspot.com/ -
Data load taking very long time between cube to cube
Hi
In our system the data loading is taking very long time between cube to cube using DTP in BI7.0,
the maximum time consumption is happening at start of extraction step only, can anybody help in decreasing the start of extraction timing please
Thanks
KiranKindly little bit Elaborate your issue, Like how is mapping between two cubes, Is it One to one mapping or any Routine is there in Transformation. Any Filter/ Routine in DTP. Also before loading data to Cube did you deleted Inedxes?
Regards,
Sushant -
Data load along with analyzation
Hello,
We had a data inconsistency situation and ran RSRV tests. One of the tests is INITIAL KEY FIGURE UNITS IN FACT TABLES. This test gave a red light giving the message "RESULT OF CHECK FOR KEY FIGURE UNITS IN FACT TABLES OF INFO CUBE "ZXXXXX" ARE EMPTY.
The description is as follows:
Message no. RSRV117
Description:
In the fact tables (E and F tables) of an InfoCube, a search is being carried out for those records that contain values other than zero for key figures that have units, but for which the unit of the key figure is blank (has no value).
Since the value of the unit has to correspond to the value of the key figure, such records would indicate an error in the data upload. The values of the units have not been loaded correctly into BW.
Repairs
There are no automatic repair options. Reload the data.
SO do I need to load all my data once again. Is this error because the data coming from R/3 has no units? Do I need to correct anything in the infocube before loading data? Also can somebody give me a link or documentation for step by step procedure for extraction and loading from R/3 to infocube at the same time analyzing the data load at each step?
Thank youHi Visu,
As I understand what he meant is to check the fields in the Extractor that are populating the Currency Key fields that you specified in BW.
If you have a 1-1 mapping in your Transfer structure;
Run the Extractor in RSA3 on R/3 side for the records which have no units comming in.
And then in the output display make sure that those fields are being populated.
If you are using Transfer rules to pupulate those fields debug the transfer rules to find out why you are not getting the units populated correctly..
Let us know ..
Ashish.
Maybe you are looking for
-
Correlation in SCOM as a tool for the alerts
Hi Experts, We have SCOM 2007R2 in our environment . We have shared platform with multiple customers. Since we have many customers alerts on the scom console it will generate more alerts. Since we were doing a work around to reduce the alert count w
-
Multiple Toplink project modules
Hi We have been using Toplink for more than 4 years and have developed a single, extensive domain model. However, this domain model is used in multiple application projects which is increasingly hindering development speed. We would like, no need to
-
Logical system not defined.
Hello Can some body help me to identify the issue which gives me an error,No Logical system for FI is manitained . This error, I receive at the time of approving shopping carts. Thanks. Jayawant Gokhale
-
Has nobody noticed, that the latest totem (1.0.4) doesn't work with the latest hal (0.5.2-3) ? totem: error while loading shared libraries: libhal.so.0: cannot open shared object file: No such file or directory I bumped on that something like one mon
-
Hi, I need to build a prompt that allows the user to select only Month and not calendar dates i.e Jan 2008 etc. In my report i need to show Transactions by Month. I have a dimension table - Dim:Time and one fact table Fact:Transactions. In my report