0ORGUNIT_ATTR Data load result in Shortdumps ITAB_DUPLICATE_KEY
Hi Everyone,
I tried to load the master data in Production system for the first time 0ORGUNIT. I am getting the short dumps ITAB_DUPLICATE_KEY.
The error is from a standard programs. Any thoughts and any help in this regards is highly appreciated.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"ITAB_DUPLICATE_KEY" " "
"SAPLHRMS_BW_PA_OS" or "LHRMS_BW_PA_OSU06"
"HR_BW_EXTRACT_IO_ORGUNIT"
LOOP AT orgunits.
* Stelle Planstellen nach PLVAR zusammen und rufe KSt-FB auf:
IF last_plvar IS INITIAL.
last_plvar = orgunits-plvar.
ENDIF.
IF orgunits-plvar = last_plvar.
MOVE-CORRESPONDING orgunits TO in_objects.
APPEND in_objects.
ENDIF.
IF ( orgunits-plvar <> last_plvar ) OR ( sy-tabix = n_of_lines ).
REFRESH main_co.
CALL FUNCTION 'RH_GET_COSTCENTER_OF_OBJECT'
EXPORTING
plvar = last_plvar
begda = p_begda
endda = p_endda
* svect = '1'
* active = ' '
* dist = ' '
object_only = only_direct
buffered_access = 'X'
read_it0001 = 'X'
i0027_flag = ' '
ombuffer_mode = ' '
TABLES
in_objects = in_objects
main_costcenters = main_co
EXCEPTIONS
OTHERS = 0.
INSERT lines of main_co INTO TABLE main_costcenters. Error Line
last_plvar = orgunits-plvar.
REFRESH in_objects.
MOVE-CORRESPONDING orgunits TO in_objects.
APPEND in_objects.
ENDIF.
I need to solve this issue as soon as possible.
Please help me.
Thanks and Regards,
Phaneendra
Hi,
I have solved this issue. Please follow the note mentioned below. This note can be implemented only if you dont need the attributes of KOKRS and KOSTL as pert of 0ORGUNIT master data.
Note 543324 - Performance extraction DataSource 0ORGUNIT_ATTR
Thanks for the time every one.
Regards,
Phaneendra
Similar Messages
-
Data load results in Endless loop
hi guys
i have a infopackage in a process chain it was working fine, until i retransported the process chain without the info package, now when i run the process chain or the info package(without process chain) it goes into an endless loop. i get a short dump "MESSAGE_TYPE_X" after hours, it just keeps on loading 2000 rec's in every data packet. any ideas what happeend and what went wrong.
i have tried re trasporting the same infopackage from DEV again i get the same error, please suggest any thing else i can try to check whats going on why is the load not ending.
thanksHi,
this error may occure when there is shortage of resource, work processor.
may be the IDOC are stocked in Source system needs to passed to BW.
use the BD87 and process manually or may be you can check in the Sm58 also for the stocked LUWs and process manually.
Regards,
Vishwa -
Data Load function - max number of columns?
Hello,
I was able to successfully create a page with Data Load Wizard. I've used this page wizard before with no issues, however in this particular case I want to load a spreadsheet with a lot of columns (99 to be precise). When I run the page, it uploads the spreadsheet into the proper table, but only the first 45 columns. The remaining 56 columns are Null for all rows. Also, there are 100 rows in the spreadsheet and it doesn't load them all (it loads 39).
Is there a limit to the number of columns it can handle?
Also, when I re-upload the same file, the data load results show that it Inserted 0 rows, Updated 100 rows, Failed 0 zeros. However there are still only a total of 39 rows in the table.
Thoughts?
SteveSteve wrote:
FYI, I figured out why it wasn't loading all 100 rows. Basically I needed to set two dependent columns in the load definition instead of one; it was seeing multiple rows in the spreadsheet and assumed some were the same record based on one column. So that part is solved...
I still would like feedback on the number of columns the Data Load Wizard can handle, and if there's way to handle more than 45.The Data Load Wizard can handle a maximum of 46 columns: +{message:id=10107069}+ -
Working with Data Loading page type
All,
i am creating a page of type="Data Loading" so i can load various data. However i want to bypass the process part where we map the fields/columns(i.e Data/Table mapping) because all data i load are stored in only one column so this mapping part is unnecessary. My requirement is to load the data then when you click next button it should take you to the "Data Validation" part then finally "Data Load Results" part.
How do i go about this?
apex 4.1.1You might try to mimic a button-press for the [Next] button. Probably just a submit will do after page rendering (so create a Dynamic Action that submits immediately after load)
-
All,
I writing a data load process to copy&paste or upload files all is good. but now i want to bypass the step for column mapping (run it in the background/ move that copy to step1) so a user doesn't see it while loading the data by default the stages are: (
Data Load Source
Data / Table Mapping
Data Validation
Data Load Results
so i want to run the 2nd step in the background (or if i can move that function and combine with step 1)...any help?
apex 4.2
thanks.Maybe consider page branches on the relevant page, or the plugin
- Process Type Plugin - EXCEL2COLLECTIONS -
Master data load failure. RSRV check resulted in inconsistencies.
Hi...
In our production system, master data load to 0EMPLOYEE is failing every alternate day. When I check the same in RSRV, following checks are red:
1. Time intervals in Q table for a characteristic with time-dep. master data
2. Compare sizes of P or Q and X or Y tables for characteristic 0EMPLOYEE
3. SID values in X and Y table: Characteristic '0EMPLOYEE'
If I repair it, it becomes green and the load is fine. Next day the load fails again. When I check in RSRV, I get the same 3 errors. So, again I need to repair it. Let me know the permanent solution for this.
I ran the programs: RSDG_IOBJ_REORG and RSDMD_CHECKPRG_ALL but these fixes are also temporary. Moving a new version of the object from Dev to QA and then to Production is not preferable right now as this involves a high amount of visibility.
I know the SID tables and all are corrupted from the logs I see. But is there any permanent solution for this without transports?
Thanks,
SrinivasHi
Chk this link will help you: Master data deletion
Regards
Ashwin. -
CALL_FUNCTION_CONFLICT_TYPE Standard Data loading
Hi,
I am facing a data loading problem using Business content on CPS_DATE infocube (0PS_DAT_MLS datasource).
The R/3 extraction processes without any error, but the problem occurs in the update rules while updating the milestone date. Please find hereunder the log from the ST22.
The real weird thing is that the process works perfectly in development environment and not in integration one (the patch levels are strongly the same: BW 3.5 Patch #16).
I apologise for the long message below... this is a part of the system log.
For information the routine_0004 is a standard one.
Thanks a lot in advanced!
Cheers.
CALL_FUNCTION_CONFLICT_TYPE
Except. CX_SY_DYN_CALL_ILLEGAL_TYPE
Symptoms. Type conflict when calling a function module
Causes Error in ABAP application program.
The current ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" had to be terminated because one of the statements could not be executed.
This is probably due to an error in the ABAP program.
A function module was called incorrectly.
Errors analysis
An exception occurred. This exception is dealt with in more detail below
. The exception, which is assigned to the class 'CX_SY_DYN_CALL_ILLEGAL_TYPE', was neither caught nor passed along using a RAISING clause, in the procedure "ROUTINE_0004"
"(FORM)" .
Since the caller of the procedure could not have expected this exception
to occur, the running program was terminated. The reason for the exception is:
The call to the function module "RS_BCT_TIMCONV_PS_CONV" is incorrect:
The function module interface allows you to specify only fields of a particular type under "E_FISCPER".
The field "RESULT" specified here is a different field type.
How to correct the error.
You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search criteria:
"CALL_FUNCTION_CONFLICT_TYPE" CX_SY_DYN_CALL_ILLEGAL_TYPEC
"GP420EQ35FHFOCVEBCR6RWPVQBR" or "GP420EQ35FHFOCVEBCR6RWPVQBR"
"ROUTINE_0004"
If you cannot solve the problem yourself and you wish to send
an error message to SAP, include the following documents:
1. A printout of the problem description (short dump)
To obtain this, select in the current display "System->List->
Save->Local File (unconverted)". 2. A suitable printout of the system log To obtain this, call the system log through transaction SM21. Limit the time interval to 10 minutes before and 5 minutes after the short dump. In the display, then select the function
"System->List->Save->Local File (unconverted)".
3. If the programs are your own programs or modified SAP programs, supply the source code.
To do this, select the Editor function "Further Utilities-> Upload/Download->Download".
4. Details regarding the conditions under which the error occurred
or which actions and input led to the error.
The exception must either be prevented, caught within the procedure
"ROUTINE_0004"
"(FORM)", or declared in the procedure's RAISING clause.
To prevent the exception, note the following:
Environment system SAP Release.............. "640"
Operating system......... "SunOS" Release.................. "5.9"
Hardware type............ "sun4u"
Character length......... 8 Bits
Pointer length........... 64 Bits
Work process number...... 2
Short dump setting....... "full"
Database type............ "ORACLE"
Database name............ "BWI"
Database owner........... "SAPTB1"
Character set............ "fr"
SAP kernel............... "640"
Created on............... "Jan 15 2006 21:42:36" Created in............... "SunOS 5.8 Generic_108528-16 sun4u"
Database version......... "OCI_920 "
Patch level.............. "109"
Patch text............... " "
Supported environment....
Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE 10.2.0.."
SAP database version..... "640"
Operating system......... "SunOS 5.8, SunOS 5.9, SunOS 5.10"
SAP Release.............. "640"
The termination occurred in the ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" in
"ROUTINE_0004".
The main program was "RSMO1_RSM2 ".
The termination occurred in line 702 of the source code of the (Include)
program "GP420EQ35FHFOCVEBCR6RWPVQBR"
of the source code of program "GP420EQ35FHFOCVEBCR6RWPVQBR" (when calling the editor 7020).
Processing was terminated because the exception "CX_SY_DYN_CALL_ILLEGAL_TYPE" occurred in the procedure "ROUTINE_0004" "(FORM)" but was not handled locally, not declared in the RAISING clause of the procedure.
The procedure is in the program "GP420EQ35FHFOCVEBCR6RWPVQBR ". Its source code starts in line 685 of the (Include) program "GP420EQ35FHFOCVEBCR6RWPVQBR ".
672 'ROUTINE_0003' g_s_is-recno
673 rs_c_false rs_c_false g_s_is-recno
674 changing c_abort.
675 catch cx_foev_error_in_function.
676 perform error_message using 'RSAU' 'E' '510'
677 'ROUTINE_0003' g_s_is-recno
678 rs_c_false rs_c_false g_s_is-recno
679 changing c_abort.
680 endtry.
681 endform.
682 ************************************************************************
683 * routine no.: 0004
684 ************************************************************************
685 form routine_0004
686 changing
687 result type g_s_hashed_cube-FISCPER3
688 returncode like sy-subrc
689 c_t_idocstate type rsarr_t_idocstate
690 c_subrc like sy-subrc
691 c_abort like sy-subrc. "#EC *
692 data:
693 l_t_rsmondata like rsmonview occurs 0 with header line. "#EC *
694
695 try.
696 * init
variables
697 move-corresponding g_s_is to comm_structure.
698
699 * fill the internal table "MONITOR", to make monitor entries
700
701 * result value of the routine
>>>> CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
703 EXPORTING
704 I_TIMNM_FROM = '0CALDAY'
705 I_TIMNM_TO = '0FISCPER'
706 I_TIMVL = COMM_STRUCTURE-CALDAY
707 I_FISCVARNT = gd_fiscvarnt
708 IMPORTING
709 E_FISCPER = RESULT.
710 * if the returncode is not equal zero, the result will not be updated
711 RETURNCODE = 0.
712 * if abort is not equal zero, the update process will be canceled
713 ABORT = 0.
714
715 catch cx_sy_conversion_error
716 cx_sy_arithmetic_error.
717 perform error_message using 'RSAU' 'E' '507'
718 'ROUTINE_0004' g_s_is-recno
719 rs_c_false rs_c_false g_s_is-recno
720 changing c_abort.
721 catch cx_foev_error_in_function.
System zones content
Name Val.
SY-SUBRC 0
SY-INDEX 2
SY-TABIX 0
SY-DBCNT 0
SY-FDPOS 65
SY-LSIND 0
SY-PAGNO 0
SY-LINNO 1
SY-COLNO 1
SY-PFKEY 0400
SY-UCOMM OK
SY-TITLE Moniteur - Atelier d'administration
SY-MSGTY E
SY-MSGID RSAU
SY-MSGNO 583
SY-MSGV1 BATVC 0000000000
SY-MSGV2 0PROJECT
SY-MSGV3
SY-MSGV4
Selected variables
Nº 23 Tpe FORM
Name ROUTINE_0004
GD_FISCVARNT
22
00 RS_C_INFO I
4
9
COMM_STRUCTURE-CALDAY
20060303
33333333
20060303
SYST-REPID GP420EQ35FHFOCVEBCR6RWPVQBR 4533345334444454445355555452222222222222 704205135686F365232627061220000000000000
RESULT
000
333
00You have an update routine in which youar callin FM 'RS_BCT_TIMCONV_PS_CONV'. Parameter e_fiscper must be the same that type of the variable you use (you can see the data tyoe in FM definition, transaction se37). You should do somethin like the following.
DATA: var type <the same that e_fiscper in FM definition>
CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
EXPORTING
I_TIMNM_FROM = '0CALDAY'
I_TIMNM_TO = '0FISCPER'
I_TIMVL = COMM_STRUCTURE-CALDAY
I_FISCVARNT = gd_fiscvarnt
IMPORTING
E_FISCPER = var.
result = var.
--- ASSIGN POINTS IS USEFUL. -
Hi guys...
Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
2.Suppose the DSO gets data like below:
Fields in DSO:X1,X2,Y1,Y2,Y3 [X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures]
Data feeded by Datasource 1 : X1 X2 Y1
a b 10
Data feeded by Datasource 2 : X1 X2 Y2 Y3
a b 20 30
so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
Datasource 1 feeds like this in Request 1:
X1 X2 Y1
a b 10
Datasource 2 feeds like this in Request 2:
X1 X2 Y1 Y2 Y3
a b 30 40 50
how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
Thanks.Hi guys...
Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
Request-ID is only a part of the new data table - after activation of your data your request will get lost. If you want to see whats happening, load you data request by request and activate your data after each request
2.Suppose the DSO gets data like below:
Fields in DSO:X1,X2,Y1,Y2,Y3 X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures
Data feeded by Datasource 1 : X1 X2 Y1
a b 10
Data feeded by Datasource 2 : X1 X2 Y2 Y3
a b 20 30
so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
If the keys are equal, you will have only one dataset in your DSO
If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
Then you will have two datasets in your DSO
3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
Datasource 1 feeds like this in Request 1:
X1 X2 Y1
a b 10
Datasource 2 feeds like this in Request 2:
X1 X2 Y1 Y2 Y3
a b 30 40 50
how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
If you choose overwrite, you will get 30 - if you choose addition, you will get 40
Thanks. -
QUERY PERFORMANCE AND DATA LOADING PERFORMANCE ISSUES
WHAT ARE QUERY PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES...PLZ URGENT
WHAT ARE DATALOADING PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES PLZ URGENT
WILL REWARD FULL POINT S
REGARDS
GURUBW Back end
Some Tips -
1)Identify long-running extraction processes on the source system. Extraction processes are performed by several extraction jobs running on the source system. The run-time of these jobs affects the performance. Use transaction code SM37 Background Processing Job Management to analyze the run-times of these jobs. If the run-time of data collection jobs lasts for several hours, schedule these jobs to run more frequently. This way, less data is written into update tables for each run and extraction performance increases.
2)Identify high run-times for ABAP code, especially for user exits. The quality of any custom ABAP programs used in data extraction affects the extraction performance. Use transaction code SE30 ABAP/4 Run-time Analysis and then run the analysis for the transaction code RSA3 Extractor Checker. The system then records the activities of the extraction program so you can review them to identify time-consuming activities. Eliminate those long-running activities or substitute them with alternative program logic.
3)Identify expensive SQL statements. If database run-time is high for extraction jobs, use transaction code ST05 Performance Trace. On this screen, select ALEREMOTE user and then select SQL trace to record the SQL statements. Identify the time-consuming sections from the results. If the data-selection times are high on a particular SQL statement, index the DataSource tables to increase the performance of selection (see no. 6 below). While using ST05, make sure that no other extraction job is running with ALEREMOTE user.
4)Balance loads by distributing processes onto different servers if possible. If your site uses more than one BW application server, distribute the extraction processes to different servers using transaction code SM59 Maintain RFC Destination. Load balancing is possible only if the extraction program allows the option
5)Set optimum parameters for data-packet size. Packet size affects the number of data requests to the database. Set the data-packet size to optimum values for an efficient data-extraction mechanism. To find the optimum value, start with a packet size in the range of 50,000 to 100,000 and gradually increase it. At some point, you will reach the threshold at which increasing packet size further does not provide any performance increase. To set the packet size, use transaction code SBIW BW IMG Menu on the source system. To set the data load parameters for flat-file uploads, use transaction code RSCUSTV6 in BW.
6)Build indexes on DataSource tables based on selection criteria. Indexing DataSource tables improves the extraction performance, because it reduces the read times of those tables.
7)Execute collection jobs in parallel. Like the Business Content extractors, generic extractors have a number of collection jobs to retrieve relevant data from DataSource tables. Scheduling these collection jobs to run in parallel reduces the total extraction time, and they can be scheduled via transaction code SM37 in the source system.
8). Break up your data selections for InfoPackages and schedule the portions to run in parallel. This parallel upload mechanism sends different portions of the data to BW at the same time, and as a result the total upload time is reduced. You can schedule InfoPackages in the Administrator Workbench.
You can upload data from a data target (InfoCube and ODS) to another data target within the BW system. While uploading, you can schedule more than one InfoPackage with different selection options in each one. For example, fiscal year or fiscal year period can be used as selection options. Avoid using parallel uploads for high volumes of data if hardware resources are constrained. Each InfoPacket uses one background process (if scheduled to run in the background) or dialog process (if scheduled to run online) of the application server, and too many processes could overwhelm a slow server.
9). Building secondary indexes on the tables for the selection fields optimizes these tables for reading, reducing extraction time. If your selection fields are not key fields on the table, primary indexes are not much of a help when accessing data. In this case it is better to create secondary indexes with selection fields on the associated table using ABAP Dictionary to improve better selection performance.
10)Analyze upload times to the PSA and identify long-running uploads. When you extract the data using PSA method, data is written into PSA tables in the BW system. If your data is on the order of tens of millions, consider partitioning these PSA tables for better performance, but pay attention to the partition sizes. Partitioning PSA tables improves data-load performance because it's faster to insert data into smaller database tables. Partitioning also provides increased performance for maintenance of PSA tables for example, you can delete a portion of data faster. You can set the size of each partition in the PSA parameters screen, in transaction code SPRO or RSCUSTV6, so that BW creates a new partition automatically when a threshold value is reached.
11)Debug any routines in the transfer and update rules and eliminate single selects from the routines. Using single selects in custom ABAP routines for selecting data from database tables reduces performance considerably. It is better to use buffers and array operations. When you use buffers or array operations, the system reads data from the database tables and stores it in the memory for manipulation, improving performance. If you do not use buffers or array operations, the whole reading process is performed on the database with many table accesses, and performance deteriorates. Also, extensive use of library transformations in the ABAP code reduces performance; since these transformations are not compiled in advance, they are carried out during run-time.
12)Before uploading a high volume of transaction data into InfoCubes, activate the number-range buffer for dimension IDs. The number-range buffer is a parameter that identifies the number of sequential dimension IDs stored in the memory. If you increase the number range before high-volume data upload, you reduce the number of reads from the dimension tables and hence increase the upload performance. Do not forget to set the number-range values back to their original values after the upload. Use transaction code SNRO to maintain the number range buffer values for InfoCubes.
13)Drop the indexes before uploading high-volume data into InfoCubes. Regenerate them after the upload. Indexes on InfoCubes are optimized for reading data from the InfoCubes. If the indexes exist during the upload, BW reads the indexes and tries to insert the records according to the indexes, resulting in poor upload performance. You can automate the dropping and regeneration of the indexes through InfoPackage scheduling. You can drop indexes in the Manage InfoCube screen in the Administrator Workbench.
14)IDoc (intermediate document) archiving improves the extraction and loading performance and can be applied on both BW and R/3 systems. In addition to IDoc archiving, data archiving is available for InfoCubes and ODS objects.
Hope it Helps
Chetan
@CP.. -
Unable to load CSV data using APEX Data Load using Firefox/Safari on a MAC
I have APEX installed on a Windows XP machine connected to an 11g database on the same Windows XP machine.
While on the windows XP, using IE 7, I am able to successfully load a CSV spreadsheet of data using the APEX Data Load utility.
However, if I switch to my MacBook Pro running OS X leopard, then login into same APEX machine using Firefox 2 or 3 or Safari 3, then try to upload CSV data, it fails on the "Table Properties" step when it asks you for the name of the new table and then asks you to set table properties, the table properties just never appear (they do appear in IE 7 on Windows XP) and if you try to hit the NEXT button, you get error message: "1 error has occurred. At least one column must be specified to include in new table." and of course, you can't specify the any of the columns because there is nothing under SET TABLE PROPERTIES in the interface.
I also tried to load data with Firefox 2, Firefox 3 (beta), and Safari 3.1, but get same failed result on all three. If I return to the Windows XP machine and use IE 7.0, Data Load works just fine. I work in an ALL MAC environment, it was difficult to get a windows machine into my workplace, and all my end users will be using MACs. There is no current version of IE for the MAC, so I have to use Firefox or Safari.
Is there some option in Firefox or Safari that I can turn on so this Data Load feature will work on the MAC?
Thanks for your help. Any assistance appreciated.
TonyI managed to get this to work by saving the CSV file as Windows CSV (not DOS CSV), which allowed the CSV data to be read by Oracle running on Windows XP. I think the problem had to do with different character sets being used for CSV on MAC versus CSV on Windows. Maybe if I had created my windows XP Oracle database with Unicode as the default character set, I never would have experienced this problem.
-
Takes Long time for Data Loading.
Hi All,
Good Morning.. I am new to SDN.
Currently i am using the datasource 0CRM_SRV_PROCESS_H and it contains 225 fields. Currently i am using around 40 fields in my report.
Can i hide the remaining fields in the datasource level itself (TCODE : RSA6)
Currently data loading takes more time to load the data from PSA to ODS (ODS 1).
And also right now i am pulling some data from another ODS(ODS 2)(LookUP). It takes long time to update the data in Active data table of the ODS.
Can you please suggest how to improve the performance of dataloading on this Case.
Thanks & Regards,
Siva.Hi....
Yes...u can hide..........just Check the hide box for those fields.......R u in BI 7.0 or BW...........whatever ........is the no of records is huge?
If so u can split the records and execute............I mean use the same IP...........just execute it with different selections.........
Check in ST04............is there are any locks or lockwaits..........if so...........Go to SM37 >> Check whether any Long running job is there or not.........then check whether that job is progressing or not............double click on the Job >> From the Job details copy the PID..............go to ST04 .....expand the node............and check whether u r able to find that PID there or not.........
Also check System log in SM21............and shortdumps in ST04........
Now to improve performance...........u can try to increase the virtual memory or servers.........if possiblr........it will increase the number of work process..........since if many jobs run at a time .then there will be no free Work prrocesses to proceed........
Regards,
Debjani...... -
Error while starting data loading on InfoPackage
Hi everybody,
I'm new at SAP BW and I'm working in the "Step-By-Step: From Data Model to the BI Application in the web" document from SAP.
I'm having a problem at the (Chapter 9 in the item c - Starting Data Load Immediately).
If anyone can help me:
Thanks,
Thiago
Below are the copy of the error from my SAP GUI.
<><><><><><><><><><<><><><><><><><><><><><><><><><><><><><><><>
Runtime Errors MESSAGE_TYPE_X
Date and Time 19.01.2009 14:41:22
Short text
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
Short text of error message:
Batch - Manager for BW Processes ***********
Long text of error message:
Technical information about the message:
Message class....... "RSBATCH"
Number.............. 000
Variable 1.......... " "
Variable 2.......... " "
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"MESSAGE_TYPE_X" " "
"SAPLRSBATCH" or "LRSBATCHU01"
"RSBATCH_START_PROCESS"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.
System environment
SAP-Release 701
Application server... "sun"
Network address...... "174.16.5.194"
Operating system..... "Windows NT"
Release.............. "5.1"
Hardware type........ "2x Intel 801586"
Character length.... 8 Bits
Pointer length....... 32 Bits
Work process number.. 2
Shortdump setting.... "full"
Database server... "localhost"
Database type..... "ADABAS D"
Database name..... "NSP"
Database user ID.. "SAPNSP"
Terminal.......... "sun"
Char.set.... "English_United State"
SAP kernel....... 701
created (date)... "Jul 16 2008 23:09:09"
create on........ "NT 5.2 3790 Service Pack 1 x86 MS VC++ 14.00"
Database version. "SQLDBC 7.6.4.014 CL 188347 "
Patch level. 7
Patch text.. " "
Database............. "MaxDB 7.6, MaxDB 7.7"
SAP database version. 701
Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
NT 6.0"
Memory consumption
Roll.... 8112
EM...... 11498256
Heap.... 0
Page.... 65536
MM Used. 6229800
MM Free. 1085264
User and Transaction
Client.............. 001
User................ "THIAGO"
Language key........ "E"
Transaction......... "RSA1 "
Transactions ID..... "CD47E6DDD55EF199B4E6001B782D539C"
Program............. "SAPLRSBATCH"
Screen.............. "SAPLRSS1 2500"
Screen line......... 7
Information on where terminated
Termination occurred in the ABAP program "SAPLRSBATCH" - in
"RSBATCH_START_PROCESS".
The main program was "RSAWBN_START ".
In the source code you have the termination point in line 340
of the (Include) program "LRSBATCHU01".
Source Code Extract
Line
SourceCde
310
endif.
311
l_lnr_callstack = l_lnr_callstack - 1.
312
endloop. " at l_t_callstack
313
endif.
314
315
*---- Eintrag für RSBATCHHEADER -
316
l_s_rsbatchheader-batch_id = i_batch_id.
317
call function 'GET_JOB_RUNTIME_INFO'
318
importing
319
jobcount = l_s_rsbatchheader-jobcount
320
jobname = l_s_rsbatchheader-jobname
321
exceptions
322
no_runtime_info = 1
323
others = 2.
324
call function 'TH_SERVER_LIST'
325
tables
326
list = l_t_server
327
exceptions
328
no_server_list = 1
329
others = 2.
330
data: l_myname type msname2.
331
call 'C_SAPGPARAM' id 'NAME' field 'rdisp/myname'
332
id 'VALUE' field l_myname.
333
read table l_t_server with key
334
name = l_myname.
335
if sy-subrc = 0.
336
l_s_rsbatchheader-host = l_t_server-host.
337
l_s_rsbatchheader-server = l_myname.
338
refresh l_t_server.
339
else.
>>>>>
message x000.
341
endif.
342
data: l_wp_index type i.
343
call function 'TH_GET_OWN_WP_NO'
344
importing
345
subrc = l_subrc
346
wp_index = l_wp_index
347
wp_pid = l_s_rsbatchheader-wp_pid.
348
if l_subrc <> 0.
349
message x000.
350
endif.
351
l_s_rsbatchheader-wp_no = l_wp_index.
352
l_s_rsbatchheader-ts_start = l_tstamps.
353
l_s_rsbatchheader-uname = sy-uname.
354
l_s_rsbatchheader-module_name = l_module_name.
355
l_s_rsbatchheader-module_type = l_module_type.
356
l_s_rsbatchheader-pc_variant = i_pc_variant.
357
l_s_rsbatchheader-pc_instance = i_pc_instance.
358
l_s_rsbatchheader-pc_logid = i_pc_logid.
359
l_s_rsbatchheader-pc_callback = i_pc_callback_at_end.Hi,
i am also getting related this issue kindly see this below short dump description.
Short text
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
Short text of error message:
Variant RSPROCESS0000000000705 does not exist
Long text of error message:
Diagnosis
You selected variant 00000000705 for program RSPROCESS.
This variant does not exist.
System Response
Procedure
Correct the entry.
Technical information about the message:
Message class....... "DB"
Number.............. 612
Variable 1.......... "&0000000000705"
Variable 2.......... "RSPROCESS"
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"MESSAGE_TYPE_X" " "
"SAPLRSPC_BACKEND" or "LRSPC_BACKENDU05"
"RSPC_PROCESS_FINISH"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.
System environment
SAP-Release 701
Application server... "CMCBIPRD"
Network address...... "192.168.50.12"
Operating system..... "Windows NT"
Release.............. "6.1"
Hardware type........ "16x AMD64 Level"
Character length.... 16 Bits
Pointer length....... 64 Bits
Work process number.. 0
Shortdump setting.... "full"
Database server... "CMCBIPRD"
Database type..... "MSSQL"
Database name..... "BIP"
Database user ID.. "bip"
Terminal.......... "CMCBIPRD"
Char.set.... "C"
SAP kernel....... 701
created (date)... "Sep 9 2012 23:43:54"
create on........ "NT 5.2 3790 Service Pack 2 x86 MS VC++ 14.00"
Database version. "SQL_Server_8.00 "
Patch level. 196
Patch text.. " "
Database............. "MSSQL 9.00.2047 or higher"
SAP database version. 701
Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
NT 6.0, Windows NT 6.1, Windows NT 6.2"
Memory consumption
Roll.... 16192
EM...... 4189840
Heap.... 0
Page.... 16384
MM Used. 2143680
MM Free. 2043536
User and Transaction
Client.............. 001
User................ "BWREMOTE"
Language Key........ "E"
Transaction......... " "
Transactions ID..... "9C109BE2C9FBF18BBD4BE61F13CE9693"
Program............. "SAPLRSPC_BACKEND"
Screen.............. "SAPMSSY1 3004"
Screen Line......... 2
Information on caller of Remote Function Call (RFC):
System.............. "BIP"
Database Release.... 701
Kernel Release...... 701
Connection Type..... 3 (2=R/2, 3=ABAP System, E=Ext., R=Reg. Ext.)
Call Type........... "asynchron without reply and transactional (emode 0, imode
0)"
Inbound TID.........." "
Inbound Queue Name..." "
Outbound TID........." "
Outbound Queue Name.." "
Information on where terminated
Termination occurred in the ABAP program "SAPLRSPC_BACKEND" - in
"RSPC_PROCESS_FINISH".
The main program was "SAPMSSY1 ".
In the source code you have the termination point in line 75
of the (Include) program "LRSPC_BACKENDU05".
Source Code Extract
Line SourceCde
45 l_t_info TYPE rs_t_rscedst,
46 l_s_info TYPE rscedst,
47 l_s_mon TYPE rsmonpc,
48 l_synchronous TYPE rs_bool,
49 l_sync_debug TYPE rs_bool,
50 l_eventp TYPE btcevtparm,
51 l_eventno TYPE rspc_eventno,
52 l_t_recipients TYPE rsra_t_recipient,
52 l_t_recipients TYPE rsra_t_recipient,
53 l_s_recipients TYPE rsra_s_recipient,
54 l_sms TYPE rs_bool,
55 l_t_text TYPE rspc_t_text.
56
57 IF i_dump_at_error = rs_c_true.
58 * ==== Dump at error? => Recursive Call catching errors ====
59 CALL FUNCTION 'RSPC_PROCESS_FINISH'
60 EXPORTING
61 i_logid = i_logid
62 i_chain = i_chain
63 i_type = i_type
64 i_variant = i_variant
65 i_instance = i_instance
66 i_state = i_state
67 i_eventno = i_eventno
68 i_hold = i_hold
69 i_job_count = i_job_count
70 i_batchdate = i_batchdate
71 i_batchtime = i_batchtime
72 EXCEPTIONS
73 error_message = 1.
74 IF sy-subrc <> 0.
>>> MESSAGE ID sy-msgid TYPE 'X' NUMBER sy-msgno
76 WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
77 ELSE.
78 EXIT.
79 ENDIF.
80 ENDIF.
81 * ==== Cleanup ====
82 COMMIT WORK.
83 * ==== Get Chain ====
84 IF i_chain IS INITIAL.
85 SELECT SINGLE chain_id FROM rspclogchain INTO l_chain
86 WHERE log_id = i_logid.
87 ELSE.
88 l_chain = i_chain.
89 ENDIF.
90 * ==== Lock ====
91 * ---- Lock process ----
92 DO.
93 CALL FUNCTION 'ENQUEUE_ERSPCPROCESS'
94 EXPORTING
If we do this
Use table RSSDLINIT and in OLTPSOURCRE: enter the name of the data source
And in the logsys enter the name of the source system and delete the entry for that info package. what will happen is process chain will run suceesfully and short dump will not come or what kindly give the detail explanation about this RSSDLINIT.
Regards,
poluru -
Inventory snapshot scenario - Data load frequency?
Hi,
I have gone through "How to" document for inventory snapshot extraction.
Here are few questions for which I could not find answers in the document -
1. Process 1 loads initial stock using BX data source into ODS.
2. Then Process 2 - I assume there are two steps in this -
a) Init using BF/UM Data source - If this is done, historical movements get added to initial stock in this ODS, which yields wrong results. So, is mapping to ODS required while doing init from BF/UM data sources? Init(Process 3) adds stock into snapshot cube for all the months from date of movement to current(system) date.
b) Delta using BF/UM Data source - Adding delta to snapshot ODS makes sense. Is it also required to load this parallelly to snapshot cube(as mentioned in process 3)?
No intention to confuse anybody.. Just wanted to know which of the following data load scenario yields perfectly fine results? The document is not self explainatory on this topic -
I assume that, 0IC_C03 is being updated in parallel.
1. Initial stock load using BX datasource into ODS. Then Initialize BF/UM data source into snapshot cube ONLY(Does this actually store historical snapshot?). Then do delta from BF/UM to ODS ONLY. Then follow rest of the steps.
2. Initial stock load using BX datasource into ODS. Then Initialize BF/UM data source into snapshot cube and ODS. Then do delta from BF/UM to snapshot cube AND ODS. Then follow rest of the steps.
3. Initial stock load using BX datasource into ODS. Initialize BF/UM data source WITHOUT DATA TRANSFER. Then start delta into snapshot cube AND ODS.
Any help on this will be greatly appreciated.
Thanks and Regards,
AnupHi,
Ensure that the 3 key figures inlcuded in communication structure (of course it will get inluded as it is standard datasource)and when u create the update rules,this 3 key figures will set to no update and then you have to populate the 3 kf's using a routine.
Hope this helps.
Thanks,Ramoji. -
Data load from variable file names
I have multiple files that I want to load into a cube, each starting with the same 5 characters but ending differently. EG GM1010104 GM1010204 What's the best option for MAXL script to automate this data load ? Can you use a wildcard name in the script to pick up anything starting with GM101**** ?
No - you need to specify the file name as it appears properly (I've never tried it but I am pretty sure it wouldn't work). One solution to this problem though is to have a shell script (or DOS commands) auto-generate an ESSCMD/MaxL script based on the files that exist in a directory.Most scripting environments should allow you to loop through a list of files that match some pattern - you can then create a script with the results and execute it.Another option is to build a MaxL script that accepts a parameter (file name) and have a shell script call it as it loops through the file list.Hope that helps.Regards,Jade----------------------------------Jade ColeSenior BI ConsultantClarity [email protected]
-
Troubleshooting 9.3.1 Data Load
I am having a problem with a data load into Essbase 9.3.1 from a flat, pipe-delimited text file, loading via a load rule. I can see an explicit record in the file but the results on that record are not showing up in the database.
* I made a special one-off file with the singular record in question and the data loads properly and is reflected in the database. The record itself seems to parse properly for load.
* I have searched the entire big file (230Mb) for the same member combination, but only come up with this one record, so it does not appear to be a "last value in wins" issue.
* Most other data (610k+ rows) appears to be loading properly, so the fields, in general, are being properly parsed out in the load rule. Additionally, months of a given item are on separate rows, and other rows of the same item are loading properly and being reflected in the database. As well as other items are being loaded properly in the months where this data loads to, so, it is not a metadata-not-existing issue.
* The load is 100% successful according to the non-existent error file. Also, loading the file interactively results in the file showing up under "loaded successfully" (no errors).
NOTE:
The file's last column does contain item descriptions which may include special characters including periods and quotes and other special characters. The load rule moves the description field to the earlier in the columns, but the file itself has it last.
QUESTION:
Is it possible that the a special character (quote??) in a preceding record is causing the field parsing to include the CR/LF, and therefore the next record, into one record? I keep thinking that if the record seems to fine alone, but is not fine where it sits amongst other records, that it may have to do with preceding or subsequent records.
THOUGHTS??Thanks Glenn. I was too busy looking for explicit members that I neglected thinking through implicit members. I guess I was thinking that implied members don't work if you have a rules file that parses out columns...that a missing member would just error out a record instead of using the last known value. In fact, I thought that (last known value) only worked if you didn't use a load rule.
I would prefer some switch in Essbase that requires keys in all fields in a load rule or allows last known value.
Maybe you are looking for
-
I've broken the screen on my iphone 4 but it lights up so i know its still working, I synced it to a laptop I dont own anymore and my new laptop wont allow me access to the iphone even though it knows its an iphone and its plugged into the usb. I've
-
How to bind data (from database) to the Dictionary structure in Web Dynpro
Hi, I am getting Hashtable object from the database related classes and I want to bind this data to the Dictionary field in Web Dynpro. So, I can bind this Dictionary field type to the Context variable. Please let me know the detailed procedure. Than
-
TS2634 How to replace my faulty week old iPad lighting to USB cable?
It's not working now. I bought my iPad online. Where do I claim my warranty or get it replaced? I am in Kuala Lumpur, Malaysia. Marvel
-
please tell me how to transfer container elements from rule container to workflow container and can we debug rule
-
Bizarre Photoshop stuff - screen blanks on adding guide, etc...
I rummaged around, but didn't see any discussions of what I'm seeing, which probably means it's something bizarre and unique to my system, but... Windows 7 64-bit Professional, I7-920 running at 3.7 GHz, 12GB of DDR3 memory, several SATA 1 or 2 TB sp