Modify pa_online_expenditure_types_v to pass in transaction date
I need to pass in either the transaction date or week-ending date from the OTL timecard to customize the view pa_online_expenditure_types_v to use information from HR. How do I do this?
Do I have to modify the timecard layout? Here is the current version of my timecard layout for the expenditure_type.
BEGIN HXC_LAYOUT_COMPONENTS "Projects Timecard Layout -XX- Expenditure Type"
OWNER = "ORACLE"
COMPONENT_VALUE = "EXPENDITURETYPE"
REGION_CODE = "HXC_CUI_TIMECARD"
REGION_CODE_APP_SHORT_NAME = "HXC"
ATTRIBUTE_CODE = "HXC_TIMECARD_EXPTYPE"
ATTRIBUTE_CODE_APP_SHORT_NAME = "HXC"
SEQUENCE = "230"
COMPONENT_DEFINITION = "LOV"
RENDER_TYPE = "WEB"
PARENT_COMPONENT =
"Projects Timecard Layout -XX- Day Scope Building blocks for worker timecard matrix"
LAST_UPDATE_DATE = "2004/05/24"
BEGIN HXC_LAYOUT_COMP_QUALIFIERS
"Projects Timecard Layout -XX- Expenditure Type"
OWNER = "ORACLE"
QUALIFIER_ATTRIBUTE_CATEGORY = "LOV"
QUALIFIER_ATTRIBUTE1 = "ExpenditureType2LOVVO"
QUALIFIER_ATTRIBUTE2 = "NODISPLAYCACHE" QUALIFIER_ATTRIBUTE3 = "HXC_CUI_EXPTYPE2_LOV"
QUALIFIER_ATTRIBUTE4 = "809"
QUALIFIER_ATTRIBUTE5 = "1"
QUALIFIER_ATTRIBUTE6 =
"HxcCuiExptypeDispColumn|EXPTYPE-DISPLAY|CRITERIA|N|HxcCuiExptypeExpType|EXPTYPE|RESULT|N|HxcCuiExptypeDispColumn|EXPTYPE-DISPLAY|RESULT|N|HxcCuiExptypeSysLinkFunc|SYSLINKFUNC|RESULT|N"
QUALIFIER_ATTRIBUTE7 = "SYSLINKFUNC|SystemLinkageFunction"
QUALIFIER_ATTRIBUTE8 = "DisplayColumn"
QUALIFIER_ATTRIBUTE9 = "ExpenditureType"
QUALIFIER_ATTRIBUTE10 =
"oracle.apps.hxc.selfservice.timecard.server.ExpenditureType2LOVVO"
QUALIFIER_ATTRIBUTE11 =
"TIMECARD_BIND_END_DATE|TIMECARD_BIND_END_DATE|TIMECARD_BIND_START_DATE|TIMECARD_BIND_END_DATE|TIMECARD_BIND_END_DATE|TIMECARD_BIND_START_DATE"
QUALIFIER_ATTRIBUTE17 = "OraTableCellText"
QUALIFIER_ATTRIBUTE20 = "N"
QUALIFIER_ATTRIBUTE21 = "Y"
QUALIFIER_ATTRIBUTE22 = "L"
QUALIFIER_ATTRIBUTE25 = "FLEX"
QUALIFIER_ATTRIBUTE26 = "PROJECTS"
QUALIFIER_ATTRIBUTE27 = "Attribute3"
QUALIFIER_ATTRIBUTE28 = "EXPTYPE"
LAST_UPDATE_DATE = "2004/05/24"
END HXC_LAYOUT_COMP_QUALIFIERS
END HXC_LAYOUT_COMPONENTS
Hi
this steps is wrong
4. Attached the Custom VO to standard AM- TimecardAM in location-oracle.apps.hxc.selfservice.timecard.server.TimecardAM
and in LovAM location-oracle.apps.hxc.selfservice.configui.server.LovAMU cant add extended VO direct to standard AM .
VO extension requires .jpx import ,that u have to do ,rest will be take care by this import itself and no need to modify standard AM .
Please check the steps for VO extension in http://oracle.anilpassi.com/extend-vo-in-oa-framwork-2.html
thanks
Pratap
Similar Messages
-
Pass transaction data from an InfoProvider to the BPC within BW
Hi Gurus,
I need to pass transaction data from an InfoProvider to the BPC Real-Time InfoCube, but I need to do from BW, and not from the BPC Excel interface.
The reason is that I want to make the whole process in one step, and at the beginning of the process the end-user has to introduce some parameters in the BW, as we are launching the process chain from a transaction.
Therefor I copied the process chain /CPMB/LOAD_INFOPROVIDER and tried to introduce the different parameters that are normally delivered by the prompt in the excel interface.
Can anybody help? Does anybody know what parameters for each process type have to be introduced, in what form and if all the process types are needed?
Making it with a normal BW transformation is also not an option, as we don't want to change from plan mode and stop people from planning.
It is very important for my end user to do everything in one step, without having to enter the BPC Excel interface.
Thanks in advance.
Cheers,
ÀlexHi Frank,
I have checked and we have many processes in SM50 available for the job to use. Sometimes there are no other jobs running in the BW System except for the 2 jobs but still the 2nd job waits for the first to finish.
I did another check on transaction code SM58 in BW System. Here it shows all the transactional RFC queues that is being generated by the 2 jobs. I saw that only TRFC queues are able to execute at one time. I am not sure if this is the problem but is it possible to set the number of queues that can be executed at one time to more than 2? If yes, where can I find such a setting?
This is quite urgent as this problem causes all my jobs to end very late.
Thanks.
Shunhui. -
Error while loading Transactional data from NW BW Infoprovider
Hi,
I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
Below is the transformation file content
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = TAB
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=NO
CREDITPOSITIVE=YES
MAXREJECTCOUNT= 9999999999999
ROUNDAMOUNT=
*MAPPING
ACCOUNT = 0ACCOUNT
BUSINESSTYPE = *NEWCOL(NOBTYPE)
BWSRC = *NEWCOL(R3)
COMPANYCODE = 0COMP_CODE
DATASRC = *NEWCOL(R3)
FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
GAME = *NEWCOL(NOGAME)
INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
PROBABILITY = *NEWCOL(NOPROB)
PRODUCT = *NEWCOL(NOPROD)
PROFITCENTER = 0PROFIT_CTR
PROJECT = *NEWCOL(NOPROJECT)
TIME = 0FISCPER
VERSION = *NEWCOL(REV0)
WEEKS = *NEWCOL(NOWEEK)
SIGNEDDATA= 0AMOUNT
*CONVERSION
Below is the Error Log
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide=ZPCA_C01
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&L.xls
CLEARDATA= Yes
RUNLOGIC= No
CHECKLCK= Yes
[Messages]
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other cube
Application: ACTUALREPORTING Package status: ERROR
Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
ANy Inpout will be appreciated.
SanjayHello Guru,
Currently i am getting below error while loading costcenter master data from BW to BPC.
Task name MASTER DATA SOURCE:
Record count: 189
Task name TEXT SOURCE:
Record count: 189
Task name CONVERT:
No 1 Round:
Info provider is not available
Application: ZRB_SALES_CMB Package status: ERROR
Anybody can tell me, if i have missed anything ???
Regards,
BI NEW
Edited by: BI NEW on Feb 23, 2011 12:25 PM -
Error while loading the transaction data ..
Hello Expert
I need your help in resolving this error, I have encountered this below error while uploading transaction data from the cube 0PP_C14 even though my transformation file is giving the accepted record
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide = 0PP_C14
SELECTION = <?xml version="1.0" encoding="utf-16"?><Selections xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><Selection Type="Selection"><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-DESIGNER
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-PLANK
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-HALF-WIDTH
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-NON-STD-WIDTH
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-RECESS
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-STD-WIDTH
</LowValue><HighValue /></Attribute><Attribute><ID>0BASE_UOM</ID><Operator>1</Operator><LowValue>4FT</LowValue><HighValue /></Attribute><Attribute><ID>0PLANT__0SALESORG</ID><Operator>1</Operator><LowValue>SILD</LowValue><HighValue /></Attribute><Attribute><ID>0PLANT__0SALESORG</ID><Operator>1</Operator><LowValue>SILE</LowValue><HighValue /></Attribute></Selection><Selection Type="FieldList"><FieldID>0EXTMATLGRP</FieldID><FieldID>0FISCPER</FieldID><FieldID>0PLANT</FieldID><FieldID>0PLANT__0SALESORG</FieldID></Selection></Selections>
TRANSFORMATION = \ROOT\WEBFOLDERS\SIL\MIS\DATAMANAGER\TRANSFORMATIONFILES\COST SHEET\0PP_C14_FOR_DOMESTICS_EXPORT.xls
TARGETMODE = Yes
RUNLOGIC = No
CHECKLCK = No
[Message]
Task name CONVERT:
No 1 Round:
An exception with the type CX_ST_MATCH_TYPE occurred, but was neither handled locally, nor declared in a RAISING clause
System expected a value for the type g
model: MIS. Package status: ERROR
Any one has the solution of it ..?Hi,NEERAJ DEPURA
I think it is caused by incorrect character entries you made in the 'set selection'
screen when you filtering data.
Because I was able to reproduce the error in the way below:
1, I changed my input language to Chinese.
2,In the 'set selection'screen,I selected '0company_code’ as the attribute ,operater is ‘=’,
and enter the value ‘SGA01’.
3,Saved the selection and ran the package. it failed with the same error as you got.
In my cube the value ‘SGA01’ is in English so in the ‘set selection’ screen if it is
entered in Chinese ,a mismatch will occur and result in the error.
If I change my input language to English and enter the valune ‘SGA01’ again in the ‘set
selection’screen,the error won’t occur.
Please check your entered values in the ‘set selection’screen.
Hope it solves your issue -
Error While importing the transaction data
All SAP BPC Gurus,
I need your help in resolving this error, I have encountered this below error while uploading (importing) the transaction data of (Non-Reporting) application, Would you please help me resolving this error. I don't know if i'm doing anything wrong or is anything wrong with the setup.
I used DataManager in EXCEL BPC and Ran the IMPORT Transaction Package.
/CPMB/MODIFY completed in 0 seconds
/CPMB/CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
FILE= DATAMANAGER\DATAFILES\IFP_BPCUSER121\rate data file.csv
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\SYSTEM FILES\IMPORT.XLS
CLEARDATA= No
RUNLOGIC= No
CHECKLCK= No
[Messages]
Task name CONVERT:
XML file (...ANAGER\TRANSFORMATIONFILES\SYSTEM FILES\IMPORT.TDM) is empty or is not found
Cannot find document/directory
Application: PLANNING Package status: ERRORare you using the standard "Import" data package?
Check the code in the Advanced tab of the Import data package
Check your Transformation file is in correct format.
code in Advaced tab should be as below:
PROMPT(INFILES,,"Import file:",)
PROMPT(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
PROMPT(RADIOBUTTON,%CLEARDATA%,"Select the method for importing the data from the source file to the destination database",0,{"Merge data values (Imports all records, leaving all remaining records in the destination intact)","Replace && clear datavalues (Clears the data values for any existing records that mirror each entity/category/time combination defined in the source, then imports the source records)"},{"0","1"})
PROMPT(RADIOBUTTON,%RUNLOGIC%,"Select whether to run default logic for stored values after importing",1,{"Yes","No"},{"1","0"})
PROMPT(RADIOBUTTON,%CHECKLCK%,"Select whether to check work status settings when importing data.",1,{"Yes, check for work status settings before importing","No, do not check work status settings"},{"1","0"})
INFO(%TEMPNO1%,%INCREASENO%)
INFO(%ACTNO%,%INCREASENO%)
TASK(/CPMB/CONVERT,OUTPUTNO,%TEMPNO1%)
TASK(/CPMB/CONVERT,ACT_FILE_NO,%ACTNO%)
TASK(/CPMB/CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
TASK(/CPMB/CONVERT,SUSER,%USER%)
TASK(/CPMB/CONVERT,SAPPSET,%APPSET%)
TASK(/CPMB/CONVERT,SAPP,%APP%)
TASK(/CPMB/CONVERT,FILE,%FILE%)
TASK(/CPMB/CONVERT,CLEARDATA,%CLEARDATA%)
TASK(/CPMB/LOAD,INPUTNO,%TEMPNO1%)
TASK(/CPMB/LOAD,ACT_FILE_NO,%ACTNO%)
TASK(/CPMB/LOAD,RUNLOGIC,%RUNLOGIC%)
TASK(/CPMB/LOAD,CHECKLCK,%CHECKLCK%)
TASK(/CPMB/LOAD,CLEARDATA,%CLEARDATA%) -
Upload transaction data for transactions like KSU2/KSU5,KB31N
We are trying to upload transaction data for transactions like KSU2/KSU5,KB31N etc using batch data processing within LSMW. We are facing couple of problems as mentioned:<br />
We have to upload data, which has a single header record with 200-300 line items record. How to define target structures to represent header & line items. Currently we are able to upload header data with 1-2 line items depending upon the recording. The present recording gives us option to load 1 line item per document. <br />
Can excel be used instead of text file for uploading data in LSMW? <br />
During processing of transaction, User clicks on button and checks a value. Based on this value the future course of action is determined. How can we do the BDC recording or do customization (Write a custom code) <br />
If errors occur during upload of data, will the processing stop at that line item or will it ignore the error record and continue processing of the remaining data. <br />
How can we write custom code to modify the data during BDC session and not before session? <br />
Immediate pointers to this would be very useful. Let me know if more information is required.<br />
<br />
Thanks,<br />
Mehfuzehi ali,
as for your question related to excel upload via lsmw , please find the below links to be usefull
[Upload of excel file in LSMW]
[Upload excel by LSMW for FV60 ?]
[Upload Excel file through LSMW]
[How upload an excel file using an LSMW.]
regards,
koolspy. -
Need to pass period start date as parameter to ProjectLOVVO in OTL module
Hi All,
I have a requirement of passing period start date as a parameter to ProjectLOVVO.
It will check against the project's base view PA_ONLINE_PROJECTS_V and allow to
display only those projects in the list where the project completion date and
resource assignment end date is greater than the period start date.
This VO is in Oracle Time and Labour Module. Navigation is SelfService Time> Create Timecard Screen.
The screen is based on AK Developer.
I have tried to followed the steps as per the document hxcconfiguiwp.pdf during the development work.
Steps are mentioned below. It is still not working.
Please go through the steps and if you have worked in this module please provide your valuable
suggestion as soon as possible.
Steps:
1. Modified the customizable view PA_ONLINE_PROJECTS_V to
keep the assignment end date in the select query.
2. Created Custom VO- XXProjectLOVVO using JDeveloper same as ProjectLOVVO
It's SQL Query is -
SELECT project_number projectnumber,
project_name projectname,
project_details projectdetails,
project_id projectid,
start_date,
completion_date,
carrying_out_organization_id
FROM pa_online_projects_v
WHERE TRUNC(completion_date) >= TRUNC(FND_DATE.CANONICAL_TO_DATE(:1))
AND TRUNC(assignment_end_date) >= TRUNC(FND_DATE.CANONICAL_TO_DATE(:2))
3. Placed the class files (XXProjectLOVVO.xml and XXProjectLOVVOImpl.class) in Custom Directory in server.
In location- apps_st/comn/java/classes/custom/oracle/apps/xxhxc/selfservice/timecard/server
4. Attached the Custom VO to standard AM- TimecardAM in location-oracle.apps.hxc.selfservice.timecard.server.TimecardAM
and in LovAM location-oracle.apps.hxc.selfservice.configui.server.LovAM
5. In AK Developer module created LOV Attributes 'XX_HXC_CUI_PROJECT_CO_ORG_ID', 'XX_HXC_CUI_PROJECT_END_DATE',
'XX_HXC_CUI_PROJECT_ID','XX_HXC_CUI_PROJECT_NAME',
'XX_HXC_CUI_PROJECT_NUMBER', 'XX_HXC_CUI_PROJECT_START_DATE'
and created a LOV region 'XX_HXC_CUI_PROJECT_LOV attached to Database Object 'ICX_PROMPTS' in the Object
and in Region items added all the above attributes.
Added LovAM location in place of AM and LovCO lication in place of CO.
6. Migrated the LOV with the below code
begin
hxc_lov_migration.migrate_lov_region
(p_region_code => 'XX_HXC_CUI_PROJECT_LOV'
,p_region_app_short_name => 'HXC'
,p_force => 'Y'
commit;
end;
7. Copied hxczzhxclayt0019.ldt LDT file from server. Renamed it to xhxczzhxclayt0019.ldt
modified the Layout section for ProjectLOV
BEGIN HXC_LAYOUT_COMPONENTS "Custom Projects Timecard Layout - Project"
OWNER = "CUSTOM"
COMPONENT_VALUE = "PROJECT"
REGION_CODE = "HXC_CUI_TIMECARD"
REGION_CODE_APP_SHORT_NAME = "HXC"
ATTRIBUTE_CODE = "HXC_TIMECARD_PROJECT"
ATTRIBUTE_CODE_APP_SHORT_NAME = "HXC"
SEQUENCE = "210"
COMPONENT_DEFINITION = "LOV"
RENDER_TYPE = "WEB"
PARENT_COMPONENT =
"Projects Timecard Layout - Day Scope Building blocks for worker timecard matrix"
LAST_UPDATE_DATE = "2010/11/10"
BEGIN HXC_LAYOUT_COMP_QUALIFIERS "Custom Projects Timecard Layout - Project"
OWNER = "CUSTOM"
QUALIFIER_ATTRIBUTE_CATEGORY = "LOV"
QUALIFIER_ATTRIBUTE1 = "XXProjectLOVVO"
QUALIFIER_ATTRIBUTE2 = "N"
QUALIFIER_ATTRIBUTE3 = "XX_HXC_CUI_PROJECT_LOV"
QUALIFIER_ATTRIBUTE4 = "809"
QUALIFIER_ATTRIBUTE5 = "12"
QUALIFIER_ATTRIBUTE6 =
"XxHxcCuiProjectNumber|PROJECT-DISPLAY|CRITERIA|N|XxHxcCuiProjectId|PROJECT|RESULT|N|XxHxcCuiProjectNumber|PROJECT-DISPLAY|RESULT|N"
QUALIFIER_ATTRIBUTE8 = "ProjectNumber"
QUALIFIER_ATTRIBUTE9 = "ProjectId#NUMBER"
QUALIFIER_ATTRIBUTE10 =
"custom.oracle.apps.xxhxc.selfservice.timecard.server.XXProjectLOVVO"
QUALIFIER_ATTRIBUTE11 = "TIMECARD_BIND_START_DATE|TIMECARD_BIND_START_DATE"
QUALIFIER_ATTRIBUTE17 = "OraTableCellText"
QUALIFIER_ATTRIBUTE20 = "N"
QUALIFIER_ATTRIBUTE21 = "Y"
QUALIFIER_ATTRIBUTE22 = "L"
QUALIFIER_ATTRIBUTE25 = "FLEX"
QUALIFIER_ATTRIBUTE26 = "PROJECTS"
QUALIFIER_ATTRIBUTE27 = "Attribute1"
QUALIFIER_ATTRIBUTE28 = "PROJECT"
LAST_UPDATE_DATE = "2010/11/10"
END HXC_LAYOUT_COMP_QUALIFIERS
END HXC_LAYOUT_COMPONENTS
8. Used FNDLOAD command to upload the ldt
FNDLOAD apps/apps@instance 0 Y UPLOAD /ad01/app/o2cdev/apps/apps_st/appl/hxc/12.0.0/patch/115/import/hxclaytlayoutsld.lct ./xhxczzhxclayt0019.ldt
9. Bounch Apache
10. Checked in below database table to check weather layout records are getting update or not and found records existing:
In table hxc_layout_COMPONENTS where region_code = 'HXC_CUI_TIMECARD' and component_value = 'PROJECT' and SEQUENCE = 210
and COMPONENT_NAME = 'Custom Projects Timecard Layout - Project'
In table hxc_layout_comp_qualifiers where layout_component_id is as in layout component table.
11. Checked that the XXProjectLOV is attached in the TimecardAM and LovAM. It is visible there but while quering for the project list its picking up all the projects irrelavant of start date parameter value .
As if its picking up all the records from seeded LOv- ProjectLOV not custom LOV.
Regards,
K BosuHi
this steps is wrong
4. Attached the Custom VO to standard AM- TimecardAM in location-oracle.apps.hxc.selfservice.timecard.server.TimecardAM
and in LovAM location-oracle.apps.hxc.selfservice.configui.server.LovAMU cant add extended VO direct to standard AM .
VO extension requires .jpx import ,that u have to do ,rest will be take care by this import itself and no need to modify standard AM .
Please check the steps for VO extension in http://oracle.anilpassi.com/extend-vo-in-oa-framwork-2.html
thanks
Pratap -
CMOD exit_rsap_saplr_001 for transactional data ABAP CODE
Hi please confirm that you want to convey that i can write the actual code in CMOD exit_rsap_saplr_001 for transactional data?? if i put 20 data sources enhancement code in there is'nt that too bulky and will cause the failing all extractor if one code is wrong if you still suggest that i can go ahead with 20 similar codes as below . please see my code below which i have used for all 20 datasources with little modification
if you can recommend some changes in code to improve performance would be great
case i_datasource.
WHEN '0CUSTOMER_ATTR'.
loop at i_t_data into l_s_BIW_KNA1_S.
l_tabix = sy-tabix.
clear i_knvp.
select single * from KNVP into i_knvp where KUNNR = l_s_BIW_KNA1_S-KUNNR.
if sy-subrc = 0.
l_s_BIW_KNA1_S-ZZPARFN = i_knvp-PARVW.
l_s_BIW_KNA1_S-ZZCUSNOBP = i_knvp-KUNN2.
modify i_t_data from l_s_BIW_KNA1_S index l_tabix.
endif.
endloop.
endcase.
Thanks
PoonamCheck this simple code for Z...include into the FM EXIT_SAPLRSAP_001 where zcomp is new field added into the datasource 8zsales2.
data : l_s_/BIC/CE8ZSALES2 like /BIC/CE8ZSALES2,
l_tabix like sy-tabix.
case i_datasource.
when '8ZSALES2'.
loop at c_T_DATA into l_s_/BIC/CE8ZSALES2.
l_tabix = sy-tabix.
fill the new field
select comp_code from /BI0/MCOMP_CODE into l_s_/BIC/CE8ZSALES2-zcomp
where comp_code = '1000'.
if sy-subrc = 0.
*l_s_ZFIAA_IS001_TXT50 = ANLA_TXT50.
modify c_t_data from l_s_/BIC/CE8ZSALES2 index l_tabix.
endif.
endselect.
endloop.
endcase.
Edited by: Rohan Kumar on Jan 16, 2008 8:21 AM -
Transactional data modification
Hi all,
I have one query. In our R/3 system wrong data was entered. That data is uploaded in BW by transactional datasource.
How can i make modification in the transactional data to correct it? In R/3 also, it is not getting modified.
Regards,
Pravin.Hi,
You can have the functional team correct the data for you on the R3 side and then repull the data to BW. Thats the recommended practice.
Of course you can correct the data on the BW side, but then your source and BW data will be out of sync which is not a good idea.
Cheers,
Kedar -
Problems with BPC Export Transaction Data process chain
Hi everybody,
we are trying to implement the procedure "How to Export BPC Transaction Data using a custom process chain " for SAP BPC NW 7.0 and everything seems fine. However, when we execute it from Excel, the package apparently runs well (the message 'IMMEDIATE RUN: The package is running...' appears) but when we go to the watch the package status, the package does not appear in the list and there is not any file exported.
When we go to watch the log of the process chain from NW, the Modify Dynamically step runs wrongly but there is no clue of what could be done to solve it. We have done it again from scratch but we always have the same error.
Any idea out there?
Thanks in advance. Kind regards,
AlbertHi Rich,
in the end we managed to export data. Our NW administrator created the custom process type wrongly - putting a blank space before ZBPCTXEXP - and now it has been corrected and everything works fine.
Well we have managed to transport applications to one server to another one and to export/import the transactional data.
Is there any doc that explains how to transport process chains from one server to another?
In BPC Opps manual there is not too much information and we should transport quiet a lot of process chains related with Runlogic issues .
Thanks a lot for your help.
Kind regards,
Albert Mas. -
ABAP/4 processor: MESSAGE_TYPE_X while loading transaction data in infopak
Hi,
While loading the transaction data through info package, short dump appears indicating the following error:
ABAP/4 processor: MESSAGE_TYPE_X
I guess there is SAP Note to be applied.
If yes, which note number.
Otherwise tell me the solution.
Thanking you,
Tarun Brijwani.Hi,
in st22, I have got the following thing:
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"MESSAGE_TYPE_X" " "
"SAPLRSS1" or "LRSS1F11"
"RSM1_CHECK_FOR_DELTAUPD"
Please suggest some suitable notes
Thanking You,.
Tarun Brijwani. -
Changing master data record while loading Transaction data
Hello All,
We have a requirementt to change one of the master data field(FLAG) while loading on the transaction data.
we get the material info in the Master data. and in the sales order item data also we get the material.
While loading the Transaction data, I have to set a FLAG field has "s" in the Master data material based on the Key selection:
Master data - MAterial = Tramsaction - Data Material.
I have written the code.. and implemented.. i get the correct records but i face huge performance issue. can any one guide me please
DATA: itab1 TYPE STANDARD TABLE OF /bi0/pmaterial,
wa_itab1 TYPE /bi0/pmaterial,
w_tabix TYPE sy-tabix.
IF itab1 IS INITIAL.
SELECT * FROM /bi0/pmaterialINTO TABLE itab1.
ENDIF.
LOOP AT result_package ASSIGNING <result_fields>.
READ TABLE itab1 INTO wa_itab1 WITH KEY
material = <result_fields>-material.
IF sy-subrc = 0.
w_tabix = sy-tabix.
IF <result_fields>-/bic/paa1c2033 IS NOT INITIAL.
wa_itab1-FLAG = 'S'.
MODIFY itab1 FROM wa_itab1 INDEX w_tabix TRANSPORTING FLAG .
ENDIF.
ENDIF.
ENDLOOP.
IF itab1 IS NOT INITIAL.
MODIFY /bi0/pmaterial FROM TABLE itab1.
ENDIF.Here are some performance tips:
Add FOR ALL ENTRIES IN result_package WHERE material = result_package-material to your select statement
After your select statement, add IF SY-SUBRC = 0. SORT itab1 BY material. ENDIF.
In your read statement, add BINARY SEARCH to the end of it
At the end of your end routine, make sure to CLEAR itab1.
You can also increase the number of parallel processes for your DTP, and DSO activation (assuming your target is DSO). -
Transaction Data Load from BW info provider
Hi gurus,
I am doing a transaction dat aload from the BW data feed to BPC cube. When I did the validation of the transformation file the task was sucessfully completed with some skipped records as expected based on the conversion files.
ValidateRecords = YES
[List of conversion file]
Conversion file: DataManager\ConversionFiles\EXAMPLES\TIMECONV.XLS!CONVERSION
Conversion file: DataManager\ConversionFiles\EXAMPLES\VERSIONCONV.XLS!CONVERSION
Conversion file: DataManager\ConversionFiles\EXAMPLES\ACCOUNTCONV.XLS!CONVERSION
Conversion file: DataManager\ConversionFiles\EXAMPLES\ENTITY.XLS!CONVERSION
Record count: 25
Accept count: 13
Reject count: 0
Skip count: 12
This task has successfully completed
but when did run package, load fails.
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide=ZPCAOB01
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\EXAMPLES\ABF_TRANS_LOAD.xls
CLEARDATA= Yes
RUNLOGIC= No
CHECKLCK= No
[Messages]
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other cube
Application: TEST Package status: ERROR
This is a fresh system and we are doing the data load for the first time. we are using BPC NW 7.5 with SP4
Is there something which we are missing which is supposed to be performed before starting the load for the first time.
my transformation file is as below
*MAPPING
Account=0ACCOUNT
Currency=0CURRENCY
DataSrc=*NEWCOL(INPUT)
Entity=ZPCCCPLN
ICP=*NEWCOL(ICP_NONE)
Scenario=0VERSION
Time=0FISCPER
SIGNEDDATA=0AMOUNT
*CONVERSION
TIME=EXAMPLES\TIMECONV.XLS
SCENARIO=EXAMPLES\VERSIONCONV.XLS
ACCOUNT=EXAMPLES\ACCOUNTCONV.XLS
ENTITY=EXAMPLES\entity.xls
Thanks a lot in advance.
Regards
SharavanHi Gersh,
Thanks for the quick response.
I checked in SLG1 and i have the below error in the log.
Class: CL_UJD_TRANSFORM_PROXY:CONSTRUCTOR Log:DATATYPE 3
Class: CL_UJD_TRANSFORM_PROXY:CONSTRUCTOR Log:CURRENT_ROUND 1
Error occurs when loading transaction data from other cube
Message no. UJD_EXCEPTION137
we are on SP4, BPC NW 7.5. Please advice
Regards
Sharavan. -
- Differences between delta and full uploads of transactional data
Hi All,
This is with refrence to the loading transactional data into the cube.
I know that delta and full uploads of transactional data into the cube is done for the following reasons.
1. Delta upload allows change in transactional data (add/delete/modify) for the particular period selected but this is not the same in full upload.
2. Full upload takes less time compared to delta upload for the particular period.
Please let me know whether my understanding is correct. Otherwise, please explain these two concepts with examples taking loading transactional data into the cube.
Regards,
Ningarajuhi,
full upload is done to avoid more time to load the data .if u use initilallization then it will take more time to laod the same amount of data when compared to full load.
however your statement about the delta is correct.
u can perform delta only after sucessful initiallization.
so
1.first full load(to avoid more time0
2. inirtilallization with zero recored.
3. delta
regards -
Not able to Retrieve Transaction Data based on the property of master data
Hi,
I am trying to retrieve transaction data based on property of Master Data for ACCOUNT (property ACCTYPE = ‘EXP’)
in BPC 10 version for netweaver.
Transaction data is present at backend, But I am not getting data in Internal table after running RSDRI Query.
I am using this code.
DATA: lt_sel TYPE uj0_t_sel,
ls_sel TYPE uj0_s_sel.
ls_sel-dimension = 'ACCOUNT'.
ls_sel-attribute = 'ACCTYPE'.
ls_sel-sign = 'I'.
ls_sel-option = 'EQ'.
ls_sel-low = 'EXP'.
APPEND ls_sel TO lt_sel.
lo_query = cl_ujo_query_factory=>get_query_adapter(
i_appset_id = lv_environment_id
i_appl_id = lv_application_id ).
lo_query->run_rsdri_query(
EXPORTING
it_dim_name = lt_dim_list " BPC: Dimension List
it_range = lt_sel" BPC: Selection condition
if_check_security = ABAP_FALSE " BPC: Generic indicator
IMPORTING
et_data = <lt_query_result>
et_message = lt_message
Data is coming if i use ID of ACCOUNT directly, for e.g.
ls_sel-dimension = 'ACCOUNT'.
ls_sel-attribute = 'ID'.
ls_sel-sign = 'I'.
ls_sel-option = 'EQ'.
ls_sel-low = 'PL110.
APPEND ls_sel TO lt_sel.
so in this case data is coming , but it is not coming for property.
So Please can you help me on this.
Thanks,
RishiHi Rishi,
There are 2 steps you need to do,.
1. read all the master data with the property you required into a internal table. in your case use ACCTYPE' = EXP
2. read transaction data with the masterdata you just selected.
Then you will get all your results.
Andy
Maybe you are looking for
-
Can't download and install iTunes update to version 11.1.4 without receiving an error message: "MSVCR80.dll is missing". What do I do?
-
High CPU usage without any application running on the laptop
Hi , I saw some earlier posts as well but I am not sure if anything is helping me. I have a HP Pavilian Dv6. Its with me for now since 2009 or so. It was all running very well till last Nov - Dec 2013 timeframe. I had many bulky work done on the lapt
-
Plz help me. how i can use plugin with cs4
I want to use, Carbon coder with adobe premiere cs4, how can i use it. it is not show in cs4. what can i do for it? Thanx in advance Regards Farrukh Shami Faisalabad Photographer
-
Standalone ITS 6.20 support?
How long do you expect that standalone ITS 6.20 will be supported? I know that the FAQ says "long term" but it said that the integrated ITS in 6.40 will only be supported through 2009. We're hearing that standalone ITS 6.20 is "going away". Can
-
Provision for Approval process
Dear Experts, I have a requirement "To be able to provision for Approval process by FInance, when role of business partner upgraded from prospect to customer" I guess this is done through Workflow concept. but I don't have Idea in detail. Looking fo