Getting error while uploading data to the CCM thru excel file.
Hi
while uploading supplier data to CCM i am getting error " The destination FXP_HTTP_ADAPTER has not been maintained! (see long text) " Error when sending to SAP XI; upload canceled ".
FXP_HTTP_ADAPTER (Production) - is not maintained in SM59 transaction.
we have maintained FXT_HTTP_ADAPTER (TEST) in SM59 transaction.
we want to connect to test link not production link.but it is taking production link.
How it could pointing to production link.
thanks,
Prashsanth.
Hi Prashsanth
Please check in customizing which RFC destination is maintained for uploading catalogs.
SPRO -> Cross Application Components -> SAP Catalog Authoring Tool -> Specify Settings for Uploading Catalogs.
This is probably wrong.
Regards.
Jason
Similar Messages
-
Getting error while uploading data using the EXCEL( GUI_upload)
Dear Freinds,
I am uploading data from my excel sheet to my Custom table . I am using the below code
call function 'GUI_UPLOAD'
exporting
filename = l_fn
filetype = 'BIN' "'ASC'
has_field_separator = 'X'
tables
data_tab = p_i_ins_db
exceptions
others = 17.
i can see in my internal P_i_ins_db ...all the data is coming in encrypted format.
The flat given to me is the .CSV file .
Could any one please let me know how i can upload the data which is there in .csv file.
regards
syamalaHi Syamala,
Before using the GUI_UPLOAD to upload the data from the excel file to the internal table,you can try usng the Function Module 'SAP_CONVERT_TO_TEX_FORMAT' or 'SAP_CONVERT_TO_CSV_FORMAT' as it helps n formatting the data.
In case you have any further clarifications,do let me know.
Regards,
Puneet Jhari. -
DB Connect Load - "Unknow error while uploading data from the DB Table"
Hi Experts,
We have our BI7 system connected to Oracle DB based third party tool. The loads are performing quite well in DEV environment.
I would like to know, how we transport DB Connect datasources to Quality systems? Any different process to be followed for DB Connect datasources?
At present the connections between BI Quality and the third party quality systems are established. We transported the DataSource from BI DEV system to BI quality system, but on trigerring an infopackage we are not able to perform loads. It prompts - "Unknow error while uploading data from the DB Table".
Also on comparing the DataSources in DEV system and Quality system there are no fields in "Proposal" tab of datasource in Quality system. Also I cannot change or activate Datasource in Quality system as we dont have change access in quality.
Please advice.
Thanks,
AbhijitHi,
Sorry for bumping an old thread ....
Did this issue get ever get resolved?
I am facing the same one. The loads work successfully in Dev. The transport for DBConnect DS also moved in successfully.
One strange this is that DB User for dev did not automatically change to db user from quality when I transported the DBConnect datasource. DBCon DS still shows me the DB User from Dev in Quality system
I get "Unknown Error" whenever I trigger the data package.
Advait -
Getting error while loading Data into ASO cube by flat file.
Hi All,
i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
does anyone have solution.
Regards,
VMAre you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
Cheers
John
http://john-goodwin.blogspot.com/ -
Getting error while uploading gl master data using LSMW batch input data
Hi Experts,
I am getting error while uploading the GL master data through lsmw using batch input recording.
After completion of all steps at the time of running batch input session error i am getting is "maintain EN language discription"
and Pls tell me is it possible to upload transactional data using lsmw? if yes pls explaing briefly.Hello,
Kindly post in the FI forum to get a better response.
Regarding the issue you might check whether the correct column is being picked up where you have maintained the description.
Kind Regards // Shaubhik -
Getting error while uploading multiple files in sharepoint hosted app in 2013 with REST API
Hi All,
In one of my tasks, I was struck with one issue, that is "While uploading multiple files into custom list with REST API".
Iam trying to upload multiple files in library with REST calls for an APP development, my issue is if i wants to upload 4 image at once its storing only
3 image file and further giving "Conflict" error". Below is the attached screenshot of exact error.
Error within screenshot are : status Code : 409
status Text :conflict
For this operation i am uploading different files as an attachment to an list item, below is the code used for uploading multiple files.
my code is
function PerformUpload(listName, fileName, listItem, fileData)
var urlOfAttachment="";
// var itemId = listItem.get_id();
urlOfAttachment = appWebUrl + "/_api/web/lists/GetByTitle('" + listName + "')/items(" + listItem + ")/AttachmentFiles/add(FileName='" + fileName + "')"
// use the request executor (cross domain library) to perform the upload
var reqExecutor = new SP.RequestExecutor(appWebUrl);
reqExecutor.executeAsync({
url: urlOfAttachment,
method: "POST",
headers: {
"Accept": "application/json; odata=verbose",
"X-RequestDigest": digest
contentType: "application/json;odata=verbose",
binaryStringRequestBody: true,
body: fileData,
success: function (x, y, z) {
alert("Success!");
error: function (x, y, z) {
alert(z);Hi,
THis is common issue if your file size exceeds
upload a document of size more than 1mb. worksss well for kb files.
https://social.technet.microsoft.com/Forums/office/en-US/b888ac78-eb4e-4653-b69d-1917c84cc777/getting-error-while-uploading-multiple-files-in-sharepoint-hosted-app-in-2013-with-rest-api?forum=sharepointdevelopment
or try the below method
https://social.technet.microsoft.com/Forums/office/en-US/40b0cb04-1fbb-4639-96f3-a95fe3bdbd78/upload-files-using-rest-api-in-sharepoint-2013?forum=sharepointdevelopment
Please remember to click 'Mark as Answer' on the answer if it helps you -
While uploading data into the r/3 using call transaction or session method
hi experts
while uploading data into the r/3 using call transaction or session method error occured in the middle of the processing then how these methods behaves it transfers next records or not?hai
Session method: The records are not added to the database until the session is processed. sy-subrc is not returned. Error logs are created for error records. Updation in database table is always Synchronous.
Call Transaction method: The records are immediately added to the database table. sy-subrc is returned to 0 if successful. Error logs are not created and hence the errors need to be handled explicitly. Updation in database table is either Synchronous or Asynchronous.
While to transfer the data from the through if any errors occurs until the errors are the complete the data is not transfer to the SAP system.
the system compulsory shows the errors. that errors are stored into the error logs (Transaction is SM35).
so the session method should not return any value.
In call transaction method data is directly pass to the SAP system.
So its compulsory return the value.
Because of the call transaction is the function.
A function should return the value mandatory
In session method errors stroed in SYSTEM GENRATED ERROR LOG.
IN CALL TRANSACTION TO CAPTURE THE ERRORS WE SHOULD PERFORM THE FOLLOWING.
FIRST ME MUST DECLARE AN INTERNAL TABLE WITH THE STRUCTURE OF BDCMSGCOLL TABLE.
THEN WHILE WRITING THE CALL TRANSACTION STATEMENT WE SHOULD PUT THE 'E' MODE FOR CAPTURING ALL THE ERRORS.
THEN FINALLY THE CAPTURED ERRORS MUST TO SENT TO THE INTERNAL TABLE WHICH WE DECLARED IN THE BEGINNING WITH BDCMSGCOLL BY USING THE FUNCTION MODULE "FORMAT_MESSAGE"
AND THUS THE ERROR MESSAGES WILL BE SENT TO THE INTERNAL TABLE WHICH WE DECLARED AT THE BEGINNING. -
Getting error while uploading a document in a library
Hi,
I'm getting a weird error while uploading the document in the library. And I don't get it all the times but most of the times.
It reads:
The file <library name>/<file name> has been modified by i:0#.<user-id> on <datetime stamp>
How to resolve this ? I have referred many articles on this one, but none helped.
Thanks for your response in advance.
RegardsHi Prajk,
Based on your description, I recommend to check if there are any ItemAdded event receiver or workflows which are triggered when an item is created to update the document properties in the library.
If there is any ItemAdded event receiver on the library, I recommend to wait for ItemAdded to finish before the ListFieldIterator control residing on EditForm.aspx loads (It’s the ListFieldIterator which displays the Item’s fields ).
http://blogs.msdn.com/b/manuj/archive/2009/09/22/itemadded-event-on-document-library-the-file-has-been-modified-by-on-error.aspx
If there is any workflow on the library, I recommend to add a Pause for Duration step at the beginning of the workflows.
If above cannot work, please check ULS log for detailed error message.
For SharePoint 2013, by default, ULS log is at C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\LOGS.
Best regards.
Thanks
Victoria Xia
TechNet Community Support -
Error while uploading data in SAP (me01)using LSMW
Hi All,
I am using LSMW for the first time.
I am trying to upload data to the Source List transaction (me01) using batch input recording. I created a new recording through LSMW itself. My source structure consists of 5 fields:
<b>MATNR C(18)
WERKS C(4)
VDATU N(8)
BDATU N(8)
LIFNR C(10)
EKORG C(10)</b>
Now the problem is when i try to convert the data, it gives me an error saying
"For type "C",a maximum length specification of 65535 is allowed."
I have noticed that unlike Direct Input method,
using batch input does not give any fields in "Maintain Field Mapping & Conversion Rules". So even if I have my fields in my source structure ,there are no fields to which I can map them to.
I don't know that is how it is when we go with batch input.
My file contain data in following order:
DANGEROUS GOODS,ABBY,20060801,20060831,30010,TEST
Pls help me to solve this problem.Hi Swapna!
You get such an error message, when define a constant and forget the second ':
g_werks = '1000. "wrong
g_werks = '1000'. "correct
In general: a batch input recording can have fixed values and some variables. You need to define, which fields you like to fill with a variable. Go to the overview of the recordings, open the recording in change mode and assign some variable names to the according batch lines.
<a href="http://help.sap.com/saphelp_erp2005vp/helpdata/en/76/a05b69e8a411d1b400006094b944c8/frameset.htm">Editing batch input recordings</a>
Follow the help for the following steps of structure assignments.
Regards,
Christian -
Getting Error While Uploading Clause Document in SAP Sourcing 9.0
Hi,
--------------------system description--------------
SAP Sourcing 9.0
SAP NW7.31 AS JAVA
IBM DB2 9.7
I am getting error while executing clause document.
----------------------Error Description---------------
Login with buyer link into the system-> it has all authorizations
then Goto -> Contract Management -> Clause Library -> Clause list -> Click on New button -> Click on Add button -> Browse Document
it is showing POPUP with Error that "There is a problem with the Contract Generation web service Contact your system administrator"
Thanks and Regards,
Murtaza NajmiHi Gary,
it is standalone system i have, everything i have installed on one server.
i have already checked with restarting IIS server, and i can see word service while running url in contractgen.serviceurl, word service is running fine on browser but when i upload the document a popup window comes with error as i have shown above in my query
Thanks once again Gary for replying
Thanks and Regards,
Murtaza Najmi -
Error while uploading data in table t_499s through BDC Prog
Hi
am facing problem while uploading data in table t_499s through BDC Program , if there is more than 15 records in file its not allowing to upload kindly suggest what to do
Thanx
Mukesh sHi,
See if you want to update only single table, which has User maintenance allowed
Use Modify statement.
EX:
LOOP AT ITAB INTO WA_TAB.
MOVE-CORRESPONDING WA_TAB TO T499S.
MODIFY T499S.
CLEAR T499S.
ENDLOOP.
It will update the table, to check go to sm30 , and check in V_T499S.
Rgds
Aeda -
Upload data from multiple worksheets of excel file into SAP by using MIME?
Hi all,
I'm trying to getting the data from a multiple worksheet excel file by usin the MIME Repository.
First of all i realizied it like [here|http://abap-explorer.blogspot.com/2008/12/upload-data-from-multiple-worksheets-of.html] in a normal ABAP Report.
By trying to transfer the code to a WebDynpro Component, i get problems.
With this part of code, because ActiveX is not allowed in our Webbrowser. So i have to find another solution to get the data from the excel file.
CALL METHOD c_oi_container_control_creator=>get_container_control
IMPORTING
control = iref_control
error = iref_error
I tried to rebuild this solution in webdynpro. But it dont works.
So my next step was trying to Read the Excel sheet directly from MIME by getting the Content (XString).
But there is no useful information Just a mix out of '######' and other symbols.
By converting it into String the same problem.
Maybe someone has an idea how to rebuild [this code|http://abap-explorer.blogspot.com/2008/12/upload-data-from-multiple-worksheets-of.html] in Webdynpro that it works correctly.
Currently Im Using the Fileupload UI Element. All Works fine. I can Upload the Excel file to MIME Repository and i can open it from there.
But i cant get the same clear informationen from the file, to write it later in a Database, like before without webdynpro.
i hope someone can help me.
Edited by: Sascha Baumann on Apr 20, 2009 4:28 PMYou can't read the native binary Excel Format in server side ABAP. The functions and classes that did this in Classic Dynpro used OLE Automation via the SAPGUI Control Framework to remotely control Excel to read the data. Because in the browser you have no connection to the SAPGUI or are sandboxed inside the browser; the same functionality is not possible.
I would suggest that you look into saving the Excel file as a open, text format. You might be able to use XML (although the Excel XML format can be complex) to support multiple sheets. You would have to build the logic yourself (using XSLT or the iXML parser) to process the XML format back into ABAP data. -
Error while uploading data from a flat file to the hierarchy
Hi guys,
after i upload data from a flat file to the hierarchy, i get a error message "Please select a valid info object" am loading data using PSA, having activated all external chars still get the problem..some help on this please..
regards
Srithere is o relation of infoobject name in flat file and infoobjet name at BW side.
please check with the object in the BW and their lengths and type of the object and check your flat file weather u have the same type there,
now check the sequence of the objects in the transfer rules and activate them.
there u go. -
AIR for iOS getting error while uploading the app, ITMS-9000 "Missing Provisioning Profile"
hi!
I get this error, when uploading the binary to Itunes. Any help, on how to solve out this problem ? And what does it mean ?for install on devices, you need to use a distribution cert. and a distribution provisioning profile for adhoc. When ready for AppStore you can use same distribution cert. but distribution provision profile for appstore. i completely use Windows PC for my entire process and use testflight to distribute to my testers. I'm not sure how the process is for simulator on the Mac is. I just the use mac to create my certs and profiles and then application loader.
When you tried upload to testflight did u use a distribution cert (p12) and an adhoc distibution provisioning profile. Also are the devices you are sending to are also registered to that profile.
Also ive been doing this for 3 years now but consider myself a newbie still cuz I'm self taught and still learning and my ways of doing things in an unorthodox way but on average I have about 10 uploads of updates or new apps to the AppStore a year and haven't had an app rejected. These forms will be your best resource of info with users like @colin. -
Getting error While loading data from ERP integrator to HFM
Hello,
We are getting the following error while loading the data from ERPI to HFM.
2013-12-31 22:44:54,133 INFO [AIF]: ERPI Process Start, Process ID: 300
2013-12-31 22:44:54,137 INFO [AIF]: ERPI Logging Level: DEBUG (5)
2013-12-31 22:44:54,139 INFO [AIF]: ERPI Log File: C:\Windows\TEMP\/aif_501_300.log
2013-12-31 22:44:54,141 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Java HotSpot(TM) 64-Bit Server VM (Oracle Corporation)]
2013-12-31 22:44:54,143 INFO [AIF]: Java Platform: java1.7.0_25
2013-12-31 22:44:56,821 INFO [AIF]: COMM Process Periods - Insert Periods - START
2013-12-31 22:44:56,828 DEBUG [AIF]:
SELECT l.SOURCE_LEDGER_ID
,l.SOURCE_LEDGER_NAME
,l.SOURCE_COA_ID
,l.CALENDAR_ID
,'0' SETID
,l.PERIOD_TYPE
,NULL LEDGER_TABLE_NAME
FROM AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE br.RULE_ID = 27
AND l.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = br.SOURCE_LEDGER_ID
2013-12-31 22:44:56,834 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT DISTINCT brl.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,prd.PERIOD_ID
,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
,COALESCE(prd.YEAR,0) GL_PERIOD_YEAR
,COALESCE(prd.PERIOD_NUM,0) GL_PERIOD_NUM
,prd.PERIOD_NAME GL_PERIOD_NAME
,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM,0) AS VARCHAR(38))) GL_PERIOD_CODE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
,COALESCE(ppa.PERIODTARGET, pp.PERIODTARGET) PERIODTARGET
,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) IMP_ENTITY_ID
,prd.PERIOD_NAME IMP_ENTITY_NAME
,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) TRANS_ENTITY_ID
,pp.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,1 SOURCE_LEDGER_ID
FROM (
AIF_BAL_RULE_LOADS brl
INNER JOIN TPOVCATEGORY pc
ON pc.CATKEY = brl.CATKEY
INNER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODFREQ = pc.CATFREQ
AND pp.PERIODKEY >= brl.START_PERIODKEY
AND pp.PERIODKEY <= brl.END_PERIODKEY
LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
ON ppa.PERIODKEY = pp.PERIODKEY
AND ppa.PERIODFREQ = pp.PERIODFREQ
AND ppa.INTSYSTEMKEY = 'FMTEST2'
INNER JOIN AIF_GL_PERIODS_STG prd
ON prd.SOURCE_SYSTEM_ID = 3
AND prd.CALENDAR_ID IN ('10000')
AND prd.SETID = '0'
AND prd.PERIOD_TYPE = 'Month'
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
AND prd.START_DATE > pp.PRIORPERIODKEY
AND prd.START_DATE <= pp.PERIODKEY
WHERE brl.LOADID = 300
ORDER BY pp.PERIODKEY
,GL_EFFECTIVE_PERIOD_NUM
2013-12-31 22:44:56,915 INFO [AIF]: COMM Process Periods - Insert Periods - END
2013-12-31 22:44:56,945 INFO [AIF]: COMM Process Periods - Insert Process Details - START
2013-12-31 22:44:56,952 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'AIF_EBS_GL_BALANCES_STG' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT DISTINCT PROCESS_ID
,IMP_ENTITY_TYPE ENTITY_TYPE
,IMP_ENTITY_ID ENTITY_ID
,IMP_ENTITY_NAME ENTITY_NAME
,(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
) q
ORDER BY ENTITY_NAME_ORDER
2013-12-31 22:44:56,963 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'TDATASEG' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT PROCESS_ID
,TRANS_ENTITY_TYPE ENTITY_TYPE
,MIN(TRANS_ENTITY_ID) ENTITY_ID
,TRANS_ENTITY_NAME ENTITY_NAME
,MIN(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
AND PRIOR_PERIOD_FLAG = 'N'
GROUP BY PROCESS_ID
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_NAME
) q
ORDER BY ENTITY_NAME_ORDER
2013-12-31 22:44:56,970 INFO [AIF]: COMM Process Periods - Insert Process Details - END
2013-12-31 22:44:57,407 DEBUG [AIF]: EBS/FS GL Balances Print Variables - Printing Variables - START
2013-12-31 22:44:57,408 DEBUG [AIF]:
p_process_id: 300
p_sql_db_type: ORACLE
p_partitionkey: 12
p_rule_id: 27
p_source_system_id: 3
p_application_id: 26
p_target_application_type: HFM
p_is_multi_currency: true
p_data_load_method: CLASSIC_VIA_EPMI
p_bal_balance_method_code: STANDARD
p_bal_ledger_group_code: SINGLE
p_bal_amount_type: MONETARY
p_prd_entity_name: JAN-13
p_prd_period_id: 135
p_prd_gl_period_name: JAN-13
p_prd_source_ledger_id: 1
p_prd_source_coa_id: 101
p_source_ledger_id: 1
p_source_coa_id: 101
p_bal_actual_flag: A
p_bal_seg_column_name: SEGMENT1
p_max_ccid_loaded_to_stg: 148137
2013-12-31 22:44:57,408 DEBUG [AIF]: EBS/FS GL Balances Print Variables - Printing Variables - END
2013-12-31 22:44:57,806 INFO [AIF]: LKM EBS/FS Extract Type - Load Audit AND Full Refresh - START
2013-12-31 22:44:57,817 DEBUG [AIF]:
SELECT p.PROCESS_ID
,br.RULE_NAME
,l.SOURCE_LEDGER_NAME
FROM AIF_GL_LOAD_AUDIT aud
,AIF_PROCESSES p
,AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND p.PROCESS_ID = aud.LAST_LOADID
AND p.STATUS = 'RUNNING'
AND p.PROCESS_ID <> 300
AND br.RULE_ID = p.RULE_ID
AND l.SOURCE_SYSTEM_ID = aud.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = aud.SOURCE_LEDGER_ID
2013-12-31 22:44:57,826 DEBUG [AIF]:
SELECT 'Y' VALID_FLAG
FROM GL_PERIOD_STATUSES
WHERE APPLICATION_ID = 101
AND SET_OF_BOOKS_ID = 1
AND PERIOD_NAME = 'JAN-13'
AND CLOSING_STATUS IN ( 'O','C','P' )
2013-12-31 22:44:57,847 DEBUG [AIF]:
SELECT 'Y' EXISTS_FLAG
FROM GL_TRACK_DELTA_BALANCES
WHERE SET_OF_BOOKS_ID = 1
AND PROGRAM_CODE = 'FEM'
AND PERIOD_NAME = 'JAN-13'
AND ACTUAL_FLAG = 'A'
AND EXTRACT_LEVEL_CODE = 'DTL'
AND CURRENCY_TYPE_CODE = 'B'
AND ENABLED_FLAG = 'Y'
2013-12-31 22:44:57,883 DEBUG [AIF]:
SELECT MAX(DELTA_RUN_ID) MAX_DELTA_RUN_ID
FROM GL_BALANCES_DELTA
WHERE SET_OF_BOOKS_ID = 1
AND PERIOD_NAME = 'JAN-13'
AND ACTUAL_FLAG = 'A'
2013-12-31 22:44:57,898 DEBUG [AIF]:
SELECT brl.EXECUTION_MODE
,( SELECT CASE COUNT(aud.DELTA_RUN_ID) WHEN 0 THEN 'N' ELSE 'Y' END
FROM AIF_GL_LOAD_AUDIT aud
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND aud.LAST_LOADID <> brl.LOADID
AND COALESCE( aud.STATUS, 'SUCCESS' ) = 'SUCCESS'
) GL_LOAD_AUDIT_SUCCESS_FLAG
,( SELECT CASE COUNT(aud.DELTA_RUN_ID) WHEN 0 THEN 'N' ELSE 'Y' END
FROM AIF_GL_LOAD_AUDIT aud
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND aud.LAST_LOADID <> brl.LOADID
AND aud.STATUS = 'RUNNING'
) GL_LOAD_AUDIT_RUNNING_FLAG
FROM AIF_BAL_RULE_LOADS brl
WHERE brl.LOADID = 300
2013-12-31 22:44:57,904 DEBUG [AIF]:
INSERT INTO AIF_GL_LOAD_AUDIT ( LAST_LOADID
,DELTA_RUN_ID
,SOURCE_SYSTEM_ID
,SOURCE_LEDGER_ID
,GL_PERIOD_ID
,BALANCE_TYPE
,GL_EXTRACT_TYPE
,STATUS
) VALUES ( 300
,0
,3
,1
,135
,'A'
,'FULLREFRESH'
,'RUNNING'
2013-12-31 22:44:57,907 DEBUG [AIF]:
DELETE FROM AIF_EBS_GL_BALANCES_STG
WHERE SOURCE_SYSTEM_ID = 3
AND SOURCE_COA_ID = 101
AND SOURCE_LEDGER_ID = 1
AND ACTUAL_FLAG = 'A'
AND PERIOD_NAME = 'JAN-13'
2013-12-31 22:44:59,283 INFO [AIF]: LKM EBS/FS Extract Type - Load Audit AND Full Refresh - END
2013-12-31 22:45:06,507 INFO [AIF]: COMM End Process Detail - Update Process Detail - START
2013-12-31 22:45:06,514 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'SUCCESS'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 57408
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_IMP'
AND ENTITY_NAME = 'JAN-13'
2013-12-31 22:45:06,519 INFO [AIF]: COMM End Process Detail - Update Process Detail - END
2013-12-31 22:45:07,106 INFO [AIF]: EBS/FS Load Data - Load TDATASEG_T - START
2013-12-31 22:45:07,112 INFO [AIF]:
Import Data from Source for Period 'January 2013'
2013-12-31 22:45:07,115 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,brl.EXECUTION_MODE
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.CURRENCY_CODE
,br.INCL_ZERO_BALANCE_FLAG
,br.BAL_SEG_VALUE_OPTION_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
FROM AIF_BAL_RULE_LOADS brl
,AIF_TARGET_APPLICATIONS app
,AIF_BALANCE_RULES br
WHERE brl.LOADID = 300
AND app.APPLICATION_ID = brl.APPLICATION_ID
AND br.RULE_ID = brl.RULE_ID
2013-12-31 22:45:07,120 DEBUG [AIF]:
SELECT PERIODKEY
FROM TPOVPERIOD
WHERE PERIODDESC = 'January 2013'
2013-12-31 22:45:07,122 INFO [AIF]:
Import Data from Source for Ledger 'JWR Books'
2013-12-31 22:45:07,125 DEBUG [AIF]:
SELECT COA_SEGMENT_NAME
,ACCOUNT_TYPE_FLAG
,BALANCE_TYPE_FLAG
FROM AIF_COA_SEGMENTS
WHERE SOURCE_SYSTEM_ID = 3
AND SOURCE_COA_ID = '101'
AND (
ACCOUNT_TYPE_FLAG = 'Y'
OR BALANCE_TYPE_FLAG = 'Y'
2013-12-31 22:45:07,127 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 26
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 12
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-12-31 22:45:07,154 DEBUG [AIF]:
INSERT INTO TDATASEG_T (
LOADID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,VALID_FLAG
,CHANGESIGN
,CODE_COMBINATION_ID
,SOURCE_LEDGER_ID
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,YEAR
,PERIOD
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ACCOUNT
,ACCOUNTX
,ENTITY
,ENTITYX
,ICP
,ICPX
,UD1
,UD1X
,UD2
,UD2X
,UD3
,UD3X
,UD4
,UD4X
,DATAVIEW
,DATAKEY
,STAT_BALANCE_FLAG
,CURKEY
,AMOUNT_PTD
,AMOUNT_YTD
,AMOUNT
,AMOUNTX
SELECT pprd.PROCESS_ID LOADID
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,pprd.PERIODKEY
,'Y' VALID_FLAG
,0 CHANGESIGN
,ccid.CODE_COMBINATION_ID
,bal.SOURCE_LEDGER_ID
,pprd.GL_PERIOD_YEAR
,pprd.GL_PERIOD_NUM
,pprd.YEARTARGET YEAR
,pprd.PERIODTARGET PERIOD
,pprd.PROCESS_ID ATTR1
,bal.SOURCE_SYSTEM_ID ATTR2
,bal.SOURCE_LEDGER_ID ATTR3
,pprd.GL_PERIOD_YEAR ATTR4
,pprd.GL_PERIOD_NAME ATTR5
,bal.ACTUAL_FLAG ATTR6
,bal.BUDGET_VERSION_ID ATTR7
,bal.ENCUMBRANCE_TYPE_ID ATTR8
,ccid.ACCOUNT_TYPE ATTR9
,NULL ATTR10
,NULL ATTR11
,NULL ATTR12
,NULL ATTR13
,NULL ATTR14
,ccid.SEGMENT4 ACCOUNT
,NULL ACCOUNTX
,ccid.SEGMENT1 ENTITY
,NULL ENTITYX
,NULL ICP
,NULL ICPX
,ccid.SEGMENT5 UD1
,NULL UD1X
,ccid.SEGMENT3 UD2
,NULL UD2X
,ccid.SEGMENT6 UD3
,NULL UD3X
,ccid.SEGMENT2 UD4
,NULL UD4X
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN 'Periodic' ELSE 'YTD' END ) DATAVIEW
,TDATASEG_DATAKEY_S.NEXTVAL
,'N' STAT_BALANCE_FLAG
,bal.CURRENCY_CODE CURKEY
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) * ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR ) AMOUNT_PTD
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) * ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR ) AMOUNT_YTD
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) *
( CASE
WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
ELSE ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
END
AMOUNT
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) *
( CASE
WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
ELSE ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
END
AMOUNTX
FROM AIF_EBS_GL_BALANCES_STG_V bal
,AIF_EBS_GL_CCID_STG ccid
,AIF_PROCESS_PERIODS pprd
WHERE bal.SOURCE_SYSTEM_ID = 3
AND bal.SOURCE_LEDGER_ID = 1
AND bal.ACTUAL_FLAG = 'A'
AND ccid.SOURCE_SYSTEM_ID = bal.SOURCE_SYSTEM_ID
AND ccid.SOURCE_COA_ID = bal.SOURCE_COA_ID
AND ccid.CODE_COMBINATION_ID = bal.CODE_COMBINATION_ID
AND pprd.PROCESS_ID = 300
AND pprd.PERIODKEY = '2013-01-01'
AND pprd.SOURCE_LEDGER_ID = bal.SOURCE_LEDGER_ID
AND pprd.GL_PERIOD_NAME = bal.PERIOD_NAME
AND (
bal.BEGIN_BALANCE_DR <> 0
OR bal.BEGIN_BALANCE_CR <> 0
OR bal.PERIOD_NET_DR <> 0
OR bal.PERIOD_NET_CR <> 0
AND bal.CURRENCY_CODE <> 'STAT'
AND bal.TRANSLATED_FLAG IS NULL
2013-12-31 22:45:09,269 INFO [AIF]: Monetary Data Rows Imported from Source: 12590
2013-12-31 22:45:09,293 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_AUDIT (
LOADID
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,EXPORT_TO_TARGET_FLAG
SELECT DISTINCT PROCESS_ID LOADID
,'HFM' TARGET_APPLICATION_TYPE
,'FMTEST2' TARGET_APPLICATION_NAME
,NULL PLAN_TYPE
,SOURCE_LEDGER_ID
,YEARTARGET EPM_YEAR
,PERIODTARGET EPM_PERIOD
,'Y' SNAPSHOT_FLAG
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,PERIODKEY
,'N' EXPORT_TO_TARGET_FLAG
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
AND PERIODKEY = '2013-01-01'
AND SOURCE_LEDGER_ID = 1
2013-12-31 22:45:09,297 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_PRD_AUDIT (
LOADID
,SOURCE_LEDGER_ID
,GL_PERIOD_ID
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
SELECT DISTINCT pprd.PROCESS_ID LOADID
,pprd.SOURCE_LEDGER_ID
,pprd.PERIOD_ID GL_PERIOD_ID
,(SELECT MAX(gl.DELTA_RUN_ID)
FROM AIF_GL_LOAD_AUDIT gl
WHERE gl.SOURCE_SYSTEM_ID = 3
AND gl.SOURCE_LEDGER_ID = pprd.SOURCE_LEDGER_ID
AND gl.BALANCE_TYPE = 'A'
AND gl.GL_PERIOD_ID = pprd.PERIOD_ID
) DELTA_RUN_ID
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,pprd.PERIODKEY
FROM AIF_PROCESS_PERIODS pprd
WHERE pprd.PROCESS_ID = 300
AND pprd.PERIODKEY = '2013-01-01'
AND pprd.SOURCE_LEDGER_ID = 1
2013-12-31 22:45:09,302 INFO [AIF]:
Total Data Rows Imported from Source: 12590
2013-12-31 22:45:09,305 INFO [AIF]: EBS/FS Load Data - Load TDATASEG_T - END
2013-12-31 22:45:09,381 INFO [AIF]: COMM Update Data - Init DataLoadUtil - START
2013-12-31 22:45:09,385 INFO [AIF]: COMM Update Data - Init DataLoadUtil - END
2013-12-31 22:45:09,485 INFO [AIF]: COMM Update Data - Update TDATASEG_T/TDATASEGW - START
2013-12-31 22:45:09,491 DEBUG [AIF]:
DELETE FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND AMOUNT = 0
2013-12-31 22:45:09,559 WARN [AIF]:
Warning: Data rows with zero balances exist
2013-12-31 22:45:09,636 INFO [AIF]: Zero Balance Data Rows Deleted: 1879
2013-12-31 22:45:09,654 DEBUG [AIF]:
SELECT DIMNAME
,CASE WHEN RULE_ID IS NULL THEN 'N' ELSE 'Y' END RULE_MAP_FLAG
,SRCKEY
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,DATAKEY
,MAPPING_TYPE
FROM (
SELECT DISTINCT tdm.DIMNAME
,tdm.RULE_ID
,NULL SRCKEY
,NULL TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,NULL CHANGESIGN
,1 SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,NULL DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
WHERE tdm.LOADID = 300
AND tdm.PARTITIONKEY = 12
AND tdm.TDATAMAPTYPE = 'ERP'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 27)
AND tdm.WHERECLAUSETYPE IS NULL
UNION ALL
SELECT tdm.DIMNAME
,tdm.RULE_ID
,tdm.SRCKEY
,tdm.TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,tdm.CHANGESIGN
,CASE tpp.PARTSEQMAP
WHEN 0 THEN CASE
WHEN (tdm.WHERECLAUSETYPE = 'MULTIDIM') THEN 2
WHEN (tdm.WHERECLAUSETYPE = 'BETWEEN') THEN 3
WHEN (tdm.WHERECLAUSETYPE = 'LIKE') THEN 4
ELSE 0
END
ELSE tdm.SEQUENCE
END SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,tdm.DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
INNER JOIN TPOVPARTITION tpp
ON tpp.PARTITIONKEY = tdm.PARTITIONKEY
WHERE tdm.LOADID = 300
AND tdm.PARTITIONKEY = 12
AND tdm.TDATAMAPTYPE = 'ERP'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 27)
AND tdm.WHERECLAUSETYPE IN ('MULTIDIM','BETWEEN','LIKE')
) q
ORDER BY DIMNAME
,RULE_ID
,SEQUENCE
,SYSTEM_GENERATED_FLAG
,SRCKEY
2013-12-31 22:45:09,672 INFO [AIF]:
Processing Mappings for Column 'ACCOUNT'
2013-12-31 22:45:09,677 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ACCOUNTX = ACCOUNT
,ACCOUNTR = 121
,ACCOUNTF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ACCOUNTX IS NULL
AND (1=1)
2013-12-31 22:45:10,044 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,053 INFO [AIF]:
Processing Mappings for Column 'ENTITY'
2013-12-31 22:45:10,057 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ENTITYX = ENTITY
,ENTITYR = 122
,ENTITYF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ENTITYX IS NULL
AND (1=1)
2013-12-31 22:45:10,426 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,435 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ENTITYX = ENTITY
,ENTITYR = 132
,ENTITYF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ENTITYX IS NULL
AND (1=1)
2013-12-31 22:45:10,446 INFO [AIF]: Data Rows Updated by Location Mapping 'DEFAULT' (LIKE): 0
2013-12-31 22:45:10,448 INFO [AIF]:
Processing Mappings for Column 'ICP'
2013-12-31 22:45:10,452 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ICPX = '[ICP None]'
,ICPR = 130
,ICPF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[ICP None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ICPX IS NULL
AND (1=1)
2013-12-31 22:45:10,784 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,798 INFO [AIF]:
Processing Mappings for Column 'UD1'
2013-12-31 22:45:10,802 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD1X = '[None]'
,UD1R = 124
,UD1F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD1X IS NULL
AND (1=1)
2013-12-31 22:45:11,134 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,156 INFO [AIF]:
Processing Mappings for Column 'UD2'
2013-12-31 22:45:11,160 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD2X = '[None]'
,UD2R = 123
,UD2F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD2X IS NULL
AND (1=1)
2013-12-31 22:45:11,517 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,531 INFO [AIF]:
Processing Mappings for Column 'UD3'
2013-12-31 22:45:11,535 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD3X = '[None]'
,UD3R = 125
,UD3F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD3X IS NULL
AND (1=1)
2013-12-31 22:45:11,870 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,883 INFO [AIF]:
Processing Mappings for Column 'UD4'
2013-12-31 22:45:11,887 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD4X = '[None]'
,UD4R = 128
,UD4F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD4X IS NULL
AND (1=1)
2013-12-31 22:45:12,192 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:12,204 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ATTR14 = DATAKEY
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:12,739 DEBUG [AIF]:
UPDATE TDATASEG_T
SET VALID_FLAG = 'N'
WHERE 1=1
AND (
(1=0)
OR TDATASEG_T.ACCOUNTX IS NULL
OR TDATASEG_T.ENTITYX IS NULL
OR TDATASEG_T.ICPX IS NULL
OR TDATASEG_T.UD1X IS NULL
OR TDATASEG_T.UD2X IS NULL
OR TDATASEG_T.UD3X IS NULL
OR TDATASEG_T.UD4X IS NULL
AND LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND VALID_FLAG = 'Y'
2013-12-31 22:45:12,754 INFO [AIF]:
Total Data Rows available for Export to Target: 10711
2013-12-31 22:45:12,773 INFO [AIF]: COMM Update Data - Update TDATASEG_T/TDATASEGW - END
2013-12-31 22:45:12,808 INFO [AIF]: COMM End Process Detail - Update Process Detail - START
2013-12-31 22:45:12,823 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'SUCCESS'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 10711
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_TRANS'
AND ENTITY_NAME = 'January 2013'
2013-12-31 22:45:12,829 INFO [AIF]: COMM End Process Detail - Update Process Detail - END
2013-12-31 22:45:12,987 INFO [AIF]: COMM Update YTD Amounts - Update YTD Amounts - START
2013-12-31 22:45:12,993 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,pprd.YEARTARGET
,pprd.PERIODTARGET
,pprd.SOURCE_LEDGER_ID
FROM AIF_BAL_RULE_LOADS brl
,AIF_PROCESS_PERIODS pprd
WHERE brl.LOADID = 300
AND pprd.PROCESS_ID = brl.LOADID
GROUP BY brl.PARTITIONKEY
,brl.CATKEY
,pprd.YEARTARGET
,pprd.PERIODTARGET
,pprd.SOURCE_LEDGER_ID
HAVING COUNT(*) > 1
2013-12-31 22:45:12,995 INFO [AIF]: COMM Update YTD Amounts - Update YTD Amounts - END
2013-12-31 22:45:13,052 INFO [AIF]: COMM Load TDATAMAPSEG/TDATASEG - Load TDATAMAPSEG/TDATASEG - START
2013-12-31 22:45:13,057 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,brl.EXECUTION_MODE
FROM AIF_BAL_RULE_LOADS brl
WHERE brl.LOADID = 300
2013-12-31 22:45:13,059 DEBUG [AIF]:
SELECT PERIODKEY
FROM AIF_APPL_LOAD_AUDIT
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND RULE_ID = 27
ORDER BY PERIODKEY
2013-12-31 22:45:13,061 INFO [AIF]:
Processing Data for PeriodKey '2013-01-01'
2013-12-31 22:45:13,065 DEBUG [AIF]:
DELETE FROM TDATAMAPSEG
WHERE PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND (
TDATAMAPTYPE = 'ERP'
OR (
TDATAMAPTYPE = 'MULTIDIM'
AND EXISTS (
SELECT 1
FROM TDATAMAPSEG parent
WHERE parent.PARTITIONKEY = TDATAMAPSEG.PARTITIONKEY
AND parent.DATAKEY = TDATAMAPSEG.TARGKEY
AND parent.CATKEY = TDATAMAPSEG.CATKEY
AND parent.PERIODKEY = TDATAMAPSEG.PERIODKEY
AND parent.TDATAMAPTYPE = 'ERP'
2013-12-31 22:45:13,074 INFO [AIF]: Number of Rows deleted from TDATAMAPSEG: 8
2013-12-31 22:45:13,077 DEBUG [AIF]:
INSERT INTO TDATAMAPSEG (
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,DIMNAME
,SRCKEY
,SRCDESC
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
SELECT DATAKEY
,PARTITIONKEY
,4
,'2013-01-01'
,DIMNAME
,SRCKEY
,SRCDESC
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
FROM TDATAMAP_T
WHERE LOADID = 300
2013-12-31 22:45:13,081 INFO [AIF]: Number of Rows inserted into TDATAMAPSEG: 8
2013-12-31 22:45:13,083 DEBUG [AIF]:
DELETE FROM TDATASEG
WHERE LOADID < 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND RULE_ID = 27
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:15,659 INFO [AIF]: Number of Rows deleted from TDATASEG: 10711
2013-12-31 22:45:15,728 DEBUG [AIF]:
INSERT INTO TDATASEG (
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,CURKEY
,DATAVIEW
,CALCACCTTYPE
,CHANGESIGN
,JOURNALID
,AMOUNT
,AMOUNTX
,AMOUNT_PTD
,AMOUNT_YTD
,DESC1
,DESC2
,ACCOUNT
,ACCOUNTX
,ACCOUNTR
,ACCOUNTF
,ENTITY
,ENTITYX
,ENTITYR
,ENTITYF
,ICP
,ICPX
,ICPR
,ICPF
,UD1
,UD1X
,UD1R
,UD1F
,UD2
,UD2X
,UD2R
,UD2F
,UD3
,UD3X
,UD3R
,UD3F
,UD4
,UD4X
,UD4R
,UD4F
,UD5
,UD5X
,UD5R
,UD5F
,UD6
,UD6X
,UD6R
,UD6F
,UD7
,UD7X
,UD7R
,UD7F
,UD8
,UD8X
,UD8R
,UD8F
,UD9
,UD9X
,UD9R
,UD9F
,UD10
,UD10X
,UD10R
,UD10F
,UD11
,UD11X
,UD11R
,UD11F
,UD12
,UD12X
,UD12R
,UD12F
,UD13
,UD13X
,UD13R
,UD13F
,UD14
,UD14X
,UD14R
,UD14F
,UD15
,UD15X
,UD15R
,UD15F
,UD16
,UD16X
,UD16R
,UD16F
,UD17
,UD17X
,UD17R
,UD17F
,UD18
,UD18X
,UD18R
,UD18F
,UD19
,UD19X
,UD19R
,UD19F
,UD20
,UD20X
,UD20R
,UD20F
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ARCHIVEID
,HASMEMOITEM
,STATICDATAKEY
,LOADID
,RULE_ID,
CODE_COMBINATION_ID
,STAT_BALANCE_FLAG
,VALID_FLAG
SELECT
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,CURKEY
,DATAVIEW
,CALCACCTTYPE
,CHANGESIGN
,JOURNALID
,AMOUNT
,AMOUNTX
,AMOUNT_PTD
,AMOUNT_YTD
,DESC1
,DESC2
,ACCOUNT
,ACCOUNTX
,ACCOUNTR
,ACCOUNTF
,ENTITY
,ENTITYX
,ENTITYR
,ENTITYF
,ICP
,ICPX
,ICPR
,ICPF
,UD1
,UD1X
,UD1R
,UD1F
,UD2
,UD2X
,UD2R
,UD2F
,UD3
,UD3X
,UD3R
,UD3F
,UD4
,UD4X
,UD4R
,UD4F
,UD5
,UD5X
,UD5R
,UD5F
,UD6
,UD6X
,UD6R
,UD6F
,UD7
,UD7X
,UD7R
,UD7F
,UD8
,UD8X
,UD8R
,UD8F
,UD9
,UD9X
,UD9R
,UD9F
,UD10
,UD10X
,UD10R
,UD10F
,UD11
,UD11X
,UD11R
,UD11F
,UD12
,UD12X
,UD12R
,UD12F
,UD13
,UD13X
,UD13R
,UD13F
,UD14
,UD14X
,UD14R
,UD14F
,UD15
,UD15X
,UD15R
,UD15F
,UD16
,UD16X
,UD16R
,UD16F
,UD17
,UD17X
,UD17R
,UD17F
,UD18
,UD18X
,UD18R
,UD18F
,UD19
,UD19X
,UD19R
,UD19F
,UD20
,UD20X
,UD20R
,UD20F
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ARCHIVEID
,HASMEMOITEM
,STATICDATAKEY
,LOADID
,RULE_ID,
CODE_COMBINATION_ID
,STAT_BALANCE_FLAG
,VALID_FLAG
FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:16,838 INFO [AIF]: Number of Rows inserted into TDATASEG: 10711
2013-12-31 22:45:16,858 DEBUG [AIF]:
DELETE FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:17,123 INFO [AIF]: Number of Rows deleted from TDATASEG_T: 10711
2013-12-31 22:45:17,153 DEBUG [AIF]:
DELETE FROM TDATAMAP_T
WHERE LOADID = 300
2013-12-31 22:45:17,156 INFO [AIF]: Number of Rows deleted from TDATAMAP_T: 8
2013-12-31 22:45:17,161 INFO [AIF]: COMM Load TDATAMAPSEG/TDATASEG - Load TDATAMAPSEG/TDATASEG - END
2013-12-31 22:45:17,993 DEBUG [AIF]:
SELECT CASE app.METADATA_LOAD_METHOD
WHEN 'EPMA' THEN CASE dim.TARGET_DIMENSION_CLASS_NAME
WHEN 'Generic' THEN dim.TARGET_DIMENSION_NAME
ELSE dim.TARGET_DIMENSION_CLASS_NAME
END
ELSE dim.TARGET_DIMENSION_NAME
END TARGET_DIMENSION_NAME
,adim.BALANCE_COLUMN_NAME
FROM AIF_TARGET_APPLICATIONS app
,AIF_TARGET_APPL_DIMENSIONS adim
,AIF_DIMENSIONS dim
WHERE app.APPLICATION_ID = 26
AND adim.APPLICATION_ID = app.APPLICATION_ID
AND dim.DIMENSION_ID = adim.DIMENSION_ID
AND dim.TARGET_DIMENSION_CLASS_NAME IN ('Custom1','Custom2','Custom3','Custom4','Generic')
2013-12-31 22:45:17,997 DEBUG [AIF]:
SELECT SCENARIO "Scenario"
,YEAR "Year"
,PERIOD "Period"
,DATAVIEW "View"
,DATAVALUE "Value"
,ACCOUNT "Account"
,ENTITY "Entity"
,ICP "ICP"
,UD2 "Area"
,UD1 "Tail"
,UD3 "Special"
,UD4 "Facility"
,AMOUNT "DataValue"
FROM AIF_HS_BALANCES
WHERE LOADID = 300
2013-12-31 22:45:18,000 INFO [SimpleAsyncTaskExecutor-9]: ODI Hyperion Financial Management Adapter
2013-12-31 22:45:18,002 INFO [SimpleAsyncTaskExecutor-9]: Load task initialized.
2013-12-31 22:45:18,028 INFO [AIF]: LKM COMM Load Data into HFM - Load Data to HFM - START
2013-12-31 22:45:18,031 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
) VALUES (
300
,'PROCESS_BAL_EXP_HFM'
,NULL
,'FMTEST2'
,NULL
,NULL
,CURRENT_TIMESTAMP
,NULL
,NULL
,'RUNNING'
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
,CURRENT_TIMESTAMP
2013-12-31 22:45:18,034 INFO [SimpleAsyncTaskExecutor-9]: Connecting to Financial Management application [FMTEST2] on [10.150.20.40] using user-name [admin].
2013-12-31 22:45:18,155 INFO [SimpleAsyncTaskExecutor-9]: Connected to Financial Management application.
2013-12-31 22:45:18,157 INFO [SimpleAsyncTaskExecutor-9]: HFM Version: 11.1.2.2.300.
2013-12-31 22:45:18,160 INFO [SimpleAsyncTaskExecutor-9]: Options for the Financial Management load task are:
<Options>
<Option name=LOG_FILE_NAME value=C:\Windows\TEMP\/aif_501_300.log/>
<Option name=IMPORT_MODE value=Replace/>
<Option name=CONSOLIDATE_ONLY value=false/>
<Option name=CONSOLIDATE_PARAMETERS value=""/>
<Option name=LOG_ENABLED value=true/>
<Option name=ACCUMULATE_WITHIN_FILE value=false/>
<Option name=DEBUG_ENABLED value=true/>
<Option name=CONSOLIDATE_AFTER_LOAD value=false/>
<Option name=FILE_CONTAINS_SHARE_DATA value=false/>
</Options>
2013-12-31 22:45:18,168 INFO [SimpleAsyncTaskExecutor-9]: Load Options validated.
2013-12-31 22:45:18,176 ERROR [SimpleAsyncTaskExecutor-9]: Error occurred during load process ORA-00904: "DATAVALUE": invalid identifier
com.hyperion.odi.common.ODIHAppException: ORA-00904: "DATAVALUE": invalid identifier
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:216)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
at org.python.core.PyObject.__call__(PyObject.java:355)
at org.python.core.PyMethod.__call__(PyMethod.java:215)
at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
at org.python.core.PyMethod.__call__(PyMethod.java:206)
at org.python.core.PyObject.__call__(PyObject.java:397)
at org.python.core.PyObject.__call__(PyObject.java:401)
at org.python.pycode._pyx161.f$0(<string>:98)
at org.python.pycode._pyx161.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java:165)
at org.python.core.PyCode.call(PyCode.java:18)
at org.python.core.Py.runCode(Py.java:1204)
at org.python.core.Py.exec(Py.java:1248)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1889)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$2.doAction(StartScenRequestProcessor.java:580)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:513)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$StartScenTask.doExecute(StartScenRequestProcessor.java:1066)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.sql.SQLSyntaxErrorException: ORA-00904: "DATAVALUE": invalid identifier
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)
at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:202)
at oracle.jdbc.driver.T4CStatement.executeForDescribe(T4CStatement.java:942)
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1283)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1441)
at oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:1690)
at oracle.jdbc.driver.OracleStatementWrapper.executeQuery(OracleStatementWrapper.java:446)
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:212)
... 38 more
2013-12-31 22:45:18,208 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'FAILED'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 0
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_EXP_HFM'
AND ENTITY_NAME = 'FMTEST2'
2013-12-31 22:45:18,210 FATAL [AIF]: Error in LKM CThe issue is that you are mapping "Data Value" to amount in the Target Application import format:
SELECT SCENARIO "Scenario"
,YEAR "Year"
,PERIOD "Period"
,DATAVIEW "View"
,DATAVALUE "Value"
,ACCOUNT "Account"
,ENTITY "Entity"
,ICP "ICP"
,UD2 "Area"
,UD1 "Tail"
,UD3 "Special"
,UD4 "Facility"
,AMOUNT "DataValue"
FROM AIF_HS_BALANCES
WHERE LOADID = 300
You need to map AMOUNT to "AMOUNT" in the HFM Application. Check that the dimension mapping is correct for the class in the target application and that your import format is going to the proper target dimension(Amount).
Maybe you are looking for
-
ID3 tags messing tremendously up on ZEN MX
Hello, I'm a long time creative user, and had many creative mp3 players up through the years. Never had a problem, and always been greatly satisfied. But with my latest purchase of a Zen MX I'm a bit confused about the way it handles id3 tags of mp3s
-
I tried accessing imovie after not using it for a long time and got an error message "You can't open the application iMovie because it is damaged or incomplete. I'm not sure how to go about addressing it. Your suggestions would be very much apprecai
-
Oracle 11gR2 RAC with ASM with asm_preferred_read_failure_group
Hello DBAs, I apologize first if this question has been posted somewhere. I have a two nodes cluster so essentially, I have 2 ASM instances (ASM1 & ASM2). It seems like both instances share the same ASM spfile. +ASM1> show parameter spfile; NAME TYPE
-
Best operating system for music applications( or most popular)
hi, gonna be buying a mac laptop. im not feeling leopard for coupla reasons. im new to all this by the way. so dont know about the diff versions of operating systems for mac........i just know tiger 10.4 was one prior. for music software, which op sy
-
Best way to handle 3 sets of data
Hi all, Hope you are all well. Just to let you know that this is an assignment problem, but I am not specifically looking for coding to complete it, just snippets to outline your ideas, perhaps. I am working on a diphonetic dictionary problem. Think