Error Loading Rate data through Data Manger Pakage
Hi all,
I wanted to load the Rate data so I uploaded my excel file on the server and it was successful. Even when I preview the data file it shows me first 200 records but when I run the Data Manger Pakage, I see the following error in the status.
TOTAL STEPS 2
1. Convert Data: Failed in 0 sec.
[Selection]
FILE=\STUDENT\RATE\DataManager\DataFiles\My Files\Rate.xls
TRANSFORMATION=\STUDENT\RATE\DATAMANAGER\TRANSFORMATIONFILES\SYSTEM FILES\IMPORT.XLS
CLEARDATA= No
RUNLOGIC= Yes
CHECKLCK= No
[Messages]
The data file is empty. Please check the data file and try again.
My Transformation file is as follows::
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER =
SKIP = 1
SKIPIF =
CREDITNEGATIVE=NO
CONVERTAMOUNTWDIM=
MAXREJECTCOUNT=
VALIDATERECORDS=YES
*MAPPING
*CONVERSION
Please help.
Diksha CHopra.
Edited by: Diksha Chopra on Apr 10, 2010 12:45 AM
Dear Diksha Chopra,
I think that your the option *MAPPING have something wrong, so your import process failed. Please you do some directions as following:
1. Please you open your transformation file which you used for import process.
2. You could copy and paste *MAPPING configuration this into your transformation file:
*MAPPING
Category=*COL(1)
InputCurrency=*COL(2)
Rate=*COL(3)
RateSrc=*COL(4)
Time=*COL(5)
Amount=*COL(6)
3. Validation and Save the transformation file
4. Please make sure you put header into you data file and the order of header likes: Category,InputCurrency,Rate,RateSrc,Time,Amount
Kind Regards,
Wandi Sutandi
Similar Messages
-
Hello,
EBS version : 11.5.10.2
DB version : 11.2.0.3
OS version : AIX 6.1
As a part of 11.5.10.2 to R12.1.1 upgrade, while applying merged 12.1.1 upgrade driver(u6678700.drv), we got below error :
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
ATTENTION: All workers either have failed or are waiting:
FAILED: file glsupdas.ldt on worker 3.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 adwork003.log
Restarting job that failed and was fixed.
Time when worker restarted job: Wed Aug 07 2013 10:36:14
Loading data using FNDLOAD function.
FNDLOAD APPS/***** 0 Y UPLOAD @SQLGL:patch/115/import/glnlsdas.lct @SQLGL:patch/115/import/US/glsupdas.ldt -
Connecting to APPS......Connected successfully.
Calling FNDLOAD function.
Returned from FNDLOAD function.
Log file: /fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log/US_glsupdas_ldt.log
Error calling FNDLOAD function.
Time when worker failed: Wed Aug 07 2013 10:36:14
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 US_glsupdas_ldt.log
Current system time is Wed Aug 7 10:36:14 2013
Uploading from the data file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt
Altering database NLS_LANGUAGE environment to AMERICAN
Dumping from LCT/LDT files (/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0), /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt) to staging tables
Dumping LCT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0) into FND_SEED_STAGE_CONFIG
Dumping LDT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt into FND_SEED_STAGE_ENTITY
Dumped the batch (GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS , GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS ) into FND_SEED_STAGE_ENTITY
Uploading from staging tables
Error loading seed data for GL_DEFAS_ACCESS_SETS: DEFINITION_ACCESS_SET = SUPER_USER_DEFAS, ORA-06508: PL/SQL: could not find program unit being called
Concurrent request completed
Current system time is Wed Aug 7 10:36:14 2013
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Below is info about file versions and INVALID packages related to GL.
PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG is invalid with error component 'GL_DEFAS_DBNAME_S' must be declared.
I can see GL_DEFAS_DBNAME_S is a VALID sequence accessible by apps user with or without specifying GL as owner.
SQL> select text from dba_source where name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG') and line=2;
TEXT
/* $Header: glistdds.pls 120.4 2005/05/05 01:23:16 kvora ship $ */
/* $Header: glistddb.pls 120.16 2006/04/10 21:28:48 cma ship $ */
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
SQL> select * from all_objects where object_name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG')
2 ; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1118545 PACKAGE 05-AUG-13 05-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1118548 PACKAGE 05-AUG-13 06-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1128507 PACKAGE BODY 05-AUG-13 06-AUG-13 2013-08-06:12:56:50 INVALID N N N 2
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1128508 PACKAGE BODY 05-AUG-13 05-AUG-13 2013-08-05:19:43:51 VALID N N N 2
SQL> select * from all_objects where object_name='GL_DEFAS_DBNAME_S'; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
GL GL_DEFAS_DBNAME_S 1087285 SEQUENCE 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
APPS GL_DEFAS_DBNAME_S 1087299 SYNONYM 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
SQL> conn apps/apps
Connected.
SQL> SELECT OWNER, OBJECT_NAME, OBJECT_TYPE, STATUS
FROM DBA_OBJECTS
WHERE OBJECT_NAME = 'GL_DEFAS_ACCESS_SETS_PKG'; 2 3 OWNER OBJECT_NAME OBJECT_TYPE STATUS
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE VALID
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE BODY INVALID SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE; Warning: Package altered with compilation errors. SQL> show error
No errors.
SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE BODY; Warning: Package Body altered with compilation errors. SQL> show error
Errors for PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG: LINE/COL ERROR
39/17 PLS-00302: component 'GL_DEFAS_DBNAME_S' must be declared
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>cat $GL_TOP/patch/115/sql/glistdab.pls|grep -n GL_DEFAS_DBNAME_S
68: SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
81: fnd_message.set_token('SEQUENCE', 'GL_DEFAS_DBNAME_S');
SQL> show user
USER is "APPS"
SQL> SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
FROM dual; 2 -- with GL.
NEXTVAL
1002
SQL> SELECT GL_DEFAS_DBNAME_S.NEXTVAL from dual; --without GL. or using synonym.
NEXTVAL
1003
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdab.pls|grep '$Header'
REM | $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ |
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdas.pls |grep '$Header'
REM | $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ |
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */ -
Cfreport & "Error loading image data"
i'm using cf report builder.
I insert in report an image linked to a url
If i execute the report i receive this error
Report data binding error Error loading image data :
http://DOMAIN/PATH/0701.jpg.
the image exists: if i copy the url to my browser, it appears
If i replace link with filesystem path, i receive the same
error
this is the error exception
"Error","jrpp-7","01/29/07","21:13:20","marcappelli_frontend","Report
data binding error Error loading image data :
http://venusia/lavoro/marsportcappelli_it/html/catalogo/2007/0701.jpg.Error
loading byte data :
http://venusia/lavoro/marsportcappelli_it/html/catalogo/2007/0701.jpg
The specific sequence of files included or processed is:
/home/htdocs/clienti/marsportcappelli_it/html/_script/catalogo2007.cfm
coldfusion.runtime.report.Report$ReportDataBindingException:
Report data binding error Error loading image data :
http://venusia/lavoro/marsportcappelli_it/html/catalogo/2007/0701.jpg.
at
coldfusion.runtime.report.Report.runReport(Report.java:303)
at
coldfusion.tagext.lang.ReportTag.doEndTag(ReportTag.java:467)
at
coldfusion.runtime.CfJspPage._emptyTag(CfJspPage.java:1909)
at
cfcatalogo20072ecfm389367973.runPage(/home/htdocs/clienti/marsportcappelli_it/html/_scrip t/catalogo2007.cfm:13)
at coldfusion.runtime.CfJspPage.invoke(CfJspPage.java:152)
at
coldfusion.tagext.lang.IncludeTag.doStartTag(IncludeTag.java:343)
at
coldfusion.filter.CfincludeFilter.invoke(CfincludeFilter.java:65)
at
coldfusion.filter.ApplicationFilter.invoke(ApplicationFilter.java:210)
at
coldfusion.filter.RequestMonitorFilter.invoke(RequestMonitorFilter.java:51)
at coldfusion.filter.PathFilter.invoke(PathFilter.java:86)
at
coldfusion.filter.ExceptionFilter.invoke(ExceptionFilter.java:50)
at
coldfusion.filter.BrowserDebugFilter.invoke(BrowserDebugFilter.java:52)
at
coldfusion.filter.ClientScopePersistenceFilter.invoke(ClientScopePersistenceFilter.java:2 8)
at
coldfusion.filter.BrowserFilter.invoke(BrowserFilter.java:38)
at
coldfusion.filter.GlobalsFilter.invoke(GlobalsFilter.java:38)
at
coldfusion.filter.DatasourceFilter.invoke(DatasourceFilter.java:22)
at coldfusion.CfmServlet.service(CfmServlet.java:105)
at
coldfusion.bootstrap.BootstrapServlet.service(BootstrapServlet.java:78)
at
jrun.servlet.ServletInvoker.invoke(ServletInvoker.java:91)
at
jrun.servlet.JRunInvokerChain.invokeNext(JRunInvokerChain.java:42)
at
jrun.servlet.JRunRequestDispatcher.invoke(JRunRequestDispatcher.java:257)
at
jrun.servlet.ServletEngineService.dispatch(ServletEngineService.java:527)
at
jrun.servlet.jrpp.JRunProxyService.invokeRunnable(JRunProxyService.java:204)
at
jrunx.scheduler.ThreadPool$DownstreamMetrics.invokeRunnable(ThreadPool.java:349)
at
jrunx.scheduler.ThreadPool$ThreadThrottle.invokeRunnable(ThreadPool.java:457)
at
jrunx.scheduler.ThreadPool$UpstreamMetrics.invokeRunnable(ThreadPool.java:295)
at jrunx.scheduler.WorkerThread.run(WorkerThread.java:66)
any ideas?
any help is very appreciates!
thanks
RobI think the new error you saw was due to the cross-schema issue on 10.1.0.2/10.1.0.3 (as described in another thread on creating pyramids).
You should connect as system to grant the permissions. With 10.1.0.2/10.1.0.3, importFrom should be called from the same schema where the GeoRaster tables are defined. (Upgrading to 10.1.0.4 or higher if you want to perform cross-schema operations.)
Other than the connection part, your scripts looked OK. Please make sure that the file with the specified file name does exist and is on the same machine where the database is running. Note that even if the grant_permission call is executed successfully, it does not necessarily mean the following file operations will always succeed. For example, grant_permission can be called successfully on a file that does not even exist. If a file with the specific name is later created, no permissions are granted on the new file. The case with the schema 'GIS' is similar. You need to make sure that the schema 'GIS' and the file do exist before the grant_permission call; and grant_permission can be called repeatedly, if necessary.
Hope this helps. -
Shared Services 9.3.1 error - "...error loading tree data"
We have recently upgraded to version 9.3.1 and have been using it for several months now.
This past week, when logging in to Shared Services, we are suddenly experiencing a pop up with an error that says "There was a communication error while loading tree data".
Has anyone seen this error, and why would it suddenly start appearing when everything has worked for months and no changes have been applied?
Thanks for any advice, I will post screens and some sections from log files soon...We are also facing the issue while accessing the Hyperion Planning 9.3.1 application from Workspace. But it allows us to access it through planning URL. The workspace classic administration for planning works perfectly fine. Let me know in case you get any solution.
-
Error loading the data into ODS - Message no. BRAIN060?
Hi,
I am getting following error while load the data from flat file, data loaded successfully from flat file to PSA but I got following error while updating the data from PSA to data target:
Value '010384 javablue' (hex. '30003100300033003800340020006A0061007600610062006C') of characteristic 0PO_NUMBER contains invalid characters
Message no. BRAIN060
Diagnosis
The following standard characters are valid in characteristic values as default:
!"%&''()*+,-./:;<=>?_0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ
Characteristic values are not allowed if they only consist of the character "#" or begin with "!". If the characteristic is compounded, this also applies to each partial key.
You are trying to load the invalid characteristic value 1. (hexidecimal representation 30003100300033003800340020006A0061007600610062006C).
I am trying to load Value '010384 javablue' to "0PO_NUMBER" info object for ODS with in one row with some other data.
Any idea or any input to resolve this issue?
Thanks in advance for any input.
SteveThanks Soumya, I have maintained the upper case letters, but I am loading upper and lower case mixed to PO number? and it is not working. What is the solution to this? If I set lower case property to the PO number infoobject, it won't accept upper case. If I uncheck the lower case then it won't accept lower case letters. I cannot add upper and lower case letters in RSKC, because it accepts up to 72 characters, I have already have more than 60 characters (special char, numbers and 26 upper case letters).
I have already tried transfer routine but you can either convert to lower or upper but it doesn't work. we need both uper and lower case for po number, and R/3 is accepting it. why BW doesn't accept both?
Any idea what can be done?
Thanks in advance for your help.
Steve -
Error 'Loading Textfile data into external table'
I am using Apex 2.0, Oralce Database 10G and Internet Explorer (version 6.5)
I tried to run following code using SQLCommands of SQLWORKSHOP of Apex
Code Here:
declare
ddl1 varchar2(200);
ddl2 varchar2(4000);
begin
ddl1 := 'create or replace directory data_dir as
''C:\LAUSD_DATA\''';
execute immediate ddl1;
ddl2:= 'create table tbl_temp_autoload
(ID NUMBER,
RECTYPE VARCHAR2(2),
EXPORTNBR NUMBER(2),
EXPORTDATE DATE,
BIMAGEID VARCHAR2(11),
APPLICNBR NUMBER(10),
MEALIMAGE VARCHAR2(17),
MEDIIMAGE VARCHAR2(17),
LANGUAGE VARCHAR2(1),
APPLICCNT NUMBER(1),
OTHAPPLICNBR VARCHAR2(10),
PEDETDATE DATE,
PESTATUS VARCHAR2(1),
ERRORREMARKS VARCHAR2(50),
COMMENTS VARCHAR2(50),
SID1 VARCHAR2(10),
WID1 NUMBER(10),
MEDIROW1 VARCHAR2(1),
LASTNAME1 VARCHAR2(20),
FIRSTNAME1 VARCHAR2(20),
SCHOOL1 VARCHAR2(15),
LOCCD1 VARCHAR2(4),
BIRTHDATE1 DATE,
CASENBR1 VARCHAR2(15),
FOSTER1 VARCHAR2(1),
CHILDINC1 VARCHAR2(4),
RACEAIAN VARCHAR2(1),
RACEBAA VARCHAR2(1),
RACENHPI VARCHAR2(1),
RACEASIAN VARCHAR2(1),
RACEWHITE VARCHAR2(1),
HISPANIC VARCHAR2(1),
NOTHISPANIC VARCHAR2(1),
ADULTSIG3 VARCHAR2(1),
ADULTSIGDATE3 VARCHAR2(10),
ADULTNAME3 VARCHAR2(30),
SSN3 VARCHAR2(1),
SSN3NOT VARCHAR2(1),
ADDRESS VARCHAR2(30),
APTNBR VARCHAR2(6),
CTY VARCHAR2(20),
ZIP NUMBER(5),
PHONEHOME VARCHAR2(12),
PHONEWORK VARCHAR2(12),
PHONEEXTEN VARCHAR2(6),
MEALROWA VARCHAR2(1),
LNAMEA VARCHAR2(20),
FNAMEA VARCHAR2(20),
MINITA VARCHAR2(6),
GENDERA VARCHAR2(1),
AGEA NUMBER(2),
MOTYPEA VARCHAR2(1),
MONAMEA VARCHAR2(1),
FATYPEA VARCHAR2(1),
FANAMEA VARCHAR2(1),
FAMNBR NUMBER(1),
PARENTINC NUMBER(5),
FAMILYINC NUMBER(5),
FAMILYSIZE NUMBER(2),
PGSIG1 VARCHAR2(1),
PGNAME1 VARCHAR2(30),
PGSIGDATE1 VARCHAR2(10),
FAMSIZE1 VARCHAR2(2),
FAMINC1 VARCHAR2(6),
PGSIG2 VARCHAR2(1),
PGNAME2 VARCHAR2(30),
PGSIGDATE2 DATE,
FAMSIZE2 VARCHAR2(2),
FAMINC2 VARCHAR2(4),
GRADE NUMBER(2),
SCHOOL VARCHAR2(40),
RACE VARCHAR2(4),
MEDI_CALID VARCHAR2(15),
MEDSTATUS VARCHAR2(1),
ADULT1NAME VARCHAR2(40),
ADULT1TYPE VARCHAR2(1),
ADULT1INC1 VARCHAR2(5),
ADULT1INC2 VARCHAR2(5),
ADULT1INC3 VARCHAR2(5),
ADULT1INC4 VARCHAR2(5),
ADULT2NAME VARCHAR2(40),
ADULT2TYPE VARCHAR2(1),
ADULT2INC1 VARCHAR2(5),
ADULT2INC2 VARCHAR2(5),
ADULT2INC3 VARCHAR2(5),
ADULT2INC4 VARCHAR2(5),
ADULT3NAME VARCHAR2(40),
ADULT3TYPE VARCHAR2(1),
ADULT3INC1 VARCHAR2(5),
ADULT3INC2 VARCHAR2(5),
ADULT3INC3 VARCHAR2(5),
ADULT3INC4 VARCHAR2(5),
ADULT4NAME VARCHAR2(40),
ADULT4TYPE VARCHAR2(1),
ADULT4INC1 VARCHAR2(5),
ADULT4INC2 VARCHAR2(5),
ADULT4INC3 VARCHAR2(5),
ADULT4INC4 VARCHAR2(5),
ADULT5NAME VARCHAR2(40),
ADULT5TYPE VARCHAR2(1),
ADULT5INC1 VARCHAR2(5),
ADULT5INC2 VARCHAR2(5),
ADULT5INC3 VARCHAR2(5),
ADULT5INC4 VARCHAR2(5),
ADULT6NAME VARCHAR2(40),
ADULT6TYPE VARCHAR2(1),
ADULT6INC1 VARCHAR2(5),
ADULT6INC2 VARCHAR2(5),
ADULT6INC3 VARCHAR2(5),
ADULT6INC4 VARCHAR2(5),
AGE1LT19 VARCHAR2(1),
AGE2LT19 VARCHAR2(1),
AGE3LT19 VARCHAR2(1),
AGE4LT19 VARCHAR2(1),
AGE5LT19 VARCHAR2(1),
AGE6LT19 VARCHAR2(1),
MIDINIT1 VARCHAR2(1)
organization external
( type oracle_loader
default directory data_dir
access parameters
( fields terminated by '','' )
location (''DataJuly07.txt'')
execute immediate ddl2;
end;
Error Received:
ORA-29913:error in executing ODCIEXITTABLEFETCH callout ORA-30653: reject limit reached
Please help ASAP. I am new with oracle and Apex. Any help will be appreciated greatly.I downloaded the External table simple application from the Apex Packaged application and installed it. It installed successfully but when I run it - it doesn't load the data eventhough I get a message stating it was successful. I am running it on OracleXE - Apex 3.0.1.
In addition, I tried running the stored procedure directly (ext_employees_load) in sql-developer and received the ora-30648.
Please help.
thanks,
Tom -
Error Loading Transactional Data into Cube(0PCA_C01)
Hi Guys,
I am trying to Install the following cubes from Business Content. 0PCA_C01 / 0PCA_C02 (Profit Center Analysis). Everything got replicated. I am trying to load transaction data now. I created Infopackage and loaded the data. Its running for a long time. It still says " not yet completed/warning ". If i try to see the content of the cube, I am getting the following errors.
"Your user master record is not sufficiently maintained for object Authorization Object3.
System error: RSDRC / FORM AUTHORITY_CHECK USER NOT AUTHORIZED 0PCA_C01 0PCA_C01
System error: RSDRC / FUNC RSDRC_BASIC_CUBE_DATA_GET ERROR IN RSDRC_BASIC_QUERY_DATA_GET 0PCA_C01 64
System error: RSDRC / FORM DATA_GET ERROR IN RSDRC_BASIC_CUBE_DATA_GET 0PCA_C01 64"
Also if i try to change something in the infopackage, it says "Init. select. for field name currently running in". I guess its because the job is still running.
Please let me know if i missed something.
RajHi Raj
This seems to be an authorization case.
i guess you are in BW Dev system.
Go to SU01. Enter your user id and display.Go to Tabs ROle and PRofile.
Ideally in development, a developers id shud have access to all devevlopment activities. So check with basis folks and get access to the relevant profiles. If you can get SAP_ALL, then perfect!!
Prakash
Assignin points is a way of saying thanks on SDN!! -
Error loading the data into cube
Hi All,
wen i am loading the data in the ASO Cube its giving warning but i did every thing correct yester day it was loaded correctly buttoday its not loading its giving warning ASO application ignore update to derived cells even i clear the data wats this error
so plz can any one help
Reading Rule SQL Information For Database accdb
Reading Rules From Rule Object For Database accdb
Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
Aggregate storage applications ignore update to derived cells. 29 cells skipped
Data Load Elapsed Time with http://dataldcu.rul : http://0.094 seconds
Database import completed
Output columns prepared: [0]
Thanks
RamIn ASO cubes you can only load data at level 0 (=leaf level). Every upload to aggregated dimension members are ignored. The warning message "ignore update to derived cells" is meaning that you are trying to load data at node level. As this is not possible with ASO cubes, essbase ignores that data.
-
Error loading Master Data ARFCSTATE = SYSFAIL.
Hi All,
When I am trying to load master data 1 data package is not getting loaded, from R/3 the job is getting completed but in BW it is still in Yellow when I checked the job log in R/3 where in that it is showing ARFCSTATE = SYSFAIL.
It shows that from R/3 itself data package has problem.
Can anyone help me in solving this issue.
Regards
BhanuHi,
I will suggest you to check a few places where you can see the status
1) sysfail in SM37 generally means that packet was not send to BW side. But this may not be true in all cases. Rarely We have come acroos situations where you can see sysfail for a particular package in R/3 but will find that packet updated into PSA. Chek this in RSMO > Details.
In case you dont find that packet in BW PSA. Try this.
Go to R/3 > SM37 > goto datacpacket with sysfail. Adjacent to that sysfail message you will a TID no.
Copy that and got to SM58 and search for that in the tRFC pending. There will be one with the same TID or Transact ID. Execute it. The Packet may get transfered to BW.
If it doesnt happen you may have to retrigger the load.
Else Once you identify you can rectify the error.
If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
Thanks,
JituK -
Error loading master data RSDMD114
Attempting to load master data - several objects fail with message:
RSDMD114, "The Master Data is deleted in the background"
I went as far as to delete all of the old data and start again.
The object was reactivated.
No locks found on the lock tables.
Still getting the error.
Anyone seen this before ?
THANKS<b>The problem has been solved.</b>
The data deletion job failed - so the control table was still set to show that the deletion run was in progress.
Table <b>RSDCHABASDEL, Status Logging for Master Data Deletion.</b>
Delete the entry from this table, look for the name of the object in question.
Everything ran OK with the entry cleared from the table. -
BPC75NW: Error Loading Master data for BPC Dimensions from BW InfoObject
Hello Gurus,
I'm trying to load master data for BPC Dimensions from BW Infoobjects.
The ID thats used in BW is 32 char long. When I run the load, the ID is truncated after 20 chars and the records are reported as duplicates.
Due to this the load is failing.
I cannot use any other ID as the texts will not be loaded in that case.
Is there any work around to handle this?
I cannot load the transaction data either.
I looked at some posts and blogs in sdn, but nothing really helped.
Cube - 0RPM_c05 (Financial Planning cube in SAP PPM)
Version: BPC 7.5 NW SP5
Thanks,
VasuThanks Everyone.
" you can write a transformation file and give a new name to those IDs who have values more than 20 characters."
Poonam - Could you explain more?
I tried using a additional Dimension property to hold the 32 char ID. But I cannot access this in the transformation.
Is there a way to refer the dimension property in the Transformation file or in the UJD_ROUTINE BADI implementation?
Thanks,
Vasu -
Error loading essbase data into flat files through DIM
Hi All,
While running the session in DIM 8.1.1 which will load data from Hyperion essbase (11.1.1.3) to flat file it fails.
From the log file it shows failes to load the HASReader plugins.
All the environmental variables are properly set.
Still I am facing this issue.
Could anymore suggest me what I am missing.
Thanks,Hi Avs sai,
By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
Regards,
Mike Yin
TechNet Community Support -
Error Loading XML Data into Template Builder
Hi,
I am using XMLP 5.6.2 with E-business Suite. I copied a seeded Oracle Apps report and changed the output from Text to XML. I executed the report and it generated XML Output. I saved the output file as an XML file using IE.
When I try to load the XML Data file in MS Word, I am getting:
'Error No :-1072896682: Invalid at the top level of the document.
Line No : 1
Position : 1
File Pos: 0
Source: Authentication failed
I never had this problem before using the older version of XMLP (I think it was 5.0).
Does anyone have an idea how to fix this?
Thanks,
SteveTry using "view source" within IE to get the XML in a text file (Notepad) and save it from there. Then it should load fine. When you save it directly from the browser window there is extra formatting that prevents it from being loaded in template builder.
-
One of the master data variable(0customer), is unable to load data.
rsmo has status red but data is loaded with errors is the message
The error message is as follows..
Error records written to application log
error 4 in the update.
i have planned to delete the request number, but i could not find
the same for the master data variable.
Plz suggest
RegardsHi,
the error message is basically
@5C@ 08/07/2007 03:06:18 Value 'PRODUÇÃO D ' for characteristic 0SORTL contains invalid characters RRSV 7
i have checked in RSKC and executed the same, and reloaded the data
but still the load is red color, with records 5167 from 5167 in rsmo for
the master data variable
Rgs -
Error loading master data attr and text for 0material
Hi gurus,
I am new to BI, I am getting an error RSDMD- 194 when i am loading 0MATERIAL_ATTR and TEXT.
It is showing for nearly 100 records in the error stack.
Can anyone explain some basics,
1) why do we need to add 0MATERIAL to a particular Info area to start loading its ATTR and TEXT
2) Are the info objects that appear in the attributes tab of 0MATERIAL the same as the fields of 0MATERIAL_ATTR datasource, basically what i don't understand is are we mapping the 0MATERIAL_ATTR datsource fields to the info objects appearing in the attributes tab of 0MATERIAL info object.
3) Also when i added 0MATERIAL to my Info Area , there were some extra info objects that got in, are there any dependent objects that get added ?
<removed by moderator>
Edited by: Siegfried Szameitat on May 19, 2009 11:53 AMHi,
"Error RSDMD- 194 when i am loading 0MATERIAL_ATTR and TEXT.RSDMD- 194"
Please check if there is any external characterstic in those 100 records, correct it and load again.
1) why do we need to add 0MATERIAL to a particular Info area to start loading its ATTR and TEXT
It does,t matter where the 0Material is. Info area in kind of Folder to easily locate and put the all the related objects at one place.
2) Are the info objects that appear in the attributes tab of 0MATERIAL the same as the fields of 0MATERIAL_ATTR datasource, basically what i don't understand is are we mapping the 0MATERIAL_ATTR datsource fields to the info objects appearing in the attributes tab of 0MATERIAL info object.
If it is a business content infosource, it will automatically map the attribute to data source field. If you are modifying or creating your own object, you should map it manually based on field name and description.
3) Also when i added 0MATERIAL to my Info Area , there were some extra info objects that got in, are there any dependent objects that get added ?
If you will move any objects all the compounded objects will also come to the info area.
Regards,
Kams
Maybe you are looking for
-
Shouldn't I be able to listen to my music while playing a game?
I just bought one game and when I play it, my music stops. The game does have sound effects which do not seem to have the option of being turned off. Are there any games that allow me to listen to my tunes? TIA
-
when I connected the phone to itunes it says that the phone cannot be synced to itunes because it is protected by a passcode and i need to put the passcode in but the phone is disabled and doesn't give me the option to enter the passcode
-
Inability to open office files in windows 8
I cannot open any downloaded office file from office 13 and office 2010 in windows 8. why is that? Is there a solution?
-
Redundant access from MPLS VPN to global routing table
Several our customers have MPLS VPNs deployed over our infrastructure. Part of them requires access to Internet (global routing table in our case). As I'm not aware of any methods how to dynamicaly import/export routes between VRF/Global routing tabl
-
Setup sps2010 development environment
We have a SharePoint 2010 server farm up and running (1 WFE, 1 SQL DB, 1 APP Server) in production. Now we want to setup a development environment, so the developers can use the it to do the development work. We would like to setup the development