EHP6 to EHP7 Upgrade Issue: Junk Values are showing for BW data source
Hi,
We are upgrading ECC from EHP6 to EHP7. We have custom data source which is build on custom Info set and Info set is built on Pooled table ZXXX. Datasource is showing the junk values in RSA3 after upgrade. I have re-generated the Info set and tested but no use. Below is the RSA3 data screen shot. Can anyone suggest me to oversome this?
Thanks,
Prasad
Hi Prasad,
Few things to check:
1. Pooled table data for which the datasource is created.
2. Reactivate the extract structure - check the dataelement also for date.
3. Is there any custom code present in Infoset? - if yes then it may be possible that dates are being calculated in that custom code and wrong format is being used there.
Thanks
Amit
Similar Messages
-
Not all my titles are showing for each data set
Hello,
I have made a column graph for 4 different data sets, I need the titles to show for each data set along the X axis. I highlight the titles in my data when making the graph, however, only two of the four are showing. Is there any way I can get all 4 to show?
^^^^ Need the title after Baseline and after Numb Hand
Thank youTry clicking the chart then the blue 'Edit Data References' rectangle, and then in the lower left of the window switch from 'Plot Columns as Series' to 'Plot Rows as Series'. Then switch back if needed. All four should then show up.
SG -
Line Graph Issue - In BIP values are summed for same date
Hi
I have below data
Date-----------------------Sell
27-Mar-13----------------10
28-Mar-13----------------15
28-Mar-13----------------20
30-Mar-13----------------25
There are two entries on 28-Mar-2013. On BIP line graph; Sell values are added and graph show value 35 on 28-Mar-2013.
I want to show two different values on line graph for 28-Mar-2013.
Edited by: 1010402 on Jun 7, 2013 12:49 AMHi 913804
I am amazed you found the browser certification of OBIEE 11.1.1.6 I could not find it anywhere in the documentation nor from metalink personell.
Regarding the graphs not showing, we found this to be an issue with IE and not with Firefox. First make sure you have followed the upgrade step listed in
3.9.11.2 Updating Oracle Business Intelligence Catalogs from http://docs.oracle.com/cd/E23943_01/doc.1111/e16793/patch_set_installer.htm#PATCH250. Then the last resort was to recreate the graph component in each report from scratch. As you say pie charts are unaffected.
Regards,
Nick. -
Inventory cube - non cumulativekey fig values are showing -ve values
Hi Guru's,
For Improving the performance of inventory cube *0IC_C03
The following steps i did:
1) Created History cube by taking a copy of actual cube (0IC_C03).
2) Transferred all the four years of data (2007, 2008, 2009, 2010) to history cube(4 yr data) as a back up to do clustering and for cube remodelling.
3) After doing all these, loaded the current 3 years (2008, 2009, 2010)data back to the actual cube and kept one year data in the history cube (2007) (i.e maintained only recent 3yrs data in actual cube).
5) Created a multiprovider includes actual and history cubes and populated the existing report on top of the multiprovider.
6) After purging one year data from the actual cube, stock values in the reports are showing negative values
7) To clear that issue i loaded the 2007 year data back to the actual cube (now the cube has all years data as it was before) to avoid the negative stock value, but again stock values are showing negative values.
How to solve this issues in inventory cube..
how too eliminate the negative value in reports which was working prperly before data purging( removing the first year data from the actual cube)Hi prayog.. 10q for answering... Yeah i went 2 the data targets. And the forumlae is already wrriten like this IF( Debit/Credit = 'H', Qty in OUn, ( 1- * Qty in OUn ) ) for Actual Consum. K.F and IF( Debit/Credit = 'H', Amt. in local curr., ( 1- * Amt. in local curr. ) ) for Amount.....
So i already said that from one of the infosource the data is flowing through ODS and then 2 CUBE. So i checked out the data in ODS with the movement type and posting date as per in the Report.. I selected the 'Debit/Credit' = H and Movement type and Posting date... But in ODS o/p the keyfig's are not displayed..... This is the problem...
Cheers,
Hemanth Aluri... -
Value Mapping Replication for Mass Data - Performance Issues
Hi All,
We are looking into Value Mapping Replication for Mass Data. We have done this for less number of fields.
Now we might have to have 15,000 records in the cache for the Value Mapping. I am not sure how this would effect the Java Cache and Java Engine as a whole.
There might be a situation where we will have to leave the 15K records in the cache table on Java Engine...
Are there any parameters that we can look into just to see how this hits the performance.
Any links/ guidance in the right direction might help me..
regNaveen,
Check jins reply in this thread (they have done with API and without API using graphical but still some issues):
Value mapping performance using LookUp API
---Satish -
Using MISSING FIELD VALUES ARE NULL for external table
I want to place a null for values missing in the sub_account field. Here is my external table:
CREATE OR REPLACE DIRECTORY INCOMING_ORDERS_log_dir
AS 'c:\starpubs\starpubs\dataformats\logs\INCOMING_ORDERS\log';
CREATE OR REPLACE DIRECTORY INCOMING_ORDERS_bad_dir
AS 'c:\starpubs\starpubs\dataformats\logs\INCOMING_ORDERS\bad';
create table ext_INCOMING_ORDERS_table (
Account varchar(5),
Sub_Account varchar(1),
Override_Code varchar(1),
Nomenclature varchar(28),
chg_nbr varchar(3),
quantity integer,
U_I varchar(5),
zipcode varchar(5),
type_reject varchar(2)
organization external
type oracle_loader
default directory user_dir
access parameters
records delimited by newline
missing field values are null
badfile INCOMING_ORDERS_bad_dir:'INCOMING_ORDERS%a_%p.bad'
logfile INCOMING_ORDERS_log_dir:'INCOMING_ORDERS%a_%p.log'
fields
Account(1:5) char(5),
Sub_Account(7:7) char(1),
Override_Code(10:10) char(1),
Nomenclature(11:38) char(28),
chg_nbr(40:42) char(3),
quantity(44:48) integer external,
U_I(50:54) char(5),
zipcode(56:60) char(5),
type_reject(61:62) char(2)
location('PTCLICK.MANUAL.NOMEN.TXT','PTCLICK.ORDERS.TXT', 'EUR_RES.TXT', 'MQ.TXT', 'BPRO.TXT')
reject limit unlimited;
How can I place the MISSING FIELD VALUES ARE NULL for missing values for the sub_account?made the change I received this error:
SQL> select * from ext_INCOMING_ORDERS_table;
select * from ext_INCOMING_ORDERS_table
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "no": expecting one of: "comma, date_format,
defaultif, enclosed, ltrim, lrtrim, ldrtrim, notrim, nullif, optionally, ),
rtrim, terminated"
KUP-01007: at line 7 column 26
CREATE OR REPLACE DIRECTORY INCOMING_ORDERS_log_dir
AS 'c:\starpubs\starpubs\dataformats\logs\INCOMING_ORDERS\log';
CREATE OR REPLACE DIRECTORY INCOMING_ORDERS_bad_dir
AS 'c:\starpubs\starpubs\dataformats\logs\INCOMING_ORDERS\bad';
create table ext_INCOMING_ORDERS_table (
Account varchar(5),
Sub_Account varchar(1),
Override_Code varchar(1),
Nomenclature varchar(28),
chg_nbr varchar(3),
quantity integer,
U_I varchar(5),
zipcode varchar(5),
type_reject varchar(2)
organization external
type oracle_loader
default directory user_dir
access parameters
records delimited by newline
badfile INCOMING_ORDERS_bad_dir:'INCOMING_ORDERS%a_%p.bad'
logfile INCOMING_ORDERS_log_dir:'INCOMING_ORDERS%a_%p.log'
fields
Account(1:5) char(5),
Sub_Account(7:7) char(1) NO PRESERVE BLANKS,
Override_Code(10:10) char(1),
Nomenclature(11:38) char(28),
chg_nbr(40:42) char(3),
quantity(44:48) integer external,
U_I(50:54) char(5),
zipcode(56:60) char(5),
type_reject(61:62) char(2)
location('PTCLICK.MANUAL.NOMEN.TXT','PTCLICK.ORDERS.TXT', 'EUR_RES.TXT', 'MQ.TXT', 'BPRO.TXT')
reject limit unlimited; -
Delete the records which have the Data values are NULL before load data to BPC model
Hi Everyone,
I am loading the data from flat file to BPC Model (10.0 Version).
Source data (Flat file) looks like below:
RP_Employee RPT_Currency Data
Test USD 8
Test1 USD
Test2 USD 6
My user requirement is that to delete the records which have the Data values are NULL before load data to BPC model.
So,Please let me know how can i meet to this requirement.
I am thinking that,it is possible by using start routine BADi. If i am correct please let me know process like creation of class and BADi implementation.
Thanks in advance!!Hi Nilanjan,
Please see my source data below:
Account Client Employee Time Data
123 XYZ Vishu 2014.01 300
456 2014.01
789 ABC Alexander 2014.02 200
If you see the second record,
If data value is ZULL, then the Employee or another dimension is also NULL.
So I want to delete second records.
If it is Start routine please share the code and steps to do.
Thanks in advance!!
Regards,
Viswanath -
Hi there,
I am having a kind of weird issues with my oracle enterprise db which was perfectly working since 2009. After having had some trouble with my network switch (replaced the switch) the all network came back and all subnet devices are functioning perfect.
This is an NFS for oracle db backup and the oracle is not starting in mount/alter etc.
Here the details of my server:
- SunOS 5.10 Generic_141445-09 i86pc i386 i86pc
- Oracle Database 10g Enterprise Edition Release 10.2.0.2.0
- 38TB disk space (plenty free)
- 4GB RAM
And when I attempt to start the db, here the logs:
Starting up ORACLE RDBMS Version: 10.2.0.2.0.
System parameters with non-default values:
processes = 150
shared_pool_size = 209715200
control_files = /opt/oracle/oradata/CATL/control01.ctl, /opt/oracle/oradata/CATL/control02.ctl, /opt/oracle/oradata/CATL/control03.ctl
db_cache_size = 104857600
compatible = 10.2.0
log_archive_dest = /opt/oracle/oradata/CATL/archive
log_buffer = 2867200
db_files = 80
db_file_multiblock_read_count= 32
undo_management = AUTO
global_names = TRUE
instance_name = CATL
parallel_max_servers = 5
background_dump_dest = /opt/oracle/admin/CATL/bdump
user_dump_dest = /opt/oracle/admin/CATL/udump
max_dump_file_size = 10240
core_dump_dest = /opt/oracle/admin/CATL/cdump
db_name = CATL
open_cursors = 300
PMON started with pid=2, OS id=10751
PSP0 started with pid=3, OS id=10753
MMAN started with pid=4, OS id=10755
DBW0 started with pid=5, OS id=10757
LGWR started with pid=6, OS id=10759
CKPT started with pid=7, OS id=10761
SMON started with pid=8, OS id=10763
RECO started with pid=9, OS id=10765
MMON started with pid=10, OS id=10767
MMNL started with pid=11, OS id=10769
Thu Nov 28 05:49:02 2013
ALTER DATABASE MOUNT
Thu Nov 28 05:49:02 2013
ORA-00202: control file: '/opt/oracle/oradata/CATL/control01.ctl'
ORA-27037: unable to obtain file status
Intel SVR4 UNIX Error: 79: Value too large for defined data type
Additional information: 45
Trying to start db without mount it starts without issues:
SQL> startup nomount
ORACLE instance started.
Total System Global Area 343932928 bytes
Fixed Size 1280132 bytes
Variable Size 234882940 bytes
Database Buffers 104857600 bytes
Redo Buffers 2912256 bytes
SQL>
But when I try to mount or alter db:
SQL> alter database mount;
alter database mount
ERROR at line 1:
ORA-00205: error in identifying control file, check alert log for more info
SQL>
From the logs again:
alter database mount
Thu Nov 28 06:00:20 2013
ORA-00202: control file: '/opt/oracle/oradata/CATL/control01.ctl'
ORA-27037: unable to obtain file status
Intel SVR4 UNIX Error: 79: Value too large for defined data type
Additional information: 45
Thu Nov 28 06:00:20 2013
ORA-205 signalled during: alter database mount
We have already checked in everywhere in the system, got oracle support as well without success. The control files are in the place and checked with strings, they are correct.
Can somebody give a clue please?
Maybe somebody had similar issue here....
Thanks in advance.Did the touch to update the date, but no joy either....
These are further logs, so maybe can give a clue:
Wed Nov 20 05:58:27 2013
Errors in file /opt/oracle/admin/CATL/bdump/catl_j000_7304.trc:
ORA-12012: error on auto execute of job 5324
ORA-27468: "SYS.PURGE_LOG" is locked by another process
Sun Nov 24 20:13:40 2013
Starting ORACLE instance (normal)
control_files = /opt/oracle/oradata/CATL/control01.ctl, /opt/oracle/oradata/CATL/control02.ctl, /opt/oracle/oradata/CATL/control03.ctl
Sun Nov 24 20:15:42 2013
alter database mount
Sun Nov 24 20:15:42 2013
ORA-00202: control file: '/opt/oracle/oradata/CATL/control01.ctl'
ORA-27037: unable to obtain file status
Intel SVR4 UNIX Error: 79: Value too large for defined data type
Additional information: 45
Sun Nov 24 20:15:42 2013
ORA-205 signalled during: alter database mount -
Bpartner value to be added in data source
Dear SDN Experts,
am facing 1 issue for CRM data source. There is activity generation module used in CRM where bpartner value has been configured under PARTNERS tab. I am able to extract this transaction data using 0CRM_SALES_ACT_1 but the particular partner value mentioned in the tab is not getting extracted in this data source.
Business has demanded this field in BW report, so I want to enhance the data source but I am not able to find out in which table of CRM this partner value gets stored against each transaction.
Please help me if any1 has faced similar issue.
Thanks & Regards,
PriyankaHello Satya,
I have tried using that BAPI function but when I use it in the extractor, the internal table eats lot of memory and extractor ends with no temp space memory error. That's why I tried searching for the basic tables where transactional data gets stored.
Regards,
Priyanka -
Stock report with value and qauntity for given date not month wise
Hi gems,
can any body give me the standard report for Stock value and qauntity for given date not month wise at storage location levelHi
check the report S_P00_07000139 with the option inventory and raw material report- detail and selection date (from, to date same). List will give opening & closing balances with goods movment and their values.
Thanks -
What are the steps need to be taken care for MM data sources while extrac
what are the steps need to be taken care for MM data sources while extracting from R/3 tables
and please provide the Steps involved in R/3 side
thnaks
RachaHi,
For Inventory Management, you can have a look at the following link.
[https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328] -
What are the Non SAP data sources supported for Analysis workbooks?
AO 1.4 SP6
BO 4.1 SP2
What are the Non SAP data sources supported for Analysis workbooks?
Thanks.HANA is a data source (which could contain non-SAP data)
For other Excel front-ends that may connect to "non-SAP" data look at Live Office or Power BI by Microsoft - see Excel and Power BI connectivity to SAP BusinessObjects Universes | Power BI -
Time series functions are not working in OBIEE for ESSBASE data source
Hi All,
I am facing a problem in OBIEE as I am getting error messages for measure columns with Time series functions(Ago,ToDate and PeriodRolling) in both RPD and Answers.
Error is "Target database does not support Ago operation".
But I am aware of OBIEE supports Time Series functions for Essbase data source.
using Hyperion 9.3.1 as data source and obiee 11.1.1.5.0 as reporting tool.
Appreciate your help.
Thanks,
AravindHi,
is because the time series function are not supported for the framentation content, see the content of the oracle support:
The error occurs due to the fact the fragmented data sources are used on some Time series measures. Time series measures (i.e. AGO) are not supported on fragmented data sources.
Confirmation is documented in the following guide - Creating and Administering the Business Model and Mapping Layer in an Oracle BI Repository > Process of Creating and Administering Dimensions
Ago or ToDate functionality is not supported on fragmented logical table sources. For more information, refer to “About Time Series Conversion Functions” on page 197.
Regards,
Gianluca -
[SOLVED] Value too large for defined data type in Geany over Samba
Some months ago Geany started to output an error whith every attempt to open a file mounted in smbfs/cifs.
The error was:
Value too large for defined data type
Now the error is solved thanks to a french user, Pierre, on Ubuntu's Launchpad:
https://bugs.launchpad.net/ubuntu/+bug/ … comments/5
The solution is to add this options to your smbfs/cifs mount options (in /etc/fstab for example):
,nounix,noserverino
It works on Arch Linux up-to-date (2009-12-02)
I've writed it on the ArchWiki too: http://wiki.archlinux.org/index.php/Sam … leshootingAn update on the original bug. This is the direct link to launchpad bug 455122:
https://bugs.launchpad.net/ubuntu/+sour … bug/455122 -
'Value too large for defined data type' error while running flexanlg
While trying to run flexanlg to analyze my access log file I have received the following error:
Could not open specified log file 'access': Value too large for defined data type
The command I was running is
${iPLANET_HOME}/extras/flexanlg/flexanlg -F -x -n "Web Server" -i ${TMP_WEB_FILE} -o ${OUT_WEB_FILE} -c hnrfeuok -t s5m5h5 -l h30c+5 -p ctl
Which should generate a html report of the web statistics
The file has approx 7 Million entries and is 2.3G in size
Ideas?I've concatenated several files together from my web servers as I wanted a single report, several reports based on individual web servers is no use.
I'm running iWS 6.1 SP6 on Solaris 10, on a zoned T2000
SunOS 10 Generic_118833-23 sun4v sparc SUNW,Sun-Fire-T200
Cheers
Chris
Maybe you are looking for
-
How can I transfer my apps from one Itune to another ?
Hi, I got a G4 on witch My Iphone & I pod are sync. I've just bought MBP installed without tranfering accounts. My apps won't update asking the MacItune to update and I'd love to recycle my G4 ! anybodi has an idea? Thanks
-
New IPOD - error message trying to use with itunes
Hi Please can someone help. I just bought an IPod with Video but I cant even get any of the songs in my itunes library synched onto it. When I click on autosync I get the following error message The IPOD "IPOD" cannot be used because it requires itun
-
Placing a PNG file with transparency
If I have a psd / Png file on Photoshop that has text with no background (transparent), and I do a Place of that file into Illustrator, I get a white background in additiotion to the Text. How do I stop the white background from placing withing the I
-
Notebook GX740 does not react to the power button
Hello. Notebook GX740. When you upgrade NB Camera / VGA / EC Firmware laptop hangs. 1 hour turned it off the power. Now does not react to the power button. How can I restore NB Camera / VGA / EC Firmware??? The file downloaded from the website EC1727
-
Iphone 4s update stopped, then the backup was too big!
I was installing the latest update on my fiances iphone 4s. Then it stopped and I got some text saying the update wouldn't go. The phone was then locked with and I was prompted to connect it to itunes. Also hade a message about the simcard not being