Data Template - Very Strange Record Data Mismatch
Using the enterprise edition with the very latest patch for 10.1.3.3.1... Below is my data template; it's very simple. The resulting XML is generated but the values don't match to the database rows. The data source is a MySQL database.
<dataTemplate name="SUGGESTIONBOX" defaultPackage="" description="SuggestionBox Data">
<properties>
<property name="include_parameters" value="true"/>
<property name="include_null_Element" value="true"/>
<property name="xml_tag_case" value="upper"/>
<property name="db_fetch_size" value="500"/>
<property name="scalable_mode" value="off"/>
<property name="include_rowsettag" value="false"/>
<property name="debug_mode" value="off"/>
</properties>
<parameters>
<parameter name="p_date" dataType="number" defaultValue=""/>
</parameters>
<lexicals/>
<dataQuery>
<sqlStatement name="Q1" dataSourceRef="">
<![CDATA[select sug_name
, sug_text
, sug_time
from corporate.suggestion
where sug_time > :p_date]]>
</sqlStatement>
</dataQuery>
<dataStructure>
<group name="G_ENTRIES" source="Q1" groupFilter="">
<element name="NAME" value="sug_name" function=""/>
<element name="TEXT" value="sug_text" function=""/>
<element name="DATE" value="sug_time" function=""/>
</group>
</dataStructure>
</dataTemplate>
It is generating the following xml. Notice the first G_ENTRIES contains the correct name and date values, but the text value is actually the text value of the second record. If I run the exact same SQL in a query browser, it returns the correct values (name1, text1, date1, name2, text2, date2, etc.)
<SUGGESTIONBOX>
<P_DATE>20070101</P_DATE>
<G_ENTRIES>
<NAME>name1</NAME>
<TEXT>text2</TEXT>
<DATE>date1</DATE>
</G_ENTRIES>
<G_ENTRIES>
<NAME>name2</NAME>
<TEXT>text3</TEXT>
<DATE>date2</DATE>
</G_ENTRIES>
<G_ENTRIES>
<NAME>name3</NAME>
<TEXT>text3</TEXT>
<DATE>date3</DATE>
</G_ENTRIES>
</SUGGESTIONBOX>
Any clues?
I am now able to reproduce this issue with any table that contains a TEXT column. I can workaround this issue by changing my data template. Note the CONCAT function, which must result in converting the TEXT field to a VARCHAR (just a guess).
<sqlStatement name="Q1" dataSourceRef="">
<![CDATA[select suggestion.sug_name,
CONCAT(suggestion.sug_text,"") as sug_text,
suggestion.sug_time,
suggestion.sug_id
from corporate.suggestion suggestion
where sug_time > :p_date]]>
</sqlStatement>
Similar Messages
-
VERY strange recording problem
I've seen some strange things in many years of using a Mac, but this is by far the most bizarre. I'm a syndicated radio producer and I transfer my weekly DJ music show from DAT onto my computer to make CD copies for my affiliate stations. For a while I've thought this was a problem with Peak Pro, but now it has cropped up when I'm recording directly into Quicktime 7.3.1.
When I playback the recording I find random elements from elsewhere in the show popping up in the middle of a song -- one or two second bursts. What's really weird is that it's always my voice, not part of some other piece of music. At other times I've gotten distortion that I can only describe as sounding like a "motorboating" effect, what you would get if you ran your finger over your lower lip repeatedly while talking.
I have tried both M-Audio and Behringer USB input devices. Settings in Audio Midi Setup seem to be correct. This has occurred using various iterations of OS X 10.4.x. I'm currently using 10.4.11. It has also occurred across different versions of Quicktime.
Here is some info from System Profiler:
Machine Name: Power Mac G4 (AGP graphics)
Machine Model: PowerMac3,1
CPU Type: PowerPC 60? (1.1)
Number Of CPUs: 2
CPU Speed: 1.6 GHz
L2 Cache (per CPU): 512 KB
Memory: 2 GB
The CPU upgrade is a NewerTech 1.6 Ghz from Other World Computing. Here's a list of my USB devices. (The M-Audio/Behringer are plugged directly into a USB 2.0 bus, not to an external hub).Upon further review ... Looks like I was premature in blaming QT for this. I opened the .mov file in Peak and saved it as a .aiff file. THAT file had the problem. When I listened to the original .mov everything sounded fine.
sorry 'bout that. -
Very Strange- Mail changes all dates after two days
I have a very strange behavior:
Apple Mail changes the date-information of incoming and outgoing mails after about two days -
for the first two days it says "today" or "yesterday" after that it changes all dates to one single date (december 12.2009 and some to december 12.2010).
I just put in a new pram battery, so this cannot be the problem.
what the heck is this?Hi Noe, do you have anything in Rules that is set to happen after 2 days?
Not certain, but this can fix myriad Mail problems...
Safe Boot from the HD, (holding Shift key down at bootup), it will try to repair your Disk Directory while the spinning radian is happening, so let it go, run Disk Utility in Applications>Utilities, then highlight your drive, click on Repair Permissions, then move these folder & file to the Desktop.
Move this Folder to the Desktop...
/Users/YourUserName/Library/Caches/Mail/
Move this file to the Desktop...
/Users/YourUserName/Library/Mail/Envelope Index
Reboot.
If that doesn't do it and you can afford to redo all your Rules, try these & reboot...
/Users/YourUserName/Library/Mail/MessageRules.plist
/Users/YourUserName/Library/Mail/MessageRules.plist.backup -
Hi Folks,
We have data mismatch issue for one info object ( BASE UOM ) when the data goes from staging DSO to conformance DSO.I will explain the issue here.
For one invoice no the Base UOM value dispayed as M2 in the staging DSO.When the data goes from staging DSO to conformance DSO,in the conformance DSO the Base UOM dispayed as Blank value.There is no code written for calculation of Base UOM in transformation level.This is direct a mapping to field Base UOM in the conformance DSO from staging DSO.For all other invoices the Base UOM value dispalyed correctly in conformance DSO when compare to staging DSO. For one specific invoice the base UOM value showing as blank in the conformance DSO.
Could you please give me your suugestions what the reasons for showing base UOM value showing as a blank value in Conformance DSO.
Thaks for your advance.
Regards,
NagHi,
You will have to check following things,
1) Check if other records with base unit value as M2 are updated properly in conformance DSO. This will make sure that there is no issue with this particular base unit.
2) As you have mentioned other records are successfully getting updated, you might have to debug the data load for this particular record. Do the selective data loading and check, where the unit is getting cleared.
You can also check if there is any conversion routine at the infoobject level.
Regards,
Durgesh. -
Data mismatch with ECC FBL5N report and debtor ageing report(BI side)
Hi,
I am facing a data mismatch problem with FBl5N(t-code for customer line item report) report in ECC side and Debtor ageing report in BI side.
The problem is
1) there are mismatch of data with some customer amounts in both ECC and Bi side.
2) and also there are customer Nos with amounts in BI Debtor ageing report which are not there in the ECC FBL5N(t-code for customer line item report)
for the second problem, I checked in the Tables BSID,BSAD in ECC side ,there also these customer Nos are not available.
One more strange thing is that with the same selectionin report on both ECC and BI reports the data mismatch and The extra customers in Bi reports are changing everyday, i.e. we are getting new set of data mismatch and extra customers Nos in BI Side.
If anyone have worked on this type of issue.....kindly help...
Thanks in advanceHi,
on the one hand it may be delta mechanism of FI_*_4 extractors with the timestamps issue, that your comparision between BI and ECC is at no time up to date.
FI Extraction
on the other hand, it may be the delta problem between data targets in your BI-System, in case you load the FI-Data from a DSO to a cube and make a report on the cube. I have this problem at the moment and will watch this thread for more suggestions. -
Inserting to MS-Access -Data mismatch
I am trying to insert records in to a MS-Access table. However i got a data mismatch exception and on figuring it out i realized there was a date field and i was populating it with string. hence i converted it in to a date and run the query again, however now, i am neither getting an exception nor is the table getting updated.
The following is the code snippet where i get the data
List <org.cuashi.wof.ws.nwis.ValueSingleVariable > ValueList = result.getTimeSeries().getValues().getValue();
for (org.cuashi.wof.ws.nwis.ValueSingleVariable value : ValueList)
System.out.format("%20s%10.4f",value.getDateTime().toString(),value.getValue());
System.out.println();
System.out.println("obtaining time series data");
String dateTime = value.getDateTime().toString().replace('T',' ');
//to convert string in to date
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
java.util.Date dt = sdf.parse(dateTime);
sdf.applyPattern("yyyy-MM-dd HH:mm:ss");
java.util.Date dt2 = sdf.parse(dateTime);
updateTable(siteCode,variableCode,dt2,timvalue,varCode,qualifierCode,qualityControlLevel,conn,rs);
}catch(Exception e)
public void updateTable(String siteCode,String variableCode,java.util.Date dt2,double tmvalue,String varCode,String qualifierCode,int qualityControlLevel,Connection con,ResultSet rs)
try{
System.out.println("inside update");
// con.setAutoCommit(false);
PreparedStatement pstmt = con.prepareStatement("INSERT INTO DataValues(ValueID,DataValue,ValueAccuracy,LocalDateTime,UTCOffset,DateTimeUTC,SiteID,VariableID,OffsetValue,OffsetTypeID,CensorCode,QualifierID,MethodID,SourceID,SampleID,DerivedFromID,QualityControlLevelID) VALUES(?,?,?,?,?,?.?,?,?,?,?,?,?,?,?,?,?)");
pstmt.setString(1,"1");
pstmt.setDouble(2,tmvalue);
pstmt.setInt(3,0);
pstmt.setDate(4,(java.sql.Date) dt2);
pstmt.setInt(5,0);
pstmt.setString(6,"0");
pstmt.setString(7,siteCode);
pstmt.setString(8,varCode);
pstmt.setInt(9,0);
pstmt.setInt(10,0);
pstmt.setInt(11,0);
pstmt.setString(12,qualifierCode);
pstmt.setInt(13,0);
pstmt.setInt(14,1);
pstmt.setInt(15,0);
pstmt.setInt(16,0);
pstmt.setInt(17,qualityControlLevel);
System.out.println("Statement prepared");
pstmt.execute();
//commit the transaction
con.commit();
pstmt.close();
}catch(SQLException e)
System.out.println("The Exception is " +e);
I found out that after field 4 the control does not go to the remaining fields at all.
Please let me know what i am missing.System.out.format("%20s%10.4f",value.getDateTime().toString(),value.getValue());
System.out.println();
System.out.println("obtaining time series data");
String dateTime = value.getDateTime().toString().replace('T',' ');
//to convert string in to date
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");I'd recommend that you setLenient(false) on your sdf.
This pattern is what you've got? Sure?
java.util.Date dt = sdf.parse(dateTime);
sdf.applyPattern("yyyy-MM-dd HH:mm:ss");
java.util.Date dt2 = sdf.parse(dateTime);
updateTable(siteCode,variableCode,dt2,timvalue,varCode,qualifierCode,qualityControlLevel,conn,rs);
}catch(Exception e)
{Empty catch block? Not a smart idea. Print the stack trace.
public void updateTable(String siteCode,String variableCode,java.util.Date dt2,double tmvalue,String varCode,String qualifierCode,int qualityControlLevel,Connection con,ResultSet rs)
try{
System.out.println("inside update");
// con.setAutoCommit(false);
PreparedStatement pstmt = con.prepareStatement("INSERT INTO DataValues(ValueID,DataValue,ValueAccuracy,LocalDateTime,UTCOffset,DateTimeUTC,SiteID,VariableID,OffsetValue,OffsetTypeID,CensorCode,QualifierID,MethodID,SourceID,SampleID,DerivedFromID,QualityControlLevelID) VALUES(?,?,?,?,?,?.?,?,?,?,?,?,?,?,?,?,?)");
pstmt.setString(1,"1");
pstmt.setDouble(2,tmvalue);
pstmt.setInt(3,0);
pstmt.setDate(4,(java.sql.Date) dt2);I'd recommend this:
pstmt.setDate(4,new java.sql.Date(dt2.getTime()));>
pstmt.setInt(5,0);
pstmt.setString(6,"0");
pstmt.setString(7,siteCode);
pstmt.setString(8,varCode);
pstmt.setInt(9,0);
pstmt.setInt(10,0);
pstmt.setInt(11,0);
pstmt.setString(12,qualifierCode);
pstmt.setInt(13,0);
pstmt.setInt(14,1);
pstmt.setInt(15,0);
pstmt.setInt(16,0);
pstmt.setInt(17,qualityControlLevel);
System.out.println("Statement prepared");
pstmt.execute();
//commit the transaction
con.commit();
pstmt.close();You should be closing your statement and connection in a finally block, in indivdual try/catch blocks.
Set autoCommit back to true.
>
}catch(SQLException e)
{So you don't roll back if there's an exception? Bad idea.
System.out.println("The Exception is " +e);
I found out that after field 4 the control does not go to the remaining fields at all.
Please let me know what i am missing.Lots of stuff. See above.
% -
IDOC failed due to date mismatch
Hi Experts,
I am facing a problem in IDOC.
I IDOC is sent through a one SAP to another system.
In target system ,It was failed ,the reason being the date mismatch.
When I check through WE02 ,it was failed becoz a TODATE and FROMDATE got wrong ,i.e it got swapped..
So is there any way to process this error out IDOC?
Is it possible to change the dates and then process it manually?
Pls let me know as its on high priorty.
Note : BOTH SAP system are Production .
thanks in advanceHi Hemant,
Please find the following steps to edit IDOC segment after you find the error using WE02.
The example codes can be found in website
http://www.sapgenie.com/sapedi/idoc_abap.htm
STEP 1 - Open document to edit
CALL FUNCTION 'EDI_DOCUMENT_OPEN_FOR_EDIT'
EXPORTING
document_number = t_docnum
IMPORTING
idoc_control = itab_edidc
TABLES
idoc_data = itab_edidd
EXCEPTIONS
document_foreign_lock = 1
document_not_exist = 2
document_not_open = 3
status_is_unable_for_changing = 4
OTHERS = 5.
STEP 2 - Loop at itab_edidd and change data
LOOP AT itab_edidd WHERE segnam = 'E1EDKA1'.
e1edka1 = itab_edidd-sdata.
IF e1edka1-parvw = 'LF'.
e1edka1-partn = t_eikto.
itab_edidd-sdata = e1edka1.
MODIFY itab_edidd.
EXIT.
ENDIF.
ENDLOOP.
STEP 3 - Change data segments
CALL FUNCTION 'EDI_CHANGE_DATA_SEGMENTS'
TABLES
idoc_changed_data_range = itab_edidd
EXCEPTIONS
idoc_not_open = 1
data_record_not_exist = 2
OTHERS = 3.
STEP 3a - Change control record
CALL FUNCTION 'EDI_CHANGE_CONTROL_RECORD'
EXPORTING
idoc_changed_control = itab_edidc
EXCEPTIONS
idoc_not_open = 1
direction_change_not_allowed = 2
OTHERS = 3.
STEP 4 - Close Idoc
Update IDoc status
CLEAR t_itab_edids40.
t_itab_edids40-docnum = t_docnum.
t_itab_edids40-status = '51'.
t_itab_edids40-repid = sy-repid.
t_itab_edids40-tabnam = 'EDI_DS'.
t_itab_edids40-mandt = sy-mandt.
t_itab_edids40-stamqu = 'SAP'.
t_itab_edids40-stamid = 'B1'.
t_itab_edids40-stamno = '999'.
t_itab_edids40-stapa1 = 'Sold to changed to '.
t_itab_edids40-stapa2 = t_new_kunnr.
t_itab_edids40-logdat = sy-datum.
t_itab_edids40-logtim = sy-uzeit.
APPEND t_itab_edids40.
CALL FUNCTION 'EDI_DOCUMENT_CLOSE_EDIT'
EXPORTING
document_number = t_docnum
do_commit = 'X'
do_update = 'X'
write_all_status = 'X'
TABLES
status_records = t_itab_edids40
EXCEPTIONS
idoc_not_open = 1
db_error = 2
OTHERS = 3.
Other alternative is to use WE19 with reference to the error IDoc number.
After you change the incorrect segment(s), you can post to the application.
As of result, new IDoc will be created.
You can use FM EDI_DOCUMENT_DELETE to delete the incorrect/errror IDoc number.
Also you can use WE02 to change the incorrect segment.
Double click on the incorrect segment to get the detail info. Then go to Data Record (menu) -> Change -> make neccessary changes -> Save.
Then you can use this program RBDINPUT to reprocess the error IDoc.
Hope this will help.
Regards,
Ferry Lianto -
OSI / OTI table data mismatch, OTI being subset of OSI
Hi,
We have a custom application where we fetch count and data for all provisioning records assigned to logged in user.
To fetch details of open provisioning tasks, Oracle recommendation is to use OTI table.
When we studied current system, we had below findings regarding behavior of OSI and OTI tables -
1. When request is submitted to OIM, approval process is triggered depending upon approval policy.
2. this workflow is responsbile for creating approval tasks.
3. when approval tasks are approved, request in OIM is closed and provisioning task for the system gets created.
4. OIM database tables which are used to fetch these provisioning tasks are OSI/OTI.
5. According to Oracle documentation, OTI table is subset of OSI table.
6. when we checked both these tables, we found that entry in both these tables is not exactly same.
7. For a particular request, OSI table had OSI_ASSIGN_TYPE as 'Group' and OTI table had OSI_ASSIGN_TYPE as 'User'.
8. As OTI table has osi_assigned_to_ugp_key value as null, this table cannot be used.
Iit looks like OTI table does not hold correct data. And OSI table has around 9.6 million records. So when we query OSI table we get correct data but it takes 6min to get the data.
We are now looking for the issue why there is this mismatch and how can it be solved ?
Regards,
DeepikaHi Kevin,
Thanks for these details.
We would try this query in our environment and see the result.
But regarding OSI /OTI data mismatch, i understand that OTI contains tasks that are still opened, like Rejected, and Pending. And these tasks are also there in OSI, when we check the record in OSI those tasks are assigned to user while when we see same record those are getting assign type changes.
Is that intended functionality or is it something wrong here.
Because of this we cannot get correct data directly from OTI table and have to use OSI(which is hampering performance)
This is the query that we are using:
select distinct oti.orc_key, a.usr_login, (select min(inner.oti_create) from oti inner where inner.orc_key = oti.orc_key)
as oti_create from oti inner join oiu on oti.orc_key = oiu.orc_key inner join usr a on oiu.usr_key = a.usr_key
inner join osi on oti.sch_key = osi.sch_key where sch_status in ('P', 'R') and obj_key in
(select obj_key from obj where obj_name in ('DB App', 'K2 App', 'LDAP App', 'SPML App', 'Manual App'))
and (osi.osi_assign_type = 'Group' and osi.osi_assigned_to_ugp_key in
(select ugp_key from usg where usr_key = " + Long.toString(Oim.getCurrentUser().getKey()) + "))
UNION ALL
select distinct oti.orc_key, a.usr_login, (select min(inner.oti_create) from oti inner where inner.orc_key = oti.orc_key)
as oti_create from oti inner join oiu on oti.orc_key = oiu.orc_key inner join usr a on oiu.usr_key = a.usr_key
inner join osi on oti.sch_key = osi.sch_key where sch_status in ('P', 'R') and obj_key in
(select obj_key from obj where obj_name in ('DB App', 'K2 App', 'LDAP App', 'SPML App', 'Manual App'))
and (osi.osi_assign_type in ('User', 'Default task assignment') and
osi.osi_assigned_to_usr_key = " + Long.toString(Oim.getCurrentUser().getKey()) + ")
order by oti_create
Regards,
Deepika -
Data Mismatch problem in Production system
Dear Experts,
We have been encountering serious data mismatch problems with our source system since 29th Dec 2010.
We have realized this mismatch only on 11th Jan 2011. So, we have deleted entire data from 29th Dec 2010 onwards to 11th Jan 2011. Then we tried to load delta again as per our Process Chain sequence. Since then, our production system has become very inconsistent. We found that the entire data from 29th Dec 2010 to till date is started giving mismatch.
Our system got damaged in terms of data drastically. If I delete any bad requests from our Cubes & Data Store Objects, the next time when we try to load data, cube & Data Store Objects is showing the deleted request no. is not updated properly. Because of this, our Process chain is daily failing to complete and giving so many technical errors. We are totally confused to restore our data from 29th Dec 2010 to till date.
Can any one please help us to restore our data fully and Process Chain to run successfully...............?
Thanks in Advance,
SumanHi Suman,
I understood that you are using 3.x version of BW
such kind of issues occurs if you distrub the delta sequence by deleting the requests without making them red.
you can resolve this two ways......
1) Go to reconstruction tab of the DSO and CUBE, select all the req from 29th Dec 2010 thru 11th Jan 2011 and reconstruct them. once they appear in the request tab. make all the requests to red (set the QM status to RED) one by one and delete them from the CUBE and DSO.
This should set the delta status back to 29th and you will be able to run a repeat delta from then in your next delta.
2) Go to RSRQ and give the SID or the Request ID one by one that it is giving you ( request no. which is not updated properly)
Mke the QM status as RED for all of them.
Since you have already deleted them from the targets, just changing the QM status to RED will be sufficinet.
You have make the QM status to RED inorder to inform the system that the requset is incorrect one and has been deleted.
Once it is set to RED, system will not prompt any req and say "request no.# not updated properly".
i suggest you the second option.
Once you resume the delta load, it should run successfully.....and if you feel some data is still missing from 29th Dec 2010 thru 11th Jan 2011, just run a full repair load the get the data back.
If your Datasource is of LO extraction u may need to fill the setup tables to perform full repair.
Please let me know if the Datasource is of LO extraction. Also let me know once the issue is resolved.
Regards,
Sudheer.
Edited by: Sudheer Kumar Kurra on Jan 22, 2011 7:58 AM
Edited by: Sudheer Kumar Kurra on Jan 22, 2011 8:03 AM -
Hi All
I have loaded 2lis_03_bx into cube and compressed the request with marker update(flag not set)
Now when i tried to compare the quantities(0TOTALSTCK) with the R/3 report MMBE, there is data mismatch
when i checked the data in the PSA, i found that some records are coming without Storage location details
So itried appllying the below restrictions and the result is exactly matching with the R/3 report MMBE.
1. Excluded storage location records with value '#'.
2. Excluded Stock type records value 'D'.
So, can any one suggest why there is a datamismatch.Hi,
did any got similar problem??
Regards
Seshu -
What is data mismatch ticket and how rectified it
Hi ,
i am pradeep .a job seeker pleas send me any doc regarding data mismatch and step by
step process to rectified it i will give good point
regards
pradeep kandepuhi ,
These r some support issues...
As to frequent failures and errors , there is no fixed reason for the load to be fail , if you want it for the interview perspective I would answer it in this way.
a) Loads can be failed due to the invalid characters
b) Can be because of the deadlock in the system
c) Can be because of previous load failure , if the load is dependant on other loads
d) Can be because of erroneous records
e) Can be because of RFC connections
These are some of the reasons for the load failures.
we can perform the daily activities in Production
1. Monitoring Data load failures thru RSMO
2. Monitoring Process Chains Daily/weekly/monthly
3. Perform Change run Hierarchy
4. Check Aggr's Rollup
To add to the above
1)check data targets are ready for reporting,
2) No failed or cancelled jobs in sm37 monitors and Bw Monitor.
3) All requests are loaded for day, monthly and yearly also.
4) Also to note down time taken for loading of critical info cubes which are used for reporting.
5) Is there any break in any schedules from your process chains.
hope this helps,
partha -
Cisco Historical reports Data Mismatches
dear all
i need to ask you about cisco Historical reports, i am using UCCX 8.6 and CUCM 8.6
i am collecting data of calls presented handled and Abundant from these three reports:
ccdr Call detail
csq detail
call abandont detail
from all these reports data is different and difference is around 1000 calls of all month, abundant presented and handle all three are showing data mismatched,
Second question is which report i should use to get data of: presented calls, handled calls abundant calls and per agent abundant calls, please suggest
please help me to resolve this issueKindly read the following for better understanding of their relationship
a) UCCX Reporting Guide - Data reconciliation between reports
b) UCCX Report Developer Guide - Interpret Database Records
Thanks!
-JT- -
Data mismatch between 10g and 11g.
Hi
We recently upgraded OBIEE to 11.1.1.6.0 from 10.1.3.4.0. While testing we found data mismatch between 10g and 11g in-case of few reports which are including a front end calculated column with division included in it, say for example ("- Paycheck"."Earnings" / COUNT(DISTINCT "- Pay"."Check Date")) / 25.
The data is matching for the below scenarios.
1) When the column is removed from both 10g and 11g.
2) When the aggregation rule is set to either "Sum or Count" in both 10g and 11g.
It would be very helpful and greatly appreciated if any workaround/pointers to solve this issue is provided.
Thanksjfedynic wrote:
The 10g and 11.1.0.7 Databases are currently set to AL32UTF8.
In each database there is a VARCHAR2 field used to store data, but not specifically AL32UTF8 data but encrypted data.
Using the 10g Client to connect to either the 10g database or 11g database it works fine.
Using the 11.1.0.7 Client to go against either the 10g or 11g database and it produces the error: ORA-29275: partial multibyte character
What has changed?
Was it considered a Bug in 10g because it allowed this behavior and now 11g is operating correctly?
29275, 00000, "partial multibyte character"
// *Cause: The requested read operation could not complete because a partial
// multibyte character was found at the end of the input.
// *Action: Ensure that the complete multibyte character is sent from the
// remote server and retry the operation. Or read the partial
// multibyte character as RAW.It appears to me a bug got fixed. -
Data mismatch in Test and Prod environments
Hi,
we have a query in Test and Prod environments.this query is not giving same result both for Test and production. Please have a look and share your thoughts.
Select D1.C3,D1.C21,D2.C3,D2.C21
from
(select
sum(F.X_SALES_DEDUCTION_ALLOC_AMT)as C3,
O.Customer_num as C21
from
ESA_W_ORG_D O,
ESA_W_DAY_D D ,
ESA_W_SALES_INVOICE_LINE_F F
where
O.ROW_WID = F.CUSTOMER_WID
and D.ROW_WID = F.INVOICED_ON_DT_WID
and D.PER_NAME_FSCL_MNTH = '2012 / 12'
group by O.Customer_num)D1,
(select
sum(F.X_SALES_DEDUCTION_ALLOC_AMT)AS c3,
O.Customer_num as C21
from
Sa.W_ORG_D@STPRD O,
Sa.W_DAY_D@STPRD D ,
Sa.W_SALES_INVOICE_LINE_F@STPRD F
where
O.ROW_WID = F.CUSTOMER_WID
and D.ROW_WID = F.INVOICED_ON_DT_WID
and D.PER_NAME_FSCL_MNTH = '2012 / 12'
group by O.Customer_num)D2
where
D1.C21=D2.C21
and D1.C3<>D2.C3;I have done the following steps:
1. created one temporary table and searched for duplicate records because if any duplicates found am planning to delete those records. but didn't find any duplicates. searched for common column values using equi join condition. are there any possibilities in data mismatch apart from this?
2. this query is taking around 45 minutes to retrieve the output. I want to increase the performance of the query. so created Unique on 5 columns. but still taking the same time.
so uing ALL_ROW HINT and ran the query but still it's taking the same time.
so suggest me any thing needs to add to increase the performance?
appreciate your support.
Thanks.If you can create a temporary database link between the two environments use DBMS_RECTIFIER_DIFF or DBMS_COMPARISON to compare the two tables' contents.
http://www.morganslibrary.org/reference/pkgs/dbms_comparison.html
http://www.morganslibrary.org/reference/pkgs/dbms_rectifier_diff.html -
Data Mismatch between ODS and Infocube
Hi,
I have got data mismatch between from ODS to Infocube. In ODS for one Sender Cost Centre, there is only one Company Code. In Infocube it is showing many.For ODS datasource is 0HR_PT_1. In Updaterules of ODS, Company Code was assigned to Company Code. But in Infocube should I assign CompCode to Comp Code or any other Characteristic as Master Data Attr of. Plz suggest us.
It is very urgent.post ur BW related querures in BW forum
Maybe you are looking for
-
Problems with Itunes store after upgrading to 7.3.2
I upgraded to 7.3.2 and now the Itunes store won't work. It takes forever to load and once it does half the pictures and icons cannot be shown. You can't buy any albums or songs because the buy icons are big red X's. Is anyone else out there having s
-
Redaction, removal from video of client personal information.
Hi All... My searches on redaction - all reference Adobe Acrobat and PDFs.... What if I'm filming an interview with cuts to computer monitor/software - where some fields may have private information.... Name, SocialInsurance Number, CreditCard Number
-
Applet gives trouble in Netscape
my applet doesn't works good in Nestscape. it works well in IE. may i know what could be the reason? In Netscape the applet gets downloaded but it cannot send messages to the servlet in the server.
-
Identity Management Self service reset
Hi Everyone! I had implemented an Identity Center 7.2 for internal training, linking both abap and java satellite systems and all works fine. However after performing initial load and the creation of edit authentication questions task, users aren't a
-
HT5622 How can i buy mountain lion OS from Greece?
Hi, i live in Greece and i want to buy mountain lion from apple store but when i finnaly reach the latest detail form with addresses, phones etc, i can not enter my country (Greece). It is only avalaible for other countries. Thanks in advance