Visa, Data mismatch
Good afternoon.
Im really new in comunications so I will apricate your help.
I have a Vi that sends and recieve data, so it is a constantly changing state machine that changes from "transmit" and "recieve".
This vi Controls and reads a ptrototype Im working with ardouino. So i if a button is pressed it will send a byte to arduino and arduino will interprete it.
After that arduino should be sending 12 bytes. the i make string subsets to devide them in 4 packages of 3 bytes each and graph them. Each package is a position of my prototype.
The problem is that I dont get the 12 bytes I need. For example I get 6 bytes from the old transmition and 6 of the new, or 3 from the last and 9 form the new etc.
I tried to put an identifier character, and read the buffer. If the first byte is my identifier byte then divide and graph my data.
This solves the problem but is still sloppy and unpractical because a lot of data i getting lost , also the graph is not as smooth as it should be.
My first hypotesis is that arduino is sending to quickly the data, becouse i made a program that just sends 12 bytes with a delay of 100ms and it was fine. I cant put a delay in arduino because it will mess the prototype.
My second hypotesis was that i was flushing the buffer the wrong way, I flush it in the transmit state.
Any suggestion?
I heard about putting an identifier to the transmition so it will start reading from the desire byte. What dou you think?
Thank you
Attachments:
control 1.vi 37 KB
Hi
Don't flush in sending or receiving except the first time after opening.
The serial communication is on two wires independently sending and receiving at least on the pc side.
Just write a byte and read later the complete message then write again.
greetings from the Netherlands
Similar Messages
-
Hi Folks,
We have data mismatch issue for one info object ( BASE UOM ) when the data goes from staging DSO to conformance DSO.I will explain the issue here.
For one invoice no the Base UOM value dispayed as M2 in the staging DSO.When the data goes from staging DSO to conformance DSO,in the conformance DSO the Base UOM dispayed as Blank value.There is no code written for calculation of Base UOM in transformation level.This is direct a mapping to field Base UOM in the conformance DSO from staging DSO.For all other invoices the Base UOM value dispalyed correctly in conformance DSO when compare to staging DSO. For one specific invoice the base UOM value showing as blank in the conformance DSO.
Could you please give me your suugestions what the reasons for showing base UOM value showing as a blank value in Conformance DSO.
Thaks for your advance.
Regards,
NagHi,
You will have to check following things,
1) Check if other records with base unit value as M2 are updated properly in conformance DSO. This will make sure that there is no issue with this particular base unit.
2) As you have mentioned other records are successfully getting updated, you might have to debug the data load for this particular record. Do the selective data loading and check, where the unit is getting cleared.
You can also check if there is any conversion routine at the infoobject level.
Regards,
Durgesh. -
Delta load from ODS to cube failed - Data mismatch
Hi all
We have a scenario where the data flow is like
R/3 table - >dataSrc -- > pSA - >InfoSrc -> ODS ->Cube.
The cube has an additional field called "monthly version"
and since it is a history cube , it is supposed to hold data
snapshots of the all data in current cube for each month .
We are facing the problem that the Data for the current month
is there in the history ODS but not in the cube . In the ODS ->Manage->requests
tab i can see only 1 red request that too with 0 recs.
However ,In the cube -> manage-> reconstruction tab , i can see 2 Red request
with the current month date. Could these red requests be the reason for
data mismatch in ODS and cube .
Please guide as to how can i solve this problem .
thanks all
annieHi
Thanks for the reply.
Load to Cube is Delta and goes daily .
Load to ODS is a full on daily basis .
Can you help me how to sort this issue . I have to work directly in production env . so has to be safe and full proof .
Thanks
annie -
Hi all,
My project manager shown screen shot like data mismatch from r/3 side and bw report, Now i have to analysis where it is
strucked give me steps to follow find that error, you guys surely have idea where and all possible to get stuck please give yr
suggestion, then i can easy find that problem.
thanks & regards,
R. SaravananHi Saravanan,
Goto R3 side--> RSA3 Transaction > Enter the DS name> Give some restriction w.r.t some Characteristics --> Extract the data
Come to BW side> RSA1 trans> you can start checking with the either Infocube content/ MP contents> right click> Display data> Give the same restrictions as you gave in the R3 RSA3 trans> Execute
Compare the field values whether you are getting the same data or not.
if the data is matching here, then there might be some filtering in the Query--> check that filtering & check the data accordingly
If it is not matching at Cube/MP level--> Then compare the data at PSA level before which check all the Routines/Update rules/ Transformations because there is a possibility that the data might be filtered in any of these levels.
Hope it helps!
Regards,
Pavan -
Sometime before Firefox 3.x, in the History window -- it would show you columns of "First Visited" and "Last Visited".
Now you only get "Visit Date" which is equivalent to "Last Visited".
* Question -- anyone know how to bring back the "First Visited" column to the History window?
* I looked in about:config and couldn't find anything.
* I searched for a Addon and didn't find a current one. (There used to be an Addon called Extended History Manager that would show that column but an administrator disabled the download page for it.)
* Anyone know how to bring back the "First Visited" column?
Having that column by default in the Mozilla suite browser. (pre Firefox) was awesome! And at least in Firefox 1 and 2 you could use Extended History Manager Addon. I miss "First Visited"! :(Hi Fractalogic, this support forums doesn't work very well as a suggestion box because of the volume of posts and the audience reading it. You can try the following to get more visibility on the proposal:
* Help > Submit Feedback
* Filing a request for enhancement on https://bugzilla.mozilla.org/
You might also try recruiting any developer that created a useful history-related add-on to incorporate this feature. -
Data Mismatch while selecting from External Table
Hi I am not able create an external table,I am trying to create and test table and able to create but when i selecting the data it showing data mismatch.I tried for an test data but it returned error.I want to load from an excel file saved as test.csv
CREATE TABLE Per_ext
CITY VARCHAR2(30),
STATE VARCHAR2(20),
ZIP VARCHAR2(10),
COUNTRY VARCHAR2(30)
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY dataload
ACCESS PARAMETERS
MISSING FIELD VALUES ARE NULL
LOCATION ('test.csv')
REJECT LIMIT UNLIMITED;
test.csv file contents
city ---------- state---------Zip------------country
Bombay----- MH------------34324-------india
london-------London------1321---------UK
Pune---------MH------------3224---------india
Banglore---- Karnataka---11313-------india
rgds
soumyaHi Justin
I am getting following error when i trying from toad
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "comma": expecting one of: "badfile, byteordermark, characterset, column, data, delimited, discardfile, exit, fields, fixed, load, logfile, language, nodiscardfile, nobadfile, nologfile, date_cache, processing, readsize, string, skip, territory, variable"
KUP-01007: at line 1 column 29
ORA-06512: at "SYS.ORACLE_LOADER", line 19
rgds
soumya -
Data mismatch with ECC FBL5N report and debtor ageing report(BI side)
Hi,
I am facing a data mismatch problem with FBl5N(t-code for customer line item report) report in ECC side and Debtor ageing report in BI side.
The problem is
1) there are mismatch of data with some customer amounts in both ECC and Bi side.
2) and also there are customer Nos with amounts in BI Debtor ageing report which are not there in the ECC FBL5N(t-code for customer line item report)
for the second problem, I checked in the Tables BSID,BSAD in ECC side ,there also these customer Nos are not available.
One more strange thing is that with the same selectionin report on both ECC and BI reports the data mismatch and The extra customers in Bi reports are changing everyday, i.e. we are getting new set of data mismatch and extra customers Nos in BI Side.
If anyone have worked on this type of issue.....kindly help...
Thanks in advanceHi,
on the one hand it may be delta mechanism of FI_*_4 extractors with the timestamps issue, that your comparision between BI and ECC is at no time up to date.
FI Extraction
on the other hand, it may be the delta problem between data targets in your BI-System, in case you load the FI-Data from a DSO to a cube and make a report on the cube. I have this problem at the moment and will watch this thread for more suggestions. -
Hi,
I am getting data mismatch in ECC standard table and ECC standard extract structure. Please help me.
I am using 2LIS_04_P_P_MATNR data source (standard extract structure).
<b>I have following data in ECC Table (AFPO - Order Item).</b>
AUFNR (Order Number) --> 4000460
PGMNG (Total planned quantity) --> 0
PSMNG (ORDER QTY) --> 2000
<b>Extract Structure is showing following data.</b>
AUFNR (Order Number) --> 4000460
PGMNG (Total planned quantity) --> 2000 (It should be zero)
PSMNG (ORDER QTY) --> 2000
I have checked with many other order numbers. <b>The extract structure in RSA3 is showing correct values only if PGMNG is non-zero values. If PGMNG is zero, then it is automatically assigning PSMNG value.</b>
Is this common? I want to display correct values for PGMNG from AFPO Table. Why this mismatch between table data and extract data. I have done all required steps refreshing data (deleting setup tables, filling set up tables etc.).
pls suggest. Poinsts will be assigned to every useful answer.
Thanks in advance.Hi Avneet,
I have a similar problem where in AFPO shows value 1000 in PSMNG field and 1200 in RSA3 for extractor 2LIS_04_P_MATNR.
I have found that the standard feature of the extractor extracts the data at the time of release of process order. If the process order is changed after the release then the extractor will not picup the chagne.
Ex: AFPO value 1000 ---BI Value 1000
Now released in ECC
AFPO value 1000 ---BI Value 1000
In ECC value changed to 1500 after the release. BI will not pickup the change.
Extractor is pulling the data from MCAFPOV table. You can check the values, there it matches with RSA3 values.
I was just going thru forums for the solution. After i found the solution. I am just updating it.
Hope it helps you.
Thaks
Srikanth -
Data mismatch in AFPO table & 2LIS_04_P_MATNR str. in RSA3
Hi,
I am getting data mismatch in ECC standard table and ECC standard extract structure. Please help me.
I am using 2LIS_04_P_P_MATNR data source (standard extract structure).
I have following data in ECC Table (AFPO - Order Item).
AUFNR (Order Number) --> 4000460
PGMNG (Total planned quantity) --> 0
PSMNG (ORDER QTY) --> 2000
Extract Structure is showing following data.
AUFNR (Order Number) --> 4000460
PGMNG (Total planned quantity) --> 2000 (It should be zero)
PSMNG (ORDER QTY) --> 2000
I have checked with many other order numbers. The extract structure in RSA3 is showing correct values only if PGMNG is non-zero values. If PGMNG is zero, then it is automatically assigning PSMNG value.
Is this common? I want to display correct values for PGMNG from AFPO Table. Why this mismatch between table data and extract data. I have done all required steps refreshing data (deleting setup tables, filling set up tables etc.).
pls suggest. Poinsts will be assigned to every useful answer. Can any one debug in debug mode in RSA3 and let me know your findings.
Thanks in advance.I have tried all the ways. But did not understand the logic. Can any one pls explain me logic behind it.
thanks, -
Inserting to MS-Access -Data mismatch
I am trying to insert records in to a MS-Access table. However i got a data mismatch exception and on figuring it out i realized there was a date field and i was populating it with string. hence i converted it in to a date and run the query again, however now, i am neither getting an exception nor is the table getting updated.
The following is the code snippet where i get the data
List <org.cuashi.wof.ws.nwis.ValueSingleVariable > ValueList = result.getTimeSeries().getValues().getValue();
for (org.cuashi.wof.ws.nwis.ValueSingleVariable value : ValueList)
System.out.format("%20s%10.4f",value.getDateTime().toString(),value.getValue());
System.out.println();
System.out.println("obtaining time series data");
String dateTime = value.getDateTime().toString().replace('T',' ');
//to convert string in to date
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
java.util.Date dt = sdf.parse(dateTime);
sdf.applyPattern("yyyy-MM-dd HH:mm:ss");
java.util.Date dt2 = sdf.parse(dateTime);
updateTable(siteCode,variableCode,dt2,timvalue,varCode,qualifierCode,qualityControlLevel,conn,rs);
}catch(Exception e)
public void updateTable(String siteCode,String variableCode,java.util.Date dt2,double tmvalue,String varCode,String qualifierCode,int qualityControlLevel,Connection con,ResultSet rs)
try{
System.out.println("inside update");
// con.setAutoCommit(false);
PreparedStatement pstmt = con.prepareStatement("INSERT INTO DataValues(ValueID,DataValue,ValueAccuracy,LocalDateTime,UTCOffset,DateTimeUTC,SiteID,VariableID,OffsetValue,OffsetTypeID,CensorCode,QualifierID,MethodID,SourceID,SampleID,DerivedFromID,QualityControlLevelID) VALUES(?,?,?,?,?,?.?,?,?,?,?,?,?,?,?,?,?)");
pstmt.setString(1,"1");
pstmt.setDouble(2,tmvalue);
pstmt.setInt(3,0);
pstmt.setDate(4,(java.sql.Date) dt2);
pstmt.setInt(5,0);
pstmt.setString(6,"0");
pstmt.setString(7,siteCode);
pstmt.setString(8,varCode);
pstmt.setInt(9,0);
pstmt.setInt(10,0);
pstmt.setInt(11,0);
pstmt.setString(12,qualifierCode);
pstmt.setInt(13,0);
pstmt.setInt(14,1);
pstmt.setInt(15,0);
pstmt.setInt(16,0);
pstmt.setInt(17,qualityControlLevel);
System.out.println("Statement prepared");
pstmt.execute();
//commit the transaction
con.commit();
pstmt.close();
}catch(SQLException e)
System.out.println("The Exception is " +e);
I found out that after field 4 the control does not go to the remaining fields at all.
Please let me know what i am missing.System.out.format("%20s%10.4f",value.getDateTime().toString(),value.getValue());
System.out.println();
System.out.println("obtaining time series data");
String dateTime = value.getDateTime().toString().replace('T',' ');
//to convert string in to date
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");I'd recommend that you setLenient(false) on your sdf.
This pattern is what you've got? Sure?
java.util.Date dt = sdf.parse(dateTime);
sdf.applyPattern("yyyy-MM-dd HH:mm:ss");
java.util.Date dt2 = sdf.parse(dateTime);
updateTable(siteCode,variableCode,dt2,timvalue,varCode,qualifierCode,qualityControlLevel,conn,rs);
}catch(Exception e)
{Empty catch block? Not a smart idea. Print the stack trace.
public void updateTable(String siteCode,String variableCode,java.util.Date dt2,double tmvalue,String varCode,String qualifierCode,int qualityControlLevel,Connection con,ResultSet rs)
try{
System.out.println("inside update");
// con.setAutoCommit(false);
PreparedStatement pstmt = con.prepareStatement("INSERT INTO DataValues(ValueID,DataValue,ValueAccuracy,LocalDateTime,UTCOffset,DateTimeUTC,SiteID,VariableID,OffsetValue,OffsetTypeID,CensorCode,QualifierID,MethodID,SourceID,SampleID,DerivedFromID,QualityControlLevelID) VALUES(?,?,?,?,?,?.?,?,?,?,?,?,?,?,?,?,?)");
pstmt.setString(1,"1");
pstmt.setDouble(2,tmvalue);
pstmt.setInt(3,0);
pstmt.setDate(4,(java.sql.Date) dt2);I'd recommend this:
pstmt.setDate(4,new java.sql.Date(dt2.getTime()));>
pstmt.setInt(5,0);
pstmt.setString(6,"0");
pstmt.setString(7,siteCode);
pstmt.setString(8,varCode);
pstmt.setInt(9,0);
pstmt.setInt(10,0);
pstmt.setInt(11,0);
pstmt.setString(12,qualifierCode);
pstmt.setInt(13,0);
pstmt.setInt(14,1);
pstmt.setInt(15,0);
pstmt.setInt(16,0);
pstmt.setInt(17,qualityControlLevel);
System.out.println("Statement prepared");
pstmt.execute();
//commit the transaction
con.commit();
pstmt.close();You should be closing your statement and connection in a finally block, in indivdual try/catch blocks.
Set autoCommit back to true.
>
}catch(SQLException e)
{So you don't roll back if there's an exception? Bad idea.
System.out.println("The Exception is " +e);
I found out that after field 4 the control does not go to the remaining fields at all.
Please let me know what i am missing.Lots of stuff. See above.
% -
IDOC failed due to date mismatch
Hi Experts,
I am facing a problem in IDOC.
I IDOC is sent through a one SAP to another system.
In target system ,It was failed ,the reason being the date mismatch.
When I check through WE02 ,it was failed becoz a TODATE and FROMDATE got wrong ,i.e it got swapped..
So is there any way to process this error out IDOC?
Is it possible to change the dates and then process it manually?
Pls let me know as its on high priorty.
Note : BOTH SAP system are Production .
thanks in advanceHi Hemant,
Please find the following steps to edit IDOC segment after you find the error using WE02.
The example codes can be found in website
http://www.sapgenie.com/sapedi/idoc_abap.htm
STEP 1 - Open document to edit
CALL FUNCTION 'EDI_DOCUMENT_OPEN_FOR_EDIT'
EXPORTING
document_number = t_docnum
IMPORTING
idoc_control = itab_edidc
TABLES
idoc_data = itab_edidd
EXCEPTIONS
document_foreign_lock = 1
document_not_exist = 2
document_not_open = 3
status_is_unable_for_changing = 4
OTHERS = 5.
STEP 2 - Loop at itab_edidd and change data
LOOP AT itab_edidd WHERE segnam = 'E1EDKA1'.
e1edka1 = itab_edidd-sdata.
IF e1edka1-parvw = 'LF'.
e1edka1-partn = t_eikto.
itab_edidd-sdata = e1edka1.
MODIFY itab_edidd.
EXIT.
ENDIF.
ENDLOOP.
STEP 3 - Change data segments
CALL FUNCTION 'EDI_CHANGE_DATA_SEGMENTS'
TABLES
idoc_changed_data_range = itab_edidd
EXCEPTIONS
idoc_not_open = 1
data_record_not_exist = 2
OTHERS = 3.
STEP 3a - Change control record
CALL FUNCTION 'EDI_CHANGE_CONTROL_RECORD'
EXPORTING
idoc_changed_control = itab_edidc
EXCEPTIONS
idoc_not_open = 1
direction_change_not_allowed = 2
OTHERS = 3.
STEP 4 - Close Idoc
Update IDoc status
CLEAR t_itab_edids40.
t_itab_edids40-docnum = t_docnum.
t_itab_edids40-status = '51'.
t_itab_edids40-repid = sy-repid.
t_itab_edids40-tabnam = 'EDI_DS'.
t_itab_edids40-mandt = sy-mandt.
t_itab_edids40-stamqu = 'SAP'.
t_itab_edids40-stamid = 'B1'.
t_itab_edids40-stamno = '999'.
t_itab_edids40-stapa1 = 'Sold to changed to '.
t_itab_edids40-stapa2 = t_new_kunnr.
t_itab_edids40-logdat = sy-datum.
t_itab_edids40-logtim = sy-uzeit.
APPEND t_itab_edids40.
CALL FUNCTION 'EDI_DOCUMENT_CLOSE_EDIT'
EXPORTING
document_number = t_docnum
do_commit = 'X'
do_update = 'X'
write_all_status = 'X'
TABLES
status_records = t_itab_edids40
EXCEPTIONS
idoc_not_open = 1
db_error = 2
OTHERS = 3.
Other alternative is to use WE19 with reference to the error IDoc number.
After you change the incorrect segment(s), you can post to the application.
As of result, new IDoc will be created.
You can use FM EDI_DOCUMENT_DELETE to delete the incorrect/errror IDoc number.
Also you can use WE02 to change the incorrect segment.
Double click on the incorrect segment to get the detail info. Then go to Data Record (menu) -> Change -> make neccessary changes -> Save.
Then you can use this program RBDINPUT to reprocess the error IDoc.
Hope this will help.
Regards,
Ferry Lianto -
hi team
i have a report, where in my report, Data being mismatch with r/3 data my question is
1.how we will do reconcellation with r/3 data
2.what is agregated level
3.how we will run the statistical setup in lo
can u explain with step by step how we solve the data mismatch erors eg:company code missing
thanks in adv
v.muralikrishnaHi
1.how we will do reconcellation with r/3 data
Yes we can do for MM we use MB5B, MMBE,Tcodes and also some Tables this is on eexample only and also Related Functional People will need to work to reconcile the data, BW consultanta alone can't do the reconcilization
2.what is agregated level
Eg:
06.2010 -- 10
06.2010 -- 20
Aggr Level is
06.2010 -- 30
3.how we will run the statistical setup in LO
LO you can see the below steps
Follow the steps, these steps are for SD module, but for your datasource, change the Tcode to fill setup tables and replace the SD DataSource with your datasource in the following steps.
1. First Install the DataSOurce in RSA5 and see it in RSA6 and activate in LBWE.
Before doing the steps from 2 to 6 lock the ECC System, i.e. no transaction will happen
2. Then delete the Queues in LBWQ like below
MCEX11 --> For 2LIS_11_*
MCEX12 --> For 2LIS_12_*
MCEX13 --> For 2LIS_13_*
Be carefull while doing all these deleations in Production Servers
3. Then delete if any entry is there in RSA7
Eg:
2LIS_11_*
2LIS_12_*
2LIS_13_*
At the time of setup table filling no entry will exists in LBWQ in ECC for the following Queues.
MCEX11 --> For 2LIS_11_*
MCEX12 --> For 2LIS_12_*
MCEX13 --> For 2LIS_13_*
6. Check the job status in SM37 and once finish it then goto RSA3 and then execute it and check.
7. Then Replicate the DataSource in BW.
4. Then delete setp tables using LBWG Tocde and select Application Number. i.e. 11, 12 and 13.
5. Then Load Setup Tables using OLI7BW, OLI8BW and OLI9BW.
Give Name of run = XYZ, Termination Date = tomorrows date and execute it in background.
i.e. Program-->Execute in Background.
2LIS_11_* Use OLI7BW Tcode to fill the Setup tables
2LIS_12_* Use OLI8BW Tcode to fill the Setup tables
2LIS_13_* Use OLI9BW Tcode to fill the Setup tables
8. Install InfoCube/DSO from Business Content or cretae InfoCube/DSO and then Map the ECC DataSOurce Fields and BW InfoObejcts in Transfer Rules (in BW 3.5) or in Transfermations (in BI 7.0).
9. Map the InfoObejcts in InfoSource and InfoObejects in InfoCube/DSO in in Update Rules (in BW 3.5) or in Transfermations (in BI 7.0).
10.Create InfoPackage and Load Init OR Full.
Thanks
Reddy -
OSI / OTI table data mismatch, OTI being subset of OSI
Hi,
We have a custom application where we fetch count and data for all provisioning records assigned to logged in user.
To fetch details of open provisioning tasks, Oracle recommendation is to use OTI table.
When we studied current system, we had below findings regarding behavior of OSI and OTI tables -
1. When request is submitted to OIM, approval process is triggered depending upon approval policy.
2. this workflow is responsbile for creating approval tasks.
3. when approval tasks are approved, request in OIM is closed and provisioning task for the system gets created.
4. OIM database tables which are used to fetch these provisioning tasks are OSI/OTI.
5. According to Oracle documentation, OTI table is subset of OSI table.
6. when we checked both these tables, we found that entry in both these tables is not exactly same.
7. For a particular request, OSI table had OSI_ASSIGN_TYPE as 'Group' and OTI table had OSI_ASSIGN_TYPE as 'User'.
8. As OTI table has osi_assigned_to_ugp_key value as null, this table cannot be used.
Iit looks like OTI table does not hold correct data. And OSI table has around 9.6 million records. So when we query OSI table we get correct data but it takes 6min to get the data.
We are now looking for the issue why there is this mismatch and how can it be solved ?
Regards,
DeepikaHi Kevin,
Thanks for these details.
We would try this query in our environment and see the result.
But regarding OSI /OTI data mismatch, i understand that OTI contains tasks that are still opened, like Rejected, and Pending. And these tasks are also there in OSI, when we check the record in OSI those tasks are assigned to user while when we see same record those are getting assign type changes.
Is that intended functionality or is it something wrong here.
Because of this we cannot get correct data directly from OTI table and have to use OSI(which is hampering performance)
This is the query that we are using:
select distinct oti.orc_key, a.usr_login, (select min(inner.oti_create) from oti inner where inner.orc_key = oti.orc_key)
as oti_create from oti inner join oiu on oti.orc_key = oiu.orc_key inner join usr a on oiu.usr_key = a.usr_key
inner join osi on oti.sch_key = osi.sch_key where sch_status in ('P', 'R') and obj_key in
(select obj_key from obj where obj_name in ('DB App', 'K2 App', 'LDAP App', 'SPML App', 'Manual App'))
and (osi.osi_assign_type = 'Group' and osi.osi_assigned_to_ugp_key in
(select ugp_key from usg where usr_key = " + Long.toString(Oim.getCurrentUser().getKey()) + "))
UNION ALL
select distinct oti.orc_key, a.usr_login, (select min(inner.oti_create) from oti inner where inner.orc_key = oti.orc_key)
as oti_create from oti inner join oiu on oti.orc_key = oiu.orc_key inner join usr a on oiu.usr_key = a.usr_key
inner join osi on oti.sch_key = osi.sch_key where sch_status in ('P', 'R') and obj_key in
(select obj_key from obj where obj_name in ('DB App', 'K2 App', 'LDAP App', 'SPML App', 'Manual App'))
and (osi.osi_assign_type in ('User', 'Default task assignment') and
osi.osi_assigned_to_usr_key = " + Long.toString(Oim.getCurrentUser().getKey()) + ")
order by oti_create
Regards,
Deepika -
How to identify the data mismatch between inventory cube and tables?
Hi experts,
i have a scenario like how to identify the data mismatch between 0IC_C03 and tables,and what are the steps to follow for avoiding the data mismatchHi
U can use data reconcilation method to check the consistency of data between the r/3 and bw. Plz check the below link
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0931642-1805-2e10-01ad-a4fbec8461da?QuickLink=index&overridelayout=true
http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d08ce5cd-3db7-2c10-ddaf-b13353ad3489
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/7a5ee147-0501-0010-0a9d-f7abcba36b14?QuickLink=index&overridelayout=true
Thanx & Regards,
RaviChandra -
Data Mismatch problem in Production system
Dear Experts,
We have been encountering serious data mismatch problems with our source system since 29th Dec 2010.
We have realized this mismatch only on 11th Jan 2011. So, we have deleted entire data from 29th Dec 2010 onwards to 11th Jan 2011. Then we tried to load delta again as per our Process Chain sequence. Since then, our production system has become very inconsistent. We found that the entire data from 29th Dec 2010 to till date is started giving mismatch.
Our system got damaged in terms of data drastically. If I delete any bad requests from our Cubes & Data Store Objects, the next time when we try to load data, cube & Data Store Objects is showing the deleted request no. is not updated properly. Because of this, our Process chain is daily failing to complete and giving so many technical errors. We are totally confused to restore our data from 29th Dec 2010 to till date.
Can any one please help us to restore our data fully and Process Chain to run successfully...............?
Thanks in Advance,
SumanHi Suman,
I understood that you are using 3.x version of BW
such kind of issues occurs if you distrub the delta sequence by deleting the requests without making them red.
you can resolve this two ways......
1) Go to reconstruction tab of the DSO and CUBE, select all the req from 29th Dec 2010 thru 11th Jan 2011 and reconstruct them. once they appear in the request tab. make all the requests to red (set the QM status to RED) one by one and delete them from the CUBE and DSO.
This should set the delta status back to 29th and you will be able to run a repeat delta from then in your next delta.
2) Go to RSRQ and give the SID or the Request ID one by one that it is giving you ( request no. which is not updated properly)
Mke the QM status as RED for all of them.
Since you have already deleted them from the targets, just changing the QM status to RED will be sufficinet.
You have make the QM status to RED inorder to inform the system that the requset is incorrect one and has been deleted.
Once it is set to RED, system will not prompt any req and say "request no.# not updated properly".
i suggest you the second option.
Once you resume the delta load, it should run successfully.....and if you feel some data is still missing from 29th Dec 2010 thru 11th Jan 2011, just run a full repair load the get the data back.
If your Datasource is of LO extraction u may need to fill the setup tables to perform full repair.
Please let me know if the Datasource is of LO extraction. Also let me know once the issue is resolved.
Regards,
Sudheer.
Edited by: Sudheer Kumar Kurra on Jan 22, 2011 7:58 AM
Edited by: Sudheer Kumar Kurra on Jan 22, 2011 8:03 AM
Maybe you are looking for
-
I've entered my passwords for both email accounts numerous time and it still won't send any email and continuosly asks for the passwords.
-
hi to every one ,in fact i got problem with my mobileme in my machine ,i cant sing in system preference mobileme,even i can sing in in mobileme webside ,please i need help
-
Part Number for parallel cable connecting SCXI-2400 to SCXI-1200
If no longer available from NI, would anyone have the (upper) rear connector pinout for the SCXI-2400 so I can make my own cable using half of a DB-25 parallel printer cable and the appropriate AMP connector?
-
This is for Indesign Server but applies to Indesign desktop as well. Using javascript, I actually changed the settings of the default PDF export presets. Is there a way to reset them to the default values or do I have to reinstall Indesign?
-
Trying to download new software..Help!
I have a late 2008 Macbook with the 10.6.8 software. I am trying to download the new 10.9.1. I go through the app store and click download, but it does nothing. What do I need to do to get it downloaded? I am basically trying to get the new Pages dow