Query on DTP error
Hi BW Experts,
I am loading data to master data infoobject from ECC 6.0.The data has come to psa correctly.While running the DTP to move the data from PSA to MASTER DATA INFOOBJECT(ZZEMPLOYEE) some records are showing error.
The error is Data record 2 ('00050008):version(HYD) is not valid.
The above error comes for another 10 records.
what i have to do now? Could you please someone help me out?
Thanks
Hi Vishwaanand,
Datatypes are same for both (ECC 6.0 Field : Infoobject in BI 7.0)that is CHAR
In this scenario 10 fields are enhanced to 0EMPLOYEE_ATTR in ECC 6.0. Out of these 10 fields 2 fields are coming from one table(T500P-NAME1(Transfer from Circle) and 2 fields are coming from table(T001P-BTEXT(Transfer from Location) and 2 fields are coming from table(PA0000-BEGDA), 2 fields are coming from (PA0023-ARBGB,Difference between BEGDA and ENDDA) etc.
I have checked the datatype for those fields in ECC 6.0 and created corresponding infoobject prefix with Z. I have mapped these infoobjects with(YEMPLOYEE( Target : Master data infoobject) to 0EMPLOYEE_ATTR)).
Dataflow is in 7.0 like 0EMPLOYEE_ATTR(3.X)->INFOSOURCE--->YEMPLOYEE.
I have replicated and run the infopackage then i got the data till PSA(362) Correctly.
While runnig DTP Out of 362 Error records are 29. Showing the message as
YTRTOLOC(Transfer to location) : Datarecord 6 ("00050008") : Version 'chennai' is not valid
YTRTOCIR(Transfer to Circle) : Datarecord 6 ("00050008") : Version 'Corporate' is not valid
YTRFRLOC(Transfer from Location):Datarecord 6 ("00050008") : Version 'Chennai' is not valid
YTRFRCIR(Transfer from Circle) :Datarecord 6 ("00050008") : Version 'Corporate' is not valid etc
How will i correct those records?
Could you please help me out?
Similar Messages
-
DTP Error when loading Master data from DSO to InfoProvider
Hi Experts,
My DTP is failing when I am running a DELTA load from DSO to the InfoProvider. The errors are either duplicate records or overlap of Dates etc.
Thanks,
SB.Hi Gopal,
Here is the info on error message.
" There are duplicates of the data record 1 & with the key '00000000038 &' for characteristic EPROPERTY &. "
Hope this helps in understand the DTP error.
Thanks,
SB. -
We are getting multiple 8623 Errors in SQL Log while running Vendor's software.
How can you catch which Query causes the error?
I tried to catch it using SQL Profiler Trace but it doesn't show which Query/Sp is the one causing an error.
I also tried to use Extended Event session to catch it, but it doesn't create any output either.
Error:
The query processor ran out of internal resources and could not produce a query plan. This is a rare event and only expected for extremely complex queries or queries that
reference a very large number of tables or partitions. Please simplify the query. If you believe you have received this message in error, contact Customer Support Services for more information.
Extended Event Session that I used;
CREATE EVENT SESSION
overly_complex_queries
ON SERVER
ADD EVENT sqlserver.error_reported
ACTION (sqlserver.sql_text, sqlserver.tsql_stack, sqlserver.database_id, sqlserver.username)
WHERE ([severity] = 16
AND [error_number] = 8623)
ADD TARGET package0.asynchronous_file_target
(SET filename = 'E:\SQLServer2012\MSSQL11.MSSQLSERVER\MSSQL\Log\XE\overly_complex_queries.xel' ,
metadatafile = 'E:\SQLServer2012\MSSQL11.MSSQLSERVER\MSSQL\Log\XE\overly_complex_queries.xem',
max_file_size = 10,
max_rollover_files = 5)
WITH (MAX_DISPATCH_LATENCY = 5SECONDS)
GO
-- Start the session
ALTER EVENT SESSION overly_complex_queries
ON SERVER STATE = START
GO
It creates only .xel file, but not .xem
Any help/advice is greatly appreciatedHi VK_DBA,
According to your error message, about which query statement may fail with error message 8623, as other post, you can use trace flag 4102 & 4118 for overcoming this error. Another way is looking for queries with very long IN lists, a large number of
UNIONs, or a large number of nested sub-queries. These are the most common causes of this particular error message.
The error 8623 occurs when attempting to select records through a query with a large number of entries in the "IN" clause (> 10,000). For avoiding this error, I suggest that you could apply the latest Cumulative Updates media for SQL Server 2012 Service
Pack 1, then simplify the query. You may try divide and conquer approach to get part of the query working (as temp table) and then add extra joins / conditions. Or You could try to run the query using the hint option (force order), option (hash join), option
(merge join) with a plan guide.
For more information about error 8623, you can review the following article.
http://blogs.technet.com/b/mdegre/archive/2012/03/13/8623-the-query-processor-ran-out-of-internal-resources-and-could-not-produce-a-query-plan.aspx
Regards,
Sofiya Li
Sofiya Li
TechNet Community Support -
Dtp error when loading data from one cube to another cube?
hi,experts
it seems strange that when I tick the some characteristics as navigational attributes in the cube , the error occurs during the execution of the DTP:error while updating to target Z* (Cube name ).
once i turn the flag off , no error appear. could anyone give me a clue ?
thanks in advance!Hi,
When u make changes in the cube u need to make necessary changes in Transformation and activate the DTP.The checking of Navigational attributes in cube will appear in u r transformation as new fields where u need to create a mapping for them and then activate, also activate your DTP and then load the data. This should resolve your issue.
Regards -
Using NVL in Query of Query resulting in error
I'm still using CF8 and Oracle 11G back-end.
When I use NVL in the query of query I got error....Can't I use NVL to check on null value? Please help
Here is my codes:
<cfquery name="GetC2" datasource="#Trim(application.OracDSN)#">
SELECT CamID2, rel2_2,p_ln2,p_fn2,ins,l_year
FROM prt_temp
WHERE Ins = 'CC'
AND l_year = '1481'
AND NVL(Child_LN2,' ') <> ' '
AND NVL(Child_FN2,' ') <> ' '
</cfquery>
<cfif GetC2.Recordcount NEQ 0>
<cfquery name="CheckRel2C2" dbtype="QUERY">
SELECT CamID2, rel2_2
FROM GetC2
WHERE NVL(Rel2_2,' ') <> ' '
AND NVL(p_ln2,' ') = ' '
AND NVL(p_fn2,' ') = ' '
AND Ins = 'CC'
AND l_year = '1481'
</cfquery>
</cfif>
The error:
Error Executing Database Query.
Query Of Queries syntax error.
Encountered "NVL ( Rel2_2 ,. Incorrect conditional expression, Expected one of [like|null|between|in|comparison] condition,NVL is an Oracle function, and is not available in ColdFusion Query of Query. If you are trying to check for null values, then use IS NULL or IS NOT NULL. So
WHERE NVL(Rel2_2,' ') <> ' '
AND NVL(p_ln2,' ') = ' '
AND NVL(p_fn2,' ') = ' '
becomes
WHERE Rel2_2 IS NOT NULL
AND p_ln2 IS NULL
AND p_fn2 IS NULL
-Carl V. -
How do I clear or delete the DTP ERROR STACK..?
HI Experts
I have a number of records written to a DTP Error Stack for a Master Data load. Subsequent executions of the DTP result in further records being written to the stack if they have the same semantic key (even if they are not in error).
I know I can execute an Error DTP to clear the stack.
But is there any way to clear the stack rather than running the Error DTP? I don't want the records in error to be processed through to the InfoObject. I only want the subsequent records to go through, without going into the error stack.
tonyHi Tony,
you have to remove the request from the target to delete the records from the error stack. If you don't want to reprocess them, try to avoid moving them to the error stack. One way is to skip the records without creating a monitor entry, so the system doesn't recognize the error and it's not shifted to the error stack, the other one is to delete the record from the source or result package throug a routine.
Single Record deletion iin the error-stack is not possible. For further information read this thread:
Delete & Automate deletion of Error Stack
Regards Michael
Edited by: Michael Seifert on Nov 20, 2008 4:04 PM -
Processing a cursor of 11,000 rows and Query completed with errors
So I have 3rd party data that I have loaded into a SQL Server Table. I am trying to determine if the 3rd party Members reside in our database by using a cursor and going through all 11,000 rows...substituting the #Parameter Values in a LIKE statement...trying
to keep it pretty broad. I tried running this in SQL Server Management Studio and it chunked for about 5 minutes and then just quit. I kind of figured I was pushing the buffer limits within SQL Server Management Studio. So instead I created it as a Stored
Procedure and changed my Query Option/Results and checked Discard results after execution. This time it chunked away for 38 minutes and then stopped saying
Query completed with errors. I did throw a COMMIT in there thinking that the COMMIT would hit and free up resources and I'd see the Table being loaded in chunks. But that didn't seem to work.
I'm kind of at a loss here in terms of trying to tie back this data.
Can anyone suggest anything on this???
Thanks for your review and am hopeful for a reply.
WHILE (@@FETCH_STATUS=0)
BEGIN
SET @SQLString = 'INSERT INTO [dbo].[FBMCNameMatch]' + @NewLineChar;
SET @SQLString = ' (' + @NewLineChar;
SET @SQLString = ' [FBMCMemberKey],' + @NewLineChar;
SET @SQLString = ' [HFHPMemberNbr]' + @NewLineChar;
SET @SQLString = ' )' + @NewLineChar;
SET @SQLString = 'SELECT ';
SET @SQLString = @SQLString + CAST(@FBMCMemberKey AS VARCHAR) + ',' + @NewLineChar;
SET @SQLString = @SQLString + ' [member].[MEMBER_NBR]' + @NewLineChar;
SET @SQLString = @SQLString + 'FROM [Report].[dbo].[member] ' + @NewLineChar;
SET @SQLString = @SQLString + 'WHERE [member].[NAME_FIRST] LIKE ' + '''' + '%' + @FirstName + '%' + '''' + ' ' + @NewLineChar;
SET @SQLString = @SQLString + 'AND [member].[NAME_LAST] LIKE ' + '''' + '%' + @LastName + '%' + '''' + ' ' + @NewLineChar;
EXEC (@SQLString)
--SELECT @SQLReturnValue
SET @CountFBMCNameMatchINSERT = @CountFBMCNameMatchINSERT + 1
IF @CountFBMCNameMatchINSERT = 100
BEGIN
COMMIT;
SET @CountFBMCNameMatchINSERT = 0;
END
FETCH NEXT
FROM FBMC_Member_Roster_Cursor
INTO @MemberIdentity,
@FBMCMemberKey,
@ClientName,
@MemberSSN,
@FirstName,
@MiddleInitial,
@LastName,
@AddressLine1,
@AddressLine2,
@City,
@State,
@Zipcode,
@TelephoneNumber,
@BirthDate,
@Gender,
@EmailAddress,
@Relation
END
--SELECT *
--FROM [#TempTable_FBMC_Name_Match]
CLOSE FBMC_Member_Roster_Cursor;
DEALLOCATE FBMC_Member_Roster_Cursor;
GOHi ITBobbyP,
As Erland suggested, you can compare all rows at once. Basing on my understanding on your code, the below code can lead to the same output as yours but have a better performance than cursor I believe.
CREATE TABLE [MemberRoster]
MemberKey INT,
FirstName VARCHAR(99),
LastName VARCHAR(99)
INSERT INTO [MemberRoster]
VALUES
(1,'Eric','Zhang'),
(2,'Jackie','Cheng'),
(3,'Bruce','Lin');
CREATE TABLE [yourCursorTable]
MemberNbr INT,
FirstName VARCHAR(99),
LastName VARCHAR(99)
INSERT INTO [yourCursorTable]
VALUES
(1,'Bruce','Li'),
(2,'Jack','Chen');
SELECT * FROM [MemberRoster]
SELECT * FROM [yourCursorTable]
--INSERT INTO [dbo].[NameMatch]
--[MemberNbr],
--[MemberKey]
SELECT y.MemberNbr,
n.[MemberKey]
FROM [dbo].[MemberRoster] n
JOIN [yourCursorTable] y
ON n.[FirstName] LIKE '%'+y.FirstName+'%'
AND n.[LastName] LIKE '%'+y.LastName+'%'
DROP TABLE [MemberRoster], [yourCursorTable]
If you have any question, feel free to let me know.
Eric Zhang
TechNet Community Support -
Query engine failed error for crytal report refreshing to new params in jsp
Using licensed weblogic 8.1 server in production mode. Weblogic workhsop has inetegrated supprot for crystal reports 9. Using standalone report & accessing sql server through odbc, i got result for different parameters passed.
Problem Area : Passing parameter to have specific report causes unexpected query engine failed error in com.crystaldecisions.report.web.viewer.CrystalViewer class. If viewer.refresh method is commented, then static (already saved) report is displayed through jsp. But using viewer.refresh method for dynamic report genertaion for new parameters through jsp gives above error.Hello
I'm experiencing the same problem. Please let me know if you've any solution. -
DRG-50901: text query parser syntax error
The query
SELECT * FROM ij
where
CONTAINS (ij.summary, 'ATTENZIONE!') > 0 returns an error:
ORA-29902: error in executing ODCIIndexStart() routine
ORA-20000: Oracle Text error:
DRG-50901: text query parser syntax error on line 1, column 13
Why?
There is a TEXT index on the summary column:
CREATE INDEX IJL_SUMMARY_IX ON IJ
(SUMMARY)
INDEXTYPE IS CTXSYS.CONTEXT
PARAMETERS('
lexer MITO_LEXER
wordlist DEFAULT_WORDLIST
stoplist IJL_STOPLIST
storage IJL_TEXT_STORAGE
SYNC (EVERY "SYSDATE + 10/1440")')
PARALLEL ( DEGREE 4 INSTANCES 1 );where the MITO_LEXER is
BEGIN
CTX_DDL.create_preference ('mito_lexer', 'BASIC_LEXER');
CTX_DDL.set_attribute ('mito_lexer', 'INDEX_STEMS', 'ITALIAN');
-- MITO-318: search on Text Index for Asterisks
CTX_DDL.set_attribute ('mito_lexer', 'printjoins', '*');
END;
/Because the exclamation mark ("!") is a reserved operator, meaning soundex, and must appear before the word it applies to.
-
What does this DTP error indicate?
I am trying to correct this error in DTP:
"Error in formula function (routine 63), record 37816"
"Records filtered becasue records with same key contain errors"
This is a data error but not some bad or invalid characters. I am not sure how it can be corrected based on above message
Has anyone come across such an error - how was it resolved?
Thanks!
Edited by: BI Quest on Aug 30, 2008 9:20 AMHi,
This issue can be resolved by correcting the error records in PSA and updating it into the target. For that the first step should be to identify if all the records are there in PSA. You can find out this from checking the Details tab in RSMO, Job log , PSA > sorting records based on status,etc. Once its confirmed force the request to red and delete the particular request from the target cube. Then go to PSA and edit the incorrect records (correcting or blanking out the invalid entries for particular field InfoObject for the incorrect record) and save it. Once all the incorrect records are edited go to RSA1>PSA find the particular request and update to target manually (right click on PSA request > Start update immediately).
I will add the step by step procedure to edit PSA data and update into target (request based).
Identifying incorrect records.
System wont show all the incorrect records at the first time itself. You need to search the PSA table manually to find all the incorrect records.
1. First see RSMO > Details > Expand upate rules / processing tabs and you will find some of the error records.
2. Then you can go to PSA and filter using the status of records. Filter all the red requests. This may also wont show the entire incorrect records.
3. Then you can go to PSA and filter using the incorrect records based on the particular field.
4. If this also doesnt work out go to PSA and sort (not filter) the records based on the particular field with incorrect values and it will show all the records. Note down the record numbers and then edit them one by one.
If you want to confirm find the PSA table and search manually."
Also Run the report RS_ERRORLOG_EXAMPLE,By this report you can display all incorrected records of the data & you can also find whether the error occured in PSA or in TRANSFER RULES.
Steps to resolve this
1. Force the request to red in RSMO > Status tab.
2. Delete the request from target.
3. Come to RSMO > top right you can see PSA maintenace button > click and go to PSA .
4.Edit the record
5. Save PSA data.
6. Got to RSA15 > Search by request name > Right click > update the request from PSA to target.
Refer how to Modify PSA Data
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40890eda-1b99-2a10-2d8b-a18b9108fc38
This should solve your problem for now.
As a long term fix you can apply some user exit in source system side or change your update rules to ensure that this field is getting blanked out before getting loaded in cube or add that particular char to permitted character list in BW.
Also in Tcode RSKC --> type ALL_CAPITAL --> F8 (Execute)
OR
Go to SE38 and execute the program RSKC_ALLOWED_CHAR_MAINTAIN and give ALL_CAPITAL or the char you want to add.
Check the table RSALLOWEDCHAR. It should contain ALL_CAPITAL or the char you have entered.
In RSKC you can add upto 45 characters
if you add any character in RSKC it will autometically update into RSALLOWEDCHAR table, But in this table you can add upto 72 characters,using ABAP program..
Refer
Invalid characters in SAP BW 3.x: Myths and Reality. Part 2.
Invalid characters in SAP BW 3.x: Myths and Reality. Part 1.
Steps of Including one special characters into permitted ones in BI
http://help.sap.com/saphelp_nw04/helpdata/en/64/e90da7a60f11d2a97100a0c9449261/frameset.htm
For adding Other characters
OSS note #173241 u2013 u201CAllowed characters in the BW Systemu201D
Sample cleansing routine (#)
Help loading char EQUIP#1111#TAG#3311 SN#A01040 * into Cube
Invalid character error while activating DSO - Please help
Regarding Special Characters
how to prevent bad charectors from source system
Regards
Tg -
Query 0: Runtime error There is already a line with the same key. with para
Dear all,
I have a query with several variables. One of the variables is Version. When I use certain values e.g. 1,2 or 3, I can generate the query without any problem. However, when I use the other values e,g, 4, 5, or 6, the query gives me with the following error messages :
1. Query 0: Runtime error There is already a line with the same key. with parallel processing via RFC
2. Error while reading data; navigation is possible
3. >> Row: 174 Inc: LRSDRPU02 Prog: SAPLRSDRP
Error 1
Query 0: Runtime error There is already a line with the same key. with parallel processing via RFC
Message no. DBMAN428
Error 2
Error while reading data; navigation is possible
Message no. BRAIN289
Diagnosis
An error occurred while reading the data. The query result is therefore empty or inconsistent and is not buffered in the OLAP cache.
Procedure
You can continue to navigate or return to the last navigation step.
Error 3
>> Row: 174 Inc: LRSDRPU02 Prog: SAPLRSDRP
Message no. RS_EXCEPTION301
Diagnosis
En error has been triggered. This message specifies where in the coding the error occurred. This helps you to localize the error quickly.
May I know what causes this error and how to troubleshoot it?
Thank you.Venkat,
You are either a genius or working for SAP.
In any case, your solution solved my problem.
Thanks heaps! -
Comma delimited in Sql query decode function errors out
Hi All,
DB: 11.2.0.3.0
I am using the below query to generate the comma delimited output in a spool file but it errors out with the message below:
SQL> set lines 100 pages 50
SQL> col "USER_CONCURRENT_QUEUE_NAME" format a40;
SQL> set head off
SQL> spool /home/xyz/cmrequests.csv
SQL> SELECT
2 a.USER_CONCURRENT_QUEUE_NAME || ','
3 || a.MAX_PROCESSES || ','
4 || sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'Q',1,0),0)) Pending_Standby ||','
5 ||sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'I',1,0),0)) Pending_Normal ||','
6 ||sum(decode(b.PHASE_CODE,'R',decode(b.STATUS_CODE,'R',1,0),0)) Running_Normal
7 from FND_CONCURRENT_QUEUES_VL a, FND_CONCURRENT_WORKER_REQUESTS b
where a.concurrent_queue_id = b.concurrent_queue_id AND b.Requested_Start_Date <= SYSDATE
8 9 GROUP BY a.USER_CONCURRENT_QUEUE_NAME,a.MAX_PROCESSES;
|| sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'Q',1,0),0)) Pending_Standby ||','
ERROR at line 4:
ORA-00923: FROM keyword not found where expected
SQL> spool off;
SQL>
Expected output in the spool /home/xyz/cmrequests.csv
Standard Manager,10,0,1,0
Thanks for your time!
Regards,Get to work immediately on marking your previous questions ANSWERED if they have been!
>
I am using the below query to generate the comma delimited output in a spool file but it errors out with the message below:
SQL> set lines 100 pages 50
SQL> col "USER_CONCURRENT_QUEUE_NAME" format a40;
SQL> set head off
SQL> spool /home/xyz/cmrequests.csv
SQL> SELECT
2 a.USER_CONCURRENT_QUEUE_NAME || ','
3 || a.MAX_PROCESSES || ','
4 || sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'Q',1,0),0)) Pending_Standby ||','
5 ||sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'I',1,0),0)) Pending_Normal ||','
6 ||sum(decode(b.PHASE_CODE,'R',decode(b.STATUS_CODE,'R',1,0),0)) Running_Normal
7 from FND_CONCURRENT_QUEUES_VL a, FND_CONCURRENT_WORKER_REQUESTS b
where a.concurrent_queue_id = b.concurrent_queue_id AND b.Requested_Start_Date <= SYSDATE
8 9 GROUP BY a.USER_CONCURRENT_QUEUE_NAME,a.MAX_PROCESSES;
|| sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'Q',1,0),0)) Pending_Standby ||','
>
Well if you want to spool query results to a file the first thing you need to do is write a query that actually works.
Why do you think a query like this is valid?
SELECT 'this, is, my, giant, string, of, columns, with, commas, in, between, each, word'
GROUP BY this, is, my, giant, stringYou only have one column in the result set but you are trying to group by three columns and none of them are even in the result set.
What's up with that?
You can only group by columns that are actually IN the result set. -
DTP Error: Duplicate data record detected
Hi experts,
I have a problem with loading data from DataSource to standart DSO.
In DS there are master data attr. which have a key containing id_field.
In End routine I make some operations which multiple lines in result package and fill new date field - defined in DSO ( and also in result_package definition )
I.E.
Result_package before End routine:
__ Id_field ____ attra1 ____ attr_b ...___ attr_x ____ date_field
____1________ a1______ b1_________ x1
____2________ a2______ b2_________ x2
Result_package after End routine:
__ Id_field ____ attra1 ____ attr_b ..___ attr_x ____ date_field
____1________ a1______ b1_________ x1______d1
____2________ a1______ b1_________ x1______d2
____3________ a2______ b2_________ x2______d1
____4________ a2______ b2_________ x2______d2
The date_field (date type) is in a key fields in DSO
When I execute DTP I have an error in section Update to DataStore Object: "Duplicate data record detected "
"During loading, there was a key violation. You tried to save more than one data record with the same semantic key."
As I know the result_package key contains all fields except fields type i, p, f.
In simulate mode (debuging) everything is correct and the status is green.
In DSO I have uncheched checkbox "Unique Data Records"
Any ideas?
Thanks in advance.
MGHi,
In the end routine, try giving
DELETE ADJACENT DUPLICATES FROM RESULT_PACKAGE COMPARING XXX YYY.
Here XXX and YYY are keys so that you can eliminate the extra duplicate record.
Or you can even try giving
SORT itab_XXX BY field1 field2 field3 ASCENDING.
DELETE ADJACENT DUPLICATES FROM itab_XXX COMPARING field1 field2 field3.
this can be given before you loop your internal table (in case you are using internal table and loops) itab_xxx is the internal table.
field1, field2 and field 3 may vary depending on your requirement.
By using the above lines, you can get rid of duplicates coming through the end routine.
Regards
Sunil
Edited by: Sunny84 on Aug 7, 2009 1:13 PM -
Hello,
We have a query and has defined a report for this query.
But then when going to:
Tools/Queries/Query PrintLayout and clicking the Print Preview button, this message its appearing:
"Internal error occurred (-101) Message 131-183.
The version it is 2005PL09.
The strangest thing it is that the reports had been working OK untill a few days ago. So it is really a strange thing...
Have you deal with this before?
Thank you in advance!Hi Vanessa,
I got same problem but after three or four days, it has been working fine.
Thanks,
Suresh Yerra -
Hi,
I was using OBIEE 10.1.3.3.3 /. After upgrading to 10.1.3.4.2 , the queries are not running ( long running queries alone) as expected. and so all the campaigns iwh tlong running queries are cancelled and the error message is seen in the xml log in Marketing jobs management.
We have upgraded the OBIEE version from 10.1.3.3.3 to 10.1.3.4.2 yesterday after which the connectivites have been checked with SQLPLUS and they are working fine. But when the campaign has been loaded, the Analytics have received the request for the segment query and after longer run, the task has been cancelled with the error message "The Query was cancelled as mentioned below in the Job Stats
<jobStats>
<jobID>2</jobID>
<jobType>WriteListFiles</jobType>
<jobUser>userid</jobUser>
<jobState>Error</jobState>
<jobTotalMilliSec>1h 41m 21s 160ms</jobTotalMilliSec>
<jobStartedTime>2012-03-22T08:06:13Z</jobStartedTime>
<jobFinishedTime>2012-03-22T09:47:34Z</jobFinishedTime>
<jobIsCancelling>N</jobIsCancelling>
- <exception>
<message>Job request of type "WriteListFiles" failed.</message>
- <exception>
<message>Error executing the list generation SQL.</message>
- <exception>
<message>Error in executing cursor for WorkNode (Id:0)</message>
- <exception>
<message>The query was cancelled.</message>
</exception>
</exception>
</exception>
</exception>
</jobStats>
Please let us know the suggestions
So far, the unaccessedtimingrunoutminutes tag has been tried to set 50 under ODBC tag in instanceconfig.xml. and the defaulttimeoutminutes has been tried to set to 150 in the same instanceconfig.xml.
along with that the rpd changes are done. For all the users apart from the Administrator, the timeout has been changedfrom 10 minutes to 150 minutes. But still that way is not working.
The segments was fired with the Administrator login. So , if the timeout for admin login can be set, i assume that the problem might be solved. Please let us know ur suggestions here too.
If that is the case, when we open rpd , only for Administrator user and group, the Permissions button is disabled. So i want to know how to change for Admin user settings in rpd..
Please advice us on the same.
Regards,
Madasamy M.
Edited by: Madasamy Murugaboobathi on Mar 26, 2012 1:40 PMThe issue has been resolved by setting the following parameters
<UnaccessedRunningTimeoutMinutes>120</UnaccessedRunningTimeoutMinutes>
<DefaultTimeoutMinutes>120</DefaultTimeoutMinutes>
<ClientSessionExpireMinutes>120</ClientSessionExpireMinutes>
<ConnectionExpireMinutes>120</ConnectionExpireMinutes>
<UIDefaultTimeoutMinutes>1440</UIDefaultTimeoutMinutes>
Maybe you are looking for
-
Configuring JRE 1.4.2 plugin for TLSv1 only server
Hi, I have apache server configured that talks only TLSv1. I wasn't able to load an applet from IE on JRE 1.4.2_05 plugin. so I did the following 1.Edited the file Documents and Settings\<<username>>\Application Data\Sun\Java\Deployment\deployment.pr
-
Anyone else notice on the mini player has 2 rewind buttons?
When i pause the mouse over the iTunes block (windows 7) on the task bar, you can see the main iTunes window as well as what was always the back, pause/play and next buttons. I'm not sure when it started, but if you look at it, its got back, pause/pl
-
We have a couple of InDesign CS4 files with us. We need to genereate a report as follows: 1. Need to take used paragraph and character styles as chapterwise. Do not take entire style available in the document. 2. Also required the corresponding font
-
Internal data error when uploading images
I have had no problem uploading files and images. Now, I can upload files, but when I try to upload an image I get FTP error - internal data error. I also connected directly to my server and tried to transfer images, but that didn't work either. Does
-
Question about Photoshop Premier and the need for coreaudio.dll drivers
I downloaded Photoshop Premier Elements. When I run the video edit part of the program, I get an error saying the program cannot find coreaudio.dll and coreaudiotoolbox.dll. I'm using Windows 8.1. Are these drivers necessary? If so, how do I get them