Table /BI0/I... has many records
What to make of the sizes of the Hierarchy tables?
/BI0/H... has about 500 records
/BI0/K... has about 7 records
/BI0/I... has about 1,400,000 records
Why does the /BI0/I... table have so many number of records?
Reviewing the Hierarchy, there are so many nodes that are not assigned to the 0HIER_NODE. Is this why there are so many records for table /BI0/I... because of the unassigned nodes?
The /BI0/P... has about 1,500,000 records.
Should this be alarming?
Hi Sam,
It depends on the interval of the hierarchy where all the ranges of the values are stored in /BIO/J... table. If you are missing any nodes for the hierarchies..please reload into BW and compare SETHEADER, SETNODE and SETLEAF tables by passing hierarchy name for setclass at R/3 with respective tables like H, J tables at BW.
Regards,
Venkata Boga
Similar Messages
-
How can i analize and solve Table /BI0/XCOMP_CODE has not yet been analyzed
Hi All,
i made a cube and among its chars is 0COMP_CODE, but the statistics appears in red, and the performance of the query is really bad, now in the TR RSRV i tried to solve it, in:
All Elementary Tests - Database -Database Statistics for an InfoCube and Its Aggregates
And appeared this message, i took the 0COMP_CODE in "Check Master Data for a Characteristic" and this doesnt tell me about something wrong, and now i dont know how can i solve this:
Message:
ORACLE: Table /BI0/XCOMP_CODE has not yet been analyzed
Message no. RSCV520
Diagnosis
Table /BI0/XCOMP_CODE has not yet been analyzed. There is, therefore, no statistical information in the database optimizer. This can lead to bad run schedules and, as a result, poor performance levels.
Procedure
Analyze table /BI0/XCOMP_CODE. Use a repair routine that is assigned to this analysis routine (function Remove error in transaction RSRV).
Help...About "run your query using display run schedule. There your IOBJ field will come in yellow - click on the same and select analyze from the window that pops up..."
Hi Arun,
i did it, i clicked analyze, this take me to other window, the IO not appear, i suppose it doesnt appear because 0COMP_CODE is part of the cube, but the query that i execute in the RSRT with display run schedule doesnt has this IO, but anyway in the next screen when i clicked Analyze, i execute the table /BI0/XCOMP_CODE and it said, was succesfull, i go to the RSRV to check if the problem was solved, but its still in red, i dont know if there is something more to do in the window "Analysis of oracle tables for creating statistics"
i dont know what else can you tell me or i m doing something wrong? -
I am designing a table, for which I am loading the data into my table from different tables by giving joins. But I have Status column, for which I have about 16 different statuses from different tables, now for each case I have a condition, if it satisfies
then the particular status will show in status column, in that way I need to write the query as 16 different cases.
Now, my question is what is the best way to write these cases for the to satisfy all the conditions and also get the data quickly to the table. As the data we are getting is mostly from big tables about 7 million records. And if we give the logic as case
it will scan for each case and about 16 times it will scan the table, How can I do this faster? Can anyone help me outHere is the code I have written to get the data from temp tables which are taking records from 7 millions table with filtering records of year 2013. This is taking more than an hour to run. Iam posting the part of code which is running slow, mainly
the part of Status column.
SELECT
z.SYSTEMNAME
--,Case when ZXC.[Subsystem Name] <> 'NULL' Then zxc.[SubSystem Name]
--else NULL
--End AS SubSystemName
, CASE
WHEN z.TAX_ID IN
(SELECT DISTINCT zxc.TIN
FROM .dbo.SQS_Provider_Tracking zxc
WHERE zxc.[SubSystem Name] <> 'NULL'
THEN
(SELECT DISTINCT [Subsystem Name]
FROM .dbo.SQS_Provider_Tracking zxc
WHERE z.TAX_ID = zxc.TIN)
End As SubSYSTEMNAME
,z.PROVIDERNAME
,z.STATECODE
,z.TAX_ID
,z.SRC_PAR_CD
,SUM(z.SEQUEST_AMT) Actual_Sequestered_Amt
, CASE
WHEN z.SRC_PAR_CD IN ('E','O','S','W')
THEN 'Nonpar Waiver'
-- --Is Puerto Rico of Lifesynch
WHEN z.TAX_ID IN
(SELECT DISTINCT a.TAX_ID
FROM .dbo.SQS_NonPar_PR_LS_TINs a
WHERE a.Bucket <> 'Nonpar'
THEN
(SELECT DISTINCT a.Bucket
FROM .dbo.SQS_NonPar_PR_LS_TINs a
WHERE a.TAX_ID = z.TAX_ID)
--**Amendment Mailed**
WHEN z.TAX_ID IN
(SELECT DISTINCT b.PROV_TIN
FROM .dbo.SQS_Mailed_TINs_010614 b WITH (NOLOCK )
where not exists (select * from dbo.sqs_objector_TINs t where b.PROV_TIN = t.prov_tin))
and z.Hosp_Ind = 'P'
THEN
(SELECT DISTINCT b.Mailing
FROM .dbo.SQS_Mailed_TINs_010614 b
WHERE z.TAX_ID = b.PROV_TIN
-- --**Amendment Mailed Wave 3-5**
WHEN z.TAX_ID In
(SELECT DISTINCT
qz.PROV_TIN
FROM
[SQS_Mailed_TINs] qz
where qz.Mailing = 'Amendment Mailed (3rd Wave)'
and not exists (select * from dbo.sqs_objector_TINs t where qz.PROV_TIN = t.prov_tin))
and z.Hosp_Ind = 'P'
THEN 'Amendment Mailed (3rd Wave)'
WHEN z.TAX_ID IN
(SELECT DISTINCT
qz.PROV_TIN
FROM
[SQS_Mailed_TINs] qz
where qz.Mailing = 'Amendment Mailed (4th Wave)'
and not exists (select * from dbo.sqs_objector_TINs t where qz.PROV_TIN = t.prov_tin))
and z.Hosp_Ind = 'P'
THEN 'Amendment Mailed (4th Wave)'
WHEN z.TAX_ID IN
(SELECT DISTINCT
qz.PROV_TIN
FROM
[SQS_Mailed_TINs] qz
where qz.Mailing = 'Amendment Mailed (5th Wave)'
and not exists (select * from dbo.sqs_objector_TINs t where qz.PROV_TIN = t.prov_tin))
and z.Hosp_Ind = 'P'
THEN 'Amendment Mailed (5th Wave)'
-- --**Top Objecting Systems**
WHEN z.SYSTEMNAME IN
('ADVENTIST HEALTH SYSTEM','ASCENSION HEALTH ALLIANCE','AULTMAN HEALTH FOUNDATION','BANNER HEALTH SYSTEM')
THEN 'Top Objecting Systems'
WHEN z.TAX_ID IN
(SELECT DISTINCT
h.TAX_ID
FROM
#HIHO_Records h
INNER JOIN .dbo.SQS_Provider_Tracking obj
ON h.TAX_ID = obj.TIN
AND obj.[Objector?] = 'Top Objector'
WHERE z.TAX_ID = h.TAX_ID
OR h.SMG_ID IS NOT NULL
)and z.Hosp_Ind = 'H'
THEN 'Top Objecting Systems'
-- --**Other Objecting Hospitals**
WHEN (z.TAX_ID IN
(SELECT DISTINCT
h.TAX_ID
FROM
#HIHO_Records h
INNER JOIN .dbo.SQS_Provider_Tracking obj
ON h.TAX_ID = obj.TIN
AND obj.[Objector?] = 'Objector'
WHERE z.TAX_ID = h.TAX_ID
OR h.SMG_ID IS NOT NULL
)and z.Hosp_Ind = 'H')
THEN 'Other Objecting Hospitals'
-- --**Objecting Physicians**
WHEN (z.TAX_ID IN
(SELECT DISTINCT
obj.TIN
FROM .dbo.SQS_Provider_Tracking obj
WHERE obj.[Objector?] in ('Objector','Top Objector')
and z.TAX_ID = obj.TIN
and z.Hosp_Ind = 'P')
THEN 'Objecting Physicians'
--****Rejecting Hospitals****
WHEN (z.TAX_ID IN
(SELECT DISTINCT
h.TAX_ID
FROM
#HIHO_Records h
INNER JOIN .dbo.SQS_Provider_Tracking obj
ON h.TAX_ID = obj.TIN
AND obj.[Objector?] = 'Rejector'
WHERE z.TAX_ID = h.TAX_ID
OR h.SMG_ID IS NOT NULL
)and z.Hosp_Ind = 'H')
THEN 'Rejecting Hospitals'
--****Rejecting Physciains****
WHEN
(z.TAX_ID IN
(SELECT DISTINCT
obj.TIN
FROM .dbo.SQS_Provider_Tracking obj
WHERE z.TAX_ID = obj.TIN
AND obj.[Objector?] = 'Rejector')
and z.Hosp_Ind = 'P')
THEN 'REjecting Physicians'
----**********ALL OBJECTORS SHOULD HAVE BEEN BUCKETED AT THIS POINT IN THE QUERY**********
-- --**Non-Objecting Hospitals**
WHEN z.TAX_ID IN
(SELECT DISTINCT
h.TAX_ID
FROM
#HIHO_Records h
WHERE
(z.TAX_ID = h.TAX_ID)
OR h.SMG_ID IS NOT NULL)
and z.Hosp_Ind = 'H'
THEN 'Non-Objecting Hospitals'
-- **Outstanding Contracts for Review**
WHEN z.TAX_ID IN
(SELECT DISTINCT
qz.PROV_TIN
FROM
[SQS_Mailed_TINs] qz
where qz.Mailing = 'Non-Objecting Bilateral Physicians'
AND z.TAX_ID = qz.PROV_TIN)
Then 'Non-Objecting Bilateral Physicians'
When z.TAX_ID in
(select distinct
p.TAX_ID
from dbo.SQS_CoC_Potential_Mail_List p
where p.amendmentrights <> 'Unilateral'
AND z.TAX_ID = p.TAX_ID)
THEN 'Non-Objecting Bilateral Physicians'
WHEN z.TAX_ID IN
(SELECT DISTINCT
qz.PROV_TIN
FROM
[SQS_Mailed_TINs] qz
where qz.Mailing = 'More Research Needed'
AND qz.PROV_TIN = z.TAX_ID)
THEN 'More Research Needed'
WHEN z.TAX_ID IN (SELECT DISTINCT qz.PROV_TIN FROM [SQS_Mailed_TINs] qz where qz.Mailing = 'Objector' AND qz.PROV_TIN = z.TAX_ID)
THEN 'ERROR'
else 'Market Review/Preparing to Mail'
END AS [STATUS Column]
Please suggest on this -
Passing values to subreport that has many records?
I have a report with a subreport in it and that subreport is a bill that includes things like name, address, etc. My report will run the subreport many times for each bill that comes up.
I am needing to pass values to text objects in these individual subreports that is information that is obtained in the code. My code works fine when this subreport is ran on it's own but I'm having trouble figuring out how to change the code up to put the information into individual subreports. Here is a simple example of some of my code:
using (PrintBills reportPrintBills = new PrintBills())
using (CrystalReportBill reportBill = new CrystalReportBill())
CrystalDecisions.CrystalReports.Engine.TextObject lblBalanceDue = ((CrystalDecisions.CrystalReports.Engine.TextObject)reportBill.Summary.ReportObjects["lblBalanceDue"]);
CrystalDecisions.CrystalReports.Engine.TextObject lblName = ((CrystalDecisions.CrystalReports.Engine.TextObject)reportBill.Summary.ReportObjects["lblName"]);
DataSet ds = new DataSet();
GetBills(ref ds, "Bills", date);
DataView dvID = new DataView(ds.Tables["Bills"]);
num = 0;
foreach (DataRowView rowID in dvID)
GetBioInfo(ref ds, "BioData", id);
DataRow row = ds.Tables["BioData"].Rows[num];
lblName.Text = row["Stu_Name"].ToString();
lblBalanceDue.Text = String.Format("{0:C}", row["Amount_Due"]);
num++;
reportPrintBills.SetParameterValue("@tblMonth", date);
reportPrintBills.SetParameterValue("@begdate", strBegdate);
reportPrintBills.SetParameterValue("@enddate", strEnddate);
//Export to PDF code here
reportBill does not display the values obviously. Somehow I need to set that reportBill is a subreport of reportPrintBills in the code so that each record will recognize the TextObject values that I am sending to them.
Another solution I've tried is adding parameter fields to reportBill and reportPrintBills and linking them up and then passing values like so:
foreach (DataRowView rowCWID in dvCWID)
GetBioInfo(ref ds, "BioData", id);
DataRow row = ds.Tables["BioData"].Rows[num];
name = row["Stu_Name"].ToString();
amtDue = String.Format("{0:C}", row["Amount_Due"]);
reportPrintBills.SetParameterValue("@strName", name);
reportPrintBills.SetParameterValue("@strAmtDue", amtDue);
num++;
But the value stays the same in all of the subreports rather than changing with each pass through.Open the report up in the Designer and Click on Edit, Subreport Links. Likely what you can do is use Shared Variables to pass values from the main report to the subreport.
You need to do this in the report first. If you are using RAS then you can at runtime. If RAS is not available to you then no way in code.
See these samples:
Root Page
http://wiki.sdn.sap.com/wiki/display/BOBJ/BusinessIntelligence%28BusinessObjects%29+Home
Enterprise Samples (including managed and unmanaged ras)
http://wiki.sdn.sap.com/wiki/display/BOBJ/BusinessObjectsSDKSampleApplications
Non-Enterprise Samples
http://wiki.sdn.sap.com/wiki/display/BOBJ/CrystalReportsSDKSampleApplications
Exporting Samples (RAS)
http://wiki.sdn.sap.com/wiki/display/BOBJ/NETRASSDK+Samples#NETRASSDKSamples-Exporting%2FPrinting
Also refer to the DSK help files for the Engine or RAS and search on the subreportcontroller.
If you are using RAS I'll move your post to the SDK forum.
Thank you
Don -
One to Many table join -- concat field per record grouped by id
Post Author: wm5
CA Forum: Formula
Hello,
I am using Crystal Reports XI and have two tables that have a one to many relationship and are joined by an JobID (number).
Below is a sample with relative fields for each table.
job_table: JobID (number), Manager (text), Status (text)
jobaudit_table : JobAuditID (number), JobID (Number), FormID (Number)
There is a one to many relationship with jobaudit_table having multiple records for each JobID in job_table.
I have created a Group Header using the job_table.JobID and suppressed the detail section.
In the group header for each JobID I display the JobID, Manager, Status. I also use the below formula to determine if any records in the jobaudit_table has a record where FormID = 90. If so, I display "Yes". If not, "No".
So my report currently looks like.
JobID Manager Status Audit Performed
1 Manager 1 Closed N
2 Manager 2 Closed Y
Here are the formula's I use to determine if any records in jobaudit_table contains a record where FormID = 90.
@ja90exists
if {jobaudit_table.FormID} = 90 then 1else 0;
if sum({@ja90exists},{job_table.JobID}) = 0 then "No"else "Yes";
Everything so far works fine. What I would like to do now is add a hyperlink to a script to view the job audit when in the above report the "Audit Performed" column is "Yes"
So Report is now:
JobID Manager Status Audit Performed
1 Manager 1 Closed N
2 Manager 2 Closed Y (hyperlink to view audit)
I cannot figure out how to gather the valid JobAuditIDs where FormID = 90 grouped by JobID to be used in the Group Heading section of the report.
Also, it is unlikely, but possible that more than one job_audit record exists with FormID = 90 per JobID. So, my hyperlink could look like http://mysite.com/viewjobaudit.aspx?jobid=[jobaudit_table.JobAuditID],[jobaudit_table.JobAuditID] .
Thanks for any help. And if this post is not clear let me know and I will clarify.
wm5Post Author: bettername
CA Forum: Formula
Although I can't think of a way to get multiple hyperlinks, this should be a start. It (should) hyperlink to the last job/audit in the group that formID of 90. Oh, I assumed that the hyperlink should have been xxxx...jobID,jobAuditID!
I think there may be a way of getting hyperlinks to every "90" record, but that will involve a subreport, so lets try this first...
1 - everything from your group header to the group footer...
2 - add a formula into the group header that says:
whileprintingrecords;
stringvar jobauditID="";
stringvar jobID:=";
3 - Then add a formula to the details section:
whileprintingrecords;
stringvar jobauditID;
stringvar jobID;
if {jobaudit_table.FormID} = 90
then (jobID:=totext({job_table.job_id},0,""); jobauditID:=totext({jobaudit_table.jobaudit_id},0,"")
4 - Finally, on your "Audit Performed" formula, have a conditional hyperlink that says:
whileprintingrecords;
stringvar jobauditID;
stringvar jobID;
if {@audit performed} = "Y" then http://mysite.com/viewjobaudit.aspx?jobid=jobID","+jobauditID -
How to delete parent table data even though it has child records
hi all,
How to delete parent table data even though it has child records.
ex: delete from pa_request cascade constraints;
But this command is not working .
Regards,
P Prakash833560 wrote:
ex: delete from pa_request cascade constraints;cascade constraints is DROP table option. It can't be used with DELETE. You need to delete child rows first or drop foreign keys and recreate them with ON DELETE CASCADE. Then:
delete from pa_request will automatically delete child rows. However, personally I don't like ON DELETE CASCADE. You can, by mistake, delete half of your database without even realizing it.
SY. -
How feed many record in table component and not use append command again
hi master
How I feed Multiple records in table component and not use append and save command again and again
Same as oracle grid
I give many record and save one time
Please give me idea how I add Multiple record and save one time not use append and save command again and again
Thank�s
aamirHi!
appendRow() method just add temporal row. For add them to DB commitChanges() method should be used. So you can use appendRow() method several times, fill every new row and only then use commitChanges() method. In this case all new rows will be added to DB per one time.
Thanks,
Roman. -
How many records are fetched my query into my internal tables
Hi
I am in the debugger, would like to know, how many records are fetched my query into my internal tables. How do I find out ?
thanks
sivaHi,
Do the following,
Step 1:
Fill your internal table with select query.
Step 2 : Use DESCRIBE statement with LINE addition to get the no of rows in the internal table.
for further informations, check
http://help.sap.com/saphelp_nw70/helpdata/en/fc/eb3798358411d1829f0000e829fbfe/content.htm
Regards,
Anirban -
Function error: Too many records
I have writing a function that needs to return the total count of a sql statement. It will divide two calculated columns to get an average. I have two versions. Version 1 compiled successfully and I am trying to either run it in Reports or in the database and call it. I get an error stating that the function returns too many records. I understand that is a rule for stored functions but how can I modify the code to get it return one value for each time it is called?
Here is the main calculation. SUM(date1-date2) / (date1-date2) = Avg of Days
version1:
create or replace FUNCTION CALC_OVER_AGE
RETURN NUMBER IS
days_between NUMBER;
days_over NUMBER;
begin
select (determination_dt - Filed_dt), SUM(determination_dt - Filed_dt) into days_between, days_over
from w_all_cases_mv
where (determination_dt - Filed_dt) > 60
and ;
return (days_between/days_over);
END CALC_OVER_AGE;
version2:
CREATE OR REPLACE FUNCTION CALC_OVER_AGE (pCaseType VARCHAR2)
RETURN PLS_INTEGER IS
v_days_between W_ALL_CASES_MV.DAYS_BETWEEN%TYPE;
v_total NUMBER;
days_over NUMBER;
i PLS_INTEGER;
BEGIN
SELECT COUNT(*)
INTO i
FROM tab
WHERE case_type_cd = pCaseType
AND determination_dt - Filed_dt > 60;
IF i <> 0 THEN
select SUM(determination_dt-Filed_dt), days_between
into v_total, v_days_between
from tab
where determination_dt - Filed_dt > 60;
RETURN v_total/v_days_between;
ELSE
RETURN 0;
END IF;
EXCEPTION
WHEN OTHERS THEN
RETURN 0;
END CALC_OVER_AGE;Table Structure:
WB_CASE_NR NUMBER(10)
RESPONDENT_TYPE_CD VARCHAR2(10)
INV_LOCAL_CASE_NR VARCHAR2(14)
CASE_TYPE_CD VARCHAR2(10)
FILED_DT DATE
FINAL_DTRMNTN_DT DATE
REPORTING_NR VARCHAR2(7)
INVESTIGATOR_NAME VARCHAR2(22)
OSHA_CD VARCHAR2(5)
FEDERAL_STATE VARCHAR2(1)
RESPONDENT_NM VARCHAR2(100)
DAYS_BETWEEN NUMBER
LAST_NM VARCHAR2(20)
FIRST_NM VARCHAR2(20)
DETERMINATION_DT DATE
DETERMINATION_TYPE_CD VARCHAR2(2)
FINAL_IND_CD VARCHAR2(1)
DESCRIPTION VARCHAR2(400)
DETERMINATION_ID NUMBER(10)
ALLEGATION_CD VARCHAR2(1)
ALGDESCRIPTION VARCHAR2(50)
Output is for Reports, I am trying to get the last calculation, which is the Average Days. The reports is grouped on Case Types and has several bucket counts for each like this.
Case Type Count All Completed Pending Over Age Avg Days
A 5 3 4 2 15
Z 10 7 6 3 30
4 8 3 5 4 22
To get the Avg Days, the Over Age calculation is used as well as the Days Between (Determination_Dt - Filed_Dt). That is the (date1-date2) example that I gave in my first post. So the calcuation would be the SUM(Days_Between) / Days_Between. -
Creating table to hold 10 million records
What should be the TABLESPACE,PCTUSED,PCTFREE,INITRANS, MAXTRANS, STORAGE details for creating a table to hold 10 million records.
TABLESPACE,A tablespace big enough to hold 10 million rows. You may decide to have a separate tablespace for a big table, you may not.
PCTUSEDAre these records likely to be deleted?
PCTFREEAre these records likely to be updated?
INITRANS, MAXTRANSHow many concurrent users are likely to be working with these records?
STORAGE Do you want to override the default storage values of the tablespace you are using?
In short, these questions can only be answered by somebody who understands your application i.e. you. The required values of these parameters has got little to do with the fact that the table has 10 million rows. You would need to answer these same questions for a table that held only ten thousand rows.
Cheers, APC -
BPEL - Provide Input to dbadapter proc that has plsql record as input param
Hi,
I have a bpel process where i have defined a DB adapter to execute a db pkg/procedure. the procedure takes a plsql record as a input parameter (as detailed below)
ie Package has the following defined -
TYPE rec_in_params IS RECORD(
p_param_name VARCHAR2(30),
p_param_value VARCHAR2(30)
TYPE t_params IS TABLE OF rec_in_params
INDEX BY BINARY_INTEGER;
and the procedure has a IN variable of type t_params
so Procedure param_proc(p_params IN t_params).......
Question - In the BPEL process, how do i iteratively populate the input variable with the name/value pairs (in the assign activity) before invoking the db adapter ?
Thanks.Use a while loop, in that loop, write your business logic ...actually created a tempVariable of the input record type for the stored procedure...in the while loop you need to write the logic in such a way that you need to assign the first record to the tempVariable, then append this tempVariable data to the invokeInputVariable. so the record gets appended to the last place if there are already some records in the invokeInputVariable or will be in the first place if there are no records. Iterate the while loop based on how many records you want to assign let's say 5. In while loop, in order to access the correct record, you need to do indexing in the XPATH and assign the same to the tempVariable. So first while loop, 1st record gets assigned, then that record in tempVariable gets appended to the invokeInputVariable. This procedure continues for n iterations. After this while loop, invoke the stored procedure...
Hope this helps...
Thanks,
N -
How can I set limitations on how many records a report can return
I have a report on the web using Oracle Reports builder and I have the client enter in date parameters for the report that they want.
Well with date ranges for different clients a different number of records are returned. Because of time it can take the report to return I want to limit the number of records that report can return.
How can I go about doing that? I don't want to limit with date parameters because date won't really work for me. I need to limit on how many records can be returned. If it exceeds 10,000 records I want the client to refine the date range of schedule the report to run later. Meaning we will run that report. So I would have two check boxes if the count was over 10,000 do you want to define your date or schedule the job to run later.
Can any one help me with this? How would I go about this?To know if the report is going to return more than 10,000 records, you first have to run the query with a 'select count(1) from ... where ...' (with the same from and where clauses as you normal query). Since this takes about the same time as runnng your report, I wonder if you really gain anything (although formatting may take some time too).
You may simplify the select count(1) query by omitting all the lookup tables that are only needed for formatting. That way your query may run a lot faster. You can put this in your after parameter form trigger. -
Reading azure table - The request has timed out
Hi,
I am developing a Windows store application with offline synchronization using Azure Mobile services. Have a table called
IMobileserviceSyncTable<PRODUCT> ProductTbl= App.MobileServiceClient.GetSyncTable<PRODUCT>();
When I am calling ProductTbl.PullAsync("ProductSync_Table", productTbl.Where(x => x.SKU == pItem.SKU);
The PullAsync() method is resulting an exception as Request Timed out with Service Code 503: Service Unavailable.
PRODUCT table has 7million records in azure but for this query results only one record.
Under Mobile Service logs have the following exception:
Error
Exception=System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.Data.Entity.Core.EntityCommandExecutionException: An error occurred while executing the command definition. See the inner exception for details. ---> System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception: The wait operation timed out
--- End of inner exception stack trace ---
at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()
at System.Data.SqlClient.SqlDataReader.get_MetaData()
at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)
at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, SqlDataReader ds)
at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean asyncWrite)
at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)
at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)
at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)
at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)
at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)
at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)
at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)
at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)
at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)
at System.Data.Entity.Core.EntityClient.Internal.EntityCommandDefinition.ExecuteStoreCommands(EntityCommand entityCommand, CommandBehavior behavior)
--- End of inner exception stack trace ---
at System.Data.Entity.Core.EntityClient.Internal.EntityCommandDefinition.ExecuteStoreCommands(EntityCommand entityCommand, CommandBehavior behavior)
at System.Data.Entity.Core.Objects.Internal.ObjectQueryExecutionPlan.Execute[TResultType](ObjectContext context, ObjectParameterCollection parameterValues)
at System.Data.Entity.Core.Objects.ObjectQuery`1.<>c__DisplayClass7.<GetResults>b__6()
at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)
at System.Data.Entity.Core.Objects.ObjectQuery`1.<>c__DisplayClass7.<GetResults>b__5()
at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)
at System.Data.Entity.Core.Objects.ObjectQuery`1.GetResults(Nullable`1 forMergeOption)
at System.Data.Entity.Core.Objects.ObjectQuery`1.<System.Collections.Generic.IEnumerable<T>.GetEnumerator>b__0()
at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()
at System.Collections.Generic.List`1..ctor(IEnumerable`1 collection)
at System.Web.Http.OData.Query.TruncatedCollection`1..ctor(IQueryable`1 source, Int32 pageSize)
at System.Web.Http.OData.Query.ODataQueryOptions.LimitResults[T](IQueryable`1 queryable, Int32 limit, Boolean& resultsLimited)
--- End of inner exception stack trace ---
at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)
at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
at System.Web.Http.OData.Query.ODataQueryOptions.LimitResults(IQueryable queryable, Int32 limit, Boolean& resultsLimited)
at System.Web.Http.OData.Query.ODataQueryOptions.ApplyTo(IQueryable query, ODataQuerySettings querySettings)
at System.Web.Http.OData.EnableQueryAttribute.ApplyQuery(IQueryable queryable, ODataQueryOptions queryOptions)
at System.Web.Http.OData.EnableQueryAttribute.ExecuteQuery(Object response, HttpRequestMessage request, HttpActionDescriptor actionDescriptor)
at System.Web.Http.OData.EnableQueryAttribute.OnActionExecuted(HttpActionExecutedContext actionExecutedContext)
at System.Web.Http.Filters.ActionFilterAttribute.OnActionExecutedAsync(HttpActionExecutedContext actionExecutedContext, CancellationToken cancellationToken)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Controllers.ActionFilterResult.<ExecuteAsync>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.AuthorizationFilterAttribute.<ExecuteAuthorizationFilterAsyncCore>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Controllers.AuthenticationFilterResult.<ExecuteAsync>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Controllers.ExceptionFilterResult.<ExecuteAsync>d__0.MoveNext(), Id=39b9d12d-ce59-4f0f-af96-68d07ebd6eb4, Category='App.Filters'
Can anyone please tell me how to get out of this issue.If this is timing out on you:
var items=ProductTbl.Where(x => x.SKU == pItem.SKU).ToCollectionAsync();
Then you most likely need to add an index on your SKU column, or use another filter that can have it avoid needing to look at all 7 million records.
Have you timed how long the raw query takes to do directly on your SQL db?
Most likely you may need to setup some indices to use. For offline sync, it will want to sort & filter by "__updatedAt" in addition to your filters. So that
is something to take in mind when optimizing your SQL performance. -
ABAP - CUA : Initial load : too many records in the CUA?
We are running :
SP03 for IDENT. CENTER DESIGNTIME 7.1 Patch 0
SP03 for IDENTITY CENTER RUNTIME 7.1 Patch 1
SP03 for NW IDM IC UIS 7.00 Patch 1
We have connected our customer's CUA system to IdM : we created an Identity Store called 'SAP_Master', created the CUA repository, defined the required attributes as defined in the guide 'IDM 7.1 - IdM For SAP Systems - Configuration 7-3.pdf', created the jobs based upon the templates etc. The dispatcher used has 'run provisioning jobs' disabled.
On our sandbox server, when we connect to our sandbox CUA system (CUA_YS5_200), everything is ok, the 'AS ABAP - Initial load' job with only 'ReadABAPRoles' enabled, runs fine.
On our QA system, when we connect to our 'production' CUA system (CUA_YP0_400), the 'AS ABAP - Initial load' job with
only 'ReadABAPRoles' enabled, finished with message 'could not start job, rescheduling'. Since there is a huge number of records (we looked it up in the system : 311.679 records), we decided to switch on parameter 'bootstrap job'. Now the result is that it takes forever, after half a day the job is still running. In the database, table 'sapCUA_YP0_400role' is still completely empty (no records). Therefore, it seemed interesting to connect our QA IdM system to our development CUA system (CUA_YS5_200). After a while, the exact same job has finished and table 'sapCUA_YS5_200role' contains 18.580 records.
After some additional testing, we might have discoved the cause of the issue could be that the number of records in our CUA is too big.
In the java code of the fromSAP pass there are 2 calls to the SAP system for reading the roles into Idm. The first one reads table USRSYSACT (311.000 records), the second one reads table USRSYSACTT (1.000.000 records). All these records are stored into a java hashmap - we think that 1 million records exceeds the hashmaps capability, although no java error is thrown.
When we debug the functionmodule RFC_READ_TABLE and change the rowcount to 100.000 then everything works fine. When we set the rowcount to 200.000 the java-code does not generate an error but the job in idm never
ends...
When running functionmodule RFC_READ_TABLE in the backend system the 1.000.000 records are processed in less than one minute. So apparently, the issue is related to the processing in the Java code.
Java Dispatcher heap size is set to 1024.
Anybody already came accros this issue?
Thanks & best regards,
KevinInstalling the patch, re- importing the SAP Provisioning framework (I selected 'update') and recreating the jobs didn't yield any result.
When examining pass 'ReadABAPRoles' of Job 'AS ABAP - Initial Load' -> tab 'source', there are no scripts used .
After applying the patch we decided anyway to verify the scripts (sap_getRoles, sap_getUserRepositories) in our Identity Center after those of 'Note 1398312 - SAP NW IdM Provisioning Framework for SAP Systems' , and they are different
File size of SAP Provisioning Framework_Folder.mcc of SP3 Patch 0 and Patch 1 are also exactly the same.
Opening file SAP Provisioning Framework_Folder.mcc with Wordpad : searched for 'sap_getRoles' :
<GLOBALSCRIPT>
<SCRIPTREVISIONNUMBER/>
<SCRIPTLASTCHANGE>2009-05-07 08:00:23.54</SCRIPTLASTCHANGE>
<SCRIPTLANGUAGE>JScript</SCRIPTLANGUAGE>
<SCRIPTID>30</SCRIPTID>
<SCRIPTDEFINITION> ... string was too long to copy
paste ... </SCRIPTDEFINITION>
<SCRIPTLOCKDATE/>
<SCRIPTHASH>0940f540423630687449f52159cdb5d9</SCRIPTHASH>
<SCRIPTDESCRIPTION/>
<SCRIPTNAME>sap_getRoles</SCRIPTNAME>
<SCRIPTLOCKSTATE>0</SCRIPTLOCKSTATE>
-> Script last change 2009-05-07 08:00:23.54 -> that's no update !
So I assume the updates mentioned in Note 1398312 aren't included in SP3 Patch 1. Manually replaced the current scripts with those of the note and re- tested : no luck. Same issue.
Thanks again for the help,
Kevin -
Installed Mavericks on MacBook Pro. Now my Contacts has no records. Contacts on my iPad are OK. How can I repopulate my MacBook Contents file from my iPad?.
rbargeThis is all just a partial answer since details will depend upon how you want to do it.
Post by Zevoneer: iPod media recovery options - https://discussions.apple.com/message/11624224 - this is an older post and many of the links are also for old posts, so bear this in mind when reading them.
Commercial software utility for transferring songs from i-device to Mac - http://www.fadingred.com/senuti/
http://support.apple.com/kb/HT1848 - just media purchased from iTunes Store
You'll need to get it all onto one iTunes collection on your Mac Pro, then sync the devices to that. iTunes only lets you transfer purchases from a device, otherwise you have to use third party software.
I guess you could also look into getting iTunes Match.
Maybe you are looking for
-
Removing Null values from character string
Hi All, Can i remove NULL values (hexadecimal - 0000) from character string. I am uploading a file from presentation layer (shared server) and getting NULL values in many fields which i want to remove. please assist. Thanks for you help.. Regards, Mo
-
NEED HELP with HP P1006 DUPLEX printing
This printer has the ability to print duplex pages (manual). This is different from printing the odd pages and then the even pages to achieve two-sided pages. This affects me because I like to print 2-sided AND print 2 pages on each side and the manu
-
Sure hope someonce can help. When I connet my Curve to the Desktop Manager I get the following error: Problem signature: Problem Event Name: APPCRASH Application Name: Rim.Transcoder.exe Application Version: 6.1.0.32 Application Timestamp: 4dee0de3
-
Placing Flex SWF inside HTML?
We're building a website that will incorporate a flex gallery. We would like to house this inside of HTML on the page. Is this possible? Would it be as simple as calling a regular flash (in this case, flex) movie, or is it more complex? We dont want
-
Parameter passing from BPF to DM package
Hi there, We have a DM package based on execute formulas that points to custom logic file. We've configured to work using legalentity and time as parameters and we put them on the promt under the SELECTINPUT. Everyting works fine but now I'll like to