Number of records being pulled from OLAP/ SQL in BPC 5.1
Hello BPC gurus,
We are experience performance issues with EVDRE.basically the report errors out and the error log states " Decompressing request failed". We are in BPC 5.1
We were trying to understand how many number of records the evdre is pulling from from OLAP / database so that we can look into some fine tuning opportunities of the EVDRE.
In the BI world we have RSRT where in which we can view the number of records from database, number of records transferred. Is there any such feature in BPC, where in which we can information on record counts.
We have turned on the error logs , but none of them give us an idea of the record count.
Appreciate your help in advance.
Thanks
sai
Hi Sorin,
Thank you very much for getting to me on my clarificaiton on the record count. As per your suggestion, we have already looked into this OSS note, and changed the entries in the table. After making these changes, the queries that normally execute in 1 min, now take 30 minutes to complete. Believe this was the observation also in some of the threads related to this issue.
You had mentioned that there might be an issue with the communication between application server and BPC Client. or SQE generating MDX query. Can you please give us some pointers on how to investigate this. Have turned on error logs evdataserver_debug.txt & EVDATASERVER_TRACE.txt on the file server, but i believe there is an OSS note 1311019, that talks about these logs not workign with SP9.
If you can guide us in the folllowing that would be helpful
1 how to bebug this issue that we are currently facing.
2. How does the concept of compressing / decompressing work in BPC.
Thanks
sai
Similar Messages
-
How Do i create a list that will show in a dropdown box with the list being pulled from another tab and not the cell data format junk?
I currently run OS X 10.10.1
Now i have been trying to work on this for a while now and what i want to do should be simple but its apparently not.
Here is an example of what i want to happen.
I will have 2 tabs: Contact | Sales
Now Contacts will have the list of names and various information about a customer, While Sales will have one drop-down box for each Cell Row that will show the names of the person in tab contacts
for what i am wanting to do i cant use the data format pop-up menu because the list is edited everyday several times a day.
Now how do i do this, Excel can do this so how can numbers do it?Hi Shegra,
Paste this into a applescript editor window and run it from there. In the script you may need to adjust the four properties to agree with your spreadsheet. Let me know if you have any questions.
quinn
Script starts:
-- This script converts column A in one table into an alphabetized list of popups. It copies the last cell in that column. Then reverts the column to text. It then refreshes popups in column A of a data table starting with a user defined row.
property DataEntrySheet : "Sheet 1" --name of sheet with popups to be refreshed
property DataEntryTable : "Sales" --name of table with popups to be refreshed
set copyRange to {}
property PopValueSheet : "Sheet 1" --name of sheet with popup values table
property PopValueTable : "Contacts" --name of table with popup values
set PopStartRow to {}
tell application "Numbers"
set d to front document
set ps to d's sheet PopValueSheet
set pt to ps's table PopValueTable
set s to d's sheet DataEntrySheet
set t to s's table DataEntryTable
set tf to t's filtered --this records filter setting on data Entry Table
display dialog "Start from row #..." default answer "" with icon 1 -- with icon file "Path:to:my.icon.icns" --a Week # row
set PopStartRow to {text returned of result}
tell pt --convert list to alphabetized popups
set ptRows to count rows
set copyRange to ("A2:" & name of cell ptRows of column "A")
set selection range to range copyRange
set selection range's format to text
sort by column 1 direction ascending
set selection range's format to pop up menu
-- popupsmade
set selection range to cell ptRows of column 1 of pt
set v to value of cell ptRows of pt
end tell
activate application "Numbers"
tell application "System Events" to keystroke "c" using command down
tell pt
set selection range to range copyRange
set selection range's format to text
end tell
tell t
set filtered to false
set tRows to count rows
set pasteRange to ((name of cell PopStartRow of column "A") & ":" & (name of cell tRows of column "A"))
set selection range to range pasteRange
tell application "System Events" to keystroke "v" using command down
set filtered to tf
end tell
end tell -
Number of records in cube from Query Designer
I don't have access to the cube in BW(listschema). I only have access to pull reports from cube from Query Designer. How can I tell the total number of records in a cube?
Thanks.Hi
you can use the tech content in the query designer to display the count for no of records or you can do the same via creating a new CKF
or see the below link to display the count
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/009819ab-c96e-2910-bbb2-c85f7bdec04a -
Records not pulling from Pervasive database
We just recently upgraded to Crystal Reports 2008 from Crystal Reports 7. We have some reports that pull from a pervasive database. After some looking I found that in the reports that we have already made that I needed to set up a Btrieve database (It is pulling from a pervasive database) connection and update it and all the reports loaded fine. I was able to update all the reports that use this database and they all seemed to be working great. Now, a couple days later when we run the report it acts like it is pulling the information but never displays it. Once we hit the refresh button, it pops up with the parameters box, we enter the values and then at the bottom it acts like it is running thru all the records. Except saying 1 of 1 and 2 of 2...ending at 10 of 10 like it always has done before, it goes 0 of 1, 0 of 2...ending at 0 of 10.
I also go to the old version of Crystal Reports and it comes up with the same problem.
I tried redoing the connection, but that didn't help either.
Please Help!I have finally had a chance to go to the computer with crystal reports to try your sql test. When I go to database menu the "Show SQL Query" is grayed out. So I am unable to select it. I tried this in design and preview view, with no luck.
I was able to play around some more to try to pinpoint the problem and here is a couple things that stood out to me. Maybe that will be helpful to you to help me determine my problem.
After running the report I went to Dependency checker and it came back with "Data Source: There was an unknown problem detected when checking table Vendor, on server M:\FB\AP\FILE.DDF. Please verify your report." and "Data Source: There was an unknown problem detected when checking table Invoices, on server M:\FB\AP\FILE.DDF. Please verify your report." I have verified my database under the database menu and that has always come back with database updated. I never did see a verify report button to do what the error message says.
The other thing is I found what parameter is causing me problems. There are 4 parameters..status, month, day, year. In status we always put S, but the values in that field could be S, C, or X. If I change the status to the C or X, it will pull and display data. So, I initially thought they maybe there was not "S" that is statisfying the rest of the parameters. However, after looking thru the reports in the program this database was made for and using, there is in fact "S"s that would satisfy the rest of the parameters and should be showing this data.
So my next question why would the report run correctly for some values but not others? -
Optimal number of records to fetch from Forte Cursor
Hello everybody:
I 'd like to ask a very important question.
I opened Forte cursor with approx 1.2 million records, and now I am trying
to figure out the number of records per fetch to obtain
the acceptable performance.
To my surprise, fetching 100 records at once gave me approx 15 percent
performance gain only in comparsion
with fetching records each by each.
I haven't found significant difference in performance fetching 100, 500 or
10.000 records at once.In the same time, fetching 20.000
records at once make a performance approx 20% worse( this fact I cannot
explain).
Does anybody have any experience in how to improve performance fetching from
Forte cursor with big number of rows ?
Thank you in advance
Genady Yoffe
Software Engineer
Descartes Systems Group Inc
Waterloo On
CanadaYou can do it by writing code in start routine of your transformations.
1.If you have any specific criteria for filtering go with that and delete unwanted records.
2. If you want to load specific number of records based on count, then in start routine of the transformations loop through source package records by keeping a counter till you reach your desired count and copy those records into an internal table.
Delete records in the source package then assign the records stored in internal table to source package. -
Data being retrieve from PL/SQL is "???"
Hi,
I follow the sample code shown in getting the data of PL/SQL using Java.
Please see below:
OracleCallableStatement stmt =(OracleCallableStatement)conn.prepareCall
( "begin ? := getEMpArray; end;" );
// The name we use below, EMPARRAY, has to match the name of the
// type defined in the PL/SQL Stored Function
stmt.registerOutParameter( 1, OracleTypes.ARRAY,"EMPARRAY" );
stmt.executeUpdate();
// Get the ARRAY object and print the meta data assosiated with it
ARRAY simpleArray = stmt.getARRAY(1);
System.out.println("Array is of type " + simpleArray.getSQLTypeName());
System.out.println("Array element is of type code "+simpleArray.getBaseType());
System.out.println("Array is of length " + simpleArray.length());
// Print the contents of the array
String[] values = (String[])simpleArray.getArray();
for( int i = 0; i < values.length; i++ )
System.out.println( "row " + i + " = '" + values<em> +"'" );
But when i view the data from java the data that is being fetch is "???". All the array data is like this "???". :(
But when i debug the PL/SQL the data is OK.
I dont know if this is a binary numbers or what.
I dont know how to solve this. And get the real data.
Please help.
By the way, i'm using eclipse as the IDE.
Thanks.</em>Ok,here is the code.
if(nvl(values1,0) >= nvl(:cf_qty,0)) then
return (nvl(value1,0)-nvl(value2,0));
else
pck.p_count_first := pck.p_count_first + 1 ;
pck.p_FGI(pck.p_count_first):= nvl(value2,0) - nvl(value1,0);
end if;
Here for First counter value stored as 1,
Second value stored as 1.
When I am retriving the values. like
if (pck.p_FGI.count > 0) then
PCK.p_count_second := pck.p_count_second + 1 ;
v_total := pck.p_FGI(pck.p_count_second) VALUE3;
end if;
Here Value pck.p_FGI(pck.p_count_second) comes as 0 instead of 1.
Can you tell us why is this show value 0. -
No updates being pulled from Microsoft
Software updates are not being downloaded anymore. The last updates were September of 2014. I went to Software Library and ran "sync" but nothing from October was pulled down. The wsyncmgr.log is all bad. what
would cause this? Everything worked perfect last month.
Too many consecutive failures. Aborting sync.
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Sync failures summary: SMS_WSUS_SYNC_MANAGER
11/7/2014 12:39:51 PM 2152 (0x0868)
Failed to sync update 5998e48e-de95-433d-9386-7cffe5d13bdc. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update c704787d-976d-406d-828b-9f1aca4430de. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 89f208f1-9036-4049-b8f6-a28497437604. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 103f6cdd-9f0a-4ea1-965c-becfc95fac92. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 76d24af8-b032-483c-8f13-9517afad2eb4. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update ba4bbea6-72cb-47d6-b61f-c2221cf4d5e8. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 9d460da0-8983-4bb4-a839-df88e5777488. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 37f39f03-27ae-4240-899d-d90a7f147510. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 60fb007f-b9b1-487a-ad90-2809876b9940. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update fbc1b097-d33b-4295-a961-8d0104adf708. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 1278a789-b29b-482d-b721-f3bc3c9ede66. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 597fc574-c938-4c75-ba99-332705631b68. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update e0b4a717-3fc7-4e8b-9d38-e498d48a6a72. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 9452503c-05be-4a64-99ed-9a59f7d65398. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 2261358b-eedc-4825-8114-ef7011072e88. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 363855a3-39b4-4b6e-944e-da8ccea428d7. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update b8c6524f-941e-4c4c-9d80-4d19e05573fc. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update b3af9afe-4c5e-40c9-8e9b-00b130d31216. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 1d48aa50-3db5-4492-8bde-cb6cd912a2e6. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update 3331cb93-46b3-438c-a5f5-6eda6c76c50a. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Failed to sync update a024086f-17fd-4875-8d84-a5d9a6811ed4. Error: The Microsoft Software License Terms have not been completely downloaded and~~cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:51 PM
2152 (0x0868)
Sync failed: Failed to sync some of the updates. Source: Microsoft.SystemsManagementServer.SoftwareUpdatesManagement.WsusSyncAction.WSyncAction.SyncUpdates
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:52 PM
15944 (0x3E48)
STATMSG: ID=6703 SEV=E LEV=M SOURCE="SMS Server" COMP="SMS_WSUS_SYNC_MANAGER" SYS=MGH-SCCM12-01.acme.org SITE=MH1 PID=12184 TID=15944 GMTDATE=Fri Nov 07 20:39:52.468 2014 ISTR0="Microsoft.SystemsManagementServer.SoftwareUpdatesManagement.WsusSyncAction.WSyncAction.SyncUpdates"
ISTR1="Failed to sync some of the updates" ISTR2="" ISTR3="" ISTR4="" ISTR5="" ISTR6="" ISTR7="" ISTR8="" ISTR9="" NUMATTRS=0
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:52 PM
15944 (0x3E48)
Sync failed. Will retry in 60 minutes SMS_WSUS_SYNC_MANAGER
11/7/2014 12:39:52 PM 15944 (0x3E48)
Setting sync alert to active state on site MH1
SMS_WSUS_SYNC_MANAGER 11/7/2014 12:39:52 PM
15944 (0x3E48)
Sync time: 0d00h06m43s SMS_WSUS_SYNC_MANAGER
11/7/2014 12:39:52 PM 15944 (0x3E48)
mqh7MQH7,
The EULA's are associated with Service Pack classifications therefore I assume someone has added in that classification to your sync since last month.
Things to check
- your WSUSContent folder has the following rights on share and NTFS - full control for network service and WSUS administrators.
- In IIS Manager on SUP go to WSUS Administration site click to Content. Right click and choose Manage Virtual Directory>Advanced Settings. Ensure the physical path is set to your WSUSContent location.
Cheers
Paul | sccmentor.wordpress.com -
Why are closed caption movies being pulled from iTunes?
As of today there is only one movie left in all of iTunes that is Closed Captioned. This is a real problem for those in the deaf community that have purchased iPods & iPhones, or even just those who use iTunes on Macs & PCs.
While Apple may not be fully liable, I am sure it is against the spirit of the ADA, and definitely not consumer friendly.
There is a small argument for blaming the movie studios for not providing CC content. Apple should be proactive in requesting movies be CC.
What is the reason for pulling existing movies that were CC and replacing them with non-CC versions?If the CC movies are coming back, how does that work with those of use who purchased non-CC movies? I see 145 movies, not too many of which I want to see.
I love the Harry Potter movies, but I have a harder and harder time understanding them. My deafness is such that not only do I find it difficult to understand what's being spoken if there's a lot of ambient noise, but I also have a lot of difficulty with foreign accents (my being USian). My hearing is bad enough to qualify me for a TTY phone! The sound quality (even if the settings are the same in iTunes) is uneven across the movies, and all I want to do is watch the movie and understand what's being said!
If I can't get the closed captioning with the movies and TV shows on iTunes, I'll just have to purchase or rent from Blockbuster and the like.
It boggles my mind that iTunes has overlooked this issue. I guess you get what you pay for. -
BAPI returns less number of records when called from WebDynpro
Hi,
We have a BAPI which updates some tables and then bring back output in the form of a table.
When we execute BAPI from R/3 we get all records. When we execute the BAPI using webdynpro, for the same input values, we are always getting 22 records. This count remains same always.
When we had put a breakpoint in the BAPI and tested it using webdynpro, we get few more records. Wondering what is the prob?
Any help?
regards,
ShabeerHi,
Are you using the same user when running the BAPI form R/3 and from the portal?
We had a similar problem when the user from the portal didn't have the necessary authorizations.
Adi. -
Hello, we are trying to import some of our Incident records from BMC Remedy into Service Manager. Our data architect is running into some difficulty finding the necessary information to create the format file (we plan to use the CSV import process).
Is there a comprehensive document that details out all of the properties available for Incidents, affected users, relationships, etc. in order to build the XML format file? For example, System.WorkItem.Incident is the main type for Incidents. Under
that, there is an 'Affected User' class.
Is anyone aware of this type of documentation for Service Manager?
Thanks in advance,
BrettHi,
Affected user' is a relationship. System.User is a class. Your data architect has to investigate SCSM management packs to get the required info. Use this PowerShell command with standard SCSM poweshell console to get all management packs in pure XML. "c:\mp"
is a path that you could change.
Get-SCSMManagementPack | Export-SCSMManagementPack -Path c:\mp
Cheers,
Marat
Site: www.scutils.com Twitter:
LinkedIn:
Facebook: -
Vendor invoice - Profit Center not being pulled from cost-center
Hi Guys,
Please help me out with this:
On the vendor expense line item, i am using an account which has a Field status group which is cost center - optional, business area - required and profit center - required. The cost center master has the profit center and business area both populated.
But when i try just populating the cost center on the expense line item, the profit center is not being populated automatically which it should. Can you please guide me what the problem can be? Is it in the field status group or maybe in the linking of the cost center and profit center.
Thanks for your help.
-MSHi,
Are you talking about vendor line item or expenses line item?
if you are talking about expense line item,check profit center is activated or not and validity period.
check profit center assignment in KS03 t.code.
if your talking about vendor line item,what version of SAP are you using?
if your using clasical GL and PCA,process T.code 1KEK - Transferring Payables/Receivables .
Hope this helps you.
Thanks,
Rau -
RE: (forte-users) Optimal number of records to fetch fromForte C ursor
The reason why a single fetch of 20.000 records performs less then
2 fetches of 10.000 might be related to memory behaviour. Do you
keep the first 10.000 records in memory when you fetch the next
10.000? If not, then a single fetch of 20.000 records requires more
memory then 2 fetches of 10.000. You might have some extra over-
head of Forte requesting additional memory from the OS, garbage
collections just before every request for memory and maybe even
the OS swapping some memory pages to disk.
This behaviour can be controlled by modifying the Minimum memory
and Maximum memory of the partition, as well as the memory chunk
size Forte uses to increment its memory.
Upon partition startup, Forte requests the Minimum memory from the
OS. Whithin this area, the actual memory being used grows, until
it hits the ceiling of this space. This is when the garbage collector
kicks in and removes all unreferenced objects. If this does not suffice
to store the additional data, Forte requests 1 additional chunk of a
predefined size. Now, the same behaviour is repeated in this, slightly
larger piece of memory. Actual memory keeps growing until it hits
the ceiling, upon which the garbage collector removes all unrefer-
enced objects. If the garbage collector reduces the amount of
memory being used to below the original Miminum memory, Forte
will NOT return the additional chunk of memory to the OS. If the
garbage collector fails to free enough memory to store the new data,
Forte will request an additional chunk of memory. This process is
repeated untill the Maximum memory is reached. If the garbage
collector fails to free enough memory at this point, the process
terminates gracelessly (which is what happens sooner or later when
you have a memory leak; something most Forte developpers have
seen once or twice).
Pascal Rottier
STP - MSS Support & Coordination Group
Philip Morris Europe
e-mail: [email protected]
Phone: +49 (0)89-72472530
+++++++++++++++++++++++++++++++++++
Origin IT-services
Desktop Business Solutions Rotterdam
e-mail: [email protected]
Phone: +31 (0)10-2428100
+++++++++++++++++++++++++++++++++++
/* All generalizations are false! */
-----Original Message-----
From: [email protected] [SMTP:[email protected]]
Sent: Monday, November 15, 1999 6:53 PM
To: [email protected]
Subject: (forte-users) Optimal number of records to fetch from Forte
Cursor
Hello everybody:
I 'd like to ask a very important question.
I opened Forte cursor with approx 1.2 million records, and now I am trying
to figure out the number of records per fetch to obtain
the acceptable performance.
To my surprise, fetching 100 records at once gave me approx 15 percent
performance gain only in comparsion
with fetching records each by each.
I haven't found significant difference in performance fetching 100, 500
or
10.000 records at once.In the same time, fetching 20.000
records at once make a performance approx 20% worse( this fact I cannot
explain).
Does anybody have any experience in how to improve performance fetching
from
Forte cursor with big number of rows ?
Thank you in advance
Genady Yoffe
Software Engineer
Descartes Systems Group Inc
Waterloo On
Canada
For the archives, go to: http://lists.sageit.com/forte-users and use
the login: forte and the password: archive. To unsubscribe, send in a new
email the word: 'Unsubscribe' to: [email protected]Hi Kieran,
According to your description, you are going to figure out what is the optimal number of records per partition, right? As per my understanding, this number was change by your hardware. The better hardware you have, the more number of records per partition.
The earlier version of the performance guide for SQL Server 2005 Analysis Services Performance Guide stated this:
"In general, the number of records per partition should not exceed 20 million. In addition, the size of a partition should not exceed 250 MB."
Besides, the number of records is not the primary concern here. Rather, the main criterion is manageability and processing performance. Partitions can be processed in parallel, so the more there are the more can be processed at once. However, the more partitions
you have the more things you have to manage. Here is some links which describe the partition optimization
http://blogs.msdn.com/b/sqlcat/archive/2009/03/13/analysis-services-partition-size.aspx
http://www.informit.com/articles/article.aspx?p=1554201&seqNum=2
Regards,
Charlie Liao
TechNet Community Support -
RE: (forte-users) Optimal number of records to fetch fromForte Cursor
Guys,
The behavior (1 fetch of 20000 vs 2 fetches of 10000 each) may also be DBMS
related. There is potentially high overhead in opening a cursor and initially
fetching the result table. I know this covers a great deal DBMS technology
territory here but one explanation is that the same physical pages may have to
be read twice when performing the query in 2 fetches as compared to doing it in
one shot. Physical IO is perhaps the most expensive (vis a vis- resources)
part of a query. Just a thought.
"Rottier, Pascal" <[email protected]> on 11/15/99 01:34:22 PM
To: "'Forte Users'" <[email protected]>
cc: (bcc: Charlie Shell/Bsg/MetLife/US)
Subject: RE: (forte-users) Optimal number of records to fetch from Forte C
ursor
The reason why a single fetch of 20.000 records performs less then
2 fetches of 10.000 might be related to memory behaviour. Do you
keep the first 10.000 records in memory when you fetch the next
10.000? If not, then a single fetch of 20.000 records requires more
memory then 2 fetches of 10.000. You might have some extra over-
head of Forte requesting additional memory from the OS, garbage
collections just before every request for memory and maybe even
the OS swapping some memory pages to disk.
This behaviour can be controlled by modifying the Minimum memory
and Maximum memory of the partition, as well as the memory chunk
size Forte uses to increment its memory.
Upon partition startup, Forte requests the Minimum memory from the
OS. Whithin this area, the actual memory being used grows, until
it hits the ceiling of this space. This is when the garbage collector
kicks in and removes all unreferenced objects. If this does not suffice
to store the additional data, Forte requests 1 additional chunk of a
predefined size. Now, the same behaviour is repeated in this, slightly
larger piece of memory. Actual memory keeps growing until it hits
the ceiling, upon which the garbage collector removes all unrefer-
enced objects. If the garbage collector reduces the amount of
memory being used to below the original Miminum memory, Forte
will NOT return the additional chunk of memory to the OS. If the
garbage collector fails to free enough memory to store the new data,
Forte will request an additional chunk of memory. This process is
repeated untill the Maximum memory is reached. If the garbage
collector fails to free enough memory at this point, the process
terminates gracelessly (which is what happens sooner or later when
you have a memory leak; something most Forte developpers have
seen once or twice).
Pascal Rottier
STP - MSS Support & Coordination Group
Philip Morris Europe
e-mail: [email protected]
Phone: +49 (0)89-72472530
+++++++++++++++++++++++++++++++++++
Origin IT-services
Desktop Business Solutions Rotterdam
e-mail: [email protected]
Phone: +31 (0)10-2428100
+++++++++++++++++++++++++++++++++++
/* All generalizations are false! */
-----Original Message-----
From: [email protected] [SMTP:[email protected]]
Sent: Monday, November 15, 1999 6:53 PM
To: [email protected]
Subject: (forte-users) Optimal number of records to fetch from Forte
Cursor
Hello everybody:
I 'd like to ask a very important question.
I opened Forte cursor with approx 1.2 million records, and now I am trying
to figure out the number of records per fetch to obtain
the acceptable performance.
To my surprise, fetching 100 records at once gave me approx 15 percent
performance gain only in comparsion
with fetching records each by each.
I haven't found significant difference in performance fetching 100, 500
or
10.000 records at once.In the same time, fetching 20.000
records at once make a performance approx 20% worse( this fact I cannot
explain).
Does anybody have any experience in how to improve performance fetching
from
Forte cursor with big number of rows ?
Thank you in advance
Genady Yoffe
Software Engineer
Descartes Systems Group Inc
Waterloo On
Canada
For the archives, go to: http://lists.sageit.com/forte-users and use
the login: forte and the password: archive. To unsubscribe, send in a new
email the word: 'Unsubscribe' to: [email protected]
For the archives, go to: http://lists.sageit.com/forte-users and use
the login: forte and the password: archive. To unsubscribe, send in a new
email the word: 'Unsubscribe' to: [email protected]Hi Kieran,
According to your description, you are going to figure out what is the optimal number of records per partition, right? As per my understanding, this number was change by your hardware. The better hardware you have, the more number of records per partition.
The earlier version of the performance guide for SQL Server 2005 Analysis Services Performance Guide stated this:
"In general, the number of records per partition should not exceed 20 million. In addition, the size of a partition should not exceed 250 MB."
Besides, the number of records is not the primary concern here. Rather, the main criterion is manageability and processing performance. Partitions can be processed in parallel, so the more there are the more can be processed at once. However, the more partitions
you have the more things you have to manage. Here is some links which describe the partition optimization
http://blogs.msdn.com/b/sqlcat/archive/2009/03/13/analysis-services-partition-size.aspx
http://www.informit.com/articles/article.aspx?p=1554201&seqNum=2
Regards,
Charlie Liao
TechNet Community Support -
Ajax Report Pull from another Page Pagination override
We are on a time crunch and need to get this application working in a timely manner, any help would greatly be appreciated..
*[History leading up to problem]*
We followed the code from this example http://apex.oracle.com/pls/otn/f?p=11933:48:4441142106394445 which pulls a report from another page using AJAX so we could update a report (in this case a tabular form). Since Tabular Forms cannot update past 30-40 records we had to turn on pagination to prevent errors when trying to submit it.
Since Pagination on the bottom of the report/tabular form links to the other page the report is on, I decided to attempt to override next> href link to instead call a javascript with the url from next> (Report/Tabular Form Pagination) with Jquery the following code {the override is working} {comments added here for clarification}...
var oldhref =$("a.t20pagination").attr("href"); //get Apex Pagination a href link location
$("a.t20pagination").attr('href', "javascript:return false"); // change the Apex Pagination a href llink location to do nothing
$("a.t20pagination").click(function () { // Create an onclick for Apex Pagination a href that calls a javascript function
get_report_page(oldhref); // This function is needed because I am grabbing the report from another page, I hope to
}); // get this javascript working [the reason why I am asking this question here]
$("a.t20pagination").removeClass("t20pagination"); // Removed the pagination class just in case it was causing issues.
The function I am working on below... {I figured I have to do the htmldb_get function {I am still new to Apex so I am not totally clear on it, but learning}, to get the report again from the other page, but I need to add pagination.. So I figured I could split the link location based on the : and go through the array for the parts I need to pass in the htmldb_get}, with alerts to help diagnose the problem...
function get_report_page(curUrl) {
alert ('Hello how are you today?');
var temp = new Array();
temp = curUrl.split(':');
alert ('I like the number '+temp.length);
for ( var i=0, len=temp.length; i<len; ++i ){
alert (i+":"+temp); // Square brackets actually used, changed to squiggly here to show
var get = new htmldb_Get(null,$x('pFlowId').value,null,205,null, null, temp{4}); // Square brackets actually used, changed to squiggly here to show
gReturn = get.get(null,'<htmldb:BOX_BODY>','</htmldb:BOX_BODY>');
get = null;
$x('ReportDrop').innerHTML = gReturn;
*[Actual Problem]*
Since the report is on another page (in this case 205) and is being pulled onto Page 200 where the user is, when the user clicks on the Next> for the report/tabluar form pagination, the user is automatically thrown to page 205 {I want to keep the user on page 200 and pass the pagination, then pull the report/tabular form again}
I looked at the link http://apex.oracle.com/pls/otn/f?p=11620:63:834860992188521::NO::: but I am not sure where to stick the report pagination{I thought it would be queryString {above you see I am setting temp[4] in the queryString parameter}..
I know the function is working since I get the alert messages, after that where the report would be it gives the following error...
Bad Request
Your browser sent a request that this server could not understand.
mod_plsql: /pls/apex/wwv_flow.show HTTP-400 Signature Mismatch or Missing '='
Oracle-Application-Server-10g/10.1.2.0.2 Oracle-HTTP-Server Server at nysastst.senate.state.ny.us Port 7778
Can I simply splice the url like I am doing above and pass it to htmldb_get (or is there anyway to pass the report pagination using htmldb_get)?
If so, what am I doing wrong?
Also, does this seem like a decent solution or is there a better solution for this issue?
- Thanks in advance,
- Brian
Edited by: brheitner on Dec 10, 2009 8:09 AM
Edited by: brheitner on Dec 10, 2009 8:11 AM
Edited by: brheitner on Dec 10, 2009 8:13 AMThanks cb... it worked like a charm..
but at first, I Could not get it to work because in my report that was being pulled from another page did not have Enable Partial Refresh set to Yes under Report Attributes > Layout and Pagination.. When it was off the a href link of the pagination was showing a standard link. Once Enable Partial Refresh was set to Yes, the pagination then showed a link to javascript:$a_report(...); which meant you can now override it with the javascript function.
Here is the thread I found it in
Call Process Through AJAX -
Table memory has Hughe through number of records very low number
Dear team,
When I'm checking one table it has below number of records.
select count(*) from table1
4980092
but the space allocated for this table
select sum(bytes) from user_segments where segment_name = 'table1';
SUM(BYTES)
2361712640
I'm surprised with this size.
When find the cause,I found the if we are delerting the records memory won't get freed, then i how can i freeup the memory for this table.
Delete happenning for thie table frequently on daily basis.user11081688 wrote:
Dear team,
When I'm checking one table it has below number of records.
select count(*) from table1
4980092
but the space allocated for this table
select sum(bytes) from user_segments where segment_name = 'table1';
SUM(BYTES)
2361712640
I'm surprised with this size.
why?
When find the cause,I found the if we are delerting the records memory won't get freed, correct
then i how can i freeup the memory for this table.
there is no need to do so, since space will be reused by new rows.
Delete happenning for thie table frequently on daily basis.if DELETE occurs daily, why is number of rows close to zero?
how many rows get INSERT daily?
what is average ROW LENGTH?
SQL> select 2361712640/4980092 from dual;
2361712640/4980092
474.230725
SQL
Maybe you are looking for
-
Im trying to setup the a720 with windows 8 and have this problems: When power off the computer have all time this BSOD : SESSION_HAS_VALID_POOL_ON_EXIT Touchscreen not work. If I run the windows 8 install, just for test, the touch is working, the pr
-
Hi! Experts, Can you please tell me the Number Range for Depreciation Post is External or Internal? My problem is: When I keep Number Range as internal, system not allow me to post the depreciation and give the following error at the time of Test Run
-
Document Search - External Repositories
Hi Folks, Has anyone of you worked on integrating the content of external repositories in Document Search within CIC ? Would really appreciate if you could share your experience doing the same. Here are the details of what I am trying to achieve: Bac
-
Refresh, Refresh, Refresh.......
Hi, Yesterday I installed os x mavericks. After solving a few small problems, I still have one very annoying All windows flashing every few seconds. Refresh. Refresh. Refresh. Does anyone know what to do with it?
-
Changing resolution for vizio external display
I just got a Vizio 37" lcd. It is hooked up to my g4 12" powerbook and looking pretty great except that it seems a bit stretched out from left to right. The users manual says I need to "set my pc computer timing mode to VESA 1366X768 at 60 Hz." I'm a