Maximum amount of records
hi,
I need to calculate maximum amount of records that I can store in my db, how can to calculate that?
thanks you by any idea
Hi
Max. no of record stored in db is not depands upon ur space(hard disks/any media on which u store it), conceptually there is no any such limit.
Raju
Similar Messages
-
I purchased a docking station for my iPad so that I could record a live recording of my band directly from a mixing board this weekend using GarageBand. Unfortunately, I was unaware of the song length limitations of the iPad version of this program. Basically, was unable to capture more than 15 minutes of the gig. I've read other conversations on this forum and can't really determine the maximum amount of time this application can record. Can anyone tell me how long GarageBand for ipad can record at one single time and how to adjust the settings to achieve this? Also, is anyone aware of any other apps than could be used to record a longer please share. Thanks.
CPI received an email reply from George, the Founder and CEO of Studiomini, he indicated to me that their app does not have any time limitations in regards to recording like GB does, you are only limited by the amount of free space on your iPad. Fo what it's worth, I think the Alesis IO Dock is an amazing piece of equipment. I would highly recommend purchasing the IO Dock and plugging your mixing board into the dock to capture your rehearsal. Hopefully the feedback I received from Studiomini is accurate and I can capture my entire rehearsal at one time. I hope this feedback helps. One last note, apparently their will be a update soon to view Studiomni in the landscape view on the iPad instead of just the portrait view.
-
What is the best practice of deleting large amount of records?
hi,
I need your suggestions on best practice of deleting large amount of records of SQL Azure regularly.
Scenario:
I have a SQL Azure database (P1) to which I insert data every day, to prevent the database size grow too fast, I need a way to remove all the records which is older than 3 days every day.
For on-premise SQL server, I can use SQL Server Agent/job, but, since SQL Azure does not support SQL Job yet, I have to use a Web job which scheduled to run every day to delete all old records.
To prevent the table locking when deleting too large amount of records, in my automation or web job code, I limit the amount of deleted records to
5000 and batch delete count to 1000 each time when calling the deleting records stored procedure:
1. Get total amount of old records (older then 3 days)
2. Get the total iterations: iteration = (total count/5000)
3. Call SP in a loop:
for(int i=0;i<iterations;i++)
Exec PurgeRecords @BatchCount=1000, @MaxCount=5000
And the stored procedure is something like this:
BEGIN
INSERT INTO @table
SELECT TOP (@MaxCount) [RecordId] FROM [MyTable] WHERE [CreateTime] < DATEADD(DAY, -3, GETDATE())
END
DECLARE @RowsDeleted INTEGER
SET @RowsDeleted = 1
WHILE(@RowsDeleted > 0)
BEGIN
WAITFOR DELAY '00:00:01'
DELETE TOP (@BatchCount) FROM [MyTable] WHERE [RecordId] IN (SELECT [RecordId] FROM @table)
SET @RowsDeleted = @@ROWCOUNT
END
It basically works, but the performance is not good. One example is, it took around 11 hours to delete around 1.7 million records, really too long time...
Following is the web job log for deleting around 1.7 million records:
[01/12/2015 16:06:19 > 2f578e: INFO] Start getting the total counts which is older than 3 days
[01/12/2015 16:06:25 > 2f578e: INFO] End getting the total counts to be deleted, total count:
1721586
[01/12/2015 16:06:25 > 2f578e: INFO] Max delete count per iteration: 5000, Batch delete count
1000, Total iterations: 345
[01/12/2015 16:06:25 > 2f578e: INFO] Start deleting in iteration 1
[01/12/2015 16:09:50 > 2f578e: INFO] Successfully finished deleting in iteration 1. Elapsed time:
00:03:25.2410404
[01/12/2015 16:09:50 > 2f578e: INFO] Start deleting in iteration 2
[01/12/2015 16:13:07 > 2f578e: INFO] Successfully finished deleting in iteration 2. Elapsed time:
00:03:16.5033831
[01/12/2015 16:13:07 > 2f578e: INFO] Start deleting in iteration 3
[01/12/2015 16:16:41 > 2f578e: INFO] Successfully finished deleting in iteration 3. Elapsed time:
00:03:336439434
Per the log, SQL azure takes more than 3 mins to delete 5000 records in each iteration, and the total time is around
11 hours.
Any suggestion to improve the deleting records performance?This is one approach:
Assume:
1. There is an index on 'createtime'
2. Peak time insert (avgN) is N times more than average (avg). e.g. supposed if average per hour is 10,000 and peak time per hour is 5 times more, that gives 50,000. This doesn't have to be precise.
3. Desirable maximum record to be deleted per batch is 5,000, don't have to be exact.
Steps:
1. Find count of records more than 3 days old (TotalN), say 1,000,000.
2. Divide TotalN (1,000,000) with 5,000 gives the number of deleted batches (200) if insert is very even. But since it is not even and maximum inserts can be 5 times more per period, set number of deleted batches should be 200 * 5 = 1,000.
3. Divide 3 days (4,320 minutes) with 1,000 gives 4.32 minutes.
4. Create a delete statement and a loop that deletes record with creation day < today - (3 days ago - 3.32 * I minutes). (I is the number of iterations from 1 to 1,000)
In this way the number of records deleted in each batch is not even and not known but should mostly within 5,000 and even you run a lot more batches but each batch will be very fast.
Frank -
Set maximum amount posted in a GL account
Hi All,
Is it possible to block the maximum amount posted in an GL account in a period.
Please advise.
Thanks,
SafiHi
You have to transfer the line items using a manual entry or through a recording by LSMW
Regards
Sanil -
Maximum Multitrack Audio Recording Length
Hi there..
I'ld like to know about the Maximum Multitrack Audio recording length of LE.
For Cubase software,the Multitrack Audio Recording Length is depends on the capacity of our H.Disk...but for GarageBand the Recording Length is limited by adjusting the Measure and Tempoh to the lower tempoh....
May i know how about LE?is it posible to set the Recording Length up to 5 Hours even 8 Hours??
thanks...
RayThis is correct, but it doesn't actually give you an ENDLESS amount of recording time.
The time is still dictated by logic, it seems to be linked to a certain number of bars. So therefore by setting the tempo really low, as the original poster said you can get a very long time out of it. I'm not sure how long this would equate to.
Once before I transferred a minidisk to cd for a friend I remember it stopped recording at about 70 mins ish, and that was set to the default of 120 BPM. So I would expect somewhere in the region of 4 times this length if setting the tempo to 30 BPM. The minimum tempo you can set logic to work at is 5 BPM, this would equal roughly 1680 minutes or 28 Hours. But whether or not it would actually happen or not is a different matter, and bare in mind these are rough maths done from a distant memory of a rough figure.
But hope that helps. -
Hi Friends,
I want to select maximum amount record from the internal table, can any one tell me how to extract that maximum record.
Regards,
LineHi Line Turbin,
Let me tell you one simple way..
sort the internal table descending by the amount.
then loop at the internable table and if sy-tabix = 1. then exit
example:
sort i_tab by wmbtr descending.
loop at i_tab.
if sy-tabix = 1.
move the hight value into variable or the row into something
exit.
endloop.
award points if useful -
Amount of records loaded to dso is not same as in psa
i performed a loading from psa to dso. i have 2 datasource under this dso, the amount of records loaded from psa for this 2 datasources to dso is not consistent. the psa for the 1st datasource having 3k records and the 2nd datasource having 5k records, when i perform the loading for both of this datasource to dso, the records is less. do anyone here know why is this so?
hi,
DSO have overwrite option and hence you have lesser records.
chk if you have enough key fields in DSO, so that you can reduce the number of records getting overwritten.
Ramesh -
How can we increase the maximum number of records which we export from UME
Hi All,
Is there any way to increase the maximum number of records which we can export from the UME.
Please give your valuable suggestions as soon as possible.
Thanks in Advance
Regards,
Ramalakshmi.SI didn’t find any configuration you can set to increase the number. I think it is related UI. The number is designed programmatically.
Lisa Zheng
TechNet Community Support -
How to insert large amount of records at a time into oracle
Hi, im Dilip. I'm newbie to Oracle. For practicing purpose i got some SQL code which has huge amounts of records in text format which i need to copy+paste in my SQL Plus in Oracle 9i. But when i try to paste in SQL Plus I'm unable to paste more them 50 lines of code at a time. In one of the text file there is 80 thousand lines of record's code i need to paste. Please help me. Here is the link for the text file I'm using : http://www.mediafire.com/view/?4o9eo1qjd15kyib . Any kind of help will be much appreciated.
982089 wrote:
Hi, im Dilip. I'm newbie to Oracle. For practicing purpose i got some SQL code which has huge amounts of records in text format which i need to copy+paste in my SQL Plus in Oracle 9i. But when i try to paste in SQL Plus I'm unable to paste more them 50 lines of code at a time. In one of the text file there is 80 thousand lines of record's code i need to paste. Please help me. Here is the link for the text file I'm using : http://www.mediafire.com/view/?4o9eo1qjd15kyib . Any kind of help will be much appreciated.
sqlplus user1/pass1
@sql_text_file.sql
doing above will execute all the SQL statements in the text file -
F110/ FBZP - maximum amount paid per Vendor in Payment run
Hi
How do i create payment run where i want to restrict the maximum paid to a vendor per transaction.
If amount exceeds this, i still wish to pay, but to split over more than one payment order
Our BACS payments are now being processed via FASTER PAYMENTS, and there seems to be a limit of £100,000 per transaction per vendor - anything over is being rejected by danske bank
So far example above, if vendor is due 156,000 - i wish to make this payment still to them, but over 2 or 2 payment orders, none of them to be more than 100,000
I have tried several methods via FBZP by using the maximum amount, but none of my settings seem to work so far,
Please advise
Many thanks for your help
TonyHi Tony,
Not exactly the solution, but going forward you can try split the invoices and process it separately at the time of invoicing itself. Otherwise you can try the option installment payment terms, but there may be at least one day difference in payment.
Warm regards,
Murukan Arunachalam -
Insert into table a large amount of records
I was trying to find a fast way to optimize a script that insert a large amount of records into a table. The initial script was like
insert into table_xxxx
select a.camp1, a.camp2, a.camp3 a.camp4, b.camp1, b.camp2, b.camp3
from table_a a, table_b b
where a.camp0 = b.camp0
The commit sentence was at the end of the insert script; so i came up with this solution
Declare
TYPE cur_CURSOR IS REF CURSOR ;
TYPE Tab_Hist IS TABLE OF table_xxxx%ROWTYPE INDEX BY BINARY_INTEGER;
g_tHist Tab_Hist;
CURSOR c_Base IS
select a.camp1, a.camp2, a.camp3 a.camp4, b.camp1, b.camp2, b.camp3
from table_a a, table_b b
where a.camp0 = b.camp0;
BEGIN
OPEN c_base;
LOOP
FETCH c_base BULK COLLECT INTO g_tHist LIMIT 1000;
EXIT WHEN g_tHist.COUNT = 0;
BEGIN
FORALL i IN g_tHist.FIRST .. g_tHist.COUNT SAVE EXCEPTIONS
INSERT INTO prov_cobr_dud VALUES g_tHist(i);
COMMIT;
EXCEPTION
WHEN NO_DATA_FOUND THEN
NULL;
END;
g_tHist.DELETE;
EXIT WHEN c_base%NOTFOUND;
END LOOP;
CLOSE c_base;
commit;
END;
If anyone could tell me another way to do the same thing i'll apreciate it a lot; i'm keen on learn more efficient ways to optimize scripts.
PD: The initial insert was inserting the table with 120,000 records (more or less)Hello,
Wrong forum. This is the Oracle Forms forum. You should post in the SQL-PL/SQL forum.
Francois -
Maximum amount of items in an edb database
I try to find out if there´s a limitation concerning the maximum amount of items (messages,appointments ...) in an edb database.
I´m currently on Exchange Enterprise 2010 SP3 with 3 databases in a DAG. One of this databases contains more than 5000000 items.
Does anyone know when its going to be critical?
Thanks for your help.Thank you for your reply. This website seems to have its scope on limitations to user mailboxes but it led me to this Technet article and this was helpful.
http://technet.microsoft.com/en-us/library/cc535025.aspx
"When you consider the appropriate maximum item count for your organization, understand that there is no cliff at the maximum value, but instead a progressive degradation in performance. The stated limits are not hard-coded. They are numbers based
on testing and code analysis. As the item count increases, performance may degrade to a user perceivable level. The level of performance that is acceptable for users in your environment will dictate the appropriate maximum item count for the environment."
I think that is what I´ve been searching for. In fact there´s no limitation for objects in a database. There are limits for objects in critical folders that must be considered to keep up performance for Exchange.
Cheers -
How can I copy the maximum amount of music to my ipod classic 160gb
how can I copy the maximum amount of music to an ipod classic 160gb
See also Repair iTunes Security Permissions and apply the suggested steps to the iTunes folder and the source media folder.
I may be reading too much into things, but is your media stored in a bunch of folders where each folder represents a playlist? If so my script ImportFolderStructure might be useful to you...
tt2 -
Maximum no. of records in a webdynpro Node
Hi experts ,
I wanted to know , is there any limitation on maximum no. of records a webDynpro node can store ?Also maximum no. of records a table can show.
Regards,
Ashish ShaaHi Ashish,
I think there is no limitation to this.
Regards,
Murtuza -
Maximum Number Of Records Import Manager can handle.
Hi Guys,
I want to know the maximum number of records Import Manager can import / handle at a time.
Thanks in advance .
Best Regards,
Ramchandra Kalkar.Amol,
The reference guide lists the limit at 50,000 records.
My experience is that this is not necessarily the case. To me it seems as though the maximum import depends somewhat on the number of fields you are trying to import. Meaning you can probably import 50,000 records if the file only contains two columns/fields, but if the file contains many columns/fields you probably will encounter difficulty trying to import 50,000 records at a time.
Maybe you are looking for
-
Hello All, What I am trying to do is use SCCM's own OSD Task Sequence / Windows PE to deploy an operating system that boots using VHD Native Boot. (C:\ = VHD File with Windows, D:\ Data drive with BCDBoot). MDT 2013 Can do it I've been told, but I a
-
My macbook pro is taking really long time to turn on. I have brought it last June'2012 and I am using all updated versions. I really need to get rid from this problem. Can anyone help me ? Should I format my hard drive as I used to do for windows pc?
-
How to put a loop for the entered value
hi all, I have 6 different scenario or cases and in my selection screen i am entering a value say 10. then in my program, it should go to all the 6 cases along with the first 4 cases. so the total turn is 10. if the value is 20, then it should go 3 t
-
Locked Screen Play different Music - iOS 7
Hello all, On my iPhone 4 under since I updated to iOS 7 I have been unable to restart music from where I left off from the lock screen. For example, I am listening to Track A in the music player and I stop the music while in the Music player. I don'
-
Exporting imovie project for windows pc
Hi- I created a project using imovie and exported it so I could put it on a website. It looks great on my mac but it won't play on a windows pc. I exported it by choosing "Export: to Quicktime" and "Formats:Web". It saved it as a .mov file and also c