0 to 100,000,000

anyone got the script to make a textfield count from 0 to
100,000,000 - but with commas
thanks in advance.....

anyone got the script to make a textfield count from 0 to
100,000,000 - but with commas
thanks in advance.....

Similar Messages

  • I bought a new computer (Windows 8) and plugged in my 2TB drive containing 100,000 songs and iTunes won't recognize 5,000 of them no matter what I do other than re-encode them.  Any ideas?

    All I can figure out to do is to sort the "unknown artist" and "unknown album" tracks by album name (yes, it shows the name of the artist and album in Windows Explorer ironically) and then move each album's worth of songs to the library which then show up as Unknown - and then I can re-encode the tracks to go where they're supposed to be but all track number info is also gone so I have to edit each track to add track numbers.  This is a nightmare of gigantic proportion that will take months to fix.  It's like all of these 5,000 tracks (out of 100,000) were not encoded properly to begin with although I can't imagine how that happened since I encoded thme from CDs originally.  What a mess!  Any ideas?  Please join in.
    P.S. why do they never include iTunes itself as the option that you're having a problem with?

    See Repair security permissions for iTunes for Windows.
    tt2

  • How to change data 100000- to -100,000 use edit_mask in alv

    hey experts,
       i want to display data '-100,000' use '100000-'.i use edit_mask 'v_____' to put '-' in the front of 100000. but the comma is disappeard.
      how can do that.
    thanks,
    heum

    Hi Kim,
    Use the function module:
    CLOI_PUT_SIGN_IN_FRONT
    [Example|http://abaplovers.blogspot.com/2008_06_23_archive.html]
    Regards,
    Chandra Sekhar

  • How to append a new entry in a list of 100,000 names without iterating the list each time?

    I have a list of 100,000 + names and I need to append to that list another 100,000 names. Each name must be unique. Currently I iterate through the entire list to be sure the name does not exist. As you can imagine this is very slow. I am new to Java and I am maintaining a 15+ year old product. Is there a better way to check for an existing name?

    We are using a Java list because that is how the original developers coded it. I don't think they planned for that many entry's. I know I need to re factor this which is why I am asking for opinions on how to  make this more efficient. Currently we don't use a database for anything in the product so I would like to stay away from that if possible.
    Ok - but it still raises the question in my mind as to how that data is being used.
    I gave you a couple of options that will take care of the UNIQUE requirement (HashTable, HashMap) but the BEST solution depends on:
    1. How often new entries are made
    2. How often entries are deleted
    3. How often entries are changed
    4. How often entries are accessed
    5. How the data is actually used
    If you just have a one time requirement to merge the two lists then just do it and get it over with - it won't really matter how you do it.
    But Hash classes will present their own performance issues if the typical access is for many, most or all of that 200k+ entries.
    Without knowing the full set of requirements we can't really know just what part of the process needs to be optimized.

  • 30,000 to over 100,000 pageouts of memory is this normal?

    Hi, I have 8 gbs of memory in my Mac Pro. I notice that sometimes istat nano will show alot of page outs used up.
    After restarting my computer, It usually starts at 300 pageouts, then it goes up to 30,000, then over 100,000. This can happen within a day of restarting my Mac or a few days.
    These are the applications that I have open: Iphoto (45,000 pictures), Itunes , calendar, Quicktime X, Apple Mail, Finder, Safari, Preview.
    Is it normal to have that many page outs with those apps open? Sometimes I don't even use my Mac much and it still goes up. Thanks guys

    Depends on how much installed RAM you have, number of open applications, and the sort of work you are doing. By itself Pageouts is not an indicator of anything special other than over the course of time the computer has accumulated that total as a result of various applications having to page out data to the hard drive. A more significant indicator is the value in parentheses shown by 'top' in the Terminal. If that number is positive and increasing then it means you do not have sufficient RAM to accommodate all the applications you are running concurrently. See:
    About OS X Memory Management and Usage
    Reading system memory usage in Activity Monitor
    Memory Management in Mac OS X
    Performance Guidelines- Memory Management in Mac OS X
    A detailed look at memory usage in OS X
    Understanding top output in the Terminal
    The amount of available RAM for applications is the sum of Free RAM and Inactive RAM. This will change as applications are opened and closed or change from active to inactive status. The Swap figure represents an estimate of the total amount of swap space required for VM if used, but does not necessarily indicate the actual size of the existing swap file. If you are really in need of more RAM that would be indicated by how frequently the system uses VM. If you open the Terminal and run the top command at the prompt you will find information reported on Pageins () and Pageouts (). Pageouts () is the important figure. If the value in the parentheses is 0 (zero) then OS X is not making instantaneous use of VM which means you have adequate physical RAM for the system with the applications you have loaded. If the figure in parentheses is running positive and your hard drive is constantly being used (thrashing) then you need more physical RAM.
    Also, visit The XLab FAQs and read the FAQ on the spinning beachball of death (SBBOD.)
    Message was edited by: Kappy

  • Conversion Problem in Unit of measure 1PAC=100,000 EA

    Hi Experts,
    I am facing the problem in conversion of Unit of Measures.
    If some material has the conversion unit when we order to Vendor.
    The Unit of measure is EA, the Order unit is PAC.
    The conversion is 1PAC=100,000 EA
    But the System is allowing as 1PAC=10,000 EA only and it is not allowing to enter 100,000 EA and throughing the error
    Entry too long (enter in the format __,___)
    Message no. 00089
    How can we increase the characters of the Unit of measure. Like this we have some 50 Materials, I need to increase the length of the conversion spaces. How to do it, Please guide me.
    rgds
    Swamy

    Hi
    you can check this out the denominator factor is Stored in structure - SMEINH - field UMREN.
    Best option is to maintain decimal place for the UOM , which is more easier to mainatin.
    Even if you go through the note the same is explained. Please read below for Too large numerators and denominators
    When 120000 CM3 = 0,2 tons (TO), you can no longer save numerator and denominator of conversion ratio 600000 CM3 = 1 TO as numerator and denominator may have maximally five digits.
    Here, you must either select a larger volume unit or a smaller unit of weight: With DM3 the conversion ratio would be 600 DM3 = 1 TO, with KG the conversion ratio would be 600 CM3 = 1 KG.
    Generally, the alternative units of measure and the base unit of measure should result in quantities that are in the same dimension since the conversion factors may not be larger than 99999/1 and not smaller than 1/99999.
    Thanks & Regards
    KK
    Edited by: Kishore Kumar Chiluka on Apr 22, 2008 8:25 AM

  • My mac mail is downloading over 100,000 emails from the server every time i click the icon.  i can't receive the current emails

    Was looking to see if anyone has a solution to stop mac mail from downloading over 100,000 emails each time i click on the mail icon.  i can't get newer emails uploaded.  Has anyone dealt with this and were you able to fix it?
    thank you

    does your anti virus has a spam tool? if so turn it off and see

  • MatchCode for 100.000 rows = dump

    Hi!
    I use FM F4IF_INT_TABLE_VALUE_REQUEST for a personal matchcode in a dynpro field.
    But if the internal table has got 100.000 rows, the system dump.
    How can I do for display the match code without dump?
    Thanks very much!

    A matchcode where you have more than 100.000 rows is not a good matchcode !
    You should provide at least some criterion to restrict the list. The maximum number of hits is only 4 digits in SAP and you should always restrict your list according to this maximum
    you do this by adding to your SELECT statement:
    up to callcontrol-maxrecords rows

  • 100,000 images?

    Last fall, I was shooting beside a friend who teaches Aperture at various shows and events. He mentioned rumors of LR having problems with the database files once the catalog reaches 100,000  images. At the time, I probably only had 75,000 images in my main catalog, but I am now just over the number. I upgraded to 3.0 just in case the story was true, and hoping Adobe would have fixed the problem if there ever was one. So, was he spreading a bad rumor? Was there really a problem? And if there was a problem in 2.x, is it now resolved in 3.x?
    Are there people here with well over 100,000 images in their catalogs?
    Thanks in advance,
    M. Jackson

    John,
    Thanks for the response. Sounds like the 100,000 barrier might just be a rumor if people have 275,000 in LR2. After I heard it, I kept watching for posts with similar concerns and haven't seen it. The guy who told me is an active wildlife photographer and is around lots of people with plenty of experience using LR. He went on to say Apple had a different way of maintaining the database than Adobe, making it more reliable. I put more faith in the comment than some comments I might hear in the field.
    Back when I was building my first catalogs, people here said they knew of people with 90,000 images in their catalogs and going strong. That gave me the courage to continue with just one main catalog. At something like 101,000, I started to hear the echo of that warning.
    Take care,
    M. Jackson

  • Saving 100,000 records in a loop too slow

    I have an explicit cursor then a for loop on it and an update statement inside the loop it works but it does 100,000 individual updates which is not good for performance speed, any ideas to speed that up
    My loop is
    FOR code_rec IN cur_codes_table
    LOOP
    code_loop := code_loop +1 ;
    UPDATE <table name> T1
    SET T1.error_indicator = 'Err?'||main_rec.REC_ROWNUM
    WHERE T1.REC_ROWNUM = main_rec.ROWNUM ;
    END LOOP;

    You haven't really provided us enough information to go on but I would suggest removing the FOR LOOP logic altogether and using a SINGLE UPDATE statement.
    If you can provide CREATE / INSERT scripts to describe your situation as well as desired result the folks here can probably develop a solution for you.
    Here is an example, it may or may not apply to your situation:
    UPDATE  TABLE_T T1
    SET     ERROR_INDICATOR = (
                                    SELECT  'Err?'||REC_ROWNUM
                                    FROM    MAIN_TABLE MT
                                    WHERE   T1.REC_ROWNUM = MT.REC_ROWNUM
    WHERE   EXISTS
                    SELECT  NULL
                    FROM    MAIN_TABLE MT
                    WHERE   T1.REC_ROWNUM = MT.REC_ROWNUM
            );

  • Display 100,000 rows in Table View

    Hi,
    I am in receipt of a strange requirement from a customer, who wants a report which returns about 100,000 rows which is based on a Direct Database Request.
    I understand that OBIEE is not an extraction tool, and that any report which has more than 100-200 rows is not very useful. However, the customer is insistent that such a report be generated.
    The report returns about 97,000 rows and has about 12 columns and is displayed as a Table View.
    To try and generate the report, i have set the ResultRowLimit in the instanceconfig.xml file to 150,000 and restarted the services. I have also set the query limits in the RPD to 150,000, so this is not the issue as well.
    When running the report, the session log shows the record count as 97,452 showing that all the records are available in the BI Server.
    However, when i click on the display all the rows button at the end of the report, the browser hangs after about 10 minutes with nothing being displayed.
    I have gone through similar posts, but there was nothing conclusive mentioned in them. Any input to fix the above issue will be highly appreciated.
    Thanks,
    Ab
    Edited by: obiee_user_ab on Nov 9, 2010 8:25 PM

    Hi Saichand,
    The client wants the data to be downloaded in CSV, so the row limit in the Excel template, that OBIEE uses is not an issue.
    The 100,000 rows that are retrieved is after using a Dashboard Prompt with 3 parameters.
    The large number of rows is because these are month end reports, which is more like extraction.
    The customer wants to implement this even though OBIEE does not work well with large number of rows, as there are only a couple of reports like this and it would be an expensive proposition to use a different reporting system for only 3-4 reports.
    Hence, i am on the lookout for a way to implement this in OBIEE.
    The other option is to directly download the report into CSV, without having to load all the records onto the browser first. To do the same, i read a couple of blog entries, but the steps mentioned were not clear. So any help on this front will also be great
    Thanks,
    Ab

  • I want to delete approx 100,000 million records and free the space

    Hi,
    i want to delete approx 100,000 million records and free the space.
    I also want to free the space.How do i do it?
    Can somebody suggest an optimized way of archiving data.

    user8731258 wrote:
    Hi,
    i want to delete approx 100,000 million records and free the space.
    I also want to free the space.How do i do it?
    Can somebody suggest an optimized way of archiving data.To archive, backup the database.
    To delete and free up the space, truncate the table/partitions and then shrink the datafile(s) associated with the tablespace.

  • If you can figure this out, I will pay $100,000,000!!!

    If anyone can figure out how to stream videos between itunes using the "shared library", I WILL PAY YOU 100,000,000!!!!! Any videos, even video podcasts. Stream between two itunes from mac to windows

    aapl.up wrote:
    If anyone can figure out how to stream videos between itunes using the "shared library", I WILL PAY YOU 100,000,000!!!!! Any videos, even video podcasts. Stream between two itunes from mac to windows
    The answer is both easy and frustrating.
    First the easy part.
    1. Add the videos to iTunes on the Mac (you did say from Mac to PC)
    2. If the video is a Music video, set the flag appropriately
    3. Tell iTunes on the Mac to share its iTunes library over the network
    4. Make sure either you share the entire library, or a playlist that includes the video(s)
    5. Set the PC to look for shared libraries on the network
    The above is probably obvious and you have probably done this. The next bit is to address the cause of the problem.
    First, I have various music videos acquired from different sources. Some from the iTunes Store before they started charging for them, some from other sites on the Internet. All are in QuickTime compatible format and hence all play locally in iTunes. However most of them do not work when I try and access them via iTunes Sharing.
    Now I had a strong idea what the cause was but here are possible causes that could be considered
    1. Its the wrong format (even if it can be played locally), i.e. not MPEG4 or H.264
    2. Its the wrong pixel size or bit rate
    3. It did not have the "Music Video" flag set
    4. Its too long in duration or file size
    5. It has not been prepared for streaming
    To put you out of your misery it appears to be number 5. Now I had already suspected this because I had previously seen reports that FrontRow (on a Mac) had problems playing videos from another Mac if the video had not been prepared for streaming.
    I have found two reasonably easy ways to convert videos so they are prepared for streaming.
    1. If you have QuickTime Pro (for Mac or Windows) you can export the video using the QuickTime Player and in options in the export dialog box, set it to enable the streaming option.
    2. If you select the video in iTunes itself, you can tell iTunes to convert it to iPod or AppleTV format, this will at the same time also set the streaming option
    I tested both methods using a video that previously would not work between Mac and Windows iTunes sharing and both these solutions worked. This was tested using iTunes 7.4.2 on Mac OS X 10.4.10 and iTunes 7.4.2 on Windows XP Pro.
    I look forward to receiving your cheque for $100,000,000

  • Installing Lightroom 5 for first time. Chose standard previews of the 100,000+ photos on removable hard drive. Lightroom stopped creating previews after the first 10,000 or so pictures. Don't see how to start it moving forward again. Thanks!

    I think I did everything correctly. Moved pictures to the external drive as per Microsoft's instructions for Windows 8. All worked fine. Everything else in Lightroom seems to work fine. However, it just stopped creating standard size previews.

    Glad you had success. You can check to see how many preview files have been built by checking the Lightroom 5 Catalog Previews.lrdata folder with File Explorer:
    /Users/[user name]/Pictures/Lightroom/Lightroom 5 Catalog Previews.lrdata
    Right-click on the folder and select 'Properties.' Next to 'Contains' will be the file count representing the number of built previews. It should be the same as the number of pictures (100,000) or slightly more.
    Creating previews for 100,000 image files will take a long time! My Windows 7 i7-860 processor system with 21 mp Canon 5D MKII raw files takes about 3 seconds to build one standard preview. Using this number for 100,000 previews:
    100,000 x 3 seconds = 300,000 sec. = 5,000 min. = 83 hours = 3.5 days!

  • Backup tips for 100,000+ images/yr

    I shoot roughly 100,000 images a year and I am on the road quite a bit. I am currently using a Macbook Pro w/256GB SSD so space is limited.  I pretty much just keep the images from the current week before I move them to a backup drive. 
    Any suggestions to access the archive on the road?  I currently have a few Travel HD's that are roughly 2TB each. I often get requests where I need to access a specific RAW file when I am out of town.  Any Cloud based solutions? Should I explore a home server?

    Cloud Storage & Unlimited Online Backup | Livedrive

Maybe you are looking for

  • How to achive this report.

    Hi, Scenario : Some raw materials qty are moving to a series of various machines in work in poress .Here some of materials are rejected Qty by machines and let say xyz reason. In OBI,If i pull all required columns into report all columns are pulling

  • Some help please? Let's get Solaris 10 out there!

    I'm on eBay, I'm always on eBay, and the sheer amount of JAVA they spit out is amazing. Chokes the heck out of this Win2K OS. Anyone out there have better results with Solaris 10? I'm seriously considering putting a separate desktop together with Sol

  • Fseventsd - during TimeCapsule Backup - Hangs

    Problem started after a system hang. Now every time I try run backup on my Time Capsule the process fseventsd consumes 100% of one CPU continuously. I let it run overnight and fseventsd never completed and the Backup stayed "stuck" on Preparing. Any

  • Garageband app will not load on my ipad.

    garageband will not load on my ipad. The icon says loading. I have 2 icons on my itunes where I install my ipad apps. One says loading and the other is fine.

  • Power Mac G4 questions.

    So like so many other users, my Power Mac G4 tower will not power on. All that happens when I press the power button is the indicator light will flash in the blink of an eye. When I hold the power butoon, same thing, but there is a faint clicking sou