Power of function with very large numbers & HEX array

Hello,
I'm haveing 3 problems and I would be greatful if someone can help. I've attached
1) I need to calculate 982451653^15. I've used 'Power of X' function but the resut I'm getting is incorrect.
is there a way for getting correct result??
2) after that I need to calculate modulo from result but I get nothing? I'm using 'Quotient & Reminder' function
3) I need to transform number 982451653 to HEX --> 3A8F05C5 and send to array gruped by two from behind as shown below:
3A8F05C5 --> [3A][8F][05][C5] and write it down to array from behind.
Array should be:
...and for hex number 3A8F05C56 --> [03][A8][F0][5C][56]
Array: 
Please help!
 

Just for "fun", I decided to take my own suggestion and write a Big Number project to handle Addition and Multiplication of Arbitrarily-Long Integers.  I built in "sign" handling for Multiplication, but (in the interest of getting a "testable") I currently only support non-negative Addition (and no Subtraction, yet -- it should be a fairly easy, and you'll forgive the accidental pun, Add-On).  The Project has 11 sub-VIs, including Product, Sum, and Power, plus one designed for output called "Big Number String" (currently only a Decimal string is supported).  I was not necessarily "coding for speed of execution", but rather for clarity of operation and ease of "proving that this works".
I tried it out on your problem.  I got out a 135-digit decimal number that appears to match what you posted as the Correct Answer (it starts with 76677477 ... and ends with ...35294157).  It executes in about 20 milliseconds.
Just for fun, I also coded up a computation of 10000! (after reading Altenbach's post).  I was not aware of the Factorial Challenge, and haven't look at the Post he cited, so am unsure how my algorithm compares with the 100-millisecond champ.  I'm definitely slower -- about 37 seconds, and while I didn't print out the result, I got 35,660 digits, one more than what is noted in Christian's Post.  However, you can Google Factorial 10000 and find its value posted on the Web -- my answer agrees with the posted value for the first (most-significant) 20-or-so digits that I compared.
For the time being, I'm going to skip over how to convert this monster decimal string representation of a number to a hex representation -- my suspicion is that it will be easier to write a Hex Package to do the same calculation (and to define an inherently Hexy format to store the arbitrary-precision number) than to try to write a direct Conversion routine.  I'll leave this task (as well as creating one's own Big Number Project) as an "Exercise for the Reader".  Consider this an Existence Proof.
Bob Schor
 

Similar Messages

  • Store Very Large Numbers

    Hello,
    I am trying to find some info about how to store very large numbers with like 2000 digits or even more and do calculations on them.
    Does anyone have any info or links about this ??
    Thanks.

    BigDecimal?Speaking. What can I do for you?
    :)

  • Very Large Numbers Question

    I am a student with a question about how Java handles very large numbers. Regarding this from our teacher: "...the program produces values that
    are larger than Java can represent and the obvious way to test their size does not
    work. That means that a test that uses >= rather than < won?t work properly, and you
    will have to devise something else..." I am wondering about the semantics of that statement.
    Does Java "know" the number in order to use it in other types of mathematical expressions, or does Java "see" the value only as gibberish?
    I am waiting on a response from the teacher on whether we are allowed to use BigInteger and the like, BTW. As the given program stands, double is used. Thanks for any help understanding this issue!

    You're gonna love this one...
    package forums;
    class IntegerOverflowTesterator
      public static void main(String[] args) {
        int i = Integer.MAX_VALUE -1;
        while (i>0) {
          System.out.println("DEBUG: i="+i);
          i++;
    }You also need to handle the negative case... and that get's nasty real fast... A positive plus/times a positive may overflow, but so might a negative plus a negative.
    This is decent summary of the underlying problem http://mindprod.com/jgloss/gotchas.html#OVERFLOW.
    The POSIX specification also worth reading regarding floating point arithmetic standards... Start here http://en.wikipedia.org/wiki/POSIX I guess... and I suppose the JLS might be worth a look to http://java.sun.com/docs/books/jls/second_edition/html/typesValues.doc.html

  • Ipod touch 4th, running OS5, boots up with very large icons, impossible to navigate, how to get back standard sized homepage?

    Ipod touch 4th, running OS5, boots up with very large icons, impossible to navigate, need to return to standard sized homescreen?

    Triple click the Home button and then go to Settings>General>Accessibility and turn Zoom off. If problems see:
    iPhone: Configuring accessibility features (including VoiceOver and Zoom)

  • JRockit for applications with very large heaps

    I am using JRockit for an application that acts an in memory database storing a large amount of memory in RAM (50GB). Out of the box we got about a 25% performance increase as compared to the hotspot JVM (great work guys). Once the server starts up almost all of the objects will be stored in the old generation and a smaller number will be stored in the nursery. The operation that we are trying to optimize on needs to visit basically every object in RAM and we want to optimize for throughput (total time to run this operation not worrying about GC pauses). Currently we are using hugePages, -XXaggressive and -XX:+UseCallProfiling. We are giving the application 50GB of ram for both the max and min. I tried adjusting the TLA size to be larger which seemed to degrade performance. I also tried a few other GC schemes including singlepar which also had negative effects (currently using the default which optimizes for throughput).
    I used the JRMC to profile the operation and here were the results that I thought were interesting:
    liveset 30%
    heap fragmentation 2.5%
    GC Pause time average 600ms
    GC Pause time max 2.5 sec
    It had to do 4 young generation collects which were very fast and then 2 old generation collects which were each about 2.5s (the entire operation takes 45s)
    For the long old generation collects about 50% of the time was spent in mark and 50% in sweep. When you get down to the sub-level 2 1.3 seconds were spent in objects and 1.1 seconds in external compaction
    Heap usage: Although 50GB is committed it is fluctuating between 32GB and 20GB of heap usage. To give you an idea of what is stored in the heap about 50% of the heap is char[] and another 20% are int[] and long[].
    My question is are there any other flags that I could try that might help improve performance or is there anything I should be looking at closer in JRMC to help tune this application. Are there any specific tips for applications with large heaps? We can also assume that memory could be doubled or even tripled if that would improve performance but we noticed that larger heaps did not always improve performance.
    Thanks in advance for any help you can provide.

    Any suggestions for using JRockit with very large heaps?

  • Any suggestions on calculating with very large or small numbers?

    It seems that double values are about 17 decimal places (10e17) in precision.
    Is there a way in iPhone calculations to get more precision for very large and small numbers, like 10e80 and so forth? I know that's more than the entire number of atoms in the universe, but still.
    I tried "long double" but that didn't seem to make any difference.
    Just a limitation?
    Thanks,
    doug

    Hmmm... maybe I was just having a problem with my formatted string then?
    I was using the NSString %g format, which is supposed to print in exponential notation if the number is greater than 1e4 or less than 1e-4, or something like that.
    But I was not getting anything greater exponents than 1e17 and then I was apparently getting overflows because the number were having negative mantissas.
    All the variables involved were double...
    How did you "look at" z?
    Thanks,
    doug

  • Maximmum value of a sequnec and viewing very large numbers

    Hi,
    We have a table which has a column populated using a sequence - it is maintained by code in owb but growing at larger rate than expect
    e.g sequence starts at say 10000000000, would expect next row to b created with 10000000001 but large gap in between.
    Have a separate ticket raised with oracel for this as mainatined via dimesnion opertaor code.
    However, have couple of questions reagrding sequences.
    1) How large can they be
    2) If we try query numbers which over 14 digits long starts to show e in the tool we are using (pl/sql developer) Is there a way to ensure we can see the whole number (sqlplus?) and
    I'm assuming say if had number column which very large say 18 digits long joined to another table 18 digits long then no issues?
    Thanks

    1. you really should read docs.
    e.g sequence starts at say 10000000000, would expect next row to b created with 10000000001 but large gap in between.Not necessarily. Oracle Sequence guarantees uniqueness, not continuity.
    There can be gaps by design. Especially if CACHE of the sequence is big.
    Have a separate ticket raised with oracel for this as mainatined via dimesnion opertaor code.you really should read docs before creating tickets.
    However, have couple of questions reagrding sequences.
    1) How large can they be-As large as NUMBER datatype can be - 38 digits.-
    I am wrong. Doc says - 28 digits.
    2) If we try query numbers which over 14 digits long starts to show e in the tool we are using (pl/sql developer) Is there a way to ensure we can see the whole number (sqlplus?) What is printed on screen is a matter of formatting. You can choose formatting you like in SQLPlus and SQLDeveloper
    I'm assuming say if had number column which very large say 18 digits long joined to another table 18 digits long then no issues?No issues.
    Edited by: Mark Malakanov (user11181920) on Apr 10, 2013 2:06 PM

  • Best data Structor for dealing with very large CSV files

    hi im writeing an object that stores data from a very large CSV file. The idea been that you initlize the object with the CSV file, then it has lots of methods to make manipulating and working with the CSV file simpler. Operations like copy colum, eliminate rows, perform some equations on all values in a certain colum, etc. Also a method for prining back to a file.
    however the CSV files will probly be in the 10mb range maby larger so simply loading into an array isn't posable. as it produces a outofmemory error.
    does anyone have a data structor they could recomend that can store the large amounts of data require and are easly writeable. i've currently been useing a randomaccessfile but it is aquard to write to as well as needing an external file which would need to been cleaned up after the object is removed (something very hard to guarentee occurs).
    any suggestions would be greatly apprechiated.
    Message was edited by:
    ninjarob

    How much internal storage ("RAM") is in the computer where your program should run? I think I have 640 Mb in mine, and I can't believe loading 10 Mb of data would be prohibitive, not even if the size doubles when the data comes into Java variables.
    If the data size turns out to be prohibitive of loading into memory, how about a relational database?
    Another thing you may want to consider is more object-oriented (in the sense of domain-oriented) analysis and design. If the data is concerned with real-life things (persons, projects, monsters, whatever), row and column operations may be fine for now, but future requirements could easily make you prefer something else (for example, a requirement to sort projects by budget or monsters by proximity to the hero).

  • Problem in compilation with very large number of method parameters

    I have java file which I created using WSDL2Java. Since the actual WSDL has a complex type with a large number of elements(around 600) in it, Consequently the resulting java file(from WSDL2Java) has a method that takes 600 parameters of various types. When I try to compile it using javac at command prompt, it says "Too many parameters" and doesn't compile. The same is compiling successfully using JBuilder X . The only way I could compile successfully at command prompt is by reducing the number of parameters to around 250 but unfortunately that it's not a workable solution. Does Sun specify any upper bound on number of parameters that can be passed to a method?

    ... a method that takes 600 parameters ...Not compatible with the spec, see Method Descriptors.
    When I try to compile it using javac at
    command prompt, it says "Too many parameters" and
    doesn't compile.As it should.
    The same is compiling successfully using JBuilder X .If JBuilder produces a class file, that class file may very well be invalid.
    The only way I could compile
    successfully at command prompt is by reducing the
    number of parameters to around 250Which is what the spec says.
    but unfortunately that it's not a workable solution.Pass an array of objects - an array is just one object.
    Does Sun specify
    any upper bound on number of parameters that can be
    passed to a method?Yes.

  • Need help with "Very large content bounds" error...

    Hey guys,
    I've been having an issue with Adobe Muse [V7.0, Build 314, CL 778523] - one of the widgets I tried from the Exchange library seemed to bug out and created a large content box.
    This resulted in this error:
    Assert: "Very large content bounds (W:532767.1999999997 H:147446.49743999972) detected in BoxUtils::childBounds"
    Does anyone know how I could fix this issue?

    Hi there -
    Your file has been repaired and emailed back to you. Please let us know if you run into any other issues.
    Thanks!
    -Sam

  • Working with VERY LARGE tables - is it possible to bypass row counting?

    Hello!
    For working with large result sets ADF provides the `Range Paging` mechanism for views, described in the 27.1.5 part of the Developer’s Guide For Forms/4GL Developers.
    It works well, but as a common mode it counts total row count to allow paging. In some cases query `select count(1) from (SELECT ...)...` can take very, very long time.
    But if a view object doesn't know row count (for example we can override getEstimatedRowCount() method ), paging controls doesn't appear in user interface.
    Meanwhile I suggest that it's possible to display two paging links - Prev and Next, without knowing row count. Is it a way to do it?
    Thank in advance,
    Ilya Rodionov.

    Hi Ilya,
    while you wait for Franks to dig up the right sample you can read this thread:
    Re: ADF BC: Performance issue with getEstimatedRowCount (ER?)
    There we discuss the exact issue.
    Timo

  • Bridge CS4 with very large image archves

    I have just catalogued 420,000 images from 3TB of my photographic collection. For those interested, the master cache files became quite large and took about 9 days of continuous processing:
    cache size: 140 gb
    file count: 991,000
    folder count: 3000
    All cache files were also exported to the disk directories.
    My primary intent was to use the exported cache files as a "quick" browsing mechanism with bridge. Of course, "quick" is a rather optimistic word, however is is very significantly faster than having bridge rebuild caches as needed.
    I am now trying to decide if it is worth keeping the master bridge cache because of the limitations of the bridge implementation which is not very flexible as to where and when the master cache adds new temporary cache entries.
    Any suggestions as to the value of keeping the master cache would be appreciated. I don't really need key word or other rating systems since I presently use a simple external data base for this type for image location.
    I am also interested in knowing if the "500,000" entry cache limitation is real - or if more than 500,000 images can be stored in the master cache since I will be exceeding this image count next year.

    I have a bridge 5 cache system with 600,000 images over 8 TB of networded disk.  I too use this to "speed up" the browsing process and rely primarily on key word processing to group images.  The metadata indexing is, for practical purposes, totally useless (it never ceases to amaze me about why Adobe things it useful for me to know how many images were taken with a lens focal length of 73mm - or some other equally useless statistic).  The only thing I can think of that is are serious missing keyword indexing feature is the ability to have a key word associated with a directory.  For example, I have shot many dozens of dance, theatre, music and sports productions - it would be much more useful to cataloge a directory that has the key words "Theatre" and "Romeo and Juliette" than attempt to key-word each individual image.   It is, of course, possible to work around the restrictions but that is very unclean and certainly less than desireable.   Key-wording a project (i.e. a directory) is a totally different kettle of fish than key-wording an image.  I also find the concept of the "collection" is very useful and well implemented.
    I do maintain a complete cache build of my system.  It is spread over two master caches, one for the first 400,000 images and a second for the next 400,000 (I want to stay within the 500,000 cache size limit - it is probabley associated with the MYSQL component of Bridge and I think may have problems if you exceed the limit by a substantial amount.  With Bridge on CS3, when the limit was exceeded , the cache system self-distructed and I had to rebuild).
    The only thing I can think of (and that seems to be part of Adobe's design) is that Bridge will rebuild the master cache for a working directory "for no apparent reason" such as when certain (unknown) changes are made to ACR,   Other automatic rebuilds have been reported by others however Adobe does not comment upon when or what casuses a rebuild.  Of course, this has serious impact upon getting work done - it is a bloody pain to have bridge suddenly process 1500 thumbs and preview extracts simply to keep the master cache completely and perfectly synchronized (in terms of image quality) with what might be displayed if you happen if you want to load a raw image into photoshop.  This strategy is IMHO completely out of step with how (at least I) use the browsing features of bridge.
    It may be of concern that Adobe may, for design reasons, change the format of the directory cache files and you will have to completely rebuild all master and directory caches yet again - which is a real problem if you have many hundreds of thousands of images.  This happened when the cache system changed from CS3 to CS4 - and Adobe did not provide conversion programme to migrate the old to new format.  This significantly adds to the rebuild time since each raw image must be completely reprocessed.  My current rebuild of the master cache has taken over two elapsed weeks of contunuous running.
    It would be nice if Adobe would allow some control over what is recorded in the master cache - for example, "do you wish meta data to be indexed".
    (( as an aside, adobe does not comment upon why, when using Bridge to import images from a CF card, results in the building a .xmp file with nothing but meta data for each raw file.  I am at a loss to speculate what really useful thing results other than maybe speeding up the processing of the (IMHO useless) aspects of meta data ))
    To answer your quiestion, I do think the master cache is worth keeping - and we can pray that Adobe puts more though process into why the master cache exists and who uses the present type of information indexed within the cache.

  • Very large numbers on screen

    The numbers on my phone at times become so large that they won't all fit on the screen.  The only thing I can do is turn it off and back on for them to be a normal size again.  Sometimes it is almost impossible to turn it off because the area you have to swipe isn't showing on the screen. This has started happening pretty often.  Anything I can do to stop this?

    Turn off Zoom.
    General - Accessibility - Zoom - OFF

  • Problem with really large numbers

    I'm quite new to java and I'm having a little trouble. I need to enter a REALLY large number into a long variable (long x = 0xAA55AA0000000000). However Java is giving me the error �integer number too large�. Does anyone know why this is happening or how I can get round it.

    Indeed, to quote a friend: 0 is a small integer,
    infinity is a large integer, and all other integers
    are medium-sized.
    I tried arguing that infinity is not an integer but
    she only gave me a strange look... "OF COURSE infinity
    is an integer!"I have had this argument with people as well, the fact is that Infinity is not even a number it is a concept of endlessness.
    If infinity was a number then it must have a value. By giving it a value you are fixing its "size" as it were and then Infinity + 1 would be greater than Infinity which is impossible

  • Since getting version 22 (release update channel) I had a few days with very large text; today, Firefox will not start. How to get it running again???

    I've searched for an .exe file in Firefox program files, but there doesn't seem to be one anywhere. I'd like to uninstall, download a new program and reinstall, but I'd rather not lose bookmarks and other settings.
    Running Windows 7 on a Sony Vaio laptop. Any suggestions? Thanks in advance.

    Certain Firefox problems can be solved by performing a ''Clean reinstall''. This means you remove Firefox program files and then reinstall Firefox, you WILL NOT lose any bookmarks, history, and settings. Please follow these steps:
    '''Note:''' You might want to print these steps or view them in another browser.
    #Download the latest Desktop version of Firefox from http://www.mozilla.org and save the setup file to your computer.
    #After the download finishes, close all Firefox windows (click Exit from the Firefox or File menu).
    #Delete the Firefox installation folder, which is located in one of these locations, by default:
    #*'''Windows:'''
    #**C:\Program Files\Mozilla Firefox
    #**C:\Program Files (x86)\Mozilla Firefox
    #*'''Mac:''' Delete Firefox from the Applications folder.
    #*'''Linux:''' If you installed Firefox with the distro-based package manager, you should use the same way to uninstall it - see [[Installing Firefox on Linux]]. If you downloaded and installed the binary package from the [http://www.mozilla.org/firefox#desktop Firefox download page], simply remove the folder ''firefox'' in your home directory.
    #Now, go ahead and reinstall Firefox:
    ##Double-click the downloaded installation file and go through the steps of the installation wizard.
    ##Once the wizard is finished, choose to directly open Firefox after clicking the Finish button.
    Please report back to see if this helped you!

Maybe you are looking for

  • Creating dynamic value range, one week at a time

    Every Monday, I query a table to get data for the past week, every Monday (last Monday) to Sunday. I do some calculations with the data and insert the data and dates into a table. I then use cfchart to create a graph using this data. The drilldown us

  • How do i check if my iPhone insurance plan

    can some one help here pls

  • GAP ANALYSIS AND TICKETS

    hi Gurus, I need some 6 real time examples relating to Gap Analysis and Tickets of SAP SD. Please could you send me all the details with the scenarios and solutions to my email if [email protected] Thanks, Farhan.

  • RAISE_EXCEPTION dump.

    Hi people!! The SAP person that are working in the going live verification in one of our production system are having this dump when executing the transaction SDCCN : RAISE_EXCEPTION INVALID_TARGET Someone can help me here? Thanks!

  • Itunes sync unkown error problem??

      i have a problem with my itunes i recently updated my itunes to 10.5 (by accident really lol) and i try to sync my 3gs iphone to it and as i was dropping a mp3 to put on there a sync unkown error appeared. it still show the song on my iphone on itu