Which group of programming languages to master?

Which group of programming languages should I master to tackle a diverse set of problems? I would like for this list to be small as to not learn too many languages while not having a good grasp on them. The purpose of this is so I can approach any problem and pick the right tool for the job.
Right now I'm thinking about C and Fortran for computational problems requiring speed (fortran because I'm a science student), Python for quick scripting and when development time is the most crucial factor, and Haskell for a functional middle ground. I appreciate any feedback.
Last edited by Daedalus1 (2011-11-17 08:35:08)

Calmatory wrote:Languages are merely tools to get things done. It is much more important to be able to think in an abstract manner to solve problems without creating more of them. Programming is the easy part, program design is the hard part.
Yes, I know that. I guess the question I should have asked is which categories of languages are appropriate to which problems? Then I could pick one of each category and learn them.
Basu wrote:
I mostly second with whatever austin.rbn says but with some additions:
1. You should learn C because it will make you learn how the machine actually works. Do not conflate C with C++, they are pretty different languages.
2. You should learn a UNIX-y scripting language -- one of Perl/Python/Ruby for quick prototyping, string mangling, web dev, etc.
3. You should learn an enterprise-y object oriented programming language such as C++, C# or Java
4. You should learn a strongly typed functional programming language to better understand the mathematical basis of computer science like ML, OCaml or Haskell
5. You should learn a Lisp dialect such as Scheme or Common Lisp (I would recommend the Racket implementation of Scheme).
Let me know if you have any questions.
Although pling may disagree with some of the reasons, these seem like sensible categories. What are some of the advantages of learning a Lisp language, and what domain would it be advantageous over other languages?
bsilbaugh wrote:
You may also want to ask yourself if you will ever need to run your codes in a distributed computing environment; e.g. a Linux cluster. If yes, then there is a high probability that you'll need to learn MPI. I know that most MPI implementations provide Fortran, C, and C++ API's, and there are also some Python wrappers out there as well (e.g. mpi4py).
I should also point out that most Fortran, C, and C++ compilers support OpenMP out of the box. This means that a code that was initially written in Fortran, C, or C++ can be modified (usually rather easily) to take advantage of multi-core processors.
As you might be aware, GPGPU computing is becoming more popular within the scientific community. Most of the GPGPU languages are minor extensions of the C language (and in some cases just additional C libraries). So, I suppose if you start out developing in C, then learning one of the GPGPU languages would seem like a natural extension of what you already know.
(As others have alluded, you may want to view all of this as a never ending evolutionary process. That is, don't worry about finding some panacea to all your future programming needs today. Focus on learning the principles, keep and open mind, and be ready to adapt to an ever changing environment. Whatever the answers you get today, 10 years from now the answers will likely be very different.)
Yes, I will eventually need to work with distributed computer systems and maybe GPU programming such as CUDA. For this purpose, where speed is critical, I would need to have C mastered. However I'm very used to the OOP methodology of Java and C++ and haven't been able to wrap my head around doing things otherwise. So as pling mentioned, I need to be able to first do procedural algorithms well and I think I need to learn C from the ground up. What I know of C, I've learned through C++. What is an introductory book that you would recommend to do this?
pling wrote:Do what a pro and would do and code in the highly productive scripting language until you a performance hit, then write C code for that limited area. Rather than learning lots of languages badly, learn the fundamentals of procedural programming well - get books like Code Complete, Pragmatic Programming, Programming Perls, etc and work through them.
Are C and python sufficient for general purpose programming? Are there any cases where it is difficult in getting the resultant amalgamated program to run smoothly? (just a wild guess)
Last edited by Daedalus1 (2011-11-23 07:35:01)

Similar Messages

  • How to know if firefox is using a master password from a some programming language?

    I am working in a security check project and I need know from an programming language or open some file if firefox is using a master password.

    Generally speaking, there is nothing in the HTTP request to indicate to a server the type of window the request originates from. So you will have to manage it at the application level.

  • Which alternative Programing Language do you suggest ?

    Except from Labview, which we all like, which other programming language do you suggest, and why ?
    (as a second option, for making projects)
    We make electronic devices for industrial control. (measurment, monitoring, data logging ....)
    As i can not make up my mind, any comments are usefull.

    As has been said before, that's a little too open-ended of a question. However, we can give you some insight to answer it for yourself. I think languages such as C are what assembly language was to C a few years ago. Many higher-level languages have been written in C (including LabVIEW) but I wouldn't waste my time learning it unless you have a lot of time to learn it and need to develop algorithms that work under the hood in a dll and such. If you want to invest heavily (and I mean heavily) learning C learn C++ instead since it is the industry standard and is object-oriented. If you want to learn a higher-level, text-based, useful, easily learned and cross-platform object-oriented language with a quick development time I would go with Python. Python is not the best performing language out there, but it is powerful, compiles on-the-fly, can be used with LV and there is huge open source community supporting it. And Python is used just about everywhere. And I started learning it myself just a few months ago.
    PaulG.
    "I enjoy talking to you. Your mind appeals to me. It resembles my own mind except that you happen to be insane." -- George Orwell

  • Which programing language

    I'm considering learning a programing language and write some easy and small (database related) applications on OSX.
    1. Which programming language should I be learning?
    2. Is there a list highlighting programming languages, their advantages and disadvantages and learning difficulty?
    Thanks in advance for your help!

    The choice of a first programming language is often decisive of whether or not you are able to successfully learn to program. If it's too hard you will give up; if it's too simple, you could have problems in the future.
    In recent years, Java has been the favorite in universities, presumably because of its wide use and easy readability. The other popular choice is C++. When I went to college, we only had C.
    But none of these are great choices if you truly have never written a line of code before in your life. You would probably do better to start with an interpreted ("scripting") language, like PHP or Python, even if only for a couple months until you are comfortable with the basic programming concepts. Then you can move on to a compiled language.
    For development on the Mac platform, Objective-C is the obvious choice. It is a very simple language to get started with, and there are a lot of fun things that you can do with it. This should keep you interested enough to get you through the difficult times. And, of course, we are here for you. Good luck.

  • Which programming language do we use for ios apps?

    Hello,
    I would like to know which programing language do professional ios app developers use for apps which work with an integrated database. I thought firstly about C# but recently Ruby has come into my attention as a good app development language.
    Thank you very much.

    Since you mentioned Ruby, there is the RubyMotion toolchain, which can be used to develop applications for iOS, OS X, and Android.  It isn't free, but then again you won't have to use Objective-C or Swift (or even Xcode for that matter).  You will still need to be able to at least read Objective-C/Swift/Java/whatever, since that is what the various API documentation is written for.

  • The programming language of the future: a philosophical question

    So after reading some of these posts and reflecting on the history of programming languages, I find myself wondering what the programming language of the future will be? As the human-computer interface evolves (touch interfaces, voice recognition, computer vision, etc) how will this influence the process of creating computer programs? Will programming become more graphical (e.g. flow charts, UML diagrams, etc)? Setting ego's aside, does it make sense for programming to be more graphical? Will we reach a point where we can essentially just sketch out a concept and let the compiler/interpreter handle the rest?
    Perhaps this question is best left to Clark and Asimov, but they're dead. This leaves you guys.
    (Moderator, I apologize in advance if this is out of scope for this forum.)
    Last edited by bsilbaugh (2011-11-23 01:15:47)

    geniuz wrote:I think the mistake of the scientist in question was not that he re-derived the Trapezoid Rule, but that he published it as if it was a new invention, while it existed and was already published in some form. That certainly could be prevented by reconciling to literature before publishing. In fact, I'd not even blame the scientist in question too much, rather the commission that approved his paper to be published in the first place.
    Sure, he didn't publish the paper in isolation - but the peers who reviewed it would have been people from the same or a related profession. My point is that this kind of error results from splintering of the disciplines. You seem to think that the splintering should be somehow undone or reversed at the level of the peer reviewer or the publisher - but non-specialists (or specialists in other areas) will not be invited to comment, because it will be assumed that they lack the qualifications to do so.
    geniuz wrote:Still, I think its a long stretch to compare the derivation of a Mathematical rule or method to the invention of a complete DSL.
    I think I didn't make it clear where I was going with that. My point was that splintering of specialisations from each other permitted this duplication of effort to take place. I see DSLs as a way to splinter disciplines from each other. Thus, the rise of DSLs would make it easier for duplicate effort to take place.
    geniuz wrote:The whole point of a DSL is that it allows experts in a specific domain to focus their efforts on creatively solving problems directly related to their field of expertise. When specific DSL's are integrated in university curricula world-wide, I hardly think there will be much duplicate effort going on.
    Sorry, but this seems a little naive to me. I can see why you might think DSLs would permit specialists to more efficiently focus on solving their particular problems, and I don't entirely disagree, but even if it's true, I don't think it's controversial to think this will lead to considerably more duplicate effort.
    geniuz wrote:Also, in the world of OSS, there are multiple tools that can perform the exact same job. Do you for instance consider the existence of both of the mail clients Mutt and Alpine as duplicate effort?
    I haven't used Alpine, but if it's exactly like Mutt, then sure, I would certainly say that's duplicate effort. The more people who use one single piece of software, the more bugs can be filed against that software.
    Also, I think this is a false analogy from the start. Mutt and Alpine are both written in C. If we're talking about the connection between duplicate effort and DSLs, let's consider the fact that "communicating and storing computational models of biological processes" (from the Wikipedia page on SBML) could be better done in Lisp, Lex/Yacc, or with the Lemon C++ library, instead of coming up with some new way of using harmful XML.
    geniuz wrote:I still don't quite see how this can be interpreted as a necessary weakness or argument against more complex systems. Sure, DSLs can be dependent upon lower level languages, but if they are considered to increase the effectiveness and efficiency of certain experts, what exactly stops them from becoming dominant and continually evolving?
    The fact that higher and higher levels of specialisation are concomitant with smaller and smaller user bases who still need to communicate with experts in other groups. I'm not saying DSLs should never be used. I just think they are essentially self-limiting, and just as bacteria will long outlive us more complicated life-forms, so too will the lower-level languages long outlive more cumbersome DSLs.
    geniuz wrote:I wasn't implying computers will ever be able to mimic the human brain, and I'm not even sure whether it is something we necessarily want to strive for. All I was saying is that computers have already become indispensable tools in virtually every scientific and engineering discipline. They are computationally strong machines able to solve numerically involved problems at rates no human can ever hope to accomplish. It is this very aspect that will continue to guarantee the succes of computers, not AI per se. Again, I believe computers will never (at least not while I'm alive) be able to truly independently mimic and outperform the human brain, especially when it comes to aspects like creativity, i.e. the very aspects of human intelligence scientists have not even been able to understand and quantify to this date. Hence, humans will always remain "in the loop" to a large extent.
    This all seems reasonable.
    geniuz wrote:Don't forget that laws of physics are "laws" that have been defined and created by humankind for its own convenience. Even recently this year, practice has shown that a concept as fundamental as the speed of light might not be as accurate as it was so widely acknowledged by the scientific community. This however hasn't stopped mankind from using these fundamental "laws" to invent e.g. radio communication and electronic devices.
    I don't think you're disagreeing with me. I'm aware of the fragile nature of what we call the laws of physics - but like you say, they're good enough that we can do things with them. It looks like our knowledge of the speed of light breaks down on really large length scales, and it looks like our knowledge of gravity breaks down on really small length scales, but the brain is in the middle. In between, our models for how the physical world works are very accurate, and it is in this regime that the brain operates.
    Also, you should note that most scientists regard those neutrino test results as residing within the bounds of experimental error, and therefore not strongly indicative that the neutrinos really did break light speed.
    geniuz wrote:I haven't said low-level languages will be abandoned completely, I think they will remain to serve their purpose as a base upon which higher level languages (like DSLs) are built. In that sense, I believe that the user base of these low level languages will become more limited to computer scientists, i.e. to the people responsible for "formulating suitable abstractions to design and model complex systems" (source).
    Seems reasonable.
    geniuz wrote:Having said that, I still don't see why it so farfetched that for the rest of the world, physically telling a computer what to do in their native tongue as opposed to typing it in some generic-text programming language, will become the de-facto standard. Hence, I will reformulate my statement by stating that programming as most people know it today will eventually become a redundant practice.
    I don't think a friendly human interface should be considered the same as programming. This thread, if we recall the OP, is about programming languages of the future, not user interfaces of the future. I certainly agree that user interfaces will become more intelligent and attractive. I only disagree that this will have any strong impact on how we do programming.
    Last edited by /dev/zero (2011-12-12 19:20:59)

  • Problems with language dependent master data

    Hi,
    I created a InfoObject with language dependent master data and I am trying to upload data from a flat file.
    My flat file has a 2 digit language code (EN,DE,FR,JP,ES) and when uploading the data it seems that SAP BW is only using the first digit which leads to the situation that EN and ES get treated as duplicate records.
    Any help would be appreciated
    thanks
    Ingo Hilgefort

    Hi,
    ES is the Language for spanish..But the Flat file should have to represent 'S' for Spanish.
    Check in Table T002 for the symbols which represent Languages and that to be used in flat file.

  • Java programming language uses call by reference for objects?

    Is Java programming language uses call by reference for objects?

    Yes. You make calls to an object via itsreference.
    No.Yes, you're referring to passing a reference into a
    method in which case the value of the
    reference is passed.I believe the OP is using the term "call by reference" to mean "pass by reference." The two are interchangable, AFAIK. So, while "making calls to an object via its reference" is correct, I don't believe it's germane to the question.

  • Choose a programming language

    I want a program that can
    analyse http://www.youtube.com/browse?s=mr&t=&l=&e=en_US&p=*  (*=1-5) every 10 minutes
    then pick out  the link that start with http://www.youtube.com/watch?v=
    export their related information and their links to a form that I want
    sort them as their youtube added time
    and it can run on my webpage
    so what programming language should I choose to write this program?

    Your program consists of two parts, to query the site in a given interval and to display the videos/information on your website. I think PHP would suite best for displaying the videos on your page, but then, if you have a python/ruby powered website then use the one you make your website with. For the query part I would use python and write a small script that gets triggered via a cron job in the interval you like. Instead of parsing the contents of the page itself I would use the RSS feed provided by the youtube page, there's a python module named "feedparser" with which you can easily access the feed contents and extract the related information (shouldn't be more than maybe 50 lines of code), write it to a database (sqlite), in the form that you like, and retrieve the contents via your PHP script on your webpage. Or completely skip the time interval query part and just check for new videos when someone visits your website .
    Last edited by chimeric (2008-01-19 11:12:56)

  • Large applications - Labview and other programming languages

    Hello Labview Users,
    as the forum saw this very interesting thread about large applications programmed in Labview
    (see: http://sine.ni.com/niforum/niforum?requireLogin=False&forumDU=http://forums.ni.com/ni/board/message?... ) I would like to ask the community about their experiences with Labview applications in combination with other programming languages.
    In advance: I have several years of experience in programming Labview applications starting from quick-and-dirty solutions which had to run within few hours and complex test solutions. I saw Labview growing and becoming better with the released versions and lot of things I missed in former times got implemented in the meantime. Actually I have to develop a complexe ATE solution with numerous equipment to control and numerous data to be captured and archived. Despite the Verison 8 I still feel still some drawbacks of the LV concept which let me hestitate to setup the solution completly in Labview:
    1) It is alway hard to re-use code of complex applications since it is not possible to do some kind of global search an replace of functions
    variables etc. It nearly impossible to re-use approved code structures (e.g a state machine) if the "inner part" is changing more the a little bit.
    2) If the application requires a certain flexibility (e.g. exchangeable test equipment of varying vendors) this is hard to implement since you have to define a lot of parameters through your hierarchy if you dont want use global variables which slow down your application and hide
    the code functionality.
    3) Despite modern PCs the look and feel of LV applications appears somewhat slow compared to other applications. For complex user interfaces the polling methode generates a lot of complex code. (I dont have expierence with the event-structure).
    Now my questions:
    Do you have experience of implemention of complex solutions dividingthe code modules using Labview and other languages? Which other
    languages did you use? Why did you use these languages (speed, flexibility of text based code, available library functions)? Did you found out this to improve your development time and code maintainibility?
    (I concider a solution where I do the single tests with Labview-VIs but delegate all the test sequencing and data collection stuff written in PERL which allows really very compact code)
    I'm curious what your experiences are.
    rainercats

    Given that you're asking these questions in a forum for LabVIEW users the opinions given are going to be somewhat slanted towards the general like of LabVIEW. I've been working with LabVIEW for a long time, ever since 2.something on a Mac. I've written numerous large-scale applications as well as "mundane" instrument drivers. As you've noted you're experienced with LabVIEW, so you know some of its strengths and weaknesses.
    To address your specific questions:
    (1) Yes, that has always been a limitation in LabVIEW, but I don't believe it is an overriding one to choose C over LabVIEW. Putnam provided one workaround for the search and replace of VIs. Once you've programmed in LabVIEW long enough you get used to doing it this way. Is it clumsy? Yes. As for the re-use of code structures, that's not entirely true. You can create a "template" VI (a regular VI, not a .vit) that contains the code you want to re-use and place in your palette with the "Merge VI" option set. That way you can select it from your functions palette, plop it down on your diagram, and you get the "template" VI's diagram placed right into your new VI.
    (2) This is not something that is specific to LabVIEW, as this exists with any programming language. It's not the language that limits you here, it's how you've designed your code. In a language like C++ you would go with classes. You can do the same thing in LabVIEW. IVI is another option (though not preferred by me).
    (3) As Putnam mentioned, you should be using the event structure.
    Other thoughts:
    The biggest strength I see with LabVIEW is that each VI is a miniature program, which allows development and debugging of functions a snap. With a language like C you have to write another program to call that function in order to debug it. The biggest weakness? I would say user interface. Yes, even with the event structure. Don't get me wrong, the event structure has vastly improved the way user interfaces and event handling in general are done with LabVIEW, but it simply doesn't hold muster to a program written in C or VB. ActiveX anyone? LabVIEW still doesn't do ActiveX properly in terms of actually getting the controls to work. Programming ActiveX controls is just plain aggravating what with all the property nodes taking up so much diagram space.
    It certainly makes sense to use the best tools available to you to get the job done. In my recent projects I had to write software to run automated tests on some products my company made. The test modules were written in LabVIEW. The tests executive was a proprietary engine driven by a SQL Server database. I had to write a "wrapper" DLL that interfaced between the LabVIEW code and the executive since the executive hadn't been designed to call LabVIEW DLLs directly. This allowed us to use LabVIEW as the preferred language for developing the test modules and let the guys who were fiddling with the test executive do their bit. Of course, TestStand's premise is basically that.

  • Zebra programming language

    insted of writng the zebra programming language,
    any other way to convert the smart form in to zebra programing language?

    follow this link which will give u the details
    [zebra programming in detail|http://www.quad.de/ftp/data/zebra/zplmanual.pdf]
    [Zebra programming- part1|ftp://ftp.tallygenicom.com/pub/genicom/docs/mobile/mtp4x/ZPL%20Vol1.pdf]
    [Zebar programing-part2|http://www.servopack.de/support/zebra/ZPLmanual_two.pdf]
    this will give u details of
    1. printer setting
    2. printer program
    3.about ZPL
    4.LAbel printing *** barcode printing
    hope this help u
    regards,
    Arunprasad
    Edited by: arun prasad on Jul 18, 2008 6:00 AM

  • Is Java 100% Object oriented programming language?

    Hi,
    Is java 100% Object orinented programming language?
    If not then what is the reason?
    Thanks in advance

    Object oriented paradigm of programming says that everything in a programming world should felt like object mean: - "Abstract and Encapsulated" The paradigm is not completely satisfied by Java. Some of the feature that makes such violation are 1. Existence of primitives, which are manipulated in ordinary style. 2. Existence of mathematical operator handled also not like objects. These two essential characteristics make Java somewhat what is the question. But here again Java is big winner to make presence of wrapper classes to wrap up this difficulty or adaption. Moreover in Java 1.5 you will feel more free because of auto-boxing feature that make auto conversion of object to primitive and primitive to object.
    Regards,
    Mohd.Rafi Ansari
    Jamia Hamdard
    MCA
    09213626363

  • Adobe reader programming language

    hey...anyone know the programming language used in programming of adobe reader.i want to be in adobe and i will learn that language.which other languages needed to be learn to be in adobe

    adobe reader is completely made using c++.i was preparing for campus placement of adobe.its technical paper included
    mostly c programming ,data structures,sorting,c++.if u dont know c++ or c then following are some good resources.
    http://www.cprogramming.com/
    http://ocw.mit.edu/
    http://www.miniwiki.tk/c-for-loop.htm/

  • What programming language to learn after ABAP?

    Hi All,
         Well I am kind of getting bored with ABAP / Workflow ! I was wondering what programming language should I learn to keep myself busy. What I am looking for is something new (it can be a an extn for ABAP/ SAP too) but something really new age. May be something like Mobile Apps for SAP. May be a new language all together. But again since SAP is bread and butter it would be nice if it can be used in someway or another in SAP too
    Any ideas?
    P.S.: I have 8 years of SAP experience so I am not noob !
    Moderator Message - I am un-marking this post as a question.
    Message was edited by: Suhas Saha

    Well yes I agree that moving to new module is a good career move. But I am not looking to make a career move ! Its something just on the side..
    I know webdynpro.. BI I need access to a system ! HANA same thing.. I am not independent in learning..I need some one to set up the system/login/access etc.. and thats not going to happen unless I sign up for a class !
    While I write this my previous comment is still under moderation ! So I am gonna copy paste it here :
    Thanks Guys !
    I am looking for something which I can learn independently ! without having to go after basis guys to download some SDK for me and installing notes for me.. Thats one of the reason learning anything new in SAP is a hassle. I guess BOBJ us out of question for this reason.
    While HTML-5 sounds interesting wont I be better off learning Python? I know I can look it up but if you know offhand whats the benefit of HTML-5 and what kind of independent work I can take up for it ? I dont care about money but as long as its interesting I am ok with that !
    I do have basic knowledge of C so C++/ Objective C may be a good move .. JAVA I am not sure off as its use has been declining off late (its still number 1 but its market share is  going down).

  • Java Programming Language Answers

    Hi,
    My business is to teach Java to legacy engineers. I'd like to use a book "Java Programming Language" as one of materials. This book contains number of excellent excercises,but without answers. I found a Web page of answers which is http://java.sun.com/docs/books/javaprog/firstedition/answers/, but this page covers Chapter 1 to 14 and appendix. This book has Chapter 1 through 20. I would gratefully appreciate if someone tell me where answers for Chatpter 15 to Chapterr20.
    Thanks and Regards,
    Shintaro Sekine
    [email protected]

    It should be no problem if you give them your own answers.
    When you couldn't give right answers for a few of them, Mr. jverd et al would
    help you on this forum.

Maybe you are looking for

  • Export to text file shows extra character at line 60, 120 so on

    Hi, I am using Crystal report 13 with .net 4.0 ( Visual Studio 2010) I have an issue while exporting from CR to Text file, After every 60 lines some special character is inserted in the text file. That might be the new page character. Is this the lim

  • AVCHD converted to ProRes via FCP's Log & Transfer --does ProRes retain TC?

    Looking for any users who have imported AVCHD using FCP's Log & Transfer and converted to ProRes. Was your original camera timecode retained? Or was each of your clip's timecode reset to zero? More specifically, there are hopefully new users of Sony'

  • How can i change the Director App Icon for Windows?

    How can i change the icon for my App for Windows? Director can do this from the Publish Settings. The Custom icon for app doesn't work. I select a .ico file for windows and Director can't attach this to the app. There is anothe way to do that simple

  • Accessing Javascript console in Acrobat X Std

    Good evening, for some reason the "Ctrl + J" shortcut does not work on my Acrobat X Std. The shortcut does show up in the helpfile so I assume there is something wrong. Does anybody else have the same issue? Thanks

  • BO 4.1 SP4 installation error on Linux 2.6.32 OS.

    I am trying to install SP4full (BO 4.1) on a linux OS. #uname -a Linux ############# 2.6.32-358.18.1.el6.x86_64 #1 SMP Fri Aug 2 17:04:38 EDT 2013 x86_64 x86_64 x86_64 GNU/Linux I am getting this error Failed: Operating system patch level (Critical)