Best programming langauge to learn

Soldato
Joined
21 Oct 2002
Posts
18,022
Location
London & Singapore
Actually the advantages of C and C++ are growing as more and more people are using more and more infrastructure.
Not convinced about that sentiment at all. Things are moving to the cloud where distributed systems, resiliency and scaling/load distribution are more important factors than outright execution speed. C/C++ remains popular because it is still the ultimate low-level systems language, and because of sheer vast quantities of legacy code.

The costs of a JIT compiled language become less and less appealing because the programs become more and more expensive to run, and then things like garbage collection can be very disruptive.
To be clear, JIT does not imply GC too.

To be perfectly clear, JIT does not mean a program needs to be recompiled upon every execution. See Java's HotSpot JVM. See .NET's RyuJIT.

A firend of mine is porting a large server infrastructure to C++ from java because despite spending significant money on trying to tame the garbage collection it just was getting to intrusive, C# was trialled and was deemed worse for some reason, so everything is getting ported to c++ so there can be better control of resources.
It sounds like the wrong language was chosen for this particular project. Is it a high frequency trading system? Not surprising that Java was inappropriate. Even still, LMAX managed to tame Java just fine. There is plenty of tricks to better control the GC. Having said that, Java's lack of value types support was always going to mean its more GC-heavy than say C# because everything is allocated on the heap. Only primitive types in Java can be passed around by value.

What version of C# and CLR were you using? Both MSCLR and Mono have made massive performance improvements in the last 3 or 4 releases.

With C++ you introduce a whole new set of problems, by the way. You do gain out-right control over performance. But writing performant code is still hard to do correctly, especially since C/C++ are highly imperative languages that encourage mutability by default.


The compilation issue for server side systems is meaningless because you have the program compiled for the specific architecture. Why compile a program every time you want to use it, nothing has changed since the last time you ran it.
Not true. See JVM HotSpot. .NET has NGen... and RyuJIT is coming.

You also wrote later on that you "don't have control over your clients servers" therefore it appears you will needing to be compiling your C/C++ to the lowest common denominator of instruction set support. Otherwise you'll have customers complaining that your software doesn't work on the CPU of their server. Contradicting yourself.

The architecture specific optimizations of JIT are typically way over stated. The last time I saw a comparisons (a few years ago no so things might have changed) there was typically only a few percent difference, while using c++ instead of C# or Java can easily get you a 500% speed up.
There's plenty of scenarios where C#/Java/F# have the same percentage of legs on C++. It's swings and roundabouts. You can introduce mutability to gain back performance in lots of situations but it is only rarely actually required. It sounds like this particular server application has a rather large critical execution path that needs optimising. It sounds like there may be architectural problems with its design. But who knows from where I'm sitting.

And JIT suffers form the fact that it has very limited time to compile.
Ignoring HotSpot / NGen / RyuJIT... why is startup time an important metric on server software? Server software starts up and then just stays running. Within a short time all the critical execution path will have been JIT'd. Without rarely used code paths remaining unJIT'd.

Ideally you need need a combined system, precompiled but some form of online compilation that can make limited modifications.
See HotSpot and RyuJIT.

The importance of c/C++ is supported by evidence with c and C++ continuing to remain steady with a significant market share, C still has the largest market share and c# only has 60% of the market share of C++.
Your conversation was about performance, now it's switched to importance. Nobody denies C/C++ are important languages. Whether they are appropriate for new server-side projects is an open question.

Java continues to be strong and more or less equal to C, and that is largely bolstered by android. Objective-C before the Iphone release was more or less unheard of but now sit at 12% or so, over 3X more than C# for example. It would be nice to know how much of the Java coding is specifically for android and remove this to see both the desktop and server market share in isolation. I'm very confident if you looked at server and desktop markets then c++ would equal Java for market penetration.
If you are proposing the OP learn C/C++, fine. But he should be prepared that 99% of projects he'll be offered are legacy code that he'll merely be fixing bugs in or eeking out more performance. A minority of C/C++ developers will be offered greenfield projects but you'll need to be a good developer with years of experience to be offered such opportunities.

And lastly "you will find stability pretty much always trumps speed", yep, but c++ is more stable than many other languages.
Get out. Every single buffer overflow and null pointer reference in history just laughed you out with that sentiment.
C/C++ are not "stable" (odd term in this context, but I'll run with it for now) languages. They require an expert hand to keep things in check. One must practice extreme discipline to remain on the "rails" as it were. See lack of immutability and its form of type safety being of an extremely old generation.

I've had a python server crash months after release because of the lack of type safety.
Because writing servers in Python, along with any dynamically typed language, is a stupid thing to do. Not because C/C++ is a better alternative. A single bad keystroke on a C/C++ program and you'll be crashing hard too.

And anything that uses garbage collection is dubious to be called stable - you can't gurnatee when the GC kicks in and your server stutters.
You can take out GC locks to prevent a GC occurring at a critical time. There is also the Concurrent GC that Java and .NET both have, which further minimise the likelihood of even needing a GC lock.

But come on, how many server sides are doing time sensitive work like that? Must be 1%. And not likely to be easy projects for a new developer to be on. Those sort of projects are for veterans.

We also have a c# server that falls over regularly due to a stack corruption in the CLR caused by a different c# process (this is .NET1.1 but we have no control over clients servers).
1.1 is ancient. Not much else to say on that.
 
Last edited:
Soldato
Joined
21 Oct 2002
Posts
18,022
Location
London & Singapore
As an FYI, it seems Apple are going to be pushing their new language, Swift to replace Obj-C.

The advantages of C and C++ are becoming less and less as time progresses, you will find stability pretty much always trumps speed. But regardless of that, the benefits of JIT compilation can sometimes result in a faster managed application (e.g. C#) than native, this is because it is able to make use of instruction sets supported on the host, whereas C/C++ needs to be pre-compiled to the most widely supported sets.

This a good post. D.P.'s is far less so.

Multi-paradigm "functional-OO" hybrid languages are all the rage at the moment for good reason. Swift, Scala, F#, OCaml. OP should learn one of these. They are actually easier to learn than a legacy language too because they deal in higher-level, almost "human like", concepts than traditional imperative languages which are still only a step or two above the CPU's instruction set.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
29,491
Location
Back in East London
Horses for courses. C/C++ aren't going away anytime soon. Not while there are intensive algorithmic requirements around, of which there is an increasing demand for. I'm not even talking about games either. Scientific simulations, weather predictions, risk analysis for finance are some examples of industries/markets that demand very fast number crunching.

edit: Though just to be clear, I'm not advocating them as I think they are horrible languages :E
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
32,618
Not convinced about that sentiment at all. Things are moving to the cloud where distributed systems, resiliency and scaling/load distribution are more important factors than outright execution speed. C/C++ remains popular because it is still the ultimate low-level systems language, and because of sheer vast quantities of legacy code.


To be clear, JIT does not imply GC too.

To be perfectly clear, JIT does not mean a program needs to be recompiled upon every execution. See Java's HotSpot JVM. See .NET's RyuJIT.


It sounds like the wrong language was chosen for this particular project. Is it a high frequency trading system? Not surprising that Java was inappropriate. Even still, LMAX managed to tame Java just fine. There is plenty of tricks to better control the GC. Having said that, Java's lack of value types support was always going to mean its more GC-heavy than say C# because everything is allocated on the heap. Only primitive types in Java can be passed around by value.

What version of C# and CLR were you using? Both MSCLR and Mono have made massive performance improvements in the last 3 or 4 releases.

With C++ you introduce a whole new set of problems, by the way. You do gain out-right control over performance. But writing performant code is still hard to do correctly, especially since C/C++ are highly imperative languages that encourage mutability by default.



Not true. See JVM HotSpot. .NET has NGen... and RyuJIT is coming.

You also wrote later on that you "don't have control over your clients servers" therefore it appears you will needing to be compiling your C/C++ to the lowest common denominator of instruction set support. Otherwise you'll have customers complaining that your software doesn't work on the CPU of their server. Contradicting yourself.


There's plenty of scenarios where C#/Java/F# have the same percentage of legs on C++. It's swings and roundabouts. You can introduce mutability to gain back performance in lots of situations but it is only rarely actually required. It sounds like this particular server application has a rather large critical execution path that needs optimising. It sounds like there may be architectural problems with its design. But who knows from where I'm sitting.


Ignoring HotSpot / NGen / RyuJIT... why is startup time an important metric on server software? Server software starts up and then just stays running. Within a short time all the critical execution path will have been JIT'd. Without rarely used code paths remaining unJIT'd.


See HotSpot and RyuJIT.


Your conversation was about performance, now it's switched to importance. Nobody denies C/C++ are important languages. Whether they are appropriate for new server-side projects is an open question.


If you are proposing the OP learn C/C++, fine. But he should be prepared that 99% of projects he'll be offered are legacy code that he'll merely be fixing bugs in or eeking out more performance. A minority of C/C++ developers will be offered greenfield projects but you'll need to be a good developer with years of experience to be offered such opportunities.


Get out. Every single buffer overflow and null pointer reference in history just laughed you out with that sentiment.
C/C++ are not "stable" (odd term in this context, but I'll run with it for now) languages. They require an expert hand to keep things in check. One must practice extreme discipline to remain on the "rails" as it were. See lack of immutability and its form of type safety being of an extremely old generation.


Because writing servers in Python, along with any dynamically typed language, is a stupid thing to do. Not because C/C++ is a better alternative. A single bad keystroke on a C/C++ program and you'll be crashing hard too.

You can take out GC locks to prevent a GC occurring at a critical time. There is also the Concurrent GC that Java and .NET both have, which further minimise the likelihood of even needing a GC lock.

But come on, how many server sides are doing time sensitive work like that? Must be 1%. And not likely to be easy projects for a new developer to be on. Those sort of projects are for veterans.


1.1 is ancient. Not much else to say on that.


What an opinionated piece of drivel. Seems like you have an axe to grind, I wont bother wasting my time disproving your fallacies.

Things like "he should be prepared that 99% of projects he'll be offered are legacy code" are just laughably wrong!



And to be clear, I have never said C++ is the be all or end all or is anywhere near perfect. It is just a very widely used language which is never going anywhere, has increasing number of uses, is the basis of many other languages and makes for an excellent language to learn allowing easy transitions to other languages.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
This a good post. D.P.'s is far less so.

Multi-paradigm "functional-OO" hybrid languages are all the rage at the moment for good reason. Swift, Scala, F#, OCaml. OP should learn one of these. They are actually easier to learn than a legacy language too because they deal in higher-level, almost "human like", concepts than traditional imperative languages which are still only a step or two above the CPU's instruction set.

what do you define as a legacy language? C++ is anything but legacy!

The problem with learning and multiparadigm functional languages is they come and go with the seasons. One summer its Haskell, then list scala, then its F#. It used it be things like Lisp. let along there are dozens of them because most of them are simply peoples pet projects. these functional languages are nothing new, the basic date back to the 50s and 60s and predate C++! For decades people having been believing that they would soon be the big thing once CPUs have caught up, but they never take off like is hyped. F# is at perhaps 1% market share, about the same as LISP (specified in 1958). Languages like Scala have less market share than Fortran (specified 1954).


c++ is the preferred language across many industries and disciplines: anything form computer games, media, DSP, stock market and financial predictions, trading, machine learning, data mining, CV, robotics, control systems, aeronautics, simulation, petrochemical, geophysics, climate studies, weather prediction, physics, automation, real-time system, graphics and rendering, productive (photoshop, illustrator are all C++), embedded systems (C), imaging systems, security, defense, and high performance servers like google and facebook all use C++, as is pretty much everything form Microsoft, adobe etc, media players etc, etc.


Most of these are growing fields
Edit just check simplyhired jobs for california (I just append to be keeping my eyes open) using the keyword aoftware + the following langauges
43,000 jobs mentioning c++
28,000 mentioning java
7,000 mentioning .NET (csadly # fails as a search term)
1400 mentioning scala
13 mentioning ocaml


EDIT 2: every good programmer should be language agnostic anyway, and be able to pick up a new language at a whim. c++11 makes an excellent basis for this, some more functional languages make excellent additions.
 
Last edited:
Soldato
Joined
21 Oct 2002
Posts
18,022
Location
London & Singapore
What an opinionated piece of drivel. Seems like you have an axe to grind, I wont bother wasting my time disproving your fallacies.

Things like "he should be prepared that 99% of projects he'll be offered are legacy code" are just laughably wrong!



And to be clear, I have never said C++ is the be all or end all or is anywhere near perfect. It is just a very widely used language which is never going anywhere, has increasing number of uses, is the basis of many other languages and makes for an excellent language to learn allowing easy transitions to other languages.

It only comes across as opinionated drivel because you weren't expecting somebody with an actual clue to arrive. Fair play to you for not actually responding. There was nothing you could have said to redeem yourself from that rubbish anyway.
 
Soldato
Joined
13 Jan 2005
Posts
3,351
Location
South West
To the OP:

Like you mention...

I don't mind going into iOS development

I would recommend mobile app development if you want a bit more excitement and variation.

However...

if I had too choice I would pick a backend language over front end as you have a lot less PM/PO involvement in how something should look or behave, most of the time with them speaking from a point of no experience and forcing you to do what they want which I can only take for so long.

...maybe not. Liaising with clients over trivial matters such as placements of buttons gets old very quickly. :(

Also it seems PHP developers don't make as much, I'm guessing since the barrier to entry is so low, and you can get a lot done with very ****** code compared to other languages.

That's one advantage Objective-C has. It's a niche skill set that does command higher-than-average salaries.

Also things to mention are I don't mind about it being strongly typed or not, I like things that look neat, and I'm really enjoying OOP.

Objective-C has this in spades. Modern Objective-C syntax also extensively uses dot notation, literals, subscripting (and of course ARC) so it falls more in line with other C-derived languages like C# and Java. It is much easier to get your head around the language now than it did 2 or 3 years ago.

Like you mentioned in the other thread in the Apple sub forum, Swift has a lot of benefits going for it too. If you were definitely serious about looking into iOS development I would look at learning (and comparing the syntax) of both languages. I've already seen sample projects that are written in both for the sake of comparison, such as how method calls and the like differ. It is also very early days for Swift, and whilst beta compilers are not a fair comparison, current benchmarks show the performance of it are significantly slower than Objective-C in contrast to the claims made in the WWDC keynote.
 
Last edited:
Soldato
OP
Joined
4 Oct 2008
Posts
6,693
Location
London
Like you mentioned in the other thread in the Apple sub forum, Swift has a lot of benefits going for it too. If you were definitely serious about looking into iOS development I would look at learning (and comparing the syntax) of both languages. I've already seen sample projects that are written in both for the sake of comparison, such as how method calls and the like differ. It is also very early days for Swift, and whilst beta compilers are not a fair comparison, current benchmarks show the performance of it are significantly slower than Objective-C in contrast to the claims made in the WWDC keynote.

Yeah, its all very interesting. I'm sure Objective-C is faster, apple just showed 2 edge cases so its meaningless. The Swift syntax is also going to change, its not even 1.0.

I have been re-learning C/C++ recently, and for the most part its been fun. I think moving to Objective-C might make a lot of sense, but I'm also going to look into a few other things. I tried Python and its not for me, the thought of not being able to set a private function is a bit scary hahaha. Scala seems interesting too.
 
Back
Top Bottom