7:20 - B. Stroustrup introduction 11:40 - What Stroustrup keeps in mind about C/C++ 16:28 - Timeline history of programming languages 25:00 - Resource management 29:50 - Pointer misuse 31:34 - Resource handles and pointers 34:30 - Error handling and resources 35:25 - Why do we use pointers? 41:10 - Move semantics 43:50 - Garbage collection 47:06 - range: for, auto, and move 49:16 - Class hierarchies 51:58 - OOP/Inheritance 54:10 - Generic Programming: Templates 59:48 - Algorithms 1:01:10 - Function objects and lambdas 1:03:08 - Container algorithms 1:03:42 - C++14 concepts 1:10:32 - "Paradigms" 1:13:56 - C++ challenges 1:15:28 - C++ more information 1:17:00 - Questions and answers
I started learning C++ with Bjarnes "Programming Principles and Practice using C++" and I can not recommend it enough. He manages to explain difficult concepts in an understandable way and doesn't miss a chance to add in some dry humor to keep the reader engaged.
@Syneyes What got you interested in video game graphics if you didn't have any programming experience? I'm just curious because that's a very specific field
@Zoran Pantic Well, I mostly have interests into the game development area and more specifically into graphics, and second to that, yes, I do have also an interest into systems. I have to agree with you, from what I've seen those topics seems to be the most difficult ones and the ones that takes the longest time to achieve a decent understanding, but I don't really see myself into Web Development at all, which is part of the problem about finding something that is complex just enough for it to not be boring and also not way too difficult that it becomes boring.
I find his talk in this video really good, but the presentation slides are really not that good looking / visually structured imo :o but okay, in the end the content matters
Been reading a lot of Stroustrup C++ books and I love how crystal clear his explanations are. For every "but how would you...?" that pops up in your head he explains it right away in detail.
Well, it's commendable; alas, a bit too late for that ;). And not to be mean, but he himself is part of the reason why C++ is so incredibly complex, and compiler errors are confusing because of it. For example, method overloading might be a tempting idea, but in the end it doesn't really solve a real programming problem; it's only a weird concept beginners have a hard time wrapping their head around, and it makes compiler errors much weirder than they have to be. And templates are a whole other story...
I cannot thank this man enough for creating the only language I can comfortably express myself. The only language that comes close (for me) is probably ECMAScript.
I'll just throw this out there. Recently I started learning C++ and some C after having achieved some degree of proficiency in VBA Excel and Python and here is the deal: If you are serious about learning the 'science' of programming, that is programming at the conceptual level, then choose either C or C++ as at least one of the languages you are learning. ((I'm leaving aside for a moment the obvious importance of the functional languages.)) I had always read that Python(and even Java and C#) is easier but I had no reference for this until I started learning C++. Obviously it is not going to be practical for everyone but if you want to be a true student of procedural process then go for C or C++. It flexes your brain way more than the higher level stuff.
Calum Tatum Depends. For performance critical tasks, like realtime audio for example, your example languages are just not fast enough. Also, C# is running on a runtime (CLR) that is written in C++. So what do you recommend for the person writing that runtime? C++ is hard, but it's also a really important language.
C++ is high level - if you are not using abstraction then you are most likely still using procedural C techniques - I agree man C++ is not easy, it is very challenging and a huge language with huge libraries, and takes years to master good luck bro! - get into Scott Meyers series it's great!
1:22:35 He is completely right about that. Well, partially. Many professors aren't involved in any active software development anymore. One could argue even many professors just don't know any better. We still learned how to do linked lists with nullptr instead of using something sensible like optional. We have never seen the good parts of C++.
I never new this genius but I understand that he is capable of reading your mail at will and monitor your i/o at the same time. And I really like the way he has programmed his haircut, hair by hair! I don`t know much about programming, but somehow it was just cool to listen to him.
It’s cool that Bjarne Stroustrup uses Cygwin on Windows. I used to use it a lot too. These days I use msys2 because it lets me create programs that don’t require the Cygwin DLL file but it still gives me that Unix environment I got used to from my Ubuntu days.
1:34:17 I think the best way forward would be to specify lots of old C++ features as deprecated and automatically generate a warning if deprecated feature is used. That would allow slowly removing parts of C++ that have better alternatives today.
I feel he is a gentelman and beside his efforts and talents his sincirity and honesty and simplicity is great. I started learning c++ this year and 7 months now. And within 7 month i read almost 7 reference books and after all i read his book it was very good experience and i think i am better now in designing and thinking than i was and only think helps me to keep forward is the solemn desire to solve one humanities problem and make the life better place for every one.
C++ is tough. I’m about to head into data structures this fall, which is based on c++ and it’s pretty intimidating but honestly, it’s all doable. I definitely won’t be working in c++ as my career, as I will be going into front end and UX but knowing c++ and how powerful it is, is interesting
Wouter I know the original syntax was akin to that used by SmallTalk (maybe the first OO language) which employed the message passing paradigm. As I understand it, modern flavors of Objective C support both styles of syntax for invoking methods against objects.
Bedroom Pianist More like 30 odd years. In a sense, Objective C beat C++ to market since it was beginning to be used commercially while C++ was still just an internal intellectual curiosity at Bell Labs. One could argue that Objective C isn’t really an entirely new language, but more like C with library and pre-processor extensions.
We all who understand a little more about computing that "normal" people, know how important C++ is for todays computing. Thank you very much Mr Stroustrup.
Bjarne Stroustrup needs to speak more often. He is brilliant and very humble. Almost too humble because I have popped into IRC's C++ channel and defended him and his programming language many times. The programmers learning C++ in the channel seem to think they know more about his programming language the man who created it, ugh.... Irritating!
It appears Python has overcome C/C++ as the most popular programming language. Almost all of the young engineers and computer scientists I encounter are proficient programming in Python and have almost no experience with C/C++.
Speaking from a small University town I am presuming that job opportunities in academia these days is very limited for the Stroustrups of the world. Only the very top universities will have the resources to hire him, due to many universities dumbing down and shifting to online teaching, and fishing for research points.
I've just started messing about with esp32 embedded cpu so I am having to learn a bit of C++. I would only call myself an amature programmer at best. I started out on Python for ease of access, which is nice, but soon hit the performance limitations. So I did the research on lots of other languages to weigh up the pros and cons, they all have them and will be different for different people, and I settled on D. This was an interesting lecture, but it does reinforce the conclusion that C++ really wants to be D, it's just going to take 10++ years to get there. Anyway, I hope I can avoid the pitfalls highlighted by Bjarne.
The potted history is that Bell Labs went through the few first-generation civilian compilers, cherry-picking them for ideas. They couldn't use A Programming Language - that had gone to the vector-coding one - so started with B, as a debug prototype to learn on, then rolled C out as a serious proposition. Because I'd worked on one of that first generation, I'd left an imperceptible double-meaning in the code, which made a lot of sense, and has become a feature of all of the derivatives. No, I'm not saying what it is.
Example at 47:37: it's a bit weird the use done of Value_type and V. You may think they're same type, values of one and the other are compared directly, yet they are different types.
C was the language that you could learn and master (completely) in 6 months.. Mastering modern C++ would take.. ehh.. about 60 years? Still, I want to thank Stroustrup for his efforts in making a practical language with more structure than C. This talk is pretty good as well. But today C++ seems like ancient history and only suitable for carrying forth legacy code. Since I've discovered Rust, I know that using C++ for larger code bases is unwise. I still use it for microcontroller code, where you maybe have 2000-3000 LOCs. Undefined behavior is just too much of a burden to deal with when project complexity scales up.
You can learn C in an afternoon - but mastering it? That takes building an entire operating system with it - realising it sucks and doing it again over and over for 20 years 😂
@lazy horse There's more clutter when writing C for sure. The STL does a good job too. In C the standard libs have always remained very spartan. But C++ still retains the same UB, and there's an exception to every rule, plus all the legacy and os dependent / compiler dependent behavior.. which makes it an impossible language, really. That's why I switched to Rust.
I still think C++ is worth it more than C, I tried to learn both, and I found that C++ clicked more with me, I just find it more logical and modern than C, sure it abstracts a lot, but it also gives you the same flexibility that C gives you if need to. But I think that C is just harder than it should, mind you the syntax rules and the Language is short, but it's not practical and just hard to write any useful application in it, since you have to implement everything yourself. I hated how uglier the syntax compared to modern C++.
you can mix and match, university of simplicity here, use your old codebase in a library, dll, etc.. and just call them from new C++ versions if you do not feel like updating the code
Long ago, I read a book by Edsger W. Dijkstra and, more or less, I've never had a problem with buffer overflows ever since. I read that book in the late 1980s, and the book was already old news then. Dijkstra's recommendation to error handling was this: if the operation can't legally be done, don't do the operation. No need to create a new control path, you'll be surprised how fast your loop exits the bottom once it's no longer able to do any legal operation. Then at the bottom, you check whether your contract was satisfied, and if not, you return an error condition. I hated exceptions with a passion-and refused to use them-until exception safety was properly formulated (mostly by David Abrahams) in the context of formal RAII. I continue to prefer Dijkstra's model, because it simply forces you to think more clearly, and implement your loops more cleanly. The only drawback is that it's hard to avoid defining a handful of boolean variables, and you can end up checking those variables almost every statement. Supposing C++ had a lazy &&=, the structure looks somewhat like this: bool function () { bool ok = true; ok &&= first_thing(); // function not called unless ok is true ok &&= second_thing(); // function not called unless ok is true return ok; } Mathematically, _every_ operation you attempt has a precondition on whether it can be executed legitimately. But we learned way back in the 1970s-the age of Dennis Ritchie-not to write code like this: for (p = p_begin, s = s_begin; s < s_end; ++s) { if (p < p_end) *p++ = *s; } My God! All those extra loop iterations doing _nothing_ you might execute for no good reason if (p_end-p_begin)
@xybersurfer I wrote that long answer in two parts because my first "submit" returned me to the Kansas of an empty canvas. Google disliked the length of my post (an "error" I suppose) and so it dumped all my bits onto the floor. But I had pressed CTRL-C with my prehensile beard-as I _always_ do-before I pressed "reply" so nothing was lost. Because I've been to Kansas _so_ many times before. Social media has that strong Kansas smell, ever and always. Christopher Hitchens in Vanity: _Clive James needed an audience and damn well deserved one. His authority with the hyperbolic metaphor is, I think, unchallenged. Arnold Schwarzenegger resembled "a brown condom full of walnuts." Of an encounter with some bore with famous halitosis Clive once announced, "By this time his breath was undoing my tie."_ In similar vein, the Kansas smell in social media curls my beard around CTRL-C before I risk committing my work to the vicissitudes of a hyper-redundant Google data farm. The second "why" about the length of my post is that your response came up on my screen as I was already deep in contemplation of "metacognitive annotation". That's a grand expression for commenting your code, but in my version it also has other large spheres. Two different "all new" approaches to fundamental physics were unveiled this month, both from the lunatic fringe where lunatic rhymes with "way smarter than your average bear". One was Stephen Wolfram. The other was Eric Weinstein. These are both avowedly ambitious with capital AA. They have also both attracted a lot of sniffing around by the why-haven't-we-got-AGI-yet? crowd. That's another cognitive Kansas jump. This time it's human technology riding the magic-slipper carpet (and noticing nothing at all _en route_) instead of your tidy exception return value. I don't believe in this kind of long jump in any sphere. A _lot_ of scenery has to pass under the bridge before we have a fully developed AGI. One of those is that we need to develop a programming methodology where this entire ridiculous error-handling debate simply goes away. Probably "error" barely belongs in the conversation by the time the dust settles. Probably in more advanced systems, every request for action receives an advanced data structure which serves as an intent manifest, which is available for all manner of sophisticated annotation, for the manner in which it seems best to handle contingencies (some of which might be rigorously classified as actual errors). One of the problems we would have to confront in doing so are the institutional boundaries, where code degenerates into institutional silos, such as the major JavaScript frameworks. I might not live to see this fixed in my lifetime, but if so, I don't expect to witness AGI in my lifetime, either.
@xybersurfer Long ago I attended a pretentious university (Waterloo, Ontario) where the faculty was determined to teach freshman and sophomores the One True Way, which in that year was the Pascal programming language, with a top-down design methodology. Everyone I knew with any talent was conniving to get an account on any machine with a viable C compiler, while the faculty was conniving to keep access to these machines as limited as possible for the sake of "sparing the children". But no high-integrity system is ever designed top down. You mainly work bottom up at first from your core integrity guarantees. And then once you have a coherent API over your core integrity layer, you work top down, but still not as I was originally instructed. The top-down portion is architected from your test and verification analysis. The best modularity is the maximally testable modularity, and not the modularity that produces the least amount or the cleanest amount of code. In particular, your abstraction boundaries only need to be clean enough, rather than conceptually pristine (pristine when you can swing it, but not at the cost of making the full system less testable). The Linux people thought that ZFS had too many layering violations, and that with the benefit of hindsight and some fundamentally better algorithms deep down, they could kick ZFS to the curb with Btrfs. Actual outcome: Btrfs was removed from Red Hat Enterprise Linux 8 (RHEL) in May 2019. Btrfs team: We know pristine (good), we know clutter (bad). ZFS team: We know core coherence layering (hard, but doable with a clear head), we know testability at scale (hard, but doable with sustained cultural pragmatism). From Rudd-O: "ZFS will actually tell you what went bad in no uncertain terms, and help you fix it. ... btrfs has nothing of the sort. You are forced to stalk the kernel ring buffer if you want to find out about such things." That's a guaranteed fingerprint of a click-your-heels-to-return-to-Kansas design ethos. At which point all you can do is emit a somewhat cryptic message to the OS ring buffer. The original inviolable rationale for why top-down-design as I was once taught in Pascal was going to rule the world is that it's fundamentally based on divide and conquer. And what can possibly defeat divide and conquer? We had hundreds of years-if not _thousands_ of years-of history with this technique in mathematics. And it truly rocks in that domain. Because mathematics-of all human things-most excels at abstraction from messy circumstance. By emulating mathematics to the ultimate extent, so too can software _pretend_ to exist in this austere world. The original ZFS design team at Oracle knew how to design the system around their validation framework, because they were _deeply_ immersed in the world of the shit hitting the fan. Before Matt Ahrens proposed ZFS, a typo in the configuration of the Solaris volume manager caused the system to lose all home directories for 1000 engineers. And that was far from the only domain of extreme suffering. Original design principle: to escape the suffering, the administrator expresses intent, not device mappings. Now I'm circling back to the original problem. When you're muzzy-headed about what constitutes a conceptually coherent error class (because you've lumped sick world & sick dog under "click-your-heels" through the underground exception-return expressway) there's barely any possible way to think clearly about application intent. Oh, but you say "the exception return object can be arbitrarily complex to handle this". And right you would be, only there goes your pristine and uncluttered code base, for sure. And now what have you really gained? Because you are already welded into the exception mindset, you're almost certainly going to try to report the damage on first contact with a 7'2" Santa Claus guarding the post. And you're return an object to encode error 666: denied by Giant Black Goliath in a red deerskin vest. I've actually played around with writing code where I would continue to attempt to do all things the code can legally do _after_ something has already gone fatally wrong (fatally wrong meaning there is no remaining chance to achieve your subroutine's mandatory post-condition) and then reporting back _all_ the resources and actions that proved unsuccessful in the quest to accomplish the assigned task. (There is no possible way to implement this coding strategy that you would not decree to be hopelessly cluttered.) And this is interesting, because when you do get six errors at all once (illegal host name, can not allocate memory, no route to host, private key does not exist) then you're immediately pretty sure you entered double-insert mode in vi and there's an extra "i" character somewhere in your application's configuration file. Suppose you tell a human minion: deliver this yellow envelope to 555 Hurl St, last door on the right at the end of the long yellow hallway on the third floor and then grab me a muffin from the bakery downstairs and your minion comes back to report "no such address" (with no muffin) and doesn't point out that while the hallway on the third floor was long, it was orange instead of yellow and you didn't even check whether there was a bakery down below, because unfortunately the hallway was orange instead of yellow, so you didn't even _think_ about completing the rest of the assigned errand. "And there was no bakery down below" would have you checking your street address first of all. "And there _was_ a bakery down below" would have you checking your floor number first of all (while happily munching your hot muffin as a side bonus). Human contingency is _so_ different than code contingency, and I just think this is wrong-headed all around. But modern computer languages really do tend to compel divide and conquer, and I conceded that it's extremely difficult to enact human contingency in a way that's clean and compatible with this reductionist software engineering ethos. I've seen lots of unnecessarily cluttered code. Almost always because of a woolly conceptual foundation, where you're half in one system of coping and half in another system of coping. But I've gradually evolved to where I almost never count clean expression of preconditions or postconditions (all the way down to the level of individual statements) as active code clutter. My eye now regards such code as a somewhat prolix signature of a job well done, no ruby slippers involved.
@xybersurfer xybersurfer Naturalness only exists in a cultural context. You choose to operate within a software culture with a weirdly enlarged notion of "error". If I tried to sign up on Facebook and I entered my "Allan Stokes" as my preferred username, would the system create me a new account, or would it display an "error" that my username is already taken by some other Allan Stokes of no particular distinction? That counts as an "error" in what sense, precisely? Was it an error that some other Allan Stokes wanted my personal name? Was it an error that I also wanted the same? Was it an error that Facebook did not originally design their social network so that all users were known by a globally unique 128-bit GUID, preventing this kind of land-rush clash for all time? I count _none_ of these things as errors. Not everybody can have everything at the same time in the world we actually live in. Since I'm already a decade or two late boarding the Facebook ark, it's now a hail Mary for me to obtain the simplest rendition of my own name in this sphere. It's for the same reason we send Santa a list, and not just the single item Ferrari Testarossa. Santa might be short on Ferrari's this year, especially in light of the present circumstance in Milan lately. So it's a wise policy to provide Santa with options B through F, as well. Was it an "error" for me to want a Ferrari Testarossa? Or was it merely the first interaction in a distributed algorithm to probe the maximal intersection point between my list of desires and Santa's available merchandise? Likewise, if my algorithm wants 10 GB of DRAM working-space so as to run faster, is it an "error" if my request can not presently be granted by Santa's elves? I don't count that as an error, either. Let's suppose I'm only asking for 256 bytes, and malloc returns NULL. Under those conditions, whatever my subroutine is trying to accomplish is probably not viable. How does the OS know that this is now an error as you see it, because your program design has boxed you into a corner with no recourse but to barf up a White Flag of Surrender message box? The OS does _not_ know this. (Is it a coincidence that "throw" and "barf" are somewhat synonymous? I think not.) What the OS _could_ know for certain is that you just called free on a pointer it can't find on any active allocation list, which is _definitely_ an error. And if your program has just called free on a non-thing (type II error), what confidence do you now have that you haven't previously called free on a thing you intended to continue using (type I error)? Practically NIL. This is now Hunt the Wumpus territory. "It is now pitch dark. If you proceed, you will likely fall into a pit." But this isn't actually defined in the standard as an "error". The standard instead says that free's behaviour is now undefined. And once _any_ function commits undefined behaviour, this condition can not be erased short of exiting the current process (although in theory, attempting to exit your process can now also be undefined)-and now the state of your Unix process tree can potentially also become undefined and it's probably now time to pay homage to CTRL-ALT-DEL. In practice, exit is usually engineered to not depend on much, so it usually won't fail to exit your process, to some number of nines (typically more for BSD with a belt-and-suspenders attitude dating back to the original graybeard; fewer for Linux, which regards too many nines as a type II cultural error that greatly hinders the effective evolution rate). This is the funny thing about error culture in computer science. The unambiguous errors are considered so severe, that the cultural response is "why even contemplate continuing to press forward?" whereas completely foreseeable circumstances-like two people trying to sign up on Facebook with the same username-are reified into the barf loop (aka culturally re-parented into the class of "exceptionable" error events). My personal favourite case study is the Therac-25, a Canadian radiation therapy machine which either killed or badly injured at last six people. It had two beam strengths: regular and lethal (100× beam power). There was a turntable inside to rotate a bullet-proof vest in front of the lethal beam. On striking the bullet-proof target, the lethal beam would kick out some other modality of therapeutic radiation as a byproduct. Patient saved. In Dijkstra's world, you have ONE place in the code which is capable of unleashing lethal beam energy. Under what condition are you not allowed to do this? When the bullet-proof vest has _not_ been rotated into position. Otherwise, the patient experiences an extreme shock and _literally_ runs screaming out of the treatment room (as did a fatally wounded Roy Cox in actual fact). But you see, the software was designed to ensure that the correct turntable position was already activated and properly in place long _before_ subroutine release_the_radioactive_hounds was called upon to do its dirty work. Mechanically, the turntable implemented _three_ separate microswitches to ensure that the software could reliably detect its true position (not just two sets like your microwave oven door). They got carried away with thinking about what the software was trying to _do_ (treat the patient) and forget to think clearly about what the software was trying to _not_ do (kill the patient). There's no excuse on God's green earth for a software culture which accepts _anything_ other than a last conditional statement to check the turntable position (and every _other_ survival-critical precondition, if there were any more) before the ONE line of code capable of unleashing the 100× lethal beam. We were a good 35 years into the software industry before these fatal accidents happened. Dijkstra had completely set this straight already by the 1960s. In ZFS or PostgreSQL-or any other life-critical application (as viewed by the data under storage)-the implementation culture is strongly biased to think first and always about what you simply MUST NOT DO (e.g. create a race condition on flush to disk). Further up the software stack, the culture flips into: call the extremely sophisticated and fast-moving API as best as you can and mostly hope for the best. If a 7'2" Santa Claus with a 30" vertical leap power-blocks your layup attempt, you click you heels through the handy-dandy uncluttered exception mechanism, and return to Kansas, hardly any the wiser, but more or less still alive. Betcha didn't know Santa had that move. Good thing your ruby slippers are maximally topped up on reindeer credits. Only here's the downside: people who live in Kansas begin to develop an extremely muzzy sense of what a true error actually looks like. "Error" becomes a hopelessly muddled class of eventualities where it seems more convenient to click your heels together than to write a robust code block with a clear sense of the hazards it is actually navigating. And then this becomes further reified into what "uncluttered" code ought to properly look like, because we are at the end of the day cultural creatures, who thrive on cultural uniformity
@Allan Stokes the problem with avoiding exceptions, is that you are just manually implementing the error handling mechanism already present in the language. you are also wasting the return value which could be used for something more natural. the big advantage of exceptions to me is that you can abstract them away. this way you are left more with code describing task. this is much cleaner, than code checking for errors, with the actual task buried somewhere inside. i'm sure you could make it work like most things, but it looks like it results in a lot of unnecessary code
Also, I hope people read carefully enough to realize that I was exactly half serious: half serious because this *does* actually work, half not serious, because it only works for a stubborn iconoclast willing to put up with a certain extreme forms of surface ugliness -and stubborn iconoclasts (like Knuth and Bernstein and every second Haskell programmer) have always managed to write software with orders of magnitudes fewer bugs than competitive endeavors from lower down in the iconoclasm pecking order, so it's really not such a special value add.
There's a reason why C++ was never accepted as the language for Mach or Linux kernels. Most of its power is in its dynamic allocation, and this is unwise in kernel context.
wow, this is unbelievable!! I never thought about sit in from of this eminence sir, can't believe this because he was to me one of the genius of my Turbo C/C++ book, amazing! ohh no matter the fact that I'm sit at home :P
C++ is faster because it is run on the machine - java is run a little different and is why it can run on virtually everything (atm machines, cashier machines, etc.)
+Sina Madani It's obvious he's not here to promote Java , he's here to talk about C++ . C++ and Java have different design objectives , C++ was mainly used to develop systems applications so it's focused on achieving control over low level operations like memory management ( C++ has pointers and doesn't have a garbage collector) and at the same time be able to abstract away using object orientation . Java however was developed to enforce the object oriented paradigm which has lead to it being one of the most successfull (if not the most successfull ) programming language in the world , Java addresses two of the key challeges of todays programming world : the first is "duplicate code is bad " and the other is "code will always keep changing " those two problems are best solved using the object oriented paradigm in which there is no better language for other than Java . If we are talking about the hardness of learning than yes C++ is harder because it's a hybrid language , sometimes i view it as combination of C and Java but that's just me :p , but both languages can do things easier than the other language can do, which means that both can achieve a deeper level a complexity unreachable by the other language depending on what they are used for .
As a programmer, I'll tell you what really ticks me off about these high level OOP languages.. I'm pretty good at logic and program flow, but my productivity is drastically slowed down by having to remember, or look up, all the INSANE syntax, variable passing, and multiple file dependencies of C++
It's an amazingly powerful language, but also gives you the opportunity to shoot yourself in the foot at every step on the way. To use it simply and safely, you'll probably start using smart pointers, copying strings instead of referencing.. undoing its performance potential. Still, you can do everything, including shooting yourself in the foot.
I won't watch any of my teachers or preachers lecturing for more than 20 minutes. This is 1 hour and 40 minutes long and I don't think Dr. Stroustrup is a gifted orator, but I will watch it to the end. The difference is the level of respect, I don't think any other living man deserves quite as much respect from me than Dr. Stroustrup.
Indeed, it's interesting. He even has a monotone voice delivery. What keeps us listening is the level of precision and relevancy with which he delivers all the information.
7:00 Tell them what you're going to tell them, tell them, tell them what you told them: done to perfection-that shiny trader is going to really enjoy this lecture.
It's a difficult situation. Every release makes it less and less welcoming for new developers, as it grows to be a larger and larger monster of a language.
@Tin Švagelj The problem is with the industry hiring new developers. With green field projects it can be fine, but often they have old, large legacy code bases which have already accumulated all kinds of C++ features and styles over the years. It can be a daunting experience to jump in such a project if there are holes in your understanding of "historical C++". There's not so much you can just forget about.
You don't need to start using it by jumping straight into template metaprogramming. You start with C code, fix the type safety, replace pointers with references wherever possible and use STL containers you find useful (most common are vecotr, map, list and tuples). You then learn how to use move and copy assignment and constructors. That's basically it, you can write good enough code knowing just that. As with any other language, you improve with practice. There's plenty of resources available online.
@J T I've read two or three different books of C++. 1st one was written by Bjarne himself. It was C++ for ARM mostly. I was too young and stupid back then and gave up at ~70th page or so, where he talked about references. (It could also be a poor translation of the book, not sure). Some time later, I understood classes, inheritance and all related stuff from yet another book. It's well over a decade ago but when I sit to any C++ thing, I like to have some web page briefly describing whats in STL and some more advanced concepts under my hand, just to make life easier. What I've also learned is I will never learn it... It's too complicated and changing lately. For example the lambdas (hate them or live too short to get used to). Keep things simple - don't overkill!!!
@Jorge Bastos Syntax?? are you serious ?, C++ is art, C++ has a few flaws but syntax is not one of them C++ is C ++ I write rust now not because it's better than C++ but because I can write concurrent programs safely But never talk about syntax being horrible, maybe the std library is congested
It's really not. The problem is, as the man said, that it's been around a long time, and has been, in a way, several _different_ related languages over the years. Keep in mind how very slow and clunky computers were when C++ was introduced - a lot of the original language relied on the programmer to do most of the heavy lifting. That is, you needed to think about the machine almost as much as the problem you were working on. These days, if you use the right parts of C++, you can offload most of the "thinking about the machine" part to the compiler and just concentrate on the problem you're solving. The compiler itself can be a much larger program than was possible in the early days, and it will compile your modern code in seconds or minutes rather than weeks. Most of this talk was about why it's better to use the newer language features when you can rather than the features that were there so that your steam-and-hamster-powered 4MHz processor with maybe a couple of megabytes of memory (huge for the time) wouldn't choke trying to compile it.
at 1:13:08 spoke a the creator of OOP, i couldnt understand the name properly, I tought it was Alan Key who invented it, but apparently Bjarne state otherwhise. I'm very curious to read more about the man he talk about, if anyone can write his name, that would be lovely.
Bjarne was always one of my Gurus. When turned this video, I was wondering If I be able to make any sort of comment to it. Well: 14:16 "Complexity of code should be proportional to the complexity of the task"... What failed during last 20 years and why do we have so many "N Gibabyte Fat" crapware...
Even c++’s creator thinks it’s designed using poor principles. I would bet that the greatest cost of unsafe languages has been security fails. It’s easily in the billions. But once a language is entrenched, it just keeps on trucking. I bet it would still be there in decades to come.
1:31:30 Actually not even C can agree on ABI calling convention between 64-bit Linux and 64-bit Windows running on x86-64 CPU on identical hardware. There's little hope for C++ to have a standard before there's an agreed upon standard on C.
I don't think we should even *know* the wrong ways of doing things... Nobody teaches non-generic Java or C#, you have to figure it out for yourself when you come across it, and neither Java or C# has an international standards committee dedicated to them, so how come C++ can't get its act together in that regard? The committee should enforce compilers marking certain language constructs as obsolete, but the committee has an abusive relationship with compiler makers and we wonder why the committee stays ("had pizza" comes to mind).
@42:50 is he talking about nulling the pointers such as by nulling them the pointers arent actually referencing anything by the time the function is out of scope or is this some other way? i thought nulling a point led to undefined behavior.
7:20 - B. Stroustrup introduction
11:40 - What Stroustrup keeps in mind about C/C++
16:28 - Timeline history of programming languages
25:00 - Resource management
29:50 - Pointer misuse
31:34 - Resource handles and pointers
34:30 - Error handling and resources
35:25 - Why do we use pointers?
41:10 - Move semantics
43:50 - Garbage collection
47:06 - range: for, auto, and move
49:16 - Class hierarchies
51:58 - OOP/Inheritance
54:10 - Generic Programming: Templates
59:48 - Algorithms
1:01:10 - Function objects and lambdas
1:03:08 - Container algorithms
1:03:42 - C++14 concepts
1:10:32 - "Paradigms"
1:13:56 - C++ challenges
1:15:28 - C++ more information
1:17:00 - Questions and answers
Legend
Mmmmmmmmmmmmmmmmmmmmm?mmm
Bless you Son.
Saved me 7 minutes
I started learning C++ with Bjarnes "Programming Principles and Practice using C++" and I can not recommend it enough. He manages to explain difficult concepts in an understandable way and doesn't miss a chance to add in some dry humor to keep the reader engaged.
Same here. I'm currently working thru it. It's challenging (chp 6/7) but at the same time motivating.
Another me too here
@Garegin Asatryan - are you sure she wasn't referring to "The C++ Programming Language", which he also wrote, and which is truly arcane and difficult?
@Syneyes What got you interested in video game graphics if you didn't have any programming experience? I'm just curious because that's a very specific field
@Zoran Pantic Well, I mostly have interests into the game development area and more specifically into graphics, and second to that, yes, I do have also an interest into systems. I have to agree with you, from what I've seen those topics seems to be the most difficult ones and the ones that takes the longest time to achieve a decent understanding, but I don't really see myself into Web Development at all, which is part of the problem about finding something that is complex just enough for it to not be boring and also not way too difficult that it becomes boring.
the essence starts at 7:27
Took me 10 seconds to start fast forwarding...
Thank you
WYSI
Man who is this guy at the start. He's boring.
when you see it
Bjarne is a surprisingly good lecturer. Best example of an IT scientist.
I find his talk in this video really good, but the presentation slides are really not that good looking / visually structured imo :o but okay, in the end the content matters
That's because he actually lectures.
Been reading a lot of Stroustrup C++ books and I love how crystal clear his explanations are. For every "but how would you...?" that pops up in your head he explains it right away in detail.
Such a smart guy, I love that he's concerned with the language being approachable to casual users as well.
Well, it's commendable; alas, a bit too late for that ;). And not to be mean, but he himself is part of the reason why C++ is so incredibly complex, and compiler errors are confusing because of it. For example, method overloading might be a tempting idea, but in the end it doesn't really solve a real programming problem; it's only a weird concept beginners have a hard time wrapping their head around, and it makes compiler errors much weirder than they have to be. And templates are a whole other story...
🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🎉🔥🎉 the last time you can contact her 😂😂😂😂☺️ the status quo the status the new PO raised the bar unpair
I cannot thank this man enough for creating the only language I can comfortably express myself. The only language that comes close (for me) is probably ECMAScript.
Love how this man can describe / explain things and make sense!
I'll just throw this out there. Recently I started learning C++ and some C after having achieved some degree of proficiency in VBA Excel and Python and here is the deal: If you are serious about learning the 'science' of programming, that is programming at the conceptual level, then choose either C or C++ as at least one of the languages you are learning. ((I'm leaving aside for a moment the obvious importance of the functional languages.)) I had always read that Python(and even Java and C#) is easier but I had no reference for this until I started learning C++. Obviously it is not going to be practical for everyone but if you want to be a true student of procedural process then go for C or C++. It flexes your brain way more than the higher level stuff.
@Wouter rust go brrrrrr
If you want procedural programming language, try Pascal..very similar to C
Calum Tatum Depends. For performance critical tasks, like realtime audio for example, your example languages are just not fast enough. Also, C# is running on a runtime (CLR) that is written in C++. So what do you recommend for the person writing that runtime? C++ is hard, but it's also a really important language.
C++ is high level - if you are not using abstraction then you are most likely still using procedural C techniques - I agree man C++ is not easy, it is very challenging and a huge language with huge libraries, and takes years to master
good luck bro! - get into Scott Meyers series it's great!
More than half a decade old and still more modern than my university.
@Chris Kramer Well, it's been about 9 years now, and it still is quite modern looking.
Well I mean “half a decade” is really not a long time ago
Great video UofE. My favorite quote starts at 07:36, and he nails it. Thank you sir for C++ and everything else!
Simply great.. He is such a good confident teacher. This video is a must watch for all programmers.
1:22:35 He is completely right about that. Well, partially. Many professors aren't involved in any active software development anymore. One could argue even many professors just don't know any better. We still learned how to do linked lists with nullptr instead of using something sensible like optional. We have never seen the good parts of C++.
we weren't even taught linked lists with nullptr lmao our prof still uses NULL
This is so awesome. Understanding like 10%, but still super interesting to follow.
I never new this genius but I understand that he is capable of reading your mail at will and monitor your i/o at the same time. And I really like the way he has programmed his haircut, hair by hair! I don`t know much about programming, but somehow it was just cool to listen to him.
It’s cool that Bjarne Stroustrup uses Cygwin on Windows. I used to use it a lot too. These days I use msys2 because it lets me create programs that don’t require the Cygwin DLL file but it still gives me that Unix environment I got used to from my Ubuntu days.
1:34:17 I think the best way forward would be to specify lots of old C++ features as deprecated and automatically generate a warning if deprecated feature is used. That would allow slowly removing parts of C++ that have better alternatives today.
That's why I learn Rust.
I feel he is a gentelman and beside his efforts and talents his sincirity and honesty and simplicity is great.
I started learning c++ this year and 7 months now.
And within 7 month i read almost 7 reference books and after all i read his book it was very good experience and i think i am better now in designing and thinking than i was and only think helps me to keep forward is the solemn desire to solve one humanities problem and make the life better place for every one.
Bjarne Stroustrup is such a great guy. He knows programming from heart, every nukes and crannies, he knows.
C++ is tough. I’m about to head into data structures this fall, which is based on c++ and it’s pretty intimidating but honestly, it’s all doable. I definitely won’t be working in c++ as my career, as I will be going into front end and UX but knowing c++ and how powerful it is, is interesting
A modern legend. Every time I struggle to understand C++ I think about him inventing it.
Geeeeeeeez....
Wouter I know the original syntax was akin to that used by SmallTalk (maybe the first OO language) which employed the message passing paradigm. As I understand it, modern flavors of Objective C support both styles of syntax for invoking methods against objects.
@Mark Teague Objective-C sucks. It's a mess of weird syntax and it lacks the generic power of C++.
Bedroom Pianist More like 30 odd years. In a sense, Objective C beat C++ to market since it was beginning to be used commercially while C++ was still just an internal intellectual curiosity at Bell Labs. One could argue that Objective C isn’t really an entirely new language, but more like C with library and pre-processor extensions.
He invented it but don't be fooled, programming languages are an iterative process and C++ has been around for 40 some-odd years.
Does it help?
We all who understand a little more about computing that "normal" people, know how important C++ is for todays computing. Thank you very much Mr Stroustrup.
Bjarne Stroustrup needs to speak more often. He is brilliant and very humble. Almost too humble because I have popped into IRC's C++ channel and defended him and his programming language many times. The programmers learning C++ in the channel seem to think they know more about his programming language the man who created it, ugh.... Irritating!
Only a genius can make something complex and abstract to sound simple.
It appears Python has overcome C/C++ as the most popular programming language. Almost all of the young engineers and computer scientists I encounter are proficient programming in Python and have almost no experience with C/C++.
This only increases the value of C++ programmers
Love and Respect, I encourage all those who want to learn C++ deeply to read his book 4th edition!
Speaking from a small University town I am presuming that job opportunities in academia these days is very limited for the Stroustrups of the world. Only the very top universities will have the resources to hire him, due to many universities dumbing down and shifting to online teaching, and fishing for research points.
I've just started messing about with esp32 embedded cpu so I am having to learn a bit of C++. I would only call myself an amature programmer at best. I started out on Python for ease of access, which is nice, but soon hit the performance limitations. So I did the research on lots of other languages to weigh up the pros and cons, they all have them and will be different for different people, and
I settled on D. This was an interesting lecture, but it does reinforce the conclusion that C++ really wants to be D, it's just going to take 10++ years to get there. Anyway, I hope I can avoid the pitfalls highlighted by Bjarne.
Is D still alive? I used it 10 years ago and it was nice, but today I settle for Rust and have more grip on performance.
All hail Bjarne the creator of our generation!
When a James Bond villain gives a lecture on programming, I listen.
@Corrie De Beer No, maybe the old Blofeld though.
He does have a bit of a Ernst Stavro Blofeld vibe about him
Man With No Name he said james bond not dragon ball z
Thanks a lot for this video. Keep uploading more of such content. Good luck...
The potted history is that Bell Labs went through the few first-generation civilian compilers, cherry-picking them for ideas. They couldn't use A Programming Language - that had gone to the vector-coding one - so started with B, as a debug prototype to learn on, then rolled C out as a serious proposition. Because I'd worked on one of that first generation, I'd left an imperceptible double-meaning in the code, which made a lot of sense, and has become a feature of all of the derivatives. No, I'm not saying what it is.
C++ is and will be my favourite programming language forever! This man is my Hero! Longlife to Him!
Example at 47:37: it's a bit weird the use done of Value_type and V. You may think they're same type, values of one and the other are compared directly, yet they are different types.
Thanks for sharing the lecture!
Omg ... watch his first videos "The Design of C++" then watch this. His speaking skills improved DRAMATICALLY! He's awesome.
He probably thought "English is too simple. Why would they teach such a language as a first one, there isn't enough thinking involved" :D
EXCELLENT lecture!!!
THE MAN THE MYTH THE LEGEND
His talks are so dense! Packs in a wealth of information and insights that need real dedication to follow.
C was the language that you could learn and master (completely) in 6 months.. Mastering modern C++ would take.. ehh.. about 60 years? Still, I want to thank Stroustrup for his efforts in making a practical language with more structure than C. This talk is pretty good as well. But today C++ seems like ancient history and only suitable for carrying forth legacy code. Since I've discovered Rust, I know that using C++ for larger code bases is unwise. I still use it for microcontroller code, where you maybe have 2000-3000 LOCs. Undefined behavior is just too much of a burden to deal with when project complexity scales up.
You can learn C in an afternoon - but mastering it? That takes building an entire operating system with it - realising it sucks and doing it again over and over for 20 years 😂
@lazy horse There's more clutter when writing C for sure. The STL does a good job too. In C the standard libs have always remained very spartan. But C++ still retains the same UB, and there's an exception to every rule, plus all the legacy and os dependent / compiler dependent behavior.. which makes it an impossible language, really. That's why I switched to Rust.
I still think C++ is worth it more than C, I tried to learn both, and I found that C++ clicked more with me, I just find it more logical and modern than C, sure it abstracts a lot, but it also gives you the same flexibility that C gives you if need to. But I think that C is just harder than it should, mind you the syntax rules and the Language is short, but it's not practical and just hard to write any useful application in it, since you have to implement everything yourself. I hated how uglier the syntax compared to modern C++.
you can mix and match, university of simplicity here, use your old codebase in a library, dll, etc.. and just call them from new C++ versions if you do not feel like updating the code
Long ago, I read a book by Edsger W. Dijkstra and, more or less, I've never had a problem with buffer overflows ever since. I read that book in the late 1980s, and the book was already old news then. Dijkstra's recommendation to error handling was this: if the operation can't legally be done, don't do the operation. No need to create a new control path, you'll be surprised how fast your loop exits the bottom once it's no longer able to do any legal operation. Then at the bottom, you check whether your contract was satisfied, and if not, you return an error condition. I hated exceptions with a passion-and refused to use them-until exception safety was properly formulated (mostly by David Abrahams) in the context of formal RAII. I continue to prefer Dijkstra's model, because it simply forces you to think more clearly, and implement your loops more cleanly. The only drawback is that it's hard to avoid defining a handful of boolean variables, and you can end up checking those variables almost every statement.
Supposing C++ had a lazy &&=, the structure looks somewhat like this:
bool function () {
bool ok = true;
ok &&= first_thing(); // function not called unless ok is true
ok &&= second_thing(); // function not called unless ok is true
return ok;
}
Mathematically, _every_ operation you attempt has a precondition on whether it can be executed legitimately.
But we learned way back in the 1970s-the age of Dennis Ritchie-not to write code like this:
for (p = p_begin, s = s_begin; s < s_end; ++s) {
if (p < p_end) *p++ = *s;
}
My God! All those extra loop iterations doing _nothing_ you might execute for no good reason if (p_end-p_begin)
@xybersurfer I wrote that long answer in two parts because my first "submit" returned me to the Kansas of an empty canvas. Google disliked the length of my post (an "error" I suppose) and so it dumped all my bits onto the floor. But I had pressed CTRL-C with my prehensile beard-as I _always_ do-before I pressed "reply" so nothing was lost. Because I've been to Kansas _so_ many times before. Social media has that strong Kansas smell, ever and always.
Christopher Hitchens in Vanity: _Clive James needed an audience and damn well deserved one. His authority with the hyperbolic metaphor is, I think, unchallenged. Arnold Schwarzenegger resembled "a brown condom full of walnuts." Of an encounter with some bore with famous halitosis Clive once announced, "By this time his breath was undoing my tie."_
In similar vein, the Kansas smell in social media curls my beard around CTRL-C before I risk committing my work to the vicissitudes of a hyper-redundant Google data farm.
The second "why" about the length of my post is that your response came up on my screen as I was already deep in contemplation of "metacognitive annotation". That's a grand expression for commenting your code, but in my version it also has other large spheres. Two different "all new" approaches to fundamental physics were unveiled this month, both from the lunatic fringe where lunatic rhymes with "way smarter than your average bear". One was Stephen Wolfram. The other was Eric Weinstein. These are both avowedly ambitious with capital AA. They have also both attracted a lot of sniffing around by the why-haven't-we-got-AGI-yet? crowd. That's another cognitive Kansas jump. This time it's human technology riding the magic-slipper carpet (and noticing nothing at all _en route_) instead of your tidy exception return value.
I don't believe in this kind of long jump in any sphere. A _lot_ of scenery has to pass under the bridge before we have a fully developed AGI. One of those is that we need to develop a programming methodology where this entire ridiculous error-handling debate simply goes away. Probably "error" barely belongs in the conversation by the time the dust settles. Probably in more advanced systems, every request for action receives an advanced data structure which serves as an intent manifest, which is available for all manner of sophisticated annotation, for the manner in which it seems best to handle contingencies (some of which might be rigorously classified as actual errors).
One of the problems we would have to confront in doing so are the institutional boundaries, where code degenerates into institutional silos, such as the major JavaScript frameworks.
I might not live to see this fixed in my lifetime, but if so, I don't expect to witness AGI in my lifetime, either.
@xybersurfer Long ago I attended a pretentious university (Waterloo, Ontario) where the faculty was determined to teach freshman and sophomores the One True Way, which in that year was the Pascal programming language, with a top-down design methodology. Everyone I knew with any talent was conniving to get an account on any machine with a viable C compiler, while the faculty was conniving to keep access to these machines as limited as possible for the sake of "sparing the children". But no high-integrity system is ever designed top down. You mainly work bottom up at first from your core integrity guarantees. And then once you have a coherent API over your core integrity layer, you work top down, but still not as I was originally instructed. The top-down portion is architected from your test and verification analysis. The best modularity is the maximally testable modularity, and not the modularity that produces the least amount or the cleanest amount of code. In particular, your abstraction boundaries only need to be clean enough, rather than conceptually pristine (pristine when you can swing it, but not at the cost of making the full system less testable). The Linux people thought that ZFS had too many layering violations, and that with the benefit of hindsight and some fundamentally better algorithms deep down, they could kick ZFS to the curb with Btrfs. Actual outcome: Btrfs was removed from Red Hat Enterprise Linux 8 (RHEL) in May 2019.
Btrfs team: We know pristine (good), we know clutter (bad).
ZFS team: We know core coherence layering (hard, but doable with a clear head), we know testability at scale (hard, but doable with sustained cultural pragmatism).
From Rudd-O: "ZFS will actually tell you what went bad in no uncertain terms, and help you fix it. ... btrfs has nothing of the sort. You are forced to stalk the kernel ring buffer if you want to find out about such things."
That's a guaranteed fingerprint of a click-your-heels-to-return-to-Kansas design ethos. At which point all you can do is emit a somewhat cryptic message to the OS ring buffer.
The original inviolable rationale for why top-down-design as I was once taught in Pascal was going to rule the world is that it's fundamentally based on divide and conquer. And what can possibly defeat divide and conquer? We had hundreds of years-if not _thousands_ of years-of history with this technique in mathematics. And it truly rocks in that domain. Because mathematics-of all human things-most excels at abstraction from messy circumstance. By emulating mathematics to the ultimate extent, so too can software _pretend_ to exist in this austere world.
The original ZFS design team at Oracle knew how to design the system around their validation framework, because they were _deeply_ immersed in the world of the shit hitting the fan. Before Matt Ahrens proposed ZFS, a typo in the configuration of the Solaris volume manager caused the system to lose all home directories for 1000 engineers. And that was far from the only domain of extreme suffering. Original design principle: to escape the suffering, the administrator expresses intent, not device mappings.
Now I'm circling back to the original problem. When you're muzzy-headed about what constitutes a conceptually coherent error class (because you've lumped sick world & sick dog under "click-your-heels" through the underground exception-return expressway) there's barely any possible way to think clearly about application intent. Oh, but you say "the exception return object can be arbitrarily complex to handle this". And right you would be, only there goes your pristine and uncluttered code base, for sure. And now what have you really gained?
Because you are already welded into the exception mindset, you're almost certainly going to try to report the damage on first contact with a 7'2" Santa Claus guarding the post. And you're return an object to encode error 666: denied by Giant Black Goliath in a red deerskin vest. I've actually played around with writing code where I would continue to attempt to do all things the code can legally do _after_ something has already gone fatally wrong (fatally wrong meaning there is no remaining chance to achieve your subroutine's mandatory post-condition) and then reporting back _all_ the resources and actions that proved unsuccessful in the quest to accomplish the assigned task. (There is no possible way to implement this coding strategy that you would not decree to be hopelessly cluttered.) And this is interesting, because when you do get six errors at all once (illegal host name, can not allocate memory, no route to host, private key does not exist) then you're immediately pretty sure you entered double-insert mode in vi and there's an extra "i" character somewhere in your application's configuration file.
Suppose you tell a human minion: deliver this yellow envelope to 555 Hurl St, last door on the right at the end of the long yellow hallway on the third floor and then grab me a muffin from the bakery downstairs and your minion comes back to report "no such address" (with no muffin) and doesn't point out that while the hallway on the third floor was long, it was orange instead of yellow and you didn't even check whether there was a bakery down below, because unfortunately the hallway was orange instead of yellow, so you didn't even _think_ about completing the rest of the assigned errand. "And there was no bakery down below" would have you checking your street address first of all. "And there _was_ a bakery down below" would have you checking your floor number first of all (while happily munching your hot muffin as a side bonus).
Human contingency is _so_ different than code contingency, and I just think this is wrong-headed all around.
But modern computer languages really do tend to compel divide and conquer, and I conceded that it's extremely difficult to enact human contingency in a way that's clean and compatible with this reductionist software engineering ethos.
I've seen lots of unnecessarily cluttered code. Almost always because of a woolly conceptual foundation, where you're half in one system of coping and half in another system of coping. But I've gradually evolved to where I almost never count clean expression of preconditions or postconditions (all the way down to the level of individual statements) as active code clutter. My eye now regards such code as a somewhat prolix signature of a job well done, no ruby slippers involved.
@xybersurfer xybersurfer Naturalness only exists in a cultural context. You choose to operate within a software culture with a weirdly enlarged notion of "error". If I tried to sign up on Facebook and I entered my "Allan Stokes" as my preferred username, would the system create me a new account, or would it display an "error" that my username is already taken by some other Allan Stokes of no particular distinction? That counts as an "error" in what sense, precisely? Was it an error that some other Allan Stokes wanted my personal name? Was it an error that I also wanted the same? Was it an error that Facebook did not originally design their social network so that all users were known by a globally unique 128-bit GUID, preventing this kind of land-rush clash for all time? I count _none_ of these things as errors. Not everybody can have everything at the same time in the world we actually live in.
Since I'm already a decade or two late boarding the Facebook ark, it's now a hail Mary for me to obtain the simplest rendition of my own name in this sphere. It's for the same reason we send Santa a list, and not just the single item Ferrari Testarossa. Santa might be short on Ferrari's this year, especially in light of the present circumstance in Milan lately. So it's a wise policy to provide Santa with options B through F, as well. Was it an "error" for me to want a Ferrari Testarossa? Or was it merely the first interaction in a distributed algorithm to probe the maximal intersection point between my list of desires and Santa's available merchandise?
Likewise, if my algorithm wants 10 GB of DRAM working-space so as to run faster, is it an "error" if my request can not presently be granted by Santa's elves? I don't count that as an error, either.
Let's suppose I'm only asking for 256 bytes, and malloc returns NULL. Under those conditions, whatever my subroutine is trying to accomplish is probably not viable. How does the OS know that this is now an error as you see it, because your program design has boxed you into a corner with no recourse but to barf up a
White Flag of Surrender message box? The OS does _not_ know this. (Is it a coincidence that "throw" and "barf" are somewhat synonymous? I think not.) What the OS _could_ know for certain is that you just called free on a pointer it can't find on any active allocation list, which is _definitely_ an error. And if your program has just called free on a non-thing (type II error), what confidence do you now have that you haven't previously called free on a thing you intended to continue using (type I error)? Practically NIL. This is now Hunt the Wumpus territory. "It is now pitch dark. If you proceed, you will likely fall into a pit."
But this isn't actually defined in the standard as an "error". The standard instead says that free's behaviour is now undefined. And once _any_ function commits undefined behaviour, this condition can not be erased short of exiting the current process (although in theory, attempting to exit your process can now also be undefined)-and now the state of your Unix process tree can potentially also become undefined and it's probably now time to pay homage to CTRL-ALT-DEL. In practice, exit is usually engineered to not depend on much, so it usually won't fail to exit your process, to some number of nines (typically more for BSD with a belt-and-suspenders attitude dating back to the original graybeard; fewer for Linux, which regards too many nines as a type II cultural error that greatly hinders the effective evolution rate).
This is the funny thing about error culture in computer science. The unambiguous errors are considered so severe, that the cultural response is "why even contemplate continuing to press forward?" whereas completely foreseeable circumstances-like two people trying to sign up on Facebook with the same username-are reified into the barf loop (aka culturally re-parented into the class of "exceptionable" error events).
My personal favourite case study is the Therac-25, a Canadian radiation therapy machine which either killed or badly injured at last six people. It had two beam strengths: regular and lethal (100× beam power). There was a turntable inside to rotate a bullet-proof vest in front of the lethal beam. On striking the bullet-proof target, the lethal beam would kick out some other modality of therapeutic radiation as a byproduct. Patient saved. In Dijkstra's world, you have ONE place in the code which is capable of unleashing lethal beam energy. Under what condition are you not allowed to do this? When the bullet-proof vest has _not_ been rotated into position. Otherwise, the patient experiences an extreme shock and _literally_ runs screaming out of the treatment room (as did a fatally wounded Roy Cox in actual fact). But you see, the software was designed to ensure that the correct turntable position was already activated and properly in place long _before_ subroutine release_the_radioactive_hounds was called upon to do its dirty work.
Mechanically, the turntable implemented _three_ separate microswitches to ensure that the software could reliably detect its true position (not just two sets like your microwave oven door).
They got carried away with thinking about what the software was trying to _do_ (treat the patient) and forget to think clearly about what the software was trying to _not_ do (kill the patient). There's no excuse on God's green earth for a software culture which accepts _anything_ other than a last conditional statement to check the turntable position (and every _other_ survival-critical precondition, if there were any more) before the ONE line of code capable of unleashing the 100× lethal beam. We were a good 35 years into the software industry before these fatal accidents happened. Dijkstra had completely set this straight already by the 1960s.
In ZFS or PostgreSQL-or any other life-critical application (as viewed by the data under storage)-the implementation culture is strongly biased to think first and always about what you simply MUST NOT DO (e.g. create a race condition on flush to disk). Further up the software stack, the culture flips into: call the extremely sophisticated and fast-moving API as best as you can and mostly hope for the best. If a 7'2" Santa Claus with a 30" vertical leap power-blocks your layup attempt, you click you heels through the handy-dandy uncluttered exception mechanism, and return to Kansas, hardly any the wiser, but more or less still alive. Betcha didn't know Santa had that move. Good thing your ruby slippers are maximally topped up on reindeer credits.
Only here's the downside: people who live in Kansas begin to develop an extremely muzzy sense of what a true error actually looks like. "Error" becomes a hopelessly muddled class of eventualities where it seems more convenient to click your heels together than to write a robust code block with a clear sense of the hazards it is actually navigating. And then this becomes further reified into what "uncluttered" code ought to properly look like, because we are at the end of the day cultural creatures, who thrive on cultural uniformity
@Allan Stokes the problem with avoiding exceptions, is that you are just manually implementing the error handling mechanism already present in the language. you are also wasting the return value which could be used for something more natural. the big advantage of exceptions to me is that you can abstract them away. this way you are left more with code describing task. this is much cleaner, than code checking for errors, with the actual task buried somewhere inside. i'm sure you could make it work like most things, but it looks like it results in a lot of unnecessary code
Also, I hope people read carefully enough to realize that I was exactly half serious: half serious because this *does* actually work, half not serious, because it only works for a stubborn iconoclast willing to put up with a certain extreme forms of surface ugliness -and stubborn iconoclasts (like Knuth and Bernstein and every second Haskell programmer) have always managed to write software with orders of magnitudes fewer bugs than competitive endeavors from lower down in the iconoclasm pecking order, so it's really not such a special value add.
Good information, thank you for sharing.🙏
I love C++, but at the kernel level a lot of the features cannot be used as they are dependent on supporting libraries which are not available.
@Pieter van der Meer It's really fun to try to write a kernel in c++ though :D
There's a reason why C++ was never accepted as the language for Mach or Linux kernels. Most of its power is in its dynamic allocation, and this is unwise in kernel context.
wow, this is unbelievable!! I never thought about sit in from of this eminence sir, can't believe this because he was to me one of the genius of my Turbo C/C++ book, amazing! ohh no matter the fact that I'm sit at home :P
The internet is amazing!
Are there any talks where he gives reasons for using C++ over Java (and why it's so much more complex/low level)?
Here is a reason. Use Java (or C# if you are only targetting Windows) when you can. Use C++ only when you really have to.
C++ is faster because it is run on the machine - java is run a little different and is why it can run on virtually everything (atm machines, cashier machines, etc.)
Also he doesn't like to talk bad about C++ applications like Java.
+Sina Madani It's obvious he's not here to promote Java , he's here to talk about C++ . C++ and Java have different design objectives , C++ was mainly used to develop systems applications so it's focused on achieving control over low level operations like memory management ( C++ has pointers and doesn't have a garbage collector) and at the same time be able to abstract away using object orientation .
Java however was developed to enforce the object oriented paradigm which has lead to it being one of the most successfull (if not the most successfull ) programming language in the world , Java addresses two of the key challeges of todays programming world : the first is "duplicate code is bad " and the other is "code will always keep changing " those two problems are best solved using the object oriented paradigm in which there is no better language for other than Java .
If we are talking about the hardness of learning than yes C++ is harder because it's a hybrid language , sometimes i view it as combination of C and Java but that's just me :p , but both languages can do things easier than the other language can do, which means that both can achieve a deeper level a complexity unreachable by the other language depending on what they are used for .
As a programmer, I'll tell you what really ticks me off about these high level OOP languages.. I'm pretty good at logic and program flow, but my productivity is drastically slowed down by having to remember, or look up, all the INSANE syntax, variable passing, and multiple file dependencies of C++
It's an amazingly powerful language, but also gives you the opportunity to shoot yourself in the foot at every step on the way. To use it simply and safely, you'll probably start using smart pointers, copying strings instead of referencing.. undoing its performance potential. Still, you can do everything, including shooting yourself in the foot.
inspired me to update my codebase, zero new/delete less lines, hundreds of bugs killed.
Menne Kamminga gg
Good material before a sleep session, this story is quite a lullaby.
brilliance beyond my software engineering skills
Holy crap, this was a great presentation.
I won't watch any of my teachers or preachers lecturing for more than 20 minutes. This is 1 hour and 40 minutes long and I don't think Dr. Stroustrup is a gifted orator, but I will watch it to the end. The difference is the level of respect, I don't think any other living man deserves quite as much respect from me than Dr. Stroustrup.
Indeed, it's interesting. He even has a monotone voice delivery. What keeps us listening is the level of precision and relevancy with which he delivers all the information.
Wait, so this guy is responsible for the creation of C++?!
Awsome!
This man is a legend and C++ is a great language maybe the most important programming language.
@Joe Stevenson not to downplay c++ is C
not to downplay C++, but C is easily more important than C++
Nearly 100 dislikes might be from the creators of 100 other languages
The Morgan Stanley guy's biggest achievement in life is getting his managers to switch from C to C++. There is something very sad about that ...
@Bextract He's taking about the guy at the beginning. Not Bjarne. You completely misunderstood his very clear comment lol.
hyqhyp wth.. he invented one of the most commonly used languages of modern time. What are you smoking?
Making a large company change their primary programming language is a serious achievement.
Actually, he said he got them to move from Fortran to C and to C++, but still agree with your point...
hyqhyp is there any article about it?
It's just a frequency that is Morris code. Depends on the frequency and beat. Particles and waves. On a keyboard.
The essence of C++ is C.
1:07:53 this is what a genius sounds like. Hes speaking code if you listen carefully
I appreciate Bjarne Stroustrup.
7:00 Tell them what you're going to tell them, tell them, tell them what you told them: done to perfection-that shiny trader is going to really enjoy this lecture.
he's one of the few who can use words like encapsulate and you know he isn't trying to pull a trick
So you think when people use words with more than two syllables in them they are trying to trick you?
Wow! Concepts and modules should have been finalized before cpp17!!!
It's a difficult situation. Every release makes it less and less welcoming for new developers, as it grows to be a larger and larger monster of a language.
@Tin Švagelj The problem is with the industry hiring new developers. With green field projects it can be fine, but often they have old, large legacy code bases which have already accumulated all kinds of C++ features and styles over the years. It can be a daunting experience to jump in such a project if there are holes in your understanding of "historical C++". There's not so much you can just forget about.
You don't need to start using it by jumping straight into template metaprogramming. You start with C code, fix the type safety, replace pointers with references wherever possible and use STL containers you find useful (most common are vecotr, map, list and tuples). You then learn how to use move and copy assignment and constructors. That's basically it, you can write good enough code knowing just that. As with any other language, you improve with practice. There's plenty of resources available online.
The only man who actually understanda c++ 🤣🤣🤣
Higher levels like Python Ruby contain certain “derived “ or inherent OOP aspects of C++ tho .. it’s a good program to learn for your first language
@J T I've read two or three different books of C++. 1st one was written by Bjarne himself. It was C++ for ARM mostly. I was too young and stupid back then and gave up at ~70th page or so, where he talked about references. (It could also be a poor translation of the book, not sure). Some time later, I understood classes, inheritance and all related stuff from yet another book. It's well over a decade ago but when I sit to any C++ thing, I like to have some web page briefly describing whats in STL and some more advanced concepts under my hand, just to make life easier. What I've also learned is I will never learn it... It's too complicated and changing lately. For example the lambdas (hate them or live too short to get used to). Keep things simple - don't overkill!!!
@Jorge Bastos Syntax?? are you serious ?,
C++ is art, C++ has a few flaws but syntax is not one of them
C++ is C ++
I write rust now not because it's better than C++ but because I can write concurrent programs safely
But never talk about syntax being horrible, maybe the std library is congested
AHAHAHA LMAOOO
When I see his code everything is overcomplicated for my taste. I can do everything simple and properly but in Java.
I really liked the jokes he made about programming.
The essence of C++ is to show how not to program
C++ was my language from 1991 to 2017. Not anymore. His father killed it for the sake of Academia over users.
I just watched this entire talk and I honestly can say most of what was said was beyond my understanding. Wanting to learn C++ but it seems too hard?
It's really not. The problem is, as the man said, that it's been around a long time, and has been, in a way, several _different_ related languages over the years. Keep in mind how very slow and clunky computers were when C++ was introduced - a lot of the original language relied on the programmer to do most of the heavy lifting. That is, you needed to think about the machine almost as much as the problem you were working on. These days, if you use the right parts of C++, you can offload most of the "thinking about the machine" part to the compiler and just concentrate on the problem you're solving. The compiler itself can be a much larger program than was possible in the early days, and it will compile your modern code in seconds or minutes rather than weeks. Most of this talk was about why it's better to use the newer language features when you can rather than the features that were there so that your steam-and-hamster-powered 4MHz processor with maybe a couple of megabytes of memory (huge for the time) wouldn't choke trying to compile it.
C++ is an absolute dragon of a language. The best languages are C# and Pascal. Wirth and Hejlsberg FTW!
I prefer the term Leviathan. ;-)
so this the guy that made my CS classes so difficult. he looks exactly as how I imagined he would look
hahaha you made my day
at 1:13:08 spoke a the creator of OOP, i couldnt understand the name properly, I tought it was Alan Key who invented it, but apparently Bjarne state otherwhise.
I'm very curious to read more about the man he talk about, if anyone can write his name, that would be lovely.
Kristen Nygaard who together with Ole-Johan Dahl invented Simula
At 49:43, the code for the ranged loop is wrong, it should be for(auto& x : s) x->draw();
Some of the other examples were also missing semicolons.
Christ it takes a bit for Bjarne to actually get on the stage doesn’t it?
His accent is the best I've ever heard. It's so classy.
His talk starts at 7 20.
I love the way this guy talks
What a great answer about not writing language like Java😄😄
Amazing man!
The 186 people who disliked this can't keep up with Mr. Stroustrup.
i love his very subtle sarcasm
Bjarne was always one of my Gurus. When turned this video, I was wondering If I be able to make any sort of comment to it. Well:
14:16 "Complexity of code should be proportional to the complexity of the task"...
What failed during last 20 years and why do we have so many "N Gibabyte Fat" crapware...
Pretty much perfect for an hour and a half ...
Funny thought of Stroustrup on computer scientists at 40:25 😂😂😂
The language of languages.
Good listening to the master.
This is so content rich.
Even c++’s creator thinks it’s designed using poor principles. I would bet that the greatest cost of unsafe languages has been security fails. It’s easily in the billions. But once a language is entrenched, it just keeps on trucking. I bet it would still be there in decades to come.
1:31:30 Actually not even C can agree on ABI calling convention between 64-bit Linux and 64-bit Windows running on x86-64 CPU on identical hardware. There's little hope for C++ to have a standard before there's an agreed upon standard on C.
7 and a half minutes of introduction? Did I end up in real life somehow? 🤔
Haha, legendary Bjarne "First of all, i dont go round killing people"
What did he mean by this comment?
I can understand those sighs... a most pragmatic guy invents a language that gets used in the most impractical ways.
I don't think we should even *know* the wrong ways of doing things... Nobody teaches non-generic Java or C#, you have to figure it out for yourself when you come across it, and neither Java or C# has an international standards committee dedicated to them, so how come C++ can't get its act together in that regard? The committee should enforce compilers marking certain language constructs as obsolete, but the committee has an abusive relationship with compiler makers and we wonder why the committee stays ("had pizza" comes to mind).
Scotland from C to C++.
If you want to be in the field you should learn at least 2 preferably 5 maybe a few more.
Thanks for this channel very exquisite
C++ is everything you could want it to be; it's a beautiful language.
It's pretty powerful,but also hard to control
Frankly you are insane
Tyler Minix "Beautiful" as in "it's what's on the inside that counts," because it's horendeus to look at 😀
Thank you .... Morgan Stanley , speaker Bjarne Stroustrup & Dave Robertson @ University of Edinburgh
Love Christopher Guest, he's the man of thousand faces!!!
@42:50 is he talking about nulling the pointers such as by nulling them the pointers arent actually referencing anything by the time the function is out of scope or is this some other way? i thought nulling a point led to undefined behavior.
Since C++11 there is a feature called nullptr.