Its name was, in fact, Boren.
Its name was, in fact, Boren.
That's true, but also missing the point.
The main issue is that these situations are not always that obvious in real world code.
There was a talk covering these issues posted at the ACM in February:
Not sure if the video plays outside of Norway:
It's about at 54 min. in.
- Carlsen was given 30 sec to win.
- Gates humbly said he had a 1600 rating vs someone with 2000 etc.
- Gates was actually a sponsor of Carlsen at the start.
- Carlsen said he violated one of his principles by using a cheap trick to win.
Only the compiler/bridge needs to know about the physical resources you work with. The user only needs a model of the hardware to work with. The compiler/bridge should then be able to produce the optimal code with no language overhead. Unfortunately, the way software is done today is still very ad hoc. There is little or no state modelling so you could map out or simulate what happens with the resources. We'd need model representations for all the hardware.
You're ignoring the reason C went the way it does: performance. 'int' can translate to whatever is fastest, not whatever the spec demands (like in Java, say). It's a tradeoff, and as C is a language which puts speed over just about anything else, I can't fault their decision there.
C actually makes it a lot harder to produce the best, optimized code, because the language constructs are too closely tied to the machine domain. An optimizing compiler has less leeway in how it can re-arrange and streamline the logic to reach the goal.
The thing to get here is that there are basically two kinds of OOP, so to speak.
Here's a short discussion that covers it:
In Alan Kay land objects are sub-computers that receive messages from other sub-computers. In Barbara Liskov world objects are abstract data with operators and a hidden representation.
Kay OOP is closely related to the actor model by Carl Hewitt and others.
Liskov had her own idea of OOP, and she was not aware of Smalltalk (Kay, Ingalls) at the time. She started work on her own language, CLU, at the same time as Smalltalk was developed.
.. is that C was seen as a major setback by Frances E. Allen and others.
It [C] was a huge setback for--in my opinion--languages and compilers, and the ability to deliver performance, easily, to the user.
Frances E. Allen
ACM 2006 Conference
The context here surrounds abstractions and not allowing users (programmers) to play with pointers directly (C, and later, C++), which is a setback concerning optimization, because of the assumptions/connections you make about/with the underlying machine.
If you want to learn more about the ideas of the 1960s and 1970s, I highly recommend looking up talks by Alan C. Kay ("machine OOP" which is Smalltalk in a nutshell), Carl Hewitt (actor model), Dan Ingalls, Frances E. Allen (programming language abstractions and optimization), Barbara Liskov ("data OOP" which is C++ in a nutshell), and don't stop there.
I love blindly copying memes..
In this case, however, things actually did get worse on many points. Look up talks by Alan C. Kay, and Frances E. Allen at the ACM to get you started. For example, C, which UNIX and Linux is based on, is a giant step back regarding concurrency and code generation, already realized the day the language was released in 1973.
There was a similar study on this surrounding laser printers.
I know what you mean, and this is only about context.
What you may want to try is to use context tests.
Example: You start the program, then go; I want to do this (think of some action); then you do that action. The key point being not to be allowed to think too much about what you do. It's not an "academic approach," but it's quite efficient, especially in games. However, the psychological factor is to avoid things that hurt the ego; this is especially true when developers are not able to distance themselves from their work; it's just a tool.
Programmers are so binary.
Programmers have to know the domain for which they are developing.
You have to make using the software part of your time, just as code review shuffles through roles.
For your example, if your software has so many unknown states when shipped, it's fair to say it's based on luck if it works right.
You would actually need an environment where you could map out states, and simulate input devices.
Software development is still young, and I don't think we know how to do it right yet.
There are many bugs that would easily be detected by actually using the application on a daily basis.
Users do not work for you. When they do post bug reports, it is most likely in frustration.
In most countries you have to work 5-15 years, full time, to save up enough capital to get a house loan from any bank. You can rent, of course, but that's a waste of money, and you most often have to keep moving. Usually when contracts expire, the price goes up.
Just out of school? Good luck getting any job! Student loan? Even better luck having to pay down that loan, and your bank loan, and other costs connected with your apartment.
The system keeps a tight leash on its citizens. People who do make it on their own cling to their paychecks to make it one month at a time. What a fucked up world to live in.
Also, to counter all the stereotype comments mentioning porn, games, everything is paid for, mom does the cooking; you can still live at home and pay rent for your 10m^2 room, do your own laundry, buy or make your own food. Who would have thought?
Ever notice that even the busiest people are never too busy to tell you just how busy they are?