Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

Comment: Re:Refactoring done right happens as you go (Score 1) 220

by TheRaven64 (#49179067) Attached to: Study: Refactoring Doesn't Improve Code Quality

Newton looked at the spectrum and saw that it contained six distinct colours to the human eye: red, orange, yellow, green, blue, purple. But his alchemist beliefs considered 7 to be a magic number and so wanted the spectrum to have seven colours. He decided that purple should be split into indigo and violet to reflect this, but didn't split any of the others (even where the difference is at least as pronounced) because it contradicted his mystical thinking.

If even Newton 'One of the smartest men to ever live' couldn't manage to keep his science separate from his mysticism, what hope do you think other religious people have?

Comment: Re:Uh, what? (Score 1) 83

by TheRaven64 (#49179029) Attached to: Khronos Group Announces Vulkan To Compete Against DirectX 12

This is a confusion in terms. Personally I blame Sun. An interpreter IS a form of compiler, it is the term used to refer compilation at run time

No, sorry. A compiler is, in theoretical terms, a partial application of an interpreter to a program. In practical terms, a compiler transforms the input into some other form, which is then executed, whereas an interpreter executes the input directly. JIT compilation is still compilation. A just-in-time compiler is the term given to compilers that produce their output just before it is executed, as opposed to ahead-of-time (AoT) compilers, which produce it all up front, even if some paths are never executed.

There's some complication, because most environments that do JIT compilation also include interpreters that gather profiling information to incorporate into the JIT compiled code and to improve startup times. JavaScript implementations, in particular, often spend a reasonable amount of time in the interpreter because most web pages contain a load of JavaScript that's only run one or two times and the time taken to compile it is more than the time saved to execute it. Some have multiple compilers - JavaScriptCore from the WebKit project has an interpreter and three different JIT compilers that have different points in the space between compilation time and execution time - they'll recompile hot paths multiple times as they're executed more, with more optimisation each time. The key difference between the interpreter and compilers here is that the compilers are each invoked once on a segment of code and it's then executed without involving the compiler. The interpreter is involved every time the bytecode is run. It reads a bytecode and then jumps to the segment of interpreter code that executes it and then returns. The compiler takes a sequence of bytecodes, generates a fragment of native code to execute them, and then this fragment is combined with other fragments to produce a running program.

The shader compilers in drivers, however, are not JIT compilers. They are AoT compilers that are invoked at load time - often at install time. They don't compile the code just before it's run, they typically compile it once and cache the result for multiple invocations of the program. Some drivers (Windows and Android come to mind) have a mechanism that allows you to do the compilation at install time. Unlike most JIT environments, graphics drivers don't tend to use run-time profiling for optimisation, the bytecode exists solely for the purpose of providing an ISA-neutral distribution format.

Comment: Re:File extensions? (Score 1) 507

by TheRaven64 (#49178979) Attached to: Why We Should Stop Hiding File-Name Extensions

Ugh, trust MS to fuck up a reasonable UI choice. On OS X, by default, it only happens for programs and requires you to close the dialog and then bring up the context menu for the program while holding a modifier key. You don't know how to do it unless you've actually read all of the way to the end of the dialog, so it generally protects people.

There are some interesting corner cases though, such as shell scripts. The file manager doesn't know if the thing that you tell it to open a shell script with is a text editor or a script interpreter, so may warn spuriously.

Comment: Re:Lack of appropriate options gripe: (Score 1) 152

by xaxa (#49175135) Attached to: Will you be using a mobile payment system?

I pay with a contactless credit card very frequently, which uses the same technology (at least outside the USA). I don't see the attraction for paying with my phone: I have to get something out of my pocket, and it's easier to touch a plastic card to a reader than unlock a phone, presumably open an app, authorise, etc.

Comment: Re:File extensions? (Score 2) 507

by TheRaven64 (#49172059) Attached to: Why We Should Stop Hiding File-Name Extensions

There are two problems. The first is that the OS allows you to run porn.jpg.exe having downloaded it from some random place on the 'net. I don't think that either OS X or Windows do: they'll both pop up a thing saying 'You are trying to run a program downloaded from the Internet, do you really want to?', which isn't normally something that happens when people try to open a file so ought to trigger them to avoid it (if it doesn't, then seeing the .exe extension probably won't either).

The second is that the OS allows programs and other file types to set icons at all before their first run. This also leads to confused deputy-like attacks where you think you're opening a file with one program but are actually opening it with something that will interpret it as code. The solution to this is probably to have programs keep their generic program icon until after their first run. If you double click on something that has a generic program icon, then you probably intend to run it...

Comment: Uh, what? (Score 2) 83

by TheRaven64 (#49171381) Attached to: Khronos Group Announces Vulkan To Compete Against DirectX 12

an LLVM-based bytecode for its shading language to remove the need for a compiler from the graphics drivers

This removes the need for a shader language parser in the graphics driver. It still needs a compiler, unless you think the GPU is going to natively execute the bytecode. If you remove the compiler from a modern GPU driver, then there's very little left...

Comment: Re:c++? (Score 2) 383

C++ is a language that is very good for generic programming. It doesn't really meet Alan Kay's definition of OO (and he's the one who coined the term), nor does it pass the Ingalls Test for OO. It has classes, but method dispatch is tied to the class hierarchy so if you want to really adopt an OO style you need to use multiple inheritance and pure abstract base classes, which is a very cumbersome way of using C++.

The worst C++ code is written by people who are thinking in C when they write C++, but the second-wrost C++ code is written by people who are really thinking in Smalltalk. If you're one of these people, then learn Objective-C: the language is far better at representing how you think about programs.

Any programming language can be used to write code in any style. You can write good OO code with a macro assembler if you want. However, every language has a set of styles that fit naturally with the language and ones that don't. You can force C++ to behave in an OO way, and it sort-of works, but it's not using the language in the most efficient way.

Comment: Re:C++14 != C++98 (Score 1) 383

The main advantage of auto is in templates, where the type is something complex derived from template instantiations. You simply don't want to make it explicit. Oh, and in lambdas, where explicitly writing the type of the lambda is very hard (though you generally cast to std::function for assignment). As to using a float instead of an int... I have no idea how you'd do that with auto. The type of auto is the type that you initialise it with. If you're using a platform where you don't have hardfloat, then why do you have APIs that return floats? There's your bug, the use of auto was just a symptom.

And if you've never seen anyone use the algorithms library, then you must be using C++ in a very specialised environment. I've not seen any C++11 code that didn't use std::move. I've rarely seen a nontrivial C++ file that didn't use std::min or std::max, and most code will also use std::copy. The only code that I've seen that didn't use the algorithms library included is own, poorly optimised and buggy, versions of several of them.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (9) Dammit, little-endian systems *are* more consistent!

Working...