Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:Do compilers really remove this? (Score 1) 470

Clang includes a number of compilation flags that can be used to make sure, or at least as sure as it can, that your code never hits any undefined behaviour at run time.

But normally, yes the compiler may change the behaviour of your application if you are depending on undefined behaviour.

Comment Re:TFA does a poor job of defining what's happenin (Score 4, Informative) 470

"What every C programmer should know about undefined behaviour" (part 3, see links for first 2 parts).

For example, overflows of unsigned values is undefined behaviour in the C standard. Compilers can make decisions like using an instruction that traps on overflow if it would execute faster, or if that is the only operator available. Since overflowing might trap, and thus cause undefined behaviour, the compiler may assume that the programmer didn't intend for that to ever happen. Therefore this test will always evaluate to true, this code block is dead and can be eliminated.

This is why there are a number of compilation optimisations that gcc can perform, but which are disabled when building the linux kernel. With those optimisations, almost every memory address overflow test would be eliminated.

Comment Re:eh (Score 1) 568

Charge per unit plans, that don't place any barriers to excessive usage, and unexpected bills, inevitably backfire. One of those least-consuming users will install something, or their kid will, and they'll be facing an unexpectedly huge bill at the end of the month.

This happened many times in Australia in the early days of broadband. In response the ISP's all set up "unlimited" plans, which have a fixed limit of usage per month. After you hit your quota they throttle your connection back to modem-ish speeds to prevent you from using too much more bandwidth. Without cutting you off completely. You may then have the option of paying for another unit of bandwidth, or bumping your monthly plan permanently.

Comment Re:False. (Score 1) 140

Without onion routing, this VPN solution will expose your social graph to NSA or servers run by other governments operating on the network, one of the key things they are interested in collecting.

Blocking access to something encourages an arms race to bypass the filter. I'd be more concerned about governments that allow you access, but monitor what you are doing and who you are talking to.

Comment Re:155 Forrester Clients (Score 1) 337

Right.... with such details as "autoSpaceLikeWord95, footnoteLayoutLikeWW8, lineWrapLikeWord6, ...". Down that path, any competitor must conform to the way Office already does things, with all of the quirks and work arounds that have been built into it over it's lifetime.

That's the wrong way around, and not what I said. Get all the authors of competing office apps in a room to hash out an interchange format, similar to the HTML standards process, with a core grammer that can be used with clearly defined rules to render any document in the same standard way.

Importing a legacy format should involve translating it's quirks into a standard representation. It should not force every document renderer to re-implement each of those quirks in their renderer.

Comment Re:155 Forrester Clients (Score 1) 337

Office is such a complicated beast that no-one can reverse engineer .doc / .docx perfectly. Those formats represent decades of continual minor changes and work-around's as the requirements have changed. This isn't like supporting standards compliant HTML5. Properly documenting the existing doc format is futile, as the only complete specification is the office source code itself.

The only way I can see to get out of this mess is to specify a common document interchange format, and force Microsoft to support it as their default file format.

Comment Re:routine IT work (Score 3, Interesting) 307

And then you add integration with all of the 3rd party legacy systems on the back end..., That level of scalability testing is possible, but unlikely. It demands a test system that perfectly mirrors the production system. And I mean perfectly, down to the wiring, switches and routers. Then you have to model your user behaviour accurately, which is difficult for a system that doesn't have any real users yet.

If you can't build such a test system, or your tests don't reflect the typical actions of your users, something could slip through the cracks. If your tests with 100,000 users on one server, that doesn't mean that production will work with 5 million users on 50 servers.

Comment Re:routine IT work (Score 1) 307

But even when designing for 5 million users, it's very difficult to test for that load without having 5 million actual users. There will inevitably be a bottleneck you didn't think of that fails under real loads, that you didn't or couldn't test for. A bottleneck that can't simply be fixed with more servers.

Comment Re:routine IT work (Score 4, Informative) 307

Building a web site for a hobby is very different to the approaches you need for massive scale. If you took a look at how the largest web sites scale, they all do things slightly differently, and they all had to fix their scalability problems gradually as their popularity increased. Throwing more CPU and RAM at the problem may be unable to fix anything if their current software design doesn't allow for it.

Comment Re:Wow. (Score 1) 999

I was mainly thinking of his research into government responses to a credit crunch. Though he has also talked about the events of 1937, where the US government believed the crisis was over and they cut spending, leading to higher unemployment, which has more in common with the events of today.

Slashdot Top Deals

I cannot believe that God plays dice with the cosmos. -- Albert Einstein, on the randomness of quantum mechanics

Working...