Forgot your password?
typodupeerror

Comment This will get standardised. (Score 1) 25

What will happen at some point is that sites that require age verification will require some kind of verifiable token generate by the OS-level age verification. Rather than the myriad proliferating independent age verfication. But it's a legislative ratchet that is unlikely to move in the opposite direction. If you don't want OS level age verification, then likely you'll be confined to the part of the internet that doesn't require age verification.

Comment Re: Normal (Score 1) 124

Still, you got it wrong, because the curve around 100 is flat, and given that the IQ is rounded to a whole number, a significant part of the population has an IQ of 100 (or 99.5 to 100.5). Thatâs what the curve vs. triangle was aimed at. Add to the fact that individual results can vary a lot, depending on the exact Series and the current State of mind of the one tested, results between 95 and 105 are well within the IQ-100 group.

Comment Re:Good luck. (Score 1) 58

There was an initial large disruption as they dumped a huge number of packages into alternate delivery systems that weren't prepared for the sudden massive increase in load. Within a few weeks, it had settled down, and shipping times had improved enough that same-day and next-day shipping were once again available, albeit with shorter "order by" windows. The quality of the delivery experience has dropped significantly (in terms of failed/late deliveries) due to them relying exclusively on "Intelcom" (a gig delivery service) rather than Amazons own delivery system.

My understanding of how it works, at least for Montreal (which used to have multiple Amazon warehouses in the metro area), is that all orders are shipped from the Toronto area, a ~6 hour drive away. Amazon loads orders onto big Amazon trucks (semi trailers) and drives them to an Intelcom distribution centre in Montreal, and Intelcom handles the last-mile delivery. Intelcom doesn't do inter-city delivery, and Amazon doesn't have any infrastructure in Montreal (or Quebec more broadly).

As for why Amazon services Montreal's orders from Toronto (a ~6 hour drive away) instead of Ottawa (a ~2 hour drive away), my only guess is that Ottawa (1.5m metro pop) wasn't big enough absorb all of Montreal's (4.3m metro pop) demand, but Toronto (6.2m) was.

Comment Re: 4GB has been insufficient for many years now (Score 2) 68

I have not seen AI code that is *more* efficient than human code, yet. I have seen AI write efficient, compact code when pressed, very, very hard to do so, but only then. Otherwise, in my hands, and those of my developer colleagues, AI produces mostly correct, but inefficient, verbose code.

Could that change? Sure, I suppose. But right now it is not the case, and the value system that is driving auto-generated code (i.e., the training set of extant code), does not put a premium on efficiency.

Comment Re:4GB has been insufficient for many years now (Score 5, Informative) 68

Web browsers are absolute hogs, and, in part, that's because web sites are absolute hogs. Web sites are now full-blown applications that were written without regard to memory footprint or efficiency. I blame the developers who write their code on lovely, large, powerful machines (because devs should get good tools, I get that), but then don't suffer the pain of running them on perfectly good 8 GB laptops that *were* top-of-the line 10 years ago, but are now on eBay for $100. MS Teams is a perfect example of this. What a steaming pile of crap. My favored laptop is said machine, favored because of the combination of ultra-light weight and eminently portable size, and zoom works just fine on it, but teams is unusable. Slack is OK, if that's nearly the only web site you're visiting. Eight frelling GB to run a glorified chat room.

The thing that gets my goat, however, is that the laptop I used in the late 1990s was about the same form factor as this one, had 64 MB (yes, MB) of main memory, and booted up Linux back then just about as fast. If memory serves, the system took about 2 MB, once up. The CPU clock on that machine was in the 100 MHz range. Even not counting for the massive architectural improvements, my 2010s-era laptop should boot an order of magnitude faster. It does not.

Why? Because a long time ago, it became OK to include vast numbers of libraries because programmers were too lazy to implement something on their own, so you got 4, 5, 6 or more layers of abstraction, as each library recursively calls packages only slightly lower-level to achieve its goals. I fear that with AI coding, it will only get worse.

And don't get me started on the massive performance regression that so-called modern languages represent, even when compiled. Hell in a handbasket? Yes. Because CPU cycles are stupidly cheap now, and we don't have to work hard to eke out every bit of performance, so we don't bother.

Comment Re:Intel's political marketing has always been bad (Score 3, Insightful) 22

If you read this post it shows that AMD stole Intel's design and reverse engineered it.

If you dig deeper, you'll find that AMD originally reverse engineered the *8080*, not the 8086. The two companies had entered into a cross-licensing agreement by 1976. Intel agreed to let AMD second-source the 8086 in order to secure the PC deal with IBM, who insisted on having a second source vendor.

There would have been no Intel success story without AMD to back them up.

(That actually would have been for the best. IBM would probably have selected an non-segmented CPU from somebody else instead of Intel's kludge.)

Comment Re: New religion (Score 1) 124

Thatâs not an independent thinker. Thatâ(TM)s someone who routinely doubts everything. But as Henri Poincaré already observed more than 100 years ago: To doubt everything and to believe everything are considered two equally convenient strategies, both of which relieve us of the necessity of thinking or reflection. (And I know, a witty saying proves nothing.)

Comment Re:Renewables rock (Score 2) 97

It's even more complicated. German law treats the grid as "copper plate", and ignores all regional differences. If a wind park in Northern Germany offers electricity for 8 ct/kWh, then a consumer in Southern Germany is allowed to buy that power and is entitled to get it delivered via the grid. And if the grid can't handle the load because of weak interconnectivity, then a gas turbine in Southern Germany will start and generate the power for 18 ct/kWh, but the consumer only pays 8 ct. The 10 ct/kWh difference is paid by all consumers with higher energy prices.

For Southern Germany, this is quite the deal, because they can now operate expensive gas turbines, and get them subsidized at least in part by electricity consumers in Northern Germany with higher energy prices, while the cheap energy generated in Northern Germany is switched off, as the energy on the books is sold already, but the electricity is generated somewhere else. But because Southern states profiteer from the situation right now, there is much resistance to changes in the law, which would make energy in the South more expensive, while Northern states would get a relief.

Slashdot Top Deals

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...