Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:He also said... (Score 1) 24

Not only that - when they scrapped 20A in August 2024 and delivered arrow lake on a TSMC process, they said it was because 18A was going so well so they wanted to focus on that: https://newsroom.intel.com/opi... "One of the benefits of our early success on Intel 18A is that it enables us to shift engineering resources from Intel 20A earlier than expected as we near completion of our five-nodes-in-four-years plan. With this decision, the Arrow Lake processor family will be built primarily using external partners and packaged by Intel Foundry."

Comment Re:Hold Up There Sparky (Score 1) 109

Guess it was harder for a hacking culture to appear without internet, and things were less standardized, so it would be a tough job for a lone hacker to do it - but of course "Woz"-level hackers would have been able to. Besides, the first teletext implementations were done without a single microcontroller /MCU. Just a bunch of fixed-function IC's chained together. One for extraction the teletext stream, a shift register for latching in the 3-digit page number, a chip that could filter out a particular page matching that number and copy to RAM, the character ROM etc. Even the features like hold, reveal-hidden etc. were just input signals that made minor modifications to what these chips did.

Comment Re:Given (Score 4, Informative) 56

Cancer persistence is more complicated - the tumor may produce molecules that inhibit the immune response of that prevent presentation of the tumor antigens to the immune system. Also not all genetic sites are presented (and thus available for inspection) even under normal circumstances (depending on the HLA/MHC genes which are highly individual). In general it is much harder for the immune system to recognize - after all, it is the body's own cells, just with some mutations. In contrast with e.g. a bacteria that is a whole different organism, or a virus that installs completely new genes and produces new proteins. But the immune system definitely does play a role in recognizing and eliminating cancer, but clearly it is imperfect.

Comment Re:Given (Score 4, Informative) 56

It is probably controlled as by the same general regulation mechanisms that ensures self-tolerance. There's central tolerance ensuring that immune cells that attack the self are killed. For T-cells in happens in the thymus (thymus education, if you want to look it up) where it is tested if the newly produced T-cells react to any self material. There's also other checks and balances such as T-cell getting killed if they react too quickly after having been released from the thymus (suggesting it is more likely reacting to something in your own body than an incoming disease agent). It is plausible these mechanisms also ensure self-tolerance against these cells from the mother.

Comment Re: What could possibly go wrong? (Score 1) 272

Look at the post I am responding to claiming they don't see how it is even possible the c++ version could be slower! I am giving reasons why it could be which is enough to refute that statement. About this being a different implementation: sure but if it wasn't there would be no point in discussing this at all. I was thinking of the question of "reasons why programs written in C+( using its language features could be slower than something written in C" from a practical perspective of how such implementations might be. If it is the same code in both cases yeah of course c++ is not slower

Comment Re:What could possibly go wrong? (Score 1) 272

A C++ implementation would often try to use C++ classes and perhaps features like operator overloading to have nice syntax. This can easily lead to code passing objects around and involving deep-copies of such objects. The alternative would be to pass pointers to the classes around but then you lose some of the benefits of objects cleaning themselves up as they go out of scope. In C the programmer might be naturally more inclined to pass around pointers around to begin with, because there's no destructors anyway and less focus on RAII. Later iterations of C++ with move semantics in combination with various auto pointers make it possible to avoid a lot of that copying while retaining the benefits, but that is only a relative recent addition and not all C++ developers are familiar with these features.

Comment You will eventually need to understand it yourself (Score 2) 139

TL;DR: AI's huge initial productivity gains can create a false sense of competence that leaves you stranded when you hit problems beyond its reach. I consider myself an experienced developer and AI user. I recently got a task of getting an old commercial software product to work. It had not been maintained, not compatible with new o/s and packages, and the huge installation guide with tons of manual steps was probably never fully correct etc. I also didn't know many of the underlying packages and application servers well, which normally wouldn't matter for this kind of 3rd party product - until it did. Anyway, I decided the right approach would be to sanitize it all and to dockerize it, and since I haven't worked much with Docker I turned to AI help. It quickly found answers and solutions to lots of things that came along. I could basically give it the whole installation guide and have it dockerize many steps first shot. It even helped with some of the 'application level' problems that started to emerge and I became more and more ambitious in what I could let the AI do. Huge success.

This was until I reached a problem concerning one of the embedded application servers, that none of the AI's could crack. I tried everything - new context windows, new AI's, providing all conceivably useful context info (stack traces, even decompiled files etc.) and symptoms based on my own hunches and debugging skills - but nothing helped. I also let the AI suggest what further info could be needed. But nothing helped. This problem was completely out of reach for the AI's - even with my debugging help. And this was the worst - because I had let the AI do so much stuff and not researched it myself, I kept trying to get the AI to fix it and so wasted a lot of time because it ultimately couldn't. Also, I was no longer in a position to debug it myself. I basically didn't fully know what I was doing, I was missing steps on the knowledge ladder. Eventually I had to throw in the towel and basically retrace everything the AI had done, why it was done that way, how/why it worked, and properly research everything and THEN deep-debug it.

This is not a criticism of AI's as such, it is unreasonable to expect a quick fix to everything. But it is just to show how some of the initial time savings (that were indeed huge) can be eaten away when the quick fixes stop working.

Comment Re:Wow! (Score 1) 201

Millenials is a very brought term but the youngest millenials are now 25 and the oldest 45. I'm at the far end of that range, and here tattoos weren't very common. Of course some people would have it but it wasn't that many and it was often in somewhat discrete locations e.g. upper arms etc. so it was hidden with clothes on. It seems to me it is the younger generation, those who are actually young now who have really picked up on it, often with extensive tattoos on the entire forearm and this being an entirely mainstream thing. It seemed to start maybe 5-6 years ago so I guess some of the last millenials were part of it too, but it doesn't seem to be a millenial thing - more a gen-z thing.

Comment Re:Can one recharge them? (Score 1) 79

The rewrite happens automatically in the background while the disk is idle - while proprietary I would guess all recent SSD's (at least TLC/QLC) does this. But note that it has not always been so - for instance, for the 840 EVO this was added in a firmware update, when retention starting to become a problem (it was one of the early TLC disks so likely it had not been much of an issue before). What is not known is how long to keep the disk on... when will the disk start the process, how long does it take, can it resume if interrupted etc. I've seen papers investigation the algorithms by looking at the power consumption of disks while idle, thereby making inferences about what is going on. But it is fully proprietary and as the 840 EVO shows it can vary for the same model based on firmware.

Comment Re:Just power? (Score 1) 79

The mechanisms are proprietary but generally it works as a background refresh mechanism. So probably mostly when idle but even though it likely won't do it continously as that would wear it out and even just scanning would cause constant high power usage. It would be nice to be able to monitor when a refresh starts, how far it is, resume etc.

Comment Re:Spare parts (Score 1) 79

yeah wondering if they are storing the firmware on a separate BIOS-like flash chip (retention typically 20-40-100 years) or on the actual main flash chip to avoid that cost. Even if the drive is empty and the firmware itself survives it could be that the metadata (mapping tables etc.) could be corrupted and hence won't work anyway. I'm not sure if the drive will necessarily be able to reinitialize those metadata tables after corruption (because reinitializing them also reduces hope of data recovery). But AFAIK the metadata is stored in SLC / simulated SLC which should have far longer longevity. But it would be nice with more insight into what goes on - how long do you need to plug in to ensure a full refresh of everything etc.

Comment I actually understand him (Score 1) 211

I’m not being ironic. I'm genuinely surprised people aren't more impressed - even reading the negative sentiment here. Yes, Microsoft’s marketing is annoying, and yes, the models hallucinate. Yes, many products may be crap. Yes, maybe there's a bubble. But looking at the raw capability (as the exec refers to and what is also dismissed in many comments here), it is mind-blowing. If I hand the AI a proprietary piece of code with a subtle race condition and it finds the bug in seconds, many may dismiss it: "Oh, it probably just saw a similar pattern in its training data." And so what? Even just being able to do this degree of very abstract and fuzzy comparison to a database of bugs is completely different league to whatever static analysis tools we had before. And I think the capabilities in generating new code based on a long list of requirements from many layers of domain precludes it is just 'auto correct' even through the sheer combinatorial explosion. At least it is auto correct in such a fuzzy sense that anything could be considered an auto correct or parrot with that measure. It seems it is mostly a comparison used as a coping mechanism. The Exec is right about the scale of the leap. We went from "Snake on a Nokia" to a machine that can explain a complex regex or debug a kernel dump in conversational English. The fact that it isn't perfect doesn't mean it isn't a marvel of engineering.

Comment Re:Only makes sense if... (Score 2) 128

Yes it still gives redundancy for at least two reasons. 1) Even on the Barr-body, some genes are expressed (XCI escape genes). 2) Also, it is not the same X-chrome in every cell that is inactivated: Selection of the X-chromosome to be inactivated happens early but there's still multiple cells making the choice independently - so the body effectively consists of a mix of cells with one X activated in some cells, and the other in others. This can be enough to overcome some genetic disorders like haemophilia caused by a passed down mutated X-chromosome gene: because as long as (roughly) half the cells in the liver have the good version activated, it is enough to prevent blood clotting etc.

Slashdot Top Deals

"Aww, if you make me cry anymore, you'll fog up my helmet." -- "Visionaries" cartoon

Working...