Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re: There is already a safe subset of C++ (Score 1) 82

I particularly hate the elitism of the "perfect programmer" crowd. most of us are out there writing shitty CRUDs in the language of the day.

and the thing we always forget when we get such comments is that most of the time, issues with bad code and layers upon layers of crap, come from the business requirements.

software development is often criticized for lack of formality . but no one ever seriously starts building a house thinking of "refactoring" it into a skyscraper in the future. but with software development we're taught to focus on the house only, and if the time to build a skyscraper comes, we'll see. YAGNI, KISS, etc.

and at the same time if we go building thinking of an eventual skyscraper we end up with stupid, useless shit as "hexagonal architecture" and 32 layers of shit, navigating a sea of interfaces and making the developer experience terrible. (I recently had to add a single property to an API that was built by a "new architect". it took me 2 hours to figure out, and around 8 files changed, not Including tests -because the project had no tests).

Comment Re:Money not Everything (Score 1) 229

Also btw, your story here is irrelevant. You come from a high income country anyways.

Indians going to the USA aren't making money like you are in Europe, and Europe isn't as welcoming as the USA is to immigrants.

During a short time I worked for a german company and a part of the team was in India. Because German bureaucracy kicked them out of the country and they had to stay out while the company sorted their papers. There is a popular youtube channel about a vietnamese girl living in Germany that tells her story of how tough she had it there even after graduating, the stress she had to go through every time she had to renew her visa.

People on H1B in the USA don't really worry about that. It's their company doing the paperwork for them and they're not under the whim of a bureaucrat, like it's in the USA.

I don't want to be aggressive but i mean, you're really a case of "check your privilege".

Comment Re:Money not Everything (Score 1) 229

Every coworker I've talked to (europe-based 30+ software developers) agreed that if they could change one thing in their careers, it would be to try to land a job in the USA in their 20s when they had energy to burn and FAANG base salaries were 150K+ a year. Then move back to Europe before burning out, but with several figures in their bank account, and get a decent job with the added bonus of "former FAANG" in their resume.

Comment Re:This should stop the abuse of H1-B (Score 1) 229

Because even if you're deported you already made a TON of money, especially if you come from India.

A PhD in america can easily get 100K+ a year, in India? Good fucking luck.

No other country on earth has that level of salaries. So yes, it's a very safe bet. If you do well, you stay in america and become a high income person. If you are unlucky, you get deported and already made more money that you would have made if you left for your home country.

Comment Re:collect IP (Score 1) 57

It's like the company I work for. Usage of AI is extremely limited (by a CYA training session): do not ever upload any company code or files to the AI.

Also the company hosts everything externally on Github, Gmail, and Google Drive.

Apparently the lawyers believe Google is only training AI on questions you make on Gemini and isn't scanning Gdrive and Gmail for everything (which we literally know they are because they announced "agents" for Gdrive files, and Gmail is full of "agents" too)

Comment Re:One non-inconsistent observation != PROOF (Score 1) 40

> "Proves" might be too strong

Different fields have different standards of proof. The most rigorous that I'm aware of, is in mathematics, wherein a proposal that almost all the experts think must surely end up being true, can be heavily studied and yet remain "unproven" for an arbitrarily large number of centuries, until eventually someone finds an actual real-world use case for the math that you get if it's NOT true. (The poster child for this is non-Euclidean geometry, but there are lots of other examples.)

There's an old joke about three university professors from England who took a trip up north together, and on their way out of the train station, the journalism professor looked over at some livestock grazing on a hill, and said, "Oh, look, the sheep in Scotland are black!" The biology professor corrected him, "Some of the sheep in Scotland are black." But the math professor said, "There exist at least three ship in Scotland, and at least three of them appear black on at least one side, at least some of the time."

Comment Re:Hurry up already (Score 1) 243

Sorry, no, that isn't the issue either. The problem the OP is running into is much, much more basic than that.

Forget, for a moment, that the ports are USB ports, and that the peripherals are USB peripherals, because as long as they match up (which they do, in the OP's scenario), none of that is the problem. The number of ports doesn't even matter, we can abstract away the 4 (or 2 + 2, same difference) and just call it N. The problem is that he's got N ports, and N peripherals that he wants to keep plugged into ports all the time, and that leaves N - N ports available to plug anything else into, if he needs to plug something in temporarily. But N - N is 0, so something has to be unplugged to free one up. That's a number-of-ports problem, entirely irrespective of the port type.

If you were proposing replacing the 2 USB-A ports with a *larger* number of USB-C ports, then your argument might have some relevance. But just changing the type of port won't bend the arithmetic in any useful direction. They could be upgraded to the new USB type K ports introduced in 2042, and it still wouldn't solve the problem: if there are still four ports and four all-the-type peripherals, there still won't be any unoccupied ports available for temporarily plugging in transitory things.

At least USB is (mostly) hot-pluggable. But, again, that's as true of A as it is of C.

Comment Re:Hurry up already (Score 2) 243

I mean, some delays are good. An example of this is when Apple decided to leave only USB-C ports for everything on Macbook Pros, but in the newer generations, they added HDMI back and also an SD card reader.

Which is very welcome. Adding HDMI to a $2k machine costs nothing and can save your ass. If you're a speaker at a conference and need to connect your laptop, it's probably gonna be HDMI, certainly not USB-C.

The SD card is also welcome by photographers and videographers who make heavy use of SD cards.

Certainly two very valid use cases for people who own macs really, and that aren't a compromise for other users (they're not taking anything away by adding HDMI or SD).

USB-C promised a lot and it hasn't materialized. I wanted to hop into USB-C at the very beginning, about 8 years ago, and it's only become worse. USB-C may or may not have PD, Thunderbolt, DisplayPort, etc. Apple does it mostly right with "all ports supporting everything", PC manufacturers are hit and miss, and it seems to be mostly a laptop thing. I'm not sure about the current state of desktop motherboards but last time I checked, they didn't really support Thunderbolt or DP and they only included one single USB-C connector.

If you're a laptop user, many monitors include USB-PD to charge your laptop, and DP to send video to the monitor, plus a USB channel for USB-C. You basically use your monitor as a docking station, with a single, thin cable. It's very neat. But AFAIK, you can't do this with a desktop computer.

Comment Re:SAT Sucks (Score 1) 115

This has probably changed over time. My impression when I was taking college entrance tests, was that the ACT tested what you knew (i.e., memorized facts), but the SAT tested _how you think_ (i.e., how good you were at figuring things out). But that was in the early nineties, and they changed some things not very much later that, among other things, resulted in more students getting higher scores, which I think was the goal then too. They had a lengthy explanation about keeping the test relevant to the changing expectations of modern institutions of higher learning, but reading between the lines, it seemed like the main outcome was giving out higher scores.

Comment Re:Windows 11 runs in 4gb of RAM (Score 1) 62

Eh. Win11 with 8 GB of RAM might work if you never connect it to the internet, or find some other way to block Windows Updates from ever happening. (Maybe once it goes EOL, they will stop issuing updates?)

But every time it starts downloading Windows Updates, it's going to try to store half the internet in system memory, and the platform's horrible virtual memory system is going to consistently swap out the page that's going to be needed next, every single time, and the download that *should* take a few minutes, is going to take more than a week, during which time you can't use the computer for anything else because it'll be constantly swapping like it's 1996 all over again; and by the time the whole update is downloaded and installed, Microsoft will release another update. Rinse, repeat.

4GB of RAM, I can only assume, would be worse, if that's even possible.

16 GB is mostly usable for basic computer tasks like browsing the web. Mostly. But it's not great. I consider 32 GB to be the practical minimum, if you want anything resembling decent performance, on Windows 8/10/11.

Linux systems can run on 8 GB of RAM. Heck, depending on what you're doing, 1 GB of RAM will do ok, though that's not going to give you much of a desktop environment. It's fine for a lot of headless roles, though.

Comment Re:What about 32-bit Raspberry Pi? (Score 1) 40

I don't think i386 builds would be usable on an ARM system anyway.

At worst, you can always just compile your own. Granted, the Mozilla codebase is (last I checked, which admittedly has been several years) significantly more of a pain to compile than the average open-source project, but it's not _prohibitively_ difficult. You do have to read a few lines of documentation and maybe edit a small config file, but there's nothing really tricky about it. Frankly, it's easier than installing most third-party binaries. Also, if you're using a distro that's made for your hardware, like Raspbian or whatever, it'll probably just have a package in the repo.

Comment Re:Old! (Score 1) 40

Yeah, I'm surprised and a bit disturbed that this hadn't happened long since. Linux distros pretty consistently started compiling everything for amd64 pretty much as soon as users had the hardware for it. There was no downside, because all of the software that everyone was using had source code readily available and could be compiled for the new system. It all has to be recompiled anyway, every time a major library (such as libc) gets any kind of really substantial update, because when you *have* the source code for everything, and everyone *knows* that you have the source code for everything, and your system includes a full working build chain out of the box, and compiling software from source is an extremely *normal* thing to do, to the point where people who aren't developers and wouldn't be able to read any of the source code don't have any trouble building it, it turns out that in that scenario hardly anybody bothers to maintain long-term ABI compatibility, because there's no compelling reason to do so. You (the package maintainer or whoever) can just do a fresh compile every time anything gets updated. You were almost certainly going to do so anyway. Even if the update only changes the documentation, you just build the thing, because it's significantly easier to just build the thing every time, than to bother to figure out whether you need to do so.

Certain other systems, that I don't need to name, took a decade or more (after folks had amd64 hardware) to transition over to widespread deployment a 64-bit OS and are *still* routinely using 32-bit applications in 2025; but that is for reasons that have never been relevant for Linux users. A lot of the users of such systems, would probably find the above paragraph just about as baffling, as the WWII Japanese naval commanders who found out the Americans had entire ships dedicated to making ice cream. "They can compile software so easily, that they do it when they don't even need to do it? They've already won the war, we just hadn't realized it yet."

Slashdot Top Deals

A conclusion is simply the place where someone got tired of thinking.

Working...