Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment Re: "It might be tempting to blame technology... (Score 3, Insightful) 105

Yeah, there was never an irresponsible young person before this generation.

Way to completely miss the point...

It's not that this never happened, or even that it was some extremely exceptional case in the millennial/GenX/boomer eras...but it was extremely different.

For starters, I think that there are two sides that have both valid points and problematic extremes. In the days of yore, companies generally saw employees as an investment. It wasn't uncommon for people to work at the same company for 25 years or more. Certainly not every company, but the majority of them accepted the tradeoff that new employees would require a good amount of training before the company broke even on their paycheck, and in return, the employee who stayed would earn the company a tidy profit over time, receiving enough money to support a family in return. Today, it seems that everyone wants to 'skip to the end' - companies want to hire savants with 20 years experience in Debian Trixie, make them always-on-call, and in return want to pay them less than half of a living wage, and keep a fear of layoffs at the front of everyone's mind.

The counterbalance is that Gen-Z knows this is the case, and treats employers with the same sense of expendability. If a company sees an employee as someone not worth investing in, with little ability to get promoted, and both promotions and layoffs being abstracted from job performance or dedication, it's not unreasonable to adopt the mindset of "if they're doing what's best for them, then I'll do what's best for me"...and it's only a few more steps from there to "I'll work when I feel like it", which is demonstrated in the willingness to no-call/no-show, doomscrolling at work, or having to be 'helicopter managed'.

This leads us to the downward spiral - with both sides leaving themselves vulnerable to exploitation if loyalty and dedication are expressed, it is only natural to reduce that vulnerability moving forward. The result, however, is both sides getting worse...it's a Prisoner's Dilemma, and neither side has incentive to break the cycle.

Comment Re:What's the difference between tablet and phone? (Score 3, Interesting) 122

Back when the iPhone was introduced I was convinced that within 10 years computing would be mostly done this way; connecting your portable computer (smart phone) to a dock that turned it into your home computer. I'm surprised that this idea never gained traction.

I think there have been a few reasons for this.

I think the biggest one is that nobody could meaningfully agree on a form factor. Now, *I* always thought that a great option would be to have a 'zombie laptop' that had a keyboard, trackpad, webcam, and a battery, with a slot to slide your phone into. The phone would connect to the peripherals and give a 12" screen and a keyboard, while charging the phone in the process.

The devil, of course, was in the details. Even if Apple made such a device and molded it to the iPhone, the problem then became that a user couldn't put their phone in a case, or it wouldn't fit in the clamshell's phone slot. There would also need to be adapters to fit the different sized phones, or different SKUs entirely with per-device slots, which then also pigeonholes Apple into a particular physical form factor. That begets the "use a C-to-C cable" option, which is better, but makes it ergonomically annoying to use if one isn't sitting at a desk. A wireless option solves both of these problems, but kills both batteries in the process. Finally, there's the price point: the cost for the end user would need to be low enough that it doesn't just make sense to have two devices, AND the first-gen owners would likely feel some kind of way if they were stuck with their old phone because it meant buying a new clamshell. It works well on paper, but pretty much any real-world testing would show the shortcomings pretty quickly.

Supposed that was solved somehow...while the Samsung Fold phones are helping justify time spent in adding a multi-window interface to Android, try installing Android x86 on a VM for a bit and watch what happens. It's been a while since I tried, but the experience was pretty bad - the inability to open e-mails in new windows was particularly infuriating; many apps take exception to having multiple concurrent instances for side-by-side usage, and window focus gets pretty tricky to navigate. It *can* be done, but it ultimately felt like all-compromise, no-improvement.

Finally, there *is* such a thing, at least to an extent. Many, MANY apps are just frontends on a website. iCloud is like this, the whole Google ecosystem is like this, Salesforce is like this...for a solid number of apps, there is a browser-based frontend that works just as well, if not better in at least some cases. Data is commonly synced with Google or iCloud or Dropbox. The number of apps that are worth running on a phone, without a desktop or browser analogue, that would justify a user getting a clamshell to run that app in a larger window...is small enough that it is seldom worth dealing with all of the *other* compromises involved.

Comment Wow... (Score 5, Informative) 89

they are constraining what you can do using the software they provide with said hardware

It has been a VERY long time since I've seen such a textbook definition of the phrase "a distinction without a difference".

On an Intel x86 PC, even the most locked-down iterations of Windows give users a means of running whatever code they want. If the user doesn't want to run Windows at all, a user can download an ISO of Ubuntu or Fedora or Proxmox or VMWare or GhostBSD or Haiku, make a menu change in the BIOS, and install those OSes instead. Done and done. Windows can be replaced in 30 minutes or less if a user wants to, with nothing but GUI tools and youtube tutorials that are universally accurate (admittedly with slight variations on where to disable secure boot in the BIOS).

On an Android phone, one must unlock the bootloader (which some phones prevent through artificial constraints), then hope that some Good Samaritan has made a different OS for it...and then go through 101 steps involving CLIs, recovery environments, and ADB interfaces...AND those steps and software downloads vary with each model of phone, AND Google gives app developers a means of telling users "sorry, I won't run on a phone you have control over", AND that assumes that a replacement OS is available in the first place...otherwise, the user needs to replace the phone, or go all the way to doing their own compiling of AOSP, which is its own rabbit hole.

So yeah, the argument rings incredibly hollow: "we're not constraining what the hardware can do...but we ARE constraining what the software can do AND constraining your ability to replace that software if you so choose." If the argument is that the constraints are purely related to software, then Google needs to put way more effort into streamlining the ability for users to depreciate the use of whatever software those constraints are implemented to protect. If they aren't going to do that, then they are being disingenuous.

If, in a court of law, they cannot produce documentation regarding the means by which the hardware can be used to run unapproved code, then I would deem them guilty of perjury for making this statement under the current climate.

Comment Re:Don't get it (Score 1) 155

You know why I don't drink alcohol? In part, because of the high cost. Why the fuck would I pay $5-$10 for a small glass of liquid?

$5-$10?! What a bargain! I'd probably order a mixed drink with my meal if it was still only $10.

Most of the restaurants I've been to in the past year have bumped the cost of cocktails to $15-$20 for standard stuff. I might accept it for drinks that have several different ingredients, but even a rum-and-coke goes for $15 around me - and I'm not in NYC or LA. One brunch spot I went to had a memosa flight that was champagne and four different fruit juices, each in MAYBE 6oz. glasses, and they wanted $40 for it.

Not to be outdone, the last time I *did* visit NYC, I went to a restaurant that made mocktails...and while my blueberry mint lemonade was indeed delicious, it was NOT worth the $10 my friend paid for it.

So yeah, I definitely share your sentiment - a nontrivial portion of the reason people aren't drinking alcohol at restaurants is due to the pretty significant cost to doing so...and I'm pretty sure that those higher priced drinks *also* have less alcohol in them than they did ten years ago, too. Given this, it makes way more sense to get a 1L bottle for $20-$50 that I can use to make my own drinks for a month...and as much as restaurants have always used bar drinks as a source of high-margin revenue, it's not really justifiable anymore to spend that kind of money on a single drink.

Comment Re:It's a weird Puritan Christian thing (Score 1) 175

It's more likely just based on a really piss-poor understanding of STIs in biblical times. People understood that if you did a bunch of sleeping around, you'd likely fall ill. They didn't understand what caused it, so it was just "punishment from God".

I'm not sure I entirely buy this. Even as recently as the 1960's, the incidence of STIs was somewhere around 1:35, and that's inherently including all of the STIs that made the jump from animals to humans in the time after the time of Christ (or Paul, who was more outspoken about it, or Moses, if you're going that far back). Yes, STIs undoubtedly happened, but with a less than 3% chance of getting one, a person had to either be "well traveled" or extremely unfortunate to get an STI in that era. We also have to discount the asymptomatic STIs; this line of reasoning might hold water with STIs that have visible scarring in genital regions, but not all STIs fit that description. The clearest symptom of chlamydia is difficulty in conception and birthing, but there were many, many virgins-on-wedding-night who had trouble with giving birth throughout history.

I think there were other, more practical reasons for this system. For starters, a faithful, single-partner wife would ensure paternity in a time prior to DNA testing. In addition, a virgin woman was deemed more desirable by the men and were able to attract more desirable suitors. Promiscuity after marriage was a poor social reflection on the husband as well.

Even though we now understand what actually causes sexually transmitted diseases and ways to reduce the likelihood of their spread, some people still cling to the whole "God doesn't like it" thing for the usual reasons that people still believe in aspects of religion which don't stand up to logical scrutiny.

Well, the Mosaic law managed to outlast the Assyrians, the Babylonians, the Persians, the ancient Egyptians, the ancient Greeks, the Romans, and the dozens of "ites" listed in the Old Testament narrative. I'm not saying that presenting the Pentateuch to Congress to put it into law in 2025 to be enforced with police and military is something that would benefit modern society, but I *am* saying that its staying power reflects some sort of a societal benefit. I submit for your consideration that even if one disagrees with the mandates present in the Mosaic law, it might be overly reductive to assume that the rules came into existence through a concern for relatively-rare STIs.

Comment Re:If it were like it was back in the good old day (Score 1) 66

By contrast, a $70 AAA title is the equivalent of spending $35 or less when most of us were kids and AAA games were, bare minimum, $50 (many SNES were $60-$80 for bigger games).

Video game prices are not gouging anyone right now.

And if games were an $80, one-time purchase, nothing-more-to-buy, multiplayer-over-TCP/IP-forever investment, yes, you're right. I have no problem paying even $100 for such a game.

Except most of them are not. There are a handful of exceptions (Elden Ring, Baulder's Gate 3, and so on), but the majority of games are $60-$80 for the standard edition but $100 or more for the deluxe edition, and then there are the season passes, battle passes, in-game purchases (they're not 'micro' anymore...), lootboxes, multiple in-game currencies, and the fallout76 mechanic of "pay for your purchased items to not lose the stats you bought them with" mechanic...oh, and all of this only lasts for as long as the company keeps the servers up - even though one can technically play FIFA 21 in offline mode, one is stuck with their current roster and cannot unlock additional players through gameplay.

So no, $70 isn't a problem for a complete video game, like what was being sold in the SNES era. FIFA 08 was the last release of the game to include all of the players in the box. The estimate to unlock every player in FIFA 25 is 100 million in-game coins, which cost about $50,000.

So yes, video game prices ARE gouging players right now...we've just somehow accepted that $80 for an incomplete game is the same as $60 for a complete game, as it was before video games became casinos on the internet.

Comment Re:As an IT expert I am .... (Score 1) 132

... and always have been completely bedazzled on why MS Word even has a business case. How this piece of software could gain the market let alone survive to this very day is a mystery to me.

Because Corel doesn't aggressively market the fact that WordPerfect 1.) still exists, 2.) is less expensive, 3.) is much faster and more stable, because 4.) it's not sold as SaaS, and 5.) it can open and save Word documents natively.

Unfortunately, even if they did, there are too few people who perceive WordPerfect as a "big name" anymore; nobody wants to be the first to shift away from Office or Google Docs and be the office that everyone hates sharing documents with, so it's a classic case of "everyone uses it because everyone uses it"

Comment And the PowerPDF Migration Continues... (Score 1) 69

Been moving PLENTY of my clients over to PowerPDF from Acrobat. $179 one-time, no AI garbage, no half-dozen services sending notification nags, and really the only function that's keeping anyone on Acrobat is the send-and-track functionality, which is admittedly a bit more polished than PowerPDF's analogue.

Seriously, Adobe as a whole is coasting on inertia at this point; nearly everything in their portfolio has viable replacements in one form or another.

Comment Re:I'm surprised (Score 1) 41

I'd probably go for a well-built Dell instead. Looks like their competition must be doing even worse if they're still selling.

The most recent crop of Latitude laptops have gone to hell. They used to be solid, boring laptops that were "everything you need, nothing you don't"...but they're doing all the Macbook crap now - soldered storage, nonreplaceable batteries, going for the svelte look that prevents decent cooling so the CPUs are clocked down, the keyboards have no travel anymore so they're not all that great to type on...My company has been a Dell reseller for nearly 20 years but we're getting clients E-Series and T-Series Thinkpads now because Latitudes are getting all the worst elements of the XPS line we avoided.

Comment Simper Reason - Covid PCs Are Aging Out... (Score 3, Interesting) 41

People who bought laptops and PCs during the 2020 lockdowns - because they realized that trying to use their iPads for everything wasn't all it was cracked up to be - are realizing that those machines are reaching the end of their life cycle. With Copilot+ PCs from Lenovo being less expensive than their "non-AI" counterparts, combined with Windows 10 reaching its EOL in a few months, it's completely unsurprising that people are buying new computers and that they're picking inexpensive ones.

Comment Re:DIgital Camera (Score 3, Interesting) 109

Kodak invented the digital camera, but its leadership feared it would cannibalize its film business so it killed it. The company would be in a different place if it had accepted the innovation, refined the digital camera and produced a product.

That's a bit oversimplified, because I don't think digital cameras killed Kodak - Instagram did.

Before the cell phone converged everything, we had cell phones, PDAs, MP3 players, and...digital still cameras. Canon, Nikon, Sony, Fuji, and Kodak were all on display at your local Circuit City and Fry's. While Canon had a bit of a vertical with their inkjet printers, Kodak actually WAS pretty innovative with their entire EasyShare platform - one could download photos to the PC and use the online EasyShare Gallery (an early Flikr with limited free space and paid tiers), or use the dock printer and pop out printed photos. It was a digital camera system so easy that grandparents the world over embraced it. While Canon and Nikon used their consumer cameras to on-ramp hobbyists to their SLR cameras, Kodak used their cameras to sell EasyShare memberships and the dye-sub printer paper cartridges.

Instagram catapulted camera phones from being "the inferior camera that is used when my phone is the only thing I have on me", to "the default camera". Phones ended up getting similar sensors to the dedicated devices, and the ability to share photos via Facebook and Instagram and MMS meant that there was no need for ANYTHING akin to the EasyShare system. The cameras weren't necessary anymore because the phone was built-in, the EasyShare Gallery wasn't necessary because Instagram was free, and fewer and fewer photos were getting printed at all, because sharing was possible both immediately and irrespective of location.

I think Kodak could have pivoted more to being a chemical company like BASF and survived, but Instagram and services like it were the ultimate evolution that Kodak simply couldn't compete with any more than Polaroid could.

Comment Re:Flop (Score 1) 47

They released a Dragon Age game last year?

Wonder why it flopped...

That one upset me and made me nervous.

A huge issue with the DragonAge game was that they spent a massive amount of development time trying to turn it into a 'live service' game, then pivoted to a classical single-player experience. Putting the sociopolitical messaging aside (and whether it was 'real' or 'perceived'), a development cycle that completely pivots that sort of underlying, fundamental paradigm shift is going to undo a massive amount of development work...and EA still found it necessary to have the devs do a bunch of last minute 'crunch' and 'ship now patch later'...which meant that the early reviews reflected some of the rough edges, and then the sociopolitical messaging accusations at the height of "anti-woke" sentiment was just the icing on the cake...and thus, the flop.

So, my concern was - and still is - that the suits at EA are going to grab their Excel spreadsheets and see that FIFA still makes a mint, while single player games don't...and ignore everything else and blame DragonAge's bad numbers on the absence of microtransations and lootboxes and season passes.

They've been real quiet this year on the Mass Effect front; I'm nervous that they're pivoting again because of the DragonAge failure, rather than looking at Elden Ring or Baulder's Gate or Cyberpunk 2077 as evidence that single player games can, in fact, still make money.

Comment Re:Yeah, Okay... (Score 2) 127

If you are unable to describe the problem scenario precisely and concisely

I did, at least to ChatGPT. I summarized here; I'll give you the transcript if you like.

, how can you expect a lexically founded predictive automaton to give a useful result?

Because that's what Microsoft is pitching. I gave ChatGPT a hell of a lot more specific information regarding equipment, already-attempted procedures, and intended outcomes, than about 90% of the people at work who call me for support provide...and it STILL waited until after I bought what it said to buy, to tell me it was the wrong controller. It could have said, "this is one possibility, this is another, depending on the exact specific LED strand you're working with; I can't conclusively determine that based on what you've provided..."

You give a hodge podge of brand names and non-sequiturs like "analog LED" and "digital LED", and expect anything better than regurgitated advertising claims?

It's what the general populous is going to demand in exchange for their keyboards and mice. If the goal is for the computer to understand what the user wants, even when the user is being vague, then there's no way that Microsoft is going to get there in five years or less.

I don't care that ChatGPT was wrong in this particular instance; it's a learning experience for a hobby. No problem. I *do* care that Microsoft thinks that it will take a very short amount of time to get from that state to Jarvis, or the computer in Star Trek, in about as much time as it takes to graduate high school.

Comment Yeah, Okay... (Score 2) 127

I asked ChatGPT to help me get a string of LED lights to work. I spent half an hour following the instructions; I tried the Tuya app and the first party app, made a dummy account, tried the AP-mode instead of the BT-LE mode...never, ever got them to connect properly. I gave it the exact model number on the back of the unit, I gave it links to the exact product, I told it the quantity and color of wires in the lead, and I was still on the version 4 model. It helpfully recommended the QuinLED Dig Quad board, a super cool ESP32-based controller to replace the craptastic Tuya garbage that came with it.

I waited a week for the board to arrive, and I connected it all up...spent an hour of faffing around with no ability to control color or brightness...only to find out that after ALL of that, the Dig Quad was the wrong board because these were analog LEDs rather than digital ones.

...So now, Microsoft wants to tell me that they're going to totally overhaul Windows to use *so much AI*, that it will basically be able to read my mind and do what I want it to do by me giving it vague parameters, and then being accurate? They're pitching Jarvis as something they'll have working properly in four years, to the point will actively want to be talking to their computer (along with everyone else in their adjacent cubicles), and it'll be desirable...but today, the models can't accurately assess which LED controller to recommend when given EVERY piece of information that an informed CSR would provide?

...Given that very few people use Copilot by choice, and given that previous attempts to overhaul Windows have been niche at best (almost 15 years into touch-based computing and *how many* Windows users leverage a touch screen even 20% of the time?), and given that existing models are useful but far from indispensable, and being that there is already a growing resentment from the sheer volume of "AI Slop" that's making the internet even less desirable to use for many.

...Occam's razor is telling me that this is just Nadella trying to avoid the stock price from cratering by give some sort of assurance to shareholders that the bazillion dollars they've spent on GPUs weren't wasted.

Slashdot Top Deals

Mausoleum: The final and funniest folly of the rich. -- Ambrose Bierce

Working...