Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:Someone has never studied philosophy... (Score 2) 185

Still not falsifiable because if we cannot participate in the "real" universe we have no idea what rules do or do not exist in it. You are assuming a universe much like our own running on computing devices as we know them. But the "simulation" could exist in a universe where none of the rules that are observable in this universe exist.

Comment Someone has never studied philosophy... (Score 2) 185

I'm a bit bemused by the idea that philosophers are starting to take the idea we are trapped in a simulation seriously. The "trapped in a simulation" is just a contemporary gloss on philosophical discussions that are covered in almost every philosophy 101. Descartes famous "I think therefore I am" (written in the early 17th century) arose from a discussion of essentially whether he was trapped in a simulation (although lacking the language of computers, he just supposed a "daemon" had created a false consciousness). That was undoubtedly inspired by the allegory of Plato's cave (circa 375), which also supposes someone trapped in a false reality (in the form of shadow projections). I'm sure you find more contemporary philosophers using the "trapped in a simulation" language these days because the "hot new thing" is what gets the attention of publishers and tenure committees, but it's not like it's anything new or unique in philosophy.

As for scientists, I'm skeptical this theory is capable of scientific study (except to the extent we want to study simulation design). A core principle of scientific inquiry is that the idea must be falsifiable. That the world as we know it is a simulation is not falsifiable unless someone could fist escape the simulation (either physically or by communicating with the world outside). Unless and until that happens, it's not really a subject of scientific inquiry so much as hype and speculation.

Comment Re:Trying vs Using (Score 1) 25

I've found exactly the same in my field (tax law). Chat GPT gives a lot of outputs that look pretty good at first glance, but are dangerously wrong (especially if the person using Chat GPT isn't already an expert in the field). You can fix it yourself, but it can be more tedious to track down the falsehoods and fix the references than just drafting it yourself.

That said, it does have some uses. It's great for stuff like summarizing, translating, and simple data analysis (i.e. which terms are present in dataset A but not B). It can also save some time with simple correspondence drafting.

Comment Re:Not 1999 (Score 1) 181

To the contrary, I remember a LOT more upgrade pressure for things like personal computers back in the late 1990s. A 486 (introduced in 1989) in 1997 was considered woefully out of date. It wouldn't run the then-current Windows 95 operating system or most new release software. By contrast, I recently upgraded my Intel Skylake Core i5 (introduced 2015), and it was still happily running the latest OS and any new software save the most demanding modern game titles.

In the late 1990s to early 00s, I would frequently upgrade a CPU on the same motherboard 2-3 times because each release was a big release in capabilities. Since then, I wait longer and longer to upgrade because generational differences have become so minor to the end-user.

Most tech goes through a decade or so of rapid progress where there is a lot of pressure to upgrade. That was smartphones between 2007 and about 2017. But we are actually past that. The iPhoneX (released in October 2017) still looks close to identical to the latest smartphone and performs almost as well while still getting the latest updates. In 2017, the equivalent would have been an iphone 4s, which was even then completely out of date. It's not like marketing stopped trying to pressure people to upgrade their phones, but the tech is no longer able to deliver leaps and bounds progress with every generation like it once was.

Comment Re:It's too early... (Score 2) 107

In order to really conclude that intermittent fasting is dangerous, you'd need to take a random sample of people and assign one group to do intermittent fasting (with appropriate monitoring to show they actually did it) and another group to act as a control. If you are relying on people self-selecting into intermittent fasting, it's going to be impossible to remove the possibility of confounding variables.

Comment Going the way of Sears (Score 1) 215

The root cause is simply the fact that the big department store is a concept that was revolutionary in the late 19th and 20th centuries but no longer makes sense today. There is no synergy between selling both cookware, bedding, and clothing when there are plenty of specialty stores for each online or in an easy trip. The big department store has no particular specialty to draw people in and is typically saddled by high employee costs.

Also, people do less casual browsing today because they can browse casually online. When they go to a store, they typically have something specific they want. The experience at the old school big department store is typically poor with a lot of wasted time.

Macey's has managed to outlast the likes of Sears and Foley's, but I will be surprised if it still exists in 10 years.

Comment Re:Tim Cook should have taken Elon Musk's Call (Score 1) 244

But electric cars haven't plateaued, despite the flood of articles stating otherwise. Year over year EV sales growth between Q2 2022 and Q3 2023 was a 52% increase.

https://mediaroom.kbb.com/2024....

It's true automakers have backed off plans to totally cancel ICE vehicles or cancelled some specific EVs, but that doesn't mean that EV sales aren't growing. That being said, I agree that making EVs isn't going to be a very profitable business, and Apple was right to cancel the project. Tesla kicked off a price war in the EV space that isn't going to end anytime soon. Any profits an automaker gets from EVs will be on razer thin margins, and it's going to be nearly impossible to beat Tesla at its own game.

As far as self-driving cars, it's an all or nothing proposition. Either you have level 4/5 autonomy and you can summon a car from across town with nobody at the wheel, or it's nothing but glorified cruise control that's mostly worthwhile as a party trick. I think Apple woke up to the fact that they weren't close to level 4/5 autonomy and won't be anytime soon. If there's no self-driving software, there's nothing about Apple that makes them a fit for producing electric cars- they have none of the infrastructure or supply base needed for that.

Comment Re:Consumers are bad at estimating value (Score 2) 13

I still think the current state of affairs is better than the days of the cable package. At least I'm not paying carriage fees for channels I would never ever watch and have "on demand" for all content. Before, even the most bare bones cable package was $50 a month, and that got you very little over OTA channels (Basically ESPN, the talk heads news channels, and the History channel). Today, $50 will get you Netflix + Prime (ad free) + Youtube Premium, which is way more content. Plus, it's so much easier to add or cancel subscriptions. You don't have to wait in call center hell to cancel a streaming channel subscription- it's often one or two clicks and you are done.

Comment Re:Half of college degrees...aren't (Score 1) 266

The "stupid degrees" are mostly a red herring. They certainly are plenty of college degrees that would be better suited for trade school, but most of the underemployed college graduates are in fields that don't sound stupid on their face. For example, the most popular college majors in terms of number of graduates are business and psychology. "Business" sounds like a degree that is for someone who wants to be gainfully employed, but many employers regard it as a degree in nothing. Likewise, psychology sounds like a career past (after all, "psychologist" is a totally valid career), but there's not much you can do with a psychology degree other than go to grad school because licensure as a clinical psychologist requires more than a bachelors' degree.

Biology is another very popular major that is almost worthless on its own for a career. An undergraduate biology degree qualifies you to do little other than go to grad school or become a teacher (or at most a low-paid lab assistant). In fact, most of the hard sciences are like that.

Comment Basic Familiarity vs Mastery (Score 1) 165

Even if you assume that AI will be able to spit out perfect code with a simple prompt in the near future, I think it's still a good idea for most students to have a basic working knowledge of coding. That doesn't mean becoming able to develop commercial software, but gaining a basic understanding what's happening behind the scenes with software is helpful in the same way understanding the basics of electricity is helpful even if you are never going to work as a EE or electrician, or understanding how an internal combustion engine works is helpful even if you will never be a mechanic. It's just part of understanding how the world arounds you works, which can help prevent dangerous misunderstandings of what is happening around you.

Slashdot Top Deals

According to all the latest reports, there was no truth in any of the earlier reports.

Working...