Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Home video landscape offers limited choices. (Score 5, Informative) 76

This is an area I am quite familiar with professionally, but the solutions are not great.

Casting boxes that support webcams:
- Google Chromecast with Google TV supports a USB webcam connection, so this plus Google Meet to perform the calls is one option
- Xbox with webcam and Skype. This is one option I know that likely supports a calling / ringtone option and Skype has been doing video calls for a long time, but keep in mind the calling feature may require fees

Webcam options:
For this the key element here is "with a camera that follows us". The home webcam segment calls this "PTZ", or Pan / Tilt / Zoom. Certain companies have unique names for the tech like "Eagle Eye". The PTZ keyword will help you find more choices beyond what I know, but the big caveat here is: this is a nascent industry and some solutions are bad. Either they are slow to react and recognize the new person in the room, or video quality is poor. I say this because anything advertising itself as a PTZ camera is an area I'm not familiar with but I have seen even the professional stuff from Cisco / Poly / Jabra and even there we saw noticeably bad cropping and / or video quality.

Suggested items to look at:
- Poly Studio P15 (includes speakers)
- Jabra Panacast MS (extremely small and unobtrusive, uses 3 fixed cameras for area coverage and stitches the images together; recommend confirming video settings are set to the highest possible, out of the box we did not observe good video quality)
- Logitech PTZ Pro 2, highly observable camera, it noticeably moves on servos when it's following people

I will say: it probably pays to do your own research here, a search for "PTZ camera review" turned up plenty of choices. It will be worthwhile seeing what the reviews have to say as that will probably shape your shortlisting / decisionmaking.

Comment Re:XPS (Score 1) 37

It would be more appropriate to say you are stuck in x86 CISC land. The CISC architecture continues holding Win x86 back with inherent inefficiencies over a clean RISC design like the ARM chips being pumped out by (basically everyone) and the RISC-V innovations happening now.. Intel unfortunately abandoned Itanium, but has had an ARM division making ARM chips for ages, so the ability to switch is there, it just has to be brave and actually do it now.

Maybe PRISM opens that door and Intel can cast off CISC shackles and maybe produce unique ARM designs, or maybe even throw a curveball and develop RISC-V designs and usher in a new era of compute power, power efficiency, (for their products) and start a new chapter in their story. This sounds like the right time to partner with Microsoft and cast off legacy ISA that continues taxing Intel (and AMD CPUs for that matter) and keeping its thermal efficiency far higher than it should be.

Comment Re:Congrats Intel (Score 1) 32

You should probably save the congratulations for the eventual market winner. Being 2 years ahead, but unable to produce enough hardware to supply the market, leaving the market and enough revenue open for at least 1 other competitor, is not a position to brag about. And then the competitors include at least 4 of the Mag 7, plus AMD, plus Intel. Plus Qualcomm. To say they are well capitalized is to say Apple has a few nickels of cash in the bank. Nvidia was 1st into the AI market but now finds itself with unenviable problems: it can't meet demand and it's able to price so high it is creating competition. If it was smart it would drop the bottom out on pricing so the competition can't find a profitable path and gives up. That would maybe eliminate the Mag 7, but it won't be enough to stop AMD, Intel, Qualcomm from all making a run at this out of necessity. These are interesting times.

Comment Re:1.5x faster, not available for another 4 years (Score 3, Insightful) 32

Correction: Intel's tech is launching later this year. 5-6 months, not 4 years. 4 years is the time it will take for their new, dedicated facility to be constructed, but they are using 3rd party fabs to manufacture Gaudi 3 this year.

As for "Nvidia has won everything and there's no point in trying" - this is a nascent market where a 3-year lead isn't much. A real software lead is Windows. Excel. Photoshop. Android. These are real standards. Nvidia's tech isn't a standard yet, because everyone is trying to compete with it. You know you're the standard when everyone else adopts you (see Tesla EV charging port as an example; everyone else adopted it). Right now you have at least Amazon, Microsoft, Meta, and Alphabet all working on their own AI chips, in addition to AMD and Intel. So Nvidia just has a head start, but the race isn't over and Nvidia can't even supply all the demand. All you have to do right now is be good-enough and you get deals because you'll be far cheaper than the best. That's dangerous for Nvidia because it means their capacity limits are allowing the existence of competition. This is only the start of the race, not the end.

Comment Re:No (Score 1) 111

Remote working will increase the prices of the suburbs and make them no more affordable than the city.

That's not how housing markets work. The price premium of cities is because that's where all the jobs are. If prices in the suburbs and the urban core become the same, then for the same price buyers can dramatically cut their commute to their office. Even with work-from-home that still leaves plenty of people who must work on site, in the urban core. Urban prices will always remain higher.

That being said - prices can rise in an entire city to such a point that the whole city becomes unaffordable. That can make the suburbs and the core equal.

Comment Urban decay is what happens. (Score 3, Insightful) 254

It's bad enough we have a new drug epidemic on our hands with Fentanyl sending overdose deaths skyrocketing, but now add WFH / WFAnywhere prevailing, and you get urban decay.

- City "bad part of town" expands because there's less people
- Fewer people downtown so more awnings; homelessness becomes much more visible
- Rather than a concentrated area where people take drugs and go through the process, they experience these effects everywhere. Result is more disturbing imagery, more random assaults, much more broken windows. And I see SF has a problem with brazen daylight attacks, which is its own problem.

Result: Employees don't want to work there, companies cancel leases, buildings go under-utilized. Developers start buying commercial and appealing to city hall to convert to residential. Or some property owners go bankrupt and the buildings go dormant.

See NYC in the 70s and 80s.

Comment Re:Macs (Score 1) 284

Macs have never been Unix Workstations. When you read "workstation" think "full ATX x 2" sized cases with expansion slots for more than just a GPU. $6000 CPUs. Ugly as Windows 3.1 UIs. The most creative thing back in those workstation days was an SGI case, and maybe some specific high-style Cray supercomputer installs.

"Gimped version of BSD" - it's a pretty complete BSD Unix with a Mach microkernel, so unlike monolithic kernels of Free / Open BSD and Linux, it had an interesting concept for a kernel. Apple hasn't really done anything interesting on the Unix side since then, because it provides a vastly complete user-centric software ecosystem. Apple might use Unix but it still never wants its customers looking at something that reminds people of DOS all over again.

"Pretty skin" - You do know Apple's history, right? OS X offers a proprietary, but complete, designed end user desktop environment. This is always what has defined and set Apple apart since the Lisa and the very first Mac. A pretty skin is just the appearance, the functionality of the Finder is leaps and bounds beyond that and has a track record for incorporating smart ideas like native PDF support and simplifying Airdrop, to other aspects few people realize like the quality of the Spotlight search engine and the amount of user-interface design that has gone into making the system intuitive to use. Compare this to literally everything else on the market, even Windows, and this is still one of the most compelling reasons why Macbooks are basically the daily-use tool in much of the tech world today.

Comment Tolkien is missing from this series. (Score 5, Interesting) 288

There's quite a bit missing from this series and it makes me wonder what the series creators were thinking.
1. It doesn't sound like Tolkien at all. The dialogue sounds utterly lifeless, contemporary. By comparison, listening to the film and TV works of the LOTR movies and GoT TV show - you hear the prose of the writer and the prose is one of the culturally-distinguishing features, sometimes to great effect. This series lacks that entirely, it sounds like generic fiction.
2. Galadriel wears a night gown? Galadriel travels in full plate armor? The LOTR movies did a great job of making the elves seem serene and regal and slightly aloof - this series does nothing of the sort. Galadriel wears generic looking outfits, this actress's hair style is so thick that it rarely shows the ears when the elves of the movies all had very fine hair that revealed their ears a lot of the time. I just don't get a sense that she is anything unique or significantly different from the humans beside her. And worse - once she rallies the troops she puts on full plate armor (likely some lighter than air Elfish armor of course, but still). This isn't in keeping with the look of Elves at war at all, they avoid heavy armors because they employ their natural agility and avoidance tactics when fighting.. Ismael Cruz is a much better cast, he does have Elfin qualities, although here again you get no sense that Arondir is a woodland elf with a different and unique culture. He's literally a generic "archer with a cause".
3. The world is mostly just a set. In Tolkien's world the world itself is a character. From moths that aid Gandalf, to forests that shiver and move, Tolkien's planet is alive and steeped in rich magic and mysteries. When trees are decimated in this series, when insects are just bugs, the world itself is totally lifeless. A dead world is not the world of Middle Earth.

This show definitely needs some major work to really feel like it's a derivative work of Tolkien. I don't know if I'd continue watching more generic fiction that uses the words and symbols of Tolkien's world, but doesn't sound or appear like Middle Earth as Tolkien intended.

Comment "Slowing price appreciation" doesn't mean losing $ (Score 2) 180

It's realty, so it doesn't take that-sophisticated a mind to realize that "slowing growth" is not the same as "prices falling". When price increases go from 1.5% monthly to 1.3%, that's slowing growth. It's not taking a bath though.

And what are Zillow's total sale prices, not asking? Across all sales, not just those in one single city? One city does not tell the story of Zillow's balance sheet.

I'm even surprised to see Bloomberg published this article at all, it's not only inaccurate, it's also poorly researched.

Comment Re:Which is it? (Score 1) 207

Intel is playing PR games with its "steal the lead back" BS; it's a repeat and shameless offender in that regard.

Great point on power, but there's a lot more that Intel also isn't talking about: Intel produces strong CPUs with weak graphics. Apple's silicon proves it can do both. I think we can discard the game benchmarks as anomalous - once Apple and game companies start optimizing for each other that full graphics power should be unleashed. The only thing that will keep games performance low on the platform is lack of interest; this is a dev relations problem, not a problem with the silicon itself. Apple needs to decide if it wants to take games seriously or not, once and for all, because to treat them apathetically with this kind of hardware it isn't about the raw potential anymore, it's about the interest.

Comment Re: Science everyone understands... (Score 0) 445

This comment is so true. We can no longer live in a naive society, we must now consider: how will a new policy, rule, or system be exploited by bad actors?

We have to ask this question for absolutely everything in society, I feel. Every anti-masker can now walk into a store, say "I'm vaccinated" and store owners don't know if they are being put at risk or not. This is a colossal blunder by the US CDC.

Slashdot Top Deals

Chemistry professors never die, they just fail to react.

Working...