Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment ...yet they FORCE you to not use OAuth services (Score 1) 19

I'm really annoyed by this. I recently switched from Google Music to Spotify. It was when Google moved to Youtube Music and it was really mostly because my son wanted to use Spotify on his Playstation.

Now, when I went to their web site to create an account, they would only allow you to use your email or facebook. Installing Spotify on my phone though required no such thing: I was able to sign in with Google!

I then proceeded to log in online with Google sign-in (!) and use the web player just fine!

So I thought "ok they promote Facebook for some reason, but that's fine".

A few minutes later I downloaded the desktop application and started: NO GOOGLE SIGN-IN

Since then, I refuse to use the desktop application purely because it would require me to create a password with them. I would use Firefox Lockwise to create one but I can't be bothered: I have a separate email for that. When I login to something with my Gmail it's because that thing supports Google authentication. If it doesn't I use a special email address I have specifically for creating accounts with random passwords...

It really pisses me off that even though they obviously have implemented Google authentication, they try to force you to create a password with them (for new accounts / to use the desktop player) and then go and lose your password and personal information to hackers...

Comment Re:Up to 16GB of RAM? (Score 4, Interesting) 112

This is HBM or something similar as it lives on the SoC, so it must be quite expensive. Apple needs its hefty profit margins so it'll be hard to sell 32GB options at this point.

The real selling point is the UMA: these things can have CPU/GPU work on data in-place without moving it around between RAM and VRAM as it's common memory. You can get some really cool performance optimizations for graphics/video editing applications due to this. Likely all those "up to X times faster" have to do with applications optimized for this.

The good news is that all these "Intel partners" that have fought UMA for ages (due Intel not having graphics) will now need something to compete with M1 so we can hope we finally see an AMD solution: a 5nm Ryzen, with RDNA2 and HBM for UMA will be absolutely killer. Heck, even at 7nm it should still be faster than M1.

Are you listening Microsoft? How about a new Surface Book with an AMD SoC?

Comment Re:I'm curious how fast they REALLY are (Score 1) 112

Did they really say "than comparable Intel processors"? Because in that case we're talking 14nm chips (two process nodes behind). That takes away a lot of the "wow" factor. A 7nm Ryzen would already give these things a run for their money, whereas a 5nm would likely comfortably beat them...

Comment Re:Embrace extend & extinguish (Score 5, Interesting) 56

Well, AMD is going in the opposite direction with fully OSS drivers so there's hope.

Now if only "Big Navi" turns out to be competitive and heralds that "come-back" of AMD in GPUs (like Ryzen did for CPUs) then we'll at least have that...

I'm already buying AMD-only (my last two purchases were an RX-480, RX-570) precisely to use my wallet and support the fully-OSS drivers. Also bought a Ryzen APU laptop for my son and will make buy I Ryzen laptop for myself next (hoping for the Surface Laptop 4).

Comment Re:User experience != hardware (Score 1) 280

Changing the CPU architecture from x86 to ARM won't change the user experience. That's all software, all of which runs just as well regardless of the underlying CPU hardware.

It really depends how you mean this. User experience can be hardware architecture as well. If Apple manages to produce a performant ARM chip (as in comparable speed to latest x86 offerings) that is cool enough to NOT require fans, that IS going to make a difference. For example, I hate working with hot laptops: one laptop I own is a Matebook X Pro 2018 model with an i7 processor and I run Throttlestop and down-clot it to i5 levels just because it is too noisy and hot for me in its normal configuration.

Also, if battery lasts 12hr with normal use as opposed to 6hr this does also constitute a difference in 'user experience'. I actually noticed my down-clocked Matebook was not just running cooler, but for considerably longer as well.

Yes, users normally won't be able to tell what changed (software will be the same, machine will look similar to any other mac, etc), but they will notice these things and point them out. The key to this better 'user experience' will be that Apple have managed to actually design an ARM chip that's way ahead of x86 offerings.

Comment Re:I'm less excited than I was before. (Score 2) 117

I fully agree. I've always thought of WSL as "the inverse of WINE" and that it can give near-native performance for Linux applications on Windows.

I was surprised when I first learned about WSL2 switching to a VM. I'd expect performance degradation. Apparently a lot of it had to do with Disk I/O were somehow performance was much worse than using a VM with a disk image. If I remember correctly, they switched to having a special mount and now ask you to only access Linux files directly that way, so that metadata does not get de-synced.

I'm not complaining though. I'm jailed in Windows at work and use Cygwin to keep my sanity, but I'd really love if we get WSL (any flavor of it) at some point...

Comment Re:ConEmu is better (Score 2) 117

At work I am forced to use Windows but thankfully I am allowed to run Cygwin.

Running bash in ConEmu gives all sorts of terminal issues. Especially when I ssh into some other box. The size of the window keeps de-syncing with what the software thinks it is. Therefore I start getting lines to wrap to early (or too late, overwriting the beginning of the line) and paging in "less" becomes impossible, etc etc.

I love ConEmu and use it extensively for powershell and cmd. Its developer is doing an amazing job, but the fact is that it is impossible to get this thing right given the fact that ConEmu is a *windows console* (see https://conemu.github.io/en/Ba...) so even WSL does not run "ideally" in it.

Getting Windows Terminal in Windows is going to be an absolute dream for me. If they give us WSL as well - to replace Cygwin - I'll be the happiest I've ever been at work.

Comment Re:USB4 in a Nutshell (Score 1) 78

First off, thank you for this very informative post.

You mention "The USB4 specification requires USB4 hubs to be backwards compatible with both USB3 AND Thunderbolt 3."

Are not all ports of a laptop essentially connected to an 'on-board hub'? Therefore even if a laptop has just one USB4 port (and maybe a few USB3 on another hub) that port essentially MUST double as a Thunderbolt 3 port (because that's what USB4 is really), right?

What I'm getting at is that all current Thunderbolt 3 docking stations (or eGPU enclosures) will work in all future USB4 ports of laptops. Since these typically supply USB3 ports on the dock/enclosure and also allow you to daisy chain further Thunderbolt 3 devices, what would be the reason to buy one of those expensive 'USB4' docking stations?

I see no advantage given that a Thunderbolt 3 hub will be cheaper, supply the same 40Gbps throughput AND is actually readily available today. Buy one now and use it as a dock for your future USB4 laptop. The only thing it will not provide is USB4 ports on the docking station so you can keep fast peripherals plugged in. But given that USB4 devices actualyl work as 'Thunderbolt 3' mode to achieve that speed, one can just daisy chain them on the Thunderbolt 3 ports that current TB3 docks provide, right? And once you have more than 1 peripherals connected the 40Gbps must be shared among them so immediately having two fast USB4 ports on a dock becomes moot (as they must share a singe 40Gbps uplink)?

Finally, I've read that Intel still maintains the certification for TB3 meaning vendors cannot advertise a USB4 port as 'Thunderbolt compatible' unless they pass Intel certification. But based on what you say, if a vendor has properly implemented USB4, their hub WILL work with TB3 devices anyway (certification or not). They just won't be able to advertise it as such. Is that correct?

Comment Re:Honest question to Eclipse users (Score 1) 67

I have used Eclipse since version 2.x and am VERY familiar with it (had at one time worked on an RCP application in a job and even wrote an OSS plugin more recently). I use it mostly for core Java development and C/C++ natives that go with that Java. Recently I'm using for Python as well with CodeMix.

I'm in my 40s and been coding since 13. I've also used IDEA and briefly PyCharm (switched to VS Code and use that for Python) and of course Visual Studio when I was working for a Windows C++ application. I spent quite a few years with vi (nvim later) for C development. So I've been around.

My main drivers currently are Eclipse for Java and VS Code for Python. For C/C++ I use both depending on what I'm developing natives for. I also use Atom quite often.

I've always considered Eclipse superior to all other stuff I've used. But for the average Joe, it is daunting.

Here's my main answer to your question "what do you use it for, and now that this is announced why would you pick it over VS Code?". Personally I like VS Code and I've long considered it better for Python development than Eclipse. Just as I've always considered Eclipse better for Java development than VS Code. However, even before Theia, I started recently evaluating CodeMix so that I can switch fully to Eclipse.

The real thing is: both can do more or less everything you need, but Eclipse just does quite a few things better in Java.

My feeling is that the real problem with Eclipse is ACCESSIBILITY. If you go to eclipse web site there's so many alternatives (for Java, for JavaEE, for C/C++, for PHP), so many nice features must be added via the market place (or update sites), it lacks a nice dark theme (which is all the rage past few years) and so on and so on... Go search for markdown or YAML support in Eclipse marketplace and you won't know what to pick...

The way I see it, is: Eclipse has fallen victim to IDEA, VS Code, etc. because: they look nice and work nice out-of-the box. In comparison, to make Eclipse usable I have a *python script* that installs cherry-picked extensions by myself from various sources. If you know what to install, it's the best way to quickly install Eclipse. But a new user will just go to IntelliJ for Java and VS Code for Python.

Eclipse IDE is like Linux distributions, only worse: with Linux at least there's Ubuntu (and Fedora) that are a bit of 'de facto' standards, but in Eclipse there is no single source offering a curated distribution that works well and also LOOKS well. Genuitec seems to have done something nice with DevStyle which I use for good looks. They've also released CodeMix recently which I'm currently evaluating for Python and so far I like what I see. I might ditch VS Code and go to a pure-Eclipse installation (if I like it after using for a while).

Every year Eclipse gets a bit more 'last year' and eventually it will die, which will be sad...

Hopefully it will keep a core of knowledgeable users to become the Emacs of 2030-2040.

Slashdot Top Deals

I put up my thumb... and it blotted out the planet Earth. -- Neil Armstrong

Working...