Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:hoping this is the straw that broke the camel's (Score 3, Interesting) 30

You need at least one genius to make it. It is currently the only AI we have that is actually smarter than humans, as it can make new scientific discoveries that humans were unable to do even they spent decades trying.

Regarding this, multiple fields (CS and structural biology) disagree with you. The model is not smarter than humans in anyway, heck it is not even smart: it is a statistical model that is able to interpolate very well, but is incapable of extrapolating (not too much, at least), and that's not a limitation, just part of its design.

There are a ton of papers already that are tackling the many limitations of AlphaFold, but that's because the whole field is working together to improve it (as it should be with Science). The reason they got the Nobel prize is not because of how "smarter" that is, but because of how impactful this elegant model has been in the field of drug discovery and structural biology, and because it showed what you can do with a well-designed ML model and good data for training.

Regarding the one genius required to do it, the Nobel prize was split between two groups, the Google one and David Baker's one (University of Washington) which had his own independent approach to the problem of protein folding. Again, very impactful, but there has been quite a lot of controversy related to the fact that ML was successful because it could build upon the work of countless scientists that for 50+ years generated, curated and collected the data that was then used to build AF. Kinda similar to what happened with Reddit complaining about being scrapped to build ChatGPT, but on a several billion dollars larger scale.

Comment Wrong frame of reference (Score 1) 121

I usually steer away from opinion controversies like this one, but I think there's a good point to make here.

I can't agree more on the fact that merging the two realms of applications and users is wrong.
However, the question

What software on Mac do you want for an iPad device experience?

sounds more insidious than the appearance, at least to me.The convergence is happening, alright, but the discussion is looking at it from the wrong perspective. Apple has been steering toward this for a while, but not because it wants to boot MacOS on the iPad but because it wants to constrain even further the full MacOS experience to resemble the one on the tablet, and that's no secret. The process of apple vetting for installing stuff on a Mac is a good example, and it's becoming more and more stringent with the years. Notarizing open-source applications got way more tedious, and it's not expected to get any easier.

That's it, that's how MacOS and iPad are going to converge, eventually.

Comment Python, JS... (Score 1) 39

I develop almost exclusively in Python and I can't think of changing anytime soon. I see a lot of well-argumented criticism to the language which makes a lot of sense, and yet I think it's misplaced.

If your code needs C-level performance, write the critical parts in C/C++, this used to be the mantra. I don't like where the new development is going to address what problems that are the direct consequence of fundamental design choices. Then if you write computationally intense code in Python that's your fault. As much as I can't stand JS and all the derivatives, I think it has its own ecosystem niche and reason to be, much like Python.

Cutting out those that want to write an OS in an interpreted language, there is no doubt that Python did a terrific job at addressing the needs of countless programmers or people that needed to quickly write code for doing something (scientists). Ignoring that would be silly. So I think people should reconsider their blind criticism.

Comment Re: More anti-features from nvidia? (Score 3, Insightful) 131

Not to disagree entirely with you on the anti-feature claim, but the argument is fairly weak. The highest reaction time of humans is between 200 and 150 milliseconds (with training). At 30 frames per seconds, a crappy standard for today's GPUs performance, each frame is rendered in ~33 milliseconds, meaning you could pack a bit more than 6 frames in your reaction time window (with training). At 144 FPS, the ideal refresh rate for Counterstrike (which a compulsory Google search reports as "Criminally smooth. For hardcore and professional players"), you have 7 ms/frame, which means your GPU could render 25 frames by the time you actually perceived a movement. Is it a cheap trick? Absolutely yes. Are you going to notice any difference? Not likely, unless you're playing against your cat (reaction times: 20/70 ms)

Comment Why not? (Score 1) 132

I've diligently read all the comments at the time of me posting this and beside a bunch of reasonably argumented points about the fact that basically in principle the same could be done with properly written C++ code, I couldn't find any good arguments about why not Rust, then. What's the cost of writing the same code in Rust instead of C++? since performance is not an issue, frankly I don't see why not use it for the majority of development currently done in C++. The argument of "just use better programmers" does not hold because even navigated Linux kernel developers seem to make trivial mistakes, why not embracing a language that can prevent the most common source of critical bugs (C/C++ buffer overflows, from a bunch of past ./ posts)? To me it seems mostly a hubris issue that "navigated" programmers seem to suffer from. I can clearly be wrong, but so far I couldn't find an argument convincing me otherwise.

Comment Webcam abomination (Score 2) 21

Beware of the incredibly idiotic design of the webcam in the bottom left corner of the screen. People on the other side of your teleconference will see your head peeking behind your ginormous fingers and knuckles. Not enough? The sensor quality is mediocre (to be nice), and the touchpad sucks. Also, some models (mine is 3560, if I'm not mistaken) will overheat the CPU and trigger lots of MCE and dmesg warnings. Other than that, I've been a relatively happy owner of an XPS13 DE for a while and I'm satisfied with the battery life and the overall power.

Comment What does it look like? (Score 1) 166

I'm a big fan of unlocked phones myself, and over the years, I've looked at the Lineage project more than once. If you google the official website, you will land on a fancy and modern page with all whistles and bells about the core values and the mission of the organization, and the links to download their software.

I did it again, today, and the outcome is still the same: I I have no idea about how this custom OS looks like, what offers, and what are the limitations (if any).
The About link shows a page with the vocabulary definition of the name of the OS, and a not-so-obvious link to Wikipedia.
I'm sure there are a bunch of smart people out there that are willing and able to send me a link full of screenshots and the features of LineageOS, but the fact that a reasonably tech savvy person can't find any of these from their own website (and not using Google, thank you) is disappointing, to say the least.
I'm thankful for all the volunteers that spend time supporting a large number of phones, etc... but I'm still disappointed.

Comment Re:It's rather obvious if you use incognito mode (Score 1) 114

(Apologies if it might look like I'm trying to turn this place into a venue for having useful conversations)
Isn't that a bit of an overkill? I can think of a number of alternative options:

- running multiple profiles for different activities, i.e., Firefox/mail profile and Firefox/browsing profile

- running different browsers for different activities, i.e., Firefox for the email, Chromium for incognito browsing

- running different browsers/activities as different users, i.e. "sudo -u mail_only firefox www.gmail.com"

They would all have much less overhead, but I'm not sure if I'm missing something obvious here.

Comment Re:What I would like to read: "1024 bugs fixed" (Score 5, Interesting) 41

To be fair, I found a more detailed changelog that mentions a few more bugs, but it's just lot of papercut fixes: tons of warnings removed, Unicode quircks corrected, UI changes... but nothing about high CPU usage, instability, and other major issues.

It's still possible to see the desktop and all the windows on it every time the system resumes on a laptop, before the KScreenLocker kicks asking for the password... but, hey, now it compiles with "strict" compile flags, and finally exports the install location for DBUS interfaces via CMake.

"Shake it, baby!"

Comment What I would like to read: "1024 bugs fixed" (Score 1) 41

Come on, make users happy and do what a dot-14 version should do.

I'm not even picky, you guys can choose them from any of these lists:

KWin

Plasma Frameworks

PlasmaShell

But what TFA has to say about?

Blurred backgrounds behind desktop context menus are no longer visually corrupted.

It's no longer possible to accidentally drag-and-drop task manager buttons into app windows.

Yeah!.... I guess?
[DISCLAIMER: I am indeed a KDE user]

Comment Impact (Score 1) 300

I usually don't care much about these discussions, but I feel there might be some room for clarification.
I'm not sure if Python is really the future of programming language, but it definitely provided a glimpse of what a possible future might look like. Whether people like it or not, this programming language lowered significantly the access barrier required to start writing useful and powerful code, without sacrificing functionalities.
It's not C/C++, it's not incredibly memory efficient, and performance is significantly lower than compiled languages, but...

There are professional programmers using it for the most obvious thing (prototyping new code) as well as writing complex programs that were not worth the time investment of writing in C++. Although, the big difference is the democratization of programming for people like me that don't have the time and the resources to build serious programming skills. The lower access barrier allowed many people to implement their ideas, make them work, and spread them around... sure, in a high-level, slow, and memory inefficient language, but do we really care? There are many places where the idea is far more important than the implementation like often in the scientific world, notoriously famous for providing crappy code.
Tons of people have benefited from countless programs doing very complex operations, or just simply scratching long standing itches (matrices operations, 3D operations, data sanitization, etc.).

If your Python code is really useful and needs to be made faster, you can hire a programmer to re-write it in C++, but in the meantime it might have reached a significant critical mass of interest/users that make it possible (getting funding, etc...).
Python is likely how an everyday programming language might look when most of the people will write a program at some time in their lives. On the other hand, if Python is the only reason to define yourself as a professional programmer, then it's obviously a problem.

Slashdot Top Deals

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (2) Thank you for your generous donation, Mr. Wirth.

Working...