Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:doesn't even pass the sniff test (Score 2, Insightful) 45

"if you have seen copyrighted code that does X, you should not write ANY code to do X"

Don't confuse software copyright with software patents. Software copyright does not prevent you from re-implementation.

modern tech companies don't care about intellectual properties

Please avoid the term "intellectual property", it clouds the mind and creates confusion about copyrights, patents and trademarks, which are distinct entities that exist for different purposes.

Comment Do not forget to patch... everything! (Score 4, Insightful) 74

In the good old days it would've meant a one-shot update to glibc per system, and maybe sometimes for the odd chroot-jailed application.

Nowadays, it means updating all containers too (docker, LXC, snap, flatpak), so unless they're all homemade, a hunt for online updates, hoping upstream isn't slow to react.

Just like in this other operating system we laugh at when it comes to security.

Aren't we getting somewhere nice?

Comment Let's ask the RIAA (Score 1) 111

When a song includes a minimal sample from another song, it's considered copyright violation.

Even if two songs are not digital clones of each other, reproducing the same patterns (as a direct song cover or as a very similar melody over very similar chords) is again copyright violation.

Isn't the above exactly what Copilot does?

(The AI process is irrelevant - if AI was used to help composers and producers write music and added for good taste a sample of U2 in a song or reproduced the melody of a Metallica song, no way the RIAA would say it's ok.)

Comment Re:And fusion will work, too (Score 1) 170

Is it better to unbundle all the libraries and have them update randomly, rather than having them update on the same release schedule as the Python interpreter? I really don't see why that would be an improvement.

No other language I know bundles that many libraries, and it appears it's possible for them to be useful despite the randomness of 3rd-party library updates. PEP 594 that kills many batteries pretty much explains the caveats of bundling so much fluff: maintenance is costly, in many cases 3rd-party libraries are better; most people use pip anyway.

Unbundling also has the advantage of incremental adaptation to changing APIs; when python 3.x is released, many bundled libraries break at the same time; if they're unbundled, a programmer can handle one library change at a time.

Python is still my favorite language

No language's perfect. There's good ranting material for all of them.

Are you shocked and horrified that C changed?

Yes, especially for those two new keywords which didn't really need to exist (the metaprogramming use case seems like a stretch).

I guess I'm a too fervent sheep in the church of Linus Torvalds --- Don't break userspace!. Programming already has enough complexities (race conditions, security, caches, scalability, ...), we don't also need artificial problems due to purely artificial code rot.

Comment Re:And fusion will work, too (Score 1) 170

when Python updates, all the "batteries included" library code updates; and almost all of those issues are library issues.

I see; I find the double-discourse disappointing; those "battery" libraries are a sold as "part of the language" when trying to promote Python (coders should rely on them, see how convenient and easy is python), but they're "not really part of the language" when it comes long-term support of users' existing code bases (coders should expect breakage).

Of course, over the years, standard protocols change, new data file formats replace old data file formats, computing needs change, third-parties create better implementations of those "battery" libraries, so many of those batteries are dead. Wait for it, in python 3.10, PEP 594 says a bunch of those dead batteries will be removed. And obviously, all the code that relied on them batteries will break. See a pattern?

With many of today's programmers relying on internet connectivity all the time anyway (calling pip install foo or npm install bar is routine for them) maybe it's time the python team moves all its batteries away from its core project and puts its energies where they'd be better invested (GIL / performance)?

You think C hasn't introduced any new reserved words since 1978? Look up restrict and inline, added in 1999.

I learned something today, and I'm disappointed; restrict and inline are just compiler hints (they do not change the behavior of the program). Before the introduction of those two keywords, was there already a keyword to send hints to a compiler? Of course: #pragma. Were there any compilers implementing #pragma inline? Yes! So I can only conclude those two added keywords are pure language pollution. I'll say it again: I'm disappointed.

Comment Re:And fusion will work, too (Score 1) 170

You still haven't found a smoking gun proving your point

As I run gentoo, that supports multiple python versions and compiles everything, many times I see programs and libraries break when python-3.n+1 is introduced, and a new version in the next weeks that fixes the issue. Maybe the differences between versions appear trivial to you, but a search for "breaks with python 3.7" (and variations) on GitHub shows many people get bitten by backward-compatibility issues. It is a real-world issue.

And those are just those we can see. There are large code bases that aren't public (I hear Google uses *lots* of python internally), there must be lots of man-hours spent fixing perfectly working programs so that recent python versions don't choke on them.

You're also pointing to a specific libc changelog (GNU's), which has nothing to do with the C language definition itself; there are other C libraries such as uClibc; Windows has its own CRT DLL, BSD has its own libc, and so on. Adding/deprecating a non-POSIX function name in one specific libc implementation is not the same as declaring async to be a reserved keyword in the language itself.

There's a significant difference between the core language and its most popular libraries; it's normal for libraries to evolve, it's part of the deal. I'm not blaming python when numpy or urllib or scipy suddenly break their APIs.

We get it, you don't like Python. You don't have to like it.

I'd like to be neutral as I am with other languages of the same scope (perl, ruby, PHP, lua, ...) but it appears python has just the right set of annoyances to grind my gears (and don't get me started on the virtualenv-anaconda-mess that tries to work around broken package management and multiple concurrent python versions or on that stupid .pyc file pollution).

Comment Re:And fusion will work, too (Score 1) 170

[python 3.5 code from 2015 is unsupported and may not work with today's python] If you are going to make an extreme claim like this, you need to produce an example.

Note that I wrote may, not will. PEP 528 and PEP 529 give two examples of valid 3.5 code that breaks when running under 3.6.

For python 3.7, the official documentation even has a porting guide; same for python 3.8 and python 3.9. Not all of those changes apply to 3.5-era syntax, but the existence of porting guides pretty much hint that every dot-release means possible code breakage.

I will show you some C library code from decades ago that is now deprecated.

I'm eager to see that.

There's an automated tool for porting 2.x code to 3.x (which I used and it worked very well for me).

You seem to be lucky enough to use a subset of the language that works well with the tool. There are many python2 libraries that did not get a port to python3; it took years for distributions such as RedHat to migrate away from python2. When a whole book is written around the subject of migrating from python2 to python3, it proves 2to3 is very limited in its abilities.

I have Python code I wrote ten years ago or more that I still use.

Count your blessings.

Comment Re:And fusion will work, too (Score 2) 170

[python 3.5 code from 2015 is unsupported] It works perfectly well in an 2015 Python environment.

python 3.5 is unsupported as of 2020/09/30. If you want to run an unsupported interpreter full of security holes that have not been backported from later versions, that's your business, but not exactly great advice.

[1978 C code from K&R still compiles] A modern compiler won't compile K&R code.

From K&R's 1978 C Programming Language manual:

main()
{
int lower, upper, step;
float fahr, celsius;

lower = 0;
upper = 300;
step = 20;

fahr = lower;
while (fahr <= upper) {
celsius = (5.0/9.0) * (fahr-32.0);
printf("%4.0f %6.1f\n", fahr, celsius);
fahr = fahr + step;
}
}

Now, exactly what modern compiler of yours cannot process that?

Comment Re:And fusion will work, too (Score 4, Interesting) 170

Building python tools with all the dependencies cannot be made stable.

I do remember when python was first sold to the masses... "Unlike perl, code is readable as there's only one right way to do such or such". They just forgot to mention "The right way to do it changes with every dot-release".

I wonder if python's the language with the greatest amount of deprecated code attached to now-unsupported versions. 1978 C code from K&R still compiles; python 3.5 code from 2015 is unsupported and may not work with today's python. I wouldn't dare code anything remotely supposed to work in five years in python.

Comment Re:SQL was never a language to itself. (Score 2) 170

SQL is a way of requesting a recordset from a database... you're not supposed to do all your computing there.

Not all of it, but it's a good rule of thumb to manipulate data where it already is. So many clueless "developers" wear out perfectly good ethernet switches calling "SELECT * FROM foo JOIN bar JOIN baz" then doing calculations on the data in their toy scripting language, and wonder why the processing is so darn slow.

It'd be stupid to code a file server in SQL, but it'd also be stupid to write a compiler in bash (but not to call a compiler from bash).

Comment Double the speed of _future_ python (Score 1) 153

I assume current python code has had all low-hanging fruit related to performance picked up already. I mean, this is not version 0.1 just following a proof-of-concept. I also assume they wouldn't fix the GIL/green threads in a dot-release, this is too much of an undertaking touching too many lines.

So, how do you optimize such a code base for speed? JIT seems out of the question, too slow at startup. What's left? Gotta change the language a bit, again. Look for it, new constructs so that the language become optionally typed, and new functions for them new static types.

I fear current python code will not be faster. Not that Guido cares: he likes to thank people who wrote python code in the past by making sure their code has a close expiry date. Fsck them all who evangelized the masses to python2!

I mean, rewriting code for no benefit is so much fun programmers should praise Guido for the opportunity! It's fun to go see your boss and tell him "I need to rewrite this piece of code that does the job, almost from scratch!" "Why? If it ain't broke, don't fix it!" "Because Guido said so, ain't it a business case?!"

Comment Re:Normalized (Score 1) 33

This.

Users are experiencing data leak fatigue, as every month there's major news about such and such large firm being breached. So they just shrug it off, rationalize that it's part of the deal, and resume uploading personal data to whatever social media website that produces the most dopamine for them.

Comment Python, really? (Score 1, Troll) 44

Python, really?

I'm surprised that NASA would rely on a volatile language with such a short support time span. I understand this is not a probe that'll be up in 30 years; still why invest resources on something like this that they'll need to rewrite in 5 years because the almighty Python Gods have now decided to break everything again and punish their users with the fun task of updating their perfectly running code for no good reason?

Slashdot Top Deals

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...