Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re:Well you have the source code... (Score 1) 48

One could wonder why browsers (and Firefox) include a pdf exporter, a pdf viewer, an svg renderer,...

PDF and SVG are file formats that can and are served on the web. Having support for parsing and viewing them does not seem at all abnormal to me.

In what way is an LLM needed? I think you're undercutting your point with these examples.

The reason to include the local model LLM features in Firefox is the same as with the PDF viewer: they're useful and people use them.

No, just stop with this broken analogy already. People need something with PDF support to view PDF documents. There are no documents that only a LLM can read. It's a false equivilence.

I grant you that they could spin off a number of features I quoted (all the non-html renderer) as add-ons that could even come suggested by default, or even included by default, but easy to remove. That's how the translation feature was offered initially. ...

YES. And that's what I'm suggesting should be done. This is a community driven product, after all. I'm not asking for excuses to justify this LLM addition as similar to printing support or support for specific file formats like PDF and SVG, and I don't buy those as equivalent anyway; I'm calling for this to be handled better.

And you're right... I won't be using Firefox. But if they want to gain me back as a user in the future, there's an easy path - carve this code back out and make it a plug in or add on. In the meantime, I'll be using alternatives or forks that do so (FWIW, I don't run Safari, Edge, IE, nor Chrome either).

Comment Re:Heart Issue? (Score 2) 25

Right!?!?!? At first announcement, I wasn't very interested. But the more info came out, the more obvious it became/becomes that they're NOT disclosing what the actual medical condition was/is. Now they're telling us who, and even a statement from them, but _STILL_ not saying what it was!?!?!? YOU'RE KILLING ME WITH CURIOSITY, NASA!

People have a right to medical privacy - I'll accept that. But WTF is it that warrants that much privacy and yet has this much disclosed? If I shit myself (in space), I might not tell my friends and coworkers exactly that, but I'd almost certainly share that it was a bowel movement adjacent issue of some sort.

Maybe there was a fight (USA vs Russia?) and someone broke something? That might have been dramatic enough to warrant hiding that info, but I doubt that would stay hidden. Hope it wasn't something boring/lame like a tooth infection.

Comment Re:Well you have the source code... (Score 1) 48

Great example! When I print something, the printer drivers and the connection to the printer are not part of the browser. The browser has an interface to work with printing, not implementing it itself. A print menu item is fine, as would be a button or interface to make calls to an LLM for something. You must configure a printer before the browser can use it.

Why would you trust these new AI features? I've worked directly with LLM's. They go off the rails sometimes. They're not always right. Why would I give it access to my personal data and the browser in which most of interfacing with the outside world gets done? Why give it that access by default, requiring one to opt out to avoid it? Why include that large addition to the codebase, where there are bound to be bugs that could be exploited in a classic sense, let alone LLM attacks (ex. prompt jailbreaks)?

The kill switch just removes them from your view.

Exactly! It only removes them from your view. They shouldn't be there in the first place. If there is a toggle or button for something that was added, it should lead to pulling in the code that adds these features (IE: an add-on). It should be opt-in, not opt-out.

What do you think the local models could be doing in your back while you're not calling them?

What couldn't they do? What if a toggle stopped the LLM results from being used, but every webpage you visited was still passed to the LLM? (Eg. someone overlooked a call when adding the toggle) Then the LLM could still be tricked into doing all manor of things via LLM jailbreaks - the whole "ignore all previous instructions" type of stuff (Ex. https://learnprompting.org/blo...). And that's not even touching on classic bugs, like additional buffer overflows that may be exploitable.

That might not be happening anywhere in this release. Maybe the toggles are easy to audit right now and we're sure they got everything wrapped in them? But new bugs can and will happen, and having this black box in place and enabled by default just keeps that door wide open to being big problem at some point.

There is an established and simple way to avoid all of those concerns - stick it in an add-on. They could even offer a download link that includes it by default. This is their own add-on system - eat your own dog food, damnit!

Comment Re:Well you have the source code... (Score 1, Insightful) 48

Wouldn't it be possible to verify the kill switch is doing what it claims to?

IMO, you're missing the point.

Integrating features (ex. AI features) and then LATER adding controls to toggle them means it's an afterthought.

I made a test-first-development analogy on the last post of this, and it applies here as well. In development, most would agree that test driven development (you write the test for a feature first, then implement the feature until it passes all the tests) is an ideal way to do things, though people often skip to implementing first. Same here. In this case, they could have designed this to be a module that could be added (via one of their add-ons) from the get go, and then others could use the same add-on API interfaces to add competing LLM integrations. Instead, they stuffed the code into the main codebase and are only now adding toggles to disable the feature (... and the code and LLM will still be included).

I'd trust them WAAAAAY more if the toggle wasn't needed in the first place. IMO, this feature should be opt-in, and only then pulling in all the LLM stuff.

Comment Re:Well, there is a positive way to consider this. (Score 1) 71

However, there's one thing. We desperately need third party forks.

100% agree. It's a terrific way (as a society?) to explore different implementations and features. In this context, Mozilla could have done a full on AI/LLM browser fork and littered it with integrations. If more people started using the fork, then they'd know where to dump more resources.

Unfortunately, firefox is under a bad license which doesn't fully support collboration. The reason that there are so many webkit derived browsers which come together under the one HTML renderer is because the KDE project chose the LGPL instead of a weaker license like the Mozilla license.

IANAL, but I am a bit of a licensing nerd. While MPL vs LGPL may have played some role here, I think you're overstating the impact. KHTML / Webkit initially took off because of Safari. AFAICT, the MPL probably would have suited Apple better, but they seemed to want a browser engine that wasn't in use by the current big two (Microsoft IE and Netscape Navigator). Apple picked up KHTML before the first release of Firefox, or we may have seen things play out differently. Google picked up Webkit (the renamed KHTML) 5 years later for Chrome, and they eventually forked it to Blink. Yada yada yada... a lot more to it than just MPL'd Gecko versus LGPL'd KHTML/Webkit.

The reason that there are so few fully open browsers is because they chose the LGPL rather than a stronger one such as the GPL v2+.

Yeah... I mostly agree. I'd probably phrase it slightly different - the reason there are so many proprietary browsers based on these engines is because they chose the LGPL rather than a one such as the GPL. But, IMO, you could swap in MPL for LGPL and things would have gone much the same way.

It remains to be seen if Firefox can create a sustainable software community around it.

How long do they need to maintain an active community around it before it's considered sustainable? It doesn't have to last forever, right? Mozilla's NGLayout/Gecko was released in 1998! It's nearly 30 years old! It's older than Safari, Chrome, and Edge. The thing that remains to be seen is how long they can maintain the established community around it.

Comment Re:Well, there is a positive way to consider this. (Score 1) 71

I get why it happens this way, but it's backwards IMO.

I guess it will start built in, a new framework will be built and in the long run we'll all be able to choose whatever AI components we prefer.

It's like test-first development. Most people will get behind the idea in theory, but they wind up developing before making the tests most of the time. I think this hits a similar mark.

IMO, all such features should start as add-ons, period. Got some new idea? Great. Can it be done as an add-on today (IE: does the add-on API have sufficient features to implement it)? If yes, then directly go to creating the add-on. If no, then the work to do on the codebase is to identify and update the things that would need to be exposed to the add-on API for it to be feasible, then (or in concert) work on the add-on. There will almost certainly need to be changes made to it during the early stages, so get those out of the way and allow it to stabilize. If it's deemed a success and there aren't (m)any competitor add-ons, merge it. If it's a success and there are competitors, take the most successful of those and make that the default implementation.

Like test driven development, I think it's easy to see why that is a wise path to take. It's hard to argue against, but it's easy to ignore and dismiss as more red-tape and/or developers thinking they know what's best already.

I'm wonder... did they add a compile time flag to exclude all the LLM stuff? That would be nice, and they could then ship two versions of the same browser, rather than have people running to third party forks. Personally, I don't trust that the config flags won't change in a future update, and I don't want that added bloat within the browser.

Comment Re:Well, there is a positive way to consider this. (Score 3, Insightful) 71

For those that want in-browser translation, it's no problem to have this as an available option. However, that doesn't mean it must be integrated into the browser, distributed to all, and enabled by default. What is the argument against having it be an add-on?

Or maybe the "translate this page" feature could be as it had been before integrating a local AI translation feature - fork that out to an external service - but provide configuration for the "translate this page" akin to the search engine config, along with an option to pull down an add-on for a local AI? I note this option because I see the rationality of adding the relatively light weight and simple "translate this page" when it's just sending the data out to some other service for the heavy lifting... why not facilitate that for those times when a user wants to actively choose to do that by clicking on that option? But why jump to including a local translation AI with the browser?!?!?

Also, maybe I already run a local LLM (I do). And maybe it's already more capable than the one they're shipping (it is). Then maybe it makes even less sense for the browser to bundle one in with it?

Long story short, most people don't want this feature creep, and those that want a translation feature AND want to use a local LLM are very very very likely to be fully capable of adding an add-on or configuring it to use their existing local LLM.

My guess... it's probably a lot easier (less friction, and less documentation/setup/support needed) to build it in when you're in that position than it is to do the same in an add-on.

Comment Re:cool and all but.... (Score 1) 57

Bash scripting is coding where the commands you are running are your external functions from a (maybe 3rd-party) API package.

Agreed. And sorry to take this on a different tangent than you intended, but... from TFS, "AI performs better with strongly typed languages. Strongly typed languages give AI much clearer constraints..."

If that's the case, why are JavaScript and Bash getting so much more usage? (NOTE: neither are strongly typed)

Comment Re:Sticky notes on the wall (Score 1) 116

NPP has excellent Find In Files capabilities ... Pretty cool, and not something vi / vim will do for you.

Vim can search across open buffers with ease. The search will highlight all the matches. You can jump around to the various matches with single key presses (n for next, shift+n for previous), and it jumps directly to the position in the file and places the cursor on the match.

If you find yourself about to say, "not something vim can do," you're probably wrong... and vim can probably do it better :-P

Comment Re:Sticky notes on the wall (Score 1) 116

Oh and speaking of yank (copy), in a GUI editor, that's Ctrl+C; cut is Ctrl+X, and paste is Ctrl-V.

To equate the two is to admit you don't fully understand vim. Vim can yank into and paste from named buffers easily. Yank word (yw) can be augmented with a named buffer ("qyw). Paste (p) can paste that q buffer ("qp). Sometimes, you want to paste a thing many times (ex. drawing some columns), and that's a simple number prefix (ex. 20p), and that also works with those (ex. 20"qp). And those can be combined with recorded macros for some incredibly powerful editing.

You mentioned shift-arrow. Yeah, that's painful. But shift+ctrl+arrow highlights a whole word at a time, or shift+home or shift+end selects from where you are to the beginning or end of the line, add ctrl and you to to the beginning or end of the text. There are a ton more shift combinations ...

OUCH! My hands stay on the home row while in Vim, and I don't need any contorted multiple modifier actions. You may be avoiding the reach for a mouse, but you're still reaching for HOME/END/ARROWS/etc...

Having used vi / vim at work for years, and then moving to GUI, I guarantee I can get around in modern editors just as fast as you can in vim, without touching a mouse.

I think you're writing checks your hands can't cash! I'd take up that challenge.

Now, if you're just trying to say that a proficient NPP user can be just as effective at their editing as a proficient Vim user (IE: though the vim user may be faster, it's not a night-and-day difference), sure. Time spent on the content (devising an algorithm, composing an essay, etc..) will make the difference insignificant. But I don't see how you could claim editing speed parity while you're CTRL SHIFT META HOME END ARROW'ing about. You said you used vim for years, so be real here - vim is unlike most editors, and that's its biggest downside, but this stuff is its biggest advantage... vim gets the win here. Now where did all those emacs users go???

Comment Re:Why? (Score 1) 84

... a simple random number generator is guaranteed to do a better job of it?

Show me how that works. Then, would you expect a normal person to be aware of how to do that?

If it's not obvious, direct use of an RNG does not produce a usable password. The result of this isn't directly usable in most situations: dd if=/dev/random of=test_password bs=32 count=1

It wouldn't surprise me if a lot of people go to google to look for a password generator, or how to make a strong password, and wind up copy/pasting from some website. If they're already using an LLM regularly, that's probably where they'll ask, and they'll probably just skip to the point and ask it to do it. If, instead, they asked it to create a program that will create a strong password, that might actually work.

PS: 100% agree that if you can write a simple shell script, you can make use of /dev/random just fine. But it's not hard to imagine people (most people) not doing that or knowing how to even start doing that.

Comment Re:Question is why? (Score 4, Insightful) 84

Exactly. It's not a question of "why would someone do that?", but one of, "What's the likelyhood of some non-insignificant number of people using it for that purpose?" And it's not just people using it, cause it's being used to orchestrate all kinds of things and will likely need to create passwords as a matter of course when doing it's tasks.

It's obvious to me why these are the results, but LLM's fool most people into believing they're thinking on some level and likely being honest. They may ask, "why wouldn't it give me a random password if that's what I asked for?"

"Want" because you're already too lazy to read and write for yourself?

I'm kinda curious what the average person typically uses for password generation? (not counting the common case of making one up themselves)
Personally, I wrote a little shell script to do it.

Comment Re: "Profit" on one side of the scale... (Score 1) 64

Yup, it would be useful to some - Probably not useful for everyone. Instant translation of written text would be very helpful to lots of people in Europe with its zoo of languages.

Who modded this down?!? Instant translation of written text would be incredible for anyone visiting a foreign land!

Slashdot Top Deals

It is impossible to travel faster than light, and certainly not desirable, as one's hat keeps blowing off. -- Woody Allen

Working...