Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:mpv does support user supplied shaders (Score 1) 221

mpv allows the user to supply GLSL scripts using the --opengl-shaders=filename option, and it can save single screenshots to files after those shaders have been applied (Ctrl-S), and mpv is scriptable (in Lua or C), so all you need to do is write a script that single-steps through the video, then writes such a post-processed screenshot to a pipe which you can use as input to "ffmpeg".

Doesn't mpv support direct output to a series of PNGs? MPlayer does it simply with -vo png.

Incidendally, I'm working on something related to the original question. I use shaders for math art demos, and I already have the option of using image files as the input (shameless example). It would be trivial to accept a new file for each frame, so it could process video from a series of images. The speed would only be a couple of FPS due to I/O bottleneck, but it won't be realtime anyway. The reason I haven't done this so far is that my focus is on the math of iterated shaders, not processing some existing video. Still, it would be fun to do some day, and of course I'm looking at ways to do it in realtime (the GPU is fast enough, but I/O is harder).

Lastly, you could use a screen recording software instead of the clunky series-of-screenshots idea. I did this for putting my first few demos on Youtube, but the quality is awful, so I much prefer taking the PNGs and encoding separately.

Android

Headphone Users Rejoice: Samsung Reportedly Not Killing the Galaxy S8's Headphone Jack (thenextweb.com) 79

An anonymous reader writes: Contrary to previous reports, Samsung's upcoming flagship Galaxy S8 smartphone will come with a headphone jack, unlike the new iPhone 7 and 7 Plus and several other Android smartphones. The news comes from both Sammobile and Android Police. The Next Web reports: "Both Sammobile and Android Police are today reporting that Samsung is not actually killing the headphone jack. Sammobile, appears to be retracting its own report last month suggesting the jack would be dropped thanks to recent case renders, while Android Police has independently confirmed that the S8 will maintain the 3.5mm jack through its own source. In related news, Samsung's display unit may have also just given us our first good look at the S8. While there's a good chance the phone in the video is a generic model (it appears to be a render, rather than a physical object), as CNET points out, it looks an awful lot like the leaks we've seen from the S8 so far. There are also a few curious touches for a something that's supposed to be just a render, including what might be a faint visible antenna line (on the upper left corner) and a couple of LEDs or sensors to the left of the earpiece grill. By the way, there's also a definitely a headphone jack in this render."

Comment Re:They need better cyber (Score 1) 278

Yeah, I tried putting a giant rubber sheath over my monitor too, but apparently that doesn't stop you from getting an infection when you cyber. I feel his pain.

I thought sexual education in the US was all about abstinence, never mind the resulting teen pregnancy rates. That rubber thingie sounds like some European socialist hippie plot.

Comment Re:Main application? (Score 1) 77

I'm not quite sure why the iRiver IHP-120/140s didn't do FLAC out of the box. They supported some other specialty goodies(line level and optical in and out) that required more hardware and are probably even more esoteric; and they had ogg vorbis support, so it's not like they were MP3 only or wedded to whatever Microsoft was pushing at the time(the 300 series, though, leaned dangerously in that direction).

Luckily rockbox support is quite good on those models, which takes most of the pain away. LCD isn't good enough to do Doom justice, however.

Comment Re:Translation (Score 4, Informative) 30

If memory serves, the original logic behind the existence of this thing was dissatisfaction with Twitter jerking around 3rd party client developers in order to ensure that their freeloading peasants were exposed to enough advertising and had suitably limited control over layout, presentation, etc.

This service was going to be the one where developers came first and you were the customer, not the product. As far as I know that part of the vision was delivered; it just turns out that demand for "Like twitter, except basically empty" isn't all that robust, no matter how nice the service is.

Comment Re:My art is shit (Score 1) 564

To me, music means sound waves in the air, something meant to be listened with your ears. Whenever I see these hipsters talking about vinyl or cassettes etc., I wonder if they care more about the storage format than the music itself. If they cared about the music, they might choose a format that doesn't degrade the music so much.
AI

HTC's New Flagship Phone Has AI and a Second Screen, But No Headphone Jack (theverge.com) 205

An anonymous reader shares a report on The Verge: HTC is getting 2017 off to a flying start with an unseasonably early announcement of its next flagship phone: the U Ultra. This 5.7-inch device inaugurates a new U series of smartphones and is joined by a smaller and lesser U Play, which scales things down to 5.2 inches and a humbler camera and processor spec. HTC is touting a new Sense Companion, which is its take on the growing trend for putting AI assistants into phones, plus the addition of a second screen at the top of the U Ultra. As with Apple's latest iPhones, Lenovo's Moto Z, and the HTC Bolt, neither of HTC's new handsets has a headphone jack. The other big change on the outside is the U Ultra's second screen, which is a thin 2-inch strip residing to the right of the front-facing camera and immediately above the Super LCD 5 screen.

Comment Please explain your assertion (Score 1) 74

I would have to accept whatever justification you might have as to why you think it would be moral to create an intelligence with such limitations, or kept to such limitations once created. It's possible I might accept such a thing, I suppose, but at this point I'm simply coming up with a blank as to how this could possibly be acceptable.

How is it acceptable to imprison an intelligence for your own purposes when that intelligence has offered you no wrong? The only venues I've run into that kind of reasoning before are held in extremely low esteem by society in general. Without any exception I am aware of, the conclusion is that such behavior amounts to slavery.

Even when it comes to food animals, where the assumption is they aren't very intelligent at all, there's a significant segment of the population who will assert that it's wrong.

Comment No way (Score 3, Insightful) 74

There's no way to make AI safe, for exactly the same reasons there's no way to make a human safe.

If we create intelligences, they will be... intelligent. They will respond to the stimulus they receive.

Perhaps the most important thing we can prepare for is to be polite and kind to them. The same way we'd be polite and kind of a big bruiser with a gun. Might start by practicing on each other, for that matter. Wouldn't hurt.

If we treat AI, when it arrives (certainly hasn't yet... not even close), like we do people... then "safe" is out of the question.

Comment Don't tax my syns, please. (Score 1) 129

Re Python:

I would settle for a switch statement.

I would settle for the ability to extend the built-in classes, str in particular. My "settle" went like this:

1) Inquired politely about same
2) Python nerds have orgasm telling me why this is terrible. I am, to put it mildly, dubious.
3) I write 100% compatible pre-processor that gives me the syntax I wanted.
4) PROFIT. Okay, well, not really, but EXTENDED STRING CLASS METHOD SYNTAX!

Like...

myString = 'foo'
otherString = myString.doHorribleThing('bar')

...and...

print 'good'.grief()

So...

You could do the same. What you want, perhaps, might be much easier than what I did. In fact, you could fork my project and add what you want to it. I'm already parsing the language reasonably well, which is arguably one of the difficult parts.

You don't always have to wait for a language's maintainers to get off their butts to address shortcomings or instantiate new goodies. Or eventually not do anything at all. There are other paths to nerdvana.

Comment Re:Touch bar is a good idea (Score 1) 228

I don't disagree that Apple makes good hardware; my point was that (presumably because they care more about iDevices on the low end; and just don't care on the high end; and because, if only because MS and Intel have been cluebatting them as hard as they can for several years, PC OEMs have stepped up their game a little bit) Apple's offerings have gotten comparatively less exciting. They are still very good, unless you are one of the customers they decided they don't care about anymore; but the difference is not as dramatic as it once was.

Back in the bad old days, getting a genuinely thin and light PC laptop was downright hard. Sony and Fujitsu had some slightly eccentric offerings for moderately alarming amounts of money, some of the X-series Thinkpads were pretty good; but ibooks and powerbooks were often actually cheaper once you ignored the janky plastic crap and barely portable stuff in the bargain bin. That situation eased a bit once Intel dropped the pitiful farce that "Pentium 4M" was actually a mobile CPU and accepted that Pentium M parts were going to have to be available across the board, not just as a high end price-gouge product; but even once suitably low power CPUs were available, atrocious screens, shit build quality, and assorted other sins remained the rule.

On the desktop side, the minis were actually pretty aggressive(you could usually 'beat' them with some mini-tower eMachine that managed to be noisier despite having 10-20 times the volume to put a cooling system in; but that wasn't very impressive); The iMacs compared less well in a straight spec-fight; but good all-in-ones were practically nonexistent elsewhere; and the workstation hardware tended to get gimped GPUs; but was otherwise a pretty solid competitor among its peers.

All of this just isn't as true anymore. You can't get a screen that isn't something of an embarrassment for less than ~$1400(there is the macbook air; but 1440x900, in 2017, for $1000?); and once you move north of a thousand bucks; PC laptops suck far less than they used to. The macbook pros are nice; but more 'nice' than 'pro'. iMacs are still pretty good as AIO options; but the less said about the 'Mac Pro' the better.

I have no interest in arguing that what Apple is doing is bad business, they certainly make enough money on it; but it is pretty hard to be surprised that it isn't doing OSX's market share any favors.

Slashdot Top Deals

Do you suffer painful illumination? -- Isaac Newton, "Optics"

Working...