Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Facebook? (Score 1) 149

I don't know specifically about those apps. But many apps do it as a natural result of them being little more than web-apps running in web-view.

Yeah, I think Facebook tried that with their early iOS app (or at least they said it was "HTML5"), but I believe a few years ago they rewrote it to be "native." Not sure if they've gone back or what (or just hybridized), but both it and especially Messenger keep changing more than you'd expect for a typical app.

Comment Facebook? (Score 1) 149

The description of "hot code push" sounds like something Facebook and Messenger are doing on iOS. They both change the location of buttons (and occasionally some functionality)--like moving the Messages icon in the Facebook app to the top left and replacing it with a useless Marketplace icon--without needing to submit a new app, among other continual and usually annoying changes in Messenger itself. (Or at least the change isn't obviously correlated with a new app version; they don't write real changelogs, instead using a generic "we continually update this app" nonsense, and the app continues to function like it did before...until one day when it doesn't.)

I'm sure there are potentially malicious uses of hot code push rather than just annoying ones like certain apps seem to be doing, but if it makes them stop doing it too, I'll be happy enough.

Comment macOS browsers (Score 1) 205

In the last 24 months, Mozilla's Firefox -- the other major browser alternative to Chrome for macOS users...

Umm, Safari, anyone? I guess that's probably true if we ignore Safari, but that would be like ignoring IE on Windows. (This makes a bit more sense in the context of TFA, from which the editor and/or submitter carelessly plagiarized this paragraph, as the article as a whole is talking about Safari's recent 13% decline on macOS...but you think someone might have read this before posting.)

Comment Re:not useless, but not revolutionary. (Score 1) 139

it's just a move towards back to windows 2000 gui rules.

you know, like input text boxes looking like input text boxes and buttons being distinctly buttons.

Really? I read the article and looked for additional information elsewhere (because the article is lacking) and it looks like it's taking the "Metro" ideas even farther, with lack of window borders and and no discrete title bars. Some vast spaces of solid color have disappeared, but only because they're apparently encouraging the use of photos (e.g., as the main "background" image in an app) instead. This doesn't look better to me.

Comment Re:Surveillance culture (Score 1) 155

Are you this paranoid about your smartphone, too? You say it's different, but an iPhone has pretty good mics ("Hey, Siri" can also hear me from across the room) and I'm sure there are comparable Androids. Sure, it won't listen unless you meet the conditions and enable the feature, but neither will Alexa--as far as we know about both if we trust the manufacturer as yo do for the phone. You can always unplug (or turn off with a "smart plug" so you don't have to physically do it, assuming you trust those) the Alexa device when you're not using it, but 99% of what I say when I'm at home is "turn off the lights" or to play Pandora, so it doesn't really bother me.

Comment Re:Manual? How old-school (Score 2) 78

>> Night Shift can be toggled on and off using the new Night Shift switch located in the Today section of the Notification Center.

Rather than have to manually turn it on/off, it seems like the much better approach would be to use a light sensor, or at least link it to the clock so it knows when its day/night. I agree that it should be manually overrideable though.

Look, I know it's not cool to read the article, but...from TFA: "In [the preferences pane], users can schedule Night Shift to come on at sunset and turn off at sunrise or set a custom Night Shift schedule." The manual toggle is just one way you can activate it.

Comment Re:Why stop there? (Score 2) 134

Let the user pick a personalized name like they would for any child or pet.

Since it's only listening for specific "wake words" and this processing must be done on the device itself, I imagine it's easier for them to code a few specific wake words into the firmware (and perhaps not even possible to do much more; I'm not sure we know much about its hardware)--everything else you speak afterwards (and, so they say, only this speech) is sent to AWS or whatnot where there's a lot more processing power, which I imagine that allowing the user to configure an arbitrary word would also take.

Comment Re:Specific to English? (Score 1) 128

I wonder how it performs on tonal languages like Cantonese.

I don't see any reason it shouldn't work. It encodes pitch (you really can't avoid that if you're encoding speech, which will include "voiced" sounds that have a fundamental frequency), and some casual reading about how it encodes suggest that it captures more specific information in the lower frequencies than in the higher ones, which also matches how our (logarithmic) perception of frequency works. That being said, the English sample I heard doesn't sound fantastic: think of a phone conversation in which /f/ is difficult to distinguish from /s/, which I suspect has to do with the high frequencies being either cut off or difficult to distinguish in terms of amplitude (/f/ is a bit weaker in general, and I think most of its noise is concentrated above the frequencies that aren't heard over the phone--don't quote me on this). So, I suspect the listener will have to do some work regardless of language, but there is nothing English-specific here.

Comment Re:About letting us choose everything? (Score 1) 156

Why can't windows search subfolders while looking for drivers?

As of Windows 7 (maybe Vista--I wouldn't know), it can. There is a checkbox labeled "Include subfolders" right under the text field where you specify the path to search. If they are Dell drivers as you mention, you might want to make sure they're not just EXEs, which you'll need to extract so Windows can find the INF files it scours for matches.

Comment Re:I'm sure there's a reason... (Score 1) 192

That's funny since most printed text is printed at like 72dpi and nobody complains that printed text is pixelated or "unclear." The human eye isn't that good. What you are loving isn't resolution related -- it's the better backlight giving you better blacks than what you had on old 1080 monitors.

Uh, no, you're thinking of PPI on displays. Even cheap printers can usually handle at least 300x300 DPI. Most laser printers I've seen default to 600x600, and even ones marketed for home use are often capable of at least 1200 in at least one dimension. Many inkjets also are able to increase their DPI for "high quality" or photo printing.

Let's take an average computer monitor, however--say a 22" monitor with 1680x1050 resolution. This gives about 90 PPI (or DPI as it's more often called here, even though some would argue that is not the correct term for displays). Early computer displays were often about 72 PPI, which is where your figure comes from (though Microsoft used 96 DPI for reasons beyond the scope of my explanation). Now, Apple's "Retina" and other high-DPI displays are on the market for both desktops and mobile devices, with PPI often in the 200s or 300s--closer to print.

The difference people perceive is most certainly related to pixel density and not to better backlighting. While a better backlight might help with increasing contrast, if you compare high-DPI vs. "regular" displays side-by-side, I think you'll find the difference clear--as if this even needs to be explained given that your description of printed text is egregiously incorrect. (For added fun, turn off font smoothing and compare--half the reason people think text looks decent at all on a 90-ish DPI monitor is font smoothing, usually in the form of subpixel rendering on LCDs.)

Comment Re:Oh God... (Score 1) 173

I recently was asked to set up an "out-of-office autoreply" on a friend's Outlook 2007 installation. Couldn't even find WHERE to do that on the Ribbon (although I did use it way back when it was new, too). Had to google for instructions...

Well, Outlook 2007 didn't have the ribbon interface in the main UI, so no wonder you couldn't find it there. ;)

Comment Re:Obsolete? (Score 2) 142

will become obsolete worldwide on the same date.

I have a 2006 MacMini. With iMovie '06 it's still the best front end to a Firewire camcorder I've found. The latest kdenlive dropped Firewire import.

For basic video editing it still works rather well. Transcoding is slow so I export everything in .dv and convert it on a faster machine.

Doesn't seem very obsolete to me.

That's fine--it's just not what Apple means when a product becomes "obsolete," which is a term they use to denote hardware for which they will no longer supply official parts for repair, generally those that were discontinued more than 7 years ago (or 5 for "vintage" products, which means almost the same thing except that there are still parts available in certain circumstances). In many cases their software/OS updates still support these machines, and you're obviously welcome to keep using them as long as you want in any case (though I'd personally avoid putting anything without recent updates on the Internet).

Comment Re:Convenience. (Score 1) 168

I sort of wanted one at work. I have a Thunderbolt (2.0) hub, that has thunderbolt in, with thunderbolt, gigabit ethernet, usb 3.0, audio, mini display port, and hdmi out. The hope was that one cable was all I would need to plug in whenever I dock my laptop at work, which has two monitors. Turns out that the only way to get two monitors with fed from one thunderbolt cable is if one monitor takes thunderbolt directly. So while one thunderbolt cable can do one 4K monitor, it can't do two 1920 monitors. Oh well, at least it's only two cables I have to plug in.

This was disappointing for me, as well, though I knew it before I bought my Thunderbolt dock. I'm pretty sure you could technically make it work with only one cable in the laptop if you bought another Thunderbolt dock and plugged it in to your existing Thunderbolt dock, but I didn't want to spend twice as much money as the already expensive dock cost. (Every once in a while, I'll think about replacing my dual displays with a single larger one of 27" or more, but I'm still not sure about that.)

Slashdot Top Deals

If you are good, you will be assigned all the work. If you are real good, you will get out of it.

Working...