I hated it because I could scarcely read what he was doing when I sat next to him due to the viewing angle.
If I ever build a killbot, it will be activated by the phrase "confusing to users."
Then we'll see "self pix" like remote TV reporting. No need for a camera person to tag along and no need for a remote van with that tall transmitter tower that can get mixed up with the electrical wires overhead.
This has already happened. Around here (one of the top five markets, and on the national 24/7 news channels, for that matter), they are constantly airing footage shot by viewers. Sure the quality is bad (technical issues like exposure, rolling shutter, etc), the composition is bad (not shot by someone who knows how to frame a shot or tell a story with video), and the overall experience is bad, but people love to see their name and video on TV so they'll even give it away for free. The station gets video for free that they otherwise wouldn't have access to, so they're thrilled as well. It's win-win or lose-lose, depending on your perspective.
Many stations send out one person, a reporter/camera person combo rather than the traditional two person camera operator and reporter team. I'm sure it's a bit awkward holding the mic and camera while asking questions, but it's significant savings (at the expense of, in my opinion, compromising content, something another poster mentioned from Thom Hogan's article about this). Similarly, a lot of newscasts are heavily automated. This leads to a lack of flexibility and occasional problems with on-air content, but stations have generally decided that those compromises are worth it in exchange for downsizing.
I can't say they're completely wrong; those are business decisions to save money at a time when there's little money going around anyway, but it's also cheapening the product and putting out substandard quality. I believe that content is king and that at some point, people will turn to the news that has actual reporters reporting along side compelling, quality video. But that's just me, and the past five or so years has worn hard on my theory.
Note that I work in TV, regularly edit material that airs nationally, worry about what my job will look like in ten years, but I don't do news.
Another problem with using "plus addressing" as I describe above is that I have come across legitimate companies who use a website for unsubscribe requests, but their website will not process the address I used.
Yeah, it's actually worse than that. There are legitimate companies that can't send mail at all to an address containing a plus sign. It's all bad (lazy? ignorant??) programming and doesn't conform to the standards, but there isn't a thing I can do about it. If I want to get mail from certain companies, I can't use the plus notation (most recently it was a small local computer shop of all things). Frustrating, but I've given up on fighting about it.
I think a lot of folks here are missing the point. The trouble is that the kernel running in secure boot mode has to be able to receive signed keys in a secure way (if you think secure boot is worth anything, many do not).
Linux running in secure boot mode is a done deal. The question is how do you import keys that are signed by Microsoft. In an ideal world you'd just upload the signed X.509 cert and you'd be done. Unfortunately, Microsoft will only sign PE binaries.
So the developers opted to enclose the X.509 cert in a PE binary. Unfortunately, that means the kernel needs to be able to read the PE binary and verify the signature all in kernel space, then extract the x,509 cert. This is undeniably messy.
Now lots of folks will argue that there's no point to this and it should be done in user space. I'm not going ti argue with that, but the reality is that most of the mechanics of this are already implemented, just not the PE stuff. You can sign kernel modules and verify them in kernel space with x.509 certs (at least by my reading of the thread).
Frankly, I think this is pretty much the only thing to do short of talking MS into signing x509 certs. The other suggested work-arounds involve additional authorities or doing stuff in user space. They are all workable, but are pretty clumsy compared to what's being proposed.
I think it may have been a mistake to just drop this ugly change on Linus without his involvement. My guess is that if the problem had been stated before coming up with a proposed implementaon, they might have come up with essentially the same solution with less drama.
This seems like as good a place as any for me to relate my experience.
I had a Garmin GPS that was a few years old (perhaps a year or two out of warranty) and it was time for a map update. The Garmin-documented procedure involved running their "WebUpdater" to determine which software to order (particularly important because my device didn't have enough memory to store the brand new top-of-the-line map update and they offered a couple of different versions). I should have just ordered the map update CD from Amazon and skipped reading the official documentation...but since I wanted to make sure I did it right, I followed their process...and ended up with a bricked GPS. Most of their support team was worthless, but one guy really did try to send me a fix. Unfortunately, it didn't work. So there I was, ready to hand over around $100 for a map upgrade, possibly another $200 for a second GPS for Mrs. ibennetch, and instead gave up on Garmin entirely. Oh, and to top off the joy, their phone queue was at least 90 minutes long before reaching a real person. Ugh.
I think I prefer going backwards to what I once saw on (what I remember as) a Microsoft Office installation, probably 15 years ago. The progress bar ever so slowly crept upwards...98, 99, 100% done...then 101, and so on. It finally locked up somewhere after 140%.
And by the way - GSM goes easily to 35k feet (11km) - if there are no obstrucions - you know - like in the AIR. We use a ferry to travel from Tallinn (Estonia) to Helsinki (Finland) and only right in the middle of this ~80 KM journey is there no cell reception from either shore. I would extrapolate that at least 30 km (3 times the height of commercial air traffic) is easily doable.
Cell phone reception only sucks if you have buildings or plants in the way. Or a mountain.
Just because you have good reception straight out from the tower at 30km doesn't mean you'll have good service 11km in the air. The cell phone towers are tuned to have good horizontal coverage, not vertically. It's not a perfect sphere of coverage; these are directional arrays that are designed to provide coverage where most of their customers are...on the ground. The first link that came up in my Google search seems to have some more information about coverage patterns.
The one that gets me is on the last trip - when opting out - the lady was trying to convince me to go through the scanner "Why not go through? There's no radiation from these machines" she says. I was so floored I couldn't even reply.
A TSA agent a few weeks ago told me they're sound waves. I have to question the science portion of their training program...
And yeah, I've had a few agents try to argue with me or try to convince me it's safe as well. I'm not really interested in explaining myself or arguing with them.
I've got to correct myself here -- I got an email from the gentleman who provides the x-mo systems and he informs me that they're indeed running extremely high framerates (4000 and 3000) for this particular show. I've worked with and met him before, and I'm pretty pleased that he spent some time in the comments of this article that's about his systems. I'm not sure how they're doing it, but without trying to sound crazy if anyone could figure it out, he's the guy. Anyway, I can admit that I was wrong about the framerates, and he tells me I am, so there you go. He's actually replied as an AC in sibling thread, so check that out for more information!
You're right on. I'm a tape guy and I've used all of the X-Mo systems; you're absolutely right that the lighting and camera noise affect the framerate. If you're able to pull the 600-1000fps you mention for night games, your ballpark is much better lit than ours. Adjusting for minimal flicker (grrrr), we usually run 350fps for night games, actually very similar for basketball and hockey (360 helps the flicker there a bit more). Besides, more than that and it becomes very difficult to tell a story around that one replay and the tape guy has to be cued really tight to the moment of action. Great for a freeze on a safe/out play, not so great for an outfield bobble. But you know all that already.
Look at the clips -- it looks to me like they're running a framerate around 500 and the clips are slowed down more (either by the EVS on playback or by the professor for posting). They all stutter quite a bit and I'd be shocked if it was near 1000fps, much less the 5000 he claims (note that the link he sources for the 5k fps comment says "up to" while he states that it is 5k).
I don't have much to add, but you've packed a lot of information in to your post and I support what you said.
I'm surprised this is even news, actually; with many or most RSNs having an X-Mo system now, seems most baseball fans would have seen something like this already. I know we talked about the way the bat wobbles, the bat breaks, the sweet spot, etc years ago.
You're close, except:
A widely used web package has a backdoor inserted.
is mostly incorrect, it wasn't the phpMyAdmin project, it wasn't that the source code was compromised; the problem is that one specific mirror was compromised and a modified copy of the phpMyAdmin source was distributed instead of the official files. It's a stretch to blame that on the phpMyAdmin project.
So, what kind of security/procedure/audit could have been in place, needs to be in place, so that something like this will raise an alarm even when the hacker isn't the most incompetent backdoor author in history? What kind of audit is needed to be sure it hasn't already happened?
I'm thinking of some sort of mathematical function where you plug in an arbitrary number of bits, say an entire file, and get out a small representation that is very, very difficult to duplicate. We could call it a hash, I suppose. Then you post the hash to the main web page for each file you distribute, and when someone downloads a file, then compare the hash of the downloaded file to the hash on the web site. Since the mirrors only host the downloads, and not the website, a compromised mirror wouldn't be able to change the website's hash. I need to find a patent attorney, I could make millions on this idea!
(please note that paragraph probably sounds more snarky than it was intended)
What we're missing is force fields. I think that's how holodecks are supposed to work - holograms bordered by force fields.
It's supposed to be a "mix" of force fields, holographs, and actual energy to matter conversion, IIRC. Perhaps holographs/force fields to simulate distance and open spaces, with actual matter for the close up stuff. So a holographic person is like computer controlled meat-puppet.
Yes, that's the general idea, based on my rather intense reading of the Star Trek The Next Generation: Technical Manual in my younger years. That being said, the reality of how it was presented in the show was different from the theory presented in the Manual, even though it was supposed to be the guide for writers to reference. It's just one of those things where we have to accept slightly different rules in each episode. By the way, I believe the technical manual specifically mentioned that at the end of a holodeck session, the replicated material would be turned back in to energy, essentially reversing the materialization process.