Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:No they aren't being too pessimistic... (Score 4, Insightful) 86

If valve can somehow get into console land with steam machines you can expect PC gaming to ultimately take over, not that I'm saying it will but if he finds some way to crack the console market it's a possibility.

All the consoles are basically rebadged PC's with some customization, that's all they are at this point.

No, Steam machines have a fundamental problem - they suck.

First off, the problem with PC gaming is piracy. Face it - 90% piracy has lead to developers targeting consoles. And it's still that high despite Steam (no-Steam hacks are plentiful, and it's why Steam has support for 3rd-party DRM still).

So the PC will remain the realm of secondary for AAA devs and the playground of indie. AAA devs will do console first, make back the big bucks, then do a half-assed port to PC as always. It might be a bit easier to take your Xbone game and run it on Windows 10, but you still have a port. Basically the devs will make their big bucks on the console, then when it tapers off, they'll release the PC version and hope to sell enough to pay for the port. Any extra is icing.

This is only broken by games that DO sell well on the PC where effective DRM is possible - i.e., games where online is a major component. So your Call of Duty or Battlefield will have day 1 ports because there is a sizable PC contingent who will buy it on day one at full price, to whom serial numbers are easily verified by servers, etc. Plus, PC users help bring it to the point of "1 billion copies sold on day one!" type PR announcements. (There are also many valid reasons for releasing on PC, since keyboard+mouse rules FPS world).

But for other games, ... not so much. Couple that with the perchant for steam sales and well, you're hoping to make it up in volume. Hell, I won't buy a PC game unless it hits $5 on sale, except in VERY rare circumstances. It's a race to the bottom, and if you want your PC game to be $70, it's got to have a big customer base who will pay full price. If not, they're going to wait for a steam sale, so better to sell on consoles for $60-70 first, release on PC 3 months later at $40, then a month later discount it to $20 for steam sale and let that be the PC release. Then 6 months later discount it to $5 and pick up the remainder as profit, hopefully.

Steam Machines? No, they're not taking over, unless you can guarantee me a $500 machine will last 10 years with zero upgrades. And seeing the initial batch, the $500 machines are... underwhelming. The good machines are $1200+, and even then you can get a console, get the "plus" (PS+, XBL Gold) services for $50 a year for 10 years, and still be ahead of a Steam Machine.

Or you can pop in a new $200 video card every couple of years and consoles will come out ahead.

Or we're gonna have to put up with an i3 with midrange discrete GPUs for the next 10 years as the "it must run on this configuration" system. Just like how we complained the PS3 and Xbox360 were holding back gaming... 4 years ago.

Comment Re:Not credit... so your account stays drained (Score 3, Interesting) 95

They won't do so because of the CC fees that are involved on a per transaction basis.

Debit also incurs interchange fees. Typically 25 cents plus 25 cents plus 1% (the merchant pays 25 cents, the user pays 25 cents, and the merchant then pays another 1% of the transaction as fees).

The only reason I knew about the debit fees to customers was a retailer who was super honest kept refunding people who used debit a quarter. he said his bank charges him 25 cents, the user gets another 25 cents tacked on to the amount and there's also a tiny percentage taken as well. Not as much as credit, but still. He decided long ago he'd eat all the fees so he paid everyone who paid in debit a quarter out of the till.

Comment Re:Microsoft Spartan? (Score 1) 317

Isn't this how the XBox became the XBox? They released the code name of their internal project, people kept using the name, and then they just stuck with it?

On the one hand "Microsoft Spartan" doesn't seem corporate enough. On the other hand it'll fit right in with Firefox & Chrome, which also have non-descriptive names that are pan-inoffensive yet interesting...

Well, the Xbox was really internally called the DirectX Box...

Though, you'd have to wonder if maybe Nadella is secretly a Halo fan or something. I mean, it was odd enough to have Cortana. Sure video gamers know who she is, and it kinda-sorta makes sense, but you'd think some sort of corporate self-censorship would've made it a nice bland name.

Then there's Spartan... of the many different names you could use to call a web browser, even one using the same Trident engine IE used...

(Project SPARTAN is/was the secret project behind supersoldiers like Master Chief).

I'm wonder what next - what thing might Microsoft call say, the Warthog next?

Comment Re:Know what's worse? Cleartext. (Score 1) 132

Weak encryption is infinitely WORSE than none.

The illusion of security is more likely to cause people to divulge information that they wouldn't do in plain text.

I remember when the export key laws were in place. Once the regulations were changed doing away with them, software and equipment should have been required to remove the obsolete code or be taken off the market.

My question is how could OpenSSL still have had this potential backdoor? Why was this not removed at first opportunity?

Yes, bad encryption is worse than none.

It's why Facebook has "privacy controls" - it's purely a marketing thing. By making people think their information is safe, they're going to divulge more of it.

As for why OpenSSL did it - most likely it's not OpenSSL's fault. I'd almost guarantee what happened is because the first-time startup took so long, some guy said "make it faster" (it can take a couple of minutes to generate the keys the first time on a slower embedded platform). So to "make it happen" they simply pre-generated the keys and embedded it in the firmware.

Comment Re:i don't get it..... (Score 2) 82

binaural = stereo
3d audio = surround sound (5.1/7.1/8.1/etc)

both have been around forever.

As have speaker virtualization.

This is basically another form of speaker virtualization - the ability to simulate a surround sound system using headphones. They do work (since you only have two ears, they just have to reproduce how your ear hears each speaker), and they do keep you from having the "inside your head" feeling you get with stereo sound played on normal headphones.

However, it's a bit more flexible in that you don't just have a virtual surround setup playing discrete channels of audio. Instead, it lets you position sounds, then simulates how it would sound if it was actually happening and plays that modified audio to your headphones, so it appears that the object was at the location.

Most virtualization systems (Dolby Headphone, DTS Headphone, Virtuaphone (Sony proprietary) and others) use HRTF (head-related transfer function) methods to compute how audio that's heard at some location in space will be heard by someone so it can be downmixed into two channel audio.

This is but yet another one. And I'm sure not the last - a few game engines also have the ability to compute 3D audio.

The other way of doing it is fine if the objects are located in discrete locations - there are a few virtualization systems that use convolution filters by having impulse responses measured from sources at discrete locations and how they're received by microphones.

Comment Re:HTTPS? (Score 1) 93

How do you obtain the expected signatures to match against the package if not by HTTPS or similar?

As you point out, if Verisign is happy to oblige, then the signatures can be altered in-transit as easily as the packages here, and there'd be no warning.

It only takes one Apple-signed developer package in the hands of someone with this kind of access to fake the origin and authenticity of the package signature AND package if the connection itself isn't secure.

The packages are signed by Apple using a key provided by Apple. That's what Apple's trusted developer program is about.

As for your solution of provided an alternate signed binary? That was always true even in the normal case. OS X accepts two kinds of apps by default - Mac App Store apps and apps signed by a developer using a certificate provided by Apple. The latter is for those programs that aren't in the MAS for one reason or another (perhaps they were rejected, or the developer doesn't want to go through the process - whatever).

But that "vulnerability" of being able to use it to sign malware has been there from day 1 since Apple doesn't control what developers use their certificate for. In fact, there has been at least one malware attack that used a legitimate developer certificate - Apple revoked it the next day which meant the malware couldn't run anymore. (No word if the developer got a warning about it - since you paid $99 for the certificate, Apple has payment information that tracks every developer who signed anything).

A more Linux example would be someone who replaced a package in a repo with a modified one and got it all re-signed. There's no real way around it - the files are legitimately signed so as far as the computer is concerned, it's all kosher.

It isn't a TOC-TOU (time of check - time of use) problem, it's basically replacing one signed binary with another. About the only thing an installer could do is check that the files it installs are signed with the same certificate - otherwise well, if you wanted to, you could pay Apple $99 and sign your malware with it.

Comment Re:Aren't these already compromised cards? (Score 0) 269

ApplePay is part of the problem. Because it tries so hard to keep information away from banks and retailers it makes it harder to detect fraud. If Apple were providing things like names and phone numbers to the banks they could very easily see that a particular CC was not being used by the authorized owner or on a phone they had never used it with before.

To be fair, banks could have demanded that information during sign up, but didn't. There is plenty of blame to go around. What I'd like to know is who pays for it. Usually it is the merchant, in which case I'd expect to see some of them refusing Apple Pay.

Actually Apple DOES provide some information to the banks. They provide information they know about the user - limited iTunes account information and how long that account has existed IS passed to the bank. (Presumably, if the card is being associated with a new iTunes account, the bank should be more careful). Stuff like names aren't important (because when you buy a credit card number you get a name, so it's trivial to make a new iTunes account with that name). Presumably, the method the number was entered is also passed on (you can take a photo of your credit card to add it - assuming that's probably a bit more secure than if you merely typed it in... don't know if the photo itself is sent to the bank).

And yes, banks respond back to Apple with whether it's completely accepted (green path), rejected, or further verification required.

As for who is liable, that's an interesting question because in the end, the merchant doesn't really know how the transaction was done - it appears to them as a regular credit card transaction. Only the bank knows when they look up the token that the transaction was actually done by Apple Pay.

Comment Re:wait what? (Score 1) 416

the EPA can worry about the environment, leave NASA to what NASA is supposed to do. National Aeronautics and Space Administration. Not the climatechange administration. not the muslim outreach administration but the National Aeronautics and Space Administration.

  Please give NASA more money, but make sure it is used for space exploration as intended. I dont see why this is getting so much heat

Sorry, but space is NOT NASA's only mandate.

NASA's mandate is right there in the name: National AERONAUTICS and Space Administration. In fact, NASA's "space" mandate only came around in the 60s. Prior to that they were known as NACA, the National Advisory Council on Aeronautics.

NASA's primary goal is actually about aviation. Space was merely tacked on because as a primarily science-based organization, they had the ability to extend their research beyond the atmosphere.

If you think climate change has nothing to do with aviation, you're wrong. Weather is an extremely important factor in aviation, and long term changes in weather (aka, climate) will have effects to the safety and conduct of flights. Maybe not now, but in the longer term future. It's why NASA does extensive climate studies - it's actually vital. We learned more in the past 50 years of aviation accidents about how weather and changes thereof affect flights - from volcanoes, windshear, microbursts, icing, to many other reasons why planes went down.

The NTSB investigates accidents and provides recommendations. The FAA implements those recommendations, however, in order to do so it might need to rely on NASA for the technology and know-how in order to make informed decisions about equipment and procedures.

NOAA? They're part of the Department of Commerce and they're concerned about how climate can affect the economy. It's a different branch of study. NASA's concerned about how climate can affect the atmosphere (which affects how planes fly through the air). While the missions often overlap, there are things NOAA does that doesn't concern NASA (e.g., NOAA monitors icebergs with Canada - this was established a year after the Titanic sank. But iceberg monitoring isn't important to NASA (since that primarily affects ships at sea). However, how the Arctic and Antarctic ice sheets behave IS since that does have climate implications that could impact flight).

Comment Re:commercials and young kids (Score 1) 163

I've yet to see / notice product placement in the stuff my daughters watch (mainly animated stuff). But there is a HELL of a lot of merchandising for the shows.

Yeah, it's called the show itself.

This started around 1984, actually - the FCC lessened the rules regarding targeting children. It's what lead to the show-length ads we called cartoons. You know, TV shows like Transformers, GI Joe, My Little Pony, Jem and others. They were basically 30 minute ads in the form of shows. They weren't completely in your face about it like an infomercial, but they were still ads.

And yes, the merchandising IS the reason those shows were produced.

Hell, Hasbro figured it out way back in 1986 when they created the original animated Transformers movie. Its sole goal is to basically get rid of all the existing toys to make room for a new lineup. Kids actually cried out - some from the fact that their entire collection was now "gone", others from the death of Optimus Prime who was so beloved that midway through season 3, Hasbro brought him back.

But yeah, if you grew up in the 80s, most of your cartoon TV programming were actually ads. If you watch the episodes today (thanks to DVD box sets) you actually notice how blatant it is at times.

Oh yeah, this extends to today as well - Transformers, GI Joe, Battleship - those movies Hasbro basically licensed it out for free to Hollywood - the prime consideration is not how well the movies did at the box office (that was Hollywood's problem), but how well Hasbro's sales did.

Comment Re:Or, it could be unrelated to actually extending (Score 1) 286

more charging stations

In the city, there's a (slow) "charging station" located within 10 metres of the road. It's called an "electrical socket" and while 110V 15A is slow charge, the "infrastructure" is plentiful and extremely common.

Of course, public use plugs are extremely rare, but given how many people gather around plugs to charge their smartphones...

Comment Re:Do that for the laptops as well (Score 1) 51

Although I question how much of a benefit this will really be. As it is, even without heatpipes, smartphone thermal throttles are usually set WELL below the CPU's junction temperature limit - the reason is that it's to prevent other components from getting too hot (like the battery). I remember talking to some Sony engineers, and IIRC, the CPU thermal throttle in most Xperia Z family units is not set to protect any of the internal components, but to protect the user's hand. Fujitsu's tricks might actually reduce the junction temperature at which a CPU can operate without burning the user.

Actually, thermal control is a compromise. ARMs have traditionally consumed 1mW/MHz, which is great back when everything was 500MHz or less. These days you have "octacore" processors running at 2.5GHz, you're looking at 10-20W in a tiny package with poor cooling (because Package on Package to put RAM on top of the SoC). Add in everything else and going full tilt you could easily have to deal with 15+W.

Given the thermal resistance of the package, you can easily reach junction temperature (125C) stupidly quickly.

One analysis of a chip I saw had 2 cores going 100%, while cores 3 and 4 had to be thermally limited to 50% to keep the junction temperature down. And it needed to start well before the temperature was reached - or you'll overshoot the temperature.

Then there's the system configuration - is the CPU beside the battery? Then you'd want to keep it from getting too hot (you can still get max junction temperature and be only at 45C at the top because of thermal resistance) . Or use the metal in the screen as a heatsink.

A large amount of heat in a SoC is also conducted away through the balls - ground and power planes that double as heatsinks aren't uncommon.

Comment Re:Long time... (Score 1) 240

You'd think. But you'd be wrong. Digg did it. Firefox did it. GNOME did it. Even Slashdot damn near did it. UI is about elegant discoverable interfaces between user and computer, and if this means expensive testing and actually listening to feedback that says "don't fix what isn't broken," so be it. UX, by contrast, relies on bogus metrics to justify change for its own sake - said change always requiring the hiring of more UX people, for some strange reason.

UX has become a cancer upon the profession. UXtards destroy products in order to leave their creative stamp on them. They lie to the marketroids and the C-suite by convincing them that change for its own sake is value-add. The existence of UX personnel in your organization ultimately results in a loss of marketshare and mindshare. Fire them all before your customers do.

And don't forget the compliant press who believes shiny should be different.

Because you know what the biggest complaint about iOS6 was? That the UI, which has changed little since iPhone OS 1.0, was "dated" and "outmoded".

UIs SHOULD be incredibly stable - they SHOULD get out of the way. The only way a UI is dated or outmoded is if it gets in the way of the user. (E.g., how iOS used to do notifications).

It's not just UX designers deciding to revamp everything - it's the press that decides that just because your windows look the same for 2 years that it needs redesign.

It's really why Apple bothered to go flat in iOS7, to partially implement it in Mavericks, etc. The press was basically calling out Apple for being stubborn with "stale" UIs.

Comment Re:The profession is in decline (Score 1) 154

Sure, but there are many areas of EE where demand has fallen. Programmable logic has drastically reduced the need for boards full of TTL chips. FPGAs, and even many ASICs, are designed with fully synchronous digital logic, that requires zero knowledge of most EE concepts, and can be done by any kid bright enough to master Verilog/VHDL. My company has done several successful FPGA projects, none of which involved anyone with an EE degree. ADCs, DACs, PWM, and DSPs come built into many microcontrollers, which themselves increasingly come on standard PCBs, with free downloadable libraries to handle all the interfacing.

And who do you think designs those things?

EEs are very much in demand, however they're not in demand for the old "computer engineer" or "programmer" style jobs.

Digital logic design still commands a small premium because it involves a LOT of advanced technology.

Though if you want to be in a field that's in resurgence, analog IC design is in. Even in the digital world - high speed digital signals behave in very basic analog ways that if your experience is in HDLs, you're not going to be able to figure out easily. Analog designers can command 6 figures easy, especially as modern PHYs are analog in nature, so you have to do mixed-signal ICs. Just because joe end user doesn't have to worry about it doesn't mean someone doesn't.

Then there's plenty of analog designs out there. A popular field is power engineering - you know, utility scale. Utilities all over the world are hurting because there really are only a handful of graduates in power engineering, not enough to replace the growing crowd that is retiring. (Yes, the flashy nature of computers and technologies have sapped the talent pool for other sub disciplines). Enough so that starting salaries are close to, if not above, 6 figures. Even those who want to retire are often asked to hang on because there's no one to replace them, or pass the institutional knowledge to.

There's plenty of RF work as well - WiFi and the like are easy to use, but that's because the RF guys made it simple enough to do. Even so, goof the design and you'll be wondering why you have limited range.

Comment Re:Wireless charging hit mainstream ~ 1-2 years ag (Score 1) 184

Personally, I hate fumbling with MicroUSB cables and my phone. I don't exactly have sausage fingers, but trying to put in that cable when I'm half asleep, the light on my nightstand is off (and I've been reading an eBook) and the end of the cable is loose *somewhere* on the nightstand is really annoying,

Other than the loose cable "problem" (which most people solve with a bit of tape, a binder clip, or other mechanism including $10 "solutions" that basically hold the end down), the problem is you're complaining about micro USB, a horrendous connector.

Sorry, this is an Apple article - and the Lightning connector is much nicer to deal with. If you wonder why Apple still made their own connector over using micro USB, think why did the USB Type C connector was invented.

Everyone blasts Apple on their proprietary connectors. Yet there are valid reasons why they exist - just because something is standard doesn't mean it doesn't suck (like micro USB).

Comment Re:Intel chip better than Qualcomm? (Score 1) 77

It is pretty difficult to get perfect performance out of a cell modem, the underlying theory is pretty complex, and translating these complex algorithms into a practical working implementation is incredibly difficult. Neither Intel nor Mediatek know how to close the gap. Qualcomm is probably the only company in the world that has the knowhow and brainpower to do this.

Actually, Intel probably has plenty of experience making modems - Intel's chips aren't their own designs, they purchased Infineon, one of the big modem chip manufacturers out there - if it wasn't Qualcomm, it's Infineon.

In fact, the first iPhones use Infineon modems. Though, one reason to use Infineon was extreme power management - so much so that it was responsible for killing AT&T's network. Basically the modems immediately killed the data channel when the transfer was over, so if you're websurfing, it would open a bunch of data channels, transfer the content, then shut them down. The end result was an overloaded control channel (opening and closing data channels is a control message). With enough iPhones, this consumed all control channel bandwidth. End result? Dropped calls because a tower with a full control channel mean the phone cannot do a handoff. It's why AT&T, despite having the worst call drop history, had some of the best data transfer rates (because a full control channel doesn't have a relation to how many voice/data channels are available).

As for Intel, maybe Intel is trying to court Apple - with modem chips at first, then maybe with SoC business.

Slashdot Top Deals

Good day to avoid cops. Crawl to work.

Working...