Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Some (backward) progress (Score 1) 379

you're confused, this is OpenBSD patching openssl that comes with that distribution. However many of their patches will help the openssl project if someone on that team can be made to have an interest in actually improving the security of openssl.

The fact that they're starting by reformatting code and ripping out anything that's not OpenBSD/posix-compatible (e.g. Windows compatibility) seems to indicate that it's highly unlikely that this effort will be backported to OpenSSL, with the exception of very specific security fixes, perhaps. What they're doing makes a lot of sense if they're planning to fork the project and maintain it exclusively for the OpenBSD project, but I'm guessing we won't see much backpatching upstream.

If they had made these changed while preserving platform independence, it may have been a different story. They're simply doing what's best for their platform of course, so I don't begrudge them that. Obviously, the OpenBSD team isn't interested in making sure they remain compatible with Windows, etc.

I think the OpenSSL library will need it's own refactoring effort, unfortunately.

Comment Re:So - who's in love with the government again? (Score 1) 397

You might want to add, "I am not an economist but..." before you write these things. "which may mean lower profits, leading to reduced employment" is as ridiculous as saying that adding a powered usb port will draw less power from your CPU and speed computation.

We have no idea what this will do for employment, there's simply too much going on. Increasing beer prices ever so slightly (I doubt this adds more than a cent or two per can, but whatever) would decrease beer consumption (also ever so slightly) and might increase productivity in other industries. Also, increasing food safety could decrease time off economy wide. It's impossible to know. But I doubt any effect would be large.

When I talk about "lower profits and reduced employment", I simply mean that in a general sense. I wasn't trying to indicate that you'd see a massive drop in employment or profits simply because of this one regulation. My point, which I apparently made somewhat poorly, was that the sum total of these regulations tends to have a negative effect on a company's profits due to the overhead of complying with said regulations. To use your analogy, of course you wouldn't see a significant power drain with a single powered usb port, but your computer can only support a limited number of powered usb ports before the power drain becomes unsustainable.

That's not an argument against all regulations, because as I stated, many are critical for safety reasons. Rather, I was simply pointing out that it's a good thing to look with a critical eye at any new regulation to see whether or not it's truly necessary because of the net economic drain it imposes.

Comment Re:So - who's in love with the government again? (Score 3, Insightful) 397

This is all a tempest in a teapot. The FDA is proposing rules for complying with a 2011 law passed by congress to ensure food safety. Brewers had been exempt from the rule because they were able to buy off congresscritters in the past. Now they will have to keep records and conduct training to make sure that they aren't shipping contaminated waste grain to feed cows. People who love to eat cows should welcome the fact that they can be assured that their cows haven't been fed contaminated feed.
All of the hysteria about driving brewers out of business is just hyperbole. Before these rules, brewers could ship contaminated, spoiled grain to feed cows without any accountability. Now they will be accountable to make sure that they don't feed cows garbage... seems reasonable.
You can read the FDA regulation (and avoid the hysterical hype) here:
http://www.fda.gov/Food/Guidan...

I haven't heard of anyone talking about driving brewers out of business wholesale, but any increase in operating cost is going to have negative repercussions for a business, which may mean lower profits, leading to reduced employment. That's just the way these things work. Note that this could also have a ripple effect, such as increasing the price of milk, since farmers have been able to rely on this cheap and nutritious feed for a long time.

You mentioned "they could have" in your response, but I could counter with "they never have so far", which seems a more powerful argument. This practice has been going on for over a century with apparently no real trouble, and suddenly the brewers are going to poison the farmer's cattle? It seems a bit far-fetched, since after all, these are the by-products of human-consumable beverages. I'd be more apt to support this if there was a documented history of problems with this practice.

Government, by it's nature, tends to want to create more and more rules and regulations. I think that's part of the natural desire to proactively protect against problems, but it's also has slightly less noble purposes as well. More regulations essentially means the government has to grow to enforce those regulations. It's in the FDA's own self-interest to pass as many rules and regulations as it can, because then it's "business" grows. That means those in the FDA can move up their own "corporate ladder", so to speak.

Government regulations have to be viewed as a necessary evil. All but the most die-hard libertarians or anarchists would say we need no regulations, but there's always a careful balancing act that must be made between the imposed overhead of these regulations and the benefits they provide in terms of safety, reliability, and consumer rights. So, I think it's worth questioning whether the imposed cost of this new proposal is worth the imposed overhead and costs.

Comment Re:Not True (Score 1) 245

If you fail to deliver on your promised, you won't be able to easily earn back people's trust

So the next kickstarter campaign is in your girlfriend's name.

Do you want me to find examples of people who have gone back to the kickstarter well and never really delivered?

No, not really. Honestly, I don't really care why you seem to dislike crowdsourcing so much. I'm not the person to defend it, as I neither use it nor contribute to other projects. It just seems like you have to consider the source of those projects very carefully - that's up to the individual contributors, but that seems like common sense to me.

Incidentally, although I'm an indie developer (as one would define it),

Great. Then maybe you can explain why it seems impossible for new companies to produce something at the level of Half-Life, the Burnout series, etc etc. Games that people want to put over 100 hours into. Valve and Criterion were relatively small and little-known "indie" companies when those games were made. Why do game developers seem so allergic to giving good value for the price of their game. And why do so many have such low opinions of their own games that they go F2P? Are there no developers who realize just how badly that genre sucks?

You're essentially asking "Why aren't all games as good as the best ones ever made?" Is that something I can even answer? Why aren't all composers Mozart? Why aren't all directors Steven Spielberg?

80% of everything is crap, and that includes games. Of the 20% that isn't crap, only a small percentage will rise to the very top, and probably make everything else look bad by comparison, even though they're probably not.

Making games is harder than most gamers think, incidentally. To make a top-notch game, you need a fusion of talented programmers, game designers, artists, plus (and this is probably the rarest) enough financial backing and managerial foresight to see a project all the way through to it's true completion, not just when the contract says it's due. Incidentally, that's not the same as giving developers unlimited time and money, because that can bankrupt companies. To me, it's a miracle that as many high-quality games are released as there are, since I've seen how incredibly hard they are to make first-hand.

Comment Re:Not True (Score 1) 245

Oh don't worry, Bungie lost it's vitality alright when they sold to Microsoft. Many people don't realize that Bungie is this old because before becoming an Xbox developer they were Mac exclusive developpers.

Yeah, they were getting the life sucked out of them by Microsoft, being asked to do nothing but Halo sequels. They seem to have regained some vitality since splitting with MS though, which is what I was sort of inferring.

Comment Re:Not True (Score 2) 245

Yeah, you're correct that "venture capitalism" is a bad analogy. It is true that you're betting your own capital, but your only potential return is a good game, and maybe some extra freebies. Minupla below gives a much better analogy as "patronage", as there's often a desire to see a specific vision come to fruition. It's not a perfect analogy, but probably a bit better than mine. Of course, any comparison or analogy is going to be flawed in some way, because crowd-funding is a rather unique mechanism for funding development.

Incidentally, although I'm an indie developer (as one would define it), I'm not taking money from crowdfunding. I saved up for many years working for established game development companies and am now self-funding my own game at a tremendous cost and my own financial risk. I've never been to an indie developer conference, in fact. I'm a professional game developer who just happens to be working on my own right now, and I'm betting my financial future on the fact that my game will be fun and engaging to play.

Keep in mind that crowdfunding is not a "get money for free" scheme. You have to pay all those people back with promised products of some sort, which work against your own future earnings. If you fail to deliver on your promised, you won't be able to easily earn back people's trust, and your venture will likely fail. I'm sure there are some people who have taken advantage of the system, but there are also other developers who are working long days and nights on their own in order to produce a viable product that others will enjoy.

Comment Re:Problem with releasing an underpowered console (Score 1) 117

Neither console is really "next-gen", that would have been 4K resolution.

I would have been happy with true 1080p resolution. How many people actually have 4K TVs at this point? Not nearly enough to support that in a console designed for the masses, at least. 4K is pretty demanding even for PC videocards. That would have pushed the cost up by several hundred bucks with absolutely no benefit for the majority of customers.

Still, it's not like we could have expected the same massive leaps in visual quality from previous generations. After all, the 360/PS3 generation was already closing in on photo-realistic quality given ideal circumstances, so there's no helping that. From here on out, improvements to visual quality will be less noticeable even for relatively large increases in processing power. 4K takes approximately 16x the processing power to achieve (at least in terms of fill rate), but of course it really doesn't look 16x as good as 1080 resolution.

Despite my grumblings, for me it's still about the games and not really the eye candy, even though comparing transistor counts and internal resolution seems to get the press all hot and bothered. I'll probably buy the first console has an epic, must-play RPG on it, as that's definitely my favorite genre.

Comment Re:Not True (Score 1) 245

Some old icons in the industry are now past their prime. Blizzard, Bioware, and id, longstanding favorites of mine, have all sold out. I'll no longer expect anything great from them, although I'm always willing to be surprised. Instead, younger and hungrier development shops will take their place... maybe ArenaNet and Bungie.

Uhh... Bungie is only 3 months younger than Blizzard. If you want to be pedantic, though, Blizzard Entertainment proper is actually the younger studio.

Yeah, you're right. After I posted that, I realized that "younger" wasn't really the proper term for describing Bungie, as they've been around for quite a while now too. Maybe it's just because it feels to me like Blizzard has lost it's vitality since getting swallowed up by Activision, while I don't necessarily get that feeling from Bungie.

Comment Safer phones? Seriously? (Score 4, Insightful) 184

People need to stop distracting themselves while driving. Better yet, make sure that anyone who causes damage, injury, or deaths due to their negligence while driving is fully prosecuted under the law. It's no different than driving under the influence of drugs or alcohol. Driving a vehicle requires responsibility as a driver.

Let's not kid ourselves. People will just root their phones and bypass any restrictions put in place to block access to the phone while driving. And how the hell would a phone know the difference between a passenger sitting in a car and a driver?

At it's heart, this really isn't a technology problem, but a societal one. We need to crack down on this sort of stuff, so people understand that it's simply not worth the risk to break the law. It would be awesome if software or hardware could fix all those meatware-related problems, but that's not the world we live in.

Comment Re:Not True (Score 4, Insightful) 245

IMO, we've never had more choices or viable platforms as gamers - my first console was an Odyssey 2, and my first computer gaming was on an Apple II+, so I've been doing this a while now. Anyone who is longing for days long gone really needs to take off the rose-coloured glasses. Most of those older games were, if you look at it objectively, pretty trite and repetitive by today's standards. They were amazing to us largely because of their novelty, and we've elevated them on the pedestal of nostalgia.

Nothing against the classics - they were amazing for their day, but I do think a bit of perspective is in order. When I was a kid, I would have killed for an amazing RPG like Skyrim, or an MMO like Guild Wars 2, or for the sheer creativity to be found in Minecraft. I picked up Limbo the other day, and have been immensely enjoying myself - it's an incredibly clever and atmospheric platformer/puzzler. I'm still playing Puzzle Quest too, a relatively low-budget but fun puzzle-RPG hybrid. More recently, I've been going through my "bought a while ago but haven't played" list like Halo 4 and Uncharted 3, and on the PC side recently picked up The Witcher 1 & 2 in a Steam deal. I've enjoyed all these games immensely so far.

Granted, there's a lot of crap out there too. Freemium games? Yeah, I stay the hell away from those too. But I don't see how crowdfunding can be blamed when it's simply opened up the market to more niche games. Sure, some of those bets won't pay off, but welcome to venture capitalism. I'm not sure how that should be a surprise to anyone. 80% of everything is crap, anyhow. It holds true now, and it was true in the past as well. You just need to look for the products that rise to the surface... you know, read reviews, judge based on developer history.

Some old icons in the industry are now past their prime. Blizzard, Bioware, and id, longstanding favorites of mine, have all sold out. I'll no longer expect anything great from them, although I'm always willing to be surprised. Instead, younger and hungrier development shops will take their place... maybe ArenaNet and Bungie. And garage development is no longer relegated to the past either thanks to crowdfunding and improvement in tools, technology, and especially distribution platforms.

Personally, I think it's a pretty exciting time for the gaming industry, and I'm happy I'm in the middle of it.

Comment Re:It's time we own up to this one (Score 1) 149

In the process of rewriting, it's inevitable that a ton of brand-new bugs will be introduced in the new codebase, and you'll have lost all the time and effort hardening the library and fixing all of the thousands of previously exploitable issues.

I think talk of scrapping or rewriting the library is a bit of an overreaction caused by the scale and scope of the issue, and is certainly not plausible in the short term anyhow. I'd say the proper thing to do is to halt development of new features for a time and investigate the viability of addressing some of the underlying issues that have been pointed out which may have contributed to this bug: Remove the custom memory allocator, or ensure it's off by default. Start writing and using a proper unit test and fuzz testing framework. Document and clean up source with proper variable names and comments to make things easier for those reviewing/auditing code. Make sure the library compiles cleanly. Pay more attention to code that dynamically allocates memory or accepts data from user. Create a better review process. Etc, etc.

Every time I make a programming mistake or introduce a bug, I ask myself "What could I have done that would have prevented this from happening in the first place? What can I realistically do now to make sure it doesn't happen again?" The OpenSSL team really needs to be asking themselves these questions right now.

Note that questions like these have to be addressed in a realistic manner as well. "It needs to be thrown out and rewritten from scratch" or "we shouldn't be using C in the first place" are not realistic solutions. Whatever merits Ada has as a language (as one example I've heard), it's simply not going to replace C as a low-level systems and library language anytime in the near future. Language choice is also impacted by issues such as efficiency, programmer availability, interoperability, shared libraries and sample/test code availability, and so on.

I don't think this is the time to throw hyperbolic accusations or wildly impractical "solutions" around. It's a great time for some serious introspection and development process review, however, not only for the OpenSSL team, but for any sort of project like theirs with similar responsibilities. Even conservative fields of study such as civil engineering have had their spectacular disasters/failures. The thing to do is to assess the situation, learn from your mistakes, and then rigorously apply what you've learned so that you don't make those same mistakes again. It often takes a serious catalyst for that to happen, so in some ways, perhaps we should be glad that this happened sooner rather than later.

Comment Re:A patch closer to usability, few more to go (Score 1) 294

Actually that's the planned idea, that when "win9" rolls out that's the "eol" for 32bit.

I'd love to see a source for that assertion you're making. There are hundreds of thousands of legacy 32-bit Windows apps that users continue to rely on. MS is not simply going to "eol" them starting with Windows 9. Seeing how near-perfect 32-bit compatibility is already built into both Windows as well as all modern 64-bit processors, they would have to actively remove that existing feature to break compatibility in the future. Why in world would they do that and destroy their own OS's ecosystem? That just makes no sense to me.

Straight C++ is effectively platform agnostic, and that includes 32/64-bit flavors of Windows, of course. As I mentioned, I can compile my game for either 32-bit or 64-bit platforms without any changes to the source code. That's what I meant by "flip a switch" - I didn't mean using the same binary, in case that wasn't clear (you're correct in that regard). A better way to phrase it would be "flip a switch and recompile the project".

Comment Re:A patch closer to usability, few more to go (Score 2) 294

No I'm not misunderstanding. You're simply not paying attention to what's going on. There's a difference between an emulation layer, and native support. Currently we have multiple flavors of OS's with native support in either flavor, in a few years we're going to have a single flavor of OS support with an extreme drop off in support for x32. We're already seeing this in gaming with x32 binaries being thrown into the trashbin, and the entire codebase thrown and ditched. The most recent example in gaming of zero x32 support is Watch Dogs for the PC.

Your post seemed to imply that 32-bit apps would fail to run in Windows 9, which is simply not the case. Also, I fail to see the problem with the world moving to 64-bit native OSes with a 32-bit compatibility mode. And what do you mean "an extreme dropoff" of support? 32-bit applications run flawlessly under 64-bit Windows. It's not some soft of half-assed software emulation. 64-bit processors still support the original x86 instruction sets all the way back to the original 8086 and executes those natively. The emulation layer (Windows on Windows) is for the Windows API, not for the binary's instruction set, so it's fairly minimal in terms of overhead.

Also, you seem to be under the impression that you need to completely rewrite your game engine for 64-bits, which isn't true. My current game engine is both 64-bit and 32-bit compatible with zero differences in the codebase itself. Simply throw a switch in the configuration and it's done. There's some types of code you do have to actually port, such as inline assembly, which is not allowed in 64-bit, or any tricks using pointers that rely on a 32-bit size, but for most C/C++ code, there's no difference at all if written correctly. If a game company decides to abandon 32-bit platforms, it's because they've determined that the market can now move ahead with 64-bit platform. This allows them to push beyond the 2GB memory limit of 32-bit applications, which is becoming a serious bottleneck for modern PC games (we had to work pretty hard on my last commercial title to fit in this limit).

Anyhow, I'm not quite sure of your overall point. Do you feel that MS should continue releasing 32-bit operating systems even though there are no 32-bit-only processors being manufactured for desktop or laptop computers, and haven't been for years?

Comment Re:Why are you using the touch interface with a mo (Score 1) 294

Still worse than Win7:

1. Click Start Button
2: Click Shut Down

But it's the content of the steps you provide and not even their number which cause the consternation with Windows 8. What about the bottom right corner is so magical that it compels people to move their mouse there in the first place to discover this menu? What about "Settings" would lead people to believe that this is where the power control is? Yes, these are easily learned, but the more of these non-intuitive steps there are, the more frustrating things are for the user. I remember how frustrated I was trying to figure out how to shut down a full screen metro app for the first time.

The best user interfaces have a sense of intuitive discovery about them. Have you ever been to a "mystery meat" website? That's where the designer was being hip and minimalistic, and forced the user to hover over a bunch of obscure icons to figure out what the hell each one of those do before they click. Generally speaking, it's a usability disaster that no serious web designer would make today, yet Microsoft managed to do exactly this.

That's why the "Start" menu worked so well when when it was introduced in Windows 95 and continued forward. It was a reliable fallback for users in order to access all functionality on their computer. Not necessarily the fastest or most efficient, but it was all right there, easily discoverable with few clicks right from there. Every application, every computer setting one would commonly use, and yes, also functionality for shutting down the computer. At the time, the start button got some mild ribbing for being the method used for shutting down your computer, but by and large, that was largely just playing for laughs.

When I bought a mac mini recently and used OSX for the first time (I hadn't used an Apple computer since my Apple II+), my experience was completely different. Everything was slightly unfamiliar, but it was a gorgeous visual experience and not at all hard to figure out because of the shared paradigm of most all modern desktop environments (closing a window - I'll guess the red X in the upper left corner). Once I wrapped my head around a few conceptual differences (such as the top-most window, and the separation between the app window and instance), I pretty much felt at home.

It's pretty incredible to me that, as a longtime Windows user (since 3.0), I felt roughly the same level of discomfort when learning an entire new operating system as when simply upgrading from Windows 7 to Windows 8, which has never happened before. In previous version upgrades, I always felt like the UI was evolving and improving for the better (like with the Windows 7 taskbar - only took me a day or two and I fell in love with it). In OSX, I don't feel like I'm being bludgeoned with an IOS-wanna-be interface every time I have to start up an application or perform any sort of OS-related task. Apple understands that these are two wildly different computing paradigms. Why didn't MS figure this out?

Slashdot Top Deals

Work without a vision is slavery, Vision without work is a pipe dream, But vision with work is the hope of the world.

Working...