Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Language isolates (Score 1) 318

I'd like to know how on this grand scheme languages such as Basque are reconciled? - along with a couple of other languages, the Basque language is classified as a language isolate. Although most of these have become isolates in fairly recent history (and therefore shouldn't be too hard to link to a language 'tree') Basque is a tricky one because it's been an isolate for as long as it has been recorded, and does not share its roots with other Indo-European languages.

I read the article (for once) but it didn't go anywhere near those kinds of details - which I find kinda odd: I would've thought when it comes to grand unifying theories of language that linguists would've been all over the issue of isolate languages like a rash.

The chart/diagram in the Economist article did have Basque on it - 'near' Greek and Russian.

Sure it's a damned interesting idea to be able to link all languages to a common source -- but the article makes it seem as though this one all boils down to plotting languages according to similarities, and then best-fitting a line across the chart. I hope there's a lot more to it than that -- otherwise it's not really a discovery, but merely an interesting hypothesis in my book.

(for those interested Wikipedia on Basque language history)

Comment Re:Are MD and SHA easily reversible? (Score 1) 409

Ok - yeah, my point was valid, but the problem space is indeed not very large for 6-character alpha-numeric passwords.

But I stand corrected, because as the AC points out with the posted link: using a slower algorithm will indeed throw a spanner in the works for any cracker.

In the past we've implemented password hashing using Blowfish, and we set a minimum password length of 8 - which is an improvement I guess.

Why isn't Bcrypt benchmarked on the page linked to (http://www.cryptopp.com/benchmarks-amd64.html) from the how-to-safely-store-a-password page? It makes it difficult to make a true comparison.

Comment Are MD and SHA easily reversible? (Score 4, Interesting) 409

I don't get it - surely it shouldn't matter if someone gains access to the password verification routine, the salt and the encrypted passwords... unless the password hashing/encryption is easily reversible?

They've still got to try and brute force match the encrypted data with a dictionary attack - sure, having the salt makes it easier - but if you've got the salt and the encrypted passwords it doesn't matter what encryption algorithm is used, you've still got to use a brute force dictionary attack. Most encryption algorithms aren't easily reversible - and that's the whole point.

Comment Stop crying about Gmail (Score 1) 375

A lot of people seem to be crying about Gmail being a free service, and therefore unreliable (even though that's not what the poster's asking us about -- and I'll get back on-topic shortly :) -- for those that don't know, it's worth noting that Gmail isn't just a free service with no guarantee of uptime; it's part of the Google Apps For Business package, which is available as both a free service (for up to 50 employees -- 'standard edition') and as a commercial service (costing a mere $50 per year, per employee -- 'premier edition')... which (hello naysayers) has a 'three-nine' (99.9%) uptime guarantee in the SLA.

If three-nines is good enough for your business, then having a great web-based email (with someone else to worry about backups, spam filtering, security, etc), shared contacts, calendaring and documents-in-the-cloud (if you need it), etcetera, is fantastic value at only $50/year/person. And if you're a small business with less than 50 staff online -- and you don't need the guarantee provided by the SLA -- then being free, it's even better value.

Obviously, some of this might not suit everyone's tastes for whatever reasons, but to damn Gmail in a business situation purely because you think "you're trusting your business to a free service" is to miss a great opportunity which would be not only be perfectly suitable but also great value to a lot of businesses.

FWIW Google also offer 'Apps For Business' for Education, Government and Non-Profit as well as the above mentioned editions, so they seem pretty serious about it being a business offering of worthy consideration -- and, whilst I don't know of any colleges, governments nor charities that use GAFB, I do know of several businesses that use it (besides mine)... and they are more than happy with the service provided.

So in my opinion, switching from an in-house system (such as Exchange) to using a cloud-based service such as GAFB is a perfectly valid business option -- if it suits your business / business needs.

Furthermore, to get back on topic and answer the actual question asked, 'can Windows and OS X and Fedora all work together?' -- yes, of course they can.

If you've already got OS X and Windows on your network, then you'll have little problem integrating Linux boxes alongside. I'd do a staged roll-out, changing to Gmail and OpenOffice first (in whichever order is most appropriate / convenient in your business), and /then/ changing whatever Windows PCs you can over to the Linux of your choice.

Personally I wouldn't choose Fedora as my choice of Linux for use in the office - as other posters rightly point out; it's a bit too cutting edge. Realistically we considered our options as being: Debian, Ubuntu LTS (long-term service release) or just plain Ubuntu -- with the most-stable/least-risky being first in that list. If having later versions of certain software is an important criteria to you then you might want to choose something further down that list.

We ended up choosing Ubuntu LTS for the servers, and latest release Ubuntu for desktops -- for various reasons we favoured the single-vendor option in this instance -- although I've got no problem with Debian for either of these, if the software suits your needs. Obviously your requirements aren't ours, so your mileage may vary.

But I think your scheme makes pretty good business sense.

/frogg

Comment Re:There were some damn fine games in that era... (Score 1) 274

indeed - not only that, but bilinear/trilinear filtering (when applied to the whole screen, as opposed to filtering on when rendering out textured polys) are really techniques for really high-res screen displays - such fullscreen filtering is pretty quick and dirty but really helps take the hard edges off of polys/models.

by comparison, on relatively low-res displays, the blur is just too much and simply causes an unwelcome loss of overall image quality.

Comment Re:Funny that (eggy post follows) (Score 2, Interesting) 274

i'm not familiar with the ac's mentioned game, minecraft - but to expand on what you say, yes, for starters back-face culling is really cheap/quick for testing whether to send polys to the render pipeline - it's done after the final transform, and involves no additional maths beyond a compare of polygon vertex index numbers (ie, you use clockwise indexed polys for front-facing, if when you come to render a poly, vert[0] has an index number higher than vert[1], you skip sending the poly to the render pipe)

like i say, i don't know minecraft, but if it's cube-based then you'll be can culling quads instead of tris which saves half the work again. you can probably take that a step further if they're always cubes, because you know that only 3 faces of a cube can be visible at any time.

furthermore, if the game is based around a 2d or 3d matrix/grid of cubes, then it is only a small amount of maths to project the view frustrum over that space, because you're not really dealing with polygon-soup per se, as with more graphically complex games, you're merely dealing with a low-res coarse array of cells (which will eventually be rendered as your simple cube meshes) -- and if the cubes are 'fixed' in orientation to the grid then you can skip the transform for the model rotation, because it's simply not needed if all the cubes are guaranteed to be at angle (0,0,0) in relation to the world.

also, if you're dealing with a strictly grid-based scene, with regular / fixed size cube models, then model/scenery occlusion is also a bit easier to deal with - because it's a regular grid of cubes you can make assumptions about visibility and take shortcuts that you can't do so simply when dealing with a scene that has arbitrary mesh shapes/sizes. (although i'm not sure whether occlusion of this kind is where the biggest work savings would be made in the pipeline? it might be feasible to just skip object occlusion after you've made the above performance tricks, depending on various factors - polys are often cheap to render (with a gpu), whereas the occlusion calculations might take extra time for no real gain - it depends where the bottleneck is in the pipeline after the above optimisations)

strictly speaking, if it's a big 3d array of cubes it's basically a big voxel-based system, so all of those techniques are applicable - although, traditionally, voxel-based renderers use fuzzy-pixel-blobs (or apply a smoothing filter or somesuch, either at render time or to the resultant mesh before render time, depending on the effect required), whereas it sounds like minecraft could be treating the world as a more crisply rendered voxel mesh of cubes.

like i say, i've not seen minecraft (and perhaps i should really have a quick google ^^) so i don't know if it's using voxels or polys and points, or something in between / a combination of both (perhaps a world-grid handled as a giant voxel, with the other meshes then thrown into the resultant scene)... and/but maybe i couldn't tell by looking anyhow :) -- but... if i were going to implement something like the ac describes / hints at: that's how i'd go about it! :)

back in the day a lot of the great games on under-powered hardware (2d or 3d) relied on tricks like this (and/or other tricks depending on the game-engine style) to get that more-going-on-than-there-really-is kind of illusion - it still goes on now with newer games-engine tech, but it was much easier to understand when things were that simple. awesome stuff! :)

Comment Re:Who says DirectDraw is going away? (Score 1) 274

i was a commercial games programmer for both dos and windows during that era - i went to the first few years worth of directx conventions / developer days (when alex st.john was still the microsoft games-dev / dx evangelist, and kate-thingumy-bob was head(?) of the directx dev team - and well before the dx api was called / rebranded as 'directx')

don't underestimate the gp poster's claim...!

it was publicly stated (ok, maybe not publicly, but to rooms full of games devs) by microsoft a number of times that directx was designed because games programmers (like us) refused to program any games in windows 95 (using the sucky dib apis) -- microsoft knew that if they couldn't get games onto- and games programmers into- their win95 platform then their new 'multimedia' os wouldn't be able to take over the desktop in the way they intended it too. yes, this is pretty much what they said to us. really.

back then to play games people would boot to dos (or, heh, just quit windows 95 - cos it just ran on top of dos of course! ;) even if the game did have a native win95 port/version (originally quite rare), and people still did this even though it was possible to run some dos games whilst still running win95 - because performance sucked if you ran stuff in a dos window in win95, and you'd sometimes/often run into problems as a programmer when hitting the hardware direct (for sound/video) when windows was still running.

microsoft didn't want any of this users-running-dos lark, they were trying to kill dos and push their new whoop-de-doo multimedia 'rewrite' of win3/311, so they needed to address this problem. they said so themselves, repeatedly. i can't provide a citation, but i was there, and heard it myself from a lot of fairly high up microsoft developers/staff, often in (semi)public forums / seminars.

i'm sure someone will correct me if i'm wrong, but the games-developer api / directx wasn't even part of the initial release of win95, it was a late addition and featured in a later in a release of win95 commonly known as win95 osr2 (which also bundled ie4 - but that's another sorry story)

sure, i agree that dx (kind of) solved the problem of providing a common api to your hardware (and by enumerating your hardware devices, etc) but none of this was really the driving force behind dx, it was just how they had to implement it given the situation. it's just how any coder would go about it. although the early releases weren't as helpful / useful as they were meant to be.

fwiw, their common api was pretty shitty up until dx3 or 5 (there was no dx4, it got ditched - i might be wrong, and it's aside to the main topic anyhow, but i think dx4 was due to be the release that included the joint project with sgi, to unite the opengl and directx 3d drivers/apis, although this potentially wonderous event never occurred - shame, because even back then we were rooting for opengl :) -- before the time of dx3/5(~ish) microsoft didn't always even provide software fallback implementations of missing-in-hardware-but-still-fairly-simple-in-software functionality in the dx software api implementation, /despite/ all the docs often implying that things should be otherwise (x/y flip when blitting comes to mind, but there were other things too) -- when they _did_ bother to provide such fallback functionality, it often performed so poorly you were better off using you own implementation straight to the frame buffer anyhow (which you already had laying around from the dos games, so no big deal for the games programmer of that era).

of course before directx the game-devs already had libraries for supporting all the popular audio and video hardware - they'd either developed them themselves in-house or used an external library to provide that functionality (i forget the names of the ones we used - but if someone reminded me i'd know them). it wasn't really anything new - and because of the spotty dx fallback support we couldn't throw a big chunk of that code away for a number of years anyhow. early dx sucked - more than people outside of game development could really imagine :(

this all happened around about the same time as when all the game-devs were using the watcom compiler because microsoft's own compiler just didn't produce as optimal code (in fact watcom absolutely trounced microsoft in this area for some years) - microsoft's response to this was to work with certain games-dev houses to find out particularly which things in their c compiler emitted sub-optimal code (compared to watcom's) and to get these things fixed fairly pronto - this, in turn, was partly the cause of the demise of watcom's compiler (i guess another reason being that watcom couldn't keep up with the windows api, and changes to it - watcom provided their own set of windows header files for their compiler, which were often problematic and must've been a nightmare to maintain)

they were both interesting times and tough times! :)

on and off during that period we worked with various members of the dx team to help address some of these dx / dx-driver issues, and microsoft would often offer our various team members jobs because they needed more of these skills in-house (they were new to games development, almost in the same way that sony were just a bit later with the launch of psx)

but don't believe the hype - microsoft were definitely trying to embrace-and-extend all the dos-based games development technologies, putting an end to various providers of dos games-developer-oriented libraries/technologies/apis (including, as i mention, helping to put some final nails into watcom's coffin along the way) so that their new baby, win95, would be a success as a 'multimedia' os - because until their games-developer-toolkit / directx came along, you simply couldn't write any decent kind of action game on the windows platform, it just truly sucked.

....and that's how it was in the early 90s game-dev trenches! (now: get off my lawn! ;)

(oh, sorry - maybe you have your own lawn? -- i see you have a 5 digit uid also! hehe)

Image

Woman Creates 3-D Erotic Book For the Blind 113

Lisa J. Murphy has written an erotic book with tactile images for that special visually impaired porn connoisseur in your life. Tactile Mind contains explicit softcore raised images, along with Braille text and photos. From the article: "A photographer with a certificate in Tactile Graphics from the Canadian National Institute for the Blind, Murphy learned to create touchable images of animals for books for visually impaired children. Then she realized that there was a lack of such books for adults only. 'There are no books of tactile pictures of nudes for adults, at least the last time I looked around,' says Murphy. 'We're breaking new ground. Playboy has [an edition with] Braille wording, but there are no pictures.' She says that while we live in a culture saturated with sexual images, the blind have been 'left out.'"

Comment Re:Chinese? (Score 1) 102

now, i'm no expert in languages, but i do see that google translate also translate to/from chinese also, so i'm surprised that you claim it is a non-existant language?

also, wikipedia have a page about the chinese language - whereas, conversely, and in support of the other half of your statement, they don't have a page for the indian language, instead having a page for the languages of india.

perhaps we differ over uses of semantics here? perhaps you would've been happier if they'd specified traditional or simplified chinese?

like i say - i'm no expert, i'm just sayin' - that's all. ;-)

Comment google translate vs ibm n.fluent? (Score 1) 102

i think ibm have some catching up to do! ;) - google translate does a lot more languages than that (51 in total) - in fact i'm kinda surprised google have not built it into their chromium-os or the android platform (erm, i dunno - maybe they have - it's difficult to keep up with it all)

and, to top it all, google recently added the ability to view romanisations of characters such as chinese han, and input transliteration of phonetics for hindi, arabic and persian.

to my technical yet non-linguistically educated mind (i'm english by birth, so - thanks mostly to our poor education system, at least when it comes to languages - i only read, write and speak one language, and to be honest it's somewhat debatable how well us english folk are at our own language, although at least we don't speak americanese [/me ducks and runs] - although it's creeping into the common vernacular more and more thanks to the telly - 'though i digress somewhat), it'd be interesting to see how the technology that powers google's translate differs from that which powers ibm's n.fluent - to my mind the end result looks similar, so i wonder how much these kinds of technologies differ and/or how much they have in common?

Comment Re:A few problems and some solutions (Score 1) 344

actually, i've just noticed i often do my two-finger scroll with my third finger + pinky when i am smoking a cigarette (because i'm holding the ciggy between my first + second finger) - and it's just as comfortable.

also, after a little thought i wonder if four finger pinch might not be too bad (depending on sensitivity of the software) using all four fingers and sliding the first finger left and right.

it's interesting to make such observations - most of these things go unnoticed generally (for me at least) and become natural / second-nature - i'm sure a lot of research has likely been done in this field

Slashdot Top Deals

Say "twenty-three-skiddoo" to logout.

Working...