Forgot your password?
typodupeerror

Comment: Um, just a sec, gotta check on something... (Score 1) 371

by Bones3D_mac (#36445196) Attached to: How Citigroup Hackers Easily Gained Access

This doesn't even qualify as a hack. It's more like a tactic a curious script kiddie would try just to see how something worked, and suddenly being pleasantly surprised when some other user's data was handed to them on a silver platter as a reward for bothering.

Sadly, I'm willing to bet this kind of "exploit" is probably far more common than anyone is willing to admit. Like those of us who have ever "left the water running" and only coming to realize it 50 miles down the road.

It's something so stupid, most developers wouldn't bother checking their own work for such a "rookie mistake", simply because they're just that good.

Comment: It seems very unlikely... (Score 1) 294

by Bones3D_mac (#36294654) Attached to: The Next Phase of Intelligent TVs Will Observe You

But...then again, that "kinect" stuff went off like wildfire, with thousands of families instantly installing a networked 3D camera in their living room, completely unaware of the potential implications. You'd think someone would have read "1984", or at least watched the movie after hearing about it on the news following the adoption of the patriot act. Soon it'll be the same for our daily use media devices and smart phones...

It'll probably have to become a video game before they figure it out, but by then, the creepiness will be outweighed by the false sense of security, knowing that "there's an app for that"... featuring the next generation apple iDevice that features multiple 3D cameras that can view the device's entire surroundings as a single 3D sensitive sphere around 15-20 feet in diameter, that can use AI assisted augmented reality to pick out and identify every object in view, then recreate the scene entirely 3D from a database of similar 3D objects as hastily collected as google's image search function. Which in turn will be uploaded to YouTube3D, where random users will watch you in realtime 3D, able to rummage through your belongings without having to actually being there. Finally, someone will think to turn this YouTube3D thing into a service you can pay for to have random people watching you 24/7 like Brinks home security, except the security "staff" actually pay for the service as well to watch you like an episode of "survivor", except it's "interactive"... and the viewers can choose to either watch you die your own home from a fire or break-in, or, call the police and be the great busy-body hero they imagine themselves to be... or just to collect a cash reward, like some sick game show.

Comment: Number 1 Cause... (Score 2) 248

by Bones3D_mac (#36294526) Attached to: What's Killing Your Wi-Fi?

... mismatched devices!

You would not believe how many people "upgrade" their broadband to 20+ Mbit/sec service and then complain that their computer is still only getting 1-3Mbit/sec speeds. A lot of them don't realize that the older 802.11 devices can significant reduce the performance of a modern wireless network.

Most 802.11b devices (which are still in use today) usually top out at around 10-11Mbit/sec, and that's under perfect conditions. If you start adding multiple users, competing networks and outside interference, things get out of hand pretty quick.

Here's a list of things to look for in examining your wireless network for performance issues:

- Replace the router.

If you're router is over 3 years old, it might be time to replace it. Especially if it's an older 802.11a/b model. The really old 802.11 devices, like Apple's original AirPort base station, have a lot of problems working correctly when they encounter other networks within their own service range. This can result in dropped or spotty connections and overall losses in bandwidth. Many of these first generation wireless network devices barely worked, but they worked well enough for the few people that could afford them. Most of these devices have since been trashed for more recent models either because they started failing under the weight of other networks or simply died from various flaws or age.

- Update the firmware.

Many wireless devices have firmware chips on them that can be upgraded through software. This can help weed out networking issues that might be caused by buggy firmware, or may add enhanced features that can help your device work better under heavier loads from competing networks, interference, multiple users and various security issues.

- upgrade all client-end networking hardware at the same time.

When putting a wireless network together, or upgrading an existing one, make sure your client devices use similar configurations. (Or identical, if possible...) A single, poorly configured client device can significantly impact your wireless network's performance. By making the network devices functionally similar to each other, the simpler it will be to put together an efficient network setup. For example, if you have a network consisting of only 802.11g devices and set up a router to only accept 802.11g connections, it'll run at around 54Mbit/sec. But, if you have a network consisting of random 802.11 devices and a router that will support several protocols going back to 802.11b, the network will default to using the slowest, most common protocol available (802.11b) and will force all connected clients to run at that speed (11Mbit/sec), regardless of each client's individual configuration. That bandwidth is then divided by every connection, making then network seem much slower than it is. By keeping the client and router hardware similarly configured, the network speeds are less likely to suffer. Your maximum network performance is limited only by the hardware you use to build it.

- Secure your network.

Make sure your network hardware is secure on both the router and client end. Set up your router to use the most powerful encryption protocols it supports and utilize MAC address detection to identify each piece of hardware on the network, so you can ensure no one outside of your client list can access your network. Also, don't use DHCP to assign IP addresses. Manually configure each client, so they have a static IP. Finally, disable SSID broadcasting. This will reduce the likelihood of a war-driver finding your network and tagging it for others to find.

- Use the latest available network protocols.

Using protocols like 802.11g or 802.11n may help to significantly improve your network speeds over older ones, but may also offer some added flexibility. Unlike the older 802.11b/a protocols, some of the newer protocols aren't limited to one broadcast frequency (2.4GHz). While the broadcast frequency of your wireless hardware has relatively little to do with your network bandwidth speeds (5GHz vs 2.4GHz), it can indirectly improve your network performance by moving your network to a less-common broadcasting range. As long as you don't need to share your connection with random users, you can simply isolate your network to use only the 5GHz channel, effectively removing the interference from all networks within your own network's range... at least, until your neighbors figure this out and start using the 5GHz frequency on their setup.

It's a short term fix, but effective.

- Move the router to a new location.

Generally speaking, placing your router at the middle of the overall area each client resides will give you the best connectivity. But you must also be mindful of other devices that could interfere with the network, because they work on a similar frequency or simply draw a lot of power. This includes items like cell / mobile phones, microwave ovens, vacuum cleaners, CRT-based monitors/TVs, radio controlled devices, etc... Other items, such as construction materials, can naturally affect a wireless network's performance, like a Faraday cage. These can include items like sheet metal, aluminum siding, metal girders, fencing / screening materials, heat ducts, metal pipes, etc...

Anyway, hopefully these tips will be useful to someone. You never know when your network might fail you next, and the answer might be something as simple as one of your kid's video game consoles with a poor configuration that's causing it.

Comment: It goes waaaaaay further back than this... (Score 1) 139

by Bones3D_mac (#36294060) Attached to: The Machines That Sparked the Beginning of the Computer Age

There is evidence that some of the first computers ever produced existed as far back as 150 BC, A device found in 1901, called the Antikythera mechanism, is a mechanical computing device believed to have been used to chart astronomical positions. It's overall design rivals the complexity of an early mechanical watch.

Another fun item, the japanese Karakuri ningy, or clockwork doll. They are some of the earliest known examples of robotics, going back to the 17th century. The Karakuri ningy was primarily used by wealthy dignitaries for ceremonial purposes, like serving tea. One of these clockwork doll would be placed upon a table, holding up a small tray. When a weighted object, such as a tea cup, was placed on the tray, the weight of the object would set the mechanics in motion, causing the doll to turn 180 degrees from the server and would then begin walking toward the guest at the other side of the table, to deliver the tea. Once the weight was removed from the tray, the action stopped and the mechanism would reset itself for the next use... allowing both server and guest to repeatedly serve each other as a form of entertainment.

Although much of this has been replaced by electronic devices, such as the Sony Aibo and the Honda Asimo, the old style Karakuri ningy design is still in use today, but mostly as large scale devices in factory settings as carts for moving large, heavily-weighted objects, like car engines to different parts of an assembly line, as a cheap way to conserve power by using an object's own weight to move it.

Comment: Re:I've heard something like that before (Score 1) 143

by Bones3D_mac (#36054490) Attached to: Scientists Afflict Computers With Schizophrenia

Actually, I have an issue like this that makes me very uncomfortable around crowds. For example, when I go to a restaurant, I can hear every conversation going on around me with the same relevance as someone sitting next to me. It drives me nuts because it prevents me from enjoying a conversation with the people I actuslly care about. In that situation, I either have to process everything at once and parse out the stuff relevant to who I'm trying to talk to, or block out everything as white noise and not participate at all. I also get a bit nauseated from dealing with so much info at one time.

Needless to say, I don't go out much and prefer to be in locations where I control the environment. Usually keeping things dark and quiet.

Comment: It makes one wonder... (Score 1) 126

by Bones3D_mac (#36032004) Attached to: AppleCrate II: Apple II-Based Parallel Computer

What kind of impact would this have had if people were doing this back in 70's?

Granted, this guy is just using it mostly for audio processing. (Impressively done, though... especially if you ever experimented with audio sampling on an Apple II using self-designed software and custom-built I/O interfaces)

What I'm curious about, is whether the video output from each of thee boards could be combined into either a single high resolution display matrix approaching VGA at a low depth, or layered atop each other to increase output depth, but at the Apple II's default resolution. (Basically, something like the output of 12 machines combined into a 4x3 matrix on a single display, which would be controlled by a 13th machine for high res output, or layering the output from all 12 machines with the 13th machine controlling the alpha value of each layer to create the illusion of a higher bit depth than the Apple II was capable of.)

Maybe then all that shareware porno imagery every library in america once hosted might actually have been identifiable...

Comment: Pets (Score 1) 343

by Bones3D_mac (#35717296) Attached to: Do Violent Games Hinder Development of Empathy?

More likely, the simple act of owning a pet probably contributes more toward instilling empathy into a child, than any video game could do to decay it. Sure there's violence in games and movies/tv, but much of it is short term and probably forgotten, versus something like living with an abusive parent or similar issues.

Havin a child learn about responsibility through caring for an animal is long term, meaning the child eventually develops an emotional bond with it. Children quickly learn failing to act responsibly wth an animal can have serious consequences, including death of the animal itself. And, while that could be somewhat traumatizing to learn about harsh topics like death that way, it will make them far more empathetic toward others versus any short-term habit changes like taking away their video games.

Comment: A Pointless Prediction... (Score 1) 347

by Bones3D_mac (#35557386) Attached to: Michio Kaku's Dark Prediction For the End of Moore's Law

Although I like Kaku as a scientist in general, he's not exactly immune to mythbusters-style "foot in the mouth" science.

One major thing he overlooks, is the high likelihood of cloud computing eventually taking over the role of the "processor" in most PCs well before then. This isn't just a fad technology that'll go away in a few months, it's probably going to be next evolution in computing since the introduction of the world wide web. Not only will it take over processing tasks, it'll also change the very way software is distributed by letting companies post a "master" copy of a program onto a cloud server, and then rent out usage time for an instance of the software in a user's cloud space. This way, the developer doesn't have to waste months of development time trying to track down bugs specific to different system configurations. This would alllow the developer to focus solely on the software's performance within the cloud, only requiring updates to the "master" copy as they are needed. The user would never need to worry about all the downsides to installing software, such as invasive DRM, software incompatibilities or malware, as the software would never actually be running locally on their system.

Likewise, processing power would also be rented out. A larger portion of CPU time could be purchased for an extra fee, on a sliding scale. One cloud computing becomes as flexible as that, one only needs the right version of client software to access their cloud vm interface and you could theoretically access it from any machine with enough local horsepower to display a window of a stream viewport of the user's workspace. (Probably any system from 1999 to the present)

After that, processing power becomes largely irrelevant unless you are working on something seriously data intensive beyond anything we can probably comprehend now.

Japan

+ - Nuclear Power Plants vs Coronal Mass Ejections

Submitted by Bones3D_mac
Bones3D_mac (324952) writes "Could a major coronal mass ejection from our Sun result in planet-wide nuclear disasters similar to what we witnessed in Japan? Not only could it cause the "station blackout" effect we initially heard about, but what about the potential for loss of communication lines and overall mass confusion as our usual electrical and electronic devices suddenly go dark on us for seemingly no reason whatsoever?"

Comment: Google The Brain! (Score 1) 143

by Bones3D_mac (#35451528) Attached to: New Hardware Needed For Future Computational Brain

Ok, I admit this sounds completely absurd at first, but there's an awful lot of similarities between the neural pathways of the brain and the countless number of ways websites link to each other, both directly and indirectly through their contacts, and their contacts' contacts, and all the contacts that eventually show up in an endless cycle of recursion, etc...

Now, google has to wade through all this, and constantly correct and update itself, to ensure it can get a user to the correct web page that best matches the search criteria.

You can't tell me that as more data on the web becomes increasingly more dynamic with all these forums, blogs, news sites and endless amounts of chat/social engineering sites constantly popping up and then dying, that there isn't at least some algorithm they employ that couldn't be applied to nueron connectivity and communication.

You'd think it'd just be a matter of passively connecting to a neuron to sniff it's traffic and then observing which nearby neurons carry the signals to and from it, then start listening to those neurons and so forth, then use machine learning to break down the patterns enough that google's setup could follow it... ie, determine which neuron is responsible for which patterns in what frequency, etc...

Comment: Shouldn't we be more concerned about... (Score 1) 386

by Bones3D_mac (#35377248) Attached to: DHS Eyes Covert Body Scans

... how to make the questionable crap we post today permanently go away on demand so it doesn't come back to bite us in the ass in the future?

If this person wasn't even aware that data could be erased from the internet, you can bet it hasn't occured to them that there are far more dangers in having data that doesn't go away eventually on its own. That edgy statement you made that initially made you look cool to your friends in 6th grade might land you on some company or government blacklist, making it near impossible to get a job because some Watson-derived human resources bot assigned you a risk assessment perentage that can't be overturned by human hands any easier than getting off a sex offender list even when the case that landed you on it was later overturned, etc...

Oh, and have fun when those watson bots aren't just assessing you by your own actions, but by your associations, both directly (communication) and indirectly (shared philosophies derived by each person's actions linking you to people you never heard of). I'm sure there are at least one or more serial killers out there even you might be linked with based on interests alone.

Comment: Red Dwarf: The Next Generation? (Score 1) 228

by Bones3D_mac (#34957744) Attached to: New <em>Red Dwarf</em> Series Threatened By the Twitter Era

Honestly, I can't see how this is going to work. Are they bringing back the old cast or starting over with a new one? is it going to be the same story following "Back To Earth" or a complete reboot like the recent "Star Trek" feature film versus the TOS version?

If it is a set of new characters, who are they... Lister's kids, a Rimmer Jr created by Rimmer and instructed to activate in 20 years, cloned versions of the cat, only younger, with kryten left to baby sit them all? And what about holly... crashed, hologrammed like Rimmer was or replaced by Kryten as Red Dwarf's main computer?

Personally, I think Red Dwarf needs to stay dead. The entire series is on NetFlix now if I get the urge to watch it again, and I doubt the series could survive in the same context it did in the 90's. Hell, just look at how tame The Simpsons has gotten in the last 20 years compared to the early years. They can't even show a character's butt in the current episodes without a disclaimer or a time shift, something that was perfectly acceptable early on. (Yet, the syndicated episodes are still shown with butt crackss intact.)

If you really want new brit space humor, I recommend checking out "HyperDrive" (which is also on NetFlix). Granted, the show is still rough around the edges, there is potential.

"More software projects have gone awry for lack of calendar time than for all other causes combined." -- Fred Brooks, Jr., _The Mythical Man Month_

Working...