The old MadCatz Panther XLwas exactly this.
The old MadCatz Panther XLwas exactly this.
The computers are Windows 7, though there is an Ubuntu VM on the main computer. Any two connected to the router cause problems, but if the cable modem isn't connected the Dlink works fine (but no internet obviously), so something is happening between the DLink router and the cable modem's router. Nothing on my network changed, particularly at specifically 12:04ish AM CST on Tuesday morning.
Comcast Modem/Router: SMC8014
DLink Gamer Lounge DGL-4300
Right now I have the second computer plugged into the SMC's router ports, and it works fine, just can't see into the DLink to talk to the other computers obviously. Connecting over WiFi does not exhibit the same problems when two computers are connected on the DLink (one via ethernet the other via WiFi).
So interestingly, on Feb 1st, at 12:04AM, my network went nuts in my house. I have Comcast Business Class service and was actually on the phone with them yesterday morning, with no good results. I have a Comcast provided SMC cable modem/router, with my own DLink Gaming Lounge 4100 or something attached to that.
Basically the problem is this: When I have two computers attached to my router now, the internet becomes unusable on all the computers. I can see the ethernet lights show Gigabit connection orange, and green traffic, but then it blinks out, repeating every 10 seconds or so.
I am wondering if this trial has something to do with my problems. Or maybe it is just time for a new router...
Yea, it was supposed to work from the beginning, but there was a bug in Flash 9.0.48 or whichever one it was that was resolved in 9.0.115 or some such non-sense. I do remember having to do your workaround like 3 years ago, for sure.
You dont need an flv extension. mp4 works fine.
Worked fine for my for years and years, starting pre-phone with the Compaq Aero, thru my last phone the XV6700. It was either that or Palm, which was a comparable product. Neither was exceptional but they def worked.
I do have a Droid now tho, since the day it came out on Verizon. Not saying MS didn't squander an opportunity with it, but it did work.
Or maybe they have thought about doing that and realized that it isn't easy to implement, since many of the files are already broken up into various RAR files and in many cases password encrypted. All 'file detection' would do is move people just a little further down the 'hiding your files' curve for the uploaders. JDownloader or whatever would undoubtedly handle the complexity for the downloaders seamlessly.
Except that isn't true... Sony had already announced the removal of OtherOS from the PS3 Slim before Geohot started trying to break the hypervisor.
Aug 2009 OtherOS removed from PS3 Slim
'End of 2009' Geohot begins to look for exploits.
It is true that Sony removed it from existing PS3 Fats after that, but the damage was already done. When your PS3 breaks, and you need a new one, the Slim is the only style available now, unless you accept that people should be forced to go to Ebay to buy PS3 Fats for every increasing prices as supply dwindles.
Profit maximization isn't necessarily the problem IMO. One issue is that there are externalities that are not factored into the cost of production, such as the impact of pollution or disposal. Say you had two LCD monitors, one using a lot of arsenic and mercury that cost $100, and another that used less damaging materials for $200. If the externalities were factored in, then true cost of the monitors that the consumer might pay could be $300 for the first, and $250 for the second.
The other main issue, somewhat releated to externalities, I would call 'incomplete information for the imperfect consumers'. If the consumer can't know the mercury/arsenic content in an LCD, then they can't make a real choice. If they don't have the education to even understand why mercury/arsenic would be bad, that is another problem. Or if they don't have the ability to know the technical aspects of why one heart strent is better/more expensive than another, they can't make an optimal decision.
But our lives are littered with these non-optimal decisions. And that is why the government has stepped in to force certain industries/products to incorporate those externalities, or established standards and testing to ensure that the quality of the product is as described when the consumer is unable to determine this themselves.
If you can think of some other way besides government to include these obvious economic factors into the actual costs of products, then feel free to pass that along. Though I would guess it would end up looking like government in the end.
I am exaggerating to some extent on the 'throwing a fit' aspect too. It just was an issue that got thrown in my lap with a 'make it work' directive, and there wasn't anything I could really do about it.
Lol, I'm not 'skirting close to anything'. If anything I am saying that agencies or people that perform the agency function at companies are clueless. I mean seriously, who tries to upload a 750MB Pro Res clip to Facebook and then throws a fit when it doesn't work, and takes 2 days to manage getting a more appropriate format despite the fact that FCP is made by their own company? Oh right...they do.
I am sure the editor who made the clip was very capable, despite the Gleeful subject matter, but that doesn't mean the people upstream are, or even understand what a codec is.
Before that it was people trying to upload Apple Intermediate Codec vids for use on the web. 'But it is a MOV...it should just work!'
The issue is that agencies and their creative teams aren't on the same page. Not too long ago, I was asked why my video processing code wouldn't work with iCompany's video. I asked for the video they were trying to upload, and it was a 750MB Pro Res two minute clip for a pep-club musical tv show. Trying to explain to iCompany that their own Pro Res format was only supported in their program and that they needed to get us a more standards compliant version was a two day ordeal.
Accidentally, probably not. But did the programmer follow good development practices? Yes. In general, programmers are taught to store all inputs. Imagine the following scenario: The programmer DIDN'T log anything (even the wifi data with personal info stripped), and a problem was found with the processing algorithm. Do you think the programmer would have kept his job if the only solution was to send out the trucks again and redrive the routes? The natural response by a programmer is to log the incoming data to avoid that scenario.
It doesn't have to be a conspiracy. It is probably just something that didn't come up in review/planning and the programmer didn't realize it was sensitive since it was just radio data.
A number of people have mentioned the data, etc. But a lot of it has to do with managing and configuring your apps...I have 3 computers I use regularly and two others less so, and it is a serious pain in the ass keeping all of my settings and apps up to date across them. Enough so, that I use RDP/VNC type programs so that I only really have to maintain one copy of most of my apps (much less deal with licensing). Not saying Chrome OS solves all of these problems, or even any of them, but this is a good reason why moving things to the cloud can be considered as progress: Spending more time working with your data, rather than working on the programs that work on your data.
If at any time, I could:
* trade in my phone for another phone one at no monetary cost
* switch to any carrier at any time with no monetary cost
* Replace all of the apps I had previously purchased on my phone with no monetary loss
Maybe then I could agree with you. But in the real world, there are financial barriers to changing phones. Maybe your argument makes sense if we were talking about iPod's, but they don't hold with cell phones.
Adding features does not necessarily increase functionality -- it just makes the manuals thicker.