Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:DoD (along with everyone else) wants clean orbi (Score 1) 253

I guess "armor" would help to a certain extent depending on the circumstances, but remember, the relative velocity between a satellite and a tiny piece of space debris can *far* exceed the velocity of a bullet on impact down here in the thick part of the atmosphere, because you don't have all that drag.

A piece of space debris the size of a fingernail can cause as much destruction in space as a bullet in atmosphere that's many times heavier. I don't think our materials science can produce materials strong enough to withstand the impact of even a small actual bullet (or other matter of equivalent mass) traveling at a typical orbital relative velocity.

Comment Space Junk Reentry question (Score 1) 253

Question to those of you with some knowledge of how satellites are set up, where they're put in orbit, probable trajectory, etc., keeping in mind that my "knowledge" of orbital mechanics is only slightly more sophisticated than an enthusiastic player of Kerbal Space Program:

Is the space junk created by this explosion likely to remain around for a very long time (on the human lifespan timescale), or are we looking at a high probability that most/all of the debris will reenter the atmosphere and burn up some time soon?

If we assume a uniform distribution of debris exploding from a point within the satellite's center of mass, a "sphere" of debris would get created, where some of it would receive a fairly large push down toward the atmosphere and likely reenter very soon... what about the pieces that were given a large delta-v *away* from the atmosphere? What'll happen to those pieces?

Comment Just good ol' fashioned (in)compatibility (Score 4, Informative) 136

Software fragmentation at a high level hasn't been all that scary of a specter for Android yet. Hook into a few core APIs that pretty much work everywhere, and off you go. There haven't been any successful whole-system Android forks that have been able to challenge mainline Android with any semblance of significance in market share.

The problem is in the details of the hardware, and to a lesser degree, software implementations. Screen resolutions; GPU capabilities; the difference in performance between the slowest and fastest (non-obsolete) device on the market; highly variable amounts of storage, VRAM, network bandwidth; limited vs unlimited data plans; the amount of other crap (that may interference) that the user or manufacturer has already installed on their phone; etc. etc.

A lot of devs are starting to whitelist their apps for specific phones, or at least for specific SoC make/model/generations. Your OpenGL fragment shader may run fine on a Qualcomm Snapdragon platform with an Adreno GPU, but crash the app or the whole system on ARM Mali. Your game may get 30-60 FPS reliably on a modern Tegra GPU, but deliver a slideshow on a Droid Mini from late 2013.

And that brings me to my second point: the hardware advances too quickly. A lot of people expect their smartphone to last them 4 years, maybe longer. But if you look at how far phone specs have come since 2010, it's pretty ridiculous. Most 2014 Android games and even non-trivial business apps won't even *launch* on a phone with specs 1/10th as capable as the state of the art.

These problems are hard enough to solve on their own. Most devs don't even have time to think about supporting other core systems with forks or replacements of the core Android APIs.

Comment Re:Department of Fairness can not be far behind (Score 2) 631

If the Federal Government can't determine what's fair, then who can?

Is it fair for someone to have exactly one choice of "broadband" ISP, when that choice is extremely unreliable, outdated, overpriced ADSL?

Is it fair that corporations get to ignore what customers want and only sell what's the most profitable for them, paying absolutely no attention to customer satisfaction, with a three-pronged "bend over and take it / don't have Internet / move house" ultimatum?

If the Federal Government won't stand up for its citizens, what recourse do citizens have left? Organize and march up to some corporate office and demand (peacefully or otherwise) to get what they want? Give up their job and completely change their life around to move to one of the handful of locations in the entire country that has actually good Internet?

Connecting to and participating in the global economy shouldn't be a privilege reserved for the upper crust elite. It should be accessible to everyone. Hell, there is an *enormous* financial incentive to do it, since without that connection, you won't sell nearly as much stuff on the 'net. Games, video, software, you name it.

You corporati would gladly tear down the national highway system to avoid paying taxes on roads, even knowing full-well that without roads, people won't be able to drive to Best Buy or Target or K-mart or Walmart to buy your shit.

Infrastructure is a special type of good. It's a GDP and productivity and economy multiplier. Infrastructure deserves special protection. Ever since man discovered the mechanical lever, we've been using infrastructure to enable us to do more than we could without it. The capitalist system has a significant weakness in that, if left completely unregulated, no one will pay for the infrastructure. Categorizing Internet service as infrastructure is exactly the move that needed to be made. IN PRINCIPLE.

Now what remains is to see what actual changes fall out in practice. The principle of the matter and the actual implementation may turn out to be very disjoint, which would be unfortunate. But leaving the system as-is would all but ensure that the current bad state of affairs would continue, since the old way was backwards *just in principle*, let alone in practice.

Comment My Mon Cal sense is going off... (Score 3, Interesting) 631

"IT'S (probably*) A TRAP!"
  - Rear Admiral Akquixotic of the Mon Calamari

*: There's a small chance that this will end up actually helping consumers. A broken clock is right twice a day, and a reg-captured FCC occasionally does things that benefit the common man.

For example, the Block C Open Access provisions on Verizon and AT&T's LTE bands (or at least some of them) are what prevented these carriers from preventing tethering or the use of custom devices. Any FCC-certified device, rooted or not, tethering or not, can be on those bands, and there's nothing the carrier can do to stop it without breaking the law.

Those provisions have been a lifesaver for many customers of these two carriers who want to use the LTE from their phone to tether a laptop on the go, but don't want to pay extra or buy dedicated hardware for it. So the FCC definitely helped in a pragmatic sense with those rules.

Then again, I'm sure the industry coalitions have fully formed lawsuits written up, signed, in the envelope, and just waiting to be mailed when this decision hit. Who knows how long it'll be until the results of this trickle down through carrier policy and plan offerings to affect the everyman?

Comment Re:Overlooking one small detail... (Score 2) 71

Also, 100 MHz is *a lot* of spectrum to allocate to a single client, given the amount of spectrum that's currently available. They'd have to free up a lot of old spectrum that is used for obsolete stuff like 2G voice and 3G data, so that they could repurpose the spectrum for 5G. The only way they'd be able to pull this off, realistically, would be to increase tower density. 100 MHz is just too much to ask. Typical LTE bands have 1.4 MHz to 20 MHz allocated to a given LTE client; this increases that by a factor of 5 for the largest-width LTE deployments today, and by a significantly larger factor for LTE running on narrower widths.

We can't just manufacture more bandwidth. Once the usable spectrum is allocated for something, we have to either wait until that technology goes obsolete and deallocate and repurpose it, or invent newer and better transceivers that can reliably transmit and receive over a previously unused range of frequencies (factoring in problems like building penetration, which gets harder at higher frequencies). Absent such advances in transceiver technology, we are stuck using the finite bandwidth ranges we have today -- at least for long distance cellular.

So, while it's quite plausible to think they could allocate 100 MHz for this technology, maybe even as much as 500 MHz of spectrum for it, there would only be enough spectrum for a lot fewer clients at a time for each tower, compared to what we can do today.

Problem is, there will be people who are perfectly happy with their 3G or 4G devices and resist upgrading, who want to remain customers under their current contract and continue to use the service already available. These folks are going to give carriers the motivation to retain their existing spectrum for the legacy protocols, inhibiting its repurposing for the next generation. It may be 50 years before the regulators officially announce, say, the 700 MHz LTE band to be free for a new auction.

2020 seems to be very aggressive to me, mainly for policy reasons, not so much technical reasons.

Comment Re:Oblig. XKCD (Score 1) 716

In addition to what AC said, there's another possibility. Even if the design of an existing solution is adequate to solve the problem, and even if someone isn't looking to create another standard simply to try to unify the existing standards, there are still an enormous number of *non-technical* reasons why something might not be desirable.

For one thing, there will always be license purists who insist that absolutely everything in their favorite distribution comply with their specific license of choice. The two biggest offenders are the all-GPL camp and the all-BSD camp, but there are probably others too. Even if they evaluated an existing solution purely on its technical merits and determined it to be superior, they'd still have a motivation to start their own project because Licensing Matters. Licensing matters at least a little to most people, but it's the #1 priority for some people. If those some people also have programming skill, they'll start their own project.

There are also other non-licensing issues that could come up. Maybe the primary maintainer of foo project requires Contributor License Agreements, and soandso thinks that CLAs are the work of the devil. They don't want to fork it (even if the license allows it) because they think it would result in too much infighting between the communities. So they go and start their own project.

One thing that seems to help is to abstract away *specifications* from *implementations*. This works marvelously in the case of the networking stack (from IP to TCP to HTTP), USB device classes, etc. Implementations are free to license under the GPL, BSD, whatever they want. As long as the interfaces between your widget and the other components of the software ecosystem are standard and consistent, people are happy to write their own implementation if they want a specific license. But you don't suffer from the proliferation problem when you have multiple conforming implementations, because the compatibility issues can be ironed out to be very minor with the right level of cooperation and standardization.

It's a wonder we have standards that are established as they are (like HTTP), given the number of reasons people use to justify proliferation.

Your agreement or disagreement with their justifications, such as the examples I gave above, depends on where you stand on the axis of pragmatism or purism with regards to the issue at hand. But regardless of your opinion, that doesn't change the fact that people nonetheless will use these reasons to fork, and hence, proliferation continues apace.

Comment Re:Alien life (Score 2) 52

Would we be so fortunate as to do so this quickly, though? 600 some odd years ago, almost no one was aware of the full extent of the planet's land masses, much less that there were actual people living on those other land masses. After that settled down, not a lot happened for the next several hundred years in terms of advancement of human life's extent and discovery of new civilizations. Then, suddenly, in the 60s, we're extraplanetary.

It would be amazing, but unlikely IMHO, to see a single generation of people live to see Apollo 11 and the discovery of extraterrestrial life. I think we're going to have to look a lot further and for a lot longer before we bump into anyone out there.

Comment Re:Tor and systemd? (Score 1) 53

Tor's integration with systemd, if any, would be very very tiny. Basically systemd would be responsible for managing the start/stop cycle of Tor and collecting any log files.

This is entirely optional, though. You can always run Tor without it being integrated into systemd's service management facility at all. If you need it in the background and headless, just run `screen -mdS tor [tor_cmdline]`.

I do not believe that Tor would be automatically running on a default Fedora install. You would have to enable it yourself.

Comment Re:Does anyone care what RMS thinks any more? (Score 1) 253

While it is true that free distribution makes it difficult to impossible to make money on software *distribution*, this does NOT mean that it is now impossible to make a profit while producing software!

One way is to profit from other aspects of the technology ecosystem. This includes selling support (or "patches/enhancements delivered for a price") for software, which is plenty profitable if you ask Red Hat. Another approach is to sell things that are inherently not copyable at zero cost, like hardware. I don't mean literally selling hardware like Intel; I mean hosting a cloud platform or something like that where you install your software and sell it as a service. Microsoft Azure. Amazon EC2. Google Cloud Engine. Another approach is to sell content, a la Spotify or Amazon Video on Demand or iTunes.

Companies can amortize software development as a cost of doing business. If every company does this, not only will it greatly increase net utility (all the individuals who "just want to use it one time" but can't afford the $1,999 sticker shock for enterprise software, for instance), but it will also reduce the amount of investment needed from each individual company, because each company will be contributing. Although from an individualistic perspective each company has an incentive not to contribute, if *no one* contributes, there will be no software commons, and thus each company will have to reinvent the wheel themselves (or pay another vendor an arm and a leg for exorbitant license fees to do so for them).

It's entirely possible for a large company like Google, Amazon, Red Hat, Microsoft, Adobe, etc. to open source their software while still turning a net corporate profit by selling other things that are not inherently copyable -- things which integrate closely with and use their software. At the same time, by making it open source, they can benefit from the long tail of drive-by patches that reduces the overall cost of their software investment.

It's simple. Aim for net utility as a principle of doing business. Reduce your assets to their essence and sell them based on their essential properties. Software is essentially copyable, so let it be copyable. Ever hear the term "software wants to be free"? So let it be free. Focus your profiteering on something that can't easily be copied for the cost of a few megabytes of data, and help build a better world by contributing to the software commons (and eating your own dogfood if you're able).

Slashdot Top Deals

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...