Comment "Urban Mines" (Score 1) 36
So, does that mean the guy who broke in last year and stole my old TV wasn't actually a thief, he was an "urban miner"?
So, does that mean the guy who broke in last year and stole my old TV wasn't actually a thief, he was an "urban miner"?
Back in 2021, the Zero Electric Motorcycle company tried charging buyers extra, post sale, to enable features that were built in but electronically disabled, and it really irritated the customer base, hurt their reputation, which for a small company, could be fatal. I just noticed this year's 2023 models had a price increase, but now they include all the features that were previously options.
They already do this. Don't buy one of the 3 pre-configured "bundles", pick the option to configure your own, and choose "no os".
At my last job they ran SLES 11/12 with a BTRFS root filesystem on all servers, which caused the occasional problem with unbalanced B-trees. They ended up setting up a cron job everywhere to rebalance the filesystem regularly, after too many critical servers crashed during the work day and required UNIX admin team support.
I also helped administer our Splunk servers, and Splunk didn't support or work on BTRFS, so we always requested an extra drive with Ext4 to store Splunk databases.
I never saw the benefits of BTRFS solve any real-world problem in the years I worked there.
I worked on Cingular Wireless' brand new National Networks team back in 2000-2004, where we ran their Nokia WAP Gateway and related upstream systems for years. A WAP gateway was essentially an HTTP proxy that translated UDP based WAP protocol from the flip phones into regular TCP based HTTP for browsing the web. These were the days when 9.6Kbps data connections was the best you could expect, so WAP was needed because it was far more efficient than HTTP. Once wireless networks reached 2G and got faster (eg 64 Kbps), cell phones and PDAs started including regular HTTP based web browsers, and WAP usage died quickly, or so I thought.
Wireless providers originally used WAP to create their own little "wall garden", where they could extract third party monies from content providers (eg. ESPN, Yahoo News) in exchange for getting on the main menu of all their customers. HTTP was designed to be decentralized, and once it took over, WAP provided no added value anymore. If we can kill Flash in 2020, why didn't we kill WAP 10-15 years ago, when the last 1G networks were phased out?
interestingly, now, in 2020, we're apparently back right where we started, with Google, Wechat, Facebook, and others locking everybody into their own much larger walled gardens, convincing us to prefer to stay within their "app" instead of freely browsing the open web. The content providers have focused on leveraging ad networks to track every single thing we do on the open web, in exchange for access to so-called "free" content. Information is king, more valuable to them than anything else.
Enough pontificating. Go back to Reddit, Facebook and Tik-tok and enjoy your self imposed prisons.
Peace out
How are they determining the OS? Can't just be the user-agent string. Does something else in Javascript reveal that info?
Has NO ONE studied history? If you make 50% of the poorest, gun-toting folk unemployed, there won't be any rich folk (or AI) for very long.
Do you want Sky-Net? Cause THIS is how you get Sky-Net!!
I have a gold Pebble Time Steel, and people are always asking me if this is an Apple Watch. I just tell them no, it's a Pebble, which is cheaper, the battery lasts 7 days, instead of 1, and Pebble has been around for years. It doesn't have the touch screen, but nobody cares about that useless feature anyway.
I'm addicted to wrist notifications, I can easily read short texts or see who is calling (and answer or send them to voice mail) without digging my iPhone out of my pocket. Great when you have a bluetooth earpiece.
Developers too often seem to think they know everything, when (esp on large teams) often they have zero idea what it takes to bring their ideas to the real world. It takes serious designers to develop a scalable app, even if lots of people think they know how. I work in production support of multiple websites, meaning I have to clean up after the mistakes developers make on a daily basis. The support folks who have to write patches for our products often grieve over the situations the original developer placed them in. It often takes a major rewrite to fix many performance issues, because the original programmer never imagined all the different situations their code would be used. Prod support is where the real issues are discovered and solved. Accept it and move on.
http://www.amazon.com/SDR-Starter-Bundle-64MHz-1700MHz-EMI-Protected/dp/B008V5NGDY/ref=sr_1_fkmr1_2?s=electronics&keywords=rtl+sdr+starter+kit
$50 at Amazon, although it's a little late to be shopping online today.
Like many of you, I've been learning and using Perl since 1.0 was first released on CSU. I was privileged to be able to contribute to the early versions by porting Perl to the many various platforms I had access to at my employer (a compiler company) at the time. 25 years later, I still find Perl useful in my current job. Love it. Thanks Larry and everyone else!
SGI's automatic parallelizing software came from Kuck and Associates, Inc (kai.com). I worked there for 8-1/2 years, and one disappointing fact we learned was that the only people who really cared enough about parallelizing their software to analyze their code and modify the source to make it faster were either research scientists (of which there were relatively few) who mostly wanted quicker and cheaper results (because renting time on supercomputers costs $$) or marketing departments of computer hardware manufacturers (of which there were fewer) who only wanted to be able to advertise higher SPECmark numbers for their hardware. SGI was the only manufacturer who shipped our product with every C and Fortran compiler they sold. IBM, DEC, HP only sold it as an option, but all used it internally to speed up their own benchmark numbers.
Automatic parallelizing is tough, tougher than you think. It's nearly impossible if you don't have a human performing program analysis and adding source code directives to inform the compiler about data dependence needs.
Why does this article use the term "multi-server microkernel OS"? I don't see anything in the article or anything else about Genode referring to multiple servers. Sounds like they're just trying to redefine the term "microkernel"
MESSAGE ACKNOWLEDGED -- The Pershing II missiles have been launched.