Back when the iPhone was introduced I was convinced that within 10 years computing would be mostly done this way; connecting your portable computer (smart phone) to a dock that turned it into your home computer. I'm surprised that this idea never gained traction.
I think there have been a few reasons for this.
I think the biggest one is that nobody could meaningfully agree on a form factor. Now, *I* always thought that a great option would be to have a 'zombie laptop' that had a keyboard, trackpad, webcam, and a battery, with a slot to slide your phone into. The phone would connect to the peripherals and give a 12" screen and a keyboard, while charging the phone in the process.
The devil, of course, was in the details. Even if Apple made such a device and molded it to the iPhone, the problem then became that a user couldn't put their phone in a case, or it wouldn't fit in the clamshell's phone slot. There would also need to be adapters to fit the different sized phones, or different SKUs entirely with per-device slots, which then also pigeonholes Apple into a particular physical form factor. That begets the "use a C-to-C cable" option, which is better, but makes it ergonomically annoying to use if one isn't sitting at a desk. A wireless option solves both of these problems, but kills both batteries in the process. Finally, there's the price point: the cost for the end user would need to be low enough that it doesn't just make sense to have two devices, AND the first-gen owners would likely feel some kind of way if they were stuck with their old phone because it meant buying a new clamshell. It works well on paper, but pretty much any real-world testing would show the shortcomings pretty quickly.
Supposed that was solved somehow...while the Samsung Fold phones are helping justify time spent in adding a multi-window interface to Android, try installing Android x86 on a VM for a bit and watch what happens. It's been a while since I tried, but the experience was pretty bad - the inability to open e-mails in new windows was particularly infuriating; many apps take exception to having multiple concurrent instances for side-by-side usage, and window focus gets pretty tricky to navigate. It *can* be done, but it ultimately felt like all-compromise, no-improvement.
Finally, there *is* such a thing, at least to an extent. Many, MANY apps are just frontends on a website. iCloud is like this, the whole Google ecosystem is like this, Salesforce is like this...for a solid number of apps, there is a browser-based frontend that works just as well, if not better in at least some cases. Data is commonly synced with Google or iCloud or Dropbox. The number of apps that are worth running on a phone, without a desktop or browser analogue, that would justify a user getting a clamshell to run that app in a larger window...is small enough that it is seldom worth dealing with all of the *other* compromises involved.
LED lights are extremely directional, to the point it can be difficult getting them to diffuse like an incandescent bulb. Most streetlights (and now, lights in sports stadiums, parking lots and the like) are extremely directional, pointing straight down and having abrupt drop-off in illumination around the periphery. In fact, in my town, streetlights that are being replaced with LED are not as good because they don't cover as large of an area or fade out on the edges in a more natural way. They almost look light spotlights shining straight down - objects are either in the light or not at all.
So I guess the light pollution from LEDs is due to light reflecting off the ground? I'm just surprised LEDs haven't decreased light pollution because of how directional they are. They should definitely be decreasing light pollution outside of the visible spectrum, because that was always an issue with the various kinds of lights used in the past for lighting large outdoor areas. A big part of their inefficiency was generating unwanted infrared or ultraviolet as a by-product.
As far as I'm concerned kids are the most important people and we're just keeping the seats warm for them.
I always hear statements like this thrown around in various contexts, and while on the surface it sounds admirable, it's really not accurate at all.
The most important people on the planet are those with the knowledge, skills and the actual power and means to accomplish what needs to be done. Without the core set of human adults that actually feed our population, keep it safe, fix broken and unhealthy bodies and the like, there would be no children. Or if there were children they would mostly be dying.
Also based on that logic, the instant a "child" becomes an adult sitting in that seat you kept warm for them, they're not important anymore, and the real focus is immediately on people younger than them. Unless there's like a grace period of importance - say you get 5 years of still being most important after you're no longer a child? See what I mean?
Obviously we need children, and we want the human race to continue and all that, and to provide for them a planet better than the one we started with and with more opportunities, but saying kids are the most important people sounds good in a Whitney Houston song, but isn't very realistic.
100% of Americans want the time changes to go away. The problem? 50% think DST should become the permanent time, while the other 50% think it should be non-DST. That's the real problem.
Personally, I'd rather have the extra hour of daylight in the evening. If it's dark in the morning then schools can start an hour later (which some in my region actually did for a week last year).
Either way, if it ever goes permanent, you're going to have half the population unhappy with what became permanent.
Cloud seeding isn't new, just using drones to do it.
Indeed it isn't. They doing it all day long, every day here in the USA and around the world. It's called "geoengineering", "weather modification" or "chemtrails" depending on who you're talking to. It goes far beyond just making a rainstorm or two here or there.
Also, building cities interrupts large chaotic systems, literally affecting the climate in the area of the city. Perhaps we should stop building cities.
Now there's a sensible idea.
As with anything humans do, we need to be aware of, and manage, side effects.
Lucifer is great at solving problems. He has to be, as he's always causing them.
Hopefully the massive widescale weather experiment we are all unwittingly or helplessly a part of won't have any devastating consequences years or generations down the line. "What could possibly go wrong?"
All this is theater exactly the same way that Microsoft anti-trust thing years ago was theater. Just bread and circuses.
All of these companies have one thing in common: they were founded by US intelligence. They are proprietary fronts. None of these "companies" organically arose due to "capitalism" or "free market" or whatnot. All of their founding myths are bullshit. It's intel dollars at work.
Look up who Jeff Bezos' grandfather was. (Preston Gize was his name.) Big shot at DARPA. You think it was a coincidence this guy ended up as head of Amazon? Just a really smart guy who happened to be in the right place at the right time?
Shell wasn't the one who burned the gasoline and produced the CO2, that was you and me
Yes and no. It's true that "we" are burning all of the oil and gas, and are responsible for the demand. But oil companies themselves emit around 15% of all greenhouse gases in the process of producing, transporting and refining oil, before they sell it to us. That's not an insignificant amount, and perhaps there's a lot of room for further improvement. They already stopped practices like flaring off that pesky natural gas that is produced along with oil.
The same goes for the manufacturers of concrete: they emit a lot of CO2 in the process... but only do so to satisfy our need for the stuff.
"The eleventh commandment was `Thou Shalt Compute' or `Thou Shalt Not Compute' -- I forget which." -- Epigrams in Programming, ACM SIGPLAN Sept. 1982