Comment Re: Last time I was in Alaska... (Score 1) 98
If you remove suicide by gun, the U.S. statistics look 'better', a lot better.
Better, yes. A lot better? No. The US homicide rate is 2 to 5 times higher than European OECD countries.
If you remove suicide by gun, the U.S. statistics look 'better', a lot better.
Better, yes. A lot better? No. The US homicide rate is 2 to 5 times higher than European OECD countries.
James Strawn, who was laid off from Adobe over the summer after 25 years as a senior software quality-assurance engineer.
I can only assume that for the past decade, James has been ignored, or terrible at his job. Every Adobe product has gotten progressively worse to use, forums are filled with bug reports that get ignored release after release, and the increase in system requirements do not reflect improvements in functionality.
Whether because Adobe didn't like what he had to say, or they decided not to listen to him, it's completely unsurprising that he lost his job.
The folks offering $500K/year for AI experts aren't going to take anyone who makes the claim on a resume, they're almost guaranteed to be looking to poach someone at OpenAI or Google. Practically speaking, they're looking to benefit from the experience that those companies paid for...and James doesn't have it.
On the upswing, odds are pretty good that James will have a job in short order, helping to deal with the fallout of 'vibe coders' who don't know how to do real-world testing. He's probably going to run into some combination of age discrimination and salary discrimination (no way he's working for $60K if he has 25 years at Adobe), but once the messes start being too big to ignore, I'm pretty sure he'll be able to become a project manager that helps direct fixes for deployed code that didn't get actual-QA. The need is most definitely there, it'll just take a bit more time to prove to the brass that he's more valuable to the company than the MBAs that are looking at their now-spherical product for more corners to cut.
We thought we would save money by going a-la-carte. Now it's more expensive.
Well, I think we did 'simple math' when we thought that, rather than 'real-world-math'. If my cable bill is $150/month for 100 channels, that's $1.50/channel. Since I only watch maybe 20 of them at most, 20*1.5 = $30/month for the 20 channels I watch. Who wouldn't want that?
The problem is that channel costs don't divide evenly. ESPN is very expensive ($8 or more of one's cable bill goes for just this channel), while Home Shopping Network and QVC have historically paid the cable companies for inclusion in the lineup. Public Access stations are both legal requirements in many jurisdictions, and the content is paid for by whoever submits it for broadcast...and again, roughly nobody would include it in their custom lineup.
Finally, there *is* a baseline amount of cost for the last-mile distribution. Whether it's got 1 channel or 1,000 channels, the infrastructure needs to exist. I'm not making any excuses for Comcast here, but someone needs to pay the right-of-way to the townships for the wire runs, the backend equipment costs money, the staff to service it, and the staff to answer the phone for CSR requests all cost something. Streaming services generally get away with chat-only support and don't have any wires to run.
So...while I'm not calling cable a good deal by any stretch, I *am* at least acknowledging that a-la-carte math would be something closer to $30 infrastructure + $10/month ESPN + probably-something-closer-to-$3/channel for the ones with actual-content, making that beautiful $30/month bill something closer to $80/month in practice.
Personally, I think that there *does* need to be some sort of court case that uses United States v. Paramount Pictures, Inc. (1948) as precedent to decouple distribution from production, which would probably do more to solve the cost issues than anything else.
Microsoft had Windows MCE and a certification program for HTPCs to run it.
It was half baked, but it was early.
Same as it ever was, Apple succeeds on execution and lack of competition which is not dragged down by advertising enshittification.
This isn't about AI. HUD glasses were always a good concept poisoned by Google and outrage hype, if even Meta can launch HUD glasses the outrage has dissipated. Apple is late to the party, but it's too big a niche to ignore. Much bigger than pass through goggles, especially since Apple is allergic to VR.
Inertia is pretty much synonymous with grid forming. Small and medium solar generation is not grid forming because that's how it has been regulated. It's not an inherent property.
With the right control algorithm any PV inverter can be as grid forming as a steam turbine with a flywheel and generator. If it has no spare power it can not add voltage support, but it can always carry energy across a cycle through its capacitors to force phase and frequency to change only slowly (ie. have inertia).
In principle battery storage can do inertia on the side with just a change in the control algorithm. So if you are building lots of battery storage, there might not be much use for the retrofit synchronous condenssrs eventually. More of a stop gap.
Inertia can be electronic (grid forming statcoms).
That can be combined with battery storage, which can be combined with (not so) rotating reserve, such that the batteries remove the need of running it at minimal power all the time.
I also seriously doubt your 15 years figure. Current technology degrades about 20% after 3000 charging cycles. Given that the 2 TWh number is a 3 day storage, you would need to fully recharge them about 120 times a year at a maximum, which means that after 25 years, you still have 80% of the capacity left. This means you have to add 20% of the capacity after 25 years or do a complete refresh every 125 years - and that means that all technology development stops right now.
unlikely it will be good for much with less than 1GB memory available. Should be a good reminder that true personal computers are still available and at only a modest cost premium.
For everyone to chisel it in marble: Just because you can detect it does not mean it is bad.
Some 20 years ago, my employer had an employment/salary income verification service. There is a similar service running today by my current employer.
I would have to request and employment verification code. Then the landlord, bank, etc. uses this code to request employment information about me. I control the code (I can disable it) and it only provides the minimum information necessary.
Modeling paged and segmented memories is tricky business. -- P.J. Denning