Forgot your password?
typodupeerror

Comment Backwards from what they think? (Score 5, Insightful) 32

Given that big companies have already made it clear that they think AI will let them do the same work with fewer people, and given that using AI costs the company a lot in terms of compute resources, it seems intuitively obvious that the only reason execs would want to encourage more AI use is to find out what jobs can easily have their headcount reduced by more use of AI.

The people using the most tokens are the ones for whom more of their jobs can be most easily automated. This is not, IMO, a positive sign for the long-term survival of that particular job role. The only rational response is to use AI just enough to show a speed-up, assuming the speed-up actually happens at all, but not enough to be high up on the chart of AI users. Using it way more than that seems self-defeating.

Comment Re:The geothermal plant already exists [Re:MS Pow. (Score 2) 47

The summary says that this thing is supposed to be geothermal powered. So they just have the cart before the horse here. They need to set up the geothermal power plant first, then build the datacenter after the power plant is operational.

The geothermal plant already exists: https://www.globalelectricity....

Apparently, Microsoft was proposing to build the data center there and tap into the existing geothermal power, not build new geothermal power (the summary was a little confusing about that).

Yeah, that was confusing. But Kenya's president is almost certainly wrong. Here's why:

1. It is not numerically correct, assuming the numbers in the summary are accurate. The country has a surplus adequate to power the data center at somewhere around half to three-quarters capacity even at peak power use, and probably at full capacity for 99 days out of 100. So even if they built it at full capacity right off the bat and did nothing else, you'd still only lose power to a small fraction of Kenya occasionally.

2. They're not building it at full capacity. They're building a small data center at first, then building it up over time as more generating capacity comes online.

3. They're a reliable customer of power. That means that they will alway pay the bill, even if it is high. The grid operators and generation plant operators can charge them a huge premium for bulk power, then use that extra revenue to build more power plants. By the time the data center is running at full capacity, they could have more than enough power to power it.

4. Even if that extra investment in production doesn't happen, they can just refuse to provide the additional power from the grid. I'm sure Microsoft knows how to do solar + storage by now, and if not, they can pay someone to do it for them who does. Or they can build their own geothermal plant right next to the existing one. Or they can do any number of other things to produce power, like installing an SMR.

5. Nothing inherently prevents them from reducing power usage during peak load periods. Service will get slower, but should gracefully degrade, assuming they're doing it right. Nobody will lose power, realistically speaking.

It is unfortunate that so many people look at these data centers and the current worst-case state of resource availability and conclude wrongly that they are infeasible, but this is a common mistake made by planners, legislators, and members of the general public. They fail to account for how the existence of the data center with its need for resources will trigger the production of facilities to exploit previously unusable resources and make them available, and they fail to recognize that in a true power emergency, they can just turn 90% of it off and shift the load to other data centers.

But the reality of the matter is that nobody is going to build a gigawatt of additional power capacity in Kenya unless the government or some private company that needs power pays them to do it. They already have a 23 to 30% surplus compared with their worst-case power consumption. That means that adding more production will just drive power prices down, so they'll get less money for the power they produce.

But as soon as someone like Microsoft starts needing enough power to pull those margins down, suddenly additional capacity becomes economically feasible, and you'll see either existing power companies expanding or new power companies entering the market. And the existence of an all-but-guaranteed higher future demand is the key to making that happen. Without the data center being approved, that motive to expand does not exist, and the grid will likely stay at or near its currently levels unless the government forces the hand of the market by paying someone to build more generating capacity.

Comment Re: Well "just" vibe code you a new API, then eh? (Score 3, Informative) 44

The biggest problem with replicating CUDA is not the technical aspects, but finding VC with enough brains to know whom to hire. Most CS grads have the knowledge, but not the drive. Most liberal arts grads have the drive, the creativity, but not the knowledge. You need to find one with both, because creating the next Nvidia killer will require someone who is boring enough to reinvent the wheel, but has enough creativity to find novel solutions to performance problems.

The computer science and hardware engineering behind the hardware and software (Nvidia/CUDA) have been known for decades. The Nvidia hardware could be replicated with FPGAs - notwithstanding any patents Nvidia might have. The software API could be replicated rather easily; parallelism has been known and studied in computer engineering (again) for decades now. What Nvidia did was political - they provided both the hardware and the API to easily use it in one package which could be understood by the C-Suite class. The challenge was never technical, but marketing.

More specifically, you'd need to understand how compilers work, and how to use YACC or bison, or something similar to generate the compiler code for you. You'd have to understand digital logic and how to create logic functions with NAND gates. If you see an FPGA development kit, know what it is, and think to yourself, "What I could do with that..." you're probably a good fit for the job. And you'd need someone willing to bankroll your project until you could demonstrate that you beat Nvidia on something marketable - like floating point performance. Or power consumption.

From an engineering standpoint, what Nvidia has done is trivial - because the solution could be reproduced by an engineer using already known techniques. But what Nvidia did was to combine technical knowledge with an understanding of their market to produce the dominant position they have today. Any computer engineer worth his diploma could produce a design with FPGAs that would beat Nvidia GPUs, but Nvidia did it first.

Comment This message ... (Score 1) 61

... has been brought to you by your cities arts council. Who want the mandatory funding for the arts increased.

Just what we need. Another piece of avant-garde sculpture. Like the one in the lobby of our city hall. That disappeared over one weekend. An investigation led to a building janitor who thought it was construction debris and threw it out.

Comment Re:If you're doing something like that once a week (Score 1) 61

What if you have a job that you enjoy and involves being creative? Figure out what you enjoy doing and get a job doing that.

Now you'll have to excuse me. Amazon just delivered a 20 ton marble block. And I've got to go and chisel something out of it. A David, perhaps. Or maybe a Trump.

Slashdot Top Deals

Seen on a button at an SF Convention: Veteran of the Bermuda Triangle Expeditionary Force. 1990-1951.

Working...