Does AWS or any of the big AI companies really want to build a data center in Maine?
None so far.
It's not exactly near any major Internet backbones or major tech hubs. The electricity there is expensive as well, plus you have to worry about blizzards disrupting your diesel deliveries during a power outage.
Logistics is not the new hotness. The new hotness is YOLO.
It seems like someone is trying to score easy political points "protesting" something that probably wasn't going to be built anyway.
But . . . but won't anyone think about their freedom?
has data that goes through, or is stored in a data center, otherwise, they'll come off like hypocrites.,
1) It's a temporary ban not a permanent one. 2) There have been no major datacenter projects in Maine yet. 3) One of the reasons for the ban is to allow the state to assess infrastructure changes. Some of these datacenters are being built with a "If you build it, they will come" attitude when it comes to infrastructure demands like electricity and water. In decades past when companies built datacenters, they had to plan for these things and some datacenters built their own power and water plants.
RAM is on the same carrier as the CPU, tightly coupled for maximum bandwidth and the lowest possible latency. The SSD is soldered on and also highly optimized.
If you have looked at Windows laptops recently that's already happened. RAM and SSDs are soldered on many of them.
Adobe Premiere Pro was a launch day application for Apple Silicon. That doesn't happen by accident with Apple's level of secrecy. Apple made sure the ecosystem was seeded with flagship products that worked well.
Apple working with Adobe to make sure their flagship program works is a very different thing than accusing Apple of conspiring with Adobe to rig performance metrics to fool consumers in thinking the M1 was faster than it is.
I'm not certain how many of us actually care about benchmarks, whether that conspiracy is true or not.
Many of us do not put full faith in benchmarks as the only facts; however, benchmarks can be used as general guidelines as to whether one machine performs better than another. The conspiracy that somehow Apple and Adobe were colluding to fool consumers seems very paranoid especially when history seems to contradict it. One would think if Apple and Adobe colluded on the M series of chips that Adobe would have ARM specific versions right after Apple launched their ARM processors. They did not which suggests Adobe like every other software developer had to spend some time updating their code.
My wants are a computer that works quickly with every program designed for it. Give me a reason to prefer benchmarks over boots on the ground performance.
And how would anyone besides you know your work flow? Benchmarks simulate general work flow. Some reviewers devise their own benchmarks that mimic what they do; however, they concede they cannot anticipate every person's needs. If you find a reviewer/benchmark that uses your work flow, go with it.
Apple pays extra for finished chips which covers most of the cost of the binned chips, but DOES NOT INHERENTLY INCLUDE actually buying the binned chips until Apple actually wants them. Because they are buying per chip. It's not a fucking hard concept nor is it difficult to believe when you look at how much money Apple has invested into TSMC. Apple is now dependent on TSMC while TSMC is no longer dependent on Apple.
Read the article again: "A sweetheart deal between the companies means TSMC effectively eats the cost of the defects that inevitably crop up in a new manufacturing process."
1) The question is which company eats the defects, but that question specifically deals with new nodes as yields in new nodes start out low. The A18 Pro is not being produced on a new node. 2) For a new node only, if Apple only pays for finished goods, an A18 Pro with 5 or 6 cores is considered a finished good so Apple . 3) All of these points are speculation on the part of outsiders. Neither Apple nor TSMC has divulged their contract terms.
Kindly take your limited, garbage knowledge and get the fuck out of here you clueless fuck.
You still have not committed to whether you are complaining that Apple is used binned chips or praising them. You just want to complain about Apple no matter what they do. Instead you resort to name calling. That is who you are.
You accuse me of arrogance, but you yourself can't provide a single link to prove your point. Maybe you should look in the mirror.
Bahahahahahahaha. You NEVER asked me for a link. You simply stated you couldn't find one therefore none existed. Let me repeat what I said earlier. WE BOTH KNOW Signal based E2EE is a feature the Google has that is not in Universal Profile 1.0. It does not exist in Universal Profile 4.0. You have yet to address that point. Seeing how you are unable to address any other points, that one point keeps destroying your arguments.
You apparently made up the part about Google extensions. I have not found any evidence that your claim is true.
BAHAHAHAHAHAHAHAHAHHAHAHAHAHHAHAHA. So because you couldn't find something, it does not exist? That is pure arrogance. You still have yet to answer a basic question of how Apple was supposed to handle the ONE extension we both know Google implemented Signal Protocol based E2EE.
Presumptuously correct because you don't know jack shit about Apple or TSMC.
You are under the impression that Apple "buys" chips from TSMC. They do not. They buy wafers like everyone else.
Sure, but not A18s. Apple is the only customer.
Again you do not seem to understand that Apple, and Nvidia, and AMD and everyone else buys wafers not chips. That is the pricing and contract structure. No one buys chips.
No, I am not. So not only do you not work in the semiconductor industry because you have no idea what you are talking about, you also lack basic reading comprehension and critical thinking skills. Even if you do work in the semiconductor industry, you clearly don't know about Apple or TSMC and are operating on assumptions from your limited experience.
Please state whether you are complaining that Apple is used parts they could not have used or praising them. You seem unwilling to commit to either proposition.
Because what Apple orders, and receives, from TSMC is individual chips, not whole wafers.
Again you do not know the industry. Everyone measures wafers in contract terms: "With this agreement, AMD now expects to purchase approximately $2.1 billion of WAFERS from GF (Global Foundry) between 2022 and 2025."
The industry does not use chips because that is highly variable. Pricing is based on wafers. Cost is based on wafers. Processing is done on wafers. Yields are based on wafers. Everything is based on wafers.
Perhaps other customers would have to pay for the whole wafer, good or bad, but that isn't how it works with Apple and TSMC.
Please provide evidence that somehow Apple and TSMC uses different pricing and contracts than everyone else. I'll wait.
Apple does not order by the wafer, they order a specific number of chips for a certain number of products they plan to go in.
1) How do you know? Please provide evidence. 2) Apple plans on a certain number of chips BUT TSMC like every other fab manufacture wafers. The contract terms and pricing specify who many wafers will be produced.
Everything binned as not part of Apple's order (5 cores, 4 cores, etc) wait around until Apple purchases them as well
Again no. Apple has already purchased those binned chips as Apple purchased the wafer. Binned chips were considered yield wafer losses.
Regardless of the structure of an order, the result is the same. Apple pays TSMC a different amount of money (more) for an full A18 than they do for a binned 5 core.
Not if Apple pays by the water like everyone else. They do not pay twice. That is a rather large hole in your assumption.
OR mostly ignores the success and demand for these lower cost devices and only create them when TSMC has a "significant" surplus of binned chips sitting around.
That statement alone says you have no clue about semiconductor fabrication supply chains. Chips are made in advance. No OEM decides to wait around before creating chips especially when TSMC is fully booked. Before Apple launched the Neo (or any device) they had a large supply.
While this is a very reasonable explanation, we don't have reason to believe it's actually true. AI "demand" is a great excuse for profit taking, just as AI is a great excuse to fire staff.
Well in this case, why is the price of some of these components very high right now? Firing staff at SK Group, Samsung, or Kyoxia would not explain that. There has been no major disruptions like a natural disaster that would explain it. Tariffs would explain 100% increase in price not the 3-4X increase. At the same time, AI companies have been trying to build (at least fund the building) of AI datacenters.
I cant fathom why we have a shortage of Fabs making this stuff.
There are only a handful of companies that make NAND flash. The main ones are Samsung, SK Group, and Kyoxia. AI companies are 1) buying out all available supply of everything they can. 2) Working deals with these companies to only manufacture their orders. These companies do not have unlimited capacity and building out capacity is years long in the making. So AI gets their products and consumers get whatever is left. The demand for consumer flash has not dropped, just the supply.
You don't have to know how the computer works, just how to work the computer.