Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Intel Businesses

Intel Dramatically Cuts Prices of Top-End i9 Gaming Chips (cnet.com) 98

Intel is whacking the prices of its high-end i9 processors as it refreshes an aging design. From a report: For example, last year's top-end Core i9-9980XE processor costing $1,990 at retail is replaced by the $979 Core i9-10980XE. The new chips are due to arrive in November. "People were stuck in the top part of the mainstream," unwilling to pay a lot more money for a bit more performance, said Frank Soqui, vice president of Intel's Client Computing Group. It's also cutting prices of the chips used in the related but more corporate high-end market: workstations. The price cuts are as deep as 50 percent, Soqui said.

These 10th-generation Intel i9 chips are built on an aging core called Skylake, but Intel has managed to squeeze out steady if unspectacular performance improvements in the chips year after year. Intel's newer Ice Lake-based designs have just begun arriving, but they're limited to power-sensitive, premium laptops. Once-dominant Intel faces plenty of competition in the chip market as smartphones have moved into our lives. Those are powered by the Arm family of chips, manufactured by companies like Apple, Qualcomm and Samsung. But Arm's virtues when it comes to sipping power can't keep pace with Intel's clout when you have a behemoth computer you don't have to unplug from that electrical power outlet on the wall.

This discussion has been archived. No new comments can be posted.

Intel Dramatically Cuts Prices of Top-End i9 Gaming Chips

Comments Filter:
  • Reasons (Score:5, Funny)

    by scmcclain ( 698239 ) on Wednesday October 02, 2019 @04:21PM (#59262762)
    Intel: This price cut has NOTHING to do with AMD, we're just being really nice to our customers...
    • Funny thing about that, the Rizen 3900x is still a better deal.
      • This is the HEDT space. Those chips are running against Threadripper, not Ryzen. 3rd gen Threadripper also drops in Nov, and it's going to eat the X chips' lunch.

        • by fintux ( 798480 )

          Depends on the chip and the use case... For many uses the 12/16 core Ryzen chips compete against the 10/12/14 core Intel chips. Sure, the 18-core one would compete mostly against the Threadripper. I know, there are differences in things like PCIe lanes and memory channels, so the Ryzen is not suitable for everyone.

          However, before Ryzen 1st gen, if you wanted more than 4c/8t with high IPC, your only option was Intel's HEDT. Before Ryzen 3rd gen, if you wanted more than 8c/16t, your options were only Intel HE

          • For many uses the 12/16 core Ryzen chips compete against the 10/12/14 core Intel chips.

            Perhaps in performance for multithreaded workloads but not in price. And price is really how you should be looking at things. Except for the small group of consumers where money isn't an option, everyone is working on some kind of budget when they build or purchase a new system. If my budget for a cpu is $300 then I am getting the best I can at $300 and I don't really care what "segment" Intel or AMD choose to call it. I'm in the "$300" segment. Right now AMD's 3900X is a 12 core 24 thread processor for ~$3

            • by Agripa ( 139780 )

              Back in the Athlon days they didn't do a price drop. Ryzen, Epyc and the upcoming Threadripper must have them squirming a lot. But I doubt its just that. Otherwise it would be just 2005 all over again.

              Back in the Athlon days, Intel payed or strongarmed the PC manufacturers to not use AMD's processors.

    • Isn't that about the amount of performance lost trying to fix their chips' security issues? Seems about right, they are only worth half. If people had known prior, they would not have been paying 2X what they were worth.
    • I think it's more to do with demand for chips that don't need software fixes for Spectre. I believe this will be the first generation of immune hardware from Intel, with AMD's having come a few months ago, the 3800 Ryzen I think. Lots of vendors are trying to pass off older chips now. Some are misbranding them as the newer. Watch out. I'm waiting 6 months for an upgrade, now's not the time.

      • Do you suppose those sorts of exploits will increase demand for raw clock rate vs 'cute' ways to increase effeciency that may later need to be disabled?
      • I think it's more to do with demand for chips that don't need software fixes for Spectre.

        The exploits labelled "Spectre" are conceptual. They're baked into the architecture of Intel Core and AMD (Yes, AMD currently requires Spectre mitigations). They can't be resolved by a tweak in CPU microcode, or it would have already been done. For Intel/AMD to mitigate SMT exploits in hardware, they have to significantly redesign the CPU.

        Here's where I don't think you really grasp the nature of commercial hardware engineering. Its going to take years for Intel to address their CPU architecture problem

    • Re:Reasons (Score:5, Informative)

      by The Cynical Critic ( 1294574 ) on Thursday October 03, 2019 @07:24AM (#59264740)
      This becomes particularly obvious when you start doing some comparisons between the price cut chips and AMD's new Ryzen 9-series

      Ryzen 9 3900X - 12C/24T - $500
      Ryzen 9 3950X - 16C/32T - $750

      i9 10900 X - 10C/20T - $580
      i9 10920 X - 12C/24T - $690
      i9 10940 X - 14C/28T - $785
      i9 10980 XE - 18C/36T - $990
  • by NewtonsLaw ( 409638 ) on Wednesday October 02, 2019 @04:21PM (#59262764)

    Yep, might be time to upgrade... my 4004 just isn't cutting it anymore.

    • Yep, might be time to upgrade... my 4004 just isn't cutting it anymore.

      I hear good things about that newfangled 8088.

      • by drnb ( 2434720 )

        Yep, might be time to upgrade... my 4004 just isn't cutting it anymore.

        I hear good things about that newfangled 8088.

        There are these awesome things called segments letting us get beyond 64K, people are going to love segment_reg + general_reg based far pointers. What could possibly go wrong?

        • by malkavian ( 9512 )

          Hehe.. I had a real WTF moment way back when I first encountered those. :)

          • by drnb ( 2434720 )

            Hehe.. I had a real WTF moment way back when I first encountered those. :)

            I swear half the errors I fixed in DOS and Win16 days were segment registers pointing to the wrong place.

        • I get upset Motorola and DEC lost the architecture wars.

          Man PDP11 assembler was the most elegant and intuitive instruction set ever made. 68K was also absolutely beautiful. By comparison X86 looks like it was made by Rube Goldberg.

          • by perpenso ( 1613749 ) on Wednesday October 02, 2019 @05:58PM (#59263198)

            I get upset Motorola and DEC lost the architecture wars.

            I was a CS major but I took all the optional EE electives. In one two class series we wired up CPUs, interrupt controllers, uarts, timers, etc. Pretty soon I noticed these chips were all the same parts found in a PC. When interviewing for the first job out of college they noticed I had a bit of assembly language experience. They said the ad I answered was for a general purpose C position but they had an assembly language / C position for the firmware and C run-time support. A brand new custom 80386 board that I would be free to do whatever I wanted since there was no legacy software. The mentioned the various support chips they used, sure enough, PC parts again. I mentioned my two EE classes and the project there. I got hired.

            When talking to the VP of Engineering and the lead hardware engineer for the 386 board they mentioned they wanted to use the 68K but went 386 so it would be easier to find someone with experience. I mentioned I knew 68K from programming the Mac. That we programmed early Macs in assembly by cross assembling code on an Apple II and downloading it to a Mac to see it run and debug it. We couldn't afford an $10K Lisa and its Pascal based environment so we improvised. The VP and lead were sad at this revelation, they really had wanted a 68K and it turned out their 386 guy was also a 68K guy.

            Anyway, first thing I did was config the 386 for 32-bit addressing and rarely ever touched seg:reg addressing again. I had a little bit of it. I did not use a flat memory model, rather 32-bit addressing but still segmented. I used segmentation so the kernel could have a private address space accessed by FS, something the applications would not be able to inadvertently touch.

            Man PDP11 assembler was the most elegant and intuitive instruction set ever made. 68K was also absolutely beautiful. By comparison X86 looks like it was made by Rube Goldberg.

            I started in assembly with 6502, then a PDP-11 at school, then 68K. I didn't understand the hate for assembly. Then I moved to 16-bit x86 and understood. 32-bit x86 was far better but still not in the same league as 68K.

            • by Agripa ( 139780 )

              I started in assembly with 6502, then a PDP-11 at school, then 68K. I didn't understand the hate for assembly. Then I moved to 16-bit x86 and understood. 32-bit x86 was far better but still not in the same league as 68K.

              68K assembly was nice but to those of us who had been using 8080 assembly, the 8086 was a great improvement.

          • by Agripa ( 139780 )

            I get upset Motorola and DEC lost the architecture wars.

            Man PDP11 assembler was the most elegant and intuitive instruction set ever made. 68K was also absolutely beautiful. By comparison X86 looks like it was made by Rube Goldberg.

            x86 looked very nice as an upgrade from the 8080 and Z80 at the time.

            In the long run, 68K was actually worse than x86 because of things like double indirect addressing. Motorola's lackluster availability did not help matters; they were not someone to depend on if you actually wanted to buy what they advertised. I still have some PLCC 68HC24s which arrived 2 years late for a 68HC11 project which was changed to PIC.

        • by Agripa ( 139780 )

          There are these awesome things called segments letting us get beyond 64K, people are going to love segment_reg + general_reg based far pointers. What could possibly go wrong?

          Honestly, it seemed like a good idea at the time. And it did allow easy porting of existing 8080, primarily CP/M, applications.

      • by Agripa ( 139780 )

        Yep, might be time to upgrade... my 4004 just isn't cutting it anymore.

        I hear good things about that newfangled 8088.

        The 8088 is untested. Stick with the 8008 upgrade path.

    • (See: His YouTube channel.)

      It's probably the only trustworthy computer out there.

      Of course, nowadays, a whole computer with 4G fits into one of those components he uses. ;)

      • Pfft. How can you trust those TTLs, RAMs, ROMs? You could trivially integrate a whole ARM system in each of them.

        Nah, as long as I haven't built it from transistors I won't trust it.

        • by Agripa ( 139780 )

          Pfft. How can you trust those TTLs, RAMs, ROMs? You could trivially integrate a whole ARM system in each of them.

          Nah, as long as I haven't built it from transistors I won't trust it.

          Discrete injection logic is the future.

    • by drnb ( 2434720 )

      Yep, might be time to upgrade... my 4004 just isn't cutting it anymore.

      Your keyboard says otherwise :-)

    • I hear you can get a good deal on an 8008, the peripherals may set you back a bit.

      • by Agripa ( 139780 )

        I hear you can get a good deal on an 8008, the peripherals may set you back a bit.

        The best part is that you can use AMD peripherals!

        That is no joke; AMD was a source for 8008 peripherals. I have a complete set of chips for the 8008 including an 8224 clock generator from AMD.

    • You will have to push hard to plug the i9-9900 into the 4004 socket.
      • by drnb ( 2434720 )

        You will have to push hard to plug the i9-9900 into the 4004 socket.

        Not really. The unused pins on the i9-9900 that do not line up with the 4004 socket bend out of the way with the slightest pressure. :-)

  • Wow! Call me when I can get one! ... So I can decline and pick one without IME, Spectre/Meltdown, and supporting a known thug.

    Besides: Even if they could sell a big volume, this only works as long as they got money to waste.
    And given the state of their process nodes, if they think they do, they soon won't anymore.

    • I suspect that Intel can actually afford to go that low with prices, at least while still making a modest profit. Their new prices for 10 and 14 core CPUs are similar to what AMD is taking for the 12 and 16 core Ryzens. The Intel chips might need more chip area for their I/O, but if you look at what GPU chips of similar size cost, those are cheaper.

      Anandtech estimates the size of the 10-core Skylake LCC die at 322 mm2 and the size of the 18-core Skylake HCC die at 484 mmÂ. Manufacturing technology is 1

      • They will have to cut their investments somewhere.
        Or rather: ... keep not investing some more.

      • by GuB-42 ( 2483988 )

        Note that Intel's 14nm process is different than AMD's 14nm. It is about 10% finer, closer to Global Foundry's 12nm. In the same way, Intel's 10nm is on the same level as TSMC's 7nm.
        Process node names are actually commercial designations, not actual physical measurements.

        Anyways, fabrication costs are just a small part of a chip price. Maybe $50 for a high end chip like that. And chipmakers price according to many factors. Chips are binned and the price is matched to market demand. Lower end CPUs and GPUs c

        • Well, Intel has been selling variations on Skylake for quite a while now, at inflated prices. The R&D costs should be long paid off. So I have very little empathy even if they have to sell slightly over fabrication cost. I suspect though that they still make a nice profit, just not as much as they used to.

  • by RyanFenton ( 230700 ) on Wednesday October 02, 2019 @04:30PM (#59262806)

    Seems to me like they ran simple production numbers calculation and decided they could produce a larger batch and make more money charging less the first round, when hype was slightly larger.

    They've got the extra production space now, thanks to their lowered competitiveness compared to AMD over the past few years or so.

    That's not drama - that's just skipping out on the 'pure fleecing' stage of a product rollout, to jump to get as much of the 'early bandwagon' stage as they can. Seems reasonable.

    It's why most games start out at 15%+ off on launch day, since most aren't highly anticipated AAA games - that's the way to maximize interest and sales when it will matter the most, if you can't guarantee a long time of clear competitive advantage.

    The actual drama was the flaws in the previous products, and the reaction over time to those flaws.

    Ryan Fenton

    • by drnb ( 2434720 )

      Seems to me like they ...

      ... exhausted the supply of early adopters for which price is no option and are now pricing for the next lower segment of the market. In other words maximizing the revenue from the various market segment's "willingness to pay". Yes this coincides nicely with a slow ramp up of production but lets not kid ourselves as to what is really dominating the pricing. Milking the big whales and then the smaller whales.

    • Re:Dramatic? (Score:4, Insightful)

      by alvinrod ( 889928 ) on Wednesday October 02, 2019 @05:55PM (#59263184)

      Seems to me like they ran simple production numbers calculation and decided they could produce a larger batch and make more money charging less the first round, when hype was slightly larger.

      I think what it really comes down to is that they looked at the competition and realized that there's no way they can sell a $2,000 enthusiast CPU that your competitors $500 mainstream top-end part can compete with. AMD's 16 core Zen 2 CPU should probably be able to beat out Intel's best offering here, and there's some leaked benchmarks to suggest this is true. So AMD is already going to have better performance on a less expensive platform.

      Once the new Threadripper CPUs come out, Intel is screwed even with their reduced price points. It's going to be hard to consider an 18-core Intel part when there's a 32-core AMD part.

      • The low end of the new Threadrippers might be what really counts for comparison. The Skylake X series has 4-channel RAM and more PCIe lanes than the Socket 4AM CPUs.
        I guess Threadripper 3000 will at least match the Intel I/O and start around 16 cores (?). That should make the low end versions of Threadripper the "natural" competition for the i9-10940X and the i9-10980XE.

      • by jezwel ( 2451108 )

        It's going to be hard to consider an 18-core Intel part when there's a 32-core AMD part.

        Unless you are paying for your software licences on a Per Core basis, in which case your budget takes a big hit. Reducing the number of hosts in your cluster seems to take a lot longer...

  • This is what I was told by an Intel FAE in 1984 or so - the processor he was talking about at the time was the 8080.

    Just wait and watch the price go down for the functionality.

    • by spth ( 5126797 )
      The PMS15A SoC by Padauk is about one cent (when bought in amounts of 10 or more). Various more powerful (e.g. ARM, STM8, MCS-51) microcontrollers have been available under 1$ for years.
  • by Lisandro ( 799651 ) on Wednesday October 02, 2019 @04:43PM (#59262860)

    AMDs has, today, offerings beating Intel on every single pricepoint when it comes to performance/core count/power consumption/bang-for-buck.

    What they're doing with the Ryzen 3 and Epyc line of CPUs is really impressive.

    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Wednesday October 02, 2019 @05:44PM (#59263154)
      Comment removed based on user account deletion
      • Ryzen 5 stomps the shit out of the i5 on both thread count and board cost.

        Excuse me for seeming like an Intel evangelist (I am certainly not), but that's hardly a reason to buy Ryzen 5 over i5.

        1) Intel CPUs have better heat & power performance than Ryzens, so they will still be the preferred CPUs on laptops (but at a much, much lower performance margin).

        2) Threading means nothing. Developers have to code their applications to take advantage of threading, and most cannot apply a use case to their advantage. The use case hardest hit by the Spectre patches is cloud services,

  • The odds that China and Chinese companies will rely on foreign proprietary tech when they don't have to is what?

    • I doubt the i9 and RISC-V are in much competition at this point.

      • by aliquis ( 678370 )

        I wrote the comment because there was talk about ARM I think. Maybe not in the right place for it to make any sense but I also wrote it in bed.

  • will apple cut mac pro pricing or is that locked in?

    • will apple cut mac pro pricing or is that locked in?

      Buy a mac pro with a fried CPU for $100 and perform surgery.

    • I heard they were going to cut the price of the $999 monitor stand because they could get less expensive metal.

      • Well, it's 999 because (some) people purchase it. Is it better to sell n at P or N at p? (n << N and p << P) If (n x P) =~ (N x p) it's still better to sell less at a higher price for obvious reasons (storage, manpower, cost of raw materials ...).
    • by Pascoea ( 968200 )
      Thanks for the laugh. I needed that.
    • Mac Pros use Xeon CPUs, NOT HEDT CPUs. Intel's not cutting those prices -- that's where they get most of their margins.
      • by Agripa ( 139780 )

        Mac Pros use Xeon CPUs, NOT HEDT CPUs. Intel's not cutting those prices -- that's where they get most of their margins.

        Apple does not pay list price. No manufacturer does.

    • will apple cut mac pro pricing

      :-) Thanks, needed some fun today.

  • Arm family of chips (Score:4, Informative)

    by chuckugly ( 2030942 ) on Wednesday October 02, 2019 @05:06PM (#59262958)

    The 'Arm family of chips', connected to the Leg family of chips. Sometimes I wonder what it takes to be an editor.

    ARM dammit.

  • No? No surprise there. Still too expensive anyways.

    • by neuro88 ( 674248 )
      Cascake Lake X is based on the Cascade Lake Xeon design which appears to have the latest fixes in hardware: https://www.intel.com/content/... [intel.com] However, I don't think this is 100% confirmed yet. So for now the answer is "probably".
      • by sexconker ( 1179573 ) on Wednesday October 02, 2019 @05:58PM (#59263200)

        For most of these spec ex attacks, the fixes aren't in "hardware", they're in the baseline microcode each CPU comes with.
        That means you get the same performance penalty as other chips that got microcode patches.

        Intel can't and won't have a true hardware fix until they redesign their pipeline and caching. These "10th gen" parts are really 8th gen parts.

        • by neuro88 ( 674248 )
          At the very least, I think the big ones are fixed. Ie, meltdown, MDS (Zombieload). Some of the others still have slower work arounds.

          Ie, for newer Intel (and all AMD) chips, we don't need to completely wipe the TLB everytime we enter kernel anymore.
        • by gweihir ( 88907 )

          These "10th gen" parts are really 8th gen parts.

          Indeed. They optimized the hell out of the details and manufacturing process, but both are old. Intel had a good racket going gouging customers for far too long. They probably assumed they never needed to do a new design. Boeing with the 767 MAX 8 comes to mind.

          Now is something like that could only happen to Microsoft...

          • by Agripa ( 139780 )

            Now is something like that could only happen to Microsoft...

            Intel relied on Microsoft to push x86 into the PDA and pad market so you could say that something already happened to both of them.

            • by gweihir ( 88907 )

              You have an excellent point. If mobile phones, tablets, etc. were all using Intel CPUs and some version of Windows, that would be even much, much worse than what we have today.

    • "No? No surprise there. Still too expensive anyways."

      No kidding, almost a grand just for the processor in a gaming rig? That's appalling. I built a system a few years ago now for gaming back when the Nvidia 980 graphics card I bought was top of the line (outside of the Titans of course) along with a top tier Intel processor and after taking advantages of a few good cyber Monday deals built the whole rig for around $2k. How the hell can that justify a grand (let alone 2 grand) for a gaming processor? It's li

      • by gweihir ( 88907 )

        It's like they want to hand AMD their market presence.

        Indeed. They probably still have not understood how outclassed they currently are. And AMD has done it to them before, only this time it looks like AMD will not fall behind again, because _they_ have learned that lesson.

      • by Cederic ( 9623 )

        Yeah, Intel and Nvidia prices rose substantially faster than inflation and without concordant performance boosts.

        I don't currently want AMD CPUs or GPUs but I greatly welcome their role in returning competition to the market.

  • by waspleg ( 316038 ) on Wednesday October 02, 2019 @05:42PM (#59263152) Journal

    making a 50% cut is steep, but they wouldn't do it if they weren't making money ....

    • Either they've doubled their yields, or they were getting well over $1,000 in profit for every CPU in this SKU sold. If the former were true they would be crowing about it, so they're just bastards.

      Not that this is news.

  • Posting to undo mod.
  • These aren't 10th generation parts, but 8th generation parts.

    Intel reused a gen with the initial 8000-series products, and is doing so again now. Even when you count the tweaked cores as full generations, they're still really on the 8th "Core" generation.

  • This has nothing, I repeat nothing, to do with the AMD competitive threat. Intel just likes to give the average person a discount now and then!
  • "But Arm's virtues when it comes to sipping power can't keep pace with Intel's clout..."

    For today perhaps, but I doubt that's a foregone conclusion. Give ARM a wall plug power budget (which some datacenters now do) and it's going to get interesting. Even datacenters care about power consumption & heat. With most app servers spending their time in network waits, ARM doesn't hurt those cases much.

    Games are more about GPU speed and ARMs could do the CPU expectations soon enough.

    • Give ARM a wall plug power budget (which some datacenters now do) and it's going to get interesting.

      Is it, though? Nobody has ever demonstrated a really high clock rate ARM core, so for now they are only good for the same workloads that nobody bought the last SPARC processors for, ones with lots and lots of threads (by making a very parallel ARM box.) Only, why wouldn't you just put the same workload on a PC processor with lots of threads and get it done a lot faster?

      • by Agripa ( 139780 )

        Give ARM a wall plug power budget (which some datacenters now do) and it's going to get interesting.

        Is it, though? Nobody has ever demonstrated a really high clock rate ARM core, so for now they are only good for the same workloads that nobody bought the last SPARC processors for, ones with lots and lots of threads (by making a very parallel ARM box.) Only, why wouldn't you just put the same workload on a PC processor with lots of threads and get it done a lot faster?

        We actually have some data on this from both the Raspberry Pi and Apple. For the Pi when scaled, Intel has a much better performance/power ratio while Apple running at a lower performance is merely comparable. So if you are expecting an improvement in performance/power by switching to ARM, get ready to be disappointed.

        This goes double considering that the CPU is only a fraction of the total power budget.

  • by Vandil X ( 636030 ) on Wednesday October 02, 2019 @10:22PM (#59263848)
    20 years ago, it was all about processor frequency. Anyone could get a Pentium, but what "speed" was it? It was a reasonable unit of measure that non-techies could use for a quick comparison when buying PCs.

    It worked for a good long time, then "budget" processors with less cache and performance stuff came into existence, like Celeron and "Pentium M" which muddied the waters for non-techies.

    And then came these "i" processors. Still the same numbers. i7, i9, etc. All saying 2GHz or so. To a non-techie and tech person alike, it seems like the race ended with everyone log-jammed in the finishing chute and no clear winner any more. How can a 2019 PC with an i7 processor at 2GHz and 8GB of RAM be faster than a 2011 PC with an i7 processor at 2GHz and 8GB of RAM? Super confusing. Super unexciting. They don't even do "Intel inside" processor commercials anymore because no general consumers understand them anymore.

    Then again, general society has moved on to tablets and smartphones. Unless they're messing with a gaming PC, general consumers just want a web-surfing/email machine that runs Word and prints. Just about anything out there now can do that.
    • by Agripa ( 139780 )

      Then again, general society has moved on to tablets and smartphones. Unless they're messing with a gaming PC, general consumers just want a web-surfing/email machine that runs Word and prints.

      General society still uses laptops unfortunately although I am not sure for what. I wish they would move on so we could get rid of chicklet keyboards and glossy screens and have peripheral ports back.

  • by twocows ( 1216842 ) on Thursday October 03, 2019 @08:04AM (#59264808)

    These 10th-generation Intel i9 chips are built on an aging core called Skylake

    Every source I can see says that the 10th generation is Ice Lake and is built on the new 10nm architecture. Am I missing something?

    • Intel brands both older Skylake-based cores and new Ice Lake based cores as 10th-generation Core. They announced Ice Lake would be called 10th gen first, and tout that a lot louder because it's part of their move to the 10nm manufacturing process, but Ice Lake isn't good for high-clock processors. So its older 14nm process is still making 10th-gen products. See this Intel announcement for example: https://newsroom.intel.com/new... [intel.com] "The new 10th Gen Intel Core processors leverage the improvements in intra-n

Crazee Edeee, his prices are INSANE!!!

Working...