Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Thank you, Bram (Score 1) 62

'vim' is a marvel, and Bram Moolenaar did a lot of good for the world, and I am grateful for that. TL;DR: I started using Borland IDEs on MS-DOS (Turbo Pascal and Turbo C, around 1987/1988), which were based on the WordStar editor commands. I first tried 'vim' around 1994, on my first PC, using Linux on a Slackware distribution (486 DX2 66MHz with 8MB of RAM, and a 500MB hard disk), because for university homework we were told to learn the 'vi' editor, as in the lab we were using Sun workstations. I continued using a Borland-compatible editor for Linux named 'Joe', as I felt more productive. It took me some time, but I ended up switching to 'vim', once I felt I could do it better than with previous editors. Despite using other editors, e.g. Notepad++ and Visual Studio in Windows, in Linux and Mac I still use mostly 'vim', even though it's considered outdated because more modern/popular editors are taking its place, e.g. VS Code, Kate, etc.

Comment Dear AMD (Score 3, Insightful) 166

Please, focus. I don't know what "the market" need. I know what *I* need to buy from you again:
  • Netbook: 2-core CPU/APU at 2GHz with decent IPC for 30 USD
  • Laptop: 4-core CPU/APU at 2-2.5GHz with good IPC for 60-70 USD
  • Desktop: "PS4 on a chip" with twice the CPU frequency for 90-120 USD

Note that, in comparison to ARM CPUs, x86 SoCs are *crazy* overpriced. There are superb ARM SoCs for just 20 USD. WTF are you doing selling similar consumer-grade chips for 100 USD??

Comment Not going to work (Score 1) 208

In my opinion, the point of using x86 in order to reuse units from desktop/server CPUs is the base of these experiments. The counterpart is to deal with the x86-mess everywhere. This seems a desperate reaction to AMD's CPU+GPGPU, which also has drawbacks. I bet that both Intel and AMD prefer to keep memory controller as simpler as possible, having a confortable long-run, without burning their ships too early. E.g. a CPU+GPGPU in the same die, with 8 x 128 bit separate memory controllers configured as NUMA (i.e. without channel interleaving/bonding) would be much better, but it would imply expensive chips, motherboards, and more DRAM chips. So I bet we'll have same-die CPU+GPU plus simple memory controller (even with embbeded RAM in 3D package) for the next 20 years (consumer-grade products).

Slashdot Top Deals

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...