Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Basic (Score 1) 623

I became interested in computers when I was about 8. The ZX Spectrum was relatively new at that time and a very good family friend bought one. My father soon followed suite. Of course BASIC was therefore my first experience in programming. Most of it was extremely amateurish :P It was only when I finished primary education (usual age of 12) and went to high school (or similar to that for my region of the world) I first experienced a PC. It took a lot of convincing to get my dad to buy a 486SX-25 PC with a VGA monitor. I soon 'upgraded' to Turbo Pascal and did my first serious programming in it, among other things, using the graphics mode to render fractals. I also did some stuff in Delphi, which is Borland Pascal for Windows. I don't think I had any problems grasping the functional concepts... At the end of my Basic 'career' I already used gosub extensively to partition and re-use fragments of code of my own accord...

When I started living on my own (second year in college, 1996?) I bought an IBM Cyrix pentium-class PC, got interested in Linux and swapped to C(++) for most of my programming.

Nowadays, I call myself a programming language agnostic. Programming languages are tools and when I start a project, I mostly look at the prerequisites, if they necessitate a certain language because of legacy code or the platform the software will be deployed upon. Else if I have a primary say in the language used, I pick one I'm familiar with, and there are dozens to pick from. Only language concept I may want to improve myself with, I think is functional languages... haven't done enough coding with those...

In projects I work on and for maintenance stuff, I'm currently using (in order of most used) C#, Golang, C++, PHP, Java(android), Objective C and Python. And if you consider them programming languages, Bash scripting and SQL.

Comment Re:Number 1 (Score 1) 149

That's a lot of 'unhappiness'...

List of programming languages - section P

P seems to be the largest group behind S and C. Don't let your happy little code experiences be thwarted by one or two rotten eggs ;)

Postscript, Powershell and Python are very well known. But... did you know there is a Prolog interpreter written for .NET CLI called P#? Pizza likes a cup of Java on the side and there seems to be an entire family of PLs. And apparently, in the 80's, VM/CMS (currently z/VM) didn't have pipes built into the command line interface but there was a separate program called 'Pipelines' with its own utility programs and syntax, so you could do some similar tricks with pipes as what was available in UNIX... I learned some new things today :)

Comment Re:Madoff is small time compared to Musk (Score 1) 289

There are more than enough rare-earth minable sands in the U.S. It's just, no-one (kept) invest(s/ing) in it, so China is cheaper. If you guys really are going to need those rare-earths, you'll build mines on your own soil... now I think about it, what IS holding you back??? Didn't you need more jobs?

In Europe, things are be a bit different... too small and too overpopulated in most places and the wrong regimes in the few areas (way east) where it might be possible.

world map of rare earth element deposits

Comment Re:Kids these days... (Score 1) 315

Almost agree to you... but the basis of modern computer architecture is not Turing but von Neumann. However, von Neumann was well aware of Turings work and the fact it described computers on an even more fundamental level.

For a machine to be a computer, it needs to be Turing Complete*. The machine can simulate any single-taped Turing machine. But it doesn't have to adhere that architecture. As long as it can simulate it, it's fine. *: Caveat: limited memory.

The von Neumann architecture tells of a CPU with a control unit, an ALU for calculations and a memory unit that stores both program and data. Connected to it are in/output devices. Exactly what it does nowadays still, in a PC, and exactly what they did from the moment von Neumann began working on such computers, at least since his draft report on the EDVAC.

Comment Re:Yeah, but no (Score 1) 109

Thanx for the numbers :) looks quite interesting, especially because I'm in the process of buying a fast SSD soon (new PC setup replacing my 7 year old Phenom and the motherboard will most probably have a PCI-e 3.x 4x M.2 slot). Latency increases with block size... but when you're going for bulk data, latency gets less important, I think. It's the commands for very small bits of data, I suppose, you want to have with as little latency as possible. At 'various levels' of copying my experience (just gut feeling, no numbers here) it's the small files that take up the most time. Whether at PC internal storage level (copying a directory with random files, it flies through the first 75% of relatively large files, then takes 90% of the time to copy all the a couple of KB/file junk), when using databases or at the network level (don't get me started on SMB overhead), whatever. Either overhead, or a (relatively) larger part of the execution time is latency, or both...

So, if the Optane has that insane low 'average' latency of 10 uS, do you think Intel has measured that at the optimum read blocksize (and as such it is an average over random positions in the NVRam you read from) or do they mean, with a 'typical' load of random blocksizes you get on average 10uS latency before the CPU can process the data... well, we'll see when people get their hands on them and can benchmark them.

I'm also very 'curious' if Optane will indeed beat a similarly priced (but obviously larger in volume) PCIe SSD as a HDD cache in actual real-world desktop circumstances (including using it as 'swapfile' if you want). And if it makes a difference that is actually noticable for user experience. Of course the guaranteed durability for cell-writes is nice, but that will be (partly?) negated by the smaller storage volume of the device. Also advertized durability doesn't indicate actual durability. I know of SSD tests where cheap (incedently Samsung) SSDs can handle way more writes than advertised (where the benchmarker had to break off their testing after 100s of times the advertized writes - else they'd miss the publishing deadline of their article) where other SSDs barely hit their mark and then died completely.

Comment Re:Yeah, but no (Score 2) 109

This!
My first thought was exactly this. You can have a Samsung 960 EVO, that is three times faster in read and over five times faster in write speeds for only twice the money of that Intel module. And it has a capacity of 250 GB, not 32 GB. If Samsung would make a 960 EVO 128GB model, the entire Intel product line would be dead in the water. Oh, wait. They have, somewhat... the SM961 128GB, which is both faster and about as expensive as 32 Intel GBs.

Sorry Intel, and thanx for the deja-vu moment, for my second thought was: 'Oh, my god, this is Intel Turbo Memory / Robson Modules (tm) all over again!"

Comment Re:Is It A Problem? (Score 1) 266

When such bugs are part of 'core' library code that's called often enough in a myriad of applications, it could affect performance and usability of your system when you do need the processing power for other things.
Take this simple piece of code loosely based on something I wrote while in a long code session 10 days ago and found out, last week. Can you spot the bug? A hint: It decreases performance on average by a factor of 2 in certain conditions, it's very basic and the function definitely does what it says on the tin.


func (b barn) NeedleInHaystackAndDoMore(theNeedle needle) {
        found := false
        for _,aNeedle := range b.haystack {
                if aNeedle == theNeedle {
                        found = true
                }
        }
        if !found {
                b.haystack = append(b.haystack, theNeedle)
        }
        DoMore()
}

Comment Re:Same thing with manhatten island. (Score 1) 147

Says who? The fool that buys souls for 8-digit sums in the first place? Someone else that tries you to prevent selling your soul so they'll have it when you can no longer make use of it? I would be very careful with that proclamation, unless you have some insight I'm not aware of.

I would not accept that 8-digit sum for something that could be no more than dust in the wind. I'd first investigate the value of soul very carefully before I'd accept or reject any sum on it. And because I know I currently definitely can't, I'd say to the buyesman 'sway me, convince me, move me, enlighten me... Or sodd off!' Then, when he does, we may be able to have an adult conversation on a more like, equal terms base, you know.

Comment Re:That's pretty smart (Score 1) 249

In these cases there may be grounds to doubt the meters - even if properly calibrated. That's because some metering system in use can't handle high frequency fluctuating loads because they are built with too cheap components because of -reasons- (can be anything from the meter manufacturer wanting to cheap out to insufficient initial specifications to some engineer having a bad hair day and an insufficient design got passed). If you have a switching power supply that draws energy from the network in 2 KHz intervals, for one period every second and you have a metering system sampling at 2 KHz, for one period every second as well and they both run 'in sync', the meter 'thinks' you're drawing energy at 2000 times the rate you actually do. If they run out of sync, the meter will charge you nothing and if frequencies differ, your meter probably will measure a correct average, although depending on the frequency difference it might one month measure 'insane' loads while another month nearly nothing.

Comment Re:A better question (Score 2) 243

That totally depends on if they are done right. Some netbooks made were surprisingly capable. Cheap, small, sturdy and useful for the narrow set of tasks they were made for in the first place. Ideal as small computer when traveling, second PC to surf the internet on or write some documents when another family member has occupied the main desktop/laptop. And even as a generic college student PC for office applications many of them were more than capable. However there were also POSs, only capable of delivering a slideshow of BSODs.

A good netbook has a balance of hardware and software with a specific task set. Many successful netbooks had to forego a Windows operating system because of that, although some late models with Win Basic may have been OK-ish. Those unfortunately got bogged down over time due to too many updates or owners trying to shoehorn a regular version on it for the 'Aero' effect.
Later on, when the term 'Netbook' went out of fashion, it wasn't only tablets that took over. i3-class hardware got a lot cheaper (both in purchase price and power requirements) and so they gobbled up the market at the higher end. Some tablets were developed with sturdy keyboard docks that made them capable of everything a netbook could and some manufacturers still release a netbook-like configuration now and then. ASUS is still my favourite in that regard. And Chromebooks, except for the high-end ones are netbooks, if not in name. The market of mobile computing equipment just got more diverse and just 'Netbook' no longer covered it.

Comment Re:Where's the work! (Score 1) 125

It (work) / They (jobs) will be gone. And that should not be a thing to worry about as long as the right politicians are voted into office. Currently governments apply taxes mostly on work and money spent. When those are no longer viable options they'll have to tax something else to keep the nation running. Production and property. Or they'll have to (shudder) privatize. They'll need to distribute enough wealth or risk anarchy. Whether they take the 'left' or the 'right' road.

Those with creative ideas and the will to execute them will still earn. Those depending on cheap/manual/easily replaceable labor are better off -not- breaking their backs and have a nice walk through the park (or 'couch potato' some torrents) instead. Talking about breaking backs, as that's a health risk an employer has to insure against, I foresee such workers in the future becoming a liability even if they offer their services for free. Robot labor will eventually become cheaper than human labor offered voluntarily and 'without cost', unless, as a society, we accept people being worked to death.

That's why lowering the minimum wage is a dead-end road. Whether you're a 'commie' or a 'capitalist', or anything in between, changing the minimum wage to anything other than 'if you work for reasonable hours/week you can live off of that, have some small luxuries and support a modes family', will do more harm than good. When minimum wage is too low, you'll have people starving and not enough consumption to keep the economy going. When minimum wage is too high, people working above and beyond what's normal (those that used to earn a higher wage) will stop putting in effort to keep progress going because they can 'earn' the same 'doing nothing'.
As such, minimum wage is not and should never be meant to indicate the value of the lowest paying jobs employers are willing to not automate/offshore away. Because that will lead to distopia. It's an indication of what we, as society, think a human being should earn at least (and thus be able to consume in equal value), as a member of that same society. Whether the job is adding that value to society or not.

I foresee a lot of new hobbies in the not too distant future, manual labor being one of them. Maybe some of them will even become sports, so the really talented can earn money going that route. The others will be glad 'modern' humans will never have to do that anymore.

Who wants to manually wave cloth, pull a plough, delve coal using only a pick-ax or even operate a switchboard... or any of the other jobs that have disappeared because mechanical/automated labor is lots and lots cheaper than paying some human workers a decent wage?

Comment Re:Bah. My phone is based on electroweak theory! (Score 2) 129

Ehm, no. It doesn't matter in this case, Electromagnetism is enclosed by Electroweak theory. You don't need electroweak theory to build a smart phone unless you want it to run on fission decay batteries. You do need the electromagnetism part, though. For the theories behind how the various radios that are built into a smart phone, communicate wirelessly, at least.
Also, you need Quantum Mechanics for things like Transistors (Semiconductor theories) and GPS navigation (atomic clocks -> Photovoltaic effect). I'd rather they had mentioned that one... Miniaturization of electronics is fine, but you need discrete components acting as electronic switches that can be miniaturized well in the first place. Tubes and relays don't cut it (or we would be living in the 'Fallout' universe).

Comment Re:Anthropological principle (Score 1) 187

With current AI we see the 'mechanism' expressing a certain 'behaviour' when inputs are triggered and somewhere inside a threshold is crossed. We learn such an AI with examples and the treshold should cross when 'similar' (but different) examples are used as input. Sometimes it triggers on examples that may not, at first glance, have enough similarity with the learning set. That's where things get interesting. It has been often enough the AI eventually was 'right' (and it detected cancer cells where no doctor could, won a Go match, etc.).
But AI currently doesn't analyze a problem from all sides, weighs arguments and consciously comes to a conclusion, like humans (would like to) do. It's in my idea more like a subconscious processor. And I think, maybe, most of our brain works on a similar level, doing things intuitively. Because to do things consciously requires a lot more energy. I think consciousness does emerge when enough 'intelligence' is connected together, when there is enough 'idle resources' to analyze a part of your input from every angle possible. and when you 'learn' (and trust) to let most of your processing be prepared through your subconscious and then only cherry pick the really tough examples to give them all the attention (and energy) you have reserved for your conscious part to process... And then consciousness isn't just a by-product, its a valuable tool that reduces your false positives. But who am I to use my meager consciousness to ponder such a question and with so little (close to no) evidence?

Slashdot Top Deals

The cost of living hasn't affected its popularity.

Working...