Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:no (Score 1) 399

120 and 240 FPS are invisible to the human eye. More importantly, the source material is either at 20, 24, 29.97, or 60 FPS, so either you have the extra frames showing the same frames again (thus being useless), or you generate extra frames which didn't previously exist and which look a bit plasticky and odd. In test after test, the "Motion Plus" and other BS upframing is rated as adding noise, because that's all it does to the signal.

Two issues with that theory.

First of all, 60 is not evenly divisible by 24 (actually, 23.976 because of historical NTSC reasons). Since basically no consumer 60Hz televisions actually drive the panel at 24 (23.976) frames per second (even though they may accept 24p input), they generate the additional frames using a 3:2 cadence (one frame is displayed for 3 refresh periods, then the next for 2). The fact that frames are displayed for uneven periods creates judder.

On a 120Hz set each frame is simply displayed for 5 refresh periods. On a 240Hz set each frame is displayed for 10 refresh periods.

Moreover, I used to agree with you about motion interpolation as used on 120Hz displays, but since actually buying one my opinion has completely reversed.

Yes, the frames are faked. Yes, it looks a little weird. But I am frankly tired of the fact that films (and many television programs) are produced at an absurdly low frame rate (24Hz) that makes motion jerky and hard to follow. I like the fact that motion interpolation makes things look smoother, even if it does occasionally add artifacts.

Motion estimation technologies have gotten very good and the better TVs (like the Sony XBR9 I have) do a very good job of making film look like smooth video. It may not look 'cinematic', but I like the way it looks and many others do as well.

Comment Re:No. (Score 1) 511

I was a Linux desktop user for 10 years and just switched to Mac - not because of some nebulous "experience" (I still run fvwm over gnome or kde when given the choice), but I was sick of waiting for my laptop to reboot all the time, and the MacBook is the first computer I've ever used where power management actually, really works.

Maybe you should have been using Windows. I have been using sleep on my desktop and laptop for the past 6 years or so, and despite a wide variety of hardware (two ThinkPad models, an Acer budget laptop, a generic "Whitebook" laptop, an Asus EEE PC netbook, and self-built systems with Intel and AMD CPUs, NVIDIA/AMD/Intel chipsets, and motherboards from Asus/Gigabyte and even Jetway) I have never had a problem with sleep working properly.

To be honest, I have had decent success with sleep in Linux, too. Not 100%, but in the last 5 years things have gotten considerably better, especially if you have Intel hardware.

Comment Re:Not Dead on Arrival (Score 1) 260

for 90% of Americans you have verizon or AT&T, sprint is only useful inside of city limits.

AT&T + Verizon is actually closer to 60% of the market, not 90%.

And I can't comment about Sprint, but T-Mobile works fine for me all over Colorado, including in rural areas.

outside of cities all coverage quickly drops to voice calls only.(I know one parking spot that my signal goes from 3G to edge depending on how which way the wind is blowing.

Maybe you should stop using AT&T's shit network. Verizon and Sprint's networks are 100% CDMA2000 EV-DO.

And EDGE is not "voice calls only". It's not fast but it does work.

Comment Re:hiring process tl;dr (Score 1) 235

As someone who has been offered full-time by both Microsoft (turned down) and Google (accepted), I can tell you that you're dead wrong.

Do you already have a creative reputation or prominent contacts in the field? If so, stop here and come and work for us - though your talents will probably go to waste.

I had neither. I have worked in small business and I spent two years as a graduate researcher. I don't want to do either anymore. What I want out of Google is good pay and benefits, work-life balance, and to work with smart and motivated peers. My intern experiences at both Microsoft and Google told me that both companies offer that. The decision to work at Google primarily came down to the fact that they were willing to hire me where I want to work (Boulder, CO).

Did you go to a top school, regardless of your background?

I respect my university (University of Colorado Boulder) but it is most certainly not of the same reputation as MIT, Berkeley, Caltech, CMU, Stanford, or one of the other top universities.

Straight out of college as you are, can you answer some inane questions on undergraduate computer science?

I had a mixture of algorithm-type questions, OO design and programming questions (C++ and Java), and some system design questions. All of these topics are applicable to a software engineering position.

How what about some silly puzzles to prove your geekiness?

Now I know that you haven't interviewed with either company recently. Neither Microsoft nor Google have used puzzle questions since 2007; in fact, puzzle questions are expressly forbidden now at both companies.

How about Unicode? Do you know some obscure facts about Unicode? What about HTTP? Have you memorised enough of the RFC?

No one at Microsoft or Google gives a shit if you can memorize standards. Hell, you don't even need to know the standard library. On one of my questions I wrote pseudo-Java that used collection methods that don't exist (with names taken from the C++ STL and Python).

How would you improve ______?

I have never been asked subjective questions like that by either Google or Microsoft.

Do you have the same attitude as us?

No one at Google or Microsoft is going to ask you questions about your attitude or motivations, except perhaps your recruiter (who doesn't have any say in whether you get hired).

Comment Re:Java killer? (Score 2) 623

Java vastly simplifies a whole ton of conventions by making every object variable a reference.

Everything in Java is passed by value. And every object variable is a reference.

The value of Java is that the designers know how to say "no". Adding more features to a language dramatically increases the chances that code will do something unknown or unexpected. No language is free from that sort of mistake, but Java is significantly more consistent and predictable than most languages.

Comment Re:Missing feature in Java: Copy on write (Score 1) 623

The solution to this is to have the return type be an interface/class that doesn't support assignment.

For example, I frequently write immutable objects that contain lists. Returning a reference to the internal List would make the object non-immutable (since you can modify the list contents). Instead, I can return Iterable which disallows assignment.

Of course, there are performance penalties for wrapping every array in a collection class. But they aren't so bad, and there are much bigger fish to fry in scientific/numerical Java from a performance standpoint (like mandatory bounds checking for every access).

Comment Re:Java killer? (Score 4, Insightful) 623

Anyone who wants Java to allow overloading of == clearly has never had to deal with C# code (C# allows overloading).

The fundamental problem is that people don't understand Java semantics. In Java, equality testing is consistent and efficient: for reference types (everything except primitives), == always means reference equality. And .equals() always means value equality.

Comparing strings with == sometimes works because the runtime allocates string constants from a pool. So ("A" == "A") evaluates to true because they reference the same object. But ("A" == new String("A")) evaluates to false.

The key here is that object variables in Java are *always* references. That's why any object reference can be null, and it's why you can't compare them with ==.

The problem with overloading is that it's not consistent. Some C# objects overload ==, some don't. There's also ".Equals()", which always checks for value equality. There's also the static method Object.ReferenceEquals(), which always checks for reference equality.

So you end up with this trap where you either end up using Equals() all the time (as you do in Java), or you have some code that uses Equals() and some code that uses ==. Which doesn't make sense.

Java is about consistency and predictability. Having operators do different things in different contexts results in neither.

Comment Re:Why I'm learning .NET (Score 1) 758

I wanted to learn regular C++ or Java as those have tons of applications, but .NET, and more specifically C# is required for pretty much any computer related degree these days.

That is absolutely not my experience. I majored in CS at the University of Colorado, and the vast majority of professors didn't care what language you used. There were a few exceptions:

- In my object oriented design courses we used Java
- In my data structures classes we used C++
- In my introductory CS course we used a mix of Python and C
- In my operating system course we used C
- In my computer architecture courses we used a mixture of different assembly languages (M68000, MIPS, x86)

Not once was I required to work with .NET. That's mostly because .NET, despite Mono, is still very Windows-centric. Our lab infrastructure is Linux-based and a good fraction of the students and professors have Macs.

I like .NET. We used it in my senior software engineering project primarily because our sponsor was a Microsoft shop (and a very successful one at that). But I could have chosen a different team if I had wanted to avoid .NET.

Comment Re:My experience (Score 2) 758

and you won't wast time futily trying to optimize a critical function in Java that could execute 50 times as fast in C++

I agree with your post entirely, but that point is based on the myth that Java is slow. Java is typically anywhere between 70-110% as fast as C++, depending on your C++ compiler and the program you're writing.

In some pathological cases C++ can be 10x faster than Java (although there are pathological cases in the other direction too).

But C++ is basically never 50x faster than Java on any sort of real code.

Note that when I say "Java", I mean Java running on a modern JIT-based VM like HotSpot (used in Oracle's Java). Obviously interpreted Java VMs are much, much slower - but they aren't used much in practice any more.

C++ is typically much better than Java from a memory usage perspective. It's also a good deal faster (usually around 2:1) on loopy scientific code, both because of array bounds checks in Java and because things like auto-vectorization aren't as well developed in JVMs.

Comment Re:Uh... (Score 1) 444

"Performant" is not "a truly cogent alternative syntax delivering readability, expressiveness and some compelling new language features." It's not even a word.

If you believe in descriptive linguistics, "performant" is absolutely a word. And as someone who writes software that actually has to deal with human language, prescriptive linguistics means that my system doesn't work. Which means that I don't get to write cool papers.

In English, no one person or organization has the power to decide what is and is not a valid word. You can either deal with that or you can pretend that complaining will change things. It won't.

Comment It appears that the software worked fine (Score 3, Interesting) 187

It appears that the A330's software works fine. The indications and reversions that the software reported over the data link are consistent with a mechanical failure (possibly caused by freezing) of the Pitot-static system.

Without airspeed data the A330's autopilot and auto throttle disengaged, and the flight control system reverted to a mode known as "Alternate Law" where most of the restrictions are eliminated. We know that this happened because the aircraft reported it over the data link before the crash.

The unfortunate reality is that the reversionary modes on the Airbus flight control system are dangerous because they tend to occur at the worst possible times - when there are multiple sensor or computer failures or when the sensors give readings that are outside the operational limits of the control system. In this situation the flight crew has to react quickly and they are often faced with inadequate, contradictory, or confusing instrument readings.

It is possible to maintain a safe airspeed in an Airbus without the Pitot-static system. The problem is that the pilots need to notice the issue (loss of airspeed data) and react before things get out of hand. It appears that the Air France pilots were unable to do so.

Comment Re:Gnome and KDE both suck (Score 1) 247

What about someone who doesn't want to?

Configuration options are fine if the defaults are sane. The problem that KDE and GNOME both have is that the defaults often aren't sane.

I don't want to have to turn the minimize button back on, or get rid of the awful spatial interface, or download an add-on to get rid of the damn cashew, or change a setting to turn the desktop back into a simple icon screen, or make the clock not huge, or select sane font sizes.

KDE gives you options for everything and then chooses insane defaults. GNOME hides the options in gconf and then chooses insane defaults.

Why can't this shit work out of the box?

And, by the way, Microsoft isn't immune from this either. They've had some real fuck-ups like the search dog in XP and hiding file extensions by default.

Comment Re:linear algebra (Score 5, Insightful) 583

Try understanding neural networks without understanding calculus. You can become a code monkey without it, but there are areas of computer science that will be beyond your grasp if you don't understand calculus (and statistics).

There is always going to be the some aspect of CS that's beyond your grasp, no matter what you take.

As someone who just graduated from a 4-year CS program and is about to get an MS in CS, and as someone who is a paid researcher on a major CS research grant, let me say this: CS is much broader than most people think.

Anyone who says that CS is just about the theory of computation has a very narrow view of CS. There's a sort of bullshit 'purity' argument that anything else should be put into another category like programming or computer engineering.

Some topics are easy to categorize. Design methodologies? Software engineering. CPU design? Computer engineering.

But then there are topics that defy classification. Is compiler design a CS topic, or is it CE? It's probably both. Is static verification a CS topic or a SWE topic? Both.

And then there are topics that obviously belong (at least partially) in CS but often have very little to do with computational theory. Computer vision, natural language processing, network theory, and quite a bit more.

If you limit CS to just algorithms and the theory of computation, students get a very limited view of what's out there. I would argue that students should have a good idea of how real computer systems work, how operating systems are designed, how network systems communicate, and how software is designed and built. None of these topics fit neatly and entirely under the "CS" banner, but that doesn't mean that they aren't important and it doesn't mean that there is not legitimate and ongoing research in those fields.

There is no getting away from the fact that most need to be able to write code after graduating from a CS program. Even in the academic community, most positions involve quite a bit of coding. There are a very few positions where academics can focus on the theory all day long. For most projects, though, publishable results depends on producing a working system, and that means writing code.

Slashdot Top Deals

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...