Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Spam (Score 3, Informative) 108

Using Outlook.com for email is a bad idea. So much legitimate email is never delivered, and you won't know what you're missing. It doesn't go to spam or junk or anything. They just delete email and don't warn you. You might as well set your primary MX record to 127.0.0.1 because email with outlook is about that useful.

Comment Re:Updated Policy: (Score 2) 372

95% of Han Unification doesn't seem like a problem to me. The slight stylistic differences between Chinese and Japanese where it's just a matter of "these tiny strokes point slightly left in Chinese and slightly right in Japanese" can still easily be understood no matter what font. Even slightly more stylistic differences don't actually cause any problems. For example, these two Kanji: http://jisho.org/search/%23kan... and other Kanji that have these shapes inside of them. The fonts tend to show the Chinese version: In the first, the top line is the same as the 3rd/4th, but Japanese usually write the top like a tiny dot almost, as seen in the stroke diagram graphic. In the second Kanji (scroll down), the last stroke is vertical in Chinese, but diagonal and connected differently in the Japanese version. Japanese people, in my experience, don't seem to have any problem with these kinds of differences.

Other more major differences caused by Kanji simplification over the years has also resulted in two codepoints in Unicode, so the Chinese and Japanese characters that *historically* had the same drawing, are now actually usable in either language still. For example, https://translate.google.com/?... shows the Japanese and simplified Chinese "fish". Japanese still use 4 dots on the bottom, Chinese use a line. This was given two codepoints and doesn't seem to be a problem. Many other differences were given two codepoints and Chinese fonts typically don't include any definition for the Japanese version and vice-versa.

The example I gave in my original post, about the Kanji meaning "leader" is one that really baffles me. Why was such a major difference in drawing merged into only one codepoint, and why was it never separated out into two codepoints in the next version of Unicode? There are other Kanji with major difference in appearance that share a single codepoint because of Han Unification, and these ones cause a lot of trouble. Japanese people typically don't recognize the Chinese version of "leader" as having any meaning at all. It's just scribbles to them, and when a webpage or document tries to display Japanese text but Windows or whatever decides to fall back to a Chinese font, the entire meaning is lost, because of Unicode.

Comment Re:Updated Policy: (Score 2) 372

The issue isn't how many characters exist. There is room to add more characters to Unicode when missing ones are found. The big failure of Unicode is Han Unification, which is basically like saying "Well the character A in America has the same *meaning* as the character B in Canada, so let's only issue one codepoint for A/B" and now when you type an A on your American computer, all Canadian's see a B because their fonts render the exact same character differently. This happened with many common characters that have the same *meaning* in Chinese and Japanese, but are drawn completely differently. As an example, try to copy the Kanji at http://jisho.org/search/%E5%B0... into MS Word and compare the Meiryo font vs Microsoft YaHei font.

Comment Re:Updated Policy: (Score 1) 372

Quite frankly, Unicode works well right now, provided you use UTF-8 or UTF-32. UTF-16 and surrogate pairs is really quite an ugly hack, and 16-bits are obviously not enough when we need nearly 21 bits to encode all the existing characters already. UTF-8 is quite elegant (compatible with ASCII, but easily countable and self-synchronizing) and UTF-8 can easily be extended to 31 bits, should we need more codepoints in the future. UTF-16 can't be extended in any easy way and will just become a nightmare to support, should future versions of Unicode decide to start using codepoints above U+10FFFF

Comment Re:either integrated Intel HD Graphics 530 or a po (Score 1) 94

I looked into this before, and in the newer setups like my laptop, it seems common that your choices for GPU in the BIOS are Intel-only, or Hybrid. You cannot select just the nVidia one. There is probably some reason in the hardware that it's not possible now.

Comment Re:either integrated Intel HD Graphics 530 or a po (Score 1) 94

Even with the crummy update that Google didn't need to do, Google Maps runs significantly faster and smoother on way older, slower hardware (custom built desktops) where there is not a hybrid GPU setup. Having a 100% dedicated nVidia card that everything always uses is great. Having an Intel GPU that is used for anything at all makes having the nVidia GPU a pointless waste of money when buying a laptop.

Comment Re:either integrated Intel HD Graphics 530 or a po (Score 1) 94

Basically, the end result is that I paid extra for an nVidia card when I bought this, thinking it would *replace* the Intel one, but it did not. Several desktop computers I built from parts all work just fine with nVidia cards, and don't have an Intel GPU or funny Optimus drivers getting in the way of things. Google maps always runs super fast and smooth, and nothing ever crashes. Next time I buy a laptop, I'm going to pay extra attention, and if you can't entirely 100% disable the Intel GPU, then there is zero point to having the nVidia GPU added on.

Comment Re:either integrated Intel HD Graphics 530 or a po (Score 1) 94

The nVidia driver actually greyed out and prevents you from selecting the nVidia GPU for apps in it's known-list. Firefox and Chrome are on the list of programs that can only use the Intel GPU. You can always copy Firefox.exe to Firefox2.exe and then it's not on the known-list. You can browse to it from the nVidia control panel thing, then set that to use the nVidia GPU. Unfortunately, it tends to crash the whole OS a lot if you do that, which seems pretty ridiculous. I tried both Firefox and Chrome and eventually gave up and left them on the Intel GPU.

Comment either integrated Intel HD Graphics 530 or a power (Score 3, Informative) 94

What "either integrated Intel HD Graphics 530 or a powerful GeForce GTX 960M" means is that the nVidia driver will make regular windows, and apps like Firefox/Chrome use the slow Intel card for all your regular stuff. Google maps or anything that uses WebGL will slow to a crawl. Only games are "allowed" to run on the real GPU.
At least, that's how the last laptop I got a year ago with a setup like that worked...
I have a Core i7-4500U, 16GB RAM, and a GT735M, and it is absolutely painful to use certain things like Google Maps.

Comment Re:Learn your mathematical operators (Score 2) 117

So many developers reflexively include tons of jQuery and Bootstrap CSS/JS files, 99% of which aren't used on the entire site. Just because that's the only way some people know how to "code" web sites. When you add in jQueryUI and a bunch of FontAwesome fonts that aren't used either, I'm surprised some people could write a single "Hello World" page in under 20MB.

Comment Re:Drivers (Score 1) 203

I'm pretty sure this stupid hybid GPU system is the reason I need Sony-specific drivers and the Intel/nVidia default graphic drivers just make things run super super slow. I do suspect the current 8/8.1 drivers will work just fine on 10 though, but Sony's official position is to not install Windows 10 until after November right now.

Comment Re:Drivers (Score 1) 203

Given how bad the stock Intel and nVidia drivers make my Vaio Flip 15 function compared to the modified Sony ones, I'm pretty worried about upgrading to Windows 10. Drivers aren't going to be available until November according to Sony - http://esupport.sony.com/US/p/... - because mine came with 8 (not 8.1) preinstalled, even though 8.1 was out at the time.

Slashdot Top Deals

Computer Science is merely the post-Turing decline in formal systems theory.

Working...